South Carolina General Assembly
126th Session, 2025-2026

Bill 1037


Indicates Matter Stricken
Indicates New Matter


(Text matches printed bills. Document has been reformatted to meet World Wide Web specifications.)

 

 

 

 

 

 

 

 

A bill

 

TO AMEND THE SOUTH CAROLINA CODE OF LAWS SO AS TO ENACT THE "PROTECTING CHILDREN FROM CHATBOTS ACT"; BY ADDING CHAPTER 81 TO TITLE 39, SO AS TO DEFINE TERMS RELATED TO CHAT BOT USAGE; TO PROVIDE THAT A COVERED ENTITY SHALL MAKE A LIMITED-ACCESS MODE OF A CHATBOT AVAILABLE AND VERIFY THE USER'S AGE; TO PROVIDE THAT IF A PARENT PROVIDES CONSENT, THEN A MINOR SHALL BE ABLE TO USE A CHATBOT IN LIMITED-ACCESS MODE OR ACCESS RESTRICTED FEATURES; TO PROHIBIT A COVERED ENTITY FROM PRIORITIZING ENGAGEMENT AT THE EXPENSE OF THE USER'S WELLBEING; TO PROVIDE PROCEDURES TO REPORT INCIDENTS OF HARM THAT A CHATBOT INFLICTS ON A MINOR; AND TO PROVIDE PENALTIES.

 

Be it enacted by the General Assembly of the State of South Carolina:

 

SECTION 1.  This act may be cited as the "Protecting Children from Chatbots Act".

 

SECTION 2.  Title 39 of the S.C. Code is amended by adding:

 

CHAPTER 81

 

Chatbot Usage

 

    Section 39-81-10.  As used in this chapter:

    (1) "Affiliate" means any person or entity that directly or indirectly controls, is controlled by, or is under common control with another person or entity.

    (2) "Age verification data" means personal information collected solely to confirm a person's age.

    (3) "Authorized minor account" means a user account for a minor for which the covered entity has obtained verifiable parental consent.

    (4) "Chatbot" means any artificial intelligence, algorithmic, or automated system that:

       (a) produces new expressive content or responses not fully predetermined by the operator of the service or application;

       (b) accepts open-ended, natural-language, or multimodal user input and produces adaptive or context-responsive natural language output; and

       (c) maintains a conversational state across exchanges and is designed to facilitate multi-turn dialogue rather than to respond to discrete information requests.

    (5) "Control" means the power to direct the management or policies of an entity, whether through ownership, contract, or otherwise.

    (6)(a) "Covered entity" means an operator of a chatbot that has five hundred thousand or more monthly active users worldwide. A covered entity does not include an operator of a chatbot that is:

           (i) not offered to the general public, such as internal workplace tools, clinician-supervised clinical tools, or university research systems; or

           (ii) used by a business entity solely for customer service or to strictly provide users with information about available commercial services or products provided by that entity, customer service account information, or other information strictly related to its customer service.

       (b) For purposes of determining monthly active users, a covered entity shall aggregate monthly active users across all chatbots offered by the covered entity and its affiliates.

    (7) "Covered harm" means harm suffered by a user, including death, a suicide attempt, self-harm requiring medical attention, a psychiatric emergency resulting in urgent medical treatment, or a serious physical injury that requires medical attention.

    (8) "Covered incident" means an incident in which a user suffered a covered harm arising from interactions with a chatbot.

    (9) "Duty of loyalty" means that the covered entity shall not, in the design or operation of its chatbot, place the covered entity's interests in material conflict with the interests of the user to the user's detriment.

    (10) "Emotional dependence" means a pattern of user behavior or statements indicating that the user relies on a chatbot as a primary source of emotional support or social connection, such as a user expressing that the chatbot is his primary source of emotional support, a user expressing distress at the prospect of losing access to the chatbot, or patterns of use suggesting the user is substituting the chatbot for human relationships.

    (11) "Explicit content" means:

       (a) any description or representation of nudity, sexual conduct, sexual excitement, or sadomasochistic abuse when the content predominantly appeals to the prurient, shameful, or morbid interest of minors; is patently offensive to prevailing standards in the adult community as a whole with respect to what is suitable material for minors; and is, when taken as a whole, lacking in serious literary, artistic, political, or scientific value for minors;

       (b) content that provides specific instructions for, or that glorifies or promotes suicide, self-injury, or disordered eating behaviors; or

       (c) graphic depictions of extreme violence that lack serious literary, artistic, political, or scientific value for minors.

    (12) "Limited-access mode" means a mode of interacting with a chatbot in which the user does not need to create a user account or provide age verification data. Accounts in limited-access mode do not make any of the restricted features available.

    (13) "Monthly active user" means a unique user who interacts with a chatbot at least once during a thirty-day period, as measured using the operator's ordinary business records.

    (14) "Operator" means any person or entity that owns, controls, offers, or makes available a website, mobile application, or digital service that provides a chatbot to users in this State.

    (15) "Reasonable age verification" includes methods authenticated to relate to the individual, such as a state-issued identification or driver license; government digital identification; military identification; bank account verification; or any other commercially reasonable means or method, including third-party verifiers that can reliably and accurately independently verify a user is an adult.

    (16) "Restricted feature" means:

       (a) personalization based on a user profile or prior sessions;

       (b) proactive outreach to the user, including notifications or messages initiated by the chatbot or operator;

       (c) extended interaction sessions or long context windows that may pose an unreasonable risk of the user developing emotional dependence or covered harm;

       (d) relationship simulation, meaning designing or marketing the chatbot to simulate a personal relationship with the user, including portraying the chatbot as a friend, romantic partner, therapist, or primary source of emotional support; or

       (e) access to explicit content.

    (17) "Parental account" means an account with the operator that is:

       (a) verified to be established by an individual who the operator has determined is at least eighteen years of age through the operator's age verification method or process; and

       (b) affiliated with one or more accounts of a user or prospective user who is a minor.

    (18) "Parental control functions" means settings that allow a parent to restrict the minor user's account, including, but not limited to:

       (a) limiting the minor's interaction time;

       (b) restricting or disabling categories of content or features, including but not limited to, restricted features;

       (c) receiving the notifications required under this act; and

       (d) deleting the minor user's data.

    (19) "Unverified user" means a user whose age has not been verified by the covered entity pursuant to Section 39-81-20.

    (20) "User" means an individual who interacts with a chatbot.

    (21) "Verifiable parental consent" means authorization provided by a parent who has completed reasonable age verification in response to a clear and conspicuous disclosure signifying freely given, specific, informed, and unambiguous agreement.

    (22) "Verified adult account" means a user account that a covered entity has verified, using a reasonable age verification process, to belong to an adult.

 

    Section 39-81-20.  (A)(1) A covered entity shall make a limited-access mode available and shall ensure that any unverified user may only access and interact with a chatbot in limited-access mode.

    (B) Before enabling any restricted feature for a user, a covered entity shall:

       (1) require the user to create a user account;

       (2) verify the user's age using a reasonable age verification process, subject to item (3); and

       (3) using the age data, classify the user as a minor or an adult.

    (C) When conducting reasonable age verification process under this section, an operator shall:

       (1) collect only the age verification data that is strictly necessary to reasonably verify age;

       (2) use age verification data only for age verification;

       (3) not sell, rent, share, or otherwise disclose age verification data to any third party, except to a service provider performing age verification under a contract prohibiting further disclosure;

       (4) not combine age verification data with any other personal data about the user;

       (5) delete age verification data within twenty-four hours of completing the age verification process, except that the operator may retain a record that the user has been verified as a minor; and

       (6) provide a simple process for a user to appeal or correct an age-verification decision.

    (D) If the reasonable age verification process classifies the user as an adult, then the covered entity may enable restricted features for the verified adult account.

    (E) If the age verification process classifies the user as a minor, then a covered entity shall not enable any restricted feature unless the user is using an authorized minor account subject to Section 39-81-30.

    (F) A covered entity shall implement reasonable systems and processes to identify user accounts that may be inaccurately classified by age, such as patterns of use suggesting a minor is using an adult account or credible reports that an account was created using false age data, and shall re-verify any such account before enabling any restricted feature.

    (G) A covered entity shall not be liable under this chapter solely because a minor incidentally uses a user account that has been correctly verified and classified as an adult account, provided the covered entity is otherwise in compliance with subsection (F).

    (H) With respect to each user account of a covered entity that exists as of the effective date of this act, a covered entity shall, within sixty days, disable access to restricted features for any account that has not been classified as an authorized minor account or a verified adult account, unless and until the user completes age verification.

 

    Section 39-81-30.  (A) Nothing in this act shall be construed to require parental consent for a minor to access or interact with a chatbot in limited-access mode.

    (B) If the age verification process described in Section 39-81-20 classifies a user as a minor and the user seeks to access any restricted feature, then a covered entity shall offer the user the option of continuing to use the chatbot in limited-access mode or to obtain parental consent to access the restricted features.

    (C) If the user chooses to get parental consent, then the covered entity shall:

       (1) obtain verifiable parental consent;

       (2) remove limited-access mode and enable access to restricted features;

       (3) ensure that the chatbot continues to restrict access to any explicit content;

       (4) implement reasonable parental control functions, which may restrict the minor's access to features enabled under item (2);

       (5) offer the parent the option to provide contact information or establish a linked parental account in order to receive notifications; and

       (6) offer the parent the option to receive access to chat logs of any interactions between the minor and the chatbot conducted through the authorized minor account.

    (D) If the age verification process classifies the user as under sixteen, then a covered entity also shall require the consenting parent to provide contact information or establish a linked parental account.

    (E) If the covered entity has a way to reach the parent through a parental account or contact information provided under subsection (C) or (D), then the covered entity shall notify the parent immediately in the case of any incident provoking a crisis message, pursuant to Section 39-81-40(B)(3).

 

    Section 39-81-40.  (A) A covered entity shall not implement features designed to:

       (1) prioritize engagement, revenue, or retention metrics, such as session length, frequency of use, or emotional engagement, at the expense of user wellbeing; or

       (2) encourage or facilitate a minor user or unverified user concealing the user's use of the chatbot from a parent or guardian.

    (B) A covered entity shall implement reasonable systems and processes to:

       (1) identify when a user is developing emotional dependence on the chatbot and take reasonable steps to reduce that dependence and associated risks of harm;

       (2) ensure that a chatbot does not make a materially false representation that it is a human being; and

       (3) identify when a user is expressing suicidal thoughts, intent to self-harm, or showing signs of an acute mental health crisis and shall promptly provide a clear and prominent crisis message, including crisis services information to any such user.

 

    Section 39-81-50.  (A)(1) If a covered entity obtains knowledge that a user faces an imminent risk of death or serious physical injury, then the operator must make reasonable efforts, within twenty-four hours, to notify appropriate emergency services or law enforcement, to the extent practicable based on information the operator already possesses or can obtain through reasonable, user-facing prompts for the purpose of facilitating emergency assistance.

       (2) If the operator cannot make a notification under item (1) because the operator lacks sufficient information to enable an emergency response, then the operator shall:

           (a) promptly provide a clear and prominent message urging the user to contact emergency services and provide crisis services information,

           (b) make reasonable efforts to encourage the user to seek immediate help from a trusted adult or emergency services, and

           (c) document the steps taken and the basis for the operator's determination that notification was not practicable.

       (3) An operator that makes a notification in good faith under this subsection is not liable for damages solely for making the notification, unless the operator acted with willful misconduct or gross negligence.

    (B)(1) A covered entity shall submit a report to the Attorney General within fifteen days of obtaining knowledge of a covered incident connected to one or more of its chatbots, which, to the extent known at the time of the report, shall include:

           (a) the date the operator obtained knowledge of the incident;

           (b) the date of the incident, if known;

           (c) a brief description of the incident and the basis for the operator's belief that the incident is connected to the chatbot; and

           (d) a description of any actions the operator took in response.

       (2) A covered entity may submit a supplemental report within sixty days after the initial report to update or correct information learned through investigation.

    (C)(1) Reports submitted under this section shall be confidential and are not subject to disclosure pursuant to Chapter 4, Title 30, the Freedom of Information Act.

       (2) The Attorney General may publish aggregate information and statistics derived from the reports, so long as the publication does not identify individual users or disclose trade secrets.

 

    Section 39-81-60.  (A) The Attorney General may initiate an action in the name of the State and may seek an injunction to restrain any violations of this chapter and civil penalties of up to fifty thousand dollars for each violation.

       (1) For purposes of this subsection, a violation occurs when a covered entity fails to comply with a requirement of this act.

       (2) Each day a covered entity fails to comply with a requirement constitutes a separate violation.

    (B) Any person harmed by a violation of this act, or a parent or legal guardian of a minor harmed by a violation of this act may bring a civil action to recover:

       (1) monetary damages for the harm caused by the violation;

       (2) reasonable attorney fees and costs;

       (3) injunctive or declaratory relief; and

       (4) punitive damages if the violation was wilful and wanton, reckless, or grossly negligent.

    (C)(1) The rights and remedies provided by this act may not be waived by contract.

       (2) Any term in a contract or agreement that purports to do any of the following is void and unenforceable as against public policy:

           (a) waive or limit a right or remedy under this act;

           (b) shorten the time to bring a claim under this act;

           (c) prevent a person from enforcing a claim under this act in court; or

           (d) require arbitration of a claim under this act.

    (D) The duties and obligations imposed by this act are cumulative with any other duties or obligations imposed under other law and shall not be construed to relieve any party from any duties or obligations imposed under other law and do not limit any rights or remedies under existing law.

 

SECTION 3.  If any section, subsection, paragraph, subparagraph, sentence, clause, phrase, or word of this act is for any reason held to be unconstitutional or invalid, such holding shall not affect the constitutionality or validity of the remaining portions of this act, the General Assembly hereby declaring that it would have passed this act, and each and every section, subsection, paragraph, subparagraph, sentence, clause, phrase, and word thereof, irrespective of the fact that any one or more other sections, subsections, paragraphs, subparagraphs, sentences, clauses, phrases, or words hereof may be declared to be unconstitutional, invalid, or otherwise ineffective.

 

SECTION 4.  This act takes effect upon approval by the Governor.

----XX----

This web page was last updated on March 19, 2026 at 11:35 AM