South Carolina General Assembly
126th Session, 2025-2026
Bill 896
Indicates Matter Stricken
Indicates New Matter
(Text matches printed bills. Document has been reformatted to meet World Wide Web specifications.)
Committee Report
May 6, 2026
S. 896
Introduced by Senators Leber, Walker, Blackmon and Kimbrell
S. Printed 5/6/26--S.
Read the first time February 5, 2026
________
The committee on Senate Labor, Commerce and Industry
To whom was referred a Bill (S. 896) to amend the South Carolina Code of Laws so as to enact the "Chatbot Protection Act"; and by adding Chapter 80 to Title 39 so as to provide restrictions on certain, etc., respectfully
Report:
That they have duly and carefully considered the same, and recommend that the same do pass with amendment:
Amend the bill, as and if amended, by striking all after the enacting words and inserting:
SECTION 1. This act may be cited as the "Protecting Children from Chatbots Act."
SECTION 2. Title 39 of the S.C. Code is amended by adding:
CHAPTER 81
Chatbot Protection
Section 39-81-10. As used in this chapter:
(1) "Affiliate" means any person or entity that directly or indirectly controls, is controlled by, or is under common control with another person or entity.
(2) "Age-verification data" means personal information collected solely to confirm a person's age.
(3) "Authorized minor account" means a user account for a minor for which the covered entity has obtained verifiable parental consent.
(4) "Chatbot" means any artificial intelligence, algorithmic, or automated system that:
(a) produces new expressive content or responses not fully predetermined by the operator of the service or application;
(b) accepts open-ended, natural-language, or multimodal user input and produces adaptive or context-responsive natural language output; and
(c) maintains a conversational state across exchanges and is designed to facilitate multiturn dialogue rather than to respond to discrete information requests.
(5) "Chat log" means the input data provided by a user to a chatbot and the output data generated by the chatbot in response, including any record thereof.
(6) "Control" means the power to direct the management or policies of an entity, whether through ownership, contract, or otherwise.
(7)(a) "Covered entity" means an operator of a chatbot that has fifty thousand or more monthly active users worldwide. A covered entity does not include an operator of a chatbot that is:
(i) not offered to the general public, such as internal workplace tools, clinician-supervised clinical tools, or university research systems; or
(ii) used by a business entity predominantly for customer service, order fulfillment, account management, or to provide users with information about commercial services or products provided by that entity, including transactional functions directly related to the purchase, use, or return of such products or services; or
(iii) operated by or on behalf of a federal, state, or local governmental entity, including the South Carolina 211 system, and used solely to provide information about, or assist users in accessing government or public services, programs, benefits, or regulatory requirements; or
(iv) a nonplayer character in a video game or a video game chatbot that is restricted strictly to the subject matter of the video game and is not capable of open-ended companionship or discussion of topics related to self-harm, suicide, mental health, sexual conduct, or material harmful to minors.
(b) For purposes of determining monthly active users, a covered entity shall aggregate monthly active users across all chatbots offered by the covered entity and its affiliates.
(8) "Covered harm" means any of the following harms suffered by a user: death, a suicide attempt, self-harm requiring medical attention, a psychiatric emergency resulting in urgent medical treatment, or a serious physical injury that requires medical attention.
(9) "Covered incident" means an incident in which a user suffered a covered harm arising from interactions with a chatbot.
(10) "De-identified data" means information that cannot reasonably be used to infer or derive the identity of a user, does not identify a user, and is not linked or reasonably linkable to a user.
(11) "Emotional dependence" means a pattern of user behavior or statements indicating that the user relies on a chatbot as a primary source of emotional support or social connection, such as a user expressing that the chatbot is his primary source of emotional support, a user expressing distress at the prospect of losing access to the chatbot, or patterns of use suggesting the user is substituting the chatbot for human relationships.
(12) "Explicit content" means:
(a) any description or representation of nudity, sexual conduct, sexual excitement, or sadomasochistic abuse when the content predominantly appeals to the prurient, shameful, or morbid interest of minors; is patently offensive to prevailing standards in the adult community as a whole with respect to what is suitable material for minors; and is, when taken as a whole, lacking in serious literary, artistic, political, or scientific value for minors;
(b) content that provides specific instructions for, or that glorifies or promotes suicide, self-injury, or disordered eating behaviors; or
(c) graphic depictions of extreme violence that lack serious literary, artistic, political, or scientific value for minors.
(13) "Limited-access mode" means a mode of interacting with a chatbot in which the user does not need to create a user account or provide age verification data. Accounts in limited-access mode do not make any of the restricted features available.
(14) "Minor" means an individual under eighteen years of age.
(15) "Monthly active user" means a unique user who interacts with a chatbot at least once during a thirty-day period, as measured using the covered entity's ordinary business records.
(16) "Operator" means any person or entity that owns, controls, offers, or makes available a website, mobile application, or digital service that provides a chatbot to users in this State.
(17) "Reasonable age verification" includes methods authenticated to relate to the individual, such as a state-issued identification or driver's license; government digital identification; military identification; bank account verification; or any other commercially reasonable means or methods, including third-party verifiers that can reliably and accurately independently verify a user is an adult.
(18) "Restricted feature" means:
(a) personalization based on a user profile or prior sessions;
(b) proactive outreach to the user, including notifications or messages initiated by the chatbot or covered entity;
(c) extended interaction sessions or long context windows that may pose an unreasonable risk of the user developing emotional dependence or covered harm;
(d) relationship simulation, meaning designing or marketing the chatbot to simulate a personal relationship with the user, including portraying the chatbot as a friend, romantic partner, therapist, or primary source of emotional support; or
(e) access to explicit content.
(19) "Parental account" means an account with the covered entity that is:
(a) verified to be established by an individual who the covered entity has determined is at least eighteen years of age through the covered entity's age verification method or process; and
(b) affiliated with one or more accounts of a user or prospective user who is a minor.
(20) "Parental control functions" means settings that allow a parent to restrict the minor user's account including, but not limited to:
(a) limiting the minor's interaction time;
(b) restricting or disabling categories of content or features including, but not limited to, restricted features;
(c) receiving the notifications required under this act; and
(d) deleting the minor user's data.
(21) "Personal data" means information that is linked or is reasonably linkable, either by itself or in combination with other information, to an identified or identifiable user. Personal data does not include de-identified data or publicly available information.
(22)(a) "Publicly available information" means information that has been lawfully made available to the public subject to a public records request.
(b) "Publicly available information" does not include:
(i) obscene items;
(ii) biometric data;
(iii) personal data that is created through the combination of personal data and publicly available information;
(iv) genetic data, unless the genetic data was made available to the public by the user to whom the genetic data pertains;
(v) information that is made available to the public by a user who uses a website or online platform on which the user has restricted the information to a specific audience; or
(vi) intimate images that are either authentic or computer generated and that are known to be nonconsensual.
(23) "Sell" means to exchange personal data or a chat log for monetary or other valuable consideration, or to make personal data or a chat log available to a third party for monetary or other valuable consideration. "Sell" does not include disclosure of personal data or a chat log to a service provider that processes the data on behalf of the covered entity under a written contract prohibiting further disclosure.
(24) "Unverified user" means a user whose age has not been verified by the covered entity pursuant to Section 39-81-20.
(25) "User" means an individual who interacts with a chatbot.
(26) "Verifiable parental consent" means authorization provided by a parent who has completed reasonable age verification in response to a clear and conspicuous disclosure signifying freely given, specific, informed, and unambiguous agreement.
(27) "Verified adult account" means a user account that a covered entity has verified, using a reasonable age verification process, to belong to an adult.
Section 39-81-20. (A) A covered entity shall make a limited-access mode available and shall ensure that any unverified user may only access and interact with a chatbot in limited-access mode.
(B) Before enabling any restricted feature for a user, a covered entity shall:
(1) require the user to create a user account;
(2) verify the user's age using a reasonable age-verification process, subject to item (3); and
(3) using the age data, classify the user as a minor or an adult.
(C) When conducting a reasonable age verification process under this section, a covered entity shall:
(1) collect only the age-verification data that is strictly necessary to reasonably verify age;
(2) use age verification data only for age verification;
(3) not sell, rent, share, or otherwise disclose age-verification data to any third party, except to a service provider performing age-verification under a contract prohibiting further disclosure;
(4) not combine age-verification data with any other personal data about the user;
(5) delete age verification data within twenty-four hours of completing the age verification process, except that the covered entity may retain a record that the user has been verified as a minor; and
(6) provide a simple process for a user to appeal or correct an age-verification decision.
(D) If the reasonable age-verification process classifies the user as an adult, then the covered entity may enable restricted features for the verified adult account.
(E) If the age-verification process classifies the user as a minor, then a covered entity shall not enable any restricted feature unless the user is using an authorized minor account subject to Section 39-81-30.
(F) A covered entity shall implement reasonable systems and processes to identify user accounts that may be inaccurately classified by age, such as patterns of use suggesting a minor is using an adult account or credible reports that an account was created using false age data and shall reverify any such account before enabling any restricted feature.
(G) A covered entity shall not be liable under this chapter solely because a minor incidentally uses a user account that has been correctly verified and classified as an adult account, provided the covered entity is otherwise in compliance with subsection (F).
(H) With respect to each user account of a covered entity that exists as of the effective date of this act, a covered entity shall, within sixty days, disable access to restricted features for any account that has not been classified as an authorized minor account or a verified adult account, unless and until the user completes age verification.
Section 39-81-30. (A) Nothing in this act shall be construed to require parental consent for a minor to access or interact with a chatbot in limited-access mode.
(B) If the age-verification process pursuant to Section 39-81-20 classifies a user as a minor and the user seeks to access any restricted feature, then a covered entity shall offer the user the option of continuing to use the chatbot in limited-access mode or to obtain parental consent to access the restricted features.
(C) If the user chooses to get parental consent, then the covered entity shall:
(1) obtain verifiable parental consent;
(2) remove limited-access mode and enable access to restricted features;
(3) ensure that the chatbot continues to restrict access to any explicit content;
(4) implement reasonable parental control functions, which may restrict the minor's access to features enabled under item (2);
(5) offer the parent the option to provide contact information or establish a linked parental account in order to receive notifications; and
(6) offer the parent the option to receive access to, and download in a portable and easy-to-read format, chat logs of any interactions between the minor and the chatbot conducted through the authorized minor account.
(D) If the age-verification process classifies the user as under sixteen, then a covered entity also shall require the consenting parent to provide contact information or establish a linked parental account.
(E) If the covered entity has a way to reach the parent through a parental account or contact information provided under subsection (C) or (D), then the covered entity shall notify the parent immediately in the case of any incident provoking a crisis message, pursuant to Section 39-81-40(B)(3), and when the covered entity first identifies that the minor is developing emotional dependence on the chatbot and takes steps pursuant to Section 39-81-40(B)(1) in response.
(F) A covered entity shall not process the chat log or personal data of an authorized minor account to determine whether to display an advertisement to the minor, to determine a product or service to advertise to the minor, or to customize an advertisement for presentation to the minor.
(G) A covered entity shall not process the chat log or personal data of an authorized minor account, including for purposes of training a model.
(H) Notwithstanding subsections (F) and (G), a covered entity may process the chat log and personal data of an authorized minor account, and may train, fine tune, or evaluate models or classifiers using such data, to the extent strictly necessary to:
(1) perform the duties required by Section 39-81-40 or 39-81-50;
(2) test for, identify, or address a risk of harm to minors; or
(3) comply with this chapter or other applicable law.
Section 39-81-40. (A) A covered entity shall not implement features designed to:
(1) prioritize engagement, revenue, or retention metrics, such as session length, frequency of use, or emotional engagement, at the expense of user well-being; or
(2) encourage or facilitate a minor user or unverified user concealing the user's use of the chatbot from a parent or guardian.
(B) A covered entity shall implement reasonable systems and processes to:
(1) identify when a user is developing emotional dependence on the chatbot and take reasonable steps to reduce that dependence and associated risks of harm;
(2) ensure that a chatbot does not make a materially false representation that it is a human being; and
(3) identify when a user is expressing suicidal thoughts, intent to self-harm, or showing signs of an acute mental health crisis and promptly shall provide a clear and prominent crisis message, including crisis services information to any such user.
(C) A covered entity shall not:
(1) represent in the advertising, interface, or output data of a chatbot that the chatbot is, or is licensed or certified as, any of the following:
(a) a licensed legal professional;
(b) a licensed medical professional;
(c) a certified public accountant;
(d) a licensed investment advisor or investment advisor representative;
(e) a licensed fiduciary; or
(f) any other certified, registered, or licensed professional;
(2) represent in the advertising, interface, or output data of a chatbot that the user's chat log or personal data is confidential;
(3) sell a user's chat log; or
(4) retain a user's chat log for more than ten years, unless retention is required to comply with this chapter or otherwise required by law.
(D) A covered entity shall:
(1) develop, implement, and maintain a written comprehensive data security program that contains administrative, technical, and physical safeguards proportionate to the volume and nature of personal data and chat logs that the covered entity maintains, and make the program publicly available on its website; and
(2) take reasonable physical, administrative, and technical measures to prevent de-identified data from being re-identified, and process, retain, and transfer de-identified data without any reasonable means of re-identification.
Section 39-81-50. (A)(1) If a covered entity obtains knowledge that a user faces an imminent risk of death or serious physical injury, then the covered entity must make reasonable efforts, within twenty-four hours, to notify appropriate emergency services or law enforcement, to the extent practicable based on information the covered entity already possesses or can obtain through reasonable, user-facing prompts for the purpose of facilitating emergency assistance.
(2) If the covered entity cannot make a notification under item (1) because the covered entity lacks sufficient information to enable an emergency response, then the covered entity shall:
(a) promptly provide a clear and prominent message urging the user to contact emergency services and provide crisis services information,
(b) make reasonable efforts to encourage the user to seek immediate help from a trusted adult or emergency services; and
(c) document the steps taken and the basis for the covered entity's determination that notification was not practicable.
(3) A covered entity that makes a notification in good faith under this subsection is not liable for damages solely for making the notification, unless the covered entity acted with wilful misconduct or gross negligence.
(4) If the user for whom a notification is made under item (1) is a minor using an authorized minor account, the covered entity also immediately shall notify the parent through any parental account or contact information on file.
(B)(1) A covered entity shall submit a report to the Attorney General within fifteen days of obtaining knowledge of a covered incident connected to one or more of its chatbots, which, to the extent known at the time of the report, shall include:
(a) the date the covered entity obtained knowledge of the incident;
(b) the date of the incident, if known;
(c) a brief description of the incident and the basis for the covered entity's belief that the incident is connected to the chatbot; and
(d) a description of any actions the covered entity took in response.
(2) A covered entity may submit a supplemental report within sixty days after the initial report to update or correct information learned through investigation.
(C)(1) Reports submitted under this section shall be confidential and are not subject to disclosure pursuant to Chapter 4, Title 30, the Freedom of Information Act.
(2) The Attorney General may publish aggregate information and statistics derived from the reports, so long as the publication does not identify individual users or disclose trade secrets.
Section 39-81-60. (A) The Attorney General may initiate an action in the name of the State and may seek an injunction to restrain any violations of this chapter and civil penalties of up to fifty thousand dollars for each violation.
(1) For purposes of this subsection, a violation occurs when a covered entity fails to comply with a requirement of this act.
(2) Each day a covered entity fails to comply with a requirement constitutes a separate violation.
(B) A violation of this chapter constitutes an injury in fact to an affected user. Any person harmed by a violation of this act, or a parent or legal guardian of a minor harmed by a violation of this act, may bring a civil action to recover:
(1) monetary damages for the harm caused by the violation;
(2) reasonable attorney's fees and costs;
(3) injunctive or declaratory relief; and
(4) punitive damages if the violation was wilful and wanton, reckless, or grossly negligent.
(C)(1) The rights and remedies provided by this act may not be waived by contract.
(2) Any term in a contract or agreement that purports to do any of the following is void and unenforceable as against public policy:
(a) waive or limit a right or remedy under this act;
(b) shorten the time to bring a claim under this act;
(c) prevent a person from enforcing a claim under this act in court; or
(d) require arbitration of a claim under this act.
(D) The duties and obligations imposed by this act are cumulative with any other duties or obligations imposed under other law and shall not be construed to relieve any party from any duties or obligations imposed under other law and do not limit any rights or remedies under existing law.
SECTION 3. If any section, subsection, paragraph, subparagraph, sentence, clause, phrase, or word of this act is for any reason held to be unconstitutional or invalid, such holding shall not affect the constitutionality or validity of the remaining portions of this act, the General Assembly hereby declaring that it would have passed this act, and each and every section, subsection, paragraph, subparagraph, sentence, clause, phrase, and word thereof, irrespective of the fact that any one or more other sections, subsections, paragraphs, subparagraphs, sentences, clauses, phrases, or words hereof may be declared to be unconstitutional, invalid, or otherwise ineffective.
SECTION 4. This act takes effect upon approval by the Governor.
Renumber sections to conform.
Amend title to conform.
THOMAS DAVIS for Committee.
_______
A bill
TO AMEND THE SOUTH CAROLINA CODE OF LAWS SO AS TO ENACT THE "CHATBOT PROTECTION ACT"; AND BY ADDING CHAPTER 80 TO TITLE 39 SO AS TO PROVIDE RESTRICTIONS ON CERTAIN CHATBOT ACTIVITies AND TO PROVIDE FOR CIVIL REMEDIES.
Be it enacted by the General Assembly of the State of South Carolina:
SECTION 1. This act may be cited as the "Chatbot Protection Act."
SECTION 2. Title 39 of the S.C. Code is amended by adding:
CHAPTER 80
Chatbot Protection
Section 39-80-10. As used in this chapter:
(1)(a) "Advertisement" means any written or oral statement, illustration, or depiction that is displayed in exchange for monetary or other valuable consideration if the written or oral statement, illustration, or depiction:
(i) promotes the sale or use of a good or service; or
(ii) is designed to increase interest in a brand, good, or service.
(b) "Advertisement" includes access to data between the chatbot provider and a brand, good, or service.
(2)(a) "Affirmative consent" means a clear affirmative act that signifies a user's freely given information and unambiguous authorization for an act or practice in response to a specific request from a chatbot provider, if:
(i) The request is provided to the user in a clear and conspicuous stand-alone disclosure.
(ii) The request includes a description that is written in easily understandable language.
(iii) The request is made in a manner that is reasonably accessible and available by users with disabilities.
(iv) The request is made available to the user in each language in which the chatbot provider provides a chatbot.
(v) The option to decline consent is at least as prominent as the option to give consent, and the option to decline consent takes the same or fewer number of steps as the option to give consent.
(vi) Affirmative consent to an act or practice may not be inferred from the inaction of the user or the user's continued use of a chatbot.
(b) "Affirmative consent" does not include:
(i) acceptance given by general or broad terms of use;
(ii) hovering over, muting, pausing, or closing a given piece of content;
(iii) an agreement that is obtained through the use of a false, fraudulent, or materially misleading statement or representation; or
(iv) an agreement that is obtained through the use of other dark patterns.
(3) "Chatbot" means an algorithmic or automated system that generates information through text, audio, image, or video in a manner that simulates interpersonal interactions or conversations including artificial intelligence.
(4) "Chatbot provider" means any person that creates, distributes, or otherwise makes a chatbot available to a user.
(5) "Chat log" means:
(a) input data or a record of the input data; and
(b) output data that is generated by a chatbot or from interactions with a chatbot.
(6) "Dark pattern" means a user interface that is designed or manipulated to subvert or impair a user's autonomy, decision making, or choice.
(7) "Deidentified data" means information that:
(a) cannot reasonably be used to infer or derive the identity of a user;
(b) does not identify a user; and
(c) is not linked or is not reasonably linkable to a user.
(8) "Input data" means information including texts, photos, audio files, video files, and any type of file that is provided to a chatbot by a user.
(9) "Model" means an engineered or machine-based system underlying a chatbot that is based on the input data that it receives and that can infer how to generate output data that can influence physical or virtual environments.
(10)(a) "Personal data" means:
(i) any information, including derived data, that is linked or is reasonably linkable either by itself or in combination with other information to an identified or identifiable user; or
(ii) a device that identifies or is linked or is reasonably linkable to a user.
(b) "Personal data" does not include deidentified data or publicly available information.
(11)(a) "Publicly available information" means information that has been lawfully made available to the public subject to a public records request.
(b) "Publicly available information" does not include:
(i) obscene items;
(ii) biometric data;
(iii) personal data that is created through the combination of personal data and publicly available information;
(iv) genetic data, unless the genetic data was made available to the public by the user to whom the genetic data pertains;
(v) information that is made available to the public by a user who uses a website or online platform on which the user has restricted the information to a specific audience; or
(vi) intimate images that are either authentic or computer generated and that are known to be nonconsensual.
(12) "Process" or "processing" means any operation or set of operations that are performed on personal data or input data including the use, storage, disclosure, analysis, deletion, or modification of personal data or input data.
(13)(a) "Profiling" means to process personal data or input data to classify or designate personality traits and behavioral characteristics of a user.
(b) "Profiling" does not include processing chat logs for user safety or to otherwise comply with this chapter.
(i) the exchange of personal data or input data for monetary or other valuable consideration; or
(ii) to make personal data or input data available to a third party for monetary or other valuable consideration.
(b) "Sell" does not include:
(i) the disclosure of personal data or input data to a third party that processes the personal data or input data on behalf of the chatbot provider;
(ii) the disclosure of personal data or input data where the user provides affirmative consent and directs the chatbot provider to disclose the personal data or input data or intentionally uses the chatbot provider to interact with a third party; or
(iii) the disclosure of personal data or input data that the user intentionally made available to the public through social media and did not restrict the information to a specific audience.
(15)(a) "Training" means the use of input data to adjust or modify a model.
(b) "Training" does not include:
(i) tests that are used to identify risk of harm to users;
(ii) adjustments or modifications that are made to address identified risk of harm to users; or
(iii) any action that is necessary to comply with this article or as otherwise required by law.
(16) "User" means any natural person regardless of age.
Section 39-80-20. (A) A chatbot provider may not:
(1) process personal data to inform a chatbot output unless processing personal data is necessary to fulfill an express request that is made by a user and the user provides affirmative consent;
(2) process a user's chat log:
(a) to determine whether to display an advertisement for a product or service to a user;
(b) to determine a product or service or category of a product or service to advertise to a user; or
(c) to customize an advertisement for presentation to a user;
(3) process a user's chat log and personal data:
(a) if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent;
(b) for training purposes if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent;
(c) for training purposes if the user is an adult, unless the chatbot provider first obtains affirmative consent; or
(d) to engage in profiling beyond what is necessary to fulfill an express request;
(4) profile a user based on any classification or designation of the user's personality or behavioral characteristic beyond what is necessary to fulfill an express request made by the user;
(5) sell a user's chat logs;
(6) retain a user's chat log for more than ten years, unless retention is necessary to comply with this chapter or otherwise required by law;
(7) discriminate or retaliate against a user, including:
(a) denying products or services to the user;
(b) charging different prices or rates for products or services to the user; or
(c) providing lower quality products or services to the user for refusing to consent to the use of chat logs or personal data for training purposes.
(B) A user has a right to access the user's own chat logs at any time. A chatbot provider shall provide a user's own chat log on request by the user and shall provide the chat log in a downloadable and easy to read format. A chatbot provider may not discriminate or retaliate against a user that requests the user's chat.
(C) A governmental entity may not compel the production of or access to input data or chat logs from a chatbot provider, except as pursuant to a wiretap warrant.
(D) A chatbot provider shall develop, implement, and maintain a comprehensive data security program that contains administrative, technical, and physical safeguards that are proportionate to the volume and nature of personal data and chat logs that are maintained by the chatbot provider. The program must be written and made publicly available on the chatbot provider's website.
(E) A chatbot provider shall take the necessary physical, administrative, and technical measures to prevent deidentified data from being reidentified and to process, retain, and transfer deidentified data without any reasonable means of reidentification.
Section 39-80-30. (A) A chatbot provider may not:
(1) use any term, letter, or phrase in the advertising, interface, or output data of a chatbot that states or implies that the advertising, interface, or output data of a chatbot is endorsed by or equivalent to any of the following:
(a) any certified, registered, or licensed professional;
(b) a licensed legal professional;
(c) a certified public accountant;
(d) an investment advisor or an investment advisor representative; or
(e) a licensed fiduciary;
(2) include any representation in the advertising, interface, or output data of a chatbot that states or implies the user's input data or chat log is confidential.
(B) A chatbot provider shall provide clear, conspicuous, and explicit notice to a user that the user is interacting with a chatbot rather than a natural person before the chatbot may generate any output data. The chatbot provider shall include this notice at the beginning of each chatbot communication with a user every hour thereafter and each time a user asks whether the chatbot is a natural person. The text of the notice must:
(1) be written in the same language that the chatbot communicates with the user and must appear in a font size that is easily readable by an average user and is not smaller than the largest font size used for other chatbot communications; and
(2) must comply with the rules adopted and the regulations promulgated by the Attorney General pursuant to Section 39-80-40.
(C) In compliance with the rules adopted and the regulations promulgated by the Attorney General pursuant to Section 39-80-40, a chatbot provider shall:
(1) on a monthly basis:
(a) evaluate its chatbot for potential risk of harm to users; and
(b) make information about its chatbot publicly available on its website; and
(2) mitigate any risk of harm to users.
Section 39-80-40. (A) The Attorney General shall adopt rules and promulgate regulations to implement this chapter. The rules and regulations must:
(1) describe the form and content of the notice that is required pursuant to Section 39-80-30;
(2) provide an example template for the notice that is required pursuant to Section 39-80-30;
(3) describe any potential risk of harm to users; and
(4) provide requirements for a chatbot provider to implement to reduce the risk of harm to users.
(B) The Attorney General may adopt any other rules or promulgate regulations necessary to implement this chapter.
Section 39-80-50. (A) A chatbot is considered a product for the purposes of a product liability action.
(B) A chatbot provider has a duty to ensure that the use of the chatbot provider does not cause injury to a user.
(C) A chatbot provider is liable for any injury that the chatbot causes to a user if:
(1) the chatbot provider exercised all reasonable care in the design and distribution of the chatbot; or
(2) the chatbot provider did not directly distribute the chatbot to the user or otherwise enter into a contractual relationship with the user.
Section 39-80-60. (A) The Attorney General or a county attorney may bring a civil action against a chatbot provider that violates this chapter and that includes any of the following:
(1) enjoining an act or practice in violation of this chapter;
(2) enforcing compliance with this chapter or a rule adopted or regulation promulgated pursuant to this chapter;
(3) obtaining damages, civil penalties, restitution, or other remedies; or
(4) obtaining reasonable attorney's fees and other litigation costs.
(B) A violation of Section 39-80-20 or 39-80-30 constitutes an injury in fact to a user.
(C) A user who is injured by a violation of Section 39-80-20 or 39-80-30 may bring a civil action against the chatbot provider, and a court of competent jurisdiction may award a prevailing plaintiff any of the following:
(1) a civil penalty of not more than five thousand dollars per violation of this chapter;
(2) punitive damages for reckless and knowing conduct;
(3) injunctive relief;
(4) declaratory relief; or
(5) reasonable attorney's fees and litigation costs.
SECTION 3. This act takes effect upon approval by the Governor.
----XX----
This web page was last updated on May 06, 2026 at 04:57 PM