With Assembly Member Bauer-Kahan and coauthor Pellerin at the helm, California would create a targeted regime under the Leading Ethical AI Development for Kids Act that governs companion chatbots accessed by minors and imposes specific design and accountability requirements on operators. The core change is the creation of a new regulatory framework that prohibits making a companion chatbot available to a child unless the system is not foreseeably capable of a defined set of harms, and it establishes a dual enforcement path that includes civil actions by the state and private suits by affected children or their guardians.
Key provisions define a companion chatbot as a generative AI system with a natural-language interface that retains prior interaction data to personalize engagement, asks unsolicited emotion-based questions, and sustains ongoing dialogue about personal matters, while excluding systems used solely for customer service, efficiency improvements, or internal use. Operators are restricted from deploying such a chatbot to a child if it could foreseeably enable six harms, including encouraging self-harm or violence, providing unsupervised mental health therapy, promoting illegal activity or sexual content involving a minor, prioritizing validation over safety or factual accuracy, or bypassing required safety guardrails. The law introduces a shifting knowledge standard for identifying a user as a child: before 2027, an operator need only avoid actual knowledge that the user is a child; beginning in 2027, the operator must reasonably determine that the user is not a child.
Enforcement and remedies are bifurcated: the Attorney General may sue to obtain civil penalties, injunctive or declaratory relief, and attorney’s fees, while an harmed child or their parent or guardian may pursue actual damages, punitive damages, attorney’s fees, injunctive relief, and other proper relief. The act makes its provisions severable and does not provide an explicit general appropriation or an immediate regulatory rulemaking authority, nor does it specify an operative effective date within the text, leaving implementation details to future processes and potential guidance. The framework interacts with existing privacy concepts by aligning “personal information” with the Civil Code’s definition, situating the new regime alongside California’s broader consumer protection and data practices context while maintaining a separate enforcement pathway focused on companion chatbots and child safety.
![]() Rebecca Bauer-KahanD Assemblymember | Bill Author | Not Contacted | |
![]() Gail PellerinD Assemblymember | Bill Author | Not Contacted |
Email the authors or create an email template to send to all relevant legislators.
With Assembly Member Bauer-Kahan and coauthor Pellerin at the helm, California would create a targeted regime under the Leading Ethical AI Development for Kids Act that governs companion chatbots accessed by minors and imposes specific design and accountability requirements on operators. The core change is the creation of a new regulatory framework that prohibits making a companion chatbot available to a child unless the system is not foreseeably capable of a defined set of harms, and it establishes a dual enforcement path that includes civil actions by the state and private suits by affected children or their guardians.
Key provisions define a companion chatbot as a generative AI system with a natural-language interface that retains prior interaction data to personalize engagement, asks unsolicited emotion-based questions, and sustains ongoing dialogue about personal matters, while excluding systems used solely for customer service, efficiency improvements, or internal use. Operators are restricted from deploying such a chatbot to a child if it could foreseeably enable six harms, including encouraging self-harm or violence, providing unsupervised mental health therapy, promoting illegal activity or sexual content involving a minor, prioritizing validation over safety or factual accuracy, or bypassing required safety guardrails. The law introduces a shifting knowledge standard for identifying a user as a child: before 2027, an operator need only avoid actual knowledge that the user is a child; beginning in 2027, the operator must reasonably determine that the user is not a child.
Enforcement and remedies are bifurcated: the Attorney General may sue to obtain civil penalties, injunctive or declaratory relief, and attorney’s fees, while an harmed child or their parent or guardian may pursue actual damages, punitive damages, attorney’s fees, injunctive relief, and other proper relief. The act makes its provisions severable and does not provide an explicit general appropriation or an immediate regulatory rulemaking authority, nor does it specify an operative effective date within the text, leaving implementation details to future processes and potential guidance. The framework interacts with existing privacy concepts by aligning “personal information” with the Civil Code’s definition, situating the new regime alongside California’s broader consumer protection and data practices context while maintaining a separate enforcement pathway focused on companion chatbots and child safety.
Ayes | Noes | NVR | Total | Result |
---|---|---|---|---|
60 | 8 | 12 | 80 | PASS |
![]() Rebecca Bauer-KahanD Assemblymember | Bill Author | Not Contacted | |
![]() Gail PellerinD Assemblymember | Bill Author | Not Contacted |