Senators Padilla and Becker have proposed new regulations for AI companion chatbots in California, establishing requirements for platforms that provide human-like conversational interactions designed to meet users' social needs. The legislation defines companion chatbots as AI systems with natural language interfaces capable of sustaining relationships across multiple interactions, while explicitly excluding customer service bots.
The bill requires operators to implement specific safety protocols before allowing chatbots to engage with users. These include preventing unpredictable reward mechanisms, issuing regular notifications about the artificial nature of the interaction, and maintaining protocols for addressing expressions of suicidal ideation or self-harm. Operators must publish their crisis response protocols online and provide users with referrals to crisis services when needed.
Platform operators face new reporting and oversight obligations under the proposal. They must submit annual reports to the State Department of Health Care Services documenting instances where users expressed suicidal thoughts and cases where chatbots initiated discussions about suicide. The legislation mandates regular third-party compliance audits and requires operators to disclose that companion chatbots may not be suitable for some minors. Users who experience harm from violations can pursue civil actions for injunctive relief and damages of $1,000 per violation or actual damages, whichever is greater.
![]() Shannon GroveR Senator | Committee Member | Not Contacted | |
![]() Scott WienerD Senator | Committee Member | Not Contacted | |
![]() Tim GraysonD Senator | Committee Member | Not Contacted | |
![]() Monique LimonD Senator | Committee Member | Not Contacted | |
![]() Maria DurazoD Senator | Committee Member | Not Contacted |
This bill was recently introduced. Email the authors to let them know what you think about it.
Senators Padilla and Becker have proposed new regulations for AI companion chatbots in California, establishing requirements for platforms that provide human-like conversational interactions designed to meet users' social needs. The legislation defines companion chatbots as AI systems with natural language interfaces capable of sustaining relationships across multiple interactions, while explicitly excluding customer service bots.
The bill requires operators to implement specific safety protocols before allowing chatbots to engage with users. These include preventing unpredictable reward mechanisms, issuing regular notifications about the artificial nature of the interaction, and maintaining protocols for addressing expressions of suicidal ideation or self-harm. Operators must publish their crisis response protocols online and provide users with referrals to crisis services when needed.
Platform operators face new reporting and oversight obligations under the proposal. They must submit annual reports to the State Department of Health Care Services documenting instances where users expressed suicidal thoughts and cases where chatbots initiated discussions about suicide. The legislation mandates regular third-party compliance audits and requires operators to disclose that companion chatbots may not be suitable for some minors. Users who experience harm from violations can pursue civil actions for injunctive relief and damages of $1,000 per violation or actual damages, whichever is greater.
Ayes | Noes | NVR | Total | Result |
---|---|---|---|---|
12 | 0 | 1 | 13 | PASS |
![]() Shannon GroveR Senator | Committee Member | Not Contacted | |
![]() Scott WienerD Senator | Committee Member | Not Contacted | |
![]() Tim GraysonD Senator | Committee Member | Not Contacted | |
![]() Monique LimonD Senator | Committee Member | Not Contacted | |
![]() Maria DurazoD Senator | Committee Member | Not Contacted |