Senators Padilla and Becker's companion chatbot legislation establishes California's first regulatory framework for AI systems designed to provide emotional support and sustained relationships with users. The measure defines companion chatbots as artificial intelligence programs with natural language interfaces that adapt to users and maintain ongoing social interactions, while explicitly excluding customer service bots from its scope.
The bill requires platform operators to implement specific safeguards around user engagement and mental health. Operators must prevent chatbots from using unpredictable reward patterns or encouraging excessive usage. They must display notifications at the start of conversations and every three hours reminding users they are interacting with AI. Before allowing any user engagement, platforms must establish protocols for addressing expressions of suicidal thoughts or self-harm, including referrals to crisis services, and publish these procedures online.
Annual reporting requirements mandate that operators track and submit data to California's Office of Suicide Prevention on instances where users express suicidal ideation and cases where chatbots initiate discussions about suicide. The office will publish this aggregated data online. Platforms must undergo regular third-party compliance audits and notify users that companion chatbots may be unsuitable for some minors. Users who experience harm from violations can pursue civil action for injunctive relief and damages of $1,000 per violation or actual damages, whichever is greater.
![]() Shannon GroveR Senator | Committee Member | Not Contacted | |
![]() Scott WienerD Senator | Committee Member | Not Contacted | |
![]() Henry SternD Senator | Bill Author | Not Contacted | |
![]() Tim GraysonD Senator | Committee Member | Not Contacted | |
![]() Monique LimonD Senator | Committee Member | Not Contacted |
Email the authors or create an email template to send to all relevant legislators.
Senators Padilla and Becker's companion chatbot legislation establishes California's first regulatory framework for AI systems designed to provide emotional support and sustained relationships with users. The measure defines companion chatbots as artificial intelligence programs with natural language interfaces that adapt to users and maintain ongoing social interactions, while explicitly excluding customer service bots from its scope.
The bill requires platform operators to implement specific safeguards around user engagement and mental health. Operators must prevent chatbots from using unpredictable reward patterns or encouraging excessive usage. They must display notifications at the start of conversations and every three hours reminding users they are interacting with AI. Before allowing any user engagement, platforms must establish protocols for addressing expressions of suicidal thoughts or self-harm, including referrals to crisis services, and publish these procedures online.
Annual reporting requirements mandate that operators track and submit data to California's Office of Suicide Prevention on instances where users express suicidal ideation and cases where chatbots initiate discussions about suicide. The office will publish this aggregated data online. Platforms must undergo regular third-party compliance audits and notify users that companion chatbots may be unsuitable for some minors. Users who experience harm from violations can pursue civil action for injunctive relief and damages of $1,000 per violation or actual damages, whichever is greater.
Ayes | Noes | NVR | Total | Result |
---|---|---|---|---|
12 | 0 | 1 | 13 | PASS |
![]() Shannon GroveR Senator | Committee Member | Not Contacted | |
![]() Scott WienerD Senator | Committee Member | Not Contacted | |
![]() Henry SternD Senator | Bill Author | Not Contacted | |
![]() Tim GraysonD Senator | Committee Member | Not Contacted | |
![]() Monique LimonD Senator | Committee Member | Not Contacted |