Assembly Members Bauer-Kahan and Berman, together with several colleagues, frame a measure addressing digitized sexually explicit material and deepfake pornography by expanding liability to include minor victims and facilitators while widening enforcement options for private individuals and public prosecutors.
At the core, the measure adds new private causes of action for depicting a person in digitized material without consent or when the person was a minor at creation, and for knowingly facilitating or recklessly aiding or abetting such conduct. It also imposes presumptions on certain actors: owners or operators of a deepfake pornography service are presumed to know the depicted individual did not consent unless they prove express written consent, and providers that enable ongoing operation of a deepfake service face a presumption of violation if they fail to halt after receiving timely, documented evidence. The definitions introduced or refined for terms like digitization, digitized sexually explicit material, and deepfake pornography service align with the measure’s focus on digitally altered representations. The bill also provides exemptions for disclosures made in reporting unlawful activity, enforcement actions, or legal proceedings, and clarifies that a disclaimer about nonparticipation or prohibition on generation is not a defense.
Remedies extend beyond private plaintiffs to include public enforcement, with substantial damages and penalties available. Private plaintiffs may recover the defendant’s monetary gains or damages (economic and noneconomic, including emotional distress), and may elect statutory damages per work ranging from $1,500 to $50,000, or up to $250,000 if malice is shown; punitive damages and attorney’s fees are also available, and injunctive relief may be sought. Public prosecutors may pursue civil actions with remedies including injunctive relief, per-violation civil penalties of $25,000 (non-malicious) or $50,000 (malicious), attorney’s fees, and other relief; these remedies do not require proving actual harm. The measures’ remedies are cumulative with other laws. A three-year discovery-based statute of limitations applies, and the provisions are severable and not construed to override protections under federal law, including Section 230, with additional internet service provider safe harbors for merely transmitting or routing third-party content.
In implementation terms, the measure would require providers to establish a process for handling evidence of ongoing deepfake service operation, including a named contact channel and a 30-day cessation window for stopping service provision—an interval potentially extendable by court action to accommodate investigations. It also foregrounds a written consent framework for depictions, including a three-business-day rescission window unless specific review or representation conditions apply. The policy context centers on addressing harms from digitized material and facilitating liability for intermediaries that enable ongoing deepfake services, while preserving limited protections for lawful disclosures, newsworthy or constitutional expressions, and standard ISP activities. The fiscal implications are to be assessed by the Legislature’s fiscal committee, given the new enforcement mechanisms and potential civil penalties, though no explicit funding allocation is described in the text.
![]() Jacqui IrwinD Assemblymember | Bill Author | Not Contacted | |
![]() Marc BermanD Assemblymember | Bill Author | Not Contacted | |
![]() Rebecca Bauer-KahanD Assemblymember | Bill Author | Not Contacted | |
![]() Buffy WicksD Assemblymember | Bill Author | Not Contacted | |
![]() Isaac BryanD Assemblymember | Bill Author | Not Contacted |
Email the authors or create an email template to send to all relevant legislators.
Assembly Members Bauer-Kahan and Berman, together with several colleagues, frame a measure addressing digitized sexually explicit material and deepfake pornography by expanding liability to include minor victims and facilitators while widening enforcement options for private individuals and public prosecutors.
At the core, the measure adds new private causes of action for depicting a person in digitized material without consent or when the person was a minor at creation, and for knowingly facilitating or recklessly aiding or abetting such conduct. It also imposes presumptions on certain actors: owners or operators of a deepfake pornography service are presumed to know the depicted individual did not consent unless they prove express written consent, and providers that enable ongoing operation of a deepfake service face a presumption of violation if they fail to halt after receiving timely, documented evidence. The definitions introduced or refined for terms like digitization, digitized sexually explicit material, and deepfake pornography service align with the measure’s focus on digitally altered representations. The bill also provides exemptions for disclosures made in reporting unlawful activity, enforcement actions, or legal proceedings, and clarifies that a disclaimer about nonparticipation or prohibition on generation is not a defense.
Remedies extend beyond private plaintiffs to include public enforcement, with substantial damages and penalties available. Private plaintiffs may recover the defendant’s monetary gains or damages (economic and noneconomic, including emotional distress), and may elect statutory damages per work ranging from $1,500 to $50,000, or up to $250,000 if malice is shown; punitive damages and attorney’s fees are also available, and injunctive relief may be sought. Public prosecutors may pursue civil actions with remedies including injunctive relief, per-violation civil penalties of $25,000 (non-malicious) or $50,000 (malicious), attorney’s fees, and other relief; these remedies do not require proving actual harm. The measures’ remedies are cumulative with other laws. A three-year discovery-based statute of limitations applies, and the provisions are severable and not construed to override protections under federal law, including Section 230, with additional internet service provider safe harbors for merely transmitting or routing third-party content.
In implementation terms, the measure would require providers to establish a process for handling evidence of ongoing deepfake service operation, including a named contact channel and a 30-day cessation window for stopping service provision—an interval potentially extendable by court action to accommodate investigations. It also foregrounds a written consent framework for depictions, including a three-business-day rescission window unless specific review or representation conditions apply. The policy context centers on addressing harms from digitized material and facilitating liability for intermediaries that enable ongoing deepfake services, while preserving limited protections for lawful disclosures, newsworthy or constitutional expressions, and standard ISP activities. The fiscal implications are to be assessed by the Legislature’s fiscal committee, given the new enforcement mechanisms and potential civil penalties, though no explicit funding allocation is described in the text.
Ayes | Noes | NVR | Total | Result |
---|---|---|---|---|
79 | 0 | 1 | 80 | PASS |
![]() Jacqui IrwinD Assemblymember | Bill Author | Not Contacted | |
![]() Marc BermanD Assemblymember | Bill Author | Not Contacted | |
![]() Rebecca Bauer-KahanD Assemblymember | Bill Author | Not Contacted | |
![]() Buffy WicksD Assemblymember | Bill Author | Not Contacted | |
![]() Isaac BryanD Assemblymember | Bill Author | Not Contacted |