AB-602, passed by the California State Senate on September 12, 2019, will, if approved by the governor, create a private right of action against persons who create or disclose another’s sexually explicit content through use of “deepfake” technology. Specifically, the cause of action may be brought against a person who creates and intentionally discloses sexually explicit material where the person knows, or reasonably should know, that such creation or disclosure was not consented to by the depicted individual, or where such person did not create but intentionally discloses such material knowing that the depicted individual did not consent to its creation.
Sponsors of the bill envision it applying in two distinct scenarios: (1) where a person’s face is superimposed on another’s body in such a way as to suggest that person is engaging in a sexually explicit way, and (2) where a mainstream filmmaker digitally alters a scene to make it look as though the actor engaged in sexually explicit activity when, in fact, he or she did not.
Issues AB-602 Seeks to Resolve
Deepfake (a portmanteau of deep learning and fake) is used for many purposes, including political commentary and parody. However, it is often used nefariously to depict individuals engaging in sexual acts in which they did not actually engage. It is these sexually explicit depictions that AB-602 seeks to prevent.
Once sexually explicit deepfakes are proliferated online, a person’s reputation becomes irreparably damaged and the person may suffer deep shame, humiliation and emotional damage. Additionally, such proliferation can result in long-lasting economic harm by tainting the depicted person’s professional image to such a degree that he or she becomes unemployable. Thus, AB-602 was introduced to provide victims of such deepfakes with a cause of action that provides sufficient redress.
Limitations of AB-602 could make it difficult for plaintiffs to successfully employ
AB-602 is limited in notable ways. First, the bill seems to protect only persons whose faces are superimposed on another’s body but not the person (i.e., the body) shown to be engaging in the sexually explicit conduct. Additionally, the legislation is likely preempted by the federal Communications Decency Act, 47 U.S.C. § 230, which protects internet content providers from liability for unlawful content posted by users of its service. Therefore, if the party who disclosed the unlawful content is difficult to find or is judgment-proof, victims may not be able to seek meaningful remuneration for redress of their injuries.
Moreover, under this bill the rights of the depicted individual are likely extinguished upon his or her death, as the legislation does not provide for the enforcement of rights under this provision for deceased persons. This creates two potential problems. First, it limits altered depictions of an individual from being used after death because it creates no right for successors or assigns to consent on behalf of the deceased individual. Secondly, it has no provision that would allow a depicted individual’s future heirs and assigns to bring a cause of action on the deceased party’s behalf.
Finally, certain provisions within the statute are vague and ambiguous. Notably, the bill defines “consent” as “an agreement written in plain language signed knowingly and voluntarily by the depicted individual that includes a general description of the sexually explicit material and the audiovisual work in which it will be incorporated.” But what does “plain language” mean exactly if an individual enters a complex legal agreement – does the complexity of such a contract render the “consent” invalid? Moreover, what is meant by “general description” of “sexually explicit material” is currently unclear. Finally, how is a prospective plaintiff to prove that a defendant “knew” plaintiff did not, in fact, consent? Proving a negative is difficult and for this reason creates uncertainty about AB-602’s potential as an effective remuneration tool for plaintiffs.
AB-602’s purpose is a noble one; however, its effectiveness in practice remains uncertain as it requires a difficult standard for prospective plaintiffs to meet and faces numerous interpretive issues as noted above.
 Nicholas Schmidt, Privacy law and resolving ‘deepfakes’ online, International Association of Privacy Professionals (Jan. 30, 2019), https://iapp.org/news/a/privacy-law-and-resolving-deepfakes-online/.
 James Vincent, Watch Jordan Peele Use AI to Make Barack Obama Deliver a PSA About Fake News, The Verge (April 17, 2018), https://www.theverge.com/tldr/2018/4/17/17247334/ai-fake-news-video-barack-obama-jordan-peele-buzzfeed.
 James Adams, The Naked Truth About Deepnudes: The stripping of dignity and democracy, Spectator (July 10, 2019), https://spectator.us/naked-truth-about-deepnudes/.
 Concurrence in Senate Amendments (Sept. 6, 2019), https://leginfo.legislature.ca.gov/faces/billHistoryClient.xhtml?bill_id=201920200AB602.
 David White, Deepfake Technology Is an Attack on Consent and Actors’ Rights to Control Sex Scenes (Guest Blog), The Wrap (Feb. 21, 2019), https://www.thewrap.com/deepfake-tech-consent-actors-rights-sex-scenes/.
 Assembly Committee on Judiciary, Depiction of Individual Using Digital or Electronic Technology: Sexually Explicit Material: Cause of Action (Sept. 12, 2019), https://leginfo.legislature.ca.gov/faces/billHistoryClient.xhtml?bill_id=201920200AB602.
 AB-602 (Sept. 12, 2019), https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602.