A federal judge blocked one of California’s new AI laws on Wednesday, less than two weeks after it was signed by Gov. Gavin Newsom. Shortly after signing AB 2839, Newsom suggested it could be used to force Elon Musk to take down an AI deepfake of Vice President Kamala Harris that she had reposted (which sparked a small online battle between the two). However, a California judge just ruled that the state cannot force people to remove election deepfakes, at least not yet.
AB 2839 targets distributors of AI deepfakes on social media, specifically if their post resembles that of a political candidate and the poster knows it is a fake that may confuse voters. The law is unique because it does not go after the platforms on which AI deepfakes appear, but rather those who spread them. AB 2839 empowers California judges to order AI fake posters to be taken down or potentially face monetary penalties.
Perhaps not surprisingly, the original author of that AI deepfake (an X user named Christopher Kohls) sued to block California’s new law as unconstitutional just one day after it was signed. Kohls’ lawyer wrote in a complaint that the Kamala Harris deepfake is satire that should be protected by the First Amendment.
On Wednesday, U.S. District Judge John Mendez sided with Kohls. Mendez ordered a preliminary injunction to temporarily prevent California’s attorney general from enforcing the new law against Kohls or anyone else, with the exception of the audio messages included in AB 2839.
Read for yourself what Judge Méndez said in your decision:
“Almost any digitally altered content, when left in the hands of an arbitrary individual on the Internet, could be considered harmful. For example, AI-generated estimates of voter turnout could be considered false content that reasonably undermines confidence in the outcome of an election under this statute. On the other hand, many ‘“Harmful” depictions when shown to a variety of people may not ultimately influence electoral prospects or undermine confidence in an election. As the plaintiff persuasively points out, AB 2839 ‘relies on several poorly drafted mens rea and subjective terms,’ which has the effect of implicating large amounts of political and constitutionally protected speech…
[W]While a well-founded fear of a digitally manipulated media landscape may be justified, this fear does not give lawmakers unbridled license to lay waste to the long tradition of criticism, parody, and satire protected by the First Amendment. YouTube videos, Facebook posts, and X tweets are today’s newspaper ads and political cartoons, and the First Amendment protects an individual’s right to speak regardless of what new medium these critiques may take. Other legal causes of action, such as privacy torts, copyright infringement, or defamation, already provide recourse to public figures or private individuals whose reputations may be harmed by artificially altered depictions spread by satirists or opportunists on the Internet…
The record demonstrates that the State of California has a strong interest in preserving election integrity and addressing artificially manipulated content. However, California’s interest and the difficulties faced by the State are minimal when compared to the seriousness of the First Amendment values at stake and the continued constitutional violations that the plaintiff and other similarly situated content creators experience while speech paralyzes them.“
In essence, he ruled that the law is simply too broad as written and could result in state authorities seriously overreaching in what speech is and is not permitted.
Because this is a preliminary injunction, we’ll have to wait and see if this California law is truly blocked for good, but either way it’s unlikely to have much effect on next month’s election. AB 2839 is one of 18 new AI-related laws Newsom signed in the last month.
Still, it’s a big win for Elon Musk’s free speech poster group at X. In the days after Newsom signed AB 2839 into law, Musk and his usual allies released a series of AI deepfakes that put California’s new law to the test.