Governor of California Signs Nine Bills Governing AI-Generated Content:
The legislation aims to mitigate the risks associated with deepfakes.
Recently, California Governor Gavin Newsome signed nine measures aimed at addressing the dangers of artificial intelligence (AI)-generated content, specifically deepfakes. Before the state legislative session ends on September 30, there are still 29 more pieces of legislation pertaining to AI that are awaiting his signature or veto.
The most contentious of them is SB 1047, which mandates that AI developers put in place precautions to lessen the possibility that their technology would either create or facilitate a disaster, like a hack or a mass casualty event. About his intentions for the bill, Newsom has been evasive thus far.
Two laws that attempt to prevent the unapproved use of actors’ and performers’ digital likenesses in artificial intelligence and other digital technologies are among the recent laws he has signed:
• AB 1836 forbids the commercial use of deepfakes of deceased performers in movies, TV shows, video games, audiobooks, sound recordings, and other media without the performers’ estates’ permission. AB 2602 requires contracts for the use of AI-generated deepfakes of a performer’s voice or likeness, and the performer must be professionally represented in the contract negotiations.
“Our North Star has always been to protect workers, but we continue to wade through uncharted territory when it comes to how AI and digital media are transforming the entertainment industry,” Newsom stated.
“This legislation strengthens protections for workers and how their likeness can or cannot be used, while ensuring the industry can continue to thrive.”
To stop the misuse of AI-generated content, particularly sexually explicit deepfakes, three measures have been proposed:
• SB 926 makes it illegal to produce and disseminate photos of actual people that are sexually explicit and seem genuine with the intention of upsetting someone’s feelings.
SB 981 requires social media platforms to create a mechanism for people to report sexually explicit deepfakes of themselves, and once reported, the platform must temporarily block the material while it investigates and permanently remove it if confirmed.
• SB 942 requires generative AI systems to place invisible watermarks on the content they produce and to provide free tools to detect them so that AI-generated content can be easily identified.
“Nobody should be threatened by someone on the internet who could deepfake them, especially in sexually explicit ways,” Newsom said. “We’re in an era where digital tools like AI have immense capabilities, but they can also be abused against other people. We’re stepping up to protect Californians.”
The use of deepfakes and other misleading digitally produced or changed content in political campaigns is the subject of four legislation.
• AB 2655 mandates that during designated times leading up to elections, major internet platforms either remove or label misleading or digitally altered political content, or offer a mechanism for users to report such content.
• AB 2839 lengthens the period of time that political organizations are forbidden from purposefully disseminating an advertisement or other election-related materials that contain misleading artificial intelligence (AI) content.
• Political advertisements that include AI-generated content must indicate that the content has been modified, according to AB 2355.
AB 2905 requires robocalls to notify recipients if the voice is artificially generated.
“Safeguarding the integrity of elections is essential to democracy, and it’s critical that we ensure AI is not deployed to undermine the public’s trust through disinformation especially in today’s fraught political climate,” Newsom said. “These measures will help to combat the harmful use of deepfakes in political ads and other content, one of several areas in which the state is being proactive to foster transparent and trustworthy AI.”
Discover more from Postbox Live
Subscribe to get the latest posts sent to your email.