SAG-AFTRA Praises CA Bill Regulating AI Usage of Dead Performers
SAG-AFTRA is praising the California state Senate for passing a law that restricts the usage of artificial intelligence-created digital replicas of dead performers.
The actors union shared in a statement shortly after the passage of AB 1836 on Saturday, “For those who would use the digital replicas of deceased performers in films, TV shows, videogames, audiobooks, sound recordings and more, without first getting the consent of those performers’ estates, the California Senate just said NO. AB 1836 is another win in SAG-AFTRA’s ongoing strategy of enhancing performer protections in a world of generative artificial intelligence. The passing of this bill, along with AB 2602 earlier this week, build on our mosaic of protections in law and contract.”
“Both of these bills have been a legislative priority for the union on behalf of our membership and beyond, making explicit consent in California mandatory,” the statement continued. “We look forward to these bills being signed by Governor Gavin Newsom.”
The bill now heads to Gov. Gavin Newsom’s desk, and he will have until the end of September to decide whether to sign it into law, veto it or allow it to become law without his signature. Earlier this week, the state Senate also passed AB 2602, which tightens consent requirements for digital replicas of living performers.
SAG-AFTRA has long been championing protections over AI usage on a legislative level, notably since the union’s 118-day strike last year, which largely depended on provisions surrounding AI in its contracts with Hollywood studios and streamers.
On the federal level, bipartisan lawmakers have also been working on bills surrounding AI protections, including the NO FAKES Act, which is intended to protect actors, singers and others from having AI programs generate their likenesses and voices without their informed written consent. The No AI Fraud Act has also been introduced, which prohibits the publication and distribution of unauthorized digital replicas, including deepfakes and voice clones.
Read the original article here