California recently introduced new legislation aimed at protecting Hollywood actors from being digitally copied by artificial intelligence (AI). Governor Gavin Newsom signed the laws, which give actors the right to withdraw from contracts if the terms regarding AI’s use of their voice or image are vague. The laws also make it illegal to create digital clones of deceased performers without approval from their estates. These measures were influenced by fears that studios could use AI to replace workers and by a controversial case involving the unauthorized recreation of comedian George Carlin for a comedy special.

The laws received strong support from groups such as SAG-AFTRA and the California Labor Federation, who emphasized the importance of protecting workers in an industry increasingly impacted by AI technology. Governor Newsom highlighted the need to safeguard the rights of performers while still allowing for growth in Hollywood. Critics, including the California Chamber of Commerce, have expressed concern that the legislation might face legal challenges, but advocates believe the laws represent a responsible approach to AI regulation. California’s measures follow Tennessee’s earlier legislation, which focused on musicians, and mark the state as a leader in establishing protections against unauthorized AI use. Additional AI-related laws, such as regulations concerning deepfake videos during elections and safety standards for AI models, are still under review, with Newsom required to take action on these proposals by September 30. These developments reflect a growing effort to balance innovation with ethical considerations in AI technology.