In a recent government meeting, California officials underscored the state's pivotal role in shaping legislation that impacts not only local governance but also global standards, particularly in the realms of elections, artificial intelligence (AI), and social media. The discussions highlighted the \"California effect\" and the \"Brussels effect,\" terms that describe how laws enacted in these regions can influence practices and regulations worldwide.
Key legislative proposals were presented, including Assembly Bill (AB) 2655, introduced by Assemblymember Berman, which mandates social media platforms to label election-related deep fakes that could mislead voters. Another significant bill, AB 2839, proposed by Chair Pellerin, seeks to prohibit the dissemination of deceptive deep fakes in political advertising close to election dates.
Additionally, Assemblymember Wicks introduced AB 3211, which focuses on establishing watermarking standards for AI-generated content. This bill aims to assist tech companies in complying with the aforementioned deep fake regulations by requiring them to embed invisible watermarks in both AI-generated and authentic content.
The meeting also referenced the February 2024 AI elections accord, signed by 20 major AI companies in Munich, which included commitments to apply watermarks to AI-generated content. However, concerns were raised regarding the lack of accountability and compliance timelines, leading to skepticism about the effectiveness of voluntary commitments made by these companies. Notably, OpenAI was acknowledged for its adherence to some of these commitments, contrasting with the broader industry's performance.
As the meeting concluded, officials emphasized the importance of ongoing legislative efforts to safeguard democracy, urging collaboration on future bills to ensure the integrity of elections. The discussions reaffirmed California's leadership in setting standards that could resonate beyond its borders, reinforcing the state's critical role in the global political landscape.