Technology

Bipartisan AI bill aims to give journalists, artists control over content  

Sen. Maria Cantwell (D-Wash.) arrives for an all Senators meeting to hear Ukrainian President Volodymyr Zelensky discuss future aide for the war effort on Sept. 21, 2023.

Journalists, songwriters and other artists would be given control over whether their content is used to train artificial intelligence (AI) systems under a new bipartisan bill introduced Thursday in the Senate.  

The Content Origin Protection and Integrity from Edited and Deepfaked Media Act, or COPIED Act, would require platforms that develop or share AI systems and apps to allow users to attach content provenance information to their work within two years.  

Content provenance is defined in the bill as machine-readable information documenting the origin and history of a piece of digital content. The legislation would prevent companies from using work with such labels to train an AI system or to create synthetic content without the users’ consent and compensation.  

It would also create a cause of action for owners of covered content and state attorneys general to seek legal action from entities that improperly used the content.  

The bill would also direct the National Institute of Standards and Technology to develop guidelines and standards for content provenance information, watermarking and synthetic content detection. Those standards aim to promote transparency and identify if content has been generated by AI and where AI content originated.  


The bill was introduced by Senate Commerce Committee Chair Maria Cantwell (D-Wash.), Sen. Marsha Blackburn (R-Tenn.) and Sen. Martin Heinrich (D-N.M.). Heinrich was one of the members of the Senate AI working group Senate Majority Leader Chuck Schumer (D-N.Y.) assembled.  

Cantwell said the bill will provide “much-needed transparency around AI-generated content.”  

“The COPIED Act will also put creators, including local journalists, artists and musicians, back in control of their content with a provenance and watermark process that I think is very much needed,” Cantwell added.  

The proposed legislation followed mounting warnings from groups that represent artists and media about the potential dangers of AI using creators’ work without their consent or compensation.  

SAG-AFTRA, a major actors’ union that secured AI protections in its recent contract negotiations, endorsed the bipartisan bill.  

“The capacity of AI to produce stunningly accurate digital representations of performers poses a real and present threat to the economic and reputational well-being and self-determination of our members,” Duncan Crabtree-Ireland, national executive director and chief negotiator of SAG-AFTRA, said in a statement.  

“We need a fully transparent and accountable supply chain for generative Artificial Intelligence and the content it creates in order to protect everyone’s basic right to control the use of their face, voice, and persona,” Crabtree-Ireland added.  

The Recording Academy, National Music Publishers’ Association, News/Media Alliance, and National Newspaper Association were also among groups to endorse the bill.  

It is the latest proposal introduced in Congress to regulate AI and a wide array of challenges the technology poses from impacts on creators to potential national security threats.  

Congress, however, has not yet voted on a piece of legislation aimed at creating guardrails around AI creation and deployment.  

In May, Schumer released an AI road map for regulation with Heinrich and other members of the AI working group. The road map encouraged at least $32 billion to be allocated in nondefense AI innovation, but it was light on calls for specific regulation.