In an effort to combat the rise of AI-generated deepfakes, US lawmakers on Thursday, July 11, introduced a bill that would require online platforms to let users tag such synthetic content with information about its origin. It is just one of many transparency guidelines proposed in the bill that looks to make the detection and authentication of AI-generated content much easier.
Titled as the Content Origin Protection and Integrity from Edited and Deepfaked Media Bill or COPIED Act, the draft bipartisan bill has been authored by Maria Cantwell and Martin Heinrich, US senators from the Democratic Party, as well as Marsha Blackburn, a Republican Party senator.
The problem of non-consensual deepfakes online came to a head when explicit AI-generated images of popular American musician Taylor Swift surfaced on X (formerly Twitter) in January this year. The incident led to calls for new legislation to be passed by the US Senate in order to tackle the growing menace of deepfake abuse.
“The COPIED Act will also put creators, including local journalists, artists and musicians, back in control of their content with a provenance and watermark process that I think is very much needed,” US Senator Cantwell said in a statement.
What is the scope of the draft bill?
For many years, experts have pointed out that the first hurdle in identifying deepfakes is determining what is or isn’t a deepfake.
Notably, the COPIED bill defines the term ‘deepfake’ as “synthetic content or synthetically modified content that appears authentic to a reasonable person and creates a false understanding or impression.” This would include images, videos, audio clips, and text that have either been wholly generated or significantly modified using AI tools, as per the bill.
The provisions of the bill will apply to any online app or platform with US-based customers, provided the platform has generated an annual revenue of $50,000,000 or registered over 25,000,000 monthly active users for three continuous months.
What are the requirements for platforms?
Under the COPIED bill, companies responsible for developing or deploying AI models need to give users the option to attach content provenance information to AI-generated synthetic content. This means that platforms such as Instagram or Google Search would need to let users tag an AI-generated image with contextual information such as the source, history, etc, in a machine-readable format.
This requirement also extends to digital copyrighted content, and the bill gives platforms two years to comply.
In a crucial step, the bill proposes to make it illegal for anyone to remove or tamper with the content provenance information added to AI-generated synthetic content. However, the bill makes an exception in cases where content provenance information is removed for research purposes.
It also prohibits platforms from using copyrighted material with content provenance information for training AI models or generating content.
“These measures give content owners – journalists, newspapers, artists, songwriters, and others – the ability to protect their work and set the terms of use for their content, including compensation,” read an official press release. The bill proposes to give artists, newspapers, broadcasters, and other content owners the explicit right to take infringing companies to court.
How will the detection and labeling process work?
Pushing for a public-private partnership approach, the bill tasks various US government agencies such as the US National Institute of Standards and Technology (NIST), the US Patent and Trademark Office (USPTO), and the US Copyright Office to work on developing detection and watermarking standards that are voluntary and consensus-based.
It further proposes organizing competitions with grand prizes for whoever can develop tools that aid in the labeling and detection of AI-generated content. Public awareness campaigns about synthetic content and deepfakes are also proposed by the bill.
According to the press release, the COPIED bill has been backed by several labor unions and industry associations such as the Screen Actors Guild (SAG-AFTRA), the National Music Publishers’ Association, the Songwriters Guild of America (SGA), the National Association of Broadcasters, Seattle Times, and more.