Law enforcement struggling to prosecute AI-generated child pornography, asks Congress to act

Rep. Anna Paulina Luna, R-Fla., speaks with reporters after a meeting with House Majority Leader Steve Scalise of La., on Capitol Hill, Wednesday, Oct. 11, 2023, in Washington. (AP Photo/Andrew Harnik)

Law enforcement is struggling to prosecute abusive, sexually explicit images of minors created by artificial intelligence (AI), Rep. Anna Paulina Luna (R-Fla.) told fellow members at a House Oversight subcommittee hearing Tuesday.

Laws against child sexual abuse material (CSAM) require “an actual photo, a real photograph, of a child, to be prosecuted,” Carl Szabo, vice president of nonprofit NetChoice, told lawmakers. With generative AI, average photos of minors are being turned into fictitious but explicit content.

“Bad actors are taking photographs of minors, using AI to modify into sexually compromising positions, and then escaping the letter of the law, not the purpose of the law but the letter of the law,” Szabo said.

Attorneys general from all 50 states wrote a bipartisan letter urging Congress to “study the means and methods of [AI] used to exploit children” and to “propose solutions to deter and address such exploitation to protect America’s children.”

The letter called on Congress to “explicitly cover AI-generated CSAM” to enable prosecutors.

“This is actually something that the FBI, in talking to them about cybercrimes, asked us to specifically look up because they are having issues currently prosecuting these really gross, sick individuals; because, technically, a child is not hurt in the process, because it is a generated image,” Luna said.

The Hill has reached out to the FBI for comment.

Although AI-generated CSAM currently represents a small portion of the abusive content circulating online, the ease of use, versatility and highly realistic nature of AI programs mean their use for CSAM will likely grow, John Shehan, vice president of the Exploited Children Division at the National Center for Missing & Exploited Children (NCMEC), said.

Lawmakers and witnesses frequently cited research from the Stanford Internet Observatory, which found generative AI is enabling the creation of more CSAM, and that training data for publicly available AI models have been tainted with CSAM.

NCMEC provides “the nation’s centralized reporting system for the online exploitation of children,” known as the CyberTipline. Only five generative AI companies have submitted reports to the tip line to date, despite an “explosion” in the number of apps or services available, according to Shehan.

“State and local law enforcement are having to deal with these issues, because the technology companies are not taking the steps on the front end to build these tools with safety by design,” he said.

Shehan also noted “nudifying” or “declothing” AI applications or web services were especially egregious, with regard to the generation of CSAM.

“None of the platforms that offer ‘nudify’ or ‘unclothe’ apps have registered to report to NCMEC’s CyberTipline; none have engaged with NCMEC regarding how to avoid creation of sexually exploitative and nude content of children and none have submitted reports to NCMEC’s CyberTipline,” he said.

“The sheer volume of CyberTips has often prevented law enforcement from pursuing proactive investigations at first that would efficiently target the most egregious offenders,” Rep. Nick Langworthy (R-N.Y.) said.

“In only a three-month period from November 1, 2022, to February 1, 2023, there were over 99,000 IP addresses throughout the United States that distributed known CSAM, and only 782 were investigated. Currently, law enforcement, through no fault of their own, they just don’t have the ability to investigate, prosecute the overwhelming number of these cases,” Langworthy added, referring to information from previous testimony by John Pizzuro, CEO of nonprofit Raven, during a February 2023 Senate Judiciary hearing on protecting children online.

Tags Anna Luna child sexual abuse material Federal Bureau of Investigation generative artificial intelligence House Oversight Committee Law enforcement Stanford Internet Observatory

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴
Main Area Bottom ↴

Testing Video

ASR RAW Boys Lacrosse: Coronado 8, Poway 6

ASR RAW Boys Lacrosse: Coronado 8, Poway 6
ASR RAW Girls Lacrosse: Coronado 15, Cathedral ...
Former Torrey Pines teammates take home another NCAA ...
Boys Lacrosse: Torrey Pines 11, Bishop's 9
More Videos

Most Popular

Load more