AP Technology

European lawmakers try to balance protection and privacy with law on explicit images of children

FILE - FBI Supervisory Special Agent Stacey Bradley and Special Agent "Jackie" of the Innocent Images Unit monitors possible child pornography offenders on May 4, 2006 in Calverton, Md. in an online chat room. European Union lawmakers adopted Tuesday, Nov. 14, 2023, a series of proposals to amend a bill from the EU’s executive arm to fight online child pornography as they tried to find the right balance between protecting children and protecting privacy. (AP Photo/Matt Houston)

BRUSSELS (AP) — Seeking to strike the right balance between protecting children and protecting privacy rights, European Union lawmakers on Tuesday adopted a series of amendments to a draft law that is intended to keep sexually explicit photos and videos of minors from circulating online.

The draft position adopted overwhelmingly by the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs would require internet providers to assess the risk of their services being used for the sexual abuse or exploitation of children, and to take steps to mitigate those threats.

But to “avoid generalized monitoring of the internet,” lawmakers proposed excluding end-to-end encrypted material from detection, while making sure time-limited detection orders approved by courts can be used to hunt down illegal material when mitigation actions are not sufficient.

They said they “want mitigation measures to be targeted, proportionate and effective, and providers should be able to decide which ones to use.”

Their position now needs to be endorsed by the whole Parliament before further negotiations involving EU member countries can take place.


Reports of online child sexual abuse in the 27-nation bloc have increased from 23,000 in 2010 to more than 1 million in 2020. A similar increase has been noticed globally, with reports of child abuse on the internet rising from 1 million to almost 22 million during 2014-2020 and over 65 million images and videos of children being sexually abused identified.

The European Commission proposed last year to force online platforms operating in the EU to detect, report and remove the material. Voluntary detection is currently the norm and the Commission believes that the system does not adequately protect children since many companies don’t do the identification work.

Digital rights groups had immediately warned that the Commission’s proposal appeared to call for widespread scanning of private communications and would discourage companies from providing end-to-end encryption services, which scramble messages so they’re unreadable by anyone else and are used by chat apps Signal and WhatsApp.

The Computer and Communications Industry Association, a big tech lobbying group, praised the committee’s proposed measures that “narrow scanning obligations, safeguard end-to-end encryption of communications and strengthen more targeted mitigation measures.”

“Indeed, the ‘cascade approach’ adopted by Parliament would first have online service providers assess risks and then take action to mitigate those,” the group said. “The tech industry commends this approach, just like the important clarification that detection orders will only be issued as a last-resort measure by a competent judicial authority, and have to be targeted and limited.”

Lawmakers from across the political spectrum also welcomed the changes to the initial proposal.

“There will be no such thing as general scanning of communications, and nothing will undermine end-to-end encryption,” said Hilde Vautmans, the Renew Europe group’s negotiator on the regulation. “This agreement is a major step forward in making the internet a safer place for children whilst upholding fundamental rights.″

The Parliament committee also wants pornography sites to implement appropriate age verification systems, mechanisms for flagging child sexual abuse material and human content moderation to process these reports.

“To stop minors being solicited online, MEPs propose that services targeting children should require by default user consent for unsolicited messages, have blocking and muting options, and boost parental controls,” the Parliament said in a statement.

To help providers better identify abuse, the Commission had proposed the creation of an EU Center on Child Sexual Abuse, similar to the National Center for Missing and Exploited Children, a U.S. nonprofit reference center that helps families and exploited victims.

Lawmakers approved the idea. The center would work with national authorities and Europol to implement the new rules and help providers to detect abuse materials online.

“The center would also support national authorities as they enforce the new child sexual abuse rulebook, conduct investigations and levy fines of up to 6% of worldwide turnover for non-compliance,” they said.