Skip to main content
Ad (425x293)

Google, Meta, Snap, Microsoft Criticize EU Over Child Abuse Law Expiry

The EU Parliament blocked extension of a law allowing tech firms to scan for child sexual abuse, causing a legal gap experts warn will reduce abuse reports and increase risks globally. Google, Meta, Snap, and Microsoft criticized the decision and pledged voluntary scanning.

·5 min read
People walking outside the EU building

Experts Warn of Sharp Decline in Abuse Reports Following EU Law Expiry

The European Parliament has blocked the extension of a law that allowed major technology companies to scan their platforms for child sexual exploitation, resulting in a legal gap that child safety experts warn will cause many crimes to go undetected.

This law, introduced as a temporary measure in 2021 as part of the EU Privacy Act, permitted companies to use automated detection technologies to identify harms such as child sexual abuse material (CSAM), grooming, and related offenses. However, it expired on April 3, and the European Parliament chose not to extend it amid privacy concerns raised by some lawmakers.

The resulting regulatory gap has created uncertainty for large tech firms. While scanning for harmful content is now prohibited, companies remain obligated under the Digital Services Act to remove any illegal content hosted on their platforms. In response, Google, Meta, Snap, and Microsoft announced in a joint statement on a Google blog that they would continue voluntarily scanning their platforms for CSAM.

“We are disappointed by this irresponsible failure to reach an agreement to maintain established efforts to protect children online,” the statement said.

The European Parliament stated it is prioritizing work on preventing and combating child sexual abuse online and that negotiations on a permanent legal framework are ongoing, though no timeline for agreement or implementation has been provided.

Child protection advocates warned that the lapse of this legislation could lead to a significant decline in reports of child sexual abuse. They cited a similar legal gap in 2021, during which reports of such material from EU-based accounts to the National Center for Missing and Exploited Children (NCMEC) dropped by 58% over a comparable period.

“When detection tools are disrupted, we lose visibility that directly impacts our ability to find and protect child sexual abuse victims,” said John Shehan, vice-president at NCMEC, a US-based organization that serves as a clearinghouse for child abuse reports and forwards them to law enforcement agencies worldwide. “When detection goes dark, the abuse doesn’t stop.”

In 2023, NCMEC received over 29 million reports containing more than 61.8 million images, videos, and other files suspected of child abuse from around the world. Approximately 90% of these reports pertain to countries outside the United States.

Ad (425x293)

A spokesperson for the European Parliament declined to comment on whether the legislative body conducted any assessments regarding the consequences of the law’s lapse.

Child safety experts indicated that the EU’s decision to prohibit scanning will have global repercussions. Many internet crimes are cross-border, with offenders sending illegal images or targeting children in other countries. Shehan noted that “sextortionists,” who impersonate romantic interests to coerce individuals into sending intimate photos before blackmailing them, may exploit the legal uncertainty.

“The offender can be anywhere in the world, but they could have unfettered access to minors in the EU now that there’s legal uncertainty around those safeguards and protections to identify when a child is being groomed,” Shehan said.

Years of Negotiations Culminate in Expiry of Key Legislation

The proposed child sexual abuse regulation had been under negotiation for four years, with contention arising from obligations it would impose on companies to minimize risks on their platforms, according to Hannah Swirsky, head of policy and public affairs at the Internet Watch Foundation, a UK-based child safety nonprofit.

Privacy advocates argue that scanning messages for child abuse infringes on fundamental privacy rights and data security for EU citizens, equating such measures to “chat control” that could lead to mass surveillance and false positives.

“There are claims of surveillance or infringement of privacy,” Swirsky said. “Blocking CSAM is not an evasion of privacy. Free speech does not include sexual abuse of children.”

The scanning technology employs machine learning to detect patterns identifying known images or videos of abuse and language associated with child exploitation, without storing any data, explained Emily Slifer, director of policy at Thorn, a nonprofit that develops technology to detect online child abuse and is commonly used by companies and law enforcement.

The system operates by having trained analysts review known CSAM obtained from external sources such as police reports, public tips, or investigations into websites known for hosting child abuse material. When analysts confirm content as illegal child sexual abuse, they generate a unique digital fingerprint—known as a hash value—that identifies the exact image. Lists of hash values are then shared with platforms, which use automated systems to scan uploads and block matching content instantly without requiring human review.

“The technology doesn’t find babies in bathtubs and things like that. If you just think of what an image of abuse would look like versus what consensual content would look like: those are two very different pieces of material, and technology can determine those patterns between them,” Slifer said.

While the EU has prohibited scanning for child abuse, it has allowed tech companies to voluntarily scan messages for the detection of terrorist content under legislation adopted in 2021, Slifer added.

“The EU is effectively risking open doors for predators,” Swirsky said. “If the EU is serious about protecting children online, then it needs to agree on a permanent legislative framework for safeguarding children and for enabling detection.”

This article was sourced from theguardian

Ad (425x293)

Related News