Meta’s AI Generates Excessive Low-Quality Child Abuse Reports, US Investigators Say
Officers from the US Internet Crimes Against Children (ICAC) taskforce have raised concerns that Meta’s use of artificial intelligence (AI) to moderate its social media platforms is producing a large volume of low-quality reports regarding child sexual abuse cases. These reports are reportedly draining resources and slowing investigations, according to testimony and statements from law enforcement officials.
“We get a lot of tips from Meta that are just kind of junk,”said Benjamin Zwiebel, a special agent with the ICAC taskforce in New Mexico, during his testimony last week in a lawsuit alleging that Meta prioritizes profits over child safety. Meta disputes these claims, highlighting platform changes such as teen accounts with default protections. The ICAC taskforce is a nationwide network of law enforcement agencies coordinated with the US Department of Justice (DoJ) to investigate and prosecute online child exploitation and abuse cases.
Another ICAC officer, speaking anonymously to discuss internal matters, stated:
“Meta is providing thousands of tips each month. It’s pretty overwhelming because we’re getting so many reports, but the quality of the reports is really lacking in terms of our ability to take serious action.”This officer also noted that the total number of cybertips their department received doubled from 2024 to 2025.
The unviable tips originating from Instagram, Facebook, and WhatsApp sometimes contain non-criminal information, according to Zwiebel and two other ICAC officers. In some cases, tips suggest a crime may have occurred but lack essential images, videos, or text, which are either missing or redacted.
“[Unviable tips from] Instagram have really skyrocketed recently, especially in the last couple of months, and that’s one of the biggest places where we’re seeing this information not being provided,”said the ICAC officer.
“In those cases, we don’t have the information to further the investigation. It weighs on you to know that this crime occurred, but we can’t identify the perpetrator.”
When asked about Zwiebel’s testimony and the ICAC officers’ comments, a Meta spokesperson responded:
“We’ve supported law enforcement to prosecute criminals for years: the DoJ has repeatedly praised our fast cooperation that has helped lead to arrests, and NCMEC has praised our streamlined and ‘improve[d]’ tip reporting process. In 2024, we received over 9,000 emergency requests from US authorities and resolved them within an average of 67 minutes – and even more quickly for cases involving child safety and suicide. Consistent with applicable law, we also report apparent child sexual exploitation imagery to NCMEC and support them to prioritize reports, from helping build their case management tool to labeling cybertips so they know which are urgent.”
The company also highlighted that Zwiebel recommended the use of Meta’s teen accounts feature during his testimony, stating he did so
“because it is the only option available, assuming that teens will not abstain from the use of social media”.
Raúl Torrez, the New Mexico attorney general leading the case against Meta, acknowledged the company’s cooperation in providing leads on child abuse during the trial:
“I do want to credit some of the social media applications and platforms, including Meta, to a certain extent they do report images to the National Center for Missing and Exploited Children.”
Filings Reveal Internal Concerns Over Policing Child Sexual Abuse
Documents released on Friday as part of the lawsuit reveal Meta executives expressed internal alarm in early 2019 about the company’s capacity to police child sexual abuse and alert law enforcement, particularly as the company prepared to enable end-to-end encryption in Facebook Messenger. This encryption would prevent anyone but the intended recipient from accessing messages.
Monika Bickert, Meta’s head of content policy, wrote in internal communications:
“We are about to do a bad thing as a company. This is so irresponsible.”She further noted that encryption would mean
“no way to find the terror attack planning or child exploitation”and that Meta was making
“gross misstatements of our ability to conduct safety operations”.
Additional internal documents estimated that encrypting Messenger would have prevented Meta from proactively providing data to law enforcement in 600 child exploitation cases, 1,454 sextortion cases, 152 terrorist cases, and 9 threatened school shootings.
Andy Stone, a Meta spokesman, told :
“The concerns raised in 2019 represent the very reason we developed a range of new safety features to help detect and prevent abuse, all designed to work in encrypted chats.”
Child safety groups have criticized Messenger’s encryption, which was ultimately rolled out in 2023.
Mass Reporting of Child Abuse Content
Under US law, social media companies must report any detected child sexual abuse material (CSAM) on their platforms to the National Center for Missing & Exploited Children (NCMEC). NCMEC acts as a national clearinghouse for such reports and forwards them to relevant law enforcement agencies across the US and internationally. NCMEC does not have the authority to filter out unviable tips before forwarding them.
Meta is the largest reporter to NCMEC. According to NCMEC’s 2024 data report, Meta submitted 13.8 million reports across Facebook, Instagram, and WhatsApp, out of a total of 20.5 million tips received.
In 2024, over 1 million CyberTipline reports were linked to specific US states and made available to ICAC taskforces and other federal, state, and local law enforcement agencies for investigation.
Meta and other social media companies use AI to detect and report suspicious content and employ human moderators to review some flagged material before reporting it to law enforcement. has previously reported that content not reviewed by social media employees often cannot be accessed by law enforcement without a warrant due to Fourth Amendment protections, which can delay investigations.
A Meta spokesperson said:
“It’s unfortunate that court rulings have increased the burden on law enforcement by requiring search warrants to open identical copies of content we’ve already reviewed and reported. Our image-matching system finds copies of known child exploitation at a scale that would be impossible to do manually, and we work to detect new child exploitation content through technology, reports from our community, and investigations by our specialist child safety teams.”
Legislative Changes Lead to Surge in Tips
The Report Act (Revising Existing Procedures On Reporting via Technology), effective November 2024, expanded online service providers’ obligations to notify NCMEC’s CyberTipline not only about child sexual abuse material but also about planned or imminent abuse, child sex trafficking, and related exploitation. The act also requires longer evidence preservation and imposes higher penalties for non-compliance.
Since the act’s passage, the number of unviable tips from Meta has increased significantly, possibly due to the company’s efforts to comply with the law, according to two ICAC officers. Many of these tips do not indicate criminal activity, such as adolescent girls discussing which celebrity they find attractive.
Zwiebel stated in court:
“Based on my training and experience, it appears that they are being submitted through the use of AI, as these are common mistakes that an AI would make that a human observer would not.”
In contrast, Zwiebel noted that his department receives fewer tips on legitimate CSAM distribution cases from Meta than in previous years.
Every tip received by an ICAC division must be reviewed, and the influx of unviable tips is diverting time and resources from investigating genuine child abuse cases, according to two officers.
“It is killing morale. We are drowning in tips and we want to get out there and do this work,”said one ICAC officer.
“We don’t have the personnel to sustain that. There’s no way that we can keep up with the flood that’s coming in.”







