Technology

‘Shocked’: Senator Opens Inquiry into Ways Tech Companies Report Suspected Child Abuse

Amazon’s AI services division has filed 1.1 million reports of alleged online child abuse by 2025 to the advocacy group. But because those reports did not contain important information, there was zero where law enforcement could take action. A new investigation opened in the Senate aims to ensure that does not happen again.

Sen. Chuck Grassley, Republican of Iowa and chairman of the Senate Judiciary Committee, this week opened an investigation into eight major technology companies over their handling of mandatory reports of online child abuse. It’s the latest move in a growing movement questioning whether tech companies can be trusted to keep their young users safe while online.

Providers of electronic services are required by law to report incidents of child sexual exploitation to the CyberTipline operated by the National Center for Missing and Exploited Children. By 2025, more than 17 million reports of child sexual exploitation were made online. But these reports may lack the information needed to prompt action in the real world.

“What I read is shocking to me,” Grassley said. “Based on the information provided to my office, I am concerned that some companies have not provided NCMEC and law enforcement with sufficient information necessary to protect children and prosecute suspected predators.”

The AI ​​Atlas

Grassley sent requests for more information to several major tech companies: Meta, TikTok, Roblox, Snap, Amazon AI Services, xAI, Grindr and Discord. These eight companies account for 81% of all child abuse reports submitted to NCMEC. Not in question was Google, the owner of YouTube.

A Meta spokesperson told CNET that the company is “working tirelessly” to protect children from this “horrific crime,” adding: “We are committed to continuous improvement and appreciate the feedback, which has already led to improvements, as NCMEC has approved. We will continue to improve our reporting process.”

Grindr, Discord and Roblox made similar comments, saying they plan to work with the Senate and NCMEC on these issues. Grindr added that its dating site is for adults only, 18 and older. Other technology companies did not immediately respond to requests for comment.

The Iowa Republican’s investigation follows reports from NCMEC in 2025 that tech companies were failing to provide important location data in their reports and failing to disclose their use of child sexual abuse material in training AI data. This is mainly about previous cases of AI being used to create intimate images that cannot be consented to, including child sexual abuse.

Child abuse online is a growing issue. By 2025, Meta alone filed nearly 11 million reports, 1.2 million of which were about alleged child trafficking. Meta owns the popular platforms Facebook, Instagram and WhatsApp. NCMEC said in 2025 that Meta and xAI have improved their reporting, but it is still lacking.

“Many ESPs constantly inflate the number of reports they send to the CyberTipline, but fail to disclose that millions of reports lack basic information,” NCMEC wrote to Grassley in 2025. “This leaves children unprotected online, subjects survivors to abuse, allows sex offenders to remain free online and wastes valuable and limited law enforcement resources.”

There has been a move in some branches of government to hold technology companies accountable for child safety. Meta was recently found guilty by a New Mexico jury of misleading users about the safety of its platforms and failing to prevent child exploitation. The company was ordered to pay $375 million in damages. A day later, Meta and Google were found guilty by a California judge of creating social networks that are addictive to children.

The first person was convicted on Tuesday under America’s new anti-AI deepfake law, the Take It Down Act, for creating AI-generated child sexual abuse material.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button