< Back to 68k.news UK front page

How to Fix the Online Child Exploitation Reporting System

Original source (on modern site)

The report is based on interviews with 66 respondents across industry, law enforcement, and civil society. Researchers also visited NCMEC's headquarters for three days of extensive interviews.

While our research focuses on CyberTipline challenges, we want to note that many respondents highlighted that the entire CyberTipline process is enormously valuable and the fact that U.S. platforms are required to report CSAM is a strength of the system. "The system is worth nurturing, preserving, and securing," one respondent said.

Findings

Law enforcement officers are overwhelmed by the high volume of CyberTipline reports they receive. However, we find that the core issue extends beyond volume: officers struggle to triage and prioritize these reports to identify offenders and reach children who are in harm. An officer might examine two CyberTipline reports - each documenting an individual uploading a single piece of CSAM - yet, upon investigation, one report might lead nowhere, while the other could uncover ongoing child abuse by the uploader. Nothing in the reports would have indicated which should be prioritized.

We identify three key challenges for law enforcement to prioritize reports for investigation.

First, while some tech companies are known for providing careful and detailed CyberTipline reports, many reports are low quality. Executives may be unwilling to dedicate engineering resources to ensure the accurate completion of fields within the reporting API. Trust and safety staff turnover and a lack of documentation on reporting best practices cause knowledge gaps in consistency and effective reporting. This is especially true for platforms that make fewer reports. That said, submitting a high volume of reports is not necessarily correlated with submitting high quality reports.

Second, NCMEC has faced challenges in rapidly implementing technological improvements that would aid law enforcement in triage. NCMEC faces resource constraints and lower salaries, leading to difficulties in retaining personnel who are often poached by industry trust and safety teams. While there has been progress in report deconfliction—identifying connections between reports, such as identical offenders—the pace of improvement has been considered slow. Additionally, varied case management interfaces used by law enforcement to process CyberTipline reports make it difficult to ensure linked reports are displayed. Integration difficulties with external data sources, which could enrich reports and facilitate triage, are partly attributed to the sensitive nature of CyberTipline data and potentially staffing constraints for technical infrastructure upgrades. Legal restrictions on NCMEC's use of cloud services hampers their ability to leverage advanced machine learning tools, although opinions vary on the appropriateness of cloud storage for their sensitive data.

Third, there are legal constraints on NCMEC's and law enforcement's roles. A federal appeals court held in 2016 that NCMEC is a governmental entity or agent, meaning its actions are subject to Fourth Amendment rules. As a result, NCMEC may not tell platforms what to look for or report, as that risks turning them into government agents too, converting what once were voluntary private searches into warrantless government searches (which generally requires suppression of evidence in court). Consequently, NCMEC is hesitant to put best practices in writing. Instead, many trust and safety staff who are new to the CyberTipline process must learn from more established platforms or industry coalitions. 

Another federal appeals court held in 2021 that the government must get a warrant before opening a reported file unless the platform viewed that file before submitting the report.  Platforms often do not indicate whether content has been viewed; if they have not so indicated, then NCMEC, like law enforcement, cannot open those files. Platforms may automate reports to the CyberTipline on the basis of a hash match hit to known CSAM instead of having staff view each file, whether due to limited review capacity or not wanting to expose staff to harmful content. Where reported files weren't viewed by the platform, law enforcement may need a warrant to investigate those reports, and NCMEC currently cannot help with an initial review. 

This review process makes it difficult to process the high volume of reported viral and meme content. Such content commonly gets shared widely, for example out of outrage or a misguided attempt at humor; nevertheless, if it meets the definition of CSAM, it is still illegal and must be reported. Platform staff don't always review meme content (to avoid repeated unnecessary exposure to known material), but if these reports with unviewed files are submitted without checking the CyberTipline report form's box for memes, it creates an enormous amount of work for law enforcement to close out these unactionable reports. Meanwhile, since platforms are required to preserve reported material for only 90 days, the time it takes to process a report means preserved content has often been deleted by the time law enforcement follows up with the platform in actionable cases.

Recommendations

< Back to 68k.news UK front page