ITIF Logo
ITIF Search
Stopping Child Sexual Abuse Online Should Start With Law Enforcement

Stopping Child Sexual Abuse Online Should Start With Law Enforcement

May 10, 2023

Congress has recently introduced multiple bills aiming to crack down on one of the most harmful forms of illegal online content, child sexual abuse material (CSAM), including Sen. Dick Durbin’s (D-IL) bill, the STOP CSAM Act of 2023. This bill, like others before it, attempts to tackle CSAM by placing even higher penalties on the online services that some perpetrators use to commit crime. But this proposal is not an effective solution for protecting children and would come at the cost of decreased privacy and security for all users. Instead, Congress should focus on enabling law enforcement to more effectively protect victims and punish the perpetrators who create, solicit, or enable CSAM.

The STOP CSAM Act, introduced on April 19, 2023, includes provisions meant to increase transparency and accountability in CSAM reporting. It requires large online services—those with over 1 million unique monthly visitors and over $50 million in annual revenue—to submit annual reports to the Attorney General and Chair of the Federal Trade Commission (FTC) detailing their CSAM policies and reporting systems and other measures to promote a culture of safety for children. The Attorney General and FTC must publish these reports, though online services can request redactions.

Greater transparency in online child safety and CSAM reporting is important for researchers and policymakers to understand the full extent of the problem and the efforts online services take to fight abusive content as well as how successful those efforts are. It is equally important, when requiring that online services make that information available to the public, that Congress does not inadvertently provide perpetrators with a playbook on how to evade detection. For this reason, the ability for the Attorney General and FTC to redact information from online service’s annual reports before publishing them, and for online services to request redactions, is necessary. However, it makes less sense for the transparency obligations in the STOP CSAM Act to only apply to large online services. Just as much harm to children can take place on smaller services, and perpetrators could easily move to smaller services if committing crime on larger services becomes even more difficult than it already is.

Other useful measures included in the STOP CSAM Act increase privacy protections for child victims and witnesses in federal court and make it easier for states to receive federal funding to establish Internet Crimes Against Children (ICAC) task forces to assist law enforcement in fighting child abuse.

Unfortunately, the STOP CSAM Act also includes a few provisions that would be detrimental to online safety, privacy, and free expression.

The bill requires online services to report and remove planned or imminent child exploitation, which as defined by current law, could encompass a wide range of online content and activities that may or may not indicate child exploitation. For example, online communication between two adults to discuss transporting a child across state lines could indicate a potential crime, but more likely it indicates that the child is going on vacation or visiting a family member. Because of the high criminal penalties in the bill—to the tune of $150,000 for an initial violation and $300,000 for subsequent violations—and tight turnaround time of 48 hours for responding to notices, this provision would incentivize online services to remove all content that could indicate planned or imminent child exploitation, even if the content is innocuous.

In addition to criminal penalties, the bill creates a civil penalty of up to $100,000 for failure to report and remove content and up to $1 million for failure to submit an annual report. Moreover, it prevents online services from using Section 230 of the Communications Decency Act—a law that protects online services and users from facing legal consequences for hosting or sharing third-party content—as a defense for failing to remove content. The bill also contains broad language, making it a crime to not only “intentionally” or “knowingly” host or store CSAM or promote or facilitate child exploitation but also to do so “recklessly” or “negligently.” This would open the door to a flood of litigation related to activities online services engage in that could lead to child exploitation in any way, even if those activities are in no way meant to facilitate criminal behavior. Arguably, an online service could be liable if it is generally aware that its service could be used to exploit children, which would be the case for virtually every online service that allows communication between users or image hosting and sharing.

Similarly to another anti-CSAM bill, the EARN IT Act, which was recently reintroduced for the third time, the STOP CSAM Act could undermine end-to-end encryption, which ensures only the sender and recipient of a message can view its contents and is an important tool for cybersecurity. The bill could lead to liability for online services that offer end-to-end encryption on the grounds that they acted negligently because they cannot scan and monitor their users’ communications for CSAM or planned or imminent child exploitation. This potential loss of end-to-end encryption would significantly undermine privacy and security for users.

A one-pager on the STOP CSAM Act from the Senate Judiciary Committee starts off by citing statistics on how, in recent years, the number of victims identified in CSAM has risen almost ten-fold and reports to the National Center for Missing and Exploited Children’s (NCMEC) CyberTipline have increased by a factor of 77. An increase in the creation and distribution of CSAM online—likely due to increased use of the Internet for not only legitimate activity but also, unfortunately, criminal activity—could be partially responsible for this increase. But it is just as likely due to advancements in technology that allow online services to more effectively and efficiently scan for CSAM and identify victims, along with heightened efforts by online services to report CSAM.

Rather than imposing additional requirements on online services that, according to the statistics in the Senate Judiciary’s one-pager, already submit millions of reports of CSAM to the CyberTipline every year, Congress could more effectively address online child exploitation by giving law enforcement more resources to investigate reports and prosecute the perpetrators responsible for creating, spreading, or enabling this exploitation. In addition to increasing funding for efforts such as ICAC task forces, which the STOP CSAM Act does, Congress should increase funding for police technology and training to keep up with perpetrators who continually update their methodology to evade detection.

Laws designed to increase safety always require a tradeoff. However, many of the recent bills meant to target CSAM, including the STOP CSAM Act, strike the wrong balance and would sacrifice a significant level of privacy and free speech for, in some cases, marginal safety gains. Congress would see more success by removing the provisions in the STOP CSAM Act that would disproportionately damage privacy, security, and free speech and taking additional measures to aid law enforcement in protecting victims and bringing perpetrators to justice.

Back to Top