ITIF Logo
ITIF Search
Congress' Blame Game Won't Keep Children Safe Online

Congress' Blame Game Won't Keep Children Safe Online

February 6, 2024

Congress’ most recent “Big Tech” hearing, which the Senate Committee on the Judiciary held on January 31, 2024 on online child sexual exploitation, highlighted everything wrong with the current debate surrounding children’s online safety. Not only did members of the committee spend more time focusing on largely unrelated issues—including potential threats to democracy from interference by foreign actors on social media—but when the focus was on protecting children, the conversation quickly became a blame game instead of discussing solutions. These conversations do little to solve the real problems facing children and parents and ignore the significant progress tech companies have made.

To date, Congress’ “Big Tech” hearings have followed the same template, generating criticism for grilling the CEOs of high-profile tech companies instead of questioning subject matter experts on how regulation could either solve or create problems for children, parents, and tech companies. During last week’s hearing, multiple senators accused the CEOs in attendance of causing harm to children and called out specific examples of harm facilitated via online services. In turn, multiple witnesses—who included Linda Yaccarino of X (formerly Twitter), Shou Chew of TikTok, Evan Spiegel of Snap, Mark Zuckerberg of Meta, and Jason Citron of Discord—downplayed the presence of children on their services.

This game of placing and shifting blame detracts from real efforts to prevent child sexual exploitation, protect victims, and punish perpetrators. Harm to children undoubtedly does occur online, much as it does in the physical world, as evidenced by the millions of reports online services send to the National Center for Missing and Exploited Children’s (NCMEC) CyberTipline every year, made possible in part due to advancements in technology that allow online services to more effectively and efficiently scan for child sexual abuse material (CSAM) and identify victims, along with heightened efforts by online services to report CSAM. Congress could more effectively address online child exploitation by giving law enforcement more resourcesincluding technology—to investigate reports and prosecute the perpetrators responsible for creating, spreading, or enabling this exploitation.

Another feature of Congress’ “Big Tech” hearings is a misrepresentation of fact. Existing research demonstrates a lack of scientific consensus on how social media affects children, and yet during last week’s hearing and other conversations surrounding children’s online safety, policymakers spoke as though social media’s effects on children are universally and irrefutably harmful. Unfortunately, in many states, policymakers have also legislated from this perspective, passing laws that create privacy risks and free speech restrictions on all users in an attempt to keep children off social media.

The current debate surrounding children on social media mirrors tech panics of yesteryear, including the very recent debate surrounding children’s use of video games. As was the case then, regulating out of fear and a desire to assign blame leads to policies that create more problems than they solve. Effective regulation should not only focus on mitigating risks but also maximizing benefits. Until lawmakers take this approach, their conversations on children’s online safety will continue to produce many sound bites but few tangible results.

Back to Top