Technology

Parent anger at social media companies boils over ahead of tech CEO hearing 

The Senate is hauling in CEOs of social media companies to grill them over online harms to children Wednesday, but parents and advocates said the time for talking is over and Congress must act to protect children and teens.

Parents who became advocates after losing their children to harms they say were created by social media companies will be among the crowd at Thursday’s Judiciary Committee hearing. The hearing will feature testimony from Meta CEO Mark Zuckerberg, TikTok CEO Shou Zi Chew, X CEO Linda Yaccarino, Snap CEO Evan Spiegel and Discord CEO Jason Citron.  

The hearing is centered around the online sexual exploitation of children, but advocates said the harms extend to how social media companies amplify cyberbullying and the spread of harmful content that promotes eating disorders and self-harm.

“I hope this is the last hearing where we talk about the problems of unregulated social media on children and teens,” said Josh Golin, executive director of Fairplay, a nonprofit focused on kids’ online safety. 

“We’ve had a number of these, and I think they can be illuminating, and some really important points are made, but ultimately what’s going to save children’s lives and create a safer, less addictive internet for kids is not senators dunking on CEOs at hearings, but actually voting for legislation,” he added. 


A coalition of teens, parents and other advocates will attend the hearing to make a push for the Kids Online Safety Act (KOSA), a bipartisan bill that would add regulations for social media companies like the five in the hot seat Wednesday.

While advocates aren’t shying away from slamming tech companies as taking too little action to mitigate the risks posed by their services, they also place blame on lawmakers for failing to pass rules that would hold the companies accountable.

“There’s no question that this should be a priority of everyone, anybody who cares about a child,” Christine McComas told the Hill.

McComas’s daughter, Grace, died by suicide at 15 in 2012 following a sexual assault and subsequent cyberbullying on Twitter, the platform now known as X and owned by Elon Musk.  

McComas will be at the hearing Wednesday, with posters of her daughter’s face covered by the hateful comments posted about her before she died.  

Parent advocates are “committed to sharing” their voices, McComas said, but it is the “hardest thing to keep going out there and telling your story and not having anything happen.”  

“To think that you were heard and then find out for whatever reason nothing was done again. It’s a moral mandate really to get this done,” she said.

Momentum behind efforts to regulate how social media companies operate for minors online has built over the past few years, especially since Facebook whistleblower Frances Haugen came forward in October 2021.

Despite hearings with Haugen, a second Meta whistleblower, the head of Instagram, the CEO of TikTok, executives from TikTok, YouTube, and Snapchat, and others over the last couple of years, lawmakers have yet to pass laws to protect kids online.

Advocates say the lack of rules has left that job up to parents, a role they say is impossible for them to play.

McComas said watching harms happen to your child online and not being able to prevent it is like a “slow-motion car crash.” At the time her daughter was experiencing cyberbullying, she didn’t even have a Twitter account herself or a smart phone, McComas said.  

Neveen Radwan said her then-15-year-old daughter developed an eating disorder in 2020 after she was shown content promoting dangerous behavior and challenges while searching for workout videos during the COVID-19 pandemic.

Even though Radwan was proactively trying to mitigate risks through limiting screen time and using her background in IT to set up strong settings on her kids’ phones, it wasn’t enough, she said.

“I thought I had all the loopholes covered. And yet, nope, I was the one that fell through the cracks,” Radwan told reporters at a virtual press conference organized by the Tech Oversight Project.

“They can blame parents all they want. There is nothing that we can do to fight those algorithms. The parents have no recourse. No matter what we do, we can’t fight those algorithms — they’re the big bad wolf,” Radwan said.  

Social media companies have broadly pushed back on criticism that they are not doing enough to protect children and tout the policies they have put in place.  

Meta, the parent company of Facebook and Instagram, has faced criticism about harms to teens online. The company released a series of updates to its policies ahead of the hearing, including restricting self-harm and eating disorder content from teen users, and released a framework for legislation that pushes for proposals that add more parental controls.

Discord will aim to distance itself from other companies by describing how its platforms’ business model differs from others — focusing on chatrooms rather than an algorithm that pushes content — and tout its safety features for teens that include a “sensitive media” filter to blur content and “nudge” teens when they’re interacting with a stranger that doesn’t have any shared friends, according to a Discord spokesperson.

Snap broke from the other companies testifying when it came out in support of KOSA last week.

Although KOSA has been introduced in the Commerce, Science and Transportation Committee, it’s likely to emerge as part of the debate at the Judiciary hearing about how to regulate the companies.

The bills lead sponsors, Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.), are also members of the Judiciary Committee and will push for its passage Wednesday, as part of a week of engagement to push the bill forward.  

“Big Tech executives have been making the same hollow promises for years as kids are suffering horrific harms online. Without real and enforceable reforms, social media companies will only continue publicly pretending to care about young people’s safety while privately prioritizing profits,” the senators said in a joint statement.  

KOSA would create a duty of care for social media companies to prevent and mitigate harms to minors. It would also require companies to perform an annual independent audit to assess risks to minors and compliance with the rules.

KOSA advanced out of the Commerce Committee in July with bipartisan support, along with COPPA 2.0, a bill that updates data privacy rules for minors. Both bills advanced last Congress, too, but were not called for floor votes.

Golin said the bill needs to be passed this Congress.

“Kids are dying every single day because social media is not regulated,” he said.  

“What you’ve seen up until now is parents politely asking for online safety legislation. I think you’re going to start seeing real anger if this continues to drag on,” he added.