Meta whistleblower to testify in Senate hearing on child safety, social media

Senators will hear testimony Tuesday from a former Facebook employee who is alleging the company failed to act on reports of harassment and harm facing teens on the platform.  

Arturo Bejar, a former Facebook engineering director who later worked as a company consultant, will testify before a Senate subcommittee about social media and its impact on the teen mental health crisis, the panel announced Friday.

The testimony comes amid a bipartisan push in Congress to adopt regulations aimed at protecting kids online. 

Bejar has also met with Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.), the lead sponsors of the Kids Online Safety Act (KOSA), according to the senators’ offices.  

“Arturo’s first-hand knowledge and damning evidence prove that Meta has put profits ahead of the safety and wellbeing of millions of teenagers, with a deadly toll on young people and families. It is clear that Facebook’s leadership will not act to make its platforms safer for users without a mandate,” the senators said in a joint statement.  


Top Stories from The Hill


The senators added it is “time to say ‘enough is enough’ to Big Tech” and pass their legislation to address the harms from tech companies’ actions toward children.  

KOSA would create a duty of care for social media platforms to prevent and mitigate harm to minors. Although it has bipartisan support, there have also been lingering concerns on the legislation regarding whether it could harm LGQBTQ youth if attorneys general weaponize the legislation to censor certain resources for young people.  

Children’s online advocacy group Fairplay also called for KOSA to be passed in light of the new allegations. 

“Arturo Bejar’s brave whistleblowing is yet more evidence that Meta not only knows Instagram’s design is harmful to young people, but it refuses to implement measures its own experts say could make the platform safer for children and teens,” Fairplay executive director Josh Golin said in a statement. 

“Meta’s continual and blatant disregard for children’s wellbeing is exhibit A for why we need the Kids Online Safety Act,” he added. 

An advocacy group called the Real Facebook Oversight Board, which is not associated with the official Meta Oversight Board, said Bejar’s allegations “should be the push that gets Congress to act on bipartisan legislation to safeguard” youth, such as KOSA and a new version of the Children’s Online Privacy Protection Act (COPPA). 

COPPA 2.0 would update the privacy rules for minors and add new regulations, such as banning targeted advertising to children. 

Both proposals have advanced out of the Senate Commerce Committee with bipartisan support, like they did last year, but have not been called for floor votes. 

Bejar’s allegations were revealed in a Wall Street Journal report published Friday. The former Facebook engineering director returned to the company in 2019 for a two-year consulting contract after seeing the harms his teen daughter and her friends experienced on Instagram.

During his two-year contract, Bejar and a team worked to create a questionnaire to evaluate user experience on the platform, according to the Journal. The questionnaire led to a report he sent to top Meta executives, including CEO Mark Zuckerberg, then-Chief Operating Officer Sheryl Sandberg, Chief Product Officer Chris Cox and Instagram head Adam Mosseri, according to the Journal.  

The BEEF questionnaire, short for “Bad Emotional Experience Feedback,” found that users were 100 times more likely to tell Instagram they witnessed bullying in the past week than Meta’s bullying-prevention statistics indicated they would, according to the Journal.  

The survey also found that “bad experiences” were particularly common among teen users on Instagram. For example, 26 percent of users under 16 recalled having a bad experience in the last week due to witnessing hostility against someone based on their race, religion or identity, according to the report. 

The survey also revealed more than a fifth of users younger than 16 felt worse about themselves after viewing others’ posts. Thirteen percent said they experienced unwanted sexual advances in the past week, according to the Journal.  

Meta spokesperson Andy Stone pushed back on the allegations.  

“It’s absurd to suggest we only started user perception surveys in 2019 or that there’s some sort of conflict between that work and prevalence metrics. Prevalence metrics and user perception surveys measure two different things. We take actions based on both and work on both continues to this day,” Stone said in a statement. 

Stone also noted the company has rolled out specific features as a result of user perception surveys, including adding notifications about “hurtful comments or content,” and reminders for users to be respectful in direct messages. 

He also said the BEEF survey was global and did not include a definition of what was an unwanted sexual advance. 

Bejar’s scheduled testimony before the Senate panel comes shortly after dozens of states sued Meta over allegations of knowingly designing and deploying features that harmed young users. 

Bejar has been consulting with state attorneys general who filed suits against Meta late last month, the Journal reported.  

Meta pushed back on the allegations from the states. 

Tags Facebook Facebook Instagram kids' online safety META Meta Privacy Richard Blumenthal

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴

More Technology News

See All
Main Area Bottom ↴

Most Popular

Load more