As the pandemic has pushed kids to use the internet more for school and socialization, children’s media safety advocates warn that social media giants’ business models are a key hurdle in combating cyberbullying.
Relying on social media platforms for kids and teens to interact amid the pandemic, combined with remote schooling which has put distance between teachers tasked with monitoring students, has exacerbated often toxic and dangerous online environments, according to experts.
But online hate speech faced by minors has prompted rare bipartisan agreement as well as a new legal battle in California federal court that could test the bounds of a controversial law that has shielded tech giants.
Jean Rogers, Children’s Screen Time Action Network director at the Campaign For a Commercial Free Childhood, said young users can feel “boxed in” to the platforms when facing hateful interactions online.
“They don’t see a way out. It’s certainly not easy because their world — and especially COVID having made it so much more difficult — is keeping them online. And we know it’s not the school’s fault. It’s not the parents fault. It’s the business model, it’s profit that’s keeping them on these platforms, and keeping them with the techniques that make them overuse them,” Rogers told The Hill.
The companies have created a “social media validation loop” that can be particularly damaging for adolescents, said KK Myers, a co-founder of the Minnesota-based organization LiveMore ScreenLess.
“As we put them on those platforms, it is preying on their vulnerability, it is taking advantage of their developmental stage where they think, ‘This is where my friends are, so I need to stay here. If I’m not there, I’m going to be left out.’ So all of those feelings that we’ve all experienced as an adolescent are now baked into those apps to keep young people there,” Myers said.
Nearly every school shut its doors last year. The shift to remote schooling added more levels of separation between teachers and students, and with it additional hurdles to combat bullying, Rogers noted. Outside of a traditional classroom, it can be more difficult for a teacher to monitor student interactions.
One of the more challenging aspects of cyberbullying are spaces that allow for anonymous comments and chat rooms, experts said.
One such anonymous question-and-answer platform, “Yolo,” is facing a legal challenge from a mother who lost her son to suicide last year.
The lawsuit was filed by Kristin Bride, the mother of Carson Bride, a 16-year-old who died by suicide in June. Carson Bride received 105 anonymous messages on Yolo between Jan. 23 to June 22, with the majority coming in during June just before his death, according to the complaint.
Of the 105 anonymous messages Carson Bride received, 62 included “content that was meant to humiliate him, often involving sexually explicit and disturbing content,” according to the complaint.
Yolo was available for integrated use on Snapchat. A Snapchat spokesperson confirmed that Yolo was suspended “in light of the serious allegations raised by the lawsuit,” but did not comment further on the allegations.
The lawsuit filed in California federal court alleges the companies violated consumer protection law.
According to Snapchat, there is a zero-tolerance policy toward bullying, harassment and intimidation on the platform.
Carson Bride’s web history on his phone from the morning of his death allegedly shows searches for “Reveal YOLO username Online.” Earlier in the month, he had searched websites for “YOLO reveal” and texted a friend asking about the identities of the anonymous senders, according to the complaint.
“YOLO developed, designed, and distributed the anonymous messaging feature to minor users, despite the known dangers and the foreseeability of injury and wrongful deaths caused by its services. In this way, YOLO failed to exercise the duty of care owed to Carson and other users,” the complaint states.
Many cases against tech companies about content posted by third parties are dismissed based on Section 230 of the 1996 Communications Decency Act.
The law has come under attack from lawmakers on both sides of the aisle in recent months, and there are Republican- and Democratic-backed proposals aimed at reforming the law that could hold the companies legally responsible for certain third-party content.
For example, a proposal introduced by Democratic senators earlier this year would remove the protections in cases where users face cyberstalking, targeted harassment and discrimination.
In the House, Reps. Jan Schakowsky (D-Ill.) and Kathy Castor (D-Fla.) introduced one earlier this month that would require social media platforms to establish and disclose terms of service, and then allow the Federal Trade Commission to take action against companies for violating the bill. And across the aisle, Rep. Jim Banks (R-Ind.) introduced a bill that would remove protection from the companies that “knowingly peddle unlawful material” such as child pornography on their sites.
Nearly half of minors aged 9 to 17 said they have experienced potentially harmful online experiences, and 38 percent said that included experiences that bullied them or made them feel uncomfortable, according to a 2020 study released by Thorn, a nonprofit organization that builds technology to defend minors from sexual abuse. The survey was conducted between Oct. 25 to Nov 11.
Snapchat and Instagram, two of the most popular apps among the respondents, were the platforms with the highest number of survey participants reporting a potentially harmful online experience, at 26 percent each.
The issue was even more prevalent among LGBTQ minors, with 33 percent of LGBTQ minors reporting potentially harmful online experiences on Snapchat and 29 percent on Instagram.
A spokesperson for Snapchat said they “really appreciate the extensive findings” in the report and will make updates to address some of the concerns raised.
“In recent months, we have been increasing our in-app education and support tools for Snapchatters, working to revamp our in-app reporting tools, putting in place additional protections for minors, and expanding resources for parents. After reviewing this research, we are making additional changes to make us be even more responsive to the issues raised by the report,” the spokesperson said in a statement.
A Facebook company spokesperson said they “appreciate Thorn’s research and value our collaboration with them.”
The spokesperson also defended the company’s “meaningful progress on these issues,” including a recent update that restricts direct messages between teens and adults they don’t follow on Instagram.
“The real difficulty with cyberbullying as opposed to in person bullying is that kids never get a break,” said Nusheen Ameenuddin, chair of the American Academy of Pediatrics’ Council on Communications and Media.
“It can happen all the time, and kids still feel pressure to have accounts because that’s how they interact with other kids and the rest of the world. But anyone can create a fake account and if you block one account, another one pops up and so it really is kind of like playing whack-a-mole,” she added.
Facebook is provoking even more pushback over its plans to launch an Instagram for kids, a platform that would allow users under 13 years old to join. The company has touted it as a way to curb underage users from circumventing the age requirements and joining the main platform. But advocates and bipartisan elected officials have widely slammed the plans and urged Facebook to abandon the idea, in part over concerns of cyberbullying.
Facebook has said any Instagram platform it launches for kids will not allow ads.
Last week, Sens. Ed Markey (D-Mass.) and Bill Cassidy (R-La.) joined together to introduce a bill that would update children’s online privacy rules. And at a recent House Energy and Commerce Committee hearing with tech CEOs, lawmakers leveled questions at the executives about policies to protect children.
Myers and Maree Hampton, the co-founders of LiveMore ScreenLess, agree there has to be some form of legislation to help combat cyberbullying.
But they said the youth who are being impacted also need to be elevated to lead the way. LiveMore ScreenLess aims to do that by working with local districts in Minnesota to develop plans to use peer educators to speak to younger students.
“Any prevention program will be incomplete if only championed and led by school personnel. Students themselves must realize that their individual and collective voice is powerful,” Myers said.
With most schools nationwide now open and expected to operate in person in the fall, experts hope schools can provide classroom environments that are more apt to mitigate cyberbullying.
“Certainly there are ways that teachers are able to use platforms to support what they’re doing in class, but I think when they go back there is going to be sort of a revolution of in-person learning, because they, the teachers themselves, have been starved for it. The teachers don’t want to be replaced by the online curriculum,” Rogers said. “I think they’re going to be more aware of, ‘we need a balance here.’ ”