Technology

Instagram to launch ‘sensitivity screen’ that blocks images of self-harm

Instagram this week will launch “sensitivity screens” to blur out images of self-harm posted to the platform, a move that comes a month after a British family claimed their daughter killed herself after viewing images of self-harm on Instagram and Pinterest.

The father of Molly Russell, a 14-year-old British girl who killed herself in January, told multiple media outlets including The Sun that he blamed Instagram in part for her death. He told The Sun that her social media platforms were full of graphic images of cutting and self-harm. 

{mosads}Instagram’s head of product Adam Mosseri in an op-ed for The Telegraph wrote that Russell’s death spurred the company last week to launch an internal review of its ability to protect young people from harmful images. 

“We are not yet where we need to be on the issues of suicide and self-harm,” Mosseri wrote, following sharp criticism in the U.K., including by the British health secretary, over Russell’s death. “We need to do everything we can to keep the most vulnerable people who use our platform safe.” 

Mosseri noted that while Instagram’s current guidelines do not permit posts that “promote or encourage suicide or self-harm,” he does not feel the company has done a good enough job taking down posts before users see them.

Some critics have pointed out that Instagram’s algorithms guide users through related images and accounts – so if users “like” or interact with one post displaying suicidal or violent content, Instagram will guide them to others. 

Mosseri said this is one of the challenges a team of “engineers and trained content reviewers” are working to address. 

“We have engineers and trained content reviewers working around the clock to make it harder for people to find self-harm images,” Mosseri wrote. “We have put in place measures to stop recommending related images, hashtags, accounts, and typeahead suggestions.”

He then pointed to the “sensitivity screens” as an example of Instagram’s efforts to weed out triggering content. The “sensitivity screens” will block posts reviewed by Instagram that contain cutting self-harm imagery. 

“Starting this week we will be applying sensitivity screens to all posts we review that contains cutting self-harm imagery that is not in violation of our guidelines,” an Instagram spokesperson said in an email to The Hill. “Our guidelines allows for people to share that they are struggling, but we remove posts that promote such activity.” 

Mosseri wrote that the images with “sensitivity screens” will “not be immediately visible, which will make it more difficult for people to see them.” 

Instagram did not respond to The Hill’s request for more information on how the self-harm images will be flagged as requiring a “sensitivity screen.”