The Views' That Kill
Viewing self-harm content available online drove a teenager to commit suicide in the United Kingdom, prompting a rare verdict
Molly Russel, just like any other teenager, was addicted to social media. But unlike most other teens, 2,100 of the 16,300 Instagram posts she had liked, shared or saved were about self-harm. That isn’t all. The 14-year-old had 5,793 pin impressions and 2,692 close-ups on Pinterest. The deadly content was marketed to her as ‘safe.’ She watched them for six months before committing suicide in 2017. Her ‘killers’ have been identified as Instagram and Pinterest.
Out there in cyber space, there are many people like Molly, viewing disturbing content endlessly. It is available to them when it should not be. Are your alarms going off?
Recommender systems optimize for high user engagement. They serve content to users based on their past interactions, clicks and likes. The goal of recommender systems is not to optimize for the health of their user base, but instead to increase the amount of time a user spends on the platform,” explains Srijan Kumar, a computer scientist and an assistant professor at the Georgia Institute of Technology.
Srijan Kumar
While this increases the ad revenue for the platform, it can have severe consequences, including promoting self-harm, misinformation, hate speeches and undesirable propaganda.
“The content people consume online drastically shapes their opinions, which is especially amplified for teenagers. Our recent work showed that consuming online misinformation increases anxiety,” says Kumar, who was honoured with the ‘Forbes 30 under 30’ award for his extraordinary work on social media safety and integrity. He says there is evidence to show that even online hate speeches translate to real-world violence and consuming self-harm content may induce suicidal ideation.
Romanticising self-harm
In Molly’s case, the London Judge, while holding that Meta platform Inc.’s Instagram and Pinterest had played “more than a minimal part” in her death, ruled that the platforms operated in such a way using algorithms so as to result, in some circumstances, in binging on images, video clips and text which “romanticized acts of self-harm” and “sought to isolate and discourage discussion with those who may have been able to help the girl.
“If their (tech giants) algorithm studies users’ likes and throws up similar content, why do they not ensure that a flag is raised when endangering material or content on self-harm or suicide is being searched for, saved or watched by a particular user, specially a child user,” wonders Cyber psychologist Nirali Bhatia.
She says many teenagers face peer pressure or conflict with parents, academic pressure and more, which can push the child into experiencing loneliness, low self-worth, mood swings, aggression, withdrawal or even helplessness. “They find escape in the online world and could be drawn towards looking for content which they believe would ease their pain or make them feel they are not alone,” says Bhatia, adding that it is the responsibility of these sites to flag extreme content and even redirect users to more positive and hopeful content.
Communication gap
During the hearings in Molly’s case, executives from Meta and Pinterest were questioned about whether the algorithms used on their social media sites had exacerbated the girl’s mental health. The verdict came after a two-week inquest into whether social media played a role in her death.
According to Mukesh Chaudhary, a cyber security expert, the problem is that children are becoming more tech savvy while the older generation knows very little about the Internet. “As a result, there is a communication gap and children are hesitant to share their thoughts or concerns with their parents. Children are free to do whatever they want on the Internet,” he believes.
He recommends using parental monitoring software applications to restrict certain websites and monitor children’s activities. “Parents must stay up-to-date on the latest cyber trends. Being your child’s confidante is the only way to keep them from self-harming,” Mukesh adds.
Self-harm is among the top 10 causes of death worldwide, including in India. In the US, a series of lawsuits have been filed against Big Tech by young people who claim social media addiction caused them to develop serious mental health issues.
Preventing tragedy
Srijan Kumar says a multi-pronged approach is needed to prevent more tragic incidents. “First, internet platforms need to prioritize user health over user engagement. Second, digital education needs to be emphasized at all levels, and teenagers need to be informed about the potentially harmful consequences of social media use,” says the Cyber space expert.
- Molly Russel excelled in school and was passionate about the performing arts. Everything about her appeared normal from the outside
- The London judge concluded that it “would not be safe” to call Molly’s death in 2017 a suicide and was instead “an act of self-harm whilst suffering from depression and the negative effects of online content.”
- Molly’s father Ian Russell is campaigning for better protection against potentially dangerous social media algorithms.
“Over the past few years, we’ve continued to strengthen our policies around self-harm content. Molly’s story has reinforced our commitment to creating a safe and positive space for our pinners.” --- Pinterest, after the verdict
“Facebook is misleading the public on progress against hate speech, violence, misinformation’ and Facebook harms children, sows division and undermines democracy in pursuit of breakneck growth and astronomical profits.” — Frances Haugen, Facebook’s Whistle-blower.
Throughout high school in the US, Christopher James Dawley, also known as CJ, became dependent on social media. He frequently used Instagram to communicate with people till three in the morning. He lost sleep and developed a body image obsession. On January 4, 2015, he texted his best buddy, "God's speed," and updated his Facebook status with the question, "Who turned out the light?" CJ fatally shot himself while holding a 22-caliber weapon in one hand and a smartphone in the other. He was 17. Police discovered a suicide note scrawled on the letter of acceptance to a college. Recently, the Dawley family has filed a lawsuit against Snap, the parent company of Instagram and Snapchat, as well as Meta, the parent company of Facebook of designing their platforms to addict users with algorithms.