Davine Lee still has the birthday present she bought for friend Molly five years ago, an unwrapped box containing a hoodie emblazoned with her favorite TV show , left untouched at home.
She was 14 when she learned of her friend’s death when she saw her empty chair at school and wondered where she might be.
Hearing this news, it doesn’t make any sense.
Warning: Some readers may find the content of this story distressing.
Molly RussellIn November 2017, a seemingly happy teenager from Harrow, north-west London, was found dead in her bedroom, a day after rehearsing with Devin for the show in which she was cast.
It was later discovered that she had seen a great deal of content online related to suicide, depression and anxiety.
In a landmark inquest ruling in September, the coroner ruled that she did not die by suicide but by “acts of self-harm while suffering from depression and the negative influence of online content”.
Dealing with the death of a friend in this way, especially at such a young age, is a particularly complex form of grief.
“Having to lose a friend at that age is kind of a scar,” Devon explained quietly. “Losing Molly…it’s something we can never forget or fully get over…
“I also got her birthday present in 2017 – it was her 15th birthday and of course she didn’t make it to that birthday. It’s still in my room and I don’t really know what to do with it. I Obviously can’t give it to her, but somehow it feels like I can still grab her through it.”
With the permission of the Russell family, this is Davine’s first media interview, given exclusively to Sky News.
Now 20 and in college, she said she was moved to speak publicly for the first time to emphasize the Online Safety Act Back before the council. She hopes that reliving that time might prompt anyone to struggle to see how much they will miss it.
“It was shocking to see how bad things were,” Devon said, referring to the graphic material Molly was shown during her interrogation. “I want people to know that what happened to Molly was not an isolated incident and the content she was tweeted about is still there.”
“Poor mental health can almost hide in plain sight”
Molly and Davine have been friends since middle school together and share a love of singing and musical theatre. Together they starred in school productions of Les Miserables and Beauty and the Beast.
“[Molly] Just been given one of the leads on a show we were doing that year…she’s still doing what she loves…in the sense that depression or poor mental health is almost Can hide in plain sight,” says Davina.
Devon didn’t know that Molly was suffering.
“My first thought was, ‘No.’ It was like an instant sense of disbelief, like, ‘No, Molly won’t.’ It didn’t even make sense.”
Reflecting on the horrific day when her school friends were told what happened, she remembers teachers ushering them all into a room. Then came the news of Molly’s death.
They all burst into tears. “That’s the voice I can’t forget, the voice of so many children in such emotions.
“At that age for a friend’s funeral…we just wanted to get through each day.”
Coroner’s ruling: How content ‘romanticized’ self-harm
Alone in her room, Molly’s family later learned that social media algorithms had been feeding her a flood of disturbing content.
The coroner ruled that what she saw “romanticized” self-harm, “normalized” her depression and some content “dissuaded” the teenager from seeking “help” – which ultimately contributed to her death.
Devon wants to emphasize that Molly is not an isolated case and that young people being drawn to dark content on social media is a huge and damaging problem.
On Instagram, many of the hashtags Molly searched for are now blocked. However, Sky News’ data and forensics unit found that while the blocks had been produced and some content removed, autofill devices or typos could still direct users to some of the content Molly had seen; Yes, but too distressing to publish here.
Molly Russell’s digital footprint revealed in final months of life
‘No one is immune to such a tragedy’
Social media ‘nearly impossible to track’
A spokesperson for Meta, which owns Instagram, responded to Sky News saying the company was committed to protecting young people.
“Many of the recommendations we’ve outlined in our research [the coroner’s] reports, including new parental monitoring tools that let parents see who their kids follow and limit their time on Instagram,” the spokesperson said.
“We also automatically set teens’ accounts to private when they join, push them to different content if they’ve been scrolling on the same topic for a while, and have content designed to limit what teens see type of control.
“We don’t allow content that promotes suicide or self-harm, and we found that 98% of content was acted upon before we received a report. We will continue to work hard, working with experts, teens and parents, so we can keep improving.”
‘I’m feeling so sick I can’t work’: Ex-social media moderator speaks anonymously
Sky News spoke anonymously to a former social media moderator who described managing harmful content on social media platforms as “an impossible task”.
Working for one of the world’s leading social media companies for a year during the pandemic, her job was to give a second look at content flagged as potentially problematic, including potentially “extremely violent, homophobic” or even “extremely violent.” Posts exhibit pedophilia.
“But there was probably one video that influenced me the most,” she said. This is footage of someone taking their own life.
She says she watches and tags at least 1,000 videos a day. “It’s more of just making you feel uneasy about the world – seeing so much, sorry to say it, s**t, really. The things people do to themselves or others, it makes you feel lacking about the world Confidence, really.”
A year later, she said she was mentally and physically overwhelmed. “I felt so unwell I couldn’t work at all and I had to call my GP to advise me not to do it again.
“The thing it’s affecting me the most is sleep. I can’t sleep because I’m so stressed out. I’m dreaming about some videos that I may have mislabeled.
“I won’t personally go into the details of what’s going on, but it’s not far off Molly [Russell]. I can recognize the feelings from the feelings I feel when I see all of this content rushing at me. “
The enormity of the task of policing all content is simply too great, she said. “The system out there is in disarray … nobody really knows what’s going on.”
“We’re just a lot of young people [people]like many [people who have] Just finished their degrees…sat around trying to figure out how to judge all of this stuff without a legal background. “
Increase in Potentially Harmful Online Content
Disturbing content is a growing problem, according to research shared exclusively with Sky News by mental health charity Young Minds.
Research has found that more than one in five (22%) young adults are automatically shown distressing content by a social media platform at least once a week based on their previous online activity.
Almost all young people with a mental health problem (89%) say social media facilitates harmful behaviour, and more than half (52%) of this group say they have sought out someone they know may cause them distress or discomfort. comfortable content.
The government has been accused of delays in introducing legislation to regulate social media companies, but now, after years of delays, the Online Safety Bill will be brought back to Parliament next week – proposing fines of up to 10% for tech companies. If they fail to protect users from harmful content and criminalize posts that encourage self-harm, their global turnover will drop.
But critics such as Baroness Claire Fox want the bill repealed.
“The danger is that we — on the back of a very emotional reaction to events like the Molly Russell tragedy — introduce legislation that not only protects children but actually infantilizes adults , and treat them like children,” she told Sky News. “If you’re a free speech activist like I am, this bill is an important, serious censorship tool.”
For those fighting for better protection from potentially dangerous social media algorithms, Molly’s case illustrates the dire consequences of inaction.
Long-term effects and ‘crisis’ on children’s mental health
Olly Parker, from Young Minds, said: “I’m a researcher in this field but I’m also a father and it scares me a lot.
“I don’t think we’ll really see the long-term impact of this until maybe 10, 15 years from now. But one of the things we’re seeing is a real crisis of children’s mental health to young people. So now every month we’re Seeing record numbers of young people being referred to their GPs and doctors for additional mental health support.”
When the Online Safety Bill returns to Parliament, Molly Russell’s friends and family hope it will be the first step in holding big tech companies accountable for the content on their platforms.
Read more: Why the Online Safety Act is so controversial
‘Why are you doing this? ‘ – a heated exchange during the interrogation
Child psychiatrist ‘didn’t sleep well’ after watching content
“It’s big news that they now want to criminalize harmful content and anyone responsible for it, but at the same time, it does feel like it’s been a very long journey,” Devon said. “But I think it’s nice that we’re here now.”
But as much as it can hope, it can never bring Molly back.
“She was loved by all of us,” Devon said. “I think she really believes we’re better off without her … I think if she saw how much pain we were going through, I don’t think she would have made that choice.”
Anyone feeling emotionally distressed or suicidal can call Samaritans on 116 123 or email email@example.com for help. Alternatively, letters may be mailed to: Freepost SAMARITANS LETTERS