Molly Russell’s father has called for a stronger UK online safety bill, including criminal sanctions against tech executives who put children’s welfare at risk, after criticizing social media platforms’ responses to the coroner’s report into his daughter’s death.
Ian Russell said the investigation into 14-year-old Molly’s death was a “unique” opportunity for the tech industry and government to make online platforms safer. In September, a coroner ruled that harmful online content caused Molly’s death, saying she “died from an act of self-harm while suffering from depression and the negative effects of online content.”
Molly, from Harrow, north-west London, took her own life in 2017 after looking at content related to suicide, depression and self-harm, including on Instagram and Pinterest.
Russell said the response by these companies to a series of recommendations by the senior coroner, which included a review of separate platforms for adults and children, was “lacking and not surprising”.
He said: “It’s not good enough when young lives are at risk.”
Russell said the responses from Snapchat owner Pinterest and Instagram parent Meta underscored the importance of the online safety bill, which passed third reading in parliament on Tuesday.
“This makes the online safety bill a really important piece of legislation because I don’t think the tech industry will get its house in order to prevent tragedies like Molly from happening again without effective regulation,” he said.
Following the inquest, Chief Coroner Andrew Walker issued a notice to prevent further deaths. It advised the government to consider giving digital platforms to children and consider: separate sites for children and adults; verify the user’s age before registering on a platform; providing age-appropriate content for children; using algorithms to provide content; advertising for children; and parent or guardian access to the child’s social media account.
A notice was also sent to Meta, Pinterest and Snap, asking them to respond detailing the actions they will take in response, although the coroner’s recommendations are not binding. In their response, the companies mentioned their efforts to protect children from harmful content.
Pinterest’s response includes a commitment to an independent review of moderation efforts, Snap has hinted at creating a “family hub” that will inform parents about who their children are friends with, and Meta has announced policies including a content moderation tool on Instagram. allows teenage users to limit the amount of sensitive material they see. Twitter, which Molly used before her death, has also received a copy of the coroner’s notice, but has not yet published a response.
While Pinterest’s commitment to third-party monitoring is a “positive” development, Russell said the responses gave a “business-as-usual feel.” Russell, who has become a leading campaigner for internet safety and founded the Molly Rose Foundation to help young people with mental health issues, added that she still finds dangerous content on platforms such as Instagram and TikTok.
Russell said he supports an amendment to the bill that would criminalize tech executives and jail them for up to two years if they systematically fail to protect children on their platforms. Currently, the act enforcers only threaten jail if they obstruct investigations by Ofcom, the communications regulator that will oversee the act. Companies that breach the act can be fined up to 10% of global turnover, which would be more than the $11bn (£9bn) in Meta’s case.
“The key to making change happen is to change the corporate culture. “The threat of serious financial sanctions is clearly not enough to focus minds at the heads of these corporations,” Russell said, adding, “The prospect of a trial will focus minds.”
Culture Secretary Michelle Donelan said she was “not referring” to the amendment, which has strong support among Conservative backbenchers and is backed by opposition parties including Labour.
In a response to the coroner’s notice, Donelan said the online safety bill had already been strengthened to provide greater protections for children, including requiring major platforms to publish risk assessments of the danger their services pose to children.
Former Conservative leader Iain Duncan Smith on Sunday called on Rishi Sunak to pass an amendment to ensure social media bosses “face punishment” for failing to protect children on their platforms.
“We have all kinds of horrible, harmful nonsense on the internet, from suicide to extreme child pornography and general violence,” he said.
“It’s time we all co-ordinated together and made sure children don’t escape this very weak system to really protect them.”
In response to Russell’s comments, Pinterest said it was “committed to accelerating ongoing improvements” to user safety, and Snap said the family hub tool was designed to “enhance safer online experiences in general.”
Meta declined to comment.