Social media firms need to “purge” the internet of harmful content that promotes self-harm and suicide, the Health Secretary has said.
Matt Hancock delivered the message after the father of a teenager who took her own life said Instagram “helped kill my daughter”.
The minister has written to a number of internet giants following the death of 14-year-old Molly Russell, telling them they have a duty to act.
Hancock said he was “horrified” to learn of Molly’s death, and feels “desperately concerned to ensure young people are protected”.
In his letter to Twitter, Snapchat, Pinterest, Apple, Google and Facebook (which owns Instagram), he said: “I welcome that you have already taken important steps, and developed some capabilities to remove harmful content. But I know you will agree that more action is urgently needed.
“It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people.
“It is time for internet and social media providers to step up and purge this content once and for all.”
He added that the Government is developing a white paper addressing “online harms”, and said it will look at content on suicide and self-harm.
He said: “I want to work with internet and social media providers to ensure the action is as effective as possible. However, let me be clear that we will introduce new legislation where needed.”
Molly was found dead in her bedroom in November 2017 after showing “no obvious signs” of severe mental health issues.
Her family later found she had been viewing material on social media linked to anxiety, depression, self-harm and suicide.
Molly’s father Ian Russell said the algorithms used by Instagram enabled her to view more harmful content, possibly contributing to her death.
Russell welcomed Health Secretary Matt Hancock’s letter.
In a statement he said: “It is clear to us that despite what the social media companies tell the public about their policies of removing disturbing content, that such content is still available for young people to find easily and by finding it they have more and more of it pushed on them by algorithms.
“It is time for tech companies to stand up and take more responsibility for the content available to their young users.”
He added: “However, for many young people, discussing their mental health journey or connecting with others who have battled similar issues is an important part of their recovery.
“This is why we don’t remove certain content and instead offer people looking at, or posting it, support when they might need it most.”
A spokeswoman for Instagram said: “We are undertaking a full review of our enforcement policies and technologies around self-harm, suicide and eating disorders.
“As part of this, we are consulting further with mental health bodies and academics to understand what more we can do to protect and support our community, especially young people.
“While we undertake this review, we are taking measures aimed at preventing people from finding self harm-related content through search and hashtags.”
An inquest into Molly’s death is expected later this year.