Social media companies could be forced by law to remove illegal content and sign a code of conduct protecting vulnerable users.
The proposed crackdown will be announced by digital minister Margot James on Tuesday, the Daily Mail newspaper reported.
Her speech comes after the death of Molly Russell, 14, whose family found she had viewed content on social media linked to anxiety, depression, self-harm and suicide before taking her own life in November 2017.
James will make a speech at a conference for Safer Internet Day, saying: “The tragic death of Molly Russell is the latest consequence of a social media world that behaves as if it is above the law.
“There is far too much bullying, abuse, misinformation as well as serious and organised crime online. For too long the response from many of the large platforms has fallen short.
“We are working towards the publication of the final policy paper, and consultation, before bringing in a new regulatory regime.
“We will introduce laws that force social media platforms to remove illegal content, and to prioritise the protection of users beyond their commercial interests.”
A Department for Digital, Culture, Media and Sport spokesperson said: “We have heard calls for an internet regulator and to place a statutory duty of care on platforms, and are seriously considering all options.
“Social media companies clearly need to do more to ensure they are not promoting harmful content to vulnerable people. Our forthcoming white paper will set out their responsibilities, how they should be met and what should happen if they are not.”
It comes as the suicide prevention minister prepares to warn that the normalisation of self-harm and suicide content online poses a risk similar to child grooming.
Jackie Doyle-Price is expected to call on social media companies to take action to protect users from the impact of harmful content at a conference in London on Tuesday.
Doyle-Price will say: “We must look at the impact of harmful suicide and self-harm content online… in normalising it, it has an effect akin to grooming.
“We have embraced the liberal nature of social media platforms, but we need to protect ourselves and our children from the harm which can be caused by both content and behaviour.”
Doyle-Price is expected to tell the National Suicide Prevention Alliance Conference that internet and social media providers must “step up to their responsibilities to protect their users”, while the government considers tougher regulation.
The minister’s comments come ahead of a meeting with Facebook to discuss what action the company is taking to curb harmful online content.
Doyle-Price will echo Health Secretary Matt Hancock’s recent warning that the Government is prepared to introduce new legislation “where needed” to tackle the issue.
“If companies cannot behave responsibly and protect their users, we will legislate,” she will say.
“Providers ought to want to do this. They shouldn’t wait for government to tell them what to do. It says a lot about the values of companies if they do not take action voluntarily.”
In January, Hancock called on internet giants to “purge” the internet of content that promotes self-harm and suicide, following the death of Molly.
The teenager’s father Ian Russell said he had “no doubt Instagram helped kill my daughter”.
Hancock is due to meet Instagram officials on Thursday to understand how it is tackling harmful online content.
Useful websites and helplines:
- Mind, open Monday to Friday, 9am-6pm on 0300 123 3393
- Samaritans offers a listening service which is open 24 hours a day, on 116 123 (UK and ROI – this number is FREE to call and will not appear on your phone bill.)
- The Mix is a free support service for people under 25. Call 0808 808 4994 or email: help@themix.org.uk
- Rethink Mental Illness offers practical help through its advice line which can be reached on 0300 5000 927 (open Monday to Friday 10am-4pm). More info can be found on www.rethink.org.