Fear for young people’s safety has sparked a call for a potentially harmful Instagram algorithm to be banned.
The conversation surrounding harmful content on social media sites began after the father of a teen who died by suicide spoke out about a lack of censorship on Instagram.
Molly Russell was 14 when she past away in 2017.
Molly’s father, Ian, is now speaking out and says he believes social media platform Instagram is partially responsible for her death.
Other social media sites including Facebook, which owns Instagram, and Pinterest were also criticised by Mr Russell.
After Molly’s death her family found distressing content about suicide and depression on her Instagram account. Now, the UK government is urging social media companies to take more responsibility for the content on their platforms.
The secretary of state for Health and Social Care, Matt Hancock, has sent a letter to Twitter, Snapchat, Pinterest, Apple, Google and Facebook informing them that urgent action has to be taken to prevent further harm.
Last week Instagram boss Steve Hatch told the BBC that they were reviewing policies and called Molly’s death a devastating event.
Instagram currently has warnings in place that appear when users search potentially harmful hashtags, but this can be bypassed by pressing a button.
The content in question is against Instagram’s community guidelines, but still exists on the site. Concern lies around content focussing on self-harm, depression, suicide and eating disorders.
There are now calls to ban an algorithm which promotes similar content and accounts to what you have been searching for. For example the more you search for photos of puppies, the more photos of puppies and related suggested accounts will come up on your timeline.
Mr Hancock pointed out that the government has the power to ban social media platforms if no action is taken, but that this would be a drastic step.
If you or someone you know is struggling with mental ill health or self-harm contact Samaritans on 116 123.