BBC reports the government is considering ways to regulate social media companies, including Facebook, YouTube and Instagram, over harmful content.
The renewed focus comes after links were made between the suicide of teenager Molly Russell and her exposure to images of self-harm on Instagram. The photo and video-sharing social network site has said it is making further changes to its approach and won't allow any graphic images of self-harm. At the moment, when it comes to graphic content, social media largely relies on self-governance. Sites such as YouTube and Facebook have their own rules about what's unacceptable (video, pictures or text) and the way that users are expected to behave towards one another. This includes content that promotes fake news, hate speech or extremism, or causes mental health problems. If the rules are broken, it is up to social media firms to remove the offending material. If illegal content, such as "revenge pornography" or extremist material, is posted on a social media site, it will be the person who posted it, rather than the social media companies, who is most at risk of prosecution. So, what do other countries do? Some are taking action to protect users, others have been criticised for going too far.