Social media giants have pledged hundreds of thousands of pounds to the Samaritans charity in a bid to rid the internet of self-harm videos and other damaging material, Health Secretary Matt Hancock said.

Representatives from Facebook, Google, Snapchat and Instagram were summoned by the Government to meet with the charity to identify and tackle harmful content, including that promoting suicide.

The summit in Whitehall came three weeks after the Government announced plans to make tech giants and social networks more accountable for harmful material online.

The maiden summit in February resulted in Instagram agreeing to ban graphic images of self-harm from its platform.

Speaking after the behind-closed-doors meeting, Mr Hancock said: “I met the main social media companies today and they have agreed to fund the Samaritans to identify what is harmful and then to put in place the technology on their platforms to find harmful material and make sure it is either removed or others can’t see it.

“The amount of support is in the hundreds of thousands.

“The crucial thing is that we have an independent body, the Samaritans, being able to be the arbiter of what is damaging content that needs taking down so all tech companies can follow the new rules that have been set out.”

Social media companies and the Government have been under pressure to act following the death of 14-year-old Molly Russell in 2017.

The schoolgirl’s family found material relating to depression and suicide when they looked at her Instagram account following her death.

Mr Hancock went on: “I feel the tech companies are starting to get the message, they’re starting to take action.

“But there’s much more to do … we also spoke about tackling eating disorders and some anti-vaccination messages which are so important to tackle to ensure they do not get prevalence online.”

In a statement, a spokesman for Facebook, which also owns Instagram, said: “The safety of people, especially young people, using our platforms is our top priority and we are continually investing in ways to ensure everyone on Facebook and Instagram has a positive experience.

“Most recently, as part of an ongoing review with experts, we have updated our policies around suicide, self-harm and eating disorder content so that more will be removed.

“We also continue to invest in our team of 30,000 people working in safety and security, as well as technology, to tackle harmful content.

“We support the new initiative from the Government and the Samaritans, and look forward to our ongoing work with industry to find more ways to keep people safe online.”

Samaritans chief executive Ruth Sutherland said: “The internet has evolved rapidly to be a force for good and a new forum to connect with others.

 

“However, there has been a worrying growth of dangerous online content, which is an urgent issue to combat and something we cannot solve alone.

“There is no black and white solution that protects the public from content on self-harm and suicide, as they are such specific and complex issues.

“That is why we need to work together with tech platforms to identify and remove harmful content whilst being extremely mindful that sharing certain content can be an important source of support for some.

“This partnership marks a collective commitment to learn more about the issues, build knowledge through research and insights from users and implement changes that can ultimately save lives.”

The online harms white paper sets out a new statutory duty of care to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services.

Compliance with this duty of care will be overseen and enforced by an independent regulator.

Failure to fulfil this duty of care will result in enforcement action such as a company fine or individual liability on senior management.