troll posting online

Social Media Platforms May Face Fines in the UK For Lax Hate Speech Moderation

Social media platforms like Twitter, Facebook and YouTube may face fines in the UK for taking a lax approach to hate speech moderation, MPs warn.

The warning comes as politicians step up the rhetoric against social media platforms, claiming the networks should do more to block the spread of hate and terrorism online.

The House of Commons Home Affairs Committee released a critical report of social media’s spread of hate, extremism and abuse.

The report prompted the UK parliamentary committee to propose fines on social media companies for failure to moderate hateful content. The committee has also called for a review of existing laws to ensure they clearly define how the legislation applies to this area.

Germany last month proposed a fine for social media firms for up to 50 million euros if the company failed to remove hate speech within 24 hours of a complaint.

In its report, the committee writes: “Social media companies currently face almost no penalties for failing to remove illegal content. There are too many examples of social media companies being made aware of illegal material yet failing to remove it, or to do so in a timely way. We recommend that the government consult on a system of escalating sanctions to include meaningful fines for social media companies which fail to remove illegal content within a strict timeframe.”

The committee says social media companies are far from taking action to address hateful and extreme content, and are putting profits ahead of public protection. The committee noted that social media firms are quick to remove content due to copyright infringement, but take a “laissez-faire approach” when it comes to illicit or hateful content.

The committee found during its investigation that terror recruitment videos were still accessible online even after complaints from MPs. Content that encouraged child abuse was also accessible even after journalists reported the material.

The inquiry was launched last year after the murder of Labour MP Jo Cox by a far-right gunman.

The committee has called for a thorough review of the legal framework that controls hate speech, extremism and abuse online, writing: “What is illegal offline should be illegal – and enforced – online.”

The report urges the government to determine whether the failure to remove hateful content is illegal. If not, the committee says the law should be strengthened, and social media firms should be treated as publishers.

“These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people’s lives. This isn’t beyond them to solve, yet they are failing to do so,” said Labour MP Yvette Cooper.

Google, YouTube’s parent company, said it is working on expanding its “trusted flagger” program to identify hateful and extreme content. Google says it will also invest in improving its alert system.

Facebook has also reassured MPs that it is conducting a review on how the company handles violent and extreme content on its platform. The social media giant was under fire after a Facebook Live video of a man committing murder in the U.S. remained on its server for more than two hours.