World

Fine Social Media Companies Over Child Pornography, Extremist Material: British Lawmakers

Smartphone
A commuter looks at his mobile phone as he crosses London Bridge during rush hour in London, Britain. Social media site can contain illegal and extremist content. Stefan Wermuth/Reuters

Updated | Social media companies such as YouTube should be fined if they fail to remove illegal content on their platforms such as child pornography, an influential committee of British MPs has said.

“Social media companies currently face almost no penalties for failing to remove illegal content,” a report published Monday by the House of Commons Home Affairs Select Committee said.

“We recommend that the government consult on a system of escalating sanctions to include meaningful fines for social media companies which fail to remove illegal content within a strict timeframe,” it continued.

The report slammed companies for “outsourcing” some of the monitoring of their platforms to taxpayer-funded bodies: “In the U.K., the Metropolitan Police’s Counter Terrorism Internet Referral Unit (CTIRU) monitors social media companies for terrorist material,” the report said.

“That means that multi-billion pound companies like Google, Facebook and Twitter are expecting the taxpayer to bear the costs of keeping their platforms and brand reputations clean of extremism,” it added.

The report also tackles the failure by social media companies to remove t errorist and extremist material.

“The weakness and delays in Google’s response to our reports of illegal neo-Nazi propaganda on YouTube were dreadful,” it said, “Despite us consistently reporting the presence of videos promoting National Action, a proscribed far-right group, examples of this material can still be found simply by searching for the name of that organisation.”

“We recommend that all social media companies introduce clear and well-funded arrangements for proactively identifying and removing illegal content—particularly dangerous terrorist content or material related to online child abuse,” the report concluded.

A YouTube spokesperson told Newsweek via email: “We take this issue very seriously. We’ve recently tightened our advertising policies and enforcement; made algorithmic updates; and are expanding our partnerships with specialist organisations working in this field. We’ll continue to work hard to tackle these challenging and complex problems."

Nick Pickles, Twitter's U.K. Head of Public Policy, said via email: “Our Rules clearly stipulate that we do not tolerate hateful conduct and abuse on Twitter. As well as taking action on accounts when they're reported to us by users, we've significantly expanded the scale of our efforts across a number of key areas.

“From introducing a range of brand new tools to combat abuse, to expanding and retraining our support teams, we're moving at pace and tracking our progress in real-time. We're also investing heavily in our technology in order to remove accounts who deliberately misuse our platform for the sole purpose of abusing or harassing others. It's important to note this is an ongoing process as we listen to the direct feedback of our users and move quickly in the pursuit of our mission to improve Twitter for everyone,” he continued.

Simon Milner, Director of Policy at Facebook said: “Nothing is more important to us than people's safety on Facebook. That is why we have quick and easy ways for people to report content, so that we can review, and if necessary remove, it from our platform. 

“We agree with the Committee that there is more we can do to disrupt people wanting to spread hate and extremism online. That’s why we are working closely with partners, including experts at Kings College, London, and at the Institute for Strategic Dialogue, to help us improve the effectiveness of our approach. We look forward to engaging with the new Government and parliament on these important issues after the election.” 

The inquiry into hate speech online was prompted by the murder of Jo Cox, the Labour Party MP killed by a far-right extremist called Thomas Mair during the Brexit campaign.

The committee also heard evidence relating to Islamophobia, misogyny, far-right extremism, the role of social media in hate crime and the particular issues faced by members of parliament in relation to hate crime and its violent manifestations.

The report did note some positive behavior by social networks: “We welcome the fact that YouTube, Facebook and Twitter all have clear community standards that go beyond the requirements of the law,” it said.

“We strongly welcome the commitment that all three social media companies have to removing hate speech or graphically violent content, and their acceptance of their social responsibility towards their users and towards wider communities.”

This article has been updated to include a statement from Facebook.

Editor's Pick