Facebook's Tommy Robinson ban is all about saving face, not stopping extremism

The social media firm's criteria for removing extremists are rather nebulous and asserted on an ad hoc basis. So what's really driving Facebook to take on extremists?

Far-right activist Stephen Yaxley-Lennon, aka Tommy Robinson, has been banned from Facebook and Instagram, where he commanded a combined following of over one million.

In a statement, Facebook said that Robinson’s page contained “dehumanising language” and advocated “violence targeted at Muslims”. It declared that “individuals and organisations” praising, supporting or engaging in “organised hate” would also be banned.

In a follow-up statement to the press, Facebook offered some specifics. On January 24, the company said, the admins of the Tommy Robinson Facebook page received a written warning about the page' content, which allegedly included videos of bullying, and posts that labelled Muslims “filthy scumbags” and called for their beheading. The company said that it had then “subsequently became aware” that Robinson had been participating in events with recognised hate groups and figures, such as Generation Identity, the Proud Boys, and Gavin McInnes.

It is hard to discern Facebook’s reasoning behind the ban. The company claims that, “when ideas and opinions cross the line and amount to hate speech that may create an environment of intimidation and exclusion for certain groups in society...we take action.”

But what, specifically, triggered it to take this action now? If Facebook applied its standards with any consistency, then Robinson’s removal would surely have come sooner. Just this week, a personal trainer who set up a fitness group for Muslim women was targeted with racist abuse and death threats after Robinson posted a flyer promoting her class on his Instagram account. Last month, the family of the Syrian schoolboy who was assaulted in a Huddersfield playground announced that they will sue Robinson for defamation, after he claimed on Facebook that the victim had previously attacked four other students.

In July 2018, an undercover reporter for Channel 4 Dispatches revealed that the popularity of Robinson’s Facebook page had protected it from being taken down. Labour’s deputy leader and shadow secretary of state for digital, culture, media and sport, Tom Watson, said of Robinson’s ban: “For far too long this violent thug’s hate-spewing, anti-Islamist tirades were given a platform by Facebook. Today’s decision comes far too late.”

Also hard to tell is why Robinson has been banned, yet other far-right activists haven’t. Former MailOnline columnist Katie Hopkins, for instance, has a long history of using “dehumanising language” to describe the poor, migrants and Muslims, and recently gave a speech to far-right organisation For Britain. Has she not therefore also violated Facebook’s standards? Last December, British "alt-lite" provocateur Milo Yiannopoulos was removed from Patreon, after the company cited his associations with the Proud Boys. This is the same group Robinson has been accused of fraternising with. Why hasn’t Facebook removed Yiannopoulos, too?

“I think the key message to take away is that Facebook’s approach to Yaxley-Lennon/Tommy Robinson’s page has been entirely inconsistent.” says Dr Bharath Ganesh, a researcher at the Oxford Internet Institute and an expert in online extremism. “Milo and Hopkins are associated with major right-wing news outlets [in fact, Yiannopoulos used to be an editor at Breitbart, and Hopkins wrote for the MailOnline, but both publications severed their ties with them in 2017], and it is unlikely that Facebook would act on their pages despite their important influence on the far right.”

“It seems the company is concerned by claims from the right that conservative voices, in both the US and the UK, [believe] their free speech is under attack. It’s possible that Facebook’s desire to be unbiased has some effect on [its] decision-making in taking down accounts.” Both Yiannopoulos and Hopkins remain heavily active on Instagram; Yiannopoulos also boasts a blue tick and over two million followers on Facebook, whereas Hopkins has not posted on her page in a while.

Responding to the ban, Robinson said that he had “breached no laws of Facebook."

His YouTube account remains up, even if the company demonetised it earlier this year. On Monday, he released an hour-long video titled “Pandorama: An Exposé of the Fake News BBC!”, in response to a Panorama investigation. Robinson’s confidence, however, that his audience will continue to view his content may be misplaced. A recent report on rightwing influencers by anti-racist charity Hope not Hate found that the removal of Alex Jones and Gavin McInnes from social media had significantly dented their online clout.

Facebook says that, although it is not bound by international human rights law (being a company and not a country), it consults the documents that underpin these regulations for “guidance”. Its statement on Robinson’s ban cites Article 19 of of the International Covenant on Civil and Political Rights (ICCPR), a treaty which sets standards for when it is appropriate to restrict freedom of speech.

Yet the key word here is “guidance”. With no clear framework regulating Facebook’s decisions, the company will continue to leave itself open to the charge that it acts opportunistically rather than ethically, and that the primary motive guiding these bans is the defence of its reputation.

This article was originally published by WIRED UK