Push the Fringe to the Fringe

Twitter’s recent announcement that President Trump’s account will lose special “public interest” protections was met with mixed reactions— from “too little, too late” to “look at how biased Twitter is behaving.” For some of those who have watched @realdonaldtrump broadcast support for violence and white supremacy since the account came into prominence in 2015, Twitter’s […]

Twitter’s recent announcement that President Trump’s account will lose special “public interest” protections was met with mixed reactions— from “too little, too late” to “look at how biased Twitter is behaving.” For some of those who have watched @realdonaldtrump broadcast support for violence and white supremacy since the account came into prominence in 2015, Twitter’s actions are important.

If Trump again violates Twitter’s policies after January 2021, @realdonaldtrump may find itself, like white extremist David Duke’s @drdavidduke and conspiracy theorist Alex Jones’s @realalexjones, banned from Twitter. I believe that in the long-run, deplatforming violently hateful individuals from mainstream platforms is a good thing because it diminishes their reach and stature.

White extremist ideas first became mainstreamed after many years of lurking in the internet underground with a range of subcultures, from the benign to the malign. Slowly, white extremism moved from fringe message boards like 4chan to larger audiences through the algorithms of YouTube, Twitter, Reddit, and Facebook.

Helen Lewis’ recent piece in the Atlantic on ironic bigotry in the early years of the social internet shows how the digital world changed. She details how ideas that were once bad jokes morphed into serious endorsements of white extremist ideas. I can see that the jokey memes of the internet that I grew up with have brought unfortunate real-world consequences.

According to our own research into violent hate speech on social media platforms, between 0.5% to 1.5% of all U.S. traffic on Twitter qualifies as violently hateful. A recently published joint report by the Global Disinformation Index and the Anti-Defamation League shows the problem metastasizing on Telegram and Facebook: a quarter of the messages that mentioned Blacks on Telegram were derogatory as were over half the messages that mentioned Jews.

But change is afoot. As mainstream platforms stop being hospitable to racist content, they are helping to reduce casual users’ exposure to hateful and violent content, limit the ability of the loudest voices to win over bystanders, and stop people from gravitating towards fringe ideas because they are no longer being exposed to them.

The peddlers of hate are plying their wares elsewhere, but the platforms they have found are smaller and offer far fewer people to influence and radicalize.

Tech activist Cory Doctrow recently explained why it is so critical that these fringe ideas be isolated in the recesses of the internet. He says that the internet has become five major platforms where people share pictures of the other four. People actually ingest information online in unsystematic ways, crossing between platforms and media, moving quickly from one outlet to another. So when an Alex Jones is pushed off Facebook and YouTube to Parler and Rumble, regular people are simply less likely to encounter his bigoted content.

More important, extremists like the Proud Boys and Boogaloo movement thrive on confrontation and provocation, and that’s much more difficult and far less thrilling when everyone on the platform agrees with them. D-list social media sites like Parler just don’t offer the level of engagement a platform as big and diverse as Twitter does.

Finally, every move from one platform to another makes reconstituting an audience more difficult. As extremists change platforms, they inevitably lose followers. Reddit’s 2018 ban on the QAnon subreddit r/TheGreatAwakening is instructive — QAnon adherents largely did not come back to Reddit. The movement did find willing hosts in Twitter and Facebook, but those platforms took action this summer to deplatform QAnon networks and their content.

Wide-scale bans of QAnon have fractured the movement, and while the election of some adherents to Congress suggests that we haven’t seen the last of QAnon, perhaps because they are now excluded from mainstream social media platforms, we’ve witnessed an end to the network’s rapid growth.

Extremists remain welcome to practice their constitutionally-protected right to speech, but there is no constitutionally-protected right to a giant platform and audience. Rather than expecting the government to act, or expecting it to act in the interest of human rights, public pressure on major social media companies to push the fringe back to the fringe may be the most important way we can stem the growth of extremist beliefs and violent action.

Blog

Author:

  • Welton Chang

Published on November 23, 2020

Share

Related Posts

Seeking asylum?

If you do not already have legal representation, cannot afford an attorney, and need help with a claim for asylum or other protection-based form of immigration status, we can help.