CLOSE

CLOSE

The online message board 8chan, which has been linked to three mass shootings in 2019, was terminated, Cloudfare announced late Sunday night.
USA TODAY

Despite efforts from major social media companies to try to weed out hate groups that use their platforms, the reality is they are still all over the networks, according to the Southern Poverty Law Center advocacy group.

“There is a direct correlation between the rise of hate groups on social media and the frequent attacks,” like the El Paso and Dayton weekend killings, says Keegan Hankes, a senior research analyst for the SPLC.

In 2019, Facebook, Twitter and Google-owned YouTube have taken stands against conspiracy theorists like Alex Jones of InfoWars by booting them off the platforms. YouTube recently changed its policies to bring in more content moderators to enforce community guidelines. YouTube says it yanked nearly 90,000 videos in the first quarter that were promoting violence or extremism, and nearly 20,000 for violating YouTube’s hate speech policy. 

Vigil on Aug. 4, 2019, in Dayton, Ohio. (Photo: John Minchillo/AP)

But Hankes says one can find extremist content on YouTube “without searching very hard…it does not take too much effort.”

As an example, Hankes shared a URL for a YouTube video, in the guise of a straight news talk show, from Redice.TV, in which the host complained about “blatant discrimination for people of European descent.” 

Redice has an active Twitter account with 38,000 followers, as does noted white natlionalist and neo-Nazi Richard Spencer, with 76,000 followers. 

Hankes is tougher on Twitter, President Donald Trump’s social media platform of choice, calling it an “absolute cesspool,” of hate. “Twitter does one of the worst jobs of content moderation.” 

What to do: How to help El Paso victims after the massacre of 22 people at Walmart

Should Facebook breakup? Facebook co-founder Chris Hughes reportedly working with FTC on Facebook breakup

Twitter had no comment. YouTube pointed USA TODAY to a June blog post in which the company said it would be more aggressive in taking down conspiracy and hate videos. 

He calls Facebook a “work in progress” that has gotten better, with stronger moderation tools and more content bans but is still a platform of choice by hate groups. “They haven’t gotten rid of all of them.” Facebook is where the groups “go to recruit new members and indoctrinate them.”

Facebook didn’t respond to requests for comment. 

The social network is primarily where the neo-nazi march on Charlottesville, Virginia, that led to a woman’s death, was organized, notes Hankes. “Nearly two years later, you would expect these companies to be way more out front of the issue and stronger in enforcement.”

The El Paso shooter is reported to have used the dark web forum 8chan to post an anti-Latino manifesto before going on his killing spree. On Monday, web security firm Cloudfare said it would no longer work with 8chan. However, another firm could pick up the slack. 

‘Enough is enough’: Cloudflare terminates 8chan, an online meeting place for ‘extremist hate’

The SLPC has partnered with several other organizations, including the National Hispanic Media Coalition and Center for American Progress in a new advocacy group Change the Terms, which calls for Google, Facebook and Twitter to “do more to combat hateful conduct on their platforms.”

Specifically, Change the Terms seeks greater enforcement of anti-hate policy. At the same time, many conservative groups see social media firms as having a bias against right-wing causes, which Trump has complained about on Twitter. 

Facebook’s stated hate speech policy says it is not allowed because “it creates an environment of intimidation and exclusion and in some cases may promote real-world violence” while Twitter says users may not “promote violence against or directly attack or threaten other people on the basis of race, ethnicity, national origin, sexual orientation, gender, gender identity, religious affiliation, age, disability, or serious disease. We also do not allow accounts whose primary purpose is inciting harm towards others on the basis of these categories.”

YouTube says hate speech is not allowed on the network. “We remove content promoting violence or hatred against individuals or groups.”

Follow USA TODAY’s Jefferson Graham (@jeffersongraham) on Twitter, Instagram and YouTube. 

Read or Share this story: https://www.usatoday.com/story/tech/talkingtech/2019/08/05/twitter-called-cesspool-for-hate-facebook-youtube-not-much-better/1924816001/