Facebook Removes White Nationalist Group Pages After Charlottesville Attack

Twitter and Reddit face scrutiny for allowing hate speech

Protestors rally against white supremacy and racism in New York City’s Columbus Circle. Drew Angerer/Getty Images

In the aftermath of the Charlottesville terror attack, many companies took action to protest white nationalist groups. Six CEOs stepped down from presidential advisory councils, and AirBnb (ABNB) banned white supremacists from making reservations and renting out homes. Even Google (GOOGL) and GoDaddy got political, kicking neo-Nazi site The Daily Stormer off their domain hosting platforms.

Sign Up For Our Daily Newsletter

By clicking submit, you agree to our <a rel="nofollow noreferer" href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime.

See all of our newsletters

And now, social networking sites are taking a stand.

Yesterday Facebook (META) removed eight pages used by racist and white nationalist groups from its site. The banned groups included Right Winged Knight, Right Wing Death Squad, Awakening Red Pill, Physical Removal, Genuine Donald Trump, Awakened Masses, White Nationalists United and Vanguard America.

The last group is particularly notable, because James Fields reportedly marched with the group on Saturday before running over 32-year-old Heather Heyer. Dillon Hopper, the “commander” of the neo-Nazi group that may be connected with the banned page, is a former U.S. Marine.

“Our hearts go out to the people affected by the tragic events in Charlottesville,” Facebook said in a statement. “Facebook does not allow hate speech or praise of terrorist acts or hate crimes, and we are actively removing any posts that glorify the horrendous act committed in Charlottesville.”

While Facebook said it would continue to remove “organized hate groups,” it did not clarify why groups promoting “death squads” and other violence were allowed on the site in the first place. The social network, which was criticized earlier this year for mistakenly blocking atheist and ex-Muslim pages, does not plan to update its terms of service in the wake of the attack.

Facebook is also in hot water for allowing the event page for the Charlottesville “Unite the Right” march to stay up on the site for a month. It only deleted the page on Friday, the day before the march, after realizing it violated community standards.

Other social media sites are also grappling with these issues. A spokesperson for Twitter told Recode the site would not update its existing user guidelines, which don’t explicitly ban white nationalists or neo-Nazis but prohibit accounts that “directly attack or threaten other people on the basis of race, ethnicity, national origin, sexual orientation, gender, gender identity, religious affiliation, age, disability or disease.”

Given that former KKK leader David Duke is allowed to tweet about “anti white hatred,” however, this standard isn’t good enough for many regulators. For example, the European Union said in June that Twitter failed to meet minimum standards for removing hate speech from its platform.

Reddit, on the other hand, is taking a more active approach. The online discussion board removed a subreddit for Physical Removal (also one of the sites Facebook banned) because users wrote that the people killed and injured in Charlottesville were “mockeries of life” who “need (ed) to fucking go.”

“We are very clear in our site terms of service that posting content that incites violence will get users banned from Reddit,” a spokesperson told CNET.

This is somewhat ironic considering users of the r/The_Donald subreddit have threatened violence against people who don’t support President Trump for months. But maybe the Charlottesville violence finally convinced Reddit to get its act together.

Facebook Removes White Nationalist Group Pages After Charlottesville Attack