Freedom of speech has been a government issue and a human right. But more and more companies are providing platforms where anyone can contribute some kind of text, usually text. And those companies are finding that they face many of the same issues that governments have: how to balance giving users the ability to express themselves freely against the possibility that they will post problematic content.
“Problem” has different meanings. In some cases, it is truly dangerous, such as incentives to violence or false medical advice. And companies may find they don’t want to be associated with expressions of racism, sexism, or other forms of hatred. But can companies do anything if people use their service to broadcast content that the companies don’t approve of?
A new study answers that question with a clear “yes”. Researchers looked at Reddit’s fight against hate speech, which saw it ban several subreddits in 2015. The analysis suggests that regular users of these subreddits toned down their language as they moved to other areas on the site. And a number of users who want to continue to share angry thoughts simply go to other services, making them someone’s problem.
Back in 2015, Reddit announced that it would begin banning subreddits that “allow their communities to use the subreddit as a platform to harm individuals.” Two of the obvious targets for this, r/fatpeoplehate and r/CoonTown, which targeted fat and black people, were banned soon after. The latest research, published in Proceedings of the ACM on Human-Computer Communicationsee what happened after the ban, using public information from Reddit for some informative data mining.
Big data
First, the researchers wrote a database of over 100 million posts and comments from 2015, as the bans took place in the middle of the year. They then used two methods to create words of hatred that they could show. One mined the contents of fatpeoplehate and CoonTown to find terms used exclusively in those subreddits. In addition, they compiled this list by hand, selecting terms from it that were clearly offensive. Although neither of these lists would fit the public definition of hate speech, both provide at least a functional indication of speech associated with hate, or at least trolling.
“These dictionaries (of hate speech) are open to the community as a resource,” the authors note. Presumably, you can also contact them if you want to hate like a pro or describe other people’s hate (“shitlording” appears in the list).
To understand the behavior of the users of the two subreddits in question, the authors needed a control population. They do one by identifying other subreddits that users post to and finding a group of users who have a similar posting pattern but don’t frequent the parts of the site that are banned.
With that, they track how user behavior changes after the ban. For many users, the end of these two subreddits means the end of their time at Reddit. Once fatpeolehate was closed, 21 percent of the accounts were inactive, and another 12 were completely deleted. That compares to normal churn in the control group, where a little more than 10 percent were inactive, and another 11 percent deleted their accounts. For CoonTown, the change is even more dramatic: 19 percent are inactive, and 21 percent deleted their account. The control population here is more likely to leave Reddit, though, with 16 percent inactive and 12 percent deleting their accounts.
That doesn’t mean users give up on hate, however. A look at a site that allows its users to continue participating in racism and fat-shaming called Voat shows that more than 1,500 Reddit usernames appear on that site.
Behavioral change you can believe in
The majority of users of the two banned subreddits, however, continued to use Reddit and generally continued to post with the same frequencies before and after the ban. Are those users engaging in the same behavior elsewhere?
There are some indications that they try. For the first few weeks after the ban, the use of hate speech dropped, but experienced sudden spikes. It is possible that these reflect attempts by individuals to recreate new communities, which Reddit generally recognizes and effectively bans, the authors suggest.
About 50 days after the original launch, however, things settled down considerably. In the case of a manual list of rules, the former subscribers of fatpeolehate stop using them significantly, becoming indistinguishable from administrative users. Ex-CoonTowners continue to use some racist terms, but at a fraction of the frequency they had while the subreddit was open. In both cases, a fully automated list of hate terms found that former users of banned subreddits continued to use hate terms at a higher frequency than controls, but much less than they used to.
Former users of fatpeolehate ended up moving on to various subreddits. The only one that has been clearly focused on them following the habit of insulting strangers is called RoastMe, in which people post pictures of themselves to be mocked. Some of CoonTown’s former residents still go there, but a fair number move to The_Donald, Hometown, and BlackCrimeMatters, subreddits where “racist behavior is either observed or rampant,” as the authors put it. Nevertheless, their racist comments are very low, even in these environments where other users can tolerate it.
FTW
From Reddit’s perspective, the anger balance worked, as the hate speech on the site went down. While some evidence suggests that people who are truly dedicated to denigrating their fellow human beings migrate to other sites, these sites do not have the popularity of Reddit, so the presence of hate speech there has little effect on the targets of hate these users.
Some of these individuals were undoubtedly among the people who canceled or deleted their accounts following the scandal. But it’s also possible that extra people left as a form of protest against Reddit having to mess with its moderation for the first time. It’s also possible that the people who leave Reddit are individuals who enjoy being angry but aren’t committed to the specific hatred these subreddits pursue. The extent to which people have abandoned Reddit because they cannot pursue their racism or other forms of hatred cannot be determined from this data.
But the key finding is that people who stay around change their behavior, adapting their language to the norms of the new subreddits they become active in. Part of that may be the fear of seeing another hangout end up being banned; the same fear may have encouraged moderators of the remaining subreddits to be more aggressive about policing language. But it is possible that some of them find that attacking someone for their appearance is generally not socially acceptable.
As the authors note, this is not a conclusive result; There are examples in the literature of aggressive balancing being unbalanced or leading some communities to spiral out of control. Many other companies are also facing challenges with figuring out how to provide an open platform while preventing threats of violence or overt discrimination. Having an example of successful moderation from a community as large and honest as Reddit can make the case that other sites can successfully intervene.
Full disclosure: the author regularly contributes to moderated discussions at this website.
Proceedings of the ACM on Human-Computer Communication2017. DOI: 10.1145/3134666 (About DOIs).