Cleaning Up the Community: Shagun Jhaver Explores Impact of Content Moderation Practices on Social Media

Online communities like Reddit or Twitter act like town halls, where opinions are shared and everyone, in theory, has a voice. Only, it doesn’t always work like that. What was once optimistically viewed as a solution to public discourse, offering promises of open and logical discussions where anyone with a keyboard and an internet connection could speak their piece, has instead become a bit of a Wild West. Message boards have degraded into sources of harassment, misinformation, radicalization, and more.

Now, the largely techno-utopian view has been adjusted, and moderation of content has become the norm. The question is: how can you moderate, while also maintaining the promise of free speech? Also, how can you avoid discouraging posters whose content was moderated or removed while encouraging them to remain a part of the public discourse?

These are just a few of the questions being posed and pursued by Shagun Jhaver, a Ph.D. student in Georgia Tech’s School of Interactive Computing (IC), whose papers at the upcoming Computer-Supported Cooperative Work and Social Computing (CSCW) conference provide some context and, perhaps, solutions.

Fairness, accountability, and transparency

Jhaver is a computer scientist at heart. He earned his bachelor’s degree in India in electrical engineering and then studied computer science for his master’s at the University of Texas at Dallas. Like most in IC, though, his primary focus is on humans.

“One of the main attractions to our School was that, although it is a computer science school, I am able to do interviews and surveys with people,” Jhaver explained. “What good are technological developments if they don’t work for humans, if they don’t improve society? In order to understand the interactions between technology and society, I wanted to develop a mixed-methods background, and the resources and faculty here are perfect for that.”

One of his first projects as a graduate student was investigating communication on social media around the Black Lives Matter movement.

“I wanted to understand the emergent collective participation around this movement and what people were feeling on the ground in the moment,” he said. “That’s how I entered this area of social computing.”

Social computing is an area of computer science that focuses on the intersection between social behavior and computational systems. Integral to Jhaver’s study was how social media and the data gathered within those systems reflected what was happening within society as a whole.

There may be no more adequate reflection of this phenomena than on Reddit and Twitter, two communities his research has looked at. At CSCW, he’ll present a handful of studies that have examined the topic of content moderation. One of the papers, titled Does Transparency in Moderation Really Matter?: User Behavior After Content Removal Explanations on Reddit, earned a best paper award. Another, titled Did You Suspect the Post Would be Removed?: Understanding User Reactions to Content Removals on Reddit, earned an honorable mention.

How, he wonders, do you develop good moderation practices that enforce community rules while also maintaining the free expression of ideas? And, what practices improve how posters feel about their moderated content and encourage them to continue participating in these forums?

“Content moderation is more nuanced than just editing and removing content,” Jhaver said. “It’s about the overall experience of the user and the community and how they interact.”

His research came to a few conclusions:

One, fairness matters; two, accountability is important; three, the platforms should be transparent in their decisions. From the perspective of end users, that means that rules are clear and easy to follow, and when the post is removed they are notified and given a clear explanation of why. If they appeal, they are given an appropriate response.

But there are multiple stakeholders involved in the exchange, and who determines what is fair?

“These Reddit moderators are volunteers,” Jhaver said. “Is it fair for us to expect them to take on these increased responsibilities for providing explanations?”

In other words, these issues are much more nuanced than they would seem to many casual participants. Amy Bruckman, a professor in IC and Jhaver’s co-advisor (with IC adjunct faculty Eric Gilbert), said she can’t think of other research that has examined this aspect of social communities.

“I don’t think it has been studied – okay, your content was just removed, so how do you feel about that?” she said. “Taking that other side of it is unique.”

Giving everyone a voice

So, why do these explanations even matter? Why not just remove bad content and move on?

“But free speech is interesting,” Jhaver said. “There’s this dichotomy where if you are free to harass certain people over their race, gender, or other aspects of identity, then you are preventing them from having the voice to speak their truth. So, you are infringing on their freedom of speech. That’s why there’s this need.”

Whatever the case, these issues are not going away. Methods of communication will continue to change over time, particularly as technology continues to advance. But, Jhaver said, these conversations aren’t anything new either.

“These are age old problems,” he said. “Harassment, free speech, suppression of free speech. These topics have always been discussed, but the internet has changed the way we see them and changed how they manifest themselves.

“I want my research to help minorities and other vulnerable groups have a greater voice in society,” Jhaver said. “I want to contribute to the design of more equitable, inclusive, and participatory technologies.”