Inequalities in online discourse 

Researchers conducted a Reddit field experiment to better understand the dynamics of online debates. 

As anyone who has ever scrolled through a comment section knows: Online discussions are often dominated by a small group of active users, while the majority remain silent. This imbalance can give the impression that extreme opinions are more widespread than is actually the case—and fuel polarization. A field experiment with 520 participants conducted on Reddit by researchers from the Max Planck Institute for Human Development, TU Dresden, and Stanford University sheds light on this dynamic—and what can be done to change it. 

Lisa Oswald from the Center for Adaptive Rationality and her team conducted a sophisticated field experiment on participation in online political discussions. They recruited 520 informed and consenting participants from the United States and randomly assigned them to six specially created discussion rooms on Reddit for four weeks of intensive exchange on 20 political topics. The research goal was not to assess individual posts and effects but rather the group dynamics that arise when a lot of people are talking at the same time—or lurking in the background. Why do a few very active users (“power users”) take up most of the bandwidth in online debates, while the majority remain silent? And can this imbalance be addressed by realistic interventions? 

“It was a complex field experiment” says Oswald. “But working with small groups in the lab wouldn’t have captured the public character of social media,” continues Oswald, who studies public discourse in digital spaces and the role of social media for democracy. “You need a certain group size for inequalities in participation to become visible—the dynamics we all know from classrooms and lecture halls.” 

A real-life lab 

For the study, which was funded by Horizon Europe’s Social Media for Democracy (Some4Dem) project, Oswald and her colleagues Philipp Lorenz-Spreen (TU Dresden and MPI for Human Development) and William Small Schulz (Stanford University) set up six private subreddits, each with up to 100 participants. The groups received a brief discussion prompt every day, and participants completed weekly check-in surveys, as well as surveys before and after the study. They were aware that the researchers were observing and, if necessary, moderating the discussion.  

The researchers distinguished two aspects of participation: whether participants posted at all, and how many comments they wrote. Analysis of 5,819 comments posted in response to the discussion prompts and a further 62,000 comments made by participants in other Reddit communities painted a clear picture. 

Reasons for active or passive participation  

Those who perceived the discussion environment to be toxic, disrespectful, polarized, or unconstructive tended to remain silent. Surprisingly, however, those same perceptions predicted higher comment counts among active users. In other words, toxic spaces are a barrier for some, but may motivate others to be particularly active. 

The most active participants tended to be male, highly interested in politics, and to describe themselves as likely to comment online. Higher activity was also predicted by perceptions of polarization and a greater perceived distance between one’s own opinion and that of the group.  

In terms of content, many debates were constructive, but some topics stood out: The Israel–Gaza conflict was the most toxic, followed by gender issues, prostitution, and gun control. The discussion on economic and climate issues was more relaxed. 

Incentives, norms, moderation—what really works 

The study tested two types of intervention designed to either reduce discourse toxicity or encourage lurkers to become active: On the one hand, appeals to norms (“Please stay civil, respectful, and on topic”), supplemented by stricter spam filters; on the other hand, financial incentives, with participants receiving $2 for every day they wrote at least one serious comment, up to a maximum of $40. Findings showed that the money made a difference. Lurkers became more active and participation was broader. However, participation inequality remained high. “There was some change, but not a huge one,” says Oswald. “This suggests that roles in online discussions are remarkably stable.”  

While monetary incentives are not easy to implement, symbolic rewards—such as greater visibility for first-time commenters, “thank you” functions, labels for high-quality posts, or targeted highlighting of diverse voices—are more realistic. Limiting the number of comments per user could also curb the dominance of frequent users. 

“Lightweight” interventions—such as appealing to norms to improve the tone of the discussion and thus enhance the perceived safety of a discussion space—did not succeed in encouraging the silent majority to speak up. However, given the low overall level of toxicity in the study, the scope for such measures was limited. 

Social feedback, in contrast, had a measurable effect. Participants who received more upvotes than downvotes were more likely to write the following day—and wrote more. In other words, visible positive feedback was directly associated with higher participation.  

Online discourse does not reflect majority opinions 

For Oswald, the goal isn’t to “save” online debates. “We can’t expect everyone to participate online,” she says. She is more interested in understanding why a small minority are highly active while the majority remain silent—and what that means for perceptions of public opinion. The participation gap gives readers a distorted view of what the general public thinks. When reading the comments section under an online news article, for example, anyone assuming it the comments to reflect public opinion will often be sorely mistaken. 

There was also a surprising side effect: After the four weeks of discussion, participants revised their self-reported knowledge of the topics discussed downward. Oswald suspects a kind of “epistemic humility” and social comparison processes. “In exchanging views with others, you realize how complex the topics are and how much other people know—and may well revise your self-assessment downward.” In a follow-up project, she plans to analyze the texts produced in the discussions in more detail to better understand this dynamic. 

Research under difficult conditions 

The study was conducted in the United States and on Reddit for pragmatic reasons. “Social media research is difficult at the moment because API access for research purposes has been restricted,” says Oswald. It is “practically impossible” to conduct experimental research on Twitter/X, and Mastodon is often too niche. Reddit offers researchers tools and allows them to create communities for research purposes. The Digital Services Act was intended to improve research access to platform data. “But these new access points have yet to be established in practice—and will likely be limited to observational data. Which means that setting up your own communities or running experimental interventions will only be possible to a very limited extent, if at all,” Oswald continues. 

The study data and code are publicly available in anonymized form; an interesting resource for further research, because it is rare to be have so much contextual information on the people involved in social media discourse. 

Implications for platform design 

What are the practical implications? There’s no silver bullet for reducing participation inequality. But one can think of several possible points of intervention: nonmonetary rewards for first-time posts and high-quality comments; consistent enforcement of rules to combat toxicity and lower the participation threshold; caps on comment numbers to reduce the dominance of power users. From a media literacy perspective, one thing is particularly important: We need to be aware that what we see online does not reflect public opinion and to understand that debates on social media are often distorted by a few highly active users. “You can’t press a button to achieve more participation,” Oswald says. “But you can create conditions that make it easier for people to speak up—especially those who have been lurking in the background.”  

At a glance: 

  • Online discourse is driven by a minority of highly active users; the majority remain silent—which distorts perceptions of public opinion and can fuel polarization. 

  • Field experiment (6 subreddits, 520 people, 4 weeks) shows: Those who found the discussion toxic, disrespectful, or polarized were more likely to remain silent. Surprisingly, however, those same perceptions predicted higher comment counts among active users. The most prolific users tend to be male, highly interested in politics, and to describe themselves as likely to comment online. 

  • Interventions: Financial incentives broaden participation but only moderately reduce inequality; appealing to norms without enforcement has little effect (in this environment, which was generally low in toxicity). Visible positive feedback (more upvotes than downvotes) was associated with increased future participation. 

  • Implications for platform design: No one-size-fits-all solution, but several potential points of intervention: nonmonetary rewards for first-time and high-quality contributions, clear rules to reduce toxicity that are consistently enforced, and caps on comment counts to reduce the dominance of extremely active users. 

Original publication

Oswald, L., Schulz, W. S., & Lorenz-Spreen, P. (2025). Disentangling participation in online political discussions with a collective field experiment. Science Advances, 11(50), Article eady8022. https://doi.org/10.1126/sciadv.ady8022

Other Interesting Articles

Go to Editor View