Skip to main content

The Wisdom of Crowds Requires the Political Left and Right to Work Together

Collaborations between followers of opposing ideologies lead to less biased, higher quality Wikipedia pages

Debates between the political left and right—whether in Congress or at the dinner table—can turn so acrimonious it becomes hard to believe the two sides can ever come together and agree on anything, let alone work as a cohesive team. A new paper in Nature Human Behavior suggests when people from opposing political parties sit down and hash things out on a project, the outcome is superior to when members of a team all share the same viewpoint. Using data from popular Wikipedia pages, which can draw in hundreds of different editors, the researchers showed that the more diverse the ideologies of contributors to a particular topic, the higher the content’s quality.

Scientific American talked to James Evans, a professor of sociology at the University of Chicago and senior author on the study, about how heated deliberations among Wikipedia editors with divergent views lead to better honed arguments rooted in fact instead of opinion about a topic.

[An edited transcript of the interview follows.]


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


What inspired you to research this idea that political polarization might benefit a creative process or final product?

We had been studying polarization in the consumption of ideas, and we published a paper in Nature Human Behavior two years ago, which showed that there was enormous [political] polarization in the consumption of science of various types, but also for culture, literature, all kinds of different books. That gave rise to this idea that—oh wow—different people with different political ideologies have different views of science, and they have different views of a wide variety of things.

I have done some of my research in the area of organizations, and there’s this one idea that’s very common in social psychology and organizational behavior, which is that diversity matters. But typically people talk about diversity in terms of gender, and race or ethnic diversity, or educational diversity. And they speak about those things as being important and positive in certain cases, especially in creative contexts. Yet the literature systematically suggested that political polarization was bad, that it was acrimonious and that it led to an inability to agree.

So that gave rise to this question: To what degree does political diversity end up contributing to complex problem solving and to synthesizing information about things? Especially because we know now that these people have very different perspectives, and they’ve consumed other kinds of information in different ways.

How did you come up with the idea to use Wikipedia to test this theory?

Misha Teplitskiy and Eaman Duede [who are co-authors on the paper] had been collaborating pretty actively with the Wikimedia Foundation, putting some of the data that we had collected as a research lab onto Wikimedia. Between the two of them, they realized, hey, you know, this would be a really interesting context in which to explore this idea of politically diverse teams and their production of knowledge.

So we looked at the degree to which [an editor] contributed to articles on the political right or left. Basically, the more that you contribute to the right, the more we believe that you have more right wing opinions and vice versa with the left. And when we surveyed people we found that this was true—that we could predict a little over a third of the likelihood that people voted and ideologically felt right or left as a function of the [editorial] contributions they made.

We found that there was this strong positive relationship between more political polarization in the production of the pages and in the quality of how they are perceived.

How do Wiki pages get edited? How do editors have to work together?

Behind every page there is something called the “talk page,” and for a very popular article, this is the equivalent of thousands of pages long. So any time you contribute something to the front page you have to make an argument for it on the back page. And the arguments are elaborate, I mean it’s like going to an academic conference.

We extracted all these measures about processes that were going on behind this talk page and looked at how political diversity influences these processes [how many issues are discussed (semantic diversity) and the number of ways the issues are described (lexical diversity)] and the impact of those things on quality. And we learned a lot of interesting things.

What did you learn?

We learned that if you have balanced polarization, that led to less toxic language, and less toxic language led to having longer debates. When people started dropping the f-bomb, that would end conversations pretty quickly. Sometimes when that would happen, pages would actually get locked down by the senior editors, and that was associated with a strong reduction in perceived quality—when only senior editors could contribute to the pages, the quality dropped precipitously.

So [when the debate was balanced], there was intense discussion, it was less toxic and disciplined by these Wikipedia policies and guidelines, and the discussions were longer. And all these things ended up contributing to the overall perceived quality of the page.

I'm curious about the effect of polarization on scientific articles, say, about climate change where it’s pretty politically divisive but scientifically it’s very factual. Could more political diversity ever reduce the quality of the article if someone adds in misinformation?

There are certainly cases where more polarized groups decreased the quality of the page. It’s just that on average, it was much more likely for greater polarization to lead to more positive outcomes.

If it’s an important page like climate change, then there’s going to be an intensive back-end debate. And having different political perspectives ends up being associated with different assumptions about phenomena like climate change. For example, even though we know there’s broad consensus that humans and human waste products were involved in the broader change in climate, there are still underlying divisions in science: Exactly how bad have humans made it, how bad it’s becoming, what’s the relationship between future technology and global warming—all these kinds of things. And the fact that people are coming to these kinds of questions with different priors and different assumptions and typically having read different scientific literature ends up meaning that there’s an intensive debate.

What can people gain from this research, say during discussions at the Thanksgiving table? How can people use polarizing opinions to enhance the discussion rather than shutting it down?

For me, it’s a proof of concept that there is value in these different perspectives and that putting those perspectives together helps unearth assumptions and bring out arguments and really hone the quality of a complex knowledge product, like a Wikipedia page.

Political opinions are becoming a larger and larger way of organizing identity in American culture. And as this happens, then it increases the likelihood that we can benefit from seeing the other perspective. Just realizing that, I think, is one step towards acting respectfully to unleash it.

Dana Smith is a freelance science writer specializing in brains and bodies. She has written for Scientific American, the Atlantic, the Guardian, NPR, Discover, and Fast Company, among other outlets. In a previous life, she earned a Ph.D. in experimental psychology from the University of Cambridge.

More by Dana G. Smith