Facebook opened its doors to researchers. What they found paints a complicated picture of social media and echo chambers

July 27, 2023

An unprecedented partnership between a group of prominent independent academic researchers and researchers at Meta has published its first series of studies about the impact Facebook and Instagram had on U.S. users during the 2020 elections.

July 27, 2023, 2:00 PM EDT

By Brandy Zadrozny

A landmark study of how Facebook shaped the news users saw in the run-up to the 2020 election has found the platform resulted in “significant ideological segregation” in regard to political news exposure — specifically with conservative users who researchers found were more walled off and encountered far more misinformation than their liberal counterparts.

Looking at aggregated data from 208 million U.S. users, researchers found “untrustworthy” news sources were favored by conservative audiences and almost all (97%) of the political news webpages rated as false by Meta’s third-party fact-checkers were seen by more conservatives than liberals.

Overall, the research, led by Sandra González-Bailón, a professor at the University of Pennsylvania’s Annenberg School for Communication, found that Facebook’s pages and groups, more than users’ friends, contributed more to this ideological segregation and polarization. In general, it concluded that conservative sources dominated Facebook’s news ecosystem.

It was not clear whether this segregation was caused more by algorithms or user choice.

“These feedback loops are very difficult to disentangle with observational data,” González-Bailón said. “These require more research.”

The study is one of four published Thursday, three in the journal Science and one in Nature. They were part of an unprecedented partnership between a group of prominent independent academic researchers and researchers at Meta with the aim of studying the impact Facebook and Instagram had on U.S. users during the 2020 elections.

The project included 17 academic researchers from 12 universities who were granted deep access by Facebook to aggregated data. The researchers collaborated with more than two dozen researchers, engineers and legal analysts at Meta.

The independent researchers were not paid by Meta, and the social media company agreed not to reject research questions for anything outside privacy or logistical reasons. Meta also relinquished the right to restrict or censor the researchers’ final findings. In the interest of transparency, the collaboration was monitored by an independent rapporteur, whose report was also published in Science on Thursday.

Together the studies offer the deepest look yet at how news flowed across Facebook and a more limited idea of how that news may or may not have affected political polarization.

But the research is also limited in its scope. The algorithm experiments were conducted on only two platforms, Facebook and Instagram, over three months, a relatively short amount of time at the height of a contentious presidential election.

Political content has traditionally been only a small part of what Facebook users see, and the platform has since sought to reduce how much political content is shown to users. In 2021, the company said it was doing initial tests on reducing political content in its News Feed, culminating in an update in April in which the company said it was continuing to refine its approach to such content and moving away from rankings based on engagement. Posts linking to political news webpages amounted to around 3% of all posts shared on Facebook, according to the González-Bailón study.

Each of the four studies found Meta’s recommendation algorithms — the complicated rules and rankings behind how platforms feed content and communities to their users — to be extremely influential in deciding what those users see and how they interact with content.

Three out of the four studies experimented with the algorithm and concluded that the kind of tweaks long hypothesized to be the solutions to polarization and the key to healthier online experiences may not affect people’s political attitudes and real-world behaviors, at least in the short term. Such tweaks include reverting to chronological feeds, reducing virality by limiting reshared content or breaking up echo chambers.

“These findings should give all of us pause, including policymakers, about any simple sort of solution,” said Talia Stroud, a professor at the University of Texas at Austin, who helped lead the research project.

For the three experimental studies, paid participants allowed researchers to manipulate their experience on the platforms in some way. They used the platforms as usual, completed surveys on political attitudes and activities throughout the three-month period, and shared their online activity on and off the studied platforms.  

In one study, led by Andrew Guess, an assistant professor of politics and public affairs at Princeton University, researchers randomly assigned participants a reverse chronological feed on Facebook and Instagram, showing newest posts first without any other algorithmic weighting.

In 2021, Facebook whistleblower Frances Haugen and some lawmakers suggested a time-ordered feed could fix the myriad problems that come with recommendation algorithms, which critics argue are engineered to keep users engaged and enraged. The next year, Facebook rolled out customizable feeds, though it’s unclear how many people utilize these options.

The new study doesn’t breed hope for chronological feeds as a silver bullet, and some of the findings can appear contradictory. Facebook users who saw the newest posts first encountered more political and untrustworthy content (by more than two-thirds), but less “uncivil” content (by almost half). At the same time, they also were shown more posts from their moderate friends and from ideologically-mixed groups and pages.

One significant effect: Without the sophisticated algorithm, researchers reported that users liked, commented and shared less often and spent “dramatically less” time on Facebook and Instagram overall. Instead, mobile users who had been switched to reverse chronological feed spent more time on TikTok and YouTube. Desktop users spent less time on Facebook and more time on Reddit.

Despite the effects on user experience, changing to a chronological feed didn’t affect participants’ levels of self-reported political polarization, knowledge or attitudes.

In a second study, the researchers experimented with virality, cutting off some participants from the ability to see content reshared from friends, groups or pages. Turning off what amounts to about one-quarter of posts viewed on Facebook had a measurable effect. Users saw less political news, clicked less on partisan news and were exposed to fewer posts containing untrustworthy content.

This seems to support the common belief, noted in leaked internal Facebook research reports, that emotionally-charged and political content gets reshared more often. Still, as they had with the chronological feed, researchers couldn’t find any link to a shift in users’ political attitudes or behavior.

The third experiment investigated echo chambers to find out what happens when people see less content from like-minded groups and pages?

First, the research confirmed that echo chambers are real — the majority of content users saw came from groups and friends who shared political leanings. Just over half came from like-minded sources and just under 15% came from people or groups with different political leanings. As with the other experiments, reducing content from like-minded sources while increasing exposure to people and content from other points of view had no real effect on polarization or political preferences or opinions as measured by the study.

Limitations notwithstanding, Nick Clegg, president of Global Affairs at Meta, trumpeted the findings as an exoneration of Facebook and its role in politics and elections. He wrote in a blog post that the papers are the first time that the company has opened itself to academics in this way, and that they showed Facebook had no role in the toxicity of U.S. politics.

“These findings add to a growing body of research showing there is little evidence that social media causes harmful ‘affective’ polarization or has any meaningful impact on key political attitudes, beliefs or behaviors,” he wrote. “They also challenge the now commonplace assertion that the ability to reshare content on social media drives polarization.”

The researchers behind the new studies were more restrained.

“These findings don’t mean there aren’t reasons for concern about social media,” said Brendan Nyhan, a professor in the department of government at Dartmouth College and one of the lead authors behind the echo chambers study. “But the study is important in that it challenges some notions people have about the effects of social media. And it might help reorient that conversation.”

The researchers said the collaborative new studies underscore the need for tech companies to provide greater access to data.

“What we were able to do here — unpacking the sort of black box of algorithms, providing all these kinds of details about what’s happening on these platforms — is a huge illustration of the value of making sure that platforms make data available to external researchers,” said Joshua Tucker, project lead and co-director of the NYU Center for Social Media and Politics.

Still, collaborations with platforms may not be the model for research going forward and perhaps it shouldn’t be, according to Michael W. Wagner, professor in the University of Wisconsin-Madison’s School of Journalism and Mass Communication, who served as the collaboration’s independent rapporteur.

In an article about the project for Science, Wagner wrote the researchers had conducted “rigorous, carefully checked, transparent, ethical, and path-breaking studies.” But future scholarship should not depend, he wrote, on obtaining a social media company’s permission.

Additional studies from the project, currently in the peer-review process, are expected in the coming months.

More News

September 21, 2023

Social Media Harms

May 2, 2023

UNHCR Innovation

April 25, 2023

Digital Services Act