Algorithm adjustments made by social media platforms alter the political news their users see and their engagement levels, but these changes do not notably affect their political attitudes, such as levels of political polarization, show a collection of studies appearing in the latest issues of the journals Science and Nature.
The findings are among the first published studies stemming from the most comprehensive research project to date examining the role of social media in American democracy.
These newly published studies from the project also show that social media algorithms used by Facebook and Instagram are extremely influential in shaping users’ on-platform experiences and that there is significant ideological segregation in political news exposure.
The multi-year project, led by academics from U.S. colleges and universities and working in collaboration with researchers at Meta, focuses primarily on how critical aspects of the algorithms that determine what people see in their feeds affect what people see and believe. In the coming year, additional papers from the project will be publicly released after completing the peer-review process. They are aimed at providing insight into the content circulating on Meta’s social media platforms, people’s behaviors, and the interaction between the two.
The project was announced in 2020 after internal researchers at Meta initiated a partnership with Professor Talia Jomini Stroud, founder and director of the Center for Media Engagement at the University of Texas at Austin, and Professor Joshua A. Tucker, co-founder and co-director of NYU’s Center for Social Media and Politics and director of NYU’s Jordan Center for the Advanced Study of Russia, around the impact of Facebook and Instagram on the 2020 U.S. elections.
“Social scientists have been limited in the study of social media’s impact on U.S. democracy,” say Stroud and Tucker. “We now know just how influential the algorithm is in shaping people’s on-platform experiences, but we also know that changing the algorithm for even a few months isn't likely to change people’s political attitudes. What we don’t know is why. It could be because the length of time for which the algorithms were changed wasn’t long enough, or these platforms have been around for decades already, or that while Facebook and Instagram are influential sources of information, they are not people’s only sources.”
Additional findings include the following:
Ideological segregation on Facebook
- Many political news URLs were seen, and engaged with, primarily by conservatives or liberals, but not both.
- There was an asymmetry between conservative and liberal audiences, where there were far more political news URLs almost exclusively seen by conservatives than political news URLs exclusively seen by liberals.
- The large majority (97%) of political news URLs (those posted at least 100 times) on Facebook rated as false by Meta’s third-party fact checker program were seen by more conservatives than liberals, although the proportion of political news URLs rated as false was very low.
Impacts of removing reshared content on Facebook
- Removing reshared content on Facebook substantially decreased the amount of political news and content from untrustworthy sources people saw in their feeds, decreased overall clicks and reactions, and reduced clicks on posts from partisan news sources.
- Removing reshares reduced the proportion of political content in people’s feeds by nearly 20% and the proportion of political news by more than half.
- Content from untrustworthy sources, although making up only 2.6% of Facebook feeds on average, was reduced by 30.6% when reshares were removed.
- Removing reshared content on Facebook produced a decrease in news knowledge among the consenting study participants and did not significantly affect political polarization or other individual-level political attitudes.
Impact of altering feed algorithms from personalized to chronological
- Replacing consenting study participants’ algorithmically ranked feeds on Facebook and Instagram with a simple chronological ranking, meaning that they saw the newest content first, substantially decreased the time participants spent on the platforms and how much they engaged with posts while there.
- The chronologically ordered feed significantly increased content from moderate friends and sources with ideologically mixed audiences on Facebook; it also increased the amount of political and untrustworthy content relative to the default algorithmic feed. The chronological feed decreased uncivil content.
- Political content—appearing in 13.5% of participants’ feeds on Facebook and 5.3% on Instagram on average—increased by 15.2% on Facebook and 4.8% on Instagram when posts were presented in chronological order.
- Content from untrustworthy sources, making up 2.6% of Facebook feeds and 1.3% of Instagram feeds on average, increased by 68.8% and 22.1%, respectively, when participants were shown the chronological feed.
- Posts with uncivil content on Facebook (estimated as 3.2% of participants’ feeds on average) decreased by 43% when participants saw a chronological feed. Posts with uncivil content on Instagram (estimated as 1.6% of participants’ Instagram feeds on average), however, did not decrease.
- Despite these substantial changes in participants’ on-platform experience, the chronological feed did not significantly alter levels of issue polarization, affective polarization, political knowledge, or other key attitudes of the study participants during the three-month study period.
Impacts of deprioritizing content from like-minded sources on Facebook
- Posts from politically ‘‘like-minded” sources constitute the majority of what people see on the platform, although political information and news represent only a small fraction of these exposures.
- The median Facebook user received a majority of their content from politically like-minded sources—50.4% versus 14.7% from cross-cutting sources (i.e., liberals seeing content from conservatives or the opposite). The remainder are from friends, Pages, and Groups that are classified as neither like-minded nor cross-cutting.
- Reducing the prevalence of political like-minded content in consenting participants’ feeds during the 2020 U.S. presidential election had no measurable effects on attitudinal measures such as affective polarization, ideological extremity, candidate evaluations, and belief in false claims.
“The breadth of these findings illustrates the importance of access to platform data for outside researchers,” adds Tucker.
The core research team includes nearly 20 academic researchers across more than a dozen colleges and universities and with multiple areas of expertise. The team worked with Meta researchers to design experimental studies with consenting users who answered survey questions and shared data about their on-platform behavior. The team also analyzed platform-wide phenomena based on the behavior of all adult U.S. users of the platform. Platform-wide data was only made available to the academic researchers in aggregated form to protect user privacy.
The academic team proposed and selected specific research questions and study designs with the explicit agreement that the only reasons Meta could reject such designs would be for legal, privacy, or logistical reasons. Meta could not restrict or censor findings, and the academic lead authors had final say over writing and research decisions.
In addition to NYU and the University of Texas at Austin, the project includes the College of William and Mary, Dartmouth College, George Washington University, Northeastern University, Princeton University, Stanford University, Syracuse University, the University of California at Davis, the University of Pennsylvania, the University of North Carolina at Chapel Hill, and the University of Wisconsin-Madison.