Skip to main content

The Surprising Impact of Algorithms

The Surprising Impact of ALGORITHMS

The Surprising Impact of
ALGORITHMS

Moody College researcher leads unprecedented study with Meta exploring the role of social media in elections

professor talia stroud poses for a portrait outside with her arms crossed

In the aftermath of the 2016 election, politicians, the media and everyday people raised numerous concerns about the effects of social media on democracy and how platforms like Facebook and Instagram influence people’s political beliefs. What role do these powerful social networks and the algorithms that run them have in how people view candidates or feel about important issues?

Over the past several years, a multi-university academic team has been working alongside Meta to answer these very important questions as part of an unprecedented research project co-led by Moody College Communication Studies professor Talia Stroud.

As part of the project, the team had access to data from Meta that has never before been made available to researchers and were given the ability to alter the Facebook and Instagram feeds of consenting study participants to see how changes in what people saw affected their political beliefs.

In the summer of 2023, researchers released their first findings from the project in a sweep of papers published in both Nature and Science journals. And while they found that algorithms have a tremendous effect on what people see on their feeds, changing these algorithms to change what people see doesn’t necessarily affect people’s political attitudes. Also, when the researchers looked at platform-wide data from U.S. adults, they found that many political news URLs were seen, and engaged with, primarily by conservatives or liberals, but not both.

To reach these conclusions, researchers designed three experimental studies in which participants gave them permission to look at their platform behavior and alter aspects of their news feeds in context of the 2020 election. They also worked with platform-wide data for U.S. adults, aggregated to protect user privacy, to understand exposure and engagement with political news and like-minded sources.

an illustration of a woman looking at social media

In one study, the team withheld shared content from people's Facebook news feeds to understand virality and how it affects people’s attitudes. In a second study, they demoted Facebook content from like-minded sources, including like-minded friends, like-minded groups and like-minded pages. In a third study, they switched participants’ Facebook and Instagram feeds from the standard-ranking feed determined by algorithms to a chronological feed, where the most recent content appeared first.

The idea was to understand if changing aspects of
the algorithms could be good for democracy, by
helping people become more informed or preventing misperceptions.

According to the study findings, making these
substantial changes did have a huge impact on what people saw on the platforms, including how much political content people were exposed to and how much content they saw from sources who were likely to share misinformation.

But, Stroud said, the interesting part is that these interventions, while run on the platforms for three months, didn't have major effects on people's political attitudes such as political polarization, a measure of
how people feel about members of their own party and members of a different party.

"These sets of studies show that simple solutions have complicated effects,” said Stroud, who is also the founder and director of the Center for Media Engagement at UT Austin. “I think that this project demonstrates that we need to think more deeply and in more complex ways about what we do with social media and the outcomes that it will actually have."
illustration of two people browsing their phone for social media

The project also has implications for exposure to misinformation on social media, Stroud said. When the team changed the Facebook algorithm to show a chronological feed, for example, they found that people were actually exposed to more content from untrustworthy sources compared to what they saw with Facebook's standard-ranking algorithm.

The project also has implications for exposure to misinformation on social media, Stroud said. When the team changed the Facebook algorithm to show a chronological feed, for example, they found that people were actually exposed to more content from untrustworthy sources compared to what they saw with Facebook's standard-ranking algorithm.

“We really hope that this information can be used to understand exactly what's happening on these platforms and what occurs as the result of dialing different levers on the algorithm,” Stroud said. “We also hope that these results are useful for policymakers who are thinking about how they're going to tackle what's happening on the platforms. And, of course, I think it's very useful for the public to know what it is that the algorithms are doing.”

The team is expected to release additional research papers as part of this ongoing project with Meta.

“I think it's critically important for everyone to understand what's happening on social media. It’s become this phenomenon that takes up so much time in so many people’s lives and if we don't really get a handle on what's happening on these platforms, how are they affecting us, if we make changes to them, how will that produce changes that have implications for democracy and elections, I think we're in real trouble,” Stroud said. “I really believe in initiatives like this that provide information to the public and policymakers so we can base our decision-making on real data.”