Fact checking agendas
The platform of the internet continues to fuel conspiracy theories, misinformation and disinformation across the globe.
To counter this surge of engineered and deceptive practices seen through social media platforms, deceitful activists, biased newsrooms, and foreign and domestic influence from organizations, the Center for Media Engagement (CME), led by Drs. Talia Stround and Gina M. Masullo, was originally created in 2011 as the Engaging News Project. CME launched in 2017 and continues as an effort to understand, appreciate, and participate in the democratic exchange of ideas.
Working with researchers from The University of Texas and universities around the country and world, the center’s goal is to propagate a method of connective democracy uniting news agencies, scholars, social media platforms, and public policy concerns with methodical approaches toward bridging gaps in society. Some areas of CME study include aspects of journalism, media ethics, propaganda, science, platforms, and bridging divides.
Samuel Woolley is an assistant professor in the School of Journalism and Media and program director and Knight faculty fellow for CME.
Woolley said misinformation is the accidental spread of false content, and disinformation is the purposeful spread of it. It can create animosity between friends and associates on social media, polarizes individuals, and causes mistrust and apathy in the electorate.
His recent research explores computational propaganda and the ways political parties manipulate social media platforms and influence perception. Computational propaganda is defined by the use of social media and other digital tools in attempts to manipulate public opinion.
Through disseminating this biased information, individuals, candidates, campaigns, companies, and political parties are able to propagate their agendas and put pressure on dissenting views. This is often executed by leveraging technology, using bots, and trolling to control the spread of information, as well as automation, artificial intelligence and trending algorithms.
Woolley is the author of "The Reality Game: How the Next Wave of Technology Will Break the Truth," a 2020 book that unfolds methods of propaganda generation under the tenets of content creations, reception, and intent. He’s currently working on another book titled "Manufacturing Consensus" that analyzes more than seven years of international field work to study those who build and launch armies of bots on social media.
"One of my goals is to think about the way we respond, and how we can design technology in such a way we’re able to get out in front of problems before they begin," Woolley said.
With multiple channels from a variety of sources available, misinformation and disinformation are ubiquitous in culture and spreading worldwide. Patterns of discourse can often overlap and can be generalized to other topics. School of Journalism and Media Assistant Professor Jo Lukito's research is more focused on the relationship between disinformation and violence.
Lukito and School of Journalism and Media faculty member Dhiraj Murthy are collaborating with a team of UT Austin researchers from computer science, linguistics, the iSchool and McCombs School of Business to study the potential of designing responsible AI technologies to curb disinformation. This novel collaboration between social scientists and STEM researchers to study and build solutions against mis-/disinformation is part of a six-year, $750,000 grant funded by Good Systems, the Research Grand Challenge program created by UT’s Office of Research.
One ongoing project looks at domestic misinformation and conspiracy theories about election fraud. Along with a few researchers at CME, Lukito published an article in Wired about a recent QAnon-linked rally that encouraged offline engagement – sometimes with violence – that has since been removed from several social media platforms.
Another project is more focused on international political disinformation, and examines the relationship between state violence and state-sponsored disinformation in Myanmar and Brazil. This work is advancing the understanding of political disinformation in developing BRICSAM countries – Brazil, Russia, India, China, South Africa, ASEAN (The Association of Southeast Asian Nations) states, Mexico – which experience a heightened degree of state-sponsored disinformation compared to Western countries.
"I am especially driven by questions related to globalization, global communication, and international relations because of my own experiences as a first-generation American, undergraduate, and graduate student," Lukito said.