As the presidential campaign goes viral, so, too, does misinformation
As the presidential campaign goes viral, so, too, does misinformation
Moody College researchers examine political trends ahead of Election Day
The 2024 election season has been unprecedented by almost every measure.
President Joe Biden dropping out of the race less than four months before Election Day. The meteoric rise of Vice President Kamala Harris as the new Democratic presidential candidate. Two failed assassination attempts on former President Donald Trump. Accusations of AI-generated crowd sizes. And thousands and thousands of cat memes.
For anyone who closely studies U.S. elections, this cycle has proven rife with opportunities to explore new facets of political culture, including how meme makers, social media influencers, AI technologies and misinformation are shaping campaigns, affecting voter turnout, and even potentially delegitimizing the election results when they do come in next month.
Moody College of Communication journalism assistant professor Jo Lukito and her team at the Center for Media Engagement, which researches media ethics and propaganda, look at all these factors in their work.
For Lukito, it’s a labor that is equal parts exciting and terrifying.
“Everything is moving super, super fast,” she said in September. “It’s hard to keep up, and we’ve had to change course quite a bit.”
Since her September interview, the southeastern U.S. was hit with catastrophic hurricanes, the conflict in the Middle East has escalated to full-on warfare and more documents were released detailing Trump’s involvement in the Jan. 6 insurrection.
The world’s eyes are fixed on the political stage.
“This election is a really good time to talk about politics because many folks are getting more involved. I feel like there's a lot of engagement,” Lukito said. “But I think the part that makes me anxious is just how much distrust people have in the media system.”
A viral campaign
One of the most significant changes in political campaigns over the past decade has been the rise of social media platforms to be among the top sources of news engagement. According to a recent study by the Pew Research Center, more than half of U.S. adults get at least some of their news from social platforms — something that has never been more obvious than in this election, with positive and negative effects.
The positive: People are becoming more interested because of viral campaigns that capture public attention.
The negative: With the proliferation of information available, it becomes more challenging to vet what is good information.
The 2024 presidential campaign has been chock-full of viral moments: Harris dancing with coconuts falling from the sky (a nod to something her mother used to say to illustrate the influence of older generations on their children), Lil Jon doing Georgia’s roll call at the Democratic National Convention, and singer Charli XCX tweeting that Harris is “brat.”
These memes have generated interest in the election, especially among young people. For better and for worse.
After the presidential debate in September, thousands of cat memes began flooding people’s social media feeds after Trump talked about Haitian immigrants in Springfield, Ohio, eating residents’ pets — a statement that is false.
“While it has the positive effect of being an educational opportunity to explore the roots of a funny statement like that and question where it comes from, the good mostly stops there,” Lukito said. “The danger is that most people don’t use it as an educational opportunity, but rather, it becomes both the unintentional spread of misinformation and the deliberate spread of disinformation.”
“I strongly believe that when a politician says a piece of misinformation, it causes a lot of damage; a regular citizen, when they say something on social media, doesn’t quite have the same amount of attention. This is why I've become really interested in political and news influencers; they increasingly have this opinion leader role. It’s important to understand that they can also unintentionally spread misinformation.”
After the debate comments, Springfield schools and government buildings began receiving bomb threats, and far-right groups descended on the city. Still, vice presidential candidate JD Vance doubled down on the statements, going so far as to say that it’s OK to make up stories to get political attention.
“I strongly believe that when a politician says a piece of misinformation, it causes a lot of damage; a regular citizen, when they say something on social media, doesn’t quite have the same amount of attention,” Lukito said. “This is why I've become really interested in political and news influencers; they increasingly have this opinion leader role. It’s important to understand that they can also unintentionally spread misinformation.”
Political influencers run the gamut today, taking on issues from abortion to immigration. Center for Media Engagement research fellow Zelly Martin’s research looks specifically at anti-abortion influencers and how they use social media trends and violent images to influence people’s beliefs about abortion. One of her big concerns this year: the false claim that babies can be killed after birth.
“These people don’t worry me because of individually held beliefs, but it does worry me when they create content to change people’s minds in a way that is deceptive,” Martin said. “Inspiring people with fear and anger and violent imagery is a way to manipulate people without getting them to think about the issue.”
Is it real or not?
The threat of disinformation on social platforms came into laser focus in 2016 when Russian internet trolls began posting false information online to interfere with the results of the election — drawing people to fake political rallies, releasing fictitious and politically damaging information about candidates, and discouraging people from the ballot box.
Lukito, who became interested in election propaganda after witnessing Muslim persecution after 9/11, said the goal of most disinformation actors is to sow distrust in the political system. When people don’t know or trust that what they are consuming is real news, they become disengaged from political life, resulting in a decrease in civic engagement.
“Each foreign agent that tries to target the U.S. has a slightly different strategy or slightly different motivations,” Lukito said. “But overall, their goals are really to destabilize the United States politically.”
UT researchers are particularly concerned about the ramp-up of disinformation as the election approaches. Foreign actors have begun creating social media accounts that look like real news organizations to post false information about when, where and how to vote. They will assert that specific polling locations are closed or send people to the wrong locations to cast their ballots. The ultimate goal is to disenfranchise voters.
“Anything you see online that is very urgent or alarmist is a red flag. That is what propagandists love to do. Those messages can quickly go viral and develop a life of their own.”
To combat these issues, Inga Trauthig, the head of research of the Propaganda Research Lab at the Center for Media Engagement, has worked directly with county officials to encourage them to use social media more aggressively in the lead-up to the election, countering misinformation by releasing quality information. She urged social media users to follow these official accounts to determine their correct polling locations and how to vote.
“That information is very likely going to stay exactly that way. Elections don’t face very last-minute changes,” Trauthig said. “Anything you see online that is very urgent or alarmist is a red flag. That is what propagandists love to do. Those messages can quickly go viral and develop a life of their own.”
UT researchers are equally concerned about misinformation after Election Day, which could lead people to question the election results and what candidates won or lost.
“We're starting to see an increased amount of misinformation immediately following the election, particularly regarding election fraud and election skepticism,” Lukito said.
‘The Wild West’
Artificial intelligence has further complicated issues this election cycle. It’s the first time generative AI has factored into campaigns, from Trump claiming that Harris’ rally sizes were AI-generated to a deepfake voice of Taylor Swift endorsing him as president.
“I think in the future we're actually going to tragically see more of this,” Lukito said. “As that technology advances, I think that's going to be a really big concern about how we detect those things. Because it's still really hard to empirically identify when it's some piece of AI-generated content.”
This summer, Trauthig and her team spoke to around 30 political strategists and campaigners who have been using generative AI in their campaigns, including to create more personalized messages and target certain audiences to help with fundraising.
“The bottom line: Everyone we interviewed had used or played around with AI, so it is already widely adopted,” Trauthig said. “We found the main guardrails holding back deceptive or targeted AI is that a lot of the American campaigners still adhere to some internal code. They don’t want to ruin their reputation. But we know that foreign actors are using it much more aggressively.”
Trauthig said their interviews with officials were conducted at the beginning of the year, and since then, things have rapidly changed.
“I can tell you the gloves are probably going to come off,” she said. “We have definitely seen some developments over the last months. People are using AI and not labeling it as such. It is the Wild West.”
Checking the facts
To combat the spread of misinformation online, Lukito and her research team at UT are focused on using AI for good — creating algorithms to detect false information that is spreading most rapidly online. It’s easy to spot issues with the major headlines of the day, but there are falsehoods online that are much more insidious and easily disguised. Researchers aim to help identify this information more quickly and get it into the hands of journalists and fact-checking organizations to correct it immediately.
“I like to think of artificial intelligence as a sort of tool that can be used for good and bad,” Lukito said. “People can use AI to create misinformation or to spread deepfakes. But we can also use AI to detect misinformation to help fact-checkers or to highlight quality information on social media platforms.”
For social media users, Lukito encourages a diverse media diet, examining multiple sources of information, particularly verified news organizations like Reuters or the Associated Press.
At the same time, she acknowledges how unfair it can feel to consumers to have to do so much work.
“We ask our citizens to do a lot. Now we ask them to verify their own information and keep track of a whole variety of media outlets,” she said. “I do think it's a lot of pressure.”
Lukito has been working on an interdisciplinary research initiative at UT called Good Systems, which seeks to understand the changes new AI technologies will bring and mitigate any harms or unintended consequences that will unfold while also leveraging the benefits of these technologies.
This year, the Center for Media Engagement also received a $3 million grant from the John S. and James L. Knight Foundation that will support its work to rebuild public trust in the media and identify strategies that help people form bonds despite holding different views. This will include training students to think through ethical issues raised by new technology and helping marginalized communities counter misinformation.
“I think people are really yearning for a good media system, one that they can trust, one that gives them a lot of confidence when they go to the ballot box,” Lukito said. “A lot of the work that I had been doing previously was really focused on detecting misinformation and disinformation, all of which is incredibly important. But it’s not just about detecting and removing bad information but also making sure good information starts to fill that gap.”
‘Go outside and touch the grass’
Perhaps disinformation actors have succeeded in more ways than many would like to acknowledge. A 2023 Pew Research Center poll found that only 4% of U.S. adults say the political system is working extremely or very well, and 65% say they always or often feel exhausted when thinking about politics.
It’s not uncommon today for people to spend hours “doomscrolling” online, drawn in by outlandish and negative content that, coupled with sowing distrust, also results in poor mental health. Another huge part of Lukito’s work is helping students and researchers who spend a lot of time online appropriately address these issues. She and her colleagues are building what they call an “ethics of care” to get universities and funders to help the most vulnerable researchers get mental health support and therapists if they need it.
Her personal cure: Go on a hike and separate from her phone.
“The internet slang for this is to ‘go out and touch grass,’” she said. “I think those sort of practices, while we joke about it, are really important for mental health.”
Lukito encourages young people not to get disillusioned. Make sure you register to vote, that you haven’t been pulled off the voter roll, that you look at candidates’ websites to understand their policy positions, that you do your best to remain an informed voter, and that you have healthy discussions with others about the significant issues of the day.
“I think we've moved away from talking about politics because it can be sensitive and perhaps emotionally charged,” Lukito said. “But talking to each other about politics is how we start to make sense of politics, whether that's people who agree with you ideologically or people you who you might disagree with, who might be voting along a different party line. I think those sort of conversations are really important to have. And I have a lot of faith that Moody students can actually hold those really mature and responsible conversations.”
Be an extremely cognizant social media user, she said.
And, please: Go outside and touch the grass.