A new way to think about bots

MIS faculty recognized for research describing role of bots in information dissemination
Carolina Salge
Carolina Alves de Lima Salge poses for a portrait at the Terry College of Business on Wednesday, January 31, 2024 in Athens, Ga.. Salge, an assistant professor of MIS and an affiliate of the UGA Institute for Artificial Intelligence, was one of eight researchers to receive the AIS Early-Career Award at the 2023 conference. The international award recognizes early-career academics who made outstanding contributions to the field of information systems.

Winston Churchill said, “a lie gets across the world before the truth has a chance to get its pants on.” And that was before social media.

Today, rumors and disinformation reach a broader audience quicker than ever. One reason is new technologies that ease communication between people, but the other is bots. Bots, or automated social media accounts, act as multidirectional megaphones that blindly transmit information from one account to others without vetting, explains Carolina A. de Lima Salge, an assistant professor of Management Information Systems at the University of Georgia Terry College of Business.

Salge and co-authors — Elena Karahanna of UGA and Jason Thatcher of the University of Colorado-Boulder (formerly of Temple University) — recently won best paper awards from the Association for Information Systems, the academic association for information systems scholars, and from MIS Quarterly, a premier information systems journal, for their investigation into the ways bots impact the information ecosystem online. Their paper — “Algorithmic Processes of Social Alertness and Social Transmission: How Bots Disseminate Information on Twitter” — was published in MIS Quarterly in 2022.

Over the last decade, bots have been blamed for spreading online misinformation around political campaigns and the COVID pandemic in the United States and abroad. It’s been a difficult problem for social media companies to tackle because developers didn’t fully understand how bots affected the information ecosystem on social networks, Salge said.

“If we’re going develop guidelines for how people should react or behave, we need to first understand what’s happening under the hood,” Salge said. 

The paper takes an agnostic view of bots themselves. Bots can amplify disinformation or hateful, violent viewpoints but can also amplify necessary information such as public health alerts or natural disaster warnings, she said.

Her team didn’t look at the motivations of the bots they studied for this paper.

“We just focused on the actions and the behaviors of the bots themselves,” Salge said. “We argue that regardless of the motivation, you can use this process from a good, morally right motivation, but also with bad motivations. It’s important to understand the process itself.”

For this study, Salge and her co-authors dissected and mapped bot traffic on X, formerly Twitter, surrounding the Brazilian 2013 FIFA Confederations Cup protests. The movement, sparked by a 20-cent increase in mass transit fares, developed into street protests involving 2 million people in Rio de Janeiro, São Paulo, and other cities. It drew worldwide attention because it coincided with a preliminary World Cup match.

She and her co-authors developed the framework and language to describe how bots disseminate information through a network of social media users and how it differs from the way people transmit information to one another.

Conduit brokerage is the term information scientists use to describe how people pass information to each other in social networks. In this model, the spread of the information depends on whether the focal person receives the information, whether they think the information is worthy of being passed along, the number of people to whom they can pass the information through their network and, lastly, their being present in the network at all.

Human conduits have to sleep, eat, work and attend social functions.

Salge points out that these factors slow or stop the flow of information on peer-to-peer networks but don’t impact a bot’s abilities. She and her co-authors coined the term “algorithmic conduit brokerage” to describe how bots pass along information.

Bots can find the information they are programmed to find more quickly, grow the network of contacts interested in the information automatically, and lack a sense of judgment about the quality or consequence of the information passed along.

Salge’s concern is more with the unintended consequences of bots, especially as designers integrate more machine learning programming into their creations.

“When you are in a turbulent environment, like social media, even very simplistic bots can act very random and kind of crazy,” she said.

The hope is the work she and her co-authors produced will help developers and platforms create guidelines for making and using bot accounts.  

“This is just one piece of a larger conversation,” Salge said. “Bots are not necessarily bad. They are artificial agents, or autonomous agents, created by people to fulfill some goal, and we need to understand how they work to understand their impact.

“Knowing how they function enables us to broaden our vision to think about other ways in which we can develop bots that can be useful and make our lives better.”