MSU Researchers Untangle the Role of Algorithms in News and Politics
What do you stand to lose when the news you read every day on social media is delivered by an algorithm – a careful calculation of your preferences and behaviors? If the social media sorting hat assigns you to the wrong category, the answer could be alarming.
Researchers at ComArtSci set out to explore the content social media users see in their Newsfeed and to understand why political and news content is more visible to some people than others. They conducted a study on the relationship between algorithms and content about news and politics, with respect to how people are classified on Facebook. In a feature embedded in their profile, Facebook users are sorted into interest categories that can be used in targeted Facebook advertising.
“There are lots of algorithmic systems within Facebook,” said Kjerstin Thorson, Ph.D., associate professor of Advertising and Public Relations, explaining how many algorithms work together to curate the Facebook Newsfeed. “What we were interested in are the algorithms that classify your interests.”
The study is the first to report the rate at which Facebook users are presented to advertisers according to their interest in news or politics.
‘Sorting Out’ Comes with Consequences
Thorson worked with two student researchers at ComArtSci and a ComArtSci alum to examine how Facebook can algorithmically infer what users are interested in, and how that determines a Facebook user’s exposure to news and political content. A number of variables factored into how people were classified including pages they liked, searches they conducted, content they engaged with, and the political interests of their family and friends on Facebook.
The team surveyed young adults who volunteered their Facebook data in order to find common threads between the way news content was delivered in the Newsfeed, which is often filled with photos, status updates and memes posted by friends and family. They compared this with the interest classifications, to understand how people are being “sorted out.”
“The way it works is that you send different signals through your own behavior, so who you’re friends with, what kinds of stuff you click on, what you read, what you look at, and that shapes what the algorithm thinks you want to see,” said Thorson. “What we’re trying to understand is who specifically gets classified as interested in news or interested in politics. What are the different ways that you can be classified as interested in news and politics? And then, does that have an impact on how much political content or news content you end up seeing?”
Thorson and her team aimed to reverse engineer an algorithm they couldn’t see.
“The algorithms are very opaque,” said Kelley Cotter, a Ph.D. candidate in the Information and Media program, who worked on the study. “Our study does some work clarifying what’s happening ‘under the hood’ of Facebook’s algorithmic systems. The algorithmic identity that has been created for us, we know, is not 100 percent accurate.”
Social media users might be classified as a parent when they are not one, or they might be lumped into the wrong political categories – or often, no political interest categories at all. She said this is due to the machine learning involved in algorithms.
“They’re like mistranslations, because it’s not a human process,” said Thorson. “People are building these algorithms, but then algorithms making probabilistic choices that are not always good ones.”
She said there’s also significant potential for discrimination or information inequality, based on how people are categorized and how they are targeted by advertisers based on those classifications.
Implications for Civic Engagement
Researchers were also interested in understanding the limitations of civic engagement on Facebook. While the social media platform has become highly politicized in recent years, many of its users sign on to the platform at age 13. When they create an account, people have the option to indicate their interests and preferences at the time, but this method of ‘sorting out’ people has limitations. Not everyone shares their political preferences on the platform, and people often change over time, which is something the algorithm may not be able to capture.
“We wanted to study the role of algorithms – not only the role that algorithms play in shaping the information environment, but also in society,” said Cotter. “We know that a lot of people are getting news from Facebook and other platforms, so it has become an important source of news from around the country, around the world, in particular news about politics.”
If Facebook users are not exposed to news and political content based on how they are categorized, however, their understanding of current news and politics may be limited. In the study, some people reported low exposure or no exposure at all to news content on Facebook.
Thorson noted that even when a person goes out of their way to Like or Follow a Facebook Page, like that of a government official, they won’t see all of the content from that page in their Newsfeed.
“Algorithms are determining what we see in our Newsfeeds, and algorithms are making judgments about how interested or not interested we are in civic content,” said Cotter.
Shaping the Newsfeed
The research reveals the power algorithms have to shape exposure to news, independent of other factors. “One thing that was surprising was the notable absence of algorithmically inferred categories on news and politics for many people,” said Cotter. For some Facebook users, she said, their interaction with news and political content did not translate to having political engagement categories listed in their profile.
Within the categories made available to algorithms and advertisers, 23 percent of participants had no news media or politics categories listed. Another 26 percent had one to three relevant categories listed, and the remaining 51 percent had four or more media or politics categories listed. The majority of keywords listed related to specific media organizations or politicians, for example, NPR, Donald Trump, Barack Obama or Bernie Sanders.
“We’re never going to be able to completely control the process,” said Cotter, but she said there’s hope if people work to change how they are categorized. “Our personal choices still do matter.”
Facebook users can manually go into the preferences to make changes, or they can proactively follow news outlets and politicians on Facebook. There’s no guarantee that this will have a direct impact on what appears in the Newsfeed, but if the algorithms become more responsive to human change over time, it could lead to a better mix of news and political content.
The research, “Algorithm Inference, Political Interest, and Exposure to News and Politics on Facebook,” was published in the Journal of Information, Communication and Society on July 27, 2019.
By Melissa Priebe