When you scroll through your social media feed, you may want to consider what you like or dislike, as well as what you comment on, post or read. And if you feel like your choices are being watched and recorded, that's probably because they are.
Most everyone who uses social media, search engines or online shopping websites has noticed that what they search for, post or view one day, is shaping what they see or hear the next. That's because almost as quickly as you can click or scroll, lightning-fast computer algorithms work in the background, shaping your internet experience based on what you have done online in the past.
Finding the Logic
Emilee Rader of Michigan State University's College of Communication Arts and Sciences has been researching how massive datasets and computer algorithms impact people’s lives, both online and offline. As a scientist who studies human-computer interaction, Rader’s focus is sociotechnical systems, like Facebook, that consist of people, technology and information.
"The oldest version of a sociotechnical system would be a library," says the associate professor and AT&T Scholar in the Department of Media and Information. "You've got people. You've got information. And you've got technology for organizing and finding information, like the Dewey Decimal System and card catalogs, that pre-date computers."
Rader explains that sociotechnical systems are everywhere, providing new ways for people to interact and access information. Algorithms—those computer programs that extract data from our internet behaviors and automatically select and prioritize content for us to see—are part of the system and have the capability to evolve and change rapidly based on the data available to them.
That ability for rapid change has implications, particularly when it comes to understanding why algorithms make decisions about the communications and information we receive.
"If you ask a human being why they made a decision, they can usually explain it to you," says Rader. "But you can't ask a Facebook algorithm why it decided to show you what it did, or what it decided not to show you. Because these systems are hard to understand, it is also hard to determine if what they are doing is helpful or harmful."
Rader began examining how algorithms affect the internet experience in 2012, shortly after she came to MSU in 2011. It was a time, she says, when there wasn't a lot of buzz around the ability of systems to automatically shape social media and news streams. The 2016 election and indications of foreign influence on tech sites changed that.
Today, more people recognize how algorithms power social and commercial networks and may think twice about clicking on that cat video or on an opinionated post. Some users notice their feeds are served up with content that the platform thinks they want to see first, while limiting posts from others. Some users don't notice or think about it at all. Rader says the decision-making ability of algorithms, and the data about people they use to make these decisions, raises issues for privacy and a host of philosophical and sociopolitical concerns.
"One thing that's important to keep in mind is that these systems are recording data about everything we do," says Rader. "They watch everything from the people you are friends with to what you click on. They consider what you scroll through, how long you spend on a post, and which links you click on and read. They're taking signals and pieces of data and painting a picture of you. Then the system uses that data to chose what to show you."
Rader admits while that sounds creepy and intellectually limiting, she acknowledges a more benevolent side. Algorithms can help people manage information overload and can filter searches. They can also improve the effectiveness of energy-efficient home thermostat systems, empower safety features like anti-lock brakes in cars, and provide automated feedback to students on assignments.
"They're simply a tool," she says. "Algorithms exist to make things easier for people. The fear is that the person or group making the algorithm has an objective. Another concern is how companies are using the data gathered through algorithms."
Pursuing the Path
Rader hadn't always set out to examine the intersection of technology and human interaction. In fact, she trained to be an opera singer at Indiana University, then switched to psychology about mid-way through her bachelor's at the University of Wisconsin.
Rader followed her psychology path to graduate school. At Carnegie Mellon University she discovered the interdisciplinary field of human computer interaction—or the combined study of computer science, psychology and design.
"I took a longish route to finding my career, but along the way, I discovered I loved doing research," says Rader who worked at Motorola for five years before earning her Ph.D. from the University of Michigan. "It's fulfilling to be in an environment like Michigan State and to have the ability to investigate important problems in the world and try to create solutions that will help people and improve their lives."
Rader came to MSU after doing postdoctoral research and teaching at Northwestern University. She's committed to exploring the effects of sociotechnical "black boxes"—or computer systems that collect data about people and have inner workings that seem mysterious or hard to explain.
Rader's research through the College's Behavior, Information and Technology Lab (BITLab) has been supported through grants from the National Science Foundation. Her particular focus is learning about how people understand and use algorithmic systems like the Facebook News Feed. She is also examining people's reactions when they begin to understand how Facebook’s algorithms influence what they see.
Rader says it's critical to recognize that tech companies, like Facebook, are in the business of grabbing attention, and monetizing the attention they grab. That, she says, is different from a traditional media organization that strives to offer balanced views and perspectives that are sourced and vetted.
"People like to say that if you're not paying for something with your money, you're paying for it with your attention," she says. "Facebook calls itself a technology company but it's also a media organization that is becoming a very powerful force for disseminating information around the globe. The fact that it's designed to capture attention means its mission and goals are different from traditional media. That has implications for what kinds of content the Facebook algorithms prioritize, and for the information that people consume."
In early December, Rader and Rick Wash, associate professor in the MSU Department of Media and Information, ran a two-day workshop on trustworthy algorithm decision-making in Washington, D.C. Supported by a grant from the NSF, the workshop invited people from the academic, government and industry sectors to discuss the increasing impact of algorithms in people's lives. The goal of the workshop was to identify ways to better understand, create, and evaluate algorithmic systems that people can trust.
By Ann Kammerer