Q&A with postdoc John S. Seberger on post-COVID study

John S. Seberger is a postdoc in the College of Communication Arts and Sciences at MSU. He works with Dean Prabu David and an interdisciplinary team from ComArtSci and the College of Engineering. 

He and Sameer Patil, an associate professor in the School of Computing at University of Utah, coauthored a recent article in JMIR Publications called Post-COVID Public Health Surveillance and Privacy Expectations in the United States: Scenario-Based Interview Study.” 

Below, he answers questions regarding his research, and addresses important takeaways from his article.  

Why did you want to conduct a study related to the COVD-19 pandemic? How does your research fit into studying an aspect from the pandemic? 

I’m a technologist, but I’m a humanist first. I’ve been profoundly privileged to develop expertise in multiple fields, and I try to bring that expertise to bear on contemporary problems. I am passionate about understanding how technologies (broadly construed) change not only the daily experience of people’s lives, but the ways people think about themselves and are treated by institutions. More than that, though, I see it as my duty to argue on behalf of people’s rights in an increasingly data-driven world – primarily their right to be treated as people deserving of, and fundamentally entitled to, dignity. In an historical period characterized by digital-everything, this often means focusing on how apps implicitly treat people as data mines – things to be used as means of monetizable data generation.  

So, I ask questions about how people think about the technologies they use – things like what certain technologies allow people to accomplish, how technologies collect and use data, and how technologies shape the future. In the context of 2020 and 2021, these questions are necessarily contextualized by the pandemic.  As a technologist and humanist, I wanted to contribute to our understanding of what it might mean in the long run to develop a public health infrastructure on the backs of pandemic-tracking apps. 

Looking at the intersection of information gathering and personal security/dignity seems to be an important part of your area of research. Why is this an important topic to study?  

This is a great question. It makes me think of the saying, “plus ça change …” You know, “the more things change, the more they stay the same.” The process of change is a constant.  When so much of the world is changing by means of digitization – what folks call the digital transformation – I think it’s essential to guide change in the directions that we think are beneficial. This requires not only careful empirical work, but deep conceptual work – and work that is participatory rather than prescriptive. 

At this point, apps are just a part of life. It is expected that people use them – for work, for communicating, for mundane daily activities like commuting or shopping. A person decides to use one app or another for myriad reasons, some of which might be rational, others of which might be affective. But their decision to use an app doesn’t grant them agency in the way the app functions or how it treats their data. Instead, they’re empowered to do a handful of things – post photos, chat with friends, order cat litter – but they’re also disempowered because they resign themselves to the data practices in which an app or its associated platform engages. How can a person be treated with dignity if they have no agency or no reliable evidence of good faith data practices? 

Theoretically there is a sweet spot between disclosure and privacy. But the desire to find that sweet spot and design within it is frequently confounded by assuming that ours is an historically stable form of privacy. If I say, “privacy,” and you say, “privacy,” sure: there’s a good chance that we’re talking about similar things. (Right now, we’d probably be talking, at least to some extent, about data.) But maybe we both remember times when data wasn’t so central to daily life. Historical definitions of privacy don’t seem to apply very well anymore. For many folks, privacy is about keeping third parties out – preventing Eve from hearing what Alice and Bob are talking about. But when the devices and apps we use are valuable and valued precisely because they communicate data among a large and unspecified network of actants – people, devices, algorithms, etc. – basic ideas of privacy start to fall apart. Eve is always there; the third party is always implicitly part of the conversation. 

Finding the “sweet spot” shouldn’t necessarily mean a return to forever – the realization of an idealized form of privacy that could only exist in a non-networked world. It means, for me anyway, the willingness to explore the futures of privacy. The willingness to confront the idea that privacy itself is changing. This isn’t necessarily a bad thing, either. Values evolve, but we have an obligation to evolve along with them and to exert agency in shaping that evolution. The potential evolution of privacy means that we have all the more responsibility – not only as users, but as designers, scholars, developers all the way up to Big Tech CEOs – to actively account for the values that matter to us as people.  

Your article, “Post-COVID Public Health Surveillance and Privacy Expectations in the United States: Scenario-Based Interview Study,” was recently published. It looks at the various apps that are out and the understanding that, for many, giving information is something that they’re willing to do for the “common good” to fight the pandemic. Was there anything that surprised you about the study and/or people’s responses? 

I’m sort of a disappointed optimist. I guess some folks call that being a realist. I expected that we’d find conflicting narratives – that’s one of the most interesting parts of talking to people! But what really surprised me was how often “the greater good”came up as a motivation. When we began making sense of what all these different people said, I got another surprise: ideas like “the greater good” do double duty for competing interests. On the one hand, many of the people we spoke to really want to contribute to the greater good by doing whatever they can to stop the spread of COVID-19. And that’s really beautiful – I mean, it’s people acting to protect other people as much as they’re acting maybe to alleviate their own pandemic-related stresses. But people’s desire to serve the greater good opens some troubling doors. When serving the greater good means adopting new technologies – new apps – that are deployed within the greater cultural and economic context of surveillance capitalism, that opens up the possibility that serving the greater good in the immediate present tense of the pandemic may facilitate the routinization of evermore data tracking. In this light, easily envisioned public health surveillance apps become just another brick in the wall of an increasingly normalized surveillance culture. To a certain extent, the people we spoke with thought it was realistic that health data collected from public health surveillance apps would be monetized like any other data. I should think this would be deeply troubling from the perspective of medicine – a field that is predicated on the provision of care and “doing no harm.” 

Could you talk about the sociotechnical implications of these apps and the tradeoffs people are willing to make? 

People made really blunt statements about how public health outweighs personal privacy. Again, the apparent altruism of this was really something to see. But it was pretty clear that this sort of trade-off has a shelf life. People are willing to trade their privacy to one extent or another when there is a catastrophic emergency, like a pandemic, but not after. When we started talking about the use of pandemic-driven technologies as routine after the pandemic – as means of public health surveillance – people became very uncomfortable. They assumed that the data such apps generated would be collected and monetized. Frankly, this is a reasonable assumption given the economics of contemporary app use. So, in a sense, people were talking about a Catch-22: allow privacy violations to improve public health but expect those violations to be normalized later. 

The tricky part here is that useful technologies don’t disappear. When powerful institutions like Big Tech companies and governments come together to build an infrastructure for something, that infrastructure isn’t just going to go away. It may well be the case that pandemic-tracking apps play a meaningful role in curtailing the pandemic – we’ll likely decide that based on a mixture of empirical data and public opinion. But if we declare those apps to be successful, it’s only natural that we’ll seek other contexts in which that success can be duplicated. But just like there was a shelf life to people’s willingness to trade privacy for health, there are myriad contextual factors that need to be considered when we start thinking about how, say, the Google-Apple pandemic-tracking infrastructure might be deployed in the future. 

Have you found that, because of the pandemic, people have become desensitized to their privacy or privacy of others? 

I’m not sure that “desensitized” is the right word. Privacy is one of those things that most people don’t care about until it is breached. It’s like a road – you don’t really pay attention to the integrity of a road’s surface unless you have reason to. Generally, reasons to pay attention to the road stem from a problem with the road. I think what we’re seeing is a breakdown in privacy itself – now the road always appears questionable because it is. Technological solutions to the pandemic may contribute to privacy desensitization in the long run, but if they do, it’ll be because we already have the infrastructure of resignation in place.  

What can the public take away from this study? 

For the general public, the takeaway here is simple: We need to act in each other’s best interests, and during a pandemic that means taking whatever actions are necessary to prevent the further spread of disease. The person next to you, across town, or across the country is as deserving of respect as you are. And that means being responsible about the role you play in either contributing to the restoration of public health or not.  

At a higher level, beyond the triage and immediacy of the pandemic, we need to be cognizant, vigilant and vocal. People need to speak to their governmental representatives about data practices. Until we can reasonably expect that the apps we use will treat us and our data with the dignity we deserve, people need to really consider whether it is worth being a user. Now is the time to talk about and shape our data-driven futures, not later.