If you’ve ever had Google or Facebook ads show you a pair of shoes you just bought, or had your phone prompt you to leave a rating for a restaurant you just left, you are one of many who has had to confront the uncomfortable relationship between modern consumers and the devices they rely on. We all opt into digital services that make our lives easier, even knowing that there may be data privacy tradeoffs. So what does it mean to make these choices? And what are the ethics of companies using our data, especially data about our health?

For this episode of the podcast, two special guests helped me tackle these complex questions: Vivian Singletary, PHII’s director, and David Addiss, longtime health ethics advocate and the director of the Focus Area for Compassion and Ethics in Global Health (FACE).

If you haven’t subscribed to Inform Me, Informatics, you can do so on iTunesSoundcloud or most podcatchers. We’re also now on Google Play, Player.fm and Spotify. Like the podcast? Please consider rating us on iTunes! This will help other listeners find out about the show. We’re also asking our listeners to share their own personal definitions of public health informatics. You can let us know yours by leaving a voicemail for us at our podcast call-in line, (678) 974-0344.

David Addiss, director of FACE (left) and Vivian Singletary, director of PHII (right)

INTRO

PIPER

Alexa, what’s a consumer health profile?

FAKE ALEXA

It’s when private companies analyze consumer data about you, like social media and your purchases, to draw conclusions about health practices and build a proxy medical record for you. This information can then be sold to advertisers, life insurance companies, and other interested parties.

PIPER

What does my health profile say about me?

FAKE ALEXA
Your credit card has been used at eight different donut shops in the last two months, and it looks like you returned a stationary bicycle you bought after just two weeks for a refund. So it doesn’t look great.

PIPER

OK, OK!

[Music fades in.]

PIPER

Hi everyone, this is Piper Hale, and welcome to another episode of Inform Me, Informatics. So, first of all, that wasn’t actually Alexa; these are just the desperate lows I sink to when I don’t have a human co-host. But just as fake Alexa explained a moment ago, all of us, as plugged-in consumers in the Digital Age, make trade-offs every day between convenience and privacy. I actually do have an Amazon Echo in my home (though she doesn’t nag me about donut consumption), I’m wearing a FitBit right now, and I’m an avid social media user. As part of my employee benefits, I voluntarily log my meals and share my workout data through a 3rd party healthy living portal to earn health insurance incentives. So what does it mean to make these choices? And what are the ethics of companies using our data?

I recently sat down with two people far more qualified than I am to answer these questions: Vivian Singletary, the director of the Public Health Informatics Institute and a familiar voice here on the podcast, and David Addiss, longtime health ethics advocate and the director of the Focus Area for Compassion and Ethics in Global Health, or FACE. I asked David and Vivian to help me delve into these complex ethics issues, and I started by asking them about their own perspectives and definitions around public health ethics. Vivian got us started.

VIVIAN

So when I think about population ethics, I think about it from the perspective of, “Are we representing this population fairly, correctly? Are we using information about these populations in a way that will ultimately benefit them?” So keeping that in mind is kind of what I think about in terms of ethics around population health – doing the right things for these populations.

DAVID

Bioethics or, you know, clinical ethics deals largely with these four major principles of doing good, beneficence, avoiding harm, nonmaleficence, justice or social justice, and autonomy, which is really about respect for persons. And you can look at clinical ethics issues through those four lenses, and sometimes those interests or those principles might be in competition. With public health ethics, we also have an additional obligation to make sure that the interventions we’re proposing or the data we’re collecting are actually going to be used for the good, and we also often have this tension between what is good for the population but might harm specific individuals. And so in public health ethics, we have to ask the question, “Is there a way to deliver the same benefits, the same good for the population in a way that harms people, harms individuals to a lesser degree, or that impinges on their respect or their autonomy to a lesser degree?”

PIPER

So to drill down a little bit more into the informatics side of public health ethics. How do you feel that changing technology is reshaping some of these ethics-based conversations? So, Vivian, you and I had talked a little bit about those online genetic tests and ancestry sites, which makes people findable to genetic relatives. So if you submit your information to this site, your relatives are now on this site, even if they, themselves, did not consent.

So I would love to hear your thoughts on the ethical responsibilities of sort of the entities involved in making these new technological, you know, these decisions that have never been possible before. And what are the ethical responsibilities we all need to keep in mind as technology and data mining techniques outpace what we currently believe to be possible?

VIVIAN

Wow. I have to tell you that what is going on in that area is very complicated, not necessarily from the DNA science, but from the social and cultural perspective in terms of population health and ethics.

But there are some, you know, kind of edge cases where it may not be great, and those examples of people that may have donated, you know, an egg to help a couple conceive, where they don’t necessarily want to be contacted by their biological children, which makes this extremely difficult to deal with. I don’t have a right or wrong answer.

I think there’s a lot of gray, as it relates to issues like this. And at least, I want to believe that when this opportunity to unlock DNA for, you know, large populations to better understand their origins, that it was all done from the perspective of good. And that there wasn’t this thinking about these, you know, edge cases that could negatively impact someone. And so we have to continue to work through this gray and see how we can continue to unlock technology and do new things, but also try to protect those who may not want to be a part of it. I think it’s a new, emerging field. It’s difficult, it’s complicated, and we’re gonna have to figure out how to work through those as we go forward.

DAVID

I totally agree. I think the challenge is that there were these unforeseen uses of unforeseen applications. And so people have given an informed consent. Perhaps, they’ve given a consent. Whether it’s informed or not could be debated, but it was probably written before some of these uses were allowed by the companies. And so there’s probably some clause that, once they’ve given their DNA, they’ve given permission for these other uses. That is problematic. And in my view, there should be an opt-in opportunity, whereas these new applications become available, each person has the ability to opt in. Otherwise, the default is that their data would not be used in that new way. I also think because these are relatively new, we haven’t had a conversation about our values as a society, and how we balance transparency and knowledge versus privacy and autonomy. And one of the big challenges of technology is that it can be used to alleviate suffering, it can be used for great good, and it can be used for harm. And the developers of the atomic bomb were horrified when they realized what they had created and the potential for harm and misuse of that weapon.

So we see this all the time with technology, and public health really is authorized by the government. If you have a benevolent government, a representative government, there are greater [00:09:30] checks and balances on the use of technology and data. If the government has biases, or there are people who are disenfranchised or marginalized, then the potential for harm is much greater.

PIPER

To follow up, David, on something that you just said about how the future repercussions of technology aren’t always visible. I wanted to drill down more into that about, when you think about how quickly data mining, in particular, and those techniques are improving, can people truly make informed decisions on health activities that they may be engaging in, when they don’t have any conception of, you know, what technology may exist in the future that could result in very personal information being extrapolated from those activities?

VIVIAN

I was reading an article yesterday, it had to do with smartphones, tracking, you know, on your phone up to 14,000 times per day. And while this is not necessarily your health, indirectly, it really does reflect your health. Because it can track you to a specific room, a place. It can tell whether you’re, you know, in a particular place in your home, whether you’ve gone to the gym, you know, you’ve gone to see a physician. It really knows where you are at any particular time. It knows your habits. It knows which grocery store you went to and put location data along with your affinity card data. So for example, if you shop at a particular store, where you use a card to get coupon discounts, it knows where you are, when you were there, what you purchased. It can potentially track, you know, your data and your lifestyle choices, essentially. And so when you start to talk about Big Data and personal, you know, choices and what that looks like, you know, kind of transposed upon the larger population, you can know quite a bit about a person without really trying very hard.

So I don’t think we’ve figured out, to David’s point, all of the data that we’re sharing and what it actually means quite yet. I think we’re starting to come into that as we continue to evolve. We’re being marketed to a certain way because we go to certain websites, which also, you know, we’re buying more and more online. So all of our personal choices are being collected in a very detailed way, and that can tell you quite a bit about a person. While, you know, the private sector wants to use that to try to market and sell to you, it can also be leveraged and used by other industries. Health industries, you know, you have, I believe it was GSK, but don’t quote me, that actually purchased some of the 23andMe data. They wanna understand the genetics. They wanna understand how they can use that and mine that data to make better pharmaceuticals to help, you know, treat, you know, emerging diseases and new things. Or even some of the existing things that are out there, they wanna understand. But is that data being mined by the pharmaceutical companies a good thing, and is it going to improve population health? That still is yet to be seen. You know? So there’s things that are literally emerging as we talk right at this moment.

DAVID

It seems to me that the major driving forces for Big Data and data collection are the for-profit sector marketing, and perhaps surveillance on different activities and getting a better sense of issues like when you’re awake, when you’re sleeping. So the for-profit sector is really driving this. And it seems like, as with trying to create advertisements against smoking, or against excessive alcohol use, or against consumerism, the poor local health department or state health department has very few resources. So while health departments want to utilize data in very creative ways to lead to better health outcomes, the overwhelming mass of usage of these data is for a profit, or for other things other than health.

PIPER

So this conversation we’re having is making me wonder, with these private data sources where data is being collected for the intention of commodifying it, or, you know, it’s much less restricted than public health surveillance is required to be…

To what extent is it ethical for public health to use the data that’s already being collected for these purposes? So for example, Google Flu Trends tracks Google search queries. When I type into Google, I am not thinking about that being aggregated and packaged and analyzed, personally, or Twitter, or public health surveillance. I’m wondering what are your thoughts on these tools, and what are the ethical considerations at play for using those?

DAVID

It’s challenging because, clearly, early warnings of outbreaks can be obtained through these types of data. So there’s clearly potential great public health benefits that can be used through creative use of data. But there’s always this shadow side, or this underbelly, where those datasets or the combination of different datasets could be misused. I think it’s really important in looking at the ethics of public health, or the ethics of data in a public health setting, to examine our own motives. We are about improving public health, but we also have careers, and we might be publishing papers and giving talks, and we have a personal interest in, say, advancing a certain line of inquiry or advancing the use of data. And sometimes that personal identity interferes with clear thinking about risks and benefits, and with bringing all the voices into the room to make a decision. So while we want to use data to advance health, I think we also have to check ourselves and examine our own motives, examine the potential benefits and harms, and to really listen to the people who are most likely to be affected. Too often, we think we know how to use the data and how to benefit health, without really consulting people who would be affected.

VIVIAN

Yeah. I totally agree with David. I think [00:18:30] the use case that you just described, like “influenza,” like “illness,” you know, typing that in on Google, just it seems very, you know, just kind of non-suspect. You know, you just want to use it to get early warnings of, you know, maybe it’s the peak of flu season. You know, we need to understand that so that we can have these early interventions. We can reinforce and, you know, encourage people to go out and get vaccinated, make sure people are washing their hands, doing some of the basic things to tamp down the spread of influenza, which makes a lot of sense to me. So if we’re using that data in the right way, I think that it’s’ okay for us to do that because we have the right intentions, and we’re trying to drive an effect that will be for the good of all.

PIPER

So it seems like, as we’re talking, we’re sort of finding that a lot of these issues center around balancing public good with individual rights, and using ethics as sort of that lens for determining where the line is. So kind of keeping that in mind, what issues around data ethics do you both think will become more pressing in the next decade or so, as technology continues to evolve?

DAVID

I think Big Data, as we’ve already discussed, and the uses of Big Data and the combination of different datasets that might in themselves be okay to individuals, but when you combine them, their privacy or their habits might be revealed in ways that neither dataset alone would reveal. So Big Data, the combination of datasets for purposes that were not originally intended raises some of these questions in a big way. So as we become a much more data-driven society and as commercial interests shape the data that we collect, I think more and more ethical issues will be raised along the lines of what we’ve been discussing. I also think that the erosion of human rights recently and the polarization of the society makes the abuse of data much more likely, and that I think is going to be a more pressing issue. How do we safeguard privacy, safeguard respect, safeguard autonomy with respect to data, when there are interests in using the data to actually harm people?

VIVIAN

You know, I think the whole issue around helping the consumers truly understand what it is that they’re consenting to is still very blurry. We just kind of click, opt in, not really understanding how our data is gonna be collected, how it’s going to be used. You know, and how is that gonna affect me, you know, 5, 10 years from now? I have no clue how long any of these services hold onto my data. I’m not clear on that. I’m probably like everyone else in America, opt in, keep it moving. I just wanna use the app at the expense of giving up privacy. So that’s a real issue, and I think it’s going to become more and more of an issue.

PIPER

So, you’ve heard Vivian’s answer to this on the show before—in fact, it’s in the show’s opening credits—but I had to ask again how she defines public health informatics.

VIVIAN

Well, public health informatics, for me, is really about having the right data at the right time to understand the disease burden or key issues that are going on in population health, so that we can make interventions to change the outcome in a positive way.

PIPER

I’ll be bringing you David Addiss’s answer to that question, along with the rest of our conversation on how these issues apply specifically to population health and the work of public health practitioners, on the next episode of Inform Me, Informatics.

Many thanks to both Vivian Singletary and David Addiss for sitting down with me for this chat! Thanks also go to Jelisa Lowe, who contributed to this episode.

This podcast is a project of the Public Health Informatics Institute, which is a program of The Task Force for Global Health. Visit phii.org to learn more about all of our informatics work! You can also find us on Facebook and follow us on Twitter @PHInformatics.

And now that we’ve asked so many of our guests the same questions, we really want to hear from you: what’s YOUR definition of public health informatics? You can let us know by leaving a voicemail for us at our podcast call-in line, 678-974-0344. That number will also be in the episode notes.

For more information on the consumer health profiles I mentioned at the top of the show, check out the book “Our Bodies, Our Data” by Adam Tanner. The music used in this episode was composed by Kevin MacLeod. Also, I’m closing the show with a heartfelt goodbye and thank you to Jessica Hill, who’s worked with me on the show since the beginning. She remains a good friend of the show and will be greatly missed. I wish her all the best in her next chapter! Fake Alexa just—isn’t the same.

I’m Piper Hale, and you’ve been informed.

BUTTON

PIPER

Sorry. The laughter outside. They’re having a party out there.

Copyright © 2021 Public Health Information Institute | All rights reserved