Is Data More Valuable Than Your Blood?-Part II with Gerry Stegmaier

Play

Rather Than Oil, Data Is Oxygen For Artificial Intelligence- Currently The Market Is Starved For Oxygen

While Current HIPAA Law Allows Hospitals To Use De-Identified Data (eg..to Improve Care) Ultimately, The Patient, Not The Hospital or Payer, Should Own the Data

We Need To Balance Patient Interests and Societal Good. There Is A Cost To Innovation From Well Intentioned Regulation (HIPAA and Research Protocols)

Alan Pitt: This is Alan Pitt with Healthcare Pittstop. I had a recent post asking the question about data and whether your data was more valuable than your blood. I thought I’d follow that up with a friend and lead attorney who is taking a close look at data issues and privacy. I have with me today Gerry Stegmaier. Gerry, thanks so much for joining.

Gerry Stegmaier: Hey, absolutely, glad to be here, Alan.

AP: Could you tell me a little bit about you, your background, and some of the problems you like to work on?

GS: Sure. My Twitter handle is @1sand0sLawyer, so it sort of gives you an idea of how I think about things. There are four kinds of intellectual property: patents, copyright, trademark, and trade secrets. If you think of those as fingers, I do a fifth area, which is everything else that people care about immensely but that doesn’t fall into those buckets and categories—-a thumb—which allow you to grip things. In Silicon Valley and elsewhere, really, most of the action these days is around things that are difficult to define [the “thumbs”], and yet people either want to own them, protect them, or use them. So all of my energy professionally for the last 20 years has really focused on helping dreamers become doers and really figure out not how to say no, but also to figure out how to do things and really helping people figure out how to tackle really difficult technical problems around information and information access.

AP: I’d really like to introduce you to my hospital, because I often get a note from them when I ask them for things. I know you’ve been involved in at least two Silicon Valley startups leveraging big data, and then moved on to Reed Smith. What did you learn from your previous efforts and what drove the move?

GS: One of the things that I increasingly encountered is the real need for deep, Washington-based regulatory expertise around healthcare at different agencies like the Food and Drug Administration, CMS, and OCR. However, many, many Washington lawyers are not actually close to or comfortable with the technology. And so the kind of folks that I work with are people who are really trying to change the game… In other words, don’t hate the player, hate the game. People who are trying to do game-changing work, who are really trying to push glaciers, need expertise that is very, very specialized, but that can then be applied in a business context. We have more than 110 lawyers at Reed Smith just focused in and around one of the best healthcare practices on the planet. So I’ve found that by embedding myself with those folks, I could, on a daily basis, over Ramen at the coffee shop next to the office, talk about things that were really, really problematic for those dreamers that I’m trying to help.

AP: Yeah, I totally get why you’d want to have all those resources available to you. As a point of disclosure, as you know, I am the Chief Medical Officer of a big data company, CloudMedx Health, which is an AI company out of Silicon Valley as well. Really, the future of AI is all around medical data as the fuel for AI, and that has really given rise to questions around who owns the data, issues of data monetization. Lots of hospitals, vendors, and other systems have been looking for ways to monetize data that they currently own. Can you give me any comments about where we are today and where you maybe see the future going?

GS: Sure. There’s been a lot of discussion about data being the new oil. I really don’t like the oil metaphor, but I do like the internal combustion engine metaphor because I think data is actually oxygen. And when it comes to artificial intelligence development, the market is really oxygen-starved. The Wall Street Journal—I think it was the Journal—had a story about 10 days ago that essentially said many, many AI projects are stalled because of the time and energy and work related to not only identifying data assets, but preparing data. Simply being able to run the legal traps, run the regulatory traps, is virtually impossible. And when you throw into this layer, as a predicate for your question, which is an increasing realization especially in healthcare, that data could be an asset class for the 21st century, it ends up resembling a Minions movie that my kids used to watch. Everybody goes, “Mine, mine, mine, mine, mine, mine, mine.” And that is not a great outlook when it comes to thinking about how to share because you know very well, Alan, to have models that are resilient, that aren’t brittle, often requires massive data sets, and very few institutions have data at the volume or scale required to build those non-brittle models.

AP: I work at one of the largest healthcare systems in the country, what was Dignity Health, is now called Common Spirit. I think Common Spirit certainly feels a need to protect data, but they also feel that they, to some degree, own that data, as do many insurers; they’ve taken a position that they own a patient’s data. Where does that position come from?

GS: Greed. I hate to present it that way, but in many, many business contexts, the second that someone says, “We own the data,” it’s a horrible oversimplification. What we’ve done in other companies that I’ve been involved with, and even in other verticals, is try and move the conversation to beyond ownership, because it’s overly simplistic. I don’t need to own the beach, I just would like to be able to get to the beach to enjoy it. Or if I’m having a dinner party, I don’t have to own the painting, I just might like the painting to be in the room when I need it, and that requires a new kind of thinking. But we have a lot of experience with that. When our kids are out on the playground and they say, “Hey, I really want to play this game,” and the other kids say, “No, we want to play a different game,” we don’t say it’s winner take all; we don’t say that. What we would say to our kids is, “Okay, well, why don’t we play this one first, and then we’ll play the game that you want to play?” And that’s really a different mindset.

GS: When it comes to healthcare data, the two things that are really critical, in my estimation, that we have to keep in mind: First, it’s the patient’s data on some level. And I don’t mean about ownership, but when people in healthcare roles say, “Hey, this is our data,” as a matter of intellectual property law you generally can’t own facts. You certainly could collect those facts from other people; they have, often, at least as good a claim as you do. We’re seeing this play out in legislative initiatives and people talking about data dividends. So my perspective on this is, at least as between all those entities saying that they own the data, I think the patients probably have the first claim.

GS: We’ve had a lot of discussion about what we’ll call patient-centric data interoperability at Higi, where I’ve been involved for a long time. What we found was when we said, “Look, everything we do is patient-centric. We’re going to do what patients want; we’re going to do what consumers want first,” that’s a really different mindset. Actually, it’s quite easy—physicians will say, “Hey, there’s this oath to do no harm, and if it isn’t in the patient’s interest, then I don’t think we should do it,” and that really makes those conversations go very easily.

GS: The second piece of this, besides patient-centric data interoperability and patient-centric approaches, and you know how CMS has an initiative on this, I heard we’ve spent $37 billion on subsidizing EHR development, yet we don’t have interoperability. The second part of it is the public interest, improving care, because the fact of the matter is that in the United States, regardless of how we ultimately get there, every one of us as a taxpayer pays for a massive amount of healthcare. So balancing direct individual patient interests with our collective societal interest in the data that’s generated from healthcare is something that we don’t have to deal with in some of the other verticals.

AP: Very interesting. You get at the point that I raised in my previous blog, which is really societal benefit versus personal privacy rights. Do you believe that hospitals, through their role in curating that data, will share in some of the revenue streams as products are made? How do you see that happening—how do they come up with a path forward from where they are?

GS: Yes, it’s a great question. I’ll just quickly say that what we saw in the past was that most institutions, at least in my experience, said, “Look, we’re in the healthcare delivery business. We’re not in the data-centric product development business.” And so you didn’t have that Minions example where they were saying, “Mine, mine, mine, mine, mine, mine” when it comes to the data. Now, that’s changed in the last couple of years, and where I think there’s a really interesting opportunity is for these organizations to participate in the upside, either by improving their own operations from building new products, but also maybe just building products that work for everyone and having a more participatory process. But candidly, I would say one of the most important things that has to happen at a mindset level… My dad used to always say, “Do what you do best, and get other people to do the rest.”

AP: Yes—hospitals certainly are not going to become data science companies to productize their data, and they’re going to need help. That is absolutely for sure.

GS: But the opportunity remains for strategic relationships, partnerships, public-private partnership, bringing the right experts in, and collaborating strategically. In many instances, the biggest challenge is just those organizations getting out of their own way.

AP: Yes. Some people have compared the historical patient rights regarding tissue donation and tried to apply that to data rights; I alluded to that in the earlier blog. Both tissue and data have value. Any thoughts on how tissue and data rights should overlap or are different?

GS: Yeah, I always think about HIPAA, the Health Insurance Portability and Accountability Act. Under HIPAA, once data is de-identified with the two methods that are appropriate, an institution can use that information in any way that it wants. One way to think about this is if two people talk to one another, the way the law works and our society works, the person who hears what the first person said isn’t limited in what they can do and what they can learn. So if we were to approach data, or even tissue for that matter on some level, and just say, “Look, the healthcare and physician community can’t benefit from their experience and what they learn,” those are what we as lawyers would call residuals. That’s going to put us in a really bad place; you don’t get the upside of having done arthroscopic surgery 10,000 times. The things you learn from 10,000 hours of practice, you don’t get the benefit.

GS: But at the same time, we’ve had some tissue-oriented cases where institutions treated people long after the need to treat them had passed because they were trying to develop and build things. And so I think, coming back to that do no harm tenet, and then the corollary being finding ways that we can do well while doing good, is where we really have to strike the balance. I think we’re really fortunate, at least in the healthcare community, that there is a very strong sense of ethics, there’s a real public interest in the community. But to put a finer point on your question, I think there are some very specific distinctions between tissue-related research and data-related research. But the similarity that’s most important is what we’re talking about—What should you be able to do with what you have learned?

GS: The way that we operationalize that in other areas of law, and I think financial institutions are a good example, is your bank isn’t allowed to trade against you as a consumer, right?

AP: Right.

GS: So the question is, should your bank be able or allowed to benefit directly with you not participating in what they have learned from you as a customer? Or where we’re frankly increasingly see this in the healthcare context, should the hospital system or the provider or the insurer be able to benefit from what they see from every patient? Because as you know, Alan, where data often is different is that it’s very rare that any one individual patient or patient-related data is the silver bullet, if you will. I think that with AI, we’re approaching things scientifically from a very different way. So it’s a fascinating subject. It’s one that isn’t going to go away, but ultimately where we could mess this up would be a world in which patients who prefer that their data not be used for curing cancer or doing really amazing things will likely have a way to say, “I withdraw my consent,” or “I don’t give you my permission in the first place.” That’s how it works with tissue donation now, but I think we just have to find a way to efficiently have those conversations, because no one wants more paper to fill out in the waiting room.

AP: Yes. I want to make two small comments: One, you did mention that HIPAA basically states that once your data is anonymized, the hospital pretty much owns your data?

GS: Correct. I don’t think of it so much as ownership… This goes to those subtleties. It can be used in an unrestricted way. Like that example—if two people talk to one another on the street, you can’t expect that the person you talk to isn’t going to repeat or use what they heard from that conversation. And that de-identification linkage essentially changes those expectations.

AP: I get it. The second comment, by the way, is just an area of interest for me. I’m curious about whether blockchain and permissions may play a role in allowing patients to opt in or opt out—that’s a much more fluid model than we have today. I think that’s an interesting space.

GS: Yes, I think distributed ledgers and the accountability that comes with them present a really fascinating opportunity. There are also some pretty interesting privacy challenges with how you build those ledgers and what is on-ledger and what’s off-ledger. Between smart contracts, blockchain technology, we’re likely going to see some really interesting innovation. But as you know, Alan, in the same time in healthcare, it’s not uncommon sometimes to have five years to get to a no-revenue pilot. So it isn’t the best plan, to be a venture capitalist.

AP: Yeah, I definitely know that. One of the things that was raised within my podcast was this question of whether your data can be used against you. There was a case of denial of benefits for a patient. You gave me some stories from other industries. How do you feel about some of that—your data, your health being used against you? What does that look like in the future?

GS: I think it will be increasingly difficult to use information against people that they didn’t know. It’s already regulated pretty extensively. We have a law called the Fair Credit Reporting Act, and if there’s information related to a person’s character, reputation, or fitness that’s used in what’s called decisioning, the process by which that decisioning occurs and notifying people on how decisions were made is highly regulated. So we’ve all seen this when we’ve applied for life insurance and they want to come and do a physical. What I think is going to happen is we’re going to really need to improve those disclosures and how those processes happen; California has a new law that looks to this. We’re going to see a lot of activity around regulating data brokers and data sources that come to play.

GS: The best vertical example I can think about this is connected cars and electronic data recorders in vehicles are regulated in virtually every state, and generally how you drive your car can’t be used again you in terms of pricing insurance. So user-based insurance is a very specific market that has come about, where if you want to get that kind of user-based insurance, you go through a very specific process. You agree to having a special device put in your car, etcetera, as opposed to every car talks to the network, and we charge you insurance based on how we know you used your car in the last year.

AP: Yes—I think of this as, you have to give to get. If you want discounts, whether it’s for your auto insurance or maybe for your healthcare, you may have to give a little bit so that people can partner with you.

GS: I think that’s exactly right.

AP: In an ideal world, how would we as a society allow individual rights to be respected in the forefront, without putting up additional barriers to progress? Any thoughts?

GS: I think the first and most important piece is to focus on the context and what consumer expectations are. So rather than every time we want to do something we have to have a negotiation, it works the way it works in many contexts, which is this is the baseline rule in this situation, and we don’t have to talk about it. It’s just how it works. And if you want to vary the norm, then you have to have the negotiation. The way I think about that is there are going to be some things that you will just be able to do. Under HIPAA right now, using PHI, Protective Health Information for healthcare operations is just okay. You don’t have to have a conversation about that; we don’t have to talk with patients about that. But when we want to use it for what we call secondary uses or purposes, a lot of times we have to have a conversation, or get permission, or give someone the ability to opt out. And I think we’re seeing a lot of refinement around that.

GS: But there are two places where we have privacy-related pain. There’s some information that is just so sensitive that we decided we’re going to have the conversation about permissioning; informed consent on the front-end every time. But the secondary is more difficult. There is information that might not otherwise be perceived as sensitive, and when you add it all together, or you can get it easily enough, bad things then can happen to people. I think your insurance denial example is one of those sorts of things, right?

AP: Right.

GS: The government doesn’t have to have a warrant to get information about you because they can just buy it in the marketplace, and the protection that warrants give us is pretty greatly eroded. So my view is that we need to figure out what we think is the right threshold at a normal level on things, and then just let that oxygen move freely. If we think too much oxygen is a bad thing, like near a fire, then we put in some additional controls. One of the neat things from healthcare is that we have a ton of experience around HIPAA, and we know that this really, really rigorous system between IRBs and HIPAA is limiting innovation. It’s limiting air improvement; stopping it. And I think I’m being generous. How do we tackle that? Or at least, how do we acknowledge the cost in a systematic way with the privacy challenges, which I think we agree are very big and very real? We’re not really having that conversation right now. There has been very little discussion of the real world, dead weight loss associated with privacy protections and healthcare.

AP: Yes. I once wrote a blog asking where is the opt-out-of-HIPAA button? I often think HIPAA is worse for me; more dangerous for me! You’re trying to help me, but you’re not really helping me very much.

GS: Anyone who has spent their day driving around to share medical records would agree with you.

AP: Yeah, absolutely. Can you talk a little bit about the work you’re really focused on now?

GS: Sure. I’m doing a lot of work with companies that are focused on unlocking and unleashing the potential for artificial intelligence in healthcare. If you’ve ever looked at a $20 bill or on the back of cash, you’ve seen that pyramid with the omniscient eye at the top. You don’t get the omniscient eye unless you can tackle some foundational problems. And those foundational problems focus on access to data; the ability to get compute at prices and rates that are affordable, the ability to get parties with different interests to set those interests aside and recognize collaboration. So a lot of the things that we’ve talked about here, and those are $100 billion opportunities often with an annual run rate of $20 billion. I saw something recently that suggests that Google is radically undervalued if you look at Google as an AI play, and a focus on artificial intelligence. So that’s the sandbox where I’ve chosen to spend the next 10 years of my life.

AP: How interesting. You know, I really appreciate our conversation. I recently read the book Thank You for Being Late. It’s not what we can do, but what we’re allowed to do. For all the doers in the world, I’m glad that you’re there to try to help them navigate all the barriers that have been put in place for the betterment of society. Thank you very much, Gerry, I appreciate your time.

GS: Thank you. I am really delighted to be here on Pittstop.

Reader Interactions

Leave a Reply

Your email address will not be published. Required fields are marked *