John Sturino

View Original

Privacy after the pandemic

A friend once told me that the most important change in our lifetimes happened in 2008 when the price of keeping data became cheaper than the price of destroying it. Companies began to sit on ever-increasing piles of data and eager Product Managers began finding ways to use it. 

Thus began a decade of privacy challenges that resulted in a series of regulations - just now coming into force - which are starting to change the relationship between individuals and companies. But as we are looking at COVID-19 - a shock to the system that changes the way we think about our relationship to each other, will we start devaluing our privacy, even as we are walking around with ever-more-powerful data gathering devices in our pockets?

Privacy regulations in a post-pandemic world

We traded our privacy for convenience when times were good. Now that we are entering into an era of global crisis, will we be even more willing to trade our privacy for the perception of security? And if we don’t, will our lawmakers?

For now - the answer appears to be no. As you can see from Joe Biden’s OpEd in the NYT politicians are still giving a nod to the idea that people are worried about their privacy.

[T]here needs to be widespread, easily available and prompt testing — and a contact tracing strategy that protects privacy. A recent report from Mr. Trump’s Department of Health and Human Services made clear that we are far from achieving this goal. -Joe Biden

Apple and Google announced that they are cooperating to enable Bluetooth communication between phones to let you know when you’ve come into contact with someone who has tested positive for Covid-19. As CNBC summarized:

Apple and Google, normally arch-rivals, announced. . . that they teamed up to build technology that enables public health agencies to write contact-tracing apps. The partnership is being closely watched: The two Silicon Valley giants are responsible for the two dominant mobile operating systems globally, iOS and Android, which together run almost 100% of smartphones sold, according to data from Statcounter…The fact that the apps work best when a lot of people use them have raised fears that governments could force citizens to use them. But representatives from both companies insist they won’t allow the technology to become mandatory…

The way the system is envisioned, when someone tests positive for Covid-19, local public health agencies will verify the test, then use these apps to notify anybody who may have been within 10 or 15 feet of them in the past few weeks. The identity of the person who tested positive would never be revealed to the companies or to other users; their identity would be tracked using scrambled codes on phones that are unlocked only when they test positive. Only public health authorities will be allowed access these APIs, the companies said. The two companies have drawn a line in the sand in one area: Governments will not be able to require its citizens to use contact-tracing software built with these APIs — users will have to opt-in to the system, senior representatives said on Monday.

A quick read of the white paper, shows you that (signaling a concern for) Privacy was top of mind in developing the solution.

Maintaining user privacy is an essential requirement in the design of this specification. - Contact Tracing Bluetooth Specifications

Contact Tracing and Privacy

When a person has enabled this functionality, it will keep track of all of the bluetooth signals that it has come close to during the day (using signal strength to get a good idea of how close). If any of the bluetooth signals that you have come close to have enabled the functionality and tested positive for Covid-19, then you will be alerted that you may have been exposed.  Note that the use of location is optional, and would require opt-in. So, it will only know if you’ve come into contact with someone who has downloaded the app, been tested positive for Covid-19 and opted in to sharing that data. The data will be double encrypted and decentralized, so there won’t be a central repository of this data.

There are a ton of challenges with this, but I want to focus on how this illustrates the challenges that privacy presents to products that depend on network effects.

More to the point, what does it mean that only two companies are necessary for this to work? Apple and Google represent nearly 100% of the operating systems that are used by 45% of the global population. (If that number seems low to you, it’s closer to 76% in (most) European countries, the US, Australia, South Korea and Japan. In Latin America, it’s 45% and in India it’s closer to 25%.)

Let’s say that between the two, this covers 75% of the folks in the US. If Apple (with 48% of the market) had launched alone, they would have covered just 36% of people. Now you’re walking around and about two out of every three people nearby could have COVID without you knowing. So, you’re just as twitchy about every sneeze.

But when Google and Apple both enable it, then, you’re thinking that at least you’ve got good coverage. I mean, we all know some idiot who spoils our text strings by having a different OS, but hey, if 3 out of every 4 of the strangers have it enabled, then I should be safe, right? And in reality, the 25% of people who don’t have smartphones are often socioeconomically distanced from those who do. [Yes, I know that this means health for those who can afford it, but that’s a point for a different post.]

But, that calculation of “safe” assumes that everyone we happen to walk past will turn on the function. The less people who opt in, the less useful it is. So, we will have great social pressure to turn it on. Which will reduce our real choice to preserve our privacy.

Per the design, the data captured for the contact tracing is anonymized. But anonymous does not mean that it cannot be used to identify a person if matched with other pieces of data. 

Apres le deluge

We will not return from shelter at home all at once. We will not have enough tests all at once. It is likely that our limited testing resources will be directed by the contact tracing data, which makes “opt-in” even less optional. Plus, our real-world interactions will be scaled down (less people in restaurants, etc.) so, figuring out who is who - even from anonymized data - will be easier.

Our doctors will want to know as much info as they can, so they’ll need to know exactly when our exposure happened and in what context. Even if the time and location data is only visible to the doctors, the questions that they ask will give you a good idea of when and where. Did someone sneeze? Did they rub up against you? Were they thumping the melon that you picked up? So, for the data to be useful, it’s going to give us a pretty good idea of who has tested positive. And when they tested positive.

So, will Covid-positive be the new scarlet letter?

Of course, testing positive isn’t a bad thing. In a little while, it’s going to be the most positive sign in our society: to have tested positive and now be immune (well, that is, if it turns out that you can get immunity). Those people will be the only ones allowed to perform certain jobs, allowed to attend concerts or eat at restaurants, maybe even go to schools.

So, now there is an incentive to game the system. And wherever there is an incentive, there will be those who seek to exploit the weaknesses in the system.

But what about the data?

Importantly, what happens to the data once it’s been collected? As I noted at the beginning of this post, it’s cheaper to keep data than to discard it. To the point now, where it has to be a conscious and dedicated effort to actually get rid of collected data. 

This might be ok - since the bigger companies are the bigger risk with regard to data breaches - if we had strong privacy regulations. But, as McKinsey noted in their article “How business leaders can plan for the next normal” -

Some consumers and governments—but by no means all—may change their attitudes toward the sharing and use of personal data if it can be demonstrated that the use of such data during the crisis helped safeguard lives.

Which means that, as we look at our post-pandemic world, there is likely to be a different view by regulators (and consumers?) with regards to health data. As we all better understand how the person next to us can influence our own health, do we start to think of health data as a public good? And if it’s a public good, then does it belong to the individual? 

Conclusion: the problem with privacy

Ultimately, the problem with privacy is the same as before the pandemic: your data is being used as currency, but there is no market transparency for its worth. When you pay for an app with your data, it’s not “free”, but a consumer has no way of evaluating the value of what they’re paying with - their data. But it’s unclear how that can happen, and little incentive in the market to solve that problem - right now.

As we enter into this new era, it’s clear that the government and the market will both be crying out for increased data sharing in the interests of public health and safety (much like the increased surveillance methods that followed in the wake of September 11). But when the cost of discarding the data is higher than keeping it, how do we ensure that it is only the public good that is served by everyone paying into the collective coffers with their data?

Over the longer term, we need to find a way to increase transparency into the value of our personal data so that as consumers, we are making informed decisions. As businesses, we need to do a better job of educating consumers about how we are providing better services through the use of their data (does anyone understand how hard it is to make things seem simple?) As a society, we need to make some decisions on what information truly is private, and what should be shared as the price of admission to that society.  

But for now -  in our individual roles as decision-makers for companies and as consumers - we need to care on behalf of a future that is not too far away.


What does “privacy” mean? 

Privacy regulation is a really nuanced topic, but ultimately it is focused around a few concepts*:

  • who owns the data (whether it is the property of the user, or whether it became the property of the company when used as currency to enable the company to provide the service). This includes the “right to be forgotten”.

  • transparency (is it clear what data is being collected and what it’s being used for). This is why you’re seeing all those cookie popups

  • choice - as a user, you have the option to not share your data

  • data transfer - (where your data can go.) Since data regulations are different in different countries,

  • where your data actually physically resides makes a difference as to whether the company can/must comply with the other parts.

*Before you start thinking this sounds evil, privacy is important, but personalization is extremely sought after by users. In order for something to be personalized, it requires a lot of your personal data. The trick(s) are: 1) what I call the “cool/creepy line” - make sure that the amount of personalization doesn’t feel like an invasion; 2) keep that data secure.