The value of privacy

Frederike Kaltheuner, Data Exploitation programme lead at Privacy International, talks about security, marketing, data privacy and mass surveillance.

Share Discuss


Privacy has never been more under threat but strangely at the same time it's also never been more alive. We are faced with unprecedented threats from not just companies but also governments who can learn more about us than was ever previously possible. But I'm quite hopeful because at the same time we're seeing an exceptionally lively debate about privacy happening around the world not just in Europe. What's so fascinating about this debate is that the definition of privacy is shifting. In the past privacy was often misconstrued as being about hiding or secrecy whereas the debate we are having right now understand privacy as what it really is. Which is about power dynamics it's about the relationship between individuals, the market, and the state. And that's why we are actually a little bit hopeful. My job at Privacy International the program I am managing is called data exploitation. We think that we're living in a world where it's impossible to know what kinds of data companies and governments hold on you and that creates a vast power asymmetry between those that can learn things about you and profit from it and those who are affected by it. 



The idea that we are giving up data in return for services is dangerously misguided. First of all the data that you actively provide is just a tiny fraction of the data that companies actually use to make money. Take the example of your smartphone it goes without saying that a smartphone app needs data in order to function but does it need thousands of third party trackers that record your location and then build a fine grade profile of your whereabouts? I think that's not part of the deal. So I think first of all it is important to recognize that the data that we provide are just a tiny fraction of the data that companies use to make profit. At the same time I don't like the idea that giving up your fundamental rights is a necessary precondition in order to participate in society. For many people around the world companies like Facebook are the internet. It is not really possible to opt out. So we think companies have responsibility to provide services that protect people by default and it is not your job to become a full time data protection expert to understand what is happening to your data. It is currently impossible to fully understand how companies collect data about you and that's a real problem. 



There's a tendency in cyber security discourses to focus exclusively on criminals, on terrorism. We have a very different conception of what cyber security means: cyber security means protecting people, devices, and networks. And that's a much more comprehensive view. We don't think that the threat of criminality should be a pretext for mass surveillance because ultimately mass surveillance is a security risk itself because it makes people more vulnerable. 



There is a persistent misconception that privacy in Europe is an idea and nothing could be further from the truth. Privacy first of all is a human right. Secondly also data protection is a widely recognized way in order to protect the right to privacy. About 126 countries around the world have some form of data protection regulation. These regimes look very different in practice but they all have the same purpose which is to protect people's privacies. Now in practice obviously the different philosophies of how privacy can best be protected. The United States for instance has privacy laws but they regulate sectors and not data in general. What we're seeing I think, we think at Privacy International is that emerging technologies undermining regimes that regulate privacy purely by sectors and I can give you an example. You can have the best healthcare data laws in the world that protect for example the confidentiality between patients and doctors but we live in a world where very sensitive medical information can be derived and inferred from things like you're purchasing histories you're browsing histories which means that your medical information is up for grabs. It was important to recognize that an underregulated data ecosystem is not just a threat to democracy but also pose a security risk. If your data ecosystem is underregulated people's most sensitive information is up for grabs. There was a study done by Amnesty International which I thought was fascinating in the context of Donald Trump claiming or toying with the idea of building a Muslim database. Amnesty tried to find out how easy it is actually to buy a list of everybody who is a Muslim in the United States. And the reality is for like about 150-160 thousand dollars you can buy a list of email addresses, phone numbers, and addresses of everybody who is Muslim in the United States. That's simply unacceptable, I think. So even though there are differences in philosophies all legal regimes have to grapple with the fact that we are faced with unprecedented threats that need to be addressed. 



When it comes to the IOT and all new technologies I’d love to talk a little bit about how we think about technology in general. We think about technology as inevitable. That there are inevitable processes and almost natural forces happening that then we as society have to grapple with. The more important or the more pressing question is what kind of world do we want to live in? And we are moving towards a world in which your fridge has a microphone, your toy has a camera. All of this is insecure, hackable and highly privacy-invasive and that's not the kind of world I want to live in. In the way that it is developing at the moment, and this is not inevitably the case, we could build an IOT that is more privacy, security, and safety respecting than what we're seeing. But the internet of things that is beginning to develop poses unprecedented threats. Most people still think about privacy as online privacy. So the privacy they have when they open a website but we hear companies saying that with the internet of things the next place to mine data is the home. When you have conversations with smart devices you reveal extremely intimate and sensitive details and this is very profitable. We used to say whenever you're not paying for the service you are the product. With the IOT we are seeing you're paying for sometimes very expensive devices and you're still the product. So in all scenarios where the business model is mining your data without being very transparent and open about it this is hugely problematic. GDPR is an interesting example because people's inboxes have been flooded with new privacy policies.  And the reason why companies are updating their privacy policies is not because they feel like it but because they have been forced to do so. There's obviously an industry interest in making and having happy customers and having safe and secure products but the pattern we have been observing over the past years is that it does take regulation more often than not in order to give the right incentives. In a world of data exploitation incentives are misaligned. 



Privacy is a human right, and as such governments and also companies have an obligation to protect privacy. What I'm most concerned about is that we're dealing with a new breed of companies that we haven't really figured out how to define. Let me give an example. A company like Facebook is not just a neutral platform. In many countries around the world where Facebook is the internet, the company provides an infrastructure and that raises questions of how we cannot just hope that such a company respects human rights but also how we can hold them to account. I mean, governments are also very complicit in human rights violations. That's why I wouldn't put all my faith in regulation. At the same time, I do think regulation is an important control and oversight mechanism. More fundamentally though I think if the service you're providing is almost infrastructure and so crucial to things like democracy and discourse, we need to think about how this can be more democratized how this can be become much more transparent than it currently is.