The endless cookie settings that appear on every website feel a bit like the fulfillment of jokes by an internet hell-bent on not changing. Is very annoying. And it feels a bit like a revenge of the data markets against regulators, giving the General Data Protection Regulation (GDPR) a bad name and in such a way that it may appear that political bureaucrats have, once again, clumsily interfered in the progress of innovation, which would otherwise be smooth.
The truth, however, is that the GDPR vision of privacy would usher in an era of innovation far more exciting than today’s seedy technology. However, as it stands now, it just falls short. What is needed is an infrastructural approach with the right incentives. I explain.
The granular metadata collected behind the scenes
As many of us know, laptops, phones and all devices with the prefix “smart” produce an incessant amount of data and metadata. So much so that the concept of a sovereign decision on your personal data hardly makes sense: if you click “no” to cookies on a site, an email will have quietly delivered a tracker. Delete Facebook and your mother will have tagged your face with your full name on an old birthday photo, etc.
What is different today (and the reason why a CCTV camera is indeed a terrible representation of surveillance) is that even if you choose and have the skills and knowledge to ensure your privacy, the general environment from the massive collection of metadata will continue to harm you. It’s not about your data, which will often be encrypted anyway, but about how the collective metadata streams will reveal things at a more granular level and make you a target, a potential customer, or a potential suspect if your patrons behavior stand out.
However, despite what it may seem, in reality everyone wants privacy. Even governments, companies and, above all, military and national security agencies. But they want privacy for themselves, not for others. And this puts them in a little dilemma: How can national security agencies, on the one hand, prevent foreign agencies from spying on their populations and, at the same time, build back doors for them to snoop?
Governments and companies have no incentive to provide privacy
To put it in language eminently familiar to these readers: the demand is there, but there is an incentive problem, to put it mildly. As an example of the problem of incentives that currently exist, an EY report values the UK healthcare data market at $ 11 billion.
These reports, while highly speculative as to the true value of the data, produce an irresistible sense of loss, leading to a self-fulfilling prophecy as everyone goes for the promised profits. This means that while everyone, from individuals to governments to large tech corporations, wants to ensure privacy, they simply do not have strong enough incentives to do so. The temptation to sneak in a back door, to make systems a little less secure, is just too strong. Governments want to know what their population (and others) is talking about, companies want to know what their customers think, employers want to know what their employees are doing, and parents and teachers want to know what children are doing.
There is a useful concept from the early history of science and technology studies that can help illuminate this mess. This is the affordability theory. The theory looks at the use of an object for its given environment, the system, and the things it offers people: the kinds of things that become possible, desirable, comfortable, and interesting to do as a result of the object or the system. Our current environment, to put it mildly, offers the irresistible temptation of surveillance to everyone, from pet owners and parents to governments.
In an excellent book, software engineer Ellen Ullman describes programming network software for an office. He vividly describes the horror when, after having installed the system, the boss excitedly realizes that it can also be used to track the keystrokes of his secretary, a person who had worked for him for more than a decade. Before there was trust and a good working relationship. The new powers inadvertently turned the boss, through this new software, into a climber, spying on the more detailed daily work rhythms of the people around him, the frequency of clicks and the pause between keystrokes. This senseless surveillance, albeit by algorithms rather than humans, often passes for innovation today.
Privacy as a material and infrastructural fact
So where does this take us? That we can’t just patch up personal privacy in this surveillance environment. Your devices, the habits of your friends and the activities of your family will nevertheless be linked and will identify you. And the metadata will leak regardless. Instead, you have to ensure privacy by default. And we know that this will not happen just because of the goodwill of governments or technology companies, because they simply do not have the incentive to do so.
The GDPR, with its immediate consequences, fell short. Privacy shouldn’t just be a right that we desperately try to realize with every visit to a website, or that most of us can only dream of exercising through costly legal processes. No, it has to be a material and infrastructural fact. This infrastructure has to be decentralized and global so that it does not fall in the interests of certain national or commercial interests. In addition, you have to have the right incentives, rewarding those who run and maintain the infrastructure so that protecting privacy is lucrative and attractive, while harming it is unworkable.
In closing, I want to point out a hugely underrated aspect of privacy, namely its positive potential for innovation. Privacy tends to be understood as a protective measure. But if privacy were just a given, data-driven innovation would suddenly make a lot more sense to people. It would enable a much broader engagement in shaping the future of all things data, including machine learning and AI. But next time we will talk about it.
The views, thoughts and opinions expressed here are solely those of the author and do not necessarily reflect or represent the views and opinions of Cointelegraph.
Jaya Klara Brekke is the Chief Strategy Officer for Nym, a global decentralized privacy project. She is a researcher at the Weizenbaum Institute, has a Ph.D. from the Department of Geography at Durham University on the policy of blockchain protocols, and is an occasional expert advisor to the European Commission on distributed ledger technology. He speaks, writes, and researches privacy, power, and the political economies of decentralized systems.
Keep reading: