Elizabeth M. Renieris (Twitter @hackylawyER), For Rebooting Web of Trust, 1-3 March 2019
Elizabeth is Global Policy Counsel at digital ID startup Evernym. She is a law and policy expert who advises on data protection and privacy in the context of emerging technologies like blockchain, AI, and machine learning. Elizabeth is a Certified Information Privacy Professional in the US and Europe (CIPP/US, CIPP/E) and has advised the European Commission, UK Parliament, and other governments on the intersection of blockchain and data protection/privacy, particularly in the context of digital and self-sovereign identity.
TL;DR: We are allowing the tides of technology and commerce to haphazardly turn everything into commodifiable data. But will we allow ourselves to be reduced to data points and, worse yet, commodified? If we are not deliberate about designing the correct legal frameworks for humanity in the data-driven age, we risk losing sight of our fundamental rights as humans.
With more than half of the world's population now online and doing more than ever before, personal data is proliferating at an unprecedented rate. Our personal data exhaust is no longer confined to “surfing the Web” or traditional “online” activities. As our lives get "smarter" (through smart phones, smart homes, smart cars, smart devices, smart cities, and eventually smart or augmented humans), every second of our lives becomes a data point. Despite the seeming inevitability of all things digital fueled by our data, usage does not connote trust. On the contrary, our trust and confidence in the entities handling our personal data is at an all-time low. Yet, despite an emerging consensus that the advertising revenue-based business model of the Internet is largely to blame, the most popular solutions on the table risk extending this broken framework to our lives at large.
One popular view taking hold and gaining momentum in the public discourse is what I call the "data-as-property" approach. In a recently authored op-ed in the Economist, musician will.i.am argued that personal data should be regarded as property and that people should be compensated for the use of their data as property. He is not alone. Many loud voices are jumping on the “own your data” bandwagon, as if owning our data (if that were even possible), will correct the breaches of trust and confidence we have suffered at the hands of big tech. For the record, it will not. As we saw with one of Facebook's more recent scandals around its market research mobile app, compensating individuals for access to their data does little to solve the underlying problem. Rather, business models enabled by legal frameworks that treat our data as property capable of being bought and sold via contract is the root of the problem.
In order to course-correct the public discourse and help design and develop better legal frameworks, let's first address why we are tempted to treat personal data as property by exploring the complex nature of data, which makes it at once feel like a commodity but actually ill-suited for commodification: all data points are relatively indistinguishable, consisting of binary bits and bytes; the word “data” itself, derived from the Latin verb meaning “to give,” conveys liquidity or transferability; and, once digitized, data becomes “money-like.” This is perhaps why our personal data is increasingly described as “the new currency” or “new oil” and why many espouse a property or commodity view of data as something we can own and sell. However, because data is a mere representation of other things, e.g. property, money, speech, etc., it cannot fall under a single legal framework or body of law. The significance and value of any given data point is different from one context to another. Treating all data as property strips it from its context, from the environment that gives it meaning.
This nature of data challenges our existing legal frameworks, which is why we need to consciously design better alternatives. Our existing legal frameworks are inadequate for the societal sea-change that is transpiring. As we haphazardly fuse the physical, digital, and biological worlds through emerging technologies, everything is converging as data—everything apart from our laws. Our approach to law has tried to maintain neat swim lanes, regulating, e.g., property as property, money as money, speech as speech, and (more recently) data as data. But as everything is becoming data, these lanes are breaking down. Domain-specific legal frameworks ignore how the digital versions of things are qualitatively different from their analog counterparts, while data-specific regulations often neglect the consumptive, expressive, or other contextual qualities of the source of a given data point or data set—contexts in which we may have important rights as citizens, patients, and humans, rather than mere customers or consumers in a data-industrial complex.
To better illustrate the way we have to think about designing better legal frameworks, take a concrete example of this digital convergence--the example of money and speech. Money and speech have traditionally faced distinct legal frameworks with different, often oppositional, objectives. Financial regulations are generally designed to curtail and control, while speech laws are meant to presumptively protect and promote the freedom of expression. But money is increasingly digital, with businesses and localities going cashless, the explosion of “fintechs,” programmable money in the form of cryptocurrencies and tokens, and digital record-keeping of value through blockchains and distributed ledgers. At the same time, human behavior and expression itself is being turned into data as we are tracked online and off in what Shoshana Zuboff calls “surveillance capitalism.” This convergence threatens the fundamental rights of individuals. If we regulate digital money without considering its expressive qualities, we risk censoring certain individuals and vulnerable populations and, conversely, if we regulate digital speech without considering its consumptive qualities, we risk curtailing their bargaining or purchasing power too. Using examples like this one, I argue for legal frameworks that treat personal data as an extension of our identity, calling for inalienable, intrinsic rights in our data, a so-called “rights-based approach.”
The pace and nature of technological change, coupled with the seduction of reductive approaches in the face of this complex challenge, is accelerating the urgency of designing viable frameworks for humanity in the data-driven age. Per the fundamental rule of technological innovation, what can be done will be done, and market forces will also nudge us down a dangerous path. The time to protect our rights by mindfully addressing and deciding how we will treat our data in the data-driven age is now—our choices will determine how we interact with technology, how we relate to each other, and even how we order society. This is a (if not the) critical question of our time—as our lives and our humanity are increasingly represented by data, the way we treat data will become the way we treat our humanity. It is also an opportunity to renew and reaffirm our core values and ideals as we move into Web 3.0.