As we approach Data Privacy Day 2018, consider this question: how much risk do you believe theft or exposure of private data poses to human health, safety, or prosperity? ESET security researchers posed that exact question to over 700 US adults last year. Respondents were asked to rate the risk on an eight point scale from “no risk at all” to “very high risk.” More than 70% of respondents rated the risk above moderate and almost 50% rated it high or very high.
To put this level of concern about data privacy in perspective, the same respondents were asked about the risk of disposal of hazardous wastes in landfills; they rated that risk at roughly the same level as theft or exposure of private data. In other words, data privacy appears to be a serious concern to a lot of people, on a level with very concrete issues like hazardous waste disposal.
Data privacy is also a topic that can spark big debates, like the one between the US and the EU as to what protections should be accorded to data pertaining to people, specifically by those who collect, control, or process such data. In this article I discuss different attitudes to data privacy and some of the practical implications of those differences with respect to something called the General Data Protection Regulation.
A privacy divide?
To vastly over-simplify the transatlantic privacy debate, the default EU position on privacy is that sensitive information about you must not be collected or used without your knowledge and permission. The default US position is that sensitive information about you can be collected or until a law or lawsuit says it cannot. By “default position” I mean the current state of privacy rights and privacy law in these two parts of the world (support for these positions among the respective citizenry is not implied and may vary by location).
An example might help explain the difference between these two positions. Suppose you were to start up a rideshare service like Uber and Lyft and collect information about people who use the service, including their names and the trips they take using the service. Are there any legal restrictions on what you can do with that data? In EU countries the answer is definitely yes (and these will be described in more detail later).
In the US the answer is “it depends” followed by a lot of clarifying language. This language may even include “whatever you can get away with.” That is why, back in 2015, two very well-qualified US privacy attorneys called for congress and/or the states to pass privacy legislation to regulate the use of personal data collected by Uber and other rideshare operations.
I chose ridesharing as an example to contrast how the two positions deal with new types of data, such as geolocation and travel patterns. If you think that’s an unfair test, then consider data that has existed since before computers, such as a list topics that interest you, based on your borrowing of materials from your library. Can you name the federal law that protects this very personal information in the US? No, because there isn’t one. But, and this is where American privacy law starts to get complicated, particularly from a compliance perspective: library records do enjoy some legal protections in the US. To quote the American Library Association: “Forty-eight states and the District of Columbia have laws protecting the confidentiality of library records; two states, Kentucky and Hawaii, have attorney general’s opinions protecting library users’ privacy.”
In other words, the US has different protections for different types of personal data, created in different ways, at different times (the classic example being the Video Privacy Protection Act of 1988, created within days of Supreme Court justice nominee Robert Bork’s rental history being leaked to a newspaper). Privacy protection may come from federal law, state legislation, or court decisions at either the state or federal level.
In the EU, data that pertains to you as an identifiable individual is protected, by default, from inception. That is the practical meaning of the term “data protection” in European usage. Anyone who wants to collect data pertaining to you is required by law to get your permission to do so, and when they have your data they are required to tightly control who can access it and for what purpose. And that implies to new forms of personal data as soon as they exist. You don’t have to wait for a lawsuit or an embarrassing political incident.
For some historical context on data privacy in Europe, I can recommend the article Echoes of History: Understanding German Data Protection. The evolution of the EU position on data privacy is, for me at least, a fascinating part of the backstory of Data Privacy Day. (If you want more detail on US privacy law, described in a data protection context, see my white paper: Data privacy and data protection: US law and legislation.)
As you may know already, Data Privacy Day is January 28 every year. That date was chosen because on that day in 1981, the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data – referred to hereinafter as the Convention – was opened for signature by the Council of Europe (the source of the quotes in the following paragraph).
The Convention was the first “binding international instrument” created to protect individuals against “abuses which may accompany the collection and processing of personal data” while also seeking to regulate “the transfrontier flow of personal data”. The Convention “outlaws the processing of ‘sensitive’ data on a person’s race, politics, health, religion, sexual life, criminal record, etc., in the absence of proper legal safeguards”. At the same time, the Convention “enshrines the individual’s right to know that information is stored on him or her and, if necessary, to have it corrected”.
Over the next three decades, all 47 member countries of the Council of Europe, which includes the UK and Russia, ratified the convention. The convention also laid the groundwork for data protection legislation in the 28 countries of the EU, notably through the Data Protection Directive of 1995. This document declares that “data-processing systems are designed to serve man” and such systems must respect the “fundamental rights and freedoms” of people, whatever their nationality or residence, “notably the right to privacy.” In other words, data protection is “protection of individuals with regard to the processing of personal data.”
What might surprise you is that the US was involved in developing the data privacy principles underpinning the Convention. The first US legislation to consider privacy specifically in the context of computers appeared in the early 1970s thanks to – another surprise – President Richard Nixon. His Secretary for Health, Education and Welfare, Elliot Richardson, commissioned a study of record-keeping practices in the computer age. The resulting report, commonly known as the “HEW Report,” recommended the enactment of a federal “Code of Fair Information Practice” for all automated personal data systems.
The code envisioned by HEW contained five principles that would be given legal effect as “safeguard requirements” for automated personal data systems. Here is the original statement of the principles (note that I updated the “him/his” language typical of the early 1970s):
- There must be no personal data record keeping systems whose existence is secret.
- There must be a way for an individual to find out what information about him/her is in a record and how it is used.
- There must be a way for an individual to prevent information about him/her that was obtained for one purpose being used or made available for other purposes without his/her consent.
- There must be a way for an individual to correct or amend a record of identifiable information about him/her.
- Any organization creating, maintaining, using, or disseminating records of identifiable personal data must assure the reliability of the data for their intended use and must take precautions to prevent misuse of the data.
These principles were echoed in numerous landmark privacy documents, notably the OECD’s 1980 Guidelines on the Protection of Privacy and Transborder Flows of Personal Data and the aforementioned 1981 Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data.
Global data protection?
I recently read an article that described GDPR as Global Data Protection Regulation (to be clear, GDPR stands for General Data Protection Regulation). At first I thought it was a typo, but then I realized the author might have been making a point, because GDPR does have global implications.
There’s not enough space here to detail the journey from the 1981 Convention to the GDPR, which goes into effect May 25, 2018 (as discussed in numerous WeLiveSecurity articles). That journey passes through the previously mentioned EU Data Protection Directive of 1995 which mandated that a uniform set of legal protections for personally identifiable data be implemented in the laws of each EU country.
A convenient way to explore what that Directive meant in practice is to visit the website of the UK Information Commissioner’s Office (ICO). The ICO was created by the UK’s Data Protection Act which implements a data registration system, as required by the Directive. Simply put: “individuals and organisations that process personal information need to register with the Information Commissioner’s Office (ICO), unless they are exempt”. To get a sense of how seriously this is taken, I suggest you try the “registration self-assessment” (watch for the “gotcha” question about CCTV).
The concept of privacy is hard to define. Discussions about privacy can quickly turn philosophical. However, the fact that data privacy has evolved quite differently in the EU and the US clearly has practical implications when it comes to implementing privacy protections in law.
With the GDPR, the EU is imposing additional protections beyond those of the Directive, and it is doing so in the form of centrally enforceable, union-wide Regulation. Because the Regulation seeks to protect the data of EU residents even when that data is in non-EU countries, it has serious practical implications for organizations outside the EU. If those organizations have not built their data systems on the premise of protection by default, they may find it hard to provide a way for individuals to correct or amend a record of identifiable information about them, or to find out what information about them is in a record and how it is used, or to prevent information about them that was obtained for one purpose being used or made available for other purposes without their consent. Ironically, those are not just requirements under GDPR, they are principles proposed by the Nixon administration.