In May, the European Union enacted its General Data Privacy Regulation (GDPR). In June, California enacted its own legislation regulating use of consumer data. While the two pieces of regulation have significant differences, there is significant overlap between them.
The goal of both laws is to tilt the data privacy scales in favor of the consumer and against the internet giants such as Alphabet (Google), Amazon, Facebook and Microsoft.
{mosads}Both laws have broad definitions of personal information. Both laws give consumers greater control over whether their information may be shared with third parties, and both laws allow consumers to opt out of sharing altogether.
Where the GDPR requires companies to seek customer opt-in for almost any use of data, the California law requires that customers be given the opportunity to opt out. Both laws give consumers the right to request a company disclose the data it holds about them.
Both laws also contain provisions that have been described as a “right to be forgotten.” In other words, you may request a company delete personal information it possesses about you.
Consumer and privacy advocates have hailed these laws as important first steps in regulating privacy. And, following headline stories about losing control or misusing data at places such as Target, Equifax and Facebook, consumers may feel like this regulation is needed and beneficial.
The problem with both the GDPR and the California law is that they lack clarity around fundamental terms, they threaten a business model that has been beneficial to consumers, and they jeopardize the pace of innovation and growth.
A fundamental aspect of any law is to clearly define what key terms mean. If you are regulating clean air, you need a clear definition of what a pollutant is. Similarly, if you are regulating data privacy, you need a clear definition of what personal information is.
The GDPR says personal information is “any information relating to an individual, whether it relates to his or her private, professional or public life.” That’s about as broad of a definition as you could imagine.
The California law defines personal information as including biometric data, psychometric information, browsing and search history and geolocation data, but it excludes information that is publicly available or general enough to not identify an individual.
This definition is clear as mud. Lawyers that will argue about what’s in and out of these definitions must be counting their fees already.
Beyond the drafting problems, the laws threaten a business model that has developed over the last 15 years. It works like this: We give the tech giants a certain amount of our personal data, and they give us free or heavily subsidized stuff (email accounts, messaging apps, professional and social networks, customer loyalty programs, location data, etc.).
If the flow of personal data we provide stops, which is highly valuable to the tech giants, the flow of services to us will diminish or we’ll start having to pay or pay more for the services we’ve come to enjoy. So far, the world, excluding some loud privacy zealots, seems to be generally pleased with the exchange.
Target is still in business. Equifax is still in business. And despite the silly hashtag campaign, large numbers of users have not abandoned Facebook.
It’s also important to note that privacy advocates almost never talk about the negative effects regulation has on innovation. One of the reasons the internet economy has boomed in the last 30 years was the general absence of regulation.
Entrepreneurs were able to test ideas, experiment with business models and put into motion a wave of innovation that has created hundreds of thousands of jobs, tremendous wealth and has improved our lives.
Remember hoping a taxi would drive by and agree to take you where you want to go? Remember trying to look at a road map when going to a new place? Remember having to call a travel agent to compare airline ticket prices?
If you are younger than 25, the likely answer is no, and that’s a good thing. Regulation risks stifling the next wave of innovation that can improve our lives.
When a regulation is enacted, the text should be clear, it should address genuine problems that the private sector cannot solve, and its benefits must outweigh negative consequences. Unfortunately, the GDPR and the California data privacy law do not meet these requirements, and we all lose as result.
Matthew R. A. Heiman is the chairman of the Cyber and Privacy Working Group at the Regulatory Transparency Project. He is also visiting fellow at the National Security Institute at George Mason University’s Antonin Scalia Law School. Previously, he was a lawyer with the National Security Division at the U.S. Department of Justice.