The views expressed by contributors are their own and not the view of The Hill

A smarter way to prevent the misuse of personal data

iStock / Getty Images Plus

Whether we’re scheduling a meeting, shouting a Starbucks order, binging on Netflix, getting an X-ray or sending flowers to a loved one, we’re constantly sharing, using and transmitting personal data.

Because data is valuable and ubiquitous, it’s also prone to abuse — and that creates challenges for policymakers. Efforts to govern the countless kinds of data we now use has led to a regulatory Rube Goldberg machine: a patchwork of local, national and international rulebooks that isn’t internally consistent, let alone readily comprehensible.

Now, Congress is taking a crack at the problem. The American Privacy Rights Act (APRA) seeks to streamline privacy regulations and offer new consumer protections. It’s a laudable goal — but one that’s likely to run into the same challenges that have dogged past attempts at data regulation.

Fortunately, there’s a solution: stop regulating “data” per se, and focus instead on the specific harms or benefits that come with its use.

That might sound like a subtle difference, but it would be a major legal and intellectual shift. Current regulations target data based on how it’s created and who collects it: a website that snags data related to an EU citizen falls under the purview of the General Data Protection Regulation, Europe’s flagship privacy law. If data is collected by a U.S. healthcare organization, it’s governed by the Health Insurance Portability and Accountability Act, with corresponding limitations on how it’s stored, shared and used. Other laws focus on biometric data, data pertaining to children or even data extracted directly from your brain.

That’s a pretty unusual approach to regulation. Generally, rules govern outcomes, not processes. It doesn’t matter if I run you over while traveling by car, motorbike or jet ski — if I was negligent, I’m likely to suffer a penalty. Similarly, if my business dumps sludge in the river, it doesn’t matter if I’m manufacturing batteries or baguettes; what gets regulated is the harm I’ve caused.
 
In the same way, what matters when we regulate data isn’t the data itself, but the harm caused by its misuse. If my healthcare records get published on billboards around town, my privacy has been violated. But if the exact same data is responsibly shared with appropriate parties to treat my illness — or create new treatments, or enable better public health outcomes — then no harm has been done. The outcome, not the data, is what really matters.

Our current focus on data as data, though, leads us to simultaneously overregulate and underregulate. On the one hand, data that could have enabled crucial innovation or smarter decision-making winds up unnecessarily siloed away. On the other, novel data types and technologies often fall outside the scope of existing rulebooks, leaving consumers vulnerable and regulators playing catchup.

The solution is to stop regulating “data” and focus instead on the consequences of its abuse. The ideal framework would resemble the way the EU currently regulates flour. There’s a ban on GMO ingredients in foodstuffs, so when flour is used for bread it has to be free from such ingredients. But if the flour is being used to manufacture Play-Doh, no such restrictions apply. There’s no need for a one-size-fits-all rulebook governing “flour” as flour; instead, enforcement focuses on preventing bad outcomes and encouraging good ones.

Some regulators are already heading in this direction. Under the EU’s AI Act, for instance, low-risk commercial chatbots get one set of regulations, while AI tools used for critical applications in defense or healthcare get a much stricter rulebook. In the U.S., the FTC is taking a similarly flexible approach, using rules barring unfair business practices to punish companies that abuse consumer data. To regulate automakers that improperly track motorists’ data, the FTC asserts, you don’t need granular regulations specifically addressing driving data — you just need broad rules covering anymisbehavior that hurts consumers.
 
Building a similar regulatory toolkit for the rest of the data economy would let us unlock the power of our data while protecting consumers more effectively. A financial services company, for instance, would be free to use data to develop new fraud-detection technologies while remaining accountable for ensuring that data is properly safeguarded as it flows through their ecosystem. (Figuring out how to implement that would be an engineering problem for the company itself — not something dictated by regulators.)

An AI algorithm used to evaluate mortgage applications, meanwhile, would need to verifiably eliminate the risk of bias based on impermissible factors such as sex, race or age — not because it’s mandated to do so by AI-specific laws, but simply because there are already laws in place barring such discrimination. At the end of the day, it doesn’t matter whether I’m discriminated against using an algorithm or a human using a paper map. It’s the bias, not the data behind it, that needs regulating.

What does all this mean for APRA and the host of other AI- and data-related legislation under consideration? Well, we certainly do need to streamline (and standardize) the data rulebook, and a federal privacy law would be a good place to start. But the ultimate goal shouldn’t be to create a standalone body of data-specific regulations — it should be to clearly articulate the rights we’re trying to protect and the outcomes we’re trying to ensure.

To protect these rights without limiting innovation, then, we need to stop focusing so much on “data” per se — and lean into what we already know works: regulating the outcomes we really care about.

Ellie Graeden is a research professor at the Georgetown University Massive Data Institute and is a partner and chief data scientist at Luminos.Law.

Tags Data Privacy Technology

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴
Main Area Bottom ↴

Most Popular

Load more