The views expressed by contributors are their own and not the view of The Hill

Congress is working on a federal privacy law. Here’s why we need one

Rep. Cathy McMorris Rodgers, R-Wash., speaks during a House Energy and Commerce Innovation, Data, and Commerce Subcommittee hearing on data privacy, Thursday, April 27, 2023, on Capitol Hill in Washington. (AP Photo/Mariam Zuhaib)

Don’t look now, but Congress is preparing to pass a federal privacy law. The American Privacy Rights Act would swap America’s mishmash of state-level laws for a single federal statute offering broad new protections — from regulating “dark patterns” that manipulate consumers, to banning unapproved sharing of sensitive information.

That would be a big deal. The average American’s online activities are tracked and sold 747 times per day; in total, our data is tracked 178 trillion times per year. This isn’t the background noise of the internet — it is the internet, and we’re surveilled and monitored every time we log on.

But while better regulations are certainly needed, it’s consumers, not regulators, who are currently doing most to force brands to reevaluate their data practices. Take GM’s recent decision to scrap a program that sold motorists’ data: the move came in response not to a regulatory crackdown, but rather to a consumer backlash that threatened GM’s bottom line.

To exercise their power effectively, though, consumers first need to understand how their data is being used. Unfortunately, it’s increasingly hard to keep track: Nobody has time to read through endless privacy policies and consent notices, so nobody really knows what they’re giving up. To that end, here’s a quick refresher on the ways businesses are currently profiting from your data:

1)    Your browser history. OK, obviously companies are scrutinizing your online behavior: It’s the value exchange we enter into — or devil’s bargain we strike — whenever we Google something or use an ad-supported website. Few people realize, though, just how much personal information is up for grabs. We’ve seen Grindr allegedly sharing users’ HIV status with third parties; data brokers classifying people based on past sexual trauma; and Alzheimer’s patients flagged for targeting by scammers. On the modern Internet, the data brokers definitely know you’re a dog.


2)    Your location. For data brokers and the brands who buy from them, where you are is almost as important as what you’re doing — and besides its value to marketers, location data can also be easily weaponized. Organizations can track people’s visits to Planned Parenthood, for instance, in order to create targeted anti-abortion campaigns. Regulators are clamping down, but many companies still color outside the lines: The Tim Hortons coffee chain, for instance, was caught improperly tracking customers around the clock, including the times they left home, where they worked, and how often they visited rival coffee shops.

3)    Your driving skills (and much, much more). Modern automobiles are smartphones on wheels, and all major car brands — not just GM — are collecting location and behavioral data on motorists. Some automakers have been caught reporting bad drivers to insurers, leading to rate increases or denial of coverage. Others collate driving data with additional information including race, genetic information, sexual activity and more. Every time you get behind the wheel, in other words, you’re giving up a startling amount of data.

4)    Your kids. Children are supposed to be largely off-limits to data brokers, but the reality is much messier. Dozens of data brokers were recently found to have sold children’s location, health and other data; Google, meanwhile, reportedly allowed personalized ads to be served on YouTube videos made for children. Even apps specifically designed to keep kids safe sometimes leak GPS data, private messages and other personal information.

5)    Your face, your fingerprints — and your thoughts? Biometric data — your features, fingerprints, DNA, retinal patterns and more — is a data goldmine. Retailers have been caught using facial recognition technology to monitor shoppers, while Meta got dinged for collecting biometrics on 60 million Facebook users. As technology advances, marketers will also use gaze-tracking, physiological markers, and even neural monitoring to figure out what you’re thinking from one moment to the next.

This just scratches the surface of the data swirling around the digital economy, and the “countermeasures” I’ve described are pretty weak sauce: They mostly boil down to expressing a preference and hoping companies will play by the rules. With two-thirds of 10 Americans believing that companies will abuse their data no matter what they do, there’s a risk of us slipping into privacy nihilism, and giving up our data because we don’t see an alternative.

Complex privacy regulations make that worse, not better, by pushing consumers into confusion and apathy. Over half of data deletion requests, for instance, come from states in which consumers aren’t empowered to demand that their data be deleted—a sign that even people who care about data privacy don’t currently understand their legal rights.

A federal law, by offering a single national rulebook, might help with that. But the real power will remain in the hands of consumers. The more we pay attention, get angry, and refuse to buy from brands that play fast-and-loose with our data, the more those brands will be forced to bring their data practices in line with our expectations.

We’ve already seen consumer pressure force companies to push beyond environmental regulations and embrace sustainable business practices, for instance, and also to require better working conditions across global supply chains.

Now, consumer pressure is driving data practices forward, too. The speed with which GM pulled the plug on its data-sharing program shows who’s really in the driving seat. It’s just up to all of us to keep our eyes on the road. As consumers, we have the power to force companies to respect our data preferences — if we stay angry, and keep paying attention to how our data is used and abused.

Jonathan Joseph is board member of The Ethical Tech Project.