The views expressed by contributors are their own and not the view of The Hill

More user-friendly, inclusive privacy online requires funded research


Three years ago, in May 2018, Europe’s General Data Protection Regulation (GDPR) changed how companies collect and use our personal data. Some may recall this as the day when every website you’ve ever interacted with emailed you about changes in their privacy policies and data practices.

Since then, there has been some progress in the battle to protect privacy and data rights. U.S. state laws like the California Consumer Privacy Act (CCPA) offer some similar protections to the GDPR, although the passage of the California Privacy Rights Act last November was a signal that the state’s voters want even more to be done.

Whether consumers are actually better off remains uncertain. And research suggests that users cannot fully exercise their new rights.

Meaningful privacy legislation requires more research on how to make privacy work in practice — research that falls on the government to fund, as data-hungry companies have little incentive to do so.

To highlight users’ difficulty in exercising their rights, take as an example the requirements that dictate companies be transparent about how they collect and process user data. The Pew Research Center found that only 22 percent of Americans report “always” or “often” reading privacy policies. Even fewer read them all the way through. This makes sense, as privacy policies are notoriously long and difficult to read.

As for the “accept cookies” banners that started appearing several years ago, researchers found many companies use deceptive practices and dark patterns to “nudge” users into accepting them. Most firms fail to transparently disclose what consumer data they store and how it is used.

To address the gap between law and technical implementation, researchers have recommended a variety of interventions, most aimed at making it easier to read and understand privacy policies. For instance, a research group at Carnegie Mellon University proposed “nutrition labels” for websites and internet-connected devices to make it easier for users to understand and compare what data companies collect — and what they do with it.

Usable systems are important, but they, too, are not enough. Even when presented with usable and readable booklet versions of privacy policies, users with concerns feel frustrated that their only alternative is to stop using a service altogether. Indeed, a growing body of research has challenged the model of providing consumers transparency through privacy policies altogether.

How can we do it better? Innovative proposals include data dividends, data co-ops and tokenized data, which aim to help users benefit financially from the data they generate and contribute. Ironically, GDPR may make these proposals illegal to implement in the EU. But a one-size-fits-all solution is unlikely to emerge. Privacy means different things to different people in different social contexts. Different social groups (such as those who differ in age, race or physical ability) have varying characteristics, abilities, needs and desires when it comes to privacy.

What we need is more research expertise and funding directed toward understanding privacy in more open-ended ways. To participate in contemporary everyday activities, people have to use the internet, and in the process face potentially harmful data collection. This trend has only been exacerbated during the COVID-19 pandemic, in which people’s work and personal lives collided in their home, and on their home network.

Companies and researchers need to design systems that are not only more usable, but also take into account people’s individual needs without forcing users to do all the work themselves. This might mean finding ways to provide services in ways that are less data-intensive or allow users to flip off certain types of data collection. They might provide options for data ownership and licensing. The possibilities are out there — the funding is not.

Finally, companies need to embed privacy experts on product teams. Since there is no one-size-fits-all definition of privacy, companies need experts who can approach privacy in more open-ended ways. Otherwise, privacy harms will continue to disproportionately affect groups at the margins — those who are least likely to be represented on Silicon Valley campuses.

The fact that privacy still dominates discussions about technology shows how far we are from workable privacy policy.

Passing legislation was the easy part. Now, we need to make effective policy. We can’t do that without research — and it’s research that private companies questing for data to package and sell are unlikely to ask for.

Nick Merrill directs the Daylight Lab at the University of California, Berkeley Center for Long-Term Cybersecurity, which produces tools for understanding and addressing critical security issues.

Richmond Wong is a postdoctoral researcher at the University of California, Berkeley Center for Long-Term Cybersecurity, specializing in the intersection of tech and ethics.

Tags Data security Digital rights General Data Protection Regulation Information privacy Internet privacy Privacy law Privacy policy Terms of service

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴
Main Area Bottom ↴

Most Popular

Load more