The views expressed by contributors are their own and not the view of The Hill

Why companies shouldn’t wait for regulation to step up their privacy practices

Getty Images


In light of recent well-documented data breaches and privacy violations like those at Equifax and Facebook, there have been numerous reports that Internet companies are finally warming to United States federal privacy regulations after fending them off for years. In late July, California drew a regulatory line in the sand, passing the California Consumer Privacy Act of 2018. Many are calling this the United States’ toughest privacy law, and given the state’s outsized market influence and its history of acting as a role model for other states, expect it to have wide-reaching ramifications. 

But it shouldn’t take legislation to motivate companies to re-examine what they do with personal data. Almost every day there is news of yet another serious data breach or privacy violation. Clearly, more is needed.

Many Internet companies have extraordinary access to individuals’ personal data – their actions, their friends, their preferences, their interests, and their most intimate secrets. Access to that kind of information should not be taken for granted: it should be handled responsibly. 

All Internet companies must take a stand for privacy. They can do so by stepping up their privacy practices and following what I think of as a “privacy code of conduct,” setting the example that all data handlers should aspire to achieve: 

  1. Adopt the mantle of data stewardship – Companies should act as custodians of users’ personal data – protecting the data, not only as a business necessity, but also on behalf of the individuals themselves. (In some circumstances, this may mean putting users’ interests first and collecting, using and sharing less personal data.)
  1. Be accountable– Companies should be transparent about their privacy practices, adhere to their privacy policies and demonstrate that they are doing what they say. They should establish clear safeguards for handling personal data and show how those safeguards are being enforced. They should commit to periodic independent audits of their practices and ensure processors or partners are abiding by the same high standards. When something goes wrong, companies should be transparent about what happened, do the best they can to contain the harm, provide affected individuals with meaningful remedies and endeavor to prevent any recurrence.
  1. Stop using user consent to excuse bad practices– Companies should not rely on user consent to justify the legitimacy of their data handling practices. They should openly demonstrate that their practices are lawful, fair and in the interests of the user before seeking user consent. Users should not be asked to agree to data sharing practices that are unreasonable or unfair, or that they have no hope of understanding.
  1. Provide user-friendly privacy information – Companies should give users “in time” information about how their personal data is being collected, used and shared. The information should be relevant, straightforward, concise and easy to understand. 
  1. Give users as much control of their privacy as possible– Users should be able to see, simply and clearly, when and how their data is being used. Companies should give users easy-to-use privacy controls and make privacy the default, not an optional extra. User permissions should not be persistent: they should have a limited duration and be specific to the task at hand (e.g. making a video call).
  1. Respect the context in which personal data was shared– Companies should confine the use of personal data to the context in which it was collected. They should not allow unauthorized or unwarranted secondary uses of personal data. 
  1. Protect “anonymized” data as if it were personal data– Companies should apply basic privacy protections to “anonymized” data to mitigate potential harm if the data is later re-identified or used to single out particular individuals.
  1. Encourage privacy researchers to highlight privacy weaknesses, risks or violations– Companies should invite independent privacy experts to audit new services and features as they are being developed. As much as possible, the results of those audits should be made publicly available. Companies should also encourage privacy researchers to report privacy vulnerabilities or violations and provide an open transparent process for responsible disclosure.
  1. Set privacy standards above and beyond what the law requires– Companies should set the next generation of privacy standards. For example, they could consider how to extend privacy protections to the personal data of non-users that has been uploaded by users, and better ways to handle privacy preferences of group data (e.g. a group photo).

Information we share on the Internet is often deeply personal. Those with access probably know more about us as users than we know about ourselves. This intimate insight into our lives demands a high level of social responsibility to protect our data and privacy.

As a society, we cannot sit by and shrug our shoulders when something bad happens, or just continue to move from privacy crisis to privacy crisis. It is time that we insist that Internet companies provide genuine and lasting protections for our privacy.

Christine Runnegar is the senior director of Internet trust at the Internet Society, a non-profit organization founded in 1992 to provide leadership in Internet-related standards, education, access, and policy. She leads the non-profit’s trust agenda which advocates for policies that support an open, globally-connected and secure Internet.

Tags Consumer privacy Digital rights Information privacy Internet privacy Privacy concerns with social networking services

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴

THE HILL MORNING SHOW

Main Area Bottom ↴

Most Popular

Load more