The views expressed by contributors are their own and not the view of The Hill

Why your privacy shouldn’t be so private

In this Aug. 11, 2019, file photo, a man uses a cellphone in New Orleans.
AP Photo/Jenny Kane, File

It has not been a good few months for tech companies. Meta shares have plummeted, Twitter is struggling under Elon Musk’s ownership, layoffs have abounded and the cryptocurrency exchange FTX has imploded. The mood towards tech has soured, and it is affecting U.S. policy in outsized ways. 

The Biden administration last month released its Blueprint for an AI Bill of Rights, a set of principles and practices calling for strengthening privacy and preventing algorithm bias and discrimination. Like much of contemporary policy today, it focuses on the perceived harms and potential risks of automation. But the federal framing of artificial intelligence largely overlooks the advantages of machine decision-making and fails to recognize the tension between privacy and progress through collecting richer, fuller, more diverse data sets.   

Privacy tops the list for civil rights activists. A new bill before Congress, the American Data Privacy and Protection Act, seeks to strengthen privacy protections and impose restrictions on data collection. The world’s toughest privacy and security law, Europe’s General Data Protection Regulation, already prohibits by default all data collection or use unless such collection is within the allowable exceptions to the privacy rule. And the American Civil Liberties Union seeks to limit or ban the collection of biometric data, including data obtained through facial recognition. 

Yet ironically, a tunnel-visioned focus on privacy rights may be hurting the most vulnerable. The very fact that data is collected actually begets improved  health, safety, equality, accuracy and socially valuable innovation. Given the long histories of offline exclusion, discrimination, secrecy and inequities, data collection and monitoring can make the root causes and patterns of inequality more visible and help ensure that those most needing resources, protection and support receive it.  

Too often, the stakeholders left behind in the data seclusion vs. data extraction calculus are those who are at the edge of data — vulnerable people and communities that have not had equal input in shaping our collective knowledge pools. In medical research, for example, where women and minorities have historically been marginalized in or omitted from clinical studies, more data collection and integration are vital to fostering greater accuracy, efficacy and accessto health care.  

Across the board, data-rich individuals and communities have the resources to provide more financial opportunities, infrastructure investment, personalized care and opportunities for civic engagement.  

Professor Shoshana Zuboff popularized the term “surveillance capitalism,” sounding the alarm about a new era in which our personal information is extracted to the point that the human experience is reduced to raw bits. But as I have shown in my research, better information can help tackle society’s most insidious problems. 

In a social liberal regime, we can envision flipping the script from surveillance capitalism to “surveillance liberalism.” We can start imagining how, under the conditions of democratic trust, millions of surveillance cameras can become a “watchful aunt” rather than the proverbial Big Brother. Instead of simply policing what must be protected against collection, we need to police what is not collected and have corollary rights to data maximization. As our automation capabilities grow, we need to be just as worried about the information we’re not collecting as the information we are.  

We should advocate for surveillance liberalism in every aspect of our lives. The Biden AI Bill of Rights states, for example, that “continuous surveillance and monitoring should not be used in education, work, housing or in other contexts where the use of such surveillance technologies is likely to limit rights, opportunities, or access.” This is far too broad.  

In my research on pay gaps, for example, I find that the stagnant, frustratingly entrenched reality of gender and race salary inequities stems in large part from asymmetric information within the market, held by employers, putting both employees and regulators at a disadvantage. We can target this data asymmetry by publicly collecting salary data — including gender and race breakdowns — either through government agencies or through private data sharing on third-party, online job market platforms like LinkedIn.  

Economists have long understood fuller and more competitive information to be the engine for equality. Glassdoor, for example, has a salary calculator called Know Your Worth; such algorithm-derived information sharing is empowering and equalizing. Yet the closed-book, anti-surveillance mindset echoes in our cultural taboos about asking each other about our salaries along with the broader terms and conditions of our work environments. Technological progress shapes and is shaped by changing cultural norms: The more we understand how such sharing benefits us all, the less we will fear surveillance liberalism.  

Policy must play an active role in facilitating these shifts — for example, by limiting the enforceability of nondisclosure agreements, changing the definitions of company proprietary and confidential information and reforming laws about salary disclosures and salary data collection. Relying on privacy as a shield too often serves to protect the most powerful, abandoning our collective responsibility to the less privileged and the powerless. It also neglects privacy’s role as both a sword against state intervention and a bulwark against change.  

Whereas the alarms sounding against algorithms primarily decry their bias against women and minorities, the irony is that more data collection is actually the way to course-correct. Those who have less digital access will have less data collected about them — data that could support, for example, better dialect accuracy in speech-to-text technology, better accuracy in facial recognition technology, better predictions of needs and innovation and more inclusion in services. Moreover, when certain groups are underrepresented in the data used to train algorithms, predictions about these groups will inherently be less accurate.  

Unequal structures of digital access and uneven data mandate more, not less, collection of data about minorities, women and vulnerable populations. Better data can be used to prevent violence, tackle trafficking and hate crimes, promote safety and health, support innovation to protect the environment, increase employment diversity, improve accessibility and accommodation and benefit the learning process in schools. Academic researchers and government agencies need more access to data that today is deemed proprietary and enclosed. We need to shift our policy focus from the mere fact of data collection to the ways that data is used.  

In September, San Francisco’s city ordinance allowing local authorities to use non-city-owned surveillance cameras to respond to crime was met with an outcry about privacy. But the new ordinance includes ample restrictions on how the cameras can be used and how long they can be used for, as well as rules about data retention and disposal, prohibitions on live monitoring inside homes, mandatory training and continuous oversight measures on the monitoring requests by law enforcement. These are the kinds of safeguards that we should support.  

In reality, the contemporary battles for privacy may have unintended regressive effects. An algorithm is only as good as the information it’s fed. Counterintuitive though it may be, the best way to prevent discrimination and promote equity and inclusion is to allow algorithms to collect information — including data about gender and race — and learn from it.  

Orly Lobel is the Warren Distinguished Professor and director of the Center for Employment and Labor Policy (CELP) at the University of San Diego and the author of The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future (PublicAffairs 2022).”  

Tags AI bill of rights American Data Privacy and Protection Act Elon Musk Joe Biden Machine learning Politics of the United States Privacy of telecommunications

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Bottom ↴

Top Stories

See All

Most Popular

Load more