The views expressed by contributors are their own and not the view of The Hill

Countering automation without representation


Tomorrow’s government workers aren’t your neighbors. They’re algorithms. 

Federal and local governments increasingly are turning to data-trained models and artificial intelligence to help make decisions — big decisions, such as the length of a prison sentence, and smaller decisions, such as when to deploy health inspectors to restaurants. But these algorithms need oversight and it is nowhere to be found.

Congress should act on this critical issue, but so, too, should local governments, where some of the most consequential uses of government by algorithm can be found.

In Pittsburgh, for example, the city and county governments are using algorithms to perform a variety of functions. Like many localities across the country, algorithms are helping judges determine who gets bail and who does not. They are guiding police about where and when to send patrols. They are helping child welfare hotline screeners determine when to send in a case worker. And they’re only becoming more common.

Each of these applications has a worthy goal: public safety, preventing crime, and saving children from neglect, abuse or worse. It is possible that some of these algorithms are “better” at doing so than the humans who might be making decisions in their place. Algorithms can be programmed to take more evidence into account; they can do more, faster. The promise of big data and predictive analytics for improving our lives and society should not be ignored.

However, it is also possible that these algorithms could perpetuate — or even accelerate — existing discrimination patterns. Ask Amazon, whose now-discarded interview system trained itself to avoid selecting resumes that indicated the applicants were women. If the data that helped train policing patrol models is using years of notoriously biased data — such as information about historical arrests — how have we ensured that the algorithm doesn’t churn out biased outcomes? 

Yet too many of these algorithms operate like black boxes, with little to no means for the public and researchers to scrutinize how decisions are made. As algorithms increasingly are used for determining government services and benefits, or even whether or not someone goes to prison, what remedies do individuals have if the algorithm is biased?

We cannot rely just on good intentions, or even impressive processes and evaluations within agencies. Instead, we should want public oversight to ensure accountability and fairness. No jurisdiction in the United States has successfully grappled with the complexities of how to adapt to this new administrative state by algorithm, though some are trying, including New York City and Washington State

To be sure, this is no easy task. This is why the University of Pittsburgh Institute for Cyber Law, Policy and Security — Pitt Cyber — has formed the Pittsburgh Task Force on Public Algorithms to study oversight of local public algorithms. With support from The Heinz Endowments, we have gathered experts and community leaders to develop best practices and issue practical guidance for local policymakers wishing to take full advantage of the promise of data, while still ensuring accountability and equity for all residents. This task force will not work in isolation; we also will be working throughout the western Pennsylvania region to ensure that we learn from residents across communities. 

In a country where techno-optimism long has ruled, it is time we also ensure accountability and fairness. 

David Hickton is the founding director of The University of Pittsburgh Institute for Cyber Law, Policy, and Security and the chair of the Pittsburgh Task Force on Public Algorithms. He is a former U.S. attorney for the Western District of Pennsylvania. Follow on Twitter @PittCyber.

Tags algorithms Artificial intelligence Big data Computer security Governance

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴
Main Area Bottom ↴

Most Popular

Load more