Technology

Artificial intelligence debate flares at Google

Google’s decision not to renew a controversial artificial intelligence (AI) contract with the Pentagon has reignited a debate about what Silicon Valley’s role should be with regard to the military and war.

Google, facing internal pressure, told employees during a meeting on Friday that it would not renew its contract with the Defense Department’s flagship AI program, known as Project Maven, after it expires in 2019, according to multiple reports.

The contract sparked a public relations crisis after a handful of employees reportedly resigned in protest and thousands of employees signed a letter urging the company’s CEO not to allow Google to be drafted into the “business of war.”

{mosads}Employees pointed to the company’s old “Don’t Be Evil” motto in pressuring Google to cut ties with the Pentagon.

Project Maven had recruited Google to help advance technology like surveillance drones, which are used to track the whereabouts of terrorist organizations and uncover devised plots before they unfold.

Bob Work, a former deputy Defense secretary who established Project Maven in April 2017, told The Hill on Monday that he is still holding out hope that Google will reconsider. If they do not, he said it would be unfortunate and could result in other technology companies divorcing themselves from Maven and similar projects.

“I worry that a lot of companies will look at Google and say, ‘Wow, if Google isn’t going to work with the Department of Defense, maybe I shouldn’t either.’ So I’m hoping that this is not going to turn into any type of stampede,” Work said.

Employees at Google and critics outside the company said the government’s partnership with the search giant raised a series of ethical and legal questions given the amount of personal data Google holds through email accounts and Google Maps.

They warned that with Google’s help, the U.S. military could build advanced AI weapons that can function autonomously. The machines, they warned, could eventually reach the point of sophistication where they could kill without human input.

“We have warned that technology companies should be extremely cautious about working with any military agency where the application involves potential harm to humans or could contribute to arms races or geopolitical instability,” said Karen Gullo, a spokeswoman for the Electronic Frontier Foundation.

“While we don’t know details about Google’s decision to not renew the Project Maven contract, we’re pleased to know that Google listened to the concerns of employees and will reportedly unveil a set of ethical principles about its use of AI.”

Google’s decision would be a blow to the Pentagon if the artificial intelligence technology it is pursuing is instead developed by another country.

“Much of the interest … the Pentagon has in these kinds of technologies, and I think more importantly the Pentagon’s new embrace of Silicon Valley, is motivated by the fear of a rising China,” Peter Singer, a fellow studying war and technology at New America, told The Hill in an interview.

Other security experts cast decisions by tech companies to not work on the projects as shortsighted. They argued that by working with the Pentagon, Google would have an opportunity to shape developments so that they are applied in a more positive way.

“The question to me is not ‘Do you do this, yes or no,’ but how do you engage in ways that reduces the carnage of war,” said Daniel Byman, a counterterrorism and Middle East security expert at the Brookings Institution.

“You’re going to see the technology advance. The question is: Can companies like Google and other places shape the evolution in a positive way? They are not going to stop it,” he said.

Singer questioned the argument by Google employees that they do not want to be drafted into matters of war, arguing that they already have been. He cited the Islamic State of Iraq and Syria’s use of YouTube to share recruitment videos to Russia’s use of social media platforms in the U.S. during the 2016 presidential race.

“It is an immaturity on their part to act like they haven’t already touched the realm of war,” said Singer.

Project Maven is using machine learning to bolster its ability to sort through hours of drone-collected data, helping analysts arrive more quickly at the most important surveillance findings. The drones are able to collect data through detection, annotation and surveillance.

Despite being assured by Google Cloud CEO Diane Greene that the technology they develop will not “operate or fly drones” or be “used to launch weapons,” employees argued in their April letter that the military could easily use this technology in advanced weapons once they are equipped with the capabilities.

Groups like the Tech Workers Coalition and the Campaign to Stop Killer Robots joined the Google employees.

“We are calling on Google not to make weapons because Google has a special relationship with the public in virtue of the kind of personal data they are collecting — through our email, through Google Maps, through Android systems, through internet searches and all sorts of things,” said Peter Asaro, an associate professor at Stanford who co-chairs the International Committee for Robot Arms Control.

Asaro, a member of the Campaign to Stop Killer Robots, echoed the concerns of employees, warning there are ethical and legal concerns to using such smart technology.

A spokesperson for the Tech Workers Coalition directed The Hill to the comments made by Google employees, who pushed to scrap Project Maven.

“We demonstrated that employees have a voice,” Google staff wrote in an email reported on by Bloomberg. “We need to remember this when we examine the AI principles and their governance mechanisms: we worked to make this happen, and we must ensure that we are not cut out of the process.”

Work, the former deputy Defense secretary, defended the initiative, stating that artificial intelligence aims to help protect U.S. military personnel in conflict zones.

“The machine isn’t making any decisions. It is not applying lethal force. All it is doing is reducing the workload of analysts. That’s it,” said Work.

He emphasized that AI could be used to save lives, rather than take them. One example he suggested was replacing men and women who diffuse bombs and other dangerous explosives with an AI-powered robot.

There has been a rapid proliferation of states acquiring drone technology, raising fears about how new technologies will be used.

Asaro said surveillance drones are now used by more than 100 countries, while armed drones are used by five to 10 countries.

Drones have long captured the attention of privacy hawks, who have warned that their growing surveillance capabilities pose a risk to people’s privacy, but they have also been repeatedly described as the new frontier of war.

The debate is likely to persist as the Defense Department continues to recruit the help of the private sector.

And it is unclear whether Google’s decision will be a short-term remedy to assuage angry employees or a longer-term plan to no longer mix work with military. Google did not respond to a request for comment, nor has it publicly confirmed its plans to walk away from the Pentagon contract.

“It is an important debate, one we should all have. When people take themselves out of the debate, I don’t think that helps really anybody,” Work added.

Ali Breland contributed.