In The Know

James Cameron warns ‘weaponization’ is AI’s ‘biggest danger’ to humanity

Filmmaker James Cameron warned in an interview Tuesday that the “weaponization of AI” is the most significant of all the risks that the rapidly developing technology poses to humanity.

“I warned you guys in 1984,” the award-winning filmmaker said, somewhat facetiously, referring to “The Terminator,” his movie from that year about a cyborg assassin sent from the future. “But you didn’t listen.”

In his interview with Canada’s CTV, Cameron also struck a more serious note, pointing to the dangers he sees artificial intelligence (AI) posing to humanity. 

He warned specifically of the weaponization of the technology, which he predicted might evolve into a situation like a nuclear arms race, where “if we don’t build it, the other guys are for sure going to build it.”

“You’ve got to follow the money: Who’s building these things? They’re either building it to dominate market share — so what are you teaching it? Greed. Or you’re building it for defensive purposes — so you’re teaching it paranoia. I think the weaponization of AI is the biggest danger. I think that we will get into the equivalent of a nuclear arms race with AI,” he said.


“You could imagine an AI in a combat theater — the whole thing just being fought by the computers, at a speed that humans can no longer intercede. You have no ability to de-escalate,” Cameron warned, “and when you’re dealing with the potential of it escalating to nuclear warfare, de-escalation is the name of the game, and having that pause, that timeout.”

“But will they do that? The AIs will not,” the “Avatar: The Way of Water” director added.

Cameron’s warning reflects the concern of many leading experts and lawmakers, who acknowledge the benefits of AI but also recognize its potential dangers. Many lawmakers and leaders in the field have supported government regulation, but concerns linger that Congress lacks the expertise and the ability to enforce regulation that keeps up with the evolving technology.