Lawmakers are concerned that advances in video manipulation technology could set off a new era of fake news. Now legislators say they want to start working on fixes to the problem before it’s too late.
Technology experts have begun to sound the alarm on the new software, which lets users take existing videos and make high-quality altered video and audio that appears real. The emergence of the technology opens up a new world of hoaxes driven by doctored audio or video, and threatens to shake faith in the media even further.
Senator Ron Wyden (D-Ore.), one of the most vocal members of Congress on tech issues, painted a grim picture about what the advances could mean for the future of discerning truth in media.
“Since we can’t rely on the responsibility of individual actors or the platforms they use, I fully expect there will be a proliferation of these sorts of fictions to a degree that nearly drowns out actual facts,” Wyden told The Hill.
“For those who value real information, there will still be some reliable publications and news outlets, and their credibility will need to be guarded all the more intently by professional journalists,” he added.
Rep. Adriano Espaillat (D-N.Y.), who has targeted fake news in the past through legislation, also told the The Hill that he’s concerned.
Espaillat said that he is worried about the potential for foreign governments to use counterfeit audio and videos to manipulate the American public.
Lawmakers’ fears are backed up by concern from experts, who say that manipulated videos are another dangerous addition to the rising trend of fake news.
{mosads}”Democracy depends on an informed electorate, and when we can’t even agree on the basics of what’s real, it becomes increasingly impossible to have the hard conversations necessary to move the country forward,” said Renee DiResta, one of the first researchers to sound the alarm on how social media platforms were being manipulated by foreign actors.
“The cumulative effect of this is a systemic erosion of trust, including trust between people and their leaders,” she added.
And widely available video faking technology is getting closer to becoming a reality.
Researcher Aviv Ovadya showed BuzzFeed new software that blurs the lines between genuine video and doctored footage that appears real. Meanwhile, researchers at Stanford and the University of Washington are currently developing technology that allows people to alter footage of world leaders like Russian President Vladimir Putin and former President George W. Bush, making them appear do things they’ve never actually done.
Ovadya predicts that such technology could be used to manipulate diplomacy, and even goad countries into making decisions based on fake information.
“It doesn’t have to be perfect — just good enough to make the enemy think something happened that it provokes a knee-jerk and reckless response of retaliation,” Ovadya, the chief technologist for the University of Michigan’s Center for Social Media Responsibility, told BuzzFeed.
Sen. Mark Warner (D-Va.), the ranking Democrat on the Senate Intelligence Committee and a leading lawmaker in the push to respond to Russian social media manipulation in U.S. politics, thinks that counterfeit videos could have impacts beyond politics.
“I’m much more worried about what could come next — could bad actors target kids with fake videos from people they trust?” Warner questioned during an event earlier in February addressing the dangers technology can pose for children.
“This ultimately begs the question — how do you maintain trust in a digital-based economy when you may not be able to believe your own eyes anymore?” he asked rhetorically.
Such algorithms are already beginning to take on a darker life. People have begun to make pornographic videos, known as “deepfakes,” using software that superimposes celebrities’ faces onto adult-film actors.
Warner didn’t specify what action should be taken, but did say that “the federal government must step up and update our laws to address emerging threats to Americans.” He also pressed platforms to “commit more resources to identifying bad actors.”
It’s still difficult to easily produce video and audio together that can actually fool people, but at least one lawmaker believes that there is concrete action the government can take. Rep. Ro Khanna (D-Calif.), who represents Silicon Valley, called for the Defense Advanced Research Projects Agency (DARPA) to create some way to authenticate genuine audio and video.
“I think we should task DARPA with figuring out how to create a secure internet protocol that requires authentication,” Khanna told The Hill.
“We all will need some form of authenticating our identity through biometrics. This way people will know whether the voice or image is real or from an impersonator,” he said, adding that social media platforms should invest in their authentication products to further validate the veracity of videos and audio.
“Congress can also provide a basic regulatory framework requiring authentication on internet platforms,” he added.
Congressional action on tools for the creation of fake news might not come easily, though. Warner and Sen. Amy Klobuchar’s (D-Minn.) bipartisan bill co-sponsored by Sen. John McCain (R-Ariz.) meant to curb fake news and increase political ad transparency on social media is already having a hard time gaining traction in Congress.
Even if it is politically difficult, experts say that it’s a fight worth having.
“Lawmakers have almost entirely avoided regulating technology companies,” DiResta said. “It’s time we rethink that.”