Facebook said Thursday it is planning to expand its fact-checking functions to photos and videos.
“Today, we’re expanding fact-checking for photos and videos to all of our 27 partners in 17 countries around the world (and are regularly on-boarding new fact-checking partners),” product manager Antonia Woodford said in a statement.
“We know that people want to see accurate information on Facebook, so for the last two years, we’ve made fighting misinformation a priority,” she said. “This will help us identify and take action against more types of misinformation, faster.”
{mosads}Facebook had previously tested the video and image fact-checking video function in France with the AFP news agency.
The company said that misleading videos and photos will receive a label after being checked with visual verification techniques, like reverse image searching.
Facebook also promised to take additional steps to remove posts determined untrue from its site.
“We know that fighting false news is a long-term commitment as the tactics used by bad actors are always changing,” Woodford wrote.
“As we take action in the short-term, we’re also continuing to invest in more technology and partnerships so that we can stay ahead of new types of misinformation in the future,” she concluded.
Facebook emphasized earlier Thursday its wider efforts to combat information it deems false as it looks ahead to the November midterm elections.
“In 2016, we were not prepared for the coordinated information operations we now regularly face,” CEO Mark Zuckerberg said. “But we have learned a lot since then and have developed sophisticated systems that combine technology and people to prevent election interference on our services.”
“This effort is part of a broader challenge to rework much of how Facebook operates to be more proactive about protecting our community from harm and taking a broader view of our responsibility overall,” he added.