For nearly 30 years, Oren Etzioni was among the most optimistic of artificial intelligence researchers.

But in 2019 Dr. Etzioni, a University of Washington professor and founding chief executive of the Allen Institute for A.I., became one of the first researchers to warn that a new breed of A.I. would accelerate the spread of disinformation online. And by the middle of last year, he said, he was distressed that A.I.-generated deepfakes would swing a major election. He founded a nonprofit, in January, hoping to fight that threat.

On Tuesday, the organization released free tools for identifying digital disinformation, with a plan to put them in the hands of journalists, fact checkers and anyone else trying to figure out what is real online.

The tools, available from the website to anyone approved by the nonprofit, are designed to detect fake and doctored images, audio and video. They review links to media files and quickly determine whether they should be trusted.

Dr. Etzioni sees these tools as an improvement over the patchwork defense currently being used to detect misleading or deceptive A.I. content. But in a year when billions of people worldwide are set to vote in elections, he continues to paint a bleak picture of what lies ahead.

“I’m terrified,” he said. “There is a very good chance we are going to see a tsunami of misinformation.”


Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Researchers warn 'humans cannot reliably detect' audio deepfakes even when trained

AI-generated audio that mimics humans can be so convincing that people can’t…

Beware – that dream job offer could actually be a malware scam

Overseas hackers target American benefits Fox News senior national correspondent William La…

Beware of this latest phishing attack disguised as an official Google email

Remember when we talked about how those sneaky phishing attacks are becoming…

Top ways to optimize your PC and Mac devices

When using your computer, you want to make sure that you’re truly…