Can your ears distinguish a human voice from an AI deepfake? Knowing the difference could save you from a phone scam that costs you thousands of dollars.
Criminals use AI-generated voice clones to trick victims into handing over money or passwords. These scams surged 442% between the first and second half of last year, according to cybersecurity firm CrowdStrike. While deepfake audio software is evolving quickly, there are still some “tells” that can betray an AI impersonator.
To test this, we enlisted David Falkenstein of corporate security firm IOActive to clone a few Wall Street Journal colleagues. He pulled down bits of our publicly available social-media and podcast audio, clips just 10 to 30 seconds in length. He used OpenAudio—easily accessible software that can run on a laptop—to make our voices say some pretty crazy things.
Read more | WALL STREET JOURNAL