Source: American Bankers Association
Imposter scams in particular are on the rise in the age of artificial intelligence (AI). Criminals are using deepfakes, or media that is generated or manipulated by AI, to gain your trust and scam you out of your hard-earned money.
Deepfakes can be altered images, videos or audio. They may depict people you know — including friends and family — or public figures including celebrities, government officials and law enforcement.
How to detect a deepfake:
Look for inconsistencies:
- Are any of the facial features blurry or distorted?
- Does the person blink too much or too little?
- Do the hair and teeth look real?
- Are the audio and video out of sync?
- Is the voice tone flat or unnatural?
- Does the visual show odd or unnatural shadows or lighting?
Tips to stay safe:
STOP AND THINK. Is someone trying to scare you or pressure you into sending money or sharing personal information?
VERIFY the legitimacy of people and requests by using trusted numbers, official websites and online reverse image/video search tools.
CREATE CODEWORDS or phrases with loved ones to confirm identities.
LIMIT YOUR DIGITAL FOOTPRINT. Photos, voice clips and videos can be used to train deepfake models.
NEVER REPOST videos or images without verifying the source.
Red flags of a deepfake scam:
- Unexpected requests for money, passwords, personal information or secrecy.
- Emotional manipulation involving fear or urgency.
- Uncharacteristic communication from someone you know, especially over text, phone or video.
Report scams: To your local police, or the FBI at IC3.gov, or to your bank if you sent money.
To learn more click here ABA infographic for tips against deepfake media scams.