AI scammers are now impersonating US government bigwigs, says FBI

AI scammers are now impersonating US government bigwigs, says FBI

Deepfake-assisted hackers are now targeting US federal and state officials by masquerading as senior US officials in the latest brazen phishing campaign to steal sensitive data. 

The bad actors have been operating since April, using deepfake voice messages and text messages to masquerade as senior government officials and establish rapport with victims, the FBI said in a May 15 warning. 

“If you receive a message claiming to be from a senior US official, do not assume it is authentic,” the agency said.  

If US officials’ accounts are compromised, the scam could become far worse because hackers can then “target other government officials, or their associates and contacts, by using the trusted contact information they obtain,” the FBI said. 

As part of these scams, the FBI says the hackers are trying to access victims’ accounts through malicious links and directing them to hacker-controlled platforms or websites that steal sensitive data like passwords. 

FBI, Cybercrime
Source: FBI

“Contact information acquired through social engineering schemes could also be used to impersonate contacts to elicit information or funds,” the agency added. 

Crypto founders targeted in separate deepfake attacks 

In an unrelated deepfake scam, Sandeep Narwal, co-founder of blockchain platform Polygon, raised the alarm in a May 13 X post that bad actors were also impersonating him with deepfakes. 

Nailwal said the “attack vector is horrifying” and had left him slightly shaken because several people had “called me on Telegram asking if I was on zoom call with them and am I asking them to install a script.” 

FBI, Cybercrime
Source: Sandeep Narwal

As part of the scam, the bad actors hacked the Telegram of Polygon’s ventures lead, Shreyansh and pinged people asking to jump in a Zoom call that had a deepfake of Nailwal, Shreyansh and a third person, according to Nailwal. 

“The audio is disabled and since your voice is not working, the scammer asks you to install some SDK, if you install game over for you,” Nailwal said. 

“Other issue is, there is no way to complain this to Telegram and get their attention on this matter. I understand they can’t possibly take all these service calls but there should be a way to do it, maybe some sort of social way to call out a particular account.” 

At least one user replied in the comments saying the fraudsters had targeted them, while Web3 OG Dovey Wan said she had also been deepfaked in a similar scam. 

FBI, Cybercrime
Source: Dovey Wan

FBI and crypto founder says vigilance is key to avoid scams 

Nailwal suggests the best way to avoid being duped by these types of scams is to never install anything during an online interaction initiated by another person and to keep a separate device specifically for accessing crypto wallets

Related: AI deepfake attacks will extend beyond videos and audio — Security firms

Meanwhile, the FBI says to verify the identity of anyone who contacts you, examine all sender addresses for mistakes or inconsistencies, and check all images and videos for distorted hands, feet or unrealistic facial features. 

At the same time, the agency recommends never sharing sensitive information with someone you have never met, clicking links from people you don’t know, and setting up two-factor or multifactor authentication. 

Magazine: Deepfake AI ‘gang’ drains $11M OKX account, Zipmex zapped by SEC: Asia Express

Add a Comment

Your email address will not be published. Required fields are marked *