Experts have raised the alarm about the increasing sophistication of scammers using audio deepfake technology to trick unsuspecting victims. Cybersecurity professionals highlight that these fraudsters employ AI to replicate voices and faces, making their scams more convincing.
Irene Corpuz, founding partner and board member at Women in Cybersecurity Middle East, referenced a recent incident in May where a British engineering company in Hong Kong lost approximately HK$200 million (Dh94 million) due to an AI-generated video call scam. “Scammers will engage you in phone conversations so that they can record your voice and use it in a future scam,” Corpuz explained. She added that this could also happen during Zoom meetings with multiple participants. “When a victim hears the voice or sees a video of a friend or a loved one, the scam becomes more believable,” she noted.
Corpuz advised the public to be wary of calls from unknown numbers, especially if the caller asks questions requiring a “yes” or “no” answer. “Scammers can initiate calls with a chatbot and when a chatbot will confirm a transaction request with a question: ‘Would you like to initiate a payment. Is this correct?’ This is when the scammers can use the recorded ‘yes’ or ‘no’ answer,” Corpuz told Khaleej Times. She cautioned against responding affirmatively to unknown callers to avoid voice recording for fraudulent transactions or identity verification scams.
Scammers often use verification tactics to appear legitimate. “The scammer would say ‘the first digits of your Emirates ID are 784-19… and then ….???’ Take note that the scammer is trying to make you fall into a trap by supplying the remaining digits of your Emirates ID,” Corpuz warned. Once they gain your trust, they can proceed with their scam. She added that many scammers pose as representatives from banks, central banks, police, or utility companies.
JD Ackley, CEO of Raizor, a conversational AI deployment and services organization, emphasized the importance of being cautious with unsolicited calls. “Typically, scammers are looking for any reason you might take a call from them and their premise will be generic and they will zero in on things you mention — because bots are programmed to take direction based on your responses,” Ackley explained. He noted that scammers usually have vague information about your demographic and try to extract more details from you.
Ackley also advised vigilance when payment requests are made in unusual forms. “A legitimate business will not request payment in any type of gift card or money transfer,” he said. He suggested asking for a callback number and verifying the legitimacy of the caller. “Scammers will do anything to keep you on the call and extract payment during that call, but only a legitimate business will have a proper way for you to call them back,” Ackley added.
Barney Almazar, director of the corporate-commercial department at Gulf Law, pointed out that scammers often target individuals during moments of lowered vigilance, such as commuting or meal times. “This tactic makes it easier for them to conceal their fraudulent intentions. Additionally, during these periods, it can be challenging to reach bank hotlines, providing scammers with a crucial window to exploit before you can report the fraud,” he explained.
Almazar stressed the importance of education and awareness in combating deepfake scams. “Under the UAE Cybercrime Law (Federal Decree-Law No. 5 of 2012), there are stringent measures in place to combat these abuses. Article 2 explicitly criminalizes electronic fraud and impersonation, imposing severe penalties including imprisonment, fines, and subsequent deportation,” he stated. He also highlighted Article 21, which prohibits the recording, sharing, or publishing of personal data without consent.
“Vigilance and critical thinking play important roles when dealing with deepfakes or any kind of scam. Don’t just believe what your eyes see or what your ears hear. Check and verify before doing anything,” Almazar concluded.