Deepfake Attack on Bunq CEO: A Wake-up Call for Cybersecurity

Estimated read time 3 min read

Introduction

Is artificial intelligence becoming the newest weapon in the arsenal of cybercriminals? A recent incident involving Bunq CEO Ali Niknam and a deepfake clone has raised alarms in the cybersecurity community.


The Incident

Last week, an employee at Bunq, an Amsterdam based online bank, received an email inviting them to a video conference. The email appeared to be from Ali Niknam, the CEO. During the video call, the employee was presented with an AI-generated clone of Niknam, complete with a replicated image and voice. The fake Niknam asked the employee to transfer a “significant amount of money.” Fortunately, the employee did not fall for this scam.


A Mission Impossible Scenario

Ali Niknam took to LinkedIn1 to share the unsettling experience, stating that this was the first time a deepfake of him had been created. He described the AI clone as highly convincing and mentioned that it felt like a scene out of Mission Impossible. Niknam’s post serves as a warning about the evolving tactics of cybercriminals.


The New Weapons of Cybercriminals

Deepfake technology is becoming increasingly accessible, and criminals are quick to adopt it. While email-based scams involving fake invoices or impersonations are common, this marks a new frontier in cybercrime. The AI-powered deception is so realistic that it can easily fool individuals into believing they are speaking to the actual person.


Rise of Voice Cloning in the US

The United States has seen a surge in voice cloning scams, with the technology performing exceptionally well in English. Incidents have even included cloning children’s voices to deceive parents into transferring money. In January 20202, a bank in the UAE lost $35 million due to a voice cloning scam that targeted an employee in Hong Kong.


Police Advisory

Last month, Dutch police had stated they were unaware of any local victims of voice cloning scams3. This might be due to the absence of Dutch language support in voice cloning software, a situation that has recently changed. The police urge the public to be skeptical of any unusual requests, especially those involving irreversible actions like money transfers.


Takeaways

  • Always double-check the identity of the person you’re speaking to.
  • Be cautious of unsolicited requests for money transfers or sharing sensitive information.
  • Stay updated on emerging cyber threats to stay one step ahead of the criminals.

Conclusion

The Bunq incident shines a light on the rapidly evolving landscape of cyber threats. As criminals adopt increasingly sophisticated methods, it’s crucial for individuals and organizations to stay vigilant and up-to-date on the latest cybersecurity measures. This incident serves as a wake-up call for everyone, underlining the importance of thorough identity verification and skepticism in the digital age.

Would you want to stay unaware of such evolving threats, or would you rather arm yourself with knowledge to combat them? The choice is yours.

  1. https://www.linkedin.com/feed/update/urn:li:activity:7114874320627085312/ ↩︎
  2. https://www.forbes.com/sites/thomasbrewster/2021/10/14/huge-bank-fraud-uses-deep-fake-voice-tech-to-steal-millions/?sh=6f02ae387559 ↩︎
  3. https://www.nrc.nl/nieuws/2023/10/04/fraudepoging-met-ai-kloon-van-topman-onlinebank-bunq-a4176143 ↩︎
Reza Rafati https://cyberwarzone.com

Reza Rafati, based in the Netherlands, is the founder of Cyberwarzone.com. An industry professional providing insightful commentary on infosec, cybercrime, cyberwar, and threat intelligence, Reza dedicates his work to bolster digital defenses and promote cyber awareness.

You May Also Like

More From Author

+ There are no comments

Add yours