Deep Fake: The Next Frontier in Cyber Scams

One of the most notorious incidents of cyber crime to date also stands out for the bare-faced cheek and simplicity of the methods employed. When criminals targeted Austrian aerospace firm FACC, they didn’t bother trying to hack into the company’s IT systems, bring down firewalls with a DDoS attack, or plant malware on its servers to quietly mine sensitive data.

Instead, they simply impersonated CEO Walter Stephan, sending a fake email in his name authorising a junior member of the accounts teams to send $47m to what the email claimed was the bank account of a company Mr Stephan was negotiating to buy. It wasn’t, and the thieves made off with the biggest single haul in cybercrime history.

Of course, how a single email could scupper all the usual transaction protocols of a major enterprise raised serious questions. But what if the CEO of the company hadn’t just been impersonated in an email? What if criminals had found a way to mimic his voice and put in a call to the accounts team ordering the transaction be made?

This was exactly what happened in an incident reported earlier this month that is bound to send shockwaves through the business and cybersecurity communities. The hapless managing director of an unnamed company received a phone call from his CEO, requesting he urgently wire $243,000 to an account in Hungary to cover late payment fines. What threw the MD off guard was that he recognised the voice as undeniably that of his boss, and he carried out the instruction accordingly.

AI mimicry

This theft is believed to be the first reported incident of so-called Deep Fake technology being used by cyber criminals to target a business. Deep Fake has been gaining a lot of publicity lately, mainly with regards to the doctoring of photographs and concern over how the spread of fake news over social media is now being bolstered by the distribution of highly convincing faked images.

But Deep Fake is far from just a matter of altering photographs. The term refers to the use of Artificial Intelligence to digitally synthesise two pieces of source content in a way which makes them indistinguishable from one another. It means you can superimpose a photograph of a person onto another in a way which goes far beyond even the very best Photoshop jobs. And, in the case of the above scam, it means that you can digitally alter one person’s voice so that it sounds indistinguishable from another, even as they talk in real time down the phone. All you need is a recording of your ‘target’ and the right software.

What this means is that companies have yet another front to fight on in the ever evolving battle against cyber crime. As Deep Fake scam successes mount up and the cost of the software falls, business of all sizes will have to be on the alert.

Share article

Get in Touch

Our modest but highly skilled team has a combined total of over 150 years of experience in commercial credit management and B2B debt collection. From independent IT contractors to major film and TV publishers, Safe Collections has the knowledge and experience you need to get paid quickly and cost effectively.

Contact Us

Opening Hours:

Mon - Fri

8 AM - 8 PM

Sat - Sun

Closed

Main Menu

Explore Our Blogs

Rosette 2024

© Safe Collections is a trading name of Safe Collections Limited. Incorporated 1984. Company Number: 01815264.

VAT Number: GB407358159. Data Protection Registration Number Z5257449 All Rights Reserved.

^ Back to the Top ^