The Rising Threat of Deep Fake Tech
Action Fraud has reported an escalation in AI attacks, specifically the growing use of AI voice cloning to deceive, defraud and scam individuals and organisations. They have also updated guidelines on business communications to heighten protection against Artificial Inflation Traffic (AIT). In response to this news, we explore the rising threat of fake communications, along with actions to reduce the risk.
Can You Spot The Difference Between A Legitimate And Fake Call?
We’d all like to think that we could tell the difference, especially when it comes to close colleagues, family and friends. However, there are plenty of reported incidents of convincing fake calls. According to a recent survey, 77% of people who received a bogus call lost money. This suggests the messages are believable.
What is Deep Fake Tech?
Deep Fake Tech uses audio and video clips to replicate voices and faces. You may believe that cyber criminals need access to complex technology to do this. However, the reality is that voice cloning technology is available on the internet and you don’t need to be a coding expert to use it. What’s more, it can take as little as a 3-second clip to capture an individual’s speech patterns and accent.
In studies, a 3-second audio clip resulted in voice cloning with 85% accuracy. When several sources of the same voice were used, this increased to 95%. Then, you have to factor in that the call is typically emotional; a demanding customer who wants action taken now or even your child who needs urgent help to get out of a sticky situation. In responsive mode, you are less likely to stop and think and more likely to act.
Where do Cyber Criminals Source Content for Deep Fake Tech?
How often do you share recorded notes or send a voicemail message? Do you share video clips with audio content on social media or take part in podcasts? More than ever, our voices and faces are recorded and publicly shared, so it is easy for cyber criminals to compile content.
A Message from Ella is a powerful and shocking short video to illustrate how this content can be manipulated.
The cloned content is then used to directly target individuals or organisations with information that seems credible. This is known as ‘spear phishing’. Employees are convinced that it was their manager who asked for access information or a regular client who requested changes to their account. Individuals are convinced that their child or spouse is in danger and they need to act to secure their safety.
The NCSC has recently warned that a Russian actor, known as Star Blizzard, is targeting UK individuals and organisations with spear phishing attacks. This is just one of the potential threats of AI-generated content being used to cause harm and extort money.
Read more about this cyber threat.
Can Deep Fake Tech Be Used for Good?
Although synthetic AI content is being harnessed by cybercriminals, it is also being applied in many positive applications. A recent tech article offers some examples, including:
- Providing immersive learning experiences
- Personalising customer service interactions
- Aiding patient care in medical settings
In banking, AI-generated content is being used to predict scams in real-time. A partnership between a payment network processor and 9 UK banks has applied the technology to significantly reduce fraud. In this application, AI is reducing financial risks for both firms and customers.
AI Exploiting Business Communications
Another way in which AI is being used to disrupt communications is by generating fake traffic to access websites and apps. The process, known as Artificial Inflation Traffic (AIT) is then exploited for financial gain. In response, the National Cyber Security Centre has revised its guidance on Business Communications Best Practice.
Reading and acting on this guidance will help you to deliver trusted and consistent messages to customers. It can protect customers from fraud and help retain your business reputation.
If you believe that this type of cybercrime only happens to high-profile organisations, think again. Of the 188 organisations that reported an incident to Action Fraud in October, 73% were micro or small businesses. What’s more, compromised emails were a major issue.
How to Reduce the Risk of Deep Fake Scams
Firstly, think before you act and inform employees, so they do the same. Cybercriminals will use urgency as a means to incite action. From immediate threats to time-limited offers, you are likely to feel pressured to act. Although it may not feel instinctive, pause and take a moment to think.
If in doubt, check the source. Can you use another phone or communication platform to contact the person or organisation directly? Use stored contact details or those shown on a website rather than links or other options provided by the caller.
Remember that no organisation will request private information over the telephone. Any request for personal details is a sign that this isn’t a legitimate call. Equally, when it comes to money, criminals will request non-traceable options. For this reason, be suspicious if usual payment methods aren’t accepted.
The NCSC recommends agreeing on a word or phrase with children and other family members. This can be used to check if an urgent call, text or other communication, is genuine. We believe the same approach can be beneficial in business. Set a company passcode and consider whether this is also viable for customer communications.
Use privacy settings on personal social media platforms to restrict who can view your content.
Sign your business up for Cyber Essentials. This will build awareness of potential cyber threats, along with practical steps to reduce the risks.
Greater Awareness To Boost Protection
We’ve not shared this information to be scaremongers. We strongly believe that greater awareness of potential threats is necessary to ensure you and your business are better protected. Simply questioning sources and resisting the pressure to act before you think could save you from falling victim to a scam.
If you have any questions about cyber protection, talk to us on 0333 101 7300. We’re happy to lead you to the most suitable solutions for your requirements.