
4 minute read
Dangers of AI voice cloning for law firms
Dangers of AI voice cloning for law firms

Most firms in the legal sector are aware of the dangers of impersonation scams, especially conveyancers who will be familiar with ‘Friday afternoon fraud’ where cyber criminals infiltrate the email conversations between homebuyers and their solicitors, with the intention of tricking the buyer into transferring the money into a criminal bank account.
Whilst measures have been taken to prevent such fraud, criminals have found ways around them and, using the latest generative AI tools, have made the scams even more convincing. Furthermore, they have supplemented the use of email with text messages and telephone calls in an effort to deceive their victims.
A more recent development, particularly worrying for law firms, is voice cloning – where AI tools are used to generate an artificial voice sounding almost identical to that of the original speaker. This might sound like something from the Terminator but it is in fact incredibly easy to do, and there have already been some well documented cases where it has been used to defraud businesses of very significant amounts of money.
Are people really fooled by AI voices?
A 2023 study by security company McAfee found that AI voice cloning tools can replicate a person’s voice with up to 95% accuracy and the evidence suggests that telling the difference is getting harder and harder as these tools improve.
If you are not convinced then try researching AI voice generation and cloning tools on the web. These are legitimate web services designed for creative and publishing purposes, but can also be used for criminal activities. Such tools can create a realistic sounding clone with just 20 seconds of sample voice. When fed with 2 minutes or more of the sample voice, the cloned voice is frighteningly accurate.
To reinforce how accurate this technology has become, the BBC tried to fool online banking security systems using AI cloned voices as part of their scam safe week. Very worryingly they were successful in getting past the security of two mainstream banks and gaining account access using it.
So how do AI voice scams work?
Imagine getting a call from a financial advisor about investing more money in a scheme they are running for you. There is a limited time to do this and you are being given a last opportunity. You recognise the caller – it’s someone you have spoken to several times before from a firm you trust. However in this case the person on the other end of the line is actually a criminal. They gained your details after a successful cyber attack on your financial advisor and used a video your advisor placed on LinkedIn to clone their voice.
This unfortunately is no longer science fiction; there are publicly available tools on the internet which can clone voices from recordings with worrying accuracy. Whilst there are many legitimate uses for this technology, the potential for criminal use is clear and it is already happening at scale. The McAfee study identified that 25% of respondents had experienced or know someone who has experienced a voice cloning attack with nearly three quarters of those attacked being successfully duped and suffering some kind of loss.
What can you do about it?
AI voice cloning scams target both businesses and individuals. From a personal perspective ensure you have ‘code words’ with family members or trusted friends who might ask you to transfer money or deal in other confidential matters.
In your business, code words can also help with internal company authorisations. However, more important is having robust multiple step authorisation processes on financial transactions. Also try and authorise with multiple methods or call back the person you think you have received a call from on a known number.
To protect your clients, ensure you make it clear, both when on boarding but also reinforcing throughout your relationship, that you won’t give instructions over the phone. Use a secure client portal and have robust authorisation methods. It is also now advised to use software solutions for secure authorisation using mobile phones with biometric authentication (although you can also use manual methods to achieve the same).
Above anything else, always be vigilant and if something feels even slightly suspicious, then treat it as a scam and hang up and call the person back on a known number. Remember these scams often target people who criminals can find voice recordings of online and these are often senior people in organisations such as partners and directors. Don’t let their seniority prevent you from questioning the legitimacy of the call.
