fbpx

Management Assistance Program

Deepfake Voices Are Coming for Law Offices: What You Need to Know

By Julie Bays, OBA Management Assistance Program Director

Have you experienced a rise in scam calls to your personal phone recently? Do these callers often leave identical voicemails, such as, “Hello, I have been trying to reach you. Please call me back as soon as possible”? If you pay attention, you might notice that the voice is likely generated by advanced artificial intelligence.

Artificial intelligence has made it easier than ever to clone a person’s voice from just a few seconds of audio. What used to require specialized equipment can now be done with consumer tools. Unfortunately, scammers have taken notice. Reports are increasing nationwide of law firms receiving calls from someone who sounds exactly like a client, staff member, or even a fellow attorney, requesting sensitive information or urgent action.

Law firms are at particular risk because lawyers handle client funds, confidential data, and time-sensitive matters. A well-timed impersonation can be extremely convincing.

How These Scams Work

Scammers pull audio from voicemail greetings, recorded webinars, social media posts, podcasts, or even a short Zoom meeting clip. They feed that recording into a voice-cloning tool and generate a near-perfect imitation.

Once they have the voice model, the scam usually involves:

  • Asking for trust account transfers.
  • Requesting immediate payment of a “bill”.
  • Changing wiring instructions.
  • Pressuring support staff for confidential information.
  • Posing as a distressed client who needs help “right now.”

These calls are difficult to identify by sound alone.

Warning Signs

Even a flawless voice usually comes with behavioral red flags. Watch for calls that:

  • Create a sudden sense of urgency.
  • Ask you to bypass ordinary procedures.
  • Come from unusual numbers.
  • Sound like the person but include odd pacing, tone, or phrasing.
  • Feel “off” despite sounding correct.

Scammers rely on pressure and speed. Slowing things down is often enough to stop the attempt.

Why Lawyers Should Pay Attention

These attacks are growing because they work. Law firms routinely handle client funds, financial instructions, sensitive data, and urgent matters. A well-timed impersonation can undermine trust, compromise confidentiality, and cause significant harm before anyone realizes what happened.

Artificial intelligence is improving rapidly, which means impersonation will only become more convincing. Awareness is the first safeguard. Encouraging staff to trust their instincts, pause when something seems unusual, and follow established procedures can prevent costly errors.

For a deeper look at how these impersonation scams work in real-world legal settings, as well as examples of recent incidents, you may find this analysis helpful: https://www.linkedin.com/pulse/call-cost-25-million-how-ai-deepfakes-targeting-law-dasgupta-ph-d–llplc/

For more information on scams targeting lawyers, visit the Scam Alert page on the OBA website.