How AI scams exploit the elderly

Contents

Older adults are increasingly becoming targets for scammers employing AI to carry out fraud using a variety of techniques such as voice cloning, social engineering, and imposter scams. Data collated from the past seven years shows that the over 70s are losing more to fraud each year, and highlights this demographic’s vulnerability to the evolving cybercrime landscape.

From data included in Visa’s fall 2024 threat report and FTC consumer sentinel reports spanning 2017-2024, we identified several worrying trends in online scams targeting the elderly, and how AI is changing the landscape.

Key Findings

  • 1 in 5 people lost money to fraud in 2023
  • The over 70s lost 2.5x more per incident than the average
  • Average losses to the over 70s have increased by 35% since 2017
  •  Imposter scams are the most prevalent, increasing in number by 145% since 2017 

The amount lost per incident is growing each year, with the losses suffered by the over 70s rising fastest. While there were more overall reports by people in the 20-39 years age group, the amount lost per incident was over $1100 for the over 70s, growing to $1450 for the over 80s. 

This represents an increase of 35% in the average loss suffered by the over 70s, with other groups remaining relatively stable.

Scams in the age of AI voice cloning

The most common type of fraud reported in 2023 was imposter scams.

Impostor scams have been common for several years. One of the most common is where a scammer sends an SMS to the victim, pretending to be a friend or relative, and says they’re in trouble and need money. 

The number of imposter scams grew by 145% and cost Americans $10.4 billion since 2017.

With the recent advancements in AI, it has become possible for scammers to send voicemails that are indistinguishable from those of the victim’s relatives. This new voice cloning ability is a development in the use of deepfakes to extort money from unknowing victims. An individual’s voice can be cloned from a speech sample just three seconds long, and used to trick friends and family into thinking the person is in trouble so they send money.

These scams are especially effective when targeting the elderly with the cloned voice of a distressed grandchild to extract significant sums of money, which often causes a painful psychological toll along with the financial cost.

How Families Can Protect Elderly Loved Ones From AI-Driven Scams

The most important thing you can do is to make sure your elderly relatives feel supported and know they can ask for help. This is one of the simplest and most effective ways to keep them safe. If they feel supported they will reach out for help and you can give frequent bitesize lessons that help with their overall digital literacy.

Regular communication about scam tactics helps them know what to look out for.  

Having a secret family “safe word” is very effective at stopping imposter scams, something that only the family knows, and is used to verify the identity of a family member asking for help.

Within this communication should be open conversation about what types of scams are common, you can keep up to date on current scams on the FTC consumer advice website, and report any suspicious activity on the FTC report fraud website

Data sources