AI-Powered Jennifer Aniston Impersonator Scams Brit Out of Thousands—Here’s How
Deepfake romance cons just leveled up—and your wallet’s the target.
Another day, another AI scam bleeding dry unsuspecting victims. This time? A synthetic Jennifer Aniston sweet-talked a UK man into emptying his bank account. No Hollywood ending here.
How the scam worked
The 'Aniston' bot deployed flirtatious banter and 'urgent' financial pleas via WhatsApp—classic pig-butchering tactics. By the time the victim smelled a rat, his life savings had moonwalked into crypto oblivion.
Why this matters
Generative AI tools now clone voices in 3 seconds flat. Combine that with crypto’s Wild West anonymity, and you’ve got a perfect storm for financial carnage. (Bonus jab: At least traditional banks take 3-5 business days to rob you.)
Stay paranoid out there.

In addition, Davis reported receiving a series of manipulative messages in which the individual posing as Aniston engaged in “love bombing”, using affectionate pet names, professing love, and even sending an image of Aniston holding a digitally altered sign that read “I Love You.”
Unfortunately, Davis was persuaded by the deceptive tactics of the AI deepfake and ultimately fell victim to the scam, losing money through non-refundable Apple gift cards.
“I’ve got fake videos from Jennifer Aniston asking me for £200 and saying she loves me,” David stated.
Apart from the messages involving Aniston, Davis reported being targeted by AI-generated videos featuring Meta Platforms CEO Mark Zuckerberg and Tesla CEO Elon Musk. He stated that he has been subjected to these deepfake videos and messages for the past five months.
“This is not a scam, believe me,” the AI-generated deepfake of Zuckerberg reportedly told Davis in an attempt to appear credible. These fabricated messages were frequently accompanied by doctored certificates and fake identification cards.
Unfortunately, Davis is not the only individual to fall victim to an AI-driven scam. In January, a French woman was deceived by an AI deepfake of actor Brad Pitt, leading her to believe they were in a romantic relationship. She was ultimately defrauded of her life savings after being convinced that the funds were needed for Pitt’s alleged cancer treatment.
The woman received love poems, heartfelt declarations, AI-generated photographs, and even a marriage proposal from the scammers. They further escalated the deception by sending an email purportedly from a doctor, claiming that Pitt was critically ill and fighting for his life. Convinced by these communications, the victim ultimately transferred approximately $850,000 to an account in Turkey.
With the rapid advancement of AI technology, scammers have exploited these developments for their own benefit. The most frequent targets of such schemes are often elderly individuals or those with limited technological proficiency, with romance-based scams emerging as a prevalent tactic among perpetrators.
Read More
Michaela has no crypto positions and does not hold any crypto assets. This article is provided for informational purposes only and should not be construed as financial advice. The Shib Magazine and The Shib Daily are the official media and publications of the Shiba Inu cryptocurrency project. Readers are encouraged to conduct their own research and consult with a qualified financial adviser before making any investment decisions.