Home Blog AI detection AI deepfake scams in India are exploding as criminals using AI.
AI deepfake scams in India are exploding as criminals using AI.

AI deepfake scams in India are exploding as criminals using AI.

AI Deepfake Scams in India: New Tricks to Watch in 2025

AI deepfake scams in India are exploding as criminals use fake videos, AI-edited images, voice cloning and face swaps to steal money, blackmail victims and spread misinformation. From investment schemes promoted by deepfake “ministers” to obscene AI-morphed photos used for sextortion, these attacks are becoming more realistic and harder to detect.(www.ndtv.com)

In this guide, we’ll break down the newest AI image and video scams in India, real case studies, red flags to watch for, and practical steps to keep your money, identity and reputation safe.


1. Deepfake Investment Scams Using Ministers, CEOs & Gurus

One of the fastest-growing AI deepfake scams in India is fake investment advice using morphed videos of well-known personalities:

  • Finance and RBI officials
  • Billionaires and business leaders
  • Spiritual gurus and influencers

In 2025, a Roorkee resident reportedly lost ₹66 lakh after trusting an AI-generated video of a senior minister promoting a “high-return” crypto investment app.(The Times of India) Similar cases used deepfake videos of spiritual leader Jaggi Vasudev (Sadhguru) to push fraudulent schemes, costing a Bengaluru woman over ₹3.7 crore.(Scroll.in)

The RBI has formally warned citizens about deepfake videos of top officials promoting fake investment schemes and clarified that it never endorses such products.(India Today)

Red flags

  • “Official-looking” videos promising guaranteed or extremely high returns.
  • Investment links shared only via WhatsApp, Telegram or random social media pages.
  • Pressure to invest quickly before a “deadline” or “exclusive window”.

2. Voice Cloning Scams: Fake Relatives, Bosses and Bank Calls

AI tools can now clone a person’s voice using just a few seconds of audio from social media. Banks and telecom companies in India have warned that criminals are using AI voice cloning to impersonate relatives, bank staff or senior company executives to demand urgent money transfers.(www.ndtv.com)

Recent examples include:

  • A victim in Indore who lost about ₹1.83 lakh after fraudsters used an AI-generated voice of his brother-in-law from Australia, claiming a visa emergency.(The Times of India)
  • Cases in Chennai where cloned voices of family members were used to demand instant UPI transfers.(The New Indian Express)

Red flags

  • A “relative” or “boss” calling from an unknown number, asking for money urgently.
  • Callers insisting on UPI transfers, gift cards or crypto instead of regular banking channels.
  • Refusal to switch to video call or let you call back on their usual number.

3. AI-Morphed Obscene Photos & Deepfake Sextortion

Another disturbing trend in AI image scams in India is the use of morphed or deepfake obscene images and videos for blackmail.

Police in multiple states have reported cases where scammers download a person’s social media photos, use AI tools to create fake nude or obscene visuals, and then threaten to leak them unless money is paid.(The Times of India)

Tragically, there have been incidents where young victims died by suicide after being blackmailed with AI-generated obscene images of themselves or family members.(Hindustan Times)

How this scam usually works

  1. Scammer downloads your or your family’s photos from Instagram, Facebook or WhatsApp.
  2. They create fake intimate content using AI face-swap and image generation tools.
  3. They send you a sample and threaten to post it publicly or send it to relatives.
  4. They demand money (often ₹10,000–₹50,000, sometimes much more) to “delete” the content.

Red flags

  • Random accounts sending your own photo with morphed obscene visuals.
  • Threats to “make the video viral” or “tell your parents/boss” unless you pay.
  • Fake profiles pretending to be “cyber police” but asking for settlement money.

4. AI-Driven Romance, Dating & Video Call Traps

Romance and dating scams are not new, but AI image and video tools are making them more convincing:

  • Scammers use AI-generated profile photos or face-swapped images to appear more attractive or trustworthy.
  • On video calls, they may use filters or pre-recorded clips to hide their real identity.
  • After gaining trust, they may ask for intimate photos or videos, which are then used for sextortion and blackmail, or they may lure victims into fake investment or loan apps.

Police in several states have busted gangs that use obscene video calls (sometimes combined with AI manipulation) to record victims and then extort money by threatening to share the footage.(The Times of India)

Red flags

  • Profiles that look “too perfect” with only a few photos, all highly edited.
  • Immediate shifting from dating apps to WhatsApp, Telegram or private calls.
  • Quick pressure for explicit chats, photos or video calls.

5. Digital Arrest & Fake Authority Scams with Video

“Digital arrest” scams often start with a phone call pretending to be from police, CBI, TRAI or cyber-crime units. In some cases, fraudsters use video calls, official-looking backgrounds or AI-edited IDs to make the interaction look real.

An elderly woman in Mumbai was conned out of ₹1.6 crore in such a digital arrest scam, where fake officials claimed she was involved in money laundering and forced her to keep money in a “safe account” during the investigation.(The Times of India)

While not always deepfake-based, these scams are increasingly layering AI-generated documents, altered photos and fake video IDs to appear more official.

Red flags

  • Anyone on a video call claiming to be from RBI, police or CBI asking for money.
  • Threats of immediate arrest, freezing of accounts or criminal cases.
  • Requests to keep the call “secret” from family or local police.

6. AI-Based Job, Loan and Customer Support Scams

Fraudsters now combine AI-generated logos, images, fake chatbots and automated emails to impersonate:

  • Banks and NBFCs
  • Job portals and HR teams
  • Popular e-commerce platforms or courier services

They might send you:

  • A fake “video KYC” link with an AI avatar “bank officer”
  • AI-generated offer letters or approval letters
  • Morphed screenshots of supposed transactions or approvals

These scams usually end with the victim paying a “processing fee”, “security deposit” or “GST” that never gets refunded. While many of these scams use traditional phishing, AI is increasingly used to polish emails, generate realistic UIs and fake documents.(PwC)


7. Why AI Deepfake Scams Are Growing So Fast in India

Several factors are driving the rise of AI deepfake scams in India:

  • Cheap internet & mobile penetration – Almost everyone is online, including vulnerable first-time users.
  • Easy access to AI tools – Many face-swap and voice-cloning tools are free or very cheap.(www.ndtv.com)
  • Low awareness – Many people still believe “if it’s on video, it must be real”.
  • High digital payment usage – UPI and instant transfers make it easy to move money within seconds.(The New Indian Express)

Even regulators and big tech companies like Google have issued charters and advisories specifically warning Indians about deepfake and AI-powered scams.(blog.google)


8. How to Protect Yourself from AI Image & Deepfake Scams

A. Always Verify Through a Second Channel

  • If a relative or boss calls for money, hang up and call back on their usual number.
  • For investment or loan offers, visit the official website or branch, don’t trust links from social media.

B. Treat Every “Too Good to Be True” Video as Suspicious

  • Do not trust investment tips only because they appear in a video of a famous person.
  • Check the official YouTube channel, website, or news outlets to see if the scheme is genuine.

C. Lock Down Your Photos & Profiles

  • Keep social media accounts private where possible.
  • Avoid posting high-resolution portrait photos publicly, especially of children.
  • Report fake accounts that use your photos immediately.

D. Never Pay to “Delete” Obscene Content

  • Paying once rarely ends the blackmail; it usually leads to more demands.
  • Instead, immediately:

E. Use Basic Cyber-Hygiene

  • Enable two-factor authentication (2FA) on social media and banking apps.
  • Update your phone and apps regularly.
  • Avoid installing random APKs or apps promoted only via social media reels.

9. Legal Remedies for AI Deepfake & Image Abuse in India

India’s existing laws cover many deepfake and AI misuse scenarios, even though the word “deepfake” may not be explicitly used:

  • IT Act, 2000 & IT Rules – For publishing or transmitting obscene or defamatory content.
  • IPC sections – For cheating, extortion, criminal intimidation and outraging the modesty of a woman.(Best Cyber Crime Lawyer)
  • Cyber police stations – Every state now has dedicated cyber cells that handle such complaints.

Victims should:

  1. Preserve all evidence (screenshots, links, transaction IDs).
  2. File a complaint at the nearest police station or cyber cell.
  3. Report fake content to platforms (Instagram, Facebook, YouTube, etc.) for takedown.

10. FAQs on AI Deepfake Scams in India

1. Are AI deepfake scams in India only about money?

No. Many scams involve reputation damage, harassment and blackmail, especially using AI-morphed obscene images and sextortion.

2. How can I tell if a video is a deepfake?

Look for unnatural blinking, odd lighting, mismatched lip-sync, strange hand or body movement and robotic speech. But remember—some deepfakes are extremely realistic, so verification through trusted sources is crucial.

3. What should I do if my face is used in a fake video or image?

Immediately:

  • Save proof (screenshots, URLs).
  • Report the content to the platform for removal.
  • File a complaint via www.cybercrime.gov.in or call 1930.
  • Inform your family or trusted contacts so they don’t fall for blackmail.

4. Can banks or RBI officials contact me on WhatsApp with investment tips?

No. RBI has clearly stated that it does not endorse investment schemes or give financial advice via deepfake videos or private messages.(Business Today)


Final Thoughts: Stay Skeptical, Stay Secure

AI deepfake scams in India will only get more sophisticated from here. Videos, voices and images can all be faked—but your habit of verifying before trusting is your strongest defence.

  • Don’t trust any video, voice or image just because it “looks real”.
  • Double-check every money request.
  • Educate your parents, children and less tech-savvy relatives.

The goal is not to fear technology, but to use it wisely. With awareness, verification and strong digital habits, you can enjoy the benefits of AI while staying safe from the growing wave of AI-powered fraud.

Add comment

Sign Up to receive the latest updates and news

Newsletter

© 2025 Proaitools. All rights reserved.