By Marlon Dale Ferreira
A chilling new scam is spreading rapidly, using facial recognition, voice capture and fingerprints to clone identities, leaving victims facing drained bank accounts, fake loans and legal nightmares. Authorities warn the public to stay alert as fraudsters exploit artificial intelligence and biometric technology in increasingly sophisticated ways.
The world of online fraud has entered a dangerous new phase.
After years of phishing emails, fake investment schemes and romance cons, scammers are now turning to something far more personal, your face, your voice and your fingerprints.
Cybersecurity experts worldwide have warned of a rise in AI-powered identity theft, where criminals use stolen biometric data to create highly convincing fake profiles. With the rapid advancement of artificial intelligence, fraudsters can now generate deepfake videos, clone voices from short audio clips and bypass identity verification systems that rely on facial recognition.
The latest scam circulating locally follows a deceptively simple pattern.
An unsuspecting individual, often posing as an elderly man or woman may approach you in public and ask for help with their mobile phone. They may claim they need assistance making a call, checking a message or navigating an app. The request appears harmless.
But the moment you take the phone and look at the screen, the front camera may silently capture your facial data. As you speak, your voice may be recorded. When you handle the device, your fingerprints may be left behind.
Security analysts warn that criminals can combine facial imagery, voice samples and other biometric traces with AI tools to construct digital replicas of victims. These digital clones can then be used to:
• Create fake social media or messaging profiles
• Attempt to bypass facial recognition banking apps
• Apply for online loans in the victim’s name
• Conduct social engineering scams targeting the victim’s contacts
• Impersonate the victim in video calls using deepfake software
International fraud monitoring agencies have reported a global spike in biometric-enabled scams. Financial institutions across Asia, Europe and North America have warned customers about criminals exploiting remote identity verification systems, particularly where banks rely on “selfie verification” and voice authentication.
In some documented cases overseas, criminals have successfully taken out online loans using stolen biometric credentials. Victims only discover the fraud when debt collectors begin contacting them.
Cybercrime specialists emphasize that while biometric systems are generally secure, they are not foolproof when manipulated with AI-enhanced deception techniques.
The public is urged to exercise caution in unfamiliar situations involving smartphones. Experts advise:
• Avoid handing your phone to strangers or even touching their phones.
• Be cautious when handling unfamiliar devices
• Do not speak sensitive information near unknown recording devices
• Enable multi-factor authentication on banking apps
• Regularly monitor bank statements for suspicious activity
Authorities also recommend reporting suspicious encounters to local police or cybercrime units.
The rise of artificial intelligence has brought remarkable innovation, but it has also armed criminals with powerful new tools. What once required hacking expertise can now be done with widely available AI software.
In an age where your face unlocks your phone and your voice confirms your identity, protecting your biometric data is no longer optional, it is essential.
Share this warning widely. Inform your parents, grandparents, friends and colleagues.
Because in this new wave of scams, the fraudster does not need your password.
They only need you.
