The latest in face fraud has little to do with AI-generated deep fake videos, according to new research this week from Joseph Cox at 404 Media. It involves a clever combination of video editing, paying unsuspecting people to record their faces and holding up to the camera blank pieces of paper. Sites such as Fotodropy and others have sprung up that have real people (as shown here) that are the face models, moving their heads and eyes about at random during the course of the video.
This goes beyond more simplistic methods of holding up a printed photograph or using a 3D-printed mask of a subject, what was known as face spoofing. That produced a static image, but many financial sites have moved to more complex detection methods, requiring a video to show someone is an actual human. These methods are called document liveness checks, and they are increasingly being employed as part of know-your-customer (KYC) routines to catch fraudsters.
The goal is not to have your actual face on a new account but someone that is under the control of the hacker. Once the account is vetted, it then can be used in various scams, with a “verified” ID that can lend the whole scam more believable.
Back in the pre-digital days, KYC often meant that a potential customer would have to pay an in-person visit to their local bank or other place of business, and hand over their ID card. A human employee would then verify that the ID matched the person’s face and other details. That seems so quaint now.
The liveness detection does more than have a model mug before the camera, and requires a customer to follow stage directions (look up, look to your left) in real time. This avoids any in-person verification in near-real-time and shifts the focus from physical ID checks to more digital methods. Of course, these methods are subject to all sorts of attacks just like anything else that operates across the internet.
There are several vendors who have these digital liveness detection tools, including Accurascan, ShuftiPro, IDnow.IO and Sensity.AI, just to name a few that I found. Some of these features can measure blood flow across your face and capture other live biometric data. This post from IDnow goes into more detail about the ways facial recognition has been defeated in the past. It is definitely a cat-and-mouse game: as the defenders come up with new tools, the fraudsters come up with more sophisticated ways around them. “This had led to growing research work on machine learning techniques to solve anti-spoofing and liveness checks,” they wrote in their post.
The one fly in these liveness routines is that to be truly effective, they have to distinguish between real and fake ID documents. This isn’t all that different from the in-person KYC verification process, but if you paste in a fake driver’s license or passport document into your video, your detection system may not have coverage on that particular document. When you consider that there are nearly 200 countries with their own passports and each country has dozens if not hundreds of potential other ID documents, that is a lot of code to train these recognition systems properly.
Note that the liveness spoofing methods are different from deepfake videos, which basically attach someone’s face to a video of someone else’s body. They are also a proprietary and parallel path to the EU’s Digital Wallet Consortium, which attempts to standardize on a set of cross-border digital IDs for its citizenry.
Thanks David, I didn’t know about this. Related topic – I’ve deleted my phone msg greeting so nobody can capture my voice. Rarely, I answer a call from an unknown number if I’m expecting a call when the person may be using a different phone. Recently my neighbour called from her co-worker’s phone and on a hunch, I picked up. I said “Hellooo?” in the creakiest, most awful, distorted voice you’d never want to hear. We quickly identified one another so the call proceeded normally – except for our peals of laughter. At first, my neighbour wondered if I’d been shot. Months later we are still laughing about the voice I used!