Recruiter Interviews ‘Creepy’ Deepfake Job Candidate Faking Their Identity On A Video Call

This extremely unnerving trend seems to have grown exponentially in recent years.

recruiter interviewing a creepy deepfake job candidate Antonio Guillem | Shutterstock | Canva Pro
Advertisement

As AI technology advances, it is becoming ever more difficult to judge what's real and what's not, especially where so-called "deepfake" videos are concerned. What was once used to fake celebrity endorsements or malign politicians has moved into the hands of everyday people, and it's resulting in a new kind of fraud in an unlikely space: job interviews.

A recruiter shared the creepy deepfake job candidate she interviewed for a job.

AI technology has been making a mess of recruiting and hiring for years now. Artificial intelligence has shown to be really poor at deciphering the nuances of resumes, for example, trashing tons of applications for often ridiculous reasons. Even worse, it has repeatedly been shown to worsen discrimination in hiring.

Advertisement
@yourtango A job applicant says they received an AI rejection letter...for having an 'unprofessional' birthday #AI #artificialintelligence #worktok #jobapplication #rejectionletter ♬ original sound- YourTango

But now the problems seem to have gone to a whole new level by making job interview fraud as easy as pie. Argentina-based tech recruiter Bettina Liporazzi recently took to LinkedIn to share her unnerving experience with this phenomenon that instantly went viral.

Advertisement

"I’d seen stories about candidates faking their identity with AI and figured it might happen to me one day," Liporazzi wrote in her post. "Well, that day was today, only a few minutes ago." She then shared a deeply unsettling video of the candidate, who looked barely human.

Whoever did this wasn't working with the best AI tools, that's for sure. The candidate's eyes seemed to be stuck in a permanent half-closed side glance, and he appeared unable to move his head side-to-side. It's like looking at a cyborg or something — one firmly positioned in the uncanny valley.

RELATED: Woman Says Brand Used An AI Deepfake Of Her To Promote Their Product Without Her Consent

When Liporazzi asked the deepfake candidate to wave a hand in front of his face, he abruptly ended the call.

Since Liporazzi was waiting for the day this would happen, she knew how to respond — first, by asking the candidate to turn on his camera. She wrote that this candidate said his camera was broken and only turned it on when she insisted he do so.

Advertisement

A series of questions, like asking the candidate to remove their filter or background or turn their head side to side, is a way to further suss out the fakery. So is what Liporazzi asked her candidate to do — wave a hand in front of his face.

In the video, he can be heard not understanding the question, asking if there was a connectivity issue and if he should log out and back into the meeting. "You don't need to join again," Liporazzi replied, then repeated her instructions. The candidate then abruptly ended the call.

"I don't know this person/group's intentions, but I doubt they are good," she went on to say in her post. "Many innocent people out there could fall for this because they have no idea that it is even remotely possible (both in and out of the workplace)."

RELATED: Recruiter Offers Interview To Job Applicant They Originally Called To Reject Before Even Looking At Their Resume Because Their AI System Auto-Rejected It

Advertisement

Deepfake job interview fraud is rapidly growing, both to cheat on interviews and as a form of cybercrime.

Cybersecurity firm Regular Forensics has been tracking the rise of both audio and video deepfake scams for a handful of years now. Their 2024 analysis showed a startling trend — 49% of companies surveyed had fallen prey to a deepfake video scam, up from just 29% in 2022.

Experts say deepfake job interviews are part of a new trend in which candidates try to game the ever more cutthroat job market by hiring people to do job interviews for them and using deepfakes to obscure their identity. Or, interviewees hide behind the deepfake avatar while using Google or tools like ChatGPT to formulate answers to interview questions.

It's easy to chalk this all up to how increasingly impossible the job search process is becoming — and to lay the blame on recruiters like Liporazzi, many of whom themselves use AI tools that make the process even more difficult for workers.

Advertisement

But deepfake job candidate scams go further than just cheating. Much like the way audio deepfakes are being used for increasingly sophisticated telephone scams, criminals are using deepfakes to obtain jobs that then provide them access to companies' sensitive information. A company was hacked by a North Korean cybercriminal this way just last fall, who collected a salary for four full months before their crime was discovered.

There's tons of talk about AI being "revolutionary," but the longer it's part of our world, the more it starts to seem like it's not remotely ready for primetime. Here's hoping it's not too late to keep these scams from doing something truly catastrophic, because it seems like only a matter of time.

RELATED: Career Expert Shares 3 Tips To Make Sure A Human Sees Your Job Application In A World Of AI Auto-Rejections

Advertisement

John Sundholm is a writer, editor, and video personality with 20 years of experience in media and entertainment. He covers culture, mental health, and human interest topics.

Loading...