Presence peril: You just hired a deepfake. Get ready for the rise of imposter employees.

[As the excellent technology writer Mike Elgan reports in this story from Protocol, both video and audio deepfake technologies are already commonly used by a variety of fraudsters in remote hiring and will represent an increasingly serious threat as the ability to fool employers with deepfakes improves. –Matthew]

[Image: Credit: z_wei/iStock/Getty Images Plus; Protocol]

You just hired a deepfake. Get ready for the rise of imposter employees.

New technology — plus the pandemic remote work trend — is helping fraudsters use someone else’s identity to get a job.

By Mike Elgan, a journalist, opinion columnist and author
August 22, 2022

Before COVID-19, job interviews took place in person and new hires worked in the office, for the most part.

But with remote work came an increase in remote hiring, from the job application to onboarding and everything in between. Many employees have never been in the same room as their employers and co-workers, and that has opened the door for a rise in imposter employees.

The FBI is concerned; you should be too.

Lies, spies and deepfake video

Companies have been increasingly complaining to the FBI about prospective employees using real-time deepfake video and deepfake audio for remote interviews, along with personally identifiable information (PII), to land jobs at American companies.

One place they’re likely getting the PII is through posting fake job openings, which enables them to harvest job candidate information, resumes and more, according to the FBI.

Deepfake video sounds advanced. But shady job candidates don’t need exotic or expensive hardware or software to impersonate someone on a live video call — only a photo of the fake person. Consumer products like Xpression Camera enable fraudsters to upload someone’s picture and use their face during a live video interview.

The FBI points out that such deepfake video calls often fail, as the “actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking.”

In other words, dishonest job applicants would like to take advantage of deepfake technology for remote hiring, but the technology isn’t there yet. But soon the technology will be so good that deepfake audio and video will look just like the real thing.

And it’s not just deepfake video: You can clone someone’s voice with just a short audio sample and a publicly available tool on GitHub. It’s unlikely that a cybercriminal would get a job using deepfake audio clone, but attackers can (and do) use cloned human voices for workplace phishing attacks.

What imposters want

The main drivers appear to be money, espionage, access to company systems and unearned career advancement.

Many of the job openings sought by these imposters include “information technology and computer programming, database, and software related job functions. Notably, some reported positions include access to customer PII, financial data, corporate IT databases and/or proprietary information,” according to a June 28 posted alert by the FBI’s Internet Crime Complaint Center. The perfect jobs for spies.

Some imposter candidates actually work for the North Korean government, according to a statement by the FBI and the U.S. State and Treasury Departments. Because of U.S. sanctions, North Koreans are ineligible for jobs at American companies. (Companies that employ North Koreans can be fined roughly $330,000 per violation.) So the North Korean government lets people apply and work as imposters in exchange for taking most of their salaries, or North Korean spies get jobs under false identities in order to steal secrets. Some North Koreans used their real identities, but claimed they were outside North Korea.

The problem of imposter employees exists on a scale from exaggerating experience to lying about credentials and personal details to faking experience to claiming to be an entirely different person. And every facet is growing in scale.

Glider AI’s “The Future of Candidate Evaluation” report found that what they call “candidate fraud” has nearly doubled — a 92% increase — since before the pandemic.

In addition to the imposter employee frauds already reported, it’s easy to imagine other scams that take advantage of new technology and remote work.

Malicious cyberattackers could get hired under stolen credentials in order to gain unauthorized access to sensitive data or systems inside companies. A skilled hacker may actually have the IT skills to get the job, and doing so may prove to be a relatively easy act of social engineering.

The bottom line is that our old habits for verifying employees — namely, interacting with them and recognizing who they are — are increasingly unreliable in the face of remote work and new technology that enables people to fake their appearance, voice and identity.

How to avoid hiring imposters

Remote work is here to stay. And it’s time to revisit and revamp hiring. Here are some tips to bear in mind when hiring.

  • Include real identity verification before hiring, and make sure identity matches background screening. (Don’t assume your background provider is verifying identity.)
  • Asking for a driver’s license or passport can lead to a discrimination lawsuit if the candidate isn’t hired — they can claim discrimination based on age, health or country of birth. Request this information only after you’re certain you’ll hire.
  • Know the law in the state you’re in to find out what’s allowed in terms of biometric data collection.
  • If you’re doing background checks and identity verification on remote hires, do the same for in-office hires to avoid discrimination.
  • Consider abandoning all-remote hiring in favor of in-person interviews, even for remote staff. And bring in remote staff for in-house team building quarterly or annually.
  • Rely more on skills assessment and testing for technical positions rather than resume-based claims of experience, certifications and education. Verify identities at the point of testing and follow up on test results with a post-test interview. Imposters are likely to seek employment elsewhere if they have to prove their qualifications.
  • Take extra care with the hiring of IT people and others who will gain access to email systems, passwords, business secrets, physical security systems and other juicy targets for cyberattack. Do thorough background checks and criminal records checks and verify identity throughout the hiring and onboarding process.
  • Embrace AI fraud detection to evaluate resumes and job candidates. Fraud detection has been used for years in banking, insurance and other fields, and is slowly being applied to hiring.

The new world of remote work calls for a new approach to hiring. It’s time to rethink your HR practices to make sure the people you’re hiring and employing are who they say they are — and not imposters.

This entry was posted in Presence in the News. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z