Deepfake videos and camera injection attacks can enable fraudsters to fake their identity, says a biometric authentication expert.
Biometrics is gaining momentum, as many organizations are implementing it for faster and smoother authentication processes. However, Stuart Wells, a CTO at biometrics authentication company Jumio, highlights potential threats and methods that fraudsters might use to bypass facial recognition.
Europol has forecasted that by 2026, up to 90 percent of online content might be artificially generated, posing a growing challenge for organizations to accurately determine the true identities of the users in question.
In a post on Biometric Update, Wells writes that fraudsters can use a technique called “camera injection” to introduce deepfake videos into the system and trick biometric and liveness detection tools.
Camera injection occurs when a fraudster bypasses a camera's charged-coupled device (CCD) to inject pre-recorded content, a real-time face swap video stream, or content completely fabricated using deepfake technology.
The main threat is that if attackers use camera injection, they can go undetected without victims realizing the hack. If malicious actors successfully bypass the verification, they can cause substantial damage by stealing identity, making fake accounts, or doing fraudulent transactions.
According to Wells, the key to safeguarding against fraudsters involves implementing mechanisms to identify instances of compromised camera device drivers and recognizing manipulation through forensic examination of video streams.
It's possible to detect fake videos by comparing natural motion to the motions in the captured videos. Natural occurrences such as eye movements, changes in facial expressions, or the typical blinking rhythm happen organically.
When these movements are absent, it strongly suggests that a video sequence can be fake. Also, changes in such parameters as ISO, aperture, frame rate, resolution, and change of light or color intensity could give away fraud.
A built-in accelerometer, which senses an axis-based motion, could also be used to track the changes in the objects in the recorded video and determine whether a camera might have fallen prey to a hacker.
Forensic analysis of the individual frames of the video can also unveil signs of manipulation, such as double-compressed parts of the image or traces of computer-generated deepfake images.
More from Cybernews:
Subscribe to our newsletter