
Trust that only real, live people are verified with Incode’s advanced liveness detection technology.Trust that only real, live people are verified with Incode’s advanced liveness detection technology.
Vendor
Incode
Company Website
The person behind the camera could be a fake
They could be manipulating your systems to appear as someone else, or they might not be a live person at all.
Deepfake and digital attacks
AI-generated deepfakes and advanced digital manipulation techniques. Digital attacks attempt to bypass identity verification software using digital manipulation, including AI-generated deepfakes.
Face Swapping
Replacing the face in a target image with a face from another source.
Face Morph
Combining two faces to create an entirely new, blended face.
2D Synthetic assets
AI-generated images of people who don’t exist – all facial features are entirely synthetically generated, unlike face morphs that blend real people’s facial features to generate a new face.
Face reenactment
Recreating characteristic movements and facial expressions to make a deepfake more convincing.
Video replays
Footage is replayed on high-resolution screens in front of the camera to simulate a live individual, making it difficult to distinguish between real and live people.
Physical presentation attacks
Counterfeit visual data that deceives camera-based verification systems. Physical presentation attacks involve manipulating visual data with high-quality screens or printed images to trick camera-based verification systems.
2D masks
Flat, printed masks are used to mimic someone’s face and deceive verification systems, but lack the depth and detail of a real human face.
3D masks
Lifelike, three-dimensional masks made of materials like silicone are crafted to closely resemble a real person, making them harder to detect.
Paper printouts
Printed photos of a face are presented to the camera, attempting to pass off a static image as a live individual.
Cardboards
Printed images mounted on cardboard provide a sturdier, more rigid appearance in an effort to fool systems.
Video replays
Footage is replayed on high-resolution screens in front of the camera to simulate a live individual, making it difficult to distinguish between real and live people.
Evasion attacks
Facial modifications that target and aim to deceive recognition systems. Evasion or obfuscation attacks occur when individuals alter their appearance to make it harder for facial recognition systems to verify their biometric features.
Exaggerated expressions
Overly dramatic facial movements, like extreme smiles or wide eyes, are used to distort the face and confuse recognition systems
Occluding objects
Items like hats, glasses, or scarves are used to block parts of the face, preventing systems from capturing a clear image.
Heavy makeup
Complex makeup techniques are applied to change the appearance of key facial features, making it harder for systems to recognize the individual.
Why choose Incode’s liveness detection?
Discover how Incode Deepsight’s advanced liveness technology tackles fraud without impacting your user experience.
Full ownership of our ML models and tech stack
Unlike other providers who rely on third-party vendors, we build and own our entire technology stack. Our AI/ML models, powered by deep learning and designed for identity verification, allow us to train on the latest document and biometric fraud vectors. This full control enables us to tailor our models to the unique needs of our clients, ensuring superior performance and flexibility.
Rich, well-organized data to train models
Using advanced neural networks, including both standard Convolutional Neural Networks (CNNs) and cutting-edge Large Vision Models (LVM) and transformers, we train our models to achieve state-of-the-art results across various tasks. Over nearly a decade, we’ve curated large, statistically representative datasets, so our models deliver balanced performance across variables like age, skin tone, and gender. Our in-house Fraud Lab has compiled over 1 million unique presentation attacks, from basic printouts to advanced 3D masks. We also generate synthetic data such as face swaps and synthetic faces, enhancing the robustness of our models.
Internal testing for flawless detection
Our internal testing environment is designed to be more challenging than real-world spoof attempts. By testing our models against complex attacks, we ensure perfect detection rates in production. This way, we can prevent known fraud rings, repeat verification attempts, and other fraudulent behaviors before they impact your business.
Leveraging multiple input modalities
Incode advanced liveness incorporates detection across various modalities, such as depth and motion, and multiple frames, multiplying its accuracy.