Author ORCID Identifier

https://orcid.org/0009-0004-0193-577X

Semester

Spring

Date of Graduation

2025

Document Type

Thesis

Degree Type

MS

College

Statler College of Engineering and Mineral Resources

Department

Lane Department of Computer Science and Electrical Engineering

Committee Chair

Nima Karimian

Committee Co-Chair

Jeremy Dawson

Committee Member

Jeremy Dawson

Committee Member

Matthew Valenti

Abstract

Biometric authentication has become a key part of our everyday lives—from unlocking smartphones with a fingerprint or face to verifying identities in banks and airports. These systems rely on our unique physical or behavioral traits, making them both convenient and secure. Unlike passwords, biometrics cannot be forgotten or stolen in the traditional sense. However, they are not without risk. One of the biggest concerns is spoofing: attempts by attackers to fool systems using fake biometric traits, such as silicone fingerprints or AI-generated videos.

As generative AI tools become more powerful and accessible, the ability to create convincing fake biometric data is easier than ever before. This raises serious concerns about the security and trustworthiness of biometric systems. Traditional anti-spoofing methods often struggle to keep up—they depend heavily on labeled data and manually designed features, and they often fail when facing new or unknown types of attacks.

To address these challenges, this research explores deep learning approaches that aim to make biometric systems smarter, more adaptable, and more secure. The focus is on two areas where spoofing is especially dangerous: fingerprint presentation attack detection (PAD) and face liveness detection using a physiological signal called remote photoplethysmography (rPPG).

For fingerprint spoof detection, this study introduces both supervised and unsupervised deep learning models. The unsupervised approach learns only from real fingerprints and detects anything unusual that might indicate a spoof, eliminating the need for collecting fake data. This is especially valuable when new attack types appear. The supervised method goes a step further, using CNNs, attention modules, and Transformers to identify fine-grained patterns that distinguish real from fake fingerprints. Both approaches achieve high accuracy on benchmark datasets and help reduce the time and effort needed for data collection.

In the second part of the work, a novel framework for face liveness detection is proposed. Unlike traditional methods that rely on texture or appearance, this approach uses rPPG—a signal derived from tiny changes in facial skin color caused by blood flow. These changes are difficult to fake and provide strong evidence that a person is real and alive. The Swin-AUnet model was developed to reconstruct high-quality rPPG signals from facial videos. It combines the strengths of U-Net, Swin Transformers, and a GAN-based training strategy with multiple discriminators. Self-supervised learning helps the model learn without needing large labeled datasets, and temporal modeling improves its ability to work in real-world scenarios where lighting and movement vary.

Extensive testing on five public datasets—PURE, UBFC-rPPG, OBF, MR-NIRP, and MMSE-HR—shows that the proposed methods work well under diverse conditions. The reconstructed signals not only support accurate liveness detection but also open doors for remote health applications, such as monitoring heart rate and stress.

By bridging deep learning with physiological signal understanding, this research makes an important step toward building more secure and reliable biometric systems. The proposed methods reduce reliance on large spoof datasets, adapt to evolving threats, and offer practical solutions for real-world security. As biometric technology becomes even more integrated into daily life, these contributions help ensure it remains trustworthy, ethical, and resilient.

Comments

This project was supported in part by the National Science Foundation under Grant No. 2104520, and also by the NSF CAREER Award under Grant No. 2338981.

Share

COinS