SaTC: CORE: Small: Probing Fairness of Ocular Biometrics Methods Across Demographic Variations

Information

  • NSF Award
  • 2129173
Owner
  • Award Id
    2129173
  • Award Effective Date
    10/1/2021 - 3 years ago
  • Award Expiration Date
    9/30/2023 - a year ago
  • Award Amount
    $ 200,000.00
  • Award Instrument
    Standard Grant

SaTC: CORE: Small: Probing Fairness of Ocular Biometrics Methods Across Demographic Variations

Biometrics technology related to recognizing the identities and traits of people has been widely adopted in intelligence gathering, law enforcement, and consumer applications. Recent studies suggest that face-based biometric technology does not work equitably across demographic variations. There is a pressing need to investigate fair biometric solutions and modalities toward accurate, fair, and trustworthy technology for enhanced security and public safety. Ocular biometrics, which consists of regions in and around the eyes, offers an alternate solution to face biometrics due to its accuracy and privacy. Furthermore, ocular biometrics can be acquired using regular cameras even in the presence of face covering. This project investigates the fairness of ocular biometric technology and develops solutions to mitigate unequal accuracy gaps across demographic variations. This project spans a highly multidisciplinary research area, which integrates engineering, statistics, mathematics, computing, and policy. The findings of this project are used to update the engineering curricula, including computer vision, image analysis, machine learning, deep learning, and biometrics. The advances made in this project are disseminated through publications, the investigator website, and seminars. This project also provides opportunities to broaden the participation of women, underrepresented minorities, and undergraduate students in computing.<br/><br/>This project investigates the fairness of ocular biometrics scanned in visible and NIR (Near-infrared) spectrum across demographic variations. The machine and deep learning models trained for ocular-based individual analysis are evaluated for unequal accuracy rates across demographic variations. The cause of unequal accuracy rates in the machine and deep learning algorithms are analyzed using explainable AI. The fairness-aware classifiers are developed by modifying the objective function, using ensemble techniques, and learning fair feature representation. Existing and publicly available ocular datasets are utilized for the evaluation and security analysis of the classifiers used in this study. The efficacy of the proposed solutions is assessed through standard biometric performance and fairness evaluation metrics.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

  • Program Officer
    Jeremy Epsteinjepstein@nsf.gov7032928338
  • Min Amd Letter Date
    7/12/2021 - 3 years ago
  • Max Amd Letter Date
    7/12/2021 - 3 years ago
  • ARRA Amount

Institutions

  • Name
    Wichita State University
  • City
    Wichita
  • State
    KS
  • Country
    United States
  • Address
    1845 Fairmount
  • Postal Code
    672600007
  • Phone Number
    3169783285

Investigators

  • First Name
    Ajita
  • Last Name
    Rattani
  • Email Address
    ajita.rattani@wichita.edu
  • Start Date
    7/12/2021 12:00:00 AM

Program Element

  • Text
    Secure &Trustworthy Cyberspace
  • Code
    8060

Program Reference

  • Text
    SaTC: Secure and Trustworthy Cyberspace
  • Text
    SMALL PROJECT
  • Code
    7923
  • Text
    WOMEN, MINORITY, DISABLED, NEC
  • Code
    9102
  • Text
    EXP PROG TO STIM COMP RES
  • Code
    9150