News & Events

IREX Conducts Comprehensive Study to Estimate Bias in Suspect Recognition

2024-07-17 17:38 Corporate
IREX, the global leader in Ethical AI, has completed a comprehensive study to estimate recognition biases in its real-time suspect recognition technology. As part of our commitment to transparency and ethical AI, this study evaluates the accuracy and fairness of our technology across different ethnic groups and skin colors. The findings, shared with all users of the IREX Ethical AI platform, demonstrate a minimal variation in accuracy—below 2%—across different ethnic groups and skin colors, ensuring our technology's reliability and fairness.

Bias Mitigation Measures

At IREX, we’re committed to developing Artificial Intelligence models that are not only accurate but also fair and unbiased. In the context of face recognition, this means ensuring that our neural networks are designed to recognize individuals regardless of their race or ethnicity.
To achieve this goal, we’ve implemented several key measures during the training process:
  1. Diverse Validation: We regularly test our model’s performance on a range of datasets featuring people from different racial and ethnic backgrounds, ages, and postures. This helps us identify and address any biases that may emerge.
  2. Color Jittering: During data preparation, we apply color transformations to the images, which encourages the model to focus on facial features rather than skin tone or other physical characteristics.
  3. Geometric Transformations: We also apply various geometric transformations, including volumetric ones. This helps prevent the model from relying too heavily on certain characteristics that may be more prevalent in one racial group than another.
  4. Diverse Dataset Creation: Our approach involves combining multiple datasets from different origins, including our own custom dataset, to create a rich and diverse representation of people. This helps ensure that our model is trained on a broad range of facial features and patterns, which in turn reduces the likelihood of racial bias.
“By incorporating these measures into our face recognition neural networks, we’re able to develop models that are not only accurate but also fair and respectful of individual differences. At IREX, we believe it’s essential to prioritize fairness and transparency in AI development, and we’re committed to continuously improving our techniques to ensure the highest standards of performance and ethics,” says Dr. Filonenko, the Senior ML Scientist at IREX.

Study Overview

IREX utilized the Racial Faces in-the-Wild (RFW) testing database, which includes four distinct subsets: Caucasian, Asian, Indian, and African. Each subset contains approximately 3,000 unique individuals with 6,000 image pairs for face verification, providing a robust dataset for evaluating our technology's performance.

Results

GPU Version Verification Accuracy of IREX Suspect Recognition

  • Caucasian: 99.15%
  • African: 98.23%
  • Indian: 98.20%
  • Asian: 97.20%
These results demonstrate a minimal variation in accuracy, with the spread between the best and worst-performing datasets being just 1.95% for the GPU version.

CPU Version Verification Accuracy of IREX Suspect Recognition

  • Caucasian: 98.55%
  • African: 97.30%
  • Indian: 97.32%
  • Asian: 96.67%
The spread between the best and worst-performing datasets was 1.88% for the CPU version.

Comparative Analysis

According to Mei Wang et al., the accuracy spreads on third-party commercial APIs (e.g., Microsoft, Amazon) vary significantly, ranging from 5.58% to 12.30%. In comparison, the IREX Suspect Recognition engine exhibits the smallest bias, highlighting our commitment to minimizing racial bias in our AI technologies.

Ongoing Commitment

At IREX, we routinely monitor and evaluate bias metrics across different ethnic groups, age groups, and genders to ensure our technologies remain fair and equitable.
"We are dedicated to continuous improvement and transparency, sharing these findings with all users of the IREX Ethical AI platform to help them understand the limitations and strengths of our technology." - says Calvin Yadav, the CEO of IREX.