Image for Unfairness Metrics

Unfairness Metrics

Unfairness metrics are tools used to measure if a decision-making system, like an AI or algorithm, treats different groups—such as based on race, gender, or age—unequally. They help identify biases by comparing outcomes across these groups. For example, they can show if a loan approval system favors one group over another unfairly. By quantifying these differences, fairness metrics guide developers to improve systems so they make more equitable and just decisions for all individuals.