
fairness bias
Fairness bias occurs when decision-making systems, like algorithms or models, unintentionally favor or disadvantage certain groups based on factors such as race, gender, or age. This happens because these systems learn from existing data, which may reflect past inequalities or stereotypes. As a result, the system's outcomes can be unfair, perpetuating or even worsening societal biases. Recognizing and addressing fairness bias is crucial to ensure that automated decisions are equitable and do not reinforce discrimination or unfair treatment of individuals based on their attributes.