
Algorithm Bias
Algorithm bias occurs when a computer program or machine learning system makes unfair or skewed decisions because of the data it was trained on or how it was designed. If the training data reflects existing prejudices or lacks diversity, the algorithm can unintentionally reinforce stereotypes or discriminate against certain groups. This bias can affect outcomes in areas like hiring, lending, or justice, leading to unfair treatment. Recognizing and addressing algorithm bias is essential to ensure technology promotes fairness, accuracy, and equal opportunity for all users.