+Algorithmic bias refers to systematic and repeatable errors in a computer system that produce unfair outcomes, often reflecting societal prejudices. These biases can arise from unrepresentative [Training Data](/wiki/training_data) or flawed model design, leading to discriminatory impacts in fields from finance to criminal justice. Mitigating such bias is crucial for building equitable [AI Systems](/wiki/ai_systems).
+## See also
+- [Data Bias](/wiki/data_bias)
+- [Fairness](/wiki/fairness)
+- [Machine Learning](/wiki/machine_learning)
... 1 more lines