
Early Stopping
Early stopping is a technique used during machine learning to prevent a model from overfitting, which occurs when it learns too much from training data including noise. As the model trains, its performance improves on the training data but may worsen on unseen data. Early stopping monitors the model's performance on a validation set (a separate data subset) and stops training when improvement stalls or declines. This helps ensure the model maintains good generalization abilities, performing well on new, unseen data instead of just memorizing the training data.