Image for AUC-PR (Precision-Recall Curve)

AUC-PR (Precision-Recall Curve)

AUC-PR, or Area Under the Precision-Recall Curve, is a metric used to evaluate how well a model distinguishes between different classes, especially in imbalanced datasets. The precision measures the proportion of true positive predictions among all positive predictions, while recall indicates how many actual positives are correctly identified. The curve plots these two metrics against each other for various thresholds. The AUC-PR summarizes this performance into a single value between 0 and 1; a higher value indicates better ability to identify true positives with fewer false positives, making it particularly useful for tasks where identifying positive cases is critical.