Résumer la matrice de confusion
Un des index d'évaluation de la classification.
| Predicted Negative | Predicted Positive | |
|---|---|---|
| Actual Negative | True Negative | False Positive | 
| Actual Positive | False Negative | True Positive | 
Precision = TP / (TP + FP) Recall = TP / (TP + FN)
from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_test, y_pred)
        Recommended Posts