Skip to content

Instantly share code, notes, and snippets.

@aniruddha27
Last active May 11, 2021 05:39
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save aniruddha27/776608746203ecc90cd1ca1320af416a to your computer and use it in GitHub Desktop.
Save aniruddha27/776608746203ecc90cd1ca1320af416a to your computer and use it in GitHub Desktop.
# confusion matrix in sklearn
from sklearn.metrics import confusion_matrix
from sklearn.metrics import classification_report
# actual values
actual = [1,0,0,1,0,0,1,0,0,1]
# predicted values
predicted = [1,0,0,1,0,0,0,1,0,0]
# confusion matrix
matrix = confusion_matrix(actual,predicted, labels=[1,0])
print('Confusion matrix : \n',matrix)
# outcome values order in sklearn
tp, fn, fp, tn = confusion_matrix(actual,predicted,labels=[1,0]).reshape(-1)
print('Outcome values : \n', tp, fn, fp, tn)
# classification report for precision, recall f1-score and accuracy
matrix = classification_report(actual,predicted,labels=[1,0])
print('Classification report : \n',matrix)
@aashish-chaubey
Copy link

Dude you seemed to have swapped the fp and fn values. Please correct it if so, it might cause confusion in the real sense.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment