Q: how the following two differ, in the computation of prediction accuracy? (note: Y
= true label, Y_preduction
= predicted label).
(from deeplearning.ai, programming assignment. Course 1 - deep learning and neural network.)
Option 1 - Logistric Regression (Week 2):
print("Test accuracy: {} %".format(100 - np.mean(np.abs(Y_prediction - Y)) * 100))
Option 2 - Planar data classification (Week 3):
print('Accuracy of logistic regression: %d ' % float((np.dot(Y, Y_prediction) + np.dot(1-Y,1-Y_prediction))/float(Y.size)*100) +
'% ' + "(percentage of correctly labelled datapoints)")
My guess: option 2 is able to handle multi classification. (e.g. class = 0, 1, 2, 3, 4 etc), where option 1 only two class (0, 1). Needs validation.