Last active
November 10, 2019 09:38
-
-
Save RafayAK/53c8c403775210b1817e79f262042068 to your computer and use it in GitHub Desktop.
This function computes the Binary Cross-Entropy(stable_bce) Cost function the way Keras implements it
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
def compute_keras_like_bce_cost(Y, P_hat, from_logits=False): | |
""" | |
This function computes the Binary Cross-Entropy(stable_bce) Cost function the way Keras | |
implements it. Accepting either probabilities(P_hat) from the sigmoid neuron or values direct | |
from the linear node(Z) | |
Args: | |
Y: labels of data | |
P_hat: Probabilities from sigmoid function | |
from_logits: flag to check if logits are being provided or not(Default: False) | |
Returns: | |
cost: The "Stable" Binary Cross-Entropy Cost result | |
dZ_last: gradient of Cost w.r.t Z_last | |
""" | |
if from_logits: | |
# assume that P_hat contains logits and not probabilities | |
return compute_stable_bce_cost(Y, Z=P_hat) | |
else: | |
# Assume P_hat contains probabilities. So make logits out of them | |
# First clip probabilities to stable range | |
EPSILON = 1e-07 | |
P_MAX = 1- EPSILON # 0.9999999 | |
P_hat = np.clip(P_hat, a_min=EPSILON, a_max=P_MAX) | |
# Second, Convert stable probabilities to logits(Z) | |
Z = np.log(P_hat/(1-P_hat)) | |
# now call compute_stable_bce_cost | |
return compute_stable_bce_cost(Y, Z) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment