Skip to content

Instantly share code, notes, and snippets.

@aateg
Last active October 27, 2019 06:08
Show Gist options
  • Save aateg/daf29a7ce2563441131a2390748583e2 to your computer and use it in GitHub Desktop.
Save aateg/daf29a7ce2563441131a2390748583e2 to your computer and use it in GitHub Desktop.
Função de Custo Regularizada, Algoritmos de Feedforward e Backpropagation
def nnRegCostFunction(theta_1, theta_2, X, y, input_layer_size, hidden_layer_size, num_labels):
# Variáveis úteis (m=número de imagens, n=número de pixels por imagem)
[m, n] = X.shape
# Algoritmo Feedforward
# Adiciona a bias unit (uma coluna de 1) ao dataset
a1 = np.append(np.ones(shape=(X.shape[0], 1)), X, axis=1)
z2 = a1 @ theta_1.transpose()
a2 = sigmoid(z2)
a2 = np.append(np.ones(shape=(a2.shape[0], 1)), a2, axis=1)
z3 = a2 @ theta_2.transpose()
a3 = sigmoid(z3)
# Função de custo
log_htheta1 = np.log10(a3)
log_htheta2 = np.log10(1-a3)
J_theta = -(np.sum(y*log_htheta1 + (1-y)*(log_htheta2)))/m
# Backpropagation
z2_aux = np.append(np.ones((z2.shape[0], 1)), z2, axis=1)
delta_3 = a3 - y
delta_2 = (delta_3 @ theta_2) * sigmoidGrad(z2_aux)
delta_2 = delta_2[:, 1:]
Delta1 = (delta_2.transpose()) @ a1
Delta2 = (delta_3.transpose()) @ a2
Theta1_grad = Delta1/m
Theta2_grad = Delta2/m
return J_theta, Theta1_grad, Theta2_grad
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment