Skip to content

Instantly share code, notes, and snippets.

@edom18
Last active June 27, 2024 14:35
Show Gist options
  • Save edom18/d705b0e0c41b8e55816f8ceabd50bc5c to your computer and use it in GitHub Desktop.
Save edom18/d705b0e0c41b8e55816f8ceabd50bc5c to your computer and use it in GitHub Desktop.
ゼロから作るDeepLearning -Pythonで学ぶディープラーニングの理論と実装-を読んだメモ ~ニューラルネットワーク編~ ref: https://qiita.com/edo_m18/items/f5ab5cd2d1293bee15c2
\begin{eqnarray}
y {=}
\begin{cases}
0 & : (b + w_1 x_1 + w_2 x_2) \leq 0 \\\
1 & : (b + w_1 x_1 + w_2 x_2) > 0
\end{cases}
\end{eqnarray}
y = h(b + w_1 x_1 + w_2 x_2)
a = np.array([1010, 1000, 990])
# np.exp(a) / np.sum(np.exp(a)) # これをそのまま計算すると
# array([nan, nan, nan]) # NaNになって不定となってしまう
c = np.max(a) # 1010
np.exp(a - c) / np.sum(exp(a - c))
# array([9,99954600e-01, 4.53978686e-05, 2.06106005e-09])
E = \frac{1}{2} \sum_{k}(y_k - t_k)^2
E = -\sum_{k} t_k log y_k
def numerical_gradient(f, x):
h = 1e-4 # 0.0001
grad = np.zeros_like(x) # xと同じ形状の配列を生成
for idx in range(x.size):
tmp_val = x[idx]
# f(x + h)の計算
x[idx] = tmp_val + h
fx_h1 = f(x)
# f(x - h)の計算
x[idx] = tmp_val - h
fx_h2 = f(x)
grad[idx] = (fx_h1 - fx_h2) / (h * 2)
x[idx] = tmp_val # 値をもとに戻す
return grad
\lim_{x \to a} \frac{f(x) - f(a)}{x - a} = \lim_{\Delta x \to 0} \frac{f(a + \Delta x) - f(a)}{\Delta x}
\begin{eqnarray}
h(x) {=}
\begin{cases}
0 & (x ≤ 0) \\
1 & (x > 0)
\end{cases}
\end{eqnarray}
y = 1 - exp(-x)
\begin{eqnarray}
h(x){=}
\begin{cases}
0 & : x \leq n \\\
1 & : x > n
\end{cases}
\end{eqnarray}
h(x) = \frac{1}{1 + exp(-x)}
\begin{eqnarray}
h(x){=}
\begin{cases}
x & : x > 0 \\\
0 & : x ≤ 0
\end{cases}
\end{eqnarray}
\begin{eqnarray}
y_k = \frac{exp(a_k)}{\sum_{i=1}^{n}exp(a_i) }
&=& \frac{C exp(a_k)}{C \sum_{i=1}^{n}exp(a_i) } \\
&=& \frac{exp(a_k + log C)}{\sum_{i=1}^{n}exp(a_i + log C) } \\
&=& \frac{exp(a_k + log C')}{\sum_{i=1}^{n}exp(a_i + log C') } \\
\end{eqnarray}
y_k = \frac{exp(a_k)}{\sum_{i=1}^{n}exp(a_i) }
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment