Skip to content

Instantly share code, notes, and snippets.

@greeness
Created January 11, 2014 05:33
Show Gist options
  • Save greeness/8367471 to your computer and use it in GitHub Desktop.
Save greeness/8367471 to your computer and use it in GitHub Desktop.
{
"metadata": {
"name": ""
},
"nbformat": 3,
"nbformat_minor": 0,
"worksheets": [
{
"cells": [
{
"cell_type": "code",
"collapsed": false,
"input": [
"import pymatbridge as pymat\n",
"ip = get_ipython()\n",
"pymat.load_ipython_extension(ip)"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"Starting MATLAB on http://localhost:53650\n",
" visit http://localhost:53650/exit.m to shut down same\n"
]
},
{
"output_type": "stream",
"stream": "stdout",
"text": [
"."
]
},
{
"output_type": "stream",
"stream": "stdout",
"text": [
"."
]
},
{
"output_type": "stream",
"stream": "stdout",
"text": [
"."
]
},
{
"output_type": "stream",
"stream": "stdout",
"text": [
"MATLAB started and connected!\n"
]
}
],
"prompt_number": 1
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"$$\\frac{d}{d\\theta}J(\\theta) = lim_{\\epsilon\\rightarrow 0} \\frac{J(\\theta+\\epsilon) - J(\\theta - \\epsilon)}{2\\epsilon}$$\n",
"\n",
"Thus we can numerically approximate the derivatives $g(\\theta) = \\frac{d}{d\\theta}J(\\theta)$.\n",
"\n",
"When $\\mathbf\\theta$ is a vector, we compute $g(\\theta_i) = \\frac{d}{d\\theta}J(\\theta_i)$, which is the derivative with respetivve to the $i$-th element of $\\theta$.\n",
"\n",
"Then we combine $g(\\theta_i)$ vertically to get $g(\\theta) = [g(\\theta_1), \\cdots, g(\\theta_m)]^T$.\n",
"\n",
"To compute $g(\\theta_i)$, we use two intermediate vectors $\\theta_i^{(+)}$ and $\\theta_i^{(-)}$. \n",
"\n",
"$\\theta_i^{(+)} = \\theta + e_i$\n",
"\n",
"$\\theta_i^{(-)} = \\theta - e_i$\n",
"\n",
"$e_i = [0, 0, \\cdots, \\epsilon, \\cdots, 0]^T$, where $e_i$ is a vector with a \"1\" in the $i$-th position and \"0\"s everywhere else."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"```matlab\n",
"function numgrad = computeNumericalGradient(J, theta)\n",
"% numgrad = computeNumericalGradient(J, theta)\n",
"% theta: a vector of parameters\n",
"% J: a function that outputs a real-number. Calling y = J(theta) will return the\n",
"% function value at theta. \n",
" \n",
"% Initialize numgrad with zeros\n",
"numgrad = zeros(size(theta));\n",
"\n",
"%% ---------- YOUR CODE HERE --------------------------------------\n",
"% Instructions: \n",
"% Implement numerical gradient checking, and return the result in numgrad. \n",
"% (See Section 2.3 of the lecture notes.)\n",
"% You should write code so that numgrad(i) is (the numerical approximation to) the \n",
"% partial derivative of J with respect to the i-th input argument, evaluated at theta. \n",
"% I.e., numgrad(i) should be the (approximately) the partial derivative of J with \n",
"% respect to theta(i).\n",
"% \n",
"% Hint: You will probably want to compute the elements of numgrad one at a time. \n",
"\n",
"e = zeros(size(theta));\n",
"epsilon = 1e-4;\n",
"\n",
"for i = 1:length(numgrad)\n",
" ei = e;\n",
" ei(i) = 1;\n",
" theta_pos = J(theta + epsilon .* ei);\n",
" theta_neg = J(theta - epsilon .* ei);\n",
"\n",
" numgrad(i) = (theta_pos - theta_neg)./(2*epsilon);\n",
"end\n",
"\n",
"\n",
"disp(' numerical comp done!')\n",
"\n",
"%% ---------------------------------------------------------------\n",
"end\n",
"```"
]
},
{
"cell_type": "code",
"collapsed": true,
"input": [
"matlab checkNumericalGradient()"
],
"language": "python",
"metadata": {},
"outputs": [
{
"metadata": {},
"output_type": "display_data",
"text": [
" 2 1\n",
"\n",
" numerical comp done!\n",
" 38.0000 38.0000\n",
" 12.0000 12.0000\n",
"\n",
"The above two columns you get should be very similar.\n",
"(Left-Your Numerical Gradient, Right-Analytical Gradient)\n",
"\n",
" 2.1452e-12\n",
"\n",
"Norm of the difference between numerical and analytical gradient (should be < 1e-9)\n",
"\n"
]
}
],
"prompt_number": 7
}
],
"metadata": {}
}
]
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment