Skip to content

Instantly share code, notes, and snippets.

@jaymody
Last active December 11, 2022 18:39
Show Gist options
  • Save jaymody/723fa97369d35058eed341429b650761 to your computer and use it in GitHub Desktop.
Save jaymody/723fa97369d35058eed341429b650761 to your computer and use it in GitHub Desktop.
Studying entropy based loss functions.
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Entropy Based Loss Functions\n",
"The goal of this notebook is to provide an understanding of:\n",
"* `torch.nn.functional.cross_entropy`\n",
"* `torch.nn.functional.nll_loss`\n",
"* `torch.nn.functional.bce_loss`\n",
"* `torch.nn.functional.kl_div`\n",
"\n",
"References:\n",
"- Information Theory Stuff: https://www.youtube.com/watch?v=ErfnhcEV1O8\n",
"- PyTorch Docs for Loss Functions: https://pytorch.org/docs/stable/nn.html#loss-functions"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
"import matplotlib.pyplot as plt"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Example Distributions"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAABL4AAAHDCAYAAAAqZtO0AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjYuMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy89olMNAAAACXBIWXMAAA9hAAAPYQGoP6dpAAA190lEQVR4nO3de5zVdZ0/8NeAMoMXLgoMouigXfECBcEPtSwl2XRtbXXzso9AzNJEU6fWxBRkcx3T1WVXUbwU+miX1fKRVosahLfcdC2QLS0r88ZagKQOhgY5c35/9HBs5Do4F/jM8/l4zOPhfOf7Ped9vg7f15zX+Z7vqapUKpUAAAAAQGF6dPUAAAAAANARFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF/QST784Q/nwx/+cLvcVl1dXU466aSW7++7775UVVXlvvvua5fbB2DTbrrpplRVVeWZZ55ptfzyyy/P3nvvnZ49e2bkyJFdMlt7Oumkk1JXV9dqWVVVVS666KIumQeADdsWj89vfZ70zDPPpKqqKjfddFOXzURZFF/QjV1yySW54447unoMgGLMnz8/5557bg466KDMmTMnl1xySVePtNWYO3duZs6c2dVjAFCoa665RlnGem3X1QMAb9+HPvShvPbaa+nVq1ebtrvkkkty7LHH5uijj+6YwQAK9qlPfSrHH398qqurW5bdc8896dGjR772ta+1+Zi8LXnttdey3XZt+zNy7ty5eeyxx3L22Wd3zFAAFGGvvfbKa6+9lu23375N211zzTUZMGBAq3fGQOKML7qpSqWS1157ravHaDc9evRITU1NevTwTxqgs/Ts2TM1NTWpqqpqWbZixYr07t273UqvrTWvampq2lx8AXRHq1ev7uoRtjlVVVWpqalJz549u3oUCuFZMluliy66KFVVVXnyySdz0kknpV+/funbt28mT56cV199tWW9119/PV/5yleyzz77pLq6OnV1dTn//POzZs2aVrdXV1eXv/7rv873v//9jB49Or179851113Xcm2sb37zm5kxY0Z233337Lzzzjn22GPT2NiYNWvW5Oyzz86gQYOy0047ZfLkyevc9pw5c3LooYdm0KBBqa6uzvDhw3Pttde2y36oVCq5+OKLs8cee2SHHXbIRz7ykTz++OPrrLe+a3z9+te/zjHHHJPBgwenpqYme+yxR44//vg0NjYm+XOgrF69OjfffHOqqqpSVVXl1RGgaOu7VlXyZua8oaqqKmeccUbuuOOO7Lfffqmurs6+++6bu+++u9V2b73GV1VVVebMmZPVq1e3HFffeMvF1pBXm+ONx1xTU5P99tsvt99++3rXe+s1ZF555ZWcffbZqaurS3V1dQYNGpSPfvSjWbx4cZI/X79l3rx5efbZZ1v2zfr+XwBsy97Ik5///Oc58cQT079//xx88MH56U9/mpNOOil77713ampqMnjw4Jx88sn5/e9/v97tN/UcKEnWrFmTc845JwMHDszOO++cj3/84/m///u/9c716KOP5mMf+1j69OmTnXbaKYcddlgefvjhVuu8kWkPPvhgPv/5z2fgwIHp169fTj311KxduzYvv/xyJk6cmP79+6d///4599xzU6lU2ryPrr/++uyzzz7p3bt3xowZkx/+8IfrrLO+a3wtW7YskydPzh577JHq6urstttu+Zu/+ZuWDK6rq8vjjz+e+++/vyVn2uv6ymz7vFTHVu2Tn/xkhg0bloaGhixevDg33nhjBg0alK9+9atJklNOOSU333xzjj322HzhC1/I//zP/6ShoSG/+MUv1vlj/Ze//GVOOOGEnHrqqfnMZz6Td7/73S0/a2hoSO/evXPeeeflySefzFVXXZXtt98+PXr0yEsvvZSLLrooDz/8cG666aYMGzYs06ZNa9n22muvzb777puPf/zj2W677fK9730vp59+epqbmzNlypS39finTZuWiy++OEcccUSOOOKILF68OIcffnjWrl270e3Wrl2bCRMmZM2aNTnzzDMzePDgPP/88/mv//qvvPzyy+nbt2++8Y1v5JRTTsmYMWPy2c9+Nkmyzz77vK15AUrx4IMP5tvf/nZOP/307Lzzzvm3f/u3HHPMMXnuueey6667rnebb3zjG7n++uvzyCOP5MYbb0ySHHjggUm2jrzalPnz5+eYY47J8OHD09DQkN///vctTzI25bTTTsttt92WM844I8OHD8/vf//7PPjgg/nFL36R97///fnyl7+cxsbG/N///V/+5V/+JUmy0047bfZsANuSv/u7v8s73/nOXHLJJalUKlmwYEGeeuqpTJ48OYMHD87jjz+e66+/Po8//ngefvjhVi++JJt+DpT8OVf+/d//PSeeeGIOPPDA3HPPPTnyyCPXmeXxxx/PBz/4wfTp0yfnnntutt9++1x33XX58Ic/nPvvvz9jx45ttf4bzx1mzJiRhx9+ONdff3369euXH/3oR9lzzz1zySWX5M4778zll1+e/fbbLxMnTtzs/fK1r30tp556ag488MCcffbZeeqpp/Lxj388u+yyS4YOHbrRbY855pg8/vjjOfPMM1NXV5cVK1ZkwYIFee6551JXV5eZM2fmzDPPzE477ZQvf/nLSZLa2trNno3CVWArNH369EqSysknn9xq+Sc+8YnKrrvuWqlUKpUlS5ZUklROOeWUVut88YtfrCSp3HPPPS3L9tprr0qSyt13391q3XvvvbeSpLLffvtV1q5d27L8hBNOqFRVVVU+9rGPtVp/3Lhxlb322qvVsldffXWd+SdMmFDZe++9Wy075JBDKocccsjGH/hfWLFiRaVXr16VI488stLc3Nyy/Pzzz68kqUyaNGmdx3HvvfdWKpVK5dFHH60kqXzrW9/a6H3suOOOrW4HoGSTJk1a5xheqbyZOW9IUunVq1flySefbFn2v//7v5Uklauuuqpl2Zw5cypJKk8//XSr+9hxxx1b3f7WklebMnLkyMpuu+1Wefnll1uWzZ8/v5JkndtKUpk+fXrL93379q1MmTJlo7d/5JFHtnkmgG3JG3lywgkntFq+vucL//mf/1lJUnnggQfW2X5jz4EqlTdz5fTTT2+13oknnrjO8fnoo4+u9OrVq/Kb3/ymZdlvf/vbys4771z50Ic+1LLsjUybMGFCq+ce48aNq1RVVVVOO+20lmWvv/56ZY899mjTc5u1a9dWBg0aVBk5cmRlzZo1Lcuvv/76SpJWt/X0009XklTmzJlTqVQqlZdeeqmSpHL55Zdv9D723XffNs1E9+GtjmzVTjvttFbff/CDH8zvf//7rFq1KnfeeWeSpL6+vtU6X/jCF5Ik8+bNa7V82LBhmTBhwnrvZ+LEia0unjh27NhUKpWcfPLJrdYbO3Zsli5dmtdff71lWe/evVv+u7GxMStXrswhhxySp556quVthVviBz/4QdauXZszzzyz1atAm3NR4L59+yZJvv/9769zWjQAmzZ+/PhWZ8EecMAB6dOnT5566qk239bWklcb87vf/S5LlizJpEmTWjIkST760Y9m+PDhm9y+X79++Z//+Z/89re/3az7AyjZW5/D/OXzhT/+8Y9ZuXJl/t//+39J0vKW8I1t/5fPgZI3c+Xzn/98q/Xe+jyhqakp8+fPz9FHH5299967Zfluu+2WE088MQ8++GDLbb7h05/+dKvnHm/kzKc//emWZT179szo0aPblIk/+clPsmLFipx22mmtroN50kkntcqd9Xnj2pn33XdfXnrppc2+T3iD4out2p577tnq+/79+ydJXnrppTz77LPp0aNH3vGOd7RaZ/DgwenXr1+effbZVsuHDRu22ffzxsH3rafc9u3bN83Nza0Krf/+7//O+PHjs+OOO6Zfv34ZOHBgzj///CR5W8XXG/O/853vbLV84MCBLfthQ4YNG5b6+vrceOONGTBgQCZMmJBZs2a9rXkAupO35kLy5wzakj+4t5a82tSMybqZk6TVWy035LLLLstjjz2WoUOHZsyYMbnooou2qCQEKMFbj+MvvvhizjrrrNTW1qZ3794ZOHBgyzrrO05v7DlQ8mauvPUyJW89Xr/wwgt59dVX13scf+9735vm5uYsXbp0o/e9sZxpSyZuKGe23377VqXc+lRXV+erX/1q7rrrrtTW1uZDH/pQLrvssixbtmyz75/uTfHFVm1Dn+RR+YsLKb71PfEb8pevtGzu/Wzq/n/zm9/ksMMOy8qVK3PllVdm3rx5WbBgQc4555wkSXNz82bN1hGuuOKK/PSnP83555+f1157LZ///Oez7777bvCilwCl21BeNDU1rbNsc/Knve7/rToirzraJz/5yTz11FO56qqrMmTIkFx++eXZd999c9ddd3XK/QNsTd56HP/kJz+ZG264Iaeddlq+/e1vZ/78+S0fmLK+5wtdeUxvS850VsYkfz6b7Ve/+lUaGhpSU1OTCy+8MO9973vz6KOPdtoMbLsUX2yz9tprrzQ3N+fXv/51q+XLly/Pyy+/nL322qvDZ/je976XNWvW5Lvf/W5OPfXUHHHEERk/fvxGn7Rsrjfmf+vje+GFFzb71ZX9998/F1xwQR544IH88Ic/zPPPP5/Zs2e3/Hxzn4QBlKB///55+eWX11n+1jOu2tvWkFebsqHMSf58sf3Nsdtuu+X000/PHXfckaeffjq77rpr/umf/qnl5zIH6I5eeumlLFy4MOedd15mzJiRT3ziE/noRz+6ybOcNuaNXPnNb37Tavlbj9cDBw7MDjvssN7j+BNPPJEePXps8qLy7WVDOfOnP/0pTz/99Gbdxj777JMvfOELmT9/fh577LGsXbs2V1xxRcvP5Qwbovhim3XEEUckSWbOnNlq+ZVXXpkk6/1Uk/b2xisff/lqR2NjY+bMmfO2b3v8+PHZfvvtc9VVV7W6/bc+3vVZtWrVOtd12X///dOjR49WH2+/4447rvdJIECJ9tlnnzQ2NuanP/1py7Lf/e5363yqYnvbGvJqU3bbbbeMHDkyN998c6u33SxYsCA///nPN7ptU1PTOm/VGTRoUIYMGbJO5njLPdDdrO/5QrJ5f9NvyMc+9rEkyb/9279t9DZ79uyZww8/PN/5znfyzDPPtCxfvnx55s6dm4MPPjh9+vTZ4jnaYvTo0Rk4cGBmz57d6hPqb7rppk0+H3n11Vfzxz/+sdWyffbZJzvvvLPnNmyW7bp6ANhSI0aMyKRJk3L99dfn5ZdfziGHHJJHHnkkN998c44++uh85CMf6fAZDj/88PTq1StHHXVUTj311PzhD3/IDTfckEGDBuV3v/vd27rtgQMH5otf/GIaGhry13/91zniiCPy6KOP5q677sqAAQM2uu0999yTM844I3/3d3+Xd73rXXn99dfzjW98Iz179swxxxzTst6oUaPygx/8IFdeeWWGDBmSYcOGrfORxgClOP744/OlL30pn/jEJ/L5z38+r776aq699tq8613vWu/FhdvL1pBXm6OhoSFHHnlkDj744Jx88sl58cUXc9VVV2XffffNH/7whw1u98orr2SPPfbIsccemxEjRmSnnXbKD37wg/z4xz9u9Ur8qFGjcuutt6a+vj4f+MAHstNOO+Woo47qjIcG0GX69OnTck2qP/3pT9l9990zf/78zT7LaX1GjhyZE044Iddcc00aGxtz4IEHZuHChXnyySfXWffiiy/OggULcvDBB+f000/Pdtttl+uuuy5r1qzJZZdd9nYeWptsv/32ufjii3Pqqafm0EMPzXHHHZenn346c+bM2eTZb7/61a9y2GGH5ZOf/GSGDx+e7bbbLrfffnuWL1+e448/vmW9UaNG5dprr83FF1+cd7zjHRk0aFAOPfTQjn5obAMUX2zTbrzxxuy999656aabcvvtt2fw4MGZOnVqpk+f3in3/+53vzu33XZbLrjggnzxi1/M4MGD87nPfS4DBw5c5xO2tsTFF1+cmpqazJ49O/fee2/Gjh2b+fPnb/LsgBEjRmTChAn53ve+l+effz477LBDRowYkbvuuqvlE2SSP59t8NnPfjYXXHBBXnvttUyaNEnxBRRr1113ze233576+vqce+65GTZsWBoaGvLrX/+6Q4uvpOvzanP81V/9Vb71rW/lggsuyNSpU7PPPvtkzpw5+c53vpP77rtvg9vtsMMOOf300zN//vx8+9vfTnNzc97xjnfkmmuuyec+97mW9U4//fQsWbIkc+bMyb/8y79kr732UnwB3cLcuXNz5plnZtasWalUKjn88MNz1113ZciQIVt8m1//+tczcODA/Md//EfuuOOOHHrooZk3b946b13cd99988Mf/jBTp05NQ0NDmpubM3bs2Pz7v/97p//d/9nPfjZNTU25/PLL8w//8A/Zf//9893vfjcXXnjhRrcbOnRoTjjhhCxcuDDf+MY3st122+U973lPvvnNb7Z6UX/atGl59tlnc9lll+WVV17JIYccovgiSVJV6cwr0gEAAABAJ3GNLwAAAACK5K2O0AVeeOGFNDU1bfDnvXr1yi677NKJEwFQqsbGxrz22msbXWfw4MGdNA0ApXnxxRdbXbD+rXr27JmBAwd24kTQmrc6Qheoq6vLs88+u8GfH3LIIRu9ngoAbK6TTjopN99880bX8ecgAFvqwx/+cO6///4N/nyvvfZq9amS0NnaXHw98MADufzyy7No0aKWjwA/+uijN7rNfffdl/r6+jz++OMZOnRoLrjggpx00klvY2zYtv33f//3Rl9979+/f0aNGtWJE8HWQ85A+/r5z3+e3/72txtdZ/z48Z00DXQ9OQPta9GiRXnppZc2+PPevXvnoIMO6sSJoLU2v9Vx9erVGTFiRE4++eT87d/+7SbXf/rpp3PkkUfmtNNOy3/8x39k4cKFOeWUU7LbbrtlwoQJWzQ0bOsc+GHD5Ay0r+HDh2f48OFdPQZsNeQMtC8v2LO1e1tvdayqqtrkKyRf+tKXMm/evDz22GMty44//vi8/PLLufvuu7f0rgHoBuQMAB1JzgCUr8Mvbv/QQw+tc/r8hAkTcvbZZ29wmzVr1mTNmjUt3zc3N+fFF1/Mrrvumqqqqo4aFaDbqFQqeeWVVzJkyJD06LFtf8CvnAHY+sgZOQPQkdqSMx1efC1btiy1tbWtltXW1mbVqlV57bXX0rt373W2aWhoyIwZMzp6NIBub+nSpdljjz26eoy3Rc4AbL3kDAAdaXNypsOLry0xderU1NfXt3zf2NiYPffcM0uXLk2fPn26cDKAMqxatSpDhw7Nzjvv3NWjdAk5A9Cx5IycAehIbcmZDi++Bg8enOXLl7datnz58vTp02e9r44kSXV1daqrq9dZ3qdPH0EB0I5KeLuFnAHYesmZ1uQMQPvanJzp8Dfcjxs3LgsXLmy1bMGCBRk3blxH3zUA3YCcAaAjyRmAbVubi68//OEPWbJkSZYsWZLkzx/vu2TJkjz33HNJ/nxa78SJE1vWP+200/LUU0/l3HPPzRNPPJFrrrkm3/zmN3POOee0zyMAoChyBoCOJGcAupc2F18/+clP8r73vS/ve9/7kiT19fV53/vel2nTpiVJfve737WERpIMGzYs8+bNy4IFCzJixIhcccUVufHGGzNhwoR2eggAlETOANCR5AxA91JVqVQqXT3EpqxatSp9+/ZNY2Oj98QDtAPH1dbsD4D25bjamv0B0L7aclzt8Gt8AQAAAEBXUHwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUKQtKr5mzZqVurq61NTUZOzYsXnkkUc2uv7MmTPz7ne/O717987QoUNzzjnn5I9//OMWDQxA+eQMAB1JzgB0H20uvm699dbU19dn+vTpWbx4cUaMGJEJEyZkxYoV611/7ty5Oe+88zJ9+vT84he/yNe+9rXceuutOf/889/28ACUR84A0JHkDED30ubi68orr8xnPvOZTJ48OcOHD8/s2bOzww475Otf//p61//Rj36Ugw46KCeeeGLq6upy+OGH54QTTtjkqyoAdE9yBoCOJGcAupc2FV9r167NokWLMn78+DdvoEePjB8/Pg899NB6tznwwAOzaNGilmB46qmncuedd+aII47Y4P2sWbMmq1atavUFQPnkDAAdSc4AdD/btWXllStXpqmpKbW1ta2W19bW5oknnljvNieeeGJWrlyZgw8+OJVKJa+//npOO+20jZ4a3NDQkBkzZrRlNAAKIGcA6EhyBqD76fBPdbzvvvtyySWX5JprrsnixYvz7W9/O/PmzctXvvKVDW4zderUNDY2tnwtXbq0o8cEYBslZwDoSHIGYNvWpjO+BgwYkJ49e2b58uWtli9fvjyDBw9e7zYXXnhhPvWpT+WUU05Jkuy///5ZvXp1PvvZz+bLX/5yevRYt3urrq5OdXV1W0YDoAByBoCOJGcAup82nfHVq1evjBo1KgsXLmxZ1tzcnIULF2bcuHHr3ebVV19dJwx69uyZJKlUKm2dF4CCyRkAOpKcAeh+2nTGV5LU19dn0qRJGT16dMaMGZOZM2dm9erVmTx5cpJk4sSJ2X333dPQ0JAkOeqoo3LllVfmfe97X8aOHZsnn3wyF154YY466qiWwACAN8gZADqSnAHoXtpcfB133HF54YUXMm3atCxbtiwjR47M3Xff3XKByOeee67VKyIXXHBBqqqqcsEFF+T555/PwIEDc9RRR+Wf/umf2u9RAFAMOQNAR5IzAN1LVWUbOD931apV6du3bxobG9OnT5+uHgdgm+e42pr9AdC+HFdbsz8A2ldbjqsd/qmOAAAAANAVFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFGmLiq9Zs2alrq4uNTU1GTt2bB555JGNrv/yyy9nypQp2W233VJdXZ13vetdufPOO7doYADKJ2cA6EhyBqD72K6tG9x6662pr6/P7NmzM3bs2MycOTMTJkzIL3/5ywwaNGid9deuXZuPfvSjGTRoUG677bbsvvvuefbZZ9OvX7/2mB+AwsgZADqSnAHoXqoqlUqlLRuMHTs2H/jAB3L11VcnSZqbmzN06NCceeaZOe+889ZZf/bs2bn88svzxBNPZPvtt9+iIVetWpW+ffumsbExffr02aLbAOBNW/NxVc4AbPu25uOqnAHY9rXluNqmtzquXbs2ixYtyvjx49+8gR49Mn78+Dz00EPr3ea73/1uxo0blylTpqS2tjb77bdfLrnkkjQ1NW3wftasWZNVq1a1+gKgfHIGgI4kZwC6nzYVXytXrkxTU1Nqa2tbLa+trc2yZcvWu81TTz2V2267LU1NTbnzzjtz4YUX5oorrsjFF1+8wftpaGhI3759W76GDh3aljEB2EbJGQA6kpwB6H46/FMdm5ubM2jQoFx//fUZNWpUjjvuuHz5y1/O7NmzN7jN1KlT09jY2PK1dOnSjh4TgG2UnAGgI8kZgG1bmy5uP2DAgPTs2TPLly9vtXz58uUZPHjwerfZbbfdsv3226dnz54ty9773vdm2bJlWbt2bXr16rXONtXV1amurm7LaAAUQM4A0JHkDED306Yzvnr16pVRo0Zl4cKFLcuam5uzcOHCjBs3br3bHHTQQXnyySfT3NzcsuxXv/pVdtttt/WGBADdl5wBoCPJGYDup81vdayvr88NN9yQm2++Ob/4xS/yuc99LqtXr87kyZOTJBMnTszUqVNb1v/c5z6XF198MWeddVZ+9atfZd68ebnkkksyZcqU9nsUABRDzgDQkeQMQPfSprc6Jslxxx2XF154IdOmTcuyZcsycuTI3H333S0XiHzuuefSo8ebfdrQoUPz/e9/P+ecc04OOOCA7L777jnrrLPypS99qf0eBQDFkDMAdCQ5A9C9VFUqlUpXD7Epq1atSt++fdPY2Jg+ffp09TgA2zzH1dbsD4D25bjamv0B0L7aclzt8E91BAAAAICuoPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEhbVHzNmjUrdXV1qampydixY/PII49s1na33HJLqqqqcvTRR2/J3QLQTcgZADqarAHoHtpcfN16662pr6/P9OnTs3jx4owYMSITJkzIihUrNrrdM888ky9+8Yv54Ac/uMXDAlA+OQNAR5M1AN1Hm4uvK6+8Mp/5zGcyefLkDB8+PLNnz84OO+yQr3/96xvcpqmpKX//93+fGTNmZO+9935bAwNQNjkDQEeTNQDdR5uKr7Vr12bRokUZP378mzfQo0fGjx+fhx56aIPb/eM//mMGDRqUT3/605t1P2vWrMmqVatafQFQPjkDQEfrjKyRMwBbj+3asvLKlSvT1NSU2traVstra2vzxBNPrHebBx98MF/72teyZMmSzb6fhoaGzJgxoy2jwSbVnTevq0doV89cemRXjwDtTs4A0NE6I2s6Imf8LQuwZTr0Ux1feeWVfOpTn8oNN9yQAQMGbPZ2U6dOTWNjY8vX0qVLO3BKALZVcgaAjrYlWSNnALYebTrja8CAAenZs2eWL1/eavny5cszePDgddb/zW9+k2eeeSZHHXVUy7Lm5uY/3/F22+WXv/xl9tlnn3W2q66uTnV1dVtGA6AAcgaAjtYZWSNnALYebTrjq1evXhk1alQWLlzYsqy5uTkLFy7MuHHj1ln/Pe95T372s59lyZIlLV8f//jH85GPfCRLlizJ0KFD3/4jAKAYcgaAjiZrALqXNp3xlST19fWZNGlSRo8enTFjxmTmzJlZvXp1Jk+enCSZOHFidt999zQ0NKSmpib77bdfq+379euXJOssB4BEzgDQ8WQNQPfR5uLruOOOywsvvJBp06Zl2bJlGTlyZO6+++6Wi0M+99xz6dGjQy8dBkDB5AwAHU3WAHQfVZVKpdLVQ2zKqlWr0rdv3zQ2NqZPnz5dPQ7bKJ+EA29yXG3N/gBoX46rrbXH/vC3LMCb2nJc9TIGAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEXarqsHAAAAAGDT6s6b19UjtKtnLj2yw+/DGV8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRFF8AAAAAFEnxBQAAAECRtqj4mjVrVurq6lJTU5OxY8fmkUce2eC6N9xwQz74wQ+mf//+6d+/f8aPH7/R9QFAzgDQ0WQNQPfQ5uLr1ltvTX19faZPn57FixdnxIgRmTBhQlasWLHe9e+7776ccMIJuffee/PQQw9l6NChOfzww/P888+/7eEBKI+cAaCjyRqA7qOqUqlU2rLB2LFj84EPfCBXX311kqS5uTlDhw7NmWeemfPOO2+T2zc1NaV///65+uqrM3HixM26z1WrVqVv375pbGxMnz592jIutKg7b15Xj9Cunrn0yK4egW3Y1nxclTMA276t/bja2VnTHvvD37JA4ljwhrYcV9t0xtfatWuzaNGijB8//s0b6NEj48ePz0MPPbRZt/Hqq6/mT3/6U3bZZZcNrrNmzZqsWrWq1RcA5ZMzAHS0zsgaOQOw9diuLSuvXLkyTU1Nqa2tbbW8trY2TzzxxGbdxpe+9KUMGTKkVdC8VUNDQ2bMmNGW0dgErTBJWb8HfgfKtC3nTHf/91XS40/sg8Q+SOyDUrO2M7LG8xnoGCUdY5Nyj7Nbm079VMdLL700t9xyS26//fbU1NRscL2pU6emsbGx5Wvp0qWdOCUA2yo5A0BH25yskTMAW482nfE1YMCA9OzZM8uXL2+1fPny5Rk8ePBGt/3nf/7nXHrppfnBD36QAw44YKPrVldXp7q6ui2jAVAAOQNAR+uMrJEzAFuPNp3x1atXr4waNSoLFy5sWdbc3JyFCxdm3LhxG9zusssuy1e+8pXcfffdGT169JZPC0DR5AwAHU3WAHQvbTrjK0nq6+szadKkjB49OmPGjMnMmTOzevXqTJ48OUkyceLE7L777mloaEiSfPWrX820adMyd+7c1NXVZdmyZUmSnXbaKTvttFM7PhQASiBnAOhosgag+2hz8XXcccflhRdeyLRp07Js2bKMHDkyd999d8vFIZ977rn06PHmiWTXXntt1q5dm2OPPbbV7UyfPj0XXXTR25segOLIGQA6mqwB6D7aXHwlyRlnnJEzzjhjvT+77777Wn3/zDPPbMldANCNyRkAOpqsAegeOvVTHQEAAACgsyi+AAAAACiS4gsAAACAIim+AAAAACiS4gsAAACAIim+AAAAACiS4gsAAACAIim+AAAAACiS4gsAAACAIim+AAAAACiS4gsAAACAIim+AAAAACiS4gsAAACAIim+AAAAACiS4gsAAACAIim+AAAAACiS4gsAAACAIim+AAAAACiS4gsAAACAIim+AAAAACiS4gsAAACAIim+AAAAACiS4gsAAACAIim+AAAAACiS4gsAAACAIim+AAAAACiS4gsAAACAIim+AAAAACiS4gsAAACAIim+AAAAACiS4gsAAACAIim+AAAAACjSdl09QGeoO29eV4/Qrp659MiuHgG2SY4FAADbrpL+lvN3HHQeZ3wBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABFUnwBAAAAUCTFFwAAAABF2q6rBwAAAAA2ru68eV09Qrt65tIju3oEuglnfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEVSfAEAAABQJMUXAAAAAEXaouJr1qxZqaurS01NTcaOHZtHHnlko+t/61vfynve857U1NRk//33z5133rlFwwLQPcgZADqarAHoHtpcfN16662pr6/P9OnTs3jx4owYMSITJkzIihUr1rv+j370o5xwwgn59Kc/nUcffTRHH310jj766Dz22GNve3gAyiNnAOhosgag+2hz8XXllVfmM5/5TCZPnpzhw4dn9uzZ2WGHHfL1r399vev/67/+a/7qr/4q//AP/5D3vve9+cpXvpL3v//9ufrqq9/28ACUR84A0NFkDUD3sV1bVl67dm0WLVqUqVOntizr0aNHxo8fn4ceemi92zz00EOpr69vtWzChAm54447Nng/a9asyZo1a1q+b2xsTJKsWrWqLeO2aF7z6hZtt7Xakv1gH9gHSVn7oLs//mTLj4l/uW2lUmmvcdrFtpozSVm/X/592QeJfZDYByXmTNI5WSNnNs2/r+79+BP7ILEPki0/LrYlZ9pUfK1cuTJNTU2pra1ttby2tjZPPPHEerdZtmzZetdftmzZBu+noaEhM2bMWGf50KFD2zJusfrO7OoJup59YB9098eftM8+eOWVV9K3b9+3f0PtRM5sHfz7sg8S+yCxD0rMmaRzskbObJp/X109QdezD+yD5O3vg83JmTYVX51l6tSprV5RaW5uzosvvphdd901VVVVXTjZhq1atSpDhw7N0qVL06dPn64ep0vYB/ZBd3/8ybazDyqVSl555ZUMGTKkq0fpEnJm22Qf2AeJfbCtPH45s+3lTLLt/H51pO6+D7r740/sg2Tb2AdtyZk2FV8DBgxIz549s3z58lbLly9fnsGDB693m8GDB7dp/SSprq5OdXV1q2X9+vVry6hdpk+fPlvtL0ZnsQ/sg+7++JNtYx9sba/AJ3Jmc2wLv1sdzT6wDxL7YFt4/FtjziSdkzXbcs4k28bvV0fr7vuguz/+xD5Itv59sLk506aL2/fq1SujRo3KwoULW5Y1Nzdn4cKFGTdu3Hq3GTduXKv1k2TBggUbXB+A7kvOANDRZA1A99LmtzrW19dn0qRJGT16dMaMGZOZM2dm9erVmTx5cpJk4sSJ2X333dPQ0JAkOeuss3LIIYfkiiuuyJFHHplbbrklP/nJT3L99de37yMBoAhyBoCOJmsAuo82F1/HHXdcXnjhhUybNi3Lli3LyJEjc/fdd7dc7PG5555Ljx5vnkh24IEHZu7cubngggty/vnn553vfGfuuOOO7Lfffu33KLYC1dXVmT59+jqnNHcn9oF90N0ff2IftAc5s35+t+yDxD5I7IPu/vjbi6xZP79f9kF3f/yJfZCUtw+qKlvjZwwDAAAAwNvUpmt8AQAAAMC2QvEFAAAAQJEUXwAAAAAUSfEFAAAAQJEUX+1k1qxZqaurS01NTcaOHZtHHnmkq0fqNA888ECOOuqoDBkyJFVVVbnjjju6eqRO1dDQkA984APZeeedM2jQoBx99NH55S9/2dVjdaprr702BxxwQPr06ZM+ffpk3Lhxueuuu7p6rC516aWXpqqqKmeffXZXj0Ih5Ez3zZlE1siZdckZ2lt3zplE1nT3nElkzVuVlDOKr3Zw6623pr6+PtOnT8/ixYszYsSITJgwIStWrOjq0TrF6tWrM2LEiMyaNaurR+kS999/f6ZMmZKHH344CxYsyJ/+9KccfvjhWb16dVeP1mn22GOPXHrppVm0aFF+8pOf5NBDD83f/M3f5PHHH+/q0brEj3/841x33XU54IADunoUCiFnunfOJLJGzrQmZ2hv3T1nElnT3XMmkTV/qbicqfC2jRkzpjJlypSW75uamipDhgypNDQ0dOFUXSNJ5fbbb+/qMbrUihUrKkkq999/f1eP0qX69+9fufHGG7t6jE73yiuvVN75zndWFixYUDnkkEMqZ511VlePRAHkzJvkzJ/JGjkjZ2hPcqY1WSNn3tAds6bEnHHG19u0du3aLFq0KOPHj29Z1qNHj4wfPz4PPfRQF05GV2lsbEyS7LLLLl08SddoamrKLbfcktWrV2fcuHFdPU6nmzJlSo488shWxwR4O+QM69Ods0bOyBnal5xhfbpzziTdO2tKzJntunqAbd3KlSvT1NSU2traVstra2vzxBNPdNFUdJXm5uacffbZOeigg7Lffvt19Tid6mc/+1nGjRuXP/7xj9lpp51y++23Z/jw4V09Vqe65ZZbsnjx4vz4xz/u6lEoiJzhrbpr1sgZOUPHkDO8VXfNmUTWlJozii9oR1OmTMljjz2WBx98sKtH6XTvfve7s2TJkjQ2Nua2227LpEmTcv/993eboFi6dGnOOuusLFiwIDU1NV09DlCw7po1ckbOAJ2ju+ZM0r2zpuScUXy9TQMGDEjPnj2zfPnyVsuXL1+ewYMHd9FUdIUzzjgj//Vf/5UHHngge+yxR1eP0+l69eqVd7zjHUmSUaNG5cc//nH+9V//Ndddd10XT9Y5Fi1alBUrVuT9739/y7KmpqY88MADufrqq7NmzZr07NmzCydkWyVn+EvdOWvkjJyhY8gZ/lJ3zpmke2dNyTnjGl9vU69evTJq1KgsXLiwZVlzc3MWLlzY7d4L3F1VKpWcccYZuf3223PPPfdk2LBhXT3SVqG5uTlr1qzp6jE6zWGHHZaf/exnWbJkScvX6NGj8/d///dZsmTJNhsSdD05QyJr1kfOyBnah5whkTMb0p2ypuScccZXO6ivr8+kSZMyevTojBkzJjNnzszq1aszefLkrh6tU/zhD3/Ik08+2fL9008/nSVLlmSXXXbJnnvu2YWTdY4pU6Zk7ty5+c53vpOdd945y5YtS5L07ds3vXv37uLpOsfUqVPzsY99LHvuuWdeeeWVzJ07N/fdd1++//3vd/VonWbnnXde5xoIO+64Y3bddddud20E2p+c6d45k8gaOSNn6FjdPWcSWdPdcyaRNUXnTBd/qmQxrrrqqsqee+5Z6dWrV2XMmDGVhx9+uKtH6jT33ntvJck6X5MmTerq0TrF+h57ksqcOXO6erROc/LJJ1f22muvSq9evSoDBw6sHHbYYZX58+d39VhdrpSP/2XrIGe6b85UKrJGzqyfnKE9deecqVRkTXfPmUpF1qxPKTlTValUKh3ergEAAABAJ3ONLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEiKLwAAAACKpPgCAAAAoEj/H+VPR2iKgpcwAAAAAElFTkSuQmCC",
"text/plain": [
"<Figure size 1500x500 with 3 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"distributions = {\n",
" \"normal_dist\": np.array([0.1, 0.2, 0.4, 0.2, 0.1]),\n",
" \"uniform_dist\": np.array([0.2, 0.2, 0.2, 0.2, 0.2]),\n",
" \"random_dist\": np.array([0.4, 0.1, 0.05, 0.20, 0.25]),\n",
"}\n",
"possible_events = list(range(5))\n",
"\n",
"fig, axes = plt.subplots(nrows=1, ncols=3, figsize=(15, 5))\n",
"for i, (name, dist) in enumerate(distributions.items()):\n",
" assert np.allclose(sum(dist), 1.0)\n",
" assert len(dist) == 5\n",
" axes[i].set_title(name)\n",
" axes[i].set_ylim(bottom=0, top=1)\n",
" axes[i].bar(x=possible_events, height=dist)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Information\n",
"Information can be interpreted as a measure for the rarity of an event (the reduction of uncertainty when that event is transmitted):\n",
"- High information for low probability events\n",
"- Low information for high probability events\n",
"$$\n",
"h(x) = -\\log P(x)\n",
"$$\n"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"normal_dist\n",
"h(0) = 2.3025850929940455\n",
"h(1) = 1.6094379124341003\n",
"h(2) = 0.916290731874155\n",
"h(3) = 1.6094379124341003\n",
"h(4) = 2.3025850929940455\n",
"\n",
"uniform_dist\n",
"h(0) = 1.6094379124341003\n",
"h(1) = 1.6094379124341003\n",
"h(2) = 1.6094379124341003\n",
"h(3) = 1.6094379124341003\n",
"h(4) = 1.6094379124341003\n",
"\n",
"random_dist\n",
"h(0) = 0.916290731874155\n",
"h(1) = 2.3025850929940455\n",
"h(2) = 2.995732273553991\n",
"h(3) = 1.6094379124341003\n",
"h(4) = 1.3862943611198906\n",
"\n"
]
}
],
"source": [
"h = lambda p, x: -np.log(p[x])\n",
"\n",
"for name, dist in distributions.items():\n",
" print(name)\n",
" for x in possible_events:\n",
" print(f\"h({x}) = {h(dist, x)}\")\n",
" print()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Entropy\n",
"Entropy can be interpreted as a measure of uncertainty of a distribution (the average amount of information from sampling the distribution once):\n",
"\n",
"- High entropy if there is lots of uncertainty (uniform distribution)\n",
"- Low entropy if there is little uncertainty (skewed distribution)\n",
"$$\n",
"H(P) = - \\sum P(x) * \\log(P(x))\n",
"$$"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"normal_dist = 1.4708084763221112\n",
"uniform_dist = 1.6094379124341005\n",
"random_dist = 1.415022588493559\n"
]
}
],
"source": [
"H = lambda p: -sum([p_x * np.log(p_x)for p_x in p])\n",
"\n",
"for name, P in distributions.items():\n",
" print(f\"{name} = {H(P)}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Cross Entropy\n",
"\n",
"From an information theory perspective, cross entropy can be interpreted as the average amount of information we get about the true distribution $P$ when sampling from an estimated distribution $Q$. It looks very similar to the entropy, but we just replace $\\log(P(x))$ with $\\log(Q(x))$:\n",
"$$\n",
"H(P, Q) = -\\sum P(x) * (\\log(Q(x)))\n",
"$$\n",
"\n",
"Notice if $P = Q$, then cross entropy is equal to entropy ($H(P, Q) = H(P)$). One useful property of cross entropy is that as $P$ and $Q$ diverge (become less alike), cross entropy grows ($H(P, Q) > H(P)$). As such, cross entropy can _sort of_ be interpreted as a measure of similarity between two distributions. This becomes more clear when it's written as:\n",
"$$\n",
"H(P, Q) = D_{KL}(P || Q) + H(P)\n",
"$$\n",
"\n",
"where $D_{KL}(P || Q)$ is the _relative entropy_ (also called Kullback–Leibler (KL) divergence) between $P$ and $Q$."
]
},
{
"cell_type": "code",
"execution_count": 24,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"crossH(normal_dist, normal_dist) = 1.4708084763221112\n",
"crossH(normal_dist, uniform_dist) = 1.6094379124341003\n",
"crossH(normal_dist, random_dist) = 2.21095601980663\n",
"crossH(uniform_dist, normal_dist) = 1.7480673485460896\n",
"crossH(uniform_dist, uniform_dist) = 1.6094379124341005\n",
"crossH(uniform_dist, random_dist) = 1.8420680743952367\n",
"crossH(random_dist, normal_dist) = 2.0253262207700673\n",
"crossH(random_dist, uniform_dist) = 1.6094379124341005\n",
"crossH(random_dist, random_dist) = 1.415022588493559\n"
]
}
],
"source": [
"crossH = lambda p, q: -sum([p_x * np.log(q_x) for p_x, q_x in zip(p, q)])\n",
"\n",
"import itertools\n",
"for (name1, p), (name2, q) in itertools.product(distributions.items(), repeat=2):\n",
" print(f\"crossH({name1}, {name2}) = {crossH(p, q)}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### KL Divergence\n",
"KL Divergence can be interpreted as a measure of similarity between two distributions (how much extra information do I need to represent Q given P):\n",
"- High if the two distributions are dissimilar\n",
"- Low if the two distributions are similar\n",
"$$\n",
"D_{KL}(P || Q) = \\sum P(x) * (\\log(P(x)) - \\log(Q(x)))\n",
"$$\n",
"Sometimes also written as:\n",
"$$\n",
"\\sum P(x) * \\log(\\frac{P(x)}{Q(x)})\n",
"$$\n",
"or\n",
"$$\n",
"-\\sum P(x) * \\log(\\frac{Q(x)}{P(x)})\n",
"$$\n"
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"KLD(normal_dist, normal_dist) = 0.0\n",
"KLD(normal_dist, uniform_dist) = 0.13862943611198902\n",
"KLD(normal_dist, random_dist) = 0.7401475434845188\n",
"KLD(uniform_dist, normal_dist) = 0.13862943611198905\n",
"KLD(uniform_dist, uniform_dist) = 0.0\n",
"KLD(uniform_dist, random_dist) = 0.23263016196113617\n",
"KLD(random_dist, normal_dist) = 0.6103036322765085\n",
"KLD(random_dist, uniform_dist) = 0.19441532394054145\n",
"KLD(random_dist, random_dist) = 0.0\n"
]
}
],
"source": [
"KLD = lambda p, q: sum([p_x * (np.log(p_x) - np.log(q_x)) for p_x, q_x in zip(p, q)])\n",
"\n",
"import itertools\n",
"for (name1, p), (name2, q) in itertools.product(distributions.items(), repeat=2):\n",
" print(f\"KLD({name1}, {name2}) = {KLD(p, q)}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Loss Functions\n",
"$$\n",
"\\text{CCELoss}(\\bm{p}, y) = -\\log(\\bm{p}_y)\n",
"$$\n",
"where $\\bm{p}$ is the predicted probability distribution vector and $y$ is the index of the correct class.\n",
"\n",
"$$\n",
"\\text{BCELoss}(p, y) = -\\log(1 - p)(1 - y) - \\log(p)y\n",
"$$\n",
"where $p$ is the predicted probability and $y$ is the label, either 1 or 0.\n",
"\n",
"$$\n",
"\\text{KLDivLoss}(\\bm{p}, \\bm{y}) = \\sum_{i} \\bm{y}_i * (\\log(\\bm{y}_i) - \\log(\\bm{p}_i))\n",
"$$\n",
"where $\\bm{p}$ is the predicted probability distribution vector and $\\bm{y}$ is the true probability distribution vector"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [],
"source": [
"def log(x, eps=1e-12):\n",
" # safe version of log where log(0) -> very large negative number rather than -inf,\n",
" # which causes numerical instability\n",
" return np.log(x + eps)\n",
"\n",
"def cce_loss(p, y):\n",
" return -log(p[y])\n",
"\n",
"def bce_loss(p, y):\n",
" return -log(1 - p)*(1 - y) - log(p)*y\n",
"\n",
"def kld_loss(p, y):\n",
" return np.sum(y * (log(y) - log(p)))"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"p = 0.0\n",
"bce_loss(p, 0) = -1.000088900581841e-12\n",
"bce_loss(p, 1) = 27.631021115928547\n",
"\n",
"p = 0.1\n",
"bce_loss(p, 0) = 0.10536051565671518\n",
"bce_loss(p, 1) = 2.3025850929840455\n",
"\n",
"p = 0.2\n",
"bce_loss(p, 0) = 0.22314355131295974\n",
"bce_loss(p, 1) = 1.6094379124291003\n",
"\n",
"p = 0.3\n",
"bce_loss(p, 0) = 0.3566749439373039\n",
"bce_loss(p, 1) = 1.2039728043226028\n",
"\n",
"p = 0.4\n",
"bce_loss(p, 0) = 0.510825623764324\n",
"bce_loss(p, 1) = 0.9162907318716551\n",
"\n",
"p = 0.5\n",
"bce_loss(p, 0) = 0.6931471805579453\n",
"bce_loss(p, 1) = 0.6931471805579453\n",
"\n",
"p = 0.6\n",
"bce_loss(p, 0) = 0.9162907318716551\n",
"bce_loss(p, 1) = 0.510825623764324\n",
"\n",
"p = 0.7\n",
"bce_loss(p, 0) = 1.2039728043226026\n",
"bce_loss(p, 1) = 0.3566749439373039\n",
"\n",
"p = 0.8\n",
"bce_loss(p, 0) = 1.6094379124291005\n",
"bce_loss(p, 1) = 0.22314355131295974\n",
"\n",
"p = 0.9\n",
"bce_loss(p, 0) = 2.302585092984046\n",
"bce_loss(p, 1) = 0.10536051565671518\n",
"\n",
"p = 1.0\n",
"bce_loss(p, 0) = 27.631021115928547\n",
"bce_loss(p, 1) = -1.000088900581841e-12\n",
"\n"
]
}
],
"source": [
"for p in np.arange(11) / 10:\n",
" print(f\"p = {p}\")\n",
" print(\"bce_loss(p, 0) =\", bce_loss(p, 0))\n",
" print(\"bce_loss(p, 1) =\", bce_loss(p, 1))\n",
" print()"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"--- normal_dist ---\n",
"target = 0\n",
"cce_loss = 2.3025850929840455\n",
"kld_loss = 2.3025850929850455\n",
"\n",
"target = 1\n",
"cce_loss = 1.6094379124291003\n",
"kld_loss = 1.6094379124301004\n",
"\n",
"target = 2\n",
"cce_loss = 0.9162907318716551\n",
"kld_loss = 0.9162907318726552\n",
"\n",
"target = 3\n",
"cce_loss = 1.6094379124291003\n",
"kld_loss = 1.6094379124301004\n",
"\n",
"target = 4\n",
"cce_loss = 2.3025850929840455\n",
"kld_loss = 2.3025850929850455\n",
"\n",
"\n",
"--- uniform_dist ---\n",
"target = 0\n",
"cce_loss = 1.6094379124291003\n",
"kld_loss = 1.6094379124301004\n",
"\n",
"target = 1\n",
"cce_loss = 1.6094379124291003\n",
"kld_loss = 1.6094379124301004\n",
"\n",
"target = 2\n",
"cce_loss = 1.6094379124291003\n",
"kld_loss = 1.6094379124301004\n",
"\n",
"target = 3\n",
"cce_loss = 1.6094379124291003\n",
"kld_loss = 1.6094379124301004\n",
"\n",
"target = 4\n",
"cce_loss = 1.6094379124291003\n",
"kld_loss = 1.6094379124301004\n",
"\n",
"\n",
"--- random_dist ---\n",
"target = 0\n",
"cce_loss = 0.9162907318716551\n",
"kld_loss = 0.9162907318726552\n",
"\n",
"target = 1\n",
"cce_loss = 2.3025850929840455\n",
"kld_loss = 2.3025850929850455\n",
"\n",
"target = 2\n",
"cce_loss = 2.995732273533991\n",
"kld_loss = 2.995732273534991\n",
"\n",
"target = 3\n",
"cce_loss = 1.6094379124291003\n",
"kld_loss = 1.6094379124301004\n",
"\n",
"target = 4\n",
"cce_loss = 1.3862943611158907\n",
"kld_loss = 1.3862943611168907\n",
"\n",
"\n"
]
}
],
"source": [
"# notice, cce_loss and kld_loss are interchangeable for the categorical case\n",
"for name, dist in distributions.items():\n",
" print(f\"--- {name} ---\")\n",
" for target in range(len(dist)):\n",
" target_one_hot = np.zeros(len(dist))\n",
" target_one_hot[target] = 1\n",
" print(f\"target = {target}\")\n",
" print(\n",
" f\"cce_loss = {cce_loss(dist, target)}\",\n",
" f\"kld_loss = {kld_loss(dist, target_one_hot)}\",\n",
" sep=\"\\n\",\n",
" )\n",
" print()\n",
" print()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### In Practice\n",
"> What do we do about $0 * \\log(0)$?\n",
"\n",
"We treat the output as 0, since $\\lim_{x \\rightarrow 0} x * \\log(x) = 0$\n",
"\n",
"> What do we do about $\\log(0)$?\n",
"\n",
"We replace $0$ with $\\epsilon$, which we can choose to set to a very small number, say $10^{-12}$.\n",
"\n",
"> How do you calculate the loss for the entire dataset, not just one example?\n",
"\n",
"We just take the mean of the per loss examples. For example, our equation for BCELoss becomes:\n",
"$$\n",
"\\text{BCELoss}(\\bm{p}, \\bm{y}) = \\frac{1}{N}\\sum_i^N-\\log(1 - \\bm{p}_i)(1 - \\bm{y}_i) - \\log(\\bm{p}_i)\\bm{y}_i\n",
"$$\n",
"\n",
"> How do we account for numerical instability and floating point errors?\n",
"\n",
"Neural networks output unnormalized probabilities (which are often called logits). To convert the logits to probabilities, we use the softmax function.\n",
"\n",
"$$\n",
"\\text{softmax}(\\bm{x})_i = \\frac{e^{\\bm{x}_i}}{\\sum_j e^{\\bm{x}_j}}\n",
"$$\n",
"\n",
"The above formulation of softmax is very numerical instable due to the value of the exponents being really large (and therefore may cause floating point errors). To preven this, we can take advantage of some properties of exponents:\n",
"\n",
"$$\n",
"\\begin{align}\n",
"\\text{softmax}(\\bm{x})_i\n",
"& = \\frac{e^{\\bm{x}_i}}{\\sum_j e^{\\bm{x}_j}} \\\\\n",
"& = \\frac{C}{C}\\frac{e^{\\bm{x}_i}}{\\sum_j e^{\\bm{x}_j}} \\\\\n",
"& = \\frac{Ce^{\\bm{x}_i}}{\\sum_j Ce^{\\bm{x}_j}} \\\\\n",
"& = \\frac{e^{\\bm{x}_i + \\log(C)}}{\\sum_j e^{\\bm{x}_j + \\log(C)}} \\\\\n",
"\\end{align}\n",
"$$\n",
"\n",
"If we set $\\log(C) = -\\max(\\bm{x})$, we can control our exponentiated values to be between 0 and 1, reducing numerical instability. However, even with this trick, taking the log after softmax is still quite numerically instable. However, if we combine the two together directly and do some manipulation:\n",
"\n",
"$$\n",
"\\begin{align}\n",
"\\text{log\\_softmax}(\\bm{x})_i\n",
"= & \\log(\\frac{e^{\\bm{x}_i + \\log(C)}}{\\sum_j e^{\\bm{x}_j + \\log(C)}}) \\\\\n",
"= & \\log(e^{\\bm{x}_i + \\log(C)}) - \\log(\\sum_j e^{\\bm{x}_j + \\log(C)}) \\\\\n",
"= & (\\bm{x}_i + \\log(C))\\log(e) - \\log(\\sum_j e^{\\bm{x}_j + \\log(C)}) \\\\\n",
"= & \\bm{x}_i + \\log(C) - \\log(\\sum_j e^{\\bm{x}_j + \\log(C)}) \\\\\n",
"= & \\bm{x}_i - \\max(\\bm{x}) - \\log(\\sum_j e^{\\bm{x}_j - \\max(\\bm{x})})\n",
"\\end{align}\n",
"$$"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"def softmax_unstable(x):\n",
" return np.exp(x) / np.sum(np.exp(x))\n",
"\n",
"def softmax(x):\n",
" x_max = np.max(x)\n",
" return np.exp(x - x_max) / np.sum(np.exp(x - x_max))\n",
"\n",
"def log_softmax(x):\n",
" x_max = np.max(x)\n",
" return x - x_max - np.log(np.sum(np.exp(x - x_max)))"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"softmax_unstable(x): [0.61489897 0.23445245 0.09552892 0.05511966]\n",
"\n",
"softmax(x): [0.61489897 0.23445245 0.09552892 0.05511966]\n",
"\n",
"log(softmax(x)): [-0.48629729 -1.45050249 -2.34832625 -2.89824887]\n",
"\n",
"log_softmax(x): [-0.48629729 -1.45050249 -2.34832625 -2.89824887]\n"
]
}
],
"source": [
"x = np.random.normal(size=(4))\n",
"print(\n",
" f\"softmax_unstable(x): {softmax_unstable(x)}\",\n",
" f\"softmax(x): {softmax(x)}\",\n",
" f\"log(softmax(x)): {np.log(softmax(x))}\",\n",
" f\"log_softmax(x): {log_softmax(x)}\",\n",
" sep=\"\\n\\n\",\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"softmax_unstable(x): [nan 0. 0. nan]\n",
"\n",
"softmax(x): [1. 0. 0. 0.]\n",
"\n",
"log(softmax(x)): [ 0. -inf -inf -inf]\n",
"\n",
"log_softmax(x): [ 0. -5537.00322633 -7592.30679862 -1077.17353227]\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"/var/folders/51/h90b7_jx6sz6vphspgd1921h0000gp/T/ipykernel_68202/809860734.py:4: RuntimeWarning: overflow encountered in exp\n",
" return np.exp(x) / np.sum(np.exp(x))\n",
"/var/folders/51/h90b7_jx6sz6vphspgd1921h0000gp/T/ipykernel_68202/809860734.py:4: RuntimeWarning: invalid value encountered in divide\n",
" return np.exp(x) / np.sum(np.exp(x))\n",
"/var/folders/51/h90b7_jx6sz6vphspgd1921h0000gp/T/ipykernel_68202/993447327.py:5: RuntimeWarning: divide by zero encountered in log\n",
" f\"log(softmax(x)): {np.log(softmax(x))}\",\n"
]
}
],
"source": [
"x = np.random.normal(size=(4)) * 10000\n",
"print(\n",
" f\"softmax_unstable(x): {softmax_unstable(x)}\",\n",
" f\"softmax(x): {softmax(x)}\",\n",
" f\"log(softmax(x)): {np.log(softmax(x))}\",\n",
" f\"log_softmax(x): {log_softmax(x)}\",\n",
" sep=\"\\n\\n\",\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Torch Loss Functions"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {},
"outputs": [],
"source": [
"import torch\n",
"from torch.nn.functional import cross_entropy, kl_div, binary_cross_entropy, nll_loss"
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {},
"outputs": [],
"source": [
"def apply_over_batch_and_cast_to_numpy(f):\n",
" return lambda X, Y: sum(f(np.array(x), np.array(y)) for x, y in zip(X, Y)) / len(X)\n",
"\n",
"batch_cce_loss = apply_over_batch_and_cast_to_numpy(cce_loss)\n",
"batch_bce_loss = apply_over_batch_and_cast_to_numpy(bce_loss)\n",
"batch_kld_loss = apply_over_batch_and_cast_to_numpy(kld_loss)"
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Binary Classification"
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"torch: 1.1347017288208008\n",
"numpy: 1.1347016079713899\n"
]
}
],
"source": [
"# comparing binary cross entropy\n",
"batch_size = 64\n",
"\n",
"p = torch.rand(size=(batch_size,))\n",
"y = torch.randint(low=0, high=2, size=(batch_size,)).float()\n",
"\n",
"print(\n",
" f\"torch: {binary_cross_entropy(p, y).item()}\",\n",
" f\"numpy: {batch_bce_loss(p, y)}\",\n",
" sep=\"\\n\"\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Multi-class Classification\n",
"Notice:\n",
"* All three of `nll_loss`, `cross_entropy` and `kl_div` have the same result.\n",
"* This is because `cross_entropy(p, q) = kl_div(p, q) + entropy(p)`, but for multi-class classification `entropy(p) = 0` (since `p` is a one-hot vector)\n",
"* `torch` does not allow you to compute cross entropy loss using probabilities, instead:\n",
" * Use `cross_entropy` on the logits (raw unnormalized output)\n",
" * Use `nll_loss` on the log probabilities (i.e. on the output of `log_softmax(logits)`)\n",
" * Use `kl_div` on the log probabilities (same as above), however, you'll need to convert `y` to a one hot encoded vector\n",
"* The reason torch doesn't let you use probabilities directly is to prevent `log(softmax(logits))` and instead use `log_softmax(logits)` which is a lot more stable."
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"--- torch functions ---\n",
"cross_entropy: 2.743520498275757\n",
"nll_loss: 2.743520498275757\n",
"kl_div: 2.743520498275757\n",
"--- numpy functions ---\n",
"batch_cce_loss: 2.74352032315047\n",
"batch_kld_loss: 2.743520326912403\n"
]
}
],
"source": [
"# comparing nll_loss, cross_entropy, kl_div for categorical data\n",
"batch_size = 64\n",
"n_classes = 10\n",
"\n",
"logits = torch.randn(size=(batch_size, n_classes))\n",
"p = torch.softmax(logits, axis=-1)\n",
"log_p = torch.log_softmax(logits, axis=-1)\n",
"\n",
"y = torch.randint(low=0, high=n_classes, size=(batch_size,))\n",
"y_one_hot = torch.eye(n_classes)[y]\n",
"\n",
"print(\n",
" \"--- torch functions ---\",\n",
" f\"cross_entropy: {cross_entropy(logits, y).item()}\",\n",
" f\"nll_loss: {nll_loss(log_p, y).item()}\",\n",
" f\"kl_div: {kl_div(log_p, y_one_hot, reduction='batchmean').item()}\",\n",
" \"--- numpy functions ---\",\n",
" f\"batch_cce_loss: {batch_cce_loss(p, y)}\",\n",
" f\"batch_kld_loss: {batch_kld_loss(p, y_one_hot)}\",\n",
" sep=\"\\n\"\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Fitting on Probability Distributions\n",
"Cross entropy loss only allows you to fit on probability distributions where all the \"mass\" is on one event (i.e. a one hot vector). If your network needs to learn _any_ distribution, then you'll need to use `kl_div` loss. In this case, the output of your network and the labels are both prob distributions."
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"torch: 0.8245439529418945\n",
"numpy: 0.8245439636521041\n"
]
}
],
"source": [
"# comparing kl_div on distribution targets\n",
"batch_size = 64\n",
"n_classes = 10\n",
"\n",
"logits = torch.randn(size=(batch_size, n_classes))\n",
"p = torch.softmax(logits, axis=-1)\n",
"log_p = torch.log_softmax(logits, axis=-1)\n",
"\n",
"y = torch.softmax(torch.randn(size=(batch_size, n_classes)), axis=-1)\n",
"\n",
"print(\n",
" f\"torch: {kl_div(log_p, y, reduction='batchmean').item()}\",\n",
" f\"numpy: {batch_kld_loss(p, y)}\",\n",
" sep=\"\\n\"\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3.9.10 64-bit ('3.9.10')",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.10"
},
"orig_nbformat": 4,
"vscode": {
"interpreter": {
"hash": "fd09b19eb83f586d348350b5c89c7a987a0d039b02a538583d56ff9c88f80cb0"
}
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment