Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save nickovchinnikov/4eeb2e74c0d53f8e428ea05101096302 to your computer and use it in GitHub Desktop.
Save nickovchinnikov/4eeb2e74c0d53f8e428ea05101096302 to your computer and use it in GitHub Desktop.
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Autoregressive model of order $n$, denoted as $\\text{AR}(n)$**, can be defined as:\n",
"\n",
"$$x_t = b + \\varphi_1x_{t-1} + \\varphi_2x_{t-2} + \\dots + \\varphi_px_{t-n} + \\epsilon_t$$\n",
"\n",
"Using sum notation this can be written as:\n",
"\n",
"$$x_t = b + \\sum_{i=1}^n \\varphi_i x_{t-i} + \\varepsilon_t$$\n",
"\n",
"where $b$ is the bias term, $x_t$ is the current value, $x_{t-1}, x_{t-2}, \\ldots, x_{t-n}$ or $x_{t-i}$ are the $n$ previous values, $\\varphi_1, \\varphi_2, \\dots , \\varphi_n$ are the model parameters and $\\varepsilon_t$ is the error term or the white noise.\n",
"\n",
"During training, the model learns the optimal parameters $\\varphi_1, \\varphi_2, \\ldots, \\varphi_n$ by minimizing a loss function, such as the **Mean Squared Error (MSE)**:\n",
"\n",
"$$L(\\varphi) = \\frac{1}{n} \\sum_{i=1}^n (x_i - \\hat{x_i})^2$$\n",
"\n",
"where $x_i$ is the true value, $\\hat{x_i}$ is the predicted value, and $n$ is the total number of time steps."
]
}
],
"metadata": {
"language_info": {
"name": "python"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment