Skip to content

Instantly share code, notes, and snippets.

@moewew
Created August 21, 2018 11:16
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save moewew/eb79abfe72a6462132d2aa0f2eca1ab5 to your computer and use it in GitHub Desktop.
Save moewew/eb79abfe72a6462132d2aa0f2eca1ab5 to your computer and use it in GitHub Desktop.
\documentclass[a4paper,12pt]{scrartcl}
\usepackage[ngerman]{babel}
\usepackage{amsthm}
\usepackage{mathtools}
\usepackage{aligned-overset}
\newcommand{\Lagr}{\mathcal{L}}
\newcommand{\matr}[1]{\mathbf{#1}}
\newcommand{\X}{\matr{X}} %Matrix von X
\newcommand{\y}{\matr{y}} %y als voller Vektor
\newcommand{\yct}{\ubar{\y}^\mathbf{T}} %Zentriert
\newcommand{\betahat}{\hat{\beta}} %betahat
\DeclarePairedDelimiter{\abs}{\lvert}{\rvert}
\begin{document}
\begin{align*}
\Lagr(\beta_{0},\beta)
&=\min_{\beta_{0}, \beta}
\biggl\{
\frac{1}{2} \sum_{i=1}^N \Bigl(y_{i}-\beta_{0}
- \sum_{j=1}^p\beta_{j}\tilde{x}_{ij}\Bigr)^{\!2}
+ \lambda\sum_{j=1}^p \abs{\beta_{j}} \biggr\}\\
&= \min_{\beta_{0}, \beta}
\biggl\{
\frac{1}{2} \sum_{i=1}^N
\Bigl(y_{i}-\beta_{0} -
\sum_{j=1}^p\beta_{j}(x_{ij}-\bar{x}_{j})\Bigr)^{\!2}
+\lambda\sum_{j=1}^p \abs{\beta_{j}} \biggr\}
\end{align*}
\begin{align*}
\frac{\partial \Lagr}{\partial \beta_{0}}
&=-\sum_{i=1}^N \Bigl(y_{i}-\betahat_{0}
- \sum_{j=1}^p\betahat_{j}(x_{ij}-\bar{x}_{j})\Bigr)\\
&= -\sum_{i=1}^{N}y_i +N\betahat_{0}
+\sum_{j=1}^p \Bigl(\betahat_{j}\sum_{i=1}^N x_{ij} \Bigr)
- N\sum_{j=1}^p\betahat_{j}\bar{x}_{j}\\
&= -N\bar{y}+N\betahat_{0}+\sum_{j=1}^p\betahat_{j}N\bar{x}_{j}
-N\sum_{j=1}^p\betahat_{j}\bar{x}_{j}\\
&= -N\bar{y}+N\betahat_{0}
\end{align*}
Für die Nullstelle der Ableitung muss also
\[ \betahat_0 = \bar{y} \]
gelten.
Aus der Annahme, dass $\y$ zentriert ist, folgt für den kritischen Punkt
\[\betahat_{0}=0.\]
\end{document}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment