#+OPTIONS: ^:{}, H:1, toc:nil, LaTeX:dvipng
** part 1.2
For 0 <= m <= 5 the plots do make sense (0 simply represents the
mean).
For m = 19, the interpolation between -0.7 and 0.3 looks quite
reasonable, but outside this interval the polynomial displays spikes
not coherent with the sample data which demonstrates how MLE is
prone to overfitting.
** part 1.3
The training error is minimal for m = 20, the testing error for m = 13.
[[./part_1.3-figure1.png]]
Given the distribution of the data points it seems unlikely that the
prediction will be valid outside [-.9, .9].
** part 2.2
For m = 20 the interpolation still suffers from overfitting, it
tries to match too many test points.
** part 3
\begin{equation}
w_{map} = {(\sigma_0^2A^TA + \sigma^2Id)}^{-1}{\sigma_0^2A^Ty}
\end{equation}
#+html:
** part 4
#+include "main.m" :lines "204-207" src matlab
*** standard hyperparameters
#+caption: Hyperparameter values: v=100, v_0=400
[[./part_4.png]]
#+html:

*** overfitting
#+caption: Hyperparameter values: v=1, v_0=400
[[./part_4-overfit.png]]