1
0
Fork 0

Remove glossary

This commit is contained in:
Alexander Hess 2020-11-30 18:42:54 +01:00
commit 96a3b242c0
Signed by: alexander
GPG key ID: 344EA5AB10D868E0
16 changed files with 40 additions and 193 deletions

View file

@ -5,7 +5,7 @@ Because ML models are trained by minimizing a loss function $L$, the
resulting value of $L$ underestimates the true error we see when
predicting into the actual future by design.
To counter that, one popular and model-agnostic approach is cross-validation
(\gls{cv}), as summarized, for example, by \cite{hastie2013}.
(CV), as summarized, for example, by \cite{hastie2013}.
CV is a resampling technique, which ranomdly splits the samples into a
training and a test set.
Trained on the former, an ML model makes forecasts on the latter.