Remove glossary
This commit is contained in:
parent
f1844f8407
commit
96a3b242c0
16 changed files with 40 additions and 193 deletions
|
|
@ -5,7 +5,7 @@ Because ML models are trained by minimizing a loss function $L$, the
|
|||
resulting value of $L$ underestimates the true error we see when
|
||||
predicting into the actual future by design.
|
||||
To counter that, one popular and model-agnostic approach is cross-validation
|
||||
(\gls{cv}), as summarized, for example, by \cite{hastie2013}.
|
||||
(CV), as summarized, for example, by \cite{hastie2013}.
|
||||
CV is a resampling technique, which ranomdly splits the samples into a
|
||||
training and a test set.
|
||||
Trained on the former, an ML model makes forecasts on the latter.
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue