
supplement to the supplemental cs229 boosting lecture notes
testing a simple extension of adaboost to prevent overfitting
testing a simple extension of adaboost to prevent overfitting
the gentlest possible introduction to generalization error bounds beyond uniform convergence
just some basic statistical learning *practice*
generalizing linear discriminant analysis beyond normally distributed data
bayesian inference in model misspecification settings, visually explained
a toy bayesian neural network with an exact $ \beta \mid D $
never escaping jekyll :)