**Introduction to statistical modeling**

Linear models serve as a "toy" prerequisite to machine learning -- while machine learning is more general, dealing with linear problems first allows us to more clearly understand the objectives and capabilities of machine learning.

- Statistical models; causal models
- Normal linear models
- Hierarchical models
- SVM, k-means
- Kernels and GLMs [1]

**Non-linear models**

- Hierarchical clustering, bootstrap [1], kNN
- ACE, backfitting, etc.
- Game dynamics, MCMC, etc.

**Neural networks**

- The universal approximation theorem (+ Turing completeness of RNNs)
- Gradient descent and modifications [1][2]
- Backpropagation and the chain rule
- Overview of basic neural network architectures [1]
- Bayesian prior of neural networks [1] [2]
- Gaussian processes and infinite-width networks [1] [2]

**Projects**

- A deeper look into convolutional networks [1]
- Playing with frequency representations [1]
- Neural networks to simulate an entropy source
- This Chinese character does not exist (Trained version on Github)
- This Letter does not exist
- One-shot generative neural networks [1][2]
- Random walk of character shapes; this font does not exist
- Peeking into GANNs [1]
- Progressive GANNs [1]
- Various GANN projects: deepfakes, adverserial images, differentiable physics
- AI colorize, high-res, framerate, etc.
- Neural style transfer on tunes
- Predict Stackoverflow answer time
- NLP WTW/TOMT (e.g. "disrecommend" --> "dissuade", "arbitraryhood" --> "arbitrariness")

## No comments:

## Post a Comment