Learning Algorithms – or blend of computation and statistics

This page includes my sporadic, useful to read, notes on my interests! These are the summary on some of the important stuff I have learnt during my courses, or during the course of research on topic, when I was missing the necessary foundations needed. In other words, these are growing along with my knowledge! The negative point about this is that, there is a great amount of mistakes in the notes. On the other hand, everything is explained very smoothly and in details, hopefully!

I would appreciate any comment on these notes. Emailing mistakes are welcome!

  1. Expectation Maximization(EM)

  2. Discriminant Analysis

  3. Tree models

  4. Logistic Regression

  5. Linear Models

  6. Generalized Linear Models

  7. Nearest Neighbours

  8. Gaphical Models:

    1. Graphical Models

    2. Hidden Markov Model

    3. Variational Approximation

    4. Topic Modelling

    5. Expectation Propagation

  9. Sampling based learning

  10. Mixture Models

  11. Support Vector Machines and kernel methods

  12. Bayesian Non-parametric

  13. Clustering

  14. Posterior Regularization for structured learning

  15. Learning Theory

    1. Introduction to Learning Theory and PAC modelling

    2. Concentration Inequalities

    3. VC dimensions bounds

    4. Rademacher bounds

    5. Boosting

    6. Online and No-Regret Learning

  16. Singular Value Decomposition

  17. Popular Computational Techniques in NLP

This work is licensed under a Creative Commons Attribution-NonCommercial 3.0 License.

As a copyright note, the inspiration for maintaining this page came from, partly from Jeff Erikson, and partly from Charles Sutton.