Machine Learning Pages

These pages have been compiled by members of the CBU Learning Machine Learning (LML) Group

Learning Machine Learning Course

1. Introduction (applications, supervised, unsupervised, semi-supervised, reinforcement learning, bayes rule, probability theory, randomness) Presentation1_LML.ppt , 27 May 2008, Eleftherios Garyfallidis.

2. Further Introduction (what is ML, bayes rule, bayesian regression,entropy, relative entropy, mutual information), Presentation2_LML.ppt , 3 June 2008, Eleftherios Garyfallidis.

3. Maximum Likelihood vs Bayesian Learning (Notes available upon request) Presentation3_LML.ppt , 10 June 2008, Hamed Nili.

4. Factor Analysis, PCA and pPCA, Presentation4_LML.ppt , 17 June 2008, Hamed Nili.

5. Independent Component Analysis (ICA), Presentation5_LML.pdf , 24 June 2008, Jason Taylor.

6. ICA & Expectation Maximization (EM), Presentation6_LML.ppt , 1 July 2008, Eleftherios Garyfallidis.

Up to this point the lectures were based to the presentations of ML course given by Zoubin Ghahramani at the department of engineering, University of Cambridge, http://learning.eng.cam.ac.uk/zoubin/ml06/index.html .

7. Graphical Models 1, 8 July 2008, Ian Nimmo-Smith.

8. Graphical Models 2, 13 January 2009. Ian Nimmo-Smith.

9. Markov Chain Monte Carlo, 20 January 2009, Eleftherios Garyfallidis.

10. ???, 27 January 2009, Hamed Nili.

Proposed next topics :

Books

1. Pattern Recognition and Machine Learning, C. M. Bishop, 2006. (Copy in our Library)

2. Information Theory and Learning Algorithms, D. J. C. Mackay, 2003. (Available online)

3. Netlab Algorithms for Pattern Recognition, I. T. Nabney, 2001.

4. Gaussian Processes for Machine Learning, C. E. Rasmussen and C. K. I. Williams, 2006. (Available online)

5. Kernel Methods for Pattern Analysis, Shawe-Taylor and Cristianini, 2004.

Reading

EM

An online demo with mixtures of lines or mixtures of gaussians. http://lcn.epfl.ch/tutorial/english/gaussian/html/

ICA

An online demonstration of the concept in http://www.cis.hut.fi/projects/ica/icademo/

A tutorial is given at http://www.cs.helsinki.fi/u/ahyvarin/papers/NN00new.pdf

A maximum likelihood algorithm for ICA http://www.inference.phy.cam.ac.uk/mackay/ica.pdf

ICA vs PCA

A simple graphical representation of the differences is given in http://genlab.tudelft.nl/~dick/cvonline/ica/node3.html

MCMC

Christophe Andrieu, Nando de Freitas, Arnaud Doucet and Michael I. Jordan. (2003) An Introduction to MCMC for Machine Learning. Machine Learning, 50, 5–43, 2003.

Reversible Jump Markov Chain Monte Carlo (RJMCMC)

Bayes Rule

Highly recommended from Bishop's book chapter 1.2.

http://plato.stanford.edu/entries/bayes-theorem/

Thomas Griffiths, Alan Yuille. A Primer on Probabilistic Inference.

An Intuitive Explanation of Bayesian Reasoning Bayes' Theorem By Eliezer Yudkowsky

Eight versions of Bayes' theorem

Bayesian Methods in Neuroscience

Ma, W.J., Beck, J.M., Latham, P.E. & Pouget, A. (2006) Bayesian inference with probabilistic population codes. Nature Neuroscience. 9:1432-1438

Griffiths,Kemp and Tenenbaum. Bayesian models of cognition.

Knill, D. C., & Pouget, A. (2004). The Bayesian brain: the role of uncertainty in neural coding and computation. Trends Neurosciences, 27(12), 712-719.

Kording, K. & Wolpert, D.M. (2006) Bayesian decision theory in sensorimotor control. TRENDS in Cognitive Sciences,10, 319-326

Online Demos for ML

http://homepages.inf.ed.ac.uk/rbf/IAPR/researchers/PPRPAGES/pprdem.htm

Software

Public code for machine learning :

http://homepages.inf.ed.ac.uk/rbf/IAPR/researchers/MLPAGES/mlcode.htm