2339
Comment:
|
← Revision 44 as of 2013-03-08 10:28:25 ⇥
5462
converted to 1.6 markup
|
Deletions are marked like this. | Additions are marked like this. |
Line 2: | Line 2: |
Line 5: | Line 4: |
== Machine Learning Course == | == Learning Machine Learning Course == 1. Introduction (applications, supervised, unsupervised, semi-supervised, reinforcement learning, bayes rule, probability theory, randomness) [[attachment:Presentation1_LML.ppt]] , 27 May 2008, Eleftherios Garyfallidis. |
Line 7: | Line 7: |
1. Introduction (applications, supervised, unsupervised, semi-supervised, reinforcement learning, bayes rule, probability theory, randomness) attachment:Presentation1_LML.ppt, 27 May 2008, Eleftherios Garyfallidis. | 2. Further Introduction (what is ML, bayes rule, bayesian regression,entropy, relative entropy, mutual information), [[attachment:Presentation2_LML.ppt]] , 3 June 2008, Eleftherios Garyfallidis. |
Line 9: | Line 9: |
2. Further Introduction (what is ML, bayes rule, bayesian regression,entropy, relative entropy, mutual information), attachment:Presentation2_LML.ppt, 3 June 2008, Eleftherios Garyfallidis. | 3. Maximum Likelihood vs Bayesian Learning (Notes available upon request) [[attachment:Presentation3_LML.ppt]] , 10 June 2008, Hamed Nili. |
Line 11: | Line 11: |
3. Maximum Likelihood vs Bayesian Learning (Notes available upon request) attachment:Presentation3_LML.ppt, 10 June 2008, Hamed Nili. | 4. Factor Analysis, PCA and pPCA, [[attachment:Presentation4_LML.ppt]] , 17 June 2008, Hamed Nili. |
Line 13: | Line 13: |
4. Factor Analysis and pPCA, 17 June 2008, Hamed Nili. | 5. Independent Component Analysis (ICA), [[attachment:Presentation5_LML.pdf]] , 24 June 2008, Jason Taylor. 6. ICA & Expectation Maximization (EM), [[attachment:Presentation6_LML.ppt]] , 1 July 2008, Eleftherios Garyfallidis. Up to this point the lectures were based to the presentations of ML course given by Zoubin Ghahramani at the department of engineering, University of Cambridge, http://learning.eng.cam.ac.uk/zoubin/ml06/index.html . 7. Graphical Models 1, 8 July 2008, Ian Nimmo-Smith. 8. Graphical Models 2, 13 January 2009. Ian Nimmo-Smith. 9. Markov Chain Monte Carlo, 20 January 2009, Eleftherios Garyfallidis. 10. ???, 27 January 2009, Hamed Nili. |
Line 16: | Line 28: |
Proposed next topics : | |
Line 17: | Line 30: |
== Reading Lists == | * Non-parametric Methods ( Kernel density estimators, nearest-neighbour methods). * Gaussian Processes. * Sparse Kernel Machines (support vector machines (SVM)) * Sparse Kernel Machines 2 (relevance vector machines (RVM)) * Boosting. * Overview of clustering methods ( k-means, EM, hierarchical clustering). * Mutual Information with applications to registration and neuronal coding. * Random Field Theory with applications in fMRI. * Artificial Neural Networks from a probabilistic viewpoint. * Machine Learning methods used in SPM. * Machine Learning methods used in FSL. * Signal processing basics. * Fourier Transform. * Wavelets. * Spherical Harmonics. * Spherical Deconvolution. * SNR in MRI experiments. * Variational approximations (KL divergences, mean field, expectation propagation). * Model comparison (Bayes factors, Occam's razor, BIC, Laplace approximations). * Reinforcement Learning, Decision Making and MDPs (value functions, value iteration, policy iteration, Bellman equations, Q-learning, Bayesian decision theory == Books == 1. Pattern Recognition and Machine Learning, C. M. Bishop, 2006. (Copy in our Library) 2. Information Theory and Learning Algorithms, D. J. C. Mackay, 2003. (Available online) 3. Netlab Algorithms for Pattern Recognition, I. T. Nabney, 2001. 4. Gaussian Processes for Machine Learning, C. E. Rasmussen and C. K. I. Williams, 2006. (Available online) 5. Kernel Methods for Pattern Analysis, Shawe-Taylor and Cristianini, 2004. == Reading == === EM === An online demo with mixtures of lines or mixtures of gaussians. http://lcn.epfl.ch/tutorial/english/gaussian/html/ === ICA === An online demonstration of the concept in http://www.cis.hut.fi/projects/ica/icademo/ A tutorial is given at http://www.cs.helsinki.fi/u/ahyvarin/papers/NN00new.pdf A maximum likelihood algorithm for ICA http://www.inference.phy.cam.ac.uk/mackay/ica.pdf === ICA vs PCA === A simple graphical representation of the differences is given in http://genlab.tudelft.nl/~dick/cvonline/ica/node3.html |
Line 20: | Line 79: |
Christophe Andrieu, Nando de Freitas, Arnaud Doucet and Michael I. Jordan. (2003) [[attachment:Andrieu2003.pdf|An Introduction to MCMC for Machine Learning.]] Machine Learning, 50, 5–43, 2003. | |
Line 21: | Line 81: |
Christophe Andrieu, Nando de Freitas, Arnaud Doucet and Michael I. Jordan. (2003) [attachment:Andrieu2003.pdf An Introduction to MCMC for Machine Learning.] Machine Learning, 50, 5–43, 2003. | Reversible Jump Markov Chain Monte Carlo [[attachment:DiffusionPapers/RJMCMC_Green_1995.pdf|(RJMCMC)]] |
Line 23: | Line 83: |
=== Bayes - some useful/interesting papers === [http://cocosci.berkeley.edu/tom/papers/tutorial2.pdf Thomas Griffiths, Alan Yuille. A Primer on Probabilistic Inference. ] [http://cocosci.berkeley.edu/tom/papers/bayeschapter.pdf Griffiths,Kemp and Tenenbaum. Bayesian models of cognition.] [http://yudkowsky.net/bayes/bayes.html An Intuitive Explanation of Bayesian Reasoning Bayes' Theorem By Eliezer Yudkowsky] [http://www.cvs.rochester.edu/knill_lab/publications/TINS_2004.pdf Knill, D. C., & Pouget, A. (2004). The Bayesian brain: the role of uncertainty in neural coding and computation. Trends Neurosciences, 27(12), 712-719.] [http://www.inf.ed.ac.uk/teaching/courses/mlsc/HW2papers/koerdingTiCS2006.pdf Kording, K. & Wolpert, D.M. (2006) Bayesian decision theory in sensorimotor control. TRENDS in Cognitive Sciences,10, 319-326] [http://www.gatsby.ucl.ac.uk/~pel/papers/ppc-06.pdf Ma, W.J., Beck, J.M., Latham, P.E. & Pouget, A. (2006) Bayesian inference with probabilistic population codes. Nature Neuroscience. 9:1432-1438] |
=== Bayes Rule === Highly recommended from Bishop's book chapter 1.2. |
Line 42: | Line 88: |
[[http://cocosci.berkeley.edu/tom/papers/tutorial2.pdf|Thomas Griffiths, Alan Yuille. A Primer on Probabilistic Inference. ]] | |
Line 43: | Line 90: |
[http://homepages.wmich.edu/~mcgrew/Bayes8.pdf Eight versions of Bayes' theorem] | [[http://yudkowsky.net/bayes/bayes.html|An Intuitive Explanation of Bayesian Reasoning Bayes' Theorem By Eliezer Yudkowsky]] |
Line 45: | Line 92: |
[[http://homepages.wmich.edu/~mcgrew/Bayes8.pdf|Eight versions of Bayes' theorem]] | |
Line 46: | Line 94: |
=== Bayesian Methods in Neuroscience === [[http://www.gatsby.ucl.ac.uk/~pel/papers/ppc-06.pdf|Ma, W.J., Beck, J.M., Latham, P.E. & Pouget, A. (2006) Bayesian inference with probabilistic population codes. Nature Neuroscience. 9:1432-1438]] |
|
Line 47: | Line 97: |
[[http://cocosci.berkeley.edu/tom/papers/bayeschapter.pdf|Griffiths,Kemp and Tenenbaum. Bayesian models of cognition.]] [[http://www.cvs.rochester.edu/knill_lab/publications/TINS_2004.pdf|Knill, D. C., & Pouget, A. (2004). The Bayesian brain: the role of uncertainty in neural coding and computation. Trends Neurosciences, 27(12), 712-719.]] [[http://www.inf.ed.ac.uk/teaching/courses/mlsc/HW2papers/koerdingTiCS2006.pdf|Kording, K. & Wolpert, D.M. (2006) Bayesian decision theory in sensorimotor control. TRENDS in Cognitive Sciences,10, 319-326]] === Online Demos for ML === http://homepages.inf.ed.ac.uk/rbf/IAPR/researchers/PPRPAGES/pprdem.htm |
|
Line 49: | Line 107: |
Machine Learning Pages
These pages have been compiled by members of the CBU Learning Machine Learning (LML) Group
Learning Machine Learning Course
1. Introduction (applications, supervised, unsupervised, semi-supervised, reinforcement learning, bayes rule, probability theory, randomness) Presentation1_LML.ppt , 27 May 2008, Eleftherios Garyfallidis.
2. Further Introduction (what is ML, bayes rule, bayesian regression,entropy, relative entropy, mutual information), Presentation2_LML.ppt , 3 June 2008, Eleftherios Garyfallidis.
3. Maximum Likelihood vs Bayesian Learning (Notes available upon request) Presentation3_LML.ppt , 10 June 2008, Hamed Nili.
4. Factor Analysis, PCA and pPCA, Presentation4_LML.ppt , 17 June 2008, Hamed Nili.
5. Independent Component Analysis (ICA), Presentation5_LML.pdf , 24 June 2008, Jason Taylor.
6. ICA & Expectation Maximization (EM), Presentation6_LML.ppt , 1 July 2008, Eleftherios Garyfallidis.
Up to this point the lectures were based to the presentations of ML course given by Zoubin Ghahramani at the department of engineering, University of Cambridge, http://learning.eng.cam.ac.uk/zoubin/ml06/index.html .
7. Graphical Models 1, 8 July 2008, Ian Nimmo-Smith.
8. Graphical Models 2, 13 January 2009. Ian Nimmo-Smith.
9. Markov Chain Monte Carlo, 20 January 2009, Eleftherios Garyfallidis.
10. ???, 27 January 2009, Hamed Nili.
Proposed next topics :
- Non-parametric Methods ( Kernel density estimators, nearest-neighbour methods).
- Gaussian Processes.
- Sparse Kernel Machines (support vector machines (SVM))
- Sparse Kernel Machines 2 (relevance vector machines (RVM))
- Boosting.
- Overview of clustering methods ( k-means, EM, hierarchical clustering).
- Mutual Information with applications to registration and neuronal coding.
- Random Field Theory with applications in fMRI.
- Artificial Neural Networks from a probabilistic viewpoint.
- Machine Learning methods used in SPM.
- Machine Learning methods used in FSL.
- Signal processing basics.
- Fourier Transform.
- Wavelets.
- Spherical Harmonics.
- Spherical Deconvolution.
- SNR in MRI experiments.
- Variational approximations (KL divergences, mean field, expectation propagation).
- Model comparison (Bayes factors, Occam's razor, BIC, Laplace approximations).
- Reinforcement Learning, Decision Making and MDPs (value functions, value iteration, policy iteration,
- Bellman equations, Q-learning, Bayesian decision theory
Books
1. Pattern Recognition and Machine Learning, C. M. Bishop, 2006. (Copy in our Library)
2. Information Theory and Learning Algorithms, D. J. C. Mackay, 2003. (Available online)
3. Netlab Algorithms for Pattern Recognition, I. T. Nabney, 2001.
4. Gaussian Processes for Machine Learning, C. E. Rasmussen and C. K. I. Williams, 2006. (Available online)
5. Kernel Methods for Pattern Analysis, Shawe-Taylor and Cristianini, 2004.
Reading
EM
An online demo with mixtures of lines or mixtures of gaussians. http://lcn.epfl.ch/tutorial/english/gaussian/html/
ICA
An online demonstration of the concept in http://www.cis.hut.fi/projects/ica/icademo/
A tutorial is given at http://www.cs.helsinki.fi/u/ahyvarin/papers/NN00new.pdf
A maximum likelihood algorithm for ICA http://www.inference.phy.cam.ac.uk/mackay/ica.pdf
ICA vs PCA
A simple graphical representation of the differences is given in http://genlab.tudelft.nl/~dick/cvonline/ica/node3.html
MCMC
Christophe Andrieu, Nando de Freitas, Arnaud Doucet and Michael I. Jordan. (2003) An Introduction to MCMC for Machine Learning. Machine Learning, 50, 5–43, 2003.
Reversible Jump Markov Chain Monte Carlo (RJMCMC)
Bayes Rule
Highly recommended from Bishop's book chapter 1.2.
http://plato.stanford.edu/entries/bayes-theorem/
Thomas Griffiths, Alan Yuille. A Primer on Probabilistic Inference.
An Intuitive Explanation of Bayesian Reasoning Bayes' Theorem By Eliezer Yudkowsky
Eight versions of Bayes' theorem
Bayesian Methods in Neuroscience
Griffiths,Kemp and Tenenbaum. Bayesian models of cognition.
Online Demos for ML
http://homepages.inf.ed.ac.uk/rbf/IAPR/researchers/PPRPAGES/pprdem.htm
Software
Public code for machine learning :
http://homepages.inf.ed.ac.uk/rbf/IAPR/researchers/MLPAGES/mlcode.htm