Machine Learning: Difference between revisions
Caseorganic (talk | contribs) No edit summary |
Caseorganic (talk | contribs) No edit summary |
||
| Line 1: | Line 1: | ||
===Definition=== | |||
===History=== | |||
===Examples=== | |||
==References== | |||
<references /> | |||
<private> | <private> | ||
Revision as of 23:43, 3 June 2011
Definition
History
Examples
References
<private>
The original link to this is here. I'd suggest visiting it for an updated list with lots of relevant comments. The following is more for my own notes.
Freely Available Books on Machine Learning
The Elements of Statistical Machine Learning is a great text covering most core topics in supervised learning along with a bit of unsupervised learning and some other specialist areas (like high dimensional problems).
Information Theory, Inference, and Learning Algorithms covers what the title suggests. It doesn't have a great deal of depth on the machine learning topics but is good for an overview of the techniques used in Bayesian inference.
Gaussian Processes for Machine Learning is the definitive reference for Gaussian processes.
Data-Intensive Text Processing with MapReduce contains patterns for implementing your algorithm in the Map-Reduce framework.
Reinforcement Learning: An Introduction covers the fundamentals of bandit algorithms and reinforcement learning in fully observable worlds (MDPs). Note it says very little about generalisation and practically nothing about acting in partially observable worlds (POMDPs). Since this book was published there has been substantial work in all areas of reinforcement learning; while the book will give you the basics you'll have to do a lot of reading in the literature to catch up to current work.
D.Barber: Bayesian Reasoning and Machine Learning
Andrew Ng's CS229 Machine Learning course notes. Comes with video lectures.
Convex Optimization – Boyd and Vandenberghe
Reinforcement Learning - An Introduction
Introduction to Information Retrieval
Introduction to Machine Learning course notes (based on a forthcoming book by Shai Ben-David and Shai Shalev-Shwartz) - 110 pages of comprehensive notes. The notes have a slight bias towards the more theoretical side of ML - PAC, Rademacher complexity, etc., with theorems and all that.
Mike Jordan and Martin Wainwright's Graphical Models, Exponential Families, and Variational Inference
[ttp://idiom.ucsd.edu/~rlevy/textbook/text.html Probabilistic Models] in the Study of Language by Roger Levy at UCSD. Still in draft form so incomplete (some chapters have yet to be written) freely available.
</private>