Jump to ratings and reviews
Rate this book

Adaptive Computation and Machine Learning

Machine Learning: A Probabilistic Perspective

Rate this book
A comprehensive introduction to machine learning that uses probabilistic models and inference as a unifying approach.

Today's Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach.

The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic methods, the book stresses a principled model-based approach, often using the language of graphical models to specify models in a concise and intuitive way. Almost all the models described have been implemented in a MATLAB software package—PMTK (probabilistic modeling toolkit)—that is freely available online. The book is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.

1104 pages, Hardcover

Published August 24, 2012

206 people are currently reading
2801 people want to read

About the author

Kevin P. Murphy

11 books24 followers
Kevin P. Murphy is a Research Scientist at Google. Previously, he was Associate Professor of Computer Science and Statistics at the University of British Columbia.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
267 (51%)
4 stars
182 (35%)
3 stars
57 (10%)
2 stars
8 (1%)
1 star
6 (1%)
Displaying 1 - 17 of 17 reviews
4 reviews2 followers
May 15, 2015
Hard pressed to say anyone has actually "read" this whole book--it reads like a smattering of all popular machine learning algorithms. I would not recommend it for an introduction to machine learning, not due to the technical prowess required (as it is actually much lighter on math than other similar books), but moreso due to the method and depth in which the author introduces the material.

That being said, this is perhaps the best modern "reference" text on machine learning methods. If you are already familiar with where many the methods exist in the overall landscape of machine learning, this book is absolutely fantastic.

This book describes itself as being Bayesian, but it is clearly less so than many other texts (e.g., Bishop's PRML or the frequentist slant of Hastie's ESL). Instead, most algorithms are motivated mostly by what is convention/en vogue in the machine learning community. In particular, just about every algorithm is posed as a convex relaxation to the actual posterior distribution, such that our modern optimization algorithms can computer MAP solutions given any reasonably large data set. True Bayesian inference methods are more of an afterthought, "evidenced" by their treatment only in much later chapters of MCMC/sampling and variational approaches.

Buy this book! Buy some page markers for it! But also buy an accompanying text for more principled approaches to the fundamentals.
Profile Image for Aiham Taleb.
27 reviews6 followers
September 14, 2021
Well, although this book is not made for reading purposes (in the common usage of the word reading). But I found it really interesting. It contains every single thing that is related with Machine Learning, every algorithm that is used, every modern approach that is developed. I liked how Murphy ordered the book's topics.
Surely it is not recommended for everyone, but at least recommended for those who want to understand deeply Machine Learning in a very comprehensive way.
333 reviews24 followers
April 29, 2019
Excellent manual on statistical learning providing a simple Bayesian explanation for the most common statistical models. Some good examples: the author explains the difference between least squares, ridge, lasso, etc. from different associations of distributions for the likelihood function and prior; or the MLE (high variance/possible overfitting) is the MAP estimate (high bias) with uniform prior, etc etc. Makes something that often looks like different cooking recipes into an ontology of clear related concepts. Includes also some useful summary tables (see eg Table 8.1 for a long list of models, classified as classification/regression, generative/discriminative, parametric/non-parametric). Very pedagogical. However, due to the length and sometimes depth of the maths, a book to read at different levels depending on what the reader is looking for. Also, I feel that the aspects presented in the second half of the book (trees, SVM, neural networks, Markov chains, etc., etc., etc.) are so varied, yet so technical, that books focused on those specific models may be preferred for some clarifications.
Profile Image for Sami.
5 reviews3 followers
May 30, 2016
Content of the book is fantastic (five stars), albeit slightly out of date in 2016. However, the first printing is so full of typos (zero stars) that it is difficult to understand how the version ever got printed. Clearly nobody read through it before printing approval. I would not recommend the first edition to anyone unless they are experts with the ability to verify and if necessary rewrite every single equation.
Profile Image for BCS.
218 reviews33 followers
May 21, 2013
This substantial book is a deep and detailed introduction to the field of machine learning, using probabilistic methods. It is aimed at a graduate-level readership and assumes a mathematical background that includes calculus, statistics and linear algebra.

The book opens with a brief survey of the kinds of problems to which machine learning can be applied, and sketches the types of methods that can be used to model these problems.

After a short introduction to probability, the remaining 27 chapters, over around a thousand pages, expand in depth on these and related topics.

Due to the nature of the material, much of the content is mathematical equations and proofs. Numerical examples are scarce. Rather, the authors rely upon the reader’s mathematical intuition supported by a variety of graphical illustrations.

Where relevant, overviews of real-world applications of various techniques are provided, which help to make the rather abstract subject matter more concrete. In addition, MATLAB and GNU Octave code, which implements the algorithms given in the book, is freely available as a download from the book’s website.

The author’s style is easy but authoritative, with useful explanations and asides. That said, the book is intended as a course textbook and requires the reader to pay close attention. In the absence of an instructor,

I frequently found myself reaching for other resources in order to support the explanations given in the book, though this is probably due to my not understanding the requisite background topics sufficiently well.

Exercises are included at the end of each chapter, however while solutions are apparently available to instructors, they are not given in the book.

This comprehensive book should be of great interest to learners and practitioners in the field of machine learning.

Reviewed by Patrick Hill CEng MBCS CITP
Profile Image for Trung Nguyen.
11 reviews6 followers
July 11, 2015
This can become a very good reference book for machine learning. A good complementary to Pattern Recognition and Machine Learning by Bishop.
Profile Image for Aaron.
16 reviews2 followers
July 27, 2019
Still relevant, still a useful reference, even in this the day of machine learning mania. Clear and well exposited.
Profile Image for Kevin Shen.
66 reviews5 followers
August 29, 2021
This is a good book. It presents all the key ML ideas succinctly. However, I was a bit surprised to find it a bit hard to navigate at times. Some chapters refer to concepts that are explored in later chapters. Sometimes the author switches/introduces notation on you without explanation. There are an annoying number of typos and even more annoying variable overloads. I guess this a natural pathology of trying to stuff so many ML ideas into so little space and also what this book aims to be (reference rather than end-to-end reading). I had higher expectations for the exercises. Some chapters only have 1-3 exercises which are "prove equation X".
1 review1 follower
November 26, 2017
This book is amazing. I really enjoy reading it. Kevin Murphy is a great teacher and excellent researcher. You can get lots of insights that absent from practical books or blogs.
There are many typos in the first 3 printings. The 4th (and later) is much better. What I bought (11/24/2017) is the 6th printing (the same as the 4th).
3 reviews
September 28, 2019
Oxford dictionary for machine learning. Clear formulas.
The only downside is that it lacks material for deep learning techniques.
Profile Image for Robert Muller.
Author 15 books35 followers
October 16, 2019
The best book on machine learning I've read, especially for those of us who like and understand the Bayesian approach to probability. It's quite math heavy and code light, but there's plenty of code available; check out the new Python code for the next edition (which itself will probably be even better than this edition, I would think).
Profile Image for Charluff.
99 reviews3 followers
January 14, 2019
Solid manual for the ML field. Though I’ve found the writing heavy and it was not easy to get a grasp on the concepts, that you would have to dig in for pages.
8 reviews
August 14, 2019
Either a statistics perspective or a optimization perspective has its own limitations. Maybe an approach like SGVB could be a promising option.
22 reviews
September 29, 2019
I read until 3.5 (P82, Naive Bayes classifiers), and find it too hard and abstract to continue. Em... Maybe I should start from an easier one?
Profile Image for Dominik Jurko.
8 reviews1 follower
March 24, 2021
Clearly written, understandable, great illustrations and excersises. I LOVE IT
108 reviews1 follower
June 23, 2016
Solid, but it needed better notation. The notation got very cumbersome by the end and obscured a lot of the intuition behind what was going on.
Displaying 1 - 17 of 17 reviews

Can't find what you're looking for?

Get help and learn more about the design.