Saturday, December 7, 2013

Information Theory, Inference and Learning Algorithms

Information Theory, Inference and Learning Algorithms


Information Theory, Inference and Learning Algorithms

Posted:

Information Theory, Inference and Learning Algorithms (Hardcover)
By David J. C. MacKay

I find it interesting that most of the people reviewing this book seem to be reviewing it as they would any other information theory textbook. Such a review, whether positive or critical, could not hope to give a complete picture of what this text actually is. There are many books on information theory, but what makes this book unique (and in my opinion what makes it so outstanding) is the way it integrates information theory with statistical inference. The book covers topics including coding theory, Bayesian inference, and neural networks, but it treats them all as different pieces of a unified puzzle, focusing more on the connections between these areas, and the philosophical implications of these connections, and less on delving into depth in one area or another.

This is a learning text, clearly meant to be read and understood. The presentation of topics is greatly expanded and includes much discussion, and although the book is dense, it is rarely concise. The exercises are absolutely essential to understanding the text. Although the author has made some effort to make certain chapters or topics independent, I think that this is one book for which it is best to more or less work straight through. For this reason and others, this book does not make a very good reference: occasionally nonstandard notation or terminology is used.

The biggest strength of this text, in my opinion, is on a philosophical level. It is my opinion, and in my opinion it is a great shame, that the vast majority of statistical theory and practice is highly arbitrary. This book will provide some tools to (at least in some cases) anchor your thinking to something less arbitrary.


Pin It Now!

0 comments:

Post a Comment

 
//PART 2