This semester I assisted to a course on Kolmogorov Complexity and Minimum Description Length, given at the Institute for Logic, Language and Computation of the University of Amsterdam.
The main lecturer, Paul Vitanyi, is the c0-author of THE book on Kolmogorov Complexity. Peter Grünwald, invited professor in two lectures, is the author of THE book on the Miniumum Description Length principle. The exercise sessions were given by a talented PhD student of them, Wouter Koolen-Wijkstra.
But what is this all about?
In simple words, the Kolmogorov Complexity of a sequence of characters is the length of the shortest program that outputs that sequence. It is the theoretical limit of compression.
I would like to recommend this book by Jeff Hawkins, in which the author tries to create a theory about the neocortex.
He claims that the neocortex is basically a hierarchical memory system able to detect temporal and spatial patterns. Jeff Hawkins, and his company Numenta
, are now trying to move forward and implementing this “neocortical algorithm” as software running on a computer.
I enjoyed a lot reading it and I am trying now to read the technical papers. So far it looks like a good model, specially for computer vision systems, but it’s not yet clear to me how to solve problems from other cognitive areas such as language processing or planning.
More posts on that for the coming weeks!
Update : 5 years later, nothing very impressive has come out of Numenta. Even though the ideas on this book are appealing, in practice, all solid Machine Learning results require: 1) a very clear loss function, 2) an efficient optimization algorithm and 3) preferably, lots of data.