Tutorial on energy models and Deep Belief Networks

Topics: Energy models, causal generative models vs. energy models in overcomplete ICA, contrastive divergence learning, score matching, restricted Boltzmann machines, deep belief networks

Presentation notes: .pdf
This is a scan of my notes for the tutorial. I'm afraid they're not very polished... The first part on ICA is missing, but you can have a look at
Rich's notes that refer to the same paper.

Task for the coding session: Implementing a deep belief network for handwritten letters classification and generation.

Worksheet .odp, .pdf

Data: Begin using only a few letters from the Binary Alphadigits. This dataset is quite small, so that learning is fast and the hidden layers only need about 100 units each to reach a good perfomance. Test the complete system on the MNIST database (split the data into mini-batches).
You'll find a copy of the Binary Alphadigits and MNIST databases at Sam Roweis' site.

My solution: .tgz (in python)

Essential bibliography: