FireStats is not installed in the database
Laryngitis homøopatiske midler køb propecia edderkop bid retsmidler

NIPS 2007 Report: Hinton’s Tutorial on Deep Belief Nets

Author:movellan @ December 3rd, 2007 Leave a Comment

  • Cool relationship between directed sigmoid belief nets and undirected boltzmann machines. Particularly the learinng rule for sigmoid belief nets becomes the Boltzmman rule if you use an infinite set of directed layers with linked weights.
  • Keeps emphasizing the form of independence that restricted Boltzmann macines are good at: conditional independence of the hidden units given the observable vector. This reverses the standard ICA approach where you have unconditional independence and the Conditional Random Fields. He called this generative conditional random fields and conditional RBMs.
  • Conditional RBMs from Hinton and Sutskever. Designed to deal with dynamicsl data. Used by Taylor Roweis and Hinton 2007 for generating skeletal animation models.
  • NIPS 2007 paper on learning the orientation of a face using deep belief networks combined with Gaussian Processes
    Here is the Paper

  • Bengio et al 2007 has a paper on how to extend RBM’s to the exponential family.
  • Hinton paper on Stochastic Embedding Using
  • Lots of deep autoencoders pretrained one layer at a time using the Boltzmann machine algorithm then finetuning it using backprop
  • salakhuditnov and Hinton -> Semantic hash mappings.