FireStats is not installed in the database
Laryngitis homøopatiske midler køb propecia edderkop bid retsmidler

Predictive State Representations (Lab Metting: Eric Wiewiora)

Author:nick @ August 6th, 2007 Leave a Comment

For our lab meeting this week, we will have Eric Wiewiora speaking to us about some research that he has been following and thinks is very promising for reinforcement learning.

Eric is a student of Gary Cottrell’s, and is the local expert on the current sate of the Reinforcement Learning literature.

As background, he recommends reading:
http://www.eecs.umich.edu/~baveja/Papers/uai2004psr.pdf
http://www.cs.cmu.edu/~ggordon/mrosen-ggordon-thrun.tpsr-icml2004.pdf

The first one is a good motivation, and it is light on math.  The second is a bit more technical, but it demonstrates one algorithm he’ll be talking about.

Abstract: Predictive State Representations

Predictive State Representations (PSRs) have shown a great deal of promise for building models of dynamic environments. PSRs are at least as powerful as Partially Observable Markov Decision Processes (POMDP) and Hidden Markov Models (HMM), but do not suffer from many of their shortcomings.
PSRs model a system based only on observables, and not hidden, latent states. Because of this, a PSR can be learned without falling into local optima.

In spite of these advantages, learning good parameters for a PSR has remained a challenge. This problem can clearly be seen when experimentally comparing state-of-the-art PSR learning algorithms to other standard methods for sequence prediction. I’ll explore the implications of these results, and what can be done to address this shortcoming.