GP-SUM

Suddhu | June 16th 2020, RPL reading group | paper link

Overview


1. Introduction


2. Related work

3. Background on GP filtering

3.1 Bayes filters

3.1.1 Prediction update

Equation 1

3.1.2 Measurement update

3.1.3 Recursive belief update

Equation 3

3.2 Gaussian processes

D={(xi,yi)}i=1nandk(x, x)D = \{(x_i, y_i)\}_{i = 1}^n \quad \text{and} \quad k(x, \ x')
Equation 4

4. GP-SUM Bayes filter

4.1 Updating the prediction belief

Equation 5
Equation 7
Equation 6
Equation 8
p(xtxt1,j, ut1)N(xtμt,j,Σt,j)p(x_t | x_{t-1, j}, \ u_{t-1}) \rightarrow \mathcal{N}(x_t | \mu_{t, j}, \Sigma_{t, j})
Equation 9
Equation 10
Algorithm 1

4.2 Recovering belief from prediction belief

Equation 11
Algorithm 2

4.3 Computational complexity

O(Mn2+M)O(Mn^2 + M)

5. Results

5.1 1D non-linear synthetic model

Dynamics model (especially sensitive around 0):

Measurement model:

Three steps of the predict-measure-predict updates. Beliefs and predictions become multimodal immediately. GP-SUM captures the 3 modes at t = 2 while other methods output a single Gaussian to enclose them

At t = 1, GP-SUM shows better metrics across all 3. GP-PF becomes particle starved soon at t = 10

GP-ADF worsens with time as when dynamics is non-linear, its predicted variance increased lowering the likelihood of the state

5.2 Real task: uncertainty in pushing

Convergent and divergent behavior of two different pushes.
Just like the push-grasp, the block has unstable state distribution. The stochastic GP captures this uncertainty and we can propagate it via GP-SUM. Standard filtering frameworks can't capture the ring-shaped distribution.,