partical filter

localizing the vehicle involves determing where on the map the vehicle is most likely to be by matching what the vehicles see to the map.

Markov localization or Bayes Filter for localization is the generalized filter. thinking of the robot location as a probability distribution, each time the robot move, the distribution becomes more diffuse(wide). by passing control data, map data, observation into the filter will concentrate(narrow) the distribution at each timestep.

state space

x = f(x, v)            (1)

z = h(x, w)             (2)

v, w is the process noise, measurement noise respectfully, and each is in the normal Gaussian distribution.

Bayes filter derivation

(b)

consider the multiply rule of probability:

p(a, b) = p(a|b) p(b) 

lhs of equation(b) is:

given x_k, assuming z_k is independent from all previous measurements z_{1:k-1}:

Markov Localization

in which the true state x is unobserved, and the measurements z is observed.
assuming 1st order Markov, the probability of current true state:

p(x_k | x_{0:k-1}) == p(x_k | x_{k-1}) (3)

similarly, the measurement is only dependent on current state, which is a stochastic projection of the true state x_t, :

p(z_k | x_{0:k}) = p(z_k | x_k)     (4)            

(3) is referred to as motion model, and (4) as measurement/observation model.

the classical problem in partially observable Markov chains is to recover a posterior distribution from all avilable sensor measurements and controls in all timesteps.

Especially, for the localization problem here is to obtain the system current state posterior p(x_k | z_{1:k}) based on the all existing measurements, which can be solved by Bayes Filter.

ps. the propability distribution of current state is also depend on other known inputs, e.g. map data, control data.

prediction

from Bayes filter equation, p(x_k | z_{1:k-1}) need get first, which is the prediction step. physically, it used to estimate the system state based on all previous measurements.

consider x_{k-1} as the random variable, the integration of pdf p(x_k, x_{k-1} | z_{1:k-1}) about x_{k-1} is p(x_k | z_{1:k-1})

consider the multiply rule:

by 1st order Markov assuming, the first item in integral can reduced:

p(x_k | x_{k-1}) is determined by the system, which obey the same distribution of process noise. p(x_{k-1} | z_{1:k-1}) is known, as the posterior state at timestep k-1. this is where the recursive process.

update

using equation (b) to update the current posterior state. the denominator of (b) is a constant coeffient. p(z_k | x_k) is the likelihood paramter, decided by measurement.