Skip to content

mint-lab/filtering_tutorial

Repository files navigation

An Intuitive Tutorial on Bayesian Filtering

Introduction

This short tutorial aims to make readers understand Bayesian filtering intuitively. Instead of derivation of Kalman filter, I introduce Kalman filter from weighted average and moving average. I expect that readers will have intuition on Kalman filter such as meaning of equations.

  • To clone this repository (codes and slides): git clone https://github.com/mint-lab/filtering_tutorial.git
  • To install required Python packages: pip install -r requirements.txt
  • To fork this repository to your Github: Click here
  • To download codes and slides as a ZIP file: Click here
  • To see the lecture slides: Click here
  • 📝 Please don't miss Roger Labbe's great book, Kalman and Bayesian Filters in Python

This tutorial contains example applications to 2-D localization (a.k.a. target tracking) with various conditions and situations. It is important for users to know how to define the following five items in their applications. I hope that readers can build their intuition from my series of examples. My codes are based on FilterPy, but their contents and your understanding may not be limited to the library.

  1. State variable
  2. State transition function
  3. State transition noise
  4. Observation function
  5. Observation noise

Code Examples

1) From Weighted and Moving Average to Simple 1-D Kalman Filter

  • 1-D noisy signal filtering: kf_1d_signal.py

    • State variable: $\mathbf{x} = x$
    • State transition function: $\mathbf{x}_{k+1} = f(\mathbf{x}_k; \mathbf{u}_{k+1}) = \mathbf{x}_k$
      • Control input: $\mathbf{u}_k = [ ]$
    • State transition noise: $\mathrm{Q} = \sigma^2_Q$
    • Observation function: $\mathbf{z} = h(\mathbf{x}) = \mathbf{x}$
      • Observation: 1-D signal value
    • Observation noise: $\mathrm{R} = \sigma^2_{R}$
  • 2-D position tracking (without class inheritance): kf_2d_position.py

    • State variable: $\mathbf{x} = [x, y]^\top$
    • State transition function: $\mathbf{x}_{k+1} = f(\mathbf{x}_k; \mathbf{u}_{k+1}) = \mathbf{x}_k$
      • Control input: $\mathbf{u}_k = [ ]$
    • State transition noise: $\mathrm{Q} = \mathrm{diag}(\sigma^2_x, \sigma^2_y)$
    • Observation function: $\mathbf{z} = h(\mathbf{x}) = [x, y]^\top$
      • Observation: $\mathbf{z} = [x_{GPS}, y_{GPS}]^\top$
    • Observation noise: $\mathrm{R} = \mathrm{diag}(\sigma^2_{GPS}, \sigma^2_{GPS})$
  • 2-D pose tracking with simple transition noise (without class inheritance): ekf_2d_pose_simple_noise.py
    • State variable: $\mathbf{x} = [x, y, \theta, v, w]^\top$
    • State transition function: Constant velocity model (time interval: $t$)
      • Control input: $\mathbf{u}_k = [ ]$
$$\mathbf{x}_{k+1} = f(\mathbf{x}_k; \mathbf{u}_{k+1}) = \begin{bmatrix} x_k + v_k t \cos(\theta_k + w_k t / 2) \\ y_k + v_k t \sin(\theta_k + w_k t / 2) \\ \theta_k + w_k t \\ v_k \\ w_k \end{bmatrix}$$
    • State transition noise: $\mathrm{Q} = \mathrm{diag}(\sigma^2_x, \sigma^2_y, \sigma^2_\theta, \sigma^2_v, \sigma^2_w)$
    • Observation function: $\mathbf{z} = h(\mathbf{x}) = [x, y]^\top$
      • Observation: $\mathbf{z} = [x_{GPS}, y_{GPS}]^\top$
    • Observation noise: $\mathrm{R} = \mathrm{diag}(\sigma^2_{GPS}, \sigma^2_{GPS})$
  • 2-D pose tracking (using class inheritance): ekf_2d_pose.py

    • Its state variable, state transition function, observation function, and observation noise are same with ekf_2d_pose_simple_noise.py.
    • State transition noise derived from translational and rotational motion noises ($\sigma_v^2$ and $\sigma_w^2$) [Thrun05]
$$\mathrm{Q} = \mathrm{W}^\top \mathrm{M} \mathrm{W} \quad \text{where} \quad \mathrm{W} = \begin{bmatrix} \frac{\partial f}{\partial v} & \frac{\partial f}{\partial w} \end{bmatrix} \quad \text{and} \quad \mathrm{M} = \begin{bmatrix} \sigma^2_v & 0 \\ 0 & \sigma^2_w \end{bmatrix}$$ $$\mathbf{x}_{k+1} = f(\mathbf{x}_k; \mathbf{u}_{k+1}) = \begin{bmatrix} x_k + v_k t \cos(\theta_k + w_k t / 2) \\ y_k + v_k t \sin(\theta_k + w_k t / 2) \\ \theta_k + w_k t \\ v_k \\ w_{max} \tanh{(w_k / w_{max})} \end{bmatrix}$$
    • Post-processing with heading angle correction [Cho24]
$$\mathbf{x}_k = \left\{ \begin{array}{ll} [x_k, y_k, \theta_k+\pi, -v_k, w_k]^\top & \text{if} \;\; v_k < \epsilon_- \\\ \mathbf{x}_k & \text{otherwise} \end{array} \right.$$
  • 2-D pose tracking with odometry: ekf_2d_pose_odometry.py
    • Its state transition noise, observation function, and observation noise are same with ekf_2d_pose.py.
    • State variable: $\mathbf{x} = [x, y, \theta]^\top$
    • State transition function with translational and rotational control inputs: Constant velocity model (time interval: $t$)
      • Control input: $\mathbf{u}_k = [v_k, w_k]^\top$
$$\mathbf{x}_{k+1} = f(\mathbf{x}_k; \mathbf{u}_{k+1}) = \begin{bmatrix} x_k + v_{k+1} t \cos(\theta_k + w_{k+1} t / 2) \\ y_k + v_{k+1} t \sin(\theta_k + w_{k+1} t / 2) \\ \theta_k + w_{k+1} t \end{bmatrix}$$
  • 2-D pose tracking with off-centered GPS: ekf_2d_pose_off_centered.py
    • Its state variable, state transition function, state transition noise, and observation noise are same with ekf_2d_pose.py.
    • Observation function with off-centered GPS [Choi20]: Off-centered GPS ( $o_x$ and $o_y$ are frontal and lateral offset of the GPS.)

$$\mathbf{z} = \begin{bmatrix} x_{GPS} \\ y_{GPS} \end{bmatrix} = h(\mathbf{x}) = \begin{bmatrix} x + o_x \cos \theta - o_y \sin \theta \\ y + o_x \sin \theta + o_y \cos \theta \end{bmatrix}$$

References

How to Cite

If you want to cite this tutorial and codes, please cite one of the following papers.

@article{Cho24,
    author      = {Se-Hyoung Cho and Sunglok Choi},
    title       = {Accurate and Resilient {GPS}-only Localization with Velocity Constraints},
    journal     = {IEEE Access},
    volume      = {12},
    year        = {2024},
    doi         = {10.1109/ACCESS.2024.3432335}
}
@article{Choi20,
    author      = {Sunglok Choi and Jong-Hwan Kim},
    title       = {Leveraging Localization Accuracy with Off-centered {GPS}},
    journal     = {IEEE Transactions on Intelligent Transportation Systems},
    volume      = {21},
    number      = {6},
    year        = {2020},
    doi         = {10.1109/TITS.2019.2915108}
}

Authors

Acknowledgement

This tutorial was supported by the following R&D projects in Korea.

  • AI-based Localization and Path Planning on 3D Building Surfaces (granted by MSIT/NRF, grant number: 2021M3C1C3096810)

Releases

No releases published

Packages

No packages published

Languages