Skip to content

Commit

Permalink
Merge branch 'master' of github.com:rpng/open_vins
Browse files Browse the repository at this point in the history
  • Loading branch information
goldbattle committed Feb 28, 2020
2 parents 9c43108 + 4e0bd81 commit 7940dd8
Show file tree
Hide file tree
Showing 8 changed files with 96 additions and 72 deletions.
22 changes: 16 additions & 6 deletions ReadMe.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,11 +14,12 @@ Please take a look at the feature list below for full details on what the system
* Github project page - https://github.com/rpng/open_vins
* Documentation - https://docs.openvins.com/
* Getting started guide - https://docs.openvins.com/getting-started.html
* Publication reference - TBD
* Publication reference - http://udel.edu/~ghuang/iros19-vins-workshop/papers/06.pdf


## News / Events

* **January 21, 2020** - Our paper has been accepted for presentation in [ICRA 2020](https://www.icra2020.org/). We look forward to seeing everybody there! We have also added links to a few videos of the system running on different datasets.
* **October 23, 2019** - OpenVINS placed first in the [IROS 2019 FPV Drone Racing VIO Competition
](http://rpg.ifi.uzh.ch/uzh-fpv.html). We will be giving a short presentation at the [workshop](https://wp.nyu.edu/workshopiros2019mav/) at 12:45pm in Macau on November 8th.
* **October 1, 2019** - We will be presenting at the [Visual-Inertial Navigation: Challenges and Applications
Expand Down Expand Up @@ -60,7 +61,17 @@ Please take a look at the feature list below for full details on what the system
* Out of the box evaluation on EurocMav and TUM-VI datasets
* Extensive evaluation suite (ATE, RPE, NEES, RMSE, etc..)

## Demo Videos

[![IMAGE ALT TEXT](http://img.youtube.com/vi/KCX51GvYGss/1.jpg)](http://www.youtube.com/watch?v=KCX51GvYGss "OpenVINS - EuRoC MAV Vicon Rooms Flyby")
[![IMAGE ALT TEXT](http://img.youtube.com/vi/Lc7VQHngSuQ/1.jpg)](http://www.youtube.com/watch?v=Lc7VQHngSuQ "OpenVINS - TUM VI Datasets Flyby")
[![IMAGE ALT TEXT](http://img.youtube.com/vi/vaia7iPaRW8/1.jpg)](http://www.youtube.com/watch?v=vaia7iPaRW8 "OpenVINS - UZH-FPV Drone Racing Dataset Flyby")
[![IMAGE ALT TEXT](http://img.youtube.com/vi/MCzTF9ye2zw/1.jpg)](http://www.youtube.com/watch?v=MCzTF9ye2zw "OpenVINS - KAIST Urban 39 Dataset Demonstration")


[![IMAGE ALT TEXT](http://img.youtube.com/vi/187AXuuGNNw/1.jpg)](http://www.youtube.com/watch?v=187AXuuGNNw "OpenVINS - EuRoC MAV Vicon Rooms Demonstration")
[![IMAGE ALT TEXT](http://img.youtube.com/vi/oUoLlrFryk0/1.jpg)](http://www.youtube.com/watch?v=oUoLlrFryk0 "OpenVINS - TUM VI Datasets Demostration")
[![IMAGE ALT TEXT](http://img.youtube.com/vi/ExPIGwORm4E/1.jpg)](http://www.youtube.com/watch?v=ExPIGwORm4E "OpenVINS - UZH-FPV Drone Racing Dataset Demonstration")


## Credit / Licensing
Expand All @@ -69,13 +80,12 @@ This code was written by the [Robot Perception and Navigation Group (RPNG)](http
If you have any issues with the code please open an issue on our github page with relevant implementation details and references.
For researchers that have leveraged or compared to this work, please cite the following:
```txt
@Conference{Geneva2019IROSws,
@Conference{Geneva2020ICRA,
Title = {OpenVINS: A Research Platform for Visual-Inertial Estimation},
Author = {Patrick Geneva and Kevin Eckenhoff and Woosik Lee and Yulin Yang and Guoquan Huang},
Booktitle = {{IROS} 2019 Workshop on Visual-Inertial Navigation: Challenges and Applications},
Year = {2019},
Address = {Macau, China},
Month = nov,
Booktitle = {Proc. of the IEEE International Conference on Robotics and Automation},
Year = {2020},
Address = {Paris, France},
Url = {\url{https://github.com/rpng/open_vins}}
}
```
Expand Down
20 changes: 13 additions & 7 deletions docs/propagation.dox
Original file line number Diff line number Diff line change
Expand Up @@ -297,23 +297,29 @@ That is,
where \f$\mathbf{n} = [ \mathbf{n}_g ~ \mathbf{n}_a ~ \mathbf{n}_{bg} ~ \mathbf{n}_{ba} ]^\top\f$ are the IMU sensor noises.


For the orientation error propagation, we start with the rotation matrix perturbation using
\f${}^{I}_G \mathbf{R} \approx (\mathbf{I}_3 - \lfloor ^{I}_{G}\tilde{\boldsymbol{\theta}}\times\rfloor)^{I}_{G} \hat{\mathbf{R}}\f$:

For the orientation error propagation, we start with the \f$\mathbf{SO}(3)\f$ perturbation using
\f${}^{I}_G \mathbf{R} \approx (\mathbf{I}_3 - \lfloor ^{I}_{G}\tilde{\boldsymbol{\theta}}\times\rfloor)^{I}_{G} \hat{\mathbf{R}}\f$
:

\f{align*}{
^{I_{k+1}}_G \mathbf{R} &= \text{}^{I_{k+1}}_{I_{k}} \mathbf{R} \text{}^{I_{k}}_G \mathbf{R} \\
{}^{I_{k+1}}_G \mathbf{R} &= \text{}^{I_{k+1}}_{I_{k}} \mathbf{R} \text{}^{I_{k}}_G \mathbf{R} \\
(\mathbf{I}_3 - \lfloor ^{I_{k+1}}_{G}\tilde{\boldsymbol{\theta}}\times\rfloor)^{I_{k+1}}_{G}
\hat{\mathbf{R}}
&\approx \text{}^{I_{k+1}}_{I_{k}} \hat{\mathbf{R}}
(\mathbf{I}_3 - \lfloor \mathbf J_r(^{I_{k+1}}_{I_k}\hat{\boldsymbol{\theta}})
&\approx \textrm{exp}(-{}^{I_{k}}\hat{\boldsymbol{\omega}}\Delta t - {}^{I_{k}}\tilde{\boldsymbol{\omega}}\Delta t)
(\mathbf{I}_3 - \lfloor ^{I_{k}}_{G}\tilde{\boldsymbol{\theta}}\times\rfloor)^{I_{k}}_{G}
\hat{\mathbf{R}}\\
&=\textrm{exp}(-{}^{I_{k}}\hat{\boldsymbol{\omega}}\Delta t)\textrm{exp}(-\mathbf{J}_r(-{}^{I_{k}}\hat{\boldsymbol{\omega}}\Delta t){}^{I_{k}}\tilde{\boldsymbol{\omega}}\Delta t)
(\mathbf{I}_3 - \lfloor ^{I_{k}}_{G}\tilde{\boldsymbol{\theta}}\times\rfloor)^{I_{k}}_{G}
\hat{\mathbf{R}}\\
&=\text{}^{I_{k+1}}_{I_{k}} \hat{\mathbf{R}}
(\mathbf{I}_3 - \lfloor \mathbf J_r(-{}^{I_{k}}\hat{\boldsymbol{\omega}}\Delta t)
\tilde{\boldsymbol{\omega}}_k\Delta t \times\rfloor)
(\mathbf{I}_3 - \lfloor ^{I_k}_{G}\tilde{\boldsymbol{\theta}}\times\rfloor)
\text{}^{I_{k}}_G \hat{\mathbf{R}}
\f}

where \f$\tilde{\boldsymbol{\omega}} = \boldsymbol{\omega} - \hat{\boldsymbol{\omega}}
= -(\tilde{\mathbf{b}}_{\mathbf{g}} + \mathbf{n}_g)\f$.
= -(\tilde{\mathbf{b}}_{\mathbf{g}} + \mathbf{n}_g)\f$ handles both the perturbation to the bias and measurement noise.
\f$\mathbf {J}_r(\boldsymbol{\theta})\f$ is the right Jacobian of \f$\mathbf{SO}(3)\f$
that maps the variation of rotation angle in the parameter vector space into the variation in the tangent vector space to the manifold
[see @ref ov_core::Jr_so3()].
Expand Down
100 changes: 52 additions & 48 deletions ov_core/src/sim/BsplineSE3.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -270,41 +270,46 @@ bool BsplineSE3::find_bounding_poses(double timestamp, std::map<double,Eigen::Ma
pose1 = Eigen::Matrix4d::Identity();

// Find the bounding poses
double min_time = -INFINITY;
double max_time = INFINITY;
bool found_older = false;
bool found_newer = false;

// Find the bounding poses for interpolation. If no older one is found, measurement is unusable
for(std::pair<const double, Eigen::MatrixXd> &pose : poses) {
if (pose.first > min_time && pose.first <= timestamp){
min_time = pose.first;
// Find the bounding poses for interpolation.
auto lower_bound = poses.lower_bound(timestamp); // Finds timestamp or next(timestamp) if not available
auto upper_bound = poses.upper_bound(timestamp); // Finds next(timestamp)

if(lower_bound != poses.end()) {
// Check that the lower bound is the timestamp.
// If not then we move iterator to previous timestamp so that the timestamp is bounded
if(lower_bound->first == timestamp) {
found_older = true;
} else if(lower_bound != poses.begin()) {
--lower_bound;
found_older = true;
}
if (pose.first < max_time && pose.first > timestamp){
max_time = pose.first;
found_newer = true;
}
}

// If we found the oldest one, set it
if(upper_bound != poses.end()) {
found_newer = true;
}

// If we found the older one, set it
if (found_older) {
t0 = min_time;
pose0 = poses.at(min_time);
t0 = lower_bound->first;
pose0 = lower_bound->second;
}

// If we found the newest one, set it
if(found_newer) {
t1 = max_time;
pose1 = poses.at(max_time);
if (found_newer) {
t1 = upper_bound->first;
pose1 = upper_bound->second;
}

// Assert the timestamps
if(found_older && found_newer)
assert(t0<t1);
if (found_older && found_newer)
assert(t0 < t1);

// Return true if we found both bounds
return (found_older && found_newer);
return (found_older && found_newer);

}

Expand All @@ -328,47 +333,46 @@ bool BsplineSE3::find_bounding_control_points(double timestamp, std::map<double,
bool success = find_bounding_poses(timestamp, poses, t1, pose1, t2, pose2);

// Return false if this was a failure
if(!success)
if (!success)
return false;

// Now find the poses that are below and above
double min_time = -INFINITY;
double max_time = INFINITY;
bool found_min_of_min = false;
bool found_max_of_max = false;
for(std::pair<const double, Eigen::MatrixXd> &pose : poses) {
if (pose.first > min_time && pose.first < t1){
min_time = pose.first;
found_min_of_min = true;
}
if (pose.first < max_time && pose.first > t2){
max_time = pose.first;
found_max_of_max = true;
}
}
auto iter_t1 = poses.find(t1);
auto iter_t2 = poses.find(t2);

// If we found the oldest one, set it
if (found_min_of_min) {
t0 = min_time;
pose0 = poses.at(min_time);
// Check that t1 is not the first timestamp
if(iter_t1 == poses.begin()) {
return false;
}

// If we found the newest one, set it
if(found_max_of_max) {
t3 = max_time;
pose3 = poses.at(max_time);
// Move the older pose backwards in time
// Move the newer one forwards in time
auto iter_t0 = --iter_t1;
auto iter_t3 = ++iter_t2;

// Check that it is valid
if (iter_t3 == poses.end()) {
return false;
}

// Set the oldest one
t0 = iter_t0->first;
pose0 = iter_t0->second;

// Set the newest one
t3 = iter_t3->first;
pose3 = iter_t3->second;

// Assert the timestamps
if(success && found_min_of_min && found_max_of_max) {
assert(t0<t1);
assert(t1<t2);
assert(t2<t3);
if (success) {
assert(t0 < t1);
assert(t1 < t2);
assert(t2 < t3);
}

// Return true if we found all four bounding poses
return (success && found_min_of_min && found_max_of_max);

return success;
}


2 changes: 1 addition & 1 deletion ov_core/src/sim/Simulator.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Simulator::Simulator(ros::NodeHandle& nh) {
ROS_INFO("=======================================");

// Load the groundtruth trajectory and its spline
std::string path_traj = "/home/patrick/workspace/catkin_ws_ov/src/open_vins/ov_data/sim/udel_gore.txt";
std::string path_traj = "./src/open_vins/ov_data/sim/udel_gore.txt";
nh.param<std::string>("sim_traj_path", path_traj, path_traj);
load_data(path_traj);
spline.feed_trajectory(traj_data);
Expand Down
2 changes: 1 addition & 1 deletion ov_core/src/track/TrackBase.h
Original file line number Diff line number Diff line change
Expand Up @@ -245,7 +245,7 @@ namespace ov_core {
/**
* @brief Changes the ID of an actively tracked feature to another one
* @param id_old Old id we want to change
* @param id_old Id we want to change the old id to
* @param id_new Id we want to change the old id to
*/
void change_feat_id(size_t id_old, size_t id_new) {

Expand Down
14 changes: 9 additions & 5 deletions ov_msckf/src/core/RosVisualizer.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -245,6 +245,8 @@ void RosVisualizer::publish_state() {
poses_seq_imu++;

// Publish our transform on TF
// NOTE: since we use JPL we have an implicit conversion to Hamilton when we publish
// NOTE: a rotation from GtoI in JPL has the same xyzw as a ItoG Hamilton rotation
tf::StampedTransform trans;
trans.stamp_ = ros::Time::now();
trans.frame_id_ = "global";
Expand All @@ -258,16 +260,18 @@ void RosVisualizer::publish_state() {
// Loop through each camera calibration and publish it
for(const auto &calib : state->get_calib_IMUtoCAMs()) {
// need to flip the transform to the IMU frame
Eigen::Vector4d q_CtoI = Inv(calib.second->quat());
Eigen::Vector3d p_IinC = -calib.second->Rot().transpose()*calib.second->pos();
// publish
Eigen::Vector4d q_ItoC = calib.second->quat();
Eigen::Vector3d p_CinI = -calib.second->Rot().transpose()*calib.second->pos();
// publish our transform on TF
// NOTE: since we use JPL we have an implicit conversion to Hamilton when we publish
// NOTE: a rotation from ItoC in JPL has the same xyzw as a CtoI Hamilton rotation
tf::StampedTransform trans;
trans.stamp_ = ros::Time::now();
trans.frame_id_ = "imu";
trans.child_frame_id_ = "cam"+std::to_string(calib.first);
tf::Quaternion quat(q_CtoI(0),q_CtoI(1),q_CtoI(2),q_CtoI(3));
tf::Quaternion quat(q_ItoC(0),q_ItoC(1),q_ItoC(2),q_ItoC(3));
trans.setRotation(quat);
tf::Vector3 orig(p_IinC(0),p_IinC(1),p_IinC(2));
tf::Vector3 orig(p_CinI(0),p_CinI(1),p_CinI(2));
trans.setOrigin(orig);
mTfBr->sendTransform(trans);
}
Expand Down
2 changes: 1 addition & 1 deletion ov_msckf/src/core/VioManager.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -724,7 +724,7 @@ void VioManager::do_feature_propagate_update(double timestamp) {
ROS_INFO("\u001b[34m[TIME]: %.4f seconds for total\u001b[0m",(rT6-rT1).total_microseconds() * 1e-6);

// Update our distance traveled
if(state->get_clones().find(timelastupdate) != state->get_clones().end()) {
if(timelastupdate != -1 && state->get_clones().find(timelastupdate) != state->get_clones().end()) {
Eigen::Matrix<double,3,1> dx = state->imu()->pos() - state->get_clone(timelastupdate)->pos();
distance += dx.norm();
}
Expand Down
6 changes: 3 additions & 3 deletions ov_msckf/src/state/State.h
Original file line number Diff line number Diff line change
Expand Up @@ -139,7 +139,7 @@ namespace ov_msckf {
}

/// Access to all current clones in the state
std::unordered_map<double, PoseJPL*> get_clones() {
std::map<double, PoseJPL*> get_clones() {
return _clones_IMU;
}

Expand Down Expand Up @@ -194,7 +194,7 @@ namespace ov_msckf {
IMU *_imu;

/// Map between imaging times and clone poses (q_GtoIi, p_IiinG)
std::unordered_map<double, PoseJPL*> _clones_IMU;
std::map<double, PoseJPL*> _clones_IMU;

/// Our current set of SLAM features (3d positions)
std::unordered_map<size_t, Landmark*> _features_SLAM;
Expand All @@ -215,7 +215,7 @@ namespace ov_msckf {
Eigen::MatrixXd _Cov;

/// Vector of variables
std::vector<Type *> _variables;
std::vector<Type*> _variables;


private:
Expand Down

0 comments on commit 7940dd8

Please sign in to comment.