You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: ReadMe.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -74,7 +74,7 @@ This is a modification of the code originally developed by the HKUST aerial robo
74
74
Here we stress that this is a loosely coupled method, thus no information is returned to the estimator to improve the underlying OpenVINS odometry.
75
75
This codebase has been modified in a few key areas including: exposing more loop closure parameters, subscribing to camera intrinsics, simplifying configuration such that only topics need to be supplied, and some tweaks to the loop closure detection to improve frequency.
This codebase contains the interface wrapper for exporting visual-inertial runs from [OpenVINS](https://github.com/rpng/open_vins) into the ViMap structure taken by [maplab](https://github.com/ethz-asl/maplab).
79
79
The state estimates and raw images are appended to the ViMap as OpenVINS runs through a dataset.
80
80
After completion of the dataset, features are re-extract and triangulate with maplab's feature system.
@@ -130,4 +147,50 @@ Please take a look at the [run_ros_uzhfpv.sh](https://github.com/rpng/open_vins/
130
147
131
148
132
149
150
+
151
+
152
+
153
+
@section gs-data-kaist KAIST Urban Dataset
154
+
155
+
The [KAIST urban dataset](https://irap.kaist.ac.kr/dataset/index.html) @cite Jeong2019IJRR is a dataset focus on autonomous driving and localization in challenging complex urban environments.
156
+
The dataset was collected in Korea with a vehicle equipped with stereo camera pair, 2d SICK LiDARs, 3d Velodyne LiDAR, Xsens IMU, fiber optic gyro (FoG), wheel encoders, and RKT GPS.
157
+
The camera is 10 Hz, while the Xsens IMU is 100 Hz sensing rate.
158
+
A groundtruth "baseline" trajectory is also provided which is the resulting output from fusion of the FoG, RKT GPS, and wheel encoders.
The key idea of the zero velocity update (ZUPT) is to allow for the system to reduce its uncertainty leveraging motion knowledge (i.e. leverage the fact that the system is stationary).
7
+
This is of particular importance in cases where we have a monocular system without any temporal SLAM features.
8
+
In this case, if we are stationary we will be unable to triangulate features and thus will be unable to update the system.
9
+
This can be avoided by either using a stereo system or temporal SLAM features.
10
+
One problem that both of these don't solve is the issue of dynamic environmental objects.
11
+
In a typical autonomous car scenario the sensor system will become stationary at stop lights in which dynamic objects, such as other cars crossing the intersection, can quickly corrupt the system.
12
+
A zero velocity update and skipping feature tracking can address these issues if we are able to classify the cases where the sensor system is at rest.
To perform update, we create a synthetic "measurement" which says that the current **true** acceleration and angular velocity is zero.
18
+
As compared to saying the velocity is zero, we can model the uncertainty of these measurements based on the readings from our inertial measurement unit.
19
+
20
+
\f{align*}{
21
+
\mathbf{a} &= \mathbf{0} \\
22
+
\boldsymbol{\omega} &= \mathbf{0}
23
+
\f}
24
+
25
+
It is important to realize this is not strictly enforcing zero velocity, but really a constant velocity.
26
+
This means we can have a false detection at constant velocity times (zero acceleration), but this can be easily addressed by a velocity magnitude check.
27
+
We have the following measurement equation relating this above synthetic "measurement" to the currently recorded inertial readings:
It is important to note that here our actual measurement is the true \f$\mathbf{a}\f$ and \f$\boldsymbol{\omega}\f$ and thus we will have the following residual where we will subtract the synthetic "measurement" and our measurement function:
@section update-zerovelocity-detect Zero Velocity Detection
60
+
61
+
Zero velocity detection in itself is a challenging problem which has seen many different works tried to address this issue @cite Wagstaff2017IPIN, @cite Ramanandan2011TITS, @cite Davidson2009ENC.
62
+
Most works boil down to simple thresholding and the approach is to try to determine the optimal threshold which allows for the best classifications of zero velocity update (ZUPT) portion of the trajectories.
63
+
There have been other works, @cite Wagstaff2017IPIN and @cite Ramanandan2011TITS, which have looked at more complicated methods and try to address the issue that this threshold can be dependent on the type of different motions (such as running vs walking) and characteristics of the platform which the sensor is mounted on (we want to ignore vehicle engine vibrations and other non-essential observed vibrations).
64
+
65
+
66
+
We approach this detection problem based on tuning of a \f$\chi^2\f$, chi-squared, thresholding based on the measurement model above.
67
+
It is important to note that we also have a velocity magnitude check which is aimed at preventing constant velocity cases which have non-zero magnitude.
68
+
More specifically, we perform the following threshold check to see if we are current at zero velocity:
We found that in the real world experiments, typically the inertial measurement noise \f$\mathbf{R}\f$ needs to be inflated by 50-100 times to allow for proper detection.
75
+
This can hint that we are using overconfident inertial noises, or that there are additional frequencies (such as the vibration of motors) which inject additional noises.
0 commit comments