Please wait a minute...
Frontiers of Computer Science

ISSN 2095-2228

ISSN 2095-2236(Online)

CN 10-1014/TP

Postal Subscription Code 80-970

2018 Impact Factor: 1.129

Front. Comput. Sci.    2019, Vol. 13 Issue (6) : 1326-1336    https://doi.org/10.1007/s11704-018-6600-8
RESEARCH ARTICLE
Fast and accurate visual odometry from a monocular camera
Xin YANG(), Tangli XUE, Hongcheng LUO, Jiabin GUO
School of Electronic Information and Communications, Huazhong University of Science and Technology, Wuhan 430074, China
 Download: PDF(801 KB)  
 Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks
Abstract

This paper aims at a semi-dense visual odometry system that is accurate, robust, and able to run realtime on mobile devices, such as smartphones, AR glasses and small drones. The key contributions of our system include: 1) the modified pyramidal Lucas-Kanade algorithm which incorporates spatial and depth constraints for fast and accurate camera pose estimation; 2) adaptive image resizing based on inertial sensors for greatly accelerating tracking speed with little accuracy degradation; and 3) an ultrafast binary feature description based directly on intensities of a resized and smoothed image patch around each pixel that is sufficiently effective for relocalization. A quantitative evaluation on public datasets demonstrates that our system achieves better tracking accuracy and up to about 2X faster tracking speed comparing to the state-of-the-art monocular SLAM system: LSD-SLAM. For the relocalization task, our system is 2.0X∼4.6X faster than DBoW2 and achieves a similar accuracy.

Keywords visual odometry      mobile devices      direct tracking      relocalization      inertial sensing      binary feature     
Corresponding Author(s): Xin YANG   
Just Accepted Date: 25 September 2017   Online First Date: 15 November 2018    Issue Date: 19 July 2019
 Cite this article:   
Xin YANG,Tangli XUE,Hongcheng LUO, et al. Fast and accurate visual odometry from a monocular camera[J]. Front. Comput. Sci., 2019, 13(6): 1326-1336.
 URL:  
https://academic.hep.com.cn/fcs/EN/10.1007/s11704-018-6600-8
https://academic.hep.com.cn/fcs/EN/Y2019/V13/I6/1326
1 D Gálvez-López, J D Tardos. Bags of binary words for fast place recognition in image sequences. IEEE Transactions on Robotics, 2012, 28(5): 1188–1197
https://doi.org/10.1109/TRO.2012.2197158
2 R Mur-Artal, J M M Montiel, J D Tardós. ORB-SLAM: a versatile and accurate monocular slam system. IEEE Transactions on Robotics, 2015, 31(5): 1147–1163
https://doi.org/10.1109/TRO.2015.2463671
3 G Klein, D Murray. Parallel tracking and mapping for small ar workspaces. In: Proceedings of the 6th IEEE and ACM International Symposium on Mixed and Augmented Reality. 2007, 225–234
https://doi.org/10.1109/ISMAR.2007.4538852
4 J Engel, T Schöps, D Cremers. LSD-SLAM: large-scale direct monocular slam. In: Proceedings of the European Conference on Computer Vision. 2014, 834–849
https://doi.org/10.1007/978-3-319-10605-2_54
5 J Engel, J Sturm, D Cremers. Semi-dense visual odometry for a monocular camera. In: Proceedings of the IEEE International Conference on Computer Vision. 2013, 1449–1456
https://doi.org/10.1109/ICCV.2013.183
6 R A Newcombe, S J Lovegrove, A J Davison. DTAM: dense tracking and mapping in real-time. In: Proceedings of the 2011 IEEE International Conference on Computer Vision. 2011, 2320–2327
https://doi.org/10.1109/ICCV.2011.6126513
7 S Gauglitz, C Sweeney, J Ventura, M Turk, T Höllerer. Live tracking and mapping from both general and rotation-only camera motion. In: Proceedings of the 2012 IEEE International Symposium on Mixed and Augmented Reality. 2012, 13–22
https://doi.org/10.1109/ISMAR.2012.6402532
8 S Gauglitz, C Sweeney, J Ventura, M Turk, T Höllerer. Model estimation and selection towards unconstrained real-time tracking and mapping. IEEE Transactions on Visualization and Computer Graphics, 2014, 20(6): 825–838
https://doi.org/10.1109/TVCG.2013.243
9 R Mur-Artal, J D Tardós. Probabilistic semi-dense mapping from highly accurate feature-based monocular SLAM. Robotics: Science and Systems, 2015
10 T Schöps, J Engel, D Cremers. Semi-dense visual odometry for ar on a smartphone. In: Proceedings of the IEEE International Symposium on Mixed and Augmented Reality. 2014, 145–150
https://doi.org/10.1109/ISMAR.2014.6948420
11 C Forster, M Pizzoli, D Scaramuzza. SVO: fast semi-direct monocular visual odometry. In: Proceedings of the IEEE International Conference on Robotics and Automation. 2014, 15–22
https://doi.org/10.1109/ICRA.2014.6906584
12 J Y Bouguet. Pyramidal implementation of the affine lucas kanade feature tracker description of the algorithm. Intel Corporation, 2001, 5(1–10): 4
13 E Rublee, V Rabaud, K Konolige, G Bradski. OOB: an efficient alternative to sift or surf. In: Proceedings of the International Conference on Computer Vision. 2011, 2564–2571
14 J Sturm, N Engelhard, F Endres, W Burgard, D Cremers. A benchmark for the evaluation of RGB-D SLAM systems. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems. 2012, 573–580
15 M Smith, I Baldwin, W Churchill, R Paul, P Newman. The new college vision and laser data set. The International Journal of Robotics Research, 2009, 28(5): 595–599
https://doi.org/10.1177/0278364909103911
16 J L Blanco, F A Moreno, J Gonzalez. A collection of outdoor robotic datasets with centimeter-accuracy ground truth. Autonomous Robots, 2009, 27(4): 327–351
https://doi.org/10.1007/s10514-009-9138-7
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed