|Table of Contents|

[1] Liu Di, Chen Xiyuan,. A calculation method for low dynamic vehicle velocitybased on fusion of optical flow and feature point matching [J]. Journal of Southeast University (English Edition), 2017, 33 (4): 426-431. [doi:10.3969/j.issn.1003-7985.2017.04.006]
Copy

A calculation method for low dynamic vehicle velocitybased on fusion of optical flow and feature point matching()
Share:

Journal of Southeast University (English Edition)[ISSN:1003-7985/CN:32-1325/N]

Volumn:
33
Issue:
2017 4
Page:
426-431
Research Field:
Automation
Publishing date:
2017-12-30

Info

Title:
A calculation method for low dynamic vehicle velocitybased on fusion of optical flow and feature point matching
Author(s):
Liu Di Chen Xiyuan
School of Instrument Science and Engineering, Southeast University, Nanjing 210096, China
Key Laboratory of Micro-Inertial Instrument and Advanced Navigation Technology of Ministry of Education, Southeast University, Nanjing 210096, China
Keywords:
velocity optical flow feature point matching non-uniform light intensity distribution
PACS:
TP242.6
DOI:
10.3969/j.issn.1003-7985.2017.04.006
Abstract:
Aiming at the problem of the low accuracy of low dynamic vehicle velocity under the environment of uneven distribution of light intensity, an improved adaptive Kalman filter method for the velocity error estimate by the fusion of optical flow tracking and scale invariant feature transform(SIFT)is proposed. The algorithm introduces a nonlinear fuzzy membership function and the filter residual for the noise covariance matrix in the adaptive adjustment process. In the process of calculating the velocity of the vehicle, the tracking and matching of the inter-frame displacement and the vehicle velocity calculation are carried out by using the optical flow tracing and the SIFT methods, respectively. Meanwhile, the velocity difference between the outputs of these two methods is used as the observation of the improved adaptive Kalman filter. Finally, the velocity calculated by the optical flow method is corrected by using the velocity error estimate of the output of the modified adaptive Kalman filter. The results of semi-physical experiments show that the maximum velocity error of the fusion algorithm is decreased by 29% than that of the optical flow method, and the computation time is reduced by 80% compared with the SIFT method.

References:

[1] Howard A. Real-time stereo visual odometry for autonomous ground vehicles[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Nice, France, 2008: 3946-3952.DOI: 10.1109/IROS.2008.4651147.
[2] Kitt B, Geiger A, Lategahn H. Visual odometry based on stereo image sequences with RANSAC-based outlier rejection scheme[C]//2010 IEEE Intelligent Vehicles Symposium. San Diego, CA, USA, 2010: 486-492. DOI: 10.1109/IVS.2010.5548123.
[3] Scaramuzza D, Fraundorfer F. Visual odometry: Part Ⅰ: The first 30 years and fundamentals[J]. IEEE Robotics & Automation Magazine, 2011, 18(4): 80-92. DOI: 10.1109/MRA.2011.943233.
[4] Desai A, Lee D J. Visual odometry drift reduction using SYBA descriptor and feature transformation[J]. IEEE Transactions on Intelligent Transportation Systems, 2016, 17(7): 1839-1851. DOI:10.1109/TITS.2015.25114 53.
[5] Fraundorfer F, Scaramuzza D. Visual odometry: Part Ⅱ: Matching, robustness, optimization, and applications[J]. IEEE Robotics & Automation Magazine, 2012, 19(2): 78-90.DOI:10.1109/MRA.2012.2182810.
[6] Bevilacqua M, Tsourdos A, Starr A. Egomotion estimation for monocular camera visual odometer[C]//2016 IEEE International Instrumentation and Measurement Technology Conference. Taipei, China, 2016: 1428-1433. DOI: 10.1109/I2MTC. 2016.7520579.
[7] Kitt B, Rehder J, Chambers A, et al. Monocular visual odometry using a planar road model to solve scale ambiguity[C]//Proceedings of the European Conference on Mobile Robots. Pittsburgh, USA, 2011: 1-6.
[8] Lucas B D, Kanade T. An iterative image registration technique with an application to stereo vision[C]//International Joint Conference on Artificial Intelligence. Vancouver, Canada, 1981: 121-130.
[9] Bouguet J Y. Pyramidal implementation of the lucas kanade feature tracker description of the algorithm[J]. Opencv Documents, 1999, 22(2): 363-381.
[10] Zhong S H, Liu Y, Chen Q C. Visual orientation inhomogeneity based scale-invariant feature transform[J]. Expert Systems with Applications, 2015, 42(13): 5658-5667. DOI: 10.1016/j.eswa.2015.01.012.
[11] Song X, Seneviratne L D, Althoefer K. A Kalman filter-integrated optical flow method for velocity sensing of mobile robots[J]. IEEE/ASME Transactions on Mechatronics, 2011, 16(3): 551-563. DOI: 10.1109/TMECH.2010.2046421.
[12] Shen C, Bai Z, Cao H, et al. Optical flow sensor/INS/magnetometer integrated navigation system for MAV in GPS-denied environment[J]. Journal of Sensors, 2016, 2016: 1-10. DOI: 10.1155/2016/6105803.

Memo

Memo:
Biographies: Liu Di(1987—), male, graduate; Chen Xiyuan(corresponding author), male, doctor, professor, chxiyuan@seu.edu.cn.
Foundation items: The National Natural Science Foundation of China(No.51375087, 51405203), the Transformation Program of Science and Technology Achievements of Jiangsu Province(No.BA2016139).
Citation: Liu Di, Chen Xiyuan. A calculation method for low dynamic vehicle velocity based on fusion of optical flow and feature point matching[J].Journal of Southeast University(English Edition), 2017, 33(4):426-431.DOI:10.3969/j.issn.1003-7985.2017.04.006.
Last Update: 2017-12-20