Real-Time Surround-View System for Mobile Robotic System
https://doi.org/10.17587/mau.20.162-170
Abstract
The paper is devoted to the problem of increasing the adequacy of perception of the environment by the mobile robot operator using remote control. A variant of a real-time surround-view system for mobile robot based on the multiple cameras with fisheye lenses and overlapping fields of view is proposed. A prototype of the circular view system has been developed. In this paper, the key features of the architecture and software of the surround-view system are considered. The algorithms for determining the internal parameters of cameras and distortion correction are researched. New models for describing the distortions of wide-angle and fisheye lens are used. Algorithms for finding the external parameters of cameras, as well as homography matrices using invariant descriptors, are implemented. Static homography matrices are used during stitching images into panorama. Various image-stitching techniques based on the overlapped images region blending are investigated and implemented. The methods of projective geometry and augmented reality were studied to obtain a perspective third-person view. A surface variant for projecting panorama of a surround view is proposed. For the implementation of the software selected cross-platform game engine "Unity". Directions for further research are identified.
About the Authors
V. V. VarlashinRussian Federation
Varlashin Victor V. - Graduate Student of the Department of Mechatronics and Robotics (under the RTC), 2nd Class Engineer of the SPbPU.
St. Petersburg, 195251.
M. A. Ershova
Russian Federation
St. Petersburg, 195251.
V. A. Bunyakov
Russian Federation
St. Petersburg, 194064.
O. U. Shmakov
Russian Federation
St. Petersburg, 194064.
References
1. Kadous M. W., Sheh R. K.-M., Sammut C. Effective user interface design for rescue robotics, ACM Conference on Human-Robot Interaction, ACM Press, 2006, pp. 250—257.
2. Song Y., Shi Q., Huang Q., Fukuda T. Development of an omnidirectional vision system for environment perception, ROBIO, IEEE, 2014, pp. 695—700.
3. Shi Q., Li C., Wang C., Luo H., Huang Q., Fukuda T. Design and implementation of an omnidirectional vision system for robot perception, Mechatronics, 2017, vol. 41, pp. 58—66.
4. Sato T., Moro A., Sugahara A., Tasaki T., Yamashita A., Asama H. Spatio-temporal bird’seye view images using multiple fisheye cameras, Proceedings of the 2013 IEEE/SICE International Symposium on System Integration, Kobe, 2013, pp. 753—758.
5. Lim S., Jun S., Jung I.-K. Wrap-around View Equipped on Mobile Robot, World Academy of Science, Engineering and Technology International Journal of Electronics and Communication Engineering, 2012, vol. 6, no. 3. pp. 354—356.
6. Jung I.-K., Kim H., Jung W. S., Jeon S. A Remote Robot Control System based on AVM using Personal Smart Device, Recent Advances in Circuits, Communications and Signal Processing, 2013, pp. 177—180.
7. Oficial’nyj sajt kompanii PointGrey [Official web site of PointGrey], available at: www.ptgrey.com (accessed 31.10.2017).
8. Oficial’nyj sajt kompanii GoPro [Official web site of Go-Pro], available at: go-pro.com (accessed 31.10.2017).
9. Yeh Y.-T., Peng C.-K., Chen K.-W., Chen Y.-S., Hung Y.-P. Driver Assistance System Providing an Intuitive Perspective View of Vehicle Surrounding, ACCV 2014 Workshops, Part II. Lecture Notes in Computer Science, Springer, 2014, vol. 9009, pp. 403—417.
10. Oficial’nyj sajt kompanii Nissan [Official web site of Nissan Motor Co], available at: www.nissan-global.com/EN/TECHNOLOGY/OVERVIEW/avm.html (accessed 31.10.2017).
11. Oficial’nyj sajt kompanii Fujitsu [Official web site of Fujitsu Limited], available at: www.fujitsu.com/us/products/devices/semi-conductor/gdc/products/omni.html (accessed 31.10.2017).
12. Gurrieri L. E., Dubois E. Acquisition of omnidirectional stereoscopic images and videos of dynamic scenes: a review, Journal of Electronic Imaging, 2013, vol. 22, no. 3, pp. 1—21.
13. Schreer O., Kauff P., Eisert P., Weissig C., Rosenthal J.-C. Geometrical Design Concept for Panoramic 3D Video Acquisition, Signal Processing Conference (EUSIPCO), 2012, pp. 2757—2761.
14. Aliakbarpour H., Tahri O., Araújo H. Visual servoing of mobile robots using non-central catadioptric cameras, Robotics and Autonomous Systems, 2014, vol. 62, no. 11, pp. 1613—1622.
15. Lin H.-Y., Wang M.-L. HOPIS: Hybrid Omnidirectional and Perspective Imaging System for Mobile Robots, Sensors, 2014, vol. 14, no. 9, pp. 16508—16531.
16. Lazarenko V. P., Dzhamijkov T. S., Korotaev V. V., Jaryshev S. N. Metod sozdaniâ sferičeskih panoram iz izobraženij, polučennyh vsenapravlennymi optiko-èlektronnymi sistemami [Method for creation of spherical panoramas from images obtained by omnidirectional optoelectronic systems], Nauchno-tehnicheskij vestnik informacionnyh tehnologij, mehaniki i optiki, 2016, vol. 16, no. 1. pp. 46—53 (in Russian).
17. Lazarenko V., Korotaev V., Yaryshev S. The algorithm for transformation of images from omnidirectional cameras, Proc. Latin America Optics and Photonics Conference (LAOP), Mexico, 2014.
18. Utsugi K., Moriya T. A Camera Revolver for Improved Image Stitching, Machine Vision Applications (MVA), 2002, pp. 261—264.
19. Brown M., Lowe D. G. Automatic Panoramic Image Stitching using Invariant Features, Int. J. Comput. Vision, 2007, vol. 74, no. 1, pp. 59—73.
20. Sakharkar V. S., Gupta S. R. Image stitching Techniquesan overview, International Journal of Computer Science and Applications, 2013, vol. 6, no. 2, pp. 324—330.
21. Samsung NX Lens Guide, available at: http://www.samsung.com/ru/pdf/NX_Lens_Guide_120606.pdf (accessed 28.11.2017) (in Russian).
22. Mei C., Rives P. Single View Point Omnidirectional Camera Calibration from Planar Grids, IEEE International Conference on Robotics and Automation (ICRA), 2007, pp. 3945—3950.
23. Scaramuzza D., Martinelli A., Siegwart R. A Toolbox for Easily Calibrating Omnidirectional Cameras, IEEE International Conference on Intelligent Robots and Systems (IROS), 2006, pp. 5695—5701.
24. Zuliani M. RANSACfor Dummies, Vision Research Lab, 2014, pp. 1—101. URL: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.475.1243&rep=rep1&type=pdf (accessed 18.12.2018).
25. Szeliski R. Image Alignment and Stitching: A Tutorial, FNT in Computer Graphics and Vision, 2006, vol. 2, no. 1, pp. 1—104.
26. Kashyap V., Agrawal P., Akhbari F. Real-time, Quasi Immersive, High Definition Automotive 3D Surround View System, Int’l Conf. IP, Comp. Vision, and Pattern Recognition, 2017, pp. 10—16.
Review
For citations:
Varlashin V.V., Ershova M.A., Bunyakov V.A., Shmakov O.U. Real-Time Surround-View System for Mobile Robotic System. Mekhatronika, Avtomatizatsiya, Upravlenie. 2019;20(3):162-170. (In Russ.) https://doi.org/10.17587/mau.20.162-170