A New Approach to Images’ Pespective Transformation in Robotics Telemetry
https://doi.org/10.17587/mau.22.644-649
Abstract
Remotely controlled robots are the majority of contemporary robots’ population. As a rule such robots are used for inspection, patrolling, mines disposal. This is caused by a fact that today’s level of robots’ autonomy is rather low and autonomous robots can not secure reliable performance. However performance of remotely operated robots depends largely on efficiency of information perception by human-operator. This paper studies images representation to operators on robots’ control pendants. More specifically it studies 3D images representation on flat displays. The goal of this study is to increased reaction rate and decrease errors done by robot’s operator due to image’s uncertainty or poor quality. Our main attention is paid to the development of software that implements the technology of transforming the perspective of images. Matrixes for perspective transformation are studied. Hence application of these matrixes within remotely operated robots is discussed. A novelty of this research is in new knowledge about perspective transformation which is done for a better information perception by human-operator and, as an outcome, to increase efficiency of remotely operated robots. The related technologies in the field of telemetry and technical vision systems have been investigated. Also considered are works in medical fields, in particular, the psychology of perception of images and space. A static software model has been developed. The video camera has been implemented with the introduction of perspective distortions to improve the reliability of the transmission of the necessary areas in the image. Research on the technology is carried out jointly with the Institute of Biomedical Problems of the Russian Academy of Sciences. A special prototype for perspective transformation basing on various scenarios is being developed.
About the Authors
F. M. BelchenkoRussian Federation
Belchenko Fillip M., Programming Engineer
Moscow, 119526
I. L. Ermolov
Russian Federation
Moscow, 119526
References
1. Stoianovici D. et al. Multi-Imager Compatible, MR Safe, Remote Center of Motion Needle-Guide Robot, IEEE Transactions on Biomedical Engineering, 2018, vol. 65, no.1, pp. 165—177, Doi: 10.1109/TBME.2017.2697766.
2. Ermolov I. Hierarchical data fusion architecture for autonomous systems,ACTA IMEKO, 2019, vol. 8, no. 4, pp. 28—32.
3. Kovalev A. M. About visually percepted space of objects, Avtometrija, 2003, vol. 39, no. 6, pp. 3—12 (in Russian).
4. Rauschenbach B. V. Picture’s geometry and visual perception, S.-Petersburg, Azbuka-klassika, 2002 (in Russian).
5. Thrun S., Montemerlo M., Dahlkamp H., Stavens D., Aron A., Diebel J., Fong P., Gale J., Halpenny M., Hoffmann G., Lau K., Oakley C., Palatucci M., Pratt V., Stang P., Strohband S., Dupont C., Jendrossek L. E., Koelen C., Markey C., Rummel C., Niekerk J., Jensen E., Alessandrini P., Bradski G., Davies B., Ettinger S., Kaehler A., Nefian A., Mahoney P. Stanley: The Robot That Won the DARPA Grand Challenge, The 2005 DARPA Grand Challenge. Springer Tracts in Advanced Robotics, 2007, vol. 36, p. 1, DOI: 10.1007/978-3-540-73429-1_1.
6. Garsia G., Suares O., Aranda G., Tersero S., Grasoa O., Enano N. Learning Image Processing with OpenCV, Publisher Packt, USA, 2015, 232 p.
7. Belchenko F. M., Ermolov I. L. Concept development of telemetry system for robotic systems with the ability to introduce targeted distortions of displayed space, Robotics and Technical Cybernetics, 2021, vol. 9, no. 1 (in Russian).
8. Belyaev M. Yu., Desinov L. V., Krikalev S. K., Kumakshev S. A., Sekerzh-Zen’kovich S. Ya. Identification of a system of oceanic waves based on space imagery, Journal of Computer and Systems Sciences International, 2009, vol. 48, no. 1.
9. Belyaev M. Y., Vinogradov P. V., Desinov L. V., Kumakshev S. A., Sekerzh-Zen’Kovich S. Y. Identification of a source of oceanic ring waves near Darwin’s Island based on space photos, Journal of Computer and Systems Sciences International, 2011, vol. 50, no. 1, pp. 70—83 (in Russian).
10. Aliev R. N., Ermolov I. L. Transformation of camera image via perceptive perspective, Proc. of "Robotics and Mechatronics — 2015" Conference, Gelendzhik, 2015. pp. 60—62 (in Russian).
11. Available at: https://russianblogs.com/article/5658585388/ (data: 18.07.2021).
12. Available at: https://ip76.ru/perspective-transformation/ (data: 18.08.2020).
13. Rozhkova G. I., Alekseenko S. V. Visual discomfort in the perception of stereoscopic images as a result of the unusual distribution of the load on various mechanisms of the visual system, The world of cinema technology, 2011, vol. 5, no. 3 (21), pp. 12—21 (in Russian).
14. Oliva A., Torralba A. Scene-Centered Description from Spatial Envelope Properties, Proc. 2nd Workshop on Biologically Motivated Computer Vision (BMCV’02),Tubingen, Germany, 2002.
15. Oliva A., Torralba A. Modeling the shape of the scene: a holistic representation of the spatial envelope, International Journal of Computer Vision,, 2001, vol. 42(3), pp. 145—175.
16. GOST R ISO/IEC 15910—2002. Information technology. Software user documentation process (in Russian).
Review
For citations:
Belchenko F.M., Ermolov I.L. A New Approach to Images’ Pespective Transformation in Robotics Telemetry. Mekhatronika, Avtomatizatsiya, Upravlenie. 2021;22(12):644-649. (In Russ.) https://doi.org/10.17587/mau.22.644-649