Preview

Mekhatronika, Avtomatizatsiya, Upravlenie

Advanced search

Method of Classification the Mobile Robot Workspace Based on the Analysis of a 3D Point Cloud

https://doi.org/10.17587/mau.23.31-36

Abstract

One of the main and most difficult tasks in the development of automotive systems is the classification of the workspace of a mobile robot. Based on the classification results, a local map of the area is built, with the help of which, then, the robot trajectory is planned. The article describes a method of classification the working area of an autonomous mobile robot moving in rough terrain. The developed classification method is based on the analysis of a three-dimensional point cloud obtained by a laser scan- ning 3D rangefinder. Using a scanning laser rangefinder allows you to classify the robots motion zone at any time of the day or year. A set of classification features is proposed, the calculation of which is carried out using the least square method and elements of probability theory and mathematical statistics. The classification of the robot workspace is carried out by four classes: "Flat surface", "Small roughness", "Large roughness" and "Obstacle". Each class characterizes the degree of passability of the robot movement surface. Classification results are saved as a local map of passability. In each cell of such a map a number is written that characterizes the passability of the region of the working area bounded by this cell. The developed classifier is integrated into the on-board control system of the wheeled mobile robot. The results of experimental studies confirming the efficiency and effectiveness of the proposed classification method are presented. The accuracy of class recognition of the mobile robot workspace has been determined. The developed classifier successfully operates in various conditions, including in winter and at dusk, but at the same time it has limitations when working in conditions of natural noise, such as rain, snow. The average classification accuracy with the minimal influence of natural noise is 92.3 %, and the execution time of each iteration does not exceed 0.085 s, which allows using developed classifier as a part of on-board control systems of autonomous mobile robots.

About the Author

T. P. Ryzhova
Bauman Moscow State Technical University
Russian Federation

 Ph.D., Senior Researcher

Moscow, 105005



References

1. Unmanned car: past, present and future, available at: https://hub.forklog.com/bespilotnye-avtomobili-proshloe-nastoyashhee-i-budushhee/ (in Russian).

2. Sophrygin A. Unmanned car Yandex, bespilot: unmanned vehicles and technologies, news and catalog of companies, available at: https://bespilot.com/news/366-yandex-bespilot (in Russian).

3. "KAMAZ" began testing the unmanned car, kamaz.ru: Official site of KAMAZ, available at: https://kamaz.ru/press/news/kamaz_nachal_testirovanie_bespilotnika/ (in Russian).

4. Unmanned car StarLine, available at: https://smartcar.starline.ru (In Russian).

5. Haselich M., Arends M., Lang D., Paulus D. Terrain Classification with Markov Random Fields on fused Camera and 3D Laser Range Data, available at: http://aass.oru.se/Agora/ECMR2011/proceedings/papers/ECMR2011_0025.pdf.

6. Iagnemma K., Dubowsky S. Terrain estimation for highspeed rough-terrain autonomous vehicle navigation, available at:: http://robots.mit.edu/people/Karl/SPIE_02.pdf.

7. Lee S. Y., Kwak D. M. A Terrain Classification Method for UGV Autonomous Navigation Based on SURF, available at: http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6145981.

8. Nguyen D. V., Kuhnert L., Jiang T., Kuhnert K. D. A Novel Approach of Terrain Classification for Outdoor Automobile Navigation, available at: http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5952752.

9. Laible S., Khan Y. N., Bohlmann K., Zell A. 3D LIDARand Camera-Based Terrain Classification Under Different Lighting Conditions, Proceedings of Conference: Autonomous Mobile Systems, September 2012, available at: https://www.researchgate.net/publication/231681273_3D_LIDAR-_and_Camera-Based_Terrain_Classification_Under_Different_Lighting_Conditions.

10. Himmelsbach M., Muller A., Luttel T., Wunsche H. J. LIDAR-based 3D Object Perception, Proceedings of 1 st International Workshop on Cognition for Technical Systems, October 2008, Munich, Germany, available at: https://www.researchgate.net/publication/229018428_LIDAR-based_3D_object_perception.

11. Woods M., Guivant J., Katupitiya J. Terrain Classification using Depth Texture Features, Proceedings of Australasian Conference on Robotics and Automation (ACRA), December 2013, available at: https://www.researchgate.net/publication/273093294_Terrain_Classification_using_Depth_Texture_Features.

12. Kragh M., Jorgensen R. N., Pedersen H. Object Detection and Terrain Classification in Agricultural Fields Using 3D Lidar Data, Springer International Publishing Switzerland, 2015, pp. 188—197.

13. Reymann C., Lacroix S. Improving LiDAR Point Cloud Classification using Intensities and Multiple Echoes, Proceedings of IEEE/ RSJ International Conference on Intelligent Robots and Systems (IROS 2015), September 2015, Hamburg, Germany, available at: https://www.researchgate.net/publication/308845174_Improving_LiDAR_point_cloud_classification_using_intensities_and_multiple_echoes.

14. Suger B., Steder B., Burgard W. Terrain-Adaptive Obstacle Detection, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2016), October 9—14, 2016, Daejeon, Korea, available at: https://www.researchgate.net/publication/312241137_Terrain-adaptive_obstacle_detection.

15. Approximation of an empirically obtained surface by the least squares method, Korolevstvo Delphi. Virtualnyy klub programmistov, available at: http://www.delphikingdom.com/asp/viewitem.asp?catalogid = 1368 (in Russian).

16. Laptev G. F. Elements of vector calculations, Moscow, Nauka, 1975, 336 p. (in Russian).


Review

For citations:


Ryzhova T.P. Method of Classification the Mobile Robot Workspace Based on the Analysis of a 3D Point Cloud. Mekhatronika, Avtomatizatsiya, Upravlenie. 2022;23(1):31-36. (In Russ.) https://doi.org/10.17587/mau.23.31-36

Views: 1005


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 1684-6427 (Print)
ISSN 2619-1253 (Online)