<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.3 20210610//EN" "JATS-journalpublishing1-3.dtd">
<article article-type="research-article" dtd-version="1.3" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xml:lang="ru"><front><journal-meta><journal-id journal-id-type="publisher-id">novtexmech</journal-id><journal-title-group><journal-title xml:lang="ru">Мехатроника, автоматизация, управление</journal-title><trans-title-group xml:lang="en"><trans-title>Mekhatronika, Avtomatizatsiya, Upravlenie</trans-title></trans-title-group></journal-title-group><issn pub-type="ppub">1684-6427</issn><issn pub-type="epub">2619-1253</issn><publisher><publisher-name>Commercial Publisher «New Technologies»</publisher-name></publisher></journal-meta><article-meta><article-id pub-id-type="doi">10.17587/mau.20.162-170</article-id><article-id custom-type="elpub" pub-id-type="custom">novtexmech-595</article-id><article-categories><subj-group subj-group-type="heading"><subject>Research Article</subject></subj-group><subj-group subj-group-type="section-heading" xml:lang="ru"><subject>РОБОТЫ, МЕХАТРОНИКА И РОБОТОТЕХНИЧЕСКИЕ СИСТЕМЫ</subject></subj-group><subj-group subj-group-type="section-heading" xml:lang="en"><subject>ROBOT, MECHATRONICS AND ROBOTIC SYSTEMS</subject></subj-group></article-categories><title-group><article-title>Система кругового обзора реального времени для мобильных робототехнических комплексов</article-title><trans-title-group xml:lang="en"><trans-title>Real-Time Surround-View System for Mobile Robotic System</trans-title></trans-title-group></title-group><contrib-group><contrib contrib-type="author" corresp="yes"><name-alternatives><name name-style="eastern" xml:lang="ru"><surname>Варлашин</surname><given-names>В. В.</given-names></name><name name-style="western" xml:lang="en"><surname>Varlashin</surname><given-names>V. V.</given-names></name></name-alternatives><bio xml:lang="ru"><p>Аспирант, инженер 2-й категории.</p></bio><bio xml:lang="en"><p>Varlashin Victor V. - Graduate Student of the Department of Mechatronics and Robotics (under the RTC), 2nd Class Engineer of the SPbPU.</p><p>St. Petersburg, 195251.</p></bio><email xlink:type="simple">v.varlashin@rtc.ru</email><xref ref-type="aff" rid="aff-1"/></contrib><contrib contrib-type="author" corresp="yes"><name-alternatives><name name-style="eastern" xml:lang="ru"><surname>Ершова</surname><given-names>М. А.</given-names></name><name name-style="western" xml:lang="en"><surname>Ershova</surname><given-names>M. A.</given-names></name></name-alternatives><bio xml:lang="ru"><p>Научный сотрудник.</p></bio><bio xml:lang="en"><p>St. Petersburg, 195251.</p></bio><email xlink:type="simple">m.ershova@rtc.ru</email><xref ref-type="aff" rid="aff-1"/></contrib><contrib contrib-type="author" corresp="yes"><name-alternatives><name name-style="eastern" xml:lang="ru"><surname>Буняков</surname><given-names>В. А.</given-names></name><name name-style="western" xml:lang="en"><surname>Bunyakov</surname><given-names>V. A.</given-names></name></name-alternatives><bio xml:lang="ru"><p>Начальник лаборатории.</p></bio><bio xml:lang="en"><p>St. Petersburg, 194064.</p></bio><email xlink:type="simple">bunyakov@rtc.ru</email><xref ref-type="aff" rid="aff-2"/></contrib><contrib contrib-type="author" corresp="yes"><name-alternatives><name name-style="eastern" xml:lang="ru"><surname>Шмаков</surname><given-names>О. А.</given-names></name><name name-style="western" xml:lang="en"><surname>Shmakov</surname><given-names>O. U.</given-names></name></name-alternatives><bio xml:lang="ru"><p>Начальник отдела.</p></bio><bio xml:lang="en"><p>St. Petersburg, 194064.</p></bio><email xlink:type="simple">shmakov@rtc.ru</email><xref ref-type="aff" rid="aff-2"/></contrib></contrib-group><aff-alternatives id="aff-1"><aff xml:lang="ru">Санкт-Петербургский политехнический университет Петра Великого<country>Россия</country></aff><aff xml:lang="en">SPbPU<country>Russian Federation</country></aff></aff-alternatives><aff-alternatives id="aff-2"><aff xml:lang="ru">Центральный научно-исследовательский и опытно-конструкторский институт робототехники и технической кибернетики (ЦНИИ РТК)<country>Россия</country></aff><aff xml:lang="en">RTC<country>Russian Federation</country></aff></aff-alternatives><pub-date pub-type="collection"><year>2019</year></pub-date><pub-date pub-type="epub"><day>06</day><month>03</month><year>2019</year></pub-date><volume>20</volume><issue>3</issue><fpage>162</fpage><lpage>170</lpage><permissions><copyright-statement>Copyright &amp;#x00A9; Commercial Publisher «New Technologies», 2019</copyright-statement><copyright-year>2019</copyright-year><copyright-holder xml:lang="ru">Commercial Publisher «New Technologies»</copyright-holder><copyright-holder xml:lang="en">Commercial Publisher «New Technologies»</copyright-holder><license xlink:href="https://mech.novtex.ru/jour/about/submissions#copyrightNotice" xlink:type="simple"><license-p>https://mech.novtex.ru/jour/about/submissions#copyrightNotice</license-p></license></permissions><self-uri xlink:href="https://mech.novtex.ru/jour/article/view/595">https://mech.novtex.ru/jour/article/view/595</self-uri><abstract><p>Обсуждается проблема повышения адекватности восприятия окружающей среды оператором мобильного робота при удаленном управлении. Предложен вариант системы кругового обзора реального времени для мобильных робототехнических комплексов на базе системы телекамер с перекрывающимися полями зрения с использованием fisheye-объективов (объективы типа "рыбий глаз"). Разработан макет модуля кругового обзора. Определены особенности архитектуры и программного обеспечения системы кругового обзора. Исследованы алгоритмы определения внутренних параметров телекамер и устранения дисторсий с использованием новых моделей описания искажений широкоугольных и сверхширокоугольных объективов. Реализованы алгоритмы нахождения внешних параметров телекамер, а также  матриц гомографии  с  использованием инвариантных  дескрипторов, использованы  статические  матрицы гомографии при склейке изображений в панораму. Исследованы и реализованы алгоритмы смешивания граничных областей склеиваемых кадров на базе методов типа " blending". Исследованы методы проективной геометрии и дополненной реальности для получения перспективного вида "от третьего лица". Предложен вариант поверхности для проецирования панорамы кругового обзора. Для реализации программного обеспечения выбран кроссплатформенный игровой движок "Unity". Определены направления дальнейших исследований.</p></abstract><trans-abstract xml:lang="en"><p>The paper is devoted to the problem of increasing the adequacy of perception of the environment by the mobile robot operator using remote control. A variant of a real-time surround-view system for mobile robot based on the multiple cameras with fisheye lenses and overlapping fields of view is proposed. A prototype of the circular view system has been developed. In this paper, the key features of the architecture and software of the surround-view system are considered. The algorithms for determining the internal parameters of cameras and distortion correction are researched. New models for describing the distortions of wide-angle and fisheye lens are used. Algorithms for finding the external parameters of cameras, as well as homography matrices using invariant descriptors, are implemented. Static homography matrices are used during stitching images into panorama. Various image-stitching techniques based on the overlapped images region blending are investigated and implemented. The methods of projective geometry and augmented reality were studied to obtain a perspective third-person view. A surface variant for projecting panorama of a surround view is proposed. For the implementation of the software selected cross-platform game engine "Unity". Directions for further research are identified.</p></trans-abstract><kwd-group xml:lang="ru"><kwd>компьютерное зрение</kwd><kwd>система кругового обзора мобильного робота</kwd><kwd>проекция изображений</kwd></kwd-group><kwd-group xml:lang="en"><kwd>computer vision</kwd><kwd>surround-view system for mobile robot</kwd><kwd>image projection</kwd></kwd-group><funding-group xml:lang="ru"><funding-statement>Министерство науки и высшего образования РФ</funding-statement></funding-group><funding-group xml:lang="en"><funding-statement>The Ministry of Education and Science of Russia</funding-statement></funding-group></article-meta></front><back><ref-list><title>References</title><ref id="cit1"><label>1</label><citation-alternatives><mixed-citation xml:lang="ru">Kadous M. W., Sheh R. K.-M., Sammut C. Effective user interface design for rescue robotics // ACM Conference on Human-Robot Interaction. ACM Press, 2006. P. 250—257.</mixed-citation><mixed-citation xml:lang="en">Kadous M. W., Sheh R. K.-M., Sammut C. Effective user interface design for rescue robotics, ACM Conference on  Human-Robot Interaction, ACM Press, 2006, pp. 250—257.</mixed-citation></citation-alternatives></ref><ref id="cit2"><label>2</label><citation-alternatives><mixed-citation xml:lang="ru">Song Y., Shi Q., Huang Q., Fukuda T. Development of an omnidirectional vision system for environment perception // ‘ROBIO’, IEEE. 2014. P. 695—700.</mixed-citation><mixed-citation xml:lang="en">Song Y., Shi Q., Huang Q., Fukuda T. Development of an omnidirectional vision system for environment perception, ROBIO, IEEE, 2014, pp. 695—700.</mixed-citation></citation-alternatives></ref><ref id="cit3"><label>3</label><citation-alternatives><mixed-citation xml:lang="ru">Shi Q., Li C., Wang C., Luo H., Huang Q., Fukuda T. Design and implementation of an omnidirectional vision system for robot perception // Mechatronics. 2017. Vol. 41. P. 58—66.</mixed-citation><mixed-citation xml:lang="en">Shi Q., Li C., Wang C., Luo H., Huang Q., Fukuda T. Design and implementation of an omnidirectional vision system for robot perception, Mechatronics, 2017, vol. 41, pp. 58—66.</mixed-citation></citation-alternatives></ref><ref id="cit4"><label>4</label><citation-alternatives><mixed-citation xml:lang="ru">Sato T., Moro A., Sugahara A., Tasaki T., Yamashita A., Asama H. Spatiotemporal bird’seye view images using multiple fisheye cameras // Proceedings of the 2013 IEEE/SICE International Symposium on System Integration, Kobe, 2013. P. 753—758.</mixed-citation><mixed-citation xml:lang="en">Sato T., Moro A., Sugahara A., Tasaki T., Yamashita A., Asama H. Spatio-temporal bird’seye view images using multiple fisheye cameras, Proceedings of the 2013 IEEE/SICE International Symposium on System Integration, Kobe, 2013, pp. 753—758.</mixed-citation></citation-alternatives></ref><ref id="cit5"><label>5</label><citation-alternatives><mixed-citation xml:lang="ru">Lim S., Jun S., Jung I.-K. Wraparound View Equipped on Mobile Robot // World Academy of Science, Engineering and Technology International Journal of Electronics and Communication Engineering. 2012. Vol. 6, N. 3. P. 354—356.</mixed-citation><mixed-citation xml:lang="en">Lim S., Jun  S.,  Jung  I.-K.  Wrap-around  View  Equipped on Mobile Robot, World Academy of Science, Engineering and Technology International Journal of Electronics and Communication Engineering, 2012, vol. 6, no. 3. pp. 354—356.</mixed-citation></citation-alternatives></ref><ref id="cit6"><label>6</label><citation-alternatives><mixed-citation xml:lang="ru">Jung I.-K., Kim H., Jung W. S., Jeon S. A Remote Robot Control System based on AVM using Personal Smart Device // Recent Advances in Circuits. Communications and Signal Processing, 2013. P. 177—180.</mixed-citation><mixed-citation xml:lang="en">Jung I.-K., Kim H., Jung W. S., Jeon S. A Remote Robot Control System based on AVM using Personal Smart Device, Recent Advances in Circuits, Communications and Signal Processing, 2013, pp. 177—180.</mixed-citation></citation-alternatives></ref><ref id="cit7"><label>7</label><citation-alternatives><mixed-citation xml:lang="ru">Официальный сайт компании PointGrey. URL: www.ptgrey.com (дата обращения: 31.10.2017).</mixed-citation><mixed-citation xml:lang="en">Oficial’nyj sajt kompanii PointGrey [Official web site of PointGrey],  available  at:  www.ptgrey.com  (accessed  31.10.2017).</mixed-citation></citation-alternatives></ref><ref id="cit8"><label>8</label><citation-alternatives><mixed-citation xml:lang="ru">Официальный сайт компании GoPro. URL: gopro.com (дата обращения: 31.10.2017).</mixed-citation><mixed-citation xml:lang="en">Oficial’nyj sajt kompanii GoPro [Official web site of Go-Pro], available at: go-pro.com (accessed 31.10.2017).</mixed-citation></citation-alternatives></ref><ref id="cit9"><label>9</label><citation-alternatives><mixed-citation xml:lang="ru">Yeh Y.-T., Peng C.-K., Chen K.-W., Chen Y.-S., Hung Y.-P. Driver Assistance System Providing an Intuitive Perspective View of Vehicle Surrounding // ACCV 2014 Workshops, Part II. Lecture Notes in Computer Science. Springer, 2014. Vol. 9009. P. 403—417.</mixed-citation><mixed-citation xml:lang="en">Yeh Y.-T., Peng C.-K., Chen K.-W., Chen Y.-S., Hung Y.-P. Driver Assistance System Providing an Intuitive Perspective View of Vehicle Surrounding, ACCV 2014 Workshops, Part II. Lecture Notes in Computer Science, Springer, 2014, vol. 9009, pp. 403—417.</mixed-citation></citation-alternatives></ref><ref id="cit10"><label>10</label><citation-alternatives><mixed-citation xml:lang="ru">Официальный сайт компании Nissan. URL: www.nissan-global.com/EN/TECHNOLOGY/OVERVIEW/avm.html (дата обращения: 31.10.2017).</mixed-citation><mixed-citation xml:lang="en">Oficial’nyj sajt kompanii Nissan [Official web site of Nissan Motor Co], available at: www.nissan-global.com/EN/TECHNOLOGY/OVERVIEW/avm.html (accessed 31.10.2017).</mixed-citation></citation-alternatives></ref><ref id="cit11"><label>11</label><citation-alternatives><mixed-citation xml:lang="ru">Официальный сайт компании Fujitsu. URL: www.fujitsu.com/us/products/devices/semiconductor/gdc/products/omni.html (дата обращения: 31.10.2017).</mixed-citation><mixed-citation xml:lang="en">Oficial’nyj sajt kompanii Fujitsu [Official web site of Fujitsu Limited], available at: www.fujitsu.com/us/products/devices/semi-conductor/gdc/products/omni.html  (accessed  31.10.2017).</mixed-citation></citation-alternatives></ref><ref id="cit12"><label>12</label><citation-alternatives><mixed-citation xml:lang="ru">Gurrieri L. E., Dubois E. Acquisition of omnidirectional stereoscopic images and videos of dynamic scenes: a review // Journal of Electronic Imaging. 2013. Vol. 22, N. 3. P. 1—21.</mixed-citation><mixed-citation xml:lang="en">Gurrieri L. E., Dubois E. Acquisition of omnidirectional stereoscopic images and videos of dynamic scenes: a review, Journal of Electronic Imaging, 2013, vol. 22, no. 3, pp. 1—21.</mixed-citation></citation-alternatives></ref><ref id="cit13"><label>13</label><citation-alternatives><mixed-citation xml:lang="ru">Schreer O., Kauff P., Eisert P., Weissig C., Rosenthal J.-C. Geometrical Design Concept for Panoramic 3D Video Acquisition // Signal Processing Conference (EUSIPCO). 2012. P. 2757—2761.</mixed-citation><mixed-citation xml:lang="en">Schreer O., Kauff P., Eisert P., Weissig C., Rosenthal J.-C. Geometrical Design Concept for Panoramic 3D Video Acquisition, Signal Processing Conference (EUSIPCO), 2012, pp. 2757—2761.</mixed-citation></citation-alternatives></ref><ref id="cit14"><label>14</label><citation-alternatives><mixed-citation xml:lang="ru">Aliakbarpour H., Tahri O., Araújo H. Visual servoing of mobile robots using non-central catadioptric cameras // Robotics and Autonomous Systems. 2014. Vol. 62, N. 11. P. 1613—1622.</mixed-citation><mixed-citation xml:lang="en">Aliakbarpour H., Tahri O., Araújo H. Visual servoing of mobile  robots  using  non-central  catadioptric  cameras,  Robotics and Autonomous Systems, 2014, vol. 62, no. 11, pp. 1613—1622.</mixed-citation></citation-alternatives></ref><ref id="cit15"><label>15</label><citation-alternatives><mixed-citation xml:lang="ru">Lin H.-Y., Wang M.-L. HOPIS: Hybrid Omnidirectional and Perspective Imaging System for Mobile Robots // Sensors. 2014. Vol. 14, N. 9. P. 16508—16531.</mixed-citation><mixed-citation xml:lang="en">Lin H.-Y., Wang M.-L. HOPIS: Hybrid Omnidirectional and Perspective Imaging System for Mobile Robots, Sensors, 2014, vol. 14, no. 9, pp. 16508—16531.</mixed-citation></citation-alternatives></ref><ref id="cit16"><label>16</label><citation-alternatives><mixed-citation xml:lang="ru">Лазаренко В. П., Джамийков Т. С., Коротаев В. В., Ярышев С. Н. Метод создания сферических панорам из изображений, полученных всенаправленными оптико-электронными системами // Научно-технический вестник информационных технологий, механики и оптики. 2016. Т. 16, № 1. С. 46—53.</mixed-citation><mixed-citation xml:lang="en">Lazarenko V. P., Dzhamijkov T. S., Korotaev V. V., Jaryshev S. N. Metod sozdaniâ sferičeskih panoram iz izobraženij, polučennyh vsenapravlennymi optiko-èlektronnymi sistemami [Method for creation of spherical panoramas from images obtained by omnidirectional optoelectronic systems], Nauchno-tehnicheskij vestnik informacionnyh tehnologij, mehaniki i optiki, 2016, vol. 16, no. 1. pp. 46—53 (in Russian).</mixed-citation></citation-alternatives></ref><ref id="cit17"><label>17</label><citation-alternatives><mixed-citation xml:lang="ru">Lazarenko V., Korotaev V., Yaryshev S. The algorithm for transformation of images from omnidirectional cameras // Proc. Latin America Optics and Photonics Conference (LAOP), Mexico. 2014.</mixed-citation><mixed-citation xml:lang="en">Lazarenko V., Korotaev V., Yaryshev S. The algorithm for transformation of images from omnidirectional cameras, Proc. Latin America Optics and Photonics Conference (LAOP), Mexico, 2014.</mixed-citation></citation-alternatives></ref><ref id="cit18"><label>18</label><citation-alternatives><mixed-citation xml:lang="ru">Utsugi K., Moriya T. A Camera Revolver for Improved Image Stitching // Machine Vision Applications (MVA). 2002. P. 261—264.</mixed-citation><mixed-citation xml:lang="en">Utsugi K., Moriya T. A Camera Revolver for Improved Image Stitching, Machine Vision Applications (MVA), 2002, pp. 261—264.</mixed-citation></citation-alternatives></ref><ref id="cit19"><label>19</label><citation-alternatives><mixed-citation xml:lang="ru">Brown M., Lowe D. G. Automatic Panoramic Image Stitching using Invariant Features // Int. J. Comput. Vision. 2007. Vol. 74, N. 1. P. 59—73.</mixed-citation><mixed-citation xml:lang="en">Brown M., Lowe D. G. Automatic Panoramic Image Stitching using Invariant Features, Int. J. Comput. Vision, 2007, vol. 74, no. 1, pp. 59—73.</mixed-citation></citation-alternatives></ref><ref id="cit20"><label>20</label><citation-alternatives><mixed-citation xml:lang="ru">Sakharkar V. S., Gupta S. R. Image stitching Techniquesan overview // International Journal of Computer Science and Applications. 2013. vol. 6, no. 2. P. 324—330.</mixed-citation><mixed-citation xml:lang="en">Sakharkar V. S., Gupta S. R. Image stitching Techniquesan overview, International Journal of Computer Science and Applications, 2013, vol. 6, no. 2, pp. 324—330.</mixed-citation></citation-alternatives></ref><ref id="cit21"><label>21</label><citation-alternatives><mixed-citation xml:lang="ru">Samsung NX Series. Руководство по продуктам. URL: http://www.samsung.com/ru/pdf/NX_Lens_Guide_120606.pdf. (дата обращения: 28.11.2017).</mixed-citation><mixed-citation xml:lang="en">Samsung NX Lens Guide, available at: http://www.samsung.com/ru/pdf/NX_Lens_Guide_120606.pdf (accessed 28.11.2017) (in Russian).</mixed-citation></citation-alternatives></ref><ref id="cit22"><label>22</label><citation-alternatives><mixed-citation xml:lang="ru">Mei C., Rives P. Single View Point Omnidirectional Camera Calibration from Planar Grids // IEEE International Conference on Robotics and Automation (ICRA). 2007. P. 3945—3950.</mixed-citation><mixed-citation xml:lang="en">Mei C., Rives P. Single View Point Omnidirectional Camera Calibration from Planar Grids, IEEE International Conference on Robotics and Automation (ICRA), 2007, pp. 3945—3950.</mixed-citation></citation-alternatives></ref><ref id="cit23"><label>23</label><citation-alternatives><mixed-citation xml:lang="ru">Scaramuzza D., Martinelli A., Siegwart R. A Toolbox for Easily Calibrating Omnidirectional Cameras // IEEE International Conference on Intelligent Robots and Systems (IROS). 2006. P. 5695—5701.</mixed-citation><mixed-citation xml:lang="en">Scaramuzza D., Martinelli A., Siegwart R. A Toolbox for Easily Calibrating Omnidirectional Cameras, IEEE International Conference on Intelligent Robots and Systems (IROS), 2006, pp. 5695—5701.</mixed-citation></citation-alternatives></ref><ref id="cit24"><label>24</label><citation-alternatives><mixed-citation xml:lang="ru">Zuliani M. RANSAC for Dummies // Vision Research Lab. 2014. P.1-101. URL: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.475.1243&amp;rep=rep1&amp;type=pdf (accessed 18.12.2018).</mixed-citation><mixed-citation xml:lang="en">Zuliani M. RANSACfor Dummies, Vision Research Lab, 2014, pp. 1—101. URL: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.475.1243&amp;rep=rep1&amp;type=pdf (accessed 18.12.2018).</mixed-citation></citation-alternatives></ref><ref id="cit25"><label>25</label><citation-alternatives><mixed-citation xml:lang="ru">Szeliski R. Image Alignment and Stitching: A Tutorial // FNT in Computer Graphics and Vision. 2006. Vol. 2, N. 1. P. 1—104.</mixed-citation><mixed-citation xml:lang="en">Szeliski R. Image Alignment and Stitching: A Tutorial, FNT in Computer Graphics and Vision, 2006, vol. 2, no. 1, pp. 1—104.</mixed-citation></citation-alternatives></ref><ref id="cit26"><label>26</label><citation-alternatives><mixed-citation xml:lang="ru">Kashyap V., Agrawal P., Akhbari F. Real-time, Quasi Immersive, High Definition Automotive 3D Surround View System // Int’l Conf. IP, Comp. Vision, and Pattern Recognition. 2017. P. 10—16.</mixed-citation><mixed-citation xml:lang="en">Kashyap V., Agrawal P., Akhbari F. Real-time, Quasi Immersive, High Definition Automotive 3D Surround View System, Int’l Conf. IP, Comp. Vision, and Pattern Recognition, 2017, pp. 10—16.</mixed-citation></citation-alternatives></ref></ref-list><fn-group><fn fn-type="conflict"><p>The authors declare that there are no conflicts of interest present.</p></fn></fn-group></back></article>
