{"id":2581,"date":"2020-01-15T16:12:37","date_gmt":"2020-01-15T15:12:37","guid":{"rendered":"https:\/\/mecatron.rma.ac.be\/?page_id=2581"},"modified":"2025-11-10T10:10:11","modified_gmt":"2025-11-10T09:10:11","slug":"ground-robotics","status":"publish","type":"page","link":"https:\/\/mecatron.rma.ac.be\/index.php\/research\/ground-robotics\/","title":{"rendered":"Ground Robotics"},"content":{"rendered":"<p><section class=\"kc-elm kc-css-391455 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-231718 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-42449\" style=\"height: 20px; clear: both; width:100%;\"><\/div>\n\t<div class=\"kc-elm kc-css-613837 kc-animate-speed-3s kc_shortcode kc_video_play kc_video_wrapper\" data-video=\"https:\/\/youtu.be\/rUHLHKt0G5Q\" data-width=\"600\" data-height=\"338.98305084746\" data-fullwidth=\"yes\" data-autoplay=\"yes\" data-loop=\"yes\" data-control=\"yes\" data-related=\"\" data-showinfo=\"yes\" data-kc-video-mute=\"yes\">\n\t\t\t<\/div>\n\n<div class=\"kc-elm kc-css-256660\" style=\"height: 20px; clear: both; width:100%;\"><\/div><div class=\"kc-elm kc-css-790852 kc_text_block\"><\/p>\n<p>Our research on ground robotics focuses on mobility and assessing 3D traversability of rough outdoor terrain and on heterogeneous team operations:<\/p>\n<p>\n<\/div><div class=\"kc-elm kc-css-516232\" style=\"height: 20px; clear: both; width:100%;\"><\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-852199 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-152918 kc_col-sm-4 kc_column kc_col-sm-4\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-70544\" style=\"height: 55px; clear: both; width:100%;\"><\/div><div class=\"kc-elm kc-css-942655 kc_shortcode kc_single_image\">\n\n        <img decoding=\"async\" src=\"https:\/\/mecatron.rma.ac.be\/wp-content\/uploads\/2020\/01\/Unmanned-Ground-System.jpg\" class=\"\" alt=\"\" \/>    <\/div>\n<\/div><\/div><div class=\"kc-elm kc-css-449635 kc_col-sm-8 kc_column kc_col-sm-8\"><div class=\"kc-col-container\">\n<div class=\"kc-elm kc-css-916809 kc-title-wrap \">\n\n\t<h5 class=\"kc_title\">Optimal design and system integration of Unmanned Ground Vehicles for tough applications<\/h5>\n<\/div>\n<div class=\"kc-elm kc-css-546004 kc_text_block\"><\/p>\n<p>The unstructured outside world imposes many constraints on unmanned ground systems that want to navigate through this environment. RMA focuses specifically on the system integration task to develop novel ground robot systems that are capable to cope with the requirements.<\/p>\n<p>RMA focuses on applications where the robotic systems are stressed under tough conditions. Examples are humanitarian demining, search and rescue and persistent outdoor environmental surveillance.<\/p>\n<ul>\n<\/ul>\n<p>\n<\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-798823 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-120354 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-445681\" style=\"height: 20px; clear: both; width:100%;\"><\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-858258 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-903089 kc_col-sm-4 kc_column kc_col-sm-4\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-919970\" style=\"height: 55px; clear: both; width:100%;\"><\/div><div class=\"kc-elm kc-css-832761 kc_shortcode kc_single_image\">\n\n        <img decoding=\"async\" src=\"https:\/\/mecatron.rma.ac.be\/wp-content\/uploads\/2020\/01\/mapping.png\" class=\"\" alt=\"\" \/>    <\/div>\n<\/div><\/div><div class=\"kc-elm kc-css-138820 kc_col-sm-8 kc_column kc_col-sm-8\"><div class=\"kc-col-container\">\n<div class=\"kc-elm kc-css-760550 kc-title-wrap \">\n\n\t<h5 class=\"kc_title\">Fast 3D mapping tools, combining data from aerial and ground-based assets <\/h5>\n<\/div>\n<div class=\"kc-elm kc-css-53013 kc_text_block\"><\/p>\n<p>As more and more robotic tools get deployed, the key issue becomes how to obtain a common understanding of the environment by using heterogeneous robots.<\/p>\n<p>Therefore, RMA studies the combination of multiple heterogeneous 3D datasets acquired by different multi-robot sensor systems operating in various large unstructured outdoor environments. This problem is very complex, especially when the system deals with a-priori unknown large-scale outdoor environments, facing problems of displacement, orientation and scale difference between the 3D data sets.<\/p>\n<p>In order to overcome the limitations of dealing with 3D data sets coming from different sensor systems (lasers, cameras) and different perspectives of the environment (ground \u00a0and air), we develop a semi-automated and robust 3D registration process, that allows us to consistently align two or more heterogeneous point clouds.<\/p>\n<p>\n<\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-667072 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-958707 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-205090\" style=\"height: 20px; clear: both; width:100%;\"><\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-860246 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-204834 kc_col-sm-4 kc_column kc_col-sm-4\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-801505\" style=\"height: 55px; clear: both; width:100%;\"><\/div><div class=\"kc-elm kc-css-951964 kc_shortcode kc_single_image\">\n\n        <img decoding=\"async\" src=\"https:\/\/mecatron.rma.ac.be\/wp-content\/uploads\/2020\/01\/Collaborative-Mapping-Robots.jpg\" class=\"\" alt=\"\" \/>    <\/div>\n<\/div><\/div><div class=\"kc-elm kc-css-739170 kc_col-sm-8 kc_column kc_col-sm-8\"><div class=\"kc-col-container\">\n<div class=\"kc-elm kc-css-518740 kc-title-wrap \">\n\n\t<h5 class=\"kc_title\">Multi-Agent Collaborative Mapping<\/h5>\n<\/div>\n<div class=\"kc-elm kc-css-716935 kc_text_block\"><\/p>\n<p>When multiple robotic agents are active in the same environment, they need to build up a common understanding or representation of this environment. This is commonly referred to as collaborative mapping. RMA works on the development of heterogeneous robotic ground systems for the collaborative mapping using multiple sensing modalities (LIDAR, 3D vision, &#8230;).\u00a0<\/p>\n<p>RMA works on collaborative mapping robots for application domains such as perimeter and area surveilance and also for supporting CBRN operations.\u00a0<\/p>\n<p>\n<\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-623478 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-480738 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-813364\" style=\"height: 20px; clear: both; width:100%;\"><\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-876481 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-431492 kc_col-sm-4 kc_column kc_col-sm-4\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-550904\" style=\"height: 55px; clear: both; width:100%;\"><\/div><div class=\"kc-elm kc-css-520023 kc_shortcode kc_single_image\">\n\n        <img decoding=\"async\" src=\"https:\/\/mecatron.rma.ac.be\/wp-content\/uploads\/2020\/01\/Collaborative-Ground-Robots.jpg\" class=\"\" alt=\"\" \/>    <\/div>\n<\/div><\/div><div class=\"kc-elm kc-css-278372 kc_col-sm-8 kc_column kc_col-sm-8\"><div class=\"kc-col-container\">\n<div class=\"kc-elm kc-css-617242 kc-title-wrap \">\n\n\t<h5 class=\"kc_title\">Heterogeneous collaboration between multiple autonomous and semi-autonomous unmanned agents<\/h5>\n<\/div>\n<div class=\"kc-elm kc-css-16587 kc_text_block\"><\/p>\n<p>RMA works on the interoperability between heterogeneous robotic systems. Interoperability is a concept that spans multiple domains: from networking, protocols and messaging over software architectures to shared situational awareness to multi-agent coordination strategies. RMA is active in all these domains with research partners from across the EU (e.g. in the <a href=\"https:\/\/mecatron.rma.ac.be\/index.php\/projects\/icarus\/\">ICARUS<\/a> project) and NATO countries (e.g. in the context of the NATO-IST-149 RTG) working group, in order to ensure interoperability between ground robotic systems of different nations.<\/p>\n<p><span style=\"font-style: inherit;\">The research of RMA focuses on ensuring the interoperability between the very different (heterogeneous) ground robotics tools and focuses on the following reseach questions:<\/span><\/p>\n<ul>\n<li>What are the most optimal strategies for collaboration?<\/li>\n<li>How can multi-agent systems share their data most efficiently?<\/li>\n<li>How can unmanned ground systems and humans on the field collaborate most effectively?<\/li>\n<\/ul>\n<p>\n<\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-418742 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-506182 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-49619\" style=\"height: 20px; clear: both; width:100%;\"><\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-92625 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-480120 kc_col-sm-4 kc_column kc_col-sm-4\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-874074\" style=\"height: 55px; clear: both; width:100%;\"><\/div><div class=\"kc-elm kc-css-984734 kc_shortcode kc_single_image\">\n\n        <img decoding=\"async\" src=\"https:\/\/mecatron.rma.ac.be\/wp-content\/uploads\/2020\/01\/Terrain-Traversability-Analysis.jpg\" class=\"\" alt=\"\" \/>    <\/div>\n<\/div><\/div><div class=\"kc-elm kc-css-866818 kc_col-sm-8 kc_column kc_col-sm-8\"><div class=\"kc-col-container\">\n<div class=\"kc-elm kc-css-776573 kc-title-wrap \">\n\n\t<h5 class=\"kc_title\">Terrain traversability analysis and mobility on rough terrain<\/h5>\n<\/div>\n<div class=\"kc-elm kc-css-743461 kc_text_block\"><\/p>\n<p>When navigating on rough outdoor terrain, one of the key concerns for an autonomous robot is to assess whether the terrain is traversable or not. This is a difficult research question, as the traversability depends on multiple factors, like the soil conditions, vegetation, robot mobility characteristics, robot dynamics, weather conditions, etc.\u00a0<\/p>\n<p>In order to make (semi-)autonomous ground robots capable of working also in difficult outdoor conditions, RMA works on the development of algorithms for the automatic assessment of the terrain traversability, mostly using camera systems (monocular, binocular, trinocular and time-of-flight cameras). The two major challenges are the identification of a model to link the traversability parameters to a correct traversability state and the assessment and processing of all these parameters in real-time.<\/p>\n<p>\n<\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-999294 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-888865 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-116927\" style=\"height: 20px; clear: both; width:100%;\"><\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-221168 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-850291 kc_col-sm-4 kc_column kc_col-sm-4\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-860705\" style=\"height: 55px; clear: both; width:100%;\"><\/div><div class=\"kc-elm kc-css-172219 kc_shortcode kc_single_image\">\n\n        <img decoding=\"async\" src=\"https:\/\/mecatron.rma.ac.be\/wp-content\/uploads\/2020\/01\/3D-Reconstrcution.jpg\" class=\"\" alt=\"\" \/>    <\/div>\n<\/div><\/div><div class=\"kc-elm kc-css-934993 kc_col-sm-8 kc_column kc_col-sm-8\"><div class=\"kc-col-container\">\n<div class=\"kc-elm kc-css-174747 kc-title-wrap \">\n\n\t<h5 class=\"kc_title\">3D Reconstruction as an aid for robot navigation<\/h5>\n<\/div>\n<div class=\"kc-elm kc-css-650423 kc_text_block\"><\/p>\n<p>Intelligent robotic systems that want to perform complex tasks in an unstructured environment need to understand the 3D nature of their surroundings and their own position within this space. This is a problem typically solved with a varied mix of sensing technologies, including active sensing systems (such as LIDAR, ultrasound or infrared rangers, etc) for 3D reconstruction and GNSS &#8211; sensors for localisation. However, in a military context, active sensors provide the disadvantage of detectability, whereas GNSS solutions can be easily jammed.\u00a0<\/p>\n<p>Therefore, RMA works on the development of 3D reconstruction and localisation algorithms that are mostly based on passive sensors (e.g. monocular, binocular and trinocular cameras) and that can work in GNSS-deprived environments.<\/p>\n<p>\n<\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-823207 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-579530 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-389717\" style=\"height: 20px; clear: both; width:100%;\"><\/div><\/div><\/div><\/div><\/div><\/section><\/p>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":1,"featured_media":2715,"parent":275,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-2581","page","type-page","status-publish","has-post-thumbnail","hentry"],"_links":{"self":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/2581","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/comments?post=2581"}],"version-history":[{"count":27,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/2581\/revisions"}],"predecessor-version":[{"id":5379,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/2581\/revisions\/5379"}],"up":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/275"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/media\/2715"}],"wp:attachment":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/media?parent=2581"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}