{"id":4338,"date":"2020-12-08T21:36:23","date_gmt":"2020-12-08T20:36:23","guid":{"rendered":"https:\/\/mecatron.rma.ac.be\/?page_id=4338"},"modified":"2020-12-08T21:46:21","modified_gmt":"2020-12-08T20:46:21","slug":"emile-le-flecher","status":"publish","type":"page","link":"https:\/\/mecatron.rma.ac.be\/index.php\/people\/emile-le-flecher\/","title":{"rendered":"Emile Le Flecher"},"content":{"rendered":"<p><section class=\"kc-elm kc-css-955153 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-598695 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-515716\" style=\"height: 20px; clear: both; width:100%;\"><\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-655224 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-562116 kc_col-sm-4 kc_column kc_col-sm-4\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-995138 kc_shortcode kc_single_image\">\n\n        <img decoding=\"async\" src=\"https:\/\/mecatron.rma.ac.be\/wp-content\/uploads\/2020\/12\/Emile_Le_Flecher.jpg\" class=\"\" alt=\"\" \/>    <\/div>\n<div class=\"kc-elm kc-css-251880 kc_text_block\"><\/p>\n<h4><span style=\"color: inherit; font-size: 1.25em; font-style: inherit;\">Researcher<\/span><\/h4>\n<p>Robotics &#038; Autonomous Systems,<br \/>Royal Military Academy<\/p>\n<h4>Address<\/h4>\n<p>Avenue De La Renaissance 30, 1000 Brussels, Belgium<\/p>\n<h4>Contact Information<\/h4>\n<p><strong>Call<\/strong>: &#8211;<\/p>\n<p><strong>Email<\/strong>: <a href=\"mailto:Emile.LeFlecher@mil.be\">Emile.LeFlecher@mil.be<\/a>\u00a0<\/p>\n<p>\n<\/div><\/div><\/div><div class=\"kc-elm kc-css-220786 kc_col-sm-7 kc_column kc_col-sm-7\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-686075 kc_text_block\"><\/p>\n<p class=\"aLF-aPX-K0-aPE\">Emile is a robotic researcher at the Robotics &#038; Autonomous Systems unit of the Department of Mechanics of the Belgian Royal Military Academy. His research focuses on developing solutions for heterogeneous robot fleet management in tough environments.<\/p>\n<p class=\"aLF-aPX-K0-aPE\">He received his Master&#039;s Degree in Robotics and Control Theory in 2016 from the <a href=\"https:\/\/www.univ-tlse3.fr\/\">Universit\u00e9 Toulouse III, Paul Sabatier<\/a>. Then he received his Doctoral Diploma in Robotics in 2020 from the <a href=\"https:\/\/www.univ-tlse3.fr\/\">Universit\u00e9 Toulouse III, Paul Sabatier<\/a>, and the <a href=\"https:\/\/www.laas.fr\/public\/\">LAAS-CNRS<\/a> laboratory with a thesis &#8220;Motion coordination of a bi-arms mobile robot to perform complex tasks of navigation a manipulation in strong dynamic environments\".<\/p>\n<p class=\"aLF-aPX-K0-aPE\">During his thesis, Emile worked in close collaboration with the agricultural robotics and sensor department of the <a href=\"https:\/\/www.ucdavis.edu\/\">University of California, Davis<\/a> (USA) in which he had conducted one year of his researches; and the mechanical department of the <a href=\"https:\/\/www.ufpe.br\/\">Federal University of Pernambuco<\/a> (Brazil). Afterwards, Emile had participated as researcher in a project with the <a href=\"https:\/\/www.laas.fr\/public\/\">LAAS-CNRS<\/a> and <a href=\"https:\/\/www.naio-technologies.com\/en\/\">Naio technologies<\/a> company about the improvement of autonomous navigation in agricultural field.<\/p>\n<p class=\"aLF-aPX-K0-aPE\">In 2020, Emile joined the Belgian Royal Military Academy to participate in the <a href=\"https:\/\/mecatron.rma.ac.be\/index.php\/projects\/imugs\/\">iMUGS<\/a> project that aims to develop and deploy a modular, standardized, and open system architecture for manned-unmanned team of robots to support armed forces on the field.<\/p>\n<p>\n<\/div><\/div><\/div><div class=\"kc-elm kc-css-364127 kc_col-sm-1 kc_column kc_col-sm-1\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-181740 kc-raw-code\"><script src=\"https:\/\/kit.fontawesome.com\/5feed4ac07.js\" crossorigin=\"anonymous\"><\/script>\r\n<link rel=\"stylesheet\" href=\"https:\/\/cdn.rawgit.com\/jpswalsh\/academicons\/master\/css\/academicons.min.css\">\r\n<span style=\"font-size: 36px; color: Dodgerblue;\">\r\n<center>\r\n<a href=\"mailto:Emile.LeFlecher@mil.be\"> <i class=\"fas fa-envelope fas-3x\"><\/i><\/a><br><br>\r\n<a href=\"\"> <i class=\"fab fa-skype fab-3x\"><\/i><\/a><br><br>\r\n<a href=\"\"> <i class=\"fab fa-twitter\"><\/i><\/a><br><br>\r\n<a href=\"https:\/\/www.linkedin.com\/in\/emile-le-flecher-183317a9\/\"> <i class=\"fab fa-linkedin\"><\/i><\/a><br><br>\r\n<a href=\"\"> <i class=\"ai ai-cv-square ai\"><\/i><\/a><br><br>\r\n<\/center>\r\n<\/span>\r\n<span style=\"font-size: 36px; color: Dodgerblue;\">\r\n<center>\r\n<a href=\"https:\/\/scholar.google.com\/citations?hl=en&user=FGXbrvQAAAAJ&view_op=list_works&authuser=1&sortby=pubdate\"><i class=\"ai ai-google-scholar-square ai\"><\/i><\/a><br><br>\r\n<a href=\"https:\/\/www.researchgate.net\/profile\/Emile_Le_Flecher2\"><i class=\"fab fa-researchgate\"><\/i><\/a><br><br>\r\n<a href=\"\"><i class=\"fab fa-mendeley\"><\/i><\/a><br><br>\r\n<a href=\"\"><i class=\"ai ai-researcherid-square ai\"><\/i><\/a><br><br>\r\n<a href=\"\"><i class=\"ai ai-orcid-square ai\"><\/i><\/a>\r\n<\/center>\r\n<\/span><\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-789203 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-998996 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-831249\" style=\"height: 20px; clear: both; width:100%;\"><\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-119337 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-165065 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\">\n<div class=\"kc-elm kc-css-981128 kc-title-wrap \">\n\n\t<h2 class=\"kc_title\">Publications<\/h2>\n<\/div>\n<div class=\"kc-elm kc-css-429241 kc_text_block\"><\/p>\n<p>\n<span style=\"font-style: inherit;\"><\/p>\n<h3 class=\"papercite\">2025<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    A. Miuccio, T. Fr\u00e9ville, E. Le Fl\u00e9cher, and C. Hamesse, <span style=\"font-style: italic\">Autonomous Mobile Manipulation for Safe and Efficient Landmine Disposal<\/span>CEIA Humanitarian Clearance Teamwork, 2025.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_1\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_1\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.ctro.hr\/userfiles\/files\/BROSURA%20CTRO%20MINE%20ACTION_2025_za%20web.pdf#page=25\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_1_block\">\n<p>Autonomous mobile manipulation for safe and efficient landmine disposal<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_1_block\">\n<pre><code class=\"tex bibtex\">@misc{miuccio_autonomous_2025,\ntitle = {Autonomous {Mobile} {Manipulation} for {Safe} and {Efficient} {Landmine} {Disposal}},\nurl = {https:\/\/www.ctro.hr\/userfiles\/files\/BROSURA%20CTRO%20MINE%20ACTION_2025_za%20web.pdf#page=25},\nabstract = {Autonomous mobile manipulation for safe and efficient landmine disposal},\nlanguage = {EN},\npublisher = {CEIA Humanitarian Clearance Teamwork},\nauthor = {Miuccio, Alessandra and Fr\u00e9ville, Timoth\u00e9e and Le Fl\u00e9cher, Emile and Hamesse, Charles},\nyear = {2025},\nunit= {meca-ras},\nproject= {DREAM}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    A. La Grappe, E. Le Fl\u00e9cher, and G. De Cubber, &#8220;Multi-Sensor Multi-Target Tracking for Maritime Surveillance with Autonomous Surface Vehicles Using Belief Propagation,\" in <span style=\"font-style: italic\">OCEANS 2025 Brest<\/span>,  2025, p. 1\u20138.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_2\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_2\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/11104349\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/OCEANS58557.2025.11104349' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_2_block\">\n<p>We present a distributed multi-sensor multi-target tracking algorithm for maritime surveillance using unmanned surface vehicles (USVs) in multi-agent settings. Our approach fuses measurements from radar, Automatic Identification System (AIS), and camera sensors within a Bayesian framework, employing an adaptive particle filtering strategy to jointly estimate the kinematic states and identities of vessels. Our solution incorporates a factorized data association model that integrates cooperative self-reports from AIS with radar and camera measurements, with visual re-identification capability. We evaluate our method using a high-fidelity simulation environment, which generates photorealistic maritime scenarios. Our performance analysis indicates that the integration of camera-based cues improves both the spatial localization and identity consistency, particularly in scenarios with low radar detection probability and non-cooperative targets. Furthermore, the distributed inference framework scales well with the number of USVs, making it well suited for large-scale multi-agent applications. Overall, our work demonstrates that fusing heterogeneous sensor modalities using belief propagation can enhance multi-target tracking performance in congested maritime environments.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_2_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{2c491cd7d3c64ae3ae227fdab6a2c80f,\ntitle = \"Multi-Sensor Multi-Target Tracking for Maritime Surveillance with Autonomous Surface Vehicles Using Belief Propagation\",\nabstract = \"We present a distributed multi-sensor multi-target tracking algorithm for maritime surveillance using unmanned surface vehicles (USVs) in multi-agent settings. Our approach fuses measurements from radar, Automatic Identification System (AIS), and camera sensors within a Bayesian framework, employing an adaptive particle filtering strategy to jointly estimate the kinematic states and identities of vessels. Our solution incorporates a factorized data association model that integrates cooperative self-reports from AIS with radar and camera measurements, with visual re-identification capability. We evaluate our method using a high-fidelity simulation environment, which generates photorealistic maritime scenarios. Our performance analysis indicates that the integration of camera-based cues improves both the spatial localization and identity consistency, particularly in scenarios with low radar detection probability and non-cooperative targets. Furthermore, the distributed inference framework scales well with the number of USVs, making it well suited for large-scale multi-agent applications. Overall, our work demonstrates that fusing heterogeneous sensor modalities using belief propagation can enhance multi-target tracking performance in congested maritime environments.\",\nkeywords = \"Visualization, Target tracking, Radar measurements, Surveillance, Radar, Radar tracking, Cameras, Particle measurements, Sensors, Artificial intelligence, Multi-target tracking, Maritime surveillance, Unmanned Surface Vessels, Distributed sensor fusion, Particle filtering, Belief propagation, Multi-agent robotics\",\nauthor = \"La Grappe, Alexandre and Le Fl\u00e9cher, Emile and De Cubber, Geert\",\nyear = \"2025\",\nmonth = jun,\nday = \"19\",\ndoi = \"10.1109\/OCEANS58557.2025.11104349\",\nlanguage = \"English\",\npages = \"1--8\",\nbooktitle = \"OCEANS 2025 Brest\",\npublisher = \"Institute of Electrical and Electronics Engineers Inc.\",\nurl = \"https:\/\/ieeexplore.ieee.org\/document\/11104349\",\nunit= meca-ras,\nproject= MULTIMETER\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    E. Maroulis, D. Hawari, K. Hasselmann, E. Le Fl\u00e9cher, and G. De Cubber, &#8220;Experimental Evaluation of Roadmap-Based Map Generation with Continuous-Time Conflict-Based Search for Multi-Agent Pathfinding,\" in <span style=\"font-style: italic\">IEEE International Conference on Autonomous Robots and Agents, ICARA<\/span>,  2025, p. 380\u2013387.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_3\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_3\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/10977707\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ICARA64554.2025.10977707' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_3_block\">\n<p>This article presents an experimental evaluation of a Multi-Agent Pathfinding (MAPF) approach for large-scale robotic fleets operating in diverse outdoor environments. We generated three distinct types of roadmap graphs: Constrained Delaunay Triangulation (CDT), Voronoi Diagram (VD), and Grid-derived from an obstacle file, and assessed their quality using metrics obtained from graph theory. Then, the performance of the Continuous-time Conflict-Based Search (CCBS) algorithm was evaluated across three different environmental maps, considering practical performance metrics including makespan and failure rate. Subsequently, the roadmap generation methods were ranked based on CCBS performance in similar scenarios using the Friedman statistical test. The results indicate that CDT outperforms both VD and Grid maps, even though it does not exhibit the best graph metrics in many environments. CDT&#8217;s superior performance is attributed to its enhanced interconnectedness and the availability of multiple pathways, as evidenced by its balanced metrics and structural properties. We show that CDT is the most efficient and reliable roadmap generation technique for multiagent systems under our experimental conditions making it a preferred choice for robust pathfinding.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_3_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{34774d01cc3341398188fc8353028be2,\ntitle = \"Experimental Evaluation of Roadmap-Based Map Generation with Continuous-Time Conflict-Based Search for Multi-Agent Pathfinding\",\nabstract = \"This article presents an experimental evaluation of a Multi-Agent Pathfinding (MAPF) approach for large-scale robotic fleets operating in diverse outdoor environments. We generated three distinct types of roadmap graphs: Constrained Delaunay Triangulation (CDT), Voronoi Diagram (VD), and Grid-derived from an obstacle file, and assessed their quality using metrics obtained from graph theory. Then, the performance of the Continuous-time Conflict-Based Search (CCBS) algorithm was evaluated across three different environmental maps, considering practical performance metrics including makespan and failure rate. Subsequently, the roadmap generation methods were ranked based on CCBS performance in similar scenarios using the Friedman statistical test. The results indicate that CDT outperforms both VD and Grid maps, even though it does not exhibit the best graph metrics in many environments. CDT's superior performance is attributed to its enhanced interconnectedness and the availability of multiple pathways, as evidenced by its balanced metrics and structural properties. We show that CDT is the most efficient and reliable roadmap generation technique for multiagent systems under our experimental conditions making it a preferred choice for robust pathfinding.\",\nkeywords = \"Measurement , Automation , Reliability theory , Graph theory , Path planning , Robots , Multi-agent systems\",\nauthor = \"Emmanouil Maroulis and Danial Hawari and Ken Hasselmann and Le Fl\u00e9cher, Emile and De Cubber, Geert\",\nyear = \"2025\",\nmonth = may,\nday = \"5\",\ndoi = \"10.1109\/ICARA64554.2025.10977707\",\nlanguage = \"English\",\npages = \"380--387\",\nbooktitle = \"IEEE International Conference on Autonomous Robots and Agents, ICARA\",\nissn = \"2767-7745\",\nurl = \"https:\/\/ieeexplore.ieee.org\/document\/10977707\",\nunit= {meca-ras},\nproject= {CUGS, ANIMUS, AIDEDEX, CONVOY}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2024<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    A. M. Casado Fauli, M. Malizia, K. Hasselmann, E. Le Fl\u00e9cher, G. De Cubber, and B. Lauwens, &#8220;HADRON: Human-friendly Control and Artificial Intelligence for Military Drone Operations,\" in <span style=\"font-style: italic\">In Proceedings 33rd IEEE International Conference on Robot and Human Interactive Communication, IEEE RO-MAN 2024<\/span>,  2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_4\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/arxiv.org\/abs\/2408.07063\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_4_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{fauli2024hadronhumanfriendlycontrolartificial,\ntitle={HADRON: Human-friendly Control and Artificial Intelligence for Military Drone Operations},\nauthor={Casado Fauli, Ana Maria and Malizia, Mario and Hasselmann, Ken and Le Fl\u00e9cher, Emile and De Cubber, Geert and Lauwens, Ben},\nyear={2024},\nbooktitle={In Proceedings 33rd IEEE International Conference on Robot and Human Interactive Communication, IEEE RO-MAN 2024},\npublisher = {IEEE},\nlocation = {Pasadena, USA},\nunit= {meca-ras},\nproject= {HADRON},\neprint={2408.07063},\narchivePrefix={arXiv},\nprimaryClass={cs.RO},\nurl={https:\/\/arxiv.org\/abs\/2408.07063},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    M. Malizia, A. M. Casado Fauli, K. Hasselmann, E. Le Fl\u00e9cher, G. De Cubber, and R. Haelterman, &#8220;Assisted Explosive Ordnance Disposal: Teleoperated Robotic Systems with AI, Virtual Reality, and Semi-Autonomous Manipulation for Safer Demining Operations,\" in <span style=\"font-style: italic\">20th International Symposium Mine Action<\/span>,  2024, pp. 52-55.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_5\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/www.ctro.hr\/userfiles\/files\/MINE%20ACTION_2024_ONLIINE.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_5_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{maliziamineact2024,\ntitle={Assisted Explosive Ordnance Disposal: Teleoperated Robotic Systems with AI, Virtual Reality, and Semi-Autonomous Manipulation for Safer Demining Operations},\nauthor={Malizia, Mario and Casado Fauli, Ana Maria and Hasselmann, Ken and Le Fl\u00e9cher, Emile and De Cubber, Geert and Haelterman, Rob},\nbooktitle={20th International Symposium Mine Action},\npublisher = {CTRO-HR},\nyear = {2024},\nlocation = {Cavtat, Croatia},\nunit= {meca-ras},\nurl={https:\/\/www.ctro.hr\/userfiles\/files\/MINE%20ACTION_2024_ONLIINE.pdf},\npages={52-55},\nproject= {BELGIAN}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2023<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, E. Le Fl\u00e9cher, A. La Grappe, E. Ghisoni, E. Maroulis, P. Ouendo, D. Hawari, and D. Doroftei, &#8220;Dual Use Security Robotics: A Demining, Resupply and Reconnaissance Use Case,\" in <span style=\"font-style: italic\">IEEE International Conference on Safety, Security, and Rescue Robotics<\/span>,  2023.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_6\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/mecatron.rma.ac.be\/pub\/2023\/SSRR2023-DeCubber.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_6_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{ssrr2023decubber,\ntitle={Dual Use Security Robotics: A Demining, Resupply and Reconnaissance Use Case},\nauthor={De Cubber, Geert and Le Fl\u00e9cher, Emile and La Grappe, Alexandre and Ghisoni, Enzo and Maroulis, Emmanouil and Ouendo, Pierre-Edouard and Hawari, Danial and Doroftei, Daniela},\nbooktitle={IEEE International Conference on Safety, Security, and Rescue Robotics},\neditors ={Kimura, Tetsuya},\npublisher = {IEEE},\nyear = {2023},\nvol = {1},\nproject = {AIDED, iMUGs, CUGS},\nlocation = {Fukushima, Japan},\nunit= {meca-ras},\ndoi = {},\nurl={https:\/\/mecatron.rma.ac.be\/pub\/2023\/SSRR2023-DeCubber.pdf}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, E. Le Fl\u00e9cher, A. Dominicus, and D. Doroftei, &#8220;Human-agent teaming between soldiers and unmanned ground systems in a resupply scenario,\" in <span style=\"font-style: italic\">Human Factors in Robots, Drones and Unmanned Systems. AHFE (2023) International Conference.<\/span>,  2023.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_7\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_7\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/openaccess.cms-conferences.org\/publications\/book\/978-1-958651-69-8\/article\/978-1-958651-69-8_5\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/http:\/\/doi.org\/10.54941\/ahfe1003746' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_7_block\">\n<p>Thanks to advances in embedded computing and robotics, intelligent Unmanned Ground Systems (UGS) are used more and more in our daily lives. Also in the military domain, the use of UGS is highly investigated for applications like force protection of military installations, surveillance, target acquisition, reconnaissance, handling of chemical, biological, radiological, nuclear (CBRN) threats, explosive ordnance disposal, etc. A pivotal research aspect for the integration of these military UGS in the standard operating procedures is the question of how to achieve a seamless collaboration between human and robotic agents in such high-stress and non-structured environments. Indeed, in these kind of operations, it is critical that the human-agent mutual understanding is flawless; hence, the focus on human factors and ergonomic design of the control interfaces.The objective of this paper is to focus on one key military application of UGS, more specifically logistics, and elaborate how efficient human-machine teaming can be achieved in such a scenario. While getting much less attention than other application areas, the domain of logistics is in fact one of the most important for any military operation, as it is an application area that is very well suited for robotic systems. Indeed, military troops are very often burdened by having to haul heavy gear across large distances, which is a problem UGS can solve.The significance of this paper is that it is based on more than two years of field research work on human + multi-agent UGS collaboration in realistic military operating conditions, performed within the scope of the European project iMUGS. In the framework of this project, not less than six large-scale field trial campaigns were organized across Europe. In each field trial campaign, soldiers and UGS had to work together to achieve a set of high-level mission goals that were distributed among them via a planning &#038; scheduling mechanism. This paper will focus on the outcomes of the Belgian field trial, which concentrated on a resupply logistics mission.Within this paper, a description of the iMUGS test setup and operational scenarios is provided. The ergonomic design of the tactical planning system is elaborated, together with the high-level swarming and task scheduling methods that divide the work between robotic and human agents in the fieldThe resupply mission, as described in this paper, was executed in summer 2022 in Belgium by a mixed team of soldiers and UGS for an audience of around 200 people from defence actors from European member states. The results of this field trial were evaluated as highly positive, as all high-level requirements were obtained by the robotic fleet.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_7_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{ahfe20203decubber,\ntitle={Human-agent teaming between soldiers and unmanned ground systems in a resupply scenario},\nauthor={De Cubber, G. and Le Fl\u00e9cher, E. and Dominicus, A. and Doroftei, D.},\nbooktitle={Human Factors in Robots, Drones and Unmanned Systems. AHFE (2023) International Conference.},\neditors ={Tareq Ahram and Waldemar Karwowski},\npublisher = {AHFE Open Access, AHFE International, USA},\nyear = {2023},\nvol = {93},\nproject = {iMUGs},\nlocation = {San Francisco, USA},\nunit= {meca-ras},\ndoi = {http:\/\/doi.org\/10.54941\/ahfe1003746},\nurl={https:\/\/openaccess.cms-conferences.org\/publications\/book\/978-1-958651-69-8\/article\/978-1-958651-69-8_5},\nabstract = {Thanks to advances in embedded computing and robotics, intelligent Unmanned Ground Systems (UGS) are used more and more in our daily lives. Also in the military domain, the use of UGS is highly investigated for applications like force protection of military installations, surveillance, target acquisition, reconnaissance, handling of chemical, biological, radiological, nuclear (CBRN) threats, explosive ordnance disposal, etc. A pivotal research aspect for the integration of these military UGS in the standard operating procedures is the question of how to achieve a seamless collaboration between human and robotic agents in such high-stress and non-structured environments. Indeed, in these kind of operations, it is critical that the human-agent mutual understanding is flawless; hence, the focus on human factors and ergonomic design of the control interfaces.The objective of this paper is to focus on one key military application of UGS, more specifically logistics, and elaborate how efficient human-machine teaming can be achieved in such a scenario. While getting much less attention than other application areas, the domain of logistics is in fact one of the most important for any military operation, as it is an application area that is very well suited for robotic systems. Indeed, military troops are very often burdened by having to haul heavy gear across large distances, which is a problem UGS can solve.The significance of this paper is that it is based on more than two years of field research work on human + multi-agent UGS collaboration in realistic military operating conditions, performed within the scope of the European project iMUGS. In the framework of this project, not less than six large-scale field trial campaigns were organized across Europe. In each field trial campaign, soldiers and UGS had to work together to achieve a set of high-level mission goals that were distributed among them via a planning & scheduling mechanism. This paper will focus on the outcomes of the Belgian field trial, which concentrated on a resupply logistics mission.Within this paper, a description of the iMUGS test setup and operational scenarios is provided. The ergonomic design of the tactical planning system is elaborated, together with the high-level swarming and task scheduling methods that divide the work between robotic and human agents in the fieldThe resupply mission, as described in this paper, was executed in summer 2022 in Belgium by a mixed team of soldiers and UGS for an audience of around 200 people from defence actors from European member states. The results of this field trial were evaluated as highly positive, as all high-level requirements were obtained by the robotic fleet.}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2022<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    E. Le Fl\u00e9cher, A. La Grappe, and G. De Cubber, &#8220;iMUGS &#8211; A ground multi-robot architecture for military Manned-Unmanned Teaming,\" in <span style=\"font-style: italic\">2022 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS)<\/span>, IEEE, 2022.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_16\" class=\"papercite_toggle\">[BibTeX]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_16_block\">\n<pre><code class=\"tex bibtex\">@inbook{imugs_le_flecher_la_grappe_de_cubber,\nplace={Kyoto},\ntitle={iMUGS - A ground multi-robot architecture for military Manned-Unmanned Teaming},\nbooktitle={2022 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS)},\npublisher={IEEE},\nyear={2022},\nauthor={Le Fl\u00e9cher, Emile and La Grappe, Alexandre and De Cubber, Geert},\nproject = {iMUGs},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2021<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    L. Emmi, E. Le Fl\u00e9cher, V. Cadenat, and M. Devy, &#8220;A hybrid representation of the environment to improve autonomous navigation of mobile robots in agriculture,\" <span style=\"font-style: italic\">Precision Agric<\/span>, vol. 22, iss. 2, p. 524\u2013549, 2021.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_0\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_0\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/link.springer.com\/10.1007\/s11119-020-09773-9\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1007\/s11119-020-09773-9' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_0_block\">\n<p>This paper considers the problem of autonomous navigation in agricultural fields. It proposes a localization and mapping framework based on semantic place classification and key location estimation, which together build a hybrid topological map. This map benefits from generic partitioning of the field, which contains a finite set of well-differentiated workspaces and, through a semantic analysis, it is possible to estimate in a probabilistic way the position (state) of a mobile system in the field. Moreover, this map integrates both metric (key locations) and semantic features (working areas). One of its advantages is that a full and precise map prior to navigation is not necessary. The identification of the key locations and working areas is carried out by a perception system based on 2D LIDAR and RGB cameras. Fusing these data with odometry allows the robot to be located in the topological map. The approach is assessed through off-line data recorded in real conditions in diverse fields during different seasons. It exploits a real-time object detector based on a convolutional neural network called you only look once, version 3, which has been trained to classify a considerable number of crops, including market-garden crops such as broccoli and cabbage, and to identify grapevine trunks. The results show the interest in the approach, which allows (i) obtaining a simple and easy-to-update map, (ii) avoiding the use of artificial landmarks, and thus (iii) improving the autonomy of agricultural robots.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_0_block\">\n<pre><code class=\"tex bibtex\">@article{emmi_hybrid_2021,\ntitle = {A hybrid representation of the environment to improve autonomous navigation of mobile robots in agriculture},\nvolume = {22},\nissn = {1385-2256, 1573-1618},\nurl = {https:\/\/link.springer.com\/10.1007\/s11119-020-09773-9},\ndoi = {10.1007\/s11119-020-09773-9},\nabstract = {This paper considers the problem of autonomous navigation in agricultural fields. It proposes a localization and mapping framework based on semantic place classification and key location estimation, which together build a hybrid topological map. This map benefits from generic partitioning of the field, which contains a finite set of well-differentiated workspaces and, through a semantic analysis, it is possible to estimate in a probabilistic way the position (state) of a mobile system in the field. Moreover, this map integrates both metric (key locations) and semantic features (working areas). One of its advantages is that a full and precise map prior to navigation is not necessary. The identification of the key locations and working areas is carried out by a perception system based on 2D LIDAR and RGB cameras. Fusing these data with odometry allows the robot to be located in the topological map. The approach is assessed through off-line data recorded in real conditions in diverse fields during different seasons. It exploits a real-time object detector based on a convolutional neural network called you only look once, version 3, which has been trained to classify a considerable number of crops, including market-garden crops such as broccoli and cabbage, and to identify grapevine trunks. The results show the interest in the approach, which allows (i) obtaining a simple and easy-to-update map, (ii) avoiding the use of artificial landmarks, and thus (iii) improving the autonomy of agricultural robots.},\nlanguage = {en},\nnumber = {2},\nurldate = {2022-10-06},\njournal = {Precision Agric},\nauthor = {Emmi, L. and Le Fl\u00e9cher, E. and Cadenat, V. and Devy, M.},\nmonth = apr,\nyear = {2021},\npages = {524--549},\nfile = {Emmi et al. - 2021 - A hybrid representation of the environment to impr.pdf:C:UsersemileNextcloudZoterostorage4V665QEGEmmi et al. - 2021 - A hybrid representation of the environment to impr.pdf:application\/pdf},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2020<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    E. Le Fl\u00e9cher, &#8220;Coordination des mouvements d\u2019un syst\u00e8me mobile bi-bras pour la r\u00e9alisation de t\u00e2ches complexes de navigation et de manipulation dans un environnement fortement dynamique,\" PhD Thesis, 2020.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_14\" class=\"papercite_toggle\">[BibTeX]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_14_block\">\n<pre><code class=\"tex bibtex\">@phdthesis{These_2020, place={Toulouse, FR},\ntitle={Coordination des mouvements d\u2019un syst\u00e8me mobile bi-bras pour la r\u00e9alisation de t\u00e2ches complexes de navigation et de manipulation dans un environnement fortement dynamique},\nschool={Universit\u00e9 Toulouse 3 Paul Sabatier},\nauthor={Le Fl\u00e9cher, Emile},\nyear={2020},\nmonth={Feb} }<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    E. Le Fl\u00e9cher, A. Durand-Petiteville, V. Cadenat, and T. Sentenac, &#8220;Simultaneous Control of Two Robotics Arms Sharing Workspace via Visual Predictive Control,\" in <span style=\"font-style: italic\">INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS 16th international.<\/span>, SPRINGER, 2020, p. 79\u201398.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_15\" class=\"papercite_toggle\">[BibTeX]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_15_block\">\n<pre><code class=\"tex bibtex\">@inbook{Book_2020,\nplace={S.l.},\ntitle={Simultaneous Control of Two Robotics Arms Sharing Workspace via Visual Predictive Control},\nISBN={978-3-030-63193-2},\nbooktitle={INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS 16th international.},\npublisher={SPRINGER},\nauthor={Le Fl\u00e9cher, Emile and Durand-Petiteville, Adrien and Cadenat, Viviane and Sentenac, Thierry},\nyear={2020},\npages={79--98} }<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2019<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. Leca, V. Cadenat, T. Sentenac, A. Durand-Petiteville, F. Gouaisbaut, and L. E. Fl\u00e9cher, &#8220;Sensor-based Obstacles Avoidance Using Spiral Controllers For an Aircraft Maintenance Inspection Robot,\" in <span style=\"font-style: italic\">Proceeding of European Control Conference<\/span>,  2019, p. 7.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_11\" class=\"papercite_toggle\">[BibTeX]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_11_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{ECC_2019,\nplace={Naples, Italy},\ntitle={Sensor-based Obstacles Avoidance Using Spiral Controllers For an Aircraft Maintenance Inspection Robot},\nbooktitle={Proceeding of European Control Conference},\nauthor={D Leca and V Cadenat and T Sentenac and A Durand-Petiteville and F Gouaisbaut and E Le Fl\u00e9cher},\nyear={2019},\npages={7} }<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    E. Fl\u00e9cher, A. Durand-Petiteville, G. F., V. Cadenat, S. Vougioukas, and S. T., &#8220;Nonlinear Output Feedback for Autonomous U-turn Maneuvers of a Robot in Orchard Headlands:,\" in <span style=\"font-style: italic\">Proceedings of the 16th International Conference on Informatics in Control, Automation and Robotics<\/span>,  2019, p. 355\u2013362.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_12\" class=\"papercite_toggle\">[BibTeX]<\/a>            <a href='http:\/\/dx.doi.org\/10.5220\/0007918803550362' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_12_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{ICINCO_2019_1, place={Prague, Czech Republic},\ntitle={Nonlinear Output Feedback for Autonomous U-turn Maneuvers of a Robot in Orchard Headlands:},\nISBN={978-989-758-380-3},\nDOI={10.5220\/0007918803550362},\nbooktitle={Proceedings of the 16th International Conference on Informatics in Control, Automation and Robotics},\npublisher={SCITEPRESS - Science and Technology Publications},\nauthor={E. Fl\u00e9cher and A. Durand-Petiteville and Gouaisbaut F. and V. Cadenat and S. Vougioukas and Sentenac T.},\nyear={2019},\npages={355--362} }<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    E. Fl\u00e9cher, A. Durand-Petiteville, V. Cadenat, and T. Sentenac, &#8220;Visual Predictive Control of Robotic Arms with Overlapping Workspace,\" in <span style=\"font-style: italic\">Proceedings of the 16th International Conference on Informatics in Control, Automation and Robotics<\/span>,  2019, p. 130\u2013137.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_13\" class=\"papercite_toggle\">[BibTeX]<\/a>            <a href='http:\/\/dx.doi.org\/10.5220\/0008119001300137' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_13_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{ICINCO_2019_2,\nplace={Prague, Czech Republic},\ntitle={Visual Predictive Control of Robotic Arms with Overlapping Workspace},\nISBN={978-989-758-380-3},\nDOI={10.5220\/0008119001300137},\nbooktitle={Proceedings of the 16th International Conference on Informatics in Control, Automation and Robotics},\npublisher={SCITEPRESS - Science and Technology Publications},\nauthor={E. Fl\u00e9cher and A. Durand-Petiteville and V. Cadenat and T. Sentenac},\nyear={2019},\npages={130--137} }<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2018<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    A. Durand-Petiteville, L. E. Fl\u00e9cher, V. Cadenat, T. Sentenac, and S. Vougioukas, &#8220;Tree detection with low-cost 3D sensors for autonomous navigation in orchards,\" <span style=\"font-style: italic\">Robotics and Autonomous Letters<\/span>, p. 8, 2018.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_10\" class=\"papercite_toggle\">[BibTeX]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_10_block\">\n<pre><code class=\"tex bibtex\">@article{RAL_2018, title={Tree detection with low-cost 3D sensors for autonomous navigation in orchards},\njournal={Robotics and Autonomous Letters},\nauthor={A Durand-Petiteville and E Le Fl\u00e9cher and V Cadenat and T Sentenac and S Vougioukas},\nyear={2018},\npages={8} }<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2017<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    E. Le Fl\u00e9cher, A. Durand-Petiteville, V. Cadenat, T. Sentenac, and S. Vougioukas, &#8220;Implementation on a harvesting robot of a sensor-based controller performing a u-turn,\" in <span style=\"font-style: italic\">Proceedings of IEEE International Workshop of Electronics, Control, Measurement, Signals and their application to Mechatronics<\/span>,  2017, p. 1\u20136.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_8\" class=\"papercite_toggle\">[BibTeX]<\/a>            <a href='http:\/\/dx.doi.org\/10.1109\/ECMSM.2017.7945895' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_8_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{ECMSM_2017,\ntitle={Implementation on a harvesting robot of a sensor-based controller performing a u-turn},\nISBN={978-1-5090-5582-1},\nDOI={10.1109\/ECMSM.2017.7945895},\nbooktitle={Proceedings of IEEE International Workshop of Electronics, Control, Measurement, Signals and their application to Mechatronics},\npublisher={IEEE},\nauthor={Le Fl\u00e9cher, E. and A. Durand-Petiteville and V. Cadenat and T. Sentenac and S. Vougioukas},\nyear={2017},\nmonth={May},\npages={1--6}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    A. Durand-Petiteville, L. E. Fl\u00e9cher, V. Cadenat, T. Sentenac, and S. Vougioukas, &#8220;Design of a Sensor-based Controller Performing U-turn to Navigate in Orchards,\" in <span style=\"font-style: italic\">ICINCO<\/span>,  2017, p. 172\u2013181.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_9\" class=\"papercite_toggle\">[BibTeX]<\/a>            <a href='http:\/\/dx.doi.org\/10.5220\/0006478601720181' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_9_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{ICINCO_2017,\ntitle={Design of a Sensor-based Controller Performing U-turn to Navigate in Orchards},\nISBN={978-989-758-263-9},\nDOI={10.5220\/0006478601720181},\nbooktitle={ICINCO},\npublisher={SCITEPRESS - Science and Technology Publications},\nauthor={A. Durand-Petiteville and E. Le Fl\u00e9cher and V. Cadenat and T. Sentenac and S. Vougioukas},\nyear={2017},\npages={172--181} }<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<p><\/span><\/p>\n<p>\n<\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-6962 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-227842 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-311493\" style=\"height: 20px; clear: both; width:100%;\"><\/div><\/div><\/div><\/div><\/div><\/section><\/p>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":1,"featured_media":250,"parent":3120,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-4338","page","type-page","status-publish","has-post-thumbnail","hentry"],"_links":{"self":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/4338","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/comments?post=4338"}],"version-history":[{"count":2,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/4338\/revisions"}],"predecessor-version":[{"id":4341,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/4338\/revisions\/4341"}],"up":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/3120"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/media\/250"}],"wp:attachment":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/media?parent=4338"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}