{"id":3586,"date":"2020-02-20T11:36:32","date_gmt":"2020-02-20T10:36:32","guid":{"rendered":"https:\/\/mecatron.rma.ac.be\/?page_id=3586"},"modified":"2025-06-15T21:13:13","modified_gmt":"2025-06-15T20:13:13","slug":"daniela-doroftei","status":"publish","type":"page","link":"https:\/\/mecatron.rma.ac.be\/index.php\/people\/daniela-doroftei\/","title":{"rendered":"Daniela Doroftei"},"content":{"rendered":"<p><section class=\"kc-elm kc-css-426738 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-962437 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-180409\" style=\"height: 20px; clear: both; width:100%;\"><\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-916923 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-776118 kc_col-sm-4 kc_column kc_col-sm-4\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-194886 kc_shortcode kc_single_image\">\n\n        <img decoding=\"async\" src=\"https:\/\/mecatron.rma.ac.be\/wp-content\/uploads\/2020\/04\/Daniela.Doroftei.png\" class=\"\" alt=\"\" \/>    <\/div>\n<div class=\"kc-elm kc-css-919921 kc_text_block\"><\/p>\n<h4><span style=\"color: inherit; font-size: 1.25em; font-style: inherit;\">Senior Researcher<\/span><\/h4>\n<p>Robotics &#038; Autonomous Systems,<br \/>Royal Military Academy<\/p>\n<h4>Address<\/h4>\n<p>Avenue De La Renaissance 30, 1000 Brussels, Belgium<\/p>\n<h4>Contact Information<\/h4>\n<p><strong>Call<\/strong>: +32(0)2-44-14106<\/p>\n<p><strong>Email<\/strong>: <a href=\"mailto:daniela.doroftei@rma.ac.be\">daniela.doroftei@rma.ac.be<\/a><\/p>\n<p>\n<\/div><\/div><\/div><div class=\"kc-elm kc-css-238467 kc_col-sm-7 kc_column kc_col-sm-7\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-493651 kc_text_block\"><\/p>\n<p>Daniela Doroftei is a senior researcher at the Robotics &#038; Autonomous Systems unit of the department of Mechanics of the Belgian Royal Military Academy. Her research focuses on the tight integration and seemless interfacing between humans and robots in security and defence applications. Within the Robotics &#038; Autonomous Systems unit, she is therefore the expert on research questions related to human factors, requirements engineering, human-robot shared control methodologies and operational quantitative validation methods.\u00a0<\/p>\n<p>Daniela received her Master Diploma in Mechanical Engineering in 2002 from the <a href=\"https:\/\/www.tuiasi.ro\/?lang=en\">Gheorghe Asachi University<\/a>\u00a0of Iasi, Romania and a Master after Master Degree in Applied Sciences in 2003 from the <a href=\"https:\/\/www.ulb.be\/en\/ulb-homepage\">Universit\u00e9 Libre de Bruxelles<\/a> (ULB), Belgium.\u00a0\u00a0<\/p>\n<p>She is a task or Work Package &#8211; leader of multiple European research projects, like <a href=\"https:\/\/mecatron.rma.ac.be\/index.php\/projects\/icarus\/\">FP7-ICARUS<\/a> (on the development of search and rescue robots) and <a href=\"https:\/\/mecatron.rma.ac.be\/index.php\/projects\/safeshore\/\">H2020-SafeShore<\/a> (on the development of a threat detection system). Within these projects, Daniela acts as the end-user liaison officer, making the bridging on the field between the scientists and the end user stakeholders.\u00a0<\/p>\n<p><span style=\"font-style: inherit;\">Daniela is a principal investigator for RMA for multiple international research projects like <a href=\"https:\/\/mecatron.rma.ac.be\/index.php\/projects\/starseu\/\">STARS*EU<\/a> and<\/span><span style=\"font-style: inherit;\">\u00a0<\/span><a style=\"font-style: inherit;\" href=\"https:\/\/mecatron.rma.ac.be\/index.php\/projects\/assets\/\">ASSETs+<\/a><span style=\"font-style: inherit;\">. Moreover, she is the technical coordinator for <\/span><a style=\"font-style: inherit;\" href=\"https:\/\/mecatron.rma.ac.be\/index.php\/projects\/alphonse\/\">the Alphonse research project<\/a><span style=\"font-style: inherit;\">, which has as an objective to reduce the number of drone incidents, by improving the training procedures for drone operators.\u00a0<\/span><\/p>\n<p>Daniela is active as a reviewer for the European Commission and other funding agencies and has published around 50 scientific papers, including books and chapters in books.\u00a0<\/p>\n<p>\n<\/div><\/div><\/div><div class=\"kc-elm kc-css-482053 kc_col-sm-1 kc_column kc_col-sm-1\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-678699 kc-raw-code\"><script src=\"https:\/\/kit.fontawesome.com\/5feed4ac07.js\" crossorigin=\"anonymous\"><\/script>\r\n<link rel=\"stylesheet\" href=\"https:\/\/cdn.rawgit.com\/jpswalsh\/academicons\/master\/css\/academicons.min.css\">\r\n<span style=\"font-size: 36px; color: Dodgerblue;\">\r\n<center>\r\n<a href=\"mailto:daniela.doroftei@rma.ac.be\"> <i class=\"fas fa-envelope fas-3x\"><\/i><\/a><br><br>\r\n<a href=\"skype:daniela.doroftei?call\"> <i class=\"fab fa-skype fab-3x\"><\/i><\/a><br><br>\r\n<a href=\"#\"> <i class=\"fab fa-twitter\"><\/i><\/a><br><br>\r\n<a href=\"https:\/\/www.linkedin.com\/in\/daniela-doroftei-7b412427\/\"> <i class=\"fab fa-linkedin\"><\/i><\/a><br><br>\r\n<a href=\"http:\/\/mecatron.rma.ac.be\/People\/DOROFTEI_Daniela_CV.pdf\"> <i class=\"ai ai-cv-square ai\"><\/i><\/a><br><br>\r\n<\/center>\r\n<\/span>\r\n<span style=\"font-size: 36px; color: Dodgerblue;\">\r\n<center>\r\n<a href=\"https:\/\/scholar.google.be\/citations?user=EUZro_cAAAAJ&hl=en&oi=ao\"><i class=\"ai ai-google-scholar-square ai\"><\/i><\/a><br><br>\r\n<a href=\"https:\/\/www.researchgate.net\/profile\/Daniela_Doroftei\"><i class=\"fab fa-researchgate\"><\/i><\/a><br><br>\r\n<a href=\"#\"><i class=\"fab fa-mendeley\"><\/i><\/a><br><br>\r\n<a href=\"#\"><i class=\"ai ai-researcherid-square ai\"><\/i><\/a><br><br>\r\n<a href=\"#\"><i class=\"ai ai-orcid-square ai\"><\/i><\/a>\r\n<\/center>\r\n<\/span><\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-666880 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-2311 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-360185\" style=\"height: 20px; clear: both; width:100%;\"><\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-907376 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-989665 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\">\n<div class=\"kc-elm kc-css-430122 kc-title-wrap \">\n\n\t<h2 class=\"kc_title\">Publications<\/h2>\n<\/div>\n<div class=\"kc-elm kc-css-218336 kc_text_block\"><\/p>\n<p><span style=\"font-style: inherit;\"><\/p>\n<h3 class=\"papercite\">2025<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber and D. Doroftei, &#8220;Resource Optimisation for Distributed Teams of Manned Aircraft and Drones,\" in <span style=\"font-style: italic\">2025 11th International Conference on Control, Automation and Robotics, ICCAR 2025<\/span>,  2025, p. 554\u2013559.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_0\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_0\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/11072934\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ICCAR64901.2025.11072934' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_0_block\">\n<p>With the significant increase in onboard computing capabilities, modern aerial robotic systems, can execute a broad range of perception and control algorithms simultaneously. Moreover, more and more, they are also deployed as heterogeneous collaborative teams, where manned and unmanned assets need to collaborate in a manned-unmanned teaming concept. This introduces the challenge of determining the optimal distribution of cognitive processes across aerial platforms, edge computing nodes, and cloud-based services. In this paper, we propose a novel load distribution methodology tailored to the aerial domain. The approach adopts a decentralized framework for allocating perception and control processes by evaluating communication parameters (e.g., bandwidth, latency, and cost), the computational capabilities of the drones and supporting infrastructure (including CPU, GPU, memory, and storage performance), and the real-time delivery requirements of high-quality output data. The proposed methodology is validated in a simulated environment, demonstrating promising performance and scalability in handling dynamic operational conditions.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_0_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{182aea17e9984dd395cb61aeec26ce48,\ntitle = \"Resource Optimisation for Distributed Teams of Manned Aircraft and Drones\",\nabstract = \"With the significant increase in onboard computing capabilities, modern aerial robotic systems, can execute a broad range of perception and control algorithms simultaneously. Moreover, more and more, they are also deployed as heterogeneous collaborative teams, where manned and unmanned assets need to collaborate in a manned-unmanned teaming concept. This introduces the challenge of determining the optimal distribution of cognitive processes across aerial platforms, edge computing nodes, and cloud-based services. In this paper, we propose a novel load distribution methodology tailored to the aerial domain. The approach adopts a decentralized framework for allocating perception and control processes by evaluating communication parameters (e.g., bandwidth, latency, and cost), the computational capabilities of the drones and supporting infrastructure (including CPU, GPU, memory, and storage performance), and the real-time delivery requirements of high-quality output data. The proposed methodology is validated in a simulated environment, demonstrating promising performance and scalability in handling dynamic operational conditions.\",\nkeywords = \"drones, mannedunmanned teaming, resource optimisation\",\nauthor = \"De Cubber, Geert and Doroftei, Daniela\",\nnote = \"Publisher Copyright: {textcopyright} 2025 IEEE.; 11th International Conference on Control, Automation and Robotics, ICCAR 2025 ; Conference date: 18-04-2025 Through 20-04-2025\",\nyear = \"2025\",\ndoi = \"10.1109\/ICCAR64901.2025.11072934\",\nlanguage = \"English\",\npages = \"554--559\",\nbooktitle = \"2025 11th International Conference on Control, Automation and Robotics, ICCAR 2025\",\npublisher = \"Institute of Electrical and Electronics Engineers Inc.\",\nedition = \"2025\",\nurl = \"https:\/\/ieeexplore.ieee.org\/document\/11072934\",\nunit= meca-ras,\nproject= {COURAGEOUS2,HADRON,ALPHONSE}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, P. Petsioti, A. Koniaris, K. B. &#8216;n, M. .Z, R. Roman, S. Sima, A. Mohamoud, J. {van de Pol}, I. Maza, A. Ollero, C. Church, and C. Popa, &#8220;Standardized Evaluation of Counter-Drone Systems: Methods, Technologies, and Performance Metrics,\" <span style=\"font-style: italic\">Drones<\/span>, vol. 9, iss. 5, 2025.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_1\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_1\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.mdpi.com\/2504-446X\/9\/5\/354\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.3390\/drones9050354' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_1_block\">\n<p>This paper aims to introduce a standardized test methodology for drone detection, tracking, and identification systems. It is the aim that this standardized test methodology for assessing the performance of counter-drone systems will lead to a much better understanding of the capabilities of these solutions. This is urgently needed, as there is an increase in drone threats and there are no cohesive policies to evaluate the performance of these systems and hence mitigate and manage the threat. The presented methodology has been developed within the framework of the project COURAGEOUS funded by European Union{textquoteright}s Internal Security Fund Police. This standardized test methodology is based upon a series of standard user-defined scenarios representing a wide set of use cases. At this moment, these standard scenarios are geared toward civil security end users. However, the proposed standard methodology provides an open architecture where the standard scenarios can be modularly extended, providing standard users the possibility to easily add new scenarios. For each of these scenarios, operational needs and functional performance requirements are provided. Using this information, an integral test methodology is presented that allows for a fair qualitative and quantitative comparison between different counter-drone systems. The standard test methodology concentrates on the qualitative and quantitative evaluation of counter-drone systems. This test methodology was validated during three user-scripted validation trials.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_1_block\">\n<pre><code class=\"tex bibtex\">@article{5edff7d5878f412ba52cbbeab092019f,\ntitle = \"Standardized Evaluation of Counter-Drone Systems: Methods, Technologies, and Performance Metrics\",\nabstract = \"This paper aims to introduce a standardized test methodology for drone detection, tracking, and identification systems. It is the aim that this standardized test methodology for assessing the performance of counter-drone systems will lead to a much better understanding of the capabilities of these solutions. This is urgently needed, as there is an increase in drone threats and there are no cohesive policies to evaluate the performance of these systems and hence mitigate and manage the threat. The presented methodology has been developed within the framework of the project COURAGEOUS funded by European Union{textquoteright}s Internal Security Fund Police. This standardized test methodology is based upon a series of standard user-defined scenarios representing a wide set of use cases. At this moment, these standard scenarios are geared toward civil security end users. However, the proposed standard methodology provides an open architecture where the standard scenarios can be modularly extended, providing standard users the possibility to easily add new scenarios. For each of these scenarios, operational needs and functional performance requirements are provided. Using this information, an integral test methodology is presented that allows for a fair qualitative and quantitative comparison between different counter-drone systems. The standard test methodology concentrates on the qualitative and quantitative evaluation of counter-drone systems. This test methodology was validated during three user-scripted validation trials.\",\nkeywords = \"CUAS, Counter-Drone, C-UAS, Standardization, Standard Test Methods\",\nauthor = \"De Cubber, Geert and Daniela Doroftei and Paraskevi Petsioti and Alexios Koniaris and Konrad Brewczy{'n}ski and Marek {.Z}yczkowski and Razvan Roman and Silviu Sima and Ali Mohamoud and {van de Pol}, Johan and Ivan Maza and Anibal Ollero and Christopher Church and Cristina Popa\",\nyear = \"2025\",\nmonth = may,\nday = \"6\",\ndoi = \"10.3390\/drones9050354\",\nlanguage = \"English\",\nvolume = \"9\",\njournal = \"Drones\",\nissn = \"2504-446X\",\nnumber = \"5\",\nurl = \"https:\/\/www.mdpi.com\/2504-446X\/9\/5\/354\",\nunit= {meca-ras},\nproject= {COURAGEOUS}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2024<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    P. Petsioti, M. Zyczkowski, K. Brewczyski, K. Cichulski, K. Kaminski, R. Razvan, A. Mohamoud, C. Church, A. Koniaris, G. De Cubber, and D. Doroftei, &#8220;Methodological Approach for the Development of Standard C-UAS Scenarios,\" <span style=\"font-style: italic\">Open Research Europe<\/span>, vol. 4, iss. 240, 2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_2\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/open-research-europe.ec.europa.eu\/articles\/4-240\/v1\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.12688\/openreseurope.18339.1' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_2_block\">\n<pre><code class=\"tex bibtex\">@Article{ 10.12688\/openreseurope.18339.1,\nAUTHOR = {Petsioti, P. and Zyczkowski, M. and Brewczyski, K. and Cichulski, K. and Kaminski, K. and Razvan, R. and Mohamoud, A. and Church, C. and Koniaris, A. and De Cubber, G. and Doroftei, D.},\nTITLE = {Methodological Approach for the Development of Standard C-UAS Scenarios},\nJOURNAL = {Open Research Europe},\nVOLUME = {4},\nYEAR = {2024},\nNUMBER = {240},\nDOI = {10.12688\/openreseurope.18339.1},\nURL = {https:\/\/open-research-europe.ec.europa.eu\/articles\/4-240\/v1},\nunit= {meca-ras},\nproject= {COURAGEOUS}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, G. De Cubber, S. Lo Bue, and H. De Smet, &#8220;Quantitative Assessment of Drone Pilot Performance,\" <span style=\"font-style: italic\">Drones<\/span>, vol. 8, iss. 9, 2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_3\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_3\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.mdpi.com\/2504-446X\/8\/9\/482\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.3390\/drones8090482' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_3_block\">\n<p>This paper introduces a quantitative methodology for assessing drone pilot performance, aiming to reduce drone-related incidents by understanding the human factors influencing performance. The challenge lies in balancing evaluations in operationally relevant environments with those in a standardized test environment for statistical relevance. The proposed methodology employs a novel virtual test environment that records not only basic flight metrics but also complex mission performance metrics, such as the video quality from a target. A group of Belgian Defence drone pilots were trained using this simulator system, yielding several practical results. These include a human-performance model linking human factors to pilot performance, an AI co-pilot providing real-time flight performance guidance, a tool for generating optimal flight trajectories, a mission planning tool for ideal pilot assignment, and a method for iterative training improvement based on quantitative input. The training results with real pilots demonstrate the methodology\u2019s effectiveness in evaluating pilot performance for complex military missions, suggesting its potential as a valuable addition to new pilot training programs.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_3_block\">\n<pre><code class=\"tex bibtex\">@Article{drones8090482,\nAUTHOR = {Doroftei, Daniela and De Cubber, Geert and Lo Bue, Salvatore and De Smet, Hans},\nTITLE = {Quantitative Assessment of Drone Pilot Performance},\nJOURNAL = {Drones},\nVOLUME = {8},\nYEAR = {2024},\nunit= {meca-ras},\nNUMBER = {9},\nARTICLE-NUMBER = {482},\nURL = {https:\/\/www.mdpi.com\/2504-446X\/8\/9\/482},\nISSN = {2504-446X},\nproject= {ALPHONSE},\nABSTRACT = {This paper introduces a quantitative methodology for assessing drone pilot performance, aiming to reduce drone-related incidents by understanding the human factors influencing performance. The challenge lies in balancing evaluations in operationally relevant environments with those in a standardized test environment for statistical relevance. The proposed methodology employs a novel virtual test environment that records not only basic flight metrics but also complex mission performance metrics, such as the video quality from a target. A group of Belgian Defence drone pilots were trained using this simulator system, yielding several practical results. These include a human-performance model linking human factors to pilot performance, an AI co-pilot providing real-time flight performance guidance, a tool for generating optimal flight trajectories, a mission planning tool for ideal pilot assignment, and a method for iterative training improvement based on quantitative input. The training results with real pilots demonstrate the methodology\u2019s effectiveness in evaluating pilot performance for complex military missions, suggesting its potential as a valuable addition to new pilot training programs.},\nDOI = {10.3390\/drones8090482}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2023<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, E. Le Fl\u00e9cher, A. La Grappe, E. Ghisoni, E. Maroulis, P. Ouendo, D. Hawari, and D. Doroftei, &#8220;Dual Use Security Robotics: A Demining, Resupply and Reconnaissance Use Case,\" in <span style=\"font-style: italic\">IEEE International Conference on Safety, Security, and Rescue Robotics<\/span>,  2023.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_4\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/mecatron.rma.ac.be\/pub\/2023\/SSRR2023-DeCubber.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_4_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{ssrr2023decubber,\ntitle={Dual Use Security Robotics: A Demining, Resupply and Reconnaissance Use Case},\nauthor={De Cubber, Geert and Le Fl\u00e9cher, Emile and La Grappe, Alexandre and Ghisoni, Enzo and Maroulis, Emmanouil and Ouendo, Pierre-Edouard and Hawari, Danial and Doroftei, Daniela},\nbooktitle={IEEE International Conference on Safety, Security, and Rescue Robotics},\neditors ={Kimura, Tetsuya},\npublisher = {IEEE},\nyear = {2023},\nvol = {1},\nproject = {AIDED, iMUGs, CUGS},\nlocation = {Fukushima, Japan},\nunit= {meca-ras},\ndoi = {},\nurl={https:\/\/mecatron.rma.ac.be\/pub\/2023\/SSRR2023-DeCubber.pdf}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, E. Le Fl\u00e9cher, A. Dominicus, and D. Doroftei, &#8220;Human-agent teaming between soldiers and unmanned ground systems in a resupply scenario,\" in <span style=\"font-style: italic\">Human Factors in Robots, Drones and Unmanned Systems. AHFE (2023) International Conference.<\/span>,  2023.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_5\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_5\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/openaccess.cms-conferences.org\/publications\/book\/978-1-958651-69-8\/article\/978-1-958651-69-8_5\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/http:\/\/doi.org\/10.54941\/ahfe1003746' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_5_block\">\n<p>Thanks to advances in embedded computing and robotics, intelligent Unmanned Ground Systems (UGS) are used more and more in our daily lives. Also in the military domain, the use of UGS is highly investigated for applications like force protection of military installations, surveillance, target acquisition, reconnaissance, handling of chemical, biological, radiological, nuclear (CBRN) threats, explosive ordnance disposal, etc. A pivotal research aspect for the integration of these military UGS in the standard operating procedures is the question of how to achieve a seamless collaboration between human and robotic agents in such high-stress and non-structured environments. Indeed, in these kind of operations, it is critical that the human-agent mutual understanding is flawless; hence, the focus on human factors and ergonomic design of the control interfaces.The objective of this paper is to focus on one key military application of UGS, more specifically logistics, and elaborate how efficient human-machine teaming can be achieved in such a scenario. While getting much less attention than other application areas, the domain of logistics is in fact one of the most important for any military operation, as it is an application area that is very well suited for robotic systems. Indeed, military troops are very often burdened by having to haul heavy gear across large distances, which is a problem UGS can solve.The significance of this paper is that it is based on more than two years of field research work on human + multi-agent UGS collaboration in realistic military operating conditions, performed within the scope of the European project iMUGS. In the framework of this project, not less than six large-scale field trial campaigns were organized across Europe. In each field trial campaign, soldiers and UGS had to work together to achieve a set of high-level mission goals that were distributed among them via a planning &#038; scheduling mechanism. This paper will focus on the outcomes of the Belgian field trial, which concentrated on a resupply logistics mission.Within this paper, a description of the iMUGS test setup and operational scenarios is provided. The ergonomic design of the tactical planning system is elaborated, together with the high-level swarming and task scheduling methods that divide the work between robotic and human agents in the fieldThe resupply mission, as described in this paper, was executed in summer 2022 in Belgium by a mixed team of soldiers and UGS for an audience of around 200 people from defence actors from European member states. The results of this field trial were evaluated as highly positive, as all high-level requirements were obtained by the robotic fleet.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_5_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{ahfe20203decubber,\ntitle={Human-agent teaming between soldiers and unmanned ground systems in a resupply scenario},\nauthor={De Cubber, G. and Le Fl\u00e9cher, E. and Dominicus, A. and Doroftei, D.},\nbooktitle={Human Factors in Robots, Drones and Unmanned Systems. AHFE (2023) International Conference.},\neditors ={Tareq Ahram and Waldemar Karwowski},\npublisher = {AHFE Open Access, AHFE International, USA},\nyear = {2023},\nvol = {93},\nproject = {iMUGs},\nlocation = {San Francisco, USA},\nunit= {meca-ras},\ndoi = {http:\/\/doi.org\/10.54941\/ahfe1003746},\nurl={https:\/\/openaccess.cms-conferences.org\/publications\/book\/978-1-958651-69-8\/article\/978-1-958651-69-8_5},\nabstract = {Thanks to advances in embedded computing and robotics, intelligent Unmanned Ground Systems (UGS) are used more and more in our daily lives. Also in the military domain, the use of UGS is highly investigated for applications like force protection of military installations, surveillance, target acquisition, reconnaissance, handling of chemical, biological, radiological, nuclear (CBRN) threats, explosive ordnance disposal, etc. A pivotal research aspect for the integration of these military UGS in the standard operating procedures is the question of how to achieve a seamless collaboration between human and robotic agents in such high-stress and non-structured environments. Indeed, in these kind of operations, it is critical that the human-agent mutual understanding is flawless; hence, the focus on human factors and ergonomic design of the control interfaces.The objective of this paper is to focus on one key military application of UGS, more specifically logistics, and elaborate how efficient human-machine teaming can be achieved in such a scenario. While getting much less attention than other application areas, the domain of logistics is in fact one of the most important for any military operation, as it is an application area that is very well suited for robotic systems. Indeed, military troops are very often burdened by having to haul heavy gear across large distances, which is a problem UGS can solve.The significance of this paper is that it is based on more than two years of field research work on human + multi-agent UGS collaboration in realistic military operating conditions, performed within the scope of the European project iMUGS. In the framework of this project, not less than six large-scale field trial campaigns were organized across Europe. In each field trial campaign, soldiers and UGS had to work together to achieve a set of high-level mission goals that were distributed among them via a planning & scheduling mechanism. This paper will focus on the outcomes of the Belgian field trial, which concentrated on a resupply logistics mission.Within this paper, a description of the iMUGS test setup and operational scenarios is provided. The ergonomic design of the tactical planning system is elaborated, together with the high-level swarming and task scheduling methods that divide the work between robotic and human agents in the fieldThe resupply mission, as described in this paper, was executed in summer 2022 in Belgium by a mixed team of soldiers and UGS for an audience of around 200 people from defence actors from European member states. The results of this field trial were evaluated as highly positive, as all high-level requirements were obtained by the robotic fleet.}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, G. De Cubber, and H. De Smet, &#8220;Human factors assessment for drone operations: towards a virtual drone co-pilot,\" in <span style=\"font-style: italic\">Human Factors in Robots, Drones and Unmanned Systems. AHFE (2023) International Conference.<\/span>,  2023.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_6\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_6\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/openaccess.cms-conferences.org\/publications\/book\/978-1-958651-69-8\/article\/978-1-958651-69-8_6\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/http:\/\/doi.org\/10.54941\/ahfe1003747' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_6_block\">\n<p>As the number of drone operations increases, so does the risk of incidents with these novel, yet sometimes dangerous unmanned systems. Research has shown that over 70% of drone incidents are caused by human error, so in order to reduce the risk of incidents, the human factors related to the operation of the drone should be studied. However, this is not a trivial exercise, because on the one hand, a realistic operational environment is required (in order to study the human behaviour in realistic conditions), while on the other hand a standardised environment is required, such that repeatable experiments can be set up in order to ensure statistical relevance. In order to remedy this, within the scope of the ALPHONSE project, a realistic simulation environment was developed that is specifically geared towards the evaluation of human factors for military drone operations. Within the ALPHONSE simulator, military (and other) drone pilots can perform missions in realistic operational conditions. At the same time, they are subjected to a range of factors that can influence operator performance. These constitute both person-induced factors like pressure to achieve the set goals in time or people talking to the pilot and environment-induced stress factors like changing weather conditions. During the flight operation, the ALPHONSE simulator continuously monitors over 65 flight parameters. After the flight, an overall performance score is calculated, based upon the achievement of the mission objectives. Throughout the ALPHONSE trials, a wide range of pilots has flown in the simulator, ranging from beginner to expert pilots. Using all the data recorded during these flights, three actions are performed:-An Artificial Intelligence (AI) &#8211; based classifier was trained to automatically recognize in real time good and bad flight behaviour. This allows for the development of a virtual co-pilot that can warn the pilot at any given moment when the pilot is starting to exhibit behaviour that is recognized by the classifier to correspond mostly to the behaviour of inexperienced pilots and not to the behaviour of good pilots.-An identification and ranking of the human factors and their impact on the flight performance, by linking the induced stress factors to the performance scores-An update of the training procedures to take into consideration the human factors that impact flight performance, such that newly trained pilots are better aware of these influences.The objective of this paper is to present the complete ALPHONSE simulator system for the evaluation of human factors for drone operations and present the results of the experiments with real military flight operators. The focus of the paper will be on the elaboration of the design choices for the development of the AI &#8211; based classifier for real-time flight performance evaluation.The proposed development is highly significant, as it presents a concrete and cost-effective methodology for developing a virtual co-pilot for drone pilots that can render drone operations safer. Indeed, while the initial training of the AI model requires considerable computing resources, the implementation of the classifier can be readily integrated in commodity flight controllers to provide real-time alerts when pilots are manifesting undesired flight behaviours.The paper will present results of tests with drone pilots from Belgian Defence and civilian Belgian Defence researchers that have flown within the ALPHONSE simulator. These pilots have first acted as data subjects to provide flight data to train the model and have later been used to validate the model. The validation shows that the virtual co-pilot achieves a very high accuracy and can in over 80% of the cases correctly identify bad flight profiles in real-time.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_6_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{ahfe20203doroftei,\ntitle={Human factors assessment for drone operations: towards a virtual drone co-pilot},\nauthor={Doroftei, D. and De Cubber, G. and De Smet, H.},\nbooktitle={Human Factors in Robots, Drones and Unmanned Systems. AHFE (2023) International Conference.},\neditors ={Tareq Ahram and Waldemar Karwowski},\npublisher = {AHFE Open Access, AHFE International, USA},\nyear = {2023},\nvol = {93},\nproject = {Alphonse},\nlocation = {San Francisco, USA},\nunit= {meca-ras},\ndoi = {http:\/\/doi.org\/10.54941\/ahfe1003747},\nurl={https:\/\/openaccess.cms-conferences.org\/publications\/book\/978-1-958651-69-8\/article\/978-1-958651-69-8_6},\nabstract = {As the number of drone operations increases, so does the risk of incidents with these novel, yet sometimes dangerous unmanned systems. Research has shown that over 70% of drone incidents are caused by human error, so in order to reduce the risk of incidents, the human factors related to the operation of the drone should be studied. However, this is not a trivial exercise, because on the one hand, a realistic operational environment is required (in order to study the human behaviour in realistic conditions), while on the other hand a standardised environment is required, such that repeatable experiments can be set up in order to ensure statistical relevance. In order to remedy this, within the scope of the ALPHONSE project, a realistic simulation environment was developed that is specifically geared towards the evaluation of human factors for military drone operations. Within the ALPHONSE simulator, military (and other) drone pilots can perform missions in realistic operational conditions. At the same time, they are subjected to a range of factors that can influence operator performance. These constitute both person-induced factors like pressure to achieve the set goals in time or people talking to the pilot and environment-induced stress factors like changing weather conditions. During the flight operation, the ALPHONSE simulator continuously monitors over 65 flight parameters. After the flight, an overall performance score is calculated, based upon the achievement of the mission objectives. Throughout the ALPHONSE trials, a wide range of pilots has flown in the simulator, ranging from beginner to expert pilots. Using all the data recorded during these flights, three actions are performed:-An Artificial Intelligence (AI) - based classifier was trained to automatically recognize in real time good and bad flight behaviour. This allows for the development of a virtual co-pilot that can warn the pilot at any given moment when the pilot is starting to exhibit behaviour that is recognized by the classifier to correspond mostly to the behaviour of inexperienced pilots and not to the behaviour of good pilots.-An identification and ranking of the human factors and their impact on the flight performance, by linking the induced stress factors to the performance scores-An update of the training procedures to take into consideration the human factors that impact flight performance, such that newly trained pilots are better aware of these influences.The objective of this paper is to present the complete ALPHONSE simulator system for the evaluation of human factors for drone operations and present the results of the experiments with real military flight operators. The focus of the paper will be on the elaboration of the design choices for the development of the AI - based classifier for real-time flight performance evaluation.The proposed development is highly significant, as it presents a concrete and cost-effective methodology for developing a virtual co-pilot for drone pilots that can render drone operations safer. Indeed, while the initial training of the AI model requires considerable computing resources, the implementation of the classifier can be readily integrated in commodity flight controllers to provide real-time alerts when pilots are manifesting undesired flight behaviours.The paper will present results of tests with drone pilots from Belgian Defence and civilian Belgian Defence researchers that have flown within the ALPHONSE simulator. These pilots have first acted as data subjects to provide flight data to train the model and have later been used to validate the model. The validation shows that the virtual co-pilot achieves a very high accuracy and can in over 80% of the cases correctly identify bad flight profiles in real-time.}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2022<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. Doroftei, G. De Cubber, and H. De Smet, &#8220;A quantitative measure for the evaluation of drone-based video quality on a target,\" in <span style=\"font-style: italic\">Eighteenth International Conference on Autonomic and Autonomous Systems (ICAS)<\/span>, Venice, Italy,  2022.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_62\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_62\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.thinkmind.org\/articles\/icas_2022_1_40_20018.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/https:\/\/www.thinkmind.org\/index.php?view=article&#038;articleid=icas_2022_1_40_20018' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_62_block\">\n<p>This paper presents a methodology to assess video quality and based on that automatically calculate drone trajectories that optimize the video quality.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_62_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2022alphonse2,\nauthor = {Doroftei, Daniela and De Cubber, Geert and De Smet, Hans},\nbooktitle = {Eighteenth International Conference on Autonomic and Autonomous Systems (ICAS)},\ntitle = {A quantitative measure for the evaluation of drone-based video quality on a target},\nyear = {2022},\nmonth = jun,\norganization = {IARIA},\npublisher = {ThinkMind},\naddress = {Venice, Italy},\nurl = {https:\/\/www.thinkmind.org\/articles\/icas_2022_1_40_20018.pdf},\nisbn={978-1-61208-966-9},\ndoi = {https:\/\/www.thinkmind.org\/index.php?view=article&articleid=icas_2022_1_40_20018},\nabstract = {This paper presents a methodology to assess video quality and based on that automatically calculate drone trajectories that optimize the video quality.},\nproject = {Alphonse},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, G. De Cubber, and H. De Smet, &#8220;Assessing Human Factors for Drone Operations in a Simulation Environment,\" in <span style=\"font-style: italic\">Human Factors in Robots, Drones and Unmanned Systems &#8211; AHFE (2022) International Conference<\/span>, New York, USA,  2022.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_63\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_63\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/openaccess-api.cms-conferences.org\/articles\/download\/978-1-958651-33-9_16\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/http:\/\/doi.org\/10.54941\/ahfe1002319' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_63_block\">\n<p>This paper presents an overview of the Alphonse methodology for Assessing Human Factors for Drone Operations in a Simulation Environment.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_63_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2022a,\nauthor = {Doroftei, Daniela and De Cubber, Geert and De Smet, Hans},\nbooktitle = {Human Factors in Robots, Drones and Unmanned Systems - AHFE (2022) International Conference},\ntitle = {Assessing Human Factors for Drone Operations in a Simulation Environment},\nyear = {2022},\nmonth = jul,\nvolume = {57},\neditor = {Tareq Ahram and Waldemar Karwowski},\npublisher = {AHFE International},\naddress = {New York, USA},\nurl = {https:\/\/openaccess-api.cms-conferences.org\/articles\/download\/978-1-958651-33-9_16},\nabstract = {This paper presents an overview of the Alphonse methodology for Assessing Human Factors for Drone Operations in a Simulation Environment.},\ndoi = {http:\/\/doi.org\/10.54941\/ahfe1002319},\nproject = {Alphonse},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2021<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. Doroftei, T. De Vleeschauwer, S. L. Bue, M. Dewyn, F. Vanderstraeten, and G. De Cubber, &#8220;Human-Agent Trust Evaluation in a Digital Twin Context,\" in <span style=\"font-style: italic\">2021 30th IEEE International Conference on Robot Human Interactive Communication (RO-MAN)<\/span>, Vancouver, BC, Canada,  2021, pp. 203-207.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_60\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/www.researchgate.net\/profile\/Geert-De-Cubber\/publication\/354078858_Human-Agent_Trust_Evaluation_in_a_Digital_Twin_Context\/links\/61430bd22bfbd83a46cf2b8c\/Human-Agent-Trust-Evaluation-in-a-Digital-Twin-Context.pdf?_sg%5B0%5D=BdEPB9AGDUV3sOwnEQKCr-DgWRA7uDNeMlvyQYNaMPGSO2bhCDbyG4AENXXxH3j323ypYTq9nMftVbDr2fsCSA.ePETOgrc5VHnE0GK_yjBK1XVVfdQ9S6g2UKVfg8Z8miIkGlMPXpzaYKlB0JPDSiroGp9QoFbmcY2egYAXbL1ZQ&#038;_sg%5B1%5D=ykQnQS2LN8fUQXAYx5Fpiy2NXqIwqO1UyVCENkpSUUWZn8Qqgrelh1bb4ry9Q9XPgCts7lVXU1_68YLjqnCPh4seSzWfG5BpKHc3MuFwsK6l.ePETOgrc5VHnE0GK_yjBK1XVVfdQ9S6g2UKVfg8Z8miIkGlMPXpzaYKlB0JPDSiroGp9QoFbmcY2egYAXbL1ZQ&#038;_iepl=\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/RO-MAN50785.2021.9515445' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_60_block\">\n<pre><code class=\"tex bibtex\">@INPROCEEDINGS{9515445,\nauthor={Doroftei, Daniela and De Vleeschauwer, Tom and Bue, Salvatore Lo and Dewyn, Micha\u00ebl and Vanderstraeten, Frik and De Cubber, Geert},\nbooktitle={2021 30th IEEE International Conference on Robot Human Interactive Communication (RO-MAN)},\ntitle={Human-Agent Trust Evaluation in a Digital Twin Context},\nyear={2021},\nvolume={},\nnumber={},\npages={203-207},\nurl={https:\/\/www.researchgate.net\/profile\/Geert-De-Cubber\/publication\/354078858_Human-Agent_Trust_Evaluation_in_a_Digital_Twin_Context\/links\/61430bd22bfbd83a46cf2b8c\/Human-Agent-Trust-Evaluation-in-a-Digital-Twin-Context.pdf?_sg%5B0%5D=BdEPB9AGDUV3sOwnEQKCr-DgWRA7uDNeMlvyQYNaMPGSO2bhCDbyG4AENXXxH3j323ypYTq9nMftVbDr2fsCSA.ePETOgrc5VHnE0GK_yjBK1XVVfdQ9S6g2UKVfg8Z8miIkGlMPXpzaYKlB0JPDSiroGp9QoFbmcY2egYAXbL1ZQ&_sg%5B1%5D=ykQnQS2LN8fUQXAYx5Fpiy2NXqIwqO1UyVCENkpSUUWZn8Qqgrelh1bb4ry9Q9XPgCts7lVXU1_68YLjqnCPh4seSzWfG5BpKHc3MuFwsK6l.ePETOgrc5VHnE0GK_yjBK1XVVfdQ9S6g2UKVfg8Z8miIkGlMPXpzaYKlB0JPDSiroGp9QoFbmcY2egYAXbL1ZQ&_iepl=},\nproject={Alphonse},\npublisher={IEEE},\naddress={Vancouver, BC, Canada},\nmonth=aug,\ndoi={10.1109\/RO-MAN50785.2021.9515445},\nunit= {meca-ras}}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, R. Lahouli, D. Doroftei, and R. Haelterman, &#8220;Distributed coverage optimisation for a fleet of unmanned maritime systems,\" <span style=\"font-style: italic\">ACTA IMEKO<\/span>, vol. 10, iss. 3, pp. 36-43, 2021.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_61\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_61\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/acta.imeko.org\/index.php\/acta-imeko\/article\/view\/IMEKO-ACTA-10%20%282021%29-03-07\/pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/http:\/\/dx.doi.org\/10.21014\/acta_imeko.v10i3.1031' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_61_block\">\n<p>Unmanned maritime systems (UMS) can provide important benefits for maritime law enforcement agencies for tasks such as area surveillance and patrolling, especially when they are able to work together as one coordinated system. In this context, this paper proposes a methodology that optimises the coverage of a fleet of UMS, thereby maximising the opportunities for identifying threats. Unlike traditional approaches to maritime coverage optimisation, which are also used, for example, in search and rescue operations when searching for victims at sea, this approach takes into consideration the limited seaworthiness of small UMS, compared with traditional large ships, by incorporating the danger level into the design of the optimiser.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_61_block\">\n<pre><code class=\"tex bibtex\">@ARTICLE{cubberimeko2021,\nauthor={De Cubber, Geert and Lahouli, Rihab and Doroftei, Daniela and Haelterman, Rob},\njournal={ACTA IMEKO},\ntitle={Distributed coverage optimisation for a fleet of unmanned maritime systems},\nyear={2021},\nvolume={10},\nnumber={3},\npages={36-43},\nissn={2221-870X},\nurl={https:\/\/acta.imeko.org\/index.php\/acta-imeko\/article\/view\/IMEKO-ACTA-10%20%282021%29-03-07\/pdf},\nproject={MarSur, SSAVE},\npublisher={IMEKO},\nmonth=oct,\nabstract = {Unmanned maritime systems (UMS) can provide important benefits for maritime law enforcement agencies for tasks such as area surveillance and patrolling, especially when they are able to work together as one coordinated system. In this context, this paper proposes a methodology that optimises the coverage of a fleet of UMS, thereby maximising the opportunities for identifying threats. Unlike traditional approaches to maritime coverage optimisation, which are also used, for example, in search and rescue operations when searching for victims at sea, this approach takes into consideration the limited seaworthiness of small UMS, compared with traditional large ships, by incorporating the danger level into the design of the optimiser. },\ndoi={http:\/\/dx.doi.org\/10.21014\/acta_imeko.v10i3.1031},\nunit= {meca-ras}}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2020<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. Doroftei, G. De Cubber, and H. De Smet, &#8220;Reducing drone incidents by incorporating human factors in the drone and drone pilot accreditation process,\" in <span style=\"font-style: italic\">Advances in Human Factors in Robots, Drones and Unmanned Systems<\/span>, San Diego, USA,  2020, p. 71\u201377.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_58\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_58\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2020\/Reducing%20drone%20incidents%20by%20incorporating%20human%20factors%20in%20the%20drone%20and%20drone%20pilot%20accreditation%20process.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1007\/978-3-030-51758-8_10' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_58_block\">\n<p>Considering the ever-increasing use of drones in a plentitude of application areas, the risk is that also an ever-increasing number of drone incidents would be ob-served. Research has shown that a large majority of all incidents with drones is due not to technological, but to human error. An advanced risk-reduction meth-odology, focusing on the human element, is thus required in order to allow for the safe use of drones. In this paper, we therefore introduce a novel concept to pro-vide a qualitative and quantitative assessment of the performance of the drone op-erator. The proposed methodology is based on one hand upon the development of standardized test methodologies and on the other hand on human performance modeling of the drone operators in a highly realistic simulation environment.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_58_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2020alphonse,\nauthor = {Doroftei, Daniela and De Cubber, Geert and De Smet, Hans},\nbooktitle = {Advances in Human Factors in Robots, Drones and Unmanned Systems},\ntitle = {Reducing drone incidents by incorporating human factors in the drone and drone pilot accreditation process},\nyear = {2020},\nmonth = jul,\neditor = {Zallio, Matteo},\npublisher = {Springer International Publishing},\npages = {71--77},\nisbn = {978-3-030-51758-8},\norganization = {AHFE},\naddress = {San Diego, USA},\nabstract = {Considering the ever-increasing use of drones in a plentitude of application areas, the risk is that also an ever-increasing number of drone incidents would be ob-served. Research has shown that a large majority of all incidents with drones is due not to technological, but to human error. An advanced risk-reduction meth-odology, focusing on the human element, is thus required in order to allow for the safe use of drones. In this paper, we therefore introduce a novel concept to pro-vide a qualitative and quantitative assessment of the performance of the drone op-erator. The proposed methodology is based on one hand upon the development of standardized test methodologies and on the other hand on human performance modeling of the drone operators in a highly realistic simulation environment.},\ndoi = {10.1007\/978-3-030-51758-8_10},\nunit= {meca-ras},\nproject = {Alphonse},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2020\/Reducing%20drone%20incidents%20by%20incorporating%20human%20factors%20in%20the%20drone%20and%20drone%20pilot%20accreditation%20process.pdf},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, R. Lahouli, D. Doroftei, and R. Haelterman, &#8220;Distributed coverage optimization for a fleet of unmanned maritime systems for a maritime patrol and surveillance application,\" in <span style=\"font-style: italic\">ISMCR 2020: 23rd International Symposium on Measurement and Control in Robotics<\/span>, Budapest, Hungary,  2020.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_59\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_59\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2020\/conference_101719.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ISMCR51255.2020.9263740' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_59_block\">\n<p>In order for unmanned maritime systems to provide added value for maritime law enforcement agencies, they have to be able to work together as a coordinated team for tasks such as area surveillance and patrolling. Therefore, this paper proposes a methodology that optimizes the coverage of a fleet of unmanned maritime systems, and thereby maximizes the chances of noticing threats. Unlike traditional approaches for maritime coverage optimization, which are also used for example in search and rescue operations when searching for victims at sea, this approaches takes into consideration the limited seaworthiness of small unmanned systems, as compared to traditional large ships, by incorporating the danger level in the design of the optimizer.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_59_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{decubber2020dco,\nauthor = {De Cubber, Geert and Lahouli, Rihab and Doroftei, Daniela and Haelterman, Rob},\nbooktitle = {ISMCR 2020: 23rd International Symposium on Measurement and Control in Robotics},\ntitle = {Distributed coverage optimization for a fleet of unmanned maritime systems for a maritime patrol and surveillance application},\nyear = {2020},\nmonth = oct,\norganization = {ISMCR},\npublisher = {{IEEE}},\nabstract = {In order for unmanned maritime systems to provide added value for maritime law enforcement agencies, they have to be able to work together as a coordinated team for tasks such as area surveillance and patrolling. Therefore, this paper proposes a methodology that optimizes the coverage of a fleet of unmanned maritime systems, and thereby maximizes the chances of noticing threats. Unlike traditional approaches for maritime coverage optimization, which are also used for example in search and rescue operations when searching for victims at sea, this approaches takes into consideration the limited seaworthiness of small unmanned systems, as compared to traditional large ships, by incorporating the danger level in the design of the optimizer.},\nproject = {SSAVE,MarSur},\naddress = {Budapest, Hungary},\ndoi = {10.1109\/ISMCR51255.2020.9263740},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2020\/conference_101719.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2019<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. Doroftei and G. De Cubber, &#8220;Using a qualitative and quantitative validation methodology to evaluate a drone detection system,\" <span style=\"font-style: italic\">ACTA IMEKO<\/span>, vol. 8, iss. 4, p. 20\u201327, 2019.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_46\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_46\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/acta.imeko.org\/index.php\/acta-imeko\/article\/view\/IMEKO-ACTA-08%20%282019%29-04-05\/pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.21014\/acta_imeko.v8i4.682' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_46_block\">\n<p>Now that the use of drones is becoming more common, the need to regulate the access to airspace for these systems is becoming more pressing. A necessary tool in order to do this is a means of detecting drones. Numerous parties have started the development of such drone detection systems. A big problem with these systems is that the evaluation of the performance of drone detection systems is a difficult operation that requires the careful consideration of all technical and non-technical aspects of the system under test. Indeed, weather conditions and small variations in the appearance of the targets can have a huge difference on the performance of the systems. In order to provide a fair evaluation, it is therefore paramount that a validation procedure that finds a compromise between the requirements of end users (who want tests to be performed in operational conditions) and platform developers (who want statistically relevant tests) is followed. Therefore, we propose in this article a qualitative and quantitative validation methodology for drone detection systems. The proposed validation methodology seeks to find this compromise between operationally relevant benchmarking (by providing qualitative benchmarking under varying environmental conditions) and statistically relevant evaluation (by providing quantitative score sheets under strictly described conditions).<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_46_block\">\n<pre><code class=\"tex bibtex\">@Article{doroftei2019using,\nauthor = {Doroftei, Daniela and De Cubber, Geert},\njournal = {{ACTA} {IMEKO}},\ntitle = {Using a qualitative and quantitative validation methodology to evaluate a drone detection system},\nyear = {2019},\nmonth = dec,\nnumber = {4},\npages = {20--27},\nvolume = {8},\nabstract = {Now that the use of drones is becoming more common, the need to regulate the access to airspace for these systems is becoming more pressing. A necessary tool in order to do this is a means of detecting drones. Numerous parties have started the development of such drone detection systems. A big problem with these systems is that the evaluation of the performance of drone detection systems is a difficult operation that requires the careful consideration of all technical and non-technical aspects of the system under test. Indeed, weather conditions and small variations in the appearance of the targets can have a huge difference on the performance of the systems. In order to provide a fair evaluation, it is therefore paramount that a validation procedure that finds a compromise between the requirements of end users (who want tests to be performed in operational conditions) and platform developers (who want statistically relevant tests) is followed. Therefore, we propose in this article a qualitative and quantitative validation methodology for drone detection systems. The proposed validation methodology seeks to find this compromise between operationally relevant benchmarking (by providing qualitative benchmarking under varying environmental conditions) and statistically relevant evaluation (by providing quantitative score sheets under strictly described conditions).},\ndoi = {10.21014\/acta_imeko.v8i4.682},\npdf = {https:\/\/acta.imeko.org\/index.php\/acta-imeko\/article\/view\/IMEKO-ACTA-08%20%282019%29-04-05\/pdf},\nproject = {SafeShore},\npublisher = {{IMEKO} International Measurement Confederation},\nurl = {https:\/\/acta.imeko.org\/index.php\/acta-imeko\/article\/view\/IMEKO-ACTA-08%20%282019%29-04-05\/pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and H. De Smet, &#8220;Evaluating Human Factors for Drone Operations using Simulations and Standardized Tests,\" in <span style=\"font-style: italic\">10th International Conference on Applied Human Factors and Ergonomics (AHFE 2019)<\/span>, Washington DC, USA,  2019.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_57\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_57\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2019\/Poster_Alphonse_Print.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5281\/zenodo.3742199' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_57_block\">\n<p>This poster publication presents an overview of the Alphonse project on the development of new training curricula to reduce the number of drone incidents due to human error.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_57_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2019alphonse,\nauthor = {Doroftei, Daniela and De Smet, Han},\nbooktitle = {10th International Conference on Applied Human Factors and Ergonomics (AHFE 2019)},\ntitle = {Evaluating Human Factors for Drone Operations using Simulations and Standardized Tests},\nyear = {2019},\nmonth = jul,\norganization = {AHFE},\npublisher = {Springer},\naddress = {Washington DC, USA},\nabstract = {This poster publication presents an overview of the Alphonse project on the development of new training curricula to reduce the number of drone incidents due to human error.},\ndoi = {10.5281\/zenodo.3742199},\nproject = {Alphonse},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2019\/Poster_Alphonse_Print.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2018<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    Y. Baudoin, D. Doroftei, G. de Cubber, J. Habumuremyi, H. Balta, and I. Doroftei, &#8220;Unmanned Ground and Aerial Robots Supporting Mine Action Activities,\" <span style=\"font-style: italic\">Journal of Physics: Conference Series<\/span>, vol. 1065, iss. 17, p. 172009, 2018.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_39\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_39\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/iopscience.iop.org\/article\/10.1088\/1742-6596\/1065\/17\/172009\/pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1088\/1742-6596\/1065\/17\/172009' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_39_block\">\n<p>During the Humanitarian\u2010demining actions, teleoperation of sensors or multi\u2010sensor heads can enhance\u2010detection process by allowing more precise scanning, which is use\u2010 ful for the optimization of the signal processing algorithms. This chapter summarizes the technologies and experiences developed during 16 years through national and\/or European\u2010funded projects, illustrated by some contributions of our own laboratory, located at the Royal Military Academy of Brussels, focusing on the detection of unexploded devices and the implementation of mobile robotics systems on minefields<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_39_block\">\n<pre><code class=\"tex bibtex\">@Article{baudoin2018unmanned,\nauthor = {Baudoin, Yvan and Doroftei, Daniela and de Cubber, Geert and Habumuremyi, Jean-Claude and Balta, Haris and Doroftei, Ioan},\ntitle = {Unmanned Ground and Aerial Robots Supporting Mine Action Activities},\nyear = {2018},\nmonth = aug,\nnumber = {17},\norganization = {IOP Publishing},\npages = {172009},\npublisher = {{IOP} Publishing},\nvolume = {1065},\nabstract = {During the Humanitarian\u2010demining actions, teleoperation of sensors or multi\u2010sensor heads can enhance\u2010detection process by allowing more precise scanning, which is use\u2010 ful for the optimization of the signal processing algorithms. This chapter summarizes the technologies and experiences developed during 16 years through national and\/or European\u2010funded projects, illustrated by some contributions of our own laboratory, located at the Royal Military Academy of Brussels, focusing on the detection of unexploded devices and the implementation of mobile robotics systems on minefields},\ndoi = {10.1088\/1742-6596\/1065\/17\/172009},\njournal = {Journal of Physics: Conference Series},\nproject = {TIRAMISU},\nurl = {https:\/\/iopscience.iop.org\/article\/10.1088\/1742-6596\/1065\/17\/172009\/pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and G. De Cubber, &#8220;Qualitative and quantitative validation of drone detection systems,\" in <span style=\"font-style: italic\">International Symposium on Measurement and Control in Robotics ISMCR2018<\/span>, Mons, Belgium,  2018.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_44\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_44\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2018\/Paper_Daniela.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5281\/ZENODO.1462586' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_44_block\">\n<p>As drones are more and more entering our world, so comes the need to regulate the access to airspace for these systems. A necessary tool in order to do this is a means of detecting these drones. Numerous commercial and non-commercial parties have started the development of such drone detection systems. A big problem with these systems is that the evaluation of the performance of drone detection systems is a difficult operation, which requires the careful consideration of all technical and non-technical aspects of the system under test. Indeed, weather conditions and small variations in the appearance of the targets can have a huge difference on the performance of the systems. In order to provide a fair evaluation and an honest comparison between systems, it is therefore paramount that a stringent validation procedure is followed. Moreover, the validation methodology needs to find a compromise between the often contrasting requirements of end users (who want tests to be performed in operational conditions) and platform developers (who want tests to be performed that are statistically relevant). Therefore, we propose in this paper a qualitative and quantitative validation methodology for drone detection systems. The proposed validation methodology seeks to find this compromise between operationally relevant benchmarking (by providing qualitative benchmarking under varying environmental conditions) and statistically relevant evaluation (by providing quantitative score sheets under strictly described conditions).<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_44_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2018qualitative,\nauthor = {Doroftei, Daniela and De Cubber, Geert},\nbooktitle = {International Symposium on Measurement and Control in Robotics ISMCR2018},\ntitle = {Qualitative and quantitative validation of drone detection systems},\nyear = {2018},\nvolume = {1},\nabstract = {As drones are more and more entering our world, so comes the need to regulate the access to airspace for these systems. A necessary tool in order to do this is a means of detecting these drones. Numerous commercial and non-commercial parties have started the development of such drone detection systems. A big problem with these systems is that the evaluation of the performance of drone detection systems is a difficult operation, which requires the careful consideration of all technical and non-technical aspects of the system under test. Indeed, weather conditions and small variations in the appearance of the targets can have a huge difference on the performance of the systems. In order to provide a fair evaluation and an honest comparison between systems, it is therefore paramount that a stringent validation procedure is followed. Moreover, the validation methodology needs to find a compromise between the often contrasting requirements of end users (who want tests to be performed in operational conditions) and platform developers (who want tests to be performed that are statistically relevant). Therefore, we propose in this paper a qualitative and quantitative validation methodology for drone detection systems. The proposed validation methodology seeks to find this compromise between operationally relevant benchmarking (by providing qualitative benchmarking under varying environmental conditions) and statistically relevant evaluation (by providing quantitative score sheets under strictly described conditions).},\ndoi = {10.5281\/ZENODO.1462586},\nfile = {:doroftei2018qualitative - Qualitative and Quantitative Validation of Drone Detection Systems.PDF:PDF},\nkeywords = {Unmanned Aerial Vehicles, Drones, Detection systems, Drone detection, Test and evaluation methods},\nproject = {SafeShore},\naddress = {Mons, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2018\/Paper_Daniela.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2017<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. S. L{&#8216;. o}, G. Moreno, J. Cordero, J. Sanchez, S. Govindaraj, M. M. Marques, V. Lobo, S. Fioravanti, A. Grati, K. Rudin, M. Tosa, A. Matos, A. Dias, A. Martins, J. Bedkowski, H. Balta, and G. De Cubber, &#8220;Interoperability in a Heterogeneous Team of Search and Rescue Robots,\" in <span style=\"font-style: italic\">Search and Rescue Robotics &#8211; From Theory to Practice<\/span>, G. De Cubber and D. Doroftei, Eds., InTech, 2017.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_34\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_34\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/interoperability-in-a-heterogeneous-team-of-search-and-rescue-robots\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/intechopen.69493' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_34_block\">\n<p>Search and rescue missions are complex operations. A disaster scenario is generally unstructured, time\u2010varying and unpredictable. This poses several challenges for the successful deployment of unmanned technology. The variety of operational scenarios and tasks lead to the need for multiple robots of different types, domains and sizes. A priori planning of the optimal set of assets to be deployed and the definition of their mission objectives are generally not feasible as information only becomes available during mission. The ICARUS project responds to this challenge by developing a heterogeneous team composed by different and complementary robots, dynamically cooperating as an interoperable team. This chapter describes our approach to multi\u2010robot interoperability, understood as the ability of multiple robots to operate together, in synergy, enabling multiple teams to share data, intelligence and resources, which is the ultimate objective of ICARUS project. It also includes the analysis of the relevant standardization initiatives in multi\u2010robot multi\u2010domain systems, our implementation of an interoperability framework and several examples of multi\u2010robot cooperation of the ICARUS robots in realistic search and rescue missions.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_34_block\">\n<pre><code class=\"tex bibtex\">@InBook{lopez2017interoperability,\nauthor = {Daniel Serrano L{'{o}}pez and German Moreno and Jose Cordero and Jose Sanchez and Shashank Govindaraj and Mario Monteiro Marques and Victor Lobo and Stefano Fioravanti and Alberto Grati and Konrad Rudin and Massimo Tosa and Anibal Matos and Andre Dias and Alfredo Martins and Janusz Bedkowski and Haris Balta and De Cubber, Geert},\neditor = {De Cubber, Geert and Doroftei, Daniela},\nchapter = {Chapter 6},\npublisher = {{InTech}},\ntitle = {Interoperability in a Heterogeneous Team of Search and Rescue Robots},\nyear = {2017},\nmonth = aug,\nabstract = {Search and rescue missions are complex operations. A disaster scenario is generally unstructured, time\u2010varying and unpredictable. This poses several challenges for the successful deployment of unmanned technology. The variety of operational scenarios and tasks lead to the need for multiple robots of different types, domains and sizes. A priori planning of the optimal set of assets to be deployed and the definition of their mission objectives are generally not feasible as information only becomes available during mission. The ICARUS project responds to this challenge by developing a heterogeneous team composed by different and complementary robots, dynamically cooperating as an interoperable team. This chapter describes our approach to multi\u2010robot interoperability, understood as the ability of multiple robots to operate together, in synergy, enabling multiple teams to share data, intelligence and resources, which is the ultimate objective of ICARUS project. It also includes the analysis of the relevant standardization initiatives in multi\u2010robot multi\u2010domain systems, our implementation of an interoperability framework and several examples of multi\u2010robot cooperation of the ICARUS robots in realistic search and rescue missions.},\nbooktitle = {Search and Rescue Robotics - From Theory to Practice},\ndoi = {10.5772\/intechopen.69493},\nproject = {ICARUS},\nunit= {meca-ras},\nurl = {https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/interoperability-in-a-heterogeneous-team-of-search-and-rescue-robots},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, H. Balta, A. Matos, E. Silva, D. Serrano, S. Govindaraj, R. Roda, V. Lobo, M{&#8216;. a}, and R. Wagemans, &#8220;Operational Validation of Search and Rescue Robots,\" in <span style=\"font-style: italic\">Search and Rescue Robotics &#8211; From Theory to Practice<\/span>, G. De Cubber and D. Doroftei, Eds., InTech, 2017.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_35\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_35\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/operational-validation-of-search-and-rescue-robots\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/intechopen.69497' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_35_block\">\n<p>This chapter describes how the different ICARUS unmanned search and rescue tools have been evaluated and validated using operational benchmarking techniques. Two large\u2010scale simulated disaster scenarios were organized: a simulated shipwreck and an earthquake response scenario. Next to these simulated response scenarios, where ICARUS tools were deployed in tight interaction with real end users, ICARUS tools also participated to a real relief, embedded in a team of end users for a flood response mission. These validation trials allow us to conclude that the ICARUS tools fulfil the user requirements and goals set up at the beginning of the project.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_35_block\">\n<pre><code class=\"tex bibtex\">@InBook{de2017operational,\nauthor = {De Cubber, Geert and Daniela Doroftei and Haris Balta and Anibal Matos and Eduardo Silva and Daniel Serrano and Shashank Govindaraj and Rui Roda and Victor Lobo and M{'{a}}rio Marques and Rene Wagemans},\neditor = {De Cubber, Geert and Doroftei, Daniela},\nchapter = {Chapter 10},\npublisher = {{InTech}},\ntitle = {Operational Validation of Search and Rescue Robots},\nyear = {2017},\nmonth = aug,\nabstract = {This chapter describes how the different ICARUS unmanned search and rescue tools have been evaluated and validated using operational benchmarking techniques. Two large\u2010scale simulated disaster scenarios were organized: a simulated shipwreck and an earthquake response scenario. Next to these simulated response scenarios, where ICARUS tools were deployed in tight interaction with real end users, ICARUS tools also participated to a real relief, embedded in a team of end users for a flood response mission. These validation trials allow us to conclude that the ICARUS tools fulfil the user requirements and goals set up at the beginning of the project.},\nbooktitle = {Search and Rescue Robotics - From Theory to Practice},\ndoi = {10.5772\/intechopen.69497},\njournal = {Search and Rescue Robotics: From Theory to Practice},\nproject = {ICARUS},\nurl = {https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/operational-validation-of-search-and-rescue-robots},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    K. Berns, A. Nezhadfard, M. Tosa, H. Balta, and G. De Cubber, &#8220;Unmanned Ground Robots for Rescue Tasks,\" in <span style=\"font-style: italic\">Search and Rescue Robotics &#8211; From Theory to Practice<\/span>, G. De Cubber and D. Doroftei, Eds., InTech, 2017.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_36\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_36\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/unmanned-ground-robots-for-rescue-tasks\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/intechopen.69491' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_36_block\">\n<p>This chapter describes two unmanned ground vehicles that can help search and rescue teams in their difficult, but life-saving tasks. These robotic assets have been developed within the framework of the European project ICARUS. The large unmanned ground vehicle is intended to be a mobile base station. It is equipped with a powerful manipulator arm and can be used for debris removal, shoring operations, and remote structural operations (cutting, welding, hammering, etc.) on very rough terrain. The smaller unmanned ground vehicle is also equipped with an array of sensors, enabling it to search for victims inside semi-destroyed buildings. Working together with each other and the human search and rescue workers, these robotic assets form a powerful team, increasing the effectiveness of search and rescue operations, as proven by operational validation tests in collaboration with end users.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_36_block\">\n<pre><code class=\"tex bibtex\">@InBook{berns2017unmanned,\nauthor = {Karsten Berns and Atabak Nezhadfard and Massimo Tosa and Haris Balta and De Cubber, Geert},\neditor = {De Cubber, Geert and Doroftei, Daniela},\nchapter = {Chapter 4},\npublisher = {{InTech}},\ntitle = {Unmanned Ground Robots for Rescue Tasks},\nyear = {2017},\nmonth = aug,\nabstract = {This chapter describes two unmanned ground vehicles that can help search and rescue teams in their difficult, but life-saving tasks. These robotic assets have been developed within the framework of the European project ICARUS. The large unmanned ground vehicle is intended to be a mobile base station. It is equipped with a powerful manipulator arm and can be used for debris removal, shoring operations, and remote structural operations (cutting, welding, hammering, etc.) on very rough terrain. The smaller unmanned ground vehicle is also equipped with an array of sensors, enabling it to search for victims inside semi-destroyed buildings. Working together with each other and the human search and rescue workers, these robotic assets form a powerful team, increasing the effectiveness of search and rescue operations, as proven by operational validation tests in collaboration with end users.},\nbooktitle = {Search and Rescue Robotics - From Theory to Practice},\ndoi = {10.5772\/intechopen.69491},\nproject = {ICARUS},\nurl = {https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/unmanned-ground-robots-for-rescue-tasks},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, G. De Cubber, R. Wagemans, A. Matos, E. Silva, V. Lobo, K. C. Guerreiro Cardoso, S. Govindaraj, J. Gancet, and D. Serrano, &#8220;User-centered design,\" , G. De Cubber and D. Doroftei, Eds., InTech, 2017, p. 19\u201336.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_37\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_37\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/user-centered-design\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/intechopen.69483' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_37_block\">\n<p>The successful introduction and acceptance of novel technological tools are only possible if end users are completely integrated in the design process. However, obtaining such integration of end users is not obvious, as end\u2010user organizations often do not consider research toward new technological aids as their core business and are therefore reluctant to engage in these kinds of activities. This chapter explains how this problem was tackled in the ICARUS project, by carefully identifying and approaching the targeted user communities and by compiling user requirements. Resulting from these user requirements, system requirements and a system architecture for the ICARUS system were deduced. An important aspect of the user\u2010centered design approach is that it is an iterative methodology, based on multiple intermediate operational validations by end users of the developed tools, leading to a final validation according to user\u2010scripted validation scenarios.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_37_block\">\n<pre><code class=\"tex bibtex\">@InBook{doroftei2017user,\nauthor = {Doroftei, Daniela and De Cubber, Geert and Wagemans, Rene and Matos, Anibal and Silva, Eduardo and Lobo, Victor and Guerreiro Cardoso, Keshav Chintamani and Govindaraj, Shashank and Gancet, Jeremi and Serrano, Daniel},\neditor = {De Cubber, Geert and Doroftei, Daniela},\nchapter = {Chapter 2},\npages = {19--36},\npublisher = {{InTech}},\ntitle = {User-centered design},\nyear = {2017},\nabstract = {The successful introduction and acceptance of novel technological tools are only possible if end users are completely integrated in the design process. However, obtaining such integration of end users is not obvious, as end\u2010user organizations often do not consider research toward new technological aids as their core business and are therefore reluctant to engage in these kinds of activities. This chapter explains how this problem was tackled in the ICARUS project, by carefully identifying and approaching the targeted user communities and by compiling user requirements. Resulting from these user requirements, system requirements and a system architecture for the ICARUS system were deduced. An important aspect of the user\u2010centered design approach is that it is an iterative methodology, based on multiple intermediate operational validations by end users of the developed tools, leading to a final validation according to user\u2010scripted validation scenarios.},\ndoi = {10.5772\/intechopen.69483},\njournal = {Search and rescue robotics. From theory to practice. IntechOpen, London},\nproject = {ICARUS},\nunit= {meca-ras},\nurl = {https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/user-centered-design},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. D. Cubber, D. Doroftei, K. Rudin, K. Berns, A. Matos, D. Serrano, J. Sanchez, S. Govindaraj, J. Bedkowski, R. Roda, E. Silva, and S. Ourevitch, &#8220;Introduction to the use of robotic tools for search and rescue,\" in <span style=\"font-style: italic\">Search and Rescue Robotics &#8211; From Theory to Practice<\/span>, G. De Cubber and D. Doroftei, Eds., InTech, 2017.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_38\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_38\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/introduction-to-the-use-of-robotic-tools-for-search-and-rescue\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/intechopen.69489' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_38_block\">\n<p>Modern search and rescue workers are equipped with a powerful toolkit to address natural and man-made disasters. This introductory chapter explains how a new tool can be added to this toolkit: robots. The use of robotic assets in search and rescue operations is explained and an overview is given of the worldwide efforts to incorporate robotic tools in search and rescue operations. Furthermore, the European Union ICARUS project on this subject is introduced. The ICARUS project proposes to equip first responders with a comprehensive and integrated set of unmanned search and rescue tools, to increase the situational awareness of human crisis managers, such that more work can be done in a shorter amount of time. The ICARUS tools consist of assistive unmanned air, ground, and sea vehicles, equipped with victim-detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the command and control equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_38_block\">\n<pre><code class=\"tex bibtex\">@InBook{cubber2017introduction,\nauthor = {Geert De Cubber and Daniela Doroftei and Konrad Rudin and Karsten Berns and Anibal Matos and Daniel Serrano and Jose Sanchez and Shashank Govindaraj and Janusz Bedkowski and Rui Roda and Eduardo Silva and Stephane Ourevitch},\neditor = {De Cubber, Geert and Doroftei, Daniela},\nchapter = {Chapter 1},\npublisher = {{InTech}},\ntitle = {Introduction to the use of robotic tools for search and rescue},\nyear = {2017},\nmonth = aug,\nabstract = {Modern search and rescue workers are equipped with a powerful toolkit to address natural and man-made disasters. This introductory chapter explains how a new tool can be added to this toolkit: robots. The use of robotic assets in search and rescue operations is explained and an overview is given of the worldwide efforts to incorporate robotic tools in search and rescue operations. Furthermore, the European Union ICARUS project on this subject is introduced. The ICARUS project proposes to equip first responders with a comprehensive and integrated set of unmanned search and rescue tools, to increase the situational awareness of human crisis managers, such that more work can be done in a shorter amount of time. The ICARUS tools consist of assistive unmanned air, ground, and sea vehicles, equipped with victim-detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the command and control equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.},\nbooktitle = {Search and Rescue Robotics - From Theory to Practice},\ndoi = {10.5772\/intechopen.69489},\nproject = {ICARUS},\nurl = {https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/introduction-to-the-use-of-robotic-tools-for-search-and-rescue},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. D. Cubber, D. Doroftei, K. Rudin, K. Berns, A. Matos, D. Serrano, J. M. Sanchez, S. Govindaraj, J. Bedkowski, R. Roda, E. Silva, S. Ourevitch, R. Wagemans, V. Lobo, G. Cardoso, K. Chintamani, J. Gancet, P. Stupler, A. Nezhadfard, M. Tosa, H. Balta, J. Almeida, A. Martins, H. Ferreira, B. Ferreira, J. Alves, A. Dias, S. Fioravanti, D. Bertin, G. Moreno, J. Cordero, M. M. Marques, A. Grati, H. M. Chaudhary, B. Sheers, Y. Riobo, P. Letier, M. N. Jimenez, M. A. Esbri, P. Musialik, I. Badiola, R. Goncalves, A. Coelho, T. Pfister, K. Majek, M. Pelka, A. Maslowski, and R. Baptista, <span style=\"font-style: italic\">Search and Rescue Robotics &#8211; From Theory to Practice<\/span>, G. De Cubber and D. Doroftei, Eds., InTech, 2017.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_41\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_41\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/intechopen.68449' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_41_block\">\n<p>In the event of large crises (earthquakes, typhoons, floods, &#8230;), a primordial task of the fire and rescue services is the search for human survivors on the incident site. This is a complex and dangerous task, which &#8211; too often &#8211; leads to loss of lives among the human crisis managers themselves. This book explains how unmanned search can be added to the toolkit of the search and rescue workers, offering a valuable tool to save human lives and to speed up the search and rescue process. The introduction of robotic tools in the world of search and rescue is not straightforward, due to the fact that the search and rescue context is extremely technology-unfriendly, meaning that very robust solutions, which can be deployed extremely quickly, are required. Multiple research projects across the world are tackling this problem and in this book, a special focus is placed on showcasing the results of the European Union ICARUS project on this subject. The ICARUS project proposes to equip first responders with a comprehensive and integrated set of unmanned search and rescue tools, to increase the situational awareness of human crisis managers, so that more work can be done in a shorter amount of time. The ICARUS tools consist of assistive unmanned air, ground, and sea vehicles, equipped with victim-detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the command and control equipment of the human crisis managers and a set of training and support tools is provided to them in order to learn to use the ICARUS system.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_41_block\">\n<pre><code class=\"tex bibtex\">@Book{de2017search,\nauthor = {Geert De Cubber and Daniela Doroftei and Konrad Rudin and Karsten Berns and Anibal Matos and Daniel Serrano and Jose Manuel Sanchez and Shashank Govindaraj and Janusz Bedkowski and Rui Roda and Eduardo Silva and Stephane Ourevitch and Rene Wagemans and Victor Lobo and Guerreiro Cardoso and Keshav Chintamani and Jeremi Gancet and Pascal Stupler and Atabak Nezhadfard and Massimo Tosa and Haris Balta and Jose Almeida and Alfredo Martins and Hugo Ferreira and Bruno Ferreira and Jose Alves and Andre Dias and Stefano Fioravanti and Daniele Bertin and German Moreno and Jose Cordero and Mario Monteiro Marques and Alberto Grati and Hafeez M. Chaudhary and Bart Sheers and Yudani Riobo and Pierre Letier and Mario Nunez Jimenez and Miguel Angel Esbri and Pawel Musialik and Irune Badiola and Ricardo Goncalves and Antonio Coelho and Thomas Pfister and Karol Majek and Michal Pelka and Andrzej Maslowski and Ricardo Baptista},\neditor = {De Cubber, Geert and Doroftei, Daniela},\npublisher = {{InTech}},\ntitle = {Search and Rescue Robotics - From Theory to Practice},\nyear = {2017},\nmonth = aug,\nabstract = {In the event of large crises (earthquakes, typhoons, floods, ...), a primordial task of the fire and rescue services is the search for human survivors on the incident site. This is a complex and dangerous task, which - too often - leads to loss of lives among the human crisis managers themselves. This book explains how unmanned search can be added to the toolkit of the search and rescue workers, offering a valuable tool to save human lives and to speed up the search and rescue process. The introduction of robotic tools in the world of search and rescue is not straightforward, due to the fact that the search and rescue context is extremely technology-unfriendly, meaning that very robust solutions, which can be deployed extremely quickly, are required. Multiple research projects across the world are tackling this problem and in this book, a special focus is placed on showcasing the results of the European Union ICARUS project on this subject. The ICARUS project proposes to equip first responders with a comprehensive and integrated set of unmanned search and rescue tools, to increase the situational awareness of human crisis managers, so that more work can be done in a shorter amount of time. The ICARUS tools consist of assistive unmanned air, ground, and sea vehicles, equipped with victim-detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the command and control equipment of the human crisis managers and a set of training and support tools is provided to them in order to learn to use the ICARUS system.},\ndoi = {10.5772\/intechopen.68449},\nproject = {ICARUS},\nurl = {https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, D. Doroftei, G. De Cubber, J. Habumuremyi, H. Balta, and I. Doroftei, &#8220;Unmanned Ground and Aerial Robots Supporting Mine Action Activities,\" in <span style=\"font-style: italic\">Mine Action &#8211; The Research Experience of the Royal Military Academy of Belgium<\/span>, C. Beumier, D. Closson, V. Lacroix, N. Milisavljevic, and Y. Yvinec, Eds., InTech, 2017, vol. 1.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_43\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_43\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/mine-action-the-research-experience-of-the-royal-military-academy-of-belgium\/unmanned-ground-and-aerial-robots-supporting-mine-action-activities\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/65783' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_43_block\">\n<p>During the Humanitarian\u2010demining actions, teleoperation of sensors or multi\u2010sensor heads can enhance-detection process by allowing more precise scanning, which is useful for the optimization of the signal processing algorithms. This chapter summarizes the technologies and experiences developed during 16 years through national and\/or European\u2010funded projects, illustrated by some contributions of our own laboratory, located at the Royal Military Academy of Brussels, focusing on the detection of unexploded devices and the implementation of mobile robotics systems on minefields.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_43_block\">\n<pre><code class=\"tex bibtex\">@InBook{baudoin2017unmanned,\nauthor = {Baudoin, Yvan and Doroftei, Daniela and De Cubber, Geert and Habumuremyi, Jean-Claude and Balta, Haris and Doroftei, Ioan},\neditor = {Beumier, Charles and Closson, Damien and Lacroix, Vincianne and Milisavljevic, Nada and Yvinec, Yann},\nchapter = {Chapter 9},\npublisher = {{InTech}},\ntitle = {Unmanned Ground and Aerial Robots Supporting Mine Action Activities},\nyear = {2017},\nmonth = aug,\nvolume = {1},\nabstract = {During the Humanitarian\u2010demining actions, teleoperation of sensors or multi\u2010sensor heads can enhance-detection process by allowing more precise scanning, which is useful for the optimization of the signal processing algorithms. This chapter summarizes the technologies and experiences developed during 16 years through national and\/or European\u2010funded projects, illustrated by some contributions of our own laboratory, located at the Royal Military Academy of Brussels, focusing on the detection of unexploded devices and the implementation of mobile robotics systems on minefields.},\nbooktitle = {Mine Action - The Research Experience of the Royal Military Academy of Belgium},\ndoi = {10.5772\/65783},\nproject = {TIRAMISU},\nurl = {https:\/\/www.intechopen.com\/books\/mine-action-the-research-experience-of-the-royal-military-academy-of-belgium\/unmanned-ground-and-aerial-robots-supporting-mine-action-activities},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2015<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. Doroftei, A. Matos, E. Silva, V. Lobo, R. Wagemans, and G. De Cubber, &#8220;Operational validation of robots for risky environments,\" in <span style=\"font-style: italic\">8th IARP Workshop on Robotics for Risky Environments<\/span>, Lisbon, Portugal,  2015.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_31\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_31\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2015\/Operational validation of robots for risky environments.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_31_block\">\n<p>This paper presents an operational test and validation approach for the evaluation of the performance of a range of marine, aerial and ground search and rescue robots. The proposed approach seeks to find a compromise between the traditional rigorous standardized approaches and the open-ended robot competitions. Operational scenarios are defined, including a performance assessment of individual robots but also collective operations where heterogeneous robots cooperate together and with manned teams in search and rescue activities. That way, it is possible to perform a more complete validation of the use of robotic tools in challenging real world scenarios.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_31_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2015operational,\nauthor = {Doroftei, Daniela and Matos, Anibal and Silva, Eduardo and Lobo, Victor and Wagemans, Rene and De Cubber, Geert},\nbooktitle = {8th IARP Workshop on Robotics for Risky Environments},\ntitle = {Operational validation of robots for risky environments},\nyear = {2015},\nabstract = {This paper presents an operational test and validation approach for the evaluation of the performance of a range of marine, aerial and ground search and rescue robots. The proposed approach seeks to find a compromise between the traditional rigorous standardized approaches and the open-ended robot competitions. Operational scenarios are defined, including a performance assessment of individual robots but also collective operations where heterogeneous robots cooperate together and with manned teams in search and rescue activities. That way, it is possible to perform a more complete validation of the use of robotic tools in challenging real world scenarios.},\nproject = {ICARUS},\naddress = {Lisbon, Portugal},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2015\/Operational validation of robots for risky environments.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    H. Balta, G. De Cubber, Y. Baudoin, and D. Doroftei, &#8220;UAS deployment and data processing during the Balkans flooding with the support to Mine Action,\" in <span style=\"font-style: italic\">8th IARP Workshop on Robotics for Risky Environments<\/span>, Lisbon, Portugal,  2015.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_32\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_32\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2015\/RISE_2015_Haris_Balta_RMA.PDF\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_32_block\">\n<p>In this paper, we provide a report on a real relief operation mission, jointly conducted by two European research projects, in response to the massive flooding in the Balkan in spring 2014. Un Unmanned Aerial System was deployed on-site in collaboration with traditional relief workers, to support them with damage assessment, area mapping, visual inspection and re-localizing the many explosive remnants of war which have been moved due to the flooding and landslides. The destructive impact of landslides, sediment torrents and floods on the mine fields and the change of mine action situation resulted with significant negative environmental and security consequences. Novel robotic technologies and data processing methodologies were brought from the research labs and directly applied onto the terrain in order to support the relief workers and minimize human suffering.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_32_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{balta2015uas,\nauthor = {Balta, Haris and De Cubber, Geert and Baudoin, Yvan and Doroftei, Daniela},\nbooktitle = {8th IARP Workshop on Robotics for Risky Environments},\ntitle = {{UAS} deployment and data processing during the {Balkans} flooding with the support to Mine Action},\nyear = {2015},\nabstract = {In this paper, we provide a report on a real relief operation mission, jointly conducted by two European research projects, in response to the massive flooding in the Balkan in spring 2014. Un Unmanned Aerial System was deployed on-site in collaboration with traditional relief workers, to support them with damage assessment, area mapping, visual inspection and re-localizing the many explosive remnants of war which have been moved due to the flooding and landslides. The destructive impact of landslides, sediment torrents and floods on the mine fields and the change of mine action situation resulted with significant negative environmental and security consequences. Novel robotic technologies and data processing methodologies were brought from the research labs and directly applied onto the terrain in order to support the relief workers and minimize human suffering.},\nproject = {ICARUS},\naddress = {Lisbon, Portugal},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2015\/RISE_2015_Haris_Balta_RMA.PDF},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2014<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. Doroftei, A. Matos, and G. De Cubber, &#8220;Designing Search and Rescue Robots towards Realistic User Requirements,\" in <span style=\"font-style: italic\">Advanced Concepts on Mechanical Engineering (ACME)<\/span>, Iasi, Romania,  2014.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_28\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_28\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2014\/Designing Search and Rescue robots towards realistic user requirements - full article -v3.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.4028\/www.scientific.net\/amm.658.612' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_28_block\">\n<p>In the event of a large crisis (think about typhoon Haiyan or the Tohoku earthquake and tsunami in Japan), a primordial task of the rescue services is the search for human survivors on the incident site. This is a complex and dangerous task, which often leads to loss of lives among the human crisis managers themselves. The introduction of unmanned search and rescue devices can offer a valuable tool to save human lives and to speed up the search and rescue process. In this context, the EU-FP7-ICARUS project [1] concentrates on the development of unmanned search and rescue technologies for detecting, locating and rescuing humans. The complex nature and difficult operating conditions of search and rescue operations pose heavy constraints on the mechanical design of the unmanned platforms. In this paper, we discuss the different user requirements which have an impact of the design of the mechanical systems (air, ground and marine robots). We show how these user requirements are obtained, how they are validated, how they lead to design specifications for operational prototypes which are tested in realistic operational conditions and we show how the final mechanical design specifications are derived from these different steps. An important aspect of all these design steps which is emphasized in this paper is to always keep the end-users in the loop in order to come to realistic requirements and specifications, ensuring the practical deployability [2] of the developed platforms.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_28_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2014designing,\nauthor = {Doroftei, Daniela and Matos, Anibal and De Cubber, Geert},\nbooktitle = {Advanced Concepts on Mechanical Engineering (ACME)},\ntitle = {Designing Search and Rescue Robots towards Realistic User Requirements},\nyear = {2014},\nabstract = {In the event of a large crisis (think about typhoon Haiyan or the Tohoku earthquake and tsunami in Japan), a primordial task of the rescue services is the search for human survivors on the incident site. This is a complex and dangerous task, which often leads to loss of lives among the human crisis managers themselves. The introduction of unmanned search and rescue devices can\noffer a valuable tool to save human lives and to speed up the search and rescue process. In this context, the EU-FP7-ICARUS project [1] concentrates on the development of unmanned search and rescue technologies for detecting, locating and rescuing humans. The complex nature and difficult operating conditions of search and rescue operations pose heavy constraints on the mechanical design of the unmanned platforms. In this paper, we discuss the different user requirements which have an impact of the design of the mechanical systems (air, ground and marine robots). We show how these user requirements are obtained, how they are validated, how they lead to design specifications for operational prototypes which are tested in realistic operational conditions and we show how the final mechanical design specifications are derived from these different steps. An important aspect of all these design steps which is emphasized in this paper is to always keep the end-users in the loop in order to come to realistic requirements and specifications, ensuring the practical deployability [2] of the developed platforms.},\ndoi = {10.4028\/www.scientific.net\/amm.658.612},\nproject = {ICARUS},\naddress = {Iasi, Romania},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2014\/Designing Search and Rescue robots towards realistic user requirements - full article -v3.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, H. Balta, D. Doroftei, and Y. Baudoin, &#8220;UAS deployment and data processing during the Balkans flooding,\" in <span style=\"font-style: italic\">2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)<\/span>, Toyako-cho, Hokkaido, Japan,  2014, p. 1\u20134.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_29\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_29\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2014\/SSRR2014_proj_037.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ssrr.2014.7017670' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_29_block\">\n<p>This project paper provides a report on a real relief operation mission, jointly conducted by two European research projects, in response to the massive flooding in the Balkan in spring 2014. Un Unmanned Aerial System was deployed on-site in collaboration with traditional relief workers, to support them with damage assessment, area mapping, visual inspection and re-localizing the many explosive remnants of war which have been moved due to the flooding and landslides. Novel robotic technologies and data processing methodologies were brought from the research labs and directly applied onto the terrain in order to support the relief workers and minimize human suffering.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_29_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2014uas,\nauthor = {De Cubber, Geert and Balta, Haris and Doroftei, Daniela and Baudoin, Yvan},\nbooktitle = {2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)},\ntitle = {{UAS} deployment and data processing during the Balkans flooding},\nyear = {2014},\norganization = {IEEE},\npages = {1--4},\nabstract = {This project paper provides a report on a real relief operation mission, jointly conducted by two European research projects, in response to the massive flooding in the Balkan in spring 2014. Un Unmanned Aerial System was deployed on-site in collaboration with traditional relief workers, to support them with damage assessment, area mapping, visual inspection and re-localizing the many explosive remnants of war which have been moved due to the flooding and landslides. Novel robotic technologies and data processing methodologies were brought from the research labs and directly applied onto the terrain in order to support the relief workers and minimize human suffering.},\ndoi = {10.1109\/ssrr.2014.7017670},\nproject = {ICARUS},\naddress = {Toyako-cho, Hokkaido, Japan},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2014\/SSRR2014_proj_037.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2013<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    H. Balta, G. De Cubber, D. Doroftei, Y. Baudoin, and H. Sahli, &#8220;Terrain traversability analysis for off-road robots using time-of-flight 3d sensing,\" in <span style=\"font-style: italic\">7th IARP International Workshop on Robotics for Risky Environment-Extreme Robotics<\/span>, Saint-Petersburg, Russia,  2013.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_25\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_25\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2013\/Terrain Traversability Analysis ver 4-HS.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_25_block\">\n<p>In this paper we present a terrain traversability analysis methodology which classifies all image pixels in the TOF image as traversable or not, by estimating for each pixel a traversability score which is based upon the analysis of the 3D (depth data) and 2D (IR data) content of the TOF camera data. This classification result is then used for the (semi) \u2013 autonomous navigation of two robotic systems, operating in extreme environments: a search and rescue robot and a humanitarian demining robot. Integrated in autonomous robot control architecture, terrain traversability classification increases the environmental situational awareness and enables a mobile robot to navigate (semi) \u2013 autonomously in an unstructured dynamical outdoor environment.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_25_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{balta2013terrain,\nauthor = {Balta, Haris and De Cubber, Geert and Doroftei, Daniela and Baudoin, Yvan and Sahli, Hichem},\nbooktitle = {7th IARP International Workshop on Robotics for Risky Environment-Extreme Robotics},\ntitle = {Terrain traversability analysis for off-road robots using time-of-flight 3d sensing},\nyear = {2013},\nabstract = {In this paper we present a terrain traversability analysis methodology which classifies all image pixels in the TOF image as traversable or not, by estimating for each pixel a traversability score which is based upon the analysis of the 3D (depth data) and 2D (IR data) content of the TOF camera data. This classification result is then used for the (semi) \u2013 autonomous navigation of two robotic systems, operating in extreme environments: a search and rescue robot and a humanitarian demining robot. Integrated in autonomous robot control architecture, terrain traversability classification increases the environmental situational awareness and enables a mobile robot to navigate (semi) \u2013 autonomously in an unstructured dynamical outdoor environment.},\nproject = {ICARUS},\naddress = {Saint-Petersburg, Russia},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2013\/Terrain Traversability Analysis ver 4-HS.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, D. Serrano, K. Chintamani, R. Sabino, and S. Ourevitch, &#8220;The EU-ICARUS project: developing assistive robotic tools for search and rescue operations,\" in <span style=\"font-style: italic\">2013 IEEE international symposium on safety, security, and rescue robotics (SSRR)<\/span>, Linkoping, Sweden,  2013, p. 1\u20134.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_26\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_26\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2013\/SSRR2013_ICARUS.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ssrr.2013.6719323' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_26_block\">\n<p>The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but lifesaving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad-hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I (command, control, communications, computers, and intelligence) equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_26_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2013eu,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Serrano, Daniel and Chintamani, Keshav and Sabino, Rui and Ourevitch, Stephane},\nbooktitle = {2013 IEEE international symposium on safety, security, and rescue robotics (SSRR)},\ntitle = {The {EU-ICARUS} project: developing assistive robotic tools for search and rescue operations},\nyear = {2013},\norganization = {IEEE},\npages = {1--4},\naddress = {Linkoping, Sweden},\nabstract = {The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but lifesaving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad-hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I (command, control, communications, computers, and intelligence) equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.},\ndoi = {10.1109\/ssrr.2013.6719323},\nproject = {ICARUS},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2013\/SSRR2013_ICARUS.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    H. Balta, G. De Cubber, and D. Doroftei, &#8220;Increasing Situational Awareness through Outdoor Robot Terrain Traversability Analysis based on Time- Of-Flight Camera,\" in <span style=\"font-style: italic\">Spring School on Developmental Robotics and Cognitive Bootstrapping<\/span>, Athens, Greece: , 2013, p. 8.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_27\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_27\" class=\"papercite_toggle\">[Abstract]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_27_block\">\n<p>Poster paper<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_27_block\">\n<pre><code class=\"tex bibtex\">@InCollection{balta2013increasing,\nauthor = {Balta, Haris and De Cubber, Geert and Doroftei, Daniela},\nbooktitle = {Spring School on Developmental Robotics and Cognitive Bootstrapping},\ntitle = {Increasing Situational Awareness through Outdoor Robot Terrain Traversability Analysis based on Time- Of-Flight Camera},\nyear = {2013},\nnumber = {Developmental Robotics and Cognitive Bootstrapping},\npages = {8},\nabstract = {Poster paper},\naddress = {Athens, Greece},\nproject = {ICARUS},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Serrano, K. Berns, K. Chintamani, R. Sabino, S. Ourevitch, D. Doroftei, C. Armbrust, T. Flamma, and Y. Baudoin, &#8220;Search and rescue robots developed by the European Icarus project,\" in <span style=\"font-style: italic\">7th Int Workshop on Robotics for Risky Environments<\/span>, Saint &#8211; Petersburg, Russia,  2013.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_42\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_42\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2013\/Search and Rescue robots developed by the European ICARUS project - Article.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_42_block\">\n<p>This paper discusses the efforts of the European ICARUS project towards the development of unmanned search and rescue (SAR) robots. ICARUS project proposes to equip first responders with a comprehensive and integrated set of remotely operated SAR tools, to increase the situational awareness of human crisis managers. In the event of large crises, a primordial task of the fire and rescue services is the search for human survivors on the incident site, which is a complex and dangerous task. The introduction of remotely operated SAR devices can offer a valuable tool to save human lives and to speed up the SAR process. Therefore, ICARUS concentrates on the development of unmanned SAR technologies for detecting, locating and rescuing humans. The remotely operated SAR devices are foreseen to be the first explorers of the area, along with in-situ supporters to act as safeguards to human personnel. While the ICARUS project also considers the development of marine and aerial robots, this paper will mostly concentrate on the development of the unmanned ground vehicles (UGVs) for SAR. Two main UGV platforms are being developed within the context of the project: a large UGV including a powerful arm for manipulation, which is able to make structural changes in disaster scenarios. The large UGV also serves as a base platform for a small UGV (and possibly also a UAV), which is used for entering small enclosures, while searching for human survivors. In order not to increase the cognitive load of the human crisis managers, the SAR robots will be designed to navigate individually or cooperatively and to follow high-level instructions from the base station, being able to navigate in an autonomous and semi-autonomous manner. The robots connect to the base station and to each other using a wireless self-organizing cognitive network of mobile communication nodes which adapts to the terrain. The SAR robots are equipped with sensors that detect the presence of humans and will also be equipped with a wide array of other types of sensors. At the base station, the data is processed and combined with geographical information, thus enhancing the situational awareness of the personnel leading the operation with in-situ processed data that can improve decision-making.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_42_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2013search,\nauthor = {De Cubber, Geert and Serrano, Daniel and Berns, Karsten and Chintamani, Keshav and Sabino, Rui and Ourevitch, Stephane and Doroftei, Daniela and Armbrust, Christopher and Flamma, Tommasso and Baudoin, Yvan},\nbooktitle = {7th Int Workshop on Robotics for Risky Environments},\ntitle = {Search and rescue robots developed by the {European} {Icarus} project},\nyear = {2013},\nabstract = {This paper discusses the efforts of the European ICARUS project towards the development of unmanned search and rescue (SAR) robots. ICARUS project proposes to equip first responders with a comprehensive and integrated set of remotely operated SAR tools, to increase the situational awareness of human crisis managers. In the event of large crises, a primordial task of the fire and rescue services is the search for human survivors on the incident site, which is a complex and dangerous task. The introduction of remotely operated SAR devices can offer a valuable tool to save human lives and to speed up the SAR process. Therefore, ICARUS concentrates on the development of unmanned SAR technologies for detecting, locating and rescuing humans. The remotely operated SAR devices are foreseen to be the first explorers of the area, along with in-situ supporters to act as safeguards to human personnel. While the ICARUS project also considers the development of marine and aerial robots, this paper will mostly concentrate on the development of the unmanned ground vehicles (UGVs) for SAR. Two main UGV platforms are being developed within the context of the project: a large UGV including a powerful arm for manipulation, which is able to make structural changes in disaster scenarios. The large UGV also serves as a base platform for a small UGV (and possibly also a UAV), which is used for entering small enclosures, while searching for human survivors. In order not to increase the cognitive load of the human crisis managers, the SAR robots will be designed to navigate individually or cooperatively and to follow high-level instructions from the base station, being able to navigate in an autonomous and semi-autonomous manner. The robots connect to the base station and to each other using a wireless self-organizing cognitive network of mobile communication nodes which adapts to the terrain. The SAR robots are equipped with sensors that detect the presence of humans and will also be equipped with a wide array of other types of sensors. At the base station, the data is processed and\ncombined with geographical information, thus enhancing the situational awareness of the personnel leading the operation with in-situ processed data that can improve decision-making.},\nproject = {ICARUS},\naddress = {Saint - Petersburg, Russia},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2013\/Search and Rescue robots developed by the European ICARUS project - Article.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2012<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, D. Doroftei, Y. Baudoin, D. Serrano, K. Chintamani, R. Sabino, and S. Ourevitch, &#8220;ICARUS : Providing Unmanned Search and Rescue Tools,\" in <span style=\"font-style: italic\">6th IARP Workshop on Risky Interventions and Environmental Surveillance (RISE)<\/span>, Warsaw, Poland,  2012.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_20\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_20\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2012\/RISE2012_ICARUS.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_20_block\">\n<p>The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but life-saving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoccognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_20_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2012icarus01,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Baudoin, Yvan and Serrano, Daniel and Chintamani, Keshav and Sabino, Rui and Ourevitch, Stephane},\nbooktitle = {6th IARP Workshop on Risky Interventions and Environmental Surveillance (RISE)},\ntitle = {{ICARUS} : Providing Unmanned Search and Rescue Tools},\nyear = {2012},\nabstract = {The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but life-saving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoccognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.},\nproject = {ICARUS},\naddress = {Warsaw, Poland},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2012\/RISE2012_ICARUS.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, G. De Cubber, and K. Chintamani, &#8220;Towards collaborative human and robotic rescue workers,\" in <span style=\"font-style: italic\">5th International Workshop on Human-Friendly Robotics (HFR2012)<\/span>, Brussels, Belgium,  2012, p. 18\u201319.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_22\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_22\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/citeseerx.ist.psu.edu\/viewdoc\/download?doi=10.1.1.303.6697&#038;rep=rep1&#038;type=pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_22_block\">\n<p>This paper discusses some of the main remaining bottlenecks towards the successful introduction of robotic search and rescue (SAR) tools, collaborating with human rescue workers. It also sketches some of the recent advances which are being made to in the context of the European ICARUS project to get rid of these bottlenecks.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_22_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2012towards,\nauthor = {Doroftei, Daniela and De Cubber, Geert and Chintamani, Keshav},\nbooktitle = {5th International Workshop on Human-Friendly Robotics (HFR2012)},\ntitle = {Towards collaborative human and robotic rescue workers},\nyear = {2012},\npages = {18--19},\nabstract = {This paper discusses some of the main remaining bottlenecks towards the successful introduction of robotic search and rescue (SAR) tools, collaborating with human rescue workers. It also sketches some of the recent advances which are being made to in the context of the European ICARUS project to get rid of these bottlenecks.},\nproject = {ICARUS},\naddress = {Brussels, Belgium},\nurl = {http:\/\/citeseerx.ist.psu.edu\/viewdoc\/download?doi=10.1.1.303.6697&rep=rep1&type=pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    A. Conduraru, I. Conduraru, E. Puscalau, G. De Cubber, D. Doroftei, and H. Balta, &#8220;Development of an autonomous rough-terrain robot,\" in <span style=\"font-style: italic\">IROS2012 Workshop on Robots and Sensors integration in future rescue INformation system (ROSIN&#8217;12)<\/span>, Villamoura, Portugal,  2012.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_23\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_23\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/pdfs.semanticscholar.org\/884e\/6a80c8768044a1fd68ee91f45f17e5125153.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_23_block\">\n<p>In this paper, we discuss the development process of a mobile robot intended for environmental observation applications. The paper describes how a standard tele-operated Explosive Ordnance Disposal (EOD) robot was upgraded with electronics, sensors, computing power and autonomous capabilities, such that it becomes able to execute semi-autonomous missions, e.g. for search &#038; rescue or humanitarian demining tasks. The aim of this paper is not to discuss the details of the navigation algorithms (as these are often task-dependent), but more to concentrate on the development of the platform and its control architecture as a whole.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_23_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{conduraru2012development,\nauthor = {Conduraru, Alina and Conduraru, Ionel and Puscalau, Emanuel and De Cubber, Geert and Doroftei, Daniela and Balta, Haris},\nbooktitle = {IROS2012 Workshop on Robots and Sensors integration in future rescue INformation system (ROSIN'12)},\ntitle = {Development of an autonomous rough-terrain robot},\nyear = {2012},\nabstract = {In this paper, we discuss the development process of a mobile robot intended for environmental observation applications. The paper describes how a standard tele-operated Explosive Ordnance Disposal (EOD) robot was upgraded with electronics, sensors, computing power and autonomous capabilities, such that it becomes able to execute semi-autonomous missions, e.g. for search & rescue or humanitarian demining tasks. The aim of this paper is not to discuss the details of the navigation algorithms (as these are often task-dependent), but more to concentrate on the development of the platform and its control architecture as a whole.},\nproject = {ICARUS},\naddress = {Villamoura, Portugal},\nurl = {https:\/\/pdfs.semanticscholar.org\/884e\/6a80c8768044a1fd68ee91f45f17e5125153.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, Y. Baudoin, D. Serrano, K. Chintamani, R. Sabino, and S. Ourevitch, &#8220;Operational RPAS scenarios envisaged for search &#038; rescue by the EU FP7 ICARUS project,\" in <span style=\"font-style: italic\">Remotely Piloted Aircraft Systems for Civil Operations (RPAS2012)<\/span>, Brussels, Belgium,  2012.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_24\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_24\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2012\/De-Cubber-Geert_RMA_Belgium_WP.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_24_block\">\n<p>The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but life-saving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_24_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2012operational,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Baudoin, Yvan and Serrano, Daniel and Chintamani, Keshav and Sabino, Rui and Ourevitch, Stephane},\nbooktitle = {Remotely Piloted Aircraft Systems for Civil Operations (RPAS2012)},\ntitle = {Operational {RPAS} scenarios envisaged for search & rescue by the {EU FP7 ICARUS} project},\nyear = {2012},\nabstract = {The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but life-saving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.},\nproject = {ICARUS},\naddress = {Brussels, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2012\/De-Cubber-Geert_RMA_Belgium_WP.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, Y. Baudoin, D. Serrano, K. Chintamani, R. Sabino, and S. Ourevitch, &#8220;ICARUS: AN EU-FP7 PROJECT PROVIDING UNMANNED SEARCH AND RESCUE TOOLS,\" in <span style=\"font-style: italic\">IROS2012 Workshop on Robots and Sensors integration in future rescue INformation system (ROSIN&#8217;12)<\/span>, Villamoura, Portugal,  2012.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_30\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_30\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2012\/Icarus - ROSIN2012 Presentation.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_30_block\">\n<p>Overview of the objectives of the ICARUS project<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_30_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2012icarus02,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Baudoin, Y and Serrano, D and Chintamani, K and Sabino, R and Ourevitch, S},\nbooktitle = {IROS2012 Workshop on Robots and Sensors integration in future rescue INformation system (ROSIN'12)},\ntitle = {{ICARUS}: AN {EU-FP7} PROJECT PROVIDING UNMANNED SEARCH AND RESCUE TOOLS},\nyear = {2012},\nabstract = {Overview of the objectives of the ICARUS project},\nproject = {ICARUS},\naddress = {Villamoura, Portugal},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2012\/Icarus - ROSIN2012 Presentation.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2011<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, D. Doroftei, H. Sahli, and Y. Baudoin, &#8220;Outdoor Terrain Traversability Analysis for Robot Navigation using a Time-Of-Flight Camera,\" in <span style=\"font-style: italic\">RGB-D Workshop on 3D Perception in Robotics<\/span>, Vasteras, Sweden,  2011.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_15\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_15\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2011\/TTA_TOF.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_15_block\">\n<p>Autonomous robotic systems operating in unstructured outdoor environments need to estimate the traversabilityof the terrain in order to navigate safely. Traversability estimation is a challenging problem, as the traversability is a complex function of both the terrain characteristics, such as slopes, vegetation, rocks, etc and the robot mobility characteristics, i.e. locomotion method, wheels, etc. It is thus required to analyze in real-time the 3D characteristics of the terrain and pair this data to the robot capabilities. In this paper, a method is introduced to estimate the traversability using data from a time-of-flight camera.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_15_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2011outdoor,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Sahli, Hichem and Baudoin, Yvan},\nbooktitle = {RGB-D Workshop on 3D Perception in Robotics},\ntitle = {Outdoor Terrain Traversability Analysis for Robot Navigation using a Time-Of-Flight Camera},\nyear = {2011},\nabstract = {Autonomous robotic systems operating in unstructured outdoor environments need to estimate the traversabilityof the terrain in order to navigate safely. Traversability estimation is a challenging problem, as the traversability is a complex function of both the terrain characteristics, such as slopes, vegetation, rocks, etc and the robot mobility characteristics, i.e. locomotion method, wheels, etc. It is thus required to analyze in real-time the 3D characteristics of the terrain and pair this data to the robot capabilities. In this paper, a method is introduced to estimate the traversability using data from a time-of-flight camera.},\nproject = {ViewFinder, Mobiniss},\naddress = {Vasteras, Sweden},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2011\/TTA_TOF.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber and D. Doroftei, &#8220;Multimodal terrain analysis for an all-terrain crisis Management Robot,\" in <span style=\"font-style: italic\">IARP HUDEM 2011<\/span>, Sibenik, Croatia,  2011.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_16\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_16\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2011\/Multimodal terrain analysis for an all-terrain crisis management robot .pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_16_block\">\n<p>In this paper, a novel stereo-based terrain-traversability estimation methodology is proposed. The novelty is that \u2013 in contrary to classic depth-based terrain classification algorithms \u2013 all the information of the stereo camera system is used, also the color information. Using this approach, depth and color information are fused in order to obtain a higher classification accuracy than is possible with uni-modal techniques<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_16_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2011multimodal,\nauthor = {De Cubber, Geert and Doroftei, Daniela},\nbooktitle = {IARP HUDEM 2011},\ntitle = {Multimodal terrain analysis for an all-terrain crisis Management Robot},\nyear = {2011},\nabstract = {In this paper, a novel stereo-based terrain-traversability estimation methodology is proposed. The novelty is that \u2013 in contrary to classic depth-based terrain classification algorithms \u2013 all the information of the stereo camera system is used, also the color information. Using this approach, depth and color information are fused in order to obtain a higher classification accuracy than is possible with uni-modal techniques},\nproject = {Mobiniss},\naddress = {Sibenik, Croatia},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2011\/Multimodal terrain analysis for an all-terrain crisis management robot .pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, K. Verbiest, and S. A. Berrabah, &#8220;Autonomous camp surveillance with the ROBUDEM robot: challenges and results,\" in <span style=\"font-style: italic\">IARP Workshop RISE\u20192011<\/span>, Belgium,  2011.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_17\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_17\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2011\/ELROB-RISE.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_17_block\">\n<p>Autonomous robotic systems can help for risky interventions to reduce the risk to human lives. An example of such a risky intervention is a camp surveillance scenario, where an environment needs to be patrolled and intruders need to be detected and intercepted. This paper describes the development of a mobile outdoor robot which is capable of performing such a camp surveillance task. The key research issues tackled are the robot design, geo-referenced localization and path planning, traversability estimation, the optimization of the terrain coverage strategy and the development of an intuitive human-robot interface.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_17_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2011autonomous,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Verbiest, Kristel and Berrabah, Sid Ahmed},\nbooktitle = {IARP Workshop RISE\u20192011},\ntitle = {Autonomous camp surveillance with the {ROBUDEM} robot: challenges and results},\nyear = {2011},\nabstract = {Autonomous robotic systems can help for risky interventions to reduce the risk to human lives. An example of such a risky intervention is a camp surveillance scenario, where an environment needs to be patrolled and intruders need to be detected and intercepted. This paper describes the development of a mobile outdoor robot which is capable of performing such a camp surveillance task. The key research issues tackled are the robot design, geo-referenced localization and path planning, traversability estimation, the optimization of the terrain coverage strategy and the development of an intuitive human-robot interface.},\nproject = {Mobiniss},\naddress = {Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2011\/ELROB-RISE.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber and D. Doroftei, &#8220;Using Robots in Hazardous Environments: Landmine Detection, de-Mining and Other Applications,\" in <span style=\"font-style: italic\">Using Robots in Hazardous Environments: Landmine Detection, De-Mining and Other Applications<\/span>, Y. Baudoin and M. Habib, Eds., Woodhead Publishing, 2011, vol. 1, p. 476\u2013498.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_40\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_40\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2009\/Handbook Chapter 4 - Human Victim Detection and Stereo-based Terrain Traversability Analysis for Behavior-Based Robot Navigation.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_40_block\">\n<p>This chapter presents three main aspects of the development of a crisis management robot. First, we present an approach for robust victim detection in difficult outdoor conditions. Second, we present an approach where a classification of the terrain in the classes traversable and obstacle is performed using only stereo vision as input data. Lastly, we present behavior-based control architecture, enabling a robot to search for human victims on an incident site, while navigating semi-autonomously, using stereo vision as the main source of sensor information.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_40_block\">\n<pre><code class=\"tex bibtex\">@InBook{de2010human,\nauthor = {De Cubber, Geert and Doroftei, Daniela},\neditor = {Baudoin, Yvan and Habib, Maki},\nchapter = {Chapter 20},\npages = {476--498},\npublisher = {Woodhead Publishing},\ntitle = {Using Robots in Hazardous Environments: Landmine Detection, de-Mining and Other Applications},\nyear = {2011},\nisbn = {1845697863},\nvolume = {1},\nabstract = {This chapter presents three main aspects of the development of a crisis management robot. First, we present an approach for robust victim detection in difficult outdoor conditions. Second, we present an approach where a classification of the terrain in the classes traversable and obstacle is performed using only stereo vision as input data. Lastly, we present behavior-based control architecture, enabling a robot to search for human victims on an incident site, while navigating semi-autonomously, using stereo vision as the main source of sensor information.},\nbooktitle = {Using Robots in Hazardous Environments: Landmine Detection, De-Mining and Other Applications},\ndate = {2011-01-11},\nean = {9781845697860},\npagetotal = {665},\nproject = {Mobiniss},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2009\/Handbook Chapter 4 - Human Victim Detection and Stereo-based Terrain Traversability Analysis for Behavior-Based Robot Navigation.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and E. Colon, &#8220;Decentralized multi-robot coordination for a risky surveillance application,\" in <span style=\"font-style: italic\">Proc. IARP HUDEM 2011<\/span>, Sibenik, Croatia,  2011.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_47\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_47\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2011\/HUDEM2011_Doroftei_Colon.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_47_block\">\n<p>This paper proposes a multi-robot control methodology that is based on a behavior-based control framework. In this behavior-based context, the robotic team members are controlled using one of 2 mutually exclusive behaviors: patrolling or intercepting. In patrol mode the robot seeks to detect enemy forces as rapidly as possible, by balancing 2 constraints: the intervention time should be minimized and the map coverage should be maximized. In interception mode, the robot tries to advance towards an enemy which was detected by one of the robotic team members. Subsequently, the robot tries to neutralize the threat posed by the enemy before enemy is able to reach the camp.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_47_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2011decentralized,\nauthor = {Doroftei, Daniela and Colon, Eric},\nbooktitle = {Proc. {IARP} {HUDEM} 2011},\ntitle = {Decentralized multi-robot coordination for a risky surveillance application},\nyear = {2011},\npublisher = {{IARP}},\nabstract = {This paper proposes a multi-robot control methodology that is based on a behavior-based control framework. In this behavior-based context, the robotic team members are controlled using one of 2 mutually exclusive behaviors: patrolling or intercepting. In patrol mode the robot seeks to detect enemy forces as rapidly as possible, by balancing 2 constraints: the intervention time should be minimized and the map coverage should be maximized. In interception mode, the robot tries to advance towards an enemy which was detected by one of the robotic team members. Subsequently, the robot tries to neutralize the threat posed by the enemy before enemy is able to reach the camp. },\nproject = {NMRS},\naddress = {Sibenik, Croatia},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2011\/HUDEM2011_Doroftei_Colon.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2010<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, S. A. Berrabah, D. Doroftei, Y. Baudoin, and H. Sahli, &#8220;Combining Dense Structure from Motion and Visual SLAM in a Behavior-Based Robot Control Architecture,\" <span style=\"font-style: italic\">International Journal of Advanced Robotic Systems<\/span>, vol. 7, iss. 1, 2010.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_10\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_10\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2010\/e_from_motion_and_visual_slam_in_a_behavior-based_robot_control_architecture.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/7240' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_10_block\">\n<p>In this paper, we present a control architecture for an intelligent outdoor mobile robot. This enables the robot to navigate in a complex, natural outdoor environment, relying on only a single on-board camera as sensory input. This is achieved through a twofold analysis of the visual data stream: a dense structure from motion algorithm calculates a depth map of the environment and a visual simultaneous localization and mapping algorithm builds a map of the surroundings using image features. This information enables a behavior-based robot motion and path planner to navigate the robot through the environment. In this paper, we show the theoretical aspects of setting up this architecture.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_10_block\">\n<pre><code class=\"tex bibtex\">@Article{de2010combining,\nauthor = {De Cubber, Geert and Sid Ahmed Berrabah and Daniela Doroftei and Yvan Baudoin and Hichem Sahli},\njournal = {International Journal of Advanced Robotic Systems},\ntitle = {Combining Dense Structure from Motion and Visual {SLAM} in a Behavior-Based Robot Control Architecture},\nyear = {2010},\nmonth = mar,\nnumber = {1},\nvolume = {7},\nabstract = {In this paper, we present a control architecture for an intelligent outdoor mobile robot. This enables the robot to navigate in a complex, natural outdoor environment, relying on only a single on-board camera as sensory input. This is achieved through a twofold analysis of the visual data stream: a dense structure from motion algorithm calculates a depth map of the environment and a visual simultaneous localization and mapping algorithm builds a map of the surroundings using image features. This information enables a behavior-based robot motion and path planner to navigate the robot through the environment. In this paper, we show the theoretical aspects of setting up this architecture.},\ndoi = {10.5772\/7240},\npublisher = {{SAGE} Publications},\nproject = {ViewFinder, Mobiniss},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2010\/e_from_motion_and_visual_slam_in_a_behavior-based_robot_control_architecture.pdf},\nunit= {meca-ras,vub-etro}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, D. Doroftei, G. De Cubber, S. A. Berrabah, E. Colon, C. Pinzon, A. Maslowski, J. Bedkowski, and J. PENDERS, &#8220;VIEW-FINDER: Robotics Assistance to fire-Fighting services,\" in <span style=\"font-style: italic\">Mobile Robotics: Solutions and Challenges<\/span>, , 2010, p. 397\u2013406.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_11\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_11\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/books.google.be\/books?id=zcfFCgAAQBAJ&#038;pg=PA397&#038;lpg=PA397&#038;dq=VIEW-FINDER: Robotics Assistance to fire-Fighting services mobile robots&#038;source=bl&#038;ots=Jh6P63OKCr&#038;sig=O1GPy_c42NPSEdO8Hb_pa9V6K7g&#038;hl=en&#038;sa=X&#038;ved=2ahUKEwiLr76B-5zfAhUMCewKHQS_Af0Q6AEwDXoECAEQAQ#v=onepage&#038;q=VIEW-FINDER: Robotics Assistance to fire-Fighting services mobile robots&#038;f=false\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_11_block\">\n<p>This paper presents an overview of the View-Finder project<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_11_block\">\n<pre><code class=\"tex bibtex\">@InCollection{baudoin2010view,\nauthor = {Baudoin, Yvan and Doroftei, Daniela and De Cubber, Geert and Berrabah, Sid Ahmed and Colon, Eric and Pinzon, Carlos and Maslowski, Andrzej and Bedkowski, Janusz and PENDERS, Jacques},\nbooktitle = {Mobile Robotics: Solutions and Challenges},\ntitle = {{VIEW-FINDER}: Robotics Assistance to fire-Fighting services},\nyear = {2010},\npages = {397--406},\nabstract = {This paper presents an overview of the View-Finder project},\nproject = {ViewFinder},\nunit= {meca-ras},\nurl = {https:\/\/books.google.be\/books?id=zcfFCgAAQBAJ&pg=PA397&lpg=PA397&dq=VIEW-FINDER: Robotics Assistance to fire-Fighting services mobile robots&source=bl&ots=Jh6P63OKCr&sig=O1GPy_c42NPSEdO8Hb_pa9V6K7g&hl=en&sa=X&ved=2ahUKEwiLr76B-5zfAhUMCewKHQS_Af0Q6AEwDXoECAEQAQ#v=onepage&q=VIEW-FINDER: Robotics Assistance to fire-Fighting services mobile robots&f=false},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, G. De Cubber, E. Colon, D. Doroftei, and S. A. Berrabah, &#8220;Robotics Assistance by Risky Interventions: Needs and Realistic Solutions,\" in <span style=\"font-style: italic\">Workshop on Robotics for Extreme conditions<\/span>, Saint-Petersburg, Russia,  2010.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_14\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_14\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2010\/Robotics Assistance by risky interventions.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_14_block\">\n<p>This paper discusses the requirements towards robotics systems in the domains of firefighting, CBRN-E and humanitarian demining.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_14_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{baudoin2010robotics,\nauthor = {Baudoin, Yvan and De Cubber, Geert and Colon, Eric and Doroftei, Daniela and Berrabah, Sid Ahmed},\nbooktitle = {Workshop on Robotics for Extreme conditions},\ntitle = {Robotics Assistance by Risky Interventions: Needs and Realistic Solutions},\nyear = {2010},\nabstract = {This paper discusses the requirements towards robotics systems in the domains of firefighting, CBRN-E and humanitarian demining.},\nproject = {ViewFinder, Mobiniss},\naddress = {Saint-Petersburg, Russia},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2010\/Robotics Assistance by risky interventions.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, and S. A. Berrabah, &#8220;Using visual perception for controlling an outdoor robot in a crisis management scenario,\" in <span style=\"font-style: italic\">ROBOTICS 2010<\/span>, Clermont-Ferrand, France,  2010.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_33\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_33\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2010\/Usingvisualperceptionforcontrollinganoutdoorrobotinacrisismanagementscenario (1).pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_33_block\">\n<p>Crisis management teams (e.g. fire and rescue services, anti-terrorist units &#8230;) are often confronted with dramatic situations where critical decisions have to be made within hard time constraints. Therefore, they need correct information about what is happening on the crisis site. In this context, the View-Finder projects aims at developing robots which can assist the human crisis managers, by gathering data. This paper gives an overview of the development of such an outdoor robot. The presented robotic system is able to detect human victims at the incident site, by using vision-based human body shape detection. To increase the perceptual awareness of the human crisis managers, the robotic system is capable of reconstructing a 3D model of the environment, based on vision data. Also for navigation, the robot depends mostly on visual perception, as it combines a model-based navigation approach using geo-referenced positioning with stereo-based terrain traversability analysis for obstacle avoidance. The robot control scheme is embedded in a behavior-based robot control architecture, which integrates all the robot capabilities. This paper discusses all the above mentioned technologies.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_33_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2010using,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Berrabah, Sid Ahmed},\nbooktitle = {ROBOTICS 2010},\ntitle = {Using visual perception for controlling an outdoor robot in a crisis management scenario},\nyear = {2010},\nabstract = {Crisis management teams (e.g. fire and rescue services, anti-terrorist units ...) are often confronted with dramatic situations where critical decisions have to be made within hard time constraints. Therefore, they need correct information about what is happening on the crisis site. In this context, the View-Finder projects aims at developing robots which can assist the human crisis managers, by gathering data. This paper gives an overview of the development of such an outdoor robot. The presented robotic system is able to detect human victims at the incident site, by using vision-based human body shape detection. To increase the perceptual awareness of the human crisis managers, the robotic system is capable of reconstructing a 3D model of the environment, based on vision data. Also for navigation, the robot depends mostly on visual perception, as it combines a model-based navigation approach using geo-referenced positioning with stereo-based terrain traversability analysis for obstacle avoidance. The robot control scheme is embedded in a behavior-based robot control architecture, which integrates all the robot capabilities. This paper discusses all the above mentioned technologies.},\nproject = {ViewFinder, Mobiniss},\naddress = {Clermont-Ferrand, France},\nunit= {meca-ras},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2010\/Usingvisualperceptionforcontrollinganoutdoorrobotinacrisismanagementscenario (1).pdf},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and E. Colon, &#8220;Decentralized Multi-Robot Coordination in an Urban Environment,\" <span style=\"font-style: italic\">European Journal of Mechanical en Environmental Engineering<\/span>, vol. 1, 2010.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_48\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_48\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2010\/EJMEE2010_doroftei_colon.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_48_block\">\n<p>In this paper, a novel control strategy is presented for multi\u2010robot coordination. An important aspect of the presented control architecture is that it is formulated in a decentralized context. This means that the robots cannot rely on traditional global path planning algorithms for navigation. The presented approach casts the multi\u2010robot control problem as a behavior\u2010based control problem.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_48_block\">\n<pre><code class=\"tex bibtex\">@Article{doro2010decentralized,\nauthor = {Doroftei, Daniela and Colon, Eric},\njournal = {European Journal of Mechanical en Environmental Engineering},\ntitle = {Decentralized Multi-Robot Coordination in an Urban Environment},\nyear = {2010},\nvolume = {1},\nabstract = {In this paper, a novel control strategy is presented for multi\u2010robot coordination. An important aspect of the presented control architecture is that it is formulated in a decentralized context. This means that the robots cannot rely on traditional global path planning algorithms for navigation. The presented approach casts the multi\u2010robot control problem as a behavior\u2010based control problem. },\nproject = {NMRS},\naddress = {Sheffield, UK},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2010\/EJMEE2010_doroftei_colon.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and E. Colon, &#8220;Multi-robot collaboration and coordination in a high-risk transportation scenario,\" in <span style=\"font-style: italic\">Proc. IARP HUDEM 2010<\/span>, Sousse, Tunisia,  2010.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_49\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_49\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/HUDEM\/HUDEM%20-%202010\/HUDEM2010_Doroftei.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_49_block\">\n<p>This paper discusses a decentralized multi-robot coordination strategy which aims to control and guide a team of robotic agents safely through a hostile area. The \u201dhostility\u201d of the environment is due to the presence of enemy forces, seeking to intercept the robotic team. In order to avoid detection and ensure global team safety, the robotic agents must carefully plan their trajectory towards a list of goal locations, while holding a defensive formation.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_49_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2001multi,\nauthor = {Doroftei, Daniela and Colon, Eric},\nbooktitle = {Proc. {IARP} {HUDEM} 2010},\ntitle = {Multi-robot collaboration and coordination in a high-risk transportation scenario},\nyear = {2010},\npublisher = {{IARP}},\nabstract = {This paper discusses a decentralized multi-robot coordination strategy which aims to control and guide a team of robotic agents safely through a hostile area. The \u201dhostility\u201d of the environment is due to the presence of enemy forces, seeking to intercept the robotic team. In order to avoid detection and ensure global team safety, the robotic agents must carefully plan their trajectory towards a list of goal locations, while holding a defensive formation. },\nproject = {NMRS},\naddress = {Sousse, Tunisia},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/HUDEM\/HUDEM%20-%202010\/HUDEM2010_Doroftei.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and E. Colon, &#8220;Decentralized Multi-Robot Coordination for Risky Interventions,\" in <span style=\"font-style: italic\">Fourth International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance RISE<\/span>, Sheffield, UK,  2010.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_50\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_50\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/RISE\/RISE%20-%202010\/Decentralized%20Multi-Robot%20Coordination%20for%20Risky%20Interventio.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_50_block\">\n<p>The paper describes an approach to design a behavior-based architecture, how each behavior was designed and how the behavior fusion problem was solved.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_50_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2010multibis,\nauthor = {Doroftei, Daniela and Colon, Eric},\nbooktitle = {Fourth International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance {RISE}},\ntitle = {Decentralized Multi-Robot Coordination for Risky Interventions},\nyear = {2010},\nabstract = {The paper describes an approach to design a behavior-based architecture, how each behavior was designed and how the behavior fusion problem was solved.},\nproject = {NMRS, ViewFinder},\naddress = {Sheffield, UK},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/RISE\/RISE%20-%202010\/Decentralized%20Multi-Robot%20Coordination%20for%20Risky%20Interventio.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2009<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, D. Doroftei, L. Nalpantidis, G. C. Sirakoulis, and A. Gasteratos, &#8220;Stereo-based terrain traversability analysis for robot navigation,\" in <span style=\"font-style: italic\">IARP\/EURON Workshop on Robotics for Risky Interventions and Environmental Surveillance, Brussels, Belgium<\/span>, Brussels, Belgium,  2009.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_7\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_7\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2009\/RISE-DECUBBER-DUTH.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_7_block\">\n<p>In this paper, we present an approach where a classification of the terrain in the classes traversable and obstacle is performed using only stereo vision as input data.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_7_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2009stereo,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Nalpantidis, Lazaros and Sirakoulis, Georgios Ch and Gasteratos, Antonios},\nbooktitle = {IARP\/EURON Workshop on Robotics for Risky Interventions and Environmental Surveillance, Brussels, Belgium},\ntitle = {Stereo-based terrain traversability analysis for robot navigation},\nyear = {2009},\nabstract = {In this paper, we present an approach where a classification of the terrain in the classes traversable and obstacle is performed using only stereo vision as input data.},\nproject = {ViewFinder, Mobiniss},\naddress = {Brussels, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2009\/RISE-DECUBBER-DUTH.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, G. De Cubber, E. Colon, and Y. Baudoin, &#8220;Behavior based control for an outdoor crisis management robot,\" in <span style=\"font-style: italic\">Proceedings of the IARP International Workshop on Robotics for Risky Interventions and Environmental Surveillance<\/span>, Brussels, Belgium,  2009, p. 12\u201314.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_8\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_8\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2009\/RISE-DOROFTEI.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_8_block\">\n<p>The design and development of a control architecture for a robotic crisis management agent raises 3 main questions: 1. How can we design the individual behaviors, such that the robot is capable of avoiding obstacles and of navigating semi-autonomously? 2. How can these individual behaviors be combined in an optimal, leading to a rational and coherent global robot behavior? 3. How can all these capabilities be combined in a comprehensive and modular framework, such that the robot can handle a high-level task (searching for human victims) with minimal input from human operators, by navigating in a complex, dynamic and environment, while avoiding potentially hazardous obstacles? In this paper, we present each of these three main aspects of the general robot control architecture more in detail.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_8_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2009behavior,\nauthor = {Doroftei, Daniela and De Cubber, Geert and Colon, Eric and Baudoin, Yvan},\nbooktitle = {Proceedings of the IARP International Workshop on Robotics for Risky Interventions and Environmental Surveillance},\ntitle = {Behavior based control for an outdoor crisis management robot},\nyear = {2009},\npages = {12--14},\nabstract = {The design and development of a control architecture for a robotic crisis management agent raises 3 main questions:\n1. How can we design the individual behaviors, such that the robot is capable of avoiding obstacles and of navigating semi-autonomously?\n2. How can these individual behaviors be combined in an optimal, leading to a rational and coherent global robot behavior?\n3. How can all these capabilities be combined in a comprehensive and modular framework, such that the robot can handle a high-level task (searching for human victims) with minimal input from human operators, by navigating in a complex, dynamic and environment, while avoiding potentially hazardous obstacles?\nIn this paper, we present each of these three main aspects of the general robot control architecture more in detail.},\nproject = {ViewFinder, Mobiniss},\naddress = {Brussels, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2009\/RISE-DOROFTEI.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, D. Doroftei, D. G. Cubber, S. A. Berrabah, C. Pinzon, F. Warlet, J. Gancet, E. Motard, M. Ilzkovitz, L. Nalpantidis, and A. Gasteratos, &#8220;VIEW-FINDER : Robotics assistance to fire-fighting services and Crisis Management,\" in <span style=\"font-style: italic\">2009 IEEE International Workshop on Safety, Security &#038; Rescue Robotics (SSRR 2009)<\/span>, Denver, USA,  2009, p. 1\u20136.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_12\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_12\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/5424172\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ssrr.2009.5424172' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_12_block\">\n<p>In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the Base Station (BS) the data is processed and combined with geographical information originating from a Web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. This paper will focus on the Crisis Management Information System that has been developed for improving a Disaster Management Action Plan and for linking the Control Station with a out-site Crisis Management Centre, and on the software tools implemented on the mobile robot gathering data in the outdoor area of the crisis.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_12_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{Baudoin2009view01,\nauthor = {Y. Baudoin and D. Doroftei and G. De Cubber and S. A. Berrabah and C. Pinzon and F. Warlet and J. Gancet and E. Motard and M. Ilzkovitz and L. Nalpantidis and A. Gasteratos},\nbooktitle = {2009 {IEEE} International Workshop on Safety, Security {&} Rescue Robotics ({SSRR} 2009)},\ntitle = {{VIEW}-{FINDER} : Robotics assistance to fire-fighting services and Crisis Management},\nyear = {2009},\nmonth = nov,\norganization = {IEEE},\npages = {1--6},\npublisher = {{IEEE}},\nabstract = {In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the Base Station (BS) the data is processed and combined with geographical information originating from a Web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. This paper will focus on the Crisis Management Information System that has been developed for improving a Disaster Management Action Plan and for linking the Control Station with a out-site Crisis Management Centre, and on the software tools implemented on the mobile robot gathering data in the outdoor area of the crisis.},\ndoi = {10.1109\/ssrr.2009.5424172},\nproject = {ViewFinder},\naddress = {Denver, USA},\nurl = {https:\/\/ieeexplore.ieee.org\/document\/5424172},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, D. Doroftei, G. De Cubber, S. A. Berrabah, C. Pinzon, J. Penders, A. Maslowski, and J. Bedkowski, &#8220;VIEW-FINDER : Outdoor Robotics Assistance to Fire-Fighting services,\" in <span style=\"font-style: italic\">International Symposium Clawar<\/span>, Istanbul, Turkey,  2009.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_18\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_18\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2009\/CLAWAR2009.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_18_block\">\n<p>In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the control station the data is processed and combined with geographical information originating from a web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. The information may also be forwarded to other forces involved in the operation (e.g. fire fighters, rescue workers, police, etc.). The robots will be designed to navigate individually or cooperatively and to follow high-level instructions from the base station. The robots are off-theshelf units, consisting of wheeled robots. The robots connect wirelessly to the control station. The control station collects in-situ data and combines it with information retrieved from the large-scale GMES-information bases. It will be equipped with a sophisticated human interface to display the processed information to the human operators and operation command.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_18_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{baudoin2009view02,\nauthor = {Baudoin, Yvan and Doroftei, Daniela and De Cubber, Geert and Berrabah, Sid Ahmed and Pinzon, Carlos and Penders, Jacques and Maslowski, Andrzej and Bedkowski, Janusz},\nbooktitle = {International Symposium Clawar},\ntitle = {{VIEW-FINDER} : Outdoor Robotics Assistance to Fire-Fighting services},\nyear = {2009},\nabstract = {In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the control station the data is processed and combined with geographical information originating from a web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. The information may also be forwarded to other forces involved in the operation (e.g. fire fighters, rescue workers, police, etc.). The robots will be designed to navigate individually or cooperatively and to follow high-level instructions from the base station. The robots are off-theshelf units, consisting of wheeled robots. The robots connect wirelessly to the control station. The control station collects in-situ data and combines it with information retrieved from the large-scale GMES-information bases. It\nwill be equipped with a sophisticated human interface to display the processed information to the human operators and operation command.},\nproject = {ViewFinder, Mobiniss},\naddress = {Istanbul, Turkey},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2009\/CLAWAR2009.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, D. Doroftei, G. De Cubber, S. A. Berrabah, E. Colon, C. Pinzon, A. Maslowski, and J. Bedkowski, &#8220;View-Finder: a European project aiming the Robotics assistance to Fire-fighting services and Crisis Management,\" in <span style=\"font-style: italic\">IARP workshop on Service Robotics and Nanorobotics<\/span>, Bejing, China,  2009.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_19\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_19\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2009\/IARP-paper2009.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_19_block\">\n<p>In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the control station the data is processed and combined with geographical information originating from a web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. The information may also be forwarded to other forces involved in the operation (e.g. fire fighters, rescue workers, police, etc.). The robots connect wirelessly to the control station. The control station collects in-situ data and combines it with information retrieved from the large-scale GMES-information bases. It will be equipped with a sophisticated human interface to display the processed information to the human operators and operation command. We\u2019ll essentially focus in this paper to the steps entrusted to the RMA and PIAP through the work-packages of the project.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_19_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{baudoin2009view03,\nauthor = {Baudoin, Yvan and Doroftei, Daniela and De Cubber, Geert and Berrabah, Sid Ahmed and Colon, Eric and Pinzon, Carlos and Maslowski, Andrzej and Bedkowski, Janusz},\nbooktitle = {IARP workshop on Service Robotics and Nanorobotics},\ntitle = {{View-Finder}: a European project aiming the Robotics assistance to Fire-fighting services and Crisis Management},\nyear = {2009},\nabstract = {In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the control station the data is processed and combined with geographical information originating from a web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. The information may also be forwarded to other forces involved in the operation (e.g. fire fighters, rescue workers, police, etc.). The robots connect wirelessly to the control station. The control station collects in-situ data and combines it with information retrieved from the large-scale GMES-information bases. It will be equipped with a sophisticated human interface to display the processed information to the human operators and operation command.\nWe\u2019ll essentially focus in this paper to the steps entrusted to the RMA and PIAP through the work-packages of the project.},\nproject = {ViewFinder},\naddress = {Bejing, China},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2009\/IARP-paper2009.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, G. De Cubber, S. A. Berrabah, D. Doroftei, E. Colon, C. Pinzon, A. Maslowski, and J. Bedkowski, &#8220;VIEW-FINDER: European Project Aiming CRISIS MANAGEMENT TOOLS and the Robotics Assistance to Fire-Fighting Services,\" in <span style=\"font-style: italic\">IARP WS on service Robotics, Beijing<\/span>, Bejing, China,  2009.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_21\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_21\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.academia.edu\/2879650\/VIEW-FINDER_European_Project_Aiming_CRISIS_MANAGEMENT_TOOLS_and_the_Robotics_Assistance_to_Fire-Fighting_Services\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_21_block\">\n<p>Overview of the View-Finder project<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_21_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{baudoin2009view04,\nauthor = {Baudoin, Yvan and De Cubber, Geert and Berrabah, Sid Ahmed and Doroftei, Daniela and Colon, E and Pinzon, C and Maslowski, A and Bedkowski, J},\nbooktitle = {IARP WS on service Robotics, Beijing},\ntitle = {{VIEW-FINDER}: European Project Aiming CRISIS MANAGEMENT TOOLS and the Robotics Assistance to Fire-Fighting Services},\nyear = {2009},\nabstract = {Overview of the View-Finder project},\nproject = {ViewFinder},\naddress = {Bejing, China},\nunit= {meca-ras},\nurl = {https:\/\/www.academia.edu\/2879650\/VIEW-FINDER_European_Project_Aiming_CRISIS_MANAGEMENT_TOOLS_and_the_Robotics_Assistance_to_Fire-Fighting_Services},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, E. Colon, Y. Baudoin, and H. Sahli, &#8220;Development of a behaviour-based control and software architecture for a visually guided mine detection robot,\" <span style=\"font-style: italic\">European Journal of Automated Systems (JESA)<\/span>, vol. 43, iss. 3, p. 295\u2013314, 2009.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_51\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_51\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2009\/doc-article-hermes.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_51_block\">\n<p>Humanitarian demining is a labor-intensive and high-risk which could benefit from the development of a humanitarian mine detection robot, capable of scanning a minefield semi-automatically. The design of such an outdoor autonomous robots requires the consideration and integration of multiple aspects: sensing, data fusion, path and motion planning and robot control embedded in a control and software architecture. This paper focuses on three main aspects of the design process: visual sensing using stereo and image motion analysis, design of a behaviour-based control architecture and implementation of a modular software architecture.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_51_block\">\n<pre><code class=\"tex bibtex\">@Article{doro2009development,\nauthor = {Doroftei, Daniela and Colon, Eric and Baudoin, Yvan and Sahli, Hichem},\njournal = {European Journal of Automated Systems ({JESA})},\ntitle = {Development of a behaviour-based control and software architecture for a visually guided mine detection robot},\nyear = {2009},\nvolume = {43},\nnumber = {3},\nabstract = { Humanitarian demining is a labor-intensive and high-risk which could benefit from the development of a humanitarian mine detection robot, capable of scanning a minefield semi-automatically. The design of such an outdoor autonomous robots requires the consideration and integration of multiple aspects: sensing, data fusion, path and motion planning and robot control embedded in a control and software architecture. This paper focuses on three main aspects of the design process: visual sensing using stereo and image motion analysis, design of a behaviour-based control architecture and implementation of a modular software architecture.},\npages = {295--314},\nproject = {Mobiniss, ViewFinder},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2009\/doc-article-hermes.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2008<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. Doroftei, E. Colon, and G. De Cubber, &#8220;A Behaviour-Based Control and Software Architecture for the Visually Guided Robudem Outdoor Mobile Robot,\" <span style=\"font-style: italic\">Journal of Automation Mobile Robotics and Intelligent Systems<\/span>, vol. 2, iss. 4, p. 19\u201324, 2008.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_9\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_9\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2008\/XXX JAMRIS No8 - Doroftei.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_9_block\">\n<p>The design of outdoor autonomous robots requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. This paper describes partial aspects of this research work, which is aimed at developing a semiautonomous outdoor robot for risky interventions. This paper focuses on three main aspects of the design process: visual sensing using stereo vision and image motion analysis, design of a behaviourbased control architecture and implementation of modular software architecture.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_9_block\">\n<pre><code class=\"tex bibtex\">@Article{doroftei2008behaviour,\nauthor = {Doroftei, Daniela and Colon, Eric and De Cubber, Geert},\njournal = {Journal of Automation Mobile Robotics and Intelligent Systems},\ntitle = {A Behaviour-Based Control and Software Architecture for the Visually Guided Robudem Outdoor Mobile Robot},\nyear = {2008},\nissn = {1897-8649},\nmonth = oct,\nnumber = {4},\npages = {19--24},\nvolume = {2},\nabstract = {The design of outdoor autonomous robots requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. This paper describes partial aspects of this research work, which is aimed at developing a semiautonomous outdoor robot for risky interventions. This paper focuses on three main aspects of the design process: visual sensing using stereo vision and image motion analysis, design of a behaviourbased control architecture and implementation of modular software architecture.},\nproject = {ViewFinder, Mobiniss},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2008\/XXX JAMRIS No8 - Doroftei.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, and G. Marton, &#8220;Development of a visually guided mobile robot for environmental observation as an aid for outdoor crisis management operations,\" in <span style=\"font-style: italic\">Proceedings of the IARP Workshop on Environmental Maintenance and Protection<\/span>, Baden Baden, Germany,  2008.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_13\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_13\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2008\/environmental observation as an aid for outdoor crisis management operations.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_13_block\">\n<p>To solve these issues, an outdoor mobile robotic platform was equipped with a differential GPS system for accurate geo-registered positioning, and a stereo vision system. This stereo vision systems serves two purposes: 1) victim detection and 2) obstacle detection and avoidance. For semi-autonomous robot control and navigation, we rely on a behavior-based robot motion and path planner. In this paper, we present each of the three main aspects (victim detection, stereo-based obstacle detection and behavior-based navigation) of the general robot control architecture more in detail.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_13_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2008development,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Marton, Gabor},\nbooktitle = {Proceedings of the IARP Workshop on Environmental Maintenance and Protection},\ntitle = {Development of a visually guided mobile robot for environmental observation as an aid for outdoor crisis management operations},\nyear = {2008},\nabstract = {To solve these issues, an outdoor mobile robotic platform was equipped with a differential GPS system for accurate geo-registered positioning, and a stereo vision system. This stereo vision systems serves two purposes: 1) victim detection and 2) obstacle detection and avoidance. For semi-autonomous robot control and navigation, we rely on a behavior-based robot motion and path planner. In this paper, we present each of the three main aspects (victim detection, stereo-based obstacle detection and behavior-based navigation) of the general robot control architecture more in detail.},\nproject = {ViewFinder, Mobiniss},\naddress = {Baden Baden, Germany},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2008\/environmental observation as an aid for outdoor crisis management operations.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and Y. Baudoin, &#8220;Development of a semi-autonomous De-mining vehicle,\" in <span style=\"font-style: italic\">7th IARP Workshop HUDEM2008<\/span>, Cairo, Egypt,  2008.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_52\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_52\" class=\"papercite_toggle\">[Abstract]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_52_block\">\n<p>The paper describes the Development of a semi-autonomous De-mining vehicle<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_52_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2008development,\nauthor = {Doroftei, Daniela and Baudoin, Yvan},\nbooktitle = {7th {IARP} Workshop {HUDEM}2008},\ntitle = {Development of a semi-autonomous De-mining vehicle},\nyear = {2008},\nabstract = {The paper describes the Development of a semi-autonomous De-mining vehicle},\naddress = {Cairo, Egypt},\nproject = {Mobiniss},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and J. Bedkowski, &#8220;Towards the autonomous navigation of robots for risky interventions,\" in <span style=\"font-style: italic\">Third International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance RISE<\/span>, Benicassim, Spain,  2008.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_53\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_53\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2008\/Doroftei.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_53_block\">\n<p>In the course of the ViewFinder project, two robotics teams (RMS and PIAP) are working on the development of an intelligent autonomous mobile robot. This paper reports on the progress of both teams.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_53_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2008towards,\nauthor = {Doroftei, Daniela and Bedkowski, Janusz},\nbooktitle = {Third International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance {RISE}},\ntitle = {Towards the autonomous navigation of robots for risky interventions},\nyear = {2008},\nabstract = {In the course of the ViewFinder project, two robotics teams (RMS and PIAP) are working on the development of an intelligent autonomous mobile robot. This paper reports on the progress of both teams.},\nproject = {ViewFinder, Mobiniss},\naddress = {Benicassim, Spain},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2008\/Doroftei.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2007<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. Doroftei, E. Colon, and G. De Cubber, &#8220;A behaviour-based control and software architecture for the visually guided Robudem outdoor mobile robot,,\" in <span style=\"font-style: italic\">ISMCR 2007<\/span>, Warsaw, Poland,,  2007.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_45\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_45\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2007\/Doroftei_ISMCR07.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_45_block\">\n<p>The design of outdoor autonomous robots requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. This paper describes partial aspects of this research work, which is aimed at developing an semi\u2010autonomous outdoor robot for risky interventions. This paper focuses mainly on three main aspects of the design process: visual sensing using stereo and image motion analysis, design of a behaviour\u2010based control architecture and implementation of a modular software architecture.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_45_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2007behaviour,\nauthor = {Doroftei, Daniela and Colon, Eric and De Cubber, Geert},\nbooktitle = {ISMCR 2007},\ntitle = {A behaviour-based control and software architecture for the visually guided {Robudem} outdoor mobile robot,},\nyear = {2007},\naddress = {Warsaw, Poland,},\nabstract = {The design of outdoor autonomous robots requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. This paper describes partial aspects of this research work, which is aimed at developing an semi\u2010autonomous outdoor robot for risky interventions. This paper focuses mainly on three main aspects of the design process: visual sensing using stereo and image motion analysis, design of a behaviour\u2010based control architecture and implementation of a modular software architecture.},\nproject = {ViewFinder,Mobiniss},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2007\/Doroftei_ISMCR07.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, E. Colon, Y. Baudoin, and H. Sahli, &#8220;Development of a semi-autonomous off-road vehicle.,\" in <span style=\"font-style: italic\">IEEE HuMan&#8217;07&#8217;<\/span>, Timimoun, Algeria,  2007, p. 340\u2013343.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_54\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_54\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2007\/Development_of_a_semi-autonomous_off-road_vehicle.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_54_block\">\n<p>Humanitarian demining is still a highly laborintensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan semi-automatically a minefield. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_54_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2007development,\nauthor = {Doroftei, Daniela and Colon, Eric and Baudoin, Yvan and Sahli, Hichem},\nbooktitle = {{IEEE} {HuMan}'07'},\ntitle = {Development of a semi-autonomous off-road vehicle.},\nyear = {2007},\naddress = {Timimoun, Algeria},\npages = {340--343},\nabstract = {Humanitarian demining is still a highly laborintensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan semi-automatically a minefield. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control.},\nproject = {Mobiniss, ViewFinder},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2007\/Development_of_a_semi-autonomous_off-road_vehicle.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2006<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. Doroftei, E. Colon, and Y. Baudoin, &#8220;A modular control architecture for semi-autonomous navigation,\" in <span style=\"font-style: italic\">CLAWAR 2006<\/span>, Brussels, Belgium,  2006, p. 712\u2013715.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_55\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_55\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2006\/Clawar2006_Doroftei_colon.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_55_block\">\n<p>Humanitarian demining is still a highly laborintensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan semi-automatically a minefield. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_55_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2006modular,\nauthor = {Doroftei, Daniela and Colon, Eric and Baudoin, Yvan},\nbooktitle = {{CLAWAR} 2006},\ntitle = {A modular control architecture for semi-autonomous navigation},\nyear = {2006},\npages = {712--715},\nabstract = {Humanitarian demining is still a highly laborintensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan semi-automatically a minefield. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. },\nproject = {Mobiniss, ViewFinder},\naddress = {Brussels, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2006\/Clawar2006_Doroftei_colon.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, E. Colon, and Y. Baudoin, &#8220;Development of a control architecture for the ROBUDEM outdoor mobile robot platform,\" in <span style=\"font-style: italic\">IARP Workshop RISE 2006<\/span>, Brussels, Belgium,  2006.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_56\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_56\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2006\/IARPWS2006_Doroftei_Colon.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_56_block\">\n<p>Humanitarian demining still is a highly labor-intensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan a minefield semi-automatically. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_56_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2006development,\nauthor = {Doroftei, Daniela and Colon, Eric and Baudoin, Yvan},\nbooktitle = {{IARP} Workshop {RISE} 2006},\ntitle = {Development of a control architecture for the ROBUDEM outdoor mobile robot platform},\nyear = {2006},\nabstract = {Humanitarian demining still is a highly labor-intensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan a minefield semi-automatically. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. },\nproject = {Mobiniss, ViewFinder},\naddress = {Brussels, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2006\/IARPWS2006_Doroftei_Colon.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<p><\/span><\/p>\n<p>\n<\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-669379 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-552659 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-320049\" style=\"height: 20px; clear: both; width:100%;\"><\/div><\/div><\/div><\/div><\/div><\/section><\/p>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":1,"featured_media":250,"parent":3120,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-3586","page","type-page","status-publish","has-post-thumbnail","hentry"],"_links":{"self":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/3586","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/comments?post=3586"}],"version-history":[{"count":67,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/3586\/revisions"}],"predecessor-version":[{"id":5190,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/3586\/revisions\/5190"}],"up":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/3120"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/media\/250"}],"wp:attachment":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/media?parent=3586"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}