Daniela Doroftei

Senior Researcher

Robotics & Autonomous Systems,
Royal Military Academy

Address

Avenue De La Renaissance 30, 1000 Brussels, Belgium

Contact Information

Call: +32(0)2-44-14106

Email: daniela.doroftei@rma.ac.be

Daniela Doroftei is a senior researcher at the Robotics & Autonomous Systems unit of the department of Mechanics of the Belgian Royal Military Academy. Her research focuses on the tight integration and seemless interfacing between humans and robots in security and defence applications. Within the Robotics & Autonomous Systems unit, she is therefore the expert on research questions related to human factors, requirements engineering, human-robot shared control methodologies and operational quantitative validation methods. 

Daniela received her Master Diploma in Mechanical Engineering in 2002 from the Gheorghe Asachi University of Iasi, Romania and a Master after Master Degree in Applied Sciences in 2003 from the Université Libre de Bruxelles (ULB), Belgium.  

She is a task or Work Package – leader of multiple European research projects, like FP7-ICARUS (on the development of search and rescue robots) and H2020-SafeShore (on the development of a threat detection system). Within these projects, Daniela acts as the end-user liaison officer, making the bridging on the field between the scientists and the end user stakeholders. 

Daniela is a principal investigator for RMA for multiple international research projects like STARS*EU and ASSETs+. Moreover, she is the technical coordinator for the Alphonse research project, which has as an objective to reduce the number of drone incidents, by improving the training procedures for drone operators. 

Daniela is active as a reviewer for the European Commission and other funding agencies and has published around 50 scientific papers, including books and chapters in books. 

Publications

2024

  • P. Petsioti, M. Zyczkowski, K. Brewczyski, K. Cichulski, K. Kaminski, R. Razvan, A. Mohamoud, C. Church, A. Koniaris, G. De Cubber, and D. Doroftei, “Methodological Approach for the Development of Standard C-UAS Scenarios," Open Research Europe, vol. 4, iss. 240, 2024.
    [BibTeX] [Download PDF] [DOI]
    @Article{ 10.12688/openreseurope.18339.1,
    AUTHOR = {Petsioti, P. and Zyczkowski, M. and Brewczyski, K. and Cichulski, K. and Kaminski, K. and Razvan, R. and Mohamoud, A. and Church, C. and Koniaris, A. and De Cubber, G. and Doroftei, D.},
    TITLE = {Methodological Approach for the Development of Standard C-UAS Scenarios},
    JOURNAL = {Open Research Europe},
    VOLUME = {4},
    YEAR = {2024},
    NUMBER = {240},
    DOI = {10.12688/openreseurope.18339.1},
    URL = {https://open-research-europe.ec.europa.eu/articles/4-240/v1},
    unit= {meca-ras},
    project= {COURAGEOUS}
    }

  • D. Doroftei, G. De Cubber, S. Lo Bue, and H. De Smet, “Quantitative Assessment of Drone Pilot Performance," Drones, vol. 8, iss. 9, 2024.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    This paper introduces a quantitative methodology for assessing drone pilot performance, aiming to reduce drone-related incidents by understanding the human factors influencing performance. The challenge lies in balancing evaluations in operationally relevant environments with those in a standardized test environment for statistical relevance. The proposed methodology employs a novel virtual test environment that records not only basic flight metrics but also complex mission performance metrics, such as the video quality from a target. A group of Belgian Defence drone pilots were trained using this simulator system, yielding several practical results. These include a human-performance model linking human factors to pilot performance, an AI co-pilot providing real-time flight performance guidance, a tool for generating optimal flight trajectories, a mission planning tool for ideal pilot assignment, and a method for iterative training improvement based on quantitative input. The training results with real pilots demonstrate the methodology’s effectiveness in evaluating pilot performance for complex military missions, suggesting its potential as a valuable addition to new pilot training programs.

    @Article{drones8090482,
    AUTHOR = {Doroftei, Daniela and De Cubber, Geert and Lo Bue, Salvatore and De Smet, Hans},
    TITLE = {Quantitative Assessment of Drone Pilot Performance},
    JOURNAL = {Drones},
    VOLUME = {8},
    YEAR = {2024},
    unit= {meca-ras},
    NUMBER = {9},
    ARTICLE-NUMBER = {482},
    URL = {https://www.mdpi.com/2504-446X/8/9/482},
    ISSN = {2504-446X},
    project= {ALPHONSE},
    ABSTRACT = {This paper introduces a quantitative methodology for assessing drone pilot performance, aiming to reduce drone-related incidents by understanding the human factors influencing performance. The challenge lies in balancing evaluations in operationally relevant environments with those in a standardized test environment for statistical relevance. The proposed methodology employs a novel virtual test environment that records not only basic flight metrics but also complex mission performance metrics, such as the video quality from a target. A group of Belgian Defence drone pilots were trained using this simulator system, yielding several practical results. These include a human-performance model linking human factors to pilot performance, an AI co-pilot providing real-time flight performance guidance, a tool for generating optimal flight trajectories, a mission planning tool for ideal pilot assignment, and a method for iterative training improvement based on quantitative input. The training results with real pilots demonstrate the methodology’s effectiveness in evaluating pilot performance for complex military missions, suggesting its potential as a valuable addition to new pilot training programs.},
    DOI = {10.3390/drones8090482}
    }

2023

  • G. De Cubber, E. Le Flécher, A. La Grappe, E. Ghisoni, E. Maroulis, P. Ouendo, D. Hawari, and D. Doroftei, “Dual Use Security Robotics: A Demining, Resupply and Reconnaissance Use Case," in IEEE International Conference on Safety, Security, and Rescue Robotics, 2023.
    [BibTeX] [Download PDF]
    @inproceedings{ssrr2023decubber,
    title={Dual Use Security Robotics: A Demining, Resupply and Reconnaissance Use Case},
    author={De Cubber, Geert and Le Flécher, Emile and La Grappe, Alexandre and Ghisoni, Enzo and Maroulis, Emmanouil and Ouendo, Pierre-Edouard and Hawari, Danial and Doroftei, Daniela},
    booktitle={IEEE International Conference on Safety, Security, and Rescue Robotics},
    editors ={Kimura, Tetsuya},
    publisher = {IEEE},
    year = {2023},
    vol = {1},
    project = {AIDED, iMUGs, CUGS},
    location = {Fukushima, Japan},
    unit= {meca-ras},
    doi = {},
    url={https://mecatron.rma.ac.be/pub/2023/SSRR2023-DeCubber.pdf}
    }

  • G. De Cubber, E. Le Flécher, A. Dominicus, and D. Doroftei, “Human-agent teaming between soldiers and unmanned ground systems in a resupply scenario," in Human Factors in Robots, Drones and Unmanned Systems. AHFE (2023) International Conference., 2023.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    Thanks to advances in embedded computing and robotics, intelligent Unmanned Ground Systems (UGS) are used more and more in our daily lives. Also in the military domain, the use of UGS is highly investigated for applications like force protection of military installations, surveillance, target acquisition, reconnaissance, handling of chemical, biological, radiological, nuclear (CBRN) threats, explosive ordnance disposal, etc. A pivotal research aspect for the integration of these military UGS in the standard operating procedures is the question of how to achieve a seamless collaboration between human and robotic agents in such high-stress and non-structured environments. Indeed, in these kind of operations, it is critical that the human-agent mutual understanding is flawless; hence, the focus on human factors and ergonomic design of the control interfaces.The objective of this paper is to focus on one key military application of UGS, more specifically logistics, and elaborate how efficient human-machine teaming can be achieved in such a scenario. While getting much less attention than other application areas, the domain of logistics is in fact one of the most important for any military operation, as it is an application area that is very well suited for robotic systems. Indeed, military troops are very often burdened by having to haul heavy gear across large distances, which is a problem UGS can solve.The significance of this paper is that it is based on more than two years of field research work on human + multi-agent UGS collaboration in realistic military operating conditions, performed within the scope of the European project iMUGS. In the framework of this project, not less than six large-scale field trial campaigns were organized across Europe. In each field trial campaign, soldiers and UGS had to work together to achieve a set of high-level mission goals that were distributed among them via a planning & scheduling mechanism. This paper will focus on the outcomes of the Belgian field trial, which concentrated on a resupply logistics mission.Within this paper, a description of the iMUGS test setup and operational scenarios is provided. The ergonomic design of the tactical planning system is elaborated, together with the high-level swarming and task scheduling methods that divide the work between robotic and human agents in the fieldThe resupply mission, as described in this paper, was executed in summer 2022 in Belgium by a mixed team of soldiers and UGS for an audience of around 200 people from defence actors from European member states. The results of this field trial were evaluated as highly positive, as all high-level requirements were obtained by the robotic fleet.

    @inproceedings{ahfe20203decubber,
    title={Human-agent teaming between soldiers and unmanned ground systems in a resupply scenario},
    author={De Cubber, G. and Le Flécher, E. and Dominicus, A. and Doroftei, D.},
    booktitle={Human Factors in Robots, Drones and Unmanned Systems. AHFE (2023) International Conference.},
    editors ={Tareq Ahram and Waldemar Karwowski},
    publisher = {AHFE Open Access, AHFE International, USA},
    year = {2023},
    vol = {93},
    project = {iMUGs},
    location = {San Francisco, USA},
    unit= {meca-ras},
    doi = {http://doi.org/10.54941/ahfe1003746},
    url={https://openaccess.cms-conferences.org/publications/book/978-1-958651-69-8/article/978-1-958651-69-8_5},
    abstract = {Thanks to advances in embedded computing and robotics, intelligent Unmanned Ground Systems (UGS) are used more and more in our daily lives. Also in the military domain, the use of UGS is highly investigated for applications like force protection of military installations, surveillance, target acquisition, reconnaissance, handling of chemical, biological, radiological, nuclear (CBRN) threats, explosive ordnance disposal, etc. A pivotal research aspect for the integration of these military UGS in the standard operating procedures is the question of how to achieve a seamless collaboration between human and robotic agents in such high-stress and non-structured environments. Indeed, in these kind of operations, it is critical that the human-agent mutual understanding is flawless; hence, the focus on human factors and ergonomic design of the control interfaces.The objective of this paper is to focus on one key military application of UGS, more specifically logistics, and elaborate how efficient human-machine teaming can be achieved in such a scenario. While getting much less attention than other application areas, the domain of logistics is in fact one of the most important for any military operation, as it is an application area that is very well suited for robotic systems. Indeed, military troops are very often burdened by having to haul heavy gear across large distances, which is a problem UGS can solve.The significance of this paper is that it is based on more than two years of field research work on human + multi-agent UGS collaboration in realistic military operating conditions, performed within the scope of the European project iMUGS. In the framework of this project, not less than six large-scale field trial campaigns were organized across Europe. In each field trial campaign, soldiers and UGS had to work together to achieve a set of high-level mission goals that were distributed among them via a planning & scheduling mechanism. This paper will focus on the outcomes of the Belgian field trial, which concentrated on a resupply logistics mission.Within this paper, a description of the iMUGS test setup and operational scenarios is provided. The ergonomic design of the tactical planning system is elaborated, together with the high-level swarming and task scheduling methods that divide the work between robotic and human agents in the fieldThe resupply mission, as described in this paper, was executed in summer 2022 in Belgium by a mixed team of soldiers and UGS for an audience of around 200 people from defence actors from European member states. The results of this field trial were evaluated as highly positive, as all high-level requirements were obtained by the robotic fleet.}
    }

  • D. Doroftei, G. De Cubber, and H. De Smet, “Human factors assessment for drone operations: towards a virtual drone co-pilot," in Human Factors in Robots, Drones and Unmanned Systems. AHFE (2023) International Conference., 2023.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    As the number of drone operations increases, so does the risk of incidents with these novel, yet sometimes dangerous unmanned systems. Research has shown that over 70% of drone incidents are caused by human error, so in order to reduce the risk of incidents, the human factors related to the operation of the drone should be studied. However, this is not a trivial exercise, because on the one hand, a realistic operational environment is required (in order to study the human behaviour in realistic conditions), while on the other hand a standardised environment is required, such that repeatable experiments can be set up in order to ensure statistical relevance. In order to remedy this, within the scope of the ALPHONSE project, a realistic simulation environment was developed that is specifically geared towards the evaluation of human factors for military drone operations. Within the ALPHONSE simulator, military (and other) drone pilots can perform missions in realistic operational conditions. At the same time, they are subjected to a range of factors that can influence operator performance. These constitute both person-induced factors like pressure to achieve the set goals in time or people talking to the pilot and environment-induced stress factors like changing weather conditions. During the flight operation, the ALPHONSE simulator continuously monitors over 65 flight parameters. After the flight, an overall performance score is calculated, based upon the achievement of the mission objectives. Throughout the ALPHONSE trials, a wide range of pilots has flown in the simulator, ranging from beginner to expert pilots. Using all the data recorded during these flights, three actions are performed:-An Artificial Intelligence (AI) – based classifier was trained to automatically recognize in real time good and bad flight behaviour. This allows for the development of a virtual co-pilot that can warn the pilot at any given moment when the pilot is starting to exhibit behaviour that is recognized by the classifier to correspond mostly to the behaviour of inexperienced pilots and not to the behaviour of good pilots.-An identification and ranking of the human factors and their impact on the flight performance, by linking the induced stress factors to the performance scores-An update of the training procedures to take into consideration the human factors that impact flight performance, such that newly trained pilots are better aware of these influences.The objective of this paper is to present the complete ALPHONSE simulator system for the evaluation of human factors for drone operations and present the results of the experiments with real military flight operators. The focus of the paper will be on the elaboration of the design choices for the development of the AI – based classifier for real-time flight performance evaluation.The proposed development is highly significant, as it presents a concrete and cost-effective methodology for developing a virtual co-pilot for drone pilots that can render drone operations safer. Indeed, while the initial training of the AI model requires considerable computing resources, the implementation of the classifier can be readily integrated in commodity flight controllers to provide real-time alerts when pilots are manifesting undesired flight behaviours.The paper will present results of tests with drone pilots from Belgian Defence and civilian Belgian Defence researchers that have flown within the ALPHONSE simulator. These pilots have first acted as data subjects to provide flight data to train the model and have later been used to validate the model. The validation shows that the virtual co-pilot achieves a very high accuracy and can in over 80% of the cases correctly identify bad flight profiles in real-time.

    @inproceedings{ahfe20203doroftei,
    title={Human factors assessment for drone operations: towards a virtual drone co-pilot},
    author={Doroftei, D. and De Cubber, G. and De Smet, H.},
    booktitle={Human Factors in Robots, Drones and Unmanned Systems. AHFE (2023) International Conference.},
    editors ={Tareq Ahram and Waldemar Karwowski},
    publisher = {AHFE Open Access, AHFE International, USA},
    year = {2023},
    vol = {93},
    project = {Alphonse},
    location = {San Francisco, USA},
    unit= {meca-ras},
    doi = {http://doi.org/10.54941/ahfe1003747},
    url={https://openaccess.cms-conferences.org/publications/book/978-1-958651-69-8/article/978-1-958651-69-8_6},
    abstract = {As the number of drone operations increases, so does the risk of incidents with these novel, yet sometimes dangerous unmanned systems. Research has shown that over 70% of drone incidents are caused by human error, so in order to reduce the risk of incidents, the human factors related to the operation of the drone should be studied. However, this is not a trivial exercise, because on the one hand, a realistic operational environment is required (in order to study the human behaviour in realistic conditions), while on the other hand a standardised environment is required, such that repeatable experiments can be set up in order to ensure statistical relevance. In order to remedy this, within the scope of the ALPHONSE project, a realistic simulation environment was developed that is specifically geared towards the evaluation of human factors for military drone operations. Within the ALPHONSE simulator, military (and other) drone pilots can perform missions in realistic operational conditions. At the same time, they are subjected to a range of factors that can influence operator performance. These constitute both person-induced factors like pressure to achieve the set goals in time or people talking to the pilot and environment-induced stress factors like changing weather conditions. During the flight operation, the ALPHONSE simulator continuously monitors over 65 flight parameters. After the flight, an overall performance score is calculated, based upon the achievement of the mission objectives. Throughout the ALPHONSE trials, a wide range of pilots has flown in the simulator, ranging from beginner to expert pilots. Using all the data recorded during these flights, three actions are performed:-An Artificial Intelligence (AI) - based classifier was trained to automatically recognize in real time good and bad flight behaviour. This allows for the development of a virtual co-pilot that can warn the pilot at any given moment when the pilot is starting to exhibit behaviour that is recognized by the classifier to correspond mostly to the behaviour of inexperienced pilots and not to the behaviour of good pilots.-An identification and ranking of the human factors and their impact on the flight performance, by linking the induced stress factors to the performance scores-An update of the training procedures to take into consideration the human factors that impact flight performance, such that newly trained pilots are better aware of these influences.The objective of this paper is to present the complete ALPHONSE simulator system for the evaluation of human factors for drone operations and present the results of the experiments with real military flight operators. The focus of the paper will be on the elaboration of the design choices for the development of the AI - based classifier for real-time flight performance evaluation.The proposed development is highly significant, as it presents a concrete and cost-effective methodology for developing a virtual co-pilot for drone pilots that can render drone operations safer. Indeed, while the initial training of the AI model requires considerable computing resources, the implementation of the classifier can be readily integrated in commodity flight controllers to provide real-time alerts when pilots are manifesting undesired flight behaviours.The paper will present results of tests with drone pilots from Belgian Defence and civilian Belgian Defence researchers that have flown within the ALPHONSE simulator. These pilots have first acted as data subjects to provide flight data to train the model and have later been used to validate the model. The validation shows that the virtual co-pilot achieves a very high accuracy and can in over 80% of the cases correctly identify bad flight profiles in real-time.}
    }

2022

  • D. Doroftei, G. De Cubber, and H. De Smet, “A quantitative measure for the evaluation of drone-based video quality on a target," in Eighteenth International Conference on Autonomic and Autonomous Systems (ICAS), Venice, Italy, 2022.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    This paper presents a methodology to assess video quality and based on that automatically calculate drone trajectories that optimize the video quality.

    @InProceedings{doroftei2022alphonse2,
    author = {Doroftei, Daniela and De Cubber, Geert and De Smet, Hans},
    booktitle = {Eighteenth International Conference on Autonomic and Autonomous Systems (ICAS)},
    title = {A quantitative measure for the evaluation of drone-based video quality on a target},
    year = {2022},
    month = jun,
    organization = {IARIA},
    publisher = {ThinkMind},
    address = {Venice, Italy},
    url = {https://www.thinkmind.org/articles/icas_2022_1_40_20018.pdf},
    isbn={978-1-61208-966-9},
    doi = {https://www.thinkmind.org/index.php?view=article&articleid=icas_2022_1_40_20018},
    abstract = {This paper presents a methodology to assess video quality and based on that automatically calculate drone trajectories that optimize the video quality.},
    project = {Alphonse},
    unit= {meca-ras}
    }

  • D. Doroftei, G. De Cubber, and H. De Smet, “Assessing Human Factors for Drone Operations in a Simulation Environment," in Human Factors in Robots, Drones and Unmanned Systems – AHFE (2022) International Conference, New York, USA, 2022.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    This paper presents an overview of the Alphonse methodology for Assessing Human Factors for Drone Operations in a Simulation Environment.

    @InProceedings{doroftei2022a,
    author = {Doroftei, Daniela and De Cubber, Geert and De Smet, Hans},
    booktitle = {Human Factors in Robots, Drones and Unmanned Systems - AHFE (2022) International Conference},
    title = {Assessing Human Factors for Drone Operations in a Simulation Environment},
    year = {2022},
    month = jul,
    volume = {57},
    editor = {Tareq Ahram and Waldemar Karwowski},
    publisher = {AHFE International},
    address = {New York, USA},
    url = {https://openaccess-api.cms-conferences.org/articles/download/978-1-958651-33-9_16},
    abstract = {This paper presents an overview of the Alphonse methodology for Assessing Human Factors for Drone Operations in a Simulation Environment.},
    doi = {http://doi.org/10.54941/ahfe1002319},
    project = {Alphonse},
    unit= {meca-ras}
    }

2021

  • D. Doroftei, T. De Vleeschauwer, S. L. Bue, M. Dewyn, F. Vanderstraeten, and G. De Cubber, “Human-Agent Trust Evaluation in a Digital Twin Context," in 2021 30th IEEE International Conference on Robot Human Interactive Communication (RO-MAN), Vancouver, BC, Canada, 2021, pp. 203-207.
    [BibTeX] [Download PDF] [DOI]
    @INPROCEEDINGS{9515445,
    author={Doroftei, Daniela and De Vleeschauwer, Tom and Bue, Salvatore Lo and Dewyn, Michaël and Vanderstraeten, Frik and De Cubber, Geert},
    booktitle={2021 30th IEEE International Conference on Robot Human Interactive Communication (RO-MAN)},
    title={Human-Agent Trust Evaluation in a Digital Twin Context},
    year={2021},
    volume={},
    number={},
    pages={203-207},
    url={https://www.researchgate.net/profile/Geert-De-Cubber/publication/354078858_Human-Agent_Trust_Evaluation_in_a_Digital_Twin_Context/links/61430bd22bfbd83a46cf2b8c/Human-Agent-Trust-Evaluation-in-a-Digital-Twin-Context.pdf?_sg%5B0%5D=BdEPB9AGDUV3sOwnEQKCr-DgWRA7uDNeMlvyQYNaMPGSO2bhCDbyG4AENXXxH3j323ypYTq9nMftVbDr2fsCSA.ePETOgrc5VHnE0GK_yjBK1XVVfdQ9S6g2UKVfg8Z8miIkGlMPXpzaYKlB0JPDSiroGp9QoFbmcY2egYAXbL1ZQ&_sg%5B1%5D=ykQnQS2LN8fUQXAYx5Fpiy2NXqIwqO1UyVCENkpSUUWZn8Qqgrelh1bb4ry9Q9XPgCts7lVXU1_68YLjqnCPh4seSzWfG5BpKHc3MuFwsK6l.ePETOgrc5VHnE0GK_yjBK1XVVfdQ9S6g2UKVfg8Z8miIkGlMPXpzaYKlB0JPDSiroGp9QoFbmcY2egYAXbL1ZQ&_iepl=},
    project={Alphonse},
    publisher={IEEE},
    address={Vancouver, BC, Canada},
    month=aug,
    doi={10.1109/RO-MAN50785.2021.9515445},
    unit= {meca-ras}}

  • G. De Cubber, R. Lahouli, D. Doroftei, and R. Haelterman, “Distributed coverage optimisation for a fleet of unmanned maritime systems," ACTA IMEKO, vol. 10, iss. 3, pp. 36-43, 2021.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    Unmanned maritime systems (UMS) can provide important benefits for maritime law enforcement agencies for tasks such as area surveillance and patrolling, especially when they are able to work together as one coordinated system. In this context, this paper proposes a methodology that optimises the coverage of a fleet of UMS, thereby maximising the opportunities for identifying threats. Unlike traditional approaches to maritime coverage optimisation, which are also used, for example, in search and rescue operations when searching for victims at sea, this approach takes into consideration the limited seaworthiness of small UMS, compared with traditional large ships, by incorporating the danger level into the design of the optimiser.

    @ARTICLE{cubberimeko2021,
    author={De Cubber, Geert and Lahouli, Rihab and Doroftei, Daniela and Haelterman, Rob},
    journal={ACTA IMEKO},
    title={Distributed coverage optimisation for a fleet of unmanned maritime systems},
    year={2021},
    volume={10},
    number={3},
    pages={36-43},
    issn={2221-870X},
    url={https://acta.imeko.org/index.php/acta-imeko/article/view/IMEKO-ACTA-10%20%282021%29-03-07/pdf},
    project={MarSur, SSAVE},
    publisher={IMEKO},
    month=oct,
    abstract = {Unmanned maritime systems (UMS) can provide important benefits for maritime law enforcement agencies for tasks such as area surveillance and patrolling, especially when they are able to work together as one coordinated system. In this context, this paper proposes a methodology that optimises the coverage of a fleet of UMS, thereby maximising the opportunities for identifying threats. Unlike traditional approaches to maritime coverage optimisation, which are also used, for example, in search and rescue operations when searching for victims at sea, this approach takes into consideration the limited seaworthiness of small UMS, compared with traditional large ships, by incorporating the danger level into the design of the optimiser. },
    doi={http://dx.doi.org/10.21014/acta_imeko.v10i3.1031},
    unit= {meca-ras}}

2020

  • D. Doroftei, G. De Cubber, and H. De Smet, “Reducing drone incidents by incorporating human factors in the drone and drone pilot accreditation process," in Advances in Human Factors in Robots, Drones and Unmanned Systems, San Diego, USA, 2020, p. 71–77.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    Considering the ever-increasing use of drones in a plentitude of application areas, the risk is that also an ever-increasing number of drone incidents would be ob-served. Research has shown that a large majority of all incidents with drones is due not to technological, but to human error. An advanced risk-reduction meth-odology, focusing on the human element, is thus required in order to allow for the safe use of drones. In this paper, we therefore introduce a novel concept to pro-vide a qualitative and quantitative assessment of the performance of the drone op-erator. The proposed methodology is based on one hand upon the development of standardized test methodologies and on the other hand on human performance modeling of the drone operators in a highly realistic simulation environment.

    @InProceedings{doroftei2020alphonse,
    author = {Doroftei, Daniela and De Cubber, Geert and De Smet, Hans},
    booktitle = {Advances in Human Factors in Robots, Drones and Unmanned Systems},
    title = {Reducing drone incidents by incorporating human factors in the drone and drone pilot accreditation process},
    year = {2020},
    month = jul,
    editor = {Zallio, Matteo},
    publisher = {Springer International Publishing},
    pages = {71--77},
    isbn = {978-3-030-51758-8},
    organization = {AHFE},
    address = {San Diego, USA},
    abstract = {Considering the ever-increasing use of drones in a plentitude of application areas, the risk is that also an ever-increasing number of drone incidents would be ob-served. Research has shown that a large majority of all incidents with drones is due not to technological, but to human error. An advanced risk-reduction meth-odology, focusing on the human element, is thus required in order to allow for the safe use of drones. In this paper, we therefore introduce a novel concept to pro-vide a qualitative and quantitative assessment of the performance of the drone op-erator. The proposed methodology is based on one hand upon the development of standardized test methodologies and on the other hand on human performance modeling of the drone operators in a highly realistic simulation environment.},
    doi = {10.1007/978-3-030-51758-8_10},
    unit= {meca-ras},
    project = {Alphonse},
    url = {http://mecatron.rma.ac.be/pub/2020/Reducing%20drone%20incidents%20by%20incorporating%20human%20factors%20in%20the%20drone%20and%20drone%20pilot%20accreditation%20process.pdf},
    }

  • G. De Cubber, R. Lahouli, D. Doroftei, and R. Haelterman, “Distributed coverage optimization for a fleet of unmanned maritime systems for a maritime patrol and surveillance application," in ISMCR 2020: 23rd International Symposium on Measurement and Control in Robotics, Budapest, Hungary, 2020.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    In order for unmanned maritime systems to provide added value for maritime law enforcement agencies, they have to be able to work together as a coordinated team for tasks such as area surveillance and patrolling. Therefore, this paper proposes a methodology that optimizes the coverage of a fleet of unmanned maritime systems, and thereby maximizes the chances of noticing threats. Unlike traditional approaches for maritime coverage optimization, which are also used for example in search and rescue operations when searching for victims at sea, this approaches takes into consideration the limited seaworthiness of small unmanned systems, as compared to traditional large ships, by incorporating the danger level in the design of the optimizer.

    @InProceedings{decubber2020dco,
    author = {De Cubber, Geert and Lahouli, Rihab and Doroftei, Daniela and Haelterman, Rob},
    booktitle = {ISMCR 2020: 23rd International Symposium on Measurement and Control in Robotics},
    title = {Distributed coverage optimization for a fleet of unmanned maritime systems for a maritime patrol and surveillance application},
    year = {2020},
    month = oct,
    organization = {ISMCR},
    publisher = {{IEEE}},
    abstract = {In order for unmanned maritime systems to provide added value for maritime law enforcement agencies, they have to be able to work together as a coordinated team for tasks such as area surveillance and patrolling. Therefore, this paper proposes a methodology that optimizes the coverage of a fleet of unmanned maritime systems, and thereby maximizes the chances of noticing threats. Unlike traditional approaches for maritime coverage optimization, which are also used for example in search and rescue operations when searching for victims at sea, this approaches takes into consideration the limited seaworthiness of small unmanned systems, as compared to traditional large ships, by incorporating the danger level in the design of the optimizer.},
    project = {SSAVE,MarSur},
    address = {Budapest, Hungary},
    doi = {10.1109/ISMCR51255.2020.9263740},
    url = {http://mecatron.rma.ac.be/pub/2020/conference_101719.pdf},
    unit= {meca-ras}
    }

2019

  • D. Doroftei and G. De Cubber, “Using a qualitative and quantitative validation methodology to evaluate a drone detection system," ACTA IMEKO, vol. 8, iss. 4, p. 20–27, 2019.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    Now that the use of drones is becoming more common, the need to regulate the access to airspace for these systems is becoming more pressing. A necessary tool in order to do this is a means of detecting drones. Numerous parties have started the development of such drone detection systems. A big problem with these systems is that the evaluation of the performance of drone detection systems is a difficult operation that requires the careful consideration of all technical and non-technical aspects of the system under test. Indeed, weather conditions and small variations in the appearance of the targets can have a huge difference on the performance of the systems. In order to provide a fair evaluation, it is therefore paramount that a validation procedure that finds a compromise between the requirements of end users (who want tests to be performed in operational conditions) and platform developers (who want statistically relevant tests) is followed. Therefore, we propose in this article a qualitative and quantitative validation methodology for drone detection systems. The proposed validation methodology seeks to find this compromise between operationally relevant benchmarking (by providing qualitative benchmarking under varying environmental conditions) and statistically relevant evaluation (by providing quantitative score sheets under strictly described conditions).

    @Article{doroftei2019using,
    author = {Doroftei, Daniela and De Cubber, Geert},
    journal = {{ACTA} {IMEKO}},
    title = {Using a qualitative and quantitative validation methodology to evaluate a drone detection system},
    year = {2019},
    month = dec,
    number = {4},
    pages = {20--27},
    volume = {8},
    abstract = {Now that the use of drones is becoming more common, the need to regulate the access to airspace for these systems is becoming more pressing. A necessary tool in order to do this is a means of detecting drones. Numerous parties have started the development of such drone detection systems. A big problem with these systems is that the evaluation of the performance of drone detection systems is a difficult operation that requires the careful consideration of all technical and non-technical aspects of the system under test. Indeed, weather conditions and small variations in the appearance of the targets can have a huge difference on the performance of the systems. In order to provide a fair evaluation, it is therefore paramount that a validation procedure that finds a compromise between the requirements of end users (who want tests to be performed in operational conditions) and platform developers (who want statistically relevant tests) is followed. Therefore, we propose in this article a qualitative and quantitative validation methodology for drone detection systems. The proposed validation methodology seeks to find this compromise between operationally relevant benchmarking (by providing qualitative benchmarking under varying environmental conditions) and statistically relevant evaluation (by providing quantitative score sheets under strictly described conditions).},
    doi = {10.21014/acta_imeko.v8i4.682},
    pdf = {https://acta.imeko.org/index.php/acta-imeko/article/view/IMEKO-ACTA-08%20%282019%29-04-05/pdf},
    project = {SafeShore},
    publisher = {{IMEKO} International Measurement Confederation},
    url = {https://acta.imeko.org/index.php/acta-imeko/article/view/IMEKO-ACTA-08%20%282019%29-04-05/pdf},
    unit= {meca-ras}
    }

  • D. Doroftei and H. De Smet, “Evaluating Human Factors for Drone Operations using Simulations and Standardized Tests," in 10th International Conference on Applied Human Factors and Ergonomics (AHFE 2019), Washington DC, USA, 2019.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    This poster publication presents an overview of the Alphonse project on the development of new training curricula to reduce the number of drone incidents due to human error.

    @InProceedings{doroftei2019alphonse,
    author = {Doroftei, Daniela and De Smet, Han},
    booktitle = {10th International Conference on Applied Human Factors and Ergonomics (AHFE 2019)},
    title = {Evaluating Human Factors for Drone Operations using Simulations and Standardized Tests},
    year = {2019},
    month = jul,
    organization = {AHFE},
    publisher = {Springer},
    address = {Washington DC, USA},
    abstract = {This poster publication presents an overview of the Alphonse project on the development of new training curricula to reduce the number of drone incidents due to human error.},
    doi = {10.5281/zenodo.3742199},
    project = {Alphonse},
    url = {http://mecatron.rma.ac.be/pub/2019/Poster_Alphonse_Print.pdf},
    unit= {meca-ras}
    }

2018

  • Y. Baudoin, D. Doroftei, G. de Cubber, J. Habumuremyi, H. Balta, and I. Doroftei, “Unmanned Ground and Aerial Robots Supporting Mine Action Activities," Journal of Physics: Conference Series, vol. 1065, iss. 17, p. 172009, 2018.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    During the Humanitarian‐demining actions, teleoperation of sensors or multi‐sensor heads can enhance‐detection process by allowing more precise scanning, which is use‐ ful for the optimization of the signal processing algorithms. This chapter summarizes the technologies and experiences developed during 16 years through national and/or European‐funded projects, illustrated by some contributions of our own laboratory, located at the Royal Military Academy of Brussels, focusing on the detection of unexploded devices and the implementation of mobile robotics systems on minefields

    @Article{baudoin2018unmanned,
    author = {Baudoin, Yvan and Doroftei, Daniela and de Cubber, Geert and Habumuremyi, Jean-Claude and Balta, Haris and Doroftei, Ioan},
    title = {Unmanned Ground and Aerial Robots Supporting Mine Action Activities},
    year = {2018},
    month = aug,
    number = {17},
    organization = {IOP Publishing},
    pages = {172009},
    publisher = {{IOP} Publishing},
    volume = {1065},
    abstract = {During the Humanitarian‐demining actions, teleoperation of sensors or multi‐sensor heads can enhance‐detection process by allowing more precise scanning, which is use‐ ful for the optimization of the signal processing algorithms. This chapter summarizes the technologies and experiences developed during 16 years through national and/or European‐funded projects, illustrated by some contributions of our own laboratory, located at the Royal Military Academy of Brussels, focusing on the detection of unexploded devices and the implementation of mobile robotics systems on minefields},
    doi = {10.1088/1742-6596/1065/17/172009},
    journal = {Journal of Physics: Conference Series},
    project = {TIRAMISU},
    url = {https://iopscience.iop.org/article/10.1088/1742-6596/1065/17/172009/pdf},
    unit= {meca-ras}
    }

  • D. Doroftei and G. De Cubber, “Qualitative and quantitative validation of drone detection systems," in International Symposium on Measurement and Control in Robotics ISMCR2018, Mons, Belgium, 2018.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    As drones are more and more entering our world, so comes the need to regulate the access to airspace for these systems. A necessary tool in order to do this is a means of detecting these drones. Numerous commercial and non-commercial parties have started the development of such drone detection systems. A big problem with these systems is that the evaluation of the performance of drone detection systems is a difficult operation, which requires the careful consideration of all technical and non-technical aspects of the system under test. Indeed, weather conditions and small variations in the appearance of the targets can have a huge difference on the performance of the systems. In order to provide a fair evaluation and an honest comparison between systems, it is therefore paramount that a stringent validation procedure is followed. Moreover, the validation methodology needs to find a compromise between the often contrasting requirements of end users (who want tests to be performed in operational conditions) and platform developers (who want tests to be performed that are statistically relevant). Therefore, we propose in this paper a qualitative and quantitative validation methodology for drone detection systems. The proposed validation methodology seeks to find this compromise between operationally relevant benchmarking (by providing qualitative benchmarking under varying environmental conditions) and statistically relevant evaluation (by providing quantitative score sheets under strictly described conditions).

    @InProceedings{doroftei2018qualitative,
    author = {Doroftei, Daniela and De Cubber, Geert},
    booktitle = {International Symposium on Measurement and Control in Robotics ISMCR2018},
    title = {Qualitative and quantitative validation of drone detection systems},
    year = {2018},
    volume = {1},
    abstract = {As drones are more and more entering our world, so comes the need to regulate the access to airspace for these systems. A necessary tool in order to do this is a means of detecting these drones. Numerous commercial and non-commercial parties have started the development of such drone detection systems. A big problem with these systems is that the evaluation of the performance of drone detection systems is a difficult operation, which requires the careful consideration of all technical and non-technical aspects of the system under test. Indeed, weather conditions and small variations in the appearance of the targets can have a huge difference on the performance of the systems. In order to provide a fair evaluation and an honest comparison between systems, it is therefore paramount that a stringent validation procedure is followed. Moreover, the validation methodology needs to find a compromise between the often contrasting requirements of end users (who want tests to be performed in operational conditions) and platform developers (who want tests to be performed that are statistically relevant). Therefore, we propose in this paper a qualitative and quantitative validation methodology for drone detection systems. The proposed validation methodology seeks to find this compromise between operationally relevant benchmarking (by providing qualitative benchmarking under varying environmental conditions) and statistically relevant evaluation (by providing quantitative score sheets under strictly described conditions).},
    doi = {10.5281/ZENODO.1462586},
    file = {:doroftei2018qualitative - Qualitative and Quantitative Validation of Drone Detection Systems.PDF:PDF},
    keywords = {Unmanned Aerial Vehicles, Drones, Detection systems, Drone detection, Test and evaluation methods},
    project = {SafeShore},
    address = {Mons, Belgium},
    url = {http://mecatron.rma.ac.be/pub/2018/Paper_Daniela.pdf},
    unit= {meca-ras}
    }

2017

  • D. S. López, G. Moreno, J. Cordero, J. Sanchez, S. Govindaraj, M. M. Marques, V. Lobo, S. Fioravanti, A. Grati, K. Rudin, M. Tosa, A. Matos, A. Dias, A. Martins, J. Bedkowski, H. Balta, and G. De Cubber, “Interoperability in a Heterogeneous Team of Search and Rescue Robots," in Search and Rescue Robotics – From Theory to Practice, G. De Cubber and D. Doroftei, Eds., InTech, 2017.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    Search and rescue missions are complex operations. A disaster scenario is generally unstructured, time‐varying and unpredictable. This poses several challenges for the successful deployment of unmanned technology. The variety of operational scenarios and tasks lead to the need for multiple robots of different types, domains and sizes. A priori planning of the optimal set of assets to be deployed and the definition of their mission objectives are generally not feasible as information only becomes available during mission. The ICARUS project responds to this challenge by developing a heterogeneous team composed by different and complementary robots, dynamically cooperating as an interoperable team. This chapter describes our approach to multi‐robot interoperability, understood as the ability of multiple robots to operate together, in synergy, enabling multiple teams to share data, intelligence and resources, which is the ultimate objective of ICARUS project. It also includes the analysis of the relevant standardization initiatives in multi‐robot multi‐domain systems, our implementation of an interoperability framework and several examples of multi‐robot cooperation of the ICARUS robots in realistic search and rescue missions.

    @InBook{lopez2017interoperability,
    author = {Daniel Serrano L{\'{o}}pez and German Moreno and Jose Cordero and Jose Sanchez and Shashank Govindaraj and Mario Monteiro Marques and Victor Lobo and Stefano Fioravanti and Alberto Grati and Konrad Rudin and Massimo Tosa and Anibal Matos and Andre Dias and Alfredo Martins and Janusz Bedkowski and Haris Balta and De Cubber, Geert},
    editor = {De Cubber, Geert and Doroftei, Daniela},
    chapter = {Chapter 6},
    publisher = {{InTech}},
    title = {Interoperability in a Heterogeneous Team of Search and Rescue Robots},
    year = {2017},
    month = aug,
    abstract = {Search and rescue missions are complex operations. A disaster scenario is generally unstructured, time‐varying and unpredictable. This poses several challenges for the successful deployment of unmanned technology. The variety of operational scenarios and tasks lead to the need for multiple robots of different types, domains and sizes. A priori planning of the optimal set of assets to be deployed and the definition of their mission objectives are generally not feasible as information only becomes available during mission. The ICARUS project responds to this challenge by developing a heterogeneous team composed by different and complementary robots, dynamically cooperating as an interoperable team. This chapter describes our approach to multi‐robot interoperability, understood as the ability of multiple robots to operate together, in synergy, enabling multiple teams to share data, intelligence and resources, which is the ultimate objective of ICARUS project. It also includes the analysis of the relevant standardization initiatives in multi‐robot multi‐domain systems, our implementation of an interoperability framework and several examples of multi‐robot cooperation of the ICARUS robots in realistic search and rescue missions.},
    booktitle = {Search and Rescue Robotics - From Theory to Practice},
    doi = {10.5772/intechopen.69493},
    project = {ICARUS},
    unit= {meca-ras},
    url = {https://www.intechopen.com/books/search-and-rescue-robotics-from-theory-to-practice/interoperability-in-a-heterogeneous-team-of-search-and-rescue-robots},
    }

  • G. De Cubber, D. Doroftei, H. Balta, A. Matos, E. Silva, D. Serrano, S. Govindaraj, R. Roda, V. Lobo, M. Marques, and R. Wagemans, “Operational Validation of Search and Rescue Robots," in Search and Rescue Robotics – From Theory to Practice, G. De Cubber and D. Doroftei, Eds., InTech, 2017.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    This chapter describes how the different ICARUS unmanned search and rescue tools have been evaluated and validated using operational benchmarking techniques. Two large‐scale simulated disaster scenarios were organized: a simulated shipwreck and an earthquake response scenario. Next to these simulated response scenarios, where ICARUS tools were deployed in tight interaction with real end users, ICARUS tools also participated to a real relief, embedded in a team of end users for a flood response mission. These validation trials allow us to conclude that the ICARUS tools fulfil the user requirements and goals set up at the beginning of the project.

    @InBook{de2017operational,
    author = {De Cubber, Geert and Daniela Doroftei and Haris Balta and Anibal Matos and Eduardo Silva and Daniel Serrano and Shashank Govindaraj and Rui Roda and Victor Lobo and M{\'{a}}rio Marques and Rene Wagemans},
    editor = {De Cubber, Geert and Doroftei, Daniela},
    chapter = {Chapter 10},
    publisher = {{InTech}},
    title = {Operational Validation of Search and Rescue Robots},
    year = {2017},
    month = aug,
    abstract = {This chapter describes how the different ICARUS unmanned search and rescue tools have been evaluated and validated using operational benchmarking techniques. Two large‐scale simulated disaster scenarios were organized: a simulated shipwreck and an earthquake response scenario. Next to these simulated response scenarios, where ICARUS tools were deployed in tight interaction with real end users, ICARUS tools also participated to a real relief, embedded in a team of end users for a flood response mission. These validation trials allow us to conclude that the ICARUS tools fulfil the user requirements and goals set up at the beginning of the project.},
    booktitle = {Search and Rescue Robotics - From Theory to Practice},
    doi = {10.5772/intechopen.69497},
    journal = {Search and Rescue Robotics: From Theory to Practice},
    project = {ICARUS},
    url = {https://www.intechopen.com/books/search-and-rescue-robotics-from-theory-to-practice/operational-validation-of-search-and-rescue-robots},
    unit= {meca-ras}
    }

  • K. Berns, A. Nezhadfard, M. Tosa, H. Balta, and G. De Cubber, “Unmanned Ground Robots for Rescue Tasks," in Search and Rescue Robotics – From Theory to Practice, G. De Cubber and D. Doroftei, Eds., InTech, 2017.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    This chapter describes two unmanned ground vehicles that can help search and rescue teams in their difficult, but life-saving tasks. These robotic assets have been developed within the framework of the European project ICARUS. The large unmanned ground vehicle is intended to be a mobile base station. It is equipped with a powerful manipulator arm and can be used for debris removal, shoring operations, and remote structural operations (cutting, welding, hammering, etc.) on very rough terrain. The smaller unmanned ground vehicle is also equipped with an array of sensors, enabling it to search for victims inside semi-destroyed buildings. Working together with each other and the human search and rescue workers, these robotic assets form a powerful team, increasing the effectiveness of search and rescue operations, as proven by operational validation tests in collaboration with end users.

    @InBook{berns2017unmanned,
    author = {Karsten Berns and Atabak Nezhadfard and Massimo Tosa and Haris Balta and De Cubber, Geert},
    editor = {De Cubber, Geert and Doroftei, Daniela},
    chapter = {Chapter 4},
    publisher = {{InTech}},
    title = {Unmanned Ground Robots for Rescue Tasks},
    year = {2017},
    month = aug,
    abstract = {This chapter describes two unmanned ground vehicles that can help search and rescue teams in their difficult, but life-saving tasks. These robotic assets have been developed within the framework of the European project ICARUS. The large unmanned ground vehicle is intended to be a mobile base station. It is equipped with a powerful manipulator arm and can be used for debris removal, shoring operations, and remote structural operations (cutting, welding, hammering, etc.) on very rough terrain. The smaller unmanned ground vehicle is also equipped with an array of sensors, enabling it to search for victims inside semi-destroyed buildings. Working together with each other and the human search and rescue workers, these robotic assets form a powerful team, increasing the effectiveness of search and rescue operations, as proven by operational validation tests in collaboration with end users.},
    booktitle = {Search and Rescue Robotics - From Theory to Practice},
    doi = {10.5772/intechopen.69491},
    project = {ICARUS},
    url = {https://www.intechopen.com/books/search-and-rescue-robotics-from-theory-to-practice/unmanned-ground-robots-for-rescue-tasks},
    unit= {meca-ras}
    }

  • D. Doroftei, G. De Cubber, R. Wagemans, A. Matos, E. Silva, V. Lobo, K. C. Guerreiro Cardoso, S. Govindaraj, J. Gancet, and D. Serrano, “User-centered design," , G. De Cubber and D. Doroftei, Eds., InTech, 2017, p. 19–36.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    The successful introduction and acceptance of novel technological tools are only possible if end users are completely integrated in the design process. However, obtaining such integration of end users is not obvious, as end‐user organizations often do not consider research toward new technological aids as their core business and are therefore reluctant to engage in these kinds of activities. This chapter explains how this problem was tackled in the ICARUS project, by carefully identifying and approaching the targeted user communities and by compiling user requirements. Resulting from these user requirements, system requirements and a system architecture for the ICARUS system were deduced. An important aspect of the user‐centered design approach is that it is an iterative methodology, based on multiple intermediate operational validations by end users of the developed tools, leading to a final validation according to user‐scripted validation scenarios.

    @InBook{doroftei2017user,
    author = {Doroftei, Daniela and De Cubber, Geert and Wagemans, Rene and Matos, Anibal and Silva, Eduardo and Lobo, Victor and Guerreiro Cardoso, Keshav Chintamani and Govindaraj, Shashank and Gancet, Jeremi and Serrano, Daniel},
    editor = {De Cubber, Geert and Doroftei, Daniela},
    chapter = {Chapter 2},
    pages = {19--36},
    publisher = {{InTech}},
    title = {User-centered design},
    year = {2017},
    abstract = {The successful introduction and acceptance of novel technological tools are only possible if end users are completely integrated in the design process. However, obtaining such integration of end users is not obvious, as end‐user organizations often do not consider research toward new technological aids as their core business and are therefore reluctant to engage in these kinds of activities. This chapter explains how this problem was tackled in the ICARUS project, by carefully identifying and approaching the targeted user communities and by compiling user requirements. Resulting from these user requirements, system requirements and a system architecture for the ICARUS system were deduced. An important aspect of the user‐centered design approach is that it is an iterative methodology, based on multiple intermediate operational validations by end users of the developed tools, leading to a final validation according to user‐scripted validation scenarios.},
    doi = {10.5772/intechopen.69483},
    journal = {Search and rescue robotics. From theory to practice. IntechOpen, London},
    project = {ICARUS},
    unit= {meca-ras},
    url = {https://www.intechopen.com/books/search-and-rescue-robotics-from-theory-to-practice/user-centered-design},
    }

  • G. D. Cubber, D. Doroftei, K. Rudin, K. Berns, A. Matos, D. Serrano, J. Sanchez, S. Govindaraj, J. Bedkowski, R. Roda, E. Silva, and S. Ourevitch, “Introduction to the use of robotic tools for search and rescue," in Search and Rescue Robotics – From Theory to Practice, G. De Cubber and D. Doroftei, Eds., InTech, 2017.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    Modern search and rescue workers are equipped with a powerful toolkit to address natural and man-made disasters. This introductory chapter explains how a new tool can be added to this toolkit: robots. The use of robotic assets in search and rescue operations is explained and an overview is given of the worldwide efforts to incorporate robotic tools in search and rescue operations. Furthermore, the European Union ICARUS project on this subject is introduced. The ICARUS project proposes to equip first responders with a comprehensive and integrated set of unmanned search and rescue tools, to increase the situational awareness of human crisis managers, such that more work can be done in a shorter amount of time. The ICARUS tools consist of assistive unmanned air, ground, and sea vehicles, equipped with victim-detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the command and control equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.

    @InBook{cubber2017introduction,
    author = {Geert De Cubber and Daniela Doroftei and Konrad Rudin and Karsten Berns and Anibal Matos and Daniel Serrano and Jose Sanchez and Shashank Govindaraj and Janusz Bedkowski and Rui Roda and Eduardo Silva and Stephane Ourevitch},
    editor = {De Cubber, Geert and Doroftei, Daniela},
    chapter = {Chapter 1},
    publisher = {{InTech}},
    title = {Introduction to the use of robotic tools for search and rescue},
    year = {2017},
    month = aug,
    abstract = {Modern search and rescue workers are equipped with a powerful toolkit to address natural and man-made disasters. This introductory chapter explains how a new tool can be added to this toolkit: robots. The use of robotic assets in search and rescue operations is explained and an overview is given of the worldwide efforts to incorporate robotic tools in search and rescue operations. Furthermore, the European Union ICARUS project on this subject is introduced. The ICARUS project proposes to equip first responders with a comprehensive and integrated set of unmanned search and rescue tools, to increase the situational awareness of human crisis managers, such that more work can be done in a shorter amount of time. The ICARUS tools consist of assistive unmanned air, ground, and sea vehicles, equipped with victim-detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the command and control equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.},
    booktitle = {Search and Rescue Robotics - From Theory to Practice},
    doi = {10.5772/intechopen.69489},
    project = {ICARUS},
    url = {https://www.intechopen.com/books/search-and-rescue-robotics-from-theory-to-practice/introduction-to-the-use-of-robotic-tools-for-search-and-rescue},
    unit= {meca-ras}
    }

  • G. D. Cubber, D. Doroftei, K. Rudin, K. Berns, A. Matos, D. Serrano, J. M. Sanchez, S. Govindaraj, J. Bedkowski, R. Roda, E. Silva, S. Ourevitch, R. Wagemans, V. Lobo, G. Cardoso, K. Chintamani, J. Gancet, P. Stupler, A. Nezhadfard, M. Tosa, H. Balta, J. Almeida, A. Martins, H. Ferreira, B. Ferreira, J. Alves, A. Dias, S. Fioravanti, D. Bertin, G. Moreno, J. Cordero, M. M. Marques, A. Grati, H. M. Chaudhary, B. Sheers, Y. Riobo, P. Letier, M. N. Jimenez, M. A. Esbri, P. Musialik, I. Badiola, R. Goncalves, A. Coelho, T. Pfister, K. Majek, M. Pelka, A. Maslowski, and R. Baptista, Search and Rescue Robotics – From Theory to Practice, G. De Cubber and D. Doroftei, Eds., InTech, 2017.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    In the event of large crises (earthquakes, typhoons, floods, …), a primordial task of the fire and rescue services is the search for human survivors on the incident site. This is a complex and dangerous task, which – too often – leads to loss of lives among the human crisis managers themselves. This book explains how unmanned search can be added to the toolkit of the search and rescue workers, offering a valuable tool to save human lives and to speed up the search and rescue process. The introduction of robotic tools in the world of search and rescue is not straightforward, due to the fact that the search and rescue context is extremely technology-unfriendly, meaning that very robust solutions, which can be deployed extremely quickly, are required. Multiple research projects across the world are tackling this problem and in this book, a special focus is placed on showcasing the results of the European Union ICARUS project on this subject. The ICARUS project proposes to equip first responders with a comprehensive and integrated set of unmanned search and rescue tools, to increase the situational awareness of human crisis managers, so that more work can be done in a shorter amount of time. The ICARUS tools consist of assistive unmanned air, ground, and sea vehicles, equipped with victim-detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the command and control equipment of the human crisis managers and a set of training and support tools is provided to them in order to learn to use the ICARUS system.

    @Book{de2017search,
    author = {Geert De Cubber and Daniela Doroftei and Konrad Rudin and Karsten Berns and Anibal Matos and Daniel Serrano and Jose Manuel Sanchez and Shashank Govindaraj and Janusz Bedkowski and Rui Roda and Eduardo Silva and Stephane Ourevitch and Rene Wagemans and Victor Lobo and Guerreiro Cardoso and Keshav Chintamani and Jeremi Gancet and Pascal Stupler and Atabak Nezhadfard and Massimo Tosa and Haris Balta and Jose Almeida and Alfredo Martins and Hugo Ferreira and Bruno Ferreira and Jose Alves and Andre Dias and Stefano Fioravanti and Daniele Bertin and German Moreno and Jose Cordero and Mario Monteiro Marques and Alberto Grati and Hafeez M. Chaudhary and Bart Sheers and Yudani Riobo and Pierre Letier and Mario Nunez Jimenez and Miguel Angel Esbri and Pawel Musialik and Irune Badiola and Ricardo Goncalves and Antonio Coelho and Thomas Pfister and Karol Majek and Michal Pelka and Andrzej Maslowski and Ricardo Baptista},
    editor = {De Cubber, Geert and Doroftei, Daniela},
    publisher = {{InTech}},
    title = {Search and Rescue Robotics - From Theory to Practice},
    year = {2017},
    month = aug,
    abstract = {In the event of large crises (earthquakes, typhoons, floods, ...), a primordial task of the fire and rescue services is the search for human survivors on the incident site. This is a complex and dangerous task, which - too often - leads to loss of lives among the human crisis managers themselves. This book explains how unmanned search can be added to the toolkit of the search and rescue workers, offering a valuable tool to save human lives and to speed up the search and rescue process. The introduction of robotic tools in the world of search and rescue is not straightforward, due to the fact that the search and rescue context is extremely technology-unfriendly, meaning that very robust solutions, which can be deployed extremely quickly, are required. Multiple research projects across the world are tackling this problem and in this book, a special focus is placed on showcasing the results of the European Union ICARUS project on this subject. The ICARUS project proposes to equip first responders with a comprehensive and integrated set of unmanned search and rescue tools, to increase the situational awareness of human crisis managers, so that more work can be done in a shorter amount of time. The ICARUS tools consist of assistive unmanned air, ground, and sea vehicles, equipped with victim-detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the command and control equipment of the human crisis managers and a set of training and support tools is provided to them in order to learn to use the ICARUS system.},
    doi = {10.5772/intechopen.68449},
    project = {ICARUS},
    url = {https://www.intechopen.com/books/search-and-rescue-robotics-from-theory-to-practice},
    unit= {meca-ras}
    }

  • Y. Baudoin, D. Doroftei, G. De Cubber, J. Habumuremyi, H. Balta, and I. Doroftei, “Unmanned Ground and Aerial Robots Supporting Mine Action Activities," in Mine Action – The Research Experience of the Royal Military Academy of Belgium, C. Beumier, D. Closson, V. Lacroix, N. Milisavljevic, and Y. Yvinec, Eds., InTech, 2017, vol. 1.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    During the Humanitarian‐demining actions, teleoperation of sensors or multi‐sensor heads can enhance-detection process by allowing more precise scanning, which is useful for the optimization of the signal processing algorithms. This chapter summarizes the technologies and experiences developed during 16 years through national and/or European‐funded projects, illustrated by some contributions of our own laboratory, located at the Royal Military Academy of Brussels, focusing on the detection of unexploded devices and the implementation of mobile robotics systems on minefields.

    @InBook{baudoin2017unmanned,
    author = {Baudoin, Yvan and Doroftei, Daniela and De Cubber, Geert and Habumuremyi, Jean-Claude and Balta, Haris and Doroftei, Ioan},
    editor = {Beumier, Charles and Closson, Damien and Lacroix, Vincianne and Milisavljevic, Nada and Yvinec, Yann},
    chapter = {Chapter 9},
    publisher = {{InTech}},
    title = {Unmanned Ground and Aerial Robots Supporting Mine Action Activities},
    year = {2017},
    month = aug,
    volume = {1},
    abstract = {During the Humanitarian‐demining actions, teleoperation of sensors or multi‐sensor heads can enhance-detection process by allowing more precise scanning, which is useful for the optimization of the signal processing algorithms. This chapter summarizes the technologies and experiences developed during 16 years through national and/or European‐funded projects, illustrated by some contributions of our own laboratory, located at the Royal Military Academy of Brussels, focusing on the detection of unexploded devices and the implementation of mobile robotics systems on minefields.},
    booktitle = {Mine Action - The Research Experience of the Royal Military Academy of Belgium},
    doi = {10.5772/65783},
    project = {TIRAMISU},
    url = {https://www.intechopen.com/books/mine-action-the-research-experience-of-the-royal-military-academy-of-belgium/unmanned-ground-and-aerial-robots-supporting-mine-action-activities},
    unit= {meca-ras}
    }

2015

  • D. Doroftei, A. Matos, E. Silva, V. Lobo, R. Wagemans, and G. De Cubber, “Operational validation of robots for risky environments," in 8th IARP Workshop on Robotics for Risky Environments, Lisbon, Portugal, 2015.
    [BibTeX] [Abstract] [Download PDF]

    This paper presents an operational test and validation approach for the evaluation of the performance of a range of marine, aerial and ground search and rescue robots. The proposed approach seeks to find a compromise between the traditional rigorous standardized approaches and the open-ended robot competitions. Operational scenarios are defined, including a performance assessment of individual robots but also collective operations where heterogeneous robots cooperate together and with manned teams in search and rescue activities. That way, it is possible to perform a more complete validation of the use of robotic tools in challenging real world scenarios.

    @InProceedings{doroftei2015operational,
    author = {Doroftei, Daniela and Matos, Anibal and Silva, Eduardo and Lobo, Victor and Wagemans, Rene and De Cubber, Geert},
    booktitle = {8th IARP Workshop on Robotics for Risky Environments},
    title = {Operational validation of robots for risky environments},
    year = {2015},
    abstract = {This paper presents an operational test and validation approach for the evaluation of the performance of a range of marine, aerial and ground search and rescue robots. The proposed approach seeks to find a compromise between the traditional rigorous standardized approaches and the open-ended robot competitions. Operational scenarios are defined, including a performance assessment of individual robots but also collective operations where heterogeneous robots cooperate together and with manned teams in search and rescue activities. That way, it is possible to perform a more complete validation of the use of robotic tools in challenging real world scenarios.},
    project = {ICARUS},
    address = {Lisbon, Portugal},
    url = {http://mecatron.rma.ac.be/pub/2015/Operational validation of robots for risky environments.pdf},
    unit= {meca-ras}
    }

  • H. Balta, G. De Cubber, Y. Baudoin, and D. Doroftei, “UAS deployment and data processing during the Balkans flooding with the support to Mine Action," in 8th IARP Workshop on Robotics for Risky Environments, Lisbon, Portugal, 2015.
    [BibTeX] [Abstract] [Download PDF]

    In this paper, we provide a report on a real relief operation mission, jointly conducted by two European research projects, in response to the massive flooding in the Balkan in spring 2014. Un Unmanned Aerial System was deployed on-site in collaboration with traditional relief workers, to support them with damage assessment, area mapping, visual inspection and re-localizing the many explosive remnants of war which have been moved due to the flooding and landslides. The destructive impact of landslides, sediment torrents and floods on the mine fields and the change of mine action situation resulted with significant negative environmental and security consequences. Novel robotic technologies and data processing methodologies were brought from the research labs and directly applied onto the terrain in order to support the relief workers and minimize human suffering.

    @InProceedings{balta2015uas,
    author = {Balta, Haris and De Cubber, Geert and Baudoin, Yvan and Doroftei, Daniela},
    booktitle = {8th IARP Workshop on Robotics for Risky Environments},
    title = {{UAS} deployment and data processing during the {Balkans} flooding with the support to Mine Action},
    year = {2015},
    abstract = {In this paper, we provide a report on a real relief operation mission, jointly conducted by two European research projects, in response to the massive flooding in the Balkan in spring 2014. Un Unmanned Aerial System was deployed on-site in collaboration with traditional relief workers, to support them with damage assessment, area mapping, visual inspection and re-localizing the many explosive remnants of war which have been moved due to the flooding and landslides. The destructive impact of landslides, sediment torrents and floods on the mine fields and the change of mine action situation resulted with significant negative environmental and security consequences. Novel robotic technologies and data processing methodologies were brought from the research labs and directly applied onto the terrain in order to support the relief workers and minimize human suffering.},
    project = {ICARUS},
    address = {Lisbon, Portugal},
    url = {http://mecatron.rma.ac.be/pub/2015/RISE_2015_Haris_Balta_RMA.PDF},
    unit= {meca-ras}
    }

2014

  • D. Doroftei, A. Matos, and G. De Cubber, “Designing Search and Rescue Robots towards Realistic User Requirements," in Advanced Concepts on Mechanical Engineering (ACME), Iasi, Romania, 2014.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    In the event of a large crisis (think about typhoon Haiyan or the Tohoku earthquake and tsunami in Japan), a primordial task of the rescue services is the search for human survivors on the incident site. This is a complex and dangerous task, which often leads to loss of lives among the human crisis managers themselves. The introduction of unmanned search and rescue devices can offer a valuable tool to save human lives and to speed up the search and rescue process. In this context, the EU-FP7-ICARUS project [1] concentrates on the development of unmanned search and rescue technologies for detecting, locating and rescuing humans. The complex nature and difficult operating conditions of search and rescue operations pose heavy constraints on the mechanical design of the unmanned platforms. In this paper, we discuss the different user requirements which have an impact of the design of the mechanical systems (air, ground and marine robots). We show how these user requirements are obtained, how they are validated, how they lead to design specifications for operational prototypes which are tested in realistic operational conditions and we show how the final mechanical design specifications are derived from these different steps. An important aspect of all these design steps which is emphasized in this paper is to always keep the end-users in the loop in order to come to realistic requirements and specifications, ensuring the practical deployability [2] of the developed platforms.

    @InProceedings{doroftei2014designing,
    author = {Doroftei, Daniela and Matos, Anibal and De Cubber, Geert},
    booktitle = {Advanced Concepts on Mechanical Engineering (ACME)},
    title = {Designing Search and Rescue Robots towards Realistic User Requirements},
    year = {2014},
    abstract = {In the event of a large crisis (think about typhoon Haiyan or the Tohoku earthquake and tsunami in Japan), a primordial task of the rescue services is the search for human survivors on the incident site. This is a complex and dangerous task, which often leads to loss of lives among the human crisis managers themselves. The introduction of unmanned search and rescue devices can
    offer a valuable tool to save human lives and to speed up the search and rescue process. In this context, the EU-FP7-ICARUS project [1] concentrates on the development of unmanned search and rescue technologies for detecting, locating and rescuing humans. The complex nature and difficult operating conditions of search and rescue operations pose heavy constraints on the mechanical design of the unmanned platforms. In this paper, we discuss the different user requirements which have an impact of the design of the mechanical systems (air, ground and marine robots). We show how these user requirements are obtained, how they are validated, how they lead to design specifications for operational prototypes which are tested in realistic operational conditions and we show how the final mechanical design specifications are derived from these different steps. An important aspect of all these design steps which is emphasized in this paper is to always keep the end-users in the loop in order to come to realistic requirements and specifications, ensuring the practical deployability [2] of the developed platforms.},
    doi = {10.4028/www.scientific.net/amm.658.612},
    project = {ICARUS},
    address = {Iasi, Romania},
    url = {http://mecatron.rma.ac.be/pub/2014/Designing Search and Rescue robots towards realistic user requirements - full article -v3.pdf},
    unit= {meca-ras}
    }

  • G. De Cubber, H. Balta, D. Doroftei, and Y. Baudoin, “UAS deployment and data processing during the Balkans flooding," in 2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014), Toyako-cho, Hokkaido, Japan, 2014, p. 1–4.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    This project paper provides a report on a real relief operation mission, jointly conducted by two European research projects, in response to the massive flooding in the Balkan in spring 2014. Un Unmanned Aerial System was deployed on-site in collaboration with traditional relief workers, to support them with damage assessment, area mapping, visual inspection and re-localizing the many explosive remnants of war which have been moved due to the flooding and landslides. Novel robotic technologies and data processing methodologies were brought from the research labs and directly applied onto the terrain in order to support the relief workers and minimize human suffering.

    @InProceedings{de2014uas,
    author = {De Cubber, Geert and Balta, Haris and Doroftei, Daniela and Baudoin, Yvan},
    booktitle = {2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)},
    title = {{UAS} deployment and data processing during the Balkans flooding},
    year = {2014},
    organization = {IEEE},
    pages = {1--4},
    abstract = {This project paper provides a report on a real relief operation mission, jointly conducted by two European research projects, in response to the massive flooding in the Balkan in spring 2014. Un Unmanned Aerial System was deployed on-site in collaboration with traditional relief workers, to support them with damage assessment, area mapping, visual inspection and re-localizing the many explosive remnants of war which have been moved due to the flooding and landslides. Novel robotic technologies and data processing methodologies were brought from the research labs and directly applied onto the terrain in order to support the relief workers and minimize human suffering.},
    doi = {10.1109/ssrr.2014.7017670},
    project = {ICARUS},
    address = {Toyako-cho, Hokkaido, Japan},
    url = {http://mecatron.rma.ac.be/pub/2014/SSRR2014_proj_037.pdf},
    unit= {meca-ras}
    }

2013

  • H. Balta, G. De Cubber, D. Doroftei, Y. Baudoin, and H. Sahli, “Terrain traversability analysis for off-road robots using time-of-flight 3d sensing," in 7th IARP International Workshop on Robotics for Risky Environment-Extreme Robotics, Saint-Petersburg, Russia, 2013.
    [BibTeX] [Abstract] [Download PDF]

    In this paper we present a terrain traversability analysis methodology which classifies all image pixels in the TOF image as traversable or not, by estimating for each pixel a traversability score which is based upon the analysis of the 3D (depth data) and 2D (IR data) content of the TOF camera data. This classification result is then used for the (semi) – autonomous navigation of two robotic systems, operating in extreme environments: a search and rescue robot and a humanitarian demining robot. Integrated in autonomous robot control architecture, terrain traversability classification increases the environmental situational awareness and enables a mobile robot to navigate (semi) – autonomously in an unstructured dynamical outdoor environment.

    @InProceedings{balta2013terrain,
    author = {Balta, Haris and De Cubber, Geert and Doroftei, Daniela and Baudoin, Yvan and Sahli, Hichem},
    booktitle = {7th IARP International Workshop on Robotics for Risky Environment-Extreme Robotics},
    title = {Terrain traversability analysis for off-road robots using time-of-flight 3d sensing},
    year = {2013},
    abstract = {In this paper we present a terrain traversability analysis methodology which classifies all image pixels in the TOF image as traversable or not, by estimating for each pixel a traversability score which is based upon the analysis of the 3D (depth data) and 2D (IR data) content of the TOF camera data. This classification result is then used for the (semi) – autonomous navigation of two robotic systems, operating in extreme environments: a search and rescue robot and a humanitarian demining robot. Integrated in autonomous robot control architecture, terrain traversability classification increases the environmental situational awareness and enables a mobile robot to navigate (semi) – autonomously in an unstructured dynamical outdoor environment.},
    project = {ICARUS},
    address = {Saint-Petersburg, Russia},
    url = {http://mecatron.rma.ac.be/pub/2013/Terrain Traversability Analysis ver 4-HS.pdf},
    unit= {meca-ras}
    }

  • G. De Cubber, D. Doroftei, D. Serrano, K. Chintamani, R. Sabino, and S. Ourevitch, “The EU-ICARUS project: developing assistive robotic tools for search and rescue operations," in 2013 IEEE international symposium on safety, security, and rescue robotics (SSRR), Linkoping, Sweden, 2013, p. 1–4.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but lifesaving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad-hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I (command, control, communications, computers, and intelligence) equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.

    @InProceedings{de2013eu,
    author = {De Cubber, Geert and Doroftei, Daniela and Serrano, Daniel and Chintamani, Keshav and Sabino, Rui and Ourevitch, Stephane},
    booktitle = {2013 IEEE international symposium on safety, security, and rescue robotics (SSRR)},
    title = {The {EU-ICARUS} project: developing assistive robotic tools for search and rescue operations},
    year = {2013},
    organization = {IEEE},
    pages = {1--4},
    address = {Linkoping, Sweden},
    abstract = {The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but lifesaving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad-hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I (command, control, communications, computers, and intelligence) equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.},
    doi = {10.1109/ssrr.2013.6719323},
    project = {ICARUS},
    url = {http://mecatron.rma.ac.be/pub/2013/SSRR2013_ICARUS.pdf},
    unit= {meca-ras}
    }

  • H. Balta, G. De Cubber, and D. Doroftei, “Increasing Situational Awareness through Outdoor Robot Terrain Traversability Analysis based on Time- Of-Flight Camera," in Spring School on Developmental Robotics and Cognitive Bootstrapping, Athens, Greece: , 2013, p. 8.
    [BibTeX] [Abstract]

    Poster paper

    @InCollection{balta2013increasing,
    author = {Balta, Haris and De Cubber, Geert and Doroftei, Daniela},
    booktitle = {Spring School on Developmental Robotics and Cognitive Bootstrapping},
    title = {Increasing Situational Awareness through Outdoor Robot Terrain Traversability Analysis based on Time- Of-Flight Camera},
    year = {2013},
    number = {Developmental Robotics and Cognitive Bootstrapping},
    pages = {8},
    abstract = {Poster paper},
    address = {Athens, Greece},
    project = {ICARUS},
    unit= {meca-ras}
    }

  • G. De Cubber, D. Serrano, K. Berns, K. Chintamani, R. Sabino, S. Ourevitch, D. Doroftei, C. Armbrust, T. Flamma, and Y. Baudoin, “Search and rescue robots developed by the European Icarus project," in 7th Int Workshop on Robotics for Risky Environments, Saint – Petersburg, Russia, 2013.
    [BibTeX] [Abstract] [Download PDF]

    This paper discusses the efforts of the European ICARUS project towards the development of unmanned search and rescue (SAR) robots. ICARUS project proposes to equip first responders with a comprehensive and integrated set of remotely operated SAR tools, to increase the situational awareness of human crisis managers. In the event of large crises, a primordial task of the fire and rescue services is the search for human survivors on the incident site, which is a complex and dangerous task. The introduction of remotely operated SAR devices can offer a valuable tool to save human lives and to speed up the SAR process. Therefore, ICARUS concentrates on the development of unmanned SAR technologies for detecting, locating and rescuing humans. The remotely operated SAR devices are foreseen to be the first explorers of the area, along with in-situ supporters to act as safeguards to human personnel. While the ICARUS project also considers the development of marine and aerial robots, this paper will mostly concentrate on the development of the unmanned ground vehicles (UGVs) for SAR. Two main UGV platforms are being developed within the context of the project: a large UGV including a powerful arm for manipulation, which is able to make structural changes in disaster scenarios. The large UGV also serves as a base platform for a small UGV (and possibly also a UAV), which is used for entering small enclosures, while searching for human survivors. In order not to increase the cognitive load of the human crisis managers, the SAR robots will be designed to navigate individually or cooperatively and to follow high-level instructions from the base station, being able to navigate in an autonomous and semi-autonomous manner. The robots connect to the base station and to each other using a wireless self-organizing cognitive network of mobile communication nodes which adapts to the terrain. The SAR robots are equipped with sensors that detect the presence of humans and will also be equipped with a wide array of other types of sensors. At the base station, the data is processed and combined with geographical information, thus enhancing the situational awareness of the personnel leading the operation with in-situ processed data that can improve decision-making.

    @InProceedings{de2013search,
    author = {De Cubber, Geert and Serrano, Daniel and Berns, Karsten and Chintamani, Keshav and Sabino, Rui and Ourevitch, Stephane and Doroftei, Daniela and Armbrust, Christopher and Flamma, Tommasso and Baudoin, Yvan},
    booktitle = {7th Int Workshop on Robotics for Risky Environments},
    title = {Search and rescue robots developed by the {European} {Icarus} project},
    year = {2013},
    abstract = {This paper discusses the efforts of the European ICARUS project towards the development of unmanned search and rescue (SAR) robots. ICARUS project proposes to equip first responders with a comprehensive and integrated set of remotely operated SAR tools, to increase the situational awareness of human crisis managers. In the event of large crises, a primordial task of the fire and rescue services is the search for human survivors on the incident site, which is a complex and dangerous task. The introduction of remotely operated SAR devices can offer a valuable tool to save human lives and to speed up the SAR process. Therefore, ICARUS concentrates on the development of unmanned SAR technologies for detecting, locating and rescuing humans. The remotely operated SAR devices are foreseen to be the first explorers of the area, along with in-situ supporters to act as safeguards to human personnel. While the ICARUS project also considers the development of marine and aerial robots, this paper will mostly concentrate on the development of the unmanned ground vehicles (UGVs) for SAR. Two main UGV platforms are being developed within the context of the project: a large UGV including a powerful arm for manipulation, which is able to make structural changes in disaster scenarios. The large UGV also serves as a base platform for a small UGV (and possibly also a UAV), which is used for entering small enclosures, while searching for human survivors. In order not to increase the cognitive load of the human crisis managers, the SAR robots will be designed to navigate individually or cooperatively and to follow high-level instructions from the base station, being able to navigate in an autonomous and semi-autonomous manner. The robots connect to the base station and to each other using a wireless self-organizing cognitive network of mobile communication nodes which adapts to the terrain. The SAR robots are equipped with sensors that detect the presence of humans and will also be equipped with a wide array of other types of sensors. At the base station, the data is processed and
    combined with geographical information, thus enhancing the situational awareness of the personnel leading the operation with in-situ processed data that can improve decision-making.},
    project = {ICARUS},
    address = {Saint - Petersburg, Russia},
    url = {http://mecatron.rma.ac.be/pub/2013/Search and Rescue robots developed by the European ICARUS project - Article.pdf},
    unit= {meca-ras}
    }

2012

  • G. De Cubber, D. Doroftei, Y. Baudoin, D. Serrano, K. Chintamani, R. Sabino, and S. Ourevitch, “ICARUS : Providing Unmanned Search and Rescue Tools," in 6th IARP Workshop on Risky Interventions and Environmental Surveillance (RISE), Warsaw, Poland, 2012.
    [BibTeX] [Abstract] [Download PDF]

    The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but life-saving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoccognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.

    @InProceedings{de2012icarus01,
    author = {De Cubber, Geert and Doroftei, Daniela and Baudoin, Yvan and Serrano, Daniel and Chintamani, Keshav and Sabino, Rui and Ourevitch, Stephane},
    booktitle = {6th IARP Workshop on Risky Interventions and Environmental Surveillance (RISE)},
    title = {{ICARUS} : Providing Unmanned Search and Rescue Tools},
    year = {2012},
    abstract = {The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but life-saving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoccognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.},
    project = {ICARUS},
    address = {Warsaw, Poland},
    url = {http://mecatron.rma.ac.be/pub/2012/RISE2012_ICARUS.pdf},
    unit= {meca-ras}
    }

  • D. Doroftei, G. De Cubber, and K. Chintamani, “Towards collaborative human and robotic rescue workers," in 5th International Workshop on Human-Friendly Robotics (HFR2012), Brussels, Belgium, 2012, p. 18–19.
    [BibTeX] [Abstract] [Download PDF]

    This paper discusses some of the main remaining bottlenecks towards the successful introduction of robotic search and rescue (SAR) tools, collaborating with human rescue workers. It also sketches some of the recent advances which are being made to in the context of the European ICARUS project to get rid of these bottlenecks.

    @InProceedings{doroftei2012towards,
    author = {Doroftei, Daniela and De Cubber, Geert and Chintamani, Keshav},
    booktitle = {5th International Workshop on Human-Friendly Robotics (HFR2012)},
    title = {Towards collaborative human and robotic rescue workers},
    year = {2012},
    pages = {18--19},
    abstract = {This paper discusses some of the main remaining bottlenecks towards the successful introduction of robotic search and rescue (SAR) tools, collaborating with human rescue workers. It also sketches some of the recent advances which are being made to in the context of the European ICARUS project to get rid of these bottlenecks.},
    project = {ICARUS},
    address = {Brussels, Belgium},
    url = {http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.303.6697&rep=rep1&type=pdf},
    unit= {meca-ras}
    }

  • A. Conduraru, I. Conduraru, E. Puscalau, G. De Cubber, D. Doroftei, and H. Balta, “Development of an autonomous rough-terrain robot," in IROS2012 Workshop on Robots and Sensors integration in future rescue INformation system (ROSIN’12), Villamoura, Portugal, 2012.
    [BibTeX] [Abstract] [Download PDF]

    In this paper, we discuss the development process of a mobile robot intended for environmental observation applications. The paper describes how a standard tele-operated Explosive Ordnance Disposal (EOD) robot was upgraded with electronics, sensors, computing power and autonomous capabilities, such that it becomes able to execute semi-autonomous missions, e.g. for search & rescue or humanitarian demining tasks. The aim of this paper is not to discuss the details of the navigation algorithms (as these are often task-dependent), but more to concentrate on the development of the platform and its control architecture as a whole.

    @InProceedings{conduraru2012development,
    author = {Conduraru, Alina and Conduraru, Ionel and Puscalau, Emanuel and De Cubber, Geert and Doroftei, Daniela and Balta, Haris},
    booktitle = {IROS2012 Workshop on Robots and Sensors integration in future rescue INformation system (ROSIN'12)},
    title = {Development of an autonomous rough-terrain robot},
    year = {2012},
    abstract = {In this paper, we discuss the development process of a mobile robot intended for environmental observation applications. The paper describes how a standard tele-operated Explosive Ordnance Disposal (EOD) robot was upgraded with electronics, sensors, computing power and autonomous capabilities, such that it becomes able to execute semi-autonomous missions, e.g. for search & rescue or humanitarian demining tasks. The aim of this paper is not to discuss the details of the navigation algorithms (as these are often task-dependent), but more to concentrate on the development of the platform and its control architecture as a whole.},
    project = {ICARUS},
    address = {Villamoura, Portugal},
    url = {https://pdfs.semanticscholar.org/884e/6a80c8768044a1fd68ee91f45f17e5125153.pdf},
    unit= {meca-ras}
    }

  • G. De Cubber, D. Doroftei, Y. Baudoin, D. Serrano, K. Chintamani, R. Sabino, and S. Ourevitch, “Operational RPAS scenarios envisaged for search & rescue by the EU FP7 ICARUS project," in Remotely Piloted Aircraft Systems for Civil Operations (RPAS2012), Brussels, Belgium, 2012.
    [BibTeX] [Abstract] [Download PDF]

    The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but life-saving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.

    @InProceedings{de2012operational,
    author = {De Cubber, Geert and Doroftei, Daniela and Baudoin, Yvan and Serrano, Daniel and Chintamani, Keshav and Sabino, Rui and Ourevitch, Stephane},
    booktitle = {Remotely Piloted Aircraft Systems for Civil Operations (RPAS2012)},
    title = {Operational {RPAS} scenarios envisaged for search \& rescue by the {EU FP7 ICARUS} project},
    year = {2012},
    abstract = {The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but life-saving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.},
    project = {ICARUS},
    address = {Brussels, Belgium},
    url = {http://mecatron.rma.ac.be/pub/2012/De-Cubber-Geert_RMA_Belgium_WP.pdf},
    unit= {meca-ras}
    }

  • G. De Cubber, D. Doroftei, Y. Baudoin, D. Serrano, K. Chintamani, R. Sabino, and S. Ourevitch, “ICARUS: AN EU-FP7 PROJECT PROVIDING UNMANNED SEARCH AND RESCUE TOOLS," in IROS2012 Workshop on Robots and Sensors integration in future rescue INformation system (ROSIN’12), Villamoura, Portugal, 2012.
    [BibTeX] [Abstract] [Download PDF]

    Overview of the objectives of the ICARUS project

    @InProceedings{de2012icarus02,
    author = {De Cubber, Geert and Doroftei, Daniela and Baudoin, Y and Serrano, D and Chintamani, K and Sabino, R and Ourevitch, S},
    booktitle = {IROS2012 Workshop on Robots and Sensors integration in future rescue INformation system (ROSIN'12)},
    title = {{ICARUS}: AN {EU-FP7} PROJECT PROVIDING UNMANNED SEARCH AND RESCUE TOOLS},
    year = {2012},
    abstract = {Overview of the objectives of the ICARUS project},
    project = {ICARUS},
    address = {Villamoura, Portugal},
    url = {http://mecatron.rma.ac.be/pub/2012/Icarus - ROSIN2012 Presentation.pdf},
    unit= {meca-ras}
    }

2011

  • G. De Cubber, D. Doroftei, H. Sahli, and Y. Baudoin, “Outdoor Terrain Traversability Analysis for Robot Navigation using a Time-Of-Flight Camera," in RGB-D Workshop on 3D Perception in Robotics, Vasteras, Sweden, 2011.
    [BibTeX] [Abstract] [Download PDF]

    Autonomous robotic systems operating in unstructured outdoor environments need to estimate the traversabilityof the terrain in order to navigate safely. Traversability estimation is a challenging problem, as the traversability is a complex function of both the terrain characteristics, such as slopes, vegetation, rocks, etc and the robot mobility characteristics, i.e. locomotion method, wheels, etc. It is thus required to analyze in real-time the 3D characteristics of the terrain and pair this data to the robot capabilities. In this paper, a method is introduced to estimate the traversability using data from a time-of-flight camera.

    @InProceedings{de2011outdoor,
    author = {De Cubber, Geert and Doroftei, Daniela and Sahli, Hichem and Baudoin, Yvan},
    booktitle = {RGB-D Workshop on 3D Perception in Robotics},
    title = {Outdoor Terrain Traversability Analysis for Robot Navigation using a Time-Of-Flight Camera},
    year = {2011},
    abstract = {Autonomous robotic systems operating in unstructured outdoor environments need to estimate the traversabilityof the terrain in order to navigate safely. Traversability estimation is a challenging problem, as the traversability is a complex function of both the terrain characteristics, such as slopes, vegetation, rocks, etc and the robot mobility characteristics, i.e. locomotion method, wheels, etc. It is thus required to analyze in real-time the 3D characteristics of the terrain and pair this data to the robot capabilities. In this paper, a method is introduced to estimate the traversability using data from a time-of-flight camera.},
    project = {ViewFinder, Mobiniss},
    address = {Vasteras, Sweden},
    url = {http://mecatron.rma.ac.be/pub/2011/TTA_TOF.pdf},
    unit= {meca-ras}
    }

  • G. De Cubber and D. Doroftei, “Multimodal terrain analysis for an all-terrain crisis Management Robot," in IARP HUDEM 2011, Sibenik, Croatia, 2011.
    [BibTeX] [Abstract] [Download PDF]

    In this paper, a novel stereo-based terrain-traversability estimation methodology is proposed. The novelty is that – in contrary to classic depth-based terrain classification algorithms – all the information of the stereo camera system is used, also the color information. Using this approach, depth and color information are fused in order to obtain a higher classification accuracy than is possible with uni-modal techniques

    @InProceedings{de2011multimodal,
    author = {De Cubber, Geert and Doroftei, Daniela},
    booktitle = {IARP HUDEM 2011},
    title = {Multimodal terrain analysis for an all-terrain crisis Management Robot},
    year = {2011},
    abstract = {In this paper, a novel stereo-based terrain-traversability estimation methodology is proposed. The novelty is that – in contrary to classic depth-based terrain classification algorithms – all the information of the stereo camera system is used, also the color information. Using this approach, depth and color information are fused in order to obtain a higher classification accuracy than is possible with uni-modal techniques},
    project = {Mobiniss},
    address = {Sibenik, Croatia},
    url = {http://mecatron.rma.ac.be/pub/2011/Multimodal terrain analysis for an all-terrain crisis management robot .pdf},
    unit= {meca-ras}
    }

  • G. De Cubber, D. Doroftei, K. Verbiest, and S. A. Berrabah, “Autonomous camp surveillance with the ROBUDEM robot: challenges and results," in IARP Workshop RISE’2011, Belgium, 2011.
    [BibTeX] [Abstract] [Download PDF]

    Autonomous robotic systems can help for risky interventions to reduce the risk to human lives. An example of such a risky intervention is a camp surveillance scenario, where an environment needs to be patrolled and intruders need to be detected and intercepted. This paper describes the development of a mobile outdoor robot which is capable of performing such a camp surveillance task. The key research issues tackled are the robot design, geo-referenced localization and path planning, traversability estimation, the optimization of the terrain coverage strategy and the development of an intuitive human-robot interface.

    @InProceedings{de2011autonomous,
    author = {De Cubber, Geert and Doroftei, Daniela and Verbiest, Kristel and Berrabah, Sid Ahmed},
    booktitle = {IARP Workshop RISE’2011},
    title = {Autonomous camp surveillance with the {ROBUDEM} robot: challenges and results},
    year = {2011},
    abstract = {Autonomous robotic systems can help for risky interventions to reduce the risk to human lives. An example of such a risky intervention is a camp surveillance scenario, where an environment needs to be patrolled and intruders need to be detected and intercepted. This paper describes the development of a mobile outdoor robot which is capable of performing such a camp surveillance task. The key research issues tackled are the robot design, geo-referenced localization and path planning, traversability estimation, the optimization of the terrain coverage strategy and the development of an intuitive human-robot interface.},
    project = {Mobiniss},
    address = {Belgium},
    url = {http://mecatron.rma.ac.be/pub/2011/ELROB-RISE.pdf},
    unit= {meca-ras}
    }

  • G. De Cubber and D. Doroftei, “Using Robots in Hazardous Environments: Landmine Detection, de-Mining and Other Applications," in Using Robots in Hazardous Environments: Landmine Detection, De-Mining and Other Applications, Y. Baudoin and M. Habib, Eds., Woodhead Publishing, 2011, vol. 1, p. 476–498.
    [BibTeX] [Abstract] [Download PDF]

    This chapter presents three main aspects of the development of a crisis management robot. First, we present an approach for robust victim detection in difficult outdoor conditions. Second, we present an approach where a classification of the terrain in the classes traversable and obstacle is performed using only stereo vision as input data. Lastly, we present behavior-based control architecture, enabling a robot to search for human victims on an incident site, while navigating semi-autonomously, using stereo vision as the main source of sensor information.

    @InBook{de2010human,
    author = {De Cubber, Geert and Doroftei, Daniela},
    editor = {Baudoin, Yvan and Habib, Maki},
    chapter = {Chapter 20},
    pages = {476--498},
    publisher = {Woodhead Publishing},
    title = {Using Robots in Hazardous Environments: Landmine Detection, de-Mining and Other Applications},
    year = {2011},
    isbn = {1845697863},
    volume = {1},
    abstract = {This chapter presents three main aspects of the development of a crisis management robot. First, we present an approach for robust victim detection in difficult outdoor conditions. Second, we present an approach where a classification of the terrain in the classes traversable and obstacle is performed using only stereo vision as input data. Lastly, we present behavior-based control architecture, enabling a robot to search for human victims on an incident site, while navigating semi-autonomously, using stereo vision as the main source of sensor information.},
    booktitle = {Using Robots in Hazardous Environments: Landmine Detection, De-Mining and Other Applications},
    date = {2011-01-11},
    ean = {9781845697860},
    pagetotal = {665},
    project = {Mobiniss},
    url = {http://mecatron.rma.ac.be/pub/2009/Handbook Chapter 4 - Human Victim Detection and Stereo-based Terrain Traversability Analysis for Behavior-Based Robot Navigation.pdf},
    unit= {meca-ras}
    }

  • D. Doroftei and E. Colon, “Decentralized multi-robot coordination for a risky surveillance application," in Proc. IARP HUDEM 2011, Sibenik, Croatia, 2011.
    [BibTeX] [Abstract] [Download PDF]

    This paper proposes a multi-robot control methodology that is based on a behavior-based control framework. In this behavior-based context, the robotic team members are controlled using one of 2 mutually exclusive behaviors: patrolling or intercepting. In patrol mode the robot seeks to detect enemy forces as rapidly as possible, by balancing 2 constraints: the intervention time should be minimized and the map coverage should be maximized. In interception mode, the robot tries to advance towards an enemy which was detected by one of the robotic team members. Subsequently, the robot tries to neutralize the threat posed by the enemy before enemy is able to reach the camp.

    @InProceedings{doro2011decentralized,
    author = {Doroftei, Daniela and Colon, Eric},
    booktitle = {Proc. {IARP} {HUDEM} 2011},
    title = {Decentralized multi-robot coordination for a risky surveillance application},
    year = {2011},
    publisher = {{IARP}},
    abstract = {This paper proposes a multi-robot control methodology that is based on a behavior-based control framework. In this behavior-based context, the robotic team members are controlled using one of 2 mutually exclusive behaviors: patrolling or intercepting. In patrol mode the robot seeks to detect enemy forces as rapidly as possible, by balancing 2 constraints: the intervention time should be minimized and the map coverage should be maximized. In interception mode, the robot tries to advance towards an enemy which was detected by one of the robotic team members. Subsequently, the robot tries to neutralize the threat posed by the enemy before enemy is able to reach the camp. },
    project = {NMRS},
    address = {Sibenik, Croatia},
    url = {http://mecatron.rma.ac.be/pub/2011/HUDEM2011_Doroftei_Colon.pdf},
    unit= {meca-ras}
    }

2010

  • G. De Cubber, S. A. Berrabah, D. Doroftei, Y. Baudoin, and H. Sahli, “Combining Dense Structure from Motion and Visual SLAM in a Behavior-Based Robot Control Architecture," International Journal of Advanced Robotic Systems, vol. 7, iss. 1, 2010.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    In this paper, we present a control architecture for an intelligent outdoor mobile robot. This enables the robot to navigate in a complex, natural outdoor environment, relying on only a single on-board camera as sensory input. This is achieved through a twofold analysis of the visual data stream: a dense structure from motion algorithm calculates a depth map of the environment and a visual simultaneous localization and mapping algorithm builds a map of the surroundings using image features. This information enables a behavior-based robot motion and path planner to navigate the robot through the environment. In this paper, we show the theoretical aspects of setting up this architecture.

    @Article{de2010combining,
    author = {De Cubber, Geert and Sid Ahmed Berrabah and Daniela Doroftei and Yvan Baudoin and Hichem Sahli},
    journal = {International Journal of Advanced Robotic Systems},
    title = {Combining Dense Structure from Motion and Visual {SLAM} in a Behavior-Based Robot Control Architecture},
    year = {2010},
    month = mar,
    number = {1},
    volume = {7},
    abstract = {In this paper, we present a control architecture for an intelligent outdoor mobile robot. This enables the robot to navigate in a complex, natural outdoor environment, relying on only a single on-board camera as sensory input. This is achieved through a twofold analysis of the visual data stream: a dense structure from motion algorithm calculates a depth map of the environment and a visual simultaneous localization and mapping algorithm builds a map of the surroundings using image features. This information enables a behavior-based robot motion and path planner to navigate the robot through the environment. In this paper, we show the theoretical aspects of setting up this architecture.},
    doi = {10.5772/7240},
    publisher = {{SAGE} Publications},
    project = {ViewFinder, Mobiniss},
    url = {http://mecatron.rma.ac.be/pub/2010/e_from_motion_and_visual_slam_in_a_behavior-based_robot_control_architecture.pdf},
    unit= {meca-ras,vub-etro}
    }

  • Y. Baudoin, D. Doroftei, G. De Cubber, S. A. Berrabah, E. Colon, C. Pinzon, A. Maslowski, J. Bedkowski, and J. PENDERS, “VIEW-FINDER: Robotics Assistance to fire-Fighting services," in Mobile Robotics: Solutions and Challenges, , 2010, p. 397–406.
    [BibTeX] [Abstract] [Download PDF]

    This paper presents an overview of the View-Finder project

    @InCollection{baudoin2010view,
    author = {Baudoin, Yvan and Doroftei, Daniela and De Cubber, Geert and Berrabah, Sid Ahmed and Colon, Eric and Pinzon, Carlos and Maslowski, Andrzej and Bedkowski, Janusz and PENDERS, Jacques},
    booktitle = {Mobile Robotics: Solutions and Challenges},
    title = {{VIEW-FINDER}: Robotics Assistance to fire-Fighting services},
    year = {2010},
    pages = {397--406},
    abstract = {This paper presents an overview of the View-Finder project},
    project = {ViewFinder},
    unit= {meca-ras},
    url = {https://books.google.be/books?id=zcfFCgAAQBAJ&pg=PA397&lpg=PA397&dq=VIEW-FINDER: Robotics Assistance to fire-Fighting services mobile robots&source=bl&ots=Jh6P63OKCr&sig=O1GPy_c42NPSEdO8Hb_pa9V6K7g&hl=en&sa=X&ved=2ahUKEwiLr76B-5zfAhUMCewKHQS_Af0Q6AEwDXoECAEQAQ#v=onepage&q=VIEW-FINDER: Robotics Assistance to fire-Fighting services mobile robots&f=false},
    }

  • Y. Baudoin, G. De Cubber, E. Colon, D. Doroftei, and S. A. Berrabah, “Robotics Assistance by Risky Interventions: Needs and Realistic Solutions," in Workshop on Robotics for Extreme conditions, Saint-Petersburg, Russia, 2010.
    [BibTeX] [Abstract] [Download PDF]

    This paper discusses the requirements towards robotics systems in the domains of firefighting, CBRN-E and humanitarian demining.

    @InProceedings{baudoin2010robotics,
    author = {Baudoin, Yvan and De Cubber, Geert and Colon, Eric and Doroftei, Daniela and Berrabah, Sid Ahmed},
    booktitle = {Workshop on Robotics for Extreme conditions},
    title = {Robotics Assistance by Risky Interventions: Needs and Realistic Solutions},
    year = {2010},
    abstract = {This paper discusses the requirements towards robotics systems in the domains of firefighting, CBRN-E and humanitarian demining.},
    project = {ViewFinder, Mobiniss},
    address = {Saint-Petersburg, Russia},
    url = {http://mecatron.rma.ac.be/pub/2010/Robotics Assistance by risky interventions.pdf},
    unit= {meca-ras}
    }

  • G. De Cubber, D. Doroftei, and S. A. Berrabah, “Using visual perception for controlling an outdoor robot in a crisis management scenario," in ROBOTICS 2010, Clermont-Ferrand, France, 2010.
    [BibTeX] [Abstract] [Download PDF]

    Crisis management teams (e.g. fire and rescue services, anti-terrorist units …) are often confronted with dramatic situations where critical decisions have to be made within hard time constraints. Therefore, they need correct information about what is happening on the crisis site. In this context, the View-Finder projects aims at developing robots which can assist the human crisis managers, by gathering data. This paper gives an overview of the development of such an outdoor robot. The presented robotic system is able to detect human victims at the incident site, by using vision-based human body shape detection. To increase the perceptual awareness of the human crisis managers, the robotic system is capable of reconstructing a 3D model of the environment, based on vision data. Also for navigation, the robot depends mostly on visual perception, as it combines a model-based navigation approach using geo-referenced positioning with stereo-based terrain traversability analysis for obstacle avoidance. The robot control scheme is embedded in a behavior-based robot control architecture, which integrates all the robot capabilities. This paper discusses all the above mentioned technologies.

    @InProceedings{de2010using,
    author = {De Cubber, Geert and Doroftei, Daniela and Berrabah, Sid Ahmed},
    booktitle = {ROBOTICS 2010},
    title = {Using visual perception for controlling an outdoor robot in a crisis management scenario},
    year = {2010},
    abstract = {Crisis management teams (e.g. fire and rescue services, anti-terrorist units ...) are often confronted with dramatic situations where critical decisions have to be made within hard time constraints. Therefore, they need correct information about what is happening on the crisis site. In this context, the View-Finder projects aims at developing robots which can assist the human crisis managers, by gathering data. This paper gives an overview of the development of such an outdoor robot. The presented robotic system is able to detect human victims at the incident site, by using vision-based human body shape detection. To increase the perceptual awareness of the human crisis managers, the robotic system is capable of reconstructing a 3D model of the environment, based on vision data. Also for navigation, the robot depends mostly on visual perception, as it combines a model-based navigation approach using geo-referenced positioning with stereo-based terrain traversability analysis for obstacle avoidance. The robot control scheme is embedded in a behavior-based robot control architecture, which integrates all the robot capabilities. This paper discusses all the above mentioned technologies.},
    project = {ViewFinder, Mobiniss},
    address = {Clermont-Ferrand, France},
    unit= {meca-ras},
    url = {http://mecatron.rma.ac.be/pub/2010/Usingvisualperceptionforcontrollinganoutdoorrobotinacrisismanagementscenario (1).pdf},
    }

  • D. Doroftei and E. Colon, “Decentralized Multi-Robot Coordination in an Urban Environment," European Journal of Mechanical en Environmental Engineering, vol. 1, 2010.
    [BibTeX] [Abstract] [Download PDF]

    In this paper, a novel control strategy is presented for multi‐robot coordination. An important aspect of the presented control architecture is that it is formulated in a decentralized context. This means that the robots cannot rely on traditional global path planning algorithms for navigation. The presented approach casts the multi‐robot control problem as a behavior‐based control problem.

    @Article{doro2010decentralized,
    author = {Doroftei, Daniela and Colon, Eric},
    journal = {European Journal of Mechanical en Environmental Engineering},
    title = {Decentralized Multi-Robot Coordination in an Urban Environment},
    year = {2010},
    volume = {1},
    abstract = {In this paper, a novel control strategy is presented for multi‐robot coordination. An important aspect of the presented control architecture is that it is formulated in a decentralized context. This means that the robots cannot rely on traditional global path planning algorithms for navigation. The presented approach casts the multi‐robot control problem as a behavior‐based control problem. },
    project = {NMRS},
    address = {Sheffield, UK},
    url = {http://mecatron.rma.ac.be/pub/2010/EJMEE2010_doroftei_colon.pdf},
    unit= {meca-ras}
    }

  • D. Doroftei and E. Colon, “Multi-robot collaboration and coordination in a high-risk transportation scenario," in Proc. IARP HUDEM 2010, Sousse, Tunisia, 2010.
    [BibTeX] [Abstract] [Download PDF]

    This paper discusses a decentralized multi-robot coordination strategy which aims to control and guide a team of robotic agents safely through a hostile area. The ”hostility” of the environment is due to the presence of enemy forces, seeking to intercept the robotic team. In order to avoid detection and ensure global team safety, the robotic agents must carefully plan their trajectory towards a list of goal locations, while holding a defensive formation.

    @InProceedings{doro2001multi,
    author = {Doroftei, Daniela and Colon, Eric},
    booktitle = {Proc. {IARP} {HUDEM} 2010},
    title = {Multi-robot collaboration and coordination in a high-risk transportation scenario},
    year = {2010},
    publisher = {{IARP}},
    abstract = {This paper discusses a decentralized multi-robot coordination strategy which aims to control and guide a team of robotic agents safely through a hostile area. The ”hostility” of the environment is due to the presence of enemy forces, seeking to intercept the robotic team. In order to avoid detection and ensure global team safety, the robotic agents must carefully plan their trajectory towards a list of goal locations, while holding a defensive formation. },
    project = {NMRS},
    address = {Sousse, Tunisia},
    url = {http://mecatron.rma.ac.be/pub/HUDEM/HUDEM%20-%202010/HUDEM2010_Doroftei.pdf},
    unit= {meca-ras}
    }

  • D. Doroftei and E. Colon, “Decentralized Multi-Robot Coordination for Risky Interventions," in Fourth International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance RISE, Sheffield, UK, 2010.
    [BibTeX] [Abstract] [Download PDF]

    The paper describes an approach to design a behavior-based architecture, how each behavior was designed and how the behavior fusion problem was solved.

    @InProceedings{doro2010multibis,
    author = {Doroftei, Daniela and Colon, Eric},
    booktitle = {Fourth International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance {RISE}},
    title = {Decentralized Multi-Robot Coordination for Risky Interventions},
    year = {2010},
    abstract = {The paper describes an approach to design a behavior-based architecture, how each behavior was designed and how the behavior fusion problem was solved.},
    project = {NMRS, ViewFinder},
    address = {Sheffield, UK},
    url = {http://mecatron.rma.ac.be/pub/RISE/RISE%20-%202010/Decentralized%20Multi-Robot%20Coordination%20for%20Risky%20Interventio.pdf},
    unit= {meca-ras}
    }

2009

  • G. De Cubber, D. Doroftei, L. Nalpantidis, G. C. Sirakoulis, and A. Gasteratos, “Stereo-based terrain traversability analysis for robot navigation," in IARP/EURON Workshop on Robotics for Risky Interventions and Environmental Surveillance, Brussels, Belgium, Brussels, Belgium, 2009.
    [BibTeX] [Abstract] [Download PDF]

    In this paper, we present an approach where a classification of the terrain in the classes traversable and obstacle is performed using only stereo vision as input data.

    @InProceedings{de2009stereo,
    author = {De Cubber, Geert and Doroftei, Daniela and Nalpantidis, Lazaros and Sirakoulis, Georgios Ch and Gasteratos, Antonios},
    booktitle = {IARP/EURON Workshop on Robotics for Risky Interventions and Environmental Surveillance, Brussels, Belgium},
    title = {Stereo-based terrain traversability analysis for robot navigation},
    year = {2009},
    abstract = {In this paper, we present an approach where a classification of the terrain in the classes traversable and obstacle is performed using only stereo vision as input data.},
    project = {ViewFinder, Mobiniss},
    address = {Brussels, Belgium},
    url = {http://mecatron.rma.ac.be/pub/2009/RISE-DECUBBER-DUTH.pdf},
    unit= {meca-ras}
    }

  • D. Doroftei, G. De Cubber, E. Colon, and Y. Baudoin, “Behavior based control for an outdoor crisis management robot," in Proceedings of the IARP International Workshop on Robotics for Risky Interventions and Environmental Surveillance, Brussels, Belgium, 2009, p. 12–14.
    [BibTeX] [Abstract] [Download PDF]

    The design and development of a control architecture for a robotic crisis management agent raises 3 main questions: 1. How can we design the individual behaviors, such that the robot is capable of avoiding obstacles and of navigating semi-autonomously? 2. How can these individual behaviors be combined in an optimal, leading to a rational and coherent global robot behavior? 3. How can all these capabilities be combined in a comprehensive and modular framework, such that the robot can handle a high-level task (searching for human victims) with minimal input from human operators, by navigating in a complex, dynamic and environment, while avoiding potentially hazardous obstacles? In this paper, we present each of these three main aspects of the general robot control architecture more in detail.

    @InProceedings{doroftei2009behavior,
    author = {Doroftei, Daniela and De Cubber, Geert and Colon, Eric and Baudoin, Yvan},
    booktitle = {Proceedings of the IARP International Workshop on Robotics for Risky Interventions and Environmental Surveillance},
    title = {Behavior based control for an outdoor crisis management robot},
    year = {2009},
    pages = {12--14},
    abstract = {The design and development of a control architecture for a robotic crisis management agent raises 3 main questions:
    1. How can we design the individual behaviors, such that the robot is capable of avoiding obstacles and of navigating semi-autonomously?
    2. How can these individual behaviors be combined in an optimal, leading to a rational and coherent global robot behavior?
    3. How can all these capabilities be combined in a comprehensive and modular framework, such that the robot can handle a high-level task (searching for human victims) with minimal input from human operators, by navigating in a complex, dynamic and environment, while avoiding potentially hazardous obstacles?
    In this paper, we present each of these three main aspects of the general robot control architecture more in detail.},
    project = {ViewFinder, Mobiniss},
    address = {Brussels, Belgium},
    url = {http://mecatron.rma.ac.be/pub/2009/RISE-DOROFTEI.pdf},
    unit= {meca-ras}
    }

  • Y. Baudoin, D. Doroftei, D. G. Cubber, S. A. Berrabah, C. Pinzon, F. Warlet, J. Gancet, E. Motard, M. Ilzkovitz, L. Nalpantidis, and A. Gasteratos, “VIEW-FINDER : Robotics assistance to fire-fighting services and Crisis Management," in 2009 IEEE International Workshop on Safety, Security & Rescue Robotics (SSRR 2009), Denver, USA, 2009, p. 1–6.
    [BibTeX] [Abstract] [Download PDF] [DOI]

    In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the Base Station (BS) the data is processed and combined with geographical information originating from a Web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. This paper will focus on the Crisis Management Information System that has been developed for improving a Disaster Management Action Plan and for linking the Control Station with a out-site Crisis Management Centre, and on the software tools implemented on the mobile robot gathering data in the outdoor area of the crisis.

    @InProceedings{Baudoin2009view01,
    author = {Y. Baudoin and D. Doroftei and G. De Cubber and S. A. Berrabah and C. Pinzon and F. Warlet and J. Gancet and E. Motard and M. Ilzkovitz and L. Nalpantidis and A. Gasteratos},
    booktitle = {2009 {IEEE} International Workshop on Safety, Security {\&} Rescue Robotics ({SSRR} 2009)},
    title = {{VIEW}-{FINDER} : Robotics assistance to fire-fighting services and Crisis Management},
    year = {2009},
    month = nov,
    organization = {IEEE},
    pages = {1--6},
    publisher = {{IEEE}},
    abstract = {In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the Base Station (BS) the data is processed and combined with geographical information originating from a Web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. This paper will focus on the Crisis Management Information System that has been developed for improving a Disaster Management Action Plan and for linking the Control Station with a out-site Crisis Management Centre, and on the software tools implemented on the mobile robot gathering data in the outdoor area of the crisis.},
    doi = {10.1109/ssrr.2009.5424172},
    project = {ViewFinder},
    address = {Denver, USA},
    url = {https://ieeexplore.ieee.org/document/5424172},
    unit= {meca-ras}
    }

  • Y. Baudoin, D. Doroftei, G. De Cubber, S. A. Berrabah, C. Pinzon, J. Penders, A. Maslowski, and J. Bedkowski, “VIEW-FINDER : Outdoor Robotics Assistance to Fire-Fighting services," in International Symposium Clawar, Istanbul, Turkey, 2009.
    [BibTeX] [Abstract] [Download PDF]

    In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the control station the data is processed and combined with geographical information originating from a web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. The information may also be forwarded to other forces involved in the operation (e.g. fire fighters, rescue workers, police, etc.). The robots will be designed to navigate individually or cooperatively and to follow high-level instructions from the base station. The robots are off-theshelf units, consisting of wheeled robots. The robots connect wirelessly to the control station. The control station collects in-situ data and combines it with information retrieved from the large-scale GMES-information bases. It will be equipped with a sophisticated human interface to display the processed information to the human operators and operation command.

    @InProceedings{baudoin2009view02,
    author = {Baudoin, Yvan and Doroftei, Daniela and De Cubber, Geert and Berrabah, Sid Ahmed and Pinzon, Carlos and Penders, Jacques and Maslowski, Andrzej and Bedkowski, Janusz},
    booktitle = {International Symposium Clawar},
    title = {{VIEW-FINDER} : Outdoor Robotics Assistance to Fire-Fighting services},
    year = {2009},
    abstract = {In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the control station the data is processed and combined with geographical information originating from a web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. The information may also be forwarded to other forces involved in the operation (e.g. fire fighters, rescue workers, police, etc.). The robots will be designed to navigate individually or cooperatively and to follow high-level instructions from the base station. The robots are off-theshelf units, consisting of wheeled robots. The robots connect wirelessly to the control station. The control station collects in-situ data and combines it with information retrieved from the large-scale GMES-information bases. It
    will be equipped with a sophisticated human interface to display the processed information to the human operators and operation command.},
    project = {ViewFinder, Mobiniss},
    address = {Istanbul, Turkey},
    url = {http://mecatron.rma.ac.be/pub/2009/CLAWAR2009.pdf},
    unit= {meca-ras}
    }

  • Y. Baudoin, D. Doroftei, G. De Cubber, S. A. Berrabah, E. Colon, C. Pinzon, A. Maslowski, and J. Bedkowski, “View-Finder: a European project aiming the Robotics assistance to Fire-fighting services and Crisis Management," in IARP workshop on Service Robotics and Nanorobotics, Bejing, China, 2009.
    [BibTeX] [Abstract] [Download PDF]

    In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the control station the data is processed and combined with geographical information originating from a web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. The information may also be forwarded to other forces involved in the operation (e.g. fire fighters, rescue workers, police, etc.). The robots connect wirelessly to the control station. The control station collects in-situ data and combines it with information retrieved from the large-scale GMES-information bases. It will be equipped with a sophisticated human interface to display the processed information to the human operators and operation command. We’ll essentially focus in this paper to the steps entrusted to the RMA and PIAP through the work-packages of the project.

    @InProceedings{baudoin2009view03,
    author = {Baudoin, Yvan and Doroftei, Daniela and De Cubber, Geert and Berrabah, Sid Ahmed and Colon, Eric and Pinzon, Carlos and Maslowski, Andrzej and Bedkowski, Janusz},
    booktitle = {IARP workshop on Service Robotics and Nanorobotics},
    title = {{View-Finder}: a European project aiming the Robotics assistance to Fire-fighting services and Crisis Management},
    year = {2009},
    abstract = {In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the control station the data is processed and combined with geographical information originating from a web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. The information may also be forwarded to other forces involved in the operation (e.g. fire fighters, rescue workers, police, etc.). The robots connect wirelessly to the control station. The control station collects in-situ data and combines it with information retrieved from the large-scale GMES-information bases. It will be equipped with a sophisticated human interface to display the processed information to the human operators and operation command.
    We’ll essentially focus in this paper to the steps entrusted to the RMA and PIAP through the work-packages of the project.},
    project = {ViewFinder},
    address = {Bejing, China},
    url = {http://mecatron.rma.ac.be/pub/2009/IARP-paper2009.pdf},
    unit= {meca-ras}
    }

  • Y. Baudoin, G. De Cubber, S. A. Berrabah, D. Doroftei, E. Colon, C. Pinzon, A. Maslowski, and J. Bedkowski, “VIEW-FINDER: European Project Aiming CRISIS MANAGEMENT TOOLS and the Robotics Assistance to Fire-Fighting Services," in IARP WS on service Robotics, Beijing, Bejing, China, 2009.
    [BibTeX] [Abstract] [Download PDF]

    Overview of the View-Finder project

    @InProceedings{baudoin2009view04,
    author = {Baudoin, Yvan and De Cubber, Geert and Berrabah, Sid Ahmed and Doroftei, Daniela and Colon, E and Pinzon, C and Maslowski, A and Bedkowski, J},
    booktitle = {IARP WS on service Robotics, Beijing},
    title = {{VIEW-FINDER}: European Project Aiming CRISIS MANAGEMENT TOOLS and the Robotics Assistance to Fire-Fighting Services},
    year = {2009},
    abstract = {Overview of the View-Finder project},
    project = {ViewFinder},
    address = {Bejing, China},
    unit= {meca-ras},
    url = {https://www.academia.edu/2879650/VIEW-FINDER_European_Project_Aiming_CRISIS_MANAGEMENT_TOOLS_and_the_Robotics_Assistance_to_Fire-Fighting_Services},
    }

  • D. Doroftei, E. Colon, Y. Baudoin, and H. Sahli, “Development of a behaviour-based control and software architecture for a visually guided mine detection robot," European Journal of Automated Systems (JESA), vol. 43, iss. 3, p. 295–314, 2009.
    [BibTeX] [Abstract] [Download PDF]

    Humanitarian demining is a labor-intensive and high-risk which could benefit from the development of a humanitarian mine detection robot, capable of scanning a minefield semi-automatically. The design of such an outdoor autonomous robots requires the consideration and integration of multiple aspects: sensing, data fusion, path and motion planning and robot control embedded in a control and software architecture. This paper focuses on three main aspects of the design process: visual sensing using stereo and image motion analysis, design of a behaviour-based control architecture and implementation of a modular software architecture.

    @Article{doro2009development,
    author = {Doroftei, Daniela and Colon, Eric and Baudoin, Yvan and Sahli, Hichem},
    journal = {European Journal of Automated Systems ({JESA})},
    title = {Development of a behaviour-based control and software architecture for a visually guided mine detection robot},
    year = {2009},
    volume = {43},
    number = {3},
    abstract = { Humanitarian demining is a labor-intensive and high-risk which could benefit from the development of a humanitarian mine detection robot, capable of scanning a minefield semi-automatically. The design of such an outdoor autonomous robots requires the consideration and integration of multiple aspects: sensing, data fusion, path and motion planning and robot control embedded in a control and software architecture. This paper focuses on three main aspects of the design process: visual sensing using stereo and image motion analysis, design of a behaviour-based control architecture and implementation of a modular software architecture.},
    pages = {295--314},
    project = {Mobiniss, ViewFinder},
    url = {http://mecatron.rma.ac.be/pub/2009/doc-article-hermes.pdf},
    unit= {meca-ras}
    }

2008

  • D. Doroftei, E. Colon, and G. De Cubber, “A Behaviour-Based Control and Software Architecture for the Visually Guided Robudem Outdoor Mobile Robot," Journal of Automation Mobile Robotics and Intelligent Systems, vol. 2, iss. 4, p. 19–24, 2008.
    [BibTeX] [Abstract] [Download PDF]

    The design of outdoor autonomous robots requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. This paper describes partial aspects of this research work, which is aimed at developing a semiautonomous outdoor robot for risky interventions. This paper focuses on three main aspects of the design process: visual sensing using stereo vision and image motion analysis, design of a behaviourbased control architecture and implementation of modular software architecture.

    @Article{doroftei2008behaviour,
    author = {Doroftei, Daniela and Colon, Eric and De Cubber, Geert},
    journal = {Journal of Automation Mobile Robotics and Intelligent Systems},
    title = {A Behaviour-Based Control and Software Architecture for the Visually Guided Robudem Outdoor Mobile Robot},
    year = {2008},
    issn = {1897-8649},
    month = oct,
    number = {4},
    pages = {19--24},
    volume = {2},
    abstract = {The design of outdoor autonomous robots requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. This paper describes partial aspects of this research work, which is aimed at developing a semiautonomous outdoor robot for risky interventions. This paper focuses on three main aspects of the design process: visual sensing using stereo vision and image motion analysis, design of a behaviourbased control architecture and implementation of modular software architecture.},
    project = {ViewFinder, Mobiniss},
    url = {http://mecatron.rma.ac.be/pub/2008/XXX JAMRIS No8 - Doroftei.pdf},
    unit= {meca-ras}
    }

  • G. De Cubber, D. Doroftei, and G. Marton, “Development of a visually guided mobile robot for environmental observation as an aid for outdoor crisis management operations," in Proceedings of the IARP Workshop on Environmental Maintenance and Protection, Baden Baden, Germany, 2008.
    [BibTeX] [Abstract] [Download PDF]

    To solve these issues, an outdoor mobile robotic platform was equipped with a differential GPS system for accurate geo-registered positioning, and a stereo vision system. This stereo vision systems serves two purposes: 1) victim detection and 2) obstacle detection and avoidance. For semi-autonomous robot control and navigation, we rely on a behavior-based robot motion and path planner. In this paper, we present each of the three main aspects (victim detection, stereo-based obstacle detection and behavior-based navigation) of the general robot control architecture more in detail.

    @InProceedings{de2008development,
    author = {De Cubber, Geert and Doroftei, Daniela and Marton, Gabor},
    booktitle = {Proceedings of the IARP Workshop on Environmental Maintenance and Protection},
    title = {Development of a visually guided mobile robot for environmental observation as an aid for outdoor crisis management operations},
    year = {2008},
    abstract = {To solve these issues, an outdoor mobile robotic platform was equipped with a differential GPS system for accurate geo-registered positioning, and a stereo vision system. This stereo vision systems serves two purposes: 1) victim detection and 2) obstacle detection and avoidance. For semi-autonomous robot control and navigation, we rely on a behavior-based robot motion and path planner. In this paper, we present each of the three main aspects (victim detection, stereo-based obstacle detection and behavior-based navigation) of the general robot control architecture more in detail.},
    project = {ViewFinder, Mobiniss},
    address = {Baden Baden, Germany},
    url = {http://mecatron.rma.ac.be/pub/2008/environmental observation as an aid for outdoor crisis management operations.pdf},
    unit= {meca-ras}
    }

  • D. Doroftei and Y. Baudoin, “Development of a semi-autonomous De-mining vehicle," in 7th IARP Workshop HUDEM2008, Cairo, Egypt, 2008.
    [BibTeX] [Abstract]

    The paper describes the Development of a semi-autonomous De-mining vehicle

    @InProceedings{doro2008development,
    author = {Doroftei, Daniela and Baudoin, Yvan},
    booktitle = {7th {IARP} Workshop {HUDEM}2008},
    title = {Development of a semi-autonomous De-mining vehicle},
    year = {2008},
    abstract = {The paper describes the Development of a semi-autonomous De-mining vehicle},
    address = {Cairo, Egypt},
    project = {Mobiniss},
    unit= {meca-ras}
    }

  • D. Doroftei and J. Bedkowski, “Towards the autonomous navigation of robots for risky interventions," in Third International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance RISE, Benicassim, Spain, 2008.
    [BibTeX] [Abstract] [Download PDF]

    In the course of the ViewFinder project, two robotics teams (RMS and PIAP) are working on the development of an intelligent autonomous mobile robot. This paper reports on the progress of both teams.

    @InProceedings{doro2008towards,
    author = {Doroftei, Daniela and Bedkowski, Janusz},
    booktitle = {Third International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance {RISE}},
    title = {Towards the autonomous navigation of robots for risky interventions},
    year = {2008},
    abstract = {In the course of the ViewFinder project, two robotics teams (RMS and PIAP) are working on the development of an intelligent autonomous mobile robot. This paper reports on the progress of both teams.},
    project = {ViewFinder, Mobiniss},
    address = {Benicassim, Spain},
    url = {http://mecatron.rma.ac.be/pub/2008/Doroftei.pdf},
    unit= {meca-ras}
    }

2007

  • D. Doroftei, E. Colon, and G. De Cubber, “A behaviour-based control and software architecture for the visually guided Robudem outdoor mobile robot,," in ISMCR 2007, Warsaw, Poland,, 2007.
    [BibTeX] [Abstract] [Download PDF]

    The design of outdoor autonomous robots requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. This paper describes partial aspects of this research work, which is aimed at developing an semi‐autonomous outdoor robot for risky interventions. This paper focuses mainly on three main aspects of the design process: visual sensing using stereo and image motion analysis, design of a behaviour‐based control architecture and implementation of a modular software architecture.

    @InProceedings{doroftei2007behaviour,
    author = {Doroftei, Daniela and Colon, Eric and De Cubber, Geert},
    booktitle = {ISMCR 2007},
    title = {A behaviour-based control and software architecture for the visually guided {Robudem} outdoor mobile robot,},
    year = {2007},
    address = {Warsaw, Poland,},
    abstract = {The design of outdoor autonomous robots requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. This paper describes partial aspects of this research work, which is aimed at developing an semi‐autonomous outdoor robot for risky interventions. This paper focuses mainly on three main aspects of the design process: visual sensing using stereo and image motion analysis, design of a behaviour‐based control architecture and implementation of a modular software architecture.},
    project = {ViewFinder,Mobiniss},
    url = {http://mecatron.rma.ac.be/pub/2007/Doroftei_ISMCR07.pdf},
    unit= {meca-ras}
    }

  • D. Doroftei, E. Colon, Y. Baudoin, and H. Sahli, “Development of a semi-autonomous off-road vehicle.," in IEEE HuMan’07’, Timimoun, Algeria, 2007, p. 340–343.
    [BibTeX] [Abstract] [Download PDF]

    Humanitarian demining is still a highly laborintensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan semi-automatically a minefield. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control.

    @InProceedings{doro2007development,
    author = {Doroftei, Daniela and Colon, Eric and Baudoin, Yvan and Sahli, Hichem},
    booktitle = {{IEEE} {HuMan}'07'},
    title = {Development of a semi-autonomous off-road vehicle.},
    year = {2007},
    address = {Timimoun, Algeria},
    pages = {340--343},
    abstract = {Humanitarian demining is still a highly laborintensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan semi-automatically a minefield. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control.},
    project = {Mobiniss, ViewFinder},
    url = {http://mecatron.rma.ac.be/pub/2007/Development_of_a_semi-autonomous_off-road_vehicle.pdf},
    unit= {meca-ras}
    }

2006

  • D. Doroftei, E. Colon, and Y. Baudoin, “A modular control architecture for semi-autonomous navigation," in CLAWAR 2006, Brussels, Belgium, 2006, p. 712–715.
    [BibTeX] [Abstract] [Download PDF]

    Humanitarian demining is still a highly laborintensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan semi-automatically a minefield. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control.

    @InProceedings{doro2006modular,
    author = {Doroftei, Daniela and Colon, Eric and Baudoin, Yvan},
    booktitle = {{CLAWAR} 2006},
    title = {A modular control architecture for semi-autonomous navigation},
    year = {2006},
    pages = {712--715},
    abstract = {Humanitarian demining is still a highly laborintensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan semi-automatically a minefield. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. },
    project = {Mobiniss, ViewFinder},
    address = {Brussels, Belgium},
    url = {http://mecatron.rma.ac.be/pub/2006/Clawar2006_Doroftei_colon.pdf},
    unit= {meca-ras}
    }

  • D. Doroftei, E. Colon, and Y. Baudoin, “Development of a control architecture for the ROBUDEM outdoor mobile robot platform," in IARP Workshop RISE 2006, Brussels, Belgium, 2006.
    [BibTeX] [Abstract] [Download PDF]

    Humanitarian demining still is a highly labor-intensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan a minefield semi-automatically. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control.

    @InProceedings{doro2006development,
    author = {Doroftei, Daniela and Colon, Eric and Baudoin, Yvan},
    booktitle = {{IARP} Workshop {RISE} 2006},
    title = {Development of a control architecture for the ROBUDEM outdoor mobile robot platform},
    year = {2006},
    abstract = {Humanitarian demining still is a highly labor-intensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan a minefield semi-automatically. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. },
    project = {Mobiniss, ViewFinder},
    address = {Brussels, Belgium},
    url = {http://mecatron.rma.ac.be/pub/2006/IARPWS2006_Doroftei_Colon.pdf},
    unit= {meca-ras}
    }