{"id":3685,"date":"2020-02-20T18:54:32","date_gmt":"2020-02-20T17:54:32","guid":{"rendered":"https:\/\/mecatron.rma.ac.be\/?page_id=3685"},"modified":"2023-02-06T15:48:56","modified_gmt":"2023-02-06T14:48:56","slug":"publications","status":"publish","type":"page","link":"https:\/\/mecatron.rma.ac.be\/index.php\/publications\/","title":{"rendered":"Publications"},"content":{"rendered":"<p><section class=\"kc-elm kc-css-596572 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-92663 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-471827 kc_text_block\"><\/p>\n<p>\n<strong>Disclaimer:\u00a0<\/strong>The papers below are intended for private viewing by the page owner or those who otherwise have legitimate access to them. No part of it may in any form or by any electronic, mechanical, photocopying, recording, or any other means be reproduced, stored in a retrieval system or be broadcast or transmitted without the prior permission of the respective publishers. If your organization has a valid subscription of the journals, click on the DOI link for the legitimate copy of the papers.<\/p>\n<p>\n<\/div><\/div><\/div><\/div><\/div><\/section><section class=\"kc-elm kc-css-231588 kc_row\"><div class=\"kc-row-container  kc-container\"><div class=\"kc-wrap-columns\"><div class=\"kc-elm kc-css-185124 kc_col-sm-12 kc_column kc_col-sm-12\"><div class=\"kc-col-container\"><div class=\"kc-elm kc-css-127982 kc_text_block\"><\/p>\n<p><h3 class=\"papercite\">2026<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    M. Malizia, C. Hamesse, K. Hasselmann, G. De Cubber, N. Tsiogkas, E. Demeester, and R. Haelterman, &#8220;MineInsight: A Multi-sensor Dataset for Humanitarian Demining Robotics in Off-Road Environments,\" <span style=\"font-style: italic\">IEEE Robotics and Automation Letters<\/span>, vol. 11, iss. 2, pp. 1650-1657, 2026.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_2\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/arxiv.org\/abs\/2506.04842\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/LRA.2025.3643265' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_2_block\">\n<pre><code class=\"tex bibtex\">@article{11297788,\nauthor={Malizia, Mario and Hamesse, Charles and Hasselmann, Ken and De Cubber, Geert and Tsiogkas, Nikolaos and Demeester, Eric and Haelterman, Rob},\njournal={IEEE Robotics and Automation Letters},\ntitle={MineInsight: A Multi-sensor Dataset for Humanitarian Demining Robotics in Off-Road Environments},\nyear={2026},\nvolume={11},\nnumber={2},\npages={1650-1657},\ndoi={10.1109\/LRA.2025.3643265},\nurl={https:\/\/arxiv.org\/abs\/2506.04842},\nunit= {meca-ras},\nproject= {BELGIAN}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2025<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    C. Sillero Illanes, T. Rogalski, G. De Cubber, and M. Kowalik, &#8220;Case Study of the Drone Ecosystem of Podkarpackie (Poland),\" , iss. KJ-01-25-618-EN-N, 2025.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_0\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_0\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/publications.jrc.ec.europa.eu\/repository\/bitstream\/JRC143402\/JRC143402_01.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.2760\/26795' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_0_block\">\n<p>This report presents the findings of a review of the Podkarpackie aerial drone and robotics ecosystem, conducted between March 2024 and November 2025 in partnership with the Regional Government of Podkarpackie. The study forms part of REGDUALOSA (Regions, Dual Use, Open Strategic Autonomy), an exploratory initiative of the European Commission\u2019s Joint Research Centre (JRC). It examines policy pathways through which Podkarpackie could strengthen its aerial drone sector as a dual-use industry, thereby enhancing regional competitiveness and contributing to European strategic autonomy. The analysis applies an adapted version of the POINT methodology, combining desk research, expert interviews, and stakeholder consultations. Amid increasing geopolitical instability, the European Commission announced the European Drone Defence Initiative in October 2025, aimed at protecting EU borders and reinforcing defence capabilities under the ReArm Europe initiative, which mobilises up to \u20ac800 billion. Bordering Ukraine and hosting a well-established aerospace cluster, Podkarpackie is strategically positioned to advance dual-use innovation. Based on the evidence gathered, the report formulates twelve strategic recommendations to improve governance, funding coordination, SME participation, and skills development. It also introduces the concept of Territorial Preparedness as an innovation policy framework that positions regional ecosystems at the forefront of Europe\u2019s resilience, industrial sovereignty, and long-term security.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_0_block\">\n<pre><code class=\"tex bibtex\">@article{JRC143402,\nnumber = {KJ-01-25-618-EN-N },\naddress = {Luxembourg (Luxembourg)},\nissn = {1831-9424 },\nyear = {2025},\nauthor = {Sillero Illanes, C. and Rogalski, T. and De Cubber, G. and Kowalik, M.},\nisbn = {978-92-68-34794-2 },\npublisher = {Publications Office of the European Union},\nabstract = {This report presents the findings of a review of the Podkarpackie aerial drone and robotics ecosystem, conducted between March 2024 and November 2025 in partnership with the Regional Government of Podkarpackie. The study forms part of REGDUALOSA (Regions, Dual Use, Open Strategic Autonomy), an exploratory initiative of the European Commission\u2019s Joint Research Centre (JRC). It examines policy pathways through which Podkarpackie could strengthen its aerial drone sector as a dual-use industry, thereby enhancing regional competitiveness and contributing to European strategic autonomy. The analysis applies an adapted version of the POINT methodology, combining desk research, expert interviews, and stakeholder consultations. Amid increasing geopolitical instability, the European Commission announced the European Drone Defence Initiative in October 2025, aimed at protecting EU borders and reinforcing defence capabilities under the ReArm Europe initiative, which mobilises up to \u20ac800 billion. Bordering Ukraine and hosting a well-established aerospace cluster, Podkarpackie is strategically positioned to advance dual-use innovation. Based on the evidence gathered, the report formulates twelve strategic recommendations to improve governance, funding coordination, SME participation, and skills development. It also introduces the concept of Territorial Preparedness as an innovation policy framework that positions regional ecosystems at the forefront of Europe\u2019s resilience, industrial sovereignty, and long-term security.},\ntitle = {Case Study of the Drone Ecosystem of Podkarpackie (Poland)},\nurl = {https:\/\/publications.jrc.ec.europa.eu\/repository\/bitstream\/JRC143402\/JRC143402_01.pdf},\ndoi = {10.2760\/26795},\nunit= {meca-ras},\nproject= {COURAGEOUS, COURAGEOUS2, ORIGAMI}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    M. Malizia, K. Hasselmann, A. Miuccio, R. Haelterman, N. Tsiogkas, and E. Demeester, &#8220;PFM-1 Landmine Detection in Vegetation Using Thermal Imaging with Limited Training Data,\" in <span style=\"font-style: italic\">2025 25th International Conference on Control, Automation and Systems (ICCAS)<\/span>,  2025, pp. 1864-1869.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_1\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/ras.papercept.net\/conferences\/conferences\/ICCAS25\/program\/ICCAS25_ContentListWeb_4.html\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.23919\/ICCAS66577.2025.11301116' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_1_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{malizia2025pfm,\nauthor = {Malizia, Mario and Hasselmann, Ken and Miuccio, Alessandra and Haelterman, Rob and Tsiogkas, Nikolaos and Demeester, Eric},\ntitle = {{PFM}-1 Landmine Detection in Vegetation Using Thermal Imaging with Limited Training Data},\nbooktitle = {2025 25th International Conference on Control, Automation and Systems (ICCAS)},\nyear = {2025},\npages={1864-1869},\nunit= {meca-ras},\nurl={https:\/\/ras.papercept.net\/conferences\/conferences\/ICCAS25\/program\/ICCAS25_ContentListWeb_4.html},\ndoi={10.23919\/ICCAS66577.2025.11301116},\nproject= {BELGIAN, DREAM}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    R. De Greef, A. Discepoli, E. Aguililla Klein, T. Engels, K. Hasselmann, and A. Paolillo, &#8220;Towards Macro-Aware C-to-Rust Transpilation (WIP),\" in <span style=\"font-style: italic\">Proceedings of the 26th ACM SIGPLAN\/SIGBED International Conference on Languages, Compilers, and Tools for Embedded Systems<\/span>, New York, NY, USA,  2025, p. 57\u201361.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_3\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_3\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/doi.org\/10.1145\/3735452.3735535\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1145\/3735452.3735535' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_3_block\">\n<p>The automatic translation of legacy C code to Rust presents significant challenges, particularly in handling preprocessor macros. C macros introduce metaprogramming constructs that operate at the text level, outside of C&#8217;s syntax tree, making their direct translation to Rust non-trivial. Existing transpilers \u2013- source-to-source compilers \u2013- expand macros before translation, sacrificing their abstraction and reducing code maintainability. In this work, we introduce Oxidize, a macro-aware C-to-Rust transpilation framework that preserves macro semantics by translating C macros into Rust-compatible constructs while selectively expanding only those that interfere with Rust&#8217;s stricter semantics. We evaluate our techniques on a small-scale study of real-world macros and find that the majority can be safely and idiomatically transpiled without full expansion.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_3_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{10.1145\/3735452.3735535,\nauthor = {De Greef, Robbe and Discepoli, Attilio and Aguililla Klein, Esteban and Engels, Th'{e}o and Hasselmann, Ken and Paolillo, Antonio},\ntitle = {Towards Macro-Aware C-to-Rust Transpilation (WIP)},\nyear = {2025},\nisbn = {9798400719219},\npublisher = {Association for Computing Machinery},\naddress = {New York, NY, USA},\nurl = {https:\/\/doi.org\/10.1145\/3735452.3735535},\ndoi = {10.1145\/3735452.3735535},\nabstract = {The automatic translation of legacy C code to Rust presents significant challenges, particularly in handling preprocessor macros. C macros introduce metaprogramming constructs that operate at the text level, outside of C's syntax tree, making their direct translation to Rust non-trivial. Existing transpilers --- source-to-source compilers --- expand macros before translation, sacrificing their abstraction and reducing code maintainability. In this work, we introduce Oxidize, a macro-aware C-to-Rust transpilation framework that preserves macro semantics by translating C macros into Rust-compatible constructs while selectively expanding only those that interfere with Rust's stricter semantics. We evaluate our techniques on a small-scale study of real-world macros and find that the majority can be safely and idiomatically transpiled without full expansion.},\nbooktitle = {Proceedings of the 26th ACM SIGPLAN\/SIGBED International Conference on Languages, Compilers, and Tools for Embedded Systems},\npages = {57\u201361},\nnumpages = {5},\nkeywords = {Abstract Syntax Tree, C, Embedded, Macros, Metaprogramming, Preprocessor, Rust, Transpilation},\nlocation = {Seoul, Republic of Korea},\nunit= {meca-ras},\nproject= {FORCES},\nseries = {LCTES '25}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    T. Engels, A. Discepoli, R. De Greef, E. Aguililla Klein, F. D&#8217;Agostino, R. Gunsett, J. Pisane, K. Hasselmann, and A. Paolillo, &#8220;FORCES: An Incremental Transpiler from C\/C++ to Rust for Robust and Secure Robotics Systems,\" in <span style=\"font-style: italic\">Workshop on Rust for Robotics: Building Robust Foundations for Tomorrow\u2019s Autonomous Systems, IEEE International Conference on Robotics and Automation (ICRA)<\/span>,  2025.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_4\" class=\"papercite_toggle\">[BibTeX]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_4_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{engels2025forces, author = {Engels, Th{'e}o and Discepoli, Attilio and De Greef, Robbe and Aguililla Klein, Esteban and D'Agostino, Francesco and Gunsett, Remi and Pisane, Jonathan and Hasselmann, Ken and Paolillo, Antonio}, title = {{FORCES}: An Incremental Transpiler from {C\/C++} to {Rust} for Robust and Secure Robotics Systems}, booktitle = {Workshop on Rust for Robotics: Building Robust Foundations for Tomorrow\u2019s Autonomous Systems, IEEE International Conference on Robotics and Automation (ICRA)}, year = {2025}, unit= {meca-ras},  project= {FORCES}, note = {Workshop Paper} }<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    A. Miuccio, T. Fr\u00e9ville, E. Le Fl\u00e9cher, and C. Hamesse, <span style=\"font-style: italic\">Autonomous Mobile Manipulation for Safe and Efficient Landmine Disposal<\/span>CEIA Humanitarian Clearance Teamwork, 2025.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_5\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_5\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.ctro.hr\/userfiles\/files\/BROSURA%20CTRO%20MINE%20ACTION_2025_za%20web.pdf#page=25\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_5_block\">\n<p>Autonomous mobile manipulation for safe and efficient landmine disposal<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_5_block\">\n<pre><code class=\"tex bibtex\">@misc{miuccio_autonomous_2025,\ntitle = {Autonomous {Mobile} {Manipulation} for {Safe} and {Efficient} {Landmine} {Disposal}},\nurl = {https:\/\/www.ctro.hr\/userfiles\/files\/BROSURA%20CTRO%20MINE%20ACTION_2025_za%20web.pdf#page=25},\nabstract = {Autonomous mobile manipulation for safe and efficient landmine disposal},\nlanguage = {EN},\npublisher = {CEIA Humanitarian Clearance Teamwork},\nauthor = {Miuccio, Alessandra and Fr\u00e9ville, Timoth\u00e9e and Le Fl\u00e9cher, Emile and Hamesse, Charles},\nyear = {2025},\nunit= {meca-ras},\nproject= {DREAM}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    S. Papili, &#8220;The Urgent call on data management: are we capable to store valuable (meta)data for naval application?,\" in <span style=\"font-style: italic\">UACE2025. 8th Underwater Acoustics Conference and Exhibition. Conference Proceedings<\/span>,  2025, p. 680P.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_6\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/www.uaconferences.org\/proceedings\/proceedings-2025\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_6_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{UACE2025Proceedings,\nauthor = {Sonia Papili},\ntitle = {{The Urgent call on data management: are we capable to store valuable (meta)data for naval application?}},\nvolume = {},\nbooktitle = {UACE2025. 8th Underwater Acoustics Conference and Exhibition. Conference Proceedings},\neditor = {Michael I. Taroudakis},\norganization = {Underwater Acoustics conference & exhibition series},\npublisher = {FORTH},\npages = {680P},\nkeywords = {data model, historical information, descriptive data, descriptive information},\nyear = {2025},\nISSN = {2408-0195},\nURL = {https:\/\/www.uaconferences.org\/proceedings\/proceedings-2025},\nunit= {meca-ras},\nproject= {DISCIMBA}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    K. Cools, C. Maathuis, A. M. van Oers, C. S. H. &#8220;u, N. Deligiannis, M. Vandewal, and G. D. Cubber, &#8220;Vision transformers: the threat of realistic adversarial patches,\" in <span style=\"font-style: italic\">Artificial Intelligence for Security and Defence Applications III<\/span>,  2025, p. 136790P.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_7\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/doi.org\/10.1117\/12.3070069\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1117\/12.3070069' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_7_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{10.1117\/12.3070069,\nauthor = {Kasper Cools and Clara Maathuis and Alexander M. van Oers and Claudia S. H{\"u}bner and Nikos Deligiannis and Marijke Vandewal and Geert De Cubber},\ntitle = {{Vision transformers: the threat of realistic adversarial patches}},\nvolume = {13679},\nbooktitle = {Artificial Intelligence for Security and Defence Applications III},\neditor = {Hugo J. Kuijf and Radhakrishna Prabhu and Yitzhak Yitzhaky},\norganization = {International Society for Optics and Photonics},\npublisher = {SPIE},\npages = {136790P},\nkeywords = {Adversarial Patches, Vision Transformers, Evasion Attacks, Attack Transferability},\nyear = {2025},\ndoi = {10.1117\/12.3070069},\nURL = {https:\/\/doi.org\/10.1117\/12.3070069},\nunit= {meca-ras},\nproject= {ARC}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    T. -T. Nguyen, V. Van Rijswijck, G. De Cubber, B. Janssens, and H. Bruyninckx, &#8220;State-of-the-art autonomous landing solutions for UAVs on moving platforms,\" in <span style=\"font-style: italic\">Proc. SPIE Sensors + Imaging 2025, Autonomous Systems for Security and Defence II,<\/span>,  2025.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_8\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/13680\/3069539\/State-of-the-art-autonomous-landing-solutions-for-UAVs-on\/10.1117\/12.3069539.full\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.3069539' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_8_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{spie_sota_2025,\ntitle={State-of-the-art autonomous landing solutions for UAVs on moving platforms},\nauthor={Nguyen, T.-T and Van Rijswijck, V and De Cubber, G and Janssens, B. and Bruyninckx, H. },\nbooktitle={Proc. SPIE Sensors + Imaging 2025, Autonomous Systems for Security and Defence II,},\neditors ={},\npublisher = {},\nyear = {2025},\nvol = {13680},\nlocation = {Madrid, Spain},\nunit= {meca-ras},\ndoi = {https:\/\/doi.org\/10.1117\/12.3069539 },\nurl={https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/13680\/3069539\/State-of-the-art-autonomous-landing-solutions-for-UAVs-on\/10.1117\/12.3069539.full},\nproject= {SAILS}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    A. Casado Faul\u00ed, M. Coppieters, L. Carpi, T. -T. Nguyen, M. Vochten, R. Ronsse, G. De Cubber, B. Lauwens, and B. C. Arrue, &#8220;Multi-UAV geometrical area coverage using gradient descent,\" in <span style=\"font-style: italic\">Proc. SPIE Sensors + Imaging 2025, Autonomous Systems for Security and Defence II,<\/span>,  2025.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_9\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/13680\/136800B\/Multi-UAV-geometrical-area-coverage-using-gradient-descent\/10.1117\/12.3063062.full\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.3063062' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_9_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{spie_multi_2025,\ntitle={Multi-UAV geometrical area coverage using gradient descent},\nauthor={Casado Faul\u00ed, A and Coppieters, M and Carpi, L and Nguyen, T.-T and Vochten, M and Ronsse, R and De Cubber, Geert and Lauwens, B and C. Arrue, B},\nbooktitle={Proc. SPIE Sensors + Imaging 2025, Autonomous Systems for Security and Defence II,},\neditors ={},\npublisher = {},\nyear = {2025},\nvol = {13680},\nlocation = {Madrid, Spain},\nunit= {meca-ras},\ndoi = {https:\/\/doi.org\/10.1117\/12.3063062 },\nurl={https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/13680\/136800B\/Multi-UAV-geometrical-area-coverage-using-gradient-descent\/10.1117\/12.3063062.full},\nproject= {HADRON}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    A. La Grappe, E. Le Fl\u00e9cher, and G. De Cubber, &#8220;Multi-Sensor Multi-Target Tracking for Maritime Surveillance with Autonomous Surface Vehicles Using Belief Propagation,\" in <span style=\"font-style: italic\">OCEANS 2025 Brest<\/span>,  2025, p. 1\u20138.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_10\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_10\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/11104349\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/OCEANS58557.2025.11104349' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_10_block\">\n<p>We present a distributed multi-sensor multi-target tracking algorithm for maritime surveillance using unmanned surface vehicles (USVs) in multi-agent settings. Our approach fuses measurements from radar, Automatic Identification System (AIS), and camera sensors within a Bayesian framework, employing an adaptive particle filtering strategy to jointly estimate the kinematic states and identities of vessels. Our solution incorporates a factorized data association model that integrates cooperative self-reports from AIS with radar and camera measurements, with visual re-identification capability. We evaluate our method using a high-fidelity simulation environment, which generates photorealistic maritime scenarios. Our performance analysis indicates that the integration of camera-based cues improves both the spatial localization and identity consistency, particularly in scenarios with low radar detection probability and non-cooperative targets. Furthermore, the distributed inference framework scales well with the number of USVs, making it well suited for large-scale multi-agent applications. Overall, our work demonstrates that fusing heterogeneous sensor modalities using belief propagation can enhance multi-target tracking performance in congested maritime environments.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_10_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{2c491cd7d3c64ae3ae227fdab6a2c80f,\ntitle = \"Multi-Sensor Multi-Target Tracking for Maritime Surveillance with Autonomous Surface Vehicles Using Belief Propagation\",\nabstract = \"We present a distributed multi-sensor multi-target tracking algorithm for maritime surveillance using unmanned surface vehicles (USVs) in multi-agent settings. Our approach fuses measurements from radar, Automatic Identification System (AIS), and camera sensors within a Bayesian framework, employing an adaptive particle filtering strategy to jointly estimate the kinematic states and identities of vessels. Our solution incorporates a factorized data association model that integrates cooperative self-reports from AIS with radar and camera measurements, with visual re-identification capability. We evaluate our method using a high-fidelity simulation environment, which generates photorealistic maritime scenarios. Our performance analysis indicates that the integration of camera-based cues improves both the spatial localization and identity consistency, particularly in scenarios with low radar detection probability and non-cooperative targets. Furthermore, the distributed inference framework scales well with the number of USVs, making it well suited for large-scale multi-agent applications. Overall, our work demonstrates that fusing heterogeneous sensor modalities using belief propagation can enhance multi-target tracking performance in congested maritime environments.\",\nkeywords = \"Visualization, Target tracking, Radar measurements, Surveillance, Radar, Radar tracking, Cameras, Particle measurements, Sensors, Artificial intelligence, Multi-target tracking, Maritime surveillance, Unmanned Surface Vessels, Distributed sensor fusion, Particle filtering, Belief propagation, Multi-agent robotics\",\nauthor = \"La Grappe, Alexandre and Le Fl\u00e9cher, Emile and De Cubber, Geert\",\nyear = \"2025\",\nmonth = jun,\nday = \"19\",\ndoi = \"10.1109\/OCEANS58557.2025.11104349\",\nlanguage = \"English\",\npages = \"1--8\",\nbooktitle = \"OCEANS 2025 Brest\",\npublisher = \"Institute of Electrical and Electronics Engineers Inc.\",\nurl = \"https:\/\/ieeexplore.ieee.org\/document\/11104349\",\nunit= meca-ras,\nproject= MULTIMETER\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber and D. Doroftei, &#8220;Resource Optimisation for Distributed Teams of Manned Aircraft and Drones,\" in <span style=\"font-style: italic\">2025 11th International Conference on Control, Automation and Robotics, ICCAR 2025<\/span>,  2025, p. 554\u2013559.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_11\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_11\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/11072934\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ICCAR64901.2025.11072934' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_11_block\">\n<p>With the significant increase in onboard computing capabilities, modern aerial robotic systems, can execute a broad range of perception and control algorithms simultaneously. Moreover, more and more, they are also deployed as heterogeneous collaborative teams, where manned and unmanned assets need to collaborate in a manned-unmanned teaming concept. This introduces the challenge of determining the optimal distribution of cognitive processes across aerial platforms, edge computing nodes, and cloud-based services. In this paper, we propose a novel load distribution methodology tailored to the aerial domain. The approach adopts a decentralized framework for allocating perception and control processes by evaluating communication parameters (e.g., bandwidth, latency, and cost), the computational capabilities of the drones and supporting infrastructure (including CPU, GPU, memory, and storage performance), and the real-time delivery requirements of high-quality output data. The proposed methodology is validated in a simulated environment, demonstrating promising performance and scalability in handling dynamic operational conditions.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_11_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{182aea17e9984dd395cb61aeec26ce48,\ntitle = \"Resource Optimisation for Distributed Teams of Manned Aircraft and Drones\",\nabstract = \"With the significant increase in onboard computing capabilities, modern aerial robotic systems, can execute a broad range of perception and control algorithms simultaneously. Moreover, more and more, they are also deployed as heterogeneous collaborative teams, where manned and unmanned assets need to collaborate in a manned-unmanned teaming concept. This introduces the challenge of determining the optimal distribution of cognitive processes across aerial platforms, edge computing nodes, and cloud-based services. In this paper, we propose a novel load distribution methodology tailored to the aerial domain. The approach adopts a decentralized framework for allocating perception and control processes by evaluating communication parameters (e.g., bandwidth, latency, and cost), the computational capabilities of the drones and supporting infrastructure (including CPU, GPU, memory, and storage performance), and the real-time delivery requirements of high-quality output data. The proposed methodology is validated in a simulated environment, demonstrating promising performance and scalability in handling dynamic operational conditions.\",\nkeywords = \"drones, mannedunmanned teaming, resource optimisation\",\nauthor = \"De Cubber, Geert and Doroftei, Daniela\",\nnote = \"Publisher Copyright: {textcopyright} 2025 IEEE.; 11th International Conference on Control, Automation and Robotics, ICCAR 2025 ; Conference date: 18-04-2025 Through 20-04-2025\",\nyear = \"2025\",\ndoi = \"10.1109\/ICCAR64901.2025.11072934\",\nlanguage = \"English\",\npages = \"554--559\",\nbooktitle = \"2025 11th International Conference on Control, Automation and Robotics, ICCAR 2025\",\npublisher = \"Institute of Electrical and Electronics Engineers Inc.\",\nedition = \"2025\",\nurl = \"https:\/\/ieeexplore.ieee.org\/document\/11072934\",\nunit= meca-ras,\nproject= {COURAGEOUS2,HADRON,ALPHONSE}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    E. Maroulis, D. Hawari, K. Hasselmann, E. Le Fl\u00e9cher, and G. De Cubber, &#8220;Experimental Evaluation of Roadmap-Based Map Generation with Continuous-Time Conflict-Based Search for Multi-Agent Pathfinding,\" in <span style=\"font-style: italic\">IEEE International Conference on Autonomous Robots and Agents, ICARA<\/span>,  2025, p. 380\u2013387.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_12\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_12\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/10977707\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ICARA64554.2025.10977707' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_12_block\">\n<p>This article presents an experimental evaluation of a Multi-Agent Pathfinding (MAPF) approach for large-scale robotic fleets operating in diverse outdoor environments. We generated three distinct types of roadmap graphs: Constrained Delaunay Triangulation (CDT), Voronoi Diagram (VD), and Grid-derived from an obstacle file, and assessed their quality using metrics obtained from graph theory. Then, the performance of the Continuous-time Conflict-Based Search (CCBS) algorithm was evaluated across three different environmental maps, considering practical performance metrics including makespan and failure rate. Subsequently, the roadmap generation methods were ranked based on CCBS performance in similar scenarios using the Friedman statistical test. The results indicate that CDT outperforms both VD and Grid maps, even though it does not exhibit the best graph metrics in many environments. CDT&#8217;s superior performance is attributed to its enhanced interconnectedness and the availability of multiple pathways, as evidenced by its balanced metrics and structural properties. We show that CDT is the most efficient and reliable roadmap generation technique for multiagent systems under our experimental conditions making it a preferred choice for robust pathfinding.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_12_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{34774d01cc3341398188fc8353028be2,\ntitle = \"Experimental Evaluation of Roadmap-Based Map Generation with Continuous-Time Conflict-Based Search for Multi-Agent Pathfinding\",\nabstract = \"This article presents an experimental evaluation of a Multi-Agent Pathfinding (MAPF) approach for large-scale robotic fleets operating in diverse outdoor environments. We generated three distinct types of roadmap graphs: Constrained Delaunay Triangulation (CDT), Voronoi Diagram (VD), and Grid-derived from an obstacle file, and assessed their quality using metrics obtained from graph theory. Then, the performance of the Continuous-time Conflict-Based Search (CCBS) algorithm was evaluated across three different environmental maps, considering practical performance metrics including makespan and failure rate. Subsequently, the roadmap generation methods were ranked based on CCBS performance in similar scenarios using the Friedman statistical test. The results indicate that CDT outperforms both VD and Grid maps, even though it does not exhibit the best graph metrics in many environments. CDT's superior performance is attributed to its enhanced interconnectedness and the availability of multiple pathways, as evidenced by its balanced metrics and structural properties. We show that CDT is the most efficient and reliable roadmap generation technique for multiagent systems under our experimental conditions making it a preferred choice for robust pathfinding.\",\nkeywords = \"Measurement , Automation , Reliability theory , Graph theory , Path planning , Robots , Multi-agent systems\",\nauthor = \"Emmanouil Maroulis and Danial Hawari and Ken Hasselmann and Le Fl\u00e9cher, Emile and De Cubber, Geert\",\nyear = \"2025\",\nmonth = may,\nday = \"5\",\ndoi = \"10.1109\/ICARA64554.2025.10977707\",\nlanguage = \"English\",\npages = \"380--387\",\nbooktitle = \"IEEE International Conference on Autonomous Robots and Agents, ICARA\",\nissn = \"2767-7745\",\nurl = \"https:\/\/ieeexplore.ieee.org\/document\/10977707\",\nunit= {meca-ras},\nproject= {CUGS, ANIMUS, AIDEDEX, CONVOY}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, P. Petsioti, A. Koniaris, K. B. &#8216;n, M. .Z, R. Roman, S. Sima, A. Mohamoud, J. {van de Pol}, I. Maza, A. Ollero, C. Church, and C. Popa, &#8220;Standardized Evaluation of Counter-Drone Systems: Methods, Technologies, and Performance Metrics,\" <span style=\"font-style: italic\">Drones<\/span>, vol. 9, iss. 5, 2025.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_13\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_13\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.mdpi.com\/2504-446X\/9\/5\/354\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.3390\/drones9050354' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_13_block\">\n<p>This paper aims to introduce a standardized test methodology for drone detection, tracking, and identification systems. It is the aim that this standardized test methodology for assessing the performance of counter-drone systems will lead to a much better understanding of the capabilities of these solutions. This is urgently needed, as there is an increase in drone threats and there are no cohesive policies to evaluate the performance of these systems and hence mitigate and manage the threat. The presented methodology has been developed within the framework of the project COURAGEOUS funded by European Union{textquoteright}s Internal Security Fund Police. This standardized test methodology is based upon a series of standard user-defined scenarios representing a wide set of use cases. At this moment, these standard scenarios are geared toward civil security end users. However, the proposed standard methodology provides an open architecture where the standard scenarios can be modularly extended, providing standard users the possibility to easily add new scenarios. For each of these scenarios, operational needs and functional performance requirements are provided. Using this information, an integral test methodology is presented that allows for a fair qualitative and quantitative comparison between different counter-drone systems. The standard test methodology concentrates on the qualitative and quantitative evaluation of counter-drone systems. This test methodology was validated during three user-scripted validation trials.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_13_block\">\n<pre><code class=\"tex bibtex\">@article{5edff7d5878f412ba52cbbeab092019f,\ntitle = \"Standardized Evaluation of Counter-Drone Systems: Methods, Technologies, and Performance Metrics\",\nabstract = \"This paper aims to introduce a standardized test methodology for drone detection, tracking, and identification systems. It is the aim that this standardized test methodology for assessing the performance of counter-drone systems will lead to a much better understanding of the capabilities of these solutions. This is urgently needed, as there is an increase in drone threats and there are no cohesive policies to evaluate the performance of these systems and hence mitigate and manage the threat. The presented methodology has been developed within the framework of the project COURAGEOUS funded by European Union{textquoteright}s Internal Security Fund Police. This standardized test methodology is based upon a series of standard user-defined scenarios representing a wide set of use cases. At this moment, these standard scenarios are geared toward civil security end users. However, the proposed standard methodology provides an open architecture where the standard scenarios can be modularly extended, providing standard users the possibility to easily add new scenarios. For each of these scenarios, operational needs and functional performance requirements are provided. Using this information, an integral test methodology is presented that allows for a fair qualitative and quantitative comparison between different counter-drone systems. The standard test methodology concentrates on the qualitative and quantitative evaluation of counter-drone systems. This test methodology was validated during three user-scripted validation trials.\",\nkeywords = \"CUAS, Counter-Drone, C-UAS, Standardization, Standard Test Methods\",\nauthor = \"De Cubber, Geert and Daniela Doroftei and Paraskevi Petsioti and Alexios Koniaris and Konrad Brewczy{'n}ski and Marek {.Z}yczkowski and Razvan Roman and Silviu Sima and Ali Mohamoud and {van de Pol}, Johan and Ivan Maza and Anibal Ollero and Christopher Church and Cristina Popa\",\nyear = \"2025\",\nmonth = may,\nday = \"6\",\ndoi = \"10.3390\/drones9050354\",\nlanguage = \"English\",\nvolume = \"9\",\njournal = \"Drones\",\nissn = \"2504-446X\",\nnumber = \"5\",\nurl = \"https:\/\/www.mdpi.com\/2504-446X\/9\/5\/354\",\nunit= {meca-ras},\nproject= {COURAGEOUS}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    K. Cools, C. Maathuis, G. De Cubber, M. Vandewal, and N. Deligiannis, &#8220;Evaluation Techniques for Modern Military Camouflage,\" in <span style=\"font-style: italic\">Evaluation Techniques for Modern Military Camouflage<\/span>,  2025.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_14\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/icmt2025.cz\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_14_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{0a1d528b681b4dc7a81b6b8487df795e,\ntitle = \"Evaluation Techniques for Modern Military Camouflage\",\nauthor = \"Kasper Cools and Clara Maathuis and De Cubber, Geert and Marijke Vandewal and Nikos Deligiannis\",\nyear = \"2025\",\nlanguage = \"Nederlands\",\nbooktitle = \"Evaluation Techniques for Modern Military Camouflage\",\nnote = \"International Conference on Military Technologies 2025, ICMT2025 ; Conference date: 27-05-2025 Through 30-05-2025\",\nurl = \"https:\/\/icmt2025.cz\",\nunit= {meca-ras},\nproject= {ARC}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    C. Maathuis and K. Cools, &#8220;FHATMO: Feedback Model for Human-AI Teaming in Military Operations,\" in <span style=\"font-style: italic\">FHATMO: Feedback Model for Human-AI Teaming in Military Operations<\/span>,  2025.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_15\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/icmt2025.cz\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_15_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{7b6f43681fff433b80174d2a190df291,\ntitle = \"FHATMO: Feedback Model for Human-AI Teaming in Military Operations\",\nauthor = \"Clara Maathuis and Kasper Cools\",\nyear = \"2025\",\nlanguage = \"Nederlands\",\nbooktitle = \"FHATMO: Feedback Model for Human-AI Teaming in Military Operations\",\nnote = \"International Conference on Military Technologies 2025, ICMT2025 ; Conference date: 27-05-2025 Through 30-05-2025\",\nurl = \"https:\/\/icmt2025.cz\",\nunit= {meca-ras},\nproject= {ARC}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    C. Maathuis and K. Cools, &#8220;Risks and Control Measures for Building Trustworthy Autonomous Weapon Systems,\" in <span style=\"font-style: italic\">ICCWS &#8211; 20th International Conference on Cyber Warfare and Security<\/span>,  2025.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_18\" class=\"papercite_toggle\">[BibTeX]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_18_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{8087a20d04e3490ca837aa9b8f5d158c,\ntitle = \"Risks and Control Measures for Building Trustworthy Autonomous Weapon Systems\",\nauthor = \"Clara Maathuis and Kasper Cools\",\nyear = \"2025\",\nlanguage = \"Nederlands\",\nbooktitle = \"ICCWS - 20th International Conference on Cyber Warfare and Security\",\nunit= {meca-ras},\nproject= {ARC}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    C. Maathuis and K. Cools, &#8220;The Role of AI in Military Cyber Security: Data Insights and Evaluation Methods,\" in <span style=\"font-style: italic\">The Role of AI in Military Cyber Security: Data Insights and Evaluation Methods<\/span>,  2025, p. 191\u2013200.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_19\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S1877050925004284\/pdf?md5=cb26bfdc1347c1719be794d52430711a&#038;pid=1-s2.0-S1877050925004284-main.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1016\/j.procs.2025.02.078' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_19_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{2cd3f64017734a629399671997131f32,\ntitle = \"The Role of AI in Military Cyber Security: Data Insights and Evaluation Methods\",\nauthor = \"Clara Maathuis and Kasper Cools\",\nyear = \"2025\",\ndoi = \"10.1016\/j.procs.2025.02.078\",\nlanguage = \"English\",\npages = \"191--200\",\nbooktitle = \"The Role of AI in Military Cyber Security: Data Insights and Evaluation Methods\",\npublisher = \"Elsevier Science Publishers B.V.\",\nurl=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S1877050925004284\/pdf?md5=cb26bfdc1347c1719be794d52430711a&pid=1-s2.0-S1877050925004284-main.pdf\",\nunit= {meca-ras},\nproject= {ARC}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2024<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    K. Cools, G. {Mailette de Buy Wenniger}, and C. Maathuis, &#8220;Modeling offensive content detection for TikTok,\" in <span style=\"font-style: italic\">IEEE Conference on Digital Platforms and Societal Harms 2024<\/span>,  2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_16\" class=\"papercite_toggle\">[BibTeX]<\/a>            <a href='http:\/\/dx.doi.org\/10.1109\/DPSH60098.2024.10774634' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_16_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{7fe0105eb06648fb806e91e8fa3ceb66,\ntitle = \"Modeling offensive content detection for TikTok\",\nauthor = \"Kasper Cools and {Mailette de Buy Wenniger}, Gideon and Clara Maathuis\",\nyear = \"2024\",\nmonth = dec,\ndoi = \"10.1109\/DPSH60098.2024.10774634\",\nlanguage = \"English\",\nbooktitle = \"IEEE Conference on Digital Platforms and Societal Harms 2024\",\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    C. Hamesse, T. &#8216;e, J. Saarinen, M. Vlaminck, H. Luong, and R. Haelterman, &#8220;Development of Ultra-Portable 3D Mapping Systems for Emergency Services,\" in <span style=\"font-style: italic\">IEEE ICRA Workshop on Field Robotics<\/span>,  2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_17\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_17\" class=\"papercite_toggle\">[Abstract]<\/a>            <a href='http:\/\/dx.doi.org\/10.48550\/arXiv.2405.03514' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_17_block\">\n<p>Miniaturization of cameras and LiDAR sensors has enabled the development of wearable 3D mapping systems for emergency responders. These systems have the potential to revolutionize response capabilities by providing real-time, high-fidelity maps of dynamic and hazardous environments. We present our recent efforts towards the development of such ultra-portable 3D mapping systems. We review four different sensor configurations, either helmet-mounted or body-worn, with two different mapping algorithms that were implemented and evaluated during field trials. The paper discusses the experimental results with the aim to stimulate further discussion within the portable 3D mapping research community.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_17_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{93e1e7b40c664f7fa853b8853463fdac,\ntitle = \"Development of Ultra-Portable 3D Mapping Systems for Emergency Services\",\nabstract = \"Miniaturization of cameras and LiDAR sensors has enabled the development of wearable 3D mapping systems for emergency responders. These systems have the potential to revolutionize response capabilities by providing real-time, high-fidelity maps of dynamic and hazardous environments. We present our recent efforts towards the development of such ultra-portable 3D mapping systems. We review four different sensor configurations, either helmet-mounted or body-worn, with two different mapping algorithms that were implemented and evaluated during field trials. The paper discusses the experimental results with the aim to stimulate further discussion within the portable 3D mapping research community. \",\nkeywords = \"SLAM, 3D point cloud, 3D reconstruction, LiDAR\",\nauthor = \"Charles Hamesse and Timoth{'e}e Freville and Juha Saarinen and Michiel Vlaminck and Hiep Luong and Rob Haelterman\",\nyear = \"2024\",\ndoi = \"10.48550\/arXiv.2405.03514\",\nlanguage = \"English\",\nbooktitle = \"IEEE ICRA Workshop on Field Robotics\",\nunit= {mwmw, meca-ras},\nproject= {DREAM}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    S. Papili, A. Penko, K. Becker, and T. Weber, &#8220;TH43A &#8211; Biogeochemical Interactions Impacting Seafloor Properties and Dynamics.\"  2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_20\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_20\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/agu.confex.com\/agu\/OSM24\/prelim.cgi\/Home\/0\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_20_block\">\n<p>This Town Hall focuses on cultivating collaborations between biologists, chemists, geologists, and oceanographers to understand the nonlinear, interdisciplinary processes occurring on the ocean{textquoteright}s seafloor. Biological, chemical, geological, and physical processes modify the properties, composition, and texture of the seabed, adding complexity and heterogeneity. The interactions of these processes over a range of temporal and spatial scales can cause anomalies in the prediction of geotechnical, physical, and acoustic responses. Due to their interconnectivity, we hypothesize that fundamental knowledge gaps of the feedback between biogeochemical processes on and within the seabed can be elucidated through a holistic approach to understanding the seafloor system. To address the interdisciplinarity of seafloor dynamics, we invite all scientists who study the ocean{textquoteright}s seafloor to aid in outlining the dominant processes, state-of-the-art modeling capabilities, and gaps in knowledge through a panel and group discussion. The outcome will help guide future research and aid in multidisciplinary collaborations.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_20_block\">\n<pre><code class=\"tex bibtex\">@conference{c22ee9b7610d4295990ced933cb0f33e,\ntitle = \"TH43A - Biogeochemical Interactions Impacting Seafloor Properties and Dynamics\",\nabstract = \"This Town Hall focuses on cultivating collaborations between biologists, chemists, geologists, and oceanographers to understand the nonlinear, interdisciplinary processes occurring on the ocean{textquoteright}s seafloor. Biological, chemical, geological, and physical processes modify the properties, composition, and texture of the seabed, adding complexity and heterogeneity. The interactions of these processes over a range of temporal and spatial scales can cause anomalies in the prediction of geotechnical, physical, and acoustic responses. Due to their interconnectivity, we hypothesize that fundamental knowledge gaps of the feedback between biogeochemical processes on and within the seabed can be elucidated through a holistic approach to understanding the seafloor system. To address the interdisciplinarity of seafloor dynamics, we invite all scientists who study the ocean{textquoteright}s seafloor to aid in outlining the dominant processes, state-of-the-art modeling capabilities, and gaps in knowledge through a panel and group discussion. The outcome will help guide future research and aid in multidisciplinary collaborations.\",\nauthor = \"Sonia Papili and Allison Penko and Kyle Becker and Tom Weber\",\nyear = \"2024\",\nmonth = feb,\nday = \"18\",\nlanguage = \"English\",\nnote = \"Ocean Science Meeting 2024, OSM24 ; Conference date: 18-02-2024 Through 23-02-2024\",\nurl = \"https:\/\/agu.confex.com\/agu\/OSM24\/prelim.cgi\/Home\/0\",\nunit= {meca-ras},\nproject= {DISCIMBA}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    K. Cools and C. Maathuis, &#8220;Trust or Bust: Ensuring Trustworthiness in Autonomous Weapon Systems,\" in <span style=\"font-style: italic\">IEEE Military Communication Conference 2024<\/span>,  2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_21\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/milcom2024.ieee-milcom.org\/\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/MILCOM61039.2024.10773908' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_21_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{b0d1d2ad7c7446dfbe873b15d4b3b337,\ntitle = \"Trust or Bust: Ensuring Trustworthiness in Autonomous Weapon Systems\",\nauthor = \"Kasper Cools and Clara Maathuis\",\nyear = \"2024\",\nmonth = dec,\ndoi = \"10.1109\/MILCOM61039.2024.10773908\",\nlanguage = \"English\",\nbooktitle = \"IEEE Military Communication Conference 2024\",\nnote = \"2024 IEEE Military Communications Conference ; Conference date: 28-10-2024 Through 01-11-2024\",\nurl = \"https:\/\/milcom2024.ieee-milcom.org\/\",\nunit= {meca-ras},\nproject= {ARC}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    T. Nguyen, C. Hamesse, T. Dutrannois, T. Halleux, G. De Cubber, R. Haelterman, and B. Janssens, &#8220;Visual-based Localization Methods for Unmanned Aerial Vehicles in Landing Operation on Maritime Vessel,\" <span style=\"font-style: italic\">Acta IMEKO<\/span>, vol. 13, iss. 4, p. 1\u201313, 2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_22\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/acta.imeko.org\/index.php\/acta-imeko\/article\/view\/1575\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.21014\/actaimeko.v13i4.1575' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_22_block\">\n<pre><code class=\"tex bibtex\">@article{nguyen_visual_2024,\ntitle = {Visual-based {Localization} {Methods} for {Unmanned} {Aerial} {Vehicles} in {Landing} {Operation} on {Maritime} {Vessel}},\nvolume = {13},\nissn = {2221-870X},\nurl = {https:\/\/acta.imeko.org\/index.php\/acta-imeko\/article\/view\/1575},\ndoi = {10.21014\/actaimeko.v13i4.1575},\nnumber = {4},\njournal = {Acta IMEKO},\nauthor = {Nguyen, Tien-Thanh and Hamesse, Charles and Dutrannois, Thomas and Halleux, Timothy and De Cubber, Geert and Haelterman, Rob and Janssens, Bart},\nmonth = nov,\nyear = {2024},\npages = {1--13},\nunit= {meca-ras},\nproject= {MarLand}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Geers, T. Willems, C. Nita, T-T. Nguyen, and J. Aelterman, &#8220;Maritime surveillance using unmanned vehicles: deep learning-based vessel re-identification,\" in <span style=\"font-style: italic\">2024 SPIE: Security + Defence, Artificial Intelligence for Security and Defence Applications II<\/span>,  2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_23\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/13206\/3028805\/Maritime-surveillance-using-unmanned-vehicles\u2013deep-learning-based-vessel\/10.1117\/12.3028805.full\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.3028805' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_23_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{spie2024,\ntitle={Maritime surveillance using unmanned vehicles: deep learning-based vessel re-identification},\nauthor={Geers, Yoni and Willems, Tim and Nita, Cornelia and Nguyen, T-T. and Aelterman, Jan},\nbooktitle={2024 SPIE: Security + Defence, Artificial Intelligence for Security and Defence Applications II},\neditors ={},\npublisher = {},\nyear = {2024},\nvol = {13206},\nlocation = {Edinburgh, United Kingdom},\nunit= {meca-ras},\ndoi = {https:\/\/doi.org\/10.1117\/12.3028805},\nurl={https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/13206\/3028805\/Maritime-surveillance-using-unmanned-vehicles--deep-learning-based-vessel\/10.1117\/12.3028805.full},\nunit= {meca-ras},\nproject= {MarLand}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Z. Chekakta, N. Aouf, S. Govindaraj, F. Polisano, and G. De Cubber, &#8220;Towards Learning-Based Distributed Task Allocation Approach for Multi-Robot System,\" in <span style=\"font-style: italic\">2024 10th International Conference on Automation, Robotics and Applications (ICARA)<\/span>,  2024, pp. 34-39.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_24\" class=\"papercite_toggle\">[BibTeX]<\/a>            <a href='http:\/\/dx.doi.org\/10.1109\/ICARA60736.2024.10553196' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_24_block\">\n<pre><code class=\"tex bibtex\">@INPROCEEDINGS{10553196,\nauthor={Chekakta, Zakaria and Aouf, Nabil and Govindaraj, Shashank and Polisano, Fabio and De Cubber, Geert},\nbooktitle={2024 10th International Conference on Automation, Robotics and Applications (ICARA)},\ntitle={Towards Learning-Based Distributed Task Allocation Approach for Multi-Robot System},\nyear={2024},\nvolume={},\nnumber={},\npages={34-39},\nkeywords={Sequential analysis;Automation;Accuracy;Robot kinematics;Prediction algorithms;Approximation algorithms;Resource management;Task Allocation;Multirobot System;Distributed Algorithms;Graph Convolutional Neural Networks},\ndoi={10.1109\/ICARA60736.2024.10553196},\nunit= {meca-ras},\nproject= {AIDED}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    P. Petsioti, M. Zyczkowski, K. Brewczyski, K. Cichulski, K. Kaminski, R. Razvan, A. Mohamoud, C. Church, A. Koniaris, G. De Cubber, and D. Doroftei, &#8220;Methodological Approach for the Development of Standard C-UAS Scenarios,\" <span style=\"font-style: italic\">Open Research Europe<\/span>, vol. 4, iss. 240, 2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_25\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/open-research-europe.ec.europa.eu\/articles\/4-240\/v1\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.12688\/openreseurope.18339.1' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_25_block\">\n<pre><code class=\"tex bibtex\">@Article{ 10.12688\/openreseurope.18339.1,\nAUTHOR = {Petsioti, P. and Zyczkowski, M. and Brewczyski, K. and Cichulski, K. and Kaminski, K. and Razvan, R. and Mohamoud, A. and Church, C. and Koniaris, A. and De Cubber, G. and Doroftei, D.},\nTITLE = {Methodological Approach for the Development of Standard C-UAS Scenarios},\nJOURNAL = {Open Research Europe},\nVOLUME = {4},\nYEAR = {2024},\nNUMBER = {240},\nDOI = {10.12688\/openreseurope.18339.1},\nURL = {https:\/\/open-research-europe.ec.europa.eu\/articles\/4-240\/v1},\nunit= {meca-ras},\nproject= {COURAGEOUS}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    K. D. Brewczy\u0144ski, M. \u017byczkowski, K. Cichulski, K. A. Kami\u0144ski, P. Petsioti, and G. De Cubber, &#8220;Methods for Assessing the Effectiveness of Modern Counter Unmanned Aircraft Systems,\" <span style=\"font-style: italic\">Remote Sensing<\/span>, vol. 16, iss. 19, 2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_26\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_26\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.mdpi.com\/2072-4292\/16\/19\/3714\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.3390\/rs16193714' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_26_block\">\n<p>Given the growing threat posed by the widespread availability of unmanned aircraft systems (UASs), which can be utilised for various unlawful activities, the need for a standardised method to evaluate the effectiveness of systems capable of detecting, tracking, and identifying (DTI) these devices has become increasingly urgent. This article draws upon research conducted under the European project COURAGEOUS, where 260 existing drone detection systems were analysed, and a methodology was developed for assessing the suitability of C-UASs in relation to specific threat scenarios. The article provides an overview of the most commonly employed technologies in C-UASs, such as radars, visible light cameras, thermal imaging cameras, laser range finders (lidars), and acoustic sensors. It explores the advantages and limitations of each technology, highlighting their reliance on different physical principles, and also briefly touches upon the legal implications associated with their deployment. The article presents the research framework and provides a structural description, alongside the functional and performance requirements, as well as the defined metrics. Furthermore, the methodology for testing the usability and effectiveness of individual C-UAS technologies in addressing specific threat scenarios is elaborated. Lastly, the article offers a concise list of prospective research directions concerning the analysis and evaluation of these technologies.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_26_block\">\n<pre><code class=\"tex bibtex\">@Article{rs16193714,\nAUTHOR = {Brewczy\u0144ski, Konrad D. and \u017byczkowski, Marek and Cichulski, Krzysztof and Kami\u0144ski, Kamil A. and Petsioti, Paraskevi and De Cubber, Geert},\nTITLE = {Methods for Assessing the Effectiveness of Modern Counter Unmanned Aircraft Systems},\nJOURNAL = {Remote Sensing},\nVOLUME = {16},\nYEAR = {2024},\nNUMBER = {19},\nARTICLE-NUMBER = {3714},\nURL = {https:\/\/www.mdpi.com\/2072-4292\/16\/19\/3714},\nISSN = {2072-4292},\nABSTRACT = {Given the growing threat posed by the widespread availability of unmanned aircraft systems (UASs), which can be utilised for various unlawful activities, the need for a standardised method to evaluate the effectiveness of systems capable of detecting, tracking, and identifying (DTI) these devices has become increasingly urgent. This article draws upon research conducted under the European project COURAGEOUS, where 260 existing drone detection systems were analysed, and a methodology was developed for assessing the suitability of C-UASs in relation to specific threat scenarios. The article provides an overview of the most commonly employed technologies in C-UASs, such as radars, visible light cameras, thermal imaging cameras, laser range finders (lidars), and acoustic sensors. It explores the advantages and limitations of each technology, highlighting their reliance on different physical principles, and also briefly touches upon the legal implications associated with their deployment. The article presents the research framework and provides a structural description, alongside the functional and performance requirements, as well as the defined metrics. Furthermore, the methodology for testing the usability and effectiveness of individual C-UAS technologies in addressing specific threat scenarios is elaborated. Lastly, the article offers a concise list of prospective research directions concerning the analysis and evaluation of these technologies.},\nDOI = {10.3390\/rs16193714},\nunit= {meca-ras},\nproject= {COURAGEOUS},\nurl={https:\/\/www.mdpi.com\/2072-4292\/16\/19\/3714}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    A. Borghgraef, M. Vandewal, and G. De Cubber, &#8220;COURAGEOUS: test methods for counter-UAS systems,\" in <span style=\"font-style: italic\">In Proceedings SPIE Sensors + Imaging, Target and Background Signatures X: Traditional Methods and Artificial Intelligence<\/span>,  2024, p. 131990D.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_27\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/doi.org\/10.1117\/12.3030928\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1117\/12.3030928' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_27_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{spie_alex,\ntitle={COURAGEOUS: test methods for counter-UAS systems},\nauthor={Borghgraef, Alexander and Vandewal, Marijke and De Cubber, Geert},\nyear={2024},\nbooktitle={In Proceedings SPIE Sensors + Imaging, Target and Background Signatures X: Traditional Methods and Artificial Intelligence},\npublisher = {SPIE},\nlocation = {Edinburgh, United Kingdom},\nunit= {meca-ras, ciss},\nproject= {COURAGEOUS},\nvolume = {13199},\neditor = {Karin Stein and Ric Schleijpen},\norganization = {International Society for Optics and Photonics},\npublisher = {SPIE},\npages = {131990D},\nkeywords = {counter-UAS, drone, border protection, standardization, measurement campaign, law enforcement, DTI, evaluation methods},\ndoi = {10.1117\/12.3030928},\nurl = {https:\/\/doi.org\/10.1117\/12.3030928}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    A. M. Casado Fauli, M. Malizia, K. Hasselmann, E. Le Fl\u00e9cher, G. De Cubber, and B. Lauwens, &#8220;HADRON: Human-friendly Control and Artificial Intelligence for Military Drone Operations,\" in <span style=\"font-style: italic\">In Proceedings 33rd IEEE International Conference on Robot and Human Interactive Communication, IEEE RO-MAN 2024<\/span>,  2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_28\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/arxiv.org\/abs\/2408.07063\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_28_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{fauli2024hadronhumanfriendlycontrolartificial,\ntitle={HADRON: Human-friendly Control and Artificial Intelligence for Military Drone Operations},\nauthor={Casado Fauli, Ana Maria and Malizia, Mario and Hasselmann, Ken and Le Fl\u00e9cher, Emile and De Cubber, Geert and Lauwens, Ben},\nyear={2024},\nbooktitle={In Proceedings 33rd IEEE International Conference on Robot and Human Interactive Communication, IEEE RO-MAN 2024},\npublisher = {IEEE},\nlocation = {Pasadena, USA},\nunit= {meca-ras},\nproject= {HADRON},\neprint={2408.07063},\narchivePrefix={arXiv},\nprimaryClass={cs.RO},\nurl={https:\/\/arxiv.org\/abs\/2408.07063},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, G. De Cubber, S. Lo Bue, and H. De Smet, &#8220;Quantitative Assessment of Drone Pilot Performance,\" <span style=\"font-style: italic\">Drones<\/span>, vol. 8, iss. 9, 2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_29\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_29\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.mdpi.com\/2504-446X\/8\/9\/482\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.3390\/drones8090482' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_29_block\">\n<p>This paper introduces a quantitative methodology for assessing drone pilot performance, aiming to reduce drone-related incidents by understanding the human factors influencing performance. The challenge lies in balancing evaluations in operationally relevant environments with those in a standardized test environment for statistical relevance. The proposed methodology employs a novel virtual test environment that records not only basic flight metrics but also complex mission performance metrics, such as the video quality from a target. A group of Belgian Defence drone pilots were trained using this simulator system, yielding several practical results. These include a human-performance model linking human factors to pilot performance, an AI co-pilot providing real-time flight performance guidance, a tool for generating optimal flight trajectories, a mission planning tool for ideal pilot assignment, and a method for iterative training improvement based on quantitative input. The training results with real pilots demonstrate the methodology\u2019s effectiveness in evaluating pilot performance for complex military missions, suggesting its potential as a valuable addition to new pilot training programs.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_29_block\">\n<pre><code class=\"tex bibtex\">@Article{drones8090482,\nAUTHOR = {Doroftei, Daniela and De Cubber, Geert and Lo Bue, Salvatore and De Smet, Hans},\nTITLE = {Quantitative Assessment of Drone Pilot Performance},\nJOURNAL = {Drones},\nVOLUME = {8},\nYEAR = {2024},\nunit= {meca-ras},\nNUMBER = {9},\nARTICLE-NUMBER = {482},\nURL = {https:\/\/www.mdpi.com\/2504-446X\/8\/9\/482},\nISSN = {2504-446X},\nproject= {ALPHONSE},\nABSTRACT = {This paper introduces a quantitative methodology for assessing drone pilot performance, aiming to reduce drone-related incidents by understanding the human factors influencing performance. The challenge lies in balancing evaluations in operationally relevant environments with those in a standardized test environment for statistical relevance. The proposed methodology employs a novel virtual test environment that records not only basic flight metrics but also complex mission performance metrics, such as the video quality from a target. A group of Belgian Defence drone pilots were trained using this simulator system, yielding several practical results. These include a human-performance model linking human factors to pilot performance, an AI co-pilot providing real-time flight performance guidance, a tool for generating optimal flight trajectories, a mission planning tool for ideal pilot assignment, and a method for iterative training improvement based on quantitative input. The training results with real pilots demonstrate the methodology\u2019s effectiveness in evaluating pilot performance for complex military missions, suggesting its potential as a valuable addition to new pilot training programs.},\nDOI = {10.3390\/drones8090482}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    M. Malizia, A. M. Casado Fauli, K. Hasselmann, E. Le Fl\u00e9cher, G. De Cubber, and R. Haelterman, &#8220;Assisted Explosive Ordnance Disposal: Teleoperated Robotic Systems with AI, Virtual Reality, and Semi-Autonomous Manipulation for Safer Demining Operations,\" in <span style=\"font-style: italic\">20th International Symposium Mine Action<\/span>,  2024, pp. 52-55.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_31\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/www.ctro.hr\/userfiles\/files\/MINE%20ACTION_2024_ONLIINE.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_31_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{maliziamineact2024,\ntitle={Assisted Explosive Ordnance Disposal: Teleoperated Robotic Systems with AI, Virtual Reality, and Semi-Autonomous Manipulation for Safer Demining Operations},\nauthor={Malizia, Mario and Casado Fauli, Ana Maria and Hasselmann, Ken and Le Fl\u00e9cher, Emile and De Cubber, Geert and Haelterman, Rob},\nbooktitle={20th International Symposium Mine Action},\npublisher = {CTRO-HR},\nyear = {2024},\nlocation = {Cavtat, Croatia},\nunit= {meca-ras},\nurl={https:\/\/www.ctro.hr\/userfiles\/files\/MINE%20ACTION_2024_ONLIINE.pdf},\npages={52-55},\nproject= {BELGIAN}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    K. Hasselmann, M. Malizia, R. Caballero, F. Polisano, S. Govindaraj, J. Stigler, O. Ilchenko, M. Bajic, and G. De Cubber, &#8220;A multi-robot system for the detection of explosive devices,\" in <span style=\"font-style: italic\">&#8220;IEEE ICRA Workshop on Field Robotics\"<\/span>,  2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_32\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/arxiv.org\/abs\/2404.14167\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.48550\/ARXIV.2404.14167' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_32_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{Hasselmannetal2024ICRAWSFRO,\ndoi = {10.48550\/ARXIV.2404.14167},\nurl={https:\/\/arxiv.org\/abs\/2404.14167},\nbooktitle = {\"IEEE ICRA Workshop on Field Robotics\"},\nauthor = {Hasselmann, Ken and Malizia, Mario and Caballero, Rafael and Polisano, Fabio and Govindaraj, Shashank and Stigler, Jakob and Ilchenko, Oleksii and Bajic, Milan and De Cubber, Geert},\ntitle = {A multi-robot system for the detection of explosive devices},\nyear = {2024},\nunit= {meca-ras},\nproject= {AIDED, AIDEDEX}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    M. Kegeleirs, D. G. Ramos, K. Hasselmann, L. Garattoni, G. Francesca, and M. Birattari, &#8220;Transferability in the automatic off-line design of robot swarms: from sim-to-real to embodiment and design-method transfer across different platforms,\" <span style=\"font-style: italic\">IEEE Robotics and Automation Letters<\/span>, 2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_33\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/10416330\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/https:\/\/doi.org\/10.1109\/LRA.2024.3360013' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_33_block\">\n<pre><code class=\"tex bibtex\">@article{kegeleirs2024transferability,\ntitle={Transferability in the automatic off-line design of robot swarms: from sim-to-real to embodiment and design-method transfer across different platforms},\nauthor={Kegeleirs, Miquel and Ramos, David Garz{'o}n and Hasselmann, Ken and Garattoni, Lorenzo and Francesca, Gianpiero and Birattari, Mauro},\njournal={IEEE Robotics and Automation Letters},\nyear={2024},\ndoi={https:\/\/doi.org\/10.1109\/LRA.2024.3360013},\nurl={https:\/\/ieeexplore.ieee.org\/document\/10416330},\npublisher={IEEE},\nunit= {meca-ras},\nproject= {AIDEDEX}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    T-T. Nguyen, A. Crismer, G. De Cubber, B. Janssens, and H. Bruyninckx, &#8220;Landing UAV on Moving Surface Vehicle: Visual Tracking and Motion Prediction of Landing Deck,\" in <span style=\"font-style: italic\">2024 IEEE\/SICE International Symposium on System Integration (SII).<\/span>,  2024.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_34\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/drive.google.com\/file\/d\/1UiF4uPF9VkxgMxMX_JCBOFiZJPmoDSRv\/view?usp=drive_link\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/https:\/\/doi.org\/10.1109\/SII58957.2024.10417303' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_34_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{sii2024,\ntitle={Landing UAV on Moving Surface Vehicle: Visual Tracking and Motion Prediction of Landing Deck},\nauthor={Nguyen, T-T. and Crismer, A and De Cubber, G. and Janssens, B. and Bruyninckx, H.},\nbooktitle={2024 IEEE\/SICE International Symposium on System Integration (SII).},\neditors ={},\npublisher = {IEEE},\nyear = {2024},\nvol = {},\nlocation = {Ha Long, Vietnam},\nunit= {meca-ras},\ndoi = {https:\/\/doi.org\/10.1109\/SII58957.2024.10417303},\nurl={https:\/\/drive.google.com\/file\/d\/1UiF4uPF9VkxgMxMX_JCBOFiZJPmoDSRv\/view?usp=drive_link},\nproject= {MarLand}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2023<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, P. Petsioti, R. Roman, A. Mohamoud, I. Maza, and C. Church, &#8220;The COURAGEOUS project efforts towards standardized test methods for assessing the performance of counter-drone solutions,\" in <span style=\"font-style: italic\">In Proceedings 11th biennial Symposium on Non-Lethal Weapons<\/span>,  2023, p. 44.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_30\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/mecatron.rma.ac.be\/pub\/2024\/Towards%20standardized%20test%20methods%20for%20assessing%20the%20performance%20of%20counter-drone%20solutions.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_30_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{decubbercuas2023,\ntitle={The COURAGEOUS project efforts towards standardized test methods for assessing the performance of counter-drone solutions},\nauthor={De Cubber, Geert and Petsioti, Petsioti and Roman, Razvan and Mohamoud, Ali and Maza, Ivan and Church, Christopher},\nbooktitle={In Proceedings 11th biennial Symposium on Non-Lethal Weapons},\npublisher = {European Working Group on Non-Lethal Weapons},\nyear = {2023},\nlocation = {Brussels, Belgium},\nunit= {meca-ras},\nurl={https:\/\/mecatron.rma.ac.be\/pub\/2024\/Towards%20standardized%20test%20methods%20for%20assessing%20the%20performance%20of%20counter-drone%20solutions.pdf},\npages={44},\nproject= {COURAGEOUS}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, E. Le Fl\u00e9cher, A. La Grappe, E. Ghisoni, E. Maroulis, P. Ouendo, D. Hawari, and D. Doroftei, &#8220;Dual Use Security Robotics: A Demining, Resupply and Reconnaissance Use Case,\" in <span style=\"font-style: italic\">IEEE International Conference on Safety, Security, and Rescue Robotics<\/span>,  2023.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_35\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/mecatron.rma.ac.be\/pub\/2023\/SSRR2023-DeCubber.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_35_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{ssrr2023decubber,\ntitle={Dual Use Security Robotics: A Demining, Resupply and Reconnaissance Use Case},\nauthor={De Cubber, Geert and Le Fl\u00e9cher, Emile and La Grappe, Alexandre and Ghisoni, Enzo and Maroulis, Emmanouil and Ouendo, Pierre-Edouard and Hawari, Danial and Doroftei, Daniela},\nbooktitle={IEEE International Conference on Safety, Security, and Rescue Robotics},\neditors ={Kimura, Tetsuya},\npublisher = {IEEE},\nyear = {2023},\nvol = {1},\nproject = {AIDED, iMUGs, CUGS},\nlocation = {Fukushima, Japan},\nunit= {meca-ras},\ndoi = {},\nurl={https:\/\/mecatron.rma.ac.be\/pub\/2023\/SSRR2023-DeCubber.pdf}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    T-T. Nguyen, L. Somers, J. Van den Bosch, G. De Cubber, B. Janssens, and H. Bruyninckx, &#8220;Affordable and Customizable Research and Educational Aerial and Surface Vehicles Robot Platforms \u2013 first implementation,\" in <span style=\"font-style: italic\">17th Mechatronics Forum International Conference.<\/span>,  2023.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_36\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/mechatronics2023.eu\/wp-content\/uploads\/2023\/09\/MX_2023_session_3_paper_3_nguyen.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_36_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{mechatronics20203usv,\ntitle={Affordable and Customizable Research and Educational Aerial and Surface Vehicles Robot Platforms \u2013 first implementation},\nauthor={Nguyen, T-T. and Somers, L. and Van den Bosch, J. and De Cubber, G. and Janssens, B. and Bruyninckx, H.},\nbooktitle={17th Mechatronics Forum International Conference.},\neditors ={},\npublisher = {},\nyear = {2023},\nvol = {},\nlocation = {Leuven, Belgium},\nunit= {meca-ras},\ndoi = {},\nurl={https:\/\/mechatronics2023.eu\/wp-content\/uploads\/2023\/09\/MX_2023_session_3_paper_3_nguyen.pdf},\nproject= {MarLand}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    T-T. Nguyen, J. Duverger, G. De Cubber, B. Janssens, and H. Bruyninckx, &#8220;Development of Dual-function Adaptive Landing Gear and Gripper for Unmanned Aerial Vehicles,\" in <span style=\"font-style: italic\">17th Mechatronics Forum International Conference.<\/span>,  2023.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_37\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/mechatronics2023.eu\/wp-content\/uploads\/2023\/09\/MX_2023_session_3_paper_1_nguyen.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_37_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{mechatronics20203gripper,\ntitle={Development of Dual-function Adaptive Landing Gear and Gripper for Unmanned Aerial Vehicles},\nauthor={Nguyen, T-T. and Duverger, J. and De Cubber, G. and Janssens, B. and Bruyninckx, H.},\nbooktitle={17th Mechatronics Forum International Conference.},\neditors ={},\npublisher = {},\nyear = {2023},\nvol = {},\nlocation = {Leuven, Belgium},\nunit= {meca-ras},\ndoi = {},\nurl={https:\/\/mechatronics2023.eu\/wp-content\/uploads\/2023\/09\/MX_2023_session_3_paper_1_nguyen.pdf},\nproject= {MarLand}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, E. Le Fl\u00e9cher, A. Dominicus, and D. Doroftei, &#8220;Human-agent teaming between soldiers and unmanned ground systems in a resupply scenario,\" in <span style=\"font-style: italic\">Human Factors in Robots, Drones and Unmanned Systems. AHFE (2023) International Conference.<\/span>,  2023.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_38\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_38\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/openaccess.cms-conferences.org\/publications\/book\/978-1-958651-69-8\/article\/978-1-958651-69-8_5\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/http:\/\/doi.org\/10.54941\/ahfe1003746' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_38_block\">\n<p>Thanks to advances in embedded computing and robotics, intelligent Unmanned Ground Systems (UGS) are used more and more in our daily lives. Also in the military domain, the use of UGS is highly investigated for applications like force protection of military installations, surveillance, target acquisition, reconnaissance, handling of chemical, biological, radiological, nuclear (CBRN) threats, explosive ordnance disposal, etc. A pivotal research aspect for the integration of these military UGS in the standard operating procedures is the question of how to achieve a seamless collaboration between human and robotic agents in such high-stress and non-structured environments. Indeed, in these kind of operations, it is critical that the human-agent mutual understanding is flawless; hence, the focus on human factors and ergonomic design of the control interfaces.The objective of this paper is to focus on one key military application of UGS, more specifically logistics, and elaborate how efficient human-machine teaming can be achieved in such a scenario. While getting much less attention than other application areas, the domain of logistics is in fact one of the most important for any military operation, as it is an application area that is very well suited for robotic systems. Indeed, military troops are very often burdened by having to haul heavy gear across large distances, which is a problem UGS can solve.The significance of this paper is that it is based on more than two years of field research work on human + multi-agent UGS collaboration in realistic military operating conditions, performed within the scope of the European project iMUGS. In the framework of this project, not less than six large-scale field trial campaigns were organized across Europe. In each field trial campaign, soldiers and UGS had to work together to achieve a set of high-level mission goals that were distributed among them via a planning &#038; scheduling mechanism. This paper will focus on the outcomes of the Belgian field trial, which concentrated on a resupply logistics mission.Within this paper, a description of the iMUGS test setup and operational scenarios is provided. The ergonomic design of the tactical planning system is elaborated, together with the high-level swarming and task scheduling methods that divide the work between robotic and human agents in the fieldThe resupply mission, as described in this paper, was executed in summer 2022 in Belgium by a mixed team of soldiers and UGS for an audience of around 200 people from defence actors from European member states. The results of this field trial were evaluated as highly positive, as all high-level requirements were obtained by the robotic fleet.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_38_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{ahfe20203decubber,\ntitle={Human-agent teaming between soldiers and unmanned ground systems in a resupply scenario},\nauthor={De Cubber, G. and Le Fl\u00e9cher, E. and Dominicus, A. and Doroftei, D.},\nbooktitle={Human Factors in Robots, Drones and Unmanned Systems. AHFE (2023) International Conference.},\neditors ={Tareq Ahram and Waldemar Karwowski},\npublisher = {AHFE Open Access, AHFE International, USA},\nyear = {2023},\nvol = {93},\nproject = {iMUGs},\nlocation = {San Francisco, USA},\nunit= {meca-ras},\ndoi = {http:\/\/doi.org\/10.54941\/ahfe1003746},\nurl={https:\/\/openaccess.cms-conferences.org\/publications\/book\/978-1-958651-69-8\/article\/978-1-958651-69-8_5},\nabstract = {Thanks to advances in embedded computing and robotics, intelligent Unmanned Ground Systems (UGS) are used more and more in our daily lives. Also in the military domain, the use of UGS is highly investigated for applications like force protection of military installations, surveillance, target acquisition, reconnaissance, handling of chemical, biological, radiological, nuclear (CBRN) threats, explosive ordnance disposal, etc. A pivotal research aspect for the integration of these military UGS in the standard operating procedures is the question of how to achieve a seamless collaboration between human and robotic agents in such high-stress and non-structured environments. Indeed, in these kind of operations, it is critical that the human-agent mutual understanding is flawless; hence, the focus on human factors and ergonomic design of the control interfaces.The objective of this paper is to focus on one key military application of UGS, more specifically logistics, and elaborate how efficient human-machine teaming can be achieved in such a scenario. While getting much less attention than other application areas, the domain of logistics is in fact one of the most important for any military operation, as it is an application area that is very well suited for robotic systems. Indeed, military troops are very often burdened by having to haul heavy gear across large distances, which is a problem UGS can solve.The significance of this paper is that it is based on more than two years of field research work on human + multi-agent UGS collaboration in realistic military operating conditions, performed within the scope of the European project iMUGS. In the framework of this project, not less than six large-scale field trial campaigns were organized across Europe. In each field trial campaign, soldiers and UGS had to work together to achieve a set of high-level mission goals that were distributed among them via a planning & scheduling mechanism. This paper will focus on the outcomes of the Belgian field trial, which concentrated on a resupply logistics mission.Within this paper, a description of the iMUGS test setup and operational scenarios is provided. The ergonomic design of the tactical planning system is elaborated, together with the high-level swarming and task scheduling methods that divide the work between robotic and human agents in the fieldThe resupply mission, as described in this paper, was executed in summer 2022 in Belgium by a mixed team of soldiers and UGS for an audience of around 200 people from defence actors from European member states. The results of this field trial were evaluated as highly positive, as all high-level requirements were obtained by the robotic fleet.}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, G. De Cubber, and H. De Smet, &#8220;Human factors assessment for drone operations: towards a virtual drone co-pilot,\" in <span style=\"font-style: italic\">Human Factors in Robots, Drones and Unmanned Systems. AHFE (2023) International Conference.<\/span>,  2023.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_39\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_39\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/openaccess.cms-conferences.org\/publications\/book\/978-1-958651-69-8\/article\/978-1-958651-69-8_6\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/http:\/\/doi.org\/10.54941\/ahfe1003747' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_39_block\">\n<p>As the number of drone operations increases, so does the risk of incidents with these novel, yet sometimes dangerous unmanned systems. Research has shown that over 70% of drone incidents are caused by human error, so in order to reduce the risk of incidents, the human factors related to the operation of the drone should be studied. However, this is not a trivial exercise, because on the one hand, a realistic operational environment is required (in order to study the human behaviour in realistic conditions), while on the other hand a standardised environment is required, such that repeatable experiments can be set up in order to ensure statistical relevance. In order to remedy this, within the scope of the ALPHONSE project, a realistic simulation environment was developed that is specifically geared towards the evaluation of human factors for military drone operations. Within the ALPHONSE simulator, military (and other) drone pilots can perform missions in realistic operational conditions. At the same time, they are subjected to a range of factors that can influence operator performance. These constitute both person-induced factors like pressure to achieve the set goals in time or people talking to the pilot and environment-induced stress factors like changing weather conditions. During the flight operation, the ALPHONSE simulator continuously monitors over 65 flight parameters. After the flight, an overall performance score is calculated, based upon the achievement of the mission objectives. Throughout the ALPHONSE trials, a wide range of pilots has flown in the simulator, ranging from beginner to expert pilots. Using all the data recorded during these flights, three actions are performed:-An Artificial Intelligence (AI) &#8211; based classifier was trained to automatically recognize in real time good and bad flight behaviour. This allows for the development of a virtual co-pilot that can warn the pilot at any given moment when the pilot is starting to exhibit behaviour that is recognized by the classifier to correspond mostly to the behaviour of inexperienced pilots and not to the behaviour of good pilots.-An identification and ranking of the human factors and their impact on the flight performance, by linking the induced stress factors to the performance scores-An update of the training procedures to take into consideration the human factors that impact flight performance, such that newly trained pilots are better aware of these influences.The objective of this paper is to present the complete ALPHONSE simulator system for the evaluation of human factors for drone operations and present the results of the experiments with real military flight operators. The focus of the paper will be on the elaboration of the design choices for the development of the AI &#8211; based classifier for real-time flight performance evaluation.The proposed development is highly significant, as it presents a concrete and cost-effective methodology for developing a virtual co-pilot for drone pilots that can render drone operations safer. Indeed, while the initial training of the AI model requires considerable computing resources, the implementation of the classifier can be readily integrated in commodity flight controllers to provide real-time alerts when pilots are manifesting undesired flight behaviours.The paper will present results of tests with drone pilots from Belgian Defence and civilian Belgian Defence researchers that have flown within the ALPHONSE simulator. These pilots have first acted as data subjects to provide flight data to train the model and have later been used to validate the model. The validation shows that the virtual co-pilot achieves a very high accuracy and can in over 80% of the cases correctly identify bad flight profiles in real-time.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_39_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{ahfe20203doroftei,\ntitle={Human factors assessment for drone operations: towards a virtual drone co-pilot},\nauthor={Doroftei, D. and De Cubber, G. and De Smet, H.},\nbooktitle={Human Factors in Robots, Drones and Unmanned Systems. AHFE (2023) International Conference.},\neditors ={Tareq Ahram and Waldemar Karwowski},\npublisher = {AHFE Open Access, AHFE International, USA},\nyear = {2023},\nvol = {93},\nproject = {Alphonse},\nlocation = {San Francisco, USA},\nunit= {meca-ras},\ndoi = {http:\/\/doi.org\/10.54941\/ahfe1003747},\nurl={https:\/\/openaccess.cms-conferences.org\/publications\/book\/978-1-958651-69-8\/article\/978-1-958651-69-8_6},\nabstract = {As the number of drone operations increases, so does the risk of incidents with these novel, yet sometimes dangerous unmanned systems. Research has shown that over 70% of drone incidents are caused by human error, so in order to reduce the risk of incidents, the human factors related to the operation of the drone should be studied. However, this is not a trivial exercise, because on the one hand, a realistic operational environment is required (in order to study the human behaviour in realistic conditions), while on the other hand a standardised environment is required, such that repeatable experiments can be set up in order to ensure statistical relevance. In order to remedy this, within the scope of the ALPHONSE project, a realistic simulation environment was developed that is specifically geared towards the evaluation of human factors for military drone operations. Within the ALPHONSE simulator, military (and other) drone pilots can perform missions in realistic operational conditions. At the same time, they are subjected to a range of factors that can influence operator performance. These constitute both person-induced factors like pressure to achieve the set goals in time or people talking to the pilot and environment-induced stress factors like changing weather conditions. During the flight operation, the ALPHONSE simulator continuously monitors over 65 flight parameters. After the flight, an overall performance score is calculated, based upon the achievement of the mission objectives. Throughout the ALPHONSE trials, a wide range of pilots has flown in the simulator, ranging from beginner to expert pilots. Using all the data recorded during these flights, three actions are performed:-An Artificial Intelligence (AI) - based classifier was trained to automatically recognize in real time good and bad flight behaviour. This allows for the development of a virtual co-pilot that can warn the pilot at any given moment when the pilot is starting to exhibit behaviour that is recognized by the classifier to correspond mostly to the behaviour of inexperienced pilots and not to the behaviour of good pilots.-An identification and ranking of the human factors and their impact on the flight performance, by linking the induced stress factors to the performance scores-An update of the training procedures to take into consideration the human factors that impact flight performance, such that newly trained pilots are better aware of these influences.The objective of this paper is to present the complete ALPHONSE simulator system for the evaluation of human factors for drone operations and present the results of the experiments with real military flight operators. The focus of the paper will be on the elaboration of the design choices for the development of the AI - based classifier for real-time flight performance evaluation.The proposed development is highly significant, as it presents a concrete and cost-effective methodology for developing a virtual co-pilot for drone pilots that can render drone operations safer. Indeed, while the initial training of the AI model requires considerable computing resources, the implementation of the classifier can be readily integrated in commodity flight controllers to provide real-time alerts when pilots are manifesting undesired flight behaviours.The paper will present results of tests with drone pilots from Belgian Defence and civilian Belgian Defence researchers that have flown within the ALPHONSE simulator. These pilots have first acted as data subjects to provide flight data to train the model and have later been used to validate the model. The validation shows that the virtual co-pilot achieves a very high accuracy and can in over 80% of the cases correctly identify bad flight profiles in real-time.}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    E. Ghisoni, S. Govindaraj, A. M. C. Faul{&#8216;i}, G. De Cubber, F. Polisano, N. Aouf, D. Rondao, Z. Chekakta, and B. de Waard, &#8220;Multi-agent system and AI for Explosive Ordnance Disposal,\" in <span style=\"font-style: italic\">19th International Symposium Mine Action<\/span>,  2023, p. 26.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_170\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/www.ctro.hr\/userfiles\/files\/MINE-ACTION-2023_.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_170_block\">\n<pre><code class=\"tex bibtex\">@inproceedings{ghisonimulti,\ntitle={Multi-agent system and AI for Explosive Ordnance Disposal},\nauthor={Ghisoni, Enzo and Govindaraj, Shashank and Faul{'i}, Ana Mar{'i}a Casado and De Cubber, Geert and Polisano, Fabio and Aouf, Nabil and Rondao, Duarte and Chekakta, Zakaria and de Waard, Bob},\nbooktitle={19th International Symposium Mine Action},\npublisher = {CEIA},\nyear = {2023},\nproject = {AIDED},\nlocation = {Croatia},\nunit= {meca-ras},\nurl={https:\/\/www.ctro.hr\/userfiles\/files\/MINE-ACTION-2023_.pdf},\npages={26}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2022<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    R. Lahouli, G. De Cubber, B. Pairet, C. Hamesse, T. Freville, and R. Haelterman, &#8220;Deep Learning based Object Detection and Tracking for Maritime Situational Awareness,\" in <span style=\"font-style: italic\">Proceedings of the 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications &#8211; Volume 4: VISAPP,<\/span>,  2022, pp. 643-650.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_161\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/www.scitepress.org\/PublicationsDetail.aspx?ID=mJ5eF6o+SbM=&#038;t=1\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5220\/0010901000003124' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_161_block\">\n<pre><code class=\"tex bibtex\">@conference{visapp22,\nauthor={Lahouli, Rihab and De Cubber, Geert and Pairet, Benoit and Hamesse, Charles and Freville, Timothee and Haelterman, Rob},\ntitle={Deep Learning based Object Detection and Tracking for Maritime Situational Awareness},\nbooktitle={Proceedings of the 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP,},\nyear={2022},\npages={643-650},\npublisher={SciTePress},\norganization={INSTICC},\ndoi={10.5220\/0010901000003124},\nisbn={978-989-758-555-5},\nproject={SSAVE},\nurl={https:\/\/www.scitepress.org\/PublicationsDetail.aspx?ID=mJ5eF6o+SbM=&t=1},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, G. De Cubber, and H. De Smet, &#8220;A quantitative measure for the evaluation of drone-based video quality on a target,\" in <span style=\"font-style: italic\">Eighteenth International Conference on Autonomic and Autonomous Systems (ICAS)<\/span>, Venice, Italy,  2022.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_162\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_162\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.thinkmind.org\/articles\/icas_2022_1_40_20018.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/https:\/\/www.thinkmind.org\/index.php?view=article&#038;articleid=icas_2022_1_40_20018' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_162_block\">\n<p>This paper presents a methodology to assess video quality and based on that automatically calculate drone trajectories that optimize the video quality.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_162_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2022alphonse2,\nauthor = {Doroftei, Daniela and De Cubber, Geert and De Smet, Hans},\nbooktitle = {Eighteenth International Conference on Autonomic and Autonomous Systems (ICAS)},\ntitle = {A quantitative measure for the evaluation of drone-based video quality on a target},\nyear = {2022},\nmonth = jun,\norganization = {IARIA},\npublisher = {ThinkMind},\naddress = {Venice, Italy},\nurl = {https:\/\/www.thinkmind.org\/articles\/icas_2022_1_40_20018.pdf},\nisbn={978-1-61208-966-9},\ndoi = {https:\/\/www.thinkmind.org\/index.php?view=article&articleid=icas_2022_1_40_20018},\nabstract = {This paper presents a methodology to assess video quality and based on that automatically calculate drone trajectories that optimize the video quality.},\nproject = {Alphonse},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    E. Ghisoni and G. De Cubber, &#8220;AIDED: Robotics &#038; Artificial Intelligence for Explosive Ordnance Disposal,\" in <span style=\"font-style: italic\">International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance (VRISE)<\/span>, Les Bons Villers, Belgium,  2022.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_163\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_163\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.ici-belgium.be\/registration-and-program-vrise2022-june-7\/\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_163_block\">\n<p>This paper presents an overview of the AIDED project on AI for IED detection.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_163_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{ghisoni2022a,\nauthor = {Ghisoni, Enzo and De Cubber, Geert},\nbooktitle = {International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance (VRISE)},\ntitle = {AIDED: Robotics & Artificial Intelligence for Explosive Ordnance Disposal},\nyear = {2022},\nmonth = jun,\norganization = {IMEKO},\npublisher = {IMEKO},\naddress = {Les Bons Villers, Belgium},\nurl = {https:\/\/www.ici-belgium.be\/registration-and-program-vrise2022-june-7\/},\nabstract = {This paper presents an overview of the AIDED project on AI for IED detection.},\nproject = {AIDED},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber and F. E. Schneider, &#8220;Military Robotics,\" in <span style=\"font-style: italic\">Encyclopedia of Robotics<\/span>, M. H. Ang, O. Khatib, and B. Siciliano, Eds., Springer, 2022.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_164\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/meteor.springer.com\/project\/dashboard.jsf?id=347\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_164_block\">\n<pre><code class=\"tex bibtex\">@InCollection{encyclopedia2022,\nauthor = {De Cubber, Geert and Schneider, Frank E.},\ntitle = {Military Robotics},\neditor = {Ang, Marcelo H. and Khatib, Oussama and Siciliano, Bruno},\nbooktitle = {Encyclopedia of Robotics},\npublisher = {Springer},\nyear = {2022},\nurl = {https:\/\/meteor.springer.com\/project\/dashboard.jsf?id=347},\nproject = {iMUGs},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, G. De Cubber, and H. De Smet, &#8220;Assessing Human Factors for Drone Operations in a Simulation Environment,\" in <span style=\"font-style: italic\">Human Factors in Robots, Drones and Unmanned Systems &#8211; AHFE (2022) International Conference<\/span>, New York, USA,  2022.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_165\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_165\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/openaccess-api.cms-conferences.org\/articles\/download\/978-1-958651-33-9_16\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/http:\/\/doi.org\/10.54941\/ahfe1002319' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_165_block\">\n<p>This paper presents an overview of the Alphonse methodology for Assessing Human Factors for Drone Operations in a Simulation Environment.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_165_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2022a,\nauthor = {Doroftei, Daniela and De Cubber, Geert and De Smet, Hans},\nbooktitle = {Human Factors in Robots, Drones and Unmanned Systems - AHFE (2022) International Conference},\ntitle = {Assessing Human Factors for Drone Operations in a Simulation Environment},\nyear = {2022},\nmonth = jul,\nvolume = {57},\neditor = {Tareq Ahram and Waldemar Karwowski},\npublisher = {AHFE International},\naddress = {New York, USA},\nurl = {https:\/\/openaccess-api.cms-conferences.org\/articles\/download\/978-1-958651-33-9_16},\nabstract = {This paper presents an overview of the Alphonse methodology for Assessing Human Factors for Drone Operations in a Simulation Environment.},\ndoi = {http:\/\/doi.org\/10.54941\/ahfe1002319},\nproject = {Alphonse},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    T. Halleux, T. Nguyen, C. Hamesse, G. De Cubber, and B. Janssens, &#8220;Visual Drone Detection and Tracking for Autonomous Operation from Maritime Vessel,\" in <span style=\"font-style: italic\">Proceedings of TC17-ISMCR2022 &#8211; A Topical Event of Technical Committee on Measurement and Control of Robotics (TC17), International Measurement Confederation (IMEKO), Theme: &#8220;Robotics and Virtual Tools for a New Era\"<\/span>,  2022.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_166\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/mecatron.rma.ac.be\/pub\/2022\/ISMCR-Drone_detection_tracking_FullPaper.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5281\/zenodo.7074445' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_166_block\">\n<pre><code class=\"tex bibtex\">@INPROCEEDINGS{ismcr2022_1,\nauthor={Halleux, Timothy and Nguyen, Tien-Thanh and Hamesse, Charles and De Cubber, Geert and Janssens, Bart},\nbooktitle={Proceedings of TC17-ISMCR2022 - A Topical Event of Technical Committee on Measurement and Control of Robotics (TC17), International Measurement Confederation (IMEKO), Theme: \"Robotics and Virtual Tools for a New Era\"},\ntitle={Visual Drone Detection and Tracking for Autonomous Operation from Maritime Vessel},\nyear={2022},\nvolume={},\nnumber={},\nurl={https:\/\/mecatron.rma.ac.be\/pub\/2022\/ISMCR-Drone_detection_tracking_FullPaper.pdf},\nproject={MarLand, COURAGEOUS},\npublisher={IMEKO},\naddress={},\ndoi={10.5281\/zenodo.7074445},\nmonth={September},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    T. Dutrannois, T. Nguyen, C. Hamesse, G. De Cubber, and B. Janssens, &#8220;Visual SLAM for Autonomous Drone Landing on a Maritime Platform,\" in <span style=\"font-style: italic\">Proceedings of TC17-ISMCR2022 &#8211; A Topical Event of Technical Committee on Measurement and Control of Robotics (TC17), International Measurement Confederation (IMEKO), Theme: &#8220;Robotics and Virtual Tools for a New Era\"<\/span>,  2022.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_167\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/mecatron.rma.ac.be\/pub\/2022\/ISMCR-Visual_SLAM_FullPaper.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5281\/zenodo.7074451' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_167_block\">\n<pre><code class=\"tex bibtex\">@INPROCEEDINGS{ismcr2022_2,\nauthor={Dutrannois, Thomas and Nguyen, Tien-Thanh and Hamesse, Charles and De Cubber, Geert and Janssens, Bart},\nbooktitle={Proceedings of TC17-ISMCR2022 - A Topical Event of Technical Committee on Measurement and Control of Robotics (TC17), International Measurement Confederation (IMEKO), Theme: \"Robotics and Virtual Tools for a New Era\"},\ntitle={Visual SLAM for Autonomous Drone Landing on a Maritime Platform},\nyear={2022},\nvolume={},\nnumber={},\nurl={https:\/\/mecatron.rma.ac.be\/pub\/2022\/ISMCR-Visual_SLAM_FullPaper.pdf},\nproject={MarLand},\npublisher={IMEKO},\naddress={},\ndoi={10.5281\/zenodo.7074451},\nmonth={September},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber and F. E. Schneider, &#8220;Military Robotics,\" in <span style=\"font-style: italic\">Encyclopedia of Robotics<\/span>, M. H. Ang, O. Khatib, and B. Siciliano, Eds., Springer, 2022.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_168\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/meteor.springer.com\/project\/dashboard.jsf?id=347\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_168_block\">\n<pre><code class=\"tex bibtex\">@InCollection{encyclopedia2022,\nauthor = {De Cubber, Geert and Schneider, Frank E.},\ntitle = {Military Robotics},\neditor = {Ang, Marcelo H. and Khatib, Oussama and Siciliano, Bruno},\nbooktitle = {Encyclopedia of Robotics},\npublisher = {Springer},\nyear = {2022},\nurl = {https:\/\/meteor.springer.com\/project\/dashboard.jsf?id=347},\nproject = {iMUGs},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    E. Le Fl\u00e9cher, A. La Grappe, and G. De Cubber, &#8220;iMUGS &#8211; A ground multi-robot architecture for military Manned-Unmanned Teaming,\" in <span style=\"font-style: italic\">2022 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS)<\/span>, IEEE, 2022.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_169\" class=\"papercite_toggle\">[BibTeX]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_169_block\">\n<pre><code class=\"tex bibtex\">@inbook{imugs_le_flecher_la_grappe_de_cubber,\nplace={Kyoto},\ntitle={iMUGS - A ground multi-robot architecture for military Manned-Unmanned Teaming},\nbooktitle={2022 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS)},\npublisher={IEEE},\nyear={2022},\nauthor={Le Fl\u00e9cher, Emile and La Grappe, Alexandre and De Cubber, Geert},\nproject = {iMUGs},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2021<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    K. Mathiassen, F. E. Schneider, P. Bounker, A. Tiderko, G. D. Cubber, M. Baksaas, J. G\u0142\u00f3wka, R. Kozik, T. Nussbaumer, J. R\u00f6ning, J. Pellenz, and A. Volk, &#8220;Demonstrating interoperability between unmanned ground systems and command and control systems,\" <span style=\"font-style: italic\">International Journal of Intelligent Defence Support Systems<\/span>, vol. 6, iss. 2, pp. 100-129, 2021.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_157\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/www.inderscienceonline.com\/doi\/abs\/10.1504\/IJIDSS.2021.115236\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1504\/ijidss.2021.115236' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_157_block\">\n<pre><code class=\"tex bibtex\">@article{doi:10.1504\/IJIDSS.2021.115236,\nauthor = {Mathiassen, Kim and Schneider, Frank E. and Bounker, Paul and Tiderko, Alexander and Cubber, Geert De and Baksaas, Magnus and G\u0142\u00f3wka, Jakub and Kozik, Rafa\u0142 and Nussbaumer, Thomas and R\u00f6ning, Juha and Pellenz, Johannes and Volk, Andr\u00e9},\ntitle = {Demonstrating interoperability between unmanned ground systems and command and control systems},\njournal = {International Journal of Intelligent Defence Support Systems},\nvolume = {6},\nnumber = {2},\npages = {100-129},\nyear = {2021},\ndoi = {10.1504\/IJIDSS.2021.115236},\nurl = {https:\/\/www.inderscienceonline.com\/doi\/abs\/10.1504\/IJIDSS.2021.115236},\neprint = {https:\/\/www.inderscienceonline.com\/doi\/pdf\/10.1504\/IJIDSS.2021.115236},\nproject = {ICARUS, iMUGs},\ndoi = {10.1504\/ijidss.2021.115236},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, T. De Vleeschauwer, S. L. Bue, M. Dewyn, F. Vanderstraeten, and G. De Cubber, &#8220;Human-Agent Trust Evaluation in a Digital Twin Context,\" in <span style=\"font-style: italic\">2021 30th IEEE International Conference on Robot Human Interactive Communication (RO-MAN)<\/span>, Vancouver, BC, Canada,  2021, pp. 203-207.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_158\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/www.researchgate.net\/profile\/Geert-De-Cubber\/publication\/354078858_Human-Agent_Trust_Evaluation_in_a_Digital_Twin_Context\/links\/61430bd22bfbd83a46cf2b8c\/Human-Agent-Trust-Evaluation-in-a-Digital-Twin-Context.pdf?_sg%5B0%5D=BdEPB9AGDUV3sOwnEQKCr-DgWRA7uDNeMlvyQYNaMPGSO2bhCDbyG4AENXXxH3j323ypYTq9nMftVbDr2fsCSA.ePETOgrc5VHnE0GK_yjBK1XVVfdQ9S6g2UKVfg8Z8miIkGlMPXpzaYKlB0JPDSiroGp9QoFbmcY2egYAXbL1ZQ&#038;_sg%5B1%5D=ykQnQS2LN8fUQXAYx5Fpiy2NXqIwqO1UyVCENkpSUUWZn8Qqgrelh1bb4ry9Q9XPgCts7lVXU1_68YLjqnCPh4seSzWfG5BpKHc3MuFwsK6l.ePETOgrc5VHnE0GK_yjBK1XVVfdQ9S6g2UKVfg8Z8miIkGlMPXpzaYKlB0JPDSiroGp9QoFbmcY2egYAXbL1ZQ&#038;_iepl=\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/RO-MAN50785.2021.9515445' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_158_block\">\n<pre><code class=\"tex bibtex\">@INPROCEEDINGS{9515445,\nauthor={Doroftei, Daniela and De Vleeschauwer, Tom and Bue, Salvatore Lo and Dewyn, Micha\u00ebl and Vanderstraeten, Frik and De Cubber, Geert},\nbooktitle={2021 30th IEEE International Conference on Robot Human Interactive Communication (RO-MAN)},\ntitle={Human-Agent Trust Evaluation in a Digital Twin Context},\nyear={2021},\nvolume={},\nnumber={},\npages={203-207},\nurl={https:\/\/www.researchgate.net\/profile\/Geert-De-Cubber\/publication\/354078858_Human-Agent_Trust_Evaluation_in_a_Digital_Twin_Context\/links\/61430bd22bfbd83a46cf2b8c\/Human-Agent-Trust-Evaluation-in-a-Digital-Twin-Context.pdf?_sg%5B0%5D=BdEPB9AGDUV3sOwnEQKCr-DgWRA7uDNeMlvyQYNaMPGSO2bhCDbyG4AENXXxH3j323ypYTq9nMftVbDr2fsCSA.ePETOgrc5VHnE0GK_yjBK1XVVfdQ9S6g2UKVfg8Z8miIkGlMPXpzaYKlB0JPDSiroGp9QoFbmcY2egYAXbL1ZQ&_sg%5B1%5D=ykQnQS2LN8fUQXAYx5Fpiy2NXqIwqO1UyVCENkpSUUWZn8Qqgrelh1bb4ry9Q9XPgCts7lVXU1_68YLjqnCPh4seSzWfG5BpKHc3MuFwsK6l.ePETOgrc5VHnE0GK_yjBK1XVVfdQ9S6g2UKVfg8Z8miIkGlMPXpzaYKlB0JPDSiroGp9QoFbmcY2egYAXbL1ZQ&_iepl=},\nproject={Alphonse},\npublisher={IEEE},\naddress={Vancouver, BC, Canada},\nmonth=aug,\ndoi={10.1109\/RO-MAN50785.2021.9515445},\nunit= {meca-ras}}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, R. Lahouli, D. Doroftei, and R. Haelterman, &#8220;Distributed coverage optimisation for a fleet of unmanned maritime systems,\" <span style=\"font-style: italic\">ACTA IMEKO<\/span>, vol. 10, iss. 3, pp. 36-43, 2021.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_159\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_159\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/acta.imeko.org\/index.php\/acta-imeko\/article\/view\/IMEKO-ACTA-10%20%282021%29-03-07\/pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/http:\/\/dx.doi.org\/10.21014\/acta_imeko.v10i3.1031' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_159_block\">\n<p>Unmanned maritime systems (UMS) can provide important benefits for maritime law enforcement agencies for tasks such as area surveillance and patrolling, especially when they are able to work together as one coordinated system. In this context, this paper proposes a methodology that optimises the coverage of a fleet of UMS, thereby maximising the opportunities for identifying threats. Unlike traditional approaches to maritime coverage optimisation, which are also used, for example, in search and rescue operations when searching for victims at sea, this approach takes into consideration the limited seaworthiness of small UMS, compared with traditional large ships, by incorporating the danger level into the design of the optimiser.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_159_block\">\n<pre><code class=\"tex bibtex\">@ARTICLE{cubberimeko2021,\nauthor={De Cubber, Geert and Lahouli, Rihab and Doroftei, Daniela and Haelterman, Rob},\njournal={ACTA IMEKO},\ntitle={Distributed coverage optimisation for a fleet of unmanned maritime systems},\nyear={2021},\nvolume={10},\nnumber={3},\npages={36-43},\nissn={2221-870X},\nurl={https:\/\/acta.imeko.org\/index.php\/acta-imeko\/article\/view\/IMEKO-ACTA-10%20%282021%29-03-07\/pdf},\nproject={MarSur, SSAVE},\npublisher={IMEKO},\nmonth=oct,\nabstract = {Unmanned maritime systems (UMS) can provide important benefits for maritime law enforcement agencies for tasks such as area surveillance and patrolling, especially when they are able to work together as one coordinated system. In this context, this paper proposes a methodology that optimises the coverage of a fleet of UMS, thereby maximising the opportunities for identifying threats. Unlike traditional approaches to maritime coverage optimisation, which are also used, for example, in search and rescue operations when searching for victims at sea, this approach takes into consideration the limited seaworthiness of small UMS, compared with traditional large ships, by incorporating the danger level into the design of the optimiser. },\ndoi={http:\/\/dx.doi.org\/10.21014\/acta_imeko.v10i3.1031},\nunit= {meca-ras}}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, G. De Cubber, and E. Cepolina, &#8220;Mobile Robots Supporting Risky Interventions, Humanitarian actions and Demining, in particular the promising DISARMADILLO Tool,\" in <span style=\"font-style: italic\">Proceedings of TC17-VRISE2021 &#8211; A VIRTUAL Topical Event of Technical Committee on Measurement and Control of Robotics (TC17), International Measurement Confederation (IMEKO), Theme: &#8220;Robotics for Risky Interventions and Environmental Surveillance\"<\/span>, Houston, TX, USA,  2021, pp. 5-6.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_160\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/mecatron.rma.ac.be\/pub\/2021\/TC17-VRISE2021-Abstract%20Proceedings.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_160_block\">\n<pre><code class=\"tex bibtex\">@INPROCEEDINGS{knvrise,\nauthor={Baudoin, Yvan and De Cubber, Geert and Cepolina, Emanuela},\nbooktitle={Proceedings of TC17-VRISE2021 - A VIRTUAL Topical Event of Technical Committee on Measurement and Control of Robotics (TC17), International Measurement Confederation (IMEKO), Theme: \"Robotics for Risky Interventions and Environmental Surveillance\"},\ntitle={Mobile Robots Supporting Risky Interventions, Humanitarian actions and Demining, in particular the promising DISARMADILLO Tool},\nyear={2021},\nvolume={},\nnumber={},\npages={5-6},\nurl={https:\/\/mecatron.rma.ac.be\/pub\/2021\/TC17-VRISE2021-Abstract%20Proceedings.pdf},\nproject={AIDED, Alphonse, MarSur, SSAVE, MarLand, iMUGs, ICARUS, TIRAMISU},\npublisher={IMEKO},\naddress={Houston, TX, USA},\nmonth=oct,\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2020<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    H. Balta, J. Velagic, H. Beglerovic, G. De Cubber, and B. Siciliano, &#8220;3D Registration and Integrated Segmentation Framework for Heterogeneous Unmanned Robotic Systems,\" <span style=\"font-style: italic\">Remote Sensing<\/span>, vol. 12, iss. 10, p. 1608, 2020.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_40\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_40\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.mdpi.com\/2072-4292\/12\/10\/1608\/pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.3390\/rs12101608' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_40_block\">\n<p>The paper proposes a novel framework for registering and segmenting 3D point clouds of large-scale natural terrain and complex environments coming from a multisensor heterogeneous robotics system, consisting of unmanned aerial and ground vehicles. This framework involves data acquisition and pre-processing, 3D heterogeneous registration and integrated multi-sensor based segmentation modules. The first module provides robust and accurate homogeneous registrations of 3D environmental models based on sensors\u2019 measurements acquired from the ground (UGV) and aerial (UAV) robots. For 3D UGV registration, we proposed a novel local minima escape ICP (LME-ICP) method, which is based on the well known iterative closest point (ICP) algorithm extending it by the introduction of our local minima estimation and local minima escape mechanisms. It did not require any prior known pose estimation information acquired from sensing systems like odometry, global positioning system (GPS), or inertial measurement units (IMU). The 3D UAV registration has been performed using the Structure from Motion (SfM) approach. In order to improve and speed up the process of outliers removal for large-scale outdoor environments, we introduced the Fast Cluster Statistical Outlier Removal (FCSOR) method. This method was used to filter out the noise and to downsample the input data, which will spare computational and memory resources for further processing steps. Then, we co-registered a point cloud acquired from a laser ranger (UGV) and a point cloud generated from images (UAV) generated by the SfM method. The 3D heterogeneous module consists of a semi-automated 3D scan registration system, developed with the aim to overcome the shortcomings of the existing fully automated 3D registration approaches. This semi-automated registration system is based on the novel Scale Invariant Registration Method (SIRM). The SIRM provides the initial scaling between two heterogenous point clouds and provides an adaptive mechanism for tuning the mean scale, based on the difference between two consecutive estimated point clouds\u2019 alignment error values. Once aligned, the resulting homogeneous ground-aerial point cloud is further processed by a segmentation module. For this purpose, we have proposed a system for integrated multi-sensor based segmentation of 3D point clouds. This system followed a two steps sequence: ground-object segmentation and color-based region-growing segmentation. The experimental validation of the proposed 3D heterogeneous registration and integrated segmentation framework was performed on large-scale datasets representing unstructured outdoor environments, demonstrating the potential and benefits of the proposed semi-automated 3D registration system in real-world environments.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_40_block\">\n<pre><code class=\"tex bibtex\">@Article{balta20203Dregistration,\nauthor = {Balta, Haris and Velagic, Jasmin and Beglerovic, Halil and De Cubber, Geert and Siciliano, Bruno},\njournal = {Remote Sensing},\ntitle = {3D Registration and Integrated Segmentation Framework for Heterogeneous Unmanned Robotic Systems},\nyear = {2020},\nmonth = may,\nnumber = {10},\npages = {1608},\nvolume = {12},\nabstract = {The paper proposes a novel framework for registering and segmenting 3D point clouds of large-scale natural terrain and complex environments coming from a multisensor heterogeneous robotics system, consisting of unmanned aerial and ground vehicles. This framework involves data acquisition and pre-processing, 3D heterogeneous registration and integrated multi-sensor based segmentation modules. The first module provides robust and accurate homogeneous registrations of 3D environmental models based on sensors\u2019 measurements acquired from the ground (UGV) and aerial (UAV) robots. For 3D UGV registration, we proposed a novel local minima escape ICP (LME-ICP) method, which is based on the well known iterative closest point (ICP) algorithm extending it by the introduction of our local minima estimation and local minima escape mechanisms. It did not require any prior known pose estimation information acquired from sensing systems like odometry, global positioning system (GPS), or inertial measurement units (IMU). The 3D UAV registration has been performed using the Structure from Motion (SfM) approach. In order to improve and speed up the process of outliers removal for large-scale outdoor environments, we introduced the Fast Cluster Statistical Outlier Removal (FCSOR) method. This method was used to filter out the noise and to downsample the input data, which will spare computational and memory resources for further processing steps. Then, we co-registered a point cloud acquired from a laser ranger (UGV) and a point cloud generated from images (UAV) generated by the SfM method. The 3D heterogeneous module consists of a semi-automated 3D scan registration system, developed with the aim to overcome the shortcomings of the existing fully automated 3D registration approaches. This semi-automated registration system is based on the novel Scale Invariant Registration Method (SIRM). The SIRM provides the initial scaling between two heterogenous point clouds and provides an adaptive mechanism for tuning the mean scale, based on the difference between two consecutive estimated point clouds\u2019 alignment error values. Once aligned, the resulting homogeneous ground-aerial point cloud is further processed by a segmentation module. For this purpose, we have proposed a system for integrated multi-sensor based segmentation of 3D point clouds. This system followed a two steps sequence: ground-object segmentation and color-based region-growing segmentation. The experimental validation of the proposed 3D heterogeneous registration and integrated segmentation framework was performed on large-scale datasets representing unstructured outdoor environments, demonstrating the potential and benefits of the proposed semi-automated 3D registration system in real-world environments.},\ndoi = {10.3390\/rs12101608},\nproject = {NRTP,ICARUS,TIRAMISU,MarSur},\npublisher = {MDPI},\nurl = {https:\/\/www.mdpi.com\/2072-4292\/12\/10\/1608\/pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, G. De Cubber, and H. De Smet, &#8220;Reducing drone incidents by incorporating human factors in the drone and drone pilot accreditation process,\" in <span style=\"font-style: italic\">Advances in Human Factors in Robots, Drones and Unmanned Systems<\/span>, San Diego, USA,  2020, p. 71\u201377.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_149\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_149\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2020\/Reducing%20drone%20incidents%20by%20incorporating%20human%20factors%20in%20the%20drone%20and%20drone%20pilot%20accreditation%20process.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1007\/978-3-030-51758-8_10' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_149_block\">\n<p>Considering the ever-increasing use of drones in a plentitude of application areas, the risk is that also an ever-increasing number of drone incidents would be ob-served. Research has shown that a large majority of all incidents with drones is due not to technological, but to human error. An advanced risk-reduction meth-odology, focusing on the human element, is thus required in order to allow for the safe use of drones. In this paper, we therefore introduce a novel concept to pro-vide a qualitative and quantitative assessment of the performance of the drone op-erator. The proposed methodology is based on one hand upon the development of standardized test methodologies and on the other hand on human performance modeling of the drone operators in a highly realistic simulation environment.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_149_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2020alphonse,\nauthor = {Doroftei, Daniela and De Cubber, Geert and De Smet, Hans},\nbooktitle = {Advances in Human Factors in Robots, Drones and Unmanned Systems},\ntitle = {Reducing drone incidents by incorporating human factors in the drone and drone pilot accreditation process},\nyear = {2020},\nmonth = jul,\neditor = {Zallio, Matteo},\npublisher = {Springer International Publishing},\npages = {71--77},\nisbn = {978-3-030-51758-8},\norganization = {AHFE},\naddress = {San Diego, USA},\nabstract = {Considering the ever-increasing use of drones in a plentitude of application areas, the risk is that also an ever-increasing number of drone incidents would be ob-served. Research has shown that a large majority of all incidents with drones is due not to technological, but to human error. An advanced risk-reduction meth-odology, focusing on the human element, is thus required in order to allow for the safe use of drones. In this paper, we therefore introduce a novel concept to pro-vide a qualitative and quantitative assessment of the performance of the drone op-erator. The proposed methodology is based on one hand upon the development of standardized test methodologies and on the other hand on human performance modeling of the drone operators in a highly realistic simulation environment.},\ndoi = {10.1007\/978-3-030-51758-8_10},\nunit= {meca-ras},\nproject = {Alphonse},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2020\/Reducing%20drone%20incidents%20by%20incorporating%20human%20factors%20in%20the%20drone%20and%20drone%20pilot%20accreditation%20process.pdf},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    A. Kakogawa, S. Ma, B. Ristic, C. Gilliam, A. K. Kamath, V. K. Tripathi, L. Behera, A. Ferrein, I. Scholl, T. Neumann, K. Kr\u00fcckel, S. Schiffer, A. Joukhadar, M. Alchehabi, and A. Jejeh, <span style=\"font-style: italic\">Unmanned Robotic Systems and Applications<\/span>, M. Reyhanoglu and G. De Cubber, Eds., InTech, 2020.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_153\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_153\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/unmanned-robotic-systems-and-applications\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/intechopen.88414' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_153_block\">\n<p>This book presents recent studies of unmanned robotic systems and their applications. With its five chapters, the book brings together important contributions from renowned international researchers. Unmanned autonomous robots are ideal candidates for applications such as rescue missions, especially in areas that are difficult to access. Swarm robotics (multiple robots working together) is another exciting application of the unmanned robotics systems, for example, coordinated search by an interconnected group of moving robots for the purpose of finding a source of hazardous emissions. These robots can behave like individuals working in a group without a centralized control.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_153_block\">\n<pre><code class=\"tex bibtex\">@Book{de2020unmanned,\nauthor = {Atsushi Kakogawa and Shugen Ma and Branko Ristic and Christopher Gilliam and Archit Krishna Kamath and Vibhu Kumar Tripathi and Laxmidhar Behera and Alexander Ferrein and Ingrid Scholl and Tobias Neumann and Kai Kr\u00fcckel and Stefan Schiffer and Abdulkader Joukhadar and Mohammad Alchehabi and Adnan Jejeh},\neditor = {Reyhanoglu, Mahmut and De Cubber, Geert},\npublisher = {{InTech}},\ntitle = {Unmanned Robotic Systems and Applications},\nyear = {2020},\nmonth = apr,\nabstract = {This book presents recent studies of unmanned robotic systems and their applications. With its five chapters, the book brings together important contributions from renowned international researchers. Unmanned autonomous robots are ideal candidates for applications such as rescue missions, especially in areas that are difficult to access. Swarm robotics (multiple robots working together) is another exciting application of the unmanned robotics systems, for example, coordinated search by an interconnected group of moving robots for the purpose of finding a source of hazardous emissions. These robots can behave like individuals working in a group without a centralized control.},\ndoi = {10.5772\/intechopen.88414},\nproject = {NRTP,ICARUS,MarSur},\nurl = {https:\/\/www.intechopen.com\/books\/unmanned-robotic-systems-and-applications},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, R. Lahouli, D. Doroftei, and R. Haelterman, &#8220;Distributed coverage optimization for a fleet of unmanned maritime systems for a maritime patrol and surveillance application,\" in <span style=\"font-style: italic\">ISMCR 2020: 23rd International Symposium on Measurement and Control in Robotics<\/span>, Budapest, Hungary,  2020.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_156\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_156\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2020\/conference_101719.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ISMCR51255.2020.9263740' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_156_block\">\n<p>In order for unmanned maritime systems to provide added value for maritime law enforcement agencies, they have to be able to work together as a coordinated team for tasks such as area surveillance and patrolling. Therefore, this paper proposes a methodology that optimizes the coverage of a fleet of unmanned maritime systems, and thereby maximizes the chances of noticing threats. Unlike traditional approaches for maritime coverage optimization, which are also used for example in search and rescue operations when searching for victims at sea, this approaches takes into consideration the limited seaworthiness of small unmanned systems, as compared to traditional large ships, by incorporating the danger level in the design of the optimizer.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_156_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{decubber2020dco,\nauthor = {De Cubber, Geert and Lahouli, Rihab and Doroftei, Daniela and Haelterman, Rob},\nbooktitle = {ISMCR 2020: 23rd International Symposium on Measurement and Control in Robotics},\ntitle = {Distributed coverage optimization for a fleet of unmanned maritime systems for a maritime patrol and surveillance application},\nyear = {2020},\nmonth = oct,\norganization = {ISMCR},\npublisher = {{IEEE}},\nabstract = {In order for unmanned maritime systems to provide added value for maritime law enforcement agencies, they have to be able to work together as a coordinated team for tasks such as area surveillance and patrolling. Therefore, this paper proposes a methodology that optimizes the coverage of a fleet of unmanned maritime systems, and thereby maximizes the chances of noticing threats. Unlike traditional approaches for maritime coverage optimization, which are also used for example in search and rescue operations when searching for victims at sea, this approaches takes into consideration the limited seaworthiness of small unmanned systems, as compared to traditional large ships, by incorporating the danger level in the design of the optimizer.},\nproject = {SSAVE,MarSur},\naddress = {Budapest, Hungary},\ndoi = {10.1109\/ISMCR51255.2020.9263740},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2020\/conference_101719.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2019<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, &#8220;Opportunities and threats posed by new technologies,\" in <span style=\"font-style: italic\">SciFi-IT<\/span>, Ghent, Belgium,  2019.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_129\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_129\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2019\/Sci-Fi-It-2019-DeCubber (2).pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/https:\/\/doi.org\/10.5281\/zenodo.2628758' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_129_block\">\n<p>The technological evolution is introducing in a fast pace new technologies in our everyday lives. As always, these new technologies can be applied for good causes and thereby give us the opportunity to do many interesting new things. Think for example about drones transporting blood samples between hospitals. However, like always, new technologies can also be applied for bad causes. Think for example about the same drones, but this time transporting bomb parcels instead of blood. In this paper, we focus on a number of novel technologies and discuss how security actors are currently doing their best to maximize the good use of these tools while minimizing the bad use. We will focus on research actions taken by Belgian Royal Military Academy in the domains of: &#8211; Augmented reality, and showcase how this technology can be used to improve surveillance operations. &#8211; Unmanned Aerial Systems (Drones), and showcase how the potential security threats posed by these systems can be mitigated by novel drone detection systems. &#8211; Unmanned Maritime Systems, and showcase how this technology can be used to increase the safety at sea. &#8211; Unmanned Ground Systems, and more specifically the autonomous cars, showcasing how to prevent potential cyber-attacks on these future transportation tools.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_129_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2019opportunities,\nauthor = {De Cubber, Geert},\nbooktitle = {SciFi-IT},\ntitle = {Opportunities and threats posed by new technologies},\nyear = {2019},\nabstract = {The technological evolution is introducing in a fast pace new technologies in our everyday lives. As always, these new technologies can be applied for good causes and thereby give us the opportunity to do many interesting new things. Think for example about drones transporting blood samples between hospitals. However, like always, new technologies can also be applied for bad causes. Think for example about the same drones, but this time transporting bomb parcels instead of blood.\nIn this paper, we focus on a number of novel technologies and discuss how security actors are currently\ndoing their best to maximize the good use of these tools while minimizing the bad use. We will focus on research actions taken by Belgian Royal Military Academy in the domains of:\n- Augmented reality, and showcase how this technology can be used to improve surveillance operations.\n- Unmanned Aerial Systems (Drones), and showcase how the potential security threats posed by these systems can be mitigated by novel drone detection systems.\n- Unmanned Maritime Systems, and showcase how this technology can be used to increase the safety at sea.\n- Unmanned Ground Systems, and more specifically the autonomous cars, showcasing how to prevent potential cyber-attacks on these future transportation tools.},\ndoi = {https:\/\/doi.org\/10.5281\/zenodo.2628758},\naddress = {Ghent, Belgium},\nproject = {MarSur,SafeShore},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2019\/Sci-Fi-It-2019-DeCubber (2).pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, &#8220;Explosive drones: How to deal with this new threat?,\" in <span style=\"font-style: italic\">International workshop on Measurement, Prevention, Protection and Management of CBRN Risks (RISE)<\/span>, Les Bon Villers, Belgium,  2019.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_130\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_130\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2019\/Explosive drones - How to deal with this new threat.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5281\/ZENODO.2628752' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_130_block\">\n<p>As the commercial and recreative use of small unmanned aerial vehicles or drones is booming, so are the military and criminals starting to use these systems more and more. Due to improvements in flight stability, autonomy and payload capacity it becomes possible to equip these drones with explosive charges, making them threat agents where traditional response mechanisms have few answers against. In this paper, we will discuss this new type of threat in detail, making the difference between the loitering munition, as used by regular armies and the traditional drones equipped with explosive charges, used in guerrilla warfare and by criminals. We will then discuss what research actions are currently being undertaken to provide answers to each of these threats and what countermeasures that are currently already available and which ones will be available in the near future.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_130_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2019explosive,\nauthor = {De Cubber, Geert},\nbooktitle = {International workshop on Measurement, Prevention, Protection and Management of CBRN Risks (RISE)},\ntitle = {Explosive drones: How to deal with this new threat?},\nyear = {2019},\nnumber = {9},\naddress = {Les Bon Villers, Belgium},\nabstract = {As the commercial and recreative use of small unmanned aerial vehicles or drones is booming, so are the military and criminals starting to use these systems more and more. Due to improvements in flight stability, autonomy and payload capacity it becomes possible to equip these drones with explosive charges, making them threat agents where traditional response mechanisms have few answers against. In this paper, we will discuss this new type of threat in detail, making the difference between the loitering munition, as used by regular armies and the traditional drones equipped with explosive charges, used in guerrilla warfare and by criminals. We will then discuss what research actions are currently being undertaken to provide answers to each of these threats and what countermeasures that are currently already available and which ones will be available in the near future.},\ndoi = {10.5281\/ZENODO.2628752},\nproject = {SafeShore},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2019\/Explosive drones - How to deal with this new threat.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    I. Lahouli, Z. Chtourou, M. A. Ben Ayed, R. Haelterman, G. De Cubber, and R. Attia, &#8220;Pedestrian Detection and Trajectory Estimation in the Compressed Domain Using Thermal Images,\" in <span style=\"font-style: italic\">Computer Vision, Imaging and Computer Graphics Theory and Applications<\/span>, Springer, 2019, p. 212\u2013227.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_131\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_131\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.springerprofessional.de\/en\/pedestrian-detection-and-trajectory-estimation-in-the-compressed\/16976092\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1007\/978-3-030-26756-8_10' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_131_block\">\n<p>Since a few decades, the Unmanned Aerial Vehicles (UAVs) are considered precious tools for different military applications such as the automatic surveillance in outdoor environments. Nevertheless, the onboard implementation of image and video processing techniques poses many challenges like the high computational cost and the high bandwidth requirements, especially on low-performance processing platforms like small or medium UAVs. A fast and efficient framework for pedestrian detection and trajectory estimation for outdoor surveillance using thermal images is presented in this paper. First, the detection process is based on a conjunction between contrast enhancement techniques and saliency maps as a hotspot detector, on Discrete Chebychev Moments (DCM) as a global image content descriptor and on a linear Support Vector Machine (SVM) as a classifier. Second, raw H.264\/AVC compressed video streams with limited computational overhead are exploited to estimate the trajectories of the detected pedestrians. In order to simulate suspicious events, six different scenarios were carried out and filmed using a thermal camera. The obtained results show the effectiveness and the low computational requirements of the proposed framework which make it suitable for real-time applications and onboard implementation.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_131_block\">\n<pre><code class=\"tex bibtex\">@InCollection{lahouli2019pedestrian,\nauthor = {Lahouli, Ichraf and Chtourou, Zied and Ben Ayed, Mohamed Ali and Haelterman, Rob and De Cubber, Geert and Attia, Rabah},\nbooktitle = {Computer Vision, Imaging and Computer Graphics Theory and Applications},\npublisher = {Springer},\ntitle = {Pedestrian Detection and Trajectory Estimation in the Compressed Domain Using Thermal Images},\nyear = {2019},\npages = {212--227},\nabstract = {Since a few decades, the Unmanned Aerial Vehicles (UAVs) are considered precious tools for different military applications such as the automatic surveillance in outdoor environments. Nevertheless, the onboard implementation of image and video processing techniques poses many challenges like the high computational cost and the high bandwidth requirements, especially on low-performance processing platforms like small or medium UAVs. A fast and efficient framework for pedestrian detection and trajectory estimation for outdoor surveillance using thermal images is presented in this paper. First, the detection process is based on a conjunction between contrast enhancement techniques and saliency maps as a hotspot detector, on Discrete Chebychev Moments (DCM) as a global image content descriptor and on a linear Support Vector Machine (SVM) as a classifier. Second, raw H.264\/AVC compressed video streams with limited computational overhead are exploited to estimate the trajectories of the detected pedestrians. In order to simulate suspicious events, six different scenarios were carried out and filmed using a thermal camera. The obtained results show the effectiveness and the low computational requirements of the proposed framework which make it suitable for real-time applications and onboard implementation.},\ndoi = {10.1007\/978-3-030-26756-8_10},\nproject = {SafeShore},\nurl = {https:\/\/www.springerprofessional.de\/en\/pedestrian-detection-and-trajectory-estimation-in-the-compressed\/16976092},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    I. Lahouli, R. Haelterman, Z. Chtourou, G. De Cubber, and R. Attia, &#8220;Pedestrian Tracking in the Compressed Domain Using Thermal Images,\" in <span style=\"font-style: italic\">Representations, Analysis and Recognition of Shape and Motion from Imaging Data, Communications in Computer and Information Science<\/span>, Springer International Publishing, 2019, vol. 842, p. 35\u201344.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_132\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_132\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/app.dimensions.ai\/details\/publication\/pub.1113953804\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1007\/978-3-030-19816-9_3' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_132_block\">\n<p>The video surveillance of sensitive facilities or borders poses many challenges like the high bandwidth requirements and the high computational cost. In this paper, we propose a framework for detecting and tracking pedestrians in the compressed domain using thermal images. Firstly, the detection process uses a conjunction between saliency maps and contrast enhancement techniques followed by a global image content descriptor based on Discrete Chebychev Moments (DCM) and a linear Support Vector Machine (SVM) as a classifier. Secondly, the tracking process exploits raw H.264 compressed video streams with limited computational overhead. In addition to two, well-known, public datasets, we have generated our own dataset by carrying six different scenarios of suspicious events using a thermal camera. The obtained results show the effectiveness and the low computational requirements of the proposed framework which make it suitable for real-time applications and onboard implementation.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_132_block\">\n<pre><code class=\"tex bibtex\">@InCollection{lahouli2019pedestriantracking,\nauthor = {Lahouli, Ichraf and Haelterman, Rob and Chtourou, Zied and De Cubber, Geert and Attia, Rabah},\nbooktitle = {Representations, Analysis and Recognition of Shape and Motion from Imaging Data, Communications in Computer and Information Science},\npublisher = {Springer International Publishing},\ntitle = {Pedestrian Tracking in the Compressed Domain Using Thermal Images},\nyear = {2019},\npages = {35--44},\nvolume = {842},\nabstract = {The video surveillance of sensitive facilities or borders poses many challenges like the high bandwidth requirements and the high computational cost. In this paper, we propose a framework for detecting and tracking pedestrians in the compressed domain using thermal images. Firstly, the detection process uses a conjunction between saliency maps and contrast enhancement techniques followed by a global image content descriptor based on Discrete Chebychev Moments (DCM) and a linear Support Vector Machine (SVM) as a classifier. Secondly, the tracking process exploits raw H.264 compressed video streams with limited computational overhead. In addition to two, well-known, public datasets, we have generated our own dataset by carrying six different scenarios of suspicious events using a thermal camera. The obtained results show the effectiveness and the low computational requirements of the proposed framework which make it suitable for real-time applications and onboard implementation.},\ndoi = {10.1007\/978-3-030-19816-9_3},\nproject = {SafeShore},\nurl = {https:\/\/app.dimensions.ai\/details\/publication\/pub.1113953804},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber and R. Haelterman, &#8220;Optimized distributed scheduling for a fleet of heterogeneous unmanned maritime systems,\" in <span style=\"font-style: italic\">2019 IEEE International Symposium on Measurement and Control in Robotics (ISMCR)<\/span>, Houston, USA,  2019.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_133\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_133\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2019\/ICMCR-DeCubber.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ismcr47492.2019.8955727' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_133_block\">\n<p>Due to the increase in embedded computing power, modern robotic systems are capable of running a wide range of perception and control algorithms simultaneously. This raises the question where to optimally allocate each robotic cognition process. In this paper, we present a concept for a novel load distribution approach. The proposed methodology adopts a decentralised approach towards the allocation of perception and control processes to different agents (unmanned vessels, fog or cloud services) based on an estimation of the communication parameters (bandwidth, latency, cost), the agent capabilities in terms of processing hardware (not only focusing on the CPU, but also taking into consideration the GPU, disk &#038; memory speed and size) and the requirements in terms of timely delivery of quality output data. The presented approach is extensively validated in a simulation environment and shows promising properties.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_133_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2019optimized,\nauthor = {De Cubber, Geert and Haelterman, Rob},\nbooktitle = {2019 {IEEE} International Symposium on Measurement and Control in Robotics ({ISMCR})},\ntitle = {Optimized distributed scheduling for a fleet of heterogeneous unmanned maritime systems},\nyear = {2019},\nmonth = sep,\nnumber = {23},\npublisher = {{IEEE}},\naddress = {Houston, USA},\nabstract = {Due to the increase in embedded computing power, modern robotic systems are capable of running a wide range of perception and control algorithms simultaneously. This raises the question where to optimally allocate each robotic cognition process. In this paper, we present a concept for a novel load distribution approach. The proposed methodology adopts a decentralised approach towards the allocation of perception and control processes to different agents (unmanned vessels, fog or cloud services) based on an estimation of the communication parameters (bandwidth, latency, cost), the agent capabilities in terms of processing hardware (not only focusing on the CPU, but also taking into consideration the GPU, disk & memory speed and size) and the requirements in terms of timely delivery of quality output data. The presented approach is extensively validated in a simulation environment and shows promising properties.},\ndoi = {10.1109\/ismcr47492.2019.8955727},\nproject = {MarSur},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2019\/ICMCR-DeCubber.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    H. Balta, J. Velagic, G. De Cubber, and B. Siciliano, &#8220;Semi-Automated 3D Registration for Heterogeneous Unmanned Robots Based on Scale Invariant Method,\" in <span style=\"font-style: italic\">2019 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR)<\/span>, Wurzburg, Germany,  2019.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_134\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_134\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/8848951\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ssrr.2019.8848951' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_134_block\">\n<p>This paper addresses the problem of 3D registration of outdoor environments combining heterogeneous datasets acquired from unmanned aerial (UAV) and ground (UGV) vehicles. In order to solve this problem, we introduced a novel Scale Invariant Registration Method (SIRM) for semi-automated registration of 3D point clouds. The method is capable of coping with an arbitrary scale difference between the point clouds, without any information about their initial position and orientation. Furthermore, the SIRM does not require having a good initial overlap between two heterogeneous datasets. Our method strikes an elegant balance between the existing fully automated 3D registration systems (which often fail in the case of heterogeneous datasets and harsh outdoor environments) and fully manual registration approaches (which are labour-intensive). The experimental validation of the proposed 3D heterogeneous registration system was performed on large-scale datasets representing unstructured and harsh outdoor environments, demonstrating the potential and benefits of the proposed 3D registration system in real-world environments.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_134_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{balta2019semi,\nauthor = {Balta, Haris and Velagic, Jasmin and De Cubber, Geert and Siciliano, Bruno},\nbooktitle = {2019 {IEEE} International Symposium on Safety, Security, and Rescue Robotics ({SSRR})},\ntitle = {Semi-Automated {3D} Registration for Heterogeneous Unmanned Robots Based on Scale Invariant Method},\nyear = {2019},\nmonth = sep,\npublisher = {{IEEE}},\nvolume = {1},\naddress = {Wurzburg, Germany},\nabstract = {This paper addresses the problem of 3D registration of outdoor environments combining heterogeneous datasets acquired from unmanned aerial (UAV) and ground (UGV) vehicles. In order to solve this problem, we introduced a novel Scale Invariant Registration Method (SIRM) for semi-automated registration of 3D point clouds. The method is capable of coping with an arbitrary scale difference between the point clouds, without any information about their initial position and orientation. Furthermore, the SIRM does not require having a good initial overlap between two heterogeneous datasets. Our method strikes an elegant balance between the existing fully automated 3D registration systems (which often fail in the case of heterogeneous datasets and harsh outdoor environments) and fully manual registration approaches (which are labour-intensive). The experimental validation of the proposed 3D heterogeneous registration system was performed on large-scale datasets representing unstructured and harsh outdoor environments, demonstrating the potential and benefits of the proposed 3D registration system in real-world environments.},\ndoi = {10.1109\/ssrr.2019.8848951},\nproject = {NRTP},\nurl = {https:\/\/ieeexplore.ieee.org\/document\/8848951},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and G. De Cubber, &#8220;Using a qualitative and quantitative validation methodology to evaluate a drone detection system,\" <span style=\"font-style: italic\">ACTA IMEKO<\/span>, vol. 8, iss. 4, p. 20\u201327, 2019.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_135\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_135\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/acta.imeko.org\/index.php\/acta-imeko\/article\/view\/IMEKO-ACTA-08%20%282019%29-04-05\/pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.21014\/acta_imeko.v8i4.682' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_135_block\">\n<p>Now that the use of drones is becoming more common, the need to regulate the access to airspace for these systems is becoming more pressing. A necessary tool in order to do this is a means of detecting drones. Numerous parties have started the development of such drone detection systems. A big problem with these systems is that the evaluation of the performance of drone detection systems is a difficult operation that requires the careful consideration of all technical and non-technical aspects of the system under test. Indeed, weather conditions and small variations in the appearance of the targets can have a huge difference on the performance of the systems. In order to provide a fair evaluation, it is therefore paramount that a validation procedure that finds a compromise between the requirements of end users (who want tests to be performed in operational conditions) and platform developers (who want statistically relevant tests) is followed. Therefore, we propose in this article a qualitative and quantitative validation methodology for drone detection systems. The proposed validation methodology seeks to find this compromise between operationally relevant benchmarking (by providing qualitative benchmarking under varying environmental conditions) and statistically relevant evaluation (by providing quantitative score sheets under strictly described conditions).<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_135_block\">\n<pre><code class=\"tex bibtex\">@Article{doroftei2019using,\nauthor = {Doroftei, Daniela and De Cubber, Geert},\njournal = {{ACTA} {IMEKO}},\ntitle = {Using a qualitative and quantitative validation methodology to evaluate a drone detection system},\nyear = {2019},\nmonth = dec,\nnumber = {4},\npages = {20--27},\nvolume = {8},\nabstract = {Now that the use of drones is becoming more common, the need to regulate the access to airspace for these systems is becoming more pressing. A necessary tool in order to do this is a means of detecting drones. Numerous parties have started the development of such drone detection systems. A big problem with these systems is that the evaluation of the performance of drone detection systems is a difficult operation that requires the careful consideration of all technical and non-technical aspects of the system under test. Indeed, weather conditions and small variations in the appearance of the targets can have a huge difference on the performance of the systems. In order to provide a fair evaluation, it is therefore paramount that a validation procedure that finds a compromise between the requirements of end users (who want tests to be performed in operational conditions) and platform developers (who want statistically relevant tests) is followed. Therefore, we propose in this article a qualitative and quantitative validation methodology for drone detection systems. The proposed validation methodology seeks to find this compromise between operationally relevant benchmarking (by providing qualitative benchmarking under varying environmental conditions) and statistically relevant evaluation (by providing quantitative score sheets under strictly described conditions).},\ndoi = {10.21014\/acta_imeko.v8i4.682},\npdf = {https:\/\/acta.imeko.org\/index.php\/acta-imeko\/article\/view\/IMEKO-ACTA-08%20%282019%29-04-05\/pdf},\nproject = {SafeShore},\npublisher = {{IMEKO} International Measurement Confederation},\nurl = {https:\/\/acta.imeko.org\/index.php\/acta-imeko\/article\/view\/IMEKO-ACTA-08%20%282019%29-04-05\/pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    N. Nauwynck, H. Balta, G. De Cubber, and H. Sahli, &#8220;A proof of concept of the in-flight launch of unmanned aerial vehicles in a search and rescue scenario,\" <span style=\"font-style: italic\">ACTA IMEKO<\/span>, vol. 8, iss. 4, p. 13\u201319, 2019.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_136\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_136\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/acta.imeko.org\/index.php\/acta-imeko\/article\/view\/IMEKO-ACTA-08 (2019)-04-04\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.21014\/acta_imeko.v8i4.681' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_136_block\">\n<p>This article considers the development of a system to enable the in-flight-launch of one aerial system by another. The article discusses how an optimal release mechanism was developed taking into account the aerodynamics of one specific mothership and child Unmanned Aerial Vehicle (UAV). Furthermore, it discusses the PID-based control concept that was introduced in order to autonomously stabilise the child UAV after being released from the mothership UAV. Finally, the article demonstrates how the concept of a mothership and child UAV combination could be taken advantage of in the context of a search and rescue operation.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_136_block\">\n<pre><code class=\"tex bibtex\">@Article{nauwynck2019proof,\nauthor = {Nauwynck, Niels and Balta, Haris and De Cubber, Geert and Sahli, Hichem},\njournal = {{ACTA} {IMEKO}},\ntitle = {A proof of concept of the in-flight launch of unmanned aerial vehicles in a search and rescue scenario},\nyear = {2019},\nmonth = dec,\nnumber = {4},\npages = {13--19},\nvolume = {8},\nabstract = {This article considers the development of a system to enable the in-flight-launch of one aerial system by another. The article discusses how an optimal release mechanism was developed taking into account the aerodynamics of one specific mothership and child Unmanned Aerial Vehicle (UAV). Furthermore, it discusses the PID-based control concept that was introduced in order to autonomously stabilise the child UAV after being released from the mothership UAV. Finally, the article demonstrates how the concept of a mothership and child UAV combination could be taken advantage of in the context of a search and rescue operation.},\ndoi = {10.21014\/acta_imeko.v8i4.681},\npublisher = {{IMEKO} International Measurement Confederation},\nproject = {ICARUS, NRTP},\nurl = {https:\/\/acta.imeko.org\/index.php\/acta-imeko\/article\/view\/IMEKO-ACTA-08 (2019)-04-04},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and H. De Smet, &#8220;Evaluating Human Factors for Drone Operations using Simulations and Standardized Tests,\" in <span style=\"font-style: italic\">10th International Conference on Applied Human Factors and Ergonomics (AHFE 2019)<\/span>, Washington DC, USA,  2019.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_148\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_148\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2019\/Poster_Alphonse_Print.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5281\/zenodo.3742199' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_148_block\">\n<p>This poster publication presents an overview of the Alphonse project on the development of new training curricula to reduce the number of drone incidents due to human error.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_148_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2019alphonse,\nauthor = {Doroftei, Daniela and De Smet, Han},\nbooktitle = {10th International Conference on Applied Human Factors and Ergonomics (AHFE 2019)},\ntitle = {Evaluating Human Factors for Drone Operations using Simulations and Standardized Tests},\nyear = {2019},\nmonth = jul,\norganization = {AHFE},\npublisher = {Springer},\naddress = {Washington DC, USA},\nabstract = {This poster publication presents an overview of the Alphonse project on the development of new training curricula to reduce the number of drone incidents due to human error.},\ndoi = {10.5281\/zenodo.3742199},\nproject = {Alphonse},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2019\/Poster_Alphonse_Print.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    A. Coluccia, A. Fascista, A. Schumann, L. Sommer, M. Ghenescu, T. Piatrik, G. De Cubber, M. Nalamati, A. Kapoor, M. Saqib, N. Sharma, M. Blumenstein, V. Magoulianitis, D. Ataloglou, A. Dimou, D. Zarpalas, P. Daras, C. Craye, S. Ardjoune, D. De la Iglesia, M. M\u00e1ndez, R. Dosil, and I. Gonz\u00e1lez, &#8220;Drone-vs-Bird Detection Challenge at IEEE AVSS2019,\" in <span style=\"font-style: italic\">2019 16th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS)<\/span>,  2019, pp. 1-7.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_154\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_154\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/8909876\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/AVSS.2019.8909876' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_154_block\">\n<p>This paper presents the second edition of the \u201cdrone-vs-bird\u201d detection challenge, launched within the activities of the 16-th IEEE International Conference on Advanced Video and Signal-based Surveillance (AVSS). The challenge&#8217;s goal is to detect one or more drones appearing at some point in video sequences where birds may be also present, together with motion in background or foreground. Submitted algorithms should raise an alarm and provide a position estimate only when a drone is present, while not issuing alarms on birds, nor being confused by the rest of the scene. This paper reports on the challenge results on the 2019 dataset, which extends the first edition dataset provided by the SafeShore project with additional footage under different conditions.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_154_block\">\n<pre><code class=\"tex bibtex\">@INPROCEEDINGS{8909876,\nauthor={A. {Coluccia} and A. {Fascista} and A. {Schumann} and L. {Sommer} and M. {Ghenescu} and T. {Piatrik} and G. {De Cubber} and M. {Nalamati} and A. {Kapoor} and M. {Saqib} and N. {Sharma} and M. {Blumenstein} and V. {Magoulianitis} and D. {Ataloglou} and A. {Dimou} and D. {Zarpalas} and P. {Daras} and C. {Craye} and S. {Ardjoune} and D. {De la Iglesia} and M. {M\u00e1ndez} and R. {Dosil} and I. {Gonz\u00e1lez}},\nbooktitle={2019 16th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS)},\ntitle={Drone-vs-Bird Detection Challenge at IEEE AVSS2019},\nyear={2019},\nvolume={},\nnumber={},\npages={1-7},\nproject = {SafeShore,MarSur},\ndoi = {10.1109\/AVSS.2019.8909876},\nabstract = {This paper presents the second edition of the \u201cdrone-vs-bird\u201d detection challenge, launched within the activities of the 16-th IEEE International Conference on Advanced Video and Signal-based Surveillance (AVSS). The challenge's goal is to detect one or more drones appearing at some point in video sequences where birds may be also present, together with motion in background or foreground. Submitted algorithms should raise an alarm and provide a position estimate only when a drone is present, while not issuing alarms on birds, nor being confused by the rest of the scene. This paper reports on the challenge results on the 2019 dataset, which extends the first edition dataset provided by the SafeShore project with additional footage under different conditions.},\nurl = {https:\/\/ieeexplore.ieee.org\/abstract\/document\/8909876},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2018<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    Y. Baudoin, D. Doroftei, G. de Cubber, J. Habumuremyi, H. Balta, and I. Doroftei, &#8220;Unmanned Ground and Aerial Robots Supporting Mine Action Activities,\" <span style=\"font-style: italic\">Journal of Physics: Conference Series<\/span>, vol. 1065, iss. 17, p. 172009, 2018.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_108\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_108\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/iopscience.iop.org\/article\/10.1088\/1742-6596\/1065\/17\/172009\/pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1088\/1742-6596\/1065\/17\/172009' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_108_block\">\n<p>During the Humanitarian\u2010demining actions, teleoperation of sensors or multi\u2010sensor heads can enhance\u2010detection process by allowing more precise scanning, which is use\u2010 ful for the optimization of the signal processing algorithms. This chapter summarizes the technologies and experiences developed during 16 years through national and\/or European\u2010funded projects, illustrated by some contributions of our own laboratory, located at the Royal Military Academy of Brussels, focusing on the detection of unexploded devices and the implementation of mobile robotics systems on minefields<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_108_block\">\n<pre><code class=\"tex bibtex\">@Article{baudoin2018unmanned,\nauthor = {Baudoin, Yvan and Doroftei, Daniela and de Cubber, Geert and Habumuremyi, Jean-Claude and Balta, Haris and Doroftei, Ioan},\ntitle = {Unmanned Ground and Aerial Robots Supporting Mine Action Activities},\nyear = {2018},\nmonth = aug,\nnumber = {17},\norganization = {IOP Publishing},\npages = {172009},\npublisher = {{IOP} Publishing},\nvolume = {1065},\nabstract = {During the Humanitarian\u2010demining actions, teleoperation of sensors or multi\u2010sensor heads can enhance\u2010detection process by allowing more precise scanning, which is use\u2010 ful for the optimization of the signal processing algorithms. This chapter summarizes the technologies and experiences developed during 16 years through national and\/or European\u2010funded projects, illustrated by some contributions of our own laboratory, located at the Royal Military Academy of Brussels, focusing on the detection of unexploded devices and the implementation of mobile robotics systems on minefields},\ndoi = {10.1088\/1742-6596\/1065\/17\/172009},\njournal = {Journal of Physics: Conference Series},\nproject = {TIRAMISU},\nurl = {https:\/\/iopscience.iop.org\/article\/10.1088\/1742-6596\/1065\/17\/172009\/pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    I. Lahouli, R. Haelterman, Z. Chtourou, G. De Cubber, and R. Attia, &#8220;Pedestrian Detection and Tracking in Thermal Images from Aerial MPEG videos,\" in <span style=\"font-style: italic\">International Conference on Computer Vision Theory and Applications<\/span>, Funchal, Portugal,  2018, p. 487\u2013495.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_114\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_114\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.scitepress.org\/Papers\/2018\/67237\/67237.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5220\/0006723704870495' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_114_block\">\n<p>Video surveillance for security and intelligence purposes has been a precious tool as long as the technology has been available but is computationally heavy. In this paper, we present a fast and efficient framework for pedestrian detection and tracking using thermal images. It is designed for automatic surveillance applications in an outdoor environment like preventing border intrusions or attacks on sensitive facilities using image and video processing techniques implemented on-board Unmanned Aerial Vehicles (UAV)s. The proposed framework exploits raw H.264 compressed video streams with limited computational overhead. Our work is driven by the fact that Motion Vectors (MV) are an integral part of any video compression technique, by day and night capabilities of thermal sensors and the distinguished thermal signature of humans. Six different scenarios were carried out and filmed using a thermal camera in order to simulate suspicious events. The obtained results show the effectiveness of the proposed framework and its low computational requirements which make it adequate for on-board processing and real-time applications.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_114_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{lahouli2018pedestrian,\nauthor = {Lahouli, Ichraf and Haelterman, Robby and Chtourou, Zied and De Cubber, Geert and Attia, Rabah},\nbooktitle = {International Conference on Computer Vision Theory and Applications},\ntitle = {Pedestrian Detection and Tracking in Thermal Images from Aerial {MPEG} videos},\nyear = {2018},\norganization = {DOI 10.5220\/0006723704870495},\npages = {487--495},\npublisher = {{SCITEPRESS} - Science and Technology Publications},\nvolume = {1},\nabstract = {Video surveillance for security and intelligence purposes has been a precious tool as long as the technology has been available but is computationally heavy. In this paper, we present a fast and efficient framework for pedestrian detection and tracking using thermal images. It is designed for automatic surveillance applications in an outdoor environment like preventing border intrusions or attacks on sensitive facilities using image and video processing techniques implemented on-board Unmanned Aerial Vehicles (UAV)s. The proposed framework exploits raw H.264 compressed video streams with limited computational overhead. Our work is driven by the fact that Motion Vectors (MV) are an integral part of any video compression technique, by day and night capabilities of thermal sensors and the distinguished thermal signature of humans. Six different scenarios were carried out and filmed using a thermal camera in order to simulate suspicious events. The obtained results show the effectiveness of the proposed framework and its low computational requirements which make it adequate for on-board processing and real-time applications.},\ndoi = {10.5220\/0006723704870495},\nproject = {SafeShore},\naddress = {Funchal, Portugal},\nurl = {https:\/\/www.scitepress.org\/Papers\/2018\/67237\/67237.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    I. Lahouli, R. Haelterman, J. Degroote, M. Shimoni, G. De Cubber, and R. Attia, &#8220;Accelerating existing non-blind image deblurring techniques through a strap-on limited-memory switched Broyden method,\" <span style=\"font-style: italic\">IEICE TRANSACTIONS on Information and Systems<\/span>, vol. 1, iss. 1, p. 8, 2018.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_116\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_116\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.jstage.jst.go.jp\/article\/transinf\/E101.D\/5\/E101.D_2017MVP0022\/_pdf\/-char\/en\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1587\/transinf.2017mvp0022' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_116_block\">\n<p>Video surveillance from airborne platforms can suffer from many sources of blur, like vibration, low-end optics, uneven lighting conditions, etc. Many different algorithms have been developed in the past that aim to recover the deblurred image but often incur substantial CPU-time, which is not always available on-board. This paper shows how a strap-on quasi-Newton method can accelerate the convergence of existing iterative methods with little extra overhead while keeping the performance of the original algorithm, thus paving the way for (near) real-time applications using on-board processing.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_116_block\">\n<pre><code class=\"tex bibtex\">@Article{lahouli2018accelerating,\nauthor = {Lahouli, Ichraf and Haelterman, Robby and Degroote, Joris and Shimoni, Michal and De Cubber, Geert and Attia, Rabah},\njournal = {IEICE TRANSACTIONS on Information and Systems},\ntitle = {Accelerating existing non-blind image deblurring techniques through a strap-on limited-memory switched {Broyden} method},\nyear = {2018},\nnumber = {1},\npages = {8},\nvolume = {1},\nabstract = {Video surveillance from airborne platforms can suffer from many sources of blur, like vibration, low-end optics, uneven lighting conditions, etc. Many different algorithms have been developed in the past that aim to recover the deblurred image but often incur substantial CPU-time, which is not always available on-board. This paper shows how a strap-on quasi-Newton method can accelerate the convergence of existing iterative methods with little extra overhead while keeping the performance of the original algorithm, thus paving the way for (near) real-time applications using on-board processing.},\ndoi = {10.1587\/transinf.2017mvp0022},\nfile = {:lahouli2018accelerating - Accelerating Existing Non Blind Image Deblurring Techniques through a Strap on Limited Memory Switched Broyden Method.PDF:PDF},\npublisher = {The Institute of Electronics, Information and Communication Engineers},\nproject = {SafeShore},\nurl = {https:\/\/www.jstage.jst.go.jp\/article\/transinf\/E101.D\/5\/E101.D_2017MVP0022\/_pdf\/-char\/en},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    I. Lahouli, E. Karakasis, R. Haelterman, Z. Chtourou, G. De Cubber, A. Gasteratos, and R. Attia, &#8220;Hot spot method for pedestrian detection using saliency maps, discrete Chebyshev moments and support vector machine,\" <span style=\"font-style: italic\">IET Image Processing<\/span>, vol. 12, iss. 7, p. 1284\u20131291, 2018.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_118\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_118\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/8387035\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1049\/iet-ipr.2017.0221' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_118_block\">\n<p>The increasing risks of border intrusions or attacks on sensitive facilities and the growing availability of surveillance cameras lead to extensive research efforts for robust detection of pedestrians using images. However, the surveillance of borders or sensitive facilities poses many challenges including the need to set up many cameras to cover the whole area of interest, the high bandwidth requirements for data streaming and the high-processing requirements. Driven by day and night capabilities of the thermal sensors and the distinguished thermal signature of humans, the authors propose a novel and robust method for the detection of pedestrians using thermal images. The method is composed of three steps: a detection which is based on a saliency map in conjunction with a contrast-enhancement technique, a shape description based on discrete Chebyshev moments and a classification step using a support vector machine classifier. The performance of the method is tested using two different thermal datasets and is compared with the conventional maximally stable extremal regions detector. The obtained results prove the robustness and the superiority of the proposed framework in terms of true and false positives rates and computational costs which make it suitable for low-performance processing platforms and real-time applications.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_118_block\">\n<pre><code class=\"tex bibtex\">@Article{lahouli2018hot,\nauthor = {Lahouli, Ichraf and Karakasis, Evangelos and Haelterman, Robby and Chtourou, Zied and De Cubber, Geert and Gasteratos, Antonios and Attia, Rabah},\njournal = {IET Image Processing},\ntitle = {Hot spot method for pedestrian detection using saliency maps, discrete {Chebyshev} moments and support vector machine},\nyear = {2018},\nnumber = {7},\npages = {1284--1291},\nvolume = {12},\nabstract = {The increasing risks of border intrusions or attacks on sensitive facilities and the growing availability of surveillance cameras lead to extensive research efforts for robust detection of pedestrians using images. However, the surveillance of borders or sensitive facilities poses many challenges including the need to set up many cameras to cover the whole area of interest, the high bandwidth requirements for data streaming and the high-processing requirements. Driven by day and night capabilities of the thermal sensors and the distinguished thermal signature of humans, the authors propose a novel and robust method for the detection of pedestrians using thermal images. The method is composed of three steps: a detection which is based on a saliency map in conjunction with a contrast-enhancement technique, a shape description based on discrete Chebyshev moments and a classification step using a support vector machine classifier. The performance of the method is tested using two different thermal datasets and is compared with the conventional maximally stable extremal regions detector. The obtained results prove the robustness and the superiority of the proposed framework in terms of true and false positives rates and computational costs which make it suitable for low-performance processing platforms and real-time applications.},\ndoi = {10.1049\/iet-ipr.2017.0221},\npublisher = {IET Digital Library},\nproject = {SafeShore},\nurl = {https:\/\/ieeexplore.ieee.org\/document\/8387035},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    I. Lahouli, R. Haelterman, G. De Cubber, Z. Chtourou, and R. Attia, &#8220;A fast and robust approach for human detection in thermal imagery for surveillance using UAVs,\" in <span style=\"font-style: italic\">15th Multi-Conference on Systems, Signals and Devices<\/span>, Hammamet, Tunisia,  2018.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_120\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_120\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/8570637\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ssd.2018.8570637' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_120_block\">\n<p>The use of Unmanned Aerial Vehicles (UAV)s has spread in various fields such as surveillance and search and rescue purposes. This leads to many research efforts that are focusing on the detection of people using aerial images. However, these platforms have limited resources of power and bandwidth which cause many restrictions and challenges. The use of the thermal sensors offers the possibility to work day and night and the detection of the human bodies because of its distinguished thermal signature. In this paper, we propose a fast and efficient method for the detection of humans in outdoor scenes using thermal images taken from aerial platforms. We start by extracting the bright blobs based on a conjunction between a saliency map and a contrast enhancement techniques. Then, we use the Discrete Chebyshev Moments as a shape descriptor and finally, we classify the blobs into humans and non-humans. The proposed framework is first tested using a well-known thermal database that covers a wide range of lighting and weather conditions and further and then compared to an also well-known blob extractor which is the Maximally Stable Extremal Regions detector (MSER). The results highlight the effectiveness and even the superiority of the proposed method in terms of true positives, false alarms and processing time.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_120_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{lahouli2018fast,\nauthor = {Lahouli, Ichraf and Haelterman, Robby and De Cubber, Geert and Chtourou, Zied and Attia, Rabah},\nbooktitle = {15th Multi-Conference on Systems, Signals and Devices},\ntitle = {A fast and robust approach for human detection in thermal imagery for surveillance using {UAVs}},\nyear = {2018},\nvolume = {1},\nabstract = {The use of Unmanned Aerial Vehicles (UAV)s has spread in various fields such as surveillance and search and rescue purposes. This leads to many research efforts that are focusing on the detection of people using aerial images. However, these platforms have limited resources of power and bandwidth which cause many restrictions and challenges. The use of the thermal sensors offers the possibility to work day and night and the detection of the human bodies because of its distinguished thermal signature. In this paper, we propose a fast and efficient method for the detection of humans in outdoor scenes using thermal images taken from aerial platforms. We start by extracting the bright blobs based on a conjunction between a saliency map and a contrast enhancement techniques. Then, we use the Discrete Chebyshev Moments as a shape descriptor and finally, we classify the blobs into humans and non-humans. The proposed framework is first tested using a well-known thermal database that covers a wide range of lighting and weather conditions and further and then compared to an also well-known blob extractor which is the Maximally Stable Extremal Regions detector (MSER). The results highlight the effectiveness and even the superiority of the proposed method in terms of true positives, false alarms and processing time.},\ndoi = {10.1109\/ssd.2018.8570637},\nfile = {:lahouli2018fast - A Fast and Robust Approach for Human Detection in Thermal Imagery for Surveillance Using UAVs.PDF:PDF},\nproject = {SafeShore},\naddress = {Hammamet, Tunisia},\nurl = {https:\/\/ieeexplore.ieee.org\/document\/8570637},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    N. Nauwynck, H. Balta, G. De Cubber, and H. Sahli, &#8220;In-flight launch of unmanned aerial vehicles,\" in <span style=\"font-style: italic\">International Symposium on Measurement and Control in Robotics ISMCR2018<\/span>, Mons, Belgium,  2018.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_121\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_121\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2018\/Paper_Niels.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5281\/zenodo.1462605' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_121_block\">\n<p>This paper considers the development of a system to enable the in-flight-launch of one aerial system by another. The paper will discuss how an optimal release mechanism was developed, taking into account the aerodynamics of one specific mother and child UAV. Furthermore, it will discuss the PID-based control concept that was introduced in order to autonomously stabilize the child UAV after being released from the mothership UAV. Finally, the paper will show how the concept of a mothership UAV + child UAV combination could be usefully taken into advantage in the context of a search and rescue operation.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_121_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{nauwynck2018flight,\nauthor = {Nauwynck, Niels and Balta, Haris and De Cubber, Geert and Sahli, Hichem},\nbooktitle = {International Symposium on Measurement and Control in Robotics ISMCR2018},\ntitle = {In-flight launch of unmanned aerial vehicles},\nyear = {2018},\nvolume = {1},\nabstract = {This paper considers the development of a system to enable the in-flight-launch of one aerial system by another. The paper will discuss how an optimal release mechanism was developed, taking into account the aerodynamics of one specific mother and child UAV. Furthermore, it will discuss the PID-based control concept that was introduced in order to autonomously stabilize the child UAV after being released from the mothership UAV. Finally, the paper will show how the concept of a mothership UAV + child UAV combination could be usefully taken into advantage in the context of a search and rescue operation.},\ndoi = {10.5281\/zenodo.1462605},\nfile = {:nauwynck2018flight - In Flight Launch of Unmanned Aerial Vehicles.PDF:PDF},\nkeywords = {Unmanned Aerial Vehicles, Control, Autonomous stabilization, Search and Rescue drones, Heterogeneous systems},\nproject = {NRTP},\naddress = {Mons, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2018\/Paper_Niels.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and G. De Cubber, &#8220;Qualitative and quantitative validation of drone detection systems,\" in <span style=\"font-style: italic\">International Symposium on Measurement and Control in Robotics ISMCR2018<\/span>, Mons, Belgium,  2018.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_122\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_122\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2018\/Paper_Daniela.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5281\/ZENODO.1462586' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_122_block\">\n<p>As drones are more and more entering our world, so comes the need to regulate the access to airspace for these systems. A necessary tool in order to do this is a means of detecting these drones. Numerous commercial and non-commercial parties have started the development of such drone detection systems. A big problem with these systems is that the evaluation of the performance of drone detection systems is a difficult operation, which requires the careful consideration of all technical and non-technical aspects of the system under test. Indeed, weather conditions and small variations in the appearance of the targets can have a huge difference on the performance of the systems. In order to provide a fair evaluation and an honest comparison between systems, it is therefore paramount that a stringent validation procedure is followed. Moreover, the validation methodology needs to find a compromise between the often contrasting requirements of end users (who want tests to be performed in operational conditions) and platform developers (who want tests to be performed that are statistically relevant). Therefore, we propose in this paper a qualitative and quantitative validation methodology for drone detection systems. The proposed validation methodology seeks to find this compromise between operationally relevant benchmarking (by providing qualitative benchmarking under varying environmental conditions) and statistically relevant evaluation (by providing quantitative score sheets under strictly described conditions).<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_122_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2018qualitative,\nauthor = {Doroftei, Daniela and De Cubber, Geert},\nbooktitle = {International Symposium on Measurement and Control in Robotics ISMCR2018},\ntitle = {Qualitative and quantitative validation of drone detection systems},\nyear = {2018},\nvolume = {1},\nabstract = {As drones are more and more entering our world, so comes the need to regulate the access to airspace for these systems. A necessary tool in order to do this is a means of detecting these drones. Numerous commercial and non-commercial parties have started the development of such drone detection systems. A big problem with these systems is that the evaluation of the performance of drone detection systems is a difficult operation, which requires the careful consideration of all technical and non-technical aspects of the system under test. Indeed, weather conditions and small variations in the appearance of the targets can have a huge difference on the performance of the systems. In order to provide a fair evaluation and an honest comparison between systems, it is therefore paramount that a stringent validation procedure is followed. Moreover, the validation methodology needs to find a compromise between the often contrasting requirements of end users (who want tests to be performed in operational conditions) and platform developers (who want tests to be performed that are statistically relevant). Therefore, we propose in this paper a qualitative and quantitative validation methodology for drone detection systems. The proposed validation methodology seeks to find this compromise between operationally relevant benchmarking (by providing qualitative benchmarking under varying environmental conditions) and statistically relevant evaluation (by providing quantitative score sheets under strictly described conditions).},\ndoi = {10.5281\/ZENODO.1462586},\nfile = {:doroftei2018qualitative - Qualitative and Quantitative Validation of Drone Detection Systems.PDF:PDF},\nkeywords = {Unmanned Aerial Vehicles, Drones, Detection systems, Drone detection, Test and evaluation methods},\nproject = {SafeShore},\naddress = {Mons, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2018\/Paper_Daniela.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    H. Balta, J. Velagic, G. De Cubber, W. Bosschaerts, and B. Siciliano, &#8220;Fast Statistical Outlier Removal Based Method for Large 3D Point Clouds of Outdoor Environments,\" in <span style=\"font-style: italic\">12th IFAC SYMPOSIUM ON ROBOT CONTROL &#8211; SYROCO 2018<\/span>, Budapest, Hungary,  2018, p. 348\u2013353.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_123\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_123\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S2405896318332725\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1016\/j.ifacol.2018.11.566' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_123_block\">\n<p>This paper proposes a very effective method for data handling and preparation of the input 3D scans acquired from laser scanner mounted on the Unmanned Ground Vehicle (UGV). The main objectives are to improve and speed up the process of outliers removal for large-scale outdoor environments. This process is necessary in order to filter out the noise and to downsample the input data which will spare computational and memory resources for further processing steps, such as 3D mapping of rough terrain and unstructured environments. It includes the Voxel-subsampling and Fast Cluster Statistical Outlier Removal (FCSOR) subprocesses. The introduced FCSOR represents an extension on the Statistical Outliers Removal (SOR) method which is effective for both homogeneous and heterogeneous point clouds. This method is evaluated on real data obtained in outdoor environment.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_123_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{balta2018fast01,\nauthor = {Balta, Haris and Velagic, Jasmin and De Cubber, Geert and Bosschaerts, Walter and Siciliano, Bruno},\nbooktitle = {12th IFAC SYMPOSIUM ON ROBOT CONTROL - SYROCO 2018},\ntitle = {Fast Statistical Outlier Removal Based Method for Large {3D} Point Clouds of Outdoor Environments},\nyear = {2018},\nnumber = {22},\npages = {348--353},\npublisher = {Elsevier {BV}},\nvolume = {51},\nabstract = {This paper proposes a very effective method for data handling and preparation of the input 3D scans acquired from laser scanner mounted on the Unmanned Ground Vehicle (UGV). The main objectives are to improve and speed up the process of outliers removal for large-scale outdoor environments. This process is necessary in order to filter out the noise and to downsample the input data which will spare computational and memory resources for further processing steps, such as 3D mapping of rough terrain and unstructured environments. It includes the Voxel-subsampling and Fast Cluster Statistical Outlier Removal (FCSOR) subprocesses. The introduced FCSOR represents an extension on the Statistical Outliers Removal (SOR) method which is effective for both homogeneous and heterogeneous point clouds. This method is evaluated on real data obtained in outdoor environment.},\ndoi = {10.1016\/j.ifacol.2018.11.566},\nfile = {:balta2018fast - Fast Statistical Outlier Removal Based Method for Large 3D Point Clouds of Outdoor Environments.PDF:PDF},\njournal = {{IFAC}-{PapersOnLine}},\nproject = {NRTP},\naddress = {Budapest, Hungary},\nurl = {https:\/\/www.sciencedirect.com\/science\/article\/pii\/S2405896318332725},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    H. Balta, J. Velagic, G. De Cubber, W. Bosschaerts, and B. Siciliano, &#8220;Fast Iterative 3D Mapping for Large-Scale Outdoor Environments with Local Minima Escape Mechanism,\" in <span style=\"font-style: italic\">12th IFAC SYMPOSIUM ON ROBOT CONTROL &#8211; SYROCO 2018<\/span>, Budapest, Hungary,  2018, p. 298\u2013305.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_124\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_124\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S2405896318332646\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1016\/j.ifacol.2018.11.558' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_124_block\">\n<p>This paper introduces a novel iterative 3D mapping framework for large scale natural terrain and complex environments. The framework is based on an Iterative-Closest-Point (ICP) algorithm and an iterative error minimization mechanism, allowing robust 3D map registration. This was accomplished by performing pairwise scan registrations without any prior known pose estimation information and taking into account the measurement uncertainties due to the 6D coordinates (translation and rotation) deviations in the acquired scans. Since the ICP algorithm does not guarantee to escape from local minima during the mapping, new algorithms for the local minima estimation and local minima escape process were proposed. The proposed framework is validated using large scale field test data sets. The experimental results were compared with those of standard, generalized and non-linear ICP registration methods and the performance evaluation is presented, showing improved performance of the proposed 3D mapping framework.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_124_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{balta2018fast02,\nauthor = {Balta, Haris and Velagic, Jasmin and De Cubber, Geert and Bosschaerts, Walter and Siciliano, Bruno},\nbooktitle = {12th IFAC SYMPOSIUM ON ROBOT CONTROL - SYROCO 2018},\ntitle = {Fast Iterative {3D} Mapping for Large-Scale Outdoor Environments with Local Minima Escape Mechanism},\nyear = {2018},\nnumber = {22},\npages = {298--305},\npublisher = {Elsevier {BV}},\nvolume = {51},\nabstract = {This paper introduces a novel iterative 3D mapping framework for large scale natural terrain and complex environments. The framework is based on an Iterative-Closest-Point (ICP) algorithm and an iterative error minimization mechanism, allowing robust 3D map registration. This was accomplished by performing pairwise scan registrations without any prior known pose estimation information and taking into account the measurement uncertainties due to the 6D coordinates (translation and rotation) deviations in the acquired scans. Since the ICP algorithm does not guarantee to escape from local minima during the mapping, new algorithms for the local minima estimation and local minima escape process were proposed. The proposed framework is validated using large scale field test data sets. The experimental results were compared with those of standard, generalized and non-linear ICP registration methods and the performance evaluation is presented, showing improved performance of the proposed 3D mapping framework.},\ndoi = {10.1016\/j.ifacol.2018.11.558},\njournal = {{IFAC}-{PapersOnLine}},\naddress = {Budapest, Hungary},\nproject = {NRTP},\nurl = {https:\/\/www.sciencedirect.com\/science\/article\/pii\/S2405896318332646},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, &#8220;Legal Issues in Search and Rescue UAV operations,\" in <span style=\"font-style: italic\">IROS2018 forum on Legal Issues, Cybersecurity and Policymakers Implication in AI Robotics<\/span>, Madrid, Spain,  2018.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_125\" class=\"papercite_toggle\">[BibTeX]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_125_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2018legal,\nauthor = {De Cubber, Geert},\nbooktitle = {IROS2018 forum on Legal Issues, Cybersecurity and Policymakers Implication in AI Robotics},\ntitle = {Legal Issues in Search and Rescue {UAV} operations},\nyear = {2018},\naddress = {Madrid, Spain},\nproject = {ICARUS},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2017<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. S. L{&#8216;. o}, G. Moreno, J. Cordero, J. Sanchez, S. Govindaraj, M. M. Marques, V. Lobo, S. Fioravanti, A. Grati, K. Rudin, M. Tosa, A. Matos, A. Dias, A. Martins, J. Bedkowski, H. Balta, and G. De Cubber, &#8220;Interoperability in a Heterogeneous Team of Search and Rescue Robots,\" in <span style=\"font-style: italic\">Search and Rescue Robotics &#8211; From Theory to Practice<\/span>, G. De Cubber and D. Doroftei, Eds., InTech, 2017.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_103\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_103\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/interoperability-in-a-heterogeneous-team-of-search-and-rescue-robots\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/intechopen.69493' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_103_block\">\n<p>Search and rescue missions are complex operations. A disaster scenario is generally unstructured, time\u2010varying and unpredictable. This poses several challenges for the successful deployment of unmanned technology. The variety of operational scenarios and tasks lead to the need for multiple robots of different types, domains and sizes. A priori planning of the optimal set of assets to be deployed and the definition of their mission objectives are generally not feasible as information only becomes available during mission. The ICARUS project responds to this challenge by developing a heterogeneous team composed by different and complementary robots, dynamically cooperating as an interoperable team. This chapter describes our approach to multi\u2010robot interoperability, understood as the ability of multiple robots to operate together, in synergy, enabling multiple teams to share data, intelligence and resources, which is the ultimate objective of ICARUS project. It also includes the analysis of the relevant standardization initiatives in multi\u2010robot multi\u2010domain systems, our implementation of an interoperability framework and several examples of multi\u2010robot cooperation of the ICARUS robots in realistic search and rescue missions.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_103_block\">\n<pre><code class=\"tex bibtex\">@InBook{lopez2017interoperability,\nauthor = {Daniel Serrano L{'{o}}pez and German Moreno and Jose Cordero and Jose Sanchez and Shashank Govindaraj and Mario Monteiro Marques and Victor Lobo and Stefano Fioravanti and Alberto Grati and Konrad Rudin and Massimo Tosa and Anibal Matos and Andre Dias and Alfredo Martins and Janusz Bedkowski and Haris Balta and De Cubber, Geert},\neditor = {De Cubber, Geert and Doroftei, Daniela},\nchapter = {Chapter 6},\npublisher = {{InTech}},\ntitle = {Interoperability in a Heterogeneous Team of Search and Rescue Robots},\nyear = {2017},\nmonth = aug,\nabstract = {Search and rescue missions are complex operations. A disaster scenario is generally unstructured, time\u2010varying and unpredictable. This poses several challenges for the successful deployment of unmanned technology. The variety of operational scenarios and tasks lead to the need for multiple robots of different types, domains and sizes. A priori planning of the optimal set of assets to be deployed and the definition of their mission objectives are generally not feasible as information only becomes available during mission. The ICARUS project responds to this challenge by developing a heterogeneous team composed by different and complementary robots, dynamically cooperating as an interoperable team. This chapter describes our approach to multi\u2010robot interoperability, understood as the ability of multiple robots to operate together, in synergy, enabling multiple teams to share data, intelligence and resources, which is the ultimate objective of ICARUS project. It also includes the analysis of the relevant standardization initiatives in multi\u2010robot multi\u2010domain systems, our implementation of an interoperability framework and several examples of multi\u2010robot cooperation of the ICARUS robots in realistic search and rescue missions.},\nbooktitle = {Search and Rescue Robotics - From Theory to Practice},\ndoi = {10.5772\/intechopen.69493},\nproject = {ICARUS},\nunit= {meca-ras},\nurl = {https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/interoperability-in-a-heterogeneous-team-of-search-and-rescue-robots},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, H. Balta, A. Matos, E. Silva, D. Serrano, S. Govindaraj, R. Roda, V. Lobo, M{&#8216;. a}, and R. Wagemans, &#8220;Operational Validation of Search and Rescue Robots,\" in <span style=\"font-style: italic\">Search and Rescue Robotics &#8211; From Theory to Practice<\/span>, G. De Cubber and D. Doroftei, Eds., InTech, 2017.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_104\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_104\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/operational-validation-of-search-and-rescue-robots\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/intechopen.69497' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_104_block\">\n<p>This chapter describes how the different ICARUS unmanned search and rescue tools have been evaluated and validated using operational benchmarking techniques. Two large\u2010scale simulated disaster scenarios were organized: a simulated shipwreck and an earthquake response scenario. Next to these simulated response scenarios, where ICARUS tools were deployed in tight interaction with real end users, ICARUS tools also participated to a real relief, embedded in a team of end users for a flood response mission. These validation trials allow us to conclude that the ICARUS tools fulfil the user requirements and goals set up at the beginning of the project.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_104_block\">\n<pre><code class=\"tex bibtex\">@InBook{de2017operational,\nauthor = {De Cubber, Geert and Daniela Doroftei and Haris Balta and Anibal Matos and Eduardo Silva and Daniel Serrano and Shashank Govindaraj and Rui Roda and Victor Lobo and M{'{a}}rio Marques and Rene Wagemans},\neditor = {De Cubber, Geert and Doroftei, Daniela},\nchapter = {Chapter 10},\npublisher = {{InTech}},\ntitle = {Operational Validation of Search and Rescue Robots},\nyear = {2017},\nmonth = aug,\nabstract = {This chapter describes how the different ICARUS unmanned search and rescue tools have been evaluated and validated using operational benchmarking techniques. Two large\u2010scale simulated disaster scenarios were organized: a simulated shipwreck and an earthquake response scenario. Next to these simulated response scenarios, where ICARUS tools were deployed in tight interaction with real end users, ICARUS tools also participated to a real relief, embedded in a team of end users for a flood response mission. These validation trials allow us to conclude that the ICARUS tools fulfil the user requirements and goals set up at the beginning of the project.},\nbooktitle = {Search and Rescue Robotics - From Theory to Practice},\ndoi = {10.5772\/intechopen.69497},\njournal = {Search and Rescue Robotics: From Theory to Practice},\nproject = {ICARUS},\nurl = {https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/operational-validation-of-search-and-rescue-robots},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    K. Berns, A. Nezhadfard, M. Tosa, H. Balta, and G. De Cubber, &#8220;Unmanned Ground Robots for Rescue Tasks,\" in <span style=\"font-style: italic\">Search and Rescue Robotics &#8211; From Theory to Practice<\/span>, G. De Cubber and D. Doroftei, Eds., InTech, 2017.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_105\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_105\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/unmanned-ground-robots-for-rescue-tasks\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/intechopen.69491' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_105_block\">\n<p>This chapter describes two unmanned ground vehicles that can help search and rescue teams in their difficult, but life-saving tasks. These robotic assets have been developed within the framework of the European project ICARUS. The large unmanned ground vehicle is intended to be a mobile base station. It is equipped with a powerful manipulator arm and can be used for debris removal, shoring operations, and remote structural operations (cutting, welding, hammering, etc.) on very rough terrain. The smaller unmanned ground vehicle is also equipped with an array of sensors, enabling it to search for victims inside semi-destroyed buildings. Working together with each other and the human search and rescue workers, these robotic assets form a powerful team, increasing the effectiveness of search and rescue operations, as proven by operational validation tests in collaboration with end users.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_105_block\">\n<pre><code class=\"tex bibtex\">@InBook{berns2017unmanned,\nauthor = {Karsten Berns and Atabak Nezhadfard and Massimo Tosa and Haris Balta and De Cubber, Geert},\neditor = {De Cubber, Geert and Doroftei, Daniela},\nchapter = {Chapter 4},\npublisher = {{InTech}},\ntitle = {Unmanned Ground Robots for Rescue Tasks},\nyear = {2017},\nmonth = aug,\nabstract = {This chapter describes two unmanned ground vehicles that can help search and rescue teams in their difficult, but life-saving tasks. These robotic assets have been developed within the framework of the European project ICARUS. The large unmanned ground vehicle is intended to be a mobile base station. It is equipped with a powerful manipulator arm and can be used for debris removal, shoring operations, and remote structural operations (cutting, welding, hammering, etc.) on very rough terrain. The smaller unmanned ground vehicle is also equipped with an array of sensors, enabling it to search for victims inside semi-destroyed buildings. Working together with each other and the human search and rescue workers, these robotic assets form a powerful team, increasing the effectiveness of search and rescue operations, as proven by operational validation tests in collaboration with end users.},\nbooktitle = {Search and Rescue Robotics - From Theory to Practice},\ndoi = {10.5772\/intechopen.69491},\nproject = {ICARUS},\nurl = {https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/unmanned-ground-robots-for-rescue-tasks},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, G. De Cubber, R. Wagemans, A. Matos, E. Silva, V. Lobo, K. C. Guerreiro Cardoso, S. Govindaraj, J. Gancet, and D. Serrano, &#8220;User-centered design,\" , G. De Cubber and D. Doroftei, Eds., InTech, 2017, p. 19\u201336.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_106\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_106\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/user-centered-design\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/intechopen.69483' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_106_block\">\n<p>The successful introduction and acceptance of novel technological tools are only possible if end users are completely integrated in the design process. However, obtaining such integration of end users is not obvious, as end\u2010user organizations often do not consider research toward new technological aids as their core business and are therefore reluctant to engage in these kinds of activities. This chapter explains how this problem was tackled in the ICARUS project, by carefully identifying and approaching the targeted user communities and by compiling user requirements. Resulting from these user requirements, system requirements and a system architecture for the ICARUS system were deduced. An important aspect of the user\u2010centered design approach is that it is an iterative methodology, based on multiple intermediate operational validations by end users of the developed tools, leading to a final validation according to user\u2010scripted validation scenarios.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_106_block\">\n<pre><code class=\"tex bibtex\">@InBook{doroftei2017user,\nauthor = {Doroftei, Daniela and De Cubber, Geert and Wagemans, Rene and Matos, Anibal and Silva, Eduardo and Lobo, Victor and Guerreiro Cardoso, Keshav Chintamani and Govindaraj, Shashank and Gancet, Jeremi and Serrano, Daniel},\neditor = {De Cubber, Geert and Doroftei, Daniela},\nchapter = {Chapter 2},\npages = {19--36},\npublisher = {{InTech}},\ntitle = {User-centered design},\nyear = {2017},\nabstract = {The successful introduction and acceptance of novel technological tools are only possible if end users are completely integrated in the design process. However, obtaining such integration of end users is not obvious, as end\u2010user organizations often do not consider research toward new technological aids as their core business and are therefore reluctant to engage in these kinds of activities. This chapter explains how this problem was tackled in the ICARUS project, by carefully identifying and approaching the targeted user communities and by compiling user requirements. Resulting from these user requirements, system requirements and a system architecture for the ICARUS system were deduced. An important aspect of the user\u2010centered design approach is that it is an iterative methodology, based on multiple intermediate operational validations by end users of the developed tools, leading to a final validation according to user\u2010scripted validation scenarios.},\ndoi = {10.5772\/intechopen.69483},\njournal = {Search and rescue robotics. From theory to practice. IntechOpen, London},\nproject = {ICARUS},\nunit= {meca-ras},\nurl = {https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/user-centered-design},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. D. Cubber, D. Doroftei, K. Rudin, K. Berns, A. Matos, D. Serrano, J. Sanchez, S. Govindaraj, J. Bedkowski, R. Roda, E. Silva, and S. Ourevitch, &#8220;Introduction to the use of robotic tools for search and rescue,\" in <span style=\"font-style: italic\">Search and Rescue Robotics &#8211; From Theory to Practice<\/span>, G. De Cubber and D. Doroftei, Eds., InTech, 2017.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_107\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_107\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/introduction-to-the-use-of-robotic-tools-for-search-and-rescue\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/intechopen.69489' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_107_block\">\n<p>Modern search and rescue workers are equipped with a powerful toolkit to address natural and man-made disasters. This introductory chapter explains how a new tool can be added to this toolkit: robots. The use of robotic assets in search and rescue operations is explained and an overview is given of the worldwide efforts to incorporate robotic tools in search and rescue operations. Furthermore, the European Union ICARUS project on this subject is introduced. The ICARUS project proposes to equip first responders with a comprehensive and integrated set of unmanned search and rescue tools, to increase the situational awareness of human crisis managers, such that more work can be done in a shorter amount of time. The ICARUS tools consist of assistive unmanned air, ground, and sea vehicles, equipped with victim-detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the command and control equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_107_block\">\n<pre><code class=\"tex bibtex\">@InBook{cubber2017introduction,\nauthor = {Geert De Cubber and Daniela Doroftei and Konrad Rudin and Karsten Berns and Anibal Matos and Daniel Serrano and Jose Sanchez and Shashank Govindaraj and Janusz Bedkowski and Rui Roda and Eduardo Silva and Stephane Ourevitch},\neditor = {De Cubber, Geert and Doroftei, Daniela},\nchapter = {Chapter 1},\npublisher = {{InTech}},\ntitle = {Introduction to the use of robotic tools for search and rescue},\nyear = {2017},\nmonth = aug,\nabstract = {Modern search and rescue workers are equipped with a powerful toolkit to address natural and man-made disasters. This introductory chapter explains how a new tool can be added to this toolkit: robots. The use of robotic assets in search and rescue operations is explained and an overview is given of the worldwide efforts to incorporate robotic tools in search and rescue operations. Furthermore, the European Union ICARUS project on this subject is introduced. The ICARUS project proposes to equip first responders with a comprehensive and integrated set of unmanned search and rescue tools, to increase the situational awareness of human crisis managers, such that more work can be done in a shorter amount of time. The ICARUS tools consist of assistive unmanned air, ground, and sea vehicles, equipped with victim-detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the command and control equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.},\nbooktitle = {Search and Rescue Robotics - From Theory to Practice},\ndoi = {10.5772\/intechopen.69489},\nproject = {ICARUS},\nurl = {https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\/introduction-to-the-use-of-robotic-tools-for-search-and-rescue},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, R. Shalom, A. Coluccia, O. Borcan, R. Chamr{&#8216;a}d, T. Radulescu, E. Izquierdo, and Z. Gagov, &#8220;The SafeShore system for the detection of threat agents in a maritime border environment,\" in <span style=\"font-style: italic\">IARP Workshop on Risky Interventions and Environmental Surveillance<\/span>, Les Bon Villers, Belgium,  2017.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_109\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_109\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2017\/SafeShore Abstract RISE-2017_.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5281\/zenodo.1115552' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_109_block\">\n<p>This paper discusses the goals of the H2020-SafeShore project, which has as a main goal to cover existing gaps in coastal border surveillance, increasing internal security by preventing cross-border crime such as trafficking in human beings and the smuggling of drugs. It is designed to be integrated with existing systems and create a continuous detection line along the border<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_109_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2017safeshore,\nauthor = {De Cubber, Geert and Shalom, Ron and Coluccia, Angelo and Borcan, Octavia and Chamr{'a}d, Richard and Radulescu, Tudor and Izquierdo, Ebroul and Gagov, Zhelyazko},\nbooktitle = {IARP Workshop on Risky Interventions and Environmental Surveillance},\ntitle = {The {SafeShore} system for the detection of threat agents in a maritime border environment},\nyear = {2017},\norganization = {IARP},\nabstract = {This paper discusses the goals of the H2020-SafeShore project, which has as a main goal to cover existing gaps in coastal border\nsurveillance, increasing internal security by preventing cross-border crime such as trafficking in human beings and the smuggling of drugs. It is designed to be integrated with existing systems and create a continuous detection line along the border},\ndoi = {10.5281\/zenodo.1115552},\nkeywords = {SafeShore, Counter UAV, Counter RPAS},\nlanguage = {en},\nproject = {Safeshore},\naddress = {Les Bon Villers, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2017\/SafeShore Abstract RISE-2017_.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    M. Buric and G. De Cubber, &#8220;Counter Remotely Piloted Aircraft Systems,\" <span style=\"font-style: italic\">MTA Review<\/span>, vol. 27, iss. 1, 2017.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_110\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_110\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2017\/Counter Remotely Piloted Aircraft Systems.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5281\/zenodo.1115502' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_110_block\">\n<p>An effective Counter Remotely Aircraft System is a major objective of many researchers and industries entities. Their activity is strongly impelled by the operational requirements of the Law Enforcement Authorities and naturally follows both the course of the latest terrorist events and technological developments. The designing process of an effective Counter Remotely Aircraft System needs to benefit from a systemic approach, starting from the legal aspects, and ending with the technical ones. From a technical point of view, the system has to work according to the five \u201ckill chain\u201d model starting with the detection phase, going on with the classification, prioritization, tracking and neutralization of the targets and ending with the forensic phase.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_110_block\">\n<pre><code class=\"tex bibtex\">@Article{buric2017counter,\nauthor = {Buric, Marian and De Cubber, Geert},\njournal = {MTA Review},\ntitle = {Counter Remotely Piloted Aircraft Systems},\nyear = {2017},\nnumber = {1},\nvolume = {27},\nabstract = {An effective Counter Remotely Aircraft System is a major objective of many researchers and industries entities. Their activity is strongly impelled by the operational requirements of the Law Enforcement Authorities and naturally follows both the course of the latest terrorist events and technological developments. The designing process of an effective Counter Remotely Aircraft System needs to benefit from a systemic approach, starting from the legal aspects, and ending with the technical ones. From a technical point of view, the system has to work according to the five \u201ckill chain\u201d model starting with the detection phase, going on with the classification, prioritization, tracking and neutralization of the targets and ending with the forensic phase.},\ndoi = {10.5281\/zenodo.1115502},\nkeywords = {Counter Remotely Piloted Aircraft Systems, drone, drone detection tracking and neutralization, RPAS, SafeShore},\nlanguage = {en},\npublisher = {Military Technical Academy Publishing House},\nproject = {SafeShore},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2017\/Counter Remotely Piloted Aircraft Systems.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    A. Coluccia, M. Ghenescu, T. Piatrik, G. D. Cubber, A. Schumann, L. Sommer, J. Klatte, T. Schuchert, J. Beyerer, M. Farhadi, R. Amandi, C. Aker, S. Kalkan, M. Saqib, N. Sharma, S. Daud, K. Makkah, and M. Blumenstein, &#8220;Drone-vs-Bird detection challenge at IEEE AVSS2017,\" in <span style=\"font-style: italic\">2017 14th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS)<\/span>, Lecce, Italy,  2017, p. 1\u20136.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_111\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_111\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2017\/WOSDETCpaper (1).pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/avss.2017.8078464' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_111_block\">\n<p>Small drones are a rising threat due to their possible misuse for illegal activities, in particular smuggling and terrorism. The project SafeShore, funded by the European Commission under the Horizon 2020 program, has launched the drone-vs-bird detection challenge to address one of the many technical issues arising in this context. The goal is to detect a drone appearing at some point in a video where birds may be also present: the algorithm should raise an alarm and provide a position estimate only when a drone is present, while not issuing alarms on birds. This paper reports on the challenge proposal, evaluation, and results<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_111_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{coluccia2017drone,\nauthor = {Angelo Coluccia and Marian Ghenescu and Tomas Piatrik and Geert De Cubber and Arne Schumann and Lars Sommer and Johannes Klatte and Tobias Schuchert and Juergen Beyerer and Mohammad Farhadi and Ruhallah Amandi and Cemal Aker and Sinan Kalkan and Muhammad Saqib and Nabin Sharma and Sultan Daud and Khan Makkah and Michael Blumenstein},\nbooktitle = {2017 14th {IEEE} International Conference on Advanced Video and Signal Based Surveillance ({AVSS})},\ntitle = {Drone-vs-Bird detection challenge at {IEEE} {AVSS}2017},\nyear = {2017},\nmonth = aug,\norganization = {IEEE},\npages = {1--6},\npublisher = {{IEEE}},\nabstract = {Small drones are a rising threat due to their possible misuse for illegal activities, in particular smuggling and terrorism. The project SafeShore, funded by the European Commission under the Horizon 2020 program, has launched the drone-vs-bird detection challenge to address one of the many technical issues arising in this context. The goal is to detect a drone appearing at some point in a video where birds may be also present: the algorithm should raise an alarm and provide a position estimate only when a drone is present, while not issuing alarms on birds. This paper reports on the challenge proposal, evaluation, and results},\ndoi = {10.1109\/avss.2017.8078464},\nproject = {SafeShore},\naddress = {Lecce, Italy},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2017\/WOSDETCpaper (1).pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    I. Lahouli, R. Haelterman, Z. Chtourou, G. De Cubber, and R. Attia, &#8220;Pedestrian Tracking in the Compressed Domain Using Thermal Images,\" in <span style=\"font-style: italic\">VIIth International Workshop on Representation, analysis and recognition of shape and motion from Image data<\/span>, Savoie, France,  2017.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_113\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_113\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2017\/RFMI2017_LAHOULI.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1007\/978-3-030-19816-9_3' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_113_block\">\n<p>The video surveillance of sensitive facilities or borders poses many challenges like the high bandwidth requirements and the high computational cost. In this paper, we propose a framework for detecting and tracking pedestrians in the compressed domain using thermal images. Firstly, the detection process uses a conjunction between saliency maps and contrast enhancement techniques followed by a global image content descriptor based on Discrete Chebychev Moments (DCM) and a linear Support Vector Machine (SVM) as a classifier. Secondly, the tracking process exploits raw H.264 compressed video streams with limited computational overhead. In addition to two, well-known, public datasets, we have generated our own dataset by carrying six different scenarios of suspicious events using a thermal camera. The obtained results show the effectiveness and the low computational requirements of the proposed framework which make it suitable for real-time applications and on-board implementation.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_113_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{lahouli2017pedestrian,\nauthor = {Lahouli, Ichraf and Haelterman, Robby and Chtourou, Zied and De Cubber, Geert and Attia, Rabah},\nbooktitle = {VIIth International Workshop on Representation, analysis and recognition of shape and motion from Image data},\ntitle = {Pedestrian Tracking in the Compressed Domain Using Thermal Images},\nyear = {2017},\nnumber = {1},\nvolume = {1},\nabstract = {The video surveillance of sensitive facilities or borders poses many challenges like the high bandwidth requirements and the high computational cost. In this paper, we propose a framework for detecting and tracking pedestrians in the compressed domain using thermal images. Firstly, the detection process uses a conjunction between saliency maps and contrast enhancement techniques followed by a global image content descriptor based on Discrete Chebychev Moments (DCM) and a linear\nSupport Vector Machine (SVM) as a classifier. Secondly, the tracking process exploits raw H.264 compressed video streams with limited computational overhead. In addition to two, well-known, public datasets, we have generated our own dataset by carrying six different scenarios of suspicious events using a thermal camera. The obtained results show the effectiveness and the low computational requirements of the proposed framework which make it suitable for real-time applications and on-board implementation.},\ndoi = {10.1007\/978-3-030-19816-9_3},\nproject = {SafeShore},\naddress = {Savoie, France},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2017\/RFMI2017_LAHOULI.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. D. Cubber, D. Doroftei, K. Rudin, K. Berns, A. Matos, D. Serrano, J. M. Sanchez, S. Govindaraj, J. Bedkowski, R. Roda, E. Silva, S. Ourevitch, R. Wagemans, V. Lobo, G. Cardoso, K. Chintamani, J. Gancet, P. Stupler, A. Nezhadfard, M. Tosa, H. Balta, J. Almeida, A. Martins, H. Ferreira, B. Ferreira, J. Alves, A. Dias, S. Fioravanti, D. Bertin, G. Moreno, J. Cordero, M. M. Marques, A. Grati, H. M. Chaudhary, B. Sheers, Y. Riobo, P. Letier, M. N. Jimenez, M. A. Esbri, P. Musialik, I. Badiola, R. Goncalves, A. Coelho, T. Pfister, K. Majek, M. Pelka, A. Maslowski, and R. Baptista, <span style=\"font-style: italic\">Search and Rescue Robotics &#8211; From Theory to Practice<\/span>, G. De Cubber and D. Doroftei, Eds., InTech, 2017.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_115\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_115\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/intechopen.68449' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_115_block\">\n<p>In the event of large crises (earthquakes, typhoons, floods, &#8230;), a primordial task of the fire and rescue services is the search for human survivors on the incident site. This is a complex and dangerous task, which &#8211; too often &#8211; leads to loss of lives among the human crisis managers themselves. This book explains how unmanned search can be added to the toolkit of the search and rescue workers, offering a valuable tool to save human lives and to speed up the search and rescue process. The introduction of robotic tools in the world of search and rescue is not straightforward, due to the fact that the search and rescue context is extremely technology-unfriendly, meaning that very robust solutions, which can be deployed extremely quickly, are required. Multiple research projects across the world are tackling this problem and in this book, a special focus is placed on showcasing the results of the European Union ICARUS project on this subject. The ICARUS project proposes to equip first responders with a comprehensive and integrated set of unmanned search and rescue tools, to increase the situational awareness of human crisis managers, so that more work can be done in a shorter amount of time. The ICARUS tools consist of assistive unmanned air, ground, and sea vehicles, equipped with victim-detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the command and control equipment of the human crisis managers and a set of training and support tools is provided to them in order to learn to use the ICARUS system.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_115_block\">\n<pre><code class=\"tex bibtex\">@Book{de2017search,\nauthor = {Geert De Cubber and Daniela Doroftei and Konrad Rudin and Karsten Berns and Anibal Matos and Daniel Serrano and Jose Manuel Sanchez and Shashank Govindaraj and Janusz Bedkowski and Rui Roda and Eduardo Silva and Stephane Ourevitch and Rene Wagemans and Victor Lobo and Guerreiro Cardoso and Keshav Chintamani and Jeremi Gancet and Pascal Stupler and Atabak Nezhadfard and Massimo Tosa and Haris Balta and Jose Almeida and Alfredo Martins and Hugo Ferreira and Bruno Ferreira and Jose Alves and Andre Dias and Stefano Fioravanti and Daniele Bertin and German Moreno and Jose Cordero and Mario Monteiro Marques and Alberto Grati and Hafeez M. Chaudhary and Bart Sheers and Yudani Riobo and Pierre Letier and Mario Nunez Jimenez and Miguel Angel Esbri and Pawel Musialik and Irune Badiola and Ricardo Goncalves and Antonio Coelho and Thomas Pfister and Karol Majek and Michal Pelka and Andrzej Maslowski and Ricardo Baptista},\neditor = {De Cubber, Geert and Doroftei, Daniela},\npublisher = {{InTech}},\ntitle = {Search and Rescue Robotics - From Theory to Practice},\nyear = {2017},\nmonth = aug,\nabstract = {In the event of large crises (earthquakes, typhoons, floods, ...), a primordial task of the fire and rescue services is the search for human survivors on the incident site. This is a complex and dangerous task, which - too often - leads to loss of lives among the human crisis managers themselves. This book explains how unmanned search can be added to the toolkit of the search and rescue workers, offering a valuable tool to save human lives and to speed up the search and rescue process. The introduction of robotic tools in the world of search and rescue is not straightforward, due to the fact that the search and rescue context is extremely technology-unfriendly, meaning that very robust solutions, which can be deployed extremely quickly, are required. Multiple research projects across the world are tackling this problem and in this book, a special focus is placed on showcasing the results of the European Union ICARUS project on this subject. The ICARUS project proposes to equip first responders with a comprehensive and integrated set of unmanned search and rescue tools, to increase the situational awareness of human crisis managers, so that more work can be done in a shorter amount of time. The ICARUS tools consist of assistive unmanned air, ground, and sea vehicles, equipped with victim-detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the command and control equipment of the human crisis managers and a set of training and support tools is provided to them in order to learn to use the ICARUS system.},\ndoi = {10.5772\/intechopen.68449},\nproject = {ICARUS},\nurl = {https:\/\/www.intechopen.com\/books\/search-and-rescue-robotics-from-theory-to-practice},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, D. Doroftei, G. De Cubber, J. Habumuremyi, H. Balta, and I. Doroftei, &#8220;Unmanned Ground and Aerial Robots Supporting Mine Action Activities,\" in <span style=\"font-style: italic\">Mine Action &#8211; The Research Experience of the Royal Military Academy of Belgium<\/span>, C. Beumier, D. Closson, V. Lacroix, N. Milisavljevic, and Y. Yvinec, Eds., InTech, 2017, vol. 1.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_119\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_119\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.intechopen.com\/books\/mine-action-the-research-experience-of-the-royal-military-academy-of-belgium\/unmanned-ground-and-aerial-robots-supporting-mine-action-activities\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/65783' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_119_block\">\n<p>During the Humanitarian\u2010demining actions, teleoperation of sensors or multi\u2010sensor heads can enhance-detection process by allowing more precise scanning, which is useful for the optimization of the signal processing algorithms. This chapter summarizes the technologies and experiences developed during 16 years through national and\/or European\u2010funded projects, illustrated by some contributions of our own laboratory, located at the Royal Military Academy of Brussels, focusing on the detection of unexploded devices and the implementation of mobile robotics systems on minefields.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_119_block\">\n<pre><code class=\"tex bibtex\">@InBook{baudoin2017unmanned,\nauthor = {Baudoin, Yvan and Doroftei, Daniela and De Cubber, Geert and Habumuremyi, Jean-Claude and Balta, Haris and Doroftei, Ioan},\neditor = {Beumier, Charles and Closson, Damien and Lacroix, Vincianne and Milisavljevic, Nada and Yvinec, Yann},\nchapter = {Chapter 9},\npublisher = {{InTech}},\ntitle = {Unmanned Ground and Aerial Robots Supporting Mine Action Activities},\nyear = {2017},\nmonth = aug,\nvolume = {1},\nabstract = {During the Humanitarian\u2010demining actions, teleoperation of sensors or multi\u2010sensor heads can enhance-detection process by allowing more precise scanning, which is useful for the optimization of the signal processing algorithms. This chapter summarizes the technologies and experiences developed during 16 years through national and\/or European\u2010funded projects, illustrated by some contributions of our own laboratory, located at the Royal Military Academy of Brussels, focusing on the detection of unexploded devices and the implementation of mobile robotics systems on minefields.},\nbooktitle = {Mine Action - The Research Experience of the Royal Military Academy of Belgium},\ndoi = {10.5772\/65783},\nproject = {TIRAMISU},\nurl = {https:\/\/www.intechopen.com\/books\/mine-action-the-research-experience-of-the-royal-military-academy-of-belgium\/unmanned-ground-and-aerial-robots-supporting-mine-action-activities},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Lapandic, J. Velagic, and H. Balta, &#8220;Framework for automated reconstruction of 3D model from multiple 2D aerial images,\" in <span style=\"font-style: italic\">2017 International Symposium ELMAR<\/span>, Zadar, Croatia,  2017, pp. 173-176.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_150\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_150\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/8124461\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.23919\/ELMAR.2017.8124461' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_150_block\">\n<p>The paper considers a problem of 3D environment model reconstruction from a set of 2D images acquired by the Unmanned Aerial Vehicle (UAV) in near real-time. The designed framework combines the FAST (Features from Accelerated Segment Test) algorithm and optical flow approach for detection of interest image points and adjacent images reconstruction. The robust estimation of camera locations is performed using the image points tracking. The coordinates of 3D points and the projection matrix are computed simultaneously using Structure-from-Motion (SfM) algorithm, from which the 3D model of environment is generated. The designed framework is tested using real image data and video sequences captured with camera mounted on the UAV. The effectiveness and quality of the proposed framework are verified through analyses of accuracy of the 3D model reconstruction and its time execution.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_150_block\">\n<pre><code class=\"tex bibtex\">@INPROCEEDINGS{8124461,\nauthor={D. {Lapandic} and J. {Velagic} and H. {Balta}},\nbooktitle={2017 International Symposium ELMAR},\ntitle={Framework for automated reconstruction of 3D model from multiple 2D aerial images},\nyear={2017},\nvolume={},\nnumber={},\npages={173-176},\nabstract={The paper considers a problem of 3D environment model reconstruction from a set of 2D images acquired by the Unmanned Aerial Vehicle (UAV) in near real-time. The designed framework combines the FAST (Features from Accelerated Segment Test) algorithm and optical flow approach for detection of interest image points and adjacent images reconstruction. The robust estimation of camera locations is performed using the image points tracking. The coordinates of 3D points and the projection matrix are computed simultaneously using Structure-from-Motion (SfM) algorithm, from which the 3D model of environment is generated. The designed framework is tested using real image data and video sequences captured with camera mounted on the UAV. The effectiveness and quality of the proposed framework are verified through analyses of accuracy of the 3D model reconstruction and its time execution.},\nkeywords={autonomous aerial vehicles;cameras;feature extraction;image reconstruction;image segmentation;image sensors;image sequences;remotely operated vehicles;video signal processing;automated reconstruction;multiple 2D aerial images;3D environment model reconstruction;UAV;optical flow approach;interest image points;robust estimation;camera locations;image data;3D model reconstruction;unmanned aerial vehicle;adjacent image reconstruction;structure-from-motion algorithm;features from accelerated segment test;Three-dimensional displays;Solid modeling;Image reconstruction;Two dimensional displays;Cameras;Feature extraction;Optical imaging;3D Model reconstruction;Aerial images;Structure from motion;Unmanned aerial vehicle},\ndoi={10.23919\/ELMAR.2017.8124461},\nISSN={},\nproject={NRTP,ICARUS},\naddress = {Zadar, Croatia},\npublisher={IEEE},\nurl={https:\/\/ieeexplore.ieee.org\/document\/8124461},\nmonth={Sep.},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    H. Balta, &#8220;Spatial registration of 3D data from aerial and ground-based unmanned robotic systems,\" PhD Thesis, 2017.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_151\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_151\" class=\"papercite_toggle\">[Abstract]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_151_block\">\n<p>Robotic systems are more and more leaving the protected laboratory environment and entering our daily lives. These robotic entities can come in the form of aerial systems (drones), ground robots or unmanned maritime systems. Each of these robots gathers data about its environment for analysis and reasoning purposes. As more and more robotic systems are deployed, the amount of environmental data gathered by these systems also increases tremendously. This gives rise to a new problem: how to coherently combine the environmental information acquired by different robotic systems into one representation that is both accurate and easy to use by human end-users? In this thesis, we introduce novel methodologies to solve this data fusion problem, by proposing a novel framework for combining heterogeneous 3D data models acquired by different robotic systems, operated in unknown large unstructured outdoor environments into a common homogeneous model. The first proposed novelty of the research work is a fast and robust ground-based 3D map reconstruction methodology for large-scale unstructured outdoor environments. It is based on an enhanced Iterative-Closest- Point algorithm and an iterative error minimization structure, as well as the fast and computational very efficient method for outlier analysis and removal in 3D point clouds. The second proposed novelty of the research work is a registration methodology combining heterogeneous data-sets acquired from unmanned aerial and ground vehicles (UAV and UGV). This is accomplished by introducing a semi-automated 3D registration framework. The framework is capable of coping with an arbitrary scale difference between the point clouds, without any information about their initial position and orientation. Furthermore, it does not require a good initial overlap between the two heterogeneous UGV and UAV point clouds. Our framework strikes an elegant balance between the existing fully automated 3D registration systems (which often fail in the case of heterogeneous data-sets and harsh-outdoor environments) and fully manual registration approaches (which are labour-intensive). A special and defining aspect of this PhD. work was that we did not only focus on investigating scientific and technical innovations but that we also concentrated on bringing these innovations to the terrain in real operational environments in the security context. As an example, we deployed the technological tools developed in the framework of this research work to the field for demining and crisis relief operations in an actual crisis situation. This operational deployment was highly successful, based upon the feedback provided by the end-users.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_151_block\">\n<pre><code class=\"tex bibtex\">@PHDTHESIS {phdbalta,\nauthor = \"Haris Balta\",\ntitle = \"Spatial registration of 3D data from aerial and ground-based unmanned robotic systems\",\nschool = \"Royal Military Academy of Belgium\",\nyear = \"2017\",\nproject={NRTP,ICARUS,TIRAMISU},\nabstract = {Robotic systems are more and more leaving the protected laboratory environment and entering our daily lives. These robotic entities can come in the form of aerial systems (drones), ground robots or unmanned maritime systems. Each of these robots gathers data about its environment for analysis and reasoning purposes. As more and more robotic systems are deployed, the amount of environmental data gathered by these systems also increases tremendously. This gives rise to a new problem: how to coherently combine the environmental information acquired by different robotic systems into one representation that is both accurate and easy to use by human end-users? In this thesis, we introduce novel methodologies to solve this data fusion problem, by proposing a novel framework for combining heterogeneous 3D data models acquired by different robotic systems, operated in unknown large unstructured outdoor environments into a common homogeneous model.\nThe first proposed novelty of the research work is a fast and robust ground-based 3D map reconstruction methodology for large-scale unstructured outdoor environments. It is based on an enhanced Iterative-Closest- Point algorithm and an iterative error minimization structure, as well as the fast and computational very efficient method for outlier analysis and removal in 3D point clouds.\nThe second proposed novelty of the research work is a registration methodology combining heterogeneous data-sets acquired from unmanned aerial and ground vehicles (UAV and UGV). This is accomplished by introducing a semi-automated 3D registration framework. The framework is capable of coping with an arbitrary scale difference between the point clouds, without any information about their initial position and orientation. Furthermore, it does not require a good initial overlap between the two heterogeneous UGV and UAV point clouds. Our framework strikes an elegant balance between the existing fully automated 3D registration systems (which often fail in the case of heterogeneous data-sets and harsh-outdoor environments) and fully manual registration approaches (which are labour-intensive).\nA special and defining aspect of this PhD. work was that we did not only focus on investigating scientific and technical innovations but that we also concentrated on bringing these innovations to the terrain in real operational environments in the security context. As an example, we deployed the technological tools developed in the framework of this research work to the field for demining and crisis relief operations in an actual crisis situation. This operational deployment was highly successful, based upon the feedback provided by the end-users.},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2016<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    M. M. Marques, R. Parreira, V. Lobo, A. Martins, A. Matos, N. Cruz, J. M. Almeida, J. C. Alves, E. Silva, J. Bedkowski, K. Majek, M. Pelka, P. Musialik, H. Ferreira, A. Dias, B. Ferreira, G. Amaral, A. Figueiredo, R. Almeida, F. Silva, D. Serrano, G. Moreno, G. De Cubber, H. Balta, and H. Beglerovic, &#8220;Use of multi-domain robots in search and rescue operations textemdash Contributions of the ICARUS team to the euRathlon 2015 challenge,\" in <span style=\"font-style: italic\">OCEANS 2016<\/span>, Shanghai, China,  2016, p. 1\u20137.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_98\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2016\/euRathlon2015_paper_final.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/oceansap.2016.7485354' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_98_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{marques2016use,\nauthor = {Mario Monteiro Marques and Rui Parreira and Victor Lobo and Alfredo Martins and Anibal Matos and Nuno Cruz and Jose Miguel Almeida and Jose Carlos Alves and Eduardo Silva and Janusz Bedkowski and Karol Majek and Michal Pelka and Pawel Musialik and Hugo Ferreira and Andre Dias and Bruno Ferreira and Guilherme Amaral and Andre Figueiredo and Rui Almeida and Filipe Silva and Daniel Serrano and German Moreno and De Cubber, Geert and Haris Balta and Halil Beglerovic},\nbooktitle = {{OCEANS} 2016},\ntitle = {Use of multi-domain robots in search and rescue operations {textemdash} Contributions of the {ICARUS} team to the {euRathlon} 2015 challenge},\nyear = {2016},\nmonth = apr,\norganization = {IEEE},\npages = {1--7},\npublisher = {{IEEE}},\ndoi = {10.1109\/oceansap.2016.7485354},\nproject = {ICARUS},\nunit= {meca-ras},\naddress = {Shanghai, China},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2016\/euRathlon2015_paper_final.pdf},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    H. Balta, J. Bedkowski, S. Govindaraj, K. Majek, P. Musialik, D. Serrano, K. Alexis, R. Siegwart, and G. De Cubber, &#8220;Integrated Data Management for a Fleet of Search-and-rescue Robots,\" <span style=\"font-style: italic\">Journal of Field Robotics<\/span>, vol. 34, iss. 3, p. 539\u2013582, 2016.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_99\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_99\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/onlinelibrary.wiley.com\/doi\/abs\/10.1002\/rob.21651\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1002\/rob.21651' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_99_block\">\n<p>Search\u2010and\u2010rescue operations have recently been confronted with the introduction of robotic tools that assist the human search\u2010and\u2010rescue workers in their dangerous but life\u2010saving job of searching for human survivors after major catastrophes. However, the world of search and rescue is highly reliant on strict procedures for the transfer of messages, alarms, data, and command and control over the deployed assets. The introduction of robotic tools into this world causes an important structural change in this procedural toolchain. Moreover, the introduction of search\u2010and\u2010rescue robots acting as data gatherers could potentially lead to an information overload toward the human search\u2010and\u2010rescue workers, if the data acquired by these robotic tools are not managed in an intelligent way. With that in mind, we present in this paper an integrated data combination and data management architecture that is able to accommodate real\u2010time data gathered by a fleet of robotic vehicles on a crisis site, and we present and publish these data in a way that is easy to understand by end\u2010users. In the scope of this paper, a fleet of unmanned ground and aerial search\u2010and\u2010rescue vehicles is considered, developed within the scope of the European ICARUS project. As a first step toward the integrated data\u2010management methodology, the different robotic systems require an interoperable framework in order to pass data from one to another and toward the unified command and control station. As a second step, a data fusion methodology will be presented, combining the data acquired by the different heterogenic robotic systems. The computation needed for this process is done in a novel mobile data center and then (as a third step) published in a software as a service (SaaS) model. The SaaS model helps in providing access to robotic data over ubiquitous Ethernet connections. As a final step, we show how the presented data\u2010management architecture allows for reusing recorded exercises with real robots and rescue teams for training purposes and teaching search\u2010and\u2010rescue personnel how to handle the different robotic tools. The system was validated in two experiments. First, in the controlled environment of a military testing base, a fleet of unmanned ground and aerial vehicles was deployed in an earthquake\u2010response scenario. The data gathered by the different interoperable robotic systems were combined by a novel mobile data center and presented to the end\u2010user public. Second, an unmanned aerial system was deployed on an actual mission with an international relief team to help with the relief operations after major flooding in Bosnia in the spring of 2014. Due to the nature of the event (floods), no ground vehicles were deployed here, but all data acquired by the aerial system (mainly three\u2010dimensional maps) were stored in the ICARUS data center, where they were securely published for authorized personnel all over the world. This mission (which is, to our knowledge, the first recorded deployment of an unmanned aerial system by an official governmental international search\u2010and\u2010rescue team in another country) proved also the concept of the procedural integration of the ICARUS data management system into the existing procedural toolchain of the search and rescue workers, and this in an international context (deployment from Belgium to Bosnia). The feedback received from the search\u2010and\u2010rescue personnel on both validation exercises was highly positive, proving that the ICARUS data management system can efficiently increase the situational awareness of the search\u2010and\u2010rescue personnel.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_99_block\">\n<pre><code class=\"tex bibtex\">@Article{balta2017integrated,\nauthor = {Haris Balta and Janusz Bedkowski and Shashank Govindaraj and Karol Majek and Pawel Musialik and Daniel Serrano and Kostas Alexis and Roland Siegwart and De Cubber, Geert},\njournal = {Journal of Field Robotics},\ntitle = {Integrated Data Management for a Fleet of Search-and-rescue Robots},\nyear = {2016},\nmonth = jul,\nnumber = {3},\npages = {539--582},\nvolume = {34},\nabstract = {Search\u2010and\u2010rescue operations have recently been confronted with the introduction of robotic tools that assist the human search\u2010and\u2010rescue workers in their dangerous but life\u2010saving job of searching for human survivors after major catastrophes. However, the world of search and rescue is highly reliant on strict procedures for the transfer of messages, alarms, data, and command and control over the deployed assets. The introduction of robotic tools into this world causes an important structural change in this procedural toolchain. Moreover, the introduction of search\u2010and\u2010rescue robots acting as data gatherers could potentially lead to an information overload toward the human search\u2010and\u2010rescue workers, if the data acquired by these robotic tools are not managed in an intelligent way. With that in mind, we present in this paper an integrated data combination and data management architecture that is able to accommodate real\u2010time data gathered by a fleet of robotic vehicles on a crisis site, and we present and publish these data in a way that is easy to understand by end\u2010users. In the scope of this paper, a fleet of unmanned ground and aerial search\u2010and\u2010rescue vehicles is considered, developed within the scope of the European ICARUS project. As a first step toward the integrated data\u2010management methodology, the different robotic systems require an interoperable framework in order to pass data from one to another and toward the unified command and control station. As a second step, a data fusion methodology will be presented, combining the data acquired by the different heterogenic robotic systems. The computation needed for this process is done in a novel mobile data center and then (as a third step) published in a software as a service (SaaS) model. The SaaS model helps in providing access to robotic data over ubiquitous Ethernet connections. As a final step, we show how the presented data\u2010management architecture allows for reusing recorded exercises with real robots and rescue teams for training purposes and teaching search\u2010and\u2010rescue personnel how to handle the different robotic tools. The system was validated in two experiments. First, in the controlled environment of a military testing base, a fleet of unmanned ground and aerial vehicles was deployed in an earthquake\u2010response scenario. The data gathered by the different interoperable robotic systems were combined by a novel mobile data center and presented to the end\u2010user public. Second, an unmanned aerial system was deployed on an actual mission with an international relief team to help with the relief operations after major flooding in Bosnia in the spring of 2014. Due to the nature of the event (floods), no ground vehicles were deployed here, but all data acquired by the aerial system (mainly three\u2010dimensional maps) were stored in the ICARUS data center, where they were securely published for authorized personnel all over the world. This mission (which is, to our knowledge, the first recorded deployment of an unmanned aerial system by an official governmental international search\u2010and\u2010rescue team in another country) proved also the concept of the procedural integration of the ICARUS data management system into the existing procedural toolchain of the search and rescue workers, and this in an international context (deployment from Belgium to Bosnia). The feedback received from the search\u2010and\u2010rescue personnel on both validation exercises was highly positive, proving that the ICARUS data management system can efficiently increase the situational awareness of the search\u2010and\u2010rescue personnel.},\ndoi = {10.1002\/rob.21651},\npublisher = {Wiley},\nproject = {ICARUS},\nunit= {meca-ras},\nurl = {https:\/\/onlinelibrary.wiley.com\/doi\/abs\/10.1002\/rob.21651},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2015<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. Doroftei, A. Matos, E. Silva, V. Lobo, R. Wagemans, and G. De Cubber, &#8220;Operational validation of robots for risky environments,\" in <span style=\"font-style: italic\">8th IARP Workshop on Robotics for Risky Environments<\/span>, Lisbon, Portugal,  2015.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_93\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_93\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2015\/Operational validation of robots for risky environments.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_93_block\">\n<p>This paper presents an operational test and validation approach for the evaluation of the performance of a range of marine, aerial and ground search and rescue robots. The proposed approach seeks to find a compromise between the traditional rigorous standardized approaches and the open-ended robot competitions. Operational scenarios are defined, including a performance assessment of individual robots but also collective operations where heterogeneous robots cooperate together and with manned teams in search and rescue activities. That way, it is possible to perform a more complete validation of the use of robotic tools in challenging real world scenarios.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_93_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2015operational,\nauthor = {Doroftei, Daniela and Matos, Anibal and Silva, Eduardo and Lobo, Victor and Wagemans, Rene and De Cubber, Geert},\nbooktitle = {8th IARP Workshop on Robotics for Risky Environments},\ntitle = {Operational validation of robots for risky environments},\nyear = {2015},\nabstract = {This paper presents an operational test and validation approach for the evaluation of the performance of a range of marine, aerial and ground search and rescue robots. The proposed approach seeks to find a compromise between the traditional rigorous standardized approaches and the open-ended robot competitions. Operational scenarios are defined, including a performance assessment of individual robots but also collective operations where heterogeneous robots cooperate together and with manned teams in search and rescue activities. That way, it is possible to perform a more complete validation of the use of robotic tools in challenging real world scenarios.},\nproject = {ICARUS},\naddress = {Lisbon, Portugal},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2015\/Operational validation of robots for risky environments.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Serrano, P. Chrobocinski, G. De Cubber, D. Moore, G. Leventakis, and S. Govindaraj, &#8220;ICARUS and DARIUS approaches towards interoperability,\" in <span style=\"font-style: italic\">8th IARP Workshop on Robotics for Risky Environments<\/span>, Lisbon, Portugal,  2015.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_94\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_94\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2015\/RISE - 2015 - ICARUS and DARIUS approach towards interoperability - rev1.3.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_94_block\">\n<p>The two FP7 projects ICARUS and DARIUS share a common objective which is to integrate the unmanned platforms in Search and Rescue operations and assess their added value through the development of an integrated system that will be tested in realistic conditions on the field. This paper describes the concept of both projects towards an optimized interoperability level in the three dimensions: organizational, procedural and technical interoperability, describing the system components and illustrating the results of the trials already performed.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_94_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{serrano2015icarus,\nauthor = {Serrano, Daniel and Chrobocinski, Philippe and De Cubber, Geert and Moore, Dave and Leventakis, Georgios and Govindaraj, Shashank},\nbooktitle = {8th IARP Workshop on Robotics for Risky Environments},\ntitle = {{ICARUS} and {DARIUS} approaches towards interoperability},\nyear = {2015},\nabstract = {The two FP7 projects ICARUS and DARIUS share a common objective which is to integrate the unmanned platforms in Search and Rescue operations and assess their added value through the development of an integrated system that will be tested in realistic conditions on the field. This paper describes the concept of both projects towards an optimized interoperability level in the three dimensions: organizational, procedural and technical interoperability, describing the system components and illustrating the results of the trials already performed.},\nproject = {ICARUS},\naddress = {Lisbon, Portugal},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2015\/RISE - 2015 - ICARUS and DARIUS approach towards interoperability - rev1.3.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    H. Balta, G. De Cubber, Y. Baudoin, and D. Doroftei, &#8220;UAS deployment and data processing during the Balkans flooding with the support to Mine Action,\" in <span style=\"font-style: italic\">8th IARP Workshop on Robotics for Risky Environments<\/span>, Lisbon, Portugal,  2015.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_95\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_95\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2015\/RISE_2015_Haris_Balta_RMA.PDF\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_95_block\">\n<p>In this paper, we provide a report on a real relief operation mission, jointly conducted by two European research projects, in response to the massive flooding in the Balkan in spring 2014. Un Unmanned Aerial System was deployed on-site in collaboration with traditional relief workers, to support them with damage assessment, area mapping, visual inspection and re-localizing the many explosive remnants of war which have been moved due to the flooding and landslides. The destructive impact of landslides, sediment torrents and floods on the mine fields and the change of mine action situation resulted with significant negative environmental and security consequences. Novel robotic technologies and data processing methodologies were brought from the research labs and directly applied onto the terrain in order to support the relief workers and minimize human suffering.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_95_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{balta2015uas,\nauthor = {Balta, Haris and De Cubber, Geert and Baudoin, Yvan and Doroftei, Daniela},\nbooktitle = {8th IARP Workshop on Robotics for Risky Environments},\ntitle = {{UAS} deployment and data processing during the {Balkans} flooding with the support to Mine Action},\nyear = {2015},\nabstract = {In this paper, we provide a report on a real relief operation mission, jointly conducted by two European research projects, in response to the massive flooding in the Balkan in spring 2014. Un Unmanned Aerial System was deployed on-site in collaboration with traditional relief workers, to support them with damage assessment, area mapping, visual inspection and re-localizing the many explosive remnants of war which have been moved due to the flooding and landslides. The destructive impact of landslides, sediment torrents and floods on the mine fields and the change of mine action situation resulted with significant negative environmental and security consequences. Novel robotic technologies and data processing methodologies were brought from the research labs and directly applied onto the terrain in order to support the relief workers and minimize human suffering.},\nproject = {ICARUS},\naddress = {Lisbon, Portugal},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2015\/RISE_2015_Haris_Balta_RMA.PDF},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber and H. Balta, &#8220;Terrain Traversability Analysis using full-scale 3D Processing,\" in <span style=\"font-style: italic\">8th IARP Workshop on Robotics for Risky Environments<\/span>, Lisbon, Portugal,  2015.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_96\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_96\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2015\/Terrain Traversability Analysis.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_96_block\">\n<p>Autonomous robotic systems which aspire to navigate through rough unstructured terrain require the capability to reason about the environmental characteristics of their environment. As a first priority, the robotic systems need to assess the degree of traversability of their immediate environment to ensure their mobility while navigating through these rough environments. This paper presents a novel terrain-traversability analyis methodology which is based on processing the full 3D model of the terrain, not on a projected or downscaled version of this model. The approach is validated using field tests using a time-of-flight camera.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_96_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2015terrain,\nauthor = {De Cubber, Geert and Balta, Haris},\nbooktitle = {8th IARP Workshop on Robotics for Risky Environments},\ntitle = {Terrain Traversability Analysis using full-scale {3D} Processing},\nyear = {2015},\nabstract = {Autonomous robotic systems which aspire to navigate through rough unstructured terrain require the capability to reason about the environmental characteristics of their environment. As a first priority, the robotic systems need to assess the degree of traversability of their immediate environment to ensure their mobility while navigating through these rough environments. This paper presents a novel terrain-traversability analyis methodology which is based on processing the full 3D model of the terrain, not on a projected or downscaled version of this model. The approach is validated using field tests using a time-of-flight camera.},\nproject = {ICARUS},\naddress = {Lisbon, Portugal},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2015\/Terrain Traversability Analysis.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    O. De Meyst, T. Goethals, H. Balta, G. De Cubber, and R. Haelterman, &#8220;Autonomous guidance for a UAS along a staircase,\" in <span style=\"font-style: italic\">International Symposium on Visual Computing<\/span>, Las Vegas, USA,  2015, p. 466\u2013475.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_97\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_97\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/link.springer.com\/chapter\/10.1007\/978-3-319-27857-5_42\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1007\/978-3-319-27857-5_42' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_97_block\">\n<p>In the quest for fully autonomous unmanned aerial systems (UAS), multiple challenges are faced. For enabling autonomous UAS navigation in indoor environments, one of the major bottlenecks is the capability to autonomously traverse narrow 3D &#8211; passages, like staircases. This paper presents a novel integrated system that implements a semi-autonomous navigation system for a quadcopter. The navigation system permits the UAS to detect a staircase using only the images provided by an on-board monocular camera. A 3D model of this staircase is then automatically reconstructed and this model is used to guide the UAS to the top of the detected staircase. For validating the methodology, a proof of concept is created, based on the Parrot AR.Drone 2.0 which is a cheap commercial off-the-shelf quadcopter.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_97_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2015autonomous,\nauthor = {De Meyst, Olivier and Goethals, Thijs and Balta, Haris and De Cubber, Geert and Haelterman, Rob},\nbooktitle = {International Symposium on Visual Computing},\ntitle = {Autonomous guidance for a {UAS} along a staircase},\nyear = {2015},\norganization = {Springer, Cham},\npages = {466--475},\nabstract = {In the quest for fully autonomous unmanned aerial systems (UAS), multiple challenges are faced. For enabling autonomous UAS navigation in indoor environments, one of the major bottlenecks is the capability to autonomously traverse narrow 3D - passages, like staircases. This paper presents a novel integrated system that implements a semi-autonomous navigation system for a quadcopter. The navigation system permits the UAS to detect a staircase using only the images provided by an on-board monocular camera. A 3D model of this staircase is then automatically reconstructed and this model is used to guide the UAS to the top of the detected staircase. For validating the methodology, a proof of concept is created, based on the Parrot AR.Drone 2.0 which is a cheap commercial off-the-shelf quadcopter.},\ndoi = {10.1007\/978-3-319-27857-5_42},\nproject = {ICARUS},\naddress = {Las Vegas, USA},\nunit= {meca-ras},\nurl = {https:\/\/link.springer.com\/chapter\/10.1007\/978-3-319-27857-5_42},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, &#8220;Search and Rescue Robots,\" <span style=\"font-style: italic\">Belgisch Militair Tijdschrift<\/span>, vol. 10, p. 50\u201360, 2015.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_100\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_100\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2015\/rmb102.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_100_block\">\n<p>This article provides an overview of the work on search and rescue robotics and more specifically the research performed within the ICARUS research project.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_100_block\">\n<pre><code class=\"tex bibtex\">@Article{de2015search,\nauthor = {De Cubber, Geert},\njournal = {Belgisch Militair Tijdschrift},\ntitle = {Search and Rescue Robots},\nyear = {2015},\npages = {50--60},\nvolume = {10},\nabstract = {This article provides an overview of the work on search and rescue robotics and more specifically the research performed within the ICARUS research project.},\npublisher = {Defensie},\nproject = {ICARUS},\nunit= {meca-ras},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2015\/rmb102.pdf},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    E. Avdic, H. Balta, and T. Ivelja, &#8220;UAS deployment and data processing of natural disaster with impact to mine action in B and H, case study: Region Olovo,\" in <span style=\"font-style: italic\">International Symposium Mine Action 2015<\/span>, Biograd, Croatia,  2015, pp. 5-12.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_152\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_152\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2015\/HUDEM_2015_Avdic_Balta_Ivelja_final_ver.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_152_block\">\n<p>In this paper, we present a case study report on how novel robotics technologies like the Unmanned Aerial System (UAS) and data processing methodologies could be used in order to support the traditional mine action procedures and be directly applied onto the terrain while increasing the operational efficiency, supporting mine action workers and minimizing human suffering in case of natural disaster with impact to mine action. Our case study is focusing on the region Olovo (Central Bosnia and Herzegovina) in response to massive flooding, landslides and sediment torrents in spring- summer of 2014. Such destructive impact of the natural disaster on the mine action situation resulted with a re-localizing of many explosive remnants of war which have been moved due to the flooding and landslides with significant negative environmental and security consequences increasing new potentially suspected hazardous areas. What will be elaborated in this paper is the following: problem definition with a statement of needs, data acquisition procedures with UAS, data processing and quality assessment and usability in further mine action procedures.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_152_block\">\n<pre><code class=\"tex bibtex\">@INPROCEEDINGS{balta2015article,\nauthor={Avdic, Esad and Balta, Haris and Ivelja, Tamara},\nbooktitle={International Symposium Mine Action 2015},\nyear = {2015},\naddress = {Biograd, Croatia},\npages = {5-12},\nkeywords = {Mine Action Support, Unmanned Aerial System, Natural Disaster},\ntitle = {UAS deployment and data processing of natural disaster with impact to mine action in B and H, case study: Region Olovo},\nkeyword = {Mine Action Support, Unmanned Aerial System, Natural Disaster},\npublisher = {HCR-CTRO d.o.o.},\npublisherplace = {Biograd, Hrvatska},\nproject={TIRAMISU},\nurl={http:\/\/mecatron.rma.ac.be\/pub\/2015\/HUDEM_2015_Avdic_Balta_Ivelja_final_ver.pdf},\nabstract= {In this paper, we present a case study report on how novel robotics technologies like the Unmanned Aerial System (UAS) and data processing methodologies could be used in order to support the traditional mine action procedures and be directly applied onto the terrain while increasing the operational efficiency, supporting mine action workers and minimizing human suffering in case of natural disaster with impact to mine action. Our case study is focusing on the region Olovo (Central Bosnia and Herzegovina) in response to massive flooding, landslides and sediment torrents in spring- summer of 2014. Such destructive impact of the natural disaster on the mine action situation resulted with a re-localizing of many explosive remnants of war which have been moved due to the flooding and landslides with significant negative environmental and security consequences increasing new potentially suspected hazardous areas. What will be elaborated in this paper is the following: problem definition with a statement of needs, data acquisition procedures with UAS, data processing and quality assessment and usability in further mine action procedures.},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2014<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. Doroftei, A. Matos, and G. De Cubber, &#8220;Designing Search and Rescue Robots towards Realistic User Requirements,\" in <span style=\"font-style: italic\">Advanced Concepts on Mechanical Engineering (ACME)<\/span>, Iasi, Romania,  2014.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_86\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_86\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2014\/Designing Search and Rescue robots towards realistic user requirements - full article -v3.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.4028\/www.scientific.net\/amm.658.612' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_86_block\">\n<p>In the event of a large crisis (think about typhoon Haiyan or the Tohoku earthquake and tsunami in Japan), a primordial task of the rescue services is the search for human survivors on the incident site. This is a complex and dangerous task, which often leads to loss of lives among the human crisis managers themselves. The introduction of unmanned search and rescue devices can offer a valuable tool to save human lives and to speed up the search and rescue process. In this context, the EU-FP7-ICARUS project [1] concentrates on the development of unmanned search and rescue technologies for detecting, locating and rescuing humans. The complex nature and difficult operating conditions of search and rescue operations pose heavy constraints on the mechanical design of the unmanned platforms. In this paper, we discuss the different user requirements which have an impact of the design of the mechanical systems (air, ground and marine robots). We show how these user requirements are obtained, how they are validated, how they lead to design specifications for operational prototypes which are tested in realistic operational conditions and we show how the final mechanical design specifications are derived from these different steps. An important aspect of all these design steps which is emphasized in this paper is to always keep the end-users in the loop in order to come to realistic requirements and specifications, ensuring the practical deployability [2] of the developed platforms.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_86_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2014designing,\nauthor = {Doroftei, Daniela and Matos, Anibal and De Cubber, Geert},\nbooktitle = {Advanced Concepts on Mechanical Engineering (ACME)},\ntitle = {Designing Search and Rescue Robots towards Realistic User Requirements},\nyear = {2014},\nabstract = {In the event of a large crisis (think about typhoon Haiyan or the Tohoku earthquake and tsunami in Japan), a primordial task of the rescue services is the search for human survivors on the incident site. This is a complex and dangerous task, which often leads to loss of lives among the human crisis managers themselves. The introduction of unmanned search and rescue devices can\noffer a valuable tool to save human lives and to speed up the search and rescue process. In this context, the EU-FP7-ICARUS project [1] concentrates on the development of unmanned search and rescue technologies for detecting, locating and rescuing humans. The complex nature and difficult operating conditions of search and rescue operations pose heavy constraints on the mechanical design of the unmanned platforms. In this paper, we discuss the different user requirements which have an impact of the design of the mechanical systems (air, ground and marine robots). We show how these user requirements are obtained, how they are validated, how they lead to design specifications for operational prototypes which are tested in realistic operational conditions and we show how the final mechanical design specifications are derived from these different steps. An important aspect of all these design steps which is emphasized in this paper is to always keep the end-users in the loop in order to come to realistic requirements and specifications, ensuring the practical deployability [2] of the developed platforms.},\ndoi = {10.4028\/www.scientific.net\/amm.658.612},\nproject = {ICARUS},\naddress = {Iasi, Romania},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2014\/Designing Search and Rescue robots towards realistic user requirements - full article -v3.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, H. Balta, and C. Lietart, &#8220;Teodor: A semi-autonomous search and rescue and demining robot,\" in <span style=\"font-style: italic\">Advanced Concepts on Mechanical Engineering (ACME)<\/span>, Iasi, Romania,  2014.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_87\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_87\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2014\/Teodor - A semi-autonomous search and rescue and demining robot - full article.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.4028\/www.scientific.net\/amm.658.599' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_87_block\">\n<p>In this paper, we present a ground robotic system which is developed to deal with rough outdoor conditions. The platform is to be used as an environmental monitoring robot for 2 main application areas: 1) Humanitarian demining: The vehicle is equipped with a specialized multichannel metal detector array. An unmanned aerial system supports it for locating suspected locations of mines, which can then be confirmed by the ground vehicle. 2) Search and rescue: The vehicle is equipped with human victim detection sensors and a 3D camera enabling it to assess the traversability of the terrain in front of the robot in order to be able to navigate autonomously. This paper discusses both the mechanical design of these platforms as the autonomous perception capabilities on board of these vehicles.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_87_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2014teodor,\nauthor = {De Cubber, Geert and Balta, Haris and Lietart, Claude},\nbooktitle = {Advanced Concepts on Mechanical Engineering (ACME)},\ntitle = {Teodor: A semi-autonomous search and rescue and demining robot},\nyear = {2014},\nabstract = {In this paper, we present a ground robotic system which is developed to deal with rough outdoor conditions. The platform is to be used as an environmental monitoring robot for 2 main application areas: 1) Humanitarian demining: The vehicle is equipped with a specialized multichannel metal detector array. An unmanned aerial system supports it for locating suspected locations of mines, which can then be confirmed by the ground vehicle. 2) Search and rescue: The vehicle is equipped with human victim detection sensors and a 3D camera enabling it to assess the traversability of the terrain in front of the robot in order to be able to navigate autonomously. This paper discusses both the mechanical design of these platforms as the autonomous perception\ncapabilities on board of these vehicles.},\ndoi = {10.4028\/www.scientific.net\/amm.658.599},\nproject = {ICARUS},\naddress = {Iasi, Romania},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2014\/Teodor - A semi-autonomous search and rescue and demining robot - full article.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, H. Balta, D. Doroftei, and Y. Baudoin, &#8220;UAS deployment and data processing during the Balkans flooding,\" in <span style=\"font-style: italic\">2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)<\/span>, Toyako-cho, Hokkaido, Japan,  2014, p. 1\u20134.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_88\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_88\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2014\/SSRR2014_proj_037.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ssrr.2014.7017670' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_88_block\">\n<p>This project paper provides a report on a real relief operation mission, jointly conducted by two European research projects, in response to the massive flooding in the Balkan in spring 2014. Un Unmanned Aerial System was deployed on-site in collaboration with traditional relief workers, to support them with damage assessment, area mapping, visual inspection and re-localizing the many explosive remnants of war which have been moved due to the flooding and landslides. Novel robotic technologies and data processing methodologies were brought from the research labs and directly applied onto the terrain in order to support the relief workers and minimize human suffering.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_88_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2014uas,\nauthor = {De Cubber, Geert and Balta, Haris and Doroftei, Daniela and Baudoin, Yvan},\nbooktitle = {2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)},\ntitle = {{UAS} deployment and data processing during the Balkans flooding},\nyear = {2014},\norganization = {IEEE},\npages = {1--4},\nabstract = {This project paper provides a report on a real relief operation mission, jointly conducted by two European research projects, in response to the massive flooding in the Balkan in spring 2014. Un Unmanned Aerial System was deployed on-site in collaboration with traditional relief workers, to support them with damage assessment, area mapping, visual inspection and re-localizing the many explosive remnants of war which have been moved due to the flooding and landslides. Novel robotic technologies and data processing methodologies were brought from the research labs and directly applied onto the terrain in order to support the relief workers and minimize human suffering.},\ndoi = {10.1109\/ssrr.2014.7017670},\nproject = {ICARUS},\naddress = {Toyako-cho, Hokkaido, Japan},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2014\/SSRR2014_proj_037.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    M. Pelka, K. Majek, J. Bedkowski, P. Musialik, A. Maslowski, G. de Cubber, H. Balta, A. Coelho, R. Goncalves, R. Baptista, J. M. Sanchez, and S. Govindaraj, &#8220;Training and Support system in the Cloud for improving the situational awareness in Search and Rescue (SAR) operations,\" in <span style=\"font-style: italic\">2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014)<\/span>, Toyako-cho, Hokkaido, Japan,  2014, p. 1\u20136.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_90\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_90\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/7017644?arnumber=7017644&#038;sortType=asc_p_Sequence&#038;filter=AND(p_IS_Number:7017643)=\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ssrr.2014.7017644' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_90_block\">\n<p>In this paper, a Training and Support system for Search and Rescue operations is described. The system is a component of the ICARUS project (http:\/\/www.fp7-icarus.eu) which has a goal to develop sensor, robotic and communication technologies for Human Search And Rescue teams. The support system for planning and managing complex SAR operations is implemented as a command and control component that integrates different sources of spatial information, such as maps of the affected area, satellite images and sensor data coming from the unmanned robots, in order to provide a situation snapshot to the rescue team who will make the necessary decisions. Support issues will include planning of frequency resources needed for given areas, prediction of coverage conditions, location of fixed communication relays, etc. The training system is developed for the ICARUS operators controlling UGVs (Unmanned Ground Vehicles), UAVs (Unmanned Aerial Vehicles) and USVs (Unmanned Surface Vehicles) from a unified Remote Control Station (RC2). The Training and Support system is implemented in SaaS model (Software as a Service). Therefore, its functionality is available over the Ethernet. SAR ICARUS teams from different countries can be trained simultaneously on a shared virtual stage. In this paper we will show the multi-robot 3D mapping component (aerial vehicle and ground vehicles). We will demonstrate that these 3D maps can be used for Training purpose. Finally we demonstrate current approach for ICARUS Urban SAR (USAR) and Marine SAR (MSAR) operation training.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_90_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{pelka2014training,\nauthor = {Michal Pelka and Karol Majek and Janusz Bedkowski and Pawel Musialik and Andrzej Maslowski and Geert de Cubber and Haris Balta and Antonio Coelho and Ricardo Goncalves and Ricardo Baptista and Jose Manuel Sanchez and Shashank Govindaraj},\nbooktitle = {2014 {IEEE} International Symposium on Safety, Security, and Rescue Robotics (2014)},\ntitle = {Training and Support system in the Cloud for improving the situational awareness in Search and Rescue ({SAR}) operations},\nyear = {2014},\nmonth = oct,\norganization = {IEEE},\npages = {1--6},\npublisher = {{IEEE}},\nabstract = {In this paper, a Training and Support system for Search and Rescue operations is described. The system is a component of the ICARUS project (http:\/\/www.fp7-icarus.eu) which has a goal to develop sensor, robotic and communication technologies for Human Search And Rescue teams. The support system for planning and managing complex SAR operations is implemented as a command and control component that integrates different sources of spatial information, such as maps of the affected area, satellite images and sensor data coming from the unmanned robots, in order to provide a situation snapshot to the rescue team who will make the necessary decisions. Support issues will include planning of frequency resources needed for given areas, prediction of coverage conditions, location of fixed communication relays, etc. The training system is developed for the ICARUS operators controlling UGVs (Unmanned Ground Vehicles), UAVs (Unmanned Aerial Vehicles) and USVs (Unmanned Surface Vehicles) from a unified Remote Control Station (RC2). The Training and Support system is implemented in SaaS model (Software as a Service). Therefore, its functionality is available over the Ethernet. SAR ICARUS teams from different countries can be trained simultaneously on a shared virtual stage. In this paper we will show the multi-robot 3D mapping component (aerial vehicle and ground vehicles). We will demonstrate that these 3D maps can be used for Training purpose. Finally we demonstrate current approach for ICARUS Urban SAR (USAR) and Marine SAR (MSAR) operation training.},\ndoi = {10.1109\/ssrr.2014.7017644},\nproject = {ICARUS},\naddress = {Toyako-cho, Hokkaido, Japan},\nurl = {https:\/\/ieeexplore.ieee.org\/document\/7017644?arnumber=7017644&sortType=asc_p_Sequence&filter=AND(p_IS_Number:7017643)=},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    C. Armbrust, G. De Cubber, and K. Berns, &#8220;ICARUS Control Systems for Search and Rescue Robots,\" <span style=\"font-style: italic\">Field and Assistive Robotics &#8211; Advances in Systems and Algorithms<\/span>, 2014.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_91\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_91\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/pdfs.semanticscholar.org\/713d\/8c8561eba9b577f17d3059155e1f3953893a.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_91_block\">\n<p>This paper describes results of the European project ICARUS in the field of search and rescue robotics. It presents the software architectures of two unmanned ground vehicles (a small and a large one) developed in the context of the project. The architectures of the two vehicles share many similarities. This allows for component reuse and thus reduces the overall development effort. Hence, the main contribution of this paper are design concepts that can serve as a basis for the development of different robot control systems.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_91_block\">\n<pre><code class=\"tex bibtex\">@Article{armbrust2014icarus,\nauthor = {Armbrust, Christopher and De Cubber, Geert and Berns, Karsten},\njournal = {Field and Assistive Robotics - Advances in Systems and Algorithms},\ntitle = {{ICARUS} Control Systems for Search and Rescue Robots},\nyear = {2014},\nabstract = {This paper describes results of the European project ICARUS in the field of search and rescue robotics. It presents the software architectures of two unmanned ground vehicles (a small and a large one) developed in the context of the project. The architectures of the two vehicles share many similarities. This allows for component reuse and thus reduces the overall development effort. Hence, the main contribution of this paper are design concepts that can serve as a basis for the development of different robot control systems.},\npublisher = {Shaker Verlag},\nproject = {ICARUS},\nurl = {https:\/\/pdfs.semanticscholar.org\/713d\/8c8561eba9b577f17d3059155e1f3953893a.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber and H. Balta, &#8220;ICARUS RPAS AND THEIR OPERATIONAL USE IN Bosnia,\" in <span style=\"font-style: italic\">RPAS 2014<\/span>, Brussels, Belgium,  2014.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_92\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_92\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2014\/Icarus Project - RPAS in Bosnia_.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_92_block\">\n<p>This is a report in the field mission with an unmanned aircraft system in Spring 2014 in Bosnia, to help with flood relief and mine clearing operations.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_92_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2014icarus,\nauthor = {De Cubber, Geert and Balta, Haris},\nbooktitle = {RPAS 2014},\ntitle = {{ICARUS RPAS} AND THEIR OPERATIONAL USE IN {Bosnia}},\nyear = {2014},\norganization = {UVS International},\nabstract = {This is a report in the field mission with an unmanned aircraft system in Spring 2014 in Bosnia, to help with flood relief and mine clearing operations.},\nproject = {ICARUS},\naddress = {Brussels, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2014\/Icarus Project - RPAS in Bosnia_.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    T. Nguyen, E. Kayakan, J. De Baerdemaeker, and W. Saeys, &#8220;Motion planning algorithm and its real-time implementation in apples harvesting robot,\" in <span style=\"font-style: italic\">In International Conference of Agricultural Engineering, CIGR-Ageng<\/span>,  2014.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_155\" class=\"papercite_toggle\">[BibTeX]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_155_block\">\n<pre><code class=\"tex bibtex\">@INPROCEEDINGS{proc_tt_006,\nauthor={T.Th. {Nguyen} and E. {Kayakan} and J. {De Baerdemaeker} and W. {Saeys}},\ntitle={Motion planning algorithm and its real-time implementation in apples harvesting robot},\nbooktitle={In International Conference of Agricultural Engineering, CIGR-Ageng},\nmonth=jul,\nyear={2014},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2013<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    J. Bedkowski, K. Majek, I. Ostrowski, P. Musialik, A. Mas{l}owski, A. Adamek, A. Coelho, and G. De Cubber, &#8220;Methodology of Training and Support for Urban Search and Rescue With Robots,\" in <span style=\"font-style: italic\">Proc. Ninth International Conference on Autonomic and Autonomous Systems (ICAS), Lisbon, Portugal<\/span>, Lisbon, Portugal,  2013, p. 77\u201382.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_76\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_76\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.thinkmind.org\/download.php?articleid=icas_2013_3_40_20054\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_76_block\">\n<p>A primordial task of the fire-fighting and rescue services in the event of a large crisis is the search for human survivors on the incident site. This task, being complex and dangerous, often leads to loss of lives. Unmanned search and rescue devices can provide a valuable tool for saving human lives and speeding up the search and rescue operations. Urban Search and Rescue (USAR) community agrees with the fact that the operator skill is the main factor for successfully using unmanned robotic platforms. The key training concept is &#8220;train as you fight\" mentality. Intervention troops focalize on &#8220;real training\", as a crisis is difficult to simulate. For this reason, in this paper a methodology of training and support for USAR with unmanned vehicles is proposed. The methodology integrates the Qualitative Spatio-Temporal Representation and Reasoning (QSTRR) framework with USAR tools to decrease the cognitive load on human operators working with sophisticated robotic platforms. Tools for simplifying and improving virtual training environment generation from life data are shown<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_76_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{bedkowski2013methodology,\nauthor = {Bedkowski, Janusz and Majek, Karol and Ostrowski, Igor and Musialik, Pawe{l} and Mas{l}owski, Andrzej and Adamek, Artur and Coelho, Antonio and De Cubber, Geert},\nbooktitle = {Proc. Ninth International Conference on Autonomic and Autonomous Systems (ICAS), Lisbon, Portugal},\ntitle = {Methodology of Training and Support for Urban Search and Rescue With Robots},\nyear = {2013},\naddress = {Lisbon, Portugal},\nmonth = mar,\npages = {77--82},\nabstract = {A primordial task of the fire-fighting and rescue services in the event of a large crisis is the search for human survivors on the incident site. This task, being complex and dangerous, often leads to loss of lives. Unmanned search and rescue devices can provide a valuable tool for saving human lives and speeding up the search and rescue operations. Urban Search and Rescue (USAR) community agrees with the fact that the operator skill is the main factor for successfully using unmanned robotic platforms. The key training concept is \"train as you fight\" mentality. Intervention troops focalize on \"real training\", as a crisis is difficult to simulate. For this reason, in this paper a methodology of training and support for USAR with unmanned vehicles is proposed. The methodology integrates the Qualitative Spatio-Temporal Representation and Reasoning (QSTRR) framework with USAR tools to decrease the cognitive load on human operators working with sophisticated robotic platforms. Tools for simplifying and improving virtual training environment generation from life data are shown},\nproject = {ICARUS},\nurl = {https:\/\/www.thinkmind.org\/download.php?articleid=icas_2013_3_40_20054},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, &#8220;ICARUS Consortium &#8211; Providing Unmanned Search and Rescue Tools,\" in <span style=\"font-style: italic\">Remotely Piloted Aircraft Systems &#8211; The Global Perspective &#8211; Yearbook 2013\/2014<\/span>, Brussels, Belgium: Blyenburgh &#038; co, 2013, vol. 11, p. 133\u2013134.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_78\" class=\"papercite_toggle\">[BibTeX]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_78_block\">\n<pre><code class=\"tex bibtex\">@InCollection{de2013icarus,\nauthor = {De Cubber, Geert},\nbooktitle = {Remotely Piloted Aircraft Systems - The Global Perspective - Yearbook 2013\/2014},\npublisher = {Blyenburgh & co},\ntitle = {{ICARUS} Consortium - Providing Unmanned Search and Rescue Tools},\nyear = {2013},\npages = {133--134},\naddress = {Brussels, Belgium},\nproject = {ICARUS},\nvolume = {11},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber and H. Sahli, &#8220;Augmented Lagrangian-based approach for dense three-dimensional structure and motion estimation from binocular image sequences,\" <span style=\"font-style: italic\">IET Computer Vision<\/span>, 2013.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_79\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_79\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/digital-library.theiet.org\/content\/journals\/10.1049\/iet-cvi.2013.0017\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1049\/iet-cvi.2013.0017' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_79_block\">\n<p>In this study, the authors propose a framework for stereo\u2013motion integration for dense depth estimation. They formulate the stereo\u2013motion depth reconstruction problem into a constrained minimisation one. A sequential unconstrained minimisation technique, namely, the augmented Lagrange multiplier (ALM) method has been implemented to address the resulting constrained optimisation problem. ALM has been chosen because of its relative insensitivity to whether the initial design points for a pseudo-objective function are feasible or not. The development of the method and results from solving the stereo\u2013motion integration problem are presented. Although the authors work is not the only one adopting the ALMs framework in the computer vision context, to thier knowledge the presented algorithm is the first to use this mathematical framework in a context of stereo\u2013motion integration. This study describes how the stereo\u2013motion integration problem was cast in a mathematical context and solved using the presented ALM method. Results on benchmark and real visual input data show the validity of the approach.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_79_block\">\n<pre><code class=\"tex bibtex\">@Article{de2013augmented,\nauthor = {De Cubber, Geert and Sahli, Hichem},\njournal = {IET Computer Vision},\ntitle = {Augmented Lagrangian-based approach for dense three-dimensional structure and motion estimation from binocular image sequences},\nyear = {2013},\nabstract = {In this study, the authors propose a framework for stereo\u2013motion integration for dense depth estimation. They formulate the stereo\u2013motion depth reconstruction problem into a constrained minimisation one. A sequential unconstrained minimisation technique, namely, the augmented Lagrange multiplier (ALM) method has been implemented to address the resulting constrained optimisation problem. ALM has been chosen because of its relative insensitivity to whether the initial design points for a pseudo-objective function are feasible or not. The development of the method and results from solving the stereo\u2013motion integration problem are presented. Although the authors work is not the only one adopting the ALMs framework in the computer vision context, to thier knowledge the presented algorithm is the first to use this mathematical framework in a context of stereo\u2013motion integration. This study describes how the stereo\u2013motion integration problem was cast in a mathematical context and solved using the presented ALM method. Results on benchmark and real visual input data show the validity of the approach.},\ndoi = {10.1049\/iet-cvi.2013.0017},\npublisher = {IET Digital Library},\nproject = {ICARUS,ViewFinder,Mobiniss},\nurl = {https:\/\/digital-library.theiet.org\/content\/journals\/10.1049\/iet-cvi.2013.0017},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    H. Balta, G. De Cubber, D. Doroftei, Y. Baudoin, and H. Sahli, &#8220;Terrain traversability analysis for off-road robots using time-of-flight 3d sensing,\" in <span style=\"font-style: italic\">7th IARP International Workshop on Robotics for Risky Environment-Extreme Robotics<\/span>, Saint-Petersburg, Russia,  2013.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_80\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_80\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2013\/Terrain Traversability Analysis ver 4-HS.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_80_block\">\n<p>In this paper we present a terrain traversability analysis methodology which classifies all image pixels in the TOF image as traversable or not, by estimating for each pixel a traversability score which is based upon the analysis of the 3D (depth data) and 2D (IR data) content of the TOF camera data. This classification result is then used for the (semi) \u2013 autonomous navigation of two robotic systems, operating in extreme environments: a search and rescue robot and a humanitarian demining robot. Integrated in autonomous robot control architecture, terrain traversability classification increases the environmental situational awareness and enables a mobile robot to navigate (semi) \u2013 autonomously in an unstructured dynamical outdoor environment.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_80_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{balta2013terrain,\nauthor = {Balta, Haris and De Cubber, Geert and Doroftei, Daniela and Baudoin, Yvan and Sahli, Hichem},\nbooktitle = {7th IARP International Workshop on Robotics for Risky Environment-Extreme Robotics},\ntitle = {Terrain traversability analysis for off-road robots using time-of-flight 3d sensing},\nyear = {2013},\nabstract = {In this paper we present a terrain traversability analysis methodology which classifies all image pixels in the TOF image as traversable or not, by estimating for each pixel a traversability score which is based upon the analysis of the 3D (depth data) and 2D (IR data) content of the TOF camera data. This classification result is then used for the (semi) \u2013 autonomous navigation of two robotic systems, operating in extreme environments: a search and rescue robot and a humanitarian demining robot. Integrated in autonomous robot control architecture, terrain traversability classification increases the environmental situational awareness and enables a mobile robot to navigate (semi) \u2013 autonomously in an unstructured dynamical outdoor environment.},\nproject = {ICARUS},\naddress = {Saint-Petersburg, Russia},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2013\/Terrain Traversability Analysis ver 4-HS.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin and G. De Cubber, &#8220;TIRAMISU-ICARUS: FP7-Projects Challenges for Robotics Systems,\" in <span style=\"font-style: italic\">7th IARP Workshop on Robotics for Risky Environment &#8211; Extreme Robotics<\/span>, Saint-Petersburg, Russia,  2013, p. 55\u201369.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_81\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_81\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2013\/KN Paper YB.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_81_block\">\n<p>TIRAMISU: Clearing large civilian areas from anti-personnel landmines and cluster munitions is a difficult problem because of the large diversity of hazardous areas and explosive contamination. A single solution does not exist and many Mine Action actors have called for a toolbox from which they could choose the tools best fit to a given situation. Some have built their own toolboxes, usually specific to their activities, such as clearance. The TIRAMISU project aims at providing the foundation for a global toolbox that will cover the main Mine Action activities, from the survey of large areas to the actual disposal of explosive hazards, including Mine Risk Education. The toolbox produced by the project will provide Mine Action actors with a large set of tools, grouped into thematic modules, which will help them to better perform their job. These tools will have been designed with the help of end-users and validated by them in mine affected countries. ICARUS: Recent dramatic events such as the earthquakes in Haiti and L\u2019Aquila or the flooding in Pakistan have shown that local civil authorities and emergency services have difficulties with adequately managing crises. The result is that these crises lead to major disruption of the whole local society. The goal of ICARUS is to decrease the total cost (both in human lives and in euro) of a major crisis. In order to realise this goal, the ICARUS project proposes to equip first responders with a comprehensive and integrated set of unmanned search and rescue tools, to increase the situational awareness of human crisis managers and to assist search and rescue teams for dealing with the difficult and dangerous, but life-saving task of finding human survivors. As every crisis is different, it is impossible to provide one solution which fits all needs. Therefore, the ICARUS project will concentrate on developing components or building blocks that can be directly used by the crisis managers when arriving on the field. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with human detection sensors. The ICARUS unmanned vehicles are intended as the first explorers of the area, as well as in-situ supporters to act as safeguards to human personnel. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radionetworking. To ensure optimal human-robot collaboration, these ICARUS tools are seamlessly integrated into the C4I equipment of the human crisis managers and a set of training and support tools is provided to the human crisis to learn to use the ICARUS system.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_81_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{baudoin2013tiramisu,\nauthor = {Baudoin, Yvan and De Cubber, Geert},\nbooktitle = {7th IARP Workshop on Robotics for Risky Environment - Extreme Robotics},\ntitle = {{TIRAMISU-ICARUS}: {FP7}-Projects Challenges for Robotics Systems},\nyear = {2013},\npages = {55--69},\naddress = {Saint-Petersburg, Russia},\nabstract = {TIRAMISU: Clearing large civilian areas from anti-personnel landmines and cluster munitions is a difficult problem because of the large diversity of hazardous areas and explosive contamination. A single solution does not exist and many Mine Action actors have called for a toolbox from which they could choose the tools best fit to a given situation. Some have built their own toolboxes, usually specific to their activities, such as clearance. The TIRAMISU project aims at providing the foundation for a global toolbox that will cover the main Mine Action activities, from the survey of large areas to the actual disposal of explosive hazards, including Mine Risk Education. The toolbox produced by the project will provide Mine Action actors with a large set of tools, grouped into thematic modules, which will help them to better perform their job. These tools will have been designed with the help of end-users and validated by them in mine affected countries.\nICARUS: Recent dramatic events such as the earthquakes in Haiti and L\u2019Aquila or the flooding in Pakistan have shown that local civil authorities and emergency services have difficulties with adequately managing crises. The result is that these crises lead to major disruption of the whole local society. The goal of ICARUS is to decrease the total cost (both in human lives and in euro) of a major crisis. In order to realise this goal, the ICARUS project proposes to equip first responders with a comprehensive and integrated set of unmanned search and rescue tools, to increase the situational awareness of human crisis managers and to assist search and rescue teams for dealing with the difficult and dangerous, but life-saving task of finding human survivors. As every crisis is different, it is impossible to provide one solution which fits all needs. Therefore, the ICARUS project will concentrate on developing components or building blocks that can be directly used by the crisis managers when arriving on the field. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with human detection sensors. The ICARUS unmanned vehicles are intended as the first explorers of the area, as well as in-situ supporters to act as safeguards to human personnel. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radionetworking. To ensure optimal human-robot collaboration, these ICARUS tools are seamlessly integrated into the C4I equipment of the human crisis managers and a set of training and support tools is provided to the human crisis to learn to use the ICARUS system.},\nproject = {ICARUS, TIRAMISU},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2013\/KN Paper YB.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, D. Serrano, K. Chintamani, R. Sabino, and S. Ourevitch, &#8220;The EU-ICARUS project: developing assistive robotic tools for search and rescue operations,\" in <span style=\"font-style: italic\">2013 IEEE international symposium on safety, security, and rescue robotics (SSRR)<\/span>, Linkoping, Sweden,  2013, p. 1\u20134.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_82\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_82\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2013\/SSRR2013_ICARUS.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ssrr.2013.6719323' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_82_block\">\n<p>The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but lifesaving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad-hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I (command, control, communications, computers, and intelligence) equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_82_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2013eu,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Serrano, Daniel and Chintamani, Keshav and Sabino, Rui and Ourevitch, Stephane},\nbooktitle = {2013 IEEE international symposium on safety, security, and rescue robotics (SSRR)},\ntitle = {The {EU-ICARUS} project: developing assistive robotic tools for search and rescue operations},\nyear = {2013},\norganization = {IEEE},\npages = {1--4},\naddress = {Linkoping, Sweden},\nabstract = {The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but lifesaving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad-hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I (command, control, communications, computers, and intelligence) equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.},\ndoi = {10.1109\/ssrr.2013.6719323},\nproject = {ICARUS},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2013\/SSRR2013_ICARUS.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    S. Govindaraj, K. Chintamani, J. Gancet, P. Letier, B. van Lierde, Y. Nevatia, G. D. Cubber, D. Serrano, M. E. Palomares, J. Bedkowski, C. Armbrust, J. Sanchez, A. Coelho, and I. Orbe, &#8220;The ICARUS project &#8211; Command, Control and Intelligence (C2I),\" in <span style=\"font-style: italic\">2013 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR)<\/span>, Linkoping, Sweden,  2013, p. 1\u20134.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_83\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_83\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2013\/Govindaraj_SSRR_WS_Paper_V2.0.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ssrr.2013.6719356' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_83_block\">\n<p>This paper describes the features and concepts behind the Command, Control and Intelligence (C2I) system under development in the ICARUS project, which aims at improving crisis management with the use of unmanned search and rescue robotic appliances embedded and integrated into existing infrastructures. A beneficial C2I system should assist the search and rescue process by enhancing first responder situational awareness, decision making and crisis handling by designing intuitive user interfaces that convey detailed and extensive information about the crisis and its evolution. The different components of C2I, their architectural and functional aspects are described along with the robot platform used for development and field testing.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_83_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{govindaraj2013icarus,\nauthor = {Shashank Govindaraj and Keshav Chintamani and Jeremi Gancet and Pierre Letier and Boris van Lierde and Yashodhan Nevatia and Geert De Cubber and Daniel Serrano and Miguel Esbri Palomares and Janusz Bedkowski and Christopher Armbrust and Jose Sanchez and Antonio Coelho and Iratxe Orbe},\nbooktitle = {2013 {IEEE} International Symposium on Safety, Security, and Rescue Robotics ({SSRR})},\ntitle = {The {ICARUS} project - Command, Control and Intelligence (C2I)},\nyear = {2013},\nmonth = oct,\norganization = {IEEE},\naddress = {Linkoping, Sweden},\npages = {1--4},\npublisher = {{IEEE}},\nabstract = {This paper describes the features and concepts behind the Command, Control and Intelligence (C2I) system under development in the ICARUS project, which aims at improving crisis management with the use of unmanned search and rescue robotic appliances embedded and integrated into existing infrastructures. A beneficial C2I system should assist the search and rescue process by enhancing first responder situational awareness, decision making and crisis handling by designing intuitive user interfaces that convey detailed and extensive information about the crisis and its evolution. The different components of C2I, their architectural and functional aspects are described along with the robot platform used for development and field testing.},\ndoi = {10.1109\/ssrr.2013.6719356},\nproject = {ICARUS},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2013\/Govindaraj_SSRR_WS_Paper_V2.0.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    H. Balta, S. Rossi, S. Iengo, B. Siciliano, A. Finzi, and G. De Cubber, &#8220;Adaptive behavior-based control for robot navigation: A multi-robot case study,\" in <span style=\"font-style: italic\">2013 XXIV International Conference on Information, Communication and Automation Technologies (ICAT)<\/span>, Sarajevo, Bosnia and Herzegovina,  2013, p. 1\u20137.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_84\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_84\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/6684083?tp=&#038;arnumber=6684083\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/icat.2013.6684083' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_84_block\">\n<p>The main focus of the work presented in this paper is to investigate the application of certain biologically-inspired control strategies in the field of autonomous mobile robots, with particular emphasis on multi-robot navigation systems. The control architecture used in this work is based on the behavior-based approach. The main argument in favor of this approach is its impressive and rapid practical success. This powerful methodology has demonstrated simplicity, parallelism, perception-action mapping and real implementation. When a group of autonomous mobile robots needs to achieve a goal operating in complex dynamic environments, such a task involves high computational complexity and a large volume of data needed for continuous monitoring of internal states and the external environment. Most autonomous mobile robots have limited capabilities in computation power or energy sources with limited capability, such as batteries. Therefore, it becomes necessary to build additional mechanisms on top of the control architecture able to efficiently allocate resources for enhancing the performance of an autonomous mobile robot. For this purpose, it is necessary to build an adaptive behavior-based control system focused on sensory adaptation. This adaptive property will assure efficient use of robot&#8217;s limited sensorial and cognitive resources. The proposed adaptive behavior-based control system is then validated through simulation in a multi-robot environment with a task of prey\/predator scenario.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_84_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{balta2013adaptive,\nauthor = {Balta, Haris and Rossi, Silvia and Iengo, Salvatore and Siciliano, Bruno and Finzi, Alberto and De Cubber, Geert},\nbooktitle = {2013 XXIV International Conference on Information, Communication and Automation Technologies (ICAT)},\ntitle = {Adaptive behavior-based control for robot navigation: A multi-robot case study},\nyear = {2013},\norganization = {IEEE},\npages = {1--7},\nabstract = {The main focus of the work presented in this paper is to investigate the application of certain biologically-inspired control strategies in the field of autonomous mobile robots, with particular emphasis on multi-robot navigation systems. The control architecture used in this work is based on the behavior-based approach. The main argument in favor of this approach is its impressive and rapid practical success. This powerful methodology has demonstrated simplicity, parallelism, perception-action mapping and real implementation. When a group of autonomous mobile robots needs to achieve a goal operating in complex dynamic environments, such a task involves high computational complexity and a large volume of data needed for continuous monitoring of internal states and the external environment. Most autonomous mobile robots have limited capabilities in computation power or energy sources with limited capability, such as batteries. Therefore, it becomes necessary to build additional mechanisms on top of the control architecture able to efficiently allocate resources for enhancing the performance of an autonomous mobile robot. For this purpose, it is necessary to build an adaptive behavior-based control system focused on sensory adaptation. This adaptive property will assure efficient use of robot's limited sensorial and cognitive resources. The proposed adaptive behavior-based control system is then validated through simulation in a multi-robot environment with a task of prey\/predator scenario.},\ndoi = {10.1109\/icat.2013.6684083},\naddress = {Sarajevo, Bosnia and Herzegovina},\nproject = {ICARUS},\nurl = {https:\/\/ieeexplore.ieee.org\/document\/6684083?tp=&arnumber=6684083},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    H. Balta, G. De Cubber, and D. Doroftei, &#8220;Increasing Situational Awareness through Outdoor Robot Terrain Traversability Analysis based on Time- Of-Flight Camera,\" in <span style=\"font-style: italic\">Spring School on Developmental Robotics and Cognitive Bootstrapping<\/span>, Athens, Greece: , 2013, p. 8.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_85\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_85\" class=\"papercite_toggle\">[Abstract]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_85_block\">\n<p>Poster paper<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_85_block\">\n<pre><code class=\"tex bibtex\">@InCollection{balta2013increasing,\nauthor = {Balta, Haris and De Cubber, Geert and Doroftei, Daniela},\nbooktitle = {Spring School on Developmental Robotics and Cognitive Bootstrapping},\ntitle = {Increasing Situational Awareness through Outdoor Robot Terrain Traversability Analysis based on Time- Of-Flight Camera},\nyear = {2013},\nnumber = {Developmental Robotics and Cognitive Bootstrapping},\npages = {8},\nabstract = {Poster paper},\naddress = {Athens, Greece},\nproject = {ICARUS},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Serrano, K. Berns, K. Chintamani, R. Sabino, S. Ourevitch, D. Doroftei, C. Armbrust, T. Flamma, and Y. Baudoin, &#8220;Search and rescue robots developed by the European Icarus project,\" in <span style=\"font-style: italic\">7th Int Workshop on Robotics for Risky Environments<\/span>, Saint &#8211; Petersburg, Russia,  2013.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_117\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_117\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2013\/Search and Rescue robots developed by the European ICARUS project - Article.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_117_block\">\n<p>This paper discusses the efforts of the European ICARUS project towards the development of unmanned search and rescue (SAR) robots. ICARUS project proposes to equip first responders with a comprehensive and integrated set of remotely operated SAR tools, to increase the situational awareness of human crisis managers. In the event of large crises, a primordial task of the fire and rescue services is the search for human survivors on the incident site, which is a complex and dangerous task. The introduction of remotely operated SAR devices can offer a valuable tool to save human lives and to speed up the SAR process. Therefore, ICARUS concentrates on the development of unmanned SAR technologies for detecting, locating and rescuing humans. The remotely operated SAR devices are foreseen to be the first explorers of the area, along with in-situ supporters to act as safeguards to human personnel. While the ICARUS project also considers the development of marine and aerial robots, this paper will mostly concentrate on the development of the unmanned ground vehicles (UGVs) for SAR. Two main UGV platforms are being developed within the context of the project: a large UGV including a powerful arm for manipulation, which is able to make structural changes in disaster scenarios. The large UGV also serves as a base platform for a small UGV (and possibly also a UAV), which is used for entering small enclosures, while searching for human survivors. In order not to increase the cognitive load of the human crisis managers, the SAR robots will be designed to navigate individually or cooperatively and to follow high-level instructions from the base station, being able to navigate in an autonomous and semi-autonomous manner. The robots connect to the base station and to each other using a wireless self-organizing cognitive network of mobile communication nodes which adapts to the terrain. The SAR robots are equipped with sensors that detect the presence of humans and will also be equipped with a wide array of other types of sensors. At the base station, the data is processed and combined with geographical information, thus enhancing the situational awareness of the personnel leading the operation with in-situ processed data that can improve decision-making.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_117_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2013search,\nauthor = {De Cubber, Geert and Serrano, Daniel and Berns, Karsten and Chintamani, Keshav and Sabino, Rui and Ourevitch, Stephane and Doroftei, Daniela and Armbrust, Christopher and Flamma, Tommasso and Baudoin, Yvan},\nbooktitle = {7th Int Workshop on Robotics for Risky Environments},\ntitle = {Search and rescue robots developed by the {European} {Icarus} project},\nyear = {2013},\nabstract = {This paper discusses the efforts of the European ICARUS project towards the development of unmanned search and rescue (SAR) robots. ICARUS project proposes to equip first responders with a comprehensive and integrated set of remotely operated SAR tools, to increase the situational awareness of human crisis managers. In the event of large crises, a primordial task of the fire and rescue services is the search for human survivors on the incident site, which is a complex and dangerous task. The introduction of remotely operated SAR devices can offer a valuable tool to save human lives and to speed up the SAR process. Therefore, ICARUS concentrates on the development of unmanned SAR technologies for detecting, locating and rescuing humans. The remotely operated SAR devices are foreseen to be the first explorers of the area, along with in-situ supporters to act as safeguards to human personnel. While the ICARUS project also considers the development of marine and aerial robots, this paper will mostly concentrate on the development of the unmanned ground vehicles (UGVs) for SAR. Two main UGV platforms are being developed within the context of the project: a large UGV including a powerful arm for manipulation, which is able to make structural changes in disaster scenarios. The large UGV also serves as a base platform for a small UGV (and possibly also a UAV), which is used for entering small enclosures, while searching for human survivors. In order not to increase the cognitive load of the human crisis managers, the SAR robots will be designed to navigate individually or cooperatively and to follow high-level instructions from the base station, being able to navigate in an autonomous and semi-autonomous manner. The robots connect to the base station and to each other using a wireless self-organizing cognitive network of mobile communication nodes which adapts to the terrain. The SAR robots are equipped with sensors that detect the presence of humans and will also be equipped with a wide array of other types of sensors. At the base station, the data is processed and\ncombined with geographical information, thus enhancing the situational awareness of the personnel leading the operation with in-situ processed data that can improve decision-making.},\nproject = {ICARUS},\naddress = {Saint - Petersburg, Russia},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2013\/Search and Rescue robots developed by the European ICARUS project - Article.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2012<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    J. B{k{e}}dkowski, A. Mas{l}owski, and G. De Cubber, &#8220;Real time 3D localization and mapping for USAR robotic application,\" <span style=\"font-style: italic\">Industrial Robot: An International Journal<\/span>, vol. 39, iss. 5, p. 464\u2013474, 2012.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_67\" class=\"papercite_toggle\">[BibTeX]<\/a>            <a href='http:\/\/dx.doi.org\/10.1108\/01439911211249751' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_67_block\">\n<pre><code class=\"tex bibtex\">@Article{bkedkowski2012real,\nauthor = {B{k{e}}dkowski, Janusz and Mas{l}owski, Andrzej and De Cubber, Geert},\njournal = {Industrial Robot: An International Journal},\ntitle = {Real time {3D} localization and mapping for {USAR} robotic application},\nyear = {2012},\nnumber = {5},\npages = {464--474},\nvolume = {39},\ndoi = {10.1108\/01439911211249751},\nproject = {ICARUS},\npublisher = {Emerald Group Publishing Limited},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber and H. Sahli, &#8220;Partial differential equation-based dense 3D structure and motion estimation from monocular image sequences,\" <span style=\"font-style: italic\">IET computer vision<\/span>, vol. 6, iss. 3, p. 174\u2013185, 2012.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_68\" class=\"papercite_toggle\">[BibTeX]<\/a>            <a href='http:\/\/dx.doi.org\/10.1049\/iet-cvi.2011.0174' class='papercite_doi' title='View on publisher site'>[DOI]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_68_block\">\n<pre><code class=\"tex bibtex\">@Article{de2012partial,\nauthor = {De Cubber, Geert and Sahli, Hichem},\njournal = {IET computer vision},\ntitle = {Partial differential equation-based dense {3D} structure and motion estimation from monocular image sequences},\nyear = {2012},\nnumber = {3},\npages = {174--185},\nvolume = {6},\ndoi = {10.1049\/iet-cvi.2011.0174},\nproject = {ViewFinder, Mobiniss},\npublisher = {IET Digital Library},\nunit= {meca-ras,vu-etro}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Yvinec, Y. Baudoin, G. De Cubber, M. Armada, L. Marques, J. Desaulniers, and M. Bajic, &#8220;TIRAMISU: FP7-Project for an integrated toolbox in Humanitarian Demining,\" in <span style=\"font-style: italic\">GICHD Technology Workshop<\/span>, Geneva, Switzerland,  2012.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_69\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_69\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2012\/TIRAMISU-TWS-GICHD.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_69_block\">\n<p>The TIRAMISU project aims at providing the foundation for a global toolbox that will cover the main mine action activities, from the survey of large areas to the actual disposal of explosive hazards, including mine risk education and training tools. After a short description of some tools, particular emphasis will be given to the two topics proposed by the GICHD Technology Workshop, namely the methodology adopted by the explosion of an ammunition storage and the possible use of UAV (or UGV\/UAV) in Technical survey and\/or Close-in-Detection<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_69_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{yvinec2012tiramisu01,\nauthor = {Yvinec, Yann and Baudoin, Yvan and De Cubber, Geert and Armada, Manuel and Marques, Lino and Desaulniers, Jean-Marc and Bajic, Milan},\nbooktitle = {GICHD Technology Workshop},\ntitle = {{TIRAMISU}: {FP7}-Project for an integrated toolbox in Humanitarian Demining},\nyear = {2012},\nabstract = {The TIRAMISU project aims at providing the foundation for a global toolbox that will cover the main mine action activities, from the survey of large areas to the actual disposal of explosive hazards, including mine risk education and training tools. After a short description of some tools, particular emphasis will be given to the two topics proposed by the GICHD Technology Workshop, namely the methodology adopted by the explosion of an ammunition storage and the possible use of UAV (or\nUGV\/UAV) in Technical survey and\/or Close-in-Detection},\nproject = {TIRAMISU},\naddress = {Geneva, Switzerland},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2012\/TIRAMISU-TWS-GICHD.pdf},\nunit= {meca-ras,ciss}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Yvinec, Y. Baudoin, G. De Cubber, M. Armada, L. Marques, J. Desaulniers, M. Bajic, E. Cepolina, and M. Zoppi, &#8220;TIRAMISU: FP7-Project for an integrated toolbox in Humanitarian Demining , focus on UGV, UAV and technical survey,\" in <span style=\"font-style: italic\">6th IARP Workshop on Risky Interventions and Environmental Surveillance (RISE)<\/span>, Warsaw, Poland,  2012.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_70\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_70\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2012\/RISE-TIRAMISU.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_70_block\">\n<p>The TIRAMISU project aims at providing the foundation for a global toolbox that will cover the main mine action activities, from the survey of large areas to the actual disposal of explosive hazards, including mine risk education and training tools. After a short description of some tools, particular emphasis will be given to the two topics proposed by the GICHD Technology Workshop, namely the methodology adopted by the explosion of an ammunition storage and the possible use of UAV (or UGV\/UAV) in Technical survey and\/or Close-in-Detection<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_70_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{yvinec2012tiramisu02,\nauthor = {Yvinec, Yann and Baudoin, Yvan and De Cubber, Geert and Armada, Manuel and Marques, Lino and Desaulniers, Jean-Marc and Bajic, Milan and Cepolina, Emanuela and Zoppi, Marco},\nbooktitle = {6th IARP Workshop on Risky Interventions and Environmental Surveillance (RISE)},\ntitle = {{TIRAMISU}: {FP7}-Project for an integrated toolbox in Humanitarian Demining , focus on UGV, UAV and technical survey},\nyear = {2012},\nabstract = {The TIRAMISU project aims at providing the foundation for a global toolbox that will cover the main mine action activities, from the survey of large areas to the actual disposal of explosive hazards, including mine risk education and training tools. After a short description of some tools, particular emphasis will be given to the two topics proposed by the GICHD Technology Workshop, namely the methodology adopted by the explosion of an ammunition storage and the possible use of UAV (or\nUGV\/UAV) in Technical survey and\/or Close-in-Detection},\naddress = {Warsaw, Poland},\nproject = {TIRAMISU},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2012\/RISE-TIRAMISU.pdf},\nunit= {meca-ras,ciss}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, Y. Baudoin, D. Serrano, K. Chintamani, R. Sabino, and S. Ourevitch, &#8220;ICARUS : Providing Unmanned Search and Rescue Tools,\" in <span style=\"font-style: italic\">6th IARP Workshop on Risky Interventions and Environmental Surveillance (RISE)<\/span>, Warsaw, Poland,  2012.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_71\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_71\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2012\/RISE2012_ICARUS.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_71_block\">\n<p>The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but life-saving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoccognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_71_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2012icarus01,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Baudoin, Yvan and Serrano, Daniel and Chintamani, Keshav and Sabino, Rui and Ourevitch, Stephane},\nbooktitle = {6th IARP Workshop on Risky Interventions and Environmental Surveillance (RISE)},\ntitle = {{ICARUS} : Providing Unmanned Search and Rescue Tools},\nyear = {2012},\nabstract = {The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but life-saving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoccognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.},\nproject = {ICARUS},\naddress = {Warsaw, Poland},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2012\/RISE2012_ICARUS.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, G. De Cubber, and K. Chintamani, &#8220;Towards collaborative human and robotic rescue workers,\" in <span style=\"font-style: italic\">5th International Workshop on Human-Friendly Robotics (HFR2012)<\/span>, Brussels, Belgium,  2012, p. 18\u201319.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_73\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_73\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/citeseerx.ist.psu.edu\/viewdoc\/download?doi=10.1.1.303.6697&#038;rep=rep1&#038;type=pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_73_block\">\n<p>This paper discusses some of the main remaining bottlenecks towards the successful introduction of robotic search and rescue (SAR) tools, collaborating with human rescue workers. It also sketches some of the recent advances which are being made to in the context of the European ICARUS project to get rid of these bottlenecks.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_73_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2012towards,\nauthor = {Doroftei, Daniela and De Cubber, Geert and Chintamani, Keshav},\nbooktitle = {5th International Workshop on Human-Friendly Robotics (HFR2012)},\ntitle = {Towards collaborative human and robotic rescue workers},\nyear = {2012},\npages = {18--19},\nabstract = {This paper discusses some of the main remaining bottlenecks towards the successful introduction of robotic search and rescue (SAR) tools, collaborating with human rescue workers. It also sketches some of the recent advances which are being made to in the context of the European ICARUS project to get rid of these bottlenecks.},\nproject = {ICARUS},\naddress = {Brussels, Belgium},\nurl = {http:\/\/citeseerx.ist.psu.edu\/viewdoc\/download?doi=10.1.1.303.6697&rep=rep1&type=pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    A. Conduraru, I. Conduraru, E. Puscalau, G. De Cubber, D. Doroftei, and H. Balta, &#8220;Development of an autonomous rough-terrain robot,\" in <span style=\"font-style: italic\">IROS2012 Workshop on Robots and Sensors integration in future rescue INformation system (ROSIN&#8217;12)<\/span>, Villamoura, Portugal,  2012.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_74\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_74\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/pdfs.semanticscholar.org\/884e\/6a80c8768044a1fd68ee91f45f17e5125153.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_74_block\">\n<p>In this paper, we discuss the development process of a mobile robot intended for environmental observation applications. The paper describes how a standard tele-operated Explosive Ordnance Disposal (EOD) robot was upgraded with electronics, sensors, computing power and autonomous capabilities, such that it becomes able to execute semi-autonomous missions, e.g. for search &#038; rescue or humanitarian demining tasks. The aim of this paper is not to discuss the details of the navigation algorithms (as these are often task-dependent), but more to concentrate on the development of the platform and its control architecture as a whole.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_74_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{conduraru2012development,\nauthor = {Conduraru, Alina and Conduraru, Ionel and Puscalau, Emanuel and De Cubber, Geert and Doroftei, Daniela and Balta, Haris},\nbooktitle = {IROS2012 Workshop on Robots and Sensors integration in future rescue INformation system (ROSIN'12)},\ntitle = {Development of an autonomous rough-terrain robot},\nyear = {2012},\nabstract = {In this paper, we discuss the development process of a mobile robot intended for environmental observation applications. The paper describes how a standard tele-operated Explosive Ordnance Disposal (EOD) robot was upgraded with electronics, sensors, computing power and autonomous capabilities, such that it becomes able to execute semi-autonomous missions, e.g. for search & rescue or humanitarian demining tasks. The aim of this paper is not to discuss the details of the navigation algorithms (as these are often task-dependent), but more to concentrate on the development of the platform and its control architecture as a whole.},\nproject = {ICARUS},\naddress = {Villamoura, Portugal},\nurl = {https:\/\/pdfs.semanticscholar.org\/884e\/6a80c8768044a1fd68ee91f45f17e5125153.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, Y. Baudoin, D. Serrano, K. Chintamani, R. Sabino, and S. Ourevitch, &#8220;Operational RPAS scenarios envisaged for search &#038; rescue by the EU FP7 ICARUS project,\" in <span style=\"font-style: italic\">Remotely Piloted Aircraft Systems for Civil Operations (RPAS2012)<\/span>, Brussels, Belgium,  2012.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_75\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_75\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2012\/De-Cubber-Geert_RMA_Belgium_WP.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_75_block\">\n<p>The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but life-saving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_75_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2012operational,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Baudoin, Yvan and Serrano, Daniel and Chintamani, Keshav and Sabino, Rui and Ourevitch, Stephane},\nbooktitle = {Remotely Piloted Aircraft Systems for Civil Operations (RPAS2012)},\ntitle = {Operational {RPAS} scenarios envisaged for search & rescue by the {EU FP7 ICARUS} project},\nyear = {2012},\nabstract = {The ICARUS EU-FP7 project deals with the development of a set of integrated components to assist search and rescue teams in dealing with the difficult and dangerous, but life-saving task of finding human survivors. The ICARUS tools consist of assistive unmanned air, ground and sea vehicles, equipped with victim detection sensors. The unmanned vehicles collaborate as a coordinated team, communicating via ad hoc cognitive radio networking. To ensure optimal human-robot collaboration, these tools are seamlessly integrated into the C4I equipment of the human crisis managers and a set of training and support tools is provided to them to learn to use the ICARUS system.},\nproject = {ICARUS},\naddress = {Brussels, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2012\/De-Cubber-Geert_RMA_Belgium_WP.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    J. B{k{e}}dkowski, G. De Cubber, and A. Mas{l}owski, &#8220;6D SLAM with GPGPU computation,\" <span style=\"font-style: italic\">Pomiary Automatyka Robotyka<\/span>, vol. 16, iss. 2, p. 275\u2013280, 2012.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_77\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_77\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/www.par.pl\/en\/content\/download\/14036\/170476\/file\/275_280.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_77_block\">\n<p>The main goal was to improve a state of the art 6D SLAM algorithm with a new GPGPU-based implementation of data registration module. Data registration is based on ICP (Iterative Closest Point) algorithm that is fully implemented in the GPU with NVIDIA FERMI architecture. In our research we focus on mobile robot inspection intervention systems applicable in hazardous environments. The goal is to deliver a complete system capable of being used in real life. In this paper we demonstrate our achievements in the field of on line robot localization and mapping. We demonstrated an experiment in real large environment. We compared two strategies of data alingment &#8211; simple ICP and ICP using so called meta scan.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_77_block\">\n<pre><code class=\"tex bibtex\">@Article{bkedkowski20126d,\nauthor = {B{k{e}}dkowski, Janusz and De Cubber, Geert and Mas{l}owski, Andrzej},\njournal = {Pomiary Automatyka Robotyka},\ntitle = {{6D SLAM} with {GPGPU} computation},\nyear = {2012},\nnumber = {2},\npages = {275--280},\nvolume = {16},\nproject = {ICARUS},\nabstract = {The main goal was to improve a state of the art 6D SLAM algorithm with a new GPGPU-based implementation of data registration module. Data registration is based on ICP (Iterative Closest Point) algorithm that is fully implemented in the GPU with NVIDIA FERMI architecture. In our research we focus on mobile robot inspection intervention systems applicable in hazardous environments. The goal is to deliver a complete system capable of being used in real life. In this paper we demonstrate our achievements in the field of on line robot localization and mapping. We demonstrated an experiment in real large environment. We compared two strategies of data alingment - simple ICP and ICP using so called meta scan.},\nurl = {http:\/\/www.par.pl\/en\/content\/download\/14036\/170476\/file\/275_280.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, Y. Baudoin, D. Serrano, K. Chintamani, R. Sabino, and S. Ourevitch, &#8220;ICARUS: AN EU-FP7 PROJECT PROVIDING UNMANNED SEARCH AND RESCUE TOOLS,\" in <span style=\"font-style: italic\">IROS2012 Workshop on Robots and Sensors integration in future rescue INformation system (ROSIN&#8217;12)<\/span>, Villamoura, Portugal,  2012.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_89\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_89\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2012\/Icarus - ROSIN2012 Presentation.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_89_block\">\n<p>Overview of the objectives of the ICARUS project<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_89_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2012icarus02,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Baudoin, Y and Serrano, D and Chintamani, K and Sabino, R and Ourevitch, S},\nbooktitle = {IROS2012 Workshop on Robots and Sensors integration in future rescue INformation system (ROSIN'12)},\ntitle = {{ICARUS}: AN {EU-FP7} PROJECT PROVIDING UNMANNED SEARCH AND RESCUE TOOLS},\nyear = {2012},\nabstract = {Overview of the objectives of the ICARUS project},\nproject = {ICARUS},\naddress = {Villamoura, Portugal},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2012\/Icarus - ROSIN2012 Presentation.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2011<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, D. Doroftei, H. Sahli, and Y. Baudoin, &#8220;Outdoor Terrain Traversability Analysis for Robot Navigation using a Time-Of-Flight Camera,\" in <span style=\"font-style: italic\">RGB-D Workshop on 3D Perception in Robotics<\/span>, Vasteras, Sweden,  2011.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_59\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_59\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2011\/TTA_TOF.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_59_block\">\n<p>Autonomous robotic systems operating in unstructured outdoor environments need to estimate the traversabilityof the terrain in order to navigate safely. Traversability estimation is a challenging problem, as the traversability is a complex function of both the terrain characteristics, such as slopes, vegetation, rocks, etc and the robot mobility characteristics, i.e. locomotion method, wheels, etc. It is thus required to analyze in real-time the 3D characteristics of the terrain and pair this data to the robot capabilities. In this paper, a method is introduced to estimate the traversability using data from a time-of-flight camera.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_59_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2011outdoor,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Sahli, Hichem and Baudoin, Yvan},\nbooktitle = {RGB-D Workshop on 3D Perception in Robotics},\ntitle = {Outdoor Terrain Traversability Analysis for Robot Navigation using a Time-Of-Flight Camera},\nyear = {2011},\nabstract = {Autonomous robotic systems operating in unstructured outdoor environments need to estimate the traversabilityof the terrain in order to navigate safely. Traversability estimation is a challenging problem, as the traversability is a complex function of both the terrain characteristics, such as slopes, vegetation, rocks, etc and the robot mobility characteristics, i.e. locomotion method, wheels, etc. It is thus required to analyze in real-time the 3D characteristics of the terrain and pair this data to the robot capabilities. In this paper, a method is introduced to estimate the traversability using data from a time-of-flight camera.},\nproject = {ViewFinder, Mobiniss},\naddress = {Vasteras, Sweden},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2011\/TTA_TOF.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber and D. Doroftei, &#8220;Multimodal terrain analysis for an all-terrain crisis Management Robot,\" in <span style=\"font-style: italic\">IARP HUDEM 2011<\/span>, Sibenik, Croatia,  2011.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_60\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_60\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2011\/Multimodal terrain analysis for an all-terrain crisis management robot .pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_60_block\">\n<p>In this paper, a novel stereo-based terrain-traversability estimation methodology is proposed. The novelty is that \u2013 in contrary to classic depth-based terrain classification algorithms \u2013 all the information of the stereo camera system is used, also the color information. Using this approach, depth and color information are fused in order to obtain a higher classification accuracy than is possible with uni-modal techniques<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_60_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2011multimodal,\nauthor = {De Cubber, Geert and Doroftei, Daniela},\nbooktitle = {IARP HUDEM 2011},\ntitle = {Multimodal terrain analysis for an all-terrain crisis Management Robot},\nyear = {2011},\nabstract = {In this paper, a novel stereo-based terrain-traversability estimation methodology is proposed. The novelty is that \u2013 in contrary to classic depth-based terrain classification algorithms \u2013 all the information of the stereo camera system is used, also the color information. Using this approach, depth and color information are fused in order to obtain a higher classification accuracy than is possible with uni-modal techniques},\nproject = {Mobiniss},\naddress = {Sibenik, Croatia},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2011\/Multimodal terrain analysis for an all-terrain crisis management robot .pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, K. Verbiest, and S. A. Berrabah, &#8220;Autonomous camp surveillance with the ROBUDEM robot: challenges and results,\" in <span style=\"font-style: italic\">IARP Workshop RISE\u20192011<\/span>, Belgium,  2011.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_61\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_61\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2011\/ELROB-RISE.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_61_block\">\n<p>Autonomous robotic systems can help for risky interventions to reduce the risk to human lives. An example of such a risky intervention is a camp surveillance scenario, where an environment needs to be patrolled and intruders need to be detected and intercepted. This paper describes the development of a mobile outdoor robot which is capable of performing such a camp surveillance task. The key research issues tackled are the robot design, geo-referenced localization and path planning, traversability estimation, the optimization of the terrain coverage strategy and the development of an intuitive human-robot interface.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_61_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2011autonomous,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Verbiest, Kristel and Berrabah, Sid Ahmed},\nbooktitle = {IARP Workshop RISE\u20192011},\ntitle = {Autonomous camp surveillance with the {ROBUDEM} robot: challenges and results},\nyear = {2011},\nabstract = {Autonomous robotic systems can help for risky interventions to reduce the risk to human lives. An example of such a risky intervention is a camp surveillance scenario, where an environment needs to be patrolled and intruders need to be detected and intercepted. This paper describes the development of a mobile outdoor robot which is capable of performing such a camp surveillance task. The key research issues tackled are the robot design, geo-referenced localization and path planning, traversability estimation, the optimization of the terrain coverage strategy and the development of an intuitive human-robot interface.},\nproject = {Mobiniss},\naddress = {Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2011\/ELROB-RISE.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber and D. Doroftei, &#8220;Using Robots in Hazardous Environments: Landmine Detection, de-Mining and Other Applications,\" in <span style=\"font-style: italic\">Using Robots in Hazardous Environments: Landmine Detection, De-Mining and Other Applications<\/span>, Y. Baudoin and M. Habib, Eds., Woodhead Publishing, 2011, vol. 1, p. 476\u2013498.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_112\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_112\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2009\/Handbook Chapter 4 - Human Victim Detection and Stereo-based Terrain Traversability Analysis for Behavior-Based Robot Navigation.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_112_block\">\n<p>This chapter presents three main aspects of the development of a crisis management robot. First, we present an approach for robust victim detection in difficult outdoor conditions. Second, we present an approach where a classification of the terrain in the classes traversable and obstacle is performed using only stereo vision as input data. Lastly, we present behavior-based control architecture, enabling a robot to search for human victims on an incident site, while navigating semi-autonomously, using stereo vision as the main source of sensor information.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_112_block\">\n<pre><code class=\"tex bibtex\">@InBook{de2010human,\nauthor = {De Cubber, Geert and Doroftei, Daniela},\neditor = {Baudoin, Yvan and Habib, Maki},\nchapter = {Chapter 20},\npages = {476--498},\npublisher = {Woodhead Publishing},\ntitle = {Using Robots in Hazardous Environments: Landmine Detection, de-Mining and Other Applications},\nyear = {2011},\nisbn = {1845697863},\nvolume = {1},\nabstract = {This chapter presents three main aspects of the development of a crisis management robot. First, we present an approach for robust victim detection in difficult outdoor conditions. Second, we present an approach where a classification of the terrain in the classes traversable and obstacle is performed using only stereo vision as input data. Lastly, we present behavior-based control architecture, enabling a robot to search for human victims on an incident site, while navigating semi-autonomously, using stereo vision as the main source of sensor information.},\nbooktitle = {Using Robots in Hazardous Environments: Landmine Detection, De-Mining and Other Applications},\ndate = {2011-01-11},\nean = {9781845697860},\npagetotal = {665},\nproject = {Mobiniss},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2009\/Handbook Chapter 4 - Human Victim Detection and Stereo-based Terrain Traversability Analysis for Behavior-Based Robot Navigation.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and E. Colon, &#8220;Decentralized multi-robot coordination for a risky surveillance application,\" in <span style=\"font-style: italic\">Proc. IARP HUDEM 2011<\/span>, Sibenik, Croatia,  2011.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_138\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_138\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2011\/HUDEM2011_Doroftei_Colon.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_138_block\">\n<p>This paper proposes a multi-robot control methodology that is based on a behavior-based control framework. In this behavior-based context, the robotic team members are controlled using one of 2 mutually exclusive behaviors: patrolling or intercepting. In patrol mode the robot seeks to detect enemy forces as rapidly as possible, by balancing 2 constraints: the intervention time should be minimized and the map coverage should be maximized. In interception mode, the robot tries to advance towards an enemy which was detected by one of the robotic team members. Subsequently, the robot tries to neutralize the threat posed by the enemy before enemy is able to reach the camp.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_138_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2011decentralized,\nauthor = {Doroftei, Daniela and Colon, Eric},\nbooktitle = {Proc. {IARP} {HUDEM} 2011},\ntitle = {Decentralized multi-robot coordination for a risky surveillance application},\nyear = {2011},\npublisher = {{IARP}},\nabstract = {This paper proposes a multi-robot control methodology that is based on a behavior-based control framework. In this behavior-based context, the robotic team members are controlled using one of 2 mutually exclusive behaviors: patrolling or intercepting. In patrol mode the robot seeks to detect enemy forces as rapidly as possible, by balancing 2 constraints: the intervention time should be minimized and the map coverage should be maximized. In interception mode, the robot tries to advance towards an enemy which was detected by one of the robotic team members. Subsequently, the robot tries to neutralize the threat posed by the enemy before enemy is able to reach the camp. },\nproject = {NMRS},\naddress = {Sibenik, Croatia},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2011\/HUDEM2011_Doroftei_Colon.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2010<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, S. A. Berrabah, D. Doroftei, Y. Baudoin, and H. Sahli, &#8220;Combining Dense Structure from Motion and Visual SLAM in a Behavior-Based Robot Control Architecture,\" <span style=\"font-style: italic\">International Journal of Advanced Robotic Systems<\/span>, vol. 7, iss. 1, 2010.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_51\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_51\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2010\/e_from_motion_and_visual_slam_in_a_behavior-based_robot_control_architecture.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/7240' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_51_block\">\n<p>In this paper, we present a control architecture for an intelligent outdoor mobile robot. This enables the robot to navigate in a complex, natural outdoor environment, relying on only a single on-board camera as sensory input. This is achieved through a twofold analysis of the visual data stream: a dense structure from motion algorithm calculates a depth map of the environment and a visual simultaneous localization and mapping algorithm builds a map of the surroundings using image features. This information enables a behavior-based robot motion and path planner to navigate the robot through the environment. In this paper, we show the theoretical aspects of setting up this architecture.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_51_block\">\n<pre><code class=\"tex bibtex\">@Article{de2010combining,\nauthor = {De Cubber, Geert and Sid Ahmed Berrabah and Daniela Doroftei and Yvan Baudoin and Hichem Sahli},\njournal = {International Journal of Advanced Robotic Systems},\ntitle = {Combining Dense Structure from Motion and Visual {SLAM} in a Behavior-Based Robot Control Architecture},\nyear = {2010},\nmonth = mar,\nnumber = {1},\nvolume = {7},\nabstract = {In this paper, we present a control architecture for an intelligent outdoor mobile robot. This enables the robot to navigate in a complex, natural outdoor environment, relying on only a single on-board camera as sensory input. This is achieved through a twofold analysis of the visual data stream: a dense structure from motion algorithm calculates a depth map of the environment and a visual simultaneous localization and mapping algorithm builds a map of the surroundings using image features. This information enables a behavior-based robot motion and path planner to navigate the robot through the environment. In this paper, we show the theoretical aspects of setting up this architecture.},\ndoi = {10.5772\/7240},\npublisher = {{SAGE} Publications},\nproject = {ViewFinder, Mobiniss},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2010\/e_from_motion_and_visual_slam_in_a_behavior-based_robot_control_architecture.pdf},\nunit= {meca-ras,vub-etro}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, D. Doroftei, G. De Cubber, S. A. Berrabah, E. Colon, C. Pinzon, A. Maslowski, J. Bedkowski, and J. PENDERS, &#8220;VIEW-FINDER: Robotics Assistance to fire-Fighting services,\" in <span style=\"font-style: italic\">Mobile Robotics: Solutions and Challenges<\/span>, , 2010, p. 397\u2013406.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_52\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_52\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/books.google.be\/books?id=zcfFCgAAQBAJ&#038;pg=PA397&#038;lpg=PA397&#038;dq=VIEW-FINDER: Robotics Assistance to fire-Fighting services mobile robots&#038;source=bl&#038;ots=Jh6P63OKCr&#038;sig=O1GPy_c42NPSEdO8Hb_pa9V6K7g&#038;hl=en&#038;sa=X&#038;ved=2ahUKEwiLr76B-5zfAhUMCewKHQS_Af0Q6AEwDXoECAEQAQ#v=onepage&#038;q=VIEW-FINDER: Robotics Assistance to fire-Fighting services mobile robots&#038;f=false\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_52_block\">\n<p>This paper presents an overview of the View-Finder project<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_52_block\">\n<pre><code class=\"tex bibtex\">@InCollection{baudoin2010view,\nauthor = {Baudoin, Yvan and Doroftei, Daniela and De Cubber, Geert and Berrabah, Sid Ahmed and Colon, Eric and Pinzon, Carlos and Maslowski, Andrzej and Bedkowski, Janusz and PENDERS, Jacques},\nbooktitle = {Mobile Robotics: Solutions and Challenges},\ntitle = {{VIEW-FINDER}: Robotics Assistance to fire-Fighting services},\nyear = {2010},\npages = {397--406},\nabstract = {This paper presents an overview of the View-Finder project},\nproject = {ViewFinder},\nunit= {meca-ras},\nurl = {https:\/\/books.google.be\/books?id=zcfFCgAAQBAJ&pg=PA397&lpg=PA397&dq=VIEW-FINDER: Robotics Assistance to fire-Fighting services mobile robots&source=bl&ots=Jh6P63OKCr&sig=O1GPy_c42NPSEdO8Hb_pa9V6K7g&hl=en&sa=X&ved=2ahUKEwiLr76B-5zfAhUMCewKHQS_Af0Q6AEwDXoECAEQAQ#v=onepage&q=VIEW-FINDER: Robotics Assistance to fire-Fighting services mobile robots&f=false},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, &#8220;On-line and Off-line 3D Reconstruction for Crisis Management Applications,\" in <span style=\"font-style: italic\">Fourth International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance, RISE\u20192010<\/span>, Sheffield, UK,  2010.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_57\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_57\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/RISE\/RISE - 2010\/On-line and Off-line 3D Reconstruction_Geert_De_Cubber.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_57_block\">\n<p>We present in this paper a 3D reconstruction methodology. This approach fuses dense stereo and sparse motion data to estimate high quality instantaneous depth maps. This methodology achieves near realtime processing frame rates, such that it can be directly used on-line by the crisis management teams.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_57_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2010line,\nauthor = {De Cubber, Geert},\nbooktitle = {Fourth International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance, RISE\u20192010},\ntitle = {On-line and Off-line {3D} Reconstruction for Crisis Management Applications},\nyear = {2010},\nabstract = {We present in this paper a 3D reconstruction methodology. This approach fuses dense stereo and sparse motion data to estimate high quality instantaneous depth maps. This methodology achieves near realtime processing frame rates, such that it can be directly used on-line by the crisis management teams.},\nproject = {ViewFinder, Mobiniss},\naddress = {Sheffield, UK},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/RISE\/RISE - 2010\/On-line and Off-line 3D Reconstruction_Geert_De_Cubber.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, G. De Cubber, E. Colon, D. Doroftei, and S. A. Berrabah, &#8220;Robotics Assistance by Risky Interventions: Needs and Realistic Solutions,\" in <span style=\"font-style: italic\">Workshop on Robotics for Extreme conditions<\/span>, Saint-Petersburg, Russia,  2010.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_58\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_58\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2010\/Robotics Assistance by risky interventions.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_58_block\">\n<p>This paper discusses the requirements towards robotics systems in the domains of firefighting, CBRN-E and humanitarian demining.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_58_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{baudoin2010robotics,\nauthor = {Baudoin, Yvan and De Cubber, Geert and Colon, Eric and Doroftei, Daniela and Berrabah, Sid Ahmed},\nbooktitle = {Workshop on Robotics for Extreme conditions},\ntitle = {Robotics Assistance by Risky Interventions: Needs and Realistic Solutions},\nyear = {2010},\nabstract = {This paper discusses the requirements towards robotics systems in the domains of firefighting, CBRN-E and humanitarian demining.},\nproject = {ViewFinder, Mobiniss},\naddress = {Saint-Petersburg, Russia},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2010\/Robotics Assistance by risky interventions.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, &#8220;Variational methods for dense depth reconstruction from monocular and binocular video sequences,\" PhD Thesis, 2010.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_62\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_62\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2010\/PhD_Thesis_Geert_.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_62_block\">\n<p>This research work tackles the problem of dense three-dimensional reconstruction from monocular and binocular image sequences. Recovering 3D-information has been in the focus of attention of the computer vision community for a few decades now, yet no all-satisfying method has been found so far. The main problem with vision, is that the perceived computer image is a two-dimensional projection of the 3D world. Three-dimensional reconstruction can thus be regarded as the process of re-projecting the 2D image(s) back to a 3D model, as such recovering the depth dimension which was lost during projection. In this work, we focus on dense reconstruction, meaning that a depth estimate is sought for each pixel of the input image. Most attention in the 3Dreconstruction area has been on stereo-vision based methods, which use the displacement of objects in two (or more) images. Where stereo vision must be seen as a spatial integration of multiple viewpoints to recover depth, it is also possible to perform a temporal integration. The problem arising in this situation is known as the Structure from Motion problem and deals with extracting 3-dimensional information about the environment from the motion of its projection onto a two-dimensional surface. Based upon the observation that the human visual system uses both stereo and structure from motion for 3D reconstruction, this research work also targets the combination of stereo information in a structure from motion-based 3D-reconstruction scheme. The data fusion problem arising in this case is solved by casting it as an energy minimization problem in a variationalframework.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_62_block\">\n<pre><code class=\"tex bibtex\">@PhdThesis{de2010variational,\nauthor = {De Cubber, Geert},\nschool = {Vrije Universiteit Brussel-Royal Military Academy},\ntitle = {Variational methods for dense depth reconstruction from monocular and binocular video sequences},\nyear = {2010},\nabstract = {This research work tackles the problem of dense three-dimensional reconstruction from monocular and binocular image sequences. Recovering 3D-information has been in the focus of attention of the computer vision community for a few decades now, yet no all-satisfying method has been found so far. The main problem with vision, is that the perceived computer image is a two-dimensional projection of the 3D world. Three-dimensional reconstruction can thus be regarded as the process of re-projecting the 2D image(s) back to a 3D model, as such recovering the depth dimension which was lost during projection.\nIn this work, we focus on dense reconstruction, meaning that a depth estimate is sought for each pixel of the input image. Most attention in the 3Dreconstruction area has been on stereo-vision based methods, which use the displacement of objects in two (or more) images. Where stereo vision must be seen as a spatial integration of multiple viewpoints to recover depth, it is also possible to perform a temporal integration. The problem arising in this situation is known as the Structure from Motion problem and deals with extracting 3-dimensional information about the environment from the motion of its projection onto a two-dimensional surface. Based upon the observation that the human visual system uses both stereo and structure from motion for 3D reconstruction, this research work also targets the combination of stereo information in a structure from motion-based 3D-reconstruction scheme. The data fusion problem arising in this case is solved by casting it as an energy minimization problem in a variationalframework.},\nproject = {ViewFinder, Mobiniss},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2010\/PhD_Thesis_Geert_.pdf},\nunit= {meca-ras,vub-etro}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, and S. A. Berrabah, &#8220;Using visual perception for controlling an outdoor robot in a crisis management scenario,\" in <span style=\"font-style: italic\">ROBOTICS 2010<\/span>, Clermont-Ferrand, France,  2010.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_101\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_101\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2010\/Usingvisualperceptionforcontrollinganoutdoorrobotinacrisismanagementscenario (1).pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_101_block\">\n<p>Crisis management teams (e.g. fire and rescue services, anti-terrorist units &#8230;) are often confronted with dramatic situations where critical decisions have to be made within hard time constraints. Therefore, they need correct information about what is happening on the crisis site. In this context, the View-Finder projects aims at developing robots which can assist the human crisis managers, by gathering data. This paper gives an overview of the development of such an outdoor robot. The presented robotic system is able to detect human victims at the incident site, by using vision-based human body shape detection. To increase the perceptual awareness of the human crisis managers, the robotic system is capable of reconstructing a 3D model of the environment, based on vision data. Also for navigation, the robot depends mostly on visual perception, as it combines a model-based navigation approach using geo-referenced positioning with stereo-based terrain traversability analysis for obstacle avoidance. The robot control scheme is embedded in a behavior-based robot control architecture, which integrates all the robot capabilities. This paper discusses all the above mentioned technologies.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_101_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2010using,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Berrabah, Sid Ahmed},\nbooktitle = {ROBOTICS 2010},\ntitle = {Using visual perception for controlling an outdoor robot in a crisis management scenario},\nyear = {2010},\nabstract = {Crisis management teams (e.g. fire and rescue services, anti-terrorist units ...) are often confronted with dramatic situations where critical decisions have to be made within hard time constraints. Therefore, they need correct information about what is happening on the crisis site. In this context, the View-Finder projects aims at developing robots which can assist the human crisis managers, by gathering data. This paper gives an overview of the development of such an outdoor robot. The presented robotic system is able to detect human victims at the incident site, by using vision-based human body shape detection. To increase the perceptual awareness of the human crisis managers, the robotic system is capable of reconstructing a 3D model of the environment, based on vision data. Also for navigation, the robot depends mostly on visual perception, as it combines a model-based navigation approach using geo-referenced positioning with stereo-based terrain traversability analysis for obstacle avoidance. The robot control scheme is embedded in a behavior-based robot control architecture, which integrates all the robot capabilities. This paper discusses all the above mentioned technologies.},\nproject = {ViewFinder, Mobiniss},\naddress = {Clermont-Ferrand, France},\nunit= {meca-ras},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2010\/Usingvisualperceptionforcontrollinganoutdoorrobotinacrisismanagementscenario (1).pdf},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and E. Colon, &#8220;Decentralized Multi-Robot Coordination in an Urban Environment,\" <span style=\"font-style: italic\">European Journal of Mechanical en Environmental Engineering<\/span>, vol. 1, 2010.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_139\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_139\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2010\/EJMEE2010_doroftei_colon.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_139_block\">\n<p>In this paper, a novel control strategy is presented for multi\u2010robot coordination. An important aspect of the presented control architecture is that it is formulated in a decentralized context. This means that the robots cannot rely on traditional global path planning algorithms for navigation. The presented approach casts the multi\u2010robot control problem as a behavior\u2010based control problem.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_139_block\">\n<pre><code class=\"tex bibtex\">@Article{doro2010decentralized,\nauthor = {Doroftei, Daniela and Colon, Eric},\njournal = {European Journal of Mechanical en Environmental Engineering},\ntitle = {Decentralized Multi-Robot Coordination in an Urban Environment},\nyear = {2010},\nvolume = {1},\nabstract = {In this paper, a novel control strategy is presented for multi\u2010robot coordination. An important aspect of the presented control architecture is that it is formulated in a decentralized context. This means that the robots cannot rely on traditional global path planning algorithms for navigation. The presented approach casts the multi\u2010robot control problem as a behavior\u2010based control problem. },\nproject = {NMRS},\naddress = {Sheffield, UK},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2010\/EJMEE2010_doroftei_colon.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and E. Colon, &#8220;Multi-robot collaboration and coordination in a high-risk transportation scenario,\" in <span style=\"font-style: italic\">Proc. IARP HUDEM 2010<\/span>, Sousse, Tunisia,  2010.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_140\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_140\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/HUDEM\/HUDEM%20-%202010\/HUDEM2010_Doroftei.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_140_block\">\n<p>This paper discusses a decentralized multi-robot coordination strategy which aims to control and guide a team of robotic agents safely through a hostile area. The \u201dhostility\u201d of the environment is due to the presence of enemy forces, seeking to intercept the robotic team. In order to avoid detection and ensure global team safety, the robotic agents must carefully plan their trajectory towards a list of goal locations, while holding a defensive formation.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_140_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2001multi,\nauthor = {Doroftei, Daniela and Colon, Eric},\nbooktitle = {Proc. {IARP} {HUDEM} 2010},\ntitle = {Multi-robot collaboration and coordination in a high-risk transportation scenario},\nyear = {2010},\npublisher = {{IARP}},\nabstract = {This paper discusses a decentralized multi-robot coordination strategy which aims to control and guide a team of robotic agents safely through a hostile area. The \u201dhostility\u201d of the environment is due to the presence of enemy forces, seeking to intercept the robotic team. In order to avoid detection and ensure global team safety, the robotic agents must carefully plan their trajectory towards a list of goal locations, while holding a defensive formation. },\nproject = {NMRS},\naddress = {Sousse, Tunisia},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/HUDEM\/HUDEM%20-%202010\/HUDEM2010_Doroftei.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and E. Colon, &#8220;Decentralized Multi-Robot Coordination for Risky Interventions,\" in <span style=\"font-style: italic\">Fourth International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance RISE<\/span>, Sheffield, UK,  2010.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_141\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_141\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/RISE\/RISE%20-%202010\/Decentralized%20Multi-Robot%20Coordination%20for%20Risky%20Interventio.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_141_block\">\n<p>The paper describes an approach to design a behavior-based architecture, how each behavior was designed and how the behavior fusion problem was solved.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_141_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2010multibis,\nauthor = {Doroftei, Daniela and Colon, Eric},\nbooktitle = {Fourth International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance {RISE}},\ntitle = {Decentralized Multi-Robot Coordination for Risky Interventions},\nyear = {2010},\nabstract = {The paper describes an approach to design a behavior-based architecture, how each behavior was designed and how the behavior fusion problem was solved.},\nproject = {NMRS, ViewFinder},\naddress = {Sheffield, UK},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/RISE\/RISE%20-%202010\/Decentralized%20Multi-Robot%20Coordination%20for%20Risky%20Interventio.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2009<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, D. Doroftei, L. Nalpantidis, G. C. Sirakoulis, and A. Gasteratos, &#8220;Stereo-based terrain traversability analysis for robot navigation,\" in <span style=\"font-style: italic\">IARP\/EURON Workshop on Robotics for Risky Interventions and Environmental Surveillance, Brussels, Belgium<\/span>, Brussels, Belgium,  2009.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_44\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_44\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2009\/RISE-DECUBBER-DUTH.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_44_block\">\n<p>In this paper, we present an approach where a classification of the terrain in the classes traversable and obstacle is performed using only stereo vision as input data.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_44_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2009stereo,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Nalpantidis, Lazaros and Sirakoulis, Georgios Ch and Gasteratos, Antonios},\nbooktitle = {IARP\/EURON Workshop on Robotics for Risky Interventions and Environmental Surveillance, Brussels, Belgium},\ntitle = {Stereo-based terrain traversability analysis for robot navigation},\nyear = {2009},\nabstract = {In this paper, we present an approach where a classification of the terrain in the classes traversable and obstacle is performed using only stereo vision as input data.},\nproject = {ViewFinder, Mobiniss},\naddress = {Brussels, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2009\/RISE-DECUBBER-DUTH.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber and G. Marton, &#8220;Human Victim Detection,\" in <span style=\"font-style: italic\">Third International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance, RISE<\/span>, Brussels, Belgium,  2009.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_45\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_45\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2009\/RISE-DECUBBER_BUTE.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_45_block\">\n<p>This paper presents an approach to achieve robust victim detection from color video images. The applied approach goes out from the Viola-Jones algorithm for Haar-features based template recognition. This algorithm was adapted to recognize persons lying on the ground in difficult outdoor illumination conditions.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_45_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2009human,\nauthor = {De Cubber, Geert and Marton, Gabor},\nbooktitle = {Third International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance, RISE},\ntitle = {Human Victim Detection},\nyear = {2009},\nabstract = {This paper presents an approach to achieve robust victim detection from color video images. The applied approach goes out from the Viola-Jones algorithm for Haar-features based template recognition. This algorithm was adapted to recognize persons lying on the ground in difficult outdoor illumination conditions.},\nproject = {ViewFinder, Mobiniss},\naddress = {Brussels, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2009\/RISE-DECUBBER_BUTE.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, G. De Cubber, E. Colon, and Y. Baudoin, &#8220;Behavior based control for an outdoor crisis management robot,\" in <span style=\"font-style: italic\">Proceedings of the IARP International Workshop on Robotics for Risky Interventions and Environmental Surveillance<\/span>, Brussels, Belgium,  2009, p. 12\u201314.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_46\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_46\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2009\/RISE-DOROFTEI.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_46_block\">\n<p>The design and development of a control architecture for a robotic crisis management agent raises 3 main questions: 1. How can we design the individual behaviors, such that the robot is capable of avoiding obstacles and of navigating semi-autonomously? 2. How can these individual behaviors be combined in an optimal, leading to a rational and coherent global robot behavior? 3. How can all these capabilities be combined in a comprehensive and modular framework, such that the robot can handle a high-level task (searching for human victims) with minimal input from human operators, by navigating in a complex, dynamic and environment, while avoiding potentially hazardous obstacles? In this paper, we present each of these three main aspects of the general robot control architecture more in detail.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_46_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2009behavior,\nauthor = {Doroftei, Daniela and De Cubber, Geert and Colon, Eric and Baudoin, Yvan},\nbooktitle = {Proceedings of the IARP International Workshop on Robotics for Risky Interventions and Environmental Surveillance},\ntitle = {Behavior based control for an outdoor crisis management robot},\nyear = {2009},\npages = {12--14},\nabstract = {The design and development of a control architecture for a robotic crisis management agent raises 3 main questions:\n1. How can we design the individual behaviors, such that the robot is capable of avoiding obstacles and of navigating semi-autonomously?\n2. How can these individual behaviors be combined in an optimal, leading to a rational and coherent global robot behavior?\n3. How can all these capabilities be combined in a comprehensive and modular framework, such that the robot can handle a high-level task (searching for human victims) with minimal input from human operators, by navigating in a complex, dynamic and environment, while avoiding potentially hazardous obstacles?\nIn this paper, we present each of these three main aspects of the general robot control architecture more in detail.},\nproject = {ViewFinder, Mobiniss},\naddress = {Brussels, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2009\/RISE-DOROFTEI.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, D. Doroftei, D. G. Cubber, S. A. Berrabah, C. Pinzon, F. Warlet, J. Gancet, E. Motard, M. Ilzkovitz, L. Nalpantidis, and A. Gasteratos, &#8220;VIEW-FINDER : Robotics assistance to fire-fighting services and Crisis Management,\" in <span style=\"font-style: italic\">2009 IEEE International Workshop on Safety, Security &#038; Rescue Robotics (SSRR 2009)<\/span>, Denver, USA,  2009, p. 1\u20136.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_53\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_53\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/ieeexplore.ieee.org\/document\/5424172\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/ssrr.2009.5424172' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_53_block\">\n<p>In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the Base Station (BS) the data is processed and combined with geographical information originating from a Web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. This paper will focus on the Crisis Management Information System that has been developed for improving a Disaster Management Action Plan and for linking the Control Station with a out-site Crisis Management Centre, and on the software tools implemented on the mobile robot gathering data in the outdoor area of the crisis.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_53_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{Baudoin2009view01,\nauthor = {Y. Baudoin and D. Doroftei and G. De Cubber and S. A. Berrabah and C. Pinzon and F. Warlet and J. Gancet and E. Motard and M. Ilzkovitz and L. Nalpantidis and A. Gasteratos},\nbooktitle = {2009 {IEEE} International Workshop on Safety, Security {&} Rescue Robotics ({SSRR} 2009)},\ntitle = {{VIEW}-{FINDER} : Robotics assistance to fire-fighting services and Crisis Management},\nyear = {2009},\nmonth = nov,\norganization = {IEEE},\npages = {1--6},\npublisher = {{IEEE}},\nabstract = {In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the Base Station (BS) the data is processed and combined with geographical information originating from a Web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. This paper will focus on the Crisis Management Information System that has been developed for improving a Disaster Management Action Plan and for linking the Control Station with a out-site Crisis Management Centre, and on the software tools implemented on the mobile robot gathering data in the outdoor area of the crisis.},\ndoi = {10.1109\/ssrr.2009.5424172},\nproject = {ViewFinder},\naddress = {Denver, USA},\nurl = {https:\/\/ieeexplore.ieee.org\/document\/5424172},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, D. Doroftei, G. De Cubber, S. A. Berrabah, C. Pinzon, J. Penders, A. Maslowski, and J. Bedkowski, &#8220;VIEW-FINDER : Outdoor Robotics Assistance to Fire-Fighting services,\" in <span style=\"font-style: italic\">International Symposium Clawar<\/span>, Istanbul, Turkey,  2009.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_63\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_63\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2009\/CLAWAR2009.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_63_block\">\n<p>In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the control station the data is processed and combined with geographical information originating from a web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. The information may also be forwarded to other forces involved in the operation (e.g. fire fighters, rescue workers, police, etc.). The robots will be designed to navigate individually or cooperatively and to follow high-level instructions from the base station. The robots are off-theshelf units, consisting of wheeled robots. The robots connect wirelessly to the control station. The control station collects in-situ data and combines it with information retrieved from the large-scale GMES-information bases. It will be equipped with a sophisticated human interface to display the processed information to the human operators and operation command.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_63_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{baudoin2009view02,\nauthor = {Baudoin, Yvan and Doroftei, Daniela and De Cubber, Geert and Berrabah, Sid Ahmed and Pinzon, Carlos and Penders, Jacques and Maslowski, Andrzej and Bedkowski, Janusz},\nbooktitle = {International Symposium Clawar},\ntitle = {{VIEW-FINDER} : Outdoor Robotics Assistance to Fire-Fighting services},\nyear = {2009},\nabstract = {In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the control station the data is processed and combined with geographical information originating from a web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. The information may also be forwarded to other forces involved in the operation (e.g. fire fighters, rescue workers, police, etc.). The robots will be designed to navigate individually or cooperatively and to follow high-level instructions from the base station. The robots are off-theshelf units, consisting of wheeled robots. The robots connect wirelessly to the control station. The control station collects in-situ data and combines it with information retrieved from the large-scale GMES-information bases. It\nwill be equipped with a sophisticated human interface to display the processed information to the human operators and operation command.},\nproject = {ViewFinder, Mobiniss},\naddress = {Istanbul, Turkey},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2009\/CLAWAR2009.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, D. Doroftei, G. De Cubber, S. A. Berrabah, E. Colon, C. Pinzon, A. Maslowski, and J. Bedkowski, &#8220;View-Finder: a European project aiming the Robotics assistance to Fire-fighting services and Crisis Management,\" in <span style=\"font-style: italic\">IARP workshop on Service Robotics and Nanorobotics<\/span>, Bejing, China,  2009.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_64\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_64\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2009\/IARP-paper2009.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_64_block\">\n<p>In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the control station the data is processed and combined with geographical information originating from a web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. The information may also be forwarded to other forces involved in the operation (e.g. fire fighters, rescue workers, police, etc.). The robots connect wirelessly to the control station. The control station collects in-situ data and combines it with information retrieved from the large-scale GMES-information bases. It will be equipped with a sophisticated human interface to display the processed information to the human operators and operation command. We\u2019ll essentially focus in this paper to the steps entrusted to the RMA and PIAP through the work-packages of the project.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_64_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{baudoin2009view03,\nauthor = {Baudoin, Yvan and Doroftei, Daniela and De Cubber, Geert and Berrabah, Sid Ahmed and Colon, Eric and Pinzon, Carlos and Maslowski, Andrzej and Bedkowski, Janusz},\nbooktitle = {IARP workshop on Service Robotics and Nanorobotics},\ntitle = {{View-Finder}: a European project aiming the Robotics assistance to Fire-fighting services and Crisis Management},\nyear = {2009},\nabstract = {In the event of an emergency due to a fire or other crisis, a necessary but time consuming pre-requisite, that could delay the real rescue operation, is to establish whether the ground or area can be entered safely by human emergency workers. The objective of the VIEW-FINDER project is to develop robots which have the primary task of gathering data. The robots are equipped with sensors that detect the presence of chemicals and, in parallel, image data is collected and forwarded to an advanced Control station (COC). The robots will be equipped with a wide array of chemical sensors, on-board cameras, Laser and other sensors to enhance scene understanding and reconstruction. At the control station the data is processed and combined with geographical information originating from a web of sources; thus providing the personnel leading the operation with in-situ processed data that can improve decision making. The information may also be forwarded to other forces involved in the operation (e.g. fire fighters, rescue workers, police, etc.). The robots connect wirelessly to the control station. The control station collects in-situ data and combines it with information retrieved from the large-scale GMES-information bases. It will be equipped with a sophisticated human interface to display the processed information to the human operators and operation command.\nWe\u2019ll essentially focus in this paper to the steps entrusted to the RMA and PIAP through the work-packages of the project.},\nproject = {ViewFinder},\naddress = {Bejing, China},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2009\/IARP-paper2009.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    Y. Baudoin, G. De Cubber, S. A. Berrabah, D. Doroftei, E. Colon, C. Pinzon, A. Maslowski, and J. Bedkowski, &#8220;VIEW-FINDER: European Project Aiming CRISIS MANAGEMENT TOOLS and the Robotics Assistance to Fire-Fighting Services,\" in <span style=\"font-style: italic\">IARP WS on service Robotics, Beijing<\/span>, Bejing, China,  2009.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_72\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_72\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.academia.edu\/2879650\/VIEW-FINDER_European_Project_Aiming_CRISIS_MANAGEMENT_TOOLS_and_the_Robotics_Assistance_to_Fire-Fighting_Services\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_72_block\">\n<p>Overview of the View-Finder project<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_72_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{baudoin2009view04,\nauthor = {Baudoin, Yvan and De Cubber, Geert and Berrabah, Sid Ahmed and Doroftei, Daniela and Colon, E and Pinzon, C and Maslowski, A and Bedkowski, J},\nbooktitle = {IARP WS on service Robotics, Beijing},\ntitle = {{VIEW-FINDER}: European Project Aiming CRISIS MANAGEMENT TOOLS and the Robotics Assistance to Fire-Fighting Services},\nyear = {2009},\nabstract = {Overview of the View-Finder project},\nproject = {ViewFinder},\naddress = {Bejing, China},\nunit= {meca-ras},\nurl = {https:\/\/www.academia.edu\/2879650\/VIEW-FINDER_European_Project_Aiming_CRISIS_MANAGEMENT_TOOLS_and_the_Robotics_Assistance_to_Fire-Fighting_Services},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, E. Colon, Y. Baudoin, and H. Sahli, &#8220;Development of a behaviour-based control and software architecture for a visually guided mine detection robot,\" <span style=\"font-style: italic\">European Journal of Automated Systems (JESA)<\/span>, vol. 43, iss. 3, p. 295\u2013314, 2009.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_142\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_142\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2009\/doc-article-hermes.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_142_block\">\n<p>Humanitarian demining is a labor-intensive and high-risk which could benefit from the development of a humanitarian mine detection robot, capable of scanning a minefield semi-automatically. The design of such an outdoor autonomous robots requires the consideration and integration of multiple aspects: sensing, data fusion, path and motion planning and robot control embedded in a control and software architecture. This paper focuses on three main aspects of the design process: visual sensing using stereo and image motion analysis, design of a behaviour-based control architecture and implementation of a modular software architecture.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_142_block\">\n<pre><code class=\"tex bibtex\">@Article{doro2009development,\nauthor = {Doroftei, Daniela and Colon, Eric and Baudoin, Yvan and Sahli, Hichem},\njournal = {European Journal of Automated Systems ({JESA})},\ntitle = {Development of a behaviour-based control and software architecture for a visually guided mine detection robot},\nyear = {2009},\nvolume = {43},\nnumber = {3},\nabstract = { Humanitarian demining is a labor-intensive and high-risk which could benefit from the development of a humanitarian mine detection robot, capable of scanning a minefield semi-automatically. The design of such an outdoor autonomous robots requires the consideration and integration of multiple aspects: sensing, data fusion, path and motion planning and robot control embedded in a control and software architecture. This paper focuses on three main aspects of the design process: visual sensing using stereo and image motion analysis, design of a behaviour-based control architecture and implementation of a modular software architecture.},\npages = {295--314},\nproject = {Mobiniss, ViewFinder},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2009\/doc-article-hermes.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2008<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    D. Doroftei, E. Colon, and G. De Cubber, &#8220;A Behaviour-Based Control and Software Architecture for the Visually Guided Robudem Outdoor Mobile Robot,\" <span style=\"font-style: italic\">Journal of Automation Mobile Robotics and Intelligent Systems<\/span>, vol. 2, iss. 4, p. 19\u201324, 2008.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_48\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_48\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2008\/XXX JAMRIS No8 - Doroftei.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_48_block\">\n<p>The design of outdoor autonomous robots requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. This paper describes partial aspects of this research work, which is aimed at developing a semiautonomous outdoor robot for risky interventions. This paper focuses on three main aspects of the design process: visual sensing using stereo vision and image motion analysis, design of a behaviourbased control architecture and implementation of modular software architecture.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_48_block\">\n<pre><code class=\"tex bibtex\">@Article{doroftei2008behaviour,\nauthor = {Doroftei, Daniela and Colon, Eric and De Cubber, Geert},\njournal = {Journal of Automation Mobile Robotics and Intelligent Systems},\ntitle = {A Behaviour-Based Control and Software Architecture for the Visually Guided Robudem Outdoor Mobile Robot},\nyear = {2008},\nissn = {1897-8649},\nmonth = oct,\nnumber = {4},\npages = {19--24},\nvolume = {2},\nabstract = {The design of outdoor autonomous robots requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. This paper describes partial aspects of this research work, which is aimed at developing a semiautonomous outdoor robot for risky interventions. This paper focuses on three main aspects of the design process: visual sensing using stereo vision and image motion analysis, design of a behaviourbased control architecture and implementation of modular software architecture.},\nproject = {ViewFinder, Mobiniss},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2008\/XXX JAMRIS No8 - Doroftei.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, L. Nalpantidis, G. C. Sirakoulis, and A. Gasteratos, &#8220;Intelligent robots need intelligent vision: visual 3D perception,\" in <span style=\"font-style: italic\">RISE\u201908: Proceedings of the EURON\/IARP International Workshop on Robotics for Risky Interventions and Surveillance of the Environment<\/span>, Benicassim, Spain,  2008.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_50\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_50\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2008\/DeCubber.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_50_block\">\n<p>In this paper, we investigate the possibilities of stereo and structure from motion approaches. It is not the aim to compare both theories of depth reconstruction with the goal of designating a winner and a loser. Both methods are capable of providing sparse as well as dense 3D reconstructions and both approaches have their merits and defects. The thorough, year-long research in the field indicates that accurate depth perception requires a combination of methods rather than a sole one. In fact, cognitive research has shown that the human brain uses no less than 12 different cues to estimate depth. Therefore, we also finally introduce in a following section a methodology to integrate stereo and structure from motion.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_50_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2008intelligent,\nauthor = {De Cubber, Geert and Nalpantidis, Lazaros and Sirakoulis, Georgios Ch and Gasteratos, Antonios},\nbooktitle = {RISE\u201908: Proceedings of the EURON\/IARP International Workshop on Robotics for Risky Interventions and Surveillance of the Environment},\ntitle = {Intelligent robots need intelligent vision: visual {3D} perception},\nyear = {2008},\nabstract = {In this paper, we investigate the possibilities of stereo and structure from motion approaches. It is not the aim to compare both theories of depth reconstruction with the goal of designating a winner and a loser. Both methods are capable of providing sparse as well as dense 3D reconstructions and both approaches have their merits and defects. The thorough, year-long research in the field indicates that accurate depth perception requires a combination of methods rather than a sole one. In fact, cognitive research has shown that the human brain uses no less than 12 different cues to estimate depth. Therefore, we also finally introduce in a following section a methodology to integrate stereo and structure from motion.},\nproject = {ViewFinder, Mobiniss},\naddress = {Benicassim, Spain},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2008\/DeCubber.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, D. Doroftei, and G. Marton, &#8220;Development of a visually guided mobile robot for environmental observation as an aid for outdoor crisis management operations,\" in <span style=\"font-style: italic\">Proceedings of the IARP Workshop on Environmental Maintenance and Protection<\/span>, Baden Baden, Germany,  2008.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_56\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_56\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2008\/environmental observation as an aid for outdoor crisis management operations.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_56_block\">\n<p>To solve these issues, an outdoor mobile robotic platform was equipped with a differential GPS system for accurate geo-registered positioning, and a stereo vision system. This stereo vision systems serves two purposes: 1) victim detection and 2) obstacle detection and avoidance. For semi-autonomous robot control and navigation, we rely on a behavior-based robot motion and path planner. In this paper, we present each of the three main aspects (victim detection, stereo-based obstacle detection and behavior-based navigation) of the general robot control architecture more in detail.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_56_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2008development,\nauthor = {De Cubber, Geert and Doroftei, Daniela and Marton, Gabor},\nbooktitle = {Proceedings of the IARP Workshop on Environmental Maintenance and Protection},\ntitle = {Development of a visually guided mobile robot for environmental observation as an aid for outdoor crisis management operations},\nyear = {2008},\nabstract = {To solve these issues, an outdoor mobile robotic platform was equipped with a differential GPS system for accurate geo-registered positioning, and a stereo vision system. This stereo vision systems serves two purposes: 1) victim detection and 2) obstacle detection and avoidance. For semi-autonomous robot control and navigation, we rely on a behavior-based robot motion and path planner. In this paper, we present each of the three main aspects (victim detection, stereo-based obstacle detection and behavior-based navigation) of the general robot control architecture more in detail.},\nproject = {ViewFinder, Mobiniss},\naddress = {Baden Baden, Germany},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2008\/environmental observation as an aid for outdoor crisis management operations.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, &#8220;Dense 3D structure and motion estimation as an aid for robot navigation,\" <span style=\"font-style: italic\">Journal of Automation Mobile Robotics and Intelligent Systems<\/span>, vol. 2, iss. 4, p. 14\u201318, 2008.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_137\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_137\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/www.jamris.org\/images\/ISSUES\/ISSUE-2008-04\/002 JAMRIS No8 - De Cubber.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_137_block\">\n<p>Three-dimensional scene reconstruction is an important tool in many applications varying from computer graphics to mobile robot navigation. In this paper, we focus on the robotics application, where the goal is to estimate the 3D rigid motion of a mobile robot and to reconstruct a dense three-dimensional scene representation. The reconstruction problem can be subdivided into a number of subproblems. First, the egomotion has to be estimated. For this, the camera (or robot) motion parameters are iteratively estimated by reconstruction of the epipolar geometry. Secondly, a dense depth map is calculated by fusing sparse depth information from point features and dense motion information from the optical flow in a variational framework. This depth map corresponds to a point cloud in 3D space, which can then be converted into a model to extract information for the robot navigation algorithm. Here, we present an integrated approach for the structure and egomotion estimation problem.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_137_block\">\n<pre><code class=\"tex bibtex\">@Article{DeCubber2008,\nauthor = {De Cubber, Geert},\njournal = {Journal of Automation Mobile Robotics and Intelligent Systems},\ntitle = {Dense {3D} structure and motion estimation as an aid for robot navigation},\nyear = {2008},\nissn = {1897-8649},\nmonth = oct,\nnumber = {4},\npages = {14--18},\nvolume = {2},\nabstract = {Three-dimensional scene reconstruction is an important tool in many applications varying from computer graphics to mobile robot navigation. In this paper, we focus on the robotics application, where the goal is to estimate the 3D rigid motion of a mobile robot and to reconstruct a dense three-dimensional scene representation. The reconstruction problem can be subdivided into a number of subproblems. First, the egomotion has to be estimated. For this, the camera (or robot) motion parameters are iteratively estimated by reconstruction of the epipolar geometry. Secondly, a dense depth map is calculated by fusing sparse depth information from point features and dense motion information from the optical flow in a variational framework. This depth map corresponds to a point cloud in 3D space, which can then be converted into a model to extract information for the robot navigation algorithm. Here, we present an integrated approach for the structure and egomotion estimation problem.},\nproject = {ViewFinder,Mobiniss},\nurl = {http:\/\/www.jamris.org\/images\/ISSUES\/ISSUE-2008-04\/002 JAMRIS No8 - De Cubber.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and Y. Baudoin, &#8220;Development of a semi-autonomous De-mining vehicle,\" in <span style=\"font-style: italic\">7th IARP Workshop HUDEM2008<\/span>, Cairo, Egypt,  2008.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_143\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_143\" class=\"papercite_toggle\">[Abstract]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_143_block\">\n<p>The paper describes the Development of a semi-autonomous De-mining vehicle<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_143_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2008development,\nauthor = {Doroftei, Daniela and Baudoin, Yvan},\nbooktitle = {7th {IARP} Workshop {HUDEM}2008},\ntitle = {Development of a semi-autonomous De-mining vehicle},\nyear = {2008},\nabstract = {The paper describes the Development of a semi-autonomous De-mining vehicle},\naddress = {Cairo, Egypt},\nproject = {Mobiniss},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei and J. Bedkowski, &#8220;Towards the autonomous navigation of robots for risky interventions,\" in <span style=\"font-style: italic\">Third International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance RISE<\/span>, Benicassim, Spain,  2008.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_144\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_144\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2008\/Doroftei.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_144_block\">\n<p>In the course of the ViewFinder project, two robotics teams (RMS and PIAP) are working on the development of an intelligent autonomous mobile robot. This paper reports on the progress of both teams.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_144_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2008towards,\nauthor = {Doroftei, Daniela and Bedkowski, Janusz},\nbooktitle = {Third International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance {RISE}},\ntitle = {Towards the autonomous navigation of robots for risky interventions},\nyear = {2008},\nabstract = {In the course of the ViewFinder project, two robotics teams (RMS and PIAP) are working on the development of an intelligent autonomous mobile robot. This paper reports on the progress of both teams.},\nproject = {ViewFinder, Mobiniss},\naddress = {Benicassim, Spain},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2008\/Doroftei.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2007<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    E. Colon, G. De Cubber, H. Ping, J. Habumuremyi, H. Sahli, and Y. Baudoin, &#8220;Integrated Robotic systems for Humanitarian Demining,\" <span style=\"font-style: italic\">International Journal of Advanced Robotic Systems<\/span>, vol. 4, iss. 2, p. 24, 2007.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_47\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_47\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2007\/10.1.1.691.7544.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.5772\/5694' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_47_block\">\n<p>This paper summarises the main results of 10 years of research and development in Humanitarian Demining. The Hudem project focuses on mine detection systems and aims at provided different solutions to support the mine detection operations. Robots using different kind of locomotion systems have been designed and tested on dummy minefields. In order to control these robots, software interfaces, control algorithms, visual positioning and terrain following systems have also been developed. Typical data acquisition results obtained during trial campaigns with robots and data acquisition systems are reported. Lessons learned during the project and future work conclude this paper.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_47_block\">\n<pre><code class=\"tex bibtex\">@Article{colon2007integrated,\nauthor = {Colon, Eric and De Cubber, Geert and Ping, Hong and Habumuremyi, Jean-Claude and Sahli, Hichem and Baudoin, Yvan},\njournal = {International Journal of Advanced Robotic Systems},\ntitle = {Integrated Robotic systems for Humanitarian Demining},\nyear = {2007},\nmonth = jun,\nnumber = {2},\npages = {24},\nvolume = {4},\nabstract = {This paper summarises the main results of 10 years of research and development in Humanitarian Demining. The Hudem project focuses on mine detection systems and aims at provided different solutions to support the mine detection operations. Robots using different kind of locomotion systems have been designed and tested on dummy minefields. In order to control these robots, software interfaces, control algorithms, visual positioning and terrain following systems have also been developed. Typical data acquisition results obtained during trial campaigns with robots and data acquisition systems are reported. Lessons learned during the project and future work conclude this paper.},\ndoi = {10.5772\/5694},\npublisher = {{SAGE} Publications},\nproject = {Mobiniss},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2007\/10.1.1.691.7544.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, &#8220;Dense 3D structure and motion estimation as an aid for robot navigation,\" in <span style=\"font-style: italic\">ISMCR 2007<\/span>, Warsaw, Poland,  2007.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_126\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_126\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2007\/Dense 3D Structure and Motion Estimation as an aid for Robot Navigation.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_126_block\">\n<p>Three-dimensional scene reconstruction is an important tool in many applications varying from computer graphics to mobile robot navigation. In this paper, we focus on the robotics application, where the goal is to estimate the 3D rigid motion of a mobile robot and to reconstruct a dense three-dimensional scene representation. The reconstruction problem can be subdivided into a number of subproblems. First, the egomotion has to be estimated. For this, the camera (or robot) motion parameters are iteratively estimated by reconstruction of the epipolar geometry. Secondly, a dense depth map is calculated by fusing sparse depth information from point features and dense motion information from the optical flow in a variational framework. This depth map corresponds to a point cloud in 3D space, which can then be converted into a model to extract information for the robot navigation algorithm. Here, we present an integrated approach for the structure and egomotion estimation problem.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_126_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2007dense,\nauthor = {De Cubber, Geert},\nbooktitle = {ISMCR 2007},\ntitle = {Dense {3D} structure and motion estimation as an aid for robot navigation},\nyear = {2007},\nabstract = {Three-dimensional scene reconstruction is an important tool in many applications varying from computer graphics to mobile robot navigation. In this paper, we focus on the robotics application, where the goal is to estimate the 3D rigid motion of a mobile robot and to reconstruct a dense three-dimensional scene representation. The reconstruction problem can be subdivided into a number of subproblems. First, the egomotion has to be estimated. For this, the camera (or robot) motion parameters are iteratively estimated by reconstruction of the epipolar geometry. Secondly, a dense depth map is calculated by fusing sparse depth information from point features and dense motion information from the optical flow in a variational framework. This depth map corresponds to a point cloud in 3D space, which can then be converted into a model to extract information for the robot navigation algorithm. Here, we present an integrated approach for the structure and egomotion estimation problem.},\nproject = {ViewFinder,Mobiniss},\naddress = {Warsaw, Poland},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2007\/Dense 3D Structure and Motion Estimation as an aid for Robot Navigation.pdf},\nunit= {meca-ras,vub-etro}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, E. Colon, and G. De Cubber, &#8220;A behaviour-based control and software architecture for the visually guided Robudem outdoor mobile robot,,\" in <span style=\"font-style: italic\">ISMCR 2007<\/span>, Warsaw, Poland,,  2007.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_127\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_127\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2007\/Doroftei_ISMCR07.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_127_block\">\n<p>The design of outdoor autonomous robots requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. This paper describes partial aspects of this research work, which is aimed at developing an semi\u2010autonomous outdoor robot for risky interventions. This paper focuses mainly on three main aspects of the design process: visual sensing using stereo and image motion analysis, design of a behaviour\u2010based control architecture and implementation of a modular software architecture.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_127_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doroftei2007behaviour,\nauthor = {Doroftei, Daniela and Colon, Eric and De Cubber, Geert},\nbooktitle = {ISMCR 2007},\ntitle = {A behaviour-based control and software architecture for the visually guided {Robudem} outdoor mobile robot,},\nyear = {2007},\naddress = {Warsaw, Poland,},\nabstract = {The design of outdoor autonomous robots requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. This paper describes partial aspects of this research work, which is aimed at developing an semi\u2010autonomous outdoor robot for risky interventions. This paper focuses mainly on three main aspects of the design process: visual sensing using stereo and image motion analysis, design of a behaviour\u2010based control architecture and implementation of a modular software architecture.},\nproject = {ViewFinder,Mobiniss},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2007\/Doroftei_ISMCR07.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, E. Colon, Y. Baudoin, and H. Sahli, &#8220;Development of a semi-autonomous off-road vehicle.,\" in <span style=\"font-style: italic\">IEEE HuMan&#8217;07&#8217;<\/span>, Timimoun, Algeria,  2007, p. 340\u2013343.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_145\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_145\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2007\/Development_of_a_semi-autonomous_off-road_vehicle.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_145_block\">\n<p>Humanitarian demining is still a highly laborintensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan semi-automatically a minefield. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_145_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2007development,\nauthor = {Doroftei, Daniela and Colon, Eric and Baudoin, Yvan and Sahli, Hichem},\nbooktitle = {{IEEE} {HuMan}'07'},\ntitle = {Development of a semi-autonomous off-road vehicle.},\nyear = {2007},\naddress = {Timimoun, Algeria},\npages = {340--343},\nabstract = {Humanitarian demining is still a highly laborintensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan semi-automatically a minefield. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control.},\nproject = {Mobiniss, ViewFinder},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2007\/Development_of_a_semi-autonomous_off-road_vehicle.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2006<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    S. A. Berrabah, G. De Cubber, V. Enescu, and H. Sahli, &#8220;MRF-Based Foreground Detection in Image Sequences from a Moving Camera,\" in <span style=\"font-style: italic\">2006 International Conference on Image Processing<\/span>, Atlanta, USA,  2006, p. 1125\u20131128.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_41\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_41\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/ieeexplore.ieee.org\/xpls\/abs_all.jsp?arnumber=4106732\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1109\/icip.2006.312754' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_41_block\">\n<p>This paper presents a Bayesian approach for simultaneously detecting the moving objects (foregrounds) and estimating their motion in image sequences taken with a moving camera mounted on the top of a mobile robot. To model the background, the algorithm uses the GMM approach for its simplicity and capability to adapt to illumination changes and small motions in the scene. To overcome the limitations of the GMM approach with its pixel-wise processing, the background model is combined with the motion cue in a maximum a posteriori probability (MAP)-MRF framework. This enables us to exploit the advantages of spatio-temporal dependencies that moving objects impose on pixels and the interdependence of motion and segmentation fields. As a result, the detected moving objects have visually attractive silhouettes and they are more accurate and less affected by noise than those obtained with simple pixel-wise methods. To enhance the segmentation accuracy, the background model is re-updated using the MAP-MRF results. Experimental results and a qualitative study of the proposed approach are presented on image sequences with a static camera as well as with a moving camera.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_41_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{berrabah2006mrf,\nauthor = {Berrabah, Sid Ahmed and De Cubber, Geert and Enescu, Valentin and Sahli, Hichem},\nbooktitle = {2006 International Conference on Image Processing},\ntitle = {{MRF}-Based Foreground Detection in Image Sequences from a Moving Camera},\nyear = {2006},\nmonth = oct,\norganization = {IEEE},\npages = {1125--1128},\npublisher = {{IEEE}},\nabstract = {This paper presents a Bayesian approach for simultaneously detecting the moving objects (foregrounds) and estimating their motion in image sequences taken with a moving camera mounted on the top of a mobile robot. To model the background, the algorithm uses the GMM approach for its simplicity and capability to adapt to illumination changes and small motions in the scene. To overcome the limitations of the GMM approach with its pixel-wise processing, the background model is combined with the motion cue in a maximum a posteriori probability (MAP)-MRF framework. This enables us to exploit the advantages of spatio-temporal dependencies that moving objects impose on pixels and the interdependence of motion and segmentation fields. As a result, the detected moving objects have visually attractive silhouettes and they are more accurate and less affected by noise than those obtained with simple pixel-wise methods. To enhance the segmentation accuracy, the background model is re-updated using the MAP-MRF results. Experimental results and a qualitative study of the proposed approach are presented on image sequences with a static camera as well as with a moving camera.},\ndoi = {10.1109\/icip.2006.312754},\nproject = {MOBINISS,ViewFinder},\naddress = {Atlanta, USA},\nurl = {http:\/\/ieeexplore.ieee.org\/xpls\/abs_all.jsp?arnumber=4106732},\nunit= {meca-ras,vub-etro}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, V. Enescu, H. Sahli, E. Demeester, M. Nuttin, and D. Vanhooydonck, &#8220;Active stereo vision-based mobile robot navigation for person tracking,\" <span style=\"font-style: italic\">Integrated Computer-Aided Engineering<\/span>, vol. 13, p. 203\u2013222, 2006.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_43\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_43\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/cris.vub.be\/en\/publications\/active-stereo-visionbased-mobile-robot-navigation-for-person-tracking(2c2cd28d-2aea-4009-ae01-35448c005050)\/export.html\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_43_block\">\n<p>In this paper, we propose a mobile robot architecture for person tracking, consisting of an active stereo vision module (ASVM) and a navigation module (NM). The first uses a stereo head equipped with a pan-tilt mechanism to track a moving target (selected by an operator) and keep it centered in the visual field. Its output, i.e. the 3D position of the person, is fed to the NM, which drives the robot towards the target while avoiding obstacles. For this, a hybrid navigation algorithm is adopted with a reactive part that efficiently reacts to the most recent sensor data, and a deliberative part that generates a globally optimal path to a target destination, such as the person&#8217;s location. As a peculiarity of the system, there is no feedback from the NM or the robot motion controller (RMC) to the ASVM. While this imparts flexibility in combining the ASVM with a wide range of robot platforms, it puts considerable strain on the ASVM. Indeed, besides the changes in the target dynamics, it has to cope with the robot motion during obstacle avoidance. These disturbances are accommodated via a suitable stochastic dynamic model for the stereo head-target system. Robust tracking is achieved by combining a color-based particle filter with a method to update the color model of the target under changing illumination conditions. The main contributions of this paper lie in (1) devising a robust color-based 3D target tracking method, (2) proposing a hybrid deliberative\/reactive navigation scheme, and (3) integrating them on a wheelchair platform for the final goal of person following. Experimental results are presented for ASVM separately and in combination with a wheelchair platform-based implementation of the NM.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_43_block\">\n<pre><code class=\"tex bibtex\">@Article{2c2cd28d2aea4009ae0135448c005050,\nauthor = {De Cubber, Geert and Valentin Enescu and Hichem Sahli and Eric Demeester and Marnix Nuttin and Dirk Vanhooydonck},\njournal = {Integrated Computer-Aided Engineering},\ntitle = {Active stereo vision-based mobile robot navigation for person tracking},\nyear = {2006},\nissn = {1069-2509},\nmonth = jul,\nnote = {Integrated Computer-Aided Engineering, Vol. ?, Nr. ?, pp. ?, .},\npages = {203--222},\nvolume = {13},\nabstract = {In this paper, we propose a mobile robot architecture for person tracking, consisting of an active stereo vision module (ASVM) and a navigation module (NM). The first uses a stereo head equipped with a pan-tilt mechanism to track a moving target (selected by an operator) and keep it centered in the visual field. Its output, i.e. the 3D position of the person, is fed to the NM, which drives the robot towards the target while avoiding obstacles. For this, a hybrid navigation algorithm is adopted with a reactive part that efficiently reacts to the most recent sensor data, and a deliberative part that generates a globally optimal path to a target destination, such as the person's location. As a peculiarity of the system, there is no feedback from the NM or the robot motion controller (RMC) to the ASVM. While this imparts flexibility in combining the ASVM with a wide range of robot platforms, it puts considerable strain on the ASVM. Indeed, besides the changes in the target dynamics, it has to cope with the robot motion during obstacle avoidance. These disturbances are accommodated via a suitable stochastic dynamic model for the stereo head-target system. Robust tracking is achieved by combining a color-based particle filter with a method to update the color model of the target under changing illumination conditions. The main contributions of this paper lie in (1) devising a robust color-based 3D target tracking method, (2) proposing a hybrid deliberative\/reactive navigation scheme, and (3) integrating them on a wheelchair platform for the final goal of person following. Experimental results are presented for ASVM separately and in combination with a wheelchair platform-based implementation of the NM.},\nday = {24},\nkeywords = {mobile robot, active vision, stereo, navigation},\nlanguage = {English},\nproject = {Mobiniss, ViewFinder},\npublisher = {IOS Press},\nunit= {meca-ras,vub-etro},\nurl = {https:\/\/cris.vub.be\/en\/publications\/active-stereo-visionbased-mobile-robot-navigation-for-person-tracking(2c2cd28d-2aea-4009-ae01-35448c005050)\/export.html},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    K. Cauwerts, G. De Cubber, T. Geerinck, W. Mattheyses, I. Ravyse, H. Sahli, M. Shami, P. Soens, W. Verhelst, and P. Verhoeve, &#8220;Audio-Visual Signal Processing: Speech and emotion processing for human-machine interaction,\" in <span style=\"font-style: italic\">Second annual IEEE BENELUX\/DSP Valley Signal Processing Symposium (SPS-DARTS 2006)<\/span>, Brussels, Belgium,  2006.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_102\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/www.semanticscholar.org\/paper\/Audio-Visual-Signal-Processing:-Speech-and-emotion-Cauwerts-Cubber\/c6cc775bfc9f5528c8c889d32af53566f1ae8415\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_102_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{cauwerts2006audio,\nauthor = {Cauwerts, Kenny and De Cubber, Geert and Geerinck, Thomas and Mattheyses, W and Ravyse, Ilse and Sahli, Hichem and Shami, M and Soens, P and Verhelst, Werner and Verhoeve, P},\nbooktitle = {Second annual IEEE BENELUX\/DSP Valley Signal Processing Symposium (SPS-DARTS 2006)},\ntitle = {Audio-Visual Signal Processing: Speech and emotion processing for human-machine interaction},\nyear = {2006},\naddress = {Brussels, Belgium},\nunit= {meca-ras},\nurl = {https:\/\/www.semanticscholar.org\/paper\/Audio-Visual-Signal-Processing:-Speech-and-emotion-Cauwerts-Cubber\/c6cc775bfc9f5528c8c889d32af53566f1ae8415},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, E. Colon, and Y. Baudoin, &#8220;A modular control architecture for semi-autonomous navigation,\" in <span style=\"font-style: italic\">CLAWAR 2006<\/span>, Brussels, Belgium,  2006, p. 712\u2013715.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_146\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_146\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2006\/Clawar2006_Doroftei_colon.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_146_block\">\n<p>Humanitarian demining is still a highly laborintensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan semi-automatically a minefield. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_146_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2006modular,\nauthor = {Doroftei, Daniela and Colon, Eric and Baudoin, Yvan},\nbooktitle = {{CLAWAR} 2006},\ntitle = {A modular control architecture for semi-autonomous navigation},\nyear = {2006},\npages = {712--715},\nabstract = {Humanitarian demining is still a highly laborintensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan semi-automatically a minefield. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. },\nproject = {Mobiniss, ViewFinder},\naddress = {Brussels, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2006\/Clawar2006_Doroftei_colon.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    D. Doroftei, E. Colon, and Y. Baudoin, &#8220;Development of a control architecture for the ROBUDEM outdoor mobile robot platform,\" in <span style=\"font-style: italic\">IARP Workshop RISE 2006<\/span>, Brussels, Belgium,  2006.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_147\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_147\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2006\/IARPWS2006_Doroftei_Colon.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_147_block\">\n<p>Humanitarian demining still is a highly labor-intensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan a minefield semi-automatically. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_147_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{doro2006development,\nauthor = {Doroftei, Daniela and Colon, Eric and Baudoin, Yvan},\nbooktitle = {{IARP} Workshop {RISE} 2006},\ntitle = {Development of a control architecture for the ROBUDEM outdoor mobile robot platform},\nyear = {2006},\nabstract = {Humanitarian demining still is a highly labor-intensive and high-risk operation. Advanced sensors and mechanical aids can significantly reduce the demining time. In this context, it is the aim to develop a humanitarian demining mobile robot which is able to scan a minefield semi-automatically. This paper discusses the development of a control scheme for such a semi-autonomous mobile robot for humanitarian demining. This process requires the careful consideration and integration of multiple aspects: sensors and sensor data fusion, design of a control and software architecture, design of a path planning algorithm and robot control. },\nproject = {Mobiniss, ViewFinder},\naddress = {Brussels, Belgium},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2006\/IARPWS2006_Doroftei_Colon.pdf},\nunit= {meca-ras}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2005<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    V. Enescu, G. De Cubber, H. Sahli, E. Demeester, D. Vanhooydonck, and M. Nuttin, &#8220;Active stereo vision-based mobile robot navigation for person tracking,\" in <span style=\"font-style: italic\">International Conference on Informatics in Control, Automation and Robotics<\/span>, Barcelona, Spain,  2005, p. 32\u201339.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_128\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_128\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2005\/f969ee9e1169623340aa409f539fddb9c413.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.3233\/ica-2006-13302' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_128_block\">\n<p>In this paper, we propose a mobile robot architecture for person tracking, consisting of an active stereo vision module (ASVM) and a navigation module (NM). The first tracks the person in stereo images and controls the pan\/tilt unit to keep the target in the visual field. Its output, i.e. the 3D position of the person, is fed to the NM, which drives the robot towards the target while avoiding obstacles. As a peculiarity of the system, there is no feedback from the NM or the robot motion controller (RMC) to the ASVM. While this imparts flexibility in combining the ASVM with a wide range of robot platforms, it puts considerable strain on the ASVM.Indeed, besides the changes in the target dynamics, it has to cope with the robot motion during obstacle avoidance. These disturbances are accommodated by generating target location hypotheses in an efficient manner. Robustness against outliers and occlusions is achieved by employing a multi-hypothesis tracking method &#8211; the particle filter &#8211; based on a color model of the target. Moreover, to deal with illumination changes, the system adaptively updates the color model of the target. The main contributions of this paper lie in (1) devising a stereo, color-based target tracking method using the stereo geometry constraint and (2) integrating it with a robotic agent in a loosely coupled manner.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_128_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{enescu2005active,\nauthor = {Enescu, Valentin and De Cubber, Geert and Sahli, Hichem and Demeester, Eric and Vanhooydonck, Dirk and Nuttin, Marnix},\nbooktitle = {International Conference on Informatics in Control, Automation and Robotics},\ntitle = {Active stereo vision-based mobile robot navigation for person tracking},\nyear = {2005},\naddress = {Barcelona, Spain},\nmonth = sep,\npages = {32--39},\nabstract = {In this paper, we propose a mobile robot architecture for person tracking, consisting of an active stereo vision module (ASVM) and a navigation module (NM). The first tracks the person in stereo images and controls the pan\/tilt unit to keep the target in the visual field. Its output, i.e. the 3D position of the person, is fed to the NM, which drives the robot towards the target while avoiding obstacles. As a peculiarity of the system, there is no feedback from the NM or the robot motion controller (RMC) to the ASVM. While this imparts flexibility in combining the ASVM with a wide range of robot platforms, it puts considerable strain on the ASVM.Indeed, besides the changes in the target dynamics, it has to cope with the robot motion during obstacle avoidance. These disturbances are accommodated by generating target location hypotheses in an efficient manner. Robustness against outliers and occlusions is achieved by employing a multi-hypothesis tracking method - the particle filter - based on a color model of the target. Moreover, to deal with illumination changes, the system adaptively updates the color model of the target. The main contributions of this paper lie in (1) devising a stereo, color-based target tracking method using the stereo geometry constraint and (2) integrating\nit with a robotic agent in a loosely coupled manner.},\nproject = {Mobiniss, ViewFinder},\ndoi = {10.3233\/ica-2006-13302},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2005\/f969ee9e1169623340aa409f539fddb9c413.pdf},\nunit= {meca-ras,vub-etro}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2004<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, S. A. Berrabah, and H. Sahli, &#8220;Color-based visual servoing under varying illumination conditions,\" <span style=\"font-style: italic\">Robotics and Autonomous Systems<\/span>, vol. 47, iss. 4, p. 225\u2013249, 2004.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_42\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_42\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"https:\/\/www.sciencedirect.com\/science\/article\/abs\/pii\/S0921889004000570\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>         <a href='http:\/\/dx.doi.org\/10.1016\/j.robot.2004.03.015' class='papercite_doi' title='View on publisher site'>[DOI]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_42_block\">\n<p>Visual servoing, or the control of motion on the basis of image analysis in a closed loop, is more and more recognized as an important tool in modern robotics. Here, we present a new model driven approach to derive a description of the motion of a target object. This method can be subdivided into an illumination invariant target detection stage and a servoing process which uses an adaptive Kalman filter to update the model of the non-linear system. This technique can be applied to any pan tilt zoom camera mounted on a mobile vehicle as well as to a static camera tracking moving environmental features.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_42_block\">\n<pre><code class=\"tex bibtex\">@Article{de2004color,\nauthor = {De Cubber, Geert and Berrabah, Sid Ahmed and Sahli, Hichem},\njournal = {Robotics and Autonomous Systems},\ntitle = {Color-based visual servoing under varying illumination conditions},\nyear = {2004},\nmonth = jul,\nnumber = {4},\npages = {225--249},\nvolume = {47},\nabstract = {Visual servoing, or the control of motion on the basis of image analysis in a closed loop, is more and more recognized as an important tool in modern robotics. Here, we present a new model driven approach to derive a description of the motion of a target object. This method can be subdivided into an illumination invariant target detection stage and a servoing process which uses an adaptive Kalman filter to update the model of the non-linear system. This technique can be applied to any pan tilt zoom camera mounted on a mobile vehicle as well as to a static camera tracking moving environmental features.},\ndoi = {10.1016\/j.robot.2004.03.015},\npublisher = {Elsevier {BV}},\nproject = {Mobiniss},\nurl = {https:\/\/www.sciencedirect.com\/science\/article\/abs\/pii\/S0921889004000570},\nunit= {meca-ras,vub-etro}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2003<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, H. Sahli, E. Colon, and Y. Baudoin, &#8220;Visual Servoing under Changing Illumination Conditions,\" in <span style=\"font-style: italic\">Proc. International Workshop on Attention and Performance in Computer Vision (ICVS03)<\/span>, Graz, Austria,  2003, p. 47\u201354.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_55\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_55\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2003\/ICVS03_Geert.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_55_block\">\n<p>Visual servoing, or the control of motion on the basis of image analysis in a closed loop, is more and more recognized as an important tool in modern robotics. In this paper, we present a new model-driven approach to derive a description of the motion of a target object. This method can be subdivided into an illumination invariant target detection stage and a servoing process which uses an adaptive Kalman filter to update the model of the nonlinear system. This technique can be applied to any pan-tilt-zoom camera mounted on a mobile vehicle as well as to a static camera tracking moving environmental features<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_55_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2003visual,\nauthor = {De Cubber, Geert and Sahli, Hichem and Colon, Eric and Baudoin, Yvan},\nbooktitle = {Proc. International Workshop on Attention and Performance in Computer Vision (ICVS03)},\ntitle = {Visual Servoing under Changing Illumination Conditions},\nyear = {2003},\npages = {47--54},\naddress = {Graz, Austria},\nabstract = {Visual servoing, or the control of motion on the basis of image analysis in a closed loop, is more and more recognized as an important tool in modern robotics. In this paper, we present a new model-driven approach to derive a description of the motion of a target object. This method can be subdivided into an illumination invariant target detection stage and a servoing process which uses an adaptive Kalman filter to update the model of the nonlinear system. This technique can be applied to any pan-tilt-zoom camera mounted on a mobile vehicle as well as to a static camera tracking moving environmental features},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2003\/ICVS03_Geert.pdf},\nproject = {Mobiniss},\nunit= {meca-ras,vub-etro}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, S. A. Berrabah, and H. Sahli, &#8220;A Bayesian Approach for Color Constancy based Visual Servoing,\" in <span style=\"font-style: italic\">11th International Conference on Advanced Robotics<\/span>, Coimbra, Portugal,  2003.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_65\" class=\"papercite_toggle\">[BibTeX]<\/a>         <a href=\"https:\/\/www.semanticscholar.org\/paper\/A-Bayesian-Approach-for-Color-Constancy-based-Cubber-Berrabah\/ed5636626e307f2b8d0c5f4fcc79d5d54a9cc639\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a>\n<div class=\"papercite_bibtex\" id=\"papercite_65_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2003bayesian,\nauthor = {De Cubber, Geert and Berrabah, Sid Ahmed and Sahli, Hichem},\nbooktitle = {11th International Conference on Advanced Robotics},\ntitle = {A Bayesian Approach for Color Constancy based Visual Servoing},\nyear = {2003},\naddress = {Coimbra, Portugal},\nunit= {meca-ras,vub-etro},\nproject = {Mobiniss},\nurl = {https:\/\/www.semanticscholar.org\/paper\/A-Bayesian-Approach-for-Color-Constancy-based-Cubber-Berrabah\/ed5636626e307f2b8d0c5f4fcc79d5d54a9cc639},\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2002<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, H. Sahli, and F. Decroos, &#8220;Sensor Integration on a Mobile Robot,\" in <span style=\"font-style: italic\">ISMCR 2002: Proc. 12th Int&#8217;l Symp. Measurement and Control in Robotics,<\/span>, Bourges, France,  2002.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_54\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_54\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2002\/Paper ISMCR'02 - Sensor Integration on a Mobile Robot.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_54_block\">\n<p>The purpose of this paper is to show an application of path planning for a mobile pneumatic robot. The robot is capable of searching for a specific target in the scene and navigating towards it, in an a priori unknown environment. To accomplish this task, the robot uses a colour pan-tilt camera and two ultrasonic sensors. As the camera is only used for target tracking, the robot is left with very incomplete sensor data with a high degree of uncertainty. To counter this, a fuzzy logic &#8211; based sensor fusion procedure is set up to aid the map building process in constructing a reliable environmental model. The significance of this work is that it shows that the use of fuzzy logic based fusion and potential field navigation can achieve good results for path planning<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_54_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2002sensor,\nauthor = {De Cubber, Geert and Sahli, Hichem and Decroos, Francis},\nbooktitle = {ISMCR 2002: Proc. 12th Int'l Symp. Measurement and Control in Robotics,},\ntitle = {Sensor Integration on a Mobile Robot},\nyear = {2002},\naddress = {Bourges, France},\nabstract = {The purpose of this paper is to show an application of path planning for a mobile pneumatic robot. The robot is capable of searching for a specific target in the scene and navigating towards it, in an a priori unknown environment. To accomplish this task, the robot uses a colour pan-tilt camera and two ultrasonic sensors. As the camera is only used for target tracking, the robot is left with very incomplete sensor data with a high degree of uncertainty. To counter this, a fuzzy logic - based sensor fusion procedure is set up to aid the map building process in constructing a reliable environmental model. The significance of this work is that it shows that the use of fuzzy logic based fusion and potential field navigation can achieve good results for path planning},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2002\/Paper ISMCR'02 - Sensor Integration on a Mobile Robot.pdf},\nproject = {Mobiniss},\nunit= {meca-ras,vub-etro}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<li>    G. De Cubber, H. Sahli, H. Ping, and E. Colon, &#8220;A Colour Constancy Approach for Illumination Invariant Colour Target Tracking,\" in <span style=\"font-style: italic\">IARP Workshop on Robots for Humanitarian Demining<\/span>, Vienna, Austria,  2002.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_66\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_66\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2002\/Paper IARP - Geert De Cubber.pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_66_block\">\n<p>Many robotic agents use color vision to retrieve quality information about the environment. In this work, we present a visual servoing technique, where vision is the primary sensing modality and sensing is based upon the analysis of the perceived visual information. We describe how colored targets can be identified and how their position and motion can be estimated quickly and reliably. The visual servoing procedure is essentially a four-stage process, with color target identification, motion parameter estimation, target tracking and target position estimation. These individual parts add up to a global vision system enabling precise positioning for a demining robot.<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_66_block\">\n<pre><code class=\"tex bibtex\">@InProceedings{de2002colour,\nauthor = {De Cubber, Geert and Sahli, Hichem and Ping, Hong and Colon, Eric},\nbooktitle = {IARP Workshop on Robots for Humanitarian Demining},\ntitle = {A Colour Constancy Approach for Illumination Invariant Colour Target Tracking},\nyear = {2002},\naddress = {Vienna, Austria},\nabstract = {Many robotic agents use color vision to retrieve quality information about the environment. In this work, we present a visual servoing technique, where vision is the primary sensing modality and sensing is based upon the analysis of the perceived visual information. We describe how colored targets can be identified and how their position and motion can be estimated quickly and reliably. The visual servoing procedure is essentially a four-stage process, with color target identification, motion parameter estimation, target tracking and target position estimation. These individual parts add up to a global vision system enabling precise positioning for a demining robot.},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2002\/Paper IARP - Geert De Cubber.pdf},\nproject = {Mobiniss},\nunit= {meca-ras,vub-etro}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<h3 class=\"papercite\">2001<\/h3>\n<ul class=\"papercite_bibliography\">\n<li>    G. De Cubber, &#8220;Integration of sensors on a mobile robot,\" Master Thesis, 2001.    <br \/>   <a href=\"javascript:void(0)\" id=\"papercite_49\" class=\"papercite_toggle\">[BibTeX]<\/a>      <a href=\"javascript:void(0)\" id=\"papercite_abstract_49\" class=\"papercite_toggle\">[Abstract]<\/a>         <a href=\"http:\/\/mecatron.rma.ac.be\/pub\/2001\/ThesisText (2).pdf\" title='Download PDF' class='papercite_pdf'>[Download PDF]<\/a><br \/>\n<blockquote class=\"papercite_bibtex\" id=\"papercite_abstract_49_block\">\n<p>The final goal of this project is to add some sort of intelligence to an existing pneumatic mobile robot and by doing this, making the robot capable of walking towards a certain designated target in a complex and unknown environment with multiple obstacles and this without any user interaction. To realise this desired goal, some sensory equipment was added to the robot, in particular 2 ultrasonic sensors and a camera. This camera has the specific task of following the target object and returning its position, whereas the ultrasonic sensors have the more general task of retrieving environmental information. This information, coming from the different sensors, is brought together and fused in an intelligent way by a sensor fusion procedure based upon the principles of fuzzy logic. In order to be able to navigate in its environment, the robot makes use of the acquired sensory data to build a map \u2013 more specifically a potential field map \u2013 as a means of representing its surroundings. This map is used to plan the path to be followed and the actions to be undertaken. A control program was written in order to gather and to coordinate all these different functions, making the robot capable of reaching the goals set up initially<\/p>\n<\/blockquote>\n<div class=\"papercite_bibtex\" id=\"papercite_49_block\">\n<pre><code class=\"tex bibtex\">@MastersThesis{de2001integration,\nauthor = {De Cubber, Geert},\nschool = {Vrije Universiteit Brussel},\ntitle = {Integration of sensors on a mobile robot},\nyear = {2001},\nabstract = {The final goal of this project is to add some sort of intelligence to an existing pneumatic mobile robot and by doing this, making the robot capable of walking towards a certain designated target in a complex and unknown environment with multiple obstacles and this without any user interaction. To realise this desired goal, some sensory equipment was added to the robot, in particular 2 ultrasonic sensors and a camera. This camera has the specific task of following the target object and returning its position, whereas the ultrasonic sensors have the more general task of retrieving environmental information. This information, coming from the different sensors, is brought together and fused in an intelligent way by a sensor fusion procedure based upon the principles of fuzzy logic. In order to be able to navigate in its environment, the robot makes use of the acquired sensory data to build a map \u2013 more specifically a potential field map \u2013 as a means of representing its surroundings. This map is used to plan the path to be followed and the actions to be undertaken. A control program was written in order to gather and to coordinate all these different functions, making the robot capable of reaching the goals set up initially},\npublisher = {Vrije Universiteit Brussel},\nurl = {http:\/\/mecatron.rma.ac.be\/pub\/2001\/ThesisText (2).pdf},\nunit= {meca-ras,vub-etro}\n}<\/code><\/pre>\n<\/p>\n<\/div>\n<\/li>\n<\/ul>\n<p>\n<\/div><\/div><\/div><\/div><\/div><\/section><\/p>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":1,"featured_media":250,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-3685","page","type-page","status-publish","has-post-thumbnail","hentry"],"_links":{"self":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/3685","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/comments?post=3685"}],"version-history":[{"count":5,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/3685\/revisions"}],"predecessor-version":[{"id":4660,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/pages\/3685\/revisions\/4660"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/media\/250"}],"wp:attachment":[{"href":"https:\/\/mecatron.rma.ac.be\/index.php\/wp-json\/wp\/v2\/media?parent=3685"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}