<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>http://introlab.3it.usherbrooke.ca/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Letd2801</id>
	<title>IntRoLab - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="http://introlab.3it.usherbrooke.ca/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Letd2801"/>
	<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php/Special:Contributions/Letd2801"/>
	<updated>2026-04-21T10:34:01Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.41.0</generator>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=RTAB-Map&amp;diff=3520</id>
		<title>RTAB-Map</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=RTAB-Map&amp;diff=3520"/>
		<updated>2024-05-28T12:48:57Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
&amp;lt;big&amp;gt;&lt;br /&gt;
&amp;lt;!-- &amp;lt;english&amp;gt;&lt;br /&gt;
[[Image:RTAB-Map.png|link=http://introlab.github.io/rtabmap|RTAB-Map]] RTAB-Map : Real-Time Appearance-Based Mapping&lt;br /&gt;
&amp;lt;/english&amp;gt; --&amp;gt;&lt;br /&gt;
&amp;lt;!-- &amp;lt;french&amp;gt; --&amp;gt;&lt;br /&gt;
[[Image:RTAB-Map.png|link=http://introlab.github.io/rtabmap|RTAB-Map]] RTAB-Map : Cartographie temps réel basée sur l&#039;apparence de l&#039;environnement &lt;br /&gt;
&amp;lt;!-- &amp;lt;/french&amp;gt; --&amp;gt;&lt;br /&gt;
&amp;lt;/big&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!--&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Description ==&lt;br /&gt;
&#039;&#039;&#039;This page is about the loop closure detection approach used by RTAB-Map. For RGB-D mapping, visit [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&#039;&#039;&#039;.&lt;br /&gt;
&lt;br /&gt;
Loop closure detection is the process involved when trying to find a match between the current and a previously visited locations in SLAM (Simultaneous Localization And Mapping). &lt;br /&gt;
Over time, the amount of time required to process new observations increases with the size of the internal map, which may affect real-time processing. &lt;br /&gt;
RTAB-Map is a novel real-time loop closure detection approach for large-scale and long-term SLAM. Our approach is based on efficient memory management to keep computation time for each new observation under a fixed time limit, thus respecting real-time limit for long-term operation. Results demonstrate the approach&#039;s adaptability and scalability using two custom data sets and ten standard data sets.&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
--&amp;gt;&lt;br /&gt;
&amp;lt;!-- &amp;lt;french&amp;gt; --&amp;gt;&lt;br /&gt;
== Description ==&lt;br /&gt;
&#039;&#039;&#039;Cette page est à propos de l&#039;approche de détection de fermeture de boucle utilisée dans RTAB-Map. Pour la cartographie RGB-D, visitez [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&#039;&#039;&#039;.&lt;br /&gt;
&lt;br /&gt;
La détection de fermeture de boucle est le processus impliqué en SLAM (localisation et cartographie simultanées) lorsqu&#039;on tente de trouver une correspondance entre un endroit présent et un autre déjà visité. Plus la carte interne augmente en taille, plus le temps requis pour la détection de fermeture de boucle augmente, ce qui peut affecter le traitement en temps réel. RTAB-Map est une nouvelle approche de détection de fermeture de boucle fonctionnant en temps réel pour du SLAM à grande échelle et à long terme. Notre approche est basée sur une gestion efficace de la mémoire afin de garder le temps de calcul en dessous d&#039;un seuil de temps, respectant ainsi la limite de temps réel à long terme. En utilisant dix ensembles de données standards, notre propre ensemble de données dérivées d&#039;un parcours de plus de 2 km rassemblant des conditions diverses et notre ensemble de données montrant un parcours où le robot visite les mêmes endroits une centaine de fois, les résultats démontrent l&#039;adaptabilité et l&#039;extensibilité de notre approche.&lt;br /&gt;
&amp;lt;!-- &amp;lt;/french&amp;gt; --&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{{#ev:youtube|71eRxTc1DaU}}&lt;br /&gt;
{{#ev:youtube|CAk-QGMlQmI}}&lt;br /&gt;
{{#ev:youtube|AMLwjo80WzI}}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Results ==&lt;br /&gt;
&#039;&#039;Note that these results (more recent) may differ from those in the presentation video above...&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Figure 1: Summary of the loop closures detected on UdeS data set :&lt;br /&gt;
* Green : Loop closures detected&lt;br /&gt;
* Yellow : Loop closures rejected&lt;br /&gt;
* Red : Unable to detect a loop closure because old places could not be retrieved&lt;br /&gt;
&lt;br /&gt;
Figure 2: Processing time for each image acquired (real-time limit fixed to 700 ms for an image rate of 1 Hz)&lt;br /&gt;
&lt;br /&gt;
Figure 3: Precision-Recall (48% recall at 100% precision)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_LoopClosureMapResults.png|250px]] [[File:RTAB-Map_LoopClosureTimeResults.png|250px]] [[File:RTAB-Map_RecallResults.png|250px]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Reproduce the loop closure detection results&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:RTAB-Map_LoopClosureAllPrecisionRecall.png|250px]]&lt;br /&gt;
&lt;br /&gt;
* Visit the [http://github.com/introlab/rtabmap/wiki/Benchmark Benchmark] wiki page on [http://github.com/introlab/rtabmap/wiki RTAB-Map&#039;s GitHub]. The ground truths can be downloaded below.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Videos&#039;&#039;&#039;&lt;br /&gt;
* Newer:&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|1dImRinTJSE}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|N5q0jQrV3gw}} {{#ev:youtube|PqO_x8tcFiY}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|2MogQIT_B2I}} {{#ev:youtube|AH_oKp3CrRA}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|0fNUD11FNZU}} {{#ev:youtube|ViXlUywWHYQ}}&amp;lt;/center&amp;gt;&lt;br /&gt;
* Older:&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|0zWs6jTaAwQ}} {{#ev:youtube|J8KGEA9ecS0}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|kghs6XM8Yzw}} {{#ev:youtube|awV2Xbjq7OM}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|CuWESlLfWpQ}} {{#ev:youtube|SQiFs1z7qSY}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|ShQlakkzsY4}} {{#ev:youtube|cTmf5yrpcl8}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Résultats ==&lt;br /&gt;
&#039;&#039;À noter que les résultats (plus récents) présentés ici peuvent différer de ceux dans le vidéo...&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Figure 1: Sommaire des détection de boucles sur l&#039;ensemble de données UdeS :&lt;br /&gt;
* Vert : Fermetures de boucle acceptées&lt;br /&gt;
* Jaune : Fermetures de boucle rejetées &lt;br /&gt;
* Rouge : Impossibilité de détecter une fermeture de boucle car les anciens endroits n&#039;ont pu être remémorisés&lt;br /&gt;
&lt;br /&gt;
Figure 2: Temps d&#039;exécution pour chaque itération (limite temps réel fixée à 700 ms pour un temps d&#039;acquisition de 1 seconde)&lt;br /&gt;
&lt;br /&gt;
Figure 3: Precision-Recall (48% recall à 100% precision)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_LoopClosureMapResults.png|250px]] [[File:RTAB-Map_LoopClosureTimeResults.png|250px]] [[File:RTAB-Map_RecallResults.png|250px]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Reproduire les résultats de détection de boucles&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:RTAB-Map_LoopClosureAllPrecisionRecall.png|250px]]&lt;br /&gt;
&lt;br /&gt;
* Visitez la page wiki [http://github.com/introlab/rtabmap/wiki/Benchmark Benchmark] sur le [http://github.com/introlab/rtabmap/wiki GitHub de RTAB-Map&#039;s]. Les &amp;quot;ground truths&amp;quot; peuvent être téléchargés en bas de la page.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Vidéos&#039;&#039;&#039;&lt;br /&gt;
* Nouveaux:&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|1dImRinTJSE}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|N5q0jQrV3gw}} {{#ev:youtube|PqO_x8tcFiY}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|2MogQIT_B2I}} {{#ev:youtube|AH_oKp3CrRA}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|0fNUD11FNZU}} {{#ev:youtube|ViXlUywWHYQ}}&amp;lt;/center&amp;gt;&lt;br /&gt;
* Anciens:&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|0zWs6jTaAwQ}} {{#ev:youtube|J8KGEA9ecS0}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|kghs6XM8Yzw}} {{#ev:youtube|awV2Xbjq7OM}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|CuWESlLfWpQ}} {{#ev:youtube|SQiFs1z7qSY}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|ShQlakkzsY4}} {{#ev:youtube|cTmf5yrpcl8}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Source code ==&lt;br /&gt;
The code was tested on Windows (Xp, 7), Mac OS X 10.6 and Ubuntu 10.4LTS.&lt;br /&gt;
* Standalone application, libraries and ROS packages : [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_Interface.png|800px|Images acquired in Need For Speed Most Wanted]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Code source ==&lt;br /&gt;
Le code a été testé sur Windows (Xp, 7), Mac OS X 10.6 et Ubuntu 10.4LTS. &lt;br /&gt;
* Logiciel &amp;quot;stand-alone&amp;quot;, bibliothèques logicielles et noeuds ROS : [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_Interface.png|800px|Images provenant de Need For Speed Most Wanted]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Data sets ==&lt;br /&gt;
&#039;&#039;&#039;UdeS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 5395 images at 1 Hz (1.5 hours). &lt;br /&gt;
* Images taken while walking through a loop of ~2 km, traversed two times.&lt;br /&gt;
* The data set contains indoor and outdoor environments.&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:UdeS_1Hz_map.png|([http://maps.google.ca/maps?q=Universit%C3%A9+de+sherbrooke&amp;amp;hl=en&amp;amp;ie=UTF8&amp;amp;ll=45.377714,-71.927383&amp;amp;spn=0.011546,0.016158&amp;amp;sll=49.891235,-97.15369&amp;amp;sspn=43.664668,66.181641&amp;amp;t=h&amp;amp;z=16 on Google maps])&lt;br /&gt;
File:UdeS_1Hz_sample1.jpg&lt;br /&gt;
File:UdeS_1Hz_sample3.jpg&lt;br /&gt;
File:UdeS_1Hz_sample4.jpg&lt;br /&gt;
File:UdeS_1Hz_sample5.jpg&lt;br /&gt;
File:UdeS_1Hz_sample6.jpg&lt;br /&gt;
File:UdeS_1Hz_sample7.jpg&lt;br /&gt;
File:UdeS_1Hz_sample8.jpg&lt;br /&gt;
File:UdeS_1Hz_sample9.jpg&lt;br /&gt;
File:UdeS_1Hz_sample11.jpg|Rain!&lt;br /&gt;
File:UdeS_1Hz_sample16.jpg|Compare illumination and camera orientation with the next image...&lt;br /&gt;
File:UdeS_1Hz_sample12.jpg&lt;br /&gt;
File:UdeS_1Hz_sample13.jpg|Elevator door...&lt;br /&gt;
File:UdeS_1Hz_sample14.jpg&lt;br /&gt;
File:UdeS_1Hz_sample15.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:UdeS_1Hz.part1.rar|UdeS_1Hz.part1.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part2.rar|UdeS_1Hz.part2.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part3.rar|UdeS_1Hz.part3.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.png|UdeS_1Hz GroundTruth]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;NFSMW&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 25098 images at 1 Hz (7 hours).&lt;br /&gt;
* Images taken from the racing video game Need For Speed: Most Wanted.&lt;br /&gt;
* 2 areas visited hundred times each (100 traversals in area 1 then moved to area 2 for another 102 traversals).&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:NFSMW_1Hz_map.png&lt;br /&gt;
File:NFSMW_1Hz_sample2.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample3.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample4.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample5.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample6.jpg|Compare illumination with the next image...&lt;br /&gt;
File:NFSMW_1Hz_sample8.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample7.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample9.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample10.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:NFSMW_1Hz.part01.rar|NFSMW_1Hz.part01.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part02.rar|NFSMW_1Hz.part02.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part03.rar|NFSMW_1Hz.part03.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part04.rar|NFSMW_1Hz.part04.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part05.rar|NFSMW_1Hz.part05.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part06.rar|NFSMW_1Hz.part06.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part07.rar|NFSMW_1Hz.part07.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part08.rar|NFSMW_1Hz.part08.rar]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Community&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Community data sets from other loop closure detection approaches :&lt;br /&gt;
* Angeli et al. : [http://cogrob.ensta.fr/loopclosure.html Lip6Indoor and Lip6Outdoor]&lt;br /&gt;
* Cummins et al. (FAB-MAP) : [http://www.robots.ox.ac.uk/~mobile/IJRR_2008_Dataset NewCollege and CityCentre]&lt;br /&gt;
* Cummins et al. (FAB-MAP 2.0) : [http://www.robots.ox.ac.uk/~mobile Eynsham (70 km)]&lt;br /&gt;
* Maddern et al. : [http://www.robots.ox.ac.uk/NewCollegeData/ NewCollege omnidirectionnal images]&lt;br /&gt;
* Kawewong et al. (PIRF-Nav 2.0): [http://haselab.info/pirf.html CrowdedCanteen]&lt;br /&gt;
* Ga ́lvez-Lo ́pez et al. : [http://www.rawseeds.org/home/category/benchmarking-toolkit/datasets/ Bovisa and Bicocca]&lt;br /&gt;
* Blanco et al. : [http://www.mrpt.org/malaga_dataset_2009 Malaga 2009]&lt;br /&gt;
&lt;br /&gt;
Ground truths:&lt;br /&gt;
* [[Media:NewCollege.png|NewCollege.png]] 1073 images at ~0.5 Hz (left and right images merged) &lt;br /&gt;
* [[Media:CityCentre.png|CityCentre.png]] 1237 images at ~0.5 Hz (left and right images merged) &lt;br /&gt;
* [[Media:Lip6Indoor.png|Lip6Indoor.png]] 388 images at 1 Hz&lt;br /&gt;
* [[Media:Lip6Outdoor.png|Lip6Outdoor.png]] 531 images at 0.5 Hz&lt;br /&gt;
* [[Media:Eynsham70km.png|Eynsham70km.png]] 5519 images at ~1 Hz (Note that we removed some images of the original data set to have an approximately image rate of 1 Hz)&lt;br /&gt;
* [[Media:NewCollegeOmni.png|NewCollegeOmni.png]] 1626 images at 1 Hz&lt;br /&gt;
* [[Media:CrowdedCanteen.png|CrowdedCanteen.png]] 692 images at 2 Hz&lt;br /&gt;
* [[Media:BicoccaIndoor-2009-02-25b.png|BicoccaIndoor-2009-02-25b.png]] 1757 images at 1 Hz&lt;br /&gt;
* [[Media:BovisaOutdoor-2008-10-04.png|BovisaOutdoor-2008-10-04.png]] 2277 images at 1 Hz&lt;br /&gt;
* [[Media:BovisaMixed-2008-10-06.png|BovisaMixed-2008-10-06.png]] 2147 images at 1 Hz&lt;br /&gt;
* [[Media:malaga2009_campus_2L.png|malaga2009_campus_2L.png]] 653 images at ~1 Hz&lt;br /&gt;
* [[Media:malaga2009_parking_6L.png|malaga2009_parking_6L.png]] 435 images at ~1 Hz&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Ensembles de données ==&lt;br /&gt;
&#039;&#039;&#039;UdeS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 5395 images à 1 Hz (1,5 heures).&lt;br /&gt;
* Images prises en marchant sur un trajet de ~2 km, parcouru deux fois.&lt;br /&gt;
* L&#039;ensemble de données contient des images prises à l&#039;intérieur et à l&#039;extérieur.&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:UdeS_1Hz_map.png|([http://maps.google.ca/maps?q=Universit%C3%A9+de+sherbrooke&amp;amp;hl=en&amp;amp;ie=UTF8&amp;amp;ll=45.377714,-71.927383&amp;amp;spn=0.011546,0.016158&amp;amp;sll=49.891235,-97.15369&amp;amp;sspn=43.664668,66.181641&amp;amp;t=h&amp;amp;z=16 sur Google maps])&lt;br /&gt;
File:UdeS_1Hz_sample1.jpg&lt;br /&gt;
File:UdeS_1Hz_sample3.jpg&lt;br /&gt;
File:UdeS_1Hz_sample4.jpg&lt;br /&gt;
File:UdeS_1Hz_sample5.jpg&lt;br /&gt;
File:UdeS_1Hz_sample6.jpg&lt;br /&gt;
File:UdeS_1Hz_sample7.jpg&lt;br /&gt;
File:UdeS_1Hz_sample8.jpg&lt;br /&gt;
File:UdeS_1Hz_sample9.jpg&lt;br /&gt;
File:UdeS_1Hz_sample11.jpg|De la pluie!&lt;br /&gt;
File:UdeS_1Hz_sample16.jpg|Comparer l&#039;illumination et l&#039;orientation de la caméra avec l&#039;image suivante... &lt;br /&gt;
File:UdeS_1Hz_sample12.jpg&lt;br /&gt;
File:UdeS_1Hz_sample13.jpg|Porte d&#039;ascenseur...&lt;br /&gt;
File:UdeS_1Hz_sample14.jpg&lt;br /&gt;
File:UdeS_1Hz_sample15.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:UdeS_1Hz.part1.rar|UdeS_1Hz.part1.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part2.rar|UdeS_1Hz.part2.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part3.rar|UdeS_1Hz.part3.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.png|UdeS_1Hz GroundTruth]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;NFSMW&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 25098 images à 1 Hz (7 heures).&lt;br /&gt;
* Images prises dans le jeu vidéo de course Need For Speed: Most Wanted.&lt;br /&gt;
* 2 zones ont été visités 100 fois chaque (100 boucles dans la zone 1 et ensuite 102 boucles dans la zone 2).&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:NFSMW_1Hz_map.png&lt;br /&gt;
File:NFSMW_1Hz_sample2.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample3.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample4.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample5.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample6.jpg|Comparer l&#039;illumination avec l&#039;image suivante...&lt;br /&gt;
File:NFSMW_1Hz_sample8.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample7.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample9.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample10.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:NFSMW_1Hz.part01.rar|NFSMW_1Hz.part01.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part02.rar|NFSMW_1Hz.part02.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part03.rar|NFSMW_1Hz.part03.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part04.rar|NFSMW_1Hz.part04.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part05.rar|NFSMW_1Hz.part05.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part06.rar|NFSMW_1Hz.part06.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part07.rar|NFSMW_1Hz.part07.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part08.rar|NFSMW_1Hz.part08.rar]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Communauté&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Ensembles de données provenant d&#039;autres approches de détection de fermeture de boucle :&lt;br /&gt;
* Angeli et al. : [http://cogrob.ensta.fr/loopclosure.html Lip6Indoor et Lip6Outdoor]&lt;br /&gt;
* Cummins et al. (FAB-MAP) : [http://www.robots.ox.ac.uk/~mobile/IJRR_2008_Dataset NewCollege et CityCentre]&lt;br /&gt;
* Cummins et al. (FAB-MAP 2.0) : [http://www.robots.ox.ac.uk/~mobile Eynsham (70 km)]&lt;br /&gt;
* Maddern et al. : [http://www.robots.ox.ac.uk/NewCollegeData/ NewCollege omnidirectionnal images]&lt;br /&gt;
* Kawewong et al. (PIRF-Nav 2.0): [http://haselab.info/pirf.html CrowdedCanteen]&lt;br /&gt;
* Ga ́lvez-Lo ́pez et al. : [http://www.rawseeds.org/home/category/benchmarking-toolkit/datasets/ Bovisa et Bicocca]&lt;br /&gt;
* Blanco et al. : [http://www.mrpt.org/malaga_dataset_2009 Malaga 2009]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Ground truths&#039;&#039;:&lt;br /&gt;
* [[Media:NewCollege.png|NewCollege.png]] 1073 images à ~0.5 Hz (les images de gauche et de droite fusionnées)&lt;br /&gt;
* [[Media:CityCentre.png|CityCentre.png]] 1237 images à ~0.5 Hz (les images de gauche et de droite fusionnées) &lt;br /&gt;
* [[Media:Lip6Indoor.png|Lip6Indoor.png]] 388 images à 1 Hz&lt;br /&gt;
* [[Media:Lip6Outdoor.png|Lip6Outdoor.png]] 531 images à 0.5 Hz&lt;br /&gt;
* [[Media:Eynsham70km.png|Eynsham70km.png]] 5519 images à ~1 Hz (Noter que nous avons enlevés des images de l&#039;ensemble données original pour avoir une fréquence d&#039;acquisition d&#039;images d&#039;environ 1 Hz.)&lt;br /&gt;
* [[Media:NewCollegeOmni.png|NewCollegeOmni.png]] 1626 images à 1 Hz&lt;br /&gt;
* [[Media:CrowdedCanteen.png|CrowdedCanteen.png]] 692 images à 2 Hz&lt;br /&gt;
* [[Media:BicoccaIndoor-2009-02-25b.png|BicoccaIndoor-2009-02-25b.png]] 1757 images à 1 Hz&lt;br /&gt;
* [[Media:BovisaOutdoor-2008-10-04.png|BovisaOutdoor-2008-10-04.png]] 2277 images à 1 Hz&lt;br /&gt;
* [[Media:BovisaMixed-2008-10-06.png|BovisaMixed-2008-10-06.png]] 2147 images à 1 Hz&lt;br /&gt;
* [[Media:malaga2009_campus_2L.png|malaga2009_campus_2L.png]] 653 images à ~1 Hz&lt;br /&gt;
* [[Media:malaga2009_parking_6L.png|malaga2009_parking_6L.png]] 435 images à ~1 Hz&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Publications ==&lt;br /&gt;
# M. Labbé and F. Michaud, “Multi-Session Visual SLAM for Illumination-Invariant Re-Localization in Indoor Environments,” in &#039;&#039;Frontiers in Robotics and AI&#039;&#039;, vol. 9, 2022. ([https://arxiv.org/abs/2103.03827 pdf]) ([https://doi.org/10.3389/frobt.2022.801886 Frontiers])&lt;br /&gt;
# M. Labbé and F. Michaud, “RTAB-Map as an Open-Source Lidar and Visual SLAM Library for Large-Scale and Long-Term Online Operation,” in &#039;&#039;Journal of Field Robotics&#039;&#039;, vol. 36, no. 2, pp. 416–446, 2019. ([[Media:Labbe18JFR_preprint.pdf|pdf]]) ([https://doi.org/10.1002/rob.21831 Wiley])&lt;br /&gt;
# M. Labbé and F. Michaud, “Long-term online multi-session graph-based SPLAM with memory management,” in &#039;&#039;Autonomous Robots&#039;&#039;, vol. 42, no. 6, pp. 1133-1150, 2017. ([[Media:LabbeAURO2017.pdf|pdf]]) ([http://dx.doi.org/10.1007/s10514-017-9682-5 Springer])&lt;br /&gt;
#M. Labbé and F. Michaud, “Online Global Loop Closure Detection for Large-Scale Multi-Session Graph-Based SLAM,” in &#039;&#039;Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems&#039;&#039;, 2014. ([[Media:Labbe14-IROS.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/login.jsp?tp=&amp;amp;arnumber=6942926 IEEE Xplore])&lt;br /&gt;
#Labbé, M., Michaud., F. (2013), “Appearance-based loop closure detection in real-time for large-scale and long-term operation,” &#039;&#039;IEEE Transactions on Robotics&#039;&#039;, vol. 29, no. 3, pp. 734-745. ([[Media:TRO2013.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6459608 IEEE Xplore])&lt;br /&gt;
#Labbé, M., Michaud, F. (2011), “Memory management for real-time appearance-based loop closure detection,” in &#039;&#039;Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems&#039;&#039;. ([[Media:labbe11memory.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6094602 IEEE Xplore])&lt;br /&gt;
&lt;br /&gt;
==== Presentations ====&lt;br /&gt;
* M. Labbé, &amp;quot;Simultaneous Localization and Mapping (SLAM) with RTAB-Map&amp;quot;, Université Laval, Québec, November 2015 ([[Media:Labbe2015ULaval.pdf|slides pdf]])&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Team ==&lt;br /&gt;
* [[Mathieu Labbé]]&lt;br /&gt;
* [http://www.gel.usherbrooke.ca/michaudf/ François Michaud]&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Équipe ==&lt;br /&gt;
* [[Mathieu Labbé]]&lt;br /&gt;
* [http://www.gel.usherbrooke.ca/michaudf/ François Michaud]&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=RTAB-Map&amp;diff=3519</id>
		<title>RTAB-Map</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=RTAB-Map&amp;diff=3519"/>
		<updated>2024-05-28T12:45:39Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
&amp;lt;big&amp;gt;&lt;br /&gt;
&amp;lt;!-- &amp;lt;english&amp;gt;&lt;br /&gt;
[[Image:RTAB-Map.png|link=http://introlab.github.io/rtabmap|RTAB-Map]] RTAB-Map : Real-Time Appearance-Based Mapping&lt;br /&gt;
&amp;lt;/english&amp;gt; --&amp;gt;&lt;br /&gt;
&amp;lt;!-- &amp;lt;french&amp;gt; --&amp;gt;&lt;br /&gt;
[[Image:RTAB-Map.png|link=http://introlab.github.io/rtabmap|RTAB-Map]] RTAB-Map : Cartographie temps réel basée sur l&#039;apparence de l&#039;environnement &lt;br /&gt;
&amp;lt;!-- &amp;lt;/french&amp;gt; --&amp;gt;&lt;br /&gt;
&amp;lt;/big&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Description ==&lt;br /&gt;
&#039;&#039;&#039;This page is about the loop closure detection approach used by RTAB-Map. For RGB-D mapping, visit [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&#039;&#039;&#039;.&lt;br /&gt;
&lt;br /&gt;
Loop closure detection is the process involved when trying to find a match between the current and a previously visited locations in SLAM (Simultaneous Localization And Mapping). &lt;br /&gt;
Over time, the amount of time required to process new observations increases with the size of the internal map, which may affect real-time processing. &lt;br /&gt;
RTAB-Map is a novel real-time loop closure detection approach for large-scale and long-term SLAM. Our approach is based on efficient memory management to keep computation time for each new observation under a fixed time limit, thus respecting real-time limit for long-term operation. Results demonstrate the approach&#039;s adaptability and scalability using two custom data sets and ten standard data sets.&lt;br /&gt;
&amp;lt;/english&amp;gt;&amp;lt;french&amp;gt;&lt;br /&gt;
== Description ==&lt;br /&gt;
&#039;&#039;&#039;Cette page est à propos de l&#039;approche de détection de fermeture de boucle utilisée dans RTAB-Map. Pour la cartographie RGB-D, visitez [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&#039;&#039;&#039;.&lt;br /&gt;
&lt;br /&gt;
La détection de fermeture de boucle est le processus impliqué en SLAM (localisation et cartographie simultanées) lorsqu&#039;on tente de trouver une correspondance entre un endroit présent et un autre déjà visité. Plus la carte interne augmente en taille, plus le temps requis pour la détection de fermeture de boucle augmente, ce qui peut affecter le traitement en temps réel. RTAB-Map est une nouvelle approche de détection de fermeture de boucle fonctionnant en temps réel pour du SLAM à grande échelle et à long terme. Notre approche est basée sur une gestion efficace de la mémoire afin de garder le temps de calcul en dessous d&#039;un seuil de temps, respectant ainsi la limite de temps réel à long terme. En utilisant dix ensembles de données standards, notre propre ensemble de données dérivées d&#039;un parcours de plus de 2 km rassemblant des conditions diverses et notre ensemble de données montrant un parcours où le robot visite les mêmes endroits une centaine de fois, les résultats démontrent l&#039;adaptabilité et l&#039;extensibilité de notre approche.&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{{#ev:youtube|71eRxTc1DaU}}&lt;br /&gt;
{{#ev:youtube|CAk-QGMlQmI}}&lt;br /&gt;
{{#ev:youtube|AMLwjo80WzI}}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Results ==&lt;br /&gt;
&#039;&#039;Note that these results (more recent) may differ from those in the presentation video above...&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Figure 1: Summary of the loop closures detected on UdeS data set :&lt;br /&gt;
* Green : Loop closures detected&lt;br /&gt;
* Yellow : Loop closures rejected&lt;br /&gt;
* Red : Unable to detect a loop closure because old places could not be retrieved&lt;br /&gt;
&lt;br /&gt;
Figure 2: Processing time for each image acquired (real-time limit fixed to 700 ms for an image rate of 1 Hz)&lt;br /&gt;
&lt;br /&gt;
Figure 3: Precision-Recall (48% recall at 100% precision)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_LoopClosureMapResults.png|250px]] [[File:RTAB-Map_LoopClosureTimeResults.png|250px]] [[File:RTAB-Map_RecallResults.png|250px]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Reproduce the loop closure detection results&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:RTAB-Map_LoopClosureAllPrecisionRecall.png|250px]]&lt;br /&gt;
&lt;br /&gt;
* Visit the [http://github.com/introlab/rtabmap/wiki/Benchmark Benchmark] wiki page on [http://github.com/introlab/rtabmap/wiki RTAB-Map&#039;s GitHub]. The ground truths can be downloaded below.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Videos&#039;&#039;&#039;&lt;br /&gt;
* Newer:&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|1dImRinTJSE}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|N5q0jQrV3gw}} {{#ev:youtube|PqO_x8tcFiY}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|2MogQIT_B2I}} {{#ev:youtube|AH_oKp3CrRA}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|0fNUD11FNZU}} {{#ev:youtube|ViXlUywWHYQ}}&amp;lt;/center&amp;gt;&lt;br /&gt;
* Older:&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|0zWs6jTaAwQ}} {{#ev:youtube|J8KGEA9ecS0}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|kghs6XM8Yzw}} {{#ev:youtube|awV2Xbjq7OM}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|CuWESlLfWpQ}} {{#ev:youtube|SQiFs1z7qSY}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|ShQlakkzsY4}} {{#ev:youtube|cTmf5yrpcl8}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Résultats ==&lt;br /&gt;
&#039;&#039;À noter que les résultats (plus récents) présentés ici peuvent différer de ceux dans le vidéo...&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Figure 1: Sommaire des détection de boucles sur l&#039;ensemble de données UdeS :&lt;br /&gt;
* Vert : Fermetures de boucle acceptées&lt;br /&gt;
* Jaune : Fermetures de boucle rejetées &lt;br /&gt;
* Rouge : Impossibilité de détecter une fermeture de boucle car les anciens endroits n&#039;ont pu être remémorisés&lt;br /&gt;
&lt;br /&gt;
Figure 2: Temps d&#039;exécution pour chaque itération (limite temps réel fixée à 700 ms pour un temps d&#039;acquisition de 1 seconde)&lt;br /&gt;
&lt;br /&gt;
Figure 3: Precision-Recall (48% recall à 100% precision)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_LoopClosureMapResults.png|250px]] [[File:RTAB-Map_LoopClosureTimeResults.png|250px]] [[File:RTAB-Map_RecallResults.png|250px]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Reproduire les résultats de détection de boucles&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:RTAB-Map_LoopClosureAllPrecisionRecall.png|250px]]&lt;br /&gt;
&lt;br /&gt;
* Visitez la page wiki [http://github.com/introlab/rtabmap/wiki/Benchmark Benchmark] sur le [http://github.com/introlab/rtabmap/wiki GitHub de RTAB-Map&#039;s]. Les &amp;quot;ground truths&amp;quot; peuvent être téléchargés en bas de la page.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Vidéos&#039;&#039;&#039;&lt;br /&gt;
* Nouveaux:&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|1dImRinTJSE}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|N5q0jQrV3gw}} {{#ev:youtube|PqO_x8tcFiY}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|2MogQIT_B2I}} {{#ev:youtube|AH_oKp3CrRA}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|0fNUD11FNZU}} {{#ev:youtube|ViXlUywWHYQ}}&amp;lt;/center&amp;gt;&lt;br /&gt;
* Anciens:&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|0zWs6jTaAwQ}} {{#ev:youtube|J8KGEA9ecS0}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|kghs6XM8Yzw}} {{#ev:youtube|awV2Xbjq7OM}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|CuWESlLfWpQ}} {{#ev:youtube|SQiFs1z7qSY}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|ShQlakkzsY4}} {{#ev:youtube|cTmf5yrpcl8}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Source code ==&lt;br /&gt;
The code was tested on Windows (Xp, 7), Mac OS X 10.6 and Ubuntu 10.4LTS.&lt;br /&gt;
* Standalone application, libraries and ROS packages : [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_Interface.png|800px|Images acquired in Need For Speed Most Wanted]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Code source ==&lt;br /&gt;
Le code a été testé sur Windows (Xp, 7), Mac OS X 10.6 et Ubuntu 10.4LTS. &lt;br /&gt;
* Logiciel &amp;quot;stand-alone&amp;quot;, bibliothèques logicielles et noeuds ROS : [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_Interface.png|800px|Images provenant de Need For Speed Most Wanted]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Data sets ==&lt;br /&gt;
&#039;&#039;&#039;UdeS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 5395 images at 1 Hz (1.5 hours). &lt;br /&gt;
* Images taken while walking through a loop of ~2 km, traversed two times.&lt;br /&gt;
* The data set contains indoor and outdoor environments.&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:UdeS_1Hz_map.png|([http://maps.google.ca/maps?q=Universit%C3%A9+de+sherbrooke&amp;amp;hl=en&amp;amp;ie=UTF8&amp;amp;ll=45.377714,-71.927383&amp;amp;spn=0.011546,0.016158&amp;amp;sll=49.891235,-97.15369&amp;amp;sspn=43.664668,66.181641&amp;amp;t=h&amp;amp;z=16 on Google maps])&lt;br /&gt;
File:UdeS_1Hz_sample1.jpg&lt;br /&gt;
File:UdeS_1Hz_sample3.jpg&lt;br /&gt;
File:UdeS_1Hz_sample4.jpg&lt;br /&gt;
File:UdeS_1Hz_sample5.jpg&lt;br /&gt;
File:UdeS_1Hz_sample6.jpg&lt;br /&gt;
File:UdeS_1Hz_sample7.jpg&lt;br /&gt;
File:UdeS_1Hz_sample8.jpg&lt;br /&gt;
File:UdeS_1Hz_sample9.jpg&lt;br /&gt;
File:UdeS_1Hz_sample11.jpg|Rain!&lt;br /&gt;
File:UdeS_1Hz_sample16.jpg|Compare illumination and camera orientation with the next image...&lt;br /&gt;
File:UdeS_1Hz_sample12.jpg&lt;br /&gt;
File:UdeS_1Hz_sample13.jpg|Elevator door...&lt;br /&gt;
File:UdeS_1Hz_sample14.jpg&lt;br /&gt;
File:UdeS_1Hz_sample15.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:UdeS_1Hz.part1.rar|UdeS_1Hz.part1.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part2.rar|UdeS_1Hz.part2.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part3.rar|UdeS_1Hz.part3.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.png|UdeS_1Hz GroundTruth]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;NFSMW&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 25098 images at 1 Hz (7 hours).&lt;br /&gt;
* Images taken from the racing video game Need For Speed: Most Wanted.&lt;br /&gt;
* 2 areas visited hundred times each (100 traversals in area 1 then moved to area 2 for another 102 traversals).&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:NFSMW_1Hz_map.png&lt;br /&gt;
File:NFSMW_1Hz_sample2.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample3.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample4.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample5.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample6.jpg|Compare illumination with the next image...&lt;br /&gt;
File:NFSMW_1Hz_sample8.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample7.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample9.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample10.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:NFSMW_1Hz.part01.rar|NFSMW_1Hz.part01.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part02.rar|NFSMW_1Hz.part02.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part03.rar|NFSMW_1Hz.part03.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part04.rar|NFSMW_1Hz.part04.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part05.rar|NFSMW_1Hz.part05.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part06.rar|NFSMW_1Hz.part06.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part07.rar|NFSMW_1Hz.part07.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part08.rar|NFSMW_1Hz.part08.rar]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Community&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Community data sets from other loop closure detection approaches :&lt;br /&gt;
* Angeli et al. : [http://cogrob.ensta.fr/loopclosure.html Lip6Indoor and Lip6Outdoor]&lt;br /&gt;
* Cummins et al. (FAB-MAP) : [http://www.robots.ox.ac.uk/~mobile/IJRR_2008_Dataset NewCollege and CityCentre]&lt;br /&gt;
* Cummins et al. (FAB-MAP 2.0) : [http://www.robots.ox.ac.uk/~mobile Eynsham (70 km)]&lt;br /&gt;
* Maddern et al. : [http://www.robots.ox.ac.uk/NewCollegeData/ NewCollege omnidirectionnal images]&lt;br /&gt;
* Kawewong et al. (PIRF-Nav 2.0): [http://haselab.info/pirf.html CrowdedCanteen]&lt;br /&gt;
* Ga ́lvez-Lo ́pez et al. : [http://www.rawseeds.org/home/category/benchmarking-toolkit/datasets/ Bovisa and Bicocca]&lt;br /&gt;
* Blanco et al. : [http://www.mrpt.org/malaga_dataset_2009 Malaga 2009]&lt;br /&gt;
&lt;br /&gt;
Ground truths:&lt;br /&gt;
* [[Media:NewCollege.png|NewCollege.png]] 1073 images at ~0.5 Hz (left and right images merged) &lt;br /&gt;
* [[Media:CityCentre.png|CityCentre.png]] 1237 images at ~0.5 Hz (left and right images merged) &lt;br /&gt;
* [[Media:Lip6Indoor.png|Lip6Indoor.png]] 388 images at 1 Hz&lt;br /&gt;
* [[Media:Lip6Outdoor.png|Lip6Outdoor.png]] 531 images at 0.5 Hz&lt;br /&gt;
* [[Media:Eynsham70km.png|Eynsham70km.png]] 5519 images at ~1 Hz (Note that we removed some images of the original data set to have an approximately image rate of 1 Hz)&lt;br /&gt;
* [[Media:NewCollegeOmni.png|NewCollegeOmni.png]] 1626 images at 1 Hz&lt;br /&gt;
* [[Media:CrowdedCanteen.png|CrowdedCanteen.png]] 692 images at 2 Hz&lt;br /&gt;
* [[Media:BicoccaIndoor-2009-02-25b.png|BicoccaIndoor-2009-02-25b.png]] 1757 images at 1 Hz&lt;br /&gt;
* [[Media:BovisaOutdoor-2008-10-04.png|BovisaOutdoor-2008-10-04.png]] 2277 images at 1 Hz&lt;br /&gt;
* [[Media:BovisaMixed-2008-10-06.png|BovisaMixed-2008-10-06.png]] 2147 images at 1 Hz&lt;br /&gt;
* [[Media:malaga2009_campus_2L.png|malaga2009_campus_2L.png]] 653 images at ~1 Hz&lt;br /&gt;
* [[Media:malaga2009_parking_6L.png|malaga2009_parking_6L.png]] 435 images at ~1 Hz&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Ensembles de données ==&lt;br /&gt;
&#039;&#039;&#039;UdeS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 5395 images à 1 Hz (1,5 heures).&lt;br /&gt;
* Images prises en marchant sur un trajet de ~2 km, parcouru deux fois.&lt;br /&gt;
* L&#039;ensemble de données contient des images prises à l&#039;intérieur et à l&#039;extérieur.&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:UdeS_1Hz_map.png|([http://maps.google.ca/maps?q=Universit%C3%A9+de+sherbrooke&amp;amp;hl=en&amp;amp;ie=UTF8&amp;amp;ll=45.377714,-71.927383&amp;amp;spn=0.011546,0.016158&amp;amp;sll=49.891235,-97.15369&amp;amp;sspn=43.664668,66.181641&amp;amp;t=h&amp;amp;z=16 sur Google maps])&lt;br /&gt;
File:UdeS_1Hz_sample1.jpg&lt;br /&gt;
File:UdeS_1Hz_sample3.jpg&lt;br /&gt;
File:UdeS_1Hz_sample4.jpg&lt;br /&gt;
File:UdeS_1Hz_sample5.jpg&lt;br /&gt;
File:UdeS_1Hz_sample6.jpg&lt;br /&gt;
File:UdeS_1Hz_sample7.jpg&lt;br /&gt;
File:UdeS_1Hz_sample8.jpg&lt;br /&gt;
File:UdeS_1Hz_sample9.jpg&lt;br /&gt;
File:UdeS_1Hz_sample11.jpg|De la pluie!&lt;br /&gt;
File:UdeS_1Hz_sample16.jpg|Comparer l&#039;illumination et l&#039;orientation de la caméra avec l&#039;image suivante... &lt;br /&gt;
File:UdeS_1Hz_sample12.jpg&lt;br /&gt;
File:UdeS_1Hz_sample13.jpg|Porte d&#039;ascenseur...&lt;br /&gt;
File:UdeS_1Hz_sample14.jpg&lt;br /&gt;
File:UdeS_1Hz_sample15.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:UdeS_1Hz.part1.rar|UdeS_1Hz.part1.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part2.rar|UdeS_1Hz.part2.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part3.rar|UdeS_1Hz.part3.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.png|UdeS_1Hz GroundTruth]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;NFSMW&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 25098 images à 1 Hz (7 heures).&lt;br /&gt;
* Images prises dans le jeu vidéo de course Need For Speed: Most Wanted.&lt;br /&gt;
* 2 zones ont été visités 100 fois chaque (100 boucles dans la zone 1 et ensuite 102 boucles dans la zone 2).&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:NFSMW_1Hz_map.png&lt;br /&gt;
File:NFSMW_1Hz_sample2.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample3.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample4.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample5.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample6.jpg|Comparer l&#039;illumination avec l&#039;image suivante...&lt;br /&gt;
File:NFSMW_1Hz_sample8.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample7.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample9.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample10.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:NFSMW_1Hz.part01.rar|NFSMW_1Hz.part01.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part02.rar|NFSMW_1Hz.part02.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part03.rar|NFSMW_1Hz.part03.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part04.rar|NFSMW_1Hz.part04.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part05.rar|NFSMW_1Hz.part05.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part06.rar|NFSMW_1Hz.part06.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part07.rar|NFSMW_1Hz.part07.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part08.rar|NFSMW_1Hz.part08.rar]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Communauté&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Ensembles de données provenant d&#039;autres approches de détection de fermeture de boucle :&lt;br /&gt;
* Angeli et al. : [http://cogrob.ensta.fr/loopclosure.html Lip6Indoor et Lip6Outdoor]&lt;br /&gt;
* Cummins et al. (FAB-MAP) : [http://www.robots.ox.ac.uk/~mobile/IJRR_2008_Dataset NewCollege et CityCentre]&lt;br /&gt;
* Cummins et al. (FAB-MAP 2.0) : [http://www.robots.ox.ac.uk/~mobile Eynsham (70 km)]&lt;br /&gt;
* Maddern et al. : [http://www.robots.ox.ac.uk/NewCollegeData/ NewCollege omnidirectionnal images]&lt;br /&gt;
* Kawewong et al. (PIRF-Nav 2.0): [http://haselab.info/pirf.html CrowdedCanteen]&lt;br /&gt;
* Ga ́lvez-Lo ́pez et al. : [http://www.rawseeds.org/home/category/benchmarking-toolkit/datasets/ Bovisa et Bicocca]&lt;br /&gt;
* Blanco et al. : [http://www.mrpt.org/malaga_dataset_2009 Malaga 2009]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Ground truths&#039;&#039;:&lt;br /&gt;
* [[Media:NewCollege.png|NewCollege.png]] 1073 images à ~0.5 Hz (les images de gauche et de droite fusionnées)&lt;br /&gt;
* [[Media:CityCentre.png|CityCentre.png]] 1237 images à ~0.5 Hz (les images de gauche et de droite fusionnées) &lt;br /&gt;
* [[Media:Lip6Indoor.png|Lip6Indoor.png]] 388 images à 1 Hz&lt;br /&gt;
* [[Media:Lip6Outdoor.png|Lip6Outdoor.png]] 531 images à 0.5 Hz&lt;br /&gt;
* [[Media:Eynsham70km.png|Eynsham70km.png]] 5519 images à ~1 Hz (Noter que nous avons enlevés des images de l&#039;ensemble données original pour avoir une fréquence d&#039;acquisition d&#039;images d&#039;environ 1 Hz.)&lt;br /&gt;
* [[Media:NewCollegeOmni.png|NewCollegeOmni.png]] 1626 images à 1 Hz&lt;br /&gt;
* [[Media:CrowdedCanteen.png|CrowdedCanteen.png]] 692 images à 2 Hz&lt;br /&gt;
* [[Media:BicoccaIndoor-2009-02-25b.png|BicoccaIndoor-2009-02-25b.png]] 1757 images à 1 Hz&lt;br /&gt;
* [[Media:BovisaOutdoor-2008-10-04.png|BovisaOutdoor-2008-10-04.png]] 2277 images à 1 Hz&lt;br /&gt;
* [[Media:BovisaMixed-2008-10-06.png|BovisaMixed-2008-10-06.png]] 2147 images à 1 Hz&lt;br /&gt;
* [[Media:malaga2009_campus_2L.png|malaga2009_campus_2L.png]] 653 images à ~1 Hz&lt;br /&gt;
* [[Media:malaga2009_parking_6L.png|malaga2009_parking_6L.png]] 435 images à ~1 Hz&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Publications ==&lt;br /&gt;
# M. Labbé and F. Michaud, “Multi-Session Visual SLAM for Illumination-Invariant Re-Localization in Indoor Environments,” in &#039;&#039;Frontiers in Robotics and AI&#039;&#039;, vol. 9, 2022. ([https://arxiv.org/abs/2103.03827 pdf]) ([https://doi.org/10.3389/frobt.2022.801886 Frontiers])&lt;br /&gt;
# M. Labbé and F. Michaud, “RTAB-Map as an Open-Source Lidar and Visual SLAM Library for Large-Scale and Long-Term Online Operation,” in &#039;&#039;Journal of Field Robotics&#039;&#039;, vol. 36, no. 2, pp. 416–446, 2019. ([[Media:Labbe18JFR_preprint.pdf|pdf]]) ([https://doi.org/10.1002/rob.21831 Wiley])&lt;br /&gt;
# M. Labbé and F. Michaud, “Long-term online multi-session graph-based SPLAM with memory management,” in &#039;&#039;Autonomous Robots&#039;&#039;, vol. 42, no. 6, pp. 1133-1150, 2017. ([[Media:LabbeAURO2017.pdf|pdf]]) ([http://dx.doi.org/10.1007/s10514-017-9682-5 Springer])&lt;br /&gt;
#M. Labbé and F. Michaud, “Online Global Loop Closure Detection for Large-Scale Multi-Session Graph-Based SLAM,” in &#039;&#039;Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems&#039;&#039;, 2014. ([[Media:Labbe14-IROS.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/login.jsp?tp=&amp;amp;arnumber=6942926 IEEE Xplore])&lt;br /&gt;
#Labbé, M., Michaud., F. (2013), “Appearance-based loop closure detection in real-time for large-scale and long-term operation,” &#039;&#039;IEEE Transactions on Robotics&#039;&#039;, vol. 29, no. 3, pp. 734-745. ([[Media:TRO2013.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6459608 IEEE Xplore])&lt;br /&gt;
#Labbé, M., Michaud, F. (2011), “Memory management for real-time appearance-based loop closure detection,” in &#039;&#039;Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems&#039;&#039;. ([[Media:labbe11memory.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6094602 IEEE Xplore])&lt;br /&gt;
&lt;br /&gt;
==== Presentations ====&lt;br /&gt;
* M. Labbé, &amp;quot;Simultaneous Localization and Mapping (SLAM) with RTAB-Map&amp;quot;, Université Laval, Québec, November 2015 ([[Media:Labbe2015ULaval.pdf|slides pdf]])&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Team ==&lt;br /&gt;
* [[Mathieu Labbé]]&lt;br /&gt;
* [http://www.gel.usherbrooke.ca/michaudf/ François Michaud]&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Équipe ==&lt;br /&gt;
* [[Mathieu Labbé]]&lt;br /&gt;
* [http://www.gel.usherbrooke.ca/michaudf/ François Michaud]&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=OpenECoSys&amp;diff=3518</id>
		<title>OpenECoSys</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=OpenECoSys&amp;diff=3518"/>
		<updated>2024-05-28T12:44:06Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Description = &lt;br /&gt;
[[Image:OpenEcoSys_NetworkViewer.png|center|600px]]&lt;br /&gt;
&lt;br /&gt;
The [http://www.openecosys.org Open Embedded Computing Systems (OpenECoSys)] project consists in providing free of charge, open source hardware &amp;amp; software implementations for embedded computing devices. Initial projet was started at IntRoLab. Over time, IntRoLab developed multiple embedded modules for its own mobile robot platforms. All modules are connected through a shared CAN (Control Area Network) bus to form a distributed network of sensors and actuators that are used on advanced platforms such as the [[AZIMUT]] robot. Most of the embedded systems are based on [http://www.microchip.com Microchip] microcontrollers that are inexpensive, powerful and versatile.  Software tools, such as the [https://sourceforge.net/apps/mediawiki/openecosys/index.php?title=NetworkViewer NetworkViewer] was developed to allow monitoring of multiple internal variable in the distributed network to facilitate the development of any application. &lt;br /&gt;
&lt;br /&gt;
More information about OpenECoSys can be found here:&lt;br /&gt;
&lt;br /&gt;
* http://openecosys.sourceforge.net&lt;br /&gt;
&lt;br /&gt;
= Related IntRoLab Project(s) =&lt;br /&gt;
&lt;br /&gt;
* [[AZIMUT]]&lt;br /&gt;
* [[Autonomous Robot]]&lt;br /&gt;
* [[Teletrauma]]&lt;br /&gt;
* [[Telerobot]]&lt;br /&gt;
* [[DEA]]&lt;br /&gt;
* [[DDRA]]&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=MARIE&amp;diff=3517</id>
		<title>MARIE</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=MARIE&amp;diff=3517"/>
		<updated>2024-05-28T12:43:55Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Description =&lt;br /&gt;
&lt;br /&gt;
[[Image:MARIE.png|center|400px]]&lt;br /&gt;
&lt;br /&gt;
[http://marie.sourceforge.net MARIE] is a  design tool for mobile and autonomous robot application, designed to facilitate the integration of multiple heterogeneous software elements. It is a flexible tool based on a distributed model, thus allowing the realization of an application using one machine or various networked machines, architectures and platforms. It is now replaced by [http://www.ros.org ROS] from [http://www.willowgarage.com Willow Garage].&lt;br /&gt;
&lt;br /&gt;
  &#039;&#039;&#039;Note : MARIE is no longer maintained.&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
= Publications =&lt;br /&gt;
&lt;br /&gt;
#Michaud, F., Côté, C., Létourneau, D., Brosseau, Y., Valin, J.-M., Beaudry, É., Raïevsky, C., Ponchon, Moisan, P., Lepage, P., Morin, Y., Gagnon, F., Giguère, P., Roux, M.-A., Caron, S., Frenette, P., Kabanza, F. (2007), “Spartacus attending the 2005 AAAI Conference,” to be published in &#039;&#039;Autonomous Robots, &#039;&#039;Special Issue on the AAAI Mobile Robot Competitions and Exhibition. ([http://introlab.3it.usherbrooke.ca/papers/AR2007.pdf pdf]) &lt;br /&gt;
#Côté, C., Champagne, R., Michaud, F. (2007), &amp;quot;Coping with architectural mismatch in autonomous mobile robotics&amp;quot;, &#039;&#039;Proceedings Workshop on Software Development and Integration in Robotics: Understanding Robot Software Architectures, International Conference on Robotics and Automation&#039;&#039;. &lt;br /&gt;
#Côté, C., Létourneau, D., Michaud, F. (2007), Brosseau, Y., &amp;quot;Robotics system integration frameworks: MARIE’s approach to software development and integration&amp;quot;, &#039;&#039;Springer Tracts in Advanced Robotics: Software Engineering for Experimental Robotics, &#039;&#039;Springer Verlag Heidelberg, vol. 30. ([http://introlab.3it.usherbrooke.ca/papers/STARbookChapter.pdf pdf]) &lt;br /&gt;
#Côté, C., Brosseau, Y., Létourneau, D., Raïevsky, C., Michaud, F. (2006), &amp;quot;Using MARIE in software development and integration for autonomous mobile robotics&amp;quot;, &#039;&#039;International Journal of Advanced Robotic Systems&#039;&#039;, Special Issue on Software Development and Integration in Robotics, 3(1):55-60. ([http://introlab.3it.usherbrooke.ca/papers/IJARS2006.pdf pdf]) &lt;br /&gt;
#Michaud, F., Brosseau, Y., Côté, C., Létourneau, D., Moisan, P., Ponchon, A., Raïevsky, C., Valin, J.-M., Beaudry, É., Kabanza, F. (2005) “Modularity and integration in the design of a socially interactive robot,” &#039;&#039;Proceedings&#039;&#039; &#039;&#039;IEEE International Workshop on Robot and Human Interactive Communication&#039;&#039;, Nashville USA, 172-177. ([http://introlab.3it.usherbrooke.ca/papers/ROMAN2005_Spartacus.pdf pdf])&lt;br /&gt;
#Côté, C., Létourneau, D., Michaud, F., Brosseau, Y. (2005) “Software design patterns for robotics: Solving integration problems with MARIE,” invited presentation, Workshop of Robotic Software Environment, &#039;&#039;IEEE International Conference on Robotics and Automation&#039;&#039;. &lt;br /&gt;
#Côté, C., Létourneau, D., Michaud, F., Valin, J.-M., Brosseau, Y., Raïevsky, C., Lemay, M., Tran. V. (2004), &amp;quot;Code reusability for programming mobile robots&amp;quot;, &#039;&#039;Proceedings IEEE/RSJ International Conference on Robots and Intelligent Systems&#039;&#039;, 1820-1825. ([http://introlab.3it.usherbrooke.ca/papers/IROS2004MARIE.pdf pdf])&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=ManyEars&amp;diff=3516</id>
		<title>ManyEars</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=ManyEars&amp;diff=3516"/>
		<updated>2024-05-28T12:43:38Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Description =&lt;br /&gt;
&lt;br /&gt;
[[Image:ManyEarsGUI.png|600px|center]]&lt;br /&gt;
&lt;br /&gt;
The [http://manyears.sourceforge.net ManyEars] project was setup to provide source code from the original [[AUDIBLE]] project. It provides an easy to use &#039;C&#039; library for microphone array processing. This includes sound source localisation, tracking and separation. A tuning [http://qt.nokia.com Qt] GUI is also available for fine tuning the parameters. &lt;br /&gt;
&lt;br /&gt;
= 8 Sounds USB =&lt;br /&gt;
&lt;br /&gt;
[http://manyears.sf.net ManyEars] requires a synchronous audio capture card. Although commercial audio acquisition cards are available, they are expensive, bulky, inefficient and too generic for the needs of ManyEars. Although functional, these cards do not meet the specific needs of mobile robotics and IntRoLab. The goal of 8 Sounds USB is to have an synchronous audio capture card with low power consumption and of a size similar to a credit card with 8 input channels and 2 output channels. The first iteration of the project is a prototype providing a synchronous audio capture card with wiring diagrams, circuit board, full documentation and source code available under Open Source Licenses. [http://www.xmos.com XMOS] is our platform of choice to process the I2S codec data and USB transmission. We are also looking for the [http://www.usb.org/developers/devclass_docs USB Generic Audio Class 2] implementation. More information on the project can be found here :&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* https://sourceforge.net/p/eightsoundsusb/home/&lt;br /&gt;
* https://www.xcore.com/projects/8-sounds-usb&lt;br /&gt;
&lt;br /&gt;
* NEW! [http://www.willowgarage.com/blog/2013/04/12/giving-ears-pr2-8sounds-and-manyears ManyEars on a PR2 (March 2013)]&lt;br /&gt;
&lt;br /&gt;
= Publications =&lt;br /&gt;
&lt;br /&gt;
#Grondin, F., Létourneau, D., Ferland, F., Rousseau, V., and Michaud, F. (2013), &amp;quot;The ManyEars Open Framework - Microphone array open software and open hardware system for robotic applications,&amp;quot; &#039;&#039;Autonomous Robots&#039;&#039;, 34:217-232. [http://link.springer.com/article/10.1007/s10514-012-9316-x] &lt;br /&gt;
#Grondin, F., Létourneau, D., Ferland, F., and Michaud, F. (2013), &amp;quot;An open hardware and software microphone array system for robotic applications,&amp;quot; Demonstration session IEEE International Conference on Human-Robot Interaction. ([[Media:HRI2013demo.pdf|pdf]]) &lt;br /&gt;
#Grondin, F., Michaud, F. (2012), &amp;quot;WISS, a Speaker Identification System for Mobile Robots,&amp;quot; Proceedings of the International Conference on Robotics and Automation: 1817-1822 ([[Media:Grondin2012wiss.pdf|pdf]]) ([[Media:ICRA2012.mpg|mpg]])&lt;br /&gt;
#Grondin, F. (2011), Reconnaissance de locuteurs pour robot mobile, Mémoire de maîtrise, Département de génie électrique et de génie informatique, Université de Sherbrooke. ([[Media:MemoireGrondin.pdf|pdf]])&lt;br /&gt;
#Abran-Côté D., Bandou M., Béland A., Cayer G, Choquette S., Gosselin F., Robitaille F., Telly Kizio D. (2011), “USB Synchronous Multichannel Audio Acquisition System”, IntRoLab Technical Paper ([[Media:8SoundsUSBTechnicalPaper2.pdf|pdf]])&lt;br /&gt;
#Badali, A., Valin, J.-M., Michaud, F., Aarabi, P. (2009), “Evaluating real-time audio localization algorithms for artificial audition on mobile robots,” to be presented at IEEE International Conference on Intelligent Robots and Systems, October. ([http://introlab.3it.usherbrooke.ca/papers/IROS2009.pdf pdf])&lt;br /&gt;
#Valin, J.-M., Yamamoto, S., Rouat, J., Michaud, F., Nakadai, K., Okuno, H. (2007), “Robust recognition of simultaneous speech by a mobile robot,” &#039;&#039;IEEE Transactions on Robotics&#039;&#039;, 23(4):742-752.&lt;br /&gt;
#Michaud, F., Côté, C., Létourneau, D., Brosseau, Y., Valin, J.-M., Beaudry, É., Raïevsky, C., Ponchon, Moisan, P., Lepage, P., Morin, Y., Gagnon, F., Giguère, P., Roux, M.-A., Caron, S., Frenette, P., Kabanza, F. (2007), “Spartacus attending the 2005 AAAI Conference,” &#039;&#039;Autonomous Robots, &#039;&#039;Special Issue on the AAAI Mobile Robot Competitions and Exhibition. ([http://introlab.3it.usherbrooke.ca/papers/AR2007.pdf pdf])&lt;br /&gt;
#Valin, J.-M., Michaud, F., Rouat, J. (2007), “Robust localization and tracking of simultaneous moving sound sources using beamforming and particle filtering,” &#039;&#039;Robotics and Autonomous Systems Journal&#039;&#039;, 55: 216-228. ([http://introlab.3it.usherbrooke.ca/papers/RAS2007.pdf pdf]) &lt;br /&gt;
#Valin, J.-M.. (2005), &amp;quot;Auditory system for a mobile robot&amp;quot;, Ph.D. Thesis, Department of Electrical Engineering and Computer Engineering, Université de Sherbrooke, August. ([http://introlab.3it.usherbrooke.ca/papers/PhDValin.pdf pdf])&lt;br /&gt;
#Valin, J.-M., Rouat, J., Michaud, F. (2004), &amp;quot;Enhanced robot audition based on microphone array source separation with post-filter&amp;quot;, &#039;&#039;Proceedings IEEE/RSJ International Conference on Robots and Intelligent Systems&#039;&#039;, 2123-2128. ([http://introlab.3it.usherbrooke.ca/papers/IROS2004_Valin.pdf pdf])&lt;br /&gt;
#Valin, J.-M., Michaud, F., Hadjou, B., Rouat, J. (2004), &amp;quot;Localization of simultaneous moving sound sources for mobile robot using a frequency-domain steered beamformer approach&amp;quot;, &#039;&#039;Proceedings IEEE International Conference on Robotics and Automation&#039;&#039;, 1033-1038. ([http://introlab.3it.usherbrooke.ca/papers/ICRA2004audible.pdf pdf])&lt;br /&gt;
#Valin, J.-M., Rouat, J., Michaud, F. (2004), &amp;quot;Microphone array post-filter for separation of simultaneous non-stationary sources&amp;quot;, &#039;&#039;ICASSP&#039;&#039;, Montréal. ([http://introlab.3it.usherbrooke.ca/papers/ICASSP2004.pdf pdf])&lt;br /&gt;
#Valin, J.-M., Michaud, F., Létourneau, D., Rouat, J. (2003), &amp;quot;Robust sound source localization using a microphone array on a mobile robot&amp;quot;, &#039;&#039;Proceedings IEEE/RSJ International Conference on Intelligent Robots and Systems&#039;&#039;, p. 1228-1233. ([http://introlab.3it.usherbrooke.ca/papers/IROS2003_Valin.pdf pdf])&lt;br /&gt;
&lt;br /&gt;
= Related IntRoLab Project(s) =&lt;br /&gt;
* [[AUDIBLE]]&lt;br /&gt;
* [[Autonomous Robot]]&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=FlowDesigner&amp;diff=3515</id>
		<title>FlowDesigner</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=FlowDesigner&amp;diff=3515"/>
		<updated>2024-05-28T12:43:13Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Description =&lt;br /&gt;
&lt;br /&gt;
[[Image:FlowDesigner.png|center|400px]]&lt;br /&gt;
&lt;br /&gt;
[http://flowdesigner.sf.net FlowDesigner] is a free (GPL/LGPL) data flow oriented development environment. It can be used to build complex applications by combining small, reusable building blocks. In some ways, it is similar to both Simulink and LabView, but is hardly a clone of either. FlowDesigner features a RAD GUI with a visual debugger. Although FlowDesigner can be used as a rapid prototyping tool, it can still be used for building real-time applications such as audio effects processing. Since FlowDesigner is not really an interpreted language, it can be quite fast. It is written in C++ and features a plugin mechanism that allows plugins/toolboxes to be easiliy added.&lt;br /&gt;
&lt;br /&gt;
= RobotFlow = &lt;br /&gt;
&lt;br /&gt;
[[Image:RobotFlow.jpeg|center|400px]]&lt;br /&gt;
&lt;br /&gt;
[http://robotflow.sf.net RobotFlow] is a mobile robotics tookit based on the FlowDesigner project. The visual programming interface provided in the FlowDesigner project will help people to better visualize &amp;amp; understand what is really happening in the robot&#039;s control loops, sensors, actuators, by using graphical probes and debugging in real-time.&lt;br /&gt;
&lt;br /&gt;
 &#039;&#039;&#039;Note : RobotFlow is no longer maintained.&#039;&#039;&#039;&lt;br /&gt;
= Publications =&lt;br /&gt;
#Létourneau, D., Valin, J.-M., Côté, C., Michaud, F. (2005), “FlowDesigner: the free data-flow oriented development environment”, &#039;&#039;Software 2.0&#039;&#039;, vol. 3. ([[Media:Software2005.pdf|pdf]]) &lt;br /&gt;
#Côté, C., Létourneau, D., Michaud, F., Valin, J.-M., Brosseau, Y., Raïevsky, C., Lemay, M., Tran. V. (2004), &amp;quot;Code reusability for programming mobile robots&amp;quot;, &#039;&#039;Proceedings IEEE/RSJ International Conference on Robots and Intelligent Systems&#039;&#039;, 1820-1825.&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=DEA&amp;diff=3514</id>
		<title>DEA</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=DEA&amp;diff=3514"/>
		<updated>2024-05-28T12:41:55Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;big&amp;gt;&lt;br /&gt;
ADE : Actionneur Différentiel-Élastique / DEA: Differential Elastic Actuator (Patent pending)&lt;br /&gt;
&amp;lt;/big&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Description =&lt;br /&gt;
The implementation of machines able to precisely control interaction with its environment begins with the use of actuators specially designed for that purpose. To that effect, a new compact implementation design for high performance actuators that are especially adapted for integration in robotic mechanisms has been developed. This design makes use of a mechanical differential as central element. Differential coupling between an intrinsically high impedance transducer and an intrinsically low impedance spring element provides the same benefits as serial coupling. However differential coupling allows new interesting design implementations possibilities, especially for rotational actuators.&lt;br /&gt;
&lt;br /&gt;
[[Image:ADETaxonomy.jpg|400px]][[Image:ADEAdvantages.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
[[Image:ADEMechanism.jpg|400px]][[Image:ADEPhoto.jpg|200px]]&lt;br /&gt;
&lt;br /&gt;
= Videos =&lt;br /&gt;
&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
&amp;lt;code&amp;gt;{{#ev:dailymotion|xbxw3r_interactive-arm_tech }}&amp;lt;/code&amp;gt;&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:IROS2008_audio.mp4 | Video showing our 3DOF arm using three Differential Elastic Actuators (in english)]]&lt;br /&gt;
&lt;br /&gt;
[[Media:Presentation_ADE.mov | Video of a gravity compensation experiment with one Differential Elastic Actuator (in french)]]&lt;br /&gt;
&lt;br /&gt;
= Patents =&lt;br /&gt;
&lt;br /&gt;
* http://www.google.com/patents/US8209052&lt;br /&gt;
&lt;br /&gt;
= Publications =&lt;br /&gt;
&lt;br /&gt;
#Ferland, F., Létourneau, D., Aumont, A., Frémy, J, Legault, M.-A., Lauria, M., Michaud, F. (2012), &amp;quot;Natural interaction design of a humanoid robot,&amp;quot; Journal of Human-Robot Interaction, 1 (2), 118-134, [http://www.humanrobotinteraction.org/journal/index.php/HRI/article/view/65].&lt;br /&gt;
#Fremy, J. (2011), Contrôle en force sécuritaire d&#039;une plateforme omnidirectionnelle non-holonome, Mémoire de maîtrise, Département de génie électrique et de génie informatique, Université de Sherbrooke. ([[Media:MemoireFremy.pdf|pdf]]) &lt;br /&gt;
#Frémy, J., Ferland, F., Clavien, L., Létourneau, D., Michaud, F., Lauria, M. (2010), “Force-controlled motion of a mobile platform,” IEEE/RSJ International Conference on Intelligent Robots and Systems. ([[Media:IROS2010.pdf|pdf]]) ([[Media:IROS2010.m4v|m4v]])&lt;br /&gt;
#Ferland, F., Clavien, L., Frémy, J., Létourneau, D., Michaud, F., Lauria, M. (2010), “Teleoperation of AZIMUT-3, an omnidirectional non-holonomic platform with steerable wheels,” IEEE/RSJ International Conference on Intelligent Robots and Systems. ([[Media:IROS2010f.pdf|pdf]]) ([[Media:IROS2010f.m4v|m4v]])&lt;br /&gt;
# Lavoie, M.-A., , Développement et contrôle d&#039;un bras robotique basé sur l&#039;actionneur différentiel élastique, Mémoire de maîtrise, Département de génie mécanique, Unviersité de Sherbrooke. ([http://laborius.gel.usherbrooke.ca/papers/MemoireLavoie.pdf pdf])&lt;br /&gt;
# Lauria, M., Legault, M.-A., Lavoie, M.-A., Michaud, F. (2008) “Differential elastic actuator for robotic interaction tasks,” Proceedings of IEEE International Conference on Robotics and Automation, Pasadena, USA. ([[Media:DifferentialElasticActuattorICRA2008.pdf|pdf]])&lt;br /&gt;
# Lauria, M., Fauteux, Ph., Legault, M.-A., Lavoie, M.-A., Michaud, F. (2008) “Differential elastic actuator for robotic interaction tasks,” in Proceedings of Actuator 2008, 11th International Conference on New Actuators, Bremen, Germany. ([[Media:DifferentialElasticActuatorForInteractionACTUATOR2008.pdf|pdf]])&lt;br /&gt;
# Lauria, M., Legault, M.-A., Lavoie, M.-A., Giguère P., Gagnon, F., Michaud, F., «High Performance Differential Actuator for Robotic Interaction Tasks», United States Patent Application number 694123, March 31, 2007.&lt;br /&gt;
#Legault, M.-A. (2007), &amp;quot;Développement d’un actionneur différentiel élastique&amp;quot;, Mémoire de maîtrise, Département de génie mécanique, Université de Sherbrooke, mars. ([[Media:MemoireLegault.pdf|pdf]])&lt;br /&gt;
# Lauria, M., Legault, M.-A., Michaud, F., (2007), “High performance differential elastic actuator for robotic interaction tasks”, Proceedings American Association for Artificial Intelligence Spring Symposium on Multidisciplinary Collaboration for Socially Assistive Robotics, Stanford, March. ([[Media:HighPerformanceElasticActuatorAAAI2007.pdf|pdf]])&lt;br /&gt;
&lt;br /&gt;
= Équipe / Team = &lt;br /&gt;
* Silvan Widmer&lt;br /&gt;
* Marc-Antoine Legault&lt;br /&gt;
* Marc-André Lavoie&lt;br /&gt;
* Philippe Fauteux&lt;br /&gt;
* Matthieu Tanguay&lt;br /&gt;
* Michel Lauria&lt;br /&gt;
* François Michaud&lt;br /&gt;
&lt;br /&gt;
= See Also =&lt;br /&gt;
* [[DDRA | Double Differential Rheological Actuator]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!--&amp;lt;english&amp;gt; --&amp;gt;&lt;br /&gt;
Similar project in [http://www.jsk.t.u-tokyo.ac.jp/research/hrp2w/index.htm Japan] and at [http://www.willowgarage.com/ Willow Garage]&lt;br /&gt;
&amp;lt;!--&amp;lt;/english&amp;gt; --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- &amp;lt;french&amp;gt; &lt;br /&gt;
Projet similaire au [http://www.jsk.t.u-tokyo.ac.jp/research/hrp2w/index.htm Japon] et chez [http://www.willowgarage.com/ Willow Garage]&lt;br /&gt;
&amp;lt;/french&amp;gt; --&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=DEA&amp;diff=3513</id>
		<title>DEA</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=DEA&amp;diff=3513"/>
		<updated>2024-05-28T12:40:05Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;big&amp;gt;&lt;br /&gt;
ADE : Actionneur Différentiel-Élastique / DEA: Differential Elastic Actuator (Patent pending)&lt;br /&gt;
&amp;lt;/big&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Description =&lt;br /&gt;
The implementation of machines able to precisely control interaction with its environment begins with the use of actuators specially designed for that purpose. To that effect, a new compact implementation design for high performance actuators that are especially adapted for integration in robotic mechanisms has been developed. This design makes use of a mechanical differential as central element. Differential coupling between an intrinsically high impedance transducer and an intrinsically low impedance spring element provides the same benefits as serial coupling. However differential coupling allows new interesting design implementations possibilities, especially for rotational actuators.&lt;br /&gt;
&lt;br /&gt;
[[Image:ADETaxonomy.jpg|400px]][[Image:ADEAdvantages.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
[[Image:ADEMechanism.jpg|400px]][[Image:ADEPhoto.jpg|200px]]&lt;br /&gt;
&lt;br /&gt;
= Videos =&lt;br /&gt;
&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
&amp;lt;code&amp;gt;{{#ev:dailymotion|xbxw3r_interactive-arm_tech }}&amp;lt;/code&amp;gt;&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:IROS2008_audio.mp4 | Video showing our 3DOF arm using three Differential Elastic Actuators (in english)]]&lt;br /&gt;
&lt;br /&gt;
[[Media:Presentation_ADE.mov | Video of a gravity compensation experiment with one Differential Elastic Actuator (in french)]]&lt;br /&gt;
&lt;br /&gt;
= Patents =&lt;br /&gt;
&lt;br /&gt;
* http://www.google.com/patents/US8209052&lt;br /&gt;
&lt;br /&gt;
= Publications =&lt;br /&gt;
&lt;br /&gt;
#Ferland, F., Létourneau, D., Aumont, A., Frémy, J, Legault, M.-A., Lauria, M., Michaud, F. (2012), &amp;quot;Natural interaction design of a humanoid robot,&amp;quot; Journal of Human-Robot Interaction, 1 (2), 118-134, [http://www.humanrobotinteraction.org/journal/index.php/HRI/article/view/65].&lt;br /&gt;
#Fremy, J. (2011), Contrôle en force sécuritaire d&#039;une plateforme omnidirectionnelle non-holonome, Mémoire de maîtrise, Département de génie électrique et de génie informatique, Université de Sherbrooke. ([[Media:MemoireFremy.pdf|pdf]]) &lt;br /&gt;
#Frémy, J., Ferland, F., Clavien, L., Létourneau, D., Michaud, F., Lauria, M. (2010), “Force-controlled motion of a mobile platform,” IEEE/RSJ International Conference on Intelligent Robots and Systems. ([[Media:IROS2010.pdf|pdf]]) ([[Media:IROS2010.m4v|m4v]])&lt;br /&gt;
#Ferland, F., Clavien, L., Frémy, J., Létourneau, D., Michaud, F., Lauria, M. (2010), “Teleoperation of AZIMUT-3, an omnidirectional non-holonomic platform with steerable wheels,” IEEE/RSJ International Conference on Intelligent Robots and Systems. ([[Media:IROS2010f.pdf|pdf]]) ([[Media:IROS2010f.m4v|m4v]])&lt;br /&gt;
# Lavoie, M.-A., , Développement et contrôle d&#039;un bras robotique basé sur l&#039;actionneur différentiel élastique, Mémoire de maîtrise, Département de génie mécanique, Unviersité de Sherbrooke. ([http://laborius.gel.usherbrooke.ca/papers/MemoireLavoie.pdf pdf])&lt;br /&gt;
# Lauria, M., Legault, M.-A., Lavoie, M.-A., Michaud, F. (2008) “Differential elastic actuator for robotic interaction tasks,” Proceedings of IEEE International Conference on Robotics and Automation, Pasadena, USA. ([[Media:DifferentialElasticActuattorICRA2008.pdf|pdf]])&lt;br /&gt;
# Lauria, M., Fauteux, Ph., Legault, M.-A., Lavoie, M.-A., Michaud, F. (2008) “Differential elastic actuator for robotic interaction tasks,” in Proceedings of Actuator 2008, 11th International Conference on New Actuators, Bremen, Germany. ([[Media:DifferentialElasticActuatorForInteractionACTUATOR2008.pdf|pdf]])&lt;br /&gt;
# Lauria, M., Legault, M.-A., Lavoie, M.-A., Giguère P., Gagnon, F., Michaud, F., «High Performance Differential Actuator for Robotic Interaction Tasks», United States Patent Application number 694123, March 31, 2007.&lt;br /&gt;
#Legault, M.-A. (2007), &amp;quot;Développement d’un actionneur différentiel élastique&amp;quot;, Mémoire de maîtrise, Département de génie mécanique, Université de Sherbrooke, mars. ([[Media:MemoireLegault.pdf|pdf]])&lt;br /&gt;
# Lauria, M., Legault, M.-A., Michaud, F., (2007), “High performance differential elastic actuator for robotic interaction tasks”, Proceedings American Association for Artificial Intelligence Spring Symposium on Multidisciplinary Collaboration for Socially Assistive Robotics, Stanford, March. ([[Media:HighPerformanceElasticActuatorAAAI2007.pdf|pdf]])&lt;br /&gt;
&lt;br /&gt;
= Équipe / Team = &lt;br /&gt;
* Silvan Widmer&lt;br /&gt;
* Marc-Antoine Legault&lt;br /&gt;
* Marc-André Lavoie&lt;br /&gt;
* Philippe Fauteux&lt;br /&gt;
* Matthieu Tanguay&lt;br /&gt;
* Michel Lauria&lt;br /&gt;
* François Michaud&lt;br /&gt;
&lt;br /&gt;
= See Also =&lt;br /&gt;
* [[DDRA | Double Differential Rheological Actuator]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!--&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
Similar project in [http://www.jsk.t.u-tokyo.ac.jp/research/hrp2w/index.htm Japan] and at [http://www.willowgarage.com/ Willow Garage]&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
--&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- &amp;lt;french&amp;gt; --&amp;gt;&lt;br /&gt;
Projet similaire au [http://www.jsk.t.u-tokyo.ac.jp/research/hrp2w/index.htm Japon] et chez [http://www.willowgarage.com/ Willow Garage]&lt;br /&gt;
&amp;lt;!-- &amp;lt;/french&amp;gt; --&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=DEA&amp;diff=3512</id>
		<title>DEA</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=DEA&amp;diff=3512"/>
		<updated>2024-05-28T12:39:53Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
&amp;lt;big&amp;gt;&lt;br /&gt;
ADE : Actionneur Différentiel-Élastique / DEA: Differential Elastic Actuator (Patent pending)&lt;br /&gt;
&amp;lt;/big&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Description =&lt;br /&gt;
The implementation of machines able to precisely control interaction with its environment begins with the use of actuators specially designed for that purpose. To that effect, a new compact implementation design for high performance actuators that are especially adapted for integration in robotic mechanisms has been developed. This design makes use of a mechanical differential as central element. Differential coupling between an intrinsically high impedance transducer and an intrinsically low impedance spring element provides the same benefits as serial coupling. However differential coupling allows new interesting design implementations possibilities, especially for rotational actuators.&lt;br /&gt;
&lt;br /&gt;
[[Image:ADETaxonomy.jpg|400px]][[Image:ADEAdvantages.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
[[Image:ADEMechanism.jpg|400px]][[Image:ADEPhoto.jpg|200px]]&lt;br /&gt;
&lt;br /&gt;
= Videos =&lt;br /&gt;
&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
&amp;lt;code&amp;gt;{{#ev:dailymotion|xbxw3r_interactive-arm_tech }}&amp;lt;/code&amp;gt;&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Media:IROS2008_audio.mp4 | Video showing our 3DOF arm using three Differential Elastic Actuators (in english)]]&lt;br /&gt;
&lt;br /&gt;
[[Media:Presentation_ADE.mov | Video of a gravity compensation experiment with one Differential Elastic Actuator (in french)]]&lt;br /&gt;
&lt;br /&gt;
= Patents =&lt;br /&gt;
&lt;br /&gt;
* http://www.google.com/patents/US8209052&lt;br /&gt;
&lt;br /&gt;
= Publications =&lt;br /&gt;
&lt;br /&gt;
#Ferland, F., Létourneau, D., Aumont, A., Frémy, J, Legault, M.-A., Lauria, M., Michaud, F. (2012), &amp;quot;Natural interaction design of a humanoid robot,&amp;quot; Journal of Human-Robot Interaction, 1 (2), 118-134, [http://www.humanrobotinteraction.org/journal/index.php/HRI/article/view/65].&lt;br /&gt;
#Fremy, J. (2011), Contrôle en force sécuritaire d&#039;une plateforme omnidirectionnelle non-holonome, Mémoire de maîtrise, Département de génie électrique et de génie informatique, Université de Sherbrooke. ([[Media:MemoireFremy.pdf|pdf]]) &lt;br /&gt;
#Frémy, J., Ferland, F., Clavien, L., Létourneau, D., Michaud, F., Lauria, M. (2010), “Force-controlled motion of a mobile platform,” IEEE/RSJ International Conference on Intelligent Robots and Systems. ([[Media:IROS2010.pdf|pdf]]) ([[Media:IROS2010.m4v|m4v]])&lt;br /&gt;
#Ferland, F., Clavien, L., Frémy, J., Létourneau, D., Michaud, F., Lauria, M. (2010), “Teleoperation of AZIMUT-3, an omnidirectional non-holonomic platform with steerable wheels,” IEEE/RSJ International Conference on Intelligent Robots and Systems. ([[Media:IROS2010f.pdf|pdf]]) ([[Media:IROS2010f.m4v|m4v]])&lt;br /&gt;
# Lavoie, M.-A., , Développement et contrôle d&#039;un bras robotique basé sur l&#039;actionneur différentiel élastique, Mémoire de maîtrise, Département de génie mécanique, Unviersité de Sherbrooke. ([http://laborius.gel.usherbrooke.ca/papers/MemoireLavoie.pdf pdf])&lt;br /&gt;
# Lauria, M., Legault, M.-A., Lavoie, M.-A., Michaud, F. (2008) “Differential elastic actuator for robotic interaction tasks,” Proceedings of IEEE International Conference on Robotics and Automation, Pasadena, USA. ([[Media:DifferentialElasticActuattorICRA2008.pdf|pdf]])&lt;br /&gt;
# Lauria, M., Fauteux, Ph., Legault, M.-A., Lavoie, M.-A., Michaud, F. (2008) “Differential elastic actuator for robotic interaction tasks,” in Proceedings of Actuator 2008, 11th International Conference on New Actuators, Bremen, Germany. ([[Media:DifferentialElasticActuatorForInteractionACTUATOR2008.pdf|pdf]])&lt;br /&gt;
# Lauria, M., Legault, M.-A., Lavoie, M.-A., Giguère P., Gagnon, F., Michaud, F., «High Performance Differential Actuator for Robotic Interaction Tasks», United States Patent Application number 694123, March 31, 2007.&lt;br /&gt;
#Legault, M.-A. (2007), &amp;quot;Développement d’un actionneur différentiel élastique&amp;quot;, Mémoire de maîtrise, Département de génie mécanique, Université de Sherbrooke, mars. ([[Media:MemoireLegault.pdf|pdf]])&lt;br /&gt;
# Lauria, M., Legault, M.-A., Michaud, F., (2007), “High performance differential elastic actuator for robotic interaction tasks”, Proceedings American Association for Artificial Intelligence Spring Symposium on Multidisciplinary Collaboration for Socially Assistive Robotics, Stanford, March. ([[Media:HighPerformanceElasticActuatorAAAI2007.pdf|pdf]])&lt;br /&gt;
&lt;br /&gt;
= Équipe / Team = &lt;br /&gt;
* Silvan Widmer&lt;br /&gt;
* Marc-Antoine Legault&lt;br /&gt;
* Marc-André Lavoie&lt;br /&gt;
* Philippe Fauteux&lt;br /&gt;
* Matthieu Tanguay&lt;br /&gt;
* Michel Lauria&lt;br /&gt;
* François Michaud&lt;br /&gt;
&lt;br /&gt;
= See Also =&lt;br /&gt;
* [[DDRA | Double Differential Rheological Actuator]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!--&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
Similar project in [http://www.jsk.t.u-tokyo.ac.jp/research/hrp2w/index.htm Japan] and at [http://www.willowgarage.com/ Willow Garage]&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
--&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- &amp;lt;french&amp;gt; --&amp;gt;&lt;br /&gt;
Projet similaire au [http://www.jsk.t.u-tokyo.ac.jp/research/hrp2w/index.htm Japon] et chez [http://www.willowgarage.com/ Willow Garage]&lt;br /&gt;
&amp;lt;!-- &amp;lt;/french&amp;gt; --&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=Main_Page&amp;diff=3494</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=Main_Page&amp;diff=3494"/>
		<updated>2023-07-13T16:36:17Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
{|  class=&amp;quot;wikitable&amp;quot;  style=&amp;quot;width:100%; height:200px; text-align:left;&amp;quot; border=&amp;quot;0&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
|width=&amp;quot;50%&amp;quot; |&lt;br /&gt;
&amp;lt;french&amp;gt;&amp;lt;big&amp;gt;IntRoLab - Laboratoire de robotique intelligente / interactive / intégrée / interdisciplinaire &amp;lt;/big&amp;gt; &amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Institut interdisciplinaire d&#039;innovation technologique [http://www.3it.ca 3IT]&#039;&#039;&#039; &amp;lt;br&amp;gt;&lt;br /&gt;
Université de Sherbrooke &amp;lt;br&amp;gt;&lt;br /&gt;
3000 boul. de l&#039;Université &amp;lt;br&amp;gt;&lt;br /&gt;
Sherbrooke (Québec) J1K 0A5 &amp;lt;br&amp;gt;&lt;br /&gt;
Canada &amp;lt;br&amp;gt;&lt;br /&gt;
Téléphone : 819 821-8000 poste 65700 &amp;lt;br&amp;gt;&lt;br /&gt;
[mailto:francois.michaud@usherbrooke.ca Contactez-nous!] &amp;lt;br&amp;gt;&lt;br /&gt;
[https://www.usherbrooke.ca/visiter/acces-routiers/ Accès routier] &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
&amp;lt;big&amp;gt;IntRoLab - Intelligent / Interactive / Integrated / Interdisciplinary Robot Lab &amp;lt;/big&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Interdisciplinary Institute for Technological Innovation [http://www.3it.ca 3IT] &amp;lt;br&amp;gt;&lt;br /&gt;
Université de Sherbrooke &amp;lt;br&amp;gt;&lt;br /&gt;
3000 boul. de l&#039;Université &amp;lt;br&amp;gt;&lt;br /&gt;
Sherbrooke (Québec) J1K 0A5 &amp;lt;br&amp;gt;&lt;br /&gt;
Canada &amp;lt;br&amp;gt;&lt;br /&gt;
Phone :  819 821-8000 ext. 65700 &amp;lt;br&amp;gt;&lt;br /&gt;
[mailto:francois.michaud@usherbrooke.ca Contact us!] &amp;lt;br&amp;gt;&lt;br /&gt;
[https://www.usherbrooke.ca/visiter/acces-routiers/ Map] &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
|width=&amp;quot;50%&amp;quot; |&lt;br /&gt;
&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;Juillet 2023 - [https://github.com/introlab/t-top T-Top]&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;July 2023 - [https://github.com/introlab/t-top T-Top]&amp;lt;/english&amp;gt;&lt;br /&gt;
{{#ev:youtube|2jpqTp6jozc}}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
[[Image:Twitter.jpeg|50px|link=https://twitter.com/introlab/]] &lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
{| style=&amp;quot;float: right;&amp;quot;&lt;br /&gt;
| [https://introlab.3it.usherbrooke.ca/introlab-secure/ Intranet]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=Main_Page&amp;diff=3493</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=Main_Page&amp;diff=3493"/>
		<updated>2023-07-13T16:34:17Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
{|  class=&amp;quot;wikitable&amp;quot;  style=&amp;quot;width:100%; height:200px; text-align:left;&amp;quot; border=&amp;quot;0&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
|width=&amp;quot;50%&amp;quot; |&lt;br /&gt;
&amp;lt;french&amp;gt;&amp;lt;big&amp;gt;IntRoLab - Laboratoire de robotique intelligente / interactive / intégrée / interdisciplinaire &amp;lt;/big&amp;gt; &amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Institut interdisciplinaire d&#039;innovation technologique [http://www.3it.ca 3IT]&#039;&#039;&#039; &amp;lt;br&amp;gt;&lt;br /&gt;
Université de Sherbrooke &amp;lt;br&amp;gt;&lt;br /&gt;
3000 boul. de l&#039;Université &amp;lt;br&amp;gt;&lt;br /&gt;
Sherbrooke (Québec) J1K 0A5 &amp;lt;br&amp;gt;&lt;br /&gt;
Canada &amp;lt;br&amp;gt;&lt;br /&gt;
Téléphone : 819 821-8000 poste 65700 &amp;lt;br&amp;gt;&lt;br /&gt;
[mailto:francois.michaud@usherbrooke.ca Contactez-nous!] &amp;lt;br&amp;gt;&lt;br /&gt;
[https://www.usherbrooke.ca/visiter/acces-routiers/ Accès routier] &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
&amp;lt;big&amp;gt;IntRoLab - Intelligent / Interactive / Integrated / Interdisciplinary Robot Lab &amp;lt;/big&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Interdisciplinary Institute for Technological Innovation [http://www.3it.ca 3IT] &amp;lt;br&amp;gt;&lt;br /&gt;
Université de Sherbrooke &amp;lt;br&amp;gt;&lt;br /&gt;
3000 boul. de l&#039;Université &amp;lt;br&amp;gt;&lt;br /&gt;
Sherbrooke (Québec) J1K 0A5 &amp;lt;br&amp;gt;&lt;br /&gt;
Canada &amp;lt;br&amp;gt;&lt;br /&gt;
Phone :  819 821-8000 ext. 65700 &amp;lt;br&amp;gt;&lt;br /&gt;
[mailto:francois.michaud@usherbrooke.ca Contact us!] &amp;lt;br&amp;gt;&lt;br /&gt;
[https://www.usherbrooke.ca/visiter/acces-routiers/ Map] &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
|width=&amp;quot;50%&amp;quot; |&lt;br /&gt;
&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;Juillet 2023- [https://github.com/introlab/t-top T-Top]&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;July 2023 - [https://github.com/introlab/t-top T-Top]&amp;lt;/english&amp;gt;&lt;br /&gt;
{{#ev:youtube|2jpqTp6jozc}}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
[[Image:Twitter.jpeg|50px|link=https://twitter.com/introlab/]] &lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
{| style=&amp;quot;float: right;&amp;quot;&lt;br /&gt;
| [https://introlab.3it.usherbrooke.ca/introlab-secure/ Intranet]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3483</id>
		<title>File:Weights.zip</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3483"/>
		<updated>2022-10-25T13:38:09Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: Letd2801 a téléversé une nouvelle version de File:Weights.zip&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
TTOP weights&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3482</id>
		<title>File:Weights.zip</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3482"/>
		<updated>2022-10-25T12:56:47Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: Letd2801 a téléversé une nouvelle version de File:Weights.zip&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
TTOP weights&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3481</id>
		<title>File:Weights.zip</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3481"/>
		<updated>2022-10-11T13:27:41Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: Letd2801 a téléversé une nouvelle version de File:Weights.zip&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
TTOP weights&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3480</id>
		<title>File:Weights.zip</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3480"/>
		<updated>2022-10-03T18:15:30Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: Letd2801 a téléversé une nouvelle version de File:Weights.zip&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
TTOP weights&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=File:IntRoLab.png&amp;diff=3474</id>
		<title>File:IntRoLab.png</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=File:IntRoLab.png&amp;diff=3474"/>
		<updated>2022-05-16T12:42:56Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: Letd2801 a téléversé une nouvelle version de File:IntRoLab.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3453</id>
		<title>File:Weights.zip</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3453"/>
		<updated>2022-01-25T22:15:29Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: Letd2801 a téléversé une nouvelle version de File:Weights.zip&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
TTOP weights&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3438</id>
		<title>File:Weights.zip</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3438"/>
		<updated>2021-09-20T13:08:51Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: Letd2801 a téléversé une nouvelle version de File:Weights.zip&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
TTOP weights&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=MediaWiki:Sidebar&amp;diff=3435</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=MediaWiki:Sidebar&amp;diff=3435"/>
		<updated>2021-09-14T22:55:53Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* News&lt;br /&gt;
**News|Toutes les nouvelles / All News&lt;br /&gt;
&lt;br /&gt;
* Projets / Projects&lt;br /&gt;
**Projects|Tous les projets / All Projects&lt;br /&gt;
**ADE|DEA&lt;br /&gt;
**AUDIBLE|AUDIBLE&lt;br /&gt;
**Autonomous Robot|Autonomous Robot&lt;br /&gt;
**AZIMUT|AZIMUT&lt;br /&gt;
**CRI|Children Robot Interaction&lt;br /&gt;
**DCD|DCD&lt;br /&gt;
**DDRA|DDRA&lt;br /&gt;
**DRF|DRF&lt;br /&gt;
**EQ|EQ&lt;br /&gt;
**MapIt|MapIt&lt;br /&gt;
**HBBA|HBBA&lt;br /&gt;
**PEXAT|PEXAT&lt;br /&gt;
**RTAB-Map|RTAB-Map&lt;br /&gt;
**SAM|SAM&lt;br /&gt;
**Telerobot|Telerobot&lt;br /&gt;
**Teletrauma|Teletrauma&lt;br /&gt;
**TRInterface|Ego/Exocentric Teleoperation&lt;br /&gt;
**WISS|WISS&lt;br /&gt;
* Open Source&lt;br /&gt;
**8SoundsUSB|8 Inputs USB Sound Card&lt;br /&gt;
**FlowDesigner|FlowDesigner&lt;br /&gt;
**ManyEars|ManyEars&lt;br /&gt;
**MARIE|MARIE&lt;br /&gt;
**OpenECoSys|OpenECoSys&lt;br /&gt;
**RTAB-Map|RTAB-Map&lt;br /&gt;
**Find-Object|Find-Object&lt;br /&gt;
**ROS_OpenTLD|ROS OpenTLD&lt;br /&gt;
**ROS4iOS|ROS4iOS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* Équipe / Team&lt;br /&gt;
** Team|Toute l&#039;équipe / All Team&lt;br /&gt;
&lt;br /&gt;
* Infrastructure&lt;br /&gt;
**Infrastructure|Infrastructure de laboratoire / Lab Infrastructure&lt;br /&gt;
&lt;br /&gt;
* Publications&lt;br /&gt;
** Publications|Toutes les publications / All publications&lt;br /&gt;
&lt;br /&gt;
* Information&lt;br /&gt;
** Information|Information du laboratoire / Lab Information&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=MediaWiki:Sidebar&amp;diff=3434</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=MediaWiki:Sidebar&amp;diff=3434"/>
		<updated>2021-09-14T22:54:53Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* News&lt;br /&gt;
**News|Toutes les nouvelles / All News&lt;br /&gt;
&lt;br /&gt;
* Projets / Projects&lt;br /&gt;
**Projects|Tous les projets / All Projects&lt;br /&gt;
**ADE|DEA&lt;br /&gt;
**AUDIBLE|AUDIBLE&lt;br /&gt;
**Autonomous Robot|Autonomous Robot&lt;br /&gt;
**AZIMUT|AZIMUT&lt;br /&gt;
**CRI|Children Robot Interaction&lt;br /&gt;
**DCD|DCD&lt;br /&gt;
**DDRA|DDRA&lt;br /&gt;
**DRF|DRF&lt;br /&gt;
**EQ|EQ&lt;br /&gt;
**MapIt|MapIt&lt;br /&gt;
**HBBA|HBBA&lt;br /&gt;
**PEXAT|PEXAT&lt;br /&gt;
**RTAB-Map|RTAB-Map&lt;br /&gt;
**MapIt|MapIt&lt;br /&gt;
**SAM|SAM&lt;br /&gt;
**Telerobot|Telerobot&lt;br /&gt;
**Teletrauma|Teletrauma&lt;br /&gt;
**TRInterface|Ego/Exocentric Teleoperation&lt;br /&gt;
**WISS|WISS&lt;br /&gt;
* Open Source&lt;br /&gt;
**8SoundsUSB|8 Inputs USB Sound Card&lt;br /&gt;
**FlowDesigner|FlowDesigner&lt;br /&gt;
**ManyEars|ManyEars&lt;br /&gt;
**MARIE|MARIE&lt;br /&gt;
**OpenECoSys|OpenECoSys&lt;br /&gt;
**RTAB-Map|RTAB-Map&lt;br /&gt;
**Find-Object|Find-Object&lt;br /&gt;
**ROS_OpenTLD|ROS OpenTLD&lt;br /&gt;
**ROS4iOS|ROS4iOS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* Équipe / Team&lt;br /&gt;
** Team|Toute l&#039;équipe / All Team&lt;br /&gt;
&lt;br /&gt;
* Infrastructure&lt;br /&gt;
**Infrastructure|Infrastructure de laboratoire / Lab Infrastructure&lt;br /&gt;
&lt;br /&gt;
* Publications&lt;br /&gt;
** Publications|Toutes les publications / All publications&lt;br /&gt;
&lt;br /&gt;
* Information&lt;br /&gt;
** Information|Information du laboratoire / Lab Information&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=MapIt&amp;diff=3433</id>
		<title>MapIt</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=MapIt&amp;diff=3433"/>
		<updated>2021-09-14T17:05:28Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
= &amp;lt;english&amp;gt;MapIt on phone&amp;lt;/english&amp;gt; &amp;lt;french&amp;gt;MapIt sur téléphone&amp;lt;/french&amp;gt; =&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|-BTO71JVjX4}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|wPt8xABSsa0}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Chaîne YouTube: [[Image:MapIt.png|link=https://www.youtube.com/channel/UC78bYbYPRuCEj4yUkNP07Lw/playlists|MapIt|50px]]&lt;br /&gt;
&lt;br /&gt;
Bonnes pratiques pour augmenter la qualité du modèle final:&lt;br /&gt;
# Pour commencer, pointez vers un endroit ayant beaucoup de repères visuels comme une peinture ou une photo sur le mur. Le but est de terminer le scan au même endroit pour que MapIt puisse détecter qu&#039;il est revenu au point initial (une fermeture de boucle est alors détectée). MapIt peut ensuite corriger les erreurs reliées au capteur de mouvement pour optimiser le modèle.&lt;br /&gt;
# Balayer de haut en bas/bas en haut en tournant tranquillement (comme un mouvement en serpentin, similaire au &amp;quot;Paint Brush Motion&amp;quot; de ce  [https://www.youtube.com/watch?v=XqJbFn5rJfE vidéo]).&lt;br /&gt;
# Éviter les mouvements rapides.&lt;br /&gt;
# Éviter de pointer la caméra trop près d&#039;une surface, rester à au moins 1 mètre de la surface.&lt;br /&gt;
# Pour le bain, essayer au maximum de balayer au dessus du bain pour capter les côtés intérieurs du bain.&lt;br /&gt;
# En glissant le doigt sur l&#039;écran de bas en haut, on peut changer le point de vue de la caméra virtuelle pour mieux voir les endroits qui n&#039;ont pas été balayés.&lt;br /&gt;
# &#039;&#039;&#039;Assurez-vous que le téléphone est chargé à plus de 90% avant de commencer un nouveau scan&#039;&#039;&#039;. En bas de 50%, le téléphone tombe en économie d&#039;énergie, diminuant la performance de l&#039;application.&lt;br /&gt;
&lt;br /&gt;
== Installation ==&lt;br /&gt;
&amp;lt;span style=&amp;quot;background:#FFFF00&amp;quot;&amp;gt;&amp;lt;b&amp;gt;MapIt est en version prototype: elle est disponible gratuitement en échange de vos commentaires. [https://docs.google.com/forms/d/e/1FAIpQLSfBkeZXw1CboBONpq05WY6mSkpbLsbH63wbaM376SDTwkfNVw/viewform?usp=sf_link Cliquez ici pour donner vos commentaires], merci!&amp;lt;/b&amp;gt;&amp;lt;/span&amp;gt;&lt;br /&gt;
=== Android ===&lt;br /&gt;
* Téléphones supportés (&#039;&#039;&#039;avec Google Tango&#039;&#039;&#039;): Lenovo Phab2Pro, Asus Zenfone AR&lt;br /&gt;
* [https://docs.google.com/forms/d/e/1FAIpQLSflyRp4PcJgk6PhW6sk7Pif3KPQXjygG91VXwC9Rq1XrrG9IA/viewform?usp=sf_link Formulaire pour demander l&#039;App].&lt;br /&gt;
&lt;br /&gt;
=== iOS ===&lt;br /&gt;
* Téléphones supportés (&#039;&#039;&#039;avec LiDAR&#039;&#039;&#039;): iPhone 12 Pro, iPhone 12 Pro Max, iPad Pro 2020, iPad Pro 2021&lt;br /&gt;
* [https://docs.google.com/forms/d/e/1FAIpQLSflyRp4PcJgk6PhW6sk7Pif3KPQXjygG91VXwC9Rq1XrrG9IA/viewform?usp=sf_link Formulaire pour demander l&#039;App].&lt;br /&gt;
&lt;br /&gt;
== Tags ==&lt;br /&gt;
Lorsque l&#039;option &amp;quot;Tag&amp;quot; and le menu &amp;quot;Détection de boucle&amp;quot; est activé, imprimer un tag ci-dessous et l&#039;apposer sur un mur de la pièce. Pour débuter la numérisation, il faut pointer le téléphone vers le tag, puis numériser la pièce comme décrit ci-dessus. À la fin, revenir devant le tag pour le redétecter, ceci va permettre de corriger certaines erreurs dans la numérisation. Il est important de ne pas bouger le tag pendant la numérisation. &lt;br /&gt;
* 4 tags par page (à découper sur les pointillés): &lt;br /&gt;
** [https://introlab.3it.usherbrooke.ca/mediawiki-introlab/images/2/23/Tags1-4.pdf Tags1-4.pdf]&lt;br /&gt;
** [https://introlab.3it.usherbrooke.ca/mediawiki-introlab/images/6/69/Tags5-8.pdf Tags5-8.pdf]&lt;br /&gt;
** [https://introlab.3it.usherbrooke.ca/mediawiki-introlab/images/a/ae/Tags9-12.pdf Tags9-12.pdf]&lt;br /&gt;
** Ouvrir le fichier dans Microsoft Edge, Google Chrome ou Adobe Reader. Assurez-vous que le logiciel utilisé n&#039;imprime pas les tags avec du flou (Firefox à gauche):&lt;br /&gt;
::[[File:impression_tag.jpg|300px]]&lt;br /&gt;
&lt;br /&gt;
Si on veut fusionner plusieurs pièces par la suite, il est important d&#039;utiliser un tag différent pour chaque pièce. Lors de la dernière session de numérisation, il faut parcourir tous les tags pour avoir un scan avec tous les tags référencés un par rapport à l&#039;autre. Pour cette dernière session, il n&#039;est pas nécessaire de balayer l&#039;environment, simplement marcher directement d&#039;un tag à l&#039;autre tout en revenant au tag initial à la fin.&lt;br /&gt;
&lt;br /&gt;
== &amp;lt;english&amp;gt;Privacy Policy&amp;lt;/english&amp;gt; &amp;lt;french&amp;gt;Politique de confidentialité&amp;lt;/french&amp;gt; ==&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
MapIt app on Google Play Store requires access to camera to record images that will be used for creating the map. When saving, a database containing these images is created. That database is saved locally on the device (on the sd-card under MapIt folder). MapIt requires read/write access to MapIt folder only, to save, export and open maps. MapIt doesn’t access any other information outside the MapIt folder. MapIt doesn’t share information over Internet unless the user explicitly clicks on Share button.&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
L&#039;application MapIt sur Google Play Store requiert l&#039;accès à la caméra pour enregistrer les images utilisées pour créer le modèle. Lors de la sauvegarde, une base de données avec ces images est créée. La base de données est sauvegardées sur l&#039;appareil (sur la carte SD sous le dossier MapIt). MapIt requiert l&#039;accès en écriture et lecture au dossier MapIt seulement, pour sauvegarder et ouvrir des modèles. MapIt n&#039;accède pas à d&#039;autres informations en dehors du dossier MapIt. MapIt ne partage pas d&#039;information sur Internet à moins que l&#039;utilisateur appuie explicitement sur le bouton Partager.&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Vidéos d&#039;auto-apprentissage ==&lt;br /&gt;
* Numériser {{#ev:youtube|_BQpaIplOeU}}&lt;br /&gt;
* Bonnes pratiques de numérisation  {{#ev:youtube|CTAEmElyZaw}}&lt;br /&gt;
* Visualiser le modèle 3D sur le téléphone {{#ev:youtube|F-4sU28wgHk}}&lt;br /&gt;
* Retrouver/Supprimer/Renommer/Partager un modèle enregistré {{#ev:youtube|sjdB5EDixDk}}&lt;br /&gt;
&lt;br /&gt;
= MapIt sur ordinateur de bureau =&lt;br /&gt;
&amp;lt;span style=&amp;quot;background:#FFFF00&amp;quot;&amp;gt;&amp;lt;b&amp;gt;MapIt est en version prototype: elle est disponible gratuitement en échange de vos commentaires. [https://docs.google.com/forms/d/e/1FAIpQLSfBkeZXw1CboBONpq05WY6mSkpbLsbH63wbaM376SDTwkfNVw/viewform?usp=sf_link Cliquez ici pour donner vos commentaires], merci!&amp;lt;/b&amp;gt;&amp;lt;/span&amp;gt;&lt;br /&gt;
== Installation ==&lt;br /&gt;
=== Windows ===&lt;br /&gt;
Téléchargement:&lt;br /&gt;
* Windows 64 bits&lt;br /&gt;
** Installateur: [https://docs.google.com/uc?authuser=0&amp;amp;id=1uNyVo-hi1PoopuGvhEzhHoFuN5YwpyGf&amp;amp;export=download MapIt-0.4.3-win64.exe]&lt;br /&gt;
*** Requiert [https://www.microsoft.com/en-ca/download/details.aspx?id=48145 Visual C++ Redistributable for Visual Studio 2015]&lt;br /&gt;
*** Requiert [https://www.microsoft.com/en-ca/download/details.aspx?id=14632 Visual C++ Redistributable for Visual Studio 2010 x64]&lt;br /&gt;
** Version ZIP: [https://docs.google.com/uc?authuser=0&amp;amp;id=1ZSJ16jTmK4mHWTy6mDIVynaKkV5lz8Da&amp;amp;export=download MapIt-0.4.3-win64.zip]&lt;br /&gt;
** [https://drive.google.com/drive/folders/1pWU4laHNWHYHOQUISk7gSvoRkpZH-ZOH?usp=sharing Anciennes versions]&lt;br /&gt;
L&#039;installateur permet d&#039;installer le logiciel comme n&#039;importe quel autre logiciel, mais requiert les droits administrateurs sur l&#039;ordinateur. Pour la version ZIP, exécutez le fichier bin/MapIt.exe pour lancer le logiciel.&lt;br /&gt;
&amp;lt;center&amp;gt;[[File:MapItWinInstall2.png|300px]] [[File:MapItWinInstall3.png|300px]]&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mac OS X ===&lt;br /&gt;
* [https://docs.google.com/uc?authuser=0&amp;amp;id=13UWY67vSTMn02KVq6l-xwFNPA6Lwz_zi&amp;amp;export=download MapIt-0.4.3-Darwin.dmg].&lt;br /&gt;
** [https://drive.google.com/drive/folders/1pWU4laHNWHYHOQUISk7gSvoRkpZH-ZOH?usp=sharing Anciennes versions]&lt;br /&gt;
* Double-cliquez sur le fichier DMG, puis glissez l&#039;application MapIt dans le dossier Applications. Ceci va extraire et copier MapIt dans votre dossier Applications. Vous pouvez ensuite éjecter le DMG dans Finder. Ensuite avec Spotlight (la loupe en haut à droite du bureau), cherchez MapIt pour trouver l&#039;application dans le dossier Applications.&lt;br /&gt;
&amp;lt;center&amp;gt;[[File:MapItMacInstall.jpeg]]&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* Si vous avez le message &amp;quot;Impossible d&#039;ouvrir MapIt...&amp;quot; en essayant d&#039;ouvrir MapIt, allez dans les Préférences de Mac OS X, sélectionnez &amp;quot;Sécurité et confidentialité&amp;quot;, déverrouiller le cadenas en bas de la fenêtre et cliquez sur &amp;quot;Ouvrir qu&#039;en même&amp;quot; à côté du message référant à MapIt. Réessayez ensuite d&#039;ouvrir MapIt, il devrait y avoir maintenant l&#039;option &amp;quot;Ouvrir&amp;quot; dans la boîte du message pour démarrer MapIt:&lt;br /&gt;
&amp;lt;center&amp;gt;[[File:MapItMacInstallRefused.png|400px]] [[File:MapItMacInstallPref.png|400px]]&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;[[File:MapItMacInstallUnlock.png|400px]]&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Mac OS X Big Sur ====&lt;br /&gt;
Sur &#039;&#039;&#039;macOS Big Sur&#039;&#039;&#039; (dernière version Mac), il y a un problème de démarrage de MapIt. Il faut le démarrer à partir d&#039;un &amp;quot;Terminal&amp;quot; en écrivant (ou en copiant cette ligne): &lt;br /&gt;
* &amp;lt;code&amp;gt;QT_MAC_WANTS_LAYER=1 /Applications/MapIt.app/Contents/MacOS/MapIt&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Importation des modèles du téléphone ==&lt;br /&gt;
&lt;br /&gt;
Il faut tout d&#039;abord brancher le téléphone par cable USB, puis s&#039;assurer que le téléphone est en mode &amp;quot;Transfert de fichiers&amp;quot; dans menu déroulant en haut du téléphone (glissez un doigt du haut de l&#039;appareil vers le bas pour faire afficher ce menu): &lt;br /&gt;
&amp;lt;center&amp;gt;[[File:MapItUsb1.jpg|225px]] --&amp;gt; [[File:MapItUsb2.jpg|225px]] --&amp;gt; [[File:MapItUsb3.jpg|225px]]&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Une fois en mode &amp;quot;Transfert de fichiers&amp;quot;, il y a deux méthodes pour transférer les fichiers:&lt;br /&gt;
&lt;br /&gt;
1)  Lors de la première utilisation de MapIt pour ordinateur de bureau, MapIt crée un répertoire &amp;quot;MapIt&amp;quot; dans votre dossier &amp;quot;Mes Documents&amp;quot;. Cliquez sur Fichier-&amp;gt;Importer et attendez quelques minutes que les modèles soient copiés dans le dossier &amp;quot;MapIt&amp;quot;. &lt;br /&gt;
&amp;lt;center&amp;gt;[[File:MapItOpenEnd.jpeg|550px]]&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
2) La deuxième approche utilise &amp;lt;strike&amp;gt;l&#039;Explorateur de fichiers sur Windows ou&amp;lt;/strike&amp;gt; l&#039;[https://www.android.com/filetransfer/ Android File Transfer] sur Mac OS X. Ouvrez le dispositif ASUS_002 puis naviguer jusqu&#039;au dossier  MapIt sur le téléphone. Sélectionnez et copiez les fichiers dans un dossier sur l&#039;ordinateur de bureau, préférablement dans le dossier &amp;quot;MapIt&amp;quot; créé dans &amp;quot;Mes Documents&amp;quot; pour que les modèles soient automatiquement chargés lors du démarrage de MapIt.&lt;br /&gt;
&lt;br /&gt;
== Utilisation de MapIt ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|BmSAv9sIaek}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
# Ouvrez MapIt. Lors de la première utilisation, l&#039;affichage devrait être vide. Pour importer des modèles du téléphone, voir section Importation ci-dessus. Par défaut MapIt ouvre le dossier &amp;quot;MapIt&amp;quot; dans &amp;quot;Mes Documents&amp;quot; pour afficher les modèles. Si les modèles ont été importés dans un autre dossier, faire Fichier-&amp;gt;Ouvrir pour  sélectionner le dossier contenant les modèles importés.&lt;br /&gt;
#* Si vous n&#039;avez pas le téléphone, téléchargez ce modèle exemple [https://docs.google.com/uc?authuser=0&amp;amp;id=1TjELNu93b7-LrxUv7Mo1mwSxSr03HIGD&amp;amp;export=download marriot_tysons_corner.zip] et dézippez-le dans &amp;quot;Mes Documents/MapIt&amp;quot;.&lt;br /&gt;
# Double-cliquez sur un modèle dans la liste de gauche. Le modèle devrait s&#039;afficher dans la fenêtre de droite. Vous pouvez cliquer sur &amp;quot;Mesurage Automatique&amp;quot; pour prendre des mesures automatiquement.&lt;br /&gt;
# L&#039;application essaie de trouver le maximum de mesures en fonction des plans trouvés dans le modèle. S&#039;il y a trop de mesures, il est possible d&#039;en enlever avec l&#039;action &amp;quot;Enlever une mesure&amp;quot;.&lt;br /&gt;
# Il est possible d&#039;ajouter des mesures personnalisées avec les autres actions de mesurage. Pour mesure la hauteur des objets, utilisez l&#039;action &amp;quot;Mesurer la hauteur&amp;quot;.&lt;br /&gt;
# Pour mesurer la distance entre des objets en fonction des axes principaux de la pièce, utilisez l&#039;action &amp;quot;Mesurer entre deux plans&amp;quot;. &lt;br /&gt;
# Pour sauvegarder les mesures, cliquez Fichier-&amp;gt;Sauvegarder ou tout simplement faire le raccourci &amp;quot;ctrl-s&amp;quot;.&lt;br /&gt;
# Pour changer les unités de mesure dans le système impérial (pieds/pouces),  allez dans Édition-&amp;gt;Préférences.&lt;br /&gt;
# Pour une vue &amp;quot;plan&amp;quot; de la pièce, cliquez sur le bouton &amp;quot;Vue Plan&amp;quot; dans la barre d&#039;actions. Vous pouvez aussi cliquer sur Z dans la barre d&#039;actions pour cacher les mesures en Z.&lt;br /&gt;
&lt;br /&gt;
== Vidéos d&#039;auto-apprentissage ==&lt;br /&gt;
* Transférer le fichier du modèle 3D (.db) vers l’ordinateur {{#ev:youtube|PPoNWj-2A1o}}&lt;br /&gt;
* Ouvrir un modèle 3D {{#ev:youtube|H_ulPuZAhhs}}&lt;br /&gt;
* Visualiser le modèle 3D sur l’ordinateur {{#ev:youtube|1bYBJrO9PLU}}&lt;br /&gt;
* Générer des mesures automatiques {{#ev:youtube|PQxfJb1dZUc}}&lt;br /&gt;
* Prendre des mesures personnalisées {{#ev:youtube|csiUai4PVxA}}&lt;br /&gt;
* Sauvegarder les fichiers de mesure {{#ev:youtube|3Y6IS6BjktE}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Quoi de neuf? =&lt;br /&gt;
== v0.4.3 ==&lt;br /&gt;
20 Juin 2020&lt;br /&gt;
* Windows/Mac: Ajout de l&#039;option d&#039;unités en &amp;quot;mm&amp;quot; dans le menu Préférences.&lt;br /&gt;
&lt;br /&gt;
== v0.4.2 ==&lt;br /&gt;
18 Juin 2019&lt;br /&gt;
* Windows: Résolution d&#039;un incompatibilité entre l&#039;application bureau et dernière version sur téléphone suite à l&#039;importation.&lt;br /&gt;
== v0.4.1 ==&lt;br /&gt;
28 Mai 2019&lt;br /&gt;
* Ajout du menu &amp;quot;Détection de boucle&amp;quot; avec options &amp;quot;Visuelle&amp;quot; et &amp;quot;Tag&amp;quot;.&lt;br /&gt;
* Amélioration de l&#039;optimisation du scan en utilisant aussi la gravité.&lt;br /&gt;
&lt;br /&gt;
12 Mars 2019&lt;br /&gt;
* Ajout d&#039;une session de mesure &amp;quot;Original&amp;quot; dans les choix d&#039;ouverture d&#039;un scan (sans mesure par défaut).&lt;br /&gt;
* Le nom du scan et la session de mesure ouverte (s&#039;il y a lieu) sont affichés dans le titre de la fenêtre.&lt;br /&gt;
* Le terminal noir ne s&#039;affiche plus sur Windows.&lt;br /&gt;
* Ajout d&#039;un lien &amp;quot;Aide&amp;quot; qui redirige vers cette page.&lt;br /&gt;
* Ajout d&#039;un dialogue &amp;quot;À propos&amp;quot; faisant état de la version de MapIt et d&#039;un lien vers cette page.&lt;br /&gt;
* Correction des problèmes de traduction sur certains boutons de dialogues.&lt;br /&gt;
&lt;br /&gt;
== v0.4.0 ==&lt;br /&gt;
13 Février 2019&lt;br /&gt;
* Ajout vidéo d&#039;explication de l&#039;application de bureau MapIt.&lt;br /&gt;
12 Novembre 2018&lt;br /&gt;
* Le menu ouvrir affiche maintenant les fichiers compatibles (*.db) avec MapIt&lt;br /&gt;
* Ajout du dialogue Édition-&amp;gt;Préférences. Tous les paramètres sauvegardés sont maintenant rendus dans ce dialogue. Ajout de nouveaux paramètres pour configurer le nombre de mesures générées par la mesure automatique. &lt;br /&gt;
* À l&#039;ouverture d&#039;un modèle, le modèle est auto-aligné avec la grille, ce qui fait que toutes les actions de mesure sont disponibles directement.&lt;br /&gt;
* Plusieurs sessions de mesure peuvent être sauvegardées dans le même modèle. Faire Fichier-&amp;gt;&amp;quot;Sauvegarder En...&amp;quot; pour sauvegarder les mesures présentes dans une nouvelle session. Lorsqu&#039;un modèle a plus d&#039;une session de mesure, à l&#039;ouverture d&#039;un modèle, une boîte de dialogue va demander qu&#039;elle session ouvrir.&lt;br /&gt;
* On peut maintenant zoomer avec les touches du clavier &amp;quot;+&amp;quot; et &amp;quot;-&amp;quot;.&lt;br /&gt;
* Menu clique-droit:&lt;br /&gt;
** Les actions ont été traduites en français &lt;br /&gt;
** Seulement les actions utiles à MapIt sont affichées &lt;br /&gt;
** Correction de la fonction de changement de couleur de l&#039;arrière-plan&lt;br /&gt;
* Le bouton &amp;quot;Enlevez une mesure&amp;quot; reste activé tant qu&#039;on ne ré-appuie pas sur le bouton.&lt;br /&gt;
* Ajout d&#039;un bouton &amp;quot;Vue Plan&amp;quot; dans la barre d&#039;actions. Ceci remplace la fonction Camera-&amp;gt;Ortho mode.&lt;br /&gt;
&lt;br /&gt;
== v0.3.0 ==&lt;br /&gt;
12 Septembre 2018&lt;br /&gt;
* Première version bureau!&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Publications et communications scientifiques =&lt;br /&gt;
* Guay, M., Labbé, M., Séguin-Tremblay, N., Auger, C., Goyer, G., Veloza, E., Chevalier, N., Polgar, J. and Michaud, F., “Adapting a Person’s Home in 3D Using a Mobile App (MapIt): Participatory Design Framework Investigating the App’s Acceptability,” in &#039;&#039;JMIR rehabilitation and assistive technologies&#039;&#039;, vol. 8, no. 2, p.e24669, 2021. ([https://rehab.jmir.org/2021/2/e24669 JMIR])&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3418</id>
		<title>File:Weights.zip</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3418"/>
		<updated>2021-08-30T14:09:48Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: Letd2801 a téléversé une nouvelle version de File:Weights.zip&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
TTOP weights&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3417</id>
		<title>File:Weights.zip</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3417"/>
		<updated>2021-08-18T14:46:31Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: Letd2801 uploaded a new version of File:Weights.zip&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
TTOP weights&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=File:IntRoLab.png&amp;diff=3416</id>
		<title>File:IntRoLab.png</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=File:IntRoLab.png&amp;diff=3416"/>
		<updated>2021-07-14T14:17:13Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3415</id>
		<title>File:Weights.zip</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3415"/>
		<updated>2021-06-23T20:49:33Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: Letd2801 a téléversé une nouvelle version de File:Weights.zip&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
TTOP weights&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3406</id>
		<title>File:Weights.zip</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3406"/>
		<updated>2020-11-06T18:32:19Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: Letd2801 a téléversé une nouvelle version de File:Weights.zip&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
TTOP weights&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3405</id>
		<title>File:Weights.zip</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3405"/>
		<updated>2020-10-13T17:05:40Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: Letd2801 uploaded a new version of File:Weights.zip&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
TTOP weights&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3402</id>
		<title>File:Weights.zip</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=File:Weights.zip&amp;diff=3402"/>
		<updated>2020-10-09T17:50:52Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: TTOP weights&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
TTOP weights&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=SAM&amp;diff=3318</id>
		<title>SAM</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=SAM&amp;diff=3318"/>
		<updated>2018-12-09T20:34:22Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;french&amp;gt;&lt;br /&gt;
[https://introlab.3it.usherbrooke.ca/sam Visitez le site de SAM.]&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
[https://introlab.3it.usherbrooke.ca/sam Please visit SAM&#039;s website.]&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=SAM&amp;diff=3317</id>
		<title>SAM</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=SAM&amp;diff=3317"/>
		<updated>2018-12-09T20:34:03Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;french&amp;gt;&lt;br /&gt;
[https://introlab.3it.usherbrooke.ca/sam | Visitez le site de SAM.]&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
[https://introlab.3it.usherbrooke.ca/sam | Please visit SAM&#039;s website.]&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=SAM&amp;diff=3316</id>
		<title>SAM</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=SAM&amp;diff=3316"/>
		<updated>2018-12-09T20:33:07Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;french&amp;gt;&lt;br /&gt;
[[Visitez le site de SAM. | https://introlab.3it.usherbrooke.ca/sam]]&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
[[Please visit SAM&#039;s website. | https://introlab.3it.usherbrooke.ca/sam]]&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=SAM&amp;diff=3315</id>
		<title>SAM</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=SAM&amp;diff=3315"/>
		<updated>2018-12-09T20:32:34Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;french&amp;gt;&lt;br /&gt;
[Visitez le site de SAM. | https://introlab.3it.usherbrooke.ca/sam]&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
[Please visit SAM&#039;s website. | https://introlab.3it.usherbrooke.ca/sam]&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=SAM&amp;diff=3314</id>
		<title>SAM</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=SAM&amp;diff=3314"/>
		<updated>2018-12-09T20:32:17Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: Created page with &amp;quot;&amp;lt;french&amp;gt; [Visitez le site de SAM.|https://introlab.3it.usherbrooke.ca/sam] &amp;lt;/french&amp;gt; &amp;lt;english&amp;gt; [Please visit SAM&amp;#039;s website.|https://introlab.3it.usherbrooke.ca/sam] &amp;lt;/english&amp;gt;&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;french&amp;gt;&lt;br /&gt;
[Visitez le site de SAM.|https://introlab.3it.usherbrooke.ca/sam]&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
[Please visit SAM&#039;s website.|https://introlab.3it.usherbrooke.ca/sam]&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=MediaWiki:Sidebar&amp;diff=3313</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=MediaWiki:Sidebar&amp;diff=3313"/>
		<updated>2018-12-09T20:30:40Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* News&lt;br /&gt;
**News|Toutes les nouvelles / All News&lt;br /&gt;
&lt;br /&gt;
* Projets / Projects&lt;br /&gt;
**Projects|Tous les projets / All Projects&lt;br /&gt;
**ADE|DEA&lt;br /&gt;
**AUDIBLE|AUDIBLE&lt;br /&gt;
**Autonomous Robot|Autonomous Robot&lt;br /&gt;
**AZIMUT|AZIMUT&lt;br /&gt;
**CRI|Children Robot Interaction&lt;br /&gt;
**DCD|DCD&lt;br /&gt;
**DDRA|DDRA&lt;br /&gt;
**DRF|DRF&lt;br /&gt;
**EQ|EQ&lt;br /&gt;
**MapIt|MapIt&lt;br /&gt;
**HBBA|HBBA&lt;br /&gt;
**PEXAT|PEXAT&lt;br /&gt;
**RTAB-Map|RTAB-Map&lt;br /&gt;
**SAM|SAM&lt;br /&gt;
**Telerobot|Telerobot&lt;br /&gt;
**Teletrauma|Teletrauma&lt;br /&gt;
**TRInterface|Ego/Exocentric Teleoperation&lt;br /&gt;
**WISS|WISS&lt;br /&gt;
* Open Source&lt;br /&gt;
**8SoundsUSB|8 Inputs USB Sound Card&lt;br /&gt;
**FlowDesigner|FlowDesigner&lt;br /&gt;
**ManyEars|ManyEars&lt;br /&gt;
**MARIE|MARIE&lt;br /&gt;
**OpenECoSys|OpenECoSys&lt;br /&gt;
**RTAB-Map|RTAB-Map&lt;br /&gt;
**Find-Object|Find-Object&lt;br /&gt;
**ROS_OpenTLD|ROS OpenTLD&lt;br /&gt;
**ROS4iOS|ROS4iOS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* Équipe / Team&lt;br /&gt;
** Team|Toute l&#039;équipe / All Team&lt;br /&gt;
&lt;br /&gt;
* Infrastructure&lt;br /&gt;
**Infrastructure|Infrastructure de laboratoire / Lab Infrastructure&lt;br /&gt;
&lt;br /&gt;
* Publications&lt;br /&gt;
** Publications|Toutes les publications / All publications&lt;br /&gt;
&lt;br /&gt;
* Information&lt;br /&gt;
** Information|Information du laboratoire / Lab Information&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=Main_Page&amp;diff=3312</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=Main_Page&amp;diff=3312"/>
		<updated>2018-11-29T13:55:47Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
{|  class=&amp;quot;wikitable&amp;quot;  style=&amp;quot;width:100%; height:200px; text-align:left;&amp;quot; border=&amp;quot;0&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
|width=&amp;quot;50%&amp;quot; |&lt;br /&gt;
&amp;lt;french&amp;gt;&amp;lt;big&amp;gt;IntRoLab - Laboratoire de robotique intelligente / interactive / intégrée / interdisciplinaire &amp;lt;/big&amp;gt; &amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Institut interdisciplinaire d&#039;innovation technologique [http://www.3it.ca 3IT]&#039;&#039;&#039; &amp;lt;br&amp;gt;&lt;br /&gt;
Université de Sherbrooke &amp;lt;br&amp;gt;&lt;br /&gt;
3000 boul. de l&#039;Université &amp;lt;br&amp;gt;&lt;br /&gt;
Sherbrooke (Québec) J1K 0A5 &amp;lt;br&amp;gt;&lt;br /&gt;
Canada &amp;lt;br&amp;gt;&lt;br /&gt;
Téléphone : 819 821-8000 poste 65700 &amp;lt;br&amp;gt;&lt;br /&gt;
[mailto:francois.michaud@usherbrooke.ca Contactez-nous!] &amp;lt;br&amp;gt;&lt;br /&gt;
[https://www.usherbrooke.ca/visiter/acces-routiers/ Accès routier] &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
&amp;lt;big&amp;gt;IntRoLab - Intelligent / Interactive / Integrated / Interdisciplinary Robot Lab &amp;lt;/big&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Interdisciplinary Institute for Technological Innovation [http://www.3it.ca 3IT] &amp;lt;br&amp;gt;&lt;br /&gt;
Université de Sherbrooke &amp;lt;br&amp;gt;&lt;br /&gt;
3000 boul. de l&#039;Université &amp;lt;br&amp;gt;&lt;br /&gt;
Sherbrooke (Québec) J1K 0A5 &amp;lt;br&amp;gt;&lt;br /&gt;
Canada &amp;lt;br&amp;gt;&lt;br /&gt;
Phone :  819 821-8000 ext. 65700 &amp;lt;br&amp;gt;&lt;br /&gt;
[mailto:francois.michaud@usherbrooke.ca Contact us!] &amp;lt;br&amp;gt;&lt;br /&gt;
[https://www.usherbrooke.ca/visiter/acces-routiers/ Map] &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
|width=&amp;quot;50%&amp;quot; |&lt;br /&gt;
&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;Janvier 2018- [https://github.com/introlab/odas ODAS : Open embeddeD Audition System]&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;January 2018 - [https://github.com/introlab/odas ODAS : Open embeddeD Audition System]&amp;lt;/english&amp;gt;&lt;br /&gt;
{{#ev:youtube|n7y2rLAnd5I}}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
[[Image:Twitter.jpeg|50px|link=https://twitter.com/introlab/]] &lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
{| style=&amp;quot;float: right;&amp;quot;&lt;br /&gt;
| [https://introlab.3it.usherbrooke.ca/introlab-secure/ Intranet]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=ROS_OpenTLD&amp;diff=3311</id>
		<title>ROS OpenTLD</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=ROS_OpenTLD&amp;diff=3311"/>
		<updated>2018-11-28T19:18:40Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|cVdi4oIKqeU}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
= Author = &lt;br /&gt;
* Ronan Chauvin&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
= Auteur =&lt;br /&gt;
* Ronan Chauvin&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
= Description = &lt;br /&gt;
ROS_OpenTLD is a ROS version of the OpenTLD tracker.&lt;br /&gt;
OpenTLD is a C++ implementation of TLD Predator (Tracking. Learning and Detection) implemented by the AIT (Austrian Institute of Technology), that was originally published in Matlab by Zdenek Kalal. OpenTLD is used for tracking objects in video streams. It doesn&#039;t need any training data and is also able to load predefined models (http://gnebehay.github.com/OpenTLD/).&lt;br /&gt;
&lt;br /&gt;
The ROS implementation consists of two nodes: the tracker node which use the opentld library and an interface node that allow you to select a bounding box, start stop the tracking, start and stop the learning, import or export a model, clear the background and change the tracker&#039;s method.&lt;br /&gt;
In the two launch files, you can configure the input video stream. In the tracker launch file, you can configure the bounding box source, the default bounding box if there is one, the model that you may want to load and its path, the automatic face detection by the OpenCV cascade classifier and some others parameters.&lt;br /&gt;
Like OpenTLD, ROS_OpenTLD is published under the terms of the GNU General Public License.&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
= Description = &lt;br /&gt;
ROS_OpenTLD  est une version ROS du logiciel de suivi OpenTLD.&lt;br /&gt;
OpenTLD est une implémentation C++ de l’algorithme de suivi TLD Predator (Tracking. Learning and Detection) implémenté par le AIT (Austrian Institute of Technology), qui a été publiée à l’origine sur Matlab par Zdenek Kalal. OpenTLD est utilisée pour suivre des objets sur des flux vidéo. Il ne nécessite pas de données d’entraînement et est aussi capable de charger des modèles préenregistrés (http://gnebehay.github.com/OpenTLD/).&lt;br /&gt;
&lt;br /&gt;
L’implémentation ROS consiste en deux nœuds : un nœud de suivi qui utilise la librairie opentld et un nœud d’interface utilisateur permettant de sélectionner la cible (rectangle), démarrer ou arrêter le suivi, importer ou exporter un modèle, changer l’arrière-plan et changer la méthode de suivi.&lt;br /&gt;
Dans les deux fichiers « .launch », vous pouvez configurer le topic du flux vidéo. Dans le fichier « .launch » du nœud de suivi, vous pouvez configurer la topic approvisionnant le nœud en cibles, une cible par défaut s’il y en a une, le modèle à charger et son chemin d’accès et d’autres paramètres.&lt;br /&gt;
Comme OpenTLD, ROS_OpenTLD est publié sous les termes de la licence publique générale GNU.&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
= Source Code =&lt;br /&gt;
You can get the source code on GitHub : &lt;br /&gt;
* http://github.com/Ronan0912/ros_opentld&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
= Code Source =&lt;br /&gt;
Vous pouvez obtenir le code source sur GitHub : &lt;br /&gt;
* http://github.com/Ronan0912/ros_opentld&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
= Applications =&lt;br /&gt;
There are many possible applications in robotic. For example, ROS_OpenTLD is already used for tracking cars from a ROS version of the Parrot AR Drone. On our part, we used the algorithm in the purpose of tracking a target (a head or a body) and following this target with a robotic mobile platform.&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
= Applications =&lt;br /&gt;
Il y a beaucoup d’applications possibles en robotique. Par exemple, ROS_OpenTLD a déjà été utilisé pour suivre des voitures à partir d’une version fonctionnant sous ROS de l’AR Drone de Parrot. De notre côté, nous avons utilisé l’algorithme dans le but de suivre une cible (une tête ou une silhouette) et de suivre physiquement cette cible avec une plateforme robotique mobile.&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;= Videos =&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;= Vidéos = &amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|ELA1yemmshE}}&amp;lt;/center&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=WISS&amp;diff=3310</id>
		<title>WISS</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=WISS&amp;diff=3310"/>
		<updated>2018-11-28T19:16:53Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: /* Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
= Description = &lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
Artificial audition recently became popular in mobile robotics in order to enhance the human-robot interaction. Speech recognition is the main field of interest whereas speaker recognition receives little attention. The ManyEars system (based on the [[AUDIBLE]] project) allows a mobile robot to localize, track and separate multiple simultaneous sound sources. This system uses an array of eight microphones disposed in a cubic shape. This speaker recognition system, named WISS (Who IS Speaking), is coupled to the ManyEars system. This speaker recognition system is robust to noise and dynamic environments. Parallel model combination (PMC) and masks are used to increase the identification rate within a noisy environment. A confidence value is also introduced to weight the obtained identifications. The simplicity of this system makes it suitable for real-time applications on a General Purpose Processor (GPP).&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
L&#039;audition artificielle est de plus en plus utilisée en robotique mobile pour améliorer l&#039;interaction humain-robot. La reconnaissance de la parole occupe présentement une place importante tandis que la reconnaissance de locuteurs est encore peu explorée pour ce genre d&#039;application. Le système ManyEars  (qui découle du projet [[AUDIBLE]]) permet actuellement à un robot mobile de localiser, suivre et séparer plusieurs sources sonores. Ce système utilise un ensemble de huit microphones qui sont disposés en cube. Ce système de reconnaissance de locuteurs, nommé WISS (Who IS Speaking ), est couplé au système ManyEars. Le système de reconnaissance de locuteurs conçu est robuste au bruit ambiant et au changement d&#039;environnement. Une technique de parallel model combination (PMC) et des masques sont utilisés pour améliorer le taux d&#039;identification dans un milieu bruité. Un indice de confiance est également introduit pour pondérer les identifications obtenues. La simplicité du système proposé fait en sorte qu&#039;il est possible d&#039;exécuter en temps réel l&#039;algorithme sur un General Purpose Processor (GPP).&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:ManyEars_Overview_v2.png]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&#039;&#039;&#039;Fig. 1&#039;&#039;&#039;. Blocks diagram of the ManyEars system&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&#039;&#039;&#039;Fig. 1&#039;&#039;&#039;. Schéma-bloc du système ManyEars&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:Wiss_entrainement_v4.png]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&#039;&#039;&#039;Fig. 2&#039;&#039;&#039;. Blocks diagram of the WISS system for training&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&#039;&#039;&#039;Fig. 2&#039;&#039;&#039;. Schéma-bloc du système WISS pour l&#039;entraînement&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:Wiss_evaluation_v3.png]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&#039;&#039;&#039;Fig. 3&#039;&#039;&#039;. Blocks diagram of the WISS system for identification&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&#039;&#039;&#039;Fig. 3&#039;&#039;&#039;. Schéma-bloc du système WISS pour la reconnaissance&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;= Team =&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;= Équipe =&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
* [[User:François_Grondin | François Grondin, jr.eng.]]&lt;br /&gt;
* [http://www.gel.usherbrooke.ca/michaudf/ François Michaud, eng., Ph.D.] &lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
* [[User:François_Grondin | François Grondin, ing.jr.]]&lt;br /&gt;
* [http://www.gel.usherbrooke.ca/michaudf/ François Michaud, ing., Ph.D.] &lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;= Source code =&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;= Code source =&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;This project is open source and available for the Matlab/Octave environments on [https://github.com/introlab/WISS GitHub].&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;Le code source de ce projet est ouvert et disponible pour les environnements Matlab/Octave sur [https://github.com/introlab/WISS GitHub].&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;= Videos =&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;= Vidéos =&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{{#ev:youtube|Acfxl3oqg90}}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Publications =&lt;br /&gt;
#Grondin, F., Michaud, F. (2012), &amp;quot;WISS, a Speaker Identification System for Mobile Robots,&amp;quot; Proceedings of the International Conference on Robotics and Automation: 1817-1822 ([[Media:Grondin2012wiss.pdf|pdf]]) &lt;br /&gt;
#Grondin, F., Reconnaissance de locuteurs pour robot mobile, Mémoire de maîtrise, Département de génie électrique et de génie informatique, Université de Sherbrooke. ([[Media:MemoireGrondin.pdf|pdf]])&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=TRInterface&amp;diff=3309</id>
		<title>TRInterface</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=TRInterface&amp;diff=3309"/>
		<updated>2018-11-28T19:16:03Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
= Egocentric and Exocentric Teleoperation Interface with 3D Video Projection =&lt;br /&gt;
&lt;br /&gt;
== Description ==&lt;br /&gt;
[[File:trinterface.jpg|thumb|300px]]&lt;br /&gt;
&lt;br /&gt;
Following the original Telerobot project, we began development on a novel 3D&lt;br /&gt;
interface for teleoperated navigation tasks.&lt;br /&gt;
This interface combines, in real time:&lt;br /&gt;
&lt;br /&gt;
* An extrusion of SLAM-built 2D map of the environment&lt;br /&gt;
* A laser-based surface projection of a 2D video feed&lt;br /&gt;
* A 3D projection of a colored point cloud built from a stereoscopic camera&lt;br /&gt;
* A CAD-based 3D model of the robot&lt;br /&gt;
&lt;br /&gt;
With this interface, the user can seamlessly transition from egocentric to&lt;br /&gt;
exocentric viewpoints by moving the virtual camera around the controlled&lt;br /&gt;
robot.&lt;br /&gt;
As in modern 3D third-person videogames, the user is able to set its viewpoint&lt;br /&gt;
to best suit the task at hand, like a top-down view to navigate tight spaces or&lt;br /&gt;
a straight-ahead view to communicate with people.&lt;br /&gt;
&lt;br /&gt;
== Future work ==&lt;br /&gt;
A new, ROS-compatible and open source implementation that will take advantage of&lt;br /&gt;
modern sensors like the Kinect is currently being worked on.&lt;br /&gt;
&lt;br /&gt;
Follow the project repository [http://github.com/francoisferland/rd] for details.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
&lt;br /&gt;
A few test sequences done in and around the laboratory :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{| width=&amp;quot;80%&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
|{{#ev:youtube|75yjelCkDno}}&lt;br /&gt;
|{{#ev:youtube|r9kSAwziHJ8}}&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Related Publications ==&lt;br /&gt;
#Ferland, F., Létourneau, D., Aumont, A., Frémy, J, Legault, M.-A., Lauria, M., Michaud, F. (2012), &amp;quot;Natural interaction design of a humanoid robot,&amp;quot; Journal of Human-Robot Interaction, 1 (2), 118-134, [http://www.humanrobotinteraction.org/journal/index.php/HRI/article/view/65].&lt;br /&gt;
#Ferland, F., Pomerleau, F., Le Dinh, C.T., Michaud, F. (2009), &amp;quot;Egocentric and exocentric teleoperation interface using real-time, 3D video projection&amp;quot;, &#039;&#039;Proceedings ACM/IEEE International Conference on Human-Robot Interaction&#039;&#039;, March. ([http://laborius.gel.usherbrooke.ca/papers/HRI2009.pdf pdf])&lt;br /&gt;
#Pomerleau, F., Colas, F., Ferland, F., Michaud, F. (2009), &amp;quot;Kd-ICP for fast and robust map registration&amp;quot;, &#039;&#039;Proceedings Field and Services Robotics&#039;&#039;, July. ([http://laborius.gel.usherbrooke.ca/papers/FSR2009.pdf pdf])&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=RTAB-Map&amp;diff=3308</id>
		<title>RTAB-Map</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=RTAB-Map&amp;diff=3308"/>
		<updated>2018-11-28T19:12:27Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
&amp;lt;big&amp;gt;&amp;lt;english&amp;gt;[[Image:RTAB-Map.png|link=http://introlab.github.io/rtabmap|RTAB-Map]] RTAB-Map : Real-Time Appearance-Based Mapping&amp;lt;/english&amp;gt;&amp;lt;french&amp;gt;[[Image:RTAB-Map.png|link=http://introlab.github.io/rtabmap|RTAB-Map]] RTAB-Map : Cartographie temps réel basée sur l&#039;apparence de l&#039;environnement &amp;lt;/french&amp;gt;&amp;lt;/big&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Description ==&lt;br /&gt;
&#039;&#039;&#039;This page is about the loop closure detection approach used by RTAB-Map. For RGB-D mapping, visit [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&#039;&#039;&#039;.&lt;br /&gt;
&lt;br /&gt;
Loop closure detection is the process involved when trying to find a match between the current and a previously visited locations in SLAM (Simultaneous Localization And Mapping). &lt;br /&gt;
Over time, the amount of time required to process new observations increases with the size of the internal map, which may affect real-time processing. &lt;br /&gt;
RTAB-Map is a novel real-time loop closure detection approach for large-scale and long-term SLAM. Our approach is based on efficient memory management to keep computation time for each new observation under a fixed time limit, thus respecting real-time limit for long-term operation. Results demonstrate the approach&#039;s adaptability and scalability using two custom data sets and ten standard data sets.&lt;br /&gt;
&amp;lt;/english&amp;gt;&amp;lt;french&amp;gt;&lt;br /&gt;
== Description ==&lt;br /&gt;
&#039;&#039;&#039;Cette page est à propos de l&#039;approche de détection de fermeture de boucle utilisée dans RTAB-Map. Pour la cartographie RGB-D, visitez [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&#039;&#039;&#039;.&lt;br /&gt;
&lt;br /&gt;
La détection de fermeture de boucle est le processus impliqué en SLAM (localisation et cartographie simultanées) lorsqu&#039;on tente de trouver une correspondance entre un endroit présent et un autre déjà visité. Plus la carte interne augmente en taille, plus le temps requis pour la détection de fermeture de boucle augmente, ce qui peut affecter le traitement en temps réel. RTAB-Map est une nouvelle approche de détection de fermeture de boucle fonctionnant en temps réel pour du SLAM à grande échelle et à long terme. Notre approche est basée sur une gestion efficace de la mémoire afin de garder le temps de calcul en dessous d&#039;un seuil de temps, respectant ainsi la limite de temps réel à long terme. En utilisant dix ensembles de données standards, notre propre ensemble de données dérivées d&#039;un parcours de plus de 2 km rassemblant des conditions diverses et notre ensemble de données montrant un parcours où le robot visite les mêmes endroits une centaine de fois, les résultats démontrent l&#039;adaptabilité et l&#039;extensibilité de notre approche.&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{{#ev:youtube|71eRxTc1DaU}}&lt;br /&gt;
{{#ev:youtube|CAk-QGMlQmI}}&lt;br /&gt;
{{#ev:youtube|AMLwjo80WzI}}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Results ==&lt;br /&gt;
&#039;&#039;Note that these results (more recent) may differ from those in the presentation video above...&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Figure 1: Summary of the loop closures detected on UdeS data set :&lt;br /&gt;
* Green : Loop closures detected&lt;br /&gt;
* Yellow : Loop closures rejected&lt;br /&gt;
* Red : Unable to detect a loop closure because old places could not be retrieved&lt;br /&gt;
&lt;br /&gt;
Figure 2: Processing time for each image acquired (real-time limit fixed to 700 ms for an image rate of 1 Hz)&lt;br /&gt;
&lt;br /&gt;
Figure 3: Precision-Recall (48% recall at 100% precision)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_LoopClosureMapResults.png|250px]] [[File:RTAB-Map_LoopClosureTimeResults.png|250px]] [[File:RTAB-Map_RecallResults.png|250px]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Reproduce the loop closure detection results&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:RTAB-Map_LoopClosureAllPrecisionRecall.png|250px]]&lt;br /&gt;
&lt;br /&gt;
* Visit the [http://github.com/introlab/rtabmap/wiki/Benchmark Benchmark] wiki page on [http://github.com/introlab/rtabmap/wiki RTAB-Map&#039;s GitHub]. The ground truths can be downloaded below.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Videos&#039;&#039;&#039;&lt;br /&gt;
* Newer:&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|1dImRinTJSE}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|N5q0jQrV3gw}} {{#ev:youtube|PqO_x8tcFiY}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|2MogQIT_B2I}} {{#ev:youtube|AH_oKp3CrRA}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|0fNUD11FNZU}} {{#ev:youtube|ViXlUywWHYQ}}&amp;lt;/center&amp;gt;&lt;br /&gt;
* Older:&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|0zWs6jTaAwQ}} {{#ev:youtube|J8KGEA9ecS0}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|kghs6XM8Yzw}} {{#ev:youtube|awV2Xbjq7OM}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|CuWESlLfWpQ}} {{#ev:youtube|SQiFs1z7qSY}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|ShQlakkzsY4}} {{#ev:youtube|cTmf5yrpcl8}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Résultats ==&lt;br /&gt;
&#039;&#039;À noter que les résultats (plus récents) présentés ici peuvent différer de ceux dans le vidéo...&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Figure 1: Sommaire des détection de boucles sur l&#039;ensemble de données UdeS :&lt;br /&gt;
* Vert : Fermetures de boucle acceptées&lt;br /&gt;
* Jaune : Fermetures de boucle rejetées &lt;br /&gt;
* Rouge : Impossibilité de détecter une fermeture de boucle car les anciens endroits n&#039;ont pu être remémorisés&lt;br /&gt;
&lt;br /&gt;
Figure 2: Temps d&#039;exécution pour chaque itération (limite temps réel fixée à 700 ms pour un temps d&#039;acquisition de 1 seconde)&lt;br /&gt;
&lt;br /&gt;
Figure 3: Precision-Recall (48% recall à 100% precision)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_LoopClosureMapResults.png|250px]] [[File:RTAB-Map_LoopClosureTimeResults.png|250px]] [[File:RTAB-Map_RecallResults.png|250px]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Reproduire les résultats de détection de boucles&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:RTAB-Map_LoopClosureAllPrecisionRecall.png|250px]]&lt;br /&gt;
&lt;br /&gt;
* Visitez la page wiki [http://github.com/introlab/rtabmap/wiki/Benchmark Benchmark] sur le [http://github.com/introlab/rtabmap/wiki GitHub de RTAB-Map&#039;s]. Les &amp;quot;ground truths&amp;quot; peuvent être téléchargés en bas de la page.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Vidéos&#039;&#039;&#039;&lt;br /&gt;
* Nouveaux:&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|1dImRinTJSE}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|N5q0jQrV3gw}} {{#ev:youtube|PqO_x8tcFiY}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|2MogQIT_B2I}} {{#ev:youtube|AH_oKp3CrRA}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|0fNUD11FNZU}} {{#ev:youtube|ViXlUywWHYQ}}&amp;lt;/center&amp;gt;&lt;br /&gt;
* Anciens:&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|0zWs6jTaAwQ}} {{#ev:youtube|J8KGEA9ecS0}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|kghs6XM8Yzw}} {{#ev:youtube|awV2Xbjq7OM}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|CuWESlLfWpQ}} {{#ev:youtube|SQiFs1z7qSY}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;{{#ev:youtube|ShQlakkzsY4}} {{#ev:youtube|cTmf5yrpcl8}}&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Source code ==&lt;br /&gt;
The code was tested on Windows (Xp, 7), Mac OS X 10.6 and Ubuntu 10.4LTS.&lt;br /&gt;
* Standalone application, libraries and ROS packages : [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_Interface.png|800px|Images acquired in Need For Speed Most Wanted]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Code source ==&lt;br /&gt;
Le code a été testé sur Windows (Xp, 7), Mac OS X 10.6 et Ubuntu 10.4LTS. &lt;br /&gt;
* Logiciel &amp;quot;stand-alone&amp;quot;, bibliothèques logicielles et noeuds ROS : [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_Interface.png|800px|Images provenant de Need For Speed Most Wanted]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Data sets ==&lt;br /&gt;
&#039;&#039;&#039;UdeS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 5395 images at 1 Hz (1.5 hours). &lt;br /&gt;
* Images taken while walking through a loop of ~2 km, traversed two times.&lt;br /&gt;
* The data set contains indoor and outdoor environments.&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:UdeS_1Hz_map.png|([http://maps.google.ca/maps?q=Universit%C3%A9+de+sherbrooke&amp;amp;hl=en&amp;amp;ie=UTF8&amp;amp;ll=45.377714,-71.927383&amp;amp;spn=0.011546,0.016158&amp;amp;sll=49.891235,-97.15369&amp;amp;sspn=43.664668,66.181641&amp;amp;t=h&amp;amp;z=16 on Google maps])&lt;br /&gt;
File:UdeS_1Hz_sample1.jpg&lt;br /&gt;
File:UdeS_1Hz_sample3.jpg&lt;br /&gt;
File:UdeS_1Hz_sample4.jpg&lt;br /&gt;
File:UdeS_1Hz_sample5.jpg&lt;br /&gt;
File:UdeS_1Hz_sample6.jpg&lt;br /&gt;
File:UdeS_1Hz_sample7.jpg&lt;br /&gt;
File:UdeS_1Hz_sample8.jpg&lt;br /&gt;
File:UdeS_1Hz_sample9.jpg&lt;br /&gt;
File:UdeS_1Hz_sample11.jpg|Rain!&lt;br /&gt;
File:UdeS_1Hz_sample16.jpg|Compare illumination and camera orientation with the next image...&lt;br /&gt;
File:UdeS_1Hz_sample12.jpg&lt;br /&gt;
File:UdeS_1Hz_sample13.jpg|Elevator door...&lt;br /&gt;
File:UdeS_1Hz_sample14.jpg&lt;br /&gt;
File:UdeS_1Hz_sample15.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:UdeS_1Hz.part1.rar|UdeS_1Hz.part1.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part2.rar|UdeS_1Hz.part2.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part3.rar|UdeS_1Hz.part3.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.png|UdeS_1Hz GroundTruth]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;NFSMW&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 25098 images at 1 Hz (7 hours).&lt;br /&gt;
* Images taken from the racing video game Need For Speed: Most Wanted.&lt;br /&gt;
* 2 areas visited hundred times each (100 traversals in area 1 then moved to area 2 for another 102 traversals).&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:NFSMW_1Hz_map.png&lt;br /&gt;
File:NFSMW_1Hz_sample2.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample3.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample4.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample5.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample6.jpg|Compare illumination with the next image...&lt;br /&gt;
File:NFSMW_1Hz_sample8.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample7.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample9.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample10.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:NFSMW_1Hz.part01.rar|NFSMW_1Hz.part01.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part02.rar|NFSMW_1Hz.part02.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part03.rar|NFSMW_1Hz.part03.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part04.rar|NFSMW_1Hz.part04.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part05.rar|NFSMW_1Hz.part05.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part06.rar|NFSMW_1Hz.part06.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part07.rar|NFSMW_1Hz.part07.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part08.rar|NFSMW_1Hz.part08.rar]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Community&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Community data sets from other loop closure detection approaches :&lt;br /&gt;
* Angeli et al. : [http://cogrob.ensta.fr/loopclosure.html Lip6Indoor and Lip6Outdoor]&lt;br /&gt;
* Cummins et al. (FAB-MAP) : [http://www.robots.ox.ac.uk/~mobile/IJRR_2008_Dataset NewCollege and CityCentre]&lt;br /&gt;
* Cummins et al. (FAB-MAP 2.0) : [http://www.robots.ox.ac.uk/~mobile Eynsham (70 km)]&lt;br /&gt;
* Maddern et al. : [http://www.robots.ox.ac.uk/NewCollegeData/ NewCollege omnidirectionnal images]&lt;br /&gt;
* Kawewong et al. (PIRF-Nav 2.0): [http://haselab.info/pirf.html CrowdedCanteen]&lt;br /&gt;
* Ga ́lvez-Lo ́pez et al. : [http://www.rawseeds.org/home/category/benchmarking-toolkit/datasets/ Bovisa and Bicocca]&lt;br /&gt;
* Blanco et al. : [http://www.mrpt.org/malaga_dataset_2009 Malaga 2009]&lt;br /&gt;
&lt;br /&gt;
Ground truths:&lt;br /&gt;
* [[Media:NewCollege.png|NewCollege.png]] 1073 images at ~0.5 Hz (left and right images merged) &lt;br /&gt;
* [[Media:CityCentre.png|CityCentre.png]] 1237 images at ~0.5 Hz (left and right images merged) &lt;br /&gt;
* [[Media:Lip6Indoor.png|Lip6Indoor.png]] 388 images at 1 Hz&lt;br /&gt;
* [[Media:Lip6Outdoor.png|Lip6Outdoor.png]] 531 images at 0.5 Hz&lt;br /&gt;
* [[Media:Eynsham70km.png|Eynsham70km.png]] 5519 images at ~1 Hz (Note that we removed some images of the original data set to have an approximately image rate of 1 Hz)&lt;br /&gt;
* [[Media:NewCollegeOmni.png|NewCollegeOmni.png]] 1626 images at 1 Hz&lt;br /&gt;
* [[Media:CrowdedCanteen.png|CrowdedCanteen.png]] 692 images at 2 Hz&lt;br /&gt;
* [[Media:BicoccaIndoor-2009-02-25b.png|BicoccaIndoor-2009-02-25b.png]] 1757 images at 1 Hz&lt;br /&gt;
* [[Media:BovisaOutdoor-2008-10-04.png|BovisaOutdoor-2008-10-04.png]] 2277 images at 1 Hz&lt;br /&gt;
* [[Media:BovisaMixed-2008-10-06.png|BovisaMixed-2008-10-06.png]] 2147 images at 1 Hz&lt;br /&gt;
* [[Media:malaga2009_campus_2L.png|malaga2009_campus_2L.png]] 653 images at ~1 Hz&lt;br /&gt;
* [[Media:malaga2009_parking_6L.png|malaga2009_parking_6L.png]] 435 images at ~1 Hz&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Ensembles de données ==&lt;br /&gt;
&#039;&#039;&#039;UdeS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 5395 images à 1 Hz (1,5 heures).&lt;br /&gt;
* Images prises en marchant sur un trajet de ~2 km, parcouru deux fois.&lt;br /&gt;
* L&#039;ensemble de données contient des images prises à l&#039;intérieur et à l&#039;extérieur.&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:UdeS_1Hz_map.png|([http://maps.google.ca/maps?q=Universit%C3%A9+de+sherbrooke&amp;amp;hl=en&amp;amp;ie=UTF8&amp;amp;ll=45.377714,-71.927383&amp;amp;spn=0.011546,0.016158&amp;amp;sll=49.891235,-97.15369&amp;amp;sspn=43.664668,66.181641&amp;amp;t=h&amp;amp;z=16 sur Google maps])&lt;br /&gt;
File:UdeS_1Hz_sample1.jpg&lt;br /&gt;
File:UdeS_1Hz_sample3.jpg&lt;br /&gt;
File:UdeS_1Hz_sample4.jpg&lt;br /&gt;
File:UdeS_1Hz_sample5.jpg&lt;br /&gt;
File:UdeS_1Hz_sample6.jpg&lt;br /&gt;
File:UdeS_1Hz_sample7.jpg&lt;br /&gt;
File:UdeS_1Hz_sample8.jpg&lt;br /&gt;
File:UdeS_1Hz_sample9.jpg&lt;br /&gt;
File:UdeS_1Hz_sample11.jpg|De la pluie!&lt;br /&gt;
File:UdeS_1Hz_sample16.jpg|Comparer l&#039;illumination et l&#039;orientation de la caméra avec l&#039;image suivante... &lt;br /&gt;
File:UdeS_1Hz_sample12.jpg&lt;br /&gt;
File:UdeS_1Hz_sample13.jpg|Porte d&#039;ascenseur...&lt;br /&gt;
File:UdeS_1Hz_sample14.jpg&lt;br /&gt;
File:UdeS_1Hz_sample15.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:UdeS_1Hz.part1.rar|UdeS_1Hz.part1.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part2.rar|UdeS_1Hz.part2.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part3.rar|UdeS_1Hz.part3.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.png|UdeS_1Hz GroundTruth]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;NFSMW&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 25098 images à 1 Hz (7 heures).&lt;br /&gt;
* Images prises dans le jeu vidéo de course Need For Speed: Most Wanted.&lt;br /&gt;
* 2 zones ont été visités 100 fois chaque (100 boucles dans la zone 1 et ensuite 102 boucles dans la zone 2).&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:NFSMW_1Hz_map.png&lt;br /&gt;
File:NFSMW_1Hz_sample2.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample3.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample4.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample5.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample6.jpg|Comparer l&#039;illumination avec l&#039;image suivante...&lt;br /&gt;
File:NFSMW_1Hz_sample8.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample7.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample9.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample10.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:NFSMW_1Hz.part01.rar|NFSMW_1Hz.part01.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part02.rar|NFSMW_1Hz.part02.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part03.rar|NFSMW_1Hz.part03.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part04.rar|NFSMW_1Hz.part04.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part05.rar|NFSMW_1Hz.part05.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part06.rar|NFSMW_1Hz.part06.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part07.rar|NFSMW_1Hz.part07.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part08.rar|NFSMW_1Hz.part08.rar]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Communauté&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Ensembles de données provenant d&#039;autres approches de détection de fermeture de boucle :&lt;br /&gt;
* Angeli et al. : [http://cogrob.ensta.fr/loopclosure.html Lip6Indoor et Lip6Outdoor]&lt;br /&gt;
* Cummins et al. (FAB-MAP) : [http://www.robots.ox.ac.uk/~mobile/IJRR_2008_Dataset NewCollege et CityCentre]&lt;br /&gt;
* Cummins et al. (FAB-MAP 2.0) : [http://www.robots.ox.ac.uk/~mobile Eynsham (70 km)]&lt;br /&gt;
* Maddern et al. : [http://www.robots.ox.ac.uk/NewCollegeData/ NewCollege omnidirectionnal images]&lt;br /&gt;
* Kawewong et al. (PIRF-Nav 2.0): [http://haselab.info/pirf.html CrowdedCanteen]&lt;br /&gt;
* Ga ́lvez-Lo ́pez et al. : [http://www.rawseeds.org/home/category/benchmarking-toolkit/datasets/ Bovisa et Bicocca]&lt;br /&gt;
* Blanco et al. : [http://www.mrpt.org/malaga_dataset_2009 Malaga 2009]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Ground truths&#039;&#039;:&lt;br /&gt;
* [[Media:NewCollege.png|NewCollege.png]] 1073 images à ~0.5 Hz (les images de gauche et de droite fusionnées)&lt;br /&gt;
* [[Media:CityCentre.png|CityCentre.png]] 1237 images à ~0.5 Hz (les images de gauche et de droite fusionnées) &lt;br /&gt;
* [[Media:Lip6Indoor.png|Lip6Indoor.png]] 388 images à 1 Hz&lt;br /&gt;
* [[Media:Lip6Outdoor.png|Lip6Outdoor.png]] 531 images à 0.5 Hz&lt;br /&gt;
* [[Media:Eynsham70km.png|Eynsham70km.png]] 5519 images à ~1 Hz (Noter que nous avons enlevés des images de l&#039;ensemble données original pour avoir une fréquence d&#039;acquisition d&#039;images d&#039;environ 1 Hz.)&lt;br /&gt;
* [[Media:NewCollegeOmni.png|NewCollegeOmni.png]] 1626 images à 1 Hz&lt;br /&gt;
* [[Media:CrowdedCanteen.png|CrowdedCanteen.png]] 692 images à 2 Hz&lt;br /&gt;
* [[Media:BicoccaIndoor-2009-02-25b.png|BicoccaIndoor-2009-02-25b.png]] 1757 images à 1 Hz&lt;br /&gt;
* [[Media:BovisaOutdoor-2008-10-04.png|BovisaOutdoor-2008-10-04.png]] 2277 images à 1 Hz&lt;br /&gt;
* [[Media:BovisaMixed-2008-10-06.png|BovisaMixed-2008-10-06.png]] 2147 images à 1 Hz&lt;br /&gt;
* [[Media:malaga2009_campus_2L.png|malaga2009_campus_2L.png]] 653 images à ~1 Hz&lt;br /&gt;
* [[Media:malaga2009_parking_6L.png|malaga2009_parking_6L.png]] 435 images à ~1 Hz&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Publications ==&lt;br /&gt;
#M. Labbé and F. Michaud, “Long-term online multi-session graph-based SPLAM with memory management,” in &#039;&#039;Autonomous Robots&#039;&#039;, accepted, 2017. ([[Media:LabbeAURO2017.pdf|pdf]]) ([http://dx.doi.org/10.1007/s10514-017-9682-5 Springer])&lt;br /&gt;
#M. Labbé and F. Michaud, “Online Global Loop Closure Detection for Large-Scale Multi-Session Graph-Based SLAM,” in &#039;&#039;Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems&#039;&#039;, 2014. ([[Media:Labbe14-IROS.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/login.jsp?tp=&amp;amp;arnumber=6942926 IEEE Xplore])&lt;br /&gt;
#Labbé, M., Michaud., F. (2013), “Appearance-based loop closure detection in real-time for large-scale and long-term operation,” &#039;&#039;IEEE Transactions on Robotics&#039;&#039;, vol. 29, no. 3, pp. 734-745. ([[Media:TRO2013.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6459608 IEEE Xplore])&lt;br /&gt;
#Labbé, M., Michaud, F. (2011), “Memory management for real-time appearance-based loop closure detection,” in &#039;&#039;Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems&#039;&#039;. ([[Media:labbe11memory.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6094602 IEEE Xplore])&lt;br /&gt;
&lt;br /&gt;
==== Presentations ====&lt;br /&gt;
* M. Labbé, &amp;quot;Simultaneous Localization and Mapping (SLAM) with RTAB-Map&amp;quot;, Université Laval, Québec, November 2015 ([[Media:Labbe2015ULaval.pdf|slides pdf]])&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Team ==&lt;br /&gt;
* [[Mathieu Labbé]]&lt;br /&gt;
* [http://www.gel.usherbrooke.ca/michaudf/ François Michaud]&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Équipe ==&lt;br /&gt;
* [[Mathieu Labbé]]&lt;br /&gt;
* [http://www.gel.usherbrooke.ca/michaudf/ François Michaud]&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=RTAB-Map&amp;diff=3307</id>
		<title>RTAB-Map</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=RTAB-Map&amp;diff=3307"/>
		<updated>2018-11-28T19:10:28Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
&amp;lt;big&amp;gt;&amp;lt;english&amp;gt;[[Image:RTAB-Map.png|link=http://introlab.github.io/rtabmap|RTAB-Map]] RTAB-Map : Real-Time Appearance-Based Mapping&amp;lt;/english&amp;gt;&amp;lt;french&amp;gt;[[Image:RTAB-Map.png|link=http://introlab.github.io/rtabmap|RTAB-Map]] RTAB-Map : Cartographie temps réel basée sur l&#039;apparence de l&#039;environnement &amp;lt;/french&amp;gt;&amp;lt;/big&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Description ==&lt;br /&gt;
&#039;&#039;&#039;This page is about the loop closure detection approach used by RTAB-Map. For RGB-D mapping, visit [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&#039;&#039;&#039;.&lt;br /&gt;
&lt;br /&gt;
Loop closure detection is the process involved when trying to find a match between the current and a previously visited locations in SLAM (Simultaneous Localization And Mapping). &lt;br /&gt;
Over time, the amount of time required to process new observations increases with the size of the internal map, which may affect real-time processing. &lt;br /&gt;
RTAB-Map is a novel real-time loop closure detection approach for large-scale and long-term SLAM. Our approach is based on efficient memory management to keep computation time for each new observation under a fixed time limit, thus respecting real-time limit for long-term operation. Results demonstrate the approach&#039;s adaptability and scalability using two custom data sets and ten standard data sets.&lt;br /&gt;
&amp;lt;/english&amp;gt;&amp;lt;french&amp;gt;&lt;br /&gt;
== Description ==&lt;br /&gt;
&#039;&#039;&#039;Cette page est à propos de l&#039;approche de détection de fermeture de boucle utilisée dans RTAB-Map. Pour la cartographie RGB-D, visitez [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&#039;&#039;&#039;.&lt;br /&gt;
&lt;br /&gt;
La détection de fermeture de boucle est le processus impliqué en SLAM (localisation et cartographie simultanées) lorsqu&#039;on tente de trouver une correspondance entre un endroit présent et un autre déjà visité. Plus la carte interne augmente en taille, plus le temps requis pour la détection de fermeture de boucle augmente, ce qui peut affecter le traitement en temps réel. RTAB-Map est une nouvelle approche de détection de fermeture de boucle fonctionnant en temps réel pour du SLAM à grande échelle et à long terme. Notre approche est basée sur une gestion efficace de la mémoire afin de garder le temps de calcul en dessous d&#039;un seuil de temps, respectant ainsi la limite de temps réel à long terme. En utilisant dix ensembles de données standards, notre propre ensemble de données dérivées d&#039;un parcours de plus de 2 km rassemblant des conditions diverses et notre ensemble de données montrant un parcours où le robot visite les mêmes endroits une centaine de fois, les résultats démontrent l&#039;adaptabilité et l&#039;extensibilité de notre approche.&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
&amp;lt;youtube&amp;gt;https://www.youtube.com/watch?v=71eRxTc1DaU&amp;lt;/youtube&amp;gt;&lt;br /&gt;
&amp;lt;youtube&amp;gt;https://www.youtube.com/watch?v=CAk-QGMlQmI&amp;lt;/youtube&amp;gt;&lt;br /&gt;
&amp;lt;youtube&amp;gt;https://www.youtube.com/watch?v=AMLwjo80WzI&amp;lt;/youtube&amp;gt;&lt;br /&gt;
{{#ev:youtube|71eRxTc1DaU}}&lt;br /&gt;
{{#ev:youtube|CAk-QGMlQmI}}&lt;br /&gt;
{{#ev:youtube|AMLwjo80WzI}}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Results ==&lt;br /&gt;
&#039;&#039;Note that these results (more recent) may differ from those in the presentation video above...&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Figure 1: Summary of the loop closures detected on UdeS data set :&lt;br /&gt;
* Green : Loop closures detected&lt;br /&gt;
* Yellow : Loop closures rejected&lt;br /&gt;
* Red : Unable to detect a loop closure because old places could not be retrieved&lt;br /&gt;
&lt;br /&gt;
Figure 2: Processing time for each image acquired (real-time limit fixed to 700 ms for an image rate of 1 Hz)&lt;br /&gt;
&lt;br /&gt;
Figure 3: Precision-Recall (48% recall at 100% precision)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_LoopClosureMapResults.png|250px]] [[File:RTAB-Map_LoopClosureTimeResults.png|250px]] [[File:RTAB-Map_RecallResults.png|250px]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Reproduce the loop closure detection results&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:RTAB-Map_LoopClosureAllPrecisionRecall.png|250px]]&lt;br /&gt;
&lt;br /&gt;
* Visit the [http://github.com/introlab/rtabmap/wiki/Benchmark Benchmark] wiki page on [http://github.com/introlab/rtabmap/wiki RTAB-Map&#039;s GitHub]. The ground truths can be downloaded below.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Videos&#039;&#039;&#039;&lt;br /&gt;
* Newer:&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|1dImRinTJSE}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|N5q0jQrV3gw}} {{#ev:youtube|PqO_x8tcFiY}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|2MogQIT_B2I}} {{#ev:youtube|AH_oKp3CrRA}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|0fNUD11FNZU}} {{#ev:youtube|ViXlUywWHYQ}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
* Older:&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|0zWs6jTaAwQ}} {{#ev:youtube|J8KGEA9ecS0}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|kghs6XM8Yzw}} {{#ev:youtube|awV2Xbjq7OM}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|CuWESlLfWpQ}} {{#ev:youtube|SQiFs1z7qSY}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|ShQlakkzsY4}} {{#ev:youtube|cTmf5yrpcl8}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Résultats ==&lt;br /&gt;
&#039;&#039;À noter que les résultats (plus récents) présentés ici peuvent différer de ceux dans le vidéo...&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Figure 1: Sommaire des détection de boucles sur l&#039;ensemble de données UdeS :&lt;br /&gt;
* Vert : Fermetures de boucle acceptées&lt;br /&gt;
* Jaune : Fermetures de boucle rejetées &lt;br /&gt;
* Rouge : Impossibilité de détecter une fermeture de boucle car les anciens endroits n&#039;ont pu être remémorisés&lt;br /&gt;
&lt;br /&gt;
Figure 2: Temps d&#039;exécution pour chaque itération (limite temps réel fixée à 700 ms pour un temps d&#039;acquisition de 1 seconde)&lt;br /&gt;
&lt;br /&gt;
Figure 3: Precision-Recall (48% recall à 100% precision)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_LoopClosureMapResults.png|250px]] [[File:RTAB-Map_LoopClosureTimeResults.png|250px]] [[File:RTAB-Map_RecallResults.png|250px]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Reproduire les résultats de détection de boucles&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:RTAB-Map_LoopClosureAllPrecisionRecall.png|250px]]&lt;br /&gt;
&lt;br /&gt;
* Visitez la page wiki [http://github.com/introlab/rtabmap/wiki/Benchmark Benchmark] sur le [http://github.com/introlab/rtabmap/wiki GitHub de RTAB-Map&#039;s]. Les &amp;quot;ground truths&amp;quot; peuvent être téléchargés en bas de la page.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Vidéos&#039;&#039;&#039;&lt;br /&gt;
* Nouveaux:&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|1dImRinTJSE}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|N5q0jQrV3gw}} {{#ev:youtube|PqO_x8tcFiY}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|2MogQIT_B2I}} {{#ev:youtube|AH_oKp3CrRA}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|0fNUD11FNZU}} {{#ev:youtube|ViXlUywWHYQ}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
* Anciens:&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|0zWs6jTaAwQ}} {{#ev:youtube|J8KGEA9ecS0}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|kghs6XM8Yzw}} {{#ev:youtube|awV2Xbjq7OM}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|CuWESlLfWpQ}} {{#ev:youtube|SQiFs1z7qSY}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|ShQlakkzsY4}} {{#ev:youtube|cTmf5yrpcl8}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Source code ==&lt;br /&gt;
The code was tested on Windows (Xp, 7), Mac OS X 10.6 and Ubuntu 10.4LTS.&lt;br /&gt;
* Standalone application, libraries and ROS packages : [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_Interface.png|800px|Images acquired in Need For Speed Most Wanted]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Code source ==&lt;br /&gt;
Le code a été testé sur Windows (Xp, 7), Mac OS X 10.6 et Ubuntu 10.4LTS. &lt;br /&gt;
* Logiciel &amp;quot;stand-alone&amp;quot;, bibliothèques logicielles et noeuds ROS : [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_Interface.png|800px|Images provenant de Need For Speed Most Wanted]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Data sets ==&lt;br /&gt;
&#039;&#039;&#039;UdeS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 5395 images at 1 Hz (1.5 hours). &lt;br /&gt;
* Images taken while walking through a loop of ~2 km, traversed two times.&lt;br /&gt;
* The data set contains indoor and outdoor environments.&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:UdeS_1Hz_map.png|([http://maps.google.ca/maps?q=Universit%C3%A9+de+sherbrooke&amp;amp;hl=en&amp;amp;ie=UTF8&amp;amp;ll=45.377714,-71.927383&amp;amp;spn=0.011546,0.016158&amp;amp;sll=49.891235,-97.15369&amp;amp;sspn=43.664668,66.181641&amp;amp;t=h&amp;amp;z=16 on Google maps])&lt;br /&gt;
File:UdeS_1Hz_sample1.jpg&lt;br /&gt;
File:UdeS_1Hz_sample3.jpg&lt;br /&gt;
File:UdeS_1Hz_sample4.jpg&lt;br /&gt;
File:UdeS_1Hz_sample5.jpg&lt;br /&gt;
File:UdeS_1Hz_sample6.jpg&lt;br /&gt;
File:UdeS_1Hz_sample7.jpg&lt;br /&gt;
File:UdeS_1Hz_sample8.jpg&lt;br /&gt;
File:UdeS_1Hz_sample9.jpg&lt;br /&gt;
File:UdeS_1Hz_sample11.jpg|Rain!&lt;br /&gt;
File:UdeS_1Hz_sample16.jpg|Compare illumination and camera orientation with the next image...&lt;br /&gt;
File:UdeS_1Hz_sample12.jpg&lt;br /&gt;
File:UdeS_1Hz_sample13.jpg|Elevator door...&lt;br /&gt;
File:UdeS_1Hz_sample14.jpg&lt;br /&gt;
File:UdeS_1Hz_sample15.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:UdeS_1Hz.part1.rar|UdeS_1Hz.part1.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part2.rar|UdeS_1Hz.part2.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part3.rar|UdeS_1Hz.part3.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.png|UdeS_1Hz GroundTruth]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;NFSMW&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 25098 images at 1 Hz (7 hours).&lt;br /&gt;
* Images taken from the racing video game Need For Speed: Most Wanted.&lt;br /&gt;
* 2 areas visited hundred times each (100 traversals in area 1 then moved to area 2 for another 102 traversals).&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:NFSMW_1Hz_map.png&lt;br /&gt;
File:NFSMW_1Hz_sample2.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample3.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample4.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample5.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample6.jpg|Compare illumination with the next image...&lt;br /&gt;
File:NFSMW_1Hz_sample8.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample7.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample9.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample10.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:NFSMW_1Hz.part01.rar|NFSMW_1Hz.part01.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part02.rar|NFSMW_1Hz.part02.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part03.rar|NFSMW_1Hz.part03.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part04.rar|NFSMW_1Hz.part04.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part05.rar|NFSMW_1Hz.part05.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part06.rar|NFSMW_1Hz.part06.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part07.rar|NFSMW_1Hz.part07.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part08.rar|NFSMW_1Hz.part08.rar]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Community&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Community data sets from other loop closure detection approaches :&lt;br /&gt;
* Angeli et al. : [http://cogrob.ensta.fr/loopclosure.html Lip6Indoor and Lip6Outdoor]&lt;br /&gt;
* Cummins et al. (FAB-MAP) : [http://www.robots.ox.ac.uk/~mobile/IJRR_2008_Dataset NewCollege and CityCentre]&lt;br /&gt;
* Cummins et al. (FAB-MAP 2.0) : [http://www.robots.ox.ac.uk/~mobile Eynsham (70 km)]&lt;br /&gt;
* Maddern et al. : [http://www.robots.ox.ac.uk/NewCollegeData/ NewCollege omnidirectionnal images]&lt;br /&gt;
* Kawewong et al. (PIRF-Nav 2.0): [http://haselab.info/pirf.html CrowdedCanteen]&lt;br /&gt;
* Ga ́lvez-Lo ́pez et al. : [http://www.rawseeds.org/home/category/benchmarking-toolkit/datasets/ Bovisa and Bicocca]&lt;br /&gt;
* Blanco et al. : [http://www.mrpt.org/malaga_dataset_2009 Malaga 2009]&lt;br /&gt;
&lt;br /&gt;
Ground truths:&lt;br /&gt;
* [[Media:NewCollege.png|NewCollege.png]] 1073 images at ~0.5 Hz (left and right images merged) &lt;br /&gt;
* [[Media:CityCentre.png|CityCentre.png]] 1237 images at ~0.5 Hz (left and right images merged) &lt;br /&gt;
* [[Media:Lip6Indoor.png|Lip6Indoor.png]] 388 images at 1 Hz&lt;br /&gt;
* [[Media:Lip6Outdoor.png|Lip6Outdoor.png]] 531 images at 0.5 Hz&lt;br /&gt;
* [[Media:Eynsham70km.png|Eynsham70km.png]] 5519 images at ~1 Hz (Note that we removed some images of the original data set to have an approximately image rate of 1 Hz)&lt;br /&gt;
* [[Media:NewCollegeOmni.png|NewCollegeOmni.png]] 1626 images at 1 Hz&lt;br /&gt;
* [[Media:CrowdedCanteen.png|CrowdedCanteen.png]] 692 images at 2 Hz&lt;br /&gt;
* [[Media:BicoccaIndoor-2009-02-25b.png|BicoccaIndoor-2009-02-25b.png]] 1757 images at 1 Hz&lt;br /&gt;
* [[Media:BovisaOutdoor-2008-10-04.png|BovisaOutdoor-2008-10-04.png]] 2277 images at 1 Hz&lt;br /&gt;
* [[Media:BovisaMixed-2008-10-06.png|BovisaMixed-2008-10-06.png]] 2147 images at 1 Hz&lt;br /&gt;
* [[Media:malaga2009_campus_2L.png|malaga2009_campus_2L.png]] 653 images at ~1 Hz&lt;br /&gt;
* [[Media:malaga2009_parking_6L.png|malaga2009_parking_6L.png]] 435 images at ~1 Hz&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Ensembles de données ==&lt;br /&gt;
&#039;&#039;&#039;UdeS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 5395 images à 1 Hz (1,5 heures).&lt;br /&gt;
* Images prises en marchant sur un trajet de ~2 km, parcouru deux fois.&lt;br /&gt;
* L&#039;ensemble de données contient des images prises à l&#039;intérieur et à l&#039;extérieur.&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:UdeS_1Hz_map.png|([http://maps.google.ca/maps?q=Universit%C3%A9+de+sherbrooke&amp;amp;hl=en&amp;amp;ie=UTF8&amp;amp;ll=45.377714,-71.927383&amp;amp;spn=0.011546,0.016158&amp;amp;sll=49.891235,-97.15369&amp;amp;sspn=43.664668,66.181641&amp;amp;t=h&amp;amp;z=16 sur Google maps])&lt;br /&gt;
File:UdeS_1Hz_sample1.jpg&lt;br /&gt;
File:UdeS_1Hz_sample3.jpg&lt;br /&gt;
File:UdeS_1Hz_sample4.jpg&lt;br /&gt;
File:UdeS_1Hz_sample5.jpg&lt;br /&gt;
File:UdeS_1Hz_sample6.jpg&lt;br /&gt;
File:UdeS_1Hz_sample7.jpg&lt;br /&gt;
File:UdeS_1Hz_sample8.jpg&lt;br /&gt;
File:UdeS_1Hz_sample9.jpg&lt;br /&gt;
File:UdeS_1Hz_sample11.jpg|De la pluie!&lt;br /&gt;
File:UdeS_1Hz_sample16.jpg|Comparer l&#039;illumination et l&#039;orientation de la caméra avec l&#039;image suivante... &lt;br /&gt;
File:UdeS_1Hz_sample12.jpg&lt;br /&gt;
File:UdeS_1Hz_sample13.jpg|Porte d&#039;ascenseur...&lt;br /&gt;
File:UdeS_1Hz_sample14.jpg&lt;br /&gt;
File:UdeS_1Hz_sample15.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:UdeS_1Hz.part1.rar|UdeS_1Hz.part1.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part2.rar|UdeS_1Hz.part2.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part3.rar|UdeS_1Hz.part3.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.png|UdeS_1Hz GroundTruth]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;NFSMW&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 25098 images à 1 Hz (7 heures).&lt;br /&gt;
* Images prises dans le jeu vidéo de course Need For Speed: Most Wanted.&lt;br /&gt;
* 2 zones ont été visités 100 fois chaque (100 boucles dans la zone 1 et ensuite 102 boucles dans la zone 2).&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:NFSMW_1Hz_map.png&lt;br /&gt;
File:NFSMW_1Hz_sample2.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample3.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample4.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample5.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample6.jpg|Comparer l&#039;illumination avec l&#039;image suivante...&lt;br /&gt;
File:NFSMW_1Hz_sample8.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample7.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample9.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample10.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:NFSMW_1Hz.part01.rar|NFSMW_1Hz.part01.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part02.rar|NFSMW_1Hz.part02.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part03.rar|NFSMW_1Hz.part03.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part04.rar|NFSMW_1Hz.part04.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part05.rar|NFSMW_1Hz.part05.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part06.rar|NFSMW_1Hz.part06.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part07.rar|NFSMW_1Hz.part07.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part08.rar|NFSMW_1Hz.part08.rar]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Communauté&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Ensembles de données provenant d&#039;autres approches de détection de fermeture de boucle :&lt;br /&gt;
* Angeli et al. : [http://cogrob.ensta.fr/loopclosure.html Lip6Indoor et Lip6Outdoor]&lt;br /&gt;
* Cummins et al. (FAB-MAP) : [http://www.robots.ox.ac.uk/~mobile/IJRR_2008_Dataset NewCollege et CityCentre]&lt;br /&gt;
* Cummins et al. (FAB-MAP 2.0) : [http://www.robots.ox.ac.uk/~mobile Eynsham (70 km)]&lt;br /&gt;
* Maddern et al. : [http://www.robots.ox.ac.uk/NewCollegeData/ NewCollege omnidirectionnal images]&lt;br /&gt;
* Kawewong et al. (PIRF-Nav 2.0): [http://haselab.info/pirf.html CrowdedCanteen]&lt;br /&gt;
* Ga ́lvez-Lo ́pez et al. : [http://www.rawseeds.org/home/category/benchmarking-toolkit/datasets/ Bovisa et Bicocca]&lt;br /&gt;
* Blanco et al. : [http://www.mrpt.org/malaga_dataset_2009 Malaga 2009]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Ground truths&#039;&#039;:&lt;br /&gt;
* [[Media:NewCollege.png|NewCollege.png]] 1073 images à ~0.5 Hz (les images de gauche et de droite fusionnées)&lt;br /&gt;
* [[Media:CityCentre.png|CityCentre.png]] 1237 images à ~0.5 Hz (les images de gauche et de droite fusionnées) &lt;br /&gt;
* [[Media:Lip6Indoor.png|Lip6Indoor.png]] 388 images à 1 Hz&lt;br /&gt;
* [[Media:Lip6Outdoor.png|Lip6Outdoor.png]] 531 images à 0.5 Hz&lt;br /&gt;
* [[Media:Eynsham70km.png|Eynsham70km.png]] 5519 images à ~1 Hz (Noter que nous avons enlevés des images de l&#039;ensemble données original pour avoir une fréquence d&#039;acquisition d&#039;images d&#039;environ 1 Hz.)&lt;br /&gt;
* [[Media:NewCollegeOmni.png|NewCollegeOmni.png]] 1626 images à 1 Hz&lt;br /&gt;
* [[Media:CrowdedCanteen.png|CrowdedCanteen.png]] 692 images à 2 Hz&lt;br /&gt;
* [[Media:BicoccaIndoor-2009-02-25b.png|BicoccaIndoor-2009-02-25b.png]] 1757 images à 1 Hz&lt;br /&gt;
* [[Media:BovisaOutdoor-2008-10-04.png|BovisaOutdoor-2008-10-04.png]] 2277 images à 1 Hz&lt;br /&gt;
* [[Media:BovisaMixed-2008-10-06.png|BovisaMixed-2008-10-06.png]] 2147 images à 1 Hz&lt;br /&gt;
* [[Media:malaga2009_campus_2L.png|malaga2009_campus_2L.png]] 653 images à ~1 Hz&lt;br /&gt;
* [[Media:malaga2009_parking_6L.png|malaga2009_parking_6L.png]] 435 images à ~1 Hz&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Publications ==&lt;br /&gt;
#M. Labbé and F. Michaud, “Long-term online multi-session graph-based SPLAM with memory management,” in &#039;&#039;Autonomous Robots&#039;&#039;, accepted, 2017. ([[Media:LabbeAURO2017.pdf|pdf]]) ([http://dx.doi.org/10.1007/s10514-017-9682-5 Springer])&lt;br /&gt;
#M. Labbé and F. Michaud, “Online Global Loop Closure Detection for Large-Scale Multi-Session Graph-Based SLAM,” in &#039;&#039;Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems&#039;&#039;, 2014. ([[Media:Labbe14-IROS.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/login.jsp?tp=&amp;amp;arnumber=6942926 IEEE Xplore])&lt;br /&gt;
#Labbé, M., Michaud., F. (2013), “Appearance-based loop closure detection in real-time for large-scale and long-term operation,” &#039;&#039;IEEE Transactions on Robotics&#039;&#039;, vol. 29, no. 3, pp. 734-745. ([[Media:TRO2013.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6459608 IEEE Xplore])&lt;br /&gt;
#Labbé, M., Michaud, F. (2011), “Memory management for real-time appearance-based loop closure detection,” in &#039;&#039;Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems&#039;&#039;. ([[Media:labbe11memory.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6094602 IEEE Xplore])&lt;br /&gt;
&lt;br /&gt;
==== Presentations ====&lt;br /&gt;
* M. Labbé, &amp;quot;Simultaneous Localization and Mapping (SLAM) with RTAB-Map&amp;quot;, Université Laval, Québec, November 2015 ([[Media:Labbe2015ULaval.pdf|slides pdf]])&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Team ==&lt;br /&gt;
* [[Mathieu Labbé]]&lt;br /&gt;
* [http://www.gel.usherbrooke.ca/michaudf/ François Michaud]&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Équipe ==&lt;br /&gt;
* [[Mathieu Labbé]]&lt;br /&gt;
* [http://www.gel.usherbrooke.ca/michaudf/ François Michaud]&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=RTAB-Map&amp;diff=3306</id>
		<title>RTAB-Map</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=RTAB-Map&amp;diff=3306"/>
		<updated>2018-11-28T19:08:50Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
&amp;lt;big&amp;gt;&amp;lt;english&amp;gt;[[Image:RTAB-Map.png|link=http://introlab.github.io/rtabmap|RTAB-Map]] RTAB-Map : Real-Time Appearance-Based Mapping&amp;lt;/english&amp;gt;&amp;lt;french&amp;gt;[[Image:RTAB-Map.png|link=http://introlab.github.io/rtabmap|RTAB-Map]] RTAB-Map : Cartographie temps réel basée sur l&#039;apparence de l&#039;environnement &amp;lt;/french&amp;gt;&amp;lt;/big&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Description ==&lt;br /&gt;
&#039;&#039;&#039;This page is about the loop closure detection approach used by RTAB-Map. For RGB-D mapping, visit [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&#039;&#039;&#039;.&lt;br /&gt;
&lt;br /&gt;
Loop closure detection is the process involved when trying to find a match between the current and a previously visited locations in SLAM (Simultaneous Localization And Mapping). &lt;br /&gt;
Over time, the amount of time required to process new observations increases with the size of the internal map, which may affect real-time processing. &lt;br /&gt;
RTAB-Map is a novel real-time loop closure detection approach for large-scale and long-term SLAM. Our approach is based on efficient memory management to keep computation time for each new observation under a fixed time limit, thus respecting real-time limit for long-term operation. Results demonstrate the approach&#039;s adaptability and scalability using two custom data sets and ten standard data sets.&lt;br /&gt;
&amp;lt;/english&amp;gt;&amp;lt;french&amp;gt;&lt;br /&gt;
== Description ==&lt;br /&gt;
&#039;&#039;&#039;Cette page est à propos de l&#039;approche de détection de fermeture de boucle utilisée dans RTAB-Map. Pour la cartographie RGB-D, visitez [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&#039;&#039;&#039;.&lt;br /&gt;
&lt;br /&gt;
La détection de fermeture de boucle est le processus impliqué en SLAM (localisation et cartographie simultanées) lorsqu&#039;on tente de trouver une correspondance entre un endroit présent et un autre déjà visité. Plus la carte interne augmente en taille, plus le temps requis pour la détection de fermeture de boucle augmente, ce qui peut affecter le traitement en temps réel. RTAB-Map est une nouvelle approche de détection de fermeture de boucle fonctionnant en temps réel pour du SLAM à grande échelle et à long terme. Notre approche est basée sur une gestion efficace de la mémoire afin de garder le temps de calcul en dessous d&#039;un seuil de temps, respectant ainsi la limite de temps réel à long terme. En utilisant dix ensembles de données standards, notre propre ensemble de données dérivées d&#039;un parcours de plus de 2 km rassemblant des conditions diverses et notre ensemble de données montrant un parcours où le robot visite les mêmes endroits une centaine de fois, les résultats démontrent l&#039;adaptabilité et l&#039;extensibilité de notre approche.&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
&amp;lt;youtube&amp;gt;https://www.youtube.com/watch?v=71eRxTc1DaU&amp;lt;/youtube&amp;gt;&lt;br /&gt;
&amp;lt;youtube&amp;gt;https://www.youtube.com/watch?v=CAk-QGMlQmI&amp;lt;/youtube&amp;gt;&lt;br /&gt;
&amp;lt;youtube&amp;gt;https://www.youtube.com/watch?v=AMLwjo80WzI&amp;lt;/youtube&amp;gt;&lt;br /&gt;
&amp;lt;code&amp;gt;{{#ev:youtube|71eRxTc1DaU}}&amp;lt;/code&amp;gt;&lt;br /&gt;
&amp;lt;code&amp;gt;{{#ev:youtube|CAk-QGMlQmI}}&amp;lt;/code&amp;gt;&lt;br /&gt;
&amp;lt;code&amp;gt;{{#ev:youtube|AMLwjo80WzI}}&amp;lt;/code&amp;gt;&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Results ==&lt;br /&gt;
&#039;&#039;Note that these results (more recent) may differ from those in the presentation video above...&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Figure 1: Summary of the loop closures detected on UdeS data set :&lt;br /&gt;
* Green : Loop closures detected&lt;br /&gt;
* Yellow : Loop closures rejected&lt;br /&gt;
* Red : Unable to detect a loop closure because old places could not be retrieved&lt;br /&gt;
&lt;br /&gt;
Figure 2: Processing time for each image acquired (real-time limit fixed to 700 ms for an image rate of 1 Hz)&lt;br /&gt;
&lt;br /&gt;
Figure 3: Precision-Recall (48% recall at 100% precision)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_LoopClosureMapResults.png|250px]] [[File:RTAB-Map_LoopClosureTimeResults.png|250px]] [[File:RTAB-Map_RecallResults.png|250px]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Reproduce the loop closure detection results&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:RTAB-Map_LoopClosureAllPrecisionRecall.png|250px]]&lt;br /&gt;
&lt;br /&gt;
* Visit the [http://github.com/introlab/rtabmap/wiki/Benchmark Benchmark] wiki page on [http://github.com/introlab/rtabmap/wiki RTAB-Map&#039;s GitHub]. The ground truths can be downloaded below.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Videos&#039;&#039;&#039;&lt;br /&gt;
* Newer:&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|1dImRinTJSE}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|N5q0jQrV3gw}} {{#ev:youtube|PqO_x8tcFiY}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|2MogQIT_B2I}} {{#ev:youtube|AH_oKp3CrRA}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|0fNUD11FNZU}} {{#ev:youtube|ViXlUywWHYQ}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
* Older:&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|0zWs6jTaAwQ}} {{#ev:youtube|J8KGEA9ecS0}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|kghs6XM8Yzw}} {{#ev:youtube|awV2Xbjq7OM}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|CuWESlLfWpQ}} {{#ev:youtube|SQiFs1z7qSY}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|ShQlakkzsY4}} {{#ev:youtube|cTmf5yrpcl8}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Résultats ==&lt;br /&gt;
&#039;&#039;À noter que les résultats (plus récents) présentés ici peuvent différer de ceux dans le vidéo...&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Figure 1: Sommaire des détection de boucles sur l&#039;ensemble de données UdeS :&lt;br /&gt;
* Vert : Fermetures de boucle acceptées&lt;br /&gt;
* Jaune : Fermetures de boucle rejetées &lt;br /&gt;
* Rouge : Impossibilité de détecter une fermeture de boucle car les anciens endroits n&#039;ont pu être remémorisés&lt;br /&gt;
&lt;br /&gt;
Figure 2: Temps d&#039;exécution pour chaque itération (limite temps réel fixée à 700 ms pour un temps d&#039;acquisition de 1 seconde)&lt;br /&gt;
&lt;br /&gt;
Figure 3: Precision-Recall (48% recall à 100% precision)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_LoopClosureMapResults.png|250px]] [[File:RTAB-Map_LoopClosureTimeResults.png|250px]] [[File:RTAB-Map_RecallResults.png|250px]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Reproduire les résultats de détection de boucles&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:RTAB-Map_LoopClosureAllPrecisionRecall.png|250px]]&lt;br /&gt;
&lt;br /&gt;
* Visitez la page wiki [http://github.com/introlab/rtabmap/wiki/Benchmark Benchmark] sur le [http://github.com/introlab/rtabmap/wiki GitHub de RTAB-Map&#039;s]. Les &amp;quot;ground truths&amp;quot; peuvent être téléchargés en bas de la page.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Vidéos&#039;&#039;&#039;&lt;br /&gt;
* Nouveaux:&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|1dImRinTJSE}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|N5q0jQrV3gw}} {{#ev:youtube|PqO_x8tcFiY}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|2MogQIT_B2I}} {{#ev:youtube|AH_oKp3CrRA}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|0fNUD11FNZU}} {{#ev:youtube|ViXlUywWHYQ}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
* Anciens:&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|0zWs6jTaAwQ}} {{#ev:youtube|J8KGEA9ecS0}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|kghs6XM8Yzw}} {{#ev:youtube|awV2Xbjq7OM}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|CuWESlLfWpQ}} {{#ev:youtube|SQiFs1z7qSY}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;code&amp;gt;{{#ev:youtube|ShQlakkzsY4}} {{#ev:youtube|cTmf5yrpcl8}}&amp;lt;/code&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Source code ==&lt;br /&gt;
The code was tested on Windows (Xp, 7), Mac OS X 10.6 and Ubuntu 10.4LTS.&lt;br /&gt;
* Standalone application, libraries and ROS packages : [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_Interface.png|800px|Images acquired in Need For Speed Most Wanted]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Code source ==&lt;br /&gt;
Le code a été testé sur Windows (Xp, 7), Mac OS X 10.6 et Ubuntu 10.4LTS. &lt;br /&gt;
* Logiciel &amp;quot;stand-alone&amp;quot;, bibliothèques logicielles et noeuds ROS : [http://introlab.github.io/rtabmap introlab.github.io/rtabmap]&lt;br /&gt;
&amp;lt;div style=&amp;quot;text-align: center;&amp;quot;&amp;gt;&lt;br /&gt;
[[File:RTAB-Map_Interface.png|800px|Images provenant de Need For Speed Most Wanted]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Data sets ==&lt;br /&gt;
&#039;&#039;&#039;UdeS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 5395 images at 1 Hz (1.5 hours). &lt;br /&gt;
* Images taken while walking through a loop of ~2 km, traversed two times.&lt;br /&gt;
* The data set contains indoor and outdoor environments.&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:UdeS_1Hz_map.png|([http://maps.google.ca/maps?q=Universit%C3%A9+de+sherbrooke&amp;amp;hl=en&amp;amp;ie=UTF8&amp;amp;ll=45.377714,-71.927383&amp;amp;spn=0.011546,0.016158&amp;amp;sll=49.891235,-97.15369&amp;amp;sspn=43.664668,66.181641&amp;amp;t=h&amp;amp;z=16 on Google maps])&lt;br /&gt;
File:UdeS_1Hz_sample1.jpg&lt;br /&gt;
File:UdeS_1Hz_sample3.jpg&lt;br /&gt;
File:UdeS_1Hz_sample4.jpg&lt;br /&gt;
File:UdeS_1Hz_sample5.jpg&lt;br /&gt;
File:UdeS_1Hz_sample6.jpg&lt;br /&gt;
File:UdeS_1Hz_sample7.jpg&lt;br /&gt;
File:UdeS_1Hz_sample8.jpg&lt;br /&gt;
File:UdeS_1Hz_sample9.jpg&lt;br /&gt;
File:UdeS_1Hz_sample11.jpg|Rain!&lt;br /&gt;
File:UdeS_1Hz_sample16.jpg|Compare illumination and camera orientation with the next image...&lt;br /&gt;
File:UdeS_1Hz_sample12.jpg&lt;br /&gt;
File:UdeS_1Hz_sample13.jpg|Elevator door...&lt;br /&gt;
File:UdeS_1Hz_sample14.jpg&lt;br /&gt;
File:UdeS_1Hz_sample15.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:UdeS_1Hz.part1.rar|UdeS_1Hz.part1.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part2.rar|UdeS_1Hz.part2.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part3.rar|UdeS_1Hz.part3.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.png|UdeS_1Hz GroundTruth]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;NFSMW&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 25098 images at 1 Hz (7 hours).&lt;br /&gt;
* Images taken from the racing video game Need For Speed: Most Wanted.&lt;br /&gt;
* 2 areas visited hundred times each (100 traversals in area 1 then moved to area 2 for another 102 traversals).&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:NFSMW_1Hz_map.png&lt;br /&gt;
File:NFSMW_1Hz_sample2.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample3.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample4.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample5.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample6.jpg|Compare illumination with the next image...&lt;br /&gt;
File:NFSMW_1Hz_sample8.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample7.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample9.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample10.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:NFSMW_1Hz.part01.rar|NFSMW_1Hz.part01.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part02.rar|NFSMW_1Hz.part02.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part03.rar|NFSMW_1Hz.part03.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part04.rar|NFSMW_1Hz.part04.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part05.rar|NFSMW_1Hz.part05.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part06.rar|NFSMW_1Hz.part06.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part07.rar|NFSMW_1Hz.part07.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part08.rar|NFSMW_1Hz.part08.rar]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Community&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Community data sets from other loop closure detection approaches :&lt;br /&gt;
* Angeli et al. : [http://cogrob.ensta.fr/loopclosure.html Lip6Indoor and Lip6Outdoor]&lt;br /&gt;
* Cummins et al. (FAB-MAP) : [http://www.robots.ox.ac.uk/~mobile/IJRR_2008_Dataset NewCollege and CityCentre]&lt;br /&gt;
* Cummins et al. (FAB-MAP 2.0) : [http://www.robots.ox.ac.uk/~mobile Eynsham (70 km)]&lt;br /&gt;
* Maddern et al. : [http://www.robots.ox.ac.uk/NewCollegeData/ NewCollege omnidirectionnal images]&lt;br /&gt;
* Kawewong et al. (PIRF-Nav 2.0): [http://haselab.info/pirf.html CrowdedCanteen]&lt;br /&gt;
* Ga ́lvez-Lo ́pez et al. : [http://www.rawseeds.org/home/category/benchmarking-toolkit/datasets/ Bovisa and Bicocca]&lt;br /&gt;
* Blanco et al. : [http://www.mrpt.org/malaga_dataset_2009 Malaga 2009]&lt;br /&gt;
&lt;br /&gt;
Ground truths:&lt;br /&gt;
* [[Media:NewCollege.png|NewCollege.png]] 1073 images at ~0.5 Hz (left and right images merged) &lt;br /&gt;
* [[Media:CityCentre.png|CityCentre.png]] 1237 images at ~0.5 Hz (left and right images merged) &lt;br /&gt;
* [[Media:Lip6Indoor.png|Lip6Indoor.png]] 388 images at 1 Hz&lt;br /&gt;
* [[Media:Lip6Outdoor.png|Lip6Outdoor.png]] 531 images at 0.5 Hz&lt;br /&gt;
* [[Media:Eynsham70km.png|Eynsham70km.png]] 5519 images at ~1 Hz (Note that we removed some images of the original data set to have an approximately image rate of 1 Hz)&lt;br /&gt;
* [[Media:NewCollegeOmni.png|NewCollegeOmni.png]] 1626 images at 1 Hz&lt;br /&gt;
* [[Media:CrowdedCanteen.png|CrowdedCanteen.png]] 692 images at 2 Hz&lt;br /&gt;
* [[Media:BicoccaIndoor-2009-02-25b.png|BicoccaIndoor-2009-02-25b.png]] 1757 images at 1 Hz&lt;br /&gt;
* [[Media:BovisaOutdoor-2008-10-04.png|BovisaOutdoor-2008-10-04.png]] 2277 images at 1 Hz&lt;br /&gt;
* [[Media:BovisaMixed-2008-10-06.png|BovisaMixed-2008-10-06.png]] 2147 images at 1 Hz&lt;br /&gt;
* [[Media:malaga2009_campus_2L.png|malaga2009_campus_2L.png]] 653 images at ~1 Hz&lt;br /&gt;
* [[Media:malaga2009_parking_6L.png|malaga2009_parking_6L.png]] 435 images at ~1 Hz&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Ensembles de données ==&lt;br /&gt;
&#039;&#039;&#039;UdeS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 5395 images à 1 Hz (1,5 heures).&lt;br /&gt;
* Images prises en marchant sur un trajet de ~2 km, parcouru deux fois.&lt;br /&gt;
* L&#039;ensemble de données contient des images prises à l&#039;intérieur et à l&#039;extérieur.&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:UdeS_1Hz_map.png|([http://maps.google.ca/maps?q=Universit%C3%A9+de+sherbrooke&amp;amp;hl=en&amp;amp;ie=UTF8&amp;amp;ll=45.377714,-71.927383&amp;amp;spn=0.011546,0.016158&amp;amp;sll=49.891235,-97.15369&amp;amp;sspn=43.664668,66.181641&amp;amp;t=h&amp;amp;z=16 sur Google maps])&lt;br /&gt;
File:UdeS_1Hz_sample1.jpg&lt;br /&gt;
File:UdeS_1Hz_sample3.jpg&lt;br /&gt;
File:UdeS_1Hz_sample4.jpg&lt;br /&gt;
File:UdeS_1Hz_sample5.jpg&lt;br /&gt;
File:UdeS_1Hz_sample6.jpg&lt;br /&gt;
File:UdeS_1Hz_sample7.jpg&lt;br /&gt;
File:UdeS_1Hz_sample8.jpg&lt;br /&gt;
File:UdeS_1Hz_sample9.jpg&lt;br /&gt;
File:UdeS_1Hz_sample11.jpg|De la pluie!&lt;br /&gt;
File:UdeS_1Hz_sample16.jpg|Comparer l&#039;illumination et l&#039;orientation de la caméra avec l&#039;image suivante... &lt;br /&gt;
File:UdeS_1Hz_sample12.jpg&lt;br /&gt;
File:UdeS_1Hz_sample13.jpg|Porte d&#039;ascenseur...&lt;br /&gt;
File:UdeS_1Hz_sample14.jpg&lt;br /&gt;
File:UdeS_1Hz_sample15.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:UdeS_1Hz.part1.rar|UdeS_1Hz.part1.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part2.rar|UdeS_1Hz.part2.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.part3.rar|UdeS_1Hz.part3.rar]]&lt;br /&gt;
 [[Media:UdeS_1Hz.png|UdeS_1Hz GroundTruth]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;NFSMW&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* 25098 images à 1 Hz (7 heures).&lt;br /&gt;
* Images prises dans le jeu vidéo de course Need For Speed: Most Wanted.&lt;br /&gt;
* 2 zones ont été visités 100 fois chaque (100 boucles dans la zone 1 et ensuite 102 boucles dans la zone 2).&lt;br /&gt;
&amp;lt;gallery perrow=5&amp;gt;&lt;br /&gt;
File:NFSMW_1Hz_map.png&lt;br /&gt;
File:NFSMW_1Hz_sample2.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample3.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample4.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample5.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample6.jpg|Comparer l&#039;illumination avec l&#039;image suivante...&lt;br /&gt;
File:NFSMW_1Hz_sample8.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample7.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample9.jpg&lt;br /&gt;
File:NFSMW_1Hz_sample10.jpg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
 [[Media:NFSMW_1Hz.part01.rar|NFSMW_1Hz.part01.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part02.rar|NFSMW_1Hz.part02.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part03.rar|NFSMW_1Hz.part03.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part04.rar|NFSMW_1Hz.part04.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part05.rar|NFSMW_1Hz.part05.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part06.rar|NFSMW_1Hz.part06.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part07.rar|NFSMW_1Hz.part07.rar]]&lt;br /&gt;
 [[Media:NFSMW_1Hz.part08.rar|NFSMW_1Hz.part08.rar]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Communauté&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Ensembles de données provenant d&#039;autres approches de détection de fermeture de boucle :&lt;br /&gt;
* Angeli et al. : [http://cogrob.ensta.fr/loopclosure.html Lip6Indoor et Lip6Outdoor]&lt;br /&gt;
* Cummins et al. (FAB-MAP) : [http://www.robots.ox.ac.uk/~mobile/IJRR_2008_Dataset NewCollege et CityCentre]&lt;br /&gt;
* Cummins et al. (FAB-MAP 2.0) : [http://www.robots.ox.ac.uk/~mobile Eynsham (70 km)]&lt;br /&gt;
* Maddern et al. : [http://www.robots.ox.ac.uk/NewCollegeData/ NewCollege omnidirectionnal images]&lt;br /&gt;
* Kawewong et al. (PIRF-Nav 2.0): [http://haselab.info/pirf.html CrowdedCanteen]&lt;br /&gt;
* Ga ́lvez-Lo ́pez et al. : [http://www.rawseeds.org/home/category/benchmarking-toolkit/datasets/ Bovisa et Bicocca]&lt;br /&gt;
* Blanco et al. : [http://www.mrpt.org/malaga_dataset_2009 Malaga 2009]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Ground truths&#039;&#039;:&lt;br /&gt;
* [[Media:NewCollege.png|NewCollege.png]] 1073 images à ~0.5 Hz (les images de gauche et de droite fusionnées)&lt;br /&gt;
* [[Media:CityCentre.png|CityCentre.png]] 1237 images à ~0.5 Hz (les images de gauche et de droite fusionnées) &lt;br /&gt;
* [[Media:Lip6Indoor.png|Lip6Indoor.png]] 388 images à 1 Hz&lt;br /&gt;
* [[Media:Lip6Outdoor.png|Lip6Outdoor.png]] 531 images à 0.5 Hz&lt;br /&gt;
* [[Media:Eynsham70km.png|Eynsham70km.png]] 5519 images à ~1 Hz (Noter que nous avons enlevés des images de l&#039;ensemble données original pour avoir une fréquence d&#039;acquisition d&#039;images d&#039;environ 1 Hz.)&lt;br /&gt;
* [[Media:NewCollegeOmni.png|NewCollegeOmni.png]] 1626 images à 1 Hz&lt;br /&gt;
* [[Media:CrowdedCanteen.png|CrowdedCanteen.png]] 692 images à 2 Hz&lt;br /&gt;
* [[Media:BicoccaIndoor-2009-02-25b.png|BicoccaIndoor-2009-02-25b.png]] 1757 images à 1 Hz&lt;br /&gt;
* [[Media:BovisaOutdoor-2008-10-04.png|BovisaOutdoor-2008-10-04.png]] 2277 images à 1 Hz&lt;br /&gt;
* [[Media:BovisaMixed-2008-10-06.png|BovisaMixed-2008-10-06.png]] 2147 images à 1 Hz&lt;br /&gt;
* [[Media:malaga2009_campus_2L.png|malaga2009_campus_2L.png]] 653 images à ~1 Hz&lt;br /&gt;
* [[Media:malaga2009_parking_6L.png|malaga2009_parking_6L.png]] 435 images à ~1 Hz&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Publications ==&lt;br /&gt;
#M. Labbé and F. Michaud, “Long-term online multi-session graph-based SPLAM with memory management,” in &#039;&#039;Autonomous Robots&#039;&#039;, accepted, 2017. ([[Media:LabbeAURO2017.pdf|pdf]]) ([http://dx.doi.org/10.1007/s10514-017-9682-5 Springer])&lt;br /&gt;
#M. Labbé and F. Michaud, “Online Global Loop Closure Detection for Large-Scale Multi-Session Graph-Based SLAM,” in &#039;&#039;Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems&#039;&#039;, 2014. ([[Media:Labbe14-IROS.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/login.jsp?tp=&amp;amp;arnumber=6942926 IEEE Xplore])&lt;br /&gt;
#Labbé, M., Michaud., F. (2013), “Appearance-based loop closure detection in real-time for large-scale and long-term operation,” &#039;&#039;IEEE Transactions on Robotics&#039;&#039;, vol. 29, no. 3, pp. 734-745. ([[Media:TRO2013.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6459608 IEEE Xplore])&lt;br /&gt;
#Labbé, M., Michaud, F. (2011), “Memory management for real-time appearance-based loop closure detection,” in &#039;&#039;Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems&#039;&#039;. ([[Media:labbe11memory.pdf|pdf]]) ([http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6094602 IEEE Xplore])&lt;br /&gt;
&lt;br /&gt;
==== Presentations ====&lt;br /&gt;
* M. Labbé, &amp;quot;Simultaneous Localization and Mapping (SLAM) with RTAB-Map&amp;quot;, Université Laval, Québec, November 2015 ([[Media:Labbe2015ULaval.pdf|slides pdf]])&lt;br /&gt;
&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
== Team ==&lt;br /&gt;
* [[Mathieu Labbé]]&lt;br /&gt;
* [http://www.gel.usherbrooke.ca/michaudf/ François Michaud]&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
== Équipe ==&lt;br /&gt;
* [[Mathieu Labbé]]&lt;br /&gt;
* [http://www.gel.usherbrooke.ca/michaudf/ François Michaud]&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=Main_Page&amp;diff=3305</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=Main_Page&amp;diff=3305"/>
		<updated>2018-11-28T19:07:21Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
{|  class=&amp;quot;wikitable&amp;quot;  style=&amp;quot;width:100%; height:200px; text-align:left;&amp;quot; border=&amp;quot;0&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
|width=&amp;quot;50%&amp;quot; |&lt;br /&gt;
&amp;lt;french&amp;gt;&amp;lt;big&amp;gt;IntRoLab - Laboratoire de robotique intelligente / interactive / intégrée / interdisciplinaire &amp;lt;/big&amp;gt; &amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Institut interdisciplinaire d&#039;innovation technologique [http://www.3it.ca 3IT]&#039;&#039;&#039; &amp;lt;br&amp;gt;&lt;br /&gt;
Université de Sherbrooke &amp;lt;br&amp;gt;&lt;br /&gt;
3000 boul. de l&#039;Université &amp;lt;br&amp;gt;&lt;br /&gt;
Sherbrooke (Québec) J1K 0A5 &amp;lt;br&amp;gt;&lt;br /&gt;
Canada &amp;lt;br&amp;gt;&lt;br /&gt;
Téléphone : 819 821-8000 poste 65700 &amp;lt;br&amp;gt;&lt;br /&gt;
[mailto:francois.michaud@usherbrooke.ca Contactez-nous!] &amp;lt;br&amp;gt;&lt;br /&gt;
[https://www.usherbrooke.ca/visiter/acces-routiers/ Accès routier] &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
&amp;lt;big&amp;gt;IntRoLab - Intelligent / Interactive / Integrated / Interdisciplinary Robot Lab &amp;lt;/big&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Interdisciplinary Institute for Technological Innovation [http://www.3it.ca 3IT] &amp;lt;br&amp;gt;&lt;br /&gt;
Université de Sherbrooke &amp;lt;br&amp;gt;&lt;br /&gt;
3000 boul. de l&#039;Université &amp;lt;br&amp;gt;&lt;br /&gt;
Sherbrooke (Québec) J1K 0A5 &amp;lt;br&amp;gt;&lt;br /&gt;
Canada &amp;lt;br&amp;gt;&lt;br /&gt;
Phone :  819 821-8000 ext. 65700 &amp;lt;br&amp;gt;&lt;br /&gt;
[mailto:francois.michaud@usherbrooke.ca Contact us!] &amp;lt;br&amp;gt;&lt;br /&gt;
[https://www.usherbrooke.ca/visiter/acces-routiers/ Map] &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
|width=&amp;quot;50%&amp;quot; |&lt;br /&gt;
&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
&amp;lt;french&amp;gt;Janvier 2018- [https://github.com/introlab/odas ODAS : Open embeddeD Audition System]&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;January 2018 - [https://github.com/introlab/odas ODAS : Open embeddeD Audition System]&amp;lt;/english&amp;gt;&lt;br /&gt;
&amp;lt;youtube&amp;gt;https://www.youtube.com/watch?v=n7y2rLAnd5I&amp;lt;/youtube&amp;gt;&lt;br /&gt;
&amp;lt;code&amp;gt;{{#ev:youtube|n7y2rLAnd5I}}&amp;lt;/code&amp;gt;&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
[[Image:Twitter.jpeg|50px|link=https://twitter.com/introlab/]] &lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
{| style=&amp;quot;float: right;&amp;quot;&lt;br /&gt;
| [https://introlab.3it.usherbrooke.ca/introlab-secure/ Intranet]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=FlowDesigner&amp;diff=3296</id>
		<title>FlowDesigner</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=FlowDesigner&amp;diff=3296"/>
		<updated>2018-10-26T19:12:36Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
= Description =&lt;br /&gt;
&lt;br /&gt;
[[Image:FlowDesigner.png|center|400px]]&lt;br /&gt;
&lt;br /&gt;
[http://flowdesigner.sf.net FlowDesigner] is a free (GPL/LGPL) data flow oriented development environment. It can be used to build complex applications by combining small, reusable building blocks. In some ways, it is similar to both Simulink and LabView, but is hardly a clone of either. FlowDesigner features a RAD GUI with a visual debugger. Although FlowDesigner can be used as a rapid prototyping tool, it can still be used for building real-time applications such as audio effects processing. Since FlowDesigner is not really an interpreted language, it can be quite fast. It is written in C++ and features a plugin mechanism that allows plugins/toolboxes to be easiliy added.&lt;br /&gt;
&lt;br /&gt;
= RobotFlow = &lt;br /&gt;
&lt;br /&gt;
[[Image:RobotFlow.jpeg|center|400px]]&lt;br /&gt;
&lt;br /&gt;
[http://robotflow.sf.net RobotFlow] is a mobile robotics tookit based on the FlowDesigner project. The visual programming interface provided in the FlowDesigner project will help people to better visualize &amp;amp; understand what is really happening in the robot&#039;s control loops, sensors, actuators, by using graphical probes and debugging in real-time.&lt;br /&gt;
&lt;br /&gt;
 &#039;&#039;&#039;Note : RobotFlow is no longer maintained.&#039;&#039;&#039;&lt;br /&gt;
= Publications =&lt;br /&gt;
#Létourneau, D., Valin, J.-M., Côté, C., Michaud, F. (2005), “FlowDesigner: the free data-flow oriented development environment”, &#039;&#039;Software 2.0&#039;&#039;, vol. 3. ([[Media:Software2005.pdf|pdf]]) &lt;br /&gt;
#Côté, C., Létourneau, D., Michaud, F., Valin, J.-M., Brosseau, Y., Raïevsky, C., Lemay, M., Tran. V. (2004), &amp;quot;Code reusability for programming mobile robots&amp;quot;, &#039;&#039;Proceedings IEEE/RSJ International Conference on Robots and Intelligent Systems&#039;&#039;, 1820-1825.&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=FlowDesigner&amp;diff=3295</id>
		<title>FlowDesigner</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=FlowDesigner&amp;diff=3295"/>
		<updated>2018-10-26T19:11:37Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
= Description =&lt;br /&gt;
&lt;br /&gt;
[[Image:test1.png]]&lt;br /&gt;
&lt;br /&gt;
[[Image:FlowDesigner.png|center|400px]]&lt;br /&gt;
&lt;br /&gt;
[http://flowdesigner.sf.net FlowDesigner] is a free (GPL/LGPL) data flow oriented development environment. It can be used to build complex applications by combining small, reusable building blocks. In some ways, it is similar to both Simulink and LabView, but is hardly a clone of either. FlowDesigner features a RAD GUI with a visual debugger. Although FlowDesigner can be used as a rapid prototyping tool, it can still be used for building real-time applications such as audio effects processing. Since FlowDesigner is not really an interpreted language, it can be quite fast. It is written in C++ and features a plugin mechanism that allows plugins/toolboxes to be easiliy added.&lt;br /&gt;
&lt;br /&gt;
= RobotFlow = &lt;br /&gt;
&lt;br /&gt;
[[Image:RobotFlow.jpeg|center|400px]]&lt;br /&gt;
&lt;br /&gt;
[http://robotflow.sf.net RobotFlow] is a mobile robotics tookit based on the FlowDesigner project. The visual programming interface provided in the FlowDesigner project will help people to better visualize &amp;amp; understand what is really happening in the robot&#039;s control loops, sensors, actuators, by using graphical probes and debugging in real-time.&lt;br /&gt;
&lt;br /&gt;
 &#039;&#039;&#039;Note : RobotFlow is no longer maintained.&#039;&#039;&#039;&lt;br /&gt;
= Publications =&lt;br /&gt;
#Létourneau, D., Valin, J.-M., Côté, C., Michaud, F. (2005), “FlowDesigner: the free data-flow oriented development environment”, &#039;&#039;Software 2.0&#039;&#039;, vol. 3. ([[Media:Software2005.pdf|pdf]]) &lt;br /&gt;
#Côté, C., Létourneau, D., Michaud, F., Valin, J.-M., Brosseau, Y., Raïevsky, C., Lemay, M., Tran. V. (2004), &amp;quot;Code reusability for programming mobile robots&amp;quot;, &#039;&#039;Proceedings IEEE/RSJ International Conference on Robots and Intelligent Systems&#039;&#039;, 1820-1825.&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=MediaWiki:Sidebar&amp;diff=3292</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=MediaWiki:Sidebar&amp;diff=3292"/>
		<updated>2018-09-13T01:18:05Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* News&lt;br /&gt;
**News|Toutes les nouvelles / All News&lt;br /&gt;
&lt;br /&gt;
* Projets / Projects&lt;br /&gt;
**Projects|Tous les projets / All Projects&lt;br /&gt;
**ADE|DEA&lt;br /&gt;
**AUDIBLE|AUDIBLE&lt;br /&gt;
**Autonomous Robot|Autonomous Robot&lt;br /&gt;
**AZIMUT|AZIMUT&lt;br /&gt;
**CRI|Children Robot Interaction&lt;br /&gt;
**DCD|DCD&lt;br /&gt;
**DDRA|DDRA&lt;br /&gt;
**DRF|DRF&lt;br /&gt;
**EQ|EQ&lt;br /&gt;
**MapIt|MapIt&lt;br /&gt;
**HBBA|HBBA&lt;br /&gt;
**PEXAT|PEXAT&lt;br /&gt;
**RTAB-Map|RTAB-Map&lt;br /&gt;
**Telerobot|Telerobot&lt;br /&gt;
**Teletrauma|Teletrauma&lt;br /&gt;
**TRInterface|Ego/Exocentric Teleoperation&lt;br /&gt;
**WISS|WISS&lt;br /&gt;
* Open Source&lt;br /&gt;
**8SoundsUSB|8 Inputs USB Sound Card&lt;br /&gt;
**FlowDesigner|FlowDesigner&lt;br /&gt;
**ManyEars|ManyEars&lt;br /&gt;
**MARIE|MARIE&lt;br /&gt;
**OpenECoSys|OpenECoSys&lt;br /&gt;
**RTAB-Map|RTAB-Map&lt;br /&gt;
**Find-Object|Find-Object&lt;br /&gt;
**ROS_OpenTLD|ROS OpenTLD&lt;br /&gt;
**ROS4iOS|ROS4iOS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* Équipe / Team&lt;br /&gt;
** Team|Toute l&#039;équipe / All Team&lt;br /&gt;
&lt;br /&gt;
* Infrastructure&lt;br /&gt;
**Infrastructure|Infrastructure de laboratoire / Lab Infrastructure&lt;br /&gt;
&lt;br /&gt;
* Publications&lt;br /&gt;
** Publications|Toutes les publications / All publications&lt;br /&gt;
&lt;br /&gt;
* Information&lt;br /&gt;
** Information|Information du laboratoire / Lab Information&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=MediaWiki:Sidebar&amp;diff=3291</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=MediaWiki:Sidebar&amp;diff=3291"/>
		<updated>2018-09-13T01:17:00Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* News&lt;br /&gt;
**News|Toutes les nouvelles / All News&lt;br /&gt;
&lt;br /&gt;
* Projets / Projects&lt;br /&gt;
**Projects|Tous les projets / All Projects&lt;br /&gt;
**ADE|DEA&lt;br /&gt;
**AUDIBLE|AUDIBLE&lt;br /&gt;
**Autonomous Robot|Autonomous Robot&lt;br /&gt;
**AZIMUT|AZIMUT&lt;br /&gt;
**CRI|Children Robot Interaction&lt;br /&gt;
**DCD|DCD&lt;br /&gt;
**DDRA|DDRA&lt;br /&gt;
**DRF|DRF&lt;br /&gt;
**EQ|EQ&lt;br /&gt;
**MapIT!|MapIT&lt;br /&gt;
**HBBA|HBBA&lt;br /&gt;
**PEXAT|PEXAT&lt;br /&gt;
**RTAB-Map|RTAB-Map&lt;br /&gt;
**Telerobot|Telerobot&lt;br /&gt;
**Teletrauma|Teletrauma&lt;br /&gt;
**TRInterface|Ego/Exocentric Teleoperation&lt;br /&gt;
**WISS|WISS&lt;br /&gt;
* Open Source&lt;br /&gt;
**8SoundsUSB|8 Inputs USB Sound Card&lt;br /&gt;
**FlowDesigner|FlowDesigner&lt;br /&gt;
**ManyEars|ManyEars&lt;br /&gt;
**MARIE|MARIE&lt;br /&gt;
**OpenECoSys|OpenECoSys&lt;br /&gt;
**RTAB-Map|RTAB-Map&lt;br /&gt;
**Find-Object|Find-Object&lt;br /&gt;
**ROS_OpenTLD|ROS OpenTLD&lt;br /&gt;
**ROS4iOS|ROS4iOS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* Équipe / Team&lt;br /&gt;
** Team|Toute l&#039;équipe / All Team&lt;br /&gt;
&lt;br /&gt;
* Infrastructure&lt;br /&gt;
**Infrastructure|Infrastructure de laboratoire / Lab Infrastructure&lt;br /&gt;
&lt;br /&gt;
* Publications&lt;br /&gt;
** Publications|Toutes les publications / All publications&lt;br /&gt;
&lt;br /&gt;
* Information&lt;br /&gt;
** Information|Information du laboratoire / Lab Information&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=User:Dominic_L%C3%A9tourneau&amp;diff=3211</id>
		<title>User:Dominic Létourneau</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=User:Dominic_L%C3%A9tourneau&amp;diff=3211"/>
		<updated>2018-03-29T18:14:56Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
= Dominic Létourneau, ing. M.Sc.A. =&lt;br /&gt;
[[Image:DominicLétourneau.jpg|200px|thumb]]&lt;br /&gt;
 &lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
Professionnel de recherche pour IntRoLab&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
Research Engineer for IntRoLab&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
&#039;&amp;quot;Coordonnées :&#039;&#039;&#039;&lt;br /&gt;
* Département de génie électrique et de génie informatique, Université de Sherbrooke&lt;br /&gt;
* Institut interdisciplinaire d’innovation technologique - 3IT&lt;br /&gt;
* Local: P2-3021&lt;br /&gt;
* Téléphone : 821-8000 ext. 65778&lt;br /&gt;
* Courriel : dominic.letourneau@usherbrooke.ca&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Contact Information :&#039;&#039;&#039;&lt;br /&gt;
* Dept. of Computer &amp;amp; Electrical Engineering, Université de Sherbrooke&lt;br /&gt;
* Interdisciplinary Institute for Technological Innovation - 3IT&lt;br /&gt;
* Room : P2-3021&lt;br /&gt;
* Phone : (819) 821-8000 ext. 65778&lt;br /&gt;
* Email : dominic.letourneau@usherbrooke.ca&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
== Formation ==&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
* &#039;&#039;&#039;2000-2001&#039;&#039;&#039; Maîtrise en génie électrique, Université de Sherbrooke&lt;br /&gt;
** Sujet de maîtrise : [[Media:MemoireDLetourneau.pdf | Identification visuelle de symboles par un robot mobile]]. Voir [[READ]].&lt;br /&gt;
** Directeur : Prof. François Michaud, ing.,  Ph.D.&lt;br /&gt;
* &#039;&#039;&#039;1995-1999&#039;&#039;&#039; Bac. en génie informatique, Université de Sherbrooke&lt;br /&gt;
* Membre de l&#039;OIQ depuis 2002.&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
* &#039;&#039;&#039;2000-2001&#039;&#039;&#039; Master degree in Electrical Engineering, Université de Sherbrooke&lt;br /&gt;
** Thesis : [[Media:MemoireDLetourneau.pdf | Identification visuelle de symboles par un robot mobile (french)]]. See [[READ]].&lt;br /&gt;
** Director : Prof. François Michaud, ing.,  Ph.D.&lt;br /&gt;
* &#039;&#039;&#039;1995-1999&#039;&#039;&#039; Bachelor Degree in Computer Engineering, Université de Sherbrooke&lt;br /&gt;
* OIQ Member since 2002.&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Open Source ===&lt;br /&gt;
* https://github.com/introlab&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
* [[Projects]]&lt;br /&gt;
&lt;br /&gt;
=== Publications ===&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
* Voir [[Publications]].&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
* Please see [[Publications]].&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=User:Dominic_L%C3%A9tourneau&amp;diff=3210</id>
		<title>User:Dominic Létourneau</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=User:Dominic_L%C3%A9tourneau&amp;diff=3210"/>
		<updated>2018-03-29T18:12:19Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: /* Dominic Létourneau, ing. M.Sc.A. */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
= Dominic Létourneau, ing. M.Sc.A. =&lt;br /&gt;
[[Image:DominicLétourneau.jpg|200px|thumb]]&lt;br /&gt;
 &lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
Professionnel de recherche pour IntRoLab&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
Research Engineer for IntRoLab&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
&#039;&amp;quot;Coordonnées :&#039;&#039;&#039;&lt;br /&gt;
* Département de génie électrique et de génie informatique, Université de Sherbrooke&lt;br /&gt;
* Institut interdisciplinaire d’innovation technologique - 3IT&lt;br /&gt;
* Local: P2-3021&lt;br /&gt;
* Téléphone : 821-8000 ext. 65778&lt;br /&gt;
* Courriel : dominic.letourneau@usherbrooke.ca&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Contact Information :&#039;&#039;&#039;&lt;br /&gt;
* Dept. of Computer &amp;amp; Electrical Engineering, Université de Sherbrooke&lt;br /&gt;
* Interdisciplinary Institute for Technological Innovation - 3IT&lt;br /&gt;
* Room : P2-3021&lt;br /&gt;
* Phone : (819) 821-8000 ext. 65778&lt;br /&gt;
* Email : dominic.letourneau@usherbrooke.ca&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
== Formation ==&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
* &#039;&#039;&#039;2000-2001&#039;&#039;&#039; Maîtrise en génie électrique, Université de Sherbrooke&lt;br /&gt;
** Sujet de maîtrise : [[Media:MemoireDLetourneau.pdf | Identification visuelle de symboles par un robot mobile]]. Voir [[READ]].&lt;br /&gt;
** Directeur : Prof. François Michaud, ing.,  Ph.D.&lt;br /&gt;
* &#039;&#039;&#039;1995-1999&#039;&#039;&#039; Bac. en génie informatique, Université de Sherbrooke&lt;br /&gt;
&lt;br /&gt;
* Membre de l&#039;OIQ depuis 2002.&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
* &#039;&#039;&#039;2000-2001&#039;&#039;&#039; Master degree in Electrical Engineering, Université de Sherbrooke&lt;br /&gt;
** Thesis : [[Media:MemoireDLetourneau.pdf | Identification visuelle de symboles par un robot mobile (french)]]. See [[READ]].&lt;br /&gt;
** Director : Prof. François Michaud, ing.,  Ph.D.&lt;br /&gt;
* &#039;&#039;&#039;1995-1999&#039;&#039;&#039; Bachelor Degree in Computer Engineering, Université de Sherbrooke&lt;br /&gt;
&lt;br /&gt;
* OIQ Member since 2002.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Open Source ===&lt;br /&gt;
* https://github.com/introlab&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
* [[Projects]]&lt;br /&gt;
&lt;br /&gt;
== Publications ==&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
* Voir [[Publications]].&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
* Please see [[Publications]].&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=User:Dominic_L%C3%A9tourneau&amp;diff=3209</id>
		<title>User:Dominic Létourneau</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=User:Dominic_L%C3%A9tourneau&amp;diff=3209"/>
		<updated>2018-03-29T18:09:01Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
= Dominic Létourneau, ing. M.Sc.A. =&lt;br /&gt;
[[Image:DominicLétourneau.jpg|200px|thumb]]&lt;br /&gt;
 &lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
&#039;&amp;quot;Coordonnées :&#039;&#039;&#039;&lt;br /&gt;
* Département de génie électrique et de génie informatique, Université de Sherbrooke&lt;br /&gt;
* Institut interdisciplinaire d’innovation technologique - 3IT&lt;br /&gt;
* Local: P2-3021&lt;br /&gt;
* Téléphone : 821-8000 ext. 65778&lt;br /&gt;
* Courriel : dominic.letourneau@usherbrooke.ca&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Contact Information :&#039;&#039;&#039;&lt;br /&gt;
* Dept. of Computer &amp;amp; Electrical Engineering, Université de Sherbrooke&lt;br /&gt;
* Interdisciplinary Institute for Technological Innovation - 3IT&lt;br /&gt;
* Room : P2-3021&lt;br /&gt;
* Phone : (819) 821-8000 ext. 65778&lt;br /&gt;
* Email : dominic.letourneau@usherbrooke.ca&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
== Formation ==&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
* &#039;&#039;&#039;2000-2001&#039;&#039;&#039; Maîtrise en génie électrique, Université de Sherbrooke&lt;br /&gt;
** Sujet de maîtrise : [[Media:MemoireDLetourneau.pdf | Identification visuelle de symboles par un robot mobile]]. Voir [[READ]].&lt;br /&gt;
** Directeur : Prof. François Michaud, ing.,  Ph.D.&lt;br /&gt;
* &#039;&#039;&#039;1995-1999&#039;&#039;&#039; Bac. en génie informatique, Université de Sherbrooke&lt;br /&gt;
&lt;br /&gt;
* Membre de l&#039;OIQ depuis 2002.&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
* &#039;&#039;&#039;2000-2001&#039;&#039;&#039; Master degree in Electrical Engineering, Université de Sherbrooke&lt;br /&gt;
** Thesis : [[Media:MemoireDLetourneau.pdf | Identification visuelle de symboles par un robot mobile (french)]]. See [[READ]].&lt;br /&gt;
** Director : Prof. François Michaud, ing.,  Ph.D.&lt;br /&gt;
* &#039;&#039;&#039;1995-1999&#039;&#039;&#039; Bachelor Degree in Computer Engineering, Université de Sherbrooke&lt;br /&gt;
&lt;br /&gt;
* OIQ Member since 2002.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Open Source ===&lt;br /&gt;
* https://github.com/introlab&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
* [[Projects]]&lt;br /&gt;
&lt;br /&gt;
== Publications ==&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
* Voir [[Publications]].&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
* Please see [[Publications]].&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
	<entry>
		<id>http://introlab.3it.usherbrooke.ca/index.php?title=User:Dominic_L%C3%A9tourneau&amp;diff=3208</id>
		<title>User:Dominic Létourneau</title>
		<link rel="alternate" type="text/html" href="http://introlab.3it.usherbrooke.ca/index.php?title=User:Dominic_L%C3%A9tourneau&amp;diff=3208"/>
		<updated>2018-03-29T18:06:32Z</updated>

		<summary type="html">&lt;p&gt;Letd2801: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;analytics uacct=&amp;quot;UA-27707792-1&amp;quot; &amp;gt;&amp;lt;/analytics&amp;gt;&lt;br /&gt;
= Dominic Létourneau, ing. M.Sc.A. =&lt;br /&gt;
[[Image:DominicLétourneau.jpg|200px|thumb]]&lt;br /&gt;
 &lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
&#039;&amp;quot;Coordonnées :&#039;&#039;&#039;&lt;br /&gt;
* Département de génie électrique et de génie informatique, Université de Sherbrooke&lt;br /&gt;
* Institut interdisciplinaire d’innovation technologique - 3IT&lt;br /&gt;
* Local: P2-3021&lt;br /&gt;
* Téléphone : 821-8000 ext. 65778&lt;br /&gt;
* Courriel : dominic.letourneau@usherbrooke.ca&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Contact Information :&#039;&#039;&#039;&lt;br /&gt;
* Dept. of Computer &amp;amp; Electrical Engineering, Université de Sherbrooke&lt;br /&gt;
* Interdisciplinary Institute for Technological Innovation - 3IT&lt;br /&gt;
* Room : P2-3021&lt;br /&gt;
* Phone : (819) 821-8000 ext. 65778&lt;br /&gt;
* Email : dominic.letourneau@usherbrooke.ca&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
== Formation ==&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
* &#039;&#039;&#039;2000-2001&#039;&#039;&#039; Maîtrise en génie électrique, Université de Sherbrooke&lt;br /&gt;
** Sujet de maîtrise : [[Media:MemoireDLetourneau.pdf | Identification visuelle de symboles par un robot mobile]]. Voir [[READ]].&lt;br /&gt;
** Directeur : Prof. François Michaud, ing.,  Ph.D.&lt;br /&gt;
* &#039;&#039;&#039;1995-1999&#039;&#039;&#039; Bac. en génie informatique, Université de Sherbrooke&lt;br /&gt;
&lt;br /&gt;
* Membre de l&#039;OIQ depuis 2002.&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
* &#039;&#039;&#039;2000-2001&#039;&#039;&#039; Master degree in Electrical Engineering, Université de Sherbrooke&lt;br /&gt;
** Thesis : [[Media:MemoireDLetourneau.pdf | Identification visuelle de symboles par un robot mobile (french)]]. See [[READ]].&lt;br /&gt;
** Director : Prof. François Michaud, ing.,  Ph.D.&lt;br /&gt;
* &#039;&#039;&#039;1995-1999&#039;&#039;&#039; Bachelor Degree in Computer Engineering, Université de Sherbrooke&lt;br /&gt;
&lt;br /&gt;
* Membre of OIQ since 2002.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
== &amp;lt;french&amp;gt;Langues&amp;lt;/french&amp;gt;&amp;lt;english&amp;gt;Languages&amp;lt;/english&amp;gt;==&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
* Français (oral, écrit)&lt;br /&gt;
* Anglais (oral, écrit)&lt;br /&gt;
* Vietnamien (en apprentissage)&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
* French&lt;br /&gt;
* English&lt;br /&gt;
* Vietnamese (learning)&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Prix et distinctions ==&lt;br /&gt;
* AAAI Mobile Robot Competition winner, 2006, 2005&lt;br /&gt;
* Innovation technologique – Mention d’honneur, MIM, Salon des technologies numériques du Québec, 2003&lt;br /&gt;
* AZIMUT –  OCTAS, Fédération informatique du Québec (FiQ) 2003 et MIMs d’Or 2003&lt;br /&gt;
* Bourse de maîtrise Denis Wood en 2001 (Fondation de l&#039;Université) pour excellence académique&lt;br /&gt;
&lt;br /&gt;
== Emploi ==&lt;br /&gt;
* &#039;&#039;&#039;2001- Aujourd&#039;hui&#039;&#039;&#039; Professionnel de recherche pour IntRoLab.&lt;br /&gt;
** Responsable du fonctionnement du laboratoire (réseaux, serveurs, ordinateurs de travail)&lt;br /&gt;
** Conception et réalisation de robots mobiles&lt;br /&gt;
** Intégration mécanique / électrique / informatique pour robots mobiles&lt;br /&gt;
** Support pour les étudiants&lt;br /&gt;
** Gestion des projets&lt;br /&gt;
&lt;br /&gt;
== Intérêts ==&lt;br /&gt;
* Systèmes embarqués intégrés&lt;br /&gt;
* Protocoles de communication&lt;br /&gt;
* Capteurs &amp;amp; actionneurs intelligents&lt;br /&gt;
* Véhicules électriques&lt;br /&gt;
* Accumulateurs / Batteries&lt;br /&gt;
&lt;br /&gt;
== Projets ==&lt;br /&gt;
&lt;br /&gt;
=== Conception de Robots ===&lt;br /&gt;
* [[AZIMUT | Azimut, Azimut2, Azimut3]]&lt;br /&gt;
* [[Roball | Roball]]&lt;br /&gt;
* [[UltimateRobot | Spartacus &amp;amp; Johnny-0 ]]&lt;br /&gt;
* [[Teletrauma]]&lt;br /&gt;
&lt;br /&gt;
=== Open Source===&lt;br /&gt;
&lt;br /&gt;
== Publications ==&lt;br /&gt;
&amp;lt;french&amp;gt;&lt;br /&gt;
* Voir [[Publications]].&lt;br /&gt;
&amp;lt;/french&amp;gt;&lt;br /&gt;
&amp;lt;english&amp;gt;&lt;br /&gt;
* Please see [[Publications]].&lt;br /&gt;
&amp;lt;/english&amp;gt;&lt;/div&gt;</summary>
		<author><name>Letd2801</name></author>
	</entry>
</feed>