Scarica (PDF – 6.19 MB)
Scarica (PDF – 6.19 MB)
Scarica (PDF – 6.19 MB)
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
4.4 Augmented reality stereoscopic visualization<br />
for intuitive robot teleguide<br />
The paper of Livatino et al. [13] proposes a methodology for fusion of<br />
laser and visual data in a teleoperation interface. This methodology<br />
exploits augmented reality to realize a coherent and intuitive visualiza-<br />
tion of integrated data, and uses stereoscopy to increase teleoperation<br />
efficiency.<br />
The interface presented in this work represents laser data as virtual<br />
overlays on the video images received by the robot cameras. Three<br />
different kinds of virtual overlays are used:<br />
• proximity planes, semi-transparent colored layers superimposed<br />
on the objects within the scene (figure 20a);<br />
• rays, colored lines departing approximately from the camera po-<br />
sition and reaching the closest objects (figure 20b);<br />
• distance values, indications of the absolute distance between the<br />
robot and the objects (figure 20b).<br />
The virtual overlays have a different color depending on the distance<br />
between the robot and the real objects to which they correspond. Red<br />
overlays correspond to the nearest objects, yellow overlays to objects<br />
at medium distances, green overlays to the furthest objects.<br />
The laser measures are linearly mapped to image pixels between the<br />
left and the right margin of the image. A semi-automatic calibration<br />
permits to the user to adjust the first and the last mapped angles. Then,<br />
edge detection is executed on the image in order to individuate the<br />
bases of the objects in the image (by taking the first edge pixels from<br />
the bottom of the image) and to vertically align the virtual overlays<br />
with the real objects.<br />
41