This paper describes the design and implementation of a multimodal user interface for exploration and surface reconstruction of remote environments in teleoperation systems. The sensory information collected by heterogeneous sensors at the teleoperated site is integrated by the system to build a meaningful representation of the remote environment. The user interface includes streaming of video information, surface reconstruction and visualization of sampled data from tactile and proximity sensors, and a haptic device (the CyberTouch virtual reality glove) returning distance information through vibrotactile actuators. The glove also enables intuitive guidance of the remote end-effector, as long as an appropriate mapping between the operator's hand and the robotic device is established. The user interface is implemented on top of a distributed architecture for teleoperation, designed according to a software framework exploiting advanced CORBA features. The multimodal user interface, along with the teleoperation framework, has been successfully applied to control simulated robotic devices operating in virtual environments, and is currently being deployed in a real robot workcell comprising a manipulator and several sensory devices.

A Multimodal User Interface for Remote Object Exploration in Teleoperation Systems / Aleotti, Jacopo; Bottazzi, S.; Caselli, Stefano; Reggiani, Monica. - (2002). (Intervento presentato al convegno IARP International Workshop on Human-Robot Interfaces: Technologies & Applications (HUROIN2002) tenutosi a Frascati (Rome), Italy nel November 2002).

A Multimodal User Interface for Remote Object Exploration in Teleoperation Systems

ALEOTTI, Jacopo;CASELLI, Stefano;REGGIANI, Monica
2002-01-01

Abstract

This paper describes the design and implementation of a multimodal user interface for exploration and surface reconstruction of remote environments in teleoperation systems. The sensory information collected by heterogeneous sensors at the teleoperated site is integrated by the system to build a meaningful representation of the remote environment. The user interface includes streaming of video information, surface reconstruction and visualization of sampled data from tactile and proximity sensors, and a haptic device (the CyberTouch virtual reality glove) returning distance information through vibrotactile actuators. The glove also enables intuitive guidance of the remote end-effector, as long as an appropriate mapping between the operator's hand and the robotic device is established. The user interface is implemented on top of a distributed architecture for teleoperation, designed according to a software framework exploiting advanced CORBA features. The multimodal user interface, along with the teleoperation framework, has been successfully applied to control simulated robotic devices operating in virtual environments, and is currently being deployed in a real robot workcell comprising a manipulator and several sensory devices.
2002
A Multimodal User Interface for Remote Object Exploration in Teleoperation Systems / Aleotti, Jacopo; Bottazzi, S.; Caselli, Stefano; Reggiani, Monica. - (2002). (Intervento presentato al convegno IARP International Workshop on Human-Robot Interfaces: Technologies & Applications (HUROIN2002) tenutosi a Frascati (Rome), Italy nel November 2002).
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11381/1451871
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact