logo SBA

ETD

Archivio digitale delle tesi discusse presso l’Università di Pisa

Tesi etd-08302018-140102


Tipo di tesi
Tesi di laurea magistrale
Autore
PIGA, NICOLA AGOSTINO
URN
etd-08302018-140102
Titolo
Object localization using vision and touch: experiments on the iCub humanoid robot
Dipartimento
INGEGNERIA DELL'INFORMAZIONE
Corso di studi
INGEGNERIA ROBOTICA E DELL'AUTOMAZIONE
Relatori
relatore Prof.ssa Pallottino, Lucia
relatore Dott. Natale, Lorenzo
Parole chiave
  • Bayesian state estimation
  • iCub humanoid robot
  • object localization
  • particle filtering
  • visuo tactile localization
Data inizio appello
27/09/2018
Consultabilità
Completa
Riassunto
Precise knowledge of the pose of objects is a fundamental requirement for an autonomous robot that has to manipulate and interact with objects in order to grasp and/or move them in a robust and efficient way. Such a knowledge is available to the robot through noisy sensors that provide each a different piece of information. Hence, as happens for humans, also robots need to employ multiple sensing modalities in order to extract as much information as possible from the sorrounding environment.
This thesis proposes a filtering algorithm for object localization that uses visual and tactile information in the form of Cartesian points belonging to the surface of the object. To this end, the state of-the-art Memory Unscented Particle Filter algorithm for tactile localization of a stationary object is extended in order to localize an object using visual measurements, in the form of point clouds, and track its pose using tactile measurements while the object is manipulated by an external end-effector. The considered tactile measurements are in the form of contact points computed using binary contact sensors, encoder readings and the forward kinematics of the robot. The performance and the limitations of the proposed algorithm are discussed through the analysis of the results of localization experiments performed in the Gazebo simulator and on the iCub humanoid robot using its stereo vision and tactile sensing systems.
File