Tesi etd-05082014-183140 |
Link copiato negli appunti
Tipo di tesi
Tesi di dottorato di ricerca
Autore
LAZZERI, NICOLE
URN
etd-05082014-183140
Titolo
Development of a cognitive and emotional control system for a social humanoid robot
Settore scientifico disciplinare
ING-INF/06
Corso di studi
INGEGNERIA
Relatori
tutor Ing. Mazzei, Daniele
tutor Prof. De Rossi, Danilo
tutor Prof. De Rossi, Danilo
Parole chiave
- Affective computing
- Believability
- Cognitive system
- Control architecture
- Embodied agent
- Expert system
- Facial expression
- Human-Robot Interaction
- Humanoid robot
- Social robot
Data inizio appello
19/05/2014
Consultabilità
Completa
Riassunto
In the last years, an increasing number of social robots have come out from science fiction novels and movies becoming reality. These social robots are interesting not only in the science fiction world but also in the scientific research field. Building socially intelligent robots in a human-centred manner can help us to better understand ourselves and the psychological and behavioural dynamics behind a social interaction.
The primary and most important function of a social robot is to appear “believable” to human observers and interaction partners. This means that a social robot must be able to express its own state and perceive the state of its social environment in a human-like way in order to act successfully, i.e., it must possess a “social intelligence” for maintaining the illusion of dealing with a real human being. The term “social intelligence” includes aspects both of appearance and of behaviour that are factors tightly coupled with each other. For example, a social robot designed to be aesthetically similar to an animal is expected to have limited functionalities. Instead, a humanoid robot that physically resembles a human being elicits strong expectations about its behavioural and cognitive capabilities and if such expectations are not being met then a person is likely to experience disorientation and disappointment.
The believability of a social robot is not only an objective matter but it also depends on a subjective evaluation of the person involved in the interaction. A social robot will be judged believable or not on the base of the individual experience and background of the person who interacts with the robot. Clearly, it is not possible to know what is really going on in the mind of that person during the interaction. Nevertheless it is possible to analyse and evaluate the psychophysiological and behavioural reactions of the subject to obtain useful cues for improving the quality and performance of the social interaction.
Based on these considerations, this thesis aims to answer two research questions: (1) How can a robot be believable and behave in a socially acceptable manner? and (2) How to evaluate the social interaction of the subject with the robot?.
This thesis presents the development of a novel software architecture for controlling a humanoid robot able to reproduce realistic facial expressions on one hand and the development of a software platform for analysing human-robot interaction studies from a point of view of the subject who interacts with the robot on the other hand. The architecture developed for controlling the robot is based on a hybrid Deliberative/Reactive paradigm to make the robot able to quickly react to the events, i.e., reactive behaviours, but even to perform more complex high-level tasks that require reasoning, i.e., deliberative behaviours. The integration of a deliberative system based on a rule expert system with the reactive system makes the robot controllable through a declarative language that is closer to the human natural way of thinking. An interactive graphical interface provides the user with a tool for controlling the behaviour of the robot. Thus, the robot becomes a research tool suitable for investigating its “being social and believable” and testing social behavioural models defined by set of rules. The hybrid architecture for controlling the robot has proven to be a good design for making the robot able to perform complex animations and convey emotional stimuli. The robot can perceive and interpret social cues of the environment, react emotionally to people in the surrounding and follow the person who attracted its attention.
The platform developed for studying the subject’s psychophysiological and behavioural reactions during the interaction with a robot is designed to be modular and configurable. On the base of the experiment specifications, multiple and heterogeneous sensors with different hardware and software characteristics can be integrated into the platform. Collecting and fusing together complementary and redundant subject-related information makes possible to obtain an enriched scene interpretation. Indeed merging different types of data can highlight important information that may otherwise remain hidden if each type of data is analysed separately. The multimodal data acquisition platform was used in the context of a research project aimed at evaluating the interaction of normally developing and autistic children with social robots. The results demonstrated the reliability and effectiveness of the platform in storing different types of data synchronously. In multimodal data fusion systems, the problem of keeping the temporal coherence between data coming from different sensors is fundamental. The availability of synchronized heterogeneous data acquired by the platform such as self-report annotations, physiological measures and behavioural observations facilitated the analysis and evaluation of the interaction of the subjects with the robot.
The primary and most important function of a social robot is to appear “believable” to human observers and interaction partners. This means that a social robot must be able to express its own state and perceive the state of its social environment in a human-like way in order to act successfully, i.e., it must possess a “social intelligence” for maintaining the illusion of dealing with a real human being. The term “social intelligence” includes aspects both of appearance and of behaviour that are factors tightly coupled with each other. For example, a social robot designed to be aesthetically similar to an animal is expected to have limited functionalities. Instead, a humanoid robot that physically resembles a human being elicits strong expectations about its behavioural and cognitive capabilities and if such expectations are not being met then a person is likely to experience disorientation and disappointment.
The believability of a social robot is not only an objective matter but it also depends on a subjective evaluation of the person involved in the interaction. A social robot will be judged believable or not on the base of the individual experience and background of the person who interacts with the robot. Clearly, it is not possible to know what is really going on in the mind of that person during the interaction. Nevertheless it is possible to analyse and evaluate the psychophysiological and behavioural reactions of the subject to obtain useful cues for improving the quality and performance of the social interaction.
Based on these considerations, this thesis aims to answer two research questions: (1) How can a robot be believable and behave in a socially acceptable manner? and (2) How to evaluate the social interaction of the subject with the robot?.
This thesis presents the development of a novel software architecture for controlling a humanoid robot able to reproduce realistic facial expressions on one hand and the development of a software platform for analysing human-robot interaction studies from a point of view of the subject who interacts with the robot on the other hand. The architecture developed for controlling the robot is based on a hybrid Deliberative/Reactive paradigm to make the robot able to quickly react to the events, i.e., reactive behaviours, but even to perform more complex high-level tasks that require reasoning, i.e., deliberative behaviours. The integration of a deliberative system based on a rule expert system with the reactive system makes the robot controllable through a declarative language that is closer to the human natural way of thinking. An interactive graphical interface provides the user with a tool for controlling the behaviour of the robot. Thus, the robot becomes a research tool suitable for investigating its “being social and believable” and testing social behavioural models defined by set of rules. The hybrid architecture for controlling the robot has proven to be a good design for making the robot able to perform complex animations and convey emotional stimuli. The robot can perceive and interpret social cues of the environment, react emotionally to people in the surrounding and follow the person who attracted its attention.
The platform developed for studying the subject’s psychophysiological and behavioural reactions during the interaction with a robot is designed to be modular and configurable. On the base of the experiment specifications, multiple and heterogeneous sensors with different hardware and software characteristics can be integrated into the platform. Collecting and fusing together complementary and redundant subject-related information makes possible to obtain an enriched scene interpretation. Indeed merging different types of data can highlight important information that may otherwise remain hidden if each type of data is analysed separately. The multimodal data acquisition platform was used in the context of a research project aimed at evaluating the interaction of normally developing and autistic children with social robots. The results demonstrated the reliability and effectiveness of the platform in storing different types of data synchronously. In multimodal data fusion systems, the problem of keeping the temporal coherence between data coming from different sensors is fundamental. The availability of synchronized heterogeneous data acquired by the platform such as self-report annotations, physiological measures and behavioural observations facilitated the analysis and evaluation of the interaction of the subjects with the robot.
File
Nome file | Dimensione |
---|---|
Lazzeri_...hesis.pdf | 17.24 Mb |
Contatta l’autore |