ETD

Archivio digitale delle tesi discusse presso l'Università di Pisa

Tesi etd-11182013-092326


Tipo di tesi
Tesi di dottorato di ricerca
Autore
SPANO, LUCIO DAVIDE
URN
etd-11182013-092326
Titolo
A Model-Based Approach for Gesture Interfaces
Settore scientifico disciplinare
INF/01
Corso di studi
SCIENZE DI BASE
Relatori
tutor Cisternino, Antonio
relatore Paternò, Fabio
Parole chiave
  • model-based approach
  • interfacce gestuali
  • interaction models
  • gestures
  • gesture interfaces
  • modelli interazione
Data inizio appello
19/12/2013
Consultabilità
Completa
Riassunto
The description of a gesture requires temporal analysis of values generated by input sensors, and it does not fit well the observer pattern traditionally used by frameworks to handle the user’s input. The current solution is to embed particular gesture-based interactions into frameworks by notifying when a gesture is detected completely. This approach suffers from a lack of flexibility, unless the programmer performs explicit temporal analysis of raw sensors data.
This thesis proposes a compositional, declarative meta-model for gestures definition based on Petri Nets. Basic traits are used as building blocks for defining gestures; each one notifies the change of a feature value. A complex gesture is defined by the composition of other sub-gestures using a set of operators. The user interface behaviour can be associated to the recognition of the whole gesture or to any other sub-component, addressing the problem of granularity for the notification of events.
The meta-model can be instantiated for different gesture recognition supports and its definition has been validated through a proof of concept library. Sample applications have been developed for supporting multi-touch gestures in iOS and full body gestures with Microsoft Kinect.
In addition to the solution for the event granularity problem, this thesis discusses how to separate the definition of the gesture from the user interface behaviour using the proposed compositional approach.
The gesture description meta-model has been integrated into MARIA, a model-based user interface description language, extending it with the description of full-body gesture interfaces.
File