ETD

Archivio digitale delle tesi discusse presso l'Università di Pisa

Tesi etd-09192017-124224


Tipo di tesi
Tesi di laurea magistrale
Autore
CRECCHI, FRANCESCO
URN
etd-09192017-124224
Titolo
Augmenting Recurrent Neural Networks Resiliency by Dropout
Dipartimento
INFORMATICA
Corso di studi
INFORMATICA
Relatori
relatore Prof. Bacciu, Davide
Parole chiave
  • DropIn
  • Resiliency
  • RNN
  • Dropout
Data inizio appello
06/10/2017
Consultabilità
Completa
Riassunto
This thesis presents a novel, principled approach to training recurrent neural networks that are robust to missing part of the input features at prediction time.
By building on the ensembling properties of Dropout regularization, we propose a methodology, named DropIn, which efficiently trains a neural network model as a committee machine of subnetworks, each capable of predicting with a subset of the original input features.
We discuss the application of the DropIn methodology to the most representatives recurrent neural models, ranging from simplest recurrent networks to Reservoir Computing models and targeting applications characterized by input sources that might be unreliable or prone to collect discontinued measurements, leading to missingness in input data (e.g., as in pervasive wireless sensor networks and IoT contexts). We provide experimental assessment using real-world data from ambient assisted living and healthcare application domains, showing how the DropIn methodology allows maintaining predictive performances comparable to those of a model without missing features, even when 20%-50% of the inputs are not available.
File