logo SBA

ETD

Archivio digitale delle tesi discusse presso l’Università di Pisa

Tesi etd-02112019-100520


Tipo di tesi
Tesi di laurea magistrale
Autore
DI SARLI, DANIELE
URN
etd-02112019-100520
Titolo
Gated Reservoir Networks for Sequences and Trees
Dipartimento
INFORMATICA
Corso di studi
INFORMATICA
Relatori
relatore Gallicchio, Claudio
relatore Prof. Micheli, Alessio
controrelatore Ciuffoletti, Augusto
Parole chiave
  • neural networks
  • machine learning
  • gated recurrent units
  • echo state networks
  • reservoir computing
Data inizio appello
01/03/2019
Consultabilità
Completa
Riassunto
Recurrent Neural Networks are an important tool in the field of Machine Learning, since they represent a learning model that allows a neural network to learn inferences over temporal or sequential data. Such neural networks, in their naïve implementation, are notoriously costly and difficult to train. For this reason, in literature different approaches emerged to exploit Recurrent Neural Networks in practical settings. The first approach that we consider, Echo State Networks, harnesses the intrinsic characteristics of Recurrent Neural Networks in order to make the costly training of the recurrent part of the network unnecessary. The second approach that we consider, Gated Recurrent Units, exploits architectural structures called "gates" in order to make training easier.
We propose a new efficient neural network model (GResNet) that extends Echo State Networks by introducing gates, and for Gated Recurrent Units we explore hybrid training approaches in which only the gates get trained.
We then extend our proposed models to deal with tree-structured data.
With our experiments, we validated our neural network models over two Natural Language Processing tasks, both for trees and sequences. The results show an increase in predictive performance for GResNet with respect to Echo State Networks, and highlight how a full training of a Gated Recurrent Unit can even be unnecessary.
File