logo SBA

ETD

Archivio digitale delle tesi discusse presso l’Università di Pisa

Tesi etd-04082019-102332


Tipo di tesi
Tesi di laurea magistrale
Autore
GIORGINI, DIEGO
URN
etd-04082019-102332
Titolo
Incremental pretraining of multi-resolution memory networks
Dipartimento
INFORMATICA
Corso di studi
INFORMATICA
Relatori
relatore Prof. Bacciu, Davide
relatore Dott. Carta, Antonio
controrelatore Prof. Oneto, Luca
Parole chiave
  • apprendimento automatico
  • clockwork rnn
  • linear memory networks
  • machine learning
  • MFCC
  • recurrent neural networks
  • reti neurali ricorrenti
  • sequenze temporali
  • speech recognition
  • temporal sequences
  • TIMIT
  • vanishing gradient
Data inizio appello
03/05/2019
Consultabilità
Completa
Riassunto
In the context of temporal sequences and Recurrent Neural Networks, the vanishing gradient and the need to discover and memorize long-term dependencies and hierarchical information are actively studied problems, but they may also lead us to create overly-complicated networks. Thus some researchers decided to separate concerns with the purpose of controlling such complexity.
We combined Linear Memory Networks, which conceptually separates functional input-output transformations from memory capabilities, with Clockwork-RNNs, which better memorizes dependencies at different resolutions thanks to dedicated modules.
We call this new model Clockwork Linear Memory Networks (CW-LMNs). We also developed an incremental pretraining algorithm for this model as an extension of the pretraining algorithm available for the memory component of Linear Memory Networks, in which we incrementally add and train a memory module at a time. We show that our model outperforms related models from literature, such as gated networks, in tasks of sequence generation on signals and spoken word recognition and that pretraining algorithms provide better performances, improved training stability and possibly lower training times.
File