Tesi etd-03092023-161823 |
Link copiato negli appunti
Tipo di tesi
Tesi di laurea magistrale
Autore
ZUPPOLINI, ANDREA
Indirizzo email
a.zuppolini@studenti.unipi.it, oneanders@live.it
URN
etd-03092023-161823
Titolo
Knowledge transfer in Distributed Continual Learning Scenarios
Dipartimento
INFORMATICA
Corso di studi
INFORMATICA
Relatori
relatore Carta, Antonio
relatore De Caro, Valerio
relatore De Caro, Valerio
Parole chiave
- continual learning
- knowledge distillation
Data inizio appello
14/04/2023
Consultabilità
Completa
Riassunto
This master thesis explores the application of knowledge distillation in mitigating catastrophic forgetting in a continual learning setting. Continual learning is a sub-field of machine learning in which a model is trained on a sequence of tasks, trying to avoid loss of performance as the training proceeds. However, this type of training leads to catastrophic forgetting. Here, a knowledge distillation approach is used to mitigate the forgetting phenomena, proposing a model architecture that learns from a teacher model while approaching new tasks. This method is applied to three different experimental settings: In the first the teacher if pre-trained on the full dataset (Join CIFAR-100), while in the second and third both teacher and student are trained on Split-CIFAR100.
File
Nome file | Dimensione |
---|---|
Master_T...olini.pdf | 1.38 Mb |
Contatta l’autore |