Tesi etd-02092026-142307 |
Link copiato negli appunti
Tipo di tesi
Tesi di laurea magistrale
Autore
PETRUZZELLA, CHRISTIAN
URN
etd-02092026-142307
Titolo
Learning to Prune Image Classifiers in Weight Space via Hypernetworks
Dipartimento
INGEGNERIA DELL'INFORMAZIONE
Corso di studi
ARTIFICIAL INTELLIGENCE AND DATA ENGINEERING
Relatori
relatore Prof. Cimino, Mario Giovanni Cosimo Antonio
relatore Prof. Galatolo, Federico Andrea
relatore Dott. Parola, Marco
relatore Prof. Galatolo, Federico Andrea
relatore Dott. Parola, Marco
Parole chiave
- hypernetworks
- image classification
- model pruning
- sparsity control
- weight-space learning
Data inizio appello
27/02/2026
Consultabilità
Non consultabile
Data di rilascio
27/02/2096
Riassunto (Inglese)
Riassunto (Italiano)
This thesis proposes a pruning approach based on hypernetworks for image classifiers. The core idea is to formulate pruning as a learning problem in weight space, where a hypernetwork is trained to generate binary masks for pretrained neural networks. The mask prediction process is conditioned on an external sparsity control signal, enabling the same hypernetwork to produce pruned models at different sparsity levels without retraining. The adoption of hypernetworks enables addressing the problem without relying on the original data and in a more efficient way. The proposed method is evaluated on convolutional neural networks of different sizes trained on standard image classification benchmarks and is compared against both traditional pruning techniques, including random pruning, magnitude-based pruning, and layerwise Optimal Brain Surgeon, as well as more advanced methods such as WoodFisher and CHITA. Pruning quality is assessed using accuracy retention and compression rate as primary evaluation metrics. This work investigates the properties, advantages, and limitations of hypernetwork-based mask prediction, with particular attention to sparsity controllability, architectural scaling, and generalization across model zoos.
File
| Nome file | Dimensione |
|---|---|
La tesi non è consultabile. |
|