logo SBA

ETD

Archivio digitale delle tesi discusse presso l’Università di Pisa

Tesi etd-03162024-112440


Tipo di tesi
Tesi di laurea magistrale
Autore
FARCHIONE, DIEGO EUSTACHIO
URN
etd-03162024-112440
Titolo
Bidirectional Encoder Representations from Transformers (BERT) for Task Automation in Seismic Processing
Dipartimento
SCIENZE DELLA TERRA
Corso di studi
GEOFISICA DI ESPLORAZIONE E APPLICATA
Relatori
relatore Prof. Bienati, Nicola
Parole chiave
  • seismic processing
  • nlp
  • bert
  • geophysics
  • deep learning
  • large language models
  • synthetic data
  • signal processing
Data inizio appello
12/04/2024
Consultabilità
Completa
Riassunto
BERT, which stands for Bidirectional Encoder Representations from Transformers, is a deep learning model used in Natural Language Processing, developed by researchers at Google in 2018. In this Thesis, the model has been used to automate various tasks in seismic processing, such as velocity prediction, denoising and first-break picking. Firstly, synthetic velocity models were generated in two different ways: using a deep learning model called Neural Style Transfer and random horizontal layers.
Next, synthetic shot gathers were produced using the Finite-Difference method to solve the elastic wave equation in a software called Devito. Three main configurations of shot gathers were generated applying different filters, source wavelets and the two types of velocity models created in the first step.
These generated synthetic data were fed, along with the training data in equal proportion, to BERT for the pre-training stage so that the model can comprehend the main features of the seismic shot gathers. The fine-tuning phase, instead, involves adapting the pre-trained BERT model to a specific downstream task, feeding as input only the synthetic data. The model is shown to perform quite well on denoising, velocity prediction and first-break picking on field data.
File