Effects of Network-Specific Training and Waveform Denoising on ML-Based Seismic Phase Picking
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
We train and evaluate the performance of deep learning based models for picking seismic phases from local, regional and teleseismic seismological datasets. The datasets for training and testing at regional and teleseismic distances are assembled based on bulletins provided by the International Monitoring System (IMS) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). In this study we test the potential of machine learning (ML) algorithms, namely phase picking and denoising, to potentially help with the detection and characterization of potential nuclear explosions. ML algorithms offer the chance of potentially faster operational procedures for the detection of explosive and non-explosive events at lower signal-to-noise ratios as currently. For training and testing on local distances a dataset provided by the German Federal Seismological Survey (GR) is used. The established phase picking algorithms PhaseNet and EQT are employed here in both original model form and with newly trained models. The network specific trained phase picker models are evaluated against other available models. We also test the influence that autoencoder based denoising has on the performance of deep learning based seismic phase picking. We train phase picking models on denoised data. We find that original models already generalize well on IMS and GR data. However training of network specific models improves predictions for teleseismic and regional distances. Application on denoised seismic waveforms necessitates dedicated models trained on denoised data.