Papers
arxiv:2512.04694

TimesNet-Gen: Deep Learning-based Site Specific Strong Motion Generation

Published on Dec 4
· Submitted by Yilmaz on Dec 8
Authors:
,
,

Abstract

TimesNet-Gen, a time-domain conditional generator with a station-specific latent bottleneck, effectively synthesizes site-specific strong ground motion records, outperforming a spectrogram-based conditional VAE baseline.

AI-generated summary

Effective earthquake risk reduction relies on accurate site-specific evaluations. This requires models that can represent the influence of local site conditions on ground motion characteristics. In this context, data driven approaches that learn site controlled signatures from recorded ground motions offer a promising direction. We address strong ground motion generation from time-domain accelerometer records and introduce the TimesNet-Gen, a time-domain conditional generator. The approach uses a station specific latent bottleneck. We evaluate generation by comparing HVSR curves and fundamental site-frequency f_0 distributions between real and generated records per station, and summarize station specificity with a score based on the f_0 distribution confusion matrices. TimesNet-Gen achieves strong station-wise alignment and compares favorably with a spectrogram-based conditional VAE baseline for site-specific strong motion synthesis. Our codes are available via https://github.com/brsylmz23/TimesNet-Gen.

Community

Paper author Paper submitter

This work presents a transformer-based generative model for complex time-series signals, with experiments on seismic accelerometer data.

Key idea: treat seismic waveforms as structured high-dimensional sequences and learn a latent trajectory that captures both physical dynamics and long-range temporal dependencies.

Screenshot 2025-12-08 at 12.16.30

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: @librarian-bot recommend

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2512.04694 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2512.04694 in a dataset README.md to link it from this page.

Spaces citing this paper 1

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.