PREPRINT
F15DEC32-5D31-4806-B5CE-E0137EFBDDFC

Don't Pay Attention to the Noise: Learning Self-supervised Representations of Light Curves with a Denoising Time Series Transformer

Mario Morvan, Nikolaos Nikolaou, Kai Hou Yip, Ingo Waldmann
arXiv:2207.02777

Submitted on 6 July 2022

Abstract

Astrophysical light curves are particularly challenging data objects due to the intensity and variety of noise contaminating them. Yet, despite the astronomical volumes of light curves available, the majority of algorithms used to process them are still operating on a per-sample basis. To remedy this, we propose a simple Transformer model -- called Denoising Time Series Transformer (DTST) -- and show that it excels at removing the noise and outliers in datasets of time series when trained with a masked objective, even when no clean targets are available. Moreover, the use of self-attention enables rich and illustrative queries into the learned representations. We present experiments on real stellar light curves from the Transiting Exoplanet Space Satellite (TESS), showing advantages of our approach compared to traditional denoising techniques.

Preprint

Comment: ICML 2022 Workshop: Machine Learning for Astrophysics

Subjects: Astrophysics - Instrumentation and Methods for Astrophysics; Statistics - Machine Learning

URL: https://arxiv.org/abs/2207.02777