Self-Supervised Learning for Time Series: Contrastive or Generative?

Kavli Affiliate: Xiang Zhang

| First 5 Authors: Ziyu Liu, Azadeh Alavi, Minyi Li, Xiang Zhang,

| Summary:

Self-supervised learning (SSL) has recently emerged as a powerful approach to
learning representations from large-scale unlabeled data, showing promising
results in time series analysis. The self-supervised representation learning
can be categorized into two mainstream: contrastive and generative. In this
paper, we will present a comprehensive comparative study between contrastive
and generative methods in time series. We first introduce the basic frameworks
for contrastive and generative SSL, respectively, and discuss how to obtain the
supervision signal that guides the model optimization. We then implement
classical algorithms (SimCLR vs. MAE) for each type and conduct a comparative
analysis in fair settings. Our results provide insights into the strengths and
weaknesses of each approach and offer practical recommendations for choosing
suitable SSL methods. We also discuss the implications of our findings for the
broader field of representation learning and propose future research
directions. All the code and data are released at
url{https://github.com/DL4mHealth/SSL_Comparison}.

| Search Query: ArXiv Query: search_query=au:”Xiang Zhang”&id_list=&start=0&max_results=3

Read More