Evaluating Temporal Plasticity in Foundation Time Series Models for Incremental Fine-tuning

Kavli Affiliate: Jia Liu

| First 5 Authors: Jia Liu, Cheng Jinguo, Xia Fang, Zhenyuan Ma, Yuankai Wu

| Summary:

Time series foundation models excel at diverse time series forecasting tasks,
but their capacity for continuous improvement through incremental learning
remains unexplored. We present the first comprehensive study investigating
these models’ temporal plasticity – their ability to progressively enhance
performance through continual learning while maintaining existing capabilities.
Through experiments on real-world datasets exhibiting distribution shifts, we
evaluate both conventional deep learning models and foundation models using a
novel continual learning framework. Our findings reveal that while traditional
models struggle with performance deterioration during incremental fine-tuning,
foundation models like Time-MoE and Chronos demonstrate sustained improvement
in predictive accuracy. This suggests that optimizing foundation model
fine-tuning strategies may be more valuable than developing domain-specific
small models. Our research introduces new evaluation methodologies and insights
for developing foundation time series models with robust continuous learning
capabilities.

| Search Query: ArXiv Query: search_query=au:”Jia Liu”&id_list=&start=0&max_results=3

Read More