Repurposing Foundation Model for Generalizable Medical Time Series Classification

Kavli Affiliate: Xiang Zhang

| First 5 Authors: Nan Huang, Haishuai Wang, Zihuai He, Marinka Zitnik, Xiang Zhang

| Summary:

Medical time series (MedTS) classification is critical for a wide range of
healthcare applications such as Alzheimer’s Disease diagnosis. However, its
real-world deployment is severely challenged by poor generalizability due to
inter- and intra-dataset heterogeneity in MedTS, including variations in
channel configurations, time series lengths, and diagnostic tasks. Here, we
propose FORMED, a foundation classification model that leverages a pre-trained
backbone and tackles these challenges through re-purposing. FORMED integrates
the general representation learning enabled by the backbone foundation model
and the medical domain knowledge gained on a curated cohort of MedTS datasets.
FORMED can adapt seamlessly to unseen MedTS datasets, regardless of the number
of channels, sample lengths, or medical tasks. Experimental results show that,
without any task-specific adaptation, the repurposed FORMED achieves
performance that is competitive with, and often superior to, 11 baseline models
trained specifically for each dataset. Furthermore, FORMED can effectively
adapt to entirely new, unseen datasets, with lightweight parameter updates,
consistently outperforming baselines. Our results highlight FORMED as a
versatile and scalable model for a wide range of MedTS classification tasks,
positioning it as a strong foundation model for future research in MedTS
analysis.

| Search Query: ArXiv Query: search_query=au:”Xiang Zhang”&id_list=&start=0&max_results=3

Read More