UnitNorm: Rethinking Normalization for Transformers in Time Series

Kavli Affiliate: Xiang Zhang

| First 5 Authors: Nan Huang, Christian Kümmerle, Xiang Zhang, ,

| Summary:

Normalization techniques are crucial for enhancing Transformer models’
performance and stability in time series analysis tasks, yet traditional
methods like batch and layer normalization often lead to issues such as token
shift, attention shift, and sparse attention. We propose UnitNorm, a novel
approach that scales input vectors by their norms and modulates attention
patterns, effectively circumventing these challenges. Grounded in existing
normalization frameworks, UnitNorm’s effectiveness is demonstrated across
diverse time series analysis tasks, including forecasting, classification, and
anomaly detection, via a rigorous evaluation on 6 state-of-the-art models and
10 datasets. Notably, UnitNorm shows superior performance, especially in
scenarios requiring robust attention mechanisms and contextual comprehension,
evidenced by significant improvements by up to a 1.46 decrease in MSE for
forecasting, and a 4.89% increase in accuracy for classification. This work not
only calls for a reevaluation of normalization strategies in time series
Transformers but also sets a new direction for enhancing model performance and
stability. The source code is available at
https://anonymous.4open.science/r/UnitNorm-5B84.

| Search Query: ArXiv Query: search_query=au:”Xiang Zhang”&id_list=&start=0&max_results=3

Read More