GenEFT: Understanding Statics and Dynamics of Model Generalization via Effective Theory

Kavli Affiliate: Max Tegmark

| First 5 Authors: David D. Baek, Ziming Liu, Max Tegmark, ,

| Summary:

We present GenEFT: an effective theory framework for shedding light on the
statics and dynamics of neural network generalization, and illustrate it with
graph learning examples. We first investigate the generalization phase
transition as data size increases, comparing experimental results with
information-theory-based approximations. We find generalization in a Goldilocks
zone where the decoder is neither too weak nor too powerful. We then introduce
an effective theory for the dynamics of representation learning, where
latent-space representations are modeled as interacting particles (repons), and
find that it explains our experimentally observed phase transition between
generalization and overfitting as encoder and decoder learning rates are
scanned. This highlights the power of physics-inspired effective theories for
bridging the gap between theoretical predictions and practice in machine
learning.

| Search Query: ArXiv Query: search_query=au:”Max Tegmark”&id_list=&start=0&max_results=3

Read More