Learngene: Inheriting Condensed Knowledge from the Ancestry Model to Descendant Models

Kavli Affiliate: Jing Wang

| First 5 Authors: Qiufeng Wang, Xu Yang, Shuxia Lin, Jing Wang, Xin Geng

| Summary:

During the continuous evolution of one organism’s ancestry, its genes
accumulate extensive experiences and knowledge, enabling newborn descendants to
rapidly adapt to their specific environments. Motivated by this observation, we
propose a novel machine learning paradigm Learngene to enable learning models
to incorporate three key characteristics of genes. (i) Accumulating: the
knowledge is accumulated during the continuous learning of an ancestry model.
(ii) Condensing: the extensive accumulated knowledge is condensed into a much
more compact information piece, i.e., learngene. (iii) Inheriting: the
condensed learngene is inherited to make it easier for descendant models to
adapt to new environments. Since accumulating has been studied in
well-established paradigms like large-scale pre-training and lifelong learning,
we focus on condensing and inheriting, which induces three key issues and we
provide the preliminary solutions to these issues in this paper: (i) Learngene
Form: the learngene is set to a few integral layers that can preserve
significance. (ii) Learngene Condensing: we identify which layers among the
ancestry model have the most similarity as one pseudo descendant model. (iii)
Learngene Inheriting: to construct distinct descendant models for the specific
downstream tasks, we stack some randomly initialized layers to the learngene
layers. Extensive experiments across various settings, including using
different network architectures like Vision Transformer (ViT) and Convolutional
Neural Networks (CNNs) on different datasets, are carried out to confirm four
advantages of Learngene: it makes the descendant models 1) converge more
quickly, 2) exhibit less sensitivity to hyperparameters, 3) perform better, and
4) require fewer training samples to converge.

| Search Query: ArXiv Query: search_query=au:”Jing Wang”&id_list=&start=0&max_results=10

Read More