Papers
arxiv:2602.03846

PLATE: Plasticity-Tunable Efficient Adapters for Geometry-Aware Continual Learning

Published on Feb 3
Authors:

Abstract

A continual learning approach called PLATE is presented that operates without access to previous task data by exploiting geometric redundancy in pretrained networks through structured low-rank updates.

AI-generated summary

We develop a continual learning method for pretrained models that requires no access to old-task data, addressing a practical barrier in foundation model adaptation where pretraining distributions are often unavailable. Our key observation is that pretrained networks exhibit substantial geometric redundancy, and that this redundancy can be exploited in two complementary ways. First, redundant neurons provide a proxy for dominant pretraining-era feature directions, enabling the construction of approximately protected update subspaces directly from pretrained weights. Second, redundancy offers a natural bias for where to place plasticity: by restricting updates to a subset of redundant neurons and constraining the remaining degrees of freedom, we obtain update families with reduced functional drift on the old-data distribution and improved worst-case retention guarantees. These insights lead to PLATE (Plasticity-Tunable Efficient Adapters), a continual learning method requiring no past-task data that provides explicit control over the plasticity-retention trade-off. PLATE parameterizes each layer with a structured low-rank update ΔW = B A Q^top, where B and Q are computed once from pretrained weights and kept frozen, and only A is trained on the new task. The code is available at https://github.com/SalesforceAIResearch/PLATE.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2602.03846
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2602.03846 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2602.03846 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2602.03846 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.