Skip to content

THUDM/INFTY

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


Documentation arXiv PyTorch numpy matplotlib License: MIT

INFTY Engine: An Optimization Toolkit to Support Continual AI

  • 🌟 Initial version of INFTY is released. (Pre-print to be updated)

🌈 What is INFTY?

animated

Wecome to INFTY, a flexible and user-friendly optimization engine tailored for Continual AI (existing libraries treat optimizers as defaults configuration). INFTY includes a suite of built-in optimization algorithms that directly tackle core challenges (e.g., catastrophic forgetting, stability–plasticity dilemma, generalization) in Continual AI. INFTY supports plug-and-play and theoretical analysis utilities, compatible with: i) various Continual AI, e.g., PTM-based CL, and Continual PEFT, Continual Diffusion, and Continual VLM etc.; ii) diverse models, e.g., ResNet, Transformer, ViT, CLIP, and Diffusion. INFTY provides a unified optimization solution in Continual AI, can serve as infrastructure for broad deployment.

✨ Features

  • Generality: Built-in CL–friendly optimization algorithms, supporting a wide range of scenarios, models, methods, and learning paradigms.

  • Usability: Portable, plugin-style design, enabling easy replacement of fixed options within existing pipelines.

  • Utilities: Built-in tools for theoretical analysis and visualization, facilitating investigation and diagnostic insight into optimization behavior.

🧠 Algorithms

INFTY has implemented three mainstream algorithms currently:

📚 Versatile Case (Ongoing Updates)

Scenario 1: Typical Continual Learning

Case 1: Generalizability support

This category promotes unified and flat loss landscapes to enhance adaptation across tasks over time. These methods can be applied to most architectures and training platforms, either from scratch or with pre-trained models (PTMs). Details can be found in C_Flat.

Case 2: BP-Free support

This category focuses on gradient approximation when backpropagation is not feasible. Combining with PTMs is strongly recommended to achieve better initialization and faster convergence. Details can be found in ZeroFlow.

Case 3: Multi-objective support

This category mitigates gradient interference between old and new task objectives, with gradient manipulation applied solely to shared parameters. Details can be found in UniGrad_FS.

Scenario 2: Continual Text-to-Image Diffusion Model

INFTY empowers CIDM! A tiny demo shows how INFTY can be applied to train Concept-Incremental text-to-image Diffusion Models. Origin repo can be found in CIDM.

Scenario 3: Vision-Language Continual Learning

INFTY also supports multi-modal continual learning — ready for VLMs, AVLMs, and more. Origin repo can be found in DMNSP.

Method T1 T2 T3 T4 T5 T6 T7 T8 T9 T10 Avg
DMNSP 99.20 96.10 91.93 87.05 87.00 86.10 84.17 83.05 81.58 79.94 87.61
+INFTY 99.20 96.30 91.80 87.30 87.44 86.60 84.46 83.20 81.69 80.52 87.85

🛠️ Installation

Option 1: Using pip

pip install infty

Option 2: Install from source

conda create -n infty python=3.8

conda activate infty

git clone https://github.com/THUDM/INFTY.git

cd infty && pip install .

🚀 Quick start

Thanks to the PILOT repo, we provide a simple example showcasing INFTY Engine. Hyperparameters for specific methods are configured in ../infty_configs/.

cd infty 

pip install .[examples]

cd examples/PILOT

python main.py --config=exps/memo_scr.json --inftyopt=c_flat
python main.py --config=exps/ease.json --inftyopt=zo_sgd_conserve
python main.py --config=exps/icarl.json --inftyopt=unigrad_fs

Tips: Feel free to use INFTY in your own projects following 🛠️ Installation or 🧩 Custom usage.

🧩 Custom usage

Optimizers

Step 1. Wrap your base optimizer with an INFTY optimizer

from infty import optim as infty_optim

base_optimizer = optim.SGD(
                filter(lambda p: p.requires_grad, self._network.parameters()), 
                lr=self.args['lrate'], 
                momentum=0.9, 
                weight_decay=self.args['weight_decay']
            )
optimizer = infty_optim.C_Flat(params=self._network.parameters(), base_optimizer=base_optimizer, model=self._network, args=self.args)

Step 2. Implement the create_loss_fn function

def create_loss_fn(self, inputs, targets):
    """
    Create a closure to calculate the loss
    """
    def loss_fn():
        outputs = self._network(inputs)
        logits = outputs["logits"]
        loss_clf = F.cross_entropy(logits, targets)
        return logits, [loss_clf]
    return loss_fn

Step 3. Use the loss_fn to calculate the loss and backward

loss_fn = self.create_loss_fn(inputs, targets)
optimizer.set_closure(loss_fn)
logits, loss_list = optimizer.step()

Visualization plots

INFTY includes built-in visualization tools for inspecting optimization behavior:

  • Loss Landscape: visualize sharpness around local minima
  • Hessian ESD: curvature analysis via eigenvalue spectrum density
  • Conflict Curves: quantify gradient interference (supports PCGrad, GradVac, UniGrad_FS, CAGrad)
  • Optimization Trajectory: observe optimization directions under gradient shifts with a toy example
from infty import plot as infty_plot

infty_plot.visualize_landscape(self._network, self.create_loss_fn, train_loader, self._cur_task, self._device)
infty_plot.visualize_esd(self._network, self.create_loss_fn, train_loader, self._cur_task, self._device)
infty_plot.visualize_conflicts()
infty_plot.visualize_trajectory(optim="c_flat")

📝 Citation

If any content in this repo is useful for your work, please cite the following paper:

  • ZeroFlow: Zeroflow: Overcoming catastrophic forgetting is easier than you think. ICML 2025 [paper]

  • C-Flat++: C-Flat++: Towards a More Efficient and Powerful Framework for Continual Learning. Arxiv 2025 [paper]

  • C-Flat: Make Continual Learning Stronger via C-Flat. NeurIPS 2024 [paper]

  • UniGrad-FS: UniGrad-FS: Unified Gradient Projection With Flatter Sharpness for Continual Learning. TII 2024 [paper]

🙏 Acknowledgements

We thank the following repos providing helpful components/functions in our work.

📬 Contact us

If you have any questions, feel free to open an issue or contact the authors: Wei Li ([email protected]) or Tao Feng ([email protected]).

🧾 License

This project is licensed under the MIT License.

About

INFTY Engine: An Optimization Toolkit to Support Continual AI

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages