Skip to content

Conversation

@AlexanderDokuchaev
Copy link
Collaborator

@AlexanderDokuchaev AlexanderDokuchaev commented Nov 6, 2025

Changes

Add nncf.batch_norm_adaptation function.
Add mode bn_adaptaion in example

Related tickets

174483

Tests

https://github.com/openvinotoolkit/nncf/actions/runs/19162855749/job/54776694802

@github-actions github-actions bot added the documentation Improvements or additions to documentation label Nov 6, 2025
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR implements BatchNorm adaptation functionality for PyTorch models after pruning. The feature allows updating BatchNorm layer statistics using a calibration dataset, which can improve model accuracy post-pruning without full fine-tuning.

Key Changes:

  • Added batch_norm_adaptation function to adapt BatchNorm statistics after pruning
  • Introduced set_batchnorm_train_only context manager to selectively set only BatchNorm layers to training mode
  • Updated type hints to use TypeVar for better type preservation in pruning functions

Reviewed Changes

Copilot reviewed 9 out of 9 changed files in this pull request and generated 2 comments.

Show a summary per file
File Description
src/nncf/torch/function_hook/pruning/batch_norm_adaptation.py New module implementing BatchNorm adaptation logic with context manager
src/nncf/pruning/prune_model.py Added public API for batch_norm_adaptation with backend routing
src/nncf/init.py Exported batch_norm_adaptation function in public API
tests/torch2/function_hook/pruning/test_bn_adaptation.py Added comprehensive tests for BatchNorm adaptation functionality
examples/pruning/torch/resnet18/main.py Integrated bn_adaptation mode into ResNet18 pruning example
examples/pruning/torch/resnet18/README.md Updated documentation with bn_adaptation usage instructions
src/nncf/torch/function_hook/pruning/prune_model.py Updated type hints using TypeVar for type preservation
src/nncf/torch/function_hook/pruning/magnitude/algo.py Updated type hints using TypeVar for type preservation
src/nncf/torch/function_hook/pruning/rb/algo.py Updated type hints using TypeVar for type preservation

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

Copilot reviewed 9 out of 9 changed files in this pull request and generated 2 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

Copilot reviewed 9 out of 9 changed files in this pull request and generated 1 comment.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@AlexanderDokuchaev AlexanderDokuchaev marked this pull request as ready for review November 7, 2025 09:24
@AlexanderDokuchaev AlexanderDokuchaev requested a review from a team as a code owner November 7, 2025 09:24
"--mode",
type=str,
choices=["magnitude", "rb"],
choices=["magnitude", "bn_adaptation", "rb"],
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to give users choice to use bn adaptation regardless of the pruning algorithm? I mean, a --bn_adaptation option would be simple IMHO
Especially because in fact there is a choice: finetune the pruned model or just apply bn adaptation

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I’m not sure I see a reason to do that.
What pipeline for bn adaptation you suggest for rb pruning?

Comment on lines +52 to +58
if isinstance(input_data, dict):
model(**input_data)
elif isinstance(input_data, tuple):
model(*input_data)
else:
model(input_data)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we reuse PTEngine here?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, in PT Engine used model.eval()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants