Skip to content

Conversation

@kevalmorabia97
Copy link
Collaborator

What does this PR do?

Type of change: Improve existing feature

Overview: GPT-OSS model has Yarn RoPE which adds additional nn.Embedding modules that need to be enabled in DynamicModule for Minitron pruning

Testing

  • gpt-oss-20b pruned using M-LM pruning example and conf scripts.

@copy-pr-bot
Copy link

copy-pr-bot bot commented Nov 8, 2025

Auto-sync is disabled for draft pull requests in this repository. Workflows must be run manually.

Contributors can view more details about this message here.

@codecov
Copy link

codecov bot commented Nov 8, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 74.36%. Comparing base (e74a468) to head (9278516).
⚠️ Report is 4 commits behind head on main.

Additional details and impacted files
@@           Coverage Diff           @@
##             main     #530   +/-   ##
=======================================
  Coverage   74.36%   74.36%           
=======================================
  Files         182      182           
  Lines       18216    18216           
=======================================
  Hits        13547    13547           
  Misses       4669     4669           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@kevalmorabia97 kevalmorabia97 force-pushed the kmorabia/yarn-emb-pruning branch 4 times, most recently from e1ae4b6 to 870e6a2 Compare November 10, 2025 17:08
@kevalmorabia97 kevalmorabia97 marked this pull request as ready for review November 10, 2025 17:26
@kevalmorabia97 kevalmorabia97 requested review from a team as code owners November 10, 2025 17:26
@kevalmorabia97 kevalmorabia97 force-pushed the kmorabia/yarn-emb-pruning branch from 870e6a2 to ca95e01 Compare November 10, 2025 17:46
@kevalmorabia97 kevalmorabia97 force-pushed the kmorabia/yarn-emb-pruning branch from ca95e01 to 25656c8 Compare November 10, 2025 19:30
@kevalmorabia97 kevalmorabia97 force-pushed the kmorabia/yarn-emb-pruning branch from 25656c8 to 9278516 Compare November 10, 2025 19:38
Copy link
Contributor

@jenchen13 jenchen13 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@kevalmorabia97 kevalmorabia97 merged commit 6a8d6da into main Nov 11, 2025
26 checks passed
@kevalmorabia97 kevalmorabia97 deleted the kmorabia/yarn-emb-pruning branch November 11, 2025 04:30
mxinO pushed a commit that referenced this pull request Nov 11, 2025
## What does this PR do?

**Type of change:** Improve existing feature <!-- Use one of the
following: Bug fix, new feature, new example, new tests, documentation.
-->

**Overview:** GPT-OSS model has Yarn RoPE which adds additional
nn.Embedding modules that need to be enabled in DynamicModule for
Minitron pruning

## Testing
<!-- Mention how have you tested your change if applicable. -->

- gpt-oss-20b pruned using M-LM pruning example and conf scripts.

Signed-off-by: Keval Morabia <[email protected]>
Signed-off-by: mxin <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants