Skip to content

Conversation

@GuoningHuang
Copy link

This PR introduces a Linear + SiLU fusion pass and integrates it into the BuddyMlp example.
It aims to improve performance by detecting the common computation pattern SiLU(Linear(x)) and lowering it into a fused implementation.
Pass & Tests
Added a new LinearSiluFusionPass to identify and fuse linalg.generic patterns representing Linear + SiLU.
Implemented unit tests to validate correct fusion and ensure no regression.
Examples
Extended the BuddyMlp example to demonstrate the new fusion pass.
Verified the fused kernel generation and execution with representative inputs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant