Skip to content

Commit 061f62b

Browse files
KarelZejustinchuby
andauthored
fix: handling of default attrs in SimplifiedLayerNormalization + LayerNormalization🐛 (#2396)
`SkipLayerNormFusion` does currently not fuse ops, if stash_type is at default (=1) or epsilon is at default (=1e-5) for [`LayerNormalization`](https://onnx.ai/onnx/operators/onnx__LayerNormalization.html#) and `SimplifiedLayerNormalization` This pr: - fixes handling default attrs in `LayerNormalization`, `SimplifiedLayerNormalization` - adds BART encoder as new test model. I added this model as some of the stash types are at default. The model is versatile and can also be used to test other fusions e.g., `EmbedLayerNormalization`. - allows for commuted inputs. Closes #2378. @shubhambhokare1 @justinchuby Could you please review? Any feedback is greatly appreciated. --------- Co-authored-by: Justin Chu <[email protected]>
1 parent 0bf5ca0 commit 061f62b

File tree

3 files changed

+743
-15
lines changed

3 files changed

+743
-15
lines changed

0 commit comments

Comments
 (0)