Skip to content

Commit afe9eb4

Browse files
authored
[Bugfix] Fix flashinfer ar+norm kernel not available issue (#29960)
Signed-off-by: elvischenv <[email protected]>
1 parent 19bee6d commit afe9eb4

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

vllm/compilation/fix_functionalization.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -104,7 +104,8 @@ def __call__(self, graph: torch.fx.Graph):
104104
mutated_args = {1: "result"}
105105
self.defunctionalize(graph, node, mutated_args)
106106
elif (
107-
at_target
107+
hasattr(torch.ops.vllm, "flashinfer_trtllm_fused_allreduce_norm")
108+
and at_target
108109
== torch.ops.vllm.flashinfer_trtllm_fused_allreduce_norm.default
109110
):
110111
mutated_args = {

0 commit comments

Comments
 (0)