Skip to content

Commit 25389c9

Browse files
authored
[https://nvbugs/5453806][unwaive] Unwaive fp8 kvcache attention test (NVIDIA#7243)
Signed-off-by: peaceh <[email protected]>
1 parent d8ec546 commit 25389c9

File tree

1 file changed

+0
-5
lines changed

1 file changed

+0
-5
lines changed

tests/unittest/_torch/attention/test_attention_mla.py

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -389,11 +389,6 @@ def test_attention_mla(scenario: Scenario, context_sequence_lengths: List[int],
389389
dtype = scenario.dtype
390390
kv_cache_dtype = scenario.kv_cache_dtype
391391

392-
FAILED_CSL = [777, 912, 431, 42, 266, 989, 524]
393-
if (kv_cache_dtype is torch.float8_e4m3fn
394-
and context_sequence_lengths == FAILED_CSL):
395-
pytest.skip("https://nvbugs/5453806")
396-
397392
print(
398393
f"--------------------------------Test for scenario: {scenario} start--------------------------------"
399394
)

0 commit comments

Comments
 (0)