Skip to content

Commit becdede

Browse files
esmeetuZhengHongming888
authored andcommitted
[Frontend] Fix logging format when enable response logging (vllm-project#28049)
Signed-off-by: esmeetu <[email protected]>
1 parent fdf310b commit becdede

File tree

1 file changed

+1
-2
lines changed

1 file changed

+1
-2
lines changed

vllm/entrypoints/openai/api_server.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1572,8 +1572,7 @@ def buffered_iterator():
15721572
full_content = full_content[:2048] + ""
15731573
"...[truncated]"
15741574
logger.info(
1575-
"response_body={streaming_complete: "
1576-
"content='%s', chunks=%d}",
1575+
"response_body={streaming_complete: content=%r, chunks=%d}",
15771576
full_content,
15781577
chunk_count,
15791578
)

0 commit comments

Comments
 (0)