Skip to content

Commit 7fbc0df

Browse files
terrytangyuanjimpang
authored andcommitted
[Bugfix] Only print out chat template when supplied (vllm-project#13444)
1 parent 95681b9 commit 7fbc0df

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

vllm/entrypoints/openai/api_server.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -797,7 +797,9 @@ async def init_app_state(
797797
state.log_stats = not args.disable_log_stats
798798

799799
resolved_chat_template = load_chat_template(args.chat_template)
800-
logger.info("Using supplied chat template:\n%s", resolved_chat_template)
800+
if resolved_chat_template is not None:
801+
logger.info("Using supplied chat template:\n%s",
802+
resolved_chat_template)
801803

802804
state.openai_serving_models = OpenAIServingModels(
803805
engine_client=engine_client,

0 commit comments

Comments
 (0)