fix(llm): preserve custom prompt and override Modelfile system prompt#970
Open
alexwrite wants to merge 2 commits intothewh1teagle:mainfrom
Open
fix(llm): preserve custom prompt and override Modelfile system prompt#970alexwrite wants to merge 2 commits intothewh1teagle:mainfrom
alexwrite wants to merge 2 commits intothewh1teagle:mainfrom
Conversation
Two fixes for the LLM summarization feature: 1. ollama.ts — add `system: ''` to /api/generate requests so the model's Modelfile SYSTEM prompt no longer silently overrides the user's custom prompt. Ollama's /api/generate always prepends the Modelfile system prompt; fine-tuned models (e.g. summarization models) therefore ignored user instructions entirely. Reference: https://github.com/ollama/ollama/blob/main/docs/api.md 2. Params.tsx — preserve `llmConfig.prompt` when switching the LLM platform selector. Previously the `prompt` field was not included in the preserved fields, so switching platform (even back to the same one) silently reset the prompt to the platform default. Fixes thewh1teagle#969
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Fixes two bugs in the LLM summarization feature reported in #969.
Bug 1 — Custom prompt ignored by Ollama (Modelfile system prompt override)
Root cause:
ollama.tsuses/api/generatewhich always prepends the model's ModelfileSYSTEMprompt before inference. For fine-tuned models (e.g. summarization-specific models), this built-in system prompt takes precedence over the user's custom prompt, making the setting effectively ignored.Fix: Pass
system: ''in the request body to clear the Modelfile system prompt and let the user's configured prompt take full effect. This matches the documented behaviour of thesystemparameter in Ollama's API.const body = JSON.stringify({ model: this.config.model, prompt, + // Override Modelfile system prompt so user's custom prompt takes effect + system: '', stream: false, })Bug 2 — Switching LLM platform silently resets the custom prompt
Root cause: In
Params.tsx, theonValueChangehandler for the platform selector spreadsdefaults(which includesprompt: "<default template>") without re-addingllmConfig.prompt, so the user's custom prompt is lost every time the platform select changes.Fix: Preserve
llmConfig.promptin the spread:setLlmConfig({ ...defaults, ollamaBaseUrl: llmConfig.ollamaBaseUrl, claudeApiKey: llmConfig.claudeApiKey, openaiBaseUrl: llmConfig.openaiBaseUrl, openaiApiKey: llmConfig.openaiApiKey, enabled: llmConfig?.enabled ?? false, + prompt: llmConfig.prompt, })Test plan
Summarize in French as bullet points.\n"""\n%s\n""")