Skip to content

fix(llm): preserve custom prompt and override Modelfile system prompt#970

Open
alexwrite wants to merge 2 commits intothewh1teagle:mainfrom
alexwrite:fix/llm-prompt-override
Open

fix(llm): preserve custom prompt and override Modelfile system prompt#970
alexwrite wants to merge 2 commits intothewh1teagle:mainfrom
alexwrite:fix/llm-prompt-override

Conversation

@alexwrite
Copy link

Summary

Fixes two bugs in the LLM summarization feature reported in #969.

Bug 1 — Custom prompt ignored by Ollama (Modelfile system prompt override)

Root cause: ollama.ts uses /api/generate which always prepends the model's Modelfile SYSTEM prompt before inference. For fine-tuned models (e.g. summarization-specific models), this built-in system prompt takes precedence over the user's custom prompt, making the setting effectively ignored.

Fix: Pass system: '' in the request body to clear the Modelfile system prompt and let the user's configured prompt take full effect. This matches the documented behaviour of the system parameter in Ollama's API.

 const body = JSON.stringify({
     model: this.config.model,
     prompt,
+    // Override Modelfile system prompt so user's custom prompt takes effect
+    system: '',
     stream: false,
 })

Bug 2 — Switching LLM platform silently resets the custom prompt

Root cause: In Params.tsx, the onValueChange handler for the platform selector spreads defaults (which includes prompt: "<default template>") without re-adding llmConfig.prompt, so the user's custom prompt is lost every time the platform select changes.

Fix: Preserve llmConfig.prompt in the spread:

 setLlmConfig({
     ...defaults,
     ollamaBaseUrl: llmConfig.ollamaBaseUrl,
     claudeApiKey: llmConfig.claudeApiKey,
     openaiBaseUrl: llmConfig.openaiBaseUrl,
     openaiApiKey: llmConfig.openaiApiKey,
     enabled: llmConfig?.enabled ?? false,
+    prompt: llmConfig.prompt,
 })

Test plan

  • Configure a custom LLM prompt in Settings (e.g. Summarize in French as bullet points.\n"""\n%s\n""")
  • Transcribe a file with LLM enabled → summary should follow the custom prompt
  • Change the LLM platform selector (e.g. ollama → openai → ollama) → custom prompt should still be present in the textarea

Two fixes for the LLM summarization feature:

1. ollama.ts — add `system: ''` to /api/generate requests so the
   model's Modelfile SYSTEM prompt no longer silently overrides the
   user's custom prompt. Ollama's /api/generate always prepends the
   Modelfile system prompt; fine-tuned models (e.g. summarization
   models) therefore ignored user instructions entirely.
   Reference: https://github.com/ollama/ollama/blob/main/docs/api.md

2. Params.tsx — preserve `llmConfig.prompt` when switching the LLM
   platform selector. Previously the `prompt` field was not included
   in the preserved fields, so switching platform (even back to the
   same one) silently reset the prompt to the platform default.

Fixes thewh1teagle#969
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant