Skip to content

Conversation

@Vanalite
Copy link
Contributor

@Vanalite Vanalite commented Oct 1, 2025

Describe Your Changes

  • Partial / user-terminated llm responses should be saved in threads
  • Allow to continue with AI response after interrupting
  • This behavior only applies for llamacpp
  • Other providers' behavior remain untouched
  • If users switch models during the interruption, the response should be generated from the beginning to avoid cross models problems.

Fixes Issues

Test plan

Screen.Recording.2025-10-01.at.5.44.50.PM.mov
  • Check behaviors with other providers

Self Checklist

  • Added relevant comments, esp in complex areas
  • Updated docs (for bug fixes / features)
  • Created issues for follow-up changes or refactoring needed

@github-actions
Copy link
Contributor

github-actions bot commented Oct 1, 2025

@Vanalite Vanalite requested a review from louis-jan October 2, 2025 01:59
@qnixsynapse
Copy link
Contributor

This needs to be rebased I think

@Vanalite Vanalite force-pushed the feat/retain-interruption-message branch from 082ee28 to 35264e9 Compare November 3, 2025 05:45
@Vanalite Vanalite force-pushed the feat/retain-interruption-message branch from 35264e9 to 29c5dfa Compare November 3, 2025 10:31
@Vanalite
Copy link
Contributor Author

Vanalite commented Nov 3, 2025

Rebase to latest dev and implement the force restart if they swap models midway @qnixsynapse

@louis-jan louis-jan requested a review from Copilot November 4, 2025 07:08
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR implements a "Continue with AI Response" feature that allows users to continue generating a stopped/interrupted AI response from where it left off. When a user stops an AI response mid-generation, the system now preserves it as a "Stopped" message, which can later be continued with the same or a different model.

Key changes:

  • Added modifyMessage method to the message interface and services to support updating existing messages
  • Implemented updateMessage state function for optimistic UI updates with persistence
  • Enhanced chat functionality to support continuing from a stopped message via continueFromMessageId parameter

Reviewed Changes

Copilot reviewed 12 out of 13 changed files in this pull request and generated 12 comments.

Show a summary per file
File Description
yarn.lock Added type definitions for @types/node@^20.0.0 and @types/whatwg-mimetype@^3.0.2
web-app/src/services/messages/types.ts Added modifyMessage method to MessagesService interface
web-app/src/services/messages/default.ts Implemented modifyMessage method for DefaultMessagesService
web-app/src/locales/en/common.json Added translation key for "Continue with AI Response"
web-app/src/hooks/useMessages.ts Implemented updateMessage function with optimistic updates and persistence
web-app/src/hooks/useChat.ts Major refactoring to support continuing interrupted messages, including helper functions for streaming and message finalization
web-app/src/hooks/useAppState.ts Exported PromptProgress type for use in other modules
web-app/src/hooks/tests/useMessages.test.ts Updated tests to verify optimistic update behavior
web-app/src/hooks/tests/useChat.test.ts Added comprehensive tests for continue functionality
web-app/src/containers/StreamingContent.tsx Added logic to hide streaming content when a stopped message exists
web-app/src/containers/ScrollToBottom.tsx Added logic to detect partial responses and model mismatches, showing continue button
web-app/src/containers/GenerateResponseButton.tsx Implemented logic to continue or regenerate based on partial response and model mismatch
core/src/types/message/messageInterface.ts Added modifyMessage method documentation to MessageInterface

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@Vanalite Vanalite merged commit 9e250b0 into dev Nov 6, 2025
17 checks passed
@Vanalite Vanalite deleted the feat/retain-interruption-message branch November 6, 2025 07:35
@github-project-automation github-project-automation bot moved this to QA in Jan Nov 6, 2025
@github-actions github-actions bot added this to the v0.7.4 milestone Nov 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: QA

Development

Successfully merging this pull request may close these issues.

bug: Partial / user-terminated llm responses should be saved in threads

4 participants