-
Notifications
You must be signed in to change notification settings - Fork 2.4k
feat: Allow continuing with AI response after interrupting #6697
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
This needs to be rebased I think |
082ee28 to
35264e9
Compare
…id message ID duplication
Once the user switches model after they interrupt the response midway, force the user to start generating the response from the beginning to avoid cross model lemma
35264e9 to
29c5dfa
Compare
|
Rebase to latest dev and implement the force restart if they swap models midway @qnixsynapse |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR implements a "Continue with AI Response" feature that allows users to continue generating a stopped/interrupted AI response from where it left off. When a user stops an AI response mid-generation, the system now preserves it as a "Stopped" message, which can later be continued with the same or a different model.
Key changes:
- Added
modifyMessagemethod to the message interface and services to support updating existing messages - Implemented
updateMessagestate function for optimistic UI updates with persistence - Enhanced chat functionality to support continuing from a stopped message via
continueFromMessageIdparameter
Reviewed Changes
Copilot reviewed 12 out of 13 changed files in this pull request and generated 12 comments.
Show a summary per file
| File | Description |
|---|---|
| yarn.lock | Added type definitions for @types/node@^20.0.0 and @types/whatwg-mimetype@^3.0.2 |
| web-app/src/services/messages/types.ts | Added modifyMessage method to MessagesService interface |
| web-app/src/services/messages/default.ts | Implemented modifyMessage method for DefaultMessagesService |
| web-app/src/locales/en/common.json | Added translation key for "Continue with AI Response" |
| web-app/src/hooks/useMessages.ts | Implemented updateMessage function with optimistic updates and persistence |
| web-app/src/hooks/useChat.ts | Major refactoring to support continuing interrupted messages, including helper functions for streaming and message finalization |
| web-app/src/hooks/useAppState.ts | Exported PromptProgress type for use in other modules |
| web-app/src/hooks/tests/useMessages.test.ts | Updated tests to verify optimistic update behavior |
| web-app/src/hooks/tests/useChat.test.ts | Added comprehensive tests for continue functionality |
| web-app/src/containers/StreamingContent.tsx | Added logic to hide streaming content when a stopped message exists |
| web-app/src/containers/ScrollToBottom.tsx | Added logic to detect partial responses and model mismatches, showing continue button |
| web-app/src/containers/GenerateResponseButton.tsx | Implemented logic to continue or regenerate based on partial response and model mismatch |
| core/src/types/message/messageInterface.ts | Added modifyMessage method documentation to MessageInterface |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Describe Your Changes
Fixes Issues
Test plan
Screen.Recording.2025-10-01.at.5.44.50.PM.mov
Self Checklist