Skip to content

bug: Token Count shows 0% when model is not loaded #6689

@louis-jan

Description

@louis-jan

Problem

The token count displays 0% when the model is not loaded. This provides no useful information to users and creates a poor user experience, especially when users are composing messages in a thread.

Image Image

Steps to Reproduce

  1. Open a thread without loading the model
  2. Start typing a message
  3. Observe the token count indicator shows 0%

Expected Behavior

One of the following should occur:

  • Model should be automatically loaded when a thread is opened, OR
  • Token count should show an estimated value based on a default model's context size, OR
  • Display a clear indicator that the model needs to be loaded for accurate token counting

Actual Behavior

Token count shows 0%, providing no useful feedback to users about their message length relative to context limits.

Proposed Solution

Implement automatic model loading on thread load. This would:

  • Provide immediate, accurate token count feedback
  • Improve user experience by eliminating manual model loading step
  • Ensure users are aware of context limits before composing long messages

Impact

  • Users cannot gauge message length against context limits
  • Reduced usability when composing messages
  • Requires manual model loading for basic functionality

🤖 Generated with Claude Code

Co-Authored-By: Claude [email protected]

Metadata

Metadata

Labels

No labels
No labels

Type

Projects

Status

Eng Planning

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions