Skip to content

Conversation

@urmauur
Copy link
Member

@urmauur urmauur commented Oct 3, 2025

Describe Your Changes

Frontend Changes
Added chat_template_kwargs: { enable_thinking: false } when calling engine.getTokensCount()
Updated type definition to include optional chat_template_kwargs parameter
This ensures token counting always works regardless of model defaults

Fixes Issues

  • Closes #
  • Closes #

Self Checklist

  • Added relevant comments, esp in complex areas
  • Updated docs (for bug fixes / features)
  • Created issues for follow-up changes or refactoring needed

@urmauur urmauur added this to the v0.7.1 milestone Oct 3, 2025
@urmauur urmauur requested a review from louis-jan October 3, 2025 12:08
@urmauur urmauur self-assigned this Oct 3, 2025
Copilot AI review requested due to automatic review settings October 3, 2025 12:08
@urmauur urmauur requested a review from Minh141120 October 3, 2025 12:08
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR fixes token counting issues by ensuring the enable_thinking parameter is consistently set to false when calculating tokens, preventing failures when models have different default settings.

Key changes:

  • Added optional chat_template_kwargs parameter to the token counting API
  • Modified token counting calls to explicitly disable thinking mode
  • Applied code formatting improvements to function calls

Reviewed Changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.

File Description
web-app/src/services/models/default.ts Added chat_template_kwargs type definition and explicit disable thinking parameter in token counting
extensions/llamacpp-extension/src/index.ts Updated token counting implementation to use chat_template_kwargs with fallback logic and applied code formatting

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

@urmauur urmauur changed the title Fix/prompt token fix: prompt token Oct 3, 2025
@github-actions
Copy link
Contributor

github-actions bot commented Oct 3, 2025

Copy link
Contributor

@louis-jan louis-jan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@Vanalite Vanalite left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@urmauur urmauur merged commit c378d76 into release/v0.7.1 Oct 3, 2025
16 checks passed
@urmauur urmauur deleted the fix/prompt-token branch October 3, 2025 13:12
@github-project-automation github-project-automation bot moved this to QA in Jan Oct 3, 2025
@github-actions github-actions bot modified the milestones: v0.7.1, v0.7.2 Oct 3, 2025
louis-jan pushed a commit that referenced this pull request Oct 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: QA

Development

Successfully merging this pull request may close these issues.

4 participants