-
Notifications
You must be signed in to change notification settings - Fork 2.3k
fix: some of the model settings are not applied #5644
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: some of the model settings are not applied #5644
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Caution
Changes requested ❌
Reviewed everything up to 7b22ba8 in 1 minute and 24 seconds. Click for details.
- Reviewed
27lines of code in1files - Skipped
0files when reviewing. - Skipped posting
0draft comments. View those below. - Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
Workflow ID: wflow_khCKgKtAPCFqo3xP
You can customize by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Caution
Changes requested ❌
Reviewed db74f2c in 1 minute and 20 seconds. Click for details.
- Reviewed
13lines of code in1files - Skipped
0files when reviewing. - Skipped posting
0draft comments. View those below. - Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
Workflow ID: wflow_CQZmB3XYJLzXT37Y
You can customize by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.
Describe Your Changes
This PR aims to ensure that all model settings are sent to the llama.cpp server. In this patch, we'll apply a very minimal validation to avoid potential issues. Any enhancements should be submitted to the llama.cpp extension Epic.
With some of parameters disabled (leave empty)

With value inputed

Fixes Issues
Self Checklist
Important
Fixes issue where some model settings were not applied by converting specific parameters to float in
normalizeValue()inutils.ts.normalizeValue()inutils.tsto convert specific parameters to float.normalizeValue()to handletemperature,top_k,top_p,min_p,repeat_penalty,frequency_penalty,presence_penalty,repeat_last_nas floats.validationRulesinextractInferenceParams()andextractModelLoadParams()to validate and normalize parameters.This description was created by
for db74f2c. You can customize this summary. It will automatically update as commits are pushed.