Fix: enable parameter suppression to LiteLLM#1411
Conversation
|
DCO Assistant Lite bot All contributors have signed the DCO ✍️ ✅ |
jmartin-tech
left a comment
There was a problem hiding this comment.
Given this targets a single model family in a specific deployment method I am not sure the a hard coded restriction in the generator is maintainable.
It might be a good idea to refactor this to be more reusable, the OpenAICompatible generator does something similar where it gathers the parameters passed for inference based on excluding some configurable parameters using a suppressed_params list the user can override via a configuration file or an extending generator class.
Sure, let me refactor this. |
|
@jmartin-tech I've added a |
|
Test failure is addressed in #1420, new commits to this branch or merge of End to end testing is in queue. |
jmartin-tech
left a comment
There was a problem hiding this comment.
Looks good, one minor suggestion on the type validation being performed.
garak/generators/litellm.py
Outdated
| if hasattr(self, 'suppressed_params') and not isinstance(self.suppressed_params, set): | ||
| self.suppressed_params = set(self.suppressed_params) |
There was a problem hiding this comment.
The existence check here is not required as DEFAULT_PARAMS will be projected on self even if provided as None/null values in the passed configuration.
The type check also seems reasonable to skip as yaml and json loaders will often be a list as I suspect users are unlikely to add the extra syntax in yaml for a set and json does not support set by default.
Consider:
| if hasattr(self, 'suppressed_params') and not isinstance(self.suppressed_params, set): | |
| self.suppressed_params = set(self.suppressed_params) | |
| self.suppressed_params = set(self.suppressed_params) |
Enforcement wise I believe this will would create the same result and allow set() to raise if provided a value that cannot be coalesced into a set. This is would still be imperfect type checking however the project has not yet settled on a pattern for type check enforcement in plugins. There are some ideas floating around about considering type providing hints on DEFAULT_PARAMS in the future.
There was a problem hiding this comment.
@jmartin-tech Good point, I've now simplified it based on your suggestion. I've also synced it up to main branch so hopefully the test will now succeed. Thanks
Summary
This fixes #1410
This ensure thats Claude 4.5 via AWS Bedrock on LiteLLM works and does not instantly fail.
Changes
top_pRationale
Currently Garak does not work with Claude 4.5 via AWS Bedrock on LiteLLM. It instantly fails due to this error:
litellm.BadRequestError: BedrockException - {"message":"The model returned the following errors: `temperature` and `top_p` cannot both be specified for this model. Please use only one."}. Received Model Group=eu.anthropic.claude-sonnet-4-5-20250929-v1:0 Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2