Skip to content

Commit d5e3e09

Browse files
authored
chore: fix anthropic integration test (#571)
## Description <!-- What does this PR do? --> Integration tests didn't like my max_tokens setting nthropic.BadRequestError: Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message': '`max_tokens` must be greater than `thinking.budget_tokens` ## PR Type <!-- Delete the types that don't apply --!> 🆕 New Feature 🐛 Bug Fix 💅 Refactor 📚 Documentation 🚦 Infrastructure ## Relevant issues <!-- e.g. "Fixes #123" --> ## Checklist - [ ] I have added unit tests that prove my fix/feature works - [ ] New and existing tests pass locally - [ ] Documentation was updated where necessary - [ ] I have read and followed the [contribution guidelines](https://github.com/mozilla-ai/any-llm/blob/main/CONTRIBUTING.md)```
1 parent 725b960 commit d5e3e09

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

tests/integration/test_reasoning.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@
77

88
from any_llm import AnyLLM, LLMProvider
99
from any_llm.exceptions import MissingApiKeyError
10+
from any_llm.providers.anthropic.utils import DEFAULT_MAX_TOKENS
1011
from any_llm.types.completion import ChatCompletion, ChatCompletionChunk
1112
from tests.constants import EXPECTED_PROVIDERS, LOCAL_PROVIDERS
1213

@@ -43,7 +44,7 @@ async def test_completion_reasoning(
4344
LLMProvider.PORTKEY,
4445
)
4546
else "auto",
46-
max_tokens=100, # Portkey with anthropic needed a max tokens value to be set (because it's an anthropic model)
47+
max_tokens=DEFAULT_MAX_TOKENS, # Portkey with anthropic needed a max tokens value to be set (because it's an anthropic model)
4748
)
4849
except MissingApiKeyError:
4950
if provider in EXPECTED_PROVIDERS:
@@ -96,7 +97,7 @@ async def test_completion_reasoning_streaming(
9697
LLMProvider.TOGETHER,
9798
)
9899
else "auto",
99-
max_tokens=100, # Portkey with anthropic needed a max tokens value to be set (because it's an anthropic model)
100+
max_tokens=DEFAULT_MAX_TOKENS, # Portkey with anthropic needed a max tokens value to be set (because it's an anthropic model)
100101
)
101102
assert isinstance(results, AsyncIterable)
102103
async for result in results:

0 commit comments

Comments
 (0)