Session ID: v2:77d21987-0e51-4fdf-9a3b-460d5eecf7ed
Session Schema: v2.0
Pro User ID: ****iLCm
Issue Description (required)
Sorry, there was an error from the AI: [Request ID: b88f24d4-ee63-4fd8-bb80-5e4dfa8c3544] {"error":"400 litellm.BadRequestError: AnthropicException - b'{"type":"error","error":{"type":"invalid_request_error","message":"This model does not support assistant message prefill. The conversation must end with a user message."},"request_id":"req_011CZzQFLniAWVqcU9iX8vqX"}'. Received Model Group=anthropic/claude-sonnet-4-6\nAvailable Model Group Fallbacks=['anthropic/claude-sonnet-4-6:backup']\nError doing the fallback: litellm.BadRequestError: BedrockException - {"message":"The model returned the follo"}
Can we get rid of LiteLLM and use Amazon? Nothing but problems all the time with LiteLLM including recent security breaches.
Expected Behavior (required)
To work.
Actual Behavior (required)
Sorry, there was an error from the AI: [Request ID: b88f24d4-ee63-4fd8-bb80-5e4dfa8c3544] {"error":"400 litellm.BadRequestError: AnthropicException - b'{"type":"error","error":{"type":"invalid_request_error","message":"This model does not support assistant message prefill. The conversation must end with a user message."},"request_id":"req_011CZzQFLniAWVqcU9iX8vqX"}'. Received Model Group=anthropic/claude-sonnet-4-6\nAvailable Model Group Fallbacks=['anthropic/claude-sonnet-4-6:backup']\nError doing the fallback: litellm.BadRequestError: BedrockException - {"message":"The model returned the follo"}
System Information
- Dyad Version: 0.43.0
- Platform: win32
- Architecture: x64
- Node Version: v22.14.0
- PNPM Version: 10.18.3
- Node Path: C:\Program Files\nodejs\node.exe
- Pro User ID: ****iLCm
- Telemetry ID: b05f6e4b-e2b6-44d2-aaa1-151815ce1fcb
- Model: anthropic:claude-sonnet-4-6 | customId: undefined
Settings
- Selected Model: anthropic:claude-sonnet-4-6
- Chat Mode: local-agent
- Auto Approve Changes: n/a
- Dyad Pro Enabled: true
- Thinking Budget: n/a
- Runtime Mode: host
- Release Channel: beta
- Auto Fix Problems: false
- Native Git: true
Logs
hasProviderModels: true
}
source: 'remote',
version: '2026-04-12T18:03:26.656Z',
expiresAt: '2026-04-12T20:03:28.156Z'
}
source: 'remote',
version: '2026-04-12T18:03:26.656Z',
providerCount: 6
}
source: 'remote',
version: '2026-04-12T18:03:26.656Z',
expiresAt: '2026-04-12T20:03:28.156Z'
}
providerId: 'anthropic',
source: 'remote',
version: '2026-04-12T18:03:26.656Z',
hasProviderModels: true
}
Session ID: v2:77d21987-0e51-4fdf-9a3b-460d5eecf7ed
Session Schema: v2.0
Pro User ID: ****iLCm
Issue Description (required)
Sorry, there was an error from the AI: [Request ID: b88f24d4-ee63-4fd8-bb80-5e4dfa8c3544] {"error":"400 litellm.BadRequestError: AnthropicException - b'{"type":"error","error":{"type":"invalid_request_error","message":"This model does not support assistant message prefill. The conversation must end with a user message."},"request_id":"req_011CZzQFLniAWVqcU9iX8vqX"}'. Received Model Group=anthropic/claude-sonnet-4-6\nAvailable Model Group Fallbacks=['anthropic/claude-sonnet-4-6:backup']\nError doing the fallback: litellm.BadRequestError: BedrockException - {"message":"The model returned the follo"}
Can we get rid of LiteLLM and use Amazon? Nothing but problems all the time with LiteLLM including recent security breaches.
Expected Behavior (required)
To work.
Actual Behavior (required)
Sorry, there was an error from the AI: [Request ID: b88f24d4-ee63-4fd8-bb80-5e4dfa8c3544] {"error":"400 litellm.BadRequestError: AnthropicException - b'{"type":"error","error":{"type":"invalid_request_error","message":"This model does not support assistant message prefill. The conversation must end with a user message."},"request_id":"req_011CZzQFLniAWVqcU9iX8vqX"}'. Received Model Group=anthropic/claude-sonnet-4-6\nAvailable Model Group Fallbacks=['anthropic/claude-sonnet-4-6:backup']\nError doing the fallback: litellm.BadRequestError: BedrockException - {"message":"The model returned the follo"}
System Information
Settings
Logs