-
Notifications
You must be signed in to change notification settings - Fork 417
Description
Checks
- I have updated to the lastest minor and patch version of Strands
- I have checked the documentation and this is not expected behavior
- I have searched ./issues and there are no duplicates of my issue
Strands Version
1.8.0
Python Version
3.13.1
Operating System
macOS 15.7
Installation Method
pip
Steps to Reproduce
from strands import Agent
from strands.models.litellm import LiteLLMModel
from strands.types.exceptions import ContextWindowOverflowException
model = LiteLLMModel(model_id="litellm_proxy/us.anthropic.claude-3-7-sonnet-20250219-v1:0", params={
"api_base": "http://0.0.0.0:4000",
"api_key": "key"
})
agent = Agent(model=model)
try:
agent("Hi"*1000000)
except ContextWindowOverflowException as e:
print(f"Caught context window overflow exception: {e}")
except Exception as e:
print(f"Did not catch context window overflow exception: {e}")
Expected Behavior
Caught context window overflow exception: ...
Actual Behavior
Did not catch context window overflow exception: litellm.ContextWindowExceededError: litellm.BadRequestError: Error code: 400 - {'error': {'message': 'litellm.ContextWindowExceededError: litellm.BadRequestError: BedrockException: Context Window Error - {"message":"The model returned the following errors: Input is too long for requested model."}\nmodel=us.anthropic.claude-3-7-sonnet-20250219-v1:0. context_window_fallbacks=None. fallbacks=None.\n\nSet 'context_window_fallback' - https://docs.litellm.ai/docs/routing#fallbacks. Received Model Group=us.anthropic.claude-3-7-sonnet-20250219-v1:0\nAvailable Model Group Fallbacks=None', 'type': None, 'param': None, 'code': '400'}}
Additional Context
This is useful for conversation management and automatic handling / pruning.
Possible Solution
Implement model client exception handling similar to bedrock model: https://github.com/strands-agents/sdk-python/blob/08dc4aeaad6e75dc273d41a9898e0d08df09863b/src/strands/models/bedrock.py#L713C1-L715C63
Related Issues
No response