Skip to content

Commit cc3a832

Browse files
authored
docs: comprehensive AWS Bedrock documentation update (#283)
* docs: update AWS Bedrock documentation with comprehensive coverage - Add Quick Start section to aws-bedrock.md for running a full Bedrock instance - Document bedrock/ prefix requirement for embedding models vs no prefix for generation - Add installation instructions for the [aws] extra (boto3, botocore) - Fix Docker Compose instructions to use --profile aws instead of DOCKER_TARGET - Fix region env var references (REGION_NAME + AWS_REGION_NAME) - Add latest Anthropic Claude models (Opus 4.6, Sonnet 4.6, Opus 4.1, Sonnet 4, etc.) - Add additional embedding models (Titan multimodal, Cohere Embed v4) - Fix default GENERATION_MODEL in llm-providers.md to match config.py - Add hybrid config examples (Bedrock embeddings + OpenAI generation and vice versa) - Add troubleshooting entries for missing AWS dependencies - Cross-reference between aws-bedrock.md and llm-providers.md * fix: update stale docstring in langchain integration to use create_agent The module docstring was using the removed create_tool_calling_agent and AgentExecutor API from LangChain < v0.2. Updated to use create_agent from langchain.agents, which is the current API. * docs: address PR review feedback from Copilot - Fix REGION_NAME vs AWS_REGION_NAME: clarify AWS_REGION_NAME is required (LiteLLM), REGION_NAME is optional (server-side boto3 model-existence checks) - Fix IAM role section: note that server boto3 utilities currently require explicit credentials, but LiteLLM handles IAM roles natively - Fix REDISVL_VECTOR_DIMENSIONS description: it is a fallback/override, not auto-detected - Add asterisk notes on embedding models not in MODEL_CONFIGS (titan-embed-image, cohere.embed-v4) requiring explicit REDISVL_VECTOR_DIMENSIONS * docs: address reviewer feedback - Clarify bedrock/ prefix is optional (not prohibited) for generation models - Remove boto3/botocore version ranges from docs - Rename LLM Proxy to LLM Proxy (LiteLLM) in architecture diagram - Update wording from 'no prefix needed' to 'prefix optional'
1 parent fbf9464 commit cc3a832

3 files changed

Lines changed: 286 additions & 175 deletions

File tree

agent-memory-client/agent_memory_client/integrations/langchain.py

Lines changed: 8 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -8,9 +8,8 @@
88
```python
99
from agent_memory_client import create_memory_client
1010
from agent_memory_client.integrations.langchain import get_memory_tools
11-
from langchain.agents import create_tool_calling_agent, AgentExecutor
11+
from langchain.agents import create_agent
1212
from langchain_openai import ChatOpenAI
13-
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
1413
1514
# Initialize memory client
1615
memory_client = await create_memory_client("http://localhost:8000")
@@ -24,16 +23,15 @@
2423
2524
# Use with LangChain agent
2625
llm = ChatOpenAI(model="gpt-4o")
27-
prompt = ChatPromptTemplate.from_messages([
28-
("system", "You are a helpful assistant with memory."),
29-
("human", "{input}"),
30-
MessagesPlaceholder("agent_scratchpad"),
31-
])
32-
agent = create_tool_calling_agent(llm, tools, prompt)
33-
executor = AgentExecutor(agent=agent, tools=tools)
26+
agent = create_agent(
27+
llm, tools,
28+
system_prompt="You are a helpful assistant with memory."
29+
)
3430
3531
# Run the agent
36-
result = await executor.ainvoke({"input": "Remember that I love pizza"})
32+
result = await agent.ainvoke(
33+
{"messages": [("human", "Remember that I love pizza")]}
34+
)
3735
```
3836
"""
3937

0 commit comments

Comments
 (0)