Skip to content

Commit d7fd6e3

Browse files
committed
Add usage example to README.md
1 parent d4ef886 commit d7fd6e3

File tree

1 file changed

+42
-0
lines changed

1 file changed

+42
-0
lines changed

README.md

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -349,3 +349,45 @@ fastmcp_tool = to_fastmcp(add)
349349
mcp = FastMCP("Math", tools=[fastmcp_tool])
350350
mcp.run(transport="stdio")
351351
```
352+
353+
## Passing InjectedToolArg to an MCP Tool
354+
355+
By using the LangChain MCP Adapter on both the server and client sides, you can use `InjectedToolArg` to hide certain parameters from the LLM.
356+
357+
```python
358+
# server.py
359+
from langchain_core.tools import tool
360+
361+
data = {
362+
'user_0': 'Spike'
363+
}
364+
365+
@tool
366+
async def get_user_pet_name(user_id: Annotated[str, InjectedToolArg]) -> str:
367+
"""Returns the user's pet name"""
368+
369+
return data[user_id]
370+
371+
fastmcp_tool = to_fastmcp(add)
372+
mcp = FastMCP("Math", tools=[fastmcp_tool])
373+
mcp.run(transport="stdio")
374+
```
375+
376+
And the user ID can be passed as part of the input, without the LLM knowledge:
377+
378+
```python
379+
# client.py
380+
381+
client = MultiServerMCPClient(
382+
...
383+
)
384+
385+
tools = await client.get_tools()
386+
agent = create_react_agent("openai:gpt-4.1", tools)
387+
response = await agent.ainvoke(
388+
{
389+
"messages": "What is my dog's name?",
390+
"user_id": "user_0"
391+
}
392+
)
393+
```

0 commit comments

Comments
 (0)