File tree Expand file tree Collapse file tree 1 file changed +42
-0
lines changed Expand file tree Collapse file tree 1 file changed +42
-0
lines changed Original file line number Diff line number Diff line change @@ -349,3 +349,45 @@ fastmcp_tool = to_fastmcp(add)
349349mcp = FastMCP(" Math" , tools = [fastmcp_tool])
350350mcp.run(transport = " stdio" )
351351```
352+
353+ # # Passing InjectedToolArg to an MCP Tool
354+
355+ By using the LangChain MCP Adapter on both the server and client sides, you can use `InjectedToolArg` to hide certain parameters from the LLM .
356+
357+ ```python
358+ # server.py
359+ from langchain_core.tools import tool
360+
361+ data = {
362+ ' user_0' : ' Spike'
363+ }
364+
365+ @ tool
366+ async def get_user_pet_name(user_id: Annotated[str , InjectedToolArg]) -> str :
367+ """ Returns the user's pet name"""
368+
369+ return data[user_id]
370+
371+ fastmcp_tool = to_fastmcp(add)
372+ mcp = FastMCP(" Math" , tools = [fastmcp_tool])
373+ mcp.run(transport = " stdio" )
374+ ```
375+
376+ And the user ID can be passed as part of the input , without the LLM knowledge:
377+
378+ ```python
379+ # client.py
380+
381+ client = MultiServerMCPClient(
382+ ...
383+ )
384+
385+ tools = await client.get_tools()
386+ agent = create_react_agent(" openai:gpt-4.1" , tools)
387+ response = await agent.ainvoke(
388+ {
389+ " messages" : " What is my dog's name?" ,
390+ " user_id" : " user_0"
391+ }
392+ )
393+ ```
You can’t perform that action at this time.
0 commit comments