File tree Expand file tree Collapse file tree 3 files changed +4
-4
lines changed
topic/machine-learning/llm-langchain Expand file tree Collapse file tree 3 files changed +4
-4
lines changed Original file line number Diff line number Diff line change @@ -2,7 +2,6 @@ name: LangChain
22
33on :
44 pull_request :
5- branches : ~
65 paths :
76 - ' .github/workflows/ml-langchain.yml'
87 - ' topic/machine-learning/llm-langchain/**'
Original file line number Diff line number Diff line change @@ -90,7 +90,7 @@ and [CrateDB].
9090 It is based on the previous notebook, and it illustrates how to use Vertex AI platform
9191 on Google Cloud for RAG pipeline.
9292
93- - ` agent_with_mcp.py `
93+ - ` agent_with_mcp.py ` [ ![ Open on GitHub ] ( https://img.shields.io/badge/Open%20on-GitHub-lightgray?logo=GitHub )] ( agent_with_mcp.py )
9494
9595 This example illustrates how to use LangGraph and the ` langchain-mcp-adapters `
9696 package to implement an LLM agent that is connecting to the CrateDB MCP server.
@@ -173,7 +173,7 @@ pytest -k document_loader
173173pytest -k " notebook and loader"
174174```
175175
176- To force a regeneration of the Jupyter Notebook , use the
176+ To force regeneration of Jupyter notebooks , use the
177177` --nb-force-regen ` option.
178178``` shell
179179pytest -k document_loader --nb-force-regen
Original file line number Diff line number Diff line change 2424python agent_with_mcp.py
2525"""
2626import asyncio
27+ import os
2728
2829from cratedb_about .instruction import GeneralInstructions
2930from langchain_mcp_adapters .client import MultiServerMCPClient
@@ -46,7 +47,7 @@ async def amain():
4647 prompt = GeneralInstructions ().render (),
4748 )
4849
49- QUERY_STR = " What is the average value for sensor 1?"
50+ QUERY_STR = os . getenv ( "DEMO_QUERY" , " What is the average value for sensor 1?")
5051 response = await agent .ainvoke ({"messages" : QUERY_STR })
5152 answer = response ["messages" ][- 1 ].content
5253
You can’t perform that action at this time.
0 commit comments