@@ -17,10 +17,10 @@ This tutorial shows how to set up real-time weather functionality with Llama-Nex
1717## 1. Set Up Your MCP Server
1818
1919``` bash
20- curl -LO https://github.com/cardea-mcp/cardea-mcp-servers/releases/download/0.7 .0/cardea-mcp-servers-unknown-linux-gnu-x86_64.tar.gz
20+ curl -LO https://github.com/cardea-mcp/cardea-mcp-servers/releases/download/0.8 .0/cardea-mcp-servers-unknown-linux-gnu-x86_64.tar.gz
2121tar xvf cardea-mcp-servers-unknown-linux-gnu-x86_64.tar.gz
2222```
23- > Download for your platform: https://github.com/cardea-mcp/cardea-mcp-servers/releases/tag/0.7 .0
23+ > Download for your platform: https://github.com/cardea-mcp/cardea-mcp-servers/releases/tag/0.8 .0
2424
2525Set the environment variables:
2626
@@ -45,11 +45,11 @@ Run the MCP server (accessible from external connections):
4545Download and extract llama-nexus:
4646
4747``` bash
48- curl -LO https://github.com/LlamaEdge/llama-nexus/releases/download/0.5 .0/llama-nexus-apple-darwin-aarch64.tar.gz
48+ curl -LO https://github.com/LlamaEdge/llama-nexus/releases/download/0.6 .0/llama-nexus-apple-darwin-aarch64.tar.gz
4949tar xvf llama-nexus-apple-darwin-aarch64.tar.gz
5050```
5151
52- > Download for your platform: https://github.com/LlamaEdge/llama-nexus/releases/tag/0.5 .0
52+ > Download for your platform: https://github.com/LlamaEdge/llama-nexus/releases/tag/0.6 .0
5353
5454### Configure llama-nexus
5555
@@ -87,11 +87,13 @@ Register an LLM chat API server for the `/chat/completions` endpoint:
8787curl --location ' http://localhost:9095/admin/servers/register' \
8888--header ' Content-Type: application/json' \
8989--data ' {
90- "url": "https://0xb2962131564bc854ece7b0f7c8c9a8345847abfb.gaia.domains",
90+ "url": "https://0xb2962131564bc854ece7b0f7c8c9a8345847abfb.gaia.domains/v1 ",
9191 "kind": "chat"
9292}'
9393```
9494
95+ > If your API server requires an API key access, you can add an ` api-key ` field in the registration request.
96+
9597## 3. Test the Setup
9698
9799Test the inference server by requesting the ` /chat/completions ` API endpoint, which is OpenAI-compatible:
0 commit comments