Test coverage: 65.614% 😌👏
clai (/klaɪ/, like "cli" in "climate") is a command line context-feeder for any ai task.
Installing:
curl -fsSL https://raw.githubusercontent.com/baalimago/clai/main/setup.sh | shYou can also install via go:
go install github.com/baalimago/clai@latestThen run:
clai help | clai query Please give a concise explanation of claiEither look at clai help or the examples for how to use clai.
If you have time, you can also check out this blogpost for a slightly more structured introduction on how to use Clai efficiently.
Install Glow for formatted markdown output when querying text responses.
- MCP client support - Add any MCP server you'd like by simply pasting their configuration.
- Vendor agnosticism - Use any functionality in Clai with most LLM vendors interchangeably.
- Conversations - Create, manage and continue conversations.
- Profiles - Pre-prompted profiles enabling customized workflows and agents.
- Unix-like - Clai follows the unix philosophy and works seamlessly with data piped in and out.
All of these features are easily combined and tweaked, empowering users to accomplish very diverse use cases. See examples for additional info.
| Vendor | Environment Variable | Models |
|---|---|---|
| Mistral | MISTRAL_API_KEY |
Text models |
| HuggingFace | HF_API_KEY |
Text models, use prefix hf: |
| OpenAI | OPENAI_API_KEY |
Text models, photo models |
| Anthropic | ANTHROPIC_API_KEY |
Text models |
| Gemini | GEMINI_API_KEY |
Text models, photo models |
| xAi | XAI_API_KEY |
Text models |
| Inception | INCEPTION_API_KEY |
Text models |
| Ollama | N/A | Use format ollama: (defaults to llama3), server defaults to localhost:11434 |

