Skip to content
/ clai Public

Command line artificial intelligence - Your local LLM context-feeder

License

Notifications You must be signed in to change notification settings

baalimago/clai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

582 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Command Line Artificial Intelligence

Go Report Card Wakatime

Test coverage: 65.614% 😌👏


clai (/klaɪ/, like "cli" in "climate") is a command line context-feeder for any ai task.

Banner

Get started

Installing:

curl -fsSL https://raw.githubusercontent.com/baalimago/clai/main/setup.sh | sh

You can also install via go:

go install github.com/baalimago/clai@latest

Then run:

clai help | clai query Please give a concise explanation of clai

Either look at clai help or the examples for how to use clai. If you have time, you can also check out this blogpost for a slightly more structured introduction on how to use Clai efficiently.

Install Glow for formatted markdown output when querying text responses.

Features

Showcase
  • MCP client support - Add any MCP server you'd like by simply pasting their configuration.
  • Vendor agnosticism - Use any functionality in Clai with most LLM vendors interchangeably.
  • Conversations - Create, manage and continue conversations.
  • Profiles - Pre-prompted profiles enabling customized workflows and agents.
  • Unix-like - Clai follows the unix philosophy and works seamlessly with data piped in and out.

All of these features are easily combined and tweaked, empowering users to accomplish very diverse use cases. See examples for additional info.

Supported vendors

Vendor Environment Variable Models
Mistral MISTRAL_API_KEY Text models
HuggingFace HF_API_KEY Text models, use prefix hf:
OpenAI OPENAI_API_KEY Text models, photo models
Anthropic ANTHROPIC_API_KEY Text models
Gemini GEMINI_API_KEY Text models, photo models
xAi XAI_API_KEY Text models
Inception INCEPTION_API_KEY Text models
Ollama N/A Use format ollama: (defaults to llama3), server defaults to localhost:11434

About

Command line artificial intelligence - Your local LLM context-feeder

Topics

Resources

License

Stars

Watchers

Forks

Contributors 7