A step-by-step guide to installing, configuring, and using Cortex.
npm install -g @gzoo/cortexVerify it worked:
cortex --versioncortex initThe wizard walks you through:
- Local LLM — Do you have Ollama running? If yes, it detects your GPU and recommends a model.
- Cloud LLM — Pick a provider (Anthropic, Google Gemini, Groq, OpenRouter) and paste your API key.
- Routing mode — How to split work between cloud and local:
cloud-first— All tasks go to cloud. Best quality, costs money.hybrid— Bulk tasks (extraction, ranking) go to Ollama, reasoning tasks go to cloud. Cheapest good option.local-first— Everything goes to Ollama, cloud only as fallback. Minimal cost.local-only— Never touches the internet. Free. Requires Ollama.
- Watch directories — Which directories Cortex should monitor for file changes.
- Budget — Monthly LLM spend cap (default $25).
Config is saved to ~/.cortex/cortex.config.json. API keys go in ~/.cortex/.env.
Tell Cortex which projects to track:
cortex projects add my-app ~/projects/my-app
cortex projects add api ~/projects/api
cortex projects add docs ~/projects/docsVerify:
cortex projects listThere are two ways to run the ingestion pipeline:
| Command | What it does |
|---|---|
cortex serve |
Web dashboard + API + file watching + ingestion (recommended) |
cortex watch |
File watching + ingestion only (CLI, no dashboard) |
You do not need both.
cortex servealready includes a file watcher. If you runcortex watchandcortex serveat the same time, they will compete for file changes and the Live Feed in the dashboard will not show events.
For most users, just run:
cortex serveThis starts the dashboard at http://localhost:3710 and watches all registered projects.
If you only want CLI-based ingestion without the web UI:
cortex watchWhen a file is saved, Cortex:
- Parses the file (markdown, TypeScript, JSON, YAML)
- Extracts entities (decisions, patterns, components, etc.)
- Infers relationships between entities
- Detects contradictions
- Stores everything in your local knowledge graph
Watch a specific project:
cortex watch my-appStop with Ctrl+C.
The watcher only picks up new changes. To ingest files that already exist:
cortex ingest "~/projects/my-app/src/**/*.ts"
cortex ingest ~/projects/my-app/README.mdPreview what would be extracted without writing to the DB:
cortex ingest "~/projects/my-app/docs/*.md" --dry-runcortex query "what caching strategies am I using?"
cortex query "what decisions have I made about authentication?"
cortex query "what are the dependencies between my projects?"Responses include source citations so you can trace back to the original files.
Filter to a specific project:
cortex query "what patterns does this project use?" --project my-appcortex find "PostgreSQL"
cortex find "PostgreSQL" --expand 2 # show 2 hops of relationships
cortex find "auth" --type Decision # only decisions matching "auth"cortex statusShows: entity/relationship counts, LLM provider status, budget usage, storage size.
cortex serveStarts the server at http://localhost:3710 — this includes both the web dashboard and a file watcher.
You do not need cortex watch when running cortex serve.
The dashboard includes:
- Dashboard — stats, recent entities, entity type breakdown
- Graph — interactive knowledge graph visualization
- Live Feed — real-time ingestion events (shows DB stats on load)
- Query — natural language queries with streaming responses
- Contradictions — review and resolve conflicting decisions
Access from another machine on your network:
cortex serve --host 0.0.0.0Cortex ignores node_modules, dist, .git, and other common directories by default.
Add more exclusions:
cortex config exclude add "**/generated/**"
cortex config exclude add "**/*.min.js"
cortex config exclude add "vendor"See what's excluded:
cortex config exclude listRemove an exclusion:
cortex config exclude remove "vendor"Important:
cortex config set ingest.exclude '[...]'overwrites the entire exclude list. Always usecortex config exclude addto append without losing existing patterns.
When Cortex detects conflicting decisions across your projects:
cortex contradictions # list active contradictions
cortex contradictions --all # include resolved ones
cortex resolve <id> --action supersede # the newer decision wins
cortex resolve <id> --action dismiss # ignore this contradiction
cortex resolve <id> --action both-validcortex costs # this month's spending
cortex costs --period today # today only
cortex costs --by model # breakdown by model
cortex costs --csv # export for spreadsheetsMark sensitive directories so they're never sent to cloud LLMs:
cortex privacy set ~/projects/client-work restricted
cortex privacy set ~/projects/internal sensitive
cortex privacy list # show all classificationsPrivacy levels:
standard— can be sent to cloud LLMssensitive— sent to cloud but with extra scrubbingrestricted— never leaves your machine, local LLM only
cortex config set llm.mode hybrid # switch routing mode
cortex config set llm.budget.monthlyLimitUsd 10 # lower budget
cortex config set llm.local.model "llama3:8b" # change Ollama model
cortex config set server.port 8080 # change dashboard port
cortex config list # see all non-default values
cortex config get ingest.exclude # check a specific valueThese override config file values:
| Variable | Overrides |
|---|---|
CORTEX_LLM_MODE |
llm.mode |
CORTEX_SERVER_PORT |
server.port |
CORTEX_DB_PATH |
graph.dbPath |
CORTEX_LOG_LEVEL |
logging.level |
CORTEX_BUDGET_LIMIT |
llm.budget.monthlyLimitUsd |
CORTEX_OLLAMA_HOST |
llm.local.host |
The file is too large. Exclude it:
cortex config exclude add "**/prisma/runtime/**"Upgrade to v0.2.5+. If still stuck, press Ctrl+C twice for a force exit.
Upgrade to v0.2.4+. JSONC comments and trailing commas are now supported.
Upgrade to v0.2.3+. Older versions only checked for the Anthropic key variable.
Upgrade to v0.2.6+, or use --verbose only when debugging. Log lines are suppressed by default.
Upgrade to v0.2.7+. The Live Feed now loads existing DB stats on page load. New events appear when files are modified while cortex serve is running.