Claude Context is an MCP plugin that adds semantic code search to Claude Code and other AI coding agents, giving them deep context from your entire codebase.
π§ Your Entire Codebase as Context: Claude Context uses semantic search to find all relevant code from millions of lines. No multi-round discovery needed. It brings results straight into the Claude's context.
π° Cost-Effective for Large Codebases: Instead of loading entire directories into Claude for every request, which can be very expensive, Claude Context efficiently stores your codebase locally in a vector database and only uses related code in context to keep your costs manageable.
π Local-First with LanceDB: By default, Claude Context uses LanceDB for local, embedded vector storage - no external services required. Quick setup with your data staying on your machine.
Model Context Protocol (MCP) allows you to integrate Claude Context with your favorite AI coding assistants, e.g. Claude Code.
Setup in 30 seconds - Just one command!
Get a free API key from OpenAI (starts with sk-)
claude mcp add claude-context -e OPENAI_API_KEY=your-openai-api-key -- npx @dannyboy2042/claude-context-mcp@latestThat's it! π Claude Context will:
- β Install automatically with no setup required
- β Store everything locally using LanceDB (no cloud services needed)
- β Work with any codebase size
- β Enable hybrid vector + text search for better results
- Open Claude Code in your project directory
- Ask Claude to "index this codebase"
- Then ask questions like "find the authentication logic" or "show me the database connection code"
Claude Context automatically handles multiple projects by creating isolated collections for each codebase:
- Each project gets its own collection based on the project's absolute path
- Projects are completely isolated from each other
- Switch between projects by opening Claude Code in different directories
- All data stored locally in
~/.claude-context/lancedb/
System Requirements:
- Node.js >= 20.0.0 (Claude Context works locally - no external services required!)
See the Claude Code MCP documentation for more details.
Simple setup for any MCP client - just add your OpenAI API key!
Gemini CLI
Create or edit the ~/.gemini/settings.json file:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@dannyboy2042/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key"
}
}
}
}Qwen Code
Create or edit the ~/.qwen/settings.json file:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@dannyboy2042/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key"
}
}
}
}Cursor
Go to: Settings -> Cursor Settings -> MCP -> Add new global MCP server
Add this to your Cursor ~/.cursor/mcp.json file:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@dannyboy2042/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key"
}
}
}
}Void
Go to: Settings -> MCP -> Add MCP Server
Add the following configuration to your Void MCP settings:
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["-y", "@dannyboy2042/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
}
}
}
}Claude Desktop
Add to your Claude Desktop configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@dannyboy2042/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
}
}
}
}Windsurf
Windsurf supports MCP configuration through a JSON file. Add the following configuration to your Windsurf MCP settings:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@dannyboy2042/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
}
}
}
}VS Code
The Claude Context MCP server integrates directly with Claude Code and other MCP-compatible clients:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@dannyboy2042/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
}
}
}
}Cherry Studio
Cherry Studio allows for visual MCP server configuration through its settings interface. While it doesn't directly support manual JSON configuration, you can add a new server via the GUI:
- Navigate to Settings β MCP Servers β Add Server.
- Fill in the server details:
- Name:
claude-context - Type:
STDIO - Command:
npx - Arguments:
["@dannyboy2042/claude-context-mcp@latest"] - Environment Variables:
OPENAI_API_KEY:your-openai-api-keyMILVUS_ADDRESS:milvus-cloud-public-endpointMILVUS_TOKEN:milvus-cloud-api-key
- Name:
- Save the configuration to activate the server.
Cline
Cline uses a JSON configuration file to manage MCP servers. To integrate the provided MCP server configuration:
-
Open Cline and click on the MCP Servers icon in the top navigation bar.
-
Select the Installed tab, then click Advanced MCP Settings.
-
In the
cline_mcp_settings.jsonfile, add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@dannyboy2042/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
}
}
}
}- Save the file.
Augment
To configure Claude Context MCP in Augment Code, you can use either the graphical interface or manual configuration.
-
Click the hamburger menu.
-
Select Settings.
-
Navigate to the Tools section.
-
Click the + Add MCP button.
-
Enter the following command:
npx @dannyboy2042/claude-context-mcp@latest -
Name the MCP: Claude Context.
-
Click the Add button.
- Press Cmd/Ctrl Shift P or go to the hamburger menu in the Augment panel
- Select Edit Settings
- Under Advanced, click Edit in settings.json
- Add the server configuration to the
mcpServersarray in theaugment.advancedobject
"augment.advanced": {
"mcpServers": [
{
"name": "claude-context",
"command": "npx",
"args": ["-y", "@dannyboy2042/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
}
}
]
}Roo Code
Roo Code utilizes a JSON configuration file for MCP servers:
-
Open Roo Code and navigate to Settings β MCP Servers β Edit Global Config.
-
In the
mcp_settings.jsonfile, add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@dannyboy2042/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
}
}
}
}- Save the file to activate the server.
Other MCP Clients
The server uses stdio transport and follows the standard MCP protocol. It can be integrated with any MCP-compatible client by running:
npx @dannyboy2042/claude-context-mcp@latestFor more detailed MCP environment variable configuration, see our Environment Variables Guide.
π Need more help? Check out our complete documentation for detailed guides and troubleshooting tips.
- π Hybrid Search: Advanced hybrid search combining vector similarity + full-text search using RRF (Reciprocal Rank Fusion) for better results
- π 100% Local: Everything runs locally using LanceDB - your code never leaves your machine
- β‘ Fast & Incremental: Smart indexing that only processes changed files
- π§© AST-Powered: Intelligent code chunking using Abstract Syntax Trees for better context
- ποΈ No Limits: Handle codebases of any size locally
- π οΈ Zero Config: Works out of the box - just add your OpenAI API key
Claude Context is a monorepo containing three main packages:
@dannyboy2042/claude-context-core: Core indexing engine with embedding and vector database integration@dannyboy2042/claude-context-mcp: Model Context Protocol server for AI agent integration
- Vector Database: LanceDB (local, embedded, zero-config)
- Embedding: OpenAI text-embedding-3-small (other providers available: VoyageAI, Ollama, Gemini)
- Code Analysis: AST-based intelligent chunking with LangChain fallback
- Languages: TypeScript, JavaScript, Python, Java, C++, C#, Go, Rust, PHP, Ruby, Swift, Kotlin, Scala, Markdown
- AI Clients: Claude Code, Cursor, Windsurf, Gemini CLI, and all MCP-compatible clients
π§ Advanced: Optional Cloud Vector Database
For enterprise teams or advanced use cases, you can optionally configure Milvus/Zilliz Cloud by setting environment variables:
claude mcp add claude-context -e OPENAI_API_KEY=your-key -e VECTOR_DB_TYPE=milvus -e MILVUS_TOKEN=your-token -- npx @dannyboy2042/claude-context-mcp@latestThe MCP server is the primary way to use Claude Context with AI assistants like Claude Code.
The @dannyboy2042/claude-context-core package provides the fundamental functionality for code indexing and semantic search.
import { Context, LanceDBVectorDatabase, OpenAIEmbedding } from '@dannyboy2042/claude-context-core';
// Initialize embedding provider
const embedding = new OpenAIEmbedding({
apiKey: process.env.OPENAI_API_KEY || 'your-openai-api-key',
model: 'text-embedding-3-small'
});
// Initialize vector database (LanceDB - local, no setup required)
const vectorDatabase = new LanceDBVectorDatabase({
uri: './.claude-context/lancedb' // Local storage path
});
// Create context instance
const context = new Context({
embedding,
vectorDatabase
});
// Index your codebase with progress tracking
const stats = await context.indexCodebase('./your-project', (progress) => {
console.log(`${progress.phase} - ${progress.percentage}%`);
});
console.log(`Indexed ${stats.indexedFiles} files, ${stats.totalChunks} chunks`);
// Perform semantic search
const results = await context.semanticSearch('./your-project', 'vector database operations', 5);
results.forEach(result => {
console.log(`File: ${result.relativePath}:${result.startLine}-${result.endLine}`);
console.log(`Score: ${(result.score * 100).toFixed(2)}%`);
console.log(`Content: ${result.content.substring(0, 100)}...`);
});# Clone repository
git clone https://github.com/danielbowne/claude-context.git
cd claude-context
# Install dependencies
pnpm install
# Build all packages
pnpm build
# Start development mode
pnpm dev# Build all packages
pnpm build
# Build specific package
pnpm build:core
pnpm build:vscode
pnpm build:mcp# Development with file watching
cd examples/basic-usage
pnpm devBy default, Claude Context supports:
- Programming languages:
.ts,.tsx,.js,.jsx,.py,.java,.cpp,.c,.h,.hpp,.cs,.go,.rs,.php,.rb,.swift,.kt,.scala,.m,.mm - Documentation:
.md,.markdown,.ipynb
Common directories and files are automatically ignored:
- Build outputs:
node_modules/**,dist/**,build/**,out/**,target/**,coverage/**,.nyc_output/** - Version control:
.git/**,.svn/**,.hg/** - IDE/Editor files:
.vscode/**,.idea/**,*.swp,*.swo - Cache directories:
.cache/**,__pycache__/**,.pytest_cache/** - Logs and temporary:
logs/**,tmp/**,temp/**,*.log - Environment files:
.env,.env.*,*.local - Minified/bundled files:
*.min.js,*.min.css,*.bundle.js,*.bundle.css,*.chunk.js,*.map
See FAQ Guide for detailed and customized configuration of supported file extensions and ignore patterns.
Check the /examples directory for complete usage examples:
- Basic Usage: Simple indexing and search example
Common Questions:
- What files does Claude Context decide to embed?
- Can I use a fully local deployment setup?
- Does it support multiple projects / codebases?
For detailed answers and more troubleshooting tips, see our FAQ Guide.
We welcome contributions! Please see our Contributing Guide for details on how to get started.
Package-specific contributing guides:
- AST-based code analysis for improved understanding
- Support for additional embedding providers
- Agent-based interactive search mode
- Enhanced code chunking strategies
- Search result ranking optimization
This project is licensed under the MIT License - see the LICENSE file for details.
Special thanks to Cheney Zhang and the Zilliz team for creating the original Claude Context project. This fork builds upon their excellent foundation to provide a local-first vector database experience with LanceDB.

