Tired of Cursor cutting off context, missing your files, and spitting out empty responses?
Shotgun is the bridge between your local codebase and the world's most powerful LLMs. It doesn't just copy files; it intelligently packages your project context and can execute prompts directly against OpenAI (GPT-4o/GPT-5), Google Gemini, or OpenRouter.
Stop copy-pasting 50 files manually.
- Select your repo.
- Let AI pick the relevant files (Auto-Context).
- Blast the payload directly to the model or copy it for use in Cursor/Windsurf.
Shotgun is a desktop power-tool that explodes your project into a structured payload designed for AI reasoning.
It has evolved from a simple "context dumper" into a full-fledged LLM Client for Codebases:
- Smart Selection: Uses AI ("Auto-Context") to analyze your task and automatically select only the relevant files from your tree.
- Direct Execution: Configurable API integration with OpenAI, Gemini, and OpenRouter.
- Prompt Engineering: Built-in templates for different roles (Developer, Architect, Bug Hunter).
- History & Audit: Keeps a full log of every prompt sent and response received.
- Auto-Context: Don't know which files are needed for a bug fix? Type your task, and Shotgun uses an LLM to scan your tree and select the relevant files for you.
- Repo Scan: supplement context retrieval with a
shotgun_reposcan.mdsummary of your architecture to give the LLM high-level awareness before diving into code.
- Fast Tree Scan: Go + Wails backend scans thousands of files in milliseconds.
- Interactive Tree: Manually toggle files/folders or use
.gitignoreand custom rule sets to filter noise. - One-Click Blast: Generate a massive context payload instantly.
- OpenAI: Support for GPT-4o and experimental support for GPT-5 family models.
- Google Gemini: Native integration for Gemini 2.5/3 Pro & Flash.
- OpenRouter: Access hundreds of LLM's via a unified API.
- Prompt Templates: Switch modes easily (e.g., "Find Bug" vs "Refactor" vs "Write Docs").
- History Tracking: Never lose a generated patch. Browse past prompts, responses, and raw API payloads.
- Privacy Focused: Your code goes only to the API provider you choose. No intermediate servers.
Shotgun guides you through a 3-step process:
- Select Project: Open your local repository.
- Filter: Use the checkbox tree,
.gitignore, or the Auto-Context button to define the scope. - Repo Scan: Edit or load the high-level repository summary for better AI grounding.
- Result: A structured XML-like dump of your selected codebase.
- Define Task: Describe what you need (e.g., "Refactor the auth middleware to use JWT").
- Select Template: Choose a persona (Dev, Architect, QA).
- Execute: Click "Execute Prompt" to send it to the configured LLM API immediately, OR copy the full payload to your clipboard for use in external tools like ChatGPT or Cursor.
- Review: View the AI's response alongside your original prompt.
- Diffs: The AI output is optimized for
diffgeneration. - Audit: Inspect raw API calls for debugging or token usage analysis.
- Go ≥ 1.20
- Node.js LTS
- Wails CLI:
go install github.com/wailsapp/wails/v2/cmd/wails@latest
git clone https://github.com/glebkudr/shotgun_code
cd shotgun_code
# Install frontend dependencies
cd frontend
npm install
cd ..
# Run in Development Mode (Hot Reload)
wails dev
# Build Production Binary
wails buildBinaries will be located in build/bin/.
Click the Settings (gear icon) in the app to configure providers:
- Provider: Select OpenAI, Gemini, or OpenRouter.
- API Key: Paste your key (stored locally).
- Model: Select your preferred model (e.g.,
gpt-4o,gemini-2.5-pro,claude-3.5-sonnet).
You can define global excludes (like node_modules, dist, .git) and custom prompt instructions that are appended to every request.
Shotgun generates context optimized for LLM parsing:
<file path="backend/main.go">
package main
...
</file>
<file path="frontend/src/App.vue">
<template>
...
</template>
</file>This format allows models to understand file boundaries perfectly, enabling accurate multi-file refactoring suggestions.
My name is Gleb Curly, and I am an indie developer making software for a living.
Shotgun is developed and maintained by Curly's Technology Tmi.
This project uses a Community License model:
You can use Shotgun for free (including modification and internal use) if:
- Your company/team generates less than $1M USD in annual revenue.
- You do not use the code to build a competing public product.
If your annual revenue exceeds $1M USD, you are required to purchase a commercial license with a pretty reasonable price.
Please contact me at [email protected] for pricing.
See LICENSE.md for the full legal text.
Shotgun – Load, Aim, Blast your code into the future.
