Adhoc is a command-line tool designed to automatically document changes in your codebase. By integrating with local language models, it generates detailed explanations of code modifications and compiles them into professional documentation formats such as LaTeX, Markdown, or Word. This tool streamlines the documentation process, making it effortless for developers to maintain up-to-date records of their code evolution.
Watch the demo video to see Adhoc in action:
- Automatic Documentation: Generates explanations for code changes using Large Language Models (LLMs).
- Multiple LLM Providers: Supports both local Ollama models and OpenAI-compatible APIs (OpenAI, OpenRouter, LiteLLM, etc.).
- Multiple Output Formats: Supports documentation in LaTeX, Markdown, and Word formats.
- Version Control Integration: Detects code changes and commits through simple commands.
- Configurable Settings: Allows customization of LLM provider, API keys, output formats, and author information via a configuration file.
- Extensible Design: Modular structure makes it easy to extend functionalities.
To install Adhoc, you can use pip:
pip install adhoc-pythonEnsure you have the necessary dependencies listed in the requirements.txt file.
Adhoc provides a set of commands to initialize your project, commit changes, generate documentation, and configure settings. Below is an explanation of each command and how to use it.
Before using Adhoc, you need to initialize your project:
adhoc init --model "your ollama model"What it does:
- Creates a
.Adhocdirectory in your project root to store configurations and databases. - Initializes a SQLite database to track code changes.
- Initializes the whole code to work with your choice of LLM and then generates a codebase summary for context.
- By default, Adhoc uses Ollama for local LLM inference. You can configure it to use OpenAI-compatible APIs (see Configuration section below).
Adhoc supports two types of LLM providers:
-
Ollama (default): Local LLM inference
adhoc init --model "llama3.1" -
OpenAI-compatible APIs: Including OpenAI, OpenRouter, LiteLLM, and other model routers
# First initialize with any model name adhoc init --model "gpt-4" # Then configure for OpenAI or other compatible services adhoc config -p openai -k "your-api-key" # For OpenRouter or custom endpoints adhoc config -p openai -k "your-api-key" -e "https://openrouter.ai/api/v1/chat/completions" -m "anthropic/claude-3-opus"
After making changes to your codebase, use the following command to commit those changes and generate explanations:
adhoc commit -m "Your commit message"Options:
-m, --message: (Optional) A commit message describing the changes.
What it does:
- Detects changes since the last commit by comparing snapshots.
- Generates explanations for the changes using the LLM, incorporating your commit message if provided.
- Stores the changes and explanations in the database for future reference.
To create documentation of your codebase and its changes, run:
adhoc generateWhat it does:
- Retrieves the codebase summary and change explanations from the database.
- Generates a documentation file in the format specified in your configuration (latex, markdown, or word).
- The output file (
documentation.tex,documentation.md, ordocumentation.docx) is created in your project directory.
Customize Adhoc settings using the adhoc config command:
adhoc config -d md -u "Your Name"Options:
-d, --document-format: Sets the output document format. Acceptsmdfor Markdown,texfor LaTeX, orwordfor Word documents.-u, --username: Sets the author name to be used in the documentation.-p, --provider: Sets the LLM provider. Acceptsollama(default) oropenaifor OpenAI-compatible APIs.-k, --api-key: Sets the API key for OpenAI-compatible services.-e, --api-endpoint: Sets a custom API endpoint for model routers (e.g., OpenRouter, LiteLLM).-m, --model: Sets the model name to use.
What it does:
- Updates the configuration file (
config.jsonin the.Adhocdirectory) with your preferences. - The changes affect how documentation is generated and personalized.
Examples:
# Configure for local Ollama usage
adhoc config -d word -u "Shreyas"
# Configure for OpenAI
adhoc config -p openai -k "your-openai-api-key" -m "gpt-4"
# Configure for OpenRouter (or other OpenAI-compatible model routers)
adhoc config -p openai -k "your-openrouter-api-key" -e "https://openrouter.ai/api/v1/chat/completions" -m "anthropic/claude-3-opus"
# Configure for LiteLLM or other custom endpoints
adhoc config -p openai -k "your-api-key" -e "https://your-service.com/v1/chat/completions" -m "your-model"- Python 3.6 or higher
Dependencies:
jinja2requestswatchdogpython-docx(for Word document generation)
Additional dependencies are listed in requirements.txt.
Contributions are welcome! Please open an issue or submit a pull request on GitHub.
This project is licensed under the MIT License. See the LICENSE file for details.
"Life is an ad hoc affair. It has to be improvised all the time because of the hard fact that everything we do changes what is. This is distressing to people who would like to see things beautifully planned out and settled once and for all. That cannot be."
― Jane Jacobs
Happy coding!
