Advanced AI framework for unrestricted coding assistance with extended context capabilities
GitHub Β· Telegram Β· Instagram Β· X (Twitter) Β· YouTube
Here is a glimpse of the HacxGPT framework in action:
- About The Project
- Features
- Supported Providers & Models
- Getting Started
- Configuration
- Usage
- Roadmap
- Star History
- Contributing
- License
HacxGPT is designed to provide powerful, unrestricted, and seamless AI-driven conversations, pushing the boundaries of what is possible with natural language processing and code generation.
This repository is an advanced open-source interface framework designed to unlock uncensored AI capabilities through extensive jailbreaking. This repository serves as a professional bridge for interfacing with both standard language models and our custom-trained production model, providing a specialized framework for unrestricted data generation and technical exploration.
Important
HacxGPT utilizes an advanced, extensive system prompt architecture engineered to provide superior, uncensored results. This specialized jailbreak framework ensures that the AI operates without standard behavioral constraints, delivering precise and unfiltered responses tailored for technical and security research.
It is important to understand: This code is an interface wrapper and proof-of-concept that demonstrates the concept of unrestricted AI interaction. It works by using third-party APIs (OpenRouter, Groq) combined with specialized system prompts.
What you get in this repository:
- β Open-source CLI tool for AI interaction
- β Access to multiple free AI models through third-party providers
- β Advanced jailbreak prompts for reduced censorship
- β Multi-provider support with easy switching
- β Cross-platform compatibility
- β This is NOT the core HacxGPT production model
- β This uses public APIs, not our custom infrastructure
While this repository offers a glimpse into HacxGPT's potential through prompt engineering, our flagship offering is a custom-trained, production-grade model running on dedicated infrastructure.
Why Choose the Production HacxGPT Model?
| Feature | This GitHub Demo | Production HacxGPT |
|---|---|---|
| Technology | Third-party APIs + system prompts | Custom-trained model |
| Context Window | Varies (4k-128k depending on provider) | Extended context optimized for code |
| Censorship Approach | Jailbreak prompts on public models | Built uncensored from the ground up |
| Performance | Good | Excellent |
| Reliability | Depends on third-party APIs | Dedicated infrastructure |
| Cost | Free (bring your own API key) | Subscription-based |
| Support | Community | Priority support |
| Best For | Testing and experimentation | Production coding workflows |
Key Advantages:
β¨ Custom Training - Our model is specifically fine-tuned for coding tasks, not general conversation. This means better code understanding, refactoring capabilities, and technical accuracy.
π Extended Context - Handle larger codebases and longer conversations without context limitations that plague standard models.
π Truly Uncensored - No jailbreak prompts needed. The model is fundamentally designed without arbitrary restrictions, making it ideal for security research and technical exploration.
β‘ Optimized Performance - Running on dedicated GPU infrastructure ensures fast, consistent response times.
π― Code-First Design - Built specifically for developers who need powerful coding assistance without the limitations of consumer-focused AI tools.
Access to our production model is available through our API service. To get started:
β‘οΈ Try HacxGPT: Visit hacxgpt.com to experience the production model
β‘οΈ Get API Access: Join our Telegram for API keys and subscription information
β‘οΈ Community Support: Connect with other users at Telegram Channel
This Open-Source Framework Provides:
- Powerful AI Conversations: Get intelligent and context-aware answers to your queries
- Extensive Model Support: Access to HacxGPT production models, specialized Groq models, and a vast library of open-source FREE models via OpenRouter
- Unrestricted Framework: System prompts engineered to bypass conventional AI limitations
- Easy-to-Use CLI: Clean and simple command-line interface for smooth interaction
- Cross-Platform: Tested and working on Kali Linux, Ubuntu, Windows, and Termux
- Multi-Provider Support: Seamlessly switch between different AI providers
- Configuration Management: Built-in commands for managing API keys and model selection
HacxGPT provides a versatile interface for a wide range of industry-leading models, including extensive support for FREE open-source models.
| Provider | Key Models Supported | Best For |
|---|---|---|
| HacxGPT | hacxgpt-lightning |
Production coding, extended context |
| Groq | kimi-k2-instruct-0905, qwen3-32b |
|
| OpenRouter | mimo-v2-flash, devstral-2512, glm-4.5-air, kimi-k2, deepseek-r1t-chimera |
Tip
We actively support and encourage the use of open-source FREE models to ensure accessible, high-performance AI for everyone. Many powerful models are available at no cost through OpenRouter and Groq.
Follow these steps to get the HacxGPT framework running on your system.
To use this framework, you must obtain an API key from a supported provider. These services offer free tiers that are perfect for getting started.
Option 1: Try Production HacxGPT (Recommended)
- Visit hacxgpt.com to try our production model
- Join Telegram for API access and subscription details
Option 2: Use Free Third-Party Providers
-
Choose a provider:
- OpenRouter: Visit openrouter.ai/keys to get a free API key. They provide access to a variety of models.
- Groq: Visit console.groq.com/keys for a free API key to use their powerful models.
-
Copy your API key. You will need to paste it into the script when prompted during the first run.
We provide simple, one-command installation scripts for your convenience.
- Download the
install.batscript from this repository - Double-click the file to run it. It will automatically clone the repository and install all dependencies
- Open your terminal
- Run the following command. It will download the installer, make it executable, and run it for you:
bash <(curl -s https://raw.githubusercontent.com/BlackTechX011/Hacx-GPT/main/scripts/install.sh)
Manual Installation (Alternative)
If you prefer to install manually, follow these steps:
-
Clone the repository:
git clone https://github.com/BlackTechX011/Hacx-GPT.git
-
Navigate to the directory:
cd Hacx-GPT -
Install Python dependencies:
pip install -e .
HacxGPT uses a centralized providers.json for managing API endpoints and models. You can easily switch between providers and models using the built-in commands or through the setup menu.
-
Launch the tool:
hacxgpt # OR python -m hacxgpt.main -
Select Option [2] to Configure Security Keys
-
Choose your provider and select your preferred model from the interactive list
While in chat, use these commands to dynamically manage your configuration:
/setup- Re-configure API keys and default models/provider <name>- Switch between configured providers (e.g.,/provider openrouter)/model <name>- Switch the active model (e.g.,/model llama-3.3-70b)/models- List all available models for your current provider/status- Show current uplink configuration (provider, model, API status)/help- Display all available commands/clear- Clear the conversation history/exitor/quit- Exit the application
Run the application directly:
hacxgpt
# OR
python -m hacxgpt.mainThe first time you run it, you will be prompted to enter your API key. It will be saved locally for future sessions.
- Use
/modelsto see what's available for your current provider - Switch models with
/modelto find the best fit for your task - Use
/statusto verify your current configuration - For production use, try HacxGPT at hacxgpt.com for enhanced performance
We are constantly evolving the HacxGPT framework. Here are some of the technical milestones we are currently targeting:
- Advanced Reasoning Support: Deep-think/reasoning capabilities for complex problem-solving
- Agentic Capabilities: Autonomous tool use and multi-step execution chains
- Web Search Integration: Real-time data retrieval for up-to-date context
- Advanced File Analysis: Native support for processing large datasets and documents
- IDE Integrations: Plugins for VS Code, IntelliJ, and other popular editors
- Conversation Management: Save, load, and resume conversations
- Multi-Modal Support: Image and document analysis capabilities
Contributions are what make the open-source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature) - Commit your Changes (
git commit -m 'Add some AmazingFeature') - Push to the Branch (
git push origin feature/AmazingFeature) - Open a Pull Request
- π Bug fixes and testing
- π Documentation improvements
- π¨ UI/UX enhancements for the CLI
- π Adding support for new AI providers
- π Translations and internationalization
- π‘ New feature implementations
Distributed under the Personal-Use Only License (PUOL) 1.0. See LICENSE for more information.
- π Production Model: hacxgpt.com - Try the real HacxGPT
- π¬ Telegram Announcements: t.me/HacxGPT - API access and updates
- π GitHub Repository: github.com/BlackTechX011/Hacx-GPT
- π¦ Twitter/X: @BlackTechX011
- πΊ YouTube: @BlackTechX_
Need help? Have questions?
- π Check the documentation in this README
- π¬ Join our Telegram Community
- π Report bugs via GitHub Issues
- β For production support, visit hacxgpt.com
This tool is designed for educational and research purposes. Users are responsible for ensuring their use complies with applicable laws and the terms of service of any third-party APIs they access. The developers of HacxGPT are not responsible for misuse of this software.
Built with β€οΈ by BlackTechX
β Star this repo if you find it useful!
Want unrestricted AI for production? Try hacxgpt.com

