A modern, feature-rich desktop application for interacting with Ollama models. Built with Tauri, React, and TypeScript for a seamless cross-platform experience.
Ollama Chat App is a user-friendly interface for the Official Ollama CLI that makes it easy to chat with large language models locally. Whether you're a developer, researcher, or AI enthusiast, this app provides an intuitive way to interact with Ollama without touching the command line.
- 🎨 Modern UI – Clean, intuitive interface built with React and Tailwind CSS
- 💬 Multiple Conversations – Manage and organize multiple chat sessions
- 🤖 Auto-detect Models – Automatically discover available Ollama models
- 🖥️ Flexible Host Configuration – Connect to Ollama running on any host
- ⏰ Auto-start Server – Automatically start the Ollama server when needed
- 💾 Persistent Storage – All conversations are saved locally using SQLite
- 📤 Import & Export – Easily backup and share your conversations
- 🌗 Light & Dark Theme – Choose your preferred visual style
- ⚡ Cross-platform – Available for macOS (Intel & Apple Silicon) and Windows
- Clone the repository:
git clone https://github.com/ollama-interface/Ollama-Gui.git
cd Ollama-Gui- Install dependencies:
pnpm install- Run in development mode:
pnpm tauri devBuild for your platform:
# macOS (Apple Silicon)
pnpm build:app:silicon
# macOS (Intel)
pnpm build:app:intell
# macOS (Universal - both architectures)
pnpm build:app:universal
# Windows
pnpm build:app:windows- Frontend: React 18, TypeScript, Tailwind CSS, Vite
- Desktop: Tauri 2
- Backend: Rust
- Database: SQLite with sqlx
- UI Components: Radix UI, shadcn/ui
- Linux support
- Improved settings interface
- Additional model parameters customization
- Conversation search and filtering
- Model management UI
Contributions are welcome! Feel free to open issues and pull requests.
This project is licensed under the MIT License – see the LICENSE file for details.
For questions or feedback, reach out to Twan Luttik on X/Twitter
