Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -120,6 +120,16 @@ updates:
schedule:
interval: "daily"

- directory: "/application/open-webui"
package-ecosystem: "docker-compose"
schedule:
interval: "daily"

- directory: "/application/open-webui/init"
package-ecosystem: "docker"
schedule:
interval: "daily"

- directory: "/application/ingestr"
package-ecosystem: "pip"
schedule:
Expand Down
56 changes: 56 additions & 0 deletions .github/workflows/application-open-webui.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
name: "Open WebUI"

on:
pull_request:
paths:
- '.github/workflows/application-open-webui.yml'
- 'application/open-webui/**'
push:
branches: [ main ]
paths:
- '.github/workflows/application-open-webui.yml'
- 'application/open-webui/**'

# Allow job to be triggered manually.
workflow_dispatch:

# Run job each night after CrateDB nightly has been published.
#schedule:
# - cron: '0 3 * * *'

# Cancel in-progress jobs when pushing to the same branch.
concurrency:
cancel-in-progress: true
group: ${{ github.workflow }}-${{ github.ref }}

jobs:

test:
runs-on: ${{ matrix.os }}

strategy:
fail-fast: true
matrix:
os: [ "ubuntu-latest" ]

name: OS ${{ matrix.os }}

env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}

steps:

- name: Acquire sources
uses: actions/checkout@v4

- name: Validate application/open-webui
run: |
# TODO: Generalize invocation into `ngr` test runner.

# Invoke software stack.
cd application/open-webui
docker compose up --detach

# Invoke validation payload.
# TODO: Currently does not work on GHA.
# docker compose run --rm test
9 changes: 9 additions & 0 deletions application/open-webui/.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# ------------------------------------------
# User configuration
# ------------------------------------------

# Define your OpenAI API key here if you want to make it persistent.
# You can also use other models with Open WebUI, however that is
# currently out of the scope of this miniature rig.

# OPENAI_API_KEY=your_openai_api_key
131 changes: 131 additions & 0 deletions application/open-webui/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,131 @@
# Use CrateDB with Open WebUI

## About

A complete end-to-end rig including CrateDB, CrateDB MCPO, and Open WebUI,
including a touch of integration tests on CI/GHA.

This stack is intended solely for demonstration purposes and does **not**
implement any security hardening. Do **not** deploy it to production.

## Introduction

[Open WebUI] is an extensible, feature-rich, and user-friendly self-hosted AI
platform designed to operate entirely offline. It supports various LLM runners
like Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG,
making it a powerful AI deployment solution.

CrateDB MCPO is an adapter wrapper around the [CrateDB MCP] server. Because
Open WebUI uses [OpenAPI Tool Servers] to integrate external tooling and data
sources into LLM agents and workflows, standard MCP servers need to adapt to
how [Open WebUI MCP Support] works.

## Usage

### Sources

It is advised to clone the Git repository and run the demo stack from there.
In this spirit, it will be easy for you to receive updates.
```shell
git clone https://github.com/crate/cratedb-examples
cd cratedb-examples/application/open-webui
```

### Start services

Configure the API key for OpenAI within the `.env` file next to `compose.yml`
to make it persistent for unattended service operations.
```dotenv
# .env
OPENAI_API_KEY=your_openai_api_key_here
```
Or export it for a one-off run:
```shell
export OPENAI_API_KEY=your_openai_api_key_here
```

Spin up the software stack. On the first occasion, it will take a while to
download the OCI images and let Open WebUI do its thing when bootstrapping
the very first time.
```shell
docker compose up
```

### User interface

You can access the service's resources on those URLs.

- CrateDB: http://localhost:4200/
- Open WebUI: http://localhost:6200/

Explore the APIs here.

- CrateDB MCPO:
- Swagger: http://localhost:5200/docs
- OpenAPI: http://localhost:5200/openapi.json
- Open WebUI:
- Swagger: http://localhost:6200/docs
- OpenAPI: http://localhost:6200/openapi.json
- Jupyter:
- http://localhost:7200/

### Configure

To make the ensemble work well, you need to configure a few bits on the Open WebUI
user interface.

- Make sure to enable the "CrateDB" tool. The toggle switch is located within the
flyout menu on the left side of the query prompt, which can be opened using the
`More (+)` button.

- In the "Chat Controls" flyout widget, located in the top right corner of the page,
- make sure to enable `Function Calling: Native`, see [OPEN-WEBUI-15939],
- and dial down to `Temperature: 0.0`.

### Example questions

Enjoy conversations with CrateDB (talk to your data) and its documentation
(talk to your knowledgebase).

- Text-to-SQL: _What is the average value for sensor 1?_
- Knowledgebase: _How do I use CrateDB with SQLAlchemy?_

### Stop services
Tear down services.
```shell
docker compose down
```
Delete all volumes.
```shell
docker compose down --volumes
```
Delete individual volumes.
```shell
docker volume rm open-webui_open-webui
```
```shell
docker volume rm open-webui_cratedb
```

### Jobs
Invoke individual jobs defined in the Compose file.
```shell
export BUILDKIT_PROGRESS=plain
docker compose run --rm setup
docker compose run --rm test
```

## What's inside

- `.env`: The dotenv file defines `OPENAI_API_KEY` for `compose.yml`.
- `compose.yml`: The service composition file defines four main services:
CrateDB, CrateDB MCPO, Open WebUI, and Jupyter. Helper jobs (setup, test, ...)
excluded for brevity. Use it with Docker or Podman.
- `init/`: Runtime configuration snippets.


[CrateDB MCP]: https://cratedb.com/docs/guide/integrate/mcp/cratedb-mcp.html
[OpenAPI Tool Servers]: https://docs.openwebui.com/openapi-servers/
[Open WebUI]: https://docs.openwebui.com/
[Open WebUI MCP Support]: https://docs.openwebui.com/openapi-servers/mcp/
[OPEN-WEBUI-15939]: https://github.com/open-webui/open-webui/issues/15939#issuecomment-3108279768
179 changes: 179 additions & 0 deletions application/open-webui/compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,179 @@
# Use CrateDB with Open WebUI
#
# https://cratedb.com/docs/
# https://docs.openwebui.com/getting-started/quick-start
---
networks:
llm-demo:
name: llm-demo
driver: bridge

volumes:
cratedb:
open-webui:
jupyter:

services:

# -------
# CrateDB
# -------
cratedb:
image: docker.io/crate/crate:6.0.0
environment:
CRATE_HEAP_SIZE: 2g
ports:
- "4200:4200"
- "5432:5432"
command: [
"crate",
"-Cdiscovery.type=single-node",
"-Ccluster.routing.allocation.disk.threshold_enabled=false",
]
networks:
- llm-demo
volumes:
- cratedb:/data
healthcheck:
test: [ "CMD", "curl", "--fail", "http://localhost:4200" ]
start_period: 3s
interval: 10s

# ------------
# CrateDB MCPO
# ------------
cratedb-mcpo:
image: ghcr.io/crate/cratedb-mcpo:0.0.7
environment:
CRATEDB_CLUSTER_URL: http://crate:crate@cratedb:4200/
ports:
- "5200:8000"
networks:
- llm-demo
healthcheck:
test: [ "CMD", "curl", "--fail", "http://localhost:8000/docs" ]
start_period: 3s
interval: 10s
depends_on:
cratedb:
condition: service_healthy

# ----------
# Open WebUI
# ----------
open-webui:
image: ghcr.io/open-webui/open-webui:0.6.18
# https://docs.openwebui.com/getting-started/env-configuration
# https://docs.openwebui.com/getting-started/api-endpoints/#swagger-documentation-links
environment:
# From caller's environment or `.env` file.
OPENAI_API_KEY: ${OPENAI_API_KEY}
# Currently defined here.
ENABLE_SIGNUP: "false"
ENABLE_LOGIN_FORM: "false"
WEBUI_AUTH: "false"
DEFAULT_MODELS: "gpt-4.1"
DEFAULT_USER_ROLE: "admin"
ENABLE_CHANNELS: "true"
RESPONSE_WATERMARK: "This text is AI generated"
WEBUI_NAME: "CrateDB LLM Cockpit"
BYPASS_MODEL_ACCESS_CONTROL: "true"
ENABLE_OLLAMA_API: "false"
ENABLE_OPENAI_API: "true"
ENABLE_DIRECT_CONNECTIONS: "true"
ENV: "dev"
# Jupyter code execution and interpreter.
ENABLE_CODE_INTERPRETER: "true"
CODE_EXECUTION_ENGINE: "jupyter"
CODE_EXECUTION_JUPYTER_URL: "http://jupyter:8888"
CODE_EXECUTION_JUPYTER_AUTH: "token"
CODE_EXECUTION_JUPYTER_AUTH_TOKEN: "123456"
CODE_EXECUTION_JUPYTER_TIMEOUT: 60
CODE_INTERPRETER_ENGINE: "jupyter"
CODE_INTERPRETER_JUPYTER_URL: "http://jupyter:8888"
CODE_INTERPRETER_JUPYTER_AUTH: "token"
CODE_INTERPRETER_JUPYTER_AUTH_TOKEN: "123456"
CODE_INTERPRETER_JUPYTER_TIMEOUT: 60
ports:
- "6200:8080"
networks:
- llm-demo
volumes:
- open-webui:/app/backend/data
healthcheck:
test: [ "CMD", "curl", "--fail", "http://localhost:8080" ]
start_period: 3s
interval: 10s
retries: 60
timeout: 90s
depends_on:
cratedb-mcpo:
condition: service_healthy
jupyter:
condition: service_healthy

# -------
# Jupyter
# -------
jupyter:
image: quay.io/jupyter/minimal-notebook:notebook-7.4.4
# https://docs.openwebui.com/tutorials/jupyter/
environment:
JUPYTER_ENABLE_LAB: "yes"
JUPYTER_TOKEN: "123456"
ports:
- "7200:8888"
networks:
- llm-demo
volumes:
- jupyter:/home/jovyan/work
healthcheck:
test: [ "CMD", "curl", "--fail", "http://localhost:8888" ]
start_period: 3s
interval: 10s
retries: 60
timeout: 90s

# -----
# Setup
# -----
setup:
build:
context: init
command: bash /app/setup.sh
networks:
- llm-demo
depends_on:
cratedb:
condition: service_healthy
cratedb-mcpo:
condition: service_healthy
open-webui:
condition: service_healthy

# ----
# Test
# ----
test:
build:
context: init
command: bash /app/test.sh
networks:
- llm-demo
depends_on:
setup:
condition: service_completed_successfully
deploy:
replicas: 0

# -------
# Bundler
# -------
# Wait for all defined services to be fully available by probing their health
# status, even when using `docker compose up --detach`.
# https://marcopeg.com/2019/docker-compose-healthcheck/
start-dependencies:
image: docker.io/dadarek/wait-for-dependencies
depends_on:
setup:
condition: service_completed_successfully
Loading