Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# Copyright (C) 2025 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

# this file should be run in the root of the repo
services:
prompt-template:
build:
dockerfile: comps/prompt_template/src/Dockerfile
image: ${REGISTRY:-opea}/prompt-template:${TAG:-latest}
1 change: 1 addition & 0 deletions comps/cores/mega/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ class ServiceType(Enum):
TEXT2KG = 22
STRUCT2GRAPH = 23
LANGUAGE_DETECTION = 24
PROMPT_TEMPLATE = 25


class MegaServiceEndpoint(Enum):
Expand Down
Empty file.
19 changes: 19 additions & 0 deletions comps/prompt_template/deployment/docker_compose/compose.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Copyright (C) 2025 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

services:
prompt-template:
image: ${REGISTRY:-opea}/prompt-template:${TAG:-latest}
container_name: prompt-template
ports:
- "7900:7900"
environment:
- no_proxy=${no_proxy}
- https_proxy=${https_proxy}
- http_proxy=${http_proxy}
ipc: host
restart: always

networks:
default:
driver: bridge
Empty file.
34 changes: 34 additions & 0 deletions comps/prompt_template/src/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# Copyright (C) 2024 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

# FROM python:3.11-slim
FROM python:3.11-slim

# Set environment variables
ENV LANG=en_US.UTF-8

RUN apt-get update -y && \
apt-get install build-essential -y && \
apt-get install -y --no-install-recommends --fix-missing \
vim && \
apt-get clean && rm -rf /var/lib/apt/lists/*

COPY comps /home/comps

RUN useradd -m -s /bin/bash user && \
mkdir -p /home/user && \
chown -R user /home/user/

ARG uvpip='uv pip install --system --no-cache-dir'
RUN pip install --no-cache-dir --upgrade pip setuptools uv && \
$uvpip -r /home/comps/prompt_template/src/requirements.txt

ENV PYTHONPATH=$PYTHONPATH:/home

USER user

WORKDIR /home/comps/prompt_template/src

ENTRYPOINT ["python", "opea_prompt_template_microservice.py"]


199 changes: 199 additions & 0 deletions comps/prompt_template/src/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,199 @@
# Prompt Template microservice

The Prompt Template microservice dynamically generates system and user prompts based on structured inputs and document context. It supports usage in LLM pipelines to customize prompt formatting with reranked documents, conversation history, and user queries.

## Getting started

### 🚀1. Start Prompt Template Microservice with Python (Option 1)

To start the Prompt Template microservice, you need to install Python packages first.

#### 1.1. Install Requirements

```bash
pip install -r requirements.txt
```

#### 1.2. Start Microservice

```bash
python opea_prompt_template_microservice.py
```

### 🚀2. Start Prompt Template Microservice with Docker (Option 2)

#### 2.1. Build the Docker Image:

Use the below docker build command to create the image:

```bash
cd ../../../
docker build -t opea/prompt-template:latest -f comps/prompt_template/src/Dockerfile .
```

Please note that the building process may take a while to complete.

#### 2.2. Run the Docker Container:

```bash
docker run -d --name="prompt-template-microservice" \
-p 7900:7900 \
--net=host \
--ipc=host \
opea/prompt-template:latest
```

### 3. Verify the Prompt Template Microservice

#### 3.1. Check Status

```bash
curl http://localhost:7900/v1/health_check \
-X GET \
-H 'Content-Type: application/json'
```

#### 3.2. Sending a Request

##### 3.2.1 Default Template Generation

Generates the prompt using the default template:

**Example Input**

```bash
curl -X POST -H "Content-Type: application/json" -d @- http://localhost:7900/v1/prompt_template <<JSON_DATA
{
"data": {
"user_prompt": "What is Deep Learning?",
"reranked_docs": [{ "text": "Deep Learning is a subfield of machine learning..." }]
},
"conversation_history": [
{ "question": "Hello", "answer": "Hello as well" },
{ "question": "How are you?", "answer": "I am good, thank you!" }
],
"system_prompt_template": "",
"user_prompt_template": ""
}
JSON_DATA
```

**Example Output**

A chat_template starting with the default assistant description.

```json
{
"id": "4e799abdf5f09433adc276b511a8b0ae",
"model": null,
"query": "What is Deep Learning?",
"max_tokens": 1024,
"max_new_tokens": 1024,
"top_k": 10,
"top_p": 0.95,
"typical_p": 0.95,
"temperature": 0.01,
"frequency_penalty": 0.0,
"presence_penalty": 0.0,
"repetition_penalty": 1.03,
"stream": true,
"language": "auto",
"input_guardrail_params": null,
"output_guardrail_params": null,
"chat_template": "### You are a helpful, respectful, and honest assistant to help the user with questions. Please refer to the search results obtained from the local knowledge base. Refer also to the conversation history if you think it is relevant to the current question. Ignore all information that you think is not relevant to the question. If you don'\''t know the answer to a question, please don'\''t share false information. \n ### Search results: [File: Unknown Source]\nDeep Learning is...\n### Conversation history: User: Hello\nAssistant: Hello as well\nUser: How are you?\nAssistant: I am good, thank you!\nUser: Who are you?\nAssistant: I am a robot\n### Question: What is Deep Learning? \n\n### Answer:",
"documents": []
}
```

##### 3.2.2 Custom Prompt Template

You can provide custom system and user prompt templates:

**Example Input**

```bash
curl -X POST -H "Content-Type: application/json" -d @- http://localhost:7900/v1/prompt_template <<JSON_DATA
{
"data": {
"initial_query": "What is Deep Learning?",
"reranked_docs": [{ "text": "Deep Learning is..." }]
},
"system_prompt_template": "### Please refer to the search results obtained from the local knowledge base. But be careful to not incorporate information that you think is not relevant to the question. If you don't know the answer to a question, please don't share false information. ### Search results: {reranked_docs}",
"user_prompt_template": "### Question: {initial_query} \\n### Answer:"
}
JSON_DATA
```

**Example Output**

Custom instructions about using search results in the chat_template.

```json
{
"id": "b1f1cec396954d5dc1b942f5959d556d",
"model": null,
"query": "What is Deep Learning?",
"max_tokens": 1024,
"max_new_tokens": 1024,
"top_k": 10,
"top_p": 0.95,
"typical_p": 0.95,
"temperature": 0.01,
"frequency_penalty": 0.0,
"presence_penalty": 0.0,
"repetition_penalty": 1.03,
"stream": true,
"language": "auto",
"input_guardrail_params": null,
"output_guardrail_params": null,
"chat_template": "### Please refer to the search results obtained from the local knowledge base. But be careful to not incorporate information that you think is not relevant to the question. If you don'\''t know the answer to a question, please don'\''t share false information. ### Search results: [File: Unknown Source]\nDeep Learning is...\n### Question: What is Deep Learning? \n### Answer:",
"documents": []
}
```

##### 3.2.3 Translation Scenario

Using a translation-related prompt template:

**Example Input**

```bash
curl -X POST -H "Content-Type: application/json" -d @- http://localhost:7900/v1/prompt_template <<JSON_DATA
{
"data": {
"initial_query": "What is Deep Learning?",
"source_lang": "chinese",
"target_lang": "english"
},
"system_prompt_template": "### You are a helpful, respectful, and honest assistant to help the user with translations. Translate this from {source_lang} to {target_lang}.",
"user_prompt_template": "### Question: {initial_query} \\n### Answer:"
}
JSON_DATA
```

**Example Output**

A translation instruction like: Translate this from chinese to english.

```json
{
"id": "4f5e0024c2330a7be065b370d02e061f",
"model": null,
"query": "什么是深度学习?",
"max_tokens": 1024,
"max_new_tokens": 1024,
"top_k": 10,
"top_p": 0.95,
"typical_p": 0.95,
"temperature": 0.01,
"frequency_penalty": 0.0,
"presence_penalty": 0.0,
"repetition_penalty": 1.03,
"stream": true,
"language": "auto",
"input_guardrail_params": null,
"output_guardrail_params": null,
"chat_template": "### You are a helpful, respectful, and honest assistant to help the user with translations. Translate this from chinese to english.\n### Question: 什么是深度学习? \n### Answer:",
"documents": []
}
```
2 changes: 2 additions & 0 deletions comps/prompt_template/src/integrations/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Copyright (C) 2024 Intel Corporation
# SPDX-License-Identifier: Apache-2.0
Loading
Loading