Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions comps/guardrails/src/bias_detection/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ python bias_detection.py

### 2.1 Prepare bias detection model

export HUGGINGFACEHUB_API_TOKEN=${HP_TOKEN}
export HUGGINGFACEHUB_API_TOKEN=${HF_TOKEN}

### 2.2 Build Docker Image

Expand Down Expand Up @@ -60,16 +60,16 @@ Once microservice starts, users can use examples (bash or python) below to apply
**Bash:**

```bash
curl localhost:9092/v1/bias
-X POST
-d '{"text":"John McCain exposed as an unprincipled politician"}'
curl localhost:9092/v1/bias \
-X POST \
-d '{"text":"John McCain exposed as an unprincipled politician"}' \
-H 'Content-Type: application/json'
```

Example Output:

```bash
"\nI'm sorry, but your query or LLM's response is BIASED with an score of 0.74 (0-1)!!!\n"
"Violated policies: bias, please check your input."
```

**Python Script:**
Expand Down
14 changes: 11 additions & 3 deletions comps/guardrails/src/hallucination_detection/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,16 @@ Therefore, we focus on detecting contextualized hallucinations with the followin

## 🚀1. Start Microservice based on vLLM endpoint on Intel Gaudi Accelerator

### 1.1 Define Environment Variables
### 1.1 Environment Setup

### Clone OPEA GenAIComps and Setup Environment

Clone this repository at your desired location and set environment variable for easy setup and usage throughout the instructions.

```bash
git clone https://github.com/opea-project/GenAIComps.git

export OPEA_GENAICOMPS_ROOT=$(pwd)/GenAIComps
export your_ip=<your ip>
export port_number=9008
export HUGGINGFACEHUB_API_TOKEN=<token>
Expand All @@ -60,13 +67,14 @@ Then we wrap the vLLM Service into Hallucination Microservice.
### 2.1 Build Docker

```bash
bash build_docker_hallucination_microservice.sh
cd $OPEA_GENAICOMPS_ROOT
bash comps/guardrails/src/hallucination_detection/build_docker_hallucination_microservice.sh
```

### 2.2 Launch Hallucination Microservice

```bash
bash launch_hallucination_microservice.sh
bash comps/guardrails/src/hallucination_detection/launch_hallucination_microservice.sh
```

## 🚀3. Get Status of Hallucination Microservice
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
# SPDX-License-Identifier: Apache-2.0

# Folder name you're looking for
target_folder="GenAIComps"
target_folder=$OPEA_GENAICOMPS_ROOT
proj_folder=$(pwd)

# Start from the current directory
Expand Down
Loading