Adapt example code for guardrails refactor (#1360)

Signed-off-by: lvliang-intel <liang1.lv@intel.com>
Signed-off-by: chensuyue <suyue.chen@intel.com>
This commit is contained in:
Liang Lv
2025-01-08 14:35:23 +08:00
committed by GitHub
parent 5638075d65
commit b3c405a5f6
7 changed files with 15 additions and 15 deletions

View File

@@ -92,7 +92,7 @@ docker build --no-cache -t opea/dataprep-redis:latest --build-arg https_proxy=$h
To fortify AI initiatives in production, Guardrails microservice can secure model inputs and outputs, building Trustworthy, Safe, and Secure LLM-based Applications.
```bash
docker build -t opea/guardrails-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/guardrails/llama_guard/langchain/Dockerfile .
docker build -t opea/guardrails:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/guardrails/src/guardrails/Dockerfile .
```
### 4. Build MegaService Docker Image
@@ -168,7 +168,7 @@ If Conversation React UI is built, you will find one more image:
If Guardrails docker image is built, you will find one more image:
- `opea/guardrails-tgi:latest`
- `opea/guardrails:latest`
## 🚀 Start MicroServices and MegaService

View File

@@ -51,8 +51,8 @@ services:
ipc: host
command: --model-id ${GURADRAILS_MODEL_ID} --max-input-length 1024 --max-total-tokens 2048
guardrails:
image: ${REGISTRY:-opea}/guardrails-tgi:${TAG:-latest}
container_name: guardrails-tgi-gaudi-server
image: ${REGISTRY:-opea}/guardrails:${TAG:-latest}
container_name: guardrails-gaudi-server
ports:
- "9090:9090"
ipc: host