diff --git a/ChatQnA/docker_compose/intel/hpu/gaudi/README.md b/ChatQnA/docker_compose/intel/hpu/gaudi/README.md index 62e0b6032..a0b0df5b2 100644 --- a/ChatQnA/docker_compose/intel/hpu/gaudi/README.md +++ b/ChatQnA/docker_compose/intel/hpu/gaudi/README.md @@ -92,7 +92,7 @@ docker build --no-cache -t opea/dataprep-redis:latest --build-arg https_proxy=$h To fortify AI initiatives in production, Guardrails microservice can secure model inputs and outputs, building Trustworthy, Safe, and Secure LLM-based Applications. ```bash -docker build -t opea/guardrails-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/guardrails/llama_guard/langchain/Dockerfile . +docker build -t opea/guardrails:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/guardrails/src/guardrails/Dockerfile . ``` ### 4. Build MegaService Docker Image @@ -168,7 +168,7 @@ If Conversation React UI is built, you will find one more image: If Guardrails docker image is built, you will find one more image: -- `opea/guardrails-tgi:latest` +- `opea/guardrails:latest` ## 🚀 Start MicroServices and MegaService diff --git a/ChatQnA/docker_compose/intel/hpu/gaudi/compose_guardrails.yaml b/ChatQnA/docker_compose/intel/hpu/gaudi/compose_guardrails.yaml index 7bebade29..55230d582 100644 --- a/ChatQnA/docker_compose/intel/hpu/gaudi/compose_guardrails.yaml +++ b/ChatQnA/docker_compose/intel/hpu/gaudi/compose_guardrails.yaml @@ -51,8 +51,8 @@ services: ipc: host command: --model-id ${GURADRAILS_MODEL_ID} --max-input-length 1024 --max-total-tokens 2048 guardrails: - image: ${REGISTRY:-opea}/guardrails-tgi:${TAG:-latest} - container_name: guardrails-tgi-gaudi-server + image: ${REGISTRY:-opea}/guardrails:${TAG:-latest} + container_name: guardrails-gaudi-server ports: - "9090:9090" ipc: host diff --git a/ChatQnA/docker_image_build/build.yaml b/ChatQnA/docker_image_build/build.yaml index a7ee05770..a93a4b6a5 100644 --- a/ChatQnA/docker_image_build/build.yaml +++ b/ChatQnA/docker_image_build/build.yaml @@ -95,12 +95,12 @@ services: dockerfile: comps/dataprep/pinecone/langchain/Dockerfile extends: chatqna image: ${REGISTRY:-opea}/dataprep-pinecone:${TAG:-latest} - guardrails-tgi: + guardrails: build: context: GenAIComps - dockerfile: comps/guardrails/llama_guard/langchain/Dockerfile + dockerfile: comps/guardrails/src/guardrails/Dockerfile extends: chatqna - image: ${REGISTRY:-opea}/guardrails-tgi:${TAG:-latest} + image: ${REGISTRY:-opea}/guardrails:${TAG:-latest} vllm: build: context: vllm diff --git a/ChatQnA/kubernetes/intel/cpu/xeon/manifest/chatqna-guardrails.yaml b/ChatQnA/kubernetes/intel/cpu/xeon/manifest/chatqna-guardrails.yaml index f464c6e64..3ebf18a56 100644 --- a/ChatQnA/kubernetes/intel/cpu/xeon/manifest/chatqna-guardrails.yaml +++ b/ChatQnA/kubernetes/intel/cpu/xeon/manifest/chatqna-guardrails.yaml @@ -758,7 +758,7 @@ spec: runAsUser: 1000 seccompProfile: type: RuntimeDefault - image: "opea/guardrails-tgi:latest" + image: "opea/guardrails:latest" imagePullPolicy: Always ports: - name: guardrails-usvc diff --git a/ChatQnA/kubernetes/intel/hpu/gaudi/manifest/chatqna-guardrails.yaml b/ChatQnA/kubernetes/intel/hpu/gaudi/manifest/chatqna-guardrails.yaml index b5fbaf8f0..1b7f1f5d2 100644 --- a/ChatQnA/kubernetes/intel/hpu/gaudi/manifest/chatqna-guardrails.yaml +++ b/ChatQnA/kubernetes/intel/hpu/gaudi/manifest/chatqna-guardrails.yaml @@ -688,7 +688,7 @@ spec: runAsUser: 1000 seccompProfile: type: RuntimeDefault - image: "opea/guardrails-tgi:latest" + image: "opea/guardrails:latest" imagePullPolicy: Always ports: - name: guardrails-usvc diff --git a/ChatQnA/tests/test_compose_guardrails_on_gaudi.sh b/ChatQnA/tests/test_compose_guardrails_on_gaudi.sh index 56f1cfdfd..b0376affb 100644 --- a/ChatQnA/tests/test_compose_guardrails_on_gaudi.sh +++ b/ChatQnA/tests/test_compose_guardrails_on_gaudi.sh @@ -19,7 +19,7 @@ function build_docker_images() { git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ echo "Build all the images with --no-cache, check docker_image_build.log for details..." - service_list="chatqna-guardrails chatqna-ui dataprep-redis retriever-redis guardrails-tgi nginx" + service_list="chatqna-guardrails chatqna-ui dataprep-redis retriever-redis guardrails nginx" docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log docker pull ghcr.io/huggingface/tgi-gaudi:2.0.6 @@ -136,7 +136,7 @@ function validate_microservices() { "${ip_address}:9090/v1/guardrails" \ "Violated policies" \ "guardrails" \ - "guardrails-tgi-gaudi-server" \ + "guardrails-gaudi-server" \ '{"text":"How do you buy a tiger in the US?"}' } diff --git a/docker_images_list.md b/docker_images_list.md index 83bc6753e..69df49ec9 100644 --- a/docker_images_list.md +++ b/docker_images_list.md @@ -65,9 +65,9 @@ Take ChatQnA for example. ChatQnA is a chatbot application service based on the | [opea/finetuning-gaudi](https://hub.docker.com/r/opea/finetuning-gaudi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/finetuning/Dockerfile.intel_hpu) | The docker image exposed the OPEA Fine-tuning microservice for GenAI application use on the Gaudi | | [opea/gmcrouter](https://hub.docker.com/r/opea/gmcrouter) | [Link](https://github.com/opea-project/GenAIInfra/blob/main/microservices-connector/Dockerfile.manager) | The docker image served as one of key parts of the OPEA GenAI Microservice Connector(GMC) to route the traffic among the microservices defined in GMC | | [opea/gmcmanager](https://hub.docker.com/r/opea/gmcmanager) | [Link](https://github.com/opea-project/GenAIInfra/blob/main/microservices-connector/Dockerfile.router) | The docker image served as one of key parts of the OPEA GenAI Microservice Connector(GMC) to be controller manager to handle GMC CRD | -| [opea/guardrails-tgi](https://hub.docker.com/r/opea/guardrails-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/llama_guard/langchain/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide content review for GenAI application use | -| [opea/guardrails-toxicity-detection](https://hub.docker.com/r/opea/guardrails-toxicity-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/toxicity_detection/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide toxicity detection for GenAI application use | -| [opea/guardrails-pii-detection](https://hub.docker.com/r/opea/guardrails-pii-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/pii_detection/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide PII detection for GenAI application use | +| [opea/guardrails]() | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/src/guardrails/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide content review for GenAI application use | +| [opea/guardrails-toxicity-detection](https://hub.docker.com/r/opea/guardrails-toxicity-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/src/toxicity_detection/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide toxicity detection for GenAI application use | +| [opea/guardrails-pii-detection](https://hub.docker.com/r/opea/guardrails-pii-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/src/pii_detection/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide PII detection for GenAI application use | | [opea/llm-docsum-tgi](https://hub.docker.com/r/opea/llm-docsum-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/summarization/tgi/langchain/Dockerfile) | This docker image is designed to build a document summarization microservice using the HuggingFace Text Generation Inference(TGI) framework. The microservice accepts document input and generates a document summary. | | [opea/llm-faqgen-tgi](https://hub.docker.com/r/opea/llm-faqgen-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/faq-generation/tgi/langchain/Dockerfile) | This docker image is designed to build a frequently asked questions microservice using the HuggingFace Text Generation Inference(TGI) framework. The microservice accepts document input and generates a FAQ. | | [opea/llm-textgen](https://hub.docker.com/r/opea/llm-textgen) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/src/text-generation/Dockerfile) | The docker image exposed the OPEA LLM microservice upon TGI docker image for GenAI application use | @@ -78,7 +78,7 @@ Take ChatQnA for example. ChatQnA is a chatbot application service based on the | [opea/lvm-video-llama](https://hub.docker.com/r/opea/lvm-video-llama) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/lvms/video-llama/Dockerfile) | The docker image exposed the OPEA microservice running Video-Llama as a large visual model (LVM) for GenAI application use | | [opea/nginx](https://hub.docker.com/r/opea/nginx) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/3rd_parties/nginx/src/Dockerfile) | The docker image exposed the OPEA nginx microservice for GenAI application use | | [opea/promptregistry-mongo-server](https://hub.docker.com/r/opea/promptregistry-mongo-server) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/prompt_registry/mongo/Dockerfile) | The docker image exposes the OPEA Prompt Registry microservices which based on MongoDB database, designed to store and retrieve user's preferred prompts | -| [opea/reranking-tei](https://hub.docker.com/r/opea/reranking-tei) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/reranks/src/Dockerfile) | The docker image exposed the OPEA reranking microservice based on tei docker image for GenAI application use | +| [opea/reranking]() | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/rerankings/src/Dockerfile) | The docker image exposed the OPEA reranking microservice based on tei docker image for GenAI application use | | [opea/retriever-milvus](https://hub.docker.com/r/opea/retriever-milvus) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/milvus/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on milvus vectordb for GenAI application use | | [opea/retriever-pathway](https://hub.docker.com/r/opea/retriever-pathway) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/pathway/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice with pathway for GenAI application use | | [opea/retriever-pgvector](https://hub.docker.com/r/opea/retriever-pgvector) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/pgvector/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on pgvector vectordb for GenAI application use |