Compare commits

...

4 Commits
main ... v1.0rc

Author SHA1 Message Date
Letong Han
3c3d0b4d36 [ProductivitySuite] Fix CD Issue (#858)
Signed-off-by: letonghan <letong.han@intel.com>
(cherry picked from commit d55a33dda1)
2024-09-20 16:32:05 +08:00
XinyaoWa
c9001a3912 Fix SearchQnA tests bug (#857)
Signed-off-by: Xinyao Wang <xinyao.wang@intel.com>
(cherry picked from commit daf2a4fad7)
2024-09-20 16:31:49 +08:00
chen, suyue
08fa591ebd print image build test commit (#856)
Signed-off-by: chensuyue <suyue.chen@intel.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
(cherry picked from commit 3ce395582b)
2024-09-20 16:31:46 +08:00
Letong Han
6d4b3d6b0b [Doc] Refine ChatQnA README (#855)
Signed-off-by: letonghan <letong.han@intel.com>
(cherry picked from commit 7eaab93d0b)
2024-09-20 16:31:44 +08:00
9 changed files with 36 additions and 37 deletions

View File

@@ -46,33 +46,34 @@ jobs:
- name: Clean Up Working Directory - name: Clean Up Working Directory
run: sudo rm -rf ${{github.workspace}}/* run: sudo rm -rf ${{github.workspace}}/*
- name: Get checkout ref - name: Get Checkout Ref
run: | run: |
if [ "${{ github.event_name }}" == "pull_request" ] || [ "${{ github.event_name }}" == "pull_request_target" ]; then if [ "${{ github.event_name }}" == "pull_request" ] || [ "${{ github.event_name }}" == "pull_request_target" ]; then
echo "CHECKOUT_REF=refs/pull/${{ github.event.number }}/merge" >> $GITHUB_ENV echo "CHECKOUT_REF=refs/pull/${{ github.event.number }}/merge" >> $GITHUB_ENV
else else
echo "CHECKOUT_REF=${{ github.ref }}" >> $GITHUB_ENV echo "CHECKOUT_REF=${{ github.ref }}" >> $GITHUB_ENV
fi fi
echo "checkout ref ${{ env.CHECKOUT_REF }}"
- name: Checkout out Repo - name: Checkout out GenAIExamples
uses: actions/checkout@v4 uses: actions/checkout@v4
with: with:
ref: ${{ env.CHECKOUT_REF }} ref: ${{ env.CHECKOUT_REF }}
fetch-depth: 0 fetch-depth: 0
- name: Clone required Repo - name: Clone Required Repo
run: | run: |
cd ${{ github.workspace }}/${{ inputs.example }}/docker_image_build cd ${{ github.workspace }}/${{ inputs.example }}/docker_image_build
docker_compose_path=${{ github.workspace }}/${{ inputs.example }}/docker_image_build/build.yaml docker_compose_path=${{ github.workspace }}/${{ inputs.example }}/docker_image_build/build.yaml
if [[ $(grep -c "tei-gaudi:" ${docker_compose_path}) != 0 ]]; then if [[ $(grep -c "tei-gaudi:" ${docker_compose_path}) != 0 ]]; then
git clone https://github.com/huggingface/tei-gaudi.git git clone https://github.com/huggingface/tei-gaudi.git
cd tei-gaudi && git rev-parse HEAD && cd ../
fi fi
if [[ $(grep -c "vllm:" ${docker_compose_path}) != 0 ]]; then if [[ $(grep -c "vllm:" ${docker_compose_path}) != 0 ]]; then
git clone https://github.com/vllm-project/vllm.git git clone https://github.com/vllm-project/vllm.git
cd vllm && git rev-parse HEAD && cd ../
fi fi
git clone https://github.com/opea-project/GenAIComps.git git clone https://github.com/opea-project/GenAIComps.git
cd GenAIComps && git checkout ${{ inputs.opea_branch }} && cd ../ cd GenAIComps && git checkout ${{ inputs.opea_branch }} && git rev-parse HEAD && cd ../
- name: Build Image - name: Build Image
if: ${{ fromJSON(inputs.build) }} if: ${{ fromJSON(inputs.build) }}

View File

@@ -72,9 +72,9 @@ docker pull opea/chatqna-ui:latest
In following cases, you could build docker image from source by yourself. In following cases, you could build docker image from source by yourself.
- Failed to download the docker image. - Failed to download the docker image. (The essential Docker image `opea/nginx` has not yet been released, users need to build this image first)
- Use the latest or special version. - If you want to use a specific version of Docker image.
Please refer to the 'Build Docker Images' in [Guide](docker_compose/intel/cpu/xeon/README.md). Please refer to the 'Build Docker Images' in [Guide](docker_compose/intel/cpu/xeon/README.md).

View File

@@ -49,9 +49,9 @@ docker pull opea/chatqna-ui:latest
In following cases, you could build docker image from source by yourself. In following cases, you could build docker image from source by yourself.
- Failed to download the docker image. - Failed to download the docker image. (The essential Docker image `opea/nginx` has not yet been released, users need to build this image first)
- Use the latest or special version. - If you want to use a specific version of Docker image.
Please refer to 'Build Docker Images' in below. Please refer to 'Build Docker Images' in below.

View File

@@ -50,9 +50,9 @@ docker pull opea/chatqna-ui:latest
In following cases, you could build docker image from source by yourself. In following cases, you could build docker image from source by yourself.
- Failed to download the docker image. - Failed to download the docker image. (The essential Docker image `opea/nginx` has not yet been released, users need to build this image first)
- Use the latest or special version. - If you want to use a specific version of Docker image.
Please refer to 'Build Docker Images' in below. Please refer to 'Build Docker Images' in below.

View File

@@ -50,9 +50,9 @@ docker pull opea/chatqna-ui:latest
In following cases, you could build docker image from source by yourself. In following cases, you could build docker image from source by yourself.
- Failed to download the docker image. - Failed to download the docker image. (The essential Docker image `opea/nginx` has not yet been released, users need to build this image first)
- Use the latest or special version. - If you want to use a specific version of Docker image.
Please refer to 'Build Docker Images' in below. Please refer to 'Build Docker Images' in below.

View File

@@ -72,9 +72,7 @@ services:
REDIS_URL: ${REDIS_URL} REDIS_URL: ${REDIS_URL}
INDEX_NAME: ${INDEX_NAME} INDEX_NAME: ${INDEX_NAME}
TEI_EMBEDDING_ENDPOINT: ${TEI_EMBEDDING_ENDPOINT} TEI_EMBEDDING_ENDPOINT: ${TEI_EMBEDDING_ENDPOINT}
LANGCHAIN_API_KEY: ${LANGCHAIN_API_KEY} HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN}
LANGCHAIN_TRACING_V2: ${LANGCHAIN_TRACING_V2}
LANGCHAIN_PROJECT: "opea-retriever-service"
restart: unless-stopped restart: unless-stopped
tei-reranking-service: tei-reranking-service:
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5

View File

@@ -53,20 +53,20 @@ function start_services() {
export TGI_LLM_ENDPOINT_CODEGEN="http://${ip_address}:8028" export TGI_LLM_ENDPOINT_CODEGEN="http://${ip_address}:8028"
export TGI_LLM_ENDPOINT_FAQGEN="http://${ip_address}:9009" export TGI_LLM_ENDPOINT_FAQGEN="http://${ip_address}:9009"
export TGI_LLM_ENDPOINT_DOCSUM="http://${ip_address}:9009" export TGI_LLM_ENDPOINT_DOCSUM="http://${ip_address}:9009"
export BACKEND_SERVICE_ENDPOINT_CHATQNA="http://${host_ip}:8888/v1/chatqna" export BACKEND_SERVICE_ENDPOINT_CHATQNA="http://${ip_address}:8888/v1/chatqna"
export BACKEND_SERVICE_ENDPOINT_FAQGEN="http://${host_ip}:8889/v1/faqgen" export BACKEND_SERVICE_ENDPOINT_FAQGEN="http://${ip_address}:8889/v1/faqgen"
export DATAPREP_DELETE_FILE_ENDPOINT="http://${host_ip}:6009/v1/dataprep/delete_file" export DATAPREP_DELETE_FILE_ENDPOINT="http://${ip_address}:6009/v1/dataprep/delete_file"
export BACKEND_SERVICE_ENDPOINT_CODEGEN="http://${host_ip}:7778/v1/codegen" export BACKEND_SERVICE_ENDPOINT_CODEGEN="http://${ip_address}:7778/v1/codegen"
export BACKEND_SERVICE_ENDPOINT_DOCSUM="http://${host_ip}:8890/v1/docsum" export BACKEND_SERVICE_ENDPOINT_DOCSUM="http://${ip_address}:8890/v1/docsum"
export DATAPREP_SERVICE_ENDPOINT="http://${host_ip}:6007/v1/dataprep" export DATAPREP_SERVICE_ENDPOINT="http://${ip_address}:6007/v1/dataprep"
export DATAPREP_GET_FILE_ENDPOINT="http://${host_ip}:6008/v1/dataprep/get_file" export DATAPREP_GET_FILE_ENDPOINT="http://${ip_address}:6008/v1/dataprep/get_file"
export CHAT_HISTORY_CREATE_ENDPOINT="http://${host_ip}:6012/v1/chathistory/create" export CHAT_HISTORY_CREATE_ENDPOINT="http://${ip_address}:6012/v1/chathistory/create"
export CHAT_HISTORY_CREATE_ENDPOINT="http://${host_ip}:6012/v1/chathistory/create" export CHAT_HISTORY_CREATE_ENDPOINT="http://${ip_address}:6012/v1/chathistory/create"
export CHAT_HISTORY_DELETE_ENDPOINT="http://${host_ip}:6012/v1/chathistory/delete" export CHAT_HISTORY_DELETE_ENDPOINT="http://${ip_address}:6012/v1/chathistory/delete"
export CHAT_HISTORY_GET_ENDPOINT="http://${host_ip}:6012/v1/chathistory/get" export CHAT_HISTORY_GET_ENDPOINT="http://${ip_address}:6012/v1/chathistory/get"
export PROMPT_SERVICE_GET_ENDPOINT="http://${host_ip}:6015/v1/prompt/get" export PROMPT_SERVICE_GET_ENDPOINT="http://${ip_address}:6015/v1/prompt/get"
export PROMPT_SERVICE_CREATE_ENDPOINT="http://${host_ip}:6015/v1/prompt/create" export PROMPT_SERVICE_CREATE_ENDPOINT="http://${ip_address}:6015/v1/prompt/create"
export KEYCLOAK_SERVICE_ENDPOINT="http://${host_ip}:8080" export KEYCLOAK_SERVICE_ENDPOINT="http://${ip_address}:8080"
export MONGO_HOST=${ip_address} export MONGO_HOST=${ip_address}
export MONGO_PORT=27017 export MONGO_PORT=27017
export DB_NAME="opea" export DB_NAME="opea"
@@ -235,7 +235,7 @@ function validate_microservices() {
# FAQGen llm microservice # FAQGen llm microservice
validate_service \ validate_service \
"${ip_address}:${LLM_SERVICE_HOST_PORT_FAQGEN}/v1/faqgen" \ "${ip_address}:9002/v1/faqgen" \
"data: " \ "data: " \
"llm_faqgen" \ "llm_faqgen" \
"llm-faqgen-server" \ "llm-faqgen-server" \
@@ -243,7 +243,7 @@ function validate_microservices() {
# Docsum llm microservice # Docsum llm microservice
validate_service \ validate_service \
"${ip_address}:${LLM_SERVICE_HOST_PORT_DOCSUM}/v1/chat/docsum" \ "${ip_address}:9003/v1/chat/docsum" \
"data: " \ "data: " \
"llm_docsum" \ "llm_docsum" \
"llm-docsum-server" \ "llm-docsum-server" \
@@ -251,7 +251,7 @@ function validate_microservices() {
# CodeGen llm microservice # CodeGen llm microservice
validate_service \ validate_service \
"${ip_address}:${LLM_SERVICE_HOST_PORT_CODEGEN}/v1/chat/completions" \ "${ip_address}:9001/v1/chat/completions" \
"data: " \ "data: " \
"llm_codegen" \ "llm_codegen" \
"llm-tgi-server-codegen" \ "llm-tgi-server-codegen" \

View File

@@ -73,10 +73,10 @@ function start_services() {
function validate_megaservice() { function validate_megaservice() {
result=$(http_proxy="" curl http://${ip_address}:3008/v1/searchqna -XPOST -d '{"messages": "How many gold medals does USA win in olympics 2024? Give me also the source link.", "stream": "False"}' -H 'Content-Type: application/json') result=$(http_proxy="" curl http://${ip_address}:3008/v1/searchqna -XPOST -d '{"messages": "What is black myth wukong?", "stream": "False"}' -H 'Content-Type: application/json')
echo $result echo $result
if [[ $result == *"2024"* ]]; then if [[ $result == *"the"* ]]; then
docker logs web-retriever-chroma-server > ${LOG_PATH}/web-retriever-chroma-server.log docker logs web-retriever-chroma-server > ${LOG_PATH}/web-retriever-chroma-server.log
docker logs searchqna-gaudi-backend-server > ${LOG_PATH}/searchqna-gaudi-backend-server.log docker logs searchqna-gaudi-backend-server > ${LOG_PATH}/searchqna-gaudi-backend-server.log
docker logs tei-embedding-gaudi-server > ${LOG_PATH}/tei-embedding-gaudi-server.log docker logs tei-embedding-gaudi-server > ${LOG_PATH}/tei-embedding-gaudi-server.log

View File

@@ -71,10 +71,10 @@ function start_services() {
function validate_megaservice() { function validate_megaservice() {
result=$(http_proxy="" curl http://${ip_address}:3008/v1/searchqna -XPOST -d '{"messages": "How many gold medals does USA win in olympics 2024? Give me also the source link.", "stream": "False"}' -H 'Content-Type: application/json') result=$(http_proxy="" curl http://${ip_address}:3008/v1/searchqna -XPOST -d '{"messages": "What is black myth wukong?", "stream": "False"}' -H 'Content-Type: application/json')
echo $result echo $result
if [[ $result == *"2024"* ]]; then if [[ $result == *"the"* ]]; then
docker logs web-retriever-chroma-server docker logs web-retriever-chroma-server
docker logs searchqna-xeon-backend-server docker logs searchqna-xeon-backend-server
echo "Result correct." echo "Result correct."