Update README for some minor issues (#1000)
Signed-off-by: lvliang-intel <liang1.lv@intel.com>
This commit is contained in:
@@ -206,8 +206,6 @@ cd GenAIExamples/ChatQnA/docker_compose/intel/hpu/gaudi/
|
|||||||
docker compose up -d
|
docker compose up -d
|
||||||
```
|
```
|
||||||
|
|
||||||
> Notice: Currently only the **Habana Driver 1.16.x** is supported for Gaudi.
|
|
||||||
|
|
||||||
Refer to the [Gaudi Guide](./docker_compose/intel/hpu/gaudi/README.md) to build docker images from source.
|
Refer to the [Gaudi Guide](./docker_compose/intel/hpu/gaudi/README.md) to build docker images from source.
|
||||||
|
|
||||||
### Deploy ChatQnA on Xeon
|
### Deploy ChatQnA on Xeon
|
||||||
|
|||||||
@@ -97,6 +97,11 @@ After launching your instance, you can connect to it using SSH (for Linux instan
|
|||||||
|
|
||||||
First of all, you need to build Docker Images locally and install the python package of it.
|
First of all, you need to build Docker Images locally and install the python package of it.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git clone https://github.com/opea-project/GenAIComps.git
|
||||||
|
cd GenAIComps
|
||||||
|
```
|
||||||
|
|
||||||
### 1. Build Retriever Image
|
### 1. Build Retriever Image
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
|
|||||||
@@ -111,7 +111,7 @@ Build frontend Docker image that enables Conversational experience with ChatQnA
|
|||||||
**Export the value of the public IP address of your Xeon server to the `host_ip` environment variable**
|
**Export the value of the public IP address of your Xeon server to the `host_ip` environment variable**
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
cd GenAIExamples/ChatQnA//ui
|
cd GenAIExamples/ChatQnA/ui
|
||||||
export BACKEND_SERVICE_ENDPOINT="http://${host_ip}:8912/v1/chatqna"
|
export BACKEND_SERVICE_ENDPOINT="http://${host_ip}:8912/v1/chatqna"
|
||||||
export DATAPREP_SERVICE_ENDPOINT="http://${host_ip}:6043/v1/dataprep"
|
export DATAPREP_SERVICE_ENDPOINT="http://${host_ip}:6043/v1/dataprep"
|
||||||
docker build --no-cache -t opea/chatqna-conversation-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy --build-arg BACKEND_SERVICE_ENDPOINT=$BACKEND_SERVICE_ENDPOINT --build-arg DATAPREP_SERVICE_ENDPOINT=$DATAPREP_SERVICE_ENDPOINT -f ./docker/Dockerfile.react .
|
docker build --no-cache -t opea/chatqna-conversation-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy --build-arg BACKEND_SERVICE_ENDPOINT=$BACKEND_SERVICE_ENDPOINT --build-arg DATAPREP_SERVICE_ENDPOINT=$DATAPREP_SERVICE_ENDPOINT -f ./docker/Dockerfile.react .
|
||||||
|
|||||||
@@ -70,6 +70,11 @@ curl http://${host_ip}:8888/v1/chatqna \
|
|||||||
|
|
||||||
First of all, you need to build Docker Images locally. This step can be ignored after the Docker images published to Docker hub.
|
First of all, you need to build Docker Images locally. This step can be ignored after the Docker images published to Docker hub.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git clone https://github.com/opea-project/GenAIComps.git
|
||||||
|
cd GenAIComps
|
||||||
|
```
|
||||||
|
|
||||||
### 1. Build Retriever Image
|
### 1. Build Retriever Image
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
|
|||||||
@@ -132,8 +132,6 @@ cd GenAIExamples/CodeGen/docker_compose/intel/hpu/gaudi
|
|||||||
docker compose up -d
|
docker compose up -d
|
||||||
```
|
```
|
||||||
|
|
||||||
> Notice: Currently only the **Habana Driver 1.16.x** is supported for Gaudi.
|
|
||||||
|
|
||||||
Refer to the [Gaudi Guide](./docker_compose/intel/hpu/gaudi/README.md) to build docker images from source.
|
Refer to the [Gaudi Guide](./docker_compose/intel/hpu/gaudi/README.md) to build docker images from source.
|
||||||
|
|
||||||
#### Deploy CodeGen on Xeon
|
#### Deploy CodeGen on Xeon
|
||||||
|
|||||||
@@ -72,8 +72,6 @@ cd GenAIExamples/CodeTrans/docker_compose/intel/hpu/gaudi
|
|||||||
docker compose up -d
|
docker compose up -d
|
||||||
```
|
```
|
||||||
|
|
||||||
> Notice: Currently only the **Habana Driver 1.16.x** is supported for Gaudi.
|
|
||||||
|
|
||||||
Refer to the [Gaudi Guide](./docker_compose/intel/hpu/gaudi/README.md) to build docker images from source.
|
Refer to the [Gaudi Guide](./docker_compose/intel/hpu/gaudi/README.md) to build docker images from source.
|
||||||
|
|
||||||
#### Deploy Code Translation on Xeon
|
#### Deploy Code Translation on Xeon
|
||||||
|
|||||||
@@ -71,8 +71,6 @@ cd GenAIExamples/DocSum/docker_compose/intel/hpu/gaudi/
|
|||||||
docker compose -f compose.yaml up -d
|
docker compose -f compose.yaml up -d
|
||||||
```
|
```
|
||||||
|
|
||||||
> Notice: Currently only the **Habana Driver 1.16.x** is supported for Gaudi.
|
|
||||||
|
|
||||||
Refer to the [Gaudi Guide](./docker_compose/intel/hpu/gaudi/README.md) to build docker images from source.
|
Refer to the [Gaudi Guide](./docker_compose/intel/hpu/gaudi/README.md) to build docker images from source.
|
||||||
|
|
||||||
#### Deploy on Xeon
|
#### Deploy on Xeon
|
||||||
|
|||||||
@@ -137,8 +137,6 @@ cd GenAIExamples/SearchQnA/docker_compose/intel/hpu/gaudi/
|
|||||||
docker compose up -d
|
docker compose up -d
|
||||||
```
|
```
|
||||||
|
|
||||||
> Notice: Currently only the **Habana Driver 1.16.x** is supported for Gaudi.
|
|
||||||
|
|
||||||
Refer to the [Gaudi Guide](./docker_compose/intel/hpu/gaudi/README.md) to build docker images from source.
|
Refer to the [Gaudi Guide](./docker_compose/intel/hpu/gaudi/README.md) to build docker images from source.
|
||||||
|
|
||||||
### Deploy SearchQnA on Xeon
|
### Deploy SearchQnA on Xeon
|
||||||
|
|||||||
@@ -76,8 +76,6 @@ cd GenAIExamples/VisualQnA/docker_compose/intel/hpu/gaudi/
|
|||||||
docker compose up -d
|
docker compose up -d
|
||||||
```
|
```
|
||||||
|
|
||||||
> Notice: Currently only the **Habana Driver 1.16.x** is supported for Gaudi.
|
|
||||||
|
|
||||||
### Deploy VisualQnA on Xeon
|
### Deploy VisualQnA on Xeon
|
||||||
|
|
||||||
Refer to the [Xeon Guide](./docker_compose/intel/cpu/xeon/README.md) for more instructions on building docker images from source.
|
Refer to the [Xeon Guide](./docker_compose/intel/cpu/xeon/README.md) for more instructions on building docker images from source.
|
||||||
|
|||||||
Reference in New Issue
Block a user