Files
GenAIExamples/ChatQnA/docker_compose/amd/gpu
Wang, Kai Lawrence 284db982be [ROCm] Fix the hf-token setting for TGI and TEI in ChatQnA (#1432)
This PR is to correct the env variable names in chatqna example on ROCm platform passing to the docker container of TGI and TEI. For tgi, either HF_TOKEN and HUGGING_FACE_HUB_TOKEN could be parsed in TGI while HF_API_TOKEN can be parsed in TEI.

TGI: https://github.com/huggingface/text-generation-inference/blob/main/router/src/server.rs#L1700C1-L1702C15
TEI: https://github.com/huggingface/text-embeddings-inference/blob/main/router/src/main.rs#L112

Signed-off-by: Wang, Kai Lawrence <kai.lawrence.wang@intel.com>
2025-01-21 14:22:39 +08:00
..