Files
GenAIExamples/AgentQnA/docker_compose/intel/cpu/xeon

Single node on-prem deployment with Docker Compose on Xeon Scalable processors

This example showcases a hierarchical multi-agent system for question-answering applications. To deploy the example on Xeon, OpenAI LLM models via API calls are used. For instructions, refer to the deployment guide here.