Signed-off-by: ZePan110 <ze.pan@intel.com> Co-authored-by: chen, suyue <suyue.chen@intel.com> Co-authored-by: Ying Hu <ying.hu@intel.com>
Translation Application
The Translation example demonstrates the implementation of language translation using OPEA component-level microservices.
Table of contents
Architecture
The architecture of the Translation Application is illustrated below:
The Translation example is implemented using the component-level microservices defined in GenAIComps. The flow chart below shows the information flow between different microservices for this example.
---
config:
flowchart:
nodeSpacing: 400
rankSpacing: 100
curve: linear
themeVariables:
fontSize: 50px
---
flowchart LR
%% Colors %%
classDef blue fill:#ADD8E6,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5
classDef orange fill:#FBAA60,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5
classDef orchid fill:#C26DBC,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5
classDef invisible fill:transparent,stroke:transparent;
style Translation-MegaService stroke:#000000
%% Subgraphs %%
subgraph Translation-MegaService["Translation MegaService "]
direction LR
LLM([LLM MicroService]):::blue
end
subgraph UserInterface[" User Interface "]
direction LR
a([User Input Query]):::orchid
UI([UI server<br>]):::orchid
end
LLM_gen{{LLM Service <br>}}
GW([Translation GateWay<br>]):::orange
NG([Nginx MicroService]):::blue
%% Questions interaction
direction LR
a[User Input Query] --> UI
a[User Input Query] --> |Need Proxy Server|NG
NG --> UI
UI --> GW
GW <==> Translation-MegaService
%% Embedding service flow
direction LR
LLM <-.-> LLM_gen
This Translation use case performs Language Translation Inference across multiple platforms. Currently, we provide the example for Intel Gaudi2 and Intel Xeon Scalable Processors, and we invite contributions from other hardware vendors to expand OPEA ecosystem.
Deployment Options
The table below lists the available deployment options and their implementation details for different hardware platforms.
| Platform | Deployment Method | Link |
|---|---|---|
| Intel Xeon | Docker compose | Deployment on Xeon |
| Intel Gaudi2 | Docker compose | Deployment on Gaudi |
| AMD ROCm | Docker compose | Deployment on AMD Rocm |
