-
Notifications
You must be signed in to change notification settings - Fork 216
Provide a clear working example of how to orchestrate multiple services #1272
Copy link
Copy link
Closed
Labels
Description
There are some code examples how how you could create a mega service from multiple micro-services
These examples appear to be incomplete in explaining on how to get them to work.
https://opea-project.github.io/latest/GenAIComps/README.html
- It appears you can do
pip install opea-compsbut my library can't find the comps directory - It shows also to clone the repo but it downloads as GenAIComps so its my assumpution we have to reference it via GenAIComps
- We have a megaservice example of class. What do you do with it? How do you run it? Do I just create an instance of the class and it should spin up expected resources
Other Notes
- TGI suggests its only for Xeon and Gaudi but reviewing the code doesn't suggest it can run on consumer grade Intel hardware or GPUs.
- Ollama is lacking documentation, I thought maybe I should not use TGI locally for teaching but then when I read about LLMs comp it suggests you have to use vLLM and TGI
- Further investigation suggests that these three models all follow the OpenAI API schema so they likely will be interchangable.
Reactions are currently unavailable