“PROWORKBENCH integration: OpenAI API provider + request for attribution/bundling guidance” #7390
jamiegrl100
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Title: PROWORKBENCH integration (OpenAI API): bundling + attribution guidance?
Hi all — I’m Jamie, working on PROWORKBENCH (PB): https://github.com/jamiegrl100/proworkbench
We’re integrating Text Generation WebUI as PB’s default local model provider via the OpenAI-compatible API:
PB detects WebUI at http://127.0.0.1:5000 and does NOT auto-start it (user starts WebUI and loads a model).
PB is still in development, but we’re planning to bundle WebUI as part of setup later. If there are preferred attribution wording or bundling guidelines you’d like integrators to follow, we’d really appreciate a pointer.
We’ll credit and link back prominently (README/About/Attribution) and include clear non-technical setup docs (e.g., “open WebUI in a browser and load a model”).
Thanks!
Jamie
Beta Was this translation helpful? Give feedback.
All reactions