Would be great if you could use self-hosted Ollama as your LLM provider as opposed to having to pay for tokens via hosted provider. API docs for ref: https://github.com/ollama/ollama/blob/main/docs/api.md