Skip to content

Add Ollama Support for On-prem Hosted LLM's #1232

@markcichy

Description

@markcichy

Would be great if you could use self-hosted Ollama as your LLM provider as opposed to having to pay for tokens via hosted provider.

API docs for ref:
https://github.com/ollama/ollama/blob/main/docs/api.md

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions