### Description It appears `marimo` supports both [GitHub Copilot](https://docs.marimo.io/guides/editor_features/ai_completion.html#github-copilot) and [Codeium Copilot](https://docs.marimo.io/guides/editor_features/ai_completion.html#codeium-copilot); I am hoping to use locally hosted models on Ollama as the third option. ### Suggested solution the completion section in `~/.marimo.toml` would look like: ``` [completion] copilot = "ollama" api_key = "ollama" model = "codeqwen:7b-code-v1.5-q5_1" base_url = "http://localhost:11434/v1/chat/completions" ``` ### Alternative _No response_ ### Additional context _No response_