-
-
Notifications
You must be signed in to change notification settings - Fork 4k
Pull requests: mudler/LocalAI
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
feat(llama-cpp): bump to d775992 and adapt to spec params refactor
#9618
opened Apr 29, 2026 by
mudler
Owner
Loading…
chore: ⬆️ Update leejet/stable-diffusion.cpp to
3d6064b37ef4607917f8acf2ca8c8906d5087413
#9617
opened Apr 29, 2026 by
localai-bot
Collaborator
Loading…
fix(http): honor X-Forwarded-Prefix when proxy strips the prefix
#9614
opened Apr 29, 2026 by
Dennisadira
Contributor
Loading…
2 of 3 tasks
chore: ⬆️ Update ggml-org/llama.cpp to
d77599234ea6e498775aeadbce665eece5bd98cd
#9609
opened Apr 28, 2026 by
localai-bot
Collaborator
Loading…
fix(middleware): parse OpenAI-spec tool_choice in /v1/chat/completions
needs-review
#9559
opened Apr 25, 2026 by
Anai-Guo
Contributor
Loading…
5 tasks
feat(backend): add buun-llama-cpp fork (DFlash + TCQ KV-cache)
#9532
opened Apr 24, 2026 by
mudler
Owner
Loading…
feat(watchdog): add size-aware LRU eviction mode
#9527
opened Apr 24, 2026 by
SuperMarioYL
Loading…
2 of 3 tasks
feat(skills): add MiniMax-AI/cli as default skill tap
#9354
opened Apr 14, 2026 by
octo-patch
Loading…
fix: correct runtime_settings.json loading and add Agent Pool settings
needs-review
#9223
opened Apr 4, 2026 by
AlekseyMoyseyuk
Loading…
Fix watchdog_enabled setting persistence
needs-review
#9148
opened Mar 27, 2026 by
localai-bot
Collaborator
Loading…
fix: X-Forwarded-Prefix not working in reverse proxy scenarios
needs-review
#9146
opened Mar 27, 2026 by
localai-bot
Collaborator
Loading…
fix(openai): always include content field in streaming responses
needs-review
#8955
opened Mar 11, 2026 by
localai-bot
Collaborator
Loading…
feat(vllm): add grammar and structured output support
#8806
opened Mar 6, 2026 by
eureka0928
Contributor
Loading…
1 task done
feat: add Avian as a cloud LLM inference provider
area/ai-model
dependencies
#8666
opened Feb 27, 2026 by
avianion
Loading…
7 tasks
feat(diffusers): support large models and add Shutdown for dynamic reloading
#8404
opened Feb 5, 2026 by
JairoGuo
Loading…
feat: add job monitoring endpoint /backends/jobs
#8095
opened Jan 18, 2026 by
Divyanshupandey007
Contributor
Loading…
feat(ROCm): Allow selecting ROCm version when building llama.cpp backend
dependencies
kind/documentation
Improvements or additions to documentation
#7615
opened Dec 16, 2025 by
sredman
Contributor
Loading…
1 task done
ProTip!
Type g p on any issue or pull request to go back to the pull request listing page.