-
Notifications
You must be signed in to change notification settings - Fork 284
Closed
Description
Ollama announced support for IBM Granite https://x.com/ollama/status/1848223852465213703
I tried to run granite3-moe with ramalama
[vpavlin@vpavlin-tuxedo ~/devel/github.com/vpavlin/ramalama(main) ]
$ ramalama run granite3-moe
Pulling dfc8e4074962e215: 100% ▕####################▏ 1.92G/1.92G 3.23MB/s 00:00
[vpavlin@vpavlin-tuxedo ~/devel/github.com/vpavlin/ramalama(main) ]
$ ramalama run granite3-moe
But it fails after download without printing any error log. Latest Ollama works fine with this model
OS: Ubuntu 23.10
Python: 3.11.6
Ramalama:
$ ramalama info
{
"Engine": "podman",
"Image": "quay.io/ramalama/ramalama:latest",
"Runtime": "llama.cpp",
"Store": "/home/vpavlin/.local/share/ramalama",
"Version": 0
}
Metadata
Metadata
Assignees
Labels
No labels