Not able to pull this model with Ollama

#1
by DhruvStar-AI - opened

Error: 500 Internal Server Error: unable to load model: /Users/dhruv/.ollama/models/blobs/sha256-a2648395d533b6d1408667d00e0b778f3823f3f3179ba371f89355f2e957e42e

This model requires Ollama 0.13.1
https://github.com/ollama/ollama/releases/

You can run ollama -v to get your version.
And you can run OLLAMA_DEBUG=1 ollama serve, before running the model to get more info about why it's failing.

Actually, the official ministral-3 model is a fused model (gguf+mmproj), and ollama isn't yet based on a recent enough llama-cpp build to support the ministral-3 architecture.

I had the same issue with qwen3-vl, where importing a gguf+mmproj separately doesn't work even though there is an official mode. It's working now though, I'm running 13.3rc0.

Sign up or log in to comment