Help with testing the 0.5.8 pre-release

Hi everyone,

The next version of Ollama has overhauled how acceleration libraries are packaged. This adds support for non-AVX+GPU and AVX2+GPU combos. It also adds AVX512 instruction support for high-end CPUs like the AMD threadripper processors.

We'd love your help testing it out before marking it as a final release to work out any kinks with GPU support. You can download it here:

https://github.com/ollama/ollama/releases/tag/v0.5.8-rc11

On Linux, you can run:

curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.8-rc11 sh

For Docker, you can use the following command to pull the new version:

docker pull ollama/ollama:0.5.8-rc11

or for ROCm users:

docker pull ollama/ollama:0.5.8-rc11-rocm

If you hit any issues feel free to DM me or create a GitHub issue letting us know you are on the 0.5.8 RC version. Thanks so much!