turkishdelight@lemmy.ml to LocalLLaMA@sh.itjust.worksEnglish · 8 months agoOllama now supports AMD graphics cardsollama.comexternal-linkmessage-square4fedilinkarrow-up177arrow-down10file-text
arrow-up177arrow-down1external-linkOllama now supports AMD graphics cardsollama.comturkishdelight@lemmy.ml to LocalLLaMA@sh.itjust.worksEnglish · 8 months agomessage-square4fedilinkfile-text
But in all fairness, it’s really llama.cpp that supports AMD. Now looking forward to the Vulkan support!
minus-squaresardaukar@lemmy.worldlinkfedilinkEnglisharrow-up7·8 months agoI’ve been using it with a 6800 for a few months now, all it needs is a few env vars.
I’ve been using it with a 6800 for a few months now, all it needs is a few env vars.