just_another_person@lemmy.world to Linux@lemmy.worldEnglish · 2 months agoAMD Announces "Instella" Fully Open-Source 3B Language Modelswww.phoronix.comexternal-linkmessage-square19fedilinkarrow-up110arrow-down10cross-posted to: [email protected]
arrow-up110arrow-down1external-linkAMD Announces "Instella" Fully Open-Source 3B Language Modelswww.phoronix.comjust_another_person@lemmy.world to Linux@lemmy.worldEnglish · 2 months agomessage-square19fedilinkcross-posted to: [email protected]
minus-squarebrokenlcd@feddit.itlinkfedilinkEnglisharrow-up0·2 months agoThe problem is… How do we run it if rocm is still a mess for most of their gpus? Cpu time?
minus-squareswelter_spark@reddthat.comlinkfedilinkEnglisharrow-up1·6 days agoThere are ROCm versions of llama.cpp, ollama, and kobold.cpp that work well, although they’ll have to add support for this model before they could run it.
The problem is… How do we run it if rocm is still a mess for most of their gpus? Cpu time?
There are ROCm versions of llama.cpp, ollama, and kobold.cpp that work well, although they’ll have to add support for this model before they could run it.