corbin@infosec.pub to Lemmy Shitpost@lemmy.worldEnglish · 8 days agocan't beat the classicsinfosec.pubimagemessage-square39fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1imagecan't beat the classicsinfosec.pubcorbin@infosec.pub to Lemmy Shitpost@lemmy.worldEnglish · 8 days agomessage-square39fedilink
minus-squarelmuel@sopuli.xyzlinkfedilinkEnglisharrow-up0·8 days agoWell in some ways they are. It also depends a lot on the hardware you have of course. A normal 16GB GPU won’t fit huge LLMs. The smaller ones are getting impressively good at some things but a lot of them are still struggling when using non-English languages for example.
Well in some ways they are. It also depends a lot on the hardware you have of course. A normal 16GB GPU won’t fit huge LLMs.
The smaller ones are getting impressively good at some things but a lot of them are still struggling when using non-English languages for example.