offline Wikipedia on a phone vs llama which needs a PC:
wiki needs:
20-30 GB for the text version
can fit on most phones otherwise use SD card /USB storage
isn’t full of AI slop
Llama 3.1 8B Requirements:
CPU: Modern processor with at least 8 cores
RAM: Minimum of 16 GB recommended
GPU: NVIDIA RTX 3090
doesn’t run on Android or iOS
also keep in mind most people outside of tech/gamer bros can’t afford the financial investment for a PC that can run it. most people can afford an Android phone
my PC is a refurbished second-hand dinosaur from 2019 and it can do this fine, no gpu. And you don’t need good specs at all other than the RAM being enough to load the model into memory, which in this case is 5 gigs.
doesn’t run on Android or iOS
You sure about androids?
Anyway if llama 3.1 is too big, just use qwen3:0.6b or something small
Also Wikipedia doesn’t contain all knowledge of the internet, but this stuff was trained on everything
at least Wikipedia won’t waste your time with misinformation. llama could be trained on the entirety of human history for all I care, doesn’t matter one bit if it can’t provide accurate sources and facts
When you don’t have internet access, which is the use case I was talking about, you don’t have sources other than what you’ve downloaded. If you cant check the sources, then effectively there are none.
congrats you just discovered offline Wikipedia
what if you don’t have enough space for all 150 gigabytes? llama 3.1 8b fits into 8 gigs
offline Wikipedia on a phone vs llama which needs a PC:
wiki needs:
Llama 3.1 8B Requirements:
also keep in mind most people outside of tech/gamer bros can’t afford the financial investment for a PC that can run it. most people can afford an Android phone
my PC is a refurbished second-hand dinosaur from 2019 and it can do this fine, no gpu. And you don’t need good specs at all other than the RAM being enough to load the model into memory, which in this case is 5 gigs.
You sure about androids?
Anyway if llama 3.1 is too big, just use qwen3:0.6b or something small
Also Wikipedia doesn’t contain all knowledge of the internet, but this stuff was trained on everything
at least Wikipedia won’t waste your time with misinformation. llama could be trained on the entirety of human history for all I care, doesn’t matter one bit if it can’t provide accurate sources and facts
When you don’t have internet access, which is the use case I was talking about, you don’t have sources other than what you’ve downloaded. If you cant check the sources, then effectively there are none.