AI should just die. It’s not something we need in our lives.

  • amino@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    0
    ·
    3 days ago

    offline Wikipedia on a phone vs llama which needs a PC:

    wiki needs:

    1. 20-30 GB for the text version
    2. can fit on most phones otherwise use SD card /USB storage
    3. isn’t full of AI slop

    Llama 3.1 8B Requirements:

    1. CPU: Modern processor with at least 8 cores
    2. RAM: Minimum of 16 GB recommended
    3. GPU: NVIDIA RTX 3090
    4. doesn’t run on Android or iOS

    also keep in mind most people outside of tech/gamer bros can’t afford the financial investment for a PC that can run it. most people can afford an Android phone

    • untakenusername@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      3 days ago

      my PC is a refurbished second-hand dinosaur from 2019 and it can do this fine, no gpu. And you don’t need good specs at all other than the RAM being enough to load the model into memory, which in this case is 5 gigs.

      doesn’t run on Android or iOS

      You sure about androids?

      Anyway if llama 3.1 is too big, just use qwen3:0.6b or something small

      Also Wikipedia doesn’t contain all knowledge of the internet, but this stuff was trained on everything

      • amino@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        0
        ·
        3 days ago

        at least Wikipedia won’t waste your time with misinformation. llama could be trained on the entirety of human history for all I care, doesn’t matter one bit if it can’t provide accurate sources and facts

        • untakenusername@sh.itjust.works
          link
          fedilink
          arrow-up
          0
          ·
          3 days ago

          …it can’t provide accurate sources and facts

          When you don’t have internet access, which is the use case I was talking about, you don’t have sources other than what you’ve downloaded. If you cant check the sources, then effectively there are none.