AI should just die. It’s not something we need in our lives.

  • untakenusername@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    3 days ago

    my PC is a refurbished second-hand dinosaur from 2019 and it can do this fine, no gpu. And you don’t need good specs at all other than the RAM being enough to load the model into memory, which in this case is 5 gigs.

    doesn’t run on Android or iOS

    You sure about androids?

    Anyway if llama 3.1 is too big, just use qwen3:0.6b or something small

    Also Wikipedia doesn’t contain all knowledge of the internet, but this stuff was trained on everything

    • amino@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      0
      ·
      3 days ago

      at least Wikipedia won’t waste your time with misinformation. llama could be trained on the entirety of human history for all I care, doesn’t matter one bit if it can’t provide accurate sources and facts

      • untakenusername@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        3 days ago

        …it can’t provide accurate sources and facts

        When you don’t have internet access, which is the use case I was talking about, you don’t have sources other than what you’ve downloaded. If you cant check the sources, then effectively there are none.