• Kongar@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    I think local compute will kill these huge data centers for AI. It’s amazing what you can do with free tools like ollama or rag agents like n8n. Even on a business laptop with only 16GB of ram. If you’ve got a 4090 at home in your gaming pc and some big ram sticks - well, you’d be surprised at what some models can do (and how quickly they can respond).

    You all know how the internet works - in a short time someone’s going to put together a free tool that’s as easy as “click this button to install” and it’ll do 80% of what ChatGPT can do. ie probably enough for the average user - for free.

    So how are they going to recoup all these billions spent on data centers if peoples personal computers can mostly do the same thing? How do they monetize your information and sell you ads if it’s all done locally?Go download one and ask questions-sure it’s not perfect but it’s surprisingly good locally hosted.

    I think the people spending these billions are starting to realize that…. Meanwhile I think this keeps video card prices high unfortunately…

    • Bob Robertson IX@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Unfortunately I think most businesses will still prefer that their AI solution is hosted by a company like OpenAI rather than maintaining their own. There’s still going to be a need for these large data centers, but I do hope most people realize that hosting your own LLM isn’t that difficult, and it doesn’t cost you your privacy.