• Daxtron2@startrek.website
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Then you haven’t been paying attention. There’s been huge strides in the field of small open language models which can do inference with low enough power consumption to run locally on a phone.