

it was, it’s just that they have officially released a 2B model trained for the BitNet architecture
it was, it’s just that they have officially released a 2B model trained for the BitNet architecture
Nice to know. Thanks.
Same, I have an HDD from 2012 which has my childhood memories. First thing I’m gonna do is to get it fixed from a reputed service when I start earning.
Everything was. Is …
Whyyy???
Welcome here!
I think the bigger bottleneck is SLAM, running that is intensive, it wont directly run on video, and SLAM is tough i guess, reading the repo doesn’t give any clues of it being able to run on CPU inference.
There is a repo they released.
It will, they have released a repo with code.
I mean I didn’t see any alarming need of a Google doc alternative, so I might actually be under a rock
taste of his own medicine
I checked mostly all of em out from the list, but 1b models are generally unusable for RAG.
i use pageassist with Ollama
We can use the same test name as proposed by a user in the original post’s comment: Odd-straw-in-the-haystack :)
Yeahh I often think how many amazing things the world misses out on which we eat almost every day. Glad to see someone enjoying niche stuff haha
Yeah it’s all propaganda, I like bashing a keyboard’s keys due to sexual reasons.
ayy, that’s nice. LLMs are truely overkill just for semantic search though, didnt know there are other ways to achieve this. but we need intelligence too right. (somewhat)
i’m not the smartest out there to explain it but it’s like …instead of floating point numbers as the weights, its just -1,0,1.