39
Microsoft researchers build 1-bit AI LLM with 2B parameters — model small enough to run on some CPUs
(www.tomshardware.com)
Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.
Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.
As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.
Rather than CPUs I think these are a much bigger deal for GPUs where memory is much more expensive. I can get 128GB of ram for 300CAD, the same amount in vram would be several grand.
For a second, I was thinking about why you need 300 instances of CAD software and if 128GB isn't a bit too small for that ludicrous amount of computer aided design.