94
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 15 Nov 2024
94 points (99.0% liked)
Futurology
1776 readers
193 users here now
founded 1 year ago
MODERATORS
Ok, so if you want to run your local LLM on your desktop, use your GPU. If you’re doing that on a laptop in a cafe, get a laptop with an NPU. If you don’t care about either, you don’t need to think about these AI PCs.
Or use a laptop with a GPU? An npu seems to just be slightly upgraded onboard graphics.