this post was submitted on 28 Aug 2025
5 points (77.8% liked)
Conservatives
119 readers
26 users here now
Pro-conservative discussions
Rules
- Pro-conservative or crazy liberal post.
- We are a discussion forum. No low effort, trolling comments.
- Everyone is welcome to opine, but be civil.
- Attack the topic, not the person
- Report violations of the rules
- Serial downvoting earns you a ban.
founded 6 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think rapidly “server” is becoming GPU and “user” is becoming power-efficient (but still fast) Arm chips.
I wouldn’t want to go down with the Xeon and desktop/laptop ship.
I know how critical I sound and there’s a case for Intel to make sensitive (read: defense) chips domestically, but they need a good kick in the pants, not sweetheart deals.
On the contrary, I think inference is going on-device more in the future, but ‘users’ will still need decent CPUs and GPUs. Intel is well set up for this: they have good CPU, GPU, and NPU IP.
Intel can go ARM if they want, no problem, just like AMD can (and almost tried). They could theoretically preserve most of their core design and still switch ISA.
Servers will still need CPUs for a long time.
As for GPU compute, we are both in a bubble, and at several forks in the road:
Is bitnet ML going to take off? If it does, that shifts the advantage to almost cyptominer-like ASICs, as expensive matrix multiplication no longer matters for inference.
Otherwise, what about NPUs? Huawei is already using them, and even training good production models with them. Intel can try their hand at this game again if loads start shifting away from CUDA.
Otherwise, they still have a decent shot at the CUDA ecosystem via ZLUDA and their own frameworks. Training and research will probably forever be Nvidia (and some niches like Cerebra’s), but still.
Maybe I’ve been in a silo but I’ve never heard “intel” and “npu” in the same breath.
Because it's only used for crappy copilot stuff right now, heh.
But technically the Gaudi 2 processors they bought and developed are server 'NPUs'. Last I heard, they're putting them on ice, but they may have integrated the tech into laptop processors anyway, and could scale it back up if Huawei's NPUs take off.