this post was submitted on 03 Sep 2025
119 points (92.2% liked)

PC Gaming

12208 readers
512 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] zqwzzle@lemmy.ca 77 points 2 days ago (3 children)

It’s the fucking AI tech bros

[–] MystikIncarnate@lemmy.ca 2 points 1 day ago

Microsoft.

Microsoft is buying them for AI.

From what I understand, chatGPT is running on azure servers.

[–] Tinidril@midwest.social 41 points 2 days ago (1 children)

Don't forget the crypto scammers.

[–] 9488fcea02a9@sh.itjust.works 5 points 2 days ago (2 children)

GPU hasnt been profitable to mine for many years now.

People just keep parroting anti-crypto talking points for years without actually knowing what'a going on

To be clear, 99% of the crypto space is a scam. But to blame them for GPU shortages and high prices is just misinformation

[–] D06M4@lemmy.zip 2 points 2 days ago

Most people are buying Nvidia because that's what's commonly recommended on reviews. "Want to use AI? Buy Nvidia! Want the latest DX12+ support? Buy Nvidia! Want to develop videogames or encode video? Buy Nvidia! Want to upgrade to Windows 11? Buy Nvidia!" Nonstop Nvidia adverts everywhere, with tampered benchmarks and whatnot. Other brands' selling points aren't well known and the general notion is that if it's not Nvidia it sucks.

[–] Tinidril@midwest.social 0 points 2 days ago (1 children)

Profitability of Bitcoin mining is dependent on the value of Bitcoin, which has more than doubled in the last 12 months. It's true that large scale miners have moved on from GPUs to purpose designed hardware, but GPUs and mining hardware are mutually dependent on a lot of the same limited resources, including FABs.

You are right that crypto doesn't drive the GPU market like it used to in the crypto boom, but I think you are underestimating the lingering impact. I would also not rule out a massive Bitcoin spike driven by actions of the Trump.p administration.

[–] Taldan@lemmy.world 9 points 2 days ago* (last edited 2 days ago) (1 children)

Profitability of Bitcoin mining is dependent on the value of Bitcoin

No it isn't. It's driven by the supply of miners and demand of transactions. Value of bitcoin is almost entirely independent

ASICs, which are used to mine Bitcoin are using very different chips than modern GPUs. Ethereum is the one that affected the GPU market, and mining is no longer a thing for Ethereum

A massive Bitcoin spike would not affect the GPU market in any appreciable way

Crypto mining is pretty dumb, but misinformation helps no one

[–] Tinidril@midwest.social 2 points 1 day ago

ASICs and GPUs do share significant dependencies in the semiconductor supply chain. Building FABS fast enough to keep up with demand is difficult and resource constrained, both by expertise and high quality materials.

You are wrong about the market value of Bitcoin's impact on the profitability of Bitcoin mining.

https://www.investopedia.com/articles/forex/051115/bitcoin-mining-still-profitable.asp

Another thing to consider is that many coins still use proof of work, and an ASIC designed for one might not work for others. Some miners (especially the most scammy ones) choose the flexibility to switch coins at will. That doesn't change the fact that ASIC now dominates, but GPUs do still have a share, especially for some of the newer scam coins.

[–] brucethemoose@lemmy.world 12 points 2 days ago* (last edited 2 days ago) (2 children)

Not as many as you’d think. The 5000 series is not great for AI because they have like no VRAM, with respect to their price.

4x3090 or 3060 homelabs are the standard, heh.

[–] MystikIncarnate@lemmy.ca 1 points 1 day ago (1 children)

Who the fuck buys a consumer GPU for AI?

If you're not doing it in a home lab, you'll need more juice than anything a RTX 3000/4000/5000/whatever000 series could have.

[–] brucethemoose@lemmy.world 1 points 1 day ago* (last edited 1 day ago) (1 children)

Who the fuck buys a consumer GPU for AI?

Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.

I can (just barely) run GLM-4.5 on a single 3090 desktop.

[–] MystikIncarnate@lemmy.ca 1 points 1 day ago (1 children)

.... Yeah, for yourself.

I'm referring to anyone running an LLM for commercial purposes.

Y'know, 80% of Nvidia's business?

[–] brucethemoose@lemmy.world 1 points 1 day ago (1 children)

I've kinda lost this thread, but what does that have to do with consumer GPU market share? The servers are a totally separate category.

I guess my original point was agreement: the 5000 series is not great for 'AI', not like everyone makes it out to be, to the point where folks who can't drop $10K for a GPU are picking up older cards instead. But if you look at download stats for these models, there is interest in running stuff locally instead of ChatGPT, just like people are interested in internet free games, or Lemmy instead of Reddit.

[–] MystikIncarnate@lemmy.ca 1 points 20 hours ago (1 children)

The original post is about Nvidia's domination of discrete GPUs, not consumer GPUs.

So I'm not limiting myself to people running an LLM on their personal desktop.

That's what I was trying to get across.

And it's right on point for the original material.

[–] brucethemoose@lemmy.world 1 points 20 hours ago (1 children)

I'm not sure the bulk of datacenter cards count as 'discrete GPUs' anymore, and they aren't counted in that survey. They're generally sold socketed into 8P servers with crazy interconnects, hyper specialized to what they do. Nvidia does sell some repurposed gaming silicon as a 'low end' PCIe server card, but these don't get a ton of use compared to the big silicon sales.

[–] MystikIncarnate@lemmy.ca 1 points 18 hours ago (1 children)

I wouldn't be surprised in the slightest if they are included in the list. I dunno, I'm not the statistician who crunched the numbers here. I didn't collect the data, and that source material is not available for me to examine.

What I can say is that the article defines "discrete" GPUs instead of just "GPUs" to eliminate all the iGPUs. Because Intel dominates that space with AMD, but it's hard to make an iGPU when you don't make CPUs, and the two largest CPU manufacturers make their own iGPUs.

The overall landscape of the GPU market is very different than what this data implies.

[–] brucethemoose@lemmy.world 1 points 16 hours ago* (last edited 16 hours ago)

Well, it’s no mystery:

https://www.jonpeddie.com/news/q225-pc-graphics-add-in-board-shipments-increased-27-0-from-last-quarter/

It’s specifically desktop addin boards:

AMD’s RX 9070 XT and RX 9070 represent AMD’s new RDNA 4 architecture, competing with Nvidia’s midrange offerings. Nvidia introduced two new Blackwell-series AIBs: the GeForce RTX 5080 Super and the RTX 5070. The company also announced the RTX 500 workstation AIB. Rumors have persisted about two new AIBs from Intel, including a dual-GPU model.

It is including workstation cards like the Blackwell Pro. But this is clearly not including server silicon like the B200, H200, MI325X and so on, otherwise they would have mentioned updates. They are not AIBs.

I hate to obsess over such a distinction, but it’s important: server sales are not skewing this data, and workstation sales volumes are pretty low. It’s probably a accurate chart for gaming GPUs.

[–] zqwzzle@lemmy.ca 23 points 2 days ago (1 children)

Their data centre division is pulling in 41 billion revenue vs 4 billion consumer market.

https://finance.yahoo.com/news/nvidia-q2-profit-soars-59-021402431.html

[–] brucethemoose@lemmy.world 7 points 2 days ago (1 children)

Yeah. What does that have to do with home setups? No one is putting an H200 or L40 in their homelab.

[–] zqwzzle@lemmy.ca 8 points 2 days ago (1 children)

Does the original title mention home setups?

[–] brucethemoose@lemmy.world 7 points 2 days ago* (last edited 2 days ago)

It mentions desktop GPUs, which are not part of this market cap survey.

Basically I don't see what the server market has to do with desktop dGPU market share. Why did you bring that up?