this post was submitted on 03 Sep 2025
119 points (92.2% liked)

PC Gaming

12208 readers
512 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] 9tr6gyp3@lemmy.world 87 points 2 days ago (9 children)

Who the hell keeps buying nvidia? Stop it.

[–] GaMEChld@lemmy.world 3 points 1 day ago

The vast majority of consumers do not watch or read reviews. They walk into a Best Buy or whatever retailer and grab the box that says GeForce that has the biggest number within their budget. LTT even did a breakdown at some point that showed how even their most watched reviews have little to no impact on sales numbers. Nvidia has the mind share. In a lot of people's minds GeForce = Graphics. And I say all that as someone who is currently on a Radeon 7900XTX. I'd be sad to see AMD and Intel quit the dGPU space, but I wouldn't be surprised.

[–] zqwzzle@lemmy.ca 77 points 2 days ago (3 children)

It’s the fucking AI tech bros

[–] MystikIncarnate@lemmy.ca 2 points 1 day ago

Microsoft.

Microsoft is buying them for AI.

From what I understand, chatGPT is running on azure servers.

[–] Tinidril@midwest.social 41 points 2 days ago (1 children)

Don't forget the crypto scammers.

[–] 9488fcea02a9@sh.itjust.works 5 points 2 days ago (2 children)

GPU hasnt been profitable to mine for many years now.

People just keep parroting anti-crypto talking points for years without actually knowing what'a going on

To be clear, 99% of the crypto space is a scam. But to blame them for GPU shortages and high prices is just misinformation

[–] D06M4@lemmy.zip 2 points 2 days ago

Most people are buying Nvidia because that's what's commonly recommended on reviews. "Want to use AI? Buy Nvidia! Want the latest DX12+ support? Buy Nvidia! Want to develop videogames or encode video? Buy Nvidia! Want to upgrade to Windows 11? Buy Nvidia!" Nonstop Nvidia adverts everywhere, with tampered benchmarks and whatnot. Other brands' selling points aren't well known and the general notion is that if it's not Nvidia it sucks.

[–] Tinidril@midwest.social 0 points 2 days ago (1 children)

Profitability of Bitcoin mining is dependent on the value of Bitcoin, which has more than doubled in the last 12 months. It's true that large scale miners have moved on from GPUs to purpose designed hardware, but GPUs and mining hardware are mutually dependent on a lot of the same limited resources, including FABs.

You are right that crypto doesn't drive the GPU market like it used to in the crypto boom, but I think you are underestimating the lingering impact. I would also not rule out a massive Bitcoin spike driven by actions of the Trump.p administration.

[–] Taldan@lemmy.world 9 points 2 days ago* (last edited 2 days ago) (1 children)

Profitability of Bitcoin mining is dependent on the value of Bitcoin

No it isn't. It's driven by the supply of miners and demand of transactions. Value of bitcoin is almost entirely independent

ASICs, which are used to mine Bitcoin are using very different chips than modern GPUs. Ethereum is the one that affected the GPU market, and mining is no longer a thing for Ethereum

A massive Bitcoin spike would not affect the GPU market in any appreciable way

Crypto mining is pretty dumb, but misinformation helps no one

[–] Tinidril@midwest.social 2 points 1 day ago

ASICs and GPUs do share significant dependencies in the semiconductor supply chain. Building FABS fast enough to keep up with demand is difficult and resource constrained, both by expertise and high quality materials.

You are wrong about the market value of Bitcoin's impact on the profitability of Bitcoin mining.

https://www.investopedia.com/articles/forex/051115/bitcoin-mining-still-profitable.asp

Another thing to consider is that many coins still use proof of work, and an ASIC designed for one might not work for others. Some miners (especially the most scammy ones) choose the flexibility to switch coins at will. That doesn't change the fact that ASIC now dominates, but GPUs do still have a share, especially for some of the newer scam coins.

[–] brucethemoose@lemmy.world 12 points 2 days ago* (last edited 2 days ago) (2 children)

Not as many as you’d think. The 5000 series is not great for AI because they have like no VRAM, with respect to their price.

4x3090 or 3060 homelabs are the standard, heh.

[–] MystikIncarnate@lemmy.ca 1 points 1 day ago (1 children)

Who the fuck buys a consumer GPU for AI?

If you're not doing it in a home lab, you'll need more juice than anything a RTX 3000/4000/5000/whatever000 series could have.

[–] brucethemoose@lemmy.world 1 points 1 day ago* (last edited 1 day ago) (1 children)

Who the fuck buys a consumer GPU for AI?

Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.

I can (just barely) run GLM-4.5 on a single 3090 desktop.

[–] MystikIncarnate@lemmy.ca 1 points 1 day ago (1 children)

.... Yeah, for yourself.

I'm referring to anyone running an LLM for commercial purposes.

Y'know, 80% of Nvidia's business?

[–] brucethemoose@lemmy.world 1 points 1 day ago (1 children)

I've kinda lost this thread, but what does that have to do with consumer GPU market share? The servers are a totally separate category.

I guess my original point was agreement: the 5000 series is not great for 'AI', not like everyone makes it out to be, to the point where folks who can't drop $10K for a GPU are picking up older cards instead. But if you look at download stats for these models, there is interest in running stuff locally instead of ChatGPT, just like people are interested in internet free games, or Lemmy instead of Reddit.

[–] MystikIncarnate@lemmy.ca 1 points 20 hours ago (1 children)

The original post is about Nvidia's domination of discrete GPUs, not consumer GPUs.

So I'm not limiting myself to people running an LLM on their personal desktop.

That's what I was trying to get across.

And it's right on point for the original material.

[–] brucethemoose@lemmy.world 1 points 20 hours ago (1 children)

I'm not sure the bulk of datacenter cards count as 'discrete GPUs' anymore, and they aren't counted in that survey. They're generally sold socketed into 8P servers with crazy interconnects, hyper specialized to what they do. Nvidia does sell some repurposed gaming silicon as a 'low end' PCIe server card, but these don't get a ton of use compared to the big silicon sales.

[–] MystikIncarnate@lemmy.ca 1 points 18 hours ago (1 children)

I wouldn't be surprised in the slightest if they are included in the list. I dunno, I'm not the statistician who crunched the numbers here. I didn't collect the data, and that source material is not available for me to examine.

What I can say is that the article defines "discrete" GPUs instead of just "GPUs" to eliminate all the iGPUs. Because Intel dominates that space with AMD, but it's hard to make an iGPU when you don't make CPUs, and the two largest CPU manufacturers make their own iGPUs.

The overall landscape of the GPU market is very different than what this data implies.

[–] brucethemoose@lemmy.world 1 points 16 hours ago* (last edited 16 hours ago)

Well, it’s no mystery:

https://www.jonpeddie.com/news/q225-pc-graphics-add-in-board-shipments-increased-27-0-from-last-quarter/

It’s specifically desktop addin boards:

AMD’s RX 9070 XT and RX 9070 represent AMD’s new RDNA 4 architecture, competing with Nvidia’s midrange offerings. Nvidia introduced two new Blackwell-series AIBs: the GeForce RTX 5080 Super and the RTX 5070. The company also announced the RTX 500 workstation AIB. Rumors have persisted about two new AIBs from Intel, including a dual-GPU model.

It is including workstation cards like the Blackwell Pro. But this is clearly not including server silicon like the B200, H200, MI325X and so on, otherwise they would have mentioned updates. They are not AIBs.

I hate to obsess over such a distinction, but it’s important: server sales are not skewing this data, and workstation sales volumes are pretty low. It’s probably a accurate chart for gaming GPUs.

[–] zqwzzle@lemmy.ca 23 points 2 days ago (1 children)

Their data centre division is pulling in 41 billion revenue vs 4 billion consumer market.

https://finance.yahoo.com/news/nvidia-q2-profit-soars-59-021402431.html

[–] brucethemoose@lemmy.world 7 points 2 days ago (1 children)

Yeah. What does that have to do with home setups? No one is putting an H200 or L40 in their homelab.

[–] zqwzzle@lemmy.ca 8 points 2 days ago (1 children)

Does the original title mention home setups?

[–] brucethemoose@lemmy.world 7 points 2 days ago* (last edited 2 days ago)

It mentions desktop GPUs, which are not part of this market cap survey.

Basically I don't see what the server market has to do with desktop dGPU market share. Why did you bring that up?

[–] tidderuuf@lemmy.world 20 points 2 days ago

The same people buying Intel and Microsoft.

[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 12 points 2 days ago (2 children)

Nvidia is the only real option for AI work. Before Trump lifted the really restrictive ban on GPUs to china they had to smuggle in GPUs from the US, and if you're Joe Schmo the only GPUs you can really buy are gaming ones. That's why the 5090 has been selling so well despite it being 2k and not all that much better than the 4090 in gaming.

Also AMD has no high end GPUs, and Intel barely has a mid range GPU.

[–] brucethemoose@lemmy.world 14 points 2 days ago (2 children)

To be fair, AMD is trying as hard as they can to not be appealing there. They inexplicably participate in the VRAM cartel when… they have no incentive to.

[–] GaMEChld@lemmy.world 1 points 1 day ago (1 children)

What's the VRAM cartel story? Think I missed that.

[–] brucethemoose@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

Basically, consumer VRAM is dirt cheap, not too far from DDR5 in $/gigabyte. And high VRAM (especially 48GB+) cards are in high demand.

But Nvidia charges through the nose for the privilege of adding more VRAM to cards. See this, which is almost the same silicon as the 5090: https://www.amazon.com/Blackwell-Professional-Workstation-Simulation-Engineering/dp/B0F7Y644FQ

When the bill of materials is really only like $100-$200 more, at most. Nvidia can get away with this because everyone is clamoring for their top end cards


AMD, meanwhile, is kind of a laughing stock in the prosumer GPU space. No one's buying them for CAD. No one's buying them for compute, for sure... And yet they do the same thing as Nvidia: https://www.amazon.com/AMD-Professional-Workstation-Rendering-DisplaPortTM/dp/B0C5DK4R3G/

In other words, with a phone call to their OEMs like Asus and such, Lisa Su could lift the VRAM restrictions from their cards and say 'you're allowed to sell as much VRAM on a 7900 or 9000 series as you can make fit." They could pull the rug out from under Nvidia and charge a $100-$200 markup instead of a $3000-$7000 one.

...Yet they don't.

It makes no sense. They're maintaining an anticompetitive VRAM 'cartel' with Nvidia instead of trying to compete.

Intel has more of an excuse here, as they literally don't manufacture a GPU that can take more than 24GB VRAM, but AMD literally has none I can think of.

[–] Diplomjodler3@lemmy.world 4 points 2 days ago

My theory is that they're just scared to annoy Nvidia too much. If they priced their GPUs so as to really increase their market share, Nvidia might retaliate. And Nvidia definitely has the deeper pockets. AMD has no chance to win a price war.

[–] BCsven@lemmy.ca 8 points 2 days ago (1 children)

This suggests AMD has comparable GPUs, in some cases better, some cases worse. People seeking diminishing return gains will never be happy. https://www.digitaltrends.com/computing/nvidia-vs-amd/

[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 10 points 2 days ago (1 children)

That article is a year old and is missing the latest generation of cards. Neither AMD nor Nvidia produce those GPUs anymore. AMDs best GPU from their 9000 series competes with Nvidias 5070/5070ti. The 5090 and 5080 are unmatched.

[–] BCsven@lemmy.ca 5 points 2 days ago* (last edited 2 days ago) (3 children)

Kind of my point, these were high end and still usable by 95% of people. Everyone is chasing 1% gains for twice the price. I have an new RTX via work equipment for rendering, I play games on the side but that RTX dosnt really make the gameplay that much better. It looks great with the shine on metal, or water reflections, but when totally immersed in game play that stuff is wasted

load more comments (3 replies)
[–] brb@sh.itjust.works 4 points 2 days ago (1 children)
[–] Marthirial@lemmy.world 4 points 1 day ago (1 children)

At the end of the day I think it is this simple. CUDA works and developers use it so users get a tangible benefit.

AMD comes up with a better version of CUDA and you have the disruption needed to compete.

I'm not sure that would even help that much, since tools out there already support CUDA, and even if AMD had a better version it would still require everyone to update apps to support it.

[–] lemonySplit@lemmy.ca 7 points 2 days ago (1 children)

Meanwhile framework's new AMD offering has nvidia slop in it. Just why. We want AMD. Give us AMD.

[–] notthebees@reddthat.com 4 points 2 days ago

They did. There's just no new amd mobile gpus. Like I think they only have 100 watt tdp or Max cooling to work with and the 7700S is the fastest amd mobile gpu currently.

If amd makes a new mobile gpu, framework will probably make it into a module.

[–] warm@kbin.earth 5 points 2 days ago

They need dlss otherwise the triple a games they love so much wont reach 30fps!!

[–] SoftestSapphic@lemmy.world -1 points 2 days ago* (last edited 2 days ago) (1 children)

I will never get another AMD card after my first one just sucked ass and didn't ever work right.

I wanted to try a Intel card but I wasn't even sure if I could find linux drivers for it because they weren't on the site for download and I couldn't find anything specifying if their newer cards even worked on linux.

So yeah, Nvidia is the only viable company for me to buy a graphics card from

[–] ganryuu@lemmy.ca 3 points 2 days ago (2 children)

That kind of comment always feels a bit weird to me; are you basing AMD's worth as a GPU manufacturer on that one bad experience? It could just as well have been the same on an Nvidia chip, would you be pro-AMD in that case?

On the Intel part, I'm not up to date but historically Intel has been very good about developing drivers for Linux, and most of the time they are actually included in the kernel (hence no download necessary).

[–] SoftestSapphic@lemmy.world 1 points 2 days ago* (last edited 2 days ago)

That kind of comment always feels a bit weird to me; are you basing AMD's worth as a GPU manufacturer on that one bad experience?

Absolutely, if a company I am trying for the first time gives me a bad experience, I will not go back. That's me giving them a chance, and AMD fucked up that chance and I couldn't even get a refund for like a $200 card. Choosing to try a different option resulted in me wasting time and money, and it pushed back my rig working for half a year until i could afford a working card again which really pissed me off.

I didn't know that about intel cards, I'll have to try one for my next upgrade if I can find on their site that they are supported.

[–] njm1314@lemmy.world -2 points 1 day ago (1 children)

What else would a consumer base things on except their own experiences? Not like it's a rare story either.

[–] ganryuu@lemmy.ca 0 points 1 day ago* (last edited 1 day ago) (1 children)

I don't know, real world data maybe? Your one, or 2, or even 10 experiences are very insignificant statistically speaking. And of course it's not a rare story, people who talk online about a product are most usually people with a bad experience, complaining about it, it kinda introduces a bias that you have to ignore. So you go for things like failure rates, which you can find online.

By the way, it's almost never actually a fault from AMD or Nvidia, but the actual manufacturer of the card.

Edit: Not that I care about Internet points, but downvoting without a rebuttal is... Not very convincing

[–] njm1314@lemmy.world 0 points 1 day ago (1 children)

A persons actual experience with a product isnt real world data? Fan boys for huge companies are so weird.

[–] ganryuu@lemmy.ca 1 points 1 day ago (1 children)

Please read my entire comment, I also said your experience as one person is statistically insignificant. As in, you cannot rely on 1 bad experience considering the volume of GPUs sold. Anybody can be unlucky with a purchase and get a defective product, no matter how good the manufacturer is.

Also, please point out where I did any fanboyism. I did not take any side in my comments. Bad faith arguments are so weird.

[–] njm1314@lemmy.world 0 points 1 day ago (1 children)

Sure buddy, we're all idiots for not liking the product you simp for. Got it.

[–] ganryuu@lemmy.ca 0 points 1 day ago

Nice. Did not answer anything, did not point out where I'm simping, or being a fanboy. I'm not pro Nvidia, nor AMD, nor anything (rather than that I'm pretty anticonsumerism actually, not that you care).

You're being extremely transparent in your bad faith.

[–] darkkite@lemmy.ml 1 points 2 days ago

I do local ai stuff and i get more support with nvidia cuda, and you usually get exclusive gaming features first on nvidia like dlss, rtx, and voice

I wish they shipped with more vram though