Rocking an R9 280 atm ๐
PC Gaming
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
I only upgraded from my 380 this year
Still plenty of fun to be had with new GOG mods, etc. :)
AMD needs better answers on the HPC software side.
They need dGPUs worth buying for HPC, other than servers that cost more than a house, so devs will actually target them.
They have the hardware for hpc, with their instinct cards. Software support is slowly growing. Rocm is fine, ZLUDA is pain and suffering on amd cards. I have a 6800xt, so old but decently supported and even then it's annoying
What I mean is they need to sell reasonable high VRAM cards that aren't a MI325X, heh.
There's not really a motivation to target them over a 3090 or 4090 or whatever, but that would change with bigger VRAM pools.
7900xtx is 24 gb, the 9700 pro has 32 gb as far as high end consumer/prosumer goes. There's no point of shoving that much vram into it if support is painful and makes it hard to develop. I'm probably biased due to my 6800xt, one of the earliest cards that's still supported by rocm, so there's a bunch of stuff my gpu can't do. ZLUDA is painful to get working (and I have it easier due to my 6800xt), ROCM is mostly works but vram utilization is very inefficient for some reason and it's Linux only, which is fine but I'd like more crossplatform options. Vulkan compute is deprecated within pytorch. AMD HIP is annoying as well but idk how much of it was just my experience with ZLUDA.
Intel actually has better cross platform support with IPEX, but that's just pytorch. Again, fine.
7900xtx is 24 gb, the 9700 pro has 32 gb as far as high end consumer/prosumer goes.
The AI Pro isn't even availible! And 32GB is not enough anyway.
I think you underestimate how desperate ML (particularlly LLM) tinkerers are for VRAM; they're working with ancient MI50s and weird stuff like that. If AMD had sold the 7900 with 48GB for a small markup (instead of $4000), AMD would have grassroots support everywhere because thats what devs would spend their time making work. And these are the same projects that trickle up to the MI325X and newer.
I was in this situation: I desperately wanted a non Nvidia ML card awhile back. I contribute little bugfixes and tweaks to backends all the time; but I ended up with a used 3090 because the 7900 XTX was just too expensive for 'only' 24GB + all the fuss.
There's lingering bits of AMD support everywhere: vulkan backends to popular projects, unfixed rocm bugs in projects, stuff that works but isn't optimized yet with tweaks; the problem is AMD isnt' making it worth anyone's while to maintain them when devs can (and do) just use 3090s or whatever.
They kind of took a baby step in this direction with the AI 395 (effectively a 110GB VRAM APU, albeit very compute light compared to a 7900/9700), but it's still $2K, effectively mini PC only, and kinda too-little-too-late.
Just saw a "leak video" that 9070 is outselling 5070, and so 5070 and 5080 supers are being released very soon.