501
submitted 9 months ago by L4s@lemmy.world to c/technology@lemmy.world

AMD’s new CPU hits 132fps in Fortnite without a graphics card::Also get 49fps in BG3, 119fps in CS2, and 41fps in Cyberpunk 2077 using the new AMD Ryzen 8700G, all without the need for an extra CPU cooler.

top 50 comments
sorted by: hot top controversial new old
[-] BombOmOm@lemmy.world 119 points 9 months ago

I have routinely been impressed with AMD integrated graphics. My last laptop I specifically went for one as it meant I didn't need a dedicated gpu for it which adds significant weight, cost, and power draw.

It isn't my main gaming rig of course; I have had no complaints.

[-] prole@sh.itjust.works 19 points 9 months ago* (last edited 9 months ago)

Same. I got a cheap Ryzen laptop a few years back and put Linux on it last year, and I've been shocked by how well it can play some games. I just recently got Disgaea 7 (mostly to play on Steam Deck) and it's so well optimized that I get steady 60fps, at full resolution, on my shitty integrated graphics.

[-] empireOfLove2@lemmy.dbzer0.com 13 points 9 months ago

I have a Lenovo ultralight with a 7730U mobile chip in it, which is a pretty mid cpu... happily plays minecraft at a full 60fps while using like 10W on the package. I can play Minecraft on battery for like 4 hours. It's nuts.

AMD does the right thing and uses their full graphics uArch CU's for the iGPU on a new die, instead of trying to cram some poorly designed iGPU inside the CPU package like Intel does.

[-] CalcProgrammer1@lemmy.ml 58 points 9 months ago

AMD's integrated GPUs have been getting really good lately. I'm impressed at what they are capable of with gaming handhelds and it only makes sense to put the same extra GPU power into desktop APUs. This hopefully will lead to true gaming laptops that don't require power hungry discrete GPUs and workarounds/render offloading for hybrid graphics. That said, to truly be a gaming laptop replacement I want to see a solid 60fps minimum at at least 1080p, but the fact that we're seeing numbers close to this is impressive nonetheless.

[-] PerogiBoi@lemmy.ca 58 points 9 months ago

I was sold on AMD once I got my Steamdeck.

[-] EndHD@lemm.ee 11 points 9 months ago

same here. or at least i finally recognized their potential. but it's not just the performance, it's the power efficiency too!

[-] prole@sh.itjust.works 7 points 9 months ago

Everything I see about AMD makes me like them more than Intel or Nvidia (for CPU and GPU respectively). You can't even use an Nvidia card with Linux without running into serious issues.

load more comments (2 replies)
[-] fosstulate@iusearchlinux.fyi 41 points 9 months ago

I hope red and blue both find success in this segment. Ideally the strengthened APU share of the market exerts pressure on publishers to properly optimize their games instead of cynically offloading the compute cost onto players.

[-] Rai@lemmy.dbzer0.com 18 points 9 months ago

Hell yeah, I want EVERYONE to make dope ass shit. I’ve made machines with both sides, and I hate tribal…ness. My current machine is a 9900k that’s getting to be… five years old?! I’d make an AMD machine today if I needed a new machine. AMD/Intel rivalry is so good for us all. Intel slacked so hard after the 9000-series. I hope they come back.

[-] Vlyn@lemmy.zip 19 points 9 months ago

Intel has slacked hard since the 2000-series. One shitty 4 core release after another, until AMD kicked things into gear with Ryzen.

And during that time you couldn't buy Intel due to security flaws (Meltdown, Spectre, ..).

Even now they are slacking, just look at the power consumption. The way they currently produce CPUs isn't sustainable (AMD pays way less per chip with the chiplet design and is far more flexible).

load more comments (1 replies)
load more comments (2 replies)
[-] 4grams@awful.systems 34 points 9 months ago

Now, if they stick one in a framework laptop, I’ll be a few thousand dollars poorer.

[-] BombOmOm@lemmy.world 9 points 9 months ago

The good news is, Framework is shipping with AMD CPUs now. :)

Currently 7th gen Ryzens, not sure when the 8th gens become available.

load more comments (1 replies)
load more comments (5 replies)
[-] Dehydrated@lemmy.world 34 points 9 months ago

Common W for AMD

[-] aluminium@lemmy.world 30 points 9 months ago

Oh, oh ok I thought one of the new Threadrippers is so powerful that the CPU can do all those graphics in Software.

[-] sardaukar@lemmy.world 24 points 9 months ago

It's gonna take decades to be able to render 1080p CP2077 at an acceptable frame rate with just software rendering.

load more comments (2 replies)
[-] le_saucisson_masquay@sh.itjust.works 29 points 9 months ago

For people like me who game once a month, and mostly stupid little game, this is great news. I bet many people could use that, it would reduce demand for graphic card and allow those who want them to buy cheaper.

[-] RememberTheApollo_@lemmy.world 28 points 9 months ago

Only downside if integrated graphics becomes a thing is that you can’t upgrade if the next gen needs a different motherboard. Pretty easy to swap from a 2080 to a 3080.

[-] olympicyes@lemmy.world 40 points 9 months ago

Integrated graphics is already a thing. Intel iGPU has over 60% market share. This is really competing with Intel and low-end discrete GPUs. Nice to have the option!

[-] RememberTheApollo_@lemmy.world 4 points 9 months ago* (last edited 9 months ago)

Yeah, I know integrated graphics is a thing. And that’s been fine for running a web browser, watching videos, or whatever other low-demand graphical application was needed for office work. Now they’re holding it up against gaming, which typically places large demands on graphical processing power.

The only reason I brought up what I did is because it’s an if… if people start looking at CPU integrated graphics as an alternative to expensive GPUs it makes an upgrade path more costly vs a short term savings of avoiding a good GPU purchase.

Again, if one’s gaming consists of games that aren’t high demand like Fortnite, then upgrades and performance probably aren’t a concern for the user. One could still end up buying a GPU and adding it to the system for more power assuming that the PSU has enough power and case has room.

load more comments (1 replies)
[-] tonyravioli@lemm.ee 33 points 9 months ago

AMD has been pretty good about this though, AM4 lasted 2016-2022. Compare to Intel changing the socket every 1-2 years, it seems.

[-] the_q@lemmy.world 20 points 9 months ago

Actually AMD is still releasing new AM4 CPUs now. 5700x3D was just announced.

[-] miss_brainfarts@lemmy.blahaj.zone 4 points 9 months ago

Oh, now that sounds like something I might like

I don't have the fastest RAM out there, so whenever I upgrade from my 1600, I want an X3D variant to help with that

load more comments (3 replies)
[-] ShustOne@lemmy.one 16 points 9 months ago

That's true but I'm excited about the future of laptops. Some of the specs are getting really impressive while keeping low power draw. I'm currently jealous of what Apple has accomplished with literal all day battery life in a 14inch laptop. I'm hopeful some of the AMD chips will get us there in other hardware.

[-] T156@lemmy.world 6 points 9 months ago

Could you not just slot in a dedicated video card if you needed one, keeping the integrated as a backup?

load more comments (4 replies)
[-] GhostFence@lemmy.world 5 points 9 months ago

And the shared RAM. Games like Star Trek Fleet Command will crash your computer by messing with that/memory leaks galore. Far less crashy with a dedicated GPU. How many other games interact poorly with integrated GPUs?

[-] sapetoku@sh.itjust.works 4 points 9 months ago

AMD keeps the same sockets for ages. I was able to upgrade a 5 year old Ryzen 5 2600G to a 5600G last month. Can't do that with Intel in general.

load more comments (1 replies)
[-] inclementimmigrant@lemmy.world 22 points 9 months ago

Mind you that it can get these frame rates at the low setting. While this is pretty damn impressive for a APU, it's still a very niche market type of APU at this point and I don't see this getting all that much traction myself.

[-] BorgDrone@lemmy.one 5 points 9 months ago

I think the opposite is true. Discrete graphics cards are on the way out, SoCs are the future. There are just too many disadvantages to having a discrete GPU and CPU each with it’s own RAM. We’ll see SoCs catch up and eventually overtake PCs with discrete components. Especially with the growth of AI applications.

load more comments (4 replies)
[-] flintheart_glomgold@lemmy.world 15 points 9 months ago* (last edited 9 months ago)

$US330 for the top 8700G APU with12 RDNA 3 compute units (compare to 32 RDNA 3 CUs in the Radeon RX7600). And it only draws 88W at peak load and can be passively cooled (or overclocked).

$US230 for the 8600G with 8 RDNA 3 CUs. Falls about 10-15% short of 8700G performance in games, but a much bigger spread in CPU (Tom's Hardware benchmarks) so I'm pretty meh on that one.

Given the higher costs for AM5 boards and DDR5 RAM, you could spend about the same or $100-200 more than an 8700G build you could combine a cheaper CPU and better GPU and get way more bang for your buck. But I see the 8700G being an solid option for gamers on a budget, or parents wanting to build younger kids their first cheap-but-effective PC.

I also see this as a lazy mans solution to building small form factor mini-ITX Home Theatre PCs that run silent and don't need a separate GPU to receive 4K live streams. I'm exactly in this boat right now where I literally don't wanna fiddle with cramming a GPU into some tiny box, but also don't want some piece of crap iGPU in case I use the HTPC for some light gaming from time to time.

[-] bonus_crab@lemmy.world 5 points 9 months ago

itll be a great upgrade for these little nuc like things , thin laptops, and steamdeck competitors

[-] sturmblast@lemmy.world 14 points 9 months ago

That's pretty damn impressive. AMD is changing the game!

load more comments (8 replies)
[-] northendtrooper@lemmy.ca 10 points 9 months ago

So will this be a HTPC king? Kind of skimped on the temps in the article. I assume HWU goes over it and will watch it soon.

load more comments (4 replies)
[-] XEAL@lemm.ee 4 points 9 months ago

Aaaaand the 7950x3D is not top tier anymore

[-] SpookyLegs@lemmy.world 9 points 9 months ago

Back in my day the 7950 was a GPU!

Yelling at clouds

load more comments (1 replies)
[-] echo64@lemmy.world 4 points 9 months ago

The playstation 5 also does this.

load more comments
view more: next ›
this post was submitted on 31 Jan 2024
501 points (97.0% liked)

Technology

59570 readers
3472 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS