796
Open source community figures out problems with performance in Starfield
(www.destructoid.com)
Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.
Submissions have to be related to games
No bigotry or harassment, be civil
No excessive self-promotion
Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts
Mark Spoilers and NSFW
No linking to piracy
More information about the community rules can be found here.
I'm inclined to believe this, and this likely isn't even the whole extent of it. I've been playing on a Series X, but decided to check it out on my Rog Ally. On low, at 720p with FSR2 on, I'd get 25-30fps in somewhere like New Atlantis. I downloaded a tweaked .ini for the Ultra preset and now not only does the game look much better, but the city is up closer to 40fps, with most other areas being 45-60+. Makes me wonder what it was they thought was worth the massive cost that the default settings give, with no real visual improvement.
Another odd thing, if I'm playing Cyberpunk or something, this thing is in the 90%+ CPU and GPU utilization range, with the temps in the 90c+ range. Starfield? GPU is like 99%, CPU sits around 30%, and the temp is <=70c, which basically doesn't happen playing any other "AAA" game. I could buy Todd's comments if the frame rate was crap, but this thing was maxed out... but not getting close to full utilization on a handheld with an APU indicates something less simple.
I'm hoping the work from Hans finds its way to all platforms (in one way or another), because I'd love to use the Series X but 30fps with weird HDR on a 120hz OLED TV actually makes me a little nauseous after playing for a while, which isn't something I commonly have a problem with.
From my experience on the Steam Deck is doesn't matter if I run low graphics or medium graphics (some high settings) the performance is almost the same