Lately we've been talking about games not performing well enough on current hardware. It's had me wondering just what we should be asking for. I think the basic principle is that components from the last 5 years should be adequate to play current-generation titles at 1080p60. Not at max settings, of course, but certainly playable without resorting to DLSS and FSR.
It makes me wonder: is it really so much to ask? There are games from 10+ years ago that still look great or at least acceptable. Should we expect new games like Starfield to be configurable to be as demanding as an older game like Portal 2 or CS:GO. If the gameplay is what really matters, and games of the 2010s looked good then, why can't we expect current games to be configurable that low?
From what I've seen, users of the GTX 1070 need to play Starfield at 720p with FSR to get 60fps. What's better? Getting 60fps by playing at 720p with FSR, or playing at 1080p with reduced texture resolution and model detail?
It shouldn't even be that hard to pull off. It should be possible to automatically create lower detail models and textures, and other details can just be turned off.
"Perfect" is a bit of stretch. I've tried running games with FSR on my GTX 1080 and it looks like absolute ass. By lowering graphics settings I was able to run at a much higher resolution/framerate while looking leaps and bounds better.
I realize that newer GPUs will give better results with FSR, but if I'm getting a newer GPU then presumably it would have better native performance anyway.
I'll certainly give it another go when I upgrade to an Ada/RDNA3 GPU, but until then it's just a marketing gimmick to me.
Did you try fsr 1 or 2? The difference is quite noticeable for me, still not enough to justify ever using it on a 1080p monitor tho. Maybe fsr3
Wouldn't a potato setting also look like absolute ass? I mean few games are very pretty at 1080p Low which OP thinks isnt going low enough to accommodate old hardware. In those situations FSR can be used to make it run at 60 FPS with 1080p output but yes it will for sure not be a pretty experience, but it will run and won't destroy gameplay mechanics.
Now I do believe that the absolute best would be if developers got better at building their games and graphics engines to be resource efficient or "optimized" as the gaming community likes to call it. But that's quite an ask in reality as most studios don't build their own engine and those that do generally aren't building them with resource efficiency in mind, they're focused on what kind of games they want to build with them. Id Software really stands out but they're engine making wizards almost more so than game developers and really pride themselves with making good looking games that can be run well by just about anything. Doom Eternal runs and looks ridiculously well on the Steam Deck (in regards to what little juice that machine has) as an example.
I'm being awfully apologetic of developers here, but I really don't think the issue is that you can't run the games on older hardware, because FSR has mostly solved that imo.
The specific example I experimented with the most was Firmament at 3440x1440, targeting 100fps.
Using FSR at the recommended settings, it was a blotchy, blurry mess. Text was barely readable. Turning down shadows and basically everything except antialiasing I got native 3440x1440 at a pretty solid 100fps.
It's a real shame that the default settings in that game have FSR enabled, because I'm sure a lot of players just go with the defaults and think the game looks like shit, when actually it's very beautiful with correct settings, even on relatively modest hardware.
It’s really quite fair that FSR and DLSS don’t look great at 1080p since they weren’t designed for that use case at all. Ideally it’s meant to upscale to 4K where the base resolution is at minimum ~1080p where’s there’s enough pixels to get a good output. When trying to upscale to 720p or 1080p the base resolution goes down to the 300ps (maybe even lower) which just isn’t enough that least with todays models.
So for me I don’t see FSR or DLSS being a solution at all the hardware question for longevity. To me it just allows a weaker GPU to look nicer on a 4K screen (theoretically) not to increase FPS as a main feature.
Maybe GPUs could get cinema upscalers? I don’t know how they work or feasibility other than for a while 4K Bluerays we’re just the 2K footage but ran through these upscalers.