Lately we've been talking about games not performing well enough on current hardware. It's had me wondering just what we should be asking for. I think the basic principle is that components from the last 5 years should be adequate to play current-generation titles at 1080p60. Not at max settings, of course, but certainly playable without resorting to DLSS and FSR.
It makes me wonder: is it really so much to ask? There are games from 10+ years ago that still look great or at least acceptable. Should we expect new games like Starfield to be configurable to be as demanding as an older game like Portal 2 or CS:GO. If the gameplay is what really matters, and games of the 2010s looked good then, why can't we expect current games to be configurable that low?
From what I've seen, users of the GTX 1070 need to play Starfield at 720p with FSR to get 60fps. What's better? Getting 60fps by playing at 720p with FSR, or playing at 1080p with reduced texture resolution and model detail?
It shouldn't even be that hard to pull off. It should be possible to automatically create lower detail models and textures, and other details can just be turned off.
I'm a little confused by your timeline. I agree, 5 year old hardware should definitely support 1080p60, but the 1070 is 7 years old now. Since the 1070 could support that when it came out and those are static targets I think we should expect the 1070 to support 1080p60 forever for games similar to games that were coming out at the time, but it's a bit unfair to compare starfield to portal 2 and cs:go when those games are in constrained and controlled environments while starfield is vast and open, and environments definitely take a GPU toll, so you will lose some performance to that compared to those games. I haven't played starfield yet so I don't know the details, but given the scope I know of it, it doesn't sound unreasonable for it to miss the 1080p60 mark a bit given the difference in game environment.