this post was submitted on 16 Sep 2025
60 points (100.0% liked)
PC Gaming
12297 readers
999 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yeah, downvoted because I woke up and saw this absolutely ridiculous strawman that bordered on marketing drivel worth of Nvidia and monitor manufacturing advertising wing.
Yeah, marketing lies. I mentioned this in the last paragraph.
You're skeptical of the benefits, that is obvious.
You're wrong about it being subjective though. There are peer reviewed methods of creating photographs that display motion blur as a human eye would experience it. People have been using these techniques to evaluate monitors for years now. Here's a very high level overview of the state of objective testing: https://blurbusters.com/massive-upgrade-with-120-vs-480-hz-oled-much-more-visible-than-60-vs-120-hz-even-for-office/ . We are seeing diminishing returns because it, roughly, takes a doubling in the refresh rate to cut the motion blur in half. 60-120 is half as blurry, 144 to 240 is only 25% less blurry.
If you want to keep seeing noticeable gains, up to being imperceptible, then display refresh rates need to continue to double and there have to be new frames generated for each of those refresh rates. Even if a card can do 480fps on some limited games, it can't do 1000fps, or 2000fps.
We need exponential increases in monitor refresh rates in order to achieve improvements in motion blur, but graphics cards have not been making exponential increases in power in quite some time.
Rasterization and Raytracing performance growth is sub-exponential while the requirements for reducing motion blur are exponential. So either monitor companies can decide to stop improving (not likely since TCL just demoed a 4k 1000hz monitor) or there has to be some technological solution for filling the gap.
That technological solution is frame generation.
Unless you know of some other way to introduce exponential growth in processing power (if you did you would win multiple Nobel prizes), then we have to use something that isn't raw rendering. There is no way for a game to 'optimize' its way into having 10x framerate, or 100x framerate.
Yes, game companies are lazy and they cover the laziness by marketing their game with a lot of upscaling so that they can keep producing crazier and crazier graphics despite graphics cards performance growth not keeping up. This is the fault of gaming companies and their marketing and not of upscaling and frame generation technology
Frame generation gives all cards better FPS, which objectively smooths out motion. Going from 30 to 60 fps cuts motion blur in half. Nothing supposed about it.
A developer's choice to optimize their game and their choice to support upscaling and frame generation are not mutually exclusive choices. There are plenty of examples of games which run well natively and also support frame generation and upscaling.
Also, frame generation only adds latency when the frame time is long (low FPS). As the source framerate increases the input latency and the frame time converge. In addition, it's possible to use frame generation to reduce input delay (blur busters: https://blurbusters.com/frame-generation-essentials-interpolation-extrapolation-and-reprojection/). Input latency is a very solvable problem.
My point is that you're not understanding the trajectory of display hardware development vs the graphics card performance growth and presenting frame generation and upscaling as some plot by game developers and graphics card designers so that they can produce worse products.
It's conspiracy nonsense.