88
submitted 9 months ago by hsr@lemmy.dbzer0.com to c/gaming@lemmy.world

How did we get here?

top 36 comments
sorted by: hot top controversial new old
[-] Contramuffin@lemmy.world 19 points 9 months ago

I'll be completely honest, that's probably the coldest take someone can make about recent tech that I've seen, and it's being presented as a hot take.

Virtually everyone prefers native, almost aggressively so. That being said, I think there's important nuance that's missing in most talks about upscaling. In my testing, my experience of blurring and smearing with upscaling/frame gen seems to be hugely dependent on pixel density. If you get a really dense screen, then upscaling, in my experience at least, becomes virtually undetectable even at 1080p.

[-] hsr@lemmy.dbzer0.com 8 points 9 months ago* (last edited 9 months ago)

probably the coldest take someone can make about recent tech that I’ve seen, and it’s being presented as a hot take

That's exactly what the "we have you surrounded" meme template conveys (at least according to my understanding): a popular opinion, but ironically presented as a fringe opinion.

So no, this isn't really intended this as a "hot take", there seems to be a decent amount of people who dislike TAA for example. I'm pointing out a trend in the industry, that devs are using temporal or upscaling tools to make the game run/look better, and GPU vendors support those tools to squeeze out the most fps from their cards. At this point TAA is the standard AA method and is integral to how some games are rendered, and upscaling is advertised as basically free* performance. Unfortunately, by its nature, all this temporal tech doesn't work too well at low framerates and resolutions, a scenario where it would be very useful.

I would agree that most artifacts and the softening effects of upscaling will be less visible on higher density screens, or when you're sitting further away from a screen. Unless your TAA/upscaling implementation is absolutely botched, in which case it will always looks garbage, but that's not really the fault of a specific technology.

[-] Contramuffin@lemmy.world 3 points 9 months ago

Interesting - I was not aware that that was the intended use of that meme. Maybe I'm getting old?

[-] moody 15 points 9 months ago

I don't play a lot of AAA games, but ngl I'm quite happy gaming at 1080p on my 27" monitors.

I get way better framerates, and it still looks plenty good with maxed out graphics, as long as I'm not sticking my face right up against the screen.

[-] ichbinjasokreativ@lemmy.world 7 points 9 months ago

Have you actually tried 4k though? Yes, framerates are lower, but boy does it look better. For me, and it's that's just my take on it, 1080p ends at 24" monitors.

[-] PopOfAfrica@lemmy.world -2 points 9 months ago

If a person is running TAA, I dont really see the point.

[-] Crozekiel@lemmy.zip 4 points 9 months ago

Same. Honestly my eyes aren't good enough to notice a difference between 1080p and 1440p (or 4k) at the scale of my pc monitor, but I damn sure notice a difference between 60 fps and 200 fps...

[-] hsr@lemmy.dbzer0.com 3 points 9 months ago

Glad it works for you. Since I upgraded to a 1440p monitor (I still have the same GPU) I went from comfortable high-ultra settings to mid-high settings + FSR Quality in more demanding titles. From the games I played in both 1080p and 1440p, I'd say that less GPU-intensive titles definitely look better in high-res, but I found the overall experience quite whelming.

Simply playing on a higher res monitor won't necessarily give you better visuals if you don't have the GPU power to match settings, however at that point it's not "higher res = better visuals" but "more powerful PC = better visuals" which, duh, of course it will look better.

[-] c0mbatbag3l@lemmy.world 7 points 9 months ago

Well there's your problem, you wanted better resolution but didn't match it with a GPU upgrade.

Gotta have both or you'll suffer a bit of loss.

[-] moody 1 points 9 months ago

That's basically the feeling I get. I've been gaming on PC since the days of CRT monitors that could run many different resolutions. The tradeoff was always quality vs resolution vs framerate, but nowadays LCD/LED monitors have a fixed native resolution, so that's one factor to take out of the equation. Nobody wants to play games at a non-native resolution.

[-] pythonoob@programming.dev 11 points 9 months ago* (last edited 9 months ago)

I still play in 1080. I believe monitors shouldn't cost as much as my build. (Yes I'm exaggerating for comedic effect).

[-] PopOfAfrica@lemmy.world 3 points 9 months ago

1080p is still great. What giant ass screens are people running where it isnt?

[-] sharkfucker420@lemmy.ml 3 points 9 months ago

1080p is great yes but it could be better. Those graphics must be sharp enough to fucking cut me

[-] Still@programming.dev 2 points 9 months ago

27 and 34 inchers that are like 15 inches from our faces

[-] wizardbeard@lemmy.dbzer0.com 8 points 9 months ago

Friendly tip: For singleplayer games, you can always disable the game's built in AA solution and use reshade for AA instead. If you have extra GPU power you can also use reshade to add all sorts of other graphical effects if you're willing to fiddle around with things to get it looking good.

If you have an NVidia card, sometimes PCGamingWiki has instructions for tweaks you can do in Profile Inspector to adjust how the driver applies AA to a game too.

[-] Glifted@lemmy.world 8 points 9 months ago

I'm still running a 1060. You'd be surprised what you can play if you're willing to put up with shit graphics

[-] Blackmist@feddit.uk 2 points 9 months ago

And for anything else, I run it on my PS5.

[-] LouNeko@lemmy.world 7 points 9 months ago

Remember when you could force MSAA through Nvidia Control Panel on almost any game without issue? Pepperidge Farm Remembers.

[-] wesker@lemmy.sdf.org 7 points 9 months ago

Solution is to not play modern AAA garbage. For the $40-50 pricetag, I could get a handful of great indie games off my wishlist. Games that won't bat an eye at an aging GPU.

[-] PeterPoopshit@lemmy.world 5 points 9 months ago* (last edited 9 months ago)

They would make a lot more money if they made games run on older hardware. Most people can't play cities skylines 2. Most people can't play kerbal space program 2. We don't want photorealistic graphics. Just give us fallout 3 era graphics because thats good enough. Fuck.

[-] c0mbatbag3l@lemmy.world 3 points 9 months ago

I agree in theory but Fallout 3 is a horrible example, the art direction was the epitome of the era's fascination with bland brown and sickening green landscapes.

The answer here is to ask if photorealism matters to the game or not, if another type of art direction suits better then do that. Hell, look at Boltgun or even just games that used contrast and bold colors to their advantage like Halo 3 or Mass Effect 2.

[-] aBundleOfFerrets@sh.itjust.works 6 points 9 months ago

The completely unfounded death of MSAA in modern games is devastating. It was (and still is!!!) so much better than every alternative.

[-] hsr@lemmy.dbzer0.com 15 points 9 months ago* (last edited 9 months ago)

Turns out the death of MSAA was actually quite founded! Here's a great Digital Foundry video on anti-aliasing piped link | youtube

tl;dw MSAA only affects geometry, so while it worked fine for older titles, it can't handle textures, normal maps, shading etc.

[-] umbrella@lemmy.ml 2 points 9 months ago* (last edited 9 months ago)

Still looks better than all alternatives by far though.

edit: second best, just remembered DLAA is a thing whenever its actually implemented.

[-] aBundleOfFerrets@sh.itjust.works 1 points 9 months ago

Games like Deep Rock Galactic have every reason to use MSAA but don’t anyway because game engines decided it was unnecessary, and small devs like that can’t be arsed to maintain their own implementation.

Also textures and normal maps don’t need anti-aliasing because they will already have it baked in. Shaders are a similar situation where any aliasing will be situational and should be handled by the shader itself. (If it even makes sense to do so)

[-] Coreidan@lemmy.world 3 points 9 months ago

Sounds like you care more about eye candy than gameplay. To each their own.

[-] sharkfucker420@lemmy.ml 2 points 9 months ago* (last edited 9 months ago)

Anyone want to explain to me wtf anti-aliasing even is

[-] Tattorack@lemmy.world 2 points 9 months ago

Most AAA games have been so shitty lately that calling something "AAA" these days is almost like saying a bad word.

Gods, the amount of disappointments I'm glad I didn't waste money on. The biggest spending I've done on gaming lately is buying myself a Steam Deck. Now I'm enjoying my backlog of indies I got from Humble Monthly.

[-] therealjcdenton@lemmy.zip 1 points 9 months ago

Watch as half life 3 comes out and you won't be able to play it without dlss or fsr

[-] Venator@lemmy.nz 1 points 9 months ago

DLAA with frame generation seems pretty good to me in cyberpunk.

[-] umbrella@lemmy.ml 1 points 9 months ago

I've replayed rdr2 recently and TAA absolutely DESTROYS the beautiful visuals of the game.

That said stuff like DLSS is a godsend and looks 90% there depending on the situation. Its simply another tradeoff you can make.

My old nvidia can punch above its weight because of it.

[-] mlg@lemmy.world 1 points 8 months ago* (last edited 8 months ago)

This is especially apparent and a major downgrade on war thunder because you need to be able to see player tanks and aircraft in the distance which ends up being like 3 pixels on your screen which DLSS will make invisible lol.

There are definitely some games that benefit from this, but the inconsistency is still enough that its not really useful for FPS and Multiplayer games where even the slightest change on your screen can dramatically affect your gameplay. A sharp 60fps is much preferable to an AI generated 120 fps which may remove detail and accuracy.

[-] Blackmist@feddit.uk 0 points 9 months ago

It is strange how upscaling was bad and for potato box gamers hurdur, but give it a special PC only name and they gobble it up.

The point of increased framerate is to give less input lag, and DLSS3 can't do that because it needs the next frame to be able to interpolate to it. Literally a joke feature, so people can pretend they've got a better PC.

[-] onlinepersona@programming.dev -1 points 9 months ago

I don't get it... my 36 inch screens are in 1080p --> games I player are in 1080p. What's the problem?

CC BY-NC-SA 4.0

[-] Snoopey@lemmy.world -1 points 9 months ago

DLSS 2 is literally voodoo magic that in some cases looks better than native res. Don't take away my DLSS

this post was submitted on 15 Feb 2024
88 points (87.3% liked)

Gaming

3127 readers
1103 users here now

!gaming is a community for gaming noobs through gaming aficionados. Unlike !games, we don’t take ourselves quite as serious. Shitposts and memes are welcome.

Our Rules:

1. Keep it civil.


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only.


2. No sexism, racism, homophobia, transphobia or any other flavor of bigotry.


I should not need to explain this one.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Try not to repost anything posted within the past month.


Beyond that, go for it. Not everyone is on every site all the time.



Logo uses joystick by liftarn

founded 1 year ago
MODERATORS