39

Aside from fps, is there any difference in quality of raytracing in Nvidia and AMD or is it the same(like they say that DLSS is better than FSR)?

top 18 comments
sorted by: hot top controversial new old
[-] Vinny_93@lemmy.world 32 points 8 months ago

DLSS works on Tensor cores only available to Nvidia. FSR works on anything. This means that DLSS is more specialised and, if implemented in a game properly, will work better.

GSync only works on Nvidia cards and GSync monitors, whilst FreeSync works on FreeSync and GSync monitors with any gpu.

Now ray tracing works on RT cores for Nvidia and I believe AMD have something similar. The key difference with the former technologies is that ray tracing doesn't have an Nvidia or AMD version, the tech is part of the DirectX 12 Ultimate specification. (I think Vulkan has something similar). Both GPU makers use DX12 so they use the same software to apply ray tracing.

The fact of the matter is that the RT cores of Nvidia are more effective than the ones AMD utilises. AMD usually combats that by just adding more cores.

In the end, it all hangs on implementation. In some games, AMD will be better because the game devs have optimised it for AMD GPUs. In most games, Nvidia will be better. I suggest looking up benchmarks for games you play with and without ray tracing.

[-] ninjan@lemmy.mildgrim.com 26 points 8 months ago

To clarify for the purpose of answering OPs question. The quality will be the same because it's the same code in both cases. But the performance, as in how many FPS, you get will most often differ.

[-] Eric_Pollock@lemmy.dbzer0.com 3 points 8 months ago

GSync only works on Nvidia cards and GSync monitors

I have a Freesync monitor (MSI) with an Nvidia RTX 3060, and Nvidia control panel gives me the option to "enable support for unverified displays" or something. Works just fine for me?

[-] TheGrandNagus@lemmy.world 8 points 8 months ago* (last edited 8 months ago)

That's FreeSync, although Nvidia, confusingly, calls it G-Sync, just like their other frame sync tech.

G-Sync required an expensive module in the display, FreeSync doesn't.

Nvidia lost the G-Sync vs FreeSync battle, but because of their marketing chops, they managed to get away with just slapping their name on it and going with the open solution.

DLSS has been much more successful, but it'd be like if they started using FSR, but rebranded it as DLSS.

[-] Eric_Pollock@lemmy.dbzer0.com 5 points 8 months ago

Oh that's really gross... but of course they get away with something like that.

[-] miss_brainfarts@lemmy.blahaj.zone 6 points 8 months ago

That just means your Nvidia card can make use of a Freesync monitor, there's no „real“ Gsync happening there.

Actual Gsync comes with a dedicated hardware module in the monitor, and it used to be only compatible with Nvidia cards, but that's also not the case anymore.

[-] Eric_Pollock@lemmy.dbzer0.com 4 points 8 months ago

So how does it work? Is it a fake software level that mimics G-Sync behavior? Something like V-Sync?

[-] miss_brainfarts@lemmy.blahaj.zone 3 points 8 months ago* (last edited 8 months ago)

It's pretty much just leveraging the open VESA Adaptive Sync standard, which AMD Freesync is practically speaking a rebrand of. It's indeed purely software to make it vendor-agnostic.

Well, unless the vendor locks it down/blocks it on purpose, which is what Nvidia has done up until... whenever Gsync Compatible became a thing.
(Misleading name imo, because as said before, there's no actual Gsync running)

[-] Lojcs@lemm.ee 1 points 8 months ago

Couldn't there be a difference between demonising algorithms if those are baked in the drivers?

[-] Vinny_93@lemmy.world 3 points 8 months ago

I have no clue what you mean

[-] Lojcs@lemm.ee 1 points 8 months ago

Autocorrect, meant denoising

[-] Vinny_93@lemmy.world 1 points 8 months ago

Most likely. Bottom line it's a total package kinda deal. If it were just one or two components, it'd be pretty easy to improve. It's the synergy between the graphics API, the implementation in the game, the GPU, GPU driver, the CPU, motherboard chipset etc. All working together.

[-] Kyouki@lemmy.world 17 points 8 months ago

Should be the same, the fact remains that Nvidia's implementation using the hardware on-board of the GPU is having an advantage. To what degree, I wouldn't know specifically, but doubt quality changes because of that.

I find it odd that people say "No, Nvidia is better" because clearly it's the same technique, only the hardware said on-board of each vendor GPUs might differ in how they handle that workload.

Personally, I think Raytracing is cool, but it's still too early for GPUs to handle efficiently + costly and isn't any factor for me buying any vendor/game. Nor do I care much about it.

[-] Ranvier@sopuli.xyz 6 points 8 months ago* (last edited 8 months ago)

To add to this, most gpu reviews will now have two sets of benchmarks, one with ray tracing and one without. You can see the gap in raytracing performance at each price point narrowing considerably over the years as amd catches up. It also narrows further at higher resolutions (since the price equivalent amd options tend to have higher raw performance and more memory which becomes increasingly important at higher resolutions). Right now all else being equal at most price points you'll see amd with a lead in non raytracing performance, and Nvidia with a lead in ray tracing performance. In addition to considering target resolution, which card is winning out can also be very variable per game, so if you have a particular game in mind, would see if there is a benchmark for that game so you would know what to expect with different cards and see what makes the most sense with your targeted performance, budget, and priorities.

And to clarify for OP, when I say raytracing performance, I mean the fps with raytracing turned on. Visually it will appear the same in each particular game no matter what gpu you're using, since it's the game that implements the ray tracing. The one exception I know of in terms of actual quality right now is "ray reconstruction", a part of dlss, that will only work on Nvidia chips, and that they claim improves the noise between individual rays better than traditional de noisiers through use of AI. Theoretically there should be other ways to reduce noise at a performance cost too, so in the end it does come down to performance and game by game implementation again. Not a lot of games with this right now, I think cyber punk, portal 1, and control.

Especially since I use vr sometimes, I tend to favor the raw power at the price point more to get the best resolutions and frame rates. If you're favoring just a great picture at lower resolutions like 1080p there starts to be diminishing returns (is 180 fps really a better experience than 120 fps?) in favoring non ray tracing performance, maybe making a less raw performance Nvidia card even more of a consideration if you feel the non raytracing performance is good enough. And then if money is no object of course, Nvidia has the best performing gpu overall in all aspects at the extreme high price end (4090), with no equivalent amd option at that level.

Also dlss vs fsr needs to be considered. Fsr being not as far along as dlss. This would be more important at the lower end though (except in the case of ray reconstruction), higher end gpus likely won't need to rely on these technologies to achieve good fps with current games. Hopefully fsr continues to improve and become a more widespread option. Amd is also working on fluid motion frames at the driver level, which may allow a similar effect to fsr 3 even if not implemented specifically by the game.

[-] Dexx1s@lemmy.world 2 points 8 months ago

I find it odd that people say "No, Nvidia is better"

I'd assume that by better, they mean the performance, I'm which Nvidia is definitely better. They've been doing it for longer and at this point. Account for the number of years in the game and they're pretty equal. 20 series Ray Tracing is a joke.

The majority of people aren't bothering with RT anyway.

[-] Kyouki@lemmy.world 2 points 8 months ago

Advantage, not better in terms of quality as OP asked.

[-] RedWeasel@lemmy.world 2 points 8 months ago

I think it depends. If it is just basic RT, then it should be about the same. If it uses Nvidia's ray reconstruction it will have a higher fidelity. I don't think they have started using ray reconstruction on anything but path tracing yet, though they are probably going to bring it down to more performant methods.

this post was submitted on 18 Mar 2024
39 points (95.3% liked)

PC Master Race

14226 readers
1 users here now

A community for PC Master Race.

Rules:

  1. No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
  2. Be respectful. Everyone should feel welcome here.
  3. No NSFW content.
  4. No Ads / Spamming.
  5. Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.

Notes:

founded 1 year ago
MODERATORS