756
Linux vs Windows tested in 10 games - Linux 17% faster on Average
(video.hardlimit.com)
Discussions and news about gaming on the GNU/Linux family of operating systems (including the Steam Deck). Potentially a $HOME
away from home for disgruntled /r/linux_gaming denizens of the redditarian demesne.
This page can be subscribed to via RSS.
Original /r/linux_gaming pengwing by uoou.
WWW:
Discord:
IRC:
Matrix:
Telegram:
Man look, I've been using Linux as a daily driver for 18 years, people have been saying exactly what you're saying since before performance was even comparable.
You're not going to get 17% better performance on the GPU just because you're using another operating system, it's not going to happen unless you're running a Linux native version of the game. Often times, that is not even the case.
Performance can be a little bit better if the game is natively opengl or vulkan, but if it is directx (the vast majority of windows games) then it is going to be comparable at best in GPU-bound scenarios, I.E. most of the games people are playing on PC.
You can't just magically put more transistors in a GPU just because you are running a different OS. CPU bound games run better on Linux because of the god-tier scheduler, but a GPU is essentially a computer in itself, all drivers do is tell the GPU to take this information and translate it into something you see on a screen.
By the way, the Nvidia thing has been false for quite some time now. I primarily use AMD on Linux, but the only place you will run into issues with Nvidia is wayland, otherwise it works perfectly fine everywhere else.
"works fine" is very different than "is equivalently optimized."
Valve has done a lot of work to get games to work well on the Steam Deck, and that likely translates to other AMD GPUs. So it makes total sense that Valve would optimize the Proton translation layer for DirectX calls to the AMD driver differently than the NVIDIA driver (or rather, in a way that AMD handles better). A big issue in GPU optimization is keeping it busy, so perhaps the AMD driver working with Valve's patches on the DirectX to Vulkan layer improve utilize m utilization. That could translate to a modest performance improvement even on well optimized games (perhaps 5-10%, probably not more than 20%).
I don't know if that's what's going on here, but it's a plausible explanation.
I can see why you'd think that, but what you fail to understand is that valve is not the only one working on proton, and valve themselves did not even make DXVK. Those are free and open source efforts and valve even pays external devs to commit to that software. I'm telling you that DXVK itself is not going to give a boost to graphical performance because it literally cannot, those are extra instructions that your GPU has to perform in order to send out frames.
Directx to vulkan translation is exactly that, translation. It receives directx calls and translates them to vulkan. For one, it has overhead, two, if the game is optimized, it is already going to be running at max performance on windows, using DXVK is going to slow the GPU time down because it will have to perform more calculations. No scheduler will save you from that, not even the Linux one, because it isn't something that is handled by the scheduler.
Two things:
DirectX -> Vulkan isn't a direct translation since the APIs aren't 1:1, so there's going to be some tuning in how APIs are mapped, and the tuning can differ depending on the GPU driver you're using.
It's the same with processors, you can optimize a compiler to work better on AMD vs Intel or vice versa (look at Intel C++ compiler benchmarks for an example of that), even if they use the exact same set of instructions because the microarchitectures are optimized differently. This is because the way the instruction set gets mapped to the microarchitecture can impact performance significantly (something like 10% is possible, depending on the benchmark).
GPU drivers are complicated, and there are a lot of areas where the interaction between the driver, software, and system services can be optimized. AMD's drivers are open source, which helps with those optimization efforts. Then you throw in a big, well-funded, and motivated company like Valve funding development (both through salaries and donations) and you end up with AMD GPUs getting extra attention for things like DXVK.
So I would expect AMD on Linux to perform better vs NVIDIA on Linux when compared to AMD vs NVIDIA on Windows. As in, the performance difference on Linux vs Windows would be more favorable for AMD cards than NVIDIA ones because AMD on Linux gets more attention than NVIDIA on Linux. I don't expect the same for compute, since NVIDIA invests heavily in that space on Linux, so it's not an inherent advantage of the platform (e.g. the scheduler discussion), but a question of where optimization efforts are focused.
Alright look, I'm not going to argue about who said what because we both know what we said and it is unrelated to the topic at hand.
The reason the windows amd driver is bad is not due to performance, it is the very same reason why the proprietary driver is bad on Linux, it is horrible reliability.
There are circumstances where they trade blows and circumstances where they perform similarly. If you really want to compare the two based on OS alone, you need to compare the equivalent drivers which is the proprietary one.
We're already not doing an apples to apples comparison here because we're comparing WINE+DXVK vs DirectX. Comparing the OS itself isn't that interesting, at least from an end-user perspective, what is interesting is comparing the typical user experience on both platforms. As in, no tinkering with stuff, just installing in the most obvious way.
Valve is optimizing for that typical user experience on their Steam Deck, and that translates to the desktop fairly well. They're not really doing the same on Windows, so it's interesting to compare devs+manufacturers optimizing stuff on Windows vs the community+Valve optimizing stuff on Linux.
Why would not comparing the OS itself be interesting? That is literally the foundation of everything you are seeing on the screen.
You also can't just compare WINE+DXVK to DirectX, because you can actually use DXVK on windows. If the video title was "directx vs dxvk" then that would be totally fair, but it is not, it is called "windows vs linux". I'm simply trying to say that the vast majority of games are not going to see a 17% increase in GPU performance, your biggest boost is going to lie with CPU bound games because it is the truth.
The only time you'll see a game perform better on a GPU on Linux is when the game has a native version, and even then that only applies if they actively develop that version, many games are not actively developed and are even a few versions behind.
Because regular users aren't going to be changing drivers based on the game, or doing a ton of system-level configuration to get a bit better performance.
So it should be defaults vs defaults.
If we want to compare OSes, we should do targeted benchmarks (Phoronix does a ton of those). There are far more interesting ways to compare schedulers than running games, and the same is true for disk performance, GPU overhead, etc.
How many people actually do that though? I'm guessing not many.
"Windows vs Linux" is comparing the default experiences on both systems, and that's interesting for people who are unlikely to change the defaults (i.e. most people).
That's just not true, as evidenced by this video. If you take the typical setup on Windows vs the typical setup on Linux, it seems you get a 17% average performance uplift on Linux on these games.
That doesn't mean Linux is 17% faster than Windows, nor does it mean you should expect games to run 17% better on Linux, it just means Linux is competitive and sometimes faster with the default configuration. And that's interesting.
Linux does not have a default configuration, that's why we have over 600 distros. If you want to have a baseline "default configuration" then fedora would be the way to go, which he has not used.
Yes, he got a performance uplit by 17% on average in these games, the point he is trying to make is that you can get this in every game on Linux which is what is not true.
Most of those games are also CPU bound, an area that Linux is going to destroy windows. Once again, I am referring to GPU performance specifically, as that is the general point that OP makes with these posts.
Sure, but each distro has a default configuration, and distros don't vary that much in terms of performance with those default configurations for playing games. If there is a consistent performance difference, it'll likely be something like 1-2%, which should be within run-to-run variance and not really impact the results.
And if anyone assumes that an average between 10 games represents the difference you'll see on average for your own games doesn't understand statistics because 10 games is not enough to be a representative sample, especially since they weren't even randomly selected to begin with. It's still an interesting result.
You're being hyperbolic here.
The differences, all else being equal, should be pretty small most of the time unless there's a hardware driver issue (e.g. when Intel's new p-core vs e-core split came out, Windows had much better support).
If we're seeing a huge difference, more is going on than just a "better" scheduler or more efficient kernel or whatever. It's much more likely Windows is using DirectX and Linux is using DXVK or something. The bigger the gap, the less likely it's the kernel that's doing it.
As someone who has used Linux exclusively for ~15 years, these kinds of benchmarks are certainly exciting. However, we need to be careful to not read too much into them.
That may be true, but de facto defaults today is Proton experimental on Steam with the a recent Linux kernel. That's pretty much the same across all distros.
Yup, the difference between Ubuntu, Fedora, and Arch or whatever isn't going to be all that big, assuming you're working with each distribution's default kernel and running with a Steam's provider runtime. You might get 1-2% here and there, but that's pretty much within run to run variance anyway.
That's not all the factors that play a role in performance in games.
For instance, what fork of the kernel are they using? Are they using zram? What graphics driver are they using? Gamescope? Gamemode? All of those things affect performance of a game to varying degrees.
Also, Proton experimental is definitely not the default on any system, that would be Proton 8.
I don't see an argument which disproves my results apart from you disbelief. But I like the Nvidia comment. I'll do a video of Linux vs Windows on my 3080M laptop. We'll see how true is that Nvidia works as well as AMD on Linux. :)
Go right on ahead, I've done the tests myself already.
Keep in mind though that if you are using a laptop, nvidia tends to work better when paired with Intel vs amd for the sake of graphics offloading.
I don't think you understand how this works, I'm not trying to disprove anything, you are the one trying to prove something. You chose 10 very specific games to run these tests, some of them being heavily CPU bound, and state that you are receiving an increase in GPU performance when it is simply not the case. All of these games are also optimized for proton, which does not help your case.
Tell you what, why don't you give something like "Spec Ops: The Line" a test? Halo Infinite? 40k Darktide? Vermintide 2? Dying Light? Hell, infinite and darktide are very popular in the Linux gaming community, I was even one of the beta testers for darktide.
You say that like I'm afraid to do it. You're missing the point that these games don't have benchmarks lol. If you want I can do a gameplay comparison but don't tell me, the areas or movements are not the same. :)
Also these games couldn't be more diverse. I tested DXVK, VD3D and Vulkan (both on Linux and Windows) with these games. If you can find a more diverse benchmark please let me know, cause I haven't found one.
Also, I'm already doing benchmarks on my i7-10870H and 3080 laptop. Linux won't go above 80W, cause of the Nvidia Drivers (545 Beta btw) so the difference will be IMMENSE for Windows there.
You don't need a specialized benchmark to do a benchmark, you can use a realtime rendered cutscene, you can do an average over several games. That's how they have been done for like a decade and a half at this point.
Also, I'm not referring specifically to mobile graphics nvidia, but nvidia altogether. Linux laptop gamers make up a very very small amount of total Linux gamers, it is an incredibly small niche of two already small niches, both being Linux and laptop gamers. Yes, of course if you have a limit to the total amount of power, it will lag behind.
I gave you a list of games, start there, my list is also diverse and includes all of those except for vulkan, which if you want, throw doom eternal in there, though as I have already stated vulkan will get a small increase on linux over windows in terms of GPU performance, so that's not really proving anything anyone doesn't already know.
If you want a fair comparison, limit it to 80 watts in windows as well. Remember though that power is NOT EVERYTHING when it comes to GPU performance. All of the games I detailed above are GPU bound games and will be a fair comparison. Just a heads up darktide may or may not have graphical glitches on your system if you are running amd (both operating systems, it is hardware related), I've worked with the devs to fix it in the past but it seems like recently people have been having issues with it again.
I only have Doom Eternal and Vermintide 2 from the games you mentioned. I can do the opening sequences of those. Is that ok?
Vermintide 2 would be fine but everyone already knows how eternal is going to work out, that is a mostly CPU game.
Edit: also halo infinite is free, and if I remember correctly it has a benchmark
I know the multi is free, Does that have a Benchmark?
Yes
Awesoem, I'll check it out then! :)
This is awesome, I didnt know that!
It sounds like some time in that 18 years, you solidified this impression, and are choosing to not recognize the advancements in Proton and drivers that have occurred post-Steam Deck.
I've been using Linux since before Xwindows existed, and I am open to OPs research. Just because we've used it longer, doesn't make either of us right without proof. OP supplied evidence. Prove them wrong.
Why are you blatantly lying like this? X came out seven years before the Linux kernel was even released. And even then, there wasn't a working system for the Linux kernel when it was released. Keep in mind I said DAILIED Linux for 18 years, I didn't say USED, I've been using Linux for 27 years now. I actually remember a time when Linux was not an operating system that people would use to play games on.
I'm using my time specifically in the community as an example to show that this is not the first time I have heard this. OP supplied evidence in ten very specific games here, there are over 12000 games on protondb that are "playable", not even verified. I have run across myself quite many games that run at half to three quarters the performance that it does on windows, and that is absolutely fine.
Telling people that using Linux will get you a "free performance boost as much as 17%" when it very likely will NOT, will create a lot more angst towards the Linux community than it already is. The elitists are already doing that for us, we don't need more of it.
We should be pushing people towards Linux for digital privacy+security and free software, not cherry picked performance boosts.
Yes, I very well recognize the black magic sourcery of proton and wine, but you are sitting here and trying to tell me that proton is somehow going to make your GPU somehow physically push more calculations per cycles just because it is running Linux. Not even giving me the "mesa drivers" spiel which is also BS, as performance is not the main area that the Foss drivers are better in.
Linux is not going to break the laws of physics buddy, I've already said what I said, boost in CPU bound games, little to no boost in GPU bound games. If you're seeing a boost, it's because you have a CPU bottleneck and you are getting it because of the scheduler.