Email or username:

Password:

Forgot your password?
Top-level
tnt

@kkarhan @sima @marcan Does that include NVidia HW ? I never tested myself, it's just what I read that my perf would be way worse with wayland and what I saw on recent published benchmark too.
(and AFAIK it's being worked on, and improving, but I'm just looking to know the status perf wise today running proton on Xorg vs wayland)

17 comments
Hector Martin

@tnt @kkarhan @sima Nvidia Wayland support is known to be horrible and broken, and that's 100% Nvidia's fault and a major cause of reputational damage to Wayland.

Andy

@tnt @kkarhan @sima @marcan After marcan's tweet yesterday, I decided to try and give Wayland on an RTX 2080 Ti a go again. I switched back to Xorg now, because Nvidia's wayland support is absolutely horrible:

The most annoying part: Electron apps and proton with DXVK show the weird behavior that old frames are displayed between newer frames. While typing in Slack for example, the letters you just typed vanish for a split second and then show up again. The same happens in Far Cry 6 when running in gamescope + proton. Results in horrible stuttering. Some synchronization primitive is broken. I am not the only one and I am running the closed source driver, so this one seems to be affected as well: github.com/NVIDIA/open-gpu-ker

Also I tried to 3D print something and Cura just crashed with an int3 on its way through `gdk_display_manager_open_display`. I didn't debug it further and just switched back to Xorg.

See you in a year (or when I decide to buy new, non-Nvidia hardware)!

@tnt @kkarhan @sima @marcan After marcan's tweet yesterday, I decided to try and give Wayland on an RTX 2080 Ti a go again. I switched back to Xorg now, because Nvidia's wayland support is absolutely horrible:

The most annoying part: Electron apps and proton with DXVK show the weird behavior that old frames are displayed between newer frames. While typing in Slack for example, the letters you just typed vanish for a split second and then show up again. The same happens in Far Cry 6 when running in...

Hector Martin

@G33KatWork @tnt @kkarhan @sima Yes, it's completely broken on Nvidia, and it's entirely Nvidia's fault, and this is one major reason Wayland undeservedly gets a bad rap.

For the record, the missing synchronization feature in the Nvidia proprietary drivers is implicit sync. It's been years and they still don't have it. Wayland is completely broken without it (so is X on modesetting for that matter, but presumably they do something else in their proprietary DDX).

@lina implemented it for the Asahi GPU driver in two weeks, plus maybe a couple more of debugging, give or take. It's already been shipping to users for a while, with a few rare glitches that are identified and fixed for the next version already.

@G33KatWork @tnt @kkarhan @sima Yes, it's completely broken on Nvidia, and it's entirely Nvidia's fault, and this is one major reason Wayland undeservedly gets a bad rap.

For the record, the missing synchronization feature in the Nvidia proprietary drivers is implicit sync. It's been years and they still don't have it. Wayland is completely broken without it (so is X on modesetting for that matter, but presumably they do something else in their proprietary DDX).

DELETED

@marcan @G33KatWork @tnt @kkarhan @sima @lina Sadly NVIDIA is still more or less the biggest player, so NVIDIA not working well with it will cause headaches for a lot of people, including me. People are not going to switch to different worse performing GPUs for their tasks (even if the main perf advantage NVIDIA has is brute force with massive power draw)

Hopefully now that NVIDIA has an open kernel driver some parts can be alleviated??

DistroHopper39B :verified:

@LunaFoxgirlVT @marcan @G33KatWork @tnt @kkarhan @sima @lina this goes along with NVIDIA also supporting CUDA and AMD's Linux OpenCL support being in the form of a DKMS driver that only seems to support a 3 year old LTS of Ubuntu and is broken on the last 2 LTS kernels (or at least was the last time I tried to use it)

🐧sima🐧

@LunaFoxgirlVT @marcan @G33KatWork @tnt @kkarhan @lina the open driver only fixes nvidia's issue of no longer being able to cheat and get access to GPL-only kernel services. which they need for cuda

the other thing they had to fix is make the fw redistributable, which was the total killer before

it's still a giantic mess because they don't do any kind of reasonable fw api versioning, which means doing a real linux driver with all the features in upstream is still very hard, and unecessarily so

Hector Martin

@sima @LunaFoxgirlVT @G33KatWork @tnt @kkarhan @lina To be fair Apple also aren't doing any FW API versioning, and we're dealing with it anyway :P

🐧sima🐧

@marcan @LunaFoxgirlVT @G33KatWork @tnt @kkarhan @lina you don't need to load it from linux (so no lolz with redistribution rights) and I thought in the bootloader entry you can spec which one you want, so that you don't have to support them all? at least it sounded somewhat reasonable

nvidia didn't even do that for years, until they where forced because the kernel's module loader got stricter with enforcing GPL-only module access, and that broke cuda

Hector Martin replied to 🐧sima🐧

@sima @LunaFoxgirlVT @G33KatWork @tnt @kkarhan @lina We pick the supported versions and our installer only offers those, but if it's loaded by Linux itself then you should also be able to restrict the set of supported versions, right?

🐧sima🐧 replied to Hector

@marcan @LunaFoxgirlVT @G33KatWork @tnt @kkarhan @lina yes

maybe it's just me being extremely biased because nvidia has a track record of maximally screwing over the open drivers, but gut feeling is that the nvidia way sounds a lot more messy

like from what I've heard apple's design seems pretty settled, which helps. nvidia's is a first cut because they panicked and try way to hard to hide stuff in the fw, and some things will need to drastically change at least for compute in upstream

Hector Martin replied to 🐧sima🐧

@sima @LunaFoxgirlVT @G33KatWork @tnt @kkarhan @lina Yeah, I don't know if it's good or bad but at least for us we *know* we're getting whatever Apple does on macOS and there isn't any room for asking for something else, so the path forward is clear regardless of how easy or hard it is.

🐧sima🐧 replied to Hector

@marcan @LunaFoxgirlVT @G33KatWork @tnt @kkarhan @lina yeah

I guess my worry is that nvidia has a track record of actively making the open stack harder than necessary, and some of the things suggest the new open driver + new fw blob is going to be a repeat

apple seems to just not care and so more design things in a way that makes sense instead of trying real hard to make the gpl'ed kernel driver as small as possible (now that impossible is out), whether that makes sense or not technically

Kevin Karhan :verified: replied to 🐧sima🐧

@sima @marcan @LunaFoxgirlVT @G33KatWork @tnt @lina *nodds in agreement*
#Apple did really leverage the solid basis of #ARM / #ARM64 as a cleaner [not entire clean tho!] slate on their machines.
And whilst I can only speculate upon how Apple 'feels' about @AsahiLinux , I do am convinced that a lot of driver and hardware engineers there would love to commit code if they weren't under dozens of NDAs.

🐧sima🐧 replied to 🐧sima🐧

@marcan @LunaFoxgirlVT @G33KatWork @tnt @kkarhan @lina at least in my experience nothing good ever comes out of designing fw when your principle is to hide all your "vendor value add" in it and move it out of the gpl'ed kernel driver

I've seen that in other places than nvidia, and it's absolute pain. and from what I've heard, this "hide it all in fw and use the kernel as the new shim to get access to gpl stuff" is absolutely their plan

that's also why the fw is ginormous

Gen X-Wing

@marcan @G33KatWork @tnt @kkarhan @sima @lina She did it without proper documentation and without the support of a massive corporation as well.

Let’s face it, it’s not incompetence from nVidia, it’s disinterest. I mean Lina (hi!) is good, but given time and documentation I’m sure I could do it too. Their Windows devs surely could.

Not claiming anything high and mighty with this, but I selected an AMD GPU for this very reason.

tnt

@G33KatWork @kkarhan @sima @marcan

Did you try with the part-binary kernel module or using the open-gpu-kernel-modules one ? Not sure if it makes any difference ...

Andy

@tnt @kkarhan @sima @marcan Just the binary driver from the Arch Linux package repository. I didn't even touch the open source driver yet.

Go Up