FS#56993 - [nvidia-utils] 387.34-4 breaks glxgears

Attached to Project: Arch Linux
Opened by ads (flv) - Sunday, 07 January 2018, 17:15 GMT
Last edited by Sven-Hendrik Haase (Svenstaro) - Tuesday, 09 January 2018, 10:33 GMT
Task Type Bug Report
Category Packages: Testing
Status Closed
Assigned To Sven-Hendrik Haase (Svenstaro)
Architecture All
Severity Low
Priority Normal
Reported Version
Due in Version Undecided
Due Date Undecided
Percent Complete 100%
Votes 8
Private No

Details

Description:

After installing 387.34-4, glxgears doesn't work on the integrated intel card

The output is similar to this:
https://bbs.archlinux.org/viewtopic.php?id=135562

Steps to reproduce:
install version 387.34-4 on a laptop with integrated (intel) graphics
run glxgers

output similar to this: https://bbs.archlinux.org/viewtopic.php?id=135562
This task depends upon

Closed by  Sven-Hendrik Haase (Svenstaro)
Tuesday, 09 January 2018, 10:33 GMT
Reason for closing:  Fixed
Comment by Sven-Hendrik Haase (Svenstaro) - Sunday, 07 January 2018, 19:18 GMT
Is this with an optimus setup?
Comment by ads (flv) - Sunday, 07 January 2018, 19:24 GMT
Hey. Yes it is. Intel 620 + NVidia 940
Comment by ads (flv) - Sunday, 07 January 2018, 19:34 GMT
Just to be clear. I'm using bumblebee to switch between the cards, so the system starts X with the Intel card.

The -3 version of the package works without issue, so the bug is definitely due to changes in -4
Comment by Sven-Hendrik Haase (Svenstaro) - Sunday, 07 January 2018, 19:51 GMT
Well here's the issue. Before -4 X didn't start at all on dual GPU systems. I'm not entirely sure what to do here and I don't have a system to test this on anymore.
Comment by ads (flv) - Sunday, 07 January 2018, 20:27 GMT
Yes, it did and it still does

Running -3 and everything works fine. X starts on Intel, optirun runs programs on the dGPU just fine.

I repeat. -3 is working correctly
Comment by ads (flv) - Sunday, 07 January 2018, 20:50 GMT
Added Xorg logs. It seems that with -4 the nvidia /usr/lib/nvidia/xorg/libglx.so is loaded instead of /usr/lib/xorg/modules/extensions/libglx.so

Xorg.1 is for -4
Xorg.0 for -3
Comment by Yakumo Ran (kitsunyan) - Monday, 08 January 2018, 00:10 GMT
That happened because /usr/lib/nvidia/xorg was added to module path. nvidia glx module overlaps xorg glx module.

In order to fix it you can create /etc/X11/xorg.conf.d/20-modulepath.conf with the following content:

Section "Files"
ModulePath "/usr/lib/xorg/modules"
EndSection
Comment by Sven-Hendrik Haase (Svenstaro) - Monday, 08 January 2018, 00:10 GMT
I got the other bug report which says that -3 didn't work perfectly after all. I'm getting really mixed signals here. I'm not sure how to fix it for both people.
Comment by Neros (Neros) - Monday, 08 January 2018, 01:31 GMT
@Yakumo Ran, your fix works for me (with /etc/X11/.. and not /usr/X11/..), thank you!
Comment by tgn-ff (tgn-ff) - Monday, 08 January 2018, 02:32 GMT
In the other bug report, the user doesn't seem to use bumblebee. What it seems to happen here, is that because of the new 99-nvidia.conf file, X tries to use nvidia's libglx on Intel card too. Adding a file, with higher priority, that will load /usr/lib/xorg/modules first, in the bumblebee package, should work for everyone.
Comment by ads (flv) - Monday, 08 January 2018, 04:22 GMT
Here's how I fixed my issue:

cat /usr/share/X11/xorg.conf.d/99-nvidia.conf

Section "Files"
ModulePath "/usr/lib/xorg/modules"
ModulePath "/usr/lib/nvidia/xorg"
EndSection


I've basically switched the order or the two ModulePath entries. Somebody without dual graphics should probably test this
Comment by ads (flv) - Monday, 08 January 2018, 04:33 GMT
What's weird is that /usr/lib/nvidia/xorg is in nvidia-drm-output-class.conf too.

cat /usr/share/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf
Section "OutputClass"
Identifier "intel"
MatchDriver "i915"
Driver "modesetting"
EndSection

Section "OutputClass"
Identifier "nvidia"
MatchDriver "nvidia-drm"
Driver "nvidia"
Option "AllowEmptyInitialConfiguration"
Option "PrimaryGPU" "yes"
ModulePath "/usr/lib/nvidia/xorg"
EndSection


so nvidia-99 could probably be removed if this is done

cat /usr/share/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf
Section "OutputClass"
Identifier "intel"
MatchDriver "i915"
Driver "modesetting"
EndSection

Section "OutputClass"
Identifier "nvidia"
MatchDriver "nvidia-drm"
Driver "nvidia"
Option "AllowEmptyInitialConfiguration"
Option "PrimaryGPU" "yes"
ModulePath "/usr/lib/xorg/modules"
ModulePath "/usr/lib/nvidia/xorg"
EndSection

Comment by Arthur Boudart (Akila) - Monday, 08 January 2018, 10:52 GMT
Just to say, this also broke steam for me, opengl glx extension is not supported by display, resolved the error by downgrading nvidia utils to -3, -4 broke it
Comment by David Roth (V1del) - Monday, 08 January 2018, 11:17 GMT
IMO this should be reverted, one of the bugs Svenstaro refers to didn't contain any useful info, and trying to start a second X server is at best a corner case (and we also don't know how that second X server start was attempted and whether it didn't load necessary config files), if someone wants to fix this without using bumblebee they should be following the Nvidia Optimus wiki documentation.

Comment by Sven-Hendrik Haase (Svenstaro) - Monday, 08 January 2018, 13:22 GMT
Can you guys check out the new packages in [testing]?
Comment by ads (flv) - Monday, 08 January 2018, 13:31 GMT
On my Intel+Nvidia setup everything works after installing -5
Comment by Sven-Hendrik Haase (Svenstaro) - Monday, 08 January 2018, 13:39 GMT
Can you also check the state of https://bugs.archlinux.org/task/53704 with your setup? Also please check and see whether you can reproduce that with -5: https://bugs.archlinux.org/task/53090
Comment by ads (flv) - Monday, 08 January 2018, 13:56 GMT
I can't really test 53704, since I have a dual graphics setup, so it will always load the xorg libgl. As for the second the same thing applies. Those should both be tester by nvidia only users, but I'd say it should work for everybody this time
Comment by ads (flv) - Monday, 08 January 2018, 14:03 GMT
Anyway, bumblebee (which starts an invisible X server when running apps with optirun seems to use the correct nvidia provided libglx as per Xorg.8.log, even after the 2nd instance
Comment by Arthur Boudart (Akila) - Monday, 08 January 2018, 14:15 GMT
Everything is working for me too with -5, also on a bumblebee setup so can't really test the other issues
Comment by Sven-Hendrik Haase (Svenstaro) - Monday, 08 January 2018, 16:06 GMT
I moved the packages to [extra] but will leave this up for just a few more days to see whether everything is working fine for everyone.

Loading...