Laptop with AMD 'Cezanne' graphics + Nvidia RTX 3060

Still have no idea how to use nVidia GPU. PRIME support was added to Xorg 1.21 which is not supported on FreeBSD at the moment.
 
I'm not quite sure, but I think the Xorg's PRIME support has reached a reasonably complete state somewhere around 2014-2016: https://github.com/freedesktop/xorg-xserver/search?o=asc&q=PRIME&s=committer-date&type=commits.

Now, the offloading method that concerns us was added to the Nvidia's driver in 2019. There are two of them actually: http://download.nvidia.com/XFree86/Linux-x86_64/515.57/README/randr14.html and http://download.nvidia.com/XFree86/Linux-x86_64/515.57/README/primerenderoffload.html. The latter works, the former does not.
 
Hi bmeneg,

Yes it now works fine on Cezanne. You have to git clone drm-kmod manually, switch to branch 5.10-lts, apply below simple patch to disable HDCP, build and install new modules.

Code:
diff --git a/drivers/gpu/drm/amd/amdgpu/psp_v12_0.c b/drivers/gpu/drm/amd/amdgpu/psp_v12_0.c
index b0ee77ee8..58ba7bf60 100644
--- a/drivers/gpu/drm/amd/amdgpu/psp_v12_0.c
+++ b/drivers/gpu/drm/amd/amdgpu/psp_v12_0.c
@@ -49,9 +49,14 @@ static int psp_v12_0_init_microcode(struct psp_context *psp)
 {
        struct amdgpu_device *adev = psp->adev;
        const char *chip_name;
+#ifdef __linux__
        char fw_name[30];
        int err = 0;
        const struct ta_firmware_header_v1_0 *ta_hdr;
+#elif defined(__FreeBSD__)
+       /* We do not support HDCP in drm-kmod yet */
+       int err = 0;
+#endif
        DRM_DEBUG("\n");
 
        switch (adev->asic_type) {
@@ -69,6 +74,7 @@ static int psp_v12_0_init_microcode(struct psp_context *psp)
        if (err)
                return err;
 
+#ifdef __linux__
        snprintf(fw_name, sizeof(fw_name), "amdgpu/%s_ta.bin", chip_name);
        err = request_firmware(&adev->psp.ta_fw, fw_name, adev->dev);
        if (err) {
@@ -113,6 +119,7 @@ out:
                        "psp v12.0: Failed to load firmware \"%s\"\n",
                        fw_name);
        }
+#endif
 
        return err;
 }

This patch was already committed on May 31st?
Code:
commit f7cc9dd58ca097a428ffcacad1686d970ca633a8 (tag: drm_v5.10.113_2)
Author: ivan-volnov <51086293+ivan-volnov@users.noreply.github.com>
Date:   Tue May 31 14:01:33 2022 +0800

    drm/amdgpu: Disable HDCP on green_sardine and renoir

    We do not support HDCP in drm-kmod yet


... I can load that module without crash and Xorg runs, but shows a colorful pattern and no content, and a working mousepointer ...
... I have a Renoir chip ...



... this patch is even included in the drm-510-kmod port (5.10.113_2)
 
I'm not quite sure, but I think the Xorg's PRIME support has reached a reasonably complete state somewhere around 2014-2016: https://github.com/freedesktop/xorg-xserver/search?o=asc&q=PRIME&s=committer-date&type=commits.

Now, the offloading method that concerns us was added to the Nvidia's driver in 2019. There are two of them actually: http://download.nvidia.com/XFree86/Linux-x86_64/515.57/README/randr14.html and http://download.nvidia.com/XFree86/Linux-x86_64/515.57/README/primerenderoffload.html. The latter works, the former does not.
Just gave it one more try - none of these two methods works to me, I cannot get two providers listed. If I enable both GPUs in xorg.conf, it just crashes when accessing NVIDIA GPU.
 
Just tried. It does not work because secondary Xorg crashes soon as it accesses NVIDIA driver, see log below. I strongly believe all our problems running FreeBSD on such hybrid systems come from an outdated Xorg (1.20.14), it's simply too old and miss some important features. Couple of months ago I spent a lot of time investigating it and came to conclusion that we need at least Xorg 21.1 which incorporates all the necessary patches and bugfixes. I even tried to port it to FreeBSD, but failed dues lack of understanding how it's made and how it's build system works.

Code:
[    26.228]
X.Org X Server 1.20.14
X Protocol Version 11, Revision 0
[    26.228] Build Operating System: FreeBSD 13.1-RELEASE amd64
[    26.228] Current Operating System: FreeBSD butterfly 13.1-RELEASE FreeBSD 13.1-RELEASE releng/13.1-n250148-fc952ac2212 GENERIC amd64
[    26.228] Build Date: 19 June 2022  12:55:14AM
[    26.228] 
[    26.228] Current version of pixman: 0.40.0
[    26.228]     Before reporting problems, check http://wiki.x.org
    to make sure that you have the latest version.
[    26.228] Markers: (--) probed, (**) from config file, (==) default setting,
    (++) from command line, (!!) notice, (II) informational,
    (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
[    26.228] (==) Log file: "/var/log/Xorg.8.log", Time: Sun Jul 31 04:59:25 2022
[    26.228] (++) Using config file: "/var/cache/nvidia-headless/xorg.conf"
[    26.228] (EE) Unable to locate/open config directory: "xorg-nvidia-headless.conf.d"
[    26.228] (==) Using system config directory "/usr/local/share/X11/xorg.conf.d"
[    26.228] (==) ServerLayout "nvidia"
[    26.228] (**) |-->Screen "Screen0" (0)
[    26.228] (**) |   |-->Monitor "Monitor0"
[    26.229] (**) |   |-->Device "Device0"
[    26.229] (**) |-->Input Device "fake"
[    26.229] (**) Option "AutoAddDevices" "false"
[    26.229] (**) Not automatically adding devices
[    26.229] (==) Automatically enabling devices
[    26.229] (==) Not automatically adding GPU devices
[    26.229] (==) Max clients allowed: 256, resource mask: 0x1fffff
[    26.229] (==) FontPath set to:
    /usr/local/share/fonts/misc/,
    /usr/local/share/fonts/TTF/,
    /usr/local/share/fonts/OTF/,
    /usr/local/share/fonts/Type1/,
    /usr/local/share/fonts/100dpi/,
    /usr/local/share/fonts/75dpi/,
    catalogue:/usr/local/etc/X11/fontpath.d
[    26.229] (**) ModulePath set to "/usr/local/lib/xorg/modules-NVIDIA,/usr/local/lib/xorg/modules"
[    26.229] (II) Loader magic: 0x433270
[    26.229] (II) Module ABI versions:
[    26.229]     X.Org ANSI C Emulation: 0.4
[    26.229]     X.Org Video Driver: 24.1
[    26.229]     X.Org XInput driver : 24.1
[    26.229]     X.Org Server Extension : 10.0
[    26.229] (--) PCI: (1@0:0:0) 10de:25a2:17aa:3a5d rev 161, Mem @ 0xd0000000/16777216, 0xfb00000000/4294967296, 0xfc00000000/33554432, I/O @ 0x00003000/128
[    26.229] (--) PCI:*(5@0:0:0) 1002:1638:17aa:3a5d rev 198, Mem @ 0xfc10000000/268435456, 0xfc20000000/2097152, 0xd1400000/524288, I/O @ 0x00001000/256, BIOS @ 0x????????/65536
[    26.229] (WW) "efifb" will not be loaded unless you've specified it to be loaded elsewhere.
[    26.229] (II) "glx" will be loaded. This was enabled by default and also specified in the config file.
[    26.229] (II) LoadModule: "dri3"
[    26.229] (II) Module "dri3" already built-in
[    26.229] (II) LoadModule: "glx"
[    26.229] (II) Loading /usr/local/lib/xorg/modules/extensions/libglx.so
[    26.229] (II) Module glx: vendor="X.Org Foundation"
[    26.229]     compiled for 1.20.14, module version = 1.0.0
[    26.229]     ABI class: X.Org Server Extension, version 10.0
[    26.229] (II) LoadModule: "nvidia"
[    26.229] (II) Loading /usr/local/lib/xorg/modules/drivers/nvidia_drv.so
[    26.229] (II) Module nvidia: vendor="NVIDIA Corporation"
[    26.229]     compiled for 1.6.99.901, module version = 1.0.0
[    26.229]     Module class: X.Org Video Driver
[    26.229] (II) NVIDIA dlloader X Driver  515.57  Wed Jun 22 22:24:02 UTC 2022
[    26.229] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[    26.229] (--) Using syscons driver with X support (version 2.0)
[    26.229] (++) using VT number 9

[    26.230] (II) Loading sub module "fb"
[    26.230] (II) LoadModule: "fb"
[    26.230] (II) Loading /usr/local/lib/xorg/modules/libfb.so
[    26.230] (II) Module fb: vendor="X.Org Foundation"
[    26.230]     compiled for 1.20.14, module version = 1.0.0
[    26.230]     ABI class: X.Org ANSI C Emulation, version 0.4
[    26.230] (II) Loading sub module "wfb"
[    26.230] (II) LoadModule: "wfb"
[    26.230] (II) Loading /usr/local/lib/xorg/modules/libwfb.so
[    26.230] (II) Module wfb: vendor="X.Org Foundation"
[    26.230]     compiled for 1.20.14, module version = 1.0.0
[    26.230]     ABI class: X.Org ANSI C Emulation, version 0.4
[    26.230] (II) Loading sub module "ramdac"
[    26.230] (II) LoadModule: "ramdac"
[    26.230] (II) Module "ramdac" already built-in
[    26.230] (WW) VGA arbiter: cannot open kernel arbiter, no multi-card support
[    26.230] (**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
[    26.230] (==) NVIDIA(0): RGB weight 888
[    26.230] (==) NVIDIA(0): Default visual is TrueColor
[    26.230] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
[    26.230] (**) NVIDIA(0): Option "AllowEmptyInitialConfiguration" "True"
[    26.231] (**) NVIDIA(0): Option "UseEDID" "False"
[    26.231] (**) NVIDIA(0): Option "ConnectedMonitor" "DFP"
[    26.231] (**) NVIDIA(0): Enabling 2D acceleration
[    26.231] (**) NVIDIA(0): ConnectedMonitor string: "DFP"
[    26.231] (**) NVIDIA(0): Ignoring EDIDs
[    26.231] (II) Loading sub module "glxserver_nvidia"
[    26.231] (II) LoadModule: "glxserver_nvidia"
[    26.231] (II) Loading /usr/local/lib/xorg/modules/extensions/libglxserver_nvidia.so
[    26.294] (II) Module glxserver_nvidia: vendor="NVIDIA Corporation"
[    26.294]     compiled for 1.6.99.901, module version = 1.0.0
[    26.294]     Module class: X.Org Server Extension
[    26.294] (II) NVIDIA GLX Module  515.57  Wed Jun 22 22:21:01 UTC 2022
[    26.295] (II) NVIDIA: The X server supports PRIME Render Offload.
[    27.035] (EE)
[    27.035] (EE) Backtrace:
[    27.036] (EE) 0: /usr/local/bin/Xorg (OsInit+0x38a) [0x41c96a]
[    27.038] (EE) unw_get_proc_name failed: no unwind info found [-10]
[    27.038] (EE) 1: /lib/libthr.so.3 (?+0x0) [0x80093158e]
[    27.038] (EE) unw_get_proc_name failed: no unwind info found [-10]
[    27.038] (EE) 2: /lib/libthr.so.3 (?+0x0) [0x800930b3f]
[    27.039] (EE) 3: ? (?+0x0) [0x7ffffffff8a3]
[    27.039] (EE) 4: ? (?+0x0) [0x0]
[    27.039] (EE) 5: /usr/local/lib/xorg/modules/drivers/nvidia_drv.so (nvidiaUnlock+0x4201f) [0x801ca8c1f]
[    27.040] (EE) 6: ? (?+0x0) [0x0]
[    27.040] (EE) unw_step failed: unspecified (general) error [-1]
[    27.040] (EE)
[    27.040] (EE) Segmentation fault at address 0x0
[    27.040] (EE)
Fatal server error:
[    27.040] (EE) Caught signal 11 (Segmentation fault). Server aborting
[    27.040] (EE)
[    27.040] (EE)
Please consult the The X.Org Foundation support
     at http://wiki.x.org
 for help.
[    27.040] (EE) Please also check the log file at "/var/log/Xorg.8.log" for additional information.
[    27.040] (EE)
[    27.040] (EE) Server terminated with error (1). Closing log file.
 
Here is my Xorg.0.log. It's practically identical to that of a Manjaro live stick (but where playing videos also takes 70% cpu, that means no accel?) ...
 

Attachments

  • Xorg.0.log.txt
    23.3 KB · Views: 80
My Goddess, it worked! The Option "UseDisplayDevice" "None" in Device section for nvidia driver did the trick, it's not crashing any more and I'm able to use hybrid setup (two Xorgs one per each GPU).

Here's a brief benchmark of NVIDIA GPU on my laptop:

Code:
rz@butterfly:~ % nvrun-vgl  glxgears -info -fullscreen
GL_RENDERER   = NVIDIA GeForce RTX 3050 Laptop GPU/PCIe/SSE2
GL_VERSION    = 4.6.0 NVIDIA 515.57
GL_VENDOR     = NVIDIA Corporation
VisualID 33, 0x21
1766 frames in 5.0 seconds = 353.034 FPS
1777 frames in 5.0 seconds = 355.243 FPS
1767 frames in 5.0 seconds = 353.321 FPS
^C

Same test with AMD GPU:

Code:
rz@butterfly:~ % glxgears -info -fullscreen
Running synchronized to the vertical refresh.  The framerate should be
approximately the same as the monitor refresh rate.
GL_RENDERER   = AMD RENOIR (DRM 3.40.0, 13.1-RELEASE, LLVM 13.0.1)
GL_VERSION    = 4.6 (Compatibility Profile) Mesa 21.3.8
GL_VENDOR     = AMD
VisualID 1276, 0x4fc
401 frames in 5.0 seconds = 79.980 FPS
301 frames in 5.0 seconds = 60.046 FPS
301 frames in 5.0 seconds = 60.055 FPS
301 frames in 5.0 seconds = 60.047 FPS
^C

This is so cool! Thank you shkhln for the hint.
 
Here is my Xorg.0.log. It's practically identical to that of a Manjaro live stick (but where playing videos also takes 70% cpu, that means no accel?) ...
I looked through your log and to me it seems ok. What if you run startx command, will it switch to graphics and produce xterm windows ?

As for video decoding acceleration, what browser do you use ? In Firefox there are some hardware acceleration settings AFAIK which may be not enabled in your case.
 
I would generally advice you to follow the Nvidia's documentation, rather than use that nvidia-hybrid-graphics port (which is basically a wrapper for VirtualGL).
Sure I gave it a try even before hybrid setup. Unfortunately PRIME (and Reverse PRIME as described in the docs) did not work - I was not able to obtain two providers despite the fact that both GPUs were initialized by Xorg. I shuffled options in xorg.conf back and forth, I tried without xorg.conf, all with no luck. Seems the problem is some missing feature in Xorg 1.20.14.

Personally I find hybrid setup more convenient, it gives me obvious control on which GPU app is using.

BTW, nVidia driver still does not support suspend/resume. Attempting to suspend while GPU is in use either crashes the OS or just hangs forever - there's a note about this in their README. Only once it came back (resumed), but GPU was stuck in unusable state. So, this discrete GPU has to be used with care - allways stop the app before suspending. AMD GPU works flawlessly.

Anyway, I'm very much delighted with the result.

Just in case someone is interested in reproducing my setup:

First, install wrapper scripts for hybrid setup from ports:


rz@butterfly:~ % uname -a
FreeBSD butterfly 13.1-RELEASE FreeBSD 13.1-RELEASE releng/13.1-n250148-fc952ac2212 GENERIC amd6

rz@butterfly:~ % cd /usr/ports/x11/nvidia-hybrid-graphics

rz@butterfly:/usr/ports/x11/nvidia-hybrid-graphics % sudo make && sudo make install


Configure Xorg.conf for primary GPU (AMD):
Code:
rz@butterfly:~ % cat /etc/X11/xorg.conf

Section "Files"
        FontPath "/usr/local/share/fonts/jmk-x11-fonts"
        FontPath "/usr/local/share/fonts/cyrillic"
        FontPath "/usr/local/share/fonts/Liberation"
        FontPath "/usr/local/share/fonts/dejavu"
        FontPath "/usr/local/share/fonts/Caladea"
        FontPath "/usr/local/share/fonts/Carlito"
        FontPath "/usr/local/share/fonts/gnu-unifont-ttf"
EndSection

Section "ServerLayout"
      Identifier "layout"
      Screen 0 "iGPU"
      #Screen 1 "dGPU"
      #inactive "dGPU"
      Option "AllowNVIDIAGPUScreens"
      InputDevice    "Keyboard0" "CoreKeyboard"
      InputDevice    "Mouse0" "CorePointer"
EndSection

Section "ServerFlags"
        Option  "AutoAddGPU"      "1"
EndSection

Section "Device"
        Identifier "dGPU"
        Driver "nvidia"
        BusID "PCI:1:0:0"
        Option  "UseDisplayDevice" "None"
EndSection

Section "Screen"
      Identifier "dGPU"
      Device "dGPU"
EndSection


Section "Device"
      Identifier "iGPU"
      Driver "amdgpu"
      #Driver "modesetting"
      BusID "PCI:5:0:0"
EndSection

Section "Screen"
      Identifier "iGPU"
      Device "iGPU"
EndSection

Section "InputDevice"
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/sysmouse"
    Option         "Emulate3Buttons" "yes"

Section "InputDevice"
    Identifier     "Touchpa0"
    Driver         "evedv"
EndSection

Section "InputClass"
        Identifier "libinput touchpad catchall"
        MatchIsTouchpad "on"
        MatchDevicePath "/dev/input/event*"
        Driver "libinput"
        Option "Tapping" "on"
        Option "MiddleEmulation" "on"
        Option "AccelProfile" "adaptive"
EndSection

Section "InputClass"
        Identifier "keyboard defaults"
        MatchIsKeyboard "on"
        Option "XkbLayout" "us,ru"
        Option "XKbOptions" "terminate:ctrl_alt_bksp,numpad:microsoft,grp:caps_t
oggle"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Keyboard0"
    Driver         "keyboard"
EndSection

:q
rz@butterfly:~ % cat /etc/X11/xorg.conf
Section "Files"
    FontPath "/usr/local/share/fonts/jmk-x11-fonts"
    FontPath "/usr/local/share/fonts/cyrillic"
    FontPath "/usr/local/share/fonts/Liberation"
    FontPath "/usr/local/share/fonts/dejavu"
    FontPath "/usr/local/share/fonts/Caladea"
    FontPath "/usr/local/share/fonts/Carlito"
    FontPath "/usr/local/share/fonts/gnu-unifont-ttf"
EndSection

Section "ServerLayout"
      Identifier "layout"
      Screen 0 "iGPU"
      #Screen 1 "dGPU"
      #inactive "dGPU"
      Option "AllowNVIDIAGPUScreens"
      InputDevice    "Keyboard0" "CoreKeyboard"
      InputDevice    "Mouse0" "CorePointer"
EndSection

Section "ServerFlags"
    Option    "AutoAddGPU"      "1"
EndSection

Section "Device"
    Identifier "dGPU"
    Driver "nvidia"
    BusID "PCI:1:0:0"
    Option    "UseDisplayDevice" "None"
EndSection

Section "Screen"
      Identifier "dGPU"
      Device "dGPU"
EndSection


Section "Device"
      Identifier "iGPU"
      Driver "amdgpu"
      #Driver "modesetting"
      BusID "PCI:5:0:0"
EndSection

Section "Screen"
      Identifier "iGPU"
      Device "iGPU"
EndSection

Section "InputDevice"
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/sysmouse"
    Option         "Emulate3Buttons" "yes"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"
    Identifier     "Touchpa0"
    Driver         "evedv"
EndSection

Section "InputClass"
        Identifier "libinput touchpad catchall"
        MatchIsTouchpad "on"
        MatchDevicePath "/dev/input/event*"
        Driver "libinput"
        Option "Tapping" "on"
        Option "MiddleEmulation" "on"
        Option "AccelProfile" "adaptive"
EndSection

Section "InputClass"
    Identifier "keyboard defaults"
    MatchIsKeyboard "on"
    Option "XkbLayout" "us,ru"
    Option "XKbOptions" "terminate:ctrl_alt_bksp,numpad:microsoft,grp:caps_toggle"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Keyboard0"
    Driver         "keyboard"
EndSection

Configure Xorg.conf for secondary GPU (nVidia):
Code:
rz@butterfly:~ % cat /usr/local/etc/X11/xorg-nvidia-headless.conf

Section "ServerLayout"
    Identifier     "nvidia"
    Screen      0  "Screen0"
    InputDevice    "fake" "CorePointer" "CoreKeyboard"
    Option         "AutoAddDevices" "false"
EndSection

Section "Files"
    ModulePath      "/usr/local/lib/xorg/modules-NVIDIA"
    ModulePath      "/usr/local/lib/xorg/modules"
EndSection

Section "Module"
    Load           "dri3"
    Load           "glx"
    Disable        "efifb"
EndSection

Section "InputDevice"
    Identifier     "fake"
    Driver         ""
EndSection

Section "Monitor"
    Identifier     "Monitor0"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    BusID     "PCI:1:0:0
    Option  "UseDisplayDevice" "None"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
EndSection

Run secondary service:

service nvidia_xorg restart


Log in to your X environment, then run a test:

rz@butterfly:~ % nvrun-vgl glxgears -info -fullscreen
GL_RENDERER = NVIDIA GeForce RTX 3050 Laptop GPU/PCIe/SSE2
GL_VERSION = 4.6.0 NVIDIA 515.57
GL_VENDOR = NVIDIA Corporation
 
Unfortunately PRIME (and Reverse PRIME as described in the docs) did not work - I was not able to obtain two providers despite the fact that both GPUs were initialized by Xorg.
You are not supposed to have two providers. In fact, you don't even need to look at that information, directly testing whether __NV_PRIME_RENDER_OFFLOAD works should be enough.

I shuffled options in xorg.conf back and forth, I tried without xorg.conf, all with no luck.
The explicit config is definitely required, xorg.conf-less Nvidia setup works only with Linux at the moment.

Seems the problem is some missing feature in Xorg 1.20.14.
No.

Personally I find hybrid setup more convenient, it gives me obvious control on which GPU app is using.
VirtualGL obviously doesn't work with Vulkan apps and, in nvidia-hybrid-graphics implementation, it's also doesn't work with 32-bit apps or Linux apps. Considering there aren't really any common use cases for a dedicated GPU other than gaming, being unable to run Wine or Steam games is pretty significant limitation.
 
Just tried setting __NV_PRIME_RENDER_OFFLOAD env variable, it does not change anything. The problem is that I do not have NVIDIA-G0 provider together with modesetting provider. nVidia docs explicitly says that there should be two providers listed to get things working. I do not know how to achieve this. I can have two separate screens one per each GPU, but that's not what is needed.
 
Just tried setting __NV_PRIME_RENDER_OFFLOAD env variable, it does not change anything.
That's a bit too vague.

The problem is that I do not have NVIDIA-G0 provider together with modesetting provider.
What exactly xrandr --listproviders prints?

I can have two separate screens one per each GPU, but that's not what is needed.
I'm pretty sure you can't, unless you mean an external display.
 
I'm pretty sure you cant, unless you mean an external display.
Well, it allows me to. :) NVIDIA driver gets loaded and I have black text mode screen of course which is expected, but technically I can have two screens.

What exactly xrandr --listproviders prints?

Following nVidia docs I have to set iGPU to "modesetting" and dGPU to "nvidia", then this is what I get from xrandr:


Code:
rz@butterfly:~ % xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x45 cap: 0xa, Sink Output, Sink Offload crtcs: 4 outputs: 2 associated providers: 0 name:modesetting

According to the Xorg.0.log it does NOT instantiate NVIDIA driver.

If I set "amdgpu" driver for iGPU, then I get this:

Code:
rz@butterfly:~ % xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x55 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 4 outputs: 2 associated providers: 0 name:Unknown AMD Radeon GPU @ pci:0000:05:00.0
 
Oh, and don't test this simultaneously with nvidia-hybrid-graphics being installed. You need the normal nvidia-driver port.
 
I looked through your log and to me it seems ok. What if you run startx command, will it switch to graphics and produce xterm windows ?

As for video decoding acceleration, what browser do you use ? In Firefox there are some hardware acceleration settings AFAIK which may be not enabled in your case.
I see some colorful little squares in the lower region, a mouse pointer which works ok, but no windows. My .initrc opens an xterm of which I see nothing, but I can close it blindly with Ctrl-D to end the session.
So the browser doesn't matter.
Video is mpv or vlc (Accel off). That uses about 180% cpu vs. 60% on linux.
 
Oh, and don't test this simultaneously with nvidia-hybrid-graphics being installed. You need the normal nvidia-driver port.
I use hybrid setup with the latest nVidia driver 515.57, works pretty well.

As I said I can have two screens configured in xorg.conf, one screen per each GPU. In this case two GPU drivers load well, but this setup does not allow render offloading. No matter what I try, I cannot fit two GPUs into one screen setup. I can have either "nvidia" driver or "amdgpu" driver loaded by Xorg at a moment, but not both. If I set "modesetting" driver for iGPU, then "nvidia" driver for dGPU never gets probed. I tried a lot many combinations of options in xorg.conf and I conclude that something in Xorg itself does not allow this.

Can you please share your Xorg.0.log with working setup of two GPUs ?
 
Back
Top