Nvidia 1050ti + IntelHD 630

Aloha!
I'm trying to get my first install of FreeBSD 11.2 working with X. I am trying to install it on a laptop that has 2 video cards, an IntelHD 630 and and an Nvidia 1050ti.
To be specific, its an HP Omen. To be more specific, its an HP Omen 15-ax250wm.

After much gnashing of teeth©, I suspect I have managed to install and load the nvidia and intel drivers.
Sadly, I still get "no screens". The issue appears to be:

Code:
(WW) VGA arbiter: cannot open kernel arbiter, no multi-card support

The question for you then, is: Am I completely screwed? or is there some clever or stupid workaround?

My original intention was to get X to run on the nVidia card so I can test webgl for a game I am building.
But getting it to work on the intel card would be acceptable too.

It seems like I should be able to get X to work on one or the other video card, but after some effort, I haven't been able to get X to work. If I could disable one of the cards in the BIOS, I think it would be easy. But, easy is not my path it appears. HP disabled the ability to turn off cards in the BIOS some years back. Their solution is: use Windows or die. (As a special note, I can confirm that even using Windows for gaming, this 'feature' has lead to many hours of gnashing of teeth©.)

So, what say you? What would it take to make this work?
 
Last edited:
I'm getting that warning as well when using nvidia-driver with Quadro M2200, and it doesn't affect anything in any way.

Having said that, did you check the HOWTO: https://forums.freebsd.org/threads/howto-setup-xorg-with-nvidias-driver.52311/
Thanks for the prompt response! Yes, I have indeed checked that post out! Also, dozens of others. I discovered from one post that I needed to recompile the nvidia driver, and that cleared up some issues. The Nvidia driver is loading and recognizing the card, and the only obvious error report before "no screens" is "no multi-card support".
 
OK. There's a possibility that the one of the graphic adapters can be turned off using the direct ACPI method call, e.g. with the help of sysutils/acpi_call, though I'm not sure if it would work on your laptop and what exactly method it should be, guess you have a bit more search ahead of you :)

And still, just in case, attach the Xorg.0.log.
 
freebsdnvidiaintel.jpg

yeah, its a screenshot of /var/log/Xorg.0.log after running startx. lazy is as lazy does :D
Any insights?
 
OK. There's a possibility that the one of the graphic adapters can be turned off using the direct ACPI method call, e.g. with the help of sysutils/acpi_call, though I'm not sure if it would work on your laptop and what exactly method it should be, guess you have a bit more search ahead of you :)

And still, just in case, attach the Xorg.0.log.
Again, thanks for the prompt response. I will look into your suggestion. But, its late here so I will check back here tomorrow.
 
Sadly, that thread actually says it doesn't work. Other threads also confirm that this method doesn't work.
One person blames the computer, and suggests throwing it out.
I'm not sure this is the right attitude, since its only a year old and works fine with windows and ubuntu.
So, no FreeBSD for this machine.
 
Success! At least, I have the intel card working now.
It turns out I should use the "modesetting" driver, not the "intel" driver.
This is described here.
 
Madness! Success, but why is sort of a mystery.

So as tankist02 says, the modsetting driver doesn't do gpu acceleration.
But, happy to have anything working, I went and tried to install gnome. It kept crashing at boot, so I installed KDE instead. That worked.
Then I installed Firefox. That worked. Then Chromium, and that worked.

So then, I fiddled around trying to get GPU to work, and finally just put in the minimum device section in .conf file for "intel".
And...that worked for some reason now! It didn't before getting modsetting driver to work. However openGL was still not working.
So I searched around and found this advice which has you remove nvidia driver and reinstall some stuff because the nvidia driver hoses the gl configuration for the other card.
Bam! After doing that, the intel card is fully functional under X, and I now have my webgl game running under FreeBSD with X and Chrome.
I need shadows disabled for 60fps on the intel, but thats good enough for testing purposes, I'm not going to try to get the nVidia card working, too much headache.

Thanks for the help guys.
 
OMG it broke again. I recall I did 2 things, I booted to windows, and I configured my imw0 wifi to work. Once I got wifi to work, X stopped working.
I traced my previous steps, and recompiling the drm-next-kmod fixed it (cd /usr/ports/graphics/drm-next-kmod/ && make install clean)
 
Ok, I didn't actually give up on getting the nVidia card to work. I gave it another whirl. This time following the idea in this thread on faking optimus support.
Sadly, this clever method did not work for me. The issue again seems to be DRM support. As far as I can tell, only one driver can use it at a time, and both drivers need it and I can't turn one off. So, working around this might be not so easy.
 
Any insights?
The NVIDIA driver is unable to assign a connected display device because of Optimus.

With Optimus (i.e. both integrated and discrete graphics adapter) enabled, the integrated graphics adapter (Intel HD Graphics 630 in your case) is controlling the display and the NVIDIA graphics adapter is just an extension - that is how the whole thing works. On *nix, OpenGL applications which need extra juice are actually rendered on a separate X server which is running on the NVIDIA card and the output is transferred back to the X server running on the Intel card. This is how Bumblebee works on Linux; x11/virtualgl works in the same way.

As you do not have the option to disable the integrated graphics adapter, VirtualGL is the only way to make use of the NVIDIA card. According to the Wiki, it should just work. ;) I am currently trying to get my NVIDIA setup working but I also want to try out VirtualGL as I do not need that much 3D performance all the time. So I will definitely look into this topic.

Thomas
 
I was able to get a virtualgl running on the nVidia card using the 'fake' method in the link I put above by adding a setenv DISPLAY variable and additionally recompiling. So I ran glxinfo on the nVidia card via vglrun, but after displaying lots of information about the nVidia card and its GL support, it segment faults.

I wasn't able to get anything else to work, and I'm not even perfectly sure how its supposed to go. What I want to finally do is run a browser with webgl, so I think I want to run X on the intel card, and then vglrun chrome.

But I cannot run anything on the Intel card while the nVidia card is in use, because it can't become the DRM Master. I tried using the modesetting driver, but it also needs DRM Master. I can't just run chrome or firefox with vglrun because there isn't a display.

So, I'm stuck again :(
 
Back
Top