Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
Intel for display, Nvidia for computing

Intel for display, NVIDIA for computing

This guide will show you how to use Intel graphics for rendering display and NVIDIA graphics for CUDA computing on Ubuntu 18.04 / 20.04 desktop.

I made this work on an ordinary gaming PC with two graphics devices, an Intel UHD Graphics 630 plus an NVIDIA GeForce GTX 1080 Ti. Both of them can be shown via lspci | grep VGA.

00:02.0 VGA compatible controller: Intel Corporation Device 3e92
01:00.0 VGA compatible controller: NVIDIA Corporation GP102 [GeForce GTX 1080 Ti] (rev a1)

This guide can be summarized into two steps:

  1. To install NVIDIA drivers without OpenGL files.
  2. To configure Xorg to use the Intel graphics.

I haven't tested on different hardware configurations, but it should work similarly. See this section for more discussion.

0. Preparation

Before operating within Linux, you need to make some configuration on your hardware. Make sure monitors are plugged to the motherboard the instead of dedicated display card. Configure the BIOS to make Intel graphics as the primary display device (usually select IGFX instead of PEG or PCIE). Make sure your computer could boot to GUI and be logged in to desktop successfully under this setting.

1. Install NVIDIA driver without OpenGL files

I suggest installing the driver in either of the following ways. If you would like to follow your own method, just make sure the OpenGL files are not installed.

1.1. Uninstall all previous installations

Common steps for both methods to avoid possible conflicts.

  1. If you have installed via PPA repositories

    sudo apt purge nvidia*
    # some interactive operations
    sudo apt autoremove
    # some interactive operations

    Check remaining packages related to NVIDIA.

    dpkg -l | grep nvidia

    If some packages are not removed, manually remove them by

    sudo dpkg -P <package_name>

    If you have add third party repositories, e.g. ones from NVIDIA, remove them too. This could be done by removing related files under /etc/apt/source.list.d/ or via ppa-purge utility.

  2. If you have installed via binary installers

    sudo nvidia-uninstall
    # some interactive operations
  3. Reboot.

1.2.A. Install from PPA Repository

  1. Add the ppa:graphics-drivers/ppa repository.

    sudo add-apt-repository ppa:graphics-drivers/ppa
    # some interactive operations

    On ubuntu 18.04 sudo apt update is automatically performed after adding a PPA, thus not manually required.

  2. Install the headless NVIDIA driver

    sudo apt install nvidia-headless-418 nvidia-utils-418

    Version 418 is the latest when I write this page. Changing to the latest version available is a good idea.

    IMPORTANT

    The nvidia-headless-418 contains only the driver, while the full nvidia-driver-418 package contain everything including display component such OpenGL libraries. If you hope to connect the display to a NVIDIA display card, install the full package, otherwise, install only the driver.

    The nvidia-utils-418 package provide utilities such as nvidia-smi.

  3. Reboot. If the installation is successful, command nvidia-smi will show all NVIDIA GPUs.

1.2.B. Install from Binary Installer

My previous post, thought old, is detailed and still work.

To summary:

  1. Download the binary installer from NVIDIA official site and make it executable.

  2. Disable nouveau driver by creating file /etc/modprobe.d/blacklist-nouveau.conf with content

    blacklist nouveau
    options nouveau modeset=0
    

    or just executing the installer and it will prompt to create it for you.

  3. Execute sudo update-initramfs -u and then reboot.

  4. After booting, switch to tty1. Stop display services such as gdm, gdm3, lightdm. Kill all process that is using the graphic card, such as Xorg or X.

  5. Execute the installer with --no-opengl-files suffix (IMPORTANT) to avoid installation of all OpenGL related files, like

    sudo ./NVIDIA-Linux-x86_64-418.56.run --no-opengl-files

    Or if you would like to display from an NVIDIA graphic card, execute the installer without any arguments, like

    sudo ./NVIDIA-Linux-x86_64-418.56.run
  6. After a successful installation, command nvidia-smi will show all NVIDIA GPUs.

2. Configure Xorg

The installed NVIDIA driver and configurations will hint Xorg to start with NVIDIA devices. Depending on whether NVIDIA related display libraries are well installed, the X server would failed to start or success to start but still use NVIDIA devices, both of which are unwanted.

We can force Xorg to use Intel graphics by creating a configuration file with the following contents and save it to /etc/X11/xorg.conf.

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "intel"
    VendorName     "Intel Corporation"
    BusID          "PCI:0:2:0
EndSection

The key point is the "BusID" option. It indicates the PCI bus id that the Intel graphics connects to. It can be retrieved from lspci. For example, on my computer, lspci outputs 00:02.0 VGA compatible controller: Intel Corporation Device 3e92, thus the bus id is 0:2:0.

Note that the bus id output from lspci is hexadecimal but the number filled in xorg.conf should be decimal. For example, if the output from lspci is 82:00.0 VGA com..., you need to fill PCI:130:0:0 in the configuration.

On Ubuntu 20.04 you may want to set the driver to modesetting instead of intel. I met some problem and solved as this link (and links within it) describes.

Setting with multiple monitors would be a little bit complex. Here is my example of setting with two monitors. Some fields are missing but it works as Xorg will smartly use some default configs. Anyway, search the Internet to get a proper set of configurations for you. man xorg.conf and ArchLinux wiki are good references.

Section "Device"
    Identifier     "Device0"
    Driver         "modesetting"
    VendorName     "Intel Corporation"
    BusID          "PCI:0:2:0
    Option         "TearFree" "true"
    Option         "monitor-DP-1" "DP"
    Option         "monitor-HDMI-2" "HDMI"
EndSection

Section "Monitor"
    Identifier "DP"
    Option     "Rotate" "Left"
EndSection

Section "Monitor"
    Identifier "HDMI"
    Option     "RightOf " "DP"
    Option     "Primary" "true"
EndSection

After the configuration, reboot the computer. If it successes, you will be able to login to the desktop. If it fails, you could be locked at the login screen. Reboot to advance mode, drop to root prompt and check /var/log/Xorg.*.log.* for hints.

After login successfully, open an terminal and execute glxheads. The displayed rendering devices should be the Intel graphics.

glxheads: exercise multiple GLX connections (any key = exit)
Usage:
  glxheads xdisplayname ...
Example:
  glxheads :0 mars:0 venus:1
Name: :0
  Display:     0x56151a1570e0
  Window:      0x2000002
  Context:     0x56151a182f60
  GL_VERSION:  3.0 Mesa 18.0.5
  GL_VENDOR:   Intel Open Source Technology Center
  GL_RENDERER: Mesa DRI Intel(R) HD Graphics (Coffeelake 3x8 GT2)

Check nvidia-smi, there should be no Xorg process listed and memory occupation should be 0MB.

Thu Nov 22 07:05:55 2018
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 418.56       Driver Version: 418.56       CUDA Version: 10.0     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|===============================+======================+======================|
|   0  GeForce GTX 108...  Off  | 00000000:01:00.0 Off |                  N/A |
|  0%   52C    P0    57W / 250W |      0MiB / 11178MiB |      0%      Default |
+-------------------------------+----------------------+----------------------+

+-----------------------------------------------------------------------------+
| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|=============================================================================|
|  No running processes found                                                 |
+-----------------------------------------------------------------------------+

Execute a heavy GPU program and move your mouse during the computation. The display should be smooth, as the computing and display rendering are separated on different devices.

Congratulations!

If glxheads outputs some virtual graphics and nvidia-smi still output Xorg processes, you probably still using the NVIDIA graphics for rendering. It just pass the rendered image to Intel graphics for display. When running heavy GPU programs, the display will still be slow.

3. Discussion

Separating display and computing is important for a smooth work.

  1. You could use a remote headless server for computing a local client for display. You can connect the remote server via SSH. This could be the simplest way. Under this setting, a properly installed headless driver will make everything work. You don't need to follow this guide.

  2. If you have only one computer and there is only one graphic device in your computer. Its unfortunate. You have to use the only device for all tasks and you would suffer severe display latency when running GPU programs.

  3. You user a single computer for both computing and display. There are two (or more) graphic devices and they are from the same vendor. Then the problem could be mush easier. Taking NVIDIA as the example, with a properly installed driver, display and computing can be perfectly separated. One can plug monitors to device 0 and use CUDA_VISIBIE_DEVICE=1 environment flag to perform computing on device 1. You probably don't need this guide.

  4. You user a single computer for both computing and display. There are two (or more) graphic devices but they are from different vendors. For example, an ordinary gaming PC configuration could include a Intel HD Graphics within the CPU and a dedicated GPU from NVIDIA. One need to plug monitors to the motherboard to use the Intel one for display and run CUDA program on the NVIDIA one. Then you are at the right place.

  5. For some purpose, you need an NVIDIA GPU for both computing and rendering. Then you probably need to run both of them on a single GPU. You need to install the drvier with OpenGL support and tune some rendering settings in Xorg.

@luohuidong

This comment has been minimized.

Copy link

@luohuidong luohuidong commented Dec 6, 2018

Awesome! Thanks a lot.

@wanghanlin14250

This comment has been minimized.

Copy link

@wanghanlin14250 wanghanlin14250 commented Jan 18, 2019

when I run nvidia-smi, I got 'no devices were found', can you tell me why?
is that my gpu not work?

@ludovicschwartz

This comment has been minimized.

Copy link

@ludovicschwartz ludovicschwartz commented Feb 4, 2019

Thank you very much for this, you're a lifesaver. This works perfectly on my laptop. My only issue is that when I can't use 2 screens at once, when I connect it to another screen, nothing happens. So far, I've not been able to solve that problem. I believe the answer is to add the correct line in the xorg.cong to tell the computer to still use the intel to display on a second screen. If you know how to proceed, I would be very grateful if you could share that with me.

@wangruohui

This comment has been minimized.

Copy link
Owner Author

@wangruohui wangruohui commented Feb 12, 2019

when I run nvidia-smi, I got 'no devices were found', can you tell me why?
is that my gpu not work?

Probably. Try lspci and find if the GPU is in the output.

@wangruohui

This comment has been minimized.

Copy link
Owner Author

@wangruohui wangruohui commented Feb 12, 2019

Thank you very much for this, you're a lifesaver. This works perfectly on my laptop. My only issue is that when I can't use 2 screens at once, when I connect it to another screen, nothing happens. So far, I've not been able to solve that problem. I believe the answer is to add the correct line in the xorg.cong to tell the computer to still use the intel to display on a second screen. If you know how to proceed, I would be very grateful if you could share that with me.

I am sorry that I haven’t tried it. But you may find some hint here:
https://wiki.archlinux.org/index.php/multihead
The key should be another Screen in the serverlayout section and another copy of configuration for device and monitor, with some id changed.

@feynman233

This comment has been minimized.

Copy link

@feynman233 feynman233 commented Apr 1, 2019

When i use the command lspci | grep VGA , I can't find intel graphics and there are only nvidia gpus, can you tell me why?thanks very much!

@wangruohui

This comment has been minimized.

Copy link
Owner Author

@wangruohui wangruohui commented May 28, 2019

When i use the command lspci | grep VGA , I can't find intel graphics and there are only nvidia gpus, can you tell me why?thanks very much!

Will intel graphics appear if you remove the discrete GPU hardware? If yes, try find options like "enable multiple display" in your bios setting. Some motherboard will disable integrated graphics if discrete ones are detected.

@timxzz

This comment has been minimized.

Copy link

@timxzz timxzz commented Jun 11, 2019

This is awesome!!!

@sq5rix

This comment has been minimized.

Copy link

@sq5rix sq5rix commented Jun 28, 2019

Hi! Thanks a lot for this tutorial. I have my opengl installed, I will need them in the future. They use 2MB on my 1070. I can work now smoothly! :)
Greetings from Poland
Tom

@samuelgarcia

This comment has been minimized.

Copy link

@samuelgarcia samuelgarcia commented Jul 19, 2019

Thank you very much.
Exactly what I was looking for.

@ruixingw

This comment has been minimized.

Copy link

@ruixingw ruixingw commented Jul 27, 2019

Thank you for the tutorial.
I have a i7-7700k with HD630 and a GTX1080. I was also wondering, if, in Linux I use GTX1080 solely for computing and HD630 for both rendering and display output, is it possible to passthrough GTX1080 to QEMU/KVM for gaming in windows? (GTX1080 for rendering in VM and HD630 for display output)

@zafar-hussain

This comment has been minimized.

Copy link

@zafar-hussain zafar-hussain commented Aug 6, 2019

Thanks mate

@ln23415

This comment has been minimized.

Copy link

@ln23415 ln23415 commented Aug 18, 2019

hi mate, it's work on my computer, thanks. But I found a drawback in this mode : it seems like cmd 'nvidia-settings ' is not supported. Do you have the same suitation as me ?

@wangruohui

This comment has been minimized.

Copy link
Owner Author

@wangruohui wangruohui commented Aug 18, 2019

hi mate, it's work on my computer, thanks. But I found a drawback in this mode : it seems like cmd 'nvidia-settings ' is not supported. Do you have the same suitation as me ?

Yes. I think nvidia-setting should be available only when you use nvidia gpu for display. So when you are using intel as display it no longer works.

@zafar-hussain

This comment has been minimized.

Copy link

@zafar-hussain zafar-hussain commented Aug 18, 2019

hi,
Nvidia-settings dosent work on on my machine either, but nvidia-smi and device_query work well

cheers

@ln23415

This comment has been minimized.

Copy link

@ln23415 ln23415 commented Aug 18, 2019

hi,
Nvidis-settings dosent work on on my machine either, but nvidia-smi and device_query work well

cheers

ah, what do you mean about device_query, the cmd 'nvidia-settings -q all' ?
actually I am trying to overclock my nvidia 1080 for controlling the fan speed, the temperture is too high when I run some DL codes.
you can try 'nvidia-settings -q all' on your devices to test : )

@zafar-hussain

This comment has been minimized.

Copy link

@zafar-hussain zafar-hussain commented Aug 18, 2019

hi,
Nvidis-settings dosent work on on my machine either, but nvidia-smi and device_query work well
cheers

ah, what do you mean about device_query, the cmd 'nvidia-settings -q all' ?
actually I am trying to overclock my nvidia 1080 for controlling the fan speed, the temperture is too high when I run some DL codes.
you can try 'nvidia-settings -q all' on your devices to test : )

I meant device_query in the samples/examples, sorry I dont have any knowledge about overclocking the GPU, Best of luck

@liuhengyue

This comment has been minimized.

Copy link

@liuhengyue liuhengyue commented Dec 28, 2019

Hi,

Thanks for your posting and it saved some extra space for my hard-working GPU.
I have a integrated Intel UHD 630 graphics and Ubuntu 18.04. After set up everything, there is flicker and horizontal splits when playing video. Then I found a solution for that. Just add following lines within the "Device" section in the xorg config.

    Option "AccelMethod" "sna"
    Option "TearFree" "true"
    Option "DRI" "3"

Hope it can help other people with similar issues. Thanks.

@isofew

This comment has been minimized.

Copy link

@isofew isofew commented Jan 5, 2020

Hi,

Thanks for your posting and it saved some extra space for my hard-working GPU.
I have a integrated Intel UHD 630 graphics and Ubuntu 18.04. After set up everything, there is flicker and horizontal splits when playing video. Then I found a solution for that. Just add following lines within the "Device" section in the xorg config.

    Option "AccelMethod" "sna"
    Option "TearFree" "true"
    Option "DRI" "3"

Hope it can help other people with similar issues. Thanks.

Thank you! This fixed my problem exactly.

@uberneko

This comment has been minimized.

Copy link

@uberneko uberneko commented Jun 16, 2020

Hello !
Are you sure this is still working on Ubuntu 20.04 ? I just tried on Mint 20 Cinnamon (beta) - once I do the xorg.conf file, I can reboot but when for instance I try launching Blender, it doesn't work (segmentation issue - core dump). And glxheads seems not to work either.
I tried before in plain Ubuntu 20.04 - same issues (plus the glitch).
I tried now with Xubuntu 20.04 - xorg won't start, it complains that something is wrong is the xorg.conf file.

@uberneko

This comment has been minimized.

Copy link

@uberneko uberneko commented Jun 16, 2020

By removing GPUs and putting them back at the very last moment (just after installing the headless drivers actually), I managed to make it boot with Xubuntu, but same result : Blender would not start, segmentation fault, core dump. Plus glxheads does not work as well.

@wangruohui

This comment has been minimized.

Copy link
Owner Author

@wangruohui wangruohui commented Jun 16, 2020

Hello !
Are you sure this is still working on Ubuntu 20.04 ? I just tried on Mint 20 Cinnamon (beta) - once I do the xorg.conf file, I can reboot but when for instance I try launching Blender, it doesn't work (segmentation issue - core dump). And glxheads seems not to work either.
I tried before in plain Ubuntu 20.04 - same issues (plus the glitch).
I tried now with Xubuntu 20.04 - xorg won't start, it complains that something is wrong is the xorg.conf file.

Hi, i have just test it on plain ubuntu 2004 a few days ago. There was some glitch problem but it finally got resolved. It seems the driver option should be replaced from intel to modesetting. Reference link are also updated within the post. See if it can help you. Thank you :)

@uberneko

This comment has been minimized.

Copy link

@uberneko uberneko commented Jun 16, 2020

Thanks for your pretty fast answer !
Umm indeed I didn't test this one - I shall try later.
When trying to find a solution I also discovered the existence of a kernel parameter : "nogpumanager" - which apparently avoids ... Ubuntu to keep on re-initializing the xorg conf ? (https://forums.developer.nvidia.com/t/ubuntu-18-04-headless-390-intel-igpu-after-prime-select-intel-lost-contact-to-geforce-1050ti/66698) - I'll investigate this one too.

@gautham20

This comment has been minimized.

Copy link

@gautham20 gautham20 commented Jun 19, 2020

Exactly what I needed, Thanks!
Few things I have noticed, In ubuntu 18.04.4, driver modesetting had to be used instead of intel, because with intel I had display glitches.
And before updating xorg.conf, get the current xorg conf by sudo X :2 -configure and change the drives whereever necessary, this will have you covered for multiple displays.

@lxxue

This comment has been minimized.

Copy link

@lxxue lxxue commented Jun 20, 2020

Hi,

Thanks for your posting and it saved some extra space for my hard-working GPU.
I have a integrated Intel UHD 630 graphics and Ubuntu 18.04. After set up everything, there is flicker and horizontal splits when playing video. Then I found a solution for that. Just add following lines within the "Device" section in the xorg config.

    Option "AccelMethod" "sna"
    Option "TearFree" "true"
    Option "DRI" "3"

Hope it can help other people with similar issues. Thanks.

Thanks!

@lxxue

This comment has been minimized.

Copy link

@lxxue lxxue commented Jun 20, 2020

Exactly what I needed, Thanks!
Few things I have noticed, In ubuntu 18.04.4, driver modesetting had to be used instead of intel, because with intel I had display glitches.
And before updating xorg.conf, get the current xorg conf by sudo X :2 -configure and change the drives whereever necessary, this will have you covered for multiple displays.

Hi!
Could you share the updated xorg.conf? The sudo X :2 -configure doesn't work for me and I couldn't find a fix for it using google (Error: Number of created screens does not match number of detected devices).
Right now I have two monitors but only one monitor correctly displays the desktop. I check the Displays provided by Ubuntu 18.04 and found that it cannot detect the second screen. I used to use Nvidia GPU for display and it worked with two monitors. Thanks!

@gautham20

This comment has been minimized.

Copy link

@gautham20 gautham20 commented Jun 22, 2020

Exactly what I needed, Thanks!
Few things I have noticed, In ubuntu 18.04.4, driver modesetting had to be used instead of intel, because with intel I had display glitches.
And before updating xorg.conf, get the current xorg conf by sudo X :2 -configure and change the drives whereever necessary, this will have you covered for multiple displays.

Hi!
Could you share the updated xorg.conf? The sudo X :2 -configure doesn't work for me and I couldn't find a fix for it using google (Error: Number of created screens does not match number of detected devices).
Right now I have two monitors but only one monitor correctly displays the desktop. I check the Displays provided by Ubuntu 18.04 and found that it cannot detect the second screen. I used to use Nvidia GPU for display and it worked with two monitors. Thanks!

I'm not sure if this will fix your issue, but here is my xorg.conf

Section "ServerLayout"
	Identifier     "X.org Configured"
	Screen      0  "Screen0" 0 0
	Screen      1  "Screen1" RightOf "Screen0"
	InputDevice    "Mouse0" "CorePointer"
	InputDevice    "Keyboard0" "CoreKeyboard"
EndSection

Section "Files"
	ModulePath   "/usr/lib/xorg/modules"
	FontPath     "/usr/share/fonts/X11/misc"
	FontPath     "/usr/share/fonts/X11/cyrillic"
	FontPath     "/usr/share/fonts/X11/100dpi/:unscaled"
	FontPath     "/usr/share/fonts/X11/75dpi/:unscaled"
	FontPath     "/usr/share/fonts/X11/Type1"
	FontPath     "/usr/share/fonts/X11/100dpi"
	FontPath     "/usr/share/fonts/X11/75dpi"
	FontPath     "built-ins"
EndSection

Section "Module"
	Load  "glx"
EndSection

Section "InputDevice"
	Identifier  "Keyboard0"
	Driver      "kbd"
EndSection

Section "InputDevice"
	Identifier  "Mouse0"
	Driver      "mouse"
	Option	    "Protocol" "auto"
	Option	    "Device" "/dev/input/mice"
	Option	    "ZAxisMapping" "4 5 6 7"
EndSection

Section "Monitor"
	Identifier   "Monitor0"
	VendorName   "Monitor Vendor"
	ModelName    "Monitor Model"
EndSection

Section "Monitor"
	Identifier   "Monitor1"
	VendorName   "Monitor Vendor"
	ModelName    "Monitor Model"
EndSection

Section "Device"
        ### Available Driver options are:-
        ### Values: <i>: integer, <f>: float, <bool>: "True"/"False",
        ### <string>: "String", <freq>: "<f> Hz/kHz/MHz",
        ### <percent>: "<f>%"
        ### [arg]: arg optional
        #Option     "SWcursor"           	# [<bool>]
        #Option     "HWcursor"           	# [<bool>]
        #Option     "NoAccel"            	# [<bool>]
        #Option     "ShadowFB"           	# [<bool>]
        #Option     "VideoKey"           	# <i>
        #Option     "WrappedFB"          	# [<bool>]
        #Option     "GLXVBlank"          	# [<bool>]
        #Option     "ZaphodHeads"        	# <str>
        #Option     "PageFlip"           	# [<bool>]
        #Option     "SwapLimit"          	# <i>
        #Option     "AsyncUTSDFS"        	# [<bool>]
        #Option     "AccelMethod"        	# <str>
        #Option     "DRI"                	# <i>
	Identifier  "Card0"
	Driver      "modesetting"
	BusID       "PCI:0:2:0"
EndSection

Section "Device"
        ### Available Driver options are:-
        ### Values: <i>: integer, <f>: float, <bool>: "True"/"False",
        ### <string>: "String", <freq>: "<f> Hz/kHz/MHz",
        ### <percent>: "<f>%"
        ### [arg]: arg optional
        #Option     "SWcursor"           	# [<bool>]
        #Option     "kmsdev"             	# <str>
        #Option     "ShadowFB"           	# [<bool>]
        #Option     "AccelMethod"        	# <str>
        #Option     "PageFlip"           	# [<bool>]
        #Option     "ZaphodHeads"        	# <str>
        #Option     "DoubleShadow"       	# [<bool>]
	Identifier  "Card1"
	Driver      "modesetting"
	BusID       "PCI:0:2:0"
EndSection

Section "Screen"
	Identifier "Screen0"
	Device     "Card0"
	Monitor    "Monitor0"
	SubSection "Display"
		Viewport   0 0
		Depth     1
	EndSubSection
	SubSection "Display"
		Viewport   0 0
		Depth     4
	EndSubSection
	SubSection "Display"
		Viewport   0 0
		Depth     8
	EndSubSection
	SubSection "Display"
		Viewport   0 0
		Depth     15
	EndSubSection
	SubSection "Display"
		Viewport   0 0
		Depth     16
	EndSubSection
	SubSection "Display"
		Viewport   0 0
		Depth     24
	EndSubSection
EndSection

Section "Screen"
	Identifier "Screen1"
	Device     "Card1"
	Monitor    "Monitor1"
	SubSection "Display"
		Viewport   0 0
		Depth     1
	EndSubSection
	SubSection "Display"
		Viewport   0 0
		Depth     4
	EndSubSection
	SubSection "Display"
		Viewport   0 0
		Depth     8
	EndSubSection
	SubSection "Display"
		Viewport   0 0
		Depth     15
	EndSubSection
	SubSection "Display"
		Viewport   0 0
		Depth     16
	EndSubSection
	SubSection "Display"
		Viewport   0 0
		Depth     24
	EndSubSection
EndSection

@uberneko

This comment has been minimized.

Copy link

@uberneko uberneko commented Jun 27, 2020

@wangruohui:
After a few time, I settled down to one different option though. But I confirm switching to modesetting instead of intel within ubuntu 20.04 will do the trick and that still works.
I use Linux Mint and spent some time with there beta Cinnamon Linux Mint 20.
Actually now they introduced the nvidia applet switcher (more or less prime, I'd say), which allows to go to saving mode (intel), performance mode (nvidia) and on-demand.
On saving mode, strangely, even if NVIDIA drivers are installed (the full ones), Blender would not see the cards, so it would propose only CPU rendering. On performance, it works, but it is the situation we don't want, using one NVIDIA graphic card for both display and computing (CUDA here).
The interesting mode is 'on-demand'. Here we are on intel for display, and programs can be launched with running with NVIDIA cards as option.
There, Blender does see the graphic cards, and GPU rendering (even Optix now) works quite well.
I just see some strange Xorg processes in nvidia-smi (one per NVIDIA card actually) - but if I run a game with NVIDIA, I clearly see the new process for the game, if I run that game without graphic card, I don't see it in nvidia-smi. So I guess it works (for some reasons the only program 'polluting' nvidia-smi with a process when being lauch on intel, i.e. normally, without demanding the graphic card, is LibreOffice... can't understand why...).
But currently that new situation satisfies me.

@mhassankhan90

This comment has been minimized.

Copy link

@mhassankhan90 mhassankhan90 commented Aug 5, 2020

Hi there,

I followed the guide to set up my integrated graphics (intel) for display and nvidia for computing. I am using Ubuntu 20.04. I installed the Nvidia driver and checked nvidia-smi. It was working fine. I then added the above mentioned contents to my xorg.conf file. Now when I run nvidia-smi command, it gives me error that "NVIDIA-SMI has failed because it couldn't communicate with the NVIDIA driver. Make sure that the latest NVIDIA driver is installed and running." Please help me resolve this issue.

Regards,

@as181920

This comment has been minimized.

Copy link

@as181920 as181920 commented Aug 15, 2020

@mhassankhan90
I tries this and works, edit file /etc/modprobe.d/blacklist-nvidia.conf
blacklist nvidia-drm
alias nvidia-drm off

found here:http://litaotju.github.io/2019/03/13/=Use-intel-for-display-nvidia-for-computing/

@josiahlaivins

This comment has been minimized.

Copy link

@josiahlaivins josiahlaivins commented Sep 17, 2020

If youre xorg.conf keeps being overwiritten, I would try: https://askubuntu.com/questions/731990/gpu-manager-overwrites-xorg-conf

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.