Following mining and findings performed on EVGA GeForce GTX 1070 SC GAMING Black Edition Graphics Card cards.
First run nvidia-xconfig --enable-all-gpus
then set about editing the xorg.conf
file to correctly set the Coolbits
option.
# /etc/X11/xorg.conf
Section "Device"
Identifier "Device0"
Driver "nvidia"
VendorName "NVIDIA Corporation"
BoardName "GeForce GTX 1070"
BusID "PCI:1:0:0"
Option "Coolbits" "28"
EndSection
Section "Device"
Identifier "Device1"
Driver "nvidia"
VendorName "NVIDIA Corporation"
BoardName "GeForce GTX 1070"
BusID "PCI:2:0:0"
Option "Coolbits" "28"
EndSection
Let's now apply a very light OC to the cards,
skylake:~# nvidia-settings -c :0 -q gpus
2 GPUs on skylake:0
[0] skylake:0[gpu:0] (GeForce GTX 1070)
Has the following names:
GPU-0
GPU-08ba492c-xxxx
[1] skylake:0[gpu:1] (GeForce GTX 1070)
Has the following names:
GPU-1
GPU-16e218e7-xxxx
# Apply +1300 Mhz Mem clock offset, and +100 Mhz on GPU clock
# Found these were the most stable on my Dual EVGA SC Black 1070s.
nvidia-settings -c :0 -a '[gpu:0]/GPUMemoryTransferRateOffset[3]=1300'
nvidia-settings -c :0 -a '[gpu:1]/GPUMemoryTransferRateOffset[3]=1300'
nvidia-settings -c :0 -a '[gpu:0]/GPUGraphicsClockOffset[3]=100'
nvidia-settings -c :0 -a '[gpu:1]/GPUGraphicsClockOffset[3]=100'
To check if these have applied, your X11 server needs to be running and you'll get a confirmation
~⟫ nvidia-settings -c :0 -a '[gpu:0]/GPUMemoryTransferRateOffset[3]=1400'
Failed to connect to Mir: Failed to connect to server socket: No such file or directory
Unable to init server: Could not connect: Connection refused
Attribute 'GPUMemoryTransferRateOffset' (skylake:0[gpu:0]) assigned value 1400.
Check the final config,
skylake:~# nvidia-smi
Sat Jun 17 03:31:57 2017
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 375.66 Driver Version: 375.66 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 GeForce GTX 1070 Off | 0000:01:00.0 On | N/A |
| 60% 75C P2 146W / 151W | 2553MiB / 8112MiB | 99% Default |
+-------------------------------+----------------------+----------------------+
| 1 GeForce GTX 1070 Off | 0000:02:00.0 Off | N/A |
| 38% 66C P2 149W / 151W | 2198MiB / 8114MiB | 99% Default |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: GPU Memory |
| GPU PID Type Process name Usage |
|=============================================================================|
| 0 1267 G /usr/lib/xorg/Xorg 184MiB |
| 0 3457 G compiz 170MiB |
| 0 4956 C ./ethdcrminer64 2195MiB |
| 1 4956 C ./ethdcrminer64 2195MiB |
+-----------------------------------------------------------------------------+
@blacksausage As counter-intuitive as it sounds, you need to have a display manager running in order to run
nvidia-settings
via CLI.Additionally if you are logged into the server via SSH you may need to follow f0k's approach here.