Skip to content

Instantly share code, notes, and snippets.

@jmftrindade
Last active December 25, 2023 21:33
Show Gist options
  • Save jmftrindade/ed3d4fad7bb7f4f1683c89c20a74c4cc to your computer and use it in GitHub Desktop.
Save jmftrindade/ed3d4fad7bb7f4f1683c89c20a74c4cc to your computer and use it in GitHub Desktop.
X570 Aorus Master with dual 3060 GPU

A Google search shows how tricky it can be to get an X570 Aorus motherboard to correclty detect your GPU, especially if you're already using one of the PCIe slots for a GPU + another M2 PCIe slot for an NVMe SSD, which was the case for my build.

Therefore, neither my BIOS nor my OS detected the 2nd GPU card right away. I spent some time reading the motherboard manual, but nothing there was immediately helpful. The section on PCIe did mention how to split the available lanes in case multiple were being used for PCIe Gen 4.0 devices (which by default will use x16 lanes), and that's what we rely on here via the "bifurcation" BIOS feature listed below.

I got it to work successfully, so sharing it here in case somebody out there finds this useful.

Assumptions:

  • You already ensured that your PSU can handle both cards. If not, use PC Part Picker to estimate the minimum wattage needed. For my build, a 850W PSU suffices (see parts list).
  • You're using Windows 10 as the OS. Might work on Windows 11 too, but I haven't tested.
  • You have an SSD already using one of the two M2 slots (if you don't, other PCIEx bifurcation configs might still be an option for you, such as e.g., the 1x8 + 2x4, or the 2x8).
  • You already have "Initial Display Output" set to "PCIe 1 Slot" on your BIOS (default setting).
  • The two GPU cards you want to install are Gen 4.0 (they're both different editions of RTX 3060 12GB OC, in my case).
  • You know what you're doing, e.g., whenever you're opening your box to add/remove cards, PSU is turned off, unplugged from outlet, you know how to insert / remove GPU cards and plug eVGA PSU cables to them, etc etc.

BIOS config you'll need, in addition to keeping PCIe 1 Slot as Initial Display Output:

Settings tab > "IO Ports":

  • PCIEX16 Bifurcation -> PCIE 4x4
  • Above 4G Decoding -> Enabled

Settings tab > "Miscellaneous":

  • PCIEX16 Slot Configuration -> Auto
  • PCIe Slot Configuration -> Auto
  • IOMMU -> Auto

Sequence of steps to get Windows to recognize both your cards. :

  1. Make sure you only have a GPU card on the PCIe 2 slot (the one in the middle; could work on PCIe 3, but here I don't have enough space to use it). Plug in peripherals (including HDMI monitor to the GPU), and boot.

  2. Windows recognizes it (listed under "Display Adapters" on "Device Manager), and either automagically install NVIDIA driver on the card, or gives you a chance to do it. All good, turn server off.

  3. Open your box, and place your 2nd GPU on the PCIe 1 slot. Leave the one HDMI monitor that was already plugged to the first GPU card (on PCIe 2 slot), and plug a second HDMI monitor to the card you just placed on PCIe 1 slot.

  4. Turn server on, booting to Windows: it should now recognize the 2nd card and install NVIDIA driver on it (two entries under "Display Adapters" on "Device Manager" now). Confirm visually that both monitors have signal.

You should now be all set.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment