PCI express risers to use multiple GPUs on one motherboard - not detecting card?

Message boards : GPUs : PCI express risers to use multiple GPUs on one motherboard - not detecting card?
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 . . . 20 · Next

AuthorMessage
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1088
United Kingdom
Message 95029 - Posted: 14 Jan 2020, 18:52:19 UTC - in response to Message 95025.  

Exactly which AMD GPU cards are you trying to use? Do they have sockets for supplementary power input?

If they don't have sockets, they'll take all the power they need - up to 75 watts - from the motherboard via the 16-bit slots. A 1-bit slot won't supply enough power, and you'll have to check that your riser can carry enough power.


USB style risers do not take any power from the slot, they will get power from the external connection on the PCIe 16x to USB PCB.

it sounds like he has one USB style riser like this:


and one 1x ribbon with external molex like this:


He should only be plugging these into the 16x slots.

but for what he's doing I still recommend just buying nice shielded ribbon risers like what I linked before. he can find the same thing from a UK based seller.


The first photo you show is one of the ones I ordered, but have not yet received.

The second photo is the one I have already and doesn't work (which I've tried two of (I have 7)).

Why do I have to plug them into 16x slots if they are only 1x plugs? I want to use the 1x slots aswell, then I can have more than two GPUs per motherboard. I've also ordered a 1x plug with four USB sockets, so I can put 4 cards into one motherboard slot (if it works).

I have ordered a ribbon riser, as like you I assumed this will cause less problems, it's just moving the socket. But that will only work for the first two cards. I want to have many GPUs without having to buy many motherboards, processors, and RAM.
ID: 95029 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1088
United Kingdom
Message 95031 - Posted: 14 Jan 2020, 18:57:20 UTC - in response to Message 95027.  
Last modified: 14 Jan 2020, 18:57:46 UTC

Exactly which AMD GPU cards are you trying to use? Do they have sockets for supplementary power input?

If they don't have sockets, they'll take all the power they need - up to 75 watts - from the motherboard via the 16-bit slots. A 1-bit slot won't supply enough power, and you'll have to check that your riser can carry enough power.


AMD Sapphire R9 280 X (two of). They both have twin 8 pin PCI Express power connectors. How does the power allocation work? Are they 3 seperate inputs and the voltage regulator on the card takes what it needs in preference? Or are they just all shorted together? The riser I tried had a rather flimsy attachment of a molex cable soldered straight onto it. Can you take 75 watts from a 1 lane socket or is that only for the 16x ones?
Try https://en.wikipedia.org/wiki/PCI_Express#Power


Interesting, although it doesn't say how the electronics on the graphics card decides where to take the power from. I assume the VRM supplying the GPU has inputs from the PCI Express slot and from the extra power connectors on top of the card and draws the right amount of current intelligently from each?
ID: 95031 · Report as offensive
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 151
United States
Message 95032 - Posted: 14 Jan 2020, 18:57:49 UTC - in response to Message 95022.  


Will PCI express 1.0 be ok for Boinc? I'd like to use those slots aswell, unless I can multiplex the PCI express 2.0 slots. Is there any way I can tell how much data is being transferred on the PCI express bus? Or look up the requirement by a project? I was under the impression that Boinc was similar to bitcoin in that a small amount of data was passed to the card, it processed hard for ages, then passed back a small amount of data.

I tried 2 of the 7 risers I had lying about, and neither produced even the BIOS screen on the monitor. They have been sat in my garage (which is only heated to 10C for the parrots), and I have found some DVI cables stored in there that have gone rusty! The sooner I get a heat pump with heating/AC/dehumidification all in one the better, Scotland is DAMP!

A graphics card once scared the hell out of me late in the evening when I forgot to plug in the extra power connectors. As soon as I turned on the PC, it made a very loud alarm sound, and flashed in red block capitals on the screen something along the lines of me being a stupid idiot and hadn't plugged things in correctly. I thought I had a horrid boot virus or something was about to catch fire.

One of the things I've ordered is a straight ribbon cable, 16x to 16x, looks similar to yours, hopefully that will work. Although I need the 1x to 16x to work if I want a 3rd card or more.


the 16x to 16x riser you bought may work since you are only PCIe 2.0, but if it were me I would feel a lot better with the higher quality cable.

The 1.0 slot will "work" but I think you will find HUGE performance reductions in doing so. as a comparison to the 16x 2.0 slot, the 1x 1.0 slot has just 1/32 of the total bandwidth. I can't imagine it will run very well. just stick to the 2 cards in my opinion.
ID: 95032 · Report as offensive
robsmith
Volunteer tester
Help desk expert

Send message
Joined: 25 May 09
Posts: 952
United Kingdom
Message 95033 - Posted: 14 Jan 2020, 18:57:57 UTC

Why do I have to plug them into 16x slots if they are only 1x plugs?

You don't - electrically the first 18(?) pins on an x16 are exactly the same as those on an x1 - or at least SHOULD be. The reason for trying in an x16 is to make sure that there isn't something strange with the x1 sockets (and that wouldn't be the first time)
ID: 95033 · Report as offensive
Richard Haselgrove
Volunteer tester
Help desk expert

Send message
Joined: 5 Oct 06
Posts: 4414
United Kingdom
Message 95034 - Posted: 14 Jan 2020, 18:58:46 UTC - in response to Message 95029.  

Why do I have to plug them into 16x slots if they are only 1x plugs?
Although all the power pins are concentrated in the short section before the key, and are common to all sizes, the motherboard manufacturer is at liberty only to provide enough power tracks between PSU and slot to provide the PCI spec power for that size of slot.

And if they are at liberty, they will have saved money by exercising that liberty. Copper is expensive.
ID: 95034 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1088
United Kingdom
Message 95035 - Posted: 14 Jan 2020, 19:01:09 UTC - in response to Message 95028.  

Exactly which AMD GPU cards are you trying to use? Do they have sockets for supplementary power input?

If they don't have sockets, they'll take all the power they need - up to 75 watts - from the motherboard via the 16-bit slots. A 1-bit slot won't supply enough power, and you'll have to check that your riser can carry enough power.


AMD Sapphire R9 280 X (two of). They both have twin 8 pin PCI Express power connectors. How does the power allocation work? Are they 3 seperate inputs and the voltage regulator on the card takes what it needs in preference? Or are they just all shorted together? The riser I tried had a rather flimsy attachment of a molex cable soldered straight onto it. Can you take 75 watts from a 1 lane socket or is that only for the 16x ones?


the PCIe power connectors on the GPUs have sense pins, they know when it's not plugged in and will not start unless you do so, and will usually give that warning message on the screen when you have forgotten.

you need all power connectors plugged in. If you are not doing that, then you should try that first.


Yes I've plugged all the power connectors in. Although it is a cheap power supply. But it powers the card just fine when connected straight to the motherboard. I was just wondering if there wasn't enough power coming through that dodgy soldered on molex plug on my riser so the card decided to not start? If it's just a power problem, it should be solved when I get the more beefy ones as in top picture from Ian and Steve above.
ID: 95035 · Report as offensive
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 151
United States
Message 95036 - Posted: 14 Jan 2020, 19:02:22 UTC - in response to Message 95031.  



Interesting, although it doesn't say how the electronics on the graphics card decides where to take the power from. I assume the VRM supplying the GPU has inputs from the PCI Express slot and from the extra power connectors on top of the card and draws the right amount of current intelligently from each?


he sent you the PCIe spec.

the information on "what power comes from where" is totally up to the GPU design and how the power connections are sorted on the GPUs PCB and will vary from card to card. On most modern graphics cards the slot power and the PCIe power are totally segregated. something along the lines of [slot power supplying the VRAM and fans] and [PCIe power supplying the GPU core]. some manufacturers might even only power the card LEDs and fans from slot power and put all other loads to the PCIe connectors. You're unlikely to find any documentation on how your specific card is setup without independent testing and measurement.
ID: 95036 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1088
United Kingdom
Message 95037 - Posted: 14 Jan 2020, 19:03:31 UTC - in response to Message 95032.  
Last modified: 14 Jan 2020, 19:04:00 UTC


Will PCI express 1.0 be ok for Boinc? I'd like to use those slots aswell, unless I can multiplex the PCI express 2.0 slots. Is there any way I can tell how much data is being transferred on the PCI express bus? Or look up the requirement by a project? I was under the impression that Boinc was similar to bitcoin in that a small amount of data was passed to the card, it processed hard for ages, then passed back a small amount of data.

I tried 2 of the 7 risers I had lying about, and neither produced even the BIOS screen on the monitor. They have been sat in my garage (which is only heated to 10C for the parrots), and I have found some DVI cables stored in there that have gone rusty! The sooner I get a heat pump with heating/AC/dehumidification all in one the better, Scotland is DAMP!

A graphics card once scared the hell out of me late in the evening when I forgot to plug in the extra power connectors. As soon as I turned on the PC, it made a very loud alarm sound, and flashed in red block capitals on the screen something along the lines of me being a stupid idiot and hadn't plugged things in correctly. I thought I had a horrid boot virus or something was about to catch fire.

One of the things I've ordered is a straight ribbon cable, 16x to 16x, looks similar to yours, hopefully that will work. Although I need the 1x to 16x to work if I want a 3rd card or more.


the 16x to 16x riser you bought may work since you are only PCIe 2.0, but if it were me I would feel a lot better with the higher quality cable.

The 1.0 slot will "work" but I think you will find HUGE performance reductions in doing so. as a comparison to the 16x 2.0 slot, the 1x 1.0 slot has just 1/32 of the total bandwidth. I can't imagine it will run very well. just stick to the 2 cards in my opinion.


I'll see how it runs, I always check when adding a new piece of hardware or a new project that nothing is causing a nasty bottleneck, I like all processors maxed out! It may be that some projects will run ok on the 1x slots. I don't want to buy more computers if I don't have to, GPU computing is my favourite :-)
ID: 95037 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1088
United Kingdom
Message 95038 - Posted: 14 Jan 2020, 19:06:01 UTC - in response to Message 95033.  

Why do I have to plug them into 16x slots if they are only 1x plugs?

You don't - electrically the first 18(?) pins on an x16 are exactly the same as those on an x1 - or at least SHOULD be. The reason for trying in an x16 is to make sure that there isn't something strange with the x1 sockets (and that wouldn't be the first time)


Well the card failed to start on the riser plugged into an x1 or an x16 slot, so I'm blaming the flimsy looking riser. I'll find out when I eventually get the new risers. I hate 2nd class post. So many sellers on Ebay either don't offer 1st class, or charge stupid prices for it. 1st class is usually only 50p more! I always sell things 1st class, or give an option with a small sensible price increase.
ID: 95038 · Report as offensive
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 151
United States
Message 95039 - Posted: 14 Jan 2020, 19:06:38 UTC - in response to Message 95033.  

Why do I have to plug them into 16x slots if they are only 1x plugs?

You don't - electrically the first 18(?) pins on an x16 are exactly the same as those on an x1 - or at least SHOULD be. The reason for trying in an x16 is to make sure that there isn't something strange with the x1 sockets (and that wouldn't be the first time)


technically what Richard wrote is correct, the 1x slots are only speced to supply about 10-30W of power. he shouldn't try plugging a GPU into this with a riser that will pull power from the slot (ribbon style) as it could damage the slot from pulling too much power.

yes, the electrical spec about what pins are what voltages are the same. but the designed power is different (other than the obvious differences between bandwidth on the data side)
ID: 95039 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1088
United Kingdom
Message 95040 - Posted: 14 Jan 2020, 19:07:17 UTC - in response to Message 95034.  

Why do I have to plug them into 16x slots if they are only 1x plugs?
Although all the power pins are concentrated in the short section before the key, and are common to all sizes, the motherboard manufacturer is at liberty only to provide enough power tracks between PSU and slot to provide the PCI spec power for that size of slot.

And if they are at liberty, they will have saved money by exercising that liberty. Copper is expensive.


So I assume if I use one of those well made risers as pictured above, the power will be added from the PCI Express 6 way power connector, so that won't be a problem?
ID: 95040 · Report as offensive
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 151
United States
Message 95041 - Posted: 14 Jan 2020, 19:08:10 UTC - in response to Message 95037.  

it's also possible that the board simply wont recognize a GPU in the 1x slots.
ID: 95041 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1088
United Kingdom
Message 95042 - Posted: 14 Jan 2020, 19:09:48 UTC - in response to Message 95036.  



Interesting, although it doesn't say how the electronics on the graphics card decides where to take the power from. I assume the VRM supplying the GPU has inputs from the PCI Express slot and from the extra power connectors on top of the card and draws the right amount of current intelligently from each?


he sent you the PCIe spec.

the information on "what power comes from where" is totally up to the GPU design and how the power connections are sorted on the GPUs PCB and will vary from card to card. On most modern graphics cards the slot power and the PCIe power are totally segregated. something along the lines of [slot power supplying the VRAM and fans] and [PCIe power supplying the GPU core]. some manufacturers might even only power the card LEDs and fans from slot power and put all other loads to the PCIe connectors. You're unlikely to find any documentation on how your specific card is setup without independent testing and measurement.


That might explain my problem. In the past when I forgot to supply the extra power and got a warning, presumably the GPU's BIOS was powered from the slot so it could show me the warning. Maybe these ones can't start at all because that slot power is unavailable due to the dodgy riser?
ID: 95042 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1088
United Kingdom
Message 95043 - Posted: 14 Jan 2020, 19:13:25 UTC - in response to Message 95041.  

it's also possible that the board simply wont recognize a GPU in the 1x slots.


It did behave slightly differently when I used an x1 slot. I think (I've forgotten exactly what happened now) Windows booted with no graphics card (I could see it on remote access though) on the x16 slot, and the computer refused to do anything at all in the x1 slot. So I may be out of luck putting more than 2 cards in that machine. Oh well, someone needs to earn more money and buy another motherboard for more cards. I guess it can do CPU projects aswell....
ID: 95043 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 615
United States
Message 95135 - Posted: 15 Jan 2020, 16:01:15 UTC - in response to Message 95043.  

My 2ct,
Stay away from 1x to 16x risers if you can.

Use $8, 6" 16x to 16x double ribbon extension risers.
They're cheap and effective.
You'll still connect in 16, 8, or 4x (depending on the Mobo).

USB risers (1x to 16x) are good for 1x ports, in case the full size slots are already occupied.

After this you might need to reinstall the GPU drivers.
ID: 95135 · Report as offensive
robsmith
Volunteer tester
Help desk expert

Send message
Joined: 25 May 09
Posts: 952
United Kingdom
Message 95137 - Posted: 15 Jan 2020, 16:13:35 UTC

Great, but if you read the description of what the OP's motherboard has, and what he want to achieve, then you would realise that what you are suggesting will not fully answer his quest.
Summary - The motherboard has two x16 slots and a number of x1 slots. He wants to run more than 2 GPUs, he wants to have them air-cooled, and the x16 slots are too close together for his comfort. Thus he needs to be able to use the at least one of the x16 slots with a riser, and the x1 slots will need a riser anyway - and those riser will have to be x1 to x16 because they have to be x1 at the motherboard end. So far he's had no joy in getting the motherboard to recognise anything sitting on a x1 to x16 riser. Ian&Steve has made a few suggestions, and I know he has done a lot of work in getting similar (not identical) systems working.
ID: 95137 · Report as offensive
Richard Haselgrove
Volunteer tester
Help desk expert

Send message
Joined: 5 Oct 06
Posts: 4414
United Kingdom
Message 95141 - Posted: 15 Jan 2020, 16:30:00 UTC - in response to Message 95137.  

That may require some nifty work with - probably not - a hacksaw, or a fine cutting tool like a Dremel. By opening the end of the PCIe x1 slot, the x16 riser could be physically inserted into the x1 slot. He'd still have to watch out for power consumption: the sense pin should tell the motherboard that a card is present. But he would still have to investigate and manage the card's actual power draw from each input.

It should be possible: his cards have nominal power inputs for 375W (75W from the PCIe slot, plus 2x 150W 8-pin supplementary inputs). The cards he has are rated at 250W average total board power. So there's headroom - it's just a question which input has the spare capacity, and that depends on the manufacturer.

To my mind, fitting the dual 8-pin suggests that the bulk of the power will be taken from them: if the full 75W was taken from the motherboard, they could have got away with 1 8-pin and 1 6-pin. But I am not a circuitry designer: it's all supposition, and it might still fry the motherboard. Proceed with extreme caution, and keep a fire extinguisher close at hand.
ID: 95141 · Report as offensive
Profile Jord
Volunteer tester
Help desk expert
Avatar

Send message
Joined: 29 Aug 05
Posts: 14560
Netherlands
Message 95143 - Posted: 15 Jan 2020, 17:04:22 UTC - in response to Message 95141.  
Last modified: 15 Jan 2020, 17:04:48 UTC

That may require some nifty work with - probably not - a hacksaw, or a fine cutting tool like a Dremel. By opening the end of the PCIe x1 slot, the x16 riser could be physically inserted into the x1 slot.
Before the OP goes this route, let me point out that BOINC Dev does not condone this sort of DIY, and any damages done to hardware are at your own risk, etc. Or blame Richard. I know where he lives. ;-)
ID: 95143 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1088
United Kingdom
Message 95146 - Posted: 15 Jan 2020, 18:35:24 UTC - in response to Message 95143.  
Last modified: 15 Jan 2020, 18:35:53 UTC

That may require some nifty work with - probably not - a hacksaw, or a fine cutting tool like a Dremel. By opening the end of the PCIe x1 slot, the x16 riser could be physically inserted into the x1 slot.
Before the OP goes this route, let me point out that BOINC Dev does not condone this sort of DIY, and any damages done to hardware are at your own risk, etc. Or blame Richard. I know where he lives. ;-)


I will not be adjusting things like that, since last time I tried something similar (retrofitting watercooling inside a PSU), I got smoke. The workmanship was fine, but my planning was incorrect. I didn't cool every component that needed it and something got terribly terribly hot. Oh I did do bad workmanship retrofitting the watercooler to the northbridge on the MB. That leaked. The PC froze, but it worked after drying it out :-) I also have told myself NO MORE OVERCLOCKING! Even if the temperature is fine, it wears out the chips really fast. That's why I'm in the place I'm in now, the last GPU died, so I tried to get another, found some half price ones, so bought two. Always run everything at the stock speed! No point in a 20% speed gain if the chip lasts for half as long.

Oh and I'm in the UK, so I can get to Richard quite easily.

I have received the first of my adapters - the 16x to 16x ribbon, but it's advertised as PCI Express 1.0 (unshielded) - it uses a ribbon similar to an IDE cable. I've ordered a shielded one that says PCI Express 3.0. The 1.0 ribbon does seem to be working though. 2xMilkyway on each of 2 cards, or 2x Einstein Gamma on each of 2 cards, no crashing :-)

Is there any way I can tell (perhaps through the device manger, properties of the card - in the details tab, there's a huge list of technical numbers), what PCI Express version it's trying to use? Since the motherboard is 2.0 and the cards are 3.0, I assume it will negotiate 2.0 without realising the ribbon isn't shielded for that.
ID: 95146 · Report as offensive
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 151
United States
Message 95147 - Posted: 15 Jan 2020, 18:41:50 UTC - in response to Message 95141.  
Last modified: 15 Jan 2020, 18:43:04 UTC

That may require some nifty work with - probably not - a hacksaw, or a fine cutting tool like a Dremel. By opening the end of the PCIe x1 slot, the x16 riser could be physically inserted into the x1 slot. He'd still have to watch out for power consumption: the sense pin should tell the motherboard that a card is present. But he would still have to investigate and manage the card's actual power draw from each input.

It should be possible: his cards have nominal power inputs for 375W (75W from the PCIe slot, plus 2x 150W 8-pin supplementary inputs). The cards he has are rated at 250W average total board power. So there's headroom - it's just a question which input has the spare capacity, and that depends on the manufacturer.

To my mind, fitting the dual 8-pin suggests that the bulk of the power will be taken from them: if the full 75W was taken from the motherboard, they could have got away with 1 8-pin and 1 6-pin. But I am not a circuitry designer: it's all supposition, and it might still fry the motherboard. Proceed with extreme caution, and keep a fire extinguisher close at hand.


actually he can't do that at all. even if he wanted to.

the topmost 1x slot is immediately obstructed aft of the slot by the north bridge heat sink.
the lower 1x slot is obstructed a little further out by the BIOS battery. he would not be able to fit a 16x card here, maybe enough room for a 8x device though.

still poor option to try to use these slots for anything. 1x lane at PCIe 1.0 speeds is just too little bandwidth to be useful on BOINC projects.

stick to the two 16x slots only.

ID: 95147 · Report as offensive
Previous · 1 · 2 · 3 · 4 · 5 . . . 20 · Next

Message boards : GPUs : PCI express risers to use multiple GPUs on one motherboard - not detecting card?

Copyright © 2021 University of California. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.