PCI express risers to use multiple GPUs on one motherboard - not detecting card?

Message boards : GPUs : PCI express risers to use multiple GPUs on one motherboard - not detecting card?
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 8 · 9 · 10 · 11 · 12 · 13 · 14 . . . 20 · Next

AuthorMessage
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 566
United Kingdom
Message 95773 - Posted: 9 Feb 2020, 11:47:31 UTC - in response to Message 95768.  

Way too hot!


Maybe someone should talk to AMD about that. Looks like AMD is doing all the things you seem to hate nvidia for. Overheating, inadequate fan control, not throttling to keep temps controlled.

https://youtu.be/V1SzOzRwbxg


Maybe so, I'm just going by my own experience. About 200 AMD cards with no heat problem, about 15 Nvidia cards, two with heat problems.

I guess it depends who designs the heatsink, fans, and controller settings, but I've never overheated an AMD. Shouldn't AMD and Nvidia design stock heatsinks and fans to go with their chips? Intel do it for their CPUs. They may not be quiet, but they always keep it cool enough.
ID: 95773 · Report as offensive     Reply Quote
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 566
United Kingdom
Message 95774 - Posted: 9 Feb 2020, 11:50:05 UTC - in response to Message 95771.  

Nvidia selling me a card which broke in under a week is not acceptable. They can't make up for that


Was it a GPU that was manufactured by nVidia, or was a GPU manufactured by a third party using an nVidia chipset. Since the vast majority of the GPUs we use are actually manufactured by a third party with the chipset supplied by either nVidia or AMD I would hazard a guess at your "nVidia" GPU was actually packaged and manufactured by a third party (Asus, MSI etc...). Thus you should be blaming that third party manufacturer for your woes - they got the package wrong.


The GPU itself (the actual processor) was obviously Nvidia. The heatsink, fan, etc should have been designed by Nvidia too. Just like Intel provide stock coolers for their CPUs. You can buy better ones, but you start with one that actually works, then buy quieter if necessary. Or maybe Nvidia shouldn't make chips with such a high TDP that's is almost impossible to cool? Admittedly I have seen one AMD card like that - a dual GPU card. There is no way in hell you can cool two GPUs in that space.
ID: 95774 · Report as offensive     Reply Quote
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 566
United Kingdom
Message 95775 - Posted: 9 Feb 2020, 11:52:06 UTC - in response to Message 95772.  
Last modified: 9 Feb 2020, 11:52:19 UTC

Nvidia selling me a card which broke in under a week is not acceptable. They can't make up for that
Was it a GPU that was manufactured by nVidia, or was a GPU manufactured by a third party using an nVidia chipset. Since the vast majority of the GPUs we use are actually manufactured by a third party with the chipset supplied by either nVidia or AMD I would hazard a guess at your "nVidia" GPU was actually packaged and manufactured by a third party (Asus, MSI etc...). Thus you should be blaming that third party manufacturer for your woes - they got the package wrong.
And as Ian&Steve C. said, it would be helpful to know when this anecdote dates from. I think the mainstream manufacturers have put a lot more effort into designing their cooling systems, once the problems with overheating became known.


I like the Gigabyte cards (eg Windforce), they always seem to put extra effort into design (motherboards and graphics cards).

200th. post in this thread. That's a lot said about PCIe risers!


Hey at least I've solved my original problem :-)
ID: 95775 · Report as offensive     Reply Quote
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 115
United States
Message 95779 - Posted: 9 Feb 2020, 17:18:52 UTC - in response to Message 95774.  


The GPU itself (the actual processor) was obviously Nvidia.


Which GPU was it? What model?

The heatsink, fan, etc should have been designed by Nvidia too. Just like Intel provide stock coolers for their CPUs. You can buy better ones, but you start with one that actually works, then buy quieter if necessary.


Not always. Both AMD and nvidia have cards that they do not release a reference design for. Meaning they produce the GPU itself and let the AIB make their own PCB and cooling solution.

And even on the CPU side, Intel and AMD both produce CPUs that they do not include a stock heatsink for.

Or maybe Nvidia shouldn't make chips with such a high TDP that's is almost impossible to cool? Admittedly I have seen one AMD card like that - a dual GPU card. There is no way in hell you can cool two GPUs in that space.


AMD is more guilty of this than nvidia. You’d know that if you’d actually compared nvidia and AMD cards on the same performance level and similar release date as nvidia.

Example:
(Relative performance) Model - TDP - Date
(100%) AMD Radeon 7 - 295W - Feb 7th 2019
(106%) Nvidia 1080ti - 250W - Mar 10th 2017
(97%) Nvidia RTX 2070 - 175W - Oct 17th 2018
(99%) AMD 5700XT - 225W - Jul 7th 2019

AMD cards are less efficient, use more power, run hotter, and have more driver issues than comparable Nvidia cards.
ID: 95779 · Report as offensive     Reply Quote
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 566
United Kingdom
Message 95780 - Posted: 9 Feb 2020, 17:57:53 UTC - in response to Message 95779.  
Last modified: 9 Feb 2020, 17:58:45 UTC


The GPU itself (the actual processor) was obviously Nvidia.


Which GPU was it? What model?


I can't remember, it was a couple of years ago. It was a build I was doing for a customer. When I told him it blew up while I ran a test, he took my advice and said to change it to an AMD. I did. Faster for the same price and didn't melt.

The heatsink, fan, etc should have been designed by Nvidia too. Just like Intel provide stock coolers for their CPUs. You can buy better ones, but you start with one that actually works, then buy quieter if necessary.


Not always. Both AMD and nvidia have cards that they do not release a reference design for. Meaning they produce the GPU itself and let the AIB make their own PCB and cooling solution.


I wouldn't if I was them. It spoils their reputation when the cooler is designed badly.

And even on the CPU side, Intel and AMD both produce CPUs that they do not include a stock heatsink for.


I've never seen one. Every Intel I've bought has a stock fan. You can of course order one without a fan if you know you're going to fit an Arctic Cooler or watercooler etc.

Or maybe Nvidia shouldn't make chips with such a high TDP that's is almost impossible to cool? Admittedly I have seen one AMD card like that - a dual GPU card. There is no way in hell you can cool two GPUs in that space.


AMD is more guilty of this than nvidia. You’d know that if you’d actually compared nvidia and AMD cards on the same performance level and similar release date as nvidia.


I have done, 100 times when I built gaming rigs.

Example:
(Relative performance) Model - TDP - Date
(100%) AMD Radeon 7 - 295W - Feb 7th 2019
(106%) Nvidia 1080ti - 250W - Mar 10th 2017
(97%) Nvidia RTX 2070 - 175W - Oct 17th 2018
(99%) AMD 5700XT - 225W - Jul 7th 2019

AMD cards are less efficient, use more power, run hotter, and have more driver issues than comparable Nvidia cards.


Not in my experience. I've even had problems replacing a Nvidia card with an AMD on quite a few occasions. Leaving the Nvidia driver on the machine caused the machine to lock up when you insert the AMD card. All very well saying remove the Nvidia driver first, but I shouldn't have to, and can't if the card broke. I had to use safe mode!
ID: 95780 · Report as offensive     Reply Quote
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 566
United Kingdom
Message 95781 - Posted: 9 Feb 2020, 18:04:00 UTC - in response to Message 95779.  

AMD cards are less efficient, use more power, run hotter, and have more driver issues than comparable Nvidia cards.


Ok, here's a link to a graph of GPUs good for compute power, as in Boinc:
https://www.videocardbenchmark.net/directCompute.html

The most powerful sensibly priced (as in under $500) AMD is the Radeon RX 5700 XT, speed 8186, $380.
The most powerful sensibly priced Nvidia is the Geforce RTX 2070 Super speed 7051, $500.

I myself prefer power over $.
ID: 95781 · Report as offensive     Reply Quote
Profile Joseph Stateson
Volunteer tester
Avatar

Send message
Joined: 27 Jun 08
Posts: 538
United States
Message 95782 - Posted: 9 Feb 2020, 18:04:07 UTC
Last modified: 9 Feb 2020, 18:10:02 UTC

This thread is really off topic. I am declaring it as a troll thread and am providing tools for identify abusers.
My pick for Peter is the "Wank - o - meter". Rest of you can decide what your are
Simplified - Numeric
    TROLL-O-METER
0 1 2 3 [4] 5 6 7 8 9 10 


Simplified - Legend
    T R O L L - O - M A T I C 
PATHETIC ---------+---------INSPIRED 


Simplified - with watchdog
 0  1  2  3  4  5  6  7  8  9 10 
+------------------------------+ 
|**********************        | 
|**********************        | 
+-----­------------------------+ 
|            (o o) 
| -------oOO--(_)--OOo-------- 


Move along folks. Nothing to see here. Just a troll having a seizure.  Show's over. Keep it moving.
TROLL-O-METER 
1 2 3 4 5 6 7 8 9 +10 +20 
|||||||||||||||||-||||||| 

BULLSHIT-O-METER 
1 2 3 4 5 6 7 8 9 +10 +20 
||||||||||||||||­|||||||| 

DUMBASS-O-METER 
1 2 3 4 5 6 7 8 9 +10 +20 
||||||||||||||||­|||||||| 


Try to be a bit more subtle next time. Thanks for playing!
---------------------- 
0-1-2-3-4-5-6-7-8-9-10 
---------------------- 
^

Better, but no bite. I am uninterested in getting into a flame war with you no matter how many personal attacks you make. If you insist on having a fight, go ahead and start without me.
.---­------------------------------­-------. 
[ reeky neighborhood watch Troll-O-Meter] 
[---0---1---2---3-­-4--5--6--7--8---9---] 
[||||||||| ] 
'---­------------------------------­-------'


 Not bad, as trolls go. The Lame-o-meter was off scale on the highest range and the Clue-o-meter wouldn't register. 
0  1  2  3  4  5  6  7  8  9  10 
_________________________________ 
|  |  |  |  |  |  |  |  |  |  | 
---------------------------------
                                ^ 
                                | 


OMG! They said it couldn't be done, but this post rates: 
+--------------------------+ 
|           4 5            |
|        3       6         | 
|     2             7      | 
|   1                 8    | 
|  0         o         9   | 
|  -1      /               | 
|   -2    /                | 
|     -3 /                 | 
+--------------------------+ 
|      Troll-O-Meter       | 
'--------------------------'
Warning, troll-o-meters are susceptible to particularly stupid statements, and may give inaccurate readings under such conditions.  A shrill response generally indicates an effective complaint. 

 
 /_______________________/| 
| TROLL-O-METER(tm)      || 
|                        || 
| .---  .---             || 
|     |     | millitrolls|| 
| .---'     |    _  _    ||
| |         | . | || |   || 
|  ---'     '   `-'`-'   |/
`------------------------'



A bit better, but still not enough to hook me. Better luck on your next cast! 
+----------------------+ +----------------------+ 
|0 1 2 3 4 5 6 7 8 9 10| |0 1 2 3 4 5 6 7 8 9 10| 
| \ TROLL-O-METER      | | WANK-O-METER    /    | 
|  \                   | |                /     | 
|   \                  | |               /      | 
|    \                 | |              /       | 
|     \                | |             /        | 
|      \               | |            /         | 
|       \              | |           /          | 
|Certifie\  next cal:  | |Certified / next cal: | 
| NIST    \ date 12/5  | | NIST    / date 12/5  | 
+----------------------+ +----------------------+ 



*BANG* 

That was the Troll-O-Meter exploding. 
Fell for that one hook, line, and sinker.

                    # 
                   ; 
                  ; 
                       @ 
+----------------+   ,.' 
| .0 .2 .4 .8 1.0|   ,.' 
|              ' | 
|              ` | ''..''..''% 
+------------`    | 
| Troll-O-Met`  (((o 
+-----------`      '., 
                      '., 
             ;            # 
               ; 
                * 
       _ 
 _____|_|_____ 
|   PLEASE!   | 
|-------------| 
| Do NOT Feed | 
|  The Troll  | 
|_____________| 
      | | 
      | | 
  \ \ | | / /

                                            .:\:/:.
         +-------------------+            .:\:\:/:/:.
         |   PLEASE DO NOT   |           :.:\:\:/:/:.:
         |  FEED THE TROLLS  |           :=.' - - '.=:
         |                   |           '=(\ 9 9 /)='
         | Thank you,        |             (  (_)  )
         |     Management    |             /`-vvv-'\
         +-------------------+            /         \
                 | |         @@@         / /|,,,,,|\ \
                 | |         @@@        /_//  /^\  \\_\
   @x@@x@        | |          |/        WW(  (  )  )WW
   \||||/        | |         \|          __\,,\ /,,/__
    \||/         | |          |     jgs (______Y______)
/\/\/\/\/\/\/\/\//\/\\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
ID: 95782 · Report as offensive     Reply Quote
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 566
United Kingdom
Message 95783 - Posted: 9 Feb 2020, 18:28:24 UTC - in response to Message 95782.  
Last modified: 9 Feb 2020, 18:28:58 UTC

This thread is really off topic. I am declaring it as a troll thread and am providing tools for identify abusers.
My pick for Peter is the "Wank - o - meter". Rest of you can decide what your are


(Rest of childish rant snipped)

Do grow up.

How is posting an opinion and experience with graphics cards "trolling"?

I won't report your abusive message because unlike some people in here I'm not childish.
ID: 95783 · Report as offensive     Reply Quote
Richard Haselgrove
Volunteer tester
Help desk expert

Send message
Joined: 5 Oct 06
Posts: 3959
United Kingdom
Message 95784 - Posted: 9 Feb 2020, 18:42:54 UTC - in response to Message 95783.  

Do grow up.

How is posting an opinion and experience with graphics cards "trolling"?
It's trolling if you post assertions on a technical board like this without being willing to back up your assertions with answers to factual questions when asked.

What was the year of manufacture of the NVidia card "which broke in under a week"?
ID: 95784 · Report as offensive     Reply Quote
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 566
United Kingdom
Message 95785 - Posted: 9 Feb 2020, 18:50:53 UTC - in response to Message 95784.  

Do grow up.

How is posting an opinion and experience with graphics cards "trolling"?
It's trolling if you post assertions on a technical board like this without being willing to back up your assertions with answers to factual questions when asked.


I've given plenty links. And I shouldn't have to. I don't keep records of everything I do, I just know I've had a lot of problems with Nvidia and none with AMD, just as I've had loads of problems with Renault cars, but guess what, I didn't keep the details.

What was the year of manufacture of the NVidia card "which broke in under a week"?


A couple of years ago I think. I'm not going to trust that they might have mended their ways in that time. And since (per my last link) AMD clearly give more compute power per $, I'll stick with them.
ID: 95785 · Report as offensive     Reply Quote
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 115
United States
Message 95786 - Posted: 9 Feb 2020, 18:58:52 UTC - in response to Message 95780.  

I wouldn't if I was them. It spoils their reputation when the cooler is designed badly.

I guess they know what they are doing, both being multi-billion dollar companies and all. The cooler design of an AIB reflects on the AIB, not the chip manufacturer, especially when a reference design doesn’t exist.

I've never seen one. Every Intel I've bought has a stock fan.

Intel has MANY skus dating all the way back to 2011 for CPUs that we’re shipped without a thermal solution. These are not simply CPUs that can be ordered “OEM”, these are CPUs that we’re never offered with one in the first place. And this is only desktop consumer stuff not even including their server grade stuff.

https://www.intel.com/content/www/us/en/support/articles/000005940/processors.html

Not in my experience.

This whole thread can probably be summed up by these four words. The reality of the world is not always the same as in your own little bubble. A lot of your opinions are based on very old experiences not reflective of reality today.
ID: 95786 · Report as offensive     Reply Quote
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 115
United States
Message 95787 - Posted: 9 Feb 2020, 19:07:43 UTC - in response to Message 95781.  

AMD cards are less efficient, use more power, run hotter, and have more driver issues than comparable Nvidia cards.


Ok, here's a link to a graph of GPUs good for compute power, as in Boinc:
https://www.videocardbenchmark.net/directCompute.html

The most powerful sensibly priced (as in under $500) AMD is the Radeon RX 5700 XT, speed 8186, $380.
The most powerful sensibly priced Nvidia is the Geforce RTX 2070 Super speed 7051, $500.

I myself prefer power over $.


Compute performance is highly dependent on the application being used. You have to answer the question of compute for “what”?

The 2070 Super is overall faster and more power efficient than a RX5700XT.

For Einstein the 5700XT will probably win, as their apps favor AMD cards. For SETI, since there is specialized applications for CUDA, the 2070S will embarrass the 5700XT.

It’s very difficult to compare two different cards in compute scenarios, since the wildly different architectures require wildly different applications to run. You can’t really make a true apples to apples comparison.
ID: 95787 · Report as offensive     Reply Quote
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 566
United Kingdom
Message 95788 - Posted: 9 Feb 2020, 19:11:45 UTC - in response to Message 95786.  

I guess they know what they are doing, both being multi-billion dollar companies and all. The cooler design of an AIB reflects on the AIB, not the chip manufacturer, especially when a reference design doesn’t exist.


Nope, when I buy a Nvidia card and it melts, I blame Nvidia. If my Rover car fails because of the BMW engine, I blame BMW.

Intel has MANY skus dating all the way back to 2011 for CPUs that we’re shipped without a thermal solution. These are not simply CPUs that can be ordered “OEM”, these are CPUs that we’re never offered with one in the first place. And this is only desktop consumer stuff not even including their server grade stuff.


I've never come across one, and I tend to buy gaming spec Intel CPUs. I've certainly seen them offered without the fan, but there's always either an option to have the fan with it, or to buy one seperately.

Not in my experience.

This whole thread can probably be summed up by these four words. The reality of the world is not always the same as in your own little bubble. A lot of your opinions are based on very old experiences not reflective of reality today.


I prefer my own experience (which I've already said is over 100 machines built from scratch by myself, plus over 1000 I've maintained, upgraded, and repaired at two companies).
ID: 95788 · Report as offensive     Reply Quote
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 566
United Kingdom
Message 95789 - Posted: 9 Feb 2020, 19:15:24 UTC - in response to Message 95787.  

Compute performance is highly dependent on the application being used. You have to answer the question of compute for “what”?

The 2070 Super is overall faster and more power efficient than a RX5700XT.

For Einstein the 5700XT will probably win, as their apps favor AMD cards. For SETI, since there is specialized applications for CUDA, the 2070S will embarrass the 5700XT.

It’s very difficult to compare two different cards in compute scenarios, since the wildly different architectures require wildly different applications to run. You can’t really make a true apples to apples comparison.


Well all I can do is go by what I've read on here (Einstein, which I use, is faster on AMD. SETI, which I don't use, is faster on Nvidia), plus some hard specs on websites - I use this: https://www.techpowerup.com/gpu-specs/ - click any card and look at "theoretical performance" - they list the GFLOPS of every card ever made - very useful for finding the most FLOPS per $ you can get.
ID: 95789 · Report as offensive     Reply Quote
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 115
United States
Message 95790 - Posted: 9 Feb 2020, 19:30:08 UTC - in response to Message 95789.  

That site is exactly what I used to show you the relative performance in the 4 cards listed a few posts back.
ID: 95790 · Report as offensive     Reply Quote
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 566
United Kingdom
Message 95791 - Posted: 9 Feb 2020, 20:06:29 UTC - in response to Message 95790.  

That site is exactly what I used to show you the relative performance in the 4 cards listed a few posts back.


It's what I used to compile a spreadsheet of what is the best value for me. I'm not an AMD fanboy. I'll buy a Nvidia if it's better value - if it does overheat, then if it's new it's under warranty, if it's 2nd hand, it's proven to be ok. I just haven't seen a good one yet.
ID: 95791 · Report as offensive     Reply Quote
ProDigit

Send message
Joined: 8 Nov 19
Posts: 460
United States
Message 95831 - Posted: 14 Feb 2020, 6:10:40 UTC - in response to Message 95755.  

It’s amazing that someone can come to such outrageous conclusions.

The hardware can only perform as well as the software commanding it. It’s down to the software programming, not the hardware.

I’m not sure why you’re talking about CUDA. The current Einstein apps are not coded with CUDA. They are OpenCL. But regardless of the platform used, the effectiveness of the application is down to the code written by the programmer, not the hardware.

On SETI for example. If you only compared the CUDA60 app to the SoG app (openCL), you’d think that CUDA is terrible, as the SoG app is much faster. Until you tried the CUDA90 app written by someone else which is 4x faster than the SoG app.

Nvidia drivers can decide to run OpenCL tasks in CUDA if they want...
I've seen it happen.
As in reference to the 15%, it's an estimation, based on playing around with WUs and tasks.
Nvidia-Xserver only shows increments of roughly 15-20%. It doesn't show accurate results.

It's also amazing how you can say someone can have "such outrageous conclusions", while you provide absolutely no proof yourself; not to mention, it's degrading and just plain not nice to say that about someone else (be it me, or anyone on board here).
We're all here to learn!
ID: 95831 · Report as offensive     Reply Quote
ProDigit

Send message
Joined: 8 Nov 19
Posts: 460
United States
Message 95832 - Posted: 14 Feb 2020, 6:18:14 UTC - in response to Message 95736.  
Last modified: 14 Feb 2020, 6:41:33 UTC


I can't even service one GPU with that CPU. It's doing MW and Einstein Gamma instead. No matter how many Gravity tasks I put onto one GPU, the CPU can't keep up. I can max out all four cores of the CPU, while the GPU sits at 30%


You mention gravity and also gamma. I quit doing gravity due to low credit and problems with drivers on older boards. With "weak" CPU, the gamma ray pulsar seems to run nicely on ATI boards, not so on NVidia. This was discussed over at Einstein and it seems to be the way the polling is implemented as well as the hardware.

Following is a Linux system with a really cheap 2 core Celeron G1840. Motherboard is Hl61BTC I got down from the attic and put a pair of RX560 and a pair of RX570

The 11 minute 44 sec completion is typical for the RX570, not obvious from the picture, but the RX560 typical is 25 - 29 minutes. Note the CPU usage is at most %22 which works out nicely for the celeron.



On the other hand, the nvidia board uses up a full cpu. Both systems run 18.04



The actual CPU load, Nvidia drivers put on the CPU, can be seen in HTOP.
Case scenario below, is just used as an example, not an actual representation of Nvidia driver and CPU load,
However, the CPU bars will show quite accurately the actual CPU usage, and the idle data sent from CPU to the GPU:

The green parts of the CPU bars (could be any other assigned color), would reflect actual CPU crunching data that is sent to the GPU, while the red parts of the bar show the idle data.
With Nvidia, the red bars will nearly always load to 100%, while the green bars could be anywhere between 1 and 99%. The red bars are just filling up from whatever the green bars do, to 100%.
The idle data is just to prevent the system from switching the job to another core, allowing you to enjoy highest performance (as this way, the CPU doesn't need to copy L-Cache from one core to another).
This doesn't work with all operating systems, as despite this polling (idle data), some Operating systems, still swap cores around.
AMD would show just the green bars, and the red bars (idle data) would be black (unused).
This has benefits, in that AMD GPUs tax the CPU less, and thus run the computer more efficiently.
But then again, the GPU section is running less efficiently than Nvidia.

About idle data being absent on computing with AMD GPUs, I've seen the same thing happen on Nvidia drivers, when CPU cores get shared between GPUs (eg: a 2 core 4 threads, is sharing 4 threads between 3 or 4 GPUs in Linux). Nvidia drivers will disable the idle polling data in that case, due to the fact that at least 1 thread is shared with a core. However, there's a slight performance penalty that comes along with it.

No one really has tried to measure, but I think there's a slight possibility the idle data Nvidia drivers send, is also consuming CPU resources, and costing extra watts. Though the tradeoff on most PCs should be small (adding only a few watts for a few percent higher performance).
ID: 95832 · Report as offensive     Reply Quote
ProDigit

Send message
Joined: 8 Nov 19
Posts: 460
United States
Message 95833 - Posted: 14 Feb 2020, 6:31:35 UTC - in response to Message 95720.  

FUNCTIONALLY a VRM is a power supply, or even a component within a power supply.

My 2ct, a VRM is a Voltage Regulator Module.
It regulates the voltage, it doesn't convert it.
It takes DC in, and makes sure that the GPU core gets the 12V it needs.
This is different from a PSU, which not only converts AC to DC, and changes the voltage/amps, but I'd say that it's safer to say that a PSU is larger than a VRM.
A vrm is just a feedback loop controller to make sure voltage and current draw, stay within limits.
A VRM is a plain digital chip; and a PSU has capacitors, and digital circuitry inside; thus is more complex than a VRM.
I think most people would see a VRM as a controller, rather than a generator of power (a PSU, the old style, were transformer based, and thus generate voltage; though the newer ones are based on digital circuitry, which makes it function much closer to a VRM).

My 2 uneducated cents. I don't care if I'm right or wrong about this.
This is just what I (to this day) believe.
ID: 95833 · Report as offensive     Reply Quote
ProDigit

Send message
Joined: 8 Nov 19
Posts: 460
United States
Message 95834 - Posted: 14 Feb 2020, 6:34:46 UTC - in response to Message 95789.  

Compute performance is highly dependent on the application being used. You have to answer the question of compute for “what”?

The 2070 Super is overall faster and more power efficient than a RX5700XT.

For Einstein the 5700XT will probably win, as their apps favor AMD cards. For SETI, since there is specialized applications for CUDA, the 2070S will embarrass the 5700XT.

It’s very difficult to compare two different cards in compute scenarios, since the wildly different architectures require wildly different applications to run. You can’t really make a true apples to apples comparison.


Well all I can do is go by what I've read on here (Einstein, which I use, is faster on AMD. SETI, which I don't use, is faster on Nvidia), plus some hard specs on websites - I use this: https://www.techpowerup.com/gpu-specs/ - click any card and look at "theoretical performance" - they list the GFLOPS of every card ever made - very useful for finding the most FLOPS per $ you can get.


From reading the forum, one user said the most optimal setting for an RX5700 (XT),I believe was 190W, with overclock.
The same performance was done between an RTX 2070 at 137W (slower) and an RTX 2070 Super at 150W (Faster than the 5700).
So while AMD may be faster, it also consumes more power.
The 40W more power, results in $+40 more on electricity per year.
ID: 95834 · Report as offensive     Reply Quote
Previous · 1 . . . 8 · 9 · 10 · 11 · 12 · 13 · 14 . . . 20 · Next

Message boards : GPUs : PCI express risers to use multiple GPUs on one motherboard - not detecting card?

Copyright © 2020 University of California. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.