PCI express risers to use multiple GPUs on one motherboard - not detecting card?

Message boards : GPUs : PCI express risers to use multiple GPUs on one motherboard - not detecting card?
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 7 · 8 · 9 · 10 · 11 · 12 · 13 . . . 20 · Next

AuthorMessage
Richard Haselgrove
Volunteer tester
Help desk expert

Send message
Joined: 5 Oct 06
Posts: 4467
United Kingdom
Message 95751 - Posted: 8 Feb 2020, 9:25:02 UTC - in response to Message 95747.  

So the Nvidia uses a lot more CPU? Kinda points to the Nvidia GPU being rubbish.
It's a reflection on the software (programming) language chosen, not on the hardware.
ID: 95751 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1125
United Kingdom
Message 95752 - Posted: 8 Feb 2020, 15:29:58 UTC - in response to Message 95751.  

So the Nvidia uses a lot more CPU? Kinda points to the Nvidia GPU being rubbish.
It's a reflection on the software (programming) language chosen, not on the hardware.


I thought Cuda was supposed to be better? If the app runs Cuda, it's Nvidia's fault for making Cuda rubbish. If the app runs with more CPU in OpenCL than an AMD, it's Nvidia's fault for making rubbish hardware.
ID: 95752 · Report as offensive
Profile Dave

Send message
Joined: 28 Jun 10
Posts: 1296
United Kingdom
Message 95753 - Posted: 8 Feb 2020, 15:46:05 UTC - in response to Message 95752.  

So the Nvidia uses a lot more CPU? Kinda points to the Nvidia GPU being rubbish.
It's a reflection on the software (programming) language chosen, not on the hardware.


I thought Cuda was supposed to be better? If the app runs Cuda, it's Nvidia's fault for making Cuda rubbish. If the app runs with more CPU in OpenCL than an AMD, it's Nvidia's fault for making rubbish hardware.


Or crunchers' fault for using them for something they were not designed for?
ID: 95753 · Report as offensive
Richard Haselgrove
Volunteer tester
Help desk expert

Send message
Joined: 5 Oct 06
Posts: 4467
United Kingdom
Message 95754 - Posted: 8 Feb 2020, 15:46:59 UTC - in response to Message 95752.  

Factor in the efficiency / proficiency of the programmer, too.
ID: 95754 · Report as offensive
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 156
United States
Message 95755 - Posted: 8 Feb 2020, 15:52:20 UTC - in response to Message 95752.  

It’s amazing that someone can come to such outrageous conclusions.

The hardware can only perform as well as the software commanding it. It’s down to the software programming, not the hardware.

I’m not sure why you’re talking about CUDA. The current Einstein apps are not coded with CUDA. They are OpenCL. But regardless of the platform used, the effectiveness of the application is down to the code written by the programmer, not the hardware.

On SETI for example. If you only compared the CUDA60 app to the SoG app (openCL), you’d think that CUDA is terrible, as the SoG app is much faster. Until you tried the CUDA90 app written by someone else which is 4x faster than the SoG app.
ID: 95755 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1125
United Kingdom
Message 95756 - Posted: 8 Feb 2020, 15:55:45 UTC - in response to Message 95755.  

It’s amazing that someone can come to such outrageous conclusions.

The hardware can only perform as well as the software commanding it. It’s down to the software programming, not the hardware.

I’m not sure why you’re talking about CUDA. The current Einstein apps are not coded with CUDA. They are OpenCL. But regardless of the platform used, the effectiveness of the application is down to the code written by the programmer, not the hardware.

On SETI for example. If you only compared the CUDA60 app to the SoG app (openCL), you’d think that CUDA is terrible, as the SoG app is much faster. Until you tried the CUDA90 app written by someone else which is 4x faster than the SoG app.


If it's in Cuda, Nvidia wrote that. If it's OpenCL, how come it works better on AMD cards?
ID: 95756 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1125
United Kingdom
Message 95757 - Posted: 8 Feb 2020, 15:56:18 UTC - in response to Message 95753.  

So the Nvidia uses a lot more CPU? Kinda points to the Nvidia GPU being rubbish.
It's a reflection on the software (programming) language chosen, not on the hardware.


I thought Cuda was supposed to be better? If the app runs Cuda, it's Nvidia's fault for making Cuda rubbish. If the app runs with more CPU in OpenCL than an AMD, it's Nvidia's fault for making rubbish hardware.


Or crunchers' fault for using them for something they were not designed for?


Cuda is designed for crunching.
ID: 95757 · Report as offensive
Richard Haselgrove
Volunteer tester
Help desk expert

Send message
Joined: 5 Oct 06
Posts: 4467
United Kingdom
Message 95758 - Posted: 8 Feb 2020, 16:01:00 UTC - in response to Message 95757.  

Cuda is designed for crunching.
It still needs to be programmed - by a programmer.

Not all programmers are created equal. If the project's application is open source, you are invited to improve it (as the SETI apps mentioned by Ian&Steve C. have been improved by crunchers, with more or less success).
ID: 95758 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1125
United Kingdom
Message 95759 - Posted: 8 Feb 2020, 16:08:56 UTC - in response to Message 95758.  
Last modified: 8 Feb 2020, 16:09:54 UTC

Cuda is designed for crunching.
It still needs to be programmed - by a programmer.

Not all programmers are created equal. If the project's application is open source, you are invited to improve it (as the SETI apps mentioned by Ian&Steve C. have been improved by crunchers, with more or less success).


I thought Nvidia actively helped making Cuda apps for the likes of Seti?

Anyway, if the OpenCL version works better, that probably means it was easier to program in.

No matter what the reason, I know which cards I'll be buying. They're more expensive for the same abilities with games too. And they're more likely to overheat.
ID: 95759 · Report as offensive
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 156
United States
Message 95760 - Posted: 8 Feb 2020, 16:14:23 UTC - in response to Message 95756.  

It’s amazing that someone can come to such outrageous conclusions.


If it's in Cuda, Nvidia wrote that. If it's OpenCL, how come it works better on AMD cards?


No. There is such a thing as CUDA developers that do not work for nvidia. The CUDA90 special application at SETI was NOT written by nvidia. It was written by a very talented volunteer from Finland.

Further, the OpenCL app at least at SETI doesn’t really give a preference to AMD cards, it scales pretty evenly based on the relative performance of the hardware.

The difference at Einstein is down to the programming. Perhaps their developer simply have more experience writing applications for the AMD hardware than Nvidia.
ID: 95760 · Report as offensive
Richard Haselgrove
Volunteer tester
Help desk expert

Send message
Joined: 5 Oct 06
Posts: 4467
United Kingdom
Message 95761 - Posted: 8 Feb 2020, 16:39:52 UTC - in response to Message 95759.  

I thought Nvidia actively helped making Cuda apps for the likes of Seti?
Yes, they did. They wrote the application announced and released on 18 Dec 2008. Which was poorly tested and buggy, causing BSODs on VLAR tasks. After testing and revision, I ran the first successful VLAR task on 15 Jan 2009. After that, NVidia supplied a slightly revised version compatible with Fermi GPUs in early 2010, but after that - nothing. Note that ATI/AMD/Intel never supplied SETI applications - all that porting was contributed by SETI volunteers.

Anyway, if the OpenCL version works better, that probably means it was easier to program in.
I wouldn't say it works 'better', but it is easier for the programmers because the same program code (with minor changes) can be used on all three platforms. OpenCL is pretty much obligatory for Intel GPUs, and has been for ATI/AMD since they dropped support for their proprietary CAL platform. OpenCL is supposed to be a common, unified language available for all platforms, but that doesn't mean that it currently has a competitive, qualitative, advantage over the closed, expensive, platforms.
ID: 95761 · Report as offensive
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 156
United States
Message 95762 - Posted: 8 Feb 2020, 17:11:36 UTC - in response to Message 95759.  

No matter what the reason, I know which cards I'll be buying. They're more expensive for the same abilities with games too. And they're more likely to overheat.


Confirmation bias and a bit of fanboy-ism. Nice.

the overheating comment is quite hilarious, as AMD is the one known for overheating. See Vega64, Radeon 7, first iteration of 5700XT, and nearly all of their older high performance cards. thermal performance is more down to how well the cooling solution is designed. and most of that lands on the shoulders of the AIB that designed it, not AMD or Nvidia themselves. In recent years AMD has been the ones producing cards that consume more power and generate more heat than the Nvidia cards at the same performance level. As well as AMDs inability to even compete at the very high level (they have nothing to compete with a RTX 2080 or higher in games performance, 5700XT is about on par with an RTX 2070, at the mid-range)
ID: 95762 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1125
United Kingdom
Message 95763 - Posted: 8 Feb 2020, 18:29:23 UTC - in response to Message 95762.  
Last modified: 8 Feb 2020, 18:32:09 UTC

No matter what the reason, I know which cards I'll be buying. They're more expensive for the same abilities with games too. And they're more likely to overheat.


Confirmation bias and a bit of fanboy-ism. Nice.


No, just choosing one of the competitors based on my own experience and that which I've read in here. Same as I choose Tesco over Asda because they don't tend to have everything out of stock.

the overheating comment is quite hilarious, as AMD is the one known for overheating. See Vega64, Radeon 7, first iteration of 5700XT, and nearly all of their older high performance cards. thermal performance is more down to how well the cooling solution is designed. and most of that lands on the shoulders of the AIB that designed it, not AMD or Nvidia themselves. In recent years AMD has been the ones producing cards that consume more power and generate more heat than the Nvidia cards at the same performance level. As well as AMDs inability to even compete at the very high level (they have nothing to compete with a RTX 2080 or higher in games performance, 5700XT is about on par with an RTX 2070, at the mid-range)


I'm just going by the cards I've used. Yes, they both get very hot, but AMD cards have always throttled to protect themselves. I had a Nvidia card run up to about 110C, make a terrible smell, then never work again.

And every time I've looked at various cards for gaming, I find the AMDs are cheaper for the same performance. I've built over 100 gaming machines to customer specs, and the AMD cards always end up the better option.
ID: 95763 · Report as offensive
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 156
United States
Message 95764 - Posted: 8 Feb 2020, 21:29:31 UTC - in response to Message 95763.  

Which AMD card is cheaper and provides the same gaming experience as the RTX 2080? 2080 Super? 2080ti?

Not sure why you think Nvidia doesn’t throttle themselves. They absolutely do. Max clocks are binned based on temperature and the power limits are strictly enforced. Nvidia does a better job at this than AMD does and has for a long time.
ID: 95764 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1125
United Kingdom
Message 95765 - Posted: 8 Feb 2020, 21:35:09 UTC - in response to Message 95764.  

Which AMD card is cheaper and provides the same gaming experience as the RTX 2080? 2080 Super? 2080ti?


No idea, never compared those. All I know is every time I specced a card for a machine, AMD was better value.

Not sure why you think Nvidia doesn’t throttle themselves. They absolutely do. Max clocks are binned based on temperature and the power limits are strictly enforced. Nvidia does a better job at this than AMD does and has for a long time.


The one I had did not throttle, it melted. And I'd say no card throttles sensibly. I always set the fans on CPU and GPU to limit it to about 70C. The default seems to be 80 or 90! Way too close to the point at which damage occurs.
ID: 95765 · Report as offensive
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 156
United States
Message 95766 - Posted: 8 Feb 2020, 22:09:59 UTC - in response to Message 95765.  
Last modified: 8 Feb 2020, 22:14:04 UTC

Sounds like your experience is way out of date. What was the last Nvidia card you had? 8800GT? If a card was running 90C it was because of poor maintenance or poor airflow or both. And the card would be heavily throttled on core clocks.

Nvidia cards are more power efficient and have better power control than the AMD cards and it’s been that way for a long time. Only with AMDs newest 7nm cards are they coming close to the power efficiency of Nvidia’s 12nm cards...

Stock temp limit on most GPUs these days is Usually somewhere around 85C. Any user can set custom limits or fan control to keep the card cooler if you want. That’s with both Nvidia and AMD. And as long as you have adequate airflow it’s unlikely to even reach that point.
ID: 95766 · Report as offensive
Peter Hucker
Avatar

Send message
Joined: 6 Oct 06
Posts: 1125
United Kingdom
Message 95767 - Posted: 8 Feb 2020, 22:18:24 UTC - in response to Message 95766.  

Sounds like your experience is way out of date. What was the last Nvidia card you had? 8800GT? If a card was running 90C it was because of poor maintenance or poor airflow or both. And the card would be heavily throttled on core clocks.


Nvidia selling me a card which broke in under a week is not acceptable. They can't make up for that.

Nvidia cards are more power efficient and have better power control than the AMD cards and it’s been that way for a long time. Only with AMDs newest 7nm cards are they coming close to the power efficiency of Nvidia’s 12nm cards...


I guess we need Nvidia to keep AMD on its toes. Just like we need AMD to keep Intel on its toes. Intel for CPUs, AMD for GPUs, that's the way I'll always be.

Stock temp limit on most GPUs these days is Usually somewhere around 85C.


Way too hot!
ID: 95767 · Report as offensive
Ian&Steve C.

Send message
Joined: 24 Dec 19
Posts: 156
United States
Message 95768 - Posted: 9 Feb 2020, 1:54:37 UTC - in response to Message 95767.  

Way too hot!


Maybe someone should talk to AMD about that. Looks like AMD is doing all the things you seem to hate nvidia for. Overheating, inadequate fan control, not throttling to keep temps controlled.

https://youtu.be/V1SzOzRwbxg
ID: 95768 · Report as offensive
robsmith
Volunteer tester
Help desk expert

Send message
Joined: 25 May 09
Posts: 958
United Kingdom
Message 95771 - Posted: 9 Feb 2020, 8:26:07 UTC

Nvidia selling me a card which broke in under a week is not acceptable. They can't make up for that


Was it a GPU that was manufactured by nVidia, or was a GPU manufactured by a third party using an nVidia chipset. Since the vast majority of the GPUs we use are actually manufactured by a third party with the chipset supplied by either nVidia or AMD I would hazard a guess at your "nVidia" GPU was actually packaged and manufactured by a third party (Asus, MSI etc...). Thus you should be blaming that third party manufacturer for your woes - they got the package wrong.
ID: 95771 · Report as offensive
Richard Haselgrove
Volunteer tester
Help desk expert

Send message
Joined: 5 Oct 06
Posts: 4467
United Kingdom
Message 95772 - Posted: 9 Feb 2020, 9:03:01 UTC - in response to Message 95771.  

Nvidia selling me a card which broke in under a week is not acceptable. They can't make up for that
Was it a GPU that was manufactured by nVidia, or was a GPU manufactured by a third party using an nVidia chipset. Since the vast majority of the GPUs we use are actually manufactured by a third party with the chipset supplied by either nVidia or AMD I would hazard a guess at your "nVidia" GPU was actually packaged and manufactured by a third party (Asus, MSI etc...). Thus you should be blaming that third party manufacturer for your woes - they got the package wrong.
And as Ian&Steve C. said, it would be helpful to know when this anecdote dates from. I think the mainstream manufacturers have put a lot more effort into designing their cooling systems, once the problems with overheating became known.

200th. post in this thread. That's a lot said about PCIe risers!
ID: 95772 · Report as offensive
Previous · 1 . . . 7 · 8 · 9 · 10 · 11 · 12 · 13 . . . 20 · Next

Message boards : GPUs : PCI express risers to use multiple GPUs on one motherboard - not detecting card?

Copyright © 2021 University of California. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.