• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

GIGABYTE Outs RX 5600 XT Gaming OC VBIOS Update and Easy Updater

OpenCuda like most of Nvidia's tech is closed and designed to force you onto their hardware. CUDA cores suck compared to AMDs compute units, but NV tries their hardest to kill or stall OpenCL adoption because the only company that can make CUDA cores is NV, so they cripple OpenCL on their hardware, and make CUDA look so much better.

CUDA is open-source and ties into existing OpenCL code quite well. Even by itself, it is still more fleshed out than OpenCL. Most machine learning libraries that have a CUDA backend can still run the code directly on a x86 CPU or any other GPGPU without major issues.

AMD doesn't care much about the machine learning market as much as NVIDIA does. If they did, they should've continued with GCN (the Vega 56, 64 and Radeon VII are great cards for the enthusiast researcher) and not step backwards with RDNA (1.0, however they say 2.0 would be robust for developers).

ROCm (uses Google's pre-owned TensorFlow), which thankfully is still alive, is still primitive to NVIDIA's cuDNN. MIOpen (AMD's deep learning math library) works extremely well on NVIDIA's own tensor cores (which is not surprising since it's HIP-compatible).

NVIDIA's current CUDA cores are the same as AMD's GCN/RDNA shader processors. It all depends on the libraries used. It just so happens that NVIDIA's implementation is the most compatible at the moment.
 
OpenCuda like most of Nvidia's tech is closed and designed to force you onto their hardware. CUDA cores suck compared to AMDs compute units, but NV tries their hardest to kill or stall OpenCL adoption because the only company that can make CUDA cores is NV, so they cripple OpenCL on their hardware, and make CUDA look so much better.

Stuffing black box code into games that devs aren't even allowed to look at. That suspiciously enhances performance for one side but causes problems for the other.
That post showed that you really don't know what you're talking about.
CUDA is open-source. Half of your absurdly long post swirls around some "black box" nonsense.
And BTW: black boxes are perfectly fine. Don't be afraid of them. Abstraction is what let programming evolve so quickly (and become so easy).

Your main problem, albeit shared with many AMD fans on this forum, is lack of understanding what these companies actually make.
You seem to think Nvidia's product is the GPU. That the software layer should just be a free and fully hardware agnostic contribution to humanity. Cute.

In the real world what Nvidia makes is a solution. That solution consists of a GPU and a software layer that lets you use it as efficiently as possible. You buy a card - you pay for both.
No one is forced to write using CUDA. Just like no one is forced to use AMD's in-house libraries (yes, AMD does the same thing - it's just that no one cares :) ).

The reason for CUDA popularity is very simple. There was a time when almost no one outside of gaming business knew how to program GPUs. We knew they are good for some problems, but coding was so difficult, learning this made very little sense for engineers and scientists.

My first contact with CUDA was on a condensed matter physics seminary in 2008. The lecturer (late 30s) started with something like: "Hey, there's this great new thing called CUDA and we can compute stuff on GPUs. It's super easy." And the audience (mostly late 50s and over) said: "GPUs?"
When I was leaving university in 2012, CUDA was in syllabus of the first-year programming course.
It was a great idea by Nvidia and it just paid off.

Compared to what we had in 2007 (OpenGL) it was like Python vs C++ in usability, but with no performance loss.

As for OpenCL - it's not a bad initiative, but it loses importance and this won't change.
Nvidia has CUDA. Intel focuses on OneAPI. Apple left it for Metal.
OpenCL simply isn't needed. And, frankly, it's not that great with latest AMD offerings.
 
Why Sapphire , XFX , Powercolor , Asrock don't make Nvidia GPUs ? That kind of logic flows both ways , yet somehow you focus only on one side of the coin . I would be impressed if you came up and accused AMD for this !

XFX used to make Nvidia-based GPU, which then also make AMD-based gpu when AMD release HD 4000 series, Nvidia then punish XFX by not supplying newer Fermi chip for GTX 400 series. in retaliation XFX drop nvidia product line completely and only made HD 5000 series.

this is just one of the AIB story that made it into internet, Nvidia has been pressuring smaller AIB only Nvidia-based gpu not both,
ASUS, Gigabyte, MSI have leeway to make cards from both side because they have leverage as major AIB.

do your research first, before you posting.
 
Last edited:
XFX used to make Nvidia-based GPU, which then also make AMD-based gpu when AMD release HD 4000 series, Nvidia then punish XFX by not supplying newer Fermi chip for GTX 400 series. in retaliation XFX drop nvidia product line completely and only made HD 5000 series.
this is just one of the AIB story that made it into internet, Nvidia has been pressuring smaller AIB only Nvidia-based gpu not both,

You see this what i don't like about some peoples , they take something that really happened and they invent a story full of BS that suits their agenda !

We are all aware of the story between Nvidia and XFX that still doesn't explain why XFX doesn't make Nvidia cards today , oh yeah i guess you must be one of those who confuse multi million businesses with children so because those children argued in the past they shall never talk to each other again right ? Yeah right .........

There are 3 other companies on that list that you fail to explain why they do make exclusively AMD GPUs , not only this but you are ready to blaim Nvidia for this when at the same breath you will not hesitate a second to blame Nvidia for EVGA not making AMD GPUs . Indeed DOUBLE STANDARDS FTW .... this is where you peoples loose all credibility !

ASUS, Gigabyte, MSI have leeway to make cards from both side because they have leverage as major AIB.

Yeah sure either this or this is another BS made up argument that suits your narrative .... and here comes AFOX which makes both Nvidia and AMD but im sure AFOX being the small brand they are have a ton more leverage than Asrock , Powercolor and Sapphire right ? right ?

do your research first, before you posting.

Dude there is difference between doing some research and making up BS to suit your agenda , besides this coming for you is the icing on the cake !
 
Last edited:
Why Sapphire , XFX , Powercolor , Asrock don't make Nvidia GPUs ? That kind of logic flows both ways , yet somehow you focus only on one side of the coin . I would be impressed if you came up and accused AMD for this !
[/QUOTE]

Some of those companies worked with Nvidia in the past and the bad taste from dealing with them made them go the ATI/AMD way. You should really read the objective history of Nvidia before thinking that I am promoting one over the other.
 
Some of those companies worked with Nvidia in the past and the bad taste from dealing with them made them go the ATI/AMD way. You should really read the objective history of Nvidia before thinking that I am promoting one over the other.
But how do you know companies that are Nvidia-only haven't had bad experience with ATI/AMD? :)
 
Back
Top