Wednesday, January 22nd 2020

GIGABYTE Outs RX 5600 XT Gaming OC VBIOS Update and Easy Updater

GIGABYTE late-Tuesday released a video BIOS update for its Radeon RX 5600 XT Gaming OC graphics card (model: GV-R56XTGAMING OC-6GD). The new FA0 video BIOS increases TGP of the card to a massive 180 W (up from 150 W original spec), gaming clocks up to 1670 MHz, boost clocks up to 1750 MHz, and memory clocks up to 14 Gbps (GDDR6-effective), up from 12 Gbps. Gigabyte's BIOS update package for the RX 5600 XT Gaming OC includes a Windows-based GUI updater that's a lot easier to use than manually updating the BIOS using ATIWinFlash. Grab the update package from the link below.

DOWNLOAD: GIGABYTE RX 5600 XT Gaming OC New FA0 BIOS Update
Add your own comment

30 Comments on GIGABYTE Outs RX 5600 XT Gaming OC VBIOS Update and Easy Updater

#26
notb
gamefoo21OpenCuda like most of Nvidia's tech is closed and designed to force you onto their hardware. CUDA cores suck compared to AMDs compute units, but NV tries their hardest to kill or stall OpenCL adoption because the only company that can make CUDA cores is NV, so they cripple OpenCL on their hardware, and make CUDA look so much better.

Stuffing black box code into games that devs aren't even allowed to look at. That suspiciously enhances performance for one side but causes problems for the other.
That post showed that you really don't know what you're talking about.
CUDA is open-source. Half of your absurdly long post swirls around some "black box" nonsense.
And BTW: black boxes are perfectly fine. Don't be afraid of them. Abstraction is what let programming evolve so quickly (and become so easy).

Your main problem, albeit shared with many AMD fans on this forum, is lack of understanding what these companies actually make.
You seem to think Nvidia's product is the GPU. That the software layer should just be a free and fully hardware agnostic contribution to humanity. Cute.

In the real world what Nvidia makes is a solution. That solution consists of a GPU and a software layer that lets you use it as efficiently as possible. You buy a card - you pay for both.
No one is forced to write using CUDA. Just like no one is forced to use AMD's in-house libraries (yes, AMD does the same thing - it's just that no one cares :) ).

The reason for CUDA popularity is very simple. There was a time when almost no one outside of gaming business knew how to program GPUs. We knew they are good for some problems, but coding was so difficult, learning this made very little sense for engineers and scientists.

My first contact with CUDA was on a condensed matter physics seminary in 2008. The lecturer (late 30s) started with something like: "Hey, there's this great new thing called CUDA and we can compute stuff on GPUs. It's super easy." And the audience (mostly late 50s and over) said: "GPUs?"
When I was leaving university in 2012, CUDA was in syllabus of the first-year programming course.
It was a great idea by Nvidia and it just paid off.

Compared to what we had in 2007 (OpenGL) it was like Python vs C++ in usability, but with no performance loss.

As for OpenCL - it's not a bad initiative, but it loses importance and this won't change.
Nvidia has CUDA. Intel focuses on OneAPI. Apple left it for Metal.
OpenCL simply isn't needed. And, frankly, it's not that great with latest AMD offerings.
Posted on Reply
#27
chaosmassive
RH92Why Sapphire , XFX , Powercolor , Asrock don't make Nvidia GPUs ? That kind of logic flows both ways , yet somehow you focus only on one side of the coin . I would be impressed if you came up and accused AMD for this !
XFX used to make Nvidia-based GPU, which then also make AMD-based gpu when AMD release HD 4000 series, Nvidia then punish XFX by not supplying newer Fermi chip for GTX 400 series. in retaliation XFX drop nvidia product line completely and only made HD 5000 series.

this is just one of the AIB story that made it into internet, Nvidia has been pressuring smaller AIB only Nvidia-based gpu not both,
ASUS, Gigabyte, MSI have leeway to make cards from both side because they have leverage as major AIB.

do your research first, before you posting.
Posted on Reply
#28
RH92
chaosmassiveXFX used to make Nvidia-based GPU, which then also make AMD-based gpu when AMD release HD 4000 series, Nvidia then punish XFX by not supplying newer Fermi chip for GTX 400 series. in retaliation XFX drop nvidia product line completely and only made HD 5000 series.
this is just one of the AIB story that made it into internet, Nvidia has been pressuring smaller AIB only Nvidia-based gpu not both,
You see this what i don't like about some peoples , they take something that really happened and they invent a story full of BS that suits their agenda !

We are all aware of the story between Nvidia and XFX that still doesn't explain why XFX doesn't make Nvidia cards today , oh yeah i guess you must be one of those who confuse multi million businesses with children so because those children argued in the past they shall never talk to each other again right ? Yeah right .........

There are 3 other companies on that list that you fail to explain why they do make exclusively AMD GPUs , not only this but you are ready to blaim Nvidia for this when at the same breath you will not hesitate a second to blame Nvidia for EVGA not making AMD GPUs . Indeed DOUBLE STANDARDS FTW .... this is where you peoples loose all credibility !
chaosmassiveASUS, Gigabyte, MSI have leeway to make cards from both side because they have leverage as major AIB.
Yeah sure either this or this is another BS made up argument that suits your narrative .... and here comes AFOX which makes both Nvidia and AMD but im sure AFOX being the small brand they are have a ton more leverage than Asrock , Powercolor and Sapphire right ? right ?
chaosmassivedo your research first, before you posting.
Dude there is difference between doing some research and making up BS to suit your agenda , besides this coming for you is the icing on the cake !
Posted on Reply
#29
kapone32
Why Sapphire , XFX , Powercolor , Asrock don't make Nvidia GPUs ? That kind of logic flows both ways , yet somehow you focus only on one side of the coin . I would be impressed if you came up and accused AMD for this !
[/QUOTE]

Some of those companies worked with Nvidia in the past and the bad taste from dealing with them made them go the ATI/AMD way. You should really read the objective history of Nvidia before thinking that I am promoting one over the other.
Posted on Reply
#30
notb
kapone32Some of those companies worked with Nvidia in the past and the bad taste from dealing with them made them go the ATI/AMD way. You should really read the objective history of Nvidia before thinking that I am promoting one over the other.
But how do you know companies that are Nvidia-only haven't had bad experience with ATI/AMD? :)
Posted on Reply
Add your own comment
Jul 17th, 2024 06:21 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts