Friday, November 9th 2018
XFX Radeon RX 590 Fatboy Smiles For The Camera
The flood of leaked AMD Radeon RX 590 graphics cards continues with the latest one being from XFX. Sporting a new naming scheme, the XFX Radeon RX 590 Fatboy is very similar to the RX 580 GTS series. It features the same dual fan cooler used on the RX 580 GTS, which takes up roughly 2.5 slots. Even the backplate remains the same with no changes to speak of, meaning side by side you wouldn't be able to tell the difference unless you look at power delivery which is where the designs diverge. The RX 590 Fatboy utilizing an 8+6 pin design compared to the RX 580 GTS series and its single 8-pin design. In regards to display outputs that remains the same between the two with three DisplayPorts, one HDMI port, and one DVI-I port being standard.
When it comes to clock speeds the XFX RX 590 Fatboy OC+ at least according to Videocardz will come with a 1600 MHz boost clock. That is an increase of roughly 200 MHz over XFX's highest clocked RX 580. With such a high boost clock the additional 6-pin power connector is likely included for improved power delivery and depending on luck may allow for more overclocking headroom. Considering no vendor releases just one version of a graphics card it is likely that a few more variants will be available at launch. Sadly no pricing information is available as of yet.
Source:
Videocardz
When it comes to clock speeds the XFX RX 590 Fatboy OC+ at least according to Videocardz will come with a 1600 MHz boost clock. That is an increase of roughly 200 MHz over XFX's highest clocked RX 580. With such a high boost clock the additional 6-pin power connector is likely included for improved power delivery and depending on luck may allow for more overclocking headroom. Considering no vendor releases just one version of a graphics card it is likely that a few more variants will be available at launch. Sadly no pricing information is available as of yet.
60 Comments on XFX Radeon RX 590 Fatboy Smiles For The Camera
It's cool for people that don't just game and are able to use their hardware however they want. But we rely so much on APIs to do anything at all and adoption of them is slow as hell.
And
What kind of double standard bullshit is that, why should I pay for physx and tensor cores that only 5% of games use.
Tough shit that stuff is not just made for your purposes or mine.
Now both sections of user's do get a better tailored product for them from Nvidia.
They also iterated the heck out of gtx 10 series without major pillaring from the community and basically just added better memory each phase.
But look at them prices, everyone is now missing a ringpiece.
And I agree that Nvidia has provided something decent in the past few years. Hell, this 1070 OC'ed to 2.2GHz runs everything butter smooth and punches above its weight. I still don't like that there's no hardware DX12 support, but within 6-12 months I will probably upgrade, and even then I'm using Vulkan wherever possible anyway.
If Vega 20 or Navi impresses me. I am jumping ship again. I want that Freesync damnit.
I wouldn't buy one of these with my money ,but for a new build for someone, maybe.
And I am the same too , certainly not perfect.
Oh and since we're calling a spade a spade, take ca gander down in the red camp since they are the ones that bloat their arches with crap nobody uses for the next 5 years, hence why the compute part of their GPUs didn't go away, but that doesn't stop them from smacking a sticker on the box that says "GAMING". Both sides are doing questionable stuff, get over it. In more recent news AMD boasted that their MI60 runs better than a Tesla, yet when they benchmarked it they disabled the tensor cores from the Tesla and once enabled said Tesla wiped the floor with the MI60, nice ain't it?
Go beyond that, PCI Express logo for you ;) Actually, even the power hungry Vega can stay within the 275W PCIe defines. The 8+8pin designs will go above that, but there are custom Nvidia designs that do that, too.
Sure, AMD has been trailing in power efficiency till every aficionado convinced themselves power draw is not to be taken seriously when looking at a video card, but let's stick to the facts.
And also AMD/RTG is working with literally a fraction of the budget for R&D as Nvidia, and AMD also makes (highly competitive) CPUs. GCN is tooled for high compute throughput and that is where it shines. IMO AMD made a bet that game engines would scale more with Compute instead of geometry, etc. Also GCN is having utilisation issues in all current games (even games that run really well like Wolfenstein 2). The Shader arrays are being used sometimes less than 50% of the time in a frame. All you have to do is download the OpenGPU tool 'Radeon GPU Profiler' and run it on Vega to see the 'Wavefront occupancy' is extremely low. You've got all that number-crunching power but you can't use it to make more FPS cuz other parts of the pipeline are bottlenecking it (Raster/Geometry I think). Also I heard something about workload distribution being an issue. Either way AMD has had 8 Years to learn the issues with GCN and fix them. Limitations in the effective scalability of the current GCN arch prevents them from really fixing this... Anyway I hope this 'Arcturus' architecture in 2020 puts AMD back in the High-end Graphics game. (Even though Vega 10-based cards can still compete with RTX 2070; which is 'high-end' in my book).
Oops. I got off-topic, anyway: TL;DR for last bit: IMO, GCN is a parallel compute architecture with graphics bits tacked on, lol, despite the name.
edit: fixing grammarz
I hope the power enveloppe (=TDP) is equal to the RX580 at 180W. Then it's actually not a bad performer at all. If the card actually has tighter timings on the RAM or could be tweaked some more then it might come very close to the 1070/TI or so.
Did they screw up with Bulldozer, yes. Did they under deliver with Vega, yes. Is Polaris bland, yes? Does it compete with the 1060, yes. The fact that a Vega64 basically matches the 2070 is quite impressive in my eyes. My assumption is that it will fair just as well with RTRT as well.
See my posts after that I happily credit Nvidia accordingly with more gamer focused designs but they make you pay for that.
As for your Maxwell drivel v funny , Maxwell was the last Nvidia arch that could do 64bit right hence it still appears powerful to some , meanwhile in the real world i swapped a 660ti for a Rx480 last night and the guys ecstatic at/165 fps on Bops 4 @ 1900x1200 and his 660 is worth nought.
On a i7 2600k pciex1 no less.
I am so torn between upgrading to this or a Vega. I feel like the Vega will be the better investment. But until the 7nm Vega comes out, they are too pricey for me..
Some AMD VGA's cards violated (and violates) PCI-E power specifications. This is a fact.
If we stick to averages only, these are 4W more than the allowed PCI-E 6-PIN ATX connection and 7W more than the allowed PCI-E motherboard connection. PCI-SIG was about to send an ufficial statement when AMD released some new drivers that allowed to switch the card to "nerf mode" (AMD called it "Compatibility Mode"). Graceless and shameless move.
I guess they sat this one out. When I think about it, I haven't seen much of their cards on UK retail anyway.
And as he said Nvidia have been caught doing the same, they just didn't fix it..
I used a 480 through that time ,two in crossfire folding 24/7 no compatibility mode no issues cos im no dramma queen.
Or maybe I just lost sense for the"FASHION" :roll: