Friday, November 9th 2018

XFX Radeon RX 590 Fatboy Smiles For The Camera

The flood of leaked AMD Radeon RX 590 graphics cards continues with the latest one being from XFX. Sporting a new naming scheme, the XFX Radeon RX 590 Fatboy is very similar to the RX 580 GTS series. It features the same dual fan cooler used on the RX 580 GTS, which takes up roughly 2.5 slots. Even the backplate remains the same with no changes to speak of, meaning side by side you wouldn't be able to tell the difference unless you look at power delivery which is where the designs diverge. The RX 590 Fatboy utilizing an 8+6 pin design compared to the RX 580 GTS series and its single 8-pin design. In regards to display outputs that remains the same between the two with three DisplayPorts, one HDMI port, and one DVI-I port being standard.

When it comes to clock speeds the XFX RX 590 Fatboy OC+ at least according to Videocardz will come with a 1600 MHz boost clock. That is an increase of roughly 200 MHz over XFX's highest clocked RX 580. With such a high boost clock the additional 6-pin power connector is likely included for improved power delivery and depending on luck may allow for more overclocking headroom. Considering no vendor releases just one version of a graphics card it is likely that a few more variants will be available at launch. Sadly no pricing information is available as of yet.
Source: Videocardz
Add your own comment

60 Comments on XFX Radeon RX 590 Fatboy Smiles For The Camera

#26
EntropyZ
theoneandonlymrkLack of understanding much.

Different arch, different tactics , Amd one arch many market's. No game specific arch.

Nvidia multi arch tiered for gaming alone and seperate design for data.

Gcn does it all, Nvidia makes special bits to do special shit ie 64/ Or 32 bit cuda cores ,tensor.

And for power the Rtx cards suck plenty ,get off that high horse , Nvidia fanboi days riding power comments is Over.
GCN does it all at a price, a typical AAA title doesn't even use those special cores anyway. Why should I pay for the GPGPU when the applications that gamers use can't take advantage of the extra hardware. And even if you do, that means using some API like Gameworks that adds overhead and most of the effects aren't worth the performance cost. Rendering tricks will still be better and more accepted than anything that uses GPGPU.

It's cool for people that don't just game and are able to use their hardware however they want. But we rely so much on APIs to do anything at all and adoption of them is slow as hell.
Posted on Reply
#27
TheoneandonlyMrK
EntropyZGCN does it all at a price, a typical AAA title doesn't even use those special cores anyway. Why should I pay for the GPGPU when the applications that gamers use can't take advantage of the extra hardware. And even if you do, that means using some API like Gameworks that adds overhead.
Accepted there is a cost to gaming performance but some of us actually use are equipment more than a few hours a day.

And

What kind of double standard bullshit is that, why should I pay for physx and tensor cores that only 5% of games use.
Tough shit that stuff is not just made for your purposes or mine.
Posted on Reply
#28
EntropyZ
theoneandonlymrkAccepted there is a cost to gaming performance but some of us actually use are equipment more than a few hours a day.

And

What kind of double standard bullshit is that, why should I pay for physx and tensor cores that only 5% of games use.
Tough shit that stuff is not just made for your purposes or mine.
Which is ironic considering most consumer graphics cards are marketed for gaming. Apart from the very low end, Quadros, Instinct and Titans.
Posted on Reply
#29
TheoneandonlyMrK
EntropyZWhich is ironic considering most consumer graphics cards are marketed for gaming. Apart from the very low end, Quadros, Instinct and Titans.
Funny that you know what irony is considering your posting history, just a few years ago Nvidias corporate and consumer cards had parity.
Now both sections of user's do get a better tailored product for them from Nvidia.
They also iterated the heck out of gtx 10 series without major pillaring from the community and basically just added better memory each phase.

But look at them prices, everyone is now missing a ringpiece.
Posted on Reply
#30
EntropyZ
theoneandonlymrkFunny that you know what irony is considering your posting history, just a few years ago Nvidias corporate and consumer cards had parity.
Now both sections of user's do get a better tailored product for them from Nvidia.

But look at them prices, everyone is now missing a ringpiece.
My posting history. That's what that is, history. You learn from it not to repeat mistakes. I still do make them at my own expense. You are free to criticize me for all of them.

And I agree that Nvidia has provided something decent in the past few years. Hell, this 1070 OC'ed to 2.2GHz runs everything butter smooth and punches above its weight. I still don't like that there's no hardware DX12 support, but within 6-12 months I will probably upgrade, and even then I'm using Vulkan wherever possible anyway.

If Vega 20 or Navi impresses me. I am jumping ship again. I want that Freesync damnit.
Posted on Reply
#31
TheoneandonlyMrK
EntropyZMy posting history. That's what that is, history. You learn from it not to repeat mistakes. I still do make them at my own expense. You are free to criticize me for all of them.
Nope you are a better debater then some sir fair enough we disagree, life goes on.
I wouldn't buy one of these with my money ,but for a new build for someone, maybe.

And I am the same too , certainly not perfect.
Posted on Reply
#32
I No
theoneandonlymrkLack of understanding much.

Different arch, different tactics , Amd one arch many market's. No game specific arch.

Nvidia multi arch tiered for gaming alone and seperate design for data.

Gcn does it all, Nvidia makes special bits to do special shit ie 64/ Or 32 bit cuda cores ,tensor.

And for power the Rtx cards suck plenty ,get off that high horse , Nvidia fanboi days riding power comments is Over.
GCN - the jack of all trades, the master of none. Can't tell if you're trolling or you're actually serious. You do know that Turing is still sucking less power than VEGA even though it stomps it into the ground right? Oh and btw you might want to check out Steve from Gamersnexus he did do a nice review with an unlimited power mod for VEGA 56 and in order for it to match the stock 2070 the power consumption was in the mid 600's. So unless AMD comes up with something that can level the playing field which I highly doubt they will, Nvidia still has a better power envelope for their products.
Oh and since we're calling a spade a spade, take ca gander down in the red camp since they are the ones that bloat their arches with crap nobody uses for the next 5 years, hence why the compute part of their GPUs didn't go away, but that doesn't stop them from smacking a sticker on the box that says "GAMING". Both sides are doing questionable stuff, get over it. In more recent news AMD boasted that their MI60 runs better than a Tesla, yet when they benchmarked it they disabled the tensor cores from the Tesla and once enabled said Tesla wiped the floor with the MI60, nice ain't it?
Posted on Reply
#33
Vya Domus
I Notake ca gander down in the red camp since they are the ones that bloat their arches with crap nobody uses for the next 5 years
You mean Tensor and RT cores ? Oh wait, that would be Nvidia with their 750mm^2 monstrosity , 50% of which is occupied by useless crap according to your definition, no wonder it's power envelope hasn't changed much. Or does this work one way only depending on your favorite color ?
Posted on Reply
#34
I No
Vya DomusYou mean Tensor and RT cores ? Oh wait, that would be Nvidia with their 750mm^2 monstrosity , 50% of which is occupied by useless crap according to your definition, no wonder it's power envelope hasn't changed much. Or does this work one way only depending on your favorite color ?
Next time quote the whole phrase. You kinda left out the part that I mentioned "Both sides are doing questionable stuff" but yeah it's easy to cherry pick. I don't care who makes a better product as long as some one is doing it. Just going ballz out on 1 camp strongly defending the other is flawed, that's all i'm pointing out. And yes at this point in time Nvidia's die is 50% useless that doesn't mean that AMD's crap is fairing any better now does it? Oh and that 750mm^2 kicks the living daylights out of that overbloated arch that AMD keeps refreshing with 0 tangible gains unless it's 2 gens over and once those are out they play catch-up and think they can gain market share with questionable marketing and with Maxwell levels of performance after 3+ years. But it's ok to punch at the competition when your product is mediocre at best right?
Posted on Reply
#35
bug
RealNeilPCI-E power spec relates to what power the card draws from the PCI-E slot your card is plugged into. (it is a standard amount of power on all boards) The PCI-E power cables from your PSU supplement that depending on what your GPU'S real requirements are.
Combination of both may require more robust cooling.

I have a GTX-980Ti that uses two 8-pin and one 6-pin power cord as well as what the PCI-E bus provides.

My two GTX-1080s have just one 8-pin.
Actually, PCI-SIG defines everything up to a 8+6pin design: en.wikipedia.org/wiki/PCI_Express#Power
Go beyond that, PCI Express logo for you ;)
KamgustaLatest AMD cards violate PCI-E specifications, drawing much more power than allowed. This also results in cases where PCI-E voltage drops significantly, from 12V to 11,5V.
Actually, even the power hungry Vega can stay within the 275W PCIe defines. The 8+8pin designs will go above that, but there are custom Nvidia designs that do that, too.
Sure, AMD has been trailing in power efficiency till every aficionado convinced themselves power draw is not to be taken seriously when looking at a video card, but let's stick to the facts.
Posted on Reply
#36
Vya Domus
I NoNext time quote the whole phrase. You kinda left out the part that I mentioned "Both sides are doing questionable stuff"
I left it out because it has nothing to do with your bizarre complaint that AMD is stuffing useless features in their GPUs when, in fact, the crown has clearly been taken by Nvidia by this point.
I NoBut it's ok to punch at the competition when your product is mediocre at best right?
I wouldn't know, I don't know what you are talking about.
I NoOh and that 750mm^2 kicks the living daylights out of that overbloated arch that AMD
So I am right indeed, in that you don't care how bloated the hardware is or any of that nonsense. All that matters is that it's faster and it has a green sticker on it, no need for elaborate justifications that go one way only for that. I wouldn't have cared if that was the case, a fair point.
Posted on Reply
#37
Ruru
S.T.A.R.S.
I understand the name these days of "body positivity". :D
Posted on Reply
#38
ArbitraryAffection
1600 MHz is pretty good result for Polaris I thinks. This will almost always beat out the 1060 while having more VRAM. AMD needs to use 12nm Wafers from GloFo and porting the Polaris architecture from 14nm is extremely easy and cost effective - 12nmLP was designed for exactly that. With no mid-range Turing in sight, and I think TU106 based 2060 may not be possible with Nvidia's margin intentions considered (it is still a big chip even if cut down), they may still rely on GTX 1060 cards to fill the midrange, which RX 590 will happily compete/beat (even if it does use more power, who cares if the cooler is good enough, I have a 1000W PSU lol -shrug-). I think a price cut to GTX 1070 would actually be a good move for NVIDIA but that probably don't want to eat into potential RTX 2070 sales.

And also AMD/RTG is working with literally a fraction of the budget for R&D as Nvidia, and AMD also makes (highly competitive) CPUs. GCN is tooled for high compute throughput and that is where it shines. IMO AMD made a bet that game engines would scale more with Compute instead of geometry, etc. Also GCN is having utilisation issues in all current games (even games that run really well like Wolfenstein 2). The Shader arrays are being used sometimes less than 50% of the time in a frame. All you have to do is download the OpenGPU tool 'Radeon GPU Profiler' and run it on Vega to see the 'Wavefront occupancy' is extremely low. You've got all that number-crunching power but you can't use it to make more FPS cuz other parts of the pipeline are bottlenecking it (Raster/Geometry I think). Also I heard something about workload distribution being an issue. Either way AMD has had 8 Years to learn the issues with GCN and fix them. Limitations in the effective scalability of the current GCN arch prevents them from really fixing this... Anyway I hope this 'Arcturus' architecture in 2020 puts AMD back in the High-end Graphics game. (Even though Vega 10-based cards can still compete with RTX 2070; which is 'high-end' in my book).

Oops. I got off-topic, anyway: TL;DR for last bit: IMO, GCN is a parallel compute architecture with graphics bits tacked on, lol, despite the name.

edit: fixing grammarz
Posted on Reply
#39
Dave65
It's FAT, like me:laugh:
Posted on Reply
#40
Jism
The core-clock (1600Mhz) is actually not so bad. Some 580's like mine come with a default 1380Mhz. It's overclockable to almost 1500Mhz but the power consumption seems to rise higher and higher once you pass the 1400 ~ 1450 Mhz on the card.

I hope the power enveloppe (=TDP) is equal to the RX580 at 180W. Then it's actually not a bad performer at all. If the card actually has tighter timings on the RAM or could be tweaked some more then it might come very close to the 1070/TI or so.
Posted on Reply
#41
moproblems99
EntropyZI want to wake up one morning and see something that is good on launch day and priced at what previous generation was at and have my mind blown. Reasonable expectation, I hope?
See this is the problem. Everyone wants AMD to beat 2080ti performance but they only want to pay RX 580 prices. Just look at when nVidia released the 480 compared to when AMD released the 290X. My future prediction is that AMD is returning the middle finger they got when they had superior products. They are fighting a losing battle to stupidity (to the masses) and to some 'techies'. I am honestly floored every time I read about one having better drivers than the other. Each company has had shit streaks with their drivers. Each perform about the same.

Did they screw up with Bulldozer, yes. Did they under deliver with Vega, yes. Is Polaris bland, yes? Does it compete with the 1060, yes. The fact that a Vega64 basically matches the 2070 is quite impressive in my eyes. My assumption is that it will fair just as well with RTRT as well.
Posted on Reply
#42
TheoneandonlyMrK
I NoNext time quote the whole phrase. You kinda left out the part that I mentioned "Both sides are doing questionable stuff" but yeah it's easy to cherry pick. I don't care who makes a better product as long as some one is doing it. Just going ballz out on 1 camp strongly defending the other is flawed, that's all i'm pointing out. And yes at this point in time Nvidia's die is 50% useless that doesn't mean that AMD's crap is fairing any better now does it? Oh and that 750mm^2 kicks the living daylights out of that overbloated arch that AMD keeps refreshing with 0 tangible gains unless it's 2 gens over and once those are out they play catch-up and think they can gain market share with questionable marketing and with Maxwell levels of performance after 3+ years. But it's ok to punch at the competition when your product is mediocre at best right?
You sir are deluded it took 750mm squared and consumers are paying a lot for useless tech atm.
See my posts after that I happily credit Nvidia accordingly with more gamer focused designs but they make you pay for that.
As for your Maxwell drivel v funny , Maxwell was the last Nvidia arch that could do 64bit right hence it still appears powerful to some , meanwhile in the real world i swapped a 660ti for a Rx480 last night and the guys ecstatic at/165 fps on Bops 4 @ 1900x1200 and his 660 is worth nought.
On a i7 2600k pciex1 no less.
Posted on Reply
#43
Ravenas
Why will XFX not put a custom Vega 64 on the market?
Posted on Reply
#44
_larry
I have an Arctic Cooler Accellero III on my R9 290, which takes up almost 3 slots..not worried about that.
I am so torn between upgrading to this or a Vega. I feel like the Vega will be the better investment. But until the 7nm Vega comes out, they are too pricey for me..
Posted on Reply
#45
Kamgusta
bugActually, even the power hungry Vega can stay within the 275W PCIe defines. The 8+8pin designs will go above that, but there are custom Nvidia designs that do that, too. Sure, AMD has been trailing in power efficiency till every aficionado convinced themselves power draw is not to be taken seriously when looking at a video card, but let's stick to the facts.
Please do some research before talking about a topic you don't know anything about. I do not have time to waste correcting uneducated people.
Some AMD VGA's cards violated (and violates) PCI-E power specifications. This is a fact.


If we stick to averages only, these are 4W more than the allowed PCI-E 6-PIN ATX connection and 7W more than the allowed PCI-E motherboard connection. PCI-SIG was about to send an ufficial statement when AMD released some new drivers that allowed to switch the card to "nerf mode" (AMD called it "Compatibility Mode"). Graceless and shameless move.
Posted on Reply
#46
moproblems99
Take it for what its worth, but to me, worrying about GPU power efficiency is like worrying about fuel economy in a super car.
Posted on Reply
#47
EntropyZ
RavenasWhy will XFX not put a custom Vega 64 on the market?
Remember HIS? The last time I saw their PCB was on the R9 300 reries. I miss those guys. Something isn't right, when R9 Fury came out with HBM, they just kind of never made their own cooler for HBM cards. They do have branded Vega cards, but they are all reference. And 400/500 series seems to have their own cooler, but they don't look all that impressive compared to their 7000/R(X)200,R(X)300 coolers.

I guess they sat this one out. When I think about it, I haven't seen much of their cards on UK retail anyway.
Posted on Reply
#48
WikiFM
Definitely GCN has been with us already too much, AMD needs a new arch from scratch. I think one of the gaming bottlenecks are the raster engines, in which a equivalent priced Nvidia card has 50% more, clocked at 40% more. Example 1060(48×1900) vs 580(32×1350), Nvidia has more than 2x raster power, AMD 30% higher compute power, still similar performance. Uneven arrange of components inside GCNs pipeline.
Posted on Reply
#49
TheoneandonlyMrK
KamgustaPlease do some research before talking about a topic you don't know anything about. I do not have time to waste correcting uneducated people.
Some AMD VGA's cards violated (and violates) PCI-E power specifications. This is a fact.


If we stick to averages only, these are 4W more than the allowed PCI-E 6-PIN ATX connection and 7W more than the allowed PCI-E motherboard connection. PCI-SIG was about to send an ufficial statement when AMD released some new drivers that allowed to switch the card to "nerf mode" (AMD called it "Compatibility Mode"). Graceless and shameless move.
What's 78 or 82 watts going to do to a pciex socket that 75 won't, do some research smart ass , nothing unless the motherboard is a piece of shit.

And as he said Nvidia have been caught doing the same, they just didn't fix it..

I used a 480 through that time ,two in crossfire folding 24/7 no compatibility mode no issues cos im no dramma queen.
Posted on Reply
#50
Zyll Goliat
I mean the name is COOL but for God sake that design really looks cheap and just simply awful...XFX cards for me(before)always looked modern&fancy with excellent cooling solution....here lets just compare this from the pic bellow:



Or maybe I just lost sense for the"FASHION" :roll:
Posted on Reply
Add your own comment
Dec 25th, 2024 19:51 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts