Tuesday, May 19th 2015

Top-end AMD "Fiji" to Get Fancy SKU Name

Weeks ago, we were the first to report that AMD could give its top SKU carved out of the upcoming "Fiji" silicon a fancy name à la GeForce GTX TITAN, breaking away from the R9 3xx mold. A new report by SweClockers confirms that AMD is carving out at least one flagship SKU based on the "Fiji" silicon, which will be given a fancy brand name. The said product will go head on against NVIDIA's GeForce GTX TITAN-X. We know that AMD is preparing two SKUs out of the fully-loaded "Fiji" silicon - an air-cooled variant with 4 GB of memory; and a liquid-cooled one, with up to 8 GB of memory. We even got a glimpse of the what this card could look like. AMD is expected to unveil its "Fiji" based high-end graphics card at E3 (mid-June, 2015), with a product launch a week after.
Source: SweClockers
Add your own comment

56 Comments on Top-end AMD "Fiji" to Get Fancy SKU Name

#26
GhostRyder
So AMD is just realizing now that names are OP?

So wait, if its just the top end sku this now leaves an interesting opening with where other cards will sit. So are we expecting the Top Fiji to be the only one receiving a name like that and the cut down variants will then fill the void below. To me that means we could be seeing something like the cut down variant being the 390X and the 390 might be the revamped R9 290X.

I guess this would help explain the rebranded R9 285 being where it is. Though its a bit confusing since by that logic a lot of card areas are being ignored including possibly some cut down silicon.
Posted on Reply
#27
the54thvoid
Super Intoxicated Moderator
If you under value your product, it will always be seen as inferior. Even if AMD could make Fiji for pennies, they would need to charge mega bucks to make it seem high end. As shitty as Nvidia's pricing policy is, it doesn't hurt sales.
One month to go and I'm genuinely excited about this card.
Posted on Reply
#28
TheGuruStud
m1dg3tBut i don't wanna pay $850 for a GFX card. I want to pay $1500, then i can show all the plebs how much betterer i am than they... /sarcasm

Gotta love the free market, run by greed & stupidity.

If current pricing trends continue this will be the last PC i build, the only component that came down in price since i built my last is the SSD. Eveything else (equivalent) has remained the same or increased in price. Good job :clap:
I don't think you understand Moore'moore (and how it's failing). GPUs are incredibly complex and silicon is maxing out.

Intel is just ripping people off as usual.

Memory manuf. have admitted to price fixing, again. They were sued before, but since laws don't matter anymore...
Posted on Reply
#29
Solaris17
Super Dainty Moderator
I dont really understand how or why people are crying about GPUs over $600 they have been like that for years. and if people doubt me I will show you the receipt for my 290X and you can just google news stories about the titan launch and the 980 or all the other GTX X80s for that matter. WTF do people want? This isnt news or shocking the RADEON 9800 PRO back in like 2005 was $300+ at launch. and I hope you have the cognitive ability to notice the technological differences now. and back then everyone said the same thing WTF why isnt this card $150?! QQ rage1one1
Posted on Reply
#30
RealNeil
I want to see the reviews before I jump, but I'm chomping at the bit!

If these new GPUs are that good, I have a friend who will unload his three Titan X cards for a good price. Maybe I'll just get those.
Posted on Reply
#31
HumanSmoke
Solaris17I dont really understand how or why people are crying about GPUs over $600 they have been like that for years. and if people doubt me I will show you the receipt for my 290X and you can just google news stories about the titan launch and the 980 or all the other GTX X80s for that matter. WTF do people want? This isnt news or shocking the RADEON 9800 PRO back in like 2005 was $300+ at launch. and I hope you have the cognitive ability to notice the technological differences now. and back then everyone said the same thing WTF why isnt this card $150?! QQ rage1one1
I generally think that most of the griping comes from people who haven't been buying graphics (or hardware in general) very long. Back in the day, $350-400 was the going rate for cutting edge 3D graphics with ~ 2MB of RAM for those of us that used to stump up the cash for ATI's 3D Rage. When the 32MB cards first arrived in 2000, the same thing applied again ( Radeon 7200 64MB, GeForce2 GTS etc.), the same with the 128MB cards (ATI AIW 8500DV, GeForce3 Ti) which started moving into the $500 price bracket. As you say, the 9800 Pro was $300+ (actually closer to $400-450 for many, as was the preceding 9700 Pro and the FX 5900U/5950U, and the following 9800 XT), and the trend continued though the X800 PRO/XT, GeForce 6800U and climbed to $500 with the XT PE and Ultra Extreme - and pretty much stayed there for anyone buying a X1800XT or 7800GTX....then you could tack on an additional $100-150 when the first 512MB cards (7800GTX 512M, and X1900 XTX) arrived.

Some people seem to think that the pricing structure as somehow escalated when the fact is the consumer was somewhat spoiled for a brief number of generations thanks to the R600 debacle that led to AMD targeting the value for money segment with the HD 3870/4870/5870/6970...which accounts for less than three years of product releases of the twenty years consumer 3D graphics have been generally available.
RejZoRIf you buy stuff based on marketing, then you're dumb. Good consumer (for companies), but dumb. I see absolutely no need for a 4K monitor.
Well, if your needs and desires dictated the fortunes of the entire AMD company I could see the relevance, but the company needs to appeal to a wider market than just yourself. You see the market from a personal standpoint, I tend to look at it from the entire customer base.
For the record, I don't see a need for 4K either - especially with OS font issues, lack of native 4K video content, and the nascent state of the 4K monitor market (why bother jumping on the train early when the next wave of panels - and cards- should feature DP1.3 and a wider range of offerings in the 10-bit/IPS class).
Posted on Reply
#32
nunyabuisness
the problem is the HBM v1 standard is limited to 4GB of ram. the only way they will get to 8GB is if they have 2 chips on the 1 board again. and we all know what that means.... only 4gb is usable! so its not an 8gb card, cause DX11 doesnt support pooling!

AMD major fail with Fuji XT. a 290X with 8gb is still gana be the choice for 4K gamers.
or titanX is a much better option. RIP AMD
Posted on Reply
#33
Xzibit
nunyabuisnessthe problem is the HBM v1 standard is limited to 4GB of ram. the only way they will get to 8GB is if they have 2 chips on the 1 board again. and we all know what that means.... only 4gb is usable! so its not an 8gb card, cause DX11 doesnt support pooling!

AMD major fail with Fuji XT. a 290X with 8gb is still gana be the choice for 4K gamers.
or titanX is a much better option. RIP AMD
You missed my post I guess.
AMD's CTO Joe Macri"You're not limited in this world to any number of stacks, but from a capacity point of view, this generation-one HBM, each DRAM is a two-gigabit DRAM, so yeah, if you have four stacks you're limited to four gigabytes. You could build things with more stacks, you could build things with less stacks. Capacity of the frame buffer is just one of our concerns. There are many things you can do to utilise that capacity better. So if you have four stacks you're limited to four [gigabytes], but we don't really view that as a performance limitation from an AMD perspective."
They already hinted they were developing techniques for 8 & 16 stacks. One would suspect the Fiji FirePro to be equipped with more stacks.
Posted on Reply
#34
nunyabuisness
XzibitYou missed my post I guess.



They already hinted they were developing techniques for 8 & 16 stacks. One would suspect the Fiji FirePro to be equipped with more stacks.
how does that help me. im not buying a 5,000 video card to play games on man....... its a failure out of the box. this is probably why Nvidia decided not to use HBM1 due to this limitation. Its only going to be for a flagship product which means 4K, and 4gb isnt enough for 4K today! and running a X2 setup its still only going to use 4gb because no pooling!

980 or a 980ti with 6-8gb is looking like a great option when they come out. until HBM becomes mature at HBM v2 and 8GB is possible! and then we will have pascal vs AMD's version (IF they stay open that long. as I said I dont know how AMD are even still alive)
Posted on Reply
#35
Xzibit
nunyabuisnesshow does that help me. im not buying a 5,000 video card to play games on man....... its a failure out of the box. this is probably why Nvidia decided not to use HBM1 due to this limitation. Its only going to be for a flagship product which means 4K, and 4gb isnt enough for 4K today! and running a X2 setup its still only going to use 4gb because no pooling!

980 or a 980ti with 6-8gb is looking like a great option when they come out. until HBM becomes mature at HBM v2 and 8GB is possible! and then we will have pascal vs AMD's version (IF they stay open that long. as I said I dont know how AMD are even still alive)
Might want to go back and look at all the Nvidia Pascal screenshots HBM1 4 stacks.

Posted on Reply
#37
Xzibit
HumanSmokeThat is a technology demonstrator - not a selling product.
Nvidia have already stated that the shipping product will feature HBM2
I don't discount that. I also suspect both are working on multi stacking. The original HBM1 paper was 4 & 8.

One wouldn't be to quick to discount that both are working on higher stacks for their pro-line in any case and would work its way to the game-line of products

HBM1 8 stacks would give them 8GB at HBM2 16GB makes sense they would work towards a higher stack count be it HBM1 or HBM2
Posted on Reply
#38
Patriot
nunyabuisnessthe problem is the HBM v1 standard is limited to 4GB of ram. the only way they will get to 8GB is if they have 2 chips on the 1 board again. and we all know what that means.... only 4gb is usable! so its not an 8gb card, cause DX11 doesnt support pooling!

AMD major fail with Fuji XT. a 290X with 8gb is still gana be the choice for 4K gamers.
or titanX is a much better option. RIP AMD
DX12 supports pooling.
Posted on Reply
#39
nunyabuisness
XzibitI don't discount that. I also suspect both are working on multi stacking. The original HBM1 paper was 4 & 8.

One wouldn't be to quick to discount that both are working on higher stacks for their pro-line in any case and would work its way to the game-line of products

HBM1 8 stacks would give them 8GB at HBM2 16GB makes sense they would work towards a higher stack count be it HBM1 or HBM2
They probably built that with Maxwell and HBM v1, and realized its not worth it yet. and this is why that card is there.
which then they announced they would release Pascal with HBM2 in 2016.
Posted on Reply
#40
nunyabuisness
PatriotDX12 supports pooling.
it does. I agree. So lets see what happens.

But you cant market a single GPU as a 4k GFX card with only 4gb. and if they say oh you need 2 to do 4K. then thats a complete waste. 290X in Xfire did 4K. so what are we wasting all that extra horsepower on! if there is no memory there to drive the Anti ALiasing and extra textures?
Posted on Reply
#41
Patriot
nunyabuisnessit does. I agree. So lets see what happens.

But you cant market a single GPU as a 4k GFX card with only 4gb. and if they say oh you need 2 to do 4K. then thats a complete waste. 290X in Xfire did 4K. so what are we wasting all that extra horsepower on! if there is no memory there to drive the Anti ALiasing and extra textures?
It really depends if DX12 is more efficient in that realm as well... Who knows. Frankly I don't think 8GB is required for 4k.
Posted on Reply
#42
nunyabuisness
PatriotIt really depends if DX12 is more efficient in that realm as well... Who knows. Frankly I don't think 8GB is required for 4k.
I have a samsung 32D 1440P monitor. and I max out 4gb easy! so I can say 4K wld kill 4gb easily!
again they are saying compression tech is getting better etc and less memory is needed. but I remain sceptical.

for me the 980 TI will be a sweet buy if its got 6-8gb of ram. and then if AMD remains alive ill see what they offer with HBM2 and pascal from Nvidia. that way there is real competition. hbm1 will be full of issues I think. fragile links and heat rising to the top Dram chip. Think Xbox360 issues and a billion bucks in write down!
Posted on Reply
#43
Patriot
Huh, have not noticed maxing out my 4gb card on my ZR30 1600p...
Posted on Reply
#44
HumanSmoke
PatriotHuh, have not noticed maxing out my 4gb card on my ZR30 1600p...
Probably depends upon which titles you play


With consoles having a greater pool of memory to work with, it is probably a safe bet that vRAM usage wont remain static in the foreseeable future.
Posted on Reply
#45
rtwjunkie
PC Gaming Enthusiast
erockerI translate this as "AMD is going the expensive route".
And they should. They're not going to get out of their financial hole by selling products cheaply.

If you build quality, people will buy it.
Posted on Reply
#46
RealNeil
rtwjunkieIf you build quality, people will buy it.
^^^This^^^

If I perceive that it's worth it, (after reading reviews and comments from others who already have it) and I can get the cash together, I'll buy it.
Posted on Reply
#48
TRWOV
It's not that HBM is limited to 4GB, is just that the interposer AMD will use for Fiji is designed for 4 stacks only. Theoretically you could fit as many as the space allows but then again HBM costs more than GDDR5 so I think that the 4 stack limit was set for economical reasons. Once HBM2 hits, each stack will be 2GB and so 8GB cards would be possible under AMD's current implementation.

The nVidia Pascal HBM design also has 4 stacks only but they will wait for HBM2 before using it.
Posted on Reply
#49
EarthDog
Well, yes, the interposer specifically. Thank you for that clarification (edited my post to be more clear).

My underlying point should still stand though considering Fiji is HBM and not HBM2 and using that interposer, correct? So there will not be an 8GB card out of the gate.
Posted on Reply
#50
HumanSmoke
EarthDogWell, yes, the interposer specifically. Thank you for that clarification (edited my post to be more clear).
My underlying point should still stand though considering Fiji is HBM and not HBM2 and using that interposer, correct? So there will not be an 8GB card out of the gate.
AMD will probably just follow Nvidia's example. Maxwell isn't being offered as a math co-processor (Tesla) due to the elimination of the bulk of double precision ability. AMD would likely follow a similar model since they cannot compete with their own 16GB vRAM equipped W9100. It also remains to be seen whether Fiji has had its native FP64 reduced in an effort to reduce the GPUs power usage envelope.
Posted on Reply
Add your own comment
Dec 19th, 2024 13:15 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts