Tuesday, May 19th 2015

Top-end AMD "Fiji" to Get Fancy SKU Name

Weeks ago, we were the first to report that AMD could give its top SKU carved out of the upcoming "Fiji" silicon a fancy name à la GeForce GTX TITAN, breaking away from the R9 3xx mold. A new report by SweClockers confirms that AMD is carving out at least one flagship SKU based on the "Fiji" silicon, which will be given a fancy brand name. The said product will go head on against NVIDIA's GeForce GTX TITAN-X. We know that AMD is preparing two SKUs out of the fully-loaded "Fiji" silicon - an air-cooled variant with 4 GB of memory; and a liquid-cooled one, with up to 8 GB of memory. We even got a glimpse of the what this card could look like. AMD is expected to unveil its "Fiji" based high-end graphics card at E3 (mid-June, 2015), with a product launch a week after.
Source: SweClockers
Add your own comment

56 Comments on Top-end AMD "Fiji" to Get Fancy SKU Name

#1
net2007
"Ash nazg durbatulûk, ash nazg gimbatul, ash nazg thrakatulûk, agh burzum-ishi krimpatul"

One GPU to rule them all, One GPU to find them; One GPU to bring them all

[INDENT]and in the darkness bind them.[/INDENT]
Posted on Reply
#2
Solaris17
Super Dainty Moderator
cant wait!
Posted on Reply
#3
manofthem
WCG-TPU Team All-Star!
net2007"Ash nazg durbatulûk, ash nazg gimbatul, ash nazg thrakatulûk, agh burzum-ishi krimpatul"

One GPU to rule them all, One GPU to find them; One GPU to bring them all

[INDENT]and in the darkness bind them.[/INDENT]
That sounds like some Lord of the Rings mojo right there. Maybe this new AMD card was founded in the fires of Mordor's Mt Doom... and maybe it'll get that hot too ;)

But excited for sure, been looking forward to it for quite some time!
Posted on Reply
#4
Patriot
manofthemThat sounds like some Lord of the Rings mojo right there. Maybe this new AMD card was founded in the fires of Mordor's Mt Doom... and maybe it'll get that hot too ;)

But excited for sure, been looking forward to it for quite some time!
Looks hot but is actually cold to the touch ;)
Posted on Reply
#5
RealNeil
I hope that it's all they want it to be.
If it's aimed at the Titan cards, it will be expensive.
Posted on Reply
#6
NC37
How bout 8GB variants that aren't water cooled? We've got games right now going over 6GB...just sayin. I don't want to have to pay top dollar for VRAM AMD when you could just as easily dump more VRAM on a lower model and call it a day.
Posted on Reply
#7
RejZoR
And where are the names? I can only see R9-300 series, the ones we've been looking at for ages...
Posted on Reply
#8
FordGT90Concept
"I go fast!1!11!1!"
NC37How bout 8GB variants that aren't water cooled? We've got games right now going over 6GB...just sayin. I don't want to have to pay top dollar for VRAM AMD when you could just as easily dump more VRAM on a lower model and call it a day.
Hoping an OEM will strap 8 GB on an air cooled R9 390 myself. When I heard only the R9 390X was getting 8 GB, I was disappointed. :(
Posted on Reply
#9
Solaris17
Super Dainty Moderator
I agree I really hope they make an 8GB none water. Thats the only one ill probably buy.
Posted on Reply
#10
RejZoR
Do you think they'll also offer R9-390X in 4GB configuration by AMD itself? I don't think I'll need 8GB of VRAM, but I could save on the price with 4GB variant. If my ancient HD7950 3GB runs pretty much everything on Ultra, I'm quite confident 4GB will be enough for my 1080p screen.
Posted on Reply
#11
FordGT90Concept
"I go fast!1!11!1!"
The grapevine is saying R9 390X will cost $600+ and at that price point, they couldn't sell it without 8 GiB of RAM. I suppose an OEM could do it but usually they increase the amount of memory from manufacturer spec, not decrease it.

You're forgetting DirectX 12 and increasing proliferation of 64-bit games. VRAM is quickly becoming a limiting factor, especially as more and more games are starting to use ridiculously high resolution textures.
Posted on Reply
#12
HumanSmoke
btarunrWe know that AMD is preparing two SKUs out of the fully-loaded "Fiji" silicon - an air-cooled variant with 4 GB of memory; and a liquid-cooled one, with up to 8 GB of memory
Haven't AMD's own people basicallycome out with a 4GB maximum per GPU? (unless you're referring to a dual-GPU 2x4GB situation)
This first-gen HBM stack will impose at least one limitation of note: its total capacity will only be 4GB....[snip]...When I asked Macri about this issue, he expressed confidence in AMD's ability to work around this capacity constraint.
Posted on Reply
#13
Xzibit
AMD's CTO Joe Macri"You're not limited in this world to any number of stacks, but from a capacity point of view, this generation-one HBM, each DRAM is a two-gigabit DRAM, so yeah, if you have four stacks you're limited to four gigabytes. You could build things with more stacks, you could build things with less stacks. Capacity of the frame buffer is just one of our concerns. There are many things you can do to utilise that capacity better. So if you have four stacks you're limited to four [gigabytes], but we don't really view that as a performance limitation from an AMD perspective."
"If you actually look at frame buffers and how efficient they are and how efficient the drivers are at managing capacities across the resolutions, you'll find that there's a lot that can be done. We do not see 4GB as a limitation that would cause performance bottlenecks. We just need to do a better job managing the capacities. We were getting free capacity, because with [GDDR5] in order to get more bandwidth we needed to make the memory system wider, so the capacities were increasing. As engineers, we always focus on where the bottleneck is. If you're getting capacity, you don't put as much effort into better utilising that capacity. 4GB is more than sufficient. We've had to go do a little bit of investment in order to better utilise the frame buffer, but we're not really seeing a frame buffer capacity [problem]. You'll be blown away by how much [capacity] is wasted."
Posted on Reply
#14
The N
i believe this will be extraordinary performer from AMD against TITANX than ever before. of course 4GB is more than enough to run games at ultra at 1080p. limited to 4xAA. the watercooled 390x with 8GB definitely expensive, AMD still have to decide the price range. cuz we already have 970/980/TiTANX in market in fair prices.

Price will be a main concern.
Posted on Reply
#15
erocker
*
I translate this as "AMD is going the expensive route".
Posted on Reply
#16
The N
yeah Expensive, as they are in certain position, to raise the cost. due to jumped up performance.

how about no software optimization from AMD. many people switched to Maxwell from hawaii due to no drivers suppportive strength. only weakness hey are facing right now. maxwell still performing great in games, while paying extra worth it.
Posted on Reply
#17
RejZoR
I've seen quite a lot GTX 9xx problems with games. Not sure how awesome it really is...

I'm thinking about vanilla R9-390 (non X). I don't need absolute top performance, but I don't think I'll feel happy with "old" architecture packaged in new cards under R9-380X. In that case I may just as well stick with trusty HD7950 3GB because I'm not willing to pay 600 € ($ = €) for a graphic card...
Posted on Reply
#18
HumanSmoke
The Ni believe this will be extraordinary performer from AMD against TITANX than ever before. of course 4GB is more than enough to run games at ultra at 1080p.
I doubt that the target audience and the marketing will be aimed at 1080p gaming. if AMD target a 1080p customer base, they are doing it wrong, especially when 2560x1440 is becoming a de facto entry level for the enthusiast sector, and 4K is now the holy grail of PR.
Posted on Reply
#19
Aceman.au
My 290's are about a year and a half old. Can't wait for these things to hit the market. While I'm current only using a 1920x1080 I think I'll upgrade to 4k soon after I get the new cards.
Posted on Reply
#20
RejZoR
HumanSmokeI doubt that the target audience and the marketing will be aimed at 1080p gaming. if AMD target a 1080p customer base, they are doing it wrong, especially when 2560x1440 is becoming a de facto entry level for the enthusiast sector, and 4K is now the holy grail of PR.
Problem is people buy massive cheap super high res monitors, pair them on mid end cards and complain how demanding they are. I'm usually one step behind with monitors, but can always run ANY game at max settings and still have it butter smooth.

Only real reason I want to buy R9-390 is because I simply want to have something new to fiddle with, to explore and admire it's tech capabilities. I won't mind extra framerate and future proof. I just don't see any point in having 4K display. For what? It doens't make image any more "HD" than 1080p unless you're gaming on a 40 inch monitor... especially not in games. This isn't video, this is real time render.
Posted on Reply
#21
HumanSmoke
RejZoRProblem is....
I think you missed my point.
What sells more hardware: Common sense or marketing?
Companies are in the business of selling product. How much product would they sell if they admitted that the new generation of tech is ideally suited for the previous generation ecosystem?
Now, if vendor "A" tells world+dog that their cards are 4K capable/ready, how much worse does it sound if vendor "B" touts 1080p capable?
The other issue directly related to AMD is, why the hell would they target 1080p with their marketing when their own AIB partner is proclaiming 4K ?
Posted on Reply
#22
The Von Matrices
FordGT90ConceptI suppose an OEM could do it but usually they increase the amount of memory from manufacturer spec, not decrease it.
I don't see how it would be possible for an AIB vendor to change the amount of memory on a HBM-equipped product. In the past, the GPU and memory were sold separately and assembled together by the AIB, so it was relatively easy for an AIB to change the amount of memory on the card just by swapping the RAM chips. With HBM, the GPU and memory are one package encased under a soldered IHS assembled by the manufacturer (AMD). Unless an AIB vendor finds a way to desolder the IHS and memory chips, replace the memory chips, then resolder the IHS back on, there is no way to upgrade the memory capacity (assuming that higher density memory chips even exist). From now on, the manufacturer will have to have an official SKU for there to be a high memory capacity configuration.
Posted on Reply
#23
RejZoR
HumanSmokeI think you missed my point.
What sells more hardware: Common sense or marketing?
Companies are in the business of selling product. How much product would they sell if they admitted that the new generation of tech is ideally suited for the previous generation ecosystem?
Now, if vendor "A" tells world+dog that their cards are 4K capable/ready, how much worse does it sound if vendor "B" touts 1080p capable?
The other issue directly related to AMD is, why the hell would they target 1080p with their marketing when their own AIB partner is proclaiming 4K ?
If you buy stuff based on marketing, then you're dumb. Good consumer (for companies), but dumb. I see absolutely no need for a 4K monitor. I switched to 1080p from 1280x1024 so I can easier capture games in a nice TV viewable format with nicer FullHD videwing format and that's it. And because I wanted a 144Hz screen. Otherwise I'd stay with old display...

Where I'd get 4K? When I was testing 42 inch LCD as a monitor replacement. With so big screen, 1080p didn't feel right because of massive pixels. But you felt almost like you were in a game, because you could only see game content and no room :D Stuffing 4K in such tiny formats is pointless. Like I said, pixel count doesn't make games better looking, especially not since we have FSAA... Games != video. Resolution doesn't affect anything within object edges. It only makes edges better looking without FSAA. Are people really still playing games without FSAA? I know I haven't since GeForce 2 days. 2xFSAA was a must have and since GeForce 6600, I've been using 4x regularly. I literally can't imagine playing a game without any FSAA these days. I rather not play it at all than without FSAA...
Posted on Reply
#24
m1dg3t
But i don't wanna pay $850 for a GFX card. I want to pay $1500, then i can show all the plebs how much betterer i am than they... /sarcasm

Gotta love the free market, run by greed & stupidity.

If current pricing trends continue this will be the last PC i build, the only component that came down in price since i built my last is the SSD. Eveything else (equivalent) has remained the same or increased in price. Good job :clap:
Posted on Reply
#25
uuuaaaaaa
Will this be the famed AMD Radeon R9 Zeus card?
Posted on Reply
Add your own comment
Jul 16th, 2024 19:50 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts