Friday, March 11th 2016

NVIDIA "GP104" Silicon to Feature GDDR5X Memory Interface

It looks like NVIDIA's next GPU architecture launch will play out much like its previous two generations - launching the second biggest chip first, as a well-priced "enthusiast" SKU that outperforms the previous-generation enthusiast product, and launching the biggest chip later, as the high-end enthusiast product. The second-biggest chip based on NVIDIA's upcoming "Pascal" architecture, the "GP104," which could let NVIDIA win crucial $550 and $350 price-points, will be a lean machine. NVIDIA will design the chip to keep manufacturing costs low enough to score big in price-performance, and a potential price-war with AMD.

As part of its efforts to keep GP104 as cost-effective as possible, NVIDIA could give exotic new tech such as HBM2 memory a skip, and go with GDDR5X. Implementing GDDR5X could be straightforward and cost-effective for NVIDIA, given that it's implemented the nearly-identical GDDR5 standard on three previous generations. The new standard will double densities, and one could expect NVIDIA to build its GP104-based products with 8 GB of standard memory amounts. GDDR5X breathed a new lease of life to GDDR5, which had seen its clock speeds plateau around 7 Gbps/pin. The new standard could come in speeds of up to 10 Gbps at first, and eventually 12 Gbps and 14 Gbps. NVIDIA could reserve HBM2 for its biggest "Pascal" chip, on which it could launch its next TITAN product.

The GP104 will be built on the 16 nm FinFET process, by TSMC. NVIDIA is hoping to unveil the first GP104-based products by April, at the Graphics Technology Conference (GTC) event, which it hosts annually; with possible market availability by late-May or early-June, 2016.
Source: Benchlife.info
Add your own comment

135 Comments on NVIDIA "GP104" Silicon to Feature GDDR5X Memory Interface

#76
Ferrum Master
ūh I missed such a shitstorm :D

It is all speculation 5X or not, the real benchmarks on real production silicon will show how really works. HBM yes or not doesn't really matter as long it works. And I haven't seen any info on bus width for these products too.

@Soalris17 I don't think 980ti is still and will be a fine card... anything more powerful justifying an upgrade cost still won't be really seen till late Q3, and that's hell of a lot time actually, so enjoy it...

As buyers we still will wait till both camps release Polaris and Pascal and then decide what is the best choice.
Posted on Reply
#77
EEQPCyblerr
Just Fu*k that NVidiot Fu*k asyc , I want HBM 2, not a GDDR5FxCK, if polaris use HBM, take my money AMD !!!
and why people talk about asyc, none game support that for now, and just small performance for FPS or TPS game
Posted on Reply
#78
rruff
Ferrum MasterIt is all speculation 5X or not, the real benchmarks on real production silicon will show how really works. HBM yes or not doesn't really matter as long it works. And I haven't seen any info on bus width for these products too.
No kidding. Spec numbers are for spec junkies. Fury has HBM but that didn't make it better than a GTX 980 with lowly GDDR5. And the 980 destroys it in FPS/W.
Posted on Reply
#79
the54thvoid
Super Intoxicated Moderator
EEQPCyblerrJust Fu*k that NVidiot Fu*k asyc , I want HBM 2, not a GDDR5FxCK, if polaris use HBM, take my money AMD !!!
and why people talk about asyc, none game support that for now, and just small performance for FPS or TPS game
Hmm. Language can get heated on TPU but how about you dial back the tone and talk in a more civil manner. Manners is what separates PC from Consoles.
Posted on Reply
#80
dj-electric
EEQPCyblerrJust Fu*k that NVidiot Fu*k asyc , I want HBM 2, not a GDDR5FxCK, if polaris use HBM, take my money AMD !!!
and why people talk about asyc, none game support that for now, and just small performance for FPS or TPS game
"I dont want ABC, i want XYZ because they told me its better" type stuff here.
Gratz, corporate BS actually got through your thick skull and made you believe this stuff.

Ask your GTX 980 Ti using friends how miserable they are that they don't have the HBM that the Fury X uses.
Posted on Reply
#81
Unregistered
I was going to wait until May to build my new system but it doesn't sound like Nvidia will have any new cards ready for release until the end of the year.

Any thoughts on if new motherboards will need to be manufactured to support the new gpu technologies?
Posted on Edit | Reply
#82
medi01
bpgt64How does
"NVIDIA could give exotic new tech such as HBM2 memory a skip, and go with GDDR5X. "

Translate to;
"NVIDIA "GP104" Silicon to Feature GDDR5X Memory Interface"

eg.

About the same as;
"I could grow wings and fly"

Translate to;
"Man grows wings and flies"
Yep, "news" makes no sense whatsoever, but nobody cares (and hardly anyone notices), it seems.
Probably that's a hidden announcement that says: "let GFX warz startorz" on this site.
Startorz they have.
UnCagedAny thoughts on if new motherboards will need to be manufactured to support the new gpu technologies?
No, certainly not.
Posted on Reply
#83
FreedomEclipse
~Technological Technocrat~
with this in mind. Is it still worth getting a 980ti i wonder
Posted on Reply
#85
the54thvoid
Super Intoxicated Moderator
FreedomEclipsewith this in mind. Is it still worth getting a 980ti i wonder
Honestly, if you're looking forward to W10 DX12 gaming - not so much but you still get DX11 goodiness for quite a while. Despite my defence of my 980ti - going forward, if I had to buy a card today, it'd be a hard call but I'd probably buy a Fury X.

As for the Anandtech article - it's a bit meh.... And I've lost alot of faith in Appletech, I mean, Anandtech.
Posted on Reply
#86
truth teller
StefanMFYI: the whitepaper from last November reads HBM
the paper, and even the presentation, for drive px2 also mentions pascal, and yet we got maxwell
you _cant_ trust that shit, especially given nfooledya track record

they _will_ use gddr5x for mid/low end and thats fine, hbm doesnt even make sense here since the gpu doesnt have enough fillrate to saturate the bus (red camp will do it for sure too). this has been known for quite a while yet people like to speculate this and that
Posted on Reply
#87
Unregistered
the54thvoidHonestly, if you're looking forward to W10 DX12 gaming - not so much but you still get DX11 goodiness for quite a while. Despite my defence of my 980ti - going forward, if I had to buy a card today, it'd be a hard call but I'd probably buy a Fury X.

As for the Anandtech article - it's a bit meh.... And I've lost alot of faith in Appletech, I mean, Anandtech.
The problem with getting an AMD/ATI gpu has always been the crappy drivers and how is it after AMD takes over ATI they still suffer from bad driver support? I used to be a fan of AMD years ago when I felt they offered real value especially on the gaming side but I had one to many issues after they took over ATI and ever since Intel came out with the 2600k AMD no longer appealed to me
Posted on Edit | Reply
#88
FreedomEclipse
~Technological Technocrat~
UnCagedThe problem with getting an AMD/ATI gpu has always been the crappy drivers and how is it after AMD takes over ATI they still suffer from bad driver support? I used to be a fan of AMD years ago when I felt they offered real value especially on the gaming side but I had one to many issues after they took over ATI and it seems after Intel came out with the 2600k AMD no longer was such a good deal
If you run a crossfire setup then you have a valid complaint. Crossfire performance is still blighted by bad drivers and its been like this for years and one of the reasons i moved away from AMD GPUs as a whole.

However - if you only run a single GPU then I advise you to stop living in the past.


This thread is also an Nvidia thread btw so please quit with the AMD bashing as its not relevant here
Posted on Reply
#89
Naito
It's ludicrous to think that some people complaining here expect to see HMB2 in low to mid tier SKUs. How many of these people were even genuinely interested in buying Nvidia, but are now upset with this alleged GDDR5X implementation? Furthermore, how many of those that are upset have a 2160p screen where bandwidth actually begins to mean something? Besides core design plays a bigger role in performance in my opinion - memory overclock to increase bandwidth provides marginal gains compared to the bumping of other clocks.

Yet again, this is the internet - trolls like to throw anything around.
Posted on Reply
#90
Unregistered
FreedomEclipseIf you run a crossfire setup then you have a valid complaint. Crossfire performance is still blighted by bad drivers and its been like this for years and one of the reasons i moved away from AMD GPUs as a whole.

However - if you only run a single GPU then I advise you to stop living in the past.


This thread is also an Nvidia thread btw so please quit with the AMD bashing as its not relevant here
I am building a new system and looking for a 2k monitor and was hoping to get in on the new gpu models but it's not a must for me I would rather sit out the first few months of the new technology to avoid the pit falls and headaches .
I would consider getting an AMD GPU if they actually offered a superior product to the 980 ti. I actually came here looking for feedback and suggestions. Does AMD have a product that would make more sense than going with the 980 ti?
Posted on Edit | Reply
#91
FreedomEclipse
~Technological Technocrat~
UnCagedI am building a new system and looking for a 2k monitor and was hoping to get in on the new gpu models but it's not a must for me I would rather sit out the first few months of the new technology to avoid the pit falls and headaches .
I would consider getting an AMD GPU if they actually offered a superior product to the 980 ti. I actually came here looking for feedback and suggestions. Does AMD have a product that would make more sense than going with the 980 ti?
Fury X as recommended - though I have heard of pump whine and stuff. YMMV
Posted on Reply
#92
Tsukiyomi91
the name is kinda weird to me though... GTX1080... sounds like a resolution rather than an actual number system, but the tech that Pascal brings might put AMD's Polaris to the test. I'm down. GDDR5X vs HBM v2. It's gonna be interesting. =D
Posted on Reply
#93
Stephen.
Hi all, is it just me but what i would be much interested with the up coming generation of gpus is price, if they do this then they should bring down the price of the G_104 chips back to $300 and $250 respectively and the upcoming G_100 chips to be back to $500 and $400

With that truly we will move to the UHD

Just my two cents
Posted on Reply
#94
HumanSmoke
Tsukiyomi91the name is kinda weird to me though... GTX1080... sounds like a resolution rather than an actual number system
1080 is just a placeholder because no one actually knows the naming convention. Nvidia changed their naming scheme after the GF 9000 series so it isn't a reach to think they would do the same after the 900 series.
Posted on Reply
#95
rtwjunkie
PC Gaming Enthusiast
ZoneDymoermm so you are contradicting yourself, you say I did not read yet then begin about reading comprehension...aka reading but not understanding...good job.

The starving kids in Africa is exactly the same joke stretch you made with your heart valve problems etc, it literally has nothing to do with the conversation and what to get worked up about, honestly how you cannot see this is beyond me.

We are talking about GPUs and you start about retire...well do I really have to repeat it, you put all that irrelevant information right on display...
Its again a non-argument.

and wait...Starving children in Africa is a stretch but "deal with raising and providing for your children" is not? its basically the same issue except with a little less selfishness (aka your children above other children) involved.



In context of pc hardware being discussed on a pc hardware forum you mean?
yeah...dont know what he was thinking...
totally this is the place we should talk about starving children, cancer, etc etc what is important in life, like life itself....yep seems just about right.
Honestly how you cannot see that what rtwjunkie said is exactly the opposite of putting things in context, aka taking them OUT of context is beyond me.
My remark about life itself is taking the out of context to a further extreme to illustrate how much of a non argument it really is.
and if that is too much to understand then Im sorry, I really cannot see who I can possibly make it any clearer.
Wow, is it time for your English lesson? I did no contradiction. You did not comprehend what you read. And that is what I said.

@HumanSmoke does know what I was thinking, because unlike you, he actually understands the concept of context, and has had frequent interaction with me, both positive and negative.

If you don't understand my point that anger over a GPU company is nonsensical, because it does not make any difference in your life like real life events do, then I'm sorry. I can't create rational thought in your head. It doesn't matter if this is a tech site. Over the top anger at a hardware company is completely irrational, as they miniscule in terms of their impact on life.

I just noticed how young you are. This explains your non-comprehension of what's actually important in the world. Like I said, you've got the luxury to allow yourself to get worked up about GPU's. It still can't ruin your life though. One day you'll realize what is actually worth getting mad over.

I'm done with this thread.
Posted on Reply
#96
Parn
This is the chip I'll be getting provided its performance surpasses my GTX980 for at least 15 - 20%. Don't really care whether it will be kitted with GDDR5X or HBM2 as long as the performance is not handicapped by the less advanced memory technology. HBM2 will also have a card size advantage but to me the lower price of GDDR5X is more important.
Posted on Reply
#98
Fx
NaitoThe GP104 is not their flagship chip, so it is understandable. Having said that, GDDR5X is nothing to scoff at if what I've read is to be believed. 8GB/256bit SKUs with bandwidth one might expect from a 512bit memory system serving GDDR5, may become more commonplace, especially in the mid to high end. Surely this is a good thing for consumers?
I agree because having the stopgap technology (GDDR5X) in addition to the HBM2 allows them to tailor the technologies to their respective target performance profiles.
Posted on Reply
#99
xenocide
ProtagonistHi all, is it just me but what i would be much interested with the up coming generation of gpus is price, if they do this then they should bring down the price of the G_104 chips back to $300 and $250 respectively and the upcoming G_100 chips to be back to $500 and $400
I'm sure Nvidia would love to sell you parts like that, but it's just not possible given manufacturing costs. You can't expect them to shove every new bit of tech into a new manufacturing process and keep it cost-effective. Even GDDR5X is a huge improvement over regular GDDR5, and in most instances memory isn't the bottleneck outside of maybe 4K--which most people don't even play at. The other issue is Nvidia has to go back to competing with AMD in terms of Compute capability, which is a lot of added cost. I think they planned on using HBM2 for GP104 and up but realized when DX12 was in the works they would need to account for Async. Compute and had to drop HBM2 to keep costs in control.
Posted on Reply
#100
Rubble
xenocideI'm sure Nvidia would love to sell you parts like that, but it's just not possible given manufacturing costs. You can't expect them to shove every new bit of tech into a new manufacturing process and keep it cost-effective. Even GDDR5X is a huge improvement over regular GDDR5, and in most instances memory isn't the bottleneck outside of maybe 4K--which most people don't even play at. The other issue is Nvidia has to go back to competing with AMD in terms of Compute capability, which is a lot of added cost. I think they planned on using HBM2 for GP104 and up but realized when DX12 was in the works they would need to account for Async. Compute and had to drop HBM2 to keep costs in control.
Totally agree with this, really does seem like the most likely option. And with That new Pro Duo beating out the titan Z i think they'll focus on compute even more so, even if it is unrelated its still a performance crown they'll want to claim.
Posted on Reply
Add your own comment
Oct 6th, 2024 09:24 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts