Friday, March 11th 2016
NVIDIA "GP104" Silicon to Feature GDDR5X Memory Interface
It looks like NVIDIA's next GPU architecture launch will play out much like its previous two generations - launching the second biggest chip first, as a well-priced "enthusiast" SKU that outperforms the previous-generation enthusiast product, and launching the biggest chip later, as the high-end enthusiast product. The second-biggest chip based on NVIDIA's upcoming "Pascal" architecture, the "GP104," which could let NVIDIA win crucial $550 and $350 price-points, will be a lean machine. NVIDIA will design the chip to keep manufacturing costs low enough to score big in price-performance, and a potential price-war with AMD.
As part of its efforts to keep GP104 as cost-effective as possible, NVIDIA could give exotic new tech such as HBM2 memory a skip, and go with GDDR5X. Implementing GDDR5X could be straightforward and cost-effective for NVIDIA, given that it's implemented the nearly-identical GDDR5 standard on three previous generations. The new standard will double densities, and one could expect NVIDIA to build its GP104-based products with 8 GB of standard memory amounts. GDDR5X breathed a new lease of life to GDDR5, which had seen its clock speeds plateau around 7 Gbps/pin. The new standard could come in speeds of up to 10 Gbps at first, and eventually 12 Gbps and 14 Gbps. NVIDIA could reserve HBM2 for its biggest "Pascal" chip, on which it could launch its next TITAN product.
The GP104 will be built on the 16 nm FinFET process, by TSMC. NVIDIA is hoping to unveil the first GP104-based products by April, at the Graphics Technology Conference (GTC) event, which it hosts annually; with possible market availability by late-May or early-June, 2016.
Source:
Benchlife.info
As part of its efforts to keep GP104 as cost-effective as possible, NVIDIA could give exotic new tech such as HBM2 memory a skip, and go with GDDR5X. Implementing GDDR5X could be straightforward and cost-effective for NVIDIA, given that it's implemented the nearly-identical GDDR5 standard on three previous generations. The new standard will double densities, and one could expect NVIDIA to build its GP104-based products with 8 GB of standard memory amounts. GDDR5X breathed a new lease of life to GDDR5, which had seen its clock speeds plateau around 7 Gbps/pin. The new standard could come in speeds of up to 10 Gbps at first, and eventually 12 Gbps and 14 Gbps. NVIDIA could reserve HBM2 for its biggest "Pascal" chip, on which it could launch its next TITAN product.
The GP104 will be built on the 16 nm FinFET process, by TSMC. NVIDIA is hoping to unveil the first GP104-based products by April, at the Graphics Technology Conference (GTC) event, which it hosts annually; with possible market availability by late-May or early-June, 2016.
135 Comments on NVIDIA "GP104" Silicon to Feature GDDR5X Memory Interface
They don't know shit about that, they have absolutely NO know how about HBM at all!
They did do shit in this regards, just the big moth jensen has, besides that, they have don nothing at all for HBM! NOTHING!
And to expect that nVidia is able to deliver an HBM chip right of the bat? no way in hell!
The HBM experiment can still be a desaster for nVidia, they have to do a whole chip in 16nm, they also have to do HBM - good chances that something will go horribly wrong...
And that we will see just another GDDR5(X) chip from them, whilst AMD rocks the boat with HBM...
If you've been paying attention, instead of blindly flying the red flag, AMD also is going to do the same thing, only bringing in HBM2 in the NEXT generation after Polaris. This means, since the size of VRAM modules needed for Polaris models means we will likely see GDDR5X from them too.
So I had more than an 'issue' with the dead GPU in my Tecra. It was a known defect in the cooling that nVidia washed their hands of. I don't give a crap if it was 11 years ago or yesterday. That's ALL I NEED TO KNOW about a company to not want to give them any more money in the future. Kinda like Ford calculating that it would cost them more money to fix the Pinto than to pay for the lawsuits from exploding gas tanks. Doesn't exactly make me want to rush out and buy a Ford. When I catch wind of that kind of garbage, yeah, it's an 'issue' for me to give that company money. But you know what? I still gave nVidia another chance. Then I had essentially unworkably buggy drivers screwing around with my three monitor setup, and I gave up and sold the card. And don't give me any crap about using the 670 with 3 monitors. The specs said it supported nVidia Surround, and that's what I expected it to do. And it didn't, at least not without a 19 step procedure with every single driver update.
But what's funny is that even after I'd already decided not to give any more money to nVidia, the worst was yet to come from them. The $200 G-Sync tax, the GTX970 fiasco, the deliberate sabotaging of games with GimpWorks, hell, they're now integrating PhysX into the game engine, so you can't turn it off. Just how much more screwing over do you need to have some sense of self-respect? You yourself were screwed over by the Green Goblin, and you're defending them. The crap about how 'all 4GB is addressable' is the joke. Of course it's addressable, it just runs at 1/7th normal speed. Games actually scale back on textures when they see you're using a 970 so they won't blow over the 3.5GB. Hey, if you're happy with having been lied to, that's your problem, but don't go saying what I've said is a joke. As for 'green glasses', I think you meant to imply I'm wearing 'red glasses'. You're the one wearing green ones. The joke is you arguing to defend them when YOU YOURSELF as a GTX970 owner were sold a bill of goods. You're the one who's deluded for down-playing that so intensely. If AMD had done it, you'd be all over it, and you know it.
For the record, I've owned both camps, and also am now considering an AMD card now. I don't have a preference company to buy from. Not that what I purchase should matter.
Do look up sales numbers for the 970 though, SINCE january last year. Hint: Phenomenal. Read user reviews. Nearly all were aware of the memory configuration. It hasn't bothered them.
The nVidia fans will buy the shit and say it's good, even if it is utter garbage like the memory of the GTX 970?
Why would you do that?
There is just one reason for that...
And that's to make it unusable when the pascal successor is around. You tweak the memory a little, the new bought game runs like shit and the nVidia consument zombie runs to the next store to buy the next nVidia graphics card....
That 'the other one' could have saved money because it could have lasted longer does not matter, it has to be an nVidia graphics card to be hip on the schoolyard - and because those nVidia Fans say it has to be this way because they belive strongly in this company...
Well, a couple of years ago, you had your god and fought about that.
Today you belive in some companys that likes nothing more than to rip you naked and YOU like being raped by them....
That has some kind of perversion in it, don't you think?
Nobody buys a 670 for Surround dude. It's literally not powerful to push it. My best friend has used Surround since he upped to 780ti and now has 3 ROG Swift (the 9q) and although he has minor issues with games and the update to Windows 10 really screwed with things his overall experience with it is he likes it enough to keep it. I can't say the same for Gsync because I sold my Swift within 6 months. But sure, I'm "wearing green glasses".
No buddy, I just call it like it is and all I see is a guy who's had some bad experiences and wants to whine to the world about it. I simply use whatever is the best at the time. If I was truly wearing "green glasses" I wouldn't own many products from both sides. Where fanboyism is concerned I'm pretty unbiased when it comes to it, which comes from the territory of being a previous hardware reviewer. I bought a 970 for testing to debunk exactly the kind of retarded misinformation that guys like you spew constantly. 1/7th the normal speed? Even if it did, it sure doesn't hamper games at all, and I'd love for you to provide a legit source (not bullshit like WCCF) saying they "scale back textures". Now you're just pulling shit out of your ass because if they did that then they'd do it with any other 4GB card (and they certainly don't). :rolleyes: I also like how you went from saying it's not a 4GB card to it's addressable. You guys with your one sided double standard contradicting arguments makes me laugh. What you're saying was, and still is a joke.
Actually no, I don't go posting on AMD threads like guys like you spewing blatant fanboyism. Go ahead, search. I could easily bring up how they're using yet another Cooler Master AIO system despite the allegations from Asetek. I could go on how they are or were being sued for their FX series false advertising. I could go on how they haven't had any competition in the CPU market since 2012. I could go on about how their PR stunts are outright laughable (anybody remember those terrible Youtube videos?). I could go on about how their only form of marketing is hype hype hype while delivering an underwhelming product. Do I? No. I've also had many, many, many bad experience from the AMD side of things but low and behold it doesn't keep me from buying AMD product for testing. It's called having a sense of understanding, and not getting butthurt about little things. Anyways, this shit's way off topic. :shadedshu:
The only reason I've been buying NV since Kepler was released is because of their superior energy efficiency compared to AMD cards. If Polaris manages to turn the table on Pascal, I won't hesistate to join the red camp.
Benchmark and actual game performance DID NOT CHANGE in between before we knew about how the memory was carved up, and afterwards. And, the card performs in line with the bracket that it's priced in.
Basically, this is a non-issue that continues to have the dead horse beaten.
All I see coming from you is complete ignorance. If you really think nVidia is innocent, never tried to screw over anybody, honestly just 'forgot' to mention that the 4GB GTX970 didn't really have 4GB of full-speed memory until 3 months later, isn't really ripping anybody off by charging them a $200 premium for FreeSync with a Green Goblin logo that won't work with anybody else's graphics cards, a pathetic attempt at confetti effects that are patented as PhysX, and that GameWorks is really good for the PC game industry, and not really a transparent rear-guard action to gimp games on all but the latest nVidia cards to force an upgrade, and you don't need to hold them to account for anything they've done, the lies, the deception, go right ahead enjoy taking it from behind. Keep supporting the company that's proven itself happy to screw you over. That's your prerogative. But don't tell me I'm just 'whining'. I've given you true anecdotes from my experience. There's no 'shit' being pulled from my ass, just true accounts.
I'm the one telling it like it is, and you're one justifying why all that crap is OK, so who's really the one whining, eh?
Yea, you're pulling shit out of your ass that is your opinion and nobody is winning because it's a forum on the internet. If anybody is winning it's me because I'm not a biased shit stain whining on the internet because boohoo Nvidia is soooo bad. I've given you factual arguments and all you're doing is whining about it saying the same crap over and over again. Nvidia is a company out to make money, same as AMD, same as Intel, same as Alienware/Dell, same as NCIX, same as Samsung, same as Apple, and many others. IF you don't like their business practice for making money take your wallet somewhere else. Don't go on a board trying shit on an Nvidia thread because you had bad experiences that give you a warped opinion of a company and anything they do. That's like having an ex gf and anything they do is pissing you off even if it's something as simple as putting a harmless selfie on Facebook. You're being that guy, that ex. It's exactly why I made the sarcastic remarks about how I should hate AMD because I've had so many bad products from them but nope. Shit happens, you get over it and realize it's a freakin computer part dude. That doesn't make the company Hitler, and it definitely doesn't make your opinion fact.
Just agree to disagree with different opinions. :-)
You consider them to be "fanbois" and "irrational". Well. So what?
Maybe that's where the problem is? Why should you care about "bitching"? Why not just get over it? Bingo. Most people judge GPU performance based on reviews, not real life experience.
Yep. It's damn subjective. Most things expressed in perf bars are hardly noticeable in real life, that's why we have things like "halo product", more 3xx's being sold after Fiji was released etc.
And your point was?