Friday, March 11th 2016

NVIDIA "GP104" Silicon to Feature GDDR5X Memory Interface

It looks like NVIDIA's next GPU architecture launch will play out much like its previous two generations - launching the second biggest chip first, as a well-priced "enthusiast" SKU that outperforms the previous-generation enthusiast product, and launching the biggest chip later, as the high-end enthusiast product. The second-biggest chip based on NVIDIA's upcoming "Pascal" architecture, the "GP104," which could let NVIDIA win crucial $550 and $350 price-points, will be a lean machine. NVIDIA will design the chip to keep manufacturing costs low enough to score big in price-performance, and a potential price-war with AMD.

As part of its efforts to keep GP104 as cost-effective as possible, NVIDIA could give exotic new tech such as HBM2 memory a skip, and go with GDDR5X. Implementing GDDR5X could be straightforward and cost-effective for NVIDIA, given that it's implemented the nearly-identical GDDR5 standard on three previous generations. The new standard will double densities, and one could expect NVIDIA to build its GP104-based products with 8 GB of standard memory amounts. GDDR5X breathed a new lease of life to GDDR5, which had seen its clock speeds plateau around 7 Gbps/pin. The new standard could come in speeds of up to 10 Gbps at first, and eventually 12 Gbps and 14 Gbps. NVIDIA could reserve HBM2 for its biggest "Pascal" chip, on which it could launch its next TITAN product.

The GP104 will be built on the 16 nm FinFET process, by TSMC. NVIDIA is hoping to unveil the first GP104-based products by April, at the Graphics Technology Conference (GTC) event, which it hosts annually; with possible market availability by late-May or early-June, 2016.
Source: Benchlife.info
Add your own comment

135 Comments on NVIDIA "GP104" Silicon to Feature GDDR5X Memory Interface

#126
medi01
Slizzo...card's performance was the same before we knew this information, and afterwards.
I understand your POV. Now please try to understand the other .

Imagine that you buy a car that is advertised to have 4 wheel drive and later discover, that it's only, lol, 3. (somehow) Would your car become any worse because of it? Nope. Did manufacturer save costs by not doing? Yep. Did he lie to you? Yep. That's the whole point.
Posted on Reply
#127
EarthDog
Asinine analogy.

I see both sides of this. In the end A LOT of people made more out of this than needed to be (and seemingly are still at it...). Performance never changed, people rarely hit the slowdown, there is actually 4GB, not 3.5GB, it's just slower, that last 512MB. Nobody shit themselves when they did this on the 660ti... but now it's a big deal... MEH.
Posted on Reply
#128
Slizzo
medi01I understand your POV. Now please try to understand the other .

Imagine that you buy a car that is advertised to have 4 wheel drive and later discover, that it's only, lol, 3. (somehow) Would your car become any worse because of it? Nope. Did manufacturer save costs by not doing? Yep. Did he lie to you? Yep. That's the whole point.
That's completely different. In you're analogy, there is false advertising at play. nVidia advertised the card as having 4GB, because that is what it physically has, and can physically address it. How it addresses the memory was never spoken of.
Posted on Reply
#129
rtwjunkie
PC Gaming Enthusiast
medi01I understand your POV. Now please try to understand the other .

Imagine that you buy a car that is advertised to have 4 wheel drive and later discover, that it's only, lol, 3. (somehow) Would your car become any worse because of it? Nope. Did manufacturer save costs by not doing? Yep. Did he lie to you? Yep. That's the whole point.
I respect your viewpoint, however: The number of people who bought 970's in the last 13 months AFTER the information came out, and were aware of the issue, and bought anyway, FAR outnumbers the number sold before the issue became public.

So, they knew what they were buying, no lies told to those after. In short, it really isn't an issue for most people, and user reviews, which indicate the buyer knew of the issue before purchase time on multiple sites back this up.
Posted on Reply
#130
Stefan Payne
The Problem with the GTX 970 Memory is:

a) there are special profiles needed for some games
b) nVidia has the power to make this GPU useless if they want to.

WIth a normal memory architecture, that's not that easy and you can still use your graphics card.
But with the Memory Architecture of the GTX 970, nVidia is able to make this chip useless, more or less with the flip of a switch.

And have you guys learned nothing of the Witcher 3??
Yes, that game, where a GTX 780/Titan was beaten by a ~200€ Card like the GTX 960.
This happened with a couple of other games, like Batman...
Thus you can use this as confirmation that nVidia drops support on older hardware easily (well, at least the optimisation stuff).

What do you think will happen when there is a successor to the GTX 970??
Do you really think that nVidia will still optimize for a not manufactured/sold card??
And they also could use the memory archtecture to cripple this card, so that most newer games are unplayable -> most GTX 970 users will run to the next store and buy the next Geforce card...
Posted on Reply
#131
rtwjunkie
PC Gaming Enthusiast
Stefan PayneAnd have you guys learned nothing of the Witcher 3??
Yes, that game, where a GTX 780/Titan was beaten by a ~200€ Card like the GTX 960.
This happened with a couple of other games, like Batman...
Thus you can use this as confirmation that nVidia drops support on older hardware easily (well, at least the optimisation stuff).
Honestly, you've been reading too many media reports instead of trying things yourself. This is where real world experience counts for more.

I had tbe 780, and used it to play TW3 the first two times I played the game. Most who know me know I am an image quality over frame rate person. I used the 780 to render almost all high to very high settings, except shadows on medium, and no hairworks. I still played at a consistent 50 to 60 fps.

Also owning a 960, it's almost embarrassing for the 960 how badly it gets beaten. For it to put out a mere 30 to 35 fps, most settings MUST be dropped way down to medium. Think about that...same resolution, settings that don't produce as good visuals as the 780 had, and it is only JUST playable on a 960.

In conclusion, the only thing a 960 wins at versus a 780 in TW3 is the title of Lower Performing Card.
Posted on Reply
#132
Tsukiyomi91
still rocking the 970 here. I dun care what ppl say about why it has 3.5+0.5 instead of full 4. As long u don't run demanding games with hi-res texture mods with graphic settings tweaked to Very High or Ultra for the sake of eye-candy, you'll be perfectly fine hitting well over 50fps on average for most of the time with slight dip to 40fps.
Posted on Reply
#133
Tsukiyomi91
@rtwjunkie the 960 is a mid range card, so no surprise here since it can't outpaced it's older brother; the 780. With 2G on most models & a few that has 4, it's not meant to win fps contests but it's good enough for those who are on a tight budget & wants a 1080p ready card that handles most games well.
Posted on Reply
#134
dr emulator (madmax)
NaitoDarn. I'm ready to throw down some money on a Pascal, but I'm gonna have to wait longer...
same here :),

hopefully it'll be cheaper on the electric bill than the hd5870 i have at the mo
Posted on Reply
#135
TheHunter
I'm seeing this full GP104 chip ~7tflops area at best, this is just around 1200-1300mhz boost on 980ti, + a lot of marketing crap with lower power again;

Imo for someone at high-end Mawell GM200 chip a GP104 gtx1080 is not really a worthy upgrade,

I would maybe change my new 980Ti for a full GP100 chip, but I don't see the point at 1080p, well unless its literally 2x faster..
Think I'll just stick with my original plan to wait for Volta and AMD offering in Q1 y2018?.. :)
Posted on Reply
Add your own comment
Nov 22nd, 2024 20:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts