# NVIDIA "GP104" Silicon to Feature GDDR5X Memory Interface



## btarunr (Mar 11, 2016)

It looks like NVIDIA's next GPU architecture launch will play out much like its previous two generations - launching the second biggest chip first, as a well-priced "enthusiast" SKU that outperforms the previous-generation enthusiast product, and launching the biggest chip later, as the high-end enthusiast product. The second-biggest chip based on NVIDIA's upcoming "Pascal" architecture, the "GP104," which could let NVIDIA win crucial $550 and $350 price-points, will be a lean machine. NVIDIA will design the chip to keep manufacturing costs low enough to score big in price-performance, and a potential price-war with AMD. 

As part of its efforts to keep GP104 as cost-effective as possible, NVIDIA could give exotic new tech such as HBM2 memory a skip, and go with GDDR5X. Implementing GDDR5X could be straightforward and cost-effective for NVIDIA, given that it's implemented the nearly-identical GDDR5 standard on three previous generations. The new standard will double densities, and one could expect NVIDIA to build its GP104-based products with 8 GB of standard memory amounts. GDDR5X breathed a new lease of life to GDDR5, which had seen its clock speeds plateau around 7 Gbps/pin. The new standard could come in speeds of up to 10 Gbps at first, and eventually 12 Gbps and 14 Gbps. NVIDIA could reserve HBM2 for its biggest "Pascal" chip, on which it could launch its next TITAN product.



The GP104 will be built on the 16 nm FinFET process, by TSMC. NVIDIA is hoping to unveil the first GP104-based products by April, at the Graphics Technology Conference (GTC) event, which it hosts annually; with possible market availability by late-May or early-June, 2016.

*View at TechPowerUp Main Site*


----------



## esrever (Mar 11, 2016)

I would be surprised if it did use DDR5x given how little enthusiasm Nvidia and pretty much any tech company have given to DDR5x. I expect normal DDR5 over DDR5X.


----------



## P4-630 (Mar 11, 2016)

Nice! I will wait for that!
Using intel HD530 now on my new skylake build, still gaming on my ROG laptop just a little longer.


----------



## FordGT90Concept (Mar 11, 2016)

GDDR5X was announced not that long ago.  It takes time to design a new memory controller and push it to market.  AMD will likely do the same with their mid-range cards.


----------



## Solaris17 (Mar 11, 2016)

Damn I just retired my 290x with a 980ti like 2 weeks ago.


----------



## Frick (Mar 11, 2016)

Gooooooooosddddddddd I just want the stuff on the market so we can skip the hype.


----------



## Casecutter (Mar 11, 2016)

When did/does GDDR5X have production reediness?  The info I had was volume production was said to be this summer.
http://www.anandtech.com/show/10017/micron-reports-on-gddr5x-progress

My money is  the GTX1080 with 8Gb GDDR5 256-Bit 150W TDP; it can best FuryX and more the reference 980Ti and they'll sell for $550.  They'll update it with GDDR5X and higher clocks (more or less a rebrand) when FinFET process is permitting.


----------



## rtwjunkie (Mar 11, 2016)

My money is on a paper-launch in late May-early June, with hard launch as soon as the new GDDR5X is ready for production.


----------



## the54thvoid (Mar 11, 2016)

Casecutter said:


> When did/does GDDR5X have production reediness?  The info I had was volume production was said to be this summer.
> http://www.anandtech.com/show/10017/micron-reports-on-gddr5x-progress
> 
> My money is  the GTX1080 with 8Gb GDDR5 256-Bit 150W TDP; it can best FuryX and more the reference 980Ti and they'll sell for $550.  They'll update it with GDDR5X and higher clocks (more or less a rebrand) when FinFET process is permitting.



Quite possible on the performance metric.  The GTX980 didn't exactly outclass the 780ti, so the '1080' might be on par with the 980ti but as you say with far less power draw.  It'd be strange for Nvidia to release a card though that they weren't sure would take the crown so perhaps they know Polaris is going to be far later in the year when they can release 1080ti?  Almost reminiscent of 780 > bettered by 290X, then the 780ti is dropped like a week or so later.  It'd be nice if Polaris comes out with guns blazing though and forces a price war with Nvidia.


----------



## truth teller (Mar 11, 2016)

FordGT90Concept said:


> It takes time to design a new memory controller and push it to market.


not really since the protocol is mostly untouched, its mostly just a speed bump a wider word/bus access



FordGT90Concept said:


> AMD will likely do the same with their mid-range cards.


most definitely, which is not a bad thing in itself since on mid/low end skus there wont be much benefit going the hbm way


----------



## Naito (Mar 12, 2016)

Darn. I'm ready to throw down some money on a Pascal, but I'm gonna have to wait longer...


----------



## manofthem (Mar 12, 2016)

I'll be looking forward to this. I haven't upgraded in a long while and I'm already scratching the itch


----------



## NC37 (Mar 12, 2016)

Will be disappointing. Reports are the 1080 will have a 104 instead of a 110. Means the 1060 will be a 128bit piece of junk. Unless they do like they did with the 600 series and run a 192bit. Either way, it'll be more of the same. Haven't had a good 256bit chip in the 60 series since Fermi.


----------



## rtwjunkie (Mar 12, 2016)

Naito said:


> Darn. I'm ready to throw down some money on a Pascal, but I'm gonna have to wait longer...



Wait longer for what?  For a long time it had been looking like mid to late summer.  This is earlier than expected.


----------



## HumanSmoke (Mar 12, 2016)

FordGT90Concept said:


> GDDR5X was announced not that long ago.  It takes time to design a new memory controller and push it to market.  AMD will likely do the same with their mid-range cards.


JEDEC might have only just ratified the specification, but it doesn't necessarily follow that memory controller and its associated physical layer IP development starts from that point. QA, verification and test tooling has been available for some time


Spoiler












More likely is that the base GPU (for both vendors) is prepped for both IMC/PHY logic blocks. GDDR5 initially, and GDDR5X when production hits its stride. If 14/16nm is anything like previous nodes, both companies will want a low (R&D)-cost refresh option.


Casecutter said:


> When did/does GDDR5X have production reediness?  The info I had was volume production was said to be this summer.
> http://www.anandtech.com/show/10017/micron-reports-on-gddr5x-progress


The salient point is mass production. Just as with initial HBM supplies, mass production does not preclude products shipping with the chips. SK Hynix announced volume production of HBM only a week before the Fury X launched - that definitively proves the chips used on the Fiji series of cards were fabricated some months before the Hynix announcement given the lead in time needed for GPU package assembly, board assembly, and shipping. Micron are already sampling GDDR5X and they certainly didn't wait around for JEDEC to ratify the specification before beginning work.


Casecutter said:


> My money is  the GTX1080 with 8Gb GDDR5 256-Bit 150W TDP; it can best FuryX and more the reference 980Ti and they'll sell for $550.  They'll update it with GDDR5X and higher clocks (more or less a rebrand) when FinFET process is permitting.


Excepting the naming ( I sincerely doubt "1080" will be a model number) that seems reasonable. Nvidia will almost certainly target high end mobile and performance segment discrete as they did with GM 204. Smaller die size will offset higher wafer costs, so $500-550 for the GTX _80 and $350 for the GTX _70 seems in line with previous pricing.


the54thvoid said:


> Quite possible on the performance metric.  The GTX980 didn't exactly outclass the 780ti, so the '1080' might be on par with the 980ti but as you say with far less power draw.  It'd be strange for Nvidia to release a card though that they weren't sure would take the crown so perhaps they know Polaris is going to be far later in the year when they can release 1080ti?  Almost reminiscent of 780 > bettered by 290X, then the 780ti is dropped like a week or so later.


I'm picking that GP104 has enough performance to pick off the custom 980 Ti's and Fury cards but maybe not much more. Price the cards at $500-550 and $330-350 (GTX _70) and they will still stack up well for consumers in comparison. One thing is certain, the company usually knows how to sell product, so they will have the angles covered.


the54thvoid said:


> It'd be nice if Polaris comes out with guns blazing though and forces a price war with Nvidia.


Sounds like a Grimm faery tale unfortunately. Except for some hollow sabre rattling and PR driven token price cuts, both AMD and Nvidia seem intent on giving the impression of a price war without actually engaging in one.


----------



## rruff (Mar 12, 2016)

btarunr said:


> It looks like NVIDIA's next GPU architecture launch will play out much like its previous two generations - launching the second biggest chip first, as a well-priced "enthusiast" SKU that outperforms the previous-generation enthusiast product, and launching the biggest chip later, as the high-end enthusiast product.



GTX 750 doesn't count? It was way before the 970.


----------



## HumanSmoke (Mar 12, 2016)

rruff said:


> GTX 750 doesn't count? It was way before the 970.


The second largest Kepler arch chip, the GK 104, was also launched the same day as GK 107 (22nd March 2012) which powered the 640M, 650M, and 660M.


----------



## Naito (Mar 12, 2016)

rtwjunkie said:


> Wait longer for what?  For a long time it had been looking like mid to late summer.  This is earlier than expected.



For the FULL Pascal. 4K needs a lot to be driven.


----------



## midnightoil (Mar 12, 2016)

rtwjunkie said:


> Wait longer for what?  For a long time it had been looking like mid to late summer.  This is earlier than expected.



Um, NVIDIA's first cards are expected in very low volume at either the beginning of Q4 or end of Q3, no earlier.  There's no way there'll be a Titan until late Q1 '17 at the very earliest.

This whole report is completely made up.

GDD5RX won't be in volume production until end of Q3.  So how exactly are NVIDIA going to launch a GDDR5X card at the end of Q2?

Answer: they can't and won't.

They still haven't demo'd real cards to anyone as far as we know.  If they had it would have leaked.  AMD had their first demos to the press in early December with real cards.  Select partners had access earlier than that.


----------



## rtwjunkie (Mar 12, 2016)

I'm not sure what your "um" is for, sounding like I'm an idiot or sonething.  It's obvious you didn't read the whole thread.  Read my post #8: http://www.techpowerup.com/forums/t...-gddr5x-memory-interface.220823/#post-3429647

This isn't my first rodeo.


----------



## PP Mguire (Mar 12, 2016)

midnightoil said:


> Um, NVIDIA's first cards are expected in very low volume at either the beginning of Q4 or end of Q3, no earlier.  There's no way there'll be a Titan until late Q1 '17 at the very earliest.
> 
> This whole report is completely made up.
> 
> ...


GDDR5 is expected to roll out high volume production in the summer which means GP104 could very well roll out easily in the fall as partners are supposed to be receiving samples now. How they got available by June is beyond me, but a paper launch seems like it could happen.


----------



## ZoneDymo (Mar 12, 2016)

pretty sad for such a "premium" company to go for GDDR5x for all but the most overpriced of their products.
You would think that that massive market share would count for something.
Oh well...guess MOAR profits rule again and the blind will all gladly bend over.


----------



## Naito (Mar 12, 2016)

ZoneDymo said:


> pretty sad for such a "premium" company to go for GDDR5x for all but the most overpriced of their products.
> You would think that that massive market share would count for something.
> Oh well...guess MOAR profits rule again and the blind will all gladly bend over.



The GP104 is not their flagship chip, so it is understandable. Having said that, GDDR5X is nothing to scoff at if what I've read is to be believed. 8GB/256bit SKUs with bandwidth one might expect from a 512bit memory system serving GDDR5, may become more commonplace, especially in the mid to high end. Surely this is a good thing for consumers?


----------



## TheDeeGee (Mar 12, 2016)

Solaris17 said:


> Damn I just retired my 290x with a 980ti like 2 weeks ago.



It was know for ages the new cards would come out around May/June.


----------



## Jism (Mar 12, 2016)

Since AMD was into development of HBM and HBM2, it said as well to have a first priority related to getting HBM 1 & HBM2 chips. The reason why Nvidia is choosing for GDDRX5 is simply because it cant get HBM2 chips yet.

HBM is still superior to GDDR, less chips, less power, much tighter latency's and much bigger bandwidth.


----------



## Naito (Mar 12, 2016)

Jism said:


> HBM is still superior to GDDR, less chips, less power, much tighter latency's and much bigger bandwidth.



GDDR5X closes the gap considerably when compared to HMB1 so much so that unless you're gaming at 4K, you'd probably never notice the difference, if then.


----------



## Jism (Mar 12, 2016)

Yes, for now. But HBM is still the better product with low power, lower latency, and even future generations of HBM can still outperform GDDRX5 easily. 

You woud'nt had a Fury X in small form factor if it was'nt for HBM. There woud'nt be room to stack 8 GDDRX5 chips around the chip anymore.


----------



## bug (Mar 12, 2016)

Jism said:


> Since AMD was into development of HBM and HBM2, it said as well to have a first priority related to getting HBM 1 & HBM2 chips. The reason why Nvidia is choosing for GDDRX5 is simply because it cant get HBM2 chips yet.
> 
> HBM is still superior to GDDR, less chips, less power, much tighter latency's and much bigger bandwidth.



Actually, the latency of HBM is worse than GDDR's. It makes up for it with its additional bandwidth. It's a little bit like the old RAMBUS.


----------



## rtwjunkie (Mar 12, 2016)

Jism said:


> Yes, for now. But HBM is still the better product with low power, lower latency, and even future generations of HBM can still outperform GDDRX5 easily.
> 
> You woud'nt had a Fury X in small form factor if it was'nt for HBM. There woud'nt be room to stack 8 GDDRX5 chips around the chip anymore.



And?...    we get it, you'd rather see HBM2 GPU's.  You will, on the high end cards.  The GP104 are of the same placement as the current 980 and 970, which is on the upper end of the mid-tier.  They are not high end flagships, and nor will be the first of Pascal, GP104.

Gp104 should be affordable.  GDDR5X, tho expensive, should be alot cheaper to produce in volume than HBM2, translating into: Keeping GP104 GPU's somewhat affordable.


----------



## PP Mguire (Mar 12, 2016)

Jism said:


> Since AMD was into development of HBM and HBM2, it said as well to have a first priority related to getting HBM 1 & HBM2 chips. The reason why Nvidia is choosing for GDDRX5 is simply because it cant get HBM2 chips yet.
> 
> HBM is still superior to GDDR, less chips, less power, much tighter latency's and much bigger bandwidth.


HBM is still expensive, HBM2 will be more expensive than GDDR5x, and Nvidia/AMD are getting their chips from different companies. Nvidia will have their HBM2 chips as soon as they're available in Q3, and are releasing GP104 first anyways like they have been the past 3 generations. So that being said, HBM2 will be left for their big chip just like HBM was for AMD and Fury. Nothing new here. GDDR5x will be perfectly fine for products that aren't top tier.


----------



## the54thvoid (Mar 12, 2016)

ZoneDymo said:


> Oh well...guess MOAR profits rule again and the blind will all gladly bend over.



I know. Isn't it just awful. Imagine if AMD released something smaller, less powerful than Fury X, with lower circuit componentry and then took the AIO unit off. Imagine they did that and charged more for it...

Shock horror, both teams try to make a profit.  Deal with it.


----------



## anubis44 (Mar 12, 2016)

Solaris17 said:


> Damn I just retired my 290x with a 980ti like 2 weeks ago.



Uh Oh:

http://pokde.net/blog/amd-r9-290x-beats-the-gtx-980-ti-in-dx12/
https://pcpartpicker.com/forums/topic/105376-do-not-buy-gtx980-or-gtx980ti-they-are-obsolete

Maybe you're still within the return period for that obsolete 980Ti? Best of luck.


----------



## Solaris17 (Mar 12, 2016)

anubis44 said:


> Uh Oh:
> 
> http://pokde.net/blog/amd-r9-290x-beats-the-gtx-980-ti-in-dx12/
> https://pcpartpicker.com/forums/topic/105376-do-not-buy-gtx980-or-gtx980ti-they-are-obsolete
> ...



lol as someone who owns both the 290x does not beat my 980ti unfortunately.


----------



## rtwjunkie (Mar 12, 2016)

anubis44 said:


> Uh Oh:
> 
> http://pokde.net/blog/amd-r9-290x-beats-the-gtx-980-ti-in-dx12/
> https://pcpartpicker.com/forums/topic/105376-do-not-buy-gtx980-or-gtx980ti-they-are-obsolete
> ...



So, as most members here know, those stories are based on the performance in Ashes of the singularity, the only real DX12 game out there a half a year after premiere of DX12.

So, there is absolutely zero credence given to either articl, especially the second hysterical "your Maxwells are obsolete" one.


----------



## medi01 (Mar 12, 2016)

Read article 2 times, missed which part of it was news (or what was the title based on).

It says NV MIGHT do something? Is the fact what some company MIGHT do considered news these days? ^^


----------



## midnightoil (Mar 12, 2016)

rtwjunkie said:


> I'm not sure what your "um" is for, sounding like I'm an idiot or sonething.  It's obvious you didn't read the whole thread.  Read my post #8: http://www.techpowerup.com/forums/t...-gddr5x-memory-interface.220823/#post-3429647
> 
> This isn't my first rodeo.



That would likely mean a paper launch 4 or even 5 months before any real availability ... they may be late and AMD earlier, but showing nothing at all would be far less damaging to their brand than that kind of paper launch.


----------



## midnightoil (Mar 12, 2016)

bug said:


> Actually, the latency of HBM is worse than GDDR's. It makes up for it with its additional bandwidth. It's a little bit like the old RAMBUS.



GDDR5(X) has a minute advantage in tRC and a disadvantage in tccd.  Both aren't really enough to matter. 

However a 256bit GDDR5X 8GB card will only have 448GB/s of bandwidth.   An HBM2 8GB card will have 1024GB/s of bandwidth.  It'll also likely have well under half the power consumption and consume about one quarter of the board area.  There are also further advantages from being a 4096bit bus vs 256bit.

It's not even close.


----------



## PP Mguire (Mar 12, 2016)

midnightoil said:


> GDDR5(X) has a minute advantage in tRC and a disadvantage in tccd.  Both aren't really enough to matter.
> 
> However a 256bit GDDR5X 8GB card will only have 448GB/s of bandwidth.   An HBM2 8GB card will have 1024GB/s of bandwidth.  It'll also likely have well under half the power consumption and consume about one quarter of the board area.  There are also further advantages from being a 4096bit bus vs 256bit.
> 
> It's not even close.


At this point I can safely say it really doesn't matter much [for now]. We get more of an advantage over raw GPU power over memory bandwidth. I can raise the clocks on my VRAM and hardly get any performance benefit, while raising my core 300MHz gives me a good FPS boost. Sure a boost in memory bandwidth coupled with the architectural differences from Maxwell to Pascal will be nice, the majority of the performance benefit will come from Pascal being faster itself while the faster memory is just a compliment. The main benefit HBM will bring to the table in large quantities is the ability to store large amounts of big texture data.


----------



## the54thvoid (Mar 12, 2016)

anubis44 said:


> Uh Oh:
> 
> http://pokde.net/blog/amd-r9-290x-beats-the-gtx-980-ti-in-dx12/
> https://pcpartpicker.com/forums/topic/105376-do-not-buy-gtx980-or-gtx980ti-they-are-obsolete
> ...



Lol. I'll gladly pitch my 980ti against any Fury X at 1440p or 4k on any current or upcoming game. As far as working DX12 titles, well, scientifically speaking you have nothing to bring to the table bar one unreleased, arguably AMD favouring title.
Love the avatar by the way, nice to see you wear your heart on your sleeve.


----------



## anubis44 (Mar 12, 2016)

the54thvoid said:


> Lol. I'll gladly pitch my 980ti against any Fury X at 1440p or 4k on any current or upcoming game. As far as working DX12 titles, well, scientifically speaking you have nothing to bring to the table bar one unreleased, arguably AMD favouring title.
> Love the avatar by the way, nice to see you wear your heart on your sleeve.



Nothing to bring to the table, eh? AOtS the 'only' example of async compute being enabled on a DX12 game, eh?

What's this?
http://wccftech.com/hitman-pc-directx-12-benchmarks/

The Radeons are beating the crap out of your beloved Green Goblin, and a 390X is beating the Titan X.

nVidia GPUs do not have hardware schedulers, so they can't perform asynchronous compute. This is like not having 'hyperthreading' on a GPU. And it's quite probable that Pascal won't have them either.

As for 'wearing my heart on my sleeve', well, what can I say? I hate nVidia with a burning passion, so why be coy about it? They screwed me over with a burned out GPU in my Toshiba Tecra M3, for which I never received compensation, their drivers were so crappy with my GTX670 on a 3 monitor setup I gave up and sold the card, they're STILL selling a 3.5GB (useable) graphics card as a 4GB graphics card (so they're a pack of liars as far as I'm concerned), they stripped out asynchronous compute engines in order to save power, and now that they've been caught with their pants down, they're using their money to try to insert GimpWorks into every title they can to deliberately sabotage Radeon cards, so they're a pack of desperate cheaters:

http://wccftech.com/nvidia-gameworks-visual-corruption-gears-war-ultimate-edition/

But the biggest joke is on nVidia owners, because of course, Kepler owners are the ones getting Gimped the most by GimpWorks:










If you think all that is irrelevant, be my guest and keep bending over for them.


----------



## AsRock (Mar 12, 2016)

Jism said:


> Yes, for now. But HBM is still the better product with low power, lower latency, and even future generations of HBM can still outperform GDDRX5 easily.
> 
> You woud'nt had a Fury X in small form factor if it was'nt for HBM. There woud'nt be room to stack 8 GDDRX5 chips around the chip anymore.


But if not much difference will be noticed it's not worth it due to cost. People don't like spending $600+ on video cards and tell you the truth i don't need HBM and i like to be able to cool my memory chips too and not have them in a hot zone.


----------



## Frick (Mar 12, 2016)

rtwjunkie said:


> I'm not sure what your "um" is for, sounding like I'm an idiot or sonething.



He must be from Arstechnica.

And wow, TWO games in which AMD are faster? Need moar data.


----------



## rtwjunkie (Mar 13, 2016)

anubis44 said:


> they're STILL selling a 3.5GB (useable) graphics card as a 4GB graphics card (so they're a pack of liars as far as I'm concerned)



Why do you effin care? It's always the people that haven't used a 970 that get angry.  

And angry you are.  Really dude, you're going to die an early death of a stroke  getting all worked up about things THAT DON'T MATTER AT ALLin the grand scheme of LIFE.  There's enough real crap in this world to focus on.


----------



## ZoneDymo (Mar 13, 2016)

the54thvoid said:


> I know. Isn't it just awful. Imagine if AMD released something smaller, less powerful than Fury X, with lower circuit componentry and then took the AIO unit off. Imagine they did that and charged more for it...
> 
> Shock horror, both teams try to make a profit.  Deal with it.



yeah and with AMD being a much smaller company it makes sense.
Remember that massive marketshare difference, it would certainly speak to Nvidia's image if they just said
"ya know what, lets not go for insane profit margins and just make a healthy amount of profit and yet move technology forward by putting HDM2 memory on all our GTX cards".

But no, procrastinate for some more years pls, we have the time, let it all go as slow as possible and release as many incremental jumps in performance as possible so as to make as much profit as possible over the backs of people.

I for one would think we as a people should give the finger to those practices but again, plenty will gladly bend over and even praise these tactics that they themselves are ultimately the victim off.

And yeah, I am dealing with it by pointing out these problems.
I feel all people that use terms like "deal with it" just mean "accept it, bend over and just accept it, let it happen" instead of actually dealing with it (which again is what Im doing by protesting).


----------



## ZoneDymo (Mar 13, 2016)

rtwjunkie said:


> Why do you effin care? It's always the people that haven't used a 970 that get angry.
> 
> And angry you are.  Really dude, you're going to die an early death of a stroke  getting all worked up about things THAT DON'T MATTER AT ALLin the grand scheme of LIFE.  There's enough real crap in this world to focus on.



Always love these silly "arguments"
If you go that far then why does life even matter? might as well kill yourself right now man and no, I dont mean that as an insult or to offend, if your mind really wanders that far then yeah, life itself is pointless as well so why go on?


----------



## rtwjunkie (Mar 13, 2016)

ZoneDymo said:


> Always love these silly "arguments"
> If you go that far then why does life even matter? might as well kill yourself right now man and no, I dont mean that as an insult or to offend, if your mind really wanders that far then yeah, life itself is pointless as well so why go on?



That's just it. I don't care.  These things people get mad about pale in comparison to almost everything.  

I usually find that once people have dealt with death and disease, aging, raising children, etc, anger at a gaming hardware company takes a back seat and really means nothing.

That's a luxury that only those who have not had to actually live life have.


----------



## ZoneDymo (Mar 13, 2016)

rtwjunkie said:


> That's just it. I don't care.  But these issues that people get actually mad about are pointkess.



again, everything is ultimately pointless, life itself is pointless, if you go that far nothing matters ever which makes it a rather redundant argument.


----------



## rtwjunkie (Mar 13, 2016)

ZoneDymo said:


> again, everything is ultimately pointless, life itself is pointless, if you go that far nothing matters ever which makes it a rather redundant argument.



Read the rest of what i wrote. There are real things to worry about.


----------



## HumanSmoke (Mar 13, 2016)

ZoneDymo said:


> Always love these silly "arguments"
> If you go that far then why does life even matter? might as well kill yourself right now man and no, I dont mean that as an insult or to offend, if your mind really wanders that far then yeah, life itself is pointless as well so why go on?


10/10 for hyperbole. 
Life can, and often does, stretch decades with a near infinite variation of experiences. A graphics card for an enthusiast typically measures its lifespan in a year or less. During that time the options in decent gaming might be measured on the fingers of one hand between immature drivers, broken games, short games split over months thanks to artificially partitioning a game into bite-sized DLC's.
I have one life, but I've lost count of the cards and generations of cards that have passed through my hands. Once upon a time, my 9700 PRO was manna from heaven - now just a fading memory, as were my 8800's, GTX 280's, HD 4890's, HD 5970, triple HD 5850's, GTX 580's, GTX 670's and a host of other cards I've owned since Mach 64 and S3 based Diamond Stealth cards.

Apples and Oranges dude.


----------



## 64K (Mar 13, 2016)

ZoneDymo said:


> Always love these silly "arguments"
> If you go that far then why does life even matter? might as well kill yourself right now man and no, I dont mean that as an insult or to offend, if your mind really wanders that far then yeah, life itself is pointless as well so why go on?


----------



## PP Mguire (Mar 13, 2016)

anubis44 said:


> Nothing to bring to the table, eh? AOtS the 'only' example of async compute being enabled on a DX12 game, eh?
> 
> What's this?
> http://wccftech.com/hitman-pc-directx-12-benchmarks/
> ...


No, Maxwell has 32 queues compared to the 128 from AMD. Maxwell supports Async Compute but only chokes when too many commands are sent to it as has been explained in detail many times by renowned sources on the net.

Funny you link a site known for fud and rumors. Even further, the benchmarks they show don't even match. How is it the first set they showed has a 980ti (which beats the Fury X) running 76fps in DX12 @ 1080p (beats in all 3 resolutions too) but the other site shows 68? Furthermore the gap at 1440p for Fury X is higher between the two sites. Who to trust? I'll say nobody. Also, it's been mentioned extensively that DX12 in the game is buggy anyways, as the Computer Base site mentions. Finally, the game is an AMD title and I'd expect it to run better on AMD cards anyways. Of course, one set of benchmarks seems to show otherwise. As to the 390x beating the crap out of the Titan X, yea and I'd be willing to bet like every other site they aren't letting the card boost. An eVGA Titan X SC will boost easily to 1350 by itself which is close to the 980ti clockspeed they have which means Titan X and 980ti would be on top of the charts for the first set of benches shown. So no, anybody worth their salt that owns a Titan X will have performance actually higher than a 980ti, which would handily beat the 390x if the top set of charts are to be believed.

So you had an issue with 2 Nvidia setups and you want to spread BS? What happens when your precious AMD card dies? Are you going to make excuses to continue hating on Nvidia?  Get a grip dude. Your Toshiba laptop came out 11 years ago and you're still whining about it? And who the hell runs Surround on a 670 of all cards? Oh man, my 5770 Crossfire setup was screwed up even after 3 replacement cards because XFX kept giving me Rev B boards.
AMD drivers keep crashing when I test my 295x2 for performance figures for friends. Snap, I might as well hate on AMD and not consider their lineup this year against Pascal and cry about it a decade later. 
The 970 has 4GB of usable VRAM. If you actually OWNED one of these cards you would know that all 4GB is addressable and actually doesn't hamper performance like all the BS misinformation on the net suggests. Want to know how I know? I have one sitting right behind me and have actually tested it for this very reason. Maybe if you took off your green glasses and stopped the fanboyism you'd see that your post here just sounds ridiculous.

Nvidia didn't bother putting a lot of effort into Maxwell regarding DX12 and DX12 features because they are meant to be DX11 1080p/1440p beasts, and that they are. Wait until Pascal is here, you'll see that very statement is beyond true.

The only thing that's a joke here is you and your post bud.


----------



## bpgt64 (Mar 13, 2016)

How does
"NVIDIA could give exotic new tech such as HBM2 memory a skip, and go with GDDR5X. "

Translate to;
"NVIDIA "GP104" Silicon to Feature GDDR5X Memory Interface"

eg.

About the same as;
"I could grow wings and fly"

Translate to;
"Man grows wings and flies"


----------



## ZoneDymo (Mar 13, 2016)

rtwjunkie said:


> Read the rest of what i wrote. There are real things to worry about.



I did read, where did you think my words came from?
Why do those other things matter? why would you worry about those? will anything in the universe change if the entire earth would blow up? would it matter that something changes in the first place?

If you take the discussion away from what memory is used on what videocard to starving kids in Africa then why not take it to an even more nonsensical redundant point of life itself and anything.

Would anything be different if you where never born, would it matter?

Obviously if anyone had the choice between GDDR5X memory and HDM2 and GDDR5X would mean the starving children would be fed, we would go with that, but it has NOTHING AT ALL TO DO WITH THE CURRENT DISCUSSION.
Hope that cleared some things up, now stop making silly non-arguments.



HumanSmoke said:


> 10/10 for hyperbole.
> Life can, and often does, stretch decades with a near infinite variation of experiences. A graphics card for an enthusiast typically measures its lifespan in a year or less. During that time the options in decent gaming might be measured on the fingers of one hand between immature drivers, broken games, short games split over months thanks to artificially partitioning a game into bite-sized DLC's.
> I have one life, but I've lost count of the cards and generations of cards that have passed through my hands. Once upon a time, my 9700 PRO was manna from heaven - now just a fading memory, as were my 8800's, GTX 280's, HD 4890's, HD 5970, triple HD 5850's, GTX 580's, GTX 670's and a host of other cards I've owned since Mach 64 and S3 based Diamond Stealth cards.
> 
> Apples and Oranges dude.



Ermm I think you have to read a few steps back, it was the person I was replying to that made the ridiculous leap (hyperbole) from 
"what memory is used on what videocard" 
to 
"erh mah gerd what about the poor children and dictators and cancer etc etc this all does not matter people!!!".


----------



## rtwjunkie (Mar 13, 2016)

ZoneDymo said:


> I did read, where did you think my words came from?
> Why do those other things matter? why would you worry about those? will anything in the universe change if the entire earth would blow up? would it matter that something changes in the first place?
> 
> If you take the discussion away from what memory is used on what videocard to starving kids in Africa then why not take it to an even more nonsensical redundant point of life itself and anything.
> ...



No you did not read. Reading comprehension is fundamental.  What I said was, once you start working for retirement, have people you know and love near death because of heart valve problems, deal with raising and providing for your children, work to pay your bills and enjoy as well as deal with real life, then getting as worked up and angry as anubis44 is about a GPU company becomes laughable.  

What a GPU company does or sells is nothing in the grand scheme of things that matter, and won't actually affect your life.

Nowhere did I go on about starving kids in Africa and whatnot. That was your stretch, not mine.


----------



## 64K (Mar 13, 2016)

You're funny @ZoneDymo I like you but please stop trolling this thread. Thanks.


----------



## HumanSmoke (Mar 13, 2016)

ZoneDymo said:


> Ermm I think you have to read a few steps back, it was the person I was replying to that made the ridiculous leap (hyperbole) from
> "what memory is used on what videocard"
> to
> "erh mah gerd what about the poor children and dictators and cancer etc etc this all does not matter people!!!".


Then you clearly have no understanding of what the word "hyperbole" actually means.
rtwjunkie simply said to another poster that they might want to put the business of the GTX 970 in particular, and Nvidia in general into an appropriate context rather than railing against a piece of hardware in some OTT outpouring of anger. 
You then decided to insert yourself into the conversation attempting to undermine a rtw's perfectly reasonable and measured stance by launching into some hyperbolic nonsense, and are now upping the derp ante by trying to play the persecution card.


----------



## PP Mguire (Mar 13, 2016)

HumanSmoke said:


> Then you clearly have no understanding of what the word "hyperbole" actually means.
> rtwjunkie simply said to another poster that they might want to put the business of the GTX 970 in particular, and Nvidia in general into an appropriate context rather than railing against a piece of hardware in some OTT outpouring of anger.
> You then decided to insert yourself into the conversation attempting to undermine a rtw's perfectly reasonable and measured stance by launching into some hyperbolic nonsense, and are now upping the derp ante by trying to play the persecution card.


Lold at upping the derp ante.


----------



## anubis44 (Mar 13, 2016)

rtwjunkie said:


> Why do you effin care? It's always the people that haven't used a 970 that get angry.
> 
> And angry you are.  Really dude, you're going to die an early death of a stroke  getting all worked up about things THAT DON'T MATTER AT ALLin the grand scheme of LIFE.  There's enough real crap in this world to focus on.



Oh, don't worry. I'm not worked up about this stuff at all. I resolved not to buy nVidia products, and left it at that. I just reply once in a while to people who seem so enamored with the company that it's sickening, never for one moment am I 'angry' about it. In fact, my revenge is not being 'angry', it's owning over 12,000 shares of AMD stock right now at an average cost of $2.00/share , (but that's not why I'm criticizing nVidia. It's because I really despise how they operate).

And yes, as an MA graduate of poli-sci, I'm much more focused on things that really matter, like finally getting a proportional voting system in Canada, which actually looks like it can happen now.

As for this:

Frick: "And wow, TWO games in which AMD are faster? Need moar data."

OK, here's another one:

http://fudzilla.com/news/graphics/40084-amd-dooms-nvidia-in-benchmarks
http://techfrag.com/2016/02/18/amd-beats-nvidia-in-doom-alpha-benchmarks/

So in the new Doom, even the R9 280X (yes 280X, not 290X) is beating the GreedForce 980Ti. I guess when it gets up to TEN, you'll be saying "TEN games in which AMD are faster? Need moar data," and when it's a hundred, "A HUNDRED games in which AMD are faster? Need moar data," etc. etc.


----------



## rtwjunkie (Mar 13, 2016)

anubis44 said:


> And yes, as an MA graduate of poli-sci, I'm much more focused on things that really matter, like finally getting a proportional voting system in Canada, which actually looks like it can happen now.



I'm glad to hear and and pleased to know it's not keeping you up at night.

Perhaps I misjudged your response, because it's really hard to go about deapising any company.  Having been a senior manager in a Fortune 500 company, I can tell you, business is business, and they all pretty much operate the same. They all have the top goal of making as much money as they can.


----------



## anubis44 (Mar 13, 2016)

rtwjunkie said:


> I'm glad to hear and and pleased to know it's not keeping you up at night.
> 
> Perhaps I misjudged your response, because it's really hard to go about deapising any company.  Having been a senior manager in a Fortune 500 company, I can tell you, business is business, and they all pretty much operate the same. They all have the top goal of making as much money as they can.



Yes, I understand, and I acknowledge that many companies do things and get away with them. I may not even know about some of these transgressions. But there are certain ethical boundaries for me, and when a company crosses one of these ethical boundaries, I won't support them. It's that simple. There's marketing, and then there's just simple lying. AMD has never sold a product as having x amount of memory, but really had y amount of usable memory. It's straight up dishonesty, and nVidia can only get away with it on a technicality, because most people don't understand the implications of .5GB of the 4 only running at 1/7th the speed of the rest of it. There are plenty of games right now that are requiring a full 4GB of video memory that should be running fine on the GTX970, and don't, and the owners of these cards are being told to 'man up', or that they're 'AMD fanboys', when really, they bought these cards expecting that '4GB' printed on the box actually meant 4GB.

As for GameWorks, this is the very last straw for me. It's now so clearly a mere rear-guard action to slow down AMD's increasingly successful DX12 counter-offensive because nVidia was obviously caught off-guard by its release and quick adoption, and I understand it from a business perspective, but it's really like sabotaging another company's products. It's not just aggressive competition anymore, it's more like paying some guy to stick his foot out to trip a competitor during a race, or to slash a tire on a competitor's car during a pit-stop. It's dirty and underhanded, and actually, it doesn't upset me, on the contrary, it reassures me that nVidia truly is a desperate company now, or else they wouldn't be risking such naked cheating and getting caught, because legions of tech nerds ARE going to find out, and when they do, there is a point where a majority of them will say the same thing I'm saying, and tell all the people who trust their tech advice not to buy nVidia products and support that company.

If you watched the 'Nvidia gameworks - game over for you' youtube video I linked to, you'll see near the end of it that it's not just Radeon owners getting hosed by the sabotage nVidia's pulling right now, with unnecessarily high tessellation that you can't even see, tessellating water that isn't even visible to the player, or integrating PhysX into game engines so you can't easily turn it off, it's previous generation nVidia owners, too. nVidia's now gimping their OWN cards from just one generation back, just to sell more Maxwells, and as the author of that video points out, they're likely to deliberately gimp Maxwells, too, once Pascal is out in order to accelerate adoption of Pascal. For me, this is no longer a 'fan-boy' issue, it's a simple self-respect issue. Nobody with self-respect, in my opinion, can watch and learn about nVidia's actions and still buy their products, knowing the underhanded lengths they'll go to squeeze a buck out of you.


----------



## Prima.Vera (Mar 13, 2016)

One other thing that is certain in life, besides death and taxes, people love to derail forums and write shit.


----------



## PP Mguire (Mar 13, 2016)

anubis44 said:


> Oh, don't worry. I'm not worked up about this stuff at all. I resolved not to buy nVidia products, and left it at that. I just reply once in a while to people who seem so enamored with the company that it's sickening, never for one moment am I 'angry' about it. In fact, my revenge is not being 'angry', it's owning over 12,000 shares of AMD stock right now at an average cost of $2.00/share , (but that's not why I'm criticizing nVidia. It's because I really despise how they operate).
> 
> And yes, as an MA graduate of poli-sci, I'm much more focused on things that really matter, like finally getting a proportional voting system in Canada, which actually looks like it can happen now.
> 
> ...


Lol you post the worst in terms of benchmarks. Dude, at 1080p they are showing a 280x beating all of the cards with 61fps average when the game looks like it's capped at 60fps.

Considering these are posted in the middle of last month and Nvidia just released a preliminary Vulcan driver I'd be willing to bet AMD is using Vulcan and Nvidia is using OpenGL 4.4. The increased memory usage for Nvidia over AMD seems to indicate this.

So we have 2 alphas and a game with reported issues on DX12. How about actually waiting to see matured instances being benchmarked by proper sites that actually know what they're doing instead of sites trying to create a clickbait article for fanboys to argue over?

And I see the new post and still at it with the 4GB crap? What games don't run fine on the 970 because of the supposed memory issue? I will literally fire up my 970 and test them because I bet they run fine. My roomie plays all AAA titles with no issues @ 1080p on it mated to a 4690k.

If you want to crapshoot Nvidia for supposed dishonest behavior and unethical business practices then why are you using an Intel CPU? They have been caught red handed doing down right dirty shit which is moot in comparison to this inflated RAM debacle. Sounds like a bunch of excuses to me if I'm honest.

Going on about Gameworks? You realize AMD has it's own version of Gameworks right? You realize that both companies do the same crap to market their cards by pairing up with Devs for AAA titles, right? You realize that both companies are doing what they need to do to have an edge to sell product, right? Sheesh.
Nvidia is not desperate at all. They've had the majority market in their hands due to having simply faster cards for a couple of generations which is why they are able to sell midrange chips for full price then make more money by releasing the big chips later. My 980s creamed my buddy's 290x's in everything and we have identical CPUs. I now have Titans and he has Fury X Crossfire and  I still have the upper hand. If AMD had any upper hand I wouldn't be using Nvidia cards, that's for sure, but they don't and I have a good feeling it'll continue to be this way for the next generation as well. You can rant and rave to try and justify your distaste for Nvidia but to the majority of us it's just blatant bashing for no reason. If you don't want to consume some logic and put aside your decade old hate for the company then at least give the rest of us some peace and not derail an Nvidia thread while continuing to game happily on your AMD graphics cards.


----------



## anubis44 (Mar 13, 2016)

PP Mguire said:


> And I see the new post and still at it with the 4GB crap? What games don't run fine on the 970 because of the supposed memory issue? I will literally fire up my 970 and test them because I bet they run fine. My roomie plays all AAA titles with no issues @ 1080p on it mated to a 4690k.
> 
> If you want to crapshoot Nvidia for supposed dishonest behavior and unethical business practices then why are you using an Intel CPU? They have been caught red handed doing down right dirty shit which is moot in comparison to this inflated RAM debacle. Sounds like a bunch of excuses to me if I'm honest.
> 
> ...



I'm currently using an Intel CPU because, as I explained in a much older thread, I decided to confirm for myself reports of 'much faster performance' than my FX-8350 system in many games. So I loaded up on some gravol, held my nose (it was the first Intel CPU I'd bought since my Celeron 300a that overclocked to 450MHz back in 1998, after all), and bought an i5 4690K on deep discount for $229.00 Canadian, and built a system around it. It turned out to be exactly 3FPS faster than the FX-8350 in my favourite game, Company of Heroes 2, at the definitely not GPU bound resolution of 1680x1050, but I had already given my FX-8350 system to my girlfriend, so I'm stuck with this piece of crap.  Believe me, when Zen comes out, I'll be tearing out this i5 4690K and donating it to a relative faster than you can say 'Intel caught cheating, gets slap on the wrist'.

As for the 'inflated ram' debacle, if nVidia didn't really do anything wrong, why did the CEO pretend to apologize (http://www.technobuffalo.com/2015/02/25/nvidias-ceo-apologizes-for-gtx-970-memory-controversy/), and try to reassure everybody that it 'won't happen again', and yet allow board partners to continue doing it? That doesn't sound kosher to me? I'm so sorry I did this unethical thing that I'm going to continue doing it for as long as I can? 

As for 'ranting and raving', I think I've maintained a civil level of decorum, and backed up everything I've said with an explanation from personal experience or references, so I hardly think it's just 'blatant bashing for no reason.' As for giving you peace so you can continue to be deceived by your beloved company, absolutely. I've said my piece here, and will stop 'derailing' the thread. Looking forward to seeing you with a Radeon in your system specs sometime soon.


----------



## alwayssts (Mar 13, 2016)

I think ya'll pretty much agree with me on the direction things are going.

Fury X had (4x128gbps 1GB) 512gbps of bandwidth...and Polaris 11 will probably have (2x256gbps 4GB) 512gbps of bandwidth.  If you go by straight compute, that's good for a Fury X up to 1110-1120mhz, around 1200mhz if you figure in where memory compression is applicable.  While cache structure and compression could change, let's assume for a sec it doesn't substantially.  I think it's somewhat fair to assume 14nm can probably clock up to ~1200mhz supremely efficiently and top out around 1400mhz or so.  I think it's pretty obvious if you're AMD you're essentially shrinking Fiji one way or another, with or without subtracting 4/8 compute (which should add to over-all efficiency) units and (or not) raising clockspeed to compensate.  Given they probably want to use it in the mobile space, and these parts should be all about simply staying within 225w at most (probably with a buffer at stock, let's say 200w...gotta make sure a 375w x2 will work and  perhaps a super-efficient voltage/clock can fit under 150w) I'm inclined to believe that gives them wiggle room to opt for less units and to pump up (at least the potential of) the clock, even if raising the clock is 1/2 as efficient as building those (mostly redundant) transistors in.

For instance, they could do something like 3584 @ 1200mhz, which for all intent and purposes should be similar over-all to a Fury X in compute but much more efficient/easy to yield, faster then a stock 980ti (which is what, essentially 3520 units at 11xxmhz?), and could potentially clock higher to tap out max bandwidth (perhaps compete/beat an overclocked 980ti).  I'm not ruling out 3840 and/or higher stock clocks either and a lower-end part tackling 980ti, perhaps specifically to put a nail in GM200.  Let's not forget there is 1600mhz HBM2 coming as well, which fits pretty darn well with product differentiation (80%), competing with GM200, if not a perfect spot to try to stay under 150w...

Does this whole thing sound familiar?

It would be like an optimal mix of when AMD shrank a 420mm2 bear of a chip down to 192mm2 (*R600->RV670, a 2.1875x smaller chip on a 2x smaller process, or a net efficiency to arch of 10% die space*) and then cranked the shit out of the voltage (that chip ran at 1.3+v) to clock/yield it decently while selling it for peanuts, mixed with that other time when their 256mm2 chip (using a new memory standard; gddr5) was clocked to put a nail in the former high-end from nvidia (G92b) and gave the affordable version of nvidia's big-ass chip (GT200) a run for it's money...all while being good-enough to hit certain performance levels to keep people content at an affordable price.  You know the bolded sentence above?  Well, 14nm should be closer to 2.1-2.32x smaller and AMD has said their improvements to the arch account for ~30% of the efficiency improvement (which when added together starts to look a lot like _the amount brighter Polaris looks compared to when it was observed blah blah blah_).  While that's probably accounting for the shrink (ie the net efficiency is divided by the shrink), that's still a similar if not greater efficiency improvement in arch as rv670....somewhere to the tune of ~13-15%.   Just throwing it out there, but 4096/3584 is also ~14%...and surely having half (even if faster) memory controllers amounts to something.


As for nvidia:

Guys....do you know with the way Maxwell is set up it essentially requires 1/3 less bandwidth than AMD (not counting compression of Tonga/Fiji, which lowers it to around to 25% or slightly less) due to cache/sfu setup?  It's true.  That alone should pretty much offset any worries between AMD's 512gbps and whatever nvidia comes to bat with, assuming they can at least muster 13000mhz gddr5x (6500mhz x 2).  Given Micron has said they have exactly that in the labs (coincidence I'm sure, not at all theoretically a requirement imposed by nvidia) I wouldn't be too worried.   While we'd all love for nvidia to come to bat with a higher compute ratio (say 224-240sp per module instead of 192 of Kepler or 128 of Maxwell) there's no saying they won't...and simply won't up the ratio of cache to supplement the design.   It's gonna be fine.

_At the end of the day,_

I have no idea who's going to perform better in any situation, but I wouldn't be surprised if both designs are fairly similar (and optimized versions of their former high-end).  My gut (and yours, probably) says nvidia wins the pure gaming metrics.  I also don't know who will be cheaper, but my gut (and yours probably) says AMD.  Still though, if both can achieve 4k30 in mostly everything...and can overclock to 4k60 in the titles we expect they should...does it really matter?   Each will surely have their strengths, be it compute features, pure gaming performance, price etc...but I think we're in for one hell of a fight.

I'm just happy one company, let-alone both, is targeting <225w, 8GB, and probably short/easily coolable cards.  That's what I want, and I'm fairly certain the community needs, especially at an affordable price.  While I feel for the guys that want 4k60 at all costs (and I'm one of them)...hey, there's always multi-gpu....and at least now it will make sense (given rarely does a game use over 8GB frame-buffer, the same which can't be said for 6GB, let-alone 4GB even if fast-enough to switch textures out quickly).


----------



## sweet (Mar 13, 2016)

PP Mguire said:


> Going on about Gameworks? You realize AMD has it's own version of Gameworks right?



Actually AMD doesn't have any black-box features like Gamesworks. Their proposed features are mostly open source.


----------



## RejZoR (Mar 13, 2016)

Wasn't it initially said that GDDR5X doesn't need new controllers unlike HBM ? That's why GDDR5X is more competitive while still doubling (or what was the boost anyway) the performance of regular GDDR5.


----------



## t88powered (Mar 13, 2016)

The biggest question I look forward to having answered is will the 16nm silicone clock like the 3rd generation 28nm did.

If the new GTX_80 performance is comparable to the 980 Ti and the GTX_80 core peaks out at 1200-1300mhz vs 1500-1600mhz on the 980 Tis core then we are not talking about too much gain per dollar seeing that Nvidia will likely charge $550 for the GTX_80 at release.

 That said, hopefully we see atleast 20-25% advantage over comparable cards and awesome clocking potential again. Would also be sick to have a VRM capable of 400+ watts with a chip that draws 150 watts @ stock speeds but I am sure they will lock and skimp on the reference design as always.


----------



## the54thvoid (Mar 13, 2016)

anubis44 said:


> I'm currently using an Intel CPU because, as I explained in a much older thread, I decided to confirm for myself reports of 'much faster performance' than my FX-8350 system in many games. So I loaded up on some gravol, held my nose (it was the first Intel CPU I'd bought since my Celeron 300a that overclocked to 450MHz back in 1998, after all), and bought an i5 4690K on deep discount for $229.00 Canadian, and built a system around it. It turned out to be exactly 3FPS faster than the FX-8350 in my favourite game, Company of Heroes 2, at the definitely not GPU bound resolution of 1680x1050, but I had already given my FX-8350 system to my girlfriend, so I'm stuck with this piece of crap.  Believe me, when Zen comes out, I'll be tearing out this i5 4690K and donating it to a relative faster than you can say 'Intel caught cheating, gets slap on the wrist'.
> 
> As for the 'inflated ram' debacle, if nVidia didn't really do anything wrong, why did the CEO pretend to apologize (http://www.technobuffalo.com/2015/02/25/nvidias-ceo-apologizes-for-gtx-970-memory-controversy/), and try to reassure everybody that it 'won't happen again', and yet allow board partners to continue doing it? That doesn't sound kosher to me? I'm so sorry I did this unethical thing that I'm going to continue doing it for as long as I can?
> 
> As for 'ranting and raving', I think I've maintained a civil level of decorum, and backed up everything I've said with an explanation from personal experience or references, so I hardly think it's just 'blatant bashing for no reason.' As for giving you peace so you can continue to be deceived by your beloved company, absolutely. I've said my piece here, and will stop 'derailing' the thread. Looking forward to seeing you with a Radeon in your system specs sometime soon.



I can understand your position. I will not buy Apple products because I disagree with their business policy and marketing smoke and mirrors.
But, as a hobbyist PC builder, I will buy what is best for the job and last year, that was Nvidia's 980ti. I held off for the release of Fury X as the hype promised so much but it failed to 'blow me away'. So I bought my current card after that disappointing release.

If AMD RTG are on the ascendency (and I wish they broke them off as ATI) I will buy again what is best for going forward. What is beneficial to all of us would be seeing the Pascal architecture in a GP104 card. There will be no doubts the benchmark test suites in use will show Nvidia strengths and weaknesses.  If they haven't evolved their warp schedulers to deal with more queues then we'll know about it.  

As long as AMD allow their next card to have AIB versions I'll be happy to buy.


----------



## HumanSmoke (Mar 13, 2016)

RejZoR said:


> Wasn't it initially said that GDDR5X doesn't need new controllers unlike HBM ? That's why GDDR5X is more competitive while still doubling (or what was the boost anyway) the performance of regular GDDR5.


The memory controllers need minor revision (almost certainly done by now)- pinout is increased (190 pins per IC compared with 170 for standard GDDR5 chips) to allow for the doubled data rate per cycle, and the prefetch architecture is likewise doubled. The IMC logic blocks are very likely similar enough to allow them to interchanged readily with the rest of the GPU. The plus side is that the chips are supposed to operate in the 2-2.5W range (so 16W-20W for 8x1GB chips) and offer the quad data rate. The downside is that due to the increased pinout, trace layout for the PCB will be a little more complex and assembly will be a little more exacting - the GDDR5X chips are also slightly smaller (14x10mm rather than 14x12mm for GDDR5) so the higher pin count means smaller solder balls closer together.


the54thvoid said:


> I can understand your position. I will not buy Apple products because I disagree with their business policy and marketing smoke and mirrors.


I tend to also *try* to boycott some companies based on how they operate. Apple is one I can happily avoid, but the true predators of the tech world are quite hard to stay sanitized from. Samsung is about the most despicable conglomerate on the face of the earth - bribery, extortion, kickbacks, price fixing, bid rigging - they pretty much tick every box for scumbag practices, but it can be bloody difficult to boycott a company that is OEM/ODM for so many third party products. The same can be said for Qualcomm and LG (anubis44's panel of choice) and a host of other companies whose tentacles wind through a huge volume of products not directly carrying their brand.


----------



## matar (Mar 13, 2016)

I cant wait if they sell the (GTX 1070) for 350 I will buy 2 for SLi rather then one (GTX 1080) for 550


----------



## Kaotik (Mar 13, 2016)

the problem is, gddr5x manufacturing won't start until late summer, Micron has mentioned august earlier. There's no way for nvidia to launch gddr5x product in april or may


----------



## ZoneDymo (Mar 13, 2016)

rtwjunkie said:


> No you did not read. Reading comprehension is fundamental.  What I said was, once you start working for retirement, have people you know and love near death because of heart valve problems, deal with raising and providing for your children, work to pay your bills and enjoy as well as deal with real life, then getting as worked up and angry as anubis44 is about a GPU company becomes laughable.
> 
> What a GPU company does or sells is nothing in the grand scheme of things that matter, and won't actually affect your life.
> 
> Nowhere did I go on about starving kids in Africa and whatnot. That was your stretch, not mine.



ermm so you are contradicting yourself, you say I did not read yet then begin about reading comprehension...aka reading but not understanding...good job.

The starving kids in Africa is exactly the same joke stretch you made with your heart valve problems etc, it literally has nothing to do with the conversation and what to get worked up about, honestly how you cannot see this is beyond me.

We are talking about GPUs and you start about retire...well do I really have to repeat it, you put all that irrelevant information right on display...
Its again a non-argument.

and wait...Starving children in Africa is a stretch but "deal with raising and providing for your children" is not? its basically the same issue except with a little less selfishness (aka your children above other children) involved.



HumanSmoke said:


> Then you clearly have no understanding of what the word "hyperbole" actually means.
> rtwjunkie simply said to another poster that they might want to put the business of the GTX 970 in particular, and Nvidia in general into an appropriate context rather than railing against a piece of hardware in some OTT outpouring of anger.
> You then decided to insert yourself into the conversation attempting to undermine a rtw's perfectly reasonable and measured stance by launching into some hyperbolic nonsense, and are now upping the derp ante by trying to play the persecution card.



In context of pc hardware being discussed on a pc hardware forum you mean?
yeah...dont know what he was thinking...
totally this is the place we should talk about starving children, cancer, etc etc what is important in life, like life itself....yep seems just about right.
Honestly how you cannot see that what rtwjunkie said is exactly the opposite of putting things in context, aka taking them OUT of context is beyond me.
My remark about life itself is taking the out of context to a further extreme to illustrate how much of a non argument it really is.
and if that is too much to understand then Im sorry, I really cannot see who I can possibly make it any clearer.


----------



## HumanSmoke (Mar 13, 2016)

Kaotik said:


> the problem is, gddr5x manufacturing won't start until late summer, Micron has mentioned august earlier. There's no way for nvidia to launch gddr5x product in april or may


That is incorrect. Micron announced that *mass* production wouldn't start until the summer. That does not mean that chips won't be available during the ramp to volume production and Micron have confirmed as much and have had a test, evaluation, and verification program going for some time:


> Micron’s GDDR5X program is in full swing and first components have already completed manufacturing. We plan to hit mass production this summer.


Silicon programs don't ramp to volume instantaneously. As I mentioned earlier, Hynix announced mass production of HBM just a week before the Fury X launch. Is it really feasible that the whole Fiji GPU package, card assembly, packaging, and shipping were all accomplished in a week?


----------



## PP Mguire (Mar 13, 2016)

anubis44 said:


> I'm currently using an Intel CPU because, as I explained in a much older thread, I decided to confirm for myself reports of 'much faster performance' than my FX-8350 system in many games. So I loaded up on some gravol, held my nose (it was the first Intel CPU I'd bought since my Celeron 300a that overclocked to 450MHz back in 1998, after all), and bought an i5 4690K on deep discount for $229.00 Canadian, and built a system around it. It turned out to be exactly 3FPS faster than the FX-8350 in my favourite game, Company of Heroes 2, at the definitely not GPU bound resolution of 1680x1050, but I had already given my FX-8350 system to my girlfriend, so I'm stuck with this piece of crap.  Believe me, when Zen comes out, I'll be tearing out this i5 4690K and donating it to a relative faster than you can say 'Intel caught cheating, gets slap on the wrist'.
> 
> As for the 'inflated ram' debacle, if nVidia didn't really do anything wrong, why did the CEO pretend to apologize (http://www.technobuffalo.com/2015/02/25/nvidias-ceo-apologizes-for-gtx-970-memory-controversy/), and try to reassure everybody that it 'won't happen again', and yet allow board partners to continue doing it? That doesn't sound kosher to me? I'm so sorry I did this unethical thing that I'm going to continue doing it for as long as I can?
> 
> As for 'ranting and raving', I think I've maintained a civil level of decorum, and backed up everything I've said with an explanation from personal experience or references, so I hardly think it's just 'blatant bashing for no reason.' As for giving you peace so you can continue to be deceived by your beloved company, absolutely. I've said my piece here, and will stop 'derailing' the thread. Looking forward to seeing you with a Radeon in your system specs sometime soon.


So you're basing your experience on one game that isn't demanding at all? The minimum requirements for that game are based around Core 2. It definitely didn't hit machines as hard as the first one did when it came out. It was rather docile compared to the tech we had available 3 years ago. The 4690k is basically better than the 8350 at practically anything besides a few multi-threaded instances. Even the benchmarks_ you _posted showed DX12 carrying the FX line because their single threaded performance is so low. Idk how many people I've built rigs for that were coming from an FX 8*** or 6*** and said it was a night and day difference, but they actually go in with an unbiased view and play a wide variety of games. Course if you want to continue chugging on the hate train, be my guest.

JHH publically apologizing was a professional courtesy which wasn't necessary at all. In fact, nobody got cheated and there is a very technical reason why the last half of RAM can't be addressed at full speed when RAM is occupied. Actually it isn't that technical at all and anybody with limited hardware knowledge can understand why it happens. The real fact of the matter is it doesn't actually hurt performance and I've done extensive testing to prove this as did renowned tech sites. I call it an inflated debacle simply because one dude ran some code, said a thing, and then everybody went spreading a metric fuck ton of misinformation which in turn caused JHH to apologize like that. In fact, if Nvidia was such an evil company as you so believe he wouldn't have bothered and the company as a whole would have ignored it completely without a care. People keep gobbling the cards up because of the above fact that they run great and are great little 1080p beasts. If the last half of VRAM was such an issue then it wouldn't be the dominating card on Steam, and wouldn't have such a market presence like it does. In fact, it wouldn't be one of the most sought after cards of the generation for gaming.

Sure, you're way more civil than I've seen and I'll give you that, but it doesn't really make my statement any less true. You had 2 bad experiences and want to go off on your own stance like a bitter old woman (not a personal attack, just how it looks) that Nvidia is the most evil company out there when really they're not. They didn't steal your sweet roll, they're a business out there to make money just like AMD. I'd say Samsung is by far worse than Nvidia when it comes to it but how many Samsung products do you own (that's rhetorical btw). I wouldn't say Nvidia is my 'beloved' company, I just go for the absolute best performance in graphics. I don't have AMD product in my specs but I do own a lot of AMD product with the most recent being the 390x. I don't own a Fury card because my bud has 2 of them and I can borrow them when I want to for testing. I have no problem using AMD product if they have something superior though, because I go for whoever has the best performance and ignore the politically correct bullshit. I don't take bad experiences to hinder my purchasing cycle unless it's something to do with CS and RMA (like Gigabyte). I don't go on every Gigabyte thread bashing their product though, I just simply don't buy their stuff and leave it at that.



sweet said:


> Actually AMD doesn't have any black-box features like Gamesworks. Their proposed features are mostly open source.


No they don't have proprietary tech, but I do know there are some instances where some of the stuff they do come out with runs obviously and probably purposely worse on Nvidia cards (clearly remembers TressFX initial release). It's really no different. Most of the enhanced technology that Nvidia offers in their Gameworks program doesn't even get integrated into games even though a LOT of them are awesome. I've never seen Grassworks, Waveworks, or a few others simply because they are game changers and developers don't want to segregate their gaming experience to one party. I know of two titles that are supposed to implement these features but haven't seen them see the light of day. It is well known fact that AMD still collaborates with developers just like Nvidia does on certain games to push their brand and it's literally the same thing. One such example was linked in this very thread, Hitman. AoS is another well  known example and typically AMD cards show a better performance figure than Nvidia does on such titles because the games are coded initially to run better. I guess this is where PhysX is naturally going to be the counter argument, but on AMD based rigs the PhysX is downgraded and ran on the CPU and in most cases can be turned off either by a setting ingame or otherwise. The whole deal is exceptionally blown out of proportion but in all reality if it all wasn't so split things like Grassworks are awesome. Idk how many times I've come across grassy segments in a game and thought to myself this would make my experience 10x better if this looked actually realistic instead of a bunch of flat sprites tossed together to form foliage. Being able to walk on a grassy field and it actually flattens and interacts with the character is just a minute detail that people overlook but could be very awesome. Of course, most would argue against simply to argue with the reason "burn Nvidia, down with the evil Gameworks!", whatever.


----------



## StefanM (Mar 13, 2016)

FYI: the whitepaper from last November reads HBM


----------



## the54thvoid (Mar 13, 2016)

StefanM said:


> FYI: the whitepaper from last November reads HBM
> 
> View attachment 72851



White paper is not finished article. If a selected component isn't ready to be utilised, they may use an alternative, I.E. GDDR5X.  
Besides, the top tier will probably still be using it, so it's erroneous to cast HBM as the memory of all Pascal cards. They could even keep it for the compute part only.
I mean, smartphones sometimes contain different chips despite being the same model. If advertised performance is delivered the internal hardware is actually irrelevant.

Edit: NV link won't be on mainstream desktop either, its for the compute cards.


----------



## Ferrum Master (Mar 13, 2016)

ūh I missed such a shitstorm 

It is all speculation 5X or not, the real benchmarks on real production silicon will show how really works. HBM yes or not doesn't really matter as long it works. And I haven't seen any info on bus width for these products too.

@Soalris17 I don't think 980ti is still and will be a fine card... anything more powerful justifying an upgrade cost still won't be really seen till late Q3, and that's hell of a lot time actually, so enjoy it...

As buyers we still will wait till both camps release Polaris and Pascal and then decide what is the best choice.


----------



## EEQPCyblerr (Mar 13, 2016)

Just Fu*k that NVidiot Fu*k asyc , I want HBM 2, not a GDDR5FxCK, if polaris use HBM, take my money AMD !!!
and why people talk about asyc, none game support that for now, and just small performance for FPS or TPS game


----------



## rruff (Mar 13, 2016)

Ferrum Master said:


> It is all speculation 5X or not, the real benchmarks on real production silicon will show how really works. HBM yes or not doesn't really matter as long it works. And I haven't seen any info on bus width for these products too.



No kidding. Spec numbers are for spec junkies. Fury has HBM but that didn't make it better than a GTX 980 with lowly GDDR5. And the 980 destroys it in FPS/W.


----------



## the54thvoid (Mar 13, 2016)

EEQPCyblerr said:


> Just Fu*k that NVidiot Fu*k asyc , I want HBM 2, not a GDDR5FxCK, if polaris use HBM, take my money AMD !!!
> and why people talk about asyc, none game support that for now, and just small performance for FPS or TPS game



Hmm. Language can get heated on TPU but how about you dial back the tone and talk in a more civil manner. Manners is what separates PC from Consoles.


----------



## dj-electric (Mar 13, 2016)

EEQPCyblerr said:


> Just Fu*k that NVidiot Fu*k asyc , I want HBM 2, not a GDDR5FxCK, if polaris use HBM, take my money AMD !!!
> and why people talk about asyc, none game support that for now, and just small performance for FPS or TPS game



"I dont want ABC, i want XYZ because they told me its better" type stuff here.
Gratz, corporate BS actually got through your thick skull and made you believe this stuff.

Ask your GTX 980 Ti using friends how miserable they are that they don't have the HBM that the Fury X uses.


----------



## UnCaged (Mar 13, 2016)

I was going to wait until May to build my new system but it doesn't sound like Nvidia will have any new cards ready for release until the end of the year.

Any thoughts on if new motherboards will need to be manufactured to support the new gpu technologies?


----------



## medi01 (Mar 13, 2016)

bpgt64 said:


> How does
> "NVIDIA could give exotic new tech such as HBM2 memory a skip, and go with GDDR5X. "
> 
> Translate to;
> ...



Yep, "news" makes no sense whatsoever, but nobody cares (and hardly anyone notices), it seems.
Probably that's a hidden announcement that says: "let GFX warz startorz" on this site.
Startorz they have.




UnCaged said:


> Any thoughts on if new motherboards will need to be manufactured to support the new gpu technologies?


No, certainly not.


----------



## FreedomEclipse (Mar 13, 2016)

with this in mind. Is it still worth getting a 980ti i wonder


----------



## medi01 (Mar 13, 2016)

http://anandtech.com/show/10136/discussing-the-state-of-directx-12-with-microsoft-oxide-games


----------



## the54thvoid (Mar 13, 2016)

FreedomEclipse said:


> with this in mind. Is it still worth getting a 980ti i wonder



Honestly, if you're looking forward to W10 DX12 gaming - not so much but you still get DX11 goodiness for quite a while.  Despite my defence of my 980ti - going forward, if I had to buy a card today, it'd be a hard call but I'd probably buy a Fury X.

As for the Anandtech article - it's a bit meh....  And I've lost alot of faith in Appletech, I mean, Anandtech.


----------



## truth teller (Mar 13, 2016)

StefanM said:


> FYI: the whitepaper from last November reads HBM


the paper, and even the presentation, for drive px2 also mentions pascal, and yet we got maxwell
you _cant_ trust that shit, especially given nfooledya track record

they _will_ use gddr5x for mid/low end and thats fine, hbm doesnt even make sense here since the gpu doesnt have enough fillrate to saturate the bus (red camp will do it for sure too). this has been known for quite a while yet people like to speculate this and that


----------



## UnCaged (Mar 13, 2016)

the54thvoid said:


> Honestly, if you're looking forward to W10 DX12 gaming - not so much but you still get DX11 goodiness for quite a while.  Despite my defence of my 980ti - going forward, if I had to buy a card today, it'd be a hard call but I'd probably buy a Fury X.
> 
> As for the Anandtech article - it's a bit meh....  And I've lost alot of faith in Appletech, I mean, Anandtech.



The problem with getting an AMD/ATI gpu has always been the crappy drivers and how is it after AMD takes over ATI they still suffer from bad  driver support? I used to be a fan of AMD years ago when I felt they offered real value especially on the gaming side but I had one to many issues after they took over ATI and ever since Intel came out with the 2600k AMD no longer appealed to me


----------



## FreedomEclipse (Mar 14, 2016)

UnCaged said:


> The problem with getting an AMD/ATI gpu has always been the crappy drivers and how is it after AMD takes over ATI they still suffer from bad  driver support? I used to be a fan of AMD years ago when I felt they offered real value especially on the gaming side but I had one to many issues after they took over ATI and it seems after Intel came out with the 2600k AMD no longer was such a good deal



If you run a crossfire setup then you have a valid complaint. Crossfire performance is still blighted by bad drivers and its been like this for years and one of the reasons i moved away from AMD GPUs as a whole. 

However - if you only run a single GPU then I advise you to stop living in the past.


This thread is also an Nvidia thread btw so please quit with the AMD bashing as its not relevant here


----------



## Naito (Mar 14, 2016)

It's ludicrous to think that some people complaining here expect to see HMB2 in low to mid tier SKUs. How many of these people were even genuinely interested in buying Nvidia, but are now upset with this alleged GDDR5X implementation? Furthermore, how many of those that are upset have a 2160p screen where bandwidth actually begins to mean something? Besides core design plays a bigger role in performance in my opinion - memory overclock to increase bandwidth provides marginal gains compared to the bumping of other clocks.

Yet again, this is the internet - trolls like to throw anything around.


----------



## UnCaged (Mar 14, 2016)

FreedomEclipse said:


> If you run a crossfire setup then you have a valid complaint. Crossfire performance is still blighted by bad drivers and its been like this for years and one of the reasons i moved away from AMD GPUs as a whole.
> 
> However - if you only run a single GPU then I advise you to stop living in the past.
> 
> ...



I am building a new system and looking for a 2k monitor and was hoping to get in on the new gpu models but it's not a must for me I would rather sit out the first few months of the new technology to avoid the pit falls and headaches .
I would consider getting an AMD GPU if they actually offered a superior product to the 980 ti. I actually came here looking for feedback and suggestions. Does AMD have a product that would make more sense than going with the 980 ti?


----------



## FreedomEclipse (Mar 14, 2016)

UnCaged said:


> I am building a new system and looking for a 2k monitor and was hoping to get in on the new gpu models but it's not a must for me I would rather sit out the first few months of the new technology to avoid the pit falls and headaches .
> I would consider getting an AMD GPU if they actually offered a superior product to the 980 ti. I actually came here looking for feedback and suggestions. Does AMD have a product that would make more sense than going with the 980 ti?




Fury X as recommended - though I have heard of pump whine and stuff. YMMV


----------



## Tsukiyomi91 (Mar 14, 2016)

the name is kinda weird to me though... GTX1080... sounds like a resolution rather than an actual number system, but the tech that Pascal brings might put AMD's Polaris to the test. I'm down. GDDR5X vs HBM v2. It's gonna be interesting. =D


----------



## Protagonist (Mar 14, 2016)

Hi all, is it just me but what i would be much interested with the up coming generation of gpus is price, if they do this then they should bring down the price of the G_104 chips back to $300 and $250 respectively and the upcoming G_100 chips to be back to $500 and $400

With that truly we will move to the UHD

Just my two cents


----------



## HumanSmoke (Mar 14, 2016)

Tsukiyomi91 said:


> the name is kinda weird to me though... GTX1080... sounds like a resolution rather than an actual number system


1080 is just a placeholder because no one actually knows the naming convention. Nvidia changed their naming scheme after the GF 9000 series so it isn't a reach to think they would do the same after the 900 series.


----------



## rtwjunkie (Mar 14, 2016)

ZoneDymo said:


> ermm so you are contradicting yourself, you say I did not read yet then begin about reading comprehension...aka reading but not understanding...good job.
> 
> The starving kids in Africa is exactly the same joke stretch you made with your heart valve problems etc, it literally has nothing to do with the conversation and what to get worked up about, honestly how you cannot see this is beyond me.
> 
> ...



Wow, is it time for your English lesson? I did no contradiction.  You did not comprehend what you read.  And that is what I said. 

@HumanSmoke does know what I was thinking, because unlike you, he actually understands the concept of context, and has had frequent interaction with me, both positive and negative.

If you don't understand my point that anger over a GPU company is nonsensical, because it does not make any difference in your life like real life events do, then I'm sorry.  I can't create rational thought in your head.  It doesn't matter if this is a tech site.  Over the top anger at a hardware company is completely irrational, as they miniscule in terms of their impact on life.

I just noticed how young you are.  This explains your non-comprehension of what's actually important in the world.  Like I said, you've got the luxury to allow yourself to get worked up about GPU's.  It still can't ruin your life though.  One day you'll realize what is actually worth getting mad over.

I'm done with this thread.


----------



## Parn (Mar 14, 2016)

This is the chip I'll be getting provided its performance surpasses my GTX980 for at least 15 - 20%. Don't really care whether it will be kitted with GDDR5X or HBM2 as long as the performance is not handicapped by the less advanced memory technology. HBM2 will also have a card size advantage but to me the lower price of GDDR5X is more important.


----------



## Tsukiyomi91 (Mar 14, 2016)

@HumanSmoke noted.


----------



## Fx (Mar 14, 2016)

Naito said:


> The GP104 is not their flagship chip, so it is understandable. Having said that, GDDR5X is nothing to scoff at if what I've read is to be believed. 8GB/256bit SKUs with bandwidth one might expect from a 512bit memory system serving GDDR5, may become more commonplace, especially in the mid to high end. Surely this is a good thing for consumers?



I agree because having the stopgap technology (GDDR5X) in addition to the HBM2 allows them to tailor the technologies to their respective target performance profiles.


----------



## xenocide (Mar 15, 2016)

Protagonist said:


> Hi all, is it just me but what i would be much interested with the up coming generation of gpus is price, if they do this then they should bring down the price of the G_104 chips back to $300 and $250 respectively and the upcoming G_100 chips to be back to $500 and $400



I'm sure Nvidia would love to sell you parts like that, but it's just not possible given manufacturing costs.  You can't expect them to shove every new bit of tech into a new manufacturing process and keep it cost-effective.  Even GDDR5X is a huge improvement over regular GDDR5, and in most instances memory isn't the bottleneck outside of maybe 4K--which most people don't even play at.  The other issue is Nvidia has to go back to competing with AMD in terms of Compute capability, which is a lot of added cost.  I think they planned on using HBM2 for GP104 and up but realized when DX12 was in the works they would need to account for Async. Compute and had to drop HBM2 to keep costs in control.


----------



## Onibi (Mar 15, 2016)

xenocide said:


> I'm sure Nvidia would love to sell you parts like that, but it's just not possible given manufacturing costs.  You can't expect them to shove every new bit of tech into a new manufacturing process and keep it cost-effective.  Even GDDR5X is a huge improvement over regular GDDR5, and in most instances memory isn't the bottleneck outside of maybe 4K--which most people don't even play at.  The other issue is Nvidia has to go back to competing with AMD in terms of Compute capability, which is a lot of added cost.  I think they planned on using HBM2 for GP104 and up but realized when DX12 was in the works they would need to account for Async. Compute and had to drop HBM2 to keep costs in control.



Totally agree with this, really does seem like the most likely option. And with That new Pro Duo beating out the titan Z i think they'll focus on compute even more so, even  if it is unrelated its still a performance crown they'll want to claim.


----------



## PP Mguire (Mar 15, 2016)

xenocide said:


> I'm sure Nvidia would love to sell you parts like that, but it's just not possible given manufacturing costs.  You can't expect them to shove every new bit of tech into a new manufacturing process and keep it cost-effective.  Even GDDR5X is a huge improvement over regular GDDR5, and in most instances memory isn't the bottleneck outside of maybe 4K--which most people don't even play at.  The other issue is Nvidia has to go back to competing with AMD in terms of Compute capability, which is a lot of added cost.  I think they planned on using HBM2 for GP104 and up but realized when DX12 was in the works they would need to account for Async. Compute and had to drop HBM2 to keep costs in control.


Not at all, HBM is meant for top tier chips, and the 2 recent news releases proves this. They won't reduce their pricing and release structure until AMD can hammer them back down with solid competition. So until then we will see midrange chips asking a 500 dollar price until the big chips come out half a year later. Rinse and repeat. Pascal was also roadmapped to be a number cruncher so you'll see increased FP16 support in all their chips and HBM2 will be on their big chip as well as more queues for Async. Nothing really to do with costs or competition, it's how they had it mapped out.


----------



## Stefan Payne (Mar 15, 2016)

I woudn't expect to see any nVidia HBM chip anytime soon.

They don't know shit about that, they have absolutely NO know how about HBM at all!
They did do shit in this regards, just the big moth jensen has, besides that, they have don nothing at all for HBM! NOTHING!

And to expect that nVidia is able to deliver an HBM chip right of the bat? no way in hell!

The HBM experiment can still be a desaster for nVidia, they have to do a whole chip in 16nm, they also have to do HBM - good chances that something will go horribly wrong...

And that we will see just another GDDR5(X) chip from them, whilst AMD rocks the boat with HBM...


----------



## rtwjunkie (Mar 15, 2016)

Stefan Payne said:


> I woudn't expect to see any nVidia HBM chip anytime soon.
> 
> They don't know shit about that, they have absolutely NO know how about HBM at all!
> They did do shit in this regards, just the big moth jensen has, besides that, they have don nothing at all for HBM! NOTHING!
> ...



They aren't making the HBM modules and neither does AMD.  All they have to do is implement it once priduction is underway. 

If you've been paying attention, instead of blindly flying the red flag, AMD also is going to do the same thing, only bringing in HBM2 in the NEXT generation after Polaris.   This means, since the size of VRAM modules needed for Polaris models means we will likely see GDDR5X from them too.


----------



## anubis44 (Mar 16, 2016)

PP Mguire said:


> No, Maxwell has 32 queues compared to the 128 from AMD. Maxwell supports Async Compute but only chokes when too many commands are sent to it as has been explained in detail many times by renowned sources on the net.
> 
> Funny you link a site known for fud and rumors. Even further, the benchmarks they show don't even match. How is it the first set they showed has a 980ti (which beats the Fury X) running 76fps in DX12 @ 1080p (beats in all 3 resolutions too) but the other site shows 68? Furthermore the gap at 1440p for Fury X is higher between the two sites. Who to trust? I'll say nobody. Also, it's been mentioned extensively that DX12 in the game is buggy anyways, as the Computer Base site mentions. Finally, the game is an AMD title and I'd expect it to run better on AMD cards anyways. Of course, one set of benchmarks seems to show otherwise. As to the 390x beating the crap out of the Titan X, yea and I'd be willing to bet like every other site they aren't letting the card boost. An eVGA Titan X SC will boost easily to 1350 by itself which is close to the 980ti clockspeed they have which means Titan X and 980ti would be on top of the charts for the first set of benches shown. So no, anybody worth their salt that owns a Titan X will have performance actually higher than a 980ti, which would handily beat the 390x if the top set of charts are to be believed.
> 
> ...



OK, you clearly want a recap, so I'll give it to you.

So I had more than an 'issue' with the dead GPU in my Tecra. It was a known defect in the cooling that nVidia washed their hands of. I don't give a crap if it was 11 years ago or yesterday. That's ALL I NEED TO KNOW about a company to not want to give them any more money in the future. Kinda like Ford calculating that it would cost them more money to fix the Pinto than to pay for the lawsuits from exploding gas tanks. Doesn't exactly make me want to rush out and buy a Ford. When I catch wind of that kind of garbage, yeah, it's an 'issue' for me to give that company money. But you know what? I still gave nVidia another chance. Then I had essentially unworkably buggy drivers screwing around with my three monitor setup, and I gave up and sold the card. And don't give me any crap about using the 670 with 3 monitors. The specs said it supported nVidia Surround, and that's what I expected it to do. And it didn't, at least not without a 19 step procedure with every single driver update.

But what's funny is that even after I'd already decided not to give any more money to nVidia, the worst was yet to come from them. The $200 G-Sync tax, the GTX970 fiasco, the deliberate sabotaging of games with GimpWorks, hell, they're now integrating PhysX into the game engine, so you can't turn it off. Just how much more screwing over do you need to have some sense of self-respect? You yourself were screwed over by the Green Goblin, and you're defending them. The crap about how 'all 4GB is addressable' is the joke. Of course it's addressable, it just runs at 1/7th normal speed. Games actually scale back on textures when they see you're using a 970 so they won't blow over the 3.5GB. Hey, if you're happy with having been lied to, that's your problem, but don't go saying what I've said is a joke. As for 'green glasses', I think you meant to imply I'm wearing 'red glasses'. You're the one wearing green ones. The joke is you arguing to defend them when YOU YOURSELF as a GTX970 owner were sold a bill of goods. You're the one who's deluded for down-playing that so intensely. If AMD had done it, you'd be all over it, and you know it.


----------



## anubis44 (Mar 16, 2016)

rtwjunkie said:


> No you did not read. Reading comprehension is fundamental.  What I said was, once you start working for retirement, have people you know and love near death because of heart valve problems, deal with raising and providing for your children, work to pay your bills and enjoy as well as deal with real life, then getting as worked up and angry as anubis44 is about a GPU company becomes laughable.
> 
> What a GPU company does or sells is nothing in the grand scheme of things that matter, and won't actually affect your life.
> 
> Nowhere did I go on about starving kids in Africa and whatnot. That was your stretch, not mine.



I am working for retirement. No kids because I went through 2 marriages with women who turned out to have mental health issues they refused to deal with. I pay my bills and enjoy my 'real' life. I'm not getting the least bit worked up, I just call a spade a spade. Screwed over by nVidia several times. Check. OK, don't buy their stuff. End of discussion. That's the calculation. No getting worked up going on here. Enjoy your nVidia card.


----------



## rtwjunkie (Mar 16, 2016)

anubis44 said:


> I am working for retirement. No kids because I went through 2 marriages with women who turned out to have mental health issues they refused to deal with. I pay my bills and enjoy my 'real' life. I'm not getting the least bit worked up, I just call a spade a spade. Screwed over by nVidia several times. Check. OK, don't buy their stuff. End of discussion. That's the calculation. No getting worked up going on here. Enjoy your nVidia card.



Hey, we came to a mutual semi-truce. You are quoting my response ro the other guy who felt he needed to insert himself into the discussion.

For the record, I've owned both camps, and also am now considering an AMD card now. I don't have a preference company to buy from.   Not that what I purchase should matter. 

Do look up sales numbers for the 970 though, SINCE january last year.  Hint: Phenomenal. Read user reviews.  Nearly all were aware of the memory configuration.  It hasn't  bothered them.


----------



## Parn (Mar 16, 2016)

PP Mguire said:


> Not at all, HBM is meant for top tier chips, and the 2 recent news releases proves this. They won't reduce their pricing and release structure until AMD can hammer them back down with solid competition. So until then we will see midrange chips asking a 500 dollar price until the big chips come out half a year later. Rinse and repeat. Pascal was also roadmapped to be a number cruncher so you'll see increased FP16 support in all their chips and HBM2 will be on their big chip as well as more queues for Async. Nothing really to do with costs or competition, it's how they had it mapped out.



Looks like Fury X wasn't enough of a competition for NV then if the new GTX1080 would start off around $500. I guess I will just have to wait until 1080 Ti is released before getting my hand on a 1080.


----------



## rtwjunkie (Mar 16, 2016)

Parn said:


> Looks like Fury X wasn't enough of a competition for NV then if the new GTX1080 would start off around $500. I guess I will just have to wait until 1080 Ti is released before getting my hand on a 1080.



Actually, that's the assumed price of the as yet unnamed card using the GP104 chip, which is not the "full" Pascal. It's the upper mid-tier, just like where the 980 sits today in the lineup.


----------



## Stefan Payne (Mar 16, 2016)

Parn said:


> Looks like Fury X wasn't enough of a competition for NV then if the new GTX1080 would start off around $500. I guess I will just have to wait until 1080 Ti is released before getting my hand on a 1080.


nVidia can do what they want.

The nVidia fans will buy the shit and say it's good, even if it is utter garbage like the memory of the GTX 970?
Why would you do that? 
There is just one reason for that...

And that's to make it unusable when the pascal successor is around. You tweak the memory a little, the new bought game runs like shit and the nVidia consument zombie runs to the next store to buy the next nVidia graphics card....

That  'the other one' could have saved money because it could have lasted longer does not matter, it has to be an nVidia graphics card to be hip on the schoolyard - and because those nVidia Fans say it has to be this way because they belive strongly in this company...

Well, a couple of years ago, you had your god and fought about that.

Today you belive in some companys that likes nothing more than to rip you naked and YOU like being raped by them....
That has some kind of perversion in it, don't you think?


----------



## PP Mguire (Mar 16, 2016)

anubis44 said:


> OK, you clearly want a recap, so I'll give it to you.
> 
> So I had more than an 'issue' with the dead GPU in my Tecra. It was a known defect in the cooling that nVidia washed their hands of. I don't give a crap if it was 11 years ago or yesterday. That's ALL I NEED TO KNOW about a company to not want to give them any more money in the future. Kinda like Ford calculating that it would cost them more money to fix the Pinto than to pay for the lawsuits from exploding gas tanks. Doesn't exactly make me want to rush out and buy a Ford. When I catch wind of that kind of garbage, yeah, it's an 'issue' for me to give that company money. But you know what? I still gave nVidia another chance. Then I had essentially unworkably buggy drivers screwing around with my three monitor setup, and I gave up and sold the card. And don't give me any crap about using the 670 with 3 monitors. The specs said it supported nVidia Surround, and that's what I expected it to do. And it didn't, at least not without a 19 step procedure with every single driver update.
> 
> But what's funny is that even after I'd already decided not to give any more money to nVidia, the worst was yet to come from them. The $200 G-Sync tax, the GTX970 fiasco, the deliberate sabotaging of games with GimpWorks, hell, they're now integrating PhysX into the game engine, so you can't turn it off. Just how much more screwing over do you need to have some sense of self-respect? You yourself were screwed over by the Green Goblin, and you're defending them. The crap about how 'all 4GB is addressable' is the joke. Of course it's addressable, it just runs at 1/7th normal speed. Games actually scale back on textures when they see you're using a 970 so they won't blow over the 3.5GB. Hey, if you're happy with having been lied to, that's your problem, but don't go saying what I've said is a joke. As for 'green glasses', I think you meant to imply I'm wearing 'red glasses'. You're the one wearing green ones. The joke is you arguing to defend them when YOU YOURSELF as a GTX970 owner were sold a bill of goods. You're the one who's deluded for down-playing that so intensely. If AMD had done it, you'd be all over it, and you know it.


All I see is bitch bitch bitch tbh. Last time I checked GPU makers don't create the cooling solution for laptops, the laptop makers do. That would be a Toshiba thing. 

Nobody buys a 670 for Surround dude. It's literally not powerful to push it. My best friend has used Surround since he upped to 780ti and now has 3 ROG Swift (the 9q) and although he has minor issues with games and the update to Windows 10 really screwed with things his overall experience with it is he likes it enough to keep it. I can't say the same for Gsync because I sold my Swift within 6 months. But sure, I'm "wearing green glasses".

No buddy, I just call it like it is and all I see is a guy who's had some bad experiences and wants to whine to the world about it. I simply use whatever is the best at the time. If I was truly wearing "green glasses" I wouldn't own many products from both sides. Where fanboyism is concerned I'm pretty unbiased when it comes to it, which comes from the territory of being a previous hardware reviewer. I bought a 970 for testing to debunk exactly the kind of retarded misinformation that guys like you spew constantly. 1/7th the normal speed? Even if it did, it sure doesn't hamper games at all, and I'd love for you to provide a legit source (not bullshit like WCCF) saying they "scale back textures". Now you're just pulling shit out of your ass because if they did that then they'd do it with any other 4GB card (and they certainly don't).  I also like how you went from saying it's not a 4GB card to it's addressable. You guys with your one sided double standard contradicting arguments makes me laugh. What you're saying was, and still is a joke. 

Actually no, I don't go posting on AMD threads like guys like you spewing blatant fanboyism. Go ahead, search. I could easily bring up how they're using yet another Cooler Master AIO system despite the allegations from Asetek. I could go on how they are or were being sued for their FX series false advertising. I could go on how they haven't had any competition in the CPU market since 2012. I could go on about how their PR stunts are outright laughable (anybody remember those terrible Youtube videos?). I could go on about how their only form of marketing is hype hype hype while delivering an underwhelming product. Do I? No. I've also had many, many, many bad experience from the AMD side of things but low and behold it doesn't keep me from buying AMD product for testing. It's called having a sense of understanding, and not getting butthurt about little things. Anyways, this shit's way off topic.


----------



## rtwjunkie (Mar 16, 2016)

Stefan Payne said:


> Today you belive in some companys that likes nothing more than to rip you naked and YOU like being raped by them....
> That has some kind of perversion in it, don't you think?



Man, I'm not sure what world you live in, but being extreme fans of and "believing" in a company are fanboy acts, and most of us on here are way beyond that age in life where anything like that matters.  We buy by card for the most part, not card family.


----------



## PP Mguire (Mar 16, 2016)

rtwjunkie said:


> Man, I'm not sure what world you live in, but being extreme fans of and "believing" in a company are fanboy acts, and most of us on here are way beyond that age in life where anything like that matters.  We buy by card for the most part, not card family.


This. It's freakin computer stuff. So and so has best performing product, I buy that. Idc about name, I just want what performs.


----------



## Parn (Mar 16, 2016)

rtwjunkie said:


> Man, I'm not sure what world you live in, but being extreme fans of and "believing" in a company are fanboy acts, and most of us on here are way beyond that age in life where anything like that matters.  We buy by card for the most part, not card family.



Well said.

The only reason I've been buying NV since Kepler was released is because of their superior energy efficiency compared to AMD cards. If Polaris manages to turn the table on Pascal, I won't hesistate to join the red camp.


----------



## rruff (Mar 16, 2016)

rtwjunkie said:


> most of us on here are way beyond that age in life where anything like that matters.  We buy by card for the most part, not card family.



I'd buy AMD if it was close, up to a coin toss. But at the moment the only AMD GPU or CPU I have is an old Athlon X2 64.


----------



## Slizzo (Mar 16, 2016)

Stefan Payne said:


> nVidia can do what they want.
> 
> The nVidia fans will buy the shit and say it's good, even if it is utter garbage like the memory of the GTX 970?
> Why would you do that?
> There is just one reason for that...



Jesus Christ, how many times are people going to harp on that? Here's the thing:

Benchmark and actual game performance DID NOT CHANGE in between before we knew about how the memory was carved up, and afterwards. And, the card performs in line with the bracket that it's priced in.

Basically, this is a non-issue that continues to have the dead horse beaten.


----------



## PP Mguire (Mar 16, 2016)

Slizzo said:


> Jesus Christ, how many times are people going to harp on that? Here's the thing:
> 
> Benchmark and actual game performance DID NOT CHANGE in between before we knew about how the memory was carved up, and afterwards. And, the card performs in line with the bracket that it's priced in.
> 
> Basically, this is a non-issue that continues to have the dead horse beaten.


Gee, how many times have I said this. All Nvidia haters have are the 970 and Gameworks


----------



## anubis44 (Mar 22, 2016)

PP Mguire said:


> All I see is bitch bitch bitch tbh. Last time I checked GPU makers don't create the cooling solution for laptops, the laptop makers do. That would be a Toshiba thing.
> 
> Nobody buys a 670 for Surround dude. It's literally not powerful to push it. My best friend has used Surround since he upped to 780ti and now has 3 ROG Swift (the 9q) and although he has minor issues with games and the update to Windows 10 really screwed with things his overall experience with it is he likes it enough to keep it. I can't say the same for Gsync because I sold my Swift within 6 months. But sure, I'm "wearing green glasses".
> 
> ...



I bought a 670 to play Diablo 3 using three monitors. If you don't like that, screw yourself. That's why I did it. I used to play the hell out of Diablo 2, and I decided I'd play the hell out of Diablo 3 on three monitors. Contrary to your empty-headed rant, the 670 put out plenty of performance for that game in surround (120FPS). Unfortunately, the crappy GreedForce drivers didn't play well with 3 monitors, so I sold the card in frustration after a month of having to fiddle with it.

All I see coming from you is complete ignorance. If you really think nVidia is innocent, never tried to screw over anybody, honestly just 'forgot' to mention that the 4GB GTX970 didn't really have 4GB of full-speed memory until 3 months later, isn't really ripping anybody off by charging them a $200 premium for FreeSync with a Green Goblin logo that won't work with anybody else's graphics cards, a pathetic attempt at confetti effects that are patented as PhysX, and that GameWorks is really good for the PC game industry, and not really a transparent rear-guard action to gimp games on all but the latest nVidia cards to force an upgrade, and you don't need to hold them to account for anything they've done, the lies, the deception, go right ahead enjoy taking it from behind. Keep supporting the company that's proven itself happy to screw you over. That's your prerogative. But don't tell me I'm just 'whining'. I've given you true anecdotes from my experience. There's no 'shit' being pulled from my ass, just true accounts.

I'm the one telling it like it is, and you're one justifying why all that crap is OK, so who's really the one whining, eh?


----------



## PP Mguire (Mar 22, 2016)

anubis44 said:


> I bought a 670 to play Diablo 3 using three monitors. If you don't like that, screw yourself. That's why I did it. I used to play the hell out of Diablo 2, and I decided I'd play the hell out of Diablo 3 on three monitors. Contrary to your empty-headed rant, the 670 put out plenty of performance for that game in surround (120FPS). Unfortunately, the crappy GreedForce drivers didn't play well with 3 monitors, so I sold the card in frustration after a month of having to fiddle with it.
> 
> All I see coming from you is complete ignorance. If you really think nVidia is innocent, never tried to screw over anybody, honestly just 'forgot' to mention that the 4GB GTX970 didn't really have 4GB of full-speed memory until 3 months later, isn't really ripping anybody off by charging them a $200 premium for FreeSync with a Green Goblin logo that won't work with anybody else's graphics cards, a pathetic attempt at confetti effects that are patented as PhysX, and that GameWorks is really good for the PC game industry, and not really a transparent rear-guard action to gimp games on all but the latest nVidia cards to force an upgrade, and you don't need to hold them to account for anything they've done, the lies, the deception, go right ahead enjoy taking it from behind. Keep supporting the company that's proven itself happy to screw you over. That's your prerogative. But don't tell me I'm just 'whining'. I've given you true anecdotes from my experience. There's no 'shit' being pulled from my ass, just true accounts.
> 
> I'm the one telling it like it is, and you're one justifying why all that crap is OK, so who's really the one whining, eh?


The only one spewing ignorance here is you because you're butthurt as hell. Get over yourself buddy. I don't have to justify anything, I buy what's the best period. I like how before you completely ignored the "and how many devices do you own that have Samsung in them" because you're just a hypocrite who wants to take your decade old frustration out on forum posts. Samsung is by far one of the worst companies to pull shit yet I bet you give 0 fucks because all you want to do is bash on Nvidia like all the Nvidia fanboys bash on AMD. 

Yea, you're pulling shit out of your ass that is your opinion and nobody is winning because it's a forum on the internet. If anybody is winning it's me because I'm not a biased shit stain whining on the internet because boohoo Nvidia is soooo bad. I've given you factual arguments and all you're doing is whining about it saying the same crap over and over again. Nvidia is a company out to make money, same as AMD, same as Intel, same as Alienware/Dell, same as NCIX, same as Samsung, same as Apple, and many others. IF you don't like their business practice for making money take your wallet somewhere else. Don't go on a board trying shit on an Nvidia thread because you had bad experiences that give you a warped opinion of a company and anything they do. That's like having an ex gf and anything they do is pissing you off even if it's something as simple as putting a harmless selfie on Facebook. You're being that guy, that ex. It's exactly why I made the sarcastic remarks about how I should hate AMD because I've had so many bad products from them but nope. Shit happens, you get over it and realize it's a freakin computer part dude. That doesn't make the company Hitler, and it definitely doesn't make your opinion fact.


----------



## rtwjunkie (Mar 22, 2016)

Come on guys....I repeat my earlier stance. It's not worth getting angry over. It isn't going to impact anyone enough to ruin their life, nor is it going to cure world hunger.

Just agree to disagree with different opinions.


----------



## PP Mguire (Mar 22, 2016)

rtwjunkie said:


> Come on guys....I repeat my earlier stance. It's not worth getting angry over. It isn't going to impact anyone enough to ruin their life, nor is it going to cure world hunger.
> 
> Just agree to disagree with different opinions.


Who's angry? I guess maybe him, but this is a usual day in the office for me and find it hilarious.


----------



## EarthDog (Mar 22, 2016)

I wish members could vote to send some of these trouble makers to purgatory... like OCN. You get punished and have to go there for a month.


----------



## PP Mguire (Mar 22, 2016)

EarthDog said:


> I wish members could vote to send some of these trouble makers to purgatory... like OCN. You get punished and have to go there for a month.


Please no massa, I'z be goodz I swearz it!


----------



## Prima.Vera (Mar 23, 2016)

reading the comments is so much beter than the "article" itself... ))


----------



## medi01 (Mar 24, 2016)

PP Mguire said:


> All I see is bitch bitch bitch tbh.



Some people think that nVidia's business practices set a record low mark AND openly state that..
You consider them to be "fanbois" and "irrational". Well. So what?

Maybe that's where the problem is? Why should you care about "bitching"? Why not just get over it?



Slizzo said:


> Benchmark and actual game performance DID NOT CHANGE...


Bingo. Most people judge GPU performance based on reviews, not real life experience.

Yep. It's damn subjective. Most things expressed in perf bars are hardly noticeable in real life, that's why we have things like "halo product", more 3xx's being sold after Fiji was released etc.

And your point was?


----------



## Slizzo (Mar 24, 2016)

medi01 said:


> Some people think that nVidia's business practices set a record low mark AND openly state that..
> You consider them to be "fanbois" and "irrational". Well. So what?
> 
> Maybe that's where the problem is? Why should you care about "bitching"? Why not just get over it?
> ...



Pretty sure my point was pretty clear. He was calling the GTX970 garbage according to some information that was learned after release and the cards performance had been advertised. And the card's performance was the same before we knew this information, and afterwards. Basically nullifying his point about the card being garbage because of the way the memory subsystem is set up.


----------



## medi01 (Mar 25, 2016)

Slizzo said:


> ...card's performance was the same before we knew this information, and afterwards.



I understand your POV. Now please try to understand the other .

Imagine that you buy a car that is advertised to have 4 wheel drive and later discover, that it's only, lol, 3. (somehow) Would your car become any worse because of it? Nope. Did manufacturer save costs by not doing? Yep. Did he lie to you? Yep. That's the whole point.


----------



## EarthDog (Mar 25, 2016)

Asinine analogy.

I see both sides of this. In the end A LOT of people made more out of this than needed to be (and seemingly are still at it...). Performance never changed, people rarely hit the slowdown, there is actually 4GB, not 3.5GB, it's just slower, that last 512MB. Nobody shit themselves when they did this on the 660ti... but now it's a big deal... MEH.


----------



## Slizzo (Mar 27, 2016)

medi01 said:


> I understand your POV. Now please try to understand the other .
> 
> Imagine that you buy a car that is advertised to have 4 wheel drive and later discover, that it's only, lol, 3. (somehow) Would your car become any worse because of it? Nope. Did manufacturer save costs by not doing? Yep. Did he lie to you? Yep. That's the whole point.



That's completely different. In you're analogy, there is false advertising at play. nVidia advertised the card as having 4GB, because that is what it physically has, and can physically address it. How it addresses the memory was never spoken of.


----------



## rtwjunkie (Mar 28, 2016)

medi01 said:


> I understand your POV. Now please try to understand the other .
> 
> Imagine that you buy a car that is advertised to have 4 wheel drive and later discover, that it's only, lol, 3. (somehow) Would your car become any worse because of it? Nope. Did manufacturer save costs by not doing? Yep. Did he lie to you? Yep. That's the whole point.



I respect your viewpoint, however:  The number of people who bought 970's in the last 13 months AFTER the information came out, and were aware of the issue, and bought anyway, FAR outnumbers the number sold before the issue became public. 

So, they knew what they were buying, no lies told to those after.  In short, it really isn't an issue for most people, and user reviews, which indicate the buyer knew of the issue before purchase time on multiple sites back this up.


----------



## Stefan Payne (Mar 31, 2016)

The Problem with the GTX 970 Memory is:

a) there are special profiles needed for some games
b) nVidia has the power to make this GPU useless if they want to.

WIth a normal memory architecture, that's not that easy and you can still use your graphics card.
But with the Memory Architecture of the GTX 970, nVidia is able to make this chip useless, more or less with the flip of a switch.

And have you guys learned nothing of the Witcher 3??
Yes, that game, where a GTX 780/Titan was beaten by a ~200€ Card like the GTX 960.
This happened with a couple of other games, like Batman...
Thus you can use this as confirmation that nVidia drops support on older hardware easily (well, at least the optimisation stuff).

What do you think will happen when there is a successor to the GTX 970??
Do you really think that nVidia will still optimize for a not manufactured/sold card??
And they also could use the memory archtecture to cripple this card, so that most newer games are unplayable -> most GTX 970 users will run to the next store and buy the next Geforce card...


----------



## rtwjunkie (Mar 31, 2016)

Stefan Payne said:


> And have you guys learned nothing of the Witcher 3??
> Yes, that game, where a GTX 780/Titan was beaten by a ~200€ Card like the GTX 960.
> This happened with a couple of other games, like Batman...
> Thus you can use this as confirmation that nVidia drops support on older hardware easily (well, at least the optimisation stuff).



Honestly, you've been reading too many media reports instead of trying things yourself.  This is where real world experience counts for more.  

I had tbe 780, and used it to play TW3 the first two times I played the game.  Most who know me know I am an image quality over frame rate person.  I used the 780 to render almost all high to very high settings, except shadows on medium, and no hairworks.  I still played at a consistent 50 to 60 fps.

Also owning a 960, it's almost embarrassing for the 960 how badly it gets beaten.  For it to put out a mere 30 to 35 fps, most settings MUST be dropped way down to medium.  Think about that...same resolution, settings that don't produce as good visuals as the 780 had, and it is only JUST playable on a 960.  

In conclusion, the only thing a 960 wins at versus a 780 in TW3 is the title of Lower Performing Card.


----------



## Tsukiyomi91 (Apr 5, 2016)

still rocking the 970 here. I dun care what ppl say about why it has 3.5+0.5 instead of full 4. As long u don't run demanding games with hi-res texture mods with graphic settings tweaked to Very High or Ultra for the sake of eye-candy, you'll be perfectly fine hitting well over 50fps on average for most of the time with slight dip to 40fps.


----------



## Tsukiyomi91 (Apr 5, 2016)

@rtwjunkie the 960 is a mid range card, so no surprise here since it can't outpaced it's older brother; the 780. With 2G on most models & a few that has 4, it's not meant to win fps contests but it's good enough for those who are on a tight budget & wants a 1080p ready card that handles most games well.


----------



## dr emulator (madmax) (Apr 20, 2016)

Naito said:


> Darn. I'm ready to throw down some money on a Pascal, but I'm gonna have to wait longer...



same here , 

hopefully it'll be cheaper on the electric bill than the hd5870  i have at the mo


----------



## TheHunter (Apr 20, 2016)

I'm seeing this full GP104 chip ~7tflops area at best, this is just around 1200-1300mhz boost on 980ti, + a lot of marketing crap with lower power again;

Imo for someone at high-end Mawell GM200 chip a GP104 gtx1080 is not really a worthy upgrade,

I would maybe change my new 980Ti  for a full GP100 chip, but I don't see the point  at 1080p, well unless its literally 2x faster..
Think I'll just stick with my original plan to wait for Volta and AMD offering in Q1 y2018?..


----------

