Friday, February 26th 2016

NVIDIA to Unveil "Pascal" at the 2016 Computex

NVIDIA is reportedly planning to unveil its next-generation GeForce GTX "Pascal" GPUs at the 2016 Computex show, in Taipei, scheduled for early-June. This unveiling doesn't necessarily mean market availability. SweClockers reports that problems, particularly related to NVIDIA supplier TSMC getting its 16 nm FinFET node up to speed, especially following the recent Taiwan earthquake, could delay market available to late- or even post-Summer. It remains to be seen if the "Pascal" architecture debuts as an all-mighty "GP100" chip, or a smaller, performance-segment "GP104" that will be peddled as enthusiast-segment over being faster than the current big-chip, the GM200. NVIDIA's next generation GeForce nomenclature will also be particularly interesting to look out for, given that the current lineup is already at the GTX 900 series.
Source: SweClockers
Add your own comment

97 Comments on NVIDIA to Unveil "Pascal" at the 2016 Computex

#76
rtwjunkie
PC Gaming Enthusiast
medi01It sucks when they play dirty, the way nVidia does.
That plus all the simar comments. What I find amusing is that you are so very naive to imagine AMD are somehow a model company.

Frankly, your idealized and warped view of the business world does nothing but show you to be out of your element. You expect perfection and exaggerate the negatives, with normal business practices being blown up into a nefarious scheme to spread "evil".

LMFAO
Posted on Reply
#77
medi01
rtwjunkieI find amusing is that you are so very naive to imagine AMD are somehow a model company.
Strong arm politics works only if you have dominant market position. AMD, being a permanent underdog, can not do such things even if it wanted to, it doesn't mean they wouldn't if they could.
rtwjunkie...normal business practices...
Normal? As "everyone does it"? Or "the way it should be"? Or "I don't give a flying f**k?
Make up your mind.

There are countries where "normal" things nVidia did to XFX are illegal.
Posted on Reply
#78
HumanSmoke
FrickYeah I know but I meant a series of them. It might have been Mars I was thinking about.. For some reason I was thinking about Sapphire.
The only Sapphire premium dual-GPU cards I can think of was the Toxic version of the HD 5970 - they and XFX released fairly pricey 4GB versions, and the HD 4870 X2 Atomic.
medi01Talking about arguments lost on opponents, ironic.
It's not irony. You are the only one involved in the argument you are making.
medi01A company does NOT need to strong arm journalists and suppliers to build great products.
A company does NOT need to force proprietary APIs to build great products.
That has absolutely nothing to do with the points being made by me and others. You are right, Nvidia and Intel don't have to do these things to build great products. It is also a FACT that both Intel and Nvidia are the respective market leaders based on a strategies that DO leverage these practices among other facets of their business. Whether they NEED to or not is immaterial to the point being made. It is a part of historical fact that it is part of how they got where they are. You can argue all day about the rights and wrongs but it has no bearing on the fact that they are in the position they occupy. Squealing about injustice doesn't retroactively change the totals in the account books.
medi01You referred to shitty practices as if they were something good (for customers) and worth following.
No I didn't. You are so caught up in your own narrative that you don't understand that some people can view the industry dispassionately in historical context. Not everyone is like you, eager to froth at the bung at a drop of hat to turn the industry into some personal crusade. Stating fact isn't condoning a practice. By your reasoning, any fact based article or book of a distasteful event in human history (i.e. armed conflict) means that the authors automatically condone the actions of the combatants.
Let's face it, from your posting history you just need any excuse, however tenuous, to jump onto the soapbox. Feel free to do so, but don't include quotes and arguments that have no direct bearing on what you are intent on sermonizing about.
rtwjunkieThat plus all the similar comments. What I find amusing is that you are so very naive to imagine AMD are somehow a model company.
Presumably this model companies dabbling in price fixing (the price fixing continued for over a year after AMD assumed control of ATI), posting fraudulent benchmarks for fictitious processors and deliberately out of date Intel benchmarks, being hit for blatant patent infringements, and a host of other dubious practices don't qualify.
Underdog = Get out of Jail Free.
Market Leader = Burn in Hell.
Posted on Reply
#79
FordGT90Concept
"I go fast!1!11!1!"
medi01So that world would have been better, had nVidia NOT bought it.

There was NO NEED in GSync the way it was done, there was nothing special about variable refresh rate, that stuff was already there in notebooks (that's why it didn't take AMD long to counter). The only drive (and wasted money) was to come out with some "only me, only mine!!!" shit, nothing else.

Had it been a common, open standard, that would have pushed market forward a lot. But no, we have crippled "only this company" shit now. Thanks, great progress.

It's great to have more than one competitive player in the market. It sucks when they play dirty, the way nVidia does.

Strong arm politics all over the place on all fronts: XFX, hell, ANAND BLOODY TECH. Punished, learned the lesson, next time put cherry picked overclocked fermi vs stock AMD. And that's only VISIBLE part of it, who fucking knows what's going on underneath.
Had G-Sync not come out, we wouldn't have external adapative sync today. It likely wouldn't have appeared until DisplayPort 1.3 (coming with Pascal/Polaris) and HDMI 2.1 (no date known). 1.2a and 2.0a specifications exist because AMD, VESA, and HDMI Forum couldn't wait 3-4 years to compete with G-Sync.

It's a lot like AMD pushing out Mantle before Direct3D 12 and Vulkan.



Edit: It should also be noted that Ashes of Singularity now uses async compute and NVIDIA cards take a fairly severe performance penalty (25% in the case of Fury X versus 980 Ti) because of it:
www.techpowerup.com/reviews/Performance_Analysis/Ashes_of_the_Singularity_Mixed_GPU/4.html

GCN never cut corners for async compute (a Direct3D 11 feature)--these kinds of numbers should go back to 7950 when async compute is used. The only reason why NVIDIA came out ahead in the last few years is because developers didn't use async compute. One could stipulate why that is. For example, because NVIDIA is the segment leader, did developers avoid using it because 80% of cards sold wouldn't perform well with it? There could be more nefarious reasons like NVIDIA recommending developers not to use it (wonder if any developers would step forward with proof of this). Oxide went against the grain and did it anyway. The merits of having hardware support for something software doesn't use could be argued but, at the end of the day, it was part of the D3D11 specification for years now and NVIDIA decided to ignore it in the name of better performance when it is not used.

For their part, Oxide did give NVIDIA ample time to fix it but a software solution is never going to best a hardware solution.
Posted on Reply
#80
BiggieShady
FordGT90ConceptEdit: It should also be noted that Ashes of Singularity now uses async compute and NVIDIA cards take a fairly severe performance penalty (25% in the case of Fury X versus 980 Ti) because of it:
www.techpowerup.com/reviews/Performance_Analysis/Ashes_of_the_Singularity_Mixed_GPU/4.html

GCN never cut corners for async compute (a Direct3D 11 feature)--these kinds of numbers should go back to 7950 when async compute is used. The only reason why NVIDIA came out ahead in the last few years is because developers didn't use async compute. One could stipulate why that is. For example, because NVIDIA is the segment leader, did developers avoid using it because 80% of cards sold wouldn't perform well with it? There could be more nefarious reasons like NVIDIA recommending developers not to use it (wonder if any developers would step forward with proof of this). Oxide went against the grain and did it anyway. The merits of having hardware support for something software doesn't use could be argued but, at the end of the day, it was part of the D3D11 specification for years now and NVIDIA decided to ignore it in the name of better performance when it is not used.

For their part, Oxide did give NVIDIA ample time to fix it but a software solution is never going to best a hardware solution.
Here is the nice read that should clear things up: ext3h.makegames.de/DX12_Compute.html
In a nutshell, both architectures benefit from async compute, GCN profits most by many small compute tasks highly parallelized, while Maxwell 2 profits most by batching async tasks just like they are draw calls.
When it comes to async compute GCN architecture is more forgiving and more versatile, Maxwell needs more special optimizations to extract peak performance (or even using DX12 only for graphics and CUDA for all compute :laugh:).
I'm just hoping Nvidia will make necessary changes in async compute with Pascal, because of all future lazy console ports.
Posted on Reply
#81
FordGT90Concept
"I go fast!1!11!1!"
AnandtechUpdate 02/24: NVIDIA sent a note over this afternoon letting us know that asynchornous shading is not enabled in their current drivers, hence the performance we are seeing here. Unfortunately they are not providing an ETA for when this feature will be enabled.
And no, Anandtech review shows NVIDIA only loses with async compute enabled (0 to -4%). AMD was -2 to +10%:

The divide even gets crazier with higher resolutions and quality:
Posted on Reply
#82
nem
lies and more lies.. :B


Posted on Reply
#83
rtwjunkie
PC Gaming Enthusiast
nemlies and more lies.. :B


Please provide empirical evidence or testing, references to disprove your post as just trolling.
Posted on Reply
#84
HumanSmoke
rtwjunkiePlease provide emoirical evidence or testing, references to disprove your post as anything but trolling.
Nvidia confirmed that they wouldn't pursue Vulkan development for Fermi-based cards six weeks ago at GTC (page 55 of the PDF presentation). With many people upgrading and many Fermi cards being underpowered for future games (as well as most having a 1GB or 2GB vRAM capacity) and the current profile of gaming shifting to upgrades ( as the new JPR figures confirm with enthusiast card sales doubling over the last year), they decided to concentrate on newer architectures. Realistically, only the GTX 580 maintains any degree of competitiveness with modern architectures....so nem, while not trolling the veracity of support, still continues to troll threads with unrelated content. Hardly surprising when even trolls at wccftech label him a troll of the highest order - not sure if that's an honour at wccf or the lowest form of life. The subject is too boring for me to devote fact-finding time to.

Posted on Reply
#85
rtwjunkie
PC Gaming Enthusiast
HumanSmokeNvidia confirmed that they wouldn't pursue Vulkan development for Fermi-based cards six weeks ago at GTC (page 55 of the PDF presentation). With many people upgrading and many Fermi cards being underpowered for future games (as well as most having a 1GB or 2GB vRAM capacity) and the current profile of gaming shifting to upgrades ( as the new JPR figures confirm with enthusiast card sales doubling over the last year), they decided to concentrate on newer architectures. Realistically, only the GTX 580 maintains any degree of competitiveness with modern architectures....so nem, while not trolling the veracity of support, still continues to troll threads with unrelated content. Hardly surprising when even trolls at wccftech label him a troll of the highest order - not sure if that's an honour at wccf or the lowest form of life. The subject is too boring for me to devote fact-finding time to.

Ok, thanks for an intelligent response. I'm so used to and weary of his trolling I can't tell when he's not.
Posted on Reply
#86
the54thvoid
Super Intoxicated Moderator
rtwjunkieOk, thanks for an intelligent response. I'm so used to and weary of his trolling I can't tell when he's not.
Must be that weird allusion to free speech being misconstrued as a right in privately owned forums. Quite poor of admin to continually allow troll posts and posters to continue.
I'm all for reasoned, if somewhat biased viewpoints, from either side but seriously, some members should be banned. TPU's tolerance of trolls is a sign of misguided liberalism. Troll posts are damaging to a sites reputation.
Posted on Reply
#87
medi01
HumanSmokeBy your reasoning, any fact based article or book of a distasteful event in human history (i.e. armed conflict) means that the authors automatically condone the actions of the combatants.
Stating a FACTS isn't. Voicing assessments, such as "Soviets bombed the hell out of Berlin, which was great, since it allowed to build modern houses" is.
HumanSmokeif you think that the game dev software R&D has no merit
No, I never implied that..
Cross platform, PhysX like API could push market forward. Each hardware company would need to invest into implementing it on its platform, game developers could use it for CORE mechanics in game.
Grab it, turn it into proprietary and suddenly it could only be used for a bunch of meaningless visual effects.

There isn't much to add to that, though, you clearly think the latter is good for the market, I think it is bad, these are just two opinions, not facts. Let's leave it at that.
Posted on Reply
#88
HumanSmoke
medi01There isn't much to add to that, though, you clearly think the latter is good for the market
What a load of bullshit.
Show me any post in this thread where I've voiced the opinion that proprietary standards are good for the market.

You are one very ineffectual troll :roll:
Posted on Reply
#89
medi01
HumanSmokeShow me any post in this thread where
Post #67 in this very thread.
Posted on Reply
#90
HumanSmoke
medi01
HumanSmokeShow me any post in this thread where I've voiced the opinion that proprietary standards are good for the market.
Post #67 in this very thread.
You really don't have a clue do you? :roll: Nowhere in that post did I say anything about proprietary standards being good for the market. The post concerned Nvidia's strategy A FACT not an opinion....
HumanSmokeQFT, although I suspect any reasoned argument is lost on medi01. He seems to have lost the plot of the thread he jumped on - which was about the various companies position in their respective markets and how they arrived .
So what? The philosophical debate over the ethics of PhysX doesn't alter the fact that Nvidia used its gaming development program to further its brand awareness. They are two mutually exclusive arguments. Do me a favour - if you're quoting me at least make your response relevant to what is being discussed.
You should spend some time trying to understand what is posted before answering a post. I'd suggest popping for a basic primer

I'd actually make an attempt to report your trolling, but 1. as @the54thvoid noted, the threshold must be quite high, and 2. I'm not sure you aren't just lacking basic reading skills rather than trolling.
Posted on Reply
#91
the54thvoid
Super Intoxicated Moderator
medi01PhysX was BOUGHT and forcefully made exclusive. At best it is "NV bought game development program".
Then you slap another nice sum to bribe devs to use it, and, yay, it's sooo good for customers.
Just for the hell of it, let's use 100% reason.

Physx was exclusive before Nvidia bought it. You had to buy an Ageia Physx card to run it (that made it a hardware exclusive technology). Even then, Ageia had bought NovodeX, who had created the physics processing chip. They didn't do very well with their product. That was the issue, without Nvidia buying it, Physx, as devised by Ageia was going nowhere - dev's wouldn't code for it because few people bought the add on card. Great idea - zero market traction. Nvidia and ATI were looking at developing processing physics as well. So, Nvidia acquired Ageia instead to use it's tech and in doing so, push it to a far larger audience with, as @HumanSmoke points out, a better gaming and marketing relationship with card owners and developers alike.
At best it is "NV bought game development program
Is logically false. NV bought Ageia - a company with a physical object to sell (IP rights). NV used their own game development program to help push Physx.

As far as bribing dev's - it's not about bribing dev's. You assist them financially to make a feature of a game that might help sell it. A dev wont use a feature unless it adds to the game. Arguably, Physx doesn't bring too much to the table anyway although in todays climate, particle modelling using physx and Async combined would be lovely.

All large companies will invest in smaller companies if it suits their business goal. So buying Ageia and allowing all relevant Nvidia cards to use it's IP was a great way to give access to Physx to a much larger audience, albeit Nvidia owners only. In the world of business you do not buy a company and then share your fruits with your competitor. Shareholders would not allow it. Nvidia and AMD/ATI are not charitable trusts - they are owned by shareholders who require payment of dividends. In a similar way to Nvidia holding Physx, each manufacturer also has it's own architecture specific IP's. They aren't going to help each other out.

Anyway, enough of reason. The biggest enemy of the PC race is the console developers and software publishing houses, not Nvidia. In fact, without Nvidia pushing and AMD reacting (and vice versa) - the PC industry would be down the pan. So whining about how evil Nvidia is does not reflect an accurate understanding of how strongly Nvidia is propping up PC gaming. Imagine if AMD stopped focusing on discrete GPU's and only worked on consoles? Imagine what would happen to the PC development then? Nvidia would have to fight harder to prove how much we need faster, stronger graphics.
Posted on Reply
#92
BiggieShady
the54thvoidArguably, Physx doesn't bring too much to the table anyway although in todays climate
Let's not forget how Physx SDK has advanced over the years since x87 fiasco in 2010. Latest version PhysX SDK 3.x has all multithreading and SIMD optimizations and is one of the fastest solutions currently available.
My point is devs choose physx because it runs well across all cpu architectures. Yes, even AMD and ARM.
On the gpu side physx has grown into entire gameworks program everything optimized for nv arch which is worst case scenario for amd arch, and locked in a prebuilt dlls that come with the cheapest licence, you want to optimize for amd buy an expensive license where you get the source code. My take on that is that's a dick move when you already have 80% of the market, but also a necessary one when you consider 100% amd in consoles.
Posted on Reply
#93
FordGT90Concept
"I go fast!1!11!1!"
Devs only choose PhysX because NVIDIA sponsored the title/engine. PhysX is rarely/never seen outside of sponsorship.

If you don't have an NVIDIA GPU, there is no hardware acceleration. Because of this, PhysX is only used in games for cosmetic reasons, not practical reasons. If they used PhysX for practical reasons, the game would break on all systems that lack an NVIDIA GPU. PhysX is an impractical technology which goes to my previous point that it is only used where sponsorship is involved.

Most developers out there have made their own physics code for handling physics inside of their respective engines. Case in point: Frostbite. About the only major engine that still uses PhysX is Unreal Engine. As per the above, most developers on Unreal Engine code on the assumption there is no NVIDIA card.


Edit: Three games come to mind as relying on physics: Star Citizen, BeamNG.drive, and Next Car Game: Wreckfest. The former two are on CryEngine. None of them use PhysX.
Posted on Reply
#94
BiggieShady
Gpu Physx and the rest of the gameworks is what it is, locked sponsored and heavily optimized for nv arch ... some of it in cuda, some of it direct compute, it's a mess and all cosmetics. I'm saying their CPU PhysX SDK is good and popular. Also on all architectures. Every game in Unreal and Unity 3D engine uses it.
Posted on Reply
#95
HumanSmoke
the54thvoidNV bought Ageia - a company with a physical object to sell (IP rights). NV used their own game development program to help push Physx.
As ATI and later AMD would have done had they actually bought Ageia rather than spend 2 years trying to publicly lowball the company. It is no coincidence that Richard Huddy - then head of ATI/AMD's game dev program - was the one repeatedly talking about acquiring the company rather than AMD's CEO, CTO, or CFO.
the54thvoidNvidia and ATI were looking at developing processing physics as well.
Yes. ATI had hitched their wagon to Intel's star. HavokFX was to be the consummation of their physics marriage. Then AMD acquired ATI which broke off the engagement, Intel then swallowed up Havok, and proceeded to play along with AMD's pipe-dream for HavokFX to the tune of zero games actually using it.
the54thvoidAll large companies will invest in smaller companies if it suits their business goal. So buying Ageia and allowing all relevant Nvidia cards to use it's IP was a great way to give access to Physx to a much larger audience, albeit Nvidia owners only. In the world of business you do not buy a company and then share your fruits with your competitor.
[sarcasm] Are you sure about that? AMD acquired ATI - didn't AMD make ATI's software stack such as Avivo/Avivo HD free to Nvidia, Intel, S3, SiS etc. [/sarcasm]
FordGT90ConceptDevs only choose PhysX because NVIDIA sponsored the title/engine. PhysX is rarely/never seen outside of sponsorship.
Very much agree. Game developers are a lazy bunch of tightwads if the end result (unpatched) is any indication. Vendor's willing to make life easier for them with support ( and this doesn't just apply to PhysX) and the dev studios will in all likelihood sign up before the sales pitch is halfway through.
Posted on Reply
#96
medi01
the54thvoidJust for the hell of it, let's use 100% reason.
Ok. Since HeSaidSheSaidIDidn'tMeanThatHereIsPersonalInsultToProveIt in the thread is already annoying enough, could you pleas confirm, that I got you right that:

1) PhysX was proprietary anyway, so nVidia did no harm in that regard. On the opposite, now much wider audience had access to PhysX. Shareholders would not understand it, if nVidia would have codepath for AMD GPUs.
2) What nVidia bought was basically an owner of a funny useless (since next to no market penetration) card that could do "physics computing". Well, there was some know-how in it, but actually NV used its own game development program to push PhysX.
3) Paying devs to use your software that runs well on your hardware, but has terrible impact when running on competitor's hardware is not bribing, it's "assisting them financially to make a feature of a game that might help sell it".
4) Consoles are the main enemies of PC world
5) If AMD quit discrete desktop GPU market altogether, nVidia "
would have to fight harder to prove how much we need faster, stronger graphics".
Posted on Reply
#97
FordGT90Concept
"I go fast!1!11!1!"
Where Ageia didn't have the resources to bribe developers to implement their code, NVIDIA does; therein lies the problem.

NVIDIA wasn't interested in Ageia hardware. They wanted the API which acted as middleware and executed on dedicated hardware. NVIDIA modified the API to execute on x86/x64/CUDA. In response to NVIDIA snatching PhysX, Intel snatched Havok. If memory serves, Intel was going to work with AMD on HavokFX but the whole thing kind of fell apart.

Pretty sure the decision to buy Ageia came from NVIDIA's GPGPU CUDA work. NVIDIA had a reason for the scientific community to buy their cards but they didn't have a reason for the game development community to buy their cards. Ageia was their door into locking developers (and consumers) into NVIDIA hardware. Needless to say, it worked.

Consoles always have been and always will be simplified, purpose-built computers. I wouldn't call them "enemies" because they represent an audience that makes games possible that wouldn't be if there were only PC gaming (Mass Effect comes to mind as does the size, scope, and scale of Witcher 3 and GTA5).

I don't buy that argument in #5 at all. NVIDIA would likely double the price of GPUs at each tier and that's about it. The market always needs better performance (e.g. VR and 4K gaming, laser scanning and 3D modeling).
Posted on Reply
Add your own comment
Dec 23rd, 2024 15:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts