• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA to Unveil "Pascal" at the 2016 Computex

Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Whaaaaaaa....
Talking about arguments lost on opponents, ironic.

A company does NOT need to strong arm journalists and suppliers to build great products.
A company does NOT need to force proprietary APIs to build great products.

You referred to shitty practices as if they were something good (for customers) and worth following.
No, they clearly aren't.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
14,019 (2.34/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
It sucks when they play dirty, the way nVidia does.

That plus all the simar comments. What I find amusing is that you are so very naive to imagine AMD are somehow a model company.

Frankly, your idealized and warped view of the business world does nothing but show you to be out of your element. You expect perfection and exaggerate the negatives, with normal business practices being blown up into a nefarious scheme to spread "evil".

LMFAO
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
I find amusing is that you are so very naive to imagine AMD are somehow a model company.
Strong arm politics works only if you have dominant market position. AMD, being a permanent underdog, can not do such things even if it wanted to, it doesn't mean they wouldn't if they could.

...normal business practices...
Normal? As "everyone does it"? Or "the way it should be"? Or "I don't give a flying f**k?
Make up your mind.

There are countries where "normal" things nVidia did to XFX are illegal.
 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Yeah I know but I meant a series of them. It might have been Mars I was thinking about.. For some reason I was thinking about Sapphire.
The only Sapphire premium dual-GPU cards I can think of was the Toxic version of the HD 5970 - they and XFX released fairly pricey 4GB versions, and the HD 4870 X2 Atomic.
Talking about arguments lost on opponents, ironic.
It's not irony. You are the only one involved in the argument you are making.
A company does NOT need to strong arm journalists and suppliers to build great products.
A company does NOT need to force proprietary APIs to build great products.
That has absolutely nothing to do with the points being made by me and others. You are right, Nvidia and Intel don't have to do these things to build great products. It is also a FACT that both Intel and Nvidia are the respective market leaders based on a strategies that DO leverage these practices among other facets of their business. Whether they NEED to or not is immaterial to the point being made. It is a part of historical fact that it is part of how they got where they are. You can argue all day about the rights and wrongs but it has no bearing on the fact that they are in the position they occupy. Squealing about injustice doesn't retroactively change the totals in the account books.
You referred to shitty practices as if they were something good (for customers) and worth following.
No I didn't. You are so caught up in your own narrative that you don't understand that some people can view the industry dispassionately in historical context. Not everyone is like you, eager to froth at the bung at a drop of hat to turn the industry into some personal crusade. Stating fact isn't condoning a practice. By your reasoning, any fact based article or book of a distasteful event in human history (i.e. armed conflict) means that the authors automatically condone the actions of the combatants.
Let's face it, from your posting history you just need any excuse, however tenuous, to jump onto the soapbox. Feel free to do so, but don't include quotes and arguments that have no direct bearing on what you are intent on sermonizing about.
That plus all the similar comments. What I find amusing is that you are so very naive to imagine AMD are somehow a model company.
Presumably this model companies dabbling in price fixing (the price fixing continued for over a year after AMD assumed control of ATI), posting fraudulent benchmarks for fictitious processors and deliberately out of date Intel benchmarks, being hit for blatant patent infringements, and a host of other dubious practices don't qualify.
Underdog = Get out of Jail Free.
Market Leader = Burn in Hell.
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
So that world would have been better, had nVidia NOT bought it.

There was NO NEED in GSync the way it was done, there was nothing special about variable refresh rate, that stuff was already there in notebooks (that's why it didn't take AMD long to counter). The only drive (and wasted money) was to come out with some "only me, only mine!!!" shit, nothing else.

Had it been a common, open standard, that would have pushed market forward a lot. But no, we have crippled "only this company" shit now. Thanks, great progress.

It's great to have more than one competitive player in the market. It sucks when they play dirty, the way nVidia does.

Strong arm politics all over the place on all fronts: XFX, hell, ANAND BLOODY TECH. Punished, learned the lesson, next time put cherry picked overclocked fermi vs stock AMD. And that's only VISIBLE part of it, who fucking knows what's going on underneath.
Had G-Sync not come out, we wouldn't have external adapative sync today. It likely wouldn't have appeared until DisplayPort 1.3 (coming with Pascal/Polaris) and HDMI 2.1 (no date known). 1.2a and 2.0a specifications exist because AMD, VESA, and HDMI Forum couldn't wait 3-4 years to compete with G-Sync.

It's a lot like AMD pushing out Mantle before Direct3D 12 and Vulkan.



Edit: It should also be noted that Ashes of Singularity now uses async compute and NVIDIA cards take a fairly severe performance penalty (25% in the case of Fury X versus 980 Ti) because of it:
http://www.techpowerup.com/reviews/Performance_Analysis/Ashes_of_the_Singularity_Mixed_GPU/4.html

GCN never cut corners for async compute (a Direct3D 11 feature)--these kinds of numbers should go back to 7950 when async compute is used. The only reason why NVIDIA came out ahead in the last few years is because developers didn't use async compute. One could stipulate why that is. For example, because NVIDIA is the segment leader, did developers avoid using it because 80% of cards sold wouldn't perform well with it? There could be more nefarious reasons like NVIDIA recommending developers not to use it (wonder if any developers would step forward with proof of this). Oxide went against the grain and did it anyway. The merits of having hardware support for something software doesn't use could be argued but, at the end of the day, it was part of the D3D11 specification for years now and NVIDIA decided to ignore it in the name of better performance when it is not used.

For their part, Oxide did give NVIDIA ample time to fix it but a software solution is never going to best a hardware solution.
 
Last edited:
Joined
Feb 8, 2012
Messages
3,014 (0.64/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Edit: It should also be noted that Ashes of Singularity now uses async compute and NVIDIA cards take a fairly severe performance penalty (25% in the case of Fury X versus 980 Ti) because of it:
http://www.techpowerup.com/reviews/Performance_Analysis/Ashes_of_the_Singularity_Mixed_GPU/4.html

GCN never cut corners for async compute (a Direct3D 11 feature)--these kinds of numbers should go back to 7950 when async compute is used. The only reason why NVIDIA came out ahead in the last few years is because developers didn't use async compute. One could stipulate why that is. For example, because NVIDIA is the segment leader, did developers avoid using it because 80% of cards sold wouldn't perform well with it? There could be more nefarious reasons like NVIDIA recommending developers not to use it (wonder if any developers would step forward with proof of this). Oxide went against the grain and did it anyway. The merits of having hardware support for something software doesn't use could be argued but, at the end of the day, it was part of the D3D11 specification for years now and NVIDIA decided to ignore it in the name of better performance when it is not used.

For their part, Oxide did give NVIDIA ample time to fix it but a software solution is never going to best a hardware solution.

Here is the nice read that should clear things up: http://ext3h.makegames.de/DX12_Compute.html
In a nutshell, both architectures benefit from async compute, GCN profits most by many small compute tasks highly parallelized, while Maxwell 2 profits most by batching async tasks just like they are draw calls.
When it comes to async compute GCN architecture is more forgiving and more versatile, Maxwell needs more special optimizations to extract peak performance (or even using DX12 only for graphics and CUDA for all compute :laugh:).
I'm just hoping Nvidia will make necessary changes in async compute with Pascal, because of all future lazy console ports.
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Anandtech said:
Update 02/24: NVIDIA sent a note over this afternoon letting us know that asynchornous shading is not enabled in their current drivers, hence the performance we are seeing here. Unfortunately they are not providing an ETA for when this feature will be enabled.

And no, Anandtech review shows NVIDIA only loses with async compute enabled (0 to -4%). AMD was -2 to +10%:

The divide even gets crazier with higher resolutions and quality:
 
Last edited:

nem

Joined
Oct 22, 2013
Messages
165 (0.04/day)
Location
Cyberdyne CPU Sky Net
lies and more lies.. :B


 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
14,019 (2.34/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
lies and more lies.. :B



Please provide empirical evidence or testing, references to disprove your post as just trolling.
 
Last edited:
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Please provide emoirical evidence or testing, references to disprove your post as anything but trolling.
Nvidia confirmed that they wouldn't pursue Vulkan development for Fermi-based cards six weeks ago at GTC (page 55 of the PDF presentation). With many people upgrading and many Fermi cards being underpowered for future games (as well as most having a 1GB or 2GB vRAM capacity) and the current profile of gaming shifting to upgrades ( as the new JPR figures confirm with enthusiast card sales doubling over the last year), they decided to concentrate on newer architectures. Realistically, only the GTX 580 maintains any degree of competitiveness with modern architectures....so nem, while not trolling the veracity of support, still continues to troll threads with unrelated content. Hardly surprising when even trolls at wccftech label him a troll of the highest order - not sure if that's an honour at wccf or the lowest form of life. The subject is too boring for me to devote fact-finding time to.

 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
14,019 (2.34/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Nvidia confirmed that they wouldn't pursue Vulkan development for Fermi-based cards six weeks ago at GTC (page 55 of the PDF presentation). With many people upgrading and many Fermi cards being underpowered for future games (as well as most having a 1GB or 2GB vRAM capacity) and the current profile of gaming shifting to upgrades ( as the new JPR figures confirm with enthusiast card sales doubling over the last year), they decided to concentrate on newer architectures. Realistically, only the GTX 580 maintains any degree of competitiveness with modern architectures....so nem, while not trolling the veracity of support, still continues to troll threads with unrelated content. Hardly surprising when even trolls at wccftech label him a troll of the highest order - not sure if that's an honour at wccf or the lowest form of life. The subject is too boring for me to devote fact-finding time to.


Ok, thanks for an intelligent response. I'm so used to and weary of his trolling I can't tell when he's not.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,121 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Ok, thanks for an intelligent response. I'm so used to and weary of his trolling I can't tell when he's not.

Must be that weird allusion to free speech being misconstrued as a right in privately owned forums. Quite poor of admin to continually allow troll posts and posters to continue.
I'm all for reasoned, if somewhat biased viewpoints, from either side but seriously, some members should be banned. TPU's tolerance of trolls is a sign of misguided liberalism. Troll posts are damaging to a sites reputation.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
By your reasoning, any fact based article or book of a distasteful event in human history (i.e. armed conflict) means that the authors automatically condone the actions of the combatants.
Stating a FACTS isn't. Voicing assessments, such as "Soviets bombed the hell out of Berlin, which was great, since it allowed to build modern houses" is.

if you think that the game dev software R&D has no merit
No, I never implied that..
Cross platform, PhysX like API could push market forward. Each hardware company would need to invest into implementing it on its platform, game developers could use it for CORE mechanics in game.
Grab it, turn it into proprietary and suddenly it could only be used for a bunch of meaningless visual effects.

There isn't much to add to that, though, you clearly think the latter is good for the market, I think it is bad, these are just two opinions, not facts. Let's leave it at that.
 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
There isn't much to add to that, though, you clearly think the latter is good for the market
What a load of bullshit.
Show me any post in this thread where I've voiced the opinion that proprietary standards are good for the market.

You are one very ineffectual troll :roll:
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Show me any post in this thread where I've voiced the opinion that proprietary standards are good for the market.
Post #67 in this very thread.
You really don't have a clue do you? :roll: Nowhere in that post did I say anything about proprietary standards being good for the market. The post concerned Nvidia's strategy A FACT not an opinion....
QFT, although I suspect any reasoned argument is lost on medi01. He seems to have lost the plot of the thread he jumped on - which was about the various companies position in their respective markets and how they arrived .
So what? The philosophical debate over the ethics of PhysX doesn't alter the fact that Nvidia used its gaming development program to further its brand awareness. They are two mutually exclusive arguments. Do me a favour - if you're quoting me at least make your response relevant to what is being discussed.

You should spend some time trying to understand what is posted before answering a post. I'd suggest popping for a basic primer

I'd actually make an attempt to report your trolling, but 1. as @the54thvoid noted, the threshold must be quite high, and 2. I'm not sure you aren't just lacking basic reading skills rather than trolling.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,121 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
PhysX was BOUGHT and forcefully made exclusive. At best it is "NV bought game development program".
Then you slap another nice sum to bribe devs to use it, and, yay, it's sooo good for customers.

Just for the hell of it, let's use 100% reason.

Physx was exclusive before Nvidia bought it. You had to buy an Ageia Physx card to run it (that made it a hardware exclusive technology). Even then, Ageia had bought NovodeX, who had created the physics processing chip. They didn't do very well with their product. That was the issue, without Nvidia buying it, Physx, as devised by Ageia was going nowhere - dev's wouldn't code for it because few people bought the add on card. Great idea - zero market traction. Nvidia and ATI were looking at developing processing physics as well. So, Nvidia acquired Ageia instead to use it's tech and in doing so, push it to a far larger audience with, as @HumanSmoke points out, a better gaming and marketing relationship with card owners and developers alike.

At best it is "NV bought game development program

Is logically false. NV bought Ageia - a company with a physical object to sell (IP rights). NV used their own game development program to help push Physx.

As far as bribing dev's - it's not about bribing dev's. You assist them financially to make a feature of a game that might help sell it. A dev wont use a feature unless it adds to the game. Arguably, Physx doesn't bring too much to the table anyway although in todays climate, particle modelling using physx and Async combined would be lovely.

All large companies will invest in smaller companies if it suits their business goal. So buying Ageia and allowing all relevant Nvidia cards to use it's IP was a great way to give access to Physx to a much larger audience, albeit Nvidia owners only. In the world of business you do not buy a company and then share your fruits with your competitor. Shareholders would not allow it. Nvidia and AMD/ATI are not charitable trusts - they are owned by shareholders who require payment of dividends. In a similar way to Nvidia holding Physx, each manufacturer also has it's own architecture specific IP's. They aren't going to help each other out.

Anyway, enough of reason. The biggest enemy of the PC race is the console developers and software publishing houses, not Nvidia. In fact, without Nvidia pushing and AMD reacting (and vice versa) - the PC industry would be down the pan. So whining about how evil Nvidia is does not reflect an accurate understanding of how strongly Nvidia is propping up PC gaming. Imagine if AMD stopped focusing on discrete GPU's and only worked on consoles? Imagine what would happen to the PC development then? Nvidia would have to fight harder to prove how much we need faster, stronger graphics.
 
Joined
Feb 8, 2012
Messages
3,014 (0.64/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Arguably, Physx doesn't bring too much to the table anyway although in todays climate
Let's not forget how Physx SDK has advanced over the years since x87 fiasco in 2010. Latest version PhysX SDK 3.x has all multithreading and SIMD optimizations and is one of the fastest solutions currently available.
My point is devs choose physx because it runs well across all cpu architectures. Yes, even AMD and ARM.
On the gpu side physx has grown into entire gameworks program everything optimized for nv arch which is worst case scenario for amd arch, and locked in a prebuilt dlls that come with the cheapest licence, you want to optimize for amd buy an expensive license where you get the source code. My take on that is that's a dick move when you already have 80% of the market, but also a necessary one when you consider 100% amd in consoles.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Devs only choose PhysX because NVIDIA sponsored the title/engine. PhysX is rarely/never seen outside of sponsorship.

If you don't have an NVIDIA GPU, there is no hardware acceleration. Because of this, PhysX is only used in games for cosmetic reasons, not practical reasons. If they used PhysX for practical reasons, the game would break on all systems that lack an NVIDIA GPU. PhysX is an impractical technology which goes to my previous point that it is only used where sponsorship is involved.

Most developers out there have made their own physics code for handling physics inside of their respective engines. Case in point: Frostbite. About the only major engine that still uses PhysX is Unreal Engine. As per the above, most developers on Unreal Engine code on the assumption there is no NVIDIA card.


Edit: Three games come to mind as relying on physics: Star Citizen, BeamNG.drive, and Next Car Game: Wreckfest. The former two are on CryEngine. None of them use PhysX.
 
Last edited:
Joined
Feb 8, 2012
Messages
3,014 (0.64/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Gpu Physx and the rest of the gameworks is what it is, locked sponsored and heavily optimized for nv arch ... some of it in cuda, some of it direct compute, it's a mess and all cosmetics. I'm saying their CPU PhysX SDK is good and popular. Also on all architectures. Every game in Unreal and Unity 3D engine uses it.
 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
NV bought Ageia - a company with a physical object to sell (IP rights). NV used their own game development program to help push Physx.
As ATI and later AMD would have done had they actually bought Ageia rather than spend 2 years trying to publicly lowball the company. It is no coincidence that Richard Huddy - then head of ATI/AMD's game dev program - was the one repeatedly talking about acquiring the company rather than AMD's CEO, CTO, or CFO.
Nvidia and ATI were looking at developing processing physics as well.
Yes. ATI had hitched their wagon to Intel's star. HavokFX was to be the consummation of their physics marriage. Then AMD acquired ATI which broke off the engagement, Intel then swallowed up Havok, and proceeded to play along with AMD's pipe-dream for HavokFX to the tune of zero games actually using it.
All large companies will invest in smaller companies if it suits their business goal. So buying Ageia and allowing all relevant Nvidia cards to use it's IP was a great way to give access to Physx to a much larger audience, albeit Nvidia owners only. In the world of business you do not buy a company and then share your fruits with your competitor.
[sarcasm] Are you sure about that? AMD acquired ATI - didn't AMD make ATI's software stack such as Avivo/Avivo HD free to Nvidia, Intel, S3, SiS etc. [/sarcasm]
Devs only choose PhysX because NVIDIA sponsored the title/engine. PhysX is rarely/never seen outside of sponsorship.
Very much agree. Game developers are a lazy bunch of tightwads if the end result (unpatched) is any indication. Vendor's willing to make life easier for them with support ( and this doesn't just apply to PhysX) and the dev studios will in all likelihood sign up before the sales pitch is halfway through.
 
Last edited:
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Just for the hell of it, let's use 100% reason.
Ok. Since HeSaidSheSaidIDidn'tMeanThatHereIsPersonalInsultToProveIt in the thread is already annoying enough, could you pleas confirm, that I got you right that:

1) PhysX was proprietary anyway, so nVidia did no harm in that regard. On the opposite, now much wider audience had access to PhysX. Shareholders would not understand it, if nVidia would have codepath for AMD GPUs.
2) What nVidia bought was basically an owner of a funny useless (since next to no market penetration) card that could do "physics computing". Well, there was some know-how in it, but actually NV used its own game development program to push PhysX.
3) Paying devs to use your software that runs well on your hardware, but has terrible impact when running on competitor's hardware is not bribing, it's "assisting them financially to make a feature of a game that might help sell it".
4) Consoles are the main enemies of PC world
5) If AMD quit discrete desktop GPU market altogether, nVidia "
would have to fight harder to prove how much we need faster, stronger graphics".
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Where Ageia didn't have the resources to bribe developers to implement their code, NVIDIA does; therein lies the problem.

NVIDIA wasn't interested in Ageia hardware. They wanted the API which acted as middleware and executed on dedicated hardware. NVIDIA modified the API to execute on x86/x64/CUDA. In response to NVIDIA snatching PhysX, Intel snatched Havok. If memory serves, Intel was going to work with AMD on HavokFX but the whole thing kind of fell apart.

Pretty sure the decision to buy Ageia came from NVIDIA's GPGPU CUDA work. NVIDIA had a reason for the scientific community to buy their cards but they didn't have a reason for the game development community to buy their cards. Ageia was their door into locking developers (and consumers) into NVIDIA hardware. Needless to say, it worked.

Consoles always have been and always will be simplified, purpose-built computers. I wouldn't call them "enemies" because they represent an audience that makes games possible that wouldn't be if there were only PC gaming (Mass Effect comes to mind as does the size, scope, and scale of Witcher 3 and GTA5).

I don't buy that argument in #5 at all. NVIDIA would likely double the price of GPUs at each tier and that's about it. The market always needs better performance (e.g. VR and 4K gaming, laser scanning and 3D modeling).
 
Last edited:
Top