• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

PhysX will Die, Says AMD

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,944 (3.75/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -.080mV PL max @220w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
LMAO....of course AMD would say this, there might actually be gamers out there buying green just for Physx, they might just as well say "but the HD4870 because it's 4 times faster than a GTX280", their argument, whether you beleive it or not bare's little credibility when it comes from THE major competitior.

As a matter of interest, NVidia's Market share has dropped by quite a bit this year, but is still has a much bigger market share than AMD/ATI, this is quite an interesting little piece.....NVidia actually admitting that they were caught "on the hop" by AMD, similar things have popped up around the web on the same subject but this seems to quite nicely bring all the angles together...........

http://www.crn.com/it-channel/212001134
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,944 (3.75/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -.080mV PL max @220w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
Proprietary is used in the loosest way possible here considering nVidia has expressed that they are more than willing to help get it running on ATi's hardware. ATi is forcing it to be propriatary by refusing to work with nVidia to get it working.

The performance hit is going to happen regardless of what API is used to create the physics. If they both are creating the same level of physic, the performance hit will be the same, as the graphics cards are asked to render more on the screen due to the physics.

Edit: Of course I hope AMD realizes that they have kind of screwed themselves by saying that. History shows that when a company says their competitions product will fail, the product usually becomes wildly popular.


Agreed, well most of it, I think AMD may also be being clever in a kind of sadistic way, their "current" architectural route (and possibly their future route) in it's most basic terms is to throw loads of fixed speed shaders (for fixed....read slow :)) where as NVidia is on the few but faster road (I am not going into the more complex differences between the 2 deliberatly!), now, with Physx, it will use a proprtion of shaders to get the job done, a GPU running shaders at twice the speed of a competitor I would GUESS is gonna get much better performing Physx, AMD would not want that perhaps?
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Physx is not going to kill anyone... it's dx10.1 all over again. No one is going to make/sell a game relying on this thing if it's gonna screw over a good portion of the market, or leave them left out of any significant part of the game. Unless AMD/ATI support physx it will never be anything more than optional eyecandy and that limited role will limit it as a factor for people to might buy into it (would you pay $500 for a new card instead of $300 so you can get more broken glass?).

Some of you are talkiing about studios and games adopting physx but what have you seen so far? Mirror's edge seems to do a bit of showcase work for physx but how many would really buy that game? Being an EA game through and though I personally have a hard time believing the game would offer anything more than what I could get watching the trailer or some demos. Otherwise I haven't seen physx do anything that havok wasn't already doing. There's no extra edge here. There's probably some incentives from Nvidia - but then that comes back to what this guy was saying in the first place;

"We cannot provide comments on our competitor's business model except that it is awfully hard to sustain support by monetary incentives. The product itself must be competitive. We view Havok technologies and products to be the leaders in physics simulation and this is why we are working with them. Games developers share this view."



This lends itself to why this stuff won't kill ATI... end of the list is Apple, didn't they put together the OpenCL standard? Which do you think they'll be pushing, CUDA or their own standard? Microsoft will be pushing it's own thing with DX11 down the road. Adobe recently took advantage of Nvidia's CUDA and ATI's Stream if I'm not mistaken... but do you think they'll want to keep on making 2 versions of their products when other people are pushing for a unified standard?



I guess this is all moot anyways then... if AMD responding to an Nvidia announcement for a reporter will guarantee success for Physx then surely the grandstanding Nvidia took part in vs. Intel will have Larrabee burying all competition.


At the end of the day people may not like what this guy is saying, why, or how, but it's true. AMD is not going to support Nvidia's proprietary APIs (and why the hell would they?), and with out that support other companies will have less incentive to get on board unless Nvidia provides it. That requires either a superior product or probably cash incentive. Now realisticly.. when the immediate alternatives to Nvidia's systems seem to be OpenCL (Apple - but open), DirectX 11 (Microsoft), and Havok (Intel), do you think these other standards won't have the resources behind them to provide both those things moreso than Nvidia? If you were in AMD's shoes who would you side with? They could do all but seriously... why? It'd just confuse other efforts and probably waste their own resources, and for what? To better prop up the competition that they can beat? So they can claim some moral high ground when they recall how Nv made DX10.1 completely moot?

All that falls appart from the simple fact that, had Ati adopted PhysX when Nvidia gave it for free. Or at least when the guy at NGOHQ already made PhysX on Ati cards* possible, then we would already have games with 10 times better physics. The point of "Game developers are not supporting it because it doesn't run on all cards." loses it's value, when that is your own fault, when you have obstructed all efforts for the thing running in your cards.

And the one "We are not going to follow a standard that only few developers have adopted" is hypocrit at best, when you are doing as much as you can for that to be true. This falaceous comments included.

It's a lame path the one that AMD has taken. As I said I have lost any faith in them, and the worst thing is that they sided with the most dangerous and "evil" company, Intel, and once Intel has their GPU they are going to crush them so hard it's going to hurt us as well. As a popular quote says: "One must never rely on alliances with the powerful, they probably want something more of that allegiance than help us."

*What can be read in that thread is not the end of story. Nvidia ended up supporting 100% that guy's efforts, but obviously it's not at their hands to release such a thing. They could help make it feasible, but not release it. Neither the guy can release that, remember the guy releasing Vista drivers for X-Fi cards. AMD on the other hand negated any help to the guy, not even providing a simple single HD4000 card for testing. Furthermore the thing went to shadows suddenly, probably, because AMD (or Intel due to AMD's deal with Havok) threatened him with litigations. After that and under a judge's mandate he probably can't say anything about the issue...

I would like to update you about what’s going on. First, we were very pleased to see so many users and readers have applied to our Beta Test Program! To be specific: 191 users, 5 spies and 2 double agents have submitted applications during the last week. Those that will be chosen will be informed early before the beta is available – we can’t still point to “when” at this stage.

The bad news is we still don’t have access to any HD 4800 hardware yet. It is very important for this project to receive AMD’s support on both developer and PR levels. It seems that AMD still is not being cooperative, we get the feeling that they want this project to fail. Perhaps their plans are to strangle PhysX since AMD and Intel have Havok. We truly hope this is not the case since “format wars” are really bad for the consumers (For example: Blu-ray vs. HD-DVD).

Before we get to the good news, I’m going to ask you to hold on to something steady as some of you are going to feel a bit dizzy after you hear this. The truth is… Nvidia is now helping us with the project and it seems they are giving us their blessings. It’s very impressive, inspiring and motivating to see Nvidia's view on this. Why they help us? My best guess would be: They probably want to take on Intel with CUDA and to deal with the latest Havok threat from both AMD and Intel.

Some other good news, we are getting a lot of help from cool journalists like Theo Valich to address the HD 4800 access issue. I can confirm that our CUDA Radeon library is almost done and everything is going as planned on this side. There are some issues that need to be addressed, since adding Radeon support in CUDA isn’t a big deal - but it’s not enough! We also need to add CUDA support on AMD’s driver level and its being addressed as we speak.

And since then AFAIK nothing.
 
Last edited:

leonard_222003

New Member
Joined
Jan 29, 2006
Messages
241 (0.03/day)
System Name Home
Processor Q6600 @ 3300
Motherboard Gigabyte p31 ds3l
Cooling TRUE Intel Edition
Memory 4 gb x 800 mhz
Video Card(s) Asus GTX 560
Storage WD 1x250 gb Seagate 2x 1tb
Display(s) samsung T220
Case no name
Audio Device(s) onboard
Power Supply chieftec 550w
Software Windows 7 64
Well in AMD's defense , Nvidia didn't want to adopt dx10.1 in spite and forced developers to not use that feature , it would've gived AMD a performance boost , this time it's kind of the same thing , physx would give Nvidia some perf. supremacy if AMD would use this feat. and they would drop dead before using it :).
You can understand how tehnology progresses when money is in the way :).
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,300 (7.53/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Well in AMD's defense , Nvidia didn't want to adopt dx10.1 in spite and forced developers to not use that feature , it would've gived AMD a performance boost , this time it's kind of the same thing , physx would give Nvidia some perf. supremacy if AMD would use this feat. and they would drop dead before using it :).
You can understand how tehnology progresses when money is in the way :).

How do you come to the conclusion that if NVIDIA had DirectX 10.1 compliance, ATI would have some sort of supremacy? On the other hand, I can straightaway call PhysX on NVIDIA GPUs a supremacy over PhysX on ATI GPUs, and I have the F@H performance numbers to back my statement.

If developers used DirectX 10.1 and it became a trend, NVIDIA would've been forced to gain compliance or devs would've avoided DX10.1 altogether in fear of sales loss due to a major GPU player not supporting it. Chicken and egg really :).
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Well in AMD's defense , Nvidia didn't want to adopt dx10.1 in spite and forced developers to not use that feature , it would've gived AMD a performance boost , this time it's kind of the same thing , physx would give Nvidia some perf. supremacy if AMD would use this feat. and they would drop dead before using it :).
You can understand how tehnology progresses when money is in the way :).

That's only half the story. Nvidia didn't force developers to not use it and the thing that makes DX10.1 faster is also available in DX10, but is only a suggestion and not a requirement. More specifically the memory buffer readback, a thing that Nvidia does support on hardware.

The idea (meme at this pont) you pointed above is purely based on what happened with Assassin's Creed. Well the developers themselves said it had nothing to do with Nvidia. That developer posture is not an isolated thing. Ati supporters have been claiming that kind of things with TWIMTBP since the beginning, while Ati never had a word about that (interesting) and many developers (iD, Epic Games, Crytek, Ubisoft) have specifically negated such a thing many times. If I have to believe anyone, I believe ALL the developers. Between what developers say and a Ati fan made conspiracy theory, where even Ati never took part, I obviously take developer's word.

But let's get back to what happened to Assasin's Creed. The same developer Ubi Montreal has just released FarCry2 and in this game DX10 Antialiasing performance is higher than in DX9 and the same that under DX10.1*. Intriguing? Conspiracy again? Not for all of us that know (and always have known) the truth about DX10.1.

* JUst in case people don't get the whole meaning of that sentence, FarCry2 does support DX10.1, but in words of the developers, it didn't make a difference in performance and features. Under DX10.1 things are simpler to program, no one will ever discuss that.
 
Last edited:

leonard_222003

New Member
Joined
Jan 29, 2006
Messages
241 (0.03/day)
System Name Home
Processor Q6600 @ 3300
Motherboard Gigabyte p31 ds3l
Cooling TRUE Intel Edition
Memory 4 gb x 800 mhz
Video Card(s) Asus GTX 560
Storage WD 1x250 gb Seagate 2x 1tb
Display(s) samsung T220
Case no name
Audio Device(s) onboard
Power Supply chieftec 550w
Software Windows 7 64
How do you come to the conclusion that if NVIDIA had DirectX 10.1 compliance, ATI would have some sort of supremacy? On the other hand, I can straightaway call PhysX on NVIDIA GPUs a supremacy over PhysX on ATI GPUs, and I have the F@H performance numbers to back my statement.

If developers used DirectX 10.1 and it became a trend, NVIDIA would've been forced to gain compliance or devs would've avoided DX10.1 altogether in fear of sales loss due to a major GPU player not supporting it. Chicken and egg really :).

Well it wouldn't be much of a problem for them to implement dx10.1 but spending money on a new generation of graphics card would've been a problem , if developers would use dx10.1 graphic cards like based on old argitecture g80,g92 would've been a bit left behind by ati products in some games and there is the problem of AA on foliage wich Nvidia can't do because it doesn't have dx10.1 , unigine proved that.
Assasin creed is another proof Nvidia made some noise when a patch enabled dx10.1 for ATI but the next patch that feature became unavailable , people started ponting fingers at Nvidia and anandtech made an article about that and the perf. numbers ATI gained with this feature.
So dx10.1 could've made and could make in the future things much better for ATI , why Nvidia would not adopt this in the new GT200 generation ? because they would kill all the cards in the mainstream market for them where ATI already has dx10.1 on them adn could render a 3870 a worthy foe for a 8800gt.
In one way or another developers held back from using ATI features considering Nvidia supports them in developing games when ATI does nothing , why should they give a f.... about ATI ? i can understand developers but for the consumer of ATI prodycts this is not nice.
 
Joined
Jan 31, 2005
Messages
2,097 (0.29/day)
Location
gehenna
System Name Commercial towing vehicle "Nostromo"
Processor 5800X3D
Motherboard X570 Unify
Cooling EK-AIO 360
Memory 32 GB Fury 3666 MHz
Video Card(s) 4070 Ti Eagle
Storage SN850 NVMe 1TB + Renegade NVMe 2TB + 870 EVO 4TB
Display(s) 25" Legion Y25g-30 360Hz
Case Lian Li LanCool 216 v2
Audio Device(s) Razer Blackshark v2 Hyperspeed / Bowers & Wilkins Px7 S2e
Power Supply HX1500i
Mouse Harpe Ace Aim Lab Edition
Keyboard Scope II 96 Wireless
Software Windows 11 23H2 / Fedora w. KDE
Just a little wood for the fire: Who is posting (my guess: a lot of money) into the game developers pocket with the slogan: The way it´s meant to be played ?
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Well it wouldn't be much of a problem for them to implement dx10.1 but spending money on a new generation of graphics card would've been a problem , if developers would use dx10.1 graphic cards like based on old argitecture g80,g92 would've been a bit left behind by ati products in some games and there is the problem of AA on foliage wich Nvidia can't do because it doesn't have dx10.1 , unigine proved that.
Assasin creed is another proof Nvidia made some noise when a patch enabled dx10.1 for ATI but the next patch that feature became unavailable , people started ponting fingers at Nvidia and anandtech made an article about that and the perf. numbers ATI gained with this feature.
So dx10.1 could've made and could make in the future things much better for ATI , why Nvidia would not adopt this in the new GT200 generation ? because they would kill all the cards in the mainstream market for them where ATI already has dx10.1 on them adn could render a 3870 a worthy foe for a 8800gt.
In one way or another developers held back from using ATI features considering Nvidia supports them in developing games when ATI does nothing , why should they give a f.... about ATI ? i can understand developers but for the consumer of ATI prodycts this is not nice.

First of all the fact that Nvidia doesn't support DX10.1 doesn't mean they don't support the fetures under DX10.1. M$ decided that in order to be DX10 compliant, the cards had to support EVERY feature specified in the API. That means that if you don't support 1 feature out of 1000, your card can't be labeled DX10 (or DX10.1 or DX11 etc) compliant. Prior to DX10 that was not necessary. If M$ decided to implement the same system for previous DX versions, there WOULDN'T BE A SINGLE DX9, DX8... comliant card in the market. As I said everything that was used in DX10.1 version of AC is available in DX10 and Nvidia cards, Farcry 2 is the proof. Plus where did you hear that G80 or G92 (had Nvidia supported 100% DX10.1) would have been behind? Nevermind, I know where, which points me out to why I said what I said in my first entry to this thread...
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,944 (3.75/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -.080mV PL max @220w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
Just a little wood for the fire: Who is posting (my guess: a lot of money) into the game developers pocket with the slogan: The way it´s meant to be played ?

Agreed, however, there are a number of "The way it's meant to be played" games that actually perform as well (and in some cases perhaps better) on ATI hardware although I know where you are coming from but to be fair, from a NON ATI or NVidia standpoint, you could argue the reasons why, it seems that one of the teams occasionally have to develop a driver patch for a specific game because their hardware runs the game so poorly, and sometimes that game does not start with the screenshot "the way it's meant to be played"

Some might say (again from a middle of the road standpoint) that "The way it's meant to be played" actually helps fund "better games" as that funding, support, blackmail, whatever you would like to call it actually gives the developers more time to get it right in the first place, all for our gaming pleasure of course, I am always a little weiry of games that are released and in their first month of retail life have 2 or more patches released for them to fix bugs that IMO should not have been there in the first place, many developers are just getting lazy IMO, 10 or 15 years ago when the web was was probably only accessed by 10% of us, the developer had to try much harder to get the game right first time as these "patches/fixes" were of course much less readily available, kind of a coincidence!
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Just a little wood for the fire: Who is posting (my guess: a lot of money) into the game developers pocket with the slogan: The way it´s meant to be played ?

Not a single penny has been put in developers hand under TWIMTBP nor Nvidia under other conditions. And by no means they ask developers to cripple Ati performance. They do help developers with support and access to hardware, and cross marketing, something very valuable for them, but that's not the same. People doesn't know how valuable the access to hardware and support in man-power form is for developers...

I've benn seing 10 Nvidia names and only 2 AMD/Ati ones under the thanks section of credits in games for so long it's not even funny.

I'll tell you a little story. We have three friends, A, N and D (Ati, Nvidia, Developers :)). When D had problems with his wife, N brought beers and had a long conversation with him. When D's mother was sick N brought medicines and lately acompanied them to the doctor. When D lost his job, N advertised his abilities to everybody he knew, which was a lot of people.
When D was...

One day A and N's children (all of them called C of customers :)) said they wanted the latest console. A and N kenw it was released in low quantities and that no store in town had them already. They did know of a store in the nearest town that was going to have it, but will soon be out of stock. They had to get there as soon as possible. They had a problem though, they had no means of transport. D has a motorcycle so they both call him asking for help. D in the end makes a decision: he will take N to the store and then return and carry on A. In the meantime A can make the trip on his own by foot or by any other means. A's children cry a lot, why would D take N first? It's not fair! :cry::cry:

But at the end of the day... IT IS.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
a lot of people dont seem to be getting it

PhysX is an nvidia only way of doing this
DirectX 11 is doing an open (any video card) version of this.

ATI/AMD are saying that nvidias closd one will die, and the open version will live on.

The problem is that PhysX has the opertunity to not be nVidia only, but ATi decided to make it nVidia only. Nvidia had no problems with getting PhysX up and running on ATi hardware, IMO because they knew that is what it would take for the technology to actually be adopted in the industry. Ageia originally failed because it couldn't get PhysX adopted in the industry, and people didn't want to buy a seperate card that sat idle in their PC 90% of the time.

Now, nVidia had a different idea for PhysX, if you use a small fraction of the GPU's power to calculate PhysX, then people don't have to worry about buying another add-on card that is going to sit idle most of the time. If a game doesn't support PhysX, then the that small fraction of GPU power is returned for use rendering the game.

Sort of Like Nvidia Saying the 4830 is defective when its not. Then Before that The head of NV said they underestimated the R770

When did nVidia say the HD4830 is defective?

And the head of nVidia saying the underestimated the RV770 is totally different from ATi saying PhysX will never catch on.
 

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,728 (2.22/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
Meh...physx will be around for a while...it won't just stop being used the day DX11 comes out...just in DX revisions, how many forgot how long it took DX10 to be more widely used and it's still not as widely used as was once thought...it was going to be the "end of DX9" the "beginning of amazing graphics", blah, blah, blah. DX11 may have physics, but they have a lot of work to do...imo PhysX will be a step or two ahead just because of the time and practice utilising physics from GPU power is now hugely supported for the newer range of NV cards that so many are using out there.

You guys will see, DX11 will come out, there will be like 20-30 threads created about it...woohoo...how about let's see some true results, I'm waiting...

Physx will still be around and used for quite a while IMO...especially if it's easier or some game creators are more used to using that code they may opt to stick with PhysX instead....ya never know!

:toast:
 
Joined
Apr 21, 2008
Messages
5,250 (0.86/day)
Location
IRAQ-Baghdad
System Name MASTER
Processor Core i7 3930k run at 4.4ghz
Motherboard Asus Rampage IV extreme
Cooling Corsair H100i
Memory 4x4G kingston hyperx beast 2400mhz
Video Card(s) 2X EVGA GTX680
Storage 2X Crusial M4 256g raid0, 1TbWD g, 2x500 WD B
Display(s) Samsung 27' 1080P LED 3D monitior 2ms
Case CoolerMaster Chosmos II
Audio Device(s) Creative sound blaster X-FI Titanum champion,Creative speakers 7.1 T7900
Power Supply Corsair 1200i, Logitch G500 Mouse, headset Corsair vengeance 1500
Software Win7 64bit Ultimate
Benchmark Scores 3d mark 2011: testing
only games company's have a word about this , and only them can say you need physics hardware or not , but when i see all new games looks amd win right now
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
It's funny when people talk about how many games are being released with it. How much took DX10 to be used in games? And that's only an update to an API they know like they know their on name... PhysX has been released only 3 months ago, for God's sake...
 
Joined
Apr 24, 2008
Messages
2,025 (0.33/day)
Processor RyZen R9 3950X
Motherboard ASRock X570 Taichi
Cooling Coolermaster Master Liquid ML240L RGB
Memory 64GB DDR4 3200 (4x16GB)
Video Card(s) RTX 3050
Storage Samsung 2TB SSD
Display(s) Asus VE276Q, VE278Q and VK278Q triple 27” 1920x1080
Case Zulman MS800
Audio Device(s) On Board
Power Supply Seasonic 650W
VR HMD Oculus Rift, Oculus Quest V1, Oculus Quest 2
Software Windows 11 64bit
Whether AMD / ATI opted out of PhysiX support or not, it doesn't really matter. PhysX is not universally supported in any offical capacity and DX11 will be with DX11 hardware. IMO, AMD's statment is likely accurate, PhysX will probably die and DX11 will almost certainly prevail. It doesn't matter one wit that ATI rejected support for PhysX and opted instead for Intel owned Havok (they had to go to the compatition which ever way they turned because they had nothing to compete). If the PhysX / CUDA API went open source I still don't think that would save it in this respect. DX11 is coming and Microsoft has a habbit of killing off competition in a way only a virtually unchecked poorly regulated monopoly juggernaut can.

Still, not everyone will or can upgrade to DX11 hardware when it comes out and there is a huge number of gamers that refuse to upgrade to Vista let alone "Windos 7". Then there is the issue of DX11 titles in a world where DX10 hasn't really taken off. It will be some time before DX11 is viable in any real applicable way beyond the odd tech demo here and there. AMD / ATI may well be first to market with DX11 hardware but DX9 and DX10 will probably be much more relevant when that happens, with DX11 being an unusable checkmark feature. In the mean time PhysX is helping to sell cards for nVidia in much the same way SLI did. I've even heard of ATI users buying nVidia cards just to run PhysX. I ask you, if you can get people who use a fast / powerful competing product to buy one of your cards anyway,.....haven't you already accomplished alot,....haven't you already won a big victory,....???

I say PhysX would have served its purpose by the time it dies and yes it will likely die.
 
Joined
Feb 26, 2008
Messages
4,876 (0.79/day)
Location
Joplin, Mo
System Name Ultrabeast GX2
Processor Intel Core 2 Duo E8500 @ 4.0GHZ 24/7
Motherboard Gigabit P35-DS3L
Cooling Rosewill RX24, Dual Slot Vid, Fan control
Memory 2x1gb 1066mhz@850MHZ DDR2
Video Card(s) 9800GX2 @ 690/1040
Storage 750/250/250/200 all WD 7200
Display(s) 24" DCLCD 2ms 1200p
Case Apevia
Audio Device(s) 7.1 Digital on-board, 5.1 digital hooked up
Power Supply 700W RAIDMAXXX SLI
Software winXP Pro
Benchmark Scores 17749 3DM06
PhysX is an Nvidia platform, if I'm not mistaken. Would this not be equivalent to ford supporting chevrolet motor division, by using their engines? The point of selling in competition is saying that their equipment is the best, so AMD supporting PhysX says that Nvidia was right in buying physx. AMDs only business choice in this matter was to forgo PhysX.
I dont think PhysX will die, but it will probably be behind a curtain as a part of Nvidia's system.
 

imperialreign

New Member
Joined
Jul 19, 2007
Messages
7,043 (1.11/day)
Location
Sector ZZ₉ Plural Z Alpha
System Name УльтраФиолет
Processor Intel Kentsfield Q9650 @ 3.8GHz (4.2GHz highest achieved)
Motherboard ASUS P5E3 Deluxe/WiFi; X38 NSB, ICH9R SSB
Cooling Delta V3 block, XPSC res, 120x3 rad, ST 1/2" pump - 10 fans, SYSTRIN HDD cooler, Antec HDD cooler
Memory Dual channel 8GB OCZ Platinum DDR3 @ 1800MHz @ 7-7-7-20 1T
Video Card(s) Quadfire: (2) Sapphire HD5970
Storage (2) WD VelociRaptor 300GB SATA-300; WD 320GB SATA-300; WD 200GB UATA + WD 160GB UATA
Display(s) Samsung Syncmaster T240 24" (16:10)
Case Cooler Master Stacker 830
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro PCI-E x1
Power Supply Kingwin Mach1 1200W modular
Software Windows XP Home SP3; Vista Ultimate x64 SP2
Benchmark Scores 3m06: 20270 here: http://hwbot.org/user.do?userId=12313
Bit-Tech.net interviewed the AMD person to get the company's take on EA and 2K's decision to adopt NVIDIA PhysX across all of their worldwide studios, earlier this week. Interestingly, when asked about how the major publishers such as EA adopting PhysX across all of their studios would impact the propagation of the API, Cheng responded with saying that monetary incentives provided to publishing houses alone won't help a great deal in propagating the API, and that the product (PhysX) must be competitive, and that AMD viewed Havoc and its physics simulation technologies as leaders. "Games developers share this view. We will also invest in technologies and partnerships beyond Havok that enhances gameplay." he added. PhysX is a proprietary physics simulation API created by Ageia technologies, which was acquired and developed by NVIDIA. You can read the full Bit-Tech.net interview with Godfrey Cheng here.



and thus it begins - ATI vs nVidia PPU shootout round 3!!

Let's get it on!
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
PhysX is an Nvidia platform, if I'm not mistaken. Would this not be equivalent to ford supporting chevrolet motor division, by using their engines? The point of selling in competition is saying that their equipment is the best, so AMD supporting PhysX says that Nvidia was right in buying physx. AMDs only business choice in this matter was to forgo PhysX.
I dont think PhysX will die, but it will probably be behind a curtain as a part of Nvidia's system.

You don't get it. AMD is supporting Havok which is Intel's. NOW WHO really is the competition? At least in the GPU arena they had a chance against Nvidia.
 

brian.ca

New Member
Joined
Nov 1, 2007
Messages
71 (0.01/day)
All that falls appart from the simple fact that, had Ati adopted PhysX when Nvidia gave it for free. Or at least when the guy at NGOHQ already made PhysX on Ati cards* possible, then we would already have games with 10 times better physics. The point of "Game developers are not supporting it because it doesn't run on all cards." loses it's value, when that is your own fault, when you have obstructed all efforts for the thing running in your cards.

That would make it fall apart if I was arguing that AMD wanted to push physx if able but I wasn't... it's like I said, Physx won't take off unless ATI supports it b/c of the whole chicken and the egg thing and ATI just doesn't want to support it. And I can't see why they would. Physx started as a propreitary system to make Ageia money selling their hardware now Nvidia bought them out it would seem foolish to think that physx will not favor Nvidia cards and that they couldn't pull some strong arm tactics down the road if physx were to become the industry standard.

And the one "We are not going to follow a standard that only few developers have adopted" is hypocrit at best, when you are doing as much as you can for that to be true. This falaceous comments included.

Saying that may be misleading but at the end of the day developers didn't go to Havok b/c ATI wouldn't support physx. They stuck with Havok b/c it's what they used, what they know and the incentives to switch were not great enough.

It's a lame path the one that AMD has taken. As I said I have lost any faith in them, and the worst thing is that they sided with the most dangerous and "evil" company, Intel, and once Intel has their GPU they are going to crush them so hard it's going to hurt us as well. As a popular quote says: "One must never rely on alliances with the powerful, they probably want something more of that allegiance than help us."

Here's the thing with Intel though... they're not going to be beaten. Nvidia can be beaten by and has beaten ATI. They might not be powerful in the big picture vs. a company like Intel but vs. ATI they are... so that quote applies to them as well. Now I'd ask, who's the smarter side to get on... The company that could probably crush your competition but has reason enough to keep you around who you have no chance of beating. Or the company that you can beat, doesn't really offer you much, that also stands to gain if they get you out of the way?

Lame or not it's the smart path
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I wait for Intel Larrabee. AMD/NVIDIA have "lost their way." They are both extremely proprietary and it's time to move to an microarchitecture-based GPU and PPU system. Intel is poised to do that.
 

brian.ca

New Member
Joined
Nov 1, 2007
Messages
71 (0.01/day)
The problem is that PhysX has the opertunity to not be nVidia only, but ATi decided to make it nVidia only. Nvidia had no problems with getting PhysX up and running on ATi hardware, IMO because they knew that is what it would take for the technology to actually be adopted in the industry. Ageia originally failed because it couldn't get PhysX adopted in the industry, and people didn't want to buy a seperate card that sat idle in their PC 90% of the time.

When did nVidia say the HD4830 is defective?.

It could have not been Nvidia only but when Nvidia developed this I'd be hard pressed to believe that it wasn't developed for their cards and that they wouldn't always retain some upperhand here (it would make sense for them to do this). When you consider that alternatives were coming about from parties with less of a vested interest in favoring a specific card vendor I'm not sure what ATI's motivation to support Nv's solution would be.

About the 4830 thing I'd wager he was referring to Nv putting out those slides showing the performance of the cards that had the unactivated SPs (iirc).
 
Joined
Feb 26, 2008
Messages
4,876 (0.79/day)
Location
Joplin, Mo
System Name Ultrabeast GX2
Processor Intel Core 2 Duo E8500 @ 4.0GHZ 24/7
Motherboard Gigabit P35-DS3L
Cooling Rosewill RX24, Dual Slot Vid, Fan control
Memory 2x1gb 1066mhz@850MHZ DDR2
Video Card(s) 9800GX2 @ 690/1040
Storage 750/250/250/200 all WD 7200
Display(s) 24" DCLCD 2ms 1200p
Case Apevia
Audio Device(s) 7.1 Digital on-board, 5.1 digital hooked up
Power Supply 700W RAIDMAXXX SLI
Software winXP Pro
Benchmark Scores 17749 3DM06
You don't get it. AMD is supporting Havok which is Intel's. NOW WHO really is the competition? At least in the GPU arena they had a chance against Nvidia.

this probably means that pII kicks ass, and they know it already.
 

brian.ca

New Member
Joined
Nov 1, 2007
Messages
71 (0.01/day)
It's funny when people talk about how many games are being released with it. How much took DX10 to be used in games? And that's only an update to an API they know like they know their on name... PhysX has been released only 3 months ago, for God's sake...

I don't mean to seem like I'm picking on you or anything but I wanted say something about the DX10 comments... one thing people need to consider is that DX10 was tied to vista which a lot of people didn't like. So it shouldn't be any surprise that DX10 had a slow adoption rate if Vista had one as well. If Windows 7 has a better adoption rate and DX11 is available for Vista then I don't think it's safe to try and judge how well received DX11 will be. If Win7 turns out to be good and they kill support for XP forcing a lot of users to migrate to the new OS the major bottle neck would become the hardware. And then like you said, not being fully compliant with a new standard doesn't mean some of the features aren't already supported on older cards so that may not even be much of a bottleneck.

When you say 3 months are you referring to that driver update? PhysX has been around a lot longer than that. In the original article's comment sections I think I saw what seemed like an Nv fanboy say Nv's been pushing physx for 10months now
 
Last edited:

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,944 (3.75/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -.080mV PL max @220w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
Whether AMD / ATI opted out of PhysiX support or not, it doesn't really matter. PhysX is not universally supported in any offical capacity and DX11 will be with DX11 hardware. IMO, AMD's statment is likely accurate, PhysX will probably die and DX11 will almost certainly prevail. It doesn't matter one wit that ATI rejected support for PhysX and opted instead for Intel owned Havok (they had to go to the compatition which ever way they turned because they had nothing to compete). If the PhysX / CUDA API went open source I still don't think that would save it in this respect. DX11 is coming and Microsoft has a habbit of killing off competition in a way only a virtually unchecked poorly regulated monopoly juggernaut can.

Still, not everyone will or can upgrade to DX11 hardware when it comes out and there is a huge number of gamers that refuse to upgrade to Vista let alone "Windos 7". Then there is the issue of DX11 titles in a world where DX10 hasn't really taken off. It will be some time before DX11 is viable in any real applicable way beyond the odd tech demo here and there. AMD / ATI may well be first to market with DX11 hardware but DX9 and DX10 will probably be much more relevant when that happens, with DX11 being an unusable checkmark feature. In the mean time PhysX is helping to sell cards for nVidia in much the same way SLI did. I've even heard of ATI users buying nVidia cards just to run PhysX. I ask you, if you can get people who use a fast / powerful competing product to buy one of your cards anyway,.....haven't you already accomplished alot,....haven't you already won a big victory,....???

I say PhysX would have served its purpose by the time it dies and yes it will likely die.

I agree with much of what you are saying there , however, just one point, DX11 hardware is irrelivent if games dont support it, now let's briefly look back, say at NVidia and ATI, their first DX10 hardware (8800 G82 series and 2900 for ATi) were on the shelves for a year before there were 10.....yes thats just 10 games with DX10 enabled, now look how many games to day are DX10 enabled, divide them by the number of months DX10 hardware has been available....end result.........a cr*p load of money spent by us with VERY little return in gaming terms, damn I have owned 7 different DX10 graphic cards, I own 7 DX10 enabled games (OK I am only a light to mid gamer)....DX11 cards will be released, we will wait 6 months for one DX11 game and a year for 5, in which time we will all have spent our hard earned money to find out that those early DX11 cards just dont cut it so we buy better ones! Now with Physx, at least it's here, it's growing whether AMD like it or not but most importantly, it gives the consumer even more choice and thats gotta be good in my book.
 
Top