• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

PhysX will Die, Says AMD

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,189 (7.56/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
In an interview with Godfrey Cheng, Director of Technical Marketing in AMD's Graphics Products Group, Bit-Tech.net has quoted him saying that standards such as PhysX would die due to their proprietary and closed nature. Says Mr. Cheng:

"There is no plan for closed and proprietary standards like PhysX," said Cheng. "As we have emphasised with our support for OpenCL and DX11, closed and proprietary standards will die."

Bit-Tech.net interviewed the AMD person to get the company's take on EA and 2K's decision to adopt NVIDIA PhysX across all of their worldwide studios, earlier this week. Interestingly, when asked about how the major publishers such as EA adopting PhysX across all of their studios would impact the propagation of the API, Cheng responded with saying that monetary incentives provided to publishing houses alone won't help a great deal in propagating the API, and that the product (PhysX) must be competitive, and that AMD viewed Havoc and its physics simulation technologies as leaders. "Games developers share this view. We will also invest in technologies and partnerships beyond Havok that enhances gameplay." he added. PhysX is a proprietary physics simulation API created by Ageia technologies, which was acquired and developed by NVIDIA. You can read the full Bit-Tech.net interview with Godfrey Cheng here.

View at TechPowerUp Main Site
 

PCpraiser100

New Member
Joined
Jul 17, 2008
Messages
1,062 (0.18/day)
System Name REBEL R1
Processor Core i7 920
Motherboard ASUS P6T
Cooling Stock
Memory 6GB OCZ GOLD TC LV Kit 1866MHz@1.65V 9-9-9-24
Video Card(s) Two Sapphire HD 5770 Vapor-X Xfire'd and OC'd (920/1330)
Storage Seagate 7200.11 500GB 32MB
Case Antec Three Hundred
Audio Device(s) ASUS Xonar D1 PCI Sound Card
Power Supply OCZ StealthXStream 500W
Software Windows 7 Ultimate 64-bit
Benchmark Scores 16585 Performance Score on 3DMark Vantage
Take that Ghost Recon!
 

Kreij

Senior Monkey Moderator
Joined
Feb 6, 2007
Messages
13,817 (2.13/day)
Location
Cheeseland (Wisconsin, USA)
A bold statement from a company who can be bold at the moment.
We shall see.
 

ShadowFold

New Member
Joined
Dec 23, 2007
Messages
16,918 (2.74/day)
Location
Omaha, NE
System Name The ShadowFold Draconis (Ordering soon)
Processor AMD Phenom II X6 1055T 2.8ghz
Motherboard ASUS M4A87TD EVO AM3 AMD 870
Cooling Stock
Memory Kingston ValueRAM 4GB DDR3-1333
Video Card(s) XFX ATi Radeon HD 5850 1gb
Storage Western Digital 640gb
Display(s) Acer 21.5" 5ms Full HD 1920x1080P
Case Antec Nine-Hundred
Audio Device(s) Onboard + Creative "Fatal1ty" Headset
Power Supply Antec Earthwatts 650w
Software Windows 7 Home Premium 64bit
Benchmark Scores -❶-❸-❸-❼-
You tell 'em AMD. I think they know something about PII we don't :p They are starting to get cocky. That's either a desperate act or an act of "I know were gonna pwn you".
 

KBD

New Member
Joined
Feb 23, 2007
Messages
2,477 (0.38/day)
Location
The Rotten Big Apple
Processor Intel e8600 @ 4.9 Ghz
Motherboard DFI Lanparty DK X48-T2RSB Plus
Cooling Water
Memory 2GB (2 x 1GB) of Buffalo Firestix DDR2-1066
Video Card(s) MSI Radeon HD 4870 1GB OC (820/950) & tweaking
Storage 2x 74GB Velociraptors in RAID 0; 320 GB Barracuda 7200.10
Display(s) 22" Mitsubishi Diamond Pro 2070SB
Case Silverstone TJ09-BW
Audio Device(s) Creative X-Fi Titanium Fatal1ty Profesional
Power Supply Ultra X3 800W
Software Windows XP Pro w/ SP3
You tell 'em AMD. I think they know something about PII we don't :p They are starting to get cocky. That's either a desperate act or an act of "I know were gonna pwn you".

i think its more about their GPU division, not CPU. one reason they are so bold is because Radeon 5000 series will whoop nvidia's ass again in their view, they may have a surprise on that front. May be them buying ATI wasnt such a bad move afterall, that graphics division is prolly helping keep the company afloat.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.68/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
Funny, they say proprietary standards will die, yet, what is DX?

I think they may be counting their chickens here.
 

kysg

New Member
Joined
Aug 20, 2008
Messages
1,255 (0.21/day)
Location
Pacoima, CA
System Name Workhorse lappy
Processor AMD A6 3420
Memory 8GB DDR3 1066
Video Card(s) ATI radeon 6520G
Storage OCZ Vertex4 128GB SSD SATAIII
Display(s) 15inch LCD
Software Windows 7 64bit
i think its more about their GPU division, not CPU. one reason they are so bold is because Radeon 5000 series will whoop nvidia's ass again in their view, they may have a surprise on that front. May be them buying ATI wasnt such a bad move afterall, that graphics division is prolly helping keep the company afloat.

It's not surprising though, graphics division has been getting in stride. hopefully the 5 series wont be just a die shrunk 4 series and there will continue to be improvement for the red camp.

Funny, they say proprietary standards will die, yet, what is DX?

I think they may be counting their chickens here.

wasn't DX the only standard at that time besides openGL??? which really wasn't a standard.

whoops dbl post my bad.
 
Joined
Sep 26, 2006
Messages
6,959 (1.05/day)
Location
Australia, Sydney
At least someone is being frank. Physx has just crippled performance when enabled and not really done anything except give people a false impression of a bigger E-Penis. Reviewers havent really warmed up to it either, and I havent too.

The main issue stems from a lack of developers even bothering to conform to such proprietary standards; they want to do it their own way. CPU Based physics engines in game engines such as the CryEngine 2, or even the latest Source engine generally are sufficient.

Sure AMD is being bold and attempting to scare away nvidia shareholders, but its true. Havok basically rips Physx in terms of how much its implemented.

"It should be noted that title support for GPU accelerated physics simulation is NOT the end game. The end game is having GPU physics as an integral part of game play and not just eye candy. If it is optional eye candy, GPU physics will not gain traction. The titles we have seen today with shattering glass and cloth waving in the wind is not integral to game play and the impact on the game's experience is minimal. We are looking for ways to integrate GPU physics better into game play. Or even things like AI instead of focusing on eye candy / effects physics."

Cheng's final words make a lot of sense and I find myself agreeing with him. We said something similar when Nvidia announced that the PC version of Mirror's Edge was delayed because of the PhysX implementation which, following a brief hands-on preview last week, does nothing but add some additional eye candy. None of it influences the actual gameplay experience.

Cannot agree more.
 
Last edited:

kysg

New Member
Joined
Aug 20, 2008
Messages
1,255 (0.21/day)
Location
Pacoima, CA
System Name Workhorse lappy
Processor AMD A6 3420
Memory 8GB DDR3 1066
Video Card(s) ATI radeon 6520G
Storage OCZ Vertex4 128GB SSD SATAIII
Display(s) 15inch LCD
Software Windows 7 64bit
At least someone is being frank. Physx has just crippled performance when enabled and not really done anything except give people a false impression of a bigger E-Penis. Reviewers havent really warmed up to it either, and I havent too.

The main issue stems from a lack of developers even bothering to conform to such proprietary standards; they want to do it their own way. CPU Based physics engines such as the CryEngine 2, or even the latest Source engine generally are sufficient.

well this is obvious though they plan to do things there own, When you been doing stuff that way for a while its really gonna tick off a few people when something new gets introduced, that really doesn't do squat, which makes your day very long, when you could have already been done doing it the old way.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.68/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
At least someone is being frank. Physx has just crippled performance when enabled and not really done anything except give people a false impression of a bigger E-Penis. Reviewers havent really warmed up to it either, and I havent too.

The main issue stems from a lack of developers even bothering to conform to such proprietary standards; they want to do it their own way. CPU Based physics engines such as the CryEngine 2, or even the latest Source engine generally are sufficient.

In GRAW2 I found it to make the game much more enjoyable, and worth the small performance hit. Sufficient doesn't cut it for me. The GPU can handle Physx a hell of a lot better than even the fastest Quad can. I want GPU accelerated physics to become the norm. The CPU just doesn't cut it anymore.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.11/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Proprietary is used in the loosest way possible here considering nVidia has expressed that they are more than willing to help get it running on ATi's hardware. ATi is forcing it to be propriatary by refusing to work with nVidia to get it working.

The performance hit is going to happen regardless of what API is used to create the physics. If they both are creating the same level of physic, the performance hit will be the same, as the graphics cards are asked to render more on the screen due to the physics.

Edit: Of course I hope AMD realizes that they have kind of screwed themselves by saying that. History shows that when a company says their competitions product will fail, the product usually becomes wildly popular.
 
Last edited:
Joined
May 19, 2007
Messages
7,662 (1.20/day)
Location
c:\programs\kitteh.exe
Processor C2Q6600 @ 1.6 GHz
Motherboard Anus PQ5
Cooling ACFPro
Memory GEiL2 x 1 GB PC2 6400
Video Card(s) MSi 4830 (RIP)
Storage Seagate Barracuda 7200.10 320 GB Perpendicular Recording
Display(s) Dell 17'
Case El Cheepo
Audio Device(s) 7.1 Onboard
Power Supply Corsair TX750
Software MCE2K5
Proprietary is used in the loosest way possible here considering nVidia has expressed that they are more than willing to help get it running on ATi's hardware. ATi is forcing it to be propriatary by refusing to work with nVidia to get it working.

The performance hit is going to happen regardless of what API is used to create the physics. If they both are creating the same level of physic, the performance hit will be the same, as the graphics cards are asked to render more on the screen due to the physics.

You know what they meant ... a standard in which nne of them have overly powerful controlling interest
 

WarEagleAU

Bird of Prey
Joined
Jul 9, 2006
Messages
10,812 (1.61/day)
Location
Gurley, AL
System Name Pandemic 2020
Processor AMD Ryzen 5 "Gen 2" 2600X
Motherboard AsRock X470 Killer Promontory
Cooling CoolerMaster 240 RGB Master Cooler (Newegg Eggxpert)
Memory 32 GB Geil EVO Portenza DDR4 3200 MHz
Video Card(s) ASUS Radeon RX 580 DirectX 12 DUAL-RX580-O8G 8GB 256-Bit GDDR5 HDCP Ready CrossFireX Support Video C
Storage WD 250 M.2, Corsair P500 M.2, OCZ Trion 500, WD Black 1TB, Assorted others.
Display(s) ASUS MG24UQ Gaming Monitor - 23.6" 4K UHD (3840x2160) , IPS, Adaptive Sync, DisplayWidget
Case Fractal Define R6 C
Audio Device(s) Realtek 5.1 Onboard
Power Supply Corsair RMX 850 Platinum PSU (Newegg Eggxpert)
Mouse Razer Death Adder
Keyboard Corsair K95 Mechanical & Corsair K65 Wired, Wireless, Bluetooth)
Software Windows 10 Pro x64
Proprietary in that its not easily programmable over a wide range. Kind of like Dell Hardware used to be, you couldnt swap out with anything, it had to be Dell specific (as an example here). I think it is bold and cocky and I like it. Will it succeed? WE shall see. I dont think ATI is hurting themselves here either.
 
Joined
Apr 7, 2008
Messages
633 (0.10/day)
Location
Australia
System Name _Speedforce_ (Successor to Strike-X, 4LI3NBR33D-H, Core-iH7 & Nemesis-H)
Processor Intel Core i9 7980XE (Lapped) @ 5.2Ghz With XSPC Raystorm (Lapped)
Motherboard Asus Rampage VI Extreme (XSPC Watercooled) - Custom Heatsinks (Lapped)
Cooling XSPC Custom Water Cooling + Custom Air Cooling (From Delta 220's TFB1212GHE to Spal 30101504&5)
Memory 8x 8Gb G.Skill Trident Z RGB 4266MHz @ 4667Mhz (2x F4-4266C17Q-32GTZR)
Video Card(s) 3x Asus GTX1080 Ti (Lapped) With Customised EK Waterblock (Lapped) + Custom heatsinks (Lapped)
Storage 1x Samsung 970 EVO 2TB - 2280 (Hyper M.2 x16 Card), 7x Samsung 860 Pro 4Tb
Display(s) 6x Asus ROG Swift PG348Q
Case Aerocool Strike X (Modified)
Audio Device(s) Creative Sound BlasterX AE-5 & Aurvana XFi Headphones
Power Supply 2x Corsair AX1500i With Custom Sheilding, Custom Switching Unit. Braided Cables.
Mouse Razer Copperhead + R.A.T 9
Keyboard Ideazon Zboard + Optimus Maximus. Logitech G13.
Software w10 Pro x64.
Benchmark Scores pppft, gotta see it to believe it. . .
I really enjoyed playing coop GRAW, and ever since ive been awaiting the release of more coop campaign gameplay for those friday or saturday night lan sessions with the guys. I have to admit, GRAW 2's PhysX wasnt the best ive seen, but taking into consideration the games age and official release date of the AGEIA PhysX P1 cards im more inclined to think . . . whatever . . . What matters was the enjoyable hours of fun played.

Since the GRAW 2 production days, PhysX has come a long way. This can be witnessed via the many examples out there on the internet. Whether it be a fluid demo, particle demo, a ripping flag or my balls bouncing off each other. Either way, the realism it provides is a vital step. EA and 2K seem to think so.

PhysX enabled reduces performance on lower end systems, and/or systems missing required hardware. Ofcourse we can get the CPU to run the PhysX stuff, but whats going to run everything else . . . .

Cheng and all of AMD is scared that PhysX will evolve to the only next step it has. To become a part of the A.I, and the game play.
PhysX cant get worse, and we all know that this technology will eventually evolve. Simulating, and ripping is second grade and will never sum up to be the best.

I wonder how the 295GTX will cope with all this.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.96/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
a lot of people dont seem to be getting it

PhysX is an nvidia only way of doing this
DirectX 11 is doing an open (any video card) version of this.

ATI/AMD are saying that nvidias closd one will die, and the open version will live on.
 

Swansen

New Member
Joined
Nov 18, 2007
Messages
182 (0.03/day)
EA and 2K seem to think so.
I wonder how the 295GTX will cope with all this.

I tend to not follow anything EA does, as they generally destroy anything they touch. The performance hit is a big deal for most people, as many don't buy bleeding edge graphics cards because they are to expensive. Eye candy is cool, but i hardly think many will miss a flag blowing in the wind. I think this just all goes to further a problem i've seen as of late, thats game developers focusing on the wrong things, gameplay should always be first, everything else comes after.

DirectX 11 is doing an open (any video card) version of this.

Lol, no i get what they were saying, its just a REALLY poor way of wording it, as DX is closed sourced software. I think most people are missing the fact that DX11 will have physics software built in, which, is very cool.... as long as its done right.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
41,918 (6.61/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Proprietary is used in the loosest way possible here considering nVidia has expressed that they are more than willing to help get it running on ATi's hardware. ATi is forcing it to be propriatary by refusing to work with nVidia to get it working.

The performance hit is going to happen regardless of what API is used to create the physics. If they both are creating the same level of physic, the performance hit will be the same, as the graphics cards are asked to render more on the screen due to the physics.

Edit: Of course I hope AMD realizes that they have kind of screwed themselves by saying that. History shows that when a company says their competitions product will fail, the product usually becomes wildly popular.

Sort of Like Nvidia Saying the 4830 is defective when its not. Then Before that The head of NV said they underestimated the R770
 
Joined
Sep 26, 2006
Messages
6,959 (1.05/day)
Location
Australia, Sydney
I tend to not follow anything EA does, as they generally destroy anything they touch. The performance hit is a big deal for most people, as many don't buy bleeding edge graphics cards because they are to expensive. Eye candy is cool, but i hardly think many will miss a flag blowing in the wind. I think this just all goes to further a problem i've seen as of late, thats game developers focusing on the wrong things, gameplay should always be first, everything else comes after.



Lol, no i get what they were saying, its just a REALLY poor way of wording it, as DX is closed sourced software. I think most people are missing the fact that DX11 will have physics software built in, which, is very cool.... as long as its done right.

Very well spoken there...
 
Joined
Nov 13, 2007
Messages
10,691 (1.72/day)
Location
Austin Texas
System Name Planet Espresso
Processor 13700KF @ 5.5GHZ 1.285v - 235W cap
Motherboard MSI 690-I PRO
Cooling Thermalright Phantom Spirit EVO
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
is this anything like the time AMD said something about their 'true' quad core being faster thant 2 core 2's glued together? :nutkick: I think he's right - but ONLY if the DX11 way ACTUALLY works like its supposed to... which is a big if
 
Last edited:

Sapientwolf

New Member
Joined
Aug 23, 2006
Messages
57 (0.01/day)
Processor Intel Core 2 Quad QX9770 Yorkfield 4.00GHz
Motherboard Asus P5E3 Deluxe/WiFi-AP X38 Chipset Motherboard
Cooling Cooler Master Hyper 212 CPU Heatsink| Fans: Intake 1x120mm and 2x140mm| Exhaust 1x120mm and 2x140mm
Memory 4GB OCZ Platinum DDR3 1600 7-7-7-26
Video Card(s) 2 x Diamond Multimedia HD 4870 512MB Graphics Cards in CrossfireX
Storage 2 Western Digital 500GB 32MB Cache Caviar Blacks in RAID 0| 1 500GB 32MB Cache Seagate Barracuda.
Display(s) Sceptre X24WG 24" 1920x1200 4000:1 2ms LCD Monitor
Case Cooler Master CM 690
Audio Device(s) HT Omega HT Claro+
Power Supply Aerocool 750W Horsepower PSU
Software Windows Vista Home Premium x64
Funny, they say proprietary standards will die, yet, what is DX?

I think they may be counting their chickens here.

Well look at when DX was pre-9, no one wanted to use it because openGL was a lot easier to use to achieve the same results. It wasn't until 9 that it was viewed as worthy API. There is a difference when when something proprietary is received well and when there is general easy to use alternative. In this case DX9 onward was offering an ease of use and feature set that developers liked. PhysX is just kind of there offering what can be done with alternatives. Alternatives that work with more systems and are free.
 
Joined
Jan 31, 2005
Messages
2,082 (0.29/day)
Location
gehenna
System Name Commercial towing vehicle "Nostromo"
Processor 5800X3D
Motherboard X570 Unify
Cooling EK-AIO 360
Memory 32 GB Fury 3666 MHz
Video Card(s) 4070 Ti Eagle
Storage SN850 NVMe 1TB + Renegade NVMe 2TB + 870 EVO 4TB
Display(s) 25" Legion Y25g-30 360Hz
Case Lian Li LanCool 216 v2
Audio Device(s) Razer Blackshark v2 Hyperspeed / Bowers & Wilkins Px7 S2e
Power Supply HX1500i
Mouse Harpe Ace Aim Lab Edition
Keyboard Scope II 96 Wireless
Software Windows 11 23H2 / Fedora w. KDE
I hope that game engine´s (like the Source engine from Valve) will gain upperhand in this battle - this way no one have to think about buying a specific piece of hardware to get the Physics pling-bing
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Once again they come with the open standard excuse and LIES? Congratulations AMD, you finally made me a Nvidia fanboy, they are HONEST with their intentions at least. There's nothing else that I hate more than LIES and that is a big lie and retorted missinformation. Everything AMD/Ati is saying now is:
"We won't support a standard where Nvidia is faster until we have something to compete."

And open your eyes guys Nvidia IS and would probably be faster at physics calculations because they made changes to the architecture, like more CPU-like cahing system, ultra-light branching prediction, etc. Not exactly branching prediction but something that paliates the effects of lacking one. THAT's why Nvidia cards are faster in F@H for example, where more than number crunching is required. At simple number crunching Ati is faster: video conversion.

That advantage Nvidia has would apply to PhysX, OpenCL, DX11 physics or ANY hardware physics API you'd want to throw in. Their claim is just an excuse until they can prepare something. For instance they say they support HAvok, Intel OWNED Havok. Open standards? Yeah sure.


One more thing is that, PhysX is a physics API and middleware, with some bits of an engine here and there just as Havok, that can RUN on various platforms unchanged: Ageia PPUs, x86 cpus, Cell Microprocessor, and CUDA and potentially any PowerPC. It does not run directly on Nvidia GPUs, as you may remember CUDA is a x86 emulation API that runs on Nvidia cards. Once OpenCL is out, PhysX will be possible to do through OpenCL just as well as through CUDA. As long as Ati has good OpenCL there shouldn't be any problems, until then they could make PhysX run through CAL/Stream for FREE, BUT they don't want to, because it would be slower. IT'S at simple as that.

Another lie there, which you have to love, is that they claim that PhysX is just being used for eye candy. IT IS being used only for that AMD, yeah, but tell WHY. Because developers have been said hardware physics will not be supported on Ati GPUs until DX11, that's why. Because they are working hard along with Intel to make that statement true. Nvidia has many demos where PhysX are being used for a lot more, so it can be done.

AMD is just double acting. It's a shame Ati/AMD, a shame, I remember the days you were honest. I know bad times and experiences make personalities change, but this is inexcusable as well as the fact that all the advertising campaign has been based on bashing every initiative made by Nvidia instead of making your's better.

This sentence resumes it all (spealing of Havok on their GPU):

Our guidance was end of this year or early next year but, first and foremost, it will be driven by the milestones that we hit. To put some context behind GPU based physics acceleration, it is really just at the beginning. Like 3D back in the early 1990s. Our competition has made some aggressive claims about support for GPU physics acceleration by the end of this year. I.e. Support in many titles....but we can count the titles on one hand or just one or two fingers.

Like back in the 90's, because they are facing competition in something they can't compete, they are downplaying it. They know it's a great thing, they know it's the future, but they don't want that future to kick start yet. YOU SIMPLY CAN'T DOWNPLAY SOMETHING AND SAY IT WILL DIE, WHILE AT THE SAME TIME YOU'RE HARD WORKING ON YOUR OWN BEHIND THE CURTAINS!! AND USING INTEL'S HAVOK!!!
We know how accelerated graphics history evolved, despite what they said back then the GPU has become the most important thing and so will the hardware physics. Just as back then, they are just HOLDING BACK the revolution until they can be part of it. Clever, from a marketing stand point, but DISHONEST. You won't have my support Ati, you already pulled down another thing that I liked a lot: Stereo 3D. You can cheat me once, but not more.
 
Last edited:

leonard_222003

New Member
Joined
Jan 29, 2006
Messages
241 (0.04/day)
System Name Home
Processor Q6600 @ 3300
Motherboard Gigabyte p31 ds3l
Cooling TRUE Intel Edition
Memory 4 gb x 800 mhz
Video Card(s) Asus GTX 560
Storage WD 1x250 gb Seagate 2x 1tb
Display(s) samsung T220
Case no name
Audio Device(s) onboard
Power Supply chieftec 550w
Software Windows 7 64
You are right Darkmatter , it's true when dx11 will be available physx will be absolete or will slowly die but until then , prey AMD/ATI that Nvidia doesn't get more developers to use physx , it's a cool thing and the performance impact isn't that big for the eye candy it does , it's worth the performance loss.
I for one think this could kill AMD for good , if 2-3 big games launch with some physx thing and the difference is big bettwen them it could kill AMD graphics departement forever.
Nvidia could continue to battle in 3dmakrs and games perf. with AMD but it seems to go on for a long time and one advantage like this could end the competition a little faster.
I feel sorry for them , intel is kicking their asses , now Nvidia too , second place forever for AMD.
 

brian.ca

New Member
Joined
Nov 1, 2007
Messages
71 (0.01/day)
Physx is not going to kill anyone... it's dx10.1 all over again. No one is going to make/sell a game relying on this thing if it's gonna screw over a good portion of the market, or leave them left out of any significant part of the game. Unless AMD/ATI support physx it will never be anything more than optional eyecandy and that limited role will limit it as a factor for people to might buy into it (would you pay $500 for a new card instead of $300 so you can get more broken glass?).

Some of you are talkiing about studios and games adopting physx but what have you seen so far? Mirror's edge seems to do a bit of showcase work for physx but how many would really buy that game? Being an EA game through and though I personally have a hard time believing the game would offer anything more than what I could get watching the trailer or some demos. Otherwise I haven't seen physx do anything that havok wasn't already doing. There's no extra edge here. There's probably some incentives from Nvidia - but then that comes back to what this guy was saying in the first place;

"We cannot provide comments on our competitor's business model except that it is awfully hard to sustain support by monetary incentives. The product itself must be competitive. We view Havok technologies and products to be the leaders in physics simulation and this is why we are working with them. Games developers share this view."

If they can market these technogies to all those big goons (Adobe, Microsoft, SunMicro, Apple...) and create user-friendly apps, boom! ATI would be certified dead in no time

This lends itself to why this stuff won't kill ATI... end of the list is Apple, didn't they put together the OpenCL standard? Which do you think they'll be pushing, CUDA or their own standard? Microsoft will be pushing it's own thing with DX11 down the road. Adobe recently took advantage of Nvidia's CUDA and ATI's Stream if I'm not mistaken... but do you think they'll want to keep on making 2 versions of their products when other people are pushing for a unified standard?

Of course I hope AMD realizes that they have kind of screwed themselves by saying that. History shows that when a company says their competitions product will fail, the product usually becomes wildly popular.

I guess this is all moot anyways then... if AMD responding to an Nvidia announcement for a reporter will guarantee success for Physx then surely the grandstanding Nvidia took part in vs. Intel will have Larrabee burying all competition.


At the end of the day people may not like what this guy is saying, why, or how, but it's true. AMD is not going to support Nvidia's proprietary APIs (and why the hell would they?), and with out that support other companies will have less incentive to get on board unless Nvidia provides it. That requires either a superior product or probably cash incentive. Now realisticly.. when the immediate alternatives to Nvidia's systems seem to be OpenCL (Apple - but open), DirectX 11 (Microsoft), and Havok (Intel), do you think these other standards won't have the resources behind them to provide both those things moreso than Nvidia? If you were in AMD's shoes who would you side with? They could do all but seriously... why? It'd just confuse other efforts and probably waste their own resources, and for what? To better prop up the competition that they can beat? So they can claim some moral high ground when they recall how Nv made DX10.1 completely moot?
 
Last edited:
Top