• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 4XX Series Discussion

Status
Not open for further replies.
Joined
Jul 2, 2008
Messages
3,638 (0.61/day)
Location
California
You mean pressured companies not to make games that support it? Is this a fact? Is tess even noticeable on those kind of hardwares?

Can someone run a tess benchmark with the HD4000 series card? I'm a lil bit confused here.

AMD can do the same with their AMD titles, to show off what their cards can do, and nvidia can't, to promote their cards, why don't they do it?

The fact that TWIMTBP titles are a lot out there and could run better on their hardwares aren't the bad things for their customers, it's a good thing.

The fact that MORE people play games on consoles with no AA and look likes crap compare to PC is a fact. And all the games are optimized for it, is this a good thing? Or releasing a console every 6 months is a better thing? NVIDIA offers what the customers needed, promote their products. AMD can just do the same thing with their card IF AMD truly has something really different to offer.
 
Joined
Sep 25, 2007
Messages
5,966 (0.95/day)
Location
New York
Processor AMD Ryzen 9 5950x, Ryzen 9 5980HX
Motherboard MSI X570 Tomahawk
Cooling Be Quiet Dark Rock Pro 4(With Noctua Fans)
Memory 32Gb Crucial 3600 Ballistix
Video Card(s) Gigabyte RTX 3080, Asus 6800M
Storage Adata SX8200 1TB NVME/WD Black 1TB NVME
Display(s) Dell 27 Inch 165Hz
Case Phanteks P500A
Audio Device(s) IFI Zen Dac/JDS Labs Atom+/SMSL Amp+Rivers Audio
Power Supply Corsair RM850x
Mouse Logitech G502 SE Hero
Keyboard Corsair K70 RGB Mk.2
VR HMD Samsung Odyssey Plus
Software Windows 10
I remember reading an article on that a long time ago, they did, AMD has had tesselators in their cards since the HD2XXX but they weren't that powerful(and they didn't need that much power at the time), but now those tesselators are outdated but if developers did utilize them then, it would have completely changed the game and we more than likely would have more powerful cards now, the only reason I can see they wouldn't is because one:Nvidia and two:Microsoft and their DX or three:they didn't think it was useful which we see now where that went.
 
Joined
Apr 30, 2008
Messages
4,896 (0.81/day)
Location
Multidimensional
System Name Boomer Master Race
Processor Intel Core i5 12600H
Motherboard MinisForum NAB6 Lite Board
Cooling Mini PC Cooling
Memory Apacer 16GB 3200Mhz
Video Card(s) Intel Iris Xe Graphics
Storage Kingston 512GB SSD
Display(s) Sony 4K Bravia X85J 43Inch TV 120Hz
Case MinisForum NAB6 Lite Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply 120w External Power Brick
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 11 Home 64bit
Benchmark Scores Don't do them anymore.
i think far cry did, the original game (in a patch). Feck all beyond that... because nvidia pressured companies not to support it, since their cards couldnt run it.

Didnt know about that, nvidia deserves a MASSIVE bitch slap for that!:slap:
 
Joined
May 19, 2007
Messages
7,662 (1.20/day)
Location
c:\programs\kitteh.exe
Processor C2Q6600 @ 1.6 GHz
Motherboard Anus PQ5
Cooling ACFPro
Memory GEiL2 x 1 GB PC2 6400
Video Card(s) MSi 4830 (RIP)
Storage Seagate Barracuda 7200.10 320 GB Perpendicular Recording
Display(s) Dell 17'
Case El Cheepo
Audio Device(s) 7.1 Onboard
Power Supply Corsair TX750
Software MCE2K5
i hate direct x
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.95/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
You mean pressured companies not to make games that support it? Is this a fact? Is tess even noticeable on those kind of hardwares?

Can someone run a tess benchmark with the HD4000 series card? I'm a lil bit confused here.

AMD can do the same with their AMD titles, to show off what their cards can do, and nvidia can't, to promote their cards, why don't they do it?

The fact that TWIMTBP titles are a lot out there and could run better on their hardwares aren't the bad things for their customers, it's a good thing.

The fact that MORE people play games on consoles with no AA and look likes crap compare to PC is a fact. And all the games are optimized for it, is this a good thing? Or releasing a console every 6 months is a better thing? NVIDIA offers what the customers needed, promote their products. AMD can just do the same thing with their card IF AMD truly has something really different to offer.

because no company is going to waste their time supporting something that limits their customer base. even nvidias proprietary physx system works on CPU in every machine, when hardware physX is unavailable - nvidia were late to the game with tesselation AND with DX10.1, so they forced game vendors to drop support.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.67/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
That's kinda hard since almost every AAA game out there is TWIMTBP-infected.


nVidia will do what they've been doing for years. They design their GPUs with one architectural advantage over competing ATI cards and then they elevate that one advantage to every AAA game through the TWIMTBP-infection.

So here is the first DX11 TWIMTBP-infected benchmark: Unigine Heaven. "Tesselation is faster on nVidia hardware, so let's push tesselation to unreal levels in a few scenes."


The end result is a bunch of games out there that don't take advantage of most features in most GPUs but just run better with nVidia.
Sad, isn't it?



Proof? Compare the number of games with PhysX to the number of games that support HD2000-HD4000 tesselation.
Oh god, TWIMTBP bashing again? Look, ATI has every opportunity to offer a similar program to Nvidia's. Titles can even be optimized for both architectures. It's ATI's fault that they don't push any similar programs.

nVidia sends help to devs that want it for optimization under the TWIMTBP program, to help optimize for their hardware. It's not like they send help to cripple ATI cards. Why doesn't ATI send help to devs to optimize a game engine for their product? ATI decides that they'd rather tackle the optimizations in drivers, and it bites them in the ass.

Give up the conspiracy folks. Nvidia isn't forcing devs to drop anything. ATI is just not offering devs any incentives for getting their tech to work better. The devs aren't going to optimize for ATI when nVidia offers the help for free. It's just common sense. Why would they burn their own dev time if they don't have to?
 

crazyeyesreaper

Not a Moderator
Staff member
Joined
Mar 25, 2009
Messages
9,813 (1.72/day)
Location
04578
System Name Old reliable
Processor Intel 8700K @ 4.8 GHz
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling Custom Water
Memory 32 GB Crucial Ballistix 3666 MHz
Video Card(s) MSI RTX 3080 10GB Suprim X
Storage 3x SSDs 2x HDDs
Display(s) ASUS VG27AQL1A x2 2560x1440 8bit IPS
Case Thermaltake Core P3 TG
Audio Device(s) Samson Meteor Mic / Generic 2.1 / KRK KNS 6400 headset
Power Supply Zalman EBT-1000
Mouse Mionix NAOS 7000
Keyboard Mionix
um wait so the Batman AA fiasco isnt a conspiracy???? and niether is The Last Remnant on PC where if u max shadows on an ATi card the game crawls yes even on an old nvidia gpu it runs fine? hmmm i love conspiracy theories
 
Joined
Apr 11, 2009
Messages
922 (0.16/day)
Location
London, UK
Processor AMD FX 8350 Black Edition @ 4.2Ghz
Motherboard Gigabyte 990FXA-UD3 Rev 4.0
Cooling Corsair H100i
Memory Samsung Green 16GB 30nm 1600Mhz DDR3
Video Card(s) XFX HD 7950 DD 3GB @ 850/5000Mhz
Storage 240GB Intel 520 SSD + 2TB Seagate Barracuda
Display(s) ASUS PB278Q 27" QHD
Case Fractal Design R5 Black
Power Supply Seasonic Platinum 760W
Mouse Corsair Raptor M40
Keyboard Corsair Raptor K50
Software Windows 10 Pro
Joined
Sep 8, 2009
Messages
1,077 (0.19/day)
Location
Porto
Processor Ryzen 9 5900X
Motherboard Gigabyte X570 Aorus Pro
Cooling AiO 240mm
Memory 2x 32GB Kingston Fury Beast 3600MHz CL18
Video Card(s) Radeon RX 6900XT Reference (amd.com)
Storage O.S.: 256GB SATA | 2x 1TB SanDisk SSD SATA Data | Games: 1TB Samsung 970 Evo
Display(s) LG 34" UWQHD
Audio Device(s) X-Fi XtremeMusic + Gigaworks SB750 7.1 THX
Power Supply XFX 850W
Mouse Logitech G502 Wireless
VR HMD Lenovo Explorer
Software Windows 10 64bit
You mean pressured companies not to make games that support it? Is this a fact? Is tess even noticeable on those kind of hardwares?


You can be sure it's noticeable. Just check out the 3 year-old Ruby Whiteout demo.
You'll see 2010-level geometry running at 60fps, 1920*1200 in a HD2900 card.



AMD can do the same with their AMD titles, to show off what their cards can do, and nvidia can't, to promote their cards, why don't they do it?

The fact that TWIMTBP titles are a lot out there and could run better on their hardwares aren't the bad things for their customers, it's a good thing.

The fact that MORE people play games on consoles with no AA and look likes crap compare to PC is a fact. And all the games are optimized for it, is this a good thing? Or releasing a console every 6 months is a better thing? NVIDIA offers what the customers needed, promote their products. AMD can just do the same thing with their card IF AMD truly has something really different to offer.

Oh god, TWIMTBP bashing again? Look, ATI has every opportunity to offer a similar program to Nvidia's. Titles can even be optimized for both architectures. It's ATI's fault that they don't push any similar programs.

The reason is simple: cash. Every since 2002 (the beginning of the TWIMTBP) that nVidia has had loads of cash to spend on this. They go to the developers, offer tens of gaming machines with nVidia cards to test and also send engineers to game developers to write specific code for their hardware.
That's why TWIMTBP is an infection. Those games have code written by nVidia, it's like a trojan horse that optimizes performance for nVidia cards and breaks it for ATI cards.
It's kind of what Intel did with their compilers (block AMD cpus from using SSE extensions).


ATI couldn't do it because they didn't have enough cash, period. Back in the R300 days, the company was still getting up from 3 years of sub-par graphics cards and trying to survive where all the others (3dfx, matrox, s3) were doomed. Then the R400 were under-specced, the R500 was late and finally the R600 was underperforming and late. It wasn't until the RV670 series that ATI started to build up some cash and now they have to sustain AMD's processor business.


In my opinion, this should be considered as monopoly measures. nVidia can only do that because they have more money, not because they have superior products.





nVidia sends help to devs that want it for optimization under the TWIMTBP program, to help optimize for their hardware. It's not like they send help to cripple ATI cards. Why doesn't ATI send help to devs to optimize a game engine for their product? ATI decides that they'd rather tackle the optimizations in drivers, and it bites them in the ass.

Give up the conspiracy folks. Nvidia isn't forcing devs to drop anything. ATI is just not offering devs any incentives for getting their tech to work better. The devs aren't going to optimize for ATI when nVidia offers the help for free. It's just common sense. Why would they burn their own dev time if they don't have to?

LOL someone needs a wake up call that's 7 years late.

Here's the freshest example:

Ian McNaughton said:
Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced. This option is not available for the retail game as there is a secure rom.

As far as I know, this started with Comanche 4. The game only allows AA enabled if a nVidia card is detected.
There are loads of examples.. the DX10.1 cut from assassin's creed, the shadow performance on last remnant, the physx thingie altering the score in 3dmark vantage, etc etc.
 
Joined
Oct 6, 2009
Messages
2,827 (0.51/day)
Location
Midwest USA
System Name My Gaming System
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte b650 Aorus Elite AX
Cooling Phanteks Glacier One 360D30
Memory G.Skill 64000 Mhz 32 Gb
Video Card(s) ASRock Phantom 7900XT OC
Storage 4 TB NVMe Total
Case Hyte y40
Power Supply Corsair 850 Modular PSU
Software Windows 11 Home Premium
If Nvidia's performance is not up to par with what it should be...... there is one thing Nvidia could do that would up set the establishment.

After thinking about this for awhile...... Nvidia could take their losses and just cut their prices similar to what ATI did with the 4800 series. Then fix Fermi's technology and blast them away with the GTX 500 series.
Just a thought and a comment .... doubt it will happen but if Fermi is as disappointing as it sounds like it might be. That would be one way to compete and recover!
 
Joined
Sep 25, 2007
Messages
5,966 (0.95/day)
Location
New York
Processor AMD Ryzen 9 5950x, Ryzen 9 5980HX
Motherboard MSI X570 Tomahawk
Cooling Be Quiet Dark Rock Pro 4(With Noctua Fans)
Memory 32Gb Crucial 3600 Ballistix
Video Card(s) Gigabyte RTX 3080, Asus 6800M
Storage Adata SX8200 1TB NVME/WD Black 1TB NVME
Display(s) Dell 27 Inch 165Hz
Case Phanteks P500A
Audio Device(s) IFI Zen Dac/JDS Labs Atom+/SMSL Amp+Rivers Audio
Power Supply Corsair RM850x
Mouse Logitech G502 SE Hero
Keyboard Corsair K70 RGB Mk.2
VR HMD Samsung Odyssey Plus
Software Windows 10
well there is a trend right now,

the 100 series is all oem
the 200 series is not
the 300 series is all oem
the 400 series is not

There might not be a 500 series for mainstream consumers
 
Joined
Jul 2, 2008
Messages
3,638 (0.61/day)
Location
California
In my opinion, this should be considered as monopoly measures. nVidia can only do that because they have more money, not because they have superior products.

Really, a bias opinion. A successful company built upon non-superior products, you gonna need solid proofs. I may not old enough to know too far way back of both companies, but the results usually show the truth.
 
Joined
Jul 19, 2006
Messages
43,603 (6.52/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
Really, a bias opinion. A successful company built upon non-superior products, you gonna need solid proofs. I may not old enough to know too far way back of both companies, but the results usually show the truth.

Well, I'd love to see some results! We've been beating around the bush for what? At least 6 months now.

Of course any intelligent person is going to be skeptical on a product where company producing the product is saying it's the best thing since sliced bread, yet have little to no real-world numbers/benchmarks for it.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
You can be sure it's noticeable. Just check out the 3 year-old Ruby Whiteout demo.
You'll see 2010-level geometry running at 60fps, 1920*1200 in a HD2900 card.







The reason is simple: cash. Every since 2002 (the beginning of the TWIMTBP) that nVidia has had loads of cash to spend on this. They go to the developers, offer tens of gaming machines with nVidia cards to test and also send engineers to game developers to write specific code for their hardware.
That's why TWIMTBP is an infection. Those games have code written by nVidia, it's like a trojan horse that optimizes performance for nVidia cards and breaks it for ATI cards.
It's kind of what Intel did with their compilers (block AMD cpus from using SSE extensions).


ATI couldn't do it because they didn't have enough cash, period. Back in the R300 days, the company was still getting up from 3 years of sub-par graphics cards and trying to survive where all the others (3dfx, matrox, s3) were doomed. Then the R400 were under-specced, the R500 was late and finally the R600 was underperforming and late. It wasn't until the RV670 series that ATI started to build up some cash and now they have to sustain AMD's processor business.


In my opinion, this should be considered as monopoly measures. nVidia can only do that because they have more money, not because they have superior products.







LOL someone needs a wake up call that's 7 years late.

Here's the freshest example:



As far as I know, this started with Comanche 4. The game only allows AA enabled if a nVidia card is detected.
There are loads of examples.. the DX10.1 cut from assassin's creed, the shadow performance on last remnant, the physx thingie altering the score in 3dmark vantage, etc etc.

Pft. What a load of crap. The optimizations made never break performance on Ati cards. You can ask any developer and will only have praise words for TWIMTPB. In fact, looking and the current escene full of console ports, any optimization made for Nvidia GPUs can only help Ati. Many DX10 titles are only DX10 and not DX9 thanks to TWIMTBP. It's the same crap as always, people who don't know shit, talking about conspiracy and what's worse, calling 80% of the developers liars. It's funny they think that developers have to break Ati performance in order for Nvidia cards to win, but they always expect drivers to have a huge imrpvement. /sarcasm on/ It's natural and something worth accepting without a doubt, that a driver team that has no idea of how a certain game works can make driver optimizations in 1 month that will increase performance by 30%, but there's no way that optimizations made before launch and with full knowledge of the game code and graphics card architecture can make some cards run faster. There's no fucking way! It must be cheating lalala!! /sarcasm off/ Sorry man, time to wake up, that's optimization. If games are optimized for Nvidia cards, because Nvidia sends people to help and Ati doesn't, the thing has no further discussion. The few games where Ati has had an active role run better on Ati cards, so logically they break Nvidia's performance isn't it? Pft. BS.
 
Joined
Jul 2, 2008
Messages
3,638 (0.61/day)
Location
California
@Erocker

He was talking about released products, and TWIMTBP games. The results I were talking about is the relationships they have built with game devs, a very good thing for gamers. And how they managed their business to this day.

I believe all companies promoted their products that way, "best thing since sliced bread".

I'm obviously defending NVIDIA here, and I'm quite tired of waiting. It's a good thing for me actually, i saved more money if the time between video cards are longer... -_-

If GTX480 offers 1.8x or more performance of the GTX280, and I price below 400$, then I'm willingly to get one. If not, i will wait a lil bit more, doesn't matter.
 
Joined
Sep 8, 2009
Messages
1,077 (0.19/day)
Location
Porto
Processor Ryzen 9 5900X
Motherboard Gigabyte X570 Aorus Pro
Cooling AiO 240mm
Memory 2x 32GB Kingston Fury Beast 3600MHz CL18
Video Card(s) Radeon RX 6900XT Reference (amd.com)
Storage O.S.: 256GB SATA | 2x 1TB SanDisk SSD SATA Data | Games: 1TB Samsung 970 Evo
Display(s) LG 34" UWQHD
Audio Device(s) X-Fi XtremeMusic + Gigaworks SB750 7.1 THX
Power Supply XFX 850W
Mouse Logitech G502 Wireless
VR HMD Lenovo Explorer
Software Windows 10 64bit
Really, a bias opinion. A successful company built upon non-superior products, you gonna need solid proofs. I may not old enough to know too far way back of both companies, but the results usually show the truth.

I'm pretty sure you're old enough to know about how the Pentium 4 sales came on top of Athlon 64 back in 2004, when the first was clearly an inferior product.
"Usually"? There's hardly any "usually" in a super competitive technology market like this.

And I never even suggested that nVidia was built upon inferior products.
Their actions after reaching stardom is what I don't agree with.
Like google says, "don't be evil".





Pft. What a load of crap. The optimizations made never break performance on Ati cards. You can ask any developer and will only have praise words for TWIMTPB.
LOL of course every TWIMTBP-infected developer will love the program.
What were you expecting?
"Yeah, we let nVidia developers write code for us so we're really screwing ATI owners because it's cheaper and our paychecks are bigger in the end."




(...)
blah blah I love TWIMTBP. TWIMTBP is the best thing ever. blah blah blah
Cool, great for you man. We agree to disagree.




I for one think that sending coders to game developers to write specific code for a hardware vendor should be absolutely forbidden. It's a monopolist activity because it depends on the amount of cash the company has, not on the product performance.

It's like Ferrari sending a construction team to alter a F1 racing circuit to respond better to their car. It just doesn't make sense.



But hey, the AMD vs Intel case was also very hard for some people to understand. I'm not really hoping for everyone to understand my point about TWIMTBP, but I stand by my opinion nonetheless.
 
Last edited:
Joined
Jul 19, 2006
Messages
43,603 (6.52/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
@Erocker

He was talking about released products, and TWIMTBP games. The results I were talking about is the relationships they have built with game devs, a very good thing for gamers. And how they managed their business to this day.

I believe all companies promoted their products that way, "best thing since sliced bread".

I'm obviously defending NVIDIA here, and I'm quite tired of waiting. It's a good thing for me actually, i saved more money if the time between video cards are longer... -_-

If GTX480 offers 1.8x or more performance of the GTX280, and I price below 400$, then I'm willingly to get one. If not, i will wait a lil bit more, doesn't matter.

I agree with you except what is in boldface. You are just repeating marketing speak really. TWIMTBP does work great for Nvidia, but really I cannot recall not being able to play any newer game with an ATi card at good FPS with all they eye candy on. Nvidia directly working with developers is a bit anti-competitive though. The one thing I am very much against is PhysX or I should say, a proprietary set of instructions for one company. Who's to blame for this? Not Nvidia, everyone (ATi and Nvidia). This is great for marketing and making money but poor for the gamer. If there was a open standard, the end-user would benefit from greater competition between the companies. PhysX really isn't working either. The list of games that actually uses PhysX is not very big considering how long PhysX has been around. We'll see where it all goes though, I see that Nvidia is touting PhysX with these new cards and Metro 2033 has a PhysX label on it. It will be interesting to see if there are any changes with PhysX and these new cards. I'm also seeing Havoc and other physics program names in newer games. Open standards and competitiveness is what we need.
 
Joined
Jul 2, 2008
Messages
3,638 (0.61/day)
Location
California
PhysX is just like any other physics engine, the different is it support hardware acceleration. You're still able to run it on CPU. Since, NVIDIA owns PhysX, it's not normal for them to make it "opened"... :ohwell:
 
Joined
Jul 19, 2006
Messages
43,603 (6.52/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
PhysX is just like any other physics engine, the different is it support hardware acceleration. You're still able to run it on CPU. Since, NVIDIA owns PhysX, it's not normal for them to make it "opened"... :ohwell:

Havoc is owned by Intel and it's open. There are a few other engines, all open. It's normal for GPU instructions to be open. Also look at CPU instructions for example. I'm not blaming anyone. It's just as much ATi's fault for not using PhysX either. I mean if a 3rd party developed a nice physics program, I'm sure ATi/Nvidia/AMD/Intel/VIA would love to use it. Just because Nvidia develops Physics doesn't mean other companies shouldn't pick it up. Who's at fault? Nvidia for not letting ATi use PhysX or ATi for not using PhysX?
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
LOL of course every TWIMTBP-infected developer will love the program.
What were you expecting?
"Yeah, we let nVidia developers write code for us so we're really screwing ATI owners because it's cheaper and our paychecks are bigger in the end."

Another argument full of crap. So your point is 100% of developers currently working are full of crap and receive money, because ALL of them have been under TWIMTBP at one point or another. Interesting theory really. Do you really think that a company like Nvidia can pay ALL those developers? You have no idea what you are talking about. In fact we can't talk about developers nowadays, since most of them are owned by a company. So do you think Nvidia has money to pay those publishers? And in the meantime let's include the Hydra case here too. Nividia pays ALL publishers and MSI, Asus and Gigabyte, etc etc. lalala. No matter each of those companies make more than twice the money that Nvidia makes. Nvidia has the money! lol

The argument that Nvidia cards have been faster because of that and not any product superirity is so lol moment. I mean you know that because obviously you are a GPU engineer with 4 Masters and 20 years of practice. Both companies believe in their architecture and they know what their are doing, because they ARE the engineers. Again, believing that only one company is right is so short sighted and belongs so much to the mentality of someone who has been adoctrinated... sad.

But hey, the AMD vs Intel case was also very hard for some people to understand. I'm not really hoping for everyone to understand my point about TWIMTBP, but I stand by my opinion nonetheless.

There's a small difference that everyone following the conspiracy theory prefer to forget. AMD has been saying publicly that was happening from the beginning and going to courts, etc? (they did so on the Intel case) Ati/AMD themselves have never said anything about the subject, it's always been coming from 3rd party bloggers which in reality only wanted some clicks on their site.
 
Last edited:
Joined
Jul 2, 2008
Messages
3,638 (0.61/day)
Location
California
It's a CPU engine, and Intel makes CPU .... Intel want to show it to the world that GPU acceleration is not needed.. lol. >.>
 
Joined
Jul 19, 2006
Messages
43,603 (6.52/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
Bah, either way I think if Nvidia was a little bit friendlier with sharing and ATi sucked it up a bit, they could both use PhysX/CUDA and both work to make it better for all of us. Nvidia was the first to the table with GPU physics (no offense to Ageia, lawl), everyone should embrace it, use it, love it, or come to an agreement on something they can all agree to use. This would be better for the end-user in the long run.
 

shevanel

New Member
Joined
Jul 27, 2009
Messages
3,464 (0.62/day)
Location
Leesburg, FL
and nvidia wants the world to only use what they have. havok works great on AMD cpus too.

and all the games ive played with havok physics were alot moe fun than any physx game
 
Joined
Jul 2, 2008
Messages
3,638 (0.61/day)
Location
California
There's always a catch in the business world. There's no such thing as free stuffs or opened. NVIDIA GPU was built with PhysX in mind, ATI knew that if they support PhysX with their hardwares, the performance will be below NVIDIA, one way or another, NVIDIA still can benefit from both.
 
Joined
Sep 25, 2007
Messages
5,966 (0.95/day)
Location
New York
Processor AMD Ryzen 9 5950x, Ryzen 9 5980HX
Motherboard MSI X570 Tomahawk
Cooling Be Quiet Dark Rock Pro 4(With Noctua Fans)
Memory 32Gb Crucial 3600 Ballistix
Video Card(s) Gigabyte RTX 3080, Asus 6800M
Storage Adata SX8200 1TB NVME/WD Black 1TB NVME
Display(s) Dell 27 Inch 165Hz
Case Phanteks P500A
Audio Device(s) IFI Zen Dac/JDS Labs Atom+/SMSL Amp+Rivers Audio
Power Supply Corsair RM850x
Mouse Logitech G502 SE Hero
Keyboard Corsair K70 RGB Mk.2
VR HMD Samsung Odyssey Plus
Software Windows 10
The argument that Nvidia cards have been faster because of that and not any product superirity is so lol moment. I mean you know that because obviously you are a GPU engineer with 4 Masters and 20 years of practice. Both companies believe in their architecture and they know what their are doing, because they ARE the engineers. Again, believing that only one company is right is so short sighted and belongs so much to the mentality of someone who has been adoctrinated... sad.

I don't think its product superiority I think its just that Nvidia supports their products better

. . . . . .

not including the mass driver suicide they just did.
 
Status
Not open for further replies.
Top