• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce 4XX Series Discussion

Status
Not open for further replies.
You mean pressured companies not to make games that support it? Is this a fact? Is tess even noticeable on those kind of hardwares?

Can someone run a tess benchmark with the HD4000 series card? I'm a lil bit confused here.

AMD can do the same with their AMD titles, to show off what their cards can do, and nvidia can't, to promote their cards, why don't they do it?

The fact that TWIMTBP titles are a lot out there and could run better on their hardwares aren't the bad things for their customers, it's a good thing.

The fact that MORE people play games on consoles with no AA and look likes crap compare to PC is a fact. And all the games are optimized for it, is this a good thing? Or releasing a console every 6 months is a better thing? NVIDIA offers what the customers needed, promote their products. AMD can just do the same thing with their card IF AMD truly has something really different to offer.
 
I remember reading an article on that a long time ago, they did, AMD has had tesselators in their cards since the HD2XXX but they weren't that powerful(and they didn't need that much power at the time), but now those tesselators are outdated but if developers did utilize them then, it would have completely changed the game and we more than likely would have more powerful cards now, the only reason I can see they wouldn't is because one:Nvidia and two:Microsoft and their DX or three:they didn't think it was useful which we see now where that went.
 
i think far cry did, the original game (in a patch). Feck all beyond that... because nvidia pressured companies not to support it, since their cards couldnt run it.

Didnt know about that, nvidia deserves a MASSIVE bitch slap for that!:slap:
 
i hate direct x
 
You mean pressured companies not to make games that support it? Is this a fact? Is tess even noticeable on those kind of hardwares?

Can someone run a tess benchmark with the HD4000 series card? I'm a lil bit confused here.

AMD can do the same with their AMD titles, to show off what their cards can do, and nvidia can't, to promote their cards, why don't they do it?

The fact that TWIMTBP titles are a lot out there and could run better on their hardwares aren't the bad things for their customers, it's a good thing.

The fact that MORE people play games on consoles with no AA and look likes crap compare to PC is a fact. And all the games are optimized for it, is this a good thing? Or releasing a console every 6 months is a better thing? NVIDIA offers what the customers needed, promote their products. AMD can just do the same thing with their card IF AMD truly has something really different to offer.

because no company is going to waste their time supporting something that limits their customer base. even nvidias proprietary physx system works on CPU in every machine, when hardware physX is unavailable - nvidia were late to the game with tesselation AND with DX10.1, so they forced game vendors to drop support.
 
That's kinda hard since almost every AAA game out there is TWIMTBP-infected.


nVidia will do what they've been doing for years. They design their GPUs with one architectural advantage over competing ATI cards and then they elevate that one advantage to every AAA game through the TWIMTBP-infection.

So here is the first DX11 TWIMTBP-infected benchmark: Unigine Heaven. "Tesselation is faster on nVidia hardware, so let's push tesselation to unreal levels in a few scenes."


The end result is a bunch of games out there that don't take advantage of most features in most GPUs but just run better with nVidia.
Sad, isn't it?



Proof? Compare the number of games with PhysX to the number of games that support HD2000-HD4000 tesselation.
Oh god, TWIMTBP bashing again? Look, ATI has every opportunity to offer a similar program to Nvidia's. Titles can even be optimized for both architectures. It's ATI's fault that they don't push any similar programs.

nVidia sends help to devs that want it for optimization under the TWIMTBP program, to help optimize for their hardware. It's not like they send help to cripple ATI cards. Why doesn't ATI send help to devs to optimize a game engine for their product? ATI decides that they'd rather tackle the optimizations in drivers, and it bites them in the ass.

Give up the conspiracy folks. Nvidia isn't forcing devs to drop anything. ATI is just not offering devs any incentives for getting their tech to work better. The devs aren't going to optimize for ATI when nVidia offers the help for free. It's just common sense. Why would they burn their own dev time if they don't have to?
 
um wait so the Batman AA fiasco isnt a conspiracy???? and niether is The Last Remnant on PC where if u max shadows on an ATi card the game crawls yes even on an old nvidia gpu it runs fine? hmmm i love conspiracy theories
 
You mean pressured companies not to make games that support it? Is this a fact? Is tess even noticeable on those kind of hardwares?


You can be sure it's noticeable. Just check out the 3 year-old Ruby Whiteout demo.
You'll see 2010-level geometry running at 60fps, 1920*1200 in a HD2900 card.



AMD can do the same with their AMD titles, to show off what their cards can do, and nvidia can't, to promote their cards, why don't they do it?

The fact that TWIMTBP titles are a lot out there and could run better on their hardwares aren't the bad things for their customers, it's a good thing.

The fact that MORE people play games on consoles with no AA and look likes crap compare to PC is a fact. And all the games are optimized for it, is this a good thing? Or releasing a console every 6 months is a better thing? NVIDIA offers what the customers needed, promote their products. AMD can just do the same thing with their card IF AMD truly has something really different to offer.

Oh god, TWIMTBP bashing again? Look, ATI has every opportunity to offer a similar program to Nvidia's. Titles can even be optimized for both architectures. It's ATI's fault that they don't push any similar programs.

The reason is simple: cash. Every since 2002 (the beginning of the TWIMTBP) that nVidia has had loads of cash to spend on this. They go to the developers, offer tens of gaming machines with nVidia cards to test and also send engineers to game developers to write specific code for their hardware.
That's why TWIMTBP is an infection. Those games have code written by nVidia, it's like a trojan horse that optimizes performance for nVidia cards and breaks it for ATI cards.
It's kind of what Intel did with their compilers (block AMD cpus from using SSE extensions).


ATI couldn't do it because they didn't have enough cash, period. Back in the R300 days, the company was still getting up from 3 years of sub-par graphics cards and trying to survive where all the others (3dfx, matrox, s3) were doomed. Then the R400 were under-specced, the R500 was late and finally the R600 was underperforming and late. It wasn't until the RV670 series that ATI started to build up some cash and now they have to sustain AMD's processor business.


In my opinion, this should be considered as monopoly measures. nVidia can only do that because they have more money, not because they have superior products.





nVidia sends help to devs that want it for optimization under the TWIMTBP program, to help optimize for their hardware. It's not like they send help to cripple ATI cards. Why doesn't ATI send help to devs to optimize a game engine for their product? ATI decides that they'd rather tackle the optimizations in drivers, and it bites them in the ass.

Give up the conspiracy folks. Nvidia isn't forcing devs to drop anything. ATI is just not offering devs any incentives for getting their tech to work better. The devs aren't going to optimize for ATI when nVidia offers the help for free. It's just common sense. Why would they burn their own dev time if they don't have to?

LOL someone needs a wake up call that's 7 years late.

Here's the freshest example:

Ian McNaughton said:
Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced. This option is not available for the retail game as there is a secure rom.

As far as I know, this started with Comanche 4. The game only allows AA enabled if a nVidia card is detected.
There are loads of examples.. the DX10.1 cut from assassin's creed, the shadow performance on last remnant, the physx thingie altering the score in 3dmark vantage, etc etc.
 
If Nvidia's performance is not up to par with what it should be...... there is one thing Nvidia could do that would up set the establishment.

After thinking about this for awhile...... Nvidia could take their losses and just cut their prices similar to what ATI did with the 4800 series. Then fix Fermi's technology and blast them away with the GTX 500 series.
Just a thought and a comment .... doubt it will happen but if Fermi is as disappointing as it sounds like it might be. That would be one way to compete and recover!
 
well there is a trend right now,

the 100 series is all oem
the 200 series is not
the 300 series is all oem
the 400 series is not

There might not be a 500 series for mainstream consumers
 
In my opinion, this should be considered as monopoly measures. nVidia can only do that because they have more money, not because they have superior products.

Really, a bias opinion. A successful company built upon non-superior products, you gonna need solid proofs. I may not old enough to know too far way back of both companies, but the results usually show the truth.
 
Really, a bias opinion. A successful company built upon non-superior products, you gonna need solid proofs. I may not old enough to know too far way back of both companies, but the results usually show the truth.

Well, I'd love to see some results! We've been beating around the bush for what? At least 6 months now.

Of course any intelligent person is going to be skeptical on a product where company producing the product is saying it's the best thing since sliced bread, yet have little to no real-world numbers/benchmarks for it.
 
You can be sure it's noticeable. Just check out the 3 year-old Ruby Whiteout demo.
You'll see 2010-level geometry running at 60fps, 1920*1200 in a HD2900 card.







The reason is simple: cash. Every since 2002 (the beginning of the TWIMTBP) that nVidia has had loads of cash to spend on this. They go to the developers, offer tens of gaming machines with nVidia cards to test and also send engineers to game developers to write specific code for their hardware.
That's why TWIMTBP is an infection. Those games have code written by nVidia, it's like a trojan horse that optimizes performance for nVidia cards and breaks it for ATI cards.
It's kind of what Intel did with their compilers (block AMD cpus from using SSE extensions).


ATI couldn't do it because they didn't have enough cash, period. Back in the R300 days, the company was still getting up from 3 years of sub-par graphics cards and trying to survive where all the others (3dfx, matrox, s3) were doomed. Then the R400 were under-specced, the R500 was late and finally the R600 was underperforming and late. It wasn't until the RV670 series that ATI started to build up some cash and now they have to sustain AMD's processor business.


In my opinion, this should be considered as monopoly measures. nVidia can only do that because they have more money, not because they have superior products.







LOL someone needs a wake up call that's 7 years late.

Here's the freshest example:



As far as I know, this started with Comanche 4. The game only allows AA enabled if a nVidia card is detected.
There are loads of examples.. the DX10.1 cut from assassin's creed, the shadow performance on last remnant, the physx thingie altering the score in 3dmark vantage, etc etc.

Pft. What a load of crap. The optimizations made never break performance on Ati cards. You can ask any developer and will only have praise words for TWIMTPB. In fact, looking and the current escene full of console ports, any optimization made for Nvidia GPUs can only help Ati. Many DX10 titles are only DX10 and not DX9 thanks to TWIMTBP. It's the same crap as always, people who don't know shit, talking about conspiracy and what's worse, calling 80% of the developers liars. It's funny they think that developers have to break Ati performance in order for Nvidia cards to win, but they always expect drivers to have a huge imrpvement. /sarcasm on/ It's natural and something worth accepting without a doubt, that a driver team that has no idea of how a certain game works can make driver optimizations in 1 month that will increase performance by 30%, but there's no way that optimizations made before launch and with full knowledge of the game code and graphics card architecture can make some cards run faster. There's no fucking way! It must be cheating lalala!! /sarcasm off/ Sorry man, time to wake up, that's optimization. If games are optimized for Nvidia cards, because Nvidia sends people to help and Ati doesn't, the thing has no further discussion. The few games where Ati has had an active role run better on Ati cards, so logically they break Nvidia's performance isn't it? Pft. BS.
 
@Erocker

He was talking about released products, and TWIMTBP games. The results I were talking about is the relationships they have built with game devs, a very good thing for gamers. And how they managed their business to this day.

I believe all companies promoted their products that way, "best thing since sliced bread".

I'm obviously defending NVIDIA here, and I'm quite tired of waiting. It's a good thing for me actually, i saved more money if the time between video cards are longer... -_-

If GTX480 offers 1.8x or more performance of the GTX280, and I price below 400$, then I'm willingly to get one. If not, i will wait a lil bit more, doesn't matter.
 
Really, a bias opinion. A successful company built upon non-superior products, you gonna need solid proofs. I may not old enough to know too far way back of both companies, but the results usually show the truth.

I'm pretty sure you're old enough to know about how the Pentium 4 sales came on top of Athlon 64 back in 2004, when the first was clearly an inferior product.
"Usually"? There's hardly any "usually" in a super competitive technology market like this.

And I never even suggested that nVidia was built upon inferior products.
Their actions after reaching stardom is what I don't agree with.
Like google says, "don't be evil".





Pft. What a load of crap. The optimizations made never break performance on Ati cards. You can ask any developer and will only have praise words for TWIMTPB.
LOL of course every TWIMTBP-infected developer will love the program.
What were you expecting?
"Yeah, we let nVidia developers write code for us so we're really screwing ATI owners because it's cheaper and our paychecks are bigger in the end."




(...)
blah blah I love TWIMTBP. TWIMTBP is the best thing ever. blah blah blah
Cool, great for you man. We agree to disagree.




I for one think that sending coders to game developers to write specific code for a hardware vendor should be absolutely forbidden. It's a monopolist activity because it depends on the amount of cash the company has, not on the product performance.

It's like Ferrari sending a construction team to alter a F1 racing circuit to respond better to their car. It just doesn't make sense.



But hey, the AMD vs Intel case was also very hard for some people to understand. I'm not really hoping for everyone to understand my point about TWIMTBP, but I stand by my opinion nonetheless.
 
Last edited:
@Erocker

He was talking about released products, and TWIMTBP games. The results I were talking about is the relationships they have built with game devs, a very good thing for gamers. And how they managed their business to this day.

I believe all companies promoted their products that way, "best thing since sliced bread".

I'm obviously defending NVIDIA here, and I'm quite tired of waiting. It's a good thing for me actually, i saved more money if the time between video cards are longer... -_-

If GTX480 offers 1.8x or more performance of the GTX280, and I price below 400$, then I'm willingly to get one. If not, i will wait a lil bit more, doesn't matter.

I agree with you except what is in boldface. You are just repeating marketing speak really. TWIMTBP does work great for Nvidia, but really I cannot recall not being able to play any newer game with an ATi card at good FPS with all they eye candy on. Nvidia directly working with developers is a bit anti-competitive though. The one thing I am very much against is PhysX or I should say, a proprietary set of instructions for one company. Who's to blame for this? Not Nvidia, everyone (ATi and Nvidia). This is great for marketing and making money but poor for the gamer. If there was a open standard, the end-user would benefit from greater competition between the companies. PhysX really isn't working either. The list of games that actually uses PhysX is not very big considering how long PhysX has been around. We'll see where it all goes though, I see that Nvidia is touting PhysX with these new cards and Metro 2033 has a PhysX label on it. It will be interesting to see if there are any changes with PhysX and these new cards. I'm also seeing Havoc and other physics program names in newer games. Open standards and competitiveness is what we need.
 
PhysX is just like any other physics engine, the different is it support hardware acceleration. You're still able to run it on CPU. Since, NVIDIA owns PhysX, it's not normal for them to make it "opened"... :ohwell:
 
PhysX is just like any other physics engine, the different is it support hardware acceleration. You're still able to run it on CPU. Since, NVIDIA owns PhysX, it's not normal for them to make it "opened"... :ohwell:

Havoc is owned by Intel and it's open. There are a few other engines, all open. It's normal for GPU instructions to be open. Also look at CPU instructions for example. I'm not blaming anyone. It's just as much ATi's fault for not using PhysX either. I mean if a 3rd party developed a nice physics program, I'm sure ATi/Nvidia/AMD/Intel/VIA would love to use it. Just because Nvidia develops Physics doesn't mean other companies shouldn't pick it up. Who's at fault? Nvidia for not letting ATi use PhysX or ATi for not using PhysX?
 
LOL of course every TWIMTBP-infected developer will love the program.
What were you expecting?
"Yeah, we let nVidia developers write code for us so we're really screwing ATI owners because it's cheaper and our paychecks are bigger in the end."

Another argument full of crap. So your point is 100% of developers currently working are full of crap and receive money, because ALL of them have been under TWIMTBP at one point or another. Interesting theory really. Do you really think that a company like Nvidia can pay ALL those developers? You have no idea what you are talking about. In fact we can't talk about developers nowadays, since most of them are owned by a company. So do you think Nvidia has money to pay those publishers? And in the meantime let's include the Hydra case here too. Nividia pays ALL publishers and MSI, Asus and Gigabyte, etc etc. lalala. No matter each of those companies make more than twice the money that Nvidia makes. Nvidia has the money! lol

The argument that Nvidia cards have been faster because of that and not any product superirity is so lol moment. I mean you know that because obviously you are a GPU engineer with 4 Masters and 20 years of practice. Both companies believe in their architecture and they know what their are doing, because they ARE the engineers. Again, believing that only one company is right is so short sighted and belongs so much to the mentality of someone who has been adoctrinated... sad.

But hey, the AMD vs Intel case was also very hard for some people to understand. I'm not really hoping for everyone to understand my point about TWIMTBP, but I stand by my opinion nonetheless.

There's a small difference that everyone following the conspiracy theory prefer to forget. AMD has been saying publicly that was happening from the beginning and going to courts, etc? (they did so on the Intel case) Ati/AMD themselves have never said anything about the subject, it's always been coming from 3rd party bloggers which in reality only wanted some clicks on their site.
 
Last edited:
It's a CPU engine, and Intel makes CPU .... Intel want to show it to the world that GPU acceleration is not needed.. lol. >.>
 
Bah, either way I think if Nvidia was a little bit friendlier with sharing and ATi sucked it up a bit, they could both use PhysX/CUDA and both work to make it better for all of us. Nvidia was the first to the table with GPU physics (no offense to Ageia, lawl), everyone should embrace it, use it, love it, or come to an agreement on something they can all agree to use. This would be better for the end-user in the long run.
 
and nvidia wants the world to only use what they have. havok works great on AMD cpus too.

and all the games ive played with havok physics were alot moe fun than any physx game
 
There's always a catch in the business world. There's no such thing as free stuffs or opened. NVIDIA GPU was built with PhysX in mind, ATI knew that if they support PhysX with their hardwares, the performance will be below NVIDIA, one way or another, NVIDIA still can benefit from both.
 
The argument that Nvidia cards have been faster because of that and not any product superirity is so lol moment. I mean you know that because obviously you are a GPU engineer with 4 Masters and 20 years of practice. Both companies believe in their architecture and they know what their are doing, because they ARE the engineers. Again, believing that only one company is right is so short sighted and belongs so much to the mentality of someone who has been adoctrinated... sad.

I don't think its product superiority I think its just that Nvidia supports their products better

. . . . . .

not including the mass driver suicide they just did.
 
Status
Not open for further replies.
Back
Top