# DirectX 11 Won't Define GPU Sales: NVIDIA



## btarunr (Sep 17, 2009)

"DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons." This coming from the same company that a few years ago said that there was every reason to opt for a DirectX 10 compliant graphics card, to complete the Windows Vista experience, at a time when it was the first and only company to be out with compliant hardware. In the wake of rival AMD's ambitious Evergreen family of DirectX 11 compliant graphics cards being released, NVIDIA made it a point to tell the press that the development shouldn't really change anything in the industry. 

Speaking at the Deutsche Bank Securities Technology Conference, NVIDIA's VP of investor relations said "DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons. This is why Microsoft is in work with the industry to allow more freedom and more creativity in how you build content, which is always good, and the new features in DirectX 11 are going to allow people to do that. But that no longer is the only reason, we believe, consumers would want to invest in a GPU."



"Now, we know, people are doing a lot in the area of video, people are going to do more and more in the area of photography… I think that the things we are doing would allow the GPU to be a co-processor to the CPU and deliver better user experience, better battery life and make that computers little bit more optimized," added Mr. Hara

NVIDIA, which was until very recently a firm believer in graphics processing horsepower to serve as the biggest selling points of new GPUs, now switches its line on what it believes will drive the market forward. All of a sudden, software that rely on the raw computational power of GPUs (eg: media transcoding software), and not advanced visual effects that a new generation API brings with it (in games and CGI applications), is what will drive people to buying graphics processors, according to the company.

Mr. Hara concluded saying "Graphics industry, I think, is on the point that microprocessor industry was several years ago, when AMD made the public confession that frequency does not matter anymore and it is more about performance per watt. I think we are the same crossroad with the graphics world: framerate and resolution are nice, but today they are very high and going from 120fps to 125fps is not going to fundamentally change end-user experience. But I think the things that we are doing with Stereo 3D Vision, PhysX, about making the games more immersive, more playable is beyond framerates and resolutions. Nvidia will show with the next-generation GPUs that the compute side is now becoming more important that the graphics side." 

The timing of this comes when NVIDIA does not have any concrete product plans laid out, while AMD is working towards getting a headstart with its next-generation GPUs that are DirectX 11 compliant, and also has compliance with industry-wide GPGPU standards such as DirectCompute 11 and OpenCL.

*View at TechPowerUp Main Site*


----------



## TheMailMan78 (Sep 17, 2009)

People are going to say this is sour grapes. That they (Nvidia) is just saying this because they do not have a DX11 GPU out before ATI. However I honestly think they may be just speaking from their past experience with DX10.

Good find BTA.


----------



## mikek75 (Sep 17, 2009)

LOL, I wonder if this is anything to do with this....http://www.semiaccurate.com/2009/09/15/nvidia-gt300-yeilds-under-2/


----------



## gumpty (Sep 17, 2009)

*Interesting.*

While I think most people will agree with the DX11 part (it isn't a solid reason to go out and buy a new GPU - until applications/games are written for it), the timing of the statement and the turn-around in attitude from the launch of DX10 is very interesting.

Basically, there is a LOT that can (and will) be read into this statement. Bring on the _'reading between the lines'_ speculators about what this all means.


----------



## btarunr (Sep 17, 2009)

No comment.


----------



## TheMailMan78 (Sep 17, 2009)

btarunr said:


> http://img.techpowerup.org/090917/bta041.jpg
> 
> No comment.



I called it didn't I?


----------



## araditus (Sep 17, 2009)

I believe, while although the statement makes great points, I kept on thinking to myself that it was merely a nice way of saying, wow, ATI has us beat for the starting gun of DX11, saw a few leaks of its power, looked at their GT300 and went, uh oh! They got us this time, ok, better show the consumer that our card has more value than fps.


----------



## TheMailMan78 (Sep 17, 2009)

araditus said:


> I believe, while although the statement makes great points, I kept on thinking to myself that it was merely a nice way of saying, wow, ATI has us beat for the starting gun of DX11, saw a few leaks of its power, looked at their GT300 and went, uh oh! They got us this time, ok, better show the consumer that our card has more value than fps.



The best bang for the buck is the 4850. It will be for the next few years. Until the new (Next-Generation) consoles are released all of these high end cards will be for guys like us. We will use them for folding and benching because no game made will drag them down. I know I sound like a broken record but damn it I can see this thread turning in to a flame war.


----------



## Mussels (Sep 17, 2009)

of course it wont affect nvidias sales at all - they dont have any DX11 cards.


----------



## AsRock (Sep 17, 2009)

Well for i care IF i was in the market for a 5870 i would not care if it was DX10.1 as by time there's enough games to make it worse while the 6870 be out lol.


----------



## TheMailMan78 (Sep 17, 2009)

AsRock said:


> Well for i care IF i was in the market for a 5870 i would not care if it was DX10.1 as by time there's enough games to make it worse while the 6870 be out lol.



Thats what people said about DX10. There are already a TON of games that will use DX11.


----------



## Mussels (Sep 17, 2009)

TheMailMan78 said:


> Thats what people said about DX10. There are already a TON of games that will use DX11.



nevermind all the DX10 games that will run awesome on these cards.


----------



## laszlo (Sep 17, 2009)

i think they made a good point;were are the dx11 games?  we'll see them in 2011 maybe...


----------



## Mussels (Sep 17, 2009)

laszlo said:


> i think they made a good point;were are the dx11 games?  we'll see them in 2011 maybe...



no ones going to release DX11 games until there are DX11 cards to play them.
Nvidia being slack is the reason DX10.1 failed, since they dont meet the requirements for it - however, since DX10 and 10.1 cards can run DX11 titles (with disabled features) DX11 games WILL roll out faster than DX10 games did, 'everyone' can run them, just that the ATI users get prettier effects.


----------



## TheMailMan78 (Sep 17, 2009)

laszlo said:


> i think they made a good point;were are the dx11 games?  we'll see them in 2011 maybe...



Dirt 2 is DX11. There is a whole thread for DX11 games here and they will be out in less than 6 months.


----------



## mdm-adph (Sep 17, 2009)

Hmm...

Nvidia, circa 2007: "Buy our cards, we're DX10 compatible and the other guy isn't!!"


----------



## TheMailMan78 (Sep 17, 2009)

mdm-adph said:


> Hmm...
> 
> Nvidia, circa 2007: "Buy our cards, we're DX10 compatible and the other guy isn't!!"



Read the first user post.


----------



## mdm-adph (Sep 17, 2009)

TheMailMan78 said:


> Read the first user post.



Their "past experience with DX10?"

Yeah, their experience has shown them that trumping up new technologies can lead to a huge increase in sales.  What's your point?


----------



## Mussels (Sep 17, 2009)

DX11 wont make sales "It will be one of the reasons" - the other reasons are that AMD has a card twice as fast as what nvidias best offering is, with triple monitor support and far better idle power consumption


----------



## AsRock (Sep 17, 2009)

TheMailMan78 said:


> Dirt 2 is DX11. There is a whole thread for DX11 games here and they will be out in less than 6 months.



But thats my point ya 1/2 way to the 68xx range.  And saved ya self a few pennys to even if you just wait 6 months.


----------



## Mussels (Sep 17, 2009)

AsRock said:


> But thats my point ya 1/2 way to the 68xx range.  And saved ya self a few pennys to even if you just wait 6 months.



in the meantime, you get all the other benefits anyway.

Oh but if you wait 6 months before buying, you might as well wait another 6 months and get the 6 series from ATI. or another 6 after that for nvidias next offering... or another 6 after that...


its an endless loop. if it has good performance to price ratio at any given time and people can afford it when they want to update, they will buy it.

when these cards launch, Nvidia will have nothing comparable - they're ahead in performance, features, and DX compatibility - you'd be stupid to buy a DX10.0 card over an 11.0 card


----------



## TheMailMan78 (Sep 17, 2009)

mdm-adph said:


> Their "past experience with DX10?"
> 
> Yeah, their experience has shown them that trumping up new technologies can lead to a huge increase in sales.  What's your point?


 Stop flame baiting. Newtek will be here any minute. I know how you two get. It turns into Dawsons Creek around here.



AsRock said:


> But thats my point ya 1/2 way to the 68xx range.  And saved ya self a few pennys to even if you just wait 6 months.


 Sorry man I aint had my coffee yet. What do ya mean?



Mussels said:


> you'd be stupid to buy a DX10.0 card over an 11.0 card



I say wait until the new consoles are released. I wont but it would be a better investment. Nothing will push these cards.


----------



## Mussels (Sep 17, 2009)

TheMailMan78 said:


> I say wait until the new consoles are released. I wont but it would be a better investment. Nothing will push these cards.



triple monitor/eyefinity.

seriously, look how cheap LCD's are lately and tell me you dont see a new wave of gamers using 3 19-24" monitors for FPS gaming.


----------



## TheMailMan78 (Sep 17, 2009)

Mussels said:


> triple monitor/eyefinity.
> 
> seriously, look how cheap LCD's are lately and tell me you dont see a new wave of gamers using 3 19-24" monitors for FPS gaming.



Personally I couldn't. I would rather have one HUGE screen than a bunch of little ones that will break up the picture. Triple monitor/eyefinity seems gimmicky to me. Cool but gimmicky.


----------



## mdm-adph (Sep 17, 2009)

TheMailMan78 said:


> Stop flame baiting. Newtek will be here any minute. I know how you two get. It turns into Dawsons Creek around here.



Pointing out hypocrisy is not flame baiting -- it's a public service to the community at large.  

And anyway, for your information, I personally think the G300 is going to be a much faster chip than any R800's.  It's just going to be a good while before they come out, and when they do, they're going to cost an arm and a leg.


----------



## AphexDreamer (Sep 17, 2009)

Please don't consider this Fanboyism, but Nvidia are the biggest BSers I know off in the Computer Industry. I mean most companies say shit but Nvidia is ridiculous.


----------



## Mussels (Sep 17, 2009)

AphexDreamer said:


> Please don't consider this Fanboyism, but Nvidia are the biggest BSers I know off in the Computer Industry. I mean most companies say shit but Nvidia is ridiculous.



i lost all trust in nvidia after seeing GTX280 mobility cards using a cut down G92 core.

nvidias been struggling lately - if they werent, they'd have DX10.1 and 11 cards out, and they wouldnt need to rename products to keep in the news every time ATI release a new card.


----------



## TheMailMan78 (Sep 17, 2009)

mdm-adph said:


> Pointing out hypocrisy is not flame baiting -- it's a public service to the community at large.
> 
> And anyway, for your information, I personally think the G300 is going to be a much faster chip than any R800's.  It's just going to be a good while before they come out, and when they do, they're going to cost an arm and a leg.



You know WTF your doing man. Cut it out. We all know about the renaming thing but at the same time ATI does it too. The HD2900 and 3870 come to mind. Same damn performance just a different fabrication. 

Both cards at the time they were released were ATI's top tier card. At least Nvidias cards gave you a performance boost with the new named cards.


----------



## heky (Sep 17, 2009)

TheMailMan78 said:


> You know WTF your doing man. Cut it out. We all know about the renaming thing but at the same time ATI does it too. The HD2900 and 3870 come to mind. Same damn performance just a different fabrication.
> 
> Both cards at the time they were released were ATI's top tier card. At least Nvidias cards gave you a performance boost with the new named cards.



And what performance boost would that be???


----------



## Benetanegia (Sep 17, 2009)

I don't know why all the buzz about this and I certainly don't relate this comments with Ati's DX11 or GT300. They've been saying that GPGPU would become more important than graphics when it comes to selling GPUs *since they released the 8800GTX and CUDA*. It's not as if this was new from them, it's not as if they suddenly changed their minds about this because they have no DX11 cards before Ati. They even spent 10% of GT200's die are for GPGPU even when that suposed less gaming performance-per die area. They even mede Intel angry because of this 3 years back. 

This is not new.
This is not related to DX11.
This is not related to GT300 or RV870.


----------



## mdm-adph (Sep 17, 2009)

TheMailMan78 said:


> You know WTF your doing man. Cut it out. We all know about the renaming thing but at the same time ATI does it too. The HD2900 and 3870 come to mind. Same damn performance just a different fabrication.
> 
> Both cards at the time they were released were ATI's top tier card. At least Nvidias cards gave you a performance boost with the new named cards.



What the who?  I didn't say anything about any renaming... 

I was just pointing out the hypocrisy of a company saying that new technologies aren't worth buying a new card for, when at one time, that was all I ever heard about Nvidia.  I still think it's a very valid point.


----------



## phanbuey (Sep 17, 2009)

Mussels said:


> triple monitor/eyefinity.
> 
> seriously, look how cheap LCD's are lately and tell me you dont see a new wave of gamers using 3 19-24" monitors for FPS gaming.



tbh im not impressed with having a screen border cut the image.. especially the 6 monitor one where the center of the "screen" has a line running through it.  I think they were playing an RPG and you couldn't even see the character - just the head and the feet lol.

It's not that I dont see a wave of gamers using multiple screens... Its just that I definitely wouldn't buy it... the borders are way too distracting.


----------



## HalfAHertz (Sep 17, 2009)

TheMailMan78 said:


> You know WTF your doing man. Cut it out. We all know about the renaming thing but at the same time ATI does it too. The HD2900 and 3870 come to mind. Same damn performance just a different fabrication.
> 
> Both cards at the time they were released were ATI's top tier card. At least Nvidias cards gave you a performance boost with the new named cards.



The 3xxx series added 10.1 support, new version of UVD for better HD playback and some other minor tweaks to the core...


----------



## TheMailMan78 (Sep 17, 2009)

HalfAHertz said:


> The 3xxx series added 10.1 support, new version of UVD for better HD playback and some other minor tweaks to the core...



And the same exact performance in games.



mdm-adph said:


> What the who?  I didn't say anything about any renaming...
> 
> I was just pointing out the hypocrisy of a company saying that new technologies aren't worth buying a new card for, when at one time, that was all I ever heard about Nvidia.  I still think it's a very valid point.



Somebody said something about someone naming something and I'm pissed!


----------



## Easo (Sep 17, 2009)

Imho just NVidia had to say something cause of ATi cards launch...


----------



## Mussels (Sep 17, 2009)

the point isnt what you think it is mailman.

while the ATI cards had the same performance, at least changes WERE made. features were added.

Nvidia doesnt add anything to theirs, you can even BIOS flash them between each "variant" of the cards.


----------



## phanbuey (Sep 17, 2009)

TheMailMan78 said:


> And the same exact performance in games.
> 
> 
> 
> Somebody said something about someone naming something and I'm pissed!



well lets face it... ATI did it ONCE and the GPU had substantial internal changes - and the card itself was quite different in terms of characteristics (power draw, memory bus etc etc)... Nvidia did it twice, on the same card.  W/e that isnt the point.

Point is they don't have a dx11 part in the immediate future whereas AMD's 5870 is imminent - quite obvious why NOW, of all times, they are beating the DX11 doesn't matter drum.  DX11 will start to matter only after their product comes out   then they will say that everyone needs to have for the best computing experience.


----------



## AphexDreamer (Sep 17, 2009)

phanbuey said:


> well lets face it... ATI did it ONCE and the GPU had substantial internal changes - and the card itself was quite different in terms of characteristics (power draw, memory bus etc etc)... Nvidia did it twice, on the same card.  W/e that isnt the point.
> 
> Point is they don't have a dx11 part in the immediate future whereas AMD's 5870 is imminent - quite obvious why NOW, of all times, they are beating the DX11 doesn't matter drum.  DX11 will start to matter only after their product comes out   then they will say that everyone needs to have for the best computing experience.



Yup exactly what I stand by. Nvidia does a good job at fooling its consumers especially the not so PC friendly ones. That's why most people I talk to consider Nvidia as being the best, all knowing, god like graphics line. I try to enlighten them, but Nvidia has brainwashed them.


----------



## Mussels (Sep 17, 2009)

its no different to the P4 era, when people bought inferior products to what AMD had at the time, simply because "intel is best" was stuck in their minds.


----------



## AphexDreamer (Sep 17, 2009)

Mussels said:


> its no different to the P4 era, when people bought inferior products to what AMD had at the time, simply because "intel is best" was stuck in their minds.



Yup, this is where marketing really does effect sells. I really don't care though, doesn't effect me.


----------



## phanbuey (Sep 17, 2009)

well... its not just new tech - its also performance... the 5870's will be the fastest cards on the market until the next gen NV comes out.  

That crown is key...  if the current benches are correct, then two $350 5870's will spank what is now two $500 gtx295 in quad SLI.  And after all this time, I think that is a bitter pill for NV to swallow.


----------



## tkpenalty (Sep 17, 2009)

Nvidia deliberately talked to the bank just to get more shareholders onboard nvidia because they always listen to what the advisors, etc say. Even if this proves to be bullshit they'll see more shares bought.


----------



## [I.R.A]_FBi (Sep 17, 2009)

Somehow i knew this 'wang' guy would send them to say sumpn like this too bad someone already said sour grapes ....


----------



## KainXS (Sep 17, 2009)

I used to LOVE Nvidia but after they started playing the renaming game with the G92's then lying about just about all of their current mobile GPU's I just went like F it, in order for me to buy nvidia again they would have to truly wow me because I just see alot of fooling going around in their corner.


----------



## [I.R.A]_FBi (Sep 17, 2009)

although they may be correct dx11 will not define gpu sales, but the gpu's spawned by it will make the current stuff look like chopped liver if the speculations im seing are correct


----------



## polaromonas (Sep 17, 2009)

Even DX11 does or doesn't boost GPU sales, AMD will have advantage over Nvidia. By launching DX11 GPU on the same time Windows 7 came out, they can deduct some money on the ads because MS will do that for them *IF* AMD creates the image for their cards that is the "only way" to bring the most out of your Windows 7 PC or something like that. 

PS. I really hope AMD doesn't screw up this time. And please bring out the legit driver for DX11Compute / OpenCL. I wanna know how long Nvidia can hold onto their CUDA thing.


----------



## TheMailMan78 (Sep 17, 2009)

All I'm saying is they may be being honest from their perspective. They were betting big on DX10 and it fell through. They foresee the same thing on DX11. 

Heres my prediction. Since I'm ALWAYS wrong it will probably be just the opposite.
We are seeing roles reversed from the last generation. ATI will lead this time in over all performance and Nvidia will take their lunch in price/performance with the 300s.


----------



## 1Kurgan1 (Sep 17, 2009)

TheMailMan78 said:


> All I'm saying is they may be being honest from their perspective. They were betting big on DX10 and it fell through. They foresee the same thing on DX11.
> 
> Heres my prediction. Since I'm ALWAYS wrong it will probably be just the opposite.
> We are seeing roles reversed from the last generation. ATI will lead this time in over all performance and Nvidia will take their lunch in price/performance with the 300s.



Here's the major difference between DX10 and DX11, DX11 isn't even out yet and there are already more titles that are announced to support it than DX10 has on the market since it was released. (Or at least very close)

DX9 has been a good platform, but it's time to retire at some point here, this is going to be it.


----------



## wolf (Sep 17, 2009)

phanbuey said:


> well... its not just new tech - its also performance... the 5870's will be the fastest cards on the market until the next gen NV comes out.
> 
> That crown is key...  if the current benches are correct, then two $350 5870's will spank what is now two $500 gtx295 in quad SLI.  And after all this time, I think that is a bitter pill for NV to swallow.



+1 Very well said, this is a big hit to Nvidia's ego.

Hey if prices on GT200 class cards plummet that's good news for us


----------



## KainXS (Sep 17, 2009)

the way I am seeing this is like this,

Nvidia is going to release its 300 series from what all sources point to next year sometime, now . . . . by that time ATI would already be about to release the HD6k(they always replace their cards in under a year, sometimes in 6 months) series and nvidia will be taking losses, that gonna be a short lunch.

either way you look at it, a next year launch on the 300 series is not looking good.


----------



## toyo (Sep 17, 2009)

Someone prepare the 2nd violin for Nvidia, the wind's blowing towards ATI for now, even if it will be only for a few months (ore more? seems NV have birth issues with GT300). This will remain in the GPU history as an ATI win, hell, Wikipedia even mentions the few days of K6 III being the king back in 1999.
There's no stopping to the "horsepower", duh. To get more and better feats you have to pack some...
So NV, stop barking at the moon and get your act together, we need you so the prices go to normal levels.


----------



## TheMailMan78 (Sep 17, 2009)

You guys calling this an ATI win have no clue what Nvidia has in store for us. No one does. I say wait until the 300s come out and we can talk. Until then all I see is mob rule and bandwagon jumpers.

Ill forum fight all you bastards!


----------



## toyo (Sep 17, 2009)

I don't remember anyone (mentioned among others: radeon 9700, Nvidia Geforce 256, 8800) waiting for competitors to release their equivalent products when a new generation of cards based on a new DirectX or technology (T&L hardware) is released, before declaring the champion of the time.
Although, before the 23rd we cannot know for sure, maybe NV has the GT300 already waiting to punch from the darkness at that date (mhmmm....)


----------



## wahdangun (Sep 17, 2009)

way to go nvdia, 

it's clear now, they won't have GT 300 ready when ati launch the evergreen


----------



## mdm-adph (Sep 17, 2009)

toyo said:


> I don't remember anyone (mentioned among others: radeon 9700, Nvidia Geforce 256, 8800) waiting for competitors to release their equivalent products when a new generation of cards based on a new DirectX or technology (T&L hardware) is released, before declaring the champion of the time.
> Although, before the 23rd we cannot know for sure, maybe NV has the GT300 already waiting to punch from the darkness at that date (mhmmm....)



I don't know what you're talking about -- the GeForce 5900 Ultra was a worthy competitor.

Now _that_ was the kind of card that could keep you warm on cold nights.


----------



## toyo (Sep 17, 2009)

Radeon 9700
Radeon 9700's advanced architecture was very efficient and, of course, more powerful compared to its older peers of 2002. Under normal conditions it beat the GeForce4 Ti 4600, the previous top-end card, by 15–20%. However, when anti-aliasing (AA) and/or anisotropic filtering (AF) were enabled it would beat the Ti 4600 by anywhere from 40–100%. At the time, this was quite astonishing, and resulted in the widespread acceptance of AA and AF as critical, truly usable features.

Besides advanced architecture, reviewers also took note of ATI's change in strategy. The 9700 would be the second of ATI's chips (after the 8500) to be shipped to third-party manufacturers instead of ATI producing all of its graphics cards, though ATI would still produce cards off of its highest-end chips. This freed up engineering resources that were channeled towards driver improvements, and the 9700 performed phenomenally well at launch because of this. id Software technical director John Carmack had the Radeon 9700 run the E3 Doom 3 demonstration.[3]

The performance and quality increases offered by the R300 GPU is considered to be one of the greatest in the history of 3D graphics, alongside the achievements GeForce 256 and Voodoo Graphics. Furthermore, NVIDIA’s response in the form of the GeForce FX 5800 was both late to market and somewhat unimpressive, especially when pixel shading was used. R300 would become one of the GPUs with the longest useful lifetime in history, allowing playable performance in new games at least 3 years after its launch.[4]

GeForce256 and Nvidia 8800 series were also uncontested winners at that time, no other player on the market had equivalent functional technologies.


----------



## phanbuey (Sep 17, 2009)

TheMailMan78 said:


> You guys calling this an ATI win have no clue what Nvidia has in store for us. No one does. I say wait until the 300s come out and we can talk. Until then all I see is mob rule and bandwagon jumpers.
> 
> Ill forum fight all you bastards!
> http://img211.imageshack.us/img211/9926/motivator6650984.jpg



We have no clue what Nvidia has in store because we dont SEE ANYTHING.  Not a peep from nvidia except a blatant attempt to write off DX11.

I am a big nvidia fan (check specs)... but reality is reality.  This is, for all intents and purposes a HUGE ati win.  Nvidia has dominated for so long, and now they will lose the crown - they were SO far ahead... and now they are back where they were during the g7x series in relation to ATI.  That is a win from ATI no matter how you spin it.

ATI has a dx11 part that will take the crown... and nvidia is saying that DX11 wont matter ?!? 

*These are the same muppets that told us all that physX matters*. LOL.  They haven't learned their lesson with DX10, they are just trying to convince their investors not to jump ship BC they don't have a competing part.  This is a business move, plain and simple... Just trying to minimize the pain until they can compete.


----------



## EastCoasthandle (Sep 17, 2009)

Wow, talk about trying to pull a "wool over your head".  Lets go back to the G80 release shall we? Because ATI was late to the party with DX10 part (IE HD2900) Nvidia reaped the benefit to be the only DX10 card in town.  In which their market share did increase during ATI's absence.  In spite of their being little to no DX10 games out.  Not only did consumers perfer G80 at it's higher price but caused ATI to lose a big piece of the discrete GPU market (as well as mobile, etc).   

I believe that their market share lost was initiated by the G80 released with no answer from ATI.  Compounded by the HD 2900 release, more or less.  Today, AMD is currently still trying to recover from "that".  Now all of sudden we are to forget what happened and say that the DX11 is nice but not all that important.  :shadedshu  Yes, we know that market conditions during that time and now are completely different.  However, if AMD is able to adapt and compensate for that I see no reason why they wouldn't do well.


----------



## PVTCaboose1337 (Sep 17, 2009)

Remember that lots of stupid people buy graphics card.  If Nvidia says that they have DX11 equipped cards before ATI, then Nvidia will do worlds better than ATI!!  Why?  Well people are dumb and think DX11 makes the card faster etc.  In truth DX11 WILL define graphics cards sales greatly.


----------



## phanbuey (Sep 17, 2009)

PVTCaboose1337 said:


> Remember that lots of stupid people buy graphics card.  If Nvidia says that they have DX11 equipped cards before ATI, then Nvidia will do worlds better than ATI!!  Why?  Well people are dumb and think DX11 makes the card faster etc.  In truth DX11 WILL define graphics cards sales greatly.



+1... exactly... theyre just trying to pull a Baghdad Bob on their investors.  "No No... we ARE winning the war...  ATI is cowering in fear, and our customers don't care about new tech at all... its just not important."


----------



## EastCoasthandle (Sep 17, 2009)

PVTCaboose1337 said:


> Remember that lots of stupid people buy graphics card.  If Nvidia says that they have DX11 equipped cards before ATI, then Nvidia will do worlds better than ATI!!  Why?  Well people are dumb and think DX11 makes the card faster etc.  In truth DX11 WILL define graphics cards sales greatly.



People feign ignorance all the time when it comes to product they prefer.  It doesn't mean that the masses simply don't know any better.  I believe that it's the positive word of mouth from their friends, etc that causes brand recognition more so then just "not knowing better".  Again, I'm talking about the masses, not individual cases.


----------



## PVTCaboose1337 (Sep 17, 2009)

phanbuey said:


> +1... exactly... theyre just trying to pull a Baghdad Bob on their investors.  "No No... we ARE winning the war...  ATI is running in fear"



Exactly!  Cause we know that ATI will have the first DX11 card out soon (the HD 5xxx series) so ATI will win!  That means Nvidia will be in dire trouble.


----------



## AsRock (Sep 17, 2009)

Mussels said:


> in the meantime, you get all the other benefits anyway.
> 
> Oh but if you wait 6 months before buying, you might as well wait another 6 months and get the 6 series from ATI. or another 6 after that for nvidias next offering... or another 6 after that...
> 
> ...



Yes my last post was a little narrow minded

As by mid \ end of next year DX11 will COUGH should be more worth it as more games will be out for it.  Although with the said boost makes it more temping but tell ya the truth i've played all games i want to play already and what are giving a issue is more CPU bound than GPU.

Stupid to buy a DX10 card now ? but that depends on what card they have now. BUT DX11 cards are going be like $250-$300+. So they could get a DX10 card for around $150 and by mid\end of next year if DX11 is more exceptable you get one but the price is going be cheaper to get one. And there be a reason to get one.

I'm a gamer so thats my view on it. I do very few benchmarks as thats not what i get faster hardware for.

Sure if you have a lower end card it's going be more worth but if you already have a card like the 285 or the 4890 there is no need if your a gamer..


----------



## PVTCaboose1337 (Sep 17, 2009)

AsRock said:


> Yes my last post was a little narrow minded
> 
> As by mid \ end of next year DX11 will COUGH should be more worth it as more games will be out for it.  Although with the said boost makes it more temping but tell ya the truth i've played all games i want to play already and what are giving a issue is more CPU bound than GPU.
> 
> ...



If DX11 turns out like DX10, we won't need it!  Every DX10 game had the ability to run in DX9 mode.  Were DX10 supporting cards necessary?  No.


----------



## [I.R.A]_FBi (Sep 17, 2009)

PVTCaboose1337 said:


> If DX11 turns out like DX10, we won't need it!  Every DX10 game had the ability to run in DX9 mode.  Were DX10 supporting cards necessary?  No.




Yes they were to get extra high quality


----------



## mtosev (Sep 17, 2009)

nvidia needs to cut the crap and make a DX11 card. if they don't then they can STFU!


----------



## WhiteLotus (Sep 17, 2009)

I kind of agree with nVidia - I don't believe that DX11 ITSELF will cause AMD (or is it still ATi? I get confused. Anyway...) cards to fly of the shelf. What will cause these cards to sell is that they will be top dog for a good few months. And them being DX11 pretty much equates to being better at performing well in DX10 et al (hell there are still "will this run Crysis" threads).

What nVidia are doing is saying to all those people that think that DX11 will make a huge difference is - hey you don't need DX11 just yet, here we'll (well i would think they would) cut the price on our cards that still perform pretty damn well.


----------



## mtosev (Sep 17, 2009)

WhiteLotus said:


> I kind of agree with nVidia - I don't believe that DX11 ITSELF will cause AMD (or is it still ATi? I get confused. Anyway...) cards to fly of the shelf. What will cause these cards to sell is that they will be top dog for a good few months. And them being DX11 pretty much equates to being better at performing well in DX10 et al (hell there are still "will this run Crysis" threads).
> 
> What nVidia are doing is saying to all those people that think that DX11 will make a huge difference is - hey you don't need DX11 just yet, here we'll (well i would think they would) cut the price on our cards that still perform pretty damn well.



it's AMD:


----------



## HossHuge (Sep 17, 2009)

I've enjoyed reading this thread.  You guys are making alot of vaild points.


----------



## btarunr (Sep 17, 2009)

TheMailMan78 said:


> You guys calling this an ATI win have no clue what Nvidia has in store for us. No one does. I say wait until the 300s come out and we can talk. Until then all I see is mob rule and bandwagon jumpers.
> 
> Ill forum fight all you bastards!
> http://img211.imageshack.us/img211/9926/motivator6650984.jpg



That's not nerd rage, this is:






He wasn't happy when he found out that his GTX 380M was based on 40 nm G92c. That aside, let's get back on track.


----------



## extrasalty (Sep 17, 2009)

ATI: We have DX11 WHQL driver.
nVidia: DX11? Phew, let's concentrate on whats important- the Powerpoint slides.


----------



## [I.R.A]_FBi (Sep 17, 2009)

does this driver have opencl?


----------



## newconroer (Sep 17, 2009)

*"...framerate and resolution are nice, but today they are very high and going from 120fps to 125fps is not going to fundamentally change end-user experience. But I think the things that we are doing with Stereo 3D Vision, PhysX, about making the games more immersive, more playable is beyond framerates and resolutions. Nvidia will show with the next-generation GPUs that the compute side is now becoming more important that the graphics side.”*

Um no, going from 120 to 125 isn't worth anything, correct, but stopping performance going from 60 to 30 IS worth something.

Compute side is all well and good, because without it, 'special' visuals won't work effeciently, but to say that pure computing is necessary is a bit premature. 

Hopefully he's hinting and suggesting what we want to see in the near future, which is real time vector drawing, rather than pre-rendered visuals. But that would require cards with massive computing flexibility, like the FIRE GL types used in AutoCAD programs.

But still, stop making cards that do give you 125fps over 120fps, and start making ones that don't cower in fear at a few dynamic shadows in a 3d program. Then worry about 'compute' cards.


----------



## Steevo (Sep 17, 2009)

ATI was hardware accelerating before Nvidia.


----------



## extrasalty (Sep 17, 2009)

It will soon:
http://developer.amd.com/gpu/ATIStreamSDK/pages/TutorialOpenCL.aspx


----------



## Benetanegia (Sep 17, 2009)

I think that most angry fans are completely missing the point that the Nvidia rep was making. He is not saying DX11 won't matter, he is not saying it is worthless. All that he is saying is that it won't drive sales as much as other factors. Performance and YES GPGPU capabilities. The number of impressed non-gamer crowd is increasing in forums like CGSociety and even youtube yonkies that upload lots of videos everyday. All this people that couldn't care less about gaming, let alone DX11, do find GPGPU quite useful, because it means they can encode their videos twice as fast by just adding an small GPU instead of the Intel IGP most of them have. 

Don't be naive and pretend that the gaming crowd is anywhere close to that installed base of users wanting some acceleration in video encoding, Photoshop and the like. Nvidia is talking about that. The capable software it's here already and it does make a difference, and much more is coming in the near future. The GPU is going to be more than a mere gaming device and that will sell more cards, simply because as I said the volume of non-gamer crowd is much much bigger than the gamer one. And considering the WoW and Sims crowd, that doesn't even know what DX is to begin with, you can pretty much disqualify half the gamer crowd as people waiting for DX11.

At the end of the day only enthusiasts care and know about DX11, and probably only half of them will buy the new cards based on DX11, because we know it will mean squat, at least in first tittles and multi-platform titles. So that leaves us with a number of around 2%. That's the percentage of people that will buy a card caring about DX11. The rest will buy the hardware for *something else*, but not DX11.


----------



## TheMailMan78 (Sep 17, 2009)

Benetanegia said:


> I think that most angry fans are completely missing the point that the Nvidia rep was making. He is not saying DX11 won't matter, he is not saying it is worthless. All that he is saying is that it won't drive sales as much as other factors. Performance and YES GPGPU capabilities. The number of impressed non-gamer crowd is increasing in forums like CGSociety and even youtube yonkies that upload lots of videos everyday. All this people that couldn't care less about gaming, let alone DX11, do find GPGPU quite useful, because it means they can encode their videos twice as fast by just adding an small GPU instead of the Intel IGP most of them have.
> 
> *Don't be naive and pretend that the gaming crowd is anywhere close to that installed base of users wanting some acceleration in video encoding, Photoshop and the like.* Nvidia is talking about that. The capable software it's here already and it does make a difference, and much more is coming in the near future. The GPU is going to be more than a mere gaming device and that will sell more cards, simply because as I said the volume of non-gamer crowd is much much bigger than the gamer one. And considering the WoW and Sims crowd, that doesn't even know what DX is to begin with, you can pretty much disqualify half the gamer crowd as people waiting for DX11.
> 
> At the end of the day only enthusiasts care and know about DX11, and probably only half of them will buy the new cards based on DX11, because we know it will mean squat, at least in first tittles and multi-platform titles. So that leaves us with a number of around 2%. That's the percentage of people that will buy a card caring about DX11. The rest will buy the hardware for *something else*, but not DX11.


A dedicated GPU card speeds up Photoshop? Thats news to me.


----------



## toyo (Sep 17, 2009)

Yeah, CS4 has a few accelerated apps. Nvidia even pulled a dedicated Quadro CX card for Adobe CS4...


----------



## Steevo (Sep 17, 2009)

Yes it does, and ATI was doing GPU acceleration first, F@H style, transcoding, better video acceleration.....


Then Nvidia jumped on, and they do it better in some aspects for the G200 series, however the move to DX11 will allow a common platform that Nvidia can't throw "TWIMTBP" money at, and level the playing field, and providing a better consumer experience for all.


DX11 is just the first step to faster computers at everything, and a hugely better experience, it is the DX7 of our time. The beginnings of something better.


----------



## TheMailMan78 (Sep 17, 2009)

No it doesn't. I use CS4 10 hours a day. All a dedicated GPU does is help out with some of the scaling and that can be done with a good IGP with 3.0 shaders.

Look.
http://kb2.adobe.com/cps/404/kb404898.html


----------



## extrasalty (Sep 17, 2009)

The rep from nVidia is VP of investor relations. The quarterly loss must be near.


----------



## toyo (Sep 17, 2009)

TheMailMan78 said:


> No it doesn't. I use CS4 10 hours a day. All a dedicated GPU does is help out with some of the scaling and that can be done with a good IGP with 3.0 shaders.
> 
> Look.
> http://kb2.adobe.com/cps/404/kb404898.html



There's more to it then scaling, and in my opinion this is GPU acceleration... I feel I'm not understanding what you mean somewhere


----------



## KainXS (Sep 17, 2009)

Of course DX11 dosen't matter right now, what I look at with the 5870 its a pretty awesomely priced card with performance that seems to top everything else available with a standby powerstate that tops everything else available, nvidia is just shaking in their boots right now because they know they are screwed until next year.


----------



## TheMailMan78 (Sep 17, 2009)

toyo said:


> There's more to it then scaling, and in my opinion this is GPU acceleration... I feel I'm not understanding what you mean somewhere



No there's not. Read the article from Adobe. Read the system requirements for the GPU to be of any use. Its a gimmick. 99% of the computers that have been made in the last 2 years can run this on their IGP. A dedicated GPU card will make no difference in Photoshop.


----------



## Benetanegia (Sep 17, 2009)

Steevo said:


> ATI was hardware accelerating before Nvidia.



Correction. Stanford was hardware accelerating before Nvidia, and they used Ati cards to accelerate their Brook parallel computing libraries. Essentially Ati X19xx cards were better for that purpose, but Ati had very little to do with that project apart from the technical support they had to give.

Curiously the chief scientist behind Brook projects, Bill Dally, and one of the project directors (whose name I do't remember now), both work for Nvidia. Bill Dally is the chief scientist and VP of Nvidia and the other one takes care of Nvidia's parallel computing division.


----------



## toyo (Sep 17, 2009)

TheMailMan78 said:


> No there's not. Read the article from Adobe. Read the system requirements for the GPU to be of any use. Its a gimmick. 99% of the computers that have been made in the last 2 years can run this on their IGP. A dedicated GPU card will make no difference in Photoshop.



Ah, I get you now. I don't have any ideea about the IGPvsGPU performance in PS, and in this stage maybe it isn't much than a gimmick, but it's the correct choice of a gimmick. It's the 1st accelerated PS, and the differences vs the CPU are visible where it is working. I hope Adobe will continue on this path, and if they can make any IGPs run it, that's even more value. Also hope they prioritize bug solving and coherence between their apps


----------



## erocker (Sep 17, 2009)

extrasalty said:


> The rep from nVidia is VP of investor relations. The quarterly loss must be near.



And that says it all folks.


----------



## phanbuey (Sep 17, 2009)

Benetanegia said:


> I think that most angry fans are completely missing the point that the Nvidia rep was making. He is not saying DX11 won't matter, he is not saying it is worthless. All that he is saying is that it won't drive sales as much as other factors. Performance and YES GPGPU capabilities. The number of impressed non-gamer crowd is increasing in forums like CGSociety and even youtube yonkies that upload lots of videos everyday. All this people that couldn't care less about gaming, let alone DX11, do find GPGPU quite useful, because it means they can encode their videos twice as fast by just adding an small GPU instead of the Intel IGP most of them have.
> 
> Don't be naive and pretend that the gaming crowd is anywhere close to that installed base of users wanting some acceleration in video encoding, Photoshop and the like. Nvidia is talking about that. The capable software it's here already and it does make a difference, and much more is coming in the near future. The GPU is going to be more than a mere gaming device and that will sell more cards, simply because as I said the volume of non-gamer crowd is much much bigger than the gamer one. And considering the WoW and Sims crowd, that doesn't even know what DX is to begin with, you can pretty much disqualify half the gamer crowd as people waiting for DX11.
> 
> At the end of the day only enthusiasts care and know about DX11, and probably only half of them will buy the new cards based on DX11, because we know it will mean squat, at least in first tittles and multi-platform titles. So that leaves us with a number of around 2%. That's the percentage of people that will buy a card caring about DX11. The rest will buy the hardware for *something else*, but not DX11.




Yeah BUT... they touted DX10 and CUDA and PhysX as being reasons why their GPU's will sell... and are now saying that DX11 doesn't really matter.  

Of course you can argue it and spin it any way you want - and say that DX11 does or doesn't matter while giving valid reasons.  But nvidia changed the tune of their song.  That's the point... they used to be the ones touting benefits that didn't exist -(lol PhysX and even CUDA to a big extent).  Like the photoshop "speedup" which only affects a sliver of the features in photoshop.

Yet now theyre trying to write off DX11... yeah... de nile is a river in africa...  And they are talking to investors.  

Honestly, think about it - if DX11 WAS a major reason ppl would buy gfx cards, hypothetically, would Nvidia really go out to investors and say "hey this is a huge feature and we got NOTHIN!  this will sell, but we don't have it yet... sorry, our bad"?


----------



## Benetanegia (Sep 17, 2009)

TheMailMan78 said:


> No there's not. Read the article from Adobe. Read the system requirements for the GPU to be of any use. Its a gimmick. 99% of the computers that have been made in the last 2 years can run this on their IGP. A dedicated GPU card will make no difference in Photoshop.



But the higher the performance of the GPU (be it IGP or dedicated) the better. An IGP will let you apply some masks and filters faster than with te CPU alone. A faster card will allow you to apply them on the fly and in 2 and 3 digit Mpixel photos, something that IGP can't do. Granted it was my fault to include PS4 as a GPGPU app, considering what GPGPU means officially (like what HD is as oposed to high resolution), considering that any DX10 capable card can accelerate PS4. But the point still is that the better the card the faster that PS4 will run those things, and gimmick or not it could drive sales even better than DX11. There are far more apps that are going to have acceleration anyway, and their convenience innegable.


----------



## erocker (Sep 17, 2009)

phanbuey said:


> Yeah BUT... they touted DX10 and CUDA and PhysX as being reasons why their GPU's will sell... and are now saying that DX11 doesn't really matter.
> 
> Of course you can argue it and spin it any way you want - and say that DX11 does or doesn't matter while giving valid reasons.  But nvidia changed the tune of their song.  That's the point... they used to be the ones touting benefits that didn't exist -(lol PhysX and even CUDA to a big extent).  Like the photoshop "speedup" which only affects a sliver of the features in photoshop.
> 
> ...



...and why should we believe (technology speaking) anything any VP of Investor relations has to say about a product they most likely don't understand? This is nothing but spin and bullshit, and why should it even matter to us? I want to hear what Mr. "Investor Relations" said coming out of one of Nvidia's engineers mouths.

Regardless to that, we all know what's going to happen anyways. In a few months (more or less) Nvidia will launch a DX11 card that will most likely have a bit better performance than ATi and will most deffinitely be more expensive. Until then we are going to hear a bunch of crap, saying not to buy the competitors products. Whoop de doo!


----------



## Mussels (Sep 17, 2009)

TheMailMan78 said:


> A dedicated GPU card speeds up Photoshop? Thats news to me.



latest photoshop (CS4) actually has GPU acceleration. it uses GPU ram and renders as a 3D image, speeding up editing/effects and other fancy crap i dont know about, since i just use it to resize my photos.


while it doesnt make HUGE differences, even an IGP can do some things faster than some CPU's (or at least, the two combined is faster than CPU alone)


----------



## erocker (Sep 17, 2009)

Mussels said:


> latest photoshop (CS4) actually has GPU acceleration. it uses GPU ram and renders as a 3D image, speeding up editing/effects and other fancy crap i dont know about, since i just use it to resize my photos.



Didn't they implement that in CS3? I dunno, I'm still using CS2.


----------



## PVTCaboose1337 (Sep 17, 2009)

Ok so lets just use the X8xx series as an example (old I know).  Remember that this card did not have shader support for Bioshock?  So Zek upgraded to the 2400 because he needed Shader support, even though the 2400 was slower than his X850?  Features do matter.  

Also on photoshop:  







It does have that GPU shit or whatnot.  Just my Vista hates my amazing modified drivers...


----------



## Mussels (Sep 17, 2009)

erocker said:


> Didn't they implement that in CS3? I dunno, I'm still using CS2.



i went from photoshop 7.0 to CS4 -i had a big leap 







it has a list of features there, that are only enabled with hardware acceleration on


----------



## phanbuey (Sep 17, 2009)

erocker said:


> ...and why should we believe (technology speaking) anything any VP of Investor relations has to say about a product they most likely don't understand? This is nothing but spin and bullshit, and why should it even matter to us? I want to hear what Mr. "Investor Relations" said coming out of one of Nvidia's engineers mouths.



they keep those people locked up far, far away from the investors... 

But honestly, I think he does understand that Nv is going to be f*** very quickly if the market share for dx11 cards goes to ATI... As do the investors.  There really isnt any need of tech knowledge to connect those dots.


----------



## Benetanegia (Sep 17, 2009)

phanbuey said:


> Yeah BUT... they touted DX10 and CUDA and PhysX as being reasons why their GPU's will sell... and are now saying that DX11 doesn't really matter.
> 
> Of course you can argue it and spin it any way you want - and say that DX11 does or doesn't matter while giving valid reasons.  But nvidia changed the tune of their song.  That's the point... they used to be the ones touting benefits that didn't exist -(lol PhysX and even CUDA to a big extent).  Like the photoshop "speedup" which only affects a sliver of the features in photoshop.
> 
> ...



Yeah and everbody thought DX10 was going to change the field of gaming. It did not, we all learnt. It's stupid to stumble into the same rock again. I know that, you know that, Nvidia knows that and AMD knows that. If anything the only dishonest voice regarding DX11 is AMD, with all the "know the future" and whatnot BS. Even when *it does* change how games *could* be made (AKA is a huge technology advancement), it won't change the gaming reality in any near future. No matter they released DX12 today with nuclear technology into it, games would still be "tweaked" DX9 games. That is the reality, DX11 means as much as DX10 did, that is *0*.


----------



## laszlo (Sep 17, 2009)

what i know for sure:

-people who like new stuff will jump to buy
-benchmark fans also
-gamers with a lot of cash also
-e-peen people also
--------------------------------
-people who give a shit about won't
-gamers who try to squeeze all from older cards won't
-who have the cash but don't consider necessary yet won't

and the lists can be bigger...


----------



## PVTCaboose1337 (Sep 17, 2009)

Although DX10 did not change the field of gaming, it sold cards.  That means DX11 will sell cards too.  People are SCARED that their card will not work in the future.  Fear drives sales.  People FEAR that they will not have a card that can handle DX11.  That will drive sales more than hardware will.  

ALSO:  I love how immediately we all called BS on this Nvidia statement.  Nice one guys!


----------



## Benetanegia (Sep 17, 2009)

PVTCaboose1337 said:


> Although DX10 did not change the field of gaming, it sold cards.  That means DX11 will sell cards too.



Extreme example: Hitler won the elections legally, which means that most people voted him. Is that going to happen again? No. Definately no.

And DX10 didn't really sold a lot of cards, based on DX. 8800 sold, but what about 8600 or HD2600 when they were released? X19xx and 79xx cards vastly outsold those cards, because the performance was better. Hence 8800 sold a lot because it offered unprecedented performance, like ability to play every game at 1920x1200 4xAA, something that not even SLI, Crossfire could do at the time for any newer games.


----------



## phanbuey (Sep 17, 2009)

Benetanegia said:


> Yeah and everbody thought DX10 was going to change the field of gaming. It did not, we all learnt. It's stupid to stumble into the same rock again. I know that, you know that, Nvidia knows that and AMD knows that. If anything the only dishonest voice regarding DX11 is AMD, with all the "know the future" and whatnot BS. Even when *it does* change how games *could* be made (AKA is a huge technology advancement), it won't change the gaming reality in any near future. No matter they released DX12 today with nuclear technology into it, games would still be "tweaked" DX9 games. That is the reality, DX11 means as much as DX10 did, that is *0*.



We agree there... but what about PhysX? I have it... and its bullsH** yet at the same time that they downplayed dx11 they talked up PhysX and "immersive experience"... you see?

They are both full of sh**.  But only one is a hypocrite.


----------



## PVTCaboose1337 (Sep 17, 2009)

Benetanegia said:


> Extreme example: Hitler won the elections legally, which means that most people voted him. Is that going to happen again? No. Definately no.
> 
> And DX10 didn't really sold a lot of cards, based on DX. 8800 sold, but what about 8600 or HD2600 when they were released? X19xx and 79xx cards vastly outsold those cards, because the performance was better. Hence 8800 sold a lot because it offered unprecedented performance, like ability to play every game at 1920x1200 4xAA, something that not even SLI, Crossfire could do at the time for any newer games.



Extreme example 2: Well we elected Obama...

But back to the topic, remember DX10.1 vs DX10.  People bought the DX10.1 equipped cards because they thought that they would not be able to play the latest and greatest games.


----------



## toyo (Sep 17, 2009)

In the end, how many even know who nVidia are? Looking back, I remember the puzzled faces of every guy/ette I helped with the computer when I told them: "Your videocard is nVidia/ATI. It's overheating etc.". Say what? "Ah, we know Intel, we heard about them on TV. About them and Microsoft. Big, bad companies, they do big bad things. Never heard of such things as videocard or ATI/nVidia. Maybe you wanna make fun of us...". Damn I had two friends who brought their shiny laptop to me crying it won't start anymore... it had a CD loaded so it couldn't boot.
I mean, within this sea of ignorance (or we shall call it a "cloud"?) almost no one will know about these (fascinating) GPU wars. People find it hard to think these days.


----------



## erocker (Sep 17, 2009)

Benetanegia said:


> Yeah and everbody thought DX10 was going to change the field of gaming. It did not, we all learnt. It's stupid to stumble into the same rock again. I know that, you know that, Nvidia knows that and AMD knows that. If anything the only dishonest voice regarding DX11 is AMD, with all the "know the future" and whatnot BS. Even when *it does* change how games *could* be made (AKA is a huge technology advancement), it won't change the gaming reality in any near future. No matter they released DX12 today with nuclear technology into it, games would still be "tweaked" DX9 games. That is the reality, DX11 means as much as DX10 did, that is *0*.



Nope, that is all to be seen yet. All companies involved are guilty of crappy slogans, saying this, saying that.. whatever. We will see.


----------



## [I.R.A]_FBi (Sep 17, 2009)

We know everyone wont run out to buy a 58XX or a GTX380 or whatever, but whatever is at the top determines how the middle and bottle dollar is spent to some extent


----------



## PVTCaboose1337 (Sep 17, 2009)

Seeing as how 16 members are watcing, and 28 guests (44 people) we have to ask, which one of you is going to get a DX11 capable card.  Just when you are giving your opinion about DX11 changing the industry or not, just put in a "PS: I'm gonna get a DX11 capable card (or not)" and why you are buying it!


----------



## Benetanegia (Sep 17, 2009)

phanbuey said:


> We agree there... but what about PhysX? I have it... and its bullsH** yet at the same time that they downplayed dx11 they talked up PhysX and "immersive experience"... you see?
> 
> They are both full of sh**.  But only one is a hypocrite.



I have Physx and I love it. Between you and me that's a 50/50 love/hate. And the thing is that Physx is something that comes appart, they don't have to make the engine any different in order to use it (you have to for DX). As long as Physx is integrated, GPU acceleration can be used or not as easily as you can change from using low detail textures to high detail textures. And Physx changes the gaming experience much much more than what a "better way of doing AA" or "look the head doesn't have edges now, but I have to not be playing to even notice it" DX features can do for me.

None of the companies are free of being hypocrite. Continuing with Physx, AMD said they would not support a propietary technology, while at the same time they were secretly working with Havok, or what it is the same Intel, to use their propietary tech.


----------



## Mussels (Sep 17, 2009)

PVTCaboose1337 said:


> Seeing as how 16 members are watcing, and 28 guests (44 people) we have to ask, which one of you is going to get a DX11 capable card.  Just when you are giving your opinion about DX11 changing the industry or not, just put in a "PS: I'm gonna get a DX11 capable card"



i'll get one.

one 5850 will match the power of my 4870's, while letting me run three screens at the same time with lower power consumption.


----------



## PVTCaboose1337 (Sep 17, 2009)

Mussels said:


> i'll get one.
> 
> one 5850 will match the power of my 4870's, while letting me run three screens at the same time with lower power consumption.



Ok so you will get one not because of DX11!  Or does DX11 even factor into the equation for you?


----------



## [I.R.A]_FBi (Sep 17, 2009)

"PS: I'm gonna get a DX11 capable card" after the prices drop

better feature set and lower power consumption.


----------



## TheMailMan78 (Sep 17, 2009)

PVTCaboose1337 said:


> Seeing as how 16 members are watcing, and 28 guests (44 people) we have to ask, which one of you is going to get a DX11 capable card.  Just when you are giving your opinion about DX11 changing the industry or not, just put in a "PS: I'm gonna get a DX11 capable card (or not)" and why you are buying it!



Ill be buying one for sure. Do I "need" one no. But for some reason I must have one. DX11 will not be utilized to its full extent until new consoles are released. Hell I have two 4850s in crossfire right now and all I play is BF2


----------



## laszlo (Sep 17, 2009)

PVTCaboose1337 said:


> Extreme example 2: Well we elected Obama...
> 
> But back to the topic, remember DX10.1 vs DX10.  People bought the DX10.1 equipped cards because they thought that they would not be able to play the latest and greatest games.



exactly and let's see minimum system requirement for  afew new games:

Red Faction Guerilla:"Video Memory: 128 MB (ATI Radeon X1300/NVIDIA GeForce 7600)"

Wolfenstein:"Video Card: 256MB NVIDIA(R) Geforce(R) 6800 GT or ATI Radeon(TM) X800"

NFS SHIFT:"Video Card – 256 MB, with support for Pixel Shader 3.0;Supported chipsets: ATI Radeon X1800 XT 512MB or greater; NVIDIA GeForce 7800 GT 256MB or greater"

so we see PS2 supported games,interesting no?

as already discussed in other thread all future game will be dx9 compatible for sure and they will run on better DX9-DX10 cards i bet


----------



## Batou1986 (Sep 17, 2009)

i will get a dx 11 capable card but im in no rush i didn't buy my current card because it was dx 11 or cuda or phsyx i bought it because it just works with no hassle and it runs smoother regardless of dx version.

all i can say is from my experience im  sticking with nvidia  i was an ati fanboy since the 9800 pro till i decided to upgrade roud the 3xxx series and i went through 3 cards that where either doa or artifacted and glitched in games so i bought a 9800gtx + figuring what the hell it cant be any worse.
Boy was i surprised everything just worked no more crap where some games i have to tweak the settings endlessly before it will run right, and no more glitchy drivers


----------



## Mussels (Sep 17, 2009)

PVTCaboose1337 said:


> Ok so you will get one not because of DX11!  Or does DX11 even factor into the equation for you?



it does. its ONE of many reasons to get one of these cards.

i tend to keep my cards for several years, and i dont like to replay games.

If i play a game with less than max graphics the first time around, i rarely bother to come back for a second time a few years later when i upgrade - so i like to be high end from the start.


That said, i have no immediate rush to update - i'll wait until i have a buyer for my cards to alleviate the cost.


----------



## PVTCaboose1337 (Sep 17, 2009)

I myself drive my cards to their death.  My 7900gt lasted till I could not do what I wanted.  What was what I want?  Play Crysis.  Can my 4850 do that?  Hell yes.  Can my 4850 do all I want right now?  Hell yes, and more!  Do I need to upgrade now?  No.


----------



## Batou1986 (Sep 17, 2009)

PVTCaboose1337 said:


> I myself drive my cards to their death.  My 7900gt lasted till I could not do what I wanted.  What was what I want?  Play Crysis.  Can my 4850 do that?  Hell yes.  Can my 4850 do all I want right now?  Hell yes, and more!  Do I need to upgrade now?  No.



Exactly   bout time gfx ppl wake up and realize theirs no point to being able to run crysis or something at 16xaa with 16xaf at some ridiculous 2000x res at 150fps it looks fine with 2x aa at 1680x1050 @ 30< fps on my rig


----------



## Mussels (Sep 17, 2009)

PVTCaboose1337 said:


> I myself drive my cards to their death.  My 7900gt lasted till I could not do what I wanted.  What was what I want?  Play Crysis.  Can my 4850 do that?  Hell yes.  Can my 4850 do all I want right now?  Hell yes, and more!  Do I need to upgrade now?  No.



my problem is that i have the ability to see upto around 120 FPS/120hz. i used to run that on my old 19" CRT at 1024x768, but lack that ability now on LCD.

My problem is that anything below 60FPS feels slow to me, since i happen to have fast reaction times. so when i say i want to play games maxed out, i mean "play them maxed out at 60FPS minimum" - and thus is my drive to overclock


----------



## phanbuey (Sep 17, 2009)

Mussels said:


> my problem is that i have the ability to see upto around 120 FPS/120hz. i used to run that on my old 19" CRT at 1024x768, but lack that ability now on LCD.
> 
> My problem is that anything below 60FPS feels slow to me, since i happen to have fast reaction times. so when i say i want to play games maxed out, i mean "play them maxed out at 60FPS minimum" - and thus is my drive to overclock



yep me too!... im jealous of the 4850 doing everything that one needs... lol my 260's dont even do that for me... i could easily double up on power and feel like it was a worthwhile investment.

And AA is key.


----------



## PVTCaboose1337 (Sep 17, 2009)

Mussels said:


> my problem is that i have the ability to see upto around 120 FPS/120hz. i used to run that on my old 19" CRT at 1024x768, but lack that ability now on LCD.
> 
> My problem is that anything below 60FPS feels slow to me, since i happen to have fast reaction times. so when i say i want to play games maxed out, i mean "play them maxed out at 60FPS minimum" - and thus is my drive to overclock





Yeah I guess you might want to play L4D2 on max settings...  OH WAIT YOU CAN'T CAUSE YOUR COUNTRY BANNED IT 

Just kidding but yeah I am the same way, but the problem is I cannot afford (not much income here) to do that.  In fact, I actually downgraded because I was going off to college and needed a laptop (specs to left).


----------



## Batou1986 (Sep 17, 2009)

I will agree there ive been a stickler for Vsync as of late, tho ur talking 120hz ive always had issues with skipping and stuff when games run well over 100fps especially online


----------



## toyo (Sep 17, 2009)

I'm itching to buy one too. I can't say yet if it will be a nvidia or ATI, I'm waiting for the price battle that will take place next year.
My reasons are:
- I love tech, especially CPU/GPU stuff, if I had the money I'd probably go with fastest things available
- It will match with W7 DirectX11. Games will appear slowly...
- I hope more and more programs will take advantage of the GPU. I use the kind that's probably to do that among the 1st (design)
- I just can't wait to see it in my case, cool as spring, powerful as Niagara, silent as the Dark Knight preying on the Arkham inmates (ok, just let me dream about the cool&silent, k), 40nm process ftw
- other reasons, I could go on for ages ...


----------



## Mussels (Sep 17, 2009)

Batou1986 said:


> I will agree there ive been a stickler for Vsync as of late, tho ur talking 120hz ive always had issues with skipping and stuff when games run well over 100fps especially online



thats because you have a 60Hz screen. you cant get above 60FPS on it, any higher and its.. how to explain... getting lost? going nowhere?

my point was that i got used to 120hz and 120FPS, so dropping to 60 is acceptable, but bad enough for me. dropping to 30 is not acceptable, and drives me mad.


One thing that needs to be said as well: any DX11 games being made right now, will be coded for ATI since they have those cards to test on.

What this means is that when Nvidia do launch, ATI has a head start on support and performance, as the game makes can support ATI from the early stages, whereas nvidia has to be patched in.


----------



## [I.R.A]_FBi (Sep 17, 2009)

Mussels said:


> my problem is that i have the ability to see upto around 120 FPS/120hz. i used to run that on my old 19" CRT at 1024x768, but lack that ability now on LCD.
> 
> My problem is that anything below 60FPS feels slow to me, since i happen to have fast reaction times. so when i say i want to play games maxed out, i mean "play them maxed out at 60FPS minimum" - and thus is my drive to overclock



I wish my wallet could see that well


----------



## TheMailMan78 (Sep 17, 2009)

Mussels said:


> One thing that needs to be said as well: any DX11 games being made right now, will be coded for ATI since they have those cards to test on.
> 
> What this means is that when Nvidia do launch, ATI has a head start on support and performance, as the game makes can support ATI from the early stages, whereas nvidia has to be patched in.


 THIS is a good point. You just sold me on the 5870


----------



## HellasVagabond (Sep 17, 2009)

What does it matter right now who gets the first DX11 card on the market ? It will matter when the first games get released, just like it mattered with the DX10 launch.

That aside it is no secret that with Cuda/PhysX/Stereo3D NVIDIA has been trying to make a better product and that is what they are supporting, is that a bad thing ? Would i or anyone else support the opponents solution ? Did ATI ever support NVIDIAs solutions ?

Reading between the lines means nothing, the bottom line does and right now we are all arguing about the single most silly thing, which will launch a DX11 compliant card first when there is no usage for it yet.

Mussels you code a game according to the code not a card, not if that company does not invest on your project, thus it has nothing to do with who launches a card first.


----------



## TheMailMan78 (Sep 17, 2009)

HellasVagabond said:


> What does it matter right now who gets the first DX11 card on the market ? It will matter when the first games get released, just like it mattered with the DX10 launch.
> 
> That aside it is no secret that with Cuda/PhysX/Stereo3D NVIDIA has been trying to make a better product and that is what they are supporting, is that a bad thing ? Would i or anyone else support the opponents solution ? Did ATI ever support NVIDIAs solutions ?
> 
> Reading between the lines means nothing, the bottom line does and right now we are all arguing about the single most silly thing, which will launch a DX11 compliant card first when there is no usage for it yet.



In way less than a few months DX11 games will be on the market. Your argument is null.


----------



## HellasVagabond (Sep 17, 2009)

DX11 games will be on the market long after NVIDIA cards.


----------



## PVTCaboose1337 (Sep 17, 2009)

Ok here is a good question:  Will developers keep DX9 compatible cards from running their game, IE RE5 lets you run in DX9 or DX10.  Will the new games have it so you can only run in DX10 or 11?


----------



## TheMailMan78 (Sep 17, 2009)

HellasVagabond said:


> DX11 games will be on the market long after NVIDIA cards.



Nvidia will have DX11 cards on the market before December?


----------



## btarunr (Sep 17, 2009)

TheMailMan78 said:


> Nvidia will have DX11 cards on the market before December?



I don't rule out a "jelly-launch" (hard-launch with such limited quantities actually sold, that it's almost a soft-launch). Sell 12 cards in CA, 12 cards in US, 12 cards in UK, 12 in EU, 12 in CN, 12 in AU, make 12 samples whore around between 200+ press people..done. Now keep telling people how awesome it is to hold your piss wait for proper quantities to reach stores.


----------



## toyo (Sep 17, 2009)

TheMailMan78 said:


> Nvidia will have DX11 cards on the market before December?





HellasVagabond said:


> DX11 games will be on the market long after NVIDIA cards.



Even so, the games take time to be developed... many are developing them probably as we write. So they develop for what is known to be DX11, as of now this would be ATI.


----------



## TheMailMan78 (Sep 17, 2009)

btarunr said:


> I don't rule out a "jelly-launch" (hard-launch with such limited quantities actually sold, that it's almost a soft-launch). Sell 12 cards in CA, 12 cards in US, 12 cards in UK, 12 in EU, 12 in CN, 12 in AU, make 12 samples whore around between 200+ press people..done. Now keep telling people how awesome it is to hold your piss wait for proper quantities to reach stores.



Yeah but even then I doubt Nvidia will be ready before December which is when the first DX11 game that will be on the market. But what you said is very feasible.


----------



## Velvet Wafer (Sep 17, 2009)

Nvidia are simply wrynecks. addicted to money. but they made quite nice cards. up until now.


----------



## driver66 (Sep 17, 2009)

It's all speculation at this point, no one knows what Nvidia has or WHEN IT WILL BE RELEASED, so why all the doom and gloom by everyone?
I think the press release was B.S. but ppl comparing DX10 - DX11 are not making a valid argument.  
When DX10 was released you needed it to unlock all of the functions of an OS. The biggest reason Nvidia sold so many G80's was not because of DX10 , it's because they totally ASS STOMPED  the competition with a better card(s)............
DX11 is not needed yet but WILL bring benefits in the FUTURE... IMHO


----------



## Valdez (Sep 17, 2009)

Benetanegia said:


> I have Physx and I love it. Between you and me that's a 50/50 love/hate. And the thing is that Physx is something that comes appart, they don't have to make the engine any different in order to use it (you have to for DX). As long as Physx is integrated, GPU acceleration can be used or not as easily as you can change from using low detail textures to high detail textures. And Physx changes the gaming experience much much more than what a "better way of doing AA" or "look the head doesn't have edges now, but I have to not be playing to even notice it" DX features can do for me.
> 
> None of the companies are free of being hypocrite. Continuing with Physx, AMD said they would not support a propietary technology, while at the same time they were secretly working with Havok, or what it is the same Intel, to use their propietary tech.


----------



## Scrizz (Sep 17, 2009)

Ati Ftc!


----------



## erocker (Sep 17, 2009)

Valdez said:


> http://www.brightsideofnews.com/Dat...hysX-in-trouble/AMD_ATI_Bullet_OpenCL_675.jpg



Oh gawd, more names. Bullet, Pixelux, Khronos... How about just putting physics in the game, have it be realistic and not give it some stupid name. Leave it "open."


----------



## TheMailMan78 (Sep 17, 2009)

erocker said:


> Oh gawd, more names. Bullet, Pixelux, Khronos... How about just putting physics in the game, have it be realistic and not give it some stupid name. Leave it "open."



How about "Butt Bullet Physics®"


----------



## Scrizz (Sep 17, 2009)

erocker said:


> Oh gawd, more names. Bullet, Pixelux, Khronos... How about just putting physics in the game, have it be realistic and not give it some stupid name. Leave it "open."



it is it's called "AMD's Bullet, Pixelux, Khronos OpenCL physics 5860 Platinum Edition"


----------



## mdm-adph (Sep 17, 2009)

erocker said:


> Oh gawd, more names. Bullet, Pixelux, Khronos... How about just putting physics in the game, have it be realistic and not give it some stupid name. Leave it "open."



"Open" anything -- bad for business.

Crappy, cheesy names -- good for business!


----------



## TheMailMan78 (Sep 17, 2009)

No one likes "Butt Bullet Physics®". Man I thought that was a winner. My genius is wasted on you people.


----------



## erocker (Sep 17, 2009)

Average consumer probably doesn't know what PhysX is. Nor will they know or care what these new name are. Just like the tilte of this thread says, DX11 won't define GPU sales, just as PhysX doesn't define GPU sales, just as Khronos and whatever won't define GPU sales. Well, perhaps PhysX does a little, since Nvidia has bee ramming it down peoples throats. Just a thought. If Open CL does become the standard, I suppose Nvidia will just call it PhysX and ATi can call it whatever they want.


----------



## grunt_408 (Sep 17, 2009)

I jumped on the bandwagon for DX9 then I jumped on for Dx10 meh why not Jump on for DX11. My seat didnt even get warm with DX10 lol


----------



## trt740 (Sep 17, 2009)

The truth is main stream dx 11 will not take hold for a year. Look how long dx 10 took to take hold and 10.1 had almost no support.


----------



## Kitkat (Sep 17, 2009)

Wot did u all expect everyone plays the press conference game if the situation was REVERSED and the news about AMD's(nvdias lol) latest yields were from 10 to 2% they'd have released "DX11 ain't Sh!t" statement too lol. You know how it goes comon why are we newbs everyday all a sudden. DX11 performance ratios, and EASY development (which is the real deciding factor in ANYTHING "no development no play....") error fixes and, almost native everything sound like great news. Take away gripes and complaints of developers and THAT will decide EVERYTHING lol. In 10 we learned early of problems.... As far as Nvidia's statement about there "DX11 sales effecting anything." It will no matter what they do its a stupid comment and PR and the same statement would have been released AMD in a reverse situation on some level. The next generation of cards will always matter because not only do they support the next DX they happen to be 1.6 ape$h!t FASTER. Even playing a DX9 game IT WILL MATTER. That strong Single GPU high-entry-level ("50") and high end chip ("70") is the whole gaming market. And maybe one more below it. Everything else around it is "side jerking" to the action. Also i think DX11 is totally "needed". Graphics need to be spanked at a software level. And all those effects they showed us in DX10 that could BARELY run at home can now with DX 11. PC needs to spank these games proper.


----------



## TheMailMan78 (Sep 17, 2009)

Kitkat said:


> Wot did u all expect everyone plays the press conference game if the situation was REVERSED and the news about AMD's(nvdias lol) latest yields were from 10 to 2% they'd have released "DX11 ain't Sh!t" statement too lol. You know how it goes comon why are we newbs everyday all a sudden. DX11 performance ratios, and EASY development (which is the real deciding factor in ANYTHING "no development no play....") error fixes and, almost native everything sound like great news. Take away gripes and complaints of developers and THAT will decide EVERYTHING lol. In 10 we learned early of problems.... As far as Nvidia's statement about there "DX11 sales effecting anything." It will no matter what they do its a stupid comment and PR and the same statement would have been released AMD in a reverse situation on some level. The next generation of cards will always matter because not only do they support the next DX they happen to be 1.6 ape$h!t FASTER. Even playing a DX9 game IT WILL MATTER. That strong Single GPU high-entry-level ("50") and high end chip ("70") is the whole gaming market. And maybe one more below it. Everything else around it is "side jerking" to the action. Also i think DX11 is totally "needed". Graphics need to be spanked at a software level. And all those effects they showed us in DX10 that could BARELY run at home can now with DX 11. PC needs to spank these games proper.



One word. Paragraphs.


----------



## btarunr (Sep 17, 2009)

erocker said:


> Oh gawd, more names. Bullet, Pixelux, Khronos... How about just putting physics in the game, have it be realistic and not give it some stupid name. Leave it "open."



Don't worry, since they're industry-wide, it wouldn't matter if you forget the names before picking up the game. On the other hand, you have to see if the game you're buying is "enhanced by PhysX" (or "won't work as intended on your Radeon").


----------



## Kitkat (Sep 17, 2009)

TheMailMan78 said:


> One word. Paragraphs.



hahahahaha


----------



## Fx (Sep 17, 2009)

extrasalty said:


> ATI: We have DX11 WHQL driver.
> nVidia: DX11? Phew, let's concentrate on whats important- the Powerpoint slides.



LMAO- nice summary


----------



## TheMailMan78 (Sep 17, 2009)

btarunr said:


> Don't worry, since they're industry-wide, it wouldn't matter if you forget the names before picking up the game. On the other hand, you have to see if the game you're buying is "enhanced by PhysX" (or "won't work as intended on your Radeon").



Well lets just hope PhysX finally dies. Off topic: WTF time is it in your part of the world BTA? I mean why the hell are you posting at this time?


----------



## Fx (Sep 17, 2009)

Craigleberry said:


> I jumped on the bandwagon for DX9 then I jumped on for Dx10 meh why not Jump on for DX11. My seat didnt even get warm with DX10 lol



very true- many people will buy just based upon that larger number. from our point of view it is nice to know that DX11 is actually going to bring the goods. it isnt being half-assed like DX10


----------



## ste2425 (Sep 17, 2009)

Mussels said:


> of course it wont affect nvidias sales at all - they dont have any DX11 cards.



dnt no why but i like that,

an on the original post i agree with what allot of people say, yes theres some very good points there and what they say about dx11 is true but i think there trying to bring ppls atention not on there offerings (nvidia) but mainly away from ati's


----------



## Benetanegia (Sep 17, 2009)

TheMailMan78 said:


> Well lets just hope PhysX finally dies. Off topic: WTF time is it in your part of the world BTA? I mean why the hell are you posting at this time?



Let's just hope it doesn't, otherwise we would be left with Intel owned Havok only, no competition. What you all don't get is that OpenCL, DX11 Compute and etc are not complete physics engines, PhysX and Havok are. OpenCL, and DX Compute are APIs or platforms (you can build on top of them) that enable and make it "easy" creating GPU accelerated physics, but it already existed a platform that enables the creation of physics engines and that it's the easiest of all of them: dadaaaa... *x86*

As a game developer, under x86, using C++ (just one of the alternatives) you can easily create your physics engine, no need to deal with APIs or extensions. There's no easier way of creating your own physics *yet most game developers use either Havok or PhysX*. Epic games and apparently EA uses PhysX (CPU PhysX running in x86, in Xenos, XB360 CPU and Cell, PS3 CPU), as well as many others, but Havok has/had a greater market share. Anyway, between both, they easily make 90% of the physics that we find in games. In fact the only developer that makes their own physics that I can think of now and that actually has some good physics is Crytek. 

So under this circumstances. When most developers choose to use 3rd party engines for the relatively easy physics that can be done on the CPU, do you really think that because now they can make them run on the GPU (allowing for far more complex physics) they are going to start making their own engine? Did the introduction of pixel and vertex shaders make developers create their own 3D engines or did they continue using Epic's or ID's engines for a long time? How many developers have their own engine versus a 3rd party engine even today?

PhysX or Havok are not going to disappear anytime soon.


----------



## ArmoredCavalry (Sep 17, 2009)

TheMailMan78 said:


> People are going to say this is sour grapes. That they (Nvidia) is just saying this because they do not have a DX11 GPU out before ATI. However I honestly think they may be just speaking from their past experience with DX10.



Nah, I'll hold back. I think the article speaks for itself.


----------



## Noggrin (Sep 18, 2009)

mikek75 said:


> LOL, I wonder if this is anything to do with this....http://www.semiaccurate.com/2009/09/15/nvidia-gt300-yeilds-under-2/



 Oh my.. Charlie aka the douchebag is back at it again.. gonna read just for the lulz, this idiot and his fantasies are just too funny.


----------



## pr0n Inspector (Sep 18, 2009)

Comments written in high school English are hard to read.


----------



## [I.R.A]_FBi (Sep 18, 2009)

pr0n Inspector said:


> Comments written in high school English are hard to read.



not nice


----------



## HTC (Sep 18, 2009)

I'm curious as to how DX11 games will be made. I mean: it's obvious they won't make them for PCs with DX11 compliant only cards, right?

Will they make 2 versions of the game? Make them for DX11 and then add support for DX9/DX10/DX10.1? The other way around? Other way that doesn't currently occur to me?

Unless option 1 (doubtful), it will be like Crysis all over again, no? Or is my reasoning flawed?


----------



## Riou (Sep 18, 2009)

HTC said:


> I'm curious as to how DX11 games will be made. I mean: it's obvious they won't make them for PCs with DX11 compliant only cards, right?
> 
> Will they make 2 versions of the game? Make them for DX11 and then add support for DX9/DX10/DX10.1? The other way around? Other way that doesn't currently occur to me?
> 
> Unless option 1 (doubtful), it will be like Crysis all over again, no? Or is my reasoning flawed?



They have to make it compatible with older DX10/9 generation cards. The developers would be cutting themselves out of a large consumer base if they did not.


----------



## erocker (Sep 18, 2009)

Aren't new releases of Direct X based upon previous generations of Direct X anyways?


----------



## HTC (Sep 18, 2009)

Riou said:


> They have to make it compatible with older DX10/9 generation cards. The developers would be cutting themselves out of a large consumer base if they did not.



Exactly.



erocker said:


> Aren't new releases of Direct X based upon previous generations of Direct X anyways?



Look @ crysis: they made it for DX9 and added DX10 elements, which is why it has better FPS on XP then on Vista, correct?

Will they resort to this way for DX11?

For a developer, i think they would go with a cost saving option and that would be like crysis, be it exactly like it (made for older version of DX with added elements of newer version of DX) or just the opposite: made for DX11 and, with what could not be run with older cards, added with DX9/DX10 (if i'm making any sense). Preferably, though, if the developer made 2 versions of the game, then one could *really* see how much better/worse DX11 really is.


----------



## Kitkat (Sep 18, 2009)

HTC said:


> I'm curious as to how DX11 games will be made. I mean: it's obvious they won't make them for PCs with DX11 compliant only cards, right?
> 
> Will they make 2 versions of the game? Make them for DX11 and then add support for DX9/DX10/DX10.1? The other way around? Other way that doesn't currently occur to me?
> 
> Unless option 1 (doubtful), it will be like Crysis all over again, no? Or is my reasoning flawed?



9 is always gonna be the "standard" or lowest and for most DX10 games there is a 9a version a very high 9a. Not quite on the Crysis tip.... It was a new engine that tested hardware to the bone yes... but a WAYYY SMARTER Cysis came after maintaining the same effects with less hurt better code and new tricks. Yes it was a new GAME engine and new and flawed DX10 (which DX11 kills in performance).

 As for a "THE GAME" there will always be one that KILLS a set of video-cards that everyone will say "Yes but will it play ______." Game companies will always push hardware further. True everything was not coded in the first Crysis as well as the second version but once they improved they will push the bar right back up to where it was or someone else will. You'll always have to worry about "A GAME". 

As for a DX10 versions of DX11 games u might see alot more really high 9a's then that. Wont have all the same native effects or texture tricks as DX11 or be as crisp or run the same amount of effects ect. But the cards will be so strong itll look good. But then again there will be some SUPER OVER KILL games in DX11 that will use every effet every trick ever limit and u wont see them try to port back to DX9 back unless they make a Xbox 360 version or so.


----------



## trt740 (Sep 18, 2009)

HTC said:


> I'm curious as to how DX11 games will be made. I mean: it's obvious they won't make them for PCs with DX11 compliant only cards, right?
> 
> Will they make 2 versions of the game? Make them for DX11 and then add support for DX9/DX10/DX10.1? The other way around? Other way that doesn't currently occur to me?
> 
> Unless option 1 (doubtful), it will be like Crysis all over again, no? Or is my reasoning flawed?



No most will be DX10 because the Xbox is DX10 as is the PS3 and they don't want to rewrite an entire game just for PC it's easier to port it.


----------



## pr0n Inspector (Sep 18, 2009)

trt740 said:


> No most will be DX10 because the *Xbox is DX10* as is the PS3 and they don't want to rewrite an entire game just for PC it's easier to port it.



LOL WHAT?


----------



## Mussels (Sep 18, 2009)

pr0n Inspector said:


> LOL WHAT?



the 360's GPU meets the early (before nvidia messed with them ) DX10 standards. its not compliant with DX10 as we know it (so he is wrong), but has a lot of similarities.


----------



## raptori (Sep 18, 2009)

phanbuey said:


> +1... exactly... theyre just trying to pull a Baghdad Bob on their investors.  "No No... we ARE winning the war...  ATI is cowering in fear, and our customers don't care about new tech at all... its just not important."



Hell yea that's true.


----------



## pr0n Inspector (Sep 18, 2009)

Mussels said:


> the 360's GPU meets the early (before nvidia messed with them ) DX10 standards. its not compliant with DX10 as we know it (so he is wrong), but has a lot of similarities.



Actually now that I read it again, the "as is the PS3" part is even more hilarious.


----------



## Mussels (Sep 18, 2009)

pr0n Inspector said:


> Actually now that I read it again, the "as is the PS3" part is even more hilarious.



the PS3 is pure DX9, its based off an Nv 7900 series card (whereas the ATI xenos in the 360 is at least a DX10 "prototype"


----------



## pr0n Inspector (Sep 18, 2009)

Mussels said:


> the PS3 is pure DX9, its based off an Nv 7900 series card (whereas the ATI xenos in the 360 is at least a DX10 "prototype"



Not to mention PS3 doesn't use DirectX at all.


----------



## Wile E (Sep 18, 2009)

pr0n Inspector said:


> Not to mention PS3 doesn't use DirectX at all.



They mean from a hardware equivalency standpoint, not literally.


----------



## pr0n Inspector (Sep 18, 2009)

Wile E said:


> They mean from a hardware equivalency standpoint, not literally.



No he said


> _No most will be DX10 because the Xbox is DX10 as is the PS3 and *they don't want to rewrite an entire game just for PC it's easier to port it*._


despite the fact that game devs have to make at least two distinct versions(PC/Xbox360 and PS3) of their cross-platform game anyway.


----------



## Zubasa (Sep 18, 2009)

Benetanegia said:


> Let's just hope it doesn't, otherwise we would be left with Intel owned Havok only, no competition. What you all don't get is that OpenCL, DX11 Compute and etc are not complete physics engines, PhysX and Havok are. OpenCL, and DX Compute are APIs or platforms (you can build on top of them) that enable and make it "easy" creating GPU accelerated physics, but it already existed a platform that enables the creation of physics engines and that it's the easiest of all of them: dadaaaa... *x86*
> 
> As a game developer, under x86, using C++ (just one of the alternatives) you can easily create your physics engine, no need to deal with APIs or extensions. There's no easier way of creating your own physics *yet most game developers use either Havok or PhysX*. Epic games and apparently EA uses PhysX (CPU PhysX running in x86, in Xenos, XB360 CPU and Cell, PS3 CPU), as well as many others, but Havok has/had a greater market share. Anyway, between both, they easily make 90% of the physics that we find in games. In fact the only developer that makes their own physics that I can think of now and that actually has some good physics is Crytek.
> 
> ...


No body is saying Havok is going down, because guess what? OpenCL allows Havok to be run on ATi's GPU 
It is only the stupid nV only *Physx that is going the way of the Dinosaur.*


----------



## Mussels (Sep 18, 2009)

Zubasa said:


> No body is saying Havok is going down, because guess what? OpenCL allows Havok to be run on ATi's GPU
> It is only the stupid nV only *Physx that is going the way of the Dinosaur.*



physx is proprietary and only runs on nvidia systems, therefore when its used in games its always as an add-on pretty effect. it never affects gameplay, and all it ever does it hurts FPS - _physx is never used to boost FPS, always to add more crap on top_

AMD/ATI got in bed with havok early, so that when DX11 launches officially they can have openCL drivers ready and working with havok from the get go - and we WILL see games using it for physics (with no software option, so its going to have gameplay effects) simply because every Nvidia cuda or ATI stream capable GPU (therefore, all DX10 GPU's) is capable of running openCL (and therefore hardware accelerated havok)

even worst case, they make this the integral engine, and GPU support merely boosts the FPS


----------



## Zubasa (Sep 18, 2009)

Mussels said:


> physx is proprietary and only runs on nvidia systems, therefore when its used in games its always as an add-on pretty effect. it never affects gameplay, and all it ever does it hurts FPS - _physx is never used to boost FPS, always to add more crap on top_
> 
> AMD/ATI got in bed with havok early, so that when DX11 launches officially they can have openCL drivers ready and working with havok from the get go - and we WILL see games using it for physx (with no software option, so its going to have gameplay effects) simply because every Nvidia cuda or ATI stream capable GPU (therefore, all DX10 GPU's) is capable of running openCL (and therefore hardware accelerated havok)
> 
> even worst case, they make this the integral engine, and GPU support merely boosts the FPS


I hope you meant "games using it for *physics*"

Another point is, many people here are getting messed up, DirectX 9.0 aka Shader Model 2.0 is in effect dead,
and is not supported in many (most) new games, it is only Shader Model 3.0 aka DX 9.0c that is supported.
Try running RE5/DMC4 etc on a X800XL.

Edit: Even Rainbow 6 Vegas 2 will not run on a 9600Pro.


----------



## Mussels (Sep 18, 2009)

typo fixed, oopsies.


----------



## PP Mguire (Sep 18, 2009)

Gotta love these stupid nvidia/physx-ati/havok threads. I love new releases but sometimes i just have to ignore the extra posts on these threads.


----------



## TheMailMan78 (Sep 18, 2009)

Its the thread that never ends. It just goes on and on my friends.


----------



## Binge (Sep 18, 2009)

YOU, Mailman, won't define GPU sales!


----------



## Mussels (Sep 18, 2009)

TheMailMan78 said:


> Its the thread that never ends. It just goes on and on my friends.



some people, started singing it...



Spoiler



This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...This is the song that never ends, yes it goes on and on my friend. Some people started singing it, not knowing what it was, and they'll continue singing it forever just because...


----------



## Scrizz (Sep 18, 2009)

Mussels said:


> some people, started singing it...
> 
> 
> 
> ...



you sir, are officially BANNED!


----------



## a_ump (Sep 18, 2009)

yea, i'm definitely getting an HD 5870, but just because i can't play stalker:CS maxed. And i'm just upgraded my mobo to help oc my cpu more as well as  acooler, so i'm only getting cause my 7800GTX is killing me lol.


----------



## 15th Warlock (Sep 18, 2009)

This is all so interesting, it's so good to see Ati and nVidia trading punches like back in the old days. 

Reminds me of the time Ati beat nVidia by releasing the better DX9 solution (the good old 9700Pro) and made nVidia FX cards look like crap. IMO those events started a golden age of video card releases including classics like the X800 and X1900 series and the 6800, 7800 and 8800 series. 

Lately it felt to me like we had hit a wall, and little progress was being made in terms of graphic card's evolution. 

We are on the verge of a new revolution, whoever wins this match wont matter, as we as consumers win!


----------



## Kitkat (Sep 18, 2009)

15th Warlock said:


> Reminds me of the time Ati beat nVidia by releasing the better DX9 solution (the good old 9700Pro) and made nVidia FX cards look like crap. IMO those events started a golden age of video card releases including classics like the X800 and X1900 series and the 6800, 7800 and 8800 series.



9800 Pro,1650 ,X1950, 4850, 4870 SweetSpots   and 9800 was the shhhhh


----------



## ste2425 (Sep 18, 2009)

Mussels said:


> some people, started singing it...
> 
> 
> 
> ...



sad thing is i got half way through without realising its all the same :shadedshu


----------



## Frizz (Sep 18, 2009)

They really do have a point, a few FPS won't really do anything because the army of rebranded cards that they have on sale currently will drop price by a huge margin. Everyone will be capable of running the latest games at 60fps even mainstream users when AMD's latest series is released I've never imagined prices for good performing GPU's will actually drop below 150 AUD but its here now.

I don't know about NVIDIA but I sure as hell wanna be able to start seeing multi-core GPUs and lifelike graphics at 60FPS in the next few years to come.


----------



## leonard_222003 (Sep 18, 2009)

I remember the days when AMD bought ATI , there was talk about how Nvidia fears AMD and Intel , probably nobody understood at the time what was to fear about AMD being almost bankrupt , but still rumors said Nvidia fear the year to come , well it seems they were right to fear them , AMD's knowledge from CPU's seems to give them an advantage in GPU's too.
If you look at the progress they had it's always in GPU arhitecture and fabrication , Stream is shit compared to Cuda and their atempts with havok are pointelles compared to a working physx , so good job AMD  , i feel they will battle with Nvidia in price again , again Nvidia can do a huge GPU but it will be too expensive to make , integrators will complain about how they don't make money and problems problems.


----------



## Bo_Fox (Sep 19, 2009)

LOL!!!   Over at Xbitlabs.com, I posted a comment saying 



> Anton, you better remember this article because in a few months from now, Nvidia will be claiming that DX11 is all-important with their GT300 cards!
> 
> Nvidia's just saying it now to hurt ATI, and Nvidia thinks that with their greater marketing power, they will have no problem turning the mindset around in making us think that we need their new GT300 cards for DX11 games.



Then a guy replied:



> I think you guys are missing the point. Hara was likely saying that DX11 is only one facet of next gen, people also need to consider their graphics card is capable of of making their PC a supercomputer for things like video transcoding and processing, or adding higher levels of immersion through proprietary techs like PhysX and 3dVision.
> 
> -Rollo
> NVIDIA Focus Group



http://www.xbitlabs.com/news/video/display/20090916140327_Nvidia_DirectX_11_Will_Not_Catalyze_Sales_of_Graphics_Cards.html

LOL, was that just coincidence or was it because of me that I actually got a Nvidia sales rep to defend Nvidia over at Xbitlabs?  Or is Rollo just joking around and not really part of so-called "NVIDIA Focus Group"?

You guys decide!


----------



## FordGT90Concept (Sep 19, 2009)

DX11 really isn't that important, no.  Most games are still DX 9.0c as proof of that.  What people are looking for, especially with LCDs, is acceptable framerates at higher resolutions.  Yes it has to play the game too but it will be quite some time before DX11 sees serious use.



TheMailMan78 said:


> Its the thread that never ends. It just goes on and on my friends.


Lamb-chops outro!


----------



## Bo_Fox (Sep 19, 2009)

FordGT90Concept said:


> DX11 really isn't that important, no.  Most games are still DX 9.0c as proof of that.  What people are looking for, especially with LCDs, is acceptable framerates at higher resolutions.  Yes it has to play the game too but it will be quite some time before DX11 sees serious use.



Just wait 'til you see how much Nvidia emphasizes DX11 when they market their new GT300 cards!!!   That's the way it's always been, my friend!


----------



## trt740 (Sep 19, 2009)

both are true


----------



## jagd (Sep 20, 2009)

Blame vista  ,win xp =DX 9.0c , DX10 /10.1 =vista only now .Game companies will not make a game for limited user base .
win7 launch may change things with all the hype around it. 



FordGT90Concept said:


> DX11 really isn't that important, no.  Most games are still DX 9.0c as proof of that.


----------



## FordGT90Concept (Sep 20, 2009)

Bo_Fox said:


> Just wait 'til you see how much Nvidia emphasizes DX11 when they market their new GT300 cards!!!   That's the way it's always been, my friend!


CEO said it's not important; marketing says it is.  As trt740 said, both are true but not from the same mouth.





jagd said:


> Blame vista  ,win xp =DX 9.0c , DX10 /10.1 =vista only now .Game companies will not make a game for limited user base .
> win7 launch may change things with all the hype around it.


I believe the lion share of developers/publishers won't want to abandon Win XP even two years after Win 7 release.  Seriously, I doubt support will stop for XP until Microsoft pulls the plug.  Only then will there be a sharp drop off of developer/publishers making DX9 games.


----------



## eidairaman1 (Sep 21, 2009)

peh, the only reason they are saying that now is because they are behind in tech advancement.


----------



## niko084 (Sep 21, 2009)

TheMailMan78 said:


> People are going to say this is sour grapes. That they (Nvidia) is just saying this because they do not have a DX11 GPU out before ATI. However I honestly think they may be just speaking from their past experience with DX10.
> 
> Good find BTA.



Of course they are, it was an entirely different tune with Dx10.


----------



## yogurt_21 (Sep 22, 2009)

btarunr said:


> "DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons." ....
> Source: Xbit Labs



roughly translated "were quite happy with the sales of our current cards and we're not going to change anything until someone lights a fire under our butts and makes us!"


----------



## Hayder_Master (Sep 24, 2009)

nvidia want to say " please stop buying new ATI card's just wait for me"


----------

