Thursday, September 17th 2009
DirectX 11 Won't Define GPU Sales: NVIDIA
"DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons." This coming from the same company that a few years ago said that there was every reason to opt for a DirectX 10 compliant graphics card, to complete the Windows Vista experience, at a time when it was the first and only company to be out with compliant hardware. In the wake of rival AMD's ambitious Evergreen family of DirectX 11 compliant graphics cards being released, NVIDIA made it a point to tell the press that the development shouldn't really change anything in the industry.
Speaking at the Deutsche Bank Securities Technology Conference, NVIDIA's VP of investor relations said "DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons. This is why Microsoft is in work with the industry to allow more freedom and more creativity in how you build content, which is always good, and the new features in DirectX 11 are going to allow people to do that. But that no longer is the only reason, we believe, consumers would want to invest in a GPU."
"Now, we know, people are doing a lot in the area of video, people are going to do more and more in the area of photography… I think that the things we are doing would allow the GPU to be a co-processor to the CPU and deliver better user experience, better battery life and make that computers little bit more optimized," added Mr. Hara
NVIDIA, which was until very recently a firm believer in graphics processing horsepower to serve as the biggest selling points of new GPUs, now switches its line on what it believes will drive the market forward. All of a sudden, software that rely on the raw computational power of GPUs (eg: media transcoding software), and not advanced visual effects that a new generation API brings with it (in games and CGI applications), is what will drive people to buying graphics processors, according to the company.
Mr. Hara concluded saying "Graphics industry, I think, is on the point that microprocessor industry was several years ago, when AMD made the public confession that frequency does not matter anymore and it is more about performance per watt. I think we are the same crossroad with the graphics world: framerate and resolution are nice, but today they are very high and going from 120fps to 125fps is not going to fundamentally change end-user experience. But I think the things that we are doing with Stereo 3D Vision, PhysX, about making the games more immersive, more playable is beyond framerates and resolutions. Nvidia will show with the next-generation GPUs that the compute side is now becoming more important that the graphics side."
The timing of this comes when NVIDIA does not have any concrete product plans laid out, while AMD is working towards getting a headstart with its next-generation GPUs that are DirectX 11 compliant, and also has compliance with industry-wide GPGPU standards such as DirectCompute 11 and OpenCL.
Source:
Xbit Labs
Speaking at the Deutsche Bank Securities Technology Conference, NVIDIA's VP of investor relations said "DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons. This is why Microsoft is in work with the industry to allow more freedom and more creativity in how you build content, which is always good, and the new features in DirectX 11 are going to allow people to do that. But that no longer is the only reason, we believe, consumers would want to invest in a GPU."
"Now, we know, people are doing a lot in the area of video, people are going to do more and more in the area of photography… I think that the things we are doing would allow the GPU to be a co-processor to the CPU and deliver better user experience, better battery life and make that computers little bit more optimized," added Mr. Hara
NVIDIA, which was until very recently a firm believer in graphics processing horsepower to serve as the biggest selling points of new GPUs, now switches its line on what it believes will drive the market forward. All of a sudden, software that rely on the raw computational power of GPUs (eg: media transcoding software), and not advanced visual effects that a new generation API brings with it (in games and CGI applications), is what will drive people to buying graphics processors, according to the company.
Mr. Hara concluded saying "Graphics industry, I think, is on the point that microprocessor industry was several years ago, when AMD made the public confession that frequency does not matter anymore and it is more about performance per watt. I think we are the same crossroad with the graphics world: framerate and resolution are nice, but today they are very high and going from 120fps to 125fps is not going to fundamentally change end-user experience. But I think the things that we are doing with Stereo 3D Vision, PhysX, about making the games more immersive, more playable is beyond framerates and resolutions. Nvidia will show with the next-generation GPUs that the compute side is now becoming more important that the graphics side."
The timing of this comes when NVIDIA does not have any concrete product plans laid out, while AMD is working towards getting a headstart with its next-generation GPUs that are DirectX 11 compliant, and also has compliance with industry-wide GPGPU standards such as DirectCompute 11 and OpenCL.
194 Comments on DirectX 11 Won't Define GPU Sales: NVIDIA
Ill forum fight all you bastards!
Although, before the 23rd we cannot know for sure, maybe NV has the GT300 already waiting to punch from the darkness at that date (mhmmm....)
it's clear now, they won't have GT 300 ready when ati launch the evergreen
Now that was the kind of card that could keep you warm on cold nights.
Radeon 9700's advanced architecture was very efficient and, of course, more powerful compared to its older peers of 2002. Under normal conditions it beat the GeForce4 Ti 4600, the previous top-end card, by 15–20%. However, when anti-aliasing (AA) and/or anisotropic filtering (AF) were enabled it would beat the Ti 4600 by anywhere from 40–100%. At the time, this was quite astonishing, and resulted in the widespread acceptance of AA and AF as critical, truly usable features.
Besides advanced architecture, reviewers also took note of ATI's change in strategy. The 9700 would be the second of ATI's chips (after the 8500) to be shipped to third-party manufacturers instead of ATI producing all of its graphics cards, though ATI would still produce cards off of its highest-end chips. This freed up engineering resources that were channeled towards driver improvements, and the 9700 performed phenomenally well at launch because of this. id Software technical director John Carmack had the Radeon 9700 run the E3 Doom 3 demonstration.[3]
The performance and quality increases offered by the R300 GPU is considered to be one of the greatest in the history of 3D graphics, alongside the achievements GeForce 256 and Voodoo Graphics. Furthermore, NVIDIA’s response in the form of the GeForce FX 5800 was both late to market and somewhat unimpressive, especially when pixel shading was used. R300 would become one of the GPUs with the longest useful lifetime in history, allowing playable performance in new games at least 3 years after its launch.[4]
GeForce256 and Nvidia 8800 series were also uncontested winners at that time, no other player on the market had equivalent functional technologies.
I am a big nvidia fan (check specs)... but reality is reality. This is, for all intents and purposes a HUGE ati win. Nvidia has dominated for so long, and now they will lose the crown - they were SO far ahead... and now they are back where they were during the g7x series in relation to ATI. That is a win from ATI no matter how you spin it.
ATI has a dx11 part that will take the crown... and nvidia is saying that DX11 wont matter ?!?:roll::roll::roll:
These are the same muppets that told us all that physX matters.:nutkick: LOL. They haven't learned their lesson with DX10, they are just trying to convince their investors not to jump ship BC they don't have a competing part. This is a business move, plain and simple... Just trying to minimize the pain until they can compete.
I believe that their market share lost was initiated by the G80 released with no answer from ATI. Compounded by the HD 2900 release, more or less. Today, AMD is currently still trying to recover from "that". Now all of sudden we are to forget what happened and say that the DX11 is nice but not all that important. :shadedshu Yes, we know that market conditions during that time and now are completely different. However, if AMD is able to adapt and compensate for that I see no reason why they wouldn't do well.
As by mid \ end of next year DX11 will COUGH should be more worth it as more games will be out for it. Although with the said boost makes it more temping but tell ya the truth i've played all games i want to play already and what are giving a issue is more CPU bound than GPU.
Stupid to buy a DX10 card now ? but that depends on what card they have now. BUT DX11 cards are going be like $250-$300+. So they could get a DX10 card for around $150 and by mid\end of next year if DX11 is more exceptable you get one but the price is going be cheaper to get one. And there be a reason to get one.
I'm a gamer so thats my view on it. I do very few benchmarks as thats not what i get faster hardware for.
Sure if you have a lower end card it's going be more worth but if you already have a card like the 285 or the 4890 there is no need if your a gamer..
What nVidia are doing is saying to all those people that think that DX11 will make a huge difference is - hey you don't need DX11 just yet, here we'll (well i would think they would) cut the price on our cards that still perform pretty damn well.
He wasn't happy when he found out that his GTX 380M was based on 40 nm G92c. That aside, let's get back on track.
nVidia: DX11? Phew, let's concentrate on whats important- the Powerpoint slides.
Um no, going from 120 to 125 isn't worth anything, correct, but stopping performance going from 60 to 30 IS worth something.
Compute side is all well and good, because without it, 'special' visuals won't work effeciently, but to say that pure computing is necessary is a bit premature.
Hopefully he's hinting and suggesting what we want to see in the near future, which is real time vector drawing, rather than pre-rendered visuals. But that would require cards with massive computing flexibility, like the FIRE GL types used in AutoCAD programs.
But still, stop making cards that do give you 125fps over 120fps, and start making ones that don't cower in fear at a few dynamic shadows in a 3d program. Then worry about 'compute' cards.
developer.amd.com/gpu/ATIStreamSDK/pages/TutorialOpenCL.aspx
Don't be naive and pretend that the gaming crowd is anywhere close to that installed base of users wanting some acceleration in video encoding, Photoshop and the like. Nvidia is talking about that. The capable software it's here already and it does make a difference, and much more is coming in the near future. The GPU is going to be more than a mere gaming device and that will sell more cards, simply because as I said the volume of non-gamer crowd is much much bigger than the gamer one. And considering the WoW and Sims crowd, that doesn't even know what DX is to begin with, you can pretty much disqualify half the gamer crowd as people waiting for DX11.
At the end of the day only enthusiasts care and know about DX11, and probably only half of them will buy the new cards based on DX11, because we know it will mean squat, at least in first tittles and multi-platform titles. So that leaves us with a number of around 2%. That's the percentage of people that will buy a card caring about DX11. The rest will buy the hardware for something else, but not DX11.