Thursday, July 3rd 2008

NVIDIA Roadmap Suggests DirectX 10.1 GPU in Q4 2008, GDDR5 in 2008

TG Daily stumbled across roadmap information of NVIDIA which provides certain details of the green giant's DirectX 10.1 implementation plans. Currently, ATI and S3 Graphics make graphics processors (GPU) compliant to DirectX 10.1 specifications. In a presentation slide, are seen plans of NVIDIA coming up with a brand new fleet of mobile (notebook) DirectX 10.1 graphics processors slated for spring 2009 and desktop GPUs either for late Q4 2008 or early 2009 with a possible ramp throughout Q1 and Q2 of 2009.

This gives competitors at least a 6-month time advantage in which they could build developer relations and aid development of games based on the DirectX 10.1 API since it's now certain that the API is going to become an industry-wide standard, with the biggest player in the discrete-graphics industry having plans to embrace it.

The second revelation that slide brings up is that NVIDIA will implement GDDR5 memory with its upcoming products within 2008.
Source: TG Daily
Add your own comment

44 Comments on NVIDIA Roadmap Suggests DirectX 10.1 GPU in Q4 2008, GDDR5 in 2008

#26
Nkd
sorry didn't mean to flame or anything, but 1.5 to 2 times performance in 6 months is just too good to believe in. Just pointing out the info from that article. again sorry if it sounded like that.
Posted on Reply
#27
farlex85
btarunrThe AA model of DirectX 10 is seriously flawed, and a major issue with this is fixed in 10.1, so you can throw DX10 out of the window right away since a succeeding DX revision (not to be confused with version (DX 9 -> 9.0c is a revision change, DX 9 -> 10 is a version change)) carries all the features of its predecessor. So there's no point in sticking to DX 10 when you get all its features plus several issues addressed with 10.1.
I hadn't really noticed any problems w/ dx10 aa, I played every game w/ max aa except for crysis. I don't know about the model thing, but it worked fine for me. I'm not sure how much this matters though if game developers don't start making an effort to fully utilize dx10 though, instead of having it as a little bonus feature.
Posted on Reply
#28
btarunr
Editor & Senior Moderator
So you played a DX 10.1 game on a 10.1 GPU with DX 10.1 installed (SP1 for Vista Installed) and compared performance with the same game + GPU without DX 10.1 installed (no Vista SP1 in place) ? Impressive.
Posted on Reply
#29
farlex85
btarunrSo you played a DX 10.1 game on a 10.1 GPU with DX 10.1 installed (SP1 for Vista Installed) and compared performance with the same game + GPU without DX 10.1 installed (no Vista SP1 in place) ? Impressive.
What? Me? :confused: I just said I didn't notice any aa problems w/ dx10, haven't used dx10.1. I thought assassin's creed was the only dx10.1 game, and it wasn't even fully that, so I have no idea what you mean.
Posted on Reply
#30
newtekie1
Semi-Retired Folder
btarunrSo you played a DX 10.1 game on a 10.1 GPU with DX 10.1 installed (SP1 for Vista Installed) and compared performance with the same game + GPU without DX 10.1 installed (no Vista SP1 in place) ? Impressive.
There are DX10.1 games?:D
Posted on Reply
#31
btarunr
Editor & Senior Moderator
newtekie1There are DX10.1 games?:D
Assassin's Creed (PC) sans any patch.
Posted on Reply
#32
imperialreign
btarunrAssassin's Creed (PC) sans any patch.
and sadly, that's the only one I know of as well . . . :ohwell:

at least after the amount of pressure Ubi came under with 1.02, they might re-instate 10.1 in the near future
Posted on Reply
#33
candle_86
imperialreignthe issue so far hasn't been what ATI has been able to do with DX10.1, but the fact that no game developers wanted to support it because the biggest GPU market holder (nVidia) didn't support it; couple with the fact that neither did Intel, the market was against 10.1 from the start. nVidia did plan to support it at some point originally, they just downplayed 10.1 as being ""a minor extension of DirectX 10 that makes a few optional features in DirectX 10 mandatory," and didn't intend to offer supporting hardware until a while.

Now that there's evidence that nVidia will back pedal their original statements on DX10.1, and move along quicker with it, I cna almost guarantee within the next year we'll start seeing more games utilizing it.

My personal take is more that nVidia is concerned about losing game devloper support now that ATI is competitive performance wise on their level - ATI's hardware is looking more promising to devlopers, and their hardware is also cheaper (if they actually do have to purchase it). Only reason why I can see nVidia pushing 10.1 support along quicker.
I don't see it, remember the mass exodus of partners to ATI in 2003 and the total flop of the NV3x, Nvidia still retained the marketshare then even being 100% crushed
Posted on Reply
#34
DarkMatter
btarunrAssassin's Creed (PC) sans any patch.
One game doesn't make DX10.1 automatically much faster than DX10 or AA faulty on DX10. DX10.1 has some features that under some very especific conditions it is faster than DX10. Change conditions and there isn't so much benefit. I'm going to be clearer, AFAIK DX10.1's (and Ati's architecture, the main reason they are trying to push DX10.1 adoption) benefit is only pronounced when using custom AA and custom filtering under a deferred rendering model. Assassins Creed uses deferred rendering and custom AA, Call of Juarez used too, and thus you can see the benefits in them. But IMO we have yet to see demostrated that defered rendering or cutom AA is better than other techniques and Assassins Creed (nor COJ) doesn't demostrate anything because the engine is nothing good to start with. It's graphics are not better than the average out there and the performance is small in comparison. They have already admitted bad multi-GPU scaling due to an excess of drawing calls, a clear sign of very deficient optimization. IMO DX10.1 fixes AC's deficiencies, but we can't take that as the norm with the info we have today.

And that info we have today is:

1- A data sheet and company line from Ati that touts the benefits of DX10.1, but that states clear (if you read it carefully) that the benefits are on some especific features.

2- ONE game that benefits from DX10.1.
2.1 - According to the developers it was broken, so this point is not very clear anyway.

3- Many developers saying DX10.1 isn't an upgrade at all for them, because the way how they use the API doesn't have any benefit going to DX10.1.

4- The maker of the API itself saying that it has not many benefits over DX10, except on some especific features.

5- Nvidia saying it's not that important for them because it's not important for developers.

In the end what we have is two fronts in this story, one that believes that points 1-2 are true and 3-4-5 are lying, and the one that believes that 1-2 are true in some cases and that so many developers and MS won't lie just to help Nvidia, when in fact they know MS may be pissed off with them (it was not too much ago), because of what happened with the Xbox.
Posted on Reply
#35
Weer
It's hard to imagine what is more useless right now - GDDR5 with the 512-bit memory bus or DirectX10.1 with no games utilizing it.
Posted on Reply
#36
Solaris17
Super Dainty Moderator
seems like 08 will be the time to plunge into an nidia card......
Posted on Reply
#37
magibeg
Maybe i'm a little off on my quarters but doesn't the 4th business quarter end in like april 2009?
Posted on Reply
#38
eidairaman1
The Exiled Airman
youd think companies would follow fiscal years.
Posted on Reply
#39
Hayder_Master
what about 9800gtx+ or gtx 260 or 280 , did they forget it , they release it before month ago , did nvidia play with customers , i know nvidia planned to put gdd5 and now they want to put dx10.1 so it must be support in gtx 200
Posted on Reply
#40
btarunr
Editor & Senior Moderator
hayder.masterwhat about 9800gtx+ or gtx 260 or 280 , did they forget it , they release it before month ago , did nvidia play with customers , i know nvidia planned to put gdd5 and now they want to put dx10.1 so it must be support in gtx 200
NVIDIA played with its customers even back with GeForce 7950 GX2. People bought it for ~$600, in a couple of months come the GeForce 8800. Same applies to those who bought 9800 GX2 only to see GTX 280 come in a couple of months' time.
Posted on Reply
#41
Hayder_Master
btarunrNVIDIA played with its customers even back with GeForce 7950 GX2. People bought it for ~$600, in a couple of months come the GeForce 8800. Same applies to those who bought 9800 GX2 only to see GTX 280 come in a couple of months' time.
that's right, i rember my friend when he got a 7950 gx2 about 800$ , in that time i was hope to got one , after two month i got 8800gt and he see the 8800gt it is next generation of nvidia gpu and support dx 10 and pixel shader 4, am thank god cuz i don't have more money to buy the 7950gx2 , i see his face in that time he just want to cry and throw 7950 gx2 in garbage, and he want to burn nvidia
Posted on Reply
#42
JohnyBGood
Any power requirement predictions for these new Nvidia cards?
Posted on Reply
#43
eidairaman1
The Exiled Airman
well if They Continue to chew the path they are they will wind up having too much power draw.
Posted on Reply
#44
DarkMatter
eidairaman1well if They Continue to chew the path they are they will wind up having too much power draw.
You need a reality check. Nvidia is doing very well when it comes to power draw, considering what the competition is doing. Nowadays GPU power draw is astronomic, but if Nvidia is doing it wrong how is Ati doing it? Despite their monolithic design path, Nvidia is doing much much better than Ati in that respect. Look at Wizzards review, for example the latest one:

www.techpowerup.com/reviews/Palit/Revolution_R700/25.html

Forget about the Palit 700 card and the X2 alltogether, they are both dual GPU cards and it would be very easy for me to make a point out of them. No, look at the HD4870 and GTX 260. Same performance, Ati card has the fab process advantage and is smaller. Yet on average it consumes more. And the average shown by Wizz's reviews is actually not like other averages where the card will be hours idling, but an average of a 3DMark run:
Average: 3DMark03 Nature at 1280x1024, 6xAA, 16xAF. This results in the highest power consumption. Average of all readings (two per second) while the test was rendering (no title screen).
Never forget that Nvidia has always had fab process disadvantage and that's a big one, and yet they are many times above in performance per watt ratio.

Now look at this article:

forums.techpowerup.com/showthread.php?t=76104

That's what G200b will consume. That's what a TRUE Nvidia 55nm card consumes. I said that G92b was NOT a true 55nm chip plenty of times, but people prefer to dismiss that fact.

Now I am not comparing Nvidia to Ati's power draw to bash or praise any of the two, but we can only compare Nvidia to Ati, because there's only those two players in the field. And you just can't say if they continue that way they will consume a lot, because reality confronts that sentence. I mean we have two vendors and two strategies. As we can see Nvidia has the better one in that respect, power consumption. Just compare their 55nm high-ends with comparable performance, almost 300W (HD4870X2) versus 160W...

The final word about this will come with 40nm cards, so be patient.
Posted on Reply
Add your own comment
Oct 2nd, 2024 05:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts