Thursday, May 22nd 2008

Next-gen NVIDIA GeForce Specifications Unveiled

After we already know what AMD/ATI are planning on their camp, it's NVIDIA's turn to show us what we should be prepared for. Verified by DailyTech, NVIDIA plans on refreshing its GPU line-up on June 18th with two new video cards that will feature the first CUDA-enabled graphics core, codenamed D10U. Two models are expected to be launched simultaneously, the flagship GeForce GTX 280 (D10U-30) and GeForce GTX 260 (D10U-20). The first chip will utilize 512-bit memory bus width, 240 stream processors (128 on the 9800 GTX) and support for up to 1GB memory. GTX 260 will be trimmed down version with 192 stream processors, 448-bit bus and up to 896MB graphics memory. Both cards will use the PCI-E version 2.0 interface, and will support NVIDIA's 3-way SLI technology. NVIDIA also promises that the unified shaders of both cards are to perform 50% faster than previous generation cards. Compared to the upcoming AMD Radeon 4000 series, the D10U GPU lacks of DirectX 10.1 support and is also limited to GDDR3 only memory. NVIDIA's documentation does not list an estimated street price for the new cards.
Source: DailyTech
Add your own comment

87 Comments on Next-gen NVIDIA GeForce Specifications Unveiled

#1
HaZe303
Sounds like im getting ATI card this time, to me the new ATI R700 cards sound much better on paper than the GT200?? I might be wrong, but sounds like the nV card is just a revolution of G92 and not a new gpu? I mean GDDR3 still, they could atleast go over to 4, preferably to 5 as ATI. And still no dx 10.1?? No im getting me a 4870 this summer, sounds like ill be getting it for cheap as well. Maybe finallly I can afford a Xfire system?? :)
Posted on Reply
#2
tkpenalty
Malware, why did you ignore my links that had similar information posted (release date), sent to you half a month ago?
Posted on Reply
#3
spud107
would have thought 10.1 would have been implemented, bit like having dx9.0b over dx9.0c?
Posted on Reply
#4
largon
As if RV770 was a "new GPU"... Whatever that means. The architecture of G80/G92 is superior than anything out there. Why on earth would you think nV should dump it?
And DX10.1 is hardly worth mentioning, how many DX10.1 titles are there again?
Posted on Reply
#5
malware
tkpenaltyMalware, why did you ignore my links that had similar information posted (release date), sent to you half a month ago?
The only recent PM I have from you is the one with the GIGABYTE Extreme motherboard?
Posted on Reply
#6
Edito
Maybe they just don't see any performance improve from GDDR4 over GDDR3 either i, look at the 8800GTS G92 has a spectaluar performance but still use GDDR3 when the time comes they will make a good use of it i believe cause i think ATI is using it but they are nothing using it well cause we just can't see any performance improvement... Don't get me wrong its what i think...
Posted on Reply
#7
btarunr
Editor & Senior Moderator
HaZe303Sounds like im getting ATI card this time, to me the new ATI R700 cards sound much better on paper than the GT200??
That's always the case. We're drenched into amazing numbers such as "512bit", "1 GB GDDR4", "320 SP's".

No, I don't think the HD4870 can beat the GTX 280 in raw performance at least, maybe price, power and other factors.
Posted on Reply
#8
spud107
there would probably be more if nv was using 10.1,
largonAs if RV770 was a "new GPU"... Whatever that means. The architecture of G80/G92 is superior than anything out there. Why on earth would you think nV should dump it?
And DX10.1 is hardly worth mentioning, how many DX10.1 titles are there again?
Posted on Reply
#9
largon
GDDR4 could have been a smart move as it is much more power efficient than GDDR3. Those 16/14 chips of GDDR3 on GTX280/GTX260 are going to suck stupid amounts of power, something like freaking 60-80W for the GDDR3 alone...

65nm - instead of 55nm - is another problem and causes more unnecessary power consumption.

And yet again, nV fails in creating a practical PCB layout. The board used for GTX280/260 is pure horror.
Posted on Reply
#10
kylew
largonAs if RV770 was a "new GPU"... Whatever that means. The architecture of G80/G92 is superior than anything out there. Why on earth would you think nV should dump it?
And DX10.1 is hardly worth mentioning, how many DX10.1 titles are there again?
Well, you KNOW why there's very little DX10.1 implementation, look at Assassin's Creed, NV moaned, stamped their feet, and so on to get it removed. Dx10.1 is "insignificant" because NV want it to be. In reality, NV can't implement it whereas DX10.1 on the 3800s shows massive performance gains when enabled.
Posted on Reply
#11
Animalpak
btarunrThat's always the case. We're drenched into amazing numbers such as "512bit", "1 GB GDDR4", "320 SP's".

No, I don't think the HD4870 can beat the GTX 280 in raw performance at least, maybe price, power and other factors.
I agree :toast:

GT200 rocks ! :rockout:
Posted on Reply
#12
JAKra
Dx10.1 Upgrade?

Hi!

I have one question. If DX10.1 can be removed by a patch, does it mean that it works the other way around? Like upgrade Crysis to DX10.1? Or any other DX10 title.
That would be nice, and I presume not to hard to accomplish(technically).
Posted on Reply
#13
Animalpak
HaZe303Sounds like im getting ATI card this time, to me the new ATI R700 cards sound much better on paper than the GT200?? I might be wrong, but sounds like the nV card is just a revolution of G92 and not a new gpu? I mean GDDR3 still, they could atleast go over to 4, preferably to 5 as ATI. And still no dx 10.1?? No im getting me a 4870 this summer, sounds like ill be getting it for cheap as well. Maybe finallly I can afford a Xfire system?? :)
Completley wrong.


GT200 is a FULL new GPU, and the GDDR3 works alright better than the GDDR5.At the end you get the same results but the GDDR3s they are more exploitable.


The differences betwheen DX10 and DX10.1 are least ! The games have just begun to use the DX10s and they are little of it !!
Posted on Reply
#14
Valdez
JAKraHi!

I have one question. If DX10.1 can be removed by a patch, does it mean that it works the other way around? Like upgrade Crysis to DX10.1? Or any other DX10 title.
That would be nice, and I presume not to hard to accomplish(technically).
nvidia owns crytek now, so there will be no dx10.1 support (crytek officialy confirmed this).
Posted on Reply
#15
FilipM
It looks great on paper, but, how will the wallet look like when you buy one of these?

Anyone knows the price or has a hint?
Posted on Reply
#16
largon
JAKra,
Unlikely, but really, anything's possible but ofcourse it's way easier to cut rather than add something.
kylew(...) NV can't implement it whereas DX10.1 on the 3800s shows massive performance gains when enabled.
DX10.1 in AC allows performance boost when AA is used. Sure.
But then again, it also causes incompatibility with nV GPUs that only support DX10.

Choose now, which would you fix?
Valdeznvidia owns crytek now, so there will be no dx10.1 support (crytek officialy confirmed this).
Link please.
Posted on Reply
#17
Valdez
AnimalpakCompletley wrong.


GT200 is a FULL new GPU, and the GDDR3 works alright better than the GDDR5.At the end you get the same results but the GDDR3s they are more exploitable.


The differences betwheen DX10 and DX10.1 are least ! The games have just begun to use the DX10s and they are little of it !!
The gt200 is not new it's just an improved g80. The memory controller in g80 is not flexible, so they have to use gddr3 in gt200 too.
Posted on Reply
#18
Valdez
largonJAKra,
Unlikely, but really, anything's possible but ofcourse it's way easier to cut rather than add something.


DX10.1 in AC allows performance boost when AA is used. Sure.
But then again, it also causes incompatibility with nV GPUs that only support DX10.

Choose now, which would you fix?Link please.
it's not incompatible, just when vista sp1 installed, nv gpus don't use dx10.1 features, but there is no incompatibility.
Posted on Reply
#19
Animalpak
ValdezThe gt200 is not new it's just an improved g80. The memory controller in g80 is not flexible, so they have to use gddr3 in gt200 too.
Sure? Then when a new GPU will go out? They had everybody confirmed that it was new !!

DAMN :shadedshu:mad::banghead:
Posted on Reply
#20
Valdez
AnimalpakSure? Then when a new GPU will go out? They had everybody confirmed that it was new !!

DAMN :shadedshu:mad::banghead:
Don't be sad, the gt200 will be the fastest gpu ever released, 9900gtx will be a brutal card, much more faster than 8800ultra/9800gtx :)
Posted on Reply
#21
Exavier
I very much doubt it's a new g80 as the most recent cards are g92..

I would also discourage the fanboy attitudes already emerging in this thread...get whichever is best, they're both unreleased yet..

also, this comes out on my birthday
mega lol
Posted on Reply
#22
largon
Exavier,
G200 is evolved G92 which is evolved G80. So it's more like a "new G80" as it's targeted for ultra high-end rather than performance-sector as G92.
Valdezit's not incompatible, just when vista sp1 installed, nv gpus don't use dx10.1 features, but there is no incompatibility.
Well obviously nV chips are incompatible with Ubisoft's DX10.1 as removing that removes problems with nV GPUs.
Posted on Reply
#23
Valdez
ExavierI very much doubt it's a new g80 as the most recent cards are g92..
g92 is just a revised g80, as rv670 is a revised r600, and rv770 is an improved rv670.
Posted on Reply
#24
Valdez
largonWell obviously nV chips are incompatible with Ubisoft's DX10.1 as removing that removes problems with nV GPUs.
hardocp.com/article.html?art=MTQ5MywxLCxoZW50aHVzaWFzdA==

when sp1 installed nvidia cards perform the same as no sp1 installed. There is no incompatibility, they run fine with dx10.1, but don't use it's features. So they run in dx10 mode even if dx10.1 isntalled.
Posted on Reply
#25
largon
I'm talking about the DX10.1 implementation in the game, not in SP1. DX10.1 code in AC causes problems with nV GPUs. That's why it was removed by Ubisoft.
Posted on Reply
Add your own comment
Jan 17th, 2025 14:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts