Monday, May 26th 2008

Next-gen NVIDIA GeForce Specs Unveiled, Part 2

Although it's a bit risky to post information taken from unknown for me sources, not to mention that the site is German, it's worth trying. The guys over at Gamezoom (Google translated) reported yesterday that during NVIDIA's Editors Day, the same place where the F@H project for NVIDIA cards and the buyout of RayScale were announced, NVIDIA has also unveiled the final specs for its soon to be released GT200 cards. This information comes to complement our previous Next-gen NVIDIA GeForce Specs Unveiled story:
  • GeForce GTX 280 will feature 602MHz/1296MHz/1107MHz core/shader/memory clocks, 240 stream processors, 512-bit memory interface, and GDDR3 memory as already mentioned.
  • GeForce GTX 260 will come with 576MHz/999MHz/896MHz reference clock speeds, 192 stream processors, 448-bit memory interface and GDDR3 memory
The prices are 449 U.S. dollars for the GTX 260 and more than 600$ for the GeForce 280 GTX. That's all for now.
Source: Gamezoom
Add your own comment

108 Comments on Next-gen NVIDIA GeForce Specs Unveiled, Part 2

#51
DarkMatter
imperialreignBut, all-in-all, the new ATI R700 series is a brand new design, not a re-hash of an older GPU like the R600 was to R500 - It might be wishful thinking, but . . . I think we might be in for a surprise with the new ATI GPUs. Probably just wishful thinking on my part, though :ohwell:

Anyhow, someone correct me if I'm wrong, but I thought I remember hearing that nVidia's new G200 is another re-hash of G92/G80? :confused:
WTF? R600 a rehash of R500? :confused::confused: What are you talking about?

Well, well, it shares many things with R500 (XB360 GPU) indeed, unified shaders for the most part. But nothing with R520 and 580, X1800 and X1900 respectively.

R600 was a complete new PC GPU architecture, RV670 was a rehash of it and RV770 is again a rehash of RV670 (kinda). GT200 is also (kind of) a rehash of G92 indeed.
Posted on Reply
#52
Rurouni Strife
Actually, the R600 was the brand new design that took a few hints from the R580. The R700 is based off of the R600 but with multiple design fixes and improvements. Can't say if the G200 is a G92 rehash but it could be.

People still bought the HD2900 didn't they? That had awful power consumption and so on. People will still buy the GTX280. Personally, I wont. I dont have that kind of money. I also don't want a mini necular reactor in my case. Thats one of the reasons I never got a 2900Pro or GT. Also, my roommates have had problems with Nvidia drivers (not that I haven't had a few minor issues with ATI, usually the CCC won't install right or work) that are totally wack.
Posted on Reply
#53
imperialreign
we can't compare these new GPUs on paper - not until we start seeing the hardware itself on shelves, and coupled with real-world gaming benchmarks

Sure, nVidia's new G200 series does appear a lot better on paper than ATI's new R700 series - but the R700 series has been in design for a long time; we were hearing rumors of it before R600 was even released, although it shares a lot of the design of the R600.

Just for comparison, the last time we saw a brand spankin new GPU design from ATI was the R500 series - and the cream of the crop there was the 1800/1900 series of cards.

nVidia's 7800/7900 cards looked better on paper than the 1800/1900 series did, but which cards came out of the gate better, and stayed ahead of the competition?


It's very possible we might see that again with these new generations of cards - we'll have to wait and see.
Posted on Reply
#54
TheGuruStud
Rurouni StrifeActually, the R600 was the brand new design that took a few hints from the R580. The R700 is based off of the R600 but with multiple design fixes and improvements. Can't say if the G200 is a G92 rehash but it could be.

People still bought the HD2900 didn't they? That had awful power consumption and so on. People will still buy the GTX280. Personally, I wont. I dont have that kind of money. I also don't want a mini necular reactor in my case. Thats one of the reasons I never got a 2900Pro or GT. Also, my roommates have had problems with Nvidia drivers (not that I haven't had a few minor issues with ATI, usually the CCC won't install right or work) that are totally wack.
If you're going for bushisms, it's nucular :P

And my drivers have always been great along with all the PCs I build (and I use the "betas"). They probably don't know how to uninstall and reinstall properly.
Posted on Reply
#55
Rurouni Strife
Haha I'll say I was (I'm just a horrible speller).

They didn't until I showed them how. That fixed one of their problems, but one of them has a 7950GX2 still, and he was missing resolutions and had some problem with Age of Conan. That could be thrown out because it's a 7950GX2 though I suppose.
Posted on Reply
#56
DarkMatter
imperialreignJust for comparison, the last time we saw a brand spankin new GPU design from ATI was the R500 series - and the cream of the crop there was the 1800/1900 series of cards.
Again, "the last time we saw a brand spankin new GPU design from ATI was the R600"

Also GT200, AKA G100, AKA G90 has been in development for as much if not more time than R700.

BTW: R580 looked a lot better on paper than Nvidia's card, R520 didn't. Ideed that's why X1900 was so much better and X1800 was not.
Posted on Reply
#57
TheGuruStud
Rurouni StrifeHaha I'll say I was (I'm just a horrible speller).

They didn't until I showed them how. That fixed one of their problems, but one of them has a 7950GX2 still, and he was missing resolutions and had some problem with Age of Conan. That could be thrown out because it's a 7950GX2 though I suppose.
Yeah, the gx2 kinda sucks haha.
Posted on Reply
#58
WarEagleAU
Bird of Prey
the prices arent justified, and no way in hell are they justified if they are only 3-6% better than ATIs high offerings. This is ridiculous.

As an aside, they arent even increasing clocks, shaders, and memory that much from what they got now.
Posted on Reply
#59
Megasty
WarEagleAUthe prices arent justified, and no way in hell are they justified if they are only 3-6% better than ATIs high offerings. This is ridiculous.

As an aside, they arent even increasing clocks, shaders, and memory that much from what they got now.
nVidia's just swinging a magic wand. You can look at the ROPs or TMUs but how in the hell is it going to translate when the core/shd/mem are so low - especially the mem. So there's a billion transistors, but if they're doing half the work that they could be doing then that's a sweet bottleneck. Maybe they just figured we'll be voltmodding it anyway, otherwise the 280 won't even come close to the 4870x2. $600!? :shadedshu
Posted on Reply
#60
Millenia
MegastynVidia's just swinging a magic wand. You can look at the ROPs or TMUs but how in the hell is it going to translate when the core/shd/mem are so low - especially the mem. So there's a billion transistors, but if they're doing half the work that they could be doing then that's a sweet bottleneck. Maybe they just figured we'll be voltmodding it anyway, otherwise the 280 won't even come close to the 4870x2. $600!? :shadedshu
Exactly, at that price it'd damn better be the cure for cancer or else it's just a massive waste of money.
Posted on Reply
#61
mandelore
MilleniaExactly, at that price it'd damn better be the cure for cancer or else it's just a massive waste of money.
lmao, thats well funny :laugh:
Posted on Reply
#62
iLLz
trt740they are on crack at that price get real
I bought my 8800GTX Stock at $649 because it was the most futureproof card at the time i build my new system. This was November 2006. I still have this first revision of the 8800GTX and its the best card I ever had. A bit expensive but was well worth it.
Posted on Reply
#64
yogurt_21
newtekie1I don't know how accurate this information is, especially since we haven't seen any other reports of it from any reputable sources. I think I'll wait to believe specs until the cards are actually out, but to me the shader speeds on these cards seem a little low to me. I know there is more of them, but to drop the speeds that much seems insane to me.
see I couldn't agree more, i hate it when manufacturers slack off on things just because the competition isn't there.

sure nvidia cards are faster than ati cards right now, but that doesn't mean you can slack off on specs. I mean the shader clock was one of ati's biggest problems so they've upped it on this round of cards, and how does nvidia respond? by lowering the clocks on theirs? that doesn't make any sense to me.

I wonder if they're sandbagging on purpose. like they did with the 7800gtx 256mb which got pwnded by the x1800xt, then nvidia launches a few 7800gtx 512mb cards with uber clocks out of the blue.

so maybe there's a g200 ultra chip settin in nvidia's labs waiting to crush the rv770. time will tell.
Posted on Reply
#65
spearman914
This happens to EVERY recent cards. The price will me mad high. Just like when the 8800 GT first came out.... it eventually,however,have dropped. I hope this will happen to the GTX260,GTX280 too. And I like the naming. Instead of 10800 GTX.......
Posted on Reply
#66
1c3d0g
MilleniaExactly, at that price it'd damn better be the cure for cancer or else it's just a massive waste of money.
A massive waste of money is oil sheiks flying around in their private jets with gold-plated bathroom sinks while the rest of their countrymen are struggling every day to survive. :mad:
Posted on Reply
#67
TheGuruStud
1c3d0gA massive waste of money is oil sheiks flying around in their private jets with their gold-plated bathroom sinks. :mad:
Gold plated? I heard they were solid gold :D
Posted on Reply
#68
Megasty
TheGuruStudGold plated? I heard they were solid gold :D
heh, & I bet all the heatsinks in their comps are made of diamond :respect:
Posted on Reply
#69
TheGuruStud
Megastyheh, & I bet all the heatsinks in their comps are made of diamond :respect:
They know how to use comps? All I ever see them doing is wrecking brand new imports trying to drift :roll:
Posted on Reply
#70
magibeg
I thought everyone had diamond heatsinks? Intel gave me a special diamond IHS with a diamond ultra 120 that has air chilled to absolute 0 blowing over it. My temperatures are an even 3 kelvin at load.
Posted on Reply
#71
GSG-9
magibegI thought everyone had diamond heatsinks? Intel gave me a special diamond IHS with a diamond ultra 120 that has air chilled to absolute 0 blowing over it. My temperatures are an even 3 kelvin at load.
I thought the pcb was non conductive platinum and the circuitry was diamond?
Posted on Reply
#72
a_of
You must be joking. I have been playing FPS since I have memory, and I can easily distinguish between 30fps and 60fps. Hell, I can even notice between 60 and 80. Perhaps you use a flat panel? I still use a CRT, nothing beats it for serious gaming. That's why when playing counter strike with my clan a few years ago, I bought the best money could buy at that moment.
laszlodon't understand what you mean playable 50 and unplayable under 60?

all games with 30 fps are playable;human eye don't see the the frames above 25 fps maybe you have some special implants (from Nvidia) and you have reached the 50 fps target as minimum,good for you and good for us who play above 30 and under 60 and we're happy with it.
Posted on Reply
#73
magibeg
a_ofYou must be joking. I have been playing FPS since I have memory, and I can easily distinguish between 30fps and 60fps. Hell, I can even notice between 60 and 80. Perhaps you use a flat panel? I still use a CRT, nothing beats it for serious gaming. That's why when playing counter strike with my clan a few years ago, I bought the best money could buy at that moment.
30 and 60 you could probably notice but I'm afraid your monitor may start to lag behind when you reach 60-80. Personally I've always found LCD's to be more crisp of an image when compared to CRT's so long as you're running at native.
Posted on Reply
#74
Megasty
magibeg30 and 60 you could probably notice but I'm afraid your monitor may start to lag behind when you reach 60-80. Personally I've always found LCD's to be more crisp of an image when compared to CRT's so long as you're running at native.
Yeah, anything over 60 fps don't matter anyway cause your monitor won't keep up. CRTs are stuck 60hz as a refresh rate while LCDs can go to 75hz but OS's & drivers keep them at 60hz anyway. Just because a card is running a game at 90+ fps don't mean you're seeing them unless you're superhuman :rolleyes:
Posted on Reply
#75
TheGuruStud
MegastyYeah, anything over 60 fps don't matter anyway cause your monitor won't keep up. CRTs are stuck 60hz as a refresh rate while LCDs can go to 75hz but OS's & drivers keep them at 60hz anyway. Just because a card is running a game at 90+ fps don't mean you're seeing them unless you're superhuman :rolleyes:
I hope you are being sarcastic.

#1 30-60 fps sucks big time, looks horrible

#2 My cutoff in framerate is about 85 before I can't tell

#3 My refresh point is about 100 Hz

#4 I game at 1280x1024 at 100 Hz with vsync of course b/c tears blow

#5 You're using a shitty driver if you can't select 75 on an LCD

#6 You all need to get new monitors
Posted on Reply
Add your own comment
Jan 11th, 2025 12:48 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts