# Next-gen NVIDIA GeForce Specs Unveiled, Part 2



## malware (May 26, 2008)

Although it's a bit risky to post information taken from unknown for me sources, not to mention that the site is German, it's worth trying. The guys over at Gamezoom (Google translated) reported yesterday that during NVIDIA's Editors Day, the same place where the F@H project for NVIDIA cards and the buyout of RayScale were announced, NVIDIA has also unveiled the final specs for its soon to be released GT200 cards. This information comes to complement our previous Next-gen NVIDIA GeForce Specs Unveiled story:
GeForce GTX 280 will feature 602MHz/1296MHz/1107MHz core/shader/memory clocks, 240 stream processors, 512-bit memory interface, and GDDR3 memory as already mentioned.
GeForce GTX 260 will come with 576MHz/999MHz/896MHz reference clock speeds, 192 stream processors, 448-bit memory interface and GDDR3 memory
The prices are 449 U.S. dollars for the GTX 260 and more than 600$ for the GeForce 280 GTX. That's all for now.

*View at TechPowerUp Main Site*


----------



## X-TeNDeR (May 26, 2008)

*Six hundred dollars!!!*  awww..

Specs look good, but man.. NVidia is hunting for fat wallets again


----------



## Megasty (May 26, 2008)

huh...$600+ :shadedshu

It better be the best dang thing ever. They just seem to be asking for it this time. I don't see how that price will cut mustard at all, especially with the fabled 4870x2 @ $500. Hopefully that price will talk the talk of how much of a monster it will be.


----------



## Animalpak (May 26, 2008)

Wow the GTX280 is a true monster. 600 dollars is not too much if you compare the price that had one 8800 ultra some years ago. And the 8800 Ultra black pearl or the Evga 8800 Ultra superclocked ? The most expensive cards ever sell.


----------



## v-zero (May 26, 2008)

Ridiculous prices again - I really hope ATi wipe the floor with those idiots at nVidia this time.


----------



## laszlo (May 26, 2008)

v-zero said:


> Ridiculous prices again - I really hope ATi wipe the floor with those idiots at nVidia this time.




at least with price/perf. they will;common user don't buy the high-end cards and the 5-600$ cards are good for e-penis and records...


----------



## hat (May 26, 2008)

for everyone bitching about the price. Remember when the x800/x850 came out? 
Remember the 8800 ultra? It was like $800. The GTX was around $600 as well.
Prices will come down, everything is always uber expensive at launch...


----------



## substance90 (May 26, 2008)

laszlo said:


> at least with price/perf. they will;common user don't buy the high-end cards and the 5-600$ cards are good for e-penis and records...


Well said, mate! Besides, what does one need such a graphics card for? I`m running every single game (except Crysis course) on high at 1920 x 1200 with a 8800GT 512Mb. And there`s the fact that more and more developers are abandoning the PC platform


----------



## Megasty (May 26, 2008)

Price bitching is fine & all but if its not faster than the 4870x2 then its worthless, & no I don't care about comparing a 2 gpu setup to 1. It may cost them that much to produce it but it should atleast included GDDR5 instead of sticking with the stuff they've been using for the last 3 yrs.


----------



## Black Hades (May 26, 2008)

Well if it's going to have a 3% to 6% advantage performance wise against the 4xxx then I guess the price is justified. We'll just wait to see.

I wont line NVIDIA's pockets anymore with my cash, as long as ATi's products can play all the new games at max settings while costing less.

On paper ATi looks better, nobody can deny that, but in real life games are mostly designed in partnership with NVIDIA. Games made specifically for ATi products run much better on ATi what a surprise... 

Makes you think a bit of what's more important, making big and bad GPU's or optimizing the software


----------



## Darksaber (May 26, 2008)

whats with the GTX 260?

33% more shaders, but clocked 35% slower than the shaders on the 8800 GTX...I somehow fail to see how this card will massively pwn the 8800 GTX at twice the current price...maybe it overclocks well? We europeans can expect a price of 399 and 599 Euros when the cards hit. way to much for my taste.

cheers
DS


----------



## oli_ramsay (May 26, 2008)

wow $600 is a bit steep, wonder has that'll translate into pounds over here.  I reckon £400 

Here's an interesting comparison chart:






If the 4870 is nearlly as fast as the GTX280 it'll be a much better buy I think.


----------



## FreedomEclipse (May 26, 2008)

Black Hades said:


> Well if it's going to have a 3% to 6% advantage performance wise against the 4xxx then I guess the price is justified. We'll just wait to see.
> 
> I wont line NVIDIA's pockets anymore with my cash, as long as ATi's products can play all the new games at max settings while costing less.
> 
> ...



I totally agree. you have totally hit the nail on the head


----------



## adrianx (May 26, 2008)

from what oli_ramsay post... 

ati/amd are better from design


----------



## Weer (May 26, 2008)

Darksaber said:


> whats with the GTX 260?
> 
> 33% more shaders, but clocked 35% slower than the shaders on the 8800 GTX...I somehow fail to see how this card will massively pwn the 8800 GTX at twice the current price...maybe it overclocks well? We europeans can expect a price of 399 and 599 Euros when the cards hit. way to much for my taste.
> 
> ...



Dude.. it's 50% more. How did you get to 33%? And you should compare it to the 8800 GTS which it has 100% more. The shader clock is irelevant seeing as 65nm can get you 2000Mhz.


----------



## oli_ramsay (May 26, 2008)

adrianx said:


> from what oli_ramsay post...
> 
> ati/amd are better from design



They sure do, nearlly 1/2 the price and nearlly 100 watts less power consumption which is an important thing to consider with energy prices relentlessly shooting up.

Just hope they perform well, can't wait for some benchies


----------



## magibeg (May 26, 2008)

It seems nvidia really thinks they must have something special here. Although i guess considering the card lineup the price and such should be about right with the GTX260 sitting above the 9800GX2 and the GTX280 sitting above that. The prices actually make sense. Price doesn't matter as much to the target audience of these cards and this way nvidia might be able to say they have the fastest GPU's in the world.


----------



## Para_Franck (May 26, 2008)

Now I am about to have enough! Can't any one wait for a little while and see the street prices and real world performance before comparing products? C'mon guys! Who cares wich cards is the best 1 month before it hits the market? 

For my part, I don'T. When I see real world performances and I can select the card at my favorite online stores, then and then only will I have an opinion on any cards or products from any manufacturer.

The only thing we KNOW for now, and this may not even be true (I hope it is), is that either the 48xx or the gt2xx will be better than wath we have rigth now, but not as good as wath will be available in 6 to 8 months

Franck


----------



## largon (May 26, 2008)

Yet again I see "GT200" which doesn't exit... 
It's G200.


----------



## Cold Storm (May 26, 2008)

I spent 600+ on my card, not because of the E-penis, but I was going SLI ether way to try things out. So, I think If needed, Id get one of these cards at the same price. But, I just hope for others that the price will be difference with others...


----------



## Darksaber (May 26, 2008)

Weer said:


> Dude.. it's 50% more. How did you get to 33%? And you should compare it to the 8800 GTS which it has 100% more. The shader clock is irelevant seeing as 65nm can get you 2000Mhz.



LOL let me rephrase that 

The 8800 GTX has 2/3 of the shaders (128 instead of 192) but those are 35% faster (stock). So that leaves a bit of extra performance for the 260 just looking at the shaders, of course I am not considering GPU design an so on...could be that the 260 blows the 88gtx to pieces...

time will tell ^^

cheers
DS


----------



## Animalpak (May 26, 2008)

Ati can just RUN the games, but how he run the  games everybody knows. Between the playable ( around 50fps ) and the unplayable ( Under the 60 fps ) but nothing special.














Why I have to purchase a card that is all right only in the benchmarks ?


----------



## laszlo (May 26, 2008)

Animalpak said:


> Ati can just RUN the games, but how he run the  games everybody knows. Between the playable ( around 50fps ) and the unplayable ( Under the 60 fps ) but nothing special.




don't understand what you mean playable 50 and unplayable under 60? 

all games with 30 fps are playable;human eye don't see the the frames above 25 fps maybe you have some special implants (from Nvidia) and you have reached the 50 fps target as minimum,good for you and good for us who play above 30 and under 60 and we're happy with it.


----------



## largon (May 26, 2008)

*Darksaber*,
There's a huge difference in SP throughput compared to G80:
G80GTX
128SPs (MADD) @ 1.35GHz
->
345.6GFLOPS


GTX280
240SPs (MADD+MUL) @ 1.296GHz
->
933.12GFLOPS 
That's exactly 2.7 times that of G80. 

*Animalpak*,
By the looks of your graphs _you_ have the gaming card (8800Ultra). HD3870X2 is the benchmarking card.


----------



## EastCoasthandle (May 26, 2008)

hat said:


> for everyone bitching about the price. Remember when the x800/x850 came out?
> Remember the 8800 ultra? It was like $800. The GTX was around $600 as well.
> Prices will come down, everything is always uber expensive at launch...



The problem with opinion is that back then the performance increase was needed.  As it stands now, the added performance increase is not needed at such a high premium.  A lot of high end card owners aren't over-clocking like they did in days past.  That's a pure indicator that we have plateau'd above average frame rates in most games.  Sure, e-peen dictates higher FPS is better then what you get now.  However, having a card that can play games at above average frame rates without even overclocking (in most cases) would be hard pressed to buy a card that cost $600+.


----------



## PVTCaboose1337 (May 26, 2008)

That is too much for me to pay!  Why would you spend that much money?


----------



## magibeg (May 26, 2008)

I'm almost afraid to say it but it is strange why the ati cards do so much better in benchmarks.  (once again i may regret saying this) It does seem to be kind of like a lot of games are specifically designed with nvidia cards in mind.


----------



## laszlo (May 26, 2008)

EastCoasthandle said:


> The problem with this opinion is that back then the performance increase WAS NEEDED.  As it stands now, the added performance increase is NOT NEEDED at such a high premium.  A lot of high end card owners aren't over-clocking like they did in days past.  That's a pure indicator that we have plateau'd above average frame rates in most games.  Sure, e-peen dictates higher FPS is better then what you get now.  However, having a card that can play games at above average frame rates without even overclocking (in most cases) would be hard pressed to buy a card that cost $600+.




exactly;no matter the gpu brand in the last year the midrange cards 150-250$ proved to be enough for all titles to be played on a decent fps;when via or s3, or intel will have a gpu capable to achieve a decent fps people will buy it because the price/performance ratio, so the best buy card will have the best $/fps ratio,tendency which already rule the market.

i don't really understand why people must be fanboys  of Nvidia or Ati ;the point is to buy a card who suits you better and use it for a few years without upgrade; i don't like when mud is thrown from both sides now just to prove that "i have a card from the 1st and best gpu manufacturer in the world"(this can be Nv or Ati) wtf cares about?


----------



## newtekie1 (May 26, 2008)

I don't know how accurate this information is, especially since we haven't seen any other reports of it from any reputable sources.  I think I'll wait to believe specs until the cards are actually out, but to me the shader speeds on these cards seem a little low to me.  I know there is more of them, but to drop the speeds that much seems insane to me.


----------



## DarkMatter (May 26, 2008)

magibeg said:


> I'm almost afraid to say it but it is strange why the ati cards do so much better in benchmarks.  (once again i may regret saying this) It does seem to be kind of like a lot of games are specifically designed with nvidia cards in mind.



Of course, because it's much easier to make ALMOST ALL developers to make your card run the games faster than to make 2-3 benchmarks run faster. Funny how corrupted all game developers are, but benchmark developers are so incorruptible...


----------



## Widjaja (May 26, 2008)

No boubt nVidia will end up releasing a GT 280 which ends up being the next gen 8800GT.

So most probably just a wait for the price vs performance minded people after the GTX 280 is released.


----------



## largon (May 26, 2008)

magibeg said:


> I'm almost afraid to say it but it is strange why the ati cards do so much better in benchmarks.  (once again i may regret saying this) It does seem to be kind of like a lot of games are specifically designed with nvidia cards in mind.


Or that benchmarks are optimized for ATi. 
*shrug* 

The reason is in the architecture. For a game to run well on ATi R6 gen GPUs some heavy optimizations are required as the architecture of R6 GPUs is so different from earlier, and then ofcourse there's the obvious flaws in R600/RV670 like the absolutely horrible texture filtering capabilities and the innate inefficiency of the superscalar shaders. RV770 will partially fix texture filtering shortcomings but unfortunately the TMUs are _only_ doubled - thus RV770's texturing muscle will still clearly trail that of even G92.


----------



## Azazel (May 26, 2008)

well its a business after all. they want to make a profit


----------



## trt740 (May 26, 2008)

X-TeNDeR said:


> *Six hundred dollars!!!*  awww..
> 
> Specs look good, but man.. NVidia is hunting for fat wallets again



they are on crack at that price get real


----------



## Black Hades (May 26, 2008)

DarkMatter said:


> Of course, because it's much easier to make ALMOST ALL developers to make your card run the games faster than to make 2-3 benchmarks run faster. Funny how corrupted all game developers are, but benchmark developers are so incorruptible...



This is not about corrupt ppl, it's about the fact that NVIDIA and ATi cards are very different.
 When you make a new game for instance you have to optimize the code so that it takes advantage of the hardware the GPU has. This process can take time, you dont generaly have time to cover all aspects and explore all the resources of two cards that work in different ways. So designers have to choose. Nvidia gives them all the help they need in understanding and using the nooks and crannies  of their cards. Then games run better on NVIDIA cards.

I'm trying to stay neutral here, the only reason that I like ATi cards right now is because the competition is overcharging.


----------



## DrPepper (May 26, 2008)

When the Ultra came out and the price was a lifetime of slavery some people would still try to buy 3 of them.


----------



## DarkMatter (May 26, 2008)

Black Hades said:


> This is not about corrupt ppl, it's about the fact that NVIDIA and ATi cards are very different.
> When you make a new game for instance you have to optimize the code so that it takes advantage of the hardware the GPU has. This process can take time, you dont generaly have time to cover all aspects and explore all the resources of two cards that work in different ways. So designers have to choose. Nvidia gives them all the help they need in understanding and using the nooks and crannies  of their cards. Then games run better on NVIDIA cards.
> 
> I'm trying to stay neutral here, the only reason that I like ATi cards right now is because the competition is overcharging.



Have you ever read any game developer's blog? They do specific code not only for each brand, but for almost each card architecture. Then they may optimize better the code specific for Nvidia hardware under TWIMTBP, because Nvidia gives them extense support. In that respect the code for Nvidia hardware may be better optimized, but each card has its own code.

Long before TWIMTBP, many first tier game developers said the way 3DMark did things WAS NOT how they were going to do things in the future. AFAIK this never changed, so it's the same with 3DMark 06 too. Saying that Ati architecture is any better based on these kind of benchmarks, involves that those benchmarks are doing things right, which they aren't. 

This is something that has always bugged me. People say that game developers are making code "Nvidia's way", but they never stop and think that MAYBE Nvidia is doing their hardware in "game developers way", while Ati may not. TWIMTBP it's a two way relationship. With the time Ati has become better (comparatively) in benchmarks and worse in games. Everybody blames TWIMTBP for this, without taking into account each company's design decisions. For a simple example, Ati's Superscalar, VLIW and SIMD shader processors are A LOT better suited for the HOMOGENEOUS CODE involved in an static benchmark, than for the ever changing code involved in a game. Also in the case of R600 and one of it's biggest flaws, its TMUs, benchmarks are a lot more favorable than games, since you can "guess" which texture comes next and you don't have to care about the textures that have already been used. In games you don't know where the camera will head next, so you don't know which textures you can discard. In reality none of the architectures can "guess" the next texture, but G80/92 with it's bigger texture power can react better to texture changes. On benchmarks R600/670 can mitigate this effect with streaming.


----------



## TheGuruStud (May 26, 2008)

Well, with 1 billion transistors does it even matter how many ROPs, shaders, etc it has? LOL *sarcasm*

It has way more switching power which means ownage (as long as they don't screw it up like the FX series). Clocks are irrevelant (overall).


----------



## BigBruser13 (May 26, 2008)

*latest upcoming nvidia offereings*

All I want to know is can it play Crysis on very high at 1920x1200


----------



## HTC (May 26, 2008)

oli_ramsay said:


> wow $600 is a bit steep, wonder has that'll translate into pounds over here.  I reckon £400
> 
> Here's an interesting comparison chart:
> 
> ...



Even if the nVidia options turn out to be better (by say ... 10% to 15%), *if this power consumption is real*, i would buy an ATI every time.

Only if the difference turns out more substantial would i consider to buy a nVidia but *NOT* for that price!


----------



## TheGuruStud (May 26, 2008)

HTC said:


> Even if the nVidia options turn out to be better (by say ... 10% to 15%), *if this power consumption is real*, i would buy an ATI every time.
> 
> Only if the difference turns out more substantial would i consider to buy a nVidia but *NOT* for that price!



Well, if it's real, then I can say that ATI is missing transistors b/c they're the ones always sucking down the juice


----------



## DarkMatter (May 26, 2008)

HTC said:


> Even if the nVidia options turn out to be better (by say ... 10% to 15%), *if this power consumption is real*, i would buy an ATI every time.
> 
> Only if the difference turns out more substantial would i consider to buy a nVidia but *NOT* for that price!



Thing is that from the looks of the specs, RV770 will have 2x the power of RV670 and GT200 will have 2x that of G92. This means a GTX 280 40-50% faster than the HD4870 and GTX260 could end up being 25% faster. 
It's early to say anything, but most probably each card will have its market segment and ALL of them will have a similar price/performance, as it's been the case almost always. Also GTX cards won't lag a lot behind in performance-per-watt, always based on these specs which I don't know if are trusty. But then again we are always talking about these specs so, simple math:

157 watts +50% = 157 +79 = 236 W isn't funny?


----------



## HTC (May 26, 2008)

DarkMatter said:


> Thing is that from the looks of the specs, RV770 will have 2x the power of RV670 and GT200 will have 2x that of G92. This means a GTX 280 40-50% faster than the HD4870 and GTX260 could end up being 25% faster.
> It's early to say anything, but most probably each card will have its market segment and ALL of them will have a similar price/performance, as it's been the case almost always. Also GTX cards won't lag a lot behind in performance-per-watt, always based on these specs which I don't know if are trusty. But then again we are always talking about these specs so, simple math:
> 
> 160 watts +50% = 160 +80 = 240



If you're correct, i would buy a nVidia card, but *only when the price dropped to a more realistic value*.

nVidia's GTX280 could be 10 times faster the ATI's 4870x2 but i wouldn't buy it for this amount of money: NO WAY!!!!


----------



## DarkMatter (May 26, 2008)

HTC said:


> If you're correct, i would buy a nVidia card, but *only when the price dropped to a more realistic value*.
> 
> nVidia's GTX280 could be 10 times faster the ATI's 4870x2 but i wouldn't buy it for this amount of money: NO WAY!!!!



I will never buy any card above 300$ neither, but the VALUE of the cards is undeniable.

Also I've been looking around and other sources say $400 and $500, for GTX260 and 280 respectively and Nvidia may still have an ace up its sleeve called GTX260 448 MB which would pwn as did GTS 320 in the past, so who knows...


----------



## Megasty (May 26, 2008)

HTC said:


> If you're correct, i would buy a nVidia card, but *only when the price dropped to a more realistic value*.
> 
> nVidia's GTX280 could be 10 times faster the ATI's 4870x2 but i wouldn't buy it for this amount of money: NO WAY!!!!



That's the full gray zone. On paper the 280 & 260 will be 25-50% faster than the 4870. That alone warrants the higher prices to a point. Then you drop the 4870x2 on them & it creates a simple paradox. On paper the 4870x2 is twice as fast as the 4870. Plus at $500 the winner is clearer than glass.


----------



## HTC (May 26, 2008)

Megasty said:


> That's the full gray zone. *On paper the 280 & 260 will be 25-50% faster than the 4870*. That alone warrants the higher prices to a point. Then you drop the 4870x2 on them & it creates a simple paradox. On paper the 4870x2 is twice as fast as the 4870. Plus at $500 the winner is clearer than glass.



Dunno how you figure that, but it doesn't matter!

To me, there are only 2 important aspects for any card: power usage and price (in that order).


----------



## DarkMatter (May 26, 2008)

Megasty said:


> That's the full gray zone. On paper the 280 & 260 will be 25-50% faster than the 4870. That alone warrants the higher prices to a point. Then you drop the 4870x2 on them & it creates a simple paradox. On paper the 4870x2 is twice as fast as the 4870. Plus at $500 the winner is clearer than glass.



That logic has its flaws:

1- HD3870 X2 is not twice as fast as HD3870, we could guess HD4 x2 won't either.

2- HD3870 X2 price is more than twice than DH3870, will this be different? That puts the HD4870 X2 well above $600. Probably it will be cheaper, but it won't launch until August, we don't know how prices are going to be then...


----------



## imperialreign (May 26, 2008)

DarkMatter said:


> Have you ever read any game developer's blog? They do specific code not only for each brand, but for almost each card architecture. Then they may optimize better the code specific for Nvidia hardware under TWIMTBP, because Nvidia gives them extense support. In that respect the code for Nvidia hardware may be better optimized, but each card has its own code.
> 
> Long before TWIMTBP, many first tier game developers said the way 3DMark did things WAS NOT how they were going to do things in the future. AFAIK this never changed, so it's the same with 3DMark 06 too. Saying that Ati architecture is any better based on these kind of benchmarks, involves that those benchmarks are doing things right, which they aren't.
> 
> This is something that has always bugged me. People say that game developers are making code "Nvidia's way", but they never stop and think that MAYBE Nvidia is doing their hardware in "game developers way", while Ati may not. TWIMTBP it's a two way relationship. With the time Ati has become better (comparatively) in benchmarks and worse in games. Everybody blames TWIMTBP for this, without taking into account each company's design decisions. For a simple example, Ati's Superscalar, VLIW and SIMD shader processors are A LOT better suited for the HOMOGENEOUS CODE involved in an static benchmark, than for the ever changing code involved in a game. Also in the case of R600 and one of it's biggest flaws, its TMUs, benchmarks are a lot more favorable than games, since you can "guess" which texture comes next and you don't have to care about the textures that have already been used. In games you don't know where the camera will head next, so you don't know which textures you can discard. In reality none of the architectures can "guess" the next texture, but G80/92 with it's bigger texture power can react better to texture changes. On benchmarks R600/670 can mitigate this effect with streaming.





I agree to an extent.  


Although, I think there is a major difference in GPU architecture that has been holding ATI back across the board, the few games where ATI has worked closely with developers shows a fairly level playing field, or better ATI performance upon release.

Sadly, the only two games I can think of that I know for certain that ATI worked closely with game developers is Call of Juarez - where we see ATI cards tending to outperform nVidia's; and FEAR - where we see ATI cards continuing to keep pace with nVidias.


I'm sure a certain amount of collaboration does tend to help out nVidia's overall, but yes, GPU architecture does come into play as well; and ATI's GPUs just haven't been suited for the more complex games we've seen over the last 2 years or so.

But, all-in-all, the new ATI R700 series is a brand new design, not a re-hash of an older GPU like the R600 was to R500 - It might be wishful thinking, but . . . I think we might be in for a surprise with the new ATI GPUs.  Probably just wishful thinking on my part, though 





Anyhow, someone correct me if I'm wrong, but I thought I remember hearing that nVidia's new G200 is another re-hash of G92/G80?


----------



## mandelore (May 26, 2008)

lol you know what, for that price you could buy 2 of ATI's single offerings and beat out the NV counterpart ^^

or just go x2 and win that way, and the x2 will no doubt be far cheaper than nv's gx2 alternative, so again, price wise and performance its win win for ATI atm.

Think about it, NV may have one killer card, but if you can almost buy 2 of ATI's own killer cards (dont matter if they are less powerful than NV's offering) for around or just over the price of one of NV's cards, do the math,  ATI / customer would win every time?


----------



## warhammer (May 26, 2008)

Save your money for the DX11 cards


----------



## Megasty (May 26, 2008)

DarkMatter said:


> That logic has its flaws:
> 
> 1- HD3870 X2 is not twice as fast as HD3870, we could guess HD4 x2 won't either.
> 
> 2- HD3870 X2 price is more than twice than DH3870, will this be different? That puts the HD4870 X2 well above $600. Probably it will be cheaper, but it won't launch until August, we don't know how prices are going to be then...



I'm not even comparing the R600 to the R700. They're 2 completely different gpu's. The 4870x2 is also using a different method of combining the cores so who klnow what new mess they will present to us. Anyway they said a long time ago that the thing wouldn't be over $499. How accurate that is now with the price of GDDR5 going up is in the air as well. But for now I'll hold them to it until I hear differently. 

The only way the 2 monsters won't compete is that the 4870x2 is priced that low. I guess it'll make since to ATi but there's still no way that 280 or 260 is getting my money. I'll stick with the GT model if I had to get one but I'm sure that ppl will still flock to snatch them up. Hopefully it will be worth it to them.


----------



## DarkMatter (May 26, 2008)

imperialreign said:


> But, all-in-all, the new ATI R700 series is a brand new design, not a re-hash of an older GPU like the R600 was to R500 - It might be wishful thinking, but . . . I think we might be in for a surprise with the new ATI GPUs.  Probably just wishful thinking on my part, though
> 
> Anyhow, someone correct me if I'm wrong, but I thought I remember hearing that nVidia's new G200 is another re-hash of G92/G80?



WTF? R600 a rehash of R500?  What are you talking about? 

Well, well, it shares many things with R500 (XB360 GPU) indeed, unified shaders for the most part. But nothing with R520 and 580, X1800 and X1900 respectively.

R600 was a complete new PC GPU architecture, RV670 was a rehash of it and RV770 is again a rehash of RV670 (kinda). GT200 is also (kind of) a rehash of G92 indeed.


----------



## Rurouni Strife (May 26, 2008)

Actually, the R600 was the brand new design that took a few hints from the R580.  The R700 is based off of the R600 but with multiple design fixes and improvements.  Can't say if the G200 is a G92 rehash but it could be.

People still bought the HD2900 didn't they?  That had awful power consumption and so on.  People will still buy the GTX280.  Personally, I wont.  I dont have that kind of money.  I also don't want a mini necular reactor in my case.  Thats one of the reasons I never got a 2900Pro or GT.  Also, my roommates have had problems with Nvidia drivers (not that I haven't had a few minor issues with ATI, usually the CCC won't install right or work) that are totally wack.


----------



## imperialreign (May 26, 2008)

we can't compare these new GPUs on paper - not until we start seeing the hardware itself on shelves, and coupled with real-world gaming benchmarks

Sure, nVidia's new G200 series does appear a lot better on paper than ATI's new R700 series - but the R700 series has been in design for a long time; we were hearing rumors of it before R600 was even released, although it shares a lot of the design of the R600.

Just for comparison, the last time we saw a brand spankin new GPU design from ATI was the R500 series - and the cream of the crop there was the 1800/1900 series of cards.

nVidia's 7800/7900 cards looked better on paper than the 1800/1900 series did, but which cards came out of the gate better, and stayed ahead of the competition?


It's very possible we might see that again with these new generations of cards - we'll have to wait and see.


----------



## TheGuruStud (May 26, 2008)

Rurouni Strife said:


> Actually, the R600 was the brand new design that took a few hints from the R580.  The R700 is based off of the R600 but with multiple design fixes and improvements.  Can't say if the G200 is a G92 rehash but it could be.
> 
> People still bought the HD2900 didn't they?  That had awful power consumption and so on.  People will still buy the GTX280.  Personally, I wont.  I dont have that kind of money.  I also don't want a mini necular reactor in my case.  Thats one of the reasons I never got a 2900Pro or GT.  Also, my roommates have had problems with Nvidia drivers (not that I haven't had a few minor issues with ATI, usually the CCC won't install right or work) that are totally wack.



If you're going for bushisms, it's nucular 

And my drivers have always been great along with all the PCs I build (and I use the "betas"). They probably don't know how to uninstall and reinstall properly.


----------



## Rurouni Strife (May 26, 2008)

Haha I'll say I was (I'm just a horrible speller).

They didn't until I showed them how.  That fixed one of their problems, but one of them has a 7950GX2 still, and he was missing resolutions and had some problem with Age of Conan.  That could be thrown out because it's a 7950GX2 though I suppose.


----------



## DarkMatter (May 26, 2008)

imperialreign said:


> Just for comparison, the last time we saw a brand spankin new GPU design from ATI was the R500 series - and the cream of the crop there was the 1800/1900 series of cards.



Again, "the last time we saw a brand spankin new GPU design from ATI was the *R600*"

Also GT200, AKA G100, AKA G90 has been in development for as much if not more time than R700.

BTW: R580 looked a lot better on paper than Nvidia's card, R520 didn't. Ideed that's why X1900 was so much better and X1800 was not.


----------



## TheGuruStud (May 26, 2008)

Rurouni Strife said:


> Haha I'll say I was (I'm just a horrible speller).
> 
> They didn't until I showed them how.  That fixed one of their problems, but one of them has a 7950GX2 still, and he was missing resolutions and had some problem with Age of Conan.  That could be thrown out because it's a 7950GX2 though I suppose.



Yeah, the gx2 kinda sucks haha.


----------



## WarEagleAU (May 26, 2008)

the prices arent justified, and no way in hell are they justified if they are only 3-6% better than ATIs high offerings. This is ridiculous.

As an aside, they arent even increasing clocks, shaders, and memory that much from what they got now.


----------



## Megasty (May 26, 2008)

WarEagleAU said:


> the prices arent justified, and no way in hell are they justified if they are only 3-6% better than ATIs high offerings. This is ridiculous.
> 
> As an aside, they arent even increasing clocks, shaders, and memory that much from what they got now.



nVidia's just swinging a magic wand. You can look at the ROPs or TMUs but how in the hell is it going to translate when the core/shd/mem are so low - especially the mem. So there's a billion transistors, but if they're doing half the work that they could be doing then that's a sweet bottleneck. Maybe they just figured we'll be voltmodding it anyway, otherwise the 280 won't even come close to the 4870x2. $600!? :shadedshu


----------



## Millenia (May 26, 2008)

Megasty said:


> nVidia's just swinging a magic wand. You can look at the ROPs or TMUs but how in the hell is it going to translate when the core/shd/mem are so low - especially the mem. So there's a billion transistors, but if they're doing half the work that they could be doing then that's a sweet bottleneck. Maybe they just figured we'll be voltmodding it anyway, otherwise the 280 won't even come close to the 4870x2. $600!? :shadedshu



Exactly, at that price it'd damn better be the cure for cancer or else it's just a massive waste of money.


----------



## mandelore (May 26, 2008)

Millenia said:


> Exactly, at that price it'd damn better be the cure for cancer or else it's just a massive waste of money.



lmao, thats well funny


----------



## iLLz (May 26, 2008)

trt740 said:


> they are on crack at that price get real



I bought my 8800GTX Stock at $649 because it was the most futureproof card at the time i build my new system.  This was November 2006.  I still have this first revision of the 8800GTX and its the best card I ever had.  A bit expensive but was well worth it.


----------



## MrMilli (May 26, 2008)

http://www.theinquirer.net/gb/inquirer/news/2008/05/26/deliberate-understatement-real


----------



## yogurt_21 (May 26, 2008)

newtekie1 said:


> I don't know how accurate this information is, especially since we haven't seen any other reports of it from any reputable sources.  I think I'll wait to believe specs until the cards are actually out, but to me the shader speeds on these cards seem a little low to me.  I know there is more of them, but to drop the speeds that much seems insane to me.



see I couldn't agree more, i hate it when manufacturers slack off on things just because the competition isn't there. 

sure nvidia cards are faster than ati cards right now, but that doesn't mean you can slack off on specs.  I mean the shader clock was one of ati's biggest problems so they've upped it on this round of cards, and how does nvidia respond? by lowering the clocks on theirs? that doesn't make any sense to me. 

I wonder if they're sandbagging on purpose. like they did with the 7800gtx 256mb which got pwnded by the x1800xt, then nvidia launches a few 7800gtx 512mb cards with uber clocks out of the blue. 

so maybe there's a g200 ultra chip settin in nvidia's labs waiting to crush the rv770. time will tell.


----------



## spearman914 (May 26, 2008)

This happens to EVERY recent cards. The price will me mad high. Just like when the 8800 GT first came out.... it eventually,however,have dropped. I hope this will happen to the GTX260,GTX280 too. And I like the naming. Instead of 10800 GTX.......


----------



## 1c3d0g (May 27, 2008)

Millenia said:


> Exactly, at that price it'd damn better be the cure for cancer or else it's just a massive waste of money.



A massive waste of money is oil sheiks flying around in their private jets with gold-plated bathroom sinks while the rest of their countrymen are struggling every day to survive.


----------



## TheGuruStud (May 27, 2008)

1c3d0g said:


> A massive waste of money is oil sheiks flying around in their private jets with their gold-plated bathroom sinks.



Gold plated? I heard they were solid gold


----------



## Megasty (May 27, 2008)

TheGuruStud said:


> Gold plated? I heard they were solid gold



heh, & I bet all the heatsinks in their comps are made of diamond


----------



## TheGuruStud (May 27, 2008)

Megasty said:


> heh, & I bet all the heatsinks in their comps are made of diamond



They know how to use comps? All I ever see them doing is wrecking brand new imports trying to drift


----------



## magibeg (May 27, 2008)

I thought everyone had diamond heatsinks? Intel gave me a special diamond IHS with a diamond ultra 120 that has air chilled to absolute 0 blowing over it. My temperatures are an even 3 kelvin at load.


----------



## GSG-9 (May 27, 2008)

magibeg said:


> I thought everyone had diamond heatsinks? Intel gave me a special diamond IHS with a diamond ultra 120 that has air chilled to absolute 0 blowing over it. My temperatures are an even 3 kelvin at load.



I thought the pcb was non conductive platinum and the circuitry was diamond?


----------



## a_of (May 27, 2008)

You must be joking. I have been playing FPS since I have memory, and I can easily distinguish between 30fps and 60fps. Hell, I can even notice between 60 and 80. Perhaps you use a flat panel? I still use a CRT, nothing beats it for serious gaming. That's why when playing counter strike with my clan a few years ago, I bought the best money could buy at that moment.




laszlo said:


> don't understand what you mean playable 50 and unplayable under 60?
> 
> all games with 30 fps are playable;human eye don't see the the frames above 25 fps maybe you have some special implants (from Nvidia) and you have reached the 50 fps target as minimum,good for you and good for us who play above 30 and under 60 and we're happy with it.


----------



## magibeg (May 27, 2008)

a_of said:


> You must be joking. I have been playing FPS since I have memory, and I can easily distinguish between 30fps and 60fps. Hell, I can even notice between 60 and 80. Perhaps you use a flat panel? I still use a CRT, nothing beats it for serious gaming. That's why when playing counter strike with my clan a few years ago, I bought the best money could buy at that moment.



30 and 60 you could probably notice but I'm afraid your monitor may start to lag behind when you reach 60-80. Personally I've always found LCD's to be more crisp of an image when compared to CRT's so long as you're running at native.


----------



## Megasty (May 27, 2008)

magibeg said:


> 30 and 60 you could probably notice but I'm afraid your monitor may start to lag behind when you reach 60-80. Personally I've always found LCD's to be more crisp of an image when compared to CRT's so long as you're running at native.



Yeah, anything over 60 fps don't matter anyway cause your monitor won't keep up. CRTs are stuck 60hz as a refresh rate while LCDs can go to 75hz but OS's & drivers keep them at 60hz anyway. Just because a card is running a game at 90+ fps don't mean you're seeing them unless you're superhuman


----------



## TheGuruStud (May 27, 2008)

Megasty said:


> Yeah, anything over 60 fps don't matter anyway cause your monitor won't keep up. CRTs are stuck 60hz as a refresh rate while LCDs can go to 75hz but OS's & drivers keep them at 60hz anyway. Just because a card is running a game at 90+ fps don't mean you're seeing them unless you're superhuman



I hope you are being sarcastic.

#1 30-60 fps sucks big time, looks horrible

#2 My cutoff in framerate is about 85 before I can't tell

#3 My refresh point is about 100 Hz

#4 I game at 1280x1024 at 100 Hz with vsync of course b/c tears blow

#5 You're using a shitty driver if you can't select 75 on an LCD

#6 You all need to get new monitors


----------



## Megasty (May 27, 2008)

TheGuruStud said:


> I hope you are being sarcastic.
> 
> #1 30-60 fps sucks big time, looks horrible
> 
> ...



You sir are superhuman & yes I was & still is being sarcastic


----------



## TheGuruStud (May 27, 2008)

Megasty said:


> You sir are superhuman & yes I was & still is being sarcastic



just superhuman in bed...


----------



## wolf (May 27, 2008)

come on people, i thought we had actually gotten over comparing a single gpu config against a double.

surely enough the 4870x2 might beat a GTX280, however given these specs im fairly sure that both new nvidia cards will tear the ass out of one 4870/50.

sure.


----------



## laszlo (May 27, 2008)

a_of said:


> You must be joking. I have been playing FPS since I have memory, and I can easily distinguish between 30fps and 60fps. Hell, I can even notice between 60 and 80. Perhaps you use a flat panel? I still use a CRT, nothing beats it for serious gaming. That's why when playing counter strike with my clan a few years ago, I bought the best money could buy at that moment.




i also use crt and if the  fps delivered by the video card is over 30 without dropping below it can't be noticed.

refresh rate has nothing to do with fps :http://hometheater.about.com/od/televisionbasics/qt/framevsrefresh.htm


when you watch a movie with a 30 fps do you see the frames?

is the same principle when you play a game or you watch a movie you don't need 200 fps to run it smooth


----------



## Hayder_Master (May 27, 2008)

X-TeNDeR said:


> *Six hundred dollars!!!*  awww..
> 
> Specs look good, but man.. NVidia is hunting for fat wallets again




in my country our shop they sell the 8800 winfast about 950$ i am buy a gigabyte 8800gt 300$ 9600 and 9800 don't come yet i don't know if the 280 come maybe i find it about 1500$ in market


----------



## trt740 (May 27, 2008)

TheGuruStud said:


> I hope you are being sarcastic.
> 
> #1 30-60 fps sucks big time, looks horrible
> 
> ...



television quality video is like 25 fps if 60fps isn't good enought for you your eyes have a problem. A sustained FPS of 30 and above is more than enough


----------



## TheGuruStud (May 27, 2008)

trt740 said:


> television quality video is like 25 fps if 60fps isn't good enought for you your eyes have a problem. A sustained FPS of 30 and above is more than enough



Television is bearable b/c they trick your eyes with motion blur (and a couple other things I think). If you look at any vertical object moving you can clearly see profound jerking.

If it's below 60, I can easily see those damn frames. It's a really fast stutter closer to 60, but annoying as hell.

And don't even get me started on refresh rate haha. I get a headache in a literal 2 mins with low refresh rate (low is below 100 haha).


----------



## trt740 (May 27, 2008)

TheGuruStud said:


> Television is bearable b/c they trick your eyes with motion blur (and a couple other things I think). If you look at any vertical object moving you can clearly see profound jerking.
> 
> If it's below 60, I can easily see those damn frames. It's a really fast stutter closer to 60, but annoying as hell.
> 
> And don't even get me started on refresh rate haha. I get a headache in a literal 2 mins with low refresh rate (low is below 100 haha).



sounds like a tumor just kidding


----------



## [I.R.A]_FBi (May 27, 2008)

i agree with trt


----------



## Guru Janitor (May 27, 2008)

TheGuruStud said:


> Television is bearable b/c they trick your eyes with motion blur (and a couple other things I think). If you look at any vertical object moving you can clearly see profound jerking.
> 
> If it's below 60, I can easily see those damn frames. It's a really fast stutter closer to 60, but annoying as hell.
> 
> And don't even get me started on refresh rate haha. I get a headache in a literal 2 mins with low refresh rate (low is below 100 haha).



He could be seeing up to 85 FPS, the AVERAGE for humans is around 30, but every person sees FPS differently.  Some might not even be able to tell that they are playing a game at 20 fps.


----------



## acperience7 (May 27, 2008)

laszlo said:


> i also use crt and if the  fps delivered by the video card is over 30 without dropping below it can't be noticed.
> 
> refresh rate has nothing to do with fps :http://hometheater.about.com/od/televisionbasics/qt/framevsrefresh.htm
> 
> ...



I can also notice a diffrence between 35 and 60 fps; At least on Test Drive Unlimited for the PC and Halo (PC). Oh yeah I use a 19'' LCD by the way, maybe that has something to do with it.


----------



## DaMulta (May 27, 2008)

So let me get this right the GT200 is not the 9900GTX now?


----------



## largon (May 27, 2008)

"GT200" doesn't exist. 
G200 = GeForce GTX 260 & 280


----------



## DaMulta (May 27, 2008)

Thank you


2 many cards are coming out.......


----------



## imperialreign (May 27, 2008)

as to all the FPS debating going on . . .


36-60 FPS is perfectly fine for slower paced games (like Splinter Cell, Thief, most RTS, etc), but for fast paced games (like Doom3 on nightmare, etc) it's unacceptable.

As to what people can visually see, it depends on the person, and on your setup - even here, I can usually tell when FPS starts dropping below 60 . . . but the only game I have an issue with "tearing" even over 60FPS is Crysis, and vsync causes to hard of a hit to enable it with 3x buffering.  I've never experienced tearing with any other game, and multi-GPU setups are a ton more susceptible to it than single GPUs.

The higher the screen res you play at, the more noticeable tearing will be - even if you have high FPS and a beefcake GPU to handle it, that's a ton more pixels that the GPu has to render and keep up pace with.

you set refresh rate has a major impact on this if you're using LCDs.  60Mhz refresh on a native 75Mhz LCD will lead to tearing or otherwise.



But, this all boils down to individual ability to percieve what's on screen - everyone is different, some more sensitive to it than others.


----------



## wolf (May 28, 2008)

ok as for this silly fps debate...

like others have mentioned, some games can be fine at a lower fps, like crysis, is perfectly playable above 30fps, and test drive unlimited can be 45-50 and smooth as.... BUT;

first person shooters is a different story.

surely enough our eyes cannot see the difference above 60hz, however your hand can feel the difference in responsiveness at 100fps as oppose to 60fps.

ut3 is a great example of this, if you run the game stock, its capped at 62fps, which is more than smooth dont get me wrong, but when i make max fps 150, i usually get 90-110, and there is a HUGE difference in how fast you can react. especially in such an intense and fast paced game like UT3.

so the short answer is maybe we cant SEE the difference, but we can FEEL the difference of faster gameplay.

thus to me, 75+ fps is always what i aim for in those kind of games. and to get 75+ fps @ 1920x1200 +4xAA +16xAF in every game, your gfx card needs to be high end. and i will be abssolutely going for the new nvidia high end when it arrives.

especially cos it will sock it to the every more cruddy looking 48xx series.

all theyve done is added 50% more shader units, 50% more texture units, and given it faster memory.

the nvidia cards have more ROPS, more memory, more bus width, more shaders, more texture units, and a refined design of what we KNOW works. nvidia has literally made this card more beefy as oppose to trying to fill its shortcomings.

NV FTW.


----------



## magibeg (May 28, 2008)

wolf said:


> especially cos it will sock it to the every more cruddy looking 48xx series.
> 
> all theyve done is added 50% more shader units, 50% more texture units, and given it faster memory.
> 
> ...



Ok you were good until you said that -1


----------



## TheGuruStud (May 28, 2008)

wolf said:


> ok as for this silly fps debate...
> Surely enough our eyes cannot see the difference above 60hz, however your hand can feel the difference in responsiveness at 100fps as oppose to 60fps.
> 
> ut3 is a great example of this, if you run the game stock, its capped at 62fps, which is more than smooth dont get me wrong, but when i make max fps 150, i usually get 90-110, and there is a HUGE difference in how fast you can react. especially in such an intense and fast paced game like UT3.
> ...



Refresh rate is pretty analogous to FPS. Florescent lights flicker to me b/c I can easily see 60 Hz (monitors are even worse, think like page flipping). Even 85 Hz starts to strain my eyes after a couple minutes (that's an annoying fast flicker).

But I agree, higher framerate does play better (that's how I got good at CS - 120 vsync on). But I also don't know how anyone plays without vsync. Sure, you can get massive performance penalty (up to 50% if your card can't handle it), but it's better to me than the entire screen tearing and disrupting my view (and pissing me off haha). Of course, my refresh is 100 so it would be 50 fps minimum and not 30


----------



## wolf (May 28, 2008)

> Ok you were good until you said that -1



lol, kudos to ATi for giving Nv the competition, but they are thoroughly prawned this time around, and we will soon see about next gen. 

in any case its each to their own, and the price is always right for both products, so theres so dumb choice.


----------



## farlex85 (May 28, 2008)

TheGuruStud said:


> Refresh rate is pretty analogous to FPS. Florescent lights flicker to me b/c I can easily see 60 Hz (monitors are even worse, think like page flipping). Even 85 Hz starts to strain my eyes after a couple minutes (that's an annoying fast flicker).
> 
> But I agree, higher framerate does play better (that's how I got good at CS - 120 vsync on). But I also don't know how anyone plays without vsync. Sure, you can get massive performance penalty (up to 50% if your card can't handle it), but it's better to me than the entire screen tearing and disrupting my view (and pissing me off haha). Of course, my refresh is 100 so it would be 50 fps minimum and not 30



Everybody can see florescent lights flicker. And yes, monitors can flicker also. 60fps is not too slow in any game, unless it is so fast paced things are moving 3-4 times as fast as they would normally. People don't see different hz (at least not anything worth noting), people just differ in how much they care, and occasionally how much it strains their eyes. Me, I can run a game at 25-30 fps with occasional tearing and not really mind, even in a game like crysis. Some need to have 60fps w/ no tearing to be happy, thats fine too. I personally am glad it doesn't bother me.


----------



## magibeg (May 28, 2008)

I just think its a little soon to be counting anyone out quite yet. Not only did ati double the number of shaders but they also increased the shader clock on them by a fair margin, from 775 to 1050mhz, a fair increase from any view. They also doubled their TMU's. ROP's supposedly don't have as much importance anymore when things are shader focused. Even so though, the nvidia cards listed there are MUCH more expensive than the ati cards that are coming out.


----------



## farlex85 (May 28, 2008)

magibeg said:


> I just think its a little soon to be counting anyone out quite yet. Not only did ati double the number of shaders but they also increased the shader clock on them by a fair margin, from 775 to 1050mhz, a fair increase from any view. They also doubled their TMU's. ROP's supposedly don't have as much importance anymore when things are shader focused. Even so though, the nvidia cards listed there are MUCH more expensive than the ati cards that are coming out.



I would wager it won't be much different from the way its been recently. Nvidia will remain on top but ati will offer very compelling price/performance. I really do hope that the 4870 competes w/ these gtx's though, then we wouldn't have to pay such outrageous prices for them, it would drive the whole market down. I hope *crosses fingers*.


----------



## TheGuruStud (May 28, 2008)

magibeg said:


> I just think its a little soon to be counting anyone out quite yet. Not only did ati double the number of shaders but they also increased the shader clock on them by a fair margin, from 775 to 1050mhz, a fair increase from any view. They also doubled their TMU's. ROP's supposedly don't have as much importance anymore when things are shader focused. Even so though, the nvidia cards listed there are MUCH more expensive than the ati cards that are coming out.



I bet the price will be astronomical (current rumored price) until ATI releases, then they will be close in price according to real world performance. They're just trying to get some bank b/c they cost like $130 (read that somewhere) to make per chip. Which, is freaking insane haha.


----------



## farlex85 (May 28, 2008)

TheGuruStud said:


> I bet the price will be astronomical (current rumored price) until ATI releases, then they will be close in price according to real world performance. They're just trying to get some bank b/c they cost like $130 (read that somewhere) to make per chip. Which, is freaking insane haha.



ATI is actually supposed to release theirs 3 days before these.


----------



## TheGuruStud (May 28, 2008)

farlex85 said:


> ATI is actually supposed to release theirs 3 days before these.



Well, Idk then 

There's a marketing stunt going on or these are very bad details. *shrugs*


----------



## farlex85 (May 28, 2008)

Its just how it goes. The 8800gtx was around $600 at launch, the ultra about $800. They didn't fall very fast either, the gtx is still $350+ (thats what, like a year and a half old). How can nvidia get away w/ that? They have no competition up there. They are indisputabely the fastest cards around. The gx2, although the fastest card in town right now, is challenged enough by the 3870x2 that the prices had to drop a decent amount. If the 4870 can compete up top, the prices would go down.

Same reason intel can sell a q9550 for $500+, they have no competition. If amd was competing in the top rungs, we could get that for less than $300. Thats why although I own intel and nvidia, I route for amd/ati all the way.


----------



## imperialreign (May 28, 2008)

magibeg said:


> I just think its a little soon to be counting anyone out quite yet. Not only did ati double the number of shaders but they also increased the shader clock on them by a fair margin, from 775 to 1050mhz, a fair increase from any view. They also doubled their TMU's. ROP's supposedly don't have as much importance anymore when things are shader focused. Even so though, the nvidia cards listed there are MUCH more expensive than the ati cards that are coming out.



I agree with this statement - sure, nVidia's looks better on paper, and even still, this new series might still topple the HD4000 (we'll have to see), but it's still recycled tech.  nVidia can only revamp and sandbag this GPU but so many times, and sooner or later ATI will catch up.  What's to say that nVidia's next brand new GPU design won't be an achilles heel?  Asides, like I mentioned before, ATI was close on the heels of G92 with the HD3000 series - not topping out in most games, but was at least able to keep up with respectable performance.  The R700 line was designed alongside the R600, so we already have an idea of baseline performance . . . just trying to say that contrary to popular belief, the R700 could seriously surprise everyone, including nVidia.

We'll just have to see - no one here can really make 100% certain claims or judgements until we see how both camps perform.  This is going to be a heated competition the last half of this year, the likes of which we haven't seen in a couple of years now.  Both camp's offerings look superb on paper, now we just need to see how well they deliver.


----------



## wolf (May 28, 2008)

magibeg said:


> ROP's supposedly don't have as much importance anymore when things are shader focused.



i gotta disagree there, ROPS account for the sheer pixel filling muscle, and pixel fillrate accounts for ALOT of your FPS


----------



## R_1 (May 30, 2008)

After 2 or 3 optical shrinks and 2 or 3 years from now G280 (or whatever future name it will get) will be a  good and cheap GPU for everybody.


----------



## department76 (May 30, 2008)

$600 is fine for a high end card....  my friend bought a 7800GTX back in '05 on the week of release for $600.  that was a decent deal because the card owned everything else on the market.


----------



## Hayder_Master (Jun 1, 2008)

don't forget the 8800 ultra before when it come, it release over 800$


----------



## candle_86 (Jun 1, 2008)

yep because Nvidia is top dog they get to set prices on the best cards out there. And the best card right now is the 9800GTX, the dual core cards are great, but some games dont scale that well with dual graphics and some not at all even in this day and age. I think Nvidia realizes that the market for SLI is still small and games are still far more optimized for single core/single card setups than for dual options and thats why they are making one massive GPU to do the work. In the long run its smarter, because when they can shrink it in 10 months or so agian they can make a GX2 off shoot if they want to and make a truly awesome card. Always start off strong with the core then worry about putting 2 there, because you can't always be assured that dual will always preform good.


----------



## yogurt_21 (Jun 1, 2008)

department76 said:


> $600 is fine for a high end card....  my friend bought a 7800GTX back in '05 on the week of release for $600.  that was a decent deal because the card owned everything else on the market.



i hope that was a 512mb version otherwise ouch.


----------

