# R700 up to 80 % Faster than GeForce GTX 280



## btarunr (Jul 10, 2008)

Pre-release performance evaluations suggest that the Radeon HD 4870 X2 2GB GDDR5 model will on an average be 50% faster than the GeForce GTX 280 and in some tests 80% faster. A second model, the HD 4850 X2 (2GB GDDR3 memory, 2x RV770Pro) will also convincingly outperform the GeForce GTX 280. The R700 series will be brought into the market late-July thru August.

*View at TechPowerUp Main Site*


----------



## zOaib (Jul 10, 2008)

yessir !!!!!!

and dont forget cheaper !!!!!


----------



## [I.R.A]_FBi (Jul 10, 2008)

owned ... can you say owned?


----------



## magibeg (Jul 10, 2008)

Oh man, that card sounds like its going to be a beast. Wonder if amd/ati will crank up the price on it because of the performance gap. Although I still want to see benchmarks before I entirely believe the numbers.


----------



## IcrushitI (Jul 10, 2008)

Some intelligent guesses, if the 4870 x2 is going for hopefully $499 will that make the 4850 x2 $399. I was going for the 4870 but can wait a few more weeks. Personally I like the $100 jumps. I still say we were all getting ripped off at the $700 mark for the high end, but thankfully the consoles helped these two outfits realized the younger crowds were leaving especially when you can get a xbox now with the price drop to $299. My son told me that was his next buy if the vid cards stayed as high as they were. He brought home his friends xbox and I have to agree on my big hdtv it didn't look to shabby at all. He teased me with COD4, just about had me convinced to buy one untill the price drop. I hope the marketing personal from the Red and Green team read these forums.


----------



## hAKtivate (Jul 10, 2008)

I want to see in game verification!  Then I want to see how this thing looks inside my case...


----------



## Spacegoast (Jul 10, 2008)

heard the price tag would be more than the GTX280, at $549. still worth it since it outperforms is by far.


----------



## X-TeNDeR (Jul 10, 2008)

one word: *PWND! *

At last the answer from ATI is here.. and i'm so happy 
Now we need some nice new phenoms to go with our 48XX cards.. hey AMD, can you hear me?


----------



## btarunr (Jul 10, 2008)

If it's true....words cannot express the awesome!


----------



## Darkrealms (Jul 10, 2008)

So then what would the result of a 280 x2 be?  Wouldn't we figure that would be Nvidias responce to this?
Good deal for $550 compared to the current competition.


----------



## Spacegoast (Jul 10, 2008)

Darkrealms said:


> So then what would the result of a 280 x2 be?  Wouldn't we figure that would be Nvidias responce to this?
> Good deal for $550 compared to the current competition.



nvidia is supposed to come out with "B" revision of thier curretn gpu's. but some new info suggests a new line up is in the works. perhaps the GTX3?? series?


----------



## Darkrealms (Jul 10, 2008)

Spacegoast said:


> nvidia is supposed to come out with "B" revision of thier curretn gpu's. but some new info suggests a new line up is in the works. perhaps the GTX3?? series?


Yeah, NVIDIA Preparing GT300 Graphics Processor?.
I'm thinking they may just 280 x2 and call it good for this round, or for that matter use the limited supplies (I"m sure they've already made) of the 280b (its just an example number/lettering) and x2 that because sales won't be that high anyway with the most likely price point.


----------



## newbielives (Jul 10, 2008)

So when will NVidia become the underdog so we can start rooting for NVidia again


----------



## flashstar (Jul 10, 2008)

Darkrealms said:


> Yeah, NVIDIA Preparing GT300 Graphics Processor?.
> I'm thinking they may just 280 x2 and call it good for this round, or for that matter use the limited supplies (I"m sure they've already made) of the 280b (its just an example number/lettering) and x2 that because sales won't be that high anyway with the most likely price point.



It would be very hard to cool a 280 x2 because the 280 core is so large and hot. Nvidia might be able to release a special water-cooled 280 or a slowed-down 260 X2.


----------



## magibeg (Jul 10, 2008)

I recommend just rooting for the people who have the best products for the cost


----------



## Darkrealms (Jul 10, 2008)

newbielives said:


> So when will NVidia become the underdog so we can start rooting for NVidia again


LoL, I honestly don't think a company has to be the underdog to learn from their mistakes.  I'm just hoping Nvidia will.  After all I was wrong about them sueing over CUDA/PhysX.





flashstar said:


> It would be very hard to cool a 280 x2 because the 280 core is so large and hot. Nvidia might be able to release a special water-cooled 280 or a slowed-down 260 X2.


That doesn't mean they won't try.  Notice I said I had no clue the price point it would have.  Remember the 6800 Ultras were $999 at one point @_0


----------



## rampage (Jul 10, 2008)

if its true (wich i asume it will be) its great news, too bad my gtx died as the gtx 280 was released and i didnt want to have to wait so i got the 280


----------



## PVTCaboose1337 (Jul 10, 2008)

ATI wins.  I am quite surprised, as that is a HUGE increase.


----------



## Darkrealms (Jul 10, 2008)

PVTCaboose1337 said:


> ATI wins.  I am quite surprised, as that is a HUGE increase.


I agree with that statement if this is true.  That would make it the fastest current card.
But if Nvidia managed a 280 x2 I think it would loose to that.


----------



## Kei (Jul 10, 2008)

This entire round of cards (from both sides of the fence) are simply insane, but these two cards on the horizon are unspeakable forces of nature or something! I think I'm afraid of the numbers they might produce.




X-TeNDeR said:


> one word: *PWND! *
> 
> At last the answer from ATI is here.. and i'm so happy
> Now we need some nice new phenoms to go with our 48XX cards.. hey AMD, can you hear me?



No offense...but do you own a Phenom...or a 4xxx series card? I see your sig mentions you have a spider but you specs don't show anything that is spider at all? 

K


----------



## FatForester (Jul 10, 2008)

I'm getting sick of all the talk - let's see some numbers!


----------



## yogurt_21 (Jul 10, 2008)

sigh more speculations while this has a good likelyhood of being true, I like real benches. especially when the words "up to" are used. lol


----------



## DarkMatter (Jul 10, 2008)

flashstar said:


> It would be very hard to cool a 280 x2 because the 280 core is so large and hot. Nvidia might be able to release a special water-cooled 280 or a slowed-down 260 X2.



It's not too much hotter than HD4870. In fact it's cooler, but it does consume a bit more. Not much really under normal usage. It has a higher peak consumption but consumes less on average. Dual GPU cards never have both cards under full load. Also Nvidia just needs a GTX260 GX2 with higher clocks. Does not need GTX280 GX2 to be on top.


----------



## Deleted member 24505 (Jul 10, 2008)

Some people say because of this competition,we win.But that is not true,because nvidia and ati are competeing so fiercely now,they are releasing new cards more regularly.And we are buying them more regularly,giving them even more of our dosh.

I have not had my 3850 very long and i am about to change it for a 4850.


----------



## FelipeV (Jul 10, 2008)

And if ATI solved the Quad CF problem, we are going to see some really nice numbers.


----------



## Darkrealms (Jul 10, 2008)

DarkMatter said:


> It's not too much hotter than HD4870. In fact it's cooler, but it does consume a bit more. Not much really under normal usage. It has a higher peak consumption but consumes less on average. Dual GPU cards never have both cards under full load. Also Nvidia just needs a GTX260 GX2 with higher clocks. Does not need GTX280 GX2 to be on top.


As true as that may be. . .   It doesn't mean I don't want to see what a 280 x2 would be capable of ; )


----------



## Kei (Jul 10, 2008)

tigger69 said:


> Some people say because of this competition,we win.But that is not true,because nvidia and ati are competeing so fiercely now,they are releasing new cards more regularly.And we are buying them more regularly,giving them even more of our dosh.
> 
> I have not had my 3850 very long and i am about to change it for a 4850.



Ha, c'mon tigger we dont' HAVE to buy the cards...you just want one, that's not their faults 

K


----------



## ShinyG (Jul 10, 2008)

This might be just hype, but under any circumstances, we have: competition! 

We have a saying in Romania for the situation in which nVidia is in now: "nVidia simte cum ii intra morocovul in cur!" which can be translated as: "nVidia can feel the carrot going up it's azz!" )

I hope they have the power to come back and fight, because I'm loving this competition thing!


----------



## Jansku07 (Jul 10, 2008)

*IF* NVIDIA launches GTX280X2 or GTX260X2 it will cost like hell, and may not even be that faster. NVIDIA must now lower their top card's price and quickly launch g200b.


----------



## DarkMatter (Jul 10, 2008)

tigger69 said:


> Some people say because of this competition,we win.But that is not true,because nvidia and ati are competeing so fiercely now,they are releasing new cards more regularly.And we are buying them more regularly,giving them even more of our dosh.
> 
> I have not had my 3850 very long and i am about to change it for a 4850.



Don't buy it then. 

lol you sound like an alcoholic who seems to have been locked up by force in a cellar.


----------



## DarkMatter (Jul 10, 2008)

Jansku07 said:


> *IF* NVIDIA launches GTX280X2 or GTX260X2 it will cost like hell.



GTX260 is pretty much at the same price as HD4870...


----------



## btarunr (Jul 10, 2008)

The world doesn't buy stuff from Newegg. Here it's only gone $15 less than launch price....GTX 260.


----------



## Deleted member 24505 (Jul 10, 2008)

DarkMatter said:


> Don't buy it then.
> 
> lol you sound like an alcoholic who seems to have been locked up by force in a cellar.


 

Well i suppose there is always the rich boys to keep their pockets lined.


----------



## DarkMatter (Jul 10, 2008)

btarunr said:


> The world doesn't buy stuff from Newegg. Here it's only gone $15 less than launch price....GTX 260.



Don't take offense, but India is not precisely the biggest market out there, one that big companies would care a lot of. In US and entire Europe the GTX260 IS almost at same price AFAIK. Here in Spain it is at least. I have seen Palit's and Gigabyte's GTX260 for 230 euros VAT incl. HD4870 sells for 215-230, I couldn't find one for less. I can't post the prices because the ones I'm talking about are retail, not etail. All the other GTX260 that I found are around 250 or above, but all of them are heavlily overclocked cards. Because Nvidia is more flexible with partners, they overclock a lot and their prices are very loose, but you can always have one (even OCed) for very good price. With 8800 GT happened the same, average price of the GT was significantly higher than HD3870 but I bought mine for 203 euros, when average GT was 250 and HD was 220.


----------



## btarunr (Jul 10, 2008)

DarkMatter said:


> Don't take offense, but India is not precisely the biggest market out there, one that big companies would care a lot of. In US and entire Europe the GTX260 IS almost at same price AFAIK. Here in Spain it is at least. I have seen Palit's and Gigabyte's GTX260 for 230 euros VAT incl. HD4870 sells for 215-230, I couldn't find one for less. I can't post the prices because the ones I'm talking about are retail, not etail. All the other GTX260 that I found are around 250 or above, but all of them are heavlily overclocked cards. Because Nvidia is more flexible with partners, they overclock a lot and their prices are very loose, but you can always have one (even OCed) for very good price. With 8800 GT happened the same, average price of the GT was significantly higher than HD3870 but I bought mine for 203 euros, when average GT was 250 and HD was 220.



The fact that the release prices here are on-par (nearly) with release prices in the US shows that. Just as you sit in Basaque Country telling the prices are on par (of GTX260 and 4870), I can say that's not with all markets, but cards when launched are priced on-par with company-specified MSRPs. So you can't generalise.

And oh...our price-bands are close to those of China's, two really huge populations. You don't have such huge price-cuts in China too.


----------



## robodude666 (Jul 10, 2008)

How many nVidia supports do you think will say it doesn't count because its not 1 GPU?


----------



## DarkMatter (Jul 10, 2008)

btarunr said:


> The fact that the release prices here are on-par (nearly) with release prices in the US shows that. Just as you sit in Basaque Country telling the prices are on par (of GTX260 and 4870), I can say that's not with all markets, but cards when launched are priced on-par with company-specified MSRPs. So you can't generalise.



Well you can't generalise either, on the biggest US etailers price is close, on Germany and around countries is close, on Spain (politically Basque Country is Spain, but not on heart, long to explain) is close. And if you were figuring out, I've read somewhere Spain was the 3th bigger electronics consumer country in Europe and I think it was 9th in the world. I don't know if that survey was reliable anyway, so don't quote me, but you get the idea.


----------



## btarunr (Jul 10, 2008)

I didn't generalise, I said "The world doesn't buy from Newegg", you said "The prices of GTX 260 and HD4870 are nearly the same". We could agree to disagree.


----------



## candle_86 (Jul 10, 2008)

well in the US market they are the same price and id take the GTX 260 over the 4850 anyday


----------



## Millenia (Jul 10, 2008)

candle_86 said:


> well in the US market they are the same price and id take the GTX 260 over the 4850 anyday



I guess you would take a GTX 260 over a 4850 since they're in a different performance class 
Did you mean to type 4870?

Anyways, it seems that at Newegg the GTX260 costs at least 50 bucks more than a 4870 while performing similarly? The difference is even bigger here in Finland; 230€ vs 290€ for the cheapest offers I could found, 60€ (nearly 100 bucks) is quite a bit of cash only for the sake of fanboyism.

And bloody hell, you're the worst Nvidia troll I've ever seen, you're always bashing AMD/ATI in every single thread concerning either Radeons or GeForces - even when they're CLEARLY better D:


----------



## candle_86 (Jul 10, 2008)

you moron im running crossfire, but i would go Nvidia they have something that the R770 doesnt have, overclocking headroom and these special R770 cards are gonna cost a shit load, so ill take the card thats close in price and OC's like a mother


----------



## btarunr (Jul 10, 2008)

No name-calling please. Have respect towards each-other.


----------



## chron (Jul 10, 2008)

candle_86 said:


> you moron im running crossfire, but i would go Nvidia they have something that the R770 doesnt have, overclocking headroom and these special R770 cards are gonna cost a shit load, so ill take the card thats close in price and OC's like a mother



yeah but ati has something nvidia doesnt have (correct me if I'm wrong) : dynamic voltage control (which in turn gives better overclocking with propper cooling)


----------



## DarkMatter (Jul 10, 2008)

btarunr said:


> I didn't generalise, I said "The world doesn't buy from Newegg", you said "The prices of GTX 260 and HD4870 are nearly the same". We could agree to disagree.



I didn't understand the sentence then. I thought you were saying Newegg was the only place where prices were similar. I didn't generalise either, I just mentioned Spain to say that I was sure about prices here, any honest person wouldn't try to ensure how prices are elsewhere, sorry if I sounded that way. Usually here prices are relative to those in US and the rest of Europe. That's why I mentioned it.


----------



## candle_86 (Jul 10, 2008)

chron said:


> yeah but ati has something nvidia doesnt have (correct me if I'm wrong) : dynamic voltage control (which in turn gives better overclocking with propper cooling)



yea but there stock cooler is crap so that feature is worthless and its just a way for most people to damage something, i dont volt my video cards, i OC at stock volts and i care who has the headroom at stock voltage with stock cooling and IMO ATI doesnt have either


----------



## Deleted member 24505 (Jul 10, 2008)

I go for the cheapest card with some goooo.I usually change the cooler anyway from stock so i dont care if the stock cooler is loud/crap.

I dont see the point buying the high end uber cards when they will just replace it a few months making your card worth less than you payed for it(sometimes)


----------



## Millenia (Jul 10, 2008)

tigger69 said:


> I go for the cheapest card with some goooo.I usually change the cooler anyway from stock so i dont care if the stock cooler is loud/crap.
> 
> I dont see the point buying the high end uber cards when they will just replace it a few months making your card worth less than you payed for it(sometimes)



Usually yeah, the 8800 GTX was the king of the hill for quite a while though, like 1½ years


----------



## johnnyfiive (Jul 10, 2008)

ATi FTMFW.


----------



## chron (Jul 10, 2008)

candle_86 said:


> yea but there stock cooler is crap so that feature is worthless and its just a way for most people to damage something, i dont volt my video cards, i OC at stock volts and i care who has the headroom at stock voltage with stock cooling and IMO ATI doesnt have either



Actually the stock fans aren't crap, the stock fan SETTINGS are.  And to volt mod an ati card all you need to do is stick a small case fan over the VRM heatsink while nvidia requires soldiering.

I can overclock my 8800gt to 700, but I have to turn fan to 100% or games like cod4 and crysis will lockup.  With my x1800gto from back in the day, I hit 700/700 from 500/500 AND unlocked it to 16 pipes, all on stock cooling, but with an 80mm fan over the VRM heatsink.


----------



## vojc (Jul 10, 2008)

X-TeNDeR said:


> one word: *PWND! *
> 
> At last the answer from ATI is here.. and i'm so happy
> Now we need some nice new phenoms to go with our 48XX cards.. hey AMD, can you hear me?



True, we need better phenom CPUs, or else i go blue


----------



## vojc (Jul 10, 2008)

candle_86 said:


> yea but there stock cooler is crap so that feature is worthless and its just a way for most people to damage something, i dont volt my video cards, i OC at stock volts and i care who has the headroom at stock voltage with stock cooling and IMO ATI doesnt have either



please read some news first to see further stock(or non stock if u wish) coolers


----------



## Millenia (Jul 10, 2008)

vojc said:


> True, we need better phenom CPUs, or else i go blue



Honestly you don't need THAT much out of a CPU for gaming nowadays, even a mid-range dual core will do more than enough..

Kinda funny as I would probably have more actual use for a quad as a 3d artist than most of TPU yet I don't have one


----------



## DarkMatter (Jul 10, 2008)

Millenia said:


> Kinda funny as I would probably have more actual use for a quad as a 3d artist than most of TPU yet I don't have one



+1. I'm on the same circunstances. LOL. But we don't work at home anyway, do we?

I don't, and I want to...


----------



## JC316 (Jul 10, 2008)

Now that is ownage, I just want AMD to come out with a chip that can do that to intel. That should keep the company alive for a while longer. Go ATI!


----------



## Darkrealms (Jul 10, 2008)

Darkrealms said:


> PVTCaboose1337 said:
> 
> 
> > ATI wins.  I am quite surprised, as that is a HUGE increase.
> ...





robodude666 said:


> How many nVidia supports do you think will say it doesn't count because its not 1 GPU?


Ouch, see my above statement . . .


----------



## Selene (Jul 10, 2008)

Im glad this will bring the 260/280 prices down even more, also with the 55nm260/280s coming look out ATI.
Im sorry card for card NV owns, yea the prices were out of whack but thats being fixed, just goes to show being the first to have new stuff does indeed come at a price!


----------



## [I.R.A]_FBi (Jul 10, 2008)

candle_86 said:


> yea but there stock cooler is crap so that feature is worthless and its just a way for most people to damage something, i dont volt my video cards, i OC at stock volts and i care who has the headroom at stock voltage with stock cooling and IMO ATI doesnt have either



why do you have to be so openly ignorant?


----------



## X-TeNDeR (Jul 10, 2008)

Kei said:


> No offense...but do you own a Phenom...or a 4xxx series card? I see your sig mentions you have a spider but you specs don't show anything that is spider at all?
> 
> K



None taken. i had a 9500 B2 phenom for a while.. than had to sell it for a better rig overall, then got stuck with a budget. so got the cheapest rig for now, with phenom 9850BE on the way. my current rig is satisfying my needs for now so, no rush.


----------



## sam0t (Jul 10, 2008)

Just a quick check on ATI vs Nvidia prices from small Finland:

4850: 149e > 230$
4870: 230e > 354$

GTX 260: 286e > 440$
GTX 280: 399e > 614€


----------



## vojc (Jul 10, 2008)

yap prices in europe not buyer friendly


----------



## NympH (Jul 10, 2008)

[I.R.A]_FBi said:


> owned ... can you say owned?



OWNED!


----------



## wolf2009 (Jul 10, 2008)

i think 4870x2 could be equal to GTX 280 x2 since they did that chip which provides better efficiency than other dual gpu solutions .


----------



## SK-1 (Jul 10, 2008)

First time ATI has been on top in a LONG while,...x1900xtx comes to mind.
Good for ATI!


----------



## chron (Jul 10, 2008)

SK-1 said:


> First time ATI has been on top in a LONG while,...x1900xtx comes to mind.
> Good for ATI!



yeah the x1900xtx was on top for a while, but then the 7900gtx came out and did better for like 100 bucks less.   Why does that seem like a lifetime ago?


----------



## MilkyWay (Jul 10, 2008)

GTX 280 x2 prolly will be a little faster by 10 or 15% but efficiecy might go to ati.

If its 80% faster than 1 GTX 280, the dual version of the 280 is bound to be faster.


----------



## wolf2009 (Jul 10, 2008)

MilkyWay said:


> GTX 280 x2 prolly will be a little faster by 10 or 15% but efficiecy might go to ati.
> 
> If its 80% faster than 1 GTX 280, the dual version of the 280 is bound to be faster.



2 280's may not be as efficient .


----------



## TooFast (Jul 10, 2008)

chron said:


> yeah the x1900xtx was on top for a while, but then the 7900gtx came out and did better for like 100 bucks less.   Why does that seem like a lifetime ago?


 dont forget the x1950 xtx was the king in that time


----------



## WarEagleAU (Jul 10, 2008)

Great news and what not, but its taking two cores to top on Nvidia gpu. Id love to see ATI get a one core stomper out there.


----------



## Megasty (Jul 10, 2008)

Now that's being pwned to death. NV needs to dump that gt200 completely. If they don't watch it, their next gen card won't even be as fast as the 4870x2 & yet cost just as much not if more than the GTX280. I thought the thing would pwn but this is ridiculous.


----------



## VanguardGX (Jul 10, 2008)

WarEagleAU said:


> Great news and what not, but its taking two cores to top on Nvidia gpu. Id love to see ATI get a one core stomper out there.



I was wondering how long it would take for someone to mention that So what if it has 2 cores? Its the way of the future just like a dual core processor.


----------



## Megasty (Jul 10, 2008)

VanguardGX said:


> I was wondering how long it would take for someone to mention that So what if it has 2 cores? Its the way of the future just like a dual core processor.



That really wasn't his point, ah it doesn't matter 

ATi pretty much has a game plan. They're not focusing as much on NV as NV is on them - for obvious reasons. If ATi continues on this path, they will have a single gpu that eats all comers. The 4870 is more than twice as powerful as a 3870. Its ridiculous to think that their next gpu will be 2X+ as powerful as a 4870 but the possibility is there. ATi also has rumored plans for a dual core gpu - & if it takes something like that to take the 'single' gpu crown then so be it. But the last thing ATi will do right now is stray away from their game plan (architecture) when its finally starting to bare sweet fruit.


----------



## swaaye (Jul 10, 2008)

How about we compare R700 (2xRV770) to GTX 280 SLI?  Price is a bit different though I guess lol.

BTW, some of you really need to understand GPU architecture a bit better. R700 is not more efficient than a design based on a single giant GPU would be. At least, not for pure 3D performance. The fastest GPU design, due to how parallel 3D rendering is, is always a single GPU. Dual GPUs waste RAM and have to deal with inefficiencies caused by trying to split the tasks and communicate via a pathetically slow bridge chip. There is extra hardware and hardware performing redundant tasks. And the drivers have to be specially set up for every game basically (this is conveniently ignored by most people for some reason.)

The problem is that manufacturing technology can not cope with mega huge GPUs. That is why GT200 can't clock as high as G92. The bigger the chip gets with more transistors, the hotter it is and the more complex it becomes to make it stable at higher clock speeds. If you look back at how early GPUs barely needed fans to today's ridiculous furnaces, you see that manufacturing is way behind what competition has pushed GPUs to become.

R700 and 9800GX2 are designed to overcome manufacturing inadequacies in the only way possible. They also conveniently allow an entire lineup to be based mainly on a single GPU design. It's just important to realize that this is not the optimum way to go for performance.

Also, realize that there is potential for a refreshed GT200 to be vastly faster. If they shrink it down and tweak it, and this allows it to clock up decently higher, a dual GT250 (or whatever) could be a lot faster than R700.


----------



## candle_86 (Jul 10, 2008)

TooFast said:


> dont forget the x1950 xtx was the king in that time



nope, the 7950GX2 took the DX9 crown


----------



## swaaye (Jul 10, 2008)

candle_86 said:


> nope, the 7950GX2 took the DX9 crown



When it worked, and it didn't always work. The critical flaw of dual GPU cards = drivers and whether the game engine works with the method used to split the work across the two GPUs. For example, in Everquest 2, 7950GX2 was only as fast as one of its boards (=7900GT).

Besides, who can really claim a GF7 or X19xx as being crown of anything. Current cards run DX9 just fine and definitely look a hell of a lot better doing it than GF7. GF7's image quality was not all that great.


----------



## brian.ca (Jul 10, 2008)

DarkMatter said:


> It's not too much hotter than HD4870. In fact it's cooler, but it does consume a bit more. Not much really under normal usage. It has a higher peak consumption but consumes less on average. Dual GPU cards never have both cards under full load. Also Nvidia just needs a GTX260 GX2 with higher clocks. Does not need GTX280 GX2 to be on top.



I haven't looked too deeply into this or anything but 2 points about what you just said,

1) Wasn't the default fan speed set lower than it needed to be?   From what I read there was a simple fix for this and just turning the speed up was supposed to help a lot with the heat situation with out too much cost in noise.  If that is the case, early benchmarks measuring heat don't really speak to the thermals of the chip itself.

2) It was my understanding from various posts that the issue with the 4870's idle power draw is that it's not downclocking properly for 2D mode so it continues at full strength readiness even when you're just reading emails.  I believe this is just a driver issue and ATI said this will be resolved with a new catalyst release.  So saying that the rv770 on avg draws more power seems like a faulty argument.  It draws less at max and it currently draws more at idle but once fixed via a driver I'm not sure there's much reason to think that it wouldn't draw less power at idle as well.  I'm assuming your avg was just load + idle / 2, if that's the case as soon as that's fixed the numbers should change significantly.


----------



## purecain (Jul 10, 2008)

the differance about this multi gpu situation, is that ATI designed their chip to work in multi chip config from the very beginning... nvidia did not.... 
come on ati.....


----------



## TheGuruStud (Jul 10, 2008)

*waits for tweaked 280 with 800 core*


----------



## purecain (Jul 10, 2008)

waits for tweaked bios for 4870 with 900core lol....


----------



## TheGuruStud (Jul 10, 2008)

purecain said:


> waits for tweaked bios for 4870 with 900core lol....



THEN YOUR HOUSE CATCHES ON FIRE!!!!


----------



## LiveOrDie (Jul 10, 2008)

sounds sweat just hope it not a hype up like the new nvidia card were


----------



## purecain (Jul 10, 2008)

TheGuruStud said:


> THEN YOUR HOUSE CATCHES ON FIRE!!!!



hardly, the core never goes above 58c even at 840mhz...

the fan issues have given this card bad press....


----------



## imperialreign (Jul 10, 2008)

swaaye said:


> How about we compare R700 (2xRV770) to GTX 280 SLI?  Price is a bit different though I guess lol.
> 
> BTW, some of you really need to understand GPU architecture a bit better. R700 is not more efficient than a design based on a single giant GPU would be. At least, not for pure 3D performance. The fastest GPU design, due to how parallel 3D rendering is, is always a single GPU. Dual GPUs waste RAM and have to deal with inefficiencies caused by trying to split the tasks and communicate via a pathetically slow bridge chip. There is extra hardware and hardware performing redundant tasks. And the drivers have to be specially set up for every game basically (this is conveniently ignored by most people for some reason.)
> 
> ...




I would tend to agree, buy we also must keep in mind that there are also reports the ATI is binning the R770 GPUs, holding onto the better cores for higher-end models . . . I think we can expect to see a second 4870x2 release with better components should nVidia come close to countering it's performance.

Also, although it might be 2 GPUs on one PCB - each PCB has it's own DRAM, AFAIK, neithr GPU is sharing MEM, which eliminates a lot of the inefficiencies of the 3870x2 - include the higherbandwidth of the GDDR5, and a lot of the microstutter associated with multi-GPU setups whould be practically eliminated (that was a big issue with the 3870x2 versus xfired 3870s - although the 70x2 had higher frame rates, minimum frame rates were lower than two seperate cards).


The big point of this whole debate at this point . . . the 4870x2 is a single card, dual-slot solution . . . to best it with nVidia hardware ATM would require 2 cards, totalling 4 hardware slots you would have to sacrifice, and pay nearly twice as much out of pocket for compared to the 4870x2 price . . . if we want to look at it like that, for the same price, you could purchase 2 4870x2s, sacrifice the same 4 slots, and PWN THA LIVIN HELL out of any nVidia setup currently on the market.  Sadly, due to nVidia's GPU design - they can't release anything similar to compete with it like the 9800GX2 just yet.


just cause it takes ATI 2 cores to best 1 nVidia core doesn't really mean squat anymore - ATI brought the cake with the R770, proving they've still got game.  Hell, sometimes it takes more than one challenger to beat down another competitor . . . I don't recall anyone crying foul when the UK, USSR, and the USA teamed up to pwn some Nazis back in the day


----------



## brian.ca (Jul 10, 2008)

swaaye said:


> How about we compare R700 (2xRV770) to GTX 280 SLI?  Price is a bit different though I guess lol.
> 
> BTW, some of you really need to understand GPU architecture a bit better. R700 is not more efficient than a design based on a single giant GPU would be. At least, not for pure 3D performance. The fastest GPU design, due to how parallel 3D rendering is, is always a single GPU. Dual GPUs waste RAM and have to deal with inefficiencies caused by trying to split the tasks and communicate via a pathetically slow bridge chip. There is extra hardware and hardware performing redundant tasks. And the drivers have to be specially set up for every game basically (this is conveniently ignored by most people for some reason.)
> 
> ...



When you say, "communicate via a pathetically slow bridge chip" are you talking about the PLX chip?  It was my understanding that for the 4870x2 the PLX chip will not handle communication between the two GPUs, that's supposed to be done through the side port or some other means that's supposed to fix some of the issues people had with the 3870x2?   There is still a PLX chip but from what I read that is needed there for another function (breaking up the PCI/e lane into two to divide it between the two chips, and reversing that process in the opposite direction).

About the other stuff, I really think most people don't care too much about how performance is achieved (ie: it's efficiency, how many chips to a card etc.) so much as that it is achieved and how much it costs.  As long as there are no substantial issues that are the result of the method I'd have to agree that it shouldn't really matter.  I haven't owned any X2 cards to experience it first hand but that micro stuttering issue I've heard about sounds like that would be such an issue.  If that's fixed now with out causing a new issue though then great.  Otherwise you have to consider that you get what you pay for and need to decide what's important to you. Ex: Would a single 280 give you easier performance? Sure. Will it get you as much? Apparently not.  If you're dead set on the extra performance, is the cost in inconvenience/issues worth the money you save vs. other solutions. etc.

Also, one other thing I don't get... Nv's 8800GT and ATI's 3870 weren't too far apart in release times, now with both companies releasing new cards the release dates are again close (if not closer).   So why do people talk about the potential for a die shrink of the 200 like ATI doesn't have the capability to do it or anything else to counter over that time frame as well?


----------



## yogurt_21 (Jul 10, 2008)

the 4870x2 should be a monster but based on the 9800gx2 I see no reason why a gtx260x2 isn't possible. It's not like they have to worry about putting 2 cores on the same pcb, just use 2 pcb's like on the 9800gx2. so core size wouldn't matter, heat, eh that can be dealt with.

a gtx280x2 could happen heat and power wize (considering the power and heat of the gtx280 vs the 4870) but the cost of the card would be high and being that the extremist market is small, I don't see too many selling. a dual gtx260 could probably hit that 500$ mark and be competitive. Then you just drop the price of the gtx280 to 400$ and be set. then ati would have to release a 4850x2 to counter the gtx280's price point. it'd be a nice little competitive market.


----------



## OzzmanFloyd120 (Jul 10, 2008)

Darkrealms said:


> So then what would the result of a 280 x2 be?



$1500 USD... or one soul.


----------



## imperialreign (Jul 10, 2008)

OzzmanFloyd120 said:


> $1500 USD... or one soul.





exactly . . . and for about $1000 - you could have 4 R770 GPUs sitting on 2 PCBs


----------



## tkpenalty (Jul 10, 2008)

GTX280X2 is impossible. Why? HUUUUUUUUUGE heat output and the fact that the bga solder balls on the packages are fragile. The only reason why the GTX280 has "low temps" is because of the cooler used on it; its basically better than whats out there in terms of cooling at the moment. Well... good luck fitting all that into TWO slots. 

HD4870X4 would require some smart engineering, but I'd think a HD4870X3 would be enough anyway. They would have to reposition the memory banks etc.... not very easy. But its possible.


----------



## mlupple (Jul 10, 2008)

magibeg said:


> I recommend just rooting for the people who have the best products for the cost


If everyone thought like you, we'd all have to root for wal-mart and chinese made products.  Then they'd have an absolute monopoly since nobody would shop anywhere else and then they'd fuck us.  Supporting the underdog to keep competition alive is what makes this country roll!


----------



## GPUCafe (Jul 10, 2008)

Came here since I got a linkback from here ..

We actually posted this two weeks earlier.

Performance claims are 30% faster than the GTX280 overall. (best case scenario: 70% faster)
Target driver for reviews is Catalyst 8.7. Claimed Quad CrossFireX gains of upto 300% over single 4870.
http://gpucafe.com/?p=12

Needless to say our chinese friends are faster. 

And got some GT200b info that I've been sitting on for a couple of days. Not going live with it cause they look flaky. :shadedshu


----------



## magibeg (Jul 11, 2008)

mlupple said:


> If everyone thought like you, we'd all have to root for wal-mart and chinese made products.  Then they'd have an absolute monopoly since nobody would shop anywhere else and then they'd fuck us.  Supporting the underdog to keep competition alive is what makes this country roll!



I thought it was capitalism that makes the US roll?


----------



## imperialreign (Jul 11, 2008)

magibeg said:


> I thought it was capitalism that makes the US roll?



no . . .

a roll is a roll

and a toll is a toll

and if the US don't get no tolls

then we don't get no rolls


----------



## OzzmanFloyd120 (Jul 11, 2008)

... like dinner rolls?


----------



## lemonadesoda (Jul 11, 2008)

2 billion transistors and a lot of watts


----------



## SK-1 (Jul 11, 2008)

imperialreign said:


> no . . .
> 
> a roll is a roll
> 
> ...



 Indica or Sativa imperialreign?


----------



## warhammer (Jul 11, 2008)

Just wait for the new DX11 cards to come out and we will have some more new cards to talk about..

Microsoft is going to release DirectX11 http://en.hardspell.com/doc/showcont.asp?news_id=3708


----------



## KainXS (Jul 11, 2008)

warhammer said:


> Just wait for the new DX11 cards to come out and we will have some more new cards to talk about..
> 
> Microsoft is going to release DirectX11 http://en.hardspell.com/doc/showcont.asp?news_id=3708



Yep, the Radeon HD 5800's(R800) and the GTX360/80's


----------



## indybird (Jul 11, 2008)

If nvidia takes the GX2 route again with the GTX 280, then (assuming the GX2 performs equal to two GTX 280s in SLI), it will be approximately equal to the HD4870X2.

Heres my reasoning:
-In SLI the second GTX 280 adds anywhere from 50% to 80% of the performance of a single card (according to most reviews).
-The ATI HD4870X2 is claimed to be 50% to 80% faster than the GTX 280
-Therefore, unless nvidia improves their scaling or ATI's isnt as good as they are currently claiming, these two cards will be equal in performance
-Based on the cost of a 65nm GTX 280 core, the GTX 280 will be very expensive: ~$700.

However, if nvidia chooses to compete with the HD4870X2 using the 55nm (probably overclocked) GTX 280 then I believe that the HD4870X2 will be more powerful.

More reasoning:
-When nvidia lowered the manufacturing process of the 9800GTX to 55nm, overclocked it and called it the 9800GTX+, the average performance gain was 10%, 20% at absolute best (also according to reviews).
-If the ATI HD4870X2 is 50% to 80% faster then the GTX 280 then the HD4870X2 will be app. 35% to 60% faster then the GTX 280 55nm/overclocked.

Sounds to me like the best option for nvidia would be a GTX260GX2 with the cores overclocked to GTX280 speeds and to sell it for about $550-$600.

-Indybird


----------



## KainXS (Jul 11, 2008)

I think Nvidia might skip a GX2 this time around, compared to what im seeing with the scores the R700 is pulling out and what SLI GTX280's do, which from what i've seen they scale about 10-45% on average on this review, releasing a GX2 would be murder, its costs alot to make those G200 cores and even when the die strinks they're still going to cost alot to make, they're going to have to speed up development of the G300's

and why would any of you want a  GTX280-GX2, that card would cost at least 900 dollars when you see that the 280's cost about 500-550 dollars now, unless you have money to burn and are a major Nvidia fan, thats a waste of money. 

Nvidia totally  this time

I want to wait for the R800's before I buy another card, if the R700 can pull 80% off though then I will definitely buy that

http://www.tbreak.com/reviews/article.php?cat=grfx&id=618&pagenumber=10


----------



## steelkane (Jul 11, 2008)

I remember 3DFX having similar clams, as far as amd/ati selling the better performing card for a cheaper price, there only doing that because they have been behind for so long, lets just see when they pull ahead, they don't raise the prices. As far as who's better amd/ati or nvidia, I don't care about that, I've had both & just want good performance.


----------



## Ketxxx (Jul 11, 2008)

Calm down people its all hear-say with no solid numbers to back it up. Remember right now there are the likes of the HD4850 that sell for stupidly cheap and stupidly cheap HD3870 GDDR3 models that sell for even lower but give stellar performance. I cant speak for a HD4850\70 yet, but I can say this HD3870 GDDR3 I have does very well. Scores 11k off the bat completely stock in 3DM06. My point is, don't be in awe of something you have absolutely no proof of. Instead be in awe of what you do have proof of


----------



## Megasty (Jul 11, 2008)

KainXS said:


> I think Nvidia might skip a GX2 this time around, compared to what im seeing with the scores the R700 is pulling out and what SLI GTX280's do, which from what i've seen they scale about 10-45% on average on this review, releasing a GX2 would be murder, its costs alot to make those G200 cores and even when the die strinks they're still going to cost alot to make, they're going to have to speed up development of the G300's
> 
> and why would any of you want a  GTX280-GX2, that card would cost at least 900 dollars when you see that the 280's cost about 500-550 dollars now, unless you have money to burn and are a major Nvidia fan, thats a waste of money.
> 
> ...



That review is painful to look at when it comes to GTX280 SLI. This is the main reason why they should back off any idea of a GX2 & just write it off. The reason the 9800GX2 worked so well is that the G92 scaled very well. This thing scales like garbage. Drivers might help it eventually but who is going to delve $1000+ into either option to find out. The 4870x2 is going to be the cornerstone for dual GPUs. If NV wants to kill themselves with a $1000+ POS that scales horribly as 2 cards now, then I see no reason to even consider it.


----------



## wolf2009 (Jul 11, 2008)

indybird said:


> If nvidia takes the GX2 route again with the GTX 280, then (assuming the GX2 performs equal to two GTX 280s in SLI), it will be approximately equal to the HD4870X2.
> 
> Heres my reasoning:
> -In SLI the second GTX 280 adds anywhere from 50% to 80% of the performance of a single card (according to most reviews).
> ...




exactly wat i said a few posts ago .


----------



## InfDamarvel (Jul 11, 2008)

Ati/AMD has been putting so much time in this dual gpu solution that they have finally began to perfect it. It seems that Nvidia may have taken to much time on making a single gpu solution, which may be quite powerful, isn't very cost effective. And at the same time they let Ati take over dual gpu solutions even though they started the entire trend.

What a interesting time in the market lol.


----------



## yogurt_21 (Jul 11, 2008)

KainXS said:


> I think Nvidia might skip a GX2 this time around, compared to what im seeing with the scores the R700 is pulling out and what SLI GTX280's do, which from what i've seen they scale about 10-45% on average on this review, releasing a GX2 would be murder, its costs alot to make those G200 cores and even when the die strinks they're still going to cost alot to make, they're going to have to speed up development of the G300's
> 
> and why would any of you want a  GTX280-GX2, that card would cost at least 900 dollars when you see that the 280's cost about 500-550 dollars now, unless you have money to burn and are a major Nvidia fan, thats a waste of money.
> 
> ...



that reviewer needs a lesson in sli, I call shenanagans. I mean seriously the scores couln't be more inconsistant if they were made up entirely. lol


----------



## wolf (Jul 11, 2008)

and just like before, they cant beat nvidias card unless they cram on 2 gpu's.


----------



## GPUCafe (Jul 11, 2008)

wolf said:


> and just like before, they cant beat nvidias card unless they cram on 2 gpu's.


Like before? This is exactly like 7950GX2 versus X1900. Smaller efficient chips versus bigger brute-force chip.


----------



## wolf (Jul 11, 2008)

no this is exactly like 3870x2 vs 8800/9800 single gpu. when you put them in SLi, they no doubt beat ATi's counterpart.

hec even 9600GT SLi give a 3870X2 a damn good run for its money, i'd say roughly even, not to mention for all the price/performance fanatics out there, as i remember it, the 9600GT SLi option was cheaper.


----------



## btarunr (Jul 11, 2008)

wolf said:


> no this is exactly like 3870x2 vs 8800/9800 single gpu. when you put them in SLi, they no doubt beat ATi's counterpart.



and cost exponentially more....and void the convinience of running two GPUs on a single slot...


----------



## imperialreign (Jul 11, 2008)

SK-1 said:


> Indica or Sativa imperialreign?



neither . . . (sadly, and thankfully, I can't do that anymore)

it was another off-handed reference to one of the funniest movies of all time . . .

I was only trying to lighten the mood . . .


----------



## wolf (Jul 11, 2008)

btarunr said:


> and cost exponentially more....and void the convinience of running two GPUs on a single slot...



the ratio isnt exponential, please dont exaggerate that much, and like i said, as for price, 9600GT.

ive said it before and ill say it again, the people out there who want THE BEST performance, will throw money at it. i know alot of people like that. they all choose nvidia because it makes the beefyest GPU's


----------



## Megasty (Jul 11, 2008)

GPUCafe said:


> Like before? This is exactly like 7950GX2 versus X1900. Smaller efficient chips versus bigger brute-force chip.



Now that was a GPU war if I ever remembered one. If that GX2 would have gotten off the ground, it would have eaten the X1950XTX alive. Too bad it didn't even come close to beating it because it was flawed from the ground up.



			
				wolf said:
			
		

> and just like before, they cant beat nvidias card unless they cram on 2 gpu's.



Does it really matters. Performance wise, 1 of these OC'd will come close to matching 2 GTX280s. But if you really need that much performance & have a grand to waste, 2 of these will murder 2 GTX280s (let alone 3 of them). There's no way I would waste a $1000+ on 2 GTX280s when 2 4870x2s stomp all over them.


----------



## wolf (Jul 11, 2008)

because we've all seen how awesome quad GPU scaling is right?


----------



## farlex85 (Jul 11, 2008)

wolf said:


> the ratio isnt exponential, please dont exaggerate that much, and like i said, as for price, 9600GT.
> 
> ive said it before and ill say it again, the people out there who want THE BEST performance, will throw money at it. i know alot of people like that. they all choose nvidia because it makes the beefyest GPU's



Don't expect nvidia to do it again. Ati's method of smaller quicker gpu's has won out, we likely won't see another monolithic from nvidia, it wouldn't make much sense for them to. Those people you speak of are likely just inclined to root for nvidia b/c it's their brand, I'm not gonna say fanboy, I'll say trusted brand. Many who want the best performance will anylise the situation and choose the best gpu out there and not simply throw away their money on something that someone else does better for less.



wolf said:


> because we've all seen how awesome quad GPU scaling is right?



We have actually. The gx2 and x2 have both come near 100% scaling across all 4 cores in certain applications. You've got to admit it's getting better, better all the time.......


----------



## magibeg (Jul 11, 2008)

wolf said:


> because we've all seen how awesome quad GPU scaling is right?



Its a completely different style of GPU scaling though, you can't say that the same way until we see it. Lets all wait for the numbers. Why don't we all just have red and green sigs from now on. Would make discussion easier


----------



## wolf (Jul 11, 2008)

i mean that for both companies dude, nvidias and ati's 4 gpu scaling is retarded so far.

and as for sigs dont count me as green boy, im getting a 4870X2 also, ive gotta see what its all about 

what im saying in effect tho is if i had the money i would probably also go for GTX280 SLi and race them


----------



## Megasty (Jul 11, 2008)

wolf said:


> because we've all seen how awesome quad GPU scaling is right?



I did have 2 3870x2s. With the 8.6, they scaled great, even though up until 8.5 there was hardly no difference between them & tri-fire 

3 way-SLI is even worse. The 3rd GPU doesn't even exist with most setups (9800GTX & GTX280/260). The fact that ATi has nearly perfected their dual GPU architecture says wonders for how scaling has progressed over the years. 3 & 4 GPU scaling still has a long way to go but atleast its better than nothing.


----------



## Bjorn_Of_Iceland (Jul 11, 2008)

4870 in xfire is indeed faster than a GTX 280. But a GTX 280 in SLI still pwns.. Price is not right though.

4870x2 in quad xfire is insane!


----------



## Nyte (Jul 11, 2008)

There's alot of people in here that need take a course in ASIC design sheesh...

You guys are saying "ATI sux" because it takes 2 to defeat 1?  I guess you can make the same analogy with it takes Dual Core CPU's to beat 1?  Smarten up.


----------



## wolf (Jul 11, 2008)

you smarten up, were comparing 2 VERY different products here, its not as simple as single core vs dual core.

remember also that dual cores are 2 execution cores, on the same die. dual gpu solutions are not.

its not us who need lessons in design, and also, which person said "ati sux" ?


----------



## Makaveli (Jul 11, 2008)

lol its easy to spot the green team cheerleaders in this thread.

Bring on the 4870 X2 and whatever NV's answer will be!!!

I want more price drops let the children fight over who is better.


----------



## candle_86 (Jul 11, 2008)

you say GTX 280 SLI sucks, yes right now it does, is it drivers or not enough power to drive even one card availble right now?

These cards i compare to the 8800GTX simply because i can, when they came out Core2 Adoption was still low as alot of people where still using 939 and AM2 based rigs, remember Core2 wasnt yet 6months old when the G80 came out, and we saw what when the world went to Core2? Almost 30% increase in FPS for the G80 over the FX62 that was the top CPU out there. Given the GTX280 is so powerful can even a modern QX9770 drive this thing honestly at its best? I wait on nahlem and ill bet you money we see massive gains from Nvidia and not so massive from AMD. You watch it happened once it will again


----------



## echo75 (Jul 11, 2008)

i dont want to sound like a pessimist but i will belive it when i see it, thats how there was uber hype about the 4870 could perform this and that...meanwhile in real life me and many others never saw that hero performance. Maybe it due to our own hardware limitations , driver incompatibilities of whatever....however it still remains that " i believe when i see it"


----------



## Nyte (Jul 11, 2008)

wolf said:


> you smarten up, were comparing 2 VERY different products here, its not as simple as single core vs dual core.
> 
> remember also that dual cores are 2 execution cores, on the same die. dual gpu solutions are not.
> 
> its not us who need lessons in design, and also, which person said "ati sux" ?



The person who said "ati sux" edited their post conveniently after I posted mine.

Then I guess your scientific criterion for fair comparison would be that you have to have 2 ASIC GPU's on the same die (to be on the same level as comparing a dual core CPU to a single core CPU) am I right?  Is this criterion defined in an ISO standard somewhere?  I'd be interested to see that.  Maybe HardOCP or Guru3D can begin to use this standard since it's so scientific.

You can never compare technologies like that.  The only way you can ever make a fair performance comparison between 2 products is if they are in the same league of price, power consumption, featureset, and requirements.  Comparing a "dual GPU" technology to a "single GPU" by implicitly denouncing the former is not a fair means of comparison in ANY standard (except for some of the posters in this thread).

"Cramming 2 GPU's to beat NVIDIA".  That statement by itself is enough for any engineer to walk away because it clearly means the speaker knows nothing about ASIC design.  Yields, cost, BOM, TDP, complexity... I guess I can throw all those factors away because as far as I know, AMD needs to cram 2 GPU's to beat NVIDIA, that MUST mean NVIDIA is better right?


My input on this matter is done.


----------



## GLD (Jul 11, 2008)

One gpu for me please. a Quad core cpu, now were talking.


----------



## bigtye (Jul 11, 2008)

Regardless of wether your an Nvidia or ATI loyal customer, these new cards from both sides are a big performance gain over the ones they outdate. Does this mean we can expect a new era of games graphics to begin shortly?

After all, they can only code to what the hardware can produce. Currently my 9600GT plays everything I want and it looks good. I can't justify buying one of these new cards just yet 'cause mine still works, as did my 1950 pro which was replaced. However when new stuff comes out I will be "forced to upgrade" (that's for the benefit of my wife, the "forced to" bit)

I am looking forward to the new generation of games which these new cards will hopefully encourage and make possible. 

Cheeper prices, new games! Woot! Sounds good.

Tye


----------



## vojc (Jul 11, 2008)

wolf said:


> and just like before, they cant beat nvidias card unless they cram on 2 gpu's.



yeah the point is that 2 ATI GPUs size equals 1GPU of Nvidia  and....
at the same die size ati is up to 80% faster and 10-20% more power hunger


----------



## InnocentCriminal (Jul 11, 2008)

Let's hope we see these sort of figures, if so I'll be extremely happy with settling for a 4850X2 over a 4870X2. It would all depend on power consumption to closed my decision.


----------



## kaneda (Jul 11, 2008)

candle_86 said:


> nope, the 7950GX2 took the DX9 crown



no it didnt.

for one the moment you upped the AA/AF the GX2 had a fit.

also 2 X1950XTX's in CF > 2 GX2's in SLI.

HDR+AA


----------



## InnocentCriminal (Jul 11, 2008)

kaneda said:


> ... also 2 X1950XTX's in CF...



Brings back memories.


----------



## fullinfusion (Jul 11, 2008)

Kiss Amd's azz nvidia lmfao...


----------



## zOaib (Jul 11, 2008)

can someone make my name red , i wanna be a fanboi representin !!! thx


----------



## btarunr (Jul 11, 2008)

Mouse over the username to see it in red whenever you feel the 'urge'. Shed fanboyism, be rational.


----------



## DarkMatter (Jul 11, 2008)

btarunr said:


> Mouse over the username to see it in red whenever you feel the 'urge'. Shed fanboyism, be rational.


----------



## Megasty (Jul 11, 2008)

btarunr said:


> Mouse over the username to see it in red whenever you feel the 'urge'. Shed fanboyism, be rational.



Rational!?, wats dat  I thought we all bought the biggest, most powerful thing we can't afford  Dang now I have to find out what this 'rational' thing is :shadedshu

Fanboyism actually spouts from a form of rationalism. Some ppl buys a card from one camp - & it pleases them to no end, w/o giving them any problems. Fantasy worlds like that are then concreted. Fanboyism of that nature is *hard* to crack. On the other side of the tracks are fanboys who haven't bought a card in their lives 

A rational person wouldn't even be able to compare a 4870x2 & a GTX280 mainly because the p/p ratios are way too different. That thing would need to be $350-400 to make any kind of rational comparison.


----------



## btarunr (Jul 11, 2008)

Rational = I will buy whatever is best for my money without bias toward either companies, NV / ATI. 

I was going to buy a HD 2900 XT. Prices sucked compared to a 8800 GTS 640M. Bought the GTS. Then came 8800 GT that outperformed it. Sold the GTS for the same price at which at which 8800 GT could be bought. Got happy with the 8800 GT. Next time in the market if ATI offers the best for my cash, I will buy it, but if NV comes up with something even at the last moment, NV gets my cash.


----------



## newconroer (Jul 11, 2008)

btarunr said:


> Pre-release performance evaluations suggest that the Radeon HD 4870 X2 2GB GDDR5 model will on an average be 50% faster than the GeForce GTX 280 and in some tests 80% faster. A second model, the HD 4850 X2 (2GB GDDR3 memory, 2x RV770Pro) will also convincingly outperform the GeForce GTX 280. The R700 series will be brought into the market late-July thru August.
> 
> Source: Hardspell




Good good...100+fps in games, just what I need.

But is it going to micro stutter? Is it going to drop from 100fps to 20 everytime your gun goes off?


The only thing the X2 will do, is push 280 prices down. The 280 will become even more of the ultimate gamer's card because it's single solution, and it eliminates the most common stutter issues. 

ATi should have been happy with their jump from the 3x series to the 4x series when it comes to performance, and continued to try and push the 'price/performance' appeal for the mass market, using that income to rebuild their dwindling campaign. Now they risk wakening a sleeping dragon. 

As I said before, ATi could possibly kill off the appeal for their 4850/70 product line; leaving them with more or less just the 'crown,' which Nvidia will turn around and trump because they have the resources to do so.


----------



## btarunr (Jul 11, 2008)

newconroer said:


> Good good...100+fps in games, just what I need.
> 
> But is it going to micro stutter? Is it going to drop from 100fps to 20 everytime your gun goes off?
> 
> ...



http://forums.techpowerup.com/showpost.php?p=880187&postcount=30

...hope that helps.


----------



## newconroer (Jul 11, 2008)

Eh, one person, one game, doesn't give me much conclusive results.

We already see the 280 accomplishing the same thing, why jam a possibly more heat/power demanding component in your system if you don't need to?


----------



## btarunr (Jul 11, 2008)

One person..one game...at least it's better than "nothing to prove there's no stutter".


----------



## Darkrealms (Jul 11, 2008)

newconroer said:


> Good good...100+fps in games, just what I need.
> 
> But is it going to micro stutter? Is it going to drop from 100fps to 20 everytime your gun goes off?
> 
> ...


Nice work, I can agree with that.  Unfortunately everyone wants the crown.  Nvidia has enough going wrong for them right now and they have the resources.  If they have been lowballing because of the competition and the 300 series does exist with 45nm and GDDR5 this could just hurt ATI.


----------



## steelkane (Jul 11, 2008)

btarunr said:


> Rational = I will buy whatever is best for my money without bias toward either companies, NV / ATI.
> 
> I was going to buy a HD 2900 XT. Prices sucked compared to a 8800 GTS 640M. Bought the GTS. Then came 8800 GT that outperformed it. Sold the GTS for the same price at which at which 8800 GT could be bought. Got happy with the 8800 GT. Next time in the market if ATI offers the best for my cash, I will buy it, but if NV comes up with something even at the last moment, NV gets my cash.



I agree with that statement 1000%, Choice is the greatest thing on EARTH.


----------



## trt740 (Jul 11, 2008)

steelkane said:


> I agree with that statement 1000%, Choice is the greatest thing on EARTH.



good luck on that they are coming out so fast it is impossible to tell whats the best buy.


----------



## DarkMatter (Jul 12, 2008)

Well, for me it's very easy to tell. The more cards out there the better, more choice. Even when you can't tell which one is the better one for your needs, it doesn't really matter: with more cards out there prices go down and even if you choose the "wrong" card you are always buying more for less than if there wasn't such extreme competition.

On another note, that competition in price IS HURTING the graphics industry, mainly driven by the prices of Ati. Not blaming, this is bussiness and that's what they have to do (they are almost forced to), and on the other hand it just happens it's so good for us. But I've been wondering lately if we are getting a little bit greedy: it's very common to see people complaining about prices and are not higher than in tha past, every year we want double the performance, but now we also want it to come at less money. There will be the eternal debate about if they were charging too much in the past and only now they are being honest with prices. IMO that's not the case. Average selling point has decreased a lot, while at the same time the percentage of high-end cards has increased. This means lower profits for the companies. And I know that people think "and what?", but IT companies are more fragile than what people think. IMO if the competition continues in this direction one of the companies could end up dissapearing and we would regret it. And in the long term this is bad for both Ati and Nvidia, any of the two could eventually dissapear. Ati's pricing estrategy is not sustainable, they can't continue selling the cards so cheap forever and what will happen when they release the profitable card? It won't be as good from a pref/price point of view and the market will be flooded with the previous cards. They would be forced to price the cards at low profitable prices again just as they are forcing Nvidia now, and of course Nvidia would be in the same situation. 

I'm not saying we have to pay more for the same. This rant is not about the purchasing decisions, buy the better thing your money can buy. I just think we have to be more aware of the current situation at the time of complaining: current prices are not necesarily fair, as in absolute truth. Are very good for us, but I strongly believe we are heading to a situation where we would be paying less than what would be fair. That said, I love the situation, I'd love this to continue, but I want to stay away from hypocrisy. IMHO:

- That prices go down, even if they make less profit, but have enough to stay in the game without affecting their workers... GOOD
- That we benefit from that situation, even if we know it's not necesarily fair acording to an absolute truth. GOOD
- That we complain about the pricing when it doesn't fit our "distorted" expectations, even if they come at same prices as in the past. BAD, very bad.

My 2 cents. 

Sorry for the rant. I just felt we needed a bit of seft-criticism and this was just the day of doing it.


----------



## Nick89 (Jul 12, 2008)

____ gets the award for biggest nvidia fanboy in this thread.

Ok ____ we get it, you won! Arnt you happy? 


Insert name as you wish, as I cant decide...lol


----------



## wolf (Jul 12, 2008)

Nyte said:


> The person who said "ati sux" edited their post conveniently after I posted mine.
> 
> Then I guess your scientific criterion for fair comparison would be that you have to have 2 ASIC GPU's on the same die (to be on the same level as comparing a dual core CPU to a single core CPU) am I right?  Is this criterion defined in an ISO standard somewhere?  I'd be interested to see that.  Maybe HardOCP or Guru3D can begin to use this standard since it's so scientific.
> 
> ...



my issue isnt with how its built, or the price, or anything pointed out in your fantastic "lecture", its with the problems inherent with Dual GPU solutions.

granted they have come along way, there are users out there who prefer one gpu to avoid any of those possible issues.

it doesnt really bother me who has the best performance, ill buy it anyway, ima get a 4870x2 for sure. however the day i crave is when either company can get 4870x2/GTX280SLi performance from a single gpu solution, you can never go wrong.


----------



## Megasty (Jul 12, 2008)

wolf said:


> my issue isnt with how its built, or the price, or anything pointed out in your fantastic "lecture", its with the problems inherent with Dual GPU solutions.
> 
> granted they have come along way, there are users out there who prefer one gpu to avoid any of those possible issues.
> 
> it doesnt really bother me who has the best performance, ill buy it anyway, ima get a 4870x2 for sure. *however the day i crave is when either company can get 4870x2/GTX280SLi performance from a single gpu solution, you can never go wrong*.



That day will definitely come. Technology doesn't stop, meet walls, limits, or anything of that nature. Just like the GTX280 is as fast as a 9800GX2 & a 4870 is as fast as a 3870x2, single chips can come to meet the generational standards of dual chip predecessors. If that trend continue (although unlikely since it just started), then the next gen single chips will blow us away just as the single chips' improvements did in this series.


----------



## handydagger (Jul 13, 2008)

lol poor nvidia I'm imaging in their labs now they are working hard on DDR5 to be released on their core assuming they will slap ATI in the next round, while ATI is working on dual core GPU or maybe Quad Core GPU  with 4-8 Crossfire GPU 

Nvidia is lagging now, same with happened to 3dfx in the past few years ago and maybe Intel is going to take care of it later on


----------



## HTC (Jul 13, 2008)

What if both nVidia and ATI, in a near future, did like Intel and AMD did?

Imagine ...

- A "low end" card would be a 2 cores card in *1 GPU*, much like the 48X0x2 or the 9800GX2, but in a single GPU.

- A "high end" card would be a 4 cores in *1 GPU* card.

- The "single core" card would not even exist anymore, except very old cards.


Just like Intel and AMD, as far as i know, only make dual, triple (AMD), quads, and more, it wouldn't be that much of a stretch to guess that the GPU future would be a multi-core one.

IMO, the "single core" in a card will fade and it won't take very long: about 2 or 3 years, i reckon.


They will manage to make a dual core GPU just like a C2Ds, for example.



In order to make a dual core but single GPU, the size of 1 of the cores *must* be small in order to fit 2 of them in a single GPU.

This is where i believe ATI has the upper hand: they managed to get a powerful GPU with a "not big" die size where as nVidia did make a better card but with a "very big" die size.


----------



## handydagger (Jul 13, 2008)

ATI is coming closer to dual Core GPU than nvidia, assuming the next coming 4870x2 will have 2 separate cores on single card.
The further is in the multi cores.


----------



## Hayder_Master (Jul 13, 2008)

80% , with overclock the R700 can we say 1*hd R700= 2* gtx280 sli


----------

