# ATI Radeon HD 3870 X2 (R680) 1GB First Full Review Posted



## malware (Jan 22, 2008)

The guys over at PConline.com.cn have managed again to be the first to post a full review of a graphics card that should be available later next month, the ATI Radeon HD 3870 X2 1GB (R680). First thing you'll notice is that the card beats NVIDIA's GeForce 8800 Ultra in every Futuremark benchmark and almost every game by quite a margin. Have a good time reading the full story here.



 

 



*View at TechPowerUp Main Site*


----------



## snuif09 (Jan 22, 2008)

i love that card already


----------



## Weer (Jan 22, 2008)

It's hard to load the pages, but from the only one I saw so far, the 8800 Ultra beats the hell out of it in Lost Planet.


----------



## MikeJeng (Jan 22, 2008)

This is better than the Ultra but it's also X2...



I still like Nvidia.


----------



## WildCat87 (Jan 22, 2008)

Weak.


----------



## cmberry20 (Jan 22, 2008)

wow! Very surprising. I wasn't expecting the performance to be as much as that. Well done ATI/AMD.

However, I worried about the price - I think they should sell it at < £250/€350 which would be the top end of most peoples (enthusiasts) budget. > £300/€400 (which it looks like it is) & your budgeting for the rich only & a niche market.


----------



## cmberry20 (Jan 22, 2008)

btw, what board is that behind the X2 in the thrird picture - is that a 8800 ultra?


----------



## Wile E (Jan 22, 2008)

WildCat87 said:


> Weak.


This card is also much cheaper than an Ultra. It's supposed to retail around the price of a GTX.


----------



## cmberry20 (Jan 22, 2008)

Wile E said:


> This card is also much cheaper than an Ultra. It's supposed to retail around the price of a GTX.



Well, considering you can now pick up a 8000GTX for £240/€330 lets hope so!!


----------



## erocker (Jan 22, 2008)

Also the HD3870 is new, while the Ultra has been around a while.  A month or two and the 3870x2 beats the Ultra in Crysis.


----------



## WildCat87 (Jan 22, 2008)

erocker said:


> Also the HD3870 is new, while the Ultra has been around a while.  A month or two and the 3870x2 beats the Ultra in Crysis.


All I see is a brand new high-end dual-GPU video card that barely cracks 40 FPS at a low resolution with NO AA.

Weak. lol


----------



## Wile E (Jan 22, 2008)

WildCat87 said:


> All I see is a brand new high-end dual-GPU video card that barely cracks 40 FPS at a low resolution with NO AA.
> 
> Weak. lol


Until you realize that the drivers aren't anywhere near mature, and it's still faster then the nVidia offering in the same price range.


----------



## erocker (Jan 22, 2008)

WildCat87 said:


> All I see is a brand new high-end dual-GPU video card that barely cracks 40 FPS at a low resolution with NO AA.
> 
> Weak. lol



I wouldn't be suprised if there is a 20 - 40% gain with a patch and/or new drivers.  Give it a little time, the card isn't even in retail yet!


----------



## WildCat87 (Jan 22, 2008)

Wile E said:


> Until you realize that the drivers aren't anywhere near mature, and it's still faster then the nVidia offering in the same price range.


I won't argue about the drivers.

A brand new, high-end, dual-GPU video card, even if only $50, getting just 1 frame more than an old, single-GPU card is still weak. Excellent price/performance ratio, but I was truly expecting more as far as pure performance, even with launch drivers.


----------



## erocker (Jan 22, 2008)

When a card beats out another card in everything else... it's drivers.  I'm sure they are working on it.


----------



## Blacklash (Jan 22, 2008)

I'd love a card that just works. It looks like with this one if the title doesn't traditionally do well with Crossfire you are going to be left with slightly worse than a single HD 3870 result. See; WiC, Crysis, and Lost Planet. I bet it blows in DiRT too. It will probably do fine in Crysis on the DX9 path. I know my 3850s do.

I am going to hold onto my money for now. Hopefully someone will deliver a single GPU that can beat my GTX well soon. It's 661|2040 with a 1600 shader clock.


----------



## Mussels (Jan 22, 2008)

lets see how it goes. It seems fast in DX9 and weak in DX10.


----------



## erocker (Jan 22, 2008)

Mussels said:


> lets see how it goes. It seems fast in DX9 and weak in DX10.



Like everything else!  We literally haven't seen a real high-end DX10 card from either side.  I think this card has the greatest potential out of all of them though.


----------



## Xaser04 (Jan 22, 2008)

Could someone possibly give a quick run down of the results as the website does not load at all here at work. 

I am seriously interested in one of these as something new to play with. 

Cheers


----------



## LiveOrDie (Jan 22, 2008)

Benchmarks don't show how the card is really is goin to proform in really world gaming, this is ATI high end card, when nvidia 9800GTX is not far off, like the saying goes 2 is better than 1, i would guess this would beat the 8800Ultra because it has 2 GPU working on one thing with more memory, would be a good card if not priced to high, but i will fall behind Nvidia's 9800GX2 in muti GPU cards. my 8800GTS 512MB comes like 100Points off the Ultra in 3DMark 06 but sure does not proform like a Ultra lol.


----------



## Mussels (Jan 22, 2008)

actually on the more memory thing... doesnt crossfire (and SLI) only share GPU power? wouldnt this really be a 512MB card for most purposes?


----------



## LiveOrDie (Jan 22, 2008)

Mussels said:


> actually on the more memory thing... doesnt crossfire (and SLI) only share GPU power? wouldnt this really be a 512MB card for most purposes?



Im not sure i think that is only when you uses 2 card not a single card with 2 GPU's, because think it work like 2 GPU's running off the same memory thats 1GB, does any one know if windows it picks the card up as one or two card, and do you have to turn crossfire/SLI on in the video cards control panel?


----------



## Darkmag (Jan 22, 2008)

Xaser04 said:


> Could someone possibly give a quick run down of the results as the website does not load at all here at work.
> 
> I am seriously interested in one of these as something new to play with.
> 
> Cheers


 Hope this helps






































































Performance diffrence in terms of R680 to 8800Ultra
Bioshock 1280*1024 = 4% Slower
Bioshock 1920x1200 = 24% Faster
Bioshock 2560*1600 = 39% Faster

COJ 1280*1024 = 11% Faster
COJ 1920x1200 = 24% Faster
COJ 2560*1600 = 10% Faster

COJ 1280*1024 4AA 16AF = 13% Faster
COJ 1920x1200 4AA 16AF = 23% Faster

Lost Planet 1280*1024 = 27% Slower
Lost Planet 1920x1200 = 30% Slower
Lost Planet 2560*1600 = 37% Faster

Lost Planet 1280*1024 4AA 16AF = 18% Slower
Lost Planet 1920x1200 4AA 16AF = 18% Slower
Lost Planet 2560*1600 4AA 16AF = 10% Faster

Crysis 1280*1024 = 13% Slower
Crysis 1920x1200 = 2% Slower
Crysis 2560*1600 = 12% slower

COD4 1280*1024 = 42% Faster
COD4 1920x1200 = 32% Faster
COD4 2560*1600 = 26% Faster

COD4 1280*1024 4AA 16AF = 27% Faster
COD4 1920x1200 4AA 16AF = 20% Faster
COD4 2560*1600 4AA 16AF = 16% Faster

NFSro 1280*1024 = 32% Faster
NFSro 1920x1200 = 38% Faster

NFSro 1280*1024 4AA 16AF = 72% Slower
NFSro 1920x1200 4AA 16AF = 67% Slower

Serious Sam 2 1280*1024 HAA 16AF = 30% Faster
Serious Sam 2 1920x1200 HAA 16AF = 45% Faster
Serious Sam 2 2560*1600 HAA 16AF = 78% Faster

UT3 1280*1024 = 7% Faster
UT3 1920x1200 = 24% Faster
UT3 2560*1600 = 37% Faster

F.E.A.R. 1600*1200 = 20% Faster
F.E.A.R. 2048*1536 = 20% Faster


----------



## OnBoard (Jan 22, 2008)

Must be a driver bug with ProStreet with AA/AF, it run nearly same 30FPS on my comp with 2xAA/8xAF =)


----------



## Mussels (Jan 22, 2008)

summary: drivers are immature, people are expecting to be about 30-50% faster than an ultra, for about 75% the price of an ultra.


----------



## Xaser04 (Jan 22, 2008)

Darkmag said:


> Hope this helps
> 
> 
> 
> ...



Brilliant Thanks for that


----------



## UnXpectedError (Jan 22, 2008)

card looks like it will be awesome once it gets some proper drivers


----------



## laszlo (Jan 22, 2008)

"Attack of the clones" or "The Empire strikes back" ?


----------



## btarunr (Jan 22, 2008)

I can already predict this card to lose to the GeForce 9800 GX2. Okay, they've compared this card to the 8800 Ultra and the 8800 Ultra loses but give the 8800 Ultra's scores a 35% increment and you can pretty much figure out how a 9800 GX2 performs and how it compares to this Radeon HD3870 X2.


----------



## SiCk (Jan 22, 2008)

laszlo said:


> "Attack of the clones" or "The Empire strikes back" ?



Return of the ati maybe


----------



## Mussels (Jan 22, 2008)

if its an attack of clones... why did they clone joined twins?


----------



## Xaser04 (Jan 22, 2008)

btarunr said:


> I can already predict this card to lose to the GeForce 9800 GX2. Okay, they've compared this card to the 8800 Ultra and the 8800 Ultra loses but give the 8800 Ultra's scores a 35% increment and you can pretty much figure out how a 9800 GX2 performs and how it compares to this Radeon HD3870 X2.



It would have been useful if they had included results from a pair of 8800GTS (G92) in SLI as this would have been roughly representative of the performance of the 9800GX2.

EDIT: Did they include any SLI results? (sorry for the question but I can't load the website so I don't know)


----------



## TooFast (Jan 22, 2008)

yes! ati is back.........


----------



## unsmart (Jan 22, 2008)

Crysis shouldn't even be benched it's so bias,just like Doom3. If it is benched they should call it "Nvidia presents Crysis".


----------



## OrbitzXT (Jan 22, 2008)

Wile E said:


> This card is also much cheaper than an Ultra. It's supposed to retail around the price of a GTX.



Supposed to...we all know how these things work, it'll be available for $450ish on NewEgg for all of 15 seconds until they're sold out, then the next batch will go up in price to 500 something. I'm guessing the R680 needs better drivers considering its still behind the Ultra in a number of games, but does very well on the 3dMark benchmark. I must have said this in 15 or so posts, but I wish ATI would fix little things in their drivers, like Image Scaling. Its such an annoying issue to me that I just simply won't buy anymore of their cards until its working as it should. I love my older games but games with 4:3 resolutions stretched to 16:10 look nasty. Even if this R680 outperforms nVidia's 9800 GX2 I'll still go with nVidia on the simple issue of drivers.


----------



## ShinyG (Jan 22, 2008)

Wow, finally something high-end from ATi!
Performance will hopefully go up as new drivers emerge.
People want to see this go head-to-head to a couple of 8800GTS'... Even if those would be faster, the price of two 8800GTS' would be a lot more, plus that not everybody wants to fill their case with 2 graphics cards, some people actually need those PCI and PCI-E 1x slots.
As far as the 9800GX2, I'm sure it's going to be better than ATi's offer, but also more expensive and probably hard as hell to find.
Hard to find seems to be the word for decently priced nVidia products lately!


----------



## Airbrushkid (Jan 22, 2008)

The same can be said about Ati. Look at some of the games when they start up and the splash screen is Ati. Isn't Fear one of those bias to ati? Why didn't they use World of Conflict game it's a Dx10 game. It takes Ati to produce a card that needs *2 GPU's* to catch up with Nivida. 




unsmart said:


> Crysis shouldn't even be benched it's so bias,just like Doom3. If it is benched they should call it "Nvidia presents Crysis".


----------



## OrbitzXT (Jan 22, 2008)

ShinyG said:


> People want to see this go head-to-head to a couple of 8800GTS'... Even if those would be faster, the price of two 8800GTS' would be a lot more, plus that not everybody wants to fill their case with 2 graphics cards



They want to see 2 GTS's because they hypothesize that those will perform similarly to the 9800 GX2 (Whether they're right or wrong). So cost and space really isn't an issue.


----------



## Xaser04 (Jan 22, 2008)

OrbitzXT said:


> They want to see 2 GTS's because they hypothesize that those will perform similarly to the 9800 GX2 (Whether they're right or wrong). So cost and space really isn't an issue.



This is the main reason I want to see it against 2 GTS cards. In theory 2 GTS's should perform roughly equivilant to the 9800GX2 judging by the technical specs being identical and the fact that the GX2 relies on SLI just like 2 GTS's do.


----------



## ShinyG (Jan 22, 2008)

@Airbrushkid:
They can put as may GPUs as they want one one card. As long as performance is high and dimensions+power consumption is ok, I don't see a problem, especially when the price is right! (Hopefully it will, or people will flame me like crazy...)
@OrbitzXT:
Oh, I wasn't paying attention there, sorry!


----------



## Edito (Jan 22, 2008)

Humm, well done ATI\AMD its impressive but i was expecting for more since the 8800Ultra its kinda Old after all maybe with time the gap between them will increase but i still prefere nvidia all the way...

Good for the ppl who like ATI\AMD


----------



## [I.R.A]_FBi (Jan 22, 2008)

Blacklash said:


> I am going to hold onto my money for now. Hopefully someone will deliver a single GPU that can beat my GTX well soon. .



thats not happening for the next 12 mths


----------



## OrbitzXT (Jan 22, 2008)

Edito said:


> Humm, well done ATI\AMD its impressive but i was expecting for more since the 8800Ultra its kinda Old after all maybe with time the gap between them will increase but i still prefere nvidia all the way...
> 
> Good for the ppl who like ATI\AMD



The nVidia fans said the same exact thing that they're expecting more of the 9800 GX2, which was stated as being "30% faster than the ultra", which they love to point out is getting old as well. I suspect the 9800 will be around the neighborhood of where the R680 is now.


----------



## Mussels (Jan 22, 2008)

OrbitzXT said:


> Supposed to...we all know how these things work, it'll be available for $450ish on NewEgg for all of 15 seconds until they're sold out, then the next batch will go up in price to 500 something. I'm guessing the R680 needs better drivers considering its still behind the Ultra in a number of games, but does very well on the 3dMark benchmark. I must have said this in 15 or so posts, but I wish ATI would fix little things in their drivers, like Image Scaling. Its such an annoying issue to me that I just simply won't buy anymore of their cards until its working as it should. I love my older games but games with 4:3 resolutions stretched to 16:10 look nasty. Even if this R680 outperforms nVidia's 9800 GX2 I'll still go with nVidia on the simple issue of drivers.



i've been complaining about the lack of scaling for a long time. Every time i do, some nugget comes along and says his games work jsut great in widescreen, so i'm making it up/Nv fanboi.


----------



## SK-1 (Jan 22, 2008)

WildCat87 said:


> All I see is a brand new high-end dual-GPU video card that barely cracks 40 FPS at a low resolution with NO AA.
> 
> Weak. lol



Not been around Graphics card tech much??? You need to know the GTX series gained 10-15% in its benchies after the FIRST driver update. This is VERY normal for pre-release drivers to be no where near the  card's potential performance. It is all part of a cards evolution.
Basic stuff here WildCat.


----------



## unsmart (Jan 22, 2008)

Airbrushkid said:


> The same can be said about Ati. Look at some of the games when they start up and the splash screen is Ati. Isn't Fear one of those bias to ati? Why didn't they use World of Conflict game it's a Dx10 game. It takes Ati to produce a card that needs *2 GPU's* to catch up with Nivida.


 F.E.A.R is actually a Nvidia game,HL2 is a ATI game and it does favor ATI a bit but not to this level. What I'm talking about is a 30% difference between two cards that perform close in most other game/benches. If you recall the ATI patch someone made for DOOM3 that put ATI cards on level with Nvidia with no loss of image quality. Thats something I think ID should have done not a end user.
 I just don't care for anything that favors one brand over the other due to marketing. They should try to optimize for both brands cards and CPUs. It's not like it's impossible to have a patch loaded for each brands card when recognized. We get screwed by the best when played on crap from both sides and I think it's time we demand the game producers start trying to make it the best for what ever brand you chose. What if HBO was "best if viewed on Sony TVs" I don't think that would last to long.


----------



## phanbuey (Jan 22, 2008)

SK-1 said:


> Not been around Graphics card tech much??? You need to know the GTX series gained 10-15% in its benchies after the FIRST driver update. This is VERY normal for pre-release drivers to be no where near the  card's potential performance. It is all part of a cards evolution.
> Basic stuff here WildCat.



Yeah.. thats true and all... but 10-15% for this card is still not very impressive, considering the Ultra is practically ancient tech, and its still in the same playing field. All that means is that this won't be the 'performance crown' for very long... maybe 1-2 months before the GX2 and 98XX series comes out -  but hey, i could be dead wrong, - they could give it some volts (like the 2900 series) and overclock the pants off of it (1GHz core anyone?), and then improve drivers, which could, altogether, give it up to a 50% increase in performance from these benchmarks... not to mention quadfire (i think my PSU just started crying).


----------



## PVTCaboose1337 (Jan 22, 2008)

YA!!!  Hopefully this will bring back AMD from the bad.


----------



## Judas (Jan 22, 2008)

Well done ...AMD/ATI


EDIT: I tried to read the full story but could not make heads nor tails of it


----------



## jydie (Jan 22, 2008)

WOW... great performance for the price!!  

If they can keep the supply greater then the demand, then AMD/ATI might do well with this card.  I would love to put this in my system and "test" it out.    But it is out of my price range, so I will just have to read the reviews and keep dreaming.


----------



## niko084 (Jan 22, 2008)

Wile E said:


> Until you realize that the drivers aren't anywhere near mature, and it's still faster then the nVidia offering in the same price range.



Don't forget that nothing out can beat down crysis yet...

I think this kid is looking for a miracle... You can't please some people without releasing a $200 card that will give you 100fps in anything completely maxed.


----------



## EastCoasthandle (Jan 22, 2008)

Impressive, regardless of what the fanboys think
Now lets see what the GX2 brings to the table.


----------



## EddxPT (Jan 22, 2008)

Crossfire scales allot better than SLI, and nvidia cannot latelly compete with ati on the price/performance ratio. My guess is the GX2 from nvidia will perform a little better than 3870X2 but for a 30% to 40% higher price than the 3870X2. Also Nvidia as been struggling with availability.It will be from nvidia merely a product to state that they are still the leaders in single card performance.

Remember also, allot of people have P965, 975X, P35 and X38 mobos wich support crossfire....I can see someone pairing this with a 3870 or a 3850 that already owns if it needs to increase performance further.Hopeffuly ati releases crossfireX for intel mobos.  So well done ATI,now Ati should do some work on the drivers now to squeeze some fps more from the 3X00 series. Also game developers : stop favoring brands - u end up loosing market in the end.


----------



## EastCoasthandle (Jan 22, 2008)

The FPS Labs review uses a Foxconn N68S7AA (nvidia 680i) SLI Motherboard that only uses PCIe 1.1, not 2.0 (correct me if I am wrong).   Remember the 3870X2 needs a PCIe 2.0 PCIe.  Also, this is no different then the 7950gx2 vs x1950xtx last time around. 







Source




> As you know, on the board ATI Radeon HD 3870 X2 is two graphics processor. According to the information available at the reference motherboard ATI Radeon HD 3870 X2 will be installed 1 GB of memory GDDR3, which operates at a frequency of 2 GHz processors and related 256 - bit tyres. *The product is meant for connecting to the bus PCI Express 2.0*, which interact with the responsibility of the special chip-bridge. The truth seems to be in this role will be used PEX6347 not, as expected, but PEX8548. At the given scheme, the bridge is at the centre of charges between the two processors. Near each processor can be found on four chip memory.
> 
> The following illustration shows that the transition from PCI Express 1.1 for PCI Express 2.0 provides a noticeable increase in productivity - around 20-30%, depending on the application and the screen resolution.


source


----------



## Makaveli (Jan 22, 2008)

Do you really believe PCI E 1.0 bandwidth is holding this card back?

Seems like you are hoping so?


----------



## niko084 (Jan 22, 2008)

Makaveli said:


> Do you really believe PCI E 1.0 bandwidth is holding this card back?
> 
> Seems like you are hoping so?



Well the ultra I believe uses almost all the bandwidth on a pci-e 1.0 so I would figure this card would.


----------



## EastCoasthandle (Jan 22, 2008)

I would ignore the Crysis results.  From what I read in another post Crysis can't use more than 128 shaders which means it's not using all of 3870's shaders.  2 GTs in SLI is still 2*128.  

Lets not forget


----------



## EastCoasthandle (Jan 22, 2008)

Makaveli said:


> Do you really believe PCI E 1.0 bandwidth is holding this card back?
> 
> Seems like you are hoping so?



The review found in the OP  uses a 2.0 capable MB.  The review from FPS Labs uses an SLI, 680i MB  and clearly shows a decrease in performance in most games.  Why would you use a CF video card on a SLI motherboard as defacto of gaming performance?   Looks like you are hoping that the lower performance found using an 680i MB is true when it's the worst case senerio :shadedshu


----------



## hacker111 (Jan 22, 2008)

Love it!


----------



## niko084 (Jan 22, 2008)

EastCoasthandle said:


> I would ignore the Crysis results.  Crysis can't use more than 128 shaders which means it's not using all of 3870's shaders.  2 GTs in SLI is still 2*128.
> 
> Lets not forget



Ya Crysis is some BS.... I bought it, maybe I should file a complaint to the BBB


----------



## Tekmuch (Jan 22, 2008)

Airbrushkid said:


> The same can be said about Ati. Look at some of the games when they start up and the splash screen is Ati. Isn't Fear one of those bias to ati? Why didn't they use World of Conflict game it's a Dx10 game. It takes Ati to produce a card that needs *2 GPU's* to catch up with Nivida.



What is with the hate.  Do you not want any competition at all.  You've had your time to gloat on your favorites.  Let the ATi folks have some to.


----------



## pentastar111 (Jan 22, 2008)

Tekmuch said:


> What is with the hate.  Do you not want any competition at all.  You've had your time to gloat on your favorites.  Let the ATi folks have some to.


 I think it is fantastic that AMD/ATI is finally rivalling nVidia...and for a fraction of what the g-force cards are selling for!..I'm going to be switching camps for my next build(All AMD this time around) The price vs performance ratio is very, very good.  NVIDIA seems to have borrowed the same "bloated" Intel mindset when it comes to their pricing tactics Don't get me wrong this rig I curently running is a very nice machine..I have no complaints as far as the performance goes but it did cost a pretty penny to assemble:shadedshu...I'm figuring for that same amount of cash the AMD/ATI configuration would be a very healthy beast indeed.


----------



## DaMulta (Jan 22, 2008)

I want one of these in a Lan Case


I can't wait to see a crossfire review over these cards. Or a Quad card setup if they bring a driver out for it. I have ran Crossfire HD2900XT and I want to see what that would be with 4 cards. Which is about that you could consider a Crossfire 3879 X2 setup .

So far I like the looks of the new dual GPU card.


----------



## pentastar111 (Jan 22, 2008)

DaMulta said:


> I want one of these in a Lan Case
> 
> 
> I can't wait to see a crossfire review over these cards. Or a Quad card setup if they bring a driver out for it. I have ran Crossfire HD2900XT and I want to see what that would be with 4 cards. Which is about that you could consider a Crossfire 3879 X2 setup .
> ...


I'm getting this LianLi case
http://www.performance-pcs.com/catalog/index.php?main_page=product_info&cPath=204&products_id=22439


----------



## Airbrushkid (Jan 22, 2008)

It's not hate. It's the truth. ATI needs 2 GPUs to stay with Nvidas 1 GPU. Just pointing it out. It's funny that you guys will buy a that.




Tekmuch said:


> What is with the hate.  Do you not want any competition at all.  You've had your time to gloat on your favorites.  Let the ATi folks have some to.


----------



## AsRock (Jan 22, 2008)

Still like to know what about when  a game is not supporting the x2GPU's game performance crashes way below the GTX\Ultra ?.  Or will it use both GPU's all the time ?;..


----------



## OrbitzXT (Jan 22, 2008)

Airbrushkid said:


> It's not hate. It's the truth. ATI needs 2 GPUs to stay with Nvidas 1 GPU. Just pointing it out. It's funny that you guys will buy a that.



As someone who doesn't hide his love of nVidia, I have to say your argument is pretty stupid. Who cares how they achieve their numbers? When it comes down to it, all that really matters to us is price and performance. That said, as I pointed out earlier I will not buy the R680 because ATI drivers don't have functional Image Scaling, otherwise I'd probably be interested in this card because of its price and performance. 

Grow up people, who cares how many GPUs there are to get the performance. Do you really think anyway that 10 years from now we'll have graphics cards with a single processing unit on them? nVidia tried this method earlier and it wasn't great because of bad drivers, if ATI can pull it off and have good drivers to support it, more power to them.


----------



## magibeg (Jan 22, 2008)

Airbrushkid you have to realize that to an end user it doesn't matter how many cores they need to use so long as they deliver ample performance over the competition.

I'm really not sure why theres a semi-flame war going on. Also for nvidia fans out there, the next gen nvidia card will come out and do well i'm sure but it doesn't exist yet so you can't compare it to something that doesn't exist.

All that said and done as long as ati prices this out properly and theres no massive price gouging this would be an excellent product for the high end.

edit- i see orbitz beat me to the punch here


----------



## a111087 (Jan 22, 2008)

Airbrushkid said:


> It's not hate. It's the truth. ATI needs 2 GPUs to stay with Nvidas 1 GPU. Just pointing it out. It's funny that you guys will buy a that.



yes, it's two gpu on one card, so what?  what is important is how it performs and it performs great!


----------



## D.F. (Jan 22, 2008)

Airbrushkid said:


> It's not hate. It's the truth. ATI needs 2 GPUs to stay with Nvidas 1 GPU. Just pointing it out. It's funny that you guys will buy a that.




It's a good card. From what I read before the 9800gx2 (or something like that) which uses two gpus was only about 30%-40% faster than the 8800 ultra. This card is already about 15%? faster than the 8800 ultra if you sum all the results, and I'll bet it will be cheaper than the 9800gx2.


----------



## HaZe303 (Jan 22, 2008)

Better performance than I expected from the card. When ATI/AMD gets the drivers figured out, maybe this card will be the monster it should be and already is. Now the only thing that conserns me is cooling the beast and price?


----------



## Scrizz (Jan 22, 2008)

so does windows see it as one?


----------



## HaZe303 (Jan 22, 2008)

Scrizz said:


> so does windows see it as one?


 Hope so, maybe that will help against games with no xfire/sli support? But I seriously doubt it, so I think it will show up as xfire. Or the way ATI wants it to show with its drivers?


----------



## [I.R.A]_FBi (Jan 22, 2008)

does a car driver care if a car has 4 cylinders or 8 if its fast? i think not!


----------



## OrbitzXT (Jan 22, 2008)

Doesn't one use more gas than the other? I don't know anything about cars but I was told that.


----------



## [I.R.A]_FBi (Jan 22, 2008)

OrbitzXT said:


> Doesn't one use more gas than the other? I don't know anything about cars but I was told that.


depends ... but a corvette (v8) returns similar gas milage to a rsx(i-4) ...


----------



## TooFast (Jan 23, 2008)

2 gpus on one board is way cheaper/better than 2 cards glued together!
Go ati!


----------



## csplayer089 (Jan 23, 2008)

and chances are nvidias dual-gpu card the 9800gx2 will kick this thing so hard in the nuts that amd will have to go back to the drawing board.


----------



## TooFast (Jan 23, 2008)

http://www.fpslabs.com/reviews/video/amd-radeon-hd-3870-x2-review

the best single card solution is now from amd/ati!!!!!!!!!


----------



## Hawk1 (Jan 23, 2008)

Why does it always end in a flame war?!:shadedshu


----------



## [I.R.A]_FBi (Jan 23, 2008)

csplayer089 said:


> and chances are nvidias dual-gpu card the 9800gx2 will kick this thing so hard in the nuts that amd will have to go back to the drawing board.


 new AMD GPU coming in h2 2008 .. nvidia was caught with their pants down ...


----------



## Esse (Jan 23, 2008)

DaMulta said:


> Or a Quad card setup if they bring a driver out for it.



Which is not going to happen because there is only one CF bridge.


----------



## ShadowFold (Jan 23, 2008)

WildCat87 said:


> Weak.



OMG 4FPS OH NOEZ WHAT WILL ATI DEW

I sure hope your being sarcastic tho :shadedshu


----------



## Scrizz (Jan 23, 2008)

guess i'll just have to dream about the HD4990X2


----------



## Scrizz (Jan 23, 2008)

w00t

"All CrossFire connectivity appears to be handled internally, and the computer recognizes the X2 as a single video card."

http://www.fpslabs.com/reviews/video/amd-radeon-hd-3870-x2-review/page-3


----------



## Ravenas (Jan 23, 2008)

Just another reason I keep considering the 3870.


----------



## Mussels (Jan 23, 2008)

Its good news that its decently fast and is seen as a single card. Needing PCI-E 2.0 is a bummer however, cant hide that.


----------



## phanbuey (Jan 23, 2008)

[I.R.A]_FBi said:


> new AMD GPU coming in h2 2008 .. nvidia was caught with their pants down ...



eeerrrrmmm... if anything, it's that ATi is pulling their pants back up .


----------



## Mussels (Jan 23, 2008)

ATI has a dual GPU card that just beats Nvidias single top card, as long as its on PCI-E 2.0

Nvidia have the 9800GX2 coming out very soon.

ATI have an unknown card due in 2008 (omg, shocking that)

Its always like this... each company bringing a new card. why cant you people just shut up, read the reviews once details ARRIVE and the cards actually exist, and just buy whichever is the best value for money.


----------



## erocker (Jan 23, 2008)

WildCat87 said:


> Weak.



Don't worry mate!  AMD is holding this card back five more days to fix just that.^^^  Just wait for the reviews a couple weeks from now when this card owns all tests.  Unless you are just not happy because you are one of those corporate followers, or fanboys, as some call them.  That's just ridiculous!


----------



## Giletus (Jan 23, 2008)

Have older 8800 gts now,I think when I build new system here next month, I will try one of these out. Does anyone think they will build multi-gpu cores ?


----------



## erocker (Jan 23, 2008)

Absolutely!  ATi will anyways.  Man, if Intel would just buy Nvidia now...  or not, thier cards would be wayy too good.


----------



## erocker (Jan 23, 2008)

Giletus said:


> Have older 8800 gts now,I think when I build new system here next month, I will try one of these out. Does anyone think they will build multi-gpu cores ?



What clocks are you running your card at?  I've been running at 700/1550shaders/1000memory for a couple months now and this card is still plenty fast.  I can't wait to see how it does in my new system!


----------



## asb2106 (Jan 23, 2008)

erocker said:


> Absolutely!  ATi will anyways.  Man, if Intel would just buy Nvidia now...  or not, thier cards would be wayy too good.



I would really like that, I use ATI now because I can run a multi card solution with Intel chipsets, and I need my Intel Chipset, but if Intel bought out Nvidia I would fly to their graphics cards immediately


----------



## Giletus (Jan 23, 2008)

I am running a EVGA- 8800 gts  320 m.b. version-factory over clocked. I am using a dell case so dont dare to push it more before my new build and better case. Its already gets pretty hot cant rember temps. but high enough to me me worry. On new build will use video card for now to see how it performs.-  building a system around a duo-core 6750


----------



## Mussels (Jan 23, 2008)

asb2106 said:


> I would really like that, I use ATI now because I can run a multi card solution with Intel chipsets, and I need my Intel Chipset, but if Intel bought out Nvidia I would fly to their graphics cards immediately



I too, await the day good SLI boards come out.

Looking at the track record (650i/680i) the Nv chipsets suck ass for next gen tech, while the intel chipsets manage 1-2 generations later of CPU's (i've seen a Q6600 G0 on a 975x)


----------



## asb2106 (Jan 23, 2008)

Mussels said:


> I too, await the day good SLI boards come out.
> 
> Looking at the track record (650i/680i) the Nv chipsets suck ass for next gen tech, while the intel chipsets manage 1-2 generations later of CPU's (i've seen a Q6600 G0 on a 975x)



and OCing ability. my buddy has a ASUS 680i stricker and the top we could get out of my q6600 was 3.2 maybe 3.3 but it was shaky

On my P5B deluxe(I know its old, x48 is mine as soon as it comes out) I get a solid 3.6Ghz, I had 3.8 but I couldnt keep it stable 24/7 with SETI running

EDIT** my p5b is a 965 chipset, works great with my q6600 BTW


----------



## asb2106 (Jan 23, 2008)

oh and with my p965 check this

just got done and i was pretty excited about it




Im very thankful for OCing because Ill never spend a grand on a proc, just cant do it


----------



## Ravenas (Jan 23, 2008)

Mussels said:


> ATI has a dual GPU card that just beats Nvidias single top card, as long as its on PCI-E 2.0
> 
> Nvidia have the 9800GX2 coming out very soon.
> 
> ...



I'm not going to jump in the middle of this, and yet I agree with you BUT...Companies like ATI and NVIDIA purposely build brand loyalty to keep life long customers. It is there goal, and nearly every companies goal. Wether we like it or not, brand loyalty does and will always exist until the end of the consumer age.


----------



## asb2106 (Jan 23, 2008)

I really was excited about getting the 3870 x 2 but now that I hear the r700's are so close to release im thinking of holding back till they come out.... what y'all think?


----------



## asb2106 (Jan 23, 2008)

Ravenas said:


> I'm not going to jump in the middle of this, and yet I agree with you BUT...Companies like ATI and NVIDIA purposely build brand loyalty to keep life long customers. It is there goal, and nearly every companies goal. Wether we like it or not, brand loyalty does and will always exist until the end of the consumer age.



I agreed 110%, brand loyalty with graphics is no different than with cars, look at all the people that wont buy anything but ford, or any other company for that matter


----------



## erocker (Jan 23, 2008)

asb2106 said:


> I really was excited about getting the 3870 x 2 but now that I hear the r700's are so close to release im thinking of holding back till they come out.... what y'all think?



If you are able to keep a backup video card on hand, I would get the 3870X2.  If R700 tuns out to be a wonderful, magical chip, you can always sell it, and get that.  Now, I know you may be thinking that R700 is going to drive 3870x2 cards down, and that may be, but those "in the know" will know how good R700 is going to be long before it's released.  Hence, the backup card.  I guess it's just a good way of looking at it, to have the "latest" stuff and not be careless with your money.


----------



## asb2106 (Jan 23, 2008)

erocker said:


> If you are able to keep a backup video card on hand, I would get the 3870X2.  If R700 tuns out to be a wonderful, magical chip, you can always sell it, and get that.  Now, I know you may be thinking that R700 is going to drive 3870x2 cards down, and that may be, but those "in the know" will know how good R700 is going to be long before it's released.  Hence, the backup card.  I guess it's just a good way of looking at it, to have the "latest" stuff and not be careless with your money.



true, I have a 3870 now thats been pretty good to me, Im almost looking at it as my "backup" card.

For some reason though im not really feeling like the r700 is gonna be that great, ATI has been alittle disapointing lately


----------



## Mussels (Jan 23, 2008)

with the exception of the 9700/9800pro and 8800GTX, few cards have ever lived up to the pre-release hype.

If you can wait, wait. If not, grab the best value now.

Yes i have an 8800GTX - but i'm just as happy with my 'backup' 8800GT in my media PC, for less than half the price.


----------



## asb2106 (Jan 23, 2008)

Mussels said:


> with the exception of the 9700/9800pro and 8800GTX, few cards have ever lived up to the pre-release hype.
> 
> If you can wait, wait. If not, grab the best value now.
> 
> Yes i have an 8800GTX - but i'm just as happy with my 'backup' 8800GT in my media PC, for less than half the price.



its almost funny when you can call a 8800gt a backup, I give that card alot of credit, I think the 8800gt surpased its pre-release hype, I think g92 has really shown a good side, with a great future for Nvidia, but I feel the same way about the 55nm tech that ATI has started on, I think it will have a good future


----------



## erocker (Jan 23, 2008)

Why not just go crossfire?


----------



## asb2106 (Jan 23, 2008)

erocker said:


> Why not just go crossfire?



The day I see a 3870 for <200 (which I think will be soon) Im gonna pick up one to hold off for the r700, thats been my thoughts today actually!


----------



## Mussels (Jan 23, 2008)

I wont go crossfire, or ATI for one reason - the broken scaling.

I got so pissed when i was on ATI with my widescreen because i could not stop 4:3 games from stretching - games like BF2 do not support wide and never will, because EA suck balls.

Nvidia and ATI both have options to fix it, its just that ATI's doesnt work and i wont spend $2K on a dell screen that does it itself.

Just so people know, ATI has an option for 'centered timings' that claims to add the black bars to maintain aspect ratio. Stupidly, its broken to ONLY work on the monitors original ratio.

16:10 screen, 1680x1050 resolution: full screen, no blur
same screen at 1440x900: fills the screen but blurry. Both ATI and Nv scaling works here, adds black bars top AND bottom, but removes blur

Use a 4:3 res say... 1280x1024 (BF2, older games, 3dmarks) and you get the following
Nvidia: black bars on the sides, tiny black bar on top - no blur, everythings peachy.
ATi: Full screen stretching. horrible blur. ATI's response (if you get one) "Buy a screen with inbuilt scaling"

Its been this way since the X1k series, and now they're in X3k without fixing it. ATI are ignoring a simple driver issue thats making a lot of people angry.


----------



## Ravenas (Jan 23, 2008)

asb2106 said:


> The day I see a 3870 for <200 (which I think will be soon) Im gonna pick up one to hold off for the r700, thats been my thoughts today actually!


----------



## Mussels (Jan 23, 2008)

you just made people skip my long post  scroll up folks!


----------



## asb2106 (Jan 23, 2008)

Ravenas said:


>



???

and @mussels

Odd, ive never had a problem like that with my 1950pro xfire setup.  when i had those 2 in CF I had them on my 19" 1440x900 but the only games I really played were FEAR, Ghost Recon 1 & 2, and B&W 2.  Didnt really get into any other games then and I dont remember ever seeing issues like that.  weird?

Any articles you could point me to that I could read about that?


----------



## Ravenas (Jan 23, 2008)

Those are the r700 wafers.


----------



## Mussels (Jan 23, 2008)

not any articles, and i've ditched ATI here so i cant take photos...
(the games you listed all support widescreen to some extent, except maybe B&W2)


The blurring is monitor related, but i challenge anyone on ATI with a widescreen monitor to run a game at a 4:3 resolution and NOT have it stretch to fill the screen.

The best i could manage would be to show the scaling on Nvidia with photos and have someone on ATI do the same. Any ATI users out there with a digital camera, PM me and we can work up an article with pics on this.


----------



## asb2106 (Jan 23, 2008)

Ravenas said:


> Those are the r700 wafers.



oohhh sweet!  Are they gonna use the 55nm tech? u know?


----------



## asb2106 (Jan 23, 2008)

Mussels said:


> not any articles, and i've ditched ATI here so i cant take photos...
> (the games you listed all support widescreen to some extent, except maybe B&W2)
> 
> 
> ...



OOOHhh a 4:3 on a widescreen, I understand, but that would only be a problem with older games right, because most new games run widescreen.

And couldnt you just run the game windowed?  I have done that with my emulators because it looked like crap running widescreen when i had crossfire


----------



## Mussels (Jan 23, 2008)

asb2106 said:


> OOOHhh a 4:3 on a widescreen, I understand, but that would only be a problem with older games right, because most new games run widescreen.
> 
> And couldnt you just run the game windowed?  I have done that with my emulators because it looked like crap running widescreen when i had crossfire



you COULD run the games/programs windowed... but thats crap. Quite often moving the mouse out of the window in an RTS results the game minimising or otherwise going stupid.

Its not just a problem with older games, as many MODERN games dont properly suppot widescreen.

Widescreen comes in two flavours vert- or hor+ 
Verical -, gives you the same horizontal view as a 4:3 image, but cuts the top off - you get LESS image than a 4:3 user would.

hor+ gives you the same vertical view, with more on the sides (real widescreen)

If i have a game with vert- (such as Bioshock on first release) i would rather play at 1280x1024, than lose part of the graphics.

Also... even with a GTX and a quad, there are the odd game that i cant max out (crysis cough) Why the hell should i run in a window, if i cant run at max res? how do people with 1080p screens handle this kind of thing on ATI??


----------



## asb2106 (Jan 23, 2008)

Mussels said:


> you COULD run the games/programs windowed... but thats crap. Quite often moving the mouse out of the window in an RTS results the game minimising or otherwise going stupid.
> 
> Its not just a problem with older games, as many MODERN games dont properly suppot widescreen.
> 
> ...



OK, i can agree with windowing and the mouse issue, I run Crysis on my 1920x1200 on medium with like 2 on high and I get about 25FPS.  My card is OCed pretty high and it looks pretty good.  to some 25fps sucks but i can deal, I dont care to much  Id like to get 40+ but beggers cant be choosers


----------



## erocker (Jan 23, 2008)

asb2106 said:


> I run Crysis *at 1920x1200




You need another card.  Or the X2.


----------



## Ravenas (Jan 23, 2008)

asb2106 said:


> oohhh sweet!  Are they gonna use the 55nm tech? u know?



45nm as far as I know.


----------



## asb2106 (Jan 23, 2008)

Ravenas said:


> 45nm as far as I know.



exciting, and Id like to get a second 3870, but not for Crysis, already beat it and have no intentions of playin it anymore


----------



## btarunr (Jan 23, 2008)

EastCoasthandle said:


> I would ignore the Crysis results.  From what I read in another post Crysis can't use more than 128 shaders which means it's not using all of 3870's shaders.  2 GTs in SLI is still 2*128.
> 
> Lets not forget



Absolutely. It's something like Half-Life 2 benches favouring ATI cards (at least they used to).



WildCat87 said:


> Weak.



Yeah right. The R680 cards will be priced $400~$500. The 8800 Ultra cards are priced $600~$700. You're losing 4 fps while saving $100~$200.


----------



## Scrizz (Jan 23, 2008)

EDIT: nvm


----------



## pentastar111 (Jan 23, 2008)

btarunr said:


> Absolutely. It's something like Half-Life 2 benches favouring ATI cards (at least they used to).
> 
> 
> 
> Yeah right. The R680 cards will be priced $400~$500. The 8800 Ultra cards are priced $600~$700. You're losing 4 fps while saving $100~$200.


Good point!


----------



## TooFast (Jan 23, 2008)

http://www.tomshardware.com/2008/01/23/ati_r680_the_rage_fury_maxx_2/page19.html


AMD Radeon HD 3870 X2
The fastest card today, once again, carries an AMD label, after long term domination by NVIDIA


----------



## btarunr (Jan 23, 2008)

LOL:














Good ol' Tom's Hardware

First review from a good reviewer: http://www.tomshardware.com/2008/01/23/ati_r680_the_rage_fury_maxx_2/index.html

Whoa:





*Respect for ATI grows*


----------



## Mussels (Jan 23, 2008)

ah i love toms... first they write an article about how the phenom is a failure and doesnt work on any AM2 boards (only asus and gigabyte AM2, otherwise need AM2+)... then a week later an article about phenom, how great it is and why its so awesome that it works on every AM2 board....

Toms has some issues, possibly with th writers themselves.


edit: that link says R700 was delayed til 2009. Ouch.


another edit cause i like them: this seems a quality review. Very good for toms.


----------



## btarunr (Jan 23, 2008)

What's interesting is the variations in the results of the PCOnline.com.cn review to the Tom's review. Even more interesting is the UT III scores considering it's another of those titles rumored to 'favour' GeForce.


----------



## Scrizz (Jan 23, 2008)

man i never got to read toms review


----------



## btarunr (Jan 23, 2008)

I was wondering if any reviewer benched a HD3870 X2 against two HD3870 cards in CrossFire.


----------



## btarunr (Jan 23, 2008)

Scrizz said:


> man i never got to read toms review



Yup, the webmaster is moving the links we have to wait till he fixes them. The link and the pics worked a few hours ago and Mussles read the review. 

Try going here: http://www.tomshardware.com/graphics/index.html and clicking on the article: "ATI Radeon HD 3870 X2 - Fastest Yet!" and see if it works. If you get a 404 from Tom's, then they're still fixing it.


----------



## Scrizz (Jan 23, 2008)

nope it's not up yet

oh and on a side note I can't activate my account on LoG


----------



## tkpenalty (Jan 23, 2008)

Its faster than an Ultra.... costs as much as a GTX (Which seems to be EOL to me) and runs cooler. (As well as taking two slots instead of four). Nice card . 

Guys, you have to remember that ATi has the performance crown now. 

Yes power usage is 30W more but its not like you will run into any cooling issues... RV670s run cool. It would be better if some manufacturer makes one of these with two VF700ALCUs or even VF900CUs... I mean you can install them yourself as there is enough space . Lets just hope that the non-reference cooling R680s will come with the stiffening bar that the reference has. Its a bit long though... GTX Length.


----------



## tkpenalty (Jan 23, 2008)

Mussels said:


> you COULD run the games/programs windowed... but thats crap. Quite often moving the mouse out of the window in an RTS results the game minimising or otherwise going stupid.
> 
> Its not just a problem with older games, as many MODERN games dont properly suppot widescreen.
> 
> ...



You are overreacting (sorry to answer an old question), but blame the problem on your monitor because 22inch monitors i've seen automatically add the black bars. (Haven't touched samsung, you might as well blame them instead).


----------



## Nyte (Jan 24, 2008)

FYI, just so that reviewers don't trounce on AMD and make its stock plummet...

All the reviews so far (3 as of writing this post) seem to have not take notice of the fact that this is NOT a PCIe 2.0 card.  Let me explain...

The HD3870 ASIC's themselves are PCIe 2.0 compliant.  However, did they ever wonder how these 2 GPU's talk to each other?  Well, there's this switch chip that is situated between the 2 GPU's (you can see it in the review photos as well).

This switch is a PCIe 1.1 part.  This means that all data transfer in and out of this switch (ie. between the GPU's and motherboard) are at 1.1 speeds and not at 2.0 speeds.

This isn't a bad thing.  We will not reach 2.0 bandwidth in quite some time considering that even AGP 8x is still holding its own.  But this should be known NOW so no customer should purchase it and go all CRAZY saying its not 2.0 and starts a smear campaign all over the net thus pulling AMD's stock price down (NVIDIA Fanboys included in this smear campaign rally).

The reviewers are most likely not checking the PCI configuration space of the devices (ie. the switch) and relying on Catalyst Control Center (which reports the capabilities of the HD3870).


My 2 cents.


----------



## erocker (Jan 24, 2008)

Nyte said:


> FYI, just so that reviewers don't trounce on AMD and make its stock plummet...
> 
> All the reviews so far (3 as of writing this post) seem to have not take notice of the fact that this is NOT a PCIe 2.0 card.  Let me explain...
> 
> ...




Thing is, it doesn't require a pci-e 2.0 bridge between the two chips, a pci-e 1.1 is plenty sufficient, but when you need to send info from both gpu's to the motherboard pci-e 2.0 is needed.


----------



## Nyte (Jan 24, 2008)

erocker said:


> Thing is, it doesn't require a pci-e 2.0 bridge between the two chips, a pci-e 1.1 is plenty sufficient, but when you need to send info from both gpu's to the motherboard pci-e 2.0 is needed.



It's not needed because the slave GPU will never need to talk to the system.  The slave GPU is only there to update the framebuffer for its own portion (by portion, I mean every other frame or half a frame).  The slave will never read from the system and it will never write to the system.  The master GPU coordinates all of that.


----------



## erocker (Jan 24, 2008)

I'm kinda losing you..  Are you saying the pci-e 2.0 isn't needed or the pci-e 1.1?


----------



## Nyte (Jan 24, 2008)

erocker said:


> I'm kinda losing you..  Are you saying the pci-e 2.0 isn't needed or the pci-e 1.1?



There is no PCIe 2.0 on the board is what I'm saying.  The switch chip which arbitrates communication between the 2 HD3870's is 1.1.

Think of the switch chip as a central connection between the 2 GPU's and motherboard (it has 3 connection ports).  The data transfers therefore are limited to 1.1 speeds (which is definitely sufficient).

The reviewers all report that the board operates at 2.0 speeds which is not true.  That's all.


----------



## erocker (Jan 24, 2008)

Makes sense.  Are there any benchmarks using a 1.1 vs. 2.0 motherboard out there?


----------



## Nyte (Jan 24, 2008)

erocker said:


> Makes sense.  Are there any benchmarks using a 1.1 vs. 2.0 motherboard out there?



There definitely is.  The X38's, RD790's, 780i's (this is not true 2.0, it also uses a 1.1 switch) are all 2.0 compliant.  You obviously need a 2.0 graphics card as well but the performance difference is minimal (if any exist at all).

AGP 8x is still comparable with 1.1.  And 2.0 is double 1.1... 

What's 2 x minimal equal to?


----------



## erocker (Jan 24, 2008)

Well, I would be more interested in seeing benchmarks from a  pci-e 2.0 motherboard vs. say a 680i chipset, or x975.


----------



## Mussels (Jan 24, 2008)

tkpenalty said:


> You are overreacting (sorry to answer an old question), but blame the problem on your monitor because 22inch monitors i've seen automatically add the black bars. (Haven't touched samsung, you might as well blame them instead).



Not really over-reacting... and i;ve used a lot of monitors. In my experience only dell and apple (24" and above) add the bars.


Feel free to prove me wrong - show me it. LOTS of people argue this, mostly because of some mistake they made (some people saw it on movies, others didnt realise its 4:3 games that screw up, and so on)

I've tried 22" models from the $300 to $600 (au) price range from asus, acer, dell (not ultrasharp) Chi Mei, CMV, and one generic chinese one i couldnt remember. Most of them are new enough to support HDCP, one even had HDMI support - none of them had in built scaling. My TV does, but pretty much only a few expensive models support it and they're all 24" or larger, or cost over $2K


----------



## Edito (Jan 24, 2008)

tkpenalty said:


> Its faster than an Ultra.... costs as much as a GTX (Which seems to be EOL to me) and runs cooler. (As well as taking two slots instead of four). Nice card .
> 
> Guys, you have to remember that ATi has the performance crown now.
> 
> Yes power usage is 30W more but its not like you will run into any cooling issues... RV670s run cool. It would be better if some manufacturer makes one of these with two VF700ALCUs or even VF900CUs... I mean you can install them yourself as there is enough space . Lets just hope that the non-reference cooling R680s will come with the stiffening bar that the reference has. Its a bit long though... GTX Length.



I don´t think ATI\AMD has the performance crown because there is no big diference between the cards and the nvidia cards has better performance according to the benchmarks http://forums.techpowerup.com/showthread.php?t=44484 i just think ATI is doin just fine because they are doin much just to get closer to the actual nvidia performance i mean dual GPU on the 3870 X2 in some benchs its inferior to a single 8800Ultra with that results i can´t say ATI has the performance crown don´t forget im not ATI hater im just tell what i think its the truth.


----------



## eidairaman1 (Jan 27, 2008)

WildCat87 said:


> I won't argue about the drivers.
> 
> A brand new, high-end, dual-GPU video card, even if only $50, getting just 1 frame more than an old, single-GPU card is still weak. Excellent price/performance ratio, but I was truly expecting more as far as pure performance, even with launch drivers.



Boy you are the one to talk, yet you own 2 ATI Products. Also from a Starting Price Point the performance ratio to the dollar is far better than that of the 8800 Ultra, i mean seriously paying 800 bux for a gaming card is dumb, id understand paying that much for a Quadro or FireGL card.
Also you gotta Realize that Software hasnt yet reached Dual Chip GPU capability that great as it does on Multicore CPUs, I recall Single Core CPUs outperforming the Dual Core CPUs back in the day, well it took time for software to mature.


----------



## Mussels (Jan 27, 2008)

i beleive wildcats points is that he was expecting more of a performance leap.

Wildcat: FYI, VERY few launches make great performance leaps. The core 2 duo line and the 8800 series were the first 'huge leaps' in a long time. Generally performance increases gradually. between generations over a period of 2-3 years, not this incredible doubling of performance.

These performance leaps are rare, we just got lucky having two (CPU and GPU) happen at the same time.


----------

