Monday, February 4th 2008

ATI Radeon HD 3870 X2 CrossFireX Tested

Admittedly this isn't the first review of AMD's new 3870 X2 running in CrossFireX, but it does seem to give a better indication of real world performance. As in the previous test, this article shows unimpressive 3DMark scores, but as the author is quick to point out, 3DMark06 is known to be quite heavily dependent on the CPU so doesn't always give a reliable indication of real world GPU performance. More interesting are the benchmarks of Call of Juarez and F.E.A.R., which show performance increases of around 50% and 75% respectively. It's important to remember that AMD's CrossFireX driver is still in development, but these figures should give a rough idea of what is to be expected. You can read FPSLABS' full article here.
Source: FPSLABS
Add your own comment

24 Comments on ATI Radeon HD 3870 X2 CrossFireX Tested

#1
paul06660
Dynamite Planted!!!
(at the Nvidia Command Post)
Posted on Reply
#2
Hawk1
Looks very good. Not that I would go Quadfire, but good to know it will (should) scale very well. Man, nothing but good news for AMD/ATI the past several days. There must be two moons in the sky or something.
Posted on Reply
#3
mdm-adph
So... anyone got a copy of this CrossFireX "evaluation" driver...? ;)
Posted on Reply
#5
Fitseries3
Eleet Hardware Junkie
it's interesting to see that it seams to do amazingly well on the older DX9 games. it does good on new games too but, most of the benches i've seen have used mostly older games.
Posted on Reply
#6
ShadowFold
If only I hit the lotery :p looks like ATi got good drivers out for once
Posted on Reply
#8
InnocentCriminal
Resident Grammar Amender
Interesting article, better than the Tom's Hardware one.

I plan to eventually have CrossfireX, once I've got the most out a single card. If they don't release a 2nd revision of the card that uses a updated PCI-bridge chip I may just stick with one. Depends...
Posted on Reply
#9
Ravenas
Wow, I've always wanted to try Crossfire or SLI. Seems like nows the time! Nice work on ATI's part! :toast:
Posted on Reply
#10
hat
Enthusiast
I detect extreme cpu bottlenecking, even on a 4GHz C2Q Extreme 45nm...

On second thought, I detect extreme bottlenecking of everything.
Posted on Reply
#11
REVHEAD
hatI detect extreme cpu bottlenecking, even on a 4GHz C2Q Extreme 45nm...

On second thought, I detect extreme bottlenecking of everything.
Yep even my 8800 gtx sli setup is seriously bottlenecked with quad core @ 3.820 and 4 gig of ram, even more compounded when I run 2560x1600res, these cards will never reach there potential unless a cpu is overclocked under phase, cpu is allways the limmiting factor, and it suxs.
Posted on Reply
#12
OrbitzXT
In most games isn't the GPU bottlenecking, not the CPU? The only time you ever see a CPU bottleneck is with benchmarks like 3dMark. Why would the frames of these games improve so much with the addition of a second card? If one card was bottlenecked by the CPU, a second card shouldn't result in a drastic increase of frames. I remember reading a good post here that explained it all, and that most users are just unaware of how bottlenecks work and always assume the CPU is bottlenecking just because of 3dMark scores.
Posted on Reply
#13
imperialreign
OrbitzXTIn most games isn't the GPU bottlenecking, not the CPU? The only time you ever see a CPU bottleneck is with benchmarks like 3dMark. Why would the frames of these games improve so much with the addition of a second card? If one card was bottlenecked by the CPU, a second card shouldn't result in a drastic increase of frames. I remember reading a good post here that explained it all, and that most users are just unaware of how bottlenecks work and always assume the CPU is bottlenecking just because of 3dMark scores.
I think it also has a lot to do with specific CPUs. The P4's are notorious for bottlenecking high end cards. The 1950s, for example, are great VGA adapters, but when coupled with a P4, they just don't shine like they should.

With the addition of my second card, on average I only saw 15-25% increase in performance (whereas I was hoping to see the typical 25-35% increase), depending on the app and how heavily GPU intensive it was.

Damn, I can't wait to get away from this CPU, though :banghead:
Posted on Reply
#14
W1zzard
these are quite early drivers .. some benchmarks even see -35% (yes i have the drivers) and amd has not released them for this reason. if the public opinion is "sucks" on launch, they have to do massive positive pr afterwards
Posted on Reply
#15
imperialreign
W1zzardthese are quite early drivers .. some benchmarks even see -35% (yes i have the drivers) and amd has not released them for this reason. if the public opinion is "sucks" on launch, they have to do massive positive pr afterwards
it would seem, then, that ATI is trying to avoid another 2900 repeat?

Still, though, from what all we've seen, there's some massive potential behind these cards - ATI's prob burning the midnight oil on them.
Posted on Reply
#16
HaZe303
Very impressive. I though I saw in some other bench that a single 3870x2 got like 13000-14000 on 3dmark06? Now it shows 20000?? Maybe it is the cpu that made the difference?? Maybe time to move over to ATI?? Nah, ill wait for Nv g9, and then decide which is best for me. Propably most others will do the same, its a shame that ATI got this "break" at this wrong time.
Posted on Reply
#17
AphexDreamer
imperialreignATI's prob burning the midnight oil on them.
hmmm, not quite familier with that allusion, care to elaborate?

I really want one of these cards, I hope to manage to sell my pro and for the GDDR4 version to come out soon, I just won't settle for GDDR3, i'm sorry.
Posted on Reply
#18
imperialreign
AphexDreamerhmmm, not quite familier with that allusion, care to elaborate?
it means that they're working extra overtime to get stuff done (working late into the night)
Posted on Reply
#19
WarEagleAU
Bird of Prey
Extremely good news on the ATI front. This card is starting to look very good. I hope to see a high end from them sometime this year.
Posted on Reply
#20
cdawall
where the hell are my stars
REVHEADYep even my 8800 gtx sli setup is seriously bottlenecked with quad core @ 3.820 and 4 gig of ram, even more compounded when I run 2560x1600res, these cards will never reach there potential unless a cpu is overclocked under phase, cpu is allways the limmiting factor, and it suxs.
at low res the cpu the limiting factor at high res its the vid card i seriously doubt your C2Q is the issue @2560x1600


back on topic how about 3/4x3870X2?
Posted on Reply
#21
mdm-adph
WarEagleAUExtremely good news on the ATI front. This card is starting to look very good. I hope to see a high end from them sometime this year.
You wouldn't call the 3870 X2 "high-end?" :p
Posted on Reply
#22
happita
mdm-adphYou wouldn't call the 3870 X2 "high-end?" :p
:laugh:
Posted on Reply
#23
REVHEAD
cdawallat low res the cpu the limiting factor at high res its the vid card i seriously doubt your C2Q is the issue @2560x1600


back on topic how about 3/4x3870X2?
Ok how would you explain the same setup running a overclocked 4ghz quad getting an extra max 12 fps in 3dmarks 2006? and the same setup running 3.6ghz max 12 fps less in 3dmarks test 1 and 2? hmm I have done 100s of tests and benchmarks, the limmiting factor and I quote "is the CPU"

What systems have you tested this theory on?
Posted on Reply
#24
cdawall
where the hell are my stars
REVHEADOk how would you explain the same setup running a overclocked 4ghz quad getting an extra max 12 fps in 3dmarks 2006? and the same setup running 3.6ghz max 12 fps less in 3dmarks test 1 and 2? hmm I have done 100s of tests and benchmarks, the limmiting factor and I quote "is the CPU"

What systems have you tested this theory on?
are you kidding me 3dmak is massively cpu biased run a real game and watch the fps. want prtoof its biased my 7800GS is oc'd quite high in games it performs better than the ones i have seen on faster cpus but a lower gpu speed, but in 3dmark i come up in last place....gee i wonder if thats got anything to do with the program?


and an extra 12fps is nice if it went from 12fps to 24fps but i doubt that it did that would show cpu bottlenecking going from 100fps to 112fps however does not
Posted on Reply
Add your own comment
Dec 23rd, 2024 10:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts