# Call of Juarez DX10 Benchmark...



## Alcpone (Jun 14, 2007)

Anyone tried this out? I found it jerky in a few parts where there was alot of foliage etc, looks sweet still, got a max fps of 33.7 I think and low of 12ish!

It always seems to be accessing the HDD which maybe the jerky issue and also it crashed bsod @ the end for some reason?


----------



## von kain (Jul 4, 2007)

any news on this post?


----------



## Chewy (Jul 4, 2007)

Hopefully we can get some benchmarks comparing the 8800's and 2900xt's.. Im soon going to be gettign a new card but it seems crysis and other dx10 games need unreleased hardware to play at max settings.


----------



## mandelore (Jul 5, 2007)

how do i run the benchy? i gots the demo installed, tried the single player @ 1920x1200 maxed settings, 16xaa etc, rather nice, pretty fast fps as far as i can tell, tho not sure how to enable the fps counter ingame,  any ideas about the benchy part?


----------



## Alcpone (Jul 5, 2007)

I just installed it and set resolution I wanted it to run @ and started it off, it does a sweep around the map and through the barn and saloon and stuff then gives you your max, min and avg fps @ the end iirc, was fairly straight forward, im on XP @ the mo so cant check atm for you.


----------



## mandelore (Jul 5, 2007)

i dont understand.... start it off? i only have the option to start a single player game, theres no benchmark or anything...

its the call of J single player demo im using....


----------



## mandelore (Jul 5, 2007)

nm, the dx10 benchy is a seperate install..


----------



## mandelore (Jul 5, 2007)

u running the default settings? can u get a screeny of your results mate?

the gaame itself its FAB, im not into westerns etc, but i got drawn in and really enjoyed the single player demo


----------



## Chewy (Jul 5, 2007)

Cool Mandelore hopefully we can get some dx10 game benches to see if the 2900xt is really more dx10 ready than the 8800 series (could be, maybe not though). I wouldnt base it off just one game I was thinking of making topic on it and have people bench theses dx10 games that are out with thier dx10 cards.. nuff said.

 Yeah I think maybe the demo does not have the frame rate test, it should be an option when you go to change the display settings.. maybe have a button saying performance test like in Company of Heroes.

 Im going for a shower another days work down one more to go.


----------



## Alcpone (Jul 5, 2007)

I will run it again sometime mand and post a shot for you


----------



## mandelore (Jul 5, 2007)

my cpu is really holdin me back methinks cpu @ 2.9 , anyhoo, i just loaded the benchy, it crashed a couple of times, then worked, everything stock on the benchmark, and stock on my 2900,  825core, 1100 memory, 







i will use the vista tweak thread and get some bloated crap removed and do it again


----------



## mandelore (Jul 5, 2007)

Alcpone said:


> Anyone tried this out? I found it jerky in a few parts where there was alot of foliage etc, looks sweet still, got a max fps of 33.7 I think and low of 12ish!
> 
> It always seems to be accessing the HDD which maybe the jerky issue and also it crashed bsod @ the end for some reason?



i found it jumpy here and there, i have a feeling something was struggling on it, or at least not properly supported in the benchy, as the single player demo worked great, everything felt fast, responsive and it did indeed feel a much higher fps than shown by the benchy

u think my cpu is bottlenecking the card?

Edit: my current opty is  seeing me till i upgrade to a phenom, and im still on ddr1, so ill either upgrade to ddr2/3 and deffo a quad phenom, so that will unleash some mighty evilness from my card


----------



## Tatty_One (Jul 5, 2007)

I reckon the memory quantity helps a great deal, if these cards are only gettin averages of around 30fps or less think how anything below 8800GTS/2900XT are going to struggle!  It would either make the game almost unplayable on a mid ranged DX10 card or this demo is as buggy as hell.


----------



## mandelore (Jul 5, 2007)

Tatty_One said:


> I reckon the memory quantity helps a great deal, if these cards are only gettin averages of around 30fps or less think how anything below 8800GTS/2900XT are going to struggle!  It would either make the game almost unplayable on a mid ranged DX10 card or this demo is as buggy as hell.



thats wot i found strange, the single player demo of the game, not the benchy i feel ran with a much higher average than 30, so i think this benchy is either very very taxing on purpose, or there is something blocking its full potential, coz the game deffo ran smoother than the benchmark

i think the bench demo is buggy, as i crashed a good few times, where as i got better performance and zero glitches in the actualy game demo at the same settings


----------



## Tatty_One (Jul 5, 2007)

Changing the subject slightly but staying on the DX10 theme, as there are a couple of fellow Brits subscribed to this thread, whats the best kind of price I can expect to pay for Vista Home Ultimate and is there an upgrade version to upgrade from XP or am I better off with the full version?


----------



## mandelore (Jul 5, 2007)

i got the oem version of vista ultimate from overclockers.co.uk

http://www.overclockers.co.uk/showproduct.php?prodid=SW-040-MS&groupid=33&catid=20&subcat=

£123

the dvd has all the diff versions of vista too, but hell id never pay £300 or wotever for the non oem version, thats just crazy


----------



## Alcpone (Jul 5, 2007)

http://www.scan.co.uk/Products/ProductInfo.asp?WebProductID=527295

Bit cheaper than mand's, the upgrade version is more expensive?


----------



## mandelore (Jul 5, 2007)

Alcpone said:


> he upgrade version is more expensive? :wtf:[/QUOTE]
> 
> thats logic for ya lol..


----------



## Chewy (Jul 5, 2007)

Tatty_One said:


> I reckon the memory quantity helps a great deal, if these cards are only gettin averages of around 30fps or less think how anything below 8800GTS/2900XT are going to struggle!  It would either make the game almost unplayable on a mid ranged DX10 card or this demo is as buggy as hell.




 I think its the game that might be buggy.. like wtf 1280x1024 res not maxing AA and getting a low of 10fps.. on mid/high end GpU's

 AA is on x2 Mandelore?


----------



## mandelore (Jul 5, 2007)

Chewy said:


> I think its the game that might be buggy.. like wtf 1280x1024 res not maxing AA and getting a low of 10fps.. on mid/high end GpU's
> 
> AA is on x2 Mandelore?



the benchmark is diff than the single player demo game, diff apps, but yes, thats default on the benchy of 2x msAA

i downloaded the actual game demo and ran it at higher settings and it was very smooth, looked GREAT, and ran so much better than the benchmark, but i do have a crappy stepping opty 185 that dont overc,ock very good, peeps with procs of over 3.5Ghz, i think would see a better improvement with removing any possible bottleneck


----------



## mandelore (Jul 6, 2007)

I gots an update, 

ran my opty @ 3ghz (3002) and bumped my 2900 to 850mhz core and 2200mhz memory on just stock volts!! my x1900 needed big voltage increases for any sort of overclock, 
so im pretty excited to see what increasing the voltage under liquid cooling can do! 

I got a nice bump in performance in the benchmark, HOWEVER, it took several repaeats to get a successful run to completion, as the overclock on my card produced quite nasty atifacts, and 90% of the tests crashed mid way thru, so voltage bumps will sort that, but heres the fruits of my labour, pretty happy so far: 

-*boost of 4.1 fps to the minimum frame rate*
-*boost of 23 fps to the maximum frame rate*
-*boost of 10.2 fps to the average frame rate*


----------



## mandelore (Jul 6, 2007)

just out of curiosity.... is there a chance it is indeed bugged this benchmark, coz 25Mhz core increase is not an awful lot for such a rise in performance??? granted the memory is 100mhz faster and i got another 100mhz on my cpu, but surely it wouldnt have that huge an impact coz 23fps to max is rather alot imo


----------



## mandelore (Jul 6, 2007)

this was taken from a beyond3d forum:

http://forum.beyond3d.com/showthread.php?t=41903&page=2 

post 50:

"Anyway, other strange thing found by the users in the thread linked:
This is a default 2900XT:

http://img368.imageshack.us/my.php?i...maginednm2.jpg

and this is a default 2900XT, too:

http://img108.imageshack.us/my.php?i...0453125iy1.jpg

difference? PCI-Express frequency at 120 MHz. I remember also that Kinc reported that R600 seemed really PCI-E frequency dependant, which is absolutely strange, but I have no idea of why it behaves so."

im intrigued... im gonna whack up the pciE frequency and see what happens 

could any1 else with a 2900 or 8800 try changing the pcie frequency? just to see if it does effect nv as well as ati, you never know..


----------



## Grings (Jul 6, 2007)

this was run on my opty rig @3ghz(468ram) (not put longhorn/vista on my c2d yet) for  a comparison, my gts is a 320mb model, though everything still reports it as a 640


----------



## mandelore (Jul 6, 2007)

cant wait to see what u get on ur c2d rig


----------



## Tatty_One (Jul 6, 2007)

mandelore said:


> cant wait to see what u get on ur c2d rig



Just be careful when raising the PCI-E frequency, small increases can provide benefits but dependant on motherboard if you raise too much it can make your system either jump like a freaking donkey, lock up or start smelling of burnt components....thats in the extreme tho.


----------



## technicks (Jul 6, 2007)

How much is the max for upping the pci-e frequency?


----------



## mandelore (Jul 6, 2007)

i think 150mhz methinks


----------



## theonetruewill (Jul 6, 2007)

Mandelore, to tell if your card is *MAJORLY* being bottlenecked by your cpu; why don't you try downclocking it and seeing how large the performance difference is? Just a thought.


----------



## mandelore (Jul 6, 2007)

thats athought, will giv it a go


----------



## theonetruewill (Jul 6, 2007)

mandelore said:


> thats athought, will giv it a go



Lookin forward to the insight that will provide


----------



## mandelore (Jul 6, 2007)

well, stock cpu @ around 2.6ghz, huge difference


----------



## theonetruewill (Jul 6, 2007)

mandelore said:


> well, stock cpu @ around 2.6ghz, huge difference



OK, I want to see a HD 2900XT with a Core 2 @ 3.4+! 
This makes it quite obvious that the cpu is MAJORLY holding you back.


----------



## mandelore (Jul 6, 2007)

meh..  me needs some quad core goodness

**politely asks amd to HURRY THE F*%K UP


----------



## theonetruewill (Jul 6, 2007)

mandelore said:


> **politely asks amd to HURRY THE F*%K UP



 I think that's what we're all thinking!


----------



## mandelore (Jul 6, 2007)

theonetruewill said:


> OK, I want to see a HD 2900XT with a Core 2 @ 3.4+!
> This makes it quite obvious that the cpu is MAJORLY holding you back.



i know, its almost worth crying about 

and this is on what stock volts can acheive on my card:

SO:

1) i need to be able to overclock my card better, well, actually, i need to be able to overclock at all (imo 25Mhz aint worthy of oc status)

2) i need a muchos faster cpu

3) then me smile and grin and feel pretty smug


----------



## theonetruewill (Jul 6, 2007)

mandelore said:


> i know, its almost worth crying about
> 
> and this is on what stock volts can acheive on my card:
> 
> ...



And smile complacently at all the struggling GTS'..........I'm not a ATi fanboi.....really


----------



## mandelore (Jul 6, 2007)

hehehe, well, i had to smile at that


----------



## mandelore (Jul 6, 2007)

Omg!!! I Broke 1000 Posts!!!!  

me gets an extra star 

Edit: * feels strangely old, (says quietly) for risk of offending the "Elder ones" he-he   *


----------



## theonetruewill (Jul 6, 2007)

mandelore said:


> Omg!!! I Broke 1000 Posts!!!!
> 
> me gets an extra star



 Congratulations, you're a ...nerd!....damn I'm nearly an official nerd too.
Whats the quality of the DX10 benchmark? Is it very much different from DX9.0c? Also, if you found a way to overclock the shaders do you think that would make the HD 2900XT a kill now- drivers or no drivers?


----------



## mandelore (Jul 6, 2007)

i read the shader clock was synced with the core, but that could just be a standing value, say the core was 750, the shader may be 750, but will remain that reguardless of overclocking, so MAYBE, with higher performing cores the shader may have an initial speed higher than that of the stock 512Mb memory cards, but this is pure speculation


----------



## mandelore (Jul 6, 2007)

the dx10 benchy of call of juarez was pretty nice, tad slow on max settings, but i really like the ati (non-benchmark) exibit of dx10
http://forums.techpowerup.com/showthread.php?t=34358

eye candy till u can almost taste it, well, methinks that was the case, kept looking at those nice glossy lips and thinking my fiancee could do with that lip gloss....
*possible going too far there but i liked it 

deffo not as good as ruby, but heres 2 screenies from the call of jaurez benchmark:


----------



## Tatty_One (Jul 6, 2007)

mandelore said:


> i think 150mhz methinks



I think you will find it varies quite a lot from board/model, some get lock ups at just 105Mhz, some more.  It has been shown there is a "peak" , usually about 108 for a single card and 115-120 for sli/xfire but again there is no firm rule I think as it does vary a lot.  Some cards will not function at 150 tho.


----------



## Tatty_One (Jul 6, 2007)

mandelore said:


> meh..  me needs some quad core goodness
> 
> **politely asks amd to HURRY THE F*%K UP



Nah....a dual core at around 3.8Gig would do you very nicely.  Are these DX10 benches or DX9?


----------



## mandelore (Jul 6, 2007)

dx10


----------



## mandelore (Jul 6, 2007)

Tatty_One said:


> I think you will find it varies quite a lot from board/model, some get lock ups at just 105Mhz, some more.  It has been shown there is a "peak" , usually about 108 for a single card and 115-120 for sli/xfire but again there is no firm rule I think as it does vary a lot.  Some cards will not function at 150 tho.



i just booted @ 140mhz pcie, but needed 1.5V across the pcie to do it


----------



## mandelore (Jul 6, 2007)

Tatty_One said:


> Nah....a dual core at around 3.8Gig would do you very nicely.  Are these DX10 benches or DX9?



lol, id imagine id be happy with ANY AMD that could go to 3.8.. drooollls...


----------



## Tatty_One (Jul 6, 2007)

mandelore said:


> lol, id imagine id be happy with ANY AMD that could go to 3.8.. drooollls...



Well Trt's old 6000+ did well over 3.5Gig and was a match in most graphics benches for a C2d at the same speed.  You would still get a decent price on flea bay for your chip....might not even cost you to upgrade.


----------



## mandelore (Jul 7, 2007)

well id have to superglue my ihs back on ^^ hehe

but im still on 939, so im waiting for a full upgrade from 939 and ddr1 to whatever  appears to be rather beastly at the time of upgrading


----------



## Tatty_One (Jul 7, 2007)

I sold an Opty 170 with excellent stepping (3.150 Gig) 2 months ago for 120 quid, that would easilt but a 6000+


----------



## Tatty_One (Jul 7, 2007)

mandelore said:


> well id have to superglue my ihs back on ^^ hehe



Lol, OK fair point!


----------



## Tatty_One (Jul 7, 2007)

mandelore said:


> i just booted @ 140mhz pcie, but needed 1.5V across the pcie to do it



Dont actually think thats too good, just go for 110 on stock volts....more is not always better.


----------



## mandelore (Jul 7, 2007)

ahh, well the reason for my ihs removal was that it is a terrible stepping, honks like a whore on a $1 free for all trying to get even a marginal overclock, the ihs removal did wonders, but only to an extent, mainly in temperature dropping, but not a great overhead for overclocking, i can get 2,9 easy, 3ghz is a squeeze at uncomfortable voltages


----------



## mandelore (Jul 7, 2007)

Tatty_One said:


> Dont actually think thats too good, just go for 110 on stock volts....more is not always better.



yeah i reduced to 120, i typically run @ 110 @ stock pcie volts fine, was just doing for dx10 benching to see if it indeed made a difference


----------



## Tatty_One (Jul 7, 2007)

mandelore said:


> yeah i reduced to 120, i typically run @ 110 @ stock pcie volts fine, was just doing for dx10 benching to see if it indeed made a difference



Did it?  I would guess you may see a marginal improvement upto a point (maybe 110-120) but beyond that perhaps not.


----------



## mandelore (Jul 7, 2007)

infact it did, just slightly, but that variation i could easily attribute to natural vairiences between benchmarks, think i would need to test further, but i was too busy drinking wine and listening to crap to pry further into the mysteries of pciE overclocking,maybe 2moro, tho im attending a house party of a very fine polish lass 2moro nite, so may be feeling rather rough...


----------



## Tatty_One (Jul 7, 2007)

mandelore said:


> infact it did, just slightly, but that variation i could easily attribute to natural vairiences between benchmarks, think i would need to test further, but i was too busy drinking wine and listening to crap to pry further into the mysteries of pciE overclocking,maybe 2moro, tho im attending a house party of a very fine polish lass 2moro nite, so may be feeling rather rough...



Need a bodyguard to accompany U to the party?.....I like Polish


----------



## mandelore (Jul 7, 2007)

Tatty_One said:


> Need a bodyguard to accompany U to the party?.....I like Polish



hey, if you were local id invite u mate


----------



## HellasVagabond (Jul 7, 2007)

For some stupid reason i cant take a snapshot of the benchmark not with PrntScr , not with Snagit and not with TPUCapture...
Ideas ?

Anyways im getting the following on my 8800GTS320.

Min : 12.8
Max : 47.2
Avg : 25.4

I think its rather low but from what i see the engine benefits alot from more VRAM.


----------



## mandelore (Jul 7, 2007)

HellasVagabond said:


> For some stupid reason i cant take a snapshot of the benchmark not with PrntScr , not with Snagit and not with TPUCapture...
> Ideas ?
> 
> Anyways im getting the following on my 8800GTS320.
> ...



hmmm, being a bit cheeky here, but havent u just got owned by a 2900?  and in dx10...

strange you cannot do a screeny, have you tried running it in windowed mode? not sure what effect that will have on performance, but it may be worth a shot to get some form of screeny


----------



## wazzledoozle (Jul 7, 2007)

Someone should run this on an 8600GTS


----------



## mandelore (Jul 7, 2007)

busy downloadin the lost planet demo, gonna give that a whirl, am i right assuming the 2900 had serious problems with that game?

or has that been resolved with the latest cats


----------



## HellasVagabond (Jul 7, 2007)

Mandelore i got owned by a card thats worth almost 150$ more than mine...Im sure that the 8800GTS 640 would fill the gap, i wont even mention the GTX or the ULTRA 
( Plus that Call Of Juarez Favors ATI in Dx9 so thats a reason we see the difference in Dx10 too )


----------



## mandelore (Jul 7, 2007)

HellasVagabond said:


> Mandelore i got owned by a card thats worth almost 150$ more than mine...Im sure that the 8800GTS 640 would fill the gap, i wont even mention the GTX or the ULTRA
> ( Plus that Call Of Juarez Favors ATI in Dx9 so thats a reason we see the difference in Dx10 too )



and the ultras not as much more expensive also?

but isnt call of j an nvidia game? after all im certain i read they helped them develiop it?


----------



## mandelore (Jul 7, 2007)

well lets hope we see some gtx's etc join the thread coz i really want to see some competition going on here, but i guess we also have to take cpu's into account coz im pretty bottlenecked atm  which is kinda sucky, but it runs good enuff for now


----------



## HellasVagabond (Jul 7, 2007)

Like i said above this game uses lots of VRAM and thats why the 320mb of my 8800 cant keep up. Im sure that the 640 version and the GTX will be a lot better.


----------



## mandelore (Jul 7, 2007)

HellasVagabond said:


> Like i said above this game uses lots of VRAM and thats why the 320mb of my 8800 cant keep up. Im sure that the 640 version and the GTX will be a lot better.



yeah i agree, just wish owners would hurry up and try the benchy


----------



## DOM (Jul 7, 2007)

Tatty_One


----------



## mandelore (Jul 7, 2007)

** sounds the Tatty Siren and looks to the hills for his steed o7


----------



## Tatty_One (Jul 7, 2007)

Gallops into town and pulls out his six shooter.....damn....blanks....no vista.....no DX10


----------



## Tatty_One (Jul 7, 2007)

Otherwise I would beat ya!  Just got 12008 on 3D Mark 2006! trouble is the damn puter crashed when I tried to paste the screenshot into paint   I t must be a temps thing, was running her at 672/1090 without the VGPU mod, will have to try again later.


----------



## FR@NK (Jul 7, 2007)

Thats with a C2D at 3.6GHz and GPU at 840/940


----------



## mandelore (Jul 7, 2007)

Tatty_One said:


> Otherwise I would beat ya!  Just got 12008 on 3D Mark 2006! trouble is the damn puter crashed when I tried to paste the screenshot into paint   I t must be a temps thing, was running her at 672/1090 without the VGPU mod, will have to try again later.



dammit, but i guess a 3,8ghz cpu dont hurt either hehehe 

really want my opty to do better, but voltage thrashing just dont help it. 

im totally stuck @ a max bench stable of 3ghz for call of j, i dont even think 3dmark runs with it at 3ghz..

damn, ive alot to winge about.. crap cpu, no proper overclocking on my card, mehhh 

feel like ive got an Armani suite with platimum cufflinks and only a bicycle to ride


----------



## Tatty_One (Jul 7, 2007)

FR@NK said:


> Thats with a C2D at 3.6GHz and GPU at 840/940



Thats nice, just think what I could do with a decent card if I had DX10


----------



## mandelore (Jul 7, 2007)

FR@NK said:


> Thats with a C2D at 3.6GHz and GPU at 840/940



ahhh nice man!!! now i know ill get a bump with a better proc!!

btw, are you still using stock volts on your 2900?


----------



## mandelore (Jul 7, 2007)

looking at the comparison between your 512mb 2900 and my 1gb, it looks like the extra memory makes a difference, 

FR@NK if you had a 1gb 2900 you would seriously PWN, 

as im only on 3ghz for my benchy and pretty unable to overclock my card atm, that extra 600mhz is v nice for raising your min frame rate!!


----------



## FR@NK (Jul 7, 2007)

mandelore said:


> ahhh nice man!!! now i know ill get a bump with a better proc!!
> 
> btw, are you still using stock volts on your 2900?



Yea I dont think there is anyway to easily overvolt these cards yet. I hoping the next update of AtiTools can do such a thing. Then we will see what these cards can really do when watercooled


----------



## Tatty_One (Jul 7, 2007)

mandelore said:


> busy downloadin the lost planet demo, gonna give that a whirl, am i right assuming the 2900 had serious problems with that game?
> 
> or has that been resolved with the latest cats



Give that a run, does it report FPS???  I only ask because I have just read the release notes for the latest Forceware Beta release and they have been optimised for Lost planet so it would be interesting to see how the cards compared with the most upto date drivers.  I also think that Cat 7.6 had optimisations for this game, not sure if Forceware does.


----------



## mandelore (Jul 7, 2007)

FR@NK said:


> Yea I dont think there is anyway to easily overvolt these cards yet. I hoping the next update of AtiTools can do such a thing. Then we will see what these cards can really do when watercooled



hell yeah, i cant wait to liquid cool with a full coverage block and up the volts, since its already running such a low voltage to start with, that gives us pretty much a WHOLE 1 VOLT to increase  and overclock with 

**hurrah! 

actually... i think the 1gb has ram on BOTH sides of the card? therefore full coverage would only cover half of the  ram chips...

maybe just a gpu block and ram sinks would suffice? since on zero volt increase on ram, im at 2200 mhz already!


----------



## Tatty_One (Jul 7, 2007)

Yay!  Just noticed....I got my 8th Star!   (post whore).


----------



## mandelore (Jul 7, 2007)

Tatty_One said:


> Give that a run, does it report FPS???  I only ask because I have just read the release notes for the latest Forceware Beta release and they have been optimised for Lost planet so it would be interesting to see how the cards compared with the most upto date drivers.  I also think that Cat 7.6 had optimisations for this game, not sure if Forceware does.



i can run it, not sure what settings for comparisons to use, but.. it looked bad, there were blocky effects and glitches... looks like something is messed up in my drivers coz it dont look nice at all, infact it was rather painful to watch..


----------



## mandelore (Jul 7, 2007)

Tatty_One said:


> Yay!  Just noticed....I got my 8th Star!   (post whore).



GRATZ!! 

i just got my 1000th post star yesterday 

lol, think i was on tpu from when i got home from work till i eventually decided i required at least a few hours sleep hehe


----------



## Tatty_One (Jul 7, 2007)

Time to leave now and move onto another forum which is what I usually do when I get as far as I can go so to speak


----------



## mandelore (Jul 7, 2007)

Tatty_One said:


> Time to leave now and move onto another forum which is what I usually do when I get as far as I can go so to speak



huh, your leaving us? 

*waves hand in jedi-like fashion:

"you are new to these forums... you have posted only a dozen times.... you want to stay till you have more stars...."


----------



## Tatty_One (Jul 7, 2007)

mandelore said:


> huh, your leaving us?
> 
> *waves hand in jedi-like fashion:
> 
> "you are new to these forums... you have posted only a dozen times.... you want to stay till you have more stars...."



Lol was kidding.


----------



## mandelore (Jul 7, 2007)

you see, my jedi powers are outstanding


----------



## FR@NK (Jul 7, 2007)

mandelore said:


> actually... i think the 1gb has ram on BOTH sides of the card? therefore full coverage would only cover half of the  ram chips...
> 
> maybe just a gpu block and ram sinks would suffice? since on zero volt increase on ram, im at 2200 mhz already!



I wouldnt recommend a full coverage waterblock. Best way to watercool these cards is to use the stock heatsink + maze5 block. Heres a link with pictures 

http://www.xtremesystems.org/forums/showthread.php?t=145701

using that setup my load temps on the GPU are 50-55C and thats with a pelt on the CPU which adds alota heat to the system. The voltage regs get around 62C under load.


----------



## mandelore (Jul 7, 2007)

i have my pelt setup on a seperate loop, so my gpu waterblock will also have a seperate loop, ive never liked combining cpu & gpus onto a single loop.

i had a full coverage waterblock for my x1900xtx and it was great, as i like the idea of all the hotspots watercooled.

But id consider a gpu block and ramsinks/stock ramplates

my x1900 had the dangerden tyee waterblock, and tbh i may just end up getting the 2900 varient once its relieased

Edit: actually, i tried removing the red plastic casing from my card the other day, just to have a close look, and i just could not get it removed, i took every screw out and it wouldnt budge from the front end?  any ideas?


----------



## FR@NK (Jul 13, 2007)

mandelore said:


> Edit: actually, i tried removing the red plastic casing from my card the other day, just to have a close look, and i just could not get it removed, i took every screw out and it wouldnt budge from the front end?  any ideas?



Did you get this sorted out? Theres a screw near the DVI ports on the bracket where the warm air is pushed out of the case.


----------



## newconroer (Jul 13, 2007)

Tatty, Microsoft was doing a bit from last December until June this year(possibly changed, not sure) that if you purchased XP from a complete system sale, you'd get a free upgrade to Vista.


----------



## MarcusTaz (Jul 13, 2007)

Bench: Call of Juarez
Q6600 3.25 Ghz
ATI HD2900XT 1GB verison BC09 mem... 848mhz core 2306mhz mem

Min 8.7
Max 24.2
Avg 15.2 

Resolution 1920x1080P
Shadowmap Size 2048x2048
Shadow Quality 2
Super-sampling off
Multi-sampling off

Min 12.3
Max 71.3
Avg 30.1

Resolution 1280x1024
Shadowmap Size 2048x2048
Shadow Quality 1
Super-sampling off
Multi-sampling on (x2)

Version 1.301

First is my native resolution, second to match the rest of the 1280x1024 rez... This is with the Cat 7.6 drivers...


----------



## erocker (Jul 13, 2007)

Jedi powers?  Say, I'll give you 100,000 credits for the boy and the droids and to trick my 1950pro into thinking it's direct x10.  I know jedi's need credits like the rest of us peasants these days.


----------



## trt740 (Jul 13, 2007)

mandelore said:


> cant wait to see what u get on ur c2d rig



I will run it aswell as soon as this beast downloads. Hey fellas on a side note I was screwing around and found out I  can run stable now at 3.3ghz at stock voltage that burn in idea is true before I could only run at 2.9ghz stable at stock voltage.


----------



## MarcusTaz (Jul 13, 2007)

Tell me the burn in method again please... Not to take this to far off topic, a link or a PM would suffice.

Thanks


----------



## trt740 (Jul 14, 2007)

MarcusTaz said:


> Tell me the burn in method again please... Not to take this to far off topic, a link or a PM would suffice.
> 
> Thanks


Run your video card, ram ,cpu for a while and it get's more stable and takes less voltage to overclock. Thats called a burn in period.* Just using overtime it becomes more stable.*


----------



## trt740 (Jul 14, 2007)

Well I canot run this bench something about a security modual cannot be accessed and it say a duplicate should be in my regestrey but it not so I cannot run that.


----------



## MarcusTaz (Jul 14, 2007)

trt740 said:


> run your video card, ram ,cpu for a while and it get's more stable and takes less voltage to overclock. Thats called a burn in period.



you mean run it at higher voltages? or Run it at 100 percent? Link that you read?


----------



## mandelore (Jul 15, 2007)

Marcus, have you changed your voltages on your 2900xt at all? im just curious if you have got atitool to work, as im stuck with everything at stock...


----------



## mandelore (Jul 15, 2007)

erocker said:


> Jedi powers?  Say, I'll give you 100,000 credits for the boy and the droids and to trick my 1950pro into thinking it's direct x10.  I know jedi's need credits like the rest of us peasants these days.



 that made me laugh


----------



## MarcusTaz (Jul 15, 2007)

mandelore said:


> Marcus, have you changed your voltages on your 2900xt at all? im just curious if you have got atitool to work, as im stuck with everything at stock...



Na  man I read a post that Wizzard responded to about Vista and his ATItool and if I remember correctly he said it did not work with Vista. So I use the n00b ATI overdrive to OC. No voltage changes here....


----------



## mandelore (Jul 15, 2007)

MarcusTaz said:


> Na  man I read a post that Wizzard responded to about Vista and his ATItool and if I remember correctly he said it did not work with Vista. So I use the n00b ATI overdrive to OC. No voltage changes here....



awww, i was hoping for some magical solution, hell, i havent even tried over 2200 memory yet, think ill give that a go, but its running ok at 2200, tho gets flakey above 850 core, I hope we are soon to be blessed with voltage control, and with watercooling  i wanna rip open some benchies


----------



## MarcusTaz (Jul 15, 2007)

Yea I keep the core 2 clicks below the max and max the mem in ATI overdrive and it seems to run stable... It did preety muh the same thing with my 512mb version...


----------



## mandelore (Jul 15, 2007)

there is one thing, which was SO very strange, just randomly one time, i reinstalled atitool, and IT RECOGNISED THE CLOCKS!!!!! i couldnt believe it! but i had to leave for work like 1 minute after i did that, so i shut down and went to work, came back later, and poof, back to not working 

was gutted, thought it had started working... or if i could even modify the bios, but rabit dont support 1gb 2900, or id made a custom bios long ago.


----------



## MarcusTaz (Jul 15, 2007)

What about Compatability mode?


----------



## mandelore (Jul 15, 2007)

nah, ive tried everything, i keep getting the same crash log, 

2007-07-15 09:51:07	D  VDDC: 1.000 V
2007-07-15 09:51:07	D  MVDDC: 1.550 V
2007-07-15 09:51:07	D  MVDDQ: 1.938 V
2007-07-15 09:51:07	D   Voltage Controller: HD 2900 XT
2007-07-15 09:51:07	D Device initialized successfully
2007-07-15 09:51:07	D Scanning for NVIDIA VGA Cards...
2007-07-15 09:51:07	D Could not read driver clock table: 7

last entry is always the same


----------



## mandelore (Jul 15, 2007)

so i think it must have been a fluke, or error that made it work, coz it just dont appear to be able to read the clocks.


----------

