# Never realized how well the x850xt PE competes with the 1800/1900 cards



## trt740 (Jul 15, 2006)

It even beat them in the lower to mid resolutions and in somes games only at super high settings does it get beaten


----------



## mikelopez (Jul 15, 2006)

I sent you a pm in response to your x1800.


----------



## trt740 (Jul 15, 2006)

here is an example of many which I looked at.

http://www.hardwaresecrets.com/article/228/9


----------



## micron (Jul 15, 2006)

trt740 said:
			
		

> Never realized how well the x850xt PE competes with the 1800/1900 cards


Without sm3.0 and OpenEXR HDR capabilities....it _cant_ compete


----------



## trt740 (Jul 15, 2006)

yes but only one game at present uses it very mutch


----------



## dduummyy (Jul 15, 2006)

trt740 said:
			
		

> yes but only one game at present uses it very mutch


LMAO! i agree with ya man, my X850XT plays it all with NO lag on every game i got at super high settings, quake4 and doom look awsome on ultra settings also and smooth and crisp! I only paid $134 for it too on NEWEGG! Lemme know when version SM 4 comes out so i can snatch up a 3 version super cheap too.


----------



## micron (Jul 16, 2006)

trt740 said:
			
		

> yes but only one game at present uses it very mutch



Oblivion, Far Cry, Splinter Cell Chaos Theory, plus more.


----------



## dduummyy (Jul 16, 2006)

itll still play those games though right?


----------



## micron (Jul 16, 2006)

Yes, but without HDR or the efficiency of sm3.0.


----------



## _33 (Jul 16, 2006)

Oblivion on my X800GTO2 "unlocked" 16 pipelines 500/1240 with my 2.8ghz Venice lags horribly in the outdoor scenes.  It lags at 1024x768 even without anti-aliasing and on 8X aniso (not 16X, that's even worse).  Also, the texture quality is set at normal, because I suppose that's how we enable texture compression.  The game uses an incredible amount of graphics memory (512MB recomended for best effects).  I do set all shadow effects at maximum, but the card won't do HDR also, so no slowdown because of that.  I use HQ mip mapping also.

Seriously guys with X800/X850, it's just getting older by the day, and an X1800/X1900 is what you need for the more recent games, including the game FEAR.

Quake 4 lags quite a bit on my card if I enable all effects and filters at 1024x768.

Not only does the X1800/X1900 exceed in older games, it also exceeds in games that work with openGL, where the older X800/X850 and below lag behind.  For the first time ATI cards equal Nvidia cards in openGL (DOOM 3, Quake 4, and the soon to be released Quake Wars).

X800, the end of the world?  Of course not.


----------



## i_am_mustang_man (Jul 16, 2006)

_33 you get lag at that? i have a lot of things set pretty high, and your cpu and ram kicks mine PCs ass.  my core on the gto2 goes higher, but i run 10x7 w/ 2x AA and 4x AF.  my grass is at max and trees too, as well as distance.  i cut down my npc distance because i find that oblivion is very cpu limited in high population areas, but you have 400MHz above my proccy.  one thing tho, my shadows are set pretty low, cuz i heard there were some erros with ATI and the shadows in the game. another thing, our 3dmark scores are similar.  try turning the shadows down alot and then raising the other things, it might help.

onto topic!  i think 850's are great, and will last until dx10 hits.  sure some games you need to tone down, but it's not bad yet! with my setup, i run HL2:lost coast @ 1600X1200 with 4x/8x and everything maxxed and i get 38fps average.  fear and oblivion i cannot run at 16x12, but they are brand new games!  i remember when games would come out, and hardware would have to catch up to play the game at 12x10 or even 10x7.  i'm sticking with x850 (800GTO2) because they do so darn well! esp when you consider the price gradient when going from X8xx to X1xxx (850xt is like 130 or so online!)


----------



## noneed4me2 (Jul 16, 2006)

My x800 gto is flashed to 16p x800xt clocks and it plays Halflife2 on medium with little to no lag and I have really gotten into the new Star Wars Empire at war and it just runs wonderfully at med to high settings. Serious Sam, Doom 3 look great at settings below ultra and no frame drops i have seen. I am upgrading alot of stuff on my pc but I have decided to wait till dx10 is here, even though the x1000 series definately apeals to me, I think in this case holding out will be better and when I finally upgrade my graphics i will get something that can blow away games like Oblivion, Fear and the like. And really, if ATI got down and really made a better than average OpenGl Driver I think we would see these cards lifespan extended further, cause the hardware aint second rate and is easily better compared to the x1600 cards regardless of SM3 support.


----------



## dduummyy (Jul 16, 2006)

i dont know what your talking about, mine plays fear fine, maybe its your system, im not trying to get in a pissing match cause i really dont care but my 850 runs amazingly great and it plays it all, if i dont need 3  who cares really.


----------



## noneed4me2 (Jul 16, 2006)

I wasn't saying anything was wrong just saying that the x800 series still has life in it, I myself am sticking with it till the next set of gpu's for DX10, no pissing nescessary I agree with you completely. I have yet to play fear or oblivion but some others newer games I play run fine on my system. If your reply was for someone else my apologies.


----------



## JC316 (Jul 16, 2006)

I have owned both an X850 PRO and a X1800GTO. The X1800 kicks the ever loving shit outta the X850. Oblivion ran ok on high at 1024x768 with the X850, the X1800 runs Ultra settings at 1600x1200 with HDR on quite smoothly. It's the same story with F.E.A.R, Quake 4, and The Battle for middle earth. 

I am not saying that the X8xx series are bad cards, but they can't compete with the next gen, even though they are running the same pipelines and speeds.


----------



## micron (Jul 16, 2006)

JC316 said:
			
		

> I have owned both an X850 PRO and a X1800GTO.



If you're stating(in your sig) that you're playing Oblivion in "Ultra" settings @1600x1200 with an X1800GTO and an Athlon64 3000+, I'm going to go ahead and laugh in your face right now, because you're telling porkies.

Honestly man....we're all grown up here


----------



## POGE (Jul 16, 2006)

micron said:
			
		

> If you're stating(in your sig) that you're playing Oblivion in "Ultra" settings @1600x1200 with an X1800GTO and an Athlon64 3000+, I'm going to go ahead and laugh in your face right now, because you're telling porkies.
> 
> Honestly man....we're all grown up here


If its unlocked and overclocked its easily doable.


----------



## i_am_mustang_man (Jul 16, 2006)

i wish i was able to do that!  i get 6882 in 3d05 (only 300 less than you jc) and to play oblivion at 16x12 (new monitor woot woot!) is with no AA or AF, and pretty carppy settings all around.  maybe the sm3.0 arch handles these processes that much faster, and maybe 3dmark is being shoved around as a bad tell!

or not


----------



## yogurt_21 (Jul 16, 2006)

actually in the raw hp department the x1800's fly compred to the x850xt's giving a 30-40% performance gain in shader 2.0 games now that gain isn't so noticible when you're already getting 120+ frames in those games with the x850, but I've certainly not seen any game or bench where the x850xt beat the x1800xt, nor even come close. lol
gotta remeber both have 16 pipes , but the x1800xt has way higher clocks and more advance shader technology. essentially the x1800xt is an x850xt with shader 3.0 on streroids.

now the x1800xl due to it's clocks does get beat out in certian games as well as the x1900gt's. ut if you were refering to the xt's I think not. lol


----------



## infrared (Jul 16, 2006)

micron said:
			
		

> Yes, but without HDR or the efficiency of sm3.0.




HDR/sm3.0 puts more load on the gpu and reduces fps... it doesn't make it more efficient. I was playing oblivion on my friends x1900xtx that he let me borrow for the weekend and i really wasn't that impressed with sm3.0. I'm getting a second x850 to go in crossfire - same performance as x1900 just without sm3.0.


----------



## Daveburt (Jul 16, 2006)

I recently bought an x1600xt and a new Mobo (pci-e) to put in my HTPC, mainly because I needed Vivo features for the HTPC (which my 9800pro didn't have), and I didn't want to take my x850xt out of my Main (gamer machine)....

Oblivion looked SO much better and ran so much smoother, that I wound up rebuliding both of them and my Big Dog now runs on the X1600xt!!

Benchmarks aside... Once you see the quality of the X1K cards, the X800's don't stand a chance!! Don't get me wrong... I'm an ATI guy.... But in the real world....  I picked a 1600 over an 850xt, and newer games are only gonna take more advantage of X1K features!

In older games the x800's may beat the X1K's, but I'm burnt out on most of those!


----------



## mikelopez (Jul 16, 2006)

yogurt_21 said:
			
		

> actually in the raw hp department the x1800's fly compred to the x850xt's giving a 30-40% performance gain in shader 2.0 games now that gain isn't so noticible when you're already getting 120+ frames in those games with the x850, but I've certainly not seen any game or bench where the x850xt beat the x1800xt, nor even come close. lol
> gotta remeber both have 16 pipes , but the x1800xt has way higher clocks and more advance shader technology. essentially the x1800xt is an x850xt with shader 3.0 on streroids.
> 
> now the x1800xl due to it's clocks does get beat out in certian games as well as the x1900gt's. ut if you were refering to the xt's I think not. lol



This bench has the X850 XT beating the x1800xt (and the x1800 is w/512 memory):

http://www23.tomshardware.com/graphics.html?modelx=33&model1=302&model2=310&chart=107


----------



## Tatty_One (Jul 16, 2006)

dduummyy said:
			
		

> itll still play those games though right?



yeah and because it does not have SM3.0 it will play them faster.


----------



## Tatty_One (Jul 16, 2006)

noneed4me2 said:
			
		

> My x800 gto is flashed to 16p x800xt clocks and it plays Halflife2 on medium with little to no lag and I have really gotten into the new Star Wars Empire at war and it just runs wonderfully at med to high settings. Serious Sam, Doom 3 look great at settings below ultra and no frame drops i have seen. I am upgrading alot of stuff on my pc but I have decided to wait till dx10 is here, even though the x1000 series definately apeals to me, I think in this case holding out will be better and when I finally upgrade my graphics i will get something that can blow away games like Oblivion, Fear and the like. And really, if ATI got down and really made a better than average OpenGl Driver I think we would see these cards lifespan extended further, cause the hardware aint second rate and is easily better compared to the x1600 cards regardless of SM3 support.



My old 850xt at 610/650 walked oblivion at maxed settings but with just 2xAA, my current 7900GT plays it at max with full AA and does not even break out of a sweat, so apart from the DX10 thing there are plent of cards around now that will play the games you have mentioned flawlessly already.


----------



## GLD (Jul 16, 2006)

I would not trade my Sapphire X850XT for any 7800 or lower nVidia card. A 7900 series maybe, if you kicked in some cash.


----------



## Tatty_One (Jul 16, 2006)

GLD said:
			
		

> I would not trade my Sapphire X850XT for any 7800 or lower nVidia card. A 7900 series maybe, if you kicked in some cash.



I absolutely loved my 850xt, got the 7900GT for a birthday pressie and it is much faster but having said that, would have been more than happy to keep the 850 as there is plenty of life in them yet and apart from SM3 would not really see the point in anyone upgrading to 1800xt at this point before DX10, but thats just my opinion.  The good thing about the 7900Gt is that it runs so cool partly because it is throttled to 1.2V from 1.4V (7900GTX) but with just about the simplest voltmod out there you can bring it back up to 1.4V and with some improved cooling you can get far in excess of GTX speeds for £150 less!, got a friend doing my voltmod this week!


----------



## BIOHazard87 (Jul 16, 2006)

micron said:
			
		

> Oblivion, Far Cry, Splinter Cell Chaos Theory, plus more.


oh wow.... ok you listed ONE thats worth talking about,mabey 2 (oblivion/far cry)

personally all I play is CS 1.6 .. so it doesnt even matter for me


----------



## _33 (Jul 16, 2006)

mikelopez said:
			
		

> This bench has the X850 XT beating the x1800xt (and the x1800 is w/512 memory):
> 
> http://www23.tomshardware.com/graphics.html?modelx=33&model1=302&model2=310&chart=107



OK, now how about this...

Or, this!


----------



## trt740 (Jul 16, 2006)

look at the example benchmarks I posted both the x1800/1900 lose at low to mid range resolution the the x850 pe  http://www.hardwaresecrets.com/article/228/9

here it is again remember this is one of many saw and in some games the PE can match the x1800/x1900 not beat it or it loses by a few frames but is this worth 190 dollar difference for a x1900xt na baby na. In the future yes!!!


----------



## _33 (Jul 16, 2006)

trt740 said:
			
		

> look at the example benchmarks I posted both the x1800/1900 lose at low to mid range resolution the the x850 pe  http://www.hardwaresecrets.com/article/228/9
> 
> here it is again remember this is one of many saw and in some games the PE can match the x1800/x1900 not beat it or it loses by a few frames but is this worth 190 dollar difference for a x1900xt na baby na. In the future yes!!!



I don't play in low detail.  I set AAA modes on, ANISO 16X if possible, and Advanced MIP MAP.  In those cases, the X1800/X1900 will definately surpass as it has more memory bandwidth right from the start, and the ringbus makes memory works in a way that it boosts performance when you add in all the bells and whistles and bling bling.

Really, I'd slap the guy in the back of the head if he was playing a game at 150FPS low detail, specially with the features we can enable.  It's not like we're running a Geforce MX 200 or an ATI Rage Pro...


----------



## mikelopez (Jul 16, 2006)

_33 said:
			
		

> OK, now how about this...
> 
> Or, this!



That looks fanstastic but remember what I mentioned.  The x1800xt in this case is OC and is 512mb.

The X850 XT is set at stock clocks and is 256mb.


----------



## _33 (Jul 16, 2006)

mikelopez said:
			
		

> That looks fanstastic but remember what I mentioned.  The x1800xt in this case is OC and is 512mb.
> 
> The X850 XT is set at stock clocks and is 256mb.



Bottom line Serious Sam 2 is a stupid game after all, and Croteam are more Nvidia fans than ATI, so they don't really care about this.

The real feature games are not from Croteam.


----------



## dduummyy (Jul 16, 2006)

its ashame the Tom's doesnt do a $price$ benchmark also For $134 buck my 850XT will last me a long time. How much are the 1900's?


----------



## micron (Jul 16, 2006)

infrared said:
			
		

> HDR/sm3.0 puts more load on the gpu and reduces fps...


HDR does, but sm3.0 doesnt. A sm3.0 card can render a scene in fewer passes then a ps2.0 card(it allows for longer shader instructions), it's more efficient, and efficiency = speed.


----------



## _33 (Jul 16, 2006)

A 16 pipeline 500mhz and above X8x0 card will be good for another 3 or 4 years, just like the older Nvidia Geforce 4200Ti that was an excellent deal at the time.  But it will remain a card with less features than the newer cards.  ATI succeded in adding AAA modes 2 months ago in the Catalyst drivers, and I was all "joy joy".  But really there's not much else you can expect from a 160 million transistor GPU? compared to a much more advanced R580 with almost 400 million transistor GPU running at close to 700mhz.  You will get a short list of "miracle bench marks" and a cut down in features for newer games.


----------



## b1lk1 (Jul 16, 2006)

X1800 Crossfire for the win.  I have yet to see under 60FPS in the games I play @ 1600X1200 res with max details/settings.  I know most everyone is stating the newer cards are better for high res.  If you can spare the money, newer is better.  But those X800/X850's are still kickass cards.  I had a ViperJohn modded X850 that was just unreal.  I wish I had the common sense to keep that card back when instead of spending all this money now.


----------



## Tatty_One (Jul 16, 2006)

micron said:
			
		

> HDR does, but sm3.0 doesnt. A sm3.0 card can render a scene in fewer passes then a ps2.0 card(it allows for longer shader instructions), it's more efficient, and efficiency = speed.



Ok but if you take CSS for example in an SM3 enabled map an 850xt will run in sm2.0 and its frame rates will be higher as rendering in SM2.0 is less GPU intensive.


----------



## JC316 (Jul 16, 2006)

micron said:
			
		

> If you're stating(in your sig) that you're playing Oblivion in "Ultra" settings @1600x1200 with an X1800GTO and an Athlon64 3000+, I'm going to go ahead and laugh in your face right now, because you're telling porkies.
> 
> Honestly man....we're all grown up here



First, I am not someone who makes crap up about my system to make me look good. I average 20-25 FPS in town, 60+ indoors and 45-50 outdoors, smooth enough for me to play. The X850 had those numbers on just high 1024x768. If I were "Telling Porkies" I would have said it runs over 60 fps at all times. I also have my CPU overclocked, 900Mbs of ram free, and an overclocked video card.


----------



## _33 (Jul 16, 2006)

JC316 said:
			
		

> First, I am not someone who makes crap up about my system to make me look good. I average 20-25 FPS in town, 60+ indoors and 45-50 outdoors, smooth enough for me to play. The X850 had those numbers on just high 1024x768. If I were "Telling Porkies" I would have said it runs over 60 fps at all times. I also have my CPU overclocked, 900Mbs of ram free, and an overclocked video card.




I wouldn't call 2ghz an overclock......  Can't it get a little higher (that cpu)?  Or is your ram multiplier inaccesible in your system bios?

X1800GTO, not a bad card.  Are the 4 other pipelines unlockable do you think?


----------



## trog100 (Jul 17, 2006)

i have an x850 card and a 1900xtx.. it comes down to being able to run higher settings on one than the other.. the x850 still copes well enough with all current games.. as for hdr and sm3 i cant say i have noticed any super "wow" factors there.. in fact with oblivion i prefer bloom and dont use it.. as for sm3 i cant say i have noticed the benifits of having or not having it.. in fact its pretty "unnoticeable" to be honest.. looking at the 3dmark 2006 canyon flight useing sm3 crawling by at 5 frames per second dont exactly impress me.. he he

trog


----------



## v-zero (Jul 17, 2006)

micron said:
			
		

> A sm3.0 card can render a scene in fewer passes then a ps2.0 card(it allows for longer shader instructions), it's more efficient, and efficiency = speed.



1. The number of passes has nothing to do with whether it is SM2 or SM3... SM2 can run shader code of up to 96 in length, SM3 is about 3000 I think, however each shader is done in one pass, and it is unlikely a shader code is longer than 20 to 30, let alone 96... 

2. Since SM3 requires FP32 precision, whilst colours will be "better" it will also be less efficient than the FP24 of SM2... So that is why SM3 cards have to do more work, and hence SM3 code slows cards down...

3. Efficiency does not equal speed, something can be very fast and very inefficient in many ways... That kind of blanket statement is wrong. Higher efficiency can aid speed, but it doesn't have to...


----------



## dduummyy (Jul 17, 2006)

what it boils down to i guess is thank god were all fortunate here to have decent video cards for gaming wouldnt you say?


----------



## JC316 (Jul 17, 2006)

_33 said:
			
		

> I wouldn't call 2ghz an overclock......  Can't it get a little higher (that cpu)?  Or is your ram multiplier inaccesible in your system bios?
> 
> X1800GTO, not a bad card.  Are the 4 other pipelines unlockable do you think?



No, my card can't unlock, sapphires have a very low success rate of unlocking. My sig needs to be updated, I am now running 2.2GHZ on the CPU. It will go higher, but I am making sure it's stable.


----------



## i_am_mustang_man (Jul 17, 2006)

dduummyy said:
			
		

> what it boils down to i guess is thank god were all fortunate here to have decent video cards for gaming wouldnt you say?


 palabra
and
 peace


----------



## _33 (Jul 17, 2006)

JC316 said:
			
		

> No, my card can't unlock, sapphires have a very low success rate of unlocking. My sig needs to be updated, I am now running 2.2GHZ on the CPU. It will go higher, but I am making sure it's stable.



Your Venice is easily safe up to 2.6ghz, trust me


----------



## i_am_mustang_man (Jul 17, 2006)

_33, i am pretty new to OCing 64s, and i have a venice 3200+.  above 2.4, it freaks out.  should i bump the voltage, if so by how much, or did i maybe get a bad batch?


----------



## Tatty_One (Jul 17, 2006)

i_am_mustang_man said:
			
		

> _33, i am pretty new to OCing 64s, and i have a venice 3200+.  above 2.4, it freaks out.  should i bump the voltage, if so by how much, or did i maybe get a bad batch?



If you don't mind me answering in his place, I assume you have made adjustments to your memory and HTT to compensate for the raise in FSB.  I had until recently, well I have still got a 3200 that I got to 2.7Gig stable and yes I had to raise the volts from in my case about 2450Mhz.  I dont know what Voltage options your mobo has but I got my 2700 on 1.55V and that is considered safe.  Reports show that some can take upto 1.7V but that is VERY dangerous and should not be attempted, anything from 1.6V is going to reduce the life of the CPU, IMO NEVER go above 1.575 (well I wouldn't), see what you can get out of 1.55 first, but bear in mind that the heat issues multiply incredibly fast with the voltage increases so if you have not already (because I was too lazy to check your specs) get a good cooler, I got mine to 2700 @ 1.55V with an Artic freezer 64 Pro that idled at 32C and never went above 43C at load, if you are hitting well into the 40ies at idle she may start to twitch under load cause she will be hitting well into the high fifties....AMD Max operating temps for the Venice cores are 49 - 65C I think.

Edit:  seen you have stock cooling, upgrade, the 64 pro is really cheap at the moment, you can get better but I have modded mine (see my specs) and she is probably running as good as any air now.


----------



## i_am_mustang_man (Jul 17, 2006)

i'm not actualy running stock, so woohoo!  (should update my specs tho!) i have a thermaltake silent 939 on there, not ridiculous, but it's silent  
and yea, i have lowered the memory, it's running at exactly 2000MHz (240x4x2), so it's not being taxed.  i just found out that there is a program to tell me temps, so i will try that out and maybe raise the volts.  if i can get to 2.6, i will be happier than i don't know.  thanks for the quick reply tatty!


----------



## DRDNA (Jul 17, 2006)

well the x850 in crossfire mode  highest score is 3DMark Score: 12025    Untitled and the x1800 crossfire mode highest score is  3DMark Score: 14474    DRDNA for approved and 
3DMark Score: 15186    165 Opteron X1800 Crossfire b1lk1  for unapproved .....They dont look to me as being almost the same.


----------



## Tatty_One (Jul 17, 2006)

i_am_mustang_man said:
			
		

> i'm not actualy running stock, so woohoo!  (should update my specs tho!) i have a thermaltake silent 939 on there, not ridiculous, but it's silent
> and yea, i have lowered the memory, it's running at exactly 2000MHz (240x4x2), so it's not being taxed.  i just found out that there is a program to tell me temps, so i will try that out and maybe raise the volts.  if i can get to 2.6, i will be happier than i don't know.  thanks for the quick reply tatty!



No probs m8...as a matter of interest I use "pc alert 4" from the MSI site, it is under software downloads.  Have tried many CPU temp monitors but after actually takeing the temps PC Alert was dead on accurate!  Small download too, give it a try, dont matterif you dont have an MSI Mobo it works fine and gives you the volts off each rail too.

let me know how U get on!


----------



## Tatty_One (Jul 17, 2006)

DRDNA said:
			
		

> well the x850 in crossfire mode  highest score is 3DMark Score: 12025    Untitled and the x1800 crossfire mode highest score is  3DMark Score: 14474    DRDNA for approved and
> 3DMark Score: 15186    165 Opteron X1800 Crossfire b1lk1  for unapproved .....They dont look to me as being almost the same.



Yeah nice scores!  Am hoping to top 12000 tomorrow, am voltmodding my 7900Gt so it should give me faster than the 7900GTX.  Am also gonna try and break the BIG 6000 for 2006.


----------



## DRDNA (Jul 17, 2006)

Good luck M8


----------



## Tatty_One (Jul 17, 2006)

DRDNA said:
			
		

> Good luck M8



Cheers....will keep U posted!


----------



## Daveburt (Jul 18, 2006)

Hey Mustang, On your OC.... Did you drop your Mem frequency to 166?  You seem like you pretty well have a handle on overclocking so I thought I would let ya know how I got 2.6Ghz out of my 3800X2.

Mem Freq.  166 (not 200)
HTT           4x  (Not 5, this seems to be the most touchy thing, make sure that HTTxFSB is never greater than 1060)
FSB           260 (not 200)
Multiplier    10x  (Max for a 3800x2)

The end result is a 30% OC (2.6Ghz), with the Mem @ (appx.) 222, and the HTT bus @ 1040.

Hope some of that info helps Bud, seems like your chip should do alot better than what ya have unless ya got a really cr*ppy stepping/chip....

 Sorry Moderator, I know this is a off topic for the thread.....


----------



## v-zero (Jul 18, 2006)

Isn't the highest multi for the 3800 X2 9x ? I don't think it cn do 10x, since it is 1.8ghz at default speed...


----------



## Tatty_One (Jul 18, 2006)

v-zero said:
			
		

> Isn't the highest multi for the 3800 X2 9x ? I don't think it cn do 10x, since it is 1.8ghz at default speed...


 No its 10 x running at 2000Mhz.

Extract from AMD site:

Matching Products (Click on OPN or View Details for more information)
Compare  OPN (Tray) OPN_PIB Processor Model Frequency Socket Wattage   
  ADA3800DAA5CD ADA3800BVBOX AMD Athlon™ 64 X2 Dual-Core 3800+ 2000Mhz Socket 939 89W View Details 
  ADA3800DAA5BV ADA3800BVBOX AMD Athlon™ 64 X2 Dual-Core 3800+ 2000Mhz Socket 939 89W View Details 

Compare Selected Items


----------



## i_am_mustang_man (Jul 18, 2006)

ya, i already dropped my mem freq.  i started another thread about my cpu oc glass roof.   it's here   http://forums.techpowerup.com/showthread.php?t=14558


----------



## Tatty_One (Jul 18, 2006)

OK, got my 7900GT voltmodded today to 1.4V...no problems with that apart from the fact that my Zalman cooler is playing up now!!!  am going to RMA it and order another old trusty Zalman Fatality instead, was TBH never that impressed with the VF900 and the Fatalities have never let me down.

The voltmod for those interested is near impossible to do with solder (conductive pen works but is difficult to control the flow and there is VERY little space to play with) as the manufacturer uses the lead free solder on the board as my buddie found out when it disintergrated!  He has wired it up for me now and it works fine, just have to downclock core at the moment as I am back on stock until the fatality arrives Thursday.


----------



## Ketxxx (Jul 18, 2006)

micron said:
			
		

> Yes, but without HDR or the efficiency of sm3.0.


SM3 offers tangible benefits at best over SM2b. tried, tested, and proven. until there is a proper replacement for sm2b theres not much point in upgrading. just look at sm3, sposed to be the next "big" thing, and it ended up being little more than a whisper. sm4 will probably be much the same, maybe when  sm5 comes out the industry might move on, but they dont like change. they would much rather beat something to death then continue beating it.


----------



## _33 (Jul 18, 2006)

Ketxxx said:
			
		

> SM3 offers tangible benefits at best over SM2b. tried, tested, and proven. until there is a proper replacement for sm2b theres not much point in upgrading. just look at sm3, sposed to be the next "big" thing, and it ended up being little more than a whisper. sm4 will probably be much the same, maybe when  sm5 comes out the industry might move on, but they dont like change. they would much rather beat something to death then continue beating it.



What's wrong with DDR1?

My Atari ST still runs!  J/K


----------



## Casheti (Jul 25, 2006)

I run Oblivion max settings at 1024 x 768 nice n smooth with the new patch, only thing turned down is shadows. Firstly tatty one, how did you get your old X850XT to 610/650?? And second, how can I voltmod?? I don't really wanna do it physically, is there no progs that raise voltage? And...is there ANY POSSIBILITY WHATSOEVER that only using a 300W PSU like mine to power my X850XT can reduce it's performance, or overclockability??? I lag in BF2 and it fucks me off. I know somebody with Athlon 3800+, X800XT PE, 1.25GB DDR 400MHz ram, and he absolutely owns me in every single game. He can run BF2 maxed out at 1280x1024 like me, accept he NEVER, EVER lags in BF2 at all. Not even when you first load a map, it's a smooth as silk. And lastly, is BF2 dual core enabled??


----------



## Tatty_One (Jul 25, 2006)

Casheti said:
			
		

> I run Oblivion max settings at 1024 x 768 nice n smooth with the new patch, only thing turned down is shadows. Firstly tatty one, how did you get your old X850XT to 610/650?? And second, how can I voltmod?? I don't really wanna do it physically, is there no progs that raise voltage? And...is there ANY POSSIBILITY WHATSOEVER that only using a 300W PSU like mine to power my X850XT can reduce it's performance, or overclockability??? I lag in BF2 and it fucks me off. I know somebody with Athlon 3800+, X800XT PE, 1.25GB DDR 400MHz ram, and he absolutely owns me in every single game. He can run BF2 maxed out at 1280x1024 like me, accept he NEVER, EVER lags in BF2 at all. Not even when you first load a map, it's a smooth as silk. And lastly, is BF2 dual core enabled??



Ok, to answer in order, I must have got lucky with my 850XT, no voltmod it was the HIS ICEQ2 so it came with the Artic Silencer 5 as standard but got me to 7300+ on 3D Mark 2005 and secondly....yes your PSU is holding you back.....the minimum power rating for the 850XT is 350W!!!!!!!  Ohhhh missed one, no softmod for volts possible on the 850XT so must be a hardmod job, if that worries you...do some research, print off the pictures and find locally a TV/HiFi repair shop and ask if they will do the soldering for you, it will only take a couple of minutes so should be very cheap.


----------



## _33 (Jul 25, 2006)

Casheti said:
			
		

> I run Oblivion max settings at 1024 x 768 nice n smooth with the new patch, only thing turned down is shadows. Firstly tatty one, how did you get your old X850XT to 610/650?? And second, how can I voltmod?? I don't really wanna do it physically, is there no progs that raise voltage? And...is there ANY POSSIBILITY WHATSOEVER that only using a 300W PSU like mine to power my X850XT can reduce it's performance, or overclockability??? I lag in BF2 and it fucks me off. I know somebody with Athlon 3800+, X800XT PE, 1.25GB DDR 400MHz ram, and he absolutely owns me in every single game. He can run BF2 maxed out at 1280x1024 like me, accept he NEVER, EVER lags in BF2 at all. Not even when you first load a map, it's a smooth as silk. And lastly, is BF2 dual core enabled??



2 things I find wrong:

#1 - Oblivion sucks even with all shadows turned on with bloom effects, how the heck can you play with such an ugly image on your hyped up X850XT (as stated in the title: "almost equal to an X1800:X1900)???

#2 - 300watts power supply?  Sheesh, are we back in 1990 or what?  You need 450 watts or more with your setup!  And I'm very serious.  I don't even understand how your system operates with 300 watts.


----------



## infrared (Jul 26, 2006)

Yeah dude.... You're complaining that you have bad lag in BF2... Now you know why!

Get a nice antec 450w


----------



## Soopahmahn (Jul 26, 2006)

_33, Bloom lighting ain't that bad. It isn't as nice as HDR etc but it's a nice attempt at making a cool effect on budget hardware. But yeah I'm definitely not buying some of these Oblivion stories - I have medium quality 1024x768 w/some shadow options enhanced and most draw distances 25-50%, and I dunno my FPS but it gets very slideshowy when scenery makes a quick change or I'm outside. Heck, even when I'm inside it can be jerky. Typically I'm looking at 15-20fps I guess.

:- SYSTEM (used to be cool) -:
P4 2.4gHz w/AVC Sunflower @ 2.6gHz/580mHz FSB (crap OC :shadedshu )
Gigabyte 8IEXP mobo w/AGP 4x  
Sapphire X800GTO w/16p (crap OC :shadedshu )
2x512mb GeiL banghead: ) DDR
430W Enermax PSU
2x120gb WD Caviar Special Edition 8mb buffer


----------



## GLD (Jul 26, 2006)

1 Brand of PSU to Rule them All!: PC Power & Cooling!


----------



## Casheti (Jul 26, 2006)

_33 said:
			
		

> #1 - Oblivion sucks even with all shadows turned on with bloom effects, how the heck can you play with such an ugly image on your hyped up X850XT (as stated in the title: "almost equal to an X1800:X1900)???
> 
> #2 - 300watts power supply? Sheesh, are we back in 1990 or what? You need 450 watts or more with your setup! And I'm very serious. I don't even understand how your system operates with 300 watts.



What do you mean it sucks at those settings on high?? All on full gfx apart from shadows, anisotropic filtering 16x, no AA, it looks awesome, and runs real smooth, mostly, apart from when I'm loading in the woods or something, but it's okay after loading.

And it runs stable, I have no idea how either. Must be a miracle, or maybe those random crashes and crap I keep getting about twice a day when I try to play BF2 is not my PC, but simply my PSU?? Or maybe the just the shit patches they bring out for it??

And 2nd, 



			
				infrared said:
			
		

> Yeah dude.... You're complaining that you have bad lag in BF2... Now you know why!
> 
> Get a nice antec 450w



Can having a low power PSU for a card like mine REALLY cause lag??


----------



## TooFast (Jul 26, 2006)

trt740 said:
			
		

> It even beat them in the lower to mid resolutions and in somes games only at super high settings does it get beaten





BS.BS I have had all three cards (x850 pe x1800xt x1900xtx) and let me tell you the x850 is no where near as fast as the x1800xt or x1900!!!!!!


----------



## Tatty_One (Jul 26, 2006)

Casheti said:
			
		

> What do you mean it sucks at those settings on high?? All on full gfx apart from shadows, anisotropic filtering 16x, no AA, it looks awesome, and runs real smooth, mostly, apart from when I'm loading in the woods or something, but it's okay after loading.
> 
> And it runs stable, I have no idea how either. Must be a miracle, or maybe those random crashes and crap I keep getting about twice a day when I try to play BF2 is not my PC, but simply my PSU?? Or maybe the just the shit patches they bring out for it??
> 
> ...



As I mentioned...the minimum REQUIREMENT for an 850xt is 350W....you will be getting enough power to pull about 80% of the cards potential and you will be getting "power fade" as you get to a particularily graphically intensive bit of action and the card draws more power (but there aint none!) to render the flow will not be smooth resulting in fluctuations of performance because of the lack of power....hence...."Lag"


----------



## Casheti (Jul 26, 2006)

Ah, I see. So...I'm off to buy a new PSU then. This one will do me fine.

http://www.ebuyer.com/customer/prod...hvd19wcm9kdWN0X292ZXJ2aWV3&product_uid=102994


----------



## Tatty_One (Jul 26, 2006)

Casheti said:
			
		

> Ah, I see. So...I'm off to buy a new PSU then. This one will do me fine.
> 
> http://www.ebuyer.com/customer/prod...hvd19wcm9kdWN0X292ZXJ2aWV3&product_uid=102994



Never skimp on a PSU it really is important, better to have a quality 400W than a budget 550W, my experience of E Buyer value PSU's is not good, had 2, both blew on me honestly.

If your budget is tight can it stretch to this:

http://www.ebuyer.com/UK/product/89518/rb/20636773354

It's not great, but it's not bad, reviews are pretty good and it's a damn sight better than the e buyer value, please don't let output power blind you....on a limited budget far better to get a balance of power and quality rather than be overly biased on the power,


----------



## Casheti (Jul 26, 2006)

Trouble is I need one with a  20 - 24 Pin Converter. (Dont ask, gay motherboard)

Is it also possible that a crap motherboard like mine can limit X850XT's performance?


----------



## Casheti (Jul 26, 2006)

Cheap PSU for a cheap budget. EVERY SINGLE REVIEW of it (All 35) have been excellent.


----------



## infrared (Jul 26, 2006)

All good psu's come with the 24pin motherboard cable (ATX 2.0,2.2)... don't buy a cheap psu... please!!! Your motherboard is fine, i doubt it would be causeing any problems.



			
				Casheti said:
			
		

> Can having a low power PSU for a card like mine REALLY cause lag??



Yes, because the hardware doesn't have enough power to work properly.


----------



## Casheti (Jul 26, 2006)

Well, it may not have enough power, but...it IS however stable. I am definitely gonna get new PSU now.


----------



## infrared (Jul 26, 2006)

Casheti said:
			
		

> Cheap PSU for a cheap budget. EVERY SINGLE REVIEW of it (All 35) have been excellent.



Because everyone that buys it is a n00b  Trust me... if you plan to do _any_ overclocking you will regret buying that psu when it fries and blows your motherboard... Happened to me when i had a generic 450w.


----------



## Casheti (Jul 26, 2006)

LMAO. Well, I won't be overclocking for a while, mobo I want is expensive  But when I do get a new mobo, I will buy a better PSU also. Good advice, there. But generally cheap PSU's should work fine.


----------



## infrared (Jul 26, 2006)

yeah, should be alright as a temporary one


----------



## Casheti (Jul 26, 2006)

Good good. Like I say, it's all about the money at the moment.


----------



## _33 (Jul 26, 2006)

Casheti said:
			
		

> What do you mean it sucks at those settings on high?? All on full gfx apart from shadows, anisotropic filtering 16x, no AA, it looks awesome, and runs real smooth, mostly, apart from when I'm loading in the woods or something, but it's okay after loading.
> 
> And it runs stable, I have no idea how either. Must be a miracle, or maybe those random crashes and crap I keep getting about twice a day when I try to play BF2 is not my PC, but simply my PSU?? Or maybe the just the shit patches they bring out for it??



Personally, I don't feel as if Oblivion has such incredible graphics, even with all effec ts turned on, not with the intense slideshow we're having.  It's a question or how much texture space the renderer is using and how it's programmed to allow good or bad speeds on graphics card.  It's not the best graphics, and quite possibly the most buggy graphics I have ever seen on a recent game.  On the flip coin, Doom 3 (2004) has much better graphics and shadow effects and require a Geforce 3 to perform them at decent framerate.

It's not really a complain, but like, how CAN you play Oblivion with no shadows???  It must look like a 1999/2000 game?  I can't understand that with a X800/X850 we have to turn off most graphical effects to have a playable game.  This game has serious GFX engine issues.


----------



## Casheti (Jul 26, 2006)

LMFAO!! I don't play with no shadows. Int is on 4/10, ext on 1/4


----------



## Casheti (Jul 26, 2006)

I play smooth on these settings







SUCK ON THAT!


----------



## _33 (Jul 27, 2006)

Casheti said:
			
		

> I play smooth on these settings
> 
> 
> 
> ...



I bet you if I send you my INI file for Oblivion, you could tell me how much of a slideshow you're getting.  I'll attach it here...  It's a highly tweaked INI file for Oblivion that gives me pretty good framerates and the best graphics I can get.  Spent lots of time tweaking this, based on many sources and experiments.  I use that INI file with no AA and 8X ANISO and it's quite playable.  But some outside lags.


----------



## Casheti (Jul 27, 2006)

You want me to play Oblivion on an .ini modified for 2GB ram?? You're crazy. I'll try it later.

I play with 16XAF and no AA

AHAHAHAH!! Your outside lags, I'll have to check out how much I own you.


----------



## Casheti (Jul 27, 2006)

I put on the new .ini, played Oblivion, and it detected default settings, so, can't do the test, unless you want to manually find the four pieces of code that tell Oblivion what video card is in the machine, so I can copy and paste my codes to trick your .ini into thinking it's for my card.


----------



## _33 (Jul 27, 2006)

Casheti said:
			
		

> I put on the new .ini, played Oblivion, and it detected default settings, so, can't do the test, unless you want to manually find the four pieces of code that tell Oblivion what video card is in the machine, so I can copy and paste my codes to trick your .ini into thinking it's for my card.




 Owwww....  Open a default INI and do cut/paste of those 4 lines, it will work!  I'm sorry for that one.  But believ e me, it won't have anything to do with 2 GB setup, just GFX.


----------



## Casheti (Jul 27, 2006)

I can't remember where the four pieces of code that detect the video card are


----------



## _33 (Jul 27, 2006)

Casheti said:
			
		

> I can't remember where the four pieces of code that detect the video card are



It looks like this:


> [Display]
> uVideoDeviceIdentifierPart1=3619102434
> uVideoDeviceIdentifierPart2=298786317
> uVideoDeviceIdentifierPart3=672688744
> uVideoDeviceIdentifierPart4=902546081



I just can't believe you're running with 300watts PSU!  A P4 and an overclocked X850XT !!!!  What an achievement!!!


----------



## Casheti (Jul 27, 2006)

HAHAHA!! That was the most pussy graphics I have ever seen. It made me throw up. It was easy as pie to run 

And yes, it is a miracle I run so much off of a 300W PSU. Bow before my amazing acheivements, you commoner.

Apparently my card "underacheives" because of this weak ass PSU, imagine what I could do with full power...


----------



## Casheti (Jul 27, 2006)

Try out my .ini for size. Fool.


----------



## _33 (Jul 27, 2006)

Casheti said:
			
		

> Try out my .ini for size. Fool.



Hhahahahahahah That's a pussy INI, hahahaha:!!!!!  INTEL Hahahahaha!!!!


----------



## i_am_mustang_man (Jul 27, 2006)

WHAT THE FUCK IS GOING ON IN THIS THREAD?!?!?!

you both enjoy oblivion


so who else thinks the horse armor mod is a piece o shit?
eh?
eh?
eh?


----------



## Casheti (Jul 27, 2006)

I was just making fun of _33 cos his graphics look like shit, and they're so easy to run. He thought it would be a "slideshow". Proved your ass wrong, man. No hard feelings, it's all good. Did you try my settings? I optimised it for 1GB of ram and gave it as much manual dual core support as possible. Notice how much better it looks when everything is on high??

And tbh, I don't have any horse armour 

O.M.F.G THERE IS A U IN ARMOUR, YOU AMERICANS MESS EVERYTHING UP!! MOM IS SPELT WITH A U, NOT AN O, THER IS A U IN COLOUR ETC... ETC...


----------



## JC316 (Jul 27, 2006)

Snooze, I run at those settings, with HDR and at 1600x1200.


----------



## bigboi86 (Jul 27, 2006)

Casheti said:
			
		

> I was just making fun of _33 cos his graphics look like shit, and they're so easy to run. He thought it would be a "slideshow". Proved your ass wrong, man. No hard feelings, it's all good. Did you try my settings? I optimised it for 1GB of ram and gave it as much manual dual core support as possible. Notice how much better it looks when everything is on high??
> 
> And tbh, I don't have any horse armour
> 
> O.M.F.G THERE IS A U IN ARMOUR, YOU AMERICANS MESS EVERYTHING UP!! MOM IS SPELT WITH A U, NOT AN O, THER IS A U IN COLOUR ETC... ETC...



Damn you're an arrogant asshat.


----------



## Daveburt (Jul 27, 2006)

Brits like to put a "U" in everything Big....  It don't make them bad!!

They're still pissed about the Bouston Teau Paurty.... Hic!!

My 1900XT runs Oblivion really well!!

See, ya Bastads made me post a drunken Message!!


----------



## Tatty_One (Jul 27, 2006)

Casheti said:
			
		

> HAHAHA!! That was the most pussy graphics I have ever seen. It made me throw up. It was easy as pie to run
> 
> And yes, it is a miracle I run so much off of a 300W PSU. Bow before my amazing acheivements, you commoner.
> 
> Apparently my card "underacheives" because of this weak ass PSU, imagine what I could do with full power...



Yeah but one day....maybe soon, you'll be sprinting thru the woods in Oblivion to hack of the head of that dark elf king and..............BANG.....PSU overload....byebye Gfx card/Mobo/PCI cards....maybe RAM and HDD too!!!!!  the time is getting ever nearer you have only lasted in this world this long because you don't overclock your CPU but that time is ever nearer!


----------



## Tatty_One (Jul 27, 2006)

JC316 said:
			
		

> Snooze, I run at those settings, with HDR and at 1600x1200.



I only have a 17 inch LCD so play at 1280 x 1024 but MAX everything (although as you know with Nvidia I cannot have HDR and AA/AF at the same time so max AA/AF) game is on full throttle @ 52FPS(worst)  and 65FPS (best)....smooth at a babies botty.


----------



## Tatty_One (Jul 27, 2006)

Daveburt said:
			
		

> Brits like to put a "U" in everything Big....  It don't make them bad!!
> 
> They're still pissed about the Bouston Teau Paurty.... Hic!!
> 
> ...



Yes but have u not heard the latest??.................the language is called ENGLISH....not  AMERICAN!  so it is fair to say we are the experts on the subject, let it rest at that...ohhhh and you were only able to have the Boston Teaparty because we introduced you to tea!


----------



## Daveburt (Jul 27, 2006)

Your Cool Tatty... I wasn't picking on you Brits!!

You can't help it you speak English with a wierd Accent!!   

And BTW (Truly American!), I don't even drink Tea!!    

I'm going to bed soon!!  Don't hate me 'cause I'm a Yankee!


----------



## Casheti (Jul 27, 2006)

LMAO. My PSU will NEVER overload. It's so cheap, it's good. I mean...who the f*ck is Bestec?? Lol

And each section of England has a different accent. Where I come from though, we have the classic British accent. The accents people have that come from Liverpool and other places REALLY piss me off.


----------



## _33 (Jul 27, 2006)

bigboi86 said:
			
		

> Damn you're an arrogant asshat.



I don't mind arrogant, but LIAR is definately off the scale.


----------



## Tatty_One (Jul 27, 2006)

Casheti said:
			
		

> LMAO. My PSU will NEVER overload. It's so cheap, it's good. I mean...who the f*ck is Bestec?? Lol
> 
> And each section of England has a different accent. Where I come from though, we have the classic British accent. The accents people have that come from Liverpool and other places REALLY piss me off.



It's a brave man that says NEVER!  Like I was never gonna spend a small fortune on a PC build!!!.....woot????


----------



## i_am_mustang_man (Jul 27, 2006)

bestec supplies psu's to hp, emachines, and a lot of compaq retail pc's sold in compusa and officedepot

i love all the diversity in the english language, american is a dialect, just like the british version is a dialect.  i would't tell someone that speaks spanish that when they say toalla they are wrong, it's towel to me, it's toalla to them, and that's awesome.  we can learn from each other.  to me, color is colour, they have the same meaning.  i just worry for my gf is prolly going to england to study abroad and if she writes a paper the prof will grade it and mark her off for writing color instead of colour, which she should get marked off for, imo!

honestly tho, casheti, i would just get a more powerful psu.  500w ones like the one at radioshack for 10$ after rebates are great.  plus you can get rush processing and have the money deposited to your paypal account within the week.  they take a small amount, so it's more like 14$ for the psu, but that's still f'in sweet!

if you want to have a lower wattage psu that's up to you, but as time progresses the capacitors will age, and then the psu's real wattage will decrease, and your comp might fry.  it may work great now with anything you can throw at it, but it, like all things, age. (and it's not like a fine wine, so don't even mention it! lol)

but that's my $.02 (question, do they say, that's my 2 cents in england?  two pence? just wondering!)


----------



## JC316 (Jul 28, 2006)

I don't speak english, I speak Texan.


----------



## infrared (Jul 28, 2006)

Back to the original thread title... 



> Never realized how well the x850xt PE competes with the 1800/1900 cards



tbh, it doesn't compete that well. It's a good card, but doesn't score anywhere near as well as those cards. I'm happy to say that i'm now running 2x x850's in crossfire, and i'm scoring around 11,500 in 3dmark05, which just about keeps up with the x1800/x1900 series cards. A single x850 is way off what even a stock x1800xt would get.


----------



## JC316 (Jul 28, 2006)

Yeah, my point exactly. I have owned both an X850 PRO and an X1800GTO, here were my stock 3dmark05 scores. Both are 500/500 12 pipe cards.

850 - 5100.
1800 - 6300.

Oced to 550/550
850 - 5600
1800 - 7100

The results can't be denied, the X1800 is just flat out better than the X850.


----------



## Tatty_One (Jul 28, 2006)

infrared said:
			
		

> Back to the original thread title...
> 
> 
> 
> tbh, it doesn't compete that well. It's a good card, but doesn't score anywhere near as well as those cards. I'm happy to say that i'm now running 2x x850's in crossfire, and i'm scoring around 11,500 in 3dmark05, which just about keeps up with the x1800/x1900 series cards. A single x850 is way off what even a stock x1800xt would get.



I beleive a clocked x1800xt averages about 10000-10500 in 3D Mark 2005 and at stock they do about 9500-1000 so your crossfired 850's exceed that, my overclcoked 850xt @ 610/650 gave me 7300.


----------



## Casheti (Jul 28, 2006)

Nah, we NEVER say my two cents in England. I see it on some American programs. What the f*ck does it mean??


----------



## mitsirfishi (Jul 28, 2006)

thing is how much has it cost you to get 2 x850's and how much you could just get a x1800/x1900 xt and just clock it


----------



## dduummyy (Jul 28, 2006)

benchmarks , benchmarks, benchmarks. Does my 850 play everything, yes. Does your 1800 play everything, yes. Idont think paying $400 is worth the extra 1000 points in a benchmarking program when most cards perform just fine in all games, unless this is a paid sport, its nothing other than testosterone badgering. You go benchmark while i enjoy my games.


----------



## infrared (Jul 28, 2006)

mitsirfishi said:
			
		

> thing is how much has it cost you to get 2 x850's and how much you could just get a x1800/x1900 xt and just clock it



Not much. The slave x850 is actually a modded sapphire x800gto2 with an x850xtpe bios. That cost me £100 quite a long time ago. And i bought the Crossfire edition x850 off Warup for £70 ($130) so it's cost me very little.

The problem i face is the catalyst drivers use a lot of cpu to enable crossfire, so i score very badly until my 640 is clocked up to 4.7-4.9ghz  I could really do with a nice dual core cpu to make the most of crossfire.


----------



## _33 (Jul 29, 2006)

NO matter what, crossfire or no crossfire, the X800/X850 architecture is behind compared to X1800/X1900.  You'll get a score that is high with your older Crossfire setup, but that score is eclipsed by the fact the card(s) can't perform the latest graphical achievements/technology (HDR + 3.0 shaders + faster OGL games, mostly).


----------



## Ketxxx (Jul 29, 2006)

GLD said:
			
		

> 1 Brand of PSU to Rule them All!: PC Power & Cooling!


nope, that would be MUSHKIN!


----------



## Daveburt (Jul 29, 2006)

If anyone is interested, I recently bought an X1900XT and got these results:

3D05   11029
3D06   5645

OC'd to 650/775 (best stable so far, haven't played with it much)

3D05   11584
3D06   5970

I have an X850XT (which is now sitting in the closet  ) . I thought about going the Crossfire route, but It just didn't make sense to me.  That's just my opinion.......  I got the 1900XT for $319, I'm pretty happy with it and HDR looks GREAT!

BTW, NICE OC on that Pentium Infrared!!!  You would LOVE Dual Core Goodness Man!!
I'll never go back to single core.... 

I can see your an Intel guy, but the 3800X2's OC Really well, and now that Conroe is out you can pick one up for a song.  Which ever way ya go I'm sure you'll enjoy DualCore once you make the switch!!


----------



## Casheti (Jul 29, 2006)

Just remember, when you play a game on a dual core, and the game does not support dual core technology, it will rip your computer to shreds and make your processor look like a wimp. I unfortunately had to find THAT out from personal experience.


----------



## Daveburt (Jul 29, 2006)

There are Patches to fix that Cash....  Everquest is the only Game I ever had problems with, I don't know about Intel, but with AMD you can just set the Affinity to use one core and it fixes it....

The Benefits, FAR outway figuring out how to fix a few quirks.... Conflicts are Few and Far between, and always repairable with a little research....


----------



## Casheti (Jul 29, 2006)

Is there something that enables dual core on Battlefield 2?


----------



## Juggernaut1987 (Jul 29, 2006)

Everybody knows that a X1800 series is more powerfull then a X850XT but!
An X850XT is like what 2 years old now? And still keeps up very decently with the X1800 in actual games. Of course its a little slower, its older, and costs 1/4 of a X1800 series card. So seriously an X8*0XT isnt that bad at all. Im keeping my X800XT for my coming Conroe settup (I have my ASRock Dual 775 VSTA but my E6400 is not even in sight yet )


----------



## Casheti (Jul 29, 2006)

hey juggernaut, how come your bunny has been chopped up into 3 pieces??


----------



## Tatty_One (Jul 29, 2006)

Daveburt said:
			
		

> There are Patches to fix that Cash....  Everquest is the only Game I ever had problems with, I don't know about Intel, but with AMD you can just set the Affinity to use one core and it fixes it....
> 
> The Benefits, FAR outway figuring out how to fix a few quirks.... Conflicts are Few and Far between, and always repairable with a little research....



yes there are "optimizers"  but the reviews I have read suggest that in a game, your CPU will still run about 5% slower (rather than about 10% before) than a single core venice @ 2600.......now unless there are lots of multi core products out there....and /or you dont game then where is the extra money going...down the pan!....by the way, I have a 4200 x2 @2800 so am not an "anti" dual core guy, just stating an unfortunate fact.


----------



## infrared (Jul 29, 2006)

Daveburt said:
			
		

> BTW, NICE OC on that Pentium Infrared!!!  You would LOVE Dual Core Goodness Man!!
> I'll never go back to single core....
> 
> I can see your an Intel guy, but the 3800X2's OC Really well, and now that Conroe is out you can pick one up for a song.  Which ever way ya go I'm sure you'll enjoy DualCore once you make the switch!!



Thanks  One day i'll go dual core! tight on money at the moment though 

I don't really have any preference between AMD/intel. AMD has all the cool timings to play with which really appeals to me, but intel just overclock like crazy on a good motherboard. Not saying that AMD's don't overclock well of course.

Nice scores on your x1900 btw.


----------



## Casheti (Jul 29, 2006)

AMD's overclock like gay. Period. An AMD FX-62 Overclocked to the max can't beat a 2.93GHz X6800 Core 2 Duo


----------



## infrared (Jul 29, 2006)

But they do more work per clock cycle... so it doesn't matter if they don't keep up with intels clock for clock


----------



## Casheti (Jul 29, 2006)

Um...blah? Would you care to come on msn and explain?? Sorry, I don't have any porn for you this time. Lol, j/k


----------



## Daveburt (Jul 30, 2006)

Thats why you don't Pay $1000 for an FX (or X6800 C2D) Unless you just don't know any better.   

Buy a 4400 for $250 and clock it to FX speeds!  My wimpy 3800x2 is rated at 2ghz but runs easily and stable at 2.6 (30% OC! Thats FX-60 speeds)

I can't say for sure because I'm not that familiar with Intels, but I would be willing to bet thier top of the line (Pentium) HotRods don't OC very well either.

As for C2D, thats a whole new animal!  From what I've been reading It's gonna take AMD awhile to catch up to that one.... But Hey, they've been smoking Intel for the last few years an they should have 65nm plants coming online soon which will probably help...

Either way, We all get better/cheaper processors!!!


----------



## Casheti (Jul 30, 2006)

I would rather buy an X6800 and clok it to a processor which doesn't exist yet..


----------



## noneed4me2 (Jul 30, 2006)

Intel wouldn't have had to make make such a fast cpu if AMDs oferings were only soso. Plus intel's chipsets are all over the place and have more compatability problems with regards to what cpu can work with what chipset/mobo combinations. I used Intel since early 486 days, and I am sure I will use them again, but you can't dismiss the ease of compatablity s939 has provided.


----------



## Dippyskoodlez (Jul 30, 2006)

micron said:
			
		

> Oblivion, Far Cry, Splinter Cell Chaos Theory, plus more.




Yay.. farcry.. a game that allows multiple headshots and the computer will still be shooting at you... 
yay splinter cell.. exciting game.. it comes bundles with everything+dog.

I personally don't like those games, so SM3.0 can bite me


----------



## Dippyskoodlez (Jul 30, 2006)

Casheti said:
			
		

> AMD's overclock like gay. Period. An AMD FX-62 Overclocked to the max can't beat a 2.93GHz X6800 Core 2 Duo




Does it matter when most people run A64 3200+'s anyways?  

"Overclock like gay"?

I mean, sheesh.. 800mhz using a $80 chip that is the worst stepping ever made... is just gay?

  50% overclock is quite good imo. Especially for a CBBID winnie.

I've done 3ghz on a brand new venice (right when they came out) that cost $150...

Not quite "overclock like gay". Perhaps you should learn to overclock THEN pass judgement sometime.


----------



## Dippyskoodlez (Jul 30, 2006)

Casheti said:
			
		

> I would rather buy an X6800 and clok it to a processor which doesn't exist yet..




Too bad the x6800 isnt out yet either. Please argue with available CPU's?

and yes, the K8L very much exists. The CPU that will follow the K8L and conroe, already exist.

They just arent.. DONE.  

But you go ahead and buy that $1000 CPU, and get that extra 10 fps.

I'll be happy getting 10 less fps with my $500 COMPUTER (WHOLE SYSTEM!!!) and buying $500 worth of pizza and games to put my stuff *to USE*


----------



## micron (Jul 30, 2006)

Dippyskoodlez said:
			
		

> I personally don't like those games, so SM3.0 can bite me


That's pretty funny. Sm3.0 games are coming out practically every other week, but you'll deny them all also. It's funny to see the things some people will post when they're trying to defend their outdated graphics card 

Hey everybody, cancel SM3.0.......Dippyskoodlez things it's a joke


----------



## Frick (Jul 30, 2006)

@OC like gay: Gay "good" or gay "bad"?  My max stable OC with my 3000+ Venice is in the 2.5 Ghz area, without mods. I once pushed it to 294*9, but the system crashed after POST. So, with the voltmods I'm pretty sure I can push up to 300*9, or more.  I love the way A64 OC. But core2 still beats the living h*ll out of everything AMD has. Me like thiiiis much.


----------



## Casheti (Jul 30, 2006)

Um...I dunno. And what's with the AMD names??? They call em a 3000+, but they're not 3.0GHz, so why call em 3000+ ?


----------



## i_am_mustang_man (Jul 30, 2006)

Casheti said:
			
		

> Um...I dunno. And what's with the AMD names??? They call em a 3000+, but they're not 3.0GHz, so why call em 3000+ ?



because the pc area is dominated (was at least when a64 entered) by intel, which as 3GHz cpu, ie, so the 3000+ A64 will compete with it, at 1.8GHz

AMD had to conform to the intel dominated market, and did pretty well for themselves thusly


----------



## Dippyskoodlez (Jul 30, 2006)

micron said:
			
		

> That's pretty funny. Sm3.0 games are coming out practically every other week, but you'll deny them all also. It's funny to see the things some people will post when they're trying to defend their outdated graphics card
> 
> Hey everybody, cancel SM3.0.......Dippyskoodlez things it's a joke




And the visual difference... is.... huge?

Yes, you proved my point. Games are continually coming out.

The Geforce Ti4200 used to run games high detail. Now it's forced to run low detail DX8.

Guess where the performance of these cards is rapdily heading?

Oh thats right, low res, med/high detail.  

Its like trying to use direct X 9 on a Geforce FX 5200. Just plain retarded.


----------



## Casheti (Jul 30, 2006)

I Have A 5200 In My Dell!!!


----------



## Frick (Jul 30, 2006)

i_am_mustang_man said:
			
		

> because the pc area is dominated (was at least when a64 entered) by intel, which as 3GHz cpu, ie, so the 3000+ A64 will compete with it, at 1.8GHz
> 
> AMD had to conform to the intel dominated market, and did pretty well for themselves thusly



As I recall it, they introduced this with the original Athlon CPU's to kick around the original P4 (wich sucked!)a bit. Showoff, you can call it. Later on the numbers were quite off, really.. I think AMD and the market simply got used to it.

EDIT: There's nothing bad with the 5200.. If you're not into gaming. At least not after year 2000.


----------



## i_am_mustang_man (Jul 30, 2006)

yea, exactly.  the naming system started because of that, but they have stuck with it - a step up in performance means +100, or 200 or something like that


----------



## {JNT}Raptor (Jul 30, 2006)

Casheti said:
			
		

> Just remember, when you play a game on a dual core, and the game does not support dual core technology, it will rip your computer to shreds and make your processor look like a wimp. I unfortunately had to find THAT out from personal experience.



Oh Please.  

Single threaded games run just fine On Dual Core....been playing games on mine for over a year with No Issue\s even remotely like you describe......no Dual Core patch....no AMD Dual Core Optimizer.

FEAR
Half Life 2
Oblivion
SS2
SOF2
UT04
FarCry
Prey
Soldier Elite

Only game here that has the ability to run Multi-Threads Is Oblivion......which runs just fine at 1024x768 4x AA and 4x AF for me.

I'm guessing your Pentiums to blame....or perhaps your system setup.


----------



## Casheti (Jul 30, 2006)

Yeah, my system is corrupt, and gay, and has too many processes, etc...


----------



## micron (Jul 30, 2006)

Dippyskoodlez said:
			
		

> Yes, you proved my point. Games are continually coming out.
> 
> The Geforce Ti4200 used to run games high detail. Now it's forced to run low detail DX8.
> 
> ...



After reading this reply, I'm embarrassed I even tried having a discussion with you. 

_You_ should actually be embarrassed more


----------



## Tatty_One (Jul 30, 2006)

{JNT}Raptor said:
			
		

> Oh Please.
> 
> Single threaded games run just fine On Dual Core....been playing games on mine for over a year with No Issue\s even remotely like you describe......no Dual Core patch....no AMD Dual Core Optimizer.
> 
> ...



Bump! Never had a problem with any game on my x2 and play oblivion on everything maxed @ 1280 x 1024.


----------



## Dippyskoodlez (Jul 30, 2006)

micron said:
			
		

> After reading this reply, I'm embarrassed I even tried having a discussion with you.
> 
> _You_ should actually be embarrassed more




Apparently you havent tried playing a game that utilized direct x 9 shaders on an FX 5200.


----------



## Casheti (Jul 30, 2006)

I have. Call Of Duty 2


----------



## Dippyskoodlez (Jul 30, 2006)

Casheti said:
			
		

> I have. Call Of Duty 2




 

bet it was quite the pretty slideshow too.


----------



## Casheti (Jul 30, 2006)

Lmao. Yeah, it was. It was on a Pentium 4 2.8GHz, 1.25GB DDR 400, and a PCI 256MB GeForece FX 5200. It was unplayable on the lowest DX9 settings, but it was fine on DX7, lagged where ever there was lots of smoke. Lol


----------



## Casheti (Jul 30, 2006)

1, nothing wrong with me, 2, nothing wrong with me, 3, nothing wrong with me, 4, nothing wrong with me, 1 somethings got to give, 2 somethings got to give, 3 somethings got to give, NOOOOOOOOOOOOOOOOOOOOOOOOOOWWW!!, Let the bodies hit the flo', let the bodies hit the flo', let the bodies hit the, grrrrrrrrroaaah. Let the bodies hit the floor, Let the bodies hit the floor, Let the bodies hit the... FLOOORRRRRRRRR!!!!!!! Skin against skin, blood and bone, Your all by yourself but your not alone, You wanted in, now your here, Driven by hate, consumed by feeeeeeeeaaar


----------



## Dippyskoodlez (Jul 31, 2006)

Casheti said:
			
		

> 1, nothing wrong with me, 2, nothing wrong with me, 3, nothing wrong with me, 4, nothing wrong with me, 1 somethings got to give, 2 somethings got to give, 3 somethings got to give, NOOOOOOOOOOOOOOOOOOOOOOOOOOWWW!!, Let the bodies hit the flo', let the bodies hit the flo', let the bodies hit the, grrrrrrrrroaaah. Let the bodies hit the floor, Let the bodies hit the floor, Let the bodies hit the... FLOOORRRRRRRRR!!!!!!! Skin against skin, blood and bone, Your all by yourself but your not alone, You wanted in, now your here, Driven by hate, consumed by feeeeeeeeaaar




Please don't spam.


----------



## JC316 (Jul 31, 2006)

It's "Consumed by hate, consumed by fear". Wow this thread has become a spam fest.


----------



## OpTicaL (Jul 31, 2006)

I own a x850xt and it sucks in games with HDR turned on. I can't play counter-strike source 16v16 cs_office without lag. Once alot of nades go off and paras firing I start to lag. With smaller servers I don't have this problem only with big servers.

I run with no aa or af, hdr off, everything else is set to high and res of 1280x1024.

Thats why I suspect you people who say you can run Doom3/Quake4  on ultra quality without severe lag is bullshit. You must be running your rig on some savage resolutions like 640x480.


----------



## infrared (Jul 31, 2006)

OpTicaL said:
			
		

> I own a x850xt and it sucks in games with HDR turned on.



The x800/x850 series DONT support HDR...



			
				OpTicaL said:
			
		

> I run with no aa or af, hdr off, everything else is set to high and res of 1280x1024.
> 
> Thats why I suspect you people who say you can run Doom3/Quake4  on ultra quality without severe lag is bullshit. You must be running your rig on some savage resolutions like 640x480.




i can quite happily run all* games at 6x AA 14x AF @ 1280x1024 smooth as silk on a single x850xtpe. If you can't, there's something seriously bottlenecking your computer.


*All games includes D3, Q4, FEAR, Battlefield 2 etc etc.

EDIT: My single x850 scores over 8k in 3dmark05... check what yours scores before saying everything is "Bullshit"


----------



## v-zero (Jul 31, 2006)

JC316 said:
			
		

> It's "Consumed by hate, consumed by fear". Wow this thread has become a spam fest.


You can't play Oblivion smoothly with those settings in your sig. I have a replica of that setup here, and it don't work...
Same goes for Q4 - it's playable, but not comfy...


----------



## Casheti (Aug 1, 2006)

I want to be loved!


----------



## bigboi86 (Aug 1, 2006)

infrared said:
			
		

> The x800/x850 series DONT support HDR...
> 
> 
> 
> ...



It's probably his CPU(2.6c ew.. )

Scoring 8k with a single X850 is quite an accomplishment though.


----------



## OpTicaL (Aug 1, 2006)

infrared said:
			
		

> The x800/x850 series DONT support HDR...
> 
> 
> 
> ...



Doom3 and Quake 4 max AA and AF @ 1280x1024? Ultra Settings? Smooth as silk? You gotta be joking. I wonder who's bsing now. Doom3 developers clearly stated Doom3 needs 512MB of video ram inorder to run it at max settings and here your saying you can run it *SILKY SMOOTH* with just 256MB? You need to give me timedemo screen shots for me to believe you.

If you said your system is crossfire or sli and your running d3, q4 silky smooth I wouldn't have anything to say. x850xtpe? silky smooth? Give me a break. You can brag about benchmarks all you want benchmarks are just numbers and it doesn't reflect real world performance. It's performance from games is what matters in the end, not benchmark numbers. Can benchmarks create a multiuser environment in video games? NO.

I score 103.69 AVERAGE on CS: Source video stress test. So what? Doesn't mean jack if the numbers don't reflect real world performance. On big servers with 40 players I barely maintain 35fps when everyone is throwing nades

Just think about it, I own a x850xt myself, there's absolutely no need for me to bash my own card. Get real man. ATi sucks now and everyone knows it. It was good for awhile but now Nvidia is king.

Just like AMD, it was good for awhile and now Intel is king. Heck, even the mainstream E6600 outperforms the fx-62. The Core 2 Extreme X6800 is overkill.



			
				bigboi86 said:
			
		

> It's probably his CPU(2.6c ew.. )
> 
> Scoring 8k with a single X850 is quite an accomplishment though.



Why didn't I upgrade you might ask? Prescott is a nuclear reactor, Pentium D is a generic dual-core. AMD? Thought about it but it wasn't absolutely neccessary at the time. Wanted to see what Core2 had to offer.

Core2 arrives, now it's time to upgrade, that is, when more motherboard support for it becomes available.


----------



## Frick (Aug 1, 2006)

OpTicaL said:
			
		

> ATi sucks now and everyone knows it. It was good for awhile but now Nvidia is king.



 

Please give me facts that support that thoery. Of course, 7950GX is fast as h*ll, but VERY expensive (and don't mention SLI..  ). I can state that Nvidia sucks as LOT's of people have TON's of problems with their cards.

@infrared: Numbers! Numbers! I need fps!


----------



## OpTicaL (Aug 1, 2006)

Frick said:
			
		

> Please give me facts that support that thoery. Of course, 7950GX is fast as h*ll, but VERY expensive (and don't mention SLI..  ). I can state that Nvidia sucks as LOT's of people have TON's of problems with their cards.
> 
> @infrared: Numbers! Numbers! I need fps!



Facts? There's GOOGLE for that. It's not a theory it's fact. How do you decide which brand is best? You take their most powerful card and pit it against the other's most powerful card. In this case you take 2x 7900GTX SLi and pit it against ATi's 1900xtx and 1900xt in Crossfire. Results are posted all over google. Here's a link just in case your lazy:

http://www.maximumpc.com/2006/06/sapphire_radeon_2.html

Lets not even talk about 7950GX in quad SLi, heck ATi can't even do quad crossfire. Enough proof for you? Now, you say alot of people have problems with Nvidia cards. What kind of problems? Hardware? Software? Drivers? Don't make a broad generalization like "LOT's of people have TON's of problems" Back your words up instead of talking out of your behind.


----------



## Juggernaut1987 (Aug 1, 2006)

Whoohoo heated discussions! 
Its just a clear fact that Nvidia is behind on Ati. Quad-Sli sucks money and well just sucks completely. Nvidia doesnt support FSAA and HDR at the same time which sucks. I had a Riva TNT2 which well sucked even worse then my Ati Rage 3Dpro (Yes the 8meg one). Whith my Ati card I at least had something to look at while my TNT2, which was RMA'd 5times, just put my screen on standby.
At a friends house I installed a fresh windows installation. Picked up the latest nvidia drivers for his geforce and BAM! display corruption everywhere.

And my last and most powerfull truth why Nvidia is bad:
"The way its meant to be played", remember the first Tombraider Legend benchmarks 
(for the interested:http://www.bit-tech.net/gaming/2006/04/11/tomb_raider_legend_review/4.html)

Im not saying Ati is good or perfect, im just saying that nvidia is worse. Its a damn shame Ati teamed up 
with AMD 

EDIT:And to keep things on topic, the X8*0XT series are aging but still they kick ass


----------



## OpTicaL (Aug 1, 2006)

Juggernaut1987 said:
			
		

> Whoohoo heated discussions!
> Its just a clear fact that Nvidia is behind on Ati. Quad-Sli sucks money and well just sucks completely. Nvidia doesnt support FSAA and HDR at the same time which sucks. I had a Riva TNT2 which well sucked even worse then my Ati Rage 3Dpro (Yes the 8meg one). Whith my Ati card I at least had something to look at while my TNT2, which was RMA'd 5times, just put my screen on standby.
> At a friends house I installed a fresh windows installation. Picked up the latest nvidia drivers for his geforce and BAM! display corruption everywhere.



We're talking about modern day cards and this guy goes off talking about prehistoric cards and prehistoric problems! Hey Juggernaut, is your head stuck in 1987 or is your comptuer from 1987?


----------



## Juggernaut1987 (Aug 1, 2006)

1987 is the year I told Nvidia to get the F out of my machine 
And wow! Is SLI from 1987 already? Close, it was available to consumers from February 1998 when 3dfx launched their Voodoo 2


----------



## {JNT}Raptor (Aug 1, 2006)

OpTicaL said:
			
		

> ATi sucks now and everyone knows it. It was good for awhile but now Nvidia is king.
> 
> Just like AMD, it was good for awhile and now Intel is king.



AMD and ATI do not suck.....thier just on the downlow preparing thier responses to the new Offerings by Intel and Nvidia.........are you new to computing?....because this Is generally the way It goes.

I also run the X850XT.......No not all games at 1280x1024 6xAA and 14xAF....but...I'm not running a 5000Mhz P4 like Infared...which I suspect Is what helps him to run those settings.....along with a real nice OC on his card. 

Yes...Nvidia and Intel have taken back the top spots from ATI and AMD......for now............thats the way It's supposed to go.....otherwise were back to Intel and Nvidia trying to corner the market and force everyone to thier side....examples would be Nvidia buying out better but smaller companies and shelving them(ULI-3dfx) just to name 2...or Intel attempting to force the "Entire" computer market over to thier Rambus technology(Remember that?)......It's not a bash session.....but those things happened.

With Intel and Nvidia getting back Into the top spots....Just look at the recent price drops....It's a buyers paradise right now......but saying AMD and ATI suck.......Is just plain Ignorant.

And I'm actually really pleased that ATI and AMD have merged.....ATI now has access to even better tech to build thier future GPUs.......which I'm sure has Nvidia feeling It alittle more than thier letting on....not to mention the Super Chipset they can produce In the future....with the rapid acceleration of PC tech.....It's only natural you see mergers at this level....I for one cannot wait to see what they churn out for us.

Bottom line.....flip flopping the top spots helps the consumer by forcing prices on Good hardware even lower....so lets enjoy It while we can.


----------



## Casheti (Aug 1, 2006)

Because my hard drive is so clogged up with rubbish, would that make my card perform worse in games e.g BF2??? I play it with max settings on 1280x1024 at 6x AA and 16x AF externally. In game, AA is set to 2x, but tray tools overrides it into 6x


----------



## {JNT}Raptor (Aug 1, 2006)

Casheti said:
			
		

> Because my hard drive is so clogged up with rubbish, would that make my card perform worse in games e.g BF2??? I play it with max settings on 1280x1024 at 6x AA and 16x AF externally. In game, AA is set to 2x, but tray tools overrides it into 6x




Wouldn't affect video but It would certainly have an effect on Files being loaded from your drive.

If your forcing 6xAA and 16xAF In your Driver CP/CCC...then the settings In game don't matter.....thier being over ridden by your CP/CCC setings.....generally.....Visual settings (AA and AF)should only be forced If the game doesn't give you the setting to set.


----------



## Casheti (Aug 1, 2006)

So..files being loaded from the drive while playing Battlefield 2 are being loaded slower?


----------



## Juggernaut1987 (Aug 1, 2006)

Yes, you should always defragment and clean your harddrive. Otherwise programs have to "look" for the files all over the harddrive because of fragmentation. So in theory loading times in BF2 can increase if your HD gets clogged.


----------



## Casheti (Aug 1, 2006)

I do defrag and cleanup and stuff...but it doesn't help much...if at all.


----------



## {JNT}Raptor (Aug 1, 2006)

Optimal way to run games Is from the 1st partition of a 2nd HD thats running on It's own channel.

Front of the HD Is the fastest access area.

On your 300gig drive how far back Is your games partition?....the further back In the drive....the slower the access time.


----------



## Frick (Aug 1, 2006)

OpTicaL said:
			
		

> Facts? There's GOOGLE for that. It's not a theory it's fact. How do you decide which brand is best? You take their most powerful card and pit it against the other's most powerful card. In this case you take 2x 7900GTX SLi and pit it against ATi's 1900xtx and 1900xt in Crossfire. Results are posted all over google. Here's a link just in case your lazy:
> 
> http://www.maximumpc.com/2006/06/sapphire_radeon_2.html
> 
> Lets not even talk about 7950GX in quad SLi, heck ATi can't even do quad crossfire. Enough proof for you? Now, you say alot of people have problems with Nvidia cards. What kind of problems? Hardware? Software? Drivers? Don't make a broad generalization like "LOT's of people have TON's of problems" Back your words up instead of talking out of your behind.



ATi doesn't "suck" because Nvidias cards are currently faster than ATis cards. ATI's new x1950xt will be cheaper than the 7950GX, and that's a VERY good thing when it comes to "regular" consumers. Dough is president, you know. 

Problems: Problems with pre-OC'd cards, problems with drivers (I've always had problems with Nvidia's drivers.. Dunno why).. Try Google.

BTW, I'm not a fanboy, but I don't like it when people claims stuff to suck when they really don't.. And as some posters already said: The tides will turn. 

@AMD: I like AMD. Mainly because I can now get a 3800+ x2 for nothing. Nice CPU's without a total upgrade. I think AMD will go strong for another couple of months until DX10-cards are out.. Then, i think, people will do total upgrades. Buy everything new at the same time. No question that Core2 is waaayyy faster than anything AMD has, but with the priceslashes they're still attractive.

EDIT:

"So how does Sapphire’s Radeon X1900 XTX stack up? The answer depends on which of a videocard’s several missions you value the most. We’ve praised ATI’s Avivo technology before, and we’ll do it again: Video looks fantastic on Sapphire’s X1900 XTX—much better than it does on XFX’s GeForce 7900 GTX. Avivo renders games more attractive, too: Colors are brighter, richer, more saturated." Maximum PC

Another proof that ATi doesn't suck.


----------



## b1lk1 (Aug 1, 2006)

LMFAO!  QuadSLI is the biggest joke in a long long time.  There isn't even a CPU out that can relieve the bottleneck, nevermind the lack of any useful drivers.  As for Nvidia being king, when was the last time you checked the Futuremark ORB?  All the top benchers are running X1900 Crossfire as it is the fastest setup going.  While I do agree that benchmarks are purely synthetic and do not reflect real world use, you cannnot deny that X1900's are cleaning Nvidias clock soundly.  I am sure you will, but you are looking with hands over your eyes.


----------



## Moose1309 (Aug 2, 2006)

b1lk1: you forget the real issue: quad SLI costs more and sounds cooler.  It has "quad" in the name.    

Nevermind that it performs worse than normal SLI most of the time.


----------



## x800professor (Aug 2, 2006)

Moose1309 said:
			
		

> b1lk1: you forget the real issue: quad SLI costs more and sounds cooler.  It has "quad" in the name.
> 
> Nevermind that it performs worse than normal SLI most of the time.



If I win the lottery, I'm going to buy 16 cards and shove them in my tower on my motherboard.  Why?  Because I will have the most expensive system EVER!  I'm also planning on buying 50 core 2 extremes and stacking them on top of each other.


----------



## Daveburt (Aug 2, 2006)

Damn... You folks are getting all riled up!!  

I only checked the thread because someone was bashing my hero's... The "A-Team" (ATI/AMD)....  I should probably just shaddup, I'm kinda buzzed, but I can't help myself  

Intel has definetly rocked AMD's world a with the C2D chips... You'll get no argument from me on that point!  But there is a local store selling 4800 X2's for $320, and I don't have to buy a new Mobo or DDR2.....  (sweet!)  Intel finally competes again, Woohoo (only took'em what 3 years?)

As for Nvidia smoking ATI..... Ummm.... Which Nv card is it that will do HDR+AA at the same time?


----------



## Daveburt (Aug 2, 2006)

{JNT}Raptor said:
			
		

> AMD and ATI do not suck.....thier just on the downlow preparing thier responses to the new Offerings by Intel and Nvidia.........are you new to computing?....because this Is generally the way It goes.
> 
> I also run the X850XT.......No not all games at 1280x1024 6xAA and 14xAF....but...I'm not running a 5000Mhz P4 like Infared...which I suspect Is what helps him to run those settings.....along with a real nice OC on his card.
> 
> ...




Nice job Raptor, I would have said the same if I was Sober... But I'm sure I wouldn't have been so Eloquent...


----------



## Frick (Aug 2, 2006)

Man, what is this, St. Pattys?  

Anyhow, just to get myself on topic, I'm very pleased with my x800RX. OC's like a rat on crack.  600/590 to the people.   I was thinking about bying a 7600GS and mod/OC it like whatever as I have plans to go Linus and skip this M$ for the rest of my life. But, I might as well wait and buy a budget G80-card. 

So, my x800 does just fine. Good thing the greatest bottleneck in my computer is the memory.


----------



## Tatty_One (Aug 2, 2006)

Juggernaut1987 said:
			
		

> Yes, you should always defragment and clean your harddrive. Otherwise programs have to "look" for the files all over the harddrive because of fragmentation. So in theory loading times in BF2 can increase if your HD gets clogged.



Bump!  I have read that in general terms, you can half the fragmentation for degredation in hard drive access times....for example if a drive were 20% fragmented it can take upto 10% longer to access a file, this was done on a 120Gig drive so these very"loose" figures will change dependent on drive size but for single file access we are talking milliseconds here but for multi file access such as a BIG game like BF2 well there would probably be noticable differences.


----------



## Casheti (Aug 2, 2006)

Hey...what do you guys think of Maxtor Hard Drives??? I got one in my Dell...and it's super quiet...am I lucky...or are they really good??


----------



## Tatty_One (Aug 2, 2006)

Casheti said:
			
		

> Hey...what do you guys think of Maxtor Hard Drives??? I got one in my Dell...and it's super quiet...am I lucky...or are they really good??



I have 2 (80 Gigs)and they also are near silent, they arent the fastest around by any means but not the slowest either but in my experience tend to be reasonably well priced, reliable and quiet!


----------



## Casheti (Aug 2, 2006)

WOW!! That's pretty cool seeing as mine came in a dell. lol


----------



## Daveburt (Aug 4, 2006)

Maxtors are definetly good Drives!!  I really think if you buy a decent brand you can't go wrong though... 

I've had Maxtor's, Seagates, WD's and Samsungs... Actually now that I think about it I've never had an HD fail on me... I have a Seagate that got handed down about 5 times though (Prolly 6 years), In fact it's still a little 40 Gig Puppy holding my Music files on the server  (yes I have Music backups on DVD!)

I just hope these Samsungs prove to be just as dependable (Oldest one is ~9 Months), no problems yet!   ($99 for 250Gb Sata2, how can that be a bad thing?)

I'd still have to give my Favorite HD vote to Seagate though............. Maxtor is 2nd!


----------



## Daveburt (Aug 4, 2006)

Just so ya Know.... I have a WD 74Gb Raptor that I use as a System Drive....

Fast as Hell, but it sounds like I'm making Jiffy Pop, when I'm doing anything Drive intensive, Dfrag, or crunching Video....  I can deal with a little noise for the sake of speed though... Just me.....:


----------



## Casheti (Aug 4, 2006)

I can't believe a £450 Dell came with such a good quality Hard Drive


----------



## OpTicaL (Aug 5, 2006)

Daveburt said:
			
		

> Just so ya Know.... I have a WD 74Gb Raptor that I use as a System Drive....
> 
> Fast as Hell, but it sounds like I'm making Jiffy Pop, when I'm doing anything Drive intensive, Dfrag, or crunching Video....  I can deal with a little noise for the sake of speed though... Just me.....:



If your into modding you can hang your hdds' with industrial grade rubber bands. That would cancel out all vibration, then you would need a case with sound dampening material to get rid of the noise. That's what I did with my Lian Li PC-777 case.

With the quad sli issue, I don't think ATi is going to release a quad crossfire. Their just going to be the first to release dual-core single card gpu's in crossfire which in a way is much of effiicent than running 4 video cards.

Only problem right now with dual-core gpu's is heat dissapation. My oc'ed x850xt already runs 62C on 100% load with a VF700 cooler, just imagine 2 cores. My video cards heat my room when it comes time for winter.


----------



## Daveburt (Aug 6, 2006)

Sup Optical.... I've pretty much decided I'm not going to fall into the Dual Video Card trap (SLI or Crossfire).  Not knocking anyone who has that settup, but it takes alot of cash, and at the rate Vid cards are advancing in 6-12 months there will be a single card solution that will probably smoke'm anyway... And let us not forget about DX10... 

I did buy a 1900XT (I know I'm weak...  )....
Talk about getting HOT!! This thing Idled at 52c and went to almost 90c under Load!!!

You might want to consider the Zalman VF900cu, I was a little leery about buying one because they are so small compared to the heater block that the 1900xt had on it, but I couldn't be happier with it!  My Idle temps dropped about 15c (just checked it it's 38c) and I've never seen the Max go over 69c!!!  

It will raise your system temps a couple degrees because it doesn't vent out of the case but I have good ventilation on my case so thats not really a big deal......  Plus it's whisper quiet compared to the stock cooler!!


----------



## OpTicaL (Aug 6, 2006)

I was thinking about a VF900 cooler but I already have a VF700 and some friends of mine who must have the very best upgraded their VF700 to VF900 and reported a average of 3-5C drop. Which is very impressive.


----------

