• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Msi Geforce 7900gto 512MB - The Luck Of The English!

wow, this 7900GTO just keeps on going, i'm now at 710Mhz core 800Mhz memory and still have not had a single issue after testing with ATITool and 3DMark :D After 5 mins of ATITool's artifact scanner it peaks at 59degrees.
 
wow, this 7900GTO just keeps on going, i'm now at 710Mhz core 800Mhz memory and still have not had a single issue after testing with ATITool and 3DMark :D After 5 mins of ATITool's artifact scanner it peaks at 59degrees.


Nice! You are at the point now where you will benefit from running 3D Mark 2005 because mine will do 715/825 but it performs less at those speeds than 710/820 because it is right on the limit and once the GPU/VRAM become stressed they perform lower.
 
Unfortunately my D805's STILL holding me right back just before the 10,000 mark - i'll have to look in to pushing it even further...
 
i'm running 3d05 right now with cpu @ 2.5ghz and ram at 250 @ 2.85 v with my new ram!
 
1st run with new ram
cpu @ 2.5
2gb ram @ 250 3-4-4-8 (can get betteri hear) @ 2.85V
700/760 on gto

i want to break 11000 by tomorrow night!

10549.jpg
 
Miles away lol, keep going, you'll get there!
 
congrats on 1100! i need to bench this baby, but i'm not even home :(
 
Nice Mustang. Do you notice a huge difference between that and your GTO2 in real games? DO you max out more settings in games?
 
1st run with new ram
cpu @ 2.5
2gb ram @ 250 3-4-4-8 (can get betteri hear) @ 2.85V
700/760 on gto

i want to break 11000 by tomorrow night!

10549.jpg


processor speed is holding you back from 11,000 , you would atleast need that 3200+ @ 2.8/2.9 ghz to hit 11,000, i know i hit 11100 with x1800xt 512 mb @ 750/1800 3800+ venice @ 2.88 ghz.

screw 3DMARK05 AND 06, what do you score in f.e.a.r 1600x1200 x4 aa x16 af max benchmark? stock @ 700/1600 i hit 30 fps minimum 48 fps average.
 
7900 gto/gtx are decent cards, but i know for a fact my x1800xtpe 512 mb, 700/1600 outperforms them in f.e.a.r and oblivion and serious sam 2 and battlefield 2 and a few other games, 7900 GTX wins in doom3 engine because of 24 tmus, texture memory units, x1800xtpe only 16, it's crazy a 16 pixel pipeline card can outperform 24 pixel lol. i still would never trade this x1800xtpe for a 7900 GTX EVER, btw doom3 engine relies on texture fill rate, the reason why it outperforms x1800xt/x1900xtx, but i score 72 fps 1600x1200 x4 aa x16 af ultra quality, im sure the 7900 GTX @ 650/1600 only scores like 74/76 fps isnt much faster.from raw specs, 7900 gtx is more powerful than x1800xt, but the x1800xt has like ringbus memory controller and just seems more efficient than nvidia gpu architecture, so the raw power doesnt even really matter.
 
Still good cards for the money though...the money that I wish I have. :(
 
Still good cards for the money though...the money that I wish I have. :(

yea, nvidia cards are decent, but i have had 4 bfgtech 7800 GTX's die on me, artifacts or just wont power up, dead memory, voltage regs, core artifacts, it just seems ati cards, go through much more quality control testing and just seem to use higher quality parts, when inspecting both..

personally, im never buying an nvidia card again, i still havent returned my 4th dead 7800 gtx, lol what's funny, they died just from running half life 2 or overclocking, while x1800xt i can overclock shit out of it, and still works i also accidentally dropped my x1800xt while changing cards once and it still works lol.
 
Of course I do, but it's the time of year, where I'm waiting for sales near thanksgiving to buy christmas presents, you know, and I think of myself last.

And I might get an Opteron 165 Dual-Core for christmas, to replace my 3200+ @ 2.67Ghz. And later on, I might go mid-range SLI. So, this isn't the time to buy for me at least, waiting for a CPU first, since my XL runs all of the games I play at 1280x1024 max settings (most)...then get a new GPU.
 
Of course I do, but it's the time of year, where I'm waiting for sales near thanksgiving to buy christmas presents, you know, and I think of myself last.

And I might get an Opteron 165 Dual-Core for christmas, to replace my 3200+ @ 2.67Ghz. And later on, I might go mid-range SLI. So, this isn't the time to buy for me at least, waiting for a CPU first, since my XL runs all of the games I play at 1280x1024 max settings (most)...then get a new GPU.

im waiting for r600 no doubt, screw weak ass g80 384 bit, r600 supposed to be 512 bit memory interface.
i got a 3200+ winchester s939 a64, first 90nm a64s.

it can hit 2.6 ghz, with 260 overclock on ram with crucial pc4000, no dividers crap.

is your ram overclocked or are you running downclocked ram to achieve 2.67 ghz?
 
So, where is your ATI tatoo, exactly? ;)

I would not write off the NVIDIA products too quickly. I found that both companies alternate blows in the race forward, so neither dominates for too long. Hopefully ATI can respond in a timely manner this time around and we do not get another X1800XT debacle (too little, too late). This competition only benefits us, in the end.

And, I would make a move to dual core before any video upgrade. I have a single core Opteron 148 that does 3.0 GHz that just does not cut it any longer (and, that is with no divider - the DDR runs native 1:1, 1T at 273 FSB). Dual core has definitely arrived.
 
So, where is your ATI tatoo, exactly? ;)

I would not write off the NVIDIA products too quickly. I found that both companies alternate blows in the race forward, so neither dominates for too long. Hopefully ATI can respond in a timely manner this time around and we do not get another X1800XT debacle (too little, too late). This competition only benefits us, in the end.

And, I would make a move to dual core before any video upgrade. I have a single core Opteron 148 that does 3.0 GHz that just does not cut it any longer (and, that is with no divider - the DDR runs native 1:1, 1T at 273 FSB). Dual core has definitely arrived.

What's the point of dualcore? only quake 4, crysis and maybe a few others support multithreading. video card is the most important thing of a computer, just pick up a cheap ass athlon 64 x2, no need to conroe bs even if it is faster, it's not noticable at high resolutions because games are mostly gpu limited. i can tell you now my 3200+ and x1800xt will handle crysis pretty damn well maybe a few physics effects might have to be disabled owell.

are we forgetting ? radeon 9500>gf3? or 9800 xt>5950 ultra? or x800xtpe/x850xtpe>6800 ultra or x1800xt >7800 GTX 512 mb? or x1900xtx>7900 gtx? lol, time and time again ati has proven themselves to be better engineers than nvidia, i mean how could nvidia make better video cards? ati's engineers clearly have more experience, ati was started in 1985, nvidia not untill 1993, i trust ati shit alot more than nvidia anyday, kinda funny i had 4 7800 gtx's die and no ati cards die, even my 9800 pro still works.
 
So, where is your ATI tatoo, exactly? ;)

I would not write off the NVIDIA products too quickly. I found that both companies alternate blows in the race forward, so neither dominates for too long. Hopefully ATI can respond in a timely manner this time around and we do not get another X1800XT debacle (too little, too late). This competition only benefits us, in the end.

And, I would make a move to dual core before any video upgrade. I have a single core Opteron 148 that does 3.0 GHz that just does not cut it any longer (and, that is with no divider - the DDR runs native 1:1, 1T at 273 FSB). Dual core has definitely arrived.

Quad sli, is pointless right now, because nvidia's driver team obviously sucks freaking ass and cant get 4 g71 gpu's to scale properly together in sync, your cpu isnt holding you back at high resolutions, it's the glitchy ass drivers.just upgrade to a socket 939 athlon 64 x2, conroe motherboards are overpriced like 200$+ plus the performance over a athlon 64 isnt too great, buying a conroe isnt worth it.
 
Well, when you come second to the market by 8 months, your product better be good. Maybe the X1800XT beat the 7800 GTX a bit, but they released the X1900 a paltry 90 days later, relegating the X1800 to second tier status a little too quickly.

I just hope the R600 specs leaked to date are not true. If they are, and the leaked specs for the G80 are correct along with the supposed delays on the R600, NVIDIA may have a 3 month plus solo run at the top during the holiday season. Ouch for ATI.

Quad SLI is a DirectX limitation. That API model is a little too old to buffer up enough work for four cards in AFR mode. So, no driver issues there. DirectX10 changes all that, however.

As for dual core, the video drivers also have a hand in optimizing for dual core. You simply make some DirectX calls, and the driver pushes the work to both cores. In all of the games I have ever run, a dual core out paces any single core. The Core 2 Duo even broadens that margin further. I still recall the day when I was happy that my single core Winchester 3200+ reached 2.7 Ghz. But, that was almost two years ago, on an AGP based nForce 3 with a 6800 Ultra. Things do march along quite quickly. Hopefully AMD delivers something better, but Intel will not lay down and give up their hard earned performance crown quite so easily this time around.
 
Last edited:
Well, when you come second to the market by 8 months, your product better be good. Maybe the X1800XT beat the 7800 GTX a bit, but they released the X1900 a paltry 90 days later, relegating the X1800 to second tier status a little too quickly.

I just hope the R600 specs leaked to date are not true. If they are, and the leaked specs for the G80 are correct along with the supposed delays on the R600, NVIDIA may have a 3 month plus solo run at the top during the holiday season. Ouch for ATI.

Quad SLI is a DirectX limitation. That API model is a little too old to buffer up enough work for four cards in AFR mode. So, no driver issues there. DirectX10 changes all that, however.

As for dual core, the video drivers also have a hand in optimizing for dual core. You simply make some DirectX calls, and the driver pushes the work to both cores. In all of the games I have ever run, a dual core out paces any single core. The Core 2 Duo even broadens that margin further. I still recall the day when I was happy that my single core Windsor 3200+ reached 2.8 Ghz. But, things march along. Hopefully AMD delivers something better, but Intel will not lay down and give up their hard earned performance crown quite so easily this time around.

the 7800 GTX 512 mb, came out after the x1800xt, when both were benchmarked ati drivers were not optimized, now with catalyst 6.9 the x1800xt is outperforming 7800 GTX 512 mb in just about all games, except like doom3 engine.remember r580 x1900xtx was finished before r520 core, because of some transistor leak problem on r520 reason for delay. , plus the g70 architecture is nothing but a spursed up 6800 ultra basically some improvements, but basically same architecture, nvidia drivers were already optimized, while ati r520 totally different from r420 x800xt.

current performance, the x1800xt stomps the 7800 gtx 512 mb, and simply murders the 256 mb.
 
proof, ati drivers werent optimized for r520 x1800xt, when x1800xt vs 7800 GTX 512 mb , were benchmarked and it made 7800 GTX 512 mb seem more powerful but it's not.

when i first bought my x1800xt october 2005, it was only scoring like 40 fps 1600x1200 x4 aa x16 af in doom3 ultra quality, now it's scoring 72 fps same settings with 6.9 drivers, vs launch like 5.10 drivers. ati works wonders with their drivers seriously.

Here's some latest 7800 gtx 512 mb vs x1800xt look the x1800xt is kicking it's ass badly lol.

http://www.legitreviews.com/article/310/8/

that's a 625/1500 kicking 7800 gtx 512 mbs ass running 550/1700, my x1800xtpe is 700/1600 so my performance is kicking it's ass even more by atleast 5-10 fps.
 
processor speed is holding you back from 11,000 , you would atleast need that 3200+ @ 2.8/2.9 ghz to hit 11,000, i know i hit 11100 with x1800xt 512 mb @ 750/1800 3800+ venice @ 2.88 ghz.

screw 3DMARK05 AND 06, what do you score in f.e.a.r 1600x1200 x4 aa x16 af max benchmark? stock @ 700/1600 i hit 30 fps minimum 48 fps average.

I ran my 4000+ at 2.7 Gig to bring it into line with mustang before he received his card and got 11,180, ( I have gained about 700 points going from 2.7 to 3.05 Gig) you get to a point where there is no bottleneck at all and CPU speed does not help in the slightest, above 3.05Gig I get no additional points, so then its purely down to system/memory tweaks to get the best ram speed/latence above 3.05Gig on the CPU for max system performance
 
I ran my 4000+ at 2.7 Gig to bring it into line with mustang before he received his card and got 11,180, ( I have gained about 700 points going from 2.7 to 3.05 Gig) you get to a point where there is no bottleneck at all and CPU speed does not help in the slightest, above 3.05Gig I get no additional points, so then its purely down to system/memory tweaks to get the best ram speed/latence above 3.05Gig on the CPU for max system performance

comparing, ati vs nvidia 3dmark05/06 is pointless it doesnt prove which video card is more powerful, example? my 7800 GTX 256 mb outscores my x1800xtpe in 3dmark06, yet my x1800xtpe whips it's ass so badly in doom3 engine and every other game i own.

real benchmarks are like doom3 TIMEDEMO DEMO1 1600x1200 x4 aa x16 af ultra quality or f.e.a.r stress test 1600x1200 x4 aa x16 af max.

3dmark is only good for testing out video cards overclocks core/memory, not comparing brand/cards.
 
comparing, ati vs nvidia 3dmark05/06 is pointless it doesnt prove which video card is more powerful, example? my 7800 GTX 256 mb outscores my x1800xtpe in 3dmark06, yet my x1800xtpe whips it's ass so badly in doom3 engine and every other game i own.

real benchmarks are like doom3 TIMEDEMO DEMO1 1600x1200 x4 aa x16 af ultra quality or f.e.a.r stress test 1600x1200 x4 aa x16 af max.

3dmark is only good for testing out video cards overclocks core/memory, not comparing brand/cards.


Didnt realise that I even mentioned ATi or NVidia in that quote? 2005 is a benchmark and that is all it is, it does not deal with quality, just raw speed hence by default it has AA/AF disabled so you are right in that respect. Where you are wrong tho, I noticed on an earlier thread where you say that in Oblivion/fear etc your card owns (I think thats the quote you made) the 7900GT/GTX etc. Would you like me to offer you a "real world" review link that specifically pitches the 1800/1900XTX and 7900GTX against each other in oblivion and guess what......the 7900GTX pipped it (but not in all resoultions) just from the 1900XTX and the 1800 was some distance behind! Now I have an 1800 clocked I think faster than yours albeit with only 256MB RAM and I also own the 7900GTO and visuals aside (which is a seperate issue that you have not referred to in any case) the 7900GTO averages 13-18FPS more at 1280x1024 and according to that review, more the higher the res gets.

Now dont get me wrong, I love my 1800 and its clear you are a fanboy!, thats why I have kept it and I dont doubt its more than a match for your old 7800GTX but I never thought they were that good anyways (the 7800's), they were in fact in many respects a kick ass uprated version of the 6800 but the 79XX series did move the game on somewhat albeit they have had their problems in themselves I appreciate but the 1800 across the board cannot match the 7900GTX in most titles, in fact the newer gen 1900 cards dont in quite a lot also. This was never about the RED V GREEN this was about the virtues of the 7900GTO which is a new card, there are plenty of old threads about the 1800XT, 1900GT, 1800GTO/2 that you can compare your FPS in Oblivion with.
 
Last edited:
Back
Top