well in my opion with a dual core that will get 6500 or above, which matches my speed with a dual core and my x1900xtx, so this means there mid range card is performail same or faster than ATI's top end card for a cheaper price
but thats not the real 8600GTS score thou, u say its between 7900 GT and the 7900 GTX, but i bet your the results ur looking at for those cards have been done with dual cores
I wouldnt say the card is that fast, assuming its 06 score is with it OCd, its just about topped out. You cant rely on a CPU to give additional horsepower for a final score as very few games use the CPU for much these days, the GPU handles 99% of it. So it wouldnt surprise me if a weaker CPU was deliberately picked to put into the test system.
KetxxxI wouldnt say the card is that fast, assuming its 06 score is with it OCd, its just about topped out. You cant rely on a CPU to give additional horsepower for a final score as very few games use the CPU for much these days, the GPU handles 99% of it. So it wouldnt surprise me if a weaker CPU was deliberately picked to put into the test system.
This card isnt for games these days, its for DX10 and next-gen games such as Crysis and Alan Wake; which will utilize multiple cores to their fullest extent. Even now, try and play Supreme Commander on a single core and watch the slide show.
1. Last I checked the 8600GTS was a single core card :p
2. If the 8600GTS barely has enough grunt for current games to run smoothly, explain to us all again how its going to have enough horsepower to run DX10 :D Admittedly DX10 supposedly brings amazing performance inprovements, but what do you think programmers are going to do with all the extra power? Most likely wittle it away by dumping additional poorly optimized tasks onto the GPU or use the extra power to put yet more details into environments etc, slowing the card down to a crawl, meaning you would need a better card anyway. 128bit memory interface is just totally outdated and only should be used on low end cards like the 8300GT.
KetxxxI wouldnt say the card is that fast, assuming its 06 score is with it OCd, its just about topped out. You cant rely on a CPU to give additional horsepower for a final score as very few games use the CPU for much these days, the GPU handles 99% of it. So it wouldnt surprise me if a weaker CPU was deliberately picked to put into the test system.
But the fact of the matter is that cpu has a large impact on 3dMark scores, which is what this test is based on.
Wile EBut the fact of the matter is that cpu has a large impact on 3dMark scores, which is what this test is based on.
05 isnt so hugely effected, 06 is more biased tho. the point I was making is in real games, the CPU has very little to do most the time as the GPU handles it.
Ketxxx05 isnt so hugely effected, 06 is more biased tho. the point I was making is in real games, the CPU has very little to do most the time as the GPU handles it.
Thats what I was saying earlier about dual cores, not dual GPU's but CPU's. Next-Gen games need a LOT of cpu power, and 3dmark06 reflects that. This guy tested it on a (presumably) slow, single core athlon 64. Bench the 8600 with an overclocked C2D, or even an Athlon X2 and youd get much better results.
KetxxxWithout that 256bit memory bus the card is already dated.
no it isn't, 128-bit is still perfectly fine for mid-range cards
the 128-bit cards from the next generation outperform the 256-bit cards from last generation most of the time
KetxxxI wouldnt say the card is that fast, assuming its 06 score is with it OCd, its just about topped out. You cant rely on a CPU to give additional horsepower for a final score as very few games use the CPU for much these days, the GPU handles 99% of it. So it wouldnt surprise me if a weaker CPU was deliberately picked to put into the test system.
nothing suggested the score is with the card overclocked, and what you seem to not understand is they used a weak ass processor with it
which will greatly affect the final score
KetxxxIf the 8600GTS barely has enough grunt for current games to run smoothly, explain to us all again how its going to have enough horsepower to run DX10 :D Admittedly DX10 supposedly brings amazing performance inprovements, but what do you think programmers are going to do with all the extra power? Most likely wittle it away by dumping additional poorly optimized tasks onto the GPU or use the extra power to put yet more details into environments etc, slowing the card down to a crawl, meaning you would need a better card anyway. 128bit memory interface is just totally outdated and only should be used on low end cards like the 8300GT.
who says it barely has enough grunt to run current games?
even with the terrible processor holding it back it still managed to outscore all but the highest end gpus on the market today
a stock 7900gt handles any game on the market today with decent settings, and this thing outscores it even with the terrible cpu
i've seen x1950Pros that could barely break 5000 in 06 even when paired with strong cpus
you can go over to the orb and look at all the x1950Pro scores with single core athlon 64 chips and see only one managed to break 5000 actually
27 Comments on GeForce 8600 GTS pictured and benchmarked
Very nice though, 2GHz DDR!
i think this would get in the high 6xxx with a good cpu putting it with the x1900xtx which also gets in the 6xxx
Edit: I was replying this
Now, the 8600 GTS scores between the 7900 GT and the 7900 GTX.
On a side note. My ATITOOL doesnt look anything like that. Im gonna double check to see if I got 0.27. Anyone know what I am doing wrong?
2. If the 8600GTS barely has enough grunt for current games to run smoothly, explain to us all again how its going to have enough horsepower to run DX10 :D Admittedly DX10 supposedly brings amazing performance inprovements, but what do you think programmers are going to do with all the extra power? Most likely wittle it away by dumping additional poorly optimized tasks onto the GPU or use the extra power to put yet more details into environments etc, slowing the card down to a crawl, meaning you would need a better card anyway. 128bit memory interface is just totally outdated and only should be used on low end cards like the 8300GT.
I took a second look at the screenshots to make sure, and then WTF!?!? its '06 :banghead: :banghead: :banghead:
the 128-bit cards from the next generation outperform the 256-bit cards from last generation most of the time nothing suggested the score is with the card overclocked, and what you seem to not understand is they used a weak ass processor with it
which will greatly affect the final score who says it barely has enough grunt to run current games?
even with the terrible processor holding it back it still managed to outscore all but the highest end gpus on the market today
a stock 7900gt handles any game on the market today with decent settings, and this thing outscores it even with the terrible cpu
i've seen x1950Pros that could barely break 5000 in 06 even when paired with strong cpus
you can go over to the orb and look at all the x1950Pro scores with single core athlon 64 chips and see only one managed to break 5000 actually