Friday, January 19th 2007
Geforce 8600 only with 128bit memory interface
There have been reports about the upcoming NVIDIA mainstream DirectX 10 parts. One big thing was the 256-bit memory interface of the 8600. It seems that this feature will be reserved for the high-end cards. The 8600 series will feature a 128-bit memory interface - no more.
Source:
VR-Zone
38 Comments on Geforce 8600 only with 128bit memory interface
8600 with 128 bit would be seriously crippled, as it would not outperform something from the last generation.
Plus the 7600GT performed better than that card.
The X1600XT was a disappointment in my eyes.
I bought a Sapphire X1650XT, which on newegg currently goes for $150, it overclocked like complete ass. There was litterally no memory headroom for overclocking, what it came clocked at was as high as it would go without artifacting like crazy. The core overclocked a tiny 25MHz, but granted the card was slightly pre-overclocked, so in the end the speeds were 625/1400. What did it get in 3DMark06? 3700
I sold that card and picked up a XFX 7600GT. It was also overclocked from the factory, it came at 590/1600, however I was able to get it up to 690/1780 completely artifact free after 2 hours of ATITool scanning. What did it get in 3Dmark06? 4100 And the kicker, the 7600GT was only $115 after rebate $145 before. Not only does the 7600GT outperform the x1650XT, it out overclocked it too, AND was cheaper.
I am sorry, but anyone that thinks anything from the x1600 series is better for the money than a 7600GT has got to be a fanboy.
Also, since when do the Mid-Range cards have to outperform the last generation to be good? I could see if we were talking about the high end cards not outperforming the last generation, but not mid-range cards, the mid-range cards rarely outperform the kings from the previous generation.
however this must cost them in manufacturing, as these models arent always say chips with faulty pipelines or ones that wont run at the default speed, as proven by the MANY people that can unlock their card higher than the original high end cards spec
considering this, maybe this time around they might actually release a card that can compete with the equivalent nvidia for a change, it would make sense, as surely those cards they keep making(gt/o's) cost the same to make as the full on versions(and more than x6/700/x16/1650's)
Edit: yes i know the 9500 was basically a lesser 9700pro, but it wasnt released because they couldnt compete with nvidia
The 2 cards have completely different cores. (And the X1600XT = X1650Pro)
And pt, you know that there aren't much difference between X1800GTO and 7600GT right?
They are both 12pipes and 12 ps cards. (So I bet they won't differ much in performance)
www.techspot.com/review/31-radeon-x1650xt-vs-geforce-7600gt/page2.html
lol
i'm totally jk, that's what the forums are there for! discussion and speculation!
Big surprise though, I said the x1650XT gets 3700 in 3Dmark06, the site you provided has the stock one getting 3500 using the same settings, so I guess my overclock netted me 200 points, to bad my 7600GT overclock netted me about 600. But yeah, I am sure I just got a bad card...:nutkick:
And now that the 7600GT is available for a very low $90, it's value can't really be matched.
So how about you get your facts straight before you start trying to argue a point.