# Why are there no Sandybridge reviews using top end C2Q CPUs?



## EastCoasthandle (Apr 12, 2011)

I've looked at a few reviews and found no inclusion of Q9650 and maybe one or 2 using a Q9550.  Perhaps I missed it?  Does anyone have a  2500 series or 2600 series SB review that compares th Q9650, etc CPUs?

Edit:
Below you will find a few benchmark results showing the differences between the Q9650 at 3.70GHz vs i7 2600k at stock. I'm using a 5870 at 900/1200.   I didn't bother overclocking the 2600k.  The same drivers were used.






Q9650 OC'd





i7 2600k Stock







Q9650 OC'd





i7 2600k Stock








Q9650 OC'd





i7 2600k Stock








Q9650 OC'd





i7 2600k Stock









Q9650 OC'd





i7 2600k Stock








Q9650 OC'd





2600k Stock





2600k HT Off







Q9650 OC'd





2600K Stock





2600k Stock HT Off






Q9650 OC'd





2600K Stock







q9650 OC'd





2600K stock
(I didn't complete all of the tests in SuperPI Mod).


----------



## KieX (Apr 12, 2011)

Q6600 and Phenom II X4 close enough?

http://www.anandtech.com/show/4083/...core-i7-2600k-i5-2500k-core-i3-2100-tested/16


----------



## EarthDog (Apr 12, 2011)

Are you just trying to put a number on how much better it is clock for clock or something? Thats like 2 FULL generations ago...


----------



## Molignar (Apr 12, 2011)

Is this closer to what your looking for? http://www.overclock.net/intel-cpus/963986-pics-bc2-very-large-gains-i5.html This guy did his own comparison with his Q9650 setup and his new i5 2500k setup.


----------



## jimmyz (Apr 12, 2011)

I own a QX9650 as well as a few Nehalems, a Gulftown and a 2600k. If you are looking for whether it is faster clock per clock, yes, by far!! also better IPC than even the Gulftowns, and blows Nehalems out of the water as well. The only place Sandy Bridge is lacking is once it hits it's wall that's it. Even LN2 didn't get mine a bit farther than the 5.35 Ghz it can do on air.


----------



## EastCoasthandle (Apr 13, 2011)

The Q6600 won't work as even the Q8000 series beat it but thanks.  There has been a lot of talk but it's odd there is no reviews showing the comparison from it.  I was looking for either a Q9650 or a Q9550 but the later is at a lower clock rate.  

I honestly have no idea what the performance differences between the 2 are.    As noted, I want to see how well clock for clock the 2500/2600 series are compared to the Q9650.  For example at 3.40GHz or 4.0GHz.  But there is no such information as of yet.  As for the linked user thread that's still on going with the OP making a new set of video(s).  But it's a start, thanks.


----------



## cadaveca (Apr 13, 2011)

Find me the cpu, I'll gladly do it for ya. Unfortunately, I only have Q6600 left in 775.


----------



## Delta6326 (Apr 13, 2011)

Q6600 FTW!! i still love mine.

I do agree I wish in reviews they had some older cpus to compare on


----------



## dank1983man420 (Apr 13, 2011)

If you need me to do anything , I have a 2500k and a q9650 I can do some tests on.


----------



## EastCoasthandle (Apr 13, 2011)

dank1983man420 said:


> If you need me to do anything , I have a 2500k and a q9650 I can do some tests on.



Please do if you don't mind.  Below are some suggestions:
Applications:
Winrar has it's own built in benchmark.  
3DMark11
SuperPI Mod


Games:
AVP benchmark
Crysis Benchmark tool (for Crysis) and/or FBWH Too (for Crysis Warhead)
FFXIV Benchmark
Lost Planet 2 Benchmark


----------



## LifeOnMars (Apr 13, 2011)

This should be interesting. I know personally how much of a step up these chips are having had a 3.8GHz Q6600 GO and a 4GHz 1090T previously.


----------



## slyfox2151 (Apr 13, 2011)

just to point out, winrar is very ram dependant last time i checked, latency and bandwidth will have a very large impack on the score.




intel burn test bandwidth should be a fair comparison... or run True Crypt benchmark. cinebench.


----------



## dank1983man420 (Apr 13, 2011)

EastCoasthandle said:


> Please do if you don't mind.  Below are some suggestions:
> Applications:
> Winrar has it's own built in benchmark.
> 3DMark11
> ...





LifeOnMars said:


> This should be interesting. I know personally how much of a step up these chips are having had a 3.8GHz Q6600 GO and a 4GHz 1090T previously.





slyfox2151 said:


> just to point out, winrar is very ram dependant last time i checked, latency and bandwidth will have a very large impack on the score.
> 
> 
> 
> ...





Sounds good,  I would have to do these tomorrow morning on my day off though when the time to do this is there though. I just downloaded the necessary programs I didn't have already, so that is taken care of.  Would you like  me to run both of the chips at their stocks, both at the same clocks OC'd,  or both stock clocks and matched OC's?


----------



## TRIPTEX_CAN (Apr 13, 2011)

I'm on the verge of pulling the trigger on a SB upgrade so there is a good chance I will have something to add to this thread down the road. 

I think matching the clocks is the best way to do it. I can say for sure this will be a complete shitkicking in all synthetics where the q9650 will show it's age.

You should also add BC2 to this comparison even though it's proven that C2Qs just can't keep up with i5/i7 let alone SB i5/i7.


----------



## EastCoasthandle (Apr 13, 2011)

dank1983man420 said:


> Sounds good,  I would have to do these tomorrow morning on my day off though when the time to do this is there though. I just downloaded the necessary programs I didn't have already, so that is taken care of.  Would you like  me to run both of the chips at their stocks, both at the same clocks OC'd,  or both stock clocks and matched OC's?



Thanks, I would prefer both at the same OC settings.


----------



## trickson (Apr 13, 2011)

TRIPTEX_CAN said:


> the q9650 will show it's age.



It may be aged , But it still is a great CPU . I love mine and have seen no real reason to upgrade it as of yet . Just a supper solid and fast CPU that can take any thing I through at it and then some .


----------



## francis511 (Apr 13, 2011)

http://www.tomshardware.co.uk/chart...compare,2430.html?prod[4785]=on&prod[4413]=on


----------



## dank1983man420 (Apr 13, 2011)

francis511 said:


> http://www.tomshardware.co.uk/chart...compare,2430.html?prod[4785]=on&prod[4413]=on



Damn, that review would have been perfect if they used an i2500k instead of the 2600k.  It would have been comparing both 4 core and 4 thread processors instead of the 2600k having the 8 thread advantage.


----------



## silkstone (Apr 13, 2011)

slyfox2151 said:


> just to point out, winrar is very ram dependant last time i checked, latency and bandwidth will have a very large impack on the score.
> 
> 
> 
> ...



It thought it would be more FSB Dependant? which i guess would also be tied to the ram, but it uses the cache more than anything else, right?


----------



## newtekie1 (Apr 13, 2011)

2600K vs. Q9650

2500K vs. Q9650

There you go.  SandyBridge definitely wins, but unless you are doing video/audio/photo/3D editting, I see no reason to upgrade if you already have a high end 775 quad.


----------



## AltecV1 (Apr 13, 2011)

DAMN that is a lot faster then i thought it would be ! but now im sad because i just realised how old my cpu is


----------



## LifeOnMars (Apr 13, 2011)

newtekie1 said:


> 2600K vs. Q9650
> 
> 2500K vs. Q9650
> 
> There you go.  SandyBridge definitely wins, but unless you are doing video/audio/photo/3D editting, I see no reason to upgrade if you already have a high end 775 quad.



Crossfire/Sli scaling is alot better for the very top end, plus that only shows a handful of games. I've seen improvements in alot more and has been an altogether smoother gaming experience.


----------



## TRIPTEX_CAN (Apr 13, 2011)

trickson said:


> It may be aged , But it still is a great CPU . I love mine and have seen no real reason to upgrade it as of yet . Just a supper solid and fast CPU that can take any thing I through at it and then some .



I agree the Q9xx0 quads are/were great chips for their time but imo that time has passed when running current highend multiple GPU configs and intensive encoding tasks.


----------



## EastCoasthandle (Apr 13, 2011)

dank1983man420,
One more thing, please make sure that all IQ settings are on max.  As for resolution make it the highest possible.


----------



## EastCoasthandle (Apr 18, 2011)

OP updated with some results.


----------



## EarthDog (Apr 18, 2011)

Were the vantage runs with PhysX disabled I hope (looks like, just confirming).


----------



## EastCoasthandle (Apr 19, 2011)

The results are without using physx.


----------



## 95Viper (Apr 19, 2011)

Thanks, EastCoasthandle... Nice thread...

Hope it produces good info, without any BS.

<subscribed>


----------



## newtekie1 (Apr 19, 2011)

LifeOnMars said:


> Crossfire/Sli scaling is alot better for the very top end, plus that only shows a handful of games. I've seen improvements in alot more and has been an altogether smoother gaming experience.



Correct, but those that can afford a proper SLi/Crossfire setup that would show a noticeable difference between the two can also probably afford a SB setup to support it as well.

But if you are still on C2Q, you're also likely looking for a single GPU setup, in which case the GPU is going to be the limitting factor, and any game that isn't limitted by the GPU will be getting well over 60FPS anyway, so the CPU limitting the game won't matter.


----------



## EastCoasthandle (Apr 19, 2011)

95Viper said:


> Thanks, EastCoasthandle... Nice thread...
> 
> Hope it produces good info, without any BS.
> 
> <subscribed>



NP, I was looking forward to adding dank1983man420 results but I have no idea what happened.


----------



## LifeOnMars (Apr 19, 2011)

newtekie1 said:


> Correct, but those that can afford a proper SLi/Crossfire setup that would show a noticeable difference between the two can also probably afford a SB setup to support it as well.
> 
> But if you are still on C2Q, you're also likely looking for a single GPU setup, in which case the GPU is going to be the limitting factor, and any game that isn't limitted by the GPU will be getting well over 60FPS anyway, so the CPU limitting the game won't matter.



Put like that I, see your logic  Guess I'm just impressed at the benefits I have seen with a single card setup, great chips. But yes, a nice C2Q with a single card will still give more than adequate framerates.


----------



## LAN_deRf_HA (Apr 19, 2011)

My friend went from a Q9xx at 3.4 ghz to a 2500k at 4.5ghz and doubled his wow frame rate in busy areas. So more than just clock for clock, you have to also account for the significant overclock advantage as well. That's what really widens the gap.


----------



## EastCoasthandle (Apr 19, 2011)

LAN_deRf_HA said:


> My friend went from a Q9xx at 3.4 ghz to a 2500k at 4.5ghz and doubled his wow frame rate in busy areas. So more than just clock for clock, you have to also account for the significant overclock advantage as well. That's what really widens the gap.



SB is more able to handle both specific cpu rendering, AI, particles, physics, culling, etc (anything else a particular game may use the cpu for) and still feed the GPU.  Doing a better job then the C2Q.  Even though the C2Q CPUs are  capable of handling the same.


----------



## cadaveca (Apr 19, 2011)

The question that I have that remains is how much of that performance boost is due to the faster memory subsystem?


----------



## EastCoasthandle (Apr 19, 2011)

I've added some results from winrar.


----------



## TRIPTEX_CAN (Apr 19, 2011)

So.. Is HT enabled on the SB in your tests? Sorry if you mentioned it and I just can't read. 

If HT was on during the WinRar test I wonder what the difference would be with it off.


----------



## EastCoasthandle (Apr 20, 2011)

HT is on during those benchmarks/test.  I suppose it would have an effect if it's disabled.  In any case, I have one more test to add and that's the Super PI Mod results.  Enjoy.


----------



## TRIPTEX_CAN (Apr 20, 2011)

I did my SB rebuild last night and the difference is huge between my q9550 and 2500k. Crysis and BC2 are now running much smoother and at higher FPS. The Q9x50 chips are good for their time and unless you're running a highend GPU it might not be worth it to upgrade. However, if you're running decent graphics and especially Crossfire/SLi the difference is HUGE. 

I didn't get much benching done before I did the swap but @ stock (3.3Ghz) the SB system score 2000pts more in 3DMark06 vs my q9550 @ 4ghz.


----------



## EastCoasthandle (Apr 20, 2011)

Some games are showing a noticeable difference (IE: BC2).


----------



## newtekie1 (Apr 20, 2011)

EastCoasthandle said:


> Some games are showing a noticeable difference (IE: BC2).



BC2 is extremely CPU limitted, so a large difference isn't a surprise, but I wonder how noticeable it really is.


----------



## TRIPTEX_CAN (Apr 20, 2011)

newtekie1 said:


> BC2 is extremely CPU limitted, so a large difference isn't a surprise, but I wonder how noticeable it really is.



HUGE!  Using 11.4 with my 5970 I woud average around 45fps in smaller scale 32 player maps and using older drivers (10.5) I would average around 55fps with most settings on medium. Since upgrading I can use 11.4 at the highest possible settings and the game is unbelivably smooth. I now average 80fps (ish) and I havent even reinstalled Windows yet. 

The difference is large and measurable. 

Crysis on the other hand didnt have a massive increase in measurable performance but the perceived difference and fluid gameplay is amazing.

I had a few BC2 fraps logs saved that I can compare against the 2500k. I'll try to get them posted later today.


----------



## EastCoasthandle (Apr 20, 2011)

With BC2 the Q9650 OC'd can handle it with no problems until the scene gets hectic with smoke, explosions, teammates shooting about, artillery raining down, etc.  As long as you keep the above to a minimum the Q9650 OC'd didn't have much of a problem.  But when you start to see the examples above in a any combination of 2 or more is when frame rate dips were noticed (resulting in some stuttering, etc) while the 2600K didn't have a problem.

I also updated the OP showing an example from Starcraft 2.


----------



## BababooeyHTJ (Apr 21, 2011)

If you happen to try out Serious Sam HD that would be nice. I saw a nice improvement with a moderate overclock on my Lynnfield. Oblivion (especially modded) or even Fallout New Vegas would be another good one. Darkplaces (Quake source port) is another good one where I saw some serious gains while overclocking especially with the right mods like pretty water.



newtekie1 said:


> 2600K vs. Q9650
> 
> 2500K vs. Q9650
> 
> There you go.  SandyBridge definitely wins, but unless you are doing video/audio/photo/3D editting, I see no reason to upgrade if you already have a high end 775 quad.



Anandtech's cpu gaming comparisons are awful. Who is running a $300 cpu at 1680x1050 or lower resolutions and low quality settings?


----------



## TRIPTEX_CAN (Apr 21, 2011)

BababooeyHTJ said:


> If you happen to try out Serious Sam HD that would be nice. I saw a nice improvement with a moderate overclock on my Lynnfield. Oblivion (especially modded) or even Fallout New Vegas would be another good one. Darkplaces (Quake source port) is another good one where I saw some serious gains while overclocking especially with the right mods like pretty water.
> 
> 
> 
> Anandtech's cpu gaming comparisons are awful. Who is running a $300 cpu at 1680x1050 or lower resolutions and low quality settings?



@ 1680x1050 any CPU limitation will be more obvious. That's why you see some websites using even lower resolution like 1024x768.


----------



## BababooeyHTJ (Apr 21, 2011)

TRIPTEX_CAN said:


> @ 1680x1050 any CPU limitation will be more obvious. That's why you see some websites using even lower resolution like 1024x768.



I don't think that is necessarily true. In some games yes but not all. Not by a longshot. The games that I mentioned in that quote are an example of that. GTA4 and FSX are two well known examples of games where the more resolution and IQ settings that you use the more cpu limited that you can become.

I don't even look at Anand's cpu reviews because of that. It makes the comparison about as useful as any other synthetic benchmark.


----------



## TRIPTEX_CAN (Apr 21, 2011)

Before and after FRAPS log from random BC2 before and after upgrading to the 2500k. Most recent scores are at even higher settings. 

My max became my average. This speaks for itself. 

*Before
*
2011-04-11 21:04:00 - BFBC2Game
Frames: 3788 - Time: 60000ms - Avg: 63.133 - Min: 49 - Max: 76

2011-04-13 22:14:36 - BFBC2Game
Frames: 4144 - Time: 60000ms - Avg: 69.067 - Min: 45 - Max: 106

2011-04-15 20:38:15 - BFBC2Game
Frames: 3341 - Time: 60000ms - Avg: 55.683 - Min: 18 - Max: 173
*
After*

2011-04-21 18:22:17 - BFBC2Game
Frames: 6234 - Time: 60000ms - Avg: 103.900 - Min: 61 - Max: 144

2011-04-21 18:24:22 - BFBC2Game
Frames: 6472 - Time: 60000ms - Avg: 107.867 - Min: 65 - Max: 172

2011-04-21 18:28:50 - BFBC2Game
Frames: 7297 - Time: 60000ms - Avg: 121.617 - Min: 70 - Max: 169


----------



## EastCoasthandle (Apr 21, 2011)

I've added BC2 and SCII comparisons without HT to the OP.


----------



## Melvis (Apr 27, 2011)

So what im seeing here is that the 2600K isnt realy any faster in games unless it is very CPU intensive, even the older Q9650 is holding its own still in all those games and gives enough grunt.

I honestly dont see a point upgrading if you still have a Q9650, not worth it at all.


----------



## newtekie1 (Apr 27, 2011)

Melvis said:


> So what im seeing here is that the 2600K isnt realy any faster in games unless it is very CPU intensive, even the older Q9650 is holding its own still in all those games and gives enough grunt.
> 
> I honestly dont see a point upgrading if you still have a Q9650, not worth it at all.



And, really, once you overclock the Q9650 to the 3.6GHz range, which they all pretty much do, it isn't going to hold you back any noticeable amount in any game.

Yes, Sandybridge is faster and overclocks better, but in real world usage the difference won't really be noticeable.


----------



## trickson (Apr 27, 2011)

newtekie1 said:


> And, really, once you overclock the Q9650 to the 3.6GHz range, which they all pretty much do, it isn't going to hold you back any noticeable amount in any game.
> 
> Yes, Sandybridge is faster and overclocks better, but in real world usage the difference won't really be noticeable.



One reason why I still love my Q9650 ! This is one solid CPU . I do not see a future build from what I have now any time soon . One thing I see happening faster and faster is the tech is moving far too fast for me to keep up with . Every time I get some thing "New" I find that in a month it is OLD and slow . I am really sick of trying to keep up and have settled for just keeping what I have .


----------



## cadaveca (Apr 27, 2011)

newtekie1 said:


> Yes, Sandybridge is faster and overclocks better, but in real world usage the difference won't really be noticeable.



Running dual GPUs on Sandybridge leads to like a 30% performance boost over 1156 with i7 870. Just ask triptex how much faster his games are now.


----------



## newtekie1 (Apr 27, 2011)

cadaveca said:


> Running dual GPUs on Sandybridge leads to like a 30% performance boost over 1156 with i7 870. Just ask triptex how much faster his games are now.



See my post above about dual GPUs.


----------



## cadaveca (Apr 27, 2011)

newtekie1 said:


> See my post above about dual GPUs.





Before the launch I was set on not buying into SB at all, but now that I have it, I really think it's important to highlight how, for specific scenarios, SB is THE way to go, even with cost considered.

I do not see SB getting alot of hype, but it should. Intel did a good job, and I'm definitely impressed.


----------



## TRIPTEX_CAN (Apr 27, 2011)

I was set on bd until I saw the results of sb I might have made a mistake but the clocks and temps speak loudly. Sb is really as good as people say. Only those who haven't tried it still defend the c2q.


----------



## newtekie1 (Apr 27, 2011)

cadaveca said:


> Before the launch I was set on not buying into SB at all, but now that I have it, I really think it's important to highlight how, for specific scenarios, SB is THE way to go, even with cost considered.
> 
> I do not see SB getting alot of hype, but it should. Intel did a good job, and I'm definitely impressed.



I absolutely agree, and if you are paying for dual-GPUs then you should be paying for propper supporting hardware as well.(Says the guy running SLi GTX460s on a Celeron...)


----------



## LAN_deRf_HA (Apr 28, 2011)

BababooeyHTJ said:


> Anandtech's cpu gaming comparisons are awful. Who is running a $300 cpu at 1680x1050 or lower resolutions and low quality settings?



That res is exactly what makes the anandtech benches better than most. As others have said those sorts of tests are typically done at lower resolutions. 1680x1050 is one of the most common high resolutions in use today. Most complaints I see against their reviews are from people who can't except just how big the gap is between certain cpus. I know AMD comes out looking downright useless for gaming, but many seem to forget phenom II only performed well compared to phenom 1, it still sucks compared to yorkfield and i5/i7. I mean even a stock Q6600 gives a X6 a challenge. Hopefully it'll be better with bulldozer, can finally switch teams without feeling like I have to lie to myself to justify it.

The whole issue is of course complicated by what sections of games are being compared. My friend had a bug that had his e5200 multi stuck at half speed, 1.6 GHz from 3.2 GHz. Just moving around in an empty crysis map he only lost like 5 frames with his 5850. Pretty remarkable given it's just a low cache dual core at such crap speeds, but during AI combat he lost a good 25fps.


----------



## BababooeyHTJ (Apr 28, 2011)

No, that doesn't make any sense at all. Who is buying a $400-500 cpu+motherboard combo to game on a ~$120 monitor on low or medium settings? Those comparisons are about as useful as any other synthetic benchmark. I gloss right over those reviews.

Secondly as I said saying that the cpu don't have any effect on resolution and IQ settings is just ignorant.

Lastly, where is a four year old 2.4ghz quad core cpu giving a 3.2ghz (not including turbo boost) hex core cpu a run for it's money? Maybe on Anandtech's reviews. Phenom 2 is at worst on par with Yorkfield clock for clock and in some cases can outperform Nehalem.


----------



## newtekie1 (Apr 28, 2011)

IMO, both high resolutions and low resolutions are important.  Low resolutions take the bottleneck off the GPU and place it on the CPU, so it highlights the real performance difference between CPUs.  However, higher resolution place the bottleneck back on the GPU, and show what real world usage will be like.

Considering 1680x1050 is the second most common resolution on monitors today, it certainly is important that it is included in benchmarks, it gives people with these monitors an idea of what performance they can expect, and actually it is people with these "lower" resolutions that will see the most noticeable difference between CPUs.  Oh, and just FYI, 1280x1024 is the 3rd most common monitor resolution, so that too is important in benchmarks as well, and again they will show a larger difference between CPUs than a higher resolution would.


----------



## Melvis (Apr 28, 2011)

LAN_deRf_HA said:


> I know AMD comes out looking downright useless for gaming, but many seem to forget phenom II only performed well compared to phenom 1, it still sucks compared to yorkfield and i5/i7. I mean even a stock Q6600 gives a X6 a challenge. Hopefully it'll be better with bulldozer, can finally switch teams without feeling like I have to lie to myself to justify it.



This is what ive seen before and it has been told again and again. A Phenom I 9950 is = to a Q6600 so if you think a Q6600 has any chance competing to any X6 from AMD your sadly mistaken. I choose to go AMD for this reason and that's for gaming, it has been proven that an AMD CPU can hold its own just fine in gaming compared to any intel CPU, and more so at high res. Not this low res BS that everyone points to, who in there right minds play games at 1024*768??? using a quad core CPU. My CPU in any game i have run so far has never maxed out my processor once, still has room to breath.


----------



## LAN_deRf_HA (Apr 28, 2011)

Melvis said:


> This is what ive seen before and it has been told again and again. A Phenom I 9950 is = to a Q6600 so if you think a Q6600 has any chance competing to any X6 from AMD your sadly mistaken. I choose to go AMD for this reason and that's for gaming, it has been proven that an AMD CPU can hold its own just fine in gaming compared to any intel CPU, and more so at high res. Not this low res BS that everyone points to, who in there right minds play games at 1024*768??? using a quad core CPU. My CPU in any game i have run so far has never maxed out my processor once, still has room to breath.



Did you miss the rest of my post? As discussed, 1680 is not low res. http://www.anandtech.com/bench/Product/53?vs=147&i=47.48.49.50

Not bad for a chip they don't even make anymore. I'd say it competes just fine. Even more so for it's replacement. http://www.anandtech.com/bench/Product/89?vs=147&i=47.48.49.50.59.60.61.62

Again, Phenom II was decent when compared to phenom I, which was just inexcusably awful. They needed to add 2 more cores to actually compete with Intel's worst chips.


----------



## Melvis (Apr 28, 2011)

LAN_deRf_HA said:


> Did you miss the rest of my post? As discussed, 1680 is not low res. http://www.anandtech.com/bench/Product/53?vs=147&i=47.48.49.50
> 
> Not bad for a chip they don't even make anymore. I'd say it competes just fine. Even more so for it's replacement. http://www.anandtech.com/bench/Product/89?vs=147&i=47.48.49.50.59.60.61.62
> 
> Again, Phenom II was decent when compared to phenom I, which was just inexcusably awful. They needed to add 2 more cores to actually compete with Intel's worst chips.



If your talking just purely about gaming benchmark, then its a no brainier that a X6 isnt the best choice (that's why i got a quad core), games at this stage wont use the full power of 6 cores, but when it comes to everything else the 1055T will eat that Q6600 alive, and in the future games will need more and more cores so the X6 will only increase its lead in time. http://www.guru3d.com/article/phenom-ii-x6-1055t-1090t-review/10

And once again the 9950 is = to a Q6600 at stock clocks, so they do not need two more cores to catch up at all. Even the 965 runs better in games then my X6.


----------



## LAN_deRf_HA (Apr 28, 2011)

What do you mean "if your talking purely about gaming", not only is this thread about gaming, you just said you bought AMD for gaming. Now you're saying the X6 isn't the best choice for gaming. Again, I'll bring up my point about how you basically have to lie to yourself to justify buying AMD at this point and time. If you wanted better application performance, you'd have gotten a 1156 for the same price and gotten radically better gaming performance as a bonus. I bought AMD when they were the best, they aren't now and haven't been for years. I'd encourage anyone supporting current AMD offerings above the $70 price point to reconsider their reasoning. I mean you just compared a Q6600 to a lower speed phenom 1. Phenom 1 had a severe clock for clock disadvantage, well outside of gaming. http://www.anandtech.com/bench/Product/53?vs=23

This isn't news. That was a well accepted fact at the time, as it should still be now. Reason would not lead someone to buy a phenom 1. You'd have to allow yourself to view things simply as you wished they were to justify that purchase.


----------



## Melvis (Apr 28, 2011)

LAN_deRf_HA said:


> What do you mean "if your talking purely about gaming", not only is this thread about gaming, you just said you bought AMD for gaming. Now you're saying the X6 isn't the best choice for gaming. Again, I'll bring up my point about how you basically have to lie to yourself to justify buying AMD at this point and time. If you wanted better application performance, you'd have gotten a 1156 for the same price and gotten radically better gaming performance as a bonus. I bought AMD when they were the best, they aren't now and haven't been for years. I'd encourage anyone supporting current AMD offerings above the $70 price point to reconsider their reasoning. I mean you just compared a Q6600 to a lower speed phenom 1. Phenom 1 had a severe clock for clock disadvantage, well outside of gaming. http://www.anandtech.com/bench/Product/53?vs=23
> 
> This isn't news. That was a well accepted fact at the time, as it should still be now. Reason would not lead someone to buy a phenom 1. You'd have to allow yourself to view things simply as you wished they were to justify that purchase.



Well if your comparing a X6 against anything else for gaming then that's your fault in the first place for bringing up the X6 compared to a X4 for gaming performance as a X4 seems to be the better choice for gaming as i explained above^. Lie to myself? i didn't have to do any such thing, i bought both my CPU's over 8months ago, and for the price performance they gave out it was unbeatable at that time. See now i don't get why your bringing up newer sockets when this is all about older sockets compared to the 2600K ONLY. You are the one saying and i quote> I mean even a stock Q6600 gives a X6 a challenge. 

Just a reminder that the Phenom 1 that i posted was at the time the fastest AMD quad, at 2.6GHz there for the so called "severe" clock for clock disadvantage was only 200MHz. http://www.guru3d.com/article/amd-phenom-x4-9950-be-processor-tested/1

I would agree with you there it would not of been the greatest choice to get a Phenom 1, this is why i held of till Phenom II came along and it proved to be a very worthy contender giving any C2Q a very good run for its money, and in most cases better performance per $$. I must say that iam comparing the Prices for over here in AUS as intel is ALOT more expensive here then AMD is.


----------



## BababooeyHTJ (Apr 28, 2011)

newtekie1 said:


> IMO, both high resolutions and low resolutions are important.  Low resolutions take the bottleneck off the GPU and place it on the CPU, so it highlights the real performance difference between CPUs.  However, higher resolution place the bottleneck back on the GPU, and show what real world usage will be like.
> 
> Considering 1680x1050 is the second most common resolution on monitors today, it certainly is important that it is included in benchmarks, it gives people with these monitors an idea of what performance they can expect, and actually it is people with these "lower" resolutions that will see the most noticeable difference between CPUs.  Oh, and just FYI, 1280x1024 is the 3rd most common monitor resolution, so that too is important in benchmarks as well, and again they will show a larger difference between CPUs than a higher resolution would.



I agree with you but how many people in the market for a $300 cpu like 2600k will be using 1680x1050. What I don't like is Anand using Medium settings. I'll tell you right now that if anything most of the settings in FO3 are mostly cpu intensive on a modern video card.


----------



## newtekie1 (Apr 28, 2011)

BababooeyHTJ said:


> I agree with you but how many people in the market for a $300 cpu like 2600k will be using 1680x1050.



A lot, a lot of people don't think that is a bad resolution.  And a lot of people will be upgrading their computer, but keeping their LCD that they already have, and really why not do it this way?  The monitor is still very good, and if you are someone like me, who spent a pretty penny on their 1680x1050 monitor, they will probably keep it if it still works. Heck, all my monitors are 1680x1050 or less with the exception of my main machine.


----------



## Crap Daddy (Apr 28, 2011)

For me the best screen is the 22" with 1680x1050, not too large, not too small. Perfect. It also allows you to go for max settings with GPUs that are not quite the most poweful in the world. And a good CPU adds to a great gaming experience. So I think this particular res still has life in it and deseves to be benchmarked.


----------



## BababooeyHTJ (Apr 28, 2011)

newtekie1 said:


> A lot, a lot of people don't think that is a bad resolution.  And a lot of people will be upgrading their computer, but keeping their LCD that they already have, and really why not do it this way?  The monitor is still very good, and if you are someone like me, who spent a pretty penny on their 1680x1050 monitor, they will probably keep it if it still works. Heck, all my monitors are 1680x1050 or less with the exception of my main machine.



And you leave all of the settings on medium while you are at it?


----------



## newtekie1 (Apr 28, 2011)

BababooeyHTJ said:


> And you leave all of the settings on medium while you are at it?



I wasn't disagreeing with your on that point, hence why I didn't address it or even include it in the quote.  Why would you assume that just because I disagree with part of your post that I disagree with it all?  That doesn't make any sense.


----------



## yogurt_21 (Apr 28, 2011)

BababooeyHTJ said:


> *No, that doesn't make any sense at all. Who is buying a $400-500 cpu+motherboard combo to game on a ~$120 monitor* on low or medium settings? Those comparisons are about as useful as any other synthetic benchmark. I gloss right over those reviews.
> 
> Secondly as I said saying that the cpu don't have any effect on resolution and IQ settings is just ignorant.
> 
> Lastly, where is a four year old 2.4ghz quad core cpu giving a 3.2ghz (not including turbo boost) hex core cpu a run for it's money? Maybe on Anandtech's reviews. Phenom 2 is at worst on par with Yorkfield clock for clock and in some cases can outperform Nehalem.



*raises hand* now I'm certainly on all maxed out settings, just the lower resolution. 

some of us want an actual monitor upgrade not just a small resolution bump while still ending up with a crappy tn panel.

the cheapest ips panel that meets my requirements is 500$ and that's cash I don't have right now, so I'm still on my 4.5 year old 1680x1050 monitor. 

besides take a look at the resolutions in megapixels and there's really a small difference between 1680x1050 at 1.76 megapixel, and 1920x1080 2 megapixel.


resolution  megapixels
640x480     0.31
800x600     0.48
1024x768   0.79
1440x900   1.29
1280x1024 1.31
1650x1080 1.76
1600x1200 1.92
1920x1080 2.07
1920x1200 2.30
2048x1152 2.36
2560x1440 3.68
2560x1600 4.1
you only really get a big jump if you skip up to the uber resolutions.


----------



## BababooeyHTJ (Apr 28, 2011)

I probably shouldn't have even have bought up the 1680x1050 part, it's really not that low. It's mostly the low IQ settings that bothers me about anand. They also used to use a lower resolution in their cpu reviews (when I decided to stop paying attention to them) not too long ago IIRC.


----------



## newtekie1 (Apr 29, 2011)

BababooeyHTJ said:


> I probably shouldn't have even have bought up the 1680x1050 part, it's really not that low. It's mostly the low IQ settings that bothers me about anand. They also used to use a lower resolution in their cpu reviews (when I decided to stop paying attention to them) not too long ago IIRC.



Again, the lower resolution, besides still being extremely popular and hence important, also shows the real difference in performance between the CPUs.  That is why lower resolutions and lower settings are used in CPU reviews.  If people are reading a CPU review, they want to see how much actual difference there are between CPUs.  Quite frankly, if you only talk about large resolution and maxed out graphics settings, then my celerons would show little difference compared to my i7.


----------



## LifeOnMars (Apr 29, 2011)

newtekie1 said:


> Again, the lower resolution, besides still being extremely popular and hence important, also shows the real difference in performance between the CPUs.  That is why lower resolutions and lower settings are used in CPU reviews.  If people are reading a CPU review, they want to see how much actual difference there are between CPUs.  *Quite frankly, if you only talk about large resolution and maxed out graphics settings, then my celerons would show little difference compared to my i7*.



Absolute tosh :shadedshu Frame rates at any resolution are ruled by the lowest common denominator. If the CPU is slower than the GPU at higher resolution you are bottlenecked by the CPU, and vice-versa. Therefore a celeron compared to an i7 is still gonna show a big difference in alot of games, even at high res.

It's very true that the gap is reduced between CPU's at higher resolutions, normally though, this is an artificial representation because you are hitting GPU bottlenecks first.


----------



## EastCoasthandle (Apr 29, 2011)

*Here is some food for thought*

A 5850 at 1Ghz was used to test a Q9550 vs i7 860 both at 4.00GHz.  The results can be found here.  If I recall correctly a 860 was faster then a 920.  Here are some results using a QX9650 vs 920 at stock and overclocked. But this time using dual GPUs.


----------



## LifeOnMars (Apr 29, 2011)

In that first test - GPU bottlenecked, plain and simples. Get a more powerful GPU paired with the CPU, or even a pair of GPU's and the better CPU would shine and come into its own. Crap test.


----------



## EastCoasthandle (May 26, 2011)

I think it's clear that having a good CPU is beneficial when gaming.  So, if anyone has the age old question, "Do I buy a CPU or a GPU?" If it's an SB CPU then get that 1st then get the GPU later.


----------



## BababooeyHTJ (May 26, 2011)

I kind of agree. I think that a lot of people underestimate the importance of a good cpu for a gaming rig. I'm hoping that after Bulldozer's release we will see some better gaming oriented cpu comparisons.


----------



## Altered (Aug 10, 2011)

This is an excellent thread. Not everyone as shown in this thread upgrades at every generation of new CPUs. Me included see Q6600 LOL

I too got irritated with the low resolution benchmarks initially thinking along similar lines of WTF ID10T is still gaming at those numbers. But when you are really studying you should study all aspects of what your involved with. OK so we don't actually play at those res but if we know we can trick the bench into telling us useful info then we should use it. Put it in the think tank to add with other things we find out such as ways to stress the GPU. Then study each particular game to be played. Some are CPU intensive others are not. Just keep adding all the data together. I never ever base important decisions on one source anyway. Several reviews from different sources is always a good thing. The abnormalities usually show up like a sore thumb and then can be checked in more detail. Basically all erroneous info usually the oddball info you throw out if it cant be substantiated. Just as long as it is honest true factual etc data it can be used to evaluate, compare, and make an analysis. Isn't that what we try to do? We want honest in depth from every angle possible factual information to base our decisions on. 

I am getting impatient myself waiting on BD so I recently started thinking about a SB i5 2500k. These are impressive. And I really like the caching thing with the small SSD to a large platter drive. This thread is really right up my ally as to should I wait to see if BD is a winner or not. If I can be patient. 



EastCoasthandle said:


> the age old question, "Do I buy a CPU or a GPU?" If it's an SB CPU then get that 1st then get the GPU later.



Do you still think in my case (Q6600@3.6 and a HD4870) and BD just around the corner I should go SB now? Or get me a couple of 6850s and see what happens with BD before buying the CPU. The idea of BD coming soon has me wanting to get the cards for my new machine but that goes against the answer you give for "Do I buy a CPU or a GPU?"


----------



## BababooeyHTJ (Aug 11, 2011)

Bulldozer will not compete with sandy for gaming. I also doubt that it will be any cheaper than a 2500k. On the other hand I kind of doubt that your Q6600 is holding back your 4870.



KickAssCop said:


> I am loving the new system. Performance increase is phenomenal in the toughest games (GTA IV, Crysis 2, Witcher 2, Gothic 4 etc). No more hitching either (had hitching in above games including Crysis 2 after the high resolution textures). Games remain pegged in terms of framerates and overall really feel that I was seriously bottlenecked w/ a Q6600. You can only know when you upgrade.



This is what a user on another forum said about his upgrade from Q6600 to an i5 2500k. He was using a single 6950 as well.

Link


----------



## EastCoasthandle (Aug 11, 2011)

Altered said:


> Do you still think in my case (Q6600@3.6 and a HD4870) and BD just around the corner I should go SB now? Or get me a couple of 6850s and see what happens with BD before buying the CPU. The idea of BD coming soon has me wanting to get the cards for my new machine but that goes against the answer you give for "Do I buy a CPU or a GPU?"



To be fair that was posted back in May.  I haven't been keeping up with BD news.  Is there a release date set for BD yet?


----------



## Melvis (Aug 11, 2011)

BababooeyHTJ said:


> Bulldozer will not compete with sandy for gaming. I also doubt that it will be any cheaper than a 2500k. On the other hand I kind of doubt that your Q6600 is holding back your 4870.



Link?



EastCoasthandle said:


> To be fair that was posted back in May.  I haven't been keeping up with BD news but is there a release date set for BD yet?



Sep 19th isnt it?


----------



## BababooeyHTJ (Aug 11, 2011)

Melvis said:


> Link?



Do you really think that its going to compete in single or even quad threaded apps like games? Sure its going to be an amazing workstation chip but I wouldn't hold my breath for it being faster than Sandy in gaming.


----------



## EastCoasthandle (Aug 11, 2011)

If that's true then he should wait until the 19th to see how things pan out for performance and prices for either SB or BD.


----------



## Melvis (Aug 11, 2011)

BababooeyHTJ said:


> Do you really think that its going to compete in single or even quad threaded apps like games? Sure its going to be an amazing workstation chip but I wouldn't hold my breath for it being faster than Sandy in gaming.



Well from the stock clocks ive seen and with turbo boosting 4 cores up to 4.5GHz then yes i think they will, but im saying this only for people that dont over clock, and at those speeds i know i wouldnt be.


----------



## BababooeyHTJ (Aug 11, 2011)

Melvis said:


> Well from the stock clocks ive seen and with turbo boosting 4 cores up to 4.5GHz then yes i think they will, but im saying this only for people that dont over clock, and at those speeds i know i wouldnt be.



You make a really good point there. It should be an interesting launch.

I'm just hoping that with this launch someone gets around to doing a good gaming oriented cpu review. There really aren't any out there with sandy or current video cards.


----------



## Altered (Aug 11, 2011)

EastCoasthandle said:


> If that's true then he should wait until the 19th to see how things pan out for performance and prices for either SB or BD.


Yes I understand the time difference. That is my issue. 
So do you think picking up my video cards now would be the better choice? Now that we are fairly close we hope to getting some info on BD.
I realize a Q6600 isn't going to be much pumpkin but it would just be till the CPU decision with BD.


----------



## EastCoasthandle (Aug 11, 2011)

Altered said:


> Yes I understand the time difference. That is my issue.
> So do you think picking up my video cards now would be the better choice? Now that we are fairly close we hope to getting some info on BD.
> I realize a Q6600 isn't going to be much pumpkin but it would just be till the CPU decision with BD.



From what I've last read AMD plans on releasing the 7000 before end of year.  So you should hold off on the new video card as well.  Regardless if you go SB or BD you will notice an improvement from a Q6000 OC or not.


----------



## Altered (Aug 12, 2011)

EastCoasthandle said:


> From what I've last read AMD plans on releasing the 7000 before end of year.  So you should hold off on the new video card as well.  Regardless if you go SB or BD you will notice an improvement from a Q6000 OC or not.



I went ahead and picked up a XFX6950 for now I needed a fix I was bored.  I always end up with "old" hardware. Nothing new here.


----------



## LordJummy (Aug 12, 2011)

Altered said:


> I went ahead and picked up a XFX6950 for now I needed a fix I was bored.  I always end up with "old" hardware. Nothing new here.



There's nothing "old" about a 6950. They are still very relevant, and amazing.


----------



## Altered (Aug 12, 2011)

LordJummy said:


> There's nothing "old" about a 6950. They are still very relevant, and amazing.



No I was implying in just a few months when the 7000 series hits the market it would seem that way. I'm never all on current hardware though at best a couple of key components will always be 1 generation if not 2 gen behind.  As long as it gets the job done Ill deal with it.  

I hope its not old right now today I just bought it.


----------



## LordJummy (Aug 12, 2011)

Altered said:


> No I was implying in just a few months when the 7000 series hits the market it would seem that way. I'm never all on current hardware though at best a couple of key components will always be 1 generation if not 2 gen behind.  As long as it gets the job done Ill deal with it.
> 
> I hope its not old right now today I just bought it.



From what I've heard the 7k series is not just a few months away. It's more like 6 months away minimum.


----------



## Exeodus (Nov 3, 2011)

Delta6326 said:


> Q6600 FTW!! i still love mine.
> 
> I do agree I wish in reviews they had some older cpus to compare on



The first chip I managed to kill was a Q6600.

But it had a 1.3125 VID so it deserved it.


----------



## TRWOV (Nov 22, 2011)

This thread is great. I thought that I might have erred by getting an H61 board and a locked Core i5 but if the Q9650 can hold it's own even today I think that this setup will serve me well for years and when it's actually obsolete it'll make a good replacement for my current HTPC.


----------



## purecain (Nov 23, 2011)

when the first i7 was released i looked at the performance numbers compared to my old Q9550 and thought... well this is it... we are definatly into the realms of diminishing gains....

i honestly didnt think i would get that massive increase in performance from a chipset upgrade for the next few years... then sandy bridge came out...

i upgraded and i can not believe the differance... in BF3 i immediatly thought...wow!!! 

every game just feels more fluid than anything i have seen before... its impressive...

definatly worth upgrading... 

if you go z68 you can buy a 20gb ssd and enable SRT... i have a 60gb force gt on the way to test, thats meant to be an excellent feature....


----------

