Saturday, November 12th 2011

Sandy Bridge-E Benchmarks Leaked: Disappointing Gaming Performance?

Just a handful of days ahead of Sandy Bridge-E's launch, a Chinese tech website, www.inpai.com.cn (Google translation) has done what Chinese tech websites do best and that's leak benchmarks and slides, Intel's NDA be damned. They pit the current i7-2600K quad core CPU against the upcoming i7-3960X hexa core CPU and compare them in several ways. The take home message appears to be that gaming performance on BF3 & Crysis 2 is identical, while the i7-3960X uses considerably more power, as one might expect from an extra two cores. The only advantage appears to come from the x264 & Cinebench tests. If these benchmarks prove accurate, then gamers might as well stick with the current generation Sandy Bridge CPUs, especially as they will drop in price, before being end of life'd. While this is all rather disappointing, it's best to take leaked benchmarks like this with a (big) grain of salt and wait for the usual gang of reputable websites to publish their reviews on launch day, November 14th. Softpedia reckons that these results are the real deal, however. There's more benchmarks and pictures after the jump.

Source: wccftech.com
Add your own comment

171 Comments on Sandy Bridge-E Benchmarks Leaked: Disappointing Gaming Performance?

#26
Unregistered
Wile EI do not see a 2500k in those tests.

And the words "in these forums" was never once mentioned prior to this.
You don't have to. Point is, Sandy is faster as long as you don't load it %100. It's faster per-clock, can OC more and does it under cooler temps.

I don't need to say that either. Just because you can afford a $1000 CPU, and have use for it, doesn't mean everyone else does. 2500k has the best balance right now, and, is the CPU to get. End of story.
#27
Wile E
Power User
John DoeYou don't have to. Point is, Sandy is faster as long as you don't load it %100. It's faster per-clock, can OC more and does it under cooler temps.

I don't need to say that either. Just because you can afford a $1000 CPU, and have use for it, doesn't mean everyone else does. 2500k has the best balance right now, and, is the CPU to get. End of story.
End of whose story? Certainly not mine. The fact is, you cannot speak for everybody, and gaming performance does not represent a cpu's true potential.
Posted on Reply
#28
Unregistered
Wile EEnd of whose story? Certainly not mine. The fact is, you cannot speak for everybody, and gaming performance does not represent a cpu's true potential.
That's right, I can't speak for everybody. However I can speak for majority. You, on the other hand, is speaking for minority. How many enthuasiast buy their systems mainly for decoding? They certainly aren't as much as people that build for gaming.
#29
n-ster
John DoeThat's right, I can't speak for everybody. However I can speak for majority. You, on the other hand, is speaking for minority. How many enthuasiast buy their systems mainly for decoding? They certainly aren't as much as people that build for gaming.
except the majority of the 2011 market is servers?
Posted on Reply
#30
Unregistered
n-sterexcept the majority of the 2011 market is servers?
Except this forum or people on similar forums aren't in the corporate market.

Except you're asking me that when you know what we're actually on about.
#31
n-ster
k fine, people who will buy 2011 will buy it for the extra features more than for the extra performance
Posted on Reply
#33
Over_Lord
News Editor
I don't see why anyone even needs to buy anything above Core i7 2600k for more than 90% gamers and otherwise.

No point really spending more than 300$ when it performs so admirably well, and can be overclocked to beat the crap out of additional core processors.
Posted on Reply
#34
n-ster
John DoeThing is, this platform didn't deliver those as well. It was said to, but it didn't. That's why it doesn't look great. It lacks USB 3, PCI-E 3.0 (which currently doesn't have use) and so.

www.tomshardware.com/reviews/core-i7-3960x-x79-performance,3026.html
I'd hardly trust TH to know this for sure... It is said it will be an unlockable feature in mobos maybe... and there are already 4 USB 3 ports from the controllers which isn't bad
Posted on Reply
#35
Unregistered
thunderisingI don't see why anyone even needs to buy anything above Core i7 2600k for more than 90% gamers and otherwise.

No point really spending more than 300$ when it performs so admirably well, and can be overclocked to beat the crap out of additional core processors.
Yeah, the 2500k is even better for just games. If you need HT for multi-threaded work, 2700k happens to sell $10 more something. 2700k's seem to do over 5 Ghz easier than the 2600k does.
n-sterI'd hardly trust TH to know this for sure... It is said it will be an unlockable feature in mobos maybe... and there are already 4 USB 3 ports from the controllers which isn't bad
Neither do I trust TH most the time, but you can't deny that they "got" the hardware. One can say it's legitimate.

With that aside, they say they doubt mobo's will have those built-in. PCI-E expanders (like Lucid) break the original purpose of platform. You can have the same on Sandy.
#36
ramcoza
Why 1680x1050 resolution? What will be the benchmarks(Games) at 2560x1440 resolution? will it be the same story?
Posted on Reply
#37
Unregistered
ramcozaWhy 1680x1050 resolution? What will be the benchmarks(Games) at 2560x1440 resolution? will it be the same story?
It's better off testing at a lower res to take GPU out of equitation. You still need the same amount of CPU power regardless of resolution.
#38
qubit
Overclocked quantum bit
Thread title is fine

New architecture gives same gaming framerates as old architecture = disappointing. What's hard to see?

Of course, we need to hold our breath for the official results on release day to be sure, which is why I pose it as a question.
Posted on Reply
#41
qubit
Overclocked quantum bit
ramcozaWhy 1680x1050 resolution? What will be the benchmarks(Games) at 2560x1440 resolution? will it be the same story?
When benching CPU framerates, one should reduce the resolution as much as possible, to remove the GPU as the bottleneck, otherwise, you just get all the results for the CPUs under test topping out and hence showing the same performance, when they actually all have different performances. If anything 1680x1050 is too high and that's why they look the same in those slides. I would test at 800x600.

Regardless, I'll say it again, wait for the official benchies tomorrow before passing judgment.

And welcome to TPU. :toast:
Posted on Reply
#42
entropy13
qubitWhen benching CPU framerates, one should reduce the resolution as much as possible, to remove the GPU as the bottleneck, otherwise, you just get all the results for the CPUs under test topping out and hence showing the same performance, when they actually all have different performances. If anything 1680x1050 is too high and that's why they look the same in those slides. I would test at 800x600.
800x600:
Posted on Reply
#43
qubit
Overclocked quantum bit
entropy13800x600:
i.imgur.com/QVEn9.png
That's hard to believe. You got a link?

If it was true, then one would simply buy the cheapest CPU and be done with it.
Posted on Reply
#44
entropy13
qubitThat's hard to believe. You got a link?

If it was true, then one would simply buy the cheapest CPU and be done with it.
:roll:

You don't know what Touhou is? :laugh:

It's not exactly known for its graphics, considering the games were all made by one guy. So even if I don't have a 3960X yet, I'm sure that such results are true. AFAIK you can't do anything with the frame cap at 60fps, and even an Atom and Intel integrated graphics can play the game.
Posted on Reply
#45
erocker
*
qubitNew architecture gives same gaming framerates as old architecture = disappointing. What's hard to see?

Of course, we need to hold our breath for the official results on release day to be sure, which is why I pose it as a question.
Who actually games at horribly low resolutions anymore? If they do, I don't think they are the marketing target for these chips anyways. Theoreticaly if a GPU needs two "lanes" to achieve it's maximum performance adding more "lanes" won't make any difference. Now, unless this new CPU has a GPU that can work with a discreet GPU or perhaps magic it won't increase framerates in gaming.
Posted on Reply
#46
qubit
Overclocked quantum bit
entropy13:roll:

You don't know what Touhou is? :laugh:

It's not exactly known for its graphics, considering the games were all made by one guy. So even if I don't have a 3960X yet, I'm sure that such results are true. AFAIK you can't do anything with the frame cap at 60fps, and even an Atom and Intel integrated graphics can play the game.
No, I've never heard of it before, enlighten me?

And measuring gaming performance with a softaware frame cap or vsync on is pretty stupid, isn't it? :shadedshu
erockerWho actually games at horribly low resolutions anymore? If they do, I don't think they are the marketing target for these chips anyways. Theoreticaly if a GPU needs two "lanes" to achieve it's maximum performance adding more "lanes" won't make any difference. Now, unless this new CPU has a GPU that can work with a discreet GPU or perhaps magic it won't increase framerates in gaming.
One wouldn't game at it, but one would certainly bench at it when comparing CPU performance, to remove the GPU from the equation.
Posted on Reply
#47
erocker
*
qubitOne wouldn't game at it, but one would certainly bench at it when comparing CPU performance, to remove the GPU from the equation.
Which is a useless way to test a CPU as the data provided shows no useful and/or real world benefit. Benchmarks/applications that actually use the CPU show it as being quite a bit better than SB.
Posted on Reply
#48
entropy13
qubitNo, I've never heard of it before, enlighten me?
Just go check it's page in Wikipedia, "Touhou Project."
qubitAnd measuring gaming performance with a softaware frame cap or vsync on is pretty stupid, isn't it? :shadedshu
Depends though. Comparing over 200 fps to another over 200fps is pretty stupid too. :laugh:

It should have been obvious already what I was trying to point out already. There are "benchmarks" where CPU power doesn't really matter at all. ;)

Touhou is just a very extreme and ridiculous example. :laugh:
Posted on Reply
#49
Unregistered
erockerWhich is a useless way to test a CPU as the data provided shows no useful and/or real world benefit. Benchmarks/applications that actually use the CPU show it as being quite a bit better than SB.
No, he's right. Numbers are numbers. If one chip is showen to be faster (regardless of resolution), you know it'll have more potential and less chance of a CPU slowdown. Look at this chart for example, he's limited by GPU (and game) so no difference.



---------------------

Then see this. At a lower resolution, there's a big difference in frametime (in a more CPU intensive case at the same time). This is a properly done test, that one above isn't:

www.hardwarecanucks.com/charts/index.php?pid=70,76&tid=3
#50
ramcoza
qubitWhen benching CPU framerates, one should reduce the resolution as much as possible, to remove the GPU as the bottleneck, otherwise, you just get all the results for the CPUs under test topping out and hence showing the same performance, when they actually all have different performances. If anything 1680x1050 is too high and that's why they look the same in those slides. I would test at 800x600.

Regardless, I'll say it again, wait for the official benchies tomorrow before passing judgment.
But I couldn't get the point. People who buy this series of CPUs will never play at such lower resolutions(whoever afford to buy this, should be already owning at least a top tier GPU & Display). So there is no sense to test it at lower res, even though you have to test it's processing power while gaming. Is it fair to test Today's CPU with a decade old configuration and come to a conclusion according to those results? I didn't stand for SB-E or Intel. I thought it's unfair to come to a conclusion with these low Res benchies. I may be wrong. But anyway we can find out the real story tomorrow.. ;)
And welcome to TPU. :toast:
Thank you.. :toast:
Posted on Reply
Add your own comment
Nov 5th, 2024 04:23 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts