# AMD Throwing the Gauntlet at Intel for releasing biased & unreliable benches.



## Tsukiyomi91 (Jan 21, 2016)

Permalink: http://www.lowyat.net/2016/92684/amd-calls-out-intel-for-using-biased-and-unreliable-benchmark/
**credits to Lowyat.Net**


----------



## cdawall (Jan 21, 2016)

Instead of bitching everytime something didn't go theit way amd should just take a leaf out of Intels book and lie.


----------



## xfia (Jan 21, 2016)

Interesting.. while I do agree with AMD we all know there is still games and programs people use that could show around 50% performance difference.


----------



## dorsetknob (Jan 21, 2016)

the ghist of that story that i get is
AMD gripe that they cannot Beat Intel
Some Tests show intel wipes the floor with AMD and some show they only lose by a smaller margin
( They STILL GET BEAT )
Simple answer ""SELL A PRODUCT that Beats Intel


----------



## Red_Machine (Jan 21, 2016)

Even without the benches being stacked in Intel's favour, AMD still loses.  Sounds like sour grapes to me.


----------



## Dethroy (Jan 21, 2016)

A quote from the article:



> which brings us to an important point: PCMark8 measures the overall performance of a particular system, while SYSmark measures raw CPU performance. If one were to compare two processors directly, which software would be more suitable? Food for thought.


Edit: Actions speak louder than words. If Zen doesn't prove to be a worthy alternative to Intel's offerings, AMD's future surely looks gloomy.


----------



## Tsukiyomi91 (Jan 21, 2016)

problem about AMD is that they're desperate for a comeback after their rather disappointing sales after the data has been tallied for Q4 2015... and they're not happy with it. So, the only way for them to "throw" their anger is lamenting & pointing fingers at Intel for unrealistic benches claims when they (AMD) are the ones that exaggerate their claims on their rather old, refurbished chips are faster than Intel & Nvidia by a huge, unrealistic margin. With Zen & Polaris coming out, I have a feeling no one wants to help AMD after this brouhaha...


----------



## Bones (Jan 21, 2016)

dorsetknob said:


> the ghist of that story that i get is
> AMD gripe that they cannot Beat Intel
> Some Tests show intel wipes the floor with AMD and some show they only lose by a smaller margin
> ( They STILL GET BEAT )
> Simple answer ""SELL A PRODUCT that Beats Intel



And let me deal a hand of poker to you with a marked & stacked deck and see how things go.

Intel has for years been stacking the deck against all competitors period, not just AMD and since the compiler used for well..... Just about everything out there was either made or at least influenced by Intel, the stacking of the deck is there. Any test(s) period will never run as well as it does on an Intel because of it hence the stacking of the deck.

Not saying this is the complete cause since I'm sure AMDs are slower IF things are on an even keel.
AMD has been guilty of shooting themselves in the foot several times over the years and can't seem to get the boat to even run straight, much less on course and that isn't helping either.
Any blame to be had cannot be placed on the proper shoulders since those responsible have already left the building with the suitcases of cash in hand, leaving the mess to those following to cleanup. Maybe something good will come of the next releases they have BUT it's no good if they don't get them out the door into the market AND if they don't do as expected or even close like BD.

All I can say is these test results don't exactly tell the complete story, if you want to rely on these as the "End all that ends all" to it that's OK because it's up to you, but I don't.

If you really want to pay out the ass both ways for your next celeron CPU let Intel become the only game in town period: You Intel fanboys will love giving up your first, second and even third born for the basic CPU package..... Don't even ask about any of the higher models when that times comes.... So just hope that time never does come.


----------



## dorsetknob (Jan 21, 2016)

Bones said:


> And let me deal a hand of poker to you with a marked & stacked deck and see how things go.
> 
> Intel has for years been stacking the deck against all competitors period, not just AMD and since the compiler used for well..... Just about everything out there was either made or at least influenced by Intel, the stacking of the deck is there. Any test(s) period will never run as well as it does on an Intel because of it hence the stacking of the deck.
> 
> ...



""DID YOU NOT EVEN CONSIDER" That i may not be an INTEL fan boy  i might Be an AMD Fan Boy pleading for AMD to


dorsetknob said:


> Simple answer ""SELL A PRODUCT that Beats Intel


----------



## Bones (Jan 21, 2016)

Not an fanboy of either... I happen to own both and never accused anyone of being such so please,  chill out. 

I was making a point of my own here and I do stand by it. Fanboy or not or whatever I don't care because in the end that won't make any difference between the two or three or however many companies would be a part of it, each one would gladly take your money - It's a matter of you or whoever deciding who gets it.


----------



## RejZoR (Jan 21, 2016)

What I hate more is how Intel compiler treats AMD CPU's even if they support all the required extensions. If AMD CPU runs Intel compiled binaries, it'll automatically be gimped.


----------



## Bones (Jan 21, 2016)

RejZoR said:


> What I hate more is how Intel compiler treats AMD CPU's even if they support all the required extensions. If AMD CPU runs Intel compiled binaries, it'll automatically be gimped.



Which happens to be most compilers if not all out there, certainly the industry itself is loaded with them and you can certainly bet the software used for testing was compiled with them.


----------



## Bill_Bright (Jan 21, 2016)

This is not Intel lying. It is just AMD whining (though with justified reason).

And Intel is not staking the deck either (not any more than ANY marketing weenie does for the product they are barking). It is just good marketing.

And this is not about one being a AMD fanboy or a Intel fanboy either.

Pick any Internet browser and you can find a benchmark or review that puts it on top. Pick a anti-malware program and you can find a test that puts it on top. Pick any pickup truck and you can find a review or award that puts it on top.

All benchmarking programs are synthetic! Even those _marketed_ as representing "real-world" are not really real-world tests - in part because every single one of the estimated 1.5 billion Windows computers in use today becomes unique within the first couple minutes of its first use! Right off the bat, users setup their unique networking, security, backgrounds and Desktop colors, favorite browsers, printers (and other attached hardware), and install their favorite apps. They use the CPUs with their motherboards, their RAM, drives, and their graphics solutions - all of which can affect overall performance.

This is the same marketing propaganda war that happened years ago in the home entertainment industry over amplifier power ratings, distortion levels, frequency responses, and power handling capability of speaker systems. Klipsch claimed their horn based speakers were best because they were the most efficient. Bose claimed their backwards firing speakers were best because they had better "dispersion". AR claimed they were the best because they had the deepest distortion-free bass.

And all were right!

But no one speaker system had the best efficiency, deepest bass, highest highs, widest dispersion, tightest localization, and least distortion too. Plus, everyone's ears are different, as are their tastes in music (a critical criterial when selecting a speaker) - just as every Windows based computer is different and every computer user's needs are different too.

The problem here is AMD and Intel CPUs should not be compared directly in terms of "performance". This is especially pertinent since AMD announced several years ago it will stop competing head-to-head with Intel.

It is us, computer enthusiasts and the IT press (and smart IT marketing/PR weenies at Intel) who keep perpetrating the problem by constantly comparing the two in performance - forcing AMD to defend itself against unfair press reports and consumer perceptions.

I can't find a good analogy. Maybe it is like comparing an office desk chair to a recliner. They are both chairs. Both need to be comfortable. Both are for sitting and need to be reliable and long lasting. But can one claim to be better than the other? Yes! Depending on the test criteria (benchmark program) you use. But is one best for ALL your sitting needs? Of course not. It is hard to fall out of a recliner if you doze off! 

*AMD makes great CPUs!* They are efficient, powerful, generally very competitively priced, and extremely reliable products you can expect to perform for years and years flawlessly. While I just happen to prefer Intel CPUs for my personal builds, I've used AMDs on many builds with no problems, and still have happy clients.

And just another side note - while AMD CPUs tend to be more affordable, when you factor in the price of the motherboard, RAM, graphics solution, case, PSU, drives, keyboard & mouse, speakers, and monitor, the cost difference becomes much less significant, and basically a wash once you spread that cost over the expected 3 - 5 (or longer) years of service you can expect from that computer.

It will REALLY be interesting to see what happens when/if Samsung buys AMD.


----------



## HD64G (Jan 21, 2016)

Problem here is that people tend to believe Intel is much more powerful in much bigger price which is a steal of their money as with 60% the Intel's price you can have a 90% cpu from AMD compared to it. VFM results are greatly distorted because of biased benchmarks. And because of the majority of people who buy on a badget go for the best value for their money, it hurts AMD very much. Simple eh?


----------



## Bones (Jan 21, 2016)

Good points Bill. 

I will say Intel back in the day did a brilliant job of advertising - Remember the aliens saying "Peeennntiuuuuummm"?
Intel was the only chipmaker back then doing such, never saw any ads from AMD, Cyrix, IBM, or anyone else involved with chipmaking at the time. 

They used this to establish the brand name and product line in the same way you'd think of many common products by the same name - And it worked.


----------



## lilhasselhoffer (Jan 21, 2016)

Did anyone read the article?  I ask because the argument seems to be that no matter what Intel will win.  The article itself draws no conclusion as such, it only states relative performance varies by test type and assumptions.  

What we're seeing instead is that Intel vs. AMD, in a laptop, is relatively close in some synthetic benchmarks and different in other synthetics.  The test that supposedly is more relevant to CPUs demonstrates a bigger difference than those tests which test multiple aspects of the system.


What has been proven is that in every day tasks your CPU is less of a factor than other things in your system.  Maybe the processor is 10% (figures here are round numbers, not real representations) slower, but as it only accounts for 10% of the total time required to do a task (call data, import it, manipulate it, write new data, repeat) so your actual difference in performance is 6%.  

Isn't this an obvious conclusion?  Intel tests with a CPU tester so that their superior processing is shown.  AMD wants to test with a whole system test because even if their CPU is slower the time loss is sponged up by less expedient procedures.


----------



## cdawall (Jan 21, 2016)

Bill_Bright said:


> It will REALLY be interesting to see what happens when/if Samsung buys AMD.



I hope they do...


----------



## Bones (Jan 21, 2016)

lilhasselhoffer said:


> Did anyone read the article?  I ask because the argument seems to be that no matter what Intel will win.  The article itself draws no conclusion as such, it only states relative performance varies by test type and assumptions.
> 
> What we're seeing instead is that Intel vs. AMD, in a laptop, is relatively close in some synthetic benchmarks and different in other synthetics.  The test that supposedly is more relevant to CPUs demonstrates a bigger difference than those tests which test multiple aspects of the system.
> 
> ...



The whole point is as you put it in the last part of your post. 
You can sum it up in several ways with each one giving a different picture of how things are or if you'd like, a "Spin" on things. 

Intel wants to "Prove" it's the best with CPU alone testing. 
AMD wants to prove it's a good product by letting tests involving all aspects of the system tell the tale. 

In reality both as said are right and wrong.
It's a matter of perception to who's looking at the results as presented with the hope your perception will agree with what they want you to see - Based on who's doing the hoping of course.


----------



## truth teller (Jan 21, 2016)

Tsukiyomi91 said:


> **credits to Lowyat.Net**


you misspelled arstechnica.co.uk

people still use sysmark? arent there somewhat better known alternatives like performancetest, pcmark or even the "simple" benchmarks of winrar/7-zip just as good?


----------



## cdawall (Jan 21, 2016)

At the end of the day if you took the stickers off of both the FX-8800P and i5 5200u/6200u no one would be able to see the difference in normal system unit. I think that is all AMD is getting at. Sysmark shows something that should be literally half as fast...


----------



## Filip Georgievski (Jan 21, 2016)

I on the other hand am fan of both.
Intel for CPUs and AMD for GPUs.
I look at this with another perspective.

I give Intel the award for being the best CPU building company now.
I give Nvidia the award for best GPU building company now.

BUT, i give the BEST OVERALL AWARD for AMD for making both quite well and most of all give BEST COMPETITOR AWARD to AMD for competiting in both CPU and GPU market and putting a hard time to Intel and Nvidia on the both markets.

Imagine if Intel made their own consumers GPUs and Nvidia making their own comsumers CPU.

If we play it like that, equaly, AMD will kick both asses to Intel and Nvidia, since deviding their power of building in 2 fronts is not something one company does with ease.


----------



## newtekie1 (Jan 21, 2016)

So basically what AMD is saying is that we shouldn't really judge the performance of processors based on "raw performance" because that gives too much of a "bias" towards processors with more performance, instead we should just say all processor are good enough for everyday tasks and leave it at that.

Got it.  It's completely stupid, but sure, whatever...

Oh, and lets not forget that AMD isn't guilt free of cherry picking benchmarks to show their products are "better" than their rivals.  So this is really just the pot calling the kettle black.


----------



## NdMk2o1o (Jan 21, 2016)

Haha "throwing the gauntlet" I wouldn't be quite so dramatic  and everyone saying "well then AMD should make a better performing chip than intel...." aren't getting the point, Intel is many more times bigger than AMD, they have the better performing chips there is no need for under handed techniques at all, anyone would think they don't any competition....


----------



## cdawall (Jan 21, 2016)

newtekie1 said:


> So basically what AMD is saying is that we shouldn't really judge the performance of processors based on "raw performance" because that gives too much of a "bias" towards processors with more performance, instead we should just say all processor are good enough for everyday tasks and leave it at that.
> 
> Got it.  It's completely stupid, but sure, whatever...



Except that in raw CPU performance the 8800P isn't really behind the i5's. Depends on the programs and configured TDP of the CPU.


----------



## newtekie1 (Jan 21, 2016)

cdawall said:


> Except that in raw CPU performance the 8800P isn't really behind the i5's. Depends on the programs and configured TDP of the CPU.



Obviously it is since that is exactly what Sysmark measures.  It measure raw cpu power, period.  PCMark gives a better idea of real-world use, but if you are just comparing rare compute power, the i5 is way ahead.

But Intel shouldn't really be penalized because they created a more powerful processor with a lot more number crunching power just because that extra power isn't likely to show itself in basic computer tasks because those tasks don't use the extra CPU power.

If AMD wants to say Intel and Sysmark is biased then they have to show raw CPU compute based benchmarks that show Sysmark is way off on CPU power.  Not use PCMark, which they admit doesn't really use the CPU that much.  Do some CPU video encoding benchmarks, some compression benchmarks, some music encoding benchmarks.  Those types of benchmarks rely on the CPU power, and will use every bit of CPU power available, and show the real difference in compute power between the two CPUs.

This "look Microsoft Office isn't any faster on Intel, so Sysmark is biased" shit isn't going to fly.


----------



## Tsukiyomi91 (Jan 21, 2016)

I too used AMD before but moved to Intel for it's efficiency, despite being pricier than what AMD has to offer. With AMD like this, it's rather painful for me to see AMD stooped this low for a "fight" when they've been lying to their consumers ever since the introduction of the "new" FX processors & R9 graphics series to the masses when they're just 5-6 years old chip with very minimal improvements marketed as brand new. The reason why they lost the interest of consumers & reviewers is not because of their products getting kicked around by other vendors, but it's their own arrogance that kills them for believing that an old dog can still learn new tricks.


----------



## cdawall (Jan 21, 2016)

newtekie1 said:


> Obviously it is since that is exactly what Sysmark measures.  It measure raw cpu power, period.  PCMark gives a better idea of real-world use, but if you are just comparing rare compute power, the i5 is way ahead.
> 
> But Intel shouldn't really be penalized because they created a more powerful processor with a lot more number crunching power just because that extra power isn't likely to show itself in basic computer tasks because those tasks don't use the extra CPU power.
> 
> ...



Would you say super pi is biased towards intel? Sysmark tests quite a bit of single threaded performance.

All this being said I looked at an 8800P and promptly purchased a 5500u.


----------



## Tsukiyomi91 (Jan 21, 2016)

@newtekie1 I thought on certain benches, especially on heavy video editing that AMD triumph over Intel & benchers were right but AMD doesn't listen to them? Sure, their chips really do show some processing muscle when it comes to multi-core/threaded tasks where Intel lags behind. But when it comes to computational power, Intel gains back their ground in that league because of one thing: it's efficiency per core it does is exceptional unlike AMD, where they rely heavily on raw speed.


----------



## cdawall (Jan 21, 2016)

Tsukiyomi91 said:


> @newtekie1 I thought on certain benches, especially on heavy video editing that AMD triumph over Intel & benchers were right but AMD doesn't listen to them? Sure, their chips really do show some processing muscle when it comes to multi-core/threaded tasks where Intel lags behind. But when it comes to computational power, Intel gains back their ground in that league because of one thing: it's efficiency per core it does is exceptional unlike AMD, where they rely heavily on raw speed.



6th generation intel completely changes that. Multithreading finally caught up to AMD, but for the rest you are more or less correct.


----------



## Tsukiyomi91 (Jan 21, 2016)

If they (AMD) want to save their face, I hope they don't exaggerate on claims that their upcoming Zen & Polaris architecture are performing "worlds apart" while comparing on older chips from Nvidia & Intel while using botched benches to gain popularity & hype in their favor. But, since it's them after all... they'll gonna go on full force with it & shit starts hitting the fan all over again. If that happens, I won't be seeing AMD in any of my new builds.


----------



## newtekie1 (Jan 21, 2016)

cdawall said:


> Would you say super pi is biased towards intel? Sysmark tests quite a bit of single threaded performance.



No I wouldn't.  But I would say SupoerPi hasn't been a relevant benchmark in years.

Sysmark tests things like media creation and number crunching, two very CPU intensive tasks, so it is going to show a bigger advantage to the CPU with more compute power.  And, IMO, if you are going to be testing the CPU power, you want to use something that actually uses the CPU as much as possible. PCMark uses relatively low CPU intensive tasks.  AMD even neglected to pick the PCMark8 Creative test, and instead went with the Work test, which only tests basic office tasks(word/excel, browsing, and video playback).  I mean, shit, a Celeron handles those tasks with ease...



Tsukiyomi91 said:


> @newtekie1 I thought on certain benches, especially on heavy video editing that AMD triumph over Intel & benchers were right but AMD doesn't listen to them? Sure, their chips really do show some processing muscle when it comes to multi-core/threaded tasks where Intel lags behind. But when it comes to computational power, Intel gains back their ground in that league because of one thing: it's efficiency per core it does is exceptional unlike AMD, where they rely heavily on raw speed.



Yes, that certainly can be true for certain workloads with the desktop processors.  But, we are looking at laptop CPUs here.  Which for the FX means they had to really dial the clock speeds back, running the 8800p at only 2.1GHz, and it is only a quad-core with no L3 cache.  It is a pretty crippled bulldozer based processor.


----------



## Bones (Jan 21, 2016)

Tsukiyomi91 said:


> They'll gonna go on full force with it & shit starts hitting the fan all over again.....



I wonder what the budget for each is in buying more fans to either sling it or deflect it somehow?


----------



## cdawall (Jan 21, 2016)

newtekie1 said:


> No I wouldn't. But I would say SupoerPi hasn't been a relevant benchmark in years.
> 
> Sysmark tests things like media creation and number crunching, two very CPU intensive tasks, so it is going to show a bigger advantage to the CPU with more compute power. And, IMO, if you are going to be testing the CPU power, you want to use something that actually uses the CPU as much as possible. PCMark uses relatively low CPU intensive tasks. AMD even neglected to pick the PCMark8 Creative test, and instead went with the Work test, which only tests basic office tasks(word/excel, browsing, and video playback). I mean, shit, a Celeron handles those tasks with ease...



Media creation is something the AMD chips excel at...That should say something about the testing.


----------



## qubit (Jan 21, 2016)

Oh this should be good! I'll read it when I get home and break out the popcorn.


----------



## cdawall (Jan 21, 2016)

qubit said:


> Oh this should be good! I'll read it when I get home and break out the popcorn.



It isn't even worth the popcorn, just click bait and another shitty marketing ploy by AMD. Why haven't they fired the entire marketing department again?


----------



## xfia (Jan 21, 2016)

cdawall said:


> It isn't even worth the popcorn, just click bait and another shitty marketing ploy by AMD. Why haven't they fired the entire marketing department again?


I just asked Galaxy but she didnt know  If only I had Watson or Jarvis for the hard ones.


----------



## Bill_Bright (Jan 21, 2016)

Tsukiyomi91 said:


> If they (AMD) want to save their face


They don't have anything to save face over! I say again, they make great processors that can be expected to last just as long as other brands. And when used with other properly chosen components, can form the foundation for a great computer - whether a game machine, CAD/CAE, server, POS, office, or plain old home computer.

This is just a marketing/PR assault from Intel, and sensationalize reports of unrealistic (not real-world) benchmarking tests.

To just make an absolute statement you won't be using any AMD products in any of your builds is simply falling for the marketing hype! But not only that, limits your own choices.

When you fall for the marketing hype and restrict yourself to a "sole-source", you feed the monopoly. And monopolies are NEVER good for consumers.

*IF* Intel, AMD, NVIDA only made one CPU and one GPU each, then and only then would it make sense to boycott one or the other. But because each produce dozens of processors across a wide range of performance capabilities - with all being very reliable, quality products - it makes no sense to not even consider all brands when building or buying a new computer. Why limit your options?

*We need AMD!* We don't need them to be in head-to-competition with Intel. But we do need AMD to keep nipping at the heels of Intel forcing Intel to keep looking over their shoulder to ensure AMD does not leap-frog over them again! AMD did that once before, totally spanking and embarrassing Intel. It took Intel nearly 8 years to gain the lead again with the Core 2 Duos and Intel cannot afford to let that happen again.

*We need Intel* to keep advancing the cutting edge but they will lose the incentive if no one is sneaking up behind them. If AMD dies because of bad press and biased comments from IT enthusiasts like us, that will be very bad for all of us.


----------



## newtekie1 (Jan 21, 2016)

cdawall said:


> Media creation is something the AMD chips excel at...That should say something about the testing.



Not in this case.  They've crippled the FX-8800P so much.  The 8-Core desktop FX chips are good at media creation, but even still have lagged behind the 4c4t dekstop i7s starting with the Haswell chips.

Now we are comparing a 4-Core 2.1GHz AMD, with no L3, to a 2c4t Intel Broadwell.  The AMD chip is too crippled, they've killed the media creation ability.

Again, if they wanted to show a Bias they would have picked appropriate benchmarks.  Not picked a benchmark that uses very little CPU power.  There are plenty of benchmarks out there that give a good idea of actual CPU power they could have picked, PCMark8 Work is not one of them.


----------



## cdawall (Jan 21, 2016)

newtekie1 said:


> Not in this case.  They've crippled the FX-8800P so much.  The 8-Core desktop FX chips are good at media creation, but even still have lagged behind the 4c4t dekstop i7s starting with the Haswell chips.
> 
> Now we are comparing a 4-Core 2.1GHz AMD, with no L3, to a 2c4t Intel Broadwell.  The AMD chip is too crippled, they've killed the media creation ability.
> 
> Again, if they wanted to show a Bias they would have picked appropriate benchmarks.  Not picked a benchmark that uses very little CPU power.  There are plenty of benchmarks out there that give a good idea of actual CPU power they could have picked, PCMark8 Work is not one of them.



Fair enough I guess I didn't realize just how crippled the 8800P was.


----------



## Bill_Bright (Jan 21, 2016)

newtekie1 said:


> There are plenty of benchmarks out there that give a good idea of actual CPU power they could have picked, PCMark8 Work is not one of them.


I think the point they were making is that IF Intel used PCMark8, it would have shown that Intel was not the run-away processor Intel was making it out to be.


----------



## newtekie1 (Jan 21, 2016)

Bill_Bright said:


> I think the point they were making is that IF Intel used PCMark8, it would have shown that Intel was not the run-away processor Intel was making it out to be.



Sure, but then the argument could be made that PCMark8 Work is just as biased towards AMD since it doesn't actually test the CPU power.  It uses low CPU power tasks.

Basically the argument AMD is making is that Intel shouldn't be saying their processor are more powerful because AMD processor are just as good at Word, Excel, and Internet Browsing.  Yeah, sure they are, but any modern processor handles those tasks with ease, you aren't going to see really any difference between processors in those work loads.



cdawall said:


> Fair enough I guess I didn't realize just how crippled the 8800P was.



Just to give an idea of how crippled it is.  I use multiple machines for encoding video when I have an whole TV series to encode.  Delegating a season to each computer.  My A10-6800K is similar to the 8800P in configuration, but still actually more powerful and clocked way higher.  My A10-6800K, even at 4.6GHz, is just barely faster at encoding to H264 than my laptop i3-3217u which is an IvyBridge at only 1.8GHz...

When you take the huge L3 away from bulldozer it's performance starts to tank.


----------



## cdawall (Jan 21, 2016)

If AMD wanted to bitch they would cry about 3dmark being biased. That is a test the 8800P should win, but doesn't.


----------



## Bill_Bright (Jan 21, 2016)

newtekie1 said:


> Basically the argument AMD is making is that Intel shouldn't be saying their processor are more powerful because AMD processor are just as good at Word, Excel, and Internet Browsing. Yeah, sure they are, but any modern processor handles those tasks with ease, you aren't going to see really any difference between processors in those work loads.


Exactly my point as well and way we as advisors, should not disparage AMD just because we prefer Intel. And I note many AMDs work great in gaming rigs too. You don't have to have the most powerful rig to have the same amount of fun.


----------



## BiggieShady (Jan 21, 2016)

RejZoR said:


> What I hate more is how Intel compiler treats AMD CPU's even if they support all the required extensions. If AMD CPU runs Intel compiled binaries, it'll automatically be gimped.


That whole fiasco made way for third party math library that is optimized for all architectures optimally and based on supported instruction set rather than looking at cpuid ... The library is called Yeppp ... http://www.yeppp.info/benchmarks.html ... performance is just as impressive as library is underused.
Here are some comparative AMD vs INTEL numbers: http://www.yeppp.info/home/yeppp-performance-numbers/


----------



## newtekie1 (Jan 21, 2016)

Bill_Bright said:


> Exactly my point as well and way we as advisors, should not disparage AMD just because we prefer Intel. And I note many AMDs work great in gaming rigs too. You don't have to have the most powerful rig to have the same amount of fun.



I completely agree.  I can't count the number of Athlon 5350 machines I've built for clients just because all they do is Word, Excel, and Internet.  They don't need anything more, and a 5350 machine is dirt cheap.


----------



## Super XP (Jan 21, 2016)

This is old news, of course. Intel always played its tricks on Benchmarks. Especially Synthetics. If Intel needs to actually cheat on its Benchmarks, that's telling a completely different story. Quite pathetic actually, on there part.


----------



## terroralpha (Jan 21, 2016)

talk about the pot calling the kettle black... how about those bulldozer and fury x benches??? I held on buying a 980 Ti for like a month and a half for no reason! i even ended up getting one but returned it after a week after realizing how worthless it was. 

the AMD cheerleaders blamed immature drivers. here we are like 6 months later and the newest drivers only added a few FPS boost on average.


----------



## Bill_Bright (Jan 21, 2016)

Super XP said:


> This is old news, of course. Intel always played its tricks on Benchmarks. Especially Synthetics. If Intel needs to actually cheat on its Benchmarks, that's telling a completely different story. Quite pathetic actually, on there part.


 Now this is BS! You think Intel has the industry cornered on hype and marketing fluff? This plays exactly into what I was saying above where we as enthusiasts need to keep our biases in check to avoid perpetrating falsehoods as you just did.

No where is there any evidence of "cheating". They never "falsified" any test results - nor did they go  "Volkswagen" with their CPUs sent to testing labs.



terroralpha said:


> talk about the pot calling the kettle black... how about those bulldozer and fury x benches???


Exactly! Pick a product, any product and their marketing departments are going to "spin" the data to make their product look best. That's their job! And how well they do it makes the difference between consumers falling for the hype hook line and sinker, or take it with a grain of salt as they should.


----------



## newtekie1 (Jan 21, 2016)

Bill_Bright said:


> Now this is BS! You think Intel has the industry cornered on hype and marketing fluff? This plays exactly into what I was saying above where we as enthusiasts need to keep our biases in check to avoid perpetrating falsehoods as you just did.



Yep, AMD is pretty much the king right now in marketing BS.

Heck, the video they made and this entire campaign is a perfect example.  They're trying to say their processors are just as good as Intel's by running Excel and Word benchmarks...  Talk about cherry picking benchmarks.


----------



## Bill_Bright (Jan 21, 2016)

newtekie1 said:


> Yep, AMD is pretty much the king right now in marketing BS.
> 
> Heck, the video they made and this entire campaign is a perfect example. They're trying to say their processors are just as good as Intel's by running Excel and Word benchmarks... Talk about cherry picking benchmarks.


And I really don't have a problem with this. Why not tout your strengths? If you provide a product or service, you can't grow your business, let alone stay in business if you don't advertise.


----------



## Tsukiyomi91 (Jan 22, 2016)

problem of AMD these days is they exaggerate a lot on botched benches when it's barely telling half of the actual story, instead relying on cherry-picked benches & claims it as a legitimate results. But when samples reached to the hands of honest reviewers, it tells a completely different story. That's no different than a liar who sugar-coat his/her words to run away from the truth.


----------



## Kanan (Jan 22, 2016)

newtekie1 said:


> Yep, AMD is pretty much the king right now in marketing BS.
> 
> Heck, the video they made and this entire campaign is a perfect example.  They're trying to say their processors are just as good as Intel's by running Excel and Word benchmarks...  Talk about cherry picking benchmarks.


But when Intel releases their biased Sysmark benchmarks with every CPU ad, it's ok? Bullshit. Intel does exactly the same shit. All AMD is doing here is proving that Intel is cherry picking their own shit (Intel knows their CPUs are IPC king and they use software that strictly uses the strengths of their CPUs to make them shine) and countering with their own cherry picking, just that AMDs cherry picked answer is a lot more realistic. Where in this world is this CPU shown in the video 50% faster than the AMD one? It's never in the real world 50% faster. End of story. And so Sysmark is bullshit, because it proves nothing. Yes, PCMark 8 Work can be done by any CPU, but at least it's realistic workload and not totally theoretical BS like Sysmark. On top of that, somebody here mentioned this software is optimized on Intel CPUs, wow, so it can't even utilize what the weaker AMD CPUs have. What a mess. This is like, you are the bigger and stronger one, and you still play it dirty. Well it's normal, its called capitalism, but still it's not really good. AMDs reaction to that is absolutely OK if you ask me. But what would have been even better, if they just said "Sysmark is not a realistic workload/benchmark, AMD CPUs are about as fast or faster in realistic workloads", simply and without making a "Po-face". They need to play the game smart and cool - they don't do that. But it's no news AMD marketing isn't up to the task I guess. 

A lot of people here react very positive to what (shit) Intel does and very negative to what AMD does. Thats obvious. No wonder many people say AMD is always being hated while everything Intel does is automatically good. I think the big problem AMD has is a image/PR problem. They need a solution for that - better marketing.


----------



## arbiter (Jan 22, 2016)

Bill_Bright said:


> The problem here is AMD and Intel CPUs should not be compared directly in terms of "performance". This is especially pertinent since AMD announced several years ago it will stop competing head-to-head with Intel.


Yet you see all AMD APU benchmarks PR slides that AMD puts out compares their cpu to an intel one USING gpu accelerated benchmarks? Kinda funny isn't it how AMD loves to use that but most app's like email, web browsing for most part are cpu dependent and have usually very little gpu acceleration in them.



lilhasselhoffer said:


> What we're seeing instead is that Intel vs. AMD, in a laptop, is relatively close in some synthetic benchmarks and different in other synthetics. The test that supposedly is more relevant to CPUs demonstrates a bigger difference than those tests which test multiple aspects of the system.


Reason AMD likes PCmark is their gpu on their APU's are much better then Intel's. That will make scores in PCmark much closer then SYSmark. As stated above not all programs people run can use the gpu to speed things up.


Tsukiyomi91 said:


> I thought on certain benches, especially on heavy video editing that AMD triumph over Intel & benchers were right but AMD doesn't listen to them? Sure, their chips really do show some processing muscle when it comes to multi-core/threaded tasks where Intel lags behind.


That is due to GPU being used to speed up processing of the data.


newtekie1 said:


> Yep, AMD is pretty much the king right now in marketing BS.
> Heck, the video they made and this entire campaign is a perfect example. They're trying to say their processors are just as good as Intel's by running Excel and Word benchmarks... Talk about cherry picking benchmarks.


That is my biggest beef with AMD is the marketing bs they put out. Most benchmarks i seen that says their processors match intel, are benchmarks that are GPU accelerated ones like BasemarkCL.


Tsukiyomi91 said:


> problem of AMD these days is they exaggerate a lot on botched benches when it's barely telling half of the actual story, instead relying on cherry-picked benches & claims it as a legitimate results


Like said above they cherry pick benchmarks that take advantage of the gpu and say that is fair to do that. while if intel did same thing they would cry foul.


Kanan said:


> But when Intel releases their biased Sysmark benchmarks with every CPU ad, it's ok? Bullshit. Intel does exactly the same shit. All AMD is doing here is proving that Intel is cherry picking their own shit (Intel knows their CPUs are IPC king and they use software that strictly uses the strengths of their CPUs to make them shine) and countering with their own cherry picking, just that AMDs cherry picked answer is a lot more realistic


Um most benchmarks see for intel cpu's are from reviews not intel themselves. Wonder why that is? Maybe intel knows they got a good cpu and independent reviews prove it?


----------



## Kanan (Jan 22, 2016)

arbiter said:


> Um most benchmarks see for intel cpu's are from reviews not intel themselves. Wonder why that is? Maybe intel knows they got a good cpu and independent reviews prove it?


That's not the topic here. It's about the PR slides Intel uses with Sysmark ad's in it. Also PCMark 8 is a mixed workload of CPU/GPU, yes, but it's not to the point where the GPU of the FX can outshine the i5. It's pretty realistic, a lot more than Sysmark at least. Also, to send some emails, write something in Word or simply browse the internet, you don't need an i5 processor, basically that's what PCMark and custom AMD benchmark proves too. Basically that i5 processor would be better for workstation + gaming, but both things are unlikely done with such small laptop CPUs anyway. These laptops showcased in the video of AMD will most likely be used for casual things as internet browsing + watching videos and for that the FX is good enough too. I think exactly that is the point of AMD in this video.


----------



## Beertintedgoggles (Jan 22, 2016)

Bones said:


> Intel has for years been stacking the deck against all competitors period, not just AMD and since the compiler used for well..... Just about everything out there was either made or at least influenced by Intel, the stacking of the deck is there. Any test(s) period will never run as well as it does on an Intel because of it hence the stacking of the deck.



Is this still the case today?  I'd imagine it would be in AMD's interest to create a compiler of their own but trying to get software companies to create two sets of executables would also be a nightmare.


----------



## Filip Georgievski (Jan 22, 2016)

There are a lot of factors here why a X CPU is faster than Y CPU.
AMD's old architecture tends to compete with the newest Intel CPUs in the market. But they are desperate for a change, and we all know that, so lets give AMD some room and look forward to Zen.

On the other hand, Intel wasnt all perfect.
Remember the time when the Athlons beat the shit out of the Pentiums?? I do.
Intels are a little bit overrated on the market.


----------



## silentbogo (Jan 22, 2016)

AMD is probably all pissy because no one does benchmarks like these:









AMD APUs are a no-brainer in an entry-level entertainment laptop market, but when an underperforming FX8800P laptop (locked at 15W) costs as much as an i5-6200 laptop with dedicated GT940 - that's when things go bad.


----------



## newtekie1 (Jan 22, 2016)

Kanan said:


> But when Intel releases their biased Sysmark benchmarks with every CPU ad, it's ok?



Again, no one has proved that it is biased.  Sysmark measures actual CPU power using high CPU intensive loads.  AMD tried to claim it was bias by using PCMark8 Work Accelerated, which is an OpenCL benchmark that doesn't put a high load on the CPU.  AMD's claim is way more biased and BS than Intel using Sysmark.



Kanan said:


> Also PCMark 8 is a mixed workload of CPU/GPU, yes, but it's not to the point where the GPU of the FX can outshine the i5.



Not PCMark 8 Work Accelerated.  Go watch the video again, that is what they used.  There is a pretty big reason they picked that specific benchmark.  It is basically nothing more than a Microsoft Office benchmark(well LibreOffice).  They didn't use the entire PCMark8 test suit, they specifically picked the least CPU intense benchmark possible.  No one gives a shit about PCMark8 Work because any processor on the market today can handle Excel and Word, you won't see much difference between the shittiest celeron and an 8-Core i7.

AMD's been relying on their better onboard GPU for years.  And that is a fine argument if you want to talk about GPU power.  But we aren't here, we are talking about CPU power.  If they wanted to show how much better the GPU is, show some game benchmarks, show some GPU compute benchmarks.  Whatever.  Don't run PCMark8 Work to try to mask the fact that while the GPU might be really good, the CPU portion is still way behind Intel.  Because the fact is most people don't care how powerful the onboard GPU is as long as it can play back their HD video(and now 4k), which Intel's solution can easily handle.  So making the GPU more powerful doesn't really appeal to the mass market.



Kanan said:


> It's pretty realistic



No it isn't, for the reasons I've mentioned.  All this is is AMD shouting "we're just as good as Intel at Excel!!!"



Kanan said:


> a lot more than Sysmark at least.



I would probably agree with that _if_ they ran the entire PCMark8 suit.  But they didn't, becuase they knew the i5 would kick their asses in the other more CPU intensive parts.



Kanan said:


> Also, to send some emails, write something in Word or simply browse the internet, you don't need an i5 processor, basically that's what PCMark and custom AMD benchmark proves too.



Yep, but you also don't need an 8800P.  My Celeron does all of those tasks just fine.  So I guess AMD's, and your, argument is the FX-8800P is no better than a _CELERON._  I mean, if that is the logic you want to go by, sure, we can say that...



Kanan said:


> Basically that i5 processor would be better for workstation + gaming, but both things are unlikely done with such small laptop CPUs anyway. These laptops showcased in the video of AMD will most likely be used for casual things as internet browsing + watching videos and for that the FX is good enough too.



I think that is very inaccurate.  We are moving into a more mobile and digital world by the day.  People are getting rid of their desktop computers completely and moving to lightweight laptops. At the same time they are doing things like ripping music and re-encoding video for their other mobile devices more and more.  Heck, my 60 year old uncle, a man I never thought I'd even see using a computer, no regularly encodes video to burn to DVDs. At every family gathering he is handing out DVDs of his grandkids that he made from movies off his phone.  To do that I bought him an i5 laptop.  If he was just looking for office tasks he'd still be using his old Pentium 987 laptop, there would really be no reason to upgrade.



Kanan said:


> I think exactly that is the point of AMD in this video.



Sure, I think that is what AMD is trying to make us believe.  But it is wrong to look at it that way.  Like I've pointed out, the things AMD is saying the FX-8800P is just as good as Intel's i5-5200u at are things that any processor is just as good at.  That is a biased way to look at CPU power.  If they would have done the entire PCMark8 suit, ok, maybe they have a point.  But just picking one benchmark, that is the least CPU intensive one, to try to say Intel's claims that their *CPUs* are are more powerful is more biased and much bigger marketing BS than anything Intel has said.


----------



## xfia (Jan 22, 2016)

they have such friendly hate for each other. Still waiting on the AMD/Intel hybrid APU haha


----------



## silkstone (Jan 22, 2016)

At this point in the game, CPU speed matters very little to the average user. Throughput from the storage device and GPU performance are much bigger factors in determining whether a computer feels slow (to the average user/gamer).
If I were AMD, I'd be focusing my marketing on the price-point and the R&D and getting power usage down.


----------



## Tsukiyomi91 (Jan 22, 2016)

AMD is hampered greatly on power consumption & thermal performance, especially on their higher end FX & R9 Series products. Sure their APU kicks the shit out of Intel's Iris Graphics, but lost the fight hands down against Intel in terms of efficiency per core, thermal performance & power consumption. if Zen is really what they are claiming to sport a lowered power consumption & less heat generation, that alone can let them (AMD) have a fighting chance.


----------



## Bill_Bright (Jan 22, 2016)

All these accusations of Intel lying or AMD lying are now just getting silly.

When you shop for a PSU, do you go only by the information in the maker's ads and brochures? Or do you go by the results on the professional review sites?

If you say you go by the maker's ads and brochures, then cough up the hook, line and sinkers - and your wallets because I have some prime swamp land in Florida that will be perfect for you. Same with cases, monitors, cars, TVs, home theater receivers, etc.

If you say you do your homework and go by what the professional review sites (that's sites, not site) say, then you are a wise consumer.

And all the talk in this thread is about ALL Intels and ALL AMDs - as if every single product one maker produces is superior to every single product of the other maker.  Yeah right.

All this silly criticism over AMD using this benchmarking program to make their processors shine, or Intel using that benchmarking to make their processors look better is really just wasting everyone's time.



Kanan said:


> That's not the topic here. It's about the PR slides Intel uses with Sysmark ad's in it.


No it's not! It's about an accusation that Intel conspired with SySmark to show significantly better performance with Intel over AMD CPUs.

And BTW, this row over SySmark is nothing new. Both AMD and Nvidia quit SySmark benchmark group years ago because of unrealistic testing and results that do not reflect real-world scenarios.



silkstone said:


> If I were AMD, I'd be focusing my marketing on the price-point and the R&D and getting power usage down.


That's exactly what they are trying to do. Their new Zen processor is based on 14nm, "stacked" technologies that promise more power in less space with much greater efficiency. And greater efficiency means less heat.


----------



## newtekie1 (Jan 22, 2016)

silkstone said:


> Throughput from the storage device and GPU performance are much bigger factors in determining whether a computer feels slow (to the average user/gamer).



I would argue that GPU power makes little difference to the average user.  Sure, maybe to a gamer, but not the average user.  As far as the average user is concerned, as long as the GPU is powerful enough to playback HD/4K video, they are happy.  Of course we are talking about laptops here, that will likely never have 4K displays, but that won't stop the common user from playing 4K video and thinking it looks "like totally way better man" than 1080p content on their 1080p screen...


----------



## Bill_Bright (Jan 22, 2016)

newtekie1 said:


> would argue that GPU power makes little difference to the average user. Sure, maybe to a gamer, but not the average user. As far as the average user is concerned, as long as the GPU is powerful enough to playback HD/4K video, they are happy. Of course we are talking about laptops here


I say even with a PC, the "average" user does not need a great deal of GPU horsepower. In fact, with a nice CPU and a decent chunk of RAM, integrated graphics on today's boards is more than sufficient for most users to watch videos in HD, play most games, and of course do general Internet surfing and office work.

The fact is, giant corporations, governments, small businesses, and schools buy fairly basic systems with integrated graphics by the 1000s and they work just fine.

And, most users would not even know they had less capable graphics unless there was an identical computer but with a powerful graphics card installed sitting right next to their computer with integrated graphics. Then they _might_ see the difference. I say _might_, because there are so many other variables to performance - including network speed.

I think it important to remember that game developers know most of their game users are on limited budgets and can't afford a $300 (or higher) graphics card (or two!). So they code their games to provide good "game play" on lessor systems. Yeah, the background may not be as detailed and there may be fewer independent objects floating about, but the "game play" will be the same or close enough.


----------



## Kanan (Jan 23, 2016)

newtekie1 said:


> Again, no one has proved that it is biased.  Sysmark measures actual CPU power using high CPU intensive loads.  AMD tried to claim it was bias by using PCMark8 Work Accelerated, which is an OpenCL benchmark that doesn't put a high load on the CPU.  AMD's claim is way more biased and BS than Intel using Sysmark.


For that they did another custom benchmark, so it's covered anyway. Also the AMD APU is afaik better in gaming, so it doesn't really matter. See silentbogos post with that video.



> Not PCMark 8 Work Accelerated.  Go watch the video again, that is what they used.  There is a pretty big reason they picked that specific benchmark.  It is basically nothing more than a Microsoft Office benchmark(well LibreOffice).  They didn't use the entire PCMark8 test suit, they specifically picked the least CPU intense benchmark possible.  No one gives a shit about PCMark8 Work because any processor on the market today can handle Excel and Word, you won't see much difference between the shittiest celeron and an 8-Core i7.


And "no one" gives a shit about Sysmark too. It's purely artificial and unrealistic, I care more about PC Mark than this probably Intel biased BS. But I agree that PCMark 8 Work is worthless - but well, everyone tries to BS his own thing, so it's basically the same shit Intel does. But again, when Intel does that, nobody is ever attacking Intel for it, but AMD gets hated at every chance possible...



> AMD's been relying on their better onboard GPU for years.  And that is a fine argument if you want to talk about GPU power.  But we aren't here, we are talking about CPU power.  If they wanted to show how much better the GPU is, show some game benchmarks, show some GPU compute benchmarks.  Whatever.  Don't run PCMark8 Work to try to mask the fact that while the GPU might be really good, the CPU portion is still way behind Intel.  Because the fact is most people don't care how powerful the onboard GPU is as long as it can play back their HD video(and now 4k), which Intel's solution can easily handle.  So making the GPU more powerful doesn't really appeal to the mass market.


I never said I'm only talking about CPU power, we can add GPUs to the discussion. You are right, the video isn't about GPU power it's about every day tasks. CPU maybe far behind, but as you said yourself, it's not important, because these PCs are just for everyday tasks and it probably won't matter anytime in the lifetime of that laptop, so... it's not so important. What AMD did was just proving that their AMD powered laptops are equally good enough for doing random tasks as Intel powered laptops. And on top of that: who needs a strong CPU like that, from Intel, with that weak GPU inside it? These laptops aren't balanced, I rather take a weaker CPU with mediocre GPU, than a strong CPU with a totally weak GPU. Balance is everything. Or GPU is more important.



> No it isn't, for the reasons I've mentioned.  All this is is AMD shouting "we're just as good as Intel at Excel!!!"


That's all the video is about, you're just making it more complex than it really is. What AMD intended to do was a simple thing, our discussion is way beyond that, to that point that its already become somewhat pointless.



> I would probably agree with that _if_ they ran the entire PCMark8 suit.  But they didn't, becuase they knew the i5 would kick their asses in the other more CPU intensive parts.


If they would've run every benchmark they would've also included a GPU benchmark or game and then AMD would've at least won that point. As already said, a balanced system is everything, or, GPU is more important than CPU. Take your pick. Reminds me of that thread with the guy with A8-6600 APU who wants to upgrade CPU or GPU and almost everybody told him to update the GPU to GTX 970. Suits somewhat to this topic. GPU > CPU. We aren't in the year 2000 anymore.



> Yep, but you also don't need an 8800P.  My Celeron does all of those tasks just fine.  So I guess AMD's, and your, argument is the FX-8800P is no better than a _CELERON._  I mean, if that is the logic you want to go by, sure, we can say that...


Isn't the 8800P already a budget processor? If yes, I don't see your point.



> I think that is very inaccurate.  We are moving into a more mobile and digital world by the day.  People are getting rid of their desktop computers completely and moving to lightweight laptops. At the same time they are doing things like ripping music and re-encoding video for their other mobile devices more and more.  Heck, my 60 year old uncle, a man I never thought I'd even see using a computer, no regularly encodes video to burn to DVDs. At every family gathering he is handing out DVDs of his grandkids that he made from movies off his phone.  To do that I bought him an i5 laptop.  If he was just looking for office tasks he'd still be using his old Pentium 987 laptop, there would really be no reason to upgrade.


I don't think its inaccurate, I think its the reality. Most people don't do what your uncle does, do you want a counter example of your example what really just is a exception? My own father who is pretty well with computer since over 30 years, isn't doing anything else than some internet, some movie looking and some lightweight gaming with his 6 year old PC. Your uncle is really just a exception, a good one that is. And btw. that Pentium laptop was maybe very old and heavy etc. there are more advances besides CPU/GPU power. Weight, battery lifetime, display, connections, looks etc. etc.



> Sure, I think that is what AMD is trying to make us believe.  But it is wrong to look at it that way.  Like I've pointed out, the things AMD is saying the FX-8800P is just as good as Intel's i5-5200u at are things that any processor is just as good at.  That is a biased way to look at CPU power.  If they would have done the entire PCMark8 suit, ok, maybe they have a point.  But just picking one benchmark, that is the least CPU intensive one, to try to say Intel's claims that their *CPUs* are are more powerful is more biased and much bigger marketing BS than anything Intel has said.


Maybe that video IS full of shit, yes. As I already said they should have done something else, that video is way too dramatic, they didn't play it cool. First thing: don't do videos like that. Second: if you must, do it cool. Where is the gaming? Where are GPU heavy tasks? That APU IS comparable to the i5 if you compare everything. But they didn't do it, they just compared CPU power and in a poor fashion, I give you that. This just makes the video somewhat silly, but doesn't change the fact that the FX processor is comparable enough to me, or let's say, good hardware too.
A laptop is something you choose by your usage, what you want to do with it. I'd rather take the AMD one if I wanted to play, if not, I'd take the Intel, or I wouldn't care at all, would decide on other things (price etc).

PS. Remembered AMD saying "we don't compete with Intel anymore". What? That video is just that. They shouldn't have done it. Problem is, they wanted to get out of the way by doing APUs, but Intel got into APUs too, so they are again competing with Intel. They did anyway, I think this whole "we don't compete with Intel anymore" was BS talk. As long as they produce x86 CPUs/APUs they compete with Intel and need to be comparable. If they aren't they lose the fight and cease to exist as a company. I think Zen is the next step in this long "war", Zen starts the battle again (or at least we hope so). After Zen, they can't even say "we aren't competing with Intel". That's the whole point of Zen, to gain market share back from them.



> At this point in the game, CPU speed matters very little to the average user. Throughput from the storage device and GPU performance are much bigger factors in determining whether a computer feels slow (to the average user/gamer).
> If I were AMD, I'd be focusing my marketing on the price-point and the R&D and getting power usage down.


+1

---
@Bill_Bright :


> All these accusations of Intel lying or AMD lying are now just getting silly.


Everytime you enter a thread you start patronizing someone or everybody. Wise men aren't smart-asses, sorry. Pls stop the acting.



> When you shop for a PSU, do you go only by the information in the maker's ads and brochures? Or do you go by the results on the professional review sites?


Nobody said that. And this topic is not about PSUs. So hard to stay on the topic and what was said here? I think you need the big talks for your patronized acting.



> If you say you go by the maker's ads and brochures, then cough up the hook, line and sinkers - and your wallets because I have some prime swamp land in Florida that will be perfect for you. Same with cases, monitors, cars, TVs, home theater receivers, etc.


Your arrogance is somewhat annoying too. Not the first time btw.



> If you say you do your homework and go by what the professional review sites (that's sites, not site) say, then you are a wise consumer.


Maybe if you were someone wise your words would matter, but you aren't. I know how wise man are, they aren't patronizing and aren't smart asses - and they aren't arrogant. You are far far away from being wise, so please don't waste your time telling us or me what is wise and what not.



> And all the talk in this thread is about ALL Intels and ALL AMDs - as if every single product one maker produces is superior to every single product of the other maker.  Yeah right.


Plain bullshit. Nobody said they talk about "all Intels vs all AMDs".



> All this silly criticism over AMD using this benchmarking program to make their processors shine, or Intel using that benchmarking to make their processors look better is really just wasting everyone's time.


Then go away and don't read it. And don't post here. That "wastes your time" or doesn't it? Strange... I think you are more or less just here (at least this thread) for the patronizing part, but not really to help. Kinda egoistic.



> No it's not! It's about an accusation that Intel conspired with SySmark to show significantly better performance with Intel over AMD CPUs.


Wrong, it's about both and a lot of other things too. Somewhat shortsighted of you.



> And BTW, this row over SySmark is nothing new. Both AMD and Nvidia quit SySmark benchmark group years ago because of unrealistic testing and results that do not reflect real-world scenarios.


Again some smartassing. And it doesn't help a bit. Intel still uses it in ads for their products. You don't seem to get the point of this thread.


----------



## newtekie1 (Jan 23, 2016)

Kanan said:


> For that they did another custom benchmark, so it's covered anyway.



Again, watch the video.  Their custom benchmark was Excel scripts, they clearly said that.



Kanan said:


> Also the AMD APU is afaik better in gaming, so it doesn't really matter. See silentbogos post with that video.



1. We aren't talking about gaming performance.
2. We are talking about CPU power.



Kanan said:


> well, everyone tries to BS his own thing, so it's basically the same shit Intel does. But again, when Intel does that, nobody is ever attacking Intel for it, but AMD gets hated at every chance possible...



Wait, you can't be serious.  Do you realize what thread you're in?  This entire thread started because AMD and AMD's fans are attacking Intel for doing it.  Not the other way around.  Get it straight. 



Kanan said:


> I never said I'm only talking about CPU power



This discussion is about CPU power.  It has nothing to do with GPUs.



Kanan said:


> we can add GPUs to the discussion



No you can't.



Kanan said:


> What AMD did was just proving that their AMD powered laptops are equally good enough for doing random tasks as Intel powered laptops.



If that is your measure of equality, than Intel's Celeron line is just as good as AMD's flagship.  See how that logic doesn't work? 



Kanan said:


> Isn't the 8800P already a budget processor? If yes, I don't see your point.



No, it competes directly with the i5 line, not the budget line.  The FX-8800P replaces the A10 laptop APUs.  It's their high end flagship laptop processor.  The entry point for an FX-8800P laptop is ~$600.  That isn't a budget laptop.



Kanan said:


> I don't think its inaccurate, I think its the reality. Most people don't do what your uncle does, do you want a counter example of your example what really just is a exception? My own father who is pretty well with computer since over 30 years, isn't doing anything else than some internet, some movie looking and some lightweight gaming with his 6 year old PC. Your uncle is really just a exception, a good one that is. And btw. that Pentium laptop was maybe very old and heavy etc. there are more advances besides CPU/GPU power. Weight, battery lifetime, display, connections, looks etc. etc.



There is a trend towards doing more media and computational work with computers, even the basic ones.  People rip music to iTunes all the time, that is a CPU intensive task that is definitely quicker on the Intel computer.  Sure there are still a lot of people that the hardest thing the laptop will do is facebook, but those people are perfectly fine with lower end laptops.  They don't need, and likely aren't buying, laptops with the FX-8800P or an i5.  They are buying Pentiums, i3s, and A6s/A4s in the $300 range.



Kanan said:


> Maybe that video IS full of shit, yes. As I already said they should have done something else, that video is way too dramatic, they didn't play it cool. First thing: don't do videos like that. Second: if you must, do it cool. Where is the gaming? Where are GPU heavy tasks? That APU IS comparable to the i5 if you compare everything. But they didn't do it, they just compared CPU power and in a poor fashion, I give you that. This just makes the video somewhat silly, but doesn't change the fact that the FX processor is comparable enough to me, or let's say, good hardware too.



That is exactly what I'm saying.  I'm not saying the FX-8800P is a necessarily bad processor.  It just isn't as powerful as the i5 at CPU intensive tasks, but it has its positive points too.  The GPU being one of them.

But then again if I was gaming, for the cost of an FX-8800P laptop, I'd just spend the $100 more and get an i5-5200u with a dedicated GTX950M...

But if your budget was $600, the FX-8800P is the best option if you play games.


----------



## Kanan (Jan 23, 2016)

newtekie1 said:


> Again, watch the video.  Their custom benchmark was Excel scripts, they clearly said that.


Man, I already got your point, it's okay. 



> 1. We aren't talking about gaming performance.
> 2. We are talking about CPU power.


You think you can dictate me or everyone in this thread what we talk about or what are you trying here? I'AM talking about GPUs too. End of story. 



> Wait, you can't be serious.  Do you realize what thread you're in?  This entire thread started because AMD and AMD's fans are attacking Intel for doing it.  Not the other way around.  Get it straight.


Yes I'am serious, and I saw that Intel fanboys attacked back and are always hating on AMD for every small mistake they make. I don't think you get what this topic is really about. 



> This discussion is about CPU power.  It has nothing to do with GPUs.


Just your opinion, nothing more. 



> No you can't.


I can, I already did and we already talked about it (plus some others too). 



> If that is your measure of equality, than Intel's Celeron line is just as good as AMD's flagship.  See how that logic doesn't work?


I don't think you got my logic straight. 



> No, it competes directly with the i5 line, not the budget line.  The FX-8800P replaces the A10 laptop APUs.  It's their high end flagship laptop processor.  The entry point for an FX-8800P laptop is ~$600.  That isn't a budget laptop.


Well maybe it tries to compete with it, but it's somewhat else. Its a mixture of CPU and GPU power, the Intel is more about good CPU power and some GPU on top of it. 



> There is a trend towards doing more media and computational work with computers, even the basic ones.  People rip music to iTunes all the time, that is a CPU intensive task that is definitely quicker on the Intel computer.  Sure there are still a lot of people that the hardest thing the laptop will do is facebook, but those people are perfectly fine with lower end laptops.  They don't need, and likely aren't buying, laptops with the FX-8800P or an i5.  They are buying Pentiums, i3s, and A6s/A4s in the $300 range.


Maybe they do, maybe they aren't. People buy wrong things all the time, or buy higher performance items to use it longer. Rip music to iTunes all the times? What? I only know people that download music and that's it. I don't think you are talking about the average user here or have a different understanding of them. 



> That is exactly what I'm saying.  I'm not saying the FX-8800P is a necessarily bad processor.  It just isn't as powerful as the i5 at CPU intensive tasks, but it has its positive points too.  The GPU being one of them.
> 
> But then again if I was gaming, for the cost of an FX-8800P laptop, I'd just spend the $100 more and get an i5-5200u with a dedicated GTX950M...
> 
> But if your budget was $600, the FX-8800P is the best option if you play games.


True. But let's agree on the point that AMD really has to do something, Zen really needs to be what it is promised or else... this doesn't get any better. The whining videos of them aren't really helping too.


----------



## newtekie1 (Jan 23, 2016)

Kanan said:


> You think you can dictate me or everyone in this thread what we talk about or what are you trying here? I'AM talking about GPUs too. End of story.



When the thread is about CPU power, we talk about CPU power.  If you want to talk about GPU power, that would be off topic, go create another topic on it and talk about it there.



Kanan said:


> Yes I'am serious, and I saw that Intel fanboys attacked back and are always hating on AMD for every small mistake they make. I don't think you get what this topic is really about.



I entirely get what this topic is about. AMD wants to let everyone know their processors are just as good as Intel's at Excel.  And any benchmark that tests real CPU performance is just biased towards Intel.



Kanan said:


> Just your opinion, nothing more.



No, it is literally what this thread is about.



Kanan said:


> I don't think you got my logic straight.



I'm pretty sure I do.  If we are just judging processors on how they handle Excel, they'd all turn out pretty much the same.  That is why you don't see CPU reviews running just Excel benchmarks and judging everything on that.



Kanan said:


> Well maybe it tries to compete with it, but it's somewhat else. Its a mixture of CPU and GPU power, the Intel is more about good CPU power and some GPU on top of it.



Either way, it is definitely not a budget CPU.



Kanan said:


> Maybe they do, maybe they aren't. People buy wrong things all the time, or buy higher performance items to use it longer. Rip music to iTunes all the times? What? I only know people that download music and that's it. I don't think you are talking about the average user here or have a different understanding of them.



It is just one example. People use these computers for video editing, graphic design, video rendering.  Hell, everyone and their mom has a youtube account where they upload shittily edited videos because they just love the Movie Maker, and they are going to be a star!

Sure there are people that buy over powered laptops, but they are also the ones that don't look at reviews and sysmark scores.  But the people that are looking at these benchmark scores use the laptops for more than just Facebook.


----------



## Kanan (Jan 23, 2016)

> I entirely get what this topic is about. AMD wants to let everyone know their processors are just as good as Intel's at Excel.  And any benchmark that tests real CPU performance is just biased towards Intel.


Not any, just Sysmark. 



> I'm pretty sure I do.  If we are just judging processors on how they handle Excel, they'd all turn out pretty much the same.  That is why you don't see CPU reviews running just Excel benchmarks and judging everything on that.


My point was about more than just that, but let's just settle this may we?



> It is just one example. People use these computers for video editing, graphic design, video rendering.  Hell, everyone and their mom has a youtube account where they upload shittily edited videos because they just love the Movie Maker, and they are going to be a star!


Maybe you are right, maybe that Intel CPU is the "smarter choice" (no pun intended) after all. 



> Sure there are people that buy over powered laptops, but they are also the ones that don't look at reviews and sysmark scores.  But the people that are looking at these benchmark scores use the laptops for more than just Facebook.


I said, they buy it for future proofing, not without any sense. Yes this isn't about average people that don't read reviews or look at scores. But there are people who buy stronger/better laptops/pcs or any hardware (even cars) to use them longer.


----------



## vega22 (Jan 23, 2016)

cdawall said:


> Instead of bitching everytime something didn't go theit way amd should just take a leaf out of Intels book and lie.



we all know amd can't afford to cover those legal fees like intel :rofl:


----------



## newtekie1 (Jan 23, 2016)

Kanan said:


> Not any, just Sysmark.



Except they failed to even begin to prove it.  Did you know PCMark8 actually has a Home test too?  One that is supposed to more accurately measure how the performance of the computer in average home use tasks?  Odd they didn't even pick that one.  They went with the Excel benchmark to somehow prove Sysmark is biased.  If they are going to claim a benchmark that is designed to test intense CPU performance is biased, they need to use other benchmarks that test intense CPU performance.  Not ones that barely rely on the CPU.



Kanan said:


> Maybe you are right, maybe that Intel CPU is the "smarter choice" (no pun intended) after all.



Honestly, Intel is the smarter choice.  Laptops with the i5 they tested start at the $400 mark.  Laptops with the FX-8800P start at the $600 mark.  And even in AMD's own tests the i5 still performed better than the FX-8800P, so going with the $200 cheaper option that is more powerful is the smart choice.


----------



## de.das.dude (Jan 23, 2016)

everyone is saying even without bias intel beats AMD. but thats not the point. The point is intel is sellin stuff that doesn't perform as they claim. which is sketchy.

But i doubt AMD will win the case since intel does not market said products by advertising performance.


----------



## Tsukiyomi91 (Jan 23, 2016)

Intel at the very least is honest enough not to boast by printing dodgy benchmarks on their products to begin with... nor even promoting such claims to solidify their footing on the IT market.


----------



## Ralfies (Jan 23, 2016)

I think the last paragraph in the article sums things up nicely.



> Based on AMD’s findings, the FX processor still could not outperform the Core i5 processor. But, what AMD wants consumers to know is that there isn’t a huge performance gap between both processors as indicated by SYSmark, which brings us to an important point: PCMark8 measures the overall performance of a particular system, while SYSmark measures raw CPU performance. If one were to compare two processors directly, which software would be more suitable? Food for thought.



It seems to me AMD's benchmark is far more biased, considering it doesn't single out the product they're actually trying to sell.


----------



## arbiter (Jan 23, 2016)

Tsukiyomi91 said:


> Intel at the very least is honest enough not to boast by printing dodgy benchmarks on their products to begin with... nor even promoting such claims to solidify their footing on the IT market.


Yea That seems to be a staple of AMD last 4-5 years with their benchmarks. Its been one biggest reasons I dislike them is promoting questionable PR benchmarks graph's that you can look at them and see where they pretty much cheated by using specialized benchmarks like ones that run using OpenCL.



Ralfies said:


> I think the last paragraph in the article sums things up nicely.
> It seems to me AMD's benchmark is far more biased, considering it doesn't single out the product they're actually trying to sell.



If you read what you quoted, they perfer benchmarks that "measure overall performance" aka takes the GPU in to account in score which AMD's gpu is far better then then intel's. Reality is if you are only play 400-600$ for a laptop most people probably not looking for something to Game on where that GPU is really gonna matter. So reality is they are attacking them selves. Its better to use Software that will utilize what part of cpu will be used instead of what part likely Won't be used as programs can't. Tell me 1 email program that is gpu accelerated? Web browsers for most part aren't either cept for video decoding but cpu on intel side is more then good enough to do the job anyway.


----------



## Bill_Bright (Jan 23, 2016)

Kanan said:


> And this topic is not about PSUs


No - its about doing your homework and learning the facts before purchasing. It was an illustration that applies to CPUs too. Sorry you could not understand that, and as such, found it necessary to denigrate the thread with personal insults. I'm arrogant? NO WHERE did I call out anyone _personally_ to criticize them with insults or name-calling because I did not like or disagreed with their comments or opinions. So look in the mirror, bud. 



Kanan said:


> I don't think you get what this topic is really about.


Do you? This topic is not about GPUs either - but you are spending a lot of time on GPUs when Intel isn't even in the discrete (graphics card) GPU business.



Kanan said:


> Plain bullshit. Nobody said they talk about "all Intels vs all AMDs".


But my point was unless you specify a specific Intel vs a specific AMD instead of continually generalizing with just "Intel" and "AMD", then you are! Because as others have noted, AMD has done their share of marketing fluff and hype too. Plus, not every AMD CPU is inferior to every Intel CPU.


----------



## newtekie1 (Jan 23, 2016)

Ralfies said:


> I think the last paragraph in the article sums things up nicely.
> 
> 
> 
> It seems to me AMD's benchmark is far more biased, considering it doesn't single out the product they're actually trying to sell.



The thing is, the PCMark8 benchmark suit is a good well rounded suit for the whole computer _if you run all the tests_.  AMD ran the Work test only, which is the least CPU intense test in the entire suite.  It tests Excel and Word and Video chat, thats pretty much it.  Any CPU on the market can handle those tasks.

At the very least they should have run the PCMark8 Home test, which gives a better idea of home tasks, what the average end user will be doing.  PCMark8 Work by itself is not a well rounded benchmark.


----------



## Kanan (Jan 24, 2016)

Bill_Bright said:


> No - its about doing your homework and learning the facts before purchasing. It was an illustration that applies to CPUs too. Sorry you could not understand that, and as such, found it necessary to denigrate the thread with personal insults. I'm arrogant? NO WHERE did I call out anyone _personally_ to criticize them with insults or name-calling because I did not like or disagreed with their comments or opinions. So look in the mirror, bud.


I think I understand more than enough and more you think I can understand, but think what you want, fantasize more about it, if it makes you happy - again I don't care. Also, being arrogant is possible without calling out anyone, or talking to someone directly (you talked to EVERYONE in this thread). Stupid that you think you can be arrogant and patronize everyone and get away with it with such a lousy reason ("I didn't call names"). lol I think you are over 40 or over 50 years old, your social skills seem pretty limited regarding that, or you know it and are just making excuses. I think it's more likely the second one. 



> Do you? This topic is not about GPUs either - but you are spending a lot of time on GPUs when Intel isn't even in the discrete (graphics card) GPU business.


I already said that this topic is about a lot of things, GPUs too. I don't care about your ignorance either / I don't care if you accept this fact. Intel is in the GPU business, they have a high market share, the highest of the world (% GPUs sold or owned by users), they are just not in the "discrete" GPU market, but that's not really important here. Still, every CPU they sell with GPU in it, means, AMD/NV/other companys (if any) sell a GPU less - basically the same as producing discrete GPUs.

btw. a discussion is a fluid, moving thing, nailing it down to one specific thing is just in theory possible. You can go in every thread of this forum and see if people are at 100% holding to the respectable topic (guess what, they are not). Stating that, this discussion started with CPUs and moved to other things too, it simply advanced further, nothing special. So I don't see your point in nailing it down, other than to patronize people. I guess you get a hold of every chance patronizing people...



> But my point was unless you specify a specific Intel vs a specific AMD instead of continually generalizing with just "Intel" and "AMD", then you are! Because as others have noted, AMD has done their share of marketing fluff and hype too. Plus, not every AMD CPU is inferior to every Intel CPU.


In some regards I was talking about "all" Intel / AMD, regarding their PR/Advertising etc. yes. But that's it. But I never said every Intel is better, or every AMD is better than any Intel CPU etc. I don't know why you are making this up and what's the point in it. Again I think you really like to patronize others, or like to deliver a speech, because I don't see the point in it. It's a discussion like every other, nothing to be ashamed of, it doesn't need a person showing up patronizing everyone. End of story - I won't continue this kindergarten with you forever, if you can't - for once - accept a mistake on your side.


----------



## OneMoar (Jan 24, 2016)

http://www.gizmodo.com.au/2016/01/amd-takes-a-swing-at-intels-sysmark-benchmarks-misses-completely/
tlR amds cpus are still shit regardless of any compiler optimisations this boils down to AMD whining like a 4 year old that just had its toys taken away and was made to sit in a corner


----------



## arbiter (Jan 24, 2016)

OneMoar said:


> http://www.gizmodo.com.au/2016/01/amd-takes-a-swing-at-intels-sysmark-benchmarks-misses-completely/
> tlR amds cpus are still shit regardless of any compiler optimisations this boils down to AMD whining like a 4 year old that just had its toys taken away and was made to sit in a corner


Read through that, they do make valid points about both sides and where amd is wrong in how they promote this. 

Was thinking if AMD is attacking benchmarks now, wonder if Zen which AMD tout's as getting them back in the game might end up being another bulldozer? i don't want to see amd die but really though there is a point where they should tell their PR staff to shut up quit talking, then tell their R&D to get off their butt's.


----------



## Kanan (Jan 24, 2016)

arbiter said:


> Read through that, they do make valid points about both sides and where amd is wrong in how they promote this.
> 
> Was thinking if AMD is attacking benchmarks now, wonder if Zen which AMD tout's as getting them back in the game might end up being another bulldozer? i don't want to see amd die but really though there is a point where they should tell their PR staff to shut up quit talking, then tell their R&D to get off their butt's.


I agree to the PR stuff, they should stick to the matters and be objective. Zen however can't be like Bulldozer, they already explained the architecture, it's a standard x86 CPU, no modules, real cores and up to 8 of them (desktop PC). The plans looked promising + it will have HTT. It's likely IPC will be a LOT higher than compared to Piledriver or latest iteration of FX - Excavator, because they strictly focused on IPC this time around. What I'm questioning is, how much the frequency will be? Will it be enough? And what will be the pricing? How about energy consumption? Such things... but I'm not questioning whether it will be another Bulldozer or not, I'm 99.9 period shure it will not, they learned from it, and plans already proved it's a traditional design like Sandy Bridge.


----------



## JunkBear (Jan 24, 2016)

All this stuff really matters to high end users like productivity and gamers. For all day use and average people who never really use full power what matter the most is how much they can keep in their wallets. Budget rigs mostly always goes for AMD imho.


----------



## newtekie1 (Jan 24, 2016)

arbiter said:


> Zen which AMD tout's as getting them back in the game might end up being another bulldozer?



Zen's new marketing:

"AMD Zen - 9 Out Of 10 People Can't Tell It Isn't An Intel!*"
*In Microsoft Excel



Kanan said:


> Zen however can't be like Bulldozer, they already explained the architecture, it's a standard x86 CPU, no modules, real cores and up to 8 of them (desktop PC).



Just because it has "real" cores doesn't mean it can't be a performance disappointment.  The original Phenom was "real" cores and it was pretty disappointing.  Though I am hopeful for Zen, Jim Keller being a part of the design is a very good sign.


----------



## Tsukiyomi91 (Jan 24, 2016)

another problem about AMD is their CPU coolers.I dunno why but it look really microscopic compared to Intel's if u ask me & they're terrible at keeping temps at bay for an already hot processor chip. Intel's stock cooler at the very least has big surface area & large fan to keep their chips cool.


----------



## hat (Jan 24, 2016)

Both Intel and AMD's stock coolers suck. AMD just runs hotter...


----------



## Bill_Bright (Jan 24, 2016)

hat said:


> Both Intel and AMD's stock coolers suck. AMD just runs hotter...


There certainly are more efficient aftermarket coolers, but in spite of what the aftermarket cooler marketing weenies want us to believe, both Intel and AMD supply quality coolers that are more than adequate for most users. They have to be because they are the ONLY coolers that are warrantied to cool the CPUs they come with. OEM coolers today get a bad rap because OEM coolers of yesteryear were lousy.

OEM coolers are warrantied for 3 full years. And in the rare event a failed cooler somehow destroys the CPU, ONLY OEM coolers cover CPU replacement too. No aftermarket cooler does that. I note the often recommended CM 212 is only warrantied for a lousy 1 year.  Even the OEM TIM pads are MUCH better than those used years ago.

Unless doing extreme overclocking, or you need total silence for a home theater PC, I always recommend users at least try the OEM coolers first. Typically, they are surprised how well and how quietly they work. And then they are surprised that it is typically the GPU, PSU, or case fans they are hearing and not the CPU fan.

They can always swap in an aftermarket cooler later on if not satisfied.

It is, after all, the case's responsibility to supply a sufficient flow of cool air through the case. The CPU fan need only toss up the CPU's heat into that flow. If case cooling is not doing the job with a default clocked CPU and OEM cooler, then the user (and/or computer builder) has failed to properly configure case cooling!!!


Tsukiyomi91 said:


> another problem about AMD is their CPU coolers.I dunno why but it look really microscopic compared to Intel's if u ask me & they're terrible at keeping temps at bay for an already hot processor chip. Intel's stock cooler at the very least has big surface area & large fan to keep their chips cool.


Perhaps that is a driving force for AMD to come out with this new cooler. That said, an advantage of the smaller coolers is they fit in slim cases, which seem to be getting more popular. And while size does matter, size is not everything. The composition of the heatsink, the fan's CFM, and of course case cooling too.

****



Kanan said:


> Intel is in the GPU business, they have a high market share, the highest of the world


 Yeah right. And you call me ignorant? Did you even notice I specifically said (I will *bold* it this time for you), "_Intel isn't even in the *discrete* (*graphics card*) GPU business_." Got a link to a current, PCIe Intel graphics card?


Kanan said:


> btw. a discussion is a fluid, moving thing, nailing it down to one specific thing is just in theory possible.


Ah! I see. So it is okay for you to run a topic OT as long as it suits you. But for me to use "selecting PSUs" as an example (NOT a OT topic) for buying a graphics is not okay because it does not suit you. Right.  And  you call me arrogant?

I am not ashamed of any thing I said. Nor did I say anything that was in error - so nothing to admit too. 


Kanan said:


> I won't continue this kindergarten


No more of your name calling! That would be wonderful. Thank you. And I will not respond to your personal affronts any more either.


----------



## arbiter (Jan 25, 2016)

Kanan said:


> I agree to the PR stuff, they should stick to the matters and be objective. Zen however can't be like Bulldozer,


What I ment by that is a lot of promoted by bulldozer being this great new arch but when it came out it was barely any better then what they had before. They changed the arch but doesn't mean its better til it is shown to be.


----------



## tabascosauz (Jan 25, 2016)

This claim is real annoying. AMD is trying to spin its usual "whole package performance is more important than raw CPU performance" from just a marketing factor to a misleading claim against Intel. Can they just leave it alone and be modest with a "BD is not what we need, so that's why we're working as hard as we can on Zen"? First the FX-8350 was the embodiment of "look, moar cores", then when it didn't work they threw that away and tried to promote FM2+ and their laptop APUs with "CPU performance is irrelevant because our iGPUs are x times more powerful than Intel's iGPUs".

Intel isn't clean, and never has been. But this is just another case of AMD trying to turn attention away from the fact that all BD-based platforms are absolute shit. The IPC difference between Excavator and Broadwell is significant but probably not 50%, but if you take into account the fact that most laptop manufacturers have tamed Intel ULV CPUs' Turbo but have absolutely no idea of how to cool Kaveri / Carrizo properly without axing performance, 50% is not far off the mark.

They even admitted in the very claim that they could not actually beat Intel. So what's the big deal here? *Get beat by a chunk, or get beat by a chunk and half, you're still getting beat, and BD is shit*. Just shut the hell up and work on getting Zen to the market when you need it to and we'll all be just fine.

This is all like appealing a rather unfair mark of 32% on an exam when you should have clearly gotten 35%. The pass threshold is 40%. Fight or no fight, you're still getting sent back to where you came from.


----------



## cdawall (Jan 25, 2016)

Tsukiyomi91 said:


> another problem about AMD is their CPU coolers.I dunno why but it look really microscopic compared to Intel's if u ask me & they're terrible at keeping temps at bay for an already hot processor chip. Intel's stock cooler at the very least has big surface area & large fan to keep their chips cool.



Have you ever used an AMD heatpipe cooler?...The intel cooler isn't larger in any dimension.



hat said:


> Both Intel and AMD's stock coolers suck. AMD just runs hotter...



Om what way? I have never in my entire life as a PC guy seen AMD processors hitting 88-90C on their stock coolers at stock settings, (save a bad fan, or some other physical issue). Plenty of Intel CPU's do however.


----------



## HumanSmoke (Jan 25, 2016)

Tsukiyomi91 said:


> If they (AMD) want to save their face, I hope they don't exaggerate on claims that their upcoming Zen & Polaris architecture are performing "worlds apart" while comparing on older chips from Nvidia & Intel while using botched benches to gain popularity & hype in their favor. But, since it's them after all... they'll gonna go on full force with it & shit starts hitting the fan all over again. If that happens, I won't be seeing AMD in any of my new builds.


Well, here's hoping the strategy fares better than the last time AMD tried a full court press on Intel. Benchmarking a non existent processor against cherry picked opposition with deliberately out of date benchmark scores...to the enterprise sector

(NSFW.....)


----------



## hat (Jan 25, 2016)

cdawall said:


> Om what way? I have never in my entire life as a PC guy seen AMD processors hitting 88-90C on their stock coolers at stock settings, (save a bad fan, or some other physical issue). Plenty of Intel CPU's do however.



Hm, I'll give you that much, but I do know that many AMD users even with good coolers speak of high temps with their latest processors. Intel might be better off if they didn't use god knows what as their TIM between the CPU die and IHS and then try to cool that with a coaster. For what it's worth, I value AMD's stock cooler a good few notches above Intel's...


----------



## cdawall (Jan 25, 2016)

hat said:


> Hm, I'll give you that much, but I do know that many AMD users even with good coolers speak of high temps with their latest processors. Intel might be better off if they didn't use god knows what as their TIM between the CPU die and IHS and then try to cool that with a coaster. For what it's worth, I value AMD's stock cooler a good few notches above Intel's...



What are high temps? AMD users complain about 55C on this forum.


----------



## newtekie1 (Jan 25, 2016)

cdawall said:


> Have you ever used an AMD heatpipe cooler?...The intel cooler isn't larger in any dimension.



Yeah, AMD's heatpipe cooler is way better than anything put out by Intel.



Bill_Bright said:


> They have to be because they are the ONLY coolers that are warrantied to cool the CPUs they come with.



Intel and AMD warranty their CPUs regardless of what cooler you use, as long as the cooler used is adequate.  Their high end Intel processors don't even come with a cooler anymore.



Bill_Bright said:


> OEM coolers today get a bad rap because OEM coolers of yesteryear were lousy.



Intel's stock cooler has only gotten worse over the years.  Most lower end processors come with short blocks of all aluminum crap coolers.  The mid-range processors use the same small heatsink, but with a copper core, all be it a hollow copper core.  At least 10 years ago Intel's coolers were tall, giving at least double the surface area, and had solid copper cores.

AMD is pretty much the same, they've be using the same heatpipe cooler since the FX-60, and the solid aluminum ones they give with most of their processors suck compared to it.  Only the high end FX processors come with the heatpipe cooler.

Sure the stock cooler is fine for stock speeds, but all but the AMD heatpipe one are terrible.


----------



## arbiter (Jan 25, 2016)

cdawall said:


> What are high temps? AMD users complain about 55C on this forum.


yea amd 8000 series only runs in the 55-60c range give or take. Problem is when max temp the cpu can run without frying is like 65-70c or there abouts yea that is considered High.


----------



## cdawall (Jan 25, 2016)

arbiter said:


> yea amd 8000 series only runs in the 55-60c range give or take. Problem is when max temp the cpu can run without frying is like 65-70c or there abouts yea that is considered High.



65-70 is still 25+ degrees cooler than intel.



newtekie1 said:


> Intel and AMD warranty their CPUs regardless of what cooler you use, as long as the cooler used is adequate. Their high end Intel processors don't even come with a cooler anymore.



Technically neither do AMD's top end ones. The 9370/9590 are sold WOF, unless you ordered the handful with a watercooler packaged with them


----------



## Bill_Bright (Jan 25, 2016)

tabascosauz said:


> This claim is real annoying. AMD is trying to spin its usual "whole package performance is more important than raw CPU performance" from just a marketing factor to a misleading claim against Intel. Can they just leave it alone and be modest


The problem is they can't leave it alone because Intel's marketing machine is running roughshod over AMD with their own marketing "ploys" to smash AMD. AMD is left with no recourse but defend itself.

AMD and Intel need each other (and we need both) but sadly, they don't see it that way.


newtekie1 said:


> Intel's stock cooler has only gotten worse over the years.


Sorry, but I don't agree with that at all. They are quieter and more efficient. 10 years ago, even with mild overclocking, the OEM coolers (with AMD or Intel) would not be sufficient. Today, OEM coolers are able to support even moderate overclocking - with properly configured case cooling, of course.


----------



## cdawall (Jan 25, 2016)

Bill_Bright said:


> The problem is they can't leave it alone because Intel's marketing machine is running roughshod over AMD with their own marketing "ploys" to smash AMD. AMD is left with no recourse but defend itself.
> 
> AMD and Intel need each other (and we need both) but sadly, they don't see it that way.
> 
> Sorry, but I don't agree with that at all. They are quieter and more efficient. 10 years ago, even with mild overclocking, the OEM coolers (with AMD or Intel) would not be sufficient. Today, OEM coolers are able to support even moderate overclocking - with properly configured case cooling, of course.



That's what happens when the CPU is 1/3rd-1/2 the wattage


----------



## kenkickr (Jan 25, 2016)

Bones said:


> Good points Bill.
> 
> I will say Intel back in the day did a brilliant job of advertising - Remember the aliens saying "Peeennntiuuuuummm"?
> Intel was the only chipmaker back then doing such, never saw any ads from AMD, Cyrix, IBM, or anyone else involved with chipmaking at the time.
> ...



AMD did try though.  Watch this.


----------



## Bill_Bright (Jan 25, 2016)

kenkickr said:


> AMD did try though. Watch this


lol That was good - and true back then. But everything changes and marketing campaigns fade away - otherwise, we would still be watching Geico caveman commercials!


----------



## hat (Jan 25, 2016)

cdawall said:


> That's what happens when the CPU is 1/3rd-1/2 the wattage


I don't see it. CPU TDP has pretty much remained the same throughout my knowledge, through different market segments.


----------



## OneMoar (Jan 25, 2016)

hat said:


> I don't see it. CPU TDP has pretty much remained the same throughout my knowledge, through different market segments.


what you aren't factoring is performance per watt and this is where amd is getting its assbeat
amd cpus are garbage from a performance per watt standpoint and thats really the only thing that matters these days
the more efficient your architecture the better the performance they go hand and hand
amd is no longer the underdog they are just a old crippled dog that shits all over the house and should be shot .... I am so tired of the debate amd-cpus= shit period they have nothing to offer in the cpu market


----------



## Bill_Bright (Jan 25, 2016)

I agree (though there are more 65W CPUs) wattage really does not really apply here.


----------



## lilhasselhoffer (Jan 25, 2016)

OneMoar said:


> what you aren't factoring is performance per watt and this is where amd is getting its assbeat
> amd cpus are garbage from a performance per watt standpoint and thats really the only thing that matters these days
> the more efficient your architecture the better the performance they go hand and hand
> amd is no longer the underdog they are just a old crippled dog that shits all over the house and should be shot .... I am so tired of the debate amd-cpus= shit period they have nothing to offer in the cpu market



Slow clap.

Thank you, for completely missing the point.  Please note that it isn't just you missing the point, but such a brazen opinion deserve a response.


What this is about is a few tests run on two pieces of mobile hardware.  Please, look back at the link.


Argue that Intel is better until you're blue in the face.  Argue that the tests are unfair, and that's the only reason AMD loses.  Neither of those points matter.  

Can we sit down, and really ask what this is?  AMD is arguing that a test is unfair, but their conclusion is that even their "fair" test results in them being beaten.  They've chosen a specific market segment, defined "similar" hardware to test on, and they're not claiming superiority.  What AMD has claimed is that they aren't as far behind Intel as Intel's figures claim.  They've demonstrated it by taking two separate tests, with separate goals, and said Intel isn't fair.  It's a definite step up from claiming that their processors somehow handily beat out Intel.


This is AMD PR.  It's BS, and taking it to the level that we have is...bewildering.  AMD has admitted they're behind.  They've not demonstrated any real data that denies that.  AMD is desperately trying to get PR, and they've resorted to lighting themselves on fire to do it.  Can we please let their stupidity burn out, rather than ignite another flame war between fanboy bases?  Barring that sanity, can we all agree that this isn't AMD claiming superiority, just them claiming to not be completely out of the race by arbitrary standards they set?  No matter what side of this debate you're on, can we at least agree that AMD PR no being fired en mass is a miracle?


----------



## cdawall (Jan 25, 2016)

hat said:


> I don't see it. CPU TDP has pretty much remained the same throughout my knowledge, through different market segments.



P4 670 was the top tier chip in its day. Maximum power usage was 148 watts "115w TDP", The QX9775 150w TDP; a 6700K is a 91w CPU. The 6700T/TE uses 35w. Wattage has come down, also remember under normal usage intel's tend to be well under TDP unlike CPU's of old. Power usage is way down compared to days of old on CPU's.


----------



## hat (Jan 25, 2016)

OneMoar said:


> what you aren't factoring is performance per watt and this is where amd is getting its assbeat
> amd cpus are garbage from a performance per watt standpoint and thats really the only thing that matters these days
> the more efficient your architecture the better the performance they go hand and hand
> amd is no longer the underdog they are just a old crippled dog that shits all over the house and should be shot .... I am so tired of the debate amd-cpus= shit period they have nothing to offer in the cpu market



I'm not denying that performance per _watt_ hasn't gotten better, merely pointing out that the watts are still roughly the same. On to your other comment, yes, AMD is behind. Yes, their processors are slower, and use more power doing so. What's worth noting, however, that benchmarks and high performance computing don't matter to the vast majority of users. We are enthusiasts. Would I use AMD in any of my systems, if I had the money to do it all over again? Probably. Not in my main machine, given the chance, because now I do a lot of encoding. I rip DVDs, and in doing so employ a number of complex high quality processes which my stock i5 2400 outperforms my overclocked Athlon II x4 by more than double in some cases. No other machine in the house is tasked with such work. AMD would work fine for everything else, on a lower budget. The next most demanding thing is an el gato capturing PS4 gameplay which an overclocked Q6600 struggles with, but even a stock FX8320 would handle with ease (those bastardized psuedo-cores they tacked on aren't useless in _everything_). To be able to handle that with an Intel, I would need at least an i5 which would cost a good bit more.


----------



## Schmuckley (Jan 25, 2016)

Crying about Sysmark? What about XTU? 
Where's the AMD XTU? yah..
This is AMD's year..or not.
I you see my avatar..yeah..I'm not biased.I go where the performance is.
I have have been using both Intel and AMD AND Cyrix for a long time.


----------



## Bill_Bright (Jan 25, 2016)

lilhasselhoffer said:


> What AMD has claimed is that they aren't as far behind Intel as Intel's figures claim. They've demonstrated it by taking two separate tests, with separate goals, and said Intel isn't fair. It's a definite step up from claiming that their processors somehow handily beat out Intel.
> 
> This is AMD PR. It's BS, and taking it to the level that we have is...bewildering.


You correctly point out that AMD is accurately noting their CPUs are still being beaten, but their complaint is that it is just not as severely as Intel want's everyone to believe. Then claim that is BS! It is not BS. AMD is being truthful. Bewildering they publish that, maybe, but it is not BS.

I see this (and I may be dating myself but who cares?) as the old Avis car rental commercials where they readily admitted to being in second place behind Hertz. They were being truthful. That said, I am not claiming AMD is trying harder, as Avis did.


----------



## newtekie1 (Jan 26, 2016)

Bill_Bright said:


> Sorry, but I don't agree with that at all. They are quieter and more efficient. 10 years ago, even with mild overclocking, the OEM coolers (with AMD or Intel) would not be sufficient. Today, OEM coolers are able to support even moderate overclocking - with properly configured case cooling, of course.









On the left is a 10 year old Intel stock heatsink I received with my Pentium D 805.  On the right is a current highest end stock Intel heatsink I received with my 4790K.  Please, explain to my how the one on the right is significantly better, more efficient, than the one on the left, and isn't in fact worse.


----------



## TheGuruStud (Jan 26, 2016)

Intel has been doing this even when AMD left them in the dust. They just pay off the mags and review sites. What else is new?

Only fools believed a pentium 4 was 25% faster than the athlon 64 LOL. Just like a fool will believe that intel wins by 200% in every program ever, today.

Intel should have forfeited their entire cash supply to AMD for rigging the market. They were slapped on the wrists.


----------



## cdawall (Jan 26, 2016)

TheGuruStud said:


> Intel has been doing this even when AMD left them in the dust. They just pay off the mags and review sites. What else is new?
> 
> Only fools believed a pentium 4 was 25% faster than the athlon 64 LOL. Just like a fool will believe that intel wins by 200% in every program ever, today.



But they are 200% faster in horribly threaded CPU bound games that only utilize a single core.


----------



## lilhasselhoffer (Jan 26, 2016)

cdawall said:


> But they are 200% faster in horribly threaded CPU bound games that only utilize a single core.



You mean most of the games produced, ever.

You're going to have to help me here, because it seems like most people are indulging in revisionist history here.  For most of the history of video gaming the hardware on the console side has been single core.  Developers code for consoles, as that's the "largest" player base.  Likewise, computer developers haven't really been on the multiple threading band wagon, because of the huge step up in complexity associated with writing threaded applications.  If that isn't apparent, let me list off two big publishers that have released games (and underlying engines) that will not run if too many threads are enabled (and best yet, have been put out in the last decade).
Bethesda - Fallout 3/New Vegas (look up the iNumHWThread .ini tweak for confirmation)
Activision - Prototype (Will not run, period, on a system that has more than 2 threads.  Ran fine when only 2 cores were enabled in BIOS)


What you're telling me is that it's unfair to compare single threaded performance.  It's unfair because the tests are rigged.  Let's be real, the software is what matters to people playing video games.  If the software is crippled to run on only one, or at best two threads competently, then why is it unfair to test single threaded performance?  Likewise, it's not really reasonable to ask for a test of a complex system as a whole, and interpret relative performance of your component in that system.  That's what AMD is asking for with the alternative test, they want to put standardized delays into the mix (read: SATA communication, memory access, etc...) so that their deficiencies are less obvious.  Sleazy tactic, especially when your entire data set consists of two pieces of hardware and two tests.  4 data points, and they accuse Intel of unfair practices.  Intel deserves to be challenged, but this is more absurdist comedy than viable challenge.  

Let's give credit where it's due.  AMD took a bold step with Bulldozer, and entirely failed to get enough development behind their unique improvements.  They managed to pull a Netburst out of previous success, and have decided to ride that failure all the way to Zen.  While I can respect sticking to your guns, AMD is well past the point where it's obvious that they bet on the wrong horse.  Let's let this bit of idiocy die.  Let's tell AMD that we're smart enough to realize this is PR crap, and they haven't got a leg to stand on.  Let's ask them to make sure that Zen rectifies their shortcomings, rather than continues this self destructive pattern of absurdist claims that don't match with reality.  Can we please not make this another flame war, based on 4 distinct points of data which even AMD admits prove they have an inferior product?


----------



## TheGuruStud (Jan 26, 2016)

What? I played prototype (smoothly might I add) on my 4 and 8 core AMDs... Lol


----------



## Filip Georgievski (Jan 26, 2016)

Hahaha, i played Prototype 1 and 2 on a Core 2 Duo E7500 and my old 6770 with no problem what so ever as well.
That is a laught coming from an AMD CPU.
Try playing Witcher or Dying Light. 
I'm not saying AMD CPUs are bad cause for that money, they are worth it, just that Intels equivalents beat them at most aspects.
Dont get me wrong, i had an Athlon 64 (i can get it out and send you pictures for proof)  and it beat the shit out of a Pentium 4.
But, now Intel is on the top, and that is that.

PS: Fanboys, pls go back to your gaming and bemchmarks, leave this discusion to the more experienced consumers and people who know what they are talking about.


----------



## Tsukiyomi91 (Jan 26, 2016)

I too used AMD chips before moving to Intel for the sake of performance & efficiency. Price wasn't exactly the issue for me... sure AMD chips are cheap (no puns intended), but the problem they have is heat generation, whether you use stock or aftermarket coolers & they tend to eat a lot of juice in order to keep up with Intel chips which only used around a fraction of what AMD requires. Real world usage however, sure there isn't much of a difference but paying your monthly utility bills is a pain in the ass if your system is all-AMD that runs the whole day. It's also bad if you have all-year long summer heat like over here in MY... So which do you prefer? super-cheap hardware but crazy high monthly fees for your electricity bills OR premium hardware but easy on the wallet/bank electricity bills? I would choose the 2nd option if I were you...


----------



## lilhasselhoffer (Jan 26, 2016)

Filip Georgievski said:


> Hahaha, i played Prototype 1 and 2 on a Core 2 Duo E7500 and my old 6770 with no problem what so ever as well.
> That is a laught coming from an AMD CPU.
> Try playing Witcher or Dying Light.
> I'm not saying AMD CPUs are bad cause for that money, they are worth it, just that Intels equivalents beat them at most aspects.
> ...



?

Maybe you've cited this incorrectly, but the E7500 was a dual core processor.  Note above, where I said 2 cores worked.

I can't tell if you didn't read, or I'm missing the point.  I tried to play Prototype on a system with a 3930k and a 7970 GPU, both of which should more than handily meet requirements.  It would get part of the way through the opening cinematic, then hang.  I figured it was some bass-ackwards setting, so off to google I went.  After half an hour the only fix I could find was related to core count.  Popped into the BIOS, disabled HT and 4 cores, and restarted the system.  Everything was slow, but I turned Prototype on, and everything worked.  Tell me, given the provided information how exactly am I supposed to come to any conclusion but threaded performance for Prototype was...poor?


If Prototype isn't your particular thing, do read up on Fallout.  There are a number of tweaks to the .ini file that are necessary to get it running on a modern system.  One of the biggest stability tweaks is to set the amount of threads available to the game to two, and set those particular threads as highest priority.  Fallout 3 on windows 7 is an absolute mess otherwise.  If my word, and the huge list of people online experiencing that issue isn't enough, then let's ask Valve.  It might have gone unnoticed, but Valve has recently added information to Fallout 3 that suggests it doesn't run well on a modern OS.  Their exact words are: "Notice: Fallout 3 is not optimized for Windows 7 and later."

My point was simple, threaded games is a relatively new concept.  Most games have been produced prior to the effort to thread programs.  To suggest it's unfair to test single core programs is insane.  It's advantageous to Intel, but that's because Intel didn't chain their products to the need for programmers to fundamentally rework how they code.  AMD did.  AMD put themselves where they are now, they admit to producing a poorer performing CPU, but they want to change the testing conditions to mitigate that difference.  It's crap, and anyone with an ounce of reason should be calling them out.  4 data points with two pieces of hardware isn't justification for Intel cheating, it's justification for AMD to generate PR on the lead-up to their product release.  

Unfortunately, Zen is about 10 months off.  AMD is generating this fanboy s***storm for no reason other than to stay relevant.  They want to sell off their remaining stock of under performing (though budget friendly) CPUs before Zen rolls out and obliterates the ability to move them.  Somebody at AMD did the math, and discovered you'd need 10 months to sell existing stocks.  Welcome to reality, where the cynical view is generally correct, but AMD banks on the fact that zealots will start flame wars instead of trying to discover their thinly guised attempt to shift inventory before it's completely irrelevant.


----------



## cdawall (Jan 26, 2016)

I normally early adopt amd stuff, but Zen will wait until reviews.


----------



## trog100 (Jan 26, 2016)

once amd tweaked the sleeping tigers tail.. but the tiger woke up and with one swipe of its paw put the upstart back in its place.. nothing has changed since then..

how much longer amd can survive is anybodies guess.. maybe not that much longer even though the market needs them..

if they can offer the same (mid range) performance for less money they may stand a chance but if they do its purely down to intel helping them.. intel could if they so wished lower their prices at any time.. its a rigged market and intel set the rules.. in a way intel needs amd which is probably the only reason amd has survived this long..

when you have only two players players true competition dosnt exist.. the leading player sets the rules end of story..

trog


----------



## Bill_Bright (Jan 26, 2016)

@newtekie1 - sorry, but I don't see any images.

That said, not sure pointing out one pair of images makes the case for either of us. There are many variables that come into play here. When it comes to heat sink materials, one tiny change in the composition of the metal alloy can significantly change the efficiency of the heat conduction. One tiny change in the fin shapes, thickness, size and number can make a significant change in the efficiency. Raw materials are much more pure. Manufacturing techniques have greatly improved in recent years to ensure the mating surfaces are free of imperfections and are more perfectly flat. And OEM TIM has much improved too. All these factors together improve cooling efficiency.

In the olden days, we use to "lap" heat sinks and CPU dies to ensure perfect flatness. That is no longer needed.

Fan technologies have greatly improved too. R&D has gone into the aerodynamic shape of the blades (they are tiny wings, after all) to ensure they grab and push more air, instead of just chopping at the air. Better bearings are used too. Better design blades and precision bearings decrease noise too.

And CPUs are much more efficient too. Plus, cases today don't come with a single 80mm case fan as cases 10 years ago commonly did - keeping in mind it is the case's responsibility to provide a sufficient supply of cool air running through the case.

So looking at a couple pictures proves nothing.

*IF* the OEM coolers were as bad as you want us to believe, there would be 100s of millions of overheating computers out there. And that is just not happening.

Again, if doing extreme overclocking, if your case cooling is inadequate, of if you are just seeking bragging rights (or you want silent running for a HTPC), then by all means go for an aftermarket cooling. But if you are just using standard clocking, I say try the OEM cooler. To be sure, I am NOT saying OEM coolers are the panacea for CPU cooling. I am just saying OEM coolers are not crap.

Another thing I've said before - keeping our CPUs adequately cooled is, no doubt, absolutely essential for stable operation and long life. But cooler does not automatically mean better. A CPU running at 35°C will not be more stable, perform better, or last longer than a CPU running 45°C or even 55°C (or even higher, for some CPUs).

You really need to ask yourself, "why would Intel or AMD provide coolers (and warranty them too) that failed to keep their CPUs adequately cooled?" How could they and not go bankrupt?

I've been doing IT tech support for a living for over 40 years and the idea that OEM coolers are inadequate is just not supported by the facts! It is myths perpetrated by aftermarket cooler makers, enthusiasts who believe everyone should follow their lead, and their blind followers.

So I say again, give the OEM coolers a chance. Make sure you have adequate case cooling. Keep your case clean of heat trapping dust and you just might be surprised at how efficient, and quiet the OEM coolers can be.

If still not satisfied, then go for an aftermarket cooer.


----------



## cdawall (Jan 26, 2016)

Bill_Bright said:


> *IF* the OEM coolers were as bad as you want us to believe, there would be 100s of millions of overheating computers out there. And that is just not happening.



They are overheating. They just throttle back when it happens. In fact they perform so badly most OEM's use aftermarket heatsinks _even on the i3's_


----------



## Bill_Bright (Jan 26, 2016)

That's just BS. So you are claiming now that 100s of millions of user PCs are not running at full speed because they are overheating and the users are not aware of this so they are keeping quiet?   

Yeah right. Nice try.


----------



## cdawall (Jan 26, 2016)

Bill_Bright said:


> That's just BS. So you are claiming now that 100s of millions of user PCs are not running at full speed because they are overheating and the users are not aware of this so they are keeping quiet?
> 
> Yeah right. Nice try.



I wouldn't know I just work in a tech shop and deal with it on a day to day basis. They can run normal clock speed fine it is the turbo mode that most of them down clock from. I would also love for you to get me a "normal user" and have him tell me the difference between a CPU that is throttling down to 2.4-2.6ghz vs 3.9

and again as I _already said_ I rarely see an intel OEM HSF in a prebuilt. Most of them have an aftermarket cooler that is better than what intel provided.


----------



## cdawall (Jan 26, 2016)

Let's add pictures to prove the point.






This is an unoverclocked dell with an i7 6700k. Mind telling me which heatsink is better this or the one @newtekie1 posted? And this is fucking dell, one of the most returned, worst built consumer level products on the market.


----------



## Bill_Bright (Jan 26, 2016)

Oh yeah. That totally proves everything.  100s of millions of users out there are all a bunch of dumb bunnies.



cdawall said:


> I wouldn't know I just work in a tech shop and deal with it on a day to day basis.


And I own a custom PC and consulting business. I have government and business contracts to support their computers and networks. And I know better than to assume what I see coming into the shop represents what is out there in the real world.

It is clear you are convinced and so are unwilling to even to see for yourselves.

Oh, and BTW, most (if not all) CPUs today toggle down in speed because they don't need to run at full throttle. Not because they are running too hot. CPUz will show anyone that.

If you want to use an aftermarket cooler - go for it. But when giving advice, if the user will not be doing extreme overclocking, it is bad advice to automatically tell them to spend extra money on an aftermarket cooler if their CPU comes with an OEM cooler.

And for those of you who automatically recommend side firing coolers like the CM 212, think twice. Motherboard designers intentionally surround the CPU socket with other heat generating and heat sensitive devices so they too can take advantage of the outward spreading air flow created by the downward firing OEM coolers. Aftermarket side firing fans do not provide such needed cooling as the fan is up high and blows in only one direction.

Don't take my word for it. See the * note *here* where it says (my *bold underline* added), 





> * For cooling the CPU and *its surrounding components*, please install a CPU cooler with a top-down blowing design.



This is just one example from ASRock. Other motherboard makers have posted similar warnings. So, if your CPU does not come with an OEM cooler, or you just choose to use an aftermarket cooler, I recommend you get one that fires in the same orientation as the OEM coolers the designers anticipated to ensure your sensitive motherboard components receive the cooling they need too. This surely will help provide better stability, regulation and component longevity on motherboards that rely on that flow of air.

Now I see no point in discussing this further - unless you can provide real evidence supporting evidence that 100s of millions of CPUs out there have throttled down because the OEM coolers fail to keep them cool.


----------



## cdawall (Jan 26, 2016)

Bill_Bright said:


> Oh yeah. That totally proves everything.  100s of millions of users out there are all a bunch of dumb bunnies.
> 
> 
> And I own a custom PC and consulting business. I have government and business contracts to support their computers and networks. And I know better than to assume what I see coming into the shop represents what is out there in the real world.



I work in the real world... Next time you have a business computer do me a favor and pop the side panel off. Guarantee it isn't a stock intel HSF, now tower cooler vs top down that's purely on manufacturer. I couldn't care less about custom built machines that is not even close to a majority market share. Want to see which designs work? Pop open an optiplex, thinkcentre or prodesk. They do not have OEM intel HSF or OEM AMD HSF's in them for a reason. The stock coolers do not have the ability to cool a machine at max turbo settings and all of this is a known well documented thing.




Bill_Bright said:


> Oh, and BTW, most (if not all) CPUs today toggle down in speed because they don't need to run at full throttle. Not because they are running too hot. CPUz will show anyone that.



I know how throttle stop and CnQ works...I don't need you to explain them.



Bill_Bright said:


> Now I see no point in discussing this further - unless you can provide real evidence supporting evidence that 100s of millions of CPUs out there have throttled down because the OEM coolers fail to keep them cool.



This doesn't apply to the vast majority machines because they don't use the crap intel cooler...I think I mentioned not even the OEM's use them what 3 times now?

Also why did you bring up anything about tower coolers? Literally just said it was a better cooler than the intel one. Don't really need another one of your random off topic rants trying to change the subject...


----------



## trog100 (Jan 26, 2016)

throttling down on turbo settings is back to front logic.. the base clock is the guarantee clock turbo is an extra.. if conditions are right you get the turbo if they aint you dont get the turbo or at least not so much of it..

and for f-cks sake it aint in intels interest to ship stock coolers that will cause customer problems.. dont get me wrong here.. they will throttle down temp wise if they hit 100 C which many will running daft things like prime95 or even intels own burn in test..

but normal people dont run daft things like prime95.. only enthusiasts do that.. and intel make adequate provisions for that..

view turbo boost for what it is.. a little extra over and above if conditions (the software being run being one of them) are right and it make more sense..

a graphics card behaves in the same way.. you only get the max claimed boost if conditions are right.. sometime they are.. sometimes they aint..

trog


----------



## BiggieShady (Jan 26, 2016)

Bill_Bright said:


> Oh, and BTW, most (if not all) CPUs today toggle down in speed because they don't need to run at full throttle. Not because they are running too hot. CPUz will show anyone that.


I gotta point out here that you are talking about idle and low load situations where this behavior happens. If it's at full load then lowering turbo clocks happens only because of temperature or power restrictions. Essentially, you are giving an argument that stock coolers are only good for idle and low load scenarios.


----------



## cdawall (Jan 26, 2016)

trog100 said:


> throttling down on turbo settings is back to front logic.. the base clock is the guarantee clock turbo is an extra.. if conditions are right you get the turbo if they aint you dont get the turbo or at least not so much of it..
> 
> and for f-cks sake it aint in intels interest to ship stock coolers that will cause customer problems.. dont get me wrong here.. they will throttle down temp wise if they hit 100 C which many will running daft things like prime95 or even intels own burn in test..
> 
> ...



The issue is they don't throttle down to those guaranteed clocks they drop well below. More of a this is good, this is good, OH FUCK WE HAVE GONE TO FAR approach.


----------



## BiggieShady (Jan 26, 2016)

trog100 said:


> but normal people dont run daft things like prime95.. only enthusiasts do that.. and intel make adequate provisions for that..


Most people that run their CPU-s hours upon hours in 100% load are 3D artists (rendering), crunchers and mathematicians (or scientists in general for all kinds of simulations btw. Prime95 is a program for searching for huge prime numbers, very efficient, worker code is completely done in assembler)


----------



## Vario (Jan 26, 2016)

LOL


> According to AMD, SYSmark favours Intel processors since the benchmarking software focuses too much on raw performance instead of real-world usage



Keep making excuses AMD.  The only benchmarks that favor AMD are those with lots of threads required.  And those don't reflect "real world usage" unless you are a server admin or a video editor.

They are still trying to market an 8350 as competitive with an i5, never mind that it is 4 years old at this point.

Its not even price competitive, I'd rather have a used i5 2500k for $100 then any of the AMDs.

AMD dropped the ball with the FX arch and no amount of marketing is going to change that.  It is 4 years later their top gaming processor is still an 8350.


----------



## newtekie1 (Jan 26, 2016)

Bill_Bright said:


> *IF* the OEM coolers were as bad as you want us to believe, there would be 100s of millions of overheating computers out there. And that is just not happening.



http://tpuminecraft.servebeer.com/pictures/heatsinks.jpg

Direct link to the image.

No, they haven't changed much in 10 years, they are just smaller now.  Or in the case of the lower end processors they have removed the copper core and just use an all aluminium heatsink.  They don't use fancy alloys, that would cost too much, it is aluminium and copper, it always has been.

I also didn't say the stock cooler wasn't adequate.  If it wasn't adequate, Intel wouldn't be including it.  That is why they don't include the all aluminium one with the 4790K, because it actually wouldn't be adequate.



Bill_Bright said:


> That's just BS. So you are claiming now that 100s of millions of user PCs are not running at full speed because they are overheating and the users are not aware of this so they are keeping quiet?
> 
> Yeah right. Nice try.



That is the beauty of turbo boost.  Because of the stock cooler, a lot of the time the CPU will not turbo as much, or as long.  This is because of heat build up.  Is this technically overheating and throttling? No.  But many enthusiasts would consider it throttling, because under an aftermarket cooler the CPU would run at the full turbo speed constantly and not have to drop back to the standard speed.



BiggieShady said:


> Most people that run their CPU-s hours upon hours in 100% load are 3D artists (rendering), crunchers and mathematicians (or scientists in general for all kinds of simulations btw. Prime95 is a program for searching for huge prime numbers, very efficient, worker code is completely done in assembler)



Plus distributed computing is more popular than ever. Folding@Home, Seti@Home, WCG.   They all load up the CPU and leave it loaded 24/7.



Bill_Bright said:


> This is just one example from ASRock. Other motherboard makers have posted similar warnings. So, if your CPU does not come with an OEM cooler, or you just choose to use an aftermarket cooler, I recommend you get one that fires in the same orientation as the OEM coolers the designers anticipated to ensure your sensitive motherboard components receive the cooling they need too. This surely will help provide better stability, regulation and component longevity on motherboards that rely on that flow of air.



That's because that AsRock board(and any other board that says that) is a piece of shit.  That is what you get when you claim your 3-Phase motherboard can run 125w CPUs.


----------



## BiggieShady (Jan 26, 2016)

newtekie1 said:


> That is the beauty of turbo boost. Because of the stock cooler, a lot of the time the CPU will not turbo as much, or as long. This is because of heat build up. Is this technically overheating and throttling? No. But many enthusiasts would consider it throttling, because under an aftermarket cooler the CPU would run at the full turbo speed constantly and not have to drop back to the standard speed.


The beauty of turbo boost enabled Intel to make even cheaper stock coolers.


----------



## Bill_Bright (Jan 26, 2016)

BiggieShady said:


> If it's at full load


I agree, "IF" at full load. But it is rare, even for hard core gamers and other enthusiasts (unless, as trog notes, they are doing "daft things" like running prime95) for a CPU to sit at full load for very long.


BiggieShady said:


> Essentially, you are giving an argument that stock coolers are only good for idle and low load scenarios.


No, my argument is, give today's OEM coolers a chance. Try them out before automatically condemning them and telling users seeking your advice to spend more money without even seeing if they meet the users needs first.

I am NOT against aftermarket coolers. I am saying I am not against OEM coolers (for most users) either. We use aftermarket coolers in all our HTPC builds. We've used AiOs and side firing aftermarkets too. But I will look at adding a quality case fan before replacing a CPU cooler if the CPU is getting too hot. And "too hot" is sitting above 60°C for more than few seconds as my own personal threshold. Often, simply ensuring the case is doing its job is all that is needed. And adding a quality case fan is much cheaper than a new, quality aftermarket cooler.


----------



## cdawall (Jan 26, 2016)

Oem coolers are crap. Period.


----------



## BiggieShady (Jan 26, 2016)

Bill_Bright said:


> No, my argument is, give today's OEM coolers a chance. Try them out before automatically condemning them and telling users seeking your advice to spend more money without even seeing if they meet the users needs first.
> 
> I am NOT against aftermarket coolers. I am saying I am not against OEM coolers (for most users) either. We use aftermarket coolers in all our HTPC builds. We've used AiOs and side firing aftermarkets too. But I will look at adding a quality case fan before replacing a CPU cooler if the CPU is getting too hot. And "too hot" is sitting above 60°C for more than few seconds as my own personal threshold. Often, simply ensuring the case is doing its job is all that is needed. And adding a quality case fan is much cheaper than a new, quality aftermarket cooler.


That's all fine and dandy since it fit your needs. People who are using long running CPU tasks realize very quickly they need something better than stock cooler and better case airflow than what came with the case. You can argue that statistically they are the rare ones in the community, but not on this forum in my experience.


----------



## suraswami (Jan 26, 2016)

Amazing, anytime its an AMD thread it goes on and on in no time, dang we are good at marketing AMD (posts wise).

Intel thread - http://www.techpowerup.com/forums/t...-certain-workloads-company-issues-fix.219167/

not much talk in it


----------



## OneMoar (Jan 26, 2016)

ill just leave this here 
http://semiaccurate.com/2016/01/22/amds-wraith-cooler-is-as-silent-as-they-claim/?sf19707881=1


----------



## OneMoar (Jan 26, 2016)

39DB Barely audible in the CES convention center ...
nobody needs fan control right ... we all did without it back in 1993


----------



## cdawall (Jan 26, 2016)

OneMoar said:


> ill just leave this here
> http://semiaccurate.com/2016/01/22/amds-wraith-cooler-is-as-silent-as-they-claim/?sf19707881=1



Ok now I am curious what CPU the have sitting there with S|A printed over it.


----------



## OneMoar (Jan 26, 2016)

SA is trolling hard core


----------



## newtekie1 (Jan 26, 2016)

Bill_Bright said:


> No, my argument is, give today's OEM coolers a chance. Try them out before automatically condemning them and telling users seeking your advice to spend more money without even seeing if they meet the users needs first.



For builds that are going to run stock, they are fine.  But you've made some bold statements.  You've claimed they are better than OEM coolers from 10 years ago, when they aren't, and you've claimed they are OK for overclocking, when they aren't.

I want you to back those claims up.


----------



## OneMoar (Jan 26, 2016)

https://www.facebook.com/AMDGaming/?fref=nf


----------



## cdawall (Jan 26, 2016)

newtekie1 said:


> For builds that are going to run stock, they are fine.  But you've made some bold statements.  You've claimed they are better than OEM coolers from 10 years ago, when they aren't, and you've claimed they are OK for overclocking, when they aren't.
> 
> I want you to back those claims up.



I agree with this.


----------



## dorsetknob (Jan 26, 2016)

Here is a point for bill to ponder tho it might distract his line of reasoning and Aurgument

the Cooler that is stock and suitable for say Canada/ US or Europe  will be CRAP if your in Israel or sub Sahara Africa ect.
Not only do you have to consider the CPU temps   you ALSO HAVE TO CONSIDER AMBIENT AIR TEMP for which you hope your cooler will function in
Often your stock cooler wont function as well as an aftermarket cooler in Tropical Climates.

OK Bill play the next 8 track tape


----------



## trog100 (Jan 26, 2016)

intel seem to think 100 C is okay for their chips.. bill on the other hand seems to think anything over 60 C for more than a few seconds isnt okay..

i aint even going to attempt to ague how hot too hot is but quite clearly there is some disagreement on this.. 

old cpus rarely die.. ebay is full of used ones.. low temps (whatever they are) are nice but i dont think for one second they are essential.. i recon intel would agree with me on this one.. else they would change their ways or we would see more failed cpus.. 

trog


----------



## OneMoar (Jan 26, 2016)

the biggest factor in operating temp tolerances are is the fab process larger process silicon has a lower temperature threshold thats all there is to it


----------



## Bill_Bright (Jan 26, 2016)

newtekie1 said:


> For builds that are going to run stock, they are fine. But you've made some bold statements. You've claimed they are better than OEM coolers from 10 years ago, when they aren't, and you've claimed they are OK for overclocking, when they aren't.


And you have misquoted me too.

I said for "mild to moderate" overclocking, OEM coolers "with proper case cooling" are adequate. And they are. That does NOT mean they will pass a prime95 test, however.

And I stick by my claim OEM coolers are better than they were 10 years ago. You noted yourself, they haven't changed much. But they have changed a little. And a little change here and a little change there can make a significant change in effectiveness.

So I claim it does not take much to improve them. Purer raw materials, flatter mating surfaces, better TIM, minor changes in fins for more surface area (a critical criteria), better fans all make the OEM coolers better. Plus better case cooling today and more efficient CPUs all contribute to today's OEM coolers providing "adequate" cooling. Are they the best coolers? NO! But they don't have to be either.



trog100 said:


> bill on the other hand seems to think anything over 60 C for more than few seconds isnt okay..


Another misquote. I said the threshold for ME is 60°C for when I want better cooling. For this system, for example, when I see my temps sitting above 60°C for more than a few seconds, that tells me I need to clean my filters. After that, temps typically drop down to below 40°C most of the time.

I fully understand CPUs can tolerate much higher temps. That just plays into my claim that OEM coolers are fine for them. Note the SS below for my CPU. 105°C for Tj. Max. But I would never allow my temps to get that high.

I'm running with the OEM cooler on an i7 3770 with pushed to 4.2GHz as seen by Core Temp here:







And, my temps are just fine - but with full disclosure - I have AS5 TIM on there. And note the only fan I hear right now is the low drone of my furnace fan which is down in the basement on the other side of the house.

Use aftermarket coolers if you want. I have no problem with that. What I have a problem with is advisors and fellow experts making blanket statements like (1) "OEM coolers are crap" and (2) automatically telling users to spend more money on aftermarket coolers without at least advising them to try the OEMs first.

I know of nobody who wants to spend money when they don't have to.  If they want to, fine! But if they don't want to, they should be advised to at least try the OEM cooler first.


----------



## cdawall (Jan 26, 2016)

Bill_Bright said:


> And you have misquoted me too.
> 
> I said for "mild to moderate" overclocking, OEM coolers "with proper case cooling" are adequate. And they are. That does NOT mean they will pass a prime95 test, however.
> 
> ...



Did you try running something that even remotely loads the CPU? This is an OEM Dell with an OEM Dell heatsink which is a full cm taller with a much better fan on a non-overclocked 4770. It merely has cpuz on stress mode which I have found to run about as warm as a normal gaming load.


----------



## Bill_Bright (Jan 26, 2016)

You are showing us what? I see an OEM Dell with an OEM cooler. A 3.4GHz CPU running at 3.7GHz at a high but tolerable maximum of 72°C (with a max TDP rating of 84° - lots of room left).  And to the point, it is still running and is still stable. What's your point?

It does not prove advisors should be telling every person coming here that OEM coolers are crap, or that they should chuck their OEM coolers without even giving them a chance.  

Just because most of the regulars here are enthusiasts, that does not mean everyone seeking advice here is.


----------



## cdawall (Jan 26, 2016)

Bill_Bright said:


> You are showing us what? I see an OEM Dell with an OEM cooler. A 3.4GHz CPU running at 3.7GHz at a high but tolerable maximum of 72°C (with a max TDP rating of 84° - lots of room left).  And to the point, it is still running and is still stable. What's your point?
> 
> It does not prove advisors should be telling every person coming here that OEM coolers are crap, or that they should chuck their OEM coolers without even giving them a chance.
> 
> Just because most of the regulars here are enthusiasts, that does not mean everyone seeking advice here is.



The Dell cooler is not Intel one. I again have said that over and over and you keep ignoring it to try and prove that you are right so I say AGAIN the intel cooler sucks, not even OEM's use it and those are close to throttling. Followed by you posting skewed results to try and prove your point. I posted a CPU that runs cooler, uses less power and is clocked 600mhz lower with a better cooler at higher temps. Who is proving anything, but you and that you are full of it.


----------



## newtekie1 (Jan 26, 2016)

Bill_Bright said:


> I said for "mild to moderate" overclocking, OEM coolers "with proper case cooling" are adequate. And they are. That does NOT mean they will pass a prime95 test, however.



If they won't pass prime, then they aren't adequate.  Also, they aren't adequate for any overclocking.  The stock cooler that comes with the i3-6100 gets the CPU to 60°C in a well ventilated case.  Any overclocking, even mild, will push it over the edge.  And my 4790K would get to 70°C with the stock cooler in a 650D, which doesn't exactly have inadequate airflow.  These are normal loads, not Prime95, Prime95 on my 4790K stock hit 100°C.



Bill_Bright said:


> And I stick by my claim OEM coolers are better than they were 10 years ago. You noted yourself, they haven't changed much. But they have changed a little. And a little change here and a little change there can make a significant change in effectiveness.



You'll note I said they've changed for the worse.  They are smaller, half the height of the coolers from 10 years ago, with the same number of fins(I counted).



Bill_Bright said:


> So I claim it does not take much to improve them. Purer raw materials, flatter mating surfaces, better TIM, minor changes in fins for more surface area (a critical criteria), better fans all make the OEM coolers better. Plus better case cooling today and more efficient CPUs all contribute to today's OEM coolers providing "adequate" cooling. Are they the best coolers? NO! But they don't have to be either.



Again, the raw materials haven't changed.  Unless they've come out with some super alloy, in which case you have to ask why no other heatsink maker has started using it, the raw material is not going to make a difference.

You're right, surface area is a big factor, and the current cooler has half the surface area of the cooler from 10 years ago.  It is half as tall with the same number of fins.  Surface area has decreased dramatically, which is why the current coolers are in fact worse than the coolers from 10 years ago.

It isn't about them being adequate, I never said they weren't, I said they were terrible(and they are) and they aren't better than coolers from 10 years ago.

You haven't provided any evidence to back up your claims that the coolers now are better than coolers from 10 years ago.



Bill_Bright said:


> I'm running with the OEM cooler on an i7 3770 with pushed to 4.2GHz as seen by Core Temp here



That's nice, now show a screenshot with some actual load.  The CPU completely loaded.


----------



## OneMoar (Jan 26, 2016)

the stock intel cooler under prime will easily hit 80 to 100c (don't believe me search this board for "high cpu temps or "are my temperatures ok' at least once a week we get somebody posting with temps and using the stock cooler

under nominal(gaming) load expect temps in the high 60s to low 70's

and its not just about what the max temp is temperature affects the cpu's ability to turbo the lower the temp the longer/more cores will turbo
at this point ...


----------



## suraswami (Jan 27, 2016)

OneMoar said:


> the stock intel cooler under prime will easily hit 80 to 100c (don't believe me search this board for "high cpu temps or "are my temperatures ok' at least once a week we get somebody posting with temps and using the stock cooler
> 
> under nominal(gaming) load expect temps in the high 60s to low 70's
> 
> ...



Lol that's not even a horse!!


----------



## Kanan (Jan 27, 2016)

newtekie1 said:


> Zen's new marketing:
> 
> "AMD Zen - 9 Out Of 10 People Can't Tell It Isn't An Intel!*"
> *In Microsoft Excel
> ...


The original Phenom (Phenom I) yeah, because it had a bug where they had to disable some of the L3 cache - also it had anyway too few L3 cache and it was clocked low. Phenom II solved all these problems though. I'm hopeful for Zen too, I think it will be a good enough design to get AMD some market share back. At least AMD posted some schematics of Zen and one time said "it performs as expected". Well that's just words, but the architecture on the schematics seemed like an Sandy Bridge copy, at least some of the folks here commented that. 



Tsukiyomi91 said:


> another problem about AMD is their CPU coolers.I dunno why but it look really microscopic compared to Intel's if u ask me & they're terrible at keeping temps at bay for an already hot processor chip. Intel's stock cooler at the very least has big surface area & large fan to keep their chips cool.


With that comment you outed yourself as an Intel fanboy, if it even wasn't for the 100 other Intel-biased comments you made here - and the thread itself and its whole reason. Everything anti-AMD-hating, nothing of real importance. Even the slightest informed knows that AMD stock coolers are a LOT better than Intel ones, but nice try anway. Btw. heard of the Wraith cooler? Or of the AIO water coolers AMD boxes with their 9590 CPUs? Does Intel have heatpipe coolers? 3x No. The obvious problem why Intel stock coolers are so bad, is, Intel is so big and powerful, they can just do it and get away with it. Simply. And same reason why AMD can't do it, their power is pretty limited and always was. 



Bill_Bright said:


> Yeah right. And you call me ignorant? Did you even notice I specifically said (I will *bold* it this time for you), "_Intel isn't even in the *discrete* (*graphics card*) GPU business_." Got a link to a current, PCIe Intel graphics card?
> Ah! I see. So it is okay for you to run a topic OT as long as it suits you. But for me to use "selecting PSUs" as an example (NOT a OT topic) for buying a graphics is not okay because it does not suit you. Right.  And  you call me arrogant?


Don't hurt your head when rolling on the ground (seemed it happened already). lol  I know you said "discrete", it's not important and I adressed it even. Can't read? And I'm ignorant? Cool. And no, PSUs are obviously totally misplaced here as an argument because it has nothing to do with CPUs, GPUs have to do with CPUs at least a bit, PSUs - no.



> I am not ashamed of any thing I said. Nor did I say anything that was in error - so nothing to admit too.
> No more of your name calling! That would be wonderful. Thank you. And I will not respond to your personal affronts any more either.


You are just the guy with over 40 years experience and a God complex or "I know everything better than any other person here"-complex, obviously the same. Am I the only person having problems with your statements, no, more like every 2nd person here. Btw. you strike me as kinda Intel-fanboyish, you are most likely the only person on earth that defends their crappy boxed coolers, and most certainly only person who thinks they got better over time. hahaha, but thanks for the good laugh this late hour.


----------



## dorsetknob (Jan 27, 2016)

Kanan said:


> The obvious problem why Intel stock coolers are so bad,



i suggest its   that as intel's CPU's consume less power over successive generations
its the Cost cutting Accountants / nickel and dime Bastards  that are Driving the slimming down of intel coolers

It Cost less to push out a cooler with no Copper insert or a smaller lighter copper insert
and less fin area and as long as its Adequate at its job this cuts Intel manufacturing costs allowing them to maintain or increase profits 

Give the customer less and charge the same or more   its the corporate way

PS AMD are no Better either


----------



## Kanan (Jan 27, 2016)

dorsetknob said:


> i suggest its   that as intel's CPU's consume less power over successive generations
> its the Cost cutting Accountants / nickel and dime Bastards  that are Driving the slimming down of intel coolers
> 
> It Cost less to push out a cooler with no Copper insert or a smaller lighter copper insert
> ...


True. And btw. I have to admit, some AMD coolers are shit too - I have a A10 7850K boxed cooler here behind me and like the A8 3870K cooler I had here too, it's crap. Basically all their APU coolers are too bad for their APUs to run smooth at 100% (only aluminium in the coolers). That's why I didn't use them and directly bought aftercoolers for the PCs I built up for my uncles. However I had some good AMD coolers too, my A64 X2 3800+ boxed cooler was really good, I was able to overclock with it up to 40% (2000 -> 2800 MHz), or let's say to the limit I could do with my Ram (200 -> 240 FSB) and it wasn't really loud either. The cooler boxed with my Phenom II 940 was good enough too, it had 2 heatpipes, so obviously a lot of copper compared to the Intel solutions. Obviously it's a matter of what CPUs you buy from AMD, but let's say, if you buy a highend CPU from AMD you will have a good boxed cooler with it. If you buy a APU, basically not. I'm not so sure about the lower FX CPUs, but I know the ones boxed with 8350 9590 (AIO water) 6300 etc are good enough. But if you buy a Intel CPU it doesn't matter much, all their coolers are basically somewhat bad, I don't say they are "too" bad, but I wouldn't say they are good when they handycap the turbo of the CPU.


----------



## newtekie1 (Jan 27, 2016)

Kanan said:


> The original Phenom (Phenom I) yeah, because it had a bug where they had to disable some of the L3 cache - also it had anyway too few L3 cache and it was clocked low.



Even before the bug the performance was disappointing, the bug just makes it worse.



Kanan said:


> Phenom II solved all these problems though.



It did, but that wasn't my point.  My point was that just because they are full real cores that doesn't mean the performance won't be disappointing.  AMD could still screw it up, it is possible.  I'm hoping they don't, but it is possible.



Kanan said:


> Even the slightest informed knows that AMD stock coolers are a LOT better than Intel ones, but nice try anway.



I'd say the heatpipe cooler is better than anything Intel puts out _today_, but you only get that with a select few processors, the 125w AM3+ processors.  Everything else comes with a crappy aluminium block cooler, which is worse than some of Intel's coolers, and about the same as others.  Even their high end APUs come with a crappy all aluminum heatsink.



Kanan said:


> Btw. heard of the Wraith cooler? Or of the AIO water coolers AMD boxes with their 9590 CPUs? Does Intel have heatpipe coolers? 3x No.



I suspect, just like the current heatpipe cooler, the Wraith is only going to come with a select few processors.  Most will continue to get the crappy AMD heatsink, not the heatpipe/wraith.

And Intel has had their own AiO stock liquid coolers too over the years.

Oh, and lets not forget this bad boy Intel bundled with some of their processors 6 years ago:








(I'd like to see Bill_Bright explain how the current stock coolers are better than this, maybe it is his special alloy)



Kanan said:


> The obvious problem why Intel stock coolers are so bad, is, Intel is so big and powerful, they can just do it and get away with it.



Intel gets away with it because their stock cooler is good enough for stock use.  They don't need to do anything better.



Kanan said:


> And same reason why AMD can't do it, their power is pretty limited and always was.



AMD isn't any better.  The cooler that most of their processors comes with is just as bad as Intel's.


----------



## cdawall (Jan 27, 2016)

newtekie1 said:


> (I'd like to see Bill_Bright explain how the current stock coolers are better than this, maybe it is his special alloy)



magic


----------



## OneMoar (Jan 27, 2016)

cdawall said:


> magic


naa its gotta be ponies
its hilarious how amd is trying to convince people that there new "wraith'' cooler is something awesome
its the same god dam cooler with 15 more press fit fins and a shitty high-blade pitch low rpm static speed fan
fucking trash ... no enthusiast or OEM would be caught dead with that garbage on there cpu
if AMD wanted to impress they should have completely redesigned the cooler with a two level heat pipe array


----------



## cdawall (Jan 27, 2016)

OneMoar said:


> naa its gotta be ponies
> its hilarious how amd is trying to convince people that there new "wraith'' cooler is something awesome
> its the same god dam cooler with 15 more press fit fins and a shitty high-blade pitch low rpm static speed fan
> fucking trash ... no enthusiast or OEM would be caught dead with that garbage on there cpu
> if AMD wanted to impress they should have completely redesigned the cooler with a two level heat pipe array



They just want their name in the news. Nothing more nothing less all press is good press remember.


----------



## newtekie1 (Jan 27, 2016)

OneMoar said:


> its the same god dam cooler with 15 more press fit fins and a shitty high-blade pitch low rpm static speed fan



So basically the FX-60 cooler, but with a fancy light.  But they've improved so much in the last 10 years!


----------



## cdawall (Jan 27, 2016)

newtekie1 said:


> So basically the FX-60 cooler, but with a fancy light.  But they've improved so much in the last 10 years!



Hey now it has a larger fan


----------



## OneMoar (Jan 27, 2016)

cdawall said:


> Hey now it has a larger fan


and more chintzy fins O and to save weight they removed fan control


----------



## newtekie1 (Jan 27, 2016)

cdawall said:


> Hey now it has a larger fan



Not really compared to the FX-60 cooler, it had a thick 80mm fan on it too.  It was later that they changed to the thin 60mm fan that was a lot louder.






This was the original heatpipe cooler.


----------



## trog100 (Jan 27, 2016)

"Intel gets away with it because their stock cooler is good enough for stock use. They don't need to do anything better."

that about sums it up.. they do what they are intended to do.. reliably keep an intel cpu at temps that will not do any harm or shorten its life.. 

what the f-ck more are they supposed to do.. the only problem i see here is that some just cant grasp the simple fact that the temps intel seem to think are okay are actually okay for real.. intel design their chips and coolers to work up to 100 C.. 

my stock 4790K with its stock intel cooler running small block prime95 quite easily hit 100 C and throttled.. i would guess pretty much as intel intended.. why some have problems with this i havnt a clue.. quite clearly intel are not having problems.. 

the "enthusiast" game seems to have two main components.. one is run a cpu faster than intel intend it to be run at and the other is to run a cpu cooler than intel intend it to be run at..

nether component matters one jot to the vast majority of people that buy or use intel based systems.. dare i call such people "normal".. all "normal" people want is for the bloody things to work.. being as they do work what is the problem.. 

old cpus never die they just get thrown away.. the cpu is probably the most long lasting part of a PC.. 

trog


----------



## cdawall (Jan 27, 2016)

trog100 said:


> my stock 4790K with its stock intel cooler running small block prime95 quite easily hit 100 C and throttled.. i would guess pretty much as intel intended.. why some have problems with this i havnt a clue.. quite clearly intel are not having problems..



Wait what? So it can't run programs and that's ok



newtekie1 said:


> Not really compared to the FX-60 cooler, it had a thick 80mm fan on it too.  It was later that they changed to the thin 60mm fan that was a lot louder.
> 
> 
> 
> ...



I always forget that little bastard was different. I still think the wraith is thicker.


----------



## HumanSmoke (Jan 27, 2016)

cdawall said:


> Wait what? So it can't run programs and that's ok


Come on, stop baiting the guy. How many people do you know that run small FFT Prime95 as anything other than a stress test? Next thing you'll be telling us your favourite entertainment is watching LinX running.

I think you'll find that the Intel HSF isn't the only cooler that allows operation up to the chips design specification. Doesn't the reference 290/290X at default do much the same thing at ~95C ?


----------



## OneMoar (Jan 27, 2016)

look familiar


----------



## suraswami (Jan 27, 2016)

newtekie1 said:


> Not really compared to the FX-60 cooler, it had a thick 80mm fan on it too.  It was later that they changed to the thin 60mm fan that was a lot louder.
> 
> 
> 
> ...



Those coolers too had 2 or more fan manufacturers, CM was the best with low noise, even the later 60mm the CM was best.  I even pulled out a loud fan (for a friend) and replaced a spare CM fan that came with my cooler (didn't use it, I went AIO).


----------



## trog100 (Jan 27, 2016)

"Wait what? So it can't run programs and that's ok"

prime95 is pretty much only used as a stress tester.. sometimes used as cooler comparison tester.. if i run it now with my pretty good top flow cooler and de-lided chip with the case side off (just to make it hard for you) i will still see over 85 C or close to 65 C over room ambient..

4790k at 4.6 gig 1.26 vcore chip delidded 3 C over room ambient while browsing.. up to 65 C over room ambient running silly programs like prime95..

intels own burn in test will do pretty much the same thing.. time you figured out the purpose of certain programs.. they are designed to torture a chip and they do.. 

anyone that lives in hotter climes without aircon will see much higher tempts than i do no matter what cooler is used.. tis what boost and temp throttling is all about..

the only point i am trying to make is that over cooling goes along with overclocking.. as for bill and his 40 C all i can say is nonsense.. said in a polite way of course.. 

trog


----------



## Kanan (Jan 27, 2016)

newtekie1 said:


> Even before the bug the performance was disappointing, the bug just makes it worse.
> 
> 
> 
> It did, but that wasn't my point.  My point was that just because they are full real cores that doesn't mean the performance won't be disappointing.  AMD could still screw it up, it is possible.  I'm hoping they don't, but it is possible.


Yeah like I said, clocks were bad too with Phenom I. Yeah of course they could screw it up, always a possibility, but I think they are not lying when they say "it performs as expected" - other than this there are a lot of facts that make it hard to believe it will be a failure again. Plus it's this time: do it or go away. No more failures allowed. I think they must and they will deliver this time, with Zen.




> I'd say the heatpipe cooler is better than anything Intel puts out _today_, but you only get that with a select few processors, the 125w AM3+ processors.  Everything else comes with a crappy aluminium block cooler, which is worse than some of Intel's coolers, and about the same as others.  Even their high end APUs come with a crappy all aluminum heatsink.


True, I suspected as much. 



> I suspect, just like the current heatpipe cooler, the Wraith is only going to come with a select few processors.  Most will continue to get the crappy AMD heatsink, not the heatpipe/wraith.


Well that was to be expected, I guess.



> And Intel has had their own AiO stock liquid coolers too over the years.


I totally forgot about that, I think it was bundled with their 6 cores, or was at least recommended to be used with them? That's why the 3930K or 3960X for example was always WOF, their stock coolers would simply be overwhelmed by the high TDP of it. 



> Oh, and lets not forget this bad boy Intel bundled with some of their processors 6 years ago:
> 
> 
> 
> ...


hahaha, forgot that too. Was it bundled with 1st gen i7? Or just the 6 core parts? I can't remember exactly. 



> Intel gets away with it because their stock cooler is good enough for stock use.  They don't need to do anything better.


Well if a CPU can't really turbo or gets handycapped doing it, it's not exactly working as intended I'd say, but this is open to point of view. 



> AMD isn't any better.  The cooler that most of their processors comes with is just as bad as Intel's.


I guess with the wraith cooler they will be better. Also Intel has no AIO water cooling (now), AMD has. 



OneMoar said:


> naa its gotta be ponies
> its hilarious how amd is trying to convince people that there new "wraith'' cooler is something awesome
> its the same god dam cooler with 15 more press fit fins and a shitty high-blade pitch low rpm static speed fan
> fucking trash ... no enthusiast or OEM would be caught dead with that garbage on there cpu
> if AMD wanted to impress they should have completely redesigned the cooler with a two level heat pipe array


I don't think they did that. They just made a video that proved it's a lot better than their old boxed coolers and that's it. Nobody at AMD said it's something "awesome". But it's good enough (for a free boxed cooler), we should admit that at least.


----------



## newtekie1 (Jan 27, 2016)

Kanan said:


> hahaha, forgot that too. Was it bundled with 1st gen i7? Or just the 6 core parts? I can't remember exactly.



It started coming with the 980x, and continued through the first generation of 2011 processors.



Kanan said:


> Well if a CPU can't really turbo or gets handycapped doing it, it's not exactly working as intended I'd say, but this is open to point of view.



Turbo is supposed to be a quick boost, and that is what it does with the stock cooler.  The processor isn't intended to run a full turbo speed for long periods of time.  With the stock cooler the processor will turbo, but then settle back down to the rated clock speed if the load goes on for a while.  AMD's processors do this with their stock heatsinks as well.



Kanan said:


> I guess with the wraith cooler they will be better. Also Intel has no AIO water cooling (now), AMD has.



Not if they only bundle it with the high end processors like they do now.  Plus, Wraith seems like a noise improvement and not really a cooling performance improvement.  But then again, maybe Zen will put out less heat.


----------



## Kanan (Jan 27, 2016)

> Not if they only bundle it with the high end processors like they do now. Plus, Wraith seems like a noise improvement and not really a cooling performance improvement. But then again, maybe Zen will put out less heat.


Still, Intel doesn't bother with having a "premium" cooler for their top end CPUs, AMD does. That IS better I'd say. And of course it's a noise improvement AND a performance improvement, because they can run on higher RPMs too, the fan is simply bigger, the cooler itself, better. So it must be better performing, simply.


----------



## RCoon (Jan 27, 2016)

A whole page about stock coolers, stemming from a thread that's originally about Intel receiving "hackusations" from AMD because of synthetic benchmarks.

This place is starting to look like the YouTube comments section.


----------



## newtekie1 (Jan 27, 2016)

Kanan said:


> Still, Intel doesn't bother with having a "premium" cooler for their top end CPUs, AMD does.



AMD doesn't bundle any heatsink with their top end CPUs.  Just like Intel.  The only reason the heatpipe cooler is bundled with some processors is because it is necessary to cool them.  If AMD could use the crappy heatsink, they would, and do.



Kanan said:


> hat IS better I'd say. And of course it's a noise improvement AND a performance improvement, because they can run on higher RPMs too, the fan is simply bigger, the cooler itself, better. So it must be better performing, simply.



The fan doesn't get faster, they demoed it a full speed, it is slow.  That is how they managed to keep it quiet.


----------



## cdawall (Jan 27, 2016)

Amd does offer the 9370/9590 with an aio watercooler


----------



## Kanan (Jan 27, 2016)

newtekie1 said:


> AMD doesn't bundle any heatsink with their top end CPUs. Just like Intel. The only reason the heatpipe cooler is bundled with some processors is because it is necessary to cool them. If AMD could use the crappy heatsink, they would, and do.


They do. FX 9590 is bundled with water cooling. Also I'm referring to the FX 8370 or 8350 because the 9590/9370 is just silly. And you don't know what AMD would do if their CPUs would have a lower TDP. I think they would bundle their CPUs with Wraith Coolers and that's not the same shit Intel is using - basically you are contradicting yourself by saying they only use wraith on highend and then say they would use crap coolers if they could. Both isn't possible. AMD is not Intel, accept it.


----------



## Bill_Bright (Jan 27, 2016)

newtekie1 said:


> If they won't pass prime, then they aren't adequate.


That's just silly. My truck won't run at 120MPH, uphill, for 10 hours straight either, so the stock cooling must be inadequate too. 

And my example of doing your homework when purchasing PSUs does apply - unlike the claim about "integrated" Intel GPUs. We buy separate PSUs. We buy separate graphics cards. We don't buy separate "integrated" graphics. So yeah, that was ignorant.

trog100 has it right (my *bold* added), 





			
				trog100 said:
			
		

> "Intel gets away with it because their stock cooler is good enough for stock use. They don't need to do anything better."
> 
> that about sums it up.. they do what they are intended to do.. reliably keep an intel cpu at temps that will not do any harm or shorten its life..
> 
> *the only problem i see here is that some just cant grasp the simple fact that the temps intel seem to think are okay are actually okay for real*.. intel design their chips and coolers to work up to 100 C..



You just hit the nail on the head. As you noted, the only problem here is some people cannot grasp, or refuse to accept, simple facts! They believe their extreme ways are how everyone uses their computers (hence the must pass prime95 comment). Or grasping at straws exceptions like what if they live in a desert with no AC or something. And they are so convinced in their ways, or afraid I might be right, they are unwilling to even suggest the OEM coolers users have already paid for be tried first. Why? Again because they want to believe everyone seeking advice here is seeking to build a system that will pass a scenario most users will NEVER encounter in the real-world, a test that is intentionally designed to abuse systems. A test designed to determine breaking points. 

I will clarify one point I said earlier for most OEM coolers. I was in error about better "alloys" on today's OEM coolers. That was wrong and I apologize for that.

What is not wrong is,

The raw materials used are purer - that is, purer copper and aluminum that allows for better conduction of heat.
Mating surfaces of both the heatsink and CPU die lid are more perfectly flat for better conduction of heat.
Purer materials and flatter surfaces result in fewer microscopic pits and valleys in the mating surfaces for better conduction of heat.
Fin size, shape and layout on today's coolers allow for more surface area for better conduction of heat.
TIM (even OEM TIM pads) is more advanced for better conduction of heat.
CPUs are more efficient, generating less heat in the first place,
And finally, fans and case cooling are better for better extraction of heat.
You guys can deny all that if you like. But all those factors work together to allow today's coolers to provide "adequate" cooling with normal clocking. That's adequate! I have never said, "outstanding", "excellent" or even "good" (though I might have said "good enough" but that means adequate).

So, until someone can show REAL tests or reviews to support their claims that all OEM coolers are crap and cannot perform their designed function and therefore must be replaced with an aftermarket cooler, I stick by my claim that OEM coolers today are "adequate" for most users, even with mild to moderate overclocking - assuming a properly configured case that is not blanketed with a layer of heat trapping dust, running abusive benchmarking programs, or sitting out in the desert sun!


----------



## cdawall (Jan 27, 2016)

Bill_Bright said:


> That's just silly. My truck won't run at 120MPH, uphill, for 10 hours straight either, so the stock cooling must be inadequate too.



Mine will. So will my cobra, maybe you just have a shitty truck


----------



## Kanan (Jan 27, 2016)

Bill_Bright said:


> That's just silly. My truck won't run at 120MPH, uphill, for 10 hours straight either, so the stock cooling must be inadequate too.
> 
> And my example of doing your homework when purchasing PSUs does apply - unlike the claim about "integrated" Intel GPUs. We buy separate PSUs. We buy separate graphics cards. We don't buy separate "integrated" graphics. So yeah, that was ignorant.


Being called ignorant by a ignorant person is somewhat senseless, sorry. I can buy different integrated GPUs by simply buying another CPU, you are simply too shortsighted to see that I guess. I can even NOT buy integrated CPUs by buying LGA2011 or simply AMD CPUs, I already did that. And you can talk like a teacher all day about "homeworks" here, for all I know, my understanding of PCs is better than yours, old man. I don't need your advice at all. Continue this niveau and my answers will be more bold every time. I'll just answer accordingly.



> I will clarify one point I said earlier for most OEM coolers. I was in error about better "alloys" on today's OEM coolers. That was wrong and I apologize for that.


Wow, we should really frame that, because it's maybe the first time ever he admitted being wrong here.



> What is not wrong is,
> 
> The raw materials used are purer - that is, purer copper and aluminum that allows for better conduction of heat.
> Mating surfaces of both the heatsink and CPU die lid are more perfectly flat for better conduction of heat.
> ...


Where is the proof for that? The raw materials are purer? Sure? I'm not convinced at all and I will not only take your word for it. More flat? I don't thinks that's true, judging by my own experience nothing changed at all there. Fewer microscopic pits and valleys, hahaha thats just the point where it gets silly. Maybe so, but I don't think that makes any real change at all. And maybe fin size shape and layout got better at aftermarket coolers but not so much at Intel boxed ones. Again, where's the proof? As I see it Intel coolers are cheap cheaper the cheapest, they don't care about all that much at their coolers. They just made the simplest trash, just good enough for the CPU to not overheat IF the PC is well enough cooled by the housing coolers and IF the environment isn't too hot. Why, because they can - their CPUs advanced so much, they simply don't need good CPU coolers for them and didn't invest money to advance them any further.



> CPUs are more efficient, generating less heat in the first place,
> And finally, fans and case cooling are better for better extraction of heat.


Thanks that you enlightended us by your special insight that CPUs are better every year. But I don't see where OEM pcs have better case cooling, that's just words. As I see it they are just as bad as before. Just CPUs don't need as much cooling as before like compared to the P4 era, and that helped the OEM computers with their simply design and cooling. That's it.

PS. I'm talking about average OEM computers here, not special gaming PC's, I hope that was obvious enough.


----------



## Bill_Bright (Jan 27, 2016)

Oh, I've been wrong before - and admitted it. There's the difference. I am not afraid to admit it when I see real evidence presented. And unlike you, I do not call others names. I said your comment was ignorant, not you.

Since again, no one has shown us any study, professional review site, or white paper that shows all OEM coolers today are "crap" and automatically need replacing, I am done wasting my time. But to be sure, I will continue to inform users of all the facts so they can make their own informed decisions and give them the opportunity to at least try the OEM coolers (which they paid good money for) to see if they meet their needs - instead of imposing biased beliefs on them with blanket statements and scenarios that do not represent real world, or potentially the user's scenarios either.

I will ensure their cases are properly configured and clean before telling them to replace their coolers. I will ensure their temps are within "normal operating ranges". I will not assume what is right for me is right for everyone. To be sure, I do have an open mind and fully accept there are exceptions where OEM coolers are not "adequate". I have never EVER denied that - contrary to what we have seen in others here as trog noted. 

Good day.


----------



## cdawall (Jan 27, 2016)

Bill_Bright said:


> Oh, I've been wrong before - and admitted it. There's the difference. I am not afraid to admit it when I see real evidence presented. And unlike you, I do not call others names. I said your comment was ignorant, not you.
> 
> Since again, no one has shown us any study, professional review site, or white paper that shows all OEM coolers today are "crap" and automatically need replacing, I am done wasting my time. But to be sure, I will continue to inform users of all the facts so they can make their own informed decisions and give them the opportunity to at least try the OEM coolers (which they paid good money for) to see if they meet their needs - instead of imposing biased beliefs on them with blanket statements and scenarios that do not represent real world, or potentially the user's scenarios either.
> 
> ...



What facts have you posted? You posted some skewed temperature numbers then what 10-15 posts of pure fud?


----------



## newtekie1 (Jan 27, 2016)

Kanan said:


> They do. FX 9590 is bundled with water cooling.



Not anymore they don't. http://www.newegg.com/Product/Product.aspx?Item=N82E16819113347
The liquid cooler bundle was a limited time thing when the processor was first released.



Kanan said:


> Also I'm referring to the FX 8370 or 8350 because the 9590/9370 is just silly.



Sure, but those aren't high end processors anymore.



Kanan said:


> And you don't know what AMD would do if their CPUs would have a lower TDP. I think they would bundle their CPUs with Wraith Coolers and that's not the same shit Intel is using - basically you are contradicting yourself by saying they only use wraith on highend and then say they would use crap coolers if they could. Both isn't possible. AMD is not Intel, accept it.



I do know, because they are already bundling shit coolers with their lower TDP processors.  Every APU comes with a crappy aluminium cooler, their low end FX processors even come with the crap aluminium heatsink.  If it isn't 125w, it doesn't get a heatpipe cooler.  There is nothing even hinting that Wraith will be any different.

And how is it not possible for them to bundle wraith with the high end, and crap with the low end?  *THAT IS WHAT THEY ARE ALREADY DOING!
*


Bill_Bright said:


> That's just silly. My truck won't run at 120MPH, uphill, for 10 hours straight either, so the stock cooling must be inadequate too.



It is.  Because when you take that truck up into the mountains, the engine will die.  Or do what modern cars/trucks do, go into reduced power mode because of the heat.

But Prime isn't like running a car at 120mph for 10 hours uphill.  That is something the car will never see.  Prime95 actually is load the processor will see.  All my processors see load like that 24/7, because they all either fold or crunch.  Prime95 w/ Small FFTs is unrealistic, but just normal Prime95 with the default settings is not.



Bill_Bright said:


> The raw materials used are purer - that is, purer copper and aluminum that allows for better conduction of heat.



You have no way of proving that, and it isn't even logically correct.  The techniques for processing copper and aluminium haven't changed much at all in 10 years.  And even if they had, the minor difference in purity would not suddenly make a cooler that is half the size with half the surface area suddenly be capable of better cooling.



Bill_Bright said:


> Mating surfaces of both the heatsink and CPU die lid are more perfectly flat for better conduction of heat.



Again, not the case either.  The IHS on the old processors used to be curved, but the heatsink base was flat. If you are going to say "then why did we lap them", well I didn't see many people lapping stock heatsinks, but I saw a bunch lapping aftermarket heatsinks that were already flat.  But even still, this isn't going to make up for the massive loss in surface area.



Bill_Bright said:


> Fin size, shape and layout on today's coolers allow for more surface area for better conduction of heat.



The fin size is half of what it was before, the shape and layout is exactly the same.  My pictures showed exactly this.  So nope.



Bill_Bright said:


> TIM (even OEM TIM pads) is more advanced for better conduction of heat.



TIM can be changed. We are talking about the design of the cooler, not the TIM.  And even still, I'm pretty sure Intel has been using the same TIM on their heatsinks for years.



Bill_Bright said:


> CPUs are more efficient, generating less heat in the first place,



That has nothing to do with your claims that current stock coolers are better than coolers from 10 years ago.



Bill_Bright said:


> And finally, fans and case cooling are better for better extraction of heat.



Again, nothing to do with the actual CPU coolers now being better than 10 years ago.



Bill_Bright said:


> Since again, no one has shown us any study, professional review site, or white paper that shows all OEM coolers today are "crap" and automatically need replacing, I am done wasting my time.



You're the one that seems to think we are saying they need to be replaced.  I've stated several times that they are fine for stock use.

You are the one making the claims that they are better than 10 years ago and allow for overclocking.  When are you going to back those claims up with any type of evidence?  Just because they are adequate doesn't mean they are better than 10 years ago.  They were adequate back then too.  My Toyota Yaris is adequate enough to get me to work and back, it is still a piece of shit.


----------



## cdawall (Jan 27, 2016)

newtekie1 said:


> It is. Because when you take that truck up into the mountains, the engine will die. Or do what modern cars/trucks do, go into reduced power mode because of the heat.



Unless you are towing and trying to do 120mph up hill the truck still isn't going to overheat. Again, I have a truck that can and has done roughly that on a trip to Vegas and again on the trip to Colorado. Plenty of mountain passes, 120-130mph was the GPS's "average speed". You could also consider that truck to be overclocked on stock cooling. It makes 400whp and the turbos are cranked to 22PSI. All stock cooling, even in Texas during the summer it doesn't overheat (short of really heavy load passes followed by stopping in traffic. IAT's are bad at that point)



newtekie1 said:


> But Prime isn't like running a car at 120mph for 10 hours uphill. That is something the car will never see. Prime95 actually is load the processor will see. All my processors see load like that 24/7, because they all either fold or crunch. Prime95 w/ Small FFTs is unrealistic, but just normal Prime95 with the default settings is not.



Prime with small FFT's would be like trying to do 120mph uphill with a trailer, but in all reality who says a car wont see a load like that. Throw a trailer behind a truck and drive 80MPH it will be harder on the cooling system than empty at 120.

To me this comment just shows another thing bill knows nothing about.


----------



## Bill_Bright (Jan 27, 2016)

You want me to prove unicorns don't exist! My facts are your lack of evidence for your unfounded claims!

Where are the millions and millions of over heating computers that must be out there if your claims are true?

You (collectively) started this OT discussion by claiming OEM coolers are "crap" and need to be replaced regardless. You collectively are STILL making such blanket statements with newtekie1 just claiming "Every APU comes with a crappy cooler".

Yet you have shown no professional review site like TPU, Tom's Hardware, Overclocker's, Hard OCP, Anandtech, etc. where they tested and concluded all OEM coolers are "crap" and automatically need to be replaced by all users. No IT site like ZDNet, TechRepublic, CNET has made such a claim. Why not?

Your claims are supported by silly, flimsy excuses like "must pass Prime95" tests, or survive in hot desert environments with no air conditioning.  You show an overclocked Dell with an aftermarket cooler running over 70°C and pretend that proves your point.  Yeah right.

You guys are implying you are all much smarter and more qualified than the extremely well funded developers, researchers and designers at Intel and AMD - suggesting they don't have a clue what they are doing. Show us the evidence your exceptions, your extreme and enthusiasts examples make the rule for everyone - and that all OEM coolers are "crap" and "terrible" and automatically need to be replaced.


----------



## newtekie1 (Jan 27, 2016)

Bill_Bright said:


> My facts are your lack of evidence for your unfounded claims!



Umm...you are the one making claims here.

You want facts that the stock cooler is crap, how about the fact that it lands dead last in pretty much every heatsink comparison:







So, now where is your evidence that CPU coolers are better now than they were 10 years ago.  I'm waiting.


----------



## Bill_Bright (Jan 27, 2016)

No, YOU, specifically said passing Prime is a must. And YOU specifically just said all APU coolers are crappy.

My claim is OEM coolers are adequate in a properly cooled case.


----------



## dorsetknob (Jan 27, 2016)

Bill_Bright said:


> You show an overclocked Dell with an aftermarket cooler running



"OEM cooler for DELL" bill



Bill_Bright said:


> That's just silly. My truck won't run at 120MPH, uphill, for 10 hours straight



1200 mile hill   tell me where on earth that is please

if you say silly things then your going to be picked up on that !


----------



## cdawall (Jan 27, 2016)

Bill_Bright said:


> You show an overclocked Dell with an aftermarket cooler running over 70°C and pretend that proves your point.  Yeah right.



OEM Dell cooler on a locked 4770, what overclocking are you talking about? That CPU has turbo it was operating 200mhz shy of its peak turbo (all 4 cores loaded would be that speed this is intel spec). It was running a simple cpuz benchmark which isn't going to load the CPU like prime would, or video editing, or CAD rendering, closest I have found it to is a gaming load.


----------



## newtekie1 (Jan 27, 2016)

Bill_Bright said:


> No, YOU, specifically said passing Prime is a must. And YOU specifically just said all APU coolers are crappy.



Yes, passing prime is a must to consider the cooler adequate.  The stock cooler can actually do that.  But, again, I never said the stock cooler wasn't adequate, I said it was crap, there is a difference.



Bill_Bright said:


> My claim is OEM coolers are adequate in a properly cooled case.



No, your claims are that stock coolers today are better than they were 10 years ago and your bases for that was that they allow overclocking.

Again, your claims from your mouth:



Bill_Bright said:


> OEM coolers today get a bad rap because OEM coolers of yesteryear were lousy.



And in responce to me saying they've gotten worse over the years you said:



Bill_Bright said:


> Sorry, but I don't agree with that at all. They are quieter and more efficient. 10 years ago, even with mild overclocking, the OEM coolers (with AMD or Intel) would not be sufficient.



So, your claims are:


OEM Coolers are better now than 10 years ago
They are Quieter
They are More Efficient
Back those claims up.  And simply the fact that you can slightly overclock the processor with a stock cooler isn't proof of any of that.  I could and did do that with stock coolers 10 years ago.


----------



## cdawall (Jan 27, 2016)

Now on the APU coolers sucking thing this one is fine. Zero noise, tiny aluminum thing, could probably be used passive. It is also the only fan in the entire case.


----------



## Bill_Bright (Jan 27, 2016)

dorsetknob said:


> 1200 mile hill tell me where on earth that is please
> 
> if you say silly things then your going to be picked up on that !


Well, if that is what you think is important in this debate and what you value as your contribution, then so be it.


newtekie1 said:


> the fact that you can slightly overclock the processor with a stock cooler isn't proof of any of that. I could and did do that with stock coolers 10 years ago.


  Thanks. That just made my day.


----------



## Kanan (Jan 27, 2016)

> Sure, but those aren't high end processors anymore.


Then AMD has no high end processors left. FX 9590 / 9370 are simply not a option for most of the users and I wouldn't exactly say they are highend nonetheless - they are more likely on i5 level than i7 level and lightyears short of being something comparable to 6 core or 8 core Intels etc. But anyways, still, that was what I was refering to.



> I do know, because they are already bundling shit coolers with their lower TDP processors.  Every APU comes with a crappy aluminium cooler, their low end FX processors even come with the crap aluminium heatsink.  If it isn't 125w, it doesn't get a heatpipe cooler.  There is nothing even hinting that Wraith will be any different.


What you say is just a estimation of things, but you don't KNOW it (should I search up "knowledge" in wikipedia for you?). Can you see the future? No. And on top of it all you are freely comparing their APU line with their CPU line AND doing a estimation of future processors on top of it, too. Thats too much and too much for anyone to KNOW. This is kinda end of story for me here, you can speculate any further if you want I don't see the point in discussing this further however.
---


> Oh, I've been wrong before - and admitted it. There's the difference. I am not afraid to admit it when I see real evidence presented. And unlike you, I do not call others names. I said your comment was ignorant, not you.


Again, it's not important what words you use to say things. I can read between lines and to call someone ignorant on one thing and again and again is the same like calling him wholly personally ignorant.



> I will ensure their cases are properly configured and clean before telling them to replace their coolers. I will ensure their temps are within "normal operating ranges". I will not assume what is right for me is right for everyone. To be sure, I do have an open mind and fully accept there are exceptions where OEM coolers are not "adequate". I have never EVER denied that - contrary to what we have seen in others here as trog noted.


So you would match a i7 4790K with a boxed cooler? If yes, that would just be silly. You are overly defending these trash coolers. Nobody here said they are not "just enough" to run a system, but you are talking them better than they really are.



> You guys are implying you are all much smarter and more qualified than the extremely well funded developers, researchers and designers at Intel and AMD - suggesting they don't have a clue what they are doing. Show us the evidence your exceptions, your extreme and enthusiasts examples make the rule for everyone - and that all OEM coolers are "crap" and "terrible" and automatically need to be replaced.


I didn't claim any of these. I just claimed the Intel coolers could be better, I said they aren't really good and I said AMD ones are better, that's it. Stop the generalizing.



> My claim is OEM coolers are adequate in a properly cooled case.


But they aren't. I tried using a stock cooler for an A8 3870K back then, and the CPU was overheating, it was too small to cool it properly and the case was OPEN, so it had more than enough fresh air. After I installed a proper cooler everything was fine. That was the last time though I trusted these crappy APU coolers. On the A10 7850K I didn't take the chance, I simply ordered it with a extra cooler from Arctic and that's it.


----------



## dorsetknob (Jan 27, 2016)

SO SORRY BILL  
i made my contribution earlied and it was on topic
Your the one who started the TRUCK CRAP in the thread 
Way to go Bill


----------



## newtekie1 (Jan 27, 2016)

Kanan said:


> Then AMD has no high end processors left.



Pretty much.



Kanan said:


> What you say is just a estimation of things, but you don't KNOW it (should I search up "knowledge" in wikipedia for you?). Can you see the future? No. And on top of it all you are freely comparing their APU line with their CPU line AND doing a estimation of future processors on top of it, too. Thats too much and too much for anyone to KNOW. This is kinda end of story for me here, you can speculate any further if you want I don't see the point in discussing this further however.



You don't know either.  But it is best to assume they will continue to do what they are doing today.  Bundle the cheapest heatsink they can get away with.  They also bundled those crappy aluminium heatsinks with their FX processors too, not just the APUs.  FYI.  Your speculation is just as good as mine, except mine is based on history and what they currently do, your's is based on nothing other than hopes and dreams.


----------



## Kanan (Jan 27, 2016)

newtekie1 said:


> Pretty much.
> 
> 
> 
> You don't know either.  But it is best to assume they will continue to do what they are doing today.  Bundle the cheapest heatsink they can get away with.  They also bundled those crappy aluminium heatsinks with their FX processors too, not just the APUs.  FYI.  Your speculation is just as good as mine, except mine is based on history and what they currently do, your's is based on nothing other than hopes and dreams.


Hopes and dreams? I couldn't care less about boxed coolers. They aren't in my "hopes and dreams" nice joke man, I had a good laugh on it.  It's simply not worth it to use a boxed cooler when you can buy a decent aftermarket one for 8 to 20€. When I bought my Phenom II 940 7 years ago, I bought a cheap cooler from Arctic with it for 20€ (Freezer 64). It was more than enough even with overclocks, and it was quiet. Never used the boxed one, and it was the "good" one with heatpipes and copper. Mid 2015 when I had the chance of using the boxed Intel cooler on a i5 4460 I bought for a friend I simply bought a CM 212 Evo with it, just to not use that loud crap - my friend tried to talk me to it but I denied it. I think everyone sees my point. Boxed coolers are simply not worth it, to save 10 to 20 bucks.


----------



## newtekie1 (Jan 27, 2016)

Kanan said:


> Hopes and dreams? I couldn't care less about boxed coolers. They aren't in my "hopes and dreams" nice joke man, I had a good laugh on it.



Your hopes and dreams are that somehow you can have something to say AMD is better than Intel.  That AMD isn't just a big company that doesn't care about their customers, that they aren't like Intel.  The sad reality is they are.  Given the chance, they behave just like Intel.


----------



## Vayra86 (Jan 27, 2016)

About those OEM cooling solutions, I'll chime in by siding with newtekie1 in stating that they MUST be able to handle full, that is 100% utilization, stress and that includes AVX instructions too. That is, in 'regular conditions', so with a regular ambient temperature. Common sense dictates you won't be able to cut throttle-free operation without any airflow, or a room temperature that is beyond 35 degrees C or something.

So yes, Prime95 is a very solid test bench for an OEM cooling solution. I myself tend to use AIDA64 instead because it gets close enough without risking an all too sudden temperature rise. The AIDA testing sequence uses a build up towards max utilization instead of pushing max util from the first second, which is a 'nicer' test bench for newly bought PC's, incase something does seem to be inadequately cooled it is easy to spot early and prevent any kind of failsafe from having to kick in.

But that does not change a single thing about the fact that when you *buy a CPU that is rated for a certain frequency, as a customer you must be able to expect the CPU to run that frequency without throttling* based on temperature limits. If it does, somewhere along the line the build and design is wrong and for the customer it shouldn't even HAVE TO MATTER what is causing it. OEM cooling solution, bad CPU, whatever the fuck ever. The product delivered does not meet the expectations you should be able to have, bottom line, it is inadequate.

Overclocking of course is out of the question here. But any CPU should be cooled well enough to run full throttle full time at stock clocks. Simple. Every GPU can, that is why Nvidia introduced GPU Boost for example. They deliver a product that is guaranteed to run full throttle at the frequency it is sold with out of the box, and anything above that is silicon lottery. That's how it should be.

In conclusion, I would like to add that every OEM cooler I have seen thus far, does its job just fine and then some. I could bring an i5 3570k with boxed cooler to 4.2 Ghz steady without any trouble, both the headroom of the cooler (I don't even have a good chip, it needs extra volts to get there) and the slight headroom of CPU Vcore are built right in and showed themselves clearly. CPU would go a tiny bit beyond stock clocks on stock volts, and the cooler would stay well clear of throttling temps. With 1.23V, stock cooler / CPU started hitting 80 C at full load. Still well within operating limits and not half bad for a measly block of metal with 80mm fan.

For giggles, I also ran a G2358 on the stock cooler while the fan was stuck in one of the wires. I built the system, noticed CPU temp rise and it actually went stable at 84 C. PASSIVELY with a single 120mm intake at the front at 5V. Tell me again OEM aint adequate...


----------



## Bill_Bright (Jan 27, 2016)

Kanan said:


> So you would match a i7 4790K with a boxed cooler?


  Do exceptions always make the rule for you?

But to answer your question, sure I would! You are implying you are smarter than Intel. So tell us why would Intel includes a cooler in the box if it would not cool the CPU to PCG 2013D specs? It would be cheaper for them to leave it out and not worry about (or pay for) the logistics to design, buy, stock, package and transport the CPU with the cooler. After all, they do sell CPUs without coolers.

I say again, I would try it and see. What a concept, huh? And if not doing extreme overclocking, and if my case was providing an adequate supply of cool air, and my temps were such that my CPU was still stable and not throttling down due to heat, then no need for an aftermarket cooler.



Kanan said:


> But they aren't. I tried using a stock cooler for an A8 3870K back then, and the CPU was overheating


Oh wow! Then clearly you have proven that all OEM coolers are trash. I stand corrected and am in awe of your expertise, extensive sampling and conclusive empirical testing techniques. 



Kanan said:


> and the case was OPEN


  Because side panels don't provide for a "flow" of cool air through the case. Oh wait! That is exactly what they do. 

I give. You guys win. You are clearly much smarter and massively more experienced than all the Intel and AMD engineers and designers and their super computers, and me. And clearly Intel and AMD are perpetrating massive deception and fraud on the rest of us dumb consumers, including governments, the IT media and major institutions and have us all (except you guys, of course) totally bamboozled.


----------



## Kanan (Jan 27, 2016)

newtekie1 said:


> Your hopes and dreams are that somehow you can have something to say AMD is better than Intel.  That AMD isn't just a big company that doesn't care about their customers, that they aren't like Intel.  The sad reality is they are.  Given the chance, they behave just like Intel.


That you don't know. For all I know through the years AMD is NOT Intel and your words doesn't change that fact. Also as I see this you tend to see this rather negatively. We will see. Or not. Depends if AMD will be remotely strong again some day.



> Do exceptions always make the rule for you?


Where exactly did I say that? You're getting old, old man.



> But to answer your question, sure I would! You are implying you are smarter than Intel. So tell us why would Intel includes a cooler in the box if it would not cool the CPU to PCG 2013D specs? It would be cheaper for them to leave it out and not worry about (or pay for) the logistics to design, buy, stock, package and transport the CPU with the cooler. After all, they do sell CPUs without coolers.


ROFL. First of all. I never said I'm more smarther than Intel, I just said I'm more smarter than YOU. End of story. I'm not as arrogant as you are, maybe that's the reason you THINK I would compare myself to an company and think myself superior, no IAM NOT, this is your kind of thinking, NOT mine. Second. Intel tries to make money as much as they can, that's it. And that's what YOU don't seem to understand at all - their CPU coolers are the cheapest shit to save them money, that's it. Funny guy, how you try to snake your way around this discussion and are still losing it, nonetheless.


> I say again, I would try it and see. What a concept, huh? And if not doing extreme overclocking, and if my case was providing an adequate supply of cool air, and my temps were such that my CPU was still stable and not throttling down due to heat, then no need for an aftermarket cooler.


I tried and have seen more than enough of boxed coolers. They are most of the times okay, just when you buy a AMD APU (even if its the top notch one, and I did that 2 times with the A8 3870K + A10 7850K) it's just crap. Same seems to hold true for some FX processors as newtekie said. Boxed coolers are just crap. To save some bucks for that inferior cooling and noise is just silly. Argue again and again, won't change my opinion (or of the others), you're just too stubborn, old man. 


> Oh wow! Then clearly you have proven that all OEM coolers are trash. I stand corrected and am in awe of your expertise, extensive sampling and conclusive empirical testing techniques.


Old man, arrogance won't save your day. ^^ Try that patronizing wannabe wise behaviour again, maybe that suited you better.


> Because side panels don't provide for a "flow" of cool air through the case. Oh wait! That is exactly what they do.


hahaha, as I said your arrogance makes you weak. I ofc tested both, first with closed housing then with open, didn't change a thing, there was no heat problem surrounding the CPU cooler. The CPU cooler was simply too weak for the strong APU back then.


> I give. You guys win. You are clearly much smarter and massively more experienced than all the Intel and AMD engineers and designers and their super computers, and me. And clearly Intel and AMD are perpetrating massive deception and fraud on the rest of us dumb consumers, including governments, the IT media and major institutions and have us all (except you guys, of course) totally bamboozled.


Blablabla, that's just you, losing and trying to make us look silly. Well, guess what, try again. This senseless bullshit you are posting now, won't win you the discussion. Grow up, old man.


----------



## newtekie1 (Jan 27, 2016)

Kanan said:


> That you don't know. For all I know through the years AMD is NOT Intel and your words doesn't change that fact. Also as I see this you tend to see this rather negatively. We will see. Or not. Depends if AMD will be remotely strong again some day.



I do know.  When they had the lead, they charged $1,000 for their processors, just like Intel.  When they can get away with it, they include the absolute cheapest crappiest coolers they can, they do this to this day.  Nothing AMD has done shows they are any better than Intel.


----------



## cdawall (Jan 27, 2016)

Bill_Bright said:


> Because side panels don't provide for a "flow" of cool air through the case. Oh wait! That is exactly what they do.
> 
> I give. You guys win. You are clearly much smarter and massively more experienced than all the Intel and AMD engineers and designers and their super computers, and me. And clearly Intel and AMD are perpetrating massive deception and fraud on the rest of us dumb consumers, including governments, the IT media and major institutions and have us all (except you guys, of course) totally bamboozled.



Super computers do not use anything remotely close to a stock PiB cooler. The IT market and major institutions use prebuilt pc's for the most part and none of those use AMD or Intel OEM HSF's.


----------



## Bill_Bright (Jan 27, 2016)

Kanan said:


> I'm not as arrogant as you are
> Grow up, old man.


Right - because your total unwillingness to show even a single professional review site, IT mag, or white paper to support your position, you just expecting everyone believe you because you say that all OEM coolers are crap and need replacing is not any sign of arrogance (or being smarter than Intel). And because your constant name calling and personal attacks when someone says something different than you clearly demonstrates a mature attitude and desire to have a constructive technical debate. Gotcha.

I'm done here. Keep going with your own feculent blather if you feel it further enhances your claims about OEM coolers.


Vayra86 said:


> In conclusion, I would like to add that every OEM cooler I have seen thus far, does its job just fine and then some.
> Tell me again OEM aint adequate...


Sorry, but clearly you must not know what you are talking about either!


----------



## dorsetknob (Jan 27, 2016)

Vayra86 said: ↑
In conclusion, I would like to add that every OEM cooler I have seen thus far, does its job just fine and then some.
Tell me again OEM aint adequate...​Sorry, but clearly you must not know what you are talking about either! 

Shit bill he was supporting your view   and ""you Diss him""
your Such a wonderfull person Bill  said with sarcasam


----------



## Bill_Bright (Jan 27, 2016)

cdawall said:


> Super computers do not use anything remotely close to a stock PiB cooler.


No kidding???

Designers and engineers don't have stock coolers on themselves either. 

It seems you guys are so intent on jumping on anything said that you are not even taking a moment to understand the meaning of what is said. 

Intel and AMD designers and engineers use their supercomputers to analyze and crunch lots and lots of data to help those engineers and designers determine (1) what the cooling requirements of their CPUs are, and (2) the necessary specs of the coolers to meet those requirements.


----------



## Bill_Bright (Jan 27, 2016)

dorsetknob said:


> Shit bill he was supporting your view and ""you Diss him""
> your Such a wonderfull person Bill said with sarcasam


See what I mean? So intent to pile on.  Of course it was sarcasm. That's what the rolling eyes mean. I was not dissing him. Did you notice I even thanked his post? And the only one to thank him, BTW.


----------



## 64K (Jan 27, 2016)

I've lived on this planet for a pretty long while and I don't recall ever seeing anyone use the term "feculent blather" before. Well done Bill_Bright.


----------



## Bill_Bright (Jan 27, 2016)

64K said:


> I've lived on this planet for a pretty long while and I don't recall ever seeing anyone use the term "feculent blather" before. Well done Bill_Bright.


Come to Omaha and I'll buy you that beer and a great steak to go with it!

But back to Vayra86's comments, after re-reading it, I change my position and agree with him that a CPU should be able to pass Prime95 at 100% load with normal clocking. I was mistaken to say otherwise. In my defense, I hastily thought of most users who run Prime95, they test the CPU when overclocking. I put 2 and 2 together and got 3. Sorry.

So just as a PSU should maintain specs under full load, so should a CPU, and the accompanying cooling solution too. And I agree too with Vayra86 that OEM coolers can do that just fine - in a properly configured case and "normal" ambient temperatures too.


----------



## cdawall (Jan 27, 2016)

You would be incorrect with that assumption


----------



## CAPSLOCKSTUCK (Jan 27, 2016)

it was worth reading this thread just to learn the word "feculent"


----------



## Bill_Bright (Jan 27, 2016)

cdawall said:


> You would be incorrect with that assumption


Right. Because you say so. I get it. 

So for sure, as per cdawall, thee source for CPU cooling and computer power supplies, users must not assume any power supply they pay good money for will meet published specs either - in spite of what any professional review site like TPU might report.


----------



## CAPSLOCKSTUCK (Jan 27, 2016)

I would like to publicly apologize for being instrumental in starting this thread.


----------



## cdawall (Jan 27, 2016)

Bill_Bright said:


> Right. Because you say so. I get it.
> 
> So for sure, as per cdawall, thee source for CPU cooling and computer power supplies, users must not assume any power supply they pay good money for will meet published specs either - in spite of what any professional review site like TPU might report.




When did I say anything about powersupplies? All I'm saying is omaha doesn't have great steaks


----------



## OneMoar (Jan 28, 2016)




----------



## xfia (Jan 28, 2016)

wow this crap..  lmao
i wanted to stop reading but i couldnt.

please click thank you if you accept @CAPSLOCKSTUCK apology!


----------



## CAPSLOCKSTUCK (Jan 28, 2016)

I know this isnt very scientific but i want to try and redeem myself through the medium of pics....

I didnt sleep so im shakey,
I am sorry about the quality of the pics, everything you need to know is visible i think.

Here is a Phenom ii x2 550 which i have o/c to 3.495ghz, as you can see it isnt in a case.





I have a selection of cooling options, the stock AMD one, one which is stamped Foxconn and a Xigmatek. (i use the spanner to start the pc. so much easier than poking around with a screwdriver)






I have a temperature gauge with a probe which is under the board.




here is what is like at idle with the Foxconn




and after running the CPUZ stress test for about 10 minutes




and the temperature under the board





I decided not to use the AMD one because in my experience they are not good enough for o/c d chips.
i decided not to use the Xigmatek because it is "overkill"


Run tests with the stock one, if it doesnt work   for you,  in your environment  buy a better one if you can afford it.

in my experience HWMon is a good tool to use as is the funky stress test on CPUZ.


If you expect me to install all three and run comparative tests i dont have enough spare time and the conclusion would be pretty obvious anyway.


----------



## rtwjunkie (Jan 28, 2016)

newtekie1 said:


> Umm...you are the one making claims here.
> 
> You want facts that the stock cooler is crap, how about the fact that it lands dead last in pretty much every heatsink comparison:



The stock cooler IS actually within the specs. It is doing what it was designed to do.  And that's good enough for all the non-enthusiast, which is probably well over 95% of the population.  Just because it's last, doesn't make it crap.  It merely makes it last.  Crap would be exceeding 100C on stock.

@CAPSLOCKSTUCK shame on you for being behind this!   Irresponsible acts like this WILL go on your permanent record.


----------



## newtekie1 (Jan 28, 2016)

rtwjunkie said:


> The stock cooler IS actually within the specs. It is doing what it was designed to do. And that's good enough for all the non-enthusiast, which is probably well over 95% of the population. Just because it's last, doesn't make it crap. It merely makes it last. Crap would be exceeding 100C on stock.



No, it is still crap, by my opinion.  Like I said, just because it barely manages to do its job doesn't mean it isn't crap.  Again, my Toyota Yaris gets me to work and back, but it is still crap.  That doesn't mean it needs to be replaced, it is doing its job, it is just a piece of shit compared to everything else.

Intel could put a little effort and a few more cents into a better cooler, but over the years they only reduced its size and made it crappier.  The same goes for AMD on everything but the top FX chips.


----------



## cdawall (Jan 28, 2016)

^I agree with this, but I still belive things should be able to live in turbo without overheating so 95c is not ok with me.


----------



## the54thvoid (Jan 28, 2016)

Ah, now we're delving into the realms of comparative crapness. Which clearly delineates the already blurry line into a zig zag of relative uselessness.
This thread is like a big merry go round of slap stick. I'm actually on Bill's side here. Although a stock cooler is generally a piece of shit, it does what it's meant to do, just.


----------



## Kanan (Jan 28, 2016)

newtekie1 said:


> I do know.  When they had the lead, they charged $1,000 for their processors, just like Intel.  When they can get away with it, they include the absolute cheapest crappiest coolers they can, they do this to this day.  Nothing AMD has done shows they are any better than Intel.


Has nothing to do with their CPU prices, AMD never betrayed anyone, AMD has better coolers as a matter of fact - just Intels CPUs cope better with bad coolers, doesn't mean their coolers are better. Thats a fundamental misunderstanding on your side. And btw. don't try to convince me again AMD is the same as Intel. They aren't. These companys aren't really comparable, Intel is 10x size, Intel has done things AMD never did, and so on and so on. Don't waste your time, it doesn't make any sense.



Bill_Bright said:


> Right - because your total unwillingness to show even a single professional review site, IT mag, or white paper to support your position, you just expecting everyone believe you because you say that all OEM coolers are crap and need replacing is not any sign of arrogance (or being smarter than Intel). And because your constant name calling and personal attacks when someone says something different than you clearly demonstrates a mature attitude and desire to have a constructive technical debate. Gotcha.


No it isn't. Intel know themselves their coolers are cheap crap and they KNOW that this crap suffices and thats why they stick to it. You don't get it do you? I mean company thinking. Seems they are way smarter than you too. Also proof was already posted here, no need for me to do it again, apart from the fact every idiot on this planet knows how boxed coolers perform nowadays. Your behaviour isn't any better than mine, you are the one that started this "war" with me in the first place, by attacking me again - you can't leave it can't you? You take every chance to do it and I just answered accordingly. Go to your wife or similar person and cry if you want, if that helps you, but you are the person with the wrong behaviour here, not me. At the very most I'm guilty that I fell down on your niveau, but nothing more. I had technical debates when I was 13, that had higher niveau than this - in a age you probably were still playing with sand or something. But pls try to patronize me again, it just gets more funny every time.  I think you are playing with forces you can't begin to understand here. haha



> I'm done here. Keep going with your own feculent blather if you feel it further enhances your claims about OEM coolers.
> Sorry, but clearly you must not know what you are talking about either!


What did I say about OEM coolers that is wrong? Did I say they are too bad to do their job? No. I just said they are pretty cheap shit, that's it. I don't think you know what I said and are making things up. Also your opinion about boxed coolers is way off, that's why you admited being wrong after 6 or 7 pages of bullshit.



Bill_Bright said:


> No kidding???
> 
> Designers and engineers don't have stock coolers on themselves either.
> 
> ...


No we aren't you are just so unnerving that it has to be this way. Man at last you are somewhat amusing, but that's the least after so much bullshit.


----------



## trog100 (Jan 28, 2016)

CAPSLOCKSTUCK said:


> it was worth reading this thread just to learn the word "feculent"


 
in simple terms it means "shitty" but i did have to look it up to make sure.. i love bills use of it.. he he

"feculent blather".. brilliant.. the most intersting thing he has said in this entire thread .. only joking bill.. but it will definitely go in my mild insult list.. he he

it makes a nice change to "a load of bollocks".. 

trog


----------



## CAPSLOCKSTUCK (Jan 28, 2016)

trog100 said:


> in simple terms it means "shitty" but i did have to look it up to make sure.. i love bills use of it.. he he
> 
> "feculent blather".. brilliant.. the most intersting thing he has said in this entire thread .. only joking bill.. but it will definitely go in my mild insult list.. he he
> 
> ...




I had to look it up as well, i am going to try and use it every day in conversation.


----------



## newtekie1 (Jan 28, 2016)

the54thvoid said:


> I'm actually on Bill's side here. Although a stock cooler is generally a piece of shit, it does what it's meant to do, just.



That is actually my side. Bill's side is that stock coolers are good, they have improved drastically over the last 10 years, that they are more efficient and quieter, and people just say stock coolers are crap because they were in the past, but they aren't anymore.



Kanan said:


> Has nothing to do with their CPU prices, AMD never betrayed anyone, AMD has better coolers as a matter of fact - just Intels CPUs cope better with bad coolers, doesn't mean their coolers are better. Thats a fundamental misunderstanding on your side. And btw. don't try to convince me again AMD is the same as Intel. They aren't. These companys aren't really comparable, Intel is 10x size, Intel has done things AMD never did, and so on and so on. Don't waste your time, it doesn't make any sense.



Look back, you'll see that I said AMD's heatpipe cooler is the best right now.  Of course, Intel has made better coolers in the past.  Nothing stock released(other than the AiOs) is better than the tower heatsink Intel released back in 2009.

And AMD hasn't betrayed anyone?  Are you serious or just blind by fanboyism?  They released drivers that purposely hindered performance AFTER the reviews for their cards were done, because the cards were overheating and dying prematurely.  They sold cards with advertised clock speeds that they knew the card would never be able to reach with the stock cooling.  Your fundamental misunderstanding seems to be that AMD is some poor little company that has never done wrong, and only cares about their customers.  They don't give a shit about their customers, and they certainly have done plenty of wrong.


----------



## xfia (Jan 28, 2016)

CAPSLOCKSTUCK said:


> I had to look it up as well, i am going to try and use it every day in conversation.


you clotn muppets 
Tu es betes comme tes pieds
skatina


----------



## CAPSLOCKSTUCK (Jan 28, 2016)

ace ^^^^^^^^


----------



## Bill_Bright (Jan 28, 2016)

newtekie1 said:


> That is actually my side. Bill's side is that stock coolers are good


Now that is feculent blather! NOT ONCE have I ever said OEM coolers are "good". I have been saying over and over again that they are "adequate" - able to do the job they were intended to do, assuming proper case cooling and normal clocking (and in some cases, even with mild to moderate overclocking). At best, I said "more than adequate for most users", but never did I say good. And said they were quality coolers, meaning they will not fail the day after the warranty runs out.

You have not been defending the OEM coolers in this thread! In fact, you said, they "_suck_", "_all but AMD heatpipe are terrible_", only "_gotten worse over the years_". You said "_if they won't pass prime, they aren't adequate_". You said "_they aren't adequate for any (your underline, not mine) overclocking_", "_the current coolers are in fact worse than the coolers from 10 years ago_". You said, "_Every APU comes with a crappy aluminum cooler_".

I've been consistent (except where I freely admitted otherwise) with my claims. You've been flip-flopping.

First you said they are not adequate for any overclocking, then admitted it ("slight") was possible, in fact claiming you did it 10 years ago. I said mild, you said slight - same thing in my book. But a flip from not "any".

Then you flip-flopped again by saying "I never said they weren't adequate" and they were "good enough".

Contrary to what you want us to believe, if a cooler was "crap", it could not do it's job. A crappy cooler is NOT "good enough". If something is crap, crappy, terrible - it is worthless.

OEM coolers do their job when used in the operating environment they were intended to be used in. They clearly are not the best coolers out there, but they are not crap either.

So, no. It is not your side. But if you want to stick with "my" claim that today's OEM coolers are adequate for most users, then I welcome you to "my" side.


----------



## newtekie1 (Jan 28, 2016)

Bill_Bright said:


> Now that is feculent blather! NOT ONCE have I ever said OEM coolers are "good". I have been saying over and over again that they are "adequate" - able to do the job they were intended to do, assuming proper case cooling and normal clocking (and in some cases, even with mild to moderate overclocking). At best, I said "more than adequate for most users", but never did I say good. And said they were quality coolers, meaning they will not fail the day after the warranty runs out.



You have claimed they are better than they are 10 years ago.  They were adequate 10 years ago, so now they must be good, that is how it works.  You claim current coolers allow overclocking, they must be at least good to do that.  If you aren't saying they are good, then they must be crap, but you keep arguing they aren't.  You can have it only one way, they are either good, or crap.



Bill_Bright said:


> You have not been defending the OEM coolers in this thread! In fact, you said, they "_suck_", "_all but AMD heatpipe are terrible_", only "_gotten worse over the years_". You said "_if they won't pass prime, they aren't adequate_". You said "_they aren't adequate for any (your underline, not mine) overclocking_", "_the current coolers are in fact worse than the coolers from 10 years ago_". You said, "_Every APU comes with a crappy aluminum cooler_".



Yes, I have said all of that, and yet I've also said over and over they are adequate.  They do their job, just barely.  That still makes them crap and terrible.  Something can be crap and terrible, and still get the job done.



Bill_Bright said:


> First you said they are not adequate for any overclocking, then admitted it ("slight") was possible, in fact claiming you did it 10 years ago. I said mild, you said slight - same thing in my book. But a flip from not "any".



Actually, I said they were capable of slight overclocks 10 years ago, when the coolers were better. Now, they do not allow overclocking, because they are worse now than they were 10 years ago.



Bill_Bright said:


> Then you flip-flopped again by saying "I never said they weren't adequate" and they were "good enough".



I never said they weren't adequate or good enough, so I don't see how this is flip-flopping.  Again, crap and terrible is not the same as inadequate.



Bill_Bright said:


> Contrary to what you want us to believe, if a cooler was "crap", it could not do it's job. A crappy cooler is NOT "good enough". If something is crap, crappy, terrible - it is worthless.



Nope, something can be crap and still get the job done.



Bill_Bright said:


> OEM coolers do their job when used in the operating environment they were intended to be used in. They clearly are not the best coolers out there, but they are not crap either.



No, they are still crap.



Bill_Bright said:


> So, no. It is not your side. But if you want to stick with "my" claim that today's OEM coolers are adequate for most users, then I welcome you to "my" side.



Again, your claims are that coolers today are better than they were 10 years ago.  I'm still waiting for even a shred of evidence from you to support this by the way.

I've never once said stock coolers aren't adequate.  You assumed that when I said they are crap or terrible that means they aren't adequate, that is not the case.  I'm done arguing with you about this.  You make claims, then won't back them up and try to put words in my mouth and try to tell me what I said.


----------



## Hood (Jan 29, 2016)

Bill_Bright said:


> It will REALLY be interesting to see what happens when/if Samsung buys AMD.


Interesting? - I should say so!  Samsung is the one company (besides Apple) that has the spare billions for R&D, and their own foundries besides, which would finally give Intel some serious competition.  Western giant vs Eastern leviathan, where can I buy tickets?  Memory and storage speeds are on the verge of some major architecture upgrades that will change the market forever, and there won't be room for any minor players.  CPUs will finally find a reason to get faster and have more cores, as storage and memory are able to communicate much faster.  AMD should be a bargain right about now, what, a lousy half billion or so?


----------



## Bill_Bright (Jan 29, 2016)

newtekie1 said:


> Again, your claims are that coolers today are better than they were 10 years ago.


It is funny how you focus on such a minor claim, them claim to be the champion of OEM coolers by pretending others are on your side.  It is even funnier that you claim something that you consider as crap and terrible can still be adequate.

You whine that I have not shown proof. My proof is the millions of users out there running with OEM coolers just fine - yet you seem to deny their existence. And at the same time you have failed to show one piece of evidence from ANY professional review site, IT mag, or white paper that says OEM coolers cannot fill their intended purpose and therefore must be replaced. And not only that, you are wrong when you say today's coolers do not support overclocking because it is already happening. Extreme overclocking? Of course not! But mild to moderate? Absolutely!

You are just in denial. You have FAILED to present any study showing the folks at Intel and AMD don't know what they are doing.

My claim is, and always has been that OEM coolers are adequate, in fact, more than adequate for the vast majority of today's users.

Yes, today's OEM coolers are better. You can hand pick one or two images and pretend that proves your point if it makes you feel better. I don't care. The OEM cooler on my i7 has a larger and quieter fan than the one on my old P4. The base has a copper pad for better conduction. Yes, that copper spot is smaller, but guess what? The die of my i7 is much smaller than the die of my old P4. It does not need to be bigger! And that is typical of CPUs today! Gobs of metal overhanging nothing does nothing!

The bearings in the fan are better. The heatsink fins have much more surface area. But again, the real point is as I stated when I first stated it, OEM coolers today do the job they were designed to do. If they were crap, they couldn't.



newtekie1 said:


> Again, crap and terrible is not the same as inadequate.


I guess if you want to have your own definitions for words, then fine. If something is *crappy*, it is worthless. If something is worthless, it cannot do its job. I do not put crappy components in my builds. So because OEM coolers provide adequate heat extraction, I will and do use them unless my client needs silent running, will be doing extreme overclocking, or buys a CPU that does not come with an OEM cooler.

We will at least try the OEM cooler and see if it works satisfactorily before spending extra money that may not need to be spent!



newtekie1 said:


> I'm done arguing with you about this.


That's great. Thanks.


----------



## newtekie1 (Jan 29, 2016)

Bill_Bright said:


> My proof is the millions of users out there running with OEM coolers just fine - yet you seem to deny their existence.



That is in no way proof that OEM coolers today are better than they were in the past.



Bill_Bright said:


> And at the same time you have failed to show one piece of evidence from ANY professional review site, IT mag, or white paper that says OEM coolers cannot fill their intended purpose and therefore must be replaced.



I never made those claims, so I don't have to.  I said they were crap, or crappy, not that they were inadequate.



Bill_Bright said:


> And not only that, you are wrong when you say today's coolers do not support overclocking because it is already happening.



And you haven't show that they can.  They are already hitting too high of temperatures, not even able to turbo for extended periods of time, and you say they can go further?  That makes no sense.



Bill_Bright said:


> Yes, today's OEM coolers are better.



You still have not posted one piece of evidence to prove that.  You keep saying it, then keep saying you didn't say it.  Prove it or shut up.



Bill_Bright said:


> I guess if you want to have your own definitions for words, then fine. If something is *crappy*, it is worthless. If something is worthless, it cannot do its job. I do not put crappy components in my builds. So because OEM coolers provide adequate heat extraction, I will and do use them unless my client needs silent running, will be doing extreme overclocking, or buys a CPU that does not come with an OEM cooler.



Funny, I clicked on the link, went to the definition of crappy from dictionary.com and Websters, neither say anything about inadequate to perform an intended tasks.  They say inferior(I've certainly proved that, even posted evidence from major review sites like you asked, they are inferior to every aftermarket cooler tested), they say cheaply made(yep, can't get much cheaper than they already are).  Nothing about being inadequate though.  Maybe Wiktionary, no no mention of inadequate there.  Oh look Macmillan even has a thesaurus, maybe it is in there...no not there either. Hey look, UrbanDictionary, mayhbe they'll save you...oh wait no, not there either...   Odd, it is almost like you are taking one word and trying to make it mean something else to try to make it sound like I said something I didn't. 

Maybe if you search through all the definitions you might find one that says inadequate, but all the major definitions don't. Crappy =/= Inadequate  *PERIOD.* You think it does, but it doesn't.  There are plenty of things that are crap that still get the job done.



Bill_Bright said:


> So because OEM coolers provide adequate heat extraction, I will and do use them unless my client needs silent running, will be doing extreme overclocking, or buys a CPU that does not come with an OEM cooler.
> 
> We will at least try the OEM cooler and see if it works satisfactorily before spending extra money that may not need to be spent!





Bill_Bright said:


> I do not put crappy components in my builds.



If you are putting the stock cooler in, then you are in fact putting crappy components in your builds.  It's OK, I do to, almost always.  I built 6 computers this week, all i3s with stock coolers.  Unless they need something quieter, I rarely use anything other than stock.  It's crap, but it gets the job done.


----------



## Bill_Bright (Jan 29, 2016)

If you don't believe cooler technologies have improved in the last 10 or so years, fine! I don't have an i7 cooler from 10 years ago to take side-by-side pictures with an i7 cooler from today and post it here (you don't either!) - not that images prove anything anyway. I explained how my i7 cooler is better than my P4 cooler - not good enough for you, so again, fine!

You claim you don't want argue then you keep posting arguments - and IMO, petty ones too. At the same time, you want to take credit for it being your claim today's coolers are "adequate". Okay. You can have it. But at the same time, you claim they are crappy and admit to using them "almost always!". That makes no sense to me.

You present other dictionaries as proof of your claim ONLY because they do not contain the specific word "_inadequate_". Hogwash! You are just being argumentative.

MW's first definition is "of poor quality". OEM coolers are not of "poor quality". They are quality built, use quality bearings and can be expected to last for years.
Dictionary.com's first definition is "extremely bad". OEM coolers are not extremely bad or they could not do their intended job. The other dictionaries follow suit. And "oh look" what Dictionary.com's sister site, Thesaurus.com has under "crappy". It says "_bad_" and "_inadequate!"_  But I'm guessing you already knew that and decided only yours were the "major" ones.

I don't see "terrible" in your "major" dictionaries - therefore that is proof you are wrong!  Yeah right. Semantics.

And BTW, for your thesaurus, interesting what it says for adequate.

I see no reason to pursue this line of debate any further, so I won't.


----------



## CAPSLOCKSTUCK (Jan 29, 2016)




----------



## hat (Jan 29, 2016)

What happened to this thread?


----------



## dorsetknob (Jan 29, 2016)

It developed into a Bitch fight   Usual Suspect(s) involved


----------



## newtekie1 (Jan 29, 2016)

Bill_Bright said:


> ut at the same time, you claim they are crappy and admit to using them "almost always!". That makes no sense to me.



It doesn't have to make sense to you.  You use them too, and they are crappy.  They get the job done.  If the client wants to spend $30 more on a better cooler, fine, but most don't care and the stock cooler works fine for them, so that is what I use.  If I thought the computer was going to fail because of the stock cooler, I wouldn't use it.  But, as I said, it is adequate.  Still crap, but adequate.



Bill_Bright said:


> You present other dictionaries as proof of your claim ONLY because they do not contain the specific word "_inadequate_". Hogwash! You are just being argumentative.



I posted dictionaries straight from you link there slick.



Bill_Bright said:


> MW's first definition is "of poor quality". OEM coolers are not of "poor quality". They are quality built, use quality bearings and can be expected to last for years.



Bullshit they are quality built.  It doesn't take a quality fan to last for years, it isn't exactly hard to manage that.  If you want a quality fan, they'd have Deltas slapped on them.  And the fan isn't the only part of the heatsink, the rest of it has to be quality too.  The push pins are absolute shit, unless you are so super careful when removing and remounting they break.  The i3 heatsinks are all aluminium and as poor quality as they can get.  The i5/i7 heatsinks have copper in them, but still about as cheap quality as you can get.  Heck, even the plastic frame of the fan gets brittle after about a year, making it easy to break if you ever have to remove the heatsink for some reason.



Bill_Bright said:


> Dictionary.com's first definition is "extremely bad". OEM coolers are not extremely bad or they could not do their intended job. The other dictionaries follow suit.



I would consider the absolute worst cooling capacity possible while still technically working "extremely bad".



Bill_Bright said:


> And "oh look" what Dictionary.com's sister site, Thesaurus.com has under "crappy". It says "_bad_" and "_inadequate!"_ But I'm guessing you already knew that and decided only yours were the "major" ones.



It does not say that under the synonym section.  It says it under "related" words, as in words that are similar but don't mean the exact same thing.



Bill_Bright said:


> I don't see "terrible" in your "major" dictionaries - therefore that is proof you are wrong!  Yeah right. Semantics.



Two different descriptions.  It is both crappy and terrible.  Both adjectives apply to the stock cooling, and I've used both to describe it.  However, I've never claimed they mean the same thing.  You have, once again made another off base claim, that crappy means the same as inadequate.



Bill_Bright said:


> And BTW, for your thesaurus, interesting what it says for adequate.



I'm not quite sure what point you are trying to make here.


----------



## 64K (Jan 29, 2016)

hat said:


> What happened to this thread?


----------



## CAPSLOCKSTUCK (Jan 29, 2016)




----------



## lilhasselhoffer (Jan 29, 2016)

It's the only way I can express this.  It's a heck of a fun watch, but afterward you feel less human for having enjoyed it as much as you did.


----------



## dorsetknob (Jan 29, 2016)

soon to go to GN   hello GN


----------



## Kanan (Feb 3, 2016)

newtekie1 said:


> That is actually my side. Bill's side is that stock coolers are good, they have improved drastically over the last 10 years, that they are more efficient and quieter, and people just say stock coolers are crap because they were in the past, but they aren't anymore.
> 
> 
> 
> ...



LOL you called ME a AMD fanboy? Before you post bullshit like that just don't be so lazy and look up my specs first. Now back to your post:

"They released drivers that purposely hindered performance AFTER the reviews for their cards were done, because the cards were overheating and dying prematurely. "

That's not "betrayal" and far from it. That's a solution for preventing the cards to be destroyed. Something entirely else. Just your interpretation of it is negative and somewhat shortsighted in terms of objective logical assessment. In reality, it was the only viable and good solution to a problem.

"They sold cards with advertised clock speeds that they knew the card would never be able to reach with the stock cooling."

You mean the 290X or Nano cards? Whatever I'll adress both.
290X: again a error in your assessment, as you tend to see things emotionally negative, but that just isn't it. The way AMD cards works is the opposite of Nvidia, they start fast and clock down if reaching too high temperature or power consumption - working as intended, as it says "up to 1000 MHz" not "1000+". You just didn't understand it (the technology and how it works) right. Or maybe you have a philosophical / psychological problem with it. In the end it's almost the same, aside from 290X reference cards, just that AMD starts high and clocks down (some cards only, most stay at highest clock if needed, depending on usage) and NVidia starts low and clocks high.
Or maybe you have a philosophical or psychological problem with their technology just that these aren't really important here. Maybe you like "clocks higher" more than "clocks lower". That's it. But in reality there is no tangible difference. AMD starts high and clocks down (not always and depends on cooling + power consumption + GPU), Nvidia starts low and clocks high. That's it.
Nano: the same basically just somewhat different, as it clocks down in under a second and clocks down because of power consumption, not temperatures as with 290X reference cards.

Notice: AMD cards always work that way (even custom AMD cards that never reach high temperatures), that's no malfunction or "betrayal" as you said. Somewhat NVidia-fanboyish of you to post something like that, essentially a fundamental misunderstanding of their technology.

"Your fundamental misunderstanding seems to be that AMD is some poor little company that has never done wrong, and only cares about their customers."

Didn't say any of that and don't think any of that. Bullshit.

"They don't give a shit about their customers, and they certainly have done plenty of wrong."

Bullshit also. Did they make mistakes? Yes, of course, nobody is perfect. Nvidia did make mistakes too. Intel too. All do. It's called "being human". Do they give a shit about their customers? No, bullshit. Seems you are a extremely pessimistic person, to say things like that. Of course they care, because if customers are happy, they earn more money. And if they are happy, they are happy that they are happy, too. Just because you don't understand some things, it doesn't mean everything is bad. Overthink your way of living or seing things maybe. You're not really on the right track.


----------



## newtekie1 (Feb 3, 2016)

Kanan said:


> That's not "betrayal" and far from it. That's a solution for preventing the cards to be destroyed. Something entirely else. Just your interpretation of it is negative and somewhat shortsighted in terms of objective logical assessment. In reality, it was the only viable and good solution to a problem.



They knew of the heat problem before the card was released.  Putting it out on the market with drivers that allowed it to overheat, just to get better reviews, then release the "fix" to hinders performance is betrayal.



Kanan said:


> You mean the 290X or Nano cards? Whatever I'll adress both.
> 290X: again a error in your assessment, as you tend to see things emotionally negative, but that just isn't it. The way AMD cards works is the opposite of Nvidia, they start fast and clock down if reaching too high temperature or power consumption - working as intended, as it says "up to 1000 MHz" not "1000+". You just didn't understand it (the technology and how it works) right.
> Nano: the same basically just somewhat different, as it clocks down in under a second and clocks down because of power consumption, not temperatures as with 290X reference cards.
> 
> Notice: AMD cards always work that way (even custom AMD cards that never reach high temperatures), that's no malfunction or "betrayal" as you said. Somewhat NVidia-fanboyish of you to post something like that, essentially a fundamental misunderstanding of their technology.



No, I completely understand the technology.  It is just betrayal.  To market at card as 1000MHz, when they know it won't run at a constant 1000MHz is a betrayal.  They are marketing the turbo frequency and not listing a base clock.  That is the wrong way to do it, and only done to deceive.



Kanan said:


> Bullshit also. Did they make mistakes? Yes, of course, nobody is perfect. Nvidia did make mistakes too. Intel too. All do. It's called "being human". Do they give a shit about their customers? No, bullshit. Seems you are a extremely pessimistic person, to say things like that. Of course they care, because if customers are happy, they earn more money. And if they are happy, they are happy that they are happy, too. Just because you don't understand some things, it doesn't mean everything is bad. Overthink your way of living or seing things maybe. Your not really on the right track.



See, I don't put any company ahead of the other.  You on the other hand seem to want to put AMD on a pedestal. For some reason you think they are better than nVidia and Intel, when the facts show they aren't.  They are just as dirty, but you won't believe that.  In your own words, there is nothing that will make you believe AMD is bad.


----------



## Kanan (Feb 3, 2016)

newtekie1 said:


> They knew of the heat problem before the card was released.  Putting it out on the market with drivers that allowed it to overheat, just to get better reviews, then release the "fix" to hinders performance is betrayal.


You have any proof of that? Sorry that I don't take your word for it.



> No, I completely understand the technology.  It is just betrayal.  To market at card as 1000MHz, when they know it won't run at a constant 1000MHz is a betrayal.  They are marketing the turbo frequency and not listing a base clock.  That is the wrong way to do it, and only done to deceive.


No and that's what I meant when I said you don't understand their technology, many people have that same misunderstanding. "Up to 1000 MHz" is not a "turbo clock"  or "Turbo" it's the base clock. AMD has no real turbo clocks with their GPUs, because they clock down and not up. What is your understanding of "Turbo"? Nvidia has a "Turbo" because it clocks higher. And it's not to deceive buyers. When the 290X were released it was already written "up to 1000 MHz" not "1000 MHz" or "1000 MHz+". Just people need to really read specs of cards or better read reviews before buying something. I don't say I like the 290X reference cards, because they clearly overheated and therefore downclocked  *from their base clock*, but it doesn't change the facts that 290X with good coolings don't have the same problems and still use the same tech, so essentially would in theory downclock too if they had higher temperatures.



> See, I don't put any company ahead of the other.  You on the other hand seem to want to put AMD on a pedestal. For some reason you think they are better than nVidia and Intel, when the facts show they aren't.  They are just as dirty, but you won't believe that.  In your own words, there is nothing that will make you believe AMD is bad.


No, I just moderated what you said down to "everybody is doing shit, all humans are faulty" - can't you read? Also I'm a current Intel and Nvidia user and most of my GPUs were Nvidia. I had a GF256, GF3Ti200 128MB, GTX7800GT, 7900GT, 8600 GT, GTX 260 216, HD 5850, HD 5970 and now a 780 Ti. As you see, you are going nowhere with calling me a "fanboy". I'm just defending AMD because you seem to hate them, that's really all I'm trying to do.


----------



## Tsukiyomi91 (Feb 4, 2016)

actually, all AMD did is to market the word "up to", which is quite vague. It also means their cards aren't ready to reach or exceed quoted clocks, which is like telling folks u can't reach 1GHz because of "unknown reasons". Isn't that considered as a sort of betrayal to those who bought their product & expect it to perform as it should? If AMD is honest about how old their chips are, everyone would forgive them & given a second chance to redeem themselves, but no... they insist in reusing with little or no improvements & cover up their incompetence with botched benches & releases it to garner publicity. Intel wasn't like this nor Nvidia. Sure those 2 guys are premium brands, but at least they guarantee it won't heat up your cool room or even kill your electricity bills when you game, edit videos, image or simply surfing the net.


----------



## Bill_Bright (Feb 4, 2016)

Tsukiyomi91 said:


> It also means their cards aren't ready to reach or exceed quoted clocks... .
> Isn't that considered as a sort of betrayal to those who bought their product & expect it to perform as it should?



Cards (any product, for that matter) should definitely meet published/advertised specs ("clocks" for your example) when "used as directed". But it is not fair to expect, or be critical of a product if it cannot "exceed" published/advertised specs ("quoted clocks") - nor is it a betrayal if a product cannot perform _better than_ advertised. Meeting specs but not exceeding specs is, if anything, just being honest.


Tsukiyomi91 said:


> Intel wasn't like this nor Nvidia.


I beg to differ. Both have "re-issued" and relabeled products with a different model number to extend their sales and recoup more on their investment. This is a long time practice going back decades - if not to the beginnings of commerce. Think "outlet malls" and "factory seconds".

Remember floppy disks? When they first came out, they were all SSSD (single sided, single density). Then as raw materials could be made purer and manufacturing technologies improved, disk makers (and drive makers) started developing SSDD (single sided, double density). Then they learned to make the other side of the disks usable and they started making DSDD (double sided, double density).

They still sold all types but they all came off the same production line, manufactured as DSDD. Those that had a side that did not pass testing were labeled and notched (or rather, not notched), then sold as SSDD. If a side failed a density test, it would be labeled as single density and labeled, and marketed that way.

Same thing with RAM makers years ago and still today. RAM devices are produced with the highest densities and speed manufacturing techniques today allow, then tested. Those that cannot meet the fastest speeds and densities, are labeled and marketed as slower speeds and densities.

CPUs and GPUs go through the same thing - some even with cores disabled then sold as lessor models. Sibling CPU models are made the same but those that cannot pass muster get "locked" or marked for a slower speed.

This happens all the time to avoid industrial waste (a pain with hazardous materials) and a total loss of any profits.



Tsukiyomi91 said:


> Sure those 2 guys are premium brands, but at least they guarantee it won't heat up your cool room or even kill your electricity bills when you game, edit videos, image or simply surfing the net.


I am with you here, but there's no deception or dishonesty involved either. Power and heat specs are readily available. Plus, AMD processors (at least CPUs) do tend to cost less too. If you look at any appliance, for example a refrigerator, the most efficient models always cost more. Power supplies are the same way.


----------



## Kanan (Feb 5, 2016)

Tsukiyomi91 said:


> actually, all AMD did is to market the word "up to", which is quite vague. It also means their cards aren't ready to reach or exceed quoted clocks, which is like telling folks u can't reach 1GHz because of "unknown reasons". Isn't that considered as a sort of betrayal to those who bought their product & expect it to perform as it should? If AMD is honest about how old their chips are, everyone would forgive them & given a second chance to redeem themselves, but no... they insist in reusing with little or no improvements & cover up their incompetence with botched benches & releases it to garner publicity. Intel wasn't like this nor Nvidia. Sure those 2 guys are premium brands, but at least they guarantee it won't heat up your cool room or even kill your electricity bills when you game, edit videos, image or simply surfing the net.


I will disregard that you are heavily Intel/Nvidia biased and anti-AMD in this post, so not being really objective, and answer just to your post.

"actually, all AMD did is to market the word "up to", which is quite vague. It also means their cards aren't ready to reach or exceed quoted clocks, which is like telling folks u can't reach 1GHz because of "unknown reasons". Isn't that considered as a sort of betrayal to those who bought their product & expect it to perform as it should?"

It is not that simple.
1. They marketed it that way because their reference cooler was bad, so they could set the base clock to 1000 and market it with "up to 1000" instead just with 650 to 1000 MHz what it really is (afaik & depending on each card). As a matter of fact, as I already said, custom cards don't have this problem, so the problem is clearly with the reference cooler, that is bad, not with the tech. Also, since they don't know how much MHz a card can do, because every card is different, it was ~impossible to market it with something else than "up to 1000" without it looking strange.
2. Yes the marketing is marketing, if you know what marketing is. But it's no "betrayal". It was always so that companys tried to market things as positive as they could, this is not any different to what Nvidia and Intel (or any other company) is doing. Anyone who skipped the crappy reference cards and bought a custom 290(X), did indeed buy a good AMD card, so again, the problem was with the cooler not the tech or their marketing per se.
3. It was possible with the crappy reference cards to have 1000 MHz all the time, just with more noise, that's it. Yes, even overclocking was possible - so it's no "betrayal" at all. Again, everyone who read reviews and wasn't a spontaneous buyer, would've known what problems these cards had and bought something else or waited for the 290X custom. Many people indeed just did that.

"If AMD is honest about how old their chips are, everyone would forgive them & given a second chance to redeem themselves, but no... they insist in reusing with little or no improvements & cover up their incompetence with botched benches & releases it to garner publicity. Intel wasn't like this nor Nvidia."

Everyone does "botched benches", this whole topic is about "botched benches" of Intel and AMD for example. Their chips are old, so what? The 390 is clearly better than the 970 and the 390X is almost as good, and sometimes even better than the 980 at a clearly lower price. Seems you aren't very good informed but still make bold statements and accusations.

"Intel wasn't like this nor Nvidia."

This is btw. the fanboy-blabber I was speaking about. Anyone with experience in IT knows that everybody has its problem's and nobody's perfect and that brands aren't important. What matters are the facts. This is you, being overly emotional about such things as graphics cards and CPUs. We are talking about *tech* here and not about a TV show.

"Sure those 2 guys are premium brands, but at least they guarantee it won't heat up your cool room or even kill your electricity bills when you game, edit videos, image or simply surfing the net."

Oh yes, Nvidia cards and Intel CPUs are more efficient I give you that, but you are exaggerating it so much, aside from the "sure those 2 guys are premium brands" [and AMD is not], that it's so obvious how off your comment is. And you are generalizing. The Nano is the most efficient card right now, it is a AMD card, not Nvidia - the most efficient card was the 980, the Nano took that over at release. Kepler cards from Nvidia are like AMD cards, they aren't much better. It's just the GTX 970 + 980 that are better, the 970 is still slower than 390, the 980 is too expensive or is slower and less efficient than Nano. Leaves you with the 980 Ti which is a really good chip and a game winner. But that's it. You are speaking like Nvidia is world aheads, they are not. There is what you say, and then there are the facts, that are quite different. Now, I won't say the same about AMD CPUs, yep, they are crap, but their APUs are good and some of their CPUs have good price to performance value (FX 4350, FX 6300, FX 8320 ... ). Intel is more efficient, yeah, and you pay a fat price premium for that too.


----------



## newtekie1 (Feb 5, 2016)

Kanan said:


> You have any proof of that? Sorry that I don't take your word for it.



Well, you've only got two options.

A.) They knew of the heat problem with their stock cards and coolers.  And waited to issue a driver fix until after the reviews were done, so the reviews showed better performance than real world use.

or

B.) They are completely incompetent, and did no testing on the cards before release.

Take your pick, because something like a card killing heat issue isn't something you find out after the card is out.



Kanan said:


> No and that's what I meant when I said you don't understand their technology, many people have that same misunderstanding. "Up to 1000 MHz" is not a "turbo clock" or "Turbo" it's the base clock. AMD has no real turbo clocks with their GPUs, because they clock down and not up. What is your understanding of "Turbo"? Nvidia has a "Turbo" because it clocks higher. And it's not to deceive buyers. When the 290X were released it was already written "up to 1000 MHz" not "1000 MHz" or "1000 MHz+". Just people need to really read specs of cards or better read reviews before buying something. I don't say I like the 290X reference cards, because they clearly overheated and therefore downclocked *from their base clock*, but it doesn't change the facts that 290X with good coolings don't have the same problems and still use the same tech, so essentially would in theory downclock too if they had higher temperatures.



No, I completely understand how the technology works.  Their marking is deceptive, and the only reason you'd do it that was is so you can market the card in deceptive ways.  Just like ISPs that say "up to" speeds and then never achieve anywhere close to those speeds.  It is deceptive.  If the only clock speed you give for a part is 1000MHz, it better run at at least 1000MHz at all times under load.



Kanan said:


> No, I just moderated what you said down to "everybody is doing shit, all humans are faulty" - can't you read? Also I'm a current Intel and Nvidia user and most of my GPUs were Nvidia. I had a GF256, GF3Ti200 128MB, GTX7800GT, 7900GT, 8600 GT, GTX 260 216, HD 5850, HD 5970 and now a 780 Ti. As you see, you are going nowhere with calling me a "fanboy". I'm just defending AMD because you seem to hate them, that's really all I'm trying to do.



No, see you are fanboy because you say shit like.

_AMD is not Intel, accept it._

AND

_don't try to convince me again AMD is the same as Intel. They aren't.
_
Implying AMD is somehow better than Intel in morals or something.  

And I must really hate AMD, most of my rigs are AMD, they're listed in my sig if you care to read(can't YOU read?).


----------



## Kanan (Feb 5, 2016)

newtekie1 said:


> Well, you've only got two options.
> 
> A.) They knew of the heat problem with their stock cards and coolers.  And waited to issue a driver fix until after the reviews were done, so the reviews showed better performance than real world use.
> 
> ...


I thought you had dropped it, so sad you didn't, because you won't win this discussion, no matter what you do. You're just not on the winner road.

"Take your pick, because something like a card killing heat issue isn't something you find out after the card is out."

Of course, this discussion is progressed somewhat further. As you can see on my earlier answers I already pointed out that companys do whatever they want to sell their things as positive as it can be. Seems you ignored that post, because your answer is somewhat pointless now.

"No, I completely understand how the technology works. Their marking is deceptive, and the only reason you'd do it that was is so you can market the card in deceptive ways. Just like ISPs that say "up to" speeds and then never achieve anywhere close to those speeds. It is deceptive. If the only clock speed you give for a part is 1000MHz, it better run at at least 1000MHz at all times under load."

No it's not. I already explained why and I won't do it again just because you ignore my posts. What is bad was their cooler and nothing else. The technology is perfectly fine. As a matter of fact, because all the custom 290X/390X do proof this (well not all, but almost all do). And the way you argued earlier revealed to me that you do NOT understand exactly how it works, because you confused "downclocking from baseclock" with "turbo" on AMD tech (and senselessly compared it to Nvidia GPU Boost) and told me that their advertised "turbos" doesn't work. Yep now that I explained it to you, you understand it - maybe, that is. Or you just dropped the point. Well not important, I see what you are doing, you won't deceive me.

"No, see you are fanboy because you say shit like.

_AMD is not Intel, accept it._

AND

_don't try to convince me again AMD is the same as Intel. They aren't._

Implying AMD is somehow better than Intel in morals or something. "

Nope, it seems that way, but you can see on my earlier posts that I'm just somewhat defending AMD and I blame them on some things too - now that is something a fanboy would NEVER do. Plus, I don't care what you think of me. Anyone who really knows me, knows that I NEVER was a fanboy and always bought what performed better and NEVER cared about brands much. Not to say I'm a machine, I like AMD more than Intel and maybe more than Nvidia too, but this isn't making me a fanboy and not even remotely. You have no clue, when you think I'm a fanboy, but thanks for the good laughter anyway. That's basically the same thing as calling me dumb, but I'm probably smarter than you.

"And I must really hate AMD, most of my rigs are AMD, they're listed in my sig if you care to read(can't YOU read?)."

Yeah I can see it now. But coming from 1) a person who ignores my specs and my earlier posts all the way until I told him what I have in my rig, in a post and 2) someone who can't differ someone defending a company from, someone being a idiotic fanboy, I don't care much. Also I discovered earlier that you are a pretty negative person. What was it again? Ah yeah, "every company is shit and they give shit about customers". Alright man. You are way too extreme to call me anything. Get your own things straight first.

PS. No fanboy uses something else than their fan brands. Do I use AMD hardware now? *No.* Did I switch from a Radeon to a GeForce again, *yes.* As you can see you're still going nowhere with this bullshit accusation. Drop it, you have lost.


----------



## ShiBDiB (Feb 5, 2016)

Smells of desperation on AMD's part. Everyone knows they've been under-performing and to sink down to finger pointing at its competitor is petty and shows how bad things have gotten for them.


----------



## newtekie1 (Feb 5, 2016)

Kanan said:


> I thought you had dropped it, so sad you didn't, because you won't win this discussion, no matter what you do. You're just not on the winner road.



I'm pretty sure I am.



Kanan said:


> Of course, this discussion is progressed somewhat further. As you can see on my earlier answers I already pointed out that companys do whatever they want to sell their things as positive as it can be. Seems you ignored that post, because your answer is somewhat pointless now.



What does that have to do with them out right deceiving customers?  Making reviews have false performance numbers by retarding the card a couple months after the release so the reviews look better to the consumer?  Remember, you are the one that said, flat out, "AMD never betrayed anyone".  Or did you forget you made that rather bold statement?



Kanan said:


> No it's not. I already explained why and I won't do it again just because you ignore my posts. What is bad was their cooler and nothing else. The technology is perfectly fine. As a matter of fact, because all the custom 290X/390X do proof this (well not all, but almost all do). And the way you argued earlier revealed to me that you do NOT understand exactly how it works, because you confused "downclocking from baseclock" with "turbo" on AMD tech (and senselessly compared it to Nvidia GPU Boost) and told me that their advertised "turbos" doesn't work. Yep now that I explained it to you, you understand it - maybe, that is. Or you just dropped the point. Well not important, I see what you are doing, you won't deceive me.



Yes it is, when you market a card as 1000MHz, I don't care if you put the words "Up To" before it, the card should run at 1000MHz.  This is deceptive, period.  If they didn't want to be deceptive, they would have actually come out with a minimum clock the card would run at, you know, just like those other horrible companies Intel and nVidia do...

You can try to explain it away any way you want, but in the computer industry the Base Clock is the minimum the processor will run at, not the maximum.  Sorry, I don't care what AMD calls it(note: they don't call it the base clock), or what you want to try to say it is.  I didn't confuse anything, I judge AMD and their technology, on the standards of the industry.  If you have a processor that goes up to a certain clock, but doesn't stay there due to heat or power consumption, then that clock is a turbo/boost clock.  The base clock is whatever minimum clock speed the processor runs at. Even professional reviewers on very well known tech sites says they believed the 290X has an "unlisted base clock", and say "to only list the boost clock is being deceitful at best".  But, hey, those professional reviewers must be idiots too...I'm sure you are way smarter, and know way more about graphics cards than they do.

Because, according to you, "AMD never betrayed anyone".



Kanan said:


> Nope, it seems that way, but you can see on my earlier posts that I'm just somewhat defending AMD and I blame them on some things too - now that is something a fanboy would NEVER do. Plus, I don't care what you think of me. Anyone who really knows me, knows that I NEVER was a fanboy and always bought what performed better and NEVER cared about brands much. Not to say I'm a machine, I like AMD more than Intel and maybe more than Nvidia too, but this isn't making me a fanboy and not even remotely. You have no clue, when you think I'm a fanboy, but thanks for the good laughter anyway.



A fanboy is going to defend a company, try to explain away all their deceptive practices and act like they never even happened.  Sound familiar?  Yeah, it is what you keep doing.  I don't care if you use Intel/nVidia products.  Even fanboys will do that if they aren't completely stupid.  A fanboy doesn't always have to use their favorite brand, but they will defend them and act like they are somehow better than the other companies.  They'll defend clear wrong doing by the company like it hasn't even happened.  This is what you are doing.  You keep going on, "oh no, they never deceived anyone like Intel and nVidia" "they didn't retard overheating cards after the reviews were out, that never happened!"  "they weren't deceptive in marking 1000MHz cards that couldn't actually run at 1000MHz"...

Yeah, the reviewers of the cards, professionals that deal with graphics cards for a living, called the practice deceptive.  But you think AMD never betrayed anyone, you have to defend that statement no matter what.  So those professional reviewers must be completely wrong, right?



Kanan said:


> That's basically the same thing as calling me dumb, but I'm probably smarter than you.



Yeah...ok. This coming from the guy that still can't figure out the multi-quote feature.:rollsyeys:



Kanan said:


> Yeah I can see it now. But coming from 1) a person who ignores my specs and my earlier posts all the way until I told him what I have in my rig, in a post and 2) someone who can't differ someone defending a company from, someone being a idiotic fanboy, I don't care much. Also I discovered earlier that you are a pretty negative person. What was it again? Ah yeah, "every company is shit and they give shit about customers". Alright man. You are way too extreme to call me anything. Get your own things straight first.



I didn't ignore your system spec, I just didn't care what they were.  Your comments and die hard intent to somehow prove that AMD is better than Intel and nVidia as a company, not product wise, is what makes you a fanboy.

And you want to talk about ignoring system specs, I would have to click on something to see your system specs, and I don't really care enough to bother.  However, my systems are put out in plain site, on every post I make.  And yet, you failed to read them, and even went so far as to try to act like I'm the one that doesn't read.  Wow...



Kanan said:


> PS. No fanboy uses something else than their fan brands. Do I use AMD hardware now? *No.* Did I switch from a Radeon to a GeForce again, *yes.* As you can see you're still going nowhere with this bullshit accusation. Drop it, you have lost.



Odd, doesn't feel like I've lost.


----------



## Kanan (Feb 5, 2016)

newtekie1 said:


> What does that have to do with them out right deceiving customers?  Making reviews have false performance numbers by retarding the card a couple months after the release so the reviews look better to the consumer?  Remember, you are the one that said, flat out, "AMD never betrayed anyone".  Or did you forget you made that rather bold statement?


"Out right deceiving customers" who is bold now? The way I see it and it's just a interpretation, as is what you are doing, is, that they fixed problems with consumer cards later, I'm talking 290X here btw. If you are not, then again I want proof of that, because I don't know what you mean then.



> Yes it is, when you market a card as 1000MHz, I don't care if you put the words "Up To" before it, the card should run at 1000MHz.  This is deceptive, period.  If they didn't want to be deceptive, they would have actually come out with a minimum clock the card would run at, you know, just like those other horrible companies Intel and nVidia do...



It's still not the same & not important if you can understand and/or accept it or not. I don't care about your ignorance, it doesn't change the hard fact that it is not the same as saying "1000 MHz". Yes Nvidias marketing is better, but no one here doubted that AMD is bad at marketing. Was already mentioned pages ago (by me or someone else).



> You can try to explain it away any way you want, but in the computer industry the Base Clock is the minimum the processor will run at, not the maximum.



Things change, seems you are old and not able to accept changes. Simply put: you are wrong.



> Sorry, I don't care what AMD calls it(note: they don't call it the base clock), or what you want to try to say it is.  I didn't confuse anything, I judge AMD and their technology, on the standards of the industry.



I'm not "trying" anything here. I explained to you what it is and what it is not. See, you are somewhat speculating on the matter, I'm not. Difference between knowledge and speculation. The problem here is that you are somewhat emotional on the matter and that makes this discussion somewhat unnerving and pointless, because you don't accept anything here, AMD is the "evil" for you and so on and so on. Emotional bullshit that is.



> If you have a processor that goes up to a certain clock, but doesn't stay there due to heat or power consumption, then that clock is a turbo/boost clock.  The base clock is whatever minimum clock speed the processor runs at. Even professional reviewers on very well known tech sites says they believed the 290X has an "unlisted base clock", and say "to only list the boost clock is being deceitful at best".  But, hey, those professional reviewers must be idiots too...I'm sure you are way smarter, and know way more about graphics cards than they do.



Well I read some of those sites too and I think they are understanding it wrong, like you are, yes. *It's no turbo*. Again and again (and again and again...): IF it would be a turbo, the custom 290X's would not start at 1000 MHz or more (because it's their baseclock). They would start at 800 and go up to 1000 or more, like GPU Boost from Nvidia. Throtteling or downclocking to save energy and again clocking higher to normal speed is not the same as a turbo, because that normal speed is its base clock. Is that so hard for you to understand? Is it because you can't accept that it is something different special to AMD, that you are unable to accept it? Those website authors have that same logical problem btw.

Edit: A turbo is btw a temporal setting, the base clock of AMD cards like custom 290X is not temporal. If you can stress the card, it will stay at 1000 or more and never clock down. Nvidia cards on the other hand change their clocks all the time - like CPUs do. See, this is again proof that it's not a turbo. A turbo is temporal. This is a base clock, because with custom cards it is not temporal. Because with the Fury X it is not temporal if you can stress the card, at least.

AND there are websites that are exactly seeing it like me. Oops. That completely destroyed your argument I guess. Doesn't seem that it's me against the world, seems rather it's the ones who understand and the ones who do not.



> Because, according to you, "AMD never betrayed anyone".



Maybe that statement was wrong, maybe not. Have you provided proof? No. You're a american? What was it again... "innocent until proven guilty". Seems like I win again. And I hope you know what I mean with "proof" - not what you did up to this point, no. I'm terribly sorry.



> A fanboy is going to defend a company, try to explain away all their deceptive practices and act like they never even happened.  Sound familiar?  Yeah, it is what you keep doing.  I don't care if you use Intel/nVidia products.  Even fanboys will do that if they aren't completely stupid.  A fanboy doesn't always have to use their favorite brand, but they will defend them and act like they are somehow better than the other companies.  They'll defend clear wrong doing by the company like it hasn't even happened.  This is what you are doing.  You keep going on, "oh no, they never deceived anyone like Intel and nVidia" "they didn't retard overheating cards after the reviews were out, that never happened!"  "they weren't deceptive in marking 1000MHz cards that couldn't actually run at 1000MHz"...



Nice try. *claps slowly* But not good enough. No I'm still no fanboy. I'm just more smart than you are and understand things you don't. But again, nice try in trying to bring me down to your emotional bullshit level. I'm no fanboy, because that is just not my level, you don't get it. And btw. my understanding of a fanatic is much more realistic than yours. YES fanatics will just buy their fanatic brand they will not care about performance, and yes they will defend everything and yes, they will never accept anything bad on their fanatized brand. You don't get what fanatism is, do you? No not even remotely. That's what I meant - I win.



> Yeah, the reviewers of the cards, professionals that deal with graphics cards for a living, called the practice deceptive.  But you think AMD never betrayed anyone, you have to defend that statement no matter what.  So those professional reviewers must be completely wrong, right?



Links, quotes, proof? I don't accept your word. You are way too emotional and subjective for me to believe you.



> Yeah...ok. This coming from the guy that still can't figure out the multi-quote feature.:rollsyeys:



lol I used it many times, but talk more bullshit and I will keep laughing at you. 



> I didn't ignore your system spec, I just didn't care what they were.  Your comments and die hard intent to somehow prove that AMD is better than Intel and nVidia as a company, not product wise, is what makes you a fanboy.



1) you just said you are ignorant. Wow cool. You are an easy opponent. Smart would be to ignore NOTHING. Any information is useful. 2) Where did I say that AMD is better than Intel and Nvidia? Nowhere did I say that. That are just your dumb (negative) emotions that are making you fantasize these things. Something I already told you before, btw.



> And you want to talk about ignoring system specs, I would have to click on something to see your system specs, and I don't really care enough to bother.  However, my systems are put out in plain site, on every post I make.  And yet, you failed to read them, and even went so far as to try to act like I'm the one that doesn't read.  Wow...



Yep, but did I call you a fanboy? No. I called you a hater and negative person. You on the other hand called me fanboy, so, your argument is pretty pointless, because I didn't do anything wrong. Btw. I saw your signature many times.



> Odd, doesn't feel like I've lost.


Oh but you have. Realizing it is something entirely else.


----------



## ShiBDiB (Feb 5, 2016)

this thread is off the rails.. I like it 

CHOO CHOO, the fanboy train is coming!


----------



## newtekie1 (Feb 5, 2016)

Kanan said:


> "Out right deceiving customers" who is bold now? The way I see it and it's just a interpretation, as is what you are doing, is, that they fixed problems with consumer cards later, I'm talking 290X here btw. If you are not, then again I want proof of that, because I don't know what you mean then.



Seriously, you talk down to me, and say you are smarter, and you don't even know what I'm talking about.  I'm not talking about the 290X, but since you seem to think you're so much smarter than me, go figure out what card I'm talking about yourself.



Kanan said:


> It's still not the same & not important if you can understand and/or accept it or not. I don't care about your ignorance, it doesn't change the hard fact that it is not the same as saying "1000 MHz". Yes Nvidias marketing is better, but no one here doubted that AMD is bad at marketing. Was already mentioned pages ago (by me or someone else).



And AMD's bad marketing is done to deceive.  Anything with "Up To" speeds is meant to deceive.  There is no getting around that fact.  And if I'm "ignorant" than so is every professional reviewer, because they all believe the same thing I do.  I don't think you'd be able to come up with a single respectable review of a 290/290X that calls the marketed speed the "base clock".



Kanan said:


> I'm not "trying" anything here. I explained to you what it is and what it is not. See, you are somewhat speculating on the matter, I'm not. Difference between knowledge and speculation. The problem here is that you are somewhat emotional on the matter and that makes this discussion somewhat unnerving and pointless, because you don't accept anything here, AMD is the "evil" for you and so on and so on. Emotional bullshit that is.



No, you're making stuff up to try to cover you fanboy statements.  AMD doesn't even call it a base clock.  So, yeah, your speculation is wrong.  Professional reviewers don't call it a base clock, AMD doesn't call it a base clock.  So exactly how am I the one speculating and you are the one that isn't?



Kanan said:


> Well I read some of those sites too and I think they are understanding it wrong, like you are, yes. *It's no turbo*. Again and again (and again and again...): IF it would be a turbo, the custom 290X's would not start at 1000 MHz or more (because it's their baseclock). They would start at 800 and go up to 1000 or more, like GPU Boost from Nvidia. Throtteling or downclocking to save energy and again clocking higher to normal speed is not the same as a turbo, because that normal speed is its base clock. Is that so hard for you to understand? Is it because you can't accept that it is something different special to AMD, that you are unable to accept it? Those website authors have that same logical problem btw.



Only a fanboy would use the word "special" to describe thermal/power throttling.  Special implies it is something positive that only they do.  In reality, it is something negative that Intel and nVidia both do when necessary as well.

By the stadnards of the industry, it is a turbo/boost clock.  Again, I don't care what you want to try to say it is, if the card is designed to run consistently at a lower speed, and only reach that top speed for short periods of time before overheating/drawing too much power, then that upper clock is a turbo/boost clock.  The base clock is the minimum clock speed the processor will always run at under load, not the maximum.  The fact that 3rd party cooling allows the card cards to run at that speed all the time doesn't change this.  3rd party cooling allows Intel and nVidia processors to run at their turbo/boost speeds constantly too, that doesn't make them the base clock.



Kanan said:


> Links, quotes, proof? I don't accept your word. You are way too emotional and subjective for me to believe you.



You know, you expect me to provide links and proof, how about you?  I don't just take your word.  Show me proof that AMD calls 1000MHz the base clock for the 290X.

As for the proof you want: http://www.anandtech.com/show/7457/the-radeon-r9-290x-review

Oh, what's that, they list the 1000MHz as a boost clock?  Oh, and in the power and temperature page they talk about the unlisted base clock, and the 1000MHz being the turbo.  So there is your proof.

Oh, and here is the architecture explanation taken pretty much directly from the AMD press kit. Funny, they go into block diagrams on how it works, but never once mention this "special feature" you talk about.  You'd think if it was so special, they'd bother to mention it.




Kanan said:


> lol I used it many times, but talk more bullshit and I will keep laughing at you.



And you've either used it wrong or not at all every single time.  But keep on laughing, ignorance is bliss.



Kanan said:


> 1) you just said you are ignorant. Wow cool. You are an easy opponent. Smart would be to ignore NOTHING. Any information is useful.



And yet, you ignore the specs listed in the sig of every post I've made, out in plain site.  And you want to say I'm ignorant because I didn't click on something to see yours, even though they have nothing to do with the issue.  Ok...



Kanan said:


> 2) Where did I say that AMD is better than Intel and Nvidia? Nowhere did I say that. That are just your dumb (negative) emotions that are making you fantasize these things. Something I already told you before, btw.



Umm, I already quoted them.

In defense of why AMD is better than Intel/nVidia you claim AMD has "never betrayed anyone".  You have said on multiple occasions that AMD is not Intel, the context implying that they are somehow morally better than Intel because Intel is a big company that can get away with it and AMD isn't.  The entire start of this argument is that you believe AMD will include the Wraith cooler with all their processors, again because Intel is big and evil and can get away with including crap coolers, but AMD is morally better and would never do that(even though they already do). Oh, and back to that, did you happen to catch the latest news?  Turns out I was right, Wraith coolers will come with only the high end processors, the standard aluminum chuck coolers will come with anything 95w and lower.  So I guess we can just drop the whole thing now, because in the end you were in fact wrong.  History repeats itself.  Even though there is nothing anyone can do to change your mind that AMD is not Intel, and they wouldn't think to include poor quality coolers with their processors, and all the processors are getting Wraith coolers, it turns out you're wrong.  So seems like a good place to leave this and move on.


----------



## Kanan (Feb 6, 2016)

newtekie1 said:


> Seriously, you talk down to me, and say you are smarter, and you don't even know what I'm talking about.  I'm not talking about the 290X, but since you seem to think you're so much smarter than me, go figure out what card I'm talking about yourself.


Nope that I don't care enough to do. Also, you are losing this point because you are unable to provide information/proof. Seems you don't want to discuss it then. Your opinion about AMD is anyway way off, so it's unlikely you would've been right with your assumption.



> And AMD's bad marketing is done to deceive.  Anything with "Up To" speeds is meant to deceive.  There is no getting around that fact.  And if I'm "ignorant" than so is every professional reviewer, because they all believe the same thing I do.  I don't think you'd be able to come up with a single respectable review of a 290/290X that calls the marketed speed the "base clock".



I don't care enough to search it up for you, but I assure you by my name that there are reviews seeing it as I do. And about marketing: you don't really understand marketing. You've proven that by now. Marketing ALWAYS is somewhat to deceive, ALWAYS. I cannot stress this enough. And basically everyone with the slightest clue about our capitalistic world knows this.

Btw. you made a very very bold statement here, so EVERY reviewer has your opinion? I want proof for that. Else you're talking stupid bullshit here.



> No, you're making stuff up to try to cover you fanboy statements.  AMD doesn't even call it a base clock.  So, yeah, your speculation is wrong.  Professional reviewers don't call it a base clock, AMD doesn't call it a base clock.  So exactly how am I the one speculating and you are the one that isn't?


No I'm not making stuff up. And I don't care that you are too dumb to accept that I'm no fanboy. By now you have proven to me that your social skills are limited, as is your character, that is why you are unable to accept this.
What does AMD call it then? Proof? Again you are unable to provide information. And its silly that I always have to ask for it. You are so limited, sorry.



> Only a fanboy would use the word "special" to describe thermal/power throttling.  Special implies it is something positive that only they do.  In reality, it is something negative that Intel and nVidia both do when necessary as well.


Only a idiot would call me a fanboy, after so much proof was provided by me that I'am not, aside from it being more than obvious without me saying anything. As I said, it's your social limitations hindering you. I really pity you by now.



> By the stadnards of the industry, it is a turbo/boost clock.  Again, I don't care what you want to try to say it is, if the card is designed to run consistently at a lower speed, and only reach that top speed for short periods of time before overheating/drawing too much power, then that upper clock is a turbo/boost clock.  The base clock is the minimum clock speed the processor will always run at under load, not the maximum.  The fact that 3rd party cooling allows the card cards to run at that speed all the time doesn't change this.  3rd party cooling allows Intel and nVidia processors to run at their turbo/boost speeds constantly too, that doesn't make them the base clock.


I discussed this with a friend of mine who is way more smart than you ever will be, and he shares my opinion, funny enough - and I didn't go the moral or emotional road for him saying the same as I do.
I already explained why it's no Turbo and I won't do it again, just because you are ignorant or too dumb. Sorry.

Edit: A turbo means additional speed, AMD graphics processors work at less speed, or base speed - so they have no turbo. AMD CPUs have Turbos like Intel / Nvidia ones.

Intel Turbo Boost (Wiki):
*"Intel Turbo Boost* is a technology implemented by Intel in certain versions of its processors *that enables the processor to run above its base operating frequency* via dynamic control of the processor's clock rate."
https://en.wikipedia.org/wiki/Intel_Turbo_Boost

Nvidia GPU Boost (from Wiki):
"GPU Boost is a new feature which is roughly analogous to turbo boosting of a CPU. The GPU is always guaranteed to run at a minimum clock speed, referred to as the "base clock". This clock speed is set to the level which will ensure that the GPU stays within TDP specifications, even at maximum loads.[6] *When loads are lower, however, there is room for the clock speed to be increased without exceeding the TDP. In these scenarios, GPU Boost will gradually increase the clock speed in steps, until the GPU reaches a predefined power target* (which is 170W by default).[7] By taking this approach, the GPU will ramp its clock up or down dynamically, so that it is providing the maximum amount of speed possible while remaining within TDP specifications."

Clearly not the same tech AMD uses. AMD has no turbo.

Also for the funs of laughing at you, because you don't know what a turbo is (Wiki: Turbocharger):
"A *turbocharger*, or *turbo* (colloquialism), from Greek "τύρβη" ("wake"),[1] (also from Latin "turbo" ("spinning top"),[2]) is a turbine-driven forced induction device *that increases an internal combustion engine's efficiency and power output by forcing extra air into the combustion chamber.*"

Also something a AMD GPU is not doing, because all it does is downclocking, not clocking higher. Also if a Fury X would have a turbo, it would not always run on 1050 MHz when stressed. It really does that, because it's the graphics cards base clock.



> You know, you expect me to provide links and proof, how about you?  I don't just take your word.  Show me proof that AMD calls 1000MHz the base clock for the 290X.



I don't need to. Every shop is doing it. TPU is doing it too.
https://www.techpowerup.com/reviews/AMD/R9_290X/
Good enough for me. But of course you will again be unable to understand or accept it. Hint: they use the base clocks there, even if it is not explicitly written down. But as this discussion is more about your ego than it is about the matter itself, you naturally won't accept my point again.



> As for the proof you want: http://www.anandtech.com/show/7457/the-radeon-r9-290x-review
> 
> Oh, what's that, they list the 1000MHz as a boost clock?  Oh, and in the power and temperature page they talk about the unlisted base clock, and the 1000MHz being the turbo.  So there is your proof.



TPU > anandtech. Me > you.



> Oh, and here is the architecture explanation taken pretty much directly from the AMD press kit. Funny, they go into block diagrams on how it works, but never once mention this "special feature" you talk about.  You'd think if it was so special, they'd bother to mention it.



You're really going crazy because of that word "special" aren't you? Sorry bro, but get a life.



> And you've either used it wrong or not at all every single time.  But keep on laughing, ignorance is bliss.


I don't care much if a limited / ignorant person like yourself is calling me ignorant. You're in no position to criticize me.



> And yet, you ignore the specs listed in the sig of every post I've made, out in plain site.  And you want to say I'm ignorant because I didn't click on something to see yours, even though they have nothing to do with the issue.  Ok...


You are more likely ignorant because of other things. And they had very well to do with the issue.



> Umm, I already quoted them.
> 
> In defense of why AMD is better than Intel/nVidia you claim AMD has "never betrayed anyone".  You have said on multiple occasions that AMD is not Intel, the context implying that they are somehow morally better than Intel because Intel is a big company that can get away with it and AMD isn't.  The entire start of this argument is that you believe AMD will include the Wraith cooler with all their processors, again because Intel is big and evil and can get away with including crap coolers, but AMD is morally better and would never do that(even though they already do). Oh, and back to that, did you happen to catch the latest news?  Turns out I was right, Wraith coolers will come with only the high end processors, the standard aluminum chuck coolers will come with anything 95w and lower.  So I guess we can just drop the whole thing now, because in the end you were in fact wrong.  History repeats itself.  Even though there is nothing anyone can do to change your mind that AMD is not Intel, and they wouldn't think to include poor quality coolers with their processors, and all the processors are getting Wraith coolers, it turns out you're wrong.  So seems like a good place to leave this and move on.


The point why I said that they "never betrayed anyone" was already discussed. I asked for counter-proof of that, you didn't provide. So you lose that point too. And AMD in fact is not Intel, even a baby would understand this, but not you. Seems you are somewhat mislead. And please keep thinking you are winning anything here, I'm very amused. If you want to end the discussion, don't answer, or accept my points. Either way, you are not in a position to dictate anything here as you are still pretty wrong with your assumptions.


----------



## CAPSLOCKSTUCK (Feb 6, 2016)

To the whole community.

I am really sorry i played a part in opening this thread.

It is becoming destructive and petty and i think it should be closed.

I promise  i will not do it again.


----------



## Kanan (Feb 6, 2016)

CAPSLOCKSTUCK said:


> To the whole community.
> 
> I am really sorry i played a part in opening this thread.
> 
> ...


It's not your fault. Don't blame yourself.


----------



## CAPSLOCKSTUCK (Feb 6, 2016)

Kanan said:


> It's not your fault. Don't blame yourself.




im not....i am blaming you and others.
Take it to PM and dont make us all look like a bunch of pricks.

It is a nonsense argument which neither "party" will resolve,. The AMD/INTEL debate is never a good one.


----------



## Kanan (Feb 6, 2016)

CAPSLOCKSTUCK said:


> im not....i am blaming you and others.
> Take it to PM and dont make us all look like a bunch of pricks.
> 
> It is a nonsense argument which neither "party" will resolve,. The AMD/INTEL debate is never a good one.


It's more about "what is a turbo" and what not, and is AMD betraying yes or no.

I don't care about "sides" here. But I do care about the truth. And that is all I'm interested in.

Edit: I would continue this in a PM if newtekie accepts and I welcome any moderator shutting this off. This discussion isn't going anywhere I guess, so...


----------



## 64K (Feb 6, 2016)




----------



## CAPSLOCKSTUCK (Feb 6, 2016)

64K said:


>



genuinely,laughing my head off, i think i broke my chair.


----------

