• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Internal Memo Reveals that even Intel is Impressed by AMD's Progress

Counter strike at 10Fps.
That's slideshow territory right there. :laugh:

I remember Cyrix... I mean Shitrix. Damn, I just showed my age. :laugh:
 
We also have VIA Master Race, they even do GPUs.
 
Shitrix more like it. I remember the days of my first PC with a Shitrix PR300 and a voodoo banshee. Counter strike at 10Fps.

Heh.. my first PC was a Cyrix 486-50. Seemed straightforward enough though. Things got more complicated after Pentium.
 
The only way that I can figure Intel is that they got complacent being on top for so long. Tech progression stagnated and they were for a long while selling 4 core 8 thread CPUs for what they should have been selling 6 core 12 thread CPUs for. It's the same lesson over and over again. When there is a lack of competition then the company on top charges more for less value and tech stagnates.
This.
Intel rested for far too long on their laurels and the urgency to innovate wasn't strong enough.
Now they're ~real~ interested,.....urgently.

I love it.

Look at the 8700K, it runs as hot as a mofo (partly due to the paste TIM that they use)
8700K, note the temps
240mm AIO cooled

schmeeble frog.JPG
 
What are your temps when doing MPEG4 h.264 encoding/transcoding using Handbrake? Because that’s when I see the higher than like-able temps.

I’m ripping some DVDs (that I have to stress, I own them!) and when compressing down the content from the raw MKV files I see high temps.

Edit: I've done some research and Handbrake's h.264 encoder does indeed use AVX 256-bit instructions.
Intel said:
Accelerating x265 with Intel® Advanced Vector Extensions 512 (Intel® AVX-512)
The x264 project for Advanced Video Coding (AVC) encoding and the x265 project for High-Efficiency Video Coding (HEVC) encoding are the two widely used media libraries that extensively use multiple generations of SIMD instructions on Intel architecture processors, from MMX technology all the way up to Intel AVX2.
AVX instructions, especially the 256-bit kind, can very much heat a processor up.
 
Last edited:
Yes. It is and can be a good one depending on use cases. It can EASILY blow the monetary savings of the cpu (core vs core and thread vs thread) right out the door. Over buying on cores and threads in a DC environment can be quite detrimental to the bottom line on many fronts.
AMD sells models of server CPUs with 8 cores, not only models with 32-64 cores. And at lower prices compared to Intel. If there is a market where more than 2 or 4 cores are a negative, I am pretty sure they can start selling models with a limited number of cores. So, where is the problem?
 
It wouldn't have been so bad years ago. It's remarkable just how few CPU companies/designers are around anymore.
Actually there's quite a lot of them. Just not in the PC business.
 
Actually there's quite a lot of them. Just not in the PC business.

I had some of those in mind too. Up until the recession or a bit before, it seemed like every UNIX company had their own chips. I don't even know what's around anymore except Power.
 
I had some of those in mind too. Up until the recession or a bit before, it seemed like every UNIX company had their own chips. I don't even know what's around anymore except Power.
While it's true those went the way of the Dodo, we now have ARM based designs creeping their way into that market.
 
They should check the security pillar. I think it's made of jello.

I liked that...;) Funny...;) Apropos!

Yes. It is and can be a good one depending on use cases. It can EASILY blow the monetary savings of the cpu (core vs core and thread vs thread) right out the door. Over buying on cores and threads in a DC environment can be quite detrimental to the bottom line on many fronts.


*cough* For every piece of software that charges by the core for using it, there are at least four competing pieces of software that charge according to other metrics--like, uh, perceived quality and professional reputation, for instance, or, uh compatibility with varying software standards, or even, in some cases, security. Or some combination thereof! Unless the software is in the "my business cannot survive without this software" category, my sincere recommendation is to eschew these "licensed by the x86 core" applications as if they are spreading plague...;)

Intel are scrambling. Worse position they've been in since Athlon 64, maybe even worse as this time they're helmed by a band-aid choice of a CEO, Bob Swan. A man that has no understanding of the main products Intel develops vs a MIT powerhouse who studied electrical engineering for her PhD.

Bingo! So much of a company's fate is inextricably applied to the man at the top--After the highpoint of A64, @ AMD, the point at which the company should have taken off like a bat out of hell, putting the wrong person at the top of the stack in the CEO position came darn close to killing the company off, instead--which would have been a tragedy--would have set general computing back maybe 50 years. It didn't happen, thankfully---investors largely kept the clock ticking at AMD until the right CEO could be landed, and she had one foot in engineering and one foot in the consumer markets and she understood AMD's situation perfectly! Lisa Su was ideal on so many levels it defies description. The right ma--er, lady, for the job! In spades! Lots of CEO candidates might have identified AMD's problems correctly but few of them would have known how to solve them! But CEO mismanagement is the #1 cause of computer companies expiring--a la Commodore--just one example--Cyrix, and many more. I mean, for years at AMD the company had a CEO so utterly clueless about "What to do?--where to go?" that at one point AMD was actually selling Intel servers under the AMD brand! *talk about face palm* I thought at the time it was finally curtains for AMD as selling servers for Intel had to be the bottom--but Lisa Su came on deck, ready to knock the ball out of the park, little did I know....;) AMD simply doesn't have the money to toss away like Intel has and wastes, but the interesting thing is that Intel's mistakes today are going to begin exacting penalties that have far greater short-term and long term effects on Intel--and it's all because of the fact that AMD is making Intel appear to be the quite second-rate tech company. It's AMD executing on a dime these days--certainly not Intel. ATM Intel has a huge cash warchest to sit on, but the flip side is that a company the size of Intel can burn through money like there's no tomorrow, so sitting pat on past performance to continue to bring in something isn't going to work all that much longer! Either Intel will make the products people want--or it won't--and there's no gray area left. It's a very binary proposition.
 
And we come back to the question of why did Intel put a finance guy in the CEO seat?
 
And we come back to the question of why did Intel put a finance guy in the CEO seat?

Bob Swan hasn't been at the head of the company long enough to be involved in any of the end-point decisions that have put them in the place they're in currently.

We'll need some time with him in the seat to see what comes of his tenure.
 
Cyrix CPU's were very solid BITD.
Really? I remember them being the weakest of the bunch. Ok with integer processing (but so was everybody else), but weakest when it came to FP.
But it would be nice if Cyrix and VIA were still around. Because now fanboys on either side, only have one brand to hate :D
 
What are your temps when doing MPEG4 h.264 encoding/transcoding using Handbrake?
AVX instructions, especially the 256-bit kind, can very much heat a processor up.
I don't know because I never use Handbrake. If I do copy a movie, (a rare occurrence) I use DVD Fab to do it with.
 
"maybe providing amd IP to Intels X86 many years ago wasn't such a smart long term move after all? " - voltage

"I think it's about Intel licensing x86 to AMD" - XiGMAKiD

"I think what he's on about was when things went 64-bit, or something...
AMD cross licensed AMD64 with Intel, so they could go on making x86 compatible CPUs." - TheLostSwede

The first one... English isn't my first language too, but I understood what he meant.
Then I wouldn't mind being enlightened to what you think he meant. We have two different takes on this so far. One seems to think he meant that it was a mistake for AMD to license AMD64 to Intel. One seems to think it was a mistake for Intel to license x86 to AMD. Both of which would be an oversimplification to the point of meaninglessness. Not to mention both interpretations seeing two different failures (one on the AMD side one on the Intel side).
 
"maybe providing amd IP to Intels X86 many years ago wasn't such a smart long term move after all? "
In the post 486/586 era of Intel and AMD
Intel needed a IP Cross licensing agreement with AMD as much as AMD also needed this Cross licensing ( and Probably more ).
AMD Brought X64 to the Deal which Intel wanted/Desperately needed.
The Other big Player Cyrix Had no IP that Intel wanted or needed and therefore could not effectivly compete with Intel (Pentium) and AMD (athlon ).
 
The Other big Player Cyrix Had no IP that Intel wanted or needed and therefore could not effectivly compete with Intel (Pentium) and AMD (athlon ).
If that is what the OP meant, it is still a very limited view of a very complicated issue that has - in the end - little to do with cross licensing (which both parties and the consumers have gained by) and more to do with the own engineering staff and innovations. For who exactly was the AMD64 licensing deal not the "smart long term move after all" in this case? Intel is still the 60+ billion jaggernaut vs the AMD 6+ billion wallflower. Was AMD supposed to not license AMD64? It was a great implementation but even back then with AMD winning nearly every performance and price contest and charging 1k for binned FX chips, Intel had the vast majority of the market secured. If AMD hadn't let them license it, they would have lacked some crucial instruction sets themselve and Intel might just have gone forward with their own implementation and considering their lock on the market (through PR, money and shady/illegal actions) might have succeeded even more. Intel without at least a competent AMD would also be the target of many anti-trust investigations and potential splitting of the company. Cyrix is also an interesting subject in and of itself, since they just reverse engineered Intel CPUs without actually having an x86 license (very clever but you can't compete by reverse engineering alone for long). Via was/is the only other player with such a license. Cyrix went in another direction, before AMD64 became a thing.
 
In the post 486/586 era of Intel and AMD
Intel needed a IP Cross licensing agreement with AMD as much as AMD also needed this Cross licensing ( and Probably more ).
AMD Brought X64 to the Deal which Intel wanted/Desperately needed.
The Other big Player Cyrix Had no IP that Intel wanted or needed and therefore could not effectivly compete with Intel (Pentium) and AMD (athlon ).
To add to the context, AMD also didn't have much IP Intel wanted. But they bought what they needed from DEC and marketed that technology. It was probably a pretty risky move, but it paid out big time.
 
Really? I remember them being the weakest of the bunch. Ok with integer processing (but so was everybody else), but weakest when it came to FP.
Na, Cyrix CPU's benchmarked fairly well in games and traded wins with like all the rest. Their downside was that they ran hotter than AMD, Intel and VIA. Even the Winchips ran cooler.
But it would be nice if Cyrix and VIA were still around. Because now fanboys on either side, only have one brand to hate :D
Right? It's not like the market isn't big enough for them now..
 
  • Like
Reactions: bug
The cost of getting on par with the freaking Atoms in performance must be huge.
 
I'm glad the full memo is here, since it disappeared from the original Reddit site.
In my opinion, the real challenge to Intel comes from TSMC, rather than really from AMD itself. Their "7nm" may only be equivalent to Intel's 10nm, but they had it ready first.
It's true that Intel has as many people working on software for Intel chips as AMD has period. So Intel has a good Fortran compiler for Intel chips which is expensive - and AMD has a free Fortran compiler for AMD chips, from the open-source LLVM, which only runs on Linux. So that's an area where Intel is ahead. But that made me think of the rivalry between IBM and Control Data, and the famous "including the janitor" memo.
More is not always better, even if the small size of AMD is limiting it in some ways.

But it would be nice if Cyrix and VIA were still around.

Given that Microsoft Windows needs an x86 to run on, what would be nice is if the x86 architecture didn't require licensing. So that in addition to SPARC, Sun could have made x86; in addition to PowerPC, IBM could have made x86 (actually, they did for a while, licensing it from Cyrix). Any company able to make a CPU ought to be able to make a CPU that can actually be used: one that can run Windows. And if Windows and the applications for it aren't distributed as source, because it isn't free like Linux, then that means the dominance of Windows has enshrined the dominance of x86. CPU makers can't compete on their merits, if they can't make x86 chips.

Of course, while SPARC and PowerPC and MIPS and Alpha and 680x0 never took the world by storm, ARM was able to get somewhere - but by carving out a new niche, in which it is dominant.

So the remedy to competition in the chip industry, barring x86 being taken from Intel, would be the government forcing Microsoft to move Windows to RISC-V and abandon the x86. That, of course, is hardly likely to happen.
 
Last edited:
Given that Microsoft Windows needs an x86 to run on
That's only because of the need for backward compatibility with decades-old legacy applications. Take the need to support legacy applications out of the equation and suddenly x86 is no longer a requirement, ARM can be used instead. Most modern programs still have source available so yes, they can be compiled for ARM. Will it be painful at first? Yes, but it can be done.
 
Well, yes and no. They didn't become complacent in that sense, but rather, they believed they were so far ahead, so even though their first 10nm process was a mess, they thought they would have time to fix it, but alas... Hence why they started a second 10nm process, as we know. On top of that, they decided to focus on a million other things, like FPGAs, AI co-processors, GPUs, wireless data modems (3G/4G, but clearly 5G was another mess), IoT (another huge failure), mobile phone SoCs (failure), photonics and what not. This means that they somewhat lost focus on the CPU business, as it was only part of what Intel made. Then they got a huge order for 4G models (and an expected order for 5G modems) from Apple, due to them falling out with Qualcomm and this ate up a lot of their production capacity. In other words, it's not so much being complacent, as having too many different businesses that don't quite fit and which used up a lot of resources. If you take a good look, it's not hard to see why Intel are in their current position. Yes, some of their "new" businesses have helped them make more money, but at the same time, they've lost focus on the good old x86 CPU.

Definitely a strategic miss, but I think you'll find it was nothing more than a poor decision in favor of short-term profitability... Bean-counters overriding engineering decisions.

Take a look at the investment Intel made (and reversed) in ASML a few years ago. Those guys were the only ones delivering EUVL and it looked like Intel had shut out the other foundries by buying up all of ASML's upcoming production. Intel didn't just buy a big stake in ASML, they pre-ordered enough of the upcoming production to shut out Samsung and TSMC. Then what happened? I'm sure Intel insiders could comment on this, but it looks like Intel just decided they'd get along just fine without EUV.

Just look back a couple of years at TSMC's roadmap and the indications that Intel was bypassing extra-dense litho for low-end silicon (indicators here and here). I just keep wondering how Intel thought they were going to be able to continue the ramp-up to greater capacity per die without a clear path beyond DUV. Your 10mm plans are adorable, Intel.
 
Not for long. The 5700XT is about to be released and the later Navi (kraken) is coming as well.

Vega was a product that initially was designed for the compute and not gaming space. However with some tweaks you could apply it as a gaming GPU but with lots of overhead.
I've only just noticed your response. As much as everyone would like AMD to be competitive, it's not something that's going to happen anytime soon. Nvidia still controls the high end of the market.
 
Back
Top