# Intel Says AMD Did a Great Job (with Ryzen 3000), But Intel CPUs are Still Better



## AleksandarK (Aug 26, 2019)

It is no secret that AMD has made a huge success with its long awaited "Zen" CPUs and returned to PC market stronger than ever. Intel however has neglected AMD's presence and only recently admitted what an impact AMD made. At this year's Gamescon, Intel started a new campaign against AMD with a point that Intel's CPUs are still better performers with "real world benchmarks" backing that claim.

"A year ago when we introduced the i9 9900K," says Intel's Troy Severson, "it was dubbed the fastest gaming CPU in the world. And I can honestly say nothing's changed. It's still the fastest gaming CPU in the world. I think you've heard a lot of press from the competition recently, but when we go out and actually do the real-world testing, not the synthetic benchmarks, but doing real-world testing of how these games perform on our platform, we stack the 9900K against the Ryzen 9 3900X. They're running a 12-core part and we're running an eight-core," he adds. "I'll be very honest, very blunt, say, hey, they've done a great job closing the gap, but we still have the highest performing CPUs in the industry for gaming, and we're going to maintain that edge."



 

 

 

 

Here Intel describes that AMD wins in synthetic workloads, while its CPUs win in a real world usage scenarios for applications like Microsoft Office, Adobe Lightroom, Photoshop and more. While they claim to posses better overall productivity performance, Intel also claims few other trophies in areas like gaming, where Core i7-9700K "is on par or better" than AMD Ryzen 9 3900X across many games tested.


 

 
In our own testing, we found the claim about gaming performance to be true where Intel's Core i7-9700K did perform better than Ryzen 9 3900X. However when it comes to overall performance results that also includes many other tasks besides gaming, like productivity and science, the case is not proven.

*View at TechPowerUp Main Site*


----------



## eidairaman1 (Aug 26, 2019)

I find this funny, now intel's talking crap because they are stuck. Childish bs from them.


----------



## laszlo (Aug 26, 2019)

what else could they say to protect their over-priced cpu's   ?


----------



## ratirt (Aug 26, 2019)

eidairaman1 said:


> I find this funny, now intel's talking crap because they are stuck. Childish bs from them.


Oh man you have no idea. When I read the first paragraph I literally fell of the chair  What a ruse, Intel just can't stomach the fact that AMD products are just as good as theirs or even better. Real world benchmark from Intel. I see Intel likes to mess around with catchy frases and cheep tricks


----------



## Deathmourne (Aug 26, 2019)

Sure Intel... Sure....


----------



## dyonoctis (Aug 26, 2019)

I guess Puget system don't know how to do benchmark then.


----------



## Chomiq (Aug 26, 2019)

"Real world benchmarks" like the streaming benchmarks that Intel pushed on reviewers and now says that they aren't "real world" once they show improved performance with AMD?


----------



## Crackong (Aug 26, 2019)

3950x is coming,  Ye know why .


----------



## TheLostSwede (Aug 26, 2019)

Ok, as someone that's been working for a company that was part of Bapco which makes Sysmark, I can tell you that this benchmark is tuned to perform better on Intel CPUs.
It has been this way since the start of Sysmark, so I wouldn't read anything into the results coming out of that benchmark.
Using that and claiming Intel performs better than AMD is a bunch of crap.

AMD and Nvidia (as well as VIA) were in fact a member companies at one point, but not any more.





						BAPCo consortium - Wikipedia
					






					en.wikipedia.org


----------



## dj-electric (Aug 26, 2019)

Commence 13 more pages of people who keep bashing each other for companies that don't care about them


----------



## SIGSEGV (Aug 26, 2019)

intel as usual. LMAO


----------



## Chaitanya (Aug 26, 2019)

dyonoctis said:


> I guess Puget system don't know how to do benchmark then.
> View attachment 130100


Unfortunately Adobe applications are heavily optimized for Intel especially likes of Premier and Lightroom(which cannot fully utilize multi-core CPUs). Unless Abode recodes its applications to be neutral towards CPUs and utilize multi-core CPUs effectively things arent going to change on that front.








						Video: H.264 Hardware Acceleration in Adobe Media Encoder - Good or Bad?
					

At first glace, the recent addition of "hardware acceleration" when exporting to H.264 and H.265 in Media Encoder and Premiere Pro provides a huge boost in performance for many users. Unfortunately, it is not a perfect technology and result in lower quality video than using the standard...




					www.pugetsystems.com


----------



## Blueberries (Aug 26, 2019)

They should be less concerned with convincing people their processors are 1-3% faster and more concerned with AMD's solution being 1-3% slower, with more threads, at a significantly lower price.


----------



## GoldenX (Aug 26, 2019)

Damage control much?
Maybe if they used all that money to get a decent product on time, they would not be this bad now.


----------



## yeeeeman (Aug 26, 2019)

I think this is quite unprofessional from their part. Sure, they have issues, but press like this makes them look even worse. Why do you need to say your product is better if it is indeed better?


----------



## Vayra86 (Aug 26, 2019)

Yes, Intel. You are doing just fine.

Keep at it!


----------



## ShurikN (Aug 26, 2019)

What's the point of these slides, when Ryzen 3000 has been out for almost 2 months, and we've all seen how it performs especially in productivity, where it destroys everything Intel has to offer...
Sysmark uses _REAL _applications
*word, excel, powerpoint... those ones run on a Casio calculator...

That's just embarrassing.


----------



## Lionheart (Aug 26, 2019)

This made me chuckle more than it should have


----------



## Hossein Almet (Aug 26, 2019)

Per Intel's analysis, people shouldn't buy the 9920X, because it performs not as good in MS Office and Adobe Lr as the 9900K.  What matters more in terms of real world benchmarks is connectivity, the ryzen 3000 offers more USB 3.2 Gen2, PCI-E 4.0 and, according to some motherboards, up to 6 SATA ports.


----------



## Dexiefy (Aug 26, 2019)

Intel in 1 word: Pathetic.


----------



## Xaled (Aug 26, 2019)

Intel's real face and personality.


----------



## biffzinker (Aug 26, 2019)

Crackong said:


> 3950x is coming, Ye know why .


Look out when that drops. Intel is just getting started with the bs excuses.


----------



## LocutusH (Aug 26, 2019)

Technically, they didnt say anything, that isnt true.

(without fan-bullshit, i also opted for a 3700X)


----------



## Bwaze (Aug 26, 2019)

But in those "Real World" tests 9900K has no advantage over 9700K - why did they make 8 core, 16 thread processor then, and are planning a 10 core, 20 thread one in beginning of 2020?

To say that the only thing that really uses many cores is Cinebench R20 is frankly pathetic. Who are they targeting with this information? The enthusiastic Microsoft Office extreme overclocking crowd?


----------



## FordGT90Concept (Aug 26, 2019)

Hey, I'm happy because we actually have competitors now.  We, the consumers, win in this environment.


----------



## Hardware Geek (Aug 26, 2019)

I'ma get me a 9900k cause Intel would never lie about their products. /s


----------



## FreedomEclipse (Aug 26, 2019)

Next thing, Intel will issue another press release saying "my dad will fight your dad" - Stay tuned for more DrAmA


----------



## PanicLake (Aug 26, 2019)

The fun part to me is that the Ryzen 7 3700x  compared to the i9 9900k costs 2/3 (66%) the price but it is only 3-4% slower in CPU relative performance and 5-7% slower in gaming performace.
A total win if you ask me!


----------



## punani (Aug 26, 2019)

laszlo said:


> what else could they say to protect their over-priced cpu's   ?



I find it funny that by these statements and boasts they believe they are creating confidence in their products. But for anyone in the business with a brain, the effect is the opposite.


----------



## GeorgeMan (Aug 26, 2019)

Key word: "gaming". I'd say low res hi fps gaming, competitive online only. For everything else they are completely blown.


----------



## lexluthermiester (Aug 26, 2019)

eidairaman1 said:


> I find this funny, now intel's talking crap because they are stuck. Childish bs from them.


I don't know that I would go so far as to call it "Childish" per se, but it is the typical marketing spiel.

@AleksandarK's conclusion about Intel's claim being only applicable to certain games is correct. Every review I've read from reputable sources show the same result. If you're gaming and have lots of money, go Intel. If you're doing literally anything else, regardless of budget, go AMD. If you want to game, but have a limited budget, go AMD. This is true as of the time of this comment.


----------



## yotano211 (Aug 26, 2019)

FordGT90Concept said:


> Hey, I'm happy because we actually have competitors now.  We, the consumers, win in this environment.


hell yea, just 2 years ago I was on a laptop with 4 cores and now its 8 cores.


----------



## Chomiq (Aug 26, 2019)

FordGT90Concept said:


> Hey, I'm happy because we actually have competitors now.  We, the consumers, win in this environment.


Except Intel is stuck in their logic of "our product is the best for gaming and you WILL pay for it" and refuses to adjust their prices. We saw what? A single price drop of $20, only recently, on 9600K, since Zen 2 launched.


----------



## Hyderz (Aug 26, 2019)

As a consumer point of view, both current cpu offerings from both parties are good.
If you compare at price point, amd offers more cores and threads at $499 and is a damn good cpu for productivity that utilizes all those cores.
Intel on the other side their $499 cpu offers great cpu performance in the gaming side but The ryzen is no slouch when it comes to gaming as well.
Buy either platform and you will have a fast desktop.

Competition is great, as consumers we have more choices now. I just hope we can see slightly lower prices as ... $499 for mainstream high end cpu is rather steep


----------



## FordGT90Concept (Aug 26, 2019)

Chomiq said:


> Except Intel is stuck in their logic of "our product is the best for gaming and you WILL pay for it" and refuses to adjust their prices. We saw what? A single price drop of $20, only recently, on 9600K, since Zen 2 launched.


2009-2017: ~$350 would only get you a 4c/8t processor from Intel
early 2017: AMD launched Ryzen 1700 for $329, an 8c/16t processor.
late 2017: Intel launched 8700K in response: 6c/12t for $360
early 2018: AMD launched 2700 for $300, an 8c/16t processor.
late 2018: Intel followed it up with a 9700K: 8c/8t for $374

Because of AMD, Intel went from 8 years of releasing the same damn 4c/8t processor over and over on smaller processes (saving them money) to 8c/8t in just two years...all for roughly the same price!  We're getting double the processors from both vendors at no extra cost (actually less, because inflation)!  Win for everyone!   Except Intel.  Who has been milking the market dry for over a decade.

My ~$425 spent in 2015 on the 6700K (4c/8t) goes twice as far at AMD and Intel because of AMD bringing the competition again.


I don't really care which you buy right now.  They're both a good choice.  The market finally has some balance again.


----------



## employee24601 (Aug 26, 2019)

Wow, the smell of desperation from Intel is almost overpowering.

Intel wants my money, but AMD has earned it.


----------



## TheGuruStud (Aug 26, 2019)

This is like Intel going through the stages of grief lol


----------



## Mephis (Aug 26, 2019)

I don't understand how this is childish on Intel's part. All they did is point out that they still hold a small advantage in single thread. They said nothing about multi thread performance or performance per dollar or performance per watt. They did exactly what they should do. You take the area where you have and advantage and you market the hell out of it. AMD did it with Bulldozer (first 5ghz cpu, higher module counts) and they try and do it with their GPUs.

Did people really expect Intel to say "Ok guys, AMD put out great CPU's. Time to fold up shop, we are never going to sell another chop again."


----------



## TheDeeGee (Aug 26, 2019)

If they had any common sense they would have made the 10th Gen use the same socket.

But Intel logic will be Intel logic.


----------



## LocutusH (Aug 26, 2019)

Mephis said:


> I don't understand how this is childish on Intel's part. All they did is point out that they still hold a small advantage in single thread. They said nothing about multi thread performance or performance per dollar or performance per watt. They did exactly what they should do. You take the area where you have and advantage and you market the hell out of it. AMD did it with Bulldozer (first 5ghz cpu, higher module counts) and they try and do it with their GPUs.
> 
> Did people really expect Intel to say "Ok guys, AMD put out great CPU's. Time to fold up shop, we are never going to sell another chop again."



Its just trendy to bash intel nowadays. Whether its logical or not.
AMD fanbois (playing with cinebench) can take revenge after all these years. Kind of understandable.

If i was only after gaming, i would have bought a 9700k. But didnt want a dead-end motherboard, that gets no more processors, so i ended up with 3700X. Wich is slower in games, by some percents. Which i can probably regain by overclocking the RAM and IF. End of story.


----------



## XiGMAKiD (Aug 26, 2019)

Yes yes yes your CPUs are great now get back to the lab and give us 10nm


----------



## Turmania (Aug 26, 2019)

Funny most teens won't know this but when AMD had the performance advantage in the single core days it was Intel who made core 2 duo and started core trends and took performance advantage.let me tell you this back then AMD was asking for more money.now after almost 2 decades of being no way near competitive they made a good progress. Underlining all this is that none of these companies are morally clean.


----------



## TheGuruStud (Aug 26, 2019)

Turmania said:


> Funny most teens won't know this but when AMD had the performance advantage in the single core days it was Intel who made core 2 duo and started core trends and took performance advantage.let me tell you this back then AMD was asking for more money.now after almost 2 decades of being no way near competitive they made a good progress. Underlining all this is that none of these companies are morally clean.



Wut? Your info is pretty faulty.


----------



## windwhirl (Aug 26, 2019)

Bwaze said:


> But in those "Real World" tests 9900K has no advantage over 9700K - why did they make 8 core, 16 thread processor then, and are planning a 10 core, 20 thread one in beginning of 2020?
> 
> To say that the only thing that really uses many cores is Cinebench R20 is frankly pathetic. Who are they targeting with this information? The enthusiastic Microsoft Office extreme overclocking crowd?




Hey!! Those spreadsheets are stupidly complex and pretty much alive with data and macros! I need to overclock the crap out of the CPU to get things done in time /s


----------



## Dragonsmonk (Aug 26, 2019)

Turmania said:


> Funny most teens won't know this but when AMD had the performance advantage in the single core days it was Intel who made core 2 duo and started core trends and took performance advantage.let me tell you this back then AMD was asking for more money.now after almost 2 decades of being no way near competitive they made a good progress. Underlining all this is that none of these companies are morally clean.



Be that as it may - the AMD from 20 years ago is nowhere near the same as it is today... I am glad to see that Intel is resorting to their usual shenanigans and that we finally have full competition.


----------



## Redwoodz (Aug 26, 2019)

Funny they didn't mention Security anywhere except in the fine print in their graphics which states" Benchmarks may not have been run with all current available security patches."
   I wonder if they still win benches with all updates installed?


----------



## dyonoctis (Aug 26, 2019)

Turmania said:


> Funny most teens won't know this but when AMD had the performance advantage in the single core days it was Intel who made core 2 duo and started core trends and took performance advantage.let me tell you this back then AMD was asking for more money.now after almost 2 decades of being no way near competitive they made a good progress. Underlining all this is that none of these companies are morally clean.


the Core 2 Duo wasn't intel first dual core, the pentium D was (two pentium 4 glued together), and AMD released (at least for reviewers) the athlon 64 x2 just a few weeks later (who was actually 2 core on the same die). But yes, since AMD had the performance crown, they asked for more money.

This article is giving some insight of the context of 2005, AMD and Intel had dual core in their mind for a long time, they just had to wait for the manufacturing process to mature before making them a reality:





						AMD's dual core Opteron & Athlon 64 X2 - Server/Desktop Performance Preview
					






					www.anandtech.com


----------



## 64K (Aug 26, 2019)

Poor Intel. I feel sorry for them...........

Not!

They were arrogant and greedy when on top for so many years. Charging for a 4 core 8 thread CPU what they should have been charging for a 6 core 12 thread CPU. Funny how they managed to do that after Ryzen came out. Putting crappy TIM under the heat spreader on their unlocked CPUs to save a few pennies instead of using solder. When overclockers complained about the poor overclocking potential without causing instability Intel's response was, "Then don't overclock".


----------



## Darmok N Jalad (Aug 26, 2019)

dyonoctis said:


> the Core 2 Duo wasn't intel first dual core, the pentium D was (two pentium 4 glued together), and AMD released (at least for reviewers) the athlon 64 x2 just a few weeks later (who was actually 2 core on the same die). But yes, since AMD had the performance crown, they asked for more money.
> 
> This article is giving some insight of the context of 2005, AMD and Intel had dual core in their mind for a long time, they just had to wait for the manufacturing process to mature before making them a reality:
> 
> ...


Prescott was also a poor performing product, so Athlon64 was already a better product. Intel knew AMD was about to go dual-core, and they engineered the Pentium D using MCM. Core came along later as an evolution of Pentium M. Pricing usually isn’t as competitive when one product is clearly better, at least at the high end. It’s why nvidia sells $1200 GPUs.


----------



## DeathtoGnomes (Aug 26, 2019)

I'd like to know what intel considers "real world" or is this just some more intel trolling AMD again?


----------



## lexluthermiester (Aug 26, 2019)

DeathtoGnomes said:


> I'd like to know what intel considers "real world" or is this just some more intel trolling AMD again?


It might be a bit trolling in addition to the marketing mumbo-jumbo.


----------



## fynxer (Aug 26, 2019)

Intel's cpu overprice is a choice they are making NOT forced on by the 14+++++++++ node

As before Intel will only drop price as a last resort.

Their 9700K and 9900K are at least 20-30% overpriced at the moment.

Problem is that AMD will not push Intel too hard on pricing at the moment because they also want to make good money.

What i hear from Sweden's biggest computer retailer is that AMD are about to make a big splash on Black Friday this year to own most of the cpu sales.

AMD will give Intel a hard time before X-MAS to screw with their 10000 series cpu release on CES.

The more customers AMD can take from Intel before X-MAS the harder it will be for Intel to gain momentum with their 10000 series cpu, it will be extra hard going in to low sale season Q1 2020 to pull of any major sales figures with their new 10000 series cpu series.

If Intel's sales are really bad before X-MAS it could lead to overstock of Intel 9000 series cpu's forcing Intel to push back 10000 Series to Q2 2020 to have time to clear 9000 series stock.


----------



## Midland Dog (Aug 26, 2019)

ratirt said:


> Oh man you have no idea. When I red the first paragraph I literally fell of the chair  What a ruse, Intel just can't stomach the fact that AMD products are just as good as theirs or even better. Real world benchmark from Intel. I see Intel likes to mess around with catchy frases and cheep tricks


sorry but gunna have to call you out on this one, legit every benchathon intel wins, every hof entry in the top 3 is intel nv and the only place ive seen amd cpus win hands down is legit cinebench


----------



## biffzinker (Aug 26, 2019)

Midland Dog said:


> sorry but gunna have to call you out on this one, legit every benchathon intel wins, every hof entry in the top 3 is intel nv and the only place ive seen amd cpus win hands down is legit cinebench


If you say so


----------



## xkm1948 (Aug 26, 2019)

With all my love for AMD CPU, especially their Threadripper lines, I kinda agree with Intel on this one.

Been doing quite a lot of R&D on the lab TR platform. I must say although AMD is providing top notch hardware per dollar, their software support leaves a lot to be desired. It takes software and hardware support to make a good ecosystem.

This is just for professional use. I have yet to move to TR for gaming so I cannot say regarding the gaming part.


----------



## Forde (Aug 26, 2019)

As someone who has a 9900k and a 3800x... this isn't wrong, but positioning the 9700k as a strong proc isn't going to end well in 3 years time.


----------



## svan71 (Aug 26, 2019)

Unlike Intel engineers the past 5 years, most people don't sit and game all day.


----------



## Dragonsmonk (Aug 26, 2019)

xkm1948 said:


> With all my love for AMD CPU, especially their Threadripper lines, I kinda agree with Intel on this one.
> 
> Been doing quite a lot of R&D on the lab TR platform. I must say although AMD is providing top notch hardware per dollar, their software support leaves a lot to be desired. It takes software and hardware support to make a good ecosystem.
> 
> This is just for professional use. I have yet to move to TR for gaming so I cannot say regarding the gaming part.



You mean since none of the software devs bother to optimize for more than Intel?

Well that may be changing soon, but to blame AMD for that is an interesting approach


----------



## Mephis (Aug 26, 2019)

Dragonsmonk said:


> You mean since none of the software devs bother to optimize for more than Intel?
> 
> Well that may be changing soon, but to blame AMD for that is an interesting approach



Who would you like him to fault?

Intel? - are we going to fault them for having the dominant position in the market?

Developers? - of course they are going to optimize for Intel over AMD. Again, this is where market share comes in.

Also he is taking about support directly from Intel and AMD not just developers. Intel has a much bigger support system for their clients than AMD does. I rember seeing posts on the interwebs when the first Epyc CPU's were launched talking about how major corporations were looking to buy AMD, but decided against it because AMD either didn't have the ability to or didn't want to let the customers get time with engineers and or architects.


----------



## xkm1948 (Aug 26, 2019)

Well our lab ARE part of the developers that are trying to optimize our specific applications for AMD's Zen arc.

To phrase it better: when specific programming error occurs that are not listed in AMD's technical whitebook, contacting AMD tech support is useless. As their support have 0 support for developers. Most of the time users are left to trail and error all by themselves.


----------



## Darmok N Jalad (Aug 26, 2019)

xkm1948 said:


> With all my love for AMD CPU, especially their Threadripper lines, I kinda agree with Intel on this one.
> 
> Been doing quite a lot of R&D on the lab TR platform. I must say although AMD is providing top notch hardware per dollar, their software support leaves a lot to be desired. It takes software and hardware support to make a good ecosystem.
> 
> This is just for professional use. I have yet to move to TR for gaming so I cannot say regarding the gaming part.





Mephis said:


> Who would you like him to fault?
> 
> Intel? - are we going to fault them for having the dominant position in the market?
> 
> ...



I’m not sure if this is still true, but I know in the past, Intel compilers had much to do with this performance optimization. In the same manner, nvidia works with developers to optimize their titles for performance. Intel and nvidia dictate the standards, essentially. AMD seems to take the “open” route, like with OpenCL and AMD64 support, and they partner through custom silicon designs. AMD needs a raw performance advantage because they don’t dictate standards or optimize software like the competition. Not saying that is a good thing, but it seems like the different approaches get us these results.


----------



## Mephis (Aug 26, 2019)

Darmok N Jalad said:


> I’m not sure if this is still true, but I know in the past, Intel compilers had much to do with this performance optimization. In the same manner, nvidia works with developers to optimize their titles for performance. Intel and nvidia dictate the standards, essentially. AMD seems to take the “open” route, like with OpenCL and AMD64 support, and they partner through custom silicon designs. AMD needs a raw performance advantage because they don’t dictate standards or optimize software like the competition. Not saying that is a good thing, but it seems like the different approaches get us these results.



I agree to an extent with that, except for one point. AMD64 is not and never was an open standard.  You need a license from AMD to implement it. Intel got that from them as part of deal that opened up Intel's parents to AMD in exchange for AMD's patents. But you and I couldn't design a cpu using it, the way we could with RISC-V.


----------



## xkm1948 (Aug 26, 2019)

"Open Everything" is a wonderful scheme: basically it is letting everyone else do the R&D for you. Kinda lazy approach TBH. Not particularly fond of this mode. Had tons of trouble back in the days writing programs hoping to use OpenCL on the Fiji arc. TONS of problems.


----------



## Tomgang (Aug 26, 2019)

Shut up and take my intel money...while i spend my real money on a Ryzen 9 3950X. In your face intel

So what do i want. Hmm 8 Intels cores or 12 AMD cores for the same price or how about 16 Intels cores for about twice as much money for what an 3950X will cost.

Thanks but no thanks intel. Intel might win in most games but not by that much and besides that AMD wins in almost every thing else. All from price to performance and raw power to power consuption. Just see with Intels comet lake with up to 125 watt for just base clock and amd 16 cores is just 105 watt with 6 core more.

I can easy live with losing 1-15 % gaming performance over to get twice the cores over I9 9900K for not so much more money and to half the price of I9 9960X. 3950X is really the best all around CPU for those that want a gaming CPU that still pack a punch in workstation and converting load. So 3950X is my CPU of choise. Intel can keep there 14 NM+++++++++++++++++++++++++++++ or how many they are up to now

This one goes to intel: *This Is So Sad Alexa Play Despacito*


----------



## juiseman (Aug 26, 2019)

fynxer said:


> Intel's cpu overprice is a choice they are making NOT forced on by the 14+++++++++ node



Incorrect; they are upto  14++++++++++ now. No big deal; just forgot 1 more +....lol...​


----------



## Nkd (Aug 26, 2019)

yeeeeman said:


> I think this is quite unprofessional from their part. Sure, they have issues, but press like this makes them look even worse. Why do you need to say your product is better if it is indeed better?



This! Company I work for always tells me one thing. Respect your competition, don't talk shit about them. Heck we don't even mention competition when we actually show value in our product why its better. If you know your product is better you don't need to talk down competition and invest more time in them.



Tomgang said:


> Shut up and take my intel money...while i spend my real money on a Ryzen 9 3950X. In your face intel
> 
> So what do i want. Hmm 8 Intels cores or 12 AMD cores for the same price or how about 16 Intels cores for about twice as much money for what an 3950X will cost.
> 
> ...



3950x might not be a bad chip for gaming. Trust me. Turn on game mode, it will likely disable one Chip and now you have the top bin chip boosting way higher, with 8 cores and SMT. I did this on my 3900x and I was boosting way higher on my cores in gaming. But now 3950x will have the upper hand since you get 8 cores that are top binned lol. Best of both worlds. It will be fun to see how it performs with game mode.


----------



## Easo (Aug 26, 2019)

Oh yeah Intel, I am so going to buy something noticably more expensive because 1-5%. 
There is this thing called real life, in which absolute majority does not have the TOP product, but a more mainstream one. And there AMD wins and you lose. It is just that simple.
Go glue your 10nm together instead...


----------



## juiseman (Aug 26, 2019)

Real question here; what is AMD's "official" description of game mode; does it disable cores or SMT (hyper threading) or is it
just a turbo thing? Why do you have to restart every timer before enabling\disabling it? Why cant it be done in windows?


----------



## Tomgang (Aug 26, 2019)

Nkd said:


> This! Company I work for always tells me one thing. Respect your competition, don't talk shit about them. Heck we don't even mention competition when we actually show value in our product why its better. If you know your product is better you don't need to talk down competition and invest more time in them.
> 
> 
> 
> 3950x might not be a bad chip for gaming. Trust me. Turn on game mode, it will likely disable one Chip and now you have the top bin chip boosting way higher, with 8 cores and SMT. I did this on my 3900x and I was boosting way higher on my cores in gaming. But now 3950x will have the upper hand since you get 8 cores that are top binned lol. Best of both worlds. It will be fun to see how it performs with game mode.



Yeah 3950X shut have binned chiplets and the highest boost clocks in one pack. Shut give the best gaming performance, while still pack a mean punch to workstation and such load. Ad some good memory and it shut be a winner for best overall CPU. The best mix of gaming and workload. Going to ad some G.skill Trident Z Neo DDR4-3600MHz CL14-15-15-35 1.40V 32 GB kit with Samsung B-die memory and try with a bit higher voltage to either get maybe like 3733 MHz and/or lower timings. That shut be a winning setup as Ryzen likes high clock and low timings on memory.


----------



## GreiverBlade (Aug 26, 2019)

very compelling .... i guess my end of year buylist will be ... dyed ... in .... red


sidenote ... when will they push a microcode patch to patch the patch they patched to disable OC on K CPU on Win10 ... because i am still waiting .... i set the OC to 4.4ghz but it never goes above 3.9ghz, well, at last it doesn't BSOD anymore because of that tho ...


----------



## dyonoctis (Aug 26, 2019)

xkm1948 said:


> "Open Everything" is a wonderful scheme: basically it is letting everyone else do the R&D for you. Kinda lazy approach TBH. Not particularly fond of this mode. Had tons of trouble back in the days writing programs hoping to use OpenCL on the Fiji arc. TONS of problems.


How hard/costly developing an API/compiler is ? My knowledge on compilers is limited, but I know that CUDA became dominant because of the support that Nvidia was able to provide. But can AMD really play catch up ? Even Apple abandoned open cl, and made metal, wich makes GPGPU more cluttered than before. I wonder how developers would react if another api appeared...


----------



## R-T-B (Aug 26, 2019)

Company says competitor good, but their stuff better.  In other breaking news, water is too wet says man, who drowns.  Tune in for more at 11.


----------



## Muser99 (Aug 26, 2019)

Intel your CPUs maybe a tad faster than AMD but your security is NOT up to standard. There are still major flaws in the Skylake-based architecture used in all of your 14nm-based CPUs which require a hardware fix by your own admission.  So Intel get off your "high horse" before you get kicked off and focus your attention on a new architecture and desktop CPUs fit for the the 2020's.  You are peddling a decade old architecture riddled with security issues.  Marketing and PR will not fix the facts!


----------



## Vya Domus (Aug 26, 2019)

Darmok N Jalad said:


> I’m not sure if this is still true, but I know in the past, Intel compilers had much to do with this performance optimization.



You know what's ironic ? Intel's compilers are notorious for applying optimizations that are detrimental to correctness and generate code that can have all sorts of bizarre undefined behavior because they have some flags set on by default. You heard that right, their compilers will prioritize speed over everything else *by default*, let's just say that if you want IEEE standards out of the box you're not going to get that with an Intel compiler. No real professional coder will jump on their compilers and let them wreck havoc, especially if they need this for scientific computing for example.


----------



## Totally (Aug 26, 2019)

dyonoctis said:


> I guess Puget system don't know how to do benchmark then.
> View attachment 130100



That Intel mouthpiece punctuated just about every sentence with some derivative of "for gaming."  Feel like Techpowerup dropped the ball there by not reflecting that in the title. Which should have read

"Intel Says AMD Did a Great Job (with Ryzen 3000), But Intel CPUs are Still Better...for gaming"


----------



## Vulcansheart (Aug 26, 2019)

Intel has a lot of work to do to get their edge back. AMD is a direct threat now, and the consumers know it no matter how much Intel tries to shrug it off. I switched to the red team this year with a budget 2600X gaming build (my first ever AMD build from scratch), so Intel has a couple years to get their cards right before I do a CPU/mobo refresh and look at their lineup as an option.


----------



## Super XP (Aug 26, 2019)

AMD beat Intel hands down with ZEN, ZEN+ and again ZEN2. AMD will continue to beat Intel with all new upcoming ZEN micro-architectures. 

The only thing Intel needs to do is *STOP having a Temper-Tantrum*.


----------



## AsRock (Aug 27, 2019)

"Intel started a new campaign against AMD "

Not the 1st one either lmao.


----------



## Fluffmeister (Aug 27, 2019)

I bet Intel wish there was a third player, then they would just buy them.


----------



## biffzinker (Aug 27, 2019)

AsRock said:


> "Intel started a new campaign against AMD "
> 
> Not the 1st one either lmao.


Knowing how Intel has reacted to AMD being competive in the past, I'm certain there's more to follow.


----------



## Athlonite (Aug 27, 2019)

PFT better single core speed than AMD um who cares I haven't used a program that relied on single core / Thread in like years


----------



## R0H1T (Aug 27, 2019)

Fluffmeister said:


> I bet Intel wish there was a third player, then they would just buy them.


There is except they're owned by the Chinese, Zhaoxin is the name you're looking for.


biffzinker said:


> Knowing how Intel has reacted to AMD being competive in the past, I'm certain there's more to follow.


I'm certain there's a lot happening beneath the surface, which we won't know about probably until the damage is done.


----------



## Camm (Aug 27, 2019)

Is my (sic) better CPU secure?


----------



## 1d10t (Aug 27, 2019)

I'm suggest they using these image for future marketing materials...







I still don't understand why they keep insisting single thread is important, how about build single core 6GHz base 7Ghz boost CPU to prove their point ?

And these "gaming CPU" again...*sigh.I know it's their marketing material,but i fell kinda misleading. People with average knowledge on a budget still think $400 CPU is all you need for gaming rather than $400 GPU.
I shouldn't  scold them anymore, after all gaming is all they had left, after lost in data center, servers, desktop and (soon) high end desktop productivity, not to mention Snapdragon 8cx also knocking on their ULV portables


----------



## blobster21 (Aug 27, 2019)

Camm said:


> Is my (sic) better CPU secure?











						Intel Says AMD Did a Great Job (with Ryzen 3000), But Intel CPUs are Still Better
					

I'ma get me a 9900k cause Intel would never lie about their products. /s




					www.techpowerup.com
				




"
Funny they didn't mention Security anywhere except in the fine print in their graphics which states" Benchmarks may not have been run with all current available security patches."
   I wonder if they still win benches with all updates installed? "


----------



## Camm (Aug 27, 2019)

blobster21 said:


> Intel Says AMD Did a Great Job (with Ryzen 3000), But Intel CPUs are Still Better
> 
> 
> I'ma get me a 9900k cause Intel would never lie about their products. /s
> ...



The problem I have is every 3 months there is another major vulnerability with Intel's CPU's, where as it either doesn't affect AMD, or is much much less affected. Performance is one thing, and yes, its important, but I really couldn't give two shits about single digit differences if my platform isn't at least notionally secure.


----------



## Jism (Aug 27, 2019)

Poor marketing. They rely on the 1% single thread advantage those CPU's have, but thats about it. Lol.


----------



## ratirt (Aug 27, 2019)

Honestly, the gaming performance is something most people will perceive as actual performance. They don't realize that Intel being the top dog for so long makes it obvious the games run a tad faster on Intel's CPUs. I'm not saying it is wrong but think about this. Since it has been 10 or so years Intel's been a king of gaming and now AMD basically is a strong competition for gaming, what would happen if we turn it around and AMD with it's Ryzen was 10 years a king? Intel would be the "bulldozer" now with its lousy core number. (Well you know what I mean I hope) AMD has achieved more in 2 years than Intel in a decade. 

In terms of gaming, Sure Intel is fast and we all know why (it is not an IPC cause Ryzen 2 has better than Intel's) so it is game code execution. Ask yourself this question. Is 4k gaming a future or 720p/1080p 240Hz? For me 4k cause it looks awesome and I think games and most of people here will agree with me. This is the way technology advancement should pursuit. The other thing is, while Ryzen is slower on lower resolutions in one game, in 4k, with the same game it is faster than Intel counterpart is. Just drawing a bigger picture here and leave conclusions for you people.

The fact that intel struggles to make processors with more cores (in a price point affordable for most people) means, Intel's starting to lose ground and fast. 10nm Isn't working as it should (we don’t know if it ever will go to a desktop market), 5Ghz and up for CPUs is overrated, more cores are needed cause it has been proven that die shrinks won't bring boosts in frequency any more but degradation. AMD did it right with the core number because that is the only way you can boost performance. (Developers!! BUCKLY UP AND USE IT!!!) I can bet, 10 years from now (or even 5) AMD will be the top dog if this keeps up and Intel will be throwing shit on AMD saying, I'm a king of gaming, Real live benchmarks and crap like that. It would've been way more productive for Intel to swallow pride, buckle up and start with an idea how to make future products better and counter AMD or else Intel will perish with its pride and cheap schemes. For me this is pathetic and since Intel is playing this card, means they've got nothing to offer nor an idea even. The 10th gen CPUs proves it badly.


----------



## Jism (Aug 27, 2019)

For 1080p with highest FPS a fast CPU is mandatory. Intel wins due to it's 1% IPC advantage compared to AMD in single-threaded applications. But that is pretty much no longer a valid reason just to go for intel. AMD is the better overal product performance and price wise. At higher resolutions the GPU starts to be the bottleneck, or having more difficulty with putting out 200 frames a second. The role of the CPU beyond 1080p is'nt that important anymore. 240Hz gaming is a niche market, not something everyday casual gamer needs or something. I doubt if you can feel any difference in between 120fps and 240fps for that matter.  Apart from that, do we really notice any difference in between a one second faster closing task we throw at the CPU if we're going productive for that matter?

CPU's these days pretty much are equal, they do their task fast and if you need more you buy a bigger / faster model, simple as that. There could be some gains in selecting the memory and which NVME ssd, but both platform will show the same performance at some point.


----------



## Vayra86 (Aug 27, 2019)

ratirt said:


> Is 4k gaming a future or 720p/1080p 240Hz? For me 4k cause it looks awesome and I think games and most of people here will agree with me. This is the way technology advancement should pursuit.



Hold on pal!

Whoever brought up that the two (high res / high refresh) are to be mutually exclusive or that the pursuit of one, can, does, or should go at the expense of another needs to get his head examined. And you, too, if you really think this is the case. This has absolutely nothing to do with the 'Intel gaming lead' or 'code optimized for Intel'.

*Precisely the better threaded engines for gaming are also capable of pushing higher FPS.* This is not about Ryzen versus Core. Its about shifting away from the dependance on single threaded applications. Did AMD really push that forward? Or is it just the general trend that we're now finally ready for it? Its the latter; consoles & mobile devices carry a higher core count, we have better APIs available (API development also instigated by consoles btw), so you will see the same development in gaming on (performance) PCs. Its just that simple, its about the common denominator. In the same vein, now that higher counts than quad are getting mainstream and consoles already had 6+ cores available, we see those being used in our 'gaming' CPUs.

Now think back, _despite quad cores being the norm for a decade on PC_, prior to PS4, games simply did _not scale beyond a single thread_. And even if your OSD did say they used more, you didn't gain much FPS from it.

Intel did what it did because there was no market to grab. Besides, their HEDT segment already offered six cores for ages, but nobody jumped on those either. There was never a demand, another writing on the wall was AMD's bulldozer exactly; '8' core CPUs with no workload to shine at, while being pretty bad at the workloads most people did use. AMD stacked a few royal failures in that regard which put them on the bench for a long time.

Your idea that 'pursuit of technology' is in ANY way going towards gaming wrt to the Zen architecture... jesus man. These are datacenter/server CPUs first and foremost, the rest is bonus. And again, the same goes for Intel. They can yell about their gaming dominance but that was also just _given to them_ because they dominated the market for a while, and they had their sweet time to fine tune things for the MSDT segment - much like you see with Ryzen 3000 right now (and lo and behold, the gap is shrinking fast for 'consumer workloads'...).

Let's not overinflate things, before it reads like another Intel press release.


----------



## ratirt (Aug 27, 2019)

Vayra86 said:


> Whoever brought up that the two (high res / high refresh) are to be mutually exclusive or that the pursuit of one, can, does, or should go at the expense of another needs to get his head examined. And you, too, if you really think this is the case. This has absolutely nothing to do with the 'Intel gaming lead' or 'code optimized for Intel'.


As always you are missing the point just to prove your point. Sure the 4k can have a high refresh rate but still nowadays no card can push 100 or more in 4K gameplay (unless you play Minecraft). It's not like these two are exclusive nor mandatory, never said that) but from my standpoint (an I defend this) I'd rather go 4k and 60hz than 1080p 240hz. That is my opinion so please dont tell me to examine anything if you don't get it. The implication I made is about Intel proving the advantage (and some people point it out 1080p and 240Hz and Intel get higher FPS while in 4K AMD and Intel are basically the same). My point here is, Intel has no advantage in 4k and FPS (sometimes it lacks it). Not that 4k can't have a high refresh rate.


Vayra86 said:


> *Precisely the better threaded engines for gaming are also capable of pushing higher FPS.* This is not about Ryzen versus Core. Its about shifting away from the dependence on single threaded applications. Did AMD really push that forward? Or is it just the general trend that we're now finally ready for it?


General trend? What the hell are you talking about? For a decade we were stuck with 4c8t by Intel and now we have 8c in a desktop market thanks to AMD and you dare asking about what AMD did? Instead of offering more cores Intel was pursuing higher frequency and most games were using this advantage instead of cores. ( that's why we don't have many games using 6c not to mention 12 threads now) that's so damn obvious for me and yet there's always you arguing.  What a hypocrite you are is beyond believe.
Please stop comparing 3 different markets and lets focus on PC please. Mobile and console are a different story here. Two bad you've missed that.


Vayra86 said:


> Intel did what it did because there was no market to grab. Besides, their HEDT segment already offered six cores for ages, but nobody jumped on those either. There was never a demand, another writing on the wall was AMD's bulldozer exactly; '8' core CPUs with no workload to shine at, while being pretty bad at the workloads most people did use. AMD stacked a few royal failures in that regard which put them on the bench for a long time.


Intel did what it did because it was convenient not because there was no market for it. First you need to have something to play with and then you can improve upon it. You get 2 core and you focus development on this. Intel was never pushing cores but frequency and developers were following that trend. Just now, 2 years back the trend has changed. Not frequency but cores matter and developers are, just now, starting to use this resources. First resources you can build upon, then software and developers support not all the way around. Sure it did offer 6c but for what price? Developers see the market trend and it wasn't desktop 6c/12t for everyone but high-end overpriced product for exclusive users and you are saying developers would focus on that niche product to develop their own software for this? AMD bulldozer was 8c but it lacked performance. I knew you'd bring that one up but this is nowhere near where Intel was back then and no game developer would use more cores and threads utilization for a product that doesn't have the performance.


Vayra86 said:


> Your idea that 'pursuit of technology' is in ANY way going towards gaming wrt to the Zen architecture... jesus man


Where the hell did I say that pursuit of technology is going towards gaming and zen arch?


ratirt said:


> This is the way technology advancement should pursuit


This is related to the 4k gaming which in my eyes is the way to go (because it does look great and detail level is outstanding) instead focusing on 720p with 500FPS and that also is important to this particular thread when Intel brags about performance of their CPUs compared to AMD. True but I dont care about that cause for me this is not the way to go. That's mine opinion and stop twisting what I said cause it really sucks.



Vayra86 said:


> Let's not overinflate things, before it reads like another Intel press release.


Then don't. Just simply disagree and move on. You dont have to read it but maybe others would


----------



## 64K (Aug 27, 2019)

ratirt said:


> Sure the 4k can have a high refresh rate but still nowadays no card can push 100 or more in 4K gameplay (unless you play Minecraft).



From the review of the RTX 2080 Ti FE here from a sample of 23 AAA games benched at 4K at highest settings 6 were over 100 FPS average and 3 were high 90s FPS average, almost 100 FPS.









						NVIDIA GeForce RTX 2080 Ti Founders Edition 11 GB Review
					

NVIDIA debuted its Turing graphics architecture today, straightaway with the flagship RTX 2080 Ti. This card packs the promise of real-time ray tracing at 4K UHD, besides huge gains in performance. NVIDIA also put out its best cooler design since TITAN, commanding a very high price for some very...




					www.techpowerup.com


----------



## ratirt (Aug 27, 2019)

64K said:


> From the review of the RTX 2080 Ti FE here from a sample of 23 AAA games benched at 4K at highest settings 6 were over 100 FPS average and 3 were high 90s FPS average, almost 100 FPS.
> 
> 
> 
> ...


Yes you are right. Maybe I should have been more specific. There are games the 2080Ti can push 100 or even more. So 3 games 100FPS average with probably dips below 100. So that gives you 20 games that the card couldn't go above 100FPS. With OC probably more than 3 but still, is that a justification from your side it can be done? I don't think new games will go that easy on the 2080Ti as these 3 you have in mind.
Thanks for pointing this out but I was using the 4k gaming for different purpose than attacking 2080Ti's performance.


----------



## dyonoctis (Aug 27, 2019)

ratirt said:


> This is related to the 4k gaming which in my eyes is the way to go (because it does look great and detail level is outstanding) instead focusing on 720p with 500FPS and that also is important to this particular thread when Intel brags about performance of their CPUs compared to AMD. True but I dont care about that cause for me this is not the way to go. That's mine opinion and stop twisting what I said cause it really sucks.


The issue is that right now 1080p is still the mainstream reference, followed closely by 1440p. In my eyes, unless we somehow get a massive jump in gpu preformance, we are going to stay stuck with having to choose between graphics, or higher resolution/refresh rate. Dr Lisa Su said it herself : right now the software is moving faster than the hardware.


----------



## 64K (Aug 27, 2019)

ratirt said:


> Yes you are right. Maybe I should have been more specific. There are games the 2080Ti can push 100 or even more. So 3 games 100FPS average with probably dips below 100. So that gives you 20 games that the card couldn't go above 100FPS. With OC probably more than 3 but still, is that a justification from your side it can be done? I don't think new games will go that easy on the 2080Ti as these 3 you have in mind.



Well, if you look at minimum FPS then probably not many AAA games will be over 100 FPS. Still I think 9 games out of 23 benched were close to 100 FPS average or over at highest quality settings is significant.

No I don't think the 2080 Ti will continue to fare so well in future games but then there's probably a 7nm 3080 Ti next year that will.

As I always say, 4K gaming is for the people willing to pay for it and there are very few of them from what I've seen. 4K at extremely high FPS are probably very, very few but it can be done if you're willing to pay for it.


----------



## ratirt (Aug 27, 2019)

64K said:


> Well, if you look at minimum FPS then probably not many AAA games will be over 100 FPS. Still I think 9 games out of 23 benched were close to 100 FPS average or over at highest quality settings is significant.
> 
> No I don't think the 2080 Ti will continue to fare so well in future games but then there's probably a 7nm 3080 Ti next year that will.
> 
> As I always say, 4K gaming is for the people willing to pay for it and there are very few of them from what I've seen. 4K at extremely high FPS are probably very, very few but it can be done if you're willing to pay for it.





dyonoctis said:


> The issue is that right now 1080p is still the mainstream reference, followed closely by 1440p. In my eyes, unless we somehow get a massive jump in gpu preformance, we are going to stay stuck with having to choose between graphics, or higher resolution/refresh rate. Dr Lisa Su said it herself : right now the software is moving faster than the hardware.



I play 4k and can't complain  well games which allow me to play at that res with my v64  It does look outstanding believe me. Even older games look way nicer in 4K.
Anyway that's not the point and this thread is for something else please guys, I don't want to argue about what res people prefer or if 2080 Ti can push 100 in games. That is not what I wanted to point out here.


----------



## Totally (Aug 27, 2019)

Vayra86 said:


> Hold on pal!
> 
> Whoever brought up that the two (high res / high refresh) are to be mutually exclusive or that the pursuit of one, can, does, or should go at the expense of another needs to get his head examined. And you, too, if you really think this is the case. This has absolutely nothing to do with the 'Intel gaming lead' or 'code optimized for Intel'.
> 
> ...



Do you put your shoes on one at a time or both at the same time? You can try to throw both on at once but the results are much better doing one and then the other.



ratirt said:


> I play 4k and can't complain  well games which allow me to play at that res with my v64  It does look outstanding believe me. Even older games look way nicer in 4K.
> Anyway that's not the point and this thread is for something else please guys, I don't want to argue about what res people prefer or if 2080 Ti can push 100 in games. That is not what I wanted to point out here.



I thin you should have clarified that 4k is "immediate" future


----------



## Vayra86 (Aug 27, 2019)

ratirt said:


> As always you are missing the point just to prove your point. Sure the 4k can have a high refresh rate but still nowadays no card can push 100 or more in 4K gameplay (unless you play Minecraft). It's not like these two are exclusive nor mandatory, never said that) but from my standpoint (an I defend this) I'd rather go 4k and 60hz than 1080p 240hz. That is my opinion so please dont tell me to examine anything if you don't get it. The implication I made is about Intel proving the advantage (and some people point it out 1080p and 240Hz and Intel get higher FPS while in 4K AMD and Intel are basically the same). My point here is, Intel has no advantage in 4k and FPS (sometimes it lacks it). Not that 4k can't have a high refresh rate.
> 
> General trend? What the hell are you talking about? For a decade we were stuck with 4c8t by Intel and now we have 8c in a desktop market thanks to AMD and you dare asking about what AMD did? Instead of offering more cores Intel was pursuing higher frequency and most games were using this advantage instead of cores. ( that's why we don't have many games using 6c not to mention 12 threads now) that's so damn obvious for me and yet there's always you arguing.  What a hypocrite you are is beyond believe.
> Please stop comparing 3 different markets and lets focus on PC please. Mobile and console are a different story here. Two bad you've missed that.
> ...



If you truly believe 'now we have 8c thanks to AMD', then yes, get yourself examined. 8 core CPUs were there far earlier than Zen. There simply wasn't a market within the mainstream segment to launch them despite AMD trying to. For HEDT, there wére - up there you do have nicely threaded workloads and applications. Part of the reason FX-processors sucked so hard was because on MSDT, there were simply no good workloads for it. And for HEDT, Intel 6 cores would already run circles around them. AMD only receives kudos for _bringing the price down on higher core counts._ Because they compete again across the whole product stack.

For a decade we were stuck at 4c8t. And yet, games did not scale beyond 1 or 2 threads anyway. Found the reason behind that yet? Because that is proof that the movement to higher core counts for gaming is extremely late to the party, we've had quads for ages now and games are only recently truly catching up to that - and still many haven't.

Convenient / no market... aren't they the same? Its not convenient to make parts you don't sell.

You can be all up in arms about what I've said but its not strange and 'making my point', its an observation on what you think happened the last decade, and I think you're wearing the wrong glasses looking back. We need the hardware before we get the software that will fully use it, and then we also need 'the performance', after all if nobody asks for 200 FPS gaming, it won't be built. And the better threading of games on the CPU coincides NOT with Zen, but with the console releases.

The _result_ of better threading then, is that we're no longer tied to single core processing power and thát in turn enables high refresh/FPS gaming. 4K is not even a player in the story here, you can run that on a potato CPU, what does it even do in a Zen topic one might ask... Its no secret that a CPU will do fine as long as its not the part bottlenecking you. There is no 'pursuit' to be had for CPUs to enable 4K gaming.

So, back to my final line in last post: let's not overinflate what happened here with Zen's release, because that is the gist of your story. As if AMD 'enabled' something for gamers. They didn't, and the higher core counts were coming regardless. They gave us back healthy competition and that's all it is.


----------



## biffzinker (Aug 27, 2019)

Here's a interesting benchmark that doesn't have optimizations that favor one company over the other. I wouldn't of expected the Ryzen 5 3600 ahead of the Core i9-9900K though.



			
				Legit Reviews said:
			
		

> Neat Video has been optimized for use on multi-core and multi-CPU systems and supports GPU acceleration. Legit Reviews contacted ABSoft, NeatLab and asked if they have ever worked with AMD, Intel, NVIDIA, Qualcomm or ARM for CPU optimizations and they do not recall any interactions like that over the years. They did acknowledge that NVIDIA and AMD use NeatBench for GPU testing, but no optimizations have been asked for by either company. That is music to our ears as it looks like we have a benchmark that hasn’t been heavily optimized for any one particular company.











						12 CPUs Tested Using Neat Video Noise Reduction Tool w/ NeatBench 5 - Legit Reviews
					






					www.legitreviews.com
				









Puget Systems gets the same scoring with Neatbench 5.













						First Look at AMD Ryzen 3rd Gen CPUs for Video Editing
					

AMD's new Ryzen 3rd generation CPUs just launched with terrific performance improvements across the board. While we don't have the full lineup tested just yet, we wanted to give a first look at what we are seeing in Premiere Pro, After Effects, DaVinci Resolve, and other applications commonly...




					www.pugetsystems.com
				




Did a quick run on my desktop.


----------



## Super XP (Aug 28, 2019)

biffzinker said:


> Here's a interesting benchmark that doesn't have optimizations that favor one company over the other. I wouldn't of expected the Ryzen 5 3600 ahead of the Core i9-9900K though.
> 
> 
> 
> ...


The majority if not all Synthetic Benchmarks ALL favour Intel CPU's. This is a common fact that most people know, which is why Real World Benchmarks are the real deal.
Though Synthetic benchmarks do have there place, so long as no CPU's are being favoured.



blobster21 said:


> Intel Says AMD Did a Great Job (with Ryzen 3000), But Intel CPUs are Still Better
> 
> 
> I'ma get me a 9900k cause Intel would never lie about their products. /s
> ...


Intel took design shortcuts to squeeze out more performance and got caught with all the security vulnerabilities.
Basically Intel's Security Vulnerabilities should be Front Page News and should be talked about as much as possible. The security patches disable features which people made purchasing decisions on. By patching there CPU's, they are False Advertising.


----------



## ratirt (Aug 28, 2019)

Vayra86 said:


> If you truly believe 'now we have 8c thanks to AMD', then yes, get yourself examined. 8 core CPUs were there far earlier than Zen. There simply wasn't a market within the mainstream segment to launch them despite AMD trying to. For HEDT, there wére - up there you do have nicely threaded workloads and applications. Part of the reason FX-processors sucked so hard was because on MSDT, there were simply no good workloads for it. And for HEDT, Intel 6 cores would already run circles around them. AMD only receives kudos for _bringing the price down on higher core counts._ Because they compete again across the whole product stack.
> 
> For a decade we were stuck at 4c8t. And yet, games did not scale beyond 1 or 2 threads anyway. Found the reason behind that yet? Because that is proof that the movement to higher core counts for gaming is extremely late to the party, we've had quads for ages now and games are only recently truly catching up to that - and still many haven't.
> 
> ...


Oh boy. I guess busses don't go where you live do they. Take that avatar off it is offensive.


biffzinker said:


> Here's a interesting benchmark that doesn't have optimizations that favor one company over the other. I wouldn't of expected the Ryzen 5 3600 ahead of the Core i9-9900K though.
> 
> 
> 
> ...


That is nice. I need to try this on my Ryzen and see what I will get.



Totally said:


> I thin you should have clarified that 4k is "immediate" future


I think this is the way to go. 4K looks just amazing


----------



## Redwoodz (Aug 29, 2019)

Of course AMD's ecosystem is not up to par compared to Intel...who have had over 80% marketshare for the last 15 yrs. It won't take very long to catch up though...that Ryzen money fixes all sorts of short-comings.


----------



## Midland Dog (Aug 31, 2019)

biffzinker said:


> If you say so


oh yeah coz ln2 guys are benching photoshop


----------



## dyonoctis (Aug 31, 2019)

Midland Dog said:


> oh yeah coz ln2 guys are benching photoshop


You don't even know what is puget system do you ? those guys aren't overclockers, there are system builder for professional (content creation, science). They don't do overclocking at all, because they do rather ship a stock but stable system, instead of trying to get more % with an oc that may or may not be stable.


----------



## Midland Dog (Sep 2, 2019)

dyonoctis said:


> You don't even know what is puget system do you ? those guys aren't overclockers, there are system builder for professional (content creation, science). They don't do overclocking at all, because they do rather ship a stock but stable system, instead of trying to get more % with an oc that may or may not be stable.


lmao ur an idiot, i specifically said benchathon, i dont care what puget does in the slightest, if they were doing xoc benching my point would be proven even harder

also the 3800x is pathetic compared to the 9900k in ur supplied bench a year late and still not as fast*, vega and 1080ti all over again
*1 point in a test that i dont care about, 1 point vs 10 more fps ill take the fps and live with 1 point less at STOCK thanks, OC for OC a 5ghz 8 core skylake trashes any oc ryzen 3000 can muster


----------



## lexluthermiester (Sep 2, 2019)

Midland Dog said:


> also the 3800x is pathetic compared to the 9900k


Your understanding needs improvement. Either your understanding of the benchmarks is lacking or your understanding of the definition of the word "pathetic" is lacking.


----------



## dyonoctis (Sep 2, 2019)

Midland Dog said:


> lmao ur an idiot, i specifically said benchathon, i dont care what puget does in the slightest, if they were doing xoc benching my point would be proven even harder
> 
> also the 3800x is pathetic compared to the 9900k in ur supplied bench a year late and still not as fast*, vega and 1080ti all over again
> *1 point in a test that i dont care about, 1 point vs 10 more fps ill take the fps and live with 1 point less at STOCK thanks, OC for OC a 5ghz 8 core skylake trashes any oc ryzen 3000 can muster


 I just told you that puget doesnt do world record benchmark or any overclocking at all, those results are from stock speed, meaning that there is no overclocking, they didn't mees with the speed in the bios, nor in ryzen master. Thoses results are from references clocks.


----------



## Midland Dog (Sep 2, 2019)

dyonoctis said:


> I just told you that puget doesnt do world record benchmark or any overclocking at all, those results are from stock speed, meaning that there is no overclocking, they didn't mees with the speed in the bios, nor in ryzen master. Thoses results are from references clocks.


and intel barely loses while having the headroom to push all threads up another soild few hundred mhz, the ryzen stands to loose a few mhz on an all core oc, skylake still wins



lexluthermiester said:


> Your understanding needs improvement. Either your understanding of the benchmarks is lacking or your understanding of the definition of the word "pathetic" is lacking.


1 point more, 1 node smaller, 1 year late and the same price, pathetic


----------



## lexluthermiester (Sep 2, 2019)

Midland Dog said:


> 1 point more, 1 node smaller, 1 year late and the same price, pathetic


Ah, I see! It's a lack of understanding context. So let's examine;
9900k
https://www.amazon.com/Intel-i9-9900K-Desktop-Processor-Unlocked/dp/B005404P9I/$495

3800X
https://www.amazon.com/AMD-Ryzen-3800X-16-Thread-Processor/dp/B07SXMZLPJ/$399

Yup, that's the exact same price....

Now let's compare benchmarks shall we? Since TPU has yet to review the 3800X(unless I missed it) we'll go with the 3700X review, just to be fair;








						AMD Ryzen 7 3700X Review
					

AMD's $330 Ryzen 7 3700X is an 8-core, 16-thread CPU that's clocked high enough to compete with Intel's offerings. Actually, its application performance matches even the more expensive Intel Core i9-9900K. Gaming performance has been increased significantly, too, thanks to the improved...




					www.techpowerup.com
				



Wow! The 9900k beat it by 3%!!! Gee wiz, the 9900k is sooo kicking the 3700X in the nads....

Sarcasm aside, the only thing "pathetic" is your inability to do math and understanding of reality. The 3700X easily matches the 9900k in every metric but one, overclocking. It's nearly $150 less expensive and performs within 3% of said 9900k. Given that the 3800X is faster than the 3700X, it doesn't take a genius to conclude that the 3800X is very likely bang on with the 9900k for nearly $100 less. Yup, I'll take your brand of "pathetic" all day, every day thank you very much.


----------



## biffzinker (Sep 2, 2019)

I chalk it up to being immature if he is 19. The immature responses to criticism suggest, yes.


----------



## ratirt (Sep 3, 2019)

biffzinker said:


> I chalk it up to being immature if he is 19. The immature responses to criticism suggest, yes.


 True that. I still don't get people. Where do they get this stuff from? It's not possible to be so damn blind. The 9900K is "the one" for some people. I simply just can't believe it


----------



## Midland Dog (Sep 3, 2019)

lexluthermiester said:


> Ah, I see! It's a lack of understanding context. So let's examine;
> 9900k
> https://www.amazon.com/Intel-i9-9900K-Desktop-Processor-Unlocked/dp/B005404P9I/$495
> 
> ...


i stand corrected on pricing but i still dont see any of the ryzen chips being worth it, if intel brought out a quad core with twice the ipc of skylake i would by that over any ryzen ever, i dont care for core counts, because most programs would gain more from 1ghz extra clocks than 2 more cores



Midland Dog said:


> i stand corrected on pricing but i still dont see any of the ryzen chips being worth it, if intel brought out a quad core with twice the ipc of skylake i would by that over any ryzen ever, i dont care for core counts, because most programs would gain more from 1ghz extra clocks than 2 more cores


lmao u just sold me on intel %110, oc potential is ALL that i care about, i buy a chip for the most the sillicon can happily do not the most that a company says it "should" do


----------



## ratirt (Sep 3, 2019)

Midland Dog said:


> i stand corrected on pricing but i still dont see any of the ryzen chips being worth it, if intel brought out a quad core with twice the ipc of skylake i would by that over any ryzen ever, i dont care for core counts, because most programs would gain more from 1ghz extra clocks than 2 more cores
> 
> 
> lmao u just sold me on intel %110, oc potential is ALL that i care about, i buy a chip for the most the sillicon can happily do not the most that a company says it "should" do


If "ifs" and "buts" were candy and nuts. How would Intel accomplish that? Twice the IPC of skylake? Actually, most applications now use more than 2 cores I think around 4 now or even more. Knowing the current state of CPU industry and how the frequency of processors is crumbling now and it will get worse, the cores, in a CPU, are the only way to guarantee performance boost.

Sure you have to OC it. If you want the CPU to keep up with the rest of the pack.


----------



## Midland Dog (Sep 3, 2019)

ratirt said:


> If "ifs" and "buts" were candy and nuts. How would Intel accomplish that? Twice the IPC of skylake? Actually, most applications now use more than 2 cores I think around 4 now or even more. Knowing the current state of CPU industry and how the frequency of processors is crumbling now and it will get worse, the cores, in a CPU, are the only way to guarantee performance boost.
> 
> Sure you have to OC it. If you want the CPU to keep up with the rest of the pack.


4 or more lmao like i said id take the quad core, cores DO NOT scale infinitely at all, winning the MT war means having the most powerful individual cores to make up mt perf, making your core 2% faster scales to all of the cores, %2 per core on a 16 core means + 32% overall


----------



## ratirt (Sep 3, 2019)

Midland Dog said:


> 4 or more lmao like i said id take the quad core, cores DO NOT scale infinitely at all, winning the MT war means having the most powerful individual cores to make up mt perf, making your core 2% faster scales to all of the cores, %2 per core on a 16 core means + 32% overall


Sure but what's the point if your previous statement was


Midland Dog said:


> i dont care for core counts, because most programs would gain more from 1ghz extra clocks than 2 more cores


Make up your mind.
and no it would not be 32% because application may scale only for 6 cores. That depends on the application. The problem with your 1ghz more is that the 5Ghz is basically maximum for silicon. Each node shrink will not give any improvement in frequency but it will degrade the frequency. You won't be able to hit 5Ghz. The only way to increase performance is with the IPC increase which is tough to accomplish. Core number is easier (AMD done it) but the downside is utilization of the cores by an application.


----------

