Tuesday, December 13th 2016

AMD RYZEN Demo Event - Beats $1,100 8-Core i7-6900K, With Lower TDP

At their Austin, Texas "New Horizon" Event, AMD introduced us to live "ZEN" chips working full-tilt, showing us what AMD's passion and ingenuity managed to achieve. The "New Horizon" event was a celebration to what AMD sees as another one of those special, breakthrough moments for a company: after starting work on "ZEN" 4 years ago in 2012 as a complete new design. The focus: building a great machine, whilst increasing IPC by 40% over their previous architecture, at the same power constraints; and to create a smart machine, which could sense and adapt to environment and applications so it improves over time. The company's verdicts: "ZEN" met or exceeded their goals, with the desktop PC market being home to the very first "ZEN" product.

According to AMD's CEO Dr. Lisa Su, AMD's event was named "New Horizon" as a reference to AMD's vision in the computing space: that they're on a journey to bring a new generation of processor technology, and customers towards a new horizon of computing. Their intention? To directly connect with fans who love PC gaming, whilst doing what AMD does best - pushing the envelope on performance, power, frame-rates and technology. AMD also flaunted their renewed faith in gaming, with it being on the company's DNA and passion, whilst revisiting the old memory lane, reminiscing on the Athlon Thunderbird, the world's first chip to break the 1 GHz barrier; the launch of their first 64-bit processor; and breaking the 1 TFLOP barrier in computing power with their HD 4850 and 4870 gaming GPUs.
AMD confirmed that CPUs based on their "ZEN" micro-architecture will carry the brand "Ryzen" - a play on the "ZEN" architecture's focus on balance, high performance and low power, while introducing new features. Ryzen is AMD's take of a processor that is both powerful in purpose, and efficient in design, and it symbolizes the power of "ZEN" reaching the next horizon in computing. They will do so by starting with an 8-core, 16-thread, SMT-enabled, 3.4 GHz+ base clock and 20MB combined cache new high-performance CPU, leveraging all the improvements baked into AMD's new AM4 platform (with 3.4 GHz apparently being the lowest frequency a Ryzen, consumer-level desktop solution will carry).
To prove their words and commitment to Ryzen's performance, AMD showcased the chip's prowess in a Blender test, pitting a Ryzen CPU at 3.4 GHz base clock (without Boost), with the consumer market's only other 8-core, 16-thread CPU in the Intel i7 6900K, at its stock 3.2 GHz base clock, with Boost enabled and no adjustments, "straight out of the box". The verdict: Ryzen matched the 6900K's performance. Dr. Lisa Su was quick to point out the 6900K's pricing at $1100, though she left an intentional silence at the point where she could have made a bombastic pricing announcement for Ryzen - perhaps keeping her cards close to her chest so as to not allow Intel to figure out any pricing changes in their products (if any), should Ryzen prove deserving of such a response. But the bottom line, and the home-run hit by Lisa Su, was the announcement that Ryzen was able to match Intel's performance with 45 W less TDP - 95 W TDP on Ryzen against the 140 W TDP on Intel's 6900K. In another test, this time a Handbrake transcoding demo, Ryzen transcoded a video in 54 seconds, against 59 seconds on Intel's 6900K processor.
Again at 3.4 GHz, Ryzen was shown "beating the game frame-rates of a Core i7 6900K playing Battlefield 1 at 4K resolution, with each CPU paired with an Nvidia Titan X GPU". Not drawing any more attention than needs to be drawn towards the usage of an NVIDIA solution at their own event (which was puzzling, since AMD did show a Ryzen CPU and a VEGA-based graphics cards running Star Wars Battlefront's as-of-yet unreleased Rogue One DLC at over 60fps in 4K), we didn't actually see any reported frame-rated on the Battlefield 1 demo - only that the Ryzen-based system offered considerably less frame-skipping than the Intel solution, with the expected effects that has on the gaming experience.
AMD also announced what constitutes part of Ryzen's beating heart: their SenseMI technology, which includes "Neural Net Prediction" - an artificial intelligence neural network that learns to predict what future pathway an application will take based on past runs; "Smart Prefetch", which drinks from the "Neural Net Prediction", anticipating the data an app needs and having it ready when needed (with these two features alone being responsible for 1/4 of Ryzen's performance uplift, according to Lisa Su). Additionally, AMD announced Ryzen's "Pure Power" and "Precision Boost" features: more than "100 embedded sensors with accuracy to the millivolt, milliwatt, and single degree level of temperature enable optimal voltage, clock frequency, and operating mode with minimal energy consumption", controlling each part of the chip, independently, in milliseconds, leveraging "smart logic that monitors integrated sensors and optimizes clock speeds, in increments as small as 25MHz, at up to a thousand times a second". Finishing the pentad of new features was the "Extended Frequency Range" (XFR), a temperature-based boost function where the processor knows what temperature it's operating at, enabling higher clock speeds as the system gets cooler (and vice-versa, we'd expect, towards the 3.4 GHz base clock).
At the event, AMD showed Ryzen running a VR demo, as well as delivering performance in raytracing, with physically based shaders and materials, HDR, and a grand total of 53 million polygons in a single model. Interestingly, AMD also showed their Ryzen CPU against an Intel 6700K processor overlocked towards an unspecified frequency, comparing the chip's performance in streaming DOTA 2: where the 6700K showed severe frame-skipping on the streaming screen, but Ryzen handled it beautifully.
As a sendoff, AMD's CEO Lisa SU mentioned that Ryzen will be on desktop and notebook solutions (leaving out the server market, which could mean a brand distinction between both solutions", whilst reaffirming that Ryzen's Q1 launch is completely on track, from the only company that has both high-performance CPUs and GPUs. And as an appetizer, the good doctor did say that Ryzen's performance will only improve until their promised launch.
Add your own comment

205 Comments on AMD RYZEN Demo Event - Beats $1,100 8-Core i7-6900K, With Lower TDP

#76
Assimilator
AscalaphusYou're missing the bigger picture:
It will cost LESS than a i7-6900k as well.
AMD hasn't released any pricing information on Zen/Ryzen yet, so I'd like to know what evidence you're basing this statement on.
efikkanAnd you are just using a single application at the time?
There are OS subsystems, multiple drivers, a web browser and some other stuff you are running, perhaps an anti-virus program and a pile of other "hidden" crap running in the background, in addition to the task you are "working" on. Even if your little game only uses 3-4 threads, more cores will help reduce stutter and other problems, even if it doesn't show up in average framerate benchmarks. Anything less than 6 (real) cores for a power-user today is not going to cut it.
If a "pile of other hidden crap" is slowing down your PC, it's almost certainly not because it's tying up your CPU, it's because it's reading from or writing to a mechanical disk. In which case you can be running a 128-core CPU and your PC as a whole is still gonna be slow. CPUs and operating systems are extremely good at task scheduling and context switching, precisely because that's their main use case.

And what is a "power user"? I consider myself a power user because I generally have half a dozen browser windows, with dozens of tabs in each, open - plus at least two instances of Visual Studio - and I play a game at the same time, and my 4-core 4-thread i5-3570K is still perfectly adequate for that. Power users don't need more cores/threads, people who do video rendering do, and that is a very small percentage of the market that explains why core counts have stagnated and higher core counts command higher prices.

Let's be honest, AMD has been trying to force the "MOAR CORES BETTAR" Kool-Aid down our throat for years and it hasn't worked out, for the simple reason that writing code that takes advantage of parallelism is difficult and hence the majority of apps around (old and new) don't do it. High core counts are and will remain a niche market for years to come.
Posted on Reply
#77
LucidStrike
xkm1948I am happy. Even though I am not going to build a new AMD system any time soon, the introduction of more competitive product will force the price of intel's HEDT down. If in one year I can upgrade to 6950X with $500 I will be thrilled!
Unless you already have an LGA 2011-3 board, it doesn't make sense not to at least consider switching to a comparable Zen chip.
Posted on Reply
#78
LucidStrike
renz496that's what people said with 7970 as well. at this point i will just waiting until the actual hardware hit the market.
That said, it's exactly the approach AMD 'just' took with the RX lineup, focusing on gaining marketshare with high price-performance ratios.
Posted on Reply
#79
renz496
evernessinceIt doesn't need to be as fast as the 1060 in every title, average is a better indicator of speed. You can go look anywhere, it's not just hardware canucks re reviewing after AMD's new driver.
hardware canucks test was not made using latest Relive driver either. it depends on what tittle being tested in the suite. one of TPU latest GPU review that include both RX480 and GTX1060 (done in mid november; the test on hardware canucks also using AMD november drivers) the relative performance chart still put 1060 6GB ahead of RX480. and the changes from the new driver is not that big. from launch driver to relive both TPU and guru3d shows average around 2-3% performance improvement for RX480. and i also check the numbers from 1060 and RX480 launch drivers versus much more recent one. while RX480 did gain more performance on much recent test the same also happen with 1060. so RX480 is not the only card gaining more performance. hardware canucks chart probably going to paint a different result if they include a few new tittles that favors nvidia cards such as Watch Dogs 2 and Dishonored 2.
Posted on Reply
#80
eidairaman1
The Exiled Airman
LucidStrikeUnless you already have an LGA 2011-3 board, it doesn't make sense not to at least consider switching to a comparable Zen chip.
LucidStrikeThat said, it's exactly the approach AMD 'just' took with the RX lineup, focusing on gaining marketshare with high price-performance ratios.
Do not double post, use the multiquote button
Posted on Reply
#81
renz496
rvalenciaPolaris 10 doesn't have double rate 16bit FP feature while both PS4 Pro and Vega 10 has double rate 16 bit FP feature.

Native 16bit FP support is important for incoming Shader Model 6.0.
AFAIK FP16 support in games is not that new. if anything using FP16 probably going to reduce image quality. on mobile (like SoC) the reason to use FP16 is to save power and bandwidth. can you provide the link talking about the importance of FP16 in upcoming SM6?
Posted on Reply
#82
P4-630
qubitWow, seriously? :eek: Do you know which ones they are?
GTA V, CAPS showed it once with his xeon.
Posted on Reply
#83
DeathtoGnomes
There were a few of us that watched the livestream from Discord. Sad more of you all were not there to comment.
Posted on Reply
#84
bug
Hm, AMD is at it again.
Nice play on "Zen with no boost vs intel with boost". When all cores are being used, intel doesn't boost either (except for a short while - a few seconds, till it gets hot).

But overall a solid showing.
Posted on Reply
#85
R-T-B
xkm19488 core 16 threads is finally becoming mainstream. This is definitely good news for everyone. We have been stuck at 4 core 8 threads for far too long.
Agreed, but I expect a 6-core mainstream Zen before 8-core gets there. Still good.
LucidStrikeUnless you already have an LGA 2011-3 board, it doesn't make sense not to at least consider switching to a comparable Zen chip.
His system specs are literally a click away, you know?
Posted on Reply
#86
YautjaLord
Aside from that Handbrake graph for both ZEN & - if my memory serves me well - 6900k, everything else seemed like "What, where's fps counter, no Lisa, Mrs. Su, what sh!t 68fps in 4k in Star Wars Battlefront Rogue One DLC you talk about? Are you guys serious?" Seemed like more of a hype train, but i digress. I'm still hungry for AM4/X370 mobos & benchmarks, oh & i don't buy that Blender demo on ZEN vs 6900k sh!t either. Waiting for you guys to actually benchmark the beast yourselves, man i am really hyped. :laugh:

Other than that - it's good that AMD gets back into competition? Bring on the benchies, who's gonna bench this thing? Bring Kaby Lake & Skylake to compare, 6900k is Skylake right?
Posted on Reply
#87
bug
YautjaLordOther than that - it's good that AMD gets back into competition? Bring on the benchies, who's gonna bench this thing? Bring Kaby Lake & Skylake to compare, 6900k is Skylake right?
6900k is actually Broadwell, but performance-wise, Skylake isn't any faster.

Edit: 6990k is actually 6900k
Posted on Reply
#88
YautjaLord
bug6990k is actually Broadwell, but performance-wise, Skylake isn't any faster.
Now i'm completely confused. They pitted ZEN vs 6990k? I heard something like 6900k. Thanx regardless, maybe it's my hearing & fact that i remember more what CPUs were used in Dota 2 while streaming. 6700k was OC'd to 4.5GHz & struggled (hype?), ZEN & 6900k(?) ran smoothly. And in AMD fashion the bald guy engineer(?) said 6900k cost twice more than ZEN. lol 1100$ or something like that?

P.S. ASUS, MSI & Gigabyte still don't have AM4/X370 mobos in their mobos list. I checked. lol

P.P.S. Loved the ZBrush demo with ray-tracing & 53 million polygons models stuff. :)
Posted on Reply
#90
R0H1T
YautjaLordNow i'm completely confused. They pitted ZEN vs 6990k? I heard something like 6900k. Thanx regardless, maybe it's my hearing & fact that i remember more what CPUs were used in Dota 2 while streaming. 6700k was OC'd to 4.5GHz & struggled (hype?), ZEN & 6900k(?) ran smoothly. And in AMD fashion the bald guy engineer(?) said 6900k cost twice more than ZEN. lol 1100$ or something like that?

P.S. ASUS, MSI & Gigabyte still don't have AM4/X370 mobos in their mobos list. I checked. lol

P.P.S. Loved the ZBrush demo with ray-tracing & 53 million polygons models stuff. :)
Well there's no 6990K, although there was the AMD/ATI 6990, it's just a typo from the poster.
Posted on Reply
#91
ratirt
This ZEN looks pretty promising. Not sure if I should get one when it shows up on the market or wait a bit longer. Seeing what it can do is great although waiting for reviews and more benchmarks might be wise idea to clarify what that CPU can really do. Anyway I think I'm getting an AMD platform :) Grrr I can even fill that chill crawled up my spine when thinking of this :) I really wanna go with AMD and try this thing with all features it gives :)

Just an add.
Finally maybe we will get mainstream 8/16 c/t instead 4/8 :) Gotta sell my still good 3770k :D
Posted on Reply
#92
TheinsanegamerN
bugHm, AMD is at it again.
Nice play on "Zen with no boost vs intel with boost". When all cores are being used, intel doesn't boost either (except for a short while - a few seconds, till it gets hot).

But overall a solid showing.
:confused:

My ive bridge i5 has 0 issue maintaining it's maximum rated boost speed on all four cores without any tinkering on my end. A halfway decent cooler will keep those chips boosting 24/7.
Posted on Reply
#93
bug
TheinsanegamerN:confused:

My ive bridge i5 has 0 issue maintaining it's maximum rated boost speed on all four cores without any tinkering on my end. A halfway decent cooler will keep those chips boosting 24/7.
What i5 exactly and what "boost" frequency does it maintain?
Posted on Reply
#94
YautjaLord
Have slightly mixed feelings about the showcasing of BF1 as well, meaning - couldn't for a f*ckin' life of me see the actual fps counters/graphs/stats/etc... - that one can be called a con(?), but the fact guy said both rigs had Titan X[P] in 'em should indicate something. And the fact it ran that not yet released Rogue One SW Battlefront DLC (or any other game) in 4k - good news. Right? Overall, I enjoyed it more with occasional "wtf? you guys serious? dafuq y'all talking about?" moments in-between. :laugh:

P.S. I see 4k LED IPS monitors in my PC store drop prices, time for ZEN-based, air-cooled (for now), 4k gaming/Blender'ing rig. Yeah, i use this awesome, awesome app for some project i have for Carma: R & Carma: Max Damage. Can rival 3DSMax actually & you can't beat the awesome price of free. lol
Posted on Reply
#95
HD64G
renz496hardware canucks test was not made using latest Relive driver either. it depends on what tittle being tested in the suite. one of TPU latest GPU review that include both RX480 and GTX1060 (done in mid november; the test on hardware canucks also using AMD november drivers) the relative performance chart still put 1060 6GB ahead of RX480. and the changes from the new driver is not that big. from launch driver to relive both TPU and guru3d shows average around 2-3% performance improvement for RX480. and i also check the numbers from 1060 and RX480 launch drivers versus much more recent one. while RX480 did gain more performance on much recent test the same also happen with 1060. so RX480 is not the only card gaining more performance. hardware canucks chart probably going to paint a different result if they include a few new tittles that favors nvidia cards such as Watch Dogs 2 and Dishonored 2.
You didn't check the last page in the HW Canacks reveiw eh? Because there is a comparison table there that perfectly describes the benefit that RX480 had vs the GTX1060 from the June2016 until November's driver release. So, while you are correct that 1060 gained also from newer drivers, RX480 gained much more and so it closed the gap to just over 2% for the games HW Canucks tested and to just over 4% for the gamelist TPU's @W1zzard uses. And it is an established fact now that CGN clearly wins the nVidia's arch there with the gap being visible for almost all games.
Posted on Reply
#96
ADHDGAMING
renz496if they have better solution they will charge for it. heck even when they not they still charge 1k for 9590.....initially.
Sue made it a point to mention the Intel Chips price for a reason .. Pay attention she was dropping clues. it will certainly be cheaper.
Posted on Reply
#97
Assimilator
ADHDGAMINGSue made it a point to mention the Intel Chips price for a reason .. Pay attention she was dropping clues. it will certainly be cheaper.
You have no understanding of how companies work, do you?
Posted on Reply
#98
dozenfury
FYI, another site (AT) they did some CSI-ish examination of the presentation, because scores were suspiciously not lining up with where they should be in that Ryzen demo and they noticed something. In the AMD Ryzen demo, the video shows that the render sample value is changed to 100 instead of the default 200, which dramatically changes the scores. So if you download the demo from AMD and just hit go, you will have the default 200 render sample value and will have a slow score. You will need to change the render sample value to 100 to get an accurate comparison, and that's assuming they also didn't change anything else.

Incredibly sneaky, and more reason to wait for real third-party unbiased benches. I'm very hopeful it turns out to be good, but stuff like this raises some major red flags especially considering AMD's history of letdowns.
Posted on Reply
#99
bug
dozenfuryFYI, another site (AT) they did some CSI-ish examination of the presentation, because scores were suspiciously not lining up with where they should be in that Ryzen demo and they noticed something. In the AMD Ryzen demo, the video shows that the render sample value is changed to 100 instead of the default 200, which dramatically changes the scores. So if you download the demo from AMD and just hit go, you will have the default 200 render sample value and will have a slow score. You will need to change the render sample value to 100 to get an accurate comparison, and that's assuming they also didn't change anything else.

Incredibly sneaky, and more reason to wait for real third-party unbiased benches. I'm very hopeful it turns out to be good, but stuff like this raises some major red flags especially considering AMD's history of letdowns.
This is the reason I mostly avoid these articles altogether. In most cases it avoids me the unrealistic hype.
Posted on Reply
#100
Octavean
AscalaphusYou're missing the bigger picture:
It will cost LESS than a i7-6900k as well.
Well, to be fair the bigger picture is that RYZEN will likely cost less then what the Core i7 6900K currently costs. Right now the Intel HEDT line is alone in the market. Reasonable competition in the HEDT segment of the market from AMD setting lower prices on the RYZEN line could simply force Intel to drop prices. Or Intel might further segment their lineup with newer models at lower prices in order to avoid dropping price on their current lineup.

I don't think there is any doubt that Intel can weather the storm of lower prices at least for the short term. In fact Intel could probably lower their prices below AMD pricing and boost performance if need be.

Its good to see there is still some fight in AMD but a price war could get ugly for them. Not ugly for the consumer though,.....

Right now, I already have a Core i7 5820K processor. Theoretically speaking:

If a RYZEN 8 core 16 thread monster came in at about ~$300 to ~$400 USD retail I would be very interested and tempted. However, if Intel were to drop prices on the Core i7 6900K to the same price range it would be easier and simpler for me to just buy the Core i7 6900K because its a drop-in upgrade. That is a big "if" though.
Posted on Reply
Add your own comment
Jul 21st, 2024 17:17 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts