# What I found about 5775c EDRAM's impact on gaming performance.



## cucker tarlson (Aug 26, 2017)

I found the option in bios to OC my EDRAM as well as disable it. I chose WatchDogs2 as my benchmark because of how CPU dependent that games is. Resolution is 2560x1440 but graphics settings are low to rule out GPU bottleneck. The results are staggering.

EDRAM @2GHz - 118 FPS


EDRAM disabled - 91 FPS

EDRAM @1800MHz is probably the sweet spot tho, it was only 1fps less than 2000MHz so basically the same, within margin of error.

The difference was 29,6%, which is huge. I encourage other 5675/5775c owners to try, as long as your boards supports disabling the L4. Remember, you have to choose a CPU heavy game, and eliminate the bottleneck of the graphics card in order for the results to be comprehensible. Use HWmonitor64 to check if the changes actually apply. It should show edram clock and temperature, while when EDRAM is disabled in bios those options are missing. XTU also can show you the current edram reading, but while I've seen it work for one review, it doesn't work for me or some other people I saw posting on intel's forum. It shows 0MHz regardless if disabled,stock or oc. Use an even multiplier like 16/18/20x, the odd ones don't apply.


----------



## phanbuey (Aug 26, 2017)

yeah that was always one of the best chips for gaming... intel nipped that one real quick.


----------



## cucker tarlson (Aug 26, 2017)

Don't know if I mentioned that, the test was done twice to make sure I'm not dreaming that. I mean, I knew 5775c was dope for gaming (see the review), about the same level of performance as 7700K at less power draw. But to see a 30% gain from L4 alone is making me ask why do people still buy 7700Ks for gaming rigs, while 5775c run significantly cooler,just as fast as KL and doesn't require a new platform.


----------



## Frick (Aug 26, 2017)

cucker tarlson said:


> Don't know if I mentioned that, the test was done twice to make sure I'm not dreaming that. I mean, I knew 5775c was dope for gaming (see the review), about the same level of performance as 7700K at less power draw. But to see a 30% gain from L4 alone is making me ask why do people still buy 7700Ks for gaming rigs, while 5775c run significantly cooler,just as fast as KL and doesn't require a new platform.



If you already are on LGA1150 it doesn't, but it's also €100 more than the 7700K where I live. There never was a lot of them, and they always were very expensive.


----------



## cucker tarlson (Aug 26, 2017)

That's a lot. 25-30 euro difference where I live.


----------



## biffzinker (Aug 26, 2017)

Frick said:


> and they always were *very expensive*.


I thought of swapping out the 4790K for a 5775c after the lastest firmware update for my ASUS Z97 mobo but the cost wasn't worth it to me.


----------



## P4-630 (Aug 26, 2017)

I have heard pretty little about these processors, also I don't remember them even being sold in my country, also didn't read anything on the internet about these odd processors.
Why?


----------



## cucker tarlson (Aug 26, 2017)

The power savings and heat reduction compared to 4790K is amazing. My 4790K was much warmer and louder.


----------



## biffzinker (Aug 26, 2017)

P4-630 said:


> I have heard pretty little about these processors, also I don't remember them even being sold in my country, also didn't read anything on the internet about these odd processors.
> Why?


There was reviews posted everywhere when the NDA? was lifted.


----------



## P4-630 (Aug 26, 2017)

biffzinker said:


> There was reviews posted everywhere when the NDA? was lifted.



Guess I was hibernating at the time lol...


It seems they still sell them at my country:


----------



## phanbuey (Aug 26, 2017)

P4-630 said:


> I have heard pretty little about these processors, also I don't remember them even being sold in my country, also didn't read anything on the internet about these odd processors.
> Why?



because they were espensif to make... 128 MB l4 is kind of a dig peal.


----------



## cucker tarlson (Aug 26, 2017)

The CPU was running stock 3.7GHz when I tested the L4, oveclocking it to 4.2GHz only made a 2 fps difference once the EDRAM was already doing its wonders. Now I'll disable the edram and compare the performance gain form 14% core OC but EDRAM disabled vs stock with L4 cache.

Updated: got 104 fps with 4.2GHz oc L4 disabled

so let's sum up

91-95 fps @3.7GHz EDRAM disabled
102-104 fps @4.2GHz EDRAM disabled
*117-118 fps @3.7GHz EDRAM @2GHz*
119-120 fps @4.2GHz EDRAM @1.8GHz (probably hitting some sort of bottleneck. I wager the game itself. I closely monitor the TDP/Power limit and it's not that. So either the game or the gpu)


12% increase from 14% core oc. Power draw and temps increased significantly.
29% increase from using 2GHz EDRAM. No observable power draw or temperature increase.

wonder how much megahurtz and watts it'd actually take for core oc only to match the increase from edram alone. Intel must be wishing they never released this beauty.


----------



## newtekie1 (Aug 26, 2017)

Interesting, I always though the eDRAM was largely just used by the iGPU and was pretty useless to CPU tasks.

Makes me seriously consider picking up a 5775C...


----------



## MagnyCours (Aug 26, 2017)

Recent open-world games released by Ubisoft are really hard on the CPUs for some reason lol. You should try benchmarking GR: Wildlands if you have it, the game hammers my 4790K at 4.8GHz pretty hard.


----------



## cucker tarlson (Aug 26, 2017)

newtekie1 said:


> Interesting, I always though the eDRAM was largely just used by the iGPU and was pretty useless to CPU tasks.
> 
> Makes me seriously consider picking up a 5775C...


It's accessible for cpu and igpu both. Not much to be found as for its function, but from what I understand it's like a low latency buffer between L3 and DRAM. Whatever its purpose, the benefits for cpu limited scenarios are huge.



MagnyCours said:


> Recent open-world games released by Ubisoft are really hard on the CPUs for some reason lol. You should try benchmarking GR: Wildlands if you have it, the game hammers my 4790K at 4.8GHz pretty hard.



was gonna pick up wildlands when the price drops, it did drop but now it jumped back so no go, not gonna pay that much for what we in Poland call "a reheated pork chop".
That said, many new games need ridiculous CPU resources. WatchDogs2 is one I like benchmarking, I found certain parts of Dishonored 2 to be a CPU hog as well. Battlefield 1 is an obvious example,Deus Ex Mankind Divided also shows huge demand for cpu resources.


testing The Witcher 3 now, Novigrad (cpu heavy), 1080p, low

CPU 3.7GHz L4 disabled  - 130 fps
CPU 4.2GHz L4 disabled  - 141 fps

edram 1.8GHz tests coming in a while.

CPU 3.7 L4 enabled - a whooping 157 fps (+21% over 3.7 L4 disabled)
CPU 4.2 L4 enabled - an even more whooping 175 fps (+24% over 4.2 L4 disabled)


----------



## newtekie1 (Aug 27, 2017)

cucker tarlson said:


> It's accessible for cpu and igpu both. Not much to be found as for its function, but from what I understand it's like a low latency buffer between L3 and DRAM. Whatever its purpose, the benefits for cpu limited scenarios are huge.



Yeah, it seems like it is used as strickly an L4 when the iGPU is disabled.  Interesting...


----------



## Nuckles56 (Aug 27, 2017)

That seriously makes me want to buy one even though I'd have to buy a board to go with it


----------



## LAN_deRf_HA (Aug 27, 2017)

I never understood why they didn't include this in later chips. It certainly made a large performance improvement. Wonder if it hurt profit margins.


----------



## Nuckles56 (Aug 27, 2017)

LAN_deRf_HA said:


> I never understood why they didn't include this in later chips. It certainly made a large performance improvement. Wonder if it hurt profit margins.


The L4 cache was too expensive to manufacture and made the CPU too good


----------



## Nordic (Aug 27, 2017)

Nuckles56 said:


> The L4 cache was too expensive to manufacture and made the CPU too good


No way. There is no reason a profit oriented business would say it was too good if they could sell it for enough money. I could understand too expensive, but if it was truly that good they could charge that much extra for it.

There is something else going on here. Review sites are showing the edram has a minimal performance gain.

If anything I think the OP is showing how much watchdogs utilizes memory speeds. These are high task specific performance gains in memory intensive applications.


----------



## cucker tarlson (Aug 27, 2017)

Nordic said:


> No way. There is no reason a profit oriented business would say it was too good if they could sell it for enough money. I could understand too expensive, but if it was truly that good they could charge that much extra for it.
> 
> There is something else going on here. Review sites are showing the edram has a minimal performance gain.
> 
> If anything I think the OP is showing how much watchdogs utilizes memory speeds. These are high task specific performance gains in memory intensive applications.


read the review. 5775c 4.2GHz beats 7700K 5GHz consistently ( review in my second post) in Battlefield 1, Dishonored 2, Hitman, Deus Ex MD, Witcher 3, Watch Dogs 2, Warhammer and Tombraider. I posted my own witcher 3 results as well and L4 has a 25% improvement there as well. Memory speed has an impact on every game when the particular sequence relies on cpu, it's not the case I'm making though, edram runs at 1600/1800/2000MHz speed as I pointed out numerous times. It's the latency that matters.


----------



## phanbuey (Aug 27, 2017)

cucker tarlson said:


> read the review. 5775c 4.2GHz beats 7700K 5GHz consistently ( review in my second post) in Battlefield 1, Dishonored 2, Hitman, Deus Ex MD, Witcher 3, Watch Dogs 2, Warhammer and Tombraider. I posted my own witcher 3 results as well and L4 has a 25% improvement there as well. Memory speed has an impact on every game when the particular sequence relies on cpu, it's not the case I'm making though, edram runs at 1600/1800/2000MHz speed as I pointed out numerous times. It's the latency that matters.



+100 - So back in the FX days for AMD the only difference with the 4000+ CPUs and the FX were that the FX had 1mb L2 cache, and the rest 512kb... and the semprons had 256....  you would have to clock the sempron like 200Mhz-300Mhz above the FX (which back when 2.8 GHz was the top speed was alot...) to just get parity in SOME apps - Cache helps alot, memory latency is really core for games.

You can still run some benchmarks apps and see the old socket 754 Nforce 4 motherboards latency in the first 3 places - back when nvidia and AMD were friends.

also if you look at skylake X - the i9-7900 tends to beat/match the 7700k - the 7820x is more or less on par if not somewhat slower... and the 7800x is a bit of a joke - even in games that only use a few threads, having more cache is absolutely huge.

I think they realized if they didn't lock down clockspeed they would have another 2500K situation on their hands and shelved this design.


----------



## cucker tarlson (Aug 27, 2017)

Shadow Warrior 2 (sorry about the OSD thing, it randomly changes to graph mode I used some time ago, dunno why)

4.2 L4 disabled 210 fps
4.2 L4 1.8GHz 240 fps (+14%)

http://imgur.com/a/24weB


4.2 L4 disabled 200 fps (framerate was freaking out, when L4 is disabled and CPU hits a bottleneck the stuttering is pretty bad, frametimes go nuts)
4.2 L4 1.8GHz 238 fps (+19%, smooth framerate, consistent frametimes)

http://imgur.com/a/UpQSn


Needs to be said that L4 helps a lot with preventing stuttering. Back when I was still running a 4790K hitting 90% or higher on all threads in watch dogs 2 equalled pretty obvious stuttering. Now even when I'm doing tests on low preset and the 5775c is pegged at 90% the effect ain't half as bad. There are some occasional hitches but the gameplay is much smoother.


Another WD2 location tested
4.2 Ghz L4 2Ghz - 110 fps (+*32.5%*)
4.2 Ghz L4. 1.8GHz - 109 fps *(+31.3%)*
4.2Ghz L4 1.6GHz - 101 fps (+21%)
4.2 Ghz L4 off - 83 fps

seems like a pattern. the higher the cpu usage, the bigger the impact of L4. +14-19% in shadow warrior 2 at 50-60% CPU usage, +30% in WD2 at all threads pegged at 90-100%.


----------



## biffzinker (Aug 27, 2017)

Nordic said:


> There is something else going on here. Review sites are showing the edram has a minimal performance gain.


When Tech Report posted their reviewed for the Skylake 6700K the Broadwell 5775C showed less drop off in Sandra cache and memory bandwidth at the 64MB block ( 128 MB L4 Cache.)





Higher bandwidth in AIDA64 memory copy.





Improved latency at the 64MB-128MB block size in Sandra cache and memory latency.





Ahead of the 4790K in AIDA64 PhotoWorxx





Project Cars, and so on for the gaming focused testing.

Higher frame rate




Less stuttering





Source link for the rest of the gaming benchmarks.
http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed/6


----------



## newtekie1 (Aug 27, 2017)

Nordic said:


> No way. There is no reason a profit oriented business would say it was too good if they could sell it for enough money. I could understand too expensive, but if it was truly that good they could charge that much extra for it.



It wasn't about it making the CPU too good, it was entirely because of cost.  The eDRAM was a second die on the CPU package, and that die was about 50% of the size of the CPU die.  Die space is what makes a CPU expensive to manufacture, and multiple dies on a package adds even more expense.


----------



## blobster21 (Aug 27, 2017)

Can you show me a picture of the EDRAM control in the Z97X gaming bios ?


----------



## R0H1T (Aug 27, 2017)

cucker tarlson said:


> It's accessible for cpu and igpu both. Not much to be found as for its function, but from what *I understand it's like a low latency buffer between L3 and DRAM*. Whatever its purpose, the benefits for cpu limited scenarios are huge.
> 
> 
> 
> ...


Well that's what an L4 is, technically I guess. What would be interesting to see is if the RAM speeds also have a big effect on any of the tasks, I mean how does OC L4 + stock RAM vs stock L4 + OC RAM results look like? I'm talking about 2100 MHz or above, in terms of RAM speeds.

Also how does the OCed 5775c handle OCing L4 & RAM, whether there's any benefit doing all 3 or just OCing any 2 of them is better?


----------



## Nordic (Aug 27, 2017)

The reviews that I read that specifically looked at the edram say it provides marginal improvements.
http://www.legitreviews.com/edram-overclocking-on-the-intel-core-i7-5775c-broadwell-cpu_167425
http://www.pcworld.com/article/2950...ted-desktop-broadwell-has-one-neat-trick.html


----------



## cucker tarlson (Aug 27, 2017)

Nordic said:


> The reviews that I read that specifically looked at the edram say it provides marginal improvements.
> http://www.legitreviews.com/edram-overclocking-on-the-intel-core-i7-5775c-broadwell-cpu_167425
> http://www.pcworld.com/article/2950...ted-desktop-broadwell-has-one-neat-trick.html


Well I guess they suck then. How can you miss a 30% improvement ?  I've seen this problem back when I became interested in the topic of ram speed vs min. fps. Most reviewers are just too lazy or incompetent to use a proper methodology. They just run a built-in game benchmark, or worse (oh dear God) a GPU benchmark like valley, report average fps increase of 1% between slow and fast ram, and then we've got the community totally misled unless someone posts their own results or a smart reviewer takes time to do it properly and gets enough credit.



R0H1T said:


> Well that's what an L4 is, technically I guess. What would be interesting to see is if the RAM speeds also have a big effect on any of the tasks, I mean how does OC L4 + stock RAM vs stock L4 + OC RAM results look like? I'm talking about 2100 MHz or above, in terms of RAM speeds.
> 
> Also how does the OCed 5775c handle OCing L4 & RAM, whether there's any benefit doing all 3 or just OCing any 2 of them is better?



will try soon  But pretty sure all of them have a big impact.

frequency increase translates more or less proportionally to added performance, I measured 12% fps increase from 14% oc (3700->4200). Then on top of that additional 21% from using stock 1.6Ghz L4, make it 32% when L4 is oc'd to 2GHz. Will do a ram test in a moment.

Tried to do the test but my bios is acting up and I am getting fed up. Will attempt to test that at a later time.


----------



## OneMoar (Aug 27, 2017)

thread ripper could benefit from edram they are already wasting a bunch of package space


----------



## Aquinus (Aug 27, 2017)

newtekie1 said:


> It wasn't about it making the CPU too good, it was entirely because of cost. The eDRAM was a second die on the CPU package, and that die was about 50% of the size of the CPU die. Die space is what makes a CPU expensive to manufacture, and multiple dies on a package adds even more expense.


It's also eDRAM, not SRAM. DRAM is far cheaper to manufacture than SRAM like standard cache is because all you're talking about is a transistor and a capacitor versus 6 transistors for a single bit. It's easier to produce DRAM because there is literally less circuitry to build it. Controlling DRAM is the tricky part but, it's not like Intel doesn't already know how to do that with DRAM that isn't on the same package.

Simply put, putting DRAM on the package is smart because you get the benefit of DRAM which is cost and the benefit of cache which is distance from the "processor" (in the sense of the thing doing the processing, like a core or a hardware thread.) The extra cost would probably come from wiring up everything on the package itself, not the production of the eDRAM.


OneMoar said:


> thread ripper could benefit from edram they are already wasting a bunch of package space


This, this, and more this.


----------



## Vario (Aug 27, 2017)

If I recall correctly, The 5775C didn't clock as high as the 4790K, Sky Lake was right around the corner, and the 5775C release wasn't really promoted for gaming.  There were some Z97 boards that didn't support it and it didn't work with Z87.  However, there were plenty of reviews in 2015 extolling the virtues of the Crystal Well EDRAM for gaming.

The best time to buy it was in 2015-2016 when it was in the upper $200s. Now as it is in the upper 300s low 400s, it is priced so high its not worth buying it over a Kabylake which has it on typical overclocked clock speed and lets you run a modern chipset, ddr4, etc.

I see the i5 5675C is in the mid-upper $200 range so it might be alright if you already have a 4690K or a 4790K, a Z97 board that is compatible, and don't mind losing 2 threads if 4790.


----------



## biffzinker (Aug 27, 2017)

Vario said:


> The best time to buy it was in 2015-2016 when it was in the upper $200s. Now as it is in the upper 300s low 400s, it is priced so high its not worth buying it over a Kabylake which has it on typical overclocked clock speed and lets you run a modern chipset, ddr4, etc.


Had a look at the historical pricing for the 5775C, not seeing anything were it dipped under $300.
PC Part Picker





Also the end of life status shows the last shipments were in August.
End-of-Life date: Last order date is February 24, 2017
Last shipment date: August 11, 2017

Edited


----------



## LAN_deRf_HA (Aug 27, 2017)

Vario said:


> 5775C didn't clock as high as the 4790K


That was part of the appeal, it was topping gaming charts while running much slower/using less power.


----------



## Vario (Aug 28, 2017)

biffzinker said:


> Had a look at the historical pricing for the 5775C, not seeing anything were it dipped under $300.
> PC Part Picker
> View attachment 91530
> 
> ...


Seemed like it was like $280 at microcenter IIRC.


----------



## EarthDog (Aug 28, 2017)

Did i miss wjere response in one title makes the rest true? I recall reviews on these and didnt see it making much of a difference in games at the time.. it didnt in our testing.

What changed...and what other games are we seeing positive results in?


----------



## LAN_deRf_HA (Aug 28, 2017)

EarthDog said:


> Did i miss wjere response in one title makes the rest true? I recall reviews on these and didnt see it making much of a difference in games at the time.. it didnt in our testing.
> 
> What changed...and what other games are we seeing positive results in?



 As long as I saw it in reviews it was always at the top or near it against much faster running chips, even in non cpu limited scenarios.






And on a side note it also hasn't been topped on integrated graphics either.


----------



## EarthDog (Aug 28, 2017)

Something like that shows parity across many chips. Looking at the review where it came from, its up there, but, you arent seeing differences like what the op is seeing is my point. That isnt common. 

http://www.anandtech.com/show/9320/intel-broadwell-review-i7-5775c-i5-5675c/9

Thread title should add the game for clarity so as not to infer this happens in all, or even many, games.


----------



## eidairaman1 (Aug 28, 2017)

Well this does nothing more than prove if you have a sandy/ivy/has/broad, there is no point in upgrading to sky/kaby...


----------



## Nordic (Aug 28, 2017)

cucker tarlson said:


> How can you miss a 30% improvement ?


I am not saying that there is no performance improvements. Legit reviews showed a 1% improvement which does sound too small. DDR3 memory scaling on haswell had as high as 12% more performance with faster memory in certain games. This edram should at least show 12% in certain memory intensive games, but 30% across the board sounds outlandish. If edram increased performance 30% across the board, I would think intel would have more processors with it despite the cost.
http://www.anandtech.com/show/7364/memory-scaling-on-haswell/7


----------



## DeathtoGnomes (Aug 28, 2017)

@cucker tarlson ok this is all good info, but what what about power consumption, whats the difference with your tweaking here?


----------



## cucker tarlson (Aug 28, 2017)

EarthDog said:


> Something like that shows parity across many chips. Looking at the review where it came from, its up there, but, *you arent seeing differences like what the op is seeing is my point. *That isnt common.
> 
> http://www.anandtech.com/show/9320/intel-broadwell-review-i7-5775c-i5-5675c/9
> 
> Thread title should add the game for clarity so as not to infer this happens in all, or even many, games.


tested on 290X/980 and you're surprised the games aren't showing differences? There's a gpu bottleneck in all of them,lol,can't you see that ?
Title is as clear as day, it says what I found about the performance. How can that be clearer. Mention the games ? Uhm, all of them ? Did you read the review from purepc where they test on 1080p GTX 1080 in cpu limited parts of the game ? Try that maybe. Here's a teaser







My second post of the thread (page 1, "spis treści" is the table of contents, click to unfold). What's funny is the the reviewer,who always engages in discussion below his articles, said in the oc vs oc test he overclocked the 5775c core only, edram was left at stock. With what I assume is ~10% difference between stock (1.6) and 2GHz the 5775c should sail way clear of 7700K.





DeathtoGnomes said:


> @cucker tarlson ok this is all good info, but what what about power consumption, whats the difference with your tweaking here?



L4 has no impact on power draw. Core oc does but the cpu sips power anyway. The package wattage reading at 4.2GHz 1.29v with L4 2GHz OC is still around 60W max. IBT very high raises it to around 80W.


----------



## biffzinker (Aug 28, 2017)

The Euler3d CFD Benchmark shows what the 5775C is capable of compared to a overclocked 6700K (4.6 GHz.)
	

	
	
		
		

		
		
	


	



http://www.legitreviews.com/intel-core-i7-6700k-skylake-processor-review_169935/7

There is also overclocking the eDRAM.









http://www.legitreviews.com/edram-overclocking-on-the-intel-core-i7-5775c-broadwell-cpu_167425


----------



## cucker tarlson (Aug 28, 2017)

Nordic said:


> I am not saying that there is no performance improvements. Legit reviews showed a 1% improvement which does sound too small. DDR3 memory scaling on haswell had as high as 12% more performance with faster memory in certain games. This edram should at least show 12% in certain memory intensive games, but 30% across the board sounds outlandish. If edram increased performance 30% across the board, I would think intel would have more processors with it despite the cost.
> http://www.anandtech.com/show/7364/memory-scaling-on-haswell/7


I misrepresented even my own findings in this post, sorry. 30% in only for oc'd EDRAM (2GHz vs stock 1.6). Stock is around 20-25% when the cpu is the limitation but the gpu has a lot of horsepower left.


----------



## EarthDog (Aug 28, 2017)

cucker tarlson said:


> tested on 290X/980 and you're surprised the games aren't showing differences? There's a gpu bottleneck in all of them,lol,can't you see that ?
> Title is as clear as day, it says what I found about the performance. How can that be clearer. Mention the games ? Uhm, all of them ? Did you read the review from purepc where they test on 1080p GTX 1080 in cpu limited parts of the game ? Try that maybe. Here's a teaser
> 
> 
> ...


I see blanket assumptions applied. Its not a huge leap, but just knkwing some titles dont scale with intel clockspeeds, may curb some of the increases.


----------



## wietjunk (Aug 29, 2017)

Hey i am new here, thanks for the awsome thread i have got this cpu for som time in the room sucking dust.

There is almoste nothing about this cpu on the net, tomorrow i receive my Asus rog impact VII back from rma.


----------



## eidairaman1 (Aug 29, 2017)

wietjunk said:


> There is almoste nothing about this cpu on the net, tomorrow i receive my Asus rog impact VII back from rma.



Get the latest release or beta firmware for it.

By the way it is against forum rules to post several replies with no one else's in between, so use the edit button and delete buttons accordingly. There is a multiquote button as well


----------



## wietjunk (Aug 29, 2017)

Ok thanks.


----------



## RichF (Aug 30, 2017)

*Intel’s Skylake lineup is robbing us of the performance king we deserve.* The one Skylake processor I want is the one that Intel isn't selling.


----------



## eidairaman1 (Aug 30, 2017)

wietjunk said:


> Ok thanks.



Combine your posts above please


----------



## cucker tarlson (Aug 30, 2017)

I didn't know 5775c was so unpopular, look at the size of the list for devil's canyon ownership club (2 CPUs: 4790K and 4690K) and compare to the list of 5775c/5675c owners  

http://www.overclock.net/t/1490324/the-intel-devils-canyon-owners-club
http://www.overclock.net/t/1583537/intel-broadwell-c-ownership-club

Crazy. Polish enthusiast forums are pretty much barren but still there are more 5675c/5775c owners there than here. Ppl look at 5775c in my sig sig and go like whut  bro, but 7700K got moar megahurtzzz



RichF said:


> *Intel’s Skylake lineup is robbing us of the performance king we deserve.* The one Skylake processor I want is the one that Intel isn't selling.


wow that 6785R sounds so dope yet so obscure I have never even heard of it. Wonder if it's possible to find an OEM/barebone desktop featuring this beast anywhere.


All that remotely resembles a desktop PC and comes with Skylake and 128MB edram is this NUC

http://allegro.pl/zestaw-nuc-boxnuc6i7kyk-i7-6770hq-8gb-ssd-1tb-m-2-i6933251324.html

It has a 6770HQ
https://ark.intel.com/pl/products/93341/Intel-Core-i7-6770HQ-Processor-6M-Cache-up-to-3_50-GHz

but it's still BGA

The NUC looks cracking and it does feature a usb type c/thunderbolt 3 but it's still gonna be limited to 5GB/s which will severely cripple any modern high-end or even mid-range card.
That NUC with alienware GPU dock would look so sick hooked up to my TV...










seems like up to 30% performance hit to an oc'd 980Ti, but I suppose a GTX 1050Ti or even a 1060 would be a good choice to go with the external GPU docking station.


----------



## Fouquin (Aug 30, 2017)

LAN_deRf_HA said:


> I never understood why they didn't include this in later chips. It certainly made a large performance improvement. Wonder if it hurt profit margins.





Nuckles56 said:


> The L4 cache was too expensive to manufacture and made the CPU too good



They did include it though. Skylake still has eDRAM but it runs off the system agent and is transparent to the system as DRAM. It no longer acts as an L4 off LLC but it does still sit in front of DRAM access if any device needs it.


----------



## cucker tarlson (Aug 30, 2017)

Fouquin said:


> They did include it though. Skylake still has eDRAM but it runs off the system agent and is transparent to the system as DRAM. It no longer acts as an L4 off LLC but it does still sit in front of DRAM access if any device needs it.



here's the 5775c one for comparison



Spoiler: pic


----------



## Flaky (Aug 30, 2017)

eDRAM is still present on SKUs where it is most beneficial - laptops and compact platforms, when there is no overclocking headroom and means to use RAM with higher clocks.
On current socketed desktop platforms, this is not a problem. The faster main memory you get, the smaller benefit from eDRAM is, up to a certain point when main memory will be faster and have lower latency than eDRAM.


----------



## cucker tarlson (Aug 30, 2017)

Flaky said:


> eDRAM is still present on SKUs where it is most beneficial - laptops and compact platforms, when there is no overclocking headroom and means to use RAM with higher clocks.
> On current socketed desktop platforms, this is not a problem. The faster main memory you get, the smaller benefit from eDRAM is, up to a certain point when main memory will be faster and have lower latency than eDRAM.


well if you prefer to spend $100 more on memory than $30 on a CPU then yes.

Lower latency than edram ? I don't think DDR4 can do that. Niether 7700K nor 5930K running DDR4 3600 CL14 and some ridiculous (~4500MHz) northbridge speeds can go below 40ns like 5775c can do at stock.

Faster memory read/write ? Yes, tuned 3600 DDR4 will beat stock edram. 5775c @4.4GHz and EDRAM @2GHz with some decent 2133 CL9 DDR3 should still beat a 5GHz 7700K with DDR4 3600 by a whisker or at least be as fast at lower cost and power draw.


----------



## Vayra86 (Aug 30, 2017)

With the current DDR4 prices I might actually just try to find one of these and put some nice DDR3 next to it for my next build. I've always known the 5775c was special, but seeing it today, especially with regards to latency and min fps, quite an eye opener. THANKS for that man! Topping the charts over a 5 Ghz 7700k while pushing only 4.2 is just.. wow

Not a huge fan of the i7 on steroids that is the 7700k. Feels like a weaksauce AMD attempt like the 9590 to get big numbers.

I have a question too (actually seriously researching this now ): overclocking seems to cap out at the 4.0-4.4 Ghz range. Is there anyone who can provide input as to how lucky you'd have to be to hit 4.2 ~4.4 ? I saw OP at just around 1.375 or so, that's quite a lot... Are these reliable 4.2 Ghz chips, or is that optimistic

@cucker tarlson what are your temps for that 4.4 @ 1.38?


----------



## cucker tarlson (Aug 30, 2017)

#1 I like about 5775c is how fun it is to play with overclocking it. I think it surpasses my old 2500K and I never expected I'd say that ever again, certainly not with what intel has been doing lately. Base clock is just 3300MHz, 4-core turbo is 3600MHz. That means overclocking it to 4400MHz/1.38v is an 22,2% increase. In comparison, it'd take 7700K (4.4GHz turbo on 4 cores) 5376MHz to reach 22,2% oc. You can overclock EDRAM on 5775c as well, and that will come at no power/temperature penalty and give you very nice returns in games. And it gets even sweeter when my tests clearly show the returns from using/overclocking EDRAM are higher the higher cpu frequency is as long as you're not GPU bottlenecked. All that at incredibly low temperatures and power consumption. During the testing period I ran cpu intensive games at low preset with additional voltage and clockspeed, CPU load was constantly +90% yet the CPU stayed in low 50s/high 40s and the fan on D15S was only spinning at 750rpm. Compared to 4790K I had that is just ridiculous for summer temperatures. Devil's Canyon at 4.6GHz 1.30v would hit 70s easily at higher fan speed.

As for oc, I've seen few of their owners claim to hit a wall at 4.0 GHz even with 1.37v. Most will do 4.2GHz at 1.35v-1.37v. Quite a few can do 4.3-4.4GHz at below 1.4v like mine does. Creme de la creme will do 4.5-4.6GHz but don't count on that, that's huge luck in sillicon lottery.  Pumping up to 1.4v is pretty much nothing for those CPUs, like I said the temperatures are very,very moderate.
If you read the broadwell-c owner's club I posted a link to, you'll see people pumping 1.45v on water without delid while maintaining pretty reasonable temperatures, which is just impossible for devil's canyon or kaby lake.
Especially if you do what I did, exchanged 2400MHz CL11 1.65v sticks for slower but low latency 1600 CL8 1.35v and run them at 2133 CL9 1.512v. This way you're not contributing that much to temperature rise under load as 1.65v would do.

I did my tests at 1.3v 4200MHz cause I was 100% sure that was stable. 4.4GHz 1.38v has been running fine for some time but for the tests I did here I wanted to run something that I'd be 100% sure is stable. Temps is game rise about 5 degrees every 0.5v so expect low 40s at stock 1.2v, low 50s at 1.3v, high 50s/low 60s at 1.38v. All depends on ambient and fan setting tho. I can run games at 1.38v with low noise adapter (500/600rpm) but that gives me 67 degrees. Still very neat tho considering that like 30-40% of D15S fan speed.


----------



## EarthDog (Aug 30, 2017)

How doesnt overclocking edram raise power or temps? Physics and all??

If ypu are going to test temps and power, you may want to move to a more consistent testing method like a stress test of some sort. Gaming, regardless if the cpu is at 90%, is way less stressful than stress testing.... temps clearly show that.


----------



## cucker tarlson (Aug 30, 2017)

EarthDog said:


> How doesnt overclocking edram raise power or temps? Physics and all??
> 
> If ypu are going to test temps and power, you may want to move to a more consistent testing method like a stress test of some sort. Gaming, regardless if the cpu is at 90%, is way less stressful than stress testing.... temps clearly show that.


The increase must be quite low so it doesn't produce any noticeable changes to temperature and draw readings. EDRAM is actually not that fast (well, it's actually quite slow compared to DDR4 or even DDR3) cause more it's low-latency oriented, and I'm not adding any voltage to it either. I'd be more surprised to see any considerable power draw differences than not see them.

I see you're quite being sour calling my reporting of gaming temps inconsistent. What's wrong with using game temps as reference unless you're just nitpicking. It's only inconsistent if I run too few test and/or have no idea of what I'm actually reporting. And by the way I mentioned IBT temps in my post as well so what is it exactly that your post was supposed to contribute the conversation ? Stress testing with IBT is producing power draw that I'd never,NEVER see in gaming, which is the topic of this thread,top of the page,capital letters.


----------



## EarthDog (Aug 30, 2017)

Wow, people are just... wow... hackles go up at the slightest sign of questioning anything... What. The. Hell... peeps. 

Anyway, cucker, all I was saying is that it doesn't make sense overclocking doesn't raise temps... because Physics. This may come down to the testing method or many other variables including not noticeable increases. I was saying to use something MORE CONSISTENT, like a stress test. That's all. Gaming temps, even in the same game can be all over the place. Variances are higher with variable loads.

Sorry I missed IBT... though it wasn't in post 57 which I responded to. 

Geeeeeeeeeeeeeeeeezus Christ........


----------



## cadaveca (Aug 30, 2017)

cucker tarlson said:


> What's wrong with using game temps as reference unless you're just nitpicking.



Games do not provide consistent loads and typically do not use all of the resources available in a way as to be able to accurately draw conclusions about power usage. You can at best measure the peak over time, but even this peak measurement will be inconsistent.


----------



## cucker tarlson (Aug 30, 2017)

cadaveca said:


> Games do not provide consistent loads and typically do not use all of the resources available in a way as to be able to accurately draw conclusions about power usage. You can at best measure the peak over time, but even this peak measurement will be inconsistent.


Agreed. If you're testing different sequences/locations/games. Not if you're comparing same location in same game. That actually does reflect changes in power/temperatures pretty well in a standard use.



EarthDog said:


> Wow, people are just... wow... hackles go up at the slightest sign of questioning anything... What. The. Hell... peeps.
> 
> Anyway, cucker, all I was saying is that it doesn't make sense overclocking doesn't raise temps... because Physics. This may come down to the testing method or many other variables including not noticeable increases. I was saying to use something MORE CONSISTENT, like a stress test. That's all. *Gaming temps, even in the same game can be all over the place. Variances are higher with variable loads.*
> 
> ...



I wish people while reading this thread would use 1/5th of the effort I tried to use while testing to ensure it produces fairly consistent results, not worthless crap.  testing different settings in the exact same game locations in ALL I DO in this thread.


You believe stress test and built-in benchmarks, I prefer testing gaming loads and in-game scenarios. Fair enough, opinions vary, you're right in your way just as I'm right in mine. Trust me, I'm doing this for myself more than anyone, just sharing here to see other 5775c owenrs share their, but that's a bit of a downer so far.


----------



## EarthDog (Aug 30, 2017)

cucker tarlson said:


> Stress testing with IBT is producing power draw that I'd never,NEVER see in gaming, which is the topic of this thread,top of the page,capital letters.


Considering you brought up the point of power and temps...I find this post curious. I(we) are just responding to what you put out there and trying to help you confirm what you are saying by using a more consistent method. Physics tells us temsp will increase and power use. Just because you are not seeing it, doesn't mean it isn't there. Not a big deal, just saying. 

Sorry if that didn't "contribute to the conversation" trying to get clarity on the point you are made...

EDIT: As I side note, I am a long time reviewer/editor (smaller website - freelance review for a huge one now), and Dave reviews motherboards here. It's not like we haven't done this before and know. That isn't to say we can't be wrong... certainly, it happens. I learn just like others do! Just sharing what we know from past testing and established norms. You could be spot on, but unfortunately, due to how games work, the variance is a lot larger than with stress testing and then the physics thing... so, I made a suggestion. Im not sour, not trying to be combative... nothing like that.

I'll leave it alone.


----------



## cucker tarlson (Aug 30, 2017)

Well I guess I should report what I'm NOT seeing, that would be consistent according to you.

/just pulling your leg /

And how's IBT method more consistent when measuring EDRAM performance in games ? Does IBT even take advantage of EDRAM cache ? And how is it anywhere near to being related to gaming performance ?


----------



## EarthDog (Aug 30, 2017)

Power and temps cucker... power and temps. I'm not going down this rabbit hole though cucker. Sorry I even said a word... wow. SMFH.

If you want to see if eDRAM is used in stress testing, enable and disable it like you do for games and see across a couple of tests (IBT, P95, Intel XTU, OCCT, etc). I am not certain either, but imagine it would make use of it.


----------



## cucker tarlson (Aug 30, 2017)

LOL you imagine. Well, it doesn't. Not to any noticeable extent that would be worth proving. GFlops score in IBT stays the same. But I already knew that. It's just I never mentioned before as it would be irrelevant to the thread,which again, is entitled edram and performance in games, and as you can see I can be quite anal when people make it about whatever else THEY want it to be about.

I guess I'll make another one. "does 5775c get hotter in stress testing than in games ? Gee, I don't know but that's what we're about to find out here. What I can tell you for sure is that it's cucking a lot more power."


----------



## EarthDog (Aug 30, 2017)

Yes, anal... no doubt. I just made a suggestion... jesus christ man..

Just because performance doesn't go up, doesn't mean it isn't being used either. This is were fly by night testing gets poeple in trouble... assumptions as a base. You may be right, but it is worth testing, to me. Obviously not to you, and not in this thread.

Good luck. Unsubscribed.


----------



## Vayra86 (Aug 30, 2017)

EarthDog said:


> Power and temps cucker... power and temps. I'm not going down this rabbit hole though cucker. Sorry I even said a word... wow. SMFH.
> 
> If you want to see if eDRAM is used in stress testing, enable and disable it like you do for games and see across a couple of tests (IBT, P95, Intel XTU, OCCT, etc). I am not certain either, but imagine it would make use of it.



This would be interesting to see!

In other news, lets try to get along... There is mutual interest here and yet its turning into a combat zone... RELAX


----------



## cucker tarlson (Aug 30, 2017)

EarthDog said:


> Yes, anal... no doubt. I just made a suggestion... jesus christ man..
> 
> Just because performance doesn't go up, doesn't mean it isn't being used either. This is were fly by night testing gets poeple in trouble... assumptions as a base. You may be right, but it is worth testing, to me. Obviously not to you, *and not in this thread*.
> 
> Good luck. Unsubscribed.


Wow I'm glad you actually got it man!


----------



## Kursah (Aug 30, 2017)

Let's keep the drama to yourselves folks, or take it to PM if any of you insist on dragging it on.

I am pretty sure as adults here that everyone can, or should be able to accept and even welcome criticism and feedback without getting defensive and aggressive about it. Let's all treat each other with respect, that is not a request that is part of the rules being a member here. Feel free to review the TPU guidelines, link is in my sig.

There have been some good points made all around, that's one of the nice things about TPU. We have experience at all levels and all sorts of interesting topics. Let's get along, or infractions will be earned and threads will be closed.


----------



## cucker tarlson (Aug 30, 2017)

Yeah,we're cool, but not all criticism is constructive criticism.

The thread is pretty much dead anyway IMO,sad to admit that. I was expecting others to pitch in with their results rather than people nitpicking about irrelevant details in mine.


----------

