Monday, January 12th 2015

G.SKILL Breaks Fastest DDR4 Memory Frequency Overclock at 4255MHz

G.SKILL International Co. Ltd., the world's leading manufacturer of extreme performance memory and solid-state storage, is extremely excited to announce a new memory record for fastest DDR4 memory frequency, set at 4255MHz CL18-18-18! This amazing achievement was attained on the Asus Rampage V Extreme motherboard (X99 chipset) and with the Intel i7-5960X processor, all under sub-zero liquid nitrogen cooling. Below is a screenshot of the record validation by CPU-Z (validation).
Add your own comment

39 Comments on G.SKILL Breaks Fastest DDR4 Memory Frequency Overclock at 4255MHz

#26
Steevo
nelizOh, MS and Intel are big enough to force feed you their products whether you like it or not, mostly because of their all-but-factual monopoly position. the few tweaks they do their product, pricing or positioning are small adjustments for their strategy.



Yap, more GPUs also enjoy the benefts of that memory. It's more the current memory controllers that limit the efficiency of the memory, there was a good article on this a few years ago how current IMC's are only working at 40-70% efficiency. Remember the days when the memory controller was on the northbridge?
HD-DVD, Betamax, Windows ME, would like a word with you in the corner about this great new investment opportunity.
Posted on Reply
#27
neliz
SteevoHD-DVD, Betamax, Windows ME, would like a word with you in the corner about this great new investment opportunity.
That's why you need to go back and check. The adoption curve is they key here.

In the VHS/Betamax/Video2000 battle for instance, both Betamax and Video2000 where technologically superior to VHS, more tracks, dual-sided tapes for instance. But VHS won because of lower price and much higher adoption rate by industry (lower royalties) and consumers (lower price.)

With Windows ME, I feel like MS did the perfect thing. Resistance to the switch from the DOS based shell environments (Win95, 98) into the NT based environment was high (they're taking our liberty, freedom and control of our PC! Bloatware!). They tried to satisfy both sides and ended up delivering a product that wasn't close to either their old shells or NT. I just kept dual-booting 98 and NT at home because of games&compatibility, but used ME on my notebook at work because I had far less USB compatibility issues.

So once everyone is complaining about how bad ME is (me too, don't worry) The only thing Microsoft had to do was blame all the legacy stuff and compatibility issues. Just point out how great Windows 2000 is (although, I think the old NT 5 had a slightly better interface.) and before you know it, people want to get rid of ME so badly, they get the next best thing, XP.
They killed off the entire DOS based environment and managed to transition consumers to their NT environment in just a year.

Let me know one company that can transition their users as fast as Microsoft has done with Win9x->ME->XP.
Posted on Reply
#28
CAPSLOCKSTUCK
Spaced Out Lunar Tick
Capitalism thrives on innovation and consumerism. Where would us techheads be without it.

I will find the development and adoption of DDR4 interesting as will many others. When i will buy it, well thats another question.
Posted on Reply
#30
RichF
nelizThey are called early adopters and they decide which products fail or which technology matures.

Imagine a world where no one bought 4K displays when they were $10.000 because they would be "dumb." Do you think display manufacturers would be willing to invest in 4K if they didn't get any return on their investment the first 5 to 10 years?
4K is dumb, at least for TVs. The HDTV viewing standard should have been 1440p. There shouldn't have been a 1080p at all. 720p should have been replaced by 1440p. That's enough resolution for TV viewing at useful distances.

tftcentral.co.uk/articles/visual_acuity.htm

What gamers actually need are HDTVs that don't have lots of input lag, have good contrast, have good color, have good angles, have fast pixels, don't have IR, and have something like G-Sync or Freesync.

And, 4K is a manufacturer-driven standard, not a consumer-driven one. It was chosen due to its ease of manufacturing transitioning from 1080p. Consumers are pretty much sheep who blindly follow the trends manufacturers push. Enthusiasts are more discerning, to a point, but they aren't the ones driving most tech.
Posted on Reply
#31
Prima.Vera
Completely agree with you RichF. But imagine that 4K wont be the last standard. The East-Asian countries are building the TV network for 8K by default, so in 20 years or so we will see not 4K TVs but 8K TVs as standard.
The 4K/8K transmissions will be today's 720p/1080i ones.
Posted on Reply
#32
Sony Xperia S
RichF4K is dumb, at least for TVs. The HDTV viewing standard should have been 1440p. There shouldn't have been a 1080p at all. 720p should have been replaced by 1440p. That's enough resolution for TV viewing at useful distances.
You mix and mess things, and actually you are wrong. Mixing PC standards with TV Broadcasting standards is never a good idea.
RichFAnd, 4K is a manufacturer-driven standard, not a consumer-driven one. It was chosen due to its ease of manufacturing transitioning from 1080p.
However, over time, those standards would tend to unify and be close if not one and the same for both.
Prima.VeraBut imagine that 4K wont be the last standard..
They have already said that 8K will be the final standard because it will reach human vision abilities.
Posted on Reply
#33
Prima.Vera
Sony Xperia SThey have already said that 8K will be the final standard because it will reach human vision abilities.
Depending on the size of the TV and especially viewing distance. ;)
For a desktop monitor I absolutely don't see any point of going beyond 4K, not unless you have bad eyesight and you need to stay with your eyes 1 inch from them.
Posted on Reply
#34
neliz
stop limiting your vision of the future by trying to tie future resolution scaling to todays display techhnologies.
Posted on Reply
#35
Steevo
nelizThat's why you need to go back and check. The adoption curve is they key here.

In the VHS/Betamax/Video2000 battle for instance, both Betamax and Video2000 where technologically superior to VHS, more tracks, dual-sided tapes for instance. But VHS won because of lower price and much higher adoption rate by industry (lower royalties) and consumers (lower price.)

With Windows ME, I feel like MS did the perfect thing. Resistance to the switch from the DOS based shell environments (Win95, 98) into the NT based environment was high (they're taking our liberty, freedom and control of our PC! Bloatware!). They tried to satisfy both sides and ended up delivering a product that wasn't close to either their old shells or NT. I just kept dual-booting 98 and NT at home because of games&compatibility, but used ME on my notebook at work because I had far less USB compatibility issues.

So once everyone is complaining about how bad ME is (me too, don't worry) The only thing Microsoft had to do was blame all the legacy stuff and compatibility issues. Just point out how great Windows 2000 is (although, I think the old NT 5 had a slightly better interface.) and before you know it, people want to get rid of ME so badly, they get the next best thing, XP.
They killed off the entire DOS based environment and managed to transition consumers to their NT environment in just a year.

Let me know one company that can transition their users as fast as Microsoft has done with Win9x->ME->XP.
Win 95/98/2000 ME was never a real option.
Posted on Reply
#36
FireFox
The Power Of Intel
I don't know if I should laugh or cry.
Posted on Reply
#37
Schmuckley
what do the last 5-6 posts have to do with RAM frequency /performance/throughput?
Posted on Reply
#38
FireFox
The Power Of Intel
ReaderDumbers are always ready to pay more for the numbers."
Thats true, i bet there is some dumb out there already saving to buy some of those kit :slap:
Posted on Reply
#39
RichF
Sony Xperia SThey have already said that 8K will be the final standard because it will reach human vision abilities.
I strongly suggest reading the link I posted.
tftcentralIf you think about that you only have maximum visual acuity in the foveola, 8K seems quite a waste; you can only see a circle with a 69 pixel diameter with maximum accuracy at the time out of 7680x4320.

On computer displays there is definitely something to say for 4K. You can display a lot of information simultaneously and you usually only have to focus on a small area at the time, which means the higher detail really has added value. Furthermore the short viewing distance allows a wide field of view without the need for extremely large displays.

With televisions it’s a different story. To really profit from 4K you’d need an extremely large screen, or sit extremely close. And 8K is just plain ridiculous. For a 250 cm viewing distance you’d need a 595 x 335 cm screen.
Sony Xperia SMixing PC standards with TV Broadcasting standards is never a good idea.
So, it's a bad idea for companies to sell 4K monitors, 1080p monitors, and so on? The standards have already been mixed. The only recent exception of particular note is 1440p.

What doesn't make sense is to not have standardization. Having HDTV and desktop monitor resolutions be equivalent makes prices drop due to the effect of larger production volume. (That is the one major good thing about 4K TV, although it does have significant drawbacks.) It also makes having beneficial standards easier to implement. The difference between TV and monitor has diminished recently and will continue to diminish now that plasma is dead.

The main issue now is input lag, particularly since 4K monitors and 4K TVs have become the standard — which gives console and PC gaming a single resolution to target for top-level performance. TVs also need to adopt something like Freesync/G-sync, fast pixels, strobing backlights, and so on for better gaming.

Going forward, desktop monitors will differentiate themselves by having wider color gamuts (something that TV standards should have focused on rather than 4K), better uniformity, anti-glare treatment (in some cases — especially for office use), sophisticated calibration features like programmable hardware LUTs, and so on. Or, in the case of many of them, low price will be the differentiator. LCD with LED backlighting, though, is tech that is common between HDTVs and monitors, making small details like those the differentiators. How well OLED will hold up to monitor usage remains to be seen, especially since we have so many miniscule pixels thanks to the 4K craze. If IR and burn end up being a significant problem for OLED on the desktop because of 4K that will certainly be a shame since LCD can't offer its black level, at least with comparable viewing angle range.
Posted on Reply
Add your own comment
Jan 2nd, 2025 08:39 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts