• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen Threadripper 1950X Overclocked to 4.1 GHz With Liquid Cooling

You could still comfortably and smoothly game even with overclocked Bulldozer class CPU and GTX 1080Ti. Knowing how much faster IPC Ryzen has, I think I don't have to say anything further...
not sure who you are talking to without a quote....or where you are going with this...

You could, but why would I intentionally limit my 1080ti by pairing it with bulldozer? I mean sure, whats the difference between 100 and 120 fps, but, id like to not put a glass ceiling on my gpu.. even Sandybridge would drive the 1080ti better than BD.

We've seen in several titles Ryzen is behind a bit. Get fast memory and that difference mostly disolves.
 
Zen's gaming performance isn't amazing and may bottleneck rx vega at 1440p. Got a feeling that because I don't NEED 12 cores anymore because I'm going to be a streamer/gamer first, youtuber second, I'm going to end up getting a coffee lake 6-core, which I might sell for a possible 10nm upgrade early next year, which could actually be a free/profitable upgrade because of tax breaks when you buy a cpu and/or mobo in Holland. My gpu has to be rx vega because I want a 32" WQHD main monitor, which leaves normal or freesync options and I really want adaptive sync.

:kookoo::kookoo::kookoo:
 
Good luck locking your games on 144fps. GamerNexus couldn´t do it in any game, always minimum fps way below 144, while 7700k flies. Cheers.

Look how high that 7700K is "flying":

upload_2017-8-9_14-25-1.png
 
You could, but why would I intentionally limit my 1080ti by pairing it with bulldozer?
Because.... for some of us peasants money does not grow on trees....:) The GPU comes first, sometimes, when it's time to upgrade...
 
Look how high that 7700K is "flying":

View attachment 90920
It's practically out of this world, lol

But seriously, yea, if you have 2x 1080ti already then maybe you need to go with the slightly faster IPC CPU to not put "a glass ceiling" on them. You may get 10 fps in some games. But if you still can't afford those 1080s, what are you doing paying Intel double the money for 'almost' the same fps? Since when do people even care about the amount of fps the CPU gives them? :kookoo:

Never heard this argument before AMD came out with Ryzen, it used to even be frowned upon to compare CPUs for fps in games. And now it suddenly seems like the only thing people do all day. :laugh:

Don't you have any better arguments?

This whole "Intel is better for gaming" argument is probably the stupidest crap I have ever heard. Intel doesn't even produce graphics cards, remember? :laugh: But suddenly an Intel CPU is what you need for more fps in games... Seriously? I don't know who's pushing this, but somebody probably is. If Intel themselves isn't behind all these stupid gaming benchmarks, who else would actually benefit from it?
 
Last edited:
If not happy with TR also hated so much AMD, but you love so much with intel and of course lot of cash.. just keep calm and wait for

Vodka Lake / X
Sake Lake / X
Coke Lake / X
etc ...

Way too much talk about benchmarking, but forgot about how good Price / Performance ratio AMD has with TR.
 
Look how high that 7700K is "flying":

View attachment 90920

That is some epic massaging of facts to suit a narrative.

In a Q&A regarding Titanfall 2‘s PC release, Respawn confirmed that the game would have a PC cap of 144 frames per second. Although it’s still a limit, 144 FPS should be enough frames to match virtually all PC game players’ monitor refresh rates.

144 hard cap. Nothing is going to get higher than that. Getting 143 when on OC is about as good as it gets (literally 99.3%)
 
Since when do people even test CPUs according to fps in games? Who actually thinks this is needed? I remember when all that mattered was to not completely bottleneck your GPU with the CPU, not about 5-10 fps difference without even accounting for the extreme price difference of those 5-10 fps that comes with Intel. It used to be all about comparing two GPUs that cost about the same head to head in fps. That's what matters. Now we are looking at completely differently priced CPUs and putting them against each-other in fps??? I couldn't invent a more ridiculous test if I wanted to.

Like this video:
It's comparing Ryzen 1700 to i7-7820X, which is double the price in my country, head to head in games! (Of course with the exact same GPU despite the massive price difference!)
That's as smart as making a whole video benchmarking a Radeon RX580 against a GTX 1080 in games... Don't they see that those price ranges are insane? :kookoo:
 
Last edited:
You know you have a good product when the antiAMD crowd is in meltdown mode.

Keep it up, guys. It's a good laugh.
 
I wonder what kind of tdp it had when overclocked this much, like how many watts of heat it was putting out. I still think it's impressive, if only I had 1000 to spend on a cpu lol
 
always makes me laugh all these comments

at the end of the day you either have a unlimited budget or not

if you have a limited budget then most people will buy the best chip for the money
and currently AMD offers best value for money
 
Damn, I needed to get the popcorn out for this thread, it got stupid fast
 
Damn, I needed to get the popcorn out for this thread, it got stupid fast
It will be even better tomorrow when all the benchmarks for ThreadRIPper come out and the same people start freaking out about how it's slightly worse at gaming than a $3000 Xeon or something.
:toast:
 
What do you expect? It's Ryzen with moar cores. It's an overclocked Sandy Bridge in games, a Broadwell-E (sometimes) in benchmarks.

You could also say the same about Kaby Lake and Sky Lake as well and it would be even more correct because they essentially are overclocked sandy bridge.

But it's not a bad thing. An overclocked Sandy Bridge is still more than enough for modern games. Ryzen has the extra cores to boot. It's also very efficient, priced well, and has long term platform support.
 
Which means little to nothing considering Intels IPC has been stagnant for over 5 years in the single thread department

thats false. IB-HW was 11%, HW-BW was 3%, and BW-SKL was 3%, and now overclocks 5GHz plus with KBY

AMD is still 5 years behind in IPC and a whole 1GHz behind in clocks. Fact.

i dont get why reality is so hard for people to except. Ryzen/CCX is not a gaming chip. It sucks ass at it but is extremely good at being cost effective and scaling for massively threaded tasks with minimal clock hit on massive dies.

That is how it was designed and works. 100% fact. It just isnt a gaming chip....get over it. It is a stellar workstation/threaded workhorse. Intels new HEDT sucks at gaming too but just not as bad...jeez.

Stop spamming the internet with fan boy bs that is devoid of reason.

4.1 Ghz on the 16-core Threadripper is outstanding. 7900X already hits a huge wall in terms of temperature and power consumption. If they don't do anything about that the 16-core and 18-core are going to be horrible overclockers.

anyone buying a 1000-2000 dollar CPU will be delidding and using water or phase change so temp is not really relevant.
 
Turbo clock is not a baseline clock. It'll NEVER operate at 4GHz on ALL cores. So, when you overclock both to 4GHz on ALL cores, that means both actually operated at 4GHz on all cores at all times. Something NEITHER does out of the box, tubo or not.

Don't mix up special "All Core Turbo" settings in BIOS that forces CPU to run the turbo clocks on all cores. But that's not what any Intel CPU does when within factory specs.

Wow you really don't understand how these chips operate?

At stock it runs at 4GHz on all the cores unless you start hitting the power limit or are running an AVX workload. The base clock is lower on these chips because they use a large AXV512 multiplier offset.

The "all core turbo" setting in the bios would run all the cores @4.3GHz or 4.5GHz.

EDIT: Also note, many motherboards by default will boost to all thread turbo out of the box. The Prime had my 7900X at 4 GHz all cores from the first boot.

The stock all core turbo speeds for the 7900x is also 4GHz so its working fine regardless if @RejZoR will believe it.

Congrats on the new CPU. I'm waiting for the higher end boards from asus to drop before I pickup a 7900X. Not sure if it would be as fast as my 6900k @4.5 unless it will do 4.8+
 
Last edited:
Ryzen/CCX is not a gaming chip.
It just isnt a gaming chip....get over it. It is a stellar workstation/threaded workhorse. Intels new HEDT sucks at gaming too but just not as bad...jeez.
Please stop. What does "gaming chip" even mean? Plenty of people now play fine on Ryzen. You know what a gaming chip is? A freaking GPU! Stop this nonsense about CPUs being gaming or non-gaming...... It makes no difference. Have you ever run a game purely on a CPU? They are all non-gaming chips!!! :banghead:

Just stop.

Here's a step by step.
1. Buy Ryzen.
2. Take the bag of money you saved and buy a better GPU. Or a second one.
3. Now you have a gaming chip.
 
Turbo clock is not a baseline clock. It'll NEVER operate at 4GHz on ALL cores. So, when you overclock both to 4GHz on ALL cores, that means both actually operated at 4GHz on all cores at all times. Something NEITHER does out of the box, tubo or not.

Don't mix up special "All Core Turbo" settings in BIOS that forces CPU to run the turbo clocks on all cores. But that's not what any Intel CPU does when within factory specs.
who care what out of the box stock is? No one buys an unlocked 1000 dollar chip and runs stock unless they are an idiot, Your point is moot.

You think the majority will be delidded? Why?
Also why delid for phase? Even on Cascade the difference is negligible

Who spends that much money and does not delid and overclock? A fool maybe? It is stupid not to delid...your throwing away perforamnce.

that is not true. Silicon lottery delids his 7700K for single phase change because it makes a notable difference. He runs his at 5.5-5.7GHz. There is a thread in his forum where i was asking him about it.

Please stop. What does "gaming chip" even mean? Plenty of people now play fine on Ryzen. You know what a gaming chip is? A freaking GPU! Stop this nonsense about CPUs being gaming or non-gaming...... It makes no difference. Have you ever run a game purely on a CPU? They are all non-gaming chips!!! :banghead:

Just stop.

Here's a step by step.
1. Buy Ryzen.
2. Take the bag of money you saved and buy a better GPU. Or a second one.
3. Now you have a gaming chip.

Again if you seriously game ryzen sucks at it. Look at the benchmarks, look at single thread. If you VR or 1440p 120hz ULMB game like I do even a 4.8 Ghz 6700K can't run many single thread games at 120hz consistently due to single thread limits. Stop your spamming of patently false statements on its ability to game on VR and 120hz. I wouldnt even use an Intel HEDT for it because of the IPC hit and thats faster than Ryzen.
 
Last edited:
thats false. IB-HW was 11%, HW-BW was 3%, and BW-SKL was 3%, and now overclocks 5GHz plus with KBY

AMD is still 5 years behind in IPC and a whole 1GHz behind in clocks. Fact.

i dont get why reality is so hard for people to except. Ryzen/CCX is not a gaming chip. It sucks ass at it but is extremely good at being cost effective and scaling for massively threaded tasks with minimal clock hit on massive dies.

That is how it was designed and works. 100% fact. It just isnt a gaming chip....get over it. It is a stellar workstation/threaded workhorse. Intels new HEDT sucks at gaming too but just not as bad...jeez.

Stop spamming the internet with fan boy bs that is devoid of reason.



anyone buying a 1000-2000 dollar CPU will be delidding and using water or phase change so temp is not really relevant.
I guess Intel Kaby sucks ass at Gaming as well since it's only 10% ahead
 
Back
Top