# More cores more futureproof for gaming ? 2017's Ryzen 1700 put to the test against modern day 4c/8t i3/ryzen 3 and i7 7700k



## cucker tarlson (Jun 12, 2020)

oc'd it's bout as good as the lowest i3 sku 10100 at stock.3300x/7700k beat it pretty comfortably stock vs stock or oc vs oc.


----------



## PooPipeBoy (Jun 12, 2020)

Well that's a bit awkward when everyone on the internet is saying that quad cores are obsolete.


----------



## cucker tarlson (Jun 12, 2020)

PooPipeBoy said:


> Well that's a bit awkward when everyone on the internet is saying that quad cores are obsolete.


they are obsolete,just look at 1200 or 9100f in this very test - you'll see 40/50 where 4/8 does 80/100
but they're talking 4c/4t,4/8 is a totally different ball game
1700 vs 10100 you'd never tell a difference if you didn't know which is which


----------



## theonek (Jun 12, 2020)

7700k is long time dead, it can't even handle streaming and gaming simultaneously.... Multi cores are up to date fo multi load tasking.... Old cpu's are good for office work only....


----------



## cucker tarlson (Jun 12, 2020)

theonek said:


> Old cpu's are good for office work only....


yeah.
and gaming apparently.


----------



## Jetster (Jun 12, 2020)

First off there is no future proof, second 4 core is not dead, not even close


----------



## cucker tarlson (Jun 12, 2020)

Jetster said:


> First off there is no future proof, second 4 core is not dead, not eve close


4/4 for gaming - kinda is
4/8 is fine,but not for streaming as the 1 pointed out


----------



## R0H1T (Jun 12, 2020)

Jetster said:


> First off there is no future proof, second 4 core is not dead, not even close


Depends on what you're doing, in tasks that scale well with cores & are generally time sensitive it's better to invest in "more" cores!
For gaming a better GPU is definitely what one *should* invest in, say over upgrading from a quad to hexa core.


----------



## dirtyferret (Jun 12, 2020)

theonek said:


> 7700k is long time dead, it can't even handle streaming and gaming simultaneously.... Multi cores are up to date fo multi load tasking.... Old cpu's are good for office work only....











						Definition of hyperbole | Dictionary.com
					

The definition of Hyperbole is obvious and intentional exaggeration. See additional meanings and similar words.




					www.dictionary.com


----------



## TheoneandonlyMrK (Jun 12, 2020)

I love how some take from it there 7700k is still fine.
Totally not acknowledge that a 10600k is also not an upgrade to a 1700 by enough.

The consoles are nearly here, your 7700k's demise is too.


----------



## R0H1T (Jun 12, 2020)

Oh yeah the consoles would lead to a major hit to quad core gaming. Still the next year or two you'd be fine with an OCed 4c/8t CPU probably dialing down a few settings.


----------



## cucker tarlson (Jun 12, 2020)

R0H1T said:


> Oh yeah the consoles would lead to a major hit to quad core gaming. Still the next year or two you'd be fine with an OCed 4c/8t CPU probably dialing down a few settings.


consoles have a 8/16 ryzen 3000 at 3.5-3.9
look at the op,find 3700x and take that frequency decrease off its score.


----------



## TheoneandonlyMrK (Jun 12, 2020)

cucker tarlson said:


> consoles have a 8/16 ryzen 3000 at 3.5-3.9
> look at the op,find 3700x and take that frequency decrease off its score.


Yeah because that's all the console uses to process eh.
You think that 7700 k at high clocks is capable of imitation of 8/16 cores despite the fact that asset decompression on ps5 = 12 or some thing close to it further core's.
Yep game's will still run perhaps ,but no where near the fidelity or frame rates.

We've moved on from Intel's defined future, it's not 2012 anymore and the next generation really is on our doorstep.

You don't think that the 7700k is going to limit big Navi or the 3080 with their Pciex4 bus actually being used effectively.

Dreamer.


----------



## Bill_Bright (Jun 12, 2020)

theonek said:


> Old cpu's are good for office work only....


Bullfeathers! Well, unless you are talking old 486 Pentiums! 

Futureproofing?   

More cores does not suggest a processor is future proofed. 

Does any game "require" more than 4 cores to play? 

GTA 5? Nope.

Resident Evil 7? Nope.

CoD: BO4? Nope.

Assassin's Creed Odyssey? Nope. 

The list goes on. 

In most cases, even the "recommended" processor is only 4 cores. 

Sure! For some games, more cores does provide better performance. But game developers know that many, if not most gamers don't have the deep pockets to spend on monster CPUs (or GPUs and massive amounts of RAM) for what amounts to a vice - a form of entertainment. So those games are coded to provide great game play with lessor hardware too - and always will. 

Futureproofing involves ensuring compatibility and support for future "protocols". Not core count beyond quads. 

One day, maybe, games, operating systems, and other programs may require more than 4 cores just to run. But when that day arrives, we will already be required to replace our motherboards and RAM too. So it is not like we could buy a CPU today with X number of multiple cores and expect it work on those future motherboards and DDR8 (QDR16?) RAM.


----------



## xtreemchaos (Jun 12, 2020)

i think the 7700K is still doing well if your just gaming, for everything else and 8c/16t seams the way to go but im byeist   . ive a 9900k on the way.


----------



## newtekie1 (Jun 12, 2020)

cucker tarlson said:


> they are obsolete,just look at 1200 or 9100f in this very test - you'll see 40/50 where 4/8 does 80/100
> but they're talking 4c/4t,4/8 is a totally different ball game
> 1700 vs 10100 you'd never tell a difference if you didn't know which is which



A quad-core with SMT/HT is still a quad-core there fella.


----------



## TheoneandonlyMrK (Jun 12, 2020)

Bill_Bright said:


> Bullfeathers! Well, unless you are talking old 486 Pentiums!
> 
> Futureproofing?
> 
> ...


Disagree heartily ,gtaV shows gains from cores and is Seven years old and largely irrelevant with regards future anything.

Assassin's creed is unplayable at 4k on 4 cores

Bo4 is like apex a competitive game MADE especially to be playable on a wide range of OLD and new tech.

Gears 5 and similar or newer titles are what matters in the FUTURE for future proofing and a quad may work on it but gives dismal performance verses an octo.


----------



## ppn (Jun 12, 2020)

In oddicey and division 2 to see 2,5GHz 8 cores or 3,5GHz 8 threads loaded to 60%, and the GPU bottlenecked to 60%, this is just preposterously stupid. So i don't need 8 core or 8 threads, just insanely fast high IPC 6 core ocean cove.


----------



## cucker tarlson (Jun 12, 2020)

newtekie1 said:


> A quad-core with SMT/HT is still a quad-core there fella.


a quad core with no HT is not the same as quad core with HT,heello,heard of threads ?








ppn said:


> In oddicey and division 2 to see 2,5GHz 8 cores or 3,5GHz 8 threads loaded to 60%, and the GPU bottlenecked to 60%, this is just preposterously stupid. So i don't need 8 core or 8 threads, just insanely fast high IPC 6 core ocean cove.


it is what it is,core count was never a substitute for a better architecture,higher frequency or lower latency.


----------



## Bill_Bright (Jun 12, 2020)

theoneandonlymrk said:


> Disagree heartily



Sadly, you clearly did not read what I said and/or did not bother to take just a moment to understand what I said.  Did you see what I said about protocols vs cores? Do you not understand that?

I didn't pull those examples out of my a$$. 

Most CPU resource demanding games.

Obviously, you feel a game is unplayable and unsupported if the hardware does not provide the maximum, #1, world champion, record-holding performance.  And that "obsession" means you will never truly be happy with your hardware because something better and faster is always just around the corner.  

That is NOT a personal criticism, BTW. Just an observation. I used to be that way years ago with audiophile-quality electronics and speakers. Fortunately I came to my senses when I realized the set of speakers I was eyeballing to replace my current (perfectly good) 2 year old speakers cost more than it would to put my kid through 4 years of college. 

Contrary to what you believe, the vast majority of gamers don't believe like you! They play games for the immersive enjoyment they get out of that "pastime". And that does NOT make them any less an enthusiast either! And as I also said above, game developers understand that too. They could not stay in business if they coded their games so they provided "good game play" (read: immersive entertainment) only when played on the latest and greatest, top-of-the-line, state-of-the-art, umpteen-core processors.


----------



## cucker tarlson (Jun 12, 2020)

Bill_Bright said:


> Sadly, you clearly did not read what I said and/or did not bother to take just a moment to understand what I said.  Did you see what I said about protocols vs cores? Do you not understand that?
> 
> I didn't pull those examples out of my a$$.
> 
> ...


you're wasting time.
the numbers are there,and numbers don't lie.it is pointless to debate raw numbers as well as "disagree" about them.what is there to disagree ? that 166 is equal to 166 ?
if he took time to see any decent review he'd see that new i3's and r3's (as well as old skylake/kaby i7s) are kicking 1st/2nd gen 8 core ryzens ass.


----------



## dirtyferret (Jun 12, 2020)

cucker tarlson said:


> consoles have a 8/16 ryzen 3000 at 3.5-3.9



its based on the ryzen 3000 with a viarbale frequency topping out at 3.5ghz (one core or all cores or in-between and for how long??) so it will be gimped and the past two consoles also had multi-core CPUs.  I find all modern CPUs to be more then capable of handling games today and for several years to come.


----------



## cucker tarlson (Jun 12, 2020)

dirtyferret said:


> its based on the ryzen 3000 with a viarbale frequency topping out at 3.5ghz (one core or all cores or in-between and for how long??) so it will be gimped and the past two consoles also had multi-core CPUs.  I find all modern CPUs to be more then capable of handling games today and for several years to come.


it is still a remarkable update over that trash jaguar


----------



## dirtyferret (Jun 12, 2020)

cucker tarlson said:


> it is still a remarkable update over that trash jaguar


true but like you said the bar was not high


----------



## cucker tarlson (Jun 12, 2020)

I suggest for those who "disagree" about the numbers in the chart to go by the long blue and red bars.


----------



## dirtyferret (Jun 12, 2020)

I find it comical how some people ran out and got the Ryzen 1700 to "future proof" only to see the Ryzen 2600 perform better in gaming.  So then they get the Ryzen 2700x only to see the Ryzen 3600 perform better in gaming.  So now they have a 3700x/3800x, etc., to "future proof" but what do you think will happen when the Ryzen 4600 gets released?  yet you have an army of Intel i5 owners from the past decade laughing like hyenas every time they launch a game to play.  Like Bill stated there is a difference between playing a game and chasing FPS.  One is qualitative while the other is quantitative.


----------



## cucker tarlson (Jun 12, 2020)

dirtyferret said:


> I find it comical how some people ran out and got the Ryzen 1700 to "future proof" only to see the Ryzen 2600 perform better in gaming.  So then they get the Ryzen 2700x only to see the Ryzen 3600 perform better in gaming.  So now they have a 3700x/3800x, etc., to "future proof" but what do you think will happen when the Ryzen 4600 gets released?  yet you have an army of Intel i5 owners from the past decade laughing like hyenas every time they launch a game to play.  Like Bill stated there is a difference between playing a game and chasing FPS.  One is qualitative while the other is quantitative.


well,they only have themselves to blame for being clueless.
I mean when watch dogs 2 came out in 2016,a game that absolutely *hammers* cpu cores and scales well past 6,and first gen ryzens could match a 4790K (pre patch nerfing),well,that was it really.








						Test procesora AMD Ryzen 7 1700 - Cenowy rywal Core i7-7700K | PurePC.pl
					

Test procesora AMD Ryzen 7 1700 - Cenowy rywal Core i7-7700K (strona 39) Test procesora AMD Ryzen R7 1700 vs Intel Core i7-7700K, najtańszego modelu z ośmioma rdzeniami w ofercie AMD. Jak się podkręca? Jak wypada w grach w porównaniu do cenowych konkurentów?




					www.purepc.pl
				




and if they didn't smell that already,when sotr,and even bigger cpu hog,showed 2700x still couldn't beat 9600k that means they'd prolly miss every possible clue.








						Intel Core i5-10600K vs AMD Ryzen 5 3600 - Test procesorów | PurePC.pl
					

Intel Core i5-10600K vs AMD Ryzen 5 3600 - Test procesorów (strona 47) Test procesorów Intel Core i5-10600K vs AMD Ryzen 5 3600, czyli pojedynek sześciu rdzeni i dwunastu wątków. Który procesor okaże się lepszy w grach i programach?




					www.purepc.pl
				




now they're saying 10600k is not enough cause consoles have 8/16.
I wonder how that will turn out hmmmmmmmmm
I mean you have 10600k beating 3900x already in games that scale even on a damn 9960x,but I guess we have to "see next year"


people are already talking about futureproofing gaming by getting a 3900x.absolutely clueless cause that goes against the very principle how ryzen 3000 works in games.an equal amount of cores will work on 3700x and 3900x,one will just have more cores that are lightly loaded.that's why you only see fractions between them while intel keeps on scaling,it's just that 3900x is a better bin,that small difference is frequency









						Intel Core i5-10600K vs AMD Ryzen 5 3600 - Test procesorów | PurePC.pl
					

Intel Core i5-10600K vs AMD Ryzen 5 3600 - Test procesorów (strona 50) Test procesorów Intel Core i5-10600K vs AMD Ryzen 5 3600, czyli pojedynek sześciu rdzeni i dwunastu wątków. Który procesor okaże się lepszy w grach i programach?




					www.purepc.pl
				




9900k to 10900k has better scaling than 3700x to 3900x.
If a 3900x could beat that 9900k,why doesn't it do it already in games where 9960x can ?


----------



## phanbuey (Jun 12, 2020)

6c/12t will be around forever.  Esp if you use the GPU for streaming (nvenc) which is getting insanely good these days.  The 8700k is going to be the 2500K where people will be finally coming off of them in 1-2 years and they will still be performing in the top 10% of benchmarks when OCd.


----------



## cucker tarlson (Jun 12, 2020)

phanbuey said:


> 6c/12t will be around forever.  Esp if you use the GPU for streaming (nvenc) which is getting insanely good these days.  The 8700k is going to be the 2500K where people will be finally coming off of them in 1-2 years.


well,while this isn't necessarily true,it doesn't mean a cpu with lower clock speed but higher core count will survive what 6/12 can't.
but yeah,it'll be a long time before 6/12 is the new 4/4.Probably,well,more than I can imagine right now.


----------



## newtekie1 (Jun 12, 2020)

cucker tarlson said:


> a quad core with no HT is not the same as quad core with HT,heello,heard of threads ?



I'm not saying they perform the same. I'm saying HT/SMT doesn't make it _not _a quad-core.  It's still a quad-core, so your statement that quad-cores are obsolete is wrong, and the video in the OP proves it.

A quad-core with SMT/HT is still a quad-core, it's right in the name.  So quad-cores are in fact not obsolete, processors with 4 threads are, but you didn't say that now did you?


----------



## cucker tarlson (Jun 12, 2020)

newtekie1 said:


> I'm not saying they perform the same. I'm saying HT/SMT doesn't make it _not _a quad-core.  It's still a quad-core, so your statement that quad-cores are obsolete is wrong, and the video in the OP proves it.
> 
> A quad-core with SMT/HT is still a quad-core, it's right in the name.  So quad-cores are in fact not obsolete, processors with 4 threads are, but you didn't say that now did you?


yeah but let's not waste time on semantics.


----------



## TheoneandonlyMrK (Jun 12, 2020)

Bill_Bright said:


> Sadly, you clearly did not read what I said and/or did not bother to take just a moment to understand what I said.  Did you see what I said about protocols vs cores? Do you not understand that?
> 
> I didn't pull those examples out of my a$$.
> 
> ...


You know me well, not, I'm gaming on a 6core laptop with a 2060 but I don't think it'll be all that viable to game on in two years.
Millions and millions of actual gamer's already have 8 cores and are getting 8/16.


----------



## cucker tarlson (Jun 12, 2020)

theoneandonlymrk said:


> Millions and millions of actual gamer's already have 8 cores and are getting 8/16.


doesn't really translate into benchmarks,does it.


----------



## Vayra86 (Jun 12, 2020)

Jetster said:


> First off there is no future proof, second 4 core is not dead, not even close



4c4t is pretty dead if you look at the results, and that is just FPS, lets not begin about stutter.

4c8t will do fine _for now. _But I wouldn't recommend building a 4c8t gaming rig today. Also remember you are looking at pretty static benchmarks here, and not your typical users' PC, any background task will mess with your game performance on 4c8t.



theoneandonlymrk said:


> You know me well, not, I'm gaming on a 6core laptop with a 2060 but I don't think it'll be all that viable to game on in two years.
> Millions and millions of actual gamer's already have 8 cores and are getting 8/16.



Next console gen won't be faster than what's in a typical high end PC today. 8/16 is overkill most likely until PS6/Xbox super duper.
6/12 will likely be going the way of 4/8 today, by that time. In my view, CPU demands for gaming progress very slowly, but when they do, you can't stay behind. No settings to really tweak to get more out of it.


----------



## cucker tarlson (Jun 12, 2020)

Vayra86 said:


> 4c8t will do fine _for now. _But I wouldn't recommend building a 4c8t gaming rig today.


get a good platform,go all out on gpu,get a 3300x for now it'll do a brilliant job for what it costs.You'll not lock 144 fps in shooters or 100 fps in third person open world games.But be happy that 130 dollars bought you 80% of that cause that +20% you're missing in cpu power is gonna cost you a +50% gpu upgrade.


----------



## Vayra86 (Jun 12, 2020)

cucker tarlson said:


> get a good platform,go all out on gpu,get a 3300x for now it'll do a brilliant job for what it costs.You'll not lock 144 fps in shooters or 100 fps in third person open world games.But be happy that 130 dollars bought you 80% of that cause that +20% you're missing in cpu power is gonna cost you a +50% gpu upgrade.



I'd definitely push the budget up to a 6/12 if you're already spending 130... Its not like that will break the bank. I'd be more content with missing out on the last 5% of CPU perf for gaming. 20% is a shit ton, its also the performance you miss on any GPU upgrade and that % will go up as GPUs get faster.


----------



## cucker tarlson (Jun 12, 2020)

Vayra86 said:


> I'd definitely push the budget up to a 6/12 if you're already spending 130... Its not like that will break the bank.


yeah true.
even for the simple "why not" reason.
3600 and 10400f deliver best bang for the buck value for gaming atm hands down.
still,if you thought that buying a 3600 will net you a big advantage over 3300x now you'll be surprised cause you won't tell the difference probably.


----------



## Jetster (Jun 12, 2020)

We have this conversation every time there is a jump in technology that is significant. Might as well just toss those Athlon X2 in the trash, the C2D E7400 is out. It doesn't mean the old is dead. It means that its limitation are starting to show. Shit I know people gaming on Sandy Bridge, and I remember what it is like to be poor.


----------



## Vayra86 (Jun 12, 2020)

Jetster said:


> We have this conversation every time there is a jump in technology that is significant. Might as well just toss those Athlon X2 in the trash, the C2D E7400 is out. It doesn't mean the old is dead. It means that its limitation are starting to show. Shit I know people gaming on Sandy Bridge, and I remember what it is like to be poor.



Oh but I don't contest that at all - definitely ride out what you have if that is still working for you. But that is completely different situation as a new build, isn't it? A new build is about a wise combination of parts going forward.



cucker tarlson said:


> still,if you thought that buying a 3600 will net you a big advantage over 3300x now you'll be surprised cause you won't tell the difference probably.



Yeah, in the happy flow you won't tell the difference. And then you get that typical hard scene or you want to run something alongside it. Oops.

Or, better yet, new game X gets released and you're hyped but without a hexa it runs like shit. All because you had to cheap out by what, 30-40 bucks?


----------



## cucker tarlson (Jun 12, 2020)

I sat on 3570k until it got completely wrecked on my high refresh monitor and 980Ti OC.
I'll probably sit on my 4/8 until this happens again and move to whatever the sweet spot will be at that moment,not more,not less.


----------



## Jetster (Jun 12, 2020)

Vayra86 said:


> Oh but I don't contest that at all - definitely ride out what you have if that is still working for you. But that is completely different situation as a new build, isn't it? A new build is about a wise combination of parts going forward.



Absolutely, I would not invest in a 4 core for gaming at this point


----------



## TheoneandonlyMrK (Jun 12, 2020)

Vayra86 said:


> 4c4t is pretty dead if you look at the results, and that is just FPS, lets not begin about stutter.
> 
> 4c8t will do fine _for now. _But I wouldn't recommend building a 4c8t gaming rig today. Also remember you are looking at pretty static benchmarks here, and not your typical users' PC, any background task will mess with your game performance on 4c8t.
> 
> ...


This ideology is wrong and I'll adequately demonstrate why.

The CPU is not the only part of system and today, and you can check all tech companies for proof.

It's the data and it's movement through a system that's starting to dictate a total systems performance and power use.

Both consoles are way ahead of any pc out now on those terms , the sub systems, compression , decompression and transfer.

All those cannot be done on conventional PC's at this time.

It's like comparing a HDD to an SSD but this time it's the subsystem and memory hierarchy that will net performance a four core with HT will stutter to copy it just doesn't have the internal bandwidth.


----------



## Vayra86 (Jun 12, 2020)

cucker tarlson said:


> I sat on 3570k until it got completely wrecked on my high refresh monitor and 980Ti OC.
> I'll probably sit on my 4/8 until this happens again and move to whatever the sweet spot will be at that moment,not more,not less.



You might retire on that 5775C if you keep this up 

And yeah the wrecking of 3570k's was real, holy shit. That TW bench I ran back then... still amazes me.


----------



## cucker tarlson (Jun 12, 2020)

Vayra86 said:


> You might retire on that 5775C if you keep this up


I get offers every week.Could swap it for a new 10400f and have beer money left.
It's not like I don't see the point of selling it.
It's more about the fact that nothing new appeals to me in the $300 range


----------



## Vayra86 (Jun 12, 2020)

cucker tarlson said:


> I get offers every week.
> It's not like I don't see the point of selling it.
> It's more about the fact that nothing new appeals to me in the $300 range



Not a huge fan of the current CPU gens either, beyond the performance. It looks hot and spiky, and I already feel some of that with this 8700K. Meh. I mean this CPU is plenty fast... but the trust factor is low, and its not my OC doing that...


----------



## cucker tarlson (Jun 12, 2020)

I was never fond of buying high end cpus for a gaming rig.I'd rather wait than just say what the hell and break the bank for a 10700K and z490 cause once I'll do that that'll be the new normal.


----------



## Bill_Bright (Jun 13, 2020)

Vayra86 said:


> Oh but I don't contest that at all - definitely ride out what you have if that is still working for you. But that is completely different situation as a new build, isn't it? A new build is about a wise combination of parts going forward.





Jetster said:


> Absolutely, I would not invest in a 4 core for gaming at this point


Budget permitting. But choice of CPU is also (or should be) dependent on the tasks the system will primarily be used for. Sadly, there are many, and frankly, I see this more often with serious gamers, who feel they represent everyone and what they want in their next computer is what everyone should want in their next computer too. That's not how it works. 

As noted previously, some games are GPU intensive and, if the budget is not unlimited, benefit more by investing more in GPU horsepower than CPU horsepower. And notwithstanding this topic is about a gaming rig, some users are looking at a futureproof CAD/CAE system or a system used for other tasks besides playing games. 

My point is, when building a totally new system, going for a many cores as you can afford may make sense for some tasks, but it may not be the best option for other users - regardless the budget.


----------



## cucker tarlson (Jun 13, 2020)

there's some truth to that.
4/4 (r1200af) costs 250pln here.
to get cheapest 4/8 it costs 500pln.

what can 250pln buy you ? a 1660 super instead of 1660,13% faster







on the other hand,when those 4/4 threads start choking,it's gonna be a lot more than just 13% you'll be losing compared to 4/8

I'd recommend buying 4/8 unless you are an experienced pc user and you know the trade off and you know your specific use scenario won't see 4c/4t being a major bottleneck


----------



## Vario (Jun 13, 2020)

Going to run my 6c/6t 8600K for maybe 3-4 more years and then when DDR5 is mainstream and better offerings abound I will sell off the board, cpu, and ram, and hopefully retain a good amount of value in my pristine parts secondhand.  That is what I did with my 3770K, Z77X-UD3H, and 16GB YK0.


----------



## GorbazTheDragon (Jun 13, 2020)

Yeah idk for a lot of games zen1 is pretty awful... You can't really futureproof with more cores when the architecture as a whole is just pretty flakey... Otherwise Haswell E would be a "good" buy for future proofing..

I honestly don't buy the futureproofing thing as a whole, just seems really stupid to get fixated on performance that is not even guaranteed to be on the table... You're just speculating that there won't be something better value down the road and that games in the future won't be biased towards newer architectures... Neither of these things have been predictable to any degree on a timeframe that is talked about for futureproofing. 

If you really want to get the best for your money, you have to be willing to regularly swap out hardware and go to the used market. Just buy what is best for your current use case, look maybe a year into the future, and swap out when your requirements change.


----------



## Bill_Bright (Jun 13, 2020)

GorbazTheDragon said:


> I honestly don't buy the futureproofing thing as a whole, just seems really stupid to get fixated on performance that is not even guaranteed to be on the table... You're just speculating...


I agree with this with a couple notable exceptions. And those would be a quality case and power supply. Both can carry one through years of evolving upgrades - to include totally new motherboard/RAM/CPU/graphics card combos. For the case, if a new USB standard comes out, you typically can add a card in back. For the PSU, if you give yourself extra headroom when sizing the PSU, power is not a problem and it is unlikely some new connector will come out that is not supported by an adapter.


----------



## TheoneandonlyMrK (Jun 13, 2020)

GorbazTheDragon said:


> Yeah idk for a lot of games zen1 is pretty awful... You can't really futureproof with more cores when the architecture as a whole is just pretty flakey... Otherwise Haswell E would be a "good" buy for future proofing..
> 
> I honestly don't buy the futureproofing thing as a whole, just seems really stupid to get fixated on performance that is not even guaranteed to be on the table... You're just speculating that there won't be something better value down the road and that games in the future won't be biased towards newer architectures... Neither of these things have been predictable to any degree on a timeframe that is talked about for futureproofing.
> 
> If you really want to get the best for your money, you have to be willing to regularly swap out hardware and go to the used market. Just buy what is best for your current use case, look maybe a year into the future, and swap out when your requirements change.


Consider this , even your version of non future proofing , still is futureproofing.
Just your version is a year long at which point your willing to swap a lot of stuff due in part to your experience and perspective.
But you still put a time it has to work for.

That's so far removed from a guy like me who doesn't mind upgrades per say but not that often, for me the back end platform needs to stay fairly viable for 3-5 years with upgrades to storage and graphics at shorter periods.

But again my perspective.

Nothing is future proof per say, it all gets superceeded in reality,

But it's about a sensible amount of use for whatever amount you are willing to pay for your tech and an amount of effort your willing to put in, to keep up, and or pay.


----------



## xman2007 (Jun 13, 2020)

Are we speaking just about gaming? cause Ryzen 1700 was behind Intel when it released so it makes sense it still is now, can an modern i3 game, stream, encode and perform like a 1700 in general multithreaded productivity? answer = no. why would it become better at gaming all of a sudden?  all acting like having a gaming PC in 2020 with a Ryzen 1st gen you're gonna be getting like 40-50 fps on AAA with a high end GPU   , when in truth it is perfectly adequate even with a high end GPU, if you want the ABSOLUTE max FPS whether it makes a difference or not, then yea, go intel, but a Ryzen 1700 with a 2080 Ti is still a high end gaming PC, why do Intel people have to keep pushing the same max FPS BS 

Let's just go back to paying $400 for high end quad cores, Intel is amazeballs and AMD sux a55, all happy?


----------



## phill (Jun 13, 2020)

I don't think that I have ever built any PC I have ever had and thought, yes, that'll last me a good couple of years..  There's always something newer, better, faster around the corner...   More often than not before hand, I'd end up buying something different after 3 to 6 months anyways...  It's what you do with hobbies I find..  Bleed money when you would like to buy something for it! 

But I think from the results in some of the graphs in the thread is that they are all tested at 1080, where you're going to see the biggest difference between things.  I mean, when Ryzen 1 series released it was a massive bounce back from AMD.  Utterly huge and my god has even the 3rd series now come on and done even more for the CPU market and AMD in general.  
Still the one thing though that I don't see the worry with, is that if 60 fps is the nice happy medium for most people gaming, so what if it then hits 140 to 220 fps??  It's going to only matter on a 120Hz, 144Hz or faster refresh rate panel.  For my personal choice, I'd head to higher resolutions than a higher refresh rate as in the years of gaming I've done, I've never had a problem with just 60Hz.   I understand that there are benefits for 144Hz and the like but I would love to know how many people actually go down that route..  I would actually also love to physically see the difference in front of me.  Seeing things on Youtube I think, only show so much of the story at times...  

Still back on track a little, if the 1700 (non X I guess from the title??) is not showing much promise at 1080P, what about 1440P or 4k?   We all know and understand very well and it's even been pointed out to us, that clock speed, latency and better architecture makes the world of difference in gaming in the first page of this thread.  So I'm not sure why you'd consider putting a brand new CPU against a 3 year old one in some ways, simply to find out if lags behind some newer CPUs with a faster clock speed.  I mean, is it really going to make it headline news that a Ryzen 1700 is going to be behind the latest 10 series Intel CPUs??  I mean come on, surely that would go without saying, right??  

I personally run two 1700X's for my girls gaming PCs with RX480's and for what they game on it's massively over powered for them.  But then I also have a 2700, 2700X and a 3900X as well.  Even with my undervolted 3900X I've no issues in gaming that I know of and I game currently with a triple 1080P panel setup..  I wish I had a bigger resolution but that's for another day I'm sure  
Testing these things is great but if someone has one of these CPUs and thinks because of the reviews like this it's not as good as the latest and greatest but it  does everything they do without fault, then why should/would they be made to feel like they have a bad setup?  

Sorry guys just wished to add in my 2p worth


----------



## GorbazTheDragon (Jun 13, 2020)

Bill_Bright said:


> And those would be a quality case and power supply.


I agree, if you are into overclocking, RAM is also somthing good to invest in... A b-die kit from 2017 will have retained basically all its value and is still perfectly good, worst case you might add 2 more sticks. But it's another one of these components where you can easily keep it when switching platforms on a 2-4 year upgrade cycle.



theoneandonlymrk said:


> Consider this , even your version of non future proofing , still is futureproofing.
> Just your version is a year long at which point your willing to swap a lot of stuff due in part to your experience and perspective.
> But you still put a time it has to work for.


In my view it's a case of putting in effort and getting more for your money because of it.



phill said:


> But I think from the results in some of the graphs in the thread is that they are all tested at 1080, where you're going to see the biggest difference between things.


The gaps can get way bigger than those, it varies immensely between games.

In general I find it really irritating when people say "x processor is better for gaming" and buy based on whether it's "better for gaming" without actually considering the specific use case... I mean lets go out and buy i3 10100s and whatnot it's better for gaming but we're gonna use it with a GTX 1050ti or something and a 60hz panel?

A 1700 performs adequately at 60hz in the *vast* majority of titles, I don't think that's even a question, for that use case it is a good processor.


----------



## TheoneandonlyMrK (Jun 14, 2020)

GorbazTheDragon said:


> I agree, if you are into overclocking, RAM is also somthing good to invest in... A b-die kit from 2017 will have retained basically all its value and is still perfectly good, worst case you might add 2 more sticks. But it's another one of these components where you can easily keep it when switching platforms on a 2-4 year upgrade cycle.
> 
> 
> In my view it's a case of putting in effort and getting more for your money because of it.
> ...


I get it but other ,in fact most people's outlook on PC are not the same, and certainly far removed from enthusiasts.
I would buy stuff monthly if money were nothing, though so I am far from dead against your ideology on upgrades or phills, I just like to share what cash I have out differently, now.
I actually spend more than necessary Tbh just cos, I have 3 1Tb nvme in raid 0 despite it making no sense just to see.


----------



## freeagent (Jun 14, 2020)

You don't future proof with mainstream stuff.. you future proof with HEDT 

Or at least you used to


----------



## tussinman (Jun 14, 2020)

GorbazTheDragon said:


> I honestly don't buy the futureproofing thing as a whole, just seems really stupid to get fixated on performance that is not even guaranteed to be on the table... You're just speculating that there won't be something better value down the road and that games in the future won't be biased towards newer architectures... Neither of these things have been predictable to any degree on a timeframe that is talked about for futureproofing


 Agreed. I've seen so many people in the last 15 years try to game the system (buying a weaker dual core versus a strong single core because "teh future", buying a slower quad core vs a strong dual core because "teh future"......ect) and by the time there extra cores are eventually used there so far behind on IPC, architecuture, and clock speed that they didn't really game the system, there still in the exact same situation of far behind......


----------



## Vayra86 (Jun 14, 2020)

freeagent said:


> You don't future proof with mainstream stuff.. you future proof with HEDT
> 
> Or at least you used to



Yeah, where is @Tomgang in this story lol

And there are a few others who are still riding old hedt


----------



## Caring1 (Jun 14, 2020)

tussinman said:


> Agreed. I've seen so many people in the last 15 years try to game the system (buying a weaker dual core versus a strong single core because "teh future", buying a slower quad core vs a strong dual core because "teh future"......ect) and by the time there extra cores are eventually used there so far behind on IPC, architecuture, and clock speed that they didn't really game the system, there still in the exact same situation of far behind......


Totally agree and I see this with my old X79 system with 10C/ 20T.
Modern 6C beats it due to clock speed and IPC


----------



## Decryptor009 (Jun 14, 2020)

freeagent said:


> You don't future proof with mainstream stuff.. you future proof with HEDT
> 
> Or at least you used to



2700K was still amazing up until 2017.


----------



## John Naylor (Jun 14, 2020)

When I see the words "future proofing" and a Gamers Nexus video in a post, I can't help comparing it to seeing an image of a boy riding a dinosaur in the Creatoinist Museum.

That being said .... try somethings your yaself:

a)  Play a decent cross section of games with all cores ... then start turning off cores 1 at a time.... stop when the impact is ? 5% on fps.  Surprise !   You're using < 4 cores
b)  Find  a card that comes in two versions ... 3GB / 6GB .... 4GB / 8 GB ... buy the bigger one, borrow the smaller one.   Yes, you can load any number of applications that "suggest" it's using this mich VRAM ... it's not but see for yaself.  These people have:









						GTX 770 4GB vs 2GB Showdown - Page 3 of 4 - AlienBabelTech
					

Do you need 4GB of ram? We tested EVGA's GTX 770 4GB versus Nvidia's GTX 770 2GB version, at 1920x1080, 2560x1600 and 5760x1080.




					alienbabeltech.com
				











						Gigabyte GeForce GTX 960 G1 Gaming 4GB review
					

In this review we check out the 4GB version of the Gigabyte G1 Gaming GeForce GTX 960. The GTX 960 is the mainstream product that we figured has too little memory, will this 4GB version resolve our co... VRAM Analysis 2GB vs 4GB - Alien Isolation




					www.guru3d.com
				











						Video Card Performance: 2GB vs 4GB Memory
					

Similar video cards are often available in versions with more than one memory size. The GeForce GTX 680 is an example, and comes in both 2GB and 4GB variants. With computer components more is often better, but does doubling the memory on a video card like this actually help with game performance...




					www.pugetsystems.com
				











						Is 4GB of VRAM enough? AMD's Fury X faces off with Nvidia's GTX 980 Ti, Titan X - ExtremeTech
					

Is 4GB enough for a high-end GPU? We investigated and tested 15 titles to find out.




					www.extremetech.com
				











						MSI GTX 1060 Gaming X 3 GB Review
					

MSI's GTX 1060 Gaming X 3 GB might come with half the memory amount only, but still brings the big guns in form of the large dual-fan TwinFrozr cooler. Our review will test whether 3 GB is a viable alternative to 6 GB if you are trying to save some money.




					www.techpowerup.com
				




In the last, one has to account for the extra shaders which give a 6% boost to the 6GB card ... but if VRAM mattered, that 6% advantage would grown when we moved from 1080p to 1440p ... it doesn't.  Extreme tech provides the clearest explanation ... But alienbabletech slams it home on ways that leave no doubt

"None of the GPU tools on the market report memory usage correctly, .....They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.

We began this article with a simple question: “Is 4GB of RAM enough for a high-end GPU?” The answer, after all, applies to more than just the Fury X — Nvidia’s GTX 970 and 980 both sell with 4GB of RAM, as do multiple AMD cards and the cheaper R9 Fury. Based our results, I would say that the answer is yes  .... First, there’s the fact that out of the fifteen games we tested, only four of could be forced to consume more than the 4GB of RAM. In every case, we had to use high-end settings at 4K to accomplish this. ... While we do see some evidence of a 4GB barrier on AMD cards that the NV hardware does not experience, provoking this problem in current-generation titles required us to use settings that rendered the games unplayable _any_ current GPU. ”

"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600.  We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference .... There is one last thing to note with _Max Payne 3_:  It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it claims to require 2750MB.  However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting.  And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s.  "

Now certainly poor console ports can cause problems and as each has shown, if you work at it you can create situations where you can create a problem, but in these cases, most time the GPU is over extended before the VRAM is.   If I tried to count the numberof games I have played that were in any way gimped by < 4 cores or < 4 GB at 1080 / 1440p. I could not use all the fingers on one hand.


----------



## GorbazTheDragon (Jun 14, 2020)

@John Naylor on the VRAM thing, it's way too easy to just turn some texture quality settings down to keep the VRAM usage reasonable... Only situation I've seen VRAM be really limiting was when I was trying to run x-plane with satellite imagery textures on my old 670 (2GB), but it's a really far outlier where disproportionately high resolution textures are used for a certain GPU performance demand.



Decryptor009 said:


> 2700K was still amazing up until 2017.


Really depends at what... Skylake (and Ryzen) way outperformed haswell and earlier in frame time consistency on some new games where AVX instructions became common. R6 siege, the newer assassins creeds, and iirc BF1 and later were examples where this happened. For 60hz a 2600k/2700k would be fine but not for high refresh...


----------



## Frick (Jun 14, 2020)

Intel Core i9-10900K och Core i5-10600K "Comet Lake-S" - Test - Test: Vid samma klockfrekvens
					

Det blir femte gången gillt för arkitekturen Skylake när Intel äntrar testlabbet med tionde generatonens Core-processorer i familjen Comet Lake-S.




					m.sweclockers.com
				




For graphs, and reading.


----------



## Bill_Bright (Jun 14, 2020)

Bill_Bright said:


> And those would be a quality case and power supply. Both can carry one through years of evolving upgrades - to include totally new motherboard/RAM/CPU/graphics card combos.





GorbazTheDragon said:


> I agree, if you are into overclocking, RAM is also somthing good to invest in...


Nah! I don't agree with this. 

For starters, I don't see where or why being "into" overclocking applies to your claim. Your comment suggests if you are NOT into overclocking, there's no need to invest in RAM during the initial build. And of course that simply is not true. Even basic office computer benefit from lots of quality RAM.

For another, there is no assurance RAM you buy today will be supported by your upgraded CPU or upgraded motherboard. I have boxes of perfectly good, but totally obsolete DDR3, DDR2, etc. And no doubt, my current DDR4 will be retired before it dies. 

And third, there's no assurance RAM you buy 3 years from now for your 3 year old motherboard will be compatible (if available!) with your 3 year old RAM you bought with that motherboard! 

So the best way to "futureproof" your RAM and ensure you will always have plenty of RAM (at least with that motherboard), is to buy more RAM than you think you will ever need during the initial build. But again, be prepared to toss it all should you decide to upgrade your motherboard. 

As long as ATX remains the industry "form factor" standard, the case and PSU are the only components you can be 99.9% certain will carry over to a new build. You might be able to carry over your drives - "IF" their capacities are still sufficient. You probably be able to carry over the keyboard, mouse, monitor and speakers too.


----------



## fevgatos (Jun 14, 2020)

PooPipeBoy said:


> Well that's a bit awkward when everyone on the internet is saying that quad cores are obsolete.


They are obsolete. Quad cores without hyperthreading are dead for any high end gaming. They can't deliver consistent framerates in CPU demanding games. Especially in a pc that runs a bunch of background stuff, 4/4 is the absolute low end. And honestly whats the point of buying a 4/4 when you cant get 6/12 for like 85 euros (1600 AF).


----------



## Vario (Jun 14, 2020)

Bill_Bright said:


> As long as ATX remains the industry "form factor" standard, the case and PSU are the only components you can be 99.9% certain will carry over to a new build. You might be able to carry over your drives - "IF" their capacities are still sufficient. You probably be able to carry over the keyboard, mouse, monitor and speakers too.


^ Heatsink on mainstream Intel has been a reliable constant too, LGA1156, 1155, 1150, 1151, 1200.  So it might be worth buying a quality heatsink and carrying it forward.


----------



## GorbazTheDragon (Jun 14, 2020)

Bill_Bright said:


> Your comment suggests if you are NOT into overclocking, there's no need to invest in RAM during the initial build.


It does not, I am merely suggesting that it is a component you can spend a bit extra on for performance that you can carry over between systems. Obviously barring the case where you are buying at the end of a DDR life cycle.



Bill_Bright said:


> Even basic office computer benefit from lots of quality RAM.



You're completely missing the point on this one... I'm saying in 2016-2019, you should have considered spending on 2x8 b-die over 2x4 or 2x8 of coinflip IC 3000c15 bin memory... This obviously implies that you will be overclocking the memory to make use of the better capacity to OC.

I'm not going to open an argument on how much memory you need as it is use case specific.



Vario said:


> ^ Heatsink on mainstream Intel has been a reliable constant too, LGA1156, 1155, 1150, 1151, 1200.



Noctua and some other manufacturers generally send out mounting hardware for new (or old) sockets.


----------



## Bill_Bright (Jun 14, 2020)

Vario said:


> ^ Heatsink on mainstream Intel has been a reliable constant too, LGA1156, 1155, 1150, 1151, 1200. So it might be worth buying a quality heatsink and carrying it forward.


Good point! Not to mention many aftermarket coolers support both Intel and AMD. So even if your upgrade path involves changing platforms, a wise cooler purchase could support you for years (assuming the fan bearings hold out!). 


GorbazTheDragon said:


> You're completely missing the point on this one...


I didn't miss anything. You didn't explain yourself completely. You specifically said, "_if you are into overclocking, RAM is also somthing good to invest in..._".  While that may be true, what if you are into serious gaming but not overclocking? Or what if you are into CAD/CAE, graphics editing or other tasks? Would they not benefit by investing in RAM too? Of course they would! We can't read minds to know what you meant or should have said in the first place. I appreciate you now attempt to clarify, but... 


GorbazTheDragon said:


> It does not


Yes, your comment did. 


			
				GorbazTheDragon said:
			
		

> I am merely suggesting that it is a component you can spend a bit extra on for performance that you can carry over between systems.


 And sorry, but that is STILL inaccurate! You are now saying RAM, and suggesting in particular "performance" RAM is something you can carry over between systems? NO!!!! There is no assurance of that at all! 

Everybody (well, just about everybody! ) knows that not all RAM is supported by all CPUs or all motherboards/chipsets. So I again point to the pile of perfectly good, but useless to me PC3-12800 and PC3-17000 RAM I have sitting on my shelf. Even today's DDR4 support, where compatibility across platforms and chipset/CPU families is magnitudes better than it was with DDR3, is still not universally compatible or supported on all DDR4 platforms. So there is no guarantee the DDR4 you buy today will work on your next motherboard.

But I will also point to the 5 year old power supply sitting on my shelf that, if necessary, I can toss in any of the 6 computers I have in this house and be certain it supports those systems. And I can even install it the 2013 Corsair Carbide 300r case sitting in the closet and know it will all fit too. 

And for the record, spending "_a bit extra for performance_" does NOTHING to ensure compatibility in the future either. So I am sorry again, but again what you are claiming just isn't a given. It may be true for a specific one-off anecdotal scenario, but that certainly does not suggest a rule or generality. However, paying "a bit extra for quantity" during the initial build does indeed help ensure support down the road - at least with that motherboard. But that is still no guarantee that RAM will carry over between systems. 

So I'll stand by my original comments about futureproofing, 


Bill_Bright said:


> Futureproofing involves ensuring compatibility and support for future "protocols".
> 
> So the best way to "futureproof" your RAM and ensure you will always have plenty of RAM (at least with that motherboard), is to buy more RAM than you think you will ever need during the initial build. But again, be prepared to toss it all should you decide to upgrade your motherboard.


----------



## GorbazTheDragon (Jun 14, 2020)

Even on DDR3 if you exclude some of the very high voltage (1.8v+) kits the compatibility of memory from x58 to Haswell was excellent. The same BBSE or PSC-X kit would work fine across all the platforms. The only limitation would be the capacity of 1Gbit ICs.

In the case of DDR4 only some very early dinosaur ICs were really dropped from support, but just about everything will run a kit of b-die simply because of the nature of how consolidated the industry is nowadays. 2x8GB single rank kits are so absurdly common that unless you went for some very early X99 era stuff it will be on a familiar PCB and will work fine on modern DDR4 platforms. Additionally 8Gbit ICs are adequate capacity wise to the foreseeable future, and I would expect to need to transition to DDR5 before 4x8 becomes a limiting factor...

Therefore, buying a 2x8 b-die kit back when skylake launched is a perfectly sensible future proofing expenditure (providing you intend to use the OCing capability of the kit) because at the time it was a) obvious that DDR5 was not on the horizon, b) 2x8 enables upgrading to 4x8 which was foreseeably adequate capacity wise, c) b-die is an exceptionally well performing 8Gb IC and higher capacity ICs are extremely unlikely to time better, and d) pricing at the time relative to typical capacity requirements was very good.

Pointing out that investing in late ddr3 or ddr2 is useless because it gets made irrelevant by the new DDR revision is pointless, you're pointing out the obvious. It would be just like saying investing a lot in a late 775 motherboard is pointless because it is obvious 1156/1366 are on the horizon and will not be compatible, its self evident.


----------



## cucker tarlson (Jun 14, 2020)

dirtyferret said:


> I find it comical how some people ran out and got the Ryzen 1700 to "future proof" only to see the Ryzen 2600 perform better in gaming.  So then they get the Ryzen 2700x only to see the Ryzen 3600 perform better in gaming.  So now they have a 3700x/3800x, etc., to "future proof" but what do you think will happen when the Ryzen 4600 gets released?  yet you have an army of Intel i5 owners from the past decade laughing like hyenas every time they launch a game to play.  Like Bill stated there is a difference between playing a game and chasing FPS.  One is qualitative while the other is quantitative.


about this.
I wonder how many ppl who bought into that futureproofing myth with e.g. zen 1700 are actually doing something that a lower clocked 8/16 like 1700 does better than a higher clocked 6/12 like 1600x



xman2007 said:


> Are we speaking just about gaming?


yes,like the title says.
gaming only.
not even gaming+streaming.



xman2007 said:


> but a Ryzen 1700 with a 2080 Ti is still a high end gaming PC


yes,it's got a 2080Ti so it is high end gaming still.



xman2007 said:


> why do Intel luvvies have to keep pushing the same max FPS BS


all reviews are unilateral,even the amd fanbase ones have the same results.a modern day 4/8 is nipping at 6/12 heels in gaming and can outperform older 8/16.


----------



## Bill_Bright (Jun 14, 2020)

GorbazTheDragon said:


> Even on DDR3 if you exclude some of the very high voltage (1.8v+) kits the compatibility of memory from x58 to Haswell was excellent. The same BBSE or PSC-X kit would work fine across all the platforms. The only limitation would be the capacity of 1Gbit ICs.


So what? Will that DDR3 work on your DDR4 motherboard? Nope. So that comment and attempt to rationalize your claims has NOTHING to do with futureproofing. Neither does, in reality, anything else you said. For example, 



GorbazTheDragon said:


> Pointing out that investing in late ddr3 or ddr2 is useless because it gets made irrelevant by the new DDR revision is pointless, you're pointing out the obvious.




So me pointing out "the obvious", that old types of RAM do NOT provide futureproofing "between systems" (your words, not mine), justifies your claim that, "for overclockers" investing in "performance" RAM (again, your words) does provide futureproofing???



I'm done here.


----------



## Tomgang (Jun 14, 2020)

Vayra86 said:


> Yeah, where is @Tomgang in this story lol
> 
> And there are a few others who are still riding old hedt



Amen to that

While X58 whas not cheap back then, but seen in the long run. It really whas a great investment back then. The same platform for 11 years, I never believed at the time it would last me so long.

Al throw at start of the year I whas planning to move to a Ryzen 9 3950X setup. But then the covid-19 came and ruined those dreams. Luckily X58 is a solid and long lived platform with good quality motherboards. So it just keeps running day after day.

Also with a nvme SSD, a gtx 1080 TI, sata 3 and USB 3.0 and with i7 980x at 4.4 ghz performing much like a stock Ryzen 1600 non af cpu. It is still a decent overall performer for games as long you don't need 120 fps constantly or more. But for those like me just want above 60 fps, it's great for an all arounder 1080P gamer and normal desktop use.

Those who might are interested in how well X58 perform in modern games. Se this video from Tech yes city on YouTube. It contains a few game benchmark as well.


----------



## cucker tarlson (Jun 14, 2020)

GorbazTheDragon said:


> You can't really futureproof with more cores when the architecture as a whole is just pretty flakey... Otherwise Haswell E would be a "good" buy for future


well tbh 5960x is still no slouch against the likes of coffe/comet i5/i7s









						Test procesora Intel Core i7-8700K - Premiera Coffee Lake | PurePC.pl
					

Test procesora Intel Core i7-8700K - Premiera Coffee Lake (strona 34) Test procesora Intel Core i7-8700K vs AMD Ryzen i reszta świata, czyli premiera nowej architektury Coffee Lake przynoszącej więcej rdzeni i większą wydajność. Jest moc!




					www.purepc.pl
				




was thinking of getting one in 2017 and frankly if I did I wouldn't be thinking about an upgrade now


----------



## GorbazTheDragon (Jun 14, 2020)

Hmm maybe it was only in bfv rather than bf1 where they went avx heavy, but definitely in R6 siege it's hard to get good frame pacing on any haswell/broadwell part. And that's a game played competitively so it's a case where high refresh is actually relevant.

Either way generally you are worse off on the HEDT parts for high refresh gaming because a lot of the titles which are candidates are memory reliant and (aside from zen2 tr) generally you lose performance due to the worse memory latency on x99/x299 compared to the mainstream platforms...


----------



## cucker tarlson (Jun 14, 2020)

GorbazTheDragon said:


> Either way generally you are worse off on the HEDT parts for high refresh gaming.


of course.
but they can deliver great performance in gaming too.just not perf/dollar


----------



## Vayra86 (Jun 14, 2020)

GorbazTheDragon said:


> Hmm maybe it was only in bfv rather than bf1 where they went avx heavy, but definitely in R6 siege it's hard to get good frame pacing on any haswell/broadwell part. And that's a game played competitively so it's a case where high refresh is actually relevant.
> 
> Either way generally you are worse off on the HEDT parts for high refresh gaming because a lot of the titles which are candidates are memory reliant and (aside from zen2 tr) generally you lose performance due to the worse memory latency on x99/x299 compared to the mainstream platforms...



Yeah and at the same time a DDR3 rig on HEDT is still going to do just fine, while a DDR3 rig on MSDT would most likely be obsolete by now. Part of that is also the quad channel memory/higher bandwidth alongside the core count; it just has breathing room in case some load demands it. Its only pretty recent (since DDR4) that there is a renewed focus on RAM speeds and lower latency. Early DDR4 most certainly was not focused on latency... and already showed bigger gains over MSDT DDR3 rigs. Its part, or most of the reason Skylake was a jump from Broadwell, or became one as people figured it out and faster RAM kits came up.









						The Intel 6th Gen Skylake Review: Core i7-6700K and i5-6600K Tested
					






					www.anandtech.com
				




This was early day and already a basic (2666) RAM DDR4 kit was showing gains over DDR3, while being higher latency than a 1600 CL8~9 kit that was pretty standard for DDR3 XMP.

Don't disagree on the overall idea for high refresh gaming though, because you also want lower core count at higher frequency, rather than many cores at a lower one. This is however something the current MSDT line up has circumvented handily with highly dynamic boosting; realistically what we're having on MSDT right now is optimal for any sort of gaming setup, anywhere in the CPU stack except rock bottom. But this was not always the case, CPUs were balanced out on entirely different workloads not too long ago.

So really, in 2020, if you're looking for a CPU for gaming you don't have to make any sort of trade off anymore. You can have a core count that will be sufficient going forward, and you can have superb single core clocks, and fast RAM to support that performance all in one box.


----------



## evernessince (Jun 14, 2020)

Bill_Bright said:


> Bullfeathers! Well, unless you are talking old 486 Pentiums!
> 
> Futureproofing?
> 
> ...



You can use any 4c8t CPU for the last 10 years and it will run the game.  Your criteria is "does the game run?" whereas most gamer's criteria is more "does the game run as smooth as possible?".

In a majority of games that 7700K is going to be pegged at 100% utilization, meaning any sort of background process or other software running is going to hobble your frame-rate.  4c8t is nice in a sterile benchmark environment but I'll pass on the stuttering and being restricted from only running one program at a time, thank you.


----------



## Bill_Bright (Jun 14, 2020)

evernessince said:


> You can use any 4c8t CPU for the last 10 years and it will run the game.


Wow! You sure do NOT know what futureproofing means!!!!  That's too bad. 

Can you put any 4c8t CPU made 10 years (or even 5 years ago) on any motherboard made today? Nope! What if we assume the same socket? Then still, nope! So guess what? That CPU was not "futureproof" capable. 


evernessince said:


> Your criteria is "does the game run?"


No it isn't. Don't try to put words in my mouth - you clearly don't know how, therefore are not good at it. 

First, I am not tunnel visioned and narrow minded. My requirements go way beyond non-productive games. 

Second my criteria is (as I said above but it flew over your head), "_does it support today's (and tomorrow's) protocols/industry standards_"? But accepting the fact many enthusiasts are gamers, if that involves games, then, "_does it still provide good game play?_"  - that is, "_is it still entertaining?" _Because contrary to what, apparently you, and some others here think, computers are NOT just game machines, nor is good "game play" only possible with the maximum possible FPS. 

And FTR, I never, not once, EVER said, suggested or implied stuttering was acceptable. So keep your interpretations of other people's comments to yourself, thank you.


----------



## GorbazTheDragon (Jun 14, 2020)




----------



## newtekie1 (Jun 15, 2020)

cucker tarlson said:


> yeah but let's not waste time on semantics.



Wait, what? It's not semantics, it's literally the facts.  Quad-Cores aren't obsolete, your statement is wrong.


----------



## phill (Jun 15, 2020)

To a degree I feel, even single cores CPUs have their places..  It might not be what it used to be like but would you expect it to be?

Of course things move on and if they didn't that wouldn't be very helpful at all...  I stand by my choice of hardware for my girls rigs just as I do with what I have for my own PC..  But as with everything for anyone buying into the 'PC Master Race', its probably down to two things more so, budget and personal choice and for those two things, no one is wrong.  As if at the end of the day it's doing everything someone needs it to do, then how can it be wrong?  There might be a few variances in what you could pick and choose but again, that'll be down to personal choice 

Chill guys, it's just hardware


----------



## Iceni (Jun 15, 2020)

Bill_Bright said:


> speakers too



I bought my speakers in 1999. I've had to rebuild the wired remote. Outside of that they work just as well as the day I bought them. Creative Cambridge Audio DTT2200 5.1

Who would have thought that older tech could actually be more future proof. At the time of purchase digital speakers (digital crossovers and amps) were becoming a thing, I can't name a single person who got digital speakers that is using a set as old as my almost 20 years old analogue ones.


----------



## Vario (Jun 15, 2020)

Iceni said:


> I bought my speakers in 1999. I've had to rebuild the wired remote. Outside of that they work just as well as the day I bought them. Creative Cambridge Audio DTT2200 5.1
> 
> Who would have thought that older tech could actually be more future proof. At the time of purchase digital speakers (digital crossovers and amps) were becoming a thing, I can't name a single person who got digital speakers that is using a set as old as my almost 20 years old analogue ones.


Had a similar 4.1 version, the Cambridge FPS2000, while it continued to work, I eventually got tired of it and donated it to a charity sale a year ago.  The digital component was always a hassle requiring a PCI Soundblaster card so over the years I switched to using them in analog instead.

I am still using some mid 90s beige Acoustic Research two channel speakers.


----------



## RandallFlagg (Jun 15, 2020)

Sounds to me like a lot of folks are having a tough time coming to grips with reality.

What reality?

#1 : An modestly OC'd i7-8700K with a good MB and RAM owns your Ryzen (and it doesn't matter which Ryzen) on games.  
#2 : Games, like many complex applications, get limited benefit from additional threads beyond a certain point; that seems to be ~4.
#3 : #2 isn't likely to change anytime soon, as it doesn't have anything to do with availability of multi-core hardware.


----------



## phanbuey (Jun 15, 2020)

RandallFlagg said:


> Sounds to me like a lot of folks are having a tough time coming to grips with reality.
> 
> What reality?
> 
> ...



this post...


----------



## jlewis02 (Jun 15, 2020)

The 7700k is on its very last legs as a useable cpu for todays standards.
I will still be a gaming only cpu for a bit longer.
I have tried to stream and game with it and it is terrible at it unless you drop the quality of the game and stream.
I would have something different but I only paid $160 for my entire system with a new cpu cooler and a case that I didn't even need.
Waiting for tax money next year to upgrade.


----------



## R0H1T (Jun 15, 2020)

RandallFlagg said:


> Games, like many complex applications, get limited benefit from additional threads beyond a certain point; *that seems to be ~4.*


No, I'd like to see where you pulled this from


----------



## tygrus (Jun 15, 2020)

Newer 8+ real cores x higher freq are always going to win today but that doesn't mean the old CPU was a bad choice at the time. The old high Cores x lower freq can be competitive with newer fewer cores x higher freq. But if your going to upgrade today for gaming, don't bother with 4 cores even with HT/MT, 6 cores will become the sweet spot. Should focus on, can we play it with enjoyable FPS with enjoyable image quality? I think another $250 on GPU will provide more benefit than $250 on CPU but YMMV.


----------



## Xex360 (Jun 15, 2020)

Their tests are too academic and don't offer a good idea on the real performance, for example playing BF1 online and offline is totally different, the CPU is much more stressed online.
I only watch those videos for fun, Hardware Unboxed are doing a decent job, but the rest are obsessed with academic results, it's like testing a car with slicks or nitrous.


----------



## thesmokingman (Jun 15, 2020)

Xex360 said:


> Their tests are too academic and don't offer a good idea on the real performance, for example playing BF1 online and offline is totally different, the CPU is much more stressed online.
> I only watch those videos for fun, Hardware Unboxed are doing a decent job, but the rest are obsessed with academic results, it's like testing a car with slicks or nitrous.



And if one wanted one could setup benchies to prove whatever pov one has.


----------



## Bill_Bright (Jun 15, 2020)

phill said:


> As if at the end of the day it's doing everything someone needs it to do, then how can it be wrong?


Exactly!


Iceni said:


> I bought my speakers in 1999. I've had to rebuild the wired remote. Outside of that they work just as well as the day I bought them. Creative Cambridge Audio DTT2200 5.1
> 
> Who would have thought that older tech could actually be more future proof. At the time of purchase digital speakers (digital crossovers and amps) were becoming a thing, I can't name a single person who got digital speakers that is using a set as old as my almost 20 years old analogue ones.


My Logitech THX Z560 4.1 surround sound speakers are about the same age (2001) and still work great too. I thought about replacing them with 5.1 but I only use them for music (and Windows sounds but who cares about them). So with music, not having a center speaker is perfect. And besides, I have a two-monitor setup and a center speaker would not work - at least for me. 

As for "digital" speakers - there's no such thing! Audio is "analog". Period. No exceptions. You can't hear a "1" or "0", unless someone speaks them - and then the waveform you hear is analog. "Digital speakers" is nothing more than another inaccurate "marketing term" for a speaker system with the DACs (digital to analog converters) located in its internal/integrated electronics, as Iceni accurately noted. Otherwise, the speakers (and amplifiers) are exactly the same but with the DACs located on the sound card (or motherboard's integrated sound). 


Vario said:


> I am still using some mid 90s beige Acoustic Research two channel speakers.


 And I still have some vintage 70s Acoustic Research AR-3a speakers that are still world class speakers.  


Xex360 said:


> For example playing BF1 online and offline is totally different, the CPU is much more stressed online.


Why? The CPU does not care where its next set of instructions come from. It just wants them and wants them now. The problem with online vs offline is the very restrictive bottleneck created by the bandwidth of the local networks, Internet connection, and distant end servers. That's a lot of potential for considerable latency issues. 

With online, especially with lots of RAM and a very fast local drive [hopefully SSD], the CPU can get its next set of instructions now, with essentially negligible latency issues. 


tygrus said:


> Should focus on, can we play it with enjoyable FPS with enjoyable image quality?


Right. And though what is enjoyable or not is very subjective, it should be determined by the "game play" and not bench mark scores.


----------



## TheoneandonlyMrK (Jun 15, 2020)

In a year ,there will be a game a quad Core cannot play I would say.

Let's see what some of the quad core supporter's are saying then, I'll be listening.

Many tout game's that have been about year's or highlight the performance at 1080p.

The future isn't relative to the passed, it evolves from it.

For the last seven years a quad core and a Polaris was THE baseline All game's were made to hit because it vaguely equals both consoles.

The baseline is getting raised in 3-6 months.

If you buy a comparatively comparable pc at that time it still won't have equal tech to the consoles, their subsystem smashes pc equivalents.

But that pc should hit most minimum specs for the next 7 years, that would be my idea of future proof.
Even that's getting superceeded within a year.

But quads will not cut it soon IMHO.

The baseline decides not Pcmr.


----------



## Xex360 (Jun 15, 2020)

Bill_Bright said:


> Exactly!
> My Logitech THX Z560 4.1 surround sound speakers are about the same age (2001) and still work great too. I thought about replacing them with 5.1 but I only use them for music (and Windows sounds but who cares about them). So with music, not having a center speaker is perfect. And besides, I have a two-monitor setup and a center speaker would not work - at least for me.
> 
> As for "digital" speakers - there's no such thing! Audio is "analog". Period. No exceptions. You can't hear a "1" or "0", unless someone speaks them - and then the waveform you hear is analog. "Digital speakers" is nothing more than another inaccurate "marketing term" for a speaker system with the DACs (digital to analog converters) located in its internal/integrated electronics, as Iceni accurately noted. Otherwise, the speakers (and amplifiers) are exactly the same but with the DACs located on the sound card (or motherboard's integrated sound).
> ...


At least on BF1 it's not the case, you can see the CPU is being hammered much more during 64 players matches compared to single player. 
But my point is, we need to have more realistic reviews, where the review focuses on a realistic use case for the product, you don't mix an i9 with a rx5500, the same way you don't mix an i3 with a 2080ti and high end RAM, unless the goal is purely academic, which is interesting and entertaining but I believe the goal of reviews is to help us make purchase decisions.


----------



## dirtyferret (Jun 15, 2020)

jlewis02 said:


> The 7700k is on its very last legs as a useable cpu for todays standards.


Is there a confirmed standard by the PC gaming industry that I am not aware of?



jlewis02 said:


> I have tried to stream and game with it and it is terrible at it unless you drop the quality of the game and stream.


Then your standard of useable (sic) is more demanding then just playing PC games which is fine but understand you are moving the goal post on your demands then just simply playing games.


----------



## Bill_Bright (Jun 15, 2020)

theoneandonlymrk said:


> In a year ,there will be a game a quad Core cannot play I would say.


Why would game developers force their customers to ditch millions and millions of quad core processors and spend $100s more for a 6+ core processor, and possibly a new motherboard to support it, which may require new RAM and new Windows license too (if new motherboard is needed)? Their sales would plummet, if ever high in the first place. 

I do agree with you that in the future, to appease the gaming enthusiasts who have the necessary deep pockets, many games will be coded where the absolute best gaming performance can only be achieved with hexa-core, octa-core, deca-core or more CPUs.  But I contend quad-core CPUs will still be able to play them for many years to come. Maybe there will be fewer objects, fewer players, less detailed backgrounds, a smaller field of vision. But good game play (for those not obsessed with the best possible FPS scores on benchmarks) will still be possible. 

NO DOUBT for many, those limitations will be unacceptable. I totally get that! But that does not make the quad core CPUs "obsolete". That's won't happen until hexa-core or octa-core CPUs are routinely used in entry-level PCs and notebooks. And for now both AMD and Intel are still making dual-core processors which are still widely used today - even with many games. 

I think it is important to remember that, even here at TPU, and contrary to what some may believe and may want everyone else to believe too, *it is NOT all about playing games!!!*



Xex360 said:


> the CPU is being hammered much more during 64 players matches compared to single player.


Oh, now I see what you are saying but I am sticking with my argument. I agree that your on-line scenario is more CPU intensive, but I contend it is not because it is on-line. It is because there are 64 "real-life" players, and not just a single player playing against the computer. There's just more crunching going on and not because it is on-line. 



Xex360 said:


> But my point is, we need to have more realistic reviews


I agree with this too but I just don't think there could ever be enough reviews. The ATX Form Factor standard is a wonderful blessing for PC consumers - especially for those of us who build our own. It gives us 1000s of options from 100s and 100s of different manufacturers we can setup in millions of different configurations. No other industry lets us use parts from 100s of different manufacturers with confidence they will all fit and connect together with standard dimensions, standard screws, screw hole locations, connectors, voltages, etc. and then know they will work. 

So my point is, it would be great if the professional review sites could review all possible configurations, but it is just not possible. There's not enough time, not enough money and not enough space. 

Plus, different games and different "productivity software" (remember, it is not all about games!) perform better on different hardware. Your favorite game or program may perform better on a "the other brand" CPU. Or Fred may prefer a different game which plays better on that platform. There's just no way any review site could cover all the truly realist scenarios - even the common ones. There are just too many. 

So all we, as consumers, can do is read as many reviews as possible, then cross our fingers and go from there. 



Xex360 said:


> you don't mix an i9 with a rx5500, the same way you don't mix an i3 with a 2080ti and high end RAM, unless the goal is purely academic,


Yes you do - and it is because the review needs to be academic (objective), not subjective. If you are reviewing a CPU, you don't want the RAM or graphics card to bottleneck the CPU. If you are reviewing a graphics card, you don't want the card constantly waiting on the CPU or RAM. 



dirtyferret said:


> but understand you are moving the goal post on your demands then just simply playing games.


That's the problem, isn't it? We each have our own opinion for where to put that goal post.


----------



## FordGT90Concept (Jun 15, 2020)

With both consoles going 8c/16t...
I wouldn't buy anything less for a gaming computer.  If you change computers every 2 years or so, 4/6 core is still fine but AAA games launching in 2021/2022 might require minimum 8c.


----------



## cucker tarlson (Jun 15, 2020)

FordGT90Concept said:


> With both consoles going 8c/16t...
> I wouldn't buy anything less for a gaming computer.  If you change computers every 2 years or so, 4/6 core is still fine but AAA games launching in 2021/2022 might require minimum 8c.


Really ? How much gaming performance is that 8/16 CPU packing at 3.5 g allcore ? 1800x level ?
Check where 1800x is against recent 6/12 cpus like 10600kf oc
Less than 8/16 is fine for 2 years,right......just like 1700x was futureproof and 3 years later it's outperformed by amd's own 3300x


----------



## TheoneandonlyMrK (Jun 15, 2020)

Bill_Bright said:


> Why would game developers force their customers to ditch millions and millions of quad core processors and spend $100s more for a 6+ core processor, and possibly a new motherboard to support it, which may require new RAM and new Windows license too (if new motherboard is needed)? Their sales would plummet, if ever high in the first place.
> 
> I do agree with you that in the future, to appease the gaming enthusiasts who have the necessary deep pockets, many games will be coded where the absolute best gaming performance can only be achieved with hexa-core, octa-core, deca-core or more CPUs.  But I contend quad-core CPUs will still be able to play them for many years to come. Maybe there will be fewer objects, fewer players, less detailed backgrounds, a smaller field of vision. But good game play (for those not obsessed with the best possible FPS scores on benchmarks) will still be possible.
> 
> ...


Progress.

The baseline is going up, as I said.

It's fine if we disagree.

TBF to your point you simply won't see some games ported to pc because they can't without compromise.

And sure, some Dev's will still design for quads, I didn't say all game's won't work in a year, I said some won't be possible to run on quads and I stand by it.

Some new console games will not be portable to quads at all, so won't , some will need more than four cores to run adequately for minimum gameplay.

Oh and this is a thread about future proofing for game's, obviously quads will retain some useful power but they're the new single core chip to me in two years, ie the absolute minimum to use.


----------



## newtekie1 (Jun 15, 2020)

FordGT90Concept said:


> With both consoles going 8c/16t...
> I wouldn't buy anything less for a gaming computer.  If you change computers every 2 years or so, 4/6 core is still fine but AAA games launching in 2021/2022 might require minimum 8c.



Everyone said that the last generation too. The fact is the extra cores are going largely unused.


----------



## cucker tarlson (Jun 15, 2020)

newtekie1 said:


> Everyone said that the last generation too. The fact is the extra cores are going largely unused.


especially on ryzen 3000 !
it's supposed to use only the best within one ccx.


----------



## TheoneandonlyMrK (Jun 15, 2020)

newtekie1 said:


> Everyone said that the last generation too. The fact is the extra cores are going largely unused.


Last generation , which last we on about, pre ryzen.
The baseline was 8 jaguar cores, a quad kept up.
A quad has no chance of even managing the data decompression of ps5(not even half the performance of a ps5 decompression engine) never mind 8/16 zen2 cores at 3.6.

Exactly how much could they scalpel out of gta6 to make that quad work, would it even be the same game, sure as shit wouldn't be half the experience, you know the experience the game designer aimed for on ps5 and Xbox series X.

Keep covering your ears shouting lala la y'all but realities will kick in soon enough.


----------



## cucker tarlson (Jun 15, 2020)

theoneandonlymrk said:


> Last generation , which last we on about, pre ryzen.
> The baseline was 8 jaguar cores, a quad kept up.


English ?


theoneandonlymrk said:


> A quad has no chance of even managing the data decompression of ps5(not even half the performance of a ps5 decompression engine) never mind 8/16 zen2 cores at 3.6.
> Exactly how much could they scalpel out of gta6 to make that quad work, would it even be the same game, sure as shit wouldn't be half the experience, you know the experience the game designer aimed for on ps5 and Xbox series X.
> Keep covering your ears shouting lala la y'all but realities will kick in soon enough.


that's why they're in consoles

btw you already had to upgrade your last gen 8 core ryzen to keep up with gaming oriented 6 cores,now that 3300x is out it'd lose to that quad with SMT too


----------



## Bill_Bright (Jun 15, 2020)

theoneandonlymrk said:


> Progress.
> 
> The baseline is going up, as I said.


I agree but progress is primarily driven by user demand. And there just is not that big of demand - yet. Thus, that baseline may be going up, but it sure isn't quickly. Otherwise, dual-core processors would not still be so available from both Intel and AMD. 



theoneandonlymrk said:


> TBF to your point you simply won't see some games ported to pc because they can't without compromise.


I personally don't think consoles or console games should be a part of this discussion. It is not like consumers have a wide variety of choices when it comes to the CPU put in consoles. Nor can consumers readily swap out or upgrade console CPUs or motherboards. 

Frankly, IMO it is because gaming consoles have such great performance and console games perform so well on those consoles that the gaming PC niche market remains a niche market. But that's for another discussion, IMO. 

It is cucker tarlson's thread and not mine so not really my say. But I think we should limit this discussion to CPUs that will be used in PCs.


----------



## cucker tarlson (Jun 15, 2020)

Bill_Bright said:


> I personally don't think consoles or console games should be a part of this discussion.


that's one thing.

another is - do people really dig into reviews or do they base everything they say here on a whim ? cause it seems to me an 8 core 3700x isn't really faster than 3600 in games and 3900x isn't faster than 3700x in turn except for differences that reflect frequency since they're each higher binned.

let me ask you this question - if we have a game that can scale on 9960x higher than 10900k,why don't 12 or 16 core ryzen 3000's scale past 9700k.








						Test procesora Intel Core i9-10900K - Nowy król wydajności w grach | PurePC.pl
					

Test procesora Intel Core i9-10900K - Nowy król wydajności w grach (strona 50) Test procesora Intel Core i9-10900K, nowego króla wydajności w grach komputerowych. Czy to najszybszy procesor na rynku? Jak się podkręca i czy warto go kupić?




					www.purepc.pl


----------



## TheoneandonlyMrK (Jun 15, 2020)

Bill_Bright said:


> I agree but progress is primarily driven by user demand. And there just is not that big of demand - yet. Thus, that baseline may be going up, but it sure isn't quickly. Otherwise, dual-core processors would not still be so available from both Intel and AMD.
> 
> I personally don't think consoles or console games should be a part of this discussion. It is not like consumers have a wide variety of choices when it comes to the CPU put in consoles. Nor can consumers readily swap out or upgrade console CPUs or motherboards.
> 
> ...


Many things effect the game's that come out on pc but there are few games Not made for consoles primarily and pc secondary, to ignore such in a debate that's essentially about gaming performance is very near sighted.

It's specifically the set state of the old and next generation consoles which leads Dev's to use them as a baseline, far more consumer friendly and accessable to the masses.

But as I said we can have different opinions.

As for dualies ,wtaf , granny needs to surf, Alice has a cookbook to read but no one's enjoying triple AAA game's on one are they, they're pointless for gaming why mention them.


Also you can read what review you want, there's enough about to sway any angle but, they're all to test present hardware on old released software.

Is it that informative about future performance, not really, you can't know what performance in what specific area a game will want.
Rtx could really take off with next generation instead of being under utilized.
There are game's on the way that are designed to push not just meet Pcmr expectations.


----------



## RandallFlagg (Jun 15, 2020)

Just thought I'd leave this here:


----------



## cucker tarlson (Jun 15, 2020)

RandallFlagg said:


> Just thought I'd leave this here:
> 
> View attachment 159129


hitman 2 is unusually single threaded for a new AAA game (kinda,the series is a mockery of what it used to be)

the point is even in heavily multithreaded game a new 4/8 can match or beat a slower 8/16
3300x beats it even in 1% and 0.1%.By *17%*


----------



## TheoneandonlyMrK (Jun 15, 2020)

I think I just failed to grasp what this threads about.

It's really an old quad owner's last lap round the block arms waiving congratulations thread.


Game's of the passed and three year old CPU's only please eh.

Future game proof  = old game bench wins


Hahaaaaaaaaa


Pahh useless thread on the edge of nonsense.


----------



## cucker tarlson (Jun 15, 2020)

theoneandonlymrk said:


> Game's of the passed and three year old CPU's only please eh.


how ?
with division 2 and rdr2 
hitman 2 and f1 2019,how exactly are they "game's of the passed" ?



theoneandonlymrk said:


> three year old CPU's only please eh.


no.only the 8 cores are old here.well,kinda.they're 3 years old.so not really old tbh.
the 4/8 cpus are new i3s and r3s
did you not read the title of the thread ? after 5 pages of posting here ?



theoneandonlymrk said:


> Pahh useless thread on the edge of nonsense.



why ? it's just a test.
it's pretty weak to whine over numbers you don't like on a tech forum like TPU.


----------



## Fizban (Jun 15, 2020)

theonek said:


> 7700k is long time dead, it can't even handle streaming and gaming simultaneously.... Multi cores are up to date fo multi load tasking.... Old cpu's are good for office work only....




7700K is fine for a lot more than "office work".


theoneandonlymrk said:


> Disagree heartily ,gtaV shows gains from cores and is Seven years old and largely irrelevant with regards future anything.
> 
> Assassin's creed is unplayable at 4k on 4 cores
> 
> ...



4K is a silly measurement for most people though. Almost no one plays at 4K, since even a  RTX 2080 TI can't manage 60 fps in a fair few games.


----------



## cucker tarlson (Jun 15, 2020)

Fizban said:


> 7700K is fine for a lot more than "office work".


not for someone who doesn't bother reading the OP apparently but feels fine about spamming the thread from a double account.


----------



## TheoneandonlyMrK (Jun 15, 2020)

Fizban said:


> 7700K is fine for a lot more than "office work".
> 
> 
> 4K is a silly measurement for most people though. Almost no one plays at 4K, since even a  RTX 2080 TI can't manage 60 fps in a fair few games.


Total rubbish , my Vega 64 does 60fps@4k in most game's quite easily, I have gamed at 4k on a 1060 6Gb.
4k is easy, it's only hard at ultra settings.

You expect me to appreciate your viewpoint when some of you are stuck in your own perspective, 1080p/1440p@144hz is a niche and no more important than the 4k niche your denouncing as unimportant.

Funny because even the old consoles targeted 4k not 1080p yet Pcmr be like nah 1080p240hz or stfu.

But regardless you missed my point, that being that there are game's and users now that a quad serves poorly, simple fact.

And that you can't measure how useful something will be on unreleased software , developed for more powerful systems ie the next generation consoles ie the future,
By running some old games , some over 7 years old, they're not the future.


----------



## RandallFlagg (Jun 15, 2020)

cucker tarlson said:


> no.only the 8 cores are old here.well,kinda.they're 3 years old.so not really old tbh.
> the 4/8 cpus are new i3s and r3s
> did you not read the title of the thread ? after 5 pages of posting here ?



Agree with your post but the big thing coming out is not so much that the 1700 didn't fare well against newer 4-core CPUs, it's that it also doesn't fare well in newer games against older 4-core CPUs from the era the 1700 was released in. 

Now I will also say I don't think anyone is truly pumping 4-core CPUs for the future.  But pumping 8C/16T is likely to be a big fat miss for gamers, just as it has been.  

We've had 4-core CPUs for consumers for a *very* long time.  The Q6600 was released in Q1 2007 - more than 13 years ago - and it is really only in the last few years that we've seen games where more than 2 cores significantly helped.  This whole argument occurred back then too, and the "Moar Cores" crowd was wrong then.   They are wrong now too.  

If history is a guide, games might get actual use from 8 cores around 2030 or so.  Probably about the time the PS 5 and Xbox X are being replaced.  I'll be long since retired.  This whole thread is nothing more than a lesson that the more things change,   the more they stay the same.


----------



## cucker tarlson (Jun 15, 2020)

theoneandonlymrk said:


> Total rubbish , my Vega 64 does 60fps@4k in most game's quite easily, I have games at 4k on a 1060 6Gb.
> 4k is easy


nah,it's just the games you play then


theoneandonlymrk said:


> But regardless you missed my point, that being that there are game's and users now that a quad serves poorly, simple fact.


absolutely.
I replaced my 4790k 4.6ghz cause it was getting too slow.
but it's not like a 3300x really.it's 7 years old.



theoneandonlymrk said:


> By running some old games , some over 7 years old, they're not the future.



the hell are you on about again.
this test has rdr2,f1 2019 and division 2 and those were the ones I really was referring to.



RandallFlagg said:


> Now I will also say I don't think anyone is truly pumping 4-core CPUs for the future.  But pumping 8C/16T is likely to be a big fat miss for gamers, just as it has been.


it's called getting what is needed.
futureproofing with a slower architecture that you're trying to compensate with cores works worse than a fast cpu with fewer cores.this is pretty much everything that this thread is about.


----------



## Bill_Bright (Jun 15, 2020)

cucker tarlson said:


> another is - do people really dig into reviews or do they base everything they say here on a whim ?


Depends on the person. I wasn't born with a silver spoon in my mouth. I was a single parent, full time active duty military, carrying a full load at college. An investment that finally led to a great job that now lets me splurge a little now and then on a few of the niceties of life. But that does not mean I want to waste my money, or even have money to waste. I don't. So I do my homework. I read the professional reviews. I cull out the stupid user reviews that downrate a product because FedEx delivered it next door instead of my house. 

But there certainly are some who buy on a whim, or they like the color, or they buy based solely on the opinion of one person who may or may not have done their homework. I can't answer for them. I can only hope to give them sound advice. 


cucker tarlson said:


> cause it seems to me an 8 core 3700x isn't really faster than 3600 in games and 3900x isn't faster than 3700x in turn except for differences that reflect frequency since they're each higher binned.


I don't disagree with that at all. There are many products that are definitely better "on paper" and in benchmarks tests and no doubt, marketing weenies (and fanboys) use that information (often expertly) to their advantage to sell their products over the other guys. And no doubt, the placebo effect plays a role in human perception and consumer satisfaction too. 

It often takes a willing open mind, and a bit of self-discipline to separate the perceptions created by the "fluff" and hype from the reality of the real world. And not everyone is willing, or able to see and accept that reality. Do better specs on paper always equate to better game play and most importantly (at least IMO) greater entertainment value? Nope. 



cucker tarlson said:


> let me ask you this question - if we have a game that can scale on 9960x higher than 10900k,why don't 12 or 16 core ryzen 3000's scale past 9700k.


I can't answer that. No demand for it? IDK If a Bugatti can go 300MPH, why can't all other cars do at least 200MPH? It is just a matter of throwing in a couple extra gears, right? 



theoneandonlymrk said:


> to ignore such in a debate that's essentially about gaming performance is very near sighted.


Near sighted? LOL 

And I say, as I have all along in this debate, that futureproofing a CPU is much more than just games and "gaming performance"! That's despite the subject title of this thread. 

And FTR, intentionally putting less emphasis on something is not ignoring it. 

A future proof CPU absolutely must be supported by all future motherboards that use that socket (which lasts a lifetime too). The futureproof CPU must support all future RAM types, sizes, speeds and timings. It must support all future protocols for all future I/Os. And of course, that means graphics protocols too - at least for those CPUs with integrated graphics. 

Even when being "near sighted" and focusing on gaming performance, futureproofing does NOT imply the best possible gaming performance long into the future. If it did, AMD would only need to produce one futureproof CPU and Intel would only need to produce one futureproof CPU. Why would there be a need for entry-level, middle of the road, and top tier CPUs if only "the best possible gaming performance" was acceptable? 

Ludicrous? Yes. Kinda how this thread is now.  

The reality is, the only way a CPU can be futureproof is if firmware coded and not hard coded. But even then, there are physical limits governed by the Laws of Physics that will still put a clock on any CPUs effective lifespan.


----------



## cucker tarlson (Jun 15, 2020)

Bill_Bright said:


> Depends on the person. I wasn't born with a silver spoon in my mouth. I was a single parent


me too.
couldn't really afford any of my current hobbies,didn't even dream of them when I was a teenager.



Bill_Bright said:


> I can't answer that. No demand for it? IDK If a Bugatti can go 300MPH, why can't all other cars do at least 200MPH? It is just a matter of throwing in a couple extra gears, right?


I just meant it's not how ryzen 3000 works in games since it's getting so much mention from Mr.K

Like Vayra said,8/16 is just an overkill cpu for a console and has no bearing on PCs where there are several generations of cpus from two makers,unlike one cpu in a whole console generation.
You can't say that pc gamers need that very cpu that is in the console.There's plenty others.Slower,same and faster.


----------



## TheoneandonlyMrK (Jun 15, 2020)

RandallFlagg said:


> Agree with your post but the big thing coming out is not so much that the 1700 didn't fare well against newer 4-core CPUs, it's that it also doesn't fare well in newer games against older 4-core CPUs from the era the 1700 was released in.
> 
> Now I will also say I don't think anyone is truly pumping 4-core CPUs for the future.  But pumping 8C/16T is likely to be a big fat miss for gamers, just as it has been.
> 
> ...


Despite the delusional, Intel aren't in charge anymore.
They're not getting away with 10 years of quads anymore.
What was will be so again but this time there's some weight behind it, and that weight isn't intel.

AMD has ryzen and if the fools at intel agreed with you wouldn't they still be pushing quads with enhanced caches and frequency specifically made for gaming.

Don't you think they could, because they sure as shit tried, yet haven't progressed that design passed one generation ie the enhanced cache 5570 I think it was though I could have that wrong and I'm on a phone, I leave it, people like a tangential point to pull someone on anyway.

@op Vega64 4k(your kind of example for future proof the Vega does 60 FPS steady at 4k max settings bar AA on GTAV), easy yes it is.

You haven't tried so you know what, nothing.


----------



## Sithaer (Jun 15, 2020)

phill said:


> I don't think that I have ever built any PC I have ever had and thought, yes, that'll last me a good couple of years..  There's always something newer, better, faster around the corner...   More often than not before hand, I'd end up buying something different after 3 to 6 months anyways...  It's what you do with hobbies I find..  Bleed money when you would like to buy something for it!
> 
> But I think from the results in some of the graphs in the thread is that they are all tested at 1080, where you're going to see the biggest difference between things.  I mean, when Ryzen 1 series released it was a massive bounce back from AMD.  Utterly huge and my god has even the 3rd series now come on and done even more for the CPU market and AMD in general.
> Still the one thing though that I don't see the worry with, is that if 60 fps is the nice happy medium for most people gaming, so what if it then hits 140 to 220 fps??  It's going to only matter on a 120Hz, 144Hz or faster refresh rate panel.  For my personal choice, I'd head to higher resolutions than a higher refresh rate as in the years of gaming I've done, I've never had a problem with just 60Hz.   I understand that there are benefits for 144Hz and the like but I would love to know how many people actually go down that route..  I would actually also love to physically see the difference in front of me.  Seeing things on Youtube I think, only show so much of the story at times...
> ...



Pretty much how I also feel/think about this.

My previous i 3 4160 system lasted me a little over 3+ years since it was all I needed for the games I was playing at the time.

Pretty sure I will do the same if not more with my current 1600x based system I built in 2018 May so its only 2 years old and I still see no reason to upgrade the CPU since I don't care about high refresh/competitive gaming.

I've upgraded my monitor ~1 year ago and I had the choice to go for a budget-ish 144Hz 1080p one or go for a higher res/better panel one.
Picked a 75 Hz 29" Ultrawide cause it gives me a nicer immersive feel in the singleplayer games I mainly play and I'm totally happy with it._ 'sure the games not properly supporting this aspect ratio can be annoying but that can be fixed most of the time with tweaks'_

I also have a global fps limit of 74 applied _'monitor freesync range is 40-75',_ so as long as my CPU is capable of pushing those frames its all good with me.
So like you said the difference between say 100-150 fps means nothing to me really.

Most likely I will keep this CPU until 2022 or so then see how the budget-mid range/best price-performance ratio hardware is and then maybe grab some upgrade. _'if I have saved up money that is'_


----------



## cucker tarlson (Jun 15, 2020)

theoneandonlymrk said:


> Despite the delusional, Intel aren't in charge anymore.
> They're not getting away with 10 years of quads anymore.
> What was will be so again but this time there's some weight behind it, and that weight isn't intel.
> 
> ...


not exactly a thread about who is in charge and who is a fool or delusional or whatever your opinion is on anything really.
the topic is "can a slower cpu with more cores outperform a faster with fewer cores in new games" and the answer is "not really".

btw I weirdly agree with you on your signature.I too think ampere is tesla/quadro only and hopper is next gen gaming.


----------



## 95Viper (Jun 15, 2020)

Hello everyone.

Let's keep it on topic.
Have a nice clean conversation.

Thank You and Have a Good Day


----------



## c2DDragon (Jun 15, 2020)

Cyberpunk 2077 might tell if my 6700k is obsolete. Same goes for my 1080Ti.
For now, I don't feel the need to upgrade until I get a slap in the face like when I saw my 3570k bottlenecking my GTX 970 years ago.

I'm not amazed by those 8c/16t+ because games I play are working perfectly fine and upgrading now wouldn't change my gaming experience which is perfectly fine right now.
For me, having more cores doesn't mean more futureproof for gaming. It's all about the whole architecture, IPC, cache, latency. If you take a 10c/20t right now I'm quite sure it will be outperformed in less than 3 years with new CPUs being able to help your system have higher values for your lower & average FPS in games and highest FPS of course.

If consoles are going 8c/16t it may mean that PC gamers would have to upgrade to 8c/16t CPUs for an optimal experience soon, especially because it's well known that PC ports of console games are not well done most of the time but that's another story (the beefier, the better for those).
I guess the minimal specs for gaming will be 4c/8t and 8c/16t the recommended in 2021/2022.
If it's for gaming only I see no point investing (if it's really the good word) in more than 8c/16t.
I wouldn't recommend to get a 4c/8t right now if you want to keep it for 5 years +.


----------



## dirtyferret (Jun 15, 2020)

cucker tarlson said:


> Really ? How much gaming performance is that 8/16 CPU packing at 3.5 g allcore ? 1800x level ?


DF performed a test on a Ryzen 3700x and it was at the Ryzen 5 1500x (4c/8T) level for both single and multi core granted they used the actual Ryzen 3700x and AMD, Sony, MS have all stated the console CPU is "based" on ryzen architecture is not an actual Ryzen desktop CPU.  Historically console CPUs use 18-25w max so one can easily assume the new console CPU will have a cut down cache and may not be able to peak at 3.5ghz across all cores, in fact I'll be shocked if it can. 

That said I will give AMD the benefit of the doubt and assume developers can optimize the console CPU to perform at Ryzen 1600 levels (original AE not AF).



c2DDragon said:


> Cyberpunk 2077 might tell if my 6700k is obsolete.



My personal opinion is the game won't be a massive CPU drain as CD Project Red have always done a good job on CPU optimization (they got the AMD FX-8 to compete with the i5-2500k).  The GPU side?  I's sure they will offer a setting that brings every video card out there to its knees (remember Ubersambling and hairworks?)


----------



## c2DDragon (Jun 15, 2020)

dirtyferret said:


> My personal opinion is the game won't be a massive CPU drain as CD Project Red have always done a good job on CPU optimization (they got the AMD FX-8 to compete with the i5-2500k).  The GPU side?  I's sure they will offer a setting that brings every video card out there to its knees (remember Ubersambling and hairworks?)


Yep, a 3080Ti will be welcome I think  and it means getting a new CPU to push it the way it's meant to be


----------



## cucker tarlson (Jun 15, 2020)

dirtyferret said:


> DF performed a test on a Ryzen 3700x and it was at the Ryzen 5 1500x (4c/8T) level for both single and multi core granted they used the actual Ryzen 3700x and AMD, Sony, MS have all stated the console CPU is "based" on ryzen architecture is not an actual Ryzen desktop CPU.  Historically console CPUs use 18-25w max so one can easily assume the new console CPU will have a cut down cache and may not be able to peak at 3.5ghz across all cores, in fact I'll be shocked if it can.
> 
> That said I will give AMD the benefit of the doubt and assume developers can optimize the console CPU to perform at Ryzen 1600 levels (original AE not AF).
> 
> ...


good point


----------



## FordGT90Concept (Jun 15, 2020)

cucker tarlson said:


> Really ? How much gaming performance is that 8/16 CPU packing at 3.5 g allcore ? 1800x level ?
> Check where 1800x is against recent 6/12 cpus like 10600kf oc
> Less than 8/16 is fine for 2 years,right......just like 1700x was futureproof and 3 years later it's outperformed by amd's own 3300x


3700X (Zen 2) 8c/16t 3.6 GHz
PlayStation 5 (Zen 2) 8c/16t 3.5 GHz
Xbox Series X (Zen 2) 8c/16t 3.6 GHz

Tim Sweeney was already talking about doing things like raytraced audio with all those threads in Unreal Engine 5.



newtekie1 said:


> Everyone said that the last generation too. The fact is the extra cores are going largely unused.


They had 8x1.6 GHz Jaguar cores with no SMT.
There is easily four times more compute power in the next generation consoles.


----------



## cucker tarlson (Jun 15, 2020)

FordGT90Concept said:


> 3700X (Zen 2) 8c/16t 3.6 GHz
> PlayStation 5 (Zen 2) 8c/16t 3.5 GHz
> Xbox Series X (Zen 2) 8c/16t 3.6 GHz


boost ?


----------



## FordGT90Concept (Jun 15, 2020)

That's base clocks across the board, allegedly.  Series X supposedly can boost to 3.8 GHz.


----------



## cucker tarlson (Jun 15, 2020)

FordGT90Concept said:


> That's base clocks across the board, allegedly.  Series X supposedly can boost to 3.8 GHz.


and 3700x?
4.2-4.4ghz


----------



## dirtyferret (Jun 15, 2020)

FordGT90Concept said:


> That's base clocks across the board, allegedly.  Series X supposedly can boost to 3.8 GHz.


Desktop CPUs have access to desktop motherboards, coolers, and power supplies.  The console version will have none of that.


----------



## FordGT90Concept (Jun 15, 2020)

cucker tarlson said:


> and 3700x?
> 4.2-4.4ghz


And?  3600X is 2c/4t short.  That small difference in boost isn't going to compensate for 25% less hardware resources.  3700X is the closest thing in consumer space to what they have.  If you want to "future proof" that's the bare minimum going forward for gaming.



dirtyferret said:


> Desktop CPUs have access to desktop motherboards and power supplies, the console version will have none of that.


What does that have to do with anything?


----------



## dirtyferret (Jun 15, 2020)

FordGT90Concept said:


> What does that have to do with anything?



Console CPUs have far more in common with laptop CPUs then they do desktop CPUs when it comes to power draw, power management and temp control.  If Ford (no pun intended) said they were coming out with a motorcycle that has an V-6 engine _based _on the F-150 do you think that motorcycle would be getting a 3.3L V-6 259hp engine that weighs and costs as much as the motorcycle itself or something similar to a 90cu-in V-6 you would find on a honda or harley motorcycle?


----------



## Xex360 (Jun 16, 2020)

Bill_Bright said:


> Why would game developers force their customers to ditch millions and millions of quad core processors and spend $100s more for a 6+ core processor, and possibly a new motherboard to support it, which may require new RAM and new Windows license too (if new motherboard is needed)? Their sales would plummet, if ever high in the first place.
> 
> I do agree with you that in the future, to appease the gaming enthusiasts who have the necessary deep pockets, many games will be coded where the absolute best gaming performance can only be achieved with hexa-core, octa-core, deca-core or more CPUs.  But I contend quad-core CPUs will still be able to play them for many years to come. Maybe there will be fewer objects, fewer players, less detailed backgrounds, a smaller field of vision. But good game play (for those not obsessed with the best possible FPS scores on benchmarks) will still be possible.
> 
> ...


I understand your point, but still their results are misleading for BF1 and certainly BFV.
As for proper reviewing I don't consider those reviews as objective, they are too artificial (like testing cars with slicks). It is understandable that we can't test every configuration out there, but we could have some realistic configs like low end, mid- range, high-end and premium, while including the current methodology as reference.
Maybe my definition of review is different to there's, I believe that reviews are there to help us make informed purchase decisions, unfortunately those kind of reviews fail to do so.


----------



## Palladium (Jun 16, 2020)

Current Intel 4C/4Ts in current games are still screaming fast versus whatever CPUs running Half-Life 1 until 2002.


----------



## cucker tarlson (Jun 16, 2020)

FordGT90Concept said:


> And?  3600X is 2c/4t short.  That small difference in boost isn't going to compensate for 25% less hardware resources.  3700X is the closest thing in consumer space to what they have.  If you want to "future proof" that's the bare minimum going forward for gaming.
> 
> 
> What does that have to do with anything?


Again,ppl need to read instead of coming with their mind made up.

3700x is a fraction faster in games,and that's with a higher clock.How is that futureproof.Take 3600x against 3700x and they're the same now.

Unlike 1600->3600 and so on.more cores is not really futureproof if they can't deliver now.,we already have games that scale on 8 or 10 cores.

In those 3 years since r1700 came out it gained nothing over 7700k or r5 1600 while 3600 and 3300x are beating it in single AND multi threaded games.


----------



## Bill_Bright (Jun 16, 2020)

Xex360 said:


> As for proper reviewing I don't consider those reviews as objective, they are too artificial (like testing cars with slicks). It is understandable that we can't test every configuration out there, but we could have some realistic configs like low end, mid- range, high-end and premium, while including the current methodology as reference.


I did say reviews "need to be" objective. Some are, many are not. Some so called reviews are just reprints of the marketing hype provided by the manufacturer.  We do need more reviews, but someone has to pay for them. And they MUST be without influences. Consumer Reports, for example, sends out "secret shoppers" and buys their review samples from Amazon, Walmart, Best Buy, etc. instead of getting review samples from the manufacturers that might have been cherry-picked and/or specially tweaked to provide good review results. But many review sites just don't have the budget to buy the samples. 

So as consumers, we have to do our homework and try to read as many different reviews as possible.


----------



## TheoneandonlyMrK (Jun 16, 2020)

dirtyferret said:


> Console CPUs have far more in common with laptop CPUs then they do desktop CPUs when it comes to power draw, power management and temp control.  If Ford (no pun intended) said they were coming out with a motorcycle that has an V-6 engine _based _on the F-150 do you think that motorcycle would be getting a 3.3L V-6 259hp engine that weighs and costs as much as the motorcycle itself or something similar to a 90cu-in V-6 you would find on a honda or harley motorcycle?


Yeh because laptop ryzen isn't stomping intel out of that market.
Tim Sweeney says the Xbox clocks are locked to those baseline speeds as a minimum same with it's GPU.

Ryzen can very very easily manage those clocks on conservative power as the laptop's alone prove.

No PC has special compression hardware = 13 extra zen cores either btw.

@cucker tarlson your still pulling old benchmark scores out to argue a point about unreleased technology, to me a fail ,no worries , keep up the good work.


----------



## cucker tarlson (Jun 16, 2020)

why are we talking consoles again Mr.K ?


----------



## TheoneandonlyMrK (Jun 16, 2020)

cucker tarlson said:


> why are we talking consoles again Mr.K ?


Future , it's decided by them not you.

Your the one talking three year old tech with game's made over the last ten years to a spec that was viable now , game's released over the last few years mean nothing to the future.

Neither are three year old CPU's relevant.

Game's being made now are not all targeting the performance a quad puts out.

Debating with blinders on might be your thing ,it isn't mine.


----------



## dirtyferret (Jun 16, 2020)

theoneandonlymrk said:


> Yeh because laptop ryzen isn't stomping intel out of that market.



What does intel or "AMD vs Intel" in laptops have to do with console CPUs????


----------



## cucker tarlson (Jun 16, 2020)

theoneandonlymrk said:


> Future , it's decided by them not you.
> 
> Your the one talking three year old tech with game's made over the last ten years to a spec that was viable now , game's released over the last few years mean nothing to the future.
> 
> ...


yes but that's not really the topic see.
the topic WAS old big cpus vs new small cpus until you absolutely derailed the thread with console supremacy nonsense.

and "future is decided by consoles" is not something that is really proved in cpu gaming tests or something that depends on your opinion.Ppl buy a 9900k for the performance over 8700k NOW,not for futureproofing really.

my links not good enough for you ? here,have more from computerbase.3300x is just as good as 3700x or 3600.That's how Ryzen 3000 works in games.But how would you know that.








						Intel Core i5-10400F im Test: Benchmarks in Spielen und Anwendungen
					

Intel Core i5-10400F im Test: Benchmarks in Spielen und Anwendungen / Testsystem und Methodik / Benchmarks in Anwendungen




					www.computerbase.de
				









waiting for yours since you're spamming since page one but you really do not overdo it with data I must say



dirtyferret said:


> What does intel or "AMD vs Intel" in laptops have to do with console CPUs????


and what does it have to do with the topic of big old vs small new in PCs ?   



theoneandonlymrk said:


> Neither are three year old CPU's relevant.


what ?
8700k is 3yrs old and still kicks ass
6700k is 5yrs old and it's still a solid performer
not to mention 2013 cpus like haswell-e that still do very well,far from being "irrelevant"

it seems like anything that is not same core count as a console is irrelevant to you.


----------



## EarthDog (Jun 16, 2020)

theoneandonlymrk said:


> Game's being made now are not all targeting the performance a quad puts out.


they aren't? Youll find a lot more titles choke on 4c systems than 6c/12t+....it isn't all, clearly, but we've seen plenty of titles show notably less performance when using a 4c/4t or 4c/8t cpu.


----------



## cucker tarlson (Jun 16, 2020)

EarthDog said:


> they aren't? Youll find a lot more titles choke on 4c systems than 6c/12t+....it isn't all, clearly, but we've seen plenty of titles show notably less performance when using a 4c/4t or 4c/8t cpu.


I think "targeting XX cores" is just plain too broad and impossible to prove.
there's different cpus,architectures and so many different variables.
look at ryzen 3100 vs 3300x and see what a ccx trick has done to gaming performance.incredible.











if Mr.K insists on putting it that way,he may wanna produce some damned data so we know exactly how many cores are pc games targeted for.


----------



## EarthDog (Jun 16, 2020)

Of course there is always some magic under the hood that can make it different... espeically with AMD and CCX's....... that said, its pretty clear quad cores are still slower than higher core count, faster CPUs......... though this (obviously - but seeing the crowd needs to be said) varied by game, res, settings, etc. a 25% increase in one title is nice, byt that still leaves it well behind similar architectures with higher core counts.


----------



## cucker tarlson (Jun 16, 2020)

EarthDog said:


> Of course there is always some magic under the hood that can make it different... espeically with AMD and CCX's....... that said, its pretty clear quad cores are still slower than higher core count, faster CPUs......... though this (obviously - but seeing the crowd needs to be said) varied by game, res, settings, etc. a 25% increase in one title is nice, byt that still leaves it well behind similar architectures with higher core counts.


well,they are

but "games are now targeted for XX cores" ,"because of XX cores on consoles" and saying that there's a total equivalency between core count on various architectures from various cpu makers is just easily false.


----------



## TheoneandonlyMrK (Jun 16, 2020)

cucker tarlson said:


> yes but that's not really the topic see.
> the topic WAS old big cpus vs new small cpus until you absolutely derailed the thread with console supremacy nonsense.
> 
> and "future is decided by consoles" is not something that is really proved in cpu gaming tests or something that depends on your opinion.Ppl buy a 9900k for the performance over 8700k NOW,not for futureproofing really.
> ...


They're going that way soon so In the title, the word future proof made all my comments valid but I'll leave you to your blinkered view if that's what you want.

It is your thread so you dictate what's said, second thoughts hell no, public forum.

Like how dirtyferret bemoans the performance at lower Watts of ryzen,so I give viable demonstrating example of why your wrong, laptops.


Then we get why mention laptops.

If it's got a core it could be relevant to the debate.


----------



## cucker tarlson (Jun 16, 2020)

theoneandonlymrk said:


> *They're going that way soon *so In the title, the word future proof made all my comments valid but I'll leave you to your blinkered view if that's what you want.
> 
> It is your thread so you dictate what's said, *second thoughts hell no*, public forum.


well,this is the problem.
it's not really an opinion thread unless it's backed by tests,neither has anything to do with consoles.
you are welcome to come up with data to prove they're going to be obsolete "soon".

seems like that 2700x went that way when their own 3300x kicked its butt in games.


I admire your console spirit tho.It's been 6 pages and you never referred to any of the tests provided once.

All you do is react "haha" and keep saying "consoles are the future" in a thread that has nothing to do with that.


----------



## TheoneandonlyMrK (Jun 16, 2020)

cucker tarlson said:


> well,this is the problem.
> it's not really an opinion thread unless it's backed by tests,neither has anything to do with consoles.
> you are welcome to come up with data to prove they're going to be obsolete "soon".
> 
> ...


So why the title, it includes future proof ,what future, some delusional non existent one where PC stay the same.

I corrected your delusional waffle about year's old games above.

Some game's.

Keep stamping those feet.

You looking forward to playing resident evil 8, cyberpunk etc on that quad.

Two games being made now ,only one has the slightest chance of a quad hitting minimum specs.

Future does not equal passed.

Plus finally, your pushing an argument about future proofing based on the FACTS of old tech over the last few years, others could believe this bs , same as back in the day with the same argument in favour of fast single cores then duels then pure non HT i5s, they're all in the past now for gaming as a whole.
So will HT quads be soon , the last few years were the very start of the climb away from quads and a time of relative stagnation.

That stagnation is just about to be stepped away from totally , leaving quads with HT behind, to ignore that is disingenuous to the premise of future proof.


----------



## RandallFlagg (Jun 16, 2020)

theoneandonlymrk said:


> Future , it's decided by them not you.




Mmmm Kay.  Time to put consoles in their place.

So lets talk numbers.  Totaling up the top 3 consoles for this generation (2013+), we have a grand total of ~200M units sold.  That's a lot, isn't it?  Lets keep something in mind though, over that time frame most console owners have bought more than 1 console.  I personally own 3 Xbox One consoles.  

That said, lets take a look at PC sales...









						PC unit shipments worldwide 2022 | Statista
					

In 2021, approximately 342 million PCs were shipped around the world.




					www.statista.com
				




Estimated over 261 million PCs were sold in 2019.  

Lets put this in perspective.  Roughly every 9 months, more PCs are sold than all 3 of the major consoles sold combined going back 7-8 years (2012/ 2013) - Xbox One, Playstation 4, Nintendo Switch, Wii U.  

Lets talk about software.

There are estimated to be 1 Billion (yes, one BILLION) Microsoft Office users.  For every major console manufactured in the past 7-8 years, whether still in use or not, there are 5 people using a Microsoft Office product.

On the gaming front, did you know that each month 67 Million unique people play Leauge of Legends?  Compare to the 200M consoles sold in the past 7 years.  This is just one game.

So I have real issue with all this talk about games driving PC sales, hyper-focus on AAA title performance, and so on.  This is not the stuff that the massive multi-billion person PC market focus' on.  It's just a small sector within that massive market.  

What the majority of people use their PCs for is evident if you're an adult.  Pay your bills.  Buy stuff online.  Comparison shop online.  Facebook / Twitter etc.  Sell your stuff on craigslist, ebay, whatever.  Apply for credit.  Sell your car.  Buy a car. Office applications.  Read the news.  Watch youtube.   On and on.

I will give props to TPU for including at least some benchmarks in their reviews that reflect these massive, overwhelmingly common use cases.  Most sites don't even pretend that these use cases exist.  I am so sick of Cinebench and 264 encoding being the benchmark of performance.  If those were the measures of success in 2011, everyone would have gotten an FX-8XXX chip.


----------



## cucker tarlson (Jun 16, 2020)

doesn't matter what the sales numbers are
new console cpu is a low frequency r3000 part.it wouldn't beat a 3600 were it on desktop.
none of the stuff that mrK has spammed this thread with is either backed by tests or really relevant.
3300x and 10320 are alredy trading blows with 3600.r4000/RKL 4/8 are gonna beat that futureproof console cpu to death.not to mention zen 4000 6 cores with all six in one ccx and higher IF clock.your futureproof next gen 8/16 is gonna get caught with its pants down in less than a year since launch.


----------



## TheoneandonlyMrK (Jun 16, 2020)

RandallFlagg said:


> Mmmm Kay.  Time to put consoles in their place.
> 
> So lets talk numbers.  Totaling up the top 3 consoles for this generation (2013+), we have a grand total of ~200M units sold.  That's a lot, isn't it?  Lets keep something in mind though, over that time frame most console owners have bought more than 1 console.  I personally own 3 Xbox One consoles.
> 
> ...


90% of that was office pc relative, irrelevant including the figures.

League of legends plays on a potato the type of potato a lot own so it's not surprising that people play what they can with what they have.

Every console was bought Just to game on , office tat be damned.

Seems like you're all picking whatever argument fit's your bill while minimising any detractors with tangential nonsense which when tasked on is irrelevant for me to mention.

Great arguing style.

This has f all Todo with office other use cases etc etc.

Future proof gaming, it's in the thread title.


----------



## cucker tarlson (Jun 16, 2020)

theoneandonlymrk said:


> Future proof gaming, it's in the thread title.


you're starting to catch on


----------



## RandallFlagg (Jun 16, 2020)

theoneandonlymrk said:


> 90% of that was office pc relative, irrelevant including the figures.
> 
> League of legends plays on a potato the type of potato a lot own so it's not surprising that people play what they can with what they have.
> 
> ...



Intel owns and has always owned gaming performance.  Anyone saying otherwise can't read a chart and is devoid of any critical thinking skills.  A 3 year old 7700K will slap brand new Ryzens on the vast majority of gaming benchmarks and a 2 year old 8700K will absolutely lay waste to them.

AMD lost that battle a long time ago and has *never* won it.    If your primary use case is video games and you buy AMD, you messed up.

My response was in answer to the assertion that consoles drive PC games.  They really don't.  Most PC gamers are not playing the games that poster thinks they are, nor are they playing the games commonly used at these  review sites.


----------



## cucker tarlson (Jun 16, 2020)

RandallFlagg said:


> Intel owns and has always owned gaming performance.  Anyone saying otherwise can't read a chart and is devoid of any critical thinking skills.  A 3 year old 7700K will slap brand new Ryzens on the vast majority of gaming benchmarks and a 2 year old 8700K will *absolutely lay waste* to them.




how about "8700k still has not lost a single step against the much newer and higher core count ryzen 3000 in gaming" conveys the same thought but without fecal matter references


----------



## TheoneandonlyMrK (Jun 16, 2020)

RandallFlagg said:


> Intel owns and has always owned gaming performance.  Anyone saying otherwise can't read a chart and is devoid of any critical thinking skills.  A 3 year old 7700K will slap brand new Ryzens on the vast majority of gaming benchmarks and a 2 year old 8700K will absolutely lay waste to them.
> 
> AMD lost that battle a long time ago and has *never* won it.    If your primary use case is video games and you buy AMD, you messed up.
> 
> My response was in answer to the assertion that consoles drive PC games.  They really don't.  Most PC gamers are not playing the games that poster thinks they are, nor are they playing the games commonly used at these  review sites.


In reality though the thin nigh invisible margin of gamer's that actually play at 144hz 1080-1440p are a small minority, as are those that Need an extra 5-10 FPS.
So for 95% of gamer's it does not matter which has max FPS when the game's are in fact running fine regardless.
Your in a niche with short term memory and a bias against the future it seams.

Plus you are getting a bit biased us v them ISH, who made brands the point.

I. Thought it was the viability of more core's for future proofing in debate ,not . Hahaaaa,

Was always a AMD troll fest here.


----------



## cucker tarlson (Jun 16, 2020)

theoneandonlymrk said:


> I. Thought it was the viability of more core's for future proofing in debate ,not . Hahaaaa,


it was
until somebody brought up console cpus....and spammed the thread......

I wonder what is a smaller minority - pc 144hz gamers or people who buy an 8/16 for gaming


----------



## RandallFlagg (Jun 16, 2020)

cucker tarlson said:


> how about "8700k still has not lost a single step against the much newer and higher core count ryzen 3000 in gaming" conveys the same thought but without fecal matter references



Uhh.. "Lay Waste" does not refer to fecal matter.  It refers to mass destruction.  I can see how it might be interpreted to mean fecal matter though...


----------



## xkm1948 (Jun 16, 2020)

Gaming sure. Productivity that can leverage good multithreading definitely good to have the 8 cores.


----------



## RandallFlagg (Jun 16, 2020)

theoneandonlymrk said:


> In reality though the thin nigh invisible margin of gamer's that actually play at 144hz 1080-1440p are a small minority, as are those that Need an extra 5-10 FPS.
> So for 95% of gamer's it does not matter which has max FPS when the game's are in fact running fine regardless.
> Your in a niche with short term memory and a bias against the future it seams.
> 
> ...



Not hardly.  I've been seeing people like you post about more cores for over 20 years.  Dual Celerons.  Athlon X2.  X4.  Phenom.  FX.  I had them all.  

As far as future proof on games goes (thread topic), so you are ok being 10-20% slower in FPS because people don't have top end 2080s.  I get it.

So what happens next year when the 3060 performs on par with this years 2080?  That still not gonna matter?

How about year after next when the 4050 outperforms this years 2080?   

Where's your future proofing?  Still relying on games becoming multi-threaded?  All that means is that you do not understand how software works.


----------



## TheoneandonlyMrK (Jun 16, 2020)

RandallFlagg said:


> Not hardly.  I've been seeing people like you post about more cores for over 20 years.  Dual Celerons.  Athlon X2.  X4.  Phenom.  FX.  I had them all.
> 
> As far as future proof on games goes (thread topic), so you are ok being 10-20% slower in FPS because people don't have top end 2080s.  I get it.
> 
> ...


And you prove yourself wrong.

You had all those chips, why not still, because the naysayers you heard were right, they were dropped pretty fast at a certain point as useless.

Then some waffle about GPU, TANGENTIAL alarm sound!.

But on those ,are they standing still, no, we agree on something.

And all those new GPU are going to need, and be able to leverage far more SYSTEM bandwidth than an old quad has available, like I said pages ago.

You don't understand systems.

They always have a weakest link.

And before you start, I'm not convinced any pc made now will be viable in five years personally.

I would say though that any i7 or 8 core ryzen isn't a bad buy imho, and they weren't three years ago, I think at different times you can expect a different amount of use out of a system.

At the start of the Core era a q6600 would end up lasting someone years, same with nehalem and many others.

But sometimes things are in such a state of flux due to innovation, node's and competitive pressures that to expect more than a few years was hopeful.
As is now.
Intel and AMD have stuff coming near term that could spell the r1700s and any quads march to irrelevance.

Of course if we all just decided to play league of legends ,wow and CSGo for life, your argument that this system (a quad)is valid holds weight eternally.

I now have two q6600 keyrings btw.


----------



## Rahnak (Jun 16, 2020)

RandallFlagg said:


> Not hardly. I've been seeing people like you post about more cores for over 20 years. Dual Celerons. Athlon X2. X4. Phenom. FX. I had them all.



You say you had them all but you also say Intel has always owned gaming performance.



RandallFlagg said:


> If your primary use case is video games and you buy AMD, you messed up.



Strongly disagree. There are other considerations besides performance when purchasing a cpu, such as pricing.



RandallFlagg said:


> So what happens next year when the 3060 performs on par with this years 2080? That still not gonna matter?



That's some very strong wishful thinking.


Legit question: How many of you don't have any programs running when you play games?


----------



## cucker tarlson (Jun 16, 2020)

Rahnak said:


> Legit question: How many of you don't have any programs running when you play games?


all the time
a dozen windows open in firefox on my secondary monitor
all the gaming software from logitech,afterburner,xtu,sound blaster,apo and geforce experience


----------



## EarthDog (Jun 16, 2020)

Rahnak said:


> Legit question: How many of you don't have any programs running when you play games?


If I can close anything, I will... though the reality is I don't have to (16c/16t, 32GB RAM....etc). 

That said, even with a 6c/12t CPU, I don't see much of a difference if I have CHrome up with a dozen tabs and watching YT/Twitch, etc on the other monitor....8c/16t is a great spot to be in today and for the next few years. I'm hoping now that we see more cores/threads in a console we'll see quick adaptation moving forward, but, I'm not holding my breath as that being a dominant driver of cores and threads either.


----------



## danbert2000 (Jun 16, 2020)

Are people forgetting that the consoles will have some of the processor locked up for the OS? And that Microsoft said that most devs were looking at running without hyperthreading in order to hit the higher clockspeed? If Xbox Series X games are going to run on 7 cores, then I think the 4c/8t will work just fine at 60 fps for the forseeable future. It's also very unlikely that devs are going to peg a 3700x style CPU when running games. Especially with raytracing in the mix, these will likely be GPU bound completely. Which means that my 5775c will likely still play all the games I want for 5+ years at 60 fps.





__





						Full Xbox Series X specs: 3.8GHz Zen 2 CPU, 16GB GDDR6, 52CU Navi GPU
					

Microsoft reveals full specs of the Xbox Series X, uncovering beastly performance of its next-gen Xbox.




					www.tweaktown.com
				




"The Xbox Series X's CPU has two modes: It can run at 3.8GHz on up to 8 cores of the Zen 2 CPU with SMT off. With simultaneous multi-threading on, developers can hit 3.6GHz using all 8-core 16 threads of the Zen 2 CPU. Digital Foundry says the system won't have a boost clock mode to raise frequencies and perf is locked in these modes. Microsoft expects most developers to use the non-SMT option to ensure more streamlined backward compatibility with current Xbox One games, which are designed for 7 cores."


----------



## cucker tarlson (Jun 16, 2020)

danbert2000 said:


> Which means that my 5775c will likely still play all the games I want for 5+ years at 60 fps.


let's not go crazy on the christmas letter.


----------



## EarthDog (Jun 16, 2020)

danbert2000 said:


> with current Xbox One games, which are designed for 7 cores."


With this said, I amend a previous statement.

If games have been designed for 7 cores for God knows how long the Xbox one is out, i have even less faith this will push things forward faster on PC...

enjoy the core wars, people...yawn.


----------



## Rahnak (Jun 17, 2020)

danbert2000 said:


> Microsoft expects most developers to use the non-SMT option to ensure more streamlined backward compatibility with current Xbox One games, which are designed for 7 cores.



Yes.. but backwards compatibility won't last forever. I'll give it a year, two max. And that's really pushing it. Microsoft expects devs to pick the higher clocks mostly for launch titles (or those not very demanding games like 2D platformers).
Sony had to put a hardware module for data decompression on the PS5 because doing it on the CPU would take an outrageous number of cores. Not sure what the strategy is on the Microsoft side. But don't worry, some studios will find a way to put those cores to use.



EarthDog said:


> If games have been designed for 7 cores for God knows how long the Xbox one is out, i have even less faith this will push things forward faster on PC...



Those were really, really weak cores though. Netbook grade stuff.


----------



## oxrufiioxo (Jun 17, 2020)

dirtyferret said:


> Console CPUs have far more in common with laptop CPUs then they do desktop CPUs when it comes to power draw, power management and temp control.  If Ford (no pun intended) said they were coming out with a motorcycle that has an V-6 engine _based _on the F-150 do you think that motorcycle would be getting a 3.3L V-6 259hp engine that weighs and costs as much as the motorcycle itself or something similar to a 90cu-in V-6 you would find on a honda or harley motorcycle?



yeah and the Laptop ryzen 8 core can beat a 9700k in R20 at lower clocks than what the consoles run them at so........


----------



## RandallFlagg (Jun 17, 2020)

Rahnak said:


> You say you had them all but you also say Intel has always owned gaming performance.



And that is incompatible how?  I do not buy just for games, in fact my next PC will not have a GPU anywhere near capable of stressing a modern 4C/4T CPU.  I pointed out that Intel does better on the types of applications people actually use (Office, browsers), but the topic of the thread is how moar coars has fared over time in games.  So I am talking about games.  Most of these so called 'productivity' arguments are flaccid though.    It is *AMAZING* how similar *benchmarks* of the FX-8320/8350 look to current *benchmarks* between Ryzen 3700X/3800X vs i7-10700/700K.  AMD high core count CPUs have always done well in applications that do the same thing over and over - Handbrake, and Cinebench.  Those are their flagship benchmarks.   They have always sucked at applications that do otherwise unpredictable branches (like games).  

Thing is, back then we knew most people didn't run Handbrake or do Video editing, so it was given little credence when AMD did well on those same benchmarks.  Today, freakishly, we are told it is important and those same benchmarks are front and center on virtually every review site.  AMDs marketing has worked.  Group think, herd mentality, that is a thing.  








Rahnak said:


> Strongly disagree. There are other considerations besides performance when purchasing a cpu, such as pricing.



There's also longevity, TCO (total cost of ownership).   Just look at these threads, I keep seeing 8700K owners saying "I have no reason to upgrade" while 2700X owners constantly say "I can't wait should I go for 3600X now or...".   I even see people with 3770K and 4770K saying they don't feel any need.  Don't see too many FX-8350 users around.  What does this suggest to you?



Rahnak said:


> That's some very strong wishful thinking.



It's based on what's happened in the past. Look at TPUs database.  2060 = 1080 (100% match).  1060 = 95% of 980 performance.  The only laggard was the 960 vs 780.  There's no reason to think that won't happen again given that the next gen will get arch, memory, and process tech boosts.  



Rahnak said:


> Legit question: How many of you don't have any programs running when you play games?



How often do you switch to one of those browsers or apps and actually use it during intense game play?   How much CPU do these apps use while your PC is running a game?   I found that the main thing those apps use is RAM, and staying away from swap is important.  It is likely that the correct answer to your implied assumption is 32GB RAM, not 8 cores.


----------



## cucker tarlson (Jun 17, 2020)

oxrufiioxo said:


> yeah and the Laptop ryzen 8 core can beat a 9700k in R20 at lower clocks than what the consoles run them at so........


what does cinebench have to do with it motorcycles ?



RandallFlagg said:


> How often do you switch to one of those browsers or apps and actually use it during intense game play?   How much CPU do these apps use while your PC is running a game?   I found that the main thing those apps use is RAM, and staying away from swap is important.  It is likely that the correct answer to your implied assumption is 32GB RAM, not 8 cores.


absolutely.
a game will use 6-8,maybe 10 gigs.
on several occasions I came close to filling 16gb with other stuff while game was running but the cpu still ran fine and games played smoothly



Rahnak said:


> Strongly disagree. There are other considerations besides performance when purchasing a cpu, such as pricing.


resale value is as important as pricing.



RandallFlagg said:


> There's also longevity, TCO (total cost of ownership).   Just look at these threads, I keep seeing 8700K owners saying "I have no reason to upgrade" while 2700X owners constantly say "I can't wait should I go for 3600X now or...".   I even see people with 3770K and 4770K saying they don't feel any need.  Don't see too many FX-8350 users around.  What does this suggest to you?


there's even one that constantly spams with 8 cores being futureproof because of consoles but upgraded 2700x to 3000 even tho he plays at 4K/60


----------



## FordGT90Concept (Jun 17, 2020)

cucker tarlson said:


> Take 3600x against 3700x and they're the same now.


Because games are made for 8-core Jaguar processors which run great on 3.0+ GHz quad-core w/ SMT (8 concurrent threads).  New games are likely to spawn 16 threads.


Here's an example of what I'm talking about:  PlayStation 3/Xbox 360 era, the consoles had about 512 MiB total RAM.  The games that came to PC were almost exclusively 32-bit and very, very few went past 4 GiB of memory because of it.  Enter Xbox One and PlayStation 4 which had 8+ GiB of memory, 64-bit has become the norm, and many games consistently use >4 GiB of VRAM alone.  AMD even declared the 4 GiB cards obsolete with the 5500 XT launch.

Consoles are the lowest common denominator.  Whenever they make a leap ahead, software follows in porting to PC.  We witnessed this with memory and tearing down the 32-bit barrier in gaming.  We're likely to witness it again in terms of CPU performance with this coming generation.


----------



## oxrufiioxo (Jun 17, 2020)

Sorta a pointless thread really... Some users might be able to get another 3-4 years out of my 9900k I already want something faster so for me it lasted less than 2 years before wanting more performance so not very future proof.

I have a 6700k based system and when paired with a 2080 ti it sucks  but if you have a slower gpu maybe even a 2070 level one its more than enough.

Also a lot of people bought 7600k over Ryzen 1600/1700s and 3 years later that was a terrible choice but I guess they got that extra 5-10% for a couple years so there is that at least.... I'm sure for some its still more than enough while others who have upgraded to faster gpu are stuttering away with their 4 threads......




cucker tarlson said:


> what does cinebench have to do with it motorcycles ?



He was harping on about how laptop vs desktop cpu aren't comparable cuz of powah etc when at this point in 2020 we already have ryzen based laptop 8 cores that beat or match 9600k/9700k in all core workloads at stock settings.  The jaguar cores in last gen systems couldn't even beat a 2500k so comparing this generation vs last generation is pointless.


----------



## cucker tarlson (Jun 17, 2020)

FordGT90Concept said:


> Because games are made for 8-core Jaguar processors w


no.because of how r3000 works.
and pc games are not made for jaguar processors
they scale on 9900k and 10900k,even +10 cores too


----------



## Rahnak (Jun 17, 2020)

RandallFlagg said:


> And that is incompatible how?



Because it's not true. AMD has beaten Intel in the past. Granted, it was a long time ago, but it happened.



RandallFlagg said:


> There's also longevity, TCO (total cost of ownership). Just look at these threads, I keep seeing 8700K owners saying "I have no reason to upgrade" while 2700X owners constantly say "I can't wait should I go for 3600X now or...". I even see people with 3770K and 4770K saying they don't feel any need. Don't see too many FX-8350 users around. What does this suggest to you?



Yeah, those are important things, no doubt. I kept my 2500k for 8 years and it served me well. And sure, I can give you a couple suggestions:
1. FX line was crap. Everyone knows it. Not sure why you're bringing it up.
2. Ryzen 1000/2000 series owners consider upgrading because the performance gains from Ryzen 3000 on certain situations is substantial and because they don't have to switch motherboards. It's a _relatively _low cost upgrade.
3. Upgrading a 3770K/4770K is not a low cost upgrade. It requires a new motherboard and ram as well. Probably new cooler. If they could drop anything faster in those motherboards they would do it in a heartbeat.
4. The reason I didn't upgrade my 2500k for so long (besides cost) was because Intel didn't offer me enough of a performance boost. Not until Ryzen came along and made Intel step up (but between a 9700K and 3700X I chose the latter for performance/value reasons).



RandallFlagg said:


> It's based on what's happened in the past. Look at TPUs database. 2060 = 1080 (100% match). 1060 = 95% of 980 performance. The only laggard was the 960 vs 780. There's no reason to think that won't happen again given that the next gen will get arch, memory, and process tech boosts.


760 -> 680 Nope
960 -> 780 Nope
1060 -> 980 Sure, pascal was a massive performance leap
2060 -> 1080 The thing about the 2060 is that the MSRP was just $30 lower than the 1070, so I'm gonna say this is a nope as well.

Now don't get me wrong, I hope you're right and I'd love to see a 3060 at $349 in the range of a 2080. And with the process shrink I believe it's totally doable. I certainly expect a massive RT performance uplift. I just don't see enough competition from AMD to push nvidia that hard. I hope I'm wrong though.



RandallFlagg said:


> How often do you switch to one of those browsers or apps and actually use it during intense game play? How much CPU do these apps use while your PC is running a game? I found that the main thing those apps use is RAM, and staying away from swap is important. It is likely that the correct answer to your implied assumption is 32GB RAM, not 8 cores.



Personally, not that often, but it happens. If I had a second monitor, I'd be constantly doing other stuff there. Let me make another question then. It's fine to buy more ram than you currently need, but buying more cores than you currently need is not..? Kinda weird.
Now, it's easier to use more RAM than to use more cores, I am fully aware of that. But it's not impossible and I'm fairly optimist it will happen, especially now that consoles have more to play with (them being the lowest common denominator and all).



cucker tarlson said:


> resale value is as important as pricing.



Yeah, Intel has much better resale value, not gonna argue that. But I feel like, and I could be way off here, that the percentage of people that sell their system is pretty small. And those that do, it's usually somewhere within the first 3 years of ownership if they intend to get any meaningful amount back.
Are you gonna sell your system when you upgrade?


----------



## cucker tarlson (Jun 17, 2020)

Rahnak said:


> Are you gonna sell your system when you upgrade?


of course

though in parts
cpu is gonna sell like hotcakes
ram too

board will take longer and not gonna get much back.but it was the cheapest part.


----------



## FordGT90Concept (Jun 17, 2020)

cucker tarlson said:


> no.because of how r3000 works.
> and pc games are not made for jaguar processors
> they scale on 9900k and 10900k,even +10 cores too


Games run fine on an i7-6700K today because they're still made for Jaguar processors.  That's going to change with PS5/XSX launch and that's my point: the bar is being raised a lot.

9900K will be fine because it's more potent than 3700X.  10900K might be okay but that base clock is concerning.


----------



## cucker tarlson (Jun 17, 2020)

FordGT90Concept said:


> Games run fine on a 6700K today because they're still made for Jaguar processors.  That's going to change with PS5/XSX launch and that's entirely my point: the bar is being raised a lot.


if 6700k is good cause "they are made for jaguar" then why does fx run like shieeeeeeeeeeeet

and my point is that a pc cpu (e.g. 8700K) that is faster than another pc cpu (e.g. 3700x) is not gonna be made obsolete by an even slower cpu they'll put in consoles.


----------



## FordGT90Concept (Jun 17, 2020)

Because it takes more clocks to do the same instructions.  More clocks consumed = more time consumed rendering each frame = fewer frames per second.  Real time rendering is a race against time.


----------



## cucker tarlson (Jun 17, 2020)

FordGT90Concept said:


> Because it takes more clocks to do the same instructions.


how many more.what is the actual exact diference ?


----------



## FordGT90Concept (Jun 17, 2020)

There was a nice chart that broke it all down but I can't find this.  This has the specific cycles per instruction for many architectures:


			https://www.agner.org/optimize/instruction_tables.pdf
		


Example: 32-bit IMUL in Bulldozer takes two cycles where in Ivy Bridge, the same instruction would only take one cycle.


----------



## cucker tarlson (Jun 17, 2020)

how do you even know new games will be written for 8/16 mode not 8/8 mode



FordGT90Concept said:


> There was a nice chart that broke it all down but I can't find this.  This has the specific cycles per instruction for many architectures:
> 
> 
> https://www.agner.org/optimize/instruction_tables.pdf
> ...


so you can't tell me the exact difference between jaguar and fx and skylake yet you cite that very difference as the reason.
ivy is not part of this discussion in any sense.


----------



## FordGT90Concept (Jun 17, 2020)

cucker tarlson said:


> how do you even know new games will be written for 8/16 mode not 8/8 mode


for (int I = 0; I < ProcessorCount; I++)
  new Worker();

Run that code on a single thread processor, you'll use one thread.  Run that code on a 16 thread processor, you'll get 16.

Also, the reason why Sony and Microsoft put a vastly better processor in is because Jaguar's anemic processing power was the #1 complaint developers had on developing for their respective consoles.  Both consoles now get what is effectively a desktop processor with SmartShift technology to balance power load between GPU and CPU components of the APU.




cucker tarlson said:


> so you can't tell me the exact difference between jaguar and fx and skylake yet you cite that very difference as the reason.


They're all on that sheet (Jaguar, Bulldozer, Piledriver, Steamroller, and Excavator).  Compare all you want.  There's dozens of instructions listed for every processor.  Jaguar begins on page 111.

You asked why "why does fx run like shieeeeeeeeeeeet" and I showed you why comparing FX to a chip comparable to what it was competing against.


----------



## cucker tarlson (Jun 17, 2020)

FordGT90Concept said:


> Compare all you want.


not how this works.

a r7 2700 class console cpu will make pc cpus obsolete - I'll believe it when I see it.

but judging how 3300x makes 3600 and 3700x sweat I'd wager that we'd sooner see r4000/RKL 4/8 cpus beat 3700x than we'll see that console cpu make smaller chips obsolete.

I get that console games have to be optimized for one,exact cpu model for the entire generation that lasts years.
I don't get why you think pc games have to be optimized for one specific cpu too.


----------



## FordGT90Concept (Jun 17, 2020)

Belief has nothing to do with it:








						Inside Xbox Series X: the full specs
					

This is it. After months of teaser trailers, blog posts and even the occasional leak, we can finally reveal firm, hard …




					www.eurogamer.net
				





> Microsoft is promising a 4x improvement in both single-core and overall throughput over Xbox One X - and CPU speeds are impressive, with a peak 3.8GHz frequency. This is when SMT - or hyper-threading - is disabled. Curiously, developers can choose to run with eight physical cores at the higher clock, or all cores and threads can be enabled with a lower 3.6GHz frequency. Those frequencies are completely locked and won't adjust according to load or thermal conditions - a point Microsoft emphasised several times during our visit.


XSX is 3.8 GHz w/o SMT (8 threads, 7 available for game) and 3.6 GHz w/ SMT (16 threads, 15 available for game).









						Inside PlayStation 5: the specs and the tech that deliver Sony's next-gen vision
					

Sony has broken its silence. PlayStation 5 specifications are now out in the open with system architect Mark Cerny deli…




					www.eurogamer.net
				





> Discussing the nature of CPU and GPU clock speeds is going to require some careful explanation because Cerny actually described frequencies as being 'capped'. For the CPU, 3.5GHz is at the top end of the spectrum, and he also suggests that this is the typical speed - but under certain conditions, it can run slower.


PS5 is up to 3.5 GHz w/ SMT (16 threads).

PS5 is the weaker of the two; XSX clearly has a more expensive APU and likely higher/power demands.


----------



## cucker tarlson (Jun 17, 2020)

it does at this point cause we don't have any tangible real world testing data.

and to get this straight - you're the one that believes it.I said I'll have to see it.


----------



## Vya Domus (Jun 17, 2020)

FordGT90Concept said:


> Tim Sweeney was already talking about doing things like raytraced audio with all those threads in Unreal Engine 5.



I think he was referring to that custom RDNA CU for that, you don't want to do any kind of raytracing on a CPU.



RandallFlagg said:


> I've been seeing people like you post about more cores for over 20 years.  Dual Celerons.  Athlon X2.  X4.  Phenom.  FX.  I had them all.



The fact is if one would have picked a Core 2 Quad over a Core 2 Duo back in the day they would have gotten way more mileage out it, despite the fact that the Core 2 Duo likely outperformed the quad core at the time of their release. It happened then and it's going to happen again, it's inevitable, more cores = more performance in the long run. Despite how everyone desperately tries to disprove it, there is such thing as futureproofing,


----------



## Rahnak (Jun 17, 2020)

@cucker tarlson With consoles getting effectively around 4x the cpu power, you don't see requirements on PC side going up when games designed from the ground up for the next generation come out?
Or do you suppose we're good on 4/8t cpus for another console gen?


----------



## EarthDog (Jun 17, 2020)

Rahnak said:


> Or do you suppose we're good on 4/8t cpus for another console gen?


4c/8t is already long in the tooth on some titles....


----------



## FordGT90Concept (Jun 17, 2020)

Vya Domus said:


> I think he was referring to that custom RDNA CU for that, you don't want to do any kind of raytracing on a CPU.


You don't need many rays for sound; you need a lot for light.  AMD debuted tech to do it back in 2013:








						AMD TrueAudio - Wikipedia
					






					en.wikipedia.org
				



Can be done on CPU, ASIC, or GPGPU.


Virtually no games implemented because there wasn't any hardware resources to spare on consoles and it's an unnecessary risk on PCs.  Now with an abundance of processing power on consoles, there's no reason not to do it.  TrueAudio Next will likely become mainstream over the next decade.


----------



## TheoneandonlyMrK (Jun 17, 2020)

L





cucker tarlson said:


> if 6700k is good cause "they are made for jaguar" then why does fx run like shieeeeeeeeeeeet
> 
> and my point is that a pc cpu (e.g. 8700K) that is faster than another pc cpu (e.g. 3700x) is not gonna be made obsolete by an even slower cpu they'll put in consoles.


Still stamping those feet, while staring through those tinted glasses, FX is 8 years old btw but still relative in your eyes, to a debate about the future lol. hahaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

thought i would leave this here, short video with common sense involved not bias.


----------



## Assimilator (Jun 17, 2020)

I agree with @cucker tarlson for the simple reason that very few applications are capable of scaling with available thread count, and games aren't among those applications.

Applications like Blender and CineBench are outliers, because they have predictable workloads that don't depend on user input, and thus can easily be partitioned to use as many threads as possible. Crucially, those threads don't depend on what the other threads are doing: they run to completion, then a final worker thread collects all their output and packages it up into a final rendered object.

This is completely different to game engines, which are built around reacting to what the player is doing. I'm not going to go into implementation details, but at the end of the day, the more threads your engine has, the more dependencies those threads will have on each other, and the more complex and time-consuming it becomes to keep those threads in sync with the overall game engine state.

To put it more bluntly: scaling the number of threads in a user-input-dependent application linearly, will always result in more time taken to keep the application's state consistent, and will generally result in an exponential increase in the difficulty of keeping the application's state consistent. *This is why adding more cores or threads does not magically improve performance.*

My personal opinion is that most general-purpose game engines will only ever be able to use up to 8 threads effectively. Games might spawn additional threads depending on their needs (e.g. an RPG with many NPCs could spin up more threads to handle those NPCs' AI calculations) but those extra threads will be relatively lightweight and simple compared to the main engine threads.

The fact that Microsoft is offering an 8c/8t higher clockspeed option for devs bolsters my belief that 8 is the magic number. If they're right, and I'm right, 4c/8t will be more than sufficient for the next half-decade.

But something crucial to remember is that core and thread count alone aren't a useful metric: ye olde i7-2600K may be a 4c/8t part but it's nowhere near the performance of an R3 3300X. As Microsoft and I believe, clockspeed (or rather IPC) is still king over high thread counts, at least when it comes to games.


----------



## cucker tarlson (Jun 17, 2020)

theoneandonlymrk said:


> L
> Still stamping those feet, while staring through those tinted glasses, FX is 8 years old btw but still relative in your eyes, to a debate about the future lol. hahaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
> 
> thought i would leave this here, short video with common sense involved not bias.


Read the thread its you and ford who brought up consoles and jaguar cores Im reporting your troll activity.

Its a discussion on how last gen big cpus are doing against new small chips not how you futureproof a console.

Youve been spamming this thread since beginning with irrelevant opinions


----------



## EarthDog (Jun 17, 2020)

Assimilator said:


> My personal opinion is that most general-purpose game engines will only ever be able to use up to 8 threads effectively. Games might spawn additional threads depending on their needs (e.g. an RPG with many NPCs could spin up more threads to handle those NPCs' AI calculations) but those extra threads will be relatively lightweight and simple compared to the main engine threads.


Since its patch Tuesday for pubg, I played COD Warzone last night... each of my 16 cores (HT disabled) was pegged equally with the cpu hitting 70-80% use. One of the first games ive seen do that.



Assimilator said:


> If they're right, and I'm right, 4c/8t will be more than sufficient for the next half-decade.


define sufficient? Again, were seeing several titles today that are held back by the cpu. Id hate to own a 4c/8t part today and game, nonetheless 5 years...no way.


----------



## Vya Domus (Jun 17, 2020)

Assimilator said:


> *This is why adding more cores or threads does not magically improve performance.*



Here's where most people are just wrong. It does "magically" increase performance, games use concurrency all throughout their engine. Basically every thread spawned can and will run on other cores automatically. You can take any CPU, disable half of it's cores and you'll always see performance degrade to an extent, no matter how many cores that CPU had to begin with.

Now whether or not that performance differential is big is questionable but it's there.


----------



## TheoneandonlyMrK (Jun 17, 2020)

cucker tarlson said:


> Read the thread its you and ford who brought up consoles and jaguar cores Im reporting your troll activity.
> 
> Its a discussion on how last gen big cpus are doing against new small chips not how you futureproof a console.
> 
> Youve been spamming this thread since beginning with irrelevant opinions


If opinions don't match or align with your story they're irrelevant, nice.
I've been on topic every post!?.

More core's more future proof gaming.


----------



## dirtyferret (Jun 17, 2020)

So while we await the count down on this thread being locked perhaps we should start a new thread and have a courteous discussion on a less volatile subject like AMD vs Nvidia, air vs water cooling, the current political situation, and/or religion?


----------



## cucker tarlson (Jun 17, 2020)

theoneandonlymrk said:


> More core's more future proof gaming.


page 8,still didn't read the OP or watch that video and is proud to shove his opinion as facts.
I linked more - never bothered either.

how is that physics on cpu cores thing doing ? it's been 6 months.did we get any games ? or even announcements ?



EarthDog said:


> Id hate to own a 4c/8t part today and game, nonetheless 5 years...no way.


if you're talking a fast 4/8 with fast ram you're fine but don't expect to lock high refresh rate vsync at 100/120
but neither can you do it on a slower 8/16 you need both fast and many cores for that


----------



## TheoneandonlyMrK (Jun 17, 2020)

cucker tarlson said:


> page 8,still didn't read the OP or watch that video and is proud to shove his opinion as facts.
> I linked more - never bothered either.
> 
> how is that physics on cpu cores thing doing ? it's been 6 months.did we get any games ? or even announcements ?
> ...


Oh I read it all.

Start out with a confused argument, end with one.

8 pages and you still don't accept that the future isn't set in stone.

8, pages and your still trying to push old games benches made for current systems = proof that your set for five years on a quad.

8 pages of you denying that change is upon us because it doesn't suit your narrative.

If I agreed which I don't we wouldn't be debating still.


----------



## Rahnak (Jun 17, 2020)

@cucker tarlson The answer to the OP is fairly simple. Mostly yes. In the video you posted in the OP, and comparing apples to apples, the 7700K overclocked to 5.1Ghz is consistently behind the stock 8700K/10600K.


----------



## Vya Domus (Jun 17, 2020)

Rahnak said:


> the 7700K overclocked to 5.1Ghz is consistently behind the stock 8700K/10600K.



Nooooo. The truth ! No amount of single core performance can make up for having extra cores which means that is no longer the primary limiting factor.

Let's be real, this was never about more cores being more future proof, this was just another AMD vs Intel shitshow.


----------



## cucker tarlson (Jun 17, 2020)

Rahnak said:


> @cucker tarlson The answer to the OP is fairly simple. Mostly yes. In the video you posted in the OP, and comparing apples to apples, the 7700K overclocked to 5.1Ghz is consistently behind the stock 8700K/10600K.


2500k v. Fx8 - faster core wins
7700k v. 1700 - faster core wins
8700k v. 3700x - faster core wins

AMD caught up MOSTLY due to ipc,clocks and latency not cores look at 3300x and you see the whole truth


----------



## FordGT90Concept (Jun 17, 2020)

Assimilator said:


> This is completely different to game engines, which are built around reacting to what the player is doing. I'm not going to go into implementation details, but at the end of the day, the more threads your engine has, the more dependencies those threads will have on each other, and the more complex and time-consuming it becomes to keep those threads in sync with the overall game engine state.


Each thread is made to do less so collectively, they get done in less time.  The last generation of consoles (Xbox One and PlayStation 4) made game developers update engines for parallelism because the Jaguar cores, individually were weak; collectively, they're pretty dang strong.  The next generation expands on parallelism while making every thread process faster.  In other words, the ground work for 8-16 threads in games was already laid.  They're not going to remove parallelism in engines just because the next generation has higher throughput.  They're going to expand upon it, doing things they couldn't do before because of hardware constraints.



Assimilator said:


> The fact that Microsoft is offering an 8c/8t higher clockspeed option for devs bolsters my belief that 8 is the magic number.


That's mostly for Xbox One emulation.


----------



## cucker tarlson (Jun 17, 2020)

Vya Domus said:


> Nooooo. The truth ! No amount of single core performance can make up for having extra cores which means that is no longer the main limiting factor.
> 
> Let's be real, this was never about more cores being more future proof, this was just another AMD vs Intel shitshow.


No you turned it into one for damage control


----------



## Vya Domus (Jun 17, 2020)

FordGT90Concept said:


> That's mostly for Xbox One emulation.



Exactly, I doubt they're ever going to explicitly chose to disable SMT. SMT helps with overhead, engines are heavily concurrent.


----------



## Rahnak (Jun 17, 2020)

cucker tarlson said:


> 2500k v. Fx8 - faster core wins
> 7700k v. 1700 - faster core wins
> 8700k v. 3700x - faster core wins
> 
> AMD caught up MOSTLY due to ipc,clocks and latency not cores look at 3300x and you see the whole truth



So what is your question? Because it seems like @Vya Domus is right.

In the title you asked "More cores more futureproof for gaming ?" and the answer is mostly yes, so long as the game engine supports them. But the narrative you're pushing seems to be: "Look how modern 4 cores are better than first gen Zen 8 cores." and this post above was full on Intel > AMD.

But yeah, you have the gist of it. Having cores for the sake of cores in gaming is meaningless. You also need IPC and clocks. Zen was a massive leap from the FX lineup but still not close enough to Intel. Zen2 is much closer, slightly ahead on IPC but behind on clock. And there's also the fact the Intel's architecture works better in gaming, with it's lower latency and whatnot.

Also, the 3300X is a little special since it's a single CCX design.

EDIT: typos


----------



## cucker tarlson (Jun 17, 2020)

Rahnak said:


> So what is your question? Because it seems like @Vya Domus is right.
> 
> In the title you asked "More cores more futureproof for gaming ?" and the answer is mostly yes, so long as the game engine supports them. But the narrative you're pushing seems to be: "Look how modern 4 cores are better than first gen Zen 8 cores." and this post above was full on Intel > AMD.
> 
> ...


well not really that special.special for a ryzen,that's true.

it was full intel v. amd as this thread (and the op video) is about pc processors,1st gen ryzen vs 3rd gen ryzen/kaby lake specifically.
not console marketing or xbox specs like some guys are pushing.


----------



## EarthDog (Jun 17, 2020)

cucker tarlson said:


> if you're talking a fast 4/8 with fast ram you're fine but don't expect to lock high refresh rate vsync at 100/120
> but neither can you do it on a slower 8/16 you need both fast and many cores for that


I'm talking any. I don't like to put glass ceilings on anything where possible. If my system's primary use is gaming, I want the GPU to always be the bottleneck. Running a 4c/8t system, that won't be the case as there are already several/dozens of titles that show performance increases over a 4c/8t chip... regardless of clocks and generation/IPC. Imagine what that looks like in a couple of years, nonetheless 5...

Saying a 4c/8t CPU will be good in five years is like saying a 4c/4t CPU will be good today 5 years ago. Sure, it works, it will play games... likely some titles at 60 FPS too... but it will hold FPS back in many titles, surely. There are also those who high refresh/FPS game at 1080p as well where each FPS counts so you don't want your CPU holding you back...

4c/8t is long in the tooth today, and could be downright painful in some games 3-5 years from now.


----------



## neatfeatguy (Jun 17, 2020)

Anyone arguing this stuff is stupid.

The idea is to build a system that'll give you acceptable performance for however long it satisfies your needs.

If a 5/6/7 year old CPU and GPU handles gaming for someone to their liking and gives them performance they're happy with - no problem here.
If a top end new CPU and multiple GPUs are required for someone and they have to upgrade with each and every iteration that comes out - then that's what they need to have performance they desire.

Neither is wrong.
Performance is relative to the individual, nothing more, nothing less. To tell folks they're doing it wrong or that they must upgrade or newer is better is dumb. Like when people want to "future proof" - which can't be done because eventually as time goes on something in the system is going to become obsolete. If something becomes obsolete, then you can't future proof.


----------



## cucker tarlson (Jun 17, 2020)

EarthDog said:


> I'm talking any. I don't like to put glass ceilings on anything. If my system's primary use is gaming, I want the GPU to always be the bottleneck. Running a 4c/8t system, that won't be the case as tehre are already several/dozens of titles that show performance increases over a 4c/8t chip... regardless of clocks and generation/IPC.
> 
> Saying a 4c/8t CPU will be good in five years is like saying a 4c/4t CPU will be good today 5 years ago. Sure, it works, it will play games... likely some titles at 60 FPS too... but it will hold FPS back, surely. Ther are also those who high refresh/FPS game at 1080p as well where each FPS counts.


well,I never said anything that flattering about 4c/4t or 4/8.

we would also like not to put a ceiling on that like you,but few of us can get a 2066 i9
for me it's usually the budget options for the cpu - and therefore the question.save on a smaller chip for gaming or spend more for "futureproof"
get 3600 for 800pln now and then another 800pln ryzen 5 4000/5000 later or get a 3800x for 1500pln now as "futureproof" - that is a relevant question for us plebs
the console mob would die defending the higher core count,but I think buying into smaller chips with new architectures is just better


----------



## EarthDog (Jun 17, 2020)

cucker tarlson said:


> well,I never said anything that flattering about 4c/4t or 4/8.
> 
> we would also like not to put a ceiling on that like you,but few of us can get a 2066 i9


I didn't say you did, lol...

Like me? Da FUQ? What does that have to do with it? You can, at minimum, raise (if not remove) the glass ceiling by running modern 6c/12t or 8c/16t CPUs for any titles that can use more than 4c/8t...which as I am trying to share is already an issue with some titles today, but more and more as time goes on.


----------



## cucker tarlson (Jun 17, 2020)

EarthDog said:


> I didn't say you did, lol...
> 
> Like me? Da FUQ? What does that have to do with it? You can, at minimum, raise (if not remove) the glass ceiling by running modern 6c/12t or 8c/16t CPUs for any titles that can use more than 4c/8t...which as I am trying to share is already an issue with some titles today, but more and more as time goes on.


remove not - but raise comfortably on e.g. a 3700x
but then again,look where a 3300x is relative to that 3700x and answer the question I posted earlier: for most of us it's get 3600 for 800pln now and then another 800pln ryzen 5 4000/5000 later or get a 3800x for 1500pln now as "futureproof"


----------



## EarthDog (Jun 17, 2020)

cucker tarlson said:


> and answer the question I posted earlier:


I thought I did... but you missed it trying to herd the ally cats. 

Clearly a 3300x is going to run out of steam well before a 3700x in any titles where core/thread counts go above 4c/8t. A little CCX magic doesn't make up for core/thread count when they are utilized.


----------



## Rahnak (Jun 17, 2020)

cucker tarlson said:


> it was full intel v. amd as this thread (and the op video) is about pc processors,1st gen ryzen vs 3rd gen ryzen/kaby lake specifically.
> not console marketing or xbox specs like some guys are pushing.



I think you should have phrased your question differently then.



neatfeatguy said:


> Anyone arguing this stuff is stupid.



Being able to have a discussion over anything is important, stupid as it may be. And a forum is just the place to do it, man.



cucker tarlson said:


> but then again,look where a 3300x is relative to that 3700x and answer the question I posted earlier: for most of us it's get 3600 for 800pln now and then another 800pln ryzen 5 4000/5000 later or get a 3800x for 1500pln now as "futureproof"



Impossible to answer without having benchmarks of the 4000 series. AMD is promising another big jump and so far they haven't been that far off from their promises on Ryzen. And it also depends on how long you plan to keep your cpu. I picked the 3700X over the 3600 because I plan to keep it for another 8 years or so and the 2 extra cores would give me a little more breathing room. Time will tell if I'll need them or not.


----------



## cucker tarlson (Jun 17, 2020)

Rahnak said:


> I think you should have phrased your question differently then.
> 
> 
> 
> ...


I have the thread a proper title,the skus are on it.


----------



## TheoneandonlyMrK (Jun 17, 2020)

cucker tarlson said:


> I have the thread a proper title,the skus are on it.


English?.

Nothing is future proof indefinitely.

And future proof for gaming is in the title too.
An eternally debatable , perspective aligned debate.


----------



## cucker tarlson (Jun 17, 2020)

theoneandonlymrk said:


> English?.
> 
> Nothing is future proof indefinitely.
> 
> ...


It was auto spell you know it.

Not really,the scope was narrow the data was there.

And for the newer gens of i5s vs R9 the data is there too. 

Yes nothing lasts forever but there are smart and smarter approaches to managing your money

What would you rather do ? Replace a 3600 with 5nm r5 in 3 years time or ride a 3800x for six


----------



## EarthDog (Jun 17, 2020)

cucker tarlson said:


> What would you rather do ? Replace a 3600 with 5nm r5 in 3 years time or ride a 3800x for six


Depends on the user's goals/budget........this isn't black/white issue.


----------



## TheoneandonlyMrK (Jun 17, 2020)

"more core's future proof for gaming?"


Seems like a straight forward question.

I'm with Earthdog , depends on the buyer and gamer.

And what game's come out and are successful.


----------



## cucker tarlson (Jun 17, 2020)

it'll be clearer for mrK if he takes a look at cpu tests instead of sending us nvidia-sponsored videos
3600 is already so close to 3800x they're almost indistinguishable.next gen of 6/12 cpus is gonna easily stroll past that.10600kf already is.



EarthDog said:


> Depends on the user's goals/budget........this isn't black/white issue.


well,the goal is kinda given in the title.
let's not debate the topic after 10 pages alright ?
and the budget for a 3700x is same as getting a 3600 now,selling it and getting a newer architecture r5
idk why mrK is laughing,he did it with 2700x->3800x
wasn't that 2700x futureproof enough for 60 fps mrK ?


----------



## Rahnak (Jun 17, 2020)

Both AMD and Intel are pretty equal in terms of platform longevity right now. Both sockets have one more generation available to them. What you choose depends on your needs but this is how I would go about it if I was buying right now:

A lot of competitive games or games that benefit from having very high fps, Intel. Otherwise some money can be saved by going with AMD.
If I was keeping the CPU for ~4 years a 6 core would probably be just fine. More than that I'd probably go for 8 just to be safe. And if performance was no longer adequate I'd still have the option to go with a newer gen cpu and/or a sku with more cores.

Another option is waiting for Zen 4 / Rocket Lake and see what's what. AMD is promising another sizeable jump and Rocket Lake should be plenty interesting as well. Then build the best you can with what you can afford.

But you know your needs better than anyone.

There's always something nicer around the corner. AM5 should use DDR5 and I figure Intel will do the same with whatever socket is next so the upgrade after that will require more investment.


----------



## TheoneandonlyMrK (Jun 17, 2020)

cucker tarlson said:


> it'll be clearer for mrK if he takes a look at cpu tests instead of sending us nvidia-sponsored videos
> 3600 is already so close to 3800x they're almost indistinguishable.next gen of 6/12 cpus is gonna easily stroll past that.10600kf already is.
> 
> 
> ...


So digital foundry are sponsored by Nvidia, proof please.

What platform did DF settle on?, Ryzen pciex4!.



2700x to what, don't assume to know what I do or did, or anyone else for that matter, your wrong see.

I went 2600 to 3800X , and liked what I got.

But have owned 1700X, 2700X ,2600,3800X 
As well as shit loads of intel chip's, presently a g3500?(unsure) 8750, i7 2600k ,q6600x2(in keyrings cos useless quads)
And About four more I can't even remember now.

Because I refurb and sell PC's.

Getting a bit personal aren't you, what I or you use affects this debate does it, no ,though it's clear your system has you biased in it alones favour.


----------



## EarthDog (Jun 17, 2020)

cucker tarlson said:


> well,the goal is kinda given in the title.


lol, mofo, I just answered your question I quoted, lol......


----------



## bogmali (Jun 17, 2020)

This thread has turned into a troll fest as well as "if I don't agree with you, I'll report you". 

For you peeps that like to report stuff, don't escalate it by replying to it and contradicting why you reported it in the first place. Shutdown commenced.


----------

