People seem to completely forgot (or not old enough to remember / had no interested in PC at that time) that Intel had massive problems with HT when it was introduced. We were seeing massive up to 20% performance loss due to Intel's new arch.
As well as when Athlon came out. There were massive patches for Athlon processors because they were new.
People said Bulldozer was terrible (due to, again, new arch) and will be obsolete so fast thus better to buy i3 instead. From what I seen at AdoredTV, Bulldozer, at worst, particularly did not lag further behind i5/i7. At best, it now beats 2500k at least. People who bought i3 at that time by following that advice were probably forced to buy yet another new cpus because i3 got completely obsoleted.
No matter how hard you are prepared, there will be problems with the launch, especially when you try to push new things.
Also, I really hope people should stop using games as CPU benchmark or at least take with a grain of salt. They are just really unreliable due to the fact that:
-Processor-specific optimizations.
-Various amount of factors and quality of the testings done by reviewers.
1) Processor-specific optimizations and API issues :
-Those who have any modern CPUs realize they cannot play a lot of old dos games because the games simply ran too fast or glitches. You need either blessings from GOG.com or use programs that make CPU cores slow enough for those games.
-Bethsada's games' physics go wild as soon as you go over 60 fps, GTA 5 begins to lag after a certain fps is exceeded.
-You cannot play Mass Effect 1 well without glitches on modern AMD CPUs, because it automatically forces to use 3D Now when AMD CPU is detected, but no modern AMD CPU supports 3D Now.
-There are several games that results would flip based on what API is used (DX11 vs DX12)
2) Various amount of factors and quality of the testings done by reviewers.
-A tester usually has no clue what 'part' of the game he/she is testing in the first place. Some point it's draw call getting wrecked. Or Physics-related, or that, or that.
One of the examples would be Gamers Nexus's testing of Watch Dog 2. Basically the author tests the game by putting a player character in very empty street, barely moving back and forth for 30 seconds. Repeat that three times. Obviously this does not actually force CPU much because he never tested in-city environment where there are a lot of objects and interactive points moving around. Not to mention driving car also brings several new issues, which are not accounted for Gamers Nexus review. At such environment, it was more of renderer pipeline speed test.
-BF1 Multiplayer's environment always changes because each game is different from another. Some sites test with multiple rounds to get 'average', while others don't.
While productivity programs are not free from this blame, games in particular really suffer optimization issues. In essence, those game reviews tell us that game developers have more time working on Intel CPU rather than AMD CPU.
Now, this may be a valid argument for preferring Intel CPU because it still means that you will get more performance (or it seems) from Intel CPU. But you cannot say this AMD CPU is bad for the gaming because CPU itself is not directly related to the gaming performance to certain degree. Unless it is really lags behind like Bulldozer, it is not appropriate to say Ryzen is sucking at gaming.
I mean, it is also not like Bulldozer that was really far behind everything else. If we forget about the gaming Ryzen really takes top. And for gaming I only see maybe 10% performance loss, at max 15%, for 100+ fps average results. That's far more than acceptable. Not to mention as AdoredTV showed us that it does not mean Ryzen is going to lag further behind either (probably more likely opposite would happen.)
Cores vs single-thread: I still remember when I was getting 2600K for my PC, people told me it was the waste of money since 2500k would give same performance for less money. But these days I don't see that is being true as well. Even during Sandy days the games were slowly moving to multi-core design. And its speed is now insanely accelerated when AMD got a hold of console market and pushed for multiple cores with weak single core performance chips into the consoles.
Finally, after several years, I am looking for building a new PC. I really have no allegiance to any companies. I pretty much flipped on Nvidia and Radeon/AMD for graphics cards and never had bought any AMD CPUs for my own PC because they were not really viable when I was looking for new CPUs.
I will wait for a while. I am getting 1080 TI for my new PC (of course custom cooler design one) so I still have some time left to see how it is going. But unless Intel suddenly give me a nice octa-core processor with decent pricing during that time, it seems I am getting a AMD CPU first time ever since 1994.