# AMD FX-8350 - "Piledriver" for AMD Socket AM3+



## cadaveca (Oct 19, 2012)

Without much fanfare, AMD is launching yet another new product, this time the Piledriver-based AM3+ CPUs. Ready as a drop-in replacement for current 9-series AMD motherboards, the FX-8350 not only offers a bit of a surprise in performance, but also a suprise that your wallet will like too!

*Show full review*


----------



## btarunr (Oct 23, 2012)

It's a more than decent processor for its price. I'd definitely get one if I were upgrading from Core 2 Quad, Phenom II, or even Core i5/i7 LGA1156.


----------



## f22a4bandit (Oct 23, 2012)

Great job on the review. Piledriver seems like a nice alternative if you don't want to spend as much cash. Good improvement on the process.


----------



## trickson (Oct 23, 2012)

Great review! Looks like AMD has a winner. Some more refinement and the such and this could be just what AMD and the AMD community has been looking for.


----------



## devguy (Oct 23, 2012)

Nice review, but Cadaveca, isn't Intel on the 22nm process, not the 28nm one?  I believe AMD is planning on moving it's new Fusion chips to 28nm, so maybe that's where the confusion is?


----------



## Melvis (Oct 23, 2012)

btarunr said:


> It's a more than decent processor for its price. I'd definitely get one if I were upgrading from Core 2 Quad, Phenom II, or even Core i5/i7 LGA1156.



Totally agree, ive been holding off with my 965 praying that this CPU would be good enough to upgrade to and it is, FX-8350 is my next upgrade for sure.


----------



## reverze (Oct 23, 2012)

this is finally a great alternative to the current i5 line up, if you bought anythign less than the 3570k there was too much of a drop in performance.

Now the 8350 is so close in gaming performance ( what most of us buy top end cpus for ) i can advise people to get one and save money on cpu and mobo and either keep the money in the pocket or spend the money saved on a better videocard, a combo that will beat the i5 with its slower videocard.

Only thing i would like to see is refreshes of high end AMD mainboards with more USB 3 ports and UEFI support.

I feel this cpu brings amd back to the "if you are a gamer, go AMD" days where the performance these FX chips bring is all you need and could never tell the difference by spending more on i5/i7.


----------



## HumanSmoke (Oct 23, 2012)

> The Intel 3770K costs nearly twice as much, but doesn't offer twice the performance


Isn't the natural comparison the 3570K ?...i.e. the Intel CPU closest in price..the CPU that also appeared in the review comparison.

For more impact I'd go with:
_"The Intel 3960X costs nearly five times as much, but doesn't offer five times the performance"_
Still trudging through the other reviews. Seems like some reviewers didn't get a lot of time to do the reviews.


----------



## [H]@RD5TUFF (Oct 23, 2012)

Suripse . . ..  err wait nope still the same garbage as the last round of AMD chips, when will AMD get their act together and make a real chip.:shadedshu


----------



## btarunr (Oct 23, 2012)

reverze said:


> Only thing i would like to see is refreshes of high end AMD mainboards with more USB 3 ports and UEFI support.



All ASUS AMD 9-series chipset motherboards have UEFI. Quite a few MSI, Biostar, and ASRock motherboards (entry-thru-performance) have it as well. It's just Gigabyte's 9-series boards that stick to ye olde AwardBIOS. They do feature "HybridEFI" if you want to boot from large volumes, though.


----------



## reverze (Oct 23, 2012)

btarunr said:


> All ASUS AMD 9-series chipset motherboards have UEFI. Quite a few MSI and ASRock motherboards (entry-thru-performance) have it as well. It's just Gigabyte's 9-series boards that stick to ye olde AwardBIOS. They do feature "HybridEFI" if you want to boot from large volumes, though.



Ah thats why i probably thought about general lack of UEFI cause i usually buy gigabyte boards, thanks for clearing that up.


----------



## Nordic (Oct 23, 2012)

I wish I could see some comparisons to the 2500k being they are the exact same price.


----------



## Melvis (Oct 23, 2012)

james888 said:


> I wish I could see some comparisons to the 2500k being they are the exact same price.



http://www.guru3d.com/articles_pages/amd_fx_8350_processor_review,1.html


----------



## seronx (Oct 23, 2012)

cadaveca said:


> -snip-


Nice review.

cadaveca you probably should place somewhere why you typed 28-nm for Intel. Like:

1.  Metal/Interconnect layers are comparable to 28-nm from other foundries.
2.  Even though the name of node is 22-nm it actually has a physical gate(fin) length of 26-28 nanometers.
3.  etc.

While you did say 28-nm for AMD the successor to Vishera is Vishera 2.0 and Viperfish(die name).

Vishera 2.0 will be on 32-nm SHP but will most likely have some form of Steamroller in it.   While Viperfish will be on some form of 28-nm/22-nm FinFET from IBM and GlobalFoundries.
Vishera 2.0 = 2013.
Viperfish = 2014.


----------



## cadaveca (Oct 23, 2012)

devguy said:


> Nice review, but Cadaveca, isn't Intel on the 22nm process, not the 28nm one?  I believe AMD is planning on moving it's new Fusion chips to 28nm, so maybe that's where the confusion is?



Whups. 



HumanSmoke said:


> Isn't the natural comparison the 3570K ?...i.e. the Intel CPU closest in price..the CPU that also appeared in the review comparison.
> 
> For more impact I'd go with:
> _"The Intel 3960X costs nearly five times as much, but doesn't offer five times the performance"_
> Still trudging through the other reviews. Seems like some reviewers didn't get a lot of time to do the reviews.




Why? The i7-3770K and the FX-8350 are currently the best CPU you can get for their respective sockets. To get the best form intel costs twice as much..but doesn't get you twice the performance. I will not argue that Inteli s faster..it is, but AMD has price/performance sealed up in the <$200 market.

And no, there was not a lot of time...I am sure many had to wait for BIOS as I did, if they had a board in the first place. Seems funny to me that AMD gave me so much for FM2, but so little for this launch...perhaps this shows what is really more important to them...? I am not sure.



seronx said:


> Nice review.
> 
> cadaveca you probably should place somewhere why you typed 28-nm for Intel. Like:
> 
> ...



No, it's a typo, will fix in a moment.


EIDT: FIX'd.


----------



## Nordic (Oct 23, 2012)

Melvis said:


> http://www.guru3d.com/articles_pages/amd_fx_8350_processor_review,1.html



On top of this, anandtech has a review also.

In my opinion the 8350 outperforms intel in multithreaded apps quite nicely for the price. For single threaded performance intel will be king for the foreseeable future. Amd does have a very competitive platform. I need more single threaded performance and am very happy with my 2500k for that regard. If I needed more multithreaded I would definitely get a 8350.


----------



## seronx (Oct 23, 2012)

I see the opposite james888 AMD is the king of singlethreaded applications.  It just happens that code that runs on both cores utilizes the FPU the best.

With single-threaded applications one core can process 8 instructions per cycle while in dual core mode both cores can process only 4 instructions each.


----------



## Ghost (Oct 23, 2012)

I liked Omega's reviews more. More comprehensive and less biased.


----------



## darkangel0504 (Oct 23, 2012)

thank for review 

i need upgrade my BIOS and buy FX-8350


----------



## btarunr (Oct 23, 2012)

[H]@RD5TUFF said:


> when will AMD get their act together and make a real chip.:shadedshu



"Never." In my informed opinion, these are the very last CPUs from AMD. There are only APUs and enterprise CPUs in the foreseeable future. Enjoy Intel chips till the PC form-factor fades out.


----------



## cadaveca (Oct 23, 2012)

Ghost said:


> I liked Omega's reviews more. More comprehensive and less biased.



My review would have been more comprehensive if I had more than a week with a working BIOS, and AMD provided me with more than a bare chip in an envelope.

Unfortunately it seems Omega is a bit to busy with other aspects of life, as does happen, so what you see is what you get.

Numbers hold no bias. If you can show me some bias in my review, please do.



btarunr said:


> "Never." In my informed opinion, these are the very last CPUs from AMD. There are only APUs and enterprise CPUs in the foreseeable future. Enjoy Intel chips till the PC form-factory fades out.



I agree, in part. I do nto think CPUs have much place left in home systems. Power users can use server-grade parts, provided some way to unlock OC ability.

However, I do not think the desktop form factor will be anywhere long-term..it will merely loose it's luster for many for some time.


----------



## Nordic (Oct 23, 2012)

seronx said:


> I see the opposite james888 AMD is the king of singlethreaded applications.  It just happens that code that runs on both cores utilizes the FPU the best.
> 
> With single-threaded applications one core can process 8 instructions per cycle while in dual core mode both cores can process only 4 instructions each.



I admittedly do not know much(or anything) about cpu architecture. I can read graphs though. Why does intel do better in most games even if slightly? The game I play most right now, Ns2, is heavily cpu dependant and is pretty much single threaded. I have trouble keeping 50 fps sometimes at 4.5ghz. I am pretty sure I would have a harder time with the 8350.

Also, multithreaded is the future. It is only a matter of time.


----------



## Melvis (Oct 23, 2012)

james888 said:


> On top of this, anandtech has a review also.
> 
> In my opinion the 8350 outperforms intel in multithreaded apps quite nicely for the price. For single threaded performance intel will be king for the foreseeable future. Amd does have a very competitive platform. I need more single threaded performance and am very happy with my 2500k for that regard. If I needed more multithreaded I would definitely get a 8350.



Your link sends me back to Guru3d  

Agreed you 2500K is great and not worth changing over to this CPU, its more so for us Phenom II users that wish to catch up to those SB users  I think over all this new CPU is a good buy. For me personally its a good upgrade, ive had a 990FX board sitting here for almost 6 months waiting for a good CPU to drop into it and this for me is what ill be getting.


----------



## seronx (Oct 23, 2012)

btarunr said:


> "Never." In my informed opinion, these are the very last CPUs from AMD.


There is a couple generations after this one.

Zambezi(2011) -> Vishera(2012) -> Vishera 2.0(2013)

All Orochi...eight cores

Unknown 1.0(2014) -> Unknown 1.1(2015)

All Viperfish...ten cores

^---
Quad-lane DDR4
20 MB L2 cache + 4 MB Northbridge Cache(IOMMU/HMMU Coherency)
So, far the socket is going to be AM3+ or the long thought dead Socket G3.
Other awesome-things come with Viperfish.





james888 said:


> Why does intel do better in most games even if slightly? The game I play most right now, Ns2, is heavily cpu dependant and is pretty much single threaded. I have trouble keeping 50 fps sometimes at 4.5ghz. I am pretty sure I would have a harder time with the 8350.
> 
> Also, multithreaded is the future. It is only a matter of time.


I think it is due to the faster cache and lower latency interconnect that Intel wins.  The problem with Bulldozer is to focused on not making errors that it actually needs a higher clock rate.  For AMD it takes to long to get the instructions to the cores and to get executed and get written since the L2 is so slow.


----------



## cdawall (Oct 23, 2012)

[H]@RD5TUFF said:


> Suripse . . ..  err wait nope still the same garbage as the last round of AMD chips, when will AMD get their act together and make a real chip.:shadedshu



Which part is the garbage part? Power consumption is down, performance is up, and the CPU's clock well, not to mention they are drop in upgrades for most people with anything from a Phenom II to the "garbage" Bulldozer chips.


----------



## btarunr (Oct 23, 2012)

seronx said:


> There is a couple generations after this one.
> 
> Zambezi(2011) -> Vishera(2012) -> Vishera 2.0(2013)
> 
> ...



Most of what you're talking about is Opteron/enterprise/MCMs. Sure, there was a bulldozer>piledriver>steamroller roadmap, but that related to micro-architecture, not necessarily how AMD implemented it.


----------



## seronx (Oct 23, 2012)

btarunr said:


> No client CPUs in the future. Most of what you're talking about is Opteron/enterprise/MCMs.


Vishera, Vishera 2.0, Viperfish, Trinity, Trinity 2.0, Kaveri say otherwise.


----------



## darkangel0504 (Oct 23, 2012)

will steamroller  support AM3+ ?


----------



## btarunr (Oct 23, 2012)

seronx said:


> Vishera, Vishera 2.0, Viperfish, Trinity, Trinity 2.0, Kaveri say otherwise.



Trinity, Trinity 2.0, Kaveri are not CPUs, they're APUs. My original assertion was that I don't see AMD client CPUs in the future, I only see APUs and enterprise CPUs. Never heard of Vishera 2.0. The only client product roadmaps we're seeing relate to APUs.


----------



## seronx (Oct 23, 2012)

darkangel0504 said:


> will steamroller  support AM3+ ?


Steamroller will support AM3+ in form of Orochi Rev E. much like Thuban in the Phenom II era.  The true Steamroller CPU Viperfish won't be shown till 2014.  The reasons I believe it was delayed is that it is going to use 22-nm node/28-nm interconnect FinFETs much like how 14-XM uses the 14-nm node/20-nm interconnect.


----------



## Nordic (Oct 23, 2012)

Melvis said:


> Your link sends me back to Guru3d
> 
> Agreed you 2500K is great and not worth changing over to this CPU, its more so for us Phenom II users that wish to catch up to those SB users  I think over all this new CPU is a good buy. For me personally its a good upgrade, ive had a 990FX board sitting here for almost 6 months waiting for a good CPU to drop into it and this for me is what ill be getting.


Fixed the link. I would not consider side/upgrading right now. My 2500k is plenty right now. I just am... I don't know... Just curious as to what amd has compared to what I got. I do the same comparisons to higher up intel cpus to.



seronx said:


> TI think it is due to the faster cache and lower latency interconnect that Intel wins.  The problem with Bulldozer is to focused on not making errors that it actually needs a higher clock rate.  For AMD it takes to long to get the instructions to the cores and to get executed and get written since the L2 is so slow.


Thats because this base architecture was primarily designed for servers right? The way you put it, it sounds like an easy fix.


----------



## TheLaughingMan (Oct 23, 2012)

AMD FX-8350 Vishera 4.0GHz (4.2GHz Tubo) Socket AM...

Retail set at $220. Not bad, but I think $5 too high. I expect a drop in price by Christmas


----------



## johnnyfiive (Oct 23, 2012)

Its nice to finally see what PD can do.


----------



## seronx (Oct 23, 2012)

james888 said:


> That's because this base architecture was primarily designed for servers right? The way you put it, it sounds like an easy fix.


The fix was to get rid of the L3 all together and simply use the L2.

Trinity: 4MB of L2(2 * 2)
Kaveri: 8MB of L2(4 * 2)
Orochi: 8MB of L2 + 8MB of L3 = 16 MB of total cache(2 * 4 + 2 * 4)
Viperfish: 20 MB of L2(4 * 5)

So, far there is two modes based on the slides for Kaveri and Viperfish:
High Performance mode
Low Performance mode

HPM -> Full 20/8 MB of L2 @ 32 - 40 cycles of latency
LPM -> Half 10/4 MB of L2 @ 16 - 20 cycles of latency

This way it can appease both server workloads and mainstream workloads.   The cache is completely unified as well so all cores can access the L2.


----------



## TheLaughingMan (Oct 23, 2012)

You go on with the speculation about processors 2 and 3 years away Seronx. I am going to turn off my FX-8350 and go to sleep.


----------



## cadaveca (Oct 23, 2012)

seronx said:


> The fix was to get rid of the L3 all together and simply use the L2.
> 
> Trinity: 4MB of L2(2 * 2)
> Kaveri: 8MB of L2(4 * 2)
> ...



This is not on topic. PLease feel free to start a new thread for your discussion outside of the FX-8350. Thanks.


----------



## seronx (Oct 23, 2012)

cadaveca said:


> This is not on topic. PLease feel free to start a new thread for your discussion outside of the FX-8350. Thanks.


FX-8350 does a good job it continues the trend.


----------



## Melvis (Oct 23, 2012)

james888 said:


> Fixed the link. I would not consider side/upgrading right now. My 2500k is plenty right now. I just am... I don't know... Just curious as to what amd has compared to what I got. I do the same comparisons to higher up intel cpus to.
> 
> 
> Thats because this base architecture was primarily designed for servers right? The way you put it, it sounds like an easy fix.



Ta, yea i wouldn't either if i was in your shoes. I just did a number count using guru3d's review with the 2600K vs the FX-8350 (minus the gaming results) and it was a tie, so in my books that's a win, or should i say what Bulldozer should of been in the first place.


----------



## Lionheart (Oct 23, 2012)

Melvis said:


> Ta, yea i wouldn't either if i was in your shoes. I just did a number count using guru3d's review with the 2600K vs the FX-8350 (minus the gaming results) and it was a tie, so in my books that's a win, or should i say what Bulldozer should of been in the first place.



Aaaww hells yeah bro looks like we both know what our new platform is going to be

Btw I was hoping you would be on steam so I could spam you with AMD news


----------



## [H]@RD5TUFF (Oct 23, 2012)

james888 said:


> I wish I could see some comparisons to the 2500k being they are the exact same price.



2500K is a dead ship and won't be sold for much longer so that's not a very good comparison you should compare with the 3570K.


----------



## HumanSmoke (Oct 23, 2012)

The load power consumption figures seem all over the place.

PC Perspective: 15.3% worse than FX-8150 
eTeknix: 5.9% worse than FX-8150
X-bit : 3.9% worse than FX-8150
TechSpot: 0.8% worse than FX-8150
Tech Report and Extreme Tech: 0% difference
Hardware Heaven: 0.4% better than FX-8150
Hexus: 1.6% better than FX-8150
Anandtech: 2% better than FX-8150
Hardware Canucks: 3.2% better than FX-8150
Hot Hardware: 4.7% better than FX-8150
[H]OCP: 4.8% better than FX-8150
Tom's Hardware: 5.8% better than FX-8150
Legit Reviews: 7.4% better than FX-8150
TechPowerUp: 20.9% better than FX-8150

The median/mean seem ballpark near enough the same for both SKU's, but the range is pretty damn loose.


----------



## drdeathx (Oct 23, 2012)

In the summary it says 3770K costs almost twice as much. 8350 is pitted against the 3570K FYI not the 3770K.


----------



## drdeathx (Oct 23, 2012)

Melvis said:


> Ta, yea i wouldn't either if i was in your shoes. I just did a number count using guru3d's review with the 2600K vs the FX-8350 (minus the gaming results) and it was a tie, so in my books that's a win, or should i say what Bulldozer should of been in the first place.





2600K is old news and going buh bye.


----------



## cadaveca (Oct 23, 2012)

drdeathx said:


> In the summary it says 3770K costs almost twice as much. 8350 is pitted against the 3570K FYI not the 3770K.



Explained above. But...meh.



HumanSmoke said:


> The load power consumption figures seem all over the place.
> 
> PC Perspective: 15.3% worse than FX-8150
> eTeknix: 5.9% worse than FX-8150
> ...



are the scores for all sites relatively the same?


----------



## cdawall (Oct 23, 2012)

HumanSmoke said:


> The load power consumption figures seem all over the place.
> 
> PC Perspective: 15.3% worse than FX-8150
> eTeknix: 5.9% worse than FX-8150
> ...





			
				pcperspective said:
			
		

> AMD Radeon HD 5870 1GB
> 
> 2 x 4GB GSkill DDR-2 1866 memory @ 9.10.9.28 latencies
> 
> ...





			
				eteknix said:
			
		

> Asus Crosshair V Formula
> AMD FX-8350
> Corsair Vengeance 1866MHz 16GB
> Corsair H80
> ...





			
				xbit said:
			
		

> AMD FX-8350 (Vishera, 8 cores, 4.0-4.2 GHz, 4 x 2 MB L2, 8 MB L3);
> ASUS Crosshair V Formula (Socket AM3+, AMD 990FX + SB950);
> Memory: 2 x 4 GB, DDR3-1866 SDRAM, 9-11-9-27 (Kingston KHX1866C9D3K2/8GX).
> Graphics card: NVIDIA GeForce GTX 680 (2 GB/256-bit GDDR5, 1006/6008 MHz).
> ...





			
				techspot said:
			
		

> - x2 4GB G.Skill DDR3 PC3-14900 (CAS 8-9-8-24)
> - Asrock Fatal1ty 990FX Professional (AMD 990FX)
> - OCZ ZX Series 1250w
> - Crucial m4 256GB (SATA 6Gb/s)
> - Gigabyte GeForce GTX 580 SOC (1536MB)



Should I continue or do you get the picture? System power consumption is done at the wall so any variation in the system can give wildly different power consumption.


----------



## [H]@RD5TUFF (Oct 23, 2012)

HumanSmoke said:


> The load power consumption figures seem all over the place.
> 
> PC Perspective: 15.3% worse than FX-8150
> eTeknix: 5.9% worse than FX-8150
> ...



I'm not sure how they managed to get the power consumption up that much, to that means their is a flaw in their capacitors making they draw too much.


----------



## cadaveca (Oct 23, 2012)

cdawall said:


> Should I continue or do you get the picture? System power consumption is done at the wall so any variation in the system can give wildly different power consumption.



Sure, but if all you are doing is swapping the CPU...

interesing, to say teh least.. I wonder how people are doing OC-wise and such too...

THe difference is probably in how those numbers were TESTED, not the CONFIGURATIONS.


----------



## Melvis (Oct 23, 2012)

Lionheart said:


> Aaaww hells yeah bro looks like we both know what our new platform is going to be
> 
> Btw I was hoping you would be on steam so I could spam you with AMD news



 ive already got most of mine here, just waiting for the dam CPU 

 Yea i knew you would do that if i was on steam, to busy working to be able to play games today


----------



## cdawall (Oct 23, 2012)

cadaveca said:


> Sure, but if all you are doing is swapping the CPU...
> 
> interesing, to say teh least.. I wonder how people are doing OC-wise and such too...



Even within each CPU power consumption varies wildly. Look at your own Intel chips in comparison to anyone elses. A high volt 8350 will make a low volt 8150 look way better than two low volt chips.


----------



## cadaveca (Oct 23, 2012)

cdawall said:


> Even within each CPU power consumption varies wildly. Look at your own Intel chips in comparison to anyone elses.



Sure, and again, I seem to be doing far better, I think. 



Anyway, I just got one board here tested 8350, then tested the 8150, waited for BIOS, got BIOS, retested 8150, then retested 8350.


----------



## Melvis (Oct 23, 2012)

drdeathx said:


> 2600K is old news and going buh bye.



Who cares if its old news or not, the point is that its as good as a 2600K which Bulldozer was meant to be last yr and the 2600K is not a slow CPU either, am i right?


----------



## Bjorn_Of_Iceland (Oct 23, 2012)

Too late. I got a 2500k for cheaps instead.


----------



## NC37 (Oct 23, 2012)

Kinda wish you would have included some Phenom comparisons. Specially since the BD couldn't even beat Phenoms in many benches.


----------



## cadaveca (Oct 23, 2012)

NC37 said:


> Kinda wish you would have included some Phenom comparisons. Specially since the BD couldn't even beat Phenoms in many benches.



No chip, sorry. If I had one...would have only added several hours of testing...on a BIOS that is only a week old...but I woulda done it.


----------



## cdawall (Oct 23, 2012)

cadaveca said:


> No chip, sorry. If I had one...would have only added several hours of testing...on a BIOS that is only a week old...but I woulda done it.



Send me that 8350 and you can have my B97 for testing


----------



## HumanSmoke (Oct 23, 2012)

cadaveca said:


> Sure, but if all you are doing is swapping the CPU...


That was also my thought. Any other parameter is bound by the different system setups. All I was really interested in was the relative gain over the FX-8150 (stock vs stock and clock vs clock)


cadaveca said:


> interesing, to say teh least.. I wonder how people are doing OC-wise and such too...


5GHz with the AMD AIO "watercooler"  5.1-5.2GHz if you like your CPU's crispy (1.5+V)


cadaveca said:


> THe difference is probably in how those numbers were TESTED, not the CONFIGURATIONS.


There is a variance in what constitutes "power consumption-load", but that's to be expected. Most seemed to use some common sense- x264, multi-tasking, multithreaded gaming...but of course there are always going to some that view anything short of LinX as a cheat- probably the same kind of people that think transient peak load under OCCT represents "real world" testing. 
My point was that if the system was the same, the cooling was equal and adequate (to take throtting out of the equation) , and the test between the two CPU's from the same review on any given site, then the only other parameters of contention are methodology in measuring power draw, and variation in CPU -which I tried to eliminate by using a larger pool of reviews. An outlier would probably be an early ES vs late stepping- but that would likely show up in comparitive testing in earlier reviews.


----------



## [H]@RD5TUFF (Oct 23, 2012)

Melvis said:


> Who cares if its old news or not, the point is that its as good as a 2600K which Bulldozer was meant to be last yr and the 2600K is not a slow CPU either, am i right?



It completely matters, comparing a brand new chip to one that's over a year old and not even sold anymore, is hardly a fair comparison again, all comparisons of the 8350 should be done against the 3570K !


----------



## cadaveca (Oct 23, 2012)

HumanSmoke said:


> An outlier would probably be an early ES vs late stepping- but that would likely show up in comparitive testing in earlier reviews.



Yeah, I'm kinda lost here why my numbers are so far out of the mix...but oh well. Either I got a really good 8350...or perhaps 8150? I am not sure....

A quick look at chip pics seems we all got the same batch? I wonder who gets the highest clocks...Imma gonna have to have another go at it when i get some time.


----------



## Melvis (Oct 23, 2012)

[H]@RD5TUFF said:


> It completely matters, comparing a brand new chip to one that's over a year old and not even sold anymore, is hardly a fair comparison again, all comparisons of the 8350 should be done against the 3570K !



Not to me it doesn't, in my eyes a 2600K is a very fast CPU (regardless if its new or not) and in fact still faster then the 3570K is it not? so im quite happy to compare the 3570K to the FX-8350, doesn't worry me in the least


----------



## Absolution (Oct 23, 2012)

Its good that the 8350 finally manages to beat the Intel 2500k and in most cases is similar to the 2600k. (not in gaming though  )

But those processors are really old. Is there any advantage (feature wise / besides 8 cores) to AMD's alternative now? (being it a bit late but finally overcoming those intels)

Also sad to see that memory performance is still lower than the 2500/2600.

http://www.guru3d.com/articles_pages/amd_fx_8350_processor_review,17.html

Can someone explain whyyyyy


----------



## eidairaman1 (Oct 23, 2012)

Great review cadaveca

Looks like the major improvements were in power consumption idle, Pricing is pretty good actually.

Now we wait for SteamRoller.


btw FYI

 Computer Hardware, AMD, Processors - Desktops, So...

(Thank OneMoar for the Link above)

220 for the 8350


----------



## HumanSmoke (Oct 23, 2012)

eidairaman1 said:


> 195 for the 8350


$220


----------



## eidairaman1 (Oct 23, 2012)

HumanSmoke said:


> $220



oops ya i misread lmao, (must of looked at the 8320)


----------



## NeoXF (Oct 23, 2012)

Colour me impressed... almost thinking of switching from a i7-3770...

BTW... I've been reading tons of news and stuff about AMD's plans in the past weeks (maybe not so much in the past days tho), but I don't remember anything about Vishera 2.0 or Trinity 2.0, Orochi die and so on... can someone get me up to speed? A link maybe too.


----------



## gourygabriev (Oct 23, 2012)

I am planning to build a new rig to replace my aging c2quad and my only problem from jumping into buying this chip is the power consumption needed to use it. I found out that my power consumption should be less than 500KW if I want to stay at my 13 cent per Kilowatt billing tier. If I go over, my rate will change to 18 cent a Kilowatt. I could stomach the added cost if they'll charge me 18 cents/KW if it will only go towards the excess... but it won't. It will be the new rate for my whole monthly energy consumption. They say it's to give incentive to people who save power but since I am already about 490KW a month average, I don't want the added cost since between my younger sister and my gaming habits account to about 6-7 hours a day of usage, I might end up getting the i7 since the difference in power consumption will most likely cover the price difference between the 2 processors in about 4 months of usage.


----------



## Hustler (Oct 23, 2012)

9 out 10...you've got to be fucking joking,right?

"Low" but adequate single-threaded performance will not appeal to some.

You said it...in this regard the chip is still a dog...utterly useless for emulation.



From a reviewer who knows what their talking about...

"If you are looking to upgrade a full system then it's impossible to recommend. It's too slow, it draws too much power, it's too hot. It's just not worth it."

"We so wanted this to be a return to form for AMD. This is the best they have to offer, and they are still a mile behind the competition."

"For AMD, start with a fresh sheet of paper."


----------



## blibba (Oct 23, 2012)

For me, the only demanding thing my computer does is games. And I'm not sure I can get over this:


----------



## tacosRcool (Oct 23, 2012)

A much better CPU for the price!


----------



## DaC (Oct 23, 2012)

well, it's good cpu IMO.
It's certainly not enough to compete with intel to us avarge gammers / users, but it sure has it uses for a different type of user, whom work with applications that will benefit a lot from the 8 core advantage from AMD.

As for me, I'll just look for a 2600k to upgrade from my G620


----------



## Super XP (Oct 23, 2012)

This is a step in the right direction. Good work AMD. 
I am now itching to buy one


----------



## Melvis (Oct 23, 2012)

Hustler said:


> "If you are looking to upgrade a full system then it's impossible to recommend. It's too slow, it draws too much power, it's too hot. It's just not worth it.""



Yea of course its slow, a 2600K is one mega slow CPU hey? power draw is down a good margin from BD so yea totally terrible, shit yes 53c is smoking hot omg  under $200 yea spot on man WAY over priced, what where they thinking? :shadedshu


----------



## blibba (Oct 23, 2012)

Melvis said:


> Yea of course its slow, a 2600K is one mega slow CPU hey? power draw is down a good margin from BD so yea totally terrible, shit yes 53c is smoking hot omg  under $200 yea spot on man WAY over priced, what where they thinking? :shadedshu



What kind of response is this?

The 8350 is slower than much cheaper Intel chips in a lot of the benchmarks that _some_ people actually care about - see the 99th percentile Skyrim graph above. In addition, it uses far more power than any recent competitive Intel offering of similar performance. Particularly when overclocked, using more power translates to getting hotter.

I happen to agree that the 8350 is not impossible to recommend, but instead is only recommendable to a certain type of buyer (one who does not prioritise games, who does prioritise certain highly threaded tasks, who does not stress the CPU enough for the electricity bill to eliminate the savings). But when you put forward an argument like that, you destroy any chance of persuading anybody of its merits.


----------



## NC37 (Oct 23, 2012)

Still waiting for that "under $200" price. Guess maybe after launch week that price might finally get to places.

Is there going to be a 8320 review too? Curious about the clocking potential what with the 500Mhz stock clock difference.


----------



## blibba (Oct 23, 2012)

NC37 said:


> Still waiting for that "under $200" price. Guess maybe after launch week that price might finally get to places.
> 
> Is there going to be a 8320 review too? Curious about the clocking potential what with the 500Mhz stock clock difference.



See Anandtech/Techreport.


----------



## hardcore_gamer (Oct 23, 2012)

How about a 6300 review ? I think it is a good budget gaming CPU.


----------



## blibba (Oct 23, 2012)

hardcore_gamer said:


> How about a 6300 review ? I think it is a good budget gaming CPU.



I think the 4300 is the pick of the bunch for gamers, as long as you don't care about the power use. Certainly I'd give it some strong consideration if I was replacing my CPU/mobo tomorrow, but as it stands it's too small an upgrade over what I have.


----------



## Fourstaff (Oct 23, 2012)

Nice improvements, but still not enough to cover the power bills deficit :/


----------



## NC37 (Oct 23, 2012)

blibba said:


> See Anandtech/Techreport.



Anand has some on it but they didn't test overclock potential. Thanks anyways, hopefully they'll post something later.


----------



## Melvis (Oct 23, 2012)

blibba said:


> What kind of response is this?
> 
> The 8350 is slower than much cheaper Intel chips in a lot of the benchmarks that _some_ people actually care about - see the 99th percentile Skyrim graph above. In addition, it uses far more power than any recent competitive Intel offering of similar performance. Particularly when overclocked, using more power translates to getting hotter.
> 
> I happen to agree that the 8350 is not impossible to recommend, but instead is only recommendable to a certain type of buyer (one who does not prioritise games, who does prioritise certain highly threaded tasks, who does not stress the CPU enough for the electricity bill to eliminate the savings). But when you put forward an argument like that, you destroy any chance of persuading anybody of its merits.



Sarcasm at its finest 

Link? as from what im seeing its the opposite as shown here> http://www.guru3d.com/articles_pages/amd_fx_8350_processor_review,1.html Skyrim? are you serious? your going to base what your saying on a game? o dear god.

Over here in the land of mega ass expensive (Australia) a 2600K will cost you $300, so if this 8350 is at $200 ish then to ME that's alot cheaper for the same performance don't you think?

I agree its not for everyone but for alot of PII users or even the odd i5 users might jump to this.

FYI regardless if the CPU uses more power doesn't mean it will run hotter, the review shows 53c maxed out on all 8 cores, a 2600K will hit 70c plus. (Stock coolers of course)


----------



## blibba (Oct 23, 2012)

Melvis said:


> Sarcasm at its finest
> 
> Link? as from what im seeing its the opposite as shown here> http://www.guru3d.com/articles_pages/amd_fx_8350_processor_review,1.html Skyrim? are you serious? your going to base what your saying on a game? o dear god.



I feel like you didn't read my post. "The 8350 is slower than much cheaper Intel chips in a lot of the benchmarks that *some* people actually care about...". For a lot of people, me included, the only time the CPU ever gets a real work out is in gaming. So I couldn't really care less how it performs elsewhere, as long as it's adequate and sips power.



Melvis said:


> Over here in the land of mega ass expensive (Australia) a 2600K will cost you $300, so if this 8350 is at $200 ish then to ME that's alot cheaper for the same performance don't you think?
> 
> I agree its not for everyone but for alot of PII users or even the odd i5 users might jump to this.



It's going to be quite a small upgrade from either of those, or even a sidegrade, unless you do a lot of encoding. But sure, depending on your workload, it might well be the same performance as a 2600K for you.



Melvis said:


> FYI regardless if the CPU uses more power doesn't mean it will run hotter, the review shows 53c maxed out on all 8 cores, a 2600K will hit 70c plus.



Well, of the electrical energy that any CPU uses, >99% is converted to heat. If the 8350 runs cooler than a chip using less power, that's likely to be to do with a superior stock cooler, or better interface between the chip and IHS.



NC37 said:


> Anand has some on it but they didn't test overclock potential. Thanks anyways, hopefully they'll post something later.



Apologies, I misunderstood your query. I thought you just wanted to see how the architecture with clockspeed.


----------



## DaC (Oct 23, 2012)

Melvis said:


> Sarcasm at its finest
> I agree its not for everyone but for alot of PII users or even the odd i5 users might jump to this.



This is a point much missed here on TPU..... remember PC industry is not all about us enthusiasts all the time...

AMD is keeping backward compatbility for a long time which is great... a pure WIN over Intel...

sckt 1156.... sckt 1155.... and yes, there's another one coming.... 
Put one pin.... take one pin and there you go.... new motherboard needed...


----------



## blibba (Oct 23, 2012)

DaC said:


> This is a point much missed here on TPU..... remember PC industry is not all about us enthusiasts all the time...
> 
> AMD is keeping backward compatbility for a long time which is great... a pure WIN over Intel...
> 
> ...



As a Phenom II user, I have no upgrade options beyond 1100T on my motherboard. But I agree that in general AMD manages this better. Long may it continue.


----------



## NC37 (Oct 23, 2012)

Melvis said:


> I agree its not for everyone but for alot of PII users or even the odd i5 users might jump to this.



I'd say more PII. If your on an i5 I don't see a reason to switch unless you really need the multithreading. For that matter, it is cheaper to upgrade to i7 since you don't need to buy a new board to go with it. 

I'm a gamer but I also do video encoding enough where I'll get use out of these. Piledriver performance is where Bulldozer should have been. Since I'm a PII user I'll jump on this. Hold over till Excavator then weigh in Intel's picks post Haswell.


----------



## Melvis (Oct 23, 2012)

blibba said:


> I feel like you didn't read my post. "The 8350 is slower than much cheaper Intel chips in a lot of the benchmarks that *some* people actually care about...". For a lot of people, me included, the only time the CPU ever gets a real work out is in gaming. So I couldn't really care less how it performs elsewhere, as long as it's adequate and sips power.



Ok fair enough i get what you mean there and i can understand that, if its mainly gaming then yes it would be alot wiser to go to an Intel chip. For me personally as ive already purchased the motherboard months and months back this is realy the only chip i can upgrade to easly that will show me the biggest gains in games, trans-coding etc



blibba said:


> It's going to be quite a small upgrade from either of those, or even a sidegrade, unless you do a lot of encoding. But sure, depending on your workload, it might well be the same performance as a 2600K for you.



I agree its not going to be massive but at this time its your best bet IF you wanted to upgrade? and for me to have the performance around a 2600K is good enough for me.




blibba said:


> Well, of the electrical energy that any CPU uses, >99% is converted to heat. If the 8350 runs cooler than a chip using less power, that's likely to be to do with a superior stock cooler, or better interface between the chip and IHS.



True and i do honestly think it might come down to the stock cooler as AMD's stock 4 copper pipe heat sink is far better then intels stock cooler that's for sure. Guess the only way to really tell is to swap coolers and find out? 

@NC37 I totally agree with you man


----------



## Fatal1ty39 (Oct 23, 2012)

will be there a cpu scaling review for FX-8350 like Wizzard did before for the AMD FX-8150 ?


----------



## nt300 (Oct 23, 2012)

This is very good release from AMD. In some games the new CPU gains over 60% performance improvement in some games. I think I found my upgrade 


Absolution said:


> Its good that the 8350 finally manages to beat the Intel 2500k and in most cases is similar to the 2600k. (not in gaming though  )
> 
> But those processors are really old. Is there any advantage (feature wise / besides 8 cores) to AMD's alternative now? (being it a bit late but finally overcoming those intels)
> 
> ...


It because AMD needs to completly redesign its IMO. By the time Excavator comes out they will have it done accroding to rumours.


----------



## blibba (Oct 23, 2012)

nt300 said:


> This is very good release from AMD. In some games the new CPU gains over 60% performance improvement in some games. I think I found my upgrade



I think you're better off sticking with your existing CPU or getting an FX4*** for games.


----------



## EarthDog (Oct 23, 2012)

Excellent review... though some things to note:

* In the conclusion, Intel is on a 22nm process not 28nm
* in the conclusion, you mention 3770k is almost twice as much. Its $330 vs $219 (newegg). Thats not even close to twice as much (to me anyway). If you want to compare MSRP its $195 vs $313. Closer to 33% more.

If anyone would like to see this CPU under LN2, you can find that HERE. 

EDIT: 





cadaveca said:


> No, it's a typo, will fix in a moment.
> 
> 
> EIDT: FIX'd.


Still shows 28nm in the conclusion....


----------



## MasterInvader (Oct 23, 2012)

It´s time to sell my 8150 

The next objective, get a good sample [8350] to push the OC beyond 5Gz´s 24/7


----------



## os2wiz (Oct 23, 2012)

EarthDog said:


> Excellent review... though some things to note:
> 
> * In the conclusion, Intel is on a 22nm process not 28nm
> * in the conclusion, you mention 3770k is almost twice as much. Its $330 vs $219 (newegg). Thats not even close to twice as much (to me anyway). If you want to compare MSRP its $195 vs $313. Closer to 33% more.
> ...



Except ,your wrong,  dead wrong. The Newegg price is $25 above MSRP and is only there for perhaps 2 weeks before they will be forced to lower it to $195. They pulled the same crap when Bulldozer was released. That is why at that time I bought my cpu from Amazon for $30 less.  $195 will be the price when the dust settles and the supply chain is full. You can bet your undies on that.


----------



## os2wiz (Oct 23, 2012)

blibba said:


> I think you're better off sticking with your existing CPU or getting an FX4*** for games.



Disagree , better off getting the 8350 FX for gaming. Off with Turboboost, run all 8 cores. It will do quite well in all but the most finicky settings in the poorly coded single-thrreaded games.
Overclock to 4.8-5.0 GHZ if you have a decent liquid cooler.


----------



## Hillbilly (Oct 23, 2012)

EarthDog said:


> If anyone would like to see this CPU under LN2, you can find that HERE.




Thanks. 7.3GHz and under 2V. Looks like fun.


----------



## EarthDog (Oct 23, 2012)

os2wiz said:


> Except ,your wrong,  dead wrong. The Newegg price is $25 above MSRP and is only there for perhaps 2 weeks before they will be forced to lower it to $195. They pulled the same crap when Bulldozer was released. That is why at that time I bought my cpu from Amazon for $30 less.  $195 will be the price when the dust settles and the supply chain is full. You can bet your undies on that.


Perhaps you want to look at my post again, since I compared both current pricing and MSRP but here it is again since you clearly missed it. 

$313 MSRP(3770K) vs $195 MSRP(FX-8350) = 38% (still not what I would call close to 50%, its MUCH closer to 33%)

Current pricing - $330 (3770K) vs $220 (FX-8350) = 33%


----------



## xenocide (Oct 23, 2012)

blibba said:


> I think you're better off sticking with your existing CPU or getting an FX4*** for games.



If you wanted a quad-core AMD CPU a Phenom II would be better than an FX4xxx CPU in almost all situations.  I'd say go for the FX8320 since it's unlocked anyway, OC it to 8350 speeds or better, and use the savings to buy a nicer cooler.  Or if gaming go for an i5-3570K if you want the OCability or lower end IB-based i5's since the i5-3570 was able to go blow for blow with the 8350 in this review.  If you wanted the best Price\Performance the FX-6300 seems pretty solid as well and is only $10 more than the equivalent FX43xx CPU.



os2wiz said:


> Except ,your wrong,  dead wrong. The Newegg price is $25 above MSRP and is only there for perhaps 2 weeks before they will be forced to lower it to $195. They pulled the same crap when Bulldozer was released. That is why at that time I bought my cpu from Amazon for $30 less.  $195 will be the price when the dust settles and the supply chain is full. You can bet your undies on that.



Which he addressed but you chose to ignore.  He said the price on Newegg for the i7 was $330 and the FX-8350 was $220.  Then said the MSRP for both; $313 and $195 respectively.  If you had read his whole post you would have seen he pointed out that Newegg was selling them for $25 over the MSRP--and I imagine more etailers will for the first few weeks.


----------



## blibba (Oct 23, 2012)

os2wiz said:


> Disagree , better off getting the 8350 FX for gaming. Off with Turboboost, run all 8 cores. It will do quite well in all but the most finicky settings in the poorly coded single-thrreaded games.
> Overclock to 4.8-5.0 GHZ if you have a decent liquid cooler.



The FX4 are faster in nearly every low-FPS game at stock, and overclock further on any given cooler. I do not think that there are any circumstances in which an FX8 is good value in a gaming-focused build.


----------



## TheLaughingMan (Oct 23, 2012)

EarthDog said:


> Excellent review... though some things to note:
> 
> * In the conclusion, Intel is on a 22nm process not 28nm
> * in the conclusion, you mention 3770k is almost twice as much. Its $330 vs $219 (newegg). Thats not even close to twice as much (to me anyway). If you want to compare MSRP its $195 vs $313. Closer to 33% more.
> ...



Dave was not just talking about the CPU. He was talking about the total build which would be CPU and motherboard (keeping everything else the same). Motherboards for Intel with similar features and specs tend to be $40 to $70 more expensive in price. So you are looking at an AMD system for $379 for CPU/Motherboard and Intel for $550 which is then closer to +70% price difference.


----------



## Ravenas (Oct 23, 2012)

I will be updating to this processor on release date. I have an AM3+ motherboard already, so I'm quite happy AMD decided to stick with the AM3+ stock.


----------



## EarthDog (Oct 23, 2012)

> Dave was not just talking about the CPU. He was talking about the total build which would be CPU and motherboard (keeping everything else the same). Motherboards for Intel with similar features and specs tend to be $40 to $70 more expensive in price. So you are looking at an AMD system for $379 for CPU/Motherboard and Intel for $550 which is then closer to +70% price difference.


Oh, he didnt mention that in PM. In fact he stated he used his LOCAL pricing compared to I'm guessing the MSRP (which isnt a fair comparison IMO - MSRP or big vendors or bust). 

Also, I think your math is incorrect... 379*1.45= ~$550. So its 45% more than the AMD system, not 70% more as you stated = 379*1.7= $644.xx.


----------



## os2wiz (Oct 23, 2012)

EarthDog said:


> Perhaps you want to look at my post again, since I compared both current pricing and MSRP but here it is again since you clearly missed it.
> 
> $313 MSRP(3770K) vs $195 MSRP(FX-8350) = 38% (still not what I would call close to 50%, its MUCH closer to 33%)
> 
> Current pricing - $330 (3770K) vs $220 (FX-8350) = 33%



118/195 = 60 % more for the I7 3770K.  They are both correct, but I think my percentage is the one someone who is deciding to purchase will look at. The 3770k is 60% more money with only an average 10-15% performance improvement. So I am not dense , perhaps some self-examination is in order on your part???


----------



## cadaveca (Oct 23, 2012)

TheLaughingMan said:


> Dave was not just talking about the CPU. He was talking about the total build which would be CPU and motherboard (keeping everything else the same). Motherboards for Intel with similar features and specs tend to be $40 to $70 more expensive in price. So you are looking at an AMD system for $379 for CPU/Motherboard and Intel for $550 which is then closer to +70% price difference.



Nah, local at time of writing 3770k was 379, which is $10 short of double. OF course, I had no idea of actual retail pricing for the AMD chips when I wrote the review.


----------



## EarthDog (Oct 23, 2012)

cadaveca said:


> Nah, local at time of writing 3770k was 379, which is $10 short of double. OF course, I had no idea of actual retail pricing for the AMD chips when I wrote the review.


Like I said in PM, you cant compare retail of your local yocal store vs MSRP...


----------



## cadaveca (Oct 23, 2012)

EarthDog said:


> Like I said in PM, you cant compare retail of your local yocal store vs MSRP...



I can compare whatever I want, thanks, that's my opinion. 

SInce MSRP isn't actual sale price, it's purely subjective thinking anyway. OR do you write conclusions seconds before the article goes live? 


I actually wrote most of this review weeks ago.  got the chip nearly 3 weeks ago to the day. 


It was very interesting to see everyone guessinmg at stuff prior to hte launch, to be honest.


----------



## Super XP (Oct 23, 2012)

EarthDog said:


> This guy os2wiz is batting ZERO in this thread so far... LOL!


Useless post. You already explained his error. Give him a chance to post a thank you 

Bulldozer FX-8120 @ 4.40GHz w/ 8- Cores w/ 1.375v. (Beyond Stable) 
Will be going up for sale soon


----------



## EarthDog (Oct 23, 2012)

os2wiz said:


> 118/195 = 60 % more for the I7 3770K.  They are both correct, but I think my percentage is the one someone who is deciding to purchase will look at. The 3770k is 60% more money with only an average 10-15% performance improvement. So I am not dense , perhaps some self-examination is in order on your part???


Perhaps I should, my bad.. But Im still not dead wrong since I gave CORRECT values to base it off of even if my math was horrible fail!! LOL! So lets both take our feet out of our mouths. 



cadaveca said:


> I can compare whatever I want, thanks, that's my opinion.
> 
> SInce MSRP isn't actual sale price, it's purely subjective thinking anyway. OR do you write conclusions seconds before the article goes live?
> 
> ...


You ca do whatever the hell you want but basing it off your LOCAL canadian pricing for intel vs MSRP isnt painting a remotely accurate picture. MSRP vs MSRP or retail vs retail that isnt up in bumblefunk canada would have been better IMO.


----------



## cadaveca (Oct 23, 2012)

EarthDog said:


> You ca do whatever the hell you want but basing it off your LOCAL canadian pricing vs MSRP isnt painting a remotely accurate picture.



Yes and no. Liek I get your point, but I actually expected retail pricing to match MSRP.

3770K list price now in the US should be $359. And since I said "about" I don't need exact math...so there!


----------



## os2wiz (Oct 23, 2012)

EarthDog said:


> This guy os2wiz is batting ZERO in this thread so far... LOL!



No you are not batting at all. Why is he using the price difference over the Intel price when he should be using the price difference over the AMD chip price. As a potential buyer the 3770k is over 60% higher in price. He is trying to minimize the appearance of the difference by saying the AMD chip is 38% less than the Intel. The old saying is that figures don't lie, but liars sure can figure. I am not accusing of lying, because you didn't . But you did manipulate the data to make it appear that one would save only 38% by buying AMD when it is just as accurate and more relevant to the cash-strapped buyer to see the Intel chip 60% more than the 8350. Both are truthful, but one is deceptive.

  I am not batting zero, but using my brain. More than I could say for you. Don't insult me or laugh at me again on this forum I will scientifically make you look a lot smaller in other peoples eye than you want to. Feel the pain??


----------



## EarthDog (Oct 23, 2012)

Its not $359 in the USA... Not at Newegg ($330), not at tigerdirect ($320), or amazon ($330).


----------



## os2wiz (Oct 23, 2012)

Super XP said:


> Useless post. You already explained his error. Give him a chance to post a thank you
> 
> Bulldozer FX-8120 @ 4.40GHz w/ 8- Cores w/ 1.375v. (Beyond Stable)
> Will be going up for sale soon



But it was not an error. And his post was not only useless but INSULTING. I will NOT tolerate that kind of abuse. No one should.


----------



## EarthDog (Oct 23, 2012)

You are right... that 2nd post was not needed... and I do apologize. I was put on the defensive immediately wih your incorrect "you are wrong, Dead wrong' statemet and that set the tone... lets be clear about that. you called me out first kiddo (and was wrong/didnt read my post).

/threadjack. You want to continue complaining, I have a PM box that is empty and waiting.


----------



## os2wiz (Oct 23, 2012)

cadaveca said:


> Yes and no. Liek I get your point, but I actually expected retail pricing to match MSRP.
> 
> 3770K list price now in the US should be $359. And since I said "about" I don't need exact math...so there!



Retail pricing for the AMD chip will match MSRp in a matter of weeks. It takes time for mthe supply chain to be filled up. Once pent-up demand is met the price will drop. In any case I got it for a total of $210.90 from BLT.  I refuse to be ripped off by Newegg.


----------



## os2wiz (Oct 23, 2012)

xenocide said:


> If you wanted a quad-core AMD CPU a Phenom II would be better than an FX4xxx CPU in almost all situations.  I'd say go for the FX8320 since it's unlocked anyway, OC it to 8350 speeds or better, and use the savings to buy a nicer cooler.  Or if gaming go for an i5-3570K if you want the OCability or lower end IB-based i5's since the i5-3570 was able to go blow for blow with the 8350 in this review.  If you wanted the best Price\Performance the FX-6300 seems pretty solid as well and is only $10 more than the equivalent FX43xx CPU.
> 
> 
> 
> Which he addressed but you chose to ignore.  He said the price on Newegg for the i7 was $330 and the FX-8350 was $220.  Then said the MSRP for both; $313 and $195 respectively.  If you had read his whole post you would have seen he pointed out that Newegg was selling them for $25 over the MSRP--and I imagine more etailers will for the first few weeks.



 Its not only the msrp vs Newegg price, that wa the side bar. The main point I mmade is he is using the price difference over the Intel price. That is deceptive. Ther price difference should be over the AMD price to show how much more % wise the Intel chip is. This is basic arithmetic where is the disconnect???


----------



## EarthDog (Oct 23, 2012)

os2wiz said:


> Its not only the msrp vs Newegg price, that wa the side bar. The main point I mmade is he is using the price difference over the Intel price. That is deceptive. Ther price difference should be over the AMD price to show how much more % wise the Intel chip is. This is basic arithmetic where is the disconnect??


You are correct... perhaps read the entire thread before you reply again as I have already conceded my math was fail in that example. Can you(we) move on?


----------



## Tatty_One (Oct 23, 2012)

Talk and counter talk about pricing, MSRP, e tailer prices and such are starting to get a little tiresome now and to a degree are detracting from the CPU reviewed here, whilst I appreciate pricing is important, and comparisions will always be made, this  "tit4tat" approach kinda fels a bit like a broken record now, so could we please get back on track and talk about a decent review of a decent CPU? (notice even I was careful there!) ..... thank you.


----------



## cadaveca (Oct 23, 2012)

os2wiz said:


> Its not only the msrp vs Newegg price, that wa the side bar. The main point I mmade is he is using the price difference over the Intel price. That is deceptive. Ther price difference should be over the AMD price to show how much more % wise the Intel chip is. This is basic arithmetic where is the disconnect???



hmm.

195 x2 = 390.

local IVB price = 379.


You guys are arguing about silly stuff here, next up is infractions!

I was using a "generalization", BTW.


----------



## os2wiz (Oct 23, 2012)

EarthDog said:


> You are right... that 2nd post was not needed... and I do apologize. I was put on the defensive immediately wih your incorrect "you are wrong, Dead wrong' statemet and that set the tone... lets be clear about that. you called me out first kiddo (and was wrong/didnt read my post).
> 
> /threadjack. You want to continue complaining, I have a PM box that is empty and waiting.



And also apologize for stating my analysis of pricing was wrong. My math was 100% on the money. My point was focused on your percentage of price difference cliaim. Don't try to shift it to a secondary issue. Admit my logic is correct and yours was deceptive, understating the price difference. Then apologize again for being insulting . Then I'll bury this issue and we can move on.


----------



## Norton (Oct 23, 2012)

On topic- Thanks for a great review Dave and I will be buying an 8350 to update the 8120 in one of my WCG crunchers


----------



## EarthDog (Oct 23, 2012)

It just makes no sense to me to base that comparison off your inflated LOCAL Canadian prices. Both NCIX and newegg.ca have it for $320/$330 respectively. I mean as you mentioned its up to you of course, but... that makes no sense at all to compare such a ripoff for a 3700k compared with MSRP.

Oh well.


----------



## cadaveca (Oct 23, 2012)

EarthDog said:


> It just makes no sense to me to base that comparison off your inflated LOCAL Canadian prices. Both NCIX and newegg.ca have it for $320/$330 respectively. I mean as you mentioned its up to you of course, but... that makes no sense at all to compare such a ripoff for a 3700k compared with MSRP.
> 
> Oh well.



It makes even less sense to dwell on it when the FX-8350 is meant to be compared with i5-3570k. It was never meant to be an exact number, or I would have said 200% more instead of "nearly twice". In fact, that's about the only time I refer to the 3770K, even, and was thinking about what the best you could buy from either company was.

Anyway, you've stated your opinion here now a couple times now, so...


----------



## Disparia (Oct 23, 2012)

Nice, I may upgrade to an FX-8350... when it's cheaper. With four users in the house, I want to move towards smaller and more efficient desktops and servers. No mITX boards and mATX boards only with the older chipsets means I won't be putting much money towards the AM3+ platform.

So I'm not unhappy with AMD, just that there so many board options on the Intel side.


----------



## cadaveca (Oct 23, 2012)

Jizzler said:


> So I'm not unhappy with AMD, just that there so many board options on the Intel side.



Good point. There are not many 9-series boards available right now other than ATX ones. 

Quite a few good options under $100 though.


----------



## badtaylorx (Oct 23, 2012)

cdawall said:


> Which part is the garbage part? Power consumption is down, performance is up, and the CPU's clock well, not to mention they are drop in upgrades for most people with anything from a Phenom II to the "garbage" Bulldozer chips.



  dude, dont feed em.....


----------



## GC_PaNzerFIN (Oct 23, 2012)

Still way too slow single thread performance. And very uncertain performer in general. Sometimes very fast, sometimes very slow. Sure, it is better than old Bulldozer but I would find it hard to recommend this over 3570K for general use ie. gaming.


----------



## MxPhenom 216 (Oct 23, 2012)

Seems like a pretty solid chip I must say. Nice work AMD.


----------



## cadaveca (Oct 23, 2012)

GC_PaNzerFIN said:


> Still way too slow single thread performance. And very uncertain performer in general. Sometimes very fast, sometimes very slow. Sure, it is better than old Bulldozer but I would find it hard to recommend this over 3570K for general use ie. gaming.



Having used both for several weeks as daily systems, I have no problems recommending the FX-8350. Plain and simple..it's cheaper, and when it loses in performance, that difference I nprice more than makes up for it, I think.


If you already have a SKT1155 system, no, these chips aren't for you. If you are buying a new Windows8 PC as many will be doing very soon, that difference in price can save you quite a bit overall, actually, and you'll end up with a cheaper system that is overall just as capable.

Very few things are single-threaded at this point, and even Intel is pushing multi-threading to devs.

Also notice that I listed my Intel i7-3820 out of my gaming rig for sale. Guess what will replace it?


----------



## GC_PaNzerFIN (Oct 23, 2012)

cadaveca said:


> Also notice that I listed my Intel i7-3820 out of my gaming rig for sale. Guess what will replace it?



3970X?


----------



## TheoneandonlyMrK (Oct 23, 2012)

cadaveca said:


> It was very interesting to see everyone guessing at stuff prior to hte launch, to be honest.



yeh i had thought clock mesh was in, apparently not:shadedshu, still good review, I was eagerly looking forward to buying an 8350 but im homeless now come weekend so that plans back on the back burner , I must say though in all the threads where Amd bashing goes on ive been sat with a 960T ,apparently some oll tat that cant do much these days and only 50gops of omph and its still ultras the world at 1080p hearing people call many a better chip rotten  ,fussy odd world isnt it

cheers for the read dave , moar pages next time please ,it ran out too quick as ever



cadaveca said:


> Also notice that I listed my Intel i7-3820 out of my gaming rig for sale. Guess what will replace it?




, a recomendation indeed..


----------



## cadaveca (Oct 23, 2012)

GC_PaNzerFIN said:


> 3970X?






I no has Intel contact. I'd love one of those though, but alas...I still buy my Intel chips @ retail.


harr harr.



theoneandonlymrk said:


> yeh i had thought clock mesh was in, apparently not:shadedshu, still good review, I was eagerly looking forward to buying an 8350 but im homeless now come weekend so that plans back on the back burner , I must say though in all the threads where Amd bashing goes on ive been sat with a 960T ,apparently some oll tat that cant do much these days and only 50gops of omph and its still ultras the world at 1080p hearing people call many a better chip rotten  ,fussy odd world isnt it
> 
> cheers for the read dave , moar pages next time please ,it ran out too quick as ever
> 
> ...



Unfortunately there was limited time for testing due to waiting for BIOS, and then having to retest everything with the new BIOS, waitiing for drivers...etc...AMD did a good job of keeping things quiet before the launch. I asked every single OEM for a board to test with...not one answered. And AMD didn't send one. Oh well, thankfully i kept one on-hand..I just recently cleaned out a tonne of boards since the wife was starting to eye them up a bit too much...


----------



## cdawall (Oct 23, 2012)

GC_PaNzerFIN said:


> Still way too slow single thread performance. And very uncertain performer in general. Sometimes very fast, sometimes very slow. Sure, it is better than old Bulldozer but I would find it hard to recommend this over 3570K for general use ie. gaming.



Why? Because some poorly coded games at a 10 millisecond difference? If we are going off of that all of the Phenom II owners better just sit were they are. Single IPC is dead welcome to 2010. I could honestly care less if a Pentium MMX chip beat it in single IPC that is not what these chips are designed to do and the sheer fact that they are even keeping up is astounding. 

When we go brake into actual multithreading







Some how AMD is beating the $350 processor and competing with the $999 one.

Even in cinebench which is immensely Intel biased.






AMD beats all of the chips in its price class.






Hmmm yet again






And WTF is this? something optimized for all different CPU's out there (not just Intel) and the AMD core kicks ass. Likely due to AVX instructions, but still.



cadaveca said:


> Very few things are single-threaded at this point, and even Intel is pushing multi-threading to devs.



Weird that you mean for the past 5-6 years everyone has had dual cores or better?



			
				cadaveca said:
			
		

> review snip



I noticed in your system builds the Intel boxes has 2133mhz ram in one and 2666mhz ram in the other while the AMD had 1866mhz ram what was the actual clock on all of the ram kits?


----------



## cadaveca (Oct 23, 2012)

cdawall said:


> Why? Because some poorly coded games at a 10 millisecond difference? If we are going off of that all of the Phenom II owners better just sit were they are. Single IPC is dead welcome to 2010. I could honestly care less if a Pentium MMX chip beat it in single IPC that is not what these chips are designed to do and the sheer fact that they are even keeping up is astounding.
> 
> When we go brake into actual multithreading
> 
> ...



1600 MHz for the 3770K, 1600 Mhz for the 3570K. board BIOSes made XMP profiles not work, unfortunately, and i thought since 1866 Mhz is what FX-8350 supports default, 1600 Mhz for IVB default, so the speeds were perfect.

3770K was 11-11-11-28, 3570K was 9-9-9-24.


----------



## cdawall (Oct 23, 2012)

cadaveca said:


> 1600 MHz for the 3770K, 1600 Mhz for the 3570K. board BIOSes made XMP profiles not work, unfortunately, and i thought since 1866 Mhz is what FX-8350 supports default, 1600 Mhz for IVB default, so the speeds were perfect.



Thanks that what I was curious about 

My mind is still blown from that zlib benchmark I cannot wait until cinebench is recoded to allow AMD to use AVX.


----------



## cadaveca (Oct 23, 2012)

cdawall said:


> Thanks that what I was curious about
> 
> My mind is still blown from that zlib benchmark I cannot wait until cinebench is recoded to allow AMD to use AVX.



i think it has been, but hhe build was not released in time for testing...i think AMD gave me a build on Friday?

you can see in the power consumption when doing certain tasks that hte software isn't using all of the chip it can, in some instances.


----------



## tehehe (Oct 23, 2012)

Still bad for games. If multithreading in gaming is the future then I will buy adequate cpu when this future comes, not now. These cpus are simply ahead of their time and that's not a compliment.


----------



## cadaveca (Oct 23, 2012)

tehehe said:


> Still bad for games.



Compared to what?

I cannot say it's bad for gaming. Been playing a fair bit of BF3 and some Rocksmith as of late though, and not much else.


----------



## BlackOmega (Oct 23, 2012)

[H]@RD5TUFF said:


> Suripse . . ..  err wait nope still the same garbage as the last round of AMD chips, when will AMD get their act together and make a real chip.:shadedshu



Same garbage? Hardly.

There's been so much improvement over BD it's fairly striking. 1866MHz memory stock!  That's huge in and of itself.  Need I remind you that AMD chips net huge performance gains from ocing the memory.
Performance is up, power consumption is down, and for a lot of 990 users it's a direct drop in.

Is it going to compete with Intel? Only if it's priced accordingly.  Performance wise, AMD simply can't compete. Intel R&D spends more in a month than AMD's YEARLY budget. So no surprise that Intel chips are faster. Actually, it's rather impressive that a relatively small company like AMD can compete at all.


Regarding humansmokes comment saying that using anything less than LinX to stress test is "cheating", I find it rather funny. LinX/IBT are good for testing Intel silicon, not AMD.  From my own and some friends testing AMD overclocks, those two tests totally suck at testing AMD's and have been removed from my stress testing regimen.
 A friend ran 18 hours of LinX on his AMD (Max everything), passed with flying colors. Turned on F@H and it bsod'd within 5 minutes.
Prime95 and F@H are still the best stress tests for AMD's. Too bad that a little known stress tester, S&M, didn't continue to the windows 7 era. It would, by far, get my AMD silicon the hottest.


----------



## cdawall (Oct 23, 2012)

cadaveca said:


> i think it has been, but hhe build was not released in time for testing...i think AMD gave me a build on Friday?
> 
> you can see in the power consumption when doing certain tasks that hte software isn't using all of the chip it can, in some instances.



Are you planning an update with games, new benchmarks etc?



tehehe said:


> Still bad for games. If multithreading in gaming is the future then I will buy adequate cpu when this future comes, not now. These cpus are simply ahead of their time and that's not a compliment.



Well for the $100-200 you save over a 3770K which is apparently the best processor on Earth according the team Intel you can take the 7870 you budgeted for and buy a 7970. Good news about that instead of .5-5FPS of difference now you are getting 15-20FPS of difference. So which is the better value for games in your book? A 3770K with a midrange card or 8350 with a high end card? I sure plan on the high end card.


----------



## cadaveca (Oct 23, 2012)

cdawall said:


> Are you planning an update with games, new benchmarks etc?



Provide me a list of games with automated benchamrks, and I will gladly run them, provided I own the title.. Play-through benchmarks are not an option.


I jsut purchased SAleeping Dogs on the Steam sale, so that might be added soon..I need to check and see if these games show differences first, etc...I do tend to vet my tests to make sure they actaully show true perforamcne differences, or if they show garbage.

Like Sniper Elite V2... I bench for every motherboard..shows ZERO differences, but for CPU testing, with big speed differences, it does. So it's in my CPU/APU reviews, but not board or memory reviews. It might make hte cut for memory...testing is still underway.

Heck, I'm re-testing and APU boasrd right this second because of the driver updates. SB driver improves SATA numbers for FM2, apparantly.


----------



## STCNE (Oct 23, 2012)

cadaveca said:


> Provide me a list of games with automated benchamrks, and I will gladly run them, provided I own the title.. Play-through benchmarks are not an option.
> 
> 
> I jsut purchased SAleeping Dogs on the Steam sale, so that might be added soon..I need to check and see if these games show differences first, etc...I do tend to vet my tests to make sure they actaully show true perforamcne differences, or if they show garbage.
> ...



GTAIV and ARMA 2 have built in benchmarks like that and tend to be CPU heavy games. Would be interesting to see how the chip stands up. GTAIV needed a pretty beefy CPU last I played(over a year ago). Ran decently on my old i5 750, was only smooth after I did some OCing.


----------



## cdawall (Oct 23, 2012)

cadaveca said:


> Provide me a list of games with automated benchamrks, and I will gladly run them, provided I own the title.. Play-through benchmarks are not an option.
> 
> 
> I jsut purchased SAleeping Dogs on the Steam sale, so that might be added soon..I need to check and see if these games show differences first, etc...I do tend to vet my tests to make sure they actaully show true perforamcne differences, or if they show garbage.
> ...



Crysis, dirt showdown have ones. I will have to look around for what else does.


----------



## cadaveca (Oct 23, 2012)

cdawall said:


> Crysis, dirt showdown have ones. I will have to look around for what else does.



Yeah, Perhaps start a thread? We can poll other users, etc..I'm always open to changing testing, this and hte APU review are my first CPU-type reviews, so I am definitely open to changing things a bit.


I jsut can't devote too much time to testing though, and each test MUST show an actual difference. If there is less than 1 FPS difference, I won't use it. If the results are not repeatable, I won't use it. If I cannot affect results by tweaking system speeds, I won't use it.  


I'm kind of a benchmarking snob, these days.  However, I've been so busy with reviews teh past year or so that I am not really up to date on what's out there game-wise.


----------



## cdawall (Oct 23, 2012)

cadaveca said:


> Yeah, Perhaps start a thread? We can poll other users, etc..I'm always open to changing testing, this and hte APU review are my first CPU-type reviews, so I am definitely open to changing things a bit.
> 
> 
> I jsut can't devote too much time to testing though, and each test MUST show an actual difference. If there is less than 1 FPS difference, I won't use it. If the results are not repeatable, I won't use it. If I cannot affect results by tweaking system speeds, I won't use it.
> ...



Arma II requires some very high end SSD's to really show differences. Crysis is the easiest one and shows tons of difference. Same goes for Dirt will post a couple quickies between stock and 3.9ghz on my chip. In a new thread obviously.

http://www.techpowerup.com/forums/showthread.php?p=2755212#post2755212

^Thread for discussion^


----------



## EarthDog (Oct 23, 2012)

cadaveca said:


> Compared to what?
> 
> I cannot say it's bad for gaming. Been playing a fair bit of BF3 and some Rocksmith as of late though, and not much else.


Have you had a chance to compare BF3 with the 8350 and 3570K/3770K at the same clock and memory speeds? That should isolate things as best possible to show CPU differences in gaming. I dont think there will be much difference with midrange cards, but when you jump up and especially in SLI/Crossfire, I have a hunch that The Intel will still pull away as it has in the past. 

This is in OLD LINK, but, a reference point none the less. Im sure it will depend on the title, resolution and cards used, but... it was true two years ago with less powerful CPU's... I dont imagine it to change much now? No clue... but it would be interesting to see actual H2H competition as best we can (same memory capacity/speed/timings, same CPU clockspeeds, and of course the same GPU).


----------



## cadaveca (Oct 23, 2012)

EarthDog said:


> Have you had a chance to compare BF3 with the 8350 and 3570K/3770K at the same clock and memory speeds? That should isolate things as best possible to show CPU differences in gaming. I dont think there will be much difference with midrange cards, but when you jump up and especially in SLI/Crossfire, I have a hunch that The Intel will still pull away as it has in the past.



3570k and 8350 are neck-and-neck with 7950 Crossfire at stock, which is what I used for testing.


I have not done a "clock-for-clock" compare between Intel/AMD, as i see no point. Then voltages might be differnt, and then that could be comapred...I'm not into comparing how every aspect of the chips are different. I am more interested in showing them as they will be used, rather than hypothetical situations. I mean with hypotheicals, I might as well post nothing but Futuremark benchmarks then.


----------



## suraswami (Oct 23, 2012)

cadaveca said:


> Provide me a list of games with automated benchamrks, and I will gladly run them, provided I own the title.. Play-through benchmarks are not an option.
> 
> 
> I jsut purchased SAleeping Dogs on the Steam sale, so that might be added soon..I need to check and see if these games show differences first, etc...I do tend to vet my tests to make sure they actaully show true perforamcne differences, or if they show garbage.
> ...



Nice review Dave.

People like me use these desktop parts for home brewed servers, VMs and databases for study and practice.  Any chance you can test the 8350 for SQL Server 2012 multi-thread performance and VMware multi vms performance?


----------



## EarthDog (Oct 23, 2012)

Wait, arent they usually BOTH used overclocked, even here at TPU?  Isnt Crossfire/SLI gaming a 'real' situation? I could care less about the synthetics...

I mean I expect having a 600Mhz clockspeed difference you wont see much difference, but what about both of these at 4.6Ghz? A common overclock for the 3570K and likely the 8350? What do games show across CPU limited reso's (say 1280x1024) and GPU limited (say 1920x1200)? In the words of Led Zepplin, does 'the song remain the same'?


----------



## tehehe (Oct 23, 2012)

cadaveca said:


> Compared to what?
> 
> I cannot say it's bad for gaming. Been playing a fair bit of BF3 and some Rocksmith as of late though, and not much else.



Compared to identically priced 2500k. If a game is cpu intensive then Intel wins. Otherwise it's a tie. So Intel is better for people that game a lot - like me. AMD is bad in this competition as far as games are concerned.


----------



## cadaveca (Oct 23, 2012)

tehehe said:


> Compared to identically priced 2500k. If a game is cpu intensive then Intel wins. Otherwise it's a tie. So Intel is better for people that game a lot - like me. AMD is bad in this competition as far as games are concerned.



I do not really know of many games that are really CPU-limited though. Like maybe 5...

And then we are tlaknig about FPS that do not make a real difference to actual game playability.


Numbers are one thing, but actual experience offered is another.


----------



## Maban (Oct 23, 2012)

How does it do at 5GHz with F@H?


----------



## cdawall (Oct 23, 2012)

EarthDog said:


> Wait, arent they usually BOTH used overclocked, even here at TPU?  Isnt Crossfire/SLI gaming a 'real' situation? I could care less about the synthetics...



A lot of people don't overclock as much as you think. Especially in crossfire and SLi when the temps really start to climb.



EarthDog said:


> I mean I expect having a 600Mhz clockspeed difference you wont see much difference, but what about both of these at 4.6Ghz? A common overclock for the 3570K and* likely the 8350?* What do games show across CPU limited reso's (say 1280x1024) and GPU limited (say 1920x1200)? In the words of Led Zepplin, does 'the song remain the same'?



The 8350 in every single benchmark review posted has done 5ghz stable. Why do you think they are only going to get 4.6ghz now? Is there some Intel conspiracy in your pocket that is going to make them unstable if they start performing better? I expect to see the actual later retail model chips in the 5.5ghz range without problems. Dave got his up to 5ghz on the AMD AIO watercooling which is no better than an H70. That is about the same cooling power as a high end aircooler. Real water cooling will only do better.


----------



## Dent1 (Oct 23, 2012)

It comes down to balance to.

Compare similarly priced Intel and AMD solutions, Intel wins majority of games, but loses in a landslide in most multi threaded work related tasks.  Looking at Guru of 3D amongst a few other review sites AMD solution wins majority of multi threaded  tasks which is a huge positive. IMO the AMD solution is a clear winner for anyone using their PC for work more than 40% of the time.




tehehe said:


> Compared to identically priced 2500k. If a game is cpu intensive then Intel wins. Otherwise it's a tie. So Intel is better for people that game a lot - like me. AMD is bad in this competition as far as games are concerned.



In your case, a lot the gaming benchmarks were single threaded or old games. For somebody holding onto their rig for upto 5 years AMD might still be the optimal solution for gaming too as with every new release we're seeing more emphasis on multi core compatibility. In the next couple of years if games utilised cores like Cinebench to dominate Intel, Piledrivers could stand a great chance at dominating gaming benches too with maturity. 

In closing, 2500k might dominate gaming today, but might lose by a greater margin to Piledriver tomorrow - Think about it.


----------



## EarthDog (Oct 23, 2012)

LOL, it was just a random number cdawall, no Intel conspiracy! Fine, jack up the 3570K to 4.9Ghz or 5Ghz like I occasionally run mine at (H100). The point was it has a 600Mhz headstart so I would expect things to be a lot closer than if both with clocked the same... regardless of that clock speed.

Though I would doubt retail is going to hit 5.5Ghz 24/7 stable with 'normal' cooling (ambient water or less). I could be wrong, and actually hope I am... until then, its all speculation.


----------



## cadaveca (Oct 23, 2012)

EarthDog said:


> Fine, jack up the 3570K to 4.9Ghz or 5Ghz like I occasionally run mine at (H100).



My 3570K does no more than 4.5 GHz.

It's very interesting how there is such a differnce between my 3770K and my 3570K, too. I can run 1.4 V through the 3570K no problem, and not break 85C.


----------



## cdawall (Oct 23, 2012)

EarthDog said:


> LOL, it was just a random number cdawall, no Intel conspiracy! Fine, jack up the 3570K to 4.9Ghz or 5Ghz like I occasionally run mine at (H100). The point was it has a 600Mhz headstart so I would expect things to be a lot closer than if both with clocked the same... regardless of that clock speed.
> 
> Though I would doubt retail is going to hit 5.5Ghz 24/7 stable with 'normal' cooling (ambient water or less). I could be wrong, and actually hope I am... until then, its all speculation.



Your chip is a good one as with Dave most do not go over 4.5ghz. So what is to say a good 8350 isn't going to do 5.5? We already saw that with Thuban and Deneb chips. "Good" chips could run 4.5ghz stable on air "normal" chips did no more than 4ghz. That is the same 500mhz difference. With you complete and utter lack of knowledge on the facts why are you speculating? If you really want to get silly about it most people do not overclock so the performance gains in multithreading at stock speeds make a massive difference. 



cadaveca said:


> My 3570K does no more than 4.5 GHz.
> 
> It's very interesting how there is such a differnce between my 3770K and my 3570K, too. I can run 1.4 V through the 3570K no problem, and not break 85C.



See that is a lot more normal.


----------



## YautjaLord (Oct 23, 2012)

Now that the PD reviews flood the interwebz (& TPU - thanx alot cadaveca) the question is: where's the "AMD FX (Piledriver) OCers Club" thread? Even more important: anyone tested it with same cooling as me, i.e. TR's VenomousX & AS5? Or does the LCS that comes with FX-8350 fairs better than what i have?  

P.S. I might change my mobo to Crosshair V Formula & OS to Win 7 Ultimate 64-bit for this CPU quite soon.


----------



## Batou1986 (Oct 23, 2012)

Yay so this will be a good upgrade to my fx-4100.


----------



## EarthDog (Oct 23, 2012)

cdawall said:


> Your chip is a good one as with Dave most do not go over 4.5ghz. So what is to say a good 8350 isn't going to do 5.5? We already saw that with Thuban and Deneb chips. "Good" chips could run 4.5ghz stable on air "normal" chips did no more than 4ghz. That is the same 500mhz difference. With you complete and utter lack of knowledge on the facts why are you speculating? If you really want to get silly about it most people do not overclock so the performance gains in multithreading at stock speeds make a massive difference.
> 
> 
> 
> See that is a lot more normal.


I bin chips and must be lucky. I havent had one (out of 10) do less than 4.5Ghz...(ambient water) voltage walls and therefore temperatures on these stupid TIM below the IHS chips tend to go up after that, yep.



> My 3570K does no more than 4.5 GHz.
> 
> It's very interesting how there is such a differnce between my 3770K and my 3570K, too. I can run 1.4 V through the 3570K no problem, and not break 85C.


As Im sure you know, its all about the leakage. 



> With you complete and utter lack of knowledge on the facts why are you speculating?


Im sorry, my what? Any need for this disparaging comment? I suppose I deserve it for calling out that other guy... but let's stop ehh? Im not an AMD guy, but I read the same forums you do and am not a muppet... 

Anyway, they could, I said I hope Im wrong, what gets you off my dick? Agreeing with you that it could it 5.5Ghz? Ok... It could commonly hit 5.5Ghz... only time will tell. WAIT! I already said that....hmmmmmm.


----------



## cdawall (Oct 23, 2012)

EarthDog said:


> I bin chips and must be lucky. I havent had one (out of 10) do less than 4.5Ghz...(ambient water) voltage walls and therefore temperatures on these stupid TIM below the IHS chips tend to go up after that, yep.



I used to heavily bin chips. Hence why my first batch 1090T was kicking in the 4.5ghz area. Heck I have gotten my B97 ebay chip up to 4.6ghz on an H70...I really do see these chips clocking much higher than listed in benchmarks. I am more onboard for the better IMC in them than anything else. 

My personal issue with them is there is no good Mini-ITX board for AM3 out there. Zotac has an 890GX board, but it only has an X1 PCI-E. I need a full slot and 125w support. I wish I could snag one for my deployment box, but it looks like I will be dealing with my little X3440@4.2 :shadedshu.



EarthDog said:


> Im sorry, my what? Any need for this disparaging comment? I suppose I deserve it for calling out that other guy... but let's stop ehh? Im not an AMD guy, but I read the same forums you do and am not a muppet...



You can take it as you want I meant it as you are an Intel guy with very little AMD experience sitting and preaching that AMD is not as good. 



EarthDog said:


> Anyway, they could, I said I hope Im wrong, what gets you off my dick? Agreeing with you that it could it 5.5Ghz? Ok... It could commonly hit 5.5Ghz... only time will tell.



They could do a lot of things. You don't know and your blatant disregard for reading the reviews posted is obvious.


----------



## EarthDog (Oct 23, 2012)

Im not an intel guy...outside of the fact that I use their CPU's (but not a fanboy which is how I thought you meant that). if AMD performance matched Intel, specifically for benchmarking, I would be ALL OVER THEM. Performance does drive me since I do benchmark, so its clear why I own Intel as they do better at Hwbot in 3D/2D.

I dont recall saying they werent as good either. Put it back in your pants man, there is no battle here, just a (futile?) attempt to figure more things out about its performance.  Ive read our review, Ive read this one. I see in single threaded performance its lacking but doing well in multithreaded performance that isnt FPU heavy. Its pricing is incredible making it a valid choice for anything these days. 

Where am I wrong in that opinion?


----------



## cadaveca (Oct 23, 2012)

EarthDog said:


> As Im sure you know, its all about the leakage.



Yeah, but the change between the two is so large...so greater than anything I am really used too.. Ididn't see that with SNB at all.


I am going to ask AMD for a few more chips. Perhaps we can get some clocking going over the winter. You guys game for some challenges?


----------



## cdawall (Oct 23, 2012)

EarthDog said:


> Im not an intel guy...outside of the fact that I use their CPU's (but not a fanboy which is how I thought you meant that). if AMD performance matched Intel, specifically for benchmarking, I would be ALL OVER THEM. Performance does drive me since I do benchmark, so its clear why I own Intel as they do better at Hwbot in 3D/2D.
> 
> I dont recall saying they werent as good either. Put it back in your pants man, there is no battle here, just a (futile?) attempt to figure more things out about its performance.  Ive read our review, Ive read this one. I see in single threaded performance its lacking but doing well in multithreaded performance that isnt FPU heavy. Its pricing is incredible making it a valid choice for anything these days.
> 
> Where am I wrong in that opinion?



Look closer at the FPU benchmarks AMD does poorly in you will notice all AMD chips do poorly in them. It has nothing to do with AMD being weak at FPU it has to do with specific benchmarks not using the technology available to them. 

I would be willing to wager a quite large bet that any multithreaded benchmark that allows AMD to utilize the technology at hand instead of backdooring anything that isn't "genuineIntel" AMD will perform better than its competition. What needs to happen is there needs to be some open X64 stuff that comes out for encoding as well as video games none of this Intel branded only works well on Intel we see now.


----------



## os2wiz (Oct 23, 2012)

tehehe said:


> Compared to identically priced 2500k. If a game is cpu intensive then Intel wins. Otherwise it's a tie. So Intel is better for people that game a lot - like me. AMD is bad in this competition as far as games are concerned.



Not quite true. In most games the frame rate is a good 60 or higher. There are a few where it drops down to 40 at some points with the most intensive settings. I can't see you complaining about a handful of poorly designed and poorly threaded games. Why not complain to the software developer about their pathetically poor design????  Multithreaded games are the wave of the future. Single threading is an old poor design and is dying.


----------



## EarthDog (Oct 23, 2012)

While I do agree that not using the proper instructions are a huge deal and in SOME tests show big differences while others do not (Cinebench showing little difference to CPUID), I also think that it is an architectural thing too. As Im sure you know, each Intel core (well that WAS A core until AMD changed the definition I guess) is FPU and Integer whereas AMD's 'modules' are 2 integer and one FPU(?). So I would imagine it goes both ways since when comparing it to an Intel chip it has the same amount of FPU's (quad with HT) as an 'octo' core (by AMD definition).


----------



## TheoneandonlyMrK (Oct 23, 2012)

cadaveca said:


> Yeah, but the change between the two is so large...so greater than anything I am really used too.. Ididn't see that with SNB at all.
> 
> 
> I am going to ask AMD for a few more chips. Perhaps we can get some clocking going over the winter. You guys game for some challenges?



id love to see the waterblocked version ocd with some grr


----------



## cdawall (Oct 23, 2012)

EarthDog said:


> While I do agree that not using the proper instructions are a huge deal and in SOME tests show big differences while others do not (Cinebench showing little difference to CPUID),



Cinebench is the only encoding benchmark that shows Intel ahead of the pack. It does not allow AMD processors to use AVX as a whole. Whenever AMD uses AVX it quite honestly demolishes the competition. Much like back in the P4 days when netburst ate video encoding up. Now these have a lot less of a performance drop in other applications vs P4.



EarthDog said:


> I also think that it is an architectural thing too. As Im sure you know, each Intel core (well that WAS A core until AMD changed the definition I guess) is FPU and Integer whereas AMD's 'modules' are 2 integer and one FPU(?). So I would imagine it goes both ways since when comparing it to an Intel chip it has the same amount of FPU's (quad with HT) as an 'octo' core (by AMD definition).



I am sure some of it is an architectural difference. Which is why we are seeing AMD run well in multithreaded benchmarks, terrible in single IPC and mediocre in a handful of honestly biased benchmarks. 

As I said before there is a reason AMD chips are being picked up for the server market. They are not bad and a massively multithreaded environment that is properly coded to make use of not only the new core hierarchy but also the technology available (AVX, SSE etc) they are actually quite good often times substantially better than an Intel alternative. It really really comes down to using the right encoders that allow use of all of the parts of the AMD cores. Like Dave said watching the power consumption during benchmarks it is blatant the cores are idling through things instead of running the cores up like it should.


----------



## EarthDog (Oct 23, 2012)

cdawall said:


> Cinebench is the only encoding benchmark that shows Intel ahead of the pack. It does not allow AMD processors to use AVX as a whole. Whenever AMD uses AVX it quite honestly demolishes the competition. Much like back in the P4 days when netburst ate video encoding up. Now these have a lot less of a performance drop in other applications vs P4.
> 
> 
> 
> ...


As far as the AVX, I have no idea. If that has to do with CPUID and such, that link I provided used a generic CPUID to force the use of all instructions in which cinebench appears to show no favorites. If its beyond that, I will admit I have no clue.

I hear ya... and appreciate the informaion cdawall.


----------



## cdawall (Oct 23, 2012)

EarthDog said:


> As far as the AVX, I have no idea. If that has to do with CPUID and such, that link I provided used a generic CPUID to force the use of all instructions in which cinebench appears to show no favorites. If its beyond that, I will admit I have no clue.
> 
> I hear ya... and appreciate the informaion cdawall.



http://www.agner.org/optimize/blog/read.php?i=49

There is a little information on the CPUID I was talking about. As for your generic CPUID you are correct there is zero optimization for a CPUID that doesn't exist. The issue is when run under a processor that supports AVX cinebench will allow intel cpu's that support AVX utilize while not allowing the AMD ones to do so.


----------



## os2wiz (Oct 23, 2012)

EarthDog said:


> While I do agree that not using the proper instructions are a huge deal and in SOME tests show big differences while others do not (Cinebench showing little difference to CPUID), I also think that it is an architectural thing too. As Im sure you know, each Intel core (well that WAS A core until AMD changed the definition I guess) is FPU and Integer whereas AMD's 'modules' are 2 integer and one FPU(?). So I would imagine it goes both ways since when comparing it to an Intel chip it has the same amount of FPU's (quad with HT) as an 'octo' core (by AMD definition).



I'll agree that the question of the decoders that send the data to each mdoule have been particularly affected in floating point usage. That is unquestionable. That will be addressed in Steamroller you can't expect all issues to be corrected one generation. Steamroller will add another decoder I believe to each module to correct this issue. You could not accomplish that without first going down to .28 nm process. I am confident in spite of the mantras that AMD will survive this down cycle and will then be able to spit significant improvements in a timely fashion. It had to get its house in order before moving forward in any revolutionary way. I think AMD has survived the worst of a [painful reorganization and hopefully will be be able to add engineering and marketing staff in another 12 months.


----------



## EarthDog (Oct 23, 2012)

cdawall said:


> http://www.agner.org/optimize/blog/read.php?i=49
> 
> There is a little information on the CPUID I was talking about. As for your generic CPUID you are correct there is zero optimization for a CPUID that doesn't exist. The issue is when run under a processor that supports AVX cinebench will allow intel cpu's that support AVX utilize while not allowing the AMD ones to do so.


OK, so it is CPUID... (read that agner link, know that, mentioned that already above and was linked in my link to OCF, thank you again though!).

That said, if you look at the Cinebench test, none of the CPUID's he used showed a difference. Am I wrong in thinking that this shows no bias since there are no changes regardless of CPUID? Or since he used an atom CPU or something would my thinking be off since it doesnt have AVX extensions (guessing here).

Feel free to PM as we are drifting a bit...


----------



## ChristTheGreat (Oct 23, 2012)

Thanks for the review 

There is one thing I'd like to know, for the power consumption, What program you did to have full system load? and full system has been taken with what? as a full system, I do think that 100w if kinda low... My rig, at idle, with the HD6950 (1x), would be about 70-80w at idle, 2 hard drive and 1 SSD, and this has been taken at the wall, with the Kill-a-watt (by the way, my UPS does also give the same wattage or so).

Thanks if you can answer 

BTW, AMD has some good performance, on some other review, games aren't that better, still alot behind Intel is alot of games, but multi-thread it is quite good (single thread, Intel seems to be faster). Theses CPU are workstation/servers at best, I would still use Intel for low power/performance as desktop.


----------



## cdawall (Oct 23, 2012)

EarthDog said:


> OK, so it is CPUID... (read that, know that..was linked in my link, thank you again though!).
> 
> That said, if you look at the Cinebench test, none of the CPUID's he used showed a difference. Am I wrong in thinking that this shows no bias since there are no changes regardless of CPUID? Or since he used an atom CPU or something would my thinking be off since it doesnt have AVX extensions (guessing here).
> 
> Feel free to PM as we are drifting a bit...



This should be the last post unless shenanigans happen, but yes you are correct since the atom lacks a huge number of instruction sets there will be no variation.


----------



## EarthDog (Oct 23, 2012)

SHENS! Thank you for bringing it back to a respectable, intelligent conversation free of disparaging remarks. This was fruitful IMO.


----------



## cdawall (Oct 23, 2012)

ChristTheGreat said:


> Thanks for the review
> 
> There is one thing I'd like to know, for the power consumption, What program you did to have full system load? and full system has been taken with what? as a full system, I do think that 100w if kinda low... My rig, at idle, with the HD6950 (1x), would be about 70-80w at idle, 2 hard drive and 1 SSD, and this has been taken at the wall, with the Kill-a-watt (by the way, my UPS does also give the same wattage or so).
> 
> ...









Even in crossfire 7970's pull less wattage of your 6950 idle.



EarthDog said:


> SHENS! Thank you for bringing it back to a respectable, intelligent conversation free of disparaging remarks. This was fruitful IMO.



Now we can't have that!


----------



## HumanSmoke (Oct 23, 2012)

theoneandonlymrk said:


> yeh i had thought clock mesh was in, apparently not:shadedshu,


YW


----------



## LTUGamer (Oct 23, 2012)

Much better than Bulldozer but far away from Core i5 3470  
9.0 is too much


----------



## suraswami (Oct 23, 2012)

cdawall said:


> Cinebench is the only encoding benchmark that shows Intel ahead of the pack. It does not allow AMD processors to use AVX as a whole. Whenever AMD uses AVX it quite honestly demolishes the competition. Much like back in the P4 days when netburst ate video encoding up. Now these have a lot less of a performance drop in other applications vs P4.
> 
> 
> 
> ...



I guess AMD needs to pay these 'benchmark' coders to make use of AMD's tech more or efficiently.  Again $ should be spent on these so called marketing tactics.  Push $ into their As* and the program will start to favor AMD.


----------



## cadaveca (Oct 23, 2012)

ChristTheGreat said:


> Thanks for the review
> 
> There is one thing I'd like to know, for the power consumption, What program you did to have full system load? and full system has been taken with what? as a full system, I do think that 100w if kinda low... My rig, at idle, with the HD6950 (1x), would be about 70-80w at idle, 2 hard drive and 1 SSD, and this has been taken at the wall, with the Kill-a-watt (by the way, my UPS does also give the same wattage or so).
> 
> Thanks if you can answer



Better yet, a pic:

In that power bar is PC in kill-a-watt clone, lamp, monitor, and stereo. That bar plugs into it's own circuit @ 15a/120V, as well. 







What I report is the average reported over an 1-hour period of a customized CPU-based load.

what's really amazing is that this system, does draw no more than 400W gaming, with dual 7950s!!!


----------



## Jhelms (Oct 23, 2012)

Even though many call the 8150 garbage, since owning one and pushing it, I love the chip (Currently at 4.8ghz stable) Loved it enough and pleased enough with the new reviews that I ordered an 8350 and am quite excited to see what I can do with her! Not a fanboy, I own several intel based computers. Just loved flogging on my AMD gear as it takes it and smiles


----------



## Super XP (Oct 23, 2012)

LTUGamer said:


> Much better than Bulldozer but far away from Core i5 3470
> 9.0 is too much


Absolutely not. 9 out of 10 hits the spot quite nice.
You can build around a AM3+ platform for super cheap and still get respectable gaming, multimedia etc. Performance without breaking the bank. These new Piledriver cores are the best bang for your dollar right now.


----------



## _Zod_ (Oct 23, 2012)

Now if we could just get developers to stop using compilers that are optimized for Intel at AMDs expense, AMD might start looking a bit more appealing.


----------



## Jhelms (Oct 23, 2012)

Whoa thought that was the 8350 at first  8320 price is $179.99

Also newegg is cheaper. Micro Center wants $5.99 shipping / Newegg is free


----------



## Super XP (Oct 23, 2012)

Garage1217 said:


> Whoa thought that was the 8350 at first  8320 price is $179.99
> 
> Also newegg is cheaper. Micro Center wants $5.99 shipping / Newegg is free


No, Newegg wants $6.99 S&H if you ship it to Canada.


> ATInsider • 8 hours ago • parent −
> AMD took a huge risk by developing this unique modular architectural style of design that can highly be expanded by a lot. They've once again pushed innovation to its limits. This is in fact a huge win for AMD and for the PC industry.
> 
> AMD now has the ability to plaster cores upon cores as each process node gets smaller. They've taken out the guess work and have a working product. All they now need to do is continue to refine the design. Piledriver is a testament that this CPU can and will eventually have the balls to equal and surpass Intel in the high end.
> ...


http://news.softpedia.com/news/AMD-s-Steamroller-To-Be-Faster-than-Intel-Haswell-289980.shtml


----------



## ChristTheGreat (Oct 23, 2012)

cadaveca said:


> Better yet, a pic:
> 
> In that power bar is PC in kill-a-watt clone, lamp, monitor, and stereo. That bar plugs into it's own circuit @ 15a/120V, as well.
> 
> ...



Okay nice!

Thanks for sharing this mate


----------



## cadaveca (Oct 24, 2012)

ChristTheGreat said:


> Okay nice!
> 
> Thanks for sharing this mate



I also use a Zalman 8-pin meter for the 8-pin numbers:

http://www.zalman.com/eng/product/Product_Read.php?Idx=417


I calibrated this unit with a Fluke-brand clamp meter.


----------



## os2wiz (Oct 24, 2012)

Super XP said:


> No, Newegg wants $6.99 S&H if you ship it to Canada.
> 
> http://news.softpedia.com/news/AMD-s-Steamroller-To-Be-Faster-than-Intel-Haswell-289980.shtml



Well it would be a great stride forward, but let's not bet the house on it yet. 45% improvement over pile driver is asking a lot. 45% against bulldozer would mean almost a 30% increase over pile driver.  That is quite rare in the cpu world from one generation to the next. I'll be happy with a 20-25% improvement over Pile driver.


----------



## EarthDog (Oct 24, 2012)

> what's really amazing is that this system, does draw no more than 400W gaming, with dual 7950s!!!


People dramatically over estimate power reqs too... is that 400W rating at the wall? If so, you have to take in to account the PSU efficiency as well. If its a 90% platinum unit, that is really around a 360W load.

I pull 400W(wall) with 2 7850's overclocked (1.25v @ 1150/1350) and a 3770k at 4.8Ghz.


----------



## Thefumigator (Oct 24, 2012)

Does anyone knows how low these clock on windows' power saver mode?

my phenom clocks 1Ghz on power saver, as far as I know every other AMD processor remained the same, except llano where they clock even lower to 600Mhz or something like that.


----------



## cadaveca (Oct 24, 2012)

EarthDog said:


> People dramatically over estimate power reqs too... is that 400W rating at the wall? If so, you have to take in to account the PSU efficiency as well. If its a 90% platinum unit, that is really around a 360W load.
> 
> I pull 400W(wall) with 2 7850's overclocked (1.25v @ 1150/1350) and a 3770k at 4.8Ghz.



I did post a pic of what I use to test that......last page


----------



## fwix (Oct 24, 2012)

nice review even if u did not includes some M.T games (bf3/moh multi empty fixed scene - crysis 2 - gta 4 but the Principe infos are here 



EarthDog said:


> with 2 7850's overclocked



well an oc 7850 v1.25 use more power than any 7950 :d 
also i would say a  gaming pc "amd platform"(gpu-proc )  this days are really good if we speak price/performance  

i have an 1100t @4.1cpu 3cpu nb 2.6 ht - gtx 590 (oc at 580 default) i really can't see the difference between an 2500k @ 4.80 ( my friend have one )  in the majority of games perhaps skyrim -crysis 2 feel more  stable with intel but u can increase the difference by reducing the lod ratio/view distance 
so it's not  really a big deal to own a ivy ore sandy if u wanna play games with the best price/performance (i know that not every gamers oc the cpu to play but the majority do :d )
also i would say this days we are not cpu limited more than we are gpu limited  ,test any game have a good engine like unreal engine -  frostbite - cry engine etc u can't be cpu limited with any quad core at "60 74hrz vsync on"  (even i tested a core i3 2120 with a 580-7850 in the modern game the gpu is all the time 90-99% vsync off ) so for those who are wondering about performance in games just oc any quad core u have to +4 ghz u can't really see the difference "only" if u play games at 120fps well an oc  intel is the best choice   for that


----------



## ChristTheGreat (Oct 24, 2012)

EarthDog said:


> People dramatically over estimate power reqs too... is that 400W rating at the wall? If so, you have to take in to account the PSU efficiency as well. If its a 90% platinum unit, that is really around a 360W load.
> 
> I pull 400W(wall) with 2 7850's overclocked (1.25v @ 1150/1350) and a 3770k at 4.8Ghz.




those 7850 has low power consumption. My rig, full CPU + GPU 100% (Prime95 + furmark), draw about 340-380w at the wall


----------



## johnspack (Oct 24, 2012)

I'm still munching popcorn...   I did amd from the amd286 to the athlon64 x2...  I really want a reason to come back!


----------



## Absolution (Oct 24, 2012)

Could performance be better with Windows 8?

I think it the AMD 8350 is a good choice:

+ More cores
+ Close to if not better performance than the Intel counterpart (Gaming)
+ Supporting AMD (because competition is better)

But still cant get over these :/

- Memory rates slow
- Power consumption when overclocked kinda high
- No idea how long will the AM3+ socket last, perhaps this gen of CPUs will be the last. So another -ve for Intel / AM2 users


----------



## cdawall (Oct 24, 2012)

Absolution said:


> But still cant get over these :/
> 
> - Memory rates slow
> - Power consumption when overclocked kinda high
> - No idea how long will the AM3  socket last, perhaps this gen of CPUs will be the last. So another -ve for Intel / AM2 users



-BIOS issue as much as a CPU issue. Check how the other boards perform before calling this one. Trust me on that one. 1600CL6 is over 1000MB/s different between my M4A78T-E and CHIII. 

-It is a performance PC if you want low power consumption by a 45w chip and leave it stock

-It is a better bet than going with 1155. Seeing how Haswell and Broadwell are slated for LGA1150.


----------



## geon (Oct 24, 2012)

brilliant cpu

For a developer, to be able to test how your code stretches on 8 threads is pretty sweet.

BTW all benches are done by running 4 threads for i5, 8 threads for BD, try running 8 threads for i5 also if you want to have fun. In most applications performance will degrade, and that's why BD architecture is great for servers.

Video compression on BD is great as long as other tech don't really work as they produce either low quality or don't have good support.

I'm pretty sure games work very nice as regular users don't have 1000$ video cards so if the GPU limits performance, it will limit in the same way both the AMD and Intel.

I don't see single core performance as an issue. More and more applications are optimized for multi-core, and few single core applications which people still use are limited by other factors which benchmarks never take into account. (Like optical drive speed in case of Lame or Itunes. If that will be factored, both i5 and BD will encode on the fly)

Power is indeed an issue, however once AMD will move to a lower node we should see a good improvement. Also electricity is pretty cheap right now and I doubt anybody will notice the CPU consumption on the bill. Also you can be green by turning off your PC when you don't need it.

I think there is hope for AMD, although I've read some amazing stuff about the next gen intels.


----------



## HumanSmoke (Oct 24, 2012)

cdawall said:


> --It is a better bet than going with 1155. Seeing how Haswell and Broadwell are slated for LGA1150.


Can't see how you arrive at that.
Even if Haswell arrives Q2 2013, that doesn't make Ivy Bridge (or Sandy Bridge for that matter) redundant overnight, so come Q2 2013 the situation is still pretty much the same as it is now.
Steamroller is either going to backwards compatible with existing AM3+ (great if you want to upgrade the board but not the CPU, not so great if it means that a better memory subsystem, an integrated PCH etc. fall by the wayside) 

The way I see it, going into Q2-3, 2013, the buyer now will be sitting on the present tech

AMD: Cheaper overall*, ~2-4 more SATA 6GB ports, higher power consumption, no SSD caching, no native USB3.0, no PCI-E 3.0**, (and no SAS if comparing to X79 or selected Z77), ECC memory support, native tri/quad single-GPU support.

Intel: More expensive in comparison to 8320,6 and 4 core AMD (* 3570K + Z75 isn't far away from 8350 + 990X/FX in pricing), fewer native SATA 6GB ports, iGP (handy for troubleshooting or placeholder between "proper" GPU's unless you have a 1440p/1600p screen)

Unless you're in the market for a better IGP  I'd venture that an IB system is still going to be pretty competitive in 2013 versus Haswell...there's also no guarantee that AMD's Steamroller timetable incurs no slippage...not as if AMD don't have prior form.

** Thinking ahead, while PCI-E 3.0 might be a non-event ATM, it would still likley be a selling point iin the resell market if you were looking at upgrading, and of course a higher bandwidth intensive card of the next gen allied with a CPU physics game might make the difference more than academic.


----------



## os2wiz (Oct 24, 2012)

*Possible for 2 editions of Sfteamroller one AM3+ theOther AM4?*



HumanSmoke said:


> Can't see how you arrive at that.
> Even if Haswell arrives Q2 2013, that doesn't make Ivy Bridge (or Sandy Bridge for that matter) redundant overnight, so come Q2 2013 the situation is still pretty much the same as it is now.
> Steamroller is either going to backwards compatible with existing AM3+ (great if you want to upgrade the board but not the CPU, not so great if it means that a better memory subsystem, an integrated PCH etc. fall by the wayside)
> 
> ...



I do see the memory bandwidth as well as a serious wall limiting performance on AM3±. Why not have 2 choices for Steamroller, one AM3+ the other AM4 at the same availability date? 
Is it possible. Obviously the AM4 would give better memory bandwidth with DD4 and an improved memory controller. It would cost more. The AM3+ version would be less of a change, cheaper to produce, and give lower performance improvement. Hyper Transport should be replaced on AM4. It is not an efficient technology
at this point.


----------



## eidairaman1 (Oct 24, 2012)

os2wiz said:


> I do see the memory bandwidth as well as a serious wall limiting performance on AM3±. Why not have 2 choices for Steamroller, one AM3+ the other AM4 at the same availability date?
> Is it possible. Obviously the AM4 would give better memory bandwidth with DD4 and an improved memory controller. It would cost more. The AM3+ version would be less of a change, cheaper to produce, and give lower performance improvement. Hyper Transport should be replaced on AM4. It is not an efficient technology
> at this point.



Rest assured when SR comes about DDR4 will be in Play and SR will be DDR3/4 Compatible, so it would be AM4; meaning Stars, BD, PD wont be able to support it but AM3+ users will have another gen of chips to play with.


----------



## HumanSmoke (Oct 24, 2012)

os2wiz said:


> I do see the memory bandwidth as well as a serious wall limiting performance on AM3±. Why not have 2 choices for Steamroller, one AM3+ the other AM4 at the same availability date?


In theory it sounds fine- market segmentation hasn't hurt Intel.
In practice, it sounds like attempting to keep a foot in both camps- which again is fine, it lessens risk with new tech. It also seems to be a diluting of R&D to keep one avenue linked to backwards compatibility. Seems like a retrograde step to me for a company with a mandate to balance the books.

AMD's continued survival (without a new investor) is likely linked to OEM's. The DIY crowd don't rate highly enough to warrant that kind of investment. I'd also argue that AMD aren't above jettisoning backwards compatibility when circumstances allow ( FM1's lifespan for example), so if OEM's are the key to increased revenue then it follows that AMD will need to pander to their requirements- which basically boils down to feature checkboxes...and I'm not convinced that a slightly tweaked 1090FX/X falls into that category- especially if Intel end up integrating WiFi and god knows what other acronyms into the Shark Bay platform.


----------



## os2wiz (Oct 24, 2012)

HumanSmoke said:


> In theory it sounds fine- market segmentation hasn't hurt Intel.
> In practice, it sounds like attempting to keep a foot in both camps- which again is fine, it lessens risk with new tech. It also seems to be a diluting of R&D to keep one avenue linked to backwards compatibility. Seems like a retrograde step to me for a company with a mandate to balance the books.
> 
> AMD's continued survival (without a new investor) is likely linked to OEM's. The DIY crowd don't rate highly enough to warrant that kind of investment. I'd also argue that AMD aren't above jettisoning backwards compatibility when circumstances allow ( FM1's lifespan for example), so if OEM's are the key to increased revenue then it follows that AMD will need to pander to their requirements- which basically boils down to feature checkboxes...and I'm not convinced that a slightly tweaked 1090FX/X falls into that category- especially if Intel end up integrating WiFi and god knows what other acronyms into the Shark Bay platform.



It is only to give one last cycle for AM3+ motherboard users before obsolescence, not a parallel development
option.


----------



## nt300 (Oct 24, 2012)

I think Piledriver will be the last AM3+ CPU support. And the Steamroller may be on a new socket with DDR4. But we talk in mid to late 2013.


----------



## EarthDog (Oct 24, 2012)

Memory bandwidth is not an issue on any platform dual channel and up. Increasing memory speed shows very little gains in 99% of applications outside of benchmarking. I liken it to a fire hose and garden hose... If you have 1GPM flow(data) in a garden hose(bandwidth value), and simply change to a fire hose(larger bandwidth value say DDR4 or quad channel) that doesnt mean your static 1GPM flow magically increases. Bandwidth is just the pipe, the CPU isnt saturating ram's bandwidth these days at dual channel 1600Mhz.


----------



## os2wiz (Oct 24, 2012)

EarthDog said:


> Memory bandwidth is not an issue on any platform dual channel and up. Increasing memory speed shows very little gains in 99% of applications outside of benchmarking. I liken it to a fire hose and garden hose... If you have 1GPM flow(data) in a garden hose(bandwidth value), and simply change to a fire hose(larger bandwidth value say DDR4 or quad channel) that doesnt mean your static 1GPM flow magically increases. Bandwidth is just the pipe, the CPU isnt saturating ram's bandwidth these days at dual channel 1600Mhz.



But we are talking at 1866 Mhz and 2133 Mhz which I use. I was told on the ROG web site by an Asus technical staffer that at 2133  even with loosened timings it would be near-impossible to get my HT link speed up to 2600. He was right. Even raising cpu-nb voltage and NB voltage couldn't get a stable run at 2400 or 2600. So it sits at 2200. So memory speed definitely impacts HT speed. I should lower it to 1600 and I bet the HT link speed could be raised to 2600 .


----------



## Durvelle27 (Oct 24, 2012)

nt300 said:


> I think Piledriver will be the last AM3+ CPU support. And the Steamroller may be on a new socket with DDR4. But we talk in mid to late 2013.



No Steamroller will be on AM3+ 

"AMD has stated that the FX-series will receive a Socket AM3+ version of a Steamroller based CPU in 2014"






Edit:

http://www.theinquirer.net/inquirer/news/2208525/amd-sticks-with-socket-am3-for-steamroller


----------



## TheLaughingMan (Oct 24, 2012)

nt300 said:


> I think Piledriver will be the last AM3+ CPU support. And the Steamroller may be on a new socket with DDR4. But we talk in mid to late 2013.



Late 2013 and the only thing I can be sure about for Steamroller is AMD is currently working on it being a 28nm chip.


----------



## os2wiz (Oct 24, 2012)

TheLaughingMan said:


> Late 2013 and the only thing I can be sure about for Steamroller is AMD is currently working on it being a 28nm chip.



I think the story about 2014 and AM3+ has never been confirmed by an AMD spokesperson.So
I think it is still an open book about many steamroller details like what motherboard features, memory, etc.


----------



## tacosRcool (Oct 24, 2012)

Durvelle27 said:


> No Steamroller will be on AM3+
> 
> "AMD has stated that the FX-series will receive a Socket AM3+ version of a Steamroller based CPU in 2014"
> 
> ...



That's a long life for a socket


----------



## Norton (Oct 24, 2012)

TheLaughingMan said:


> Late 2013 and the only thing I can be sure about for Steamroller is AMD is currently working on it being a 28nm chip.



I wouldn't be surprised if a Piledriver upgrade pops up earlier next year at 28nm- I believe AMD is trying to get their GPU's and CPU's down to the same node (should be more cost effective that way anyway).

My $0.02....


----------



## Durvelle27 (Oct 24, 2012)

tacosRcool said:


> That's a long life for a socket



well AMD is know to have long life sockets like AM2/AM2+


----------



## EarthDog (Oct 24, 2012)

os2wiz said:


> But we are talking at 1866 Mhz and 2133 Mhz which I use. I was told on the ROG web site by an Asus technical staffer that at 2133  even with loosened timings it would be near-impossible to get my HT link speed up to 2600. He was right. Even raising cpu-nb voltage and NB voltage couldn't get a stable run at 2400 or 2600. So it sits at 2200. So memory speed definitely impacts HT speed. I should lower it to 1600 and I bet the HT link speed could be raised to 2600 .


Thats probably true... but was solely talking memory speeds and bandwidth here, not the HT link speed and how higher memory speeds may force you to lower it. I see the correlation and what you are getting at though. There isnt anything like that in the Intel world to mess with. But again standing alone, memory bandwidth is not close to being saturated with dual channel.


----------



## cadaveca (Oct 24, 2012)

EarthDog said:


> Thats probably true... but was solely talking memory speeds and bandwidth here, not the HT link speed and how higher memory speeds may force you to lower it. I see the correlation and what you are getting at though.



For me, memory bandwidth plays a large role in Multi-GPU performance, and in a very perceptible way. I agree with your sentiment about memory perforamnce in a big way for everything else though.


----------



## ChristTheGreat (Oct 24, 2012)

HumanSmoke said:


> Can't see how you arrive at that.
> Even if Haswell arrives Q2 2013, that doesn't make Ivy Bridge (or Sandy Bridge for that matter) redundant overnight, so come Q2 2013 the situation is still pretty much the same as it is now.
> Steamroller is either going to backwards compatible with existing AM3+ (great if you want to upgrade the board but not the CPU, not so great if it means that a better memory subsystem, an integrated PCH etc. fall by the wayside)
> 
> ...



AMD had native USB3 before Intel.. my P67 doesn't have native USB 3...


----------



## Nirutbs (Oct 24, 2012)

always nice for the review thx...
nice chip but it cannot make me from intel....ie...ie


----------



## EarthDog (Oct 24, 2012)

cadaveca said:


> For me, memory bandwidth plays a large role in Multi-GPU performance, and in a very perceptible way. I agree with your sentiment about memory perforamnce in a big way for everything else though.


I dont run multi GPU setups, so I never experienced that. Thanks for the heads up! I have CrossfireX on the test bench as we speak.. I will see what happens when I cut back from 2666 to 1600Mhz tonight. 

That still fits in to the 99% thing though (ok, 97% according to steam, LOL?) as most arent using multiple cards anyway.


----------



## cdawall (Oct 24, 2012)

HumanSmoke said:


> Can't see how you arrive at that.
> Even if Haswell arrives Q2 2013, that doesn't make Ivy Bridge (or Sandy Bridge for that matter) redundant overnight, so come Q2 2013 the situation is still pretty much the same as it is now.



How about Intel hasn't given more than two generations on the step down chipset in forever? LGA1156 lasted what 3 years (late 09-12)?, 1155 is on schedule for maybe 3 as well. AM3 and later AM3+ have been out a while and all boards 890 series and up support BD and PD. AM3+ will be supported a while longer it is already on AMD's roadmaps to stay. So which is the better buy to you? The one you are out another $150+ for a mobo and $200+ for a CPU or drop in steamroller? I am sticking with AMD for that alone.



HumanSmoke said:


> Steamroller is either going to backwards compatible with existing AM3+ (great if you want to upgrade the board but not the CPU, not so great if it means that a better memory subsystem, an integrated PCH etc. fall by the wayside)



Memory subsystem meaning what the IMC is integrated into the CPU? Even 790FX can give high ram clocks with subsystem timings changed. PCH? it isn't Intel the North and Southbridge on AMD boards support plenty of PCI-e lanes manufactures just dump standalones onboard. I have a several year old 790FX Gigabyte board with USB3.0, 2 16x slots, 10 SATA ports that supports through Thuban. The 890FX model is the same way with BD and PD support. What is the downside to keeping the old board because it sure isn't a performance or feature issue.



HumanSmoke said:


> The way I see it, going into Q2-3, 2013, the buyer now will be sitting on the present tech



AM3+ is present tech as is Piledriver. With all of the rumors of every company complaining about multithreading which is a better performing chip for the future? The one with a monolithic die fighting for single core IPC or something that multithreads like a champ?



HumanSmoke said:


> AMD: Cheaper overall*, ~2-4 more SATA 6GB ports, higher power consumption, no SSD caching, no native USB3.0, no PCI-E 3.0**, (and no SAS if comparing to X79 or selected Z77), ECC memory support, native tri/quad single-GPU support.
> 
> Intel: More expensive in comparison to 8320,6 and 4 core AMD (* 3570K + Z75 isn't far away from 8350 + 990X/FX in pricing), fewer native SATA 6GB ports, iGP (handy for troubleshooting or placeholder between "proper" GPU's unless you have a 1440p/1600p screen)



Try and be correct on this one AMD 970 allows for almost all of the same options as a Z77 let alone Z75. For $69.99. So for $289.98 you have a CF/SLi supporting dual 8x motherboard and FX 8350. The cheapest dual 8x board I could find for Intel was a Z77 so that breaks out to $334.98. so 15% more expensive for the same features. Oh and AMD has better performance in multithreading, mildly worse in games if any. 



HumanSmoke said:


> Unless you're in the market for a better IGP  I'd venture that an IB system is still going to be pretty competitive in 2013 versus Haswell...there's also no guarantee that AMD's Steamroller timetable incurs no slippage...not as if AMD don't have prior form.



There is no guarantee Intel's timetable will not slip it has happened before and can happen again. 



HumanSmoke said:


> ** Thinking ahead, while PCI-E 3.0 might be a non-event ATM, it would still likley be a selling point iin the resell market if you were looking at upgrading, and of course a higher bandwidth intensive card of the next gen allied with a CPU physics game might make the difference more than academic.



Current cards don't use the bandwidth available in PCI-E 2.0 let alone what is available for 3.0. Future cards are not likely to change that, but either way dual 8x slots will hamper the Intel side of things as it lacks the lanes themselves to transfer the data between the cards. Hence why a 4x slot can handle a single card, but when used for SLi or crossfire it sucks.


----------



## ensabrenoir (Oct 24, 2012)

*Credit where credit is due....*

My gosh.......its....NOT ...a....delorean       2points amd  welcome to the 90's  no I mean ....?.....20's dont sound cool


----------



## os2wiz (Oct 24, 2012)

There is no guarantee Intel's timetable will not slip it has happened before and can happen again. 



   I am predicting right now Haswell will not come out in the 2nd quarter of 2013 as Intel predicts. That is fluff that Intel has produced to freeze the cpu market in their favor and keep any potential customers from jumping to AMD. Going from .22 nm process to .18 nm is a very big jump. Each move down in  process size gets that much more difficult to execute.
I am predicting that Haswell will not be out before the 4th quarter. I give any of you permission to wipe the egg off my face if I am wrong.


----------



## EarthDog (Oct 24, 2012)

os2wiz said:


> There is no guarantee Intel's timetable will not slip it has happened before and can happen again.
> 
> 
> 
> ...


QFPermanence. 

Production for these are supposed to be starting in December. If that holds true and yields are acceptable, you should have a rag handy. They already showed ONE at IDF I thought.

Last I heard, Haswell was still 22nm...I bet you wan to change that statement now since there isnt a die shrink...too late, you were quoted. 

That said, 2Q 2013 if they get ramped up in dec/jan, isnt off the charts... especially since they are staying on 22nm process.


----------



## os2wiz (Oct 24, 2012)

EarthDog said:


> QFPermanence.
> 
> Production for these are supposed to be starting in December. If that holds true and yields are acceptable, you should have a rag handy. They already showed ONE at IDF I thought.
> 
> ...



 I read that Haswell did in fact have a die shrink to .18 nm. Can anyone else corroborate the die size for Haswell?  My statement of delay was solely predicated on a die shrink. If I am wrong about the die shrink all bets are off.


----------



## EarthDog (Oct 24, 2012)

> If I am wrong about the die shrink all bets are off.


   :shadedshu


Intel (leaked)roadmaps say 22nm, Intel SAID 22nm..........just google. Not a lot comes up with 18nm but when you google 22nm.. its all over from reputable sources. Below is one... from an intel inerview

http://www.bit-tech.net/news/hardware/2012/09/12/intel-haswell-core/1

EDIT: Broadwell I believe may be the next shrink.


----------



## cdawall (Oct 24, 2012)

os2wiz said:


> I read that Haswell did in fact have a die shrink to .18 nm. Can anyone else corroborate the die size for Haswell?  My statement of delay was solely predicated on a die shrink. If I am wrong about the die shrink all bets are off.



Wiki still lists 22 nm, the stopping block I see is the swap to LGA 1150. The PCH is shrinking from 65nm to 32nm (finally).


----------



## cadaveca (Oct 24, 2012)

Man, you guys.






Tick tock, then tick tock,

That's the plan of Intel's process clock.



:shadedshu


----------



## EarthDog (Oct 24, 2012)

cadaveca said:


> Man, os2wiz.
> 
> 
> 
> ...


Fixed. I didnt botch up that fact and (eventually/potentially)lose a bet because of it!!! But then again, he did renege...

And that rhymed!


----------



## os2wiz (Oct 24, 2012)

EarthDog said:


> Fixed. I didnt botch up that fact and (eventually/potentially)lose a bet because of it!!! But then again, he did renege...
> 
> And that rhymed!



Come on. that is   bit unfair. If my whole argument about thmissing their projection is based on
on a process shrink that is not happening, How did I reneg. I was wrong about the process shrink, yes.  Give me a break???


----------



## EarthDog (Oct 24, 2012)

You made a bet..and pulled it back = renege. It doesnt matter what stipulations/reasons you came up with in your head to have the fortitude to post that comment... that was your REASON for making the bet...you should have your facts straight before opening yourself up to that (Well aware its a simple mistake, just giving you the business!). 

You are lucky this isnt a bookie and it WAS a gentleman's bet... Vegas/bookie wouldnt tolerate you attempting to renege becuase you thought Eli Manning was coming back off injury but found out after you made the bet he wasnt. Never make a bet without knowing for sure what you are actually betting on! But I'll send you a rag regardless. Im not wiping off the egg Im chucking at your grill for making a bet based off of misinformation! 

(Eli isnt injured, just picked someone you may know being a NYker).

Anyway, you are off the hook, just giving you shhhhhhhhhhhhhhhh.


----------



## nt300 (Oct 24, 2012)

os2wiz said:


> There is no guarantee Intel's timetable will not slip it has happened before and can happen again.
> 
> I am predicting right now Haswell will not come out in the 2nd quarter of 2013 as Intel predicts. That is fluff that Intel has produced to freeze the cpu market in their favor and keep any potential customers from jumping to AMD. Going from .22 nm process to .18 nm is a very big jump. Each move down in  process size gets that much more difficult to execute.
> I am predicting that Haswell will not be out before the 4th quarter. I give any of you permission to wipe the egg off my face if I am wrong.


Steamroller should easily be the CPu to directly compete with Haswell


----------



## brandonwh64 (Oct 24, 2012)

nt300 said:


> Steamroller should easily be the CPu to directly compete with Haswell



Proof?


----------



## cdawall (Oct 24, 2012)

brandonwh64 said:


> Proof?



Proof it wont?


----------



## brandonwh64 (Oct 24, 2012)

cdawall said:


> Proof it wont?



Both really. Will it?


----------



## EarthDog (Oct 24, 2012)

Boy, you can tell who 'dress to the left' and 'dress to the right'. NOBODY FOOKN KNOWS PEOPLE! (= both sides to the middle)


----------



## brandonwh64 (Oct 24, 2012)

EarthDog said:


> Boy, you can tell who 'dress to the left' and 'dress to the right'. NOBODY FOOKN KNOWS PEOPLE! (= both sides to the middle)



NT300 seems too, thats why I asked for proof


----------



## erocker (Oct 24, 2012)

Why are people spouting nonsense for make believe CPU's for a FX-8350 CPU review? Stay on topic.


----------



## cdawall (Oct 24, 2012)

brandonwh64 said:


> Both really. Will it?



I say I don't care I would still rather look at it than another 3 year socket.



EarthDog said:


> Boy, you can tell who 'dress to the left' and 'dress to the right'. NOBODY FOOKN KNOWS PEOPLE! (= both sides to the middle)



Oh I know, but nothing says it wont hell they could dominate everything ever created, but no one will know until they release.


----------



## brandonwh64 (Oct 24, 2012)

cdawall said:


> I say I don't care I would still rather look at it than another 3 year socket.



I don't either but I thought it would be funny just to see what he could come up with HAHAHAHAH


----------



## os2wiz (Oct 24, 2012)

EarthDog said:


> You made a bet..and pulled it back = renege. It doesnt matter what stipulations/reasons you came up with in your head to have the fortitude to post that comment... that was your REASON for making the bet...you should have your facts straight before opening yourself up to that (Well aware its a simple mistake, just giving you the business!).
> 
> You are lucky this isnt a bookie and it WAS a gentleman's bet... Vegas/bookie wouldnt tolerate you attempting to renege becuase you thought Eli Manning was coming back off injury but found out after you made the bet he wasnt. Never make a bet without knowing for sure what you are actually betting on! But I'll send you a rag regardless. Im not wiping off the egg Im chucking at your grill for making a bet based off of misinformation!
> 
> ...



 II don't go to bookies or loan sharks. Just the sharks from Wall Street, your neighborhood B of A and JP Morgan-Chase.
  If I am going to get abroken arm for something I'd rather get it fighting the bosses and their corrupt system. I got a heart attack from the consequences of that  No regrets. They may have some after what I and my co-workers put them through.


----------



## erocker (Oct 24, 2012)

erocker said:


> Why are people spouting nonsense for make believe CPU's for a FX-8350 CPU review? Stay on topic.



I won't say it again. I will take action if you cannot keep on topic.


----------



## os2wiz (Oct 24, 2012)

By the way I saw a retail price drop on-line. Tiger Direct has the FX-8350 for 204.50. You have to pay 2.99 for slow shipping though.  Still better than 219.99 with rip-off Newegg.

   Techs Spot gave a terrible hack review of the cpu. He used primarily synthetic benchmarks with a lot of single threaded games and about 3 or 4 applications. Written by an Aussie named Steve. Not nearly as thorough or favorable review as I saw on Tom's Hardware.
I told him so on the site. his only reply was "have a nice day." No effort to defend his review or justify his findings which were not generally upheld by other testing. I found no intellectual process explained in his review. Tom's broke down everything and explained every nuance in their testing procedure and results.


----------



## os2wiz (Oct 24, 2012)

erocker said:


> I won't say it again. I will take action if you cannot keep on topic.



Man I posted before I saw your admonition. Sorry. Must have crossed paths at the same time. I was composing the oiff-topic reply when you were posting yours, so I did not see it.


----------



## fullinfusion (Oct 24, 2012)

^ oh get back to work Erocker 

Great review Dave 

If I had AMD like I used to I'd upgrade to the pile driver, but really nothing is enticing me about this cpu over the older 8150... sure its a bit faster but ... The bulldozer was what pissed me off so much that I sold my amd rig and went Intel! PPL go on about overclocking and I just shake my head. Amd still cant touch Intel. Im keeping an open mind here but really Pile driver @ stock 4.0-4.2 GHz vs Stock 3.5ghz 3770K as an example.... sure the numbers are close but up the I7 to the same clock as amd and Intel HAMMERS the pile driver big time.... It's not even a pissing match any more... It's like a Vett vs a chevett 

I must say Im looking forward to upgrading to Z77 and 3770K in the next month 

And Sorry AMD, you didn't win me back this time around and I don't think you ever will 

Id rather spend a bit more money and know what I have, then spend less and wished I spent more to have more 


All in all good job AMD


----------



## HumanSmoke (Oct 24, 2012)

cdawall said:


> How about Intel hasn't given more than two generations on the step down chipset in forever?


So what? My nephew uses my old hand-me-down 4 year old QX9650/X48 Rampage on LGA 775. It certainly isn't embarrassed by any AMD CPU performance.


cdawall said:


> So which is the better buy to you? The one you are out another $150+ for a mobo and $200+ for a CPU or drop in steamroller? I am sticking with AMD for that alone.


The one that delivers on time.
On a personal note, I like new stuff. If Haswell delivers, I might just buy it...but then, I tend to upgrade yearly- probably why Intel make so much money!


cdawall said:


> What is the downside to keeping the old board because it sure isn't a performance or feature issue.


The same can be said for any full featured Z77 and X79...or are you too myopic to see past a brand?


cdawall said:


> AM3+ is present tech as is Piledriver. With all of the *rumors of every company complaining about multithreading * which is a better performing chip for the future? The one with a monolithic die fighting for single core IPC or something that multithreads like a champ?


Because AMD are the only company to evolve their CPU design ? because Intel CPU's lack multi-threaded performance ? Because the next 6-12 months are going to see an exponential growth in software tailored to Bulldozer architecture ? Because AMD can stick to a timetable and their performance estimates ?
While you're answering those, maybe you can provide links to support your supposition of "every company complaining about multithreading".


cdawall said:


> Try and be correct on this one AMD 970 allows for almost all of the same options as a Z77 let alone Z75...


So what? How many people use every feature on a motherboard? 


cdawall said:


> Oh and AMD has better performance in multithreading, mildly worse in games if any


Bleat on about dual x8 boards (presumeably for CFX/SLI) and rave about "mildly worse" in gaming. Sounds about right.

I'd think that more people might look at options such as choice in the mATX/ITX form factor, onboard WiFi and WiDi, SSD caching and the like.


cdawall said:


> There is no guarantee Intel's timetable will not slip it has happened before and can happen again.


True, but whats AMD's track record ? I know which company I'd trust more to adhere to their timetable. 


cdawall said:


> Current cards don't use the bandwidth available in PCI-E 2.0 let alone what is available for 3.0. Future cards are not likely to change that


Rubbish. Dual GPU cards and CFX/SLI are already impacting the electrical restriction of PCI-E 2.0


cdawall said:


> but either way dual 8x slots will hamper the Intel side of things as it lacks the lanes themselves to transfer the data between the cards


I see we've reached the limit of your knowledge.
PCI-E x16 2.0 @x16 = 80GT/sec (8GB/sec) * 80% (8b/10b encode) = 64GT/sec (6.4GB/sec)
PCI-E x16 3.0 @x8 = 64GT/sec (6.4GB/sec)* 98.46% (128b/130b encode)= 63.6GT/sec (6.36GB/sec)


----------



## Fourstaff (Oct 24, 2012)

HumanSmoke said:


> Rubbish. Dual GPU cards and CFX/SLI are already impacting the electrical restriction of PCI-E 2.0



http://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI-Express_Scaling/23.html


----------



## cadaveca (Oct 24, 2012)

HumanSmoke said:


> I see we've reached the limit of your knowledge.
> PCI-E x16 2.0 @x16 = 80GT/sec (8GB/sec) * 80% (8b/10b encode) = 64GT/sec (6.4GB/sec)
> PCI-E x16 3.0 @x8 = 64GT/sec (6.4GB/sec)* 98.46% (128b/130b encode)= 63.6GT/sec (6.36GB/sec)



Total bandwidth is not the answer. So we've reached the limit of yours as well, eh?



Fourstaff said:


> http://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI-Express_Scaling/23.html




Single-GPU is not Dual-GPU or tri-GPU or quad-GPU. W1zz shows differnce with one card..it's worse with two, becuase not only do you have traffic for each card, but also traffic between the cards themselves.


----------



## N3M3515 (Oct 24, 2012)

Nice perf per clock increase!!



> the FX-8510 offered


FX-8*15*0


----------



## Fourstaff (Oct 24, 2012)

cadaveca said:


> Single-GPU is not Dual-GPU or tri-GPU or quad-GPU. W1zz shows differnce with one card..it's worse with two, becuase not only do you have traffic for each card, but also traffic between the cards themselves.



If scaling is linear (should it? I don't know, I am just assuming here) I am not convinced with the 1% drop in performance between PCI-E 2.0 x8 and x16 is indicative of us hitting the wall. I am also not sure if dual GPU cards require more than 2x single GPU card, and on top of that I have absolutely no knowledge of how much GPU talks between each other outside CF/SLi bridge. Enlighten me


----------



## cadaveca (Oct 24, 2012)

Fourstaff said:


> If scaling is linear (should it? I don't know, I am just assuming here) I am not convinced with the 1% drop in performance between PCI-E 2.0 x8 and x16 is indicative of us hitting the wall. I am also not sure if dual GPU cards require more than 2x single GPU card, and on top of that I have absolutely no knowledge of how much GPU talks between each other outside CF/SLi bridge. Enlighten me



Nah, I've done testing, and then sold the cards. It's best you test yourself.

Scaling between cards is NOT linear with more than two for sure, and not quite linear with just two. You tell me why it isn't, and you'll answer the question you just asked.


I am no expert, so I'll leave them to explain the ins and ours of why...because I've been saying this for years, but nobody seems to agree...but then you ask those people if they've ever ran such configs, and the answer is no.

I feel it's better you educate yourself than relying on me with this subject in particular.


----------



## TheoneandonlyMrK (Oct 24, 2012)

fullinfusion said:


> Id rather spend a bit more money and know what I have, then spend less and wished I spent more to have more



well you could allways human smoke its ass and move on to the next system each year, nice job you must have smokey, bit of a waste though ,a new system each year just to slate Amd  in threads

my main rigs listed , it has 3 cards in it and x8 for the xfire setup, i tried x16 but cant keep that config with this card mix but i loose 2-4 fps out of 60-80 avg on 8 ,and x4 for physx card dosnt hinder it either, nor does the emense cpu i have, ima get me one dave ,Ill tell you how it clocks on mad water cooling, i might do a tap oc run too, perma fresh cold waters .


----------



## fullinfusion (Oct 24, 2012)

theoneandonlymrk said:


> well you could allways human smoke its ass and move on to the next system each year, nice job you must have smokey, bit of a waste though ,a new system each year just to slate Amd  in threads
> 
> Forgive my ignorance in not understanding your British slang. What were you getting at.


----------



## Fourstaff (Oct 24, 2012)

cadaveca said:


> I feel it's better you educate yourself than relying on me with this subject in particular.


Precious little on the web, and I can never get my two friends with 670s together at any one time to test. 

Either way, I feel like there is a need for people with equipment and knowledge (*hint* Wiz *hint*) to do some writeup on CF/SLi scalings on top of PCIe 3.0/2.0. I wonder if different platforms (x79 etc) react differently.


----------



## Super XP (Oct 24, 2012)

cadaveca said:


> Total bandwidth is not the answer. So we've reached the limit of yours as well, eh?
> 
> 
> 
> ...


Can we get Clock for clock comparisons with the FX8150 vs. FX8350 perhaps at 3.60GHz and 4.0GHz. 
Thanks,


----------



## cadaveca (Oct 24, 2012)

Fourstaff said:


> Precious little on the web, and I can never get my two friends with 670s together at any one time to test.
> 
> Either way, I feel like there is a need for people with equipment and knowledge (*hint* Wiz *hint*) to do some writeup on CF/SLi scalings on top of PCIe 3.0/2.0. I wonder if different platforms (x79 etc) react differently.



Different platform? Not really. I mean, there is the whole PLX PEX8747 or whatever chips used to provide similar bandwidth on each platform, and I did some testing of course, but the same basic things applied.

What I cna say for sure if that the very best result when it comes to scaling was with three cards, and only on the Gigabyte X79-UD5, the only X79 board I found to offer three links direct to CPU with no bridge chips of any kind, and it easily out-performed any other board by 5-8%.

I did mention this to W1zz, and we kinda agreed that those other chips in the link also add latency just by being there, which is something commonly mentioned anyway when it came to nvidia's NF200. So natuarally, adding another device in the link, mainly another GPU, is going to add latency too.


So, because this perforamnce problem exists with multiple GPUs, and drivers seemingly cannot deal with it effectively, running more than two GPUs doesn't make much sense, even with Eyefinity...the added latency of the third card at that res is killer, IMHO.

THe whole lack of perfect scaling when adding GPUs says it all, I think.



Super XP said:


> Can we get Clock for clock comparisons with the FX8150 vs. FX8350 perhaps at 3.60GHz and 4.0GHz.
> Thanks,




Why? I did it at 5.0 GHz. I do not have the time for such things unfortunately, as it doesn't really add much usable info to purchasing advice.


----------



## HumanSmoke (Oct 25, 2012)

theoneandonlymrk said:


> well you could allways human smoke its ass and move on to the next system each year, nice job you must have smokey, bit of a waste though ,a new system each year just to slate Amd  in threads


Chef - the one that sets the menu.
No worries on the upgrade score pal. Turn over a rig a year means I can resell for good cash because everything is still covered by warranty.
I build systems in my spare time -mainly bespoke watercooled gamers, w/c'ed and OC'd workstation for stock traders, and budget all purpose for the deaf community since the internet becomes an important access point to keep them in touch with the community (I'm profoundly deaf myself so generally know what the requirements are). I also really love building systems- more than using them I think. To me upgrading and building is adult LEGO, and I refuse to limit my enjoyment simply because someone elses idea of empirical cost effectiveness differs from my own.

 As for slating AMD...only the BoD and the myopic Walter Mitty fanboy base that dwell in an alternate reality where AMD can do no wrong and benevolently rule the technological world.


theoneandonlymrk said:


> i might do a tap oc run too, perma fresh cold waters .


That should do wonders for limiting galvanic corrosion.

And to whoever burbled on about AMD having native USB3.0 support...You'll find that only FM1/FM2 have that. 700, 800 and 900 series southbridges don't-they rely on third party controllers.


Fourstaff said:


> Either way, I feel like there is a need for people with equipment and knowledge (*hint* Wiz *hint*) to do some writeup on CF/SLi scalings on top of PCIe 3.0/2.0. I wonder if different platforms (x79 etc) react differently.


Already done. Here's an example from HardOCP. Bear in mind that the PCI-E 3.0 system they are using runs at x16 x16 (CFX/SLI) or x16, x8, x8 (3-way CFX/SLI). There are examples which show lesser gains on the net, and also greater gains (mainly using dual-GPU cards at high res)


cadaveca said:


> Total bandwidth is not the answer. So we've reached the limit of yours as well, eh?


That was never the point of my answer. The question/accusation posed by cdawall was that AMD's full 2.0 spec x16 implementation was somehow superior to Intel's x8 3.0 (the fact that there are plenty of Intel x16,x16 3.0 spec boards around seems to escaped him). Of course there's other factors involved- Game and driver coding to limit stalls in CPU, GPU and memory subsystems come to mind, but not a lot of point flying off on a tangent when you're supplying answers to a specific statement.


----------



## eidairaman1 (Oct 25, 2012)

honestly who cares, Its Out, it draws less power, runs faster than BD so whats it matter anymore. These chips will replace BD totally and make room for SR so lets end the nonsense.

(Beating a Deadhorse is only fun for maybe 2 minutes then its boring, some start sounding like broken records around here)


----------



## cedrac18 (Oct 25, 2012)

eidairaman1 said:


> honestly who cares, Its Out, it draws less power, runs faster than BD so whats it matter anymore. These chips will replace BD totally and make room for SR so lets end the nonsense.
> 
> (Beating a Deadhorse is only fun for maybe 2 minutes then its boring, some start sounding like broken records around here)



Amen, it's funny the people buying new GPUs and and CPUs every year are probably the same ones slamming people buying Ipads and Macbook pros every year.


----------



## os2wiz (Oct 25, 2012)

cedrac18 said:


> Amen, it's funny the people buying new GPUs and and CPUs every year are probably the same ones slamming people buying Ipads and Macbook pros every year.



Enough said you right.


----------



## eidairaman1 (Oct 25, 2012)

os2wiz said:


> WEell I slam ipad and Iphone users, because they overpay just to be cool. The product is not a better productivity enhancer than a Samsung android or a Toshiba Excite 10" android tablet like I have. Saving $400 over an Ipad 3 and being able to do everything I want to on my tablet gives me a weird feeling of satisfaction . I can gloat at the fools standing in line and wasting hard earned money for a status symbol.



dude knock it off ok. there was no need to say anymore:shadedshu




cedrac18 said:


> Amen, it's funny the people buying new GPUs and and CPUs every year are probably the same ones slamming people buying Ipads and Macbook pros every year.



Ya I hear ya, I use a Galaxy S 1 as my phone and a 7 year old laptop as my mobile platform and my sig rig at home, Phone definitely needs a fix, the laptop runs skype just fine and doesnt overheat like the new ones on the market


----------



## DaedalusHelios (Oct 25, 2012)

I see the 3570k and the FX8350 at roughly the same price. 
3570k = $230 on average
FX-8350 = $220 on average

Proper OC motherboards are roughly the same price too.

So why would you buy a processor like the FX-8350 that uses far more power, and suffers from poor single threaded performance which we use most day-to-day? The multi-threaded performance was roughly equivalent.

*** Why are you guys talking about Apple. lol


----------



## os2wiz (Oct 25, 2012)

DaedalusHelios said:


> I see the 3570k and the FX8350 at roughly the same price.
> 3570k = $230 on average
> FX-8350 = $220 on average
> 
> ...



That difference will grow in about 2 weeks as the supply chain fills and the FX-8350 prices gravitates toward the msrp which is the natural p[rogression. In a month the difference will be about $30. Remember prices are always inflated at product introductions.


----------



## eidairaman1 (Oct 25, 2012)

os2wiz said:


> That difference will grow in about 2 weeks as the supply chain fills and the FX-8350 prices gravitates toward the msrp which is the natural p[rogression. In a month the difference will be about $30. Remember prices are always inflated at product introductions.



yup initial profits


----------



## DaedalusHelios (Oct 25, 2012)

Also I realize saying far more power is a bit of an overstatement. I should have said considerably more.


----------



## eidairaman1 (Oct 25, 2012)

DaedalusHelios said:


> Also I realize saying far more power is a bit of an overstatement. I should have said considerably more.



if you would put it that way, but ya first thing i noticed from daves review is the power draw is down which is a plus because with that clock ramping can happen on this design/stepping/OPN.

Hopefully they are taking the time, reasearch, quality of the development on test models of SR (IF they have made it tangible prototypes)


----------



## os2wiz (Oct 25, 2012)

eidairaman1 said:


> if you would put it that way, but ya first thing i noticed from daves review is the power draw is down which is a plus because with that clock ramping can happen on this design/stepping/OPN.
> 
> Hopefully they are taking the time, reasearch, quality of the development on test models of SR (IF they have made it tangible prototypes)



So when you ordering yours??  I paid 210.90 on blt.com but it's not in stock, I won't have it in less than 12 days.. I am looking to pay less and get it in the next4-5 days . Anybody got a good source?  I almost ordered from Tiger Direct for $205.50 . The economy shipping is 2.99 more, but delivery can take as long as 9 days, so I said no.If you guys no a better alternative, I'll cancel the order at blt andjump at it. I just refuse to pay Newegg the bloated price of 219.99 $25 above msrp. I called to them and bellyached about it. I have been a regular customer of theirs. As an enthusiast I have spent almost $10,00 there in the past 5 years. I told them they were price gouging like they did when Bulldozer was introduced. They blew me off. I'm just a drop in the sea to them.


----------



## eidairaman1 (Oct 25, 2012)

os2wiz said:


> So when you ordering yours??  I paid 210.90 on blt.com but it's not in stock. I am looking to pay less and get it in the next4-5 days . Anybody got a good source?  I almost ordered from Tiger Direct for $205.0 . The economy shipping is 2.99 more, but delivery can take as long as 9 days, so I said no.



once things are golden in my life. I was thinking eventually taking my Bros machine i built him last year. Probably wont have PD in it as it has a PII unlocked


----------



## DaedalusHelios (Oct 25, 2012)

eidairaman1 said:


> once things are golden in my life. I was thinking eventually taking my Bros machine i built him last year. Probably wont have PD in it as it has a PII unlocked



What do you mean? How old are you if you don't mind saying.....


----------



## eidairaman1 (Oct 25, 2012)

DaedalusHelios said:


> What do you mean? How old are you if you don't mind saying.....



check your PM


----------



## TheoneandonlyMrK (Oct 25, 2012)

Re tap water cooling



HumanSmoke said:


> That should do wonders for limiting galvanic corrosion.



and i care:shadedshu, im on tpu ,max is the endgame at all times dude, all be it max with what i have, im not calling you for obssesive hardware buying,  just the trollin fella,

 i am a different end user to you and one you might consider because an fx8350 is right on plan for an upgrade to my system (and many like mine)and i doubt ill be unhappy at 150 notes,,, period, i also dont forsee any issues continuing to run a xfire setup with a physx card and also a pciex ssd, something intel wanted mega money for at the time i rebuilt, and still want proper money for now, id be happier with a better allocation of lanes or manual adjust but ah well.


----------



## HumanSmoke (Oct 25, 2012)

theoneandonlymrk said:


> im not calling you for obssesive hardware buying,  just the trollin fella


Bullshit. If trolling was a reason to fired up, you'd have jumped all over cdawall's posting. Some of the of the guys' "facts" come straight out of Fantasia.
Personally, run whatever you want- this is supposedly a tech enthusiast community, and as such, I've not denigrated a single item of hardware in 290+ posts here (unlike some), because my interest is hardware, and not a _brand of hardware_...probably why I was pretty much certain what PD was bringing to the table some time before launch


----------



## eidairaman1 (Oct 25, 2012)

Well 8 ghz was broken with the 8350 all modules and cores enabled


----------



## cdawall (Oct 25, 2012)

HumanSmoke said:


> So what? My nephew uses my old hand-me-down 4 year old QX9650/X48 Rampage on LGA 775. It certainly isn't embarrassed by any AMD CPU performance.



You sure about that? Put 2 GPU's on it or stress the memory out and see how well that FSB does. If you want I am sure I can find you a stack of benchmarks showing how well that ones goes. 



HumanSmoke said:


> The one that delivers on time.
> On a personal note, I like new stuff. If Haswell delivers, I might just buy it...but then, I tend to upgrade yearly- probably why Intel make so much money!



No kidding you buy Intel I honestly didn't see that one coming.



HumanSmoke said:


> The same can be said for any full featured Z77 and X79...or are you too myopic to see past a brand?



Except with Z77 and X79 you cannot go buy a brand new CPU every year you will by buying a brand new motherboard as well or can *you* not see past brand.



HumanSmoke said:


> Because AMD are the only company to evolve their CPU design ? because Intel CPU's lack multi-threaded performance ? Because the next 6-12 months are going to see an exponential growth in software tailored to Bulldozer architecture ? Because AMD can stick to a timetable and their performance estimates ?



What are you trying to get at with this one? You own a P6 design chip which dates back to Pentium 3. It was designed in the middle of single core needs. We are well past that. Software doesn't have to be tailored for bulldozer all it has to do is allow bulldozer to use AVX. Weird how there is a CPUID flag in a lot of programs that block AMD even when it supports the technology. The rest of what you said is fluff.




HumanSmoke said:


> While you're answering those, maybe you can provide links to support your supposition of "every company complaining about multithreading".



How about Intel's guide to multithreading that the released for free for everyone to use in hopes to help get programmers to multithread apps or this one were a programmer was cornered by intel. Would you like some more?




HumanSmoke said:


> So what? How many people use every feature on a motherboard?



That is probably the dumbest argument for why Intel's $100+ don't offer any improvement over AMD's $60 mobo's.



HumanSmoke said:


> Bleat on about dual x8 boards (presumeably for CFX/SLI) and rave about "mildly worse" in gaming. Sounds about right.



That is already considered a pretty bad review, but figures *you* would find it. I am quite glad to see a $130 more expensive processor perform better when not only overclocked more, but also using a $50 more expensive motherboard. For $180 I could buy a lot of things to make up for the frames. Like better cooling for a higher overclock, or better video cards that perform better.

What I find most interesting is the 3770K they clocked up hit higher than most at 4.8ghz and the 8350 fell short of every other review out there. Gee I do wonder if that review might possibly have been a bit biased to one side.



HumanSmoke said:


> I'd think that more people might look at options such as choice in the mATX/ITX form factor, onboard WiFi and WiDi, SSD caching and the like.



SSD caching is available on ONE chipset. Not exactly making headway. AMD also has options for onboard WiFi.



HumanSmoke said:


> True, but whats AMD's track record ? I know which company I'd trust more to adhere to their timetable.



That's nice. 



HumanSmoke said:


> Rubbish. Dual GPU cards and CFX/SLI are already impacting the electrical restriction of PCI-E 2.0



Electrical restriction meaning what? Power consumption or performance? TPU's own review shows negligible difference between 2.0 and 3.0 PCI-e X16



HumanSmoke said:


> I see we've reached the limit of your knowledge.
> PCI-E x16 2.0 @x16 = 80GT/sec (8GB/sec) * 80% (8b/10b encode) = 64GT/sec (6.4GB/sec)
> PCI-E x16 3.0 @x8 = 64GT/sec (6.4GB/sec)* 98.46% (128b/130b encode)= 63.6GT/sec (6.36GB/sec)



I see we have substantially exceeded yours as has been already pointed out to you quite bluntly already. It isn't all about the bandwidth, but how it is delivered. The number of available lanes is a substantial hamper on crossfire and SLi performance. Hence why people do not want a 16x/4x mobo instead looking for an 8x/8x.



HumanSmoke said:


> Bullshit. If trolling was a reason to fired up, you'd have jumped all over cdawall's posting. Some of the of the guys' "facts" come straight out of Fantasia.



Also if you would like to continue to insulting not only myself, but many other members of this forums I am sure the moderators will have no issues escorting you off of the forums.



HumanSmoke said:


> Personally, run whatever you want- this is supposedly a tech enthusiast community, and as such, I've not denigrated a single item of hardware in 290+ posts here (unlike some), because my interest is hardware, and not a _brand of hardware_...probably why I was pretty much certain what PD was bringing to the table some time before launch



The post you quoted is you naming things that you don't think AMD will have I fail to see how that proves anything in your favor.


----------



## drdeathx (Oct 25, 2012)

eidairaman1 said:


> honestly who cares, Its Out, it draws less power, runs faster than BD so whats it matter anymore. These chips will replace BD totally and make room for SR so lets end the nonsense.
> 
> (Beating a Deadhorse is only fun for maybe 2 minutes then its boring, some start sounding like broken records around here)





LOL, thats why this is called a forum...


----------



## HumanSmoke (Oct 25, 2012)

cdawall said:


> You sure about that? Put 2 GPU's on it or stress the memory out and see how well that FSB does. If you want I am sure I can find you a stack of benchmarks showing how well that ones goes.


Don't need them thanks I ran a few myself back in the day. My last two LGA775 systems (the QX9650 w/ HD 5850BE CFX and Q9400 @ 417FSB/no bump in Vcore (1.2375V) /3.33G +EP45-DS4P+ GTX 280SLI/HD 5850 CFX and later briefly, 580 SLI w/SLI hack) were perfectly stable every day they ran


cdawall said:


> Except with Z77 and X79 you cannot go buy a brand new CPU every year you will by buying a brand new motherboard as well


And this is a problem for the enthusiast builder in what way?


cdawall said:


> What are you trying to get at with this one?


I thought it was obvious. I was asking questions.


cdawall said:


> You own a P6 design chip which dates back to Pentium 3. It was designed in the middle of single core needs. We are well past that.


So Intel CPU design is dead? Well, if it is you can rejoice. Don't save me a pew in your church just yet.


cdawall said:


> That is already considered a pretty bad review


And why is that? Because it doesn't bolster your argument. How about this one:







cdawall said:


> but figures *you* would find it. I am quite glad to see a $130 more expensive processor perform better when not only overclocked more


Better than what??? Maybe you can show me where I said anything about a comparison regarding performance, show me where I disparaged PD's performance. You seem to be making some straw man argument and trying to put forward an idea that AMD are the future and Intel's CPU's don't cut it.
All I've mentioned is a personal choice based on the local resell market (a scarcity/lateness to local market for AMD waterblocks doesn't help the enthusiast here either), and a general opinion of marketability from an OEM standpoint. You're the one with a Go-AMD or go home mentality.


cdawall said:


> What I find most interesting is the 3770K they clocked up hit higher than most at 4.8ghz and the 8350 fell short...


WTF are you talking about? the PCI-E 2.0 vs 3.0 article only featured Intel systems.


cdawall said:


> SSD caching is available on ONE chipset. Not exactly making headway.


Well no, actually it's both the chipsets you talked about earlier ( Z77 and likely X79)...


cdawall said:


> Except with Z77 and X79 you cannot go buy a brand new CPU every year you will by buying a brand new motherboard as well


...as well as Z68 and Z75 and H77 and Q77


cdawall said:


> AMD also has options for onboard WiFi.


Must be very prevalent if the only example you could find was a discontinued board. You might also note that my quote was related to what the majority of prospective computer buyers might look for (as opposed to "enthusiasts"). So, if you believe that as a marketing bulletpoint, multithreading is a better drawcard than WiFi, WiDi etc. etc. for the masses...


HumanSmoke said:


> *I'd think that more people might look at options* such as choice in the mATX/ITX form factor, onboard WiFi and WiDi, SSD caching and the like.


...then we indeed see marketing for the masses from a different perspective


cdawall said:


> Electrical restriction meaning what?


What it usually means...data I/O


cdawall said:


> TPU's own review shows negligible difference between 2.0 and 3.0 PCI-e X16


I thought that had been made relatively clear when I mentioned *dual-GPU *and *multi-GPU*


cdawall said:


> The number of available lanes is a substantial hamper on crossfire and SLi performance. Hence why people do not want a 16x/4x mobo instead looking for an 8x/8x.


SLI isn't available for any motherboard with a x16/x4 lane assignment, and four lanes mechanical/electrical does tend to bottleneck *some enthusiast level cards *relying on *some *CPU intensive games. We've (I've) already established that the difference between PCI-E 2.0 x16 @ 16 and PCI-E 3.0 x16 @ 8 is nominal for bandwidth, and favours the 3.0 spec for latency (encoding overhead)


cdawall said:


> Also if you would like to continue to insulting not only myself...


No offense intended, but if I see opinion paraded as fact I tend to note it as such.


cdawall said:


> The post you quoted is you naming things that you don't think AMD will have I fail to see how that proves anything in your favor.


The reply was in response to SuperXP. Their  assertion being that the PD benches circulating weren't PD but a Bulldozer revision. My assertion was/is that PD is basically a Bulldozer revision (as opposed to a full respin), and as such wasn't incorporating RCM- which, with it's lowering of power envelope was one of the distinguishing characteristics touted between Bulldozer and (the original) Piledriver- a plan that now seems to be a tweak to accelerate PD's entry into the marketplace and possibly leaves room for a second revision *should* Steamroller be delayed ( by GloFo's 28nm bulk process ramp speed or otherwise).


----------



## eidairaman1 (Oct 25, 2012)

drdeathx said:


> LOL, thats why this is called a forum...



thank you Lt Sarcasm


----------



## EarthDog (Oct 25, 2012)

cdawall said:


> SSD caching is available on ONE chipset.


Its at least on Z68 and Z77... not sure about P67...So that is at least two. Its a neat feature that AMD should have IMO. Useful for the 'demographic' that posts here and other n00blet home users that want a kick in the arse but cant afford a decent sized SSD. One would think with AMD's crosshairs firmly locked on the bang for your buck crowd, that they would at least attempt to have something similar.

BUt yeah, it just came out from those chipsets so..................I wouldnt expect them to be on any others, LOL!


----------



## EarthDog (Oct 25, 2012)

cadaveca said:


> For me, memory bandwidth plays a large role in Multi-GPU performance, and in a very perceptible way. I agree with your sentiment about memory perforamnce in a big way for everything else though.


@ Dave,

I tested out memory last night in 3dMark Vantage and 3dMark 11. With 2 7850's and 3770K at 4Ghz (Z77 Mpower), my scored dropped 100 points in 3d11. In vantage it dropped a few hundred. Here is the thing though, in BOTH tests, it lost those points because of the CPU and PhsyX tests. The GPU scores remained the same.

For games, in BF3 and batman, I saw negligible differences (1FPS). So at least with these lower powered cards, it doesnt make a difference at all. I will have 7970's on the test bench here in a few days, and will try that again.


----------



## cadaveca (Oct 25, 2012)

EarthDog said:


> @ Dave,
> 
> I tested out memory last night in 3dMark Vantage and 3dMark 11. With 2 7850's and 3770K at 4Ghz (Z77 Mpower), my scored dropped 100 points in 3d11. In vantage it dropped a few hundred. Here is the thing though, in BOTH tests, it lost those points because of the CPU and PhsyX tests. The GPU scores remained the same.
> 
> For games, in BF3 and batman, I saw negligible differences (1FPS). So at least with these lower powered cards, it doesnt make a difference at all. I will have 7970's on the test bench here in a few days, and will try that again.



Playa real game, watch minimums. overall FPS does not increase, but minimums do not go as low. And yes, 3dm11 will show same 3d score.


----------



## EarthDog (Oct 25, 2012)

BF3 and batman are real games. Minimums changed negligibly as well in the games....Im thinking its game/card dependent. Perhaps the7850's arent pumping enough data through it to saturate the pipe...

(anyway, not about FX, my bad... we cant yap in PM if you like)


----------



## drdeathx (Oct 25, 2012)

eidairaman1 said:


> thank you Lt Sarcasm





Just a friendly reminder!


----------



## eidairaman1 (Oct 25, 2012)

drdeathx said:


> Just a friendly reminder!



ya and that was me being friendly and funny


----------



## drdeathx (Oct 25, 2012)

eidairaman1 said:


> ya and that was me being friendly and funny





ME BALLS HURT!


----------



## cdawall (Oct 25, 2012)

HumanSmoke said:


> -random banter with zilch to do with FX 8350 review-



I think all of your posts here can be summed up with this one simple question.

Which pen do you like better?


----------



## eidairaman1 (Oct 25, 2012)

drdeathx said:


> ME BALLS HURT!



LMAO!


----------



## drdeathx (Oct 25, 2012)

cdawall said:


> I think all of your posts here can be summed up with this one simple question.
> 
> Which pen do you like better?
> 
> ...





I wouldn't go "That" far yet but to tell you the truth. $200 for the 8350 is priced right and AMD surprised me a bit here.


----------



## Super XP (Oct 25, 2012)

DaedalusHelios said:


> I see the 3570k and the FX8350 at roughly the same price.
> 3570k = $230 on average
> FX-8350 = $220 on average
> 
> ...


The best Socket AM3+ motherboard is the ASUS Crosshair V Formula. An Intel equivalent mobo would set you back a lot more. Combine this with the FX-8350 @ $200 and you can buy a better grhics card with the savings. INTEL setups are overpriced. INTEL mobos can go upgo $500+ a board


----------



## EarthDog (Oct 25, 2012)

Super XP said:


> The best Socket AM3+ motherboard is the ASUS Crosshair V Formula. An Intel equivalent mobo would set you back a lot more. Combine this with the FX-8350 @ $200 and you can buy a better grhics card with the savings. INTEL setups are overpriced. INTEL mobos can go upgo $500+ a board


Lets not mix oil and water... X79 boards may cost $500, but Z77 boards are cheaper by in large. $450 is the most and that is because of thunderbolt... BLEH. Thing is you dont remotely need such a motherboard (on either platform). For example, on 'high end' boards at the same price as the CHVF, you can buy a z77 MSI Mpower or Asrock OC Formula. To that end, the Asrock Z77 extreme 4 will handle daily driver clocks (4.5Ghz+) easily and that comes in at $130. 

Apples to apples however the Z77 MVF is $280. $50 more than a CHVF.


----------



## cdawall (Oct 25, 2012)

EarthDog said:


> Lets not mix oil and water... X79 boards may cost $500, but Z77 boards are cheaper by in large. $450 is the most and that is because of thunderbolt... BLEH. Thing is you dont remotely need such a motherboard (on either platform). For example, on 'high end' boards at the same price as the CHVF, you can buy a z77 MSI Mpower or Asrock OC Formula. To that end, the Asrock Z77 extreme 4 will handle daily driver clocks (4.5Ghz+) easily and that comes in at $130.



As does the Biostar TA990FXE for $109. Now that may not sound like much to some but the couple of bucks between it and the CPU adds up to the difference between a graphics card using the stock cooler and one using an aftermarket cooler. 

or the Asrock 990FX for $119 that includes 8GB of DDR3 1600 right now  nice deal on that one.


----------



## EarthDog (Oct 25, 2012)

cdawall said:


> As does the Biostar TA990FXE for $109. Now that may not sound like much to some but the couple of bucks between it and the CPU adds up to the difference between a graphics card using the stock cooler and one using an aftermarket cooler.
> 
> or the Asrock 990FX for $119 that includes 8GB of DDR3 1600 right now  nice deal on that one.


Right. The extreme 4 was just $110 with same same free ram (expired). I see the(your) point here. $20 matters to a lot of people. 

My point was just pop the balloon of $500 intel boards being anywhere close to relevant within this context and much lower end boards on both sides will be fine... which brings Helios's point back to relevance as well (price - not getting in to the performance thing).


----------



## cadaveca (Oct 25, 2012)

EarthDog said:


> which brings Helios's point back to relevance as well (price - not getting in to the performance thing).



Price is perfectly suited to performance. This is really a $269 chip(IMHO), but cost of ownership due to higher power draw = $200 is perfect.

AMD isn't messing around. Kinda reminds me of when 939 launched, actually...


----------



## drdeathx (Oct 25, 2012)

Super XP said:


> The best Socket AM3+ motherboard is the ASUS Crosshair V Formula. An Intel equivalent mobo would set you back a lot more. Combine this with the FX-8350 @ $200 and you can buy a better grhics card with the savings. INTEL setups are overpriced. INTEL mobos can go upgo $500+ a board





LOL, Who says the crosshair is the best mottherboard? Please explain why?


----------



## cadaveca (Oct 25, 2012)

drdeathx said:


> LOL, Who says the crosshair is the best mottherboard? Please explain why?



BIOS for memory clocking? If memory clocking/tweaking is truly important to you, ASUS wins, hands down right now.

And that is through ALL platforms. They have one damn good BIOS engineer.


----------



## drdeathx (Oct 25, 2012)

EarthDog said:


> Lets not mix oil and water... X79 boards may cost $500, but Z77 boards are cheaper by in large. $450 is the most and that is because of thunderbolt... BLEH. Thing is you dont remotely need such a motherboard (on either platform). For example, on 'high end' boards at the same price as the CHVF, you can buy a z77 MSI Mpower or Asrock OC Formula. To that end, the Asrock Z77 extreme 4 will handle daily driver clocks (4.5Ghz+) easily and that comes in at $130.
> 
> Apples to apples however the Z77 MVF is $280. $50 more than a CHVF.



Not all X79 boards are $500. The CPU performance is exactly the same on any Z77 or X79 motherboard. Overclocking capabilities may be 100-200MHz better on a top shelf board but thats it. Gigabyte makes 2 X79 motherboards at the $300 price point. Just saying. Yes Z77 boards are cheaper and just above AMD platform.

You cannot compare Intel boards to AMD boards. AMD boards do not have an Intel controller plus they do not have PCIE 3.0. The raw extras performance is much better on the Z77 Platform ( sata speed, PCIE Speed, memory speeds)which really does not cost a heck of a lot more than AMD Platform. you can get a Z77 motherboard for $125 and at microsenter a core i5 3570K for $179 making Intel Z77 + IvyBridge cheaper than AMD. i7 3770K at Miocrocenter is $279 making a potential Intel Rig only $80 more than a 8350. just sayin.


----------



## TheoneandonlyMrK (Oct 25, 2012)

cadaveca said:


> Quote:
> Originally Posted by drdeathx
> LOL, Who says the crosshair is the best mottherboard? Please explain why?
> 
> ...



Amen brother, and thats another reason i like Amd chips at the minute, there is much more that can be tweeked and tuned, you could spend weeks getting it just so


----------



## EarthDog (Oct 25, 2012)

drdeathx said:


> Not all X79 boards are $500. The CPU performance is exactly the same on any Z77 or X79 motherboard. Overclocking capabilities may be 100-200MHz better on a top shelf board but thats it. Gigabyte makes 2 X79 motherboards at the $300 price point. Just saying. Yes Z77 boards are cheaper and just above AMD platform.
> 
> You cannot compare Intel boards to AMD boards. AMD boards do not have an Intel controller plus they do not have PCIE 3.0. The raw extras performance is much better on the Z77 Platform ( sata speed, PCIE Speed, memory speeds)which really does not cost a heck of a lot more than AMD Platform. you can get a Z77 motherboard for $125 and at microsenter a core i5 3570K for $179 making Intel Z77 + IvyBridge cheaper than AMD. i7 3770K at Miocrocenter is $279 making a potential Intel Rig only $80 more than a 8350. just sayin.


Yes, I know . I didnt mention that point as its not relevant to the conversation. BUT there is one $600 X79 mobo, LOL! ON average X79 boards are much more expensive than Z77. I mean that has been out for a year and 2 mobo's under $200, none under $100. BUT enthusiast platform vs mainstream as well so....

CPU performance IS different from those platforms DEPENDING ON THE CPU USED. For example, if some got X79 and i7 3820 (fool!) vs 3770k, the 3770k wins out as its faster clocks and IPC as well as quad channel memory really making no difference for the most part. Now, you use a 2600K and 3820, performance should be the same as they are both SB chips and quad ch memory still doesnt matter. 

ALso, SATA speeds are the same (SATAIII), and so are memory speeds... in fact, IB has a more robust IMC than SB-E chips so memory can be faster speed wise, though bandwidth (absolutely useless over dual channel for 99% of things) is greater. No point in a bigger pipe(quad ch) with the same amount of water flowing through it(data).. it wont move faster.


----------



## cdawall (Oct 25, 2012)

EarthDog said:


> Right. The extreme 4 was just $110 with same same free ram (expired). I see the(your) point here. $20 matters to a lot of people.
> 
> My point was just pop the balloon of $500 intel boards being anywhere close to relevant within this context and much lower end boards on both sides will be fine... which brings Helios's point back to relevance as well (price - not getting in to the performance thing).



I agree 100% the only downside on the Intel side is to get some of the features available on even that $109 board you need X79. Z77 simply lacks PCI-e lanes...Intel really did cut Z77 down in comparison. I wish nvidia still had a chipset division to make some "better" Intel boards.


----------



## erocker (Oct 25, 2012)

cdawall said:


> I agree 100% the only downside on the Intel side is to get some of the features available on even that $109 board you need X79. Z77 simply lacks PCI-e lanes...Intel really did cut Z77 down in comparison. I wish nvidia still had a chipset division to make some "better" Intel boards.



The lack of PCI-E lanes (if running dual video cards) makes almost zero difference especially when comparing to a competing chipset like the 9 series from AMD.


----------



## drdeathx (Oct 25, 2012)

EarthDog said:


> Yes, I know . I didnt mention that point as its not relevant to the conversation. BUT there is one $600 X79 mobo, LOL! ON average X79 boards are much more expensive than Z77. I mean that has been out for a year and 2 mobo's under $200, none under $100. BUT enthusiast platform vs mainstream as well so....
> 
> CPU performance IS different from those platforms DEPENDING ON THE CPU USED. For example, if some got X79 and i7 3820 (fool!) vs 3770k, the 3770k wins out as its faster clocks and IPC as well as quad channel memory really making no difference for the most part. Now, you use a 2600K and 3820, performance should be the same as they are both SB chips and quad ch memory still doesnt matter.
> 
> ALso, SATA speeds are the same (SATAIII), and so are memory speeds... in fact, IB has a more robust IMC than SB-E chips so memory can be faster speed wise, though bandwidth (absolutely useless over dual channel for 99% of things) is greater. No point in a bigger pipe(quad ch) with the same amount of water flowing through it(data).. it wont move faster.





3820 has 10Mb of L3 cache. 2600K has 8Mb L3 cache.


----------



## EarthDog (Oct 25, 2012)

> Z77 simply lacks PCI-e lanes...


I hear this a lot... but to what end? Who needs them? Tri+ SLI people perhaps (benching). Z77 also has a PLX chip to give more lanes. Yes, latency is added, however, performance isnt but a couple % different with that vs native lanes on X79 (which until IB-E comes out is still PCIe2.0 IIRC). Tests here show 8x/8x you barely lose a thing. 

Technically you are correct, but the differences in performance are negligible for dual card people regardless so, I guess does it really matter?



drdeathx said:


> 3820 has 10Mb of L3 cache. 2600K has 8Mb L3 cache.


Bears poop in the woods.... whats your point? 

Performance is performance. I dont care if I have a boosted rice rocket that runs 12's, or a N/A big block car that runs 12's... they still run 12's, they just get there a different way. 

Look: 3820 has 10Mb of L3 cache. 2600K has 8Mb L3 cache.[/QUOTE]Bears poop in the woods.... whats your point?  Performance is performance. I dont care if I have a boosted rice rocket that runs 12's, or a N/A big block card that runs 12's... they still run 12's, they just get there a different way. Look: With a 200Mhz clockspeed advantage and cache advantage it trades punches with a 2600K. "]With a 200Mhz clockspeed advantage and cache advantage it trades punches with a 2600K (AND 3820 can use 40W more!!!). 



erocker said:


> I think so. Price/performance, even with Intel being more expensive is still on Intel's side. Much more so. You're right with x8 + x8, the thing is with AMD and x16 + x16 performance can be up to 50% less.. Not to mention single GPU performance with Bulldozer/Steamroller doesn't compare to a similarly priced chip such as the 2500K. Of course I'm talking gaming whether that's important to you or not.


Indeed... I have my blinders on today.. 

thanks!


----------



## erocker (Oct 25, 2012)

EarthDog said:


> I guess does it really matter?



I think so. Price/performance, even with Intel being more expensive is still on Intel's side. Much more so. You're right with x8 + x8, the thing is with AMD and x16 + x16 performance can be up to 50% less.. Not to mention single GPU performance with Bulldozer/Steamroller doesn't compare to a similarly priced chip such as the 2500K. Of course I'm talking gaming whether that's important to you or not.


----------



## cdawall (Oct 25, 2012)

erocker said:


> The lack of PCI-E lanes (if running dual video cards) makes almost zero difference especially when comparing to a competing chipset like the 9 series from AMD.



I agree however what happens when you are not using those PCI-E lanes for video cards. A full loaded system with two video cards in 8x/8x with a raid card at 16x is something you can do on AMD without a PLX chip. Will most users take advantage of that? Heck no. However in an encoder box I for one would be looking at a Piledriver chip to begin with since they perform better in that exact instance, but the lane configuration is much more usable for that. I feel like a comparison showing a big fancy raid card would show major differences.



EarthDog said:


> I hear this a lot... but to what end? Who needs them? Tri+ SLI people perhaps (benching). Z77 also has a PLX chip to give more lanes. Yes, latency is added, however, performance isnt but a couple % different with that vs native lanes on X79 (which until IB-E comes out is still PCIe2.0 IIRC). Tests here show 8x/8x you barely lose a thing.



I am already looking at swapping to tri-sli myself. It isn't the most effective way to fix my lack of graphics power, but I have 3 GTX 470's so why not. 

I honestly just find it frustrating that Intel specifically cuts down Z77 to try and force people into paying more for X79. AMD has proven for under $150 you can have just about every X79 feature yet in that price range you don't have those options. You get handed a constantly replaced socket (which is EOL here soon with LGA1150 coming out) and a cpu that is better suited in single IPC rather than multithreading. While yes I will not argue at this time a 3570K can beat an 8350 pretty handily in quite a few games, what is to say that's not changing. We are already seeing it in other app's games are next.



EarthDog said:


> Technically you are correct, but the differences in performance are negligible for dual card people regardless so, I guess does it really matter?



I don't think that specifically does. I think the all around package does. When steamroller comes out it will be nice to take a brand new $200-250 chip and drop it onto a 990FX motherboard. Will it be the latest and greatest board? NOPE, but as we see now it already has all the PCI-e lanes and features available which might make the board still heavily competitive then.



erocker said:


> I think so. Price/performance, even with Intel being more expensive is still on Intel's side. Much more so. You're right with x8 + x8, the thing is with AMD and x16 + x16 performance can be up to 50% less.. Not to mention single GPU performance with Bulldozer/Steamroller doesn't compare to a similarly priced chip such as the 2500K. Of course I'm talking gaming whether that's important to you or not.



Here is my thing on the gaming. Are you over 60FPS? Well cool I know I cannot see the difference between 60FPS and 100FPS and considering my monitors are OC'd to 65Hz I know I will never be able to see the difference because the monitors can't even display it.






It doesn't even do that badly. The only game that is terrible is Skyrim which I don't play so I could care less.

Now take that same 2500K and transcode a video.






Seems to be more of a performance difference in that particular instance than the video games of the same review...


----------



## EarthDog (Oct 25, 2012)

You are asking them to cater to the EXTREME minority who uses more than 2 cards or two cards plus an incredibly expensive 16x RAID card (a proper one anyway). Those that should go X79. Simple to me. I dont go to Nissan begging them to shoehorn the Maxima Motor in a Sentra either. 

As far as the other stuff, we can both list specific examples to fit our talking points as was done here (and out of drdeath's generic context)... but I dont want to get in to that (specifically mentioned that point). Each person buying a PC needs to understand his/her own needs and their budget, then choose a platform accordingly. 

As for you, or anyone, NOFKING way would I tri SLI 3 470's. Too much power, too much heat, and tri SLI+ is not very efficient. If you have em, I guess. But not worth the hassle to me. Id rather sell those bad boys and grab a 7970 + 12.11 at this time. I WILL BE perfectly happy with that setup on 2560x1440 playing BF3 on ultra (680 LIghtning currently).


----------



## erocker (Oct 25, 2012)

cdawall said:


> I agree however what happens when you are not using those PCI-E lanes for video cards. A full loaded system with two video cards in 8x/8x with a raid card at 16x is something you can do on AMD without a PLX chip. Will most users take advantage of that? Heck no. However in an encoder box I for one would be looking at a Piledriver chip to begin with since they perform better in that exact instance, but the lane configuration is much more usable for that. I feel like a comparison showing a big fancy raid card would show major differences.
> 
> 
> 
> ...



You kinda cherry picked a couple benchmarks there. But I see your point. Then again talking about not needing anything over 60 fps and then talking about using two video cards and a raid controller makes no sense to me. As a gamer I'd still take the 2500k (or even a lesser Intel chip) over any AMD offerings which really kinda sucks for me.


----------



## cdawall (Oct 25, 2012)

EarthDog said:


> You are asking them to cater to the EXTREME minority who uses more than 2 cards or two cards plus an incredibly expensive 16x RAID card (a proper one anyway). Those that should go X79. Simple to me. I dont go to Nissan begging them to shoehorn the Maxima Motor in a Sentra either.
> 
> As far as the other stuff, we can both list specific examples to fit our talking points as was done here (and out of drdeath's generic context)... but I dont want to get in to that (specifically mentioned that point). Each person buying a PC needs to understand his/her own needs and their budget, then choose a platform accordingly.
> 
> As for you, or anyone, NOFKING way would I tri SLI 3 470's. Too much power, too much heat, and tri SLI+ is not very efficient. If you have em, I guess. But not worth the hassle to me. Id rather sell those bad boys and grab a 7970 + 12.11 at this time. I WILL BE perfectly happy with that setup on 2560x1440 playing BF3 on ultra.



There are more people running Bulldozer rigs to encode and transcode than you think. They are good at it just like P4's were. The Bulldozer and Piledriver rigs do some other things well also, but encode is their bread and butter. It is as far as I am concerned a server chip released on a desktop board.

As for the GTX 470's I don't pay for electricity (in fact you probably do) so the more the merrier.



erocker said:


> You kinda cherry picked a couple benchmarks there. But I see your point. Then again talking about not needing anything over 60 fps and then talking about using two video cards and a raid controller makes no sense to me. As a gamer I'd still take the 2500k (or even a lesser Intel chip) over any AMD offerings which really kinda sucks for me.



I picked the multithreaded benchmarks it is designed to compete in. Single IPC is dead I could care less if a Pentium with MMX beats it. As for the multi-card and SAS...multipurpose rig? Video games and video encoding seems simple to me.


----------



## erocker (Oct 25, 2012)

cdawall said:


> Single IPC is dead



I don't believe it is at all.  Plently of applications say otherwise.


----------



## m1dg3t (Oct 25, 2012)

Good review Dave! As usual  I sure am glad to see AMD making headway with their new tech, Intel is becoming complacent with the lack of direct competition. IVB TIM fiasco anyone?

From my experiences it was/is always more cost effective to build an AMD based system, generally got more for your $$$. Also it seems that AMD has always catered more to the "tweakers" as they always include a plethora of options to play with  I recall the days when AMD was "wiping the floor" with Intels, they weren't long ago... The shoe is on the other foot now, soon the cycle will repeat. Nature of the beast 

Keep up the good work Dave


----------



## cdawall (Oct 25, 2012)

erocker said:


> I don't believe it is at all.  Plently of applications say otherwise.



Then disable all but one module on piledriver and clock it up. People have 5.5+ on 4170's with both modules enabled I am sure you can get even high with piledriver. The same thing would fix the terribly coded skyrim.


----------



## erocker (Oct 25, 2012)

cdawall said:


> Then disable all but one module on piledriver and clock it up. People have 5.5+ on 4170's with both modules enabled I am sure you can get even high with piledriver. The same thing would fix the terribly coded skyrim.



Did that with an 8150. Single IPC still isn't good.


----------



## cdawall (Oct 25, 2012)

erocker said:


> Did that with an 8150. Single IPC still isn't good.



Clock it higher and make sure to kick the bus up while your at it. Pure multi clocks don't help as much.


----------



## cadaveca (Oct 25, 2012)

cdawall said:


> Clock it higher and make sure to kick the bus up while your at it. Pure multi clocks don't help as much.



....


8350 is not 8150.

or 965BE.


----------



## erocker (Oct 25, 2012)

cdawall said:


> Clock it higher and make sure to kick the bus up while your at it. Pure multi clocks don't help as much.



Did that, single IPC is still substandard to it's competition. Besides I sold that chip for a much or capable 3770K. Oh well, I'm glad some people find AMD's CPU offerings up to their needs. They just don't meet mine. While the 8350 shows improvement over the 8150, it's still not enough for me when I can run my CPU at stock and not have any issues (or just have more FPS) with gaming.

http://www.hardocp.com/article/2012/10/22/amd_fx8350_piledriver_processor_ipc_overclocking/5


----------



## cadaveca (Oct 25, 2012)

erocker said:


> They just don't meet mine.



You upgrade regularily.


For those on a budget, that don't already have SKT1155 Intel chips, the 8350 is a damn mighty good option.


For you...nah, not so much.


Kinda silly to even discuss. 



However, I swapped out my I7 3820 for FX-8350.


----------



## erocker (Oct 25, 2012)

cadaveca said:


> You upgrade regularily.
> 
> 
> For those on a bidget, that don't already have Intel chips, the 8350 si a damn mighty good option.
> ...



Well sure! If you have the chipset to run it, it's a logical upgrade.


----------



## drdeathx (Oct 25, 2012)

EarthDog said:


> Bears poop in the woods.... whats your point?
> 
> Performance is performance. I dont care if I have a boosted rice rocket that runs 12's, or a N/A big block car that runs 12's... they still run 12's, they just get there a different way.
> 
> thanks!



The point is 2 more MB L3 Cache. What did you miss?

Level 3 or L3 cache is specialized memory that works hand-in-hand with L1 and L2 cache to improve computer performance.


----------



## EarthDog (Oct 25, 2012)

cdawall said:


> As for the GTX 470's I don't pay for electricity (in fact you probably do) so the more the merrier.


PAYING for it isnt the problem for me... Its the fact that it uses WAY more than it needs to for similar performance.....the fact that scaling is poor at Tri+, heatdump, and the potential problems multi GPU come up with in games (no profiles to name one). 

OH well, I digress. 



drdeathx said:


> The point is 2 more MB L3 Cache. What did you miss?
> 
> Level 3 or L3 cache is specialized memory that works hand-in-hand with L1 and L2 cache to improve computer performance.


Im not missing a thing. It trades punches with a 2600K *even though the 2600K has less cache, and has a 200Mhz clockspeed deficit*. So yippee it has more cache but its performance is what it is WITH the extra cache so it doesnt matter it has more cache. Did any review disable the cache for comparison? LOL, no... so again, that was completely not relevant. Are you with me now? 



cadaveca said:


> Price is perfectly suited to performance. This is really a $269 chip(IMHO), but cost of ownership due to higher power draw = $200 is perfect.
> 
> AMD isn't messing around. Kinda reminds me of when 939 launched, actually...


I missed this post.. Yes, correct, its priced right, but that was never the issue (right? wasnt in my head! ). 

I dont find this remotely like 939. In the s939 days, AMD lead performance nearly across the board... it was also MUCH easier to compare apples to apples at that point as it was single v single, dual v dual core and I think, IIRC, pricing was in the ballpark. These days, it depends on who you talk to as to what is relevant..Let's not forget though the original FX chips were priced terribly, just like the "X" chips of Intel are these days.

Is it fair to compare an 8 'core' AMD vs an Intel quad/quad with HT or Hex w/HT price not withstanding? Is it fair to go solely by price ignoring core/thread count? Its dizzying!!


----------



## bpgt64 (Oct 25, 2012)

Replacing my Phenom X4 955 with this in my work eSXi box.  But alas I am cheap...8320.


----------



## cdawall (Oct 25, 2012)

erocker said:


> Did that, single IPC is still substandard to it's competition. Besides I sold that chip for a much or capable 3770K. Oh well, I'm glad some people find AMD's CPU offerings up to their needs. They just don't meet mine. While the 8350 shows improvement over the 8150, it's still not enough for me when I can run my CPU at stock and not have any issues (or just have more FPS) with gaming.
> 
> http://www.hardocp.com/article/2012/10/22/amd_fx8350_piledriver_processor_ipc_overclocking/5



I despise [H] and he despises AMD so why bother reading that.


----------



## Super XP (Oct 25, 2012)

erocker said:


> Did that, single IPC is still substandard to it's competition. Besides I sold that chip for a much or capable 3770K. Oh well, I'm glad some people find AMD's CPU offerings up to their needs. They just don't meet mine. While the 8350 shows improvement over the 8150, it's still not enough for me when I can run my CPU at stock and not have any issues (or just have more FPS) with gaming.
> 
> http://www.hardocp.com/article/2012/10/22/amd_fx8350_piledriver_processor_ipc_overclocking/5


It's an improvement never the less and runs at less volts too. AMD is in the right direction, they just have to ensure Steamroller blows both Bulldozer and Piledriver out of the water in 2013.


----------



## m1dg3t (Oct 25, 2012)

cdawall said:


> I despise [H]



You are too kind sometimes! [H]ardforum is one of the biggest pieces of poop i have ever come across on the net. The only thing worse than their colour scheme is the majority of their community. 



Super XP said:


> It's an improvement never the less and runs at less volts too. AMD is in the right direction, they just have to ensure Steamroller blows both Bulldozer and Piledriver out of the water in 2013.



YES!! Too much head in butt disease in the world these days.


----------



## erocker (Oct 25, 2012)

Super XP said:


> It's an improvement never the less and runs at less volts too. AMD is in the right direction, they just have to ensure Steamroller blows both Bulldozer and Piledriver out of the water in 2013.



One can keep hoping I guess. Wishful thinking doesn't make me buy a product though. I think there's more of a chance of AMD leaving the desktop CPU market especially in light of how horrible their management has been. Yes I am disappointed with Piledriver and with AMD as a whole. Not to mention how much money I lost on their company due to their incompetence.


----------



## Ikaruga (Oct 26, 2012)

Thanks for the review. 

It's a very good chip for that price indeed, but let's not forget that this is not an entry level CPU. People who are going for this kind of performance can't ignore the speed and the efficiency what Ivy Bridge has to offer. That being said, (and while I personally prefer Ivy Bridge over Piledriver atm), I still think that this aggressive pricing will pay up for AMD on the long run.

*ps.:* You accidentally a i5-3770K in the Test System 4


----------



## Am* (Oct 26, 2012)

Glad my prediction was right -- this IS Bulldozer done right.

If I was still on AM3, I'd buy this CPU without any hesitation. If it was released a little over a year ago, I probably never would have went with Sandy Bridge for my build.


P.S. Any chance of including the 2500K in the comparisons? The way I see it, especially in the 3D benchmarks, Ivy Bridge could be edging out the FX-8350 more than it should due to PCI-E 3.0. The 2500K would be fair game.


----------



## DaedalusHelios (Oct 26, 2012)

cdawall said:


> I despise [H] and he despises AMD so why bother reading that.



Well from the review posted by Erocker, Kyle said,



> "Is Vishera a better part than Zambezi? Yes it is. And in some areas of performance, quite a lot. AMD has done some great things with Piledriver when compared to its previous Bulldozer architecture. AMD can surely claim a victory in that arena and kudos to its engineers for doing so."



I don't like Kyle personally but I wouldn't say he is super biased when it comes to computer hardware. Is the most vocal members of H staff conservative like Glenn Beck politically? Yes, and it shows up in their news and general sentiment unfortunately. That and they don't treat people with respect for the most part. :shadedshu 

I enjoy their trade forum and marketplace though. They moderate their marketplace very well and I am thankful for that.


----------



## cdawall (Oct 26, 2012)

DaedalusHelios said:


> I don't like Kyle personally but I wouldn't say he is super biased when it comes to computer hardware. Is the most vocal members of H staff conservative like Glenn Beck politically? Yes, and it shows up in their news and general sentiment unfortunately. That and they don't treat people with respect for the most part.
> 
> I enjoy their trade forum and marketplace though. They moderate their marketplace very well and I am thankful for that.



Somehow he managed to do an entire benchmark were AMD lost. The only processor at stock speed was the FX 8350 and yet you think it was unbiased. Last time I checked unlocked or not most people do NOT overclock. A proper benchmark should have at bare minimum shown stock vs stock performance with a section dedicated to clock vs clock which we all knew going into this AMD would loose. Hence it is a completely worthless review. 

As I said in the other FX 8350 thread it completely depends on the benchmark
































Weird how other websites can allow AMD to show higher performance than Intel and Kyle cannot. There are also many more benchmarks that show the FX 8350 beating the 3770K in multithreading. If you want I can go hunt them all down, then everyone can just revel at how much better in "games" Intel is.


----------



## Ikaruga (Oct 26, 2012)

cdawall said:


> Weird how other websites can allow AMD to show higher performance than Intel and Kyle cannot. There are also many more benchmarks that show the FX 8350 beating the 3770K in multithreading. If you want I can go hunt them all down, then everyone can just revel at how much better in "games" Intel is.



*-* TPU's review uses win7. Iirc Windows 8 has some additional SMT optimization which might produce some different results in (rare) heavily multithreaded cases. 

*-* Also: sadly, developers are still (can't or) not putting enough effort into making their code more SMT friendly, and efficiently and fully feeding 8 cores won't just happen by "itself", so, 80-90% of the times you gonna need to rely on the performance what your CPU can give you with 1-3 threads. I wish that everything would just run on as many cores as the CPU has, but that's still not how it is today.


----------



## EarthDog (Oct 26, 2012)

cdawall said:


> As I said in the other FX 8350 thread it completely depends on the benchmark


I dont go to [H] for any reason (PSU reviews maybe)... but do they use these same benchmarks for Intel CPU's? If not, you have a point, otherwise, its a pretty big jump to say they did that intentionally, ya know? I mean, look around to any sites and see they all use different things. In no way can one say they cherry picked benchmarks if they use the the same benchmarks for a while..

Perhaps its time to admit we landed on the moon, there was only one shooter of JFK, and the 9/11 building collapse was caused by the terrorist planes and not the government demolishing them...


----------



## cdawall (Oct 26, 2012)

EarthDog said:


> I dont go to [H] for any reason (PSU reviews maybe)... but do they use these same benchmarks for Intel CPU's? If not, you have a point, otherwise, its a pretty big jump to say they did that intentionally, ya know? I mean, look around to any sites and see they all use different things. In no way can one say they cherry picked benchmarks if they use the the same benchmarks for a while..
> 
> Perhaps its time to admit we landed on the moon, there was only one shooter of JFK, and the 9/11 building collapse was caused by the terrorist planes and not the government demolishing them...



Actually he did skip any benchmark that showed AMD performing better. Take a look at the 3960X review











That is just a couple of benchmarks he skipped.


----------



## EarthDog (Oct 26, 2012)

Ooooof..... wow. Well, that was almost a year ago, what did they use when IB was released in April? Its possible they changed it up since then? I dont know, clearly we see differently until you can prove something.

EDIT: the IB review doesnt have those either... http://www.hardocp.com/article/2012/04/23/intel_ivy_bridge_processor_ipc_overclocking_review/1


----------



## HumanSmoke (Oct 26, 2012)

cdawall said:


> There are also many more benchmarks that show the FX 8350 beating the 3770K in multithreading. If you want I can go hunt them all down


You can argue that point 'til the cows come home- best case scenario is win some lose even in an *ideal* test enviroment ( no Intel compile + multithreaded). The problem being that to realize any notable performance lead over IB you'd have to aggregate a very esoteric software suite, in a niche OS market, and not being overly concerned about power usage...which would seem counter-productive given that sotware utilization is geared towards heavily multi-threaded and CPU intensive workloads.
*GREAT*




*NOT GREAT*





[source]

BTW: If anyone was interested in Windows 8 vs Windows 7 performance for Piledriver, I'd suggest a read of the AMD affiliated Planet3DNow review.


----------



## Ikaruga (Oct 26, 2012)

HumanSmoke said:


> BTW: If anyone was interested in Windows 8 vs Windows 7 performance for Piledriver, I'd suggest a read of the AMD affiliated Planet3DNow review.



Thanks for that link. It's nice to see that Win7 is still performing better, even if the difference is marginal. 

ps.: at least, win8 still has a nice free Pinball


----------



## cdawall (Oct 26, 2012)

HumanSmoke said:


> You can argue that point 'til the cows come home- best case scenario is win some lose even in an *ideal* test enviroment ( no Intel compile + multithreaded). The problem being that to realize any notable performance lead over IB you'd have to aggregate a very esoteric software suite, in a niche OS market, and not being overly concerned about power usage...which would seem counter-productive given that sotware utilization is geared towards heavily multi-threaded and CPU intensive workloads.
> *GREAT*
> http://img.techpowerup.org/121026/piledriver 0.png
> *NOT GREAT*
> ...



Look closely at your compiler information at the bottom of those images. If you look at the top one were AMD performed better you will notice it used gcc options -lssl -lcrypto -lm -lz -fopenmp -lcrypt -ldl vs the one were AMD performs worse only uses gcc options -03 -march-native. What is interesting about that is it labels what both sets of processors are aloud to use and oddly enough you appear to have linked that AMD is not as good at single IPC. Did you happen to figure that out on your own? I do not think I have seen that mentioned here before.


----------



## EarthDog (Oct 26, 2012)

OR........ they are completely different benchmarks using different compiling options/abilities? (No idea... just a ignoramous looking at it, but it seems its an easy fallback, true or not, for AMD users to fall back on). Do you know what they do? If so, can you explain what they do as I dont know.


----------



## cdawall (Oct 26, 2012)

EarthDog said:


> OR........ they are completely different benchmarks using different compiling options/abilities?
> 
> You wear your green glasses so proud!!!



Same compiler different benchmarks one is testing single IPC the other is testing true multithreading. You wear your blue ones like a champ.


----------



## EarthDog (Oct 26, 2012)

There has to be some balance to the force bubba... I really feel, the green team, at the slightest smell of blue, sure does get their hackles up in a hurry and most of the time for little reason. Seems like beaten wife syndrome... flinching all the time for little reason, LOL!

And BULLSHIT do I wear blue glasses...Im just trying to get to the bottom of this but its tough with the incessant pissing contests some people present, ya know? Im looking for knowledge and explanation, and right now, all I see due to lack of knowledge (on my end), are two different benchmarks using two different options and I have no idea why... Do you? Can you explain it please instead of leaving a cliff hanger for those that dont know? What if you used option a on bench B and option b on bench a?

Thanks.


----------



## cdawall (Oct 26, 2012)

EarthDog said:


> There has to be some balance to the force bubba... I really feel, the green team, at the slightest smell of blue, sure does get their hackles up in a hurry and most of the time for little reason.
> 
> And BULLSHIT do I wear blue glasses...Im just trying to get to the bottom of this but its tough with the incessant pissing contests some people present, ya know? Im looking for knowledge and explanation, and right now, all I see due to lack of knowledge, are two different benchmarks using two different options and I have no idea why... Do you?



I honestly don't wear green ones. As we post away I am finishing up my Intel box. It does play games better so I am bringing it with me on this deployment. Most games are still heavily single IPC. AMD does perform better in encoding as I said before it lacks in games, but is still able to produce more than 60FPS at normal resolutions. So that makes it your choice. Do you want more FPS that you cannot see on the vast majority of monitors or do you want higher performance when you do encoding?


----------



## EarthDog (Oct 26, 2012)

That didnt answer the questions I had above... at all. LOL!


----------



## cdawall (Oct 26, 2012)

EarthDog said:


> That didnt answer the questions I had above... at all. LOL!



Skimmed the post and missed it. They use different options to force single core usage vs multithreading.


----------



## EarthDog (Oct 26, 2012)

cdawall said:


> Skimmed the post and missed it. They use different options to force single core usage vs multithreading.


Keep going... that doesnt explain why it matters within this context...please dont make me extract the information. If you dont know what it does, you dont know... 

I mean single threaded bench, single threaded option.. multi = multi... where is the problem????????????????????????


----------



## cdawall (Oct 26, 2012)

EarthDog said:


> Keep going... that doesnt explain why it matters within this context...please dont make me extract the information. If you dont know what it does, you dont know...
> 
> I mean single threaded bench, single threaded option.. multi = multi... where is the problem????????????????????????



There is no problem. I was simply pointing out he introduced no new information everyone reading this thread knows that Intel exceeds in single IPC while AMD exceeds in multithreading so why bother posting yet another benchmark that shows the same thing. May as well go back to the review from [H] and look at hyperpi.


----------



## EarthDog (Oct 26, 2012)

cdawall said:


> There is no problem. I was simply pointing out he introduced no new information everyone reading this thread knows that Intel exceeds in single IPC while AMD exceeds in multithreading so why bother posting yet another benchmark that shows the same thing. May as well go back to the review from [H] and look at hyperpi.


Huh, my english must be getting rusty(native speaker, LOL!)... seemed like you thought it was a problem.....



> Look closely at your compiler information at the bottom of those images. If you look at the top one were AMD performed better you will notice it used gcc options -lssl -lcrypto -lm -lz -fopenmp -lcrypt -ldl vs the one were AMD performs worse only uses gcc options -03 -march-native.



NOt knowing most of what that was, further explanation was warranted to clarify your meaning.


----------



## HumanSmoke (Oct 26, 2012)

cdawall said:


> Do you want more FPS that you cannot see on the vast majority of monitors or do you want higher performance when you do encoding?


Some people might want both. And as for true multithreading for encode:





I'd also note that the newer build of x264 actually improves IB in relation to PD... by 7% on the first pass and 0.5% on the second pass...so much for progress, eh!
...and H.264 just for a comparison


----------



## Super XP (Oct 26, 2012)

Once again it shows the FX-8350 does very well.


----------



## os2wiz (Oct 27, 2012)

Super XP said:


> Once again it shows the FX-8350 does very well.



Correctamon, Gaston.

  I decided I'm canceling the preorder I put in with BLT.  They cxhanged the expected arrival date of the processors in their warehouse from 11/02/12 to 11/24/12. That means I would not get it until just before December 1.  That was just to much. It also means , not only are they near the bottom of the supply chain, but the processor is selling out very fast from the preferred large vendors like Newegg (not available now), Amazon ( not available now), and Tiger Direct.  Earlier in the Day Tiger Direct had the price at 229.00 up $10. I checked a couple of hours ago it was available again and the price was back to $219.00. I broke down and ordered it, even though that price rubs me the wrong way. From what I can read the market , the supply chain will not be saturated for at least 3-4 more weeks. That means pricing will not drop for another month. Couldn't wait that long. I paid as few extra to get it second day. Should have it by next Wednesday. Hopefully bios 1703 for Crosshairs V will be stable. I had some blue screens with it when I first installed it. I adjusted some memory timings and I'm keeping my fingers crossed.


----------



## Super XP (Oct 27, 2012)

What I would do is keep your setup at its default speed such as your ram until you get your FX-8350. I know for a fact the Crosshair V Fornula setz your ram speed wrong when yu leave it on auto for some reason.


----------



## os2wiz (Oct 27, 2012)

Super XP said:


> What I would do is keep your setup at its default speed such as your ram until you get your FX-8350. I know for a fact the Crosshair V Fornula sets your ram speed wrong when you leave it on auto for some reason.


 
I relaxed the second, third, and fourth values in my docp profile. it seems a lot better now.
I did run a bunch of benchmarks and it didn't crash. 


 By the way I have a Gigabyte Radeon 6950 2 gigabyte graphics card. About 6 months ago I decided to load a program that said it would unlock unused shaders and essentially convert the card to a 6970. Of course this was done with modding the bios. They asked me to save the old bios and I opted to save it. I never noticed any real gain from the modding but I had some issues with crashes in games I used.. But now of course I have a generic AMD bios . Giagabyte has a utility to flash the bios with any updates, but of course now it does not recognize my card. I used that modding utility to restore it back to 6950 and it used a generic 6950 bios instead of restoring the original bios I thought I had saved. Do you think there is any way Gigabyte would help me restore my Gigabyte bios?


----------



## Super XP (Oct 27, 2012)

os2wiz said:


> I relaxed the second, third, and fourth values in my docp profile. it seems a lot better now.
> I did run a bunch of benchmarks and it didn't crash.
> 
> 
> By the way I have a Gigabyte Radeon 6950 2 gigabyte graphics card. About 6 months ago I decided to load a program that said it would unlock unused shaders and essentially convert the card to a 6970. Of course this was done with modding the bios. They asked me to save the old bios and I opted to save it. I never noticed any real gain from the modding but I had some issues with crashes in games I used.. But now of course I have a generic AMD bios . Giagabyte has a utility to flash the bios with any updates, but of course now it does not recognize my card. I used that modding utility to restore it back to 6950 and it used a generic 6950 bios instead of restoring the original bios I thought I had saved. Do you think there is any way Gigabyte would help me restore my Gigabyte bios?


Personally, no and the reason is I had mobo issues with them such as a bad bios chip and there customer support along with the tech support sucked big time. That was a while back. 

What I would do is create a new thread on this Site and ask for assistance. There's many here that are very experienced in this sort of thing.


----------



## os2wiz (Oct 28, 2012)

erocker said:


> Did that with an 8150. Single IPC still isn't good.



   Well it appears the Korean got all 8 cores up to over 8 GHZ. That is unheard of. Had over 1.8 volts on the cpu core. BD could never do that. This chip is much better than people are aware of. It needs to improve architecturally for the next release and I believe it will.  Early sales of this cpu are very good. That is why the price has not really come down near MSRP. Both Amazon and Newegg are sold out as of yesterday. The various areas which AMD is branching out to will bring in revenue and stop the financial hemorrhaging. I see a better future for cpu enthisiasts who want real competition, lower prices , better product, and BETTER software as well that leverages the true capabilites of all these multicore cpus whether Intel or AMD.


----------



## os2wiz (Oct 28, 2012)

EarthDog said:


> There has to be some balance to the force bubba... I really feel, the green team, at the slightest smell of blue, sure does get their hackles up in a hurry and most of the time for little reason. Seems like beaten wife syndrome... flinching all the time for little reason, LOL!
> 
> And BULLSHIT do I wear blue glasses...Im just trying to get to the bottom of this but its tough with the incessant pissing contests some people present, ya know? Im looking for knowledge and explanation, and right now, all I see due to lack of knowledge (on my end), are two different benchmarks using two different options and I have no idea why... Do you? Can you explain it please instead of leaving a cliff hanger for those that dont know? What if you used option a on bench B and option b on bench a?
> 
> Thanks.


 
 Well that beaten wife has reason to flinch she never knows when that drunken sot husband will inflict bodily harm or death upon her. That is not paranoia but healthy reflexive caution.


----------



## DaedalusHelios (Oct 28, 2012)

os2wiz said:


> Well that beaten wife has reason to flinch she never knows when that drunken sot husband will inflict bodily harm or death upon her. That is not paranoia but healthy reflexive caution.



What is with this domestic abuse talk. You guys beat your wives or something.

I just had a friend have to leave her husband for that so try to keep in mind it is a horrible act and may be distasteful to trivialize it in a public forum.

AMD is looking a little better but still behind in many ways. I don't think they will stay in the race long enough to catch up. It' sad but it looks like it's true.


----------



## os2wiz (Oct 28, 2012)

Super XP said:


> Personally, no and the reason is I had mobo issues with them such as a bad bios chip and there customer support along with the tech support sucked big time. That was a while back.
> 
> What I would do is create a new thread on this Site and ask for assistance. There's many here that are very experienced in this sort of thing.



Thanks I'll survive with the generic bios on this website that I flashed.  By the way I believe I mentioned that I've had  afair number of blue screens. With about 3 different reasons and some with no reason given. I read a post somewhere on this site that somebody had an issue with my model Corsair psu that caused them to get blips or spikes and they went and bought a Kingwin. Si I ordered one from outletpc.com . A Kingwin LSP 650w psu with platinum certification.  It has great reviews on many web sites and was well worth the extra money over a Corsair. it is extremel;y efficient , uses a lot less power, and is whisper quiet. I believe this will solve many of the bluescreen problems I have experienced.The current psu is about 2 1/2 years old and may be seeing a slow march downward to end of life. My power should be spike free and save juice as well. Cost $169.99 and free slow shipping.  Man I am getting very excited by Wednersday I will have my rejuvenated rig up with H100 Corsair liquid cooler and the new CPU. The Kingwin psu probably will arrive the following week. But that is ok with me.
  The more news we hear day by day brings greater reassurance that thFX-8350 will be the beginning of AMD's transformation and an end of the hemorrhaging in the desktop cpu market. I beleive their CEO in spite of these brutal layoffs will position AMD to be able to start rehiring people in another year.


----------



## os2wiz (Oct 28, 2012)

DaedalusHelios said:


> What is with this domestic abuse talk. You guys beat your wives or something.
> 
> I just had a friend have to leave her husband for that so try to keep in mind it is a horrible act and may be distasteful to trivialize it in a public forum.
> 
> AMD is looking a little better but still behind in many ways. I don't think they will stay in the race long enough to catch up. It' sad but it looks like it's true.



Not at all. My wife is a tough cookie with a heart of gold. I just was showing that the beaten wife syndrome was incorrectly described by our friend. I understood his intent , but he expressed it somewhat incongruously. I am just nit picking. Just showing the old man's mind is a lot sharper than some people think.


----------



## erocker (Oct 28, 2012)

Please use the "multi-quote" button when you want to quote multiple people.

Thanks!


----------



## os2wiz (Oct 28, 2012)

erocker said:


> Please use the "multi-quote" button when you want to quote multiple people.
> 
> Thanks!



Yes , now I know why you have that there!! You learn something new every day!


----------



## EarthDog (Oct 30, 2012)

os2wiz said:


> Well it appears the Korean got all 8 cores up to over 8 GHZ. That is unheard of. Had over 1.8 volts on the cpu core. BD could never do that. This chip is much better than people are aware of. It needs to improve architecturally for the next release and I believe it will.  Early sales of this cpu are very good. That is why the price has not really come down near MSRP. Both Amazon and Newegg are sold out as of yesterday. The various areas which AMD is branching out to will bring in revenue and stop the financial hemorrhaging. I see a better future for cpu enthisiasts who want real competition, lower prices , better product, and BETTER software as well that leverages the true capabilites of all these multicore cpus whether Intel or AMD.


BD could do that the same way, FYI. (Speaking of volages). 

I think only a few boards can disable these modules, unlike BD where most can. So that is part of the reason why you see scaling with as many cores its pssible they dont have a board that can disable it in the first place. If one disables them, I would bet you can see upper 8Ghz range like you did with BD when it had modules disabled. BD, good chips, can bench in to upper 7Ghz with all modules enabled I believe (of course this is under LN2 or greater).


----------



## Super XP (Oct 30, 2012)

EarthDog said:


> BD could do that the same way, FYI. (Speaking of volages).
> 
> I think only a few boards can disable these modules, unlike BD where most can. So that is part of the reason why you see scaling with as many cores its pssible they dont have a board that can disable it in the first place. If one disables them, I would bet you can see upper 8Ghz range like you did with BD when it had modules disabled. BD, good chips, can bench in to upper 7Ghz with all modules enabled I believe (of course this is under LN2 or greater).


Most if not all Socket AM3+ boards can disable modules. What I am looking for is the ability to disables individual cores within modules. I want to bench 4 cores via 4 modules versus 2 modules w/ 4 cores. The problem is so far beta bioses have this ability, but are not 100% optimised or stable. 

I believe the 4 cores via 4 modules would outperform 4 cores via 2 modules especially with Piledriver CPU's.


----------



## blibba (Oct 30, 2012)

Super XP said:


> I believe the 4 cores via 4 modules would outperform 4 cores via 2 modules



This is certainly true. Said solution would provide the floating-point performance of at least an FX8, with the performance elsewhere of at least an FX4.

I see no reason to suspect why this is especially true of Piledriver.

I would be intrigued as to the power consumption of such an endeavour.


----------



## EarthDog (Oct 30, 2012)

Super XP said:


> Most if not all Socket AM3+ boards can disable modules.


For BD (like I specifically mentioned). 

I believe (someone correct me if I am wrong), not all boards with Vishera/PD in them can do that yet??? BIOS updates are needed on some boards. I read that in one of the 8GHz articles IIRC.


----------



## os2wiz (Oct 30, 2012)

Super XP said:


> Most if not all Socket AM3+ boards can disable modules. What I am looking for is the ability to disables individual cores within modules. I want to bench 4 cores via 4 modules versus 2 modules w/ 4 cores. The problem is so far beta bioses have this ability, but are not 100% optimised or stable.
> 
> I believe the 4 cores via 4 modules would outperform 4 cores via 2 modules especially with Piledriver CPU's.



Interesting theory. Hopefully somebody will find a way to do this and still maintain stability. I can understand why you woit would perform better. Each core would have the full cache and decoder to itself, instead of sharing it with its twin.


----------



## EarthDog (Oct 30, 2012)

Not a theory... a practice. This was done with BD.


----------



## cadaveca (Oct 30, 2012)

EarthDog said:


> Not a theory... a practice. This was done with BD.



And still isn't a very smart line of thought, either, just like with BD. If you want Phenom II, go buy Phenom II.


----------



## os2wiz (Oct 30, 2012)

Super XP said:


> Most if not all Socket AM3+ boards can disable modules. What I am looking for is the ability to disables individual cores within modules. I want to bench 4 cores via 4 modules versus 2 modules w/ 4 cores. The problem is so far beta bioses have this ability, but are not 100% optimised or stable.
> 
> I believe the 4 cores via 4 modules would outperform 4 cores via 2 modules especially with Piledriver CPU's.



Nice theory and undoubtedly true if it is possible to isolate one core in each module. Let me know if progress is made on that front.


----------



## os2wiz (Oct 30, 2012)

EarthDog said:


> For BD (like I specifically mentioned).
> 
> I believe (someone correct me if I am wrong), not all boards with Vishera/PD in them can do that yet??? BIOS updates are needed on some boards. I read that in one of the 8GHz articles IIRC.



Certainly on all 990 FX motherboards one has that ability.


----------



## EarthDog (Oct 30, 2012)

> And still isn't a very smart line of thought, either, just like with BD. If you want Phenom II, go buy Phenom II.


Monetarily speaking you are spot on.



> Certainly on all 990 FX motherboards one has that ability.


Perhaps. Im not sure what board was used in the article... I just know they could not disable cores on w/e board they were using for w/e reason.


----------



## cadaveca (Oct 30, 2012)

I'll never understand why people buy new stuff, and then complain that it's not like the old stuff. Being different is what makes it new. Trying to change it into the old stuff...when you already had the old stuff...man...I'll never understand.


And yes, I was speaking from a financial perspective.


----------



## EarthDog (Oct 30, 2012)

Me either... nostalgia perhaps? Trying to get that IPC feeling back again? 

OSwiz - double posting machine, LOL!


----------



## os2wiz (Oct 30, 2012)

EarthDog said:


> Not a theory... a practice. This was done with BD.



How could it have been done,when the bios doesn't permit you to activate only 1 core per module?
Unless you are referring to some modded bios that proved to be unstable.


----------



## TRWOV (Oct 30, 2012)

os2wiz said:


> How could it have been done,when the bios doesn't permit you to activate only 1 core per module?
> Unless you are referring to some modded bios that proved to be unstable.



I'm fairly sure that the Assus Crosshair allowed that. I thing that [H] ran some tests with 1 core per module activated and they used that motherboard if I recall correctly.


----------



## os2wiz (Oct 30, 2012)

TRWOV said:


> I'm fairly sure that the Assus Crosshair allowed that. I thing that [H] ran some tests with 1 core per module activated and they used that motherboard if I recall correctly.



No not correct. I have an Asus Crosshairs V and you can turn on and off modules , Not individual cores.


----------



## cadaveca (Oct 30, 2012)

os2wiz said:


> No not correct. I have an Asus Crosshairs V and you can turn on and off modules , Not individual cores.



There was a special BIOS for doing this. I don't mean to be rude here, but google is your friend on this one, for sure.


I did not review the BD CPUs, I actually assembled a package by buying all the parts separately, but ending up with the same review kit as all those other guys and gals got.


I tried the whole core-in-module-disable. was one BIOS only. Was it slightly faster? Sure. But then you lost out on the whole idea of these chips, which is many mores, not a quad-core.


----------



## os2wiz (Oct 30, 2012)

cadaveca said:


> There was a special BIOS for doing this. I don't mean to be rude here, but google is your friend on this one, for sure.
> 
> 
> I did not review the BD CPUs, I actually assembled a package by buying all the parts separately, but ending up with the same review kit as all those other guys and gals got.
> ...



  Yes, I never thought it would be good as an overall solution, but in some instances it may have delivered better single thread performance. I knew it had to be a modded bios. I had installed all the official and some beta builds an none of them actually provided that capability.  I  enjoy mild experimentation, but I have NOT the patience or money to be on the bleeding edge all the time. I enjoy your posts. They are intelligent, witty, and sometimes humorous, and sometimes with a edge of sublime sarcasm.  Your  technical knowledge is well beyond mine as are many here , who probably work in the industry in some fashion. I feel I learn , sometimes by listening to you all, and sometimes by arguing what I think is right , even if it is not wholly correct. I have never been afraid to express myself. I hate to be in a world where people are not free to express ourselves, such as in the despotic work place, where those who toil are treated like drones with nothing to contribute. Despotism MUST be crushed when it oppresses the multitudes as it is doing now globally. Enough said on this score.


----------



## cadaveca (Oct 30, 2012)

os2wiz said:


> Yes, I never thought it would be good as an overall solution, but in some instances it may have delivered better single thread performance. I knew it had to be a modded bios. I had installed all the official and some beta builds an none of them actually provided that capability.



Modded by ASUS staff.




> I enjoy mild experimentation, but I have NOT the patience or money to be on the bleeding edge all the time.



That's what got me into doing this in the first place. I have an extra amount of time right now, planned extra time, so doing hardware reviews and playing with hardware in general was just what I had set out to do.

Actually getting into it though, was hard, and I just happened to be at the right place at the right time...and W1zz gave me a chance.



> I enjoy your posts. They are intelligent, witty, and sometimes humorous, and sometimes with a edge of sublime sarcasm.  Your  technical knowledge is well beyond mine as are many here , who probably work in the industry in some fashion.




Thanks. Really, I don't know much more than anyone else. I do, however, play with a lot of hardware, and always have. Overclocking memory and playing with timings has always been my thing than I enjoyed out of this hobby, and then going sub-zero for cooling opened a whole new realm of possibilities...expensive ones.  That didn't last long.

I do not work for anyone in the industry, period. I do home exterior refinishing, vinyl siding actually, but live in Canada, where vinyl can only be installed about 8 months out of the year. I work for myself, get paid according to the work I do, and then take the winter off. Needing/wanting extra money in the winter lead to me doing many other things, but siding is what I do. I'm gonna go back to school soon though, so I can do HVAC in the winter. I enjoy working on people's homes and such too much, making people's visions reality...

Anyway, that's me in a nutshell. I have 4 kids, my wife has a pretty successful executive/engineering career, and childcare is expensive, so the past 2 years, and one year more, I've been home full time caring for my kids...it's been like one long long winter.

I can't sit around and do nothing, so doing reviews works well for me. And I get to share my experiences, I get some info for the companies that make this stuff, and then get to relate that back at you guys. I put my own spin on motherboard reviews, showing what's really what on the board's surface, but really ,anyone with a small bit of knowledge on what numbers to google can do just the same as I do. It's just research, really, and isn't something I personally see as any real skill. It takes time, and I have a lot of that.


I'm just happy I can do something useful with my time.  No thanks needed. 


The only advantage I have, really, is all this time. Unfortunately, that time has now itself come up short...I'm just simply getting too many product samples, and end up doing nothing but reviews.

But the more samples I get...the more I learn. Really, I'm learning just as much as you are


----------



## os2wiz (Oct 30, 2012)

*Struggle with Struggle Against*



cadaveca said:


> Modded by ASUS staff.
> 
> "That's what got me into doing this in the first place. I have an extra amount of time right now, planned extra time, so doing hardware reviews and playing with hardware in general was just what I had set out to do.
> 
> ...



    Time is not an issue for me, I am retired as of May. So I enjoy some finageling but I do have my limits. When it gets a little hairy for me, my stress levels go up and I have to pull back. Bad heart. I like to stretch my mind . I love the learning-teaching aspect of life and this is just a small piece of the grander scheme of things .

  I worked 32 years as diagnostic medical sonographer (ultrasound tech). I enjoyed the work, my co-workers, and the patients. Come from the working class, though I am formally educated, and have no personal ambition to be rich, just be comfortable in an unassuming way. I use every opportunity to engage people in discussion, whether it is science, computers, politics, or organizing to change the world. I love the ride of life and have an imdominatable spirit.


----------



## jihadjoe (Oct 31, 2012)

The new FX processors were on TR's podcast, and an interesting point raised was that while the TDP differences suggests a 48W gap between Intel and AMD, in actual practice the measured difference at the wall was about 100W.

That's HUGE, especially if you're running your computer 24/7 for one reason or another (servers, folding nuts, etc). 100W 24/7 at 15c per kWh is $11 a month in electricity. That's $130+ a year or roughly the difference in platform cost compared to an Ivy Bridge i7; if you keep your rig for 2 years an Ivy i7 actually ends up being cheaper to own+operate.

Where I live, the applied electricity rate is actually close to 30c per kWh, so that's a whopping $260 difference in electricity alone for just one year. It actually makes running an FX processor a rather silly option for me.


----------



## Jhelms (Oct 31, 2012)

Yeah however the mass majority of us do not run our PC's 24-7. 

Also here is a good average power consumption benchmark from guys known to be Intel biased  
http://www.tomshardware.com/reviews/fx-8350-vishera-review,3328-16.html

8350 to i7 3770K was around 53W


----------



## cdawall (Oct 31, 2012)

jihadjoe said:


> The new FX processors were on TR's podcast, and an interesting point raised was that while the TDP differences suggests a 48W gap between Intel and AMD, in actual practice the measured difference at the wall was about 100W.
> 
> That's HUGE, especially if you're running your computer 24/7 for one reason or another (servers, folding nuts, etc). 100W 24/7 at 15c per kWh is $11 a month in electricity. That's $130+ a year or roughly the difference in platform cost compared to an Ivy Bridge i7; if you keep your rig for 2 years an Ivy i7 actually ends up being cheaper to own+operate.
> 
> Where I live, the applied electricity rate is actually close to 30c per kWh, so that's a whopping $260 difference in electricity alone for just one year. It actually makes running an FX processor a rather silly option for me.



Depends on the server farm as to if they even care. Thing is as we sit AMD offers more threads per board. That does matter not to mention in those specific scenario's AMD typically performs better.


----------



## os2wiz (Oct 31, 2012)

jihadjoe said:


> The new FX processors were on TR's podcast, and an interesting point raised was that while the TDP differences suggests a 48W gap between Intel and AMD, in actual practice the measured difference at the wall was about 100W.
> 
> That's HUGE, especially if you're running your computer 24/7 for one reason or another (servers, folding nuts, etc). 100W 24/7 at 15c per kWh is $11 a month in electricity. That's $130+ a year or roughly the difference in platform cost compared to an Ivy Bridge i7; if you keep your rig for 2 years an Ivy i7 actually ends up being cheaper to own+operate.
> 
> Where I live, the applied electricity rate is actually close to 30c per kWh, so that's a whopping $260 difference in electricity alone for just one year. It actually makes running an FX processor a rather silly option for me.



Actually if you are not running a 24/7 business and are essentially using your computer only for defined tasks, the best way to compute usage would be watts/computation. That would the most efficient method of energy useage.


----------



## Super XP (Oct 31, 2012)

cadaveca said:


> I'll never understand why people buy new stuff, and then complain that it's not like the old stuff. Being different is what makes it new. Trying to change it into the old stuff...when you already had the old stuff...man...I'll never understand.
> 
> And yes, I was speaking from a financial perspective.


I think you misunderstood my point, it was and still is pure curiosity that I would like to perform this comparison. AMD speaks highly of this sharing within its new unique design. 

Personally if disabling 1 core per module proves to be a better performer, I would still keep it running at all 8-cores 

Now last I heard its all up to the board and bios. Currently not possible on the Crosshair V Formula unless you use a hacked bios.


----------



## os2wiz (Oct 31, 2012)

Super XP said:


> I think you misunderstood my point, it was and still is pure curiosity that I would like to perform this comparison. AMD speaks highly of this sharing within its new unique design.
> 
> Personally if disabling 1 core per module proves to be a better performer, I would still keep it running at all 8-cores
> 
> Now last I heard its all up to the board and bios. Currently not possible on the Crosshair V Formula unless you use a hacked bios.



Correctamon, Gaston!


----------



## xenocide (Oct 31, 2012)

Super XP said:


> I think you misunderstood my point, it was and still is pure curiosity that I would like to perform this comparison. AMD speaks highly of this sharing within its new unique design.



It was tested on Bulldozer and it did result in slightly higher performance--because there was no bottleneck at the scheduler.


----------



## os2wiz (Nov 2, 2012)

cadaveca said:


> [page=Introduction]
> 
> 
> I just got my CPU today FX-8350. The batch number is 1236 PGN. Does any one know if this is one of the better batches that overclock better?
> ...


----------



## eidairaman1 (Nov 3, 2012)

Considering this only launched recently we dont know, why not compare it to the Chinaman that OCd to 8.1 GHz on all 8 cores



os2wiz said:


> cadaveca said:
> 
> 
> > [page=Introduction]
> ...


----------



## os2wiz (Nov 3, 2012)

I hope that was a joke. I have a Corsair H100 lying here ready for installation with the cpu, not a LN2 setup.. I am having installation Monday when the power supply unit is supposed to arrive. I think 4.8 GHZ is my likely target.


----------



## lordjohn (Nov 3, 2012)

I ordered 8120 with 990 combo in july, now 8350 with 970 combo cost same as what I paid in july. I should have waited


----------



## Super XP (Nov 3, 2012)

lordjohn said:


> I ordered 8120 with 990 combo in july, now 8350 with 970 combo cost same as what I paid in july. I should have waited


Why? Your 990 is superior to the 970. If you really want the 8320 or 8350, just sell your 8120, add a little more cash and you got yourself a Piledriver. That is what I am planning when the time is right.



os2wiz said:


> I hope that was a joke. I have a Corsair H100 lying here ready for installation with the cpu, not a LN2 setup.. I am having installation Monday when the power supply unit is supposed to arrive. I think 4.8 GHZ is my likely target.


What with a H100? I already hit 4.40GHz running all 8-Cores with a minor bump in volts 1.375v. I can push her more, but what for. For me to go any higher running 8-Cores, I would have to up the volts a lot, something I am not willing to do. And I too have the H100. 

I can easily see 5.40GHz with either of the 8320/50 CPU's with a mild bump in volts. If I can get 4.40GHz with the 8120 at 1.375v I would imagine I can get to 4.80GHzto 5.00GHz also at 1.375v. But you never know.


----------



## os2wiz (Nov 3, 2012)

Super XP said:


> Why? Your 990 is superior to the 970. If you really want the 8320 or 8350, just sell your 8120, add a little more cash and you got yourself a Piledriver. That is what I am planning when the time is right.
> 
> 
> What with a H100? I already hit 4.40GHz running all 8-Cores with a minor bump in volts 1.375v. I can push her more, but what for. For me to go any higher running 8-Cores, I would have to up the volts a lot, something I am not willing to do. And I too have the H100.
> ...



Right now with the FX-8150 I have an H60 cooler which is much smaller than the H100. I found on Bulldozer 4.4 GHZ with my setup was the hiighest stable overclock that I could achieve. I guess if the H100 was in there right now, I could get 4.6 GHZ. I also beleive I can get 4.8 without difficulty . I'll give 5.0 a shot but I really don't want to tax the voltage much either. Record are fine , but I wouldn't want to shorten cpu life pushing higher voltages over a prolonged period. How high is your HT throughput. Right now mine is 2200.
I would have slow my memory more to get anywhere near 2600. That is the problem with the AMD design. Memeory bandwidth comes at the expense of system through put and vice versa. I think the next generation cpu will come with a new system replacing Hyper Transport. A totally new motherboard design for the full-fledged Steamroller. The refresh to piledriver may be on am3+ but in 2014 the full-fledged steamroller will usher in many changes. I must say vishera is selling very well right now. If it can maintain high sales AMD will stabillize and the accelerated improvements will be cpoming to all of us.


----------



## Super XP (Nov 3, 2012)

My FX-8120 setup.

HTT @ 2600 MHz
NB @ 2400 MHz
CPU @ 1.375v
DDR3 @ 1866 MHz (16GB)
CPU @ 4400 MHz w/ 8-Cores

The IMC has been redone and enhanced with the FX-8320/50's versus Bulldozer.


----------



## os2wiz (Nov 3, 2012)

Super XP said:


> My FX-8120 setup.
> 
> HTT @ 2600 MHz
> NB @ 2200 MHz
> ...



Nice setup. I didn't do my homework properly about crossfire. I didn't realize that extra card is an extra
350 watts. I have no intention of using an extra 350 watts while I game A couple of hours a day. So now I will have to sell those 2 HD 6970 cards as well as my original HD 6950 card, So I just ordered a new HD 7950 card.With the new drivers I will be getting about 30 to 40% better performance without the expense of 2 cards
and the ongoing electricity costs. I got it from Newegg for $299 free shipping, and no taxes. Plus 4 free games.Iwill sell off the 3 cards on Tuesday after my new card arrives. I'll take a small hit over what I paid on Ebay just to move them quickly.



lordjohn said:


> I ordered 8120 with 990 combo in july, now 8350 with 970 combo cost same as what I paid in july. I should have waited



The 990 board shoulbe a better performer than a 970 board.Why don't you put the 8350 on the 990 board?



Super XP said:


> My FX-8120 setup.
> 
> HTT @ 2600 MHz
> NB @ 2200 MHz
> ...


 
I see you have HT Link Speed at 2600. Perhaps my error was thinking the North bridge frequency had to match the HT Link speed. Apparently that was a misconception on my part. I will adjust that tomorrow. Thanks for the low down.


----------



## cdawall (Nov 3, 2012)

Hmmm...Maybe people jumped to soon to say the FX line sucks at games


----------



## os2wiz (Nov 4, 2012)

cdawall said:


> http://img.techpowerup.org/121103/Capture001.jpg
> 
> Hmmm...Maybe people jumped to soon to say the FX line sucks at games




Yes BF3 apparently has some multithreading, unlike most of the crappy gaming code that most people bllndly put their dollars down for.


----------



## DaedalusHelios (Nov 4, 2012)

cdawall,

You meant "this game" not "games" right?

"Games" implies more than one. The "s" in "games" makes it plural.


----------



## os2wiz (Nov 4, 2012)

DaedalusHelios said:


> cdawall,
> 
> You meant "this game" not "games" right?
> 
> "Games" implies more than one. The "s" in "games" makes it plural.



The truth is that this cpu will do very wel on probably a dozen or so  well coded games out of the hundreds of well known games out there. I have a turn based startegy game called Russia Under Siege , about the Russian Civil War of 1919-1922.  It most likely will do well under Piledriver because it is essentially a grand database that would have to have been multithreaded. Action and shoot-em-up games can be multi-threaded, like BF III can be multithreaded, and therefore give over-all better performance for the game itself, or they can be single-threaded and still work-only well with the Intel cpu. Apparently Intel has won the "hearts and minds" of most game designers, but more accurately  their wallets. I am sure they give seed money to many of these gaming firms which influences their design decisions. That is what monopolists like John D Rockefeller or Bill Gates would do. If I was totally obsessed about this issue I would spend the time to investigate. I am not an investigative reporter and have other interests in life so I don't. When steamroller releases the single threaded performance gap should close a great deal and AMD's superior mulitithread performance edge will increase substantially as well. Sixteen months of surviving without capitulation by AMD will lead to a scare in Intel's leadership. Intel pricing structure will be smashed and the future will be up for grabs.
  By the way the fact that AMD can deliver over 90 fps on BF III shows that it can do so  on any game that is designed to be multi-threaded. It is the single thread and single thread alone ,  a moribund and antiquated programming concept , that prevents high frame rates by seizing control of the cpu. Intel solves this by throwing lots more dollars on decoders and cache size, yes it works, but it is economically an unnecessary waste of resources, if only design and code discipline could seize control of the minds that run the software industry.


----------



## Frick (Nov 4, 2012)

cdawall said:


> http://img.techpowerup.org/121103/Capture001.jpg
> 
> Hmmm...Maybe people jumped to soon to say the FX line sucks at games



Doesn't that mean it's GPU bound?


----------



## Super XP (Nov 4, 2012)

Frick said:


> Doesn't that mean it's GPU bound?


Not quite. I believe the software is the real culprit. 
There were a few games that performed better on Intel, but most I see no big difference.


----------



## cdawall (Nov 4, 2012)

DaedalusHelios said:


> cdawall,
> 
> You meant "this game" not "games" right?
> 
> "Games" implies more than one. The "s" in "games" makes it plural.



As has been said it will depend on the game that was just one of the many multithreaded games out there. Crysis is another and Crysis 3 looks like it will be as well. Hopefully this is the way all games will be going. I would rather be prepped for the future since it appears developers are getting their collective heads out of their collective butts and coding things correctly finally.



Frick said:


> Doesn't that mean it's GPU bound?



More than likely yes, but on that note even being GPU bound AMD has had lackluster performance due to a lacking single core IPC making Intel chips look immensely better performing. This is yet more proof they are not...


----------



## Jhelms (Nov 6, 2012)

Very nice... In Batman Arkham City, I run physx maxed and let the CPU handle the physx part. My 8150 and 1100T both would not go over 31fps, generally 30fps. Just benchmarked my overclocked 8350 and hit 36fps. Nice increase! Benchmark is rough, in game it is visually glass smooth.


----------



## blibba (Nov 6, 2012)

os2wiz said:


> I hope that was a joke. I have a Corsair H100 lying here ready for installation with the cpu, not a LN2 setup.. I am having installation Monday when the power supply unit is supposed to arrive. I think 4.8 GHZ is my likely target.




For sub-zero overclocking, you want high-leakage. For conventional cooling, you want low-leakage.


----------



## cadaveca (Nov 6, 2012)

blibba said:


> For sub-zero overclocking, you want high-leakage. For conventional cooling, you want low-leakage.



True, but can you tell me why?


----------



## blibba (Nov 6, 2012)

cadaveca said:


> True, but can you tell me why?



No! I just wanted to point out that batches used in record-breaking OCs might not be the best to go for.

If I was to make an intelligent guess, I'd say that high-leakage parts, though generating more heat, get it out of the CPU more effectively. On LN2 you don't really care how much heat you generate, so this is preferable. On water or air, thermal limits are normally the absolute limits. How close did I get?


----------



## cadaveca (Nov 6, 2012)

blibba said:


> No! I just wanted to point out that batches used in record-breaking OCs might not be the best to go for.
> 
> If I was to make an intelligent guess, I'd say that high-leakage parts, though generating more heat, get it out of the CPU more effectively. On LN2 you don't really care how much heat you generate, so this is preferable. On water or air, thermal limits are normally the absolute limits. How close did I get?



a chip has to leak...


it takes in all that wattage it consumes, and really, converts it to heat. Not really much else, really.

silicon is a semiconductor, and it's leakage changes according to temperatures. Cool a good "air" chip too much, and it won't be able to leak the power as heat sufficiently, and then that can causes errors or stalls, or maybe even damage? Not sure on that last bit. It simply prevents you from being able to remove the heat fast enough.


Anyway, so a "very leaky" chip, cooled to the same temps, has greater ability for leakage at the same temps, and ergo, can go a bit further before it hits that critical point where it's just too cold.


This is over and above the ability of the chip to be effectively cooled and handle the extra wattage(by removing the heat via leakage....)

handling that balance is where the real skill in sub-zero is. If you spend some time looking into LN2 cooling, you'll see how important that pots hold a specific temperature for a long time is important...this is a big reason for that need.


----------



## cdawall (Nov 6, 2012)

cadaveca said:


> a chip has to leak...
> 
> 
> it takes in all that wattage it consumes, and really, converts it to heat. Not really much else, really.
> ...



See the way I always looked at it was you have to have high leakage to loose some of the wattage. Low leak chips are the ones that normally fry when they are pushed to 1.8-2v under LN2. I always assumed it was the voltage drop across the chip that kept them alive at high voltage.


----------



## cadaveca (Nov 6, 2012)

cdawall said:


> See the way I always looked at it was you have to have high leakage to loose some of the wattage. Low leak chips are the ones that normally fry when they are pushed to 1.8-2v under LN2. I always assumed it was the voltage drop across the chip that kept them alive at high voltage.



I'm not exactly sure what you are referring to, simply most likely because we are using different terminology.



voltage drop across a chip is kinda important too, if we are talking about the same thing, but as far as I know, it's all about leakage affecting the ability to remove heat when going really low. A chip that leaks more, you can pull more heat out of.


Do keep in mind, it's not like I went to school for this stuff, or I've had any of the "pro" clockers help...me raging over everyone's lack of help was what got me banned from XS for a year. Later I understood that the stuff I really wanted to know, was covered by NDA for those that really did know. 

I mean, really, I'm just making shit up here. I dunno wtf is the truth, honestly. I still don't get any more answer than I did before doing reviews, really


----------



## cdawall (Nov 6, 2012)

cadaveca said:


> I'm not exactly sure what you are referring to, simply most likely because we are using different terminology.



Could be I tend to look at things a little differently thanks to work and school.



cadaveca said:


> voltage drop across a chip is kinda important too, if we are talking about the same thing, but as far as I know, it's all about leakage affecting the ability to remove heat when going really low. A chip that leaks more, you can pull more heat out of.



AFAIK a chip that leaks more would have more voltage drop so we are talking about the same thing I think.



cadaveca said:


> Do keep in mind, it's not like I went to school for this stuff, or I've had any of the "pro" clockers help...me raging over everyone's lack of help was what got me banned from XS for a year. Later I understood that the stuff I really wanted to know, was covered by NDA for those that really did know.



I have had some help thanks to blatant disregard for NDA's which I helped with in all honesty. Lets just say "cdawall" is on file with AMD@Dallas.



cadaveca said:


> I mean, really, I'm just making shit up here. I dunno wtf is the truth, honestly.



Sounds like you are correct for the most part. End of it all is there was a reason the Phenom II TWKR chips were high leakage silicone.


----------



## cadaveca (Nov 6, 2012)

cdawall said:


> Sounds like you are correct for the most part. End of it all is there was a reason the Phenom II TWKR chips were high leakage silicone.



I know that...long before TWKR chips were made, I was asking AMD reps directly for high-leakage chips for OC'ers..the chips they'd normally bin as useless, which was EXACTLY what AMD did with the TWKR chips. Which is why I say that the "TWKR" chip was my idea...

I "got" all this stuff long ago...as far as I understand, it's just something about silicon in general, and not really anything new. Where those critical points are has changed over time, but the general nature of what this behavior is, has not.



And really..just me making shit up...if I'm close, I guess that's because it's logical, and Occam's Razor wins over all.  Pure luck. 

I mean really, bin a bunch of chips under LN2, no matter the chip ,and this trend emerges, which is where I got the idea from. For all I know, could be some board thing, memory...some PLL..I don't have a clue, really.


----------



## cdawall (Nov 6, 2012)

The old 945ES chips were the same way as the TWKR models. Every single one seemed binned more or less the same stable on water at 4-4.2, but required 1.65v on average.


----------



## swaaye (Nov 6, 2012)

cdawall said:


> http://img.techpowerup.org/121103/Capture001.jpg
> 
> Hmmm...Maybe people jumped to soon to say the FX line sucks at games



CPU performance doesn't seem to affect the game much. It's probably very GPU bound at high detail + 1080p even with a 7950 3GB.
http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/7


----------



## cdawall (Nov 6, 2012)

swaaye said:


> CPU performance doesn't seem to affect the game much. It's probably very GPU bound at high detail + 1080p even with a 7950 3GB.
> http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/7
> http://techreport.com/r.x/amd-fx-8350/bf3-fps.gif
> http://techreport.com/r.x/amd-fx-8350/bf3-99th.gif





cdawall said:


> More than likely yes, but on that note even being GPU bound AMD has had lackluster performance due to a lacking single core IPC making Intel chips look immensely better performing. This is yet more proof they are not...



Already said that


----------



## xenocide (Nov 6, 2012)

cdawall said:


> Already said that



It's worth noting that BF3 is also probably the most heavily threaded game on the market.  If I recall it's optimized for up to 10 or 12 threads--leaning towards 12 since that would make it ideal for really any CPU on the market (6 core Intel's with HTing).  I've said it before and I'll say it again; if you can max the threads on a BD\PD CPU, they are good options for the price.  The problem is unless you're working in rendering or play specific games that are heavily threaded (a lot of Strategy, Simulation, and a handful of FPS') you're not using the best case scenario that makes your product worthwhile.

For most run of the mill games Intel CPU's handily outperform AMD options, just because they only use 1-4 threads, which with Intel's superior IPC, makes the AMD parts look bad.


----------



## os2wiz (Nov 6, 2012)

cadaveca said:


> I know that...long before TWKR chips were made, I was asking AMD reps directly for high-leakage chips for OC'ers..the chips they'd normally bin as useless, which was EXACTLY what AMD did with the TWKR chips. Which is why I say that the "TWKR" chip was my idea...
> 
> I "got" all this stuff long ago...as far as I understand, it's just something about silicon in general, and not really anything new. Where those critical points are has changed over time, but the general nature of what this behavior is, has not.
> 
> ...



 So what happens when they start fabricating chips from carbon instead of silicon? I think IBM is close to a major breakthrough on non-silicon technology.


----------



## cadaveca (Nov 6, 2012)

os2wiz said:


> So what happens when they start fabricating chips from carbon instead of silicon? I think IBM is close to a major breakthrough on non-silicon technology.



I dunno man. that's way over my head, honestly. I suppose the same might apply, but I dunno.


----------



## swaaye (Nov 6, 2012)

xenocide said:


> It's worth noting that BF3 is also probably the most heavily threaded game on the market.  If I recall it's optimized for up to 10 or 12 threads--leaning towards 12 since that would make it ideal for really any CPU on the market (6 core Intel's with HTing).


It does not appear to matter much. It is either GPU bound or the engine is behaving inefficiently.  It does appear that dual core chips are at a bit of a disadvantage though from the Pentium result.


----------



## Super XP (Nov 7, 2012)

So in other words, BF3 is a well designed game taking advantage of our new multi-core hardware. BF3 makes other games look like bad code.


----------



## os2wiz (Nov 7, 2012)

Super XP said:


> So in other words, BF3 is a well designed game taking advantage of our new multi-core hardware. BF3 makes other games look like bad code.



You are so right about this. I just installed my AMD HD 7950 this afternoon. Testing a lot faster than my previous 3D Mark scores. Frame rates are significantly higher also. It's a good deal now. AMD has a promotion where you get 3 or quality games free for download. Crysis 3  and 2 or 3 others. I already sold my original HD 6950 card and am about to unoad the 2 HD 6970's I had bought before I reconsidered the Crossfire solution. Tomorrow my platinum certified psu will arrive. Hoping that resolves the few blue screens I get that I can't determine their source.


----------



## xenocide (Nov 7, 2012)

Super XP said:


> So in other words, BF3 is a well designed game taking advantage of our new multi-core hardware. BF3 makes other games look like bad code.



Yes!  That is why BF3 deserved more appreciation than it got.  The PC version was coded by a separate team and optimized to take advantage of high performance multi-core processors not present on Consoles, as well as use exclusively DX10/11.


----------



## blibba (Nov 7, 2012)

xenocide said:


> high performance multi-core processors not present on Consoles



We're not counting the 3.2GHZ 8-core chip in the PS3 as high performance multi-core?


----------



## xenocide (Nov 7, 2012)

blibba said:


> We're not counting the 3.2GHZ 8-core chip in the PS3 as high performance multi-core?



The Cell I don't believe was clocked at 3.2GHz, I know the Tri-Core from the 360 was.

EDIT:  Apparently it was 3.2GHz, but at least one of those cores was disabled at all times, and the architecture and development tools made it nearly impossible to optimize code properly--not to mention the terrible memory allocation.


----------



## blibba (Nov 7, 2012)

xenocide said:


> The Cell I don't believe was clocked at 3.2GHz, I know the Tri-Core from the 360 was.
> 
> EDIT:  Apparently it was 3.2GHz, but at least one of those cores was disabled at all times, and the architecture and development tools made it nearly impossible to optimize code properly--not to mention the terrible memory allocation.



Still, getting the most out of a PS3 requires at least as much SMP expertise as getting the most out of a quad-core in a PC.


----------



## badtaylorx (Nov 7, 2012)

just got one from newegg for my lil'cuz's sabbertooth.....

THEY SENT ME A USED CHIP!!!!! and man am i pissed

didnt even bother to clean off all of tim........currently reading them a riot-act


----------



## eidairaman1 (Nov 7, 2012)

badtaylorx said:


> just got one from newegg for my lil'cuz's sabbertooth.....
> 
> THEY SENT ME A USED CHIP!!!!! and man am i pissed
> 
> didnt even bother to clean off all of tim........currently reading them a riot-act



just send it back or state that the price should be slashed for a used good


----------



## ozborne (Nov 8, 2012)

humansmoke said:


> isn't the natural comparison the 3570k ?...i.e. The intel cpu closest in price..the cpu that also appeared in the review comparison.
> 
> For more impact i'd go with:
> _"the intel 3960x costs nearly five times as much, but doesn't offer five times the performance"_
> still trudging through the other reviews. Seems like some reviewers didn't get a lot of time to do the reviews.



we all see the reviews but here is a fact ( lga 1155 = 1155 pin connectors the latest the lga 2012 = 2012 pin connectors that is there top range prossesor sockett, so we will jump strait into the facts lga 1156 then 1155 now 2012 here is what amd will have 10 core opteron already 12 core with 12 dim rows per cpu so what is the cpu and the next step in amd's bag, 1 is that pile driver is on a sockett with under 1000 pins so wher am i going with this (the fact is that the amd am3+ sockett is so under crunched that it can no longer compare speed tests in the new intel socketts as there is not nearly enough pins to transfer the data from main board to cpu etc all intel cpus have more than 150 more pins that amd's am3+ so its going faster fact, but g 34 g35 magny is hear and is in servers all over the world, so we will be crunching it like this , magny sockett g34 = almost 3000 pin connects the cpu is massive that means on a new 22nm they can smack in over 30 cores on one cpu.  Dont get me wrong i like both products but all the sockett changes that intel have made have upset lots of my clients, and end of life seems to soon, where amd no problems , the problem i have is that when amd starts pumping out g34 with a 10 core therbane mixed bulldozer and merged into a piledriver that has 2980 pin connectors then we can acutual take mesure , but for now yes amd is keeping every thing very quiet like intel did when athlon x2 was the king and amd had a very long party and intel pumped out core 2 duo .  In my personal feeling about this ,is that amd have the fastest cpu just not in the sockett it needs to be inn.

Thankyou


----------



## ensabrenoir (Nov 8, 2012)

ozborne said:


> we all see the reviews but here is a fact ( lga 1155 = 1155 pin connectors the latest the lga 2012 = 2012 pin connectors that is there top range prossesor sockett, so we will jump strait into the facts lga 1156 then 1155 now 2012 here is what amd will have 10 core opteron already 12 core with 12 dim rows per cpu so what is the cpu and the next step in amd's bag, 1 is that pile driver is on a sockett with under 1000 pins so wher am i going with this (the fact is that the amd am3+ sockett is so under crunched that it can no longer compare speed tests in the new intel socketts as there is not nearly enough pins to transfer the data from main board to cpu etc all intel cpus have more than 150 more pins that amd's am3+ so its going faster fact, but g 34 g35 magny is hear and is in servers all over the world, so we will be crunching it like this , magny sockett g34 = almost 3000 pin connects the cpu is massive that means on a new 22nm they can smack in over 30 cores on one cpu.  Dont get me wrong         snip      In my personal feeling about this ,is that amd have the fastest cpu just not in the sockett it needs to be inn.
> 
> Thankyou[/QUOTE
> 
> ...


----------



## cdawall (Nov 9, 2012)

ozborne said:


> we all see the reviews but here is a fact ( lga 1155 = 1155 pin connectors the latest the lga 2012 = 2012 pin connectors that is there top range prossesor sockett, so we will jump strait into the facts lga 1156 then 1155 now 2012 here is what amd will have 10 core opteron already 12 core with 12 dim rows per cpu so what is the cpu and the next step in amd's bag, 1 is that pile driver is on a sockett with under 1000 pins so wher am i going with this (the fact is that the amd am3+ sockett is so under crunched that it can no longer compare speed tests in the new intel socketts as there is not nearly enough pins to transfer the data from main board to cpu etc all intel cpus have more than 150 more pins that amd's am3+ so its going faster fact, but g 34 g35 magny is hear and is in servers all over the world, so we will be crunching it like this , magny sockett g34 = almost 3000 pin connects the cpu is massive that means on a new 22nm they can smack in over 30 cores on one cpu.  Dont get me wrong i like both products but all the sockett changes that intel have made have upset lots of my clients, and end of life seems to soon, where amd no problems , the problem i have is that when amd starts pumping out g34 with a 10 core therbane mixed bulldozer and merged into a piledriver that has 2980 pin connectors then we can acutual take mesure , but for now yes amd is keeping every thing very quiet like intel did when athlon x2 was the king and amd had a very long party and intel pumped out core 2 duo .  In my personal feeling about this ,is that amd have the fastest cpu just not in the sockett it needs to be inn.
> 
> Thankyou



First of all please proof read your post. LGA 2011 is Intel's top end socket. The reason for the extra pinouts is the onboard PCI-E controller and has next to nothing to do with actual core logic or speed. G34 only has 1944 pads on the CPU and 1974 on the socket itself. So yet again no you are incorrect it does not have 3K+ pins. It has less than Intel's top socket. All of that with dual CPU dies, a multi-CPU interconnect and quad channel memory.

LGA 1156 was replaced by LGA 1155 which is being replaced by LGA 1150 obviously more pins is not better.



ensabrenoir said:


> So amd is like a dragster trying to drift........... Yeah I get it.    Im impressed they put out a nice product and all but seriously ... Even u can hear it...tick tock...tick.......its coming so enjoy
> Youve got about ten minutes left



What?


----------



## anubis44 (Nov 19, 2012)

btarunr said:


> All ASUS AMD 9-series chipset motherboards have UEFI. Quite a few MSI, Biostar, and ASRock motherboards (entry-thru-performance) have it as well. It's just Gigabyte's 9-series boards that stick to ye olde AwardBIOS. They do feature "HybridEFI" if you want to boot from large volumes, though.



You mean, it's just Gigabyte boards that stick to the simple, easy to use, no learning required, tried and true, reliable bios. Yeah, they should really be faulted for that. After all, when you want to change your bios settings, there's such a massive improvement when your mouse is enabled, even though you still have to type the numbers in the field most of the time.

Whatever. Next you'll want to be able to hook up a mouse to your wristwatch to change the time because the buttons are too difficult to use.


----------



## xenocide (Nov 21, 2012)

Yes, because switching from the BIOS system--developed in *1979*--to the much more up to date UEFI is only for the sake of using a mouse in the menus.  UEFI makes huge improvements that were long over due.  Being able to use a mouse is kind of just icing on the cake, and most people probably still use their keyboards anyway.  BIOS sucks, simple fact.  It's old, it's outdated, it was never intended to survive this long in the first place, and there are substantially better alternatives.


----------



## cdawall (Nov 21, 2012)

xenocide said:


> Yes, because switching from the BIOS system--developed in *1979*--to the much more up to date UEFI is only for the sake of using a mouse in the menus.  UEFI makes huge improvements that were long over due.  Being able to use a mouse is kind of just icing on the cake, and most people probably still use their keyboards anyway.  BIOS sucks, simple fact.  It's old, it's outdated, it was never intended to survive this long in the first place, and there are substantially better alternatives.



Screw you all I would much rather a good simple BIOS than these UEFI shenanigans.


----------



## johnspack (Nov 21, 2012)

cdawall said:


> Screw you all I would much rather a good simple BIOS than these UEFI shenanigans.



LOL!!!!!   found my first quote worthy quote!


----------



## Dent1 (Nov 22, 2012)

blibba said:


> We're not counting the 3.2GHZ 8-core chip in the PS3 as high performance multi-core?



I hate to bump this thread, but I had to correct you.

The PS3 had 1 core and 8 threads. It was not 8 core.


----------



## blibba (Nov 22, 2012)

Dent1 said:


> I hate to bump this thread, but I had to correct you.
> 
> The PS3 had 1 core and 8 threads. It was not 8 core.



How you define cores and threads is very arbitrary. X86 cores are unusual in that it's been so clear-cut for so long. A GTX 680, for example, could be considered as a 1, 8 or 1536 core processor. The PS3 is closer to what you'd normally consider an 8-core than it is to what you'd normally consider a single-core.


----------



## Super XP (Nov 22, 2012)

blibba said:


> How you define cores and threads is very arbitrary. X86 cores are unusual in that it's been so clear-cut for so long. A GTX 680, for example, could be considered as a 1, 8 or 1536 core processor. The PS3 is closer to what you'd normally consider an 8-core than it is to what you'd normally consider a single-core.


Check it out, 


> The PlayStation 3 uses the Sony, Toshiba, IBM-designed Cell microprocessor as its CPU, which is made up of *one 3.2 GHz PowerPC-based "Power Processing Element"* (PPE) and *eight Synergistic Processing Elements* (SPEs).The eighth SPE is disabled to improve chip yields.Only six of the seven SPEs are accessible to developers as the seventh SPE is reserved by the console's operating system


----------



## blibba (Nov 22, 2012)

Super XP said:


> Check it out,



I think that's pretty compatible with what I said, except for the bit about one core being disabled to improve yields, which I do remember from the time now you mention it.

In any case, the point is that the PS3 requires effective SMP programming to make the most of its hardware to at least the same extent that a typical gaming PC does.

http://store.steampowered.com/hwsurvey/cpus/


----------



## xenocide (Nov 24, 2012)

blibba said:


> How you define cores and threads is very arbitrary. X86 cores are unusual in that it's been so clear-cut for so long. A GTX 680, for example, could be considered as a 1, 8 or 1536 core processor. The PS3 is closer to what you'd normally consider an 8-core than it is to what you'd normally consider a single-core.



It's a similar to Bulldozer cores in some regards.  Bulldozer has up to 8 "cores", but you could make a pretty strong argument for it being up to 4 "cores".


----------



## Super XP (Nov 24, 2012)

xenocide said:


> It's a similar to Bulldozer cores in some regards.  Bulldozer has up to 8 "cores", but you could make a pretty strong argument for it being up to 4 "cores".


I don't know, I see Bulldozer as 8-Full cores when compared to the way the PS3 generates extra cores and compared to Intel's Hyper Threading, but yes its not like the PII's.


----------



## Aquinus (Nov 24, 2012)

xenocide said:


> It's a similar to Bulldozer cores in some regards. Bulldozer has up to 8 "cores", but you could make a pretty strong argument for it being up to 4 "cores".



Then how do you explain the linear scaling of compute power to each core? Hyper-threading doesn't enable the extra threads to scale as well as having "real cores" so the benefit is highly variable and doesn't always provide an extra core worth of compute. If you look at what happened to AMD processors, single threaded applications took a hit, but multi-threaded applications that use it worked very well. Consider for a moment the performance boost in multi-threaded applications as AMD optimizes the core, shrinks the die, and crams more cores on the die. By sharing the floating point unit (keep in mind that a lot of FP-intensive applications are starting to get programmed on GPUs now, not a lot, but they're cropping up) AMD can optimize what the CPU needs to be  good at. A lot more integer math gets done in a processor than floating point math for the average user and generally speaking unless you're doing a lot of parallel floating point operations, you won't take a huge performance hit because at that point you should be considering OpenCL for large amounts of data and I think AMD's is hoping that you will get or use an AMD GPU to improve your floating point performance because there are huge benefits to be had when you can make your FP code run in parallel. Obviously the industry isn't there yet, but it will be before you know it. Also consider all the optimizations that this architecture could use, it is new and AMD needs time to work out the bugs. All things considered I think they're doing the best they can against Intel considering revenue and usable income.


----------



## buildzoid (Dec 1, 2012)

*gaming CPUs*

I just read a preformance overview of the new FX series on guru3D and basically the FX 4300/6300 will give you very similar FPS at less power and cash than FX 8350/8320 now considering the architecture is almost identical for these CPU they will overclock to almost the same levels the 6300 cost 75$ less than the 8350(40$ less than i5 3570k) the 4300 80$ less and since games only use a few threads a max of 4 then you could get the 6300 or the 4300 and get it to 4.5-5.0Ghz with a little over 3/4 or 1/2 power consumption. the 6300 is a bit slower/faster in pretty much all benchmarks against the i5 3570k so If I wanted 1 or 2 GPU only gaming rig I'd go AMD.


----------



## Steevo (Dec 1, 2012)

Aquinus said:


> Then how do you explain the linear scaling of compute power to each core? Hyper-threading doesn't enable the extra threads to scale as well as having "real cores" so the benefit is highly variable and doesn't always provide an extra core worth of compute. If you look at what happened to AMD processors, single threaded applications took a hit, but multi-threaded applications that use it worked very well. Consider for a moment the performance boost in multi-threaded applications as AMD optimizes the core, shrinks the die, and crams more cores on the die. By sharing the floating point unit (keep in mind that a lot of FP-intensive applications are starting to get programmed on GPUs now, not a lot, but they're cropping up) AMD can optimize what the CPU needs to be  good at. A lot more integer math gets done in a processor than floating point math for the average user and generally speaking unless you're doing a lot of parallel floating point operations, you won't take a huge performance hit because at that point you should be considering OpenCL for large amounts of data and I think AMD's is hoping that you will get or use an AMD GPU to improve your floating point performance because there are huge benefits to be had when you can make your FP code run in parallel. Obviously the industry isn't there yet, but it will be before you know it. Also consider all the optimizations that this architecture could use, it is new and AMD needs time to work out the bugs. All things considered I think they're doing the best they can against Intel considering revenue and usable income.



Careful with the "real core" issues there. 

AMD "cores" aren't even "real cores", take one out by itself and it won't operate, it requires the rest of the shared hardware its twin is using. 


AMD and ATI have both had a history of making hardware to do things that software was not ready for, and about 50% of the time or better it flopped.


----------



## Ghost (Dec 1, 2012)

buildzoid said:


> I just read a preformance overview of the new FX series on guru3D and basically the FX 4300/6300 will give you very similar FPS at less power and cash than FX 8350/8320 now considering the architecture is almost identical for these CPU they will overclock to almost the same levels the 6300 cost 75$ less than the 8350(40$ less than i5 3570k) the 4300 80$ less and since games only use a few threads a max of 4 then you could get the 6300 or the 4300 and get it to 4.5-5.0Ghz with a little over 3/4 or 1/2 power consumption. the 6300 is a bit slower/faster in pretty much all benchmarks against the i5 3570k so If I wanted 1 or 2 GPU only gaming rig I'd go AMD.



You got something wrong. i5 will give better performance in games than any AMD CPU. i5 3570K costs the same, draws less power and is around 20-30% faster than FX-8350 in games. That's with single GPU. Dual high-end system would be severely bottlenecked by FX.

Here's FX-8350 vs i7 3770K with 2x HD 7970s. i5 is identical to i7 in most games.
http://vr-zone.com/articles/amd-fx-...hz--multi-gpu-gaming-performance/17494-1.html


----------



## Dent1 (Dec 1, 2012)

Ghost said:


> You got something wrong. i5 will give better performance in games than any AMD CPU. i5 3570K costs the same, draws less power and is around 20-30% faster than FX-8350 in games. That's with single GPU. Dual high-end system would be severely bottlenecked by FX.
> 
> Here's FX-8350 vs i7 3770K with 2x HD 7970s. i5 is identical to i7 in most games.
> http://vr-zone.com/articles/amd-fx-...hz--multi-gpu-gaming-performance/17494-1.html




But how many people can afford one 7970 let alone two for Crossfire. Whilst I'd agree there is a deficiency in high-end multi GPU gaming performance few users will opt for such an expensive setup.

I think the broader message buildzoid was trying to convey was the similarities in gaming performance between the FX 4300, 6300, 8350 and 8320 which is sort of true.



buildzoid said:


> I the 6300 is a bit slower/faster in pretty much all benchmarks against the i5 3570k so If I wanted 1 or 2 GPU only gaming rig I'd go AMD.



This part I'm in disagreement with. Gaming no. Everything else maybe.


----------



## Steevo (Dec 1, 2012)

APU's if you are going to game are still better bang for the buck.


----------



## os2wiz (Dec 2, 2012)

Ghost said:


> You got something wrong. i5 will give better performance in games than any AMD CPU. i5 3570K costs the same, draws less power and is around 20-30% faster than FX-8350 in games. That's with single GPU. Dual high-end system would be severely bottlenecked by FX.
> 
> Here's FX-8350 vs i7 3770K with 2x HD 7970s. i5 is identical to i7 in most games.
> http://vr-zone.com/articles/amd-fx-...hz--multi-gpu-gaming-performance/17494-1.html



You are making a genralization that just does not hold up with many of the newer games. The FX 8350 betters the I5 3570k and equals the I7 3770k in games like Battlefield III , Sleeping Dogs. In poorly designed single threaded games or games that are cpu bound that may be a different stiory. But  more and more of the better games are taking advantage of multi-cores as many as 8!!!  Wake up a new day is here.


----------



## os2wiz (Dec 2, 2012)

Super XP said:


> Check it out,


 
  The Power PC was a great chip. It was too bad IBM is pretty much out of the cpu business. They had great fabrication and great design teams. They also had the most advanced desktop and server operating system:
OS/2. Far better design than Windows, The Workplace Shell gui was better than windows as well. They made the mistake of trusting Microsoft and it cost them big time.


----------



## EarthDog (Dec 2, 2012)

Ghost said:


> You got something wrong. i5 will give better performance in games than any AMD CPU. i5 3570K costs the same, draws less power and is around 20-30% faster than FX-8350 in games. That's with single GPU. Dual high-end system would be severely bottlenecked by FX.
> 
> Here's FX-8350 vs i7 3770K with 2x HD 7970s. i5 is identical to i7 in most games.
> http://vr-zone.com/articles/amd-fx-...hz--multi-gpu-gaming-performance/17494-1.html





os2wiz said:


> You are making a genralization that just does not hold up with many of the newer games. The FX 8350 betters the I5 3570k and equals the I7 3770k in games like Battlefield III , Sleeping Dogs. In poorly designed single threaded games or games that are cpu bound that may be a different stiory. But  more and more of the better games are taking advantage of multi-cores as many as 8!!!  Wake up a new day is here.



http://www.overclockers.com/amd-fx-8350-piledriver-gaming-comparison


----------



## os2wiz (Dec 2, 2012)

EarthDog said:


> http://www.overclockers.com/amd-fx-8350-piledriver-gaming-comparison



There is absolutely nothing factually wrong with what I stated. I qualified my statement if you bothered to read it in its entirety. I said many ,not most, of the new games are being designed for multi core processors. I said that is becoming an icreasing trend and it is. I know what the reports say and it does not negate my observations. Analyze my friend , rote responses lack analysis.


----------



## EarthDog (Dec 2, 2012)

I dont recall calling anyone out (reads post again to be sure...NOPE)... just adding a link for you guys to chew on. Put it back in your pants, there is no need for that...in fact, doesnt my link actually support what you are saying as far as the performance goes?


----------



## Super XP (Dec 2, 2012)

EarthDog said:


> I dont recall calling anyone out (reads post again to be sure...NOPE)... just adding a link for you guys to chew on. Put it back in your pants, there is no need for that...in fact, doesnt my link actually support what you are saying as far as the performance goes?


That is why you posted the link right? This clearly shows Once Again, in GAMING both Intel and AMD are competative. Anybody thinking otherwise is delusional.


----------



## EarthDog (Dec 2, 2012)

I posted it as it has information they are talking about... that's it!


----------



## Frick (Dec 2, 2012)

EarthDog said:


> I posted it as it has information they are talking about... that's it!



*sarcasm*


----------



## buildzoid (Dec 8, 2012)

what I was aiming at with my post was to point out that in games the FX 6300/4300 will be practically equal to the FX 8350 while costing less and consuming less power. If 8350 isn't too far behind the i5 3570k but consumes too much power and doesn't need all the cores then the 6300 and 4300 make ideal cheap gaming CPUs because your only losing cores which won't mean anything in most games. Since they run on less power they should hit higher overclocks than the 8350 and match the i5 in gaming especially the cheap i5s like the 3470


----------

