# show me that a q6600 is no good now



## Shambles1980 (May 6, 2014)

Ok,
basically i know that the q6600 is considered ancient in terms of computing power.
But i just cant seem to justify an i3 or low cost i5 because i simply dont see how they can actually be that much better.
I dont mean the higest value i5's and i7's I mean the run of the mill i dont have much money but lets try to get up to date buy's and spending as little as possible to get more performance..

Any way i have always over clocked computers. just something i always did started off by deliberatly setting the jumpers wrong and hoping the house didnt burn down. untill now where i try and balance the performance to longevity and most importaintly having a stable system that i can game on for 16 hours straight if i so wish..

So here we go..

My Q6600 stable Oc on air.
3.8-3.9Ghz on the cpu is probably possible but not with this ram. and voltage restrictions on this board.
tweaked what i could.
had the ram up to 1000 (its ddr2 800) but had to pump a lot of voltage to it i did not like that.

The long and short of it is what i was comfortable overclocking to with the voltages i would use and the ram would let me was.
Fsb 412
Cpu 3.7
ram 824 mhs



gpu 1150
gddr 1300 (1350 gave some purple artifacts)
http://www.techpowerup.com/gpuz/kwkxx/

this then gave me the following scores..
Ice storm
114107
cloud gate
13295
fire strike
4811
http://www.3dmark.com/3dm/3016922
With this setup like it is i can get full usage out of my cpu and gpu when gaming with lots of bells and whistles in thief at 1280x1024
low points of 55 fps highs of 100+ avarage about 65 ish id say.

I may try and get it 3.75ghz stable on the cpu and 1001mhz stable on the ram. But the latencys suffer a lot when i go to 1000mhz and i should probably just get 1066 ddr2 ram if i want to push it any futher.

the gpu i am pretty happy with "i dont want artifacts" and at 60c full load the fan sits at about 25-30% so thats always good.
honestly though if i crank it up as far as i could go for a single test with all the fans i have wired up and cooling the whole system down at full thottle
i dont think i would be able to squeeze any more than 5000 from fire strike test.
And i dont see the point in an overclock you cant use every day..

Now what i would like to see is Lower end i5's and i3's or amd apu equivalent  that have ran the 3dmark test and the results.
if you have a 7850 or nvidia equivalent, is it a 660?...  not sure i would be most interested in those results. valid test preferable no gpu physics


-=EDIT=-

if you want to just see what happened when i tested the q6600 at 3.0 vs a fx-8120 locked at 3.1 with occasional throteling down to 2.8ghz then here it is




Shambles1980 said:


> ok did all my samples for the q6600..
> After trying all my bench software theirs nothing i can do to make the minimum fps better. so that is the cpu as at the same res with lower quality the min fps is the same. maximum goes up and obviously as a result so does the average. thats when using fire strike, and if i cant change the settings on that then i will keep the variables as low as possible by using all quality settings at defaults and only changing resolutions.
> 
> i chose to use the 3 valley tests at 1080p.
> ...


----------



## Vario (May 7, 2014)

You are playing 1280x1024!  You should be able to run everything at that resolution.  At 1920x1080 or higher you would have slow down.

What is your 3dmark11 physics score?


----------



## natr0n (May 7, 2014)

As long as you can still play modern games, then nothing to worry about.


----------



## Aquinus (May 7, 2014)

An i5 or better will probably outperform your Q6600 but whether or not it makes a difference on your experience depends on the game. Do you feel you need more performance? That's really the question you should ask yourself when it comes to upgrading because if you don't need it, it's a waste.


Vario said:


> You are playing 1280x1024!  You should be able to run everything at that resolution.  At 1920x1080 or higher you would have slow down.
> 
> What is your 3dmark11 physics score?


Benchmarks have shown that higher resolutions are less CPU dependent than lower resoltutions. CPU becomes a bottleneck generally as the framerate gets higher and lower resolutions puts less stress on the GPU and more on the CPU because the GPU is getting stuff done faster at lower resolutions.


----------



## Solaris17 (May 7, 2014)

Does it do what you want it to?

Honestly people were playing 1080p games back in the Q6600 day and its still the standard, regardless of what 4k elitists will tell you i-finity and multi monitor gaming setups still dont have nearly the market saturation people claim it does. 

I used to be into the whole crazy OC water cooling 2 PSUs down to fan RPM BS then i realized I wanted to pay off my house and in the end I got to say i beat the score of 17yr/o in 3dmark it wasnt worth the money any more and honestly the PC did everything I wanted it to.

I can sit here and tell you to get a 3.6ghz I7 8 core machine with 32gb of ram and SLI titans but it wont do shit for you if all you do is log into AOL desktop 9.7


----------



## Shambles1980 (May 7, 2014)

i can quite happily game at 1024x768 provided it is smooth. 1280x1024 is perfectly adequate for me. although i wouldnt mind being able to put absolutley everything on full at those resolutions click v-synk and never drop below 60fps.

I also like to over clock always have done. just having kids (one is a teen the other will be 3 in december) kind of makes money tight lol. so im stuck with my old cpu. I cant see it surviving another gpu upgrade to be honest. right now its  as ballanced as i can make it be, with full usage of gpu and cpu available. i dont go in for console gaming. only ever liked forza really. i have much more pleasure from my computer..

any way. in regards to my 3dmark 11 physics score..
http://www.3dmark.com/3dm11/8304206
Physics Score
3916
Combined Score
3780

not really that fussed about physics scores really. the chances of a game coming allong and needing that much of the cpu for physics is pretty slim. even so there they are.
I am much more interested in the fire strike results of the new 3d mark.

as for do i think i need more performace.. yes i do.. Can i affoard what i want? no not really..
so thats what im trying to find out. what low cost i3 or i5 MAY be viable to me as an upgrade. would need a board ram and cpu and money constraints mean that i simply look at the things and compare the numbers and read reviews and forums. and i think well ok but to get better performance "in the real world" that i would consider an upgrade. im looking at spending a lot of money..

What i do see a lot is people saying The q6600 will bottle neck  a hd 6950 or hd 7770 but really it wont. Not unless the game needs a better cpu that is. because i can run my GPU and CPU at stress tests at the same time and get them both to 100%
But im pretty sure im at about the limit of the cpu's ability. and its just not quite where i want to be.
I dont need the best of the best. But i would like to see that my oppinion of the LGA 775 being the greatest socket that was ever created, and the Q6600 being the best processor intel ever made is now obsoleet and its time to move on.
the only problem with that is the old performance to money equasions. and i have this much performance now (outlined above) which is "free" so to get exactly the same performance. the replacment needs to be "free" too or its a waste of money.. so i need to take that theory and apply it to similar results and then weigh up the costs and gains and decide which to get or just stay the same for now..


----------



## THE_EGG (May 7, 2014)

OK well I don't know how much this will help but I moved from a Q8400 to an i3-3220 in one of my 'spare' rigs that friends use for games when they come over. I found that the machine seemed a lot more snappy and responsive and also much, much cooler. I think the upgraded RAM also helped out with that. Gaming performance seemed to improve but I can't really say how much because I also upgraded the gpu at the same time to a 7850 1gb (coming from a 5770). That being said you can't really overclock an i3 so yeh....

Maybe you could look at a second hand 2500k or 3570k?


----------



## Shambles1980 (May 7, 2014)

THE_EGG said:


> OK well I don't know how much this will help but I moved from a Q8400 to an i3-3220 in one of my 'spare' rigs that friends use for games when they come over. I found that the machine seemed a lot more snappy and responsive and also much, much cooler. I think the upgraded RAM also helped out with that. Gaming performance seemed to improve but I can't really say how much because I also upgraded the gpu at the same time to a 7850 1gb (coming from a 5770). That being said you can't really overclock an i3 so yeh....
> 
> Maybe you could look at a second hand 2500k or 3570k?



if you find the time to download the new 3d mark and post the results that would be very usefull those specs are very close to mine. and i also think bus speeds would play a bigger part than the raw numbers imply


----------



## patrico (May 7, 2014)

i have the same, gonna upgrade very soon, getting jealous of fast machines lol


----------



## kn00tcn (May 7, 2014)

just... ignore everything 3dmark, period!

do you encode videos or render in after effects or 3d tools? if yes, you'll notice any cpu boost

for games, you need specifically demanding (or unoptimized) ones, so battlefields, large open worlds, MMOs, etc

one thing to consider is all those things that use only 1 thread or just a couple, in this case you want faster per core performance, although it's great that you overclocked

you could also merely look at your cpu & gpu usage during various workloads & decide if you need a boost or not


----------



## Vario (May 7, 2014)

Shambles1980 said:


> Now what i would like to see is Lower end i5's and i3's or amd apu equivalent  that have ran the 3dmark test and the results.
> if you have a 7850 or nvidia equivalent, is it a 660?...  not sure i would be most interested in those results. valid test preferable no gpu physics





Shambles1980 said:


> any way. in regards to my 3dmark 11 physics score..
> http://www.3dmark.com/3dm11/8304206





> P6092 with AMD Radeon HD 7850(1x) and Intel Core 2 Quad Processor Q6600
> Graphics Score
> 7547
> Physics Score
> ...



Only reason I mentioned 3dmark11 is here are some comparisons, this doesn't show your q6600 as being no good, it rather shows it as still very relevant:

(2010 era) Phenom II 965BE @ 4.0 ghz with 7850 at 24/7 1,180 MHz core 1,480 MHz mem
http://www.3dmark.com/3dm11/5304049
P6129 with AMD Radeon HD 7850(1x) and AMD Phenom II X4 965 
Graphics Score
6792
Physics Score
4810
Combined Score
4645

(2011 era) i5-2550k at 3.1 deliberately underclocked, with ram at 1333, stock 7850 (this should simulate a lower end i5 as you had wanted this comparison):
http://www.3dmark.com/3dm11/8305675
P6047 with AMD Radeon HD 7850(1x) and Intel Core i5-2550K Processor 
Graphics Score
6073
Physics Score
5959
Combined Score
5993

(2011 era) stock i5 2550k stock ram stock 7850
http://www.3dmark.com/3dm11/8305547
P6207 with AMD Radeon HD 7850(1x) and Intel Core i5-2550K Processor 
Graphics Score
6071
Physics Score
6722
Combined Score
6560

i5 2550k @ 4.4 with 7850 stock
http://www.3dmark.com/3dm11/7980370
P6407 with AMD Radeon HD 7850(1x) and Intel Core i5-2550K Processor 
Graphics Score
6054
Physics Score
8560
Combined Score
6817

i5 2550k at 4.8 with 7850 at stock
http://www.3dmark.com/3dm11/8305614
P6459 with AMD Radeon HD 7850(1x) and Intel Core i5-2550K Processor 
Graphics Score
6056
Physics Score
9165
Combined Score
6845

i5 2550k at 4.8 with 7850 at 1050 core (Catalyst CC simple overclock)
http://www.3dmark.com/3dm11/8305634
P7445 with AMD Radeon HD 7850(1x) and Intel Core i5-2550K Processor 
Graphics Score
7140
Physics Score
9138
Combined Score
7778

i5 2550k at 4.4 with 7850 at 1100 core (custom bios, the card was not as stable with bios overclocking instead of driver oc.)
http://www.3dmark.com/3dm11/7227143
P7599 with AMD Radeon HD 7850(1x) and Intel Core i5-2550K Processor 
Graphics Score
7389
Physics Score
8550
Combined Score
7973

(2012 era) i7 3770k at 4.5 with 7850 stock
http://www.3dmark.com/3dm11/7916796
P6452 with AMD Radeon HD 7850(1x) and Intel Core i7-3770K Processor 
Graphics Score
5935
Physics Score
10906
Combined Score
6731

i7 3770k at 4.5 with 7850 at 1050 core oc
http://www.3dmark.com/3dm11/7728675
P7456 with AMD Radeon HD 7850(1x) and Intel Core i7-3770K Processor 
Graphics Score
6980
Physics Score
10986
Combined Score
7692


edit: added stock 2550k/7850 system run and your system to above for easy comparison!
edit2: added a deliberately underclocked 2550k at 3.1 and ram at 1333 to simulate low end i5


----------



## Fourstaff (May 7, 2014)

Its very simple: when you cannot stand your current computer, you upgrade! No need to get anyone to convince you to upgrade, you will know when you have to upgrade


----------



## RCoon (May 7, 2014)

Fourstaff said:


> Its very simple: when you cannot stand your current computer, you upgrade! No need to get anyone to convince you to upgrade, you will know when you have to upgrade


 
The term, "if it aint broke, fix it until it is" comes to mind. Q6600 is still a ballin' chip.


----------



## remixedcat (May 7, 2014)

My sis uses one for video production and it's passable. She uses an older version of adobe premiere though.


----------



## Vario (May 7, 2014)

RCoon said:


> The term, "if it aint broke, fix it until it is" comes to mind. Q6600 is still a ballin' chip.



IMO, FPS Games that are graphically intense but not cpu intense won't care much if you run a q6600 or phenom II, but strategy games like SC2 would be very different.  If OP is happy with it, keep using it.  I liked my Phenom II a lot.  The Intel 1155 stuff is noticeably faster for everything though, both gaming and general windows use.

My friend now has my Phenom II + AsRock 970 Extreme 3 and a 5870, it is very capable.  Other than its inability to run ram over 1333 stock, it competes with AMD's current line up.

edit:

doing one final run with i5 stuck at 3.1 and ram at 1333 to simulate a low end i5 2400.


----------



## micropage7 (May 7, 2014)

remixedcat said:


> My sis uses one for video production and it's passable. She uses an older version of adobe premiere though.


yep, if you buy or build pc just find what your needs,  and many people still happy with their old pc.


----------



## Mussels (May 7, 2014)

i went from a Q6600 -> E8400 -> Xeon E3120 -> 1090T x6 -> i5


every single step of the way i saw a noticeable performance increase, and i overclocked every last one.

your CPU is enough to run modern games, but performance is far from what it could be...


3dmark doesnt tell you how real games perform. my 1090T to this CPU gave me a 30-40% FPS boost, for 50% less power used.


----------



## Easy Rhino (May 7, 2014)

My main rig is a q6600 and it performs very nicely for all of my needs. I play Total War Shogun and Company of Heroes on high settings at 1080p and it runs flawlessly. I am not even thinking about upgrading for another year.


----------



## Mussels (May 7, 2014)

Easy Rhino said:


> My main rig is a q6600 and it performs very nicely for all of my needs. I play Total War Shogun and Company of Heroes on high settings at 1080p and it runs flawlessly. I am not even thinking about upgrading for another year.



remind me to play that with you for two reasons.


1. i love that game

2. i want to point out every time you lag


----------



## Sasqui (May 7, 2014)

I'm still using an E8600 (Core2 Duo Wolfdale).  I've got it at 4.3Ghz, it played Crysis with a 5870 at 1920x1200 comfortably.  There are other games much more CPU bound.


----------



## Easy Rhino (May 7, 2014)

Mussels said:


> remind me to play that with you for two reasons.
> 
> 
> 1. i love that game
> ...



Which game?


----------



## grunt_408 (May 7, 2014)

As long as it can run minesweeper , minecraft ect your good to go with that res......


----------



## Kyuuba (May 7, 2014)

Man your best choice is an i5, I had one 2500 non K (that's a second gen) and it was excellent dude running with a PNY GTX 280, played so many games with it including metro 2033 back in 2011 at max settings 1280x720 and ran flawlessly.


----------



## Mussels (May 7, 2014)

Easy Rhino said:


> Which game?



company of heroes, sorry.


I also just built up an E6600 system with a 9600GT out of my spare parts thanks to this thread reminding me i had it, should run a benchmark or two...


----------



## Kursah (May 7, 2014)

Mussels said:


> ...
> 
> 3dmark doesnt tell you how real games perform. my 1090T to this CPU gave me a 30-40% FPS boost, for 50% less power used.



That and degrading performance at my preferred levels and resolutions would necessitate an upgrade. That or what's happened the last couple of times since my Q9650 and i5-760 builds was selling my rigs because I have friends that always want my rigs. It gives me an easy out to upgrading...hence the haswell build.


----------



## MT Alex (May 7, 2014)

Keep your CPU and get a 1080p monitor, for the love of Pete.  I can't fathom still playing at such poor resolutions.  Besides that, the new monitor will dictate the areas of your rig you need to improve.  Seriously, a new monitor is one of the biggest "wow" upgrades anyone can do to their system.


----------



## Easy Rhino (May 7, 2014)

Mussels said:


> company of heroes, sorry.



i don't lag in that game?


----------



## Mussels (May 7, 2014)

Easy Rhino said:


> i don't lag in that game?



challenge accepted.


----------



## Shambles1980 (May 7, 2014)

MT Alex said:


> Keep your CPU and get a 1080p monitor, for the love of Pete.  I can't fathom still playing at such poor resolutions.  Besides that, the new monitor will dictate the areas of your rig you need to improve.  Seriously, a new monitor is one of the biggest "wow" upgrades anyone can do to their system.


what makes you think that  my monitor can only do those resolutions lol?
its just the resolutions that i chose.. 

this is a 1080p60 monitor.


----------



## Toothless (May 7, 2014)

Shambles1980 said:


> what makes you think that  my monitor can only do those resolutions lol?
> its just the resolutions that i chose..
> 
> this is a 1080p60 monitor.


So then why run it at such a low resolution? All I could think of is "GIANT ICONS."


----------



## Easy Rhino (May 7, 2014)

Mussels said:


> challenge accepted.



i have played with people who lag and it is either because their internet sucks or they have their graphics turned up too high. neither applies to me.


----------



## MxPhenom 216 (May 7, 2014)

Lightbulbie said:


> So then why run it at such a low resolution? All I could think of is "GIANT ICONS."



Yep and resolution that is not even within its monitors aspect ratio. Serious run at your monitors native resolution. 

Once you do, might find you do need an upgrade in both cpu and gpu departments.


----------



## Toothless (May 7, 2014)

MxPhenom 216 said:


> Yep and resolution that is not even within its monitors aspect ratio. Serious run at your monitors native resolution.
> 
> Once you do, might find you do need an upgrade in both cpu and gpu departments.


Maybe, maybe not. Might only have to turn down settings in games. It's really the resolution that gives it the quality. I used to play at 1600x900 highest until I tried 1080p at medium-low, still looked better than 1600x900.


----------



## eidairaman1 (May 7, 2014)

Vario said:


> IMO, FPS Games that are graphically intense but not cpu intense won't care much if you run a q6600 or phenom II, but strategy games like SC2 would be very different.  If OP is happy with it, keep using it.  I liked my Phenom II a lot.  The Intel 1155 stuff is noticeably faster for everything though, both gaming and general windows use.
> 
> My friend now has my Phenom II + AsRock 970 Extreme 3 and a 5870, it is very capable.  Other than its inability to run ram over 1333 stock, it competes with AMD's current line up.
> 
> ...


that motherboard can run 1600MHz Ram, My Bro has a Extreme 4 with a X2 555 unlocked to X4 955/B55


----------



## lilhasselhoffer (May 7, 2014)

...So you want someone to show you the value of an upgrade with synthetic benchmarks.  I'm not sure if trolling, or unaware of what reality is.

Yes, the Core processors were excellent.  They introduced the idea of decent multi-core processors, were fun to overclock, and can still hang with modern versions assuming some hefty overclock.  They also have that pesky need to have a memory controller on the board, are limited to SATA II connections, and reach all the way back to PCI-e 1.0.


If you want to see the need to give up on a q6600 then you need to compare apples to apples.  You're comparing an approximately $851 dollar CPU (at launch time) to a sub $150 processor.  If you bought a 4960x today I'd expect it to be viable in five years as well.  Of course, once you compare numbers of the q6600 to that of the 4960x you'll be a bit disappointed.  

If you want a reason to go from a q6600 to an i3 it's because you've got a laptop.  If you want a reason why the q6600 is still viable, it's because you invested in the most powerful chip available at the time.  This thread is based upon a false premise.  By this logic, an average sized man is a giant among midgets.  You are starting from a faulty assumption, therefore no conclusion you reach is worth looking at.


----------



## Shambles1980 (May 7, 2014)

im a child of the era where role play games just had text on the screen lol. and i still think super mario brothers 3 has great graophics. but when it comes to modern computers anything lower than 1024x768 even by my extra low standards looks like poop. But anything above 1024x768 looks just fine to me lol. honestly cant tell the difference.
(my desktop is at 1080p though)
the only thing i can tell the difference with is 1080p vs 1080i and 720p looks better than 1080i but any way.
I am looking at some board ram and cpu bundles on the internet but i dont seem to see anything thats a good enough improvment for the money that i can then convince the wife i need to spend that much..

Maybe i should just hang back and wait for second gen 14nm cpus to hit the streets and pick up a decent used 22nm setup then.
cant be that far off now i guess.

as for i bought the q6600 when i did.. 
thats a bit different i upgraded from pentium D which was lga 775, the D was a very low end model. but i bought the best mother board i could at the time. this allowed me to then buy a Q6600 later


----------



## Kursah (May 7, 2014)

With that level of graphics, you'll waste your money unless you want to game with an integrated GPU which an i5 would be great for imho. My kids run on my G/F's i3 + 4GB DDR3 at 1920x1080 using the onboard Intel iGPU HD3000 series for graphics support. Saves quite a bit of power over a dedicated card, more efficient than older onboard (MB-based) onboard solutions. Her i3 has been great, it's a hyperthreading unit, I think the chip + board + 4GB ddr3 was sub-$200 or so.

For your levels of gaming, you could gain some efficiency and build an ITX rig that is almost console sized that could meet or beat your current build, save on power, run cooler and get the job done. Food for thought.


----------



## TheMailMan78 (May 7, 2014)

Solaris17 said:


> Does it do what you want it to?
> 
> Honestly people were playing 1080p games back in the Q6600 day and its still the standard, regardless of what 4k elitists will tell you i-finity and multi monitor gaming setups still dont have nearly the market saturation people claim it does.
> 
> ...


Pretty much. My days of constant upgrades are over. No game really drags out what I have now.


----------



## Shambles1980 (May 7, 2014)

i can still play the games i want to play but there are still those occasional 5fps dips that take me down to less than 60fps. (those are the ones that i have noticed with AB logging the fps as i play some areas could be even worse)
also i do like to play RTS games racing sims and ww2 combat flight simulators. havent gotten to the point where im saying "well i just cant play it like this" but im sure its coming and soon too.
Also i like games like fallout new vegas and always have the view distance maxed out i like to be able to see things i want to shoot before they see me..
and i do occasionaly once in a blue moon go play some multyplayer FPS. but the days where FPS were the best thing in the world have long since past for me, and i much prefer a first person perspective role play game, 
morrowind/skyrim/stalker and may end up trying the online version(s) although i think stalker online has been abandoned. 

and if we are talking emulation. 
the q6600 with this gpu only bearly runs dolphin so games "resident evil" are at acceptable levels. that is a limitation of the software and not the hardware, but more power would fix it.

so yes i think i need more power. i dont think i need that much more power. and would generally prefer to get a slightly better cpu with no on board gpu and use a gfx card.


----------



## Kursah (May 7, 2014)

Most intel CPU's come with onboard gpu's...you can simply disable them though. Even my 4770k has on onboard gpu...it's actually a nice backup to have should your main GPU fail.

It sounds like you want to justify an upgrade but don't at the same time. What's your budget? Let's get some build ideas out there for you and set you free to research about them.


----------



## Shambles1980 (May 7, 2014)

Thats the point lol i dont have a budget. I need to find some newer stuff that out preforms my current stuff enough so i can tell the wife look how much better it is, Whilst at the same time costing as little as possible so she dosent kill me lol.. 
If i can think up an inexpensive board cpu and ram combo that will be noticably better than my current set up using the same gpu, i would sell what i have now on fee bay hope to get a bit of funds up from that then cover the rest. 
id say that my cpu and board + ram should be worth about £70 i sold my 5770 the other day sold some ram today so that would take me up to about £140 and then i would need to cover the rest. 
and the only things i can find in the £160 range really are not an upgrade. sure its a faster bus speed but theres no way i can convince the wife it was worth all the hassle letalone £20 + the money from what i sold (incidentaly. any thing i sell becomes money, and some how that money then becomes hers, and if i buy her something and she sells it that money is also hers) 
Looking at what im looking at It seems i need to spend over £300 to get where i want to get. which means i would have to cover £200 of the cost from my own pocket.

this is why i was asking for benches of lower end i3's and i5's  Preferably with the same gpu or similar. 
That way i could possibly see some thing i had been over looking and be able to get the build price down.


----------



## MxPhenom 216 (May 7, 2014)

Its so hard finding benchmarks similar to your rig to compare. I dont think any of us are really going to be able to convince you to upgrade or not, but you are about 5 generations of Intel CPUs behind and about 2 or 3 AMD generations behind.

Not only would you get a good performance jump in FPS alone, but there are improvements in other areas too.

Motherboard chipsets are better, BIOS (UEFI) . SATA 6 gb/s, DDR3, overall lower power consumption, less heat, completely new CPU architectures, PCIe 3.0 support, etc.

Unless you want to overclock you could do a build with an i5 4430 or 4570, good H87 motherboard, and 8GB of DDR3 memory at 1600mhz or higher which would come out to being just below $400 USD.

Or you could jump over to an AMD FX build with a FX6300/8320/8350 and it would be even cheaper than the Intel Platform.

Honestly you are due for an upgrade regardless IMO. But then again you need to be able to make the judgement yourself.

I am just one of the few that tries to have the best most recent stuff each year even if I dont need it. Im at the point where being behind just a single generation bothers me. Especially on GPUs. I just have a big obsession with hardware is all haha.


----------



## Kursah (May 7, 2014)

Your wife won't care about benches. Be straight up, if you want to upgrade fine, if not then don't.

I hope you can get the funds to upgrade if that's what you feel you need to do. I think that Q6600 is still a damn good chip for an older Core2. Sure an i3 or i5 will be better, even noticeably so...but where they really ring in is the power savings imho. That you can justify to the wife. Lower power consumption, lower heat output, etc.

No offense, but it sounds like you need to take your balls out of her purse on this one!

Sounds like you deserve a new machine without major hassle! I hope you can do it, but seeking benchmarks isn't the answer imho. Those are very subjective and won't correlate into an average gaming experience...the best way would be to get it, build it, do an A|B if you care to or sell your old stuff. You won't find direct comparisons. You can google CPU comparisons at sites like CPU BOSS and there are GPU comparison sites too. But really it's all a mixed bag.

New hardware = more powerful + more efficient + both are noticeable ==


----------



## eidairaman1 (May 7, 2014)

are you trying to prove to yourself you dont need an upgrade or do you???


----------



## Shambles1980 (May 7, 2014)

trying to prove that i do lol.

its just this has been so very different to the past.
286 had to be upgraded to play anything at all really.
skip forward and then amd 333mhz with 3d now! easy decision.
then they went and smashed it all with the 1st 1ghz cpu.. Again easy to argue it was needed.
then skip forward a bit and im on a pentium 4 486, the future was coming so i traded over to a Pentium D making sure i got a LGA 775 board with the higest fsb i could find.
then came the Q6600 at 1066 native fsb and 9x multiple, even then there were boards that could do 1333 native and 1600.. it was a very easy choice to make.
4 cores. and them being actual cores using a proper architecture and not just 2 pentium 4 cores glued together. The Q6600 was probably the easiest choice in upgrades any one ever had to make.

And now here we are 6 years later? 2-3 mother boards changed. 4-5 gpu upgrades. and still the same cpu. thing has been ran like a mule for all this time and still keeps on kicking. And i just cant seem to justify the costs of upgrading, every other upgrade i ever did was easy. it was a simple matter of If i want to do this. then i need that..

the truth is I Want to upgrade. but i need it to be a realistic upgrade. i dont want to pay and get the same or only slightly better performance. and it seems like if i dont pay a lot then that is what i will end up with. And i dont want to pay £300 (probably $570) on the parts i would want to consider it an actuall upgrade.

I think my problem is my last update came when intel finally said. "ok amd. you have had your fun now this is what we can do"
And i keep expecting that sort of step up for the money.

maybe i should try and hold out for 14nm and see if i can then get a 2nd gen i5 for the money i want to spend. but if i do that then what i have will be worth even less so its probably going to end up equating to the same costs.

Any way that allong with a wife is my problem.

p.s

looking at everything an I5-2500k with a decent board and 8-16gb of ram is what id be looking at as something thats actually an upgrade.. 
which is where im getting the £360 ish price tag from. 
The lowest end systems i could muster together would range from £180 ish, but i dont see any of them as an upgrade.


----------



## Kursah (May 7, 2014)

Shambles1980 said:


> trying to prove that i do lol.



Get a new monitor that runs 1920x1080...now your games don't run right. Also power consumption is higher than that of a new build with an 80+ PSU, lower voltage DDR3, smaller fab'd and lower voltage CPU's that also run cooler and more efficiently, with a newer mainboard with improved efficiencies in chipsets too, and haswell-based boards no longer need expensive and power consuming CPU VR circuits, just some more basic filtering of overall input voltage, but Haswell CPU's have internal VRM control for the CPU cores, which is more efficient too.

What you need to do is research what you want or think you can afford...and list all the positive aspects. Find some pro reviews that rate them well...find some good reviews from users on forums...post link after link for her to see...and say you're getting it for all the right reasons. Gaming at 1080p and lower power consumption/higher efficiency = win.


----------



## Jetster (May 8, 2014)

The Q6600's go for $50. There is a reason for this. But if your happy with what you have then play on


----------



## Toothless (May 8, 2014)

High grade i3 / low grade i5 for Intel // Athlon 760 / A6 APU from AMD
4GB DDR3 1333mhz RAM
Decent mini-atx board, or go smaller.
Low-power PSU
Small little case

Honestly, a little rig for an upgrade will be overall cheaper than what you got that Q6600 for.


----------



## Toothless (May 8, 2014)

Or this.. Sadly I can only get U.S prices.





If anyone has a better list, by all means please edit this. I'm still a student in the hardware school.


----------



## MT Alex (May 8, 2014)

Shambles1980 said:


> what makes you think that  my monitor can only do those resolutions lol?
> its just the resolutions that i chose..
> 
> this is a 1080p60 monitor.



Then you probably don't need to upgrade at all, because you aren't doing it right.


----------



## Toothless (May 8, 2014)

We're not trying to make fun of or insult you, OP. We're just saying that you're using your monitor "improperly." And if you want to upgrade, then you should decide with or against it. We're just here to help.


----------



## eidairaman1 (May 8, 2014)

I only just recently started piecing a machine together. Im going from an Athlon Xp to a Fx setup, see sig rig.

Also you might aswell save cash for a Haswell E setup.



Shambles1980 said:


> trying to prove that i do lol.
> 
> its just this has been so very different to the past.
> 286 had to be upgraded to play anything at all really.
> ...


----------



## Vario (May 8, 2014)

OP, if you want to upgrade, you should consider an i5 4670k.  Thats probably the best choice for you:
1) Intel is so much further beyond AMD's single thread performance that you would be obsolete much sooner with an AMD setup
2) don't need hyperthreading for games, so i5 is probably the way to go instead of i7
2) haswell (4670k) is possibly 5% faster than ivy bridge (3570k) for games at any given clock
3) latest socket, and you can buy it with a z97 board now that they are being released.

If you don't think its worth it, then don't talk yourself into it and keep what you have.  There will always be something better coming out just around the corner.

Another option is buying a used 2500k and motherboard combo from a forum, and selling your q6600 + motherboard to offset the price.  2500k isn't much behind a 4670k as far as games are concerned.

You don't need more than 8GB of ram.


----------



## xBruce88x (May 8, 2014)

here's what i got with my old Phenom II 920

http://www.3dmark.com/fs/2116500

Just over 4000


----------



## HalfAHertz (May 8, 2014)

I've got a cheaper i5 and a 7850. I could try running some benchmark for you. To be honest as others have said, the q6600 was the high end of its time, so anything other than at least an i5 4670k wouldn't be worth it. It might perform slightly better than what you have but it will not last as long as your q6600 and you'd soon need another upgrade.

I think you should get a small SSD(100-120Gb is good for windows+some apps+ a few games) with the money you have saved now because I saw a huge speed up from that, and then start saving for a more long lasting upgrade.


----------



## Shambles1980 (May 10, 2014)

any reason not to get a AMD FX-6300 ?


----------



## Jetster (May 10, 2014)

Shambles1980 said:


> any reason not to get a AMD FX-6300 ?



Because it cant hold a candle to an Intel I5


----------



## Vario (May 10, 2014)

Shambles1980 said:


> any reason not to get a AMD FX-6300 ?


Because its a joke?


----------



## TheoneandonlyMrK (May 10, 2014)

Really,  I disagree as compared to a q6600 (my last intel chip) my phenom quad was noticeably quicker in all tasks and my fx is a leap ahead of a q6600 and in all gaming loads is more than adequate for the next five to ten years,  go cheap, go fx83## live happy I have..


----------



## Lopez0101 (May 10, 2014)

My old, old computer had a Q6600. It was a great chip, but it would be crushed under my current computer. As others have mentioned, it's not just the CPU's performance, but all the other newer technology that comes with newer chipsets: USB3.0, PCI-E3.0, SATA III, etc. Mobo's still had IDE headers standard. DDR3 was just coming out. My mobo had DDR2 and DDR3 slots.

My Q6600 and 4850 would be hopeless in any modern game trying to run at 1080p. Also, running and LCD at below it's native resolution does sacrifice some visual clarity, no matter what your graphics settings are. Running less than native on my monitor makes everything look a little bit fuzzy on edges.


----------



## RejZoR (May 10, 2014)

If Q6600 is reasonably overclocked, i think it can still be pretty good CPU. Stock, not so much.


----------



## Toothless (May 10, 2014)

Vario said:


> Because its a joke?


It's a good budget CPU.. I'll be running it soon.


----------



## Shambles1980 (May 10, 2014)

game debate say its about 10% better than an i5 2400 3.1Ghz,
1% better than an i5-2500k (at stock)
and 54% better than my q6600 (at stock)..

what about a FX-8120 then? is that any better? (i have a feeling you guys will say no because game debate say its only 3% better than the fx-6300)

It looks like at the prices the only options i really have for a step up by the margin i expect for the price im willing to pay would be one of those.
for some reason the i5-2400 and 2500k are still ridiculously expensive. considering the % of performance difference i would actually notice in the real world..

i can buy a new fx-6300 + new board + 4gb of new ram for the price of 1 and a half second hand I5-2500k chip(s)
for 1% less performance in general gaming that does seem like a bit of a steep price.
i however have noticed that the usa prices arent nearly as ludicrous as the uk prices, but shipping + custom charges then put it back in the  "you must be stupid" price bracket


----------



## Toothless (May 10, 2014)

Shambles1980 said:


> game debate say its about 10% better than an i5 2400 3.1Ghz,
> 1% better than an i5-2500k (at stock)
> and 54% better than my q6600 (at stock)..
> 
> ...


Game Debate isn't a good place to compare. Looking at benchmarks against other CPUs is about the best you can get.


----------



## Lopez0101 (May 10, 2014)

It's pretty well known Intel's chips are overpriced and that's because AMD can't compete with their higher-end chips, so they can charge whatever they want. But, really, the upper FX chips are going to serve your just fine for gaming. I just built my system with a 4770k and I kind of wish I'd gone with an AMD setup and saved myself several hundred dollars. The only downside with going with an FX, to me, is the older chipset, at the time I bought my system.


----------



## Toothless (May 11, 2014)

Lopez0101 said:


> It's pretty well known Intel's chips are overpriced and that's because AMD can't compete with their higher-end chips, so they can charge whatever they want. But, really, the upper FX chips are going to serve your just fine for gaming. I just built my system with a 4770k and I kind of wish I'd gone with an AMD setup and saved myself several hundred dollars. The only downside with going with an FX, to me, is the older chipset, at the time I bought my system.


AMD also likes to shove a lot of wattage into their chips. And single-threaded apps don't run as well as Intel chips do.


----------



## TheoneandonlyMrK (May 11, 2014)

Yet the hundred notes I saved changed a 7850 gpu into a 7970  and all future modern games for the next ten years areamd x64 x 8 go figur aand wattage smotage oc'ers care little for such petty nonsense


----------



## eidairaman1 (May 11, 2014)

Check my sig rig. Its a work in progress but once im done it will be a solid gaming/ media crunching machine.


----------



## Jetster (May 11, 2014)

Ether way is fine. All i can say is my 3770K w/ 2 - 7950s up against a 8350 w/ 2 - 7970s I scored 10% higher on 3d mark. And used 100 less watts. But both systems run good

But I would take a i5 4670K over a 8350 anytime and its only what $30 more


----------



## remixedcat (May 11, 2014)

if you want to have larger icons, text, etc you can just turn the DPI up or just right click>view>large icons


----------



## Shambles1980 (May 11, 2014)

remixedcat said:


> if you want to have larger icons, text, etc you can just turn the DPI up or just right click>view>large icons


lol my desktop is at 1920 x 1080 it just games arent


----------



## TheoneandonlyMrK (May 11, 2014)

Jetster said:


> Ether way is fine. All i can say is my 3770K w/ 2 - 7950s up against a 8350 w/ 2 - 7970s I scored 10% higher on 3d mark. And used 100 less watts. But both systems run good
> 
> But I would take a i5 4670K over a 8350 anytime and its only what $30 more


Only a tool would believe a four core has more longevity then an eight core in this moment and drop the xfire and ive no doubt my rig would claw that ten percent back at 5ghz yet 2-300watts more power used and no im not assed cost was king when I upgraded


----------



## Vario (May 11, 2014)

theoneandonlymrk said:


> Only a tool would believe a four core has more longevity then an eight core in this moment and drop the xfire and ive no doubt my rig would claw that ten percent back at 5ghz yet 2-300watts more power used and no im not assed cost was king when I upgraded


8350 is not an 8 core, it has 4 modules with 2 integer cores and 1 floating point core each.


----------



## TheoneandonlyMrK (May 11, 2014)

And a ps4 xbone has the same 8 integer and float units , an I5 has four of each


----------



## Vario (May 11, 2014)

theoneandonlymrk said:


> Only a tool would believe a four core has more longevity then an eight core in this moment and drop the xfire and ive no doubt my rig would claw that ten percent back at 5ghz yet 2-300watts more power used and no im not assed cost was king when I upgraded





theoneandonlymrk said:


> And a ps4 xbone has the same 8 integer and float units , an I5 has four of each


Whats your point?  The 8350 is obsolete now.  Its slower than the last 3 generations of Intel.  How would it make sense to be less obsolete in the future?  Its not a true 8 core and cores don't dictate performance.  I hope you use 7zip a lot because at least its good at that.
8350 is basically the modern equivalent of the Pentium 4.  If I were buying on a budget I'd buy used.

Historical example: if you bought an i7 920 in 2008.  You could still run things as fine as an 8350 at present and you would have been running it 6 years longer.  That's much more longevity for your money.  I don't see how 6 years from now an 8350 will be worth while.

example of a i7 at 3.3 vs an 8350
http://www.anandtech.com/bench/product/99?vs=697


----------



## Toothless (May 11, 2014)

Guise, don't forget the Intel has hyperthreading feeding off those godly cores. They've perfected singlecore/multicore performance. People go to AMD for cheaper chips that in some games, work better than Intel. It's all on how you want to do something.


----------



## qu4k3r (May 11, 2014)

I have a q6600 oc@3.2ghz, it was able to play some games (far cry 3, bf3, hitman abs, grid 2, metro 2033). Maybe those games are not the most cpu demanding but it run them nicely. I can barely feel any difference when I compare oced q6600 vs fx6300 at stock freq or fx6300@4ghz.

I agree Q6600 is an old power hungry chip but after read these two articles I can verify it is a good chip nowadays in terms of performance for a basic gaming pc.
http://www.tomshardware.com/reviews/ivy-bridge-wolfdale-yorkfield-comparison,3487.html
http://www.tomshardware.com/reviews/piledriver-k10-cpu-overclocking,3584.html


----------



## Shambles1980 (May 11, 2014)

well decided that for the money i want to spend. and the performance i wanted to gain for that money i had to go amd. went allong and bought used..
so its a gigabyte GA-78LMT-S2P and fx-8120 Figured a 6300 would have been better really but prices are what they are and wife being quite tight im lucky to have pulled any sort of purchace off. only having 4gb 2x2 1333 for now. will get that changed to 2x4 or 2x8 in the next few weeks or in a month or so.
not really unhappy about it whole lot cost less than a second hand i5-2500k..

i do realize that i got a builldozer and not a piledriver and im sure i will be put in my place for that. but every where i looked this chip is better than a q6600 hopefully some magic with mantel and junk like that can get more out of the unused cores (doubt it) or atleast the gpu can utilize them.

but i I spent less than expcted which is always nice. and the wife even seems happy as she had seen my £300+ wishlist. then my amd £160 wish list, and in the end i managed to get the stuff i needed for more than £60 under my lowest build price.
I think really its not to bad of a build for less than £100


----------



## Jetster (May 11, 2014)

You will be happy. It will bring new life to your GPU


----------



## Vario (May 12, 2014)

Shambles1980 said:


> well decided that for the money i want to spend. and the performance i wanted to gain for that money i had to go amd. went allong and bought used..
> so its a gigabyte GA-78LMT-S2P and fx-8120 Figured a 6300 would have been better really but prices are what they are and wife being quite tight im lucky to have pulled any sort of purchace off. only having 4gb 2x2 1333 for now. will get that changed to 2x4 or 2x8 in the next few weeks or in a month or so.
> not really unhappy about it whole lot cost less than a second hand i5-2500k..
> 
> ...


Hopefully its faster then the q6600.


----------



## Toothless (May 12, 2014)

OP will see a speed boost. I'd like to hear some feedback on the multitasking performance coming from that rig.


----------



## Misaki (May 12, 2014)

Q6600 is still a good chip, but incredibly hot. If you don't need to upgrade - don't do It. I have Phenom II X6 1045T and I don't want to upgrade. I saw how newer intel chips performs (i7 860, i5 2500, i5 4670K) and It's not a big difference in my apps and games.

Of course Q6600 is much slower but you're talking about games, not Blender, Sony Vegas, Photoshop etc.


----------



## TheoneandonlyMrK (May 12, 2014)

Vario said:


> Hopefully its faster then the q6600.


Since all along you have been speculating wheras I've recounted experience he will be better off,  and my earlier point is games Will use all 8  integer cores of even my obsolete cpu so an eight core is where the entry point neefs setting now for long term pc gamer ownership imho.
An I5 has not got 8 even with Ht so an I7 is intels best bet at this point imho or an 8 core amd,  you can call it and me what you want but ill still be right vario and ddr3 alone smoothed out some frame rate inconsistency for me so the Op should be quite happy with his new pc.
Oh and I have an i7 920 its not bad and will still be adequate for most games but unsurprisingly given its age it does Not outperform an 8350  or an 8150 for that matter.
Still a good chip for my steambox though.


----------



## Toothless (May 12, 2014)

Basically, an i5 is just an i7 w/o HT. Actually, all of that type of chip is made off of the same exact die. The i3s, i5s, and i7s are the same thing, just with HT or cores disabled due to issues with HT and/or that core.


----------



## TheoneandonlyMrK (May 12, 2014)

I suppose I did sound unsure. 
I know all that , , I've been around a while mate ty.
And because I was there when all this shit came about I already advised as well as its possible to do.
No console uses Ht  both main (for the foreseeable future) consoles have 8 amd cores, four good intel cores may well out compute 8 low power amd cores but if games are made to use 8 efficiently then four probably won't cut it in a few years when game devs start maxing the new console architectures.


----------



## OneMoar (May 12, 2014)

lets not forget the memory hes running is dd2 that alone warrants a upgrade hes getting passable performance because hes running at 1280x1024 most likely a old CRT
the cheapest i5/b86 ddr3 1333 will run circles around it for about ~300


----------



## Toothless (May 12, 2014)

theoneandonlymrk said:


> I suppose I did sound unsure.
> I know all that , , I've been around a while mate ty.
> And because I was there when all this shit came about I already advised as well as its possible to do.
> No console uses Ht  both main (for the foreseeable future) consoles have 8 amd cores, four good intel cores may well out compute 8 low power amd cores but if games are made to use 8 efficiently then four probably won't cut it in a few years when game devs start maxing the new console architectures.


Depending on when the next console hardware update will be, I'm sure Intel could get a slip in. 

(Let's start picking out the new names for new consoles)


----------



## TheoneandonlyMrK (May 12, 2014)

Easy ps4 and xbox 360 doh, id vote 32 arm 64 cores in each with 8 x86  side by side and a 4000 cored gpu each all on one die but that be 5-10 years off when tsmc and glo fo have 10nm sorted


----------



## Shambles1980 (May 12, 2014)

lmao how many times.
my monitor is a 1080p 60. my desk top resolution is. 1920x1080

its only games i change the resolution on.
i have done it for years. its just easier to see stuff off in the distance and tell that its something i need to shoot, than it is to see the same thing at a teenie tiny scale.
Sure it looks a little bit better (its not enough better to to make this fuss about it you make though)  but i dont have a giant monitor. its something in the region of 22" id guess.. and im not that close to it either lol...

I feel like i should start gaming at 320x240 just to wind you up lol.

as for the cpu. I am looking at it as a quad core with some AMD version of HT.


----------



## MxPhenom 216 (May 12, 2014)

I dont know, that 8120 might have been the worst choice you could of made.


----------



## Lopez0101 (May 12, 2014)

Shambles1980 said:


> just easier to see stuff off in the distance



Not sure what games you're playing, but increasing the res on a game doesn't make everything smaller like it does icons on the desktop. Unless you're playing an isometric view, pixel based game like Diablo 2.


----------



## rtwjunkie (May 12, 2014)

Lopez0101 said:


> Not sure what games you're playing, but increasing the res on a game doesn't make everything smaller like it does icons on the desktop. Unless you're playing an isometric view, pixel based game like Diablo 2.


 
Indeed. It adds more pixels, which increases details.


----------



## 64K (May 12, 2014)

Shambles1980 said:


> I feel like i should start gaming at 320x240 just to wind you up lol.



320x240 That takes me back to my C-64 days but I think it was 320x200 of pixelated bliss.


----------



## Shambles1980 (May 12, 2014)

well mother board cpu and ram for about $160 usd cant really be a bad choice imo. a second hand i5-2500k would cost me more here and thats without a mother board or ram. (i did look), for a board ram and an i5 i would have been looking at about  $507 is it really 350% better? "i know it isnt by the way" 
so if all we are looking at is bragging rights. better single threaded performance, and a mixed bag of even equal to worse than and better than gaming performance depending on the game, id rather save the money lol. (would be stupid not to wouldnt you?)

but i do think a 6300 would have been better than an 8120.


----------



## Toothless (May 12, 2014)

Shambles1980 said:


> lmao how many times.
> my monitor is a 1080p 60. my desk top resolution is. 1920x1080
> 
> its only games i change the resolution on.
> ...


There is no AMD version of Hyperthreading. 
And if you play at 320x240, expect people to throw NVIDIA TNTs at you in the most loving manner.


----------



## Toothless (May 12, 2014)

Shambles1980 said:


> well mother board cpu and ram for about $160 usd cant really be a bad choice imo. a second hand i5-2500k would cost me more here and thats without a mother board or ram. (i did look), for a board ram and an i5 i would have been looking at about  $507 is it really 350% better? "i know it isnt by the way"
> so if all we are looking at is bragging rights. better single threaded performance, and a mixed bag of even equal to worse than and better than gaming performance depending on the game, id rather save the money lol. (would be stupid not to wouldnt you?)
> 
> but i do think a 6300 would have been better than an 8120.


In some benchmarks, the 6300 does beat the 8120, and is also cheaper from what I've seen.


----------



## Vario (May 12, 2014)

Lightbulbie said:


> In some benchmarks, the 6300 does beat the 8120, and is also cheaper from what I've seen.


Only advantage the 8120 has over the 6300 is the quad module configuration.  6300 is faster otherwise.  But since you wanted advice, I'd advice you strongly not to buy an 8120.  Buy an Athlon x4 760K if you are going budget.


----------



## eidairaman1 (May 12, 2014)

Stick with the 8120


----------



## TheoneandonlyMrK (May 12, 2014)

Vario said:


> Only advantage the 8120 has over the 6300 is the quad module configuration.  6300 is faster otherwise.  But since you wanted advice, I'd advice you strongly not to buy an 8120.  Buy an Athlon x4 760K if you are going budget.


Athlon quad to swap out a q6600 seriously your advice is below worthless.
A q6600 is as good a quad as is necessary for most present ported games as for the future a quad anything will suck balls and 8 cores wont simple.


----------



## Delta6326 (May 12, 2014)

I run a Q6600 with 7870 Le and have yet to run it problems with games at 1080p always 35+fps usually close to 50. I'd wait for Skylake.


----------



## Vario (May 12, 2014)

theoneandonlymrk said:


> Athlon quad to swap out a q6600 seriously your advice is below worthless.
> A q6600 is as good a quad as is necessary for most present ported games as for the future a quad anything will suck balls and 8 cores wont simple.


Can you tell us anything else us from the future, most exalted seer?


----------



## Shambles1980 (May 12, 2014)

i just want to reiterate i dont see the amd 8xxx as 8 cores. i really think of them as quad cores with a amd version of hyper threading.
so really its kind of an i7 but only manages to play ball with lower end i5's for most part but can compete with 2500k in some situations (not single threaded though) and if i think of them like that then they do seem very poor in comparison..
But they do have enough grunt to power decent gpu's and can definitely allow me to play the games i want to play how i want to play them.
i wont achieve higer than needed maximums "although that is never what i care about" what i do care about is the minimum fps. all i want to do is game at a constant 60fps. vsynk on double/tripple buffer tweak the settings and the res just right. and never go below 60fps.

I may need to up the gpu a bit i'm yet to find that out. but im pretty sure that the cpu has enough of what is needed to keep me ticking over at what i want to do.

and at the least i will have some ddr3 ram now. so thats one less item on the things needed to upgrade list.
i may well get a 2500k or similar when 14nm tech arrives. 
what i have done here is basically swapped my old board cpu and ram for this board cpu and ram, which gives me one less item i need to upgrade later bringing my costs down. and the money for the q6600 setup is about the same as the money spent on this..


----------



## Toothless (May 12, 2014)

theoneandonlymrk said:


> Athlon quad to swap out a q6600 seriously your advice is below worthless.
> A q6600 is as good a quad as is necessary for most present ported games as for the future a quad anything will suck balls and 8 cores wont simple.


No need to be insulting now. 

The Q6600 is a good CPU, yes. But it's also old. The X4 760k won't do a good job for an upgrade. At LEAST a FX-6300 for an upgrade would work. If you want to go higher, then pick an i5 and up.


----------



## Vario (May 12, 2014)

Lightbulbie said:


> No need to be insulting now.
> 
> The Q6600 is a good CPU, yes. But it's also old. The X4 760k won't do a good job for an upgrade. At LEAST a FX-6300 for an upgrade would work. If you want to go higher, then pick an i5 and up.


still waiting for the OP to find that his 8120 is slower than the q6600 (@ 3.8) for games.  Bulldozer is horrid. 

I only mentioned the 760k as its cheap, a sidegrade that would get him faster and more ram, newer mobo etc


----------



## GreiverBlade (May 12, 2014)

Vario said:


> Only advantage the 8120 has over the 6300 is the quad module configuration.  6300 is faster otherwise.  But since you wanted advice, I'd advice you strongly not to buy an 8120.  Buy an Athlon x4 760K if you are going budget.


i ran a X4 760K setup for the price of it, it perform kinda good for nearly all game i played with it (and a R9-270 followed by a GTX580 and then a  R9-270X) i tested the 6300 and saw near no difference game wise with the 760K
actually i test a A10-7700K (since i had my X4 760K on a FM2+ A88X mobo ) off course my E8500 paired with a R9 270X do well (heck even with the R7 240 it has actually ) also had a i7 920 and a SLI setup paired with a Xeon E3-1275V2

yet if it was a Q9650 and not a Q6600 i would say wait the next gen, as it would be pretty much a capable quad (Q9550 OC 4ghz is near a i5 760 on the perf level and it's a pretty capable cpu even for actual games)

in the end i choose the cheaper solution and still all my games run smooth... damn: money you are a pain you ruins everythings!


----------



## Shambles1980 (May 12, 2014)

upgrading for me isnt a quick deal with a wife lol..
my next upgrade will probably be slower than this one.and most likely will be an i3 + mother board. which i will help fund by selling this board and the 8120..

But then i will have DDR3 ram not ddr2 ( from this "for all purposes free" upgrade) an a board that will support the i5 2500k (and/or better depending what i see for the prices)..
and then im just a cpu away. from where you guys think i need to be.
having spread the cost out. whilst still being able to game. and inevitably spending less money.

its the same method i used to upgrade from a p4 486 to a q6600. via pentium d. lga 775
i really cant just say thats what i want so il just buy all those parts now.
back then i didnt have a wife. but i was young and broke. so the basic premiss is still the same. spread out the costs and get what i can when i can.

Who knows though the 8120 may well be fine and i dont want to upgrade. but atleast i do have the ddr3 ram now and a system to use.
and i will probably find an i3+board for a similar price to what i can sell the 8120 + board for.
but i would not have found an i3+board+ram for the same price i could sell my q6600+ddr2 ram for..
im working on the premis that a 8core system will hold its value more (on ebay) than a quad core..


----------



## Toothless (May 12, 2014)

Vario said:


> still waiting for the OP to find that his 8120 is slower than the q6600 (@ 3.8) for games.  Bulldozer is horrid.
> 
> I only mentioned the 760k as its cheap, a sidegrade that would get him faster and more ram, newer mobo etc


I can see that, and updated memory and whatnot would be a good start for an upgrade. I've looked around and  saw that L3 cache is really good in games, which most (if not all) Athlons lack. But for budget gaming? I can see that happening. My X4 620 just ran AC4 @1080p at around 40-45 fps paired with a 660.


----------



## Vario (May 12, 2014)

Lightbulbie said:


> I can see that, and updated memory and whatnot would be a good start for an upgrade. I've looked around and  saw that L3 cache is really good in games, which most (if not all) Athlons lack. But for budget gaming? I can see that happening. My X4 620 just ran AC4 @1080p at around 40-45 fps paired with a 660.


The 620 was a Propus, essentially Phenom II (Deneb) without L3.  Current Athlons also lack L3, they are similar to a 4300 but without L3.  620 was a true quad core, whereas the current Athlons are dual module.  You are right that the L3 does help.


----------



## Jetster (May 12, 2014)

http://www.futuremark.com/hardware/cpu


----------



## 64K (May 12, 2014)

Since we're discussing a CPU from a gaming standpoint I thought I'd throw this in to consider.

http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-5.html


----------



## GreiverBlade (May 12, 2014)

64K said:


> Since we're discussing a CPU from a gaming standpoint I thought I'd throw this in to consider.
> 
> http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-5.html


thanks now i have the confirmation that a Q9650 to replace  the E8500 for my DC7900 is a good idea  if i can find one cheap, that is...


----------



## Dent1 (May 12, 2014)

Vario said:


> still waiting for the OP to find that his 8120 is slower than the q6600 (@ 3.8) for games.  Bulldozer is horrid.
> 
> I only mentioned the 760k as its cheap, a sidegrade that would get him faster and more ram, newer mobo etc



Not strictly true. The Bulldozer isn't much slower than the Piledriver. The Piledriver is a small update. To say Bulldozer is horrid is to say Piledriver is also horrid.

Also, Bulldozer FX 8120 reviews were done in 2011. The Bulldozer would stand a much better chance at newer games from 2014 onwards than the Q6600.


----------



## Dent1 (May 12, 2014)

Vario said:


> Whats your point?  The 8350 is obsolete now.  Its slower than the last 3 generations of Intel.  How would it make sense to be less obsolete in the future?  Its not a true 8 core and cores don't dictate performance.  I hope you use 7zip a lot because at least its good at that.
> 8350 is basically the modern equivalent of the Pentium 4.  If I were buying on a budget I'd buy used.




The 8350 does have 8 cores.  You obviously have no clue about technology. It's Intel that have virtual cores.




Vario said:


> Historical example: if you bought an i7 920 in 2008.  You could still run things as fine as an 8350 at present and you would have been running it 6 years longer.  That's much more longevity for your money.  I don't see how 6 years from now an 8350 will be worth while.
> 
> 
> 
> ...



Looking at that review the FX 8350 actually wins a fair few and is quite competitive on others. You obviously cant read where it says "lower is better"


----------



## MxPhenom 216 (May 13, 2014)

Dent1 said:


> The 8350 does have 8 cores.  You obviously have no clue about technology. It's Intel that have virtual cores.
> 
> 
> 
> ...



It is not a traditional REAL 8 Core though. Each module has 2 cores. It is hardware based you're right, not like intel's virtual.

Its just two different ways of going about a multi threaded CPU architecture.

Really the debate about it, and how good bulldozer is or isn't is like beating a dead horse. Atleast from what I could see from the last page or 2 of this thread.

AMD has already faced that fate, and is planning to release new CPUs, without this module architecture.

The OP already got the 8120 so there's no going back now. So just move along.


----------



## Vario (May 13, 2014)

Dent1 said:


> The 8350 does have 8 cores.  You obviously have no clue about technology. It's Intel that have virtual cores.
> 
> 
> 
> ...



Read the gaming benchmarks, the i7 crushes it on those and the OP is a gamer.

*Dragon Age Origins - 1680 x 1050 - Max Settings (no AA/Vsync)*
Frames per Second - Higher is Better
*166.6
139.2
Dawn of War II - 1680 x 1050 - Ultra Settings*
Frames per Second - Higher is Better
*80.3
70.5
World of Warcraft*
FRAPS Runthrough - FPS - Higher is Better
*104.1
91.5*


----------



## Dent1 (May 13, 2014)

MxPhenom 216 said:


> It is not a traditional REAL 8 Core though. Each module has 2 cores. It is hardware based you're right, not like intel's virtual.



You mean its not a traditional microprocessor design. The cores still exist and are physical.



MxPhenom 216 said:


> Its just two differe
> nt ways of going about a multi threaded CPU architecture.


Yes



MxPhenom 216 said:


> Really the debate about it, and how good bulldozer is or isn't good is like beating a dead horse. Atleast from what I could see from the last page or 2 of this thread.
> 
> The OP already got the 8120 so there's no going back now. So just move along.



I didn't read the entire thread. Just wanted to reply to the most outlandish comments.


----------



## MxPhenom 216 (May 13, 2014)

GreiverBlade said:


> thanks now i have the confirmation that a Q9650 to replace  the E8500 for my DC7900 is a good idea  if i can find one cheap, that is...



Dude, the Q9550/9650 were awesome chips! My buddy had a Q9550 that did ~500FSB on a Gigabyte UD3P board. He kept that chip for so long.


----------



## Dent1 (May 13, 2014)

Vario said:


> Read the gaming benchmarks, the i7 crushes it on those and the OP is a gamer.
> 
> *Dragon Age Origins - 1680 x 1050 - Max Settings (no AA/Vsync)*
> Frames per Second - Higher is Better
> ...



I like how you ignore the parts of the review where the FX 8350 was spanking the i7 to focus on the gaming section which comprises of only 3 games?

Also 166FPS vs 139FPS, 80.30 FPS v 70.5 FPS, 104.1 FPS v 91.5  FPS  is a very small performance gap considering the price point of both CPUs at the time.


----------



## xenocide (May 13, 2014)

theoneandonlymrk said:


> Athlon quad to swap out a q6600 seriously your advice is below worthless.
> A q6600 is as good a quad as is necessary for most present ported games as for the future a quad anything will suck balls and 8 cores wont simple.


 
Except you're wrong.  The Athlon X4 750K/760K are on par for the FX-6300's performance for less than the FX-4300 CPU's.  On top of that it's a newer chipset so you get PCI-express 3 for cheap, SATA 3, USB3, and it all uses DDR3--which is handily better than the bottlenecked mess that is LGA775 at this point.  I am curious about what _most_ games are, because a lot of the games out there benefit heavily from CPU bumps.  Any MMO, any RTS, any Simulation game, and quite a few more.  I suppose by any you mean most FPS's, but even the likes of BF3/4 and the Metro series benefit greatly from CPU performance.

Most of the newest AMD quad-cores (APU's being an exception) are on par for Nahelem level Intel CPU's at least, which handily outperformed the Q6600.  All of that being said, I would recommend an i3 or i5 over anything AMD has to offer because you can get all the shiney features AM3+ lacks for a decent price and if you don't plan on overclocking can get decent CPU's for solid prices.  In GPU bound games an i5 is on par for an FX-8350, and in most CPU-bound games it blows it away--especialyl games that favor Intel's architecture like MMO's and RTS's.


----------



## Vario (May 13, 2014)

Dent1 said:


> I like how you ignore the parts of the review where the FX 3850 was spanking the i7 to focus on the gaming section which comprises of only 3 games?
> 
> Also 166FPS vs 139FPS, 80.30 FPS v 70.5 FPS, 104.1 FPS v 91.5  FPS  is a very small performance gap considering the price point of both CPUs at the time.


Yes because the OP is a gamer.  Not sure what you mean by "at the time", it came out 4 years earlier!

My argument was merely to prove that for gaming, buying a good processor instead of buying a budget processor, pays off in the long run and will save you money.  If he buys a bulldozer it will cost him more money in the long run.

For example, he bought a q6600, that chip has last him a long time.  If  he had bought a budget cpu he'd have already upgraded by now.  The Q6600 was launched in 2007 and is still relevant today, 7 years later.  Do really you think that an FX8120 will be relevant in 2021?

It is an argument meant as a retort to this comment:


theoneandonlymrk said:


> Only a tool would believe a four core has more longevity then an eight core in this moment and drop the xfire and ive no doubt my rig would claw that ten percent back at 5ghz yet 2-300watts more power used and no im not assed cost was king when I upgraded


----------



## Toothless (May 13, 2014)

>Thread went from helping OP to the ancient Intel/AMD battle.

Y'all starting to make me regret my FX-6300.


----------



## Dent1 (May 13, 2014)

Vario said:


> Yes because the OP is a gamer.  Not sure what you mean by "at the time", it came out 4 years earlier!
> 
> My argument was merely to prove that for gaming, buying a good processor instead of buying a budget processor, pays off in the long run and will save you money.  If he buys a bulldozer it will cost him more money in the long run.
> 
> ...



So you're saying a budget AMD can't last a long time? I've had my Athlon II X4 620 since 2009 and it's still going strong. Your point is invalid.


Secondly I have not seen any evidence that a Q6600 Kentfield can outperform an FX 8120 in gaming or otherwise. You needed an Yorkfield Q9xxx just to match the Phenom II X4.  Q6600 would stand no chance against a Bulldozer FX 8120. In single threaded performance the Bulldozer was often on par with the Phenom II X4 series, so logically you'd need a Q9xx to hang with a Bulldozer in single threaded gaming on its worst day.


----------



## Toothless (May 13, 2014)

Dent1 said:


> So you're saying a budget AMD can't last a long time? I've had my Athlon II X4 620 since 2009 and it's still going strong. Your point is invalid.
> 
> 
> Secondly I have not seen any evidence that a Q6600 Kentfield can outperform an FX 8120 in gaming or otherwise. You needed an Yorkfield Q9xxx just to match the Phenom II X4.  Q6600 would stand no chance against a Bulldozer FX 8120. In single threaded performance the Bulldozer was often on par with the Phenom II X4 series, so logically you'd need a Q9xx to hang with a Bulldozer in single threaded gaming on its worst day.


I gotta agree with the 620, and not only does it last long; it's also pretty durable to abuse thrown at it. I've had pins bent. It flying off a heatsink and into a bathroom sink. Heatsinks fall on it. Fan failure while on. The list can go on.

While yes, it does its job and does it well. I wanted something that could give me more of a multitasking boost. (Hence the 6300)

I also have an AMD-based laptop and good lordy, it WON'T die. Even OC'ing it, playing with the voltage. Running it at 100C. AMD has some chips that simply won't give up. But so does Intel.

My old school was running 5-7 year old CELERONS in their desktops. And don't get me started with what idiots do to those school machines.

On the matter of how long a CPU can last, let's just toss that topic out the window until we either see FX-8xxx or i7-3xxx/ i7-4xxx starting to blow up in a few years. My bet will be that AMD has some FX-9750 sitting in their pockets, just waiting to give someone a 250w hug. 
(Joke to their FX-9xxx series, which is a big waste of 220w in my opinion)


----------



## Vario (May 13, 2014)

Dent1 said:


> So you're saying a budget AMD can't last a long time? I've had my Athlon II X4 620 since 2009 and it's still going strong. Your point is invalid.
> 
> 
> Secondly I have not seen any evidence that a Q6600 Kentfield can outperform an FX 8120 in gaming or otherwise. You needed an Yorkfield Q9xxx just to match the Phenom II X4.  Q6600 would stand no chance against a Bulldozer FX 8120. In single threaded performance the Bulldozer was often on par with the Phenom II X4 series, so logically you'd need a Q9xx to hang with a Bulldozer in single threaded gaming on its worst day.


So with that 620 you are able to run modern games maxed without bottlenecking your graphics card?  CPUs rarely stop running "strong", they just become obsolete. i7 920 could max out games in 2008 and can still max out games now.  I was posting that to refute claims by the oneandonlymrk.

You're seizing on the wrong argument here.  If the Bulldozer is on par with the Phenom II x4, and his Q6600 @ 3.8 gets nearly the same physics score as my Phenom II, then why is he upgrading to a Bulldozer.

Kentsfield and yorkfield aren't all that different. http://www.anandtech.com/show/2362/4

edit:
his old system:http://www.3dmark.com/3dm11/8304206
my old system: http://www.3dmark.com/3dm11/5304049
an 8120 and 7850: http://www.3dmark.com/3dm11/3148764
another 8120 and 7850: http://www.3dmark.com/3dm11/5000159

doesn't look like an upgrade to me


----------



## TheoneandonlyMrK (May 13, 2014)

xenocide said:


> Except you're wrong.  The Athlon X4 750K/760K are on par for the FX-6300's performance for less than the FX-4300 CPU's.  On top of that it's a newer chipset so you get PCI-express 3 for cheap, SATA 3, USB3, and it all uses DDR3--which is handily better than the bottlenecked mess that is LGA775 at this point.  I am curious about what _most_ games are, because a lot of the games out there benefit heavily from CPU bumps.  Any MMO, any RTS, any Simulation game, and quite a few more.  I suppose by any you mean most FPS's, but even the likes of BF3/4 and the Metro series benefit greatly from CPU performance.
> 
> Most of the newest AMD quad-cores (APU's being an exception) are on par for Nahelem level Intel CPU's at least, which handily outperformed the Q6600.  All of that being said, I would recommend an i3 or i5 over anything AMD has to offer because you can get all the shiney features AM3+ lacks for a decent price and if you don't plan on overclocking can get decent CPU's for solid prices.  In GPU bound games an i5 is on par for an FX-8350, and in most CPU-bound games it blows it away--especialyl games that favor Intel's architecture like MMO's and RTS's.


Utter twoddle read my replys I talk of ffuture games not yesterdays and especially games that favour amd will run better in the future .


----------



## Toothless (May 13, 2014)

Vario said:


> So with that 620 you are able to run modern games maxed without bottlenecking your graphics card?  CPUs rarely stop running "strong", they just become obsolete. i7 920 could max out games in 2008 and can still max out games now.  I was posting that to refute claims by the oneandonlymrk.
> 
> You're seizing on the wrong argument here.  If the Bulldozer is on par with the Phenom II x4, and his Q6600 @ 3.8 gets nearly the same physics score as my Phenom II, then why is he upgrading to a Bulldozer.
> 
> ...


Just because a program benchmark is the same, doesn't mean that they'll perform the same in a game. My 620 was BEATEN by an A6 3420m. (Quadcore @2.6ghz lost to a Quadcore @1.5ghz.)

Now given that the GPU was a bit different. If I decided to get a more powerful fan/cooling pad for my laptop. I could run it at or slightly below my current desktop setup. (620 with a GTX660 OC)

I'm sure my 620 could EASILY beat the 3420m in a program benchmark but hey, whatever floats your sinking canoe. I'd say let's just show OP some gaming benchmarks with different GPU combos and let them decide. This turned into a CPU war way too quickly.


----------



## xenocide (May 13, 2014)

theoneandonlymrk said:


> Utter twoddle read my replys I talk of ffuture games not yesterdays and especially games that favour amd will run better in the future .


 
That argument is garbage because those CPU's will be outperformed by cheaper alternatives in the future.  There's no such thing as futureproofing--at least not past a year at a massive premium.  Do you really think in 2-10 years when all games can run on 8 Threads the FX-8350 will still be relevant?  When threading gets that high AMD and Intel will have released CPU's infinitely more powerful.  It's silly to by a CPU today for standards that won't exist for 2+ years (and I assure you, you might see a slight increase in heavily threaded games but you definitely won't for at least that long if not longer).


----------



## RCoon (May 13, 2014)

theoneandonlymrk said:


> Utter twoddle read my replys I talk of ffuture games not yesterdays and especially games that favour amd will run better in the future .


 
i5 quads are the choice of most gamers today, and will continue to be until some drastic change in API's. Games are not getting more and more CPU bound, they are becoming more and more reliant on GPU cores, because of the way API's work.

As for games that favour AMD processors, I can probably count them on only one hand. Unless you're playing RTS (in which case an AMD 8350 is about as useful as a generic i5), games don't give a damn about your CPU unless you're running one that's ancient tech, like the original Core2Duo's. My 8350 performed identically do my 3570K before I upgraded, I see no point in the 8 cores, especially when they both cost the same, and the intel offers better single core performance. Granted, my 8350 whoops the intel's during multithreaded software applications like parsing and RARing, besides that I never use my 8350 for gaming since I got the 4670.

Consoles have 8 cores, but they don't use 8 for gaming. Half of them are still used for running the 3 OS's that the consoles strangely run, so they're actually running the games on around ~4 cores anyway.

OP could happily buy a 760K (which overclocks like nobody's business on air cooling), or a semi decent i3/i5 and not notice any difference between the two, unless something is pumping his PC with physics effects or thousands of calculations (see: RTS genre).

Future games will likely be just the same as they are today, largely badly made, ported, maybe use some proprietary API for some GPU or other. By then, different classes of processors will be out to cater for them. The future takes a long time, as does hardware recently. I'd buy an i5 and probably not bother upgrading it for 5 years. You can't talk about future games, because you have no idea what the future holds. I know I won't be upgrading my processor for a few years.

Bottom line is, if OP is playing manshoot, foot to ball, or mandrive, Q6600 is probably fine at 1080p, but in essence, ANY new processor and motherboard might give him a few extra frames, and better RAM, and better onboard everything. I played Metro Last Light on a 750K for christ's sake, paired with a 780 there was very little performance to be gained.


----------



## Dent1 (May 13, 2014)

Vario said:


> So with that 620 you are able to run modern games maxed without bottlenecking your graphics card?  CPUs rarely stop running "strong", they just become obsolete. i7 920 could max out games in 2008 and can still max out games now.  I was posting that to refute claims by the oneandonlymrk.




Yes I am able to max out my games, hence why I've felt no need to upgrade it since 2009. Crysis 1-2, Max Payne 3, Battlefield 2-4, TR Underworld, Serious Sam 3, Skyrim, Tombraider (2013) etc just to name a few all run maxed out detail or  near maxed detail, no lag, very good frame rates. I even have 5850CF which at the time would be considered high end and no sign of bottlenecking.




Vario said:


> You're seizing on the wrong argument here.  If the Bulldozer is on par with the Phenom II x4, and his Q6600 @ 3.8 gets nearly the same physics score as my Phenom II, then why is he upgrading to a Bulldozer.
> 
> Kentsfield and yorkfield aren't all that different. http://www.anandtech.com/show/2362/4



Firstly, that is an Extreme Edition Kentfield. Much more powerful than the regular series. Your point is invalid.

I would like to see some gaming benchmarks of the "horrid" Bulldozer FX 8120 against the Q6xxx Kentfield side by side. If you can't provide that I won't entertain you any further.




Vario said:


> edit:
> his old system:http://www.3dmark.com/3dm11/8304206
> my old system: http://www.3dmark.com/3dm11/5304049
> an 8120 and 7850: http://www.3dmark.com/3dm11/3148764
> ...



Who cares about physic scores? Earlier you said your argument hinged on gaming right? Now you're using a synthetic benchmark to argue a case? A higher physic score doesn't mean it will translate into better performance in reality. And you need a cross section of tests from an independent reviewer, not just a cherry picked 3D Mark.


----------



## TheoneandonlyMrK (May 13, 2014)

Bottom line guys is the Op has spent cheap Already so chill with the two page rants and your still wrong i have had my 8350 two years and in five ill put a fiver it Will play any console port still. 
 I heart intel but your pushing it a bit an I 5 simply Will be inadequate in more games sooner imho get over it.


----------



## RCoon (May 13, 2014)

theoneandonlymrk said:


> Bottom line guys is the Op has spent cheap Already so chill with the two page rants and your still wrong i have had my 8350 two years and in five ill put a fiver it Will play any console port still I heart intel but your pushing it a bit an I 5 simply Will be inadequate in more games sooner imho get over it.


 
All I hear is "you're wrong because reasons". You're providing no proof (or grammar for that matter).

Also as a point to note, telling people to "get over it" is really not providing any insightful thoughts to this thread. It's just an invitation to argue with someone who refuses to be debated with "because reasons".


----------



## Shambles1980 (May 13, 2014)

your missing the point lol..
for what i sold my old set up (board cpu and ram) i got this set up (board cpu and ram)
it is a slight upgrade, (and as stated in some of my earlier stipulations at the same cost as i sold my old stuff.. "so free")
But the importaint part is i now have ddr3 ram and less limitations than my previous lga 775 set up. this as i said was virtually a direct swap piece for piece. and i really had to do the transition now or never. if i held on to my q6600 and ddr2 till 14nm came along i believe the value would have droped to a point where it would not be able to get me any thing worth getting.
regardless of that I now have ddr3 ram so IF i decided i need to upgrade i only need a board and cpu. which brings my upgrade cost down.

like i said if i do decide to get a board and cpu i will hold off till 14nm and then grab a board and i3 for cheap and offset the costs by selling the 8 core amd which im sure will hold its price better on ebay (i dont see it becoming much more worthless any time soon lol) will probably drop performance a bit but then i would upgrade the cpu later on..

for the time however the whole system is going to be better than the q6600 with the ability to use better gpu's more effectivly.

also if we are talking 80fps vs 70 fps when all i care about is 60fps.. it really seems pretty stupid to spend $400 more for 10fps i wont see because of v synk..


----------



## Jetster (May 13, 2014)

Facts are facts and no one can tell what the future brings. Bulldozer was supposed to bring and evolution to gamming CPUs and we see what happened there. So you buy the best CPU you can for the money and go with it


----------



## RCoon (May 13, 2014)

Shambles1980 said:


> also if we are talking 80fps vs 70 fps when all i care about is 60fps.. it really seems pretty stupid to spend $400 more for 10fps i wont see because of v synk..


 
Max FPS does not matter, but minimum FPS does. You might get 20000 FPS at peak, but if you're getting 2FPS as a minimum, you're not going to be a happy camper. That being said, if you're on an 8120 or whatever it was, i'd stick with that as opposed to selling it and buying an i3... Wait until you have the monies for an i5, otherwise the sidegrade is a worthless waste of time and money.



Jetster said:


> So you buy the best CPU you can for the money and go with it


Hurray for users with common sense.


----------



## TheoneandonlyMrK (May 13, 2014)

Feck .
The Op is so spot on here im a leave it to him.
RCOON im not on here to argue, the guy wanted advice , ive owned a q6600@3.4 -5ghz a phenom quad after that then an 8350 and an i7920  and I am telling you for defffffo what I saw and id agree it is not the apex of performance but it will do fine for many more years. 
Gramma tart calling = just funny to me.


----------



## RCoon (May 13, 2014)

theoneandonlymrk said:


> I am telling you for defffffo what I saw


 
"because reasons"

Seriously, sit and consider your posts before you drop them, some people are trying to be helpful, you just sound like a 14 year old that discovered the chans.


----------



## Shambles1980 (May 13, 2014)

seeing as this has turned in to a intel amd thing lol. 
Im goingto throw some cats in amongs the pigeons.. 

ok lets look at steam, and api's and the fact that steam want to make their own os. then lets look at mantle which is a better functioning api because its easier tyo get to the actual hardware. 
now lets think about things like opengl/cl 
ms are being slow and frankly rather usless with their dx updates, they also force an os change on these upgrades. 
hardware vendors nvidia/ati would prefer not to be confined to directx. other os's that cant use ms would rather not use directx. Steam would rather not use direct-x so steam box can work properly.

im sure developers are the only thing holding the switch back. 
opengl is better fore hardware level access. 
and i think the amd chips are better suited for that situation.. 

BUT by the time that happens "which im sure it eventually will" these chips will be obsolete lol.


----------



## GreiverBlade (May 13, 2014)

xenocide said:


> Except you're wrong.  The Athlon X4 750K/760K are on par for the FX-6300's performance for less than the FX-4300 CPU's.


as i wrote a bit above ... i tested both and no difference for me.



xenocide said:


> On top of that it's a newer chipset so you get PCI-express 3 for cheap


well only with a Kaveri APU



Vario said:


> Yes because the OP is a gamer.  Not sure what you mean by "at the time", it came out 4 years earlier!


i will have to go with the : no the i7 doesn't crush a 8350 in game ... as long as the FPS avg is 30 or above (oh well let say 60) you don't notice it ... everything above that is just for self gratification and little future proofing, i mean c'mon ... 2014 games will put current config as my FM2+ setup on their knees?  specially when you notice medium setting and high settings have little to none difference in eyecandy (ok i might not be objective since i game only at 1080p 60hz most of the time no FXAA) 

unless you go for RTS... 



Shambles1980 said:


> or the time however the whole system is going to be better than the q6600 with the ability to use better gpu's more effectivly.
> 
> also if we are talking 80fps vs 70 fps when all i care about is 60fps.. it really seems pretty stupid to spend $400 more for 10fps i wont see because of v synk..


exactly ... well V-sync i differ ... i never activate it ... 
and indeed that will still be better than the Q6600, as i said if you had a Q9650 it would have been a bit different.


----------



## TheoneandonlyMrK (May 13, 2014)

We're not having a serious debate mate, we're mearly bickering about bs and massaging egos. 
I don't normally resort to such language in forums and apologise to any casual reader for any offence but it was chosen by me and I stand by it.
Back when I first joined this forum I was probably a bit over zealous and a smarty pants but I think I have matured a bit and don't try to get involved in heavy never ending debate anymore it gets to be pointless but im not up for being patronised.
And I only piped up in the first place due to experience not opinion but experience,  I don't mind people who dissagree with me But I do have a beef with people who think we can all update every year ,and we should all, always go intel or those that think you can't game on amd gear and enjoy the experience.


----------



## brandonwh64 (May 13, 2014)

I have owned a Q6600 at the same time with a phenom II 965BE. The 965BE murdered it in gaming. I seen FPS increases with a AMD 5850 of about 20 FPS. I then upgraded to a I7-920 and got atleast 15+ fps more than with the Q6600 so yes there are better CPU's out there that will stomp the q6600 in gaming.

If it were me and I was on a budget, I would look into getting a used P67/Z68 and a used 2500K. I seen a P67 the other day for 50$ shipped with no issues and a 2500K for 100$ shipped. That would be a bad ass upgrade for 150$.


----------



## Tatty_One (May 13, 2014)

Cleaned up, stop the language and the crap please,  I appreciate at times things can get heated but I hate getting to the stage where infractions need to be issued and we crossed that line a few posts ago sadly......thank you.


----------



## Dent1 (May 13, 2014)

Tatty_One said:


> Cleaned up, stop the language and the crap please,  I appreciate at times things can get heated but I hate getting to the stage where infractions need to be issued and we crossed that line a few posts ago sadly......thank you.





theoneandonlymrk said:


> We're not having a serious debate mate, we're mearly bickering about bs and massaging egos.
> I don't normally resort to such language in forums and apologise to any casual reader for any offence but it was chosen by me and I stand by it.
> Back when I first joined this forum I was probably a bit over zealous and a smarty pants but I think I have matured a bit and don't try to get involved in heavy never ending debate anymore it gets to be pointless but im not up for being patronised.
> And I only piped up in the first place due to experience not opinion but experience,  I don't mind people who dissagree with me But I do have a beef with people who think we can all update every year ,and we should all, always go intel or those that think you can't game on amd gear and enjoy the experience.



A few trolls have aggravated a British person into resorting to foul language. We should all turn the other cheek and stop replying (including myself).


----------



## Tatty_One (May 13, 2014)

Dent1 said:


> A few trolls have aggravated a British person into resorting to foul language. We should all turn the other cheek and stop replying (including myself).


  Thanks but British has nothing to do wth it, can we get back on topic now?


----------



## Shambles1980 (May 13, 2014)

brandonwh64 said:


> I have owned a Q6600 at the same time with a phenom II 965BE. The 965BE murdered it in gaming. I seen FPS increases with a AMD 5850 of about 20 FPS. I then upgraded to a I7-920 and got atleast 15+ fps more than with the Q6600 so yes there are better CPU's out there that will stomp the q6600 in gaming.
> 
> If it were me and I was on a budget, I would look into getting a used P67/Z68 and a used 2500K. I seen a P67 the other day for 50$ shipped with no issues and a 2500K for 100$ shipped. That would be a bad ass upgrade for 150$.




i had a look and there was no way i could get an i5. the cheapest second hand i5 2500k was £90 ($151) that was used. i would then have needed ddr3 ram which on avarage even used was £40 ($68)
and then i would have needed board as well. which at the low end would have been £50 (£85) 
thats all used. which would have taken me up to $300 
(p.s the £90 i5 sold pretty fast, they were mostly £140 ish ($236) 
which would take an avarage second hand i5 build up to the $390 mark.. 
So no there was no way i could make a i5 second hand rig for $150 the cpu alone is more than that here in the uk. 

but i was able to get a amd cpu board and ddr3 ram for less than the cost of an i5.. (about $150) whcih incidentally is about the same as i got form my q6600 + board  + ddr2 ram. 

the usa prices are a lot lower than the uk prices on these chips. but after shipping and customs charges then your back in the region of "too expensive" 
if we had the USA prices for the hardware here it would be an easy no brainer. 
but when your looking at almost $400 compared to $150 then chosing an i5 becomes a difficult thing to justify. 
I would not have been able to just produce the $200 or so difference to get the i5 build  and the only way i can get a 2nd gen i5 or i7 is by going via an i3 with good board later on. it really is the only way i can spread the costs out. 
i would have bought an I3 now instead of the amd option but the boards arent that good with them (usually some generic dell pull) and its more of a modern socket than am3+ 
so i have to wait for 14nm for the upgrades to come through which will put the boards out there for a cheaper price.


----------



## Toothless (May 13, 2014)

Dent1 said:


> Yes I am able to max out my games, hence why I've felt no need to upgrade it since 2009. Crysis 1-2, Max Payne 3, Battlefield 2-4, TR Underworld, Serious Sam 3, Skyrim, Tombraider (2013) etc just to name a few all run maxed out detail or  near maxed detail, no lag, very good frame rates. I even have 5850CF which at the time would be considered high end and no sign of bottlenecking.


Tell me your secret on BF3. Is it your overclock? I can't max it out with my setup.


----------



## RCoon (May 13, 2014)

Lightbulbie said:


> Tell me your secret on BF3. Is it your overclock? I can't max it out with my setup.


 
Looks like he's on a fairly low resolution on a 19" monitor. Certainly, not having to max out on 1080p res would probably allow him to get max settings.


----------



## Dent1 (May 13, 2014)

Lightbulbie said:


> Tell me your secret on BF3. Is it your overclock? I can't max it out with my setup.



I believe it is my 5850 CF.

5850 CF can outperform a  single GeForce GTX 295, which was Nvidia's uber high end card back in 2009.  5 years later, the 5850 CF is still a midrange contender, on par with today's AMD 7850

Which video card did you have at the time?

Edit:

I'm sure the OC does help too. Plus my resolution is only 1440x900.


----------



## brandonwh64 (May 13, 2014)

Shambles1980 said:


> i had a look and there was no way i could get an i5. the cheapest second hand i5 2500k was £90 ($151) that was used. i would then have needed ddr3 ram which on avarage even used was £40 ($68)
> and then i would have needed board as well. which at the low end would have been £50 (£85)
> thats all used. which would have taken me up to $300
> (p.s the £90 i5 sold pretty fast, they were mostly £140 ish ($236)
> ...



Ahh ok I just realized you were in the UK. Yea US prices on used equipment is a lot lower.


----------



## Bucho (May 13, 2014)

Hi guys, just signed up @ techpowerup.

Sorry for this long post coming up ...

I've started PC gaming late 1991 with a 386SX and since then I am hooked to PC hardware/software(games)/overclocking and have built a lot of systems since I had the chance to do so in my dad's company and for many friends (mainly the gaming aspect) and now I even work as an IT engineer at a company for more than 13 years.

@Shambles1980
What's a "Pentium 4 486" you mentioned twice? I know a 486 (early to mid 90s) and a Pentium 4 (early 2000s).
And you were comparing a Pentium D and a Q6600 as "4 cores. and them being actual cores using a proper architecture and not just 2 pentium 4 cores glued together". That's not really true. Yes a Pentium D is just two Pentium 4 cores together and a Core 2 Duo is a true native dual core, but a Core 2 Quad is again just a "glued together" Core 2 Duo. AMDs Phenom was the first true native Quad Core for the end customer market.

Back to the Topic ...

Your problem / upgrade wish is pretty common since a lot of people I know (including myself) still sit on their old Quad CPUs (Core 2, first Core i gen and Phenom or Athlon X4) and ask themselves if they should upgrade and if what they should get. If money was no matter they would not even have such a old system in the first place and they wouldn't think twice when upgrading.
The main problem (that has been discussed over and over all around the internet) is that somehow the CPU performance increase (kind of) flattened over the past ~ 8 years. I say _kind of _because the IPC (instuctions per cycle) performance increased and also the core count increased in certain classes of CPUs. So why don't we see the big boost? 
One thing is that the MHz of the CPUs didn't increase that much, at least not like in the days of 486, Pentium /II/III/4 and K6, Athlon. Back then the MHz doubled and tripled von generation to generation and the IPC got better. No wonder a Athlon 1400 was almost twice as fast as a 700MHz Athlon, right? Or a 486DX2 66MHz was almost twice as fast as a 33MHz. At least in some raw benchmarks where the bus speed, memory speed and other things didn't bottleneck.
The other thing is that we still have quad cores as the mainstream and performance CPUs. That shifted a little but back in early 2007 you could buy a Quad Core, okay that was high end like the Socket 2011 CPUs (6 core) now, but Quad Core became mainstream around mid 2008.
Sure if you pay A LOT you could go out and buy that 8 or even 10 core Xeon and put it on your Socket 2011 board but there is no mainstream/performance class 6 or 8 core out there (besides the Phenom X6).
And it seems like this is going to stay that way referring to latest news that Broadwell (Haswell shrink) and even Skylake (successor of Haswell/Broadwell) will still be a Quad Core for the mainstream/performance Socket 1150/1151. 

So as long as we don't see more programs that benefit of more than 4 cores and no dramatic increase of MHz the performance for the regular desktop PC will not increase as fast as it did in the years of 486, Pentium and Athlon CPUs.

@Shambles1980
I don't think your upgrade was the best you could do. Okay it was rather cheap, but that mainboard IS OUTDATED with a very old chipset. It has no USB 3.0, no SATA 6GB ports, no PCI-Express 3.0, only two memory sockets, and the overclocking capability is questionable since it does not seem to support a faster CPU than the FX-8300. (so no FX-8320, 8350 ...)
Memory runs at DDR3-1333 already overclocked according to specs.
The FX-8120 isn't a bad CPU but the programs have to make good use of it. Battlefield 4 and Windows 8/8.1 seem to like the FX CPUs. But the FX-8120 most of the time get's beaten by a i3 in games.

The prices of these i5-2500K still are high even on used models because they are still very good and fast CPUs. The difference between Sandy Bridge (i5-2500K) and Ivy Bridge (i5-3570K) at same clock speed is just a few percent. And the recent Haswell (i5-4670K) at same clock speed are only like 10-15% fasten than the Sandy Bridge. And the i5-2500K was a good overclocker - so 4.5GHz can be achieved with almost every chip (and sometimes even way more). Same with the Ivy Bridge and Haswell ... so it's not like you could OC that i5-4670K to 5GHz and more.

So my advice would have been that you get a used i5-2500K or i5-3570K. Yes that would have been more expensive but you can overclock them and be happy for the next few years.

Back to the Core 2 Quad ... since I own one (even more than one) myself i know what it can do and can't. I have a Q6700 on a EP45-DS4 and a Xeon X3370 (that's a Q9650) on a EP45-UD3P. Right now I am playing around with a Xeon X5460 with a MOD on my UD3P that is a Socket 771 CPU which is a Q9650 but with 3.16 GHz stock clock.
I have the Xeons clocked at 4.0 GHz and I have no problems playing any games. Okay I use a GTX 680 which is a little overkill for this system but at least I can use the highest setting in any game. But this Core 2 Quad at 4.0 GHz can't really hold up with an i5-2500K at stock. Sometimes it is as fast but it has some FPS drops that the i5 doesn't have.
So your Quad @ 3.6 like you said if I remember right should be quite capable too, but any i5-2xxx would be way faster in any situation. And like I said since you like to overclock and could get your hands on a used 2500K or 3570K you would be out of reach for any Core 2 Quad or Phenom II doesn't matter how high you clock these. Would be sad if it wasn't the case since these i5 CPUs are a good 2-3 years younger.

The problem is the price ... Intel wants you to pay if you want to overclock. So you at least need a K model and these start at around 180 EUR where I live and still are sold for about 100 EUR used (i5-2500K for example).
A pretty good CPU is the i3-4130 that's about 90 EUR and has a good performance for that price. But as I said you can't really overclock it and in programs/games that make good use of 4 or more cores an i5 is always faster.
On the AMD side the Athlon X4 760K is a nice CPU for only 70 EUR. You can overclock it pretty easy and the boards are cheap too, but it lacks of raw power compared to the single core power of an i3 or i5. But this is another price range anyway.

Since you already made your choice and got that FX-8120 try to make the best of it. In some cases it should be faster than your old Core 2 Quad, but don't be disappointed if it sometimes isn't faster.

Since this was also mentioned, I think and hope that the current generation of consoles push the developers to make use of more cores and also support the AMD module based architecture better so that the PC can benefit of it.

EDIT:
Uh and @Shambles1980
I really don't get your point in "I am using a 1080p monitor but run games in 1024 or 1280"?
Is it still a CRT? Then I would understand but to run a resolution below the nativ on a LCD is usually awful. So 3D games will be the same size on the screen no matter what resolution you choose, but the detail would be the highest on the nativ resolution. So I can't follow your argument in "I see them earlier if they are far away on lower resolutions".
The only thing I can think of is that you have more FPS at a lower resolution and maybe that's why you prefer it.
So again to point it out, any LCD has to interpolate the pixels to it's native resolution, so you kind of get a smoothing / antialiasing but from less details! Regular antialiasing works the other way around as well as downsampling does.


----------



## Shambles1980 (May 13, 2014)

pentium 4 came on 2 sockets. 486 and 775. i was with the 486 socket version..


----------



## Dent1 (May 13, 2014)

Bucho, very informative post.

Most of which I believe is accurate and a reasonable assessment. Mainly in regards to the OP board being too outdated.

The only part I disagree with is the i3 can not outperform the FX 8120 in games.  If you look at old reviews back from Bulldozers release in games the i3 seemed more on par with the FX 6100.  The FX 8120 could actually hang with and beat the i5 xx in games.

http://www.guru3d.com/articles_pages/amd_fx_8150_8120_6100_and_4100_performance_review,10.html

Small sample, consisting of only Crysis 2 and Farcry 2, but you get my point.


Other than that I enjoyed your post and don't want to take away from its analysis.



Shambles1980 said:


> pentium 4 came on 2 sockets. 486 and 775. i was with the 486 socket version..



Actually, it came in 3 sockets, Socket 423, Socket 478, and LGA 775.

But without getting petty, socket LGA 775 had very few P4 single core in its range. The real kick off was the Pentium D's, which were essentially P4s glued together to make a dual core.


----------



## Shambles1980 (May 13, 2014)

im not a huge fan on game based benech marks taken with fraps and stuff. 
they rarely use my common settings. Its usually maxed out the settings till the gpu is over worked. and get avarage frame rates all across the board of 4/5 fps diference, and then low setting which loads the cpu up which then makes the cpu fps the issue. 
they never take the time to do what a normal person does and tweak the settings for the best visuals possible at the resolutions whilst not hitting gpu or cpu limits..
thats what people actually do when they game IMO.
they (or atleast I) set the thing to max, and see what my lowest fps is. if its not good enough then i tweak the settings.
i think that bench tests should so that. and find the settings that allow them to play the game smooth and with no frame rate dips with vsink on. 
then report what those settings were for each system, (i have a feeling that with a set up like that the diference between the cpu's would be almost non existant unless it was a cpu heavy game) 

So at the end of the day i like to look at synthetic benches futuremark and so on, as they inevitably have the same settings (performance) and you can see how many fps difference each set up at those settings achived. 
Again thats not perfect either. but atleast is a more level field.. 

but i really do think correct tests should be. 
i have an i5-2500k and a 6950 on this system and tied to get the following games to play at a constant frame rate. These are the settings i had to use to have 0 frame dropped
the 8120  and 6950 achived the same results at these setting.. 
And then i would be happy with bench tests.  

i really dont see the point in seeing that this card can do 400fps at this resolution with some drops here and there. 
All i want to know is what settings does the system have to use to achieve a constant 60fps


----------



## Bucho (May 13, 2014)

Dent1 said:


> Bucho, very informative post.
> 
> Most of which I believe is accurate and a reasonable assessment. Mainly in regards to the OP board being too outdated.
> 
> ...



@*Shambles1980*
Yes like Dent1 said the P4 came for these 3 sockets so you may have mixed something up there. I guess you mean 478 since that was the most common one for the P4.

@Dent1
In that link there is no i3? But an i3-2xxx should be close to that i5-661 ... maybe a little faster.
Check out this link and click through the games:
http://www.tomshardware.co.uk/gaming-processor-frame-rate-performance,review-32628-4.html

Frame times are not as good as on the FX-8xxx CPUs ... I guess that's because it is only a dual core with HT. Oh and of course in multithreaded applications like rendering, converting aso. the FX are better than that i3.


----------



## Mussels (May 13, 2014)

wow, this threads still going on?


the OP has upgraded, would love to hear his feedback on performance.

myself, i upgraded from a 1090T to this i5, and benchmarks and reviews didnt show much of a performance difference... but the games i play do. its not about the maximum FPS, its the minimum. getting rid of the worst periods when the game lagged out every now and then is what makes upgrades worth it.


----------



## Dent1 (May 13, 2014)

Bucho,

Nothing can be derived from Metro 2033 review as they are all within margin for error. FX and i3 range are within <1FPS. Its actually a draw. I don't think these benchmarks stress the CPU enough to show bigger separation.

Your right the i3-2xxx would be more powerful than the i5-5xx in single threaded applications  because its of the newer architecture. (Westmere vs Sandybridge)

I would think the FX Bulldozer would be more geared towards the Westmere, whilst the Piledriver FX towards the Sandy.  For example the i5 661 (dual core) Westmere gets destroyed by the Bulldozer FX . The Sandy or Ivy would paint a different story.

http://www.guru3d.com/articles_pages/amd_fx_8350_8320_6300_processor_4300_performance_review,10.html


Edit:

Also the FX Piledriver seems to outperform the i3 Ivy Bridge and the i7 Nehalem in games too.

http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/5
http://www.anandtech.com/bench/product/697?vs=47


----------



## MxPhenom 216 (May 13, 2014)

RCoon said:


> Max FPS does not matter, but minimum FPS does. You might get 20000 FPS at peak, but if you're getting 2FPS as a minimum, you're not going to be a happy camper. That being said, if you're on an 8120 or whatever it was, i'd stick with that as opposed to selling it and buying an i3... Wait until you have the monies for an i5, otherwise the sidegrade is a worthless waste of time and money.
> 
> 
> Hurray for users with common sense.



Unless he aims for the new unlocked i3 or pentium chips being released with the devils canyon parts.


----------



## GreiverBlade (May 13, 2014)

Shambles1980 said:


> pentium 4 came on 2 sockets. 486 and 775. i was with the 486 socket version..


never seen a skt 486... it is 478 (mostly prescott) and 423 (1 serie P4 williamette) edit: woopsies already corrected  

also if i can run crysis 3 of battlefield 4 with a decent fps and not much slowing down on a X4 760K/A10 7700K + a R9 270X a i3 can do it ... but double the price of the 760K for the 1st decent i3 (the A10 is priced similarly but has a stronger IGP, still that's not the point for the OP )


----------



## Shambles1980 (May 13, 2014)

yep i mean 478. the 8120 hasnt arrived yet but i will post some comparisons when i can. 
il run some some benchies on the Q6600 tonight at 3.0 (whilst i still have it) and gpu at stock. 
i already have the higher benches up here. 
then il compare them to the 8120. 
game wise il have a look at what settings i can play thief at a constant 60fps.
but i dont really have much els installed now (just that and Gothic 3) i been uninstalling games and freeing up space ready for the new install when the 8120 gets here.. 

incidentally i see that there is a KB update (2 parts) for windows 7x64 after checking up on it it seemed that it was originally pulled (but because ms only released the 1st part of the fix) but then they re released it with the correct 2 parts which need to be installed in the correct order. 
It does seem to have improved some on single threaded processes. so i will be trying with and without the update.


----------



## Toothless (May 13, 2014)

Dent1 said:


> I believe it is my 5850 CF.
> 
> 5850 CF can outperform a  single GeForce GTX 295, which was Nvidia's uber high end card back in 2009.  5 years later, the 5850 CF is still a midrange contender, on par with today's AMD 7850
> 
> ...


My GPU is an MSI GTX 660 OC, and I play at 1080p. I can't really OC my CPU due to the crappy motherboard I'm currently stuck with.


----------



## Dent1 (May 13, 2014)

Shambles1980 said:


> yep i mean 478. the 8120 hasnt arrived yet but i will post some comparisons when i can.
> il run some some benchies on the Q6600 tonight at 3.0 (whilst i still have it) and gpu at stock.
> i already have the higher benches up here.
> then il compare them to the 8120.
> ...



I would be very interested to see the results of your benchmarks. It will render the trolls and AMD haters mute lol


----------



## Shambles1980 (May 13, 2014)

really dont know if thats the case. at best i hope to prove perfectly playable games ate more than acceptable levels. raw numbers probably wont ammount to much. but it will be interesting to see some diferent benches. 

its probably good to know that i am truley neutrall when it comes to hardware. i believe at their own times AMD and intel have made the best cpu its just they have never both made the best cpu's at the same time.. 

i have also gone back and forth between nvidia and ati when it comes to gpu's it has however been a while since i changed to ati. and the reason i did that was because they removed the ability to use over scan and under scan. but i am getting to the point where i may well go back to nvidia as i do seem to get a lot of cards with issues from ati of late (mostly due to power play and not being able to dissable it)


----------



## TheoneandonlyMrK (May 13, 2014)

Dent1 said:


> I would be very interested to see the results of your benchmarks. It will render the trolls and AMD haters mute lol


Likewise here.


----------



## Toothless (May 13, 2014)

Okay, just something to go towards Dent1. I played BF3 on both resolutions and honestly.. The 620 can't hold up to a 1080p. Your 1440x900 does give me 70-100fps, but the quality does no magic. The game simply looks better at 1080p and the frame difference is about 10-30 frames. Though I'm still playing at 40-60fps at medium-high settings. (At stock clocks)


----------



## Shambles1980 (May 13, 2014)

-=Edit=-
valley seems to revert to default settings when you hit bench mark so my original post is kind of pointless.
the only thing i can change is the resolution which dosent really help with what i want to do. any way here is the original post, but the only relevant part of it is the resolutions i use

------
ORIGINAL POST
-------


ok doing some benchies right now.
im only using valley as i figured will do dx11 dx9 and open gl

settings are
quality high
single monitor
AA off
1920x1080


----------



## Dent1 (May 13, 2014)

Lightbulbie said:


> Okay, just something to go towards Dent1. I played BF3 on both resolutions and honestly.. The 620 can't hold up to a 1080p. Your 1440x900 does give me 70-100fps, but the quality does no magic. The game simply looks better at 1080p and the frame difference is about 10-30 frames. Though I'm still playing at 40-60fps at medium-high settings. (At stock clocks)



Although I run @ only 1440x900, I do apply 4x anti aliasing to give the impression of running 1080p whilst not suffering from the big frame rate penalty. If you're running AA @ 1080p this could be eating your performance. AA is a frame rate killer at high resolution.

You'd get better performance at lower resolution with AA than 1080p with no AA.


----------



## Toothless (May 13, 2014)

Dent1 said:


> Although I run @ only 1440x900, I do apply 4x anti aliasing to give the impression of running 1080p whilst not suffering from the big frame rate penalty. If you're running AA @ 1080p this could be eating your performance. AA is a frame rate killer at high resolution.
> 
> You'd get better performance at lower resolution with AA than 1080p with no AA.


I never use AA.  I'll have to give it another try.


----------



## Dent1 (May 13, 2014)

Lightbulbie said:


> I never use AA.  I'll have to give it another try.



Its very addictive. AA, literally straightens all the jagged lines in games. Its hard to go back afterwards.

View the image below


----------



## Toothless (May 13, 2014)

Dent1 said:


> Its very addictive. AA, literally straightens all the jagged lines in games. Its hard to go back afterwards.
> 
> View the image below


Lemme ramp up all of my fans and try this. I've been having issues due to it being like, 80F outside and we have no AC.


----------



## Shambles1980 (May 14, 2014)

ok did all my samples for the q6600..
After trying all my bench software theirs nothing i can do to make the minimum fps better. so that is the cpu as at the same res with lower quality the min fps is the same. maximum goes up and obviously as a result so does the average. thats when using fire strike, and if i cant change the settings on that then i will keep the variables as low as possible by using all quality settings at defaults and only changing resolutions.

i chose to use the 3 valley tests at 1080p.
gpu test 2, physics test, and combined test of 3dmark's fire strike bench.
and ice storm Gpu test 1 and physics test, (3dmark)

fire strike is hard on gpu and cpu usage, valley isnt that bad. and icestorm is gentle.
so thats tough, normal, and easy. in terms of gpu/cpu needs. at the same res and same quality settings. (i think thats about as fair as i can make it removing as many variables as possible)

the q6600 results are in for now 8120 will get the same tests when it gets here..

Valley 1920x1080 (full screen)


min:
18.3 (dx11)
15.1(dx9)
15.0 (opengl)

Max:
73.6(dx11)
76.5(dx9)
58.3 (opengl)

average:
42.6 (dx11)
41.2(dx9)
32.1(opengl)

the 3dmark results are in the form of png image(s) so i will post those here directly. 1920x1080 (full screen)

Fire strike:




IceStorm:





i don't like stupid meaningless scores so its min fps max fps and average..

pretty sure that the resolution + settings is a reasonable place to compare. the 3.0ghz "25% oc" on the q6600 should be something every board and even stock cooling can handle, so its a fair place to set it.
if you want different resolutions chose 1 and i will try and do that before this gets packed up.
and if you want the q6600 maxed out then they are already in my 1st post.

When i add the 8120 it will have the same amount of ram And to be fair i will oc it 25% or to the highest safe setting i think every one can achieve without burning the house down or having a home sounding like a wind farm *which ever setting is lowest*

I made a ridiculously basic html page. i may improve on it if i care to do so, but i really probably wont lol.
here is the web page. (has the valley results in html form in there) http://shambles1980.x10.mx/


Thief 2014 results.

there really is not much i can do with the q6600 clocked at 3.0 and the gpu at stock oc. the game simply cant run at a constant 60fps no matter what settings i change.
Any way the closest i could get to to best graphics balance with acceptable game play "constantly" gave me 60fps dropping as low as 44fps for a short while.

the test was done in the jewelers store early in the game. this handled the low requirements whilst in doors with most lights off (some rooms were left lit for transitional lighting)
and also a quick burst out side in to the rain and lightning where 3 guards find me and start chasing around increasing the action.
This resulted in the drops of 44fps during the high action outside run but only for a second or 2 when running down the open length of the outside alley.

Defiantly more than playable and if i did not have afterburner running to tell me the fps then i doubt i would have noticed..
tried all kinds of settings and resolutions what i ended up settling for as the best all round compromise was:

Full screen: on
Exclusive full screen: on
v-synk: double buffer
res: 1280x720
Refresh rate 60hz

field of view: 90
texture: Normal
4x AA
shadow quality: normal
parallax occlusion mapping: off
ssaa: off
fxaa: off
Contact hardening shadows: off
tessellation: off

the frames of the bench results were.
15.1Minimum and
63.9 max
average of aproximatley 33
(i have to approximate the average by watching msi after burner as the delay start reports min fps as 0. but they are really 15.) the bench reports average as 29.8

(same settings at 1080p gave max frames of 44.8 in the bench test btw.)

That completes all the tests for the q6600. the low drops in game of 44 IMO are cpu bottling as i cant get them higher than that with editing settings. but i was able to get the average speeds to stay higher with the settings i chose. a higer resolution gives me more drops in to the 50's which is then the gpu.
these settings work the best for the gpu. but at the 3.0 ghz of the cpu the lowest frames will never be higer.
this is also further proven by higher low fps points when running at 3.6ghz.

so now I'm off to use one of my core 2 duo systems whilst i wait for my new board to arrive.

the next update will be the 8120 results.

ok so thief bench test. same as before with the settings. the 8120 did a little better but not really enough to warrant much fan fare.
as usuall the game reports min fps as 0 because of delayed start but msi after burner repoted it as 14.5 (.6 slower than the q6600)
but through out the test the 8120 was able to be consistantly higer and ended up with 63.5 max (lower than the q6600)
and aproximatley 36 avarage (which is quite a bit better) given the lower max frames (but same 0 frames reported) to achieve this the 8120 had to be more consistant or with less low drops during the test.
the actual avarage reported by the bench was 33.9 (4.1fps better than then q6600 achived with higer max frames)

so it looks prety good for the 8120 so far for being more consistant..

the actuall game..
again it was reasonably close. with the 8120 being slighjtly more consistant at staying in the 60fps requested of it. but it too suffered some drops mostly in to the 50's but 1ce diped down to 41.5 (slightly lower than the q6600s lowest point) i must stress however that this was for one small second once, where as i could repeat the 44fps drop with the q6600 every time i ran down the open length of the court yard. the 8120 only did it once and managed to stay at 60.1 fps when i ran down the same path again a few times (possibly throtteling due to the vrm)

The only conclusion i can take from this is that with this board and cpu you would notice no reall difference gaming at the settings using this cpu at stock vs a over clocked q6600 at 3.0
Again i need to stress that this is a slighjtly unfair test due to the fact that this board is 3+1 and will occasionally throttle the cpu to 2.8 when under load.

on to the synthetic tests.
valley bench

min:
15.1 (dx11)
15.4(dx9)
14.6 (opengl)

Max:
74.1(dx11)
78.1(dx9)
66.1 (opengl)

average:
41.6 (dx11)
40.3(dx9)
32.5(opengl)

from here we again see that these 2 processors are again very close provided the q6600 is over clocked to 3.0. and the fx-8120 is locked to 3.1ghz and gets throttled down due to stupid 3+1 power phase.
(how often it gets throttled to 2.8 however i dont know. but if we assume it never does. and was always running at 3.1 with turbo mode off. then the 8120 is atleast on par with a q6600)
As i said earlier this test is very biast in favour of the q6600 due to the stupid 3+1 power phase board which I will be replacing..

any way 3d mark.




here the 8120 seems to do prety much exactly the same as the q6600 in the gpu test. which would imply that the gpu is the limiting factor in this test.
Suprizingly (atleast to me) the 8120 actually performs the physics tests better than the q6600 even when hindered.
and in the combined test its a more even playing feild but the 8120 at 3.1 seems to be able to keep the frames at a more consitsnat rate compared to the q6600..





the q6600 performs quite a bit better here in terms of actuall raw numbers but the graph seems to show the 8120 is more consistant even when hindered..
but in the physics test the 3.0 q6600 is outright better than the 8120 at 3.1..
Which is strange as in the previous physics test it was the other way arround. (again this could be due to throtteling)


so.. wit this set up and the 8120 Locked at 3.1ghz with occasional throtteling down to 2.8ghz "due to stupid mother board" and being unable to utilize built in features such as turbo mode.
the 8120 and 25% oc q6600 seem to be about even...

this is actually a better result for the 8120 than i expected when i found out i was stuck with a 3+1 phase board and it throttled down under load.
If the 8120 had been allowed to use its turbo mode then the q6600 would not have kept up at 3.0 but at 3.6.. I think it probably would have.

so in the end.. i have to say that a stock 8120 would be better than a 25% oc'd q6600. because it would be able to use its turbo mode and hit 4ghz.
unfortunatly i cannot test that out untill i get atleast a 6+2 board.

but these cpu's running at very similar clock speeds are really close in terms of power per mhz. I would have to say the q6600 has slightly better performance for its mhz, but the 8120 on the right board can surpass the q6600 in terms of over clocking.
and if we are looking at stock cooling all round with the correct boards for each cpu then the 8120 would be the winner simply due to the higher clock speeds.

Is it an upgrade?
Well if you did like me and simply swapped your q6600 for the 8120 then yes i guess it is..
if you would need to pay more for the 8120 setup than you could get for your Q6600.. Then probably not Unless you cannot over clock your q6600

hope to get the new board soon and update this with more valid results..


----------



## OneMoar (May 15, 2014)

the 8120 will walk all over the Q6660 it will use twice the power of i7 and be 30% slower then a i7 but it will walk all over a 775 chip I recently build a budget core i5 and R7 265x and it blows the amd chips out of the water for about 100 dollars more and the difference in power consumption will more then account for that eventually
don't get me wrong its fine to settle for a AMD chip hell in the majority of cases you won't see any difference


----------



## Delta6326 (May 15, 2014)

Hey #shambles1980 what volts you running on your q6600 for 3ghz? I used to have mine oced  a long time ago. Currently running stock though. Been thinking of trying again.


----------



## HalfAHertz (May 15, 2014)

OneMoar said:


> the 8120 will walk all over the Q6660 it will use twice the power of i7 and be 30% slower then a i7 but it will walk all over a 775 chip I recently build a budget core i5 and R7 265x and it blows the amd chips out of the water for about 100 dollars more and the difference in power consumption will more then account for that eventually
> don't get me wrong its fine to settle for a AMD chip hell in the majority of cases you won't see any difference



That's true. Don't forget that AMDs are overclockable and come at the same price as the more affordable non-overclockable i3s and i5s. So even tho they perform somewhat sub par at stock, if cooling and power usage are not too big of an issue, you can reach and even surpass the intel performance for the same price.

The old phenom IIs and Athlon IIs needed between 300 and 600MHz to have similar performance to a Nehalem i7, depending on the task; Bulldozer a Piledriver need about 800 to 1200MHz to perform the same as an Ivy Bridge/ Haswell. That's still a reachable goal.

Most people on TPU like to compare AMD's offerings only to the K series and forget that outside of the US, there's a really stark difference in price between the lower end i3s, i5s and AMDs versus the overclockable K-series i5s and i7s. 100$+ is quite a bit of money in a lot of places.

If I do some back of the napkin calculations, and take the worst case possible, if we say that at full load an AMD would use 100W more than an Intel, and that your computer is working at full load 100% of the time for 8 hours a day, that'd be about 58.4$ per year if electricity was 0.2$ per kWh (seems to be a good average). So you'd need at least 2 years under this unrealistic load to cover the difference in price just in electricity.


----------



## Mussels (May 15, 2014)

HalfAHertz said:


> That's true. Don't forget that AMDs are overclockable and come at the same price as the more affordable non-overclockable i3s and i5s. So even tho they perform somewhat sub par at stock, if cooling and power usage are not too big of an issue, you can reach and even surpass the intel performance for the same price.
> 
> The old phenom IIs and Athlon IIs needed between 300 and 600MHz to have similar performance to a Nehalem i7, depending on the task; Bulldozer a Piledriver need about 800 to 1200MHz to perform the same as an Ivy Bridge/ Haswell. That's still a reachable goal.
> 
> ...




dont forget that some non K intels can in fact, overclock. i've got 600MHz out of mine.


----------



## HalfAHertz (May 15, 2014)

Mussels said:


> dont forget that some non K intels can in fact, overclock. i've got 600MHz out of mine.



You are right, I forgot that. But I think that it works only on specific motherboards, not sure.

Edit: Drats my cpu has only 2 unlockable bins.


----------



## Shambles1980 (May 15, 2014)

Delta6326 said:


> Hey #shambles1980 what volts you running on your q6600 for 3ghz? I used to have mine oced  a long time ago. Currently running stock though. Been thinking of trying again.



you can usually get 3.0 on stock voltage. (1.28v maybe 1.3)
I cant just tell you what voltages to use though. what you tell the bios and what it actually ends up sending to the cpu is never exactly the same. if you told my old motherboard "abit quad gt" to send 1.556v then all the chip ever gets is 1.48v.. (even when it whats more) and you had to go up to 1.6v in the bios to get to 1.54v on the actual cpu..
but even 3.7 oc never needed to draw more than 1.5v, anything over 1.55v i would not recommend though. and the more power you send the hotter it gets. so it simply is a matter of start at the lowest voltages run occt and watch the temps. if it errors with good temps the cpu needed more power.

but 3.0 is easy just 333 on the fsb same multiple and dont change the voltages should easily get to windows. then run occt and see if you get an error on any of the cores. if you do go 1 up on the voltages and try again till you dont get the error.
thats the best way to do it, remember to change your ram / fsb ratio
some will let you get to 800mhz running a 333 fsb. otheres would have you at 900+ or 667, when your testing for stable voltages always have your ram at stock or slower speeds.
if your board supports a fsb of 1600 then 400 x 8 is a nice 3.2 oc as it brings the fsb speed up to the max and you can easily set the ram to 800mhs with 2:1 but 3.2 and up is where you start to need some decent cooling if its something you want to use every day..




HalfAHertz said:


> That's true. Don't forget that AMDs are overclockable and come at the same price as the more affordable non-overclockable i3s and i5s. So even tho they perform somewhat sub par at stock, if cooling and power usage are not too big of an issue, you can reach and even surpass the intel performance for the same price.
> 
> The old phenom IIs and Athlon IIs needed between 300 and 600MHz to have similar performance to a Nehalem i7, depending on the task; Bulldozer a Piledriver need about 800 to 1200MHz to perform the same as an Ivy Bridge/ Haswell. That's still a reachable goal.
> 
> ...



on the over clocking front. i have ordered a Xigmatek Aegir SD128264 which should let me get a nice stable oc out of the 8120


----------



## Bucho (May 15, 2014)

OneMoar said:


> the 8120 will walk all over the Q6660 it will use twice the power of i7 and be 30% slower then a i7 but it will walk all over a 775 chip I recently build a budget core i5 and R7 265x and it blows the amd chips out of the water for about 100 dollars more and the difference in power consumption will more then account for that eventually
> don't get me wrong its fine to settle for a AMD chip hell in the majority of cases you won't see any difference



At stock clocks that's true for sure since the Q6600 only has 2.4GHz and a low FSB of 266MHz (so even DDR2-533 or 667 memory is fast enough and higher speed memory or DDR3 would make no difference). The 8120 has 3.1GHz and a Turbo up to 4.0 (!) GHz with no FSB since the memory controller is in the CPU and it does support (and even makes some good use of) DDR3-1866 RAM.
Then the 8120 has 8 threads it can process, so (and that's the thing with these FX CPUs anyway) it HEAVILY depends on the program how good it utilizes the FX (bulldozer) architecture.
BUT if the Q6600 is overclocked to 3.6GHz+ the FX8120 @ stock will not always be faster, especially at programs that only make use of one or two threads and/or are older and not optimized for that FX in any way. But then again that FX8120 can be overclocked too, but you need a decent board and cooling for that, and also depending on your GPU a decent power supply since that FX needs some power when overclocked (but the Q6600 needs a lot too if overclocked and at higher VCore)

Anyway I am very interested in scores that Shambles1980 will post, but for "real life" tests he only used Thief. 3DMark at least shows what theoretically the CPU/GPU combo is capable of. Most of the games will be a different pair of shoes anyway.

@Delta6326
You have that Q6600 of yours at stock? Really?
You have a good board that should overclock great, a (way too) strong PSU, a decent cooler and then you even throw that HD7870 at that poor CPU.

At least set your FSB to 333MHz to get 3.0GHz out of that Q6600. If it is a G0 revision it may work at this setting even with default VCore or maybe a little more. Oh and remember that you do NOT want to leave most of the voltage setting on AUTO (so the board raises these values as it likes) when you overclock since these ASUS boards tend to use way more voltage on AUTO than what is required.
If that CPU needs a higher VCore to be stable you might want to enable LLC (loadline calibration). With that you will have only a small VDroop (difference between the VCore voltage between IDLE and LOAD) so that you maybe even can lower that VCore a little but the system will need more power and produce more heat under LOAD.

@HalfAHerz
Yes that's a big plus for AMD. You can get a pretty decent FX-8320 NEW for about 116 EUR (about 158 USD) here where I live. At least that used to be a good offer about a year ago (was ~ 130 EUR back then) but Intel wasn't sleeping and their Haswell CPUs are about 10-15% faster than their Ivy Bridge CPUs.
Also a FX-6300 for about 86 EUR (about 117 USD) is a good offer.

Intel CPUs always were a little more expensive, at least if you wanted a bigger CPU or one that can be overclocked. 
The crazy thing is that Intel kind of really can price their performance and high end CPUs almost to anything they want. That started right after the launch of the Sandy Bridge CPUs. The i5-2500K dropped right after it's release in early 2011 to below 170 EUR (that about 232 USD) and in early 2012 it went up again to 190-200 EUR!!! (about 260-270 USD) right before the release of the i5-3570K that was sold for about 215 EUR (about 293 USD) and this price held up to mid 2013 right before the Haswell CPUs were released.
It seems like Intel does not see a reason to drop it's prices.

About that frequency thing ... check out some reviews about the FX-9590. As you surely know that's just a overclocked FX-83xx and that almost all of these can be clocked to ~ 4.8 GHz and even some more. But still at these clocks they often are slower than a i5-4670K @ stock and that's sad. But in some tests they really shine, but most of them are applications like crypting/decrypting/packing/converting aso. and in games the i5 usually is faster.


----------



## Shambles1980 (May 15, 2014)

the benches are all i could use sorry. had uninstalled pretty much everything getting ready for the new setup. i did set thief up in the most fair way as possible (a real world scenario where some one would be adjusting the settings to best suit the system they had) i will do the same for the 8120
the other 3 benches (valley dx11, dx9, opengl, fire strike and ice storm.) should help even out the actual performance numbers. the graphs in the 3dmark tests will be a better indication than over all scores.
i would hope to see a graph with smaller spikes and dips and a more constant averages the 8120 arrived today. but the ram hasn't yet which annoys me.

the q6600 G0 will easily run at 1600 (400) so that's a motherboard limitation but there are plenty of lga 775 bards that support 1600 fsb, the big attraction to the q6600 was its low 266 fsb with high multiple, this meant more boards could over clock it easier. where as at the time most cpu's had a lower multiple and higher fsb req. 
400x9 on the go q6600 does need a lot of cooling though. and i was never able to get the q6600 higher than 3.7Ghz stable i could get it higher than 3.7 and bench test or game, but not occt stable. and if its not occt stable then its not stable (imo)


----------



## Mussels (May 15, 2014)

HalfAHertz said:


> You are right, I forgot that. But I think that it works only on specific motherboards, not sure.
> 
> Edit: Drats my cpu has only 2 unlockable bins.



its socket 1155, with a P or Z series board.

since its the older socket, the OP could have hunted up something to suit there. i got this setup (CPU, mobo + 4GB ram) for $120 second hand, and its crapped all over my old AMD setup.

he's got his AMD setup now (or is it still coming?) and i'm just awaiting more feedback from him.


----------



## lilhasselhoffer (May 15, 2014)

Bucho said:


> Hi guys, just signed up @ techpowerup.
> ...
> The main problem (that has been discussed over and over all around the internet) is that somehow the CPU performance increase (kind of) flattened over the past ~ 8 years. I say _kind of _because the IPC (instuctions per cycle) performance increased and also the core count increased in certain classes of CPUs. So why don't we see the big boost?
> One thing is that the MHz of the CPUs didn't increase that much, at least not like in the days of 486, Pentium /II/III/4 and K6, Athlon. Back then the MHz doubled and tripled von generation to generation and the IPC got better. No wonder a Athlon 1400 was almost twice as fast as a 700MHz Athlon, right? Or a 486DX2 66MHz was almost twice as fast as a 33MHz. At least in some raw benchmarks where the bus speed, memory speed and other things didn't bottleneck.
> ...



Welcome to TPU!


Pleasantries aside, I can agree with pretty much everything you are saying but what I've quoted.  The simple truth behind my reservation is that you missing the forest for the trees.

If it were 2004, I would be swearing, because I had to install another two servers in order to run the remote system for the six new employees that the company just hired on.  They'd require their own additional hardware, I'd have to get everything configured, and after all of this I'd still have to find some way to make all of this stuff fit into both the server room and my budget.  Luckily, it's 2014.  I've got two Xeon 6 core, 12 thread processors chugging along with a dozen different virtual machines plugging away.  I can shear off enough resources to run another instance, and my CPU chugs along with a slightly higher load to distribute.

That very same Xeon processor would need at least three Core architecture systems to replace it just on core count, not to mention a tangle of update requiring systems.  The lack of any one person's ability to see improvements in processor speed is their own failing, not that of processor stagnation.  Likewise, CPU utilization is rarely as simple as people credit it.

This said, the OP asks for one usage scenario.  I've got an unknown set of games, with unknown optimizations.  Some of the cited titles were developed for functionally single core machine, and ported to more powerful systems.  I'm preventing a tax on my graphical systems by running at low resolution, and I'm preventing taxing my CPU by running games that don't apply it with a heavy burden.  As such, I see no reason to switch from my high end Core processor to a low end i-series processor.  

You cite the great innovations of the past in greater frequencies, but seem to forget why frequency increases have functionally become minute.  Frequency increases actually stopped yielding huge results almost a decade ago.  The real innovation that allowed the CPU market to improve was better miniaturization, new materials, and increases in the efficiency of chip design that increased IPC.  I'd gladly pit one of the Athlon CPUs versus the P4 chips released at the same time, and demonstrate how better design won out over raw frequencies.  This lesson holds true today, when AMD FX processors far more easily attain high frequencies, but are beaten by the Intel offerings with significantly lower frequencies.


Your point about core count is accurate, but misguided.  The sad truth is that gaming is currently (though hopefully will not be for too much longer) more about consoles than about PCs.  This truth was motivated by publishers, who have treated the PC market as second citizens for years.  There are games out there that have made use of high end multi-core processors.  We have had the capability to use all that extra power for years, but the cost to use it has been deemed prohibitively high.  Since the Core processors were introduced we've been hearing about how games will be coming out soon that use all those extra CPU cores.  Our reality is that these games exist, but don't have enough juice yet to make them the norm.


The OP's point that a q6600 is "good enough" is valid, but misguided.  My calculator has as much computational power as the Apollo rockets system.  It's good enough to get people to the moon and back, but isn't good enough to perform tasks that a modern cellular phone does regularly.  "Good enough" is a cop-out, and replacing your BMW with a Geo Metro is not a solution.  Comparing a q6600 to an i3 is misleading, but if all you qualify is fuel efficiency then the comparison is just peachy.


----------



## Delta6326 (May 15, 2014)

OK thanks for the info. I used to have it oced but when I switched over to w8 I got rid of it and could not remember where I started for the voltage.


----------



## CounterZeus (May 15, 2014)

Shambles1980 said:


> im a child of the era where role play games just had text on the screen lol. and i still think super mario brothers 3 has great graophics. but when it comes to modern computers anything lower than 1024x768 even by my extra low standards looks like poop. But anything above 1024x768 looks just fine to me lol. honestly cant tell the difference.
> (my desktop is at 1080p though)
> the only thing i can tell the difference with is 1080p vs 1080i and 720p looks better than 1080i but any way.
> I am looking at some board ram and cpu bundles on the internet but i dont seem to see anything thats a good enough improvment for the money that i can then convince the wife i need to spend that much..
> ...




Q6600 is old and even a 2nd gen i5 runs circles around it. If you can't run 1080p (can't believe you actually run things at 1024*768 and not notice the difference at 1080p), it's worth upgrading. Pentium D was high end when it came out, but it aged very rapidely. I ran all my games at 1920*1080 on a Pentium D 830 back in the day when I had my new screen, except for crysis. Before that, I ran 1280*1024, which was a huge upgrade of 1024*768 for me.

Sounds to me the best upgrade you can do is your wife so you can buy more pc components


----------



## Shambles1980 (May 15, 2014)

my wife is not getting upgraded lol..
i dont game at 1024x768. its usually 1280x1024 although i should be 1280x800 to stay in the correct aspect ratio. 
honestly i though i dont see much of a gfx improvemnt over anything 720p and up i do see anything below 1024x768 as horrible. but anything above its pretty much all the same. 

@*lilhasselhoffer* 
its not true that the resolutions did not stress the 7850 as stated the test ran on the Q6600 was balanced to remove gpu bottles and try my best to remove the cpu bottle. 
With the settings listed the gpu did not bottle (these settings are a lot lower than what i usually play the game at) but the q6600 at 3.0 just did not have enough grunt.

the q6600 even at 3.7 was at 84% even at the options screen and would only ever get used more. 
So where as the settings chosen could utilize the gpu at 100% If the cpu was taxed a bit more than usual then the gpu would take a hit. but at 3.7 the q6600 did have enough power to keep the low frame drops in the 50's at 3.0 it lost 9-11fps at the low end. which is quite a lot when your talking <60 fps

Thief its self is a pretty badly optimized game if you ask me. 
some times it will use 4 cores sometimes only 3. its not very good at distributing the work load and has an exessivly high dependancy on the cpu (i can only imagine due to garrets physics needing to be able to bump things and for the lighting to cast shadows)
i do believe that this could have been intentional though, and that mantel would be showcased on the game to show improvments whilst in reality a good % of the increase would have come from better utilization of the hardware but not because of mantle but simply better optimization. 

I will not be using mantle to test the 8120. it will be the same game same version and i even saved the game just before starting the initial test so will be taken from the exact same place and in game time. 

Where as If i had all my games installed i probably would not have chosen thief as my test bed game. with the method used to test (set the options up the best for the hardware) and the fact it does have a built in bench mark utility, i think it will serve as a reasonable stand in. 
My main interest will be in the fire strike tests though. they are very demanding and should show the limitations of the cpu's.
i added the ice storm tests as a bench for lower requierments.
So i do feel like i have covered all the bases for a fair result to be shown.. 

just wating on the ram to arrive and then i can test the 8120. it should be interesting to see the results.


----------



## Toothless (May 15, 2014)

Shambles1980 said:


> i dont game at 1024x768. its usually 1280x1024 although i should be 1280x800 to stay in the correct aspect ratio.
> honestly i though i dont see much of a gfx improvemnt over anything 720p and up i do see anything below 1024x768 as horrible. but anything above its pretty much all the same.


I think I just died a little inside. 

Well then again there are people that can't see the difference between 30fps and 60fps. If you don't see a quality improvement, then I guess gaming at a lower rez and getting more frames sure helps. How big is your monitor? 24in?


----------



## lilhasselhoffer (May 15, 2014)

Shambles1980 said:


> my wife is not getting upgraded lol..
> i dont game at 1024x768. its usually 1280x1024 although i should be 1280x800 to stay in the correct aspect ratio.
> honestly i though i dont see much of a gfx improvemnt over anything 720p and up i do see anything below 1024x768 as horrible. but anything above its pretty much all the same.
> 
> ...



You misunderstand, and seem to be consciously avoiding reality.  Allow me to dispel the difference in our understanding.

84% usage is sand bagging.  100% usage, and seeing how many valid FPS can be generated is testing.  If you say that you are happy at lower resolutions then you are not testing your system's power.

My Ti-84 calculator can run a Galaga clone.  My computer can run the same.  Does it imply that the two are equals; no.  You only test what a system is capable of by pushing settings to the maximum, and seeing when you can no longer do something.  In gaming, this is adding all the post processing effects, and measuring whether you get a playable frame rate.  Synthetic benchmarks are just that, attempts to standardize performance readings.

Don't agree with this, then you aren't really testing, what you are doing is playing with yourself.  You set the rules, box in the performance, and reset the rules in order to get the outcome you want.  Don't like the FPS being generated?  All you do is crank the resolution down.  Don't like the synthetic benchmarks?  Crank the frequency up for a single viable test run, and get big numbers to provide that whatever you think is right is right.  This isn't objective reason, it's putting blinders on.


I challenge you to lose the blinders, and to come back to reality.  If you've got to set the resolution lower than native on a screen then you're creating blurriness.  Can't tell the difference between 720p and 1080p, then you're not viewing the screen properly.  What I'm seeing is someone growing older, and vision getting less acute.  I've seen the same before from my parents.  They got glasses, and suddenly the difference between 720p and 1080p was clear.  The price of a Blu-ray movie suddenly made sense, because the visual quality was so much better.  Saying that improved resolutions don't provide anything more is admitting that you need some help.

The q6600 is adequate for what you've cited.  What you've cited is a severely limited facet of gaming.  If you have so little ambition, then you cannot argue that a replacement is needed.  When you open up your standards, you can see why the q6600 isn't really living up to more modern standards.


----------



## Shambles1980 (May 15, 2014)

i dunno if you missed the part where i said that higer resolutions and or higer display settings resulted in more drops below 60fps which means that the gpu was then the component slowing down the system.
resulting in more drops below 60 (mostly in to the 50's)

with reduced resolutions and lower gfx settings the 60fps target was achived for a longer
period.
The settings chose were the ones that most suited the 7850 at 900/1200 the fps only dropped down to 44 fps when the cpu was taxed, and changing the settings could not get rid of this 44fps low point. this had removed the gpu from the equation as the reason i could not sustain 60fps, but the cpu was confirmed as the reason that i had 44fps dips... (this is futher confirmed by 55fps dips at 3.7ghz...

now if you dont think that people in the real world would adjust settings for the best ballance of performance and frames then i think you are the one that needs to remove the blinkers.
i set the game up to run as efficiently as possible on the hardware. increasing the resolutions dropped the frame rates, thats a simple fact.
if the 8120 can achive more constant frames at the same or better settings then it is better. if it cant then it isnt..
its a prety simple premiss.

the 3 benches are set at the same for the testing process. i chose 1080p because for some reason some people seem to think it makes a difference to image quality, (personally i really dont see it) but as these are synthetic benches they can be set to a default quality with 1080p resolutions to see which cpu results in the better output.

i think you have a misconseption that at 1080p you cannot be bottle necked because the cpu is under less load, but the fact of the matter is atleast with the game i tested 1080p cause the gpu to be the reason the fps dropped. at 720p the gpu was able to handle the work load at the outlined settings.
you seem to be confuzing a set of tests to compare 2 things s evenly and as close to a real world setting as possible with what you would want a system to do "run all games at full res no issues"

what you propose is to set everything to the highest settings and then say "well i cant play it like that end of test"
my method is set it up so it is playable with the MINIMUM ammount frame drops under my target of 60fps. then use those as a comparison for the 8120 when the ram arrives.
"i dont care if i can get 8000 Fps in gauntlet on my pc when i could only get 20 fps on my old system.. All i want to know is what does it take to play at a constant 60fps with no drops, and which system can achive that at the highest settings, the higest settings the q6600 came closest to achieving this was outlined previously, any higer and it lowerd fps, any lower and it did not change.. so that is the sweet spot"
its a simple premiss of if the 8120 can run at those same settings without ever dropping below 60. or if it is able to run with only low points of 50+ then it was able to do the same work load better.

the diference being that i dont just test 2 cpu's and say "well i cant play this maxed out with either one of these so dont bother buying one"
and instead i can say "the 8120 actually works better at these settings by not dropping below 60fps at all) or i can say
(the q6600 was better as it achived the 60fps target and only droped to 44fps where the 8120 would drop to 30)

testing the power of something when it blaitantly is not powerfull enough to do the things you are testing it is a bit redundant.
in what reality do people stick everything on max try to play get 4fps and say well i cant play that and just not adjust any settings till they can.
this is one of the fundimental issues i have with bench tests they are all flawed. they never simly test at what settings and resolution does the setup work best and then compare the settings.

if bench marks stated that game A runs at a cnstant 60fps with v-synk on at these setings and this resolution with cpu a and gpu a.
but cpu b and gpu b can do exactly the same at these settings and this resolution. then that would be a better test.

there really is little reason to test the max possible settings and show that this system runs 14fps and this one runs 15fps
When in reality at settings tweaked by some one using it There could be a difference of 20fps at actuall settings people with that setup would probably use..


----------



## Toothless (May 15, 2014)

Well duh you're going to get under 60fps if you crank the resolution up. You say you run 720p maxed settings, which ANY modern mid-high range GPU can handle, but when you keep those maxed settings and bump to 1080p, it's REALLY EASY TO SEE THAT ANY MID-HIGH RANGE GPU WILL HAVE ISSUES. (Not ultra range, which is 680 and up) This is why games have "Settings." I run an MSI GTX 660 OC, with an AMD Athlon II X4 620. Now while my CPU isn't as powerful as yours, I can still maintain 50-70FPS in BF3 Multiplayer at 1080p. Why? Because I don't go shitfaced with cranking the settings up. There is a HUGE difference between 720p and 1080p. (Source:

-Skyrim on a GT 220 @720p on a 900p monitor. Lowest possible settings. Pixelated. 20-40 FPS rate. (Aged and low grade GPU)

Skyrim on a GTX 660 OC @720p on a 900p monitor. Ultra settings. Not as many noticeable pixels. High FPS rate.

Skyrim on an GTX 660 OC @1080p on a 1080p monitor. Lowest possible settings. No major noticeable pixels. High FPS rate.

Also ran at Ultra settings +HD packs. FPS capped due to CPU being aged and low grade.. )

The only reason you don't see a difference is because you crank the settings so high that you can't really see the 720p pixels, which is fine. (AA will do that)

If you're going to game at such a low resolution, then upgrading is worthless, but it's too late for that. You're getting an upgrade. Great. You can probably run games better. Great. Now turn off your highassed settings and try gaming at 1080p. Still see no difference? Great. Then stay at 720p. But by no means is there ANY logical reason to say that there is no difference between 1080p and 720p. I have bad eyesight at times due to medical reasons, and I can still see the difference. Maybe you're using your monitor wrong, or your eyesight isn't as good as it used to be. Who knows?

All in all, you didn't need an upgrade because you play at a setting where even a low-end i3 can run easily.  You don't push your hardware to the point of quality because you're bent on quantity, but that's your choice, not ours.


----------



## TheHunter (May 15, 2014)

this should be a very good example Q6600 vs 4770k
http://www.legitreviews.com/upgrading-from-intel-core-2-quad-q6600-to-core-i7-4770k_2247


----------



## Shambles1980 (May 15, 2014)

well at the settings i play at the cpu i had still couldnt keep the fps at a constant 60.. with v-synk on. and given that is the only thing i care about and no matter how i tweak the settings it simply is not possible to achive that with the cpu how is it i did not need an upgrade?

also i can lower the settings and play at 1080p but the frames would still be less than my target.. 
i honestly prefer a constant 60fps no drops to higher resolution "which i really cant tell the difference with and dropping frames every time i move or a stutter when i try and turn around fast. 
AA does make the game look better so prehaps it is why i dont see the diference between 720p and 1080p because at 1080p i cant have AA on and still get frame drops. at 720p i can have AA on looks just as good to me, and i dont get the frame drops..


----------



## Toothless (May 15, 2014)

Shambles1980 said:


> well at the settings i pay at the cpu i had still couldnt keep the fps at a constant 60.. with v-synk on. and given that is the only thing i care about and no matter how i tweak the settings it simply is not possible to achive that with the cpu how is it i did not need an upgrade?


Vsync uses more resources... Just keep it off.


----------



## Shambles1980 (May 15, 2014)

Lightbulbie said:


> Vsync uses more resources... Just keep it off.


tripple buffer v-synk uses more resources than no v-synk?
i was pretty sure that it saved the frames up so that it could display them reducing drops...

or more accuratly continued to draw frames even when the front buffer did not need them because it had room to do so in the 3rd buffer which means you dont get the performance hit at all..


----------



## Toothless (May 15, 2014)

You really aren't that tech savvy, are you? Vsync forces the GPU to put out a certain amount of frames instead of naturally. You'll ONLY see 60fps due to your monitor, assuming that it is a 60hz. You'll only see as many frames as the hertz your monitor will give out. The ONLY time you need Vsync is if you're getting frame issues while gaming. Such as frames ripping and tearing. Most gamers leave Vsync off because it doesn't hurt to have 120fps, because all you'll see is 60 and no more.


----------



## Shambles1980 (May 15, 2014)

Lightbulbie said:


> You really aren't that tech savvy, are you? Vsync forces the GPU to put out a certain amount of frames instead of naturally. You'll ONLY see 60fps due to your monitor, assuming that it is a 60hz. You'll only see as many frames as the hertz your monitor will give out. The ONLY time you need Vsync is if you're getting frame issues while gaming. Such as frames ripping and tearing. Most gamers leave Vsync off because it doesn't hurt to have 120fps, because all you'll see is 60 and no more.



all i see is 60fps "thats all any one can see on a 60hz monitor"
but the game renders as many as it can when using tripple buffering.
if fraps or similar was counting actuall frames the card was rendering and not the frames sent to the monitor every refresh it would show 120 fps or 300 or whatever it happens to be. with tripple bufferd v-sync but it will only count the actuall refreshed frames which is 60 because they only look at the front buffer...

tripple buffer lets the card render your 120-300fps so it works to its full potential, but the front buffer only refreshes 1ce per frame giving you the same performance as no v-synk but without tearing.. (the buffers have the latest frame to display exactly the time you need it because the gpu never stopped rendering them)

There is also limited time queing which is also sometimes called tripple buffering, that is not the same but in most instances works just as well but can sometimes display an old frame which would cause input lag.
real tripple buffering always displays the latest frame so you always get the same performance as no v-sync but with 0% chance of tearing.

v-sync alone (not tripple or double buffer) however forces the card to impose an artifical delay. this will leave you slower than your opponent in a first person shooter for instance. it could be 10ms slower displaying the image on your screen than it was on your opponents screen if they had no v-synk.
in that situation then yes you are right using vsynk is slower because it is delaying the image.. (but given i have repetedly said tripple buffering v-sync that point it totally mute)

Also with no v-sync on and if you dont see tearing then you gained nothing from not having it on any way as the frame that was just updated is pretty much the same as the one that was their any way.

but  please explain to me how tripple buffering is bad because im not tech savvy enough to get your point.


----------



## Toothless (May 15, 2014)

Shambles1980 said:


> tripple buffer lets the card render your 120-300fps so it works to its full potential, but the front buffer only refreshes 1ce per frame giving you the same performance as no v-synk but without tearing..
> 
> .



Okay so you're making your GPU work harder than it should. Tell me why that's a good thing.


----------



## lilhasselhoffer (May 15, 2014)

Shambles1980 said:


> well at the settings i play at the cpu i had still couldnt keep the fps at a constant 60.. with v-synk on. and given that is the only thing i care about and no matter how i tweak the settings it simply is not possible to achive that with the cpu how is it i did not need an upgrade?
> 
> also i can lower the settings and play at 1080p but the frames would still be less than my target..
> i honestly prefer a constant 60fps no drops to higher resolution "which i really cant tell the difference with and dropping frames every time i move or a stutter when i try and turn around fast.
> AA does make the game look better so prehaps it is why i dont see the diference between 720p and 1080p because at 1080p i cant have AA on and still get frame drops. at 720p i can have AA on looks just as good to me, and i dont get the frame drops..




I'm going to take one last go at trying to explain this to you.  My Raspberry-Pi can run Doom.  It can run 1080p video.  It can play games.  Using your logic, my Raspberry-Pi is as good as an i3 or i5 processor, because at these specific settings it works.  It also beats out the Intel Gigabit ethernet controller on my board, because it can connect to a dozen different sensors.  My NIC only connects to one thing, and only transmits data using one protocol.

Is the nature of your argument apparent yet?  If not, then I can also prove that my Ti-84 calculator beats the pants off of you q6600.


Now, stop putting numbers ahead of everything else.  Gaming isn't about one processor beating out another by two points in a synthetic test, it's about being able to play the game.  You crank all of your settings to high, fire up the game, and play.  If the frame rate drops, you dial back the post-processing features, not the pixel count.  If you have to slide the resolution down then you don't have enough power to adequately play the game.  You've already said this is what you do, and still have 84% CPU utilization.  All the synthetic benches in the world cannot counter-indicate that this means your system isn't up to the task of providing real gaming performance any more.


None of this even touches on faster interconnections (PCI-e 2.0 and above), and new instruction sets available in new systems.  There's nothing like good new optimized code to show you how much quality a new processor and platform can deliver.  You are welcome to deny this, bury your head in sand, and continue believing the q6600 is more than a venerable relic.  Needless to say, I would be getting a new system to replace this aging system if ...it was my choice.  Honestly, the Core processors are past their prime.  They aren't bad, but a comparison between a $800 Core2Quad and a $100 i3 today is foolish.  I made the same point in a previous post, but it seems to fall on deaf ears.  4960x versus q6600 is a rather stark difference.  

Consider that a test of processing power is not a test for gaming.  The numbers already quoted are computational values.  My way of showing you that a q6600 isn't good any more is to set that monitor to native resolution, set the graphics to moderate settings, and let you see the crappy frame rates.  By lowering resolution to playable levels you hide the age of your CPU.  It's hard to see the inadequacies of a painting if you keep it in a dark room, just like it's hard to see a processor's shortcomings if you never push it to deliver a better performance.


Edit starts at elipses.  Lost consciousness before finishing post.


----------



## Shambles1980 (May 15, 2014)

lilhasselhoffer said:


> I'm going to take one last go at trying to explain this to you.  My Raspberry-Pi can run Doom.  It can run 1080p video.  It can play games.  Using your logic, my Raspberry-Pi is as good as an i3 or i5 processor, because at these specific settings it works.  It also beats out the Intel Gigabit ethernet controller on my board, because it can connect to a dozen different sensors.  My NIC only connects to one thing, and only transmits data using one protocol.
> 
> Is the nature of your argument apparent yet?  If not, then I can also prove that my Ti-84 calculator beats the pants off of you q6600.
> 
> ...



i think your simply missing my point because my point seems to be exactly the same as yours. lol. we just seem to word it diferently.
i have already bought the upgraded system for what i could affoard "if you read the full thread"
After that people were stating that my Q6600 may well have been better than the 8120 i bought to replace it.
so After a few heated words between a few other forum members i decided the best thing to do is to run a serieze of tests on the q6600 with the outlined test sequences (the 1 game i still had installed and 3 bench tests)
And then i would run the exact same tests on the 8120 when it arrived "still wating on the ram"
hopefully this will show that either the q6600 is infact better than the 8120 they are about the same or the 8120 is better than the q6600.
at no point did i say either of these processors were as good as or better than a i7 lol.. 

i set the game at the settings i did because they were the settings that best suited the gpu and cpu and was the actuall sweet spot for them.

I actually said the cpu was at 84% usage at the options screen.. it was fully used in the game the gpu also gets to 99% .. So it was configured properly.
i do believe that what has happend here is you did not fully read the thread and what has changed allong the way. but that is something i often do.


----------



## Shambles1980 (May 15, 2014)

Lightbulbie said:


> Okay so you're making your GPU work harder than it should. Tell me why that's a good thing.


How is it working harder than having v-sync off?
i really dont see what your trying to get at here..


----------



## Bucho (May 15, 2014)

lilhasselhoffer said:


> Welcome to TPU!


Thx



lilhasselhoffer said:


> (Xeon Servers)
> 
> The lack of any one person's ability to see improvements in processor speed is their own failing, not that of processor stagnation.  Likewise, CPU utilization is rarely as simple as people credit it.



Okay hold on, you didn't quote all I wrote and maybe (since english is not my native language [I live in Austria that's in central Europe]) I can't express myself the way I want. So please excuse my errors that may lead to a misunderstanding.
I did mention that you can slap a 10 core Xeon on your average desktop Socket 2011 board, but if you are the average customer you will have no benefit of it. Of course there are some programs than can utilize all cores and so this CPU would be equally faster than a dual or quad core. And yes - it is true that in the professional / server segment (that's what I meant when I wrote "in certain classes") the performance increased just the same as it did back then. It is a huge improvement like you said to have only one server handling the needs of what had to be done with multiple machines back then.

We are talking about the average user here, most of them just use the PC to browse the internet, communicate, play games or write/paint stuff. Then there is the user @ work who uses the PC for communication/general office use/develop(program)/... .
Both in 99.9% of the case do not benefit of more cores or more CPUs.
In this case here we are talking about a gamer who probably uses his PC 50/50 for games/internet.



lilhasselhoffer said:


> You cite the great innovations of the past in greater frequencies, but seem to forget why frequency increases have functionally become minute.  Frequency increases actually stopped yielding huge results almost a decade ago.  The real innovation that allowed the CPU market to improve was better miniaturization, new materials, and increases in the efficiency of chip design that increased IPC.  I'd gladly pit one of the Athlon CPUs versus the P4 chips released at the same time, and demonstrate how better design won out over raw frequencies.  This lesson holds true today, when AMD FX processors far more easily attain high frequencies, but are beaten by the Intel offerings with significantly lower frequencies.


I wasn't praising high MHz and wasn't pointing that out to be the messiah to performance, but rather explaining why we saw and felt quite a big boost in performance when changing a system or CPU. I mean we went from 4.77 (IBM XT) to 8+ (286) to 16+ (386) ... up to 3800 MHz (Pentium 4 HT) quite seamless (at least with Intel CPUs). And CPU architectures change and have their limits or let's say frequency ranges where they perform best (maybe that's because of timings, latency and the general structure how the CPU/whole system is built). So IPC AND frequency raised quite a lot since the architectures were getting better (with the case where that wasn't quite the fact (Netburst) they only dramatically raised the frequency). I have been there all along the way and had a Pentium III-S 1.4GHz that outperfomed a Willamette P4 (with SD-RAM) even up to 1.8GHz. But after a while I bought a P4 2.8GHz with HT and 200MHz FSB and clocked it even higher and was happy because the increase also was there because of the big MHz and the overall changes (faster memory, bus system aso). Then I got myself a nice Socket 479 adapter and a Pentium-M and clocked that to 2.6GHz and no P4 under 4GHz stood a chance (in games and even some programs too). I didn't mind the MHz because the performance was there. Then they switched to dual cores and even quad cores, new architectures (Core 2 and finally Core i) but the core count was the same and the frequency was only slowly rising up. Okay games and applications had to catch up with dual and even four core support ... but it seems they still do that and there are many out there that only use one core.



lilhasselhoffer said:


> Your point about core count is accurate, but misguided.  The sad truth is that gaming is currently (though hopefully will not be for too much longer) more about consoles than about PCs.  This truth was motivated by publishers, who have treated the PC market as second citizens for years.  There are games out there that have made use of high end multi-core processors.  We have had the capability to use all that extra power for years, but the cost to use it has been deemed prohibitively high.  Since the Core processors were introduced we've been hearing about how games will be coming out soon that use all those extra CPU cores.  Our reality is that these games exist, but don't have enough juice yet to make them the norm.


But that's the problem ... if we have nothing to feed the cores with they are useless (for gaming needs). And better IPC performance alone, since frequencies are almost the same for the last few years, doesn't really bring the Ohhhs and Ahhs we need to be motivated to buy new PCs.
-So you have a Quad core - fine ... more cores are barely needed nowadays (hope that changes in the future or maybe not to "need" them but benefit of more cores)
-You do not have the oldest generation (so the IPC is okay)
-and/or Frequencies are high
Why do I need to upgrade? Because of 30-40% more CPU power (that maybe don't even transist into performance in games or apps) and some less power usage?

And remember I am not talking about server or workstation class PCs ... there existed multi CPU systems many many years from now (for desktop PCs) like dual Pentium Pro, Pentium II/III and Celerons, Athlon MPs aso. but they were ignored for gaming all the way along. I am sure (at least for the last few years) a lot of developers moved their focus to console games like you said. With the PS3 and XB360 it slowly changed that games use more than one core. I faced that with my Pentium-M back when GTA IV on PC came out. I couldn't really play it because of that single core no matter what settings I used (GPU was a GeForce 7900GT 512MB AGP overclocked). 
Today there is no game that I can't play (even quite decent) with my ~6 years old Core 2 Quad (overclocked quite good).
Think of trying to play GTA IV (Dez 2008) with a late 2002/early 2003 CPU. That would have been a P4 2.66/2.8GHz 133MHz FSB or a Athlon XP 2600+ / 2800+.
Take that 6 year time period at any date back then and try to play a top game on a 6 year old CPU.

I still remember back then when we were converting DVDs on our P III 500MHz to DivX ... and that took all night (a good 6-8 hours or so)
A few years later still with a single core P4 or Athlon 64 with high MHz we did the same in like 2-3 hours. Then with a dual core in like 1.5 hours and now with a fast quad core in like 40 minutes.
You see those jumps were big every time because the IPC got better and the rise in MHz and finally the core count.
All I am trying to say is that for the average user/gamer with a good quad core it seems to be of no need to upgrade and it also seems to stay that way for some more time since even Skylake will have only 4 cores in the mainstream market.
That's why even a used i5-2500K is sold at quite a high price because if you have one you I bet you can stay with that for even the next 2-3 years and have no problems playing the latest games.


----------



## Toothless (May 15, 2014)

Shambles1980 said:


> How is it working harder than having v-sync off?
> i really dont see what your trying to get at here..


You're forcing your GPU to maintain 60fps, and to render 3x that. (Or so you say)

Having your CPU/GPU at 99% constantly in a game is not good for the health of the hardware.


----------



## Bucho (May 15, 2014)

Shambles1980 said:


> ...
> After that people were stating that my Q6600 may well have been better than the 8120 i bought to replace it.
> so After a few heated words between a few other forum members i decided the best thing to do is to run a serieze of tests on the q6600 with the outlined test sequences (the 1 game i still had installed and 3 bench tests)
> And then i would run the exact same tests on the 8120 when it arrived "still wating on the ram"
> ...



That link @TheHunter  posted is quite interesting
http://www.legitreviews.com/upgrading-from-intel-core-2-quad-q6600-to-core-i7-4770k_2247

Funny to see that in games at 1080p and high settings both systems are close together and that the GPU is limiting. At lower res and/or lower settings the i7 goes up and away.
Since most of the people will play at 1080p and rather high settings they may not get that huge upgrade with that i7. But like you said what is important to you and I think is too is the minimum FPS. To bad he didn't show these and only tested two games.
Keep in mind that the Q6600 was at stock and the i7 4770K too so it's 4 old cores @ 2.4GHz against 4 new cores (and HT) @ 3.5+GHz plus all the goodies around like no 266MHz FSB, slow DDR2 RAM, slower PCI-E bus ...


----------



## Bucho (May 15, 2014)

Lightbulbie said:


> You're forcing your GPU to maintain 60fps, and to render 3x that. (Or so you say)
> 
> Having your CPU/GPU at 99% constantly in a game is not good for the health of the hardware.



Triple buffer in combination with v-sync doesn't mean it renders 60fps three times but to render max. 3 frames ahead of monitor sync.

And why is it not good for the hardware to run at 100%???
That's the way it should be, otherwise you will not use one of your components (CPU/GPU) full potential.
(Like using a Titan black edition on a Celeron D or an i7-4960K with a GeForce 6200 PCI-E for 3D gaming)

As long as temperatures and voltages are within certain limits these components will not degrade much faster compared to run them at like 60%.


----------



## Shambles1980 (May 16, 2014)

Lightbulbie said:


> You're forcing your GPU to maintain 60fps, and to render 3x that. (Or so you say)
> 
> Having your CPU/GPU at 99% constantly in a game is not good for the health of the hardware.



no your missing the point of tripple buffering entierly..
your graphics card with no vsynk will send its renderd frames to the monitor when it renders them.. lets for arguments sake say your card is able to render a constant 180fps and your monitor is 60fps
in that instance your gpu is sending 3x more data to the monitor.
this can cause tearing because your monitor may well start to render frame 3 but as it started to do that frame 4 has been sent to it, so the top of the picture is renderd frame 3 and the bottom of it is rendered frame 4. if these 2 frames are a bit different to each other then you will see tearing, if they were almost identical then you wont see any tearing..

now what tripple buffer does is have 2 buffers and the front buffer so frame 1 goes to the monitor the card still renders the images like it did with no vsynk. but it goes to the buffer. "still no monitor refresh" the card renders another image and sends that to the buffer "still no refresh" the card renders another image and sends it to the monitor because it refreshed.. the 2 buffered images were never used but were rendered incase they needed to be..

now limited que also works very similar, but is more of a double buffer. where it saves the images that werent able to be displayed due to a refresh, and after a set time deletes them. this works almost exactly the same. but if the screen refreshes before the set ammount of time then it will display an image from the buffer. this image would be considered old as the gpu could have rendered one instead of it.
in reality though limited time que works very well.

double buffer is a bit like limited time que and tripple buffer but worse. as it only has 2 buffers then it sends image 1 to the monitor image 2 goes to the buffer and then it sits their till it can be displayed. the gpu was not able to buffer any more images and did not send out a new refreshed image and so the image from a double buffer is almost always old. but you can use double buffer with no v-sync which is kind of pointless because then you can still get tearing.

so the card isnt working any harder. its just preventing tearing with the bonus of not delaying images.


----------



## TheoneandonlyMrK (May 16, 2014)

Since he is also benching default settings as well as his speculative own style its all good imho


----------



## crazyeyesreaper (May 16, 2014)

meh Q6600 play Rome 2 Total War or Shogun 2 Total War maxed out with larger armies watch it crawl.  10-12 FPS in melee vs 29-30 of the newer Haswell chips lol


----------



## Toothless (May 16, 2014)

Bucho said:


> Triple buffer in combination with v-sync doesn't mean it renders 60fps three times but to render max. 3 frames ahead of monitor sync..



THANK YOU. At least I can understand the way you explained it.


----------



## OneMoar (May 16, 2014)

don't forget to get 800Mhz out of a 8150 you need a board with the VRM to handle that at 4.4 a FX chip is gonna be pushing 200W TDP to say nothing of the cooling required


----------



## HalfAHertz (May 16, 2014)

That q6600 vs i7 4770k was pretty interesting. The q6600 held surprisingly well in gaming at 19 x 12 for a 7 year old cpu. However I think that the Cinebench test was pretty telling. The q6600 was getting only 0.69 points per core. Last time I checked the 8120 was around 0.90+ in that one. That's a 30% increase in single threaded cases and considering that the amd has twice the numbers of cores, things are looking good for the OP.


----------



## Shambles1980 (May 16, 2014)

its not the best board but it is listed as being able to oc. but only 4+1 so really not going to get that much out of an 8120 at all "like i said a few pages ago a 63xx would have been better all round but this is what i could affoard.
i will be adding heat sinks to the vrm's but i dont expect to see crazy over clocks.
as for the cpu its self. thats getting a xigmatek aegir sd128264 so thats covered, but the board being 4+1 i dont expect much from.
possibly around 4.2 maybe more maybe less if i add heat sinks and a 80mm fan for the vrm's
but some where in that general area i think is the target for 4+1

pretty sure i have some heat sinks here that will do the job nicely but i havent looked at the board properly yet hopefully it wont be difficult to keep the transistors cool


----------



## HalfAHertz (May 16, 2014)

Shambles1980 said:


> its not the best board but it is listed as being able to oc. but only 4+1 so really not going to get that much out of an 8120 at all "like i said a few pages ago a 63xx would have been better all round but this is what i could affoard.
> i will be adding heat sinks to the vrm's but i dont expect to see crazy over clocks.
> as for the cpu its self. thats getting a xigmatek aegir sd128264 so thats covered, but the board being 4+1 i dont expect much from.
> possibly around 4.2 maybe more maybe less if i add heat sinks and a 80mm fan for the vrm's
> ...



I'd expect that your 8120 at 4.2 GHz will deliver similar performance to my 3330 at 3 GHz. I'm very happy with my 3330 and an hd7850 and I think you made a good choice.


----------



## Shambles1980 (May 16, 2014)

HalfAHertz said:


> I'd expect that your 8120 at 4.2 GHz will deliver similar performance to my 3330 at 3 GHz. I'm very happy with my 3330 and an hd7850 and I think you made a good choice.


well my heat sink will take a wile to arrive (just checked estimated delivery date). But ram should arrive today...
the vrm's i can add a temp heat sink to for now. may just need to cut it down some more. but with stock cooling i wont try any oc just yet.
i do have my eye on some 6+2 boards which should let me unleash more of the power i dont mind spending a bit on that and recouping some of it from this board as it should work out ok.


edit

having looked at the board properly today it only has 4 chokes near the cpu which to me says its 3+1  but it does have high quality transistors. but if it really is 3.1 then this board is not for over clocking regardless of what people on the internet say. And i will need the transistors heat sink'd just to run at stock speeds without throttling (at least i think so) haven got that much experience with split power phases but my understanding of it is that if the power circuit is protected (this board is) it will throttle down and or turn off due to heat.
I may need to do some more research in the board but at least for now i will treat it as under powered and not an over clocker. as i would rather get back some of the money on it because i did not blow it up and get a reasonable 6+2

i find it strange that the internet says the board is 4+1 but from what i see on the board its a 3+1


----------



## TheoneandonlyMrK (May 16, 2014)

The memory phase is likely separate from cpu vrms


----------



## Shambles1980 (May 16, 2014)

not so sure. just set it up went to set the thing as an oc set at 3.1 (should be stock) turned off the turbo stuff and amd versions of speed step thing, so it should sit at a constant 3.1 provided the vrm isnt getting hot.
started occt "cant really give it a proper run as occt dosent detect core temps so i can't trust it to stop before any damage"
and it dipped to 2.8 at one point, which means that it throttled due to the power phase being over worked. (thats at stock) so this has to be a 3+1..
Which will make the tests some what un fair. Like i said though i do have a 6+2 im trying to buy right now.
but there is no way i can oc with this board.
i added a 80mm fan over the transistors, which should help so it should not dip under normal gaming and stuff. but if it cant occt without throtteling then its 3+1
I didnt add a heat sink to the transistors though. the one i was going to use is a bit short.
i do have some memory heat sinks orderd which should fit, but if i get the 6+2 then it wont matter..

any way il run some tests with it now. but it is going to be hindered by the board's vrm.


----------



## OneMoar (May 16, 2014)

afaik if its a G0 chip you should beable to get 4Ghz out of it with reasonable voltage a good binned chip will do 4.0 with 1.45V or less I knew a guy that hit 4.5 with 1.52 under water


----------



## Shambles1980 (May 16, 2014)

ok so thief bench test. same as before with the settings. the 8120 did a little better but not really enough to warrant much fan fare.
as usuall the game reports min fps as 0 because of delayed start but msi after burner repoted it as 14.5 (.6 slower than the q6600)
but through out the test the 8120 was able to be consistantly higer and ended up with 63.5 max (lower than the q6600)
and aproximatley 36 avarage (which is quite a bit better) given the lower max frames (but same 0 frames reported) to achieve this the 8120 had to be more consistant or with less low drops during the test.
the actual avarage reported by the bench was 33.9 (4.1fps better than then q6600 achived with higer max frames)

so it looks prety good for the 8120 so far for being more consistant..

the actuall game..
again it was reasonably close. with the 8120 being slighjtly more consistant at staying in the 60fps requested of it. but it too suffered some drops mostly in to the 50's but 1ce diped down to 41.5 (slightly lower than the q6600s lowest point) i must stress however that this was for one small second once, where as i could repeat the 44fps drop with the q6600 every time i ran down the open length of the court yard. the 8120 only did it once and managed to stay at 60.1 fps when i ran down the same path again a few times (possibly throtteling due to the vrm)

The only conclusion i can take from this is that with this board and cpu you would notice no reall difference gaming at the settings using this cpu at stock vs a over clocked q6600 at 3.0
Again i need to stress that this is a slighjtly unfair test due to the fact that this board is 3+1 and will occasionally throttle the cpu to 2.8 when under load.

on to the synthetic tests.
valley bench

min:
15.1 (dx11)
15.4(dx9)
14.6 (opengl)

Max:
74.1(dx11)
78.1(dx9)
66.1 (opengl)

average:
41.6 (dx11)
40.3(dx9)
32.5(opengl)

from here we again see that these 2 processors are again very close provided the q6600 is over clocked to 3.0. and the fx-8120 is locked to 3.1ghz and gets throttled down due to stupid 3+1 power phase.
(how often it gets throttled to 2.8 however i dont know. but if we assume it never does. and was always running at 3.1 with turbo mode off. then the 8120 is atleast on par with a q6600)
As i said earlier this test is very biast in favour of the q6600 due to the stupid 3+1 power phase board which I will be replacing..

any way 3d mark.





here the 8120 seems to do prety much exactly the same as the q6600 in the gpu test. which would imply that the gpu is the limiting factor in this test.
Suprizingly (atleast to me) the 8120 actually performs the physics tests better than the q6600 even when hindered.
and in the combined test its a more even playing feild but the 8120 at 3.1 seems to be able to keep the frames at a more consitsnat rate compared to the q6600..





the q6600 performs quite a bit better here in terms of actuall raw numbers but the graph seems to show the 8120 is more consistant even when hindered..
but in the physics test the 3.0 q6600 is outright better than the 8120 at 3.1..
Which is strange as in the previous physics test it was the other way arround. (again this could be due to throtteling)


so.. wit this set up and the 8120 Locked at 3.1ghz with occasional throtteling down to 2.8ghz "due to stupid mother board" and being unable to utilize built in features such as turbo mode.
the 8120 and 25% oc q6600 seem to be about even...

this is actually a better result for the 8120 than i expected when i found out i was stuck with a 3+1 phase board and it throttled down under load.
If the 8120 had been allowed to use its turbo mode then the q6600 would not have kept up at 3.0 but at 3.6.. I think it probably would have.

so in the end.. i have to say that a stock 8120 would be better than a 25% oc'd q6600. because it would be able to use its turbo mode and hit 4ghz.
unfortunatly i cannot test that out untill i get atleast a 6+2 board.

but these cpu's running at very similar clock speeds are really close in terms of power per mhz. I would have to say the q6600 has slightly better performance for its mhz, but the 8120 on the right board can surpass the q6600 in terms of over clocking.
and if we are looking at stock cooling all round with the correct boards for each cpu then the 8120 would be the winner simply due to the higher clock speeds.

Is it an upgrade?
Well if you did like me and simply swapped your q6600 for the 8120 then yes i guess it is..
if you would need to pay more for the 8120 setup than you could get for your Q6600.. Then probably not Unless you cannot over clock your q6600

hope to get the new board soon and update this with more valid results..



OneMoar said:


> afaik if its a G0 chip you should beable to get 4Ghz out of it with reasonable voltage a good binned chip will do 4.0 with 1.45V or less I knew a guy that hit 4.5 with 1.52 under water



with the Go Q6600 its not really the cpu thats the issue (well temps are, but thats something you need to cater for) to oc the q6600 really high you need a board that supports fsb of 1600 native and then can over clock past that with ease.. 

a 1333 board can usually oc to about 1600 and then will really struggle to go above that. some of them cant even do that though. and if your board was a native 1066 then you would be aiming for 1333 or better as your over clock and may not even manage that. 

the q6600 really is about 3.7 max on a decent after market air cooler imo. (if you want it totaly stable in occt) i doubt 1.45 would give 4ghz stable on a G0 regardles of the board. even at 3.75 my G0 wanted to draw 1.48v to be occt stable. but that was unsustainable on air.  
4ghz is more than possible with a G0 on the right board with good liquid cooling "i dont mean closed loop system" but it would more than likley want to draw 1.5+ when under 100% load on all 4 cores. but i think you could do it at 1.55 or lower.


----------



## OneMoar (May 16, 2014)

note for the money you spent on a 8150 you could have a H81 and a i5 4750 
http://www.newegg.com/Product/Product.aspx?Item=N82E16813128649
http://www.newegg.com/Product/Product.aspx?Item=N82E16819116897
for gaming this setup will do better then the 8150 
FX 8150 150.00USD
AM3+ motherboard with at least a 6+2 phase VRM to support the 8150 75.00
________
h81 motherboard 60.00
intel core i5 200.00 
so the price diffrence is really about 35.00 
ofc there are other differences  but for the 35.00 usd in price you will save that within the first 3 months in power consumption alone if you can't use all 8 threads of the 8150 then its pointless to own it


----------



## Shambles1980 (May 16, 2014)

OneMoar said:


> note for the money you spent on a 8150 you could have a H81 and a i5 4750
> http://www.newegg.com/Product/Product.aspx?Item=N82E16813128649
> http://www.newegg.com/Product/Product.aspx?Item=N82E16819116897
> for gaming this setup will do better then the 8150
> ...




i dont know how many times i have said this lol.
but the cheapest USED i5 cpu here in the uk costs more than i spent on this system.. that is without motherboard or ram. infact the cheapest used i5 that i saw was more than that i5 you linked to is new. But again that i5 you linked to is more than i spent on this system..(it also has no ram or motherboard)

if i was in the usa it would be different but i am not lol.
If it had been as low of  difference in price as it is in the usa it would have been a no brainer. but given it was more than double the money here. then its verry dificult to justify.


----------



## RCoon (May 16, 2014)

OneMoar said:


> the 35.00 usd in price you will save that within the first 3 months in power consumption



Not likely, it would be closer to around $12 in a year in savings. The cost in power of both processors at stock clocks is a few dollars a year difference.


----------



## OneMoar (May 16, 2014)

RCoon said:


> Not likely, it would be closer to around $12 in a year in savings. The cost in power of both processors at stock clocks is a few dollars a year difference.


not here it isn't lol
power bill is 289 a month here and I haven't even run the AC yet


----------



## RCoon (May 16, 2014)

OneMoar said:


> not here it isn't lol
> power bill is 289 a month here and I haven't even run the AC yet



Good god.


----------



## Shambles1980 (May 16, 2014)

well i updated my first post in this thread to have the results all in one place which is easier to find. 
Still dont think its a fair test but the 8120 did better than i expected given the limitations the board forced on it.


----------



## Bucho (May 16, 2014)

Hi Shambles1980 and thanks for the test results.
I mentioned that earlier that the board you got really is outdated and rather a lower end class. The AMD 700 chipsets were the 740G, the 760G, the 780G and V, the 785G and the 790GX. These are the ones with integrated graphics. These were usually used in µATX boards and were not really ment for overclocking. The ones without an integrated GPU were the 770, 790X and 790FX. There you could find a few great boards for overclocking. 
But there are way better and newer boards (with more features and they use less power) out there with the 990X/FX chipset or some with the 970. The ones with the 990 are kind of expensive but the 970 should be a lot cheaper. You might want to check the 970 models out before you buy because some of them are no good overclockers.

As for the results ... they don't surprise me. In some cases at the same frequency the 8120 should be better but only if the program is optimized for it or it's threads. I guess Battlefield 4 would show better results with the FX at these speeds.
And like you said the FX will probably reach higher OC frequencies but you need a good board and good cooling as well as a decent PSU for that.
Still with the Q6600 @ 3.6-3.7 it will be a close match with that FX-8120 also OCd to about 3.8GHz.


----------



## Shambles1980 (May 16, 2014)

if the q6600 was 3.7 and the 8120 was 3.8 I think the Q6600 would actaully be better for all round performance.
but to get a G0 q6600 to 3.7 stable you do need to spend money.

i am looking at used 990fx boards right now. hope to see one for a decent price. and trade out this board for that.

i am also looking for cheap used 1155 intel boards. the cpu's are still to expensive for now which means a lot of people arent trying to buy boards used because buying them seperatly is usually more expensive than a used bundle.
im hoping to buy really cheap and then later on get a 2nd/3rd gen i5 for a good price.
as my wife is american. i can search the american ebay market and hope to get a cheap one there. a cpu should fit in to one of the boxes they send our daughter every so often.

that project is a (if i find them) situation though..

This board i have right now..
Simply sux. its 3+1 phase. the internet told me it was 4+1 more reseach shows Only the 3.1 rev was 4+1 and the later revisons for some reason went to 3+1 the processor TDP was dropped from 125w to 95w so really the 8120 should only run at 2.8 and its only staying at stock speeds for as long as it is because of the 80mm fan i put in place,
on a diferent note some "hypothetical" questions made me change some settings and i can run this at 3.6ghz with no throtteling at stock voltages if i have it running as 6 cores.
so thats how il have it running for the time being. should help with single threaded apps untill i get a better board..


----------



## Vario (May 16, 2014)

Try to find an i5 2400 (not 2400s) and a cheapo 1155 motherboard.  You can even use the stock intel cooler.  Its going to cost you less in the long run.  Sell the 8120 and am3.

edit: i5-2300, 2310, 2320 can also be found cheap, theres a 2320 for $130 on ebay for example


----------



## Shambles1980 (May 16, 2014)

Vario said:


> Try to find an i5 2400 (not 2400s) and a cheapo 1155 motherboard.  You can even use the stock intel cooler.  Its going to cost you less in the long run.  Sell the 8120 and am3.




working on it but the 2400 is close enough to the same price as 2500k here that i may as well get the 2500k.(£10 difference)

any way the board was listed as 4+1 phase when its most definatly 3+1 so im seeing if they will do something about that.
I am trying to get a 6+2 board right now.
i have seen a really cheap board for about £25 that would let me put a 2/3gen i5 in it if i got one, and a really crappy celeron G something or other i could use mean while. but again the i5's are just still to expensive right now.


----------



## Toothless (May 16, 2014)

Okay so I'm sitting here with an older desktop that has a Q6700 paired with GTS8600. It's the same exactly chip as the Q6600 but with a .22ghz higher clock. Runs every day-to-day thing smoothly and I'll be testing some games on it soon.


----------



## Shambles1980 (May 16, 2014)

the 8600 will be the issue with that setup. the best card i paired with my q6600 IMO was the 5770, it had a 3870 and a 4870 prior to that all of those cards however are much better than the 8600. and the 3870 was not really that great. i would have gone for a 6950 as i thought at the time it was about the limit of the q6600 and possibly wouldnt see the full potential i did not do that. but the 7850 i now have came allong at a really good price with prety much the same performance so i thought id try it. Seems like the 7850 is a bit to much card for the q6600 which has then led to all this lol.
i think i still have the 4870 here some where.


----------



## Toothless (May 16, 2014)

Shambles1980 said:


> the 8600 will be the issue with that setup. the best card i paird with my q6600 IMO was the 5770, it had a 3870 and a 4870 prior to that all of those cards however are much better than the 8600. and the 3870 was not really that great.
> i think i still have the 4870 here some where.


I know of a few games that are more CPU-based than GPU.


----------



## Shambles1980 (May 16, 2014)

its just the 8600 is not going to do that quad any favors imo. i still think they are great processors. they just have to be properly balanced with the rest of the system to get the best out of everything. 
a 5770 would be a night and day difference.


----------



## Toothless (May 16, 2014)

Shambles1980 said:


> its just the 8600 is not going to do that quad any favors imo. i still think they are great processors. they just have to be properly balanced with the rest of the system to get the best out of everything.
> a 5770 would be a night and day difference.


I get the fact that the GPU is weak. I think I know that. I'm not testing the GPU; I'm testing the limits of the CPU in a few games. I'm not going to go out to get a GPU for a rig that 

1. Isn't mine. Though the owner wants me to push it to see how far it will go.
2. Isn't a gaming rig. It is a multimedia/streaming desktop that has been recently restored.
3. Will probably never see a game after the tests are completed.


----------



## SaltyFish (May 17, 2014)

And here I am still rocking a QX6850 (along with a GTX 560 Ti) and playing games at 1920x1200...

It's very tempting for me to upgrade, but with DDR4 on the horizon I wonder if I should skip DDR3 completely and hold out until then: Haswell-E, Skylake, or whenever AMD rolls out their DDR4 socket (they've been waiting on DDR4 since there isn't much point in a new DDR3 socket now).

If you plan on OC'ing the living daylights out of your FX-8120 for longevity, you'd probably want to aim for a slightly higher-end motherboard. If your wife is heading to the USA anytime soon, you might even be able to score a flagship motherboard for less. Regardless, I'm looking forward to the OP's results on whatever new fancier board.


----------



## Toothless (May 17, 2014)

SaltyFish said:


> And here I am still rocking a QX6850 (along with a GTX 560 Ti) and playing games at 1920x1200...
> 
> It's very tempting for me to upgrade, but with DDR4 on the horizon I wonder if I should skip DDR3 completely and hold out until then: Haswell-E, Skylake, or whenever AMD rolls out their DDR4 socket (they've been waiting on DDR4 since there isn't much point in a new DDR3 socket now).
> 
> If you plan on OC'ing the living daylights out of your FX-8120 for longevity, you'd probably want to aim for a slightly higher-end motherboard. If your wife is heading to the USA anytime soon, you might even be able to score a flagship motherboard for less. Regardless, I'm looking forward to the OP's results on whatever new fancier board.


Basically the old Q6600/Q6700/QX6850's are still powerhouses, just the perfect set of hardware is needed. The only reason to upgrade from those is DDR4. Unless you have the need to have more power right before a major tech update.


----------



## Lopez0101 (May 17, 2014)

Or more native SATAIII, USB3.0, eSATA ports. Don't forget if you want an M.2 or mSATA slot on the mobo. A CPU itself is not the only reason to upgrade, chipest upgrades make a big difference as well.

My Q6600 box back home would need to have a videocard that's a couple generations newer to game at 1920x1080 at a graphics level I'd be okay with. And more RAM, haha.


----------



## Dent1 (May 17, 2014)

crazyeyesreaper said:


> meh Q6600 play Rome 2 Total War or Shogun 2 Total War maxed out with larger armies watch it crawl.  10-12 FPS in melee vs 29-30 of the newer Haswell chips lol



29-30 FPS on a Haswell isn't exactly good, that is actually terrible considering its Intel's flagship chip and its unable to achieve 60FPS on Total War: Shogun 2 which is like 3 years old!



Shambles1980 said:


> if the q6600 was 3.7 and the 8120 was 3.8 I think the Q6600 would actually be better for all round performance. but to get a G0 q6600 to 3.7 stable you do need to spend money.



Then you'd be massively wrong. The Phenom II X4 9xx can match the Q9xxx at equal clock speeds. So FX 8120 would walk over it too, let alone the Q6600.

Anandtech: http://www.anandtech.com/show/2715/10
_Compared directly to the Q9550, the Phenom II X4 940 is a strong competitor. It had better average frame rates in CrossFire mode than the Q9550 in three titles, tied in one, and finished behind the Q9550 by about 2%~7% in the other three games.

When it came to actual game play experiences, we thought the Phenom II 940 was clearly the better choice in Company of Heroes: Opposing Fronts and Crysis Warhead due to minimum frame rate advantages and fluidity of game play. In the five other titles, we could not tell any real differences in the quality of game play between the Phenom II 940 and Core 2 Quad Q9550. Except for Far Cry 2_


----------



## Shambles1980 (May 17, 2014)

you could argue a phenom II is better than an 8120 all round as well though.. so i stick by my statement


----------



## Dent1 (May 17, 2014)

Shambles1980 said:


> you could argue a phenom II is better than an 8120 all round as well though.. so i stick by my statement



Then you'd be even more massively wrong.

The Bulldozer architecture is the Deneb's successor. At worst  case scenario where the architecture doesn't scale the performance will be virtually the same  +/- about <10% in either direction. But overall the where the Bulldozer's would consistently outperform by a bigger margin.  As shown in this link here ----> http://www.anandtech.com/bench/product/362?vs=434  (used the FX8150 vs 980BE as they have similar clock speeds)


Now, 2 posts ago I quoted Anandtech saying the Phenom II X4 940 is on par with the Q9550 (York field). The link above shows the AMD Phenom II X4 980 BE  getting outperformed by the AMD FX-8150 by a bigger margin and worst performing virtually the same.  So it would be impossible for the older Q6600 (Kentfield) to outperform its successor the Q9550 York field or Phenom II X4 940 BE or Phenom II X4 980 BE let alone a AMD FX-8150 or the anything from the Bulldozer 8 core family.


----------



## crazyeyesreaper (May 17, 2014)

Dent1 said:


> *29-30 FPS on a Haswell isn't exactly good, that is actually terrible considering its Intel's flagship chip and its unable to achieve 60FPS on Total War: Shogun 2 which is like 3 years old!*
> 
> 
> 
> ...



Thats the thing tho show me another game series that allows for 10,000-50,000 Soldiers on screen individually animated fighting each other lol. 

Avg frame rate is much higher the 29-30 is the MINIMUM.   

So Phenom II / Q6600 / Q9550 / FX8000 etc etc all have miniums in the 11-15 range which happens ALOT in melee battles.   Haswell being 29-30 means in those intense awesome situations your gameplay is fluid.

Still old tech is old tech

Q6600 while serviceable is still extremely dated and shows its age when paired with modern graphics cards.


----------



## Bucho (May 17, 2014)

Lightbulbie said:


> Basically the old Q6600/Q6700/QX6850's are still powerhouses, just the perfect set of hardware is needed. The only reason to upgrade from those is DDR4. Unless you have the need to have more power right before a major tech update.



I wouldn't make DDR4 THE reason. DDR4 will not even make a huge improvement in performance when looking at intel CPUs and the average customer who buys a PC for internet/office or gaming. Right now there is not much difference if you use DDR3-1333 or DDR3-2133. In synthetic benchmarks and tests you can see the difference but in real life and games the difference is almost to zero (max. a few single percent).
I guess DDR4 is more important for servers since they plan to build single modules with large memory size. And they use less power / lower volts. And finally in server application areas even the plus on the speed side may me more relevant if there are memory intese applications or maybe more VMs are running and accessing the memory at the same time.

About the Q6xxx CPUs ... yes they are still quite capable, but we should not forget that a second/third gen i5 (2xxx and 3xxx models) are almost 40% faster and a fourth gen i5 (4xxx) is about 50% faster at the same clock speed. Sure, depending on the program the difference sometimes is way smaller but sometimes also way bigger. So the only way to keep up a little is overclocking. A Q6600 at stock is pretty slow compared to anything new you can buy at the moment from Intel. Even a Celeron G1820 will be faster most of the times (at least if only up to 2 cores are used). And the i3-4xxx gives a Q6600 at stock the rest.
http://www.tomshardware.com/reviews/ivy-bridge-wolfdale-yorkfield-comparison,3487.html (there are some Ivy Bridge generation CPUs in the test and a Q9550 that's already faster than a Q6600 (433MHz more freuqency, 333MHz instead of 266MHz FSB, more cache (12MB instead of 8MB) and a newer architecture).



Lopez0101 said:


> Or more native SATAIII, USB3.0, eSATA ports. Don't forget if you want an M.2 or mSATA slot on the mobo. A CPU itself is not the only reason to upgrade, chipest upgrades make a big difference as well.
> 
> My Q6600 box back home would need to have a videocard that's a couple generations newer to game at 1920x1080 at a graphics level I'd be okay with. And more RAM, haha.



eSATA is not a point since you had this back then with P35 chipset boards too. And M.2 and mSATA slots in my opinion are not really worth in on desktop boards. On ITX where space (in the case) often is rare that's a nice thing. Instead I am preferring SATA Express (basically and technically the same interface as M.2, so two SATA 6GB ports slapped together but the ports can still be used as SATA ports). But then again M.2 most of the times comes shared with two SATA ports so it wouldn't make a difference.
The rest I agree, if you can make use of it SATA III / 6GB for SSDs, native USB 3.0 and faster PCI-E (3.0) slots do make some goodies when upgrading a old platform.



Dent1 said:


> Then you'd be even more massively wrong.
> 
> The Bulldozer architecture is the Deneb's successor. At worst  case scenario where the architecture doesn't scale the performance will be virtually the same  +/- about <10% in either direction. But overall the where the Bulldozer's would consistently outperform by a bigger margin.  As shown in this link here ----> http://www.anandtech.com/bench/product/362?vs=434  (used the FX8150 vs 980BE as they have similar clock speeds)
> 
> ...



Check out some more benches between the FX 8120 or 8150 and the Phenom II. In gaming (and that's what we are talking here about) the FX often even is slower than the Phenom II. Your link only has 4 games and in 3 the Phenom is faster and the 4th is StarCraft II that only uses two cores so the turbo of the FX can kick in.
So again, no doubt in most applications and programs that are optimized for the FX or use his many threads or his strenght (compression, converting, crypting) the FX is better. In games he is usually slower.
That said this might change now since some new games (BF4, maybe Watch Dogs ...) are better optimized for the AMD modul architecture (maybe because of the new consoles that use that too and have more threads).


----------



## Shambles1980 (May 17, 2014)

i did not want to say anything but seeing as its been pointed out already.
that chart shows the phenom beating the 8150 quite soundly in a lot of areas, it trails behind a bit in some multi threaded applications but even then still wins with video encoding which in reality 8 cores should win at. also if we take in to consideration that the 8150 is faster than the 8120, then the phenom would only further beat that..
so like i said you could argue that all round  the phenom is better (better single threaded, better for more games. more things can utilize it better)
so i still stick to what i said clock for clock i think the q6600 is faster than an 8120 but only just. thats considering single threaded apps. games that dont use 8 cores. and so on.
"normal usage"
But the 8120 can be clocked higer. and does have added benifits of updated board features (although the lga 775 boards could support ddr3 and some did come ddr3)
and i think it would be moe of a fair test to compare the 95w 8120 vs the g0 q6600, but i do not have the 95w revision. im sure that one has to be better than the 125w

on the side of the 8120 I do think the 8120 is able to maintain a more steady throughput than the q6600 though. the q6600 seems to fluctuate a lot more where as the 8120 at the same clock may provide slightly slower performance it is a more even distibution compared to highs and lows of the q6600.
so slightly slower but more consistant at that speed vs slightly faster but fluctuates..
i think i prefer consistancy but clock per clock i still think the q6600 is faster. however as i like a constant then the 8120 is probably better for me although dissapointingly similar clock for clock compared to the ancient q6600

-=edit=-

think i have to buy a
ga-78lmt-usb3
only 4+1 phase but is reccomended for oc. (all revisions)
should let me get the cpu up to 4ghz possibly 4.2 protected circuitry so i wont fry the vrm if it cant handle the work. pretty sure it will be ok with a 80mm + the heat sinks it already has.
would have liked 6+2 or 8+1 but given im trying to do this for as cheap as possible my choice is pretty limited.
good news however i can get that board new for less than £40. was already given a partial refund on this 3+1 board because it honestly shouldnt have this cpu so if i can get a few £ back selling this board on the new board should only have cost £15 or so which really isnt bad given im still nicley under budget even after buying aftermarket cooling.
However the extra 4gb of ram will be getting put on the "next time" list

the main issues with the board is lack of sata 3 and only pci-e 2.0 16x but its not going to slow any of my components down as i dont use ssd. and the 7850 is not going to lose anything on 2.0 vs 3.0


----------



## Dent1 (May 17, 2014)

Bucho said:


> Check out some more benches between the FX 8120 or 8150 and the Phenom II. In gaming (and that's what we are talking here about) the FX often even is slower than the Phenom II. Your link only has 4 games and in 3 the Phenom is faster and the 4th is StarCraft II that only uses two cores so the turbo of the FX can kick in.
> 
> So again, no doubt in most applications and programs that are optimized for the FX or use his many threads or his strenght (compression, converting, crypting) the FX is better. In games he is usually slower. That said this might change now since some new games (BF4, maybe Watch Dogs ...) are better optimized for the AMD module architecture (maybe because of the new consoles that use that too and have more threads).




Actually those 3 games show virtually the same performance and are within margin for error.  It actually supports my claim that at worst the Bulldozer will perform virtually the same as the Phenom II in games. In extreme cases <10% in either direction.

Dragon age, 121.4 FPS vs 118.4 FPS  = within margin for error. Draw.
World of Warcraft, 80.6 FPS vs 77.7FPS = within margin for error. Draw.
Starcraft 2, 42.5 FPS vs 47.8FPS = winner, FX 8120
Dawn of War  60.1 FPS vs 51.5 = winner,  Phenom II 980 BE

All that can be concluded from the gaming segment is the frame rate on these "old" games are virtually the same, you can't say Phenom II is faster in games as that isn't what the results read. There isn't enough separation between to two to conclude anything meaningful.

Now also keep in mind the Phenom II 980 BE is actually running 100MHz higher, so I'm giving it the benefit of the doubt, but overall beginning to end when you factor in the entire review the Bulldozer is faster.

Edit:
Shambles. When does your CPU arrive?


----------



## Vario (May 17, 2014)

Shambles1980 said:


> i did not want to say anything but seeing as its been pointed out already.
> that chart shows the phenom beating the 8150 quite soundly in a lot of areas, it trails behind a bit in some multi threaded applications but even then still wins with video encoding which in reality 8 cores should win at. also if we take in to consideration that the 8150 is faster than the 8120, then the phenom would only further beat that..
> so like i said you could argue that all round  the phenom is better (better single threaded, better for more games. more things can utilize it better)
> so i still stick to what i said clock for clock i think the q6600 is faster than an 8120 but only just. thats considering single threaded apps. games that dont use 8 cores. and so on.
> ...


Don't buy another cheap motherboard if your doing an FX.  They really need all the VRM they can get.  4+1 isn't enough unless you were doing a 4300.


----------



## Shambles1980 (May 18, 2014)

Dragon age, 121.4 FPS vs 118.4 FPS  = within margin for error. Draw. 
3fps = draw because it was in phenoms favour

World of Warcraft, 80.6 FPS vs 77.7FPS = within margin for error. Draw.
3.1 fps within margin of error because it was in phenoms favour

Starcraft 2, 42.5 FPS vs 47.8FPS = winner, FX 8120
5.3 fps "absolutley smashed it! the bull dozer is the clear winner" 



lol sorry for the sarcasm. it just seems that your margin of error extends to the precice point where the phenom is able to beat the 8150 by.. some thing tells me 10fps would have been within the margin of error if the phenom was 10fps faster..

my point however still stands you can argue that a phenom is faster... (as is being proven) and i specifically stated the 8120 which is slower than the 8150 which by your "margin of error rules" would undoubtedly leave all the areas where you say "phenom margin of error draw"  as "phemon wins" and the one where the fx wins because of its massive 2fps above your imposed margin of error,  would be reduced to a draw.. "with your margin of error rules" 

all of this is utterly pointless however because like i said you can argue the phenom is faster. dosen't mean its true and dosen't mean it isn't . it simply means you can have a valid argument about it pitching apples vs oranges all day and never really agreeing on it. 

so i still stand by my statement that a Q6600 clock for clock is faster than an 8120. 
and i still also stand by my statment that i believe the 8120 is able to keep the same through put up at a slightly slower pace than the q6600 which seems to fluctuate more in its work.. 
the end result is 2 processors that at the same speed preform almost exactly the same in almost all tasks.



Vario said:


> Don't buy another cheap motherboard if your doing an FX.  They really need all the VRM they can get.  4+1 isn't enough unless you were doing a 4300.



thats my thinking too. but in all honestly this set up is not going to be a long term build.
i chose the aftermarket cooler i did because it works very well on both amd and i5's

the long term goal is get an i5.
the short term was. get rid of the q6600 whilst i can still get newer tech for the same money. (did that)

now i need a board that i can actually use the 8120 on to atleast stock potential for as close to nothing as possible..
spent a while and the only board that i can buy for next to nothing is the ga-78lmt-usb3.

I am also trying to buy a 1155 board with 2/3rd gen i5 support. which i will stash away for a while. the boards arent that expensive right now because the cpu's are expensive so people just seem to buy pre built or bundle deals.

once i have the 1155 board i will roll this 8120 + board over aiming for the £100 mark which is a bit less than i see them going for on ebay and a bit more than i paid for it and the ram. keep the ram. and spend about £100 on an i5 2500k

this way whats happend is i turned my £70 worth of hardware (q6600 parts) in to £100 + 4gb ddr3 ram. and i really do think if i held on to the system for much longer i would have been risking getting even less for it.
What i hope to eventually happen is throug a seriese of seemingly useless upgrades that cost me nothing that i can infact get my self an I5 + motherboard + ram for the price of a mother board in cold hard cash. and the rest of it comes from selling the upgrades. 

It also means that i have a system i can use upstairs whilst i try and find a inexpensive 1155 board.

typing it out like that it seems like a lot of effort to save a few quid and spread out the costs lol. But i think i prefer doing it the difficult way.
as for the cheap 4+1 board.
I could in theory just not bother with it and sell this board + cpu on..
but honestly what kind of low life would knowingly sell on a 3+1 board with a 8120..
(apparantly scan would lol. they were the ones who sold it to the person who sold it to me.)


----------



## Vario (May 18, 2014)

Shambles1980 said:


> thats my thinking too. but in all honestly this set up is not going to be a long term build.
> i chose the aftermarket cooler i did because it works very well on both amd and i5's
> 
> the long term goal is get an i5.
> ...



Sell the 3+1 and the 8120 separately.  The less money you spend on getting the 8120 working the better, as a budget processor it fails due to requiring a premium motherboard.

Alternatively you could try to find a cheap Phenom II x4 and use it with that board, I got one 2 years ago for $60 brand new.  Phenom II still requires decent motherboard power delivery though.


----------



## Shambles1980 (May 18, 2014)

Vario said:


> Sell the 3+1 and the 8120 separately.  The less money you spend on getting the 8120 working the better, as a budget processor it fails due to requiring a premium motherboard.
> 
> Alternatively you could try to find a cheap Phenom II x4 and use it with that board, I got one 2 years ago for $60 brand new.  Phenom II still requires decent motherboard power delivery though.



your right i could still in theory get around £100 for this board + the cpu separate. 
I may hold off on the other board then, just find a 1155 board cheap, then sell these 2 separate and fund the i5. works out about the same just may take a bit longer to sell the board and cpu separate.


----------



## Vario (May 18, 2014)

Shambles1980 said:


> your right i could still in theory get around £100 for this board + the cpu separate.
> I may hold off on the other board then, just find a 1155 board cheap, then sell these 2 separate and fund the i5. works out about the same just may take a bit longer to sell the board and cpu separate.


1155 chips require so little power you can run a real budget board, like an h61.


----------



## Shambles1980 (May 18, 2014)

well i just spent £20 on the cheapest 1155 board i could see that supported 2/3gen i5. 
the thing will get lost in my case no doubt about that lol. 

il list the 8120 and board on ebay and hope they sell at the same time so i can get an i5
really dont want to be without a pc up stairs for long


----------



## Vario (May 18, 2014)

Shambles1980 said:


> well i just spent £20 on the cheapest 1155 board i could see that supported 2/3gen i5.
> the thing will get lost in my case no doubt about that lol.
> 
> il list the 8120 and board on ebay and hope they sell at the same time so i can get an i5
> really dont want to be without a pc up stairs for long


Sounds good.  Which 1155 board?


----------



## eidairaman1 (May 18, 2014)

Honestly dont care what is better or not. Building a pc for myself is considerably better than a prebuilt because of cost and able to select ehat I want.

if yall care to, check out my signature Rig. Im still ordering parts but once all is here ill build it and post pics. So far all fans are in the case, and ram/cpu are in the mobo. Bear in mind im going from a Athlon Xp and a Celeron core 2 based laptop. The reasons why im building are as follows, no Windows 7 AGP GART Driver for NF2 Ultra 400 from nvidia(otherwise id be rockin a 4670 AGP lol) plus the celeron laptop lags worse than an agp card in youtube.


----------



## Shambles1980 (May 18, 2014)

msi h61m-p20-g3
will end up upgraded to something bigger not a fan of teenie tiny boards lol


----------



## Dent1 (May 18, 2014)

Shambles1980 said:


> Dragon age, 121.4 FPS vs 118.4 FPS  = within margin for error. Draw.
> 3fps = draw because it was in phenoms favour
> 
> World of Warcraft, 80.6 FPS vs 77.7FPS = within margin for error. Draw.
> ...



Don't be silly, do you know what margin for error means?

A victory by 1-5 FPS is within margin for error, if you run the same benchmark 10x times, the frame rate and results will vary each time.

The results will never be exactly the same. No piece of hardware in the system, memory, CPU and GPU will run to a its most precise optimal level every test run, even on the OS level the software is always using resources differently and assigning variables differently.  Unless there is a consistent large separation it falls within margin or error. It's not about saying one CPU if faster than the other, its about using common sense a 3FPS and 3.1 FPS separation doesn't tell us anything meaningful except the two CPUs perform virtually the same and margin for error occurred.




Shambles1980 said:


> so i still stand by my statement that a Q6600 clock for clock is faster than an 8120.
> and i still also stand by my statment that i believe the 8120 is able to keep the same through put up at a slightly slower pace than the q6600 which seems to fluctuate more in its work..
> the end result is 2 processors that at the same speed preform almost exactly the same in almost all tasks.



I can't honestly see the Q6600 being clock for clock faster, perhaps the Q9xxx on the same level. But lets talk hypothetically, even if the Q6600 is marginally clock for clock faster doesn't mean the end performance will be better.  When you're reading reviews they are in "near perfect" conditions e.g. formatted HDD, newly installed OS, etc. In reality a normal user will have significantly more background applications running and a less optimised OS on a software level and would have an antivirus scanner running or updating, Steam or Origin client in the background updating, Skype, TeamSpeak, Fraps etc along with their everyday applications cached into memory.  The Bulldozer FX 8 core would have a better experience. Would you bank on a 3FPS margin for error that the under heavy stress the 8 core wouldn't pull ahead?

We've already seen a margin or error of 3 FPS between the two, run Crysis 3 on a bloated spyware and adware infested PC with your antivirus software doing a full HDD scan, whilst converting a video file and listening to a music playlist in the background, whist also recording the game using a screen capture software and I can guarantee the Q6600's FPS would dive and the Bulldozer 8 core FX would stay consistent.


----------



## Vario (May 18, 2014)

Shambles1980 said:


> msi h61m-p20-g3
> will end up upgraded to something bigger not a fan of teenie tiny boards lol


good choice, (if a bit minimalist).  It should be fine.  Only downside is Sata II.


----------



## Vario (May 18, 2014)

Dent1 said:


> We've already seen a margin or error of 3 FPS between the two, run Crysis 3 on a bloated spyware and adware infested PC with your antivirus software doing a full HDD scan, whilst converting a video file and listening to a music playlist in the background, whist also recording the game using a screen capture software and I can guarantee the Q6600's FPS would dive and the Bulldozer 8 core FX would stay consistent.


  What about standing on my head while wearing a space suit and driving a schoolbus using my feet?


----------



## Dent1 (May 18, 2014)

Vario said:


> What about standing on my head while wearing a space suit and driving a schoolbus using my feet?



Thank you for that invaluable contribution.

Seriously, when we game we don't have the luxury of having a newly installed OS, with minimal background apps like in the reviews.


If you've got a only 3FPS margin for error yield, its already gone once you factor that the average gamer run backgrounds apps whilst doing other tasks.

Then factor in that every year those background tasks require more resources and are becoming more compatible for multicore CPUs in general.

Then factor in every year games are becoming more compatible for multi core CPUs in general.


That means a 3FPS margin for error yield for a quad core would get eroded year on year relative to say a hexacore or octocore of similar IPC, whereas a hexacore or octocore of similar IPC would gain yield year on year, or at least maintain its performance for longer.


----------



## Shambles1980 (May 18, 2014)

just did a pre-emptive purchace of an i5-2500k now 
just hope the fx-8120 sells soon or i will be short on cash and have an 8120 i wont be using.


----------



## Shambles1980 (May 18, 2014)

Dent1 said:


> Thank you for that invaluable contribution.
> 
> Seriously, when we game we don't have the luxury of having a newly installed OS, with minimal background apps like in the reviews.
> 
> ...



i have no issues with 8 cores or 6 cores and 0r 4 cores with ht or however you want to describe the amount of threads you can process, but the buldozer just inst that great for its generation at per clock cycle power. its not bad really but the amount of actual processing power per core per clock really is at early 2007 standards. for a cpu released almost 5 years later (late 2011) its a tiny bit dissapointing.
if we just look at it as if the Q6600 was a single core cpu and the 8120 was also single core. and they were running at the same speed, the q6600 would be faster. I have no doubt about that at all..
the 8120 does however have more grunt. from what ive sen its able to plow through the work more consistantly.
its a bit like a car with high bhp and low torque vs a car with High torque and lower bhp.
the high bhp lower torque is faster. but if you strap on a heavy load to the back of them. the high torque car would consistantly manage to achive close to its top speed, where as the high bhp low torque car would fluctuate more depending on hills and similar, so at the end of the course they would end up pretty close..


----------



## Dent1 (May 18, 2014)

Shambles1980 said:


> if we just look at it as if the Q6600 was a single core cpu and the 8120 was also single core. and they were running at the same speed, the q6600 would be faster. I have no doubt ..



It doesn't necessarily work like that.  If there was a way to disable the entire other 7x cores of the FX 8120 and then disable 3x of cores of the Q6600,  and then disable all the sub memory systems so its really 1 core vs. 1 core, then maybe. But it's impossible because multi core processors still borrow resources from it's other cores or modules. For example the way bulldozer shares its L2+L3 resources between its cores and collectively would make it impossible to treat it as a single core. Even when its in single threading scenarios the other modules are still active storing variables.

Effectively in a single threaded situation, its still partially working like in a multi threaded scenario making it inefficient. This is the risk AMD took, stagnate single threading performance at the cost of enhanced multithreading performance. Not a bad move considering this is the way software is going.

Also IMO, we don't need more single threaded support, you've already pointed out that your old Q6600 is enough for today's games, I've had my Athlon II X4 since 2009 and its plenty enough, why do you care what is faster core for core. We are heading in the direction where its about overall performance and multi threading.


----------



## Shambles1980 (May 19, 2014)

lol i did not say i cared i said i think the q6600 is faster clock for clock..
just from my observations running both chips.
either way I5-2500k is on the way now. i just hope its actualy fast enough to the point where i dont think "may as well have kept the q6600"
because honestly the Q6600 really wasnt doing it for me any more.


----------



## Vario (May 19, 2014)

Shambles1980 said:


> lol i did not say i cared i said i think the q6600 is faster clock for clock..
> just from my observations running both chips.
> either way I5-2500k is on the way now. i just hope its actualy fast enough to the point where i dont think "may as well have kept the q6600"
> because honestly the Q6600 really wasnt doing it for me any more.


I am pretty sure it will be assuming everything works perfectly, you should be set.  The 2500k is the best gaming processor in recent years (when considering cost, overclockability, game performance).  Its only ~8.8% slower per clock than a 4670k and overclocks much higher.
source: http://www.ocaholic.ch/modules/smartsection/item.php?itemid=1129&page=14


----------



## Dent1 (May 19, 2014)

Shambles1980 said:


> lol i did not say i cared i said i think the q6600 is faster clock for clock..
> just from my observations running both chips.
> either way I5-2500k is on the way now. i just hope its actualy fast enough to the point where i dont think "may as well have kept the q6600"
> because honestly the Q6600 really wasnt doing it for me any more.



You could of got any processor, I5-2500k or FX 8120, coming from a Q6600 its a big leap. You wont' have any regrets.

The only thing I would suggest is maybe upgrade from 4GB to 8GB of RAM, but other than that with a 7850 the rig will be more well balanced.


----------



## Bucho (May 19, 2014)

Dent1 said:


> ...
> We've already seen a margin or error of 3 FPS between the two, run Crysis 3 on a bloated spyware and adware infested PC with your antivirus software doing a full HDD scan, whilst converting a video file and listening to a music playlist in the background, whist also recording the game using a screen capture software and I can guarantee the Q6600's FPS would dive and the Bulldozer 8 core FX would stay consistent.



You forgot: ... while watching a BluRay, editing some huge photos with PS and adding filters while constantly updating facebook, twitter and all that shit on what I am doing at the moment and video-chatting on skype oh and don't forget that huge SQL database and webserver and forum software running in the background where hundreds of people access it at the same time.

I get your point and I agree that the FX will handle multitasking probably way better than that (not even native) quad core Q6600. But come on, nobody is playing a heavy 3D game while doing a full virus/spyware scan while converting a video while listening to music and recording the game while they are at it. And even if I think the first thing that might be limiting will be the harddrive speed and not the CPU.

Phenom II vs FX
And yes the few percent that the Phenom II is faster is almost within error margin. BUT at least it seems to be a little faster since in 3 out of 4 the "error margin" tends to the Phenom.
There are some tests out there that show that a Phenom II core at same clock speed is indeed sometimes faster than a FX. But usually they compare a X6 to a FX 8150 so with all cores it's a little hard to compare since this is 6 cores vs. 4 modules with 8 threads.

Both at 4.2GHz Turbo OFF:
http://wccftech.com/amd-bulldozer-f...i-x6-1100t-clocktoclock-benchmark-comparison/
(There is only Stalker in there for gaming that shows the same results, but the other tests show that the PII is faster)

Both at 3.3GHz
http://www.pcper.com/reviews/Proces...r-Unearth-AMD-Victory/FX-versus-Phenom-Perf-0

My guess is that all of these CPUs (Q6600, Q9xxx, Phenom II X4 9xx and even X6 as well as those FX-61xx, FX-81xx) at the same clock speed are extremly close together when it comes to gaming. Down side of the Core 2 series is that they have to be overclocked to reach a decent speed and they still have a FSB that limits the memory interface (so that DDR2/DDR3 doesn't make a difference) but if they get overclocked the FSB is the screw you have to turn to overclock at all. So at high FSB speeds the bottleneck gets less relevant. The Phenom II has some nice clock speeds already and uses DDR3 an (at least the Black Editions) can be overclocked quite good (with a decent board but that counts for all three CPU types). The FX already has high clock speeds too (at least the FX-63xx and 83xx models) and can be overclocked even higher and without a doubt should reach the highest OC. Downside is that they seem to have the weakest single threaded performance at same clock speed but sure benefit if a program is optimized for more than 4 cores or the AMD module architecture so it may be the most "future proof" of them.

@Shambles1980
The MSI H61m-p20 is one of the cheapest socket 1155 boards out there, but it will work even with a i7 without throttling or anything. Downside is that it has no SATA 6GB ports (only counts if you use a SSD), no USB 3.0 ports (again, only important if needed) and only two DDR3 slots (you have to exchange you modules for new ones if you want to upgrade).
Oh and you will NOT be able to overclock that 2500K since you need a P67, Z68 or Z75/Z77 board to do that as far as I know.

But I guess you will be happy with your combo (H61, 2500K, 4GB DDR3 and the HD7850) for now. You should get noticeable better FPS and scores around all games and apps.


----------



## Shambles1980 (May 19, 2014)

i did get a 8120 it was not a big leap..


----------



## Vario (May 19, 2014)

Bucho said:


> @Shambles1980
> The MSI H61m-p20 is one of the cheapest socket 1155 boards out there, but it will work even with a i7 without throttling or anything. Downside is that it has no SATA 6GB ports (only counts if you use a SSD), no USB 3.0 ports (again, only important if needed) and only two DDR3 slots (you have to exchange you modules for new ones if you want to upgrade).
> Oh and you will NOT be able to overclock that 2500K since you need a P67, Z68 or Z75/Z77 board to do that as far as I know.
> 
> But I guess you will be happy with your combo (H61, 2500K, 4GB DDR3 and the HD7850) for now. You should get noticeable better FPS and scores around all games and apps.



-Better FPS and scores are exactly what the OP wants.  This setup should deliver that.
-Now the OP just needs 2x4GB ram and hes set for the next couple years.  Even 1333 ram is enough, Intels don't really benefit as much from faster ram.


			
				ocaholic said:
			
		

> Overclocking the Core i5-2500K to 4.5 GHz makes the performance with our "low-preset" go up by 12.4 percent but when it comes to the high-preset the increase in performance is only 2.1 percent.


-The testing done by ocaholic even indicates that overclocking the core architecture gives minimal gains at higher video fidelity (3.3/3.7 turbo vs 4.5).
-Also the bios manual shows a multiplier setting; "Adjust CPU Ratio", so it may be possible to overclock.
-Be sure to check the socket when you receive the motherboard to make sure the pins in the socket are all perfect, modern Intel boards don't have as robust a socket design as AMD.  -Don't lose the black plastic socket cover, its required for RMA.
-Make sure your ram voltage is 1.5v, you can go up to 1.65 but its not recommended.


----------



## Tatty_One (May 19, 2014)

Shambles1980 said:


> i have no issues with 8 cores or 6 cores and 0r 4 cores with ht or however you want to describe the amount of threads you can process, *but the buldozer just inst that great for its generation at per clock cycle power. its not bad really but the amount of actual processing power per core per clock really is at early 2007 standards. for a cpu released almost 5 years later (late 2011) its a tiny bit dissapointing.*
> if we just look at it as if the Q6600 was a single core cpu and the 8120 was also single core. and they were running at the same speed, the q6600 would be faster. I have no doubt about that at all..
> the 8120 does however have more grunt. from what ive sen its able to plow through the work more consistantly.
> its a bit like a car with high bhp and low torque vs a car with High torque and lower bhp.
> the high bhp lower torque is faster. but if you strap on a heavy load to the back of them. the high torque car would consistantly manage to achive close to its top speed, where as the high bhp low torque car would fluctuate more depending on hills and similar, so at the end of the course they would end up pretty close..


 
Forgive me, but surely you knew that before you bought it?  it seems that this very lengthy thread has moved from what you have (hence I assume the title) to what you need,  to what you want.  It's simple really, look at it this way, Bulldozer will pretty much give most people what they need but it won't give a fair few people what they want, you started in the former and have managed to move through to the latter since you started this thread!


----------



## Shambles1980 (May 19, 2014)

i know of an app that can change the multiple on a board as i had to use it to change the multiples on a dsc7700 when i changed the cpu the micro code had been added to the bios update but it did not use the corect multi and the bios did not allow any changes to the multi (crystal cpu) whether it can change the multi on h61 is something i dont know but the dc7700 did not allow changing of the muti and worked on that.. (probably wont) but i will eventually get a better board i dont like the tiny ones. i did not get a full atx case to use a teenie tiny board that gets lost in there  lol. 
also i will want 4x ram slots regardless. 
have managed to sell this mother board now. just wating for the fx-8120 to sell so my wife stops being mad at me before i start looking for a replacment board. 

i dont plan on using crossfire. havent been a fan of that even back when voodoo were using the idea. "even though not many people seem to remember thats where it all started"
I prefer a single gpu with more powere over multiple gpu's. "i dont like paying 100% more for 30-50% increase, and i dont like the lower compatibility" 
So I dont mind only 1x pci-e 16x 
Sata not really fussed about it. i do need atleast 4x sata 2 slots though because i use quite a few mechaical drives. i have seen boards with 4x sata 2 and 2x sata 3 so one of those would probably be a good idea as it would leave me scope for a decent sized ssd or hybrid as well.


----------



## Shambles1980 (May 19, 2014)

Tatty_One said:


> Forgive me, but surely you knew that before you bought it?  it seems that this very lengthy thread has moved from what you have (hence I assume the title) to what you need,  to what you want.  It's simple really, look at it this way, Bulldozer will pretty much give most people what they need but it won't give a fair few people what they want, you started in the former and have managed to move through to the latter since you started this thread!



I know i was kidding my self when i was thinking i could do with "what i need"
if it could have been what i need than the q6600 would have been fine and i never would have made the thread.
It seems i made this thread to hope people would convince me that i actually neeed what i wanted..
Did not seem to pan out that way. so i started to upgrade in the cheapest way possible.
the 8120 was better but it was a huge dissapointment for having twice the cores similar clock speeds and being 5 years younger. all it did was end up confirming my fears that if i wanted to upgarde it had to be Atleast an i5-2500.
i know me too I probably wont be happy with that either lol. but that really is the maximum i am able to go to. and is probably pushing it by quite a bit.
The q6600 is the only cpu i was ever actually happy with probably why i kept it for so long.
if the 8120 had been released at the same time as the q6600 and i bought it instead of the q6600 i think that i would have been just as happy with it untill the exact same point i decided it was finally to old.

i do think that 6/7years out of a single cpu is pretty darn good. and i dont know if i will ever again be happy using the same cpu for that long.
But given how we seem to have stalled on clock speeds. the benefits of lower NM is not coming over that much in performance. 64bit really dosent speed things up compared to 32bit even though technically you should be able to do twice the work in a single clock because you can feed twice as much data in to your string then it is quite possible that current gen cpu's will be able to do everything paired with the correct gpu's even in 10 years time..
So who knows i guess..

as for the 8120. 
i did expect it to be a bit faster than it was to be honest. and i was not expecting miracles. but it really was lack luster. if i had been moving from a pentium D to an 8120 then it would have been more like what i was expecting. but from a q6600 to a 8120.. really not enough diference to justify it unless its a straight swap. 
lucky for me it was a straight swap, so doing that transition i gained some ddr3 ram. but if it wasnt for that it would have left a very sour taste.


----------



## Tatty_One (May 19, 2014)

I had a really good D0 stepping 6600 tha would run at 4gig stable and I loved it, I still only kept it for around 18 months and then the Q9650 came out and at the time that was a monster, I might even have been the first here to get one and that did 4.4gig 24/7, that stayed again for about 18 months and then Bloomfield came, got me a 920 and then the D0 stepping 930's came out, I still have it now in my main rig going strong nearly 6 years later running at 4.3gig, but alas, it's almost time for it to go.


----------



## TheoneandonlyMrK (May 19, 2014)

Shambles1980 said:


> i did get a 8120 it was not a big leap..


In all mate you took what you thought was a good deal but wasn't and then didn't stretch its legs in the right way in the right mobo but whatever.
I was out of this conversation ages ago.
Oh and I would wager ill get six years out of this 8350 @4.8ghz despite crunching 24/7


----------



## Shambles1980 (May 19, 2014)

83 is pd 81 is bd..
pile driver is probably a bit better, i dont know why you seem to be offended because the 8120 simply wasnt enough better for me compared to the q6600..
who knows if i had gotten a 63xx i may have been happy enough. i did mention that a 63xx would have probably been a better allround choice at the time.
I gave the 8120 the best chance i could under the circumstances. but i don't feel that at even 4/4.2ghz it would be enough faster than a q6600 "which i could easily get to 3.7 stable on air" for me to have been happy with it.

i can read all the bench marks and see all the contradicting evidence on the internet.
but when i put both processors against each other in the situations where i use them and some synthetic tests. the facts were simple the 8120 was able to sustain a more constant work load at a slightly slower pace. where as the q6600 was able to do the work load but would fluctuate from slightly faster to slightly slower but end up at about the same.

I really dont know how you can seemingly be offended by simple straight observations. the 2 processors are just to similar to consider the 8120 an upgrade. and there is no way it will be a viable processor in 2-3 years. to me it isnt right now.
like i said however you have a piledriver not a buldozer so that may well be enough of a difference.
personally "to me at least" the 8 cores of the amd system currently is only as good as the core 2 quad system. so if im upgrading then i have to go for intel i5 with a board that will let me use a 3rd gen i7 later if the mainstream usage of 8 cores becomes more of the in thing.
its simply not a viable option to stick with an 8120 right now as an upgrade limiting my self to am3+ cpu upgrades. when i can go i5 with scope to upgrade to a later gen i7 without any thing other than a cpu upgrade.
There is no way i could have gotten more than 6 months out of the 8120. even with a 12+2 board


----------



## Vario (May 19, 2014)

These guys are really testy and easily offended I told you the same thing a few pages back but dent1, mrk, and the rest kinda went nuts.  If you point out the obvious you are an "Intel fan boy" these days. (never mind that I have owned at least 5 AMD processors over the years, including 939, socket A, and AM3).

Consoles have been using sophisticated multithreading schemes for a long time, PS3 and Xbox 360 had many threads (Cell and Xenon architectures)  It hasn't pervaded PC's yet.  I doubt the current consoles will make much of a change either.  Besides, one should buy based on what is available and possible now, not what will be in the future, thats a foolish move and recommending someone buy a processor that competes at best with 2008 technology to somehow future proof 2014+ makes no sense.  Programmers can make games with brilliant graphics now that run on the 2500k (a 3 year old processor) with ease and use one or two threads, why bother making a game that runs 8 except to justify some worthless bulldozer purchase.

The whole idea of a bulldozer as a budget processor is crazy too, it requires a more expensive motherboard, more expensive power supply, more expensive cpu cooler.

AMD got rid of the FX series architect Mike Butler and brought back Jim Keller, who was chief architect back when AMD made great class leading cpus.  They aren't even making any new AM3+ FX chips beyond the existing range, its dead.


----------



## MxPhenom 216 (May 19, 2014)

Dent1 said:


> 29-30 FPS on a Haswell isn't exactly good, that is actually terrible considering its Intel's flagship chip and its unable to achieve 60FPS on Total War: Shogun 2 which is like 3 years old!
> 
> 
> 
> ...



@crazyeyesreaper was talking about Total War new Rome II game, not Shogun.


----------



## crazyeyesreaper (May 19, 2014)

applies to both games as they are the same game engine. 

Older tech runs them like crap in large melee fights.

Nehalem / Core 2 / Phenom / FX / etc all run the game well below what Sandybridge / Ivy Bridge / Haswell can do

Hell in Shogun 2 going from Sandy to Ivy gave a 20% boost across the board.  To IPC increases.


----------



## TheoneandonlyMrK (May 19, 2014)

While you lot swap chips like they are trainers ill be gaming on,have fun.


----------



## Shambles1980 (May 19, 2014)

theoneandonlymrk said:


> While you lot swap chips like they are trainers ill be gaming on,have fun.


lol if i changed trainers every 7 years id have some mighty sore feet by now..


----------



## Dent1 (May 19, 2014)

Shambles1980 said:


> i do think that 6/7years out of a single cpu is pretty darn good. and i dont know if i will ever again be happy using the same cpu for that long.



Agreed, but realise that is because the Q6600 was released at a time when people were NOT sold on quad cores. I can remember hearing all the doubters saying there isn't a need for a quad core, Q6600 barely beats out a E6600 dual core etc. Guess what happened? Games like Crysis 1, Farcry 2, GTA IV, BF2, BF:BC came out rendering dual cores incapable. Gamers were forced compromise on settings considerably or upgrade to a quad core.

Years later, the guys whom opted for the Qxxx series, Athlon II 6xxx 4x, Phenom II X4/X6 xxx are sitting happy years later. Those that had the Exxxx dual cores are eating crow.

Same thing is happening now, we are not sold octocores...the cycle begins again.




MxPhenom 216 said:


> @crazyeyesreaper was talking about Total War new Rome II game, not Shogun.





crazyeyesreaper said:


> meh Q6600 play Rome 2 Total War or Shogun 2 Total War maxed out with larger armies watch it crawl.  10-12 FPS in melee vs 29-30 of the newer Haswell chips lol


http://www.techpowerup.com/forums/t...600-is-no-good-now.200571/page-9#post-3108604


----------



## Shambles1980 (May 19, 2014)

its not that im not sold on 8 cores.. truth is nothing is truley 8 cores.. Amd have their way and intel have HT 
so right now To me. a 8 core 8xxx is quad core with a form of HT . And a I5 is a quad core and an i7 is a quad core with hyper threading. 
If a quad core i5 is better than a q6600 and q6600 is about the same as an 8120 then its better for me to get an i5 and then later on IF 8 cores become useful during the time span where an 81xx or 83xx is still viable, then i can get an i7 3rd gen and be much better of than if i has stayed with the 8120. further more in the mean time the i5 will be much better for me right now and if 8 cores dont become importaint within the next 4 years i will still be much better off. 

i opted for amd because i believed they could have preformed better than they did, but im not one of these people that buy something that sux then pretends i did the right thing and all is well and im super happy with it. 
I have no problems saying ok well i thought that prehaps people were exagerating with saying it sux, i thought it was fan boys, but honestly compared to what i had it was not that much better. and certainly not a viable upgrade. in all honestly i could have bought a phenom and have felt the same but atleast im not making a huge loss and it does allow me to buy the I5 rig for less money. but no matter how much i wanted it to be better what i want does not make it happen.
I dont have my i5 yet who knows maybe il say something similar about that when i get it. and end up saying that the Q6600 is still viable and pointless to upgrade if you can oc to 3.7


----------



## Jetster (May 19, 2014)

Shambles1980 said:


> its not that im not sold on 8 cores.. truth is nothing is truley 8 cores.. Amd have their way and intel have HT



Ivy-E has 6 cores and 12 threads. Xeon has 8 cores and 16 threads.  And the 8150 is a native 8 core CPU. And Haswell -E will have 8 cores 16 threads


----------



## Vario (May 19, 2014)

theoneandonlymrk said:


> While you lot swap chips like they are trainers ill be gaming on,have fun.


Maybe the people swapping chips have experience with a variety of chips, therefore can actually make a good recommendation.


----------



## Shambles1980 (May 19, 2014)

Jetster said:


> Ivy-E has 6 cores and 12 threads. Xeon has 8 cores and 16 threads.  And the 8150 is a native 8 core CPU. And Haswell -E will have 8 cores 16 threads


thats nice to know but im kind of funny...
i look at this..





and i see 4 blocks with 2 modules in each sharing the same everything. to me thats a quad spilt mechanically and not true 8 core. with 4 caches "not 8"

then this power 7 ibm is an 8 core if you ask me..




again compared to the bulldozer floorplan..




i know a lot of people will argue its true 8 cores. but i can plainly see its a quad with some trickery. (to me the big give away is the 4 sets of L2/3 rather than 8)
Just like the i7..




4 cores.. but hyper threading..

i honestly cannot call the bulldozer 8 cores. like i cant call an i7 8 cores.
i never saw the floor plan of the xenon 8 core. but if its 16 threads then it has to be 8 cores with Ht.


----------



## TheoneandonlyMrK (May 19, 2014)

Vario said:


> Maybe the people swapping chips have experience with a variety of chips, therefore can actually make a good recommendation.


I fix and own a progression of pc types and my reasoning is based largly on experience but my rig'S are what they are.


----------



## Dent1 (May 19, 2014)

Shambles1980 said:


> its not that im not sold on 8 cores.. truth is nothing is truley 8 cores.. Amd have their way and intel have HT
> so right now To me. a 8 core 8xxx is quad core with a form of HT . And a I5 is a quad core and an i7 is a quad core with hyper threading.



This is incorrect. AMD 's architecture has genuine cores within.

It's only Intel's hyper threading which uses the concept of virtual cores, which is essentially software performing two threads per core.




Shambles1980 said:


> i opted for amd because i believed they could have preformed better than they did, but im not one of these people that buy something that sux then pretends i did the right thing and all is well and im super happy with it.



It's OK. You don't have to justify your decision. I'm not saying necessarily which CPU to buy.


Edit:

Shambles, going by the charts above, it shows the i7 with only 4 cores. Hence why it's a genuine quad core. The hyper threading element cant be seen as its virtual. it doesn't exist in the physical.

The Bulldozer in that chart shows 8 cores. So Bulldozer isn't a octocore because it had 8 cores in the chart? Your logic is flawed.


----------



## Shambles1980 (May 19, 2014)

ok i decided i would edit the floor plan that the buldozer has a to show you "not accuratley" what it would need to look like for me to call it 8 core.

please compare it to the real floor plan posted above.

undoubtedly i would then be told its 16 cores though because it has 8 modules..

however im sure im not the only person in this world that looks at the real floor plan and sees a 4core cpu.


----------



## Dent1 (May 19, 2014)

Shambles1980 said:


> ok i decided i would edit the floor plan that the buldozer has a to show you "not accuratley" what it would need to look like for me to call it 8 core.
> 
> please compare it to the real floor plan posted above.
> 
> ...



Who made you qualified enough to say what is an is an 8 core architecture looks like?

You can remove each of the 8 cores within the 2 modules. You can touch it and feel it. So its a real 8 core processor.  You can't do that with Intel, because 4 of them are logical not physical.


----------



## Shambles1980 (May 19, 2014)

you can look at the floor plan and see that there is no way you can split 1 module in to 2 and come out with 2 identical cores..
draw a line through any of the modules any where you want and you cant make 2 identical or mirror images of the other half..

but like i said the main giveaway is the number of blocks of cache. 
as shown with the true 8 core ibm power each core has a block of cache. 
and even the i7 has 4 blocks 1  for each core although they are shared. 
where as the buldozer only has 4.. so its a quad with trickery..


----------



## Dent1 (May 19, 2014)

Shambles1980 said:


> you can look at the floor plan and see that there is no way you can split 1 module in to 2 and come out with 2 identical cores..



What are you talking about.  You can clearly see 4 modules, each module with 2 physical cores.


----------



## Shambles1980 (May 19, 2014)

i clearly see 4 modules, that you cannot split in to 2 mirror or exact images of them selfs (one in each corner) although each module is exactly or a mirror image of the next module just like a core 
and 4 sets of cache





i dont see how its anything other than quad..


----------



## Jetster (May 19, 2014)

Just because they share cash does not mean they don't exist. I see 8 cores


----------



## Dent1 (May 19, 2014)

Shambles1980 said:


> i clearly see 4 modules, that you cannot split in to 2 mirror or exact images of them selfs (one in each corner) although each module is exactly or a mirror image of the next module just like a core
> and 4 sets of cache
> 
> i dont see how its anything other than quad..



You mean that each module is  like a traditional dual core? 

Except with L3 cache linking them together?


----------



## Shambles1980 (May 19, 2014)

ok il try again..
for me to call it 8 cores each module would look like this..





Which you can see in my edited floor plan..
where as each module atcually looks like this..





With all the best will in the world i cant call that 2 cores in 1 module..
i would however let you call it a much better design for a single core though..

because its a single core with twice as many pipelines..


----------



## Jetster (May 19, 2014)

Anyway the i5 2500K in gaming when overclocked will beat the 8150


----------



## Dent1 (May 19, 2014)

Shambles1980 said:


>





Shambles1980 said:


> With all the best will in the world i cant call that 2 cores in 1 module..



Why not, are you saying those two cores are not physical? They don't exist? Its a figment of our imagination?

Or are you saying that those two cores are broken and cant do any arithmetic and logic?


----------



## Shambles1980 (May 19, 2014)

im saying that one module is 1 core with 8 pipelines.. because everything els is shared.
instead of having 2 true cores like in the 1st pic.
there is a HUGE difference.
but it is a better way to Hyper thread, provided you had 2 of those in each module.
But then you would argue its 16 cores because it can do 16 threads..

thats just how i see it though. (im sure many many others do too) call me a purist if you want.


----------



## Dent1 (May 19, 2014)

Shambles1980 said:


> im saying that one module is 1 core with 8 pipelines.. because everything els is shared.
> instead of having 2 true cores like in the 1st pic.
> there is a HUGE difference.
> but it is a better way to Hyper thread, provided you had 2 of those in each module.



Incorrect. Each core per module is independent, hence why each has its own independent L1 reserve. The resources _CAN _be shared optionally (L2).




Shambles1980 said:


> But then you would argue its 16 cores because it can do 16 threads



Incorrect. The FX 8xxx only supports 1 thread per core. So its 8 thread total.

I can't believe you made a CPU purchase over the last few days without even understanding AMD's product.


----------



## Shambles1980 (May 20, 2014)

i cant believe that you think that module is 2 actual cores.
the ONLY things not shared in that module is the scheduler and the pipe lines. 
Fp scheduler is shared.
fmac shared And halved compared to having 2 real cores.
fetch shared 
decode shared.
only one L2 per module instead of 1 per core (so halved again)

how can you say it has 2 real cores in the module when it blatantly only has 1 with twice as many pipelines?

I dont understand how having the correct amount of 1 thing for 2 cores. then halving the amount of 2 other things and sharing everything els makes it 2 cores.

hey i have 2 oranges here. well i cut them both in half and and i kept 2 of the 8 segments from the parts i discarded, but hey its still 2 oranges in amd land even though if i tried to put them back together id have 1 orange, 4 pipe lines and a scheduler.. because the rest of the orange isn't here.
(pipe lines and the scheduler would be the 2 segments not discarded btw in case you missed it


----------



## Dent1 (May 20, 2014)

Shambles1980 said:


> i cant believe that you think that module is 2 actual cores.



It's not just me that thinks this, this is actual fact, documented in on many technology articles and websites, as well as AMD's very own technical specification.

Why should be believe you, rather than the facts written by professionals?


----------



## Shambles1980 (May 20, 2014)

well amd aren't going to say.
We wanted to make 8 cores but its a bit expensive so what we did was make a quad core throw in an extra scheduler and pipelines and call it 8 cores, hope no one notices whilst we think up something to do with this module design because it sure as fudge isnt going to compete with an i7 

i have pointed out all my reasoning why i say its not true 8 cores. i gave a coherent analogy, and even stated that this is my opinion in my eyes.
I simply cant call it 8 cores when it blatantly inst. if amd want to redefine what a core is that's fine, but i am not about to change the definition because its better for their marketing. and im rather disappointed that some people have..

IMO if they had just said its a quad core with extra pipelines. then people would not have been so disappointed with the performance


----------



## Dent1 (May 20, 2014)

Shambles1980 said:


> well amd aren't going to say.



Well according to you The FX 8xxx could be "16 threaded". Wouldn't AMD say this to sell more units?  - Your own logic makes no sense.

Also even if AMD doesn't say I'm sure the thousands of independent technology reviewers would say.



Shambles1980 said:


> We wanted to make 8 cores but its a bit expensive so what we did was make a quad core throw in an extra scheduler and pipe lines and call it 8 cores.



Says who? Says what independent source, If you insist on using yourself as a source can you please upload your electrical engineering degree.


----------



## Shambles1980 (May 20, 2014)

see the picture that i showed with what a 8 core of amd would look like. That was AMD's initial idea. 
then see the one that has 1 core with more pipelines? Thats what they ended up doing.. 
you work it out..


----------



## Toothless (May 20, 2014)

Shambles1980 said:


> well amd aren't going to say.
> We wanted to make 8 cores but its a bit expensive so what we did was make a quad core throw in an extra scheduler and pipelines and call it 8 cores, hope no one notices whilst we think up something to do with this module design because it sure as fudge isnt going to compete with an i7
> 
> i have pointed out all my reasoning why i say its not true 8 cores. i gave a coherent analogy, and even stated that this is my opinion in my eyes.
> ...


You can call it a giant single core if you want, that's fine. But you CANNOT say that it's fact. I saw all 8 cores there. Either you just want to argue or you're being ignorant to the people who have been trying to help you.


----------



## Shambles1980 (May 20, 2014)

Lightbulbie said:


> You can call it a giant single core if you want, that's fine. But you CANNOT say that it's fact. I saw all 8 cores there. Either you just want to argue or you're being ignorant to the people who have been trying to help you.



you can say its 8 cores.
But if it looks like a quad core.
has the same ammount of l2, fmac. fp scheduler, fetch and decode as a quad core.
has the Performance of a quad core.
Then to me its a quad core.

any way as this is getting to be a rather long and some what futile debate. 
i will say:
im wrong it so is a 8 core cpu. omg how did i not see that


----------



## Toothless (May 20, 2014)

Shambles1980 said:


> you can say its 8 cores.
> But if it looks like a quad core.
> has the same ammount of l2, fmac. fp scheduler, fetch and decode as a quad core.
> has the Performance of a quad core.
> ...


Acting like a child doesn't help your case.


----------



## Shambles1980 (May 20, 2014)

there is no case lol.
there is what there is then bashing my head on the wall..
I gave up with the bashing my head and will let you call it 8 cores.
just don't expect me to do the same lol.

just to point out, i am actually trying to sell this 8120 still. and i wouldn't say all i have said just to be obnoxious because any 1 finding this thread from google and reading it who may have potentially bought my 8120 would probably not after reading this.
Im just stating what i see how i see it. And i dont really care if its detrimental to the sale of my cpu. because i belive it is true.


----------



## Toothless (May 20, 2014)

We call it an 8-cored CPU because it is an 8-cored CPU. Where is your education coming from? Your degree in KnowingEverythingAboutAMD?


----------



## Shambles1980 (May 20, 2014)

Lightbulbie said:


> We call it an 8-cored CPU because it is an 8-cored CPU. Where is your education coming from? Your degree in KnowingEverythingAboutAMD?



Ahh i see being childish does help your case?

im sorry. i forgot that saying this is a futile debate where i am bashing my head on a wall, so I concede to you. means that the debate continues but in a puerile manner.

so in that case. 
my dad could beat up your dad and my dad says im right!


----------



## Dent1 (May 20, 2014)

Shambles1980 said:


> see the picture that i showed with what a 8 core of amd would look like. That was AMD's initial idea.
> then see the one that has 1 core with more pipelines? Thats what they ended up doing..
> you work it out..



OK. You are smarter than the technology community and AMD's thousands of staff whom work in research and development

----

Expert Reviews
These cores are arranged in four modules of two cores each. Each core within the module has its own level 1 cache, and each two-core module has a block of level 2 cache that the two cores share. All four modules then have access to a pool of level 3 cache. In the multi-threaded video-encoding and multitasking benchmarks, where you would expect the *processor's eight cores* to shine

Toms Hardware
Brute-forcing performance with higher clock rates and as many as *eight integer clusters* allows FX-8350 to snag a second-place finish in the threaded benchmark.

guru3d.com
Right bellow in line will be the FX-8120 that also packs *eight processing cores* and other similar features, but comes clocked at 3.1GHz

Wikipedia:
The modular architecture consists of multithreaded shared L2 cache and FlexFPU, which uses simultaneous multithreading. Each physical integer core, two per module, is *single threaded*, in contrast with Intel's Hyperthreading, where two virtual simultaneous threads share the resources of a single physical core.[8]

Extreme Tech
The good news on Piledriver is that the* eight-core *Vishera CPU is an unambiguous improvement on ‘Dozer.

PC Pro
In practice, it means AMD's top-end FX-8150, with four modules, can process* eight threads* at once.

PC Mag
The FX-8150's eight processing cores (the first consumer chip so equipped)

hardwarecanucks
With *8 cores on tap*, we think the initial FX-series offerings will certainly have the multi-threaded aspect covered


----------



## Shambles1980 (May 20, 2014)

I could go and google all the reviews that state i7 yada yada utilizing all 8 cores..
but you know thats a waste of time too..


----------



## Dent1 (May 20, 2014)

Shambles1980 said:


> I could go and google all the reviews that state i7 yada yada utilizing all 8 cores..
> but you know thats a waste of time too..



No you can't because i7 it doesn't have 8 cores.

I would welcome you do try, because you will make yourself look even more ignorant.


----------



## Toothless (May 20, 2014)

Shambles1980 said:


> I could go and google all the reviews that state i7 yada yada utilizing all 8 cores..
> but you know thats a waste of time too..


As if every bloat that knows even a LITTLE about hardware is always going to say "Hyperthreaded virtual core"


----------



## Shambles1980 (May 20, 2014)

Dent1 said:


> No you can't because i7 it doesn't have 8 cores.
> 
> I would welcome you do try, because you will make yourself look even more ignorant.



it has 4 cores but hyper threading so in a review utilizing bench tests and apps like occt that stress cores for stability. its easy enough for a reviewer to type out "occt stressed out all 8 cores of the i7." when really its  only 4 cores.

just like its easy enough for them to type out "8 amd cores", rather than 4 cores with extra pipelines


----------



## Toothless (May 20, 2014)

Shambles1980 said:


> it has 4 cores but hyper threading so in a review utilizing bench tests and apps like occt that stress cores for stability. its easy enough for a reviewer to type out occt stressed out all 8 cores of the i7. when really its  only 4 cores.
> 
> just like its easy enough for them to type out 8 amd cores, rather than 4 cores with extra pipelines


But the FX-8xxx / FX-9xxx are actual 8-cored CPUs..


----------



## Dent1 (May 20, 2014)

Shambles1980 said:


> just like its easy enough for them to type out 8 amd cores, rather than 4 cores with extra pipelines



So you're saying that ALL the independent technology reviewers would rather spread misinformation because they are too lazy to type " 4 cores with extra pipelines". Do you think they would risk a lawsuit for such a ridiculous reason.

You sound more ridiculous with every post.


----------



## eidairaman1 (May 20, 2014)

Honestly Id Suggest a 4350,6300,8320,

Robust 970 Motherboard 8-16GB of 1600-2133 Ram.

http://www.gigabyte.com/products/product-page.aspx?pid=4717#ov


----------



## Toothless (May 20, 2014)

eidairaman1 said:


> Honestly Id Suggest a 4350,6300,8320,
> 
> Robust 970 Motherboard 8-16GB of 1600-2133 Ram.


I love that timing.


----------



## Shambles1980 (May 20, 2014)

i already said this.
but if i had 2 oranges with 8 segments each. cut them in half.  pulled out all the segments, threw away 6 segments. then claimed i still had 2 oranges i would be wrong.
i would have one orange and 2 segments.. 

that's what a bulldozer module is..
one core with 2 components of a second core squashed in to it..


----------



## eidairaman1 (May 20, 2014)

Lightbulbie said:


> I love that timing.



Just giving suggestions for a build, look at my signature rig...


----------



## Dent1 (May 20, 2014)

Shambles1980 said:


> its easy enough for a reviewer to type out "occt stressed out all 8 cores of the i7." when really its  only 4 cores.



Please find me a review that says that. If its easy for reviewers to say that, you won't have trouble posting a supporting link.




Shambles1980 said:


> i already said this.
> but if i had 2 oranges with 8 segments each. cut them in half.  pulled out all the segments, threw away 6 segments. then claimed i still had 2 oranges i would be wrong.
> i would have one orange and 2 segments..
> that's what a bulldozer module is..
> one core with 2 components of a second core squashed in to it..



I think you have a genuine confusion. At first I thought you was being ignorant, the confusion lays on the fact that you don't know the difference between a core and a thread or more importantly what defines a core.


----------



## Shambles1980 (May 20, 2014)

Dent1 said:


> Please find me a review that says that. If its easy for reviewers to say that, you won't have trouble posting a supporting link.
> 
> 
> 
> ...


its not me thats confused as to what a core actually is..

even amd in theire initial designs had it laid out as a true 8 core. it just got changed allong the way.
its not my fault they managed to trick people.

if you call 4 pipelines a core.. then so be it its 8 core. but thats not what a core is..


----------



## Dent1 (May 20, 2014)

Shambles1980 said:


> its not me thats confused as to what a core actually is..



OK, in your own words what is the difference between a core and a thread.

PS. I'm still waiting for you to post links of reviews saying "occt stressed out all 8 cores of the i7." or similar.


----------



## Shambles1980 (May 20, 2014)

i have MULTIPLE times pointed out what a core is. (prehaps you didnt read or look at the pictures and thats why you still contradict what i say?)

a thread is a set of instructions sent to be processed.. nothing at all the same.

now you explain to me in your own words how 4 pipelines make a whole core.


----------



## eidairaman1 (May 20, 2014)

Info for yall

The 8300 series are 8 cores they just share cache sources

http://www.cpu-world.com/CPUs/Bulldozer/AMD-FX-Series FX-8350.html


----------



## Dent1 (May 20, 2014)

Shambles1980 said:


> now you explain to me in your own words how 4 pipelines make a whole core.



You obviously have no clue what a pipline is.

A pipeline sends data to the components, like a wire.

4 pipelines are used because it gives 4x the bandwidth as having only 1x pipeline (in theory) thus allowing more bandwidth or throughput to travel bidirectional, and thus reduce bottleneck.

The pipeline has nothing to do with the core or what a core is. Shish

If you think the 4x pipeline in the image represents the core you are reading the image wrong. I was right you was confused.


----------



## eidairaman1 (May 20, 2014)

Even more info

http://www.cpu-world.com/CPUs/Core_2/index.html

http://www.cpu-world.com/CPUs/Core_i7/index.html


----------



## Shambles1980 (May 20, 2014)

Dent1 said:


> You obviously have no clue what a pipline is.
> 
> A pipeline sends data to the components, like a wire.
> 
> ...



ok il tell you again what a bulldozer core consists of..

Fetch
Decode
integer scheduler
FP scheduler
2 Fmac
1 set of pipelines
l1 d cache
1 l2
1 l3

That is 1 core.

what the bulldozer has in a module is.

1 fetch
1 decode
2 integer scheduler
1 fp scheduler
2 fmac
2 sets of pipe lines
2 L1 d cache
1 L2
1 L3

thats 1 core 2 sets of pipe 2 integer schedulers lines 1/2 the ammount of fmac it should have for 2 cores.

amd's initial design was to have

2 x Fetch
2 x Decode
2 x integer scheduler
2 x FP scheduler
4 x Fmac
2 x set of pipelines
2 x l1 d cache
2 x l2
1  l3

But that is not what they did..

I dont know why you find that so hard to grasp as to the reason i dont call it 8 cores.

(initial design)







(what they settled on)






this is a depiction of 1 module.. there are 4 of these on a chip.


----------



## Toothless (May 20, 2014)

Shambles1980 said:


> ok il tell you again what a bulldozer core consists of..
> 
> Fetch
> Decode
> ...


Your logic just makes me go.


----------



## eidairaman1 (May 20, 2014)

Considering youre so hellbent on this subject instead of wasting anymore time just get the cpu you want, otherwise youll be waiting indefinitely. A core i5 would do what you want.


----------



## Dent1 (May 20, 2014)

eidairaman1 said:


> Considering youre so hellbent on this subject instead of writing anymore just get the cpu you want. A core i5 would do for what you want.



Not sure if you're following. He has come off topic. He is arguing  a nonsense claims.


a.) Bulldozer 8 core isn't a real 8 core processor
b.) Bulldozer 8 core is really 16 threads
c.) That two cores within a module counts as 1 core?
d.) Its impossible for it to be a octocore because each core only has 4 pipelines?
e.) Says AMD isn't telling us the truth that the FX8xxx isn't a octocore.
f.) Claims reviewers acknowledge the i7 as being a 8 core and often refer to it as one (but wont provide a link)
g.) Thinks AMD's 8 cores are as fake as hyper threading.
h.) Says he knows better than AMD's research and development team
i.) Says he knows better than independent technology reviewers
j.) Implies he knows better than the technology community.

I can go all the way up to "z". But I will stop here

This guys is crazy.


----------



## Shambles1980 (May 20, 2014)

Dent1 said:


> Not sure if you're following. He has come off topic. He is arguing  a nonsense claims.
> 
> 
> a.) Bulldozer 8 core isn't a real 8 core processor
> ...


you have a total inability to comprehend what you read..

A (correct i said that) 
B (totally wrong, i said if they used that module design as it is now "the one i say is a quad core as a true 8 core" it could do 16 threads and you would call it 16 cores)
c (i have repeatedly shown you it is 1 core with 2 components from a second core added even amd's initial design vs what they ended up doing shows you that)
d. (each MODULE has 2 sets of pipe lines. which just makes it 1 core with extra pipeline) 

I suggest you go back and read and try to comprehend what has been said because you haven't comprehended a single word..


----------



## Toothless (May 20, 2014)

Shambles1980 said:


> you have a total inability to comprehend what you read..
> 
> A (correct i said that)
> B (totally wrong, i said if they used that module design as it is now "the one i say is a quad core as a true 8 core" it could do 16 threads and you would call it 16 cores)
> ...


What you've posted and argued with is complete and utter BS. You keep making things up and call it fact.


----------



## Shambles1980 (May 20, 2014)

yeah sure i do.
if you sleep better at night thinking that then thats ok.
just dont research it ok.

i will never ever understand the ignorance of fan boys..
off to play a game on the AMD fx-8120
It can do that and it can do it just fine. can do a whole bunch of other stuff fine too.

but its not 8 core.


----------



## Toothless (May 20, 2014)

Shambles1980 said:


> i will never ever understand the ignorance of fan boys..


Who here is fanboying? We're sitting here with the facts on the plate and you're cooking some crap while thinking you're the master chef.

As for me, I own both AMD and Intel, and the only reason I go with AMD all of the time is due to lower budgets and/or friends wreck their computers and I get to fix and keep.

Would I go with an Intel-based rig if I had the money? Certainly. 

Do I recommend AMD to people just as much as I recommend Intel? Always.

The only real fanboy here is you with your ignorance.


----------



## MxPhenom 216 (May 20, 2014)

Wow, I have never seen a dead horse get beat so bad.


----------



## 64K (May 20, 2014)

Shambles1980, it seems to me that if it weren't an 8 core processor then Intel would be all over AMD for being dishonest and selling them as 8 core CPUs.


----------



## Vario (May 20, 2014)

Amazing how things have changed in the past couple years:



			
				2007  said:
			
		

> AMD and Intel currently have divergent views on how to architect a quad-core desktop processor. Intel's quad-core parts - for desktop systems as well as servers - are based on a pair of dual-core chips put together on a single substrate - an arrangement know as a multi-chip module (MCM). The quad-core processors are equipped with either 8MiB or 12MiB L2 cache - 4MiB/6MiB per chip - and accessed via a common front-side bus (FSB) whose bandwidth is currently limited to 10.6GiB/s (1333MHz).
> 
> Cache contention and access latency can be problematic with four cores potentially thrashing away and requiring data to be streamed from main memory through a discrete memory-controller hub (MCH) on the motherboard, via the FSB. However MCM has benefits of its own. First, the time to market is reduced and that's contributed to Intel's one-year lead over AMD in desktop quad-core x86 processors. In addition, each dual-core assembly is smaller that a quad-core, so yields are improved and costs reduced.
> 
> AMD on the other hand reckons that 'native' quad-core is the way to go - using a monolithic piece of silicon to house all four cores, cache, and, of course, integrated memory-controller.


http://hexus.net/tech/reviews/cpu/1...henom-9600-vs-intel-core-2-quad-q6600/?page=2



			
				2012 said:
			
		

> The basic building block of Bulldozer is the dual-core module, pictured below. AMD wanted better performance than simple SMT (ala Hyper Threading) would allow but without resorting to full duplication of resources we get in a traditional dual core CPU. The result is a duplication of integer execution resources and L1 caches, but a sharing of the front end and FPU. AMD still refers to this module as being dual-core, although it's a departure from the more traditional definition of the word. In the early days of multi-core x86 processors, dual-core designs were simply two single core processors stuck on the same package. Today we still see simple duplication of identical cores in a single processor, but moving forward it's likely that we'll see more heterogenous multi-core systems. AMD's Bulldozer architecture may be unusual, but it challenges the conventional definition of a core in a way that we're probably going to face one way or another in the not too distant future.
> 
> The bigger issue with Bulldozer isn't one of core semantics, but rather how threads get scheduled on those cores. Ideally, threads with shared data sets would get scheduled on the same module, while threads that share no data would be scheduled on separate modules. The former allows more efficient use of a module's L2 cache, while the latter guarantees each thread has access to all of a module's resources when there's no tangible benefit to sharing.
> 
> This ideal scenario isn't how threads are scheduled on Bulldozer today. Instead of intelligent core/module scheduling based on the memory addresses touched by a thread, Windows 7 currently just schedules threads on Bulldozer in order. Starting from core 0 and going up to core 7 in an eight-core FX-8150, Windows 7 will schedule two threads on the first module, then move to the next module, etc... If the threads happen to be working on the same data, then Windows 7's scheduling approach makes sense. If the threads scheduled are working on different data sets however, Windows 7's current treatment of Bulldozer is suboptimal.



http://www.anandtech.com/show/5448/the-bulldozer-scheduling-patch-tested

*Lesson: the definition of a core is pure marketing.  AMD says it has 8 cores so it has 8 cores.  An AMD FX core is different from an Intel core, different from a core 2 mcm core, and is different from a K10 core. However that doesn't make it not a core.*


----------



## HalfAHertz (May 20, 2014)

Shambles I think you need to look at the problem from another angle. AMD's Bulldozer design was ingenious because it tried to revolutionize the way a CPU works. Their goal was to make a more efficient processor. To do that they dissected a core to its basic components and asked the question "What are the most heavily used parts of the core?". They decided to beef up only those parts and share the rest of the components so that they keep utilization to 100% and not have parts of the core sitting idly. That way thy could have a smaller footprint and cheaper design that could be more efficient. Now you have to understand that when so many complex functions happen in 1/1000000 of a second, it's very hard to do simple things like scheduling, prefetching, thread aligning and so on, so the fact that AMD actually made it work with so many shared resources is nothing short of a miracle.

The only problem is that they optimized it only for certain workflows. And it looks that AMD did not choose those workflows carefully enough. Still if you look at reviews and see benchmarks for file or video compression or image and video rendering, you'd see that those silly modules of theirs actually manage to keep up with the i5s and the i7s, even tho Intel's CPUs are produced on a much more advanced process node and pack more transistors.

The other thing is that Intel's HT actually parks one thread mid flight when it reaches a decoder that needs more than 1 cycle to finish the task and picks up another thread that can run on a different decoder along the pipeline. So even tho Intel has done an amazing job and manages to switch threads mid flight almost instantly, HT still introduces some minute latencies. AMD's design on the other hand can have the two threads run in parallel on the same module because each of those integer units has its own dedicated decoders. The only problem is that because their core is weaker, it's not fast enough to see any benefits.
They're both trying to solve the same problem of more efficient multi-threading in two different ways but because Intel has a better micro-architecture and better manufacturing facilities their SMT design seems better even tho it's not.


----------



## Tatty_One (May 20, 2014)

I think this thread has run it's course and has now turned into a chat room full of inaccuracies and misinterpretation, and therefore is closed.


----------

