# AMD Details Bulldozer Processor Architecture



## btarunr (Aug 24, 2010)

AMD is finally going to embrace a truly next generation x86 processor architecture that is built from ground up. AMD's current architecture, the K10(.5) "Stars" is an evolution of the more market-successful K8 architecture, but it didn't face the kind of market success as it was overshadowed by competing Intel architectures. AMD codenamed its latest design "Bulldozer", and it features an x86 core design that is radically different from anything we've seen from either processor giants. With this design, AMD thinks it can outdo both HyperThreading and Multi-Core approaches to parallelism, in one shot, as well as "bulldoze" through serial workloads with a broad 8 integer pipeline per core, (compared to 3 on K10, and 4 on Westmere). Two almost-individual blocks of integer processing units share a common floating point unit with two 128-bit FMACs. 

AMD is also working on a multi-threading technology of its own to rival Intel's HyperThreading, that exploits Bulldozer's branched integer processing backed by shared floating point design, which AMD believes to be so efficient, that each SMT worker thread can be deemed a core in its own merit, and further be backed by competing threads per "core". AMD is working on another micro-architecture codenamed "Bobcat", which is a downscale implementation of Bulldozer, with which it will take on low-power and high performance per Watt segments that extend from all-in-One PCs all the way down to hand-held devices and 8-inch tablets. We will explore the Bulldozer architecture in some detail. 






*Bulldozer: The Turbo Diesel Engine*
In many respects, the Bulldozer architecture is comparable to a diesel engine. Lower RPM (clock-speeds), high torque (instructions per second). When implemented, Bulldozer-based processors could outperform competing processor architectures at much lower clock speeds, due to one critical area AMD seems to have finally addressed: instructions per clock (IPC), unlike with the 65 nm "Barcelona" or 45 nm "Shanghai" architectures that upped IPC synthetically by using other means (such as backing the cores up with a level-3 cache, upping the uncore/northbridge clock speeds), the 32 nm Bulldozer actually features a broad integer unit with eight integer pipelines split into two portions, each portion having its own scheduler and L1 Data cache. 





*Parallelism: A Radical Approach?*
Back when analysts were pinning high hopes on the Barcelona architecture, their hopes were fueled by early reports suggesting that AMD was using wide 128-bit wide floating point units, leading analysts to believe that AMD may have conquered its biggest nemesis - floating point performance, in turn its pure math crunching abilities. However, that wasn't exactly to be. That's because the processor's overall number crunching abilities were pegged to its floating point performance, ignoring the integer units. 





AMD split 8 integers per core into two blocks, each block having four integer pipelines, an integer scheduler for those, and an L1 Data cache. These constitute the lowest level of "dedicated components", dedicated to processor threads. There is a shared floating point unit between the two, with two 128-bit FMACs, arbitrated by a floating point scheduler. The Fetch/Decode, an L2 cache, and the FPU constitute "shared" components. 





AMD is implementing a simultaneous multithreading (SMT) technology, it can split each of the "dedicated" components (in this case, the integer unit) to deal with a thread of its own, while sharing certain components with the other integer unit, and effectively make each set of dedicated components a "core" in its own merit of efficiency. This way, the actual core of the Bulldozer die is deemed a "module", a superlative of two cores, and the Bulldozer die (chip) features n-number of modules depending on the model.



 

So now you have a chip with eight cores with much lower die sizes and transistor counts compared to a hypothetical 32 nm K10 8-core processor. It is unclear whether AMD wants to further push down SMT to the "core" level and run two threads simultaneously over dedicated components, but one thing for sure is that AMD has embraced SMT in some form or another. In all this, the chip-level parallelism is transparent to the operating system, it will only see a fixed number of logical processors, without any special software or driver requirement.

So in one go, AMD shot up its integer performance. Either a thread makes use of one integer unit with its four pipelines, or deals with both the integer units arbitrated by the fetch/decode, and the shared FPU. 

*Outside the modules*
At the chip-level, there's a large L3 cache, a northbridge that integrates the PCI-Express root complex, and an integrated memory controller. Since the northbridge is completely on the chip, the processor does not need to deal with the rest of the system with a HyperTransport link. It connects to the chipset (which is now relegated to a southbridge, much like Intel's Ibex Peak), using A-Link Express, which like DMI, is essentially a PCI-Express link. It is important to note that all modules and extra-modular components are present on the same piece of silicon die. Because of this design change, Bulldozer processors will come in totally new packages that are not backwards compatible with older AMD sockets such as AM3 or AM2(+). 



 

*Expectations*
Not surprisingly, AMD isn't talking about Bulldozer as the next big thing since dual-core processors (something it did with Barcelona). AMD currently does have an 8-core and 12-core processors codenamed "Magny-Cours", which are multichip modules of Shanghai (4-core) and Istanbul (6-core) dies. AMD expects an 8-core Bulldozer implementation (built with four modules), to have 50% higher performance-per-watt compared to Magny-Cours.





*Market Segments*
As mentioned in the graphic before, AMD's modular design allows it to create different products by simply controlling the number of modules on the die (by whichever method). With this, AMD will have processors ready with most PC and server market segments, all the way from desktop PCs, enthusiast-grade PCs, notebooks, to servers. AMD expects to have a full-fledged lineup in 2011. The first Bulldozer CPUs will be sold to the server market.


*Hotchips 22 Presentation by AMD on the Bobcat Architecture*
Below are as-is slides from AMD's Hotchips presentation on the Bobcat architecture.



 

 

 

 

 

 

 

 

 

 

 

 

 

 

 



*View at TechPowerUp Main Site*


----------



## afw (Aug 24, 2010)

Nice ... hoping to go AMD for my next build


----------



## Lionheart (Aug 24, 2010)

Interesting read, I hope this bulldozes Intel's next gen cpu's out the water


----------



## kajson (Aug 24, 2010)

If they can stay at the same price level with this product as they're selling their current products at, Intel might finally have something to really start worrying about.

After overtaking Nvidia in the gfx department, that really would be something...


----------



## WhiteLotus (Aug 24, 2010)

Lower clock speeds but better math crunching abilities... interesting.

And am I alone in thinking these will be big chips? What with everything on them...


----------



## LAN_deRf_HA (Aug 24, 2010)

I hope that A) these actually can beat an i7 in real benchmarks and B) that an equally amazing chipset comes with it. I haven't been impressed by amd chipset performance since... ever. Also hoping for good memory latency. Bandwidth shouldn't be a challenge with quad channel.


----------



## Fourstaff (Aug 24, 2010)

Can we has performance numbers instead?


----------



## Atom_Anti (Aug 24, 2010)

"_50% higher performance-per-watt compared to Magny-Cours._"

Magny-cours mean Phenom II?


----------



## naram-sin (Aug 24, 2010)

well, I mentioned "some form of HT" a couple of weeks ago, thinking that it would be a good way to go for AMD. or at least one page in their book... it seems to me that this will be (finally!) something new in AMD clan... good for them/us!


----------



## toyo (Aug 24, 2010)

Judging by the number of comments until mine, I can tell there's lots of scepticism about AMD's new CPU line... I guess they just delayed it for too long.

However, it seems they were kinda broke, and maybe it is only the 5800 series success that put Bulldozer back on the drawing board. I thought they abandoned the project for lack of resources or something.

Whatever the reality is, I hope it is worth waiting... AMD deserves a high-end CPU that will kick Intel line in the arse... it's maybe time for another performance crown switch like in good old Athlon vs Pentium days...


----------



## Valdez (Aug 24, 2010)

Atom_Anti said:


> "_50% higher performance-per-watt compared to Magny-Cours._"
> 
> Magny-cours mean Phenom II?



No.



> AMD currently does have an 8-core and 12-core processors codenamed "Magny-Cours", which are multichip modules of Shanghai (4-core) and Istanbul (6-core) dies.


----------



## PaNiC (Aug 24, 2010)

*Wtf amd*

Really why did AMD bring out 890fx chipset, just for the shitty 1090t it was nothing the 790fx couldn't handle. you ripped off customers and you don't have enuf to do that . i hope your shitty chip sucks.
FUCK YOU AMD


----------



## Zubasa (Aug 24, 2010)

PaNiC said:


> Really why did AMD bring out 890fx chipset, just for the shitty 1090t it was nothing the 790fx couldn't handle. you ripped off customers and you don't have enuf to do that . i hope your shitty chip sucks.
> FUCK YOU AMD


Stop being an emo.

AMD never forced anyone to buy 890FX boards.
You can drop a Phenom II X6 in a 790FX board and it works fine. 
The fact is many 890FX boards (with more features) are released at a lower price than the original 790FX boards, so you comment makes no sense at all.


----------



## pantherx12 (Aug 24, 2010)

PaNiC said:


> Really why did AMD bring out 890fx chipset, just for the shitty 1090t it was nothing the 790fx couldn't handle. you ripped off customers and you don't have enuf to do that . i hope your shitty chip sucks.
> FUCK YOU AMD



Pfffft you assumed to needed it no one else


----------



## naram-sin (Aug 24, 2010)

toyo said:


> Judging by the number of comments until mine, I can tell there's lots of scepticism about AMD's new CPU line... I guess they just delayed it for too long.
> 
> However, it seems they were kinda broke, and maybe it is only the 5800 series success that put Bulldozer back on the drawing board. I thought they abandoned the project for lack of resources or something.
> 
> Whatever the reality is, I hope it is worth waiting... AMD deserves a high-end CPU that will kick Intel line in the arse... it's maybe time for another performance crown switch like in good old Athlon vs Pentium days...



I think you are only partially right, regarding the "drawing board". I think that the "delay" was actually caused due to almost a complete redesign of the architecture. Which is an obvious prerequisite for AMD's future... The old one i just that, and even too old...

This one sounds really fresh and new, and I think that only with new ideas and new tech can AMD kick anything/anyone... but the bucket...


----------



## PaNiC (Aug 24, 2010)

Zubasa said:


> Stop being an emo.
> 
> AMD never forced anyone to buy 890FX boards.
> You can drop a Phenom II X6 in a 790FX board and it works fine.
> The fact is many 890FX boards (with more features) are released at a lower price than the original 790FX boards, so you comment makes no sense at all.



your old girl is emo
9 month support and only 3 new chips evan if i pay $20 for my crosshair iv it would still be a rip off


----------



## naram-sin (Aug 24, 2010)

PaNiC said:


> Really why did AMD bring out 890fx chipset, just for the shitty 1090t it was nothing the 790fx couldn't handle. you ripped off customers and you don't have enuf to do that . i hope your shitty chip sucks.
> FUCK YOU AMD



Ooooo, a hit an' a miss... 

I think this would bear any meaning if it was a lament on LGA1155 and the "other guy".. you know? with an "I"? 
just joking


----------



## Zubasa (Aug 24, 2010)

PaNiC said:


> your old girl is emo
> 9 month support and only 3 new chips evan if i pay $20 for my crosshair iv it would still be a rip off


LOL go slap that at Asus.

My friends Gigabyte 790GX board (from the Phenom I days) still supports the new X6s.
The 790FX chipsets is even older than that.

The 890FX chipset is just an update to replace the old 790FX, and there is nothing wrong with that.
The same way that the X48 replaced the X38. The difference is the 790FX is already well over 2 years old when the 890FX is released.


----------



## Melvis (Aug 24, 2010)

Did i read that right? it will be 50% faster then the magny-cours? how well does a magny -cours compare to an i7?


----------



## Fourstaff (Aug 24, 2010)

We all know that the 890FX is a bridge chipset, somewhere which is neither here nor there (in the sense that 790FX can do pretty much everything it can, but it cannot support future chipsets), so why the hate for it now? After all, its just an alternative to the 7xx boards. 

^ Magny-Cours, not "Cores"


----------



## mastrdrver (Aug 24, 2010)

So since this does away with HT Link does nVidia have a license for A-Link or does it end up being the same thing with Intel's DMI connection where they can't use it?

If they can't that would mean the end of nVidia chipsets on AMD (and all pc related from here forward). Wonder if they will allow SLI on AMD boards? My guess would be no since they are a graphics competitor, but only time will tell.


----------



## PaNiC (Aug 24, 2010)

Zubasa said:


> The same way that the X48 replaced the X38. The difference is the 790FX is already well over 2 years old when the 890FX is released.



X48 realsed for the die shink core 2 65nm to 45nm and got bunch of new chips
i own a X48 and have nothing bad to say about it. 
X48 was board i had befor upgrading to my 890fx, i could have gone 1366 drop a i7 930 and later i7 970.


----------



## Lionheart (Aug 24, 2010)

PaNiC said:


> X48 realsed for the die shink core 2 65nm to 45nm and got bunch of new chips
> i own a X48 and have nothing bad to say about it.
> X48 was board i had befor upgrading to my 890fx, i could have gone 1366 drop a i7 930 and later i7 970.



I hope a mod deals with your stupid ass soon, all your doin is annoying ppl and calling AMD shit when you should be calling yourself that you little troll


----------



## pantherx12 (Aug 24, 2010)

CHAOS_KILLA said:


> I hope a mod deals with your stupid ass soon, all your doin is annoying ppl and calling AMD shit when you should be calling yourself that you little troll



Actually I don't think he's bothering anyone at all, he's hardly offensive.

He's entitled to say what he likes after all


----------



## mastrdrver (Aug 24, 2010)

PaNiC said:


> Really why did AMD bring out 890fx chipset, just for the shitty 1090t it was nothing the 790fx couldn't handle. you ripped off customers and you don't have enuf to do that . i hope your shitty chip sucks.
> FUCK YOU AMD



Just a heads up but Lost Circuits is reporting that Bulldozer will be backward compatible with AM3.

You might just want to wait until later today when AMD makes their speech at Hot Chips to see if they verify this.


----------



## naram-sin (Aug 24, 2010)

That sounds amazing as well as incredible. Especially since NB is on CPU and a there is the A-link...

I think I can already see a new problem here, with the _drop-by readers_. HT for Hyper Threading and not Hyper Transport... 

IMHO, but I don't think they should cram to much of their "legacy" tech inside this kind of novelty... I think that Bulldozer deserves a new platform from scratch...

EDIT:


wolf said:


> buddy you just got to get over that.
> 
> on a lighter note, bring on the performance numbers! from all that info this chip sounds very promising, to many market segments.



I agree on the first part. 

On the second part, this is a kind of "paper (re)roll-out". So, only number that could come out is probably a number of slides from that presentation. 

But, I get you. I am also eager to see how (and where) this baby goes...


----------



## wolf (Aug 24, 2010)

PaNiC said:


> Really why did AMD bring out 890fx chipset, just for the shitty 1090t it was nothing the 790fx couldn't handle. you ripped off customers and you don't have enuf to do that . i hope your shitty chip sucks.
> FUCK YOU AMD



buddy you just got to get over that.

on a lighter note, bring on the performance numbers! from all that info this chip sounds very promising, to many market segments.


----------



## PaNiC (Aug 24, 2010)

pantherx12 said:


> Pfffft you assumed to needed it no one else



i only got this now. a bit slow
i assumed nothing, i was mislead by AMD shitty roadmaps



> I hope a mod deals with your stupid ass soon, all your doin is annoying ppl and calling AMD shit when you should be calling yourself that you little troll



could you be anymore random scrub


----------



## btarunr (Aug 24, 2010)

mastrdrver said:


> Just a heads up but Lost Circuits is reporting that Bulldozer will be backward compatible with AM3.
> 
> You might just want to wait until later today when AMD makes their speech at Hot Chips to see if they verify this.



Nope, we talked to AMD, they said it won't be backwards compatible. The way the chip is designed, it won't even theoretically work on platforms designed for K8/K10.


----------



## pantherx12 (Aug 24, 2010)

Well no one else was, they made it very clear the x6 processors would have backwards compatibility.

There was a lot of news from board manufacturers as well saying basically saying " ours support x6 too LULZ!"

It just seems you may of just missed everything we read.


----------



## Tatty_One (Aug 24, 2010)

PaNiC said:


> X48 realsed for the die shink core 2 65nm to 45nm and got bunch of new chips
> .



Weak argument, P35 and x38 chipset boards would still run 45nm chips with a simple BIOS update.


----------



## btarunr (Aug 24, 2010)

PaNiC said:


> X48 realsed for the die shink core 2 65nm to 45nm and got bunch of new chips



That's laughably misinformed.


----------



## PaNiC (Aug 24, 2010)

pantherx12 said:


> Well no one else was, they made it very clear the x6 processors would have backwards compatibility.
> 
> There was a lot of news from board manufacturers as well saying basically saying " ours support x6 too LULZ!"
> 
> It just seems you may of just missed everything we read.



yeah most of the people taking up amd side are intel owners


----------



## Kantastic (Aug 24, 2010)

PaNiC said:


> yeah most of the people taking up amd side are intel owners



And this helps your argument how?


----------



## PaNiC (Aug 24, 2010)

btarunr said:


> That's laughably misinformed.



it came out to support 1600fsb thats about it but there was still the die shinrk after so i see that as just


----------



## pantherx12 (Aug 24, 2010)

> I think that you are talking about Magny-Course as one being compatible to AM2(+), as well did they, the mnfcts... And this is Bulldozer, right? From the slides you can see that they are presenting it on 8-core processor. Or am I missing a point...?



Yeah your missing the point a bit, that was aimed at panic, he said he was mislead by AMD regarding 800 chip-set and the x6 processors.

How ever I was saying there was lot of information to be accessed from various sources.


----------



## naram-sin (Aug 24, 2010)

pantherx12 said:


> Yeah your missing the point a bit, that was aimed at panic, he said he was mislead by AMD regarding 800 chip-set and the x6 processors.
> 
> How ever I was saying there was lot of information to be accessed from various sources.



Yeah, I saw later that I did... and deleted the stupid post. 

But, I agree on the second one. Thanks for clearing that up.

And I was just thinking about replying on his... issues too. Because this is getting out of hand and going seriously off-topic.


----------



## Tatty_One (Aug 24, 2010)

PaNiC said:


> it came to support 1600fsb thats about it but there was still the die shinrk after so i see that as just



Your ignoring the facts, your very argument about the 890 also applies to Intel equally, I have no particular loyality to either side, your attack on the 890, whilst in part justified, coupled by your defence of the x48 makes no sense, the x48 gave us a new and improved chipset with DDr3 support etc etc but was not a neccesity to run 45nm chips just like as you mentioned the 890 is not a neccessity, the 890 also brings some added enhancements....... all I am saying is there are definate similarities that you choose to overlook for the sake of your argument.


----------



## PaNiC (Aug 24, 2010)

Tatty_One said:


> Your ignoring the facts, your very argument about the 890 also applies to Intel equally, I have no particular loyality to either side, your attack on the 890, whilst in part justified, coupled by your defence of the x48 makes no sense, the x48 gave us a new and improved chipset with DDr3 support etc etc but was not a neccesity to run 45nm chips just like as you mentioned the 890 is not a neccessity, the 890 also brings some added enhancements....... all I am saying is there are definate similarities that you choose to overlook for the sake of your argument.



but for what? 2 chips? intel had more then 30 after x48


----------



## Bloodcrazz (Aug 24, 2010)

> Originally Posted by PaNiC
> Really why did AMD bring out 890fx chipset, just for the shitty 1090t it was nothing the 790fx couldn't handle. you ripped off customers and you don't have enuf to do that . i hope your shitty chip sucks.
> FUCK YOU AMD



I do agree with him. I also have this problem. I also got a AM3 socket knowing that AMD themselves quoted that the Bulldozer was going to be for AM3 socket.
And knowing now that it will not be a AM3 socket has got me feeling like i got riped off.
Its just like they said it for marketing purporses to sell AM3 cpu's, to make every one think theres a upgrade at the end of the year.
Now there saying its not AM3.


----------



## Tatty_One (Aug 24, 2010)

PaNiC said:


> but for what? 2 chips? intel had more then 30 after x48



Your right there, but how many of them would run on an x48 and not an x38?  I am not disagreeing with the argument, just the facts you have presented to support that argument, you are right, you just presented it wrongly if you get my meaning?


----------



## pantherx12 (Aug 24, 2010)

Bloodcrazz said:


> I do agree with him. I also have this problem. I also got a AM3 socket knowing that AMD themselves quoted that the Bulldozer was going to be for AM3 socket.
> And knowing now that it will not be a AM3 socket has got me feeling like i got riped off.
> Its just like they said it for marketing purporses to sell AM3 cpu's, to make every one think theres a upgrade at the end of the year.
> Now there saying its not AM3.




Just how long did AMD maintain backwards compatibility ? 


Time to move on, this is the tech world we're talking about things go quickly.

People will always be disappointed because something new comes out, it's just how things work with computers.

Can you link me to where AMD said it would be AM3 by the way? 

I've been following bulldozer for a bit and certainly never read anything like that, although other sources did have rumours about the socket.

I don't think AMD made any announcements at all.


----------



## naram-sin (Aug 24, 2010)

pantherx12 said:


> ...
> Can you link me to where AMD said it would be AM3 by the way?
> 
> I've been following bulldozer for a bit and certainly never read anything like that, although other sources did have rumours about the socket.
> ...



Me to. I was under impression that the Bulldozer will be settled on LGA1207 or Socket F, even from first announcements... Definitely not AM3. And this i besides the point. Completely new architecture demands new components and interfaces, it's nature's law. Just hope this situation won't resemble of that when we had s754 and s939, that's all.

Meaning, I think it would be of greater use to focus all of resources on new platform, and completely abandon further development of old ones. No matter how much pennies is there to gain on old inventory. Maybe new lithography will help on this, what with lower costs and higher yield...


----------



## PaNiC (Aug 24, 2010)

pantherx12 said:


> Just how long did AMD maintain backwards compatibility ?
> 
> 
> Time to move on, this is the tech world we're talking about things go quickly.
> ...



amd stated many times that it will be am3r2


----------



## Bloodcrazz (Aug 24, 2010)

pantherx12 said:


> Just how long did AMD maintain backwards compatibility ?
> 
> 
> Time to move on, this is the tech world we're talking about things go quickly.
> ...



Hear is a link.
http://en.wikipedia.org/wiki/Bulldozer_(processor)


----------



## PaNiC (Aug 24, 2010)

Tatty_One said:


> Your right there, but how many of them would run on an x48 and not an x38?  I am not disagreeing with the argument, just the facts you have presented to support that argument, you are right, you just presented it wrongly if you get my meaning?



x48 also had a close to 2 year life span not 9 months


----------



## wolf (Aug 24, 2010)

PaNiC said:


> amd stated many times that it will be am3r2



show me the links. with no links your just ranting.



Bloodcrazz said:


> Hear is a link.
> http://en.wikipedia.org/wiki/Bulldozer_(processor)



that link is moot, wikipedia isn't a statement from AMD, nor does the reference list show a link to AMD officially stating it will be on the AM3 socket.


----------



## pantherx12 (Aug 24, 2010)

PaNiC said:


> amd stated many times that it will be am3r2





I think you guys misunderstood me, can I have a link to a page were AMD or a representative of AMD stated that it will be a AM3 or a variation of AM3?


----------



## PaNiC (Aug 24, 2010)

wolf said:


> show me the links. with no links your just ranting.
> 
> 
> 
> that link is moot, wikipedia isn't a statement from AMD, nor does the reference list show a link to AMD officially stating it will be on the AM3 socket.



theres has been a few press conference/computer shows where they have said this. just google it
http://www.anandtech.com/show/2871/2


----------



## WhiteLotus (Aug 24, 2010)

It's your argument you google it.


If this is a new socket, wonder if they will change cooling attachment design?


And also I am loving this "modular" design, that mean they can have numerous types of the processor, all at different prices. Say a low end one wouldn't have much but a top end would have everything. Or am I missing the point?


----------



## naram-sin (Aug 24, 2010)

PaNiC said:


> amd stated many times that it will be am3r2



Most of things that can be found on the net are readers' guesses. At one point, AMD did mention am3r2, but hadn't claimed that there will be any backward compatibility. Or at least I can't find any.

But, we should keep in mind that this is not exactly what AMD initially intended. This is a result of going back to the drawing board and coming up with something (almost) completely new... or at least upgraded... and better (judging by these info)! And I say: good for them! And for most of us! If they came up with an idea that is great (superior, even, at least compared to previous one) but requires certain changes in other areas, then do it.

Otherwise, it would be like cramming a jet-engine inside a trailer...


----------



## Tatty_One (Aug 24, 2010)

PaNic, you have made your point.... thank you, let's move back on topic now and talk about Bulldozer and try and get over obselete chipsets (or not) as the case may be......   I appreciate sockets are relative, lets just not harp on about rip off previous socket support.


----------



## pantherx12 (Aug 24, 2010)

PaNiC said:


> theres has been a few press conference/computer shows where they have said this. just google it
> http://www.anandtech.com/show/2871/2



That's fairly dated but fair enough


----------



## Valdez (Aug 24, 2010)

wolf said:


> show me the links. with no links your just ranting.
> 
> 
> 
> that link is moot, wikipedia isn't a statement from AMD, nor does the reference list show a link to AMD officially stating it will be on the AM3 socket.


----------



## Atom_Anti (Aug 24, 2010)

Ok guys, we got two important information:
-not backwards compatible with AM3,
-50% higher performance-per-watt compared to Shanghai and Istanbul.

Anyone can tell me a Corei7 or a Phenom2 how much faster than Shanghai or Istanbul?


----------



## the_wolf88 (Aug 24, 2010)

> Really why did AMD bring out 890fx chipset, just for the shitty 1090t it was nothing the 790fx couldn't handle. you ripped off customers and you don't have enuf to do that . i hope your shitty chip sucks.
> FUCK YOU AMD



At least AMD motherboards are compatible with all their processors so I can upgrade the CPU anytime I want ! Not like Intel core ix which force you to upgrade to one generation only !


----------



## WarEagleAU (Aug 24, 2010)

I am glad they are going with a new chipset and socket. Bless them for being backwards compatible for so long. I don't mind spending more money for this new architecture. Also, according to that link for circuits?? it was saying that the new processor chipset and socket would be backwards compatible with AM3 procs, at least that is how I took it, I Could be wrong.

Cant wait to see this come out. IF it gets closer to the i7 I will be more than happy.


----------



## JATownes (Aug 24, 2010)

I understand that this was initially planned to run on AM3, but as technology advances, changes have to be made.  Look at LGA1366 from Intel, it is EOL next year to make way for LGA2011.  AM2+ and AM3 have had a very long run, but it is time to be replaced.  

Lets face it though.  If you are worried about upgrade costs, a 1090t on a CHIV will last a few more years without the need for an upgrade.  And if you are one who needs the "latest and greatest" changing sockets is just a way of life.  It happens, and we should be thankful it happens, or we would all still be stuck on DDR with single cores.  SKT754 & SKT939 had a very short life span, and AMD rectified that (IMO) with the AM2+/AM3 socket (And the PHII with dual memory controllers).  I am hopeful that their next socket will have an equally impressive life span.


----------



## Anarchy0110 (Aug 24, 2010)

Can't wait for this one to shoots out  AMD FTW !!!!


----------



## Valdez (Aug 24, 2010)

Atom_Anti said:


> Ok guys, we got two important information:
> -not backwards compatible with AM3,
> -50% higher performance-per-watt compared to Shanghai and Istanbul.
> 
> Anyone can tell me a Corei7 or a Phenom2 how much faster than Shanghai or Istanbul?



They not comparing it to shanghai and istanbul, but magny-cours which is a 8 or 12 core processor. (which is based on deneb core, like phenom II)

More details: http://products.amd.com/en-us/OpteronCPUResult.aspx?f1=AMD+Opteron™+6100+Series+Processor


----------



## link2009 (Aug 24, 2010)

I'm sorry admins but allowing "PaNiC"'s posts in this thread was a mistake and it has slightly reduced my respect for the site. Every other member except him are trying to contribute to the topic at hand while he is trolling everyone and insulting left and right.

There are mature people here and respect should be given to everyone.

Now to be on topic... I really hope that Bulldozer pans out to be what everyone is making it out to be, a real-world competitor to the i7 and (possibly) Sandy bridge.


----------



## wolf (Aug 24, 2010)

Valdez said:


> http://images.anandtech.com/reviews/cpu/amd/FAD2009/desktoproadmap.jpg



thats all I was asking for, thankyou


----------



## Anarchy0110 (Aug 24, 2010)

Btw no backward compatible to AM3 ??? C'mon


----------



## naram-sin (Aug 24, 2010)

JATownes said:


> I understand that this was initially planned to run on AM3, but as technology advances, changes have to be made.  Look at LGA1366 from Intel, it is EOL next year to make way for LGA2011.  AM2+ and AM3 have had a very long run, but it is time to be replaced.
> 
> Lets face it though.  If you are worried about upgrade costs, a 1090t on a CHIV will last a few more years without the need for an upgrade.  And if you are one who needs the "latest and greatest" changing sockets is just a way of life.  It happens, and we should be thankful it happens, or we would all still be stuck on DDR with single cores.  SKT754 & SKT939 had a very short life span, and AMD rectified that (IMO) with the AM2+/AM3 socket (And the PHII with dual memory controllers).  I am hopeful that their next socket will have an equally impressive life span.



Best point regarding the backward compatibility of Bulldozer, so far. 
_...can't make a scene if you don't have the green..._


----------



## PaNiC (Aug 24, 2010)

link2009 said:


> I'm sorry admins but allowing "PaNiC"'s posts in this thread was a mistake and it has slightly reduced my respect for the site. Every other member except him are trying to contribute to the topic at hand while he is trolling everyone and insulting left and right.
> 
> There are mature people here and respect should be given to everyone.
> 
> Now to be on topic... I really hope that Bulldozer pans out to be what everyone is making it out to be, a real-world competitor to the i7 and (possibly) Sandy bridge.



trolling would mean i'm an intel fan boy or something, im just making my point that there was no need for the 890fx and to mislead people. like AMD did.
fact: people call me an emo first and i own a amd. im not a fanboy


----------



## Valdez (Aug 24, 2010)

PaNiC said:


> trolling would mean i'm an intel fan boy or something, im just making my point that there was no need for the 890fx and to mislead people. like AMD did.



infact we have a lot of chipset we don't need... and a lot of processors, gpu-s... this market is just like this, face it.

Anyway it's your job (if you are interested) to choose the best to your needs. If you have chosen wrong, it's only your fault.


----------



## Bloodcrazz (Aug 24, 2010)

link2009 said:


> I'm sorry admins but allowing "PaNiC"'s posts in this thread was a mistake and it has slightly reduced my respect for the site. Every other member except him are trying to contribute to the topic at hand while he is trolling everyone and insulting left and right.
> 
> There are mature people here and respect should be given to everyone.
> 
> Now to be on topic... I really hope that Bulldozer pans out to be what everyone is making it out to be, a real-world competitor to the i7 and (possibly) Sandy bridge.



I dont think that "PaNiC" insulted anyone at all.
Im pretty sure that he was having a go at AMD not at any person.
Its called freedom of speech hence this being a FORUM.


----------



## inferKNOX (Aug 24, 2010)

WhiteLotus said:


> It's your argument you google it.
> 
> 
> *If this is a new socket, wonder if they will change cooling attachment design?*
> ...


+1!!
I really don't want to have to change from my Prolimatech Mega Shadow! Although I'm sure they'll just release a fresh retention kit for the new socket.



PaNiC said:


> lolwat


Can you just quit it, we're not interested in your rant!:shadedshu
I can't believe people like you that complain about AMD's new CPU not being backward compatible, when they've been making all their others backward compatible for so long! Get over it, the world can't be on hold forever, the only way to make significant changes is to make things radically different and not backwards compatible.
“Man cannot discover new oceans unless he has the courage to lose sight of the shore.” -Andre Gide


This CPU sounds like it's gonna be great and I really looking forward to it!


----------



## wolf (Aug 24, 2010)

this thread has been going downhill since page 1.


----------



## JATownes (Aug 24, 2010)

PaNiC said:


> trolling would mean i'm an intel fan boy or something, im just making my point that there was no need for the 890fx and to mislead people. like AMD did.
> fact: people call me an emo first and i own a amd. im not a fanboy



Just to be clear.  AMD 890FX was added to included SATA 6GBs and USB 3...IIRC, those are the only real changes, and they were needed for some people running these devices.


----------



## JATownes (Aug 24, 2010)

PaNiC said:


> my rant isn't about the chip been backwards compatible its about AMD saying it's was going be and then change there mind after i pay top dollar for* what is now a usless board*.
> 
> please read befor you post... and yet im trolling



You could always send it to me


----------



## PaNiC (Aug 24, 2010)

JATownes said:


> Just to be clear.  AMD 890FX was added to included SATA 6GBs and USB 3...IIRC, those are the only real changes, and they were needed for some people running these devices.



usb 3 isnt native


----------



## naram-sin (Aug 24, 2010)

PaNiC said:


> trolling would mean i'm an intel fan boy or something, im just making my point that there was no need for the 890fx and to mislead people. like AMD did.



That is completely untrue. Many needed 8xx chipsets, and in many cases it is these chipsests that have made way for better results of AM3 processors. Google some comparative reviews of 7xx and 8xx chipsets. And 8xx chipsets are more advanced (new tech) and have better peripheral performance than than 7xx chipsets. Especially, since 7xx are here almost since AM2 socket...

Btw, there was an "issue" with core unlocking. 

With all of that and, as previously said, this wasn't misleading, this was just a change of plans...

Now, I'll stop MY ranting. OaO

EDIT:


PaNiC said:


> usb 3 isnt native



SATAIII is.


----------



## Bloodcrazz (Aug 24, 2010)

inferKNOX said:


> +1!!
> I really don't want to have to change from my Prolimatech Mega Shadow! Although I'm sure they'll just release a fresh retention kit for the new socket.
> 
> 
> ...



If u read the earlier posts he was talking about AMD saying there releasing the AM3 socket on the bulldozer and AMD changed there minds.
After Millions of ppl bought AM3 socketed cpus believing they would be compatible.
And now they have changed the socket on the Bulldozer.
AMD by law have given false and misleading information regarding the Bulldozer.


----------



## Valdez (Aug 24, 2010)

PaNiC said:


> usb 3 isnt native



I doesn't make any difference... oh maybe it costs 1 $ more...


----------



## JATownes (Aug 24, 2010)

I understand it is not native, but have you seen any 790FX board with it, or as it only been added to 890FX?

If you didn't need SATAIII or USB 3, why didn't you grab a 790FX board?  They are AM3 as well, and "should have supported Bulldozer" too, according to your arguement.  You grabbed the most expensive board out there.  I made my argument, I am now done.

On-topic: Bring on BULLDOZER!!!


----------



## pantherx12 (Aug 24, 2010)

Bloodcrazz said:


> AMD by law have given false and misleading information regarding the Bulldozer.



.... No they have not, a road map is not a legally binding statement.

But anyway, chaps who have issues with the AM3 thing wanna take your discussion to a new thread instead? 


All it takes is clicking the start new topic button : ]


----------



## inferKNOX (Aug 24, 2010)

Bloodcrazz said:


> *If u read the earlier posts* he was talking about AMD saying there releasing the AM3 socket on the bulldozer and AMD changed there minds.
> After Millions of ppl bought AM3 socketed cpus believing they would be compatible.
> And now they have changed the socket on the Bulldozer.
> AMD by law have given false and misleading information regarding the Bulldozer.



I read the whole thread before posting.
AMD did not officially announce anything, so people riding on rumors made a mistake just as pantherx12 said.
Anyway, it's not like they can't sell to upgrade, it's not a consumable.


----------



## naram-sin (Aug 24, 2010)

Bloodcrazz said:


> If u read the earlier posts he was talking about AMD saying there releasing the AM3 socket on the bulldozer and AMD changed there minds.
> After Millions of ppl bought AM3 socketed cpus believing they would be compatible.
> And now they have changed the socket on the Bulldozer.
> AMD by law have given false and misleading information regarding the Bulldozer.



In essence, it said they "plan", not "we take a pledge to insure backward comp...", you get the idea. And I can't believe that anyone bought the board only to be able to upgrade it 2 years after... that, my good sir is foolish. I get a new MOBO's more often than my car get's new tires. And it's never on a premium. Anyone that upgraded wanted 8xx because it was a larger no. than 7xx, because it had from scratch support for DDR3 and supported new interfaces...

You AMD boyz are just spoiled, what with AM2/AM2+/AM3 backward compatibility... 



inferKNOX said:


> I read the whole thread before posting.
> AMD did not officially announce anything, so people riding on rumors made a mistake just as pantherx12 said.
> Anyway, it's not like they can't sell to upgrade, it's not a consumable.



Good frelling point.


----------



## Valdez (Aug 24, 2010)

Bloodcrazz said:


> If u read the earlier posts he was talking about AMD saying there releasing the AM3 socket on the bulldozer and AMD changed there minds.
> After Millions of ppl bought AM3 socketed cpus believing they would be compatible.
> And now they have changed the socket on the Bulldozer.
> AMD by law have given false and misleading information regarding the Bulldozer.



I think only a very low percentage of buyers chosed an am3 board because of bulldozer. Anyway backward compatibility is not a guarantee, never was. Do you remember the am2/+/3 compability? There were a lot of am2 boards which can accept the phenom II series, but not only the socket matters. There are other things like bios size, power design, manufacturer support quality.

If bulldozer would be backward compatible to am3 socket, and you have a popular am3 board from a decent manufacturer, you'll likely have bios support for the new processor.


----------



## newtekie1 (Aug 24, 2010)

I'm interested to see how this performs.  I really hope it has something for Intel, because their high end prices have been insane thanks to no competition.

But people need to not be fooled.  If this really does outperform Intel, don't think AMD is going to keep prices low, they will jack prices up just as fast an Intel when they can.  The only reason prices are cheap now is because they have to be cheap.



PaNiC said:


> X48 realsed for the die shink core 2 65nm to 45nm and got bunch of new chips
> i own a X48 and have nothing bad to say about it.
> X48 was board i had befor upgrading to my 890fx, i could have gone 1366 drop a i7 930 and later i7 970.





PaNiC said:


> trolling would mean i'm an intel fan boy or something, im just making my point that there was no need for the 890fx and to mislead people. like AMD did.
> fact: people call me an emo first and i own a amd. im not a fanboy



I hate to be the one to burst your bubble, but the move to the 800 series was done to add USB 3.0 and SATA 6Gbps natively to the chipsets.

On the other hand, the move from x38 to x48 was for nothing other than a name change.  In fact some x38 boards could be flashed to x48 without a problem because the manufacturers simply put a sticker on the x38 board and called it an x48.  Yes, 1600FSB was offically supported with the x48 chipset, and not the x38, but anyone that ran an x38 knows it easily did 1600FSB.


----------



## btarunr (Aug 24, 2010)

Valdez said:


> http://images.anandtech.com/reviews/cpu/amd/FAD2009/desktoproadmap.jpg



Old roadmap is old. AMD told us it's a different socket just last week.


----------



## link2009 (Aug 24, 2010)

Bloodcrazz said:


> I dont think that "PaNiC" insulted anyone at all.
> Im pretty sure that he was having a go at AMD not at any person.
> Its called freedom of speech hence this being a FORUM.





inferKNOX said:


> Can you just quit it, we're not interested in your rant!:shadedshu
> I can't believe people like you that complain about AMD's new CPU not being backward compatible, when they've been making all their others backward compatible for so long! Get over it, the world can't be on hold forever, the only way to make significant changes is to make things radically different and not backwards compatible.





wolf said:


> this thread has been going downhill since page 1.



I think everyone knows he is trolling.


----------



## Valdez (Aug 24, 2010)

btarunr said:


> Old roadmap is old. AMD told us it's a different socket just last week.



Well, this is the newest roadmap so far (for 2011)...

Which socket did amd told you? It is an existing or a new socket?


----------



## inferKNOX (Aug 24, 2010)

btarunr said:


> Old roadmap is old. AMD told us it's a different socket just last week.



Ooohh... what is the socket called?!... tell us, tell us, tell us, tell us!! *Giant Goo-Goo Eyes*


----------



## CDdude55 (Aug 24, 2010)

> Because of this design change, Bulldozer processors will come in totally new packages that are not backwards compatible with older AMD sockets such as AM3 or AM2(+).



Finally a new socket!

Hopefully we can actually see some performance from these things pretty soon, getting tired of AMD fanboys cuming there pants over speculation.


----------



## JATownes (Aug 24, 2010)

Valdez said:


> Well, this is the newest roadmap so far (for 2011)...
> 
> Which socket did amd told you? It is an existing or a new socket?





inferKNOX said:


> Ooohh... what is the socket called?!... tell us, tell us, tell us, tell us!! *Giant Goo-Goo Eyes*



If I am not mistaken, they call it "Socket G"...anyone know if I am remembering that correctly?


----------



## inferKNOX (Aug 24, 2010)

JATownes said:


> If I am not mistaken, they call it "Socket G"...anyone know if I am remembering that correctly?



Do you have details? Does it take pins or have them and contact the CPU like with Intel. Are physical dimensions the same? Number of contact-points/pins/whatever-you-call-them?

Really concerned about a possible different HS retention layout...


----------



## JATownes (Aug 24, 2010)

I cannot remember where I read that at, but I think it was a derivative of the G34 Socket used for the Opteron Magny-Cours.

 Socket G34  also supports 4 channel DDR3, so it seems like a natural extension to modify the socket for consumer desktop usage.


----------



## btarunr (Aug 24, 2010)

This is pure FUD, but:

I personally expect the consumer Bulldozer package to have nearly 1000~1400 pins, just not arranged like AM3/2 or compatible with it. 

The processor may continue to have a dual-channel DDR3 memory controller (maybe higher memory clock speeds of 1833 MHz support to give higher bandwidth).

Processor will need pins to give out 40 PCI-Express 2.1 lanes (incl. the A-Link III which is x4).

No HyperTransport pins on the consumer packages. The 2P/4P Opteron package might be bigger, as it needs pins for 1 or 2 16-bit HyperTransport links (to neighbouring sockets).


----------



## PaNiC (Aug 24, 2010)

newtekie1 said:


> I'm interested to see how this performs.  I really hope it has something for Intel, because their high end prices have been insane thanks to no competition.
> 
> But people need to not be fooled.  If this really does outperform Intel, don't think AMD is going to keep prices low, they will jack prices up just as fast an Intel when they can.  The only reason prices are cheap now is because they have to be cheap.
> 
> ...



most of this was coverd in older posts
and as for the new AMD chip it will be good and AMD know that hence the socket forcing people to upgrade, seem intel dosn't have much they put all there eggs in the Larrabee basket and anything they bring out will only be a plan b


----------



## bear jesus (Aug 24, 2010)

JATownes said:


> I cannot remember where I read that at, but I think it was a derivative of the G34 Socket used for the Opteron Magny-Cours.
> 
> Socket G34  also supports 4 channel DDR3, so it seems like a natural extension to modify the socket for consumer desktop usage.




1974 pins  that must be one fat package if bulldozer gives some good performance then quad memory channels might be quite useful for some virtual machines, i just hope we find out for sure what socket it will be and the sepc

I admit i am pretty glad i held off on going for an am3 board, im still running on an am2 790fx/sb600 asus m3a32-mvp that's on its third processor (currently an am3 pehnom ii) to be honest i would be happy with an all new socket as i have been very happy with my am2 upgrades and i saw little point going am3 yet unless buldozer was compatible.

either way i want more info on bulldozer and sandy bridge as im getting the full system upgrade itch (not that i realy need more power)


----------



## DigitalUK (Aug 24, 2010)

btarunr said:


> This is pure FUD, but:
> 
> I personally expect the consumer Bulldozer package to have nearly 1000~1400 pins, just not arranged like AM3/2 or compatible with it.
> 
> ...




i think i remember seeing bulldozer was going to have quad-channel memory, i could be wrong. but i cant wait for bulldozer been waiting for years, wish i could pre order now..


----------



## mdsx1950 (Aug 24, 2010)

Wonder if these chips will kick Intel buttocks.


----------



## CDdude55 (Aug 24, 2010)

Man, 8 cores and 16 threads just seems so pointless right now. The software isn't keeping up.


----------



## bear jesus (Aug 24, 2010)

CDdude55 said:


> Man, 8 cores and 16 threads just seems so pointless right now. The software isn't keeping up.



For the most part right now its is almost useless but there are more and more things that are geting more heavily threaded, one thing i would be interested in is how a folding@home smp client would do on a 16 threaded bulldozer although i dont know if it would scale over all 16 threads or do it well...... i geuss time should tell


----------



## pantherx12 (Aug 24, 2010)

CDdude55 said:


> Man, 8 cores and 16 threads just seems so pointless right now. The software isn't keeping up.



8 or 16 cores XD

I think it is, there's been massive improvements the past 5 months or so.

I've plenty of apps that max out my cores.


----------



## CDdude55 (Aug 24, 2010)

bear jesus said:


> For the most part right now its is almost useless but there are more and more things that are geting more heavily threaded, one thing i would be interested in is how a folding@home smp client would do on a 16 threaded bulldozer although i dont know if it would scale over all 16 threads or do it well...... i geuss time should tell





pantherx12 said:


> 8 or 16 cores XD
> 
> I think it is, there's been massive improvements the past 5 months or so.
> 
> I've plenty of apps that max out my cores.




No doubt software is moving forward and allowing more cores and threads to be utilized, but it's not as widespread still. Then again it depends on what you do, i mean for an average consumer or gamer, more then 2 cores still isn't needed in most programs. But on the flip side, the people doing massively CPU intensive things might actually need an 8 or 16 core CPU. there has definitely been improvements, but not enough to warrant a CPU with more then 4 cores. But even if the software isn't fully caught up, it's inevitable that hardware will just keep getting better regardless.(and that's what can be disappointing sometimes)


----------



## bear jesus (Aug 24, 2010)

CDdude55 said:


> No doubt software is moving forward and allowing more cores and threads to be utilized, but it's not as widespread still. Then again it depends on what you do, i mean for an average consumer or gamer, more then 2 cores still isn't needed in most programs. But on the flip side, the people doing massively CPU intensive things might actually need an 8 or 16 core CPU. there has definitely been improvements, but not enough to warrant a CPU with more then 4 cores. But even if the software isn't fully caught up, it's inevitable that hardware will just keep getting better regardless.(and that's what can be disappointing sometimes)



Very true but the bulk of my current machine i have had for over 2 years, i would hope a full system upgrade next year would give me at least a couple more years if not maybe 3 years and i hope at some point in that time software will start to catch up to what will be an out of date machine in that time lol


----------



## Batou1986 (Aug 24, 2010)

Kudos to AMD for finally trying something completely new i hope these can at least compete with intel for a change.

And congrats to PaNiC for making my ignore list on your first day way to go :shadedshu


----------



## Mindweaver (Aug 24, 2010)

WHat! you mean i can't use my NEW 1055T in the new chipset MB's built for these processors?.. j/k don't ban me mussels! hehehe

Wow this has been the best news of the day! I can't wait to own the newness! (Maybe Intel will be pushed to bring out *Core24veinticuatro*!


----------



## 3volvedcombat (Aug 24, 2010)

Well you guys that crying about the bulldozer cores being on a different platform, thats what you get just buying without being responsible and thinking about it.

I would not go purchase something unless its out already- or all details- road maps- and even sites pre-listing the hardware at a certain price pop up.

You guys went and purchased 890fx boards expecting 8 core on them- when there were barley quad cores and thuban dies were barley expected 3-4 months later.

w00t on the epic purchases some of you guys made- I say live with your 890fx platform because it could still be a while before bulldozer comes out, because if you THINK about it, 
AMD wants money right now- so there not just going to completely dis own the 6 core and 4 core chips which are out right now. So if they do release it even later then the say- thats good.

Because there just trying to get the brilliant quad cores and thuban cores sold- till they displace them with 200-300 dollar bulldozer chips.

You DO NOT WANT them to release bulldozer anytime soon or i smell a 500 dollar AMD cpu

even the thought of a 400 dollar amd cpu scares me 

Let them sell the chips right now- get some profit- stop production on some deneb chips- lower the prices of the thuban dies and 890fx motherboards- and then release a whole new platform at previous thuban die prices 

and ill be happy there staying cheap with such epic 8 core designs.


----------



## HolyCow02 (Aug 24, 2010)

Sounds completely awesome. Can't wait to see some numbers on these when they come out, whenever that will be.


----------



## Mindweaver (Aug 24, 2010)

They will have to beat i7 to price a chip over 300 bucks. I see them putting out a feeler chip to show the new chip can beat an i7 before they just put out a 500 chip. But that being said I CAN'T WAIT! hehehe These are great days we are living in my friends!


----------



## Valdez (Aug 24, 2010)

If it won't be am3 compatible, i'm sure AMD will release new, 32nm steppings for am3 platform  based on the current generation.


----------



## largon (Aug 24, 2010)

Finally a new socket. 
I was kinda worried since it appeared until today that Bulldozer landed on AM3. 
Now the quad channel RAM claim also makes sense. 

PS. 
Laughing at that one silly kid who's pissed at AMD.


----------



## mastrdrver (Aug 24, 2010)

btarunr said:


> This is pure FUD, but:
> 
> I personally expect the consumer Bulldozer package to have nearly 1000~1400 pins, just not arranged like AM3/2 or compatible with it.
> 
> ...





DigitalUK said:


> i think i remember seeing bulldozer was going to have quad-channel memory, i could be wrong. but i cant wait for bulldozer been waiting for years, wish i could pre order now..



1866mhz is suppose to be support from the get go with Llano so I suspect Bulldozer to support it too. Also dual channel has been known to only be supported. That quad channel thing is for high end desktop Sandy Bridge.


----------



## suraswami (Aug 24, 2010)

I guess that this 'BullDozer' is mainly aimed at the server market where they have lost almost all of its market share.  Where ever I go I see Intel dominated servers as against Opty dominated servers few years back.  This is the place where both these companies make most money.

New Socket = better, in my opinion as they don't need to form a bubble and work within it.  Would have been nice to get a proc for AM3 socket, but as such I am tired of existing AM2...AM3 sockets.

If this processor really shines (without the lapping ), AMD will be back on money making on their CPUs.

Go AMD Go


----------



## Valdez (Aug 24, 2010)

mastrdrver said:


> That quad channel thing is for high end desktop Sandy Bridge.



well, amd can have quad channel too, it's not intel only tech


----------



## techtard (Aug 24, 2010)

Good thing I read this, I was about to pull the trigger on a 890fx mobo, 1055 and some ram. 
I guess my trusty old frankensteined system can live for another few months until we get some performance numbers on Bulldozer.

If it lives up to the hype, then maybe it will force Intel to cut prices and we all win.


----------



## wahdangun (Aug 24, 2010)

CDdude55 said:


> Man, 8 cores and 16 threads just seems so pointless right now. The software isn't keeping up.



if just nvdia allow for proper coding of physix and allow more cores to be utilized and not use that ancient X87 code w will be surely seeing superb physix effect WITHOUT needed any of GPU power so GPU can concentrate to flex its muscle in graphic department and our hexacore CPU won't be waste of silicon 


bring on bullet phisyx and please intel optimize that havoc engine so it will be use all 6 six core


----------



## OneCool (Aug 24, 2010)

I want it!!

HD 6870 = I want it!!


----------



## crazyeyesreaper (Aug 24, 2010)

i STILL cant see why ppl are bitching about 890fx for Im mean for fucking goodness sake bulldozer wont be here for another full god damn year get over it. 890fx will have nearly a 2 year life span by then as well so everyone needs to stop bitching 2 years in the tech world is a long god damn time. I look at it this way by the Time bulldozer comes out i will have fully enjoyed and used my hardware to its fullest and gotten my moneys worth from it. seriously... 2007- 2011 my machines base configuration will be 4 years old by the time Bulldozer hits.. ppl are seriously going to complain about socket longevity? i went from athlon x2 to 940be then jumped to 965 and grabbed a used mobo.  Seriously 4 years on a machine in terms of being an enthusiast is a long ass time. The tech world constantly moves forwards just as the sun rises and sets and the world spins. IF you dont like those facts of life go play russian roulette with all the chambers filled.'

More on topic i dont mind a new socket either will be a nice change hopefully they fix the cpu HSF mounting issues currently seen on 939 AM2 AM2+ AM3 would be nice to have more freedom of heatsink choice without it crippling my ram selection due to tall heatspreaders


----------



## erocker (Aug 24, 2010)

Who's complaining? The current socket has had a good and long life, I see no reason to complain. 



crazyeyesreaper said:


> im talking about the guy who bitched for nearly half the thread about 890fx... when if he used his brain he would realize that when bulldozer does come out  890fx will be nearly 2 years old



It's funny how one person can ruin a thread. Ignore the trolls.


----------



## CDdude55 (Aug 24, 2010)

wahdangun said:


> if just nvdia allow for proper coding of physix and allow more cores to be utilized and not use that ancient X87 code w will be surely seeing superb physix effect WITHOUT needed any of GPU power so GPU can concentrate to flex its muscle in graphic department and our hexacore CPU won't be waste of silicon
> 
> 
> bring on bullet phisyx and please intel optimize that havoc engine so it will be use all 6 six core



But the whole point of Physx is so the GPU does the physics processing instead of the CPU. whether or not a CPU can utilize all of it's cores is a matter of software taking advantage of those cores. How is physx holding the CPU back?, even if physx is poorly coded, how would that effect the CPU?. I don't understand how in anyway physx could be holding back a part that it has nothing to do with.

But of course, everything has to be Nvidia's fault right.:shadedshu


----------



## crazyeyesreaper (Aug 24, 2010)

im talking about the guy who bitched for nearly half the thread about 890fx... when if he used his brain he would realize that when bulldozer does come out  890fx will be nearly 2 years old


----------



## DannibusX (Aug 24, 2010)

AMD needs a socket change to compete.  I have no problems with it.  They've done a great job keeping their processors backward compatible to this point.  Who says that future processors won't be backward compatible with whatever socket that Bulldozer gets?


----------



## crazyeyesreaper (Aug 24, 2010)

AMD said it lol ^ but yea seriously AM2 to AM2+ to AM3 we had by the time bulldozer gets here nearly 5 years of same socket it was indeed time for a change


----------



## CDdude55 (Aug 24, 2010)

Don't see what the big deal is with them changing sockets. Sockets get old and eventually needs to be replaced with a newer one with better tech on board. I don't see why someone would get mad at that.


----------



## wolf (Aug 24, 2010)

remember that if you still have an AM2+ or AM3 socket you wil still get whatever new processors AMD release until bulldozer, I doubt they are going to stop in their tracks from now until it and its new platform are released.

I for one am looking forward to plopping a 6 core into my 785G in about 9-12 months.


----------



## CDdude55 (Aug 24, 2010)

wolf said:


> remember that if you still have an AM2+ or AM3 socket you wil still get whatever new processors AMD release until bulldozer, I doubt they are going to stop in their tracks from now until it and its new platform are released.
> 
> I for one am looking forward to plopping a 6 core into my 785G in about 9-12 months.



I agree.

I also hope the 980x or i7 970 drop down in price eventually so i can go 6 core to.


----------



## cadaveca (Aug 24, 2010)

So, new socket? that makes me VERY happy.


----------



## Super XP (Aug 24, 2010)

IMO, 50% increase in throughput over Phenom II does not necessarily mean 50% more performance increase, w/ yes 33% more cores. 
It sounds like AMD choose their wording very intelligently.


----------



## cavemanthreeonesix (Aug 25, 2010)

Glad to here some concrete news about the bulldozer, new platform is definitely a positive move forward imho, K10 has definitely peaked so hopefully they can move forward from that.

Only downside is i'm split on whether the crosshair IV extreme is going to worth getting anymore, if it ever comes out...


----------



## crazyeyesreaper (Aug 25, 2010)

you got a 1 year + so it might be might not up to you


----------



## mastrdrver (Aug 25, 2010)

Fwiw on Bulldozer being backward compatible, Tech Report is saying they "expect compatibility............although specifics about that are still murky."

Notably though is that they confirmed that they are compatible with current C32 and G34 for the server side. Maybe that is why they think there might be a chance for AM3 compatibility?



Valdez said:


> well, amd can have quad channel too, it's not intel only tech



Your absolutely right but the real question is why? Not even the 6 core/12 thread Westmere chips fully use triple channel for high end desktop though it would be a great feature on server systems.

Now some will say what about the gpu being moved on die. With the rumor of a maximum of ~400 SP being on a Llano some would argue that adding more channels would be beneficial here. Thing is more channels require more die room being reserved for that.

Why not just support higher frequencies instead? Save die space for something else (or just take the space savings and the power you save too). If I'm not mistaken, dual channel with 1866mhz memory has higher bandwidth than the current Intel triple channel that is on the 9xx chips (obviously this is generally speaking).


----------



## Super XP (Aug 25, 2010)

AMD's diagram is interesting, it shows TWO L3 cache(s) and a TWO NB's for a 4 module, 8 core Bulldozer CPU. What do you guys make of this? Can this be how we get the so called Quad-Channel Integrated Memory controller previously rumoured, in terms of 2 x Dual-Channel IMC's.

Mark my words, if Bulldozer is indead based on a Quad-Channel interface, it's going to perform like a Bulldozer!!!!


----------



## trickson (Aug 25, 2010)

I just hope they finally have the right stuff .


----------



## JATownes (Aug 25, 2010)

Super XP said:


> AMD's diagram is interesting, it shows TWO L3 cache(s) and a TWO NB's for a 4 module, 8 core Bulldozer CPU. What do you guys make of this? Can this be how we get the so called *Quad-Channel Integrated Memory controller previously rumoured, in terms of 2 x Dual-Channel IMC's*.
> 
> Mark my words, if Bulldozer is indead based on a Quad-Channel interface, it's going to perform like a Bulldozer!!!!



From what I understand this is correct.  Looking for source...
Edit: 





> We’d be surprised if a Bulldozer APU had more than the four memory channels of a Magny-Cours CPU, but not that it would be quicker – Magny-Cours CPUs are comprised of two 6-core CPU dies, so the quad-channel memory controller is really two dual-channel units split across the two dies rather than one homogenous mega-controller.



Source

So if Magny cours is two dual channel IMCs it stands to reason that Bulldozer will implement the same tech.


----------



## 1c3d0g (Aug 25, 2010)

Yawn. Let's see how this performs first, which is hopefully not as bad as Intel's craptastic Atoms...


----------



## mastrdrver (Aug 25, 2010)

JATownes said:


> From what I understand this is correct.  Looking for source...
> Edit:
> 
> Source
> ...



Seeing as Bulldozer is said to work in G34 and C32 sockets, showing dual memory and northbridge controllers isn't surprising. I think it is the C32 which is the 2P format for AMD from which will be derived the 1P high end desktop Bulldozer.


----------



## W1zzard (Aug 25, 2010)

i added the full slide deck from Hot Chips Conference to the first post


----------



## Wile E (Aug 25, 2010)

All I want is for AMD to be able to compete on the high end desktop market again.


----------



## Atom_Anti (Aug 25, 2010)

From where is the information the northbridge is completely on the chip?

All other sites not saying anything like that, but it will AM3 compatible!


----------



## Hayder_Master (Aug 25, 2010)

ok ok ok AMD u going to do great CPU's but what about damn AMD mother boards chips any improve


----------



## TheMailMan78 (Aug 25, 2010)

hayder.master said:


> ok ok ok AMD u going to do great CPU's but what about damn AMD mother boards chips any improve



Whats wrong with the 890?


----------



## Deleted member 67555 (Aug 25, 2010)

New Boards will DX11


----------



## inferKNOX (Aug 25, 2010)

crazyeyesreaper said:


> More on topic i dont mind a new socket either will be a nice change hopefully they fix the *cpu HSF mounting issues currently seen on 939 AM2 AM2+ AM3* would be nice to have more freedom of heatsink choice without it crippling my ram selection due to tall heatspreaders


I think it's more a RAM-is-too-close-to-CPU issue, rather than a CPU HSF mounting issue.



Valdez said:


> well, *amd can have quad channel* too, it's not intel only tech


I'm against the idea of having quad channel.
Needing 4 RAM sticks in order to get benefits of higher speed sucks when you think about 1 of them dying on you. At least if they could, if possible, make it such that it would have different modes where, if you have 2 sticks, it switches to dual channel, 3 sticks - triple channel, 4 sticks - quad channel.

*EDIT:*


hayder.master said:


> ok ok ok AMD u going to do great CPU's but what about damn AMD mother boards chips any improve


No doubt those chipsets will have PCIe 3.0, intergrated & USB 3.0 at the very least.


----------



## Deleted member 67555 (Aug 25, 2010)

Sabine: Mainstream mobile platform based on the Llano APU, which will see a quad-core Stars-based CPU and DirectX 11-class graphics processor tied together on the same piece of silicon, manufactured using 32 nm lithography. Sabine is expected to arrive in 2011.

Brazos: Ultra low-power mobile platform based on the Ontario APU, which will see a dual-core Bobcat-based CPU and DirectX 11-class graphics processor tied together on the same piece of silicon. Brazos is expected to arrive in 2011, and will allow AMD to drive netbooks, along with form factors the company’s hardware hasn’t yet appeared in (possibly tablets).

Scorpius: Enthusiast desktop platform based on AMD’s Zambezi processor and discrete graphics (AMD, of course, specifies an ATI GPU). The platform requires a quad-core CPU or higher, DDR3 memory, and a revised Socket AM3 interface. Availability is expected in 2011.

Lynx: Mainstream desktop platform based on AMD’s Llano APU. It’ll feature up to four CPU cores, a single graphics core (integrated onto the APU, naturally), and DDR3 memory. Availability is expected in 2011.

Components:

Llano: This is going to be AMD’s first APU, combining a quad-core Stars-based CPU and DirectX 11-class GPU on a single piece of silicon. It’ll be manufactured using a 32 nm SOI process, support DDR3 memory, and include core-level power gating. Because there are brand new capabilities in play here, it should surprise no one that Llano will drop into a new socket interface. Availability is expected in 2011.

Ontario: While the Llano APU absorbs much of AMD’s risk in shifting to 32 nm manufacturing (since it employs a familiar CPU microarchitecture and more mature manufacturing process), Ontario will be the first APU to employ AMD’s Bobcat CPU microarchitecture. Ontario is manufactured at 40 nm, armed with DirectX 11-class graphics, and expected in 2011.

Zambezi: Per AMD, Zambezi will be the first desktop processor based on the company’s Bulldozer architecture. Featuring as many as eight cores, Zambezi-based offerings will incorporate as many as four processor “modules.” AMD plans to use 32 nm manufacturing, and early reports suggest Socket AM3 compatibility (along with DDR3 memory support). Zambezi is not an APU, but rather is meant to be paired with discrete graphics.

Interlagos/Valencia: Respective code-names for AMD’s upcoming 16-core and eight-core Opteron processors, respectively, both based on the Bulldozer microarchitecture. Interlagos will drop into the existing G34 interface, while Valencia is C32-compatible. Both families will be manufactured using 32 nm SOI lithography, will support DDR3 (including load-reduced DIMMs and 1.25 V memory modules), and are expected in 2011.


----------



## Hayder_Master (Aug 25, 2010)

TheMailMan78 said:


> Whats wrong with the 890?



can u compare it with X58


----------



## Deleted member 67555 (Aug 25, 2010)

hayder.master said:


> can u compare it with X58



I think with a matched processor you could
the MSI 890FXA-GD70 AM3  comes to mind


----------



## TheMailMan78 (Aug 25, 2010)

hayder.master said:


> can u compare it with X58



Apples to oranges. Nvidia will never let AMD/ATI run SLI native on a 890 or "990" chipset. If so then yeah.


----------



## inferKNOX (Aug 25, 2010)

jmcslob said:


> Scorpius: Enthusiast desktop platform based on AMD’s Zambezi processor and discrete graphics (AMD, of course, specifies an ATI GPU). The platform requires a quad-core CPU or higher, DDR3 memory, *and a revised Socket interface*. Availability is expected in 2011.
> 
> Components:
> 
> Zambezi: Per AMD, Zambezi will be the first desktop processor based on the company’s Bulldozer architecture. Featuring as many as eight cores, Zambezi-based offerings will incorporate as many as four processor “modules.” AMD plans to use 32 nm manufacturing, and *reports suggest Socket AM3 incompatibility* (along with DDR3 memory support). Zambezi is not an APU, but rather is meant to be paired with discrete graphics.



corrected


----------



## Bloodcrazz (Aug 25, 2010)

btarunr said:


> That's laughably misinformed.



misinformed rofl, the new socket your talking about. is for the server platform.
Zambezi will be am3+(am3r2).
sigh whos misinformed now.
http://www.extremetech.com/article2/0,2845,2368186,00.asp


----------



## nt300 (Aug 25, 2010)

Rememb AMD is sort of new to the chipset family despite ATIs previous knowledge. 790FX was a start and good one. The 890FX was a better yes, and help gain AMD more experience. And now the upcoming 990FX or some may say the Bulldozer chipset should be what they've been leading to for years now and should be feature rich just like Intels high end chipsets.


----------



## cadaveca (Aug 25, 2010)

Bloodcrazz said:


> misinformed rofl, the new socket your talking about. is for the server platform.
> Zambezi will be am3+(am3r2).
> sigh whos misinformed now.
> http://www.extremetech.com/article2/0,2845,2368186,00.asp





btarunr said:


> Old roadmap is old. AMD told us it's a different socket just last week.



I dunno...personally, i find ExtremeTech not that "techie". They are a Ziff Davis pulication, no?

I tend to trust BTA here. The way I look at it, the only way I see decent performance boost overall is with a new socket, so I won't ever use current boards with these upcoming chips...seems a waste of possible resources.


AMD has said for a long time that Zambezi would be AM3+, not exactly the same as current AM3 socket. That suggests to me that because of seperate NB/memcontrollers, they can disable some functionality on these chips as needed(AMD's "Modular Design").

Oh, and by the way, the August 24th article hosted on that site makes no mention of socket plans, except this:



> AMD also told us that *it will introduce a new AM3+ socket for consumer versions of Bulldozer CPUs*. AM2 and AM3 processors will work in the AM3+ socket, but Bulldozer chips will not work in non-AM3+ motherboards.
> .


----------



## jpierce55 (Aug 25, 2010)

toyo said:


> Judging by the number of comments until mine, I can tell there's lots of scepticism about AMD's new CPU line... I guess they just delayed it for too long.
> 
> However, it seems they were kinda broke, and maybe it is only the 5800 series success that put Bulldozer back on the drawing board. I thought they abandoned the project for lack of resources or something.
> 
> Whatever the reality is, I hope it is worth waiting... AMD deserves a high-end CPU that will kick Intel line in the arse... it's maybe time for another performance crown switch like in good old Athlon vs Pentium days...



I have skepticism due to all the hype they put on the Phenom, a really sorry processor for the hype. I hope AMD pulls it off. I doubt it will be up to par with Intel, but maybe a little closer, and hopefully a bit cheaper. I hope my next cpu is an AMD.


----------



## Bloodcrazz (Aug 25, 2010)

cadaveca said:


> I dunno...personally, i find ExtremeTech not that "techie". They are a Ziff Davis pulication, no?
> 
> I tend to trust BTA here. The way I look at it, the only way I see decent performance boost overall is with a new socket, so I won't ever use current boards with these upcoming chips...seems a waste of possible resources.
> 
> ...



amd were say this pre 890fx. why wouldnt then amd fit this new am3r2 socket on the 890?
it says am3+ will work on older chips but new chips wont work on am3.
890fx will be base for the scorpius like was planed all long most sites are saying this not just that one and people that were at hotchips.
so he was misinformed


----------



## cadaveca (Aug 25, 2010)

It's hard to judge how accurate any info from AMD is, at this point.

AMD officials were hyping 3ghz Phenom chips. This never materialized. The set a precedent there...for lying about about future products (they WERE supposed to have a policy of not discussing future products, I thought)

So, until we have retail products, I am skeptical of any of this info, and if they use 890FX for Zambezi, I definately won't be buying.



What has me really curious is the motivation behind such details being released so early...why? 6 months way from any launch, in the least, and they are pimping products in the public domain? Seems fishy to me.


----------



## Bloodcrazz (Aug 25, 2010)

http://www.tomshardware.com/reviews/bulldozer-bobcat-hot-chips,2724-2.html
Scorpius: Enthusiast desktop platform based on AMD’s Zambezi processor and discrete graphics (AMD, of course, specifies an ATI GPU). The platform requires a quad-core CPU or higher, DDR3 memory, and a revised Socket AM3 interface. Availability is expected in 2011.
toms wrong 2
lol everyone turn on panic rofl but panic was the only one that was right


----------



## cadaveca (Aug 25, 2010)

Tom's said:
			
		

> likely, Socket AM3 desktop platforms as well



:shadedshu

There is so much conflicting info out there...makes you wonder...


----------



## Steevo (Aug 25, 2010)

I'm ready for a new board so who cares, I have been through a 9850, and would like to offload my 940 to my parents and and get a X6 if asus will ever pull their heads out of their ass and get a BIOS done. 


I didn't expect this board to actually last this long for me, and I could drop in a higher end X4 part and still be happy. I have been building AMD for the last few years now and the fact that I can still get chips to fit older boards for a cheap performance upgrade, and or a board to fit a older chip is amazing from my days as a Intel man.

I have quite a few Intel chips at home from when a board dies, but nothign to do with them and they are now worthless as the boards don't work. I still have a S939 board and chip at home.


----------



## Bloodcrazz (Aug 25, 2010)

i was reading over this and people are like live in a dream world. 
$300 amd beating $1000 intel. well if it did beat it wouldn't be $300 rofl.
dont think people remember the Althon FX and unlike intel extreme new one use to come out every few months


----------



## crazyeyesreaper (Aug 25, 2010)

actually $300 cpu does beat a $1000 cpu on occassion granted its OC vs stock but the major point here is your on an enthusiast site we dont by parts of ridiculous high end spectrums all the time many buy the best bang for buck and oc the shit out of it. example why would you buy a 980x if a 920oced gives u the same performance for 700 less granted that 980 when overclocked will walk away yes but at $700 premium does it make sense to get it no it dosent not for most.

its why AMD is still around today they offer good enough and close enough at a lower price they dont beat intels counterparts very often but they put up a damn good show of it and a 1090T at 4ghz+ is a damn good chip for $300.

And sure we remember the athlon FX but it was out in a time when intel was still shitting out P4s and AMD had the performance crown. having the crown means you can charge more money for your shit especially if you have good marketing


----------



## CDdude55 (Aug 25, 2010)

crazyeyesreaper said:


> actually $300 cpu does beat a $1000 cpu on occassion granted its OC vs stock but the major point here is your on an enthusiast site we dont by parts of ridiculous high end spectrums all the time many buy the best bang for buck and oc the shit out of it. example why would you buy a 980x if a 920oced gives u the same performance for 700 less granted that 980 when overclocked will walk away yes but at $700 premium does it make sense to get it no it dosent not for most.
> 
> its why AMD is still around today they offer good enough and close enough at a lower price they dont beat intels counterparts very often but they put up a damn good show of it and a 1090T at 4ghz+ is a damn good chip for $300.
> 
> And sure we remember the athlon FX but it was out in a time when intel was still shitting out P4s and AMD had the performance crown. having the crown means you can charge more money for your shit especially if you have good marketing



A lot depends on the architecture actually, a stock 980x still kicks the shit out of a 1090T at 4Ghz, and it still kills an overclocked 920. Then again, this was proven only during programs that were actually multithreaded. In the programs that weren't mutithreaded, the 920 and 980x generally perform exactly the same, with AMD still slightly lagging behind (but with the better price). AMD is still around today because yes, they provide cheaper chips that give you enough performance most of the time. Intels recent chips are very powerful, but yes the price is a bit higher, then again, how higher?. I mean a 1090T is $296 on newegg and an i7 930 is actually _cheaper_, sitting at $290... and you get better performance. Then again, one could debate that the surrounding parts including the motherboard could be more expensive. Then again, if you have spend almost $300 on a CPU, why not get a nice mobo to go along with it?.

It depends what you want, AMD definitely offers the best bang for buck, but if you're trying to reap all out performance, where that's where Intel shines. In multithreaded games and benchmarks, nothing can touch the 980x(especially when overclocked). The current AMD CPU's will always give you _good enough_ performance. And whether or not that's what you want depends on the person.

If Bulldozer is cheap and can actually beat an i7 this time, then i'll be moving to that platform.


----------



## crazyeyesreaper (Aug 25, 2010)

well thats my point most apps still dont use more then 2-4 threads Intel has better clock to clock performance its cant be denied but major selling point for amd is the fact a 1090T can be dropped into a previous gen cfx or sli mobo that costs $100   where as that i7 930 is forced to be paired with a x58 and im just going with new prices not what we can find used that and triple channel is higher priced and as the 1156 socket proved triple channel is uneeded for i7. as the 750 and 860 tend to run neck and neck with the 920 in alot of situations and surprisingly the 750 tends to do slightly better in games with the same gpu on occasion as well which is rather interesting 

http://www.anandtech.com/bench/Product/146?vs=46

as shown here 1090T vs i7 940 both trade blows off and on and the i7s only dominace comes in Far Cry 2 and let me tell you its a LANDSLIDE in favor of the i7 in that game  but point is if i drop down to the 920 it becomes even more in favor of the 1090 but thats more due to the higher clock rate helping make up for lesser clock to clock performance.

but at the end of the day its what fits the bill for whats needed... with the $50 rebate + bing cash back tiger direct had awhile ago a 1090T could be grabbed for $225 which is a damn good deal but if i had to go intel now id go 1156 due to the fact the 860 tends to perform a tad bit better then the 920 the mobo is cheaper and dual channel ddr3 is more then enough and it will still hang with the 1090T or any amd cpu as we all know

and yes same if bulldozer is revolutionary and performs great ill switch out as well but im holding my breathe  call me a skeptic


----------



## pantherx12 (Aug 25, 2010)

I'm holding onto my current system until bulldozer as well.

Xeon just about keeps me satisfied, and the 5770 is okay.... for now....
The newest fanciest games I get between 22-35 fps D:


----------



## largon (Aug 25, 2010)

> Originally Posted in *1st post*
> 
> 
> Outside the modules
> At the chip-level, there's a large L3 cache, *a northbridge that integrates the PCI-Express root complex*, and an integrated memory controller. *Since the northbridge is completely on the chip, the processor does not need to deal with the rest of the system with a HyperTransport link. It connects to the chipset (which is now relegated to a southbridge, much like Intel's Ibex Peak), using A-Link Express, which like DMI, is essentially a PCI-Express link.* It is important to note that all modules and extra-modular components are present on the same piece of silicon die. *Because of this design change, Bulldozer processors will come in totally new packages that are not backwards compatible with older AMD sockets such as AM3 or AM2(+).*


Where did this text come from? Is it a quote from a slide or a text released by AMD? The bolded, highlighted parts are, controversial, to say the least. I see nothing in the slides on any site that would state there is a on-die PCIe ctrl, DMI or even a new socket.


----------



## TheMailMan78 (Aug 25, 2010)

pantherx12 said:


> I'm holding onto my current system until bulldozer as well.
> 
> Xeon just about keeps me satisfied, and the 5770 is okay.... for now....
> The newest fanciest games I get between 22-35 fps D:



5770=weak. Upgrade that bitch.


----------



## pantherx12 (Aug 25, 2010)

TheMailMan78 said:


> 5770=weak. Upgrade that bitch.



No monies right now else I would!

Hopefully getting a job soon at CEX ( Computer shop! WOOOO) so expect my rig to have lots of upgrades if I get the job XD


----------



## Wile E (Aug 25, 2010)

crazyeyesreaper said:


> actually $300 cpu does beat a $1000 cpu on occassion granted its OC vs stock but the major point here is your on an enthusiast site we dont by parts of ridiculous high end spectrums all the time many buy the best bang for buck and oc the shit out of it. example why would you buy a 980x if a 920oced gives u the same performance for 700 less granted that 980 when overclocked will walk away yes but at $700 premium does it make sense to get it no it dosent not for most.
> 
> its why AMD is still around today they offer good enough and close enough at a lower price they dont beat intels counterparts very often but they put up a damn good show of it and a 1090T at 4ghz+ is a damn good chip for $300.
> 
> And sure we remember the athlon FX but it was out in a time when intel was still shitting out P4s and AMD had the performance crown. having the crown means you can charge more money for your shit especially if you have good marketing


And if AMD takes the lead again, do you really expect them to give you $300 chips that outperform Intel's $1000 chips? No, they won't. The point he is making is that if they can compete in the high end, they will charge $1000 for those cpus, just like Intel. They've already demonstrated this in the past with the FX series. You will no longer have $300 kick ass chips.

And the $700 was worth every penny to me. 



crazyeyesreaper said:


> well thats my point most apps still dont use more then 2-4 threads Intel has better clock to clock performance its cant be denied but major selling point for amd is the fact a 1090T can be dropped into a previous gen cfx or sli mobo that costs $100   where as that i7 930 is forced to be paired with a x58 and im just going with new prices not what we can find used that and triple channel is higher priced and as the 1156 socket proved triple channel is uneeded for i7. as the 750 and 860 tend to run neck and neck with the 920 in alot of situations and surprisingly the 750 tends to do slightly better in games with the same gpu on occasion as well which is rather interesting
> 
> http://www.anandtech.com/bench/Product/146?vs=46
> 
> ...


I still fail to understand why people use games to test cpu performance. 4 year old cpus still game just fine. It's kind of a pointless test for cpu power.

But, to be honest, I was looking forward to Thuban at first, then I found out it only matches i7 quads clock for clock in the stuff I do. That's when I skipped on a gfx upgrade, and went with the 980X instead. 4870X2 is still plenty for most games, but my QX wasn't doing the trick for me anymore. 

I really hope Bulldozer lives up to expectations tho. Competition at the high end will do us some justice. I'll sell my rig and go AMD if they can pull ahead.


----------



## Bloodcrazz (Aug 25, 2010)

crazyeyesreaper said:


> im talking about the guy who bitched for nearly half the thread about 890fx... when if he used his brain he would realize that when bulldozer does come out  890fx will be nearly 2 years old



err 890fx came out few months ago (q2/2010) bulldozer is coming first half 2011 im no math king but i'm sure thats 2 years
ROFL


----------



## crazyeyesreaper (Aug 25, 2010)

bulldozer probably wont hit maintstream aka non server markets untill the end of 2011 after all the leaked benchs above are most likely server chips since thats what there comparing performance wise now im no genius but there were 12 core amd server cpus long before a 6core ever came to the desktop market.  by the time Bulldozer comes to the market in force it will be over a year. Still so ppl can still bitch moan and complain if u dont like it to bad those companies still dont give a shit also 32nm bulk has been skipped and were going to 28nm which still isnt in full swing and wont be for some time

then lets not forget the switch to global foundries etc whos to say they wont hit a snag fact is by the time bulldozer is in full swing it will be 2012 in my honest opinion and in that situation its still nearly 2 years for 890fx and honestly im not butt hurt i didnt by an 890fx i paid $110 for a 790fx u get what u pay for and in the tech world u pay the price for the lastest and greatest


----------



## Bloodcrazz (Aug 25, 2010)

crazyeyesreaper said:


> bulldozer probably wont hit maintstream aka non server markets untill the end of 2011 after all the leaked benchs above are most likely server chips since thats what there comparing performance wise now im no genius but there were 12 core amd server cpus long before a 6core ever came to the desktop market.  by the time Bulldozer comes to the market in force it will be over a year. Still so ppl can still bitch moan and complain if u dont like it to bad those companies still dont give a shit also 32nm bulk has been skipped and were going to 28nm which still isnt in full swing and wont be for some time
> 
> then lets not forget the switch to global foundries etc whos to say they wont hit a snag fact is by the time bulldozer is in full swing it will be 2012 in my honest opinion and in that situation its still nearly 2 years for 890fx and honestly im not butt hurt i didnt by an 890fx i paid $110 for a 790fx u get what u pay for and in the tech world u pay the price for the lastest and greatest



where are you reading this. zambezi will be 32nm.
its ati that skipped 32nm going to 28nm.


----------



## Super XP (Aug 26, 2010)

Bloodcrazz said:


> where are you reading this. zambezi will be 32nm.
> its ati that skipped 32nm going to 28nm.


Sounds like an old forum post with misinformation. I remember reading this somewhere way back.


----------



## cadaveca (Aug 26, 2010)

Super XP said:


> Sounds like an old forum post with misinformation. I remember reading this somewhere way back.



Nah, that's pretty accurate. TSMC, that makes ATI chips, is skipping 32nm, and that's official, so only GLoFOr or maybe Chartered, could produce 32nm chips for ATi(unsure of Chartered's capacity or even ability to do 32nm).

While I'd personally love to see TSMC lose ATi's business, I doubt GloFo actually has the capacity to produce 32nm vga chips, without affecting cpu or chipset outputs.


----------



## nt300 (Aug 26, 2010)

cadaveca said:


> Nah, that's pretty accurate. TSMC, that makes ATI chips, is skipping 32nm, and that's official, so only GLoFOr or maybe Chartered, could produce 32nm chips for ATi(unsure of Chartered's capacity or even ability to do 32nm).
> 
> While I'd personally love to see TSMC lose ATi's business, I doubt GloFo actually has the capacity to produce 32nm vga chips, without affecting cpu or chipset outputs.


I read GlobalFo also cancel 32nm and already move to 22nm & 20nm. 
And heres the link.
http://www.xbitlabs.com/news/other/display/20100401144643_Globalfoundries_Scraps_32nm_Bulk_Fabrication_Process.html


----------



## cadaveca (Aug 26, 2010)

nt300 said:


> I read GlobalFo also cancel 32nm and already move to 22nm & 20nm.
> And heres the link.
> http://www.xbitlabs.com/news/other/display/20100401144643_Globalfoundries_Scraps_32nm_Bulk_Fabrication_Process.html



Thanks, dude, I do remeber this, but I'll point out something:



> All of our efforts around *next-gen graphics and wireless* are focused on 28nm with HKMG and we no longer have a 32nm bulk process.



Notice no mention of cpu processes. They are different products.


----------



## inferKNOX (Aug 27, 2010)

Bloodcrazz said:


> http://www.tomshardware.com/reviews/bulldozer-bobcat-hot-chips,2724-2.html
> Scorpius: Enthusiast desktop platform based on AMD’s Zambezi processor and discrete graphics (AMD, of course, specifies an ATI GPU). The platform requires a quad-core CPU or higher, DDR3 memory, and a revised Socket AM3 interface. Availability is expected in 2011.
> toms wrong 2
> lol everyone turn on panic rofl but panic was the only one that was right





cadaveca said:


> :shadedshu
> 
> There is so much conflicting info out there...makes you wonder...


That is just the same word for word statement as in post #139 which I commented on in post #143, nothing new.
It's a simple case of copy-paste, which doesn't make it any more right or wrong than the rest, nor adds or subtracts from the level of misinformation out there.
@Bloodcrazz: trolling is never right, even if he was trying to prove some point.

@Everyone Else: AMD sure is taking long to announce the HD6000s officially if they're coming out as soon as Oct/Nov '10; do you think they're trying to squeeze the announcement as close as they can to the Bulldozer for the sake of the Scorpius platform?


----------



## a_ump (Aug 27, 2010)

idk why but i have a feeling that AMD/ATi are going to blow us the _f*ck!_ away. Idk if it'll be purely in gpu specs, but i've noticed alot of HD 5XXX series issues that have been on-going for sometime esp xfire. 

I know ATi have that rep but they did good with HD 4XXX series drivers, so makes me think a good portion of their time is spent on the HD 6XXX series; like mad optimizations, improved loaders for shaders, more accurate CCC overdrive with voltage control, at least 85% xfire scaling. I do look for some of all i've said to happen but i def don't think it all will; that would be too perfect n makes nvidia shit themselves lol


----------



## inferKNOX (Aug 27, 2010)

Yeah, that would be nice.
Just as I spoke though: ATI Radeon HD 6000 Series GPU Codenames Surface
Still unofficially however I take it... :-/
TIME FOR SOME MARKETING AMD! That's where Intel and nVidia trump AMD/ATi almost _everytime_, and I say almost, not because there's been a time I've seen otherwise, but I just assume they _must_ have marketed better _at least once_!


----------



## wahdangun (Aug 27, 2010)

CDdude55 said:


> But the whole point of Physx is so the GPU does the physics processing instead of the CPU. whether or not a CPU can utilize all of it's cores is a matter of software taking advantage of those cores. How is physx holding the CPU back?, even if physx is poorly coded, how would that effect the CPU?. I don't understand how in anyway physx could be holding back a part that it has nothing to do with.
> 
> But of course, everything has to be Nvidia's fault right.:shadedshu



the whole point of physix is so we can have more realistic effect and btw its really take some processing power of GPU to process physix effect and the result are we must tone down our in game setting to lower like lower AA/detail and make your hexacore cpu useless (because no games that can use more than 4 thread) so if physix was coded properly we doesn't need to use our precious GPU power to process physix and make the cpu more useful, 

so nvdia make it like havoc but optimize it further by supporting more than quad core,


----------



## pantherx12 (Aug 27, 2010)

Actually the latest phsyx supports multicore rendering by default, as many cores as you have.

It still uses old instruction sets mind you.

an 8600gt for example has twice the performance of my CPU at the moment.

I'd expect my CPU to be as good as  8600gt if phsyx used newer instruction sets for the cpu.


----------



## TheMailMan78 (Aug 27, 2010)

The fact these new CPU's will use AM3 sockets makes me a little sad. You cannot push the envelope and maintain current compatibility. Especially on a 2 year old socket.


----------



## nt300 (Aug 27, 2010)

cadaveca said:


> Thanks, dude, I do remeber this, but I'll point out something:
> 
> Notice no mention of cpu processes. They are different products.


Yes good point, thanks


----------



## cadaveca (Aug 27, 2010)

nt300 said:


> Yes good point, thanks



I think the real point there is that once again, AMD isn't exactly forthcoming with PRECISE information, ever. Or maybe it's those reporting...I am unsure since everyone in those circles is so "buddy-buddy" at this point.


----------



## nt300 (Aug 27, 2010)

So Bulldozer CPUs will not work with old AM3 motherboards but old AM3 cpus will work in new AM3+ motherboards. I hope AMD does not mess up the DDR3 scaling because Dual-channel is not enough to feed 8 bulldozer cores.



> *Desktop Bulldozer Processors Will Require New Platforms - AMD.
> AMD Zambezi to Use AM3+ Platforms*
> http://www.xbitlabs.com/news/cpu/display/20100826225852_Desktop_Bulldozer_Processors_Will_Require_New_Platforms_AMD.html
> 
> Advanced Micro Devices said that its next-generation desktop processors code-named Zambezi will use socket AM3+ platforms, which will be backwards compatible with the firm's existing AM3 products. While the latter is an advantage for the platform, it may be a disadvantage for eight-core processors based on Bulldozer micro-architecture...............


----------



## cadaveca (Aug 27, 2010)

I'm not buying any of it. Let's wait for some motherboards to surface before deciding who's got the right story...I think these guys aren't all talking to the same people @ AMD, and the guys they are talking to, aren't exactly up to date on all the pertinent info. Idiots.


----------



## Deleted member 67555 (Aug 27, 2010)

I'm thinking the first round will be like the PhenomII 920 and 940 but after that they will all be AM3r2 only Cpu's


----------



## Super XP (Aug 28, 2010)

New platform, no problem I am looking forward to buying a new mobo.


----------



## Neo4 (Sep 25, 2010)

nt300 said:


> So Bulldozer CPUs will not work with old AM3 motherboards but old AM3 cpus will work in new AM3+ motherboards. I hope AMD does not mess up the DDR3 scaling because Dual-channel is not enough to feed 8 bulldozer cores.



Dual channel memory is more than enough and Intel proved it with socket 1366 and triple channel designs being an unnecessary expensive. Why do you think they went back to dual channel? Read the reviews it wasn't just for the expense. (By the way, read the reviews on the real world impact on RAM speed as well.) And how can current AM3 designs support a radical and completely new design never before tried by ANY CPU manufacturer? One that doesn't require a Northbridge chipset because it's built into the CPU itself? If current boards supported "Bulldozer" then it would just be a rehash of "Stars" and little faster than what AMD has now. Despite the die shrink to 32 nm which will certainly allow higher clocks and lower TDP's. It certainly wouldn't have a chance against Intel's current and future processors. Allowing current CPU's to work in the Bulldozer boards to come is far more generous than anybody should expect and far more than the Intel camp would ever allow. AMD, I strongly suspect, has a major new performance boost coming with Bulldozer and it's going to strike with even more impact because they will downplay it right up to the day it's released to the server market next April or so. Remember when AMD shocked everybody by how much faster the 4000 video series was to the 3000 series by keeping a low profile up until the day they went on sale? By next August, regular peeps like us will be able to purchase hardware from NewEgg probably no more expensive than current AMD hardware and all we'll need to upgrade our boxes will be a new board and CPU. Next year at this time TechPowerup, HardOCP, Anandtech and all the other hardware review sites will be gushing their enthusiasm for what AMD will have accomplished. Exciting times my friends when you think that you can just buy a new board that supports Bulldozer, use your current Phenom II and buy a Bulldozer CPU later when you have the cash. That's a pretty painless and inexpensive upgrade path compared to ChipZilla..


----------



## JF-AMD (Sep 25, 2010)

People seem to be really caught up in how many channels of memory there are, and not necessarily how efficient those channels perform.

What if you had 2 channels that could perform the same as 3?  Would you still demand 3 or would you be ok with 2?

It's the same thing with thermals on servers.  Intel is at 32nm but their best 2P power score (@ 100% utilization) is 174W.  Ours is 126W (on a 45nm process). I have people try to convince me that 32nm is an advantage because you have lower power consumption.

It's not about the technology, it's about the output.


----------



## cheezburger (Sep 25, 2010)

JF-AMD said:


> People seem to be really caught up in how many channels of memory there are, and not necessarily how efficient those channels perform.
> 
> What if you had 2 channels that could perform the same as 3?  Would you still demand 3 or would you be ok with 2?
> 
> ...



that's because in computer world everything are accelerate by pure brutal force. not efficiency. if you can do same performance intel that consume 174W while only use 126W. why not increase to 174W and crush intel? i don't understand you logic at all.


----------



## bear jesus (Sep 25, 2010)

JF-AMD said:


> People seem to be really caught up in how many channels of memory there are, and not necessarily how efficient those channels perform.



I think that some people, myself included just assumed that each channel is limited more by the ram than anything else thus assumed that the only way to get more performance is to add more channels.

I'm still interested in the idea of a quad memory channel bulldozer (preferably interlagos) for a home server partly as in a way i assume that with so many core's and with running a multiple virtual machines it would benifit from the extra channels, although really i dont have a clue what would be needed memory bandwith wise or if i would have a need for so many channels.


----------



## btarunr (Sep 25, 2010)

cheezburger said:


> that's because in computer world everything are accelerate by pure brutal force. not efficiency.



That's exactly what JF is talking about. "Pure brutal force" counts, not what goes into creating that. So If Bulldozer's client SKU uses say dual-channel DDR3-1866 MHz as its memory standard (since 1866 MHz 1.5V bulk DIMMs are a reality), it's making up for memory bandwidth that triple-channel DDR3-1066 MHz (Core i7 official memory standard) has with its third channel.  It's the same as 256-bit high-speed GDDR5 vs. 384-bit low-speed GDDR5 AMD vs. NVIDIA point. 

And you're wrong, efficiency is God in the server world.


----------



## JF-AMD (Sep 25, 2010)

Because there are large companies that buy tens of thousands of servers and all they care about is the absolute lowest power possible so that they can have the largest number of threads with the lowest watts per thread.  Think of very large cloud companies.

As a matter of fact, these customers routinely underclock their processors because the proportional drop in power is greater than the drop in performance, leading to better performance per watt.

Not every application requires performance.  As a matter of fact, because only ~5% of the processors bought are top bin (ours and intel's), you can actually say that 95% of the customers want something other than raw performance (either price/performance or performance/watt.)  It is pretty simplistic to think that performance is the only vector that matters.  It's akin to asking a hybrid car owner what the 0-60mph time is or asking a sports car owner what the gas mileage is.

There are plenty of different usage models in the market and the "raw performance at all costs" is ~5% of the market.  At best.


----------



## JF-AMD (Sep 25, 2010)

bear jesus said:


> I think that some people, myself included just assumed that each channel is limited more by the ram than anything else thus assumed that the only way to get more performance is to add more channels.
> 
> I'm still interested in the idea of a quad memory channel bulldozer (preferably interlagos) for a home server partly as in a way i assume that with so many core's and with running a multiple virtual machines it would benifit from the extra channels, although really i dont have a clue what would be needed memory bandwith wise or if i would have a need for so many channels.



Actually, you find that 3 channels is in reality less efficient.  I could get into the long math of it, but let me cut to the chase: Everything in the computer world is based on even numbers.  3 channels of memory is the odd man out and is not handled the same way.  Plus you don't get to do some things on the server side like advanced ECC unless you have even numbers of channels.


----------



## cadaveca (Sep 25, 2010)

btarunr said:


> And you're wrong, efficiency is God in the server world.



AMD's process uses less current than Intel's, and this is a huge advantage for AMD(not like I haven't said that before). I think they have the efficiency thing down pat already...and hopefully Bulldozer brings that brute force. The two things together = 1 killer chip.


----------



## bear jesus (Sep 25, 2010)

JF-AMD said:


> Actually, you find that 3 channels is in reality less efficient.  I could get into the long math of it, but let me cut to the chase: Everything in the computer world is based on even numbers.  3 channels of memory is the odd man out and is not handled the same way.  Plus you don't get to do some things on the server side like advanced ECC unless you have even numbers of channels.



Really i'm expecting to have to choose between 2 or 4 channels for the server mainly depending on performace along with either 8 or 16 core's. But it is good to know that a triple channel baised server would not be a good idea for my wants/needs.


----------



## DigitalUK (Sep 25, 2010)

JF-AMD said:


> People seem to be really caught up in how many channels of memory there are, and not necessarily how efficient those channels perform.
> 
> What if you had 2 channels that could perform the same as 3?  Would you still demand 3 or would you be ok with 2?



i think JF knows something hes not telling us...lol  blink twice if its dual channel


----------



## bear jesus (Sep 25, 2010)

DigitalUK said:


> i think JF knows something hes not telling us...lol  blink twice if its dual channel



I'm pretty sure he knows a lot that he can't tell us  we are just lucky he is doing a good job at kind of telling us things without telling us cirtain things... if that makes any sence lol.


----------



## CDdude55 (Sep 25, 2010)

DigitalUK said:


> i think JF knows something hes not telling us...lol  blink twice if its dual channel



lol, there's a lot he probably can't tell us, even if he's only the server guy.

I think if they can make it efficient and get near or more memory bandwidth while only using two channels, then im all fine with that. As said, i think of the server side of things efficiency is very important, but i think client wise, triple channel is more then enough even if it's not as efficient.


----------



## DigitalUK (Sep 25, 2010)

yea ive complete faith in amd to deliver the goods thats why im saving now., its just JL as hinted at least 3 times or more that 2 channels could be less efficient. his blog is also Very interesting, memory noted there as well.
AMD hit man knocking my door soon.


----------



## mastrdrver (Sep 26, 2010)

JF-AMD said:


> People seem to be really caught up in how many channels of memory there are, and not necessarily how efficient those channels perform.
> 
> What if you had 2 channels that could perform the same as 3?  Would you still demand 3 or would you be ok with 2?
> 
> ...



You sir are crazy! Everyone knows that 3 channels pwns all and 4 is teh win! 

 j/k

I'm glad we could get some clarity on this straight from the horses mount (per say).


----------



## Wile E (Sep 26, 2010)

All I care about is how much overclocked performance it can achieve within the heat output my cooling setup is able to manage. I don't care how it's achieved, only that it is.

I just want to know how it performs and how it overclocks.

If it's better than Intel, my next rig is AMD. If not, I stick with Intel. That's that.


----------



## bear jesus (Sep 26, 2010)

Wile E said:


> All I care about is how much performance it can achieve within the heat output my cooling setup is able to manage. I don't care how it's achieved, only that it is.
> 
> I just want to know how it performs and how it overclocks.
> 
> ...



I have to agree, i really want to go with some water cooling with my next cpu upgrade so i am really hoping that bulldozer will oc well under water.


----------



## Super XP (Sep 26, 2010)

Neo4 said:


> Dual channel memory is more than enough and Intel proved it with socket 1366 and triple channel designs being an unnecessary expensive. Why do you think they went back to dual channel? Read the reviews it wasn't just for the expense. (By the way, read the reviews on the real world impact on RAM speed as well.) And how can current AM3 designs support a radical and completely new design never before tried by ANY CPU manufacturer? One that doesn't require a Northbridge chipset because it's built into the CPU itself? If current boards supported "Bulldozer" then it would just be a rehash of "Stars" and little faster than what AMD has now. Despite the die shrink to 32 nm which will certainly allow higher clocks and lower TDP's. It certainly wouldn't have a chance against Intel's current and future processors. Allowing current CPU's to work in the Bulldozer boards to come is far more generous than anybody should expect and far more than the Intel camp would ever allow. AMD, I strongly suspect, has a major new performance boost coming with Bulldozer and it's going to strike with even more impact because they will downplay it right up to the day it's released to the server market next April or so. Remember when AMD shocked everybody by how much faster the 4000 video series was to the 3000 series by keeping a low profile up until the day they went on sale? By next August, regular peeps like us will be able to purchase hardware from NewEgg probably no more expensive than current AMD hardware and all we'll need to upgrade our boxes will be a new board and CPU. Next year at this time TechPowerup, HardOCP, Anandtech and all the other hardware review sites will be gushing their enthusiasm for what AMD will have accomplished. Exciting times my friends when you think that you can just buy a new board that supports Bulldozer, use your current Phenom II and buy a Bulldozer CPU later when you have the cash. That's a pretty painless and inexpensive upgrade path compared to ChipZilla..


Is that a shared NB?


----------



## JF-AMD (Sep 27, 2010)

Super XP said:


> Is that a shared NB?
> http://www.xbitlabs.com/images/news/2009-11/amd_bulldozer_scheme.jpg



Shared at the die level, just like the L3


----------



## Super XP (Sep 27, 2010)

The NB is still integrated into Bulldozer was what I was trying to point out. Just like the IMC.


----------



## TheMailMan78 (Sep 29, 2010)

cadaveca said:


> I think the real point there is that once again, AMD isn't exactly forthcoming with PRECISE information, ever. Or maybe it's those reporting...I am unsure since everyone in those circles is so "buddy-buddy" at this point.



I think thats the case with any major company. You never show all the goods. It can give you an edge or hide your flaw.


----------



## cadaveca (Sep 29, 2010)

TheMailMan78 said:


> I think thats the case with any major company. You never show all the goods. It can give you an edge or hide your flaw.



Well since, then, JF-AMD is posting here now, so maybe he'll end that confusion.


----------



## TheMailMan78 (Sep 29, 2010)

cadaveca said:


> Well since, then, JF-AMD is posting here now, so maybe he'll end that confusion.



 I wouldn't hold my breath. TPU don't sign the mans paycheck.


----------



## CDdude55 (Sep 29, 2010)

TheMailMan78 said:


> I wouldn't hold my breath. TPU don't sign the mans paycheck.



I agree.

I think it's going to be just like the Pailt rep that used to post here, they'll come and correct some people and do some clean up work, but after a day or so they'll never log back in again.(unless something big with their company happens and they have to regulate)


----------



## trt740 (Sep 29, 2010)

Thats not how it worked Palit guy posted for along time but during the USA financial melt down Palit closed up shop in the USA and he lost his job. So thats not really a good comparison but we get your point.


----------



## cadaveca (Sep 29, 2010)

TheMailMan78 said:


> I wouldn't hold my breath. TPU don't sign the mans paycheck.



NO, but AMD does, and his job is marketing...and according to him, we are included in that marketing. Likewise, it should be his job to correct any misconceptions based on data he has given. After all, something is better than nothing(well, not to me, but ya know.)


----------



## CDdude55 (Sep 29, 2010)

cadaveca said:


> NO, but AMD does, and his job is marketing...and according to him, we are included in that marketing. Likewise, it should be his job to correct any misconceptions based on data he has given. After all, something is better than nothing(well, not to me, but ya know.)



Even if that is the case, he as a ''marketer'' has a lot of other big fish to catch, like i said, he came here for a day or so to regulate on some rumors or all out false facts and then bam, you won't see him here possibly until everyone is speculating about another new architecture or another new line up or cards or CPU's.

But who knows, he may keep posting here and doing his best to answer some of our questions. I just don't think first on his list is '' Stay logged into TPU and regulate on some speculation and rumors for weeks on end''.




trt740 said:


> Thats not how it worked Palit guy posted for along time but during the USA financial melt down Palit closed up shop in the USA and he lost his job. So thats not really a good comparison but we get your point.



Ahhh, that's a good point.


----------



## cadaveca (Sep 29, 2010)

OH, I agree...BullDozer seems to be his "baby". 


Time will tell though...I hope for way more than I should, but am always prepared for the worst.  I know he's travelling right now, for example...


----------



## bear jesus (Sep 29, 2010)

I think we need to remember JF posts on other forums, has a blog he writes/replys to, he has a job (that i assume takes him away from home at times) and a life in general, i'm sure he will be around in the future but i'm also pretty sure he is busyer than most of us 

I just wish it was not so far away from the server chips launch, i hated waiting through istanbul's release for thuban to come out and show some overclocking numbers.


----------



## wahdangun (Sep 29, 2010)

yeah i hope it not as long as shanghai to deneb or istanbul to thuban, i hate to wait that long.

and btw if anyone know if amd will launch fusion chip with bulldozer core ?


----------



## Neo4 (Sep 29, 2010)

trt740 said:


> Thats not how it worked Palit guy posted for along time but during the USA financial melt down Palit closed up shop in the USA and he lost his job. So thats not really a good comparison but we get your point.



Too bad about Palit USA and the rep who posted here. Glad you can still buy them as I have a Sonic Platinum 1 gig that does some crazy numbers Folding@Home and it's running default speed undervolted to .950v. I may even have it when I buy my Bulldozer mobo next year..


----------



## TheMailMan78 (Sep 29, 2010)

Green or Red I hope both sides bring it soon. These prices are getting out of hand.


----------



## JF-AMD (Sep 29, 2010)

If you look at my forum coverage, I am on about 8 or 10 different ones and have thousands of responses.

While "marketing" is my job, forums are not in my job description and I don't get paid to be here.  You'll notice that I specifically try to stay away from speculation on the competitor, I only deal with facts and you also should note that I start very few threads.  I generally only respond to questions because I don't want to be a shill for my products.

I do most of my forum surfing after hours.  For those of you that think this is the workday right now, I am in Tokyo right now and it is not even breakfast time.  Been in forums for the past 2-3 hours.

Time will tell, reputations are not earned overnight.

What was the original question you were asking about, I can't track that back.  If someone can repost I will get you an answer.


----------



## TheMailMan78 (Sep 29, 2010)

JF-AMD said:


> If you look at my forum coverage, I am on about 8 or 10 different ones and have thousands of responses.
> 
> While "marketing" is my job, forums are not in my job description and I don't get paid to be here.  You'll notice that I specifically try to stay away from speculation on the competitor, I only deal with facts and you also should note that I start very few threads.  I generally only respond to questions because I don't want to be a shill for my products.
> 
> ...



Did cadaveca create the TWKR chip?


----------



## cadaveca (Sep 29, 2010)

TheMailMan78 said:


> Did cadaveca create the TWKR chip?










I didn't say I created it, I said it was my idea.  


Normally I'd call you a troll, but that's seriously funny.




Now wouldn't you be upset if he said yes.


----------



## wahdangun (Sep 30, 2010)

TheMailMan78 said:


> Green or Red I hope both sides bring it soon. These prices are getting out of hand.



yupz, and it really retarded, without NV the price is just gonna be worst, i hope AMD still have some sense in their brain, and not price it like no tomorrow


----------



## mastrdrver (Sep 30, 2010)

While I don't visit XS too often I'm usually browsing S/A and it seems like JF posts quite a bit (at least to me) over on there. Though that forum can get highly technical at times.



JF-AMD said:


> If you look at my forum coverage, I am on about 8 or 10 different ones and have thousands of responses.



Best one I remember recently is when you got called out over on OC.net for being a noob since your post count is so low. 

Btw can you comment on whether all AM3 chips will be able to work in the Bulldozer socket for non server types like us?

I've just never seen anyone clarify if all AM3 chips will work or just certain ones (why some would and some wouldn't outside of bios support wouldn't make sense to me but maybe there is something more than what is currently known). Trying to be as general with that question in regards to future stuff as I can but I'm sure you know what I'm trying to get at.



TheMailMan78 said:


> Did cadaveca create the TWKR chip?



bwahahahaha


----------



## JF-AMD (Sep 30, 2010)

Actually I don't know the answer to this.


----------



## bear jesus (Sep 30, 2010)

mastrdrver said:


> can you comment on whether all AM3 chips will be able to work in the Bulldozer socket for non server types like us?



If the desktop socket for bulldozer is AM3+ would it be logical to assume that it should be simmilar to socket AM2 and AM2+ and having likly the simmilar compatibility issues and kind of trade offs?

If that is correct then should not all AM3 chips be able to work in an AM3+ board and all AM3+ chips be able to work in an AM3 board with the only incompatibilities brought in by board makers depending on their bios updates?

I would hope so as over recent years and between 2 computers i have used an AM2 chip on an AM2+ board, an AM2+ chip on an AM2 board and am currently using an AM3 chip on an AM2+ board (of corse also had matching chips to sockets), it has been one of the big reasons it has been so easy for me to stay with AMD.


----------



## crazyeyesreaper (Sep 30, 2010)

bulldozer wont work period in AM3 end of story there but the pin layout may be such that an AM3 cpu can be dropped in and work we have to wait and see for now tho


----------



## bear jesus (Sep 30, 2010)

crazyeyesreaper said:


> bulldozer wont work period in AM3 end of story there but the pin layout may be such that an AM3 cpu can be dropped in and work we have to wait and see for now tho



I would say there goes my processor/board swapping but to be honest i don't intend to get an AM3 board anymore (mainly due to asus being a pain in the butt and taking forever with the crosshair IV extreme), i just intended to wait for bulldozer and if its good enough to tempt me away from intels offerings then i will be buying a current board to go with a current chip. either way though i'm really looking forward to an all out upgrade to my gaming rig


----------



## mastrdrver (Sep 30, 2010)

From what I've understood is that AM3 cpus will work in the Bulldozer socket. Whether it has AM3+ or if that is for Llano idk. Though Bulldozer is not backward compatible with AM3 boards this has been made clear after the initial releases from Hot Chips.


----------



## scaminatrix (Sep 30, 2010)

JF-AMD said:


> People seem to be really caught up in how many channels of memory there are, and not necessarily how efficient those channels perform.
> 
> What if you had 2 channels that could perform the same as 3?  Would you still demand 3 or would you be ok with 2?



New chipset with dual channel DDR4 support next year? 990FX?


----------



## JF-AMD (Sep 30, 2010)

DDR4 in 2011?  You need to do a little more research on that one.


----------



## scaminatrix (Sep 30, 2010)

JF-AMD said:


> DDR4 in 2011?  You need to do a little more research on that one.



My optimism will be the death of me!


----------



## bear jesus (Sep 30, 2010)

scaminatrix said:


> New chipset with dual channel DDR4 support next year? 990FX?



 unfortunatly it should be around 2015 we see DDR4, would be nice to have some DDR4 next year but i would be happy with some DDR3 if i can push it over 2000mhz on an amd board.


----------



## nt300 (Sep 30, 2010)

scaminatrix said:


> New chipset with dual channel DDR4 support next year? 990FX?


More like very late 2012 to 2013 that was already confirmed by many ram manufacturers. I still think DDR3 has a lot more legs to run on and now that price is comig down it very good.


----------



## JF-AMD (Oct 1, 2010)

RAM transitions are generally over estimated by the manufacturers.  There is the "introduction" timeframe and the "mass acceptance" timeframe.

The earliest availability date typically means huge price premiums, spotty supply and less than stellar capabilites until the companies get their processes in line.


----------



## cheezburger (Oct 1, 2010)

bear jesus said:


> unfortunatly it should be around 2015 we see DDR4, would be nice to have some DDR4 next year but i would be happy with some DDR3 if i can push it over 2000mhz on an amd board.



why bother go DDR4 when you realize it's going to be 20+ in cycle latency......

now all we need is to fix these latency per clock on the current ram technology. not just more clock


----------



## Wile E (Oct 1, 2010)

trt740 said:


> Thats not how it worked Palit guy posted for along time but during the USA financial melt down Palit closed up shop in the USA and he lost his job. So thats not really a good comparison but we get your point.



I miss Palit Guy and my free hardware. 



cheezburger said:


> why bother go DDR4 when you realize it's going to be 20+ in cycle latency......
> 
> now all we need is to fix these latency per clock on the current ram technology. not just more clock



Latency per clock on DDR3 is already better than both DDR2 and DDR. To get the same latency per clock as my ram on DDR2, you would have to run it at CAS4 1066Mhz, or CAS5 1333Mhz. Not too many ram kits could do that stable and live for very long, and none were sold with those as their stock speeds.


----------



## Neo4 (Oct 1, 2010)

Wile E said:


> I miss Palit Guy and my free hardware.
> 
> Must have been nice that..
> 
> Latency per clock on DDR3 is already better than both DDR2 and DDR. To get the same latency per clock as my ram on DDR2, you would have to run it at CAS4 1066Mhz, or CAS5 1333Mhz. Not too many ram kits could do that stable and live for very long, and none were sold with those as their stock speeds.



Timing and speed matter little in real world applications and games. All the reviews show only a very few frames per second difference. It takes fast L1, L2 and L3 cache to keep our hungry processors feed. Compared to that, system memory is dog slow and only hard drives and optical drives are slower. Thank goodness solid state drives are slowly taking over both those antique mechanical technologies..


----------



## largon (Oct 1, 2010)

DDR4 can't come soon enough. 

Miniaturization of CPUs and as such, IMCs, has brought problems regarding memory DDQ voltages; remember i7 doesn't run safe with RAM vDD > 1.65v? You can't have huge voltage differences between IMC and the rest of the core or things between them two parts of the die go *poof*. Even ULV DDR3 running a 1.35V vDDQ will cause a conflict with CPU core vDDs, sooner or later. And since CPU vDDs are continuously going down... Anyways, more aggregate bandwidth never hurts, and considering GPUs are getting more and more integrated in the CPU, so in near future the industry will be screaming for faster RAM.


----------



## Wile E (Oct 2, 2010)

Neo4 said:


> Timing and speed matter little in real world applications and games. All the reviews show only a very few frames per second difference. It takes fast L1, L2 and L3 cache to keep our hungry processors feed. Compared to that, system memory is dog slow and only hard drives and optical drives are slower. Thank goodness solid state drives are slowly taking over both those antique mechanical technologies..


I know it makes little difference. Just commenting on his apparant misunderstanding of ram performance. Good DDR3 has both lower real world latency and higher bandwidth than both DDR 1 and 2. I was just speaking in terms of the hardwares' raw abilities, not the effect it has on our apps. 

And yes, getting free hardware to OC to death was a blast. lol.



largon said:


> DDR4 can't come soon enough.
> 
> Miniaturization of CPUs and as such, IMCs, has brought problems regarding memory DDQ voltages; remember i7 doesn't run safe with RAM vDD > 1.65v? You can't have huge voltage differences between IMC and the rest of the core or things between them two parts of the die go *poof*. Even ULV DDR3 running a 1.35V vDDQ will cause a conflict with CPU core vDDs, sooner or later. And since CPU vDDs are continuously going down... Anyways, more aggregate bandwidth never hurts, and considering GPUs are getting more and more integrated in the CPU, so in near future the industry will be screaming for faster RAM.



I also wouldn't mind seeing ram and core speeds match, bet that would help latency nicely. Having both ram and cpu running locked at 4Ghz (for example) has to have some sort of positive benefits in overall performance.


----------



## JF-AMD (Oct 3, 2010)

largon said:


> DDR4 can't come soon enough.



What if it is slower, higher latency and more expensive?  Will you make the jump then?

You don't need the newest technology, you need the best technology.  I haven't seen enough on DDR4 to make me wish it was here any time sooner.  And, it's quite a ways off.


----------



## Super XP (Oct 3, 2010)

DDR3 is more than enough. I think a good set of DDR3-1866 is perfect with ultra low timings. That should be enough for at least 2+ years for solid gaming with a nice 4GB x 4 = 16GB.


----------



## bear jesus (Oct 3, 2010)

I'm still on 1066mhz DDR2 at cas5 and it's still serving me very well, i will be happy if i can get around 2000mhz DDR3 on my next board and would be happy to wait the few years untill DDR4 is in mass production.


----------



## JF-AMD (Oct 3, 2010)

Anyone that raced out to get DDR-3 when it came out was treated to a pretty significant price premium and the first rounds were at 800MHz, maybe 1066MHz, but definitely no 1333MHz.  It took until the first process node change to get prices and speeds in line.

Memory is one area where being an early adopter rarely has a benefit.


----------



## largon (Oct 3, 2010)

As was the case with at least DDR2 and DDR3 one can reasonably expect that DDR4 will be worse than DDR3 at start but that doesn't invalidate my statement. It didn't take that long for DDR2, DDR3 to clearly overcome DDR1, DDR2 respectively. 

I'm not going to be among the first adopters... Hell, I'm still using DDR2 and _personally_ I don't see any compelling reason to go DDR3 until, of necessity, when I'll do a platform overhaul sometime in 2011.


----------



## JF-AMD (Oct 3, 2010)

Then your statement should have been "_volume second generation _DDR4 can't come soon enough"


----------



## Neo4 (Oct 3, 2010)

largon said:


> As was the case with at least DDR2 and DDR3 one can reasonably expect that DDR4 will be worse than DDR3 at start but that doesn't invalidate my statement. It didn't take that long for DDR2, DDR3 to clearly overcome DDR1, DDR2 respectively.
> 
> I'm not going to be among the first adopters... Hell, I'm still using DDR2 and _personally_ I don't see any compelling reason to go DDR3 until, of necessity, when I'll do a platform overhaul sometime in 2011.



Seriously, DDR4 is pie in the sky as far as I'm concerned. When I was younger I'd have been fired up about it but not any more. The hardware performance numbers don't lie and each and every memory architectural advancement has been a great big yawn. CPU's, GPU's and the new SSD's are where all the performance action is and I can't wait for a year from now when I'll be able to jump on the Bulldozer platform.


----------



## HalfAHertz (Oct 4, 2010)

Interesting discussion. What I'd like to ask JF is if AMD is going to introduce any Liano based APUs for the server market. I'd guess openCL programers would be pretty interested in those if they are competitive in Gflops/wat.


----------



## JF-AMD (Oct 4, 2010)

I actually cover that in my blog.  APUs for the server market might happen, but not in the near term.  There is a lot of work that has to happen on the software side first before we start embedding GPUs into CPUs.

Today customers want threads.  There is a definite need for GPGPU technology, but for now the speeds/sizes that customers want make them difficult to integrate into a CPU package.  There is also the issue of CPU:GPU ratio, which is different by application.


----------



## Super XP (Oct 5, 2010)

I agree the software needs to catch up but that is not stopping Intel.


----------



## JF-AMD (Oct 5, 2010)

That was a server statement that I made, not a client statement.


----------



## Neo4 (Nov 13, 2010)

Super XP said:


> DDR3 is more than enough. I think a good set of DDR3-1866 is perfect with ultra low timings. That should be enough for at least 2+ years for solid gaming with a nice 4GB x 4 = 16GB.



http://www.xbitlabs.com/articles/memory/display/phenom-ii-x6-ddr3-2000.html

Conclusion

The main thing we have discovered in our today's tests is that DDR3-2000 SDRAM is indeed possible on Socket AM3 systems. We now know the prerequisites for that: 1) any Phenom II X6 processor, 2) any of the many mainboards based on AMD’s 800 series chipsets, and 3) specially optimized memory modules.

As you can see, the most difficult requirement is to get such optimized memory. We were lucky to have a dual-channel 4GB kit from G.Skill (F3-16000CL7D-4GBFLS) which proved to be capable of working as DDR3-2000 on our Socket AM3 testbed. This memory kit is not without downsides, of course. For example, the modules are rather large because of the cooling elements, but we don't want to find fault with them as there are almost no alternatives available on the market. If you want high-speed DDR3 for your overclocked Phenom II X6-based computer, we do recommend you this memory kit from G.Skill.

Well, you shouldn’t be disappointed if you don’t find DDR3-2000 modules compatible with the Phenom II X6 as the performance benefits of such memory over DDR3-1600 only amount to 1-2% while memory kits like the G.Skill F3-16000CL7D-4GBFLS are some 50% more expensive. So, we are prone to regard the use of DDR3-2000 modules in an overclocked Socket AM3 system as a luxury rather than a necessity.

Although the optimized modules have no problems working with Phenom II X6 processors as DDR3-2000, there are obvious problems with AMD's memory controller in general. The highest memory frequency this controller permits is much lower than what you can get with Intel processors.

Hopefully, AMD will revise its memory controller so that the company’s upcoming Bulldozer and other architectures will work with high-speed memory without any limitations and reservations, especially as JEDEC-approved speeds of DDR3 SDRAM modules may go as high as 2000 and more megahertz in the very near future.


----------



## mastrdrver (Nov 14, 2010)

Yet again XBit Labs proves they don't understand AMD at all.

They also have another article showing that there is no to negligible improvement when overclocking the CPU-NB. I think that speaks enough about it.


Back to the topic............
Everyone see the little Bulldozer preview AMD did on their Youtube channel?


----------



## Wile E (Nov 14, 2010)

But they are right in claiming that running 2000Mhz ram provides little, if any benefits, regardless of platform. 1600 CAS6 is better than 2000 CAS8, for example. So the true value of running 2000Mhz depends on both timings, and price. The gains will be small with significant increases in money at these levels.

But of course, CAS7 @ 2000 is better still, and some darn good sticks.


----------



## hat (Nov 14, 2010)

largon said:


> DDR4 can't come soon enough.
> 
> Miniaturization of CPUs and as such, IMCs, has brought problems regarding memory DDQ voltages; remember i7 doesn't run safe with RAM vDD > 1.65v? You can't have huge voltage differences between IMC and the rest of the core or things between them two parts of the die go *poof*. Even ULV DDR3 running a 1.35V vDDQ will cause a conflict with CPU core vDDs, sooner or later. And since CPU vDDs are continuously going down... Anyways, more aggregate bandwidth never hurts, and considering GPUs are getting more and more integrated in the CPU, so in near future the industry will be screaming for faster RAM.



I thought it was the QPI voltage that had to be in line with the RAM voltage, not CPU core voltage.


----------



## Nick89 (Nov 14, 2010)

WhiteLotus said:


> Lower clock speeds but better math crunching abilities... interesting.
> 
> And am I alone in thinking these will be big chips? What with everything on them...



I'm hoping they will be ether 32nm or 28nm.


----------



## Steevo (Nov 14, 2010)

Burn the Intel infadels. 


I am not upgrading again until either the bulldozer is real, and really competitive for price and performance, or it fails and I go back Intel.


I need more video processing power m2ts, 1080P with effects in Adobe, pixela, and ATI has failed me on that front too. So. Green and blue might be my new colors if they don't pull their shit together by next spring.


----------



## TheMailMan78 (Nov 14, 2010)

Honestly I'm fine with DDR2 and low timings. My board can run DDR2 @1333. But currently I run at 1067. But look at my timings.


----------



## bear jesus (Nov 14, 2010)

TheMailMan78 said:


> Honestly I'm fine with DDR2 and low timings. My board can run DDR2 @1333. But currently I run at 1067. But look at my timings.



I kind of agree, one of the things making it easy for me to wait to go with dd3 is the timings, when i move to ddr3 i want 8gb across 2 modules and the best that's easily available to me is 2000mhz at 9-10-9-27, I'm sure running it slower than 2ghz would possibly let me lower the timings but going from 5-5-5-15 ddr2  i would kind of want at least cas 6 or 7 with ddr3.

I hope in the coming months more memory will be released with lower timings.


----------



## CDdude55 (Nov 14, 2010)

bear jesus said:


> I kind of agree, one of the things making it easy for me to wait to go with dd3 is the timings, when i move to ddr3 i want 8gb across 2 modules and the best that's easily available to me is 2000mhz at 9-10-9-27, I'm sure running it slower than 2ghz would possibly let me lower the timings but going from 5-5-5-15 ddr2  i would kind of want at least cas 6 or 7 with ddr3.
> 
> I hope in the coming months more memory will be released with lower timings.



Ya, as always the timings will get lower as time goes on. Currently im running 6GB DDR3 sticks at 1333 with timings of 7-7-7-20,which is a decent balance for me.


----------



## Neo4 (Nov 14, 2010)

CDdude55 said:


> Ya, as always the timings will get lower as time goes on. Currently im running 6GB DDR3 sticks at 1333 with timings of 7-7-7-20,which is a decent balance for me.



Won't get much better than what you have there unless you want to spend a ton of money with little performance improvement. My whole point all along is that RAM is RAM and you might as well get the most value because there's no point in getting high dollar stuff. Manufacturers could justify that cost if there was a real world improvement of at least 10% or more. I guess if you have deep pockets though...


----------



## CDdude55 (Nov 14, 2010)

Neo4 said:


> Won't get much better than what you have there unless you want to spend a ton of money with little performance improvement. My whole point all along is that RAM is RAM and you might as well get the most value because there's no point in getting high dollar stuff. Manufacturers could justify that cost if there was a real world improvement of at least 10% or more. I guess if you have deep pockets though...



It really depends, each Memory standard is going to add more and more bandwidth, speed etc., even if there isn't a significant boost in performance, it's still going to be a standard on high end platforms, so you have pretty much have no choice in the uber high end market. I definitely agree though get what you need for a good price, as currently your aren't going to see a big difference. Even for me, i could of easily stayed with DDR2 800 memory, but if i was to upgrade, and i have the money, might as well throw in the towel with the latest standards.


----------



## bear jesus (Nov 14, 2010)

CDdude55 said:


> Ya, as always the timings will get lower as time goes on. Currently im running 6GB DDR3 sticks at 1333 with timings of 7-7-7-20,which is a decent balance for me.



That reminds me, i'm sure i read somewhere that he latency with ddr3 is lower than with ddr2 so the timings are not really the same so ddr3 cas 7 is more like ddr2 cas 6 or 5 so that is a good balance of size, speed and timings, i only want a dual channel 8gb, cas 6, 2ghz ddr3 kit as i'm a whore 

*edit*
With faster ram as far as i knew the only time you would see a big improvement is when something needs more bandwidth than is available, i only want so much bandwidth to go with 6 or 8 cores and multiple virtual machines to run 24/7 while carrying on all other normal usage.


----------



## TheMailMan78 (Nov 14, 2010)

I mean as far as I know the AM2, AM2+, AM3 don't even support tri-channel.


----------



## bear jesus (Nov 14, 2010)

TheMailMan78 said:


> I mean as far as I know the AM2, AM2+, AM3 don't even support tri-channel.



They don't, and from what i have been reading desktop versions of  bulldozer with be dual channel as well.


----------



## InnocentCriminal (Nov 14, 2010)

I got all swallowed up when the details of the original Phenom started to circulate. The idea behind a completely native quad-core was and still is fantastic. I convinced myself that Phenoms were going to kick some serious ass, but unfortunately that didn't pan out. Learning from my experiences, I'm going to hold out until more information comes out on Bulldozer. 

I'm with Wile E on this one, I want the best components for my rig(s) and if Bulldozer can compete with Intel on both performance _and_ then it'll be my next purchase. I'm also hoping AMD have some fight in them and if the deal with Apple is true, then that'll be some much needed revenue that'll hopefully give them more resources to make all the right moves resulting in more competitive prices for the consumer... me!


----------



## LAN_deRf_HA (Nov 14, 2010)

Neo4 said:


> Won't get much better than what you have there unless you want to spend a ton of money with little performance improvement. My whole point all along is that RAM is RAM and you might as well get the most value because there's no point in getting high dollar stuff. Manufacturers could justify that cost if there was a real world improvement of at least 10% or more. I guess if you have deep pockets though...



That's why I think the unlocked uncore is the best feature of 1366. Let's you make cheap memory that only does tight timings at low speed perform like it's running at a much higher mhz.


----------



## mastrdrver (Nov 15, 2010)

You all realize that DDR3 has lower latencies then DDR2 right?


----------



## bear jesus (Nov 15, 2010)

mastrdrver said:


> You all realize that DDR3 has lower latencies then DDR2 right?





bear jesus said:


> The latency with ddr3 is lower than with ddr2 so the timings are not really the same so ddr3 cas 7 is more like ddr2 cas 6 or 5



 but i have no idea how much lower i was just guessing.


----------



## CDdude55 (Nov 15, 2010)

mastrdrver said:


> You all realize that DDR3 has lower latencies then DDR2 right?



When did that happen? lol

Here's an DDR2 OCZ kit of memory with cas 5 timings: http://www.amazon.com/dp/B0017SA5ZY/?tag=tec06d-20

And here's the DDR3 version of that same kit with higher speed but still higher timings (cas 7): http://www.amazon.com/dp/B0013HC36S/?tag=tec06d-20

Still fairly high from what im looking at, which DDR3 kits are lower?


----------



## bear jesus (Nov 15, 2010)

CDdude55 said:


> When did that happen? lol
> 
> Here's an DDR2 OCZ kit of memory with cas 5 timings: http://www.amazon.com/dp/B0017SA5ZY/?tag=tec06d-20
> 
> And here's the DDR3 version of that same kit with highwer speed but still higher timings (cas 7): http://www.amazon.com/dp/B0013HC36S/?tag=tec06d-20



He means as in cas5 on ddr3 has lower latency than ddr2 cas5 as apparently the clock cycles are shorter. but i have not really read into it.


----------



## mastrdrver (Nov 15, 2010)

CDdude55 said:


> When did that happen? lol
> 
> Here's an DDR2 OCZ kit of memory with cas 5 timings: http://www.amazon.com/dp/B0017SA5ZY/?tag=tec06d-20
> 
> ...



The timings given by any dimm is only relative to its speed (in nanoseconds).

Timings =! latency

DDR3 1600 CAS 9 has a lower latency then DDR3 1333 CAS 9 because the speed (in nanoseconds) of DDR3 1600 is 1.25 and DDR3 1333 is 1.50 hence you get 10.5ns for 1600 and 13.5ns for 1333.

DDR2 666 CAS 4 has a latency of 12ns because the speed of the dimm is twice that of DDR3 1600 at 3ns.

Secrets of PC Memory

Read that and understand this:

DDR refresh clock (tCLK):
DDR 200 is 10.0ns
DDR 266 is 7.52ns
DDR 333 is 6.02ns
DDR 400 is 5.00ns

DDR2 refresh clock (tCLK):
DDR2 400 is 5.00ns
DDR2 533 is 3.76ns
DDR2 667 is 3.00ns
DDR2 800 is 2.50ns
DDR2 1066 is 1.876ns

DDR3 refresh clock (tCLK):
DDR3 1066 is 1.876ns
DDR3 1333 is 1.50ns
DDR3 1600 is 1.25ns
DDR3 1866 is 1.07ns
DDR3 2000 is 1.00ns

To find the *latency* of any dimm you need to take its tCLK and multiply it by its listed timings to find the latency of that timing.


----------



## CDdude55 (Nov 15, 2010)

Very good info guys, thanks.


----------



## TheMailMan78 (Nov 15, 2010)

mastrdrver said:


> The timings given by any dimm is only relative to its speed (in nanoseconds).
> 
> Timings =! latency
> 
> ...



Going by this and the fact an AMD platform is only duel channel going from DDR2 to DDR3 have very little benefit. Am I correct?


----------



## Wile E (Nov 15, 2010)

AMD gets a small boost from DDR3.

DDR3 is worth it to me just because it is getting cheap, and it runs a lot cooler.


----------



## Deleted member 67555 (Nov 15, 2010)

I guess if you consider 7-12% overall small..

Sure some programs wont even have improvement at all.

But tbh an fair it's better to increase your storage transfer rates before your memory.


----------



## Wile E (Nov 15, 2010)

jmcslob said:


> I guess if you consider 7-12% overall small..
> 
> Sure some programs wont even have improvement at all.
> 
> But tbh an fair it's better to increase your storage transfer rates before your memory.



7-12% in synthetics maybe. Not nearly that much in real world apps.


----------



## mastrdrver (Nov 16, 2010)

One of the reasons (there are others too) for going to DDR3 (if I understand what I've read correctly) is that 4GB DDR2 dimms is the limit because of the way it retrieves data. With DDR it was 2GB.

Again I'm not 100% on that part but it is from what I've come to understand from what I've read. I'm not 100% on why the way the data is retrieved has anything to do with the size limit of the dimm.

If any one wants to make their mind bleed from trying to understand memory read: Everything you always wanted to know about sdram memory but were afraid to ask on Anandtech. Also their article ASUS ROG Rampage Formula: Why we were wrong about the Intel X48 may need reading first to be able to follow along in the other.

I got to about page 4 before I said f- this and then got saved on page 5 with the Youtube video. I kind of skimmed from there because it was stretching my mind mentally trying to follow along.


----------



## lane (Nov 16, 2010)

bear jesus said:


> I kind of agree, one of the things making it easy for me to wait to go with dd3 is the timings, when i move to ddr3 i want 8gb across 2 modules and the best that's easily available to me is 2000mhz at 9-10-9-27, I'm sure running it slower than 2ghz would possibly let me lower the timings but going from 5-5-5-15 ddr2  i would kind of want at least cas 6 or 7 with ddr3.
> 
> I hope in the coming months more memory will be released with lower timings.



I have lower timing access (memory access in ns)  ( Aida64 ) with 1600mhz modules @ 9-9-9-24 i had with DDR2 ultra low latencies ... (Micron D9 )

now i use 1600mhz 6-8-6-24 modules DDR3 triple channel, and they keep Cas6 untill 1800mhz... and Cas7-8 @ 2000mhz ... 

You can 't compare latencies using CAS between different type of memory, DDR1-2-or 3 .... it's a false idea.

for give you an idea: Sandrasisoftware give me a 29go/s bandiwth with my DDR3@1600mhz ( CPU@200x20).... how much you have with DDR2 ? 15go/s?   ( this have nothing to do with latencies, but well it's just for information )


----------



## mastrdrver (Nov 16, 2010)

Even worse is that the wiki articles on ddr/ddr2/ddr3 confuses memory latency with timings making the claim that higher timings means higher latencies. :shadedshu


----------



## Wile E (Nov 17, 2010)

mastrdrver said:


> Even worse is that the wiki articles on ddr/ddr2/ddr3 confuses memory latency with timings *making the claim that higher timings means higher latencies.* :shadedshu



Well, if all else is equal, it does.


----------



## mastrdrver (Nov 17, 2010)

Problem is they rarely are.

Best quote:


> While the typical latencies for a JEDEC DDR2 device were 5-5-5-15, the standard latencies for the JEDEC DDR3 devices are 7-7-7-20 for DDR3-1066 and 7-7-7-24 for DDR3-1333.



The DDR3 article mixes latency with dimm clock cycles (tCLKmin) needed to complete an action (CAS, RAS, etc). This would only be true if tCLKmin was equal to 1ns, but that only happens at 2000mhz for DDR3. Just 2 paragraphs later they get it right:



> As with earlier memory generations, faster DDR3 memory became available after the release of the initial versions. DDR3-2000 memory with 9-9-9-28 latency (9 ns) was available in time to coincide with the Intel Core i7 release.[7]  CAS latency of 9 at 1000 MHz (DDR3-2000) is 9 ns, while CAS latency of 7 at 667 MHz (DDR3-1333) is 10.5 ns.



The first part should say "typical timings" instead of "typical latencies" because it makes it sound like 5-5-5-15 is the latency but its not. Its just the number of cycles need to complete that action at the given speed (i.e 800mhz for ddr2). The latency changes once the speed goes to 1066mhz for DDR2 and it would be a mistake to say 5-5-5-15 equals latencies.

Example:

DDR2 Timings of 5-5-5-15 at
-------------------------------------------------
1066mhz is about 9.5 -  9.5 -  9.5 -  28 (in nanoseconds)
800mhz is about 12.5 - 12.5 - 12.5 - 37.5 (in nanoseconds)

DDR3 Timings of 7-7-7-20 at
-----------------------------------------------------
1066mhz is about 9.5 -  9.5 -  9.5 -  28 (in nanoseconds)
1333mhz is about 10.5 - 10.5 - 10.5 - 30 (in nanoseconds)

This also shows that not only is 1066 DDR3 marginally faster (then 1333 DDR3) given those timings but, unless you have some bandwidth sensitive real world application, there is no real gain by going to 2000mhz DDR3 CAS 9 dimms when the latencies between the two are almost the same. The difference only shows up if the program moves a lot of data in and out of memory otherwise its just wasted (as in most cases). This is also why lower the refresh (tREF) makes the system "feel" fast and more responsive because the system is waiting less and less for the data to be refreshed and minimal changes show up quicker.


----------



## Wile E (Nov 17, 2010)

mastrdrver said:


> Problem is they rarely are.
> 
> Best quote:
> 
> ...



I understand that. Was essentially just pointing out that your statement was a little vague.


----------



## mastrdrver (Nov 17, 2010)

Ok yea I see what you were saying. Man, I was failing pretty hard last night at reading comprehension.

If anything maybe that last post adds something I missed in the other one.


----------



## nt300 (Nov 19, 2010)

What about Quad-Channel DDR3-1866 for Bulldozer. What all the talk about Dual-Channel.


----------



## bear jesus (Nov 19, 2010)

nt300 said:


> What about Quad-Channel DDR3-1866 for Bulldozer. What all the talk about Dual-Channel.



From what i have read the desktop bulldozer cores will be dual channel and only the socket G34 server bulldozer cores will have quad channel memory.


----------



## JF-AMD (Nov 19, 2010)

G34 is quad channel 1600.


----------



## bear jesus (Nov 19, 2010)

JF-AMD said:


> G34 is quad channel 1600.



Thanks for pointing that out, i admit i did not have a clue what speed it was.

I don't know what that equates to in gb/s but I'm sure quad channel at 1600mhz will give plenty of bandwidth, i just hope the desktop and server bulldozer cores do well against what intel will be offering in at least price/performance ratio.


----------



## Super XP (Nov 20, 2010)

bear jesus said:


> From what i have read the desktop bulldozer cores will be dual channel and only the socket G34 server bulldozer cores will have quad channel memory.


That defeats Bulldozer's purpose to excell in Memory performance. I heard it was Quad-Channel to feed those 8 cores.


----------



## pantherx12 (Nov 20, 2010)

Super XP said:


> That defeats Bulldozer's purpose to excell in Memory performance. I heard it was Quad-Channel to feed those 8 cores.



It has plenty of band-width for desktop use.

( according to JF anyways)


----------



## TAViX (Nov 20, 2010)

What about the DDR3 Hex(a)-channel thingy AMD was keep telling about?


----------



## CDdude55 (Nov 20, 2010)

JF said Quad Channel was going to me implemented on the server side of things. Desktop users will probably still be on Dual Channel but with a reworked IMC for better bandwidth.


----------

