# AMD Zambezi ''Bulldozer'' Desktop CPU Roadmap Revealed



## btarunr (Nov 18, 2010)

AMD's next-generation PC processor architecture that seeks to challenge the best Intel has, codenamed "Bulldozer", is set to make its desktop PC debut in 2Q next year, with a desktop processor die codenamed "Zambezi". AMD is seeking to target all market segments, including an enthusiast-grade 8-core segment, a performance 6-core segment, and a mainstream 4-core segment. The roadmap reveals that Zambezi will make its entry with the enthusiast-grade 8-core models first, starting with 125W and 95W models, trailed by 6-core and 4-core ones. 

Another couple of architectural details revealed is that Zambezi's integrated memory controller (IMC) supports DDR3-1866 MHz as its standard memory type, just like Deneb supports DDR3-1333 MHz as its standard. DDR3-1866 MHz, or PC3-14900 as it's technically known, will churn out 29.8 GB/s in dual-channel mode, that's higher than triple-channel DDR3-1066 MHz (25.6 GB/s), which is Intel Core i7 LGA1366 processors' official memory standard. The 8-core and 6-core Zambezi models feature 8 MB of L3 cache, while the 4-core ones feature 4 MB. Another tidbit you probably already knew is that existing socket AM3 processors are forwards-compatible with AM3+ (Zambezi's socket), but Zambezi processors won't work on older AM3/AM2(+) socket motherboards.





*View at TechPowerUp Main Site*


----------



## 1Kurgan1 (Nov 18, 2010)

Impressed that the 8 cores will be the ones landing first, wonder what prices look like.

I think I'll be getting a mobo first then maybe a proc, we'll see though.


----------



## Fourstaff (Nov 18, 2010)

Release AM3+ boards pls thx bai. And please get the Bulldozer out earlier than Q3, because intel's Sandy Bridge is going to make Bulldozer look like Nvidia's GTX480 when it was released.


----------



## pantherx12 (Nov 18, 2010)

+ one to release boards now.

Then I can get am3 cpu to tide me over : ]


----------



## btarunr (Nov 18, 2010)

Fourstaff said:


> because intel's Sandy Bridge is going to make Bulldozer look like Nvidia's GTX480 when it was released.



There's no way of saying that right now.


----------



## Imsochobo (Nov 18, 2010)

btarunr said:


> There's no way of saying that right now.



i've heard otherwise in the server market.
that amd is the one who will be doing a proper smack!


----------



## (FIH) The Don (Nov 18, 2010)

btarunr said:


> Another tidbit you probably already knew is that existing socket AM3 processors are forwards-compatible with AM3  (Zambezi's socket), but Zambezi processors won't work on older AM3/AM2( ) socket motherboards.



the best thing about AMD


----------



## [H]@RD5TUFF (Nov 18, 2010)

More cache PLZ DAMMIT!

Also meh.

I do not think these will be the saving grace AMD needs.


----------



## pantherx12 (Nov 18, 2010)

[H]@RD5TUFF said:


> More cache PLZ DAMMIT!
> 
> Also meh.
> 
> I do not think these will be the saving grace AMD needs.





Cache has massive diminishing returns .


----------



## 1Kurgan1 (Nov 18, 2010)

[H]@RD5TUFF said:


> More cache PLZ DAMMIT!
> 
> Also meh.
> 
> I do not think these will be the saving grace AMD needs.



Why do they need a saving grace? No they of course aren't going to push as much out to the market as Intel, but they never have. The PII's, especially since the 6 cores landed at great prices, infact, since PII's in general came out, AMD's been doing fine.


----------



## [H]@RD5TUFF (Nov 18, 2010)

pantherx12 said:


> Cache has massive diminishing returns .



I disagree, it adds life to the life span of the processor.



1Kurgan1 said:


> Why do they need a saving grace? No they of course aren't going to push as much out to the market as Intel, but they never have. The PII's, especially since the 6 cores landed at great prices, infact, since PII's in general came out, AMD's been doing fine.



Doing fine isn't the same as competing, the best part of ATI (now AMD) stepping up it's game was prices came down, that's why I want AMD to be a real competitor, and not the go to for budget, because I want more powerful chips at cheaper prices.


----------



## Yellow&Nerdy? (Nov 18, 2010)

This might be a dumb question, but does the 8-core have 4 "Bulldozer cores" or 8? Because a "Bulldozer-core" has two conventional cores stitched together.

Next year will be interesting with Sandy, Bulldozer, Keplar and Southern Islands.


----------



## Melvis (Nov 18, 2010)

That is some impressive dual-channel performance, they wasn't joking when they said they could get it to out perform triple channel, very nice. (or as good)


----------



## blibba (Nov 18, 2010)

Yellow&Nerdy? said:


> This might be a dumb question, but does the 8-core have 4 "Bulldozer cores" or 8? Because a "Bulldozer-core" has two conventional cores stitched together.
> 
> Next year will be interesting with Sandy, Bulldozer, Keplar and Southern Islands.



I'm also confused by how they're judging core count on the new system.


----------



## 1Kurgan1 (Nov 18, 2010)

[H]@RD5TUFF said:


> Doing fine isn't the same as competing, the best part of ATI (now AMD) stepping up it's game was prices came down, that's why I want AMD to be a real competitor, and not the go to for budget, because I want more powerful chips at cheaper prices.



Like I said before, AMD has never sold in the quantities that Intel has, they were much later to the game, don't have the budget, and maybe never will, hoping for that is like dreaming for the moon as yours, well maybe not as bad.


----------



## pantherx12 (Nov 18, 2010)

blibba said:


> I'm also confused by how they're judging core count on the new system.




8 core is 4 bulldozer cores. 

Unless they've changed how they're doing things in the past 6 months


----------



## HalfAHertz (Nov 18, 2010)

Yellow&Nerdy? said:


> This might be a dumb question, but does the 8-core have 4 "Bulldozer cores" or 8? Because a "Bulldozer-core" has two conventional cores stitched together.
> 
> Next year will be interesting with Sandy, Bulldozer, Keplar and Southern Islands.



It has 4 modules with 2 cores each for a total of 8


----------



## meirb111 (Nov 18, 2010)

*i was thinking tdp will be lower but its not*

tdp is the same as phenom II 125w-95w


----------



## NdMk2o1o (Nov 18, 2010)

meirb111 said:


> tdp is the same as phenom II 125w-95w



oh no, their new 8 core uses the same amount of power as their last 4 core  95-125w is standard nowadays for any x4+ processors. 

Do we have confirmation that they have something similar to hyper threading and how it actually works?


----------



## pantherx12 (Nov 18, 2010)

NdMk2o1o said:


> oh no, their new 8 core uses the same amount of power as their last 4 core  95-125w is standard nowadays for any x4+ processors.
> 
> Do we have confirmation that they have something similar to hyper threading and how it actually works?





As far as I know no hyper-threading like technology.

How ever each bulldozer module ( 2 cores in os) can process two threads fully at the same time.


----------



## NdMk2o1o (Nov 18, 2010)

pantherx12 said:


> As far as I know no hyper-threading like technology.
> 
> How ever each bulldozer module ( 2 cores in os) can process two threads fully at the same time.



Soooooo is one bulldozer "module" just a dual core chip, I seem to recall one of AMD's criticism of intels quads where that the weren't "true" quad core like Phenom....


----------



## FishHead69 (Nov 18, 2010)

*Hail All Glory to Octocore*


----------



## Fourstaff (Nov 18, 2010)

btarunr said:


> There's no way of saying that right now.



True, but then again, we have seen previews of the Sandy Bridge and its going to be a tall order for Bulldozer to "bulldoze" Intel. Pricing may be the saviour this time round tough. History have shown that a new architecture is never going to work as well as intended (R600, P4 HT, Fermi, Phenom 1 etc) so I am not going to get my hopes high. Zacate previews looks good, I must admit.

Edit: can 1 Bulldozer module handle 1 thread, or 50% of the module will be idle?


----------



## JF-AMD (Nov 18, 2010)

Modules contain two integer cores.  Each integer core has its own integer pipeline, so they can simultaneously execute 2 threads (unlike hyperthreading which can handle 2 threads, but has only 1 set of integer pipelines, so it really only executes one thread at a time.)

We will not market modules, we will only market cores; modules are how the designers lay out the processor, but that will not be part of the marketing or naming.


----------



## Fourstaff (Nov 18, 2010)

JF-AMD said:


> Modules contain two integer cores.  Each integer core has its own integer pipeline, so they can simultaneously execute 2 threads (unlike hyperthreading which can handle 2 threads, but has only 1 set of integer pipelines, so it really only executes one thread at a time.)
> 
> We will not market modules, we will only market cores; modules are how the designers lay out the processor, but that will not be part of the marketing or naming.



So if I have 1 thread, then I can only use 1 integer pipeline?


----------



## NdMk2o1o (Nov 18, 2010)

Fourstaff said:


> History have shown that a new architecture is never going to work as well as intended (R600, P4 HT, Fermi, Phenom 1 etc)



Wow you just completely disregarded Athlon, Core, AMD5 series was said to be a "new architecture as was i7 (although regardless they are all built on previous technologies of a sort)


----------



## Fourstaff (Nov 18, 2010)

NdMk2o1o said:


> Wow you just completely disregarded Athlon, Core, AMD5 series was said to be a "new architecture as was i7 (although regardless they are all built on previous technologies of a sort)



That is true, but then again, it brings to the tally of 50% at best. Still not impressive enough. If you are referring to the AMD's ATI5xxx series in the graphics card, its just another step in the evolution of the R600, they did not design it from scratch. And from what I remembered, the i7 is derived from the server processors. So, all in all, its just the Core2 processors and the ix processors which are done right the last few years.


----------



## bear jesus (Nov 18, 2010)

Great to see that 1866mhz is the standard of the new cores as that should hopefully mean getting over 2ghz would be much easier than he current generation AMD cpu's

I really can't wait to see how they do against sandy bridge even if I'm unsure if i can wait long enough, but worst case is a buy something from sandy bridge and find out i should sell it and go with a bulldozer core... at least i could keep the ram


----------



## Tatty_One (Nov 18, 2010)

I intend to upgrade to this next year when released, probably just a 6 core offering, I am very pleased with the system I have now but I have had it for 18 months and it's been far too long since I used AMD so I need a change, things look good so far.

I will be interested however in how their 6 core offerings compete with the current Intel Gulftown models because if they are close, it's fairly certain that Intel will remain competative with their future releases.


----------



## TheMailMan78 (Nov 18, 2010)

No new socket? Well that just sucks. This is starting to smell of fail.


----------



## Lionheart (Nov 18, 2010)

I just jizzed in my pants, is that normal


----------



## Frick (Nov 18, 2010)

TheMailMan78 said:


> No new socket? Well that just sucks. This is starting to smell of fail.



Why?


----------



## btarunr (Nov 18, 2010)

Fourstaff said:


> True, but then again, we have seen previews of the Sandy Bridge and its going to be a tall order for Bulldozer to "bulldoze" Intel.



There is potential for AMD to do that, it has done that in the past. Think of it as the Ashes, with Intel being the Aussie team.


----------



## Lionheart (Nov 18, 2010)

btarunr said:


> There is potential for AMD to do that, it has done that in the past. Think of it as the Ashes, with Intel being the Aussie team.



hahahahaha nice one, Aussie team cry too much


----------



## Fourstaff (Nov 18, 2010)

btarunr said:


> There is potential for AMD to do that, it has done that in the past. Think of it as the Ashes, with Intel being the Aussie team.



Sincerely hope that's the case, competition is always everybody's friend. I just don't have much faith in AMD right now.


----------



## bear jesus (Nov 18, 2010)

TheMailMan78 said:


> No new socket? Well that just sucks. This is starting to smell of fail.


That's a lie


btarunr said:


> existing socket AM3 processors are forwards-compatible with *AM3+ (Zambezi's socket)*



AM3+ is a new socket  (yes i do know what u meant though)


----------



## 1Kurgan1 (Nov 18, 2010)

Fourstaff said:


> True, but then again, we have seen previews of the Sandy Bridge and its going to be a tall order for Bulldozer to "bulldoze" Intel. Pricing may be the saviour this time round tough. History have shown that a new architecture is never going to work as well as intended (R600, P4 HT, Fermi, Phenom 1 etc) so I am not going to get my hopes high. Zacate previews looks good, I must admit.
> 
> Edit: can 1 Bulldozer module handle 1 thread, or 50% of the module will be idle?



Pricing is what matters here, it's like the 980X to 1095t comparisons. It's obviously not at all a fair comparison, to either chip as the audiences are far different, the 980X is faster, and the 1095t is easier to obtain. And with this being a tech site, even here 980X's are rare, I mean don't get me wrong, I like to look at the top dog reviews for all of the best hardware out there, but I can't think of anytime in the foreseeable future where I would be able to justify a $1000 processor. I'd rather see them in the $200 - $350 range, and able to do everything great, like my 1055t, I love the thing and I got it day 1 for $150.



JF-AMD said:


> Modules contain two integer cores.  Each integer core has its own integer pipeline, so they can simultaneously execute 2 threads (unlike hyperthreading which can handle 2 threads, but has only 1 set of integer pipelines, so it really only executes one thread at a time.)
> 
> We will not market modules, we will only market cores; modules are how the designers lay out the processor, but that will not be part of the marketing or naming.



 It seems we have a lurker, right from AMD, nice to see


----------



## JF-AMD (Nov 18, 2010)

Fourstaff said:


> So if I have 1 thread, then I can only use 1 integer pipeline?



Correct.  Just like every other architecture out there. 

Splitting a thread over multiple integer cores would create a scheduling nightmare.  You'd spend so much time going back and forth over the two cores that you would lose cycles and efficiency.

Imagine this scenario:

A+B=C
C*4=D
D+E=F
F*G=H
H+I=J

It is all incredibly linear.  You could take A+B and run it on one core, but the second core handling that thread would have to wait to get C, so you lose a cycle for that second core. Then you do C*4 on the second cycle, but that second core is still sitting idle.  But you probably lose a cycle of overhead now because the second core is saying "do you have anything for me"?  The checksum to sync the two cores takes a cycle, so now you are at 3 cycles and you have only run 2 instructions on the 2 cores, and one has sat idle 100% of the time.

Obviously you can see where this goes nowhere.  Fast. The only place it would work is if a thread was extremely parallel, but there will still be some checksum/synchronizaiton that will eat up overhead and create latency/inefficiency.

So then you think, well, if we have inefficiency and pipelines sitting idle, why not use SMT to load multiple threads on a single core and take advantage of the pipeline gaps/stalls?

Then you are breaking up all of the threads across multiple cores, creating gaps in pipelines and then trying to load multiple threads to fill in those gaps.  Definitely not efficient.



1Kurgan1 said:


> It seems we have a lurker, right from AMD, nice to see



I am not a lurker, I contribute


----------



## cadaveca (Nov 18, 2010)

I'll be back in this thread in April, 6 months from now, when these cpus launch.

Toodles!





EDIT:, yeah, I'll give you that, JF-AMD, you are always on top of the Bulldozer threads, not quite lurking, and I personally appreciate the info.


But ya know, I'll take the contribution of a board and a few cpus...


----------



## JF-AMD (Nov 18, 2010)

I have stacks of processors on my desk right now.  But if I start giving them away, I'll get hammered with requests.  Plus, they are server processors


----------



## 1Kurgan1 (Nov 18, 2010)

JF-AMD said:


> I am not a lurker, I contribute



I meant more of a lurker in a good way, you're to AMD as Batman is to Gotham  Seldom seen, but banishing evil when you are seen. It's just really nice to see some sort of representation from a large corporation actually taking part in our forums, nice to know you guys do care what we loonies think.


----------



## cadaveca (Nov 18, 2010)

JF-AMD said:


> I have stacks of processors on my desk right now.  But if I start giving them away, I'll get hammered with requests.  Plus, they are server processors



Heh. Well, when you want a unbiased opinion of those cpus, you know where to send them. 

I personally don't care if it's server or desktop...I just need a board with an included PCI-E 16x slot. Those Tyan G34 boards with 4x PCI-E slots seems right up my alley, actually.

It's really the entry price that has most of us stay away from server-based products, even though they seem to be built for a far greater workload. For me, really, it's the memory pricing. 

Anyhow, crunchers need cores.


But if you find some of my posts from 5-6 years ago, you'll see that i was posting aobut every house having a server, and then using "thin" clients to actualyl access that server and it's grunt. If that server with 4x vgas could run 4x games...served to various parts of my house, man, I'm in. I know the grunt is there...could get 24 cores, and 4x vgas...

I've got 4 kids, so need 6 PCs. But 6 full-size pcs genreate alot of heat, and suck back alot of power, too.

House is already wired for it, too. All i need is to dump a server into the mechanical room, ethernet and such is all there and waiting.



Oh, and I'm special, you know. Just because you help me out, doesn't mean others are special too...

Don't let the fact that I'm special in the head deter you.  Heck, I knew who you were before anyone else here. LOL.


----------



## JF-AMD (Nov 18, 2010)

Wow, never heard it explained that way.  I am just selfish.  When people post rumors and lies about my products I can either stamp it out quickly online or spend the next 3 months fielding emails from the press and sales guys where I have to refute the same stuff over and over and over.


----------



## cadaveca (Nov 18, 2010)

JF-AMD said:


> Wow, never heard it explained that way.  I am just selfish.  When people post rumors and lies about my products I can either stamp it out quickly online or spend the next 3 months fielding emails from the press and sales guys where I have to refute the same stuff over and over and over.



We appreciate it. Really. I personally think you're doing the right thing. 


Now, where's my cpus?


----------



## Steevo (Nov 18, 2010)

I need a better clocking chip for my water loop, so I can send my 940 to my parents (they are heating their house with a even crappier 9850). I only use AMD systems after dealing with underpowered stuttering Intel systems that I have to put video cards in anyway.

Plus ATI raped me on a X1800XT, big time, and my ass still hurts, I sold it to a work intel build for $50 a short while later. Then I coudn't even fold on it.


----------



## bear jesus (Nov 18, 2010)

Steevo said:


> I need a better clocking chip for my water loop, so I can send my 940 to my parents (they are heating their house with a even crappier 9850). I only use AMD systems after dealing with underpowered stuttering Intel systems that I have to put video cards in anyway.
> 
> Plus ATI raped me on a X1800XT, big time, and my ass still hurts, I sold it to a work inel build for $50 a short while later. Then I coudn't even fold on it.



Sounds like you have had some pretty bad luck. 

I admit if the 8 core bulldozers (or something from sandy bridge) overclock well on water it will be enough to push me to my first custom loop.
I have been loving the H50 but it's hardly water cooling compared to a custom loop with a nice fat 480mm rad


----------



## Steevo (Nov 18, 2010)

3.71 isn't bad really. My parents 4XXX series card was on par with my old 4850 untill memory became the factor. but they could use the extra CPU horsepower, and I could use it rendering video. Last night a "CCC video took 40 minutes as it wasn't running on the GPU, but on the CPU." 

The poor support, and broken promises type of continued fucking is starting to piss me off. Unless AMD comes out with something great I will have a Intel/Nvidia machine, Adobe acceleration, faster CPU based rendering, better memory support. I spent money again and again and each time the promises ATI/AMD make are broken, CCC GPU based encoding? Sure for a few formats that there are freeware converters everywhere for, so I can convert that already small mpg onto my phone. How about something hard like the ability to hardware render M2TS from my canon? Nope, gotta run that on the CPU. Oh yeah, its slower than two series old Intel chips. That is unless you want to purchase this software, and oh, it only works on some things, so you still need all these other codecs that don't work together and cause issues. 


You want to accelerate videos and upscale them with the hardware you purchased? That costs extra.


----------



## bear jesus (Nov 18, 2010)

Steevo said:


> 3.71 isn't bad really. My parents 4XXX series card was on par with my old 4850 untill memory became the factor. but they could use the extra CPU horsepower, and I could use it rendering video. Last night a "CCC video took 40 minutes as it wasn't running on the GPU, but on the CPU."
> 
> The poor support, and broken promises type of continued fucking is starting to piss me off. Unless AMD comes out with something great I will have a Intel/Nvidia machine, Adobe acceleration, faster CPU based rendering, better memory support. I spent money again and again and each time the promises ATI/AMD make are broken, CCC GPU based encoding? Sure for a few formats that there are freeware converters everywhere for, so I can convert that already small mpg onto my phone. How about something hard like the ability to hardware render M2TS from my canon? Nope, gotta run that on the CPU. Oh yeah, its slower than two series old Intel chips. That is unless you want to purchase this software, and oh, it only works on some things, so you still need all these other codecs that don't work together and cause issues.
> 
> ...



It seams like no matter what you do every company will find a way to accidentally screw you over, i admit the only reason i can be so happy with AMD is i don't really do ha much with my computer, right now some gaming is he only thing that puts any of my hardware to use the rest of the time it's just watching video and poking the net.

I really hope that whatever brands your next upgrade involves that things work out better than they have been for you.


----------



## PVTCaboose1337 (Nov 18, 2010)

Here I made this:

We can only hope...


----------



## OneCool (Nov 18, 2010)

JF-AMD said:


> I have stacks of processors on my desk right now.  But if I start giving them away, I'll get hammered with requests.  Plus, they are server processors



A maaan!!! 


I wanna build a server


----------



## HillBeast (Nov 19, 2010)

I don't care if people call me a troll, but I'm just gonna say this: Fusion looks fail.

Looking at the preliminary reviews of Brazos on Anandtech it looks like AMD has spent the last 5 or 6 years bragging and not actually doing any work. All this is, is a HD5000/HD6000 GPU strapped to an underpowered and poorly designed CPU. If that's all Fusion is meant to be then sorry mate, but I'm going Intel. I have been waiting 5 years for Fusion and to see it's not remotely what they were cracking it up to be, it just seems like AMD made a processor that would have been good 5 years ago but they have held it back for far too long.

This would have been a good fight against Nehalem, but Sandy Bridge? Ha. And don't go on to me about Intel GMA being a piece of fail because obviously you haven't read the reviews of the early Sandy Bridge: that thing takes on low end HD5000 series cards, and those things aren't too bad.

AMD, just release Fusion already so we can see it fail. It's clearly ready for mass market and has been for quite some time.


----------



## NdMk2o1o (Nov 19, 2010)

HillBeast said:


> I don't care if people call me a troll, but I'm just gonna say this: Fusion looks fail.



Be careful what you say dude, your being watched lol I am an intel spy claiming to run AMD 

On a serious note I really do hope bulldozer brings back the old Athlon days, the AMD Athlon kicked the P4 butt all over the show and it was great.. unfortunatly since then AMD haven't really been able to compete with c2,c2q and i7 though I believe it will happen if not with bulldozer then soon after. 

Which is a good thing for all of us cause it means kick ass chips at the best prices which is a win for everyone, looking forward to bulldozer personally!!


----------



## LAN_deRf_HA (Nov 19, 2010)

btarunr said:


> will churn out 29.8 GB/s in dual-channel mode



Sounds like sandybridge. The maxxmem thread may get a bit more competitive.


----------



## JF-AMD (Nov 19, 2010)

HillBeast said:


> I don't care if people call me a troll, but I'm just gonna say this: Fusion looks fail.
> 
> Looking at the preliminary reviews of Brazos on Anandtech it looks like AMD has spent the last 5 or 6 years bragging and not actually doing any work. All this is, is a HD5000/HD6000 GPU strapped to an underpowered and poorly designed CPU. If that's all Fusion is meant to be then sorry mate, but I'm going Intel. I have been waiting 5 years for Fusion and to see it's not remotely what they were cracking it up to be, it just seems like AMD made a processor that would have been good 5 years ago but they have held it back for far too long.
> 
> ...



You are looking at it all wrong.  That product is not designed to compete with sandybridge, Zambezi competes with sandybridge.

Comparing Ontario to sandybridge is like looking at a corvette and saying "but I don't think it will pull my boat...."


----------



## HillBeast (Nov 19, 2010)

JF-AMD said:


> You are looking at it all wrong.  That product is not designed to compete with sandybridge, Zambezi competes with sandybridge.
> 
> Comparing Ontario to sandybridge is like looking at a corvette and saying "but I don't think it will pull my boat...."



Um, by Sandy Bridge, I mean the microarchitecture behind all of Intels next generation CPUs. They WILL be putting a low end chip out that will compete against Ontario. Zambezi will just be going up against the Core i7 2xxx and Core i5. Ontario will have to take on Core i3. I doubt they will dominate considering what I've seen both parties show so far.


----------



## lashton (Nov 19, 2010)

also the phenom II was designed to compete with core 2 quads NOT the Core i7, its just the timing that everyone relates the 2


----------



## TheGuruStud (Nov 19, 2010)

JF-AMD said:


> You are looking at it all wrong.  That product is not designed to compete with sandybridge, Zambezi competes with sandybridge.
> 
> Comparing Ontario to sandybridge is like looking at a corvette and saying "but I don't think it will pull my boat...."



I wouldn't even bother with these kind of guys. It's probably smyrgyl from the zone


----------



## Wile E (Nov 19, 2010)

(FIH) The Don said:


> the best thing about AMD



How so? You still have to buy a new board to use the new cpu. It's not any different than Intel.


And I can't wait to see what these can do. I hope it's not yet another let down by AMD. I want to see competition on the high end, dammit.


----------



## Steevo (Nov 19, 2010)

I am tired of the hype and fail from AMD also. If you aren't going to compete just say so, but don't try and put spin on this shit, especially on a tech site. 


If your CPU's are slower they better be marketed accordingly, and with a price to match, or at lease some redeeming features. And I totally love how you just ignore users who have legit issues with being raped by your company so you can make somewhat witty comparisons.


----------



## Imsochobo (Nov 19, 2010)

JF-AMD said:


> I have stacks of processors on my desk right now.  But if I start giving them away, I'll get hammered with requests.  Plus, they are server processors



Hi.
I've gotten a "few" cpu's from amd before, but i'm not here to ask about that ;P
only if it is a am3 bulldozer i'll be interested.
but anyways.

Will vmotion be forward compatible with bulldozer, in vi 4,1 evc mode, non amd^now (gen 3) lists future amd cpu's.
so will actually vmotion work between a gen 3 opteron/PHII and a bulldozer ? :O

and 2nd question.
will there be any "diffrence" between server and desktop bulldozer in terms of support of features, quadchannel for server, dual for desktops ?


----------



## 1Kurgan1 (Nov 19, 2010)

Wile E said:


> How so? You still have to buy a new board to use the new cpu. It's not any different than Intel.



It's a stepping stone. Like back when I had an AM2 setup, then AM2+ came out, so I bought an AM2+ chip and used it in my AM2 board, then later I got an AM2+ board. Then AM3 came out, and once again, I bought an AM3 chip and used it in my AM2+ board, then I finally upgraded to my AM3 board. Just now you will be able to buy the new board, then buy the chip. It's nice not having to invest in a new mobo/cpu at the exact sametime.



Steevo said:


> I am tired of the hype and fail from AMD also. If you aren't going to compete just say so, but don't try and put spin on this shit, especially on a tech site.
> 
> If your CPU's are slower they better be marketed accordingly, and with a price to match, or at lease some redeeming features. And I totally love how you just ignore users who have legit issues with being raped by your company so you can make somewhat witty comparisons.



What fail? So far they haven't said they would contend with $1000 chips, and so far their prices are very competitive


----------



## Wile E (Nov 19, 2010)

1Kurgan1 said:


> It's a stepping stone. Like back when I had an AM2 setup, then AM2+ came out, so I bought an AM2+ chip and used it in my AM2 board, then later I got an AM2+ board. Then AM3 came out, and once again, I bought an AM3 chip and used it in my AM2+ board, then I finally upgraded to my AM3 board. Just now you will be able to buy the new board, then buy the chip. It's nice not having to invest in a new mobo/cpu at the exact sametime.
> 
> 
> 
> What fail? So far they haven't said they would contend with $1000 chips, and so far their prices are very competitive




You can't use this in your AM3 board. The end result is the same. You need to buy both the cpu and board to use this architecture. I bought my Intel rig a piece at a time too. I just used my old rig until I had all the pieces I needed. Same thing.

And not being able to at least come close to competing on the high end is kind of a fail.


----------



## 1Kurgan1 (Nov 19, 2010)

Wile E said:


> You can't use this in your AM3 board. The end result is the same. You need to buy both the cpu and board to use this architecture.
> 
> And not being able to at least come close to competing on the high end is kind of a fail.



I said this time its the opposite, can't use the new chip in the old board, I can use the new board with my old chip. So I don't have to buy them both at the sametime.


----------



## NdMk2o1o (Nov 19, 2010)

JF-AMD said:


> You are looking at it all wrong.  That product is not designed to compete with sandybridge, Zambezi competes with sandybridge.
> 
> Comparing Ontario to sandybridge is like looking at a corvette and saying "but I don't think it will pull my boat...."



So without giving too much away as I'm sure you can only say so much about Bulldozer, but surely you have seen the performance of Bulldozer, how does it stack up against the Phenom architecture?


----------



## Wile E (Nov 19, 2010)

1Kurgan1 said:


> I said this time its the opposite, can't use the new chip in the old board, I can use the new board with my old chip. So I don't have to buy them both at the sametime.



I didn't buy my i7 rig at the same time either. i just used my old rig in the meantime. Same thing. Same results. You'll get no performance benefit by buying an AM3+ board for your current cpu, so it's absolutely no different than just using an old rig until you have all the parts.

Sorry, it's not any different. To use the cpu, you need a new board.


----------



## WhiteLotus (Nov 19, 2010)

I'd just rather they release the damn thing already. Then all this useless speculation would go away.


Though who wants to bet the release will coincide with a new Graphics card of some sort... like the series.


----------



## Frizz (Nov 19, 2010)

When Intel releases Sandy Bridge I'd have swapped boards 2-3 times. I'm definitely looking towards AMD and the AM3+ boards as it seems it will last more than one or two generation of CPUs.


----------



## Peter1986C (Nov 19, 2010)

Wile E said:


> And not being able to at least come close to competing on the high end is kind of a fail.



You seem to fail at understanding that the OEM market has a huge importance. There retailers and manufacturers get the biggest sales figures.
BTW, I still see a lot of Core2 series computers being advertised by electronics and computer stores, especially in case of affordable (500-700 euros) laptops I still see a lot of E-series Pentium cpus among the Core 2 Duos and i3s. Desktops aren't much different, I guess it's 50% i3, 50% C2D/C2Q. And I think a lot of people would gladly have such a compo because it is Intel stuff (not caring about the exact model, as if they actually have a clue). A lot of people don't care about how pcs work, not even talking about how to build them themselves and especially not about high-end stuff that cost per part often roughly the price of an entry or intermediate level OEM machine.

Llano is an attempt to get a much better foothold on the OEM market, with both a CPU and a GPU that are expected to be economical and powerful enough to compete with the dual core sytems equipped with IGPs currently on the market.

Zambezi will probably (at least, I guss so) leave i3s in the dust, while competing with the i5 and with those i7 CPUs that don't cost roughly 800-1000 euro. In addition, I expect the Zambezi to OC way better than the Sandy i7s, for the simple reason that Intel will make OCing sorta impossible from a technical point of view. I forgot though how Intel is going to do so and I am currently too lazy to look it up.


----------



## Badlands (Nov 19, 2010)

Well for me AMD and Gigabyte are the best things since cotton candy......  I have an AM2 mobo that started with an AM2 Athlon 6000 dual core , then upgraded to the AM2+ Phenom X4 9550 , (really disapointed with that one) then jumped to the AM3 Phenom II X6 1090T . (All I needed was a Bios update for my upgrades) The 6 core has brought new life to an ageing system . AMD and Gigabyte has been good to me with the constant mobo support of an old pc build. I think they made sure I got my moneys worth outta their products and with that being said , I don't need to compare the performance of Bulldozer against the sandybridge to wonder what my next Build will be. How I figure it , when I upgrade to an AM3+ Mobo , I will have an upgrade path for 3 years of so . (Its what I got from my last purchase) How many intel products can say that with sockets 775 , 1156 , 1366, or the ........???


----------



## TheMailMan78 (Nov 19, 2010)

Does anyone know why they are still running duel channel on the desktop if the server class can be quad channel? I mean I am in no way an expert but it sounds like this could be major bottleneck.



Wile E said:


> How so? You still have to buy a new board to use the new cpu. It's not any different than Intel.
> 
> 
> And I can't wait to see what these can do. I hope it's not yet another let down by AMD. I want to see competition on the high end, dammit.



Yeah but then you have people like me. I'm already running a 1090T and I don't want to jump to the first gen. of bulldozer. However I do want a new mobo. Ill be able to buy a new mobo with an upgrade path and still use my old components until I'm ready to upgrade again. THAT is whats great about AMD.


----------



## HalfAHertz (Nov 19, 2010)

TheMailMan78 said:


> Does anyone know why they are still running duel channel on the desktop if the server class can be quad channel? I mean I am in no way an expert but it sounds like this could be major bottleneck.
> 
> 
> 
> Yeah but then you have people like me. I'm already running a 1090T and I don't want to jump to the first gen. of bulldozer. However I do want a new mobo. Ill be able to buy a new mobo with an upgrade path and still use my old components until I'm ready to upgrade again. THAT is whats great about AMD.



Because your Winodwz doesn't eat up 128GB of ram a second? 
http://en.wikipedia.org/wiki/High-performance_computing


----------



## JF-AMD (Nov 19, 2010)

Cleint workloads rarely saturate the memory bus.  We'll be adding ~50% more throughput than current designs.  

Benchmarks and real life usage are 2 different things.

For every person that is bottlenecked on today's systems there are a million others on the interwebs that are not even saturating 1/3 of their memory bandwidth.

People are getting hung up on number of channel instead of focusing on the amount of bandwidth they can actually achieve and the amount of bandwidth their applications require.


----------



## TheMailMan78 (Nov 19, 2010)

JF-AMD said:


> Cleint workloads rarely saturate the memory bus.  We'll be adding ~50% more throughput than current designs.
> 
> Benchmarks and real life usage are 2 different things.
> 
> ...



Cool. Nice clear answer for a joker like me. So its not how many bottles of beer you got. Its the size of the bottle?


----------



## JF-AMD (Nov 19, 2010)

Exactly.  Would your rather have two 7 ounce Little Kings or a 22 ounce Stone IPA?

One is twice as many bottles, that would be better, right?


----------



## Kovoet (Nov 19, 2010)

And now AMD are trying to make me homesick damn them


----------



## TheMailMan78 (Nov 19, 2010)

JF-AMD said:


> Exactly.  Would your rather have two 7 ounce Little Kings or a 22 ounce Stone IPA?
> 
> One is twice as many bottles, that would be better, right?



Nice. Carry on then.


----------



## Wile E (Nov 20, 2010)

Chevalr1c said:


> You seem to fail at understanding that the OEM market has a huge importance. There retailers and manufacturers get the biggest sales figures.
> BTW, I still see a lot of Core2 series computers being advertised by electronics and computer stores, especially in case of affordable (500-700 euros) laptops I still see a lot of E-series Pentium cpus among the Core 2 Duos and i3s. Desktops aren't much different, I guess it's 50% i3, 50% C2D/C2Q. And I think a lot of people would gladly have such a compo because it is Intel stuff (not caring about the exact model, as if they actually have a clue). A lot of people don't care about how pcs work, not even talking about how to build them themselves and especially not about high-end stuff that cost per part often roughly the price of an entry or intermediate level OEM machine.
> 
> Llano is an attempt to get a much better foothold on the OEM market, with both a CPU and a GPU that are expected to be economical and powerful enough to compete with the dual core sytems equipped with IGPs currently on the market.
> ...


Competing at the top =  better market exposure. I under stand fully about OEM, I just don't care. I buy top end, if they can't compete there, it's sort of a fail in my book.



TheMailMan78 said:


> Does anyone know why they are still running duel channel on the desktop if the server class can be quad channel? I mean I am in no way an expert but it sounds like this could be major bottleneck.
> 
> 
> 
> Yeah but then you have people like me. I'm already running a 1090T and I don't want to jump to the first gen. of bulldozer. However I do want a new mobo. Ill be able to buy a new mobo with an upgrade path and still use my old components until I'm ready to upgrade again. THAT is whats great about AMD.


Or you can just use what you have, since a new mobo wont really help your performance, and save more money for a second gen AM3+ mobo and chip. My point stands, you need a new mobo to use the chip, it's not any different.


----------



## Imsochobo (Nov 20, 2010)

hell, im running my memory at pc5300 (its orginally 1150 mhz 5-4-4-12) and a old PH II  940 @stock with 2x 5850CF.
No problemo's. start up lagg cause of i run my memory so slowly, but im too lazy....

Games run as good as any other rig, i rarely hit games i have issues with... more like never.
save some money and buy yourself a ssd and your better off then.

Videocards is more a often change than cpu's, i've been using this for two years, and many friends of mine just went away from 5600+'s, 2ghz core 2 duo's running with 5970, 5870CF and GTX460SLI, with those cpu's.
finally games taxed their old old cpu's.
Just saying, cpu's are overrated, so is memory alot of the time


----------



## JF-AMD (Nov 20, 2010)

Wile E said:


> Competing at the top =  better market exposure. I under stand fully about OEM, I just don't care. I buy top end, if they can't compete there, it's sort of a fail in my book.



Actually, few, if any people buy at the top.  The concept of a "halo" brand is only to make people feel good about their purchases AFTER they have bought their product.  Since more than 95% of the world does not buy top bin products, that concept never follows through.

Here is the proof.  (please ignore this if you are one of the very few people that actually buys the $1000 desktop CPU.)

You have a choice between 2 CPUs.

Company 1 has the highest benchmark product, but it costs $1000.  You can only afford $500.  You have 2 processors in the $500 budget range:

Company 1 has one that scores 450 on the one benchmark that really matters to you.
Company 2 has one that scores 600 on the one benchmark that really matters to you.

Do you:

A. Buy the slower CPU from company 1, even though you are getting less for your money
B. Buy the faster CPU from company 2

The problem with the idea of a halo brand is that 80%+ of the world is what the market calls "processor unaware."  Ask 20 people at random and you find the vast majority don't know what kind of processor they have.  So having that $1000 top score processor isn't helping sell to that market.

Of the 20% of the market that is "processor aware", they are smart enough to compare performance of the choices in their budget and buy the best one.

Trust me, people don't buy the chevy malibu because the corvette is the fastest car, they buy it because it has the best features for their budget.


----------



## Fourstaff (Nov 20, 2010)

JF-AMD said:


> Trust me, people don't buy the chevy malibu because the corvette is the fastest car, they buy it because it has the best features for their budget.



Or they see that Mercedes and buy it because its a Mercedes, therefore it must be good. I completely understand what you mean, but I have seen a lot of people going like "I want Intel, because they are the best", and completely negating the other parts, value for money and other things. Just like girls going after the next fashion fad, they see the pros (or "stars") wearing X brand, then they buy X brand regardless of quality and other stuff.


----------



## HillBeast (Nov 20, 2010)

JF-AMD said:


> Actually, few, if any people buy at the top.  The concept of a "halo" brand is only to make people feel good about their purchases AFTER they have bought their product.  Since more than 95% of the world does not buy top bin products, that concept never follows through.



So AMDs idea of competing is to assume what we want? I WANT a more powerful CPU because I am a video editor and I NEED the performance. I don't want to be sitting around waiting for preview files to render. I don't want to wait for hours while I encode 1080p at 10+Mbit/s.

You think it's better for me to get a lower end CPU and put up with it so you can satisfy the cheapskates? Intel have the SMART idea: 3 main processor lines: Low power, mainstream, and enthusiast. AMDs idea is: low power, cheapskate. I want the enthusiast grade, even if only to attempt to compete against Intel so I can get some lower prices on the enthusiast chips.


----------



## JF-AMD (Nov 21, 2010)

I am not telling anyone what to buy.  I am just pointing out the reality of the market.


----------



## bear jesus (Nov 21, 2010)

HillBeast said:


> So AMDs idea of competing is to assume what we want? I WANT a more powerful CPU because I am a video editor and I NEED the performance. I don't want to be sitting around waiting for preview files to render. I don't want to wait for hours while I encode 1080p at 10+Mbit/s.
> 
> You think it's better for me to get a lower end CPU and put up with it so you can satisfy the cheapskates? Intel have the SMART idea: 3 main processor lines: Low power, mainstream, and enthusiast. AMDs idea is: low power, cheapskate. I want the enthusiast grade, even if only to attempt to compete against Intel so I can get some lower prices on the enthusiast chips.



In what way is anyone or anything suggesting you get a lower end CPU? just like everyone else you will buy what suits your needs and budget, if anything you should stop being a cheapskate and shell out for an i7 970 or 980x  (j/k)

I have to ask though, why are you using a CPU to encode video when you could be using your much faster GPU?


----------



## kirtar (Nov 21, 2010)

bear jesus said:


> I have to ask though, why are you using a CPU to encode video when you could be using your much faster GPU?


Probably because GPU encoding solutions still don't match the quality of CPU based solutions (by bitrate).  I've tried things like Cyberlink MediaEspresso, and to get the same quality as x264 I have to use probably 3 times the bitrate.  Of course, this is mainly relevant on final output rather than previews unless it's absolutely critical that you have full quality

In any case, JF-AMD is definitely right in that a very small part of the market buys at the top of the market.  In addition, I wouldn't find it difficult to believe that the vast majority of people don't know what exact processor is in their systems.  However, I would add that most people probably do know what brand of processor is in their system (especially on OEM systems that include the brand's shield stickers on the machine), and that the public perception of the brand is important.

Honestly, if AMD took control of the midrange CPU market, which probably covers most of the consumer market, that would probably get more attention than just topping the performance segment because the vast majority of people aren't building their own machines, and most also don't buy at the top of the line.  

Quite honestly, with how I've been considering what hardware to do, I almost always look at what I call the low end of the high end.  What I mean by this is that I look at the cheapest of the high performance line (e.g. the i7 920 when the i7's came out, and the Phenom II x6 1055 instead of the 1090).  I don't care much about the other market segments besides the price points directly adjacent to what I am considering.


----------



## bear jesus (Nov 21, 2010)

kirtar said:


> Probably because GPU encoding solutions still don't match the quality of CPU based solutions (by bitrate).  I've tried things like Cyberlink MediaEspresso, and to get the same quality as x264 I have to use probably 3 times the bitrate.



I had no idea, i have not encoded any form of video on a GPU and not on a CPU for years so i just assumed GPU accelerated programs had the same features including bit rate options.
Really that kind of makes GPU encoding pointless.

I just wish the desktop bulldozer cores would be out sooner so i could pick my CPU, motherboard and ram upgrade as i know I'm going to be very tempted by sandy bridge if the K versions are not to high priced.


----------



## kirtar (Nov 21, 2010)

bear jesus said:


> I had no idea, i have not encoded any form of video on a GPU and not on a CPU for years so i just assumed GPU accelerated programs had the same features including bit rate options.
> Really that kind of makes GPU encoding pointless.
> 
> I just wish the desktop bulldozer cores would be out sooner so i could pick my CPU, motherboard and ram upgrade as i know I'm going to be very tempted by sandy bridge if the K versions are not to high priced.


Actually there is a point:  Acceleration.  If all you need is a quick output to give you a basic idea of what you have, GPU accelerated encoding is fine because it does output faster, and the size is more or less irrelevant because it's probably not sticking around.  However, for a final output I would use a CPU based encoder until GPU acclerated encoders begin to catch up.

Also, this is just based of of my opinion from software available to me.  For all I know there could be some super top secret industrial software that works just fine.


----------



## Wile E (Nov 21, 2010)

JF-AMD said:


> Actually, few, if any people buy at the top.  The concept of a "halo" brand is only to make people feel good about their purchases AFTER they have bought their product.  Since more than 95% of the world does not buy top bin products, that concept never follows through.
> 
> Here is the proof.  (please ignore this if you are one of the very few people that actually buys the $1000 desktop CPU.)
> 
> ...



Again, I don't give a shit about the market. No top end = fail in my book.

But since you want to talk about the overall market:

Being able to claim the fastest cpu give you the ability to advertise as such, boosting sales of the lower parts thru brand recognition. Chevy advertising the Corvette ZR-1 or Camaro SS most certainly does bring people in the door for lower end vehicles. It generates a buzz and excitement, very useful selling tools. Or for a better example, Cadillac having the world's fastest production v8 sedan witht he CTS-V. Hell, most people I talk to don't know any Caddy models, but they sure as hell know about the CTS-V or Escalade, both top end "parts".

AMD has been feeding us the same spin you just got done feeding me for years. Sorry, but it's just plain horseshit. Having a strong top end does boost sales thru bolstering brand image.



kirtar said:


> Actually there is a point:  Acceleration.  If all you need is a quick output to give you a basic idea of what you have, GPU accelerated encoding is fine because it does output faster, and the size is more or less irrelevant because it's probably not sticking around.  However, for a final output I would use a CPU based encoder until GPU acclerated encoders begin to catch up.
> 
> Also, this is just based of of my opinion from software available to me.  For all I know there could be some super top secret industrial software that works just fine.



CPU still produces the best quality. GPU encoders are great for encoding for portable devices tho, where absolute quality isn't necessary due to the limited displays.


----------



## f22a4bandit (Nov 21, 2010)

Wile E said:


> Again, I don't give a shit about the market. No top end = fail in my book.
> 
> But since you want to talk about the overall market:
> 
> ...



Did you take any marketing classes, or graduate with a degree in marketing/advertising/public relations? Your argument is very good, and definitely spot on.

Intel has a great share of the market because of their advertising. I can't remember seeing an AMD commercial on the television in...ever. Maybe there have been, but honestly the way I heard about AMD was through the grapevine and online reviews.

Every person's budget is going to differentiate from one to the next. I do agree that AMD should compete with a big dog processor again because that can greatly increase competition and drive a nice price war like we see with graphics cards.


----------



## bear jesus (Nov 21, 2010)

Wile E said:


> Again, I don't give a shit about the market. No top end = fail in my book.
> 
> But since you want to talk about the overall market:
> 
> ...



I don't think that applies to AMD as they don't advertise in mainstream media  OK i admit i don't know if they do anymore but they never used to when i paid attention to mainstream media


----------



## Fatal (Nov 21, 2010)

Building my own computer a few years back has helped me understand how computers work. Then once I got into over clocking well that made me the junkie I am today. For the few years I have been building computers I have gained much knowledge in forums and trial and error. I now come to TPU all the time because there are many here that know their stuff. I have not had the chance to work with an Intel system. This is due to the pricing so I stuck with AMD. Now that I have had a good taste of what can be done all I care about is speed. With that being said I am waiting as some are to see what AMD comes up with this time. I was not willing to pay higher prices before but I want a faster overall system. I have been eyeing the I7's and for now am holding off if like some have stated to believe AMD will be a let down I will have my answer as to what system I will build next.


----------



## department76 (Nov 21, 2010)

sounds like i'll happily be buying an AM3+ board to drop my 965 into, to last me until the X8 prices are reasonable.

ty, AMD, for the sensible (and fiscally intelligent) upgrade path.


----------



## bear jesus (Nov 21, 2010)

department76 said:


> sounds like i'll happily be buying an AM3+ board to drop my 965 into, to last me until the X8 prices are reasonable.
> 
> ty, AMD, for the sensible (and fiscally intelligent) upgrade path.



That's a good point, i kept being tempted to get an AM3 board to put my 965 into but ended up deciding to wait for sandy bridge and bulldozer but i could just jump from AM2+ to AM3+ if bulldozer is worth it.... although by the time it comes out I'm sure i will be wanting a new CPU as well.


----------



## pr0n Inspector (Nov 21, 2010)

department76 said:


> sounds like i'll happily be buying an AM3+ board to drop my 965 into, to last me until the X8 prices are reasonable.
> 
> ty, AMD, for the sensible (and fiscally intelligent) upgrade path.



What exactly is wrong with holding the money in your hand until both the board(or even newer boards) and chip is out? Can you not control your spending urges?


----------



## bear jesus (Nov 21, 2010)

pr0n Inspector said:


> What exactly is wrong with holding the money in your hand until both the board(or even newer boards) and chip is out? Can you not control your spending urges?



To many people including me hardware upgrades are like a drug addiction, just getting a little hit even if it does not really do much is worth it


----------



## Nesters (Nov 21, 2010)

pr0n Inspector said:


> What exactly is wrong with holding the money in your hand until both the board(or even newer boards) and chip is out? Can you not control your spending urges?



Well, sometimes the current mobo is crap or, for example, doesn't support CF, but you want more GPU power. 

Usually you can find other reasons, why you should buy a new mobo.


----------



## meirb111 (Nov 21, 2010)

pr0n Inspector said:


> What exactly is wrong with holding the money in your hand until both the board(or even newer boards) and chip is out? Can you not control your spending urges?



another reason will be to wait untill a price drop instead of buying 1 day after
a product is released patience is a virtue!


----------



## Steevo (Nov 21, 2010)

GPU based render is the same set of instructions on Adobe that the CPU uses for lighting effects, saturation, hue, audio control, etc.... unless I am mistaken.


But still 24MBPS takes forever to render with three effects, why? Lazy companies promise and when they have sold their products have already moved on to their next set of promises they fully intend on breaking.

I'm downloading espresso 6 to try the GPU part again, the last time I tried it I had issues with video corruption.







yeah, hardcore acceleration.


----------



## HillBeast (Nov 21, 2010)

kirtar said:


> Probably because GPU encoding solutions still don't match the quality of CPU based solutions (by bitrate).  I've tried things like Cyberlink MediaEspresso, and to get the same quality as x264 I have to use probably 3 times the bitrate.  Of course, this is mainly relevant on final output rather than previews unless it's absolutely critical that you have full quality



Precisely. GPU based solutions are very dodgy. They work for one GPU and for the other they just do weird things. I have spent hours upon hours trying to get MediaEspresso to work on my 5870 and got nowhere. It simply left the GPU acceleration box greyed out. I tryed new drivers, installing Stream, reinstalling MediaEspresso, installing the latest Avivo... nothing. Eventually I figured there was no point seeing it can't do Premiere projects and that is what I care about.

Of course I would be using AMDs plugin that uses a Radeon to render your videos, but they had to be AMD instead of ATI and say you need an AMD CPU. I'd rather not get a slower CPU just for the sake of a GPU plugin which more than likely won't help me much.



Steevo said:


> GPU based render is the same set of instructions on Adobe that the CPU uses for lighting effects, saturation, hue, audio control, etc.... unless I am mistaken.



Yeah they have all those nice features, but the feature I really want is Premiere project file support. Only Adobe can do that. Running it through the likes of MediaEspresso will only slow me down because I am running it through a second encoder, and it will also lose quality as a result. Less encodes = higher quality.


----------



## Steevo (Nov 21, 2010)

Exactly, I want full hardware support, not this occasional use of the GPU that amounts to a whole minute of savings on a 2 hour project. 

ATI/AMD isn't cutting it, they promised this shit and fail to deliver, but if you listen to them they tell you all about this digital dream they have. It is more like a nightmare, buy this, buy that, buy this, buy that and oh wait, it only does this one type of file that is from 1998, doesn't work with this anymore, this driver breaks that, but if you buy this new thing everything will be great!!!


----------



## TheoneandonlyMrK (Nov 21, 2010)

bear jesus said:


> why are you using a CPU to encode video when you could be using your much faster GPU?



sactly +1,   i shoulda read on 

ps hillbeast nv are quite good for gpu encodeing have you considered a cheap gt240 just for encodeing its quick too and you could get some hybrid physx bonus points too then just disable ati encodeing and your off 

I am held in awe by no Co, theyre all shit at times hence ive amd ati and intel playing v nicely in my pc today with some fairly stale abit shit lol


----------



## bear jesus (Nov 21, 2010)

HillBeast said:


> Of course I would be using AMDs plugin that uses a Radeon to render your videos, but they had to be AMD instead of ATI and say you *need an AMD CPU*. I'd rather not get a slower CPU just for the sake of a GPU plugin which more than likely won't help me much.



Wow that's like he biggest fail i have heard of in weeks, i assumed it worked with an intel cpu.


----------



## HillBeast (Nov 21, 2010)

theoneandonlymrk said:


> ps hillbeast nv are quite good for gpu encodeing have you considered a cheap gt240 just for encodeing its quick too and you could get some hybrid physx bonus points too then just disable ati encodeing and your off



I could do that, but it would mean taking my GTX285 out 

Already tried NVIDIA encoding, and it's the same deal: a little faster, but still no Adobe and no point in it.

CPU encoding is the best and will always be the best as there is no driver issues. We need a CPU which is a TRUE fusion of a CPU and a GPU, like Cell, but not so fail and better single threading.



bear jesus said:


> Wow that's like he biggest fail i have heard of in weeks, i assumed it worked with an intel cpu.



Nah, the AMD plugin for Premiere only works on AMD CPU + ATI Radeon <- it will never be an AMD Radeon. Even AMD themselves think so when you look at the download drivers section: http://www.amd.com/au/Pages/AMDHomePage.aspx


----------



## Peter1986C (Nov 21, 2010)

Wile E said:


> Escalade, both top end "parts"



Are you sure? As a car to use in town it's a far from smart choice (with all that traffic a Smart, Citroën DS 3, Volkswagen Golf, Ford Focus etc. more practical IMHO) and if you need an off-road vehicle you better get a Range Rover, or a Jeep.
And IIRC from a review the Escalade was all plastic from inside (as most US cars are).


----------



## Steevo (Nov 21, 2010)

HillBeast said:


> I could do that, but it would mean taking my GTX285 out
> 
> Already tried NVIDIA encoding, and it's the same deal: a little faster, but still no Adobe and no point in it.
> 
> ...



Nvidia still works with CS4 and up, the full version, however some of the forum members there have been able to use a ATI card and get the same effects by startign the computer and Adobe with the Nvidia card attached and primary, then once it is open and running switching back to the ATI card. So Adobe & Nvidia are just as guilty as ATI is at this shit. 

I have AMD/ATI and the stream encoder/decoder don't work with the program they are supposed to work with. So AMD can kiss my ass with their digital dream. Once they make good with the promises I have paid for, with my last five cards from them I might consider them again, but really I am moving to Nvidia once they get their issues fixed.


----------



## HillBeast (Nov 21, 2010)

Steevo said:


> Nvidia still works with CS4 and up, the full version, however some of the forum members there have been able to use a ATI card and get the same effects by startign the computer and Adobe with the Nvidia card attached and primary, then once it is open and running switching back to the ATI card. So Adobe & Nvidia are just as guilty as ATI is at this shit.
> 
> I have AMD/ATI and the stream encoder/decoder don't work with the program they are supposed to work with. So AMD can kiss my ass with their digital dream. Once they make good with the promises I have paid for, with my last five cards from them I might consider them again, but really I am moving to Nvidia once they get their issues fixed.




It's really not a matter of getting acceleration inside Premiere, I want the acceleration in Media Encoder when I'm exporting my work. Doesn't matter what I have there because Adobe can't be bothered putting it in OpenCL or CUDA or DirectCompute or whatever you want to favour. Odd that they CUDA'd up Premiere CS5 and RUINED that, but left Media Encoder alone. I guess programming encoders to work with OpenCL (or whatever library you favour) just isn't easy and what you get is really low quality footage, not something I want.


----------



## pantherx12 (Nov 22, 2010)

Steevo said:


> Nvidia still works with CS4 and up, the full version, however some of the forum members there have been able to use a ATI card and get the same effects by startign the computer and Adobe with the Nvidia card attached and primary, then once it is open and running switching back to the ATI card. So Adobe & Nvidia are just as guilty as ATI is at this shit.
> 
> I have AMD/ATI and the stream encoder/decoder don't work with the program they are supposed to work with. So AMD can kiss my ass with their digital dream. Once they make good with the promises I have paid for, with my last five cards from them I might consider them again, but really I am moving to Nvidia once they get their issues fixed.




Did you actually installed stream?

It's not installed by default so a lot of things don't work if you just try running them.

You have to download the sdk to get it working.

Shit flies when decoding on my gpus!


----------



## Wile E (Nov 22, 2010)

Chevalr1c said:


> Are you sure? As a car to use in town it's a far from smart choice (with all that traffic a Smart, Citroën DS 3, Volkswagen Golf, Ford Focus etc. more practical IMHO) and if you need an off-road vehicle you better get a Range Rover, or a Jeep.
> And IIRC from a review the Escalade was all plastic from inside (as most US cars are).



1.) I'm in the US. They work fine here. Our roads and parking spaces are plenty large enough for SUVs.

2.) Like it or not, they are one of *Cadillac's* top of the line premium models.

3.) I didn't compare it to other makers. I used Cadillac as a parallel to AMD. I didn't say it was the best car in the world, I used Escalade to prove the point that poeple know what brand it is because of the premium models. I never once mentioned the value of the car, or compared it to other makers.

You have completely missed the point of the exercise.


----------



## Peter1986C (Nov 23, 2010)

Wile E said:


> 1.) I'm in the US. They work fine here. Our roads and parking spaces are plenty large enough for SUVs.



Trying to get through a downtown district of a European city is often a different matter though. Like sh** through a funnel, so most folks are trying to keep the size of their turds to a limit. 
Certain cities even ask fees if people wish to go through the downtown district by car (to disencourage the use of them and encourage people to go by bike, scooter, mass transit or whatever) , because of the traffic congestion. So SUV drivers are, when using such vehicles downtown, quite being frowned upon, because they hinder the other traffic so much.

And in Asia the traffic is often even worse than in Europe.



Wile E said:


> 2.) Like it or not, they are one of *Cadillac's* top of the line premium models.



I should have realised that. My bad.



Wile E said:


> 3.) I didn't compare it to other makers. I used Cadillac as a parallel to AMD. I didn't say it was the best car in the world, I used Escalade to prove the point that poeple know what brand it is because of the premium models. I never once mentioned the value of the car, or compared it to other makers.


If you want to make such a point, better pick a more globally known car brand as a parrallel (Asian or Euro brands, like Hyundai, Toyota, Mercedes, BMW, etc.). Cars like the Escalade are barely known here, I actually only know it because of tv. The only US brands selling well in Europe are Ford and General Motors (under the names of Chevrolet (mostly former Daewoo models) and Opel/Vauxhall).



Wile E said:


> You have completely missed the point of the exercise.



Not completly my fault, IMHO, if your parallels don't work that well for non-Americans. I mean, I can try my best of course but there is some chance of "failure".


----------



## Wile E (Nov 25, 2010)

Chevalr1c said:


> Trying to get through a downtown district of a European city is often a different matter though. Like sh** through a funnel, so most folks are trying to keep the size of their turds to a limit.
> Certain cities even ask fees if people wish to go through the downtown district by car (to disencourage the use of them and encourage people to go by bike, scooter, mass transit or whatever) , because of the traffic congestion. So SUV drivers are, when using such vehicles downtown, quite being frowned upon, because they hinder the other traffic so much.
> 
> And in Asia the traffic is often even worse than in Europe.


I understand that. But I don't live there, so I can't give any good examples for you. Most of the forum is from the US, so that's just what I'm used to catering to, that's all.



Chevalr1c said:


> I should have realised that. My bad.
> 
> 
> If you want to make such a point, better pick a more globally known car brand as a parrallel (Asian or Euro brands, like Hyundai, Toyota, Mercedes, BMW, etc.). Cars like the Escalade are barely known here, I actually only know it because of tv. The only US brands selling well in Europe are Ford and General Motors (under the names of Chevrolet (mostly former Daewoo models) and Opel/Vauxhall).


See, you just made my point. If not for the exposure of the top of the line Cadillac model on TV, would you have known about it? Same principle can hold true for AMD. If they put out a product that gets exposure because it rests at the top end, they'll get more brand recognition.



Chevalr1c said:


> Not completly my fault, IMHO, if your parallels don't work that well for non-Americans. I mean, I can try my best of course but there is some chance of "failure".


I agree, not really your fault at all. Like I said, I'm just putting into perspective for the majority of this forum. Feel free to substitute an example that better applies to your region.


----------



## Steevo (Nov 25, 2010)

I installed the 115MB stream package, made no difference in Adobe, it allowed for 3% more use of the GPU over previous, so a total of 12% to transcode a 1.81GB 24MBPS 1080i M2TS file to a 1080i MPEG4/DIVX It was still insanely slow. 


So either I choose to shoot at lower resolution than a few years old Canon common format high def camcorder can do and go back to 90's formats, or suck it up and continue to spend days on projects. 


Yeah, ATI/AMD can suck it.


----------



## hobgoblin351 (Nov 25, 2010)

So, since I would like upgrade to Bulldozer, but cant wait that long. It seems that an upgrade to an AMD+ board with an AMD3 chip is the way to go. Then I can upgrade to Bulldozer intime when my pockets are deeper.  Has anyone heard of the AMD3+ boards being worked on or realease dates? I cant seem to find anything on them.  And if  JF-AMD has a stack of chips on his desk, then is it safe to assume that ASUS, GYGABYTE, and the rest of them have some and are working on putting out AMD3+ boards.  Why have we heard nothing about them?


----------



## Fourstaff (Nov 25, 2010)

hobgoblin351 said:


> So, since I would like upgrade to Bulldozer, but cant wait that long. It seems that an upgrade to an AMD+ board with an AMD3 chip is the way to go. Then I can upgrade to Bulldozer intime when my pockets are deeper.  Has anyone heard of the AMD3+ boards being worked on or realease dates? I cant seem to find anything on them.  And if  JF-AMD has a stack of chips on his desk, then is it safe to assume that ASUS, GYGABYTE, and the rest of them have some and are working on putting out AMD3+ boards.  Why have we heard nothing about them?



AM3+ mobos will be out in the second quarter, which is still a looooooong way from where we are right now.


----------



## HillBeast (Nov 25, 2010)

hobgoblin351 said:


> So, since I would like upgrade to Bulldozer, but cant wait that long.



Just wait. It's not that hard. I want the funds to but parts for my new big project but don't at the moment. I'm not going to rob a bank to get it. Just wait for it to come out and wait for Bulldozer to come out before deciding you want it. Planning to buy something before there are even benchmarks of it means the only reason you want it is marketing speak. It may arrive and be awesome, but it could also arrive and be a piece of crap.

Just wait.


----------



## LightningHertz (Dec 27, 2010)

*Give me Memory Controllers, or Give Me... A solar calculator*



JF-AMD said:


> Cleint workloads rarely saturate the memory bus.  We'll be adding ~50% more throughput than current designs.
> 
> Benchmarks and real life usage are 2 different things.
> _
> ...



...and rightfully they should. '50% more processing power' was the description given for the Opteron replacement; and even then, that doesn't say anything at all about the rest of the hardware. Number of channels of lower speed, lower latency _dedicated_ memory per core is what gives real-world throughput, not just bench numbers.

(I apologize ahead of time for any meanderings that may be momentary rants, lol. I mean to mostly be informative, and hopefully with some humor, but atm I'm a bit upset with AMD   )

   People can't saturate their memory bandwidth because it can't be done. The bus is fine; the access is what is lacking. The problem is hardware limitations when you try to address the same target with too many cores, and greater ram speed is almost moot: Fewer dedicated memory paths than the amount of cores causes contention among cores, among many other things I mention below. You can still fill your bucket (memory) slowly, with a slow hose (low # of ram channels @ higher speed = higher memory controller & strap latencies, memory latencies, core-contention, bank, rank, controller interleaving, all while refreshing, strange ratios, etc) -but this is not 'performance' when it is time that we want to save. I can't just be excited that I can run 6 things 'ok.'

Case in point:
   Look at this study performed by Sandia National Labratories in ALBUQUERQUE, N.M. (Please note that they are terming multi-core systems as 'supercomputers'):

https://share.sandia.gov/news/resou...lower-supercomputing-sandia-simulation-shows/


   In a related occurrence, look at how the 'cpu race' topped out. More speed in a single core just resulted in a performance to consumption ratio just made a really good shin-cooker. Instead, the answer was a smaller die process with low-power, moderate speed SMP cores; much like nVidia's Cuda or ATI's Stream. Memory controllers/ram is no different.

   What good is a ridiculously fast DDR HT-bus when you can't send solid, concurrent dedicated data streams down it to memory because of all the turn-taking and latencies? It's like taking turns with water spigots down a huge hose that has a small nozzle on it.
You can't achieve high throughput with current (and near-future) configurations. Notice that as soon as we said "Yay! Dual-core AND dual channel ram!" it quickly became, "what do you mean "controller interleave?" ...But then - the fix: Ganged memory at half the data width. 
...And there was (not) much rejoicing. (yay.)

Why did they do this? 
(My opinion only): It continues, always more, to look like it's because they won't ever, ever give you what you want, and it doesn't matter who the manufacturer is. They need you NEEDING the next model after only 6 months. Look at these graphics cards today. I almost spit out my coffee when I read the benchmarks for some of the most recent cards, priced from $384 to $1100. At 800 mHz and up, some sporting dual gpu and high-speed 1GB-2GB gddr5 getting LESS THAN HALF of the framerates of my 5 year old 600 mhz, ddr3 512MB, pci-e card; same software, with all eye-candy on, and my processor is both older and slower than those showcased. It's obviously not a polygon-math issue. What's going on? Are we going backwards? I can only guess that Cuda and Stream have cores that fight over the memory with a bit-width that is still behind.

   I also do 3d animation on the same system that I game with, transcode movies, etc, etc, etc. So far, I have tested both Intel and AMD multi-core multi-threading with real-world, specifically compiled software _only _(thanks to Intel's filthy compiler tricks.) Engaging additional cores just results in them starving for memory access in a linear fashion. In addition, so far all my tests suggest that *no more than 4GB of DDR3-1066 @ 5-5-5-15-30 can be filled on dual channel... at all, on 2 through 6 core systems.* (On a side note: WOW- my Intel C2D machine, tested with non-Intel compiled (lying) software, performs like an armless drunk trying to juggle corded telephones with his face.)

   Anyway, the memory speed needed for current configurations would be well over what is currently available to even match parallel processing performance for the 3:1 [core : mem controller] ratio when you're done accounting for latencies, scheduling, northbridge strap, throughput reduction due to spread-spectrum because of the high frequency, etc, etc. 

So in conclusion, more parallel, lower speed, low latency controllers and memory modules (with additional, appropriate hardware) could generate a system with a far greater level of real, _usable _throughput. Because I would much prefer, but will most likely not be able to afford, a Valencia quad-core (server core... for gaming too? -and then only if it had concurrent memory access,) - it looks like I'm giving up before Bulldozer even gets here. 

6 cores and 2 channels for Zambezi? No thanks.
I'm tired of waiting, both for my renders, _and _a pc that renders 30 seconds in less than 3 days. 

(One final aside): About these 'modules' on the Bulldozer- wouldn't that rub additionally if it's the 'core 'a' passes throughput to core 'b'' design that was created in some of the first dual cores? Time will tell.

~Peace


----------



## Steevo (Dec 27, 2010)

LightningHertz said:


> ...and rightfully they should. '50% more processing power' was the description given for the Opteron replacement; and even then, that doesn't say anything at all about the rest of the hardware. Number of channels of lower speed, lower latency _dedicated_ memory per core is what gives real-world throughput, not just bench numbers.
> 
> (I apologize ahead of time for any meanderings that may be momentary rants, lol. I mean to mostly be informative, and hopefully with some humor, but atm I'm a bit upset with AMD   )
> 
> ...



While some of your post was accurate. Your old graphics card theory is bullshit.

Your eye candy is due to DX rendering paths, your older card will render in DX 8 or 9 at OK framerates, but the newer cards will struggle to render all the advanced features of DX11 that make the small differences.


Try HL2 CM 10 with all high settings. I can do it, why can't you?

But yes you are right about core starvation, I see it happen on my system, increasing the core speed helps alleviate the problem to a small degree,  just the effect of reduced latencies, adding more RAM won't help, higher RAM speed won't help, we are entering the era of needing 1 stick of 2GB RAM for one core on its own memory path, or being able to read and write to different parts of RAM and track what becomes available to read and write to and where it is for the next step in the process. Almost like RAM RAID.


----------



## LightningHertz (Dec 27, 2010)

_


Steevo said:



			While some of your post was accurate. Your old graphics card theory is bullshit.

Your eye candy is due to DX rendering paths, your older card will render in DX 8 or 9 at OK framerates, but the newer cards will struggle to render all the advanced features of DX11 that make the small differences.

Try HL2 CM 10 with all high settings. I can do it, why can't you?
		
Click to expand...




Steevo said:



			But yes you are right about core starvation, I see it happen on my system, increasing the core speed helps alleviate the problem to a small degree,  just the effect of reduced latencies, adding more RAM won't help, higher RAM speed won't help, we are entering the era of needing 1 stick of 2GB RAM for one core on its own memory path, or being able to read and write to different parts of RAM and track what becomes available to read and write to and where it is for the next step in the process. Almost like RAM RAID.
		
Click to expand...


   Hi... Steevo. I'm glas that you are also able to reproduce some of these observations.

   I feel I need to clear some things up though. Before firing of with 'bullshit':

1) I developed no such theory; it was purely a facetious 'speculation' based upon a real-world observation; As usual, lack of inflection in writing causes these problems. I will annotate such comments in the future, seeing as the words 'I can only guess' doesn't appear to get that idea across.
2) Direct X 11 rendering paths have nothing to do with DX 9 benchmarks of DX 9 games just because it's tested on DX11 compatible hardware.
3) I am only familiar with HL2. I don't know why you're asking why I can't run something I neither mentioned nor tried. But now that you bring it up, DX 11, like DX 10, were touted as being easier to render by compatible hardware and thus requiring less processing power. Why then, if you want to compare DX9 to 10 or 11, are respective framerates much lower in these title add-ons, with even more powerful hardware at the same resolutions?

Thanks.
Have a nice day._


----------



## nt300 (Feb 23, 2011)

So we have Quad-Channel DDR3 IMC for Server/Workstation CPU's and Dual-Channel DDR3 for Desktop. That really sucks, AMD should have stuck with Quad-Channel support to further boost memory perfomrance which today they greatly lack big time.



> - Native DDR3-1866 Memory Support [8]
> - Dual Channel DDR3 Integrated Memory Controller (Support for PC3-15000 (DDR3-1866)) for Desktop, *Quad Channel DDR3* Integrated Memory Controller (support for PC-12800 (DDR3-1600) and Registered DDR3)[9] *for Server/Workstation* (New Opteron Valencia and Interlagos)


http://en.wikipedia.org/wiki/Bulldozer_(processor)#Microarchitecture


----------



## JF-AMD (Feb 23, 2011)

So, Intel was at triple channel.  Performance in many cases showed no appreciable difference between dual channel and triple channel.  With SB they moved back to dual channel.

AMD is delivering dual channel with up to 50% greater throughput than current products. Isn't that a better option?

As to quad channel on the desktop, if triple channel was not clearly a product differentiator, why would quad be any better?  Sometimes people get caught up in the specs but they don't focus on the output.  

If dual channel and quad channel were about the same in throughput would you rather have dual channel with lower cost and lower power or quad with higher cost and higher power?


----------



## cdawall (Feb 23, 2011)

JF-AMD said:


> So, Intel was at triple channel.  Performance in many cases showed no appreciable difference between dual channel and triple channel.  With SB they moved back to dual channel.
> 
> AMD is delivering dual channel with up to 50% greater throughput than current products. Isn't that a better option?
> 
> ...



is sandybridge considered in the 50% greater throughput because it has already shown vast improvements over previous core and thuban products.


----------



## JF-AMD (Feb 23, 2011)

beats me, I am a server guy, I have no idea what the throughputs look like on the client side.


----------



## cdawall (Feb 23, 2011)

JF-AMD said:


> beats me, I am a server guy, I have no idea what the throughputs look like on the client side.



haha ok thanks for at least being honest with me.


----------



## AlphaGeek (Apr 22, 2011)

[H]@RD5TUFF said:


> More cache PLZ DAMMIT!
> 
> Also meh.
> 
> I do not think these will be the saving grace AMD needs.



The the only thing that will allow AMD to compete and save them from their slump is developing a Mainboard/CPU that can handle triple channel RAM. And even that won't last now since Intel has leaked rumors of Quad channel RAM!!!


----------



## Jack Doph (Apr 22, 2011)

AlphaGeek said:


> The the only thing that will allow AMD to compete and save them from their slump is developing a Mainboard/CPU that can handle triple channel RAM. And even that won't last now since Intel has leaked rumors of Quad channel RAM!!!



That's unlikely to make a real-world difference - hence the step Intel took in going back to dual-channel.
AMD already has quad-channel, just not for the desktop.
What would make a proper difference, if dual-channel could be coupled to a per-core setup.


----------



## AlphaGeek (Apr 22, 2011)

Jack Doph said:


> That's unlikely to make a real-world difference - hence the step Intel took in going back to dual-channel.
> AMD already has quad-channel, just not for the desktop.
> What would make a proper difference, if dual-channel could be coupled to a per-core setup.



This is very true. I would love to see anybody, AMD OR Intel, do a per-core setup with their RAM, dual, triple, or quad channel!!!


----------

