# AMD Demonstrates Graphics Processing Power of Llano Fusion APUs



## btarunr (Jun 2, 2010)

AMD demonstrated its first Fusion APU (accelerated processing unit), which is a "fusion" between a processor and a graphics processor. The first such processor in the works is based on the 32 nm silicon fabrication technology, codenamed Llano, and fuses a quad-core processor with a DirectX 11 compliant GPU. AMD's Rick Bergman showed off a wafer of the Llano APUs, but it didn't stop there. Rick surprised the press when he went on to claim that the APU can power Aliens vs. Predator in DirectX 11 mode, with a reasonable level of detail, which was demonstrated. Find a video of the same at the source.



 



*View at TechPowerUp Main Site*


----------



## Imsochobo (Jun 2, 2010)

exceeded my expectations!


----------



## mastrdrver (Jun 2, 2010)

Awesome sweetness!


----------



## Imsochobo (Jun 2, 2010)

mastrdrver said:


> Awesome sweetness!



I expect theese will rock the world of notebooks! god i can't wait now!
Want specs specs specs, if it got 320 shaders its gonna be very very good!


----------



## caleb (Jun 2, 2010)

Name is very "original". The first thing that popped into my mind was Arithmetic Processing Unit


----------



## v12dock (Jun 2, 2010)

wow very nice


----------



## btarunr (Jun 2, 2010)

caleb said:


> Name is very "original". The first thing that popped into my mind was Arithmetic Processing Unit



That component is called ALU, not APU. ALU stands for Arithmetic and Logic Unit.


----------



## caleb (Jun 2, 2010)

Yep it is.


----------



## Fourstaff (Jun 2, 2010)

1. AM3?
2. all in one or in the style of I5 661?
3. How much?
4. Crossfire?


----------



## btarunr (Jun 2, 2010)

Fourstaff said:


> 1. AM3?
> 2. all in one or in the style of I5 661?
> 3. How much?
> 4. Crossfire?



Not AM3.
Yes.
An entire range, up to high-end.
Dunno.


----------



## kaneda (Jun 2, 2010)

I like this. 

me like this long time


----------



## Imsochobo (Jun 2, 2010)

btarunr said:


> Not AM3.
> Yes.
> An entire range, up to high-end.
> Dunno.



Add : built on K10.5 and not bulldozer.


----------



## Yellow&Nerdy? (Jun 2, 2010)

Wow, that's a lot more than I expected. Will definately revolutionize the notebook market. Let's hope that Bulldozer exceeds our expectations too


----------



## _JP_ (Jun 2, 2010)

Ok, I'm confused...a "fusion" between a GPU and a CPU...results in a APU...and what exactly is that?

The i5 have the CPU core physically divided from the GPU core...does this one have it too?
I've got to look into this.

And HA!, I knew it! New socket!


----------



## Imsochobo (Jun 2, 2010)

_JP_ said:


> Ok, I'm confused...a "fusion" between a GPU and a CPU...results in a APU...and what exactly is that?
> 
> The i5 have the CPU core physically divided from the GPU core...does this one have it too?
> I've got to look into this.
> ...



Bulldozer will come on AM3. if rumors are true.

OFC they have to dump AM3, NB will be built in. and a gpu, chipsets are incompatible with handling those things.


All is one DIE. not like I5, Intel will also be ready, alltho most likely a tad later and alot slower(we all know intel on the gpu performance side)


----------



## AsRock (Jun 2, 2010)

_JP_ said:


> Ok, I'm confused...a "fusion" between a GPU and a CPU...results in a APU...and what exactly is that?
> 
> The i5 have the CPU core physically divided from the GPU core...does this one have it too?
> I've got to look into this.
> ...



INTELS cpu core sits next to the GPU core on.  With AMD there both combined as one piece of silicon.


----------



## theubersmurf (Jun 2, 2010)

_JP_ said:


> And HA!, I knew it! New socket!


Finally, they should have come up with new socket when they started supporting DDR3.


----------



## slyfox2151 (Jun 2, 2010)

They did, it was called AM3


----------



## theubersmurf (Jun 2, 2010)

slyfox2151 said:


> They did, it was called AM3


The AM3 socket isn't any different than the AM2/AM2+ socket, it's got a new designation to indicate it supports DDR3.


----------



## _JP_ (Jun 2, 2010)

theubersmurf said:


> The AM3 socket isn't any different than the AM2/AM2+ socket, it's got a new designation to indicate it supports DDR3.



Yeah, it only has 2 more pins than 939 and one more than AM2/AM2+.

That is good news! The fact that it's only one die is good for cooling purposes and price.


----------



## Frick (Jun 2, 2010)

theubersmurf said:


> Finally, they should have come up with new socket when they started supporting DDR3.



AMD keeping it's sockets around is a great thing.


----------



## TVman (Jun 2, 2010)

so its like cell cpu ?


----------



## Atom_Anti (Jun 2, 2010)

Awesome, I just canceled buying my new laptop, I will wait for it!


----------



## Zubasa (Jun 2, 2010)

_JP_ said:


> Yeah, it only has 2 more pins than 939 and one more than AM2/AM2+.
> 
> That is good news! The fact that it's only one die is good for cooling purposes and price.


Actually no.
The AM3 socket have 938 pins that is 2 less pins than AM2(+) that have 940.


----------



## wahdangun (Jun 2, 2010)

wew, i hope mobo maker can use hydra on one of this babys so tis great power never gonna get waste (if we wanna  get a more powerful GPU)


----------



## Techtu (Jun 2, 2010)

I was thinking of buying a new CPU soon, but now that I've seen this, I might just wait another couple of months so the prices of the Phenom II's come down a little more 

Great to see AMD finally putting their backbone in something though


----------



## theubersmurf (Jun 2, 2010)

Frick said:


> AMD keeping it's sockets around is a great thing.


Admittedly, I forgot about their having removed pins for the AM3 cpus. But in all honesty, when the AM3s were released having it use the same socket caused more problems than it solved, I recall tons of people having problems upgrading, as well, they should switch to something like the Land Grid Array from pins. Yeah, keeping it in the same socket makes for an inexpensive upgrade, but I don't think people's confusion and the related difficulties was worth it. Just my opinion.


----------



## theubersmurf (Jun 2, 2010)

Tech2 said:


> I was thinking of buying a new CPU soon, but now that I've seen this, I might just wait another couple of months so the prices of the Phenom II's come down a little more
> 
> Great to see AMD finally putting their backbone in something though


My suspicion is that this isn't going to be an enthusiast architecture, more of an almost integrated system.


----------



## Lionheart (Jun 2, 2010)

Looks pretty awesome, hope it does well


----------



## theubersmurf (Jun 2, 2010)

the computex presentation in question. Haven't had a chance to watch it yet.


----------



## nascasho (Jun 2, 2010)

Yeah yeah... but will it run Cry... _*gets shot*_


----------



## _JP_ (Jun 2, 2010)

Zubasa said:


> Actually no.
> The AM3 socket have 938 pins that is 2 less pins than AM2(+) that have 940.



Well, actually no.
The AM3 *socket* has 941 pins, it's the AM3 *processors* that have 938 pins.
They made it like this due to compatibility.
If you still don't believe me, go count them.


----------



## Trigger911 (Jun 2, 2010)

JP is right the AM2 has 1 more pin than am3. This is pretty sweet news I cant wait for a few years to pass and see how this goes.


----------



## a_ump (Jun 2, 2010)

kewl, i like seeing AMD beat Intel to it. bet intel was "SON OF A BITCH!!!! we should have been doing this instead of larrabee!!!"


----------



## EastCoasthandle (Jun 2, 2010)

Just think about it for a minute, a world that no longer needs discrete graphics solutions.  I think that may happen if both AMD and Intel both push for this as the only solution for OEMs.  Here's an example:
someone buys a Dell that has the same graphical power as your home built top of the line PC at 2/3's the cost (well I can only hope that those APUs  or whatever they decide to call them doesn't cost that much).  That would change most enthusiast opinions on how they decide to build their PCs.  Who is going to pay (hopefully) more for a separate cpu and video card when they can get the combination of both in one package at a cheaper price?  Heck, even if having both comes at a slightly slower performance.

Now I'm not talking about 1st gen APUs either.


----------



## a_ump (Jun 2, 2010)

true....but upgradability would the deciding factor in that and there's no telling how much a future ALU processor may cost so you can't depend on that, discrete graphic cards however are predictable on price.


----------



## Techtu (Jun 2, 2010)

Personally I think AMD have hit the nail on the head here, and we will see them surpass Intel in the coming year's, unless Intel simply build a better version.


----------



## a_ump (Jun 2, 2010)

Tech2 said:


> Personally I think AMD have hit the nail on the head here, and we will see them surpass Intel in the coming year's, unless Intel simply build a better version.



that's the big problem with intel thou. they have no *great* gpu experience or knowledge, not compared to what AMD got when merging with ATi.  So the chances of intel taking a lead in ALU market....very slim imo


----------



## Techtu (Jun 2, 2010)

But... this only mean's one thing... who has Intel got to "team up" with if they are going to compete with AMD on this road? I see the only option being nVidia which may lead to great thing's, I'm just hoping AMD keep trumps on this one


----------



## koorosh (Jun 2, 2010)

Yeah AMD rocks! I knew it



EastCoasthandle said:


> Just think about it for a minute, a world that no longer needs discrete graphics solutions.  I think that may happen if both AMD and Intel both push for this as the only solution for OEMs.



Of course we'll have the old classified market segments. After all, all of this is to sell us products


----------



## lyndonguitar (Jun 2, 2010)

this would be great for small PC and laptops


----------



## theubersmurf (Jun 2, 2010)

Tech2 said:


> But... this only mean's one thing... who has Intel got to "team up" with if they are going to compete with AMD on this road? I see the only option being nVidia which may lead to great thing's, I'm just hoping AMD keep trumps on this one


They're probably not teaming up any time soon. Only if this puts financial pressure on both do I think they would lay down their arms and join forces.


----------



## ArmoredCavalry (Jun 2, 2010)

Nice, I wouldn't be surprised to see combined gpu/cpu's take over the market eventually.

Having a separate "video card" can't last forever. Just look at the Sound Card. 

Eventually, I think CPU's will just get much better at generalized tasks, including graphical processing.

Then you will have a cpu processing video/sound, so what comes next? Maybe physics cards really is the business to be in.... maybe....


----------



## Deleted member 24505 (Jun 2, 2010)

If they dont kick out too much heat,they might be ok for lappys.


----------



## EastCoasthandle (Jun 2, 2010)

ArmoredCavalry said:


> Nice, I wouldn't be surprised to see combined gpu/cpu's take over the market eventually.
> 
> Having a separate "video card" can't last forever. Just look at the *Nic and hard drive controller* cards.


Fix'd!
But to be honest that's true.  You can't expect to have discrete video cards forever.  Eventually that will be consolidated into a smaller and more efficient package of some kind.  If AMD/Intel doesn't do it someone else will.  If it was the future we would see something like this being more popular for laptops, for example, IMO.

Look at the market now, we don't need these big discrete VCs for ipods, ipads, touch sensitive phones, etc. The market is moving away from that to something smaller and more efficient.


----------



## suraswami (Jun 2, 2010)

One word - 'awesome'


----------



## Kitkat (Jun 2, 2010)

the lowest bulldozer should end am3. along with "Phenom III" a shrunk down phenom II that we already know is coming.

Even tho he said what it could power (game) im sure the intention is just to show its very capable im sure it wont be directly marketed to gaming. "Oh and it can even play ______ ".


Here is video  http://pc.watch.impress.co.jp/docs/news/event/20100602_371640.html


Also shows off there IE APU Acceleration


EDIT EDIT EIDT AND A XGP!!!!


----------



## wahdangun (Jun 2, 2010)

so this is amd GRAB phisyx co-founder, so maybe we will se rapid development of stream SDK


----------



## Imsochobo (Jun 2, 2010)

theubersmurf said:


> Admittedly, I forgot about their having removed pins for the AM3 cpus. But in all honesty, when the AM3s were released having it use the same socket caused more problems than it solved, I recall tons of people having problems upgrading, as well, they should switch to something like the Land Grid Array from pins. Yeah, keeping it in the same socket makes for an inexpensive upgrade, but I don't think people's confusion and the related difficulties was worth it. Just my opinion.



the problem was you stuck the cpu in the socket.
You boot up the pc.
maybe it discovers it as eng sample.
mobo webpage.
dl bios.
Flash.

Solved.
This is the same if there is a new chip for a LGA1156 platform, doesnt matter, you still need the same update.
So thats not a valid statement in my opinion, i stuff in AM3 cpu's in AM2 boards and they work, yes AM2, with AGP and nforce 3( good old asrock) worked as easy as any other cpu upgrade.


----------



## theubersmurf (Jun 2, 2010)

tigger said:


> If they dont kick out too much heat,they might be ok for lappys.


If you watch the video I linked to, they talk about versions as low as 25 watts.


----------



## a_ump (Jun 2, 2010)

25watts is quite a bit for a CPU, however if it can pull off performance of say a mobility HD 4670 or 5770 then that's dam good. the detail in that screenshot looks good, but can't tell how smooth it was running.


----------



## EastCoasthandle (Jun 2, 2010)

I wouldn't say it's a lot for a APU solution.  How much power does it take to power an equivalent CPU and GPU solution?


----------



## theubersmurf (Jun 2, 2010)

Imsochobo said:


> the problem was you stuck the cpu in the socket.
> You boot up the pc.
> maybe it discovers it as eng sample.
> mobo webpage.
> ...


I didn't actually have problems with it myself, it's the number of others who couldn't seem to figure out how to get an am2 or am2+ cpu to work in an am3 mobo (since they already had a chip) Or didn't bother to flash, or flashed badly or whatever.


----------



## theubersmurf (Jun 2, 2010)

EastCoasthandle said:


> I wouldn't say it's a lot for a APU solution.  How much power does it take to power an equivalent CPU and GPU solution?


I seem to recall my athlon x2 5000+ being about 89watts or similar? maybe it was 125 but I haven't used it in a while, I think 25 watts is easily something a Lithium Ion laptop battery could maintain, though I think that figure was targetted at netbooks, not so much laptops. 

They said they wanted four core x86 processing, but didn't say how many sp's on the die, so I'm not sure.


----------



## Imsochobo (Jun 2, 2010)

a_ump said:


> 25watts is quite a bit for a CPU, however if it can pull off performance of say a mobility HD 4670 or 5770 then that's dam good. the detail in that screenshot looks good, but can't tell how smooth it was running.



let remind you.
Quadcore. 15W
GPU 5W
Chipset 5W

well, then we're down to a 8 pci-e lane chipset from AMD.
A single core 1.6 ghz
and no gpu got that TPD.

its basicly impossible with dedicated gpu, 5770 is too high, 4670 is more the right direction, maybe slightly less due to memory performance.
Rather excited, even though im not a huge laptop fan, but i would like to see better performance on laptops, intel have really improved, but amd gonna bring it to another level with gpu performance.
Opencl may really kick off if intel manages to bring out "acceptable" performance atleast in terms of gpgpu, 100 gigaflops would be a huge improvement for a app when the part is needed anyways for video when required.
The only looser is nvidia, GF9400 class and such may get really obsolute! (huge profits there!)

and for high end pc's the gpu part could be used in opencl, so no matter what its usefull. physx on the gpu on the cpu in high end pc's  alltho nvidia wudnt let us do that anyways


----------



## WarEagleAU (Jun 2, 2010)

AMD that is just sweet news, dx11 on a chip especially if it can run the game like that at decent frames. I definitely would want a desktop version (HTPC) and a notebook to boot.


----------



## manchesterutd81 (Jun 2, 2010)

I wish i knew if this will support DDR3 OR DDR5?

Im ready for DDR5..... and I hope AMD will jump to it for the next Gen of motherboards and processors...


----------



## F1reFly (Jun 3, 2010)

considering video cards get upgraded far more often and easier, i doubt AMD will ever want to completely kill off the discreat video card business. i suspect the graphical power of these will remain in the lower end for many years.


----------



## kaneda (Jun 3, 2010)

TVman said:


> so its like cell cpu ?



if you mean the Cell BroadBand Engine, no. not in the slightest. 

it should be thrown in with the high performance computing components though.


maybe~


----------



## pr0n Inspector (Jun 3, 2010)

Am I getting this right: this only needs to connect RAM, vRAM and southbridge?


----------



## FordGT90Concept (Jun 3, 2010)

_JP_ said:


> Ok, I'm confused...a "fusion" between a GPU and a CPU...results in a APU...and what exactly is that?


Auxiliary Power Unit. 

They need a better acronym like CPUWG (Central Processing Unit With Graphics).




manchesterutd81 said:


> I wish i knew if this will support DDR3 OR DDR5?
> 
> Im ready for DDR5..... and I hope AMD will jump to it for the next Gen of motherboards and processors...


I imagine it uses the main system RAM just like every other IGP.


----------



## CounterZeus (Jun 3, 2010)

this is very interesting, even cheap laptops good get decent performance. My mate's nv7000M just sucks...He can barely run warcraft 3 with a mobile athlon X2 and that crap igp.
if intel and nvidia would team up, we got a serious cpu/gpu war on our hands  (hopefully with better prices for us costumers )


----------



## Frick (Jun 3, 2010)

CounterZeus said:


> this is very interesting, even cheap laptops good get decent performance. My mate's nv7000M just sucks...He can barely run warcraft 3 with a mobile athlon X2 and that crap igp.
> if intel and nvidia would team up, we got a serious cpu/gpu war on our hands  (hopefully with better prices for us costumers )



I assume it's an Intel IGP. They do suck.


----------



## HalfAHertz (Jun 3, 2010)

I'm really interested in the memory configuration of this. Will it have two memory controllers - one for the CPU part and one for the GPU part. Will the GPU have an on-board memory like AMD's current igps? So many questions come up to mind...


----------



## a_ump (Jun 3, 2010)

be sweet/inovative if they shared the memory controller. def show real integration right there.


----------



## HalfAHertz (Jun 3, 2010)

It would be great if we could see a 265bit bus split 50/50 between the two, and as someone else said earlier maybe DDR5 - but i highly doubt that. The easiest thing to do is have dedicated GDDR5 sideport memory on the motherboard and then dual channel ddr3 fro the cpu.


----------



## Trigger911 (Jun 3, 2010)

Frick said:


> I assume it's an Intel IGP. They do suck.



I think he said it was a amd system so hes prob got an xpress 200 series igp


----------



## a_ump (Jun 3, 2010)

i dout it, probly more along the lines of an actual discrete card's gpu, like an HD 5550 or something along those lines. just a guess.


----------



## PCpraiser100 (Jun 3, 2010)

I wonder what OCing would be like???


----------



## Techtu (Jun 3, 2010)

PCpraiser100 said:


> I wonder what OCing would be like???



That's actually the something no one questioned until now... and now you've really got me thinking... Would overclocking even be something we would be able to do on these chip's, lot's and lot's of question's and only time will tell


----------



## theubersmurf (Jun 3, 2010)

Tech2 said:


> That's actually the something no one questioned until now... and now you've really got me thinking... Would overclocking even be something we would be able to do on these chip's, lot's and lot's of question's and only time will tell


In the presentation, they talk about "Black" editions that would be unlocked for enthusiats. But I wonder about it myself. Two different clock speed sections on a die? How would that work?


----------



## mastrdrver (Jun 4, 2010)

Talking about how many shaders this thing has, rumors are pretty consistant that max will about around 400. Basically, a 5570 on a cpu die (since it has the slower ddr3 instead of gddr5 like the 5670). Combine that with up to a quad phenom minus the L3 and you have Llano or what was just demoed.

AM3+ (aka AM3r2) is coming probably when this thing debuts or with Bulldozer. There is a slide that has been posted on several sites that show it. This is probably what the 8 series chipsets have been designed for.

Also, anyone remember the talk about quad channel AMD cpus? If they start pushing up the size of the gpu on the die with the cpu, your going to need a wider bus or else you will end up with a bandwidth limited APU. Although, the GPU is suppose to become more integrated like how the Interger processing unit was back 10+ years ago.


----------



## erocker (Jun 4, 2010)

theubersmurf said:


> Two different clock speed sections on a die? How would that work?



It'll work. They basically do it now with the CPU and the memory controller.


----------



## AsRock (Jun 4, 2010)

Frick said:


> I assume it's an Intel IGP. They do suck.



Suck that depends on how you look at it.  Laptops should be kept for none gaming and when they make them for gaming you have to expect them to blow up due to the crappy cooling a laptop can have.  Just a way for them make money of people saying they can cope with such heat they produce.  

Now if ya had it in a PC that's a different matter.


----------



## HalfAHertz (Jun 4, 2010)

I guess you can always set the frequency of the GPU as multiple of the HT speed?


----------



## theubersmurf (Jun 4, 2010)

erocker said:


> It'll work. They basically do it now with the CPU and the memory controller.


I guess that's true, hadn't even thought about that.


----------



## Soylent Joe (Jun 4, 2010)

caleb said:


> Name is very "original". The first thing that popped into my mind was Arithmetic Processing Unit



The first thing that popped into my mind when I saw the game screenshot and him holding that big disc and was Alien Processing Unit. Might as well be alien, this thing is looks sweet


----------



## GSG-9 (Jun 4, 2010)

HalfAHertz said:


> I guess you can always set the frequency of the GPU as multiple of the HT speed?



I guess they would go from having a FSB:RAM ratio to FSB:RAM:Graphics type of ratio, one more variable.


----------

