Friday, August 22nd 2008

First AMD Fusion Specifications, Hint Toward RV710 Specs.

AMD Fusion could well be the first CPU to feature a graphics processor core. It will incorporate a graphics processor with specifications identical to the RV710. This CPU would be based on the 45 nm silicon fabrication process, and this processor will be manufactured at the Taiwan Semiconductor Manufacturing Company (TSMC). This GPU will be called "Kong". Here are its specifications:
  • Core Frequency between 600 ~ 800 MHz
  • 128-bit wide memory bus (DDR3, Side-port supportive)
  • 40 Stream Processors
  • 8 TMUs, 4 ROPs
  • DirectX 10.1 support
  • UVD
The GPU will not connect to the rest of the processor using HyperTransport , instead, a new interface referred to as "Onion" will replace it, keeping with the "green" scheme, the GPU will utilize a "Garlic" memory interface to enhance read/write performance and reduce latencies. Performance-wise, it's expected to perform 50% better than the RS780 chip. The addition of this GPU will step-up power consumption by 5~8 W (load) and 0.4~0.6 W (idle). This is the first implementation of such as design methodology, AMD wants to step into the pool only after dipping its toes.
Source: IT.com.cn
Add your own comment

33 Comments on First AMD Fusion Specifications, Hint Toward RV710 Specs.

#1
From_Nowhere
This will be awesome for laptops and HTPC's.

As long as it doesn't smell like an onion or a clove of garlic.
Posted on Reply
#2
tkpenalty
From_NowhereThis will be awesome for laptops and HTPC's.

As long as it doesn't smell like an onion or a clove of garlic.
Garlic = ring bus. This will definately put the intel atom to shame as the K8 architecture excels at lower clockspeeds, moreover the IGP which is equivalent to what, today's low end GPUs Is present. Its faster than my old 9550 :(
Posted on Reply
#3
Mussels
Freshwater Moderator
garlic and onion? wtf?
Posted on Reply
#4
From_Nowhere
Musselsgarlic and onion? wtf?
To make the vampiric Intel cry and run away? Dunno just nick names.
Posted on Reply
#5
laszlo
so future is here already;i expected this to happen after 2010 but seems we won't need stand alone gpus in a few year;this is bad news for nvidia and also for ati because the oem built pc's(not the expensive ones) has a weak gpu so they can spare now ...
Posted on Reply
#6
btarunr
Editor & Senior Moderator
Musselsgarlic and onion? wtf?
= stinks when raw , yummy when cooked.

I'd say garlic = sideport, so even before implementing DDR3, or on cheap boards without it, the DDR3 Sideport memory should make use ot memory chips present on the motherboard.
Posted on Reply
#7
Mussels
Freshwater Moderator
I can see a few problems arising from this setup.

#1. Motherboard needs support - otherwise you have no outputs
#1.1. You're really screwed if you want more/different outputs (adding HDMI, etc)
#2. what will happen if you add an unsupported video card (Nvidia for example)? will it be able to be disabled?

bonus#. This will be fun for itx and really tiny systems. Only one heatsink needed...
Posted on Reply
#8
Wile E
Power User
MusselsI can see a few problems arising from this setup.

#1. Motherboard needs support - otherwise you have no outputs
#1.1. You're really screwed if you want more/different outputs (adding HDMI, etc)
#2. what will happen if you add an unsupported video card (Nvidia for example)? will it be able to be disabled?

bonus#. This will be fun for itx and really tiny systems. Only one heatsink needed...
Or alternatively, the gfx core could be used as strictly a GPGPU device. It can offload physics and stuff like that. They would just need to make sure it's driver registers it as a Co-Processor instead of a gpu in this situation, that way, if you wanted to add an NV card in Vista, you can retain some sort of functionality from the gpu core in the cpu, and not have to worry about Vista disabling one of your gfx drivers.
Posted on Reply
#9
btarunr
Editor & Senior Moderator
MusselsI can see a few problems arising from this setup.

#1. Motherboard needs support - otherwise you have no outputs
Yup.
Mussels#1.1. You're really screwed if you want more/different outputs (adding HDMI, etc)
Why? Isn't it the same with boards with onboard graphics? Aren't you equally screwed if you need say HDMI and the board gives you only D-Sub? (D-Sub - DVI - HDMI using MacGyver dongles won't work, you'd need wiring on a DVI port that conveys HDMI as what happens with say ATI cards that come with DVI-HDMI dongles.
Mussels#2. what will happen if you add an unsupported video card (Nvidia for example)? will it be able to be disabled?
The same as what happens when you use one on a 780G board, nothing. Fusion gives out video through the board's connectors, when using ATI HybridGraphics, your monitor plugs into the connector on the board, not the card(s), so I'm not sure that on boards without display out(s) the output of Fusion would go to a graphics card. (I'm just guessing).
Musselsbonus#. This will be fun for itx and really tiny systems. Only one heatsink needed...
Yes, ITX, notebooks....only 0.7 W (for the graphics) in idle is awesome.
Posted on Reply
#10
tkpenalty
MusselsI can see a few problems arising from this setup.

#1. Motherboard needs support - otherwise you have no outputs
#1.1. You're really screwed if you want more/different outputs (adding HDMI, etc)
#2. what will happen if you add an unsupported video card (Nvidia for example)? will it be able to be disabled?

bonus#. This will be fun for itx and really tiny systems. Only one heatsink needed...
#1. Fusion uses different socket anyway
#1.1 Read above
#2. Duh.
Posted on Reply
#11
Mussels
Freshwater Moderator
hey i never stated they were facts or anything guys, just stating the obvious really.

Using a different socket is good, i wasnt aware of that.
Posted on Reply
#12
Tatty_Two
Gone Fishing
Wile EOr alternatively, the gfx core could be used as strictly a GPGPU device. It can offload physics and stuff like that. They would just need to make sure it's driver registers it as a Co-Processor instead of a gpu in this situation, that way, if you wanted to add an NV card in Vista, you can retain some sort of functionality from the gpu core in the cpu, and not have to worry about Vista disabling one of your gfx drivers.
Yes my thoughts....leading to the true concept of multiple Hybrid GPU's in a single system, a more cost effective way of enhancing performance.....providing of course the CPU's and motherboards are not too expensive.
Posted on Reply
#13
Unregistered
I think these chips would need bigger coolers than a cpu only though,you have a hot cpu and a hot gpu in the same package.

Nice idea though.
#14
Mussels
Freshwater Moderator
tigger69I think these chips would need bigger coolers than a cpu only though,you have a hot cpu and a hot gpu in the same package.

Nice idea though.
"The addition of this GPU will step-up power consumption by 5~8 W (load) and 0.4~0.6 W (idle)"

When you factor in that the average CPU Wattage is around 70W (TDP) these days, its basically irrelevant to temps.
Posted on Reply
#15
WhiteLotus
This looks like a very interesting method of doing things and i think it can dramatically improve the HTPC without doing much.

As for the nVidia and ATI losing out on discreet graphics above^ ati and AMD are the same company now so ATI wont be losing out at all, and AMD will have a great selling point as well. This is something Intel can't do (i think) so could be the sole provider of such chips to such market.
Posted on Reply
#16
laszlo
WhiteLotusAs for the nVidia and ATI losing out on discreet graphics above^ ati and AMD are the same company now so ATI wont be losing out at all, and AMD will have a great selling point as well. This is something Intel can't do (i think) so could be the sole provider of such chips to such market.
just my 2 cents

imagine a 22nm chip with a dual-core cpu and a hd4870 ;would you buy a discrete graphic card? because this will happen and we already have mobos with ram ,i don't think is hard to put 1 gb ram on mobo
Posted on Reply
#17
mdm-adph
laszloso future is here already;i expected this to happen after 2010 but seems we won't need stand alone gpus in a few year;
There pretty much always had been, and always will be, standalone video. :p There's always going to be a need for a discrete, ultra powerful, dedicated video solution -- maybe it'll become more expensive and in the domain of workstations in the future, but it'll always be there.
Posted on Reply
#18
Unregistered
I love the names AMD comes up with , Venice , Brisbane ,Shanghai , and now Garlic , Onion . Lol , lets see tomatoes and oranges and apples too .
#19
MilkyWay
what happens when you upgrade the cpu do you have to upgrade the gpu at the same time?

what happens when the gpu is outdated but the cpu is still okay or the opposite?

these questions believe me to come to the conclusion the only want to get rid of the IGP not gpus altogether
Posted on Reply
#20
jydie
MilkyWaywhat happens when you upgrade the cpu do you have to upgrade the gpu at the same time?

what happens when the gpu is outdated but the cpu is still okay or the opposite?

these questions believe me to come to the conclusion the only want to get rid of the IGP not gpus altogether
I agree with your conclusion... This will probably be used in low cost (basic) systems, laptops, Home Theater Systems, etc. If you want to do some decent gaming, then you will need a good video card. The way I see it, this simply moves the GPU for integrated graphics from the motherboard to the CPU.

This will be great for laptops!! If you can focus most of the heat buildup to one area, then I would think that more efficient cooling methods would follow.
Posted on Reply
#22
MrMilli
AMD Fusion could well be the first CPU to feature a graphics core?
To name a few before the Fusion: Cyrix MediaGX (later AMD Geode), Via Mark CoreFusion & Intel Timna (cancelled though)! So it's really not the first of it's kind.

Fusion will probably be a MCM chip which would imply that only the GPU will be produced at TSMC and the CPU in AMD's own fabs.

To answer MilkyWay: Fusion will have PCI-E integrated, so you could add a discrete card.
Posted on Reply
#23
substance90
Too bad, I don`t see this getting into netbooks and UMPCs! Via Nano is also better than the Atom, but still do you know a single device using the Nano?
Posted on Reply
#24
Cuzza
btarunr.......gives you only D-Sub? (D-Sub - DVI - HDMI using MacGyver dongles won't work, you'd need ......
That's what i'm talking about ! MacGyver can do anything! Sorry this is totally off topic but:

Posted on Reply
#25
WarEagleAU
Bird of Prey
Its just awesome that they are pulling this off about 2 years before anyone thought. Id like to see beefier cpus with beefier GPUs but this is a start. Imagine the power savings with these chips.
Posted on Reply
Add your own comment
Dec 18th, 2024 14:51 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts