Monday, December 17th 2007

AMD Claims First “Swift” Fusion Processor Due in Second Half 2009

Advanced Micro Devices was once again unsure when exactly it is capable of releasing its highly-anticipated code-named Fusion processor during its meeting with financial analysts on Thursday. Based on the current indications made by the world's second largest x86 chipmaker, the products, which combine general purpose as well as graphics cores, will be delayed to the second half of 2009. The concept chip that combines general purpose as well as graphics computing capabilities, which is usually named Fusion, is now called Accelerated Processing Unit (APU), according to a presentation of Mario Rivas, executive vice president of computing solutions group at AMD. "I am happy to announce the birth of a new category, the Accelerated Processing Units. The 'new AMD' now has access to excellent IP on CPUs, excellent IP on graphics processing units and second to none chipsets. The integration of all these parts and our uniqueness - customer centric innovation - create the APU," said Mr. Rivas.

The first APU, that is "on track to market in 2H 2009" is code-named Swift and features two or three general purpose x86 cores based on AMD's new-generation micro-architecture (the same that is used in Phenom processors), graphics core based on "existing high-end discrete" design (possibly, ATI Radeon HD 3800), DDR3 memory controller as well as PCI Express bus controller. The chip will be made using 45nm process technology.

"The first APU platform is code-named Swift. It gives you the choice of technologies for high-confidence volume production ramp. We want to re-use as much [IP] as possible to accelerate our quality [qualification] and time to market. So, we have an AMD Stars CPU core, the graphics core that is based on the present high-end discrete GPU core and leverages the North Bridge that is presently found in Griffin, the CPU of the Puma platform. It will be our second 45nm generation product, so the maturity of the [production technology] will be proven. It is done on the current SOI design rules, which is the process that we know how to build on very well," Mr. Rivas explained.

Initially the company indicated that Fusion processors "are expected in late 2008/early 2009", and the company anticipated to use them within all of the chipmaker's "priority computing categories", including laptops, desktops, workstations and servers, as well as in "consumer electronics and solutions tailored for the unique needs of emerging markets". A little later the company said that the first-generation of Fusion chips will be aimed at laptops and that production will start in early 2009. This time AMD claims that the actual chips will reach the market only in the second half of 2009, which may mean that the product will only be launched commercially in Q4 2009. Still, the company said that it is minimizing all the risks hopes to really deliver the product on time.

"By optimizing the choice of IP blocks we have less risks and faster time to market in the second half of 2009," claimed , executive vice president of computing solutions group at AMD.
Source: X-bit Labs
Add your own comment

12 Comments on AMD Claims First “Swift” Fusion Processor Due in Second Half 2009

#1
Dangle
This is EXACTLY what we've been waiting for from AMD+ATI!!! Let's just hope it's a high-performance chip.
Posted on Reply
#2
Polaris573
Senior Moderator
theonetruewillEdit on title required. AMD, not MD. After correction I will delete this post.
Thanks and no need to delete the post if you don't want to. Please correct me as quickly as possible when I make mistakes.
Posted on Reply
#4
MilkyWay
look really this is a mixed bag, to me these processors would only be good on a laptop/notebook because you cant change a gpu on a laptop/notebook combining them is a good idea to save space and power, then there's the media pcs that have igp so that also seems like a good idea to combine gpu/cpu if its good enuf for media.

Then theres normal pcs which sucks because we need performance, this is stupid because if we wanted to upgrade the cpu it also changes the gpu, then if we want a new gpu we have to change cpu aswell, lastly theres the stupididty of having a graphics card as well as a gpu/cpu which seems like a waste having both? Only thing i could see is that the graphics card works with the gpu to increase performance but really then why would we need a cpu that can do graphics processing.

Seems a bit stupid and gimicky for me as well as the heat of that thing and the power usage.
BTW i have a AMD processor i just dont want it to turn into another 3dfx.
Posted on Reply
#5
Dangle
this is stupid because if we wanted to upgrade the cpu it also changes the gpu
Yea, but look at the bright side. If you want to upgrade your PC next year, you just buy a CpuGpu, and your golden. What if the prices come down a lot on these as well? I think this is an awesome idea. PCs can be smaller and cooling them will be cheaper. Liquid cooling will be a snap! Also, there's no reason these couldn't be as powerful as having a separate GPU. Look at physics for example - Agei PhysX cards are crap if you've got a decent CPU. I think this is an awesome idea and I'm certain this is the future of computing.
Posted on Reply
#6
mdm-adph
RambotnicWhat is IP?
Means "Intellectual Property." It's a more politically correct way of saying "copyright" today.
Posted on Reply
#7
InnocentCriminal
Resident Grammar Amender
I'm going to piss on most bon-fires here (without intent) but don't expect Fusion to be blistering quick, imagine budget CPU with capabilities of the current X1250 or slightly better, as they're (CPU/GPU) going to be forced to share access to your rigs RAM unlike dedicated cards now. I doubt we'll see them be able to implement high resolutions with anisotropic filtering & anti-aliasing.

If developers are keen to get their fingers in the pie, the Fusion platform could be extremely efficient and programmable piece of technology. One capable of reconfiguring itself on-the-fly for everything we want it to do, such as conventional high-def decoding to 3D graphics and physics stuff.

I'm really interested in the idea of Fusion as I'm extremely keen on SFF's (small form factors). I think that these could be hugely popular if they're implemented correctly and supported by software dev's not just AMD. I honestly think that if DAMMIT can get interest/backing from more dev's other than Valve then DX10+ sort of graphics could very well be a possibility at a later date. Honestly, in a few years I doubt we'll have CPUs and GPUs - they'll be the samething.
Posted on Reply
#8
Wile E
Power User
kieran_fletchlook really this is a mixed bag, to me these processors would only be good on a laptop/notebook because you cant change a gpu on a laptop/notebook combining them is a good idea to save space and power, then there's the media pcs that have igp so that also seems like a good idea to combine gpu/cpu if its good enuf for media.

Then theres normal pcs which sucks because we need performance, this is stupid because if we wanted to upgrade the cpu it also changes the gpu, then if we want a new gpu we have to change cpu aswell, lastly theres the stupididty of having a graphics card as well as a gpu/cpu which seems like a waste having both? Only thing i could see is that the graphics card works with the gpu to increase performance but really then why would we need a cpu that can do graphics processing.

Seems a bit stupid and gimicky for me as well as the heat of that thing and the power usage.
BTW i have a AMD processor i just dont want it to turn into another 3dfx.
I don't think the gpus on these are intended to be used only for graphics processing. In a laptop maybe, but in more serious computers or servers, I think they'll be used to improve floating point calculations (where these gpus excel to a high degree), and perhaps for specialized apps, programmed to take advantage of the gpu core's strengths. I don't think we need to worry about enthusiast class computers losing their dedicated gpu any time soon.
Posted on Reply
#9
btarunr
Editor & Senior Moderator
Let's hope it rolls out soon. We're tired of waiting for that one "magical chip" from AMD.
Posted on Reply
#10
effmaster
Could this have anything to do with why AMD recently announced no new set of graphics cards until 2009? Seems a bit of a coincidence huh. AMD knows what its doing I think.:rolleyes: The real question is can they survive long enough for their plan to be implemented? ;);)
Posted on Reply
#11
panchoman
Sold my stars!
Polaris573based on AMD’s new-generation micro-architecture (the same that is used in Phenom processors)
yeah okay, i thought the whole new firestreamer and stuff platform was gonna be totally kickass but i stopped reading after that part... amd is screwed
Posted on Reply
#12
kwchang007
If implemented right, maybe, just maybe this'll work. Brilliant for laptops, servers, HTPCs, and low powered office oriented desktop. For the performance oriented desktop, this is going to suck.
Posted on Reply
Add your own comment
Jan 9th, 2025 12:52 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts