Wednesday, May 21st 2008

AMD Confirms GDDR5 for ATI Radeon 4 Series Video Cards

AMD today announced the first commercial implementation of Graphics Double Data Rate, version 5 (GDDR5) memory in its forthcoming next generation of ATI Radeon graphics card products. The high-speed, high-bandwidth GDDR5 technology is expected to become the new memory standard in the industry, and that same performance and bandwidth is a key enabler of The Ultimate Visual Experience, unlocking new GPU capabilities. AMD is working with a number of leading memory providers, including Samsung, Hynix and Qimonda, to bring GDDR5 to market.

Today's GPU performance is limited by the rate at which data can be moved on and off the graphics chip, which in turn is limited by the memory interface width and die size. The higher data rates supported by GDDR5 - up to 5x that of GDDR3 and 4x that of GDDR4 - enable more bandwidth over a narrower memory interface, which can translate into superior performance delivered from smaller, more cost-effective chips. AMD's senior engineers worked closely with industry standards body JEDEC in developing the new memory technology and defining the GDDR5 spec.

"The days of monolithic mega-chips are gone. Being first to market with GDDR in our next-generation architecture, AMD is able to deliver incredible performance using more cost-effective GPUs," said Rick Bergman, Senior Vice President and General Manager, Graphics Product Group, AMD. "AMD believes that GDDR5 is the optimal way to drive performance gains while being mindful of power consumption. We're excited about the potential GDDR5 brings to the table for innovative game development and even more exciting game play."

The introduction of GDDR5-based GPU offerings marks the continued tradition of technology leadership in graphics for AMD. Most recently AMD has been first to bring a unified shader architecture to market, the first to support Microsoft DirectX 10.1 gaming, first to lower process nodes like 55nm, the first with integrated HDMI with audio, and the first with double-precision floating point calculation support.

AMD expects that PC graphics will benefit from the increase in memory bandwidth for a variety of intensive applications. PC gamers will have the potential to play at high resolutions and image quality settings, with superb overall gaming performance. PC applications will have the potential to benefit from fast load times, with superior responsiveness and multi-tasking.

"Qimonda has worked closely with AMD to ensure that GDDR5 is available in volume to best support AMD's next-generation graphics products," said Thomas Seifert, Chief Operating Officer of Qimonda AG. "Qimonda's ability to quickly ramp production is a further milestone in our successful GDDR5 roadmap and underlines our predominant position as innovator and leader in the graphics DRAM market."

GDDR5 for Stream Processing
In addition to the potential for improved gaming and PC application performance, GDDR5 also holds a number of benefits for stream processing, where GPUs are applied to address complex, massively parallel calculations. Such calculations are prevalent in high-performance computing, financial and academic segments among others. AMD expects that the increased bandwidth of GDDR5 will greatly benefit certain classes of stream computations.

New error detection mechanisms in GDDR5 can also help increase the accuracy of calculations by indentifying errors and re-issuing commands to get valid data. This capability is a level of reliability not available with other GDDR-based memory solutions today.
Source: AMD
Add your own comment

135 Comments on AMD Confirms GDDR5 for ATI Radeon 4 Series Video Cards

#26
CDdude55
Crazy 4 TPU!!!
ATI is the only thing holding AMD together. They are luky they bought them in time.
Posted on Reply
#27
ningen
Oi oi, not a fanboy and don't want to instigate. I do remember Radeon 9700 you know.

It's just facts though - you look at specs and wow, GDDR5, wide bus... but still GFs win. Even if ATI gets one game, it loses in 10 other ones. There are more drivers problems with ATI too, and pricing kinda shows what's what as well.
Posted on Reply
#28
Valdez
erockerNot bribed, what I don't get is why ATi isn't getting together with developers a little more? Can Nvidia's "The way it's meant to be played" program really be leagal? It would be like Ford getting together with OPEC and formulating a gasoline that only works well with Ford cars. Sure, it's smart of Nvidia to do it, however it hurts competition. The end result is that it hurts the consumer.
After "bribing", Ati's presence is not welcomed anymore by the developers ;) Ati doesn't have money, so they are happy if the final version of the game is tested on a Ati card at least...
Posted on Reply
#29
erocker
*
candle_86So Nvidia supplies there compilers to a company so there product works well on Nvidia. AMD is more than welcome to do that, and they did in the past as ATI. AMD has never done this, how many titles have you seen say runs best on AMD besides the FarCry 64 patch? AMD needs to get there compliers out so games can be optimized for there cards also, if they can't do that, then who cares. The way its meant to be played program is a program for compaines to get access to the complier and recive sponsoring from Nvidia to help fund the game. AMD is always welcome to do it also, they just won't/
Exactly. Plus, AMD needs to kick themselves in the arse and get thier marketing department going. As far as "running best on" that is all just marketing hoopla. AMD should have plenty of money to "grease the palms" of game developers.
Posted on Reply
#30
candle_86
CDdude55ATI is the only thing holding AMD together. They are luky they bought them in time.
id have to say Sempron and low end x2 does alot better for them, they are still in many OEM and office PC's sold, and are faster than the intel lot in that price range. ATI is getting beat in every price segment right now
Posted on Reply
#31
candle_86
erockerExactly. Plus, AMD needs to kick themselves in the arse and get thier marketing department going. As far as "running best on" that is all just marketing hoopla. AMD should have plenty of money to "grease the palms" of game developers.
agreed, alot of runs best on intel games back in the day did the oppisite but it just means they had the complier for it, if AMD did this i might take notice, but all i see them doing is screwing up
Posted on Reply
#32
Valdez
candle_86So Nvidia supplies there compilers to a company so there product works well on Nvidia. AMD is more than welcome to do that, and they did in the past as ATI. AMD has never done this, how many titles have you seen say runs best on AMD besides the FarCry 64 patch? AMD needs to get there compliers out so games can be optimized for there cards also, if they can't do that, then who cares. The way its meant to be played program is a program for compaines to get access to the complier and recive sponsoring from Nvidia to help fund the game. AMD is always welcome to do it also, they just won't/
It's illogical what you've written here... Do you have any source that amd doesn't (want to) provide any help to the developers?
Posted on Reply
#33
EastCoasthandle
CDdude55Now i am confused, Should i get the a 8800 series card like a 8800 GT or get one of the 4 series cards? I have a 680i SLI SE motherboard so i don't know if an ATI will work so good on my motherboard :confused:. Still can't wait for GDDR5:)
.
Confused? Confused? Either you wait and see if the 4870 is indeed faster then nvidia current offering or you remain loyal no matter what! It's nothing to be confused about at this point.
Posted on Reply
#34
jbunch07
erockerNot bribed, what I don't get is why ATi isn't getting together with developers a little more? Can Nvidia's "The way it's meant to be played" program really be leagal? It would be like Ford getting together with OPEC and formulating a gasoline that only works well with Ford cars. Sure, it's smart of Nvidia to do it, however it hurts competition. The end result is that it hurts the consumer. Oh, and I'm glad that Ati is at least using GDDR5, and furthering thier advancement. Hopefully this will pay off well in the performance department.
those where my thoughts... it doesnt seem right when nvidia does that. :shadedshu
AMD/ATI doesnt do that afaik, but im sure its just a marketing thing nvidia has going...

anyway im glad to see that gddr5 is confirmed! :)
Posted on Reply
#36
Valdez
cata 8.5 is out btw :)
Posted on Reply
#37
PVTCaboose1337
Graphical Hacker
I am liking the GDDR5 route, and also I am liking the 28 people viewing this thread (must be important!) Good thing ATI will make a comeback.
Posted on Reply
#38
CDdude55
Crazy 4 TPU!!!
EastCoasthandleConfused? Confused? Either you wait and see if the 4870 is indeed faster then nvidia current offering or you remain loyal no matter what! It's nothing to be confused about at this point.
But if it is faster is will be more expensive. Also i don't know how good ATI's drivers are on nforce chipset boards.
Posted on Reply
#39
Valdez
CDdude55But if it is faster is will be more expensive. Also i don't know how good ATI's drivers are on nforce chipset boards.
i'm using hd3870 with nforce 570 ultra, never had problems with that ;)
Posted on Reply
#40
Millenia
CDdude55But if it is faster is will be more expensive. Also i don't know how good ATI's drivers are on nforce chipset boards.
But if it offers a major performance upgrade it's a LOT cheaper and a better choice to use a bit more expensive RAM instead of having monolithic cards with gazillions of transistors if it yields the same end result.

At least power consumption should be lower that way :p
Posted on Reply
#41
EastCoasthandle
CDdude55But if it is faster is will be more expensive. Also i don't know how good ATI's drivers are on nforce chipset boards.
You could have easily visited your MB forum and would have received a good answer to that. Valdez statement should key you in on what the deal is IMO.
Posted on Reply
#42
[I.R.A]_FBi
im guessing ggdr5 cheaper than 512 bit bus?
Posted on Reply
#43
EastCoasthandle
[I.R.A]_FBiim guessing ggdr5 cheaper than 512 bit bus?
Well think of it as this:
You have water cooling setup and want to decide on the size of the tubing. You can go 3/4" inner diameter tubing but you run the risk of a slower flow rate do to the size of the pump's barb only being 1/2" and it's power output (more/less). Or you can get a tube with an inner diameter 7/16" (which is slightly smaller then 1/2") which should maximize your flow rate. I believe this is what the following means:
Today’s GPU performance is limited by the rate at which data can be moved on and off the graphics chip, which in turn is limited by the memory interface width and die size. The higher data rates supported by GDDR5 – up to 5x that of GDDR3 and 4x that of GDDR4 – enable more bandwidth over a narrower memory interface, which can translate into superior performance delivered from smaller, more cost-effective chips.
If I am wrong could someone clarify this? :o
Posted on Reply
#44
Kirby123
ningenATI's always better on paper - faster memory, clocks, DX 10.1 support... yet nvidia cards are still better.
that is all in opinion from my point of view, my x2 matches up to the 9800 gx2 easily, my max score on mark 06 was higher than the 9800gx2, same parts just different video cards.
the cards are built for gaming, there isnt a good enough game to really tell what is better. right now i beat the 9800 series of invidia by 8 fps with same settings. it is all on the config of your pc
( i am only running 1 card right now too)
Posted on Reply
#45
CDdude55
Crazy 4 TPU!!!
Kirby123that is all in opinion from my point of view, my x2 matches up to the 9800 gx2 easily, my max score on mark 06 was higher than the 9800gx2, same parts just different video cards.
the cards are built for gaming, there isnt a good enough game to really tell what is better. right now i beat the 9800 series of invidia by 8 fps with same settings. it is all on the config of your pc
( i am only running 1 card right now too)
That's weird considering the 9800 GX2 is more expensive.:eek:
Posted on Reply
#46
WarEagleAU
Bird of Prey
Looks like the 4 series will be mighty nice. I may need to sell my 3870 and buy 4870. Of course, Id love to CF two of them, even though I dont have a need for it.
Posted on Reply
#47
Kirby123
prices dosnt matter its all on your setup. 9800 gx2 and the X2 3 series are basicly equal. i recomend X2 over 9800 gx2. right now the best computer out is the ati crossfire X quad gpu setup. my friend from css mark 06 score was 45k.... one main reason is the quad extream he has at 4.8 ghz with 8gb ram. but his friend got the 9800 gx2 sli and his was only 40k. crossfireX is the best thing out right now from what i have seen atleast.
Posted on Reply
#48
imperialreign
jbunch07those where my thoughts... it doesnt seem right when nvidia does that. :shadedshu
AMD/ATI doesnt do that afaik, but im sure its just a marketing thing nvidia has going...

anyway im glad to see that gddr5 is confirmed! :)
ATI has worked with developers as well, and usually, the very, very few games that they've worked closely with devlopers, we tend to see the tables turned - i.e. Call of Juarez. Even games that are developed closely with ATI, but still brandish the TWIMTBP logo run fairly equal between the two (i.e. FEAR, when it came out, and even up to today).

But, those games are few and far between - nVidia likes to pander to the poor, struggling game devlopers, who need the money.


ATI cards have proven capable of keeping up without "optimized" game code, which says a lot, IMO. If we were to scrap all hardware optimizational coding, both companies would be on a level playing field.

The way I see it, though, I wouldn't be surprised if nVidia is the next to come under anti-trust law investigation. They're still up to shady tactics, and odd coincidences pop up now and then - like Assassin's Creed and DX10.1 . . . funny how much better ATI 10.1 capable cards were running it as compared to nVidia's, and then DX10.1 is removed?! :wtf:

Makes you wonder why nVidia refused to support 10.1 to begin with, even though it's meant to run better than initial DX10 . . . and possibly that refinement is why ATI chose to be the only hardware manufacturer to support it . . .
Posted on Reply
#49
CDdude55
Crazy 4 TPU!!!
ATI works a little with valve. Thats why there logo is in the pause menu in HL2:EPx. And Nvidias logo is in Crysis. But it's not like it actually matters.
Posted on Reply
#50
ningen
You don't play 3DMark so who cares. 9800gx2 vs 3870x2 tests (at least those I've seen) show that cards are pretty much on par, but... ATI experiences heavy fps drops with AA sometimes. Maybe drivers, maybe nvidia optimalizations... whatever. In the end, I'd rather play than indulge in conspiracy theories, just as I'd buy a card that works better for variety of games instead of clinging to those few that my gpu shines in.

GDDR5 doesn't seem like something to drool over at all, too. I mean, isn't it like with DDR3 RAM? Frequency increases and so do timings, and in the end performance increase is nothing much really.
Same with graphic cards, we're up to GDDR5 already but the cards will still develop at the usual, moderate pace. 4x series is supposed to be what, up to 50% faster then 3x series? Within the usual cycle, I'd say.
Posted on Reply
Add your own comment
Jan 8th, 2025 23:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts