Thursday, April 24th 2008

ATI Radeon HD 4800 Series Video Cards Specs Leaked

Thanks to TG Daily we can now talk about the very soon to be released ATI HD 4800 series of graphics cards with more details. One week ahead of its presumable release date, general specifications of the new cards have been revealed. All Radeon 4800 graphics will use the 55nm TSMC produced RV770 GPU, that include over 800 million transistors, 480 stream processors or shader units (96+384), 32 texture units, 16 ROPs, a 256-bit memory controller (512-bit for the Radeon 4870 X2) and native GDDR3/4/5 support as reported before. At first, AMD's graphics division will launch three new cards - Radeon HD 4850, 4870 and 4870 X2:
  • ATI Radeon HD 4850 - 650MHz/850MHz/1140MHz core/shader/memory clock speeds, 20.8 GTexel/s (32 TMU x 0.65 GHz) fill-rate, available in 256MB/512MB of GDDR3 memory or 512MB of GDDR5 memory clocked at 1.73GHz
  • ATI Radeon HD 4870 - 850MHz/1050MHz/1940MHz core/shader/memory clock speeds, 27.2 GTexel/s (32 TMU x 0.85 GHz) fill-rate, available in 1GB GDDR5 version only
  • ATI Radeon HD 4870 X2 - unknown core/shader clock speeds, available with 2048MB of GDDR5 memory clocked at 1730MHz
The 4850 256MB GDDR3 version will arrive as the successor of the 3850 256MB with a price in the sub-$200 range. The 4850 512MB GDDR3 should retail for $229, while the 4850 512MB GDDR5 will set you back about $249-269. The 1GB GDDR5 powered 4870 will retail between $329-349. The flagship Radeon HD 4870 X2 will ship later this year for $499.
Source: TG Daily
Add your own comment

278 Comments on ATI Radeon HD 4800 Series Video Cards Specs Leaked

#176
yogurt_21
MusselsNvidia has a 71% share in discrete graphics. That is a fact, although not one that has been linked to in this thread. NV would have more errors by sheer numbers alone. I got a bit annoyed earlier, but its not a fact that can really be argued - yeah intel have more than Nvidia, but Nvidia have more than AMD - i dont see people choosing intel video because it crashes less.
I never argued the discreet market share, I argue that applying a 71% market share to nvidia in that article was ridiculous as others have shown, when intel is in the equation, nvidia doesn't have near that amount. And the article didn't even say it focused on graphics, it just reported the amount of crashes due to drivers, amd/ati, intel, nvidia, and via all use drivers for motherboards as well as drivers for graphics. This will also include any tv tuners or other add on devices any of the above companies make that have seperate drivers. You also have to remember that the specs of the systems were not shown. Meaning any joe shmoe who wanted to upgrade his nforce motherboard with an amd athlon cpu to vista is factored in there. You also have to factor any user who decided to mdownload a nvidia quodro or ati fire gl driver for their geforce or radeon. Or any users who tried to install the wrong motherboard drivers.

All in all you have to take the article with a grain of salt. I mean it came off of 158 pages of suppor tickets without any real system info. Thats inconclusive in any book.
Posted on Reply
#178
grndzro
GSG-9They have been doing it since right after the 9800...I love ati but come on, lets be realistic. You cant just say there not going to, theres at least a feasible chance they will.
Who the hell are you kidding?
I'm still running Crossfire 1950XTX.
I've never had driver problems
I have never run across a game yet that I can't max out the graphics and get 70+ fps.
(Not counting Crysis....it is an unoptimized pos that should have been scrapped)

I did work for Dell as a tecnician. Nvidia and Microsoft fought for over a year about who was supposed to foot the bill for rewriting the Nvidia Vista drivers. They absolutely sucked during that time.
Another point is that many prebuit systems never report their crashes to MS, they are reported to their respective companies. which are never counted in the polls.....and are 80% nvidia.
The Nvidia/Microsoft fiasco over the Vista drivers put Nvidia back quite a ways behind vista compatibility.

Another thing to consider is that ATI-Tool is far more complex at tuning ATI cards and offers far more options for us ATI fanboys than any Nvidia overclocking tool, and that alone creates far more crashes than would normally happen were we content to just use the ATI control panel. and most ATI users do use ATI-Tool for tweaking their graphics.
Posted on Reply
#179
HTC
grndzroWho the hell are you kidding?
I'm still running Crossfire 1950XTX.
I've never had driver problems
I have never run across a game yet that I can't max out the graphics and get 70+ fps.
(Not counting Crysis....it is an unoptimized pos that should have been scrapped)

I did work for Dell as a tecnician. Nvidia and Microsoft fought for over a year about who was supposed to foot the bill for rewriting the Nvidia Vista drivers. They absolutely sucked during that time.
Another point is that many prebuit systems never report their crashes to MS, they are reported to their respective companies. which are never counted in the polls.....and are 80% nvidia.
The Nvidia/Microsoft fiasco over the Vista drivers put Nvidia back quite a ways behind vista compatibility.

Another thing to consider is that ATI-Tool is far more complex at tuning ATI cards and offers far more options for us ATI fanboys than any Nvidia overclocking tool, and that alone creates far more crashes than would normally happen were we content to just use the ATI control panel. and most ATI users do use ATI-Tool for tweaking their graphics.
By any chance, did you bother to read post #180? You know: the post right before yours?
Posted on Reply
#180
Megasty
HTCBy any chance, did you bother to read post #180? You know: the post right before yours?
Of course not :rolleyes:

Before this thread completely goes to hell, those specs seem to have a few areas of focus that would cause some to think that they are atleast twice as fast as their 3800 counterparts. The ROPs kills that thought right off. The cards will still pwn, but if the architecture would have allowed for 24 or 32 ROPs, they would have been out of this world. I do have to commend them for fixing the TMU mess as that screwed us over more than anything else :rockout:
Posted on Reply
#181
BumbRush
Musselsboth have. repeatedly.

Aimed at no one in particular:
ANYTHING YOU SAY ABOUT NV OR ATI SUCKING CAN BE APPLIED EQUALLY TO THE OTHER ONE. PLEASE STOP THIS REPETITIVE FANBOI CRAP.
but at least ati didnt put out an FX class line of cards :P
Posted on Reply
#182
GSG-9
BumbRushbut at least ati didnt put out an FX class line of cards :P
nope, ATI released the 'VE' series (Radion 7000)
Posted on Reply
#183
BumbRush
imperialreignnVidia pushes out 2+ beta drivers a month. Usually they only have one alpha release a month. They're on par with ATI; only difference is that ATI doesn't release beta drivers left and right like nVidia does - instead, they rely heavily on feedback crews, and consumer feedback (us) for driver development. If there's an issue they're trying to resolve, we typically see either a hotfix or a beta release.

Now, if we start calling beta drivers as "official" driver releases - than yeah, I'll defi admit that nVidia releases more drivers than ATI does.



And saying that ATI is lucky to see monthly driver releases anymore is absolutely ridiculous - and you know that, man - ATI has been following the same 1 official driver release per month schedule since, what? 2004/2005? We all know round about when the next driver is rolling out, there's no guessing or hoping involved. If there was any evidence that ATI would start cutting back to quarterly or bi-monthly driver releases, we would've seen or heard evidence of that already.

I understand there's a debate going on, but in the heat of a debate one's comments can start coming across to be very fanboish - not calling you a fanboi, newtekie1, but IMO, that quote on the driver releases very much sounded that way.
Initial release January 19, 2004

Ver. 4.1 (Pkg. ver. 7.97) [1]

so yeah, nvidia's supports so much better......yet most of their beta drivers dont support all the cards that use the same chips(g92) and they are NOT official drivers they are BETA.

personaly i have an 8800gt, and really, i wish the 3870 had been out when i got my card, because thats what i would have, and i wouldnt have had to reinstall x64 windows 4 times to get the system useable.
Posted on Reply
#184
eidairaman1
The Exiled Airman
HTCBy any chance, did you bother to read post #180? You know: the post right before yours?
I didnt have Problems with Drivers Until 7.8s and higher, Once the Hotfix Came Out i switched to those and no problems.

Beyond that Please Lets get back on Topic, this topic is not about drivers but ATi's Radeon 4 Line.
Posted on Reply
#185
btarunr
Editor & Senior Moderator
BumbRushbut at least ati didnt put out an FX class line of cards :P
They put out HD2000 series instead :P
Posted on Reply
#186
flashstar
On wikipedia, it says that the GT200 has 1000 gflops of processing power. The R770 is estimated to have 1000 as well. It appears that the current situation between ATI and Nvidia will not change much with the exception of ATI being out first this time in the contest between GT200 and R770.

What will matter is how competitive ATI's pricing is. I'm betting that we will see a major price drop in Q3 2008 from ATI for the R770 with the release of GT200 products.
Posted on Reply
#187
BumbRush
GSG-9nope, ATI released the 'VE' series (Radion 7000)
the VE and SE cards are kinda like the MX line, they are just suck cards for people who want cheap.

look how many stupid fools bought gf4mx cards thinking "its geforce 4 its gotta be good"

the VE was bad for games, but they played dvd's very well, and OLD games where ok, hell the mpeg decoding on them really helped slower systems play dvd's for old skool htpc/mpc systems :)
btarunrThey put out HD2000 series instead :P
at least the hd2000 cards are capable of doing what they advertise even if they loose perf to AA and such.

the fx line CANT play dx9 games worth a damn, i know its the one thing nvidia did that truely ticked me off, soldme a top of the line "dx9" card that turned out to be utterly unable to play dx9 games......unless u like a 4fps slideshow...........

meh, i hope that the 4800's turn out to be kickass :)
Posted on Reply
#188
btarunr
Editor & Senior Moderator
In the same way, I should be able to play any DX 10 game at appropriate settings on any HD 2000 series card. I can't play Crysis on even the lowest setting on a HD2400 Pro....4fps slideshow....what I don't like.
Posted on Reply
#189
BumbRush
2400 is NOT a gaming card tho, just like the 8400 isnt a gaming card, they are made for buisness and work/video playback systems, wanting to play any game other then some 10year old stuff on a low end "value" card is like wanting to use a geo metro to tow a 24foot boat ;)
Posted on Reply
#190
btarunr
Editor & Senior Moderator
BumbRush2400 is NOT a gaming card tho, just like the 8400 isnt a gaming card, they are made for buisness and work/video playback systems, wanting to play any game other then some 10year old stuff on a low end "value" card is like wanting to use a geo metro to tow a 24foot boat ;)
I will use your logic, equate the HD2400 to FX 5200 (which in its line couldn't play DX9 games).

No more FX / HD2000 discussion. Barring the HD2900 series, HD2000 had been as much a hollow promise to consumers as GeForce FX was.
Posted on Reply
#191
eidairaman1
The Exiled Airman
5200 could Run NFSU and U2 fine at normal settings, just couldnt pump graphics, every card has its niche.

But TBH i think this Topic needs to be locked as it has gotten way out of context.
Posted on Reply
#192
MrMilli
MegastyOf course not :rolleyes:

Before this thread completely goes to hell, those specs seem to have a few areas of focus that would cause some to think that they are atleast twice as fast as their 3800 counterparts. The ROPs kills that thought right off. The cards will still pwn, but if the architecture would have allowed for 24 or 32 ROPs, they would have been out of this world. I do have to commend them for fixing the TMU mess as that screwed us over more than anything else :rockout:
Ati really doesn't need more ROP's because they do AA in the shaders (unlike nVidia). The 3870 already proves that since it's more competitive at higher resolutions.
www.computerbase.de/artikel/hardware/grafikkarten/2008/test_asus_radeon_hd_3850_x2/20/#abschnitt_performancerating_qualitaet
Check out 2560x1600 ... ATI is short on TMU's not ROP's.
Posted on Reply
#193
Megasty
MrMilliAti really doesn't need more ROP's because they do AA in the shaders (unlike nVidia). The 3870 already proves that since it's more competitive at higher resolutions.
www.computerbase.de/artikel/hardware/grafikkarten/2008/test_asus_radeon_hd_3850_x2/20/#abschnitt_performancerating_qualitaet
Check out 2560x1600 ... ATI is short on TMU's not ROP's.
The choking point in the 3800 series was definitely the TMUs. The relatively poor performance in the 3870 came from only having 16 TMUs. The card would fly when just starting to game & slow down to a crawl when you enter an environment where there's alot going on. The 3870x2 nearly solved that problem while showing the ability of a card that have 32 total TMUs & ROPs, although they work over 2 gpu's. With the doubling of the TMUs, ATI has tackled the problem headon. They still have half the TMUs of current Nvidia cards which they tried to offset by having a ridiculous amount of basic shaders but that's an architecture issue isn't it :toast:
Posted on Reply
#194
DarkMatter
MrMilliAti really doesn't need more ROP's because they do AA in the shaders (unlike nVidia). The 3870 already proves that since it's more competitive at higher resolutions.
www.computerbase.de/artikel/hardware/grafikkarten/2008/test_asus_radeon_hd_3850_x2/20/#abschnitt_performancerating_qualitaet
Check out 2560x1600 ... ATI is short on TMU's not ROP's.
And I think the architecture is only going to better with the generations in that respect. Since AA is done in the shaders it takes up X shading power. Let's do a bold speculation based on performance hit when AA is enabled and say it takes up 80 SP on the HD3870 at a said resolution (25% of power, please it's just to give an example) . At the same resolution the HD4 series are going to need the same power, 80 SPs, but the difference is:

480 - 80 = 400
320 - 80 = 240

We have 50% more shaders, but that translates to 66% more free shaders, and this will go up as we add shaders. IMO dedicated hardware (more ROPs) is still better, but Ati is going to improve, that's for sure. 66% more shaders clocked 35% higher (1050Mhz / 775 Mhz) translates to 125% more performance:

P x 1,66 x 1,35 = 2,249 P

Interestingly, we have more or less the same improvement in the texture mapping area, double the units clocked a bit higher:

Texture fillrate -> TFR x 2 x 850/775 = 2,19 TFR

I guess they are aiming at ~2,20X the performance of HD3 series.

And I really hope we are correct and Ati's new generation comes with a 100% improvement or greater, since according to leaked specs Nvidia's chip IS going to be double as fast as G92, since it's double everything.

We need Ati back on the high-end market.
Posted on Reply
#195
mandelore
DarkMatterAnd I think the architecture is only going to better with the generations in that respect. Since AA is done in the shaders it takes up X shading power. Let's do a bold speculation based on performance hit when AA is enabled and say it takes up 80 SP on the HD3870 at a said resolution (25% of power, please it's just to give an example) . At the same resolution the HD4 series are going to need the same power, 80 SPs, but the difference is:

480 - 80 = 400
320 - 80 = 240

We have 50% more shaders, but that translates to 66% more free shaders, and this will go up as we add shaders. IMO dedicated hardware (more ROPs) is still better, but Ati is going to improve, that's for sure. 66% more shaders clocked 35% higher (1050Mhz / 775 Mhz) translates to 125% more performance:

P x 1,66 x 1,35 = 2,249 P

Interestingly, we have more or less the same improvement in the texture mapping area, double the units clocked a bit higher:

Texture fillrate -> TFR x 2 x 850/775 = 2,19 TFR

I guess they are aiming at ~2,20X the performance of HD3 series.

And I really hope we are correct and Ati's new generation comes with a 100% improvement or greater, since according to leaked specs Nvidia's chip IS going to be double as fast as G92, since it's double everything.

We need Ati back on the high-end market.
then double that up for the 4870x2 :eek: :nutkick:
Posted on Reply
#196
newtekie1
Semi-Retired Folder
grndzroAnother thing to consider is that ATI-Tool is far more complex at tuning ATI cards and offers far more options for us ATI fanboys than any Nvidia overclocking tool, and that alone creates far more crashes than would normally happen were we content to just use the ATI control panel. and most ATI users do use ATI-Tool for tweaking their graphics.
Have you ever even used Rivatuner? It is far more complicated than ATItool.
Posted on Reply
#197
lemonadesoda
DarkMatter... lots of calcs leading to conclusion of 2.2x performance.
Given the same architecture, higher clocks, and more shaders, I think these are the performance implications:

1./ Broadly similar performance at standard resolutions e.g. 1280x1024 and with no AA FSAA effects since no architectural changes
2./ General improvement in line with clock-for-clock increases 10-20%
3./ The increase to 32 TMU will mean that the cards wont CHOKE at higher resolutions. It will be able to handle 1920x1200 without hitting the wall
4./ Currently you can dial up 4x AA without any performance hit. With the extra shaders you can do the same at 1920x1200 now
5./ With the extra shaders, you will be able to dial up 8x or 16x at 1280x1024 without a significant hit.
6./ The GPU will run hotter and require more power
7./ Compensated by using GDDR5 memory that will require less power and run a bit cooler

Net net... get the GDDR5 model.

Will there be a "jump" in performance like we saw between the x19xx series and hd38xx? No.
Posted on Reply
#198
flashstar
lemonadesodaWill there be a "jump" in performance like we saw between the x19xx series and hd38xx? No.
That is where I have to disagree with you. If performance doesn't "jump", ATI will fail. Then AMD will be very vulnerable to a buyout from some other company and then who knows what will happen. ATI knows that it has to be at least on par with the GT200.
Posted on Reply
#199
eidairaman1
The Exiled Airman
Beyond Shaders, ROPs, TMUs there is the Fact of the Basic Transistor Density.
DarkMatterAnd I think the architecture is only going to better with the generations in that respect. Since AA is done in the shaders it takes up X shading power. Let's do a bold speculation based on performance hit when AA is enabled and say it takes up 80 SP on the HD3870 at a said resolution (25% of power, please it's just to give an example) . At the same resolution the HD4 series are going to need the same power, 80 SPs, but the difference is:

480 - 80 = 400
320 - 80 = 240

We have 50% more shaders, but that translates to 66% more free shaders, and this will go up as we add shaders. IMO dedicated hardware (more ROPs) is still better, but Ati is going to improve, that's for sure. 66% more shaders clocked 35% higher (1050Mhz / 775 Mhz) translates to 125% more performance:

P x 1,66 x 1,35 = 2,249 P

Interestingly, we have more or less the same improvement in the texture mapping area, double the units clocked a bit higher:

Texture fillrate -> TFR x 2 x 850/775 = 2,19 TFR

I guess they are aiming at ~2,20X the performance of HD3 series.

And I really hope we are correct and Ati's new generation comes with a 100% improvement or greater, since according to leaked specs Nvidia's chip IS going to be double as fast as G92, since it's double everything.

We need Ati back on the high-end market.
Posted on Reply
#200
eidairaman1
The Exiled Airman
flashstarThat is where I have to disagree with you. If performance doesn't "jump", ATI will fail. Then AMD will be very vulnerable to a buyout from some other company and then who knows what will happen. ATI knows that it has to be at least on par with the GT200.
What put them behind schedule was the 2900 Line, 3800 Came about due to the Powerdraw of the 2900, Many drivers later the 2900 is a good card if you have the power to run it, Radeon 4 Series is on schedule according to ATi.
Posted on Reply
Add your own comment
Nov 25th, 2024 21:21 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts