Thursday, April 24th 2008
ATI Radeon HD 4800 Series Video Cards Specs Leaked
Thanks to TG Daily we can now talk about the very soon to be released ATI HD 4800 series of graphics cards with more details. One week ahead of its presumable release date, general specifications of the new cards have been revealed. All Radeon 4800 graphics will use the 55nm TSMC produced RV770 GPU, that include over 800 million transistors, 480 stream processors or shader units (96+384), 32 texture units, 16 ROPs, a 256-bit memory controller (512-bit for the Radeon 4870 X2) and native GDDR3/4/5 support as reported before. At first, AMD's graphics division will launch three new cards - Radeon HD 4850, 4870 and 4870 X2:
Source:
TG Daily
- ATI Radeon HD 4850 - 650MHz/850MHz/1140MHz core/shader/memory clock speeds, 20.8 GTexel/s (32 TMU x 0.65 GHz) fill-rate, available in 256MB/512MB of GDDR3 memory or 512MB of GDDR5 memory clocked at 1.73GHz
- ATI Radeon HD 4870 - 850MHz/1050MHz/1940MHz core/shader/memory clock speeds, 27.2 GTexel/s (32 TMU x 0.85 GHz) fill-rate, available in 1GB GDDR5 version only
- ATI Radeon HD 4870 X2 - unknown core/shader clock speeds, available with 2048MB of GDDR5 memory clocked at 1730MHz
278 Comments on ATI Radeon HD 4800 Series Video Cards Specs Leaked
All in all you have to take the article with a grain of salt. I mean it came off of 158 pages of suppor tickets without any real system info. Thats inconclusive in any book.
forums.techpowerup.com/showthread.php?p=766039#post766039
I'm still running Crossfire 1950XTX.
I've never had driver problems
I have never run across a game yet that I can't max out the graphics and get 70+ fps.
(Not counting Crysis....it is an unoptimized pos that should have been scrapped)
I did work for Dell as a tecnician. Nvidia and Microsoft fought for over a year about who was supposed to foot the bill for rewriting the Nvidia Vista drivers. They absolutely sucked during that time.
Another point is that many prebuit systems never report their crashes to MS, they are reported to their respective companies. which are never counted in the polls.....and are 80% nvidia.
The Nvidia/Microsoft fiasco over the Vista drivers put Nvidia back quite a ways behind vista compatibility.
Another thing to consider is that ATI-Tool is far more complex at tuning ATI cards and offers far more options for us ATI fanboys than any Nvidia overclocking tool, and that alone creates far more crashes than would normally happen were we content to just use the ATI control panel. and most ATI users do use ATI-Tool for tweaking their graphics.
Before this thread completely goes to hell, those specs seem to have a few areas of focus that would cause some to think that they are atleast twice as fast as their 3800 counterparts. The ROPs kills that thought right off. The cards will still pwn, but if the architecture would have allowed for 24 or 32 ROPs, they would have been out of this world. I do have to commend them for fixing the TMU mess as that screwed us over more than anything else :rockout:
Ver. 4.1 (Pkg. ver. 7.97) [1]
so yeah, nvidia's supports so much better......yet most of their beta drivers dont support all the cards that use the same chips(g92) and they are NOT official drivers they are BETA.
personaly i have an 8800gt, and really, i wish the 3870 had been out when i got my card, because thats what i would have, and i wouldnt have had to reinstall x64 windows 4 times to get the system useable.
Beyond that Please Lets get back on Topic, this topic is not about drivers but ATi's Radeon 4 Line.
What will matter is how competitive ATI's pricing is. I'm betting that we will see a major price drop in Q3 2008 from ATI for the R770 with the release of GT200 products.
look how many stupid fools bought gf4mx cards thinking "its geforce 4 its gotta be good"
the VE was bad for games, but they played dvd's very well, and OLD games where ok, hell the mpeg decoding on them really helped slower systems play dvd's for old skool htpc/mpc systems :) at least the hd2000 cards are capable of doing what they advertise even if they loose perf to AA and such.
the fx line CANT play dx9 games worth a damn, i know its the one thing nvidia did that truely ticked me off, soldme a top of the line "dx9" card that turned out to be utterly unable to play dx9 games......unless u like a 4fps slideshow...........
meh, i hope that the 4800's turn out to be kickass :)
No more FX / HD2000 discussion. Barring the HD2900 series, HD2000 had been as much a hollow promise to consumers as GeForce FX was.
But TBH i think this Topic needs to be locked as it has gotten way out of context.
www.computerbase.de/artikel/hardware/grafikkarten/2008/test_asus_radeon_hd_3850_x2/20/#abschnitt_performancerating_qualitaet
Check out 2560x1600 ... ATI is short on TMU's not ROP's.
480 - 80 = 400
320 - 80 = 240
We have 50% more shaders, but that translates to 66% more free shaders, and this will go up as we add shaders. IMO dedicated hardware (more ROPs) is still better, but Ati is going to improve, that's for sure. 66% more shaders clocked 35% higher (1050Mhz / 775 Mhz) translates to 125% more performance:
P x 1,66 x 1,35 = 2,249 P
Interestingly, we have more or less the same improvement in the texture mapping area, double the units clocked a bit higher:
Texture fillrate -> TFR x 2 x 850/775 = 2,19 TFR
I guess they are aiming at ~2,20X the performance of HD3 series.
And I really hope we are correct and Ati's new generation comes with a 100% improvement or greater, since according to leaked specs Nvidia's chip IS going to be double as fast as G92, since it's double everything.
We need Ati back on the high-end market.
1./ Broadly similar performance at standard resolutions e.g. 1280x1024 and with no AA FSAA effects since no architectural changes
2./ General improvement in line with clock-for-clock increases 10-20%
3./ The increase to 32 TMU will mean that the cards wont CHOKE at higher resolutions. It will be able to handle 1920x1200 without hitting the wall
4./ Currently you can dial up 4x AA without any performance hit. With the extra shaders you can do the same at 1920x1200 now
5./ With the extra shaders, you will be able to dial up 8x or 16x at 1280x1024 without a significant hit.
6./ The GPU will run hotter and require more power
7./ Compensated by using GDDR5 memory that will require less power and run a bit cooler
Net net... get the GDDR5 model.
Will there be a "jump" in performance like we saw between the x19xx series and hd38xx? No.