• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ATI Radeon HD 3800 Series Specs, Photos and Logos

Argoon

New Member
Joined
Oct 21, 2007
Messages
2 (0.00/day)
the card can be used as a dedicated physics unit, just like the R600 in multi crossfire setups

Hi to all, mandelore you forgot to mention that the game as to support HavokFX from Havok and ONLY HavokFX nor the Havok 5 engine to this to work, and until now i have no knowledge of a game that does.

And any ps 3.0 card can accelerate Havok FX.
 
Joined
May 12, 2006
Messages
491 (0.07/day)
Location
A small crate floating in the Pacific..
Processor Athlon64 X2 3800+ @ 2.6Ghz
Motherboard BioStar TForce4 SLI (939)
Cooling Stock Opteron Heatpipe Cooler
Memory 4x512 Crucial BalistiX, @ DDR520, 2.5-3-3-8
Video Card(s) ATI, x1900 AIW (600/620)
Storage 80GB SATA2, 320GB eSATA2
Display(s) AGM 19" Widescreen LCD, 4ms
Case Aspire X-Dreamer (Black)
Audio Device(s) Auzentech: X-Meridan (Best Soundcard Ever)
Power Supply Rosewill 550W 2x12V
Software XP Professional X64 & Ubuntu 7.10 x86
I think this is GAY.....what is wrong with AMD....why would they move to a new series....when they pretty much just gained some steam on the HD 2k series...

AMD is hoping to beat the GF 8000 series with these higher clocked cards, that's all. Is it right? No. Are they doing it anyways? Yes.

The trouble is we can all be mad out how misleading it is, but at the end of the day will that stop us from buying if the price/performance good? No.
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,945 (3.75/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -.080mV PL max @220w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
There are going to be an aweful lot of mid/mid high cards on the market in a couple of months to chose from, it will be interesting how they all fit into a pricing strategy:

8800GTS 320 G80
8800GTS 640 G80
8800GT G92 256
8800GT G92 512
8800GTS 640 with extra pipes (112) G80

etc etc

2900pro
2950pro
2900xt
2950xt???
2952.5 proX :D

I am lost already :cry:
 

JC316

Knows what makes you tick
Joined
Jan 24, 2006
Messages
9,397 (1.36/day)
System Name Budget Gaming
Processor AMD FX6300
Motherboard Gigabyte 880GMA-USB3
Cooling Coolermaster Hyper 212+
Memory 8GB Ripjaws DDR3 1600
Video Card(s) HD7850 1GB
Storage 1TB Sata2
Display(s) Acer 24" LED
Case Generic black
Audio Device(s) Stock onboard
Power Supply FSP Aurum Gold 650W
Software Windows 7 Home Premium 64bit
Hmmm, should I be mad that I just recently bought a HD2900Pro?

I'm not. The 2900 pro is awesome. The 2850 does look nice though, but I will still stick with my 2900.
 
Joined
May 16, 2007
Messages
2,368 (0.37/day)
Location
Nova Scotia, Canada
System Name Main rig
Processor AMD Ryzen 5800X
Motherboard Asus Strix X570-E Motherboard
Cooling Castle 240EX v2 AIO
Memory 2x16GB GSkill Trident Z RGB 3600MHz CL16
Video Card(s) Powercolor 7900XTX Red Devil
Storage 2x1TB M.2 - 3TB Spinny Boi
Display(s) Alienware AW3423DWF
Case Lian Li 011 Air Mini
Audio Device(s) UAD Apollo
Power Supply Thermaltake GF1 ARGB 850W
Mouse Steelseries Aerox 5 Wireless
Keyboard Steelseries Apex 7 TKL blue switches
VR HMD Quest 3
Software Windows 11
I'm not. The 2900 pro is awesome. The 2850 does look nice though, but I will still stick with my 2900.

Correction......3850
This numbering is STILL GAY
 
Joined
May 19, 2007
Messages
7,662 (1.19/day)
Location
c:\programs\kitteh.exe
Processor C2Q6600 @ 1.6 GHz
Motherboard Anus PQ5
Cooling ACFPro
Memory GEiL2 x 1 GB PC2 6400
Video Card(s) MSi 4830 (RIP)
Storage Seagate Barracuda 7200.10 320 GB Perpendicular Recording
Display(s) Dell 17'
Case El Cheepo
Audio Device(s) 7.1 Onboard
Power Supply Corsair TX750
Software MCE2K5
how long is it?
 

GrapeApe

New Member
Joined
May 4, 2007
Messages
33 (0.01/day)
Location
The Canadian Rockies
Processor Pentium T7300 - 2GHz
Motherboard Intel® 965PM
Cooling teeny tiny fans
Memory 2GB DDR2 - 667
Video Card(s) Mobility Radeon HD2600(Pro) - 256 @ 550/650
Storage 100GB Fujitsu 5400 RPM & ext WD 80GB DualOption & 250GB MyBook Pro & 80GB Vantec NAS & 320GB NAS
Display(s) 17" CrystalView LCD 1440x900.
Audio Device(s) Audigy 2ZS PCMCIA
Software Geoworks
Benchmark Scores 4,000 Bungholiomarks.
AMD has finally lost it, using a higher number for a brand new next gen flagship video card has been a practice in the video card business since ever, for example Radeon 7200-8500-9700-X800-X1800-X2900 (X meaning "10"), GeForce 2-3-4-5800-6800-7800-8800, Voodoo 1-2-3-4-5, and so on and so forth, but now AMD is misleading the consumer making them think this GPU model is technologically a generation above the previous GPU model when it's not....

What was the generational difference between the R9800 and X800 other than some extremely slight tweaks? I gues at least it was a different codenamed part, and what was the big generational difference between the GF6800 and GF7800 (aka the NV47/48)? Oh yeah forgot nV changed that last one to the G70 as if it was a big change. :slap:

Frankly, I don't know where they want to get with this, what's next AMD? PR numbers like you use for your processors? Radeon 4000+ anyone?? :shadedshu

Well that seems to be it, just like the craptacular GMA X3000.
It's not much different than any other naming scheme. As long as there's some rhyme or reason to it, it'll work. That it's an HD2900 or 3870 doesn't really matter as much as whether or not it's worth the money. I don't care if they call it the AMD Corvette as long as it outperforms the AMD Chevette for the price. ;)
 

Terantek

New Member
Joined
Oct 22, 2007
Messages
2 (0.00/day)
Processor AMD Athlon X2 4200 @ 2.7Ghz (from 2.2) stock cooling
Motherboard Asus A8n Sli Deluxe
Memory 2GB DDR (2.5-3-3-5)
Video Card(s) Inno3d 8800GTS
Storage 320 SATAII + 120 SATA + 80 SATA
Display(s) 32" Viewsonic LCD HDTV
Audio Device(s) Creative X-Fi Extreme Music / Logitech z-5500D 5.1
Power Supply 450W Generic
Benchmark Scores 3DMark 06: 8400
Inquirer has an article claiming 2400 Mhz memory clock - thats pretty insane! Also I wonder if there is any performance to be gained by increasing stream processor clock... i know 2900 xt had about double the number of stream processors over 8800 but around half the clock speed on said processors. Maybe they did something like this to justify a 3xxx model number.. i guess we'll see how the benchies turn out.
 

GrapeApe

New Member
Joined
May 4, 2007
Messages
33 (0.01/day)
Location
The Canadian Rockies
Processor Pentium T7300 - 2GHz
Motherboard Intel® 965PM
Cooling teeny tiny fans
Memory 2GB DDR2 - 667
Video Card(s) Mobility Radeon HD2600(Pro) - 256 @ 550/650
Storage 100GB Fujitsu 5400 RPM & ext WD 80GB DualOption & 250GB MyBook Pro & 80GB Vantec NAS & 320GB NAS
Display(s) 17" CrystalView LCD 1440x900.
Audio Device(s) Audigy 2ZS PCMCIA
Software Geoworks
Benchmark Scores 4,000 Bungholiomarks.
are you kidding? a die shrink is laughable...look at the G70 vs G71 the only difference is a die shrink

You do realize that's not the only difference, eh!?! :shadedshu

They also got rid of 25million transistors yet were able to keep the same # of shaders/TUs/ROPs/etc. :eek:

Still not sure what they got rid of (drop FX12 support? :confused: )

So not only do they get a process reduction benefit, they also cut transistors, which also helps with heat and power, which often helps speed limits. The RV670 likely benefits from a similar change, but depends on what else they added or changed (TMUs/ROPs) in addition to what we already know (UMD/SM4.1) while taking some other things away.

Process reduction alone isn't beneficial though if it isn't efficient reduction, because as you decrease trace size and increase density, you increase the potential for noise which you overcome with more voltage, which usually leads to more heat/power.
But as 80nm and 65/55nm are completely different processes it's not just an optical shrink it's a complete move which gives them the change to change the layout, hopefully to something with potential to reach a little closer to those 1GHz numbers in all those early R600 rumours way back when.

Now if they want to keep power consumption low then it would be best to have lower clocks (likely the single slot solution), but that they are going to have a dual-slot model shows that they are going to push what they can hard which would increase heat/power while getting higher speeds/performance. This may be so that they can get the fab savings over the HD2900 and maybe replace possibly the 512MB model and at least the PRO with a cheaper to make high end part. They have alot of potential if they have less issues than they reportedly had with the TSMC 80nmHS fab.
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,945 (3.75/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -.080mV PL max @220w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
Inquirer has an article claiming 2400 Mhz memory clock - thats pretty insane! Also I wonder if there is any performance to be gained by increasing stream processor clock... i know 2900 xt had about double the number of stream processors over 8800 but around half the clock speed on said processors. Maybe they did something like this to justify a 3xxx model number.. i guess we'll see how the benchies turn out.

A very good point, IMO one of the few weaknesses of the 2900XT is the fixed shader clock, not only do the current NVidia cards shader clocks raise with the core clocks but now, with the latest release of RivaTuner you can independantly raise the shader clock completely unlinked to the core, if ATi can integrate something like that into their architecture then I think that potentiallly, with their cards extra stream processors they could really get some extra performance.
 

GrapeApe

New Member
Joined
May 4, 2007
Messages
33 (0.01/day)
Location
The Canadian Rockies
Processor Pentium T7300 - 2GHz
Motherboard Intel® 965PM
Cooling teeny tiny fans
Memory 2GB DDR2 - 667
Video Card(s) Mobility Radeon HD2600(Pro) - 256 @ 550/650
Storage 100GB Fujitsu 5400 RPM & ext WD 80GB DualOption & 250GB MyBook Pro & 80GB Vantec NAS & 320GB NAS
Display(s) 17" CrystalView LCD 1440x900.
Audio Device(s) Audigy 2ZS PCMCIA
Software Geoworks
Benchmark Scores 4,000 Bungholiomarks.
Except that the HD2900 isn't really shader power hampered as much as texture and ROP/hdwr AA limited. The HD2900 already competes well with the GF8800GTX/Ultra in shader power, and demonstrates this well when there's no need for AA or texture loads are low. Look at the GF8800 when it is forced to do shader based AA like that called for in DX10, performance flips then when the texture and ROP loads aren't stressed, but the shaders are.

Having faster shaders would be nice, as would faster everything, but the question is whether you could have the current composition at much faster speeds. There are already a bunch of components working outside of core clock, but how easy is it to implement on those 320SPUs/64shader-cores, and also what's the benefit vs power/heat cost. Personally I'd prefer the opposite of the G80 vis-a-vis the R600 series, faster TMUs/ROPs to make up for the lack of numbers and different composition.
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,945 (3.75/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -.080mV PL max @220w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
Except that the HD2900 isn't really shader power hampered as much as texture and ROP/hdwr AA limited. The HD2900 already competes well with the GF8800GTX/Ultra in shader power, and demonstrates this well when there's no need for AA or texture loads are low. Look at the GF8800 when it is forced to do shader based AA like that called for in DX10, performance flips then when the texture and ROP loads aren't stressed, but the shaders are.

Having faster shaders would be nice, as would faster everything, but the question is whether you could have the current composition at much faster speeds. There are already a bunch of components working outside of core clock, but how easy is it to implement on those 320SPUs/64shader-cores, and also what's the benefit vs power/heat cost. Personally I'd prefer the opposite of the G80 vis-a-vis the R600 series, faster TMUs/ROPs to make up for the lack of numbers and different composition.

Yes it does compete well, your right, my point is it has twice the Sp's and with a little work could be a fair bit quicker!
 
Joined
Mar 17, 2005
Messages
1,052 (0.15/day)
Location
Ards
System Name Jezebelle
Processor i5 4690K
Motherboard MSI Z97M-G43
Cooling Noctua NH-U12
Memory 2x8GB Ballistix Sport
Video Card(s) GTX 1070 Ti
Storage Sandisk Ultra II 480GB, Sammy 1TB F3
Display(s) BenQ Xl2411Z
Case Fractal Define Mini
Audio Device(s) Onboard Realtek 7.1 HD, does the job!
Power Supply Enermax Infiniti 720W
Keyboard Logitech G15
Software Winblows 10 Pro
I think this is GAY.....what is wrong with AMD....why would they move to a new series....when they pretty much just gained some steam on the HD 2k series...

I personally think that calling an inanimate object without gender a homo' is gay in itself!
:slap:
Long live ati :rockout:
 

GrapeApe

New Member
Joined
May 4, 2007
Messages
33 (0.01/day)
Location
The Canadian Rockies
Processor Pentium T7300 - 2GHz
Motherboard Intel® 965PM
Cooling teeny tiny fans
Memory 2GB DDR2 - 667
Video Card(s) Mobility Radeon HD2600(Pro) - 256 @ 550/650
Storage 100GB Fujitsu 5400 RPM & ext WD 80GB DualOption & 250GB MyBook Pro & 80GB Vantec NAS & 320GB NAS
Display(s) 17" CrystalView LCD 1440x900.
Audio Device(s) Audigy 2ZS PCMCIA
Software Geoworks
Benchmark Scores 4,000 Bungholiomarks.
I understand that, but if the bottleneck is in the backend and not the shaders then your benefit is still limited. It's still a benefit, but it would be like overclocking your QX9650 to 4GHz in UT3 but still being stuck with a ChromeS27, your computer may be able to better handle the game's core needs but you still can't translate that benefit out to your display because of a bottleneck further down the path. Same problem with the R600 it's biggest weaknesses are in the back-end not it's core shader power.

That's not to say it's without benefit, overclocked SPUs would help a bit with the shader based AA, but it's still heavily TMU and ROP limited at any significant setting used by top-end cards.

I don't disagree that faster SPUs will improve some things, but my main point is that's not it's biggest weakness, and what is the cost of your OC, as it's already a very power hungry and pretty warm VPU without increasing the speed of the SPUs (these increases you seek don't come at 0 cost there). I think that level of power is for next year's games, not really our current batch (although Crysis may prove otherwise if geometry is cranked as high as we hope).
So like I said, personally I'd prefer to see them focus on the back-end for any expenditure of power/heat or even transistors since that's their current Achilles' heel.
 
Joined
May 19, 2007
Messages
7,662 (1.19/day)
Location
c:\programs\kitteh.exe
Processor C2Q6600 @ 1.6 GHz
Motherboard Anus PQ5
Cooling ACFPro
Memory GEiL2 x 1 GB PC2 6400
Video Card(s) MSi 4830 (RIP)
Storage Seagate Barracuda 7200.10 320 GB Perpendicular Recording
Display(s) Dell 17'
Case El Cheepo
Audio Device(s) 7.1 Onboard
Power Supply Corsair TX750
Software MCE2K5
how long is it?
 

thomasxstewart

New Member
Joined
Feb 4, 2006
Messages
8 (0.00/day)
Location
WASHINGTON,dc
TOP $ Till Nvidia PCIe2.0

:toast: Well its good if DX10.1 comes in stronger, especially with TWICE Bandwidth. Yet TEARS of Pain & How Can They Charge Sooo Much, comes to Mind. Well Until Nvidia PCIe 2.0 pokes new high score, if THEY can. Its HOTTT!!!

Signed:pHYSICIAN THOMAS STEWART VON DRASHEK M.D.
 
Joined
Aug 10, 2006
Messages
4,413 (0.66/day)
Processor Intel Core i7-7700K
Motherboard ASUS ROG Strix Z270E Gaming
Cooling Arctic Cooling Freezer i11
Memory 4x 8GB DDR4 Corsair Vengeance LPX @ 2133MHz
Video Card(s) 2x NVIDIA GTX 1080 Ti FEs
Storage 512GB SSD, 2x2TB HDD
Display(s) AOC U2879VF, AOC G2260VWQ6
Case Corsair 750D Airflow Edition
Power Supply EVGA Supernova 850G
Software Windows 10 x64 Pro
Interesting. AMD are really stepping it up, especially since NVIDIA admitted defeat earlier.
 
Joined
May 16, 2007
Messages
2,368 (0.37/day)
Location
Nova Scotia, Canada
System Name Main rig
Processor AMD Ryzen 5800X
Motherboard Asus Strix X570-E Motherboard
Cooling Castle 240EX v2 AIO
Memory 2x16GB GSkill Trident Z RGB 3600MHz CL16
Video Card(s) Powercolor 7900XTX Red Devil
Storage 2x1TB M.2 - 3TB Spinny Boi
Display(s) Alienware AW3423DWF
Case Lian Li 011 Air Mini
Audio Device(s) UAD Apollo
Power Supply Thermaltake GF1 ARGB 850W
Mouse Steelseries Aerox 5 Wireless
Keyboard Steelseries Apex 7 TKL blue switches
VR HMD Quest 3
Software Windows 11
Okay...the person who thought too release a new series is gay! LONG LIVE ATi though....i wuv them!
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.11/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
You do realize that's not the only difference, eh!?! :shadedshu

They also got rid of 25million transistors yet were able to keep the same # of shaders/TUs/ROPs/etc. :eek:

Still not sure what they got rid of (drop FX12 support? :confused: )

So not only do they get a process reduction benefit, they also cut transistors, which also helps with heat and power, which often helps speed limits. The RV670 likely benefits from a similar change, but depends on what else they added or changed (TMUs/ROPs) in addition to what we already know (UMD/SM4.1) while taking some other things away.

Process reduction alone isn't beneficial though if it isn't efficient reduction, because as you decrease trace size and increase density, you increase the potential for noise which you overcome with more voltage, which usually leads to more heat/power.
But as 80nm and 65/55nm are completely different processes it's not just an optical shrink it's a complete move which gives them the change to change the layout, hopefully to something with potential to reach a little closer to those 1GHz numbers in all those early R600 rumours way back when.

Now if they want to keep power consumption low then it would be best to have lower clocks (likely the single slot solution), but that they are going to have a dual-slot model shows that they are going to push what they can hard which would increase heat/power while getting higher speeds/performance. This may be so that they can get the fab savings over the HD2900 and maybe replace possibly the 512MB model and at least the PRO with a cheaper to make high end part. They have alot of potential if they have less issues than they reportedly had with the TSMC 80nmHS fab.

i did know that but didnt think breaking into the tech stuff would benifit as much as the raw difference in clocks between the cards ;) that were really not to much different as far as series changes go
 

effmaster

New Member
Joined
Aug 22, 2007
Messages
1,327 (0.21/day)
Location
Rocket City, Alabama (Huntsville)
Processor Core Duo T2250 1.73 GHZ
Memory 2 GB DDR2
Video Card(s) NVIDIA GEForce Go 7600 256 MB
Storage 120 GB
Display(s) 17 inch notebook
Case metallic grey
Audio Device(s) Creative xmod (though that doesnt really count as a sound card lol)
Power Supply power plug and battery
Software Lots of it
i did know that but didnt think breaking into the tech stuff would benifit as much as the raw difference in clocks between the cards ;) that were really not to much different as far as series changes go

Raw clock speeds dont always mean they are the best if they are highest or higher than before. AMD proved this to Intel after all. And Intel responded with lower GHZ speed processors namely called Core 2 Duo and it was an amazing proc and still is to this day:rockout::rockout::rockout:
 

General

New Member
Joined
Oct 23, 2007
Messages
2 (0.00/day)
Processor Intel E6750 @3.2Ghz
Motherboard Gigabyte GA_P35_DS3R
Cooling 4x12mm fans (+stock case cooling)
Memory OCZ 2GB XTC Series DDR2
Video Card(s) BFG 7800GTX
Storage Seagate Barracuda 2x500GB HD
Display(s) Samsung SM-226BW LCD
Case Silverstone TJ-09 W
Audio Device(s) X-Fi Xtreme Gamer 7.1
Power Supply OCZ 700W
Software XP home
Benchmark Scores Do it later
What was the generational difference between the R9800 and X800 other than some extremely slight tweaks? I gues at least it was a different codenamed part, and what was the big generational difference between the GF6800 and GF7800 (aka the NV47/48)? Oh yeah forgot nV changed that last one to the G70 as if it was a big change. :slap:



Well that seems to be it, just like the craptacular GMA X3000.
It's not much different than any other naming scheme. As long as there's some rhyme or reason to it, it'll work. That it's an HD2900 or 3870 doesn't really matter as much as whether or not it's worth the money. I don't care if they call it the AMD Corvette as long as it outperforms the AMD Chevette for the price. ;)

There apparantly using these numbers so to get rid of the 'XT, XTX, PRO, GT' that, your average customer simply doesnt understand. Much easier to look on a website and see a card that says 3870 and say

'wooo, that must be better than a 2950' or whatever the hell they end up calling these card's.

Power requirements on those R600's where just insane, for a lot of people, (myself included) that was the only reason I went with an nvidia card.

However, this really does tempt me I must say =] Doing a brand spanking new system for christmas (too bad I will miss out on the new CPU's and 790SLI chipset :()

Martyn
 
Joined
Aug 16, 2004
Messages
3,285 (0.44/day)
Location
Sunny California
Processor AMD Ryzen 7 9800X3D
Motherboard Gigabyte Aorus X870E Elite
Cooling Asus Ryujin II 360 EVA Edition
Memory 4x16GBs DDR5 6000MHz Corsair Vengeance
Video Card(s) Zotac RTX 4090 AMP Extreme Airo
Storage 2TB Samsung 990 Pro OS - 4TB Nextorage G Series Games - 8TBs WD Black Storage
Display(s) LG C2 OLED 42" 4K 120Hz HDR G-Sync enabled TV
Case Asus ROG Helios EVA Edition
Audio Device(s) Denon AVR-S910W - 7.1 Klipsch Dolby ATMOS Speaker Setup - Audeze Maxwell
Power Supply beQuiet Straight Power 12 1500W
Mouse Asus ROG Keris EVA Edition - Asus ROG Scabbard II EVA Edition
Keyboard Asus ROG Strix Scope EVA Edition
VR HMD Samsung Odyssey VR
Software Windows 11 Pro 64bit
What was the generational difference between the R9800 and X800 other than some extremely slight tweaks? I gues at least it was a different codenamed part, and what was the big generational difference between the GF6800 and GF7800 (aka the NV47/48)? Oh yeah forgot nV changed that last one to the G70 as if it was a big change. :slap:

For starters, the X800 (R420) had twice the pixel shaders (16ps vs 8ps) than the 9700 (the original R300), 50% more vertex shaders (6vs vs 4vs) than the 9700, about 45% more transistors than the 9700 (160 million vs 107 million), a new fabrication process (.13 micron vs .15 micron) supported SM2b and the 9700 supported 2a, supported the PCIe platform and the 9800 was AGP only, it was the first Ati card that supported Crossfire, and the clocks for both memory and GPU core were about 50% higher than the 9700's clocks. Even though the R430 was an evolutionary step from the fantastic R300 core, it offered a performance leap anywhere from 40% up to 120% depending on the game or benchmark and the resolution/effects used, not just some "extremely slight tweaks" as you can see. :slap:

Now, the GF7800 (G70) supported 50% more pixel shaders than the GF6800 (NV42, not 47/48 :confused:) (24ps vs 16ps), 33% more vertex shaders than the GF6800 (8vs vs 6vs), about 40% more transitors than the GF6800 (302 million vs 222 million), 20% higher memory and GPU clocks than the GF6800, supported transparency adaptive AA, supported multiple GPUs on a single board (aka 7950GX2) and even though by the numbers there didn't appear to be so much of a difference between both cards, you could get a performance leap anywhere from 30% to more than a 100% depending on the benchmark or game and the resolution/effects used.

As you can see, both examples you quote, clearly were more than worthy of having a new numerical denomination when compared to their previous gen counterparts :rolleyes:
 
Last edited:

GrapeApe

New Member
Joined
May 4, 2007
Messages
33 (0.01/day)
Location
The Canadian Rockies
Processor Pentium T7300 - 2GHz
Motherboard Intel® 965PM
Cooling teeny tiny fans
Memory 2GB DDR2 - 667
Video Card(s) Mobility Radeon HD2600(Pro) - 256 @ 550/650
Storage 100GB Fujitsu 5400 RPM & ext WD 80GB DualOption & 250GB MyBook Pro & 80GB Vantec NAS & 320GB NAS
Display(s) 17" CrystalView LCD 1440x900.
Audio Device(s) Audigy 2ZS PCMCIA
Software Geoworks
Benchmark Scores 4,000 Bungholiomarks.
Well it was an illustration of a similar application of hyperbole, but for the heck of it let's continue on...

For starters, the X800 (R420) had twice the pixel shaders...

Quantity doesn't show improvement on the architecture nor the need of a name change and thus the resistance people seem to have with the new name. An increase in number within a 'generation' has the X1800-> X1900 increased the shader count 3 fold, and the transistor count 20%, yet didn't get it's own X2K numbering, but the GF3 -> GF4 is only 10% diff but gets it own generation.

a new fabrication process (.13 micron vs .15 micron)

Fab process wasn't new, just new to the high end, 130nm and 130nm low-Kd were already used on the R9600P/XT. And for that same reason you could argue the RV670 deserves a name change skipping a node and going from optical shrink to optical shrink, so that's like 2 process changes, and will be the first to be built on the new fab from any IHV. So it kinda proves my point more than dissproves it, although I don't think fab process matters that much so much as the results.

supported SM2b and the 9700 supported 2a,

Actually the FX cards were PS2.0a, not the R3xx which was PS2.0 and PS2.0extended, the R420 was PS2.0b. There are more differences between PS2.0a and either PS2.0 and 2.0b than between 2.0 and 2.0b themselves which have slight changes in their upper limits.

supported the PCIe platform and the 9800 was AGP only,

So the PCX5900 should've been the PCX6800 based on that argument? OR does it matter native/non-native where the GF6800 PCIe (NV45) becomes the GF7800, instead of the later NV47?

it was the first Ati card that supported Crossfire,

Only after it's refresh when it became the R480, and actually after it was demoed on X700s before you could even buy X850 master cards. So should the R480 have become the X1800 based on that argument?
BTW, R9700s were doing multi-VPU rendering on E&S SimFusion rigs long before nV even had their new 'SLi' and even before Alienware demoed their ALX, so not sure how relevant multi-vpu support is.

and the clocks for both memory and GPU core were about 50% higher than the 9700's clocks.

But only about 20% more than the R9800XT core, and the core was slower than the R9600XT. And if it was speedboost alone then the GF5900 -> 6800 jump shouldn't have gotten a generational name change as it went down in speed.

Even though the R430 was an evolutionary step from the fantastic R300 core, it offered a performance leap anywhere from 40% up to 120% depending on the game or benchmark and the resolution/effects used, not just some "extremely slight tweaks" as you can see.

Performance increase doesn't need dramatic architecture changes, the R9800XT offered larger performance differences over the R9700 as did the X1900 offer over the X1800 depending on the game/setttings, but what constitutes a significant enough change.

Now, the GF7800 (G70) supported 50% more pixel shaders than the GF6800 (NV42, not 47/48 :confused:) (24ps vs 16ps),

The original GF6800 was the NV40, not the NV42 which was the 110nm GF6800 plain 12PS model, and if you don't know what the NV47/48 was in reference to, perhaps you shouldn't bother replying, eh? :slap:

supported multiple GPUs on a single board (aka 7950GX2)

Actually that was multiple GPUs on TWO board (you could actually take them apart if you were so inclined), but a single PCIe socket, you probably should've refered to the ASUS Extreme N7800GT Dual. Also, the GF6800 supported multiple VPUs on a single board as well, guess you never heard of the Gigabyte 3D1 series (both GF6800 and 6600);
http://www.digit-life.com/articles2/video/nv45-4.html

As you can see, both examples you quote, clearly were more than worthy of having a new numerical denomination when compared to their previous gen counterparts :rolleyes:

I think both my examples were pretty illustrative of why it's too early to complain about numbering schemes, since similar examples have occured in the past, and especially when most of the people complaining really don't know enough about them to complain in the first place.

BTW, I'm just curious if those who have a problem with the HD3xxx numbering scheme have a similar problem with the GF8800GT and potential GF8800GTS-part2 numbering scheme causing conflicts with the current high-end?

Personally I only dislike the new numbering scheme if they got rid of the suffixes and replaced them with numbers to play down to the dumbest consumers in the marketplace.
That to me focuses on people who don't care anyways and will still buy an HD4100 with 1GB of 64 bit DDR2 memory because the number and VRAM size is higher than the HD3999 with 512MB of 512bit XDR/GDDR5 memory which may outperform it 5:1 or whatever. Those are the same people who are simply better served by a chart printed on the box by the IHV showing the performance positioning of part more than changing an existing numbering scheme. :banghead:
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,945 (3.75/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -.080mV PL max @220w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
Is Alec back????
 
Joined
Aug 16, 2004
Messages
3,285 (0.44/day)
Location
Sunny California
Processor AMD Ryzen 7 9800X3D
Motherboard Gigabyte Aorus X870E Elite
Cooling Asus Ryujin II 360 EVA Edition
Memory 4x16GBs DDR5 6000MHz Corsair Vengeance
Video Card(s) Zotac RTX 4090 AMP Extreme Airo
Storage 2TB Samsung 990 Pro OS - 4TB Nextorage G Series Games - 8TBs WD Black Storage
Display(s) LG C2 OLED 42" 4K 120Hz HDR G-Sync enabled TV
Case Asus ROG Helios EVA Edition
Audio Device(s) Denon AVR-S910W - 7.1 Klipsch Dolby ATMOS Speaker Setup - Audeze Maxwell
Power Supply beQuiet Straight Power 12 1500W
Mouse Asus ROG Keris EVA Edition - Asus ROG Scabbard II EVA Edition
Keyboard Asus ROG Strix Scope EVA Edition
VR HMD Samsung Odyssey VR
Software Windows 11 Pro 64bit
Quantity doesn't show improvement on the architecture nor the need of a name change and thus the resistance people seem to have with the new name. An increase in number within a 'generation' has the X1800-> X1900 increased the shader count 3 fold, and the transistor count 20%, yet didn't get it's own X2K numbering, but the GF3 -> GF4 is only 10% diff but gets it own generation.


Yes Ati did that previously with the X1800~X1900 series, both used very different architectures and yet both had the same generational numeration, but in that case, the consumer was not mislead, you got a product that didn't improve a performance dramatically from the previous flagship video card, so Ati decided to just go for the X1900 numeration, that was the old Ati, and I preferred that to what they do now.

In this case, you get almost the same GPU from an architectural standpoint (smaller fabrication process, DX10.1 support which is worthless besides being on more bullet point to add to the feature list) but yet, most uninformed consumers will think this is a whole new card because of the next gen denomination (HD3800>HD2900), when in reality, will have about the same performance but a cheaper price point than the "previous gen" card.

This is akin to what nVidia did many years ago with the GeForce 4 MX, which was a GeForce 2 MX with higher clocks and a new name, even though the GeForce 4 Ti series were a lot faster than the MX series and had support for pixel and vertex shaders. Or the same as Ati did when they introduced the 9000 and 9200 series, they only supported DX 8.1 when compared to other fully DX 9 "genuine" R9x00 cards. Or the X600, X300, X700 cards, which used the X denomination but were just PCIe versions of the 9600/9700 series.



Fab process wasn't new, just new to the high end, 130nm and 130nm low-Kd were already used on the R9600P/XT. And for that same reason you could argue the RV670 deserves a name change skipping a node and going from optical shrink to optical shrink, so that's like 2 process changes, and will be the first to be built on the new fab from any IHV. So it kinda proves my point more than dissproves it, although I don't think fab process matters that much so much as the results.

The card that introduced the 9X00 series was the R300 based 9700, not the RV350/360, it has been a common practice in the video card industry for many years for manufacturers to migrate to a smaller fab. process for the mainstream GPU series on any given generation, before using that smaller process for the next gen flagship video cards, just as the HD3800 is a mainstream smaller fab. process version of the HD2900, sorry but this kinda disproves your point in any case...


So the PCX5900 should've been the PCX6800 based on that argument? OR does it matter native/non-native where the GF6800 PCIe (NV45) becomes the GF7800, instead of the later NV47?

I was just using an example of another feature available on the X8x0 series that wasn't available on the R3x0 series (the two architectures you decided to quote), just to prove that all those features combined don't add up to just "some extremely slight tweaks" between both generations...


Only after it's refresh when it became the R480, and actually after it was demoed on X700s before you could even buy X850 master cards. So should the R480 have become the X1800 based on that argument?
BTW, R9700s were doing multi-VPU rendering on E&S SimFusion rigs long before nV even had their new 'SLi' and even before Alienware demoed their ALX, so not sure how relevant multi-vpu support is.

Another feature available for consumers on X8x0 cards first, add it to the feature list that doesn't add up to "some extremely slight tweaks". It doesn't matter if the US government used 4 9800XT cards working in parallel for a flight simulator, or Alienware shows some vaporware, if the consumer cannot have access to that technology with the product it has on it hands at any given moment.


But only about 20% more than the R9800XT core, and the core was slower than the R9600XT. And if it was speedboost alone then the GF5900 -> 6800 jump shouldn't have gotten a generational name change as it went down in speed.

Performance increase doesn't need dramatic architecture changes, the R9800XT offered larger performance differences over the R9700 as did the X1900 offer over the X1800 depending on the game/setttings, but what constitutes a significant enough change.

Once again, Ati introduced the R9x00 series with the R300 based 9700pro, All other R9x00 models (except for the R9000 and the R9200) shared the same basic architecture with different features, clocks and fab. process, that's precisely my point.

The original GF6800 was the NV40, not the NV42 which was the 110nm GF6800 plain 12PS model, and if you don't know what the NV47/48 was in reference to, perhaps you shouldn't bother replying, eh? :slap:

So what, I made a mistake because the GF6800GS has an NV42 core, at least I didn't quote two cores that were never available for sale :slap:

Nvidia's NV47 never existed

Nvidia has canned NV48

The truth of the matter is AMD can name these cards whatever they want, they could name it Radeon HD4000+ for all I care, but it will always be controversial when you raise the expectations of the consumer, and they pay for something that won't exactly live to what they expected, see what happened to the GeForce 4MX and Radeon 9200 users. :shadedshu
 
Last edited:
Top