Wednesday, October 17th 2007

ATI RV670 is Radeon HD 3800 Series

According to VR-Zone, the official name for AMD/ATI's RV670 will be Radeon HD 3000 series. Previously codenamed Radeon HD 2950 series, today one of the AIB partners - GeCube has even listed out Radeon HD 3800 on their website. AMD has also decided to drop PRO, XT, GT, XTX from their future HD 3000 series graphics cards. The last 2 digits of the model name will determine the performance of the card. For example RV670PRO will be known as Radeon HD 3850 while RV670XT will be known as Radeon HD 3870 when they are launched on November 15th.
Source: VR-Zone
Add your own comment

37 Comments on ATI RV670 is Radeon HD 3800 Series

#1
lemonadesoda
Some marketing people at AMD need to fall on their swords.

RV670 is only a small update. Hardly an order of magnitude difference! The X2950, if anything, was probably overstating it, and better to have X2920.

I fondly remember the days of Radeon 7500, 8500, 9700... each time doubling or tripling performance. And all the $millions spent on teaching people about Pro, XT, XTX... all being thrown to the wind for another naming convention!

I hope that the new GPU, R7xx or whatever they call it, will go back to a more sensible naming convention, and "reset" the counters.
Posted on Reply
#2
InfDamarvel
It would have been a good idea.....years ago. Now lets think about this Ati. You do still have products called the 9600 and 9800 and etc. You really have noobs out there confused because your horrible with your names.
Posted on Reply
#3
Jimmy 2004
Maybe a typo on the GeCube site? It could easily have been supposed to be HD 2800 - which might make more sense considering AMD always used to have 9800, X800 ect, but if the RV670 does perform better then that would be misleading. AMD should really have released the HD 2800s first rather than HD 2900, and released the rest under 2900.
Posted on Reply
#4
a111087
i like that they dropped those XT, XTX... less confusion for new people
Posted on Reply
#5
mdm-adph
That's got to be a typo, right? I mean, it's only a revision...

I hope ATI/AMD aren't that eager to forget the HD 2000 line -- they're not that bad.
Posted on Reply
#6
jocksteeluk
mdm-adphThat's got to be a typo, right? I mean, it's only a revision...

I hope ATI/AMD aren't that eager to forget the HD 2000 line -- they're not that bad.
I think they already have forgotten about the 2000 products for the high end at least, mybe they will keep the 2000 for the lower end products.
Posted on Reply
#7
WarEagleAU
Bird of Prey
I think this might let on to say that maybe its not just a revision, but they added changes to it. Only thing I can see for the new naming scheme.
Posted on Reply
#8
Easy Rhino
Linux Advocate
if anything the new naming scheme makes it more complicated!
Posted on Reply
#9
Disparia
Very nice!

Death to the suffix!
Posted on Reply
#10
ccleorina
WTF..... I just got 2 HD2900XT.... Now HD3000? DAMM.... :banghead:
Posted on Reply
#11
magibeg
Maybe ATI will just skip ahead to the next generation and surprise everyone with a full hd3000 series. hd3900 with 640 stream processors ;)
Posted on Reply
#12
ccleorina
magibegMaybe ATI will just skip ahead to the next generation and surprise everyone with a full hd3000 series. hd3900 with 640 stream processors ;)
I hope it with 640 SP + 4X8pin PCI-E Power.....:banghead:
Posted on Reply
#13
a111087
ccleorinaI hope it with 640 SP + 4X8pin PCI-E Power.....:banghead:
no, PCI-E2.0 will provide more power to video card = no big need for additional connectors
Posted on Reply
#14
ccleorina
a111087no, PCI-E2.0 will provide more power to video card = no big need for additional connectors
Hope so man.......:roll: can wait see if HD3K can kick nvidia ass:rockout:
Posted on Reply
#15
ccleorina
magibegMaybe ATI will just skip ahead to the next generation and surprise everyone with a full hd3000 series. hd3900 with 640 stream processors ;)
Why don't make a dual core graphic card?:toast:
Posted on Reply
#16
eidairaman1
The Exiled Airman
ya know what too many of you are too harsh on the company for making a major step, Honestly in my opinion the Naming Conventions were getting too f!@#$%^ long for pronunciation.
Posted on Reply
#17
Deleted member 3
ccleorinaWhy don't make a dual core graphic card?:toast:
They already exist, in fact, they have for many years. Even 3DFX made them.
Posted on Reply
#18
Tatty_Two
Gone Fishing
a111087no, PCI-E2.0 will provide more power to video card = no big need for additional connectors
Yep.....but we dont want any more power consumption, we want faster cards with higher speeds using less power and therefore producing less heat.....surely thats one of the main reasons for lowering the fabrication process, if every time you need to raise the power requirements to gain the speed with each generation then IMO that is nt really an advance in technology..........leading to the year 2050, when we all own our very own nuclear powerplant to power up our DX25 grapics cards!:eek:
Posted on Reply
#19
Seany1212
Tatty_OneYep.....but we dont want any more power consumption, we want faster cards with higher speeds using less power and therefore producing less heat.....surely thats one of the main reasons for lowering the fabrication process, if every time you need to raise the power requirements to gain the speed with each generation then IMO that is nt really an advance in technology..........leading to the year 2050, when we all own our very own nuclear powerplant to power up our DX25 grapics cards!:eek:
agreed on sooo many levels :toast: (to gfx manufactures: :slap:)
Posted on Reply
#20
AsRock
TPU addict
ccleorinaI hope it with 640 SP + 4X8pin PCI-E Power.....:banghead:
:laugh::laugh:
Posted on Reply
#21
effmaster
Seany1212agreed on sooo many levels :toast: (to gfx manufactures: :slap:)
If only they would listen to us and actually did create lower power requirements

For Shame Nvidea:shadedshu and for Shame ATI/AMD:shadedshu

More power means more connectors which means more space taken up on the graphics card which means less room to innovate. Im not sure but I do hope this is the 3000 series of cards from ATI/AMD since this will give them a head start on Nvidea since Nvidea wont be releasing their new series of cards until early 2008
Posted on Reply
#22
Disparia
From what I've gathered...

PCIe 2.0 slot = 150w.

If a card has a greater demand, then a single 150w 8-pin connector will be present (they want to do away with the 6-pin for simplicity).

I guess you can call it the "300w soft cap" - that hasn't changed (6-pin + 8-pin + PCIe = 300w). They're just shifting more of it to the motherboard (PCIe 2.0 + 8-pin = 300w).


-edit-

And these days we know that isn't true...
Posted on Reply
#23
Tatty_Two
Gone Fishing
JizzlerFrom what I've gathered...

PCIe 2.0 slot = 150w.

If a card has a greater demand, then a single 150w 8-pin connector will be present (they want to do away with the 6-pin for simplicity).

I guess you can call it the "300w soft cap" - that hasn't changed (6-pin + 8-pin + PCIe = 300w). They're just shifting more of it to the motherboard (PCIe 2.0 + 8-pin = 300w).
Very true.....but my point is....300W is wwwwwwwaaaaaaayyyyyyyyyyyyyy too much!
Posted on Reply
#24
largon
What's the talk about 300W?

This thing is also PCIe 1.1a compliant so the cards' spec has an absolute max. power consumption of 150W.
Posted on Reply
#25
nINJAkECIL
the name changes is more confusing than the older name....and it sounds ugly too....
which is sounds more cool:
1. 3800XT or
2. 3850 :lol:
Posted on Reply
Add your own comment
Dec 20th, 2024 19:22 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts