Wednesday, October 17th 2007

ATI RV670 is Radeon HD 3800 Series

According to VR-Zone, the official name for AMD/ATI's RV670 will be Radeon HD 3000 series. Previously codenamed Radeon HD 2950 series, today one of the AIB partners - GeCube has even listed out Radeon HD 3800 on their website. AMD has also decided to drop PRO, XT, GT, XTX from their future HD 3000 series graphics cards. The last 2 digits of the model name will determine the performance of the card. For example RV670PRO will be known as Radeon HD 3850 while RV670XT will be known as Radeon HD 3870 when they are launched on November 15th.
Source: VR-Zone
Add your own comment

37 Comments on ATI RV670 is Radeon HD 3800 Series

#26
Tatty_Two
Gone Fishing
largonWhat's the talk about 300W?

This thing is also PCIe 1.1a compliant so the cards' spec has an absolute max. power consumption of 150W.
The guy was talking about power requirements in general and showing how 300W was made available.
Posted on Reply
#27
Darkrealms
DanTheBanjomanThey already exist, in fact, they have for many years. Even 3DFX made them.
3DFX WOOT! they got screwed by their management. . . >:(

I hope ATI/AMD is looking ahead (don't really like ATIs) there needs to be more competition for Nvidia and Intel.
Posted on Reply
#28
adamnfs
dual core graphics cards.
DanTheBanjomanThey already exist, in fact, they have for many years. Even 3DFX made them.
not really, iv had 3dfx cards like the voodoo5 5500 and a card like the ati radeon 3870 X2

i dont call those cards as a video card with a dual core gpu, its two separate gpus on the same card, a dual core as what im saying it like a althon x2 or phenom x4 cpu, where ther are 2 or 4 or more cpu, or in this case GPU cores in a single die, aka dual or quad core.

multiple cores on a single chip, not as in two separate chips on one card, the card would have one chip, but the one chip would basically be an actuall dual core gpu on one chip/die package, thus should also be easier to cool as well, as in the 3870 x2 card i had overheated cuz the air pushed the cool air through the first heatsinc and heated the air, once it got to the second chip the air wanst cool, the air was hot thus causing the second chip to over heat... and actuall dual core gpu would be better as in they would be easier to cool unlike as you said a dual core which isnt, its considered a dual chip graphics card, not dual core, dual core is basically two processors on one chip, which is how they should be making graphics card gpus now sence dualcore and quad core cpus are made, gps could be made the same way as well,
Posted on Reply
#29
zithe
918 day bump. Sorry couldn't resist, t'was an epic bump, after all. Welcome to TPU, by the way. :toast:
Posted on Reply
#31
Steevo
I smell the smell of a bump that smells smelly.
Posted on Reply
#33
LAN_deRf_HA
It would be slightly less absurd if it weren't worded so redundantly...
Posted on Reply
#34
sneekypeet
not-so supermod
Let dying threads die. Either post on topic or move along. This constant idea that spamming threads for whatever reason needs to stop. Please be mature and move along versus posting more irrelevant posts. I don't want to start handing out points for such offenses, but have no real issues doing so. Let's not have to go there shall we?
Posted on Reply
#35
a111087
lol, can anyone else recall what they forgot to say 3 years ago?
Posted on Reply
#36
SNiiPE_DoGG
I predict the HD3k series will have a low performance advantage over the 2900xt
Posted on Reply
#37
a111087
SNiiPE_DoGGI predict the HD3k series will have a low performance advantage over the 2900xt
dude! are you from the future?! :roll:
Posted on Reply
Add your own comment
Dec 20th, 2024 22:06 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts