• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ATI Radeon HD 5670 Pictured, Detailed, and Tested

maybe im wrong about bandwidth in one sense seeing as my old ati x1650pro had 20gig bandwidth and my hd4650 only has 14 but runs a lot better. then again old card had 256mbs ram and the 4650 has 512 so maybe this is why it runs better and at high res and also because of the change from pixel pipe lines to stream proccesers.
 
i generally have older pc's or hand me-downs...

right now i'm running a dell precision 650 with an agp 2400pro @700mhz

i play torchlight, left 4 dead, battlefield heroes, bioshock...but i want to get back into pc gaming and pick up some newer titles...and besides, i'm having to run all my games at lower than 720p res and a lot let detail so they just don't look as good

i might be getting a newer dell soon, one that has a pci-e slot...so this would be perfect, though a 4650 would still be a huge step up from my 2400pro...the 5670/5650 would be that much sweeter
 
i cant knock my HD4650 its been a good card and ran all the games i want to play nice enough. its even the slowest version of the 4650 to, 800mhz ram instead of 1000mhz.

i do like the look of this new 5670 tho but ima pass and just get a 5770 after xmas and a new 23/24" screen.
 
I want them to change it to single slot so I can fit it in some pre-builts that are lying about.
 
The bandwidth on this card is 64GB/s, which is 6GB/s less than a 9800GTX+. I understand that this is a $100 video card, but it really just doesn't kill it the way a DX11 card should.

Who ever said it was $100? Traditional pricing for ATi mid-range would put this thing closer to the $79 mark. Also, the thing only uses a 128-bit bus, so the fact that it's anywhere close to the bandwidth numbers put out by the 9800GTX+ is pretty fantastic actually. Keep in mind, this thing costs ATi a fraction of what a GTX+ costs nVidia...
 
Who ever said it was $100? Traditional pricing for ATi mid-range would put this thing closer to the $79 mark. Also, the thing only uses a 128-bit bus, so the fact that it's anywhere close to the bandwidth numbers put out by the 9800GTX+ is pretty fantastic actually. Keep in mind, this thing costs ATi a fraction of what a GTX+ costs nVidia...

Doesn't the Radeon HD 5750 more or less fit into the 9800GTX+/GTS 250/HD 4850 niche already? I don't think ATI would want to place its performance to closely with an existing card. I think we can expect at best 'better than' 9800GT, so I guess performance slightly lower than the 4770 (or more or less equivalent to the China only Radeon HD 4750)
 
i generally have older pc's or hand me-downs...

right now i'm running a dell precision 650 with an agp 2400pro @700mhz

i play torchlight, left 4 dead, battlefield heroes, bioshock...but i want to get back into pc gaming and pick up some newer titles...and besides, i'm having to run all my games at lower than 720p res and a lot let detail so they just don't look as good

i might be getting a newer dell soon, one that has a pci-e slot...so this would be perfect, though a 4650 would still be a huge step up from my 2400pro...the 5670/5650 would be that much sweeter

Instead of buying a machine, why not get your hands "dirty"
 
thats just it, i don't buy the machines, i have some of these pc's just given to me and i can fix almost anything, so i tend to just use what i can get my hands on until i take care of the debt i have.....i've been building pc's since the 486dx days. just now money is very tight...but that shouldn't last too much longer
 
True. This is a gamer card, a mid-low-budget one.

Non-gamers are content with IGPs.

This card will be able to play anything you throw at it in the next 3 years, and older games like COD4 can be maxed out easily.
Non-Gamers might need more than IGPs, if you want smooth HD play back you need at lease a 4350 and 4550 will be better.
Go watch some really 1080p movies and you gona love the shutter on the IGP.

This card is more like a general purpose card that can play games, and by playing it doesn't mean its gona be smooth.
Sure you can play a game @30FPS but it just won't be a decent experience.
Anything below a 5750 don't really cut it.

Call of Duty 4 is a joke, why buy a new DX11 card to play DX9 games that are more than 2 years old by now?
Go get Dirt2 and see how this card holds up with games "3 years later".
The pricing will be the key, and we have yet to see how this card performs in real games.

BTW I can play some shitty old games on my IGP, so do I now have a "gamer IGP"? :nutkick:
 
Last edited:
Stupid Question, but isn';t there suppose to be another 5600 series card coming out, something about xt and pro?

This is true, I've been able to max out AA with no fps hit for a few years now.
I was talking about AF, not aa.

i'm just really crossing my fingers for a low profile version! i want to play games with more detail, but i don't want to spend the money for a 9800gt low profile and a new PSU also when something like this would do the job without the need for a high end PSU.
I recently bought a PNY 9800GT Green edition 512MB GDDR3 card from best buy for only 65 bucks. Running on a 250 PSU. The 5670 i plan to run with a 350 PSU.
 
ATI doesn't do XT and Pro anymore.

For XT its xx70 and pro is xx50
 
Stupid Question, but isn';t there suppose to be another 5600 series card coming out, something about xt and pro

Most likely HD 5670 is the Redwood-XT, and HD 5650 is the Redwood-Pro.
 
Non-Gamers might need more than IGPs, if you want smooth HD play back you need at lease a 4350 and 4550 will be better.
Go watch some really 1080p movies and you gona love the shutter on the IGP.

You don't need a graphics card to play H264 1080p movies! In what year do you live? 2007?

I own an Acer Ferrari One. It has an Athlon Neo L310 (old 65nm dual core A64 @ 1.2GHz) and a HD3200 IGP, underclocked to 380MHz, which uses the old UVD 1 engine.
It plays every 1080p movie I throw at it, with no stutter whatsoever.
My HTPC, with a desktop HD3200 IGP, played Blu-Rays with no stuttering at all.


Even the single core Atom is capable of flawless 1080p playback with the ION IGP.


Nowadays, a discrete card is useless for most non-gamers. They (HD3200/4200 and Geforce 8300/9300) have everything needed for a full computing experience: more than suficient 3D acceleration for OS gimmicks, complete video acceleration, two monitor output, HDMI sound passthrough, etc.

That's why dirt low-end graphics cards are gradually ceasing to exist. There's no market for those in new PCs.




This card is more like a general purpose card that can play games, and by playing it doesn't mean its gona be smooth.
Sure you can play a game @30FPS but it just won't be a decent experience.
Anything below a 5750 don't really cut it.



Do you have the slightest idea of the average PC gamer rig?

First of all, most PCs nowadays are notebooks, not desktops. The best GPU you find in mid-range laptops right now is the HD4650 with DDR3. Before that was out and popular (half a year ago), it was the 32sp 9600M GT.

You buy a high-end laptop and you get a Geforce 260M GTX and a 280M GTX. Those are G92b chips corresponding to underclocked 9800GT and 9800GTX.

This card should have a performance around a 9800GT. This means the HD5670 will be a lot faster than most PC gaming rigs when it comes out.

And even desktops, pay attention to the next hardware survey from Steam and you'll know how far behind most people are from DX10. Or take a visit to the GameSpot forums. You find many REAL gamers who buy new games every week and play them all, using outdated hardware. They only change their hardware when the current one refuses to run the recent games.

Developers know this too. They won't marginalize 95% of their customers in a few years.






Call of Duty 4 is a joke, why buy a new DX11 card to play DX9 games that are more than 2 years old by now?
Go get Dirt2 and see how this card holds up with games "3 years later".
The pricing will be the key, and we have yet to see how this card performs in real games.

BTW I can play some shitty old games on my IGP, so do I now have a "gamer IGP"? :nutkick:



First off, you seem to have some issues with older games. Just because the game is over a year, you call it "shitty old" and a "joke"? Now that's a joke itself.
COD4 is a "joke" that still sells hundreds of copies worldwide and is used for hundreds of gaming tournaments. I'm replaying KOTOR, with graphic settings maxed out, in my subnotebook. It's not a shitty game. It's one of the best games I ever played.


How will this card perform in 3 years? Easy: as well as a HD2600 XT performs now, which is damn fine if you ask me. Sure, you can only crank the settings up to medium or low, but it'll play every game and still provide a satisfatory gaming experience.
 
Last edited:
ATI doesn't do XT and Pro anymore.

For XT its xx70 and pro is xx50

ATI/AMD still very much uses the naming system, but now it is reserved for GPU code names while the larger four digit number is the model name. For example the RV620LE is the code name commonly used by ATI and its vendors internally while we know this card as the Radeon HD 3450. Its good to know the code name of the various GPUs around as it gives you greater insight beyond just the 'branding' of the card. For example, for the average computer illiterate person, the Radeon HD 4200 IGP must be more powerful than the Radeon HD 3450 because the product model name is bigger, but if we look at the code name for the Radeon HD 4200, we see that, infact the code number designation for the IGP is RV620LE, its essentially, they are the same GPU with a different model name.

Another example of this is the Radeon HD 4730, which was mainly released in Europe (although some did make it North America), we would assume that this card would be similar to the Radeon HD 4770 which has the code name RV740, but if we look at the code name, the Radeon HD 4730 is called the RV770CE! This means that its not part of the same RV740 series as the 4770 but is derived from the Radeon HD 4800 series, which uses (excluding the 4890) code name is RV770, so therefore we know that the 4730 is actually a cut down 4830/4850/4870 rather than being related to the 4770 like one would believe from the model name.
 
i get 37 to 45 fps with my crappy hd4650 on dirt 2 demo @1280/1024 max res of my screen with high and some ultra settings the only settings that kill my card are AA and shadows so shadows set to medium and turned AA of, so i think this new 5670 will be a nice budget card.
 
You don't need a graphics card to play H264 1080p movies! In what year do you live? 2007?

I own an Acer Ferrari One. It has an Athlon Neo L310 (old 65nm dual core A64 @ 1.2GHz) and a HD3200 IGP, underclocked to 380MHz, which uses the old UVD 1 engine.
It plays every 1080p movie I throw at it, with no stutter whatsoever.
My HTPC, with a desktop HD3200 IGP, played Blu-Rays with no stuttering at all.

Even the single core Atom is capable of flawless 1080p playback with the ION IGP.

Nowadays, a discrete card is useless for most non-gamers. They (HD3200/4200 and Geforce 8300/9300) have everything needed for a full computing experience: more than suficient 3D acceleration for OS gimmicks, complete video acceleration, two monitor output, HDMI sound passthrough, etc.

That's why dirt low-end graphics cards are gradually ceasing to exist. There's no market for those in new PCs.

Do you have the slightest idea of the average PC gamer rig?

First of all, most PCs nowadays are notebooks, not desktops. The best GPU you find in mid-range laptops right now is the HD4650 with DDR3. Before that was out and popular (half a year ago), it was the 32sp 9600M GT.

You buy a high-end laptop and you get a Geforce 260M GTX and a 280M GTX. Those are G92b chips corresponding to underclocked 9800GT and 9800GTX.

This card should have a performance around a 9800GT. This means the HD5670 will be a lot faster than most PC gaming rigs when it comes out.

And even desktops, pay attention to the next hardware survey from Steam and you'll know how far behind most people are from DX10. Or take a visit to the GameSpot forums. You find many REAL gamers who buy new games every week and play them all, using outdated hardware. They only change their hardware when the current one refuses to run the recent games.

Developers know this too. They won't marginalize 95% of their customers in a few years.

First off, you seem to have some issues with older games. Just because the game is over a year, you call it "shitty old" and a "joke"? Now that's a joke itself.
COD4 is a "joke" that still sells hundreds of copies worldwide and is used for hundreds of gaming tournaments. I'm replaying KOTOR, with graphic settings maxed out, in my subnotebook. It's not a shitty game. It's one of the best games I ever played.

How will this card perform in 3 years? Easy: as well as a HD2600 XT performs now, which is damn fine if you ask me. Sure, you can only crank the settings up to medium or low, but it'll play every game and still provide a satisfatory gaming experience.
Nice, I got that exact same IGP (yes HD3200) and I do notice the shutter when playing HD movies ;)
It depnends on the video itself, the IGP do suffer shutters on higher compression rate and @scenes with higher bitrate.

Oh so you think I am talking about COD4 as a shitty old game? :slap:
I guess you are the one that imply it is. :laugh:

LOL, even the 8600GT from that time plays like shit on most newer games.
Now you actually believe the HD2600XT is still "enough"? :eek:
Cards on that generation is barely enough for the DX9 games that are already out at that time.
DX10 don't even really come into play here. The 8600GT was @most as good as the X1950GT.

Talk about notebooks, when are mid-range notebooks intended for gamming?
As far as back as I can remeber, "gamming notebooks" are most exclusively high end models.
 
Last edited:
Nice, I got that exact same IGP (yes HD3200) and I do notice the shutter when playing HD movies ;)
It depnends on the video itself, the IGP do suffer shutters on higher compression rate and @scenes with higher bitrate.


It doesn't stutter at all. Not with resolutions up to 1080p anyways, no matter what compression and bitrate you play.

Do you know how to enable DXVA in mkv movies? Maybe you're not using any hardware acceleration at all.



Oh so you think I am talking about COD4 as a shitty old game? :slap:
I guess you are the one that imply it is. :laugh:

LOL, even the 8600GT from that time plays like shit or most newer games.
Now you actually believe the HD2600XT is still "enough"? :eek:

Yes, the HD2600XT and the 8600GT still provide an enjoyable experience for all games.
I can afford higher performing hardware but if I couldn't, I'd still play PC games with a graphics card like those.
Don't forget that either of them has more mojo than the GPUs you find in either PS3 or X360.



Talk about notebooks, when are mid-range notebooks intended for gamming?
As far as back as I can remeber, "gamming notebooks" are most exclusively high end models.

Name it what you want, most gamers use laptops, not desktops. I can assure you there are a lot more people playing recent games in sub-1000€ laptops than there are with mid-end desktops.
 
It doesn't stutter at all. Not with resolutions up to 1080p anyways, no matter what compression and bitrate you play.

Do you know how to enable DXVA in mkv movies? Maybe you're not using any hardware acceleration at all.





Yes, the HD2600XT and the 8600GT still provide an enjoyable experience for all games.
I can afford higher performing hardware but if I couldn't, I'd still play PC games with a graphics card like those.
Don't forget that either of them has more mojo than the GPUs you find in either PS3 or X360.
You can't compare Consoles and PC like this dude.
Cosnoles runs highly opitmized code and that allows them to run games largely on their beefy CPUs.
The G70 series GPU on the PS3 is only used for output and some 3D properities like for hardware skinning and and for their ROPs.

This interns leads to exactly the point why the old cards like the HD2600XT simply aren't enough anymore.
Console ports are generally not opitmized for PC, therefore they require much more powerful hardware to do the same job.
Have you actually play games on a console and see it shutter?
At lease I know I did. :shadedshu
Some times even Console games aren't written well and the players can notice slow downs in certain scenes.
 
Last edited:
You can't compare Consoles and PC like this dude.
Cosnoles runs highly opitmized code and that allows them to run games largely on their beefy CPUs.
The G70 series GPU on the PS3 is only used for output and some 3D properities like for hardware skinning and and for their ROPs.

This interns leads to exactly the point why the old cards like the HD2600XT simply aren't enough anymore.
Console ports are generally not opitmized for PC, therefore they require much more powerful hardware to do the same job.
Have you actually play games on a console and see it shutter?
At lease I know I did. :shadedshu
Some times even Console games aren't written well and the players can notice slow downs in certain scenes.

Most games are written on the pc then ported over to xbox and ps3. Also there is software that automatically optimizes the program for each platform.
 
Most games are written on the pc then ported over to xbox and ps3. Also there is software that automatically optimizes the program for each platform.
The Console and the PC don't even run the same API for 3D.
So I highly doubt that a converter can automaticly opimitize for each platform.
(It is likely that you can convert the calls from one APi to another, but that is not opitmization isn't it?)
When you consider the wide range of hardware that are on the PC.
 
The Console and the PC don't even run the same API for 3D.
So I highly doubt that a converter can automaticly opimitize for each platform.
(It is likely that you can convert the calls from one APi to another, but that is not opitmization isn't it?)
When you consider the wide range of hardware that are on the PC.

It wasn't me who said it, I read it in a magazine in an article from the developers from dead space.
 
Weak card. Good cruncher maybe. I'm sure it won't carry any DX11 games at the resolutions and quality settings any real gamer is looking for.

i don't think it's weak. considering my 4670 which still can run almost all latest games on 1920x1080 + highest setting with FPS around 30, which is generally accepted as playable (except crysis which only playable on 1280x720 high setting).

and that makes me belive the 5670 with 80 more SPs and double bandwidth can perform better and will last at least 2 or 3 years due to slow progress into photo-realistic graphic nowadays (so most developers can make games that playable on platform like ps3 and xbox 360, in fact CryEngine3 is developed so it can run crysis 2 on multiple platform).

considering this fact, you shouldn't be afraid that this 5670 can't hold up latest games in next 3 years. sure, this is not the card that make rich gamers hyped but for those on budget, this is the best bang for bucks.
 
Quick question, how much will this 5670 power draw, about 59w you think?
Because i plan to buy this card to go along with a 300 PSU (secondary rig ) ?
 
Quick question, how much will this 5670 power draw, about 59w you think?
Because i plan to buy this card to go along with a 300 PSU (secondary rig ) ?

The 5850 uses about 10 watts idle and 100 idle at max so this will be much much less.
 
Back
Top