Wednesday, December 2nd 2009
ATI Radeon HD 5670 Pictured, Detailed, and Tested
AMD's lower-mainstream DirectX 11 compliant graphics card slated for Q1-2010, the ATI Radeon HD 5670 has been pictured and detailed, sourced from a [H]ardOCP HardForum community member. The HD 5600 series is based on a 40 nm GPU codenamed "Redwood". From the specifications the GPU-Z screenshot shows, it has a 50% downscaled SIMD engine, with 400 stream processors, while it retains the 128-bit wide GDDR5 memory interface, with 16 ROPs. Assuming the clock speeds shown in the screenshot to be the reference speeds, they are 775 MHz for the core, and 1000 MHz for the 1 GB of memory (resulting in 64 GB/s of memory bandwidth).
An engineering-sample of the card has also been pictured, revealing a red-colored PCB breaking away from the black PCB scheme of the rest of the HD 5000 series. The card draws all its power from the PCI-Express slot. The GPU cooler consists of a simple heatsink with radially-projecting metal fins, in which is nested a fan. Output connectivity includes DVI, HDMI, and D-Sub, though the leads behind the D-Sub connector shows that offering a DisplayPort in its place might be possible.The user also put the card through two tests, in a performance comparison with the Radeon HD 4670 512 MB, the card this one replaces. The test-bed comprised of a Intel Core i5 750 @ 2.66 GHz, Gigabyte P55M-UD2, and 4 GB of OCZ DDR3-1333 memory. The first test was Street Fighter 4 1600 x 1200, no AA, 16x AF. The HD 5670 scored 10,473 points, with average frame-rate of 95.35 fps. The HD 4670 on the other hand, scored 8,559 points with average frame-rate of 65.35 fps. Next was Unigine Heaven Demo v1.0 DirectX 10 (SM 4.0) 1024 x 768, windowed. With the CPU running at 2.53 GHz (according to the screenshot), the HD 5670 scored 859 points with 34.1 fps frame-rate, while the HD 4670 with the CPU running at 2.66 GHz scored 699 points with 27.8 fps.
Source:
HardForum
An engineering-sample of the card has also been pictured, revealing a red-colored PCB breaking away from the black PCB scheme of the rest of the HD 5000 series. The card draws all its power from the PCI-Express slot. The GPU cooler consists of a simple heatsink with radially-projecting metal fins, in which is nested a fan. Output connectivity includes DVI, HDMI, and D-Sub, though the leads behind the D-Sub connector shows that offering a DisplayPort in its place might be possible.The user also put the card through two tests, in a performance comparison with the Radeon HD 4670 512 MB, the card this one replaces. The test-bed comprised of a Intel Core i5 750 @ 2.66 GHz, Gigabyte P55M-UD2, and 4 GB of OCZ DDR3-1333 memory. The first test was Street Fighter 4 1600 x 1200, no AA, 16x AF. The HD 5670 scored 10,473 points, with average frame-rate of 95.35 fps. The HD 4670 on the other hand, scored 8,559 points with average frame-rate of 65.35 fps. Next was Unigine Heaven Demo v1.0 DirectX 10 (SM 4.0) 1024 x 768, windowed. With the CPU running at 2.53 GHz (according to the screenshot), the HD 5670 scored 859 points with 34.1 fps frame-rate, while the HD 4670 with the CPU running at 2.66 GHz scored 699 points with 27.8 fps.
80 Comments on ATI Radeon HD 5670 Pictured, Detailed, and Tested
right now i'm running a dell precision 650 with an agp 2400pro @700mhz
i play torchlight, left 4 dead, battlefield heroes, bioshock...but i want to get back into pc gaming and pick up some newer titles...and besides, i'm having to run all my games at lower than 720p res and a lot let detail so they just don't look as good
i might be getting a newer dell soon, one that has a pci-e slot...so this would be perfect, though a 4650 would still be a huge step up from my 2400pro...the 5670/5650 would be that much sweeter
i do like the look of this new 5670 tho but ima pass and just get a 5770 after xmas and a new 23/24" screen.
www.fudzilla.com/content/view/16681/1/
Go watch some really 1080p movies and you gona love the shutter on the IGP.
This card is more like a general purpose card that can play games, and by playing it doesn't mean its gona be smooth.
Sure you can play a game @30FPS but it just won't be a decent experience.
Anything below a 5750 don't really cut it.
Call of Duty 4 is a joke, why buy a new DX11 card to play DX9 games that are more than 2 years old by now?
Go get Dirt2 and see how this card holds up with games "3 years later".
The pricing will be the key, and we have yet to see how this card performs in real games.
BTW I can play some shitty old games on my IGP, so do I now have a "gamer IGP"? :nutkick:
For XT its xx70 and pro is xx50
I own an Acer Ferrari One. It has an Athlon Neo L310 (old 65nm dual core A64 @ 1.2GHz) and a HD3200 IGP, underclocked to 380MHz, which uses the old UVD 1 engine.
It plays every 1080p movie I throw at it, with no stutter whatsoever.
My HTPC, with a desktop HD3200 IGP, played Blu-Rays with no stuttering at all.
Even the single core Atom is capable of flawless 1080p playback with the ION IGP.
Nowadays, a discrete card is useless for most non-gamers. They (HD3200/4200 and Geforce 8300/9300) have everything needed for a full computing experience: more than suficient 3D acceleration for OS gimmicks, complete video acceleration, two monitor output, HDMI sound passthrough, etc.
That's why dirt low-end graphics cards are gradually ceasing to exist. There's no market for those in new PCs. Do you have the slightest idea of the average PC gamer rig?
First of all, most PCs nowadays are notebooks, not desktops. The best GPU you find in mid-range laptops right now is the HD4650 with DDR3. Before that was out and popular (half a year ago), it was the 32sp 9600M GT.
You buy a high-end laptop and you get a Geforce 260M GTX and a 280M GTX. Those are G92b chips corresponding to underclocked 9800GT and 9800GTX.
This card should have a performance around a 9800GT. This means the HD5670 will be a lot faster than most PC gaming rigs when it comes out.
And even desktops, pay attention to the next hardware survey from Steam and you'll know how far behind most people are from DX10. Or take a visit to the GameSpot forums. You find many REAL gamers who buy new games every week and play them all, using outdated hardware. They only change their hardware when the current one refuses to run the recent games.
Developers know this too. They won't marginalize 95% of their customers in a few years. First off, you seem to have some issues with older games. Just because the game is over a year, you call it "shitty old" and a "joke"? Now that's a joke itself.
COD4 is a "joke" that still sells hundreds of copies worldwide and is used for hundreds of gaming tournaments. I'm replaying KOTOR, with graphic settings maxed out, in my subnotebook. It's not a shitty game. It's one of the best games I ever played.
How will this card perform in 3 years? Easy: as well as a HD2600 XT performs now, which is damn fine if you ask me. Sure, you can only crank the settings up to medium or low, but it'll play every game and still provide a satisfatory gaming experience.
Another example of this is the Radeon HD 4730, which was mainly released in Europe (although some did make it North America), we would assume that this card would be similar to the Radeon HD 4770 which has the code name RV740, but if we look at the code name, the Radeon HD 4730 is called the RV770CE! This means that its not part of the same RV740 series as the 4770 but is derived from the Radeon HD 4800 series, which uses (excluding the 4890) code name is RV770, so therefore we know that the 4730 is actually a cut down 4830/4850/4870 rather than being related to the 4770 like one would believe from the model name.
It depnends on the video itself, the IGP do suffer shutters on higher compression rate and @scenes with higher bitrate.
Oh so you think I am talking about COD4 as a shitty old game? :slap:
I guess you are the one that imply it is. :laugh:
LOL, even the 8600GT from that time plays like shit on most newer games.
Now you actually believe the HD2600XT is still "enough"? :eek:
Cards on that generation is barely enough for the DX9 games that are already out at that time.
DX10 don't even really come into play here. The 8600GT was @most as good as the X1950GT.
Talk about notebooks, when are mid-range notebooks intended for gamming?
As far as back as I can remeber, "gamming notebooks" are most exclusively high end models.
Do you know how to enable DXVA in mkv movies? Maybe you're not using any hardware acceleration at all. Yes, the HD2600XT and the 8600GT still provide an enjoyable experience for all games.
I can afford higher performing hardware but if I couldn't, I'd still play PC games with a graphics card like those.
Don't forget that either of them has more mojo than the GPUs you find in either PS3 or X360. Name it what you want, most gamers use laptops, not desktops. I can assure you there are a lot more people playing recent games in sub-1000€ laptops than there are with mid-end desktops.
Cosnoles runs highly opitmized code and that allows them to run games largely on their beefy CPUs.
The G70 series GPU on the PS3 is only used for output and some 3D properities like for hardware skinning and and for their ROPs.
This interns leads to exactly the point why the old cards like the HD2600XT simply aren't enough anymore.
Console ports are generally not opitmized for PC, therefore they require much more powerful hardware to do the same job.
Have you actually play games on a console and see it shutter?
At lease I know I did. :shadedshu
Some times even Console games aren't written well and the players can notice slow downs in certain scenes.
So I highly doubt that a converter can automaticly opimitize for each platform.
(It is likely that you can convert the calls from one APi to another, but that is not opitmization isn't it?)
When you consider the wide range of hardware that are on the PC.
and that makes me belive the 5670 with 80 more SPs and double bandwidth can perform better and will last at least 2 or 3 years due to slow progress into photo-realistic graphic nowadays (so most developers can make games that playable on platform like ps3 and xbox 360, in fact CryEngine3 is developed so it can run crysis 2 on multiple platform).
considering this fact, you shouldn't be afraid that this 5670 can't hold up latest games in next 3 years. sure, this is not the card that make rich gamers hyped but for those on budget, this is the best bang for bucks.
Because i plan to buy this card to go along with a 300 PSU (secondary rig ) ?
As far as I know the HD 5850 peaks out @150W and usualy load at around 110W:p
Ths card I will say around 40W and peaks out around 60ish maybe.