• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

HIS Radeon HD 6850 1 GB

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,668 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
AMD's new Radeon HD 6850 comes at an extremely affordable $179 price point. This poses the question whether it can become the new weapon of choice for budget oriented gamers. In our testing we see excellent performance which enables full HD resolution gaming in DirectX 11.

Show full review
 
Last edited:
Problem? I'd be sweet to get a card with 1120 shaders instead of 960...
Hopefully a few of these makes it out in retail :D
 
Great article. You write very well, something I admire in a technology reviewer.
 
Wow, that's really funny. They hid that LITTLE heatsink under that HUGE shroud. Good to know it doesn't need a beastly cooler, but kinda humorous at the same time.
 
Great card.... Too bad it performed worse than the 5850.... At least its better than the 460GTX....but then the 460 has a good over-clocking potential...
 
I must say I'm very impressed with their midrange cards outdoing the top tier cards from 6 months ago. I'm pumped for 69xx. Good work Wizz on the reviews, and the V-mod section is an AWESOME addon :)
 
Great card.... Too bad it performed worse than the 5850.... At least its better than the 460GTX....but then the 460 has a good over-clocking potential...
HD6850 priced at $179 and HD5850 was priced at $259 when released so why exactly is it bad that it performs like this?

Stupid naming scheme!
 
This sample seams a little odd, i'm curious how you managed to reconfigure the shaders as i thought they were now physically disabled during production.

Is it possible that this sample was kind of crippled in it's overclocking due to the problem with the shaders, i could be wrong i just assumed that the shaders that should have been disabled could be ones that were not thus unable to hit anywhere near the speeds of the other 2 chips you reviewed.

I really have no idea what i'm talking about, i don't have a clue how disabling sp's work or how it is worked out which sp's are the ones that need to be disabled :confused:

Great review as always, these 5 reviews are making for great morning reading, one question though how big are the heatpipes on this cooler? they look so massive in the picture they make me think 10mm but i didn't think they were being used on any cooler, are they possibly 8mm?

*edit*
Umm is it just me that thinks the heatpipes look big, it could just be the fact these cards are so much smaller than almost every other card i have looked at recently... either that or i just need more coffee :laugh:
 
Last edited:
are the temps posted correct?

I can't understand how it can stay so incredibly cool (45),and after a mild OC it jumps 30 degrees up
 

Attachments

  • temps.jpg
    temps.jpg
    26.1 KB · Views: 924
are the temps posted correct?

I can't understand how it can stay so incredibly cool (45),and after a mild OC it jumps 30 degrees up

fixed
 
Great review, as always. The card lived up to the expectations definitely. It's listed at a big German e-tailer for only 149.99€! That's lower than the 1GB GTX 460, but the performance is better. And the card overclocks pretty well too. This is the new best price/performance card out there.
 
Nice review as always W1zz, on all the 68XX cards.

Do you think it would be possible to include a source engine game in the benches such as LFD or something, anything??? :)

The only other constructive criticism I can give would be to include some of the other ATI x8series in the benches, such as 3850, 3870, 4850, 4870. I think this would have been awesome in here http://www.techpowerup.com/reviews/HIS/Radeon_HD_6850/7.html since there a probably a ton of people still running them it would be really nice to see where their card stands against the newer cards, especially in a source engine benchmark :)

I mean you have the test bed out? Might as well keep on jugging ;) Cmon Source engine benchmark, can I start a petition?? ;)
 
Nice review as always W1zz, on all the 68XX cards.

Do you think it would be possible to include a source engine game in the benches such as LFD or something, anything??? :)

The only other constructive criticism I can give would be to include some of the other ATI x8series in the benches, such as 3850, 3870, 4850, 4870. I think this would have been awesome in here http://www.techpowerup.com/reviews/HIS/Radeon_HD_6850/7.html since there a probably a ton of people still running them it would be really nice to see where their card stands against the newer cards, especially in a source engine benchmark :)

I mean you have the test bed out? Might as well keep on jugging ;) Cmon Source engine benchmark, can I start a petition?? ;)

I admit i do look out for source engine benchmarks when on other sites but it's kind of pointless as even my 4870 spents a lot of time over 100fps in l4d2 so any mid to high end card now just pushes further past 100fps, if anything the only really useful source engine benchmarks would be ones ran on triple screen setups.
 
My favorite part of these tests is the performance summary, but I like the 3DMark 03 test too. It well illustrate the difference between the cards and probably the manufacturers not optimizing drivers to this old benchmark program:toast:.

Other:
A hungarian test site figured out the older Radeons with 10.10 beta driver significantly faster than they were. So those guys who would like to change their HD58xx to HD68xx, try to calculate with it. I mean the difference is might bigger than we got here.
 
Maybe the sample was a cut down 6870 with some defective parts....
 
Maybe the sample was a cut down 6870 with some defective parts....

if there were defective parts then the card would not come with 1120 shaders
 
Maybe there is possibility to unlock any HD6850 to HD6870?

W1zzard, did you save bios from this 'faulty' HD6850?
 
Scores high in perf/$ and perf/W at 19x12 summary, nice. Will probably buy this as my last desktop upgrade before I go all-out laptop. :D

Thanks for the review, much appreciated!
 
if there were defective parts then the card would not come with 1120 shaders

SO...........Does this mean there is the possibility of bios flashing a 6850 to a 6870?!?!?!?!!!

One can only hope..........
 
SO...........Does this mean there is the possibility of bios flashing a 6850 to a 6870?!?!?!?!!!

One can only hope..........

when amd makes the gpus they get tested: makes XT clocks, makes XT shaders. if it does not it ends up in another bin called "pro". when board partners order the gpus to make their boards, amd will modify the fuses in the asic (think intel multiplier) to put it into the right shader configuration. these fuses can not be changed.

if for some reason the fuses do not get applied in production (like on our sample), there is the additional feature of having fuses in the bios that change the shader config. in that case by flashing a different bios you could the shader config.

but this does not happen for retail boards anymore, for years all amd gpus have been fused at the asic level and not via bios. as mentioned in the article, my gpu is marked as an engineering sample and amd probably forgot to fuse it correctly. since his did expect the gpus to be fused they did not change the config via bios.

again, for retail, dont expect any unlocking
 
I admit i do look out for source engine benchmarks when on other sites but it's kind of pointless as even my 4870 spents a lot of time over 100fps in l4d2 so any mid to high end card now just pushes further past 100fps, if anything the only really useful source engine benchmarks would be ones ran on triple screen setups.

Yes, but, I have a 120Hz (120fps) samy rz2233, and at this point in time the few games I play all run on the source engine, so its kinda nice to see how newer cards perform compared to older 48xx and 38xx series. (temp, power, noise and performance wise)
 
when amd makes the gpus they get tested: makes XT clocks, makes XT shaders. if it does not it ends up in another bin called "pro". when board partners order the gpus to make their boards, amd will modify the fuses in the asic (think intel multiplier) to put it into the right shader configuration. these fuses can not be changed.

if for some reason the fuses do not get applied in production (like on our sample), there is the additional feature of having fuses in the bios that change the shader config. in that case by flashing a different bios you could the shader config.

but this does not happen for retail boards anymore, for years all amd gpus have been fused at the asic level and not via bios. as mentioned in the article, my gpu is marked as an engineering sample and amd probably forgot to fuse it correctly. since his did expect the gpus to be fused they did not change the config via bios.

again, for retail, dont expect any unlocking

Can you sell me that unlocked 6850 card for $90 CND?? :laugh: n/m the 6870
 
looks great deal with crossfire, nice work w1z
 
Yes, but, I have a 120Hz (120fps) samy rz2233, and at this point in time the few games I play all run on the source engine, so its kinda nice to see how newer cards perform compared to older 48xx and 38xx series. (temp, power, noise and performance wise)

Thats a good point, with more and more 120hz monitors coming out the target for them is 120fps instead of 60fps, more so with 3d.

I have been looking around for some source engine benchmarks tonight as i often play l4d2 with friends and i have been trying to get an idea of high res performance, mainly looking at 2560x1600 as it's the closest to the res i will be running (2560x1600 is 4,096,000 pixels and the res i will be running is 5040x1050 so 5,292,000. 1.2 million off but it's as close as most reviews go :laugh:) and when using that kind of res the fps really takes a hit.
 
After taking a second look at the power consumption on the 6850, I'm not sure these new cards really are more power efficient. At maximum, which I'm guessing is the unrealistic stress-testing run, then sure there's a gap. But at all the points it's only a few watts behind the 5850, while being slower than the 5850. Bump the clocks up to match a 5850 and the power usage would probably be the same if not more on average. So price efficient yes, but not really a huge step up power wise... unless you're always using some sort of gpu processing program. I'd still pick a 68 over a 58 but I just got the initial impression that there was more of a power usage difference.
 
Back
Top