# AMD Radeon HD 6800 Series Specifications Leaked



## btarunr (Oct 19, 2010)

Specifications of the upcoming Radeon HD 6800 series GPUs have already been doing rounds for the last couple of days, and ChipHell.com finally managed to leak an alleged press-deck of the HD 6800 series that discloses the GPUs' specifications and some key features that AMD will introduce with this generation. What can be said looking at the slides is that AMD seems to have stepped up performance/die-size big time (up to 35% increase in performance per mm²), with some reconfiguring of key components. It also redesigned the GPUs to have up to 100% increase in tessellation performance, new image-quality enhancements, a new video acceleration engine (UVD 3), and a redesigned display IO with 2nd Gen. Eyefinity technology that can let users of standard variants drive up to six displays with a single card.

Specifications of the HD 6870 are: 1120 stream processors, 32 ROPs, 56 TMUs, 256-bit GDDR5 memory interface holding 1 GB, clock speeds of 900/1050(4200) MHz core/memory(effective), and idle/max board power of 19W/151W. For the HD 6850, it's 960 stream processors, 32 ROPs, 48 TMUs, 256-bit GDDR5 memory interface holding 1 GB, clock speeds of 775/1000(4000) MHz, idle/max board power of 19W/127W.



 

 

 

 



*View at TechPowerUp Main Site*


----------



## HTC (Oct 19, 2010)

btarunr said:


> Specifications of the upcoming Radeon HD 6800 series GPUs have already been doing rounds for the last couple of days, and ChipHell.com finally managed to leak an alleged press-deck of the HD 6800 series that discloses the GPUs' specifications and some key features that AMD will introduce with this generation. What can be said looking at the slides is that AMD seems to have stepped up performance/die-size big time (up to 35% increase in performance per mm²), with some reconfiguring of key components. It also redesigned the GPUs to have up to 100% increase in tessellation performance, new image-quality enhancements, a new video acceleration engine (UVD 3), and a redesigned display IO with 2nd Gen. Eyefinity technology that can let users of standard variants drive up to six displays with a single card.
> 
> Specifications of the HD 6870 are: 1120 stream processors, 32 ROPs, 56 TMUs, 256-bit GDDR5 memory interface holding 1 GB, clock speeds of 900/1050(4200) MHz core/memory(effective), and idle/max board power of 19W/151W. For the HD 6850, it's 960 stream processors, 32 ROPs, 48 TMUs, 256-bit GDDR5 memory interface holding 1 GB, clock speeds of 775/1000(4000) MHz, idle/max board power of 19W/127W.



19W idle? DAMN!!!!!


----------



## wolf (Oct 19, 2010)

swish specs! better performance than 5850 is said, which is promising, but I had hoped for closer to 5870 performance with the 6870, maybe driver updates will brige that gap, and hec, I ain't seen a review yet.

maybe some crossfire improvements too?


----------



## afw (Oct 19, 2010)

Ok ... this is really irritating ... I just wanna see some benchmarks ...


----------



## vinhak49 (Oct 19, 2010)

*good job AMD*

It's still specs, let see how far it improve over HD5870 and HD5850.


----------



## cadaveca (Oct 19, 2010)

I think these are still 5-D, and they added extra shaders to existing Juniper designs because of extra pad space required to shift to 256-bit mem. I expect a small increase from 4890 performance..20%(800 + 160 extra shaders=+20%, plus some extra becuase of faster mem bus). New dispatch processor, and tesselation thrown in, we might see a big impact with just those changed, and no shader change.

I do not see 5870/5850 being obsoleted...the whole renaming and such will keep 5850/5870 value up.


----------



## KainXS (Oct 19, 2010)

probably but I wonder how much tessellation performance they have now


----------



## cadaveca (Oct 19, 2010)

KainXS said:


> probably but I wonder how much tessellation performance they have now



Double the previous gen. NOt sure if it;s two full "old units", or a completely new design though.


----------



## buggalugs (Oct 19, 2010)

afw said:


> Ok ... this is really irritating ... I just wanna see some benchmarks ...



There is some AMD benchmarks here:






Uploaded with ImageShack.us





Uploaded with ImageShack.us

http://forums.overclockers.com.au/showpost.php?p=12435812&postcount=1981



 The 460 (768/1GB) is average 25% ish slower than the 6850/6870....even on games like metro 2033 that have tessellation. The 460 is doomed.


----------



## cadaveca (Oct 19, 2010)

Thanks, that basically confirms still 5-D, and now there is two set-up/tesselation engines.


Slide says 12-14 SIMD engines, which MUST be 5-D(960/12=???), shows seperate tesselator, etc...confirms 100% increase...ugh...this is exactly what i expected.


----------



## a_ump (Oct 19, 2010)

maybe. Why is it that you think they're still 5-D? it's been well explained and strongly rumor'd as being in 4-D. I'm pretty sure that's actually a fact.


----------



## buggalugs (Oct 19, 2010)

vinhak49 said:


> It's still specs, let see how far it improve over HD5870 and HD5850.



Its not a replacement for the 5870/5850. The 6850/6870 are about the same as 5850/5870 except with double the tessellation performance.

The 6850/6870 are to replace the 5770 performance bracket and to take on the Nvidia 460...which it does.


----------



## HTC (Oct 19, 2010)

Here are the benches from AMD:







and






Got these from the link provided in post #9 of this topic.


----------



## cadaveca (Oct 19, 2010)

a_ump said:


> maybe. Why is it that you think they're still 5-D? it's been well explained and strongly rumor'd as being in 4-D. I'm pretty sure that's actually a fact.



They could not be the rumoured 4-D, add the new features shown in those slides, plus increase bus width, and keep power within those confines(<150w for 6850/5850).

What they have done to get the increase in performance is two set-up engines, seperating the die into two, and making like a real dual-core gpu, with each side of the dualcore having thier own tesselation engine. This way they can literally double the gpu usage(needed, based on my own experience with cypress, and poor Crossfire scaling exposing less than 60% gpu usage per card)

See, I know that they have issues keeping shaders filled, and that would perfectly fix that, and also explains the higher geometry throughput.

This is just a simple refresh.

If it WAS 4-D, then 800 shaders should basically double the performance of the previous gen, but this is not the case(ie, 6870 would be faster than 5870). I mean, I may be wrong, but I highly doubt they could also do this as stay pin-compatible with cypress...it would require a different power layout for the socket, IMHO. 4D would also require the set-up engine change, but I do not think we'd have SIMDs with 80 units any more.


----------



## HalfAHertz (Oct 19, 2010)

cadaveca said:


> I think these are still 5-D, and they added extra shaders to existing Juniper designs because of extra pad space required to shift to 256-bit mem. I expect a small increase from 4890 performance..20%(800 + 160 extra shaders=+20%, plus some extra becuase of faster mem bus). New dispatch processor, and tesselation thrown in, we might see a big impact with just those changed, and no shader change.
> 
> I do not see 5870/5850 being obsoleted...the whole renaming and such will keep 5850/5870 value up.



I think you are wrong 
   Look at the slides. On the fourth they have shown a legend map of the die and you can see the 14 SIMDs. Each of them consists of 5 Texture units and if you look carefully, you can see that you have 4 texture cores in each TU, which hold the SPs. So if we say that the SPs are marked with "x", then x*4*5*14=1120 and that means that x = 4

Edit: I am guessing the 6850 has two SIMDS disabled, which would look like 4SPs*4TCs*5TU*12SIMDS=960SPs


----------



## mechtech (Oct 19, 2010)

vinhak49 said:


> It's still specs, let see how far it improve over HD5870 and HD5850.



Some area's it may not improve over 5870 as it has 1600 shaders!!


----------



## buggalugs (Oct 19, 2010)

Haha you guys stop posting them i already did it!! lol


----------



## ebolamonkey3 (Oct 19, 2010)

HTC said:


> 19W idle? DAMN!!!!!



What's the idle power usage on the 5850 and 5870?


----------



## cadaveca (Oct 19, 2010)

HalfAHertz said:


> I think you are wrong
> Look at the slides. On the fourth they have shown a legend map of the die and you can see the 14 SIMDs. Each of them consists of 5 Texture units and if you look carefully, you can see that you have 4 texture cores in each TU, which hold the SPs. So if we say that the SPs are marked with "x", then x*4*5*14=1120 and that means that x = 4
> 
> Edit: I am guessing the 6850 has two SIMDS disabled, which would look like 4SPs*4TCs*5TU*12SIMDS=960SPs



I see, in the red vertical boxes (a single SIMD), 16 smaller boxes. Each of those 16 boxes holds the actual shaders. Now ,to me 16x4=? 64? 12x64=not enough.

I've blown up that pic you refer to as large as possible, and do not see anything to indicate what you are saying. Keep in mind, my 30-inch does a good job of displaying enlarged images.


Time will tell though...a short wait is all that is left.

Funny, now, many said the cards were going to launch today, and I said no(I actually said I doubt it possible because of constraints within memory production). Look...no real launch today...just some pictures...


----------



## HTC (Oct 19, 2010)

ebolamonkey3 said:


> What's the idle power usage on the 5850 and 5870?



It's 18W for the reference 5850.

Strangely enough, i thought it was more ... Must have confuse it with some other card, i guess.


----------



## afw (Oct 19, 2010)

So 6870 is around 20% faster than a 1GB GTX460 and the 6850 is around 25% faster than a 768MB GTX 460 (according to the graphs) .... hmmm .... but IMHO I dont think these will be competitive @ $249 and $199 ... cos currently the GTX 460s are selling for $220 and $170 (1GB/768MB) ... Hope they'll release it at a lesser price ...


----------



## btarunr (Oct 19, 2010)

ebolamonkey3 said:


> what's the idle power usage on the 5850 and 5870?



25w/27w.


----------



## cadaveca (Oct 19, 2010)

btarunr said:


> 25w/27w.



Only when clocks reduced down to 157/300, which isn't the case in most configs now, thanks to the dual-monitor flicker problem. My main card idles @ 400/900, evne with only one installed, but the slave will drop down to 157/300.

And there's the source of my cursor corruption too.


----------



## btarunr (Oct 19, 2010)

cadaveca said:


> Only when clocks reduced down to 157/300, which isn't the case in most configs now, thanks to the dual-monitor flicker problem.



Idle wattage obviously is measured on the GPU's default 2D clock profile, both by AMD and NVIDIA.


----------



## buggalugs (Oct 19, 2010)

cadaveca said:


> Only when clocks reduced down to 157/300, which isn't the case in most configs now, thanks to the dual-monitor flicker problem.



Well, to be fair most users (95%?) still only have 1 screen. I dont know if anyone has figures but i suspect eyefinity users is a tiny number overall.


----------



## cadaveca (Oct 19, 2010)

btarunr said:


> Idle wattage obviously is measured on the GPU's default 2D clock profile, both by AMD and NVIDIA.



I know...but the default...it's not the same as it was at launch, now. I've seen an increase of a bit over 10c at idle thanks to the clock increase.

Buggalugs, I have been running one monitor for some time now, as i RMA'ed one card a bit over two weeks ago. My idle clocks are 400/900. UVD clocks are 600/1250, and full speed is 900/1250.


----------



## filip007 (Oct 19, 2010)

I hope they got drivers ready for 35% increase of speed as they say on first slide!


----------



## the54thvoid (Oct 19, 2010)

cadaveca said:


> I know...but the default...it's not the same as it was at launch, now. I've seen an increase of a bit over 10c at idle thanks to the clock increase.
> 
> Buggalugs, I have been running one monitor for some time now, as i RMA'ed one card a bit over two weeks ago. My idle clocks are 400/900. UVD clocks are 600/1250, and full speed is 900/1250.



I disabled ULPS for my cards and then pressed 'default' in CCC ATI Overdrive - reset clocks back to 157/300.  If you're interested?


----------



## btarunr (Oct 19, 2010)

cadaveca said:


> I know...but the default...it's not the same as it was at launch, now. I've seen an increase of a bit over 10c at idle thanks to the clock increase.



You altered its settings to result in increased power draw/voltages. If you use it according to its default specifications, idle power draw will be as mentioned. There is no false-marketing on the part of the GPU manufacturer. Both manufacturers rate their idle power draws (and we test to prove) in the same conditions.


----------



## cadaveca (Oct 19, 2010)

btarunr said:


> You altered its settings to result in increased power draw/voltages. If you use it according to its default specifications, idle power draw will be as mentioned. There is no false-marketing on the part of the GPU manufacturer. Both manufacturers rate their idle power draws (and we test to prove) in the same conditions.



I didn't alter any settings. See the post above yours...it was a driver edit that increased the idle clocks. I cannot remember which driver it was, maybe march/april, but there was TONNES of people complaining about it. I haven't modified anything quite intentionally because of all the issues I had. You're more than welcome to come over to my house and see what's up.

I don't even unlock the lock in CCC..no vga overclock here. Yes, I have overclocked my cards, but I use a completely seperate OS for that.

EDIT: You may be right though, BTA, and it's the overclocked bios from XFX that are on my cards causing the issue...but whatever the problem is, it has nothing to do with anything I personally did.


----------



## N3M3515 (Oct 19, 2010)

So, if the 6870 is equal or slightly faster than a 5850 (just like i said it would, in other thread), it would be a huge win, if its direct competitor is the GTX 460 1GB (even o/ced).
Which means nvidia could lower the price on the GTX460, nice!
Or, AMD could rise the price of the 6870 lol.

An overclocked 6870 would be on par with a GTX470 or slightly faster (dependeing on overclocking room, of course)


----------



## EastCoasthandle (Oct 19, 2010)

"New and improved image quality features.  AA and AF".  Is that implying the MLAA mode?


----------



## largon (Oct 19, 2010)

^Atleast there's hope they fixed HD5000s broken AF... 

But I think the most interesting detail is the die shot of a Cypress on one of 'em slides.


----------



## cadaveca (Oct 19, 2010)

EastCoasthandle said:


> "New and improved image quality features.  AA and AF".  Is that implying the MLAA mode?



Yeah, there was a slide refering to a new AA mode...can't remember the specifics though. I want to call it morphing AA or something..morphalogic?


----------



## JATownes (Oct 19, 2010)

If these specs are accurate, my aging 4850's will be replaced in the next couple of weeks with a pair of 6870s.  WOO-HOO!!!


----------



## the54thvoid (Oct 19, 2010)

cadaveca said:


> I didn't alter any settings. See the post above yours...it was a driver edit that increased the idle clocks. I cannot remember which driver it was, maybe march/april, but there was TONNES of people complaining about it. I haven't modified anything quite intentionally because of all the issues I had. You're more than welcome to come over to my house and see what's up.
> 
> I don't even unlock the lock in CCC..no vga overclock here. Yes, I have overclocked my cards, but I use a completely seperate OS for that.
> 
> EDIT: You may be right though, BTA, and it's the overclocked bios from XFX that are on my cards causing the issue...but whatever the problem is, it has nothing to do with anything I personally did.



Hey man, i can back you up.  When i installed 10.9, defaults went up to 400/900.  They started back at 157/300.  I dont OC my gfx cards.  It was/is a driver issue, perhaps only affects crossfire set ups that way?


----------



## Animalpak (Oct 19, 2010)

that benches makes me laugh, how you can believe ?? bahaha


----------



## R4PT0R (Oct 19, 2010)

*HD 6870 price*

Check this page: http://www.compusa.com/applications/SearchTools/item-details.asp?EdpNo=6799926&CatId=3585 according to that the 6870 is gonna be around $270. It lands in HD5830/GTX460 zone.


----------



## Ross211 (Oct 19, 2010)

Here is to hoping that AMD releases the the "Real Meat & Potatoes" HD 6000 series (6970 & 6990) this week.


----------



## cadaveca (Oct 19, 2010)

the54thvoid said:


> Hey man, i can back you up.  When i installed 10.9, defaults went up to 400/900.  They started back at 157/300.  I dont OC my gfx cards.  It was/is a driver issue, perhaps only affects crossfire set ups that way?



Honestly, I do not know, but I do know it's one of the things that I am looking to be "fixed" with these cards.

I'm not really too concerned with performance with these cards. I'm more focused on the outstanding issues that myself and many others are still left dealing with when using the 5-series cards.

5870 is a great card, performance-wise. i've not had a single issue since i sent my second card off for RMA...the idle clock thing is a bit frustrating, but not really an issue, as it solves flicker/artifacting issues. AND I really do think that THAT particular behavior, the new ULPS settings in driver, is what causes the cursor corruption.

See, thing is with many ASUS motherboards, the lower pci-e 16x slot is actually the main slot for dual vgas, but if you plug your monitor into the lower card, you don't get to see any image onscreen until windows boots and loads the vga driver. So the monitor gets plugged into the top card...the slave card...which gets the ULPS clocks of 157/300. The main card, the lower card, gets the 400/900 idle clocks...but doesn't have anything plugged into it, so that particular fix fails, and I'm stil left with cursor corruption, and flickering screens.


Why ASUS does that with the pci-e slots I do not know. I don't even know if that's the actual cause, but it could be...just one of the many possibilities that has come up over the last...damn..almost 13 months now.


----------



## erocker (Oct 19, 2010)

I never got increased idle clocks with my 5850's in CrossFire through 10.9's. After 10.7's (maybe), idle clocks increased only when I overclocked the cards.


----------



## cadaveca (Oct 19, 2010)

Well that's it...my cards are overclocked out of the box, so maybe that's the issue..I honestly don't have the slightest. You also edited your 2D clocks, erocker...


All I know is that on a fresh OS I just installed over the weekend, I do not get 157/300 idle clocks, and I have not plugged in a second card, nor anything other than my single 3008WFP.

If I could actually find a cause for these issues, they'd not be important. The biggest problem with all of it is that so many users have problems, and few do not.


----------



## pantherx12 (Oct 19, 2010)

Cadaveca, does this occur with single monitor also?

( making sure eyefinity is switched off properly) 

I think eyefinity ups the idle clocks due to the flicker problem.

If no, I've no clues as to why it's going up at all D:


----------



## cadaveca (Oct 19, 2010)

Yeah, never connected more than one monitor on the current OS.

Here's a good thread about the idle clock issues, grey screen, flicker:

 HD 5XXX, 2D Clocks & Video

Several users edited idle clocks, and fixed issues, and then if i remember correctly, ATI rolled that into the driver...there were other single card, singlegpu users complaining of the higher clocks...some got it lowered by reseting to factory defaults in CCC, others did not.

Anyway, "we" increased clocks @ 2D to fix some issues that I hope aren't a problem in low-power mode with the new cards.


----------



## KainXS (Oct 19, 2010)

I bet wiz is sitting with a 6870 right now:shadedshu


----------



## btarunr (Oct 19, 2010)

largon said:


> But I think the most interesting detail is the die shot of a Cypress on one of 'em slides.



If you look closely, that die shot is of a Barcelona Opteron.


----------



## Yukikaze (Oct 19, 2010)

KainXS said:


> I bet wiz is sitting with a 6870 right now:shadedshu



Two, probably


----------



## cadaveca (Oct 19, 2010)

btarunr said:


> If you look closely, that die shot is of a Barcelona Opteron.



 I didn't want to step up and say anything, but I kinda wondered about that too..cause it really does look like a quadcore. 

Good catch.


----------



## N3M3515 (Oct 19, 2010)

Yukikaze said:


> Two, probably



And two 6850


----------



## NeSeNVi (Oct 19, 2010)

Actually HD 6850 doesn't look bad... these 127W could be even less:>


----------



## erocker (Oct 19, 2010)

cadaveca said:


> Well that's it...my cards are overclocked out of the box, so maybe that's the issue..I honestly don't have the slightest. You also edited your 2D clocks, erocker...



I stopped editing 2d clocks after the driver that came out that raised them when overclocking. When that driver came out I noticed that while playing video (where I had problems), the clocks would actually raise to "video mode" instead of staying at 157/300 like with previous drivers. Only thing different I did was always disable ULPS mode.


----------



## Ross211 (Oct 19, 2010)

KainXS said:


> I bet wiz is sitting with a 6870 right now:shadedshu





N3M3515 said:


> And two 6850



And two 6970 & 6990.  lol.

Probably under NDA ;~(


----------



## AddSub (Oct 19, 2010)

Wow, so soon. These are like 2-3 times faster than top end Radeon 5xxx series, _rite_, _rite_? Awsomumme! AMD ftw!!! 

Anywho...


----------



## cadaveca (Oct 19, 2010)

erocker said:


> Only thing different I did was always disable ULPS mode.



Ah.

Anyway, doesn't matter too much as it pertains to the 6-series..let's just hope that there's no fiddling with clocks with these ones..I think 100/300 is the new low-power clocks, and you'd hope that AMD wouldn't pull the same mistake twice...but you never know.


It's actually looking like a whole lot has changed..so much so I'm actually kinda curious to see how it all pans out.


----------



## wahdangun (Oct 19, 2010)

just 35% increase ? hmm if i remembered isn't amd promised to double the performance from HD 5770 ?


----------



## OneCool (Oct 19, 2010)

cadaveca said:


> Ah.
> 
> 
> 
> It's actually looking like a whole lot has changed..so much so I'm actually kinda curious to see how it all pans out.




Same here.I figured AMD would have just overclocked the 5xxx series and stamped 6xxx series on it.

Looking forward to the release now.


----------



## largon (Oct 19, 2010)

btarunr said:


> If you look closely, that die shot is of a Barcelona Opteron.


Wow, I just saved it, glanced around. Thought it looked like nothing I expected to see on Cypress. More like a garbled mess rather than distinct structures. Then I saw your comment. Thought, "No way. I should have seen K10 right away." 

Then I realized it's actually _two images superimposed._ 
It's Barcelona _and RV770_ mushed into one:

I did the same in PS, just for the giggles:






 + 
	

	
	
		
		

		
		
	


	



-> 





 =


----------



## LAN_deRf_HA (Oct 19, 2010)

afw said:


> So 6870 is around 20% faster than a 1GB GTX460 and the 6850 is around 25% faster than a 768MB GTX 460 (according to the graphs) .... hmmm .... but IMHO I dont think these will be competitive @ $249 and $199 ... cos currently the GTX 460s are selling for $220 and $170 (1GB/768MB) ... Hope they'll release it at a lesser price ...



Exactly, at every opportunity AMD chooses not to start a price war they could easily win, just to slot in under or above nvidia products. It's starting to feel like we aren't benefiting from the competition much these days.


----------



## CDdude55 (Oct 19, 2010)

HTC said:


> Here are the benches from AMD:
> 
> http://img249.imageshack.us/img249/866/66553985.jpg
> 
> ...



I never trust the benchmarks from the actual manufactures, because for some reason their own cards always seem to come out on top..

Still waiting for some benchmarks and reviews of these cards..


----------



## Edito (Oct 19, 2010)

Im slowly getting into the AMD realm those designs look fantastic and for me its very good... i was nvidia fan cause of the designs, drivers and slogan lol but the new AMD cards are just fantastic... they only need to fix the drivers...


----------



## Benetanegia (Oct 19, 2010)

CDdude55 said:


> I never trust the benchmarks from the actual manufactures, because for some reason their own cards always seem to come out on top..
> 
> Still waiting for some benchmarks and reviews of these cards..



You mean like this one where they claimed HD5870 would befaster than GTX295? 






Yeah, comparing both slides doesn't do a favor to the HD6870.


----------



## [H]@RD5TUFF (Oct 19, 2010)

Meh, this looks more or less like a LP version of the 5 series, I'll wait for the 7 series. People make such a big deal about power these days when the difference is a a whopping 20 dollars a year difference in your power bill.



Edito said:


> Im slowly getting into the AMD realm those designs look fantastic and for me its very good... i was nvidia fan cause of the designs, drivers and slogan lol but the new AMD cards are just fantastic... they only need to fix the drivers...



Better drivers will never happen their too concerned with keeping up/ trying to beat Nvidia, and their drivers show it, they always have, and always will, at least Nvidia's drivers work.


----------



## the54thvoid (Oct 19, 2010)

To be fair, they have a slide number '67' that shows configuration settings so it would probably be quite clear that they tweaked it to get that.  At least in saying - 'see slide 67 for settings' they're maintaining some 'credibility'.

But i agree, Manufacturer slides mean nothing.  Bear in mind though that there aren't actual releases yet - these are still leaks.  

Will they beat the 460's?  Probably.  Will be proces the same? Like shit they will.

And why are we quibbling about the Barts ones?  I want Cayman ffs!


----------



## CDdude55 (Oct 19, 2010)

Benetanegia said:


> You mean like this one where they claimed HD5870 would befaster than GTX295?
> 
> http://i25.tinypic.com/2emg9zs.jpg
> 
> Yeah, comparing both slides doesn't do a favor to the HD6870.



Lets me rephrase my statement for that slide:



			
				CDdude55 said:
			
		

> I never trust the benchmarks from the actual manufactures, because for some reason their own cards always seem to come out on top or win the majority of performance tests that are of course tailor made mainly in favor of their own cards design.


----------



## the54thvoid (Oct 19, 2010)

[H]@RD5TUFF said:


> Better drivers will never happen their too concerned with keeping up/ trying to beat Nvidia, and their drivers show it, they always have, and always will, at least Nvidia's drivers work.



Yeah cos this never happened...

http://www.zdnet.com/blog/hardware/warning-nvidia-19675-drivers-can-kill-your-graphics-card/7551

And their gfx chips never had problems....

http://www.tomshardware.com/news/HP-Having-Nvidia-Chipset-Issues,7669.html

And for the record, I CTD playing MoH with my 5850's so they're not perfect either.  Can we have a third gfx company please?


----------



## f22a4bandit (Oct 19, 2010)

the54thvoid said:


> Yeah cos this never happened...
> 
> http://www.zdnet.com/blog/hardware/warning-nvidia-19675-drivers-can-kill-your-graphics-card/7551
> 
> ...



You'll have your wish when Intel gets their GPUs correct


----------



## JATownes (Oct 19, 2010)

f22a4bandit said:


> You'll have your wish when Intel gets their GPUs correct



By then we will all be flying cars and living on Mars though.


----------



## [H]@RD5TUFF (Oct 19, 2010)

the54thvoid said:


> Yeah cos this never happened...
> 
> http://www.zdnet.com/blog/hardware/warning-nvidia-19675-drivers-can-kill-your-graphics-card/7551
> 
> ...



Never stated they were perfect iether but you will be hard pressed to find anyone syaing they have never had an issue with ATI crads and drivers, where as the vast majority of Nvidia users have never had 1 problem. Personally I run both, and the quality of Nvidia drivers trump AMD in every way. Not to mention the CCC is a bunch of jumbled garbage and forces needless menu navigation.


----------



## f22a4bandit (Oct 19, 2010)

JATownes said:


> By then we will all be flying cars and living on Mars though.



That's far earlier than I thought they'd perfect their first real GPU. I was thinking closer to flying at 99.99%, repeating of course, the speed of light and settling somewhere near Alpha Centauri.


----------



## aj28 (Oct 20, 2010)

[H]@RD5TUFF said:


> Better drivers will never happen their too concerned with keeping up/ trying to beat Nvidia, and their drivers show it, they always have, and always will, at least Nvidia's drivers work.



People need to stop whining about AMD's drivers. Not that anecdotal evidence means anything, but I've never had a persistent issue caused by their software. Ever. And I've been using them regularly since before CCC. Never had to revert to an old version, either. And no, I don't regularly run Driver Sweeper or anything of that nature when I upgrade... In fact, the first issue I've had with graphics drivers in years occurred when I purchased a GT240 and started getting constant crashes, BSODs, and driver resets on a clean install of W7 64-bit. Needless to say, after trying multiple driver revisions, it was sold to purchase my HD5770...

Basically, I think your statement is a little strong, and more than a little exaggerated.

All that being said, 32 ROP FTW! They look like nice cards, but unfortunately I think I'll be passing on this round...


----------



## MxPhenom 216 (Oct 20, 2010)

ill wait for nvidia's GTX475 485 and their supposively GTX580

ati 6 series looks good but time will only tell with their drivers.


----------



## MxPhenom 216 (Oct 20, 2010)

aj28 said:


> People need to stop whining about AMD's drivers. Not that anecdotal evidence means anything, but I've never had a persistent issue caused by their software. Ever. And I've been using them regularly since before CCC. Never had to revert to an old version, either. And no, I don't regularly run Driver Sweeper or anything of that nature when I upgrade... In fact, the first issue I've had with graphics drivers in years occurred when I purchased a GT240 and started getting constant crashes, BSODs, and driver resets on a clean install of W7 64-bit. Needless to say, after trying multiple driver revisions, it was sold to purchase my HD5770...
> 
> Basically, I think your statement is a little strong, and more than a little exaggerated.
> 
> All that being said, 32 ROP FTW! They look like nice cards, but unfortunately I think I'll be passing on this round...



Don't be so ignorant. Just because you haven't had problems with amd software doesn't mean you need to go flaming people who mention they have. I had quite a bit when i had a hd5870. I had a lot of problems with microstuttering with a single card and every drive release either made it worse or just pooped all over my fps i was getting before. once ati 10.7 and 10.8 they released were garbage i ditched and got my 470 and have never been happier.


----------



## Steevo (Oct 20, 2010)

[H]@RD5TUFF said:


> Never stated they were perfect iether but you will be hard pressed to find anyone syaing they have never had an issue with ATI crads and drivers, where as the vast majority of Nvidia users have never had 1 problem. Personally I run both, and the quality of Nvidia drivers trump AMD in every way. Not to mention the CCC is a bunch of jumbled garbage and forces needless menu navigation.



I use both also, have only had a few issues with ATI drivers. Nvidia has cheated with their drivers so many times its not even funny, just trying to get the upper hand.

Stability, most of that is not due to the drivers, but the users and their expectations/overclocks, games instability, other system issues. I'm a admin for quite a few systems, and almost all have ATI cards, about 7 have Nvidia, and a few have IGP from Intel. I have had zero issues with any, and your comment about CCC is just your being unfamiliar with it. And it being total shit to navigate. I hate the displays settings, it is almost like they purposefully hide certain settings. Right click second display to configure options is just retarded.


----------



## CDdude55 (Oct 20, 2010)

Before my 470, i owned both a HD 4870 and a 5770, both cards ran fine with the driver i installed on them, overclockability, stability and performance where great. I think only if you really analyze the drivers you'll find something lacking. Gaming wise nothing was holding me back in the drivers department with those cards.

The Geforce drivers have also been very good to me with my GTX 470 and much older 8600 GTS.



aj28 said:


> but unfortunately I think I'll be passing on this round...



Same here. (though i still want to see benchmarks, reviews, and unboxing videos/pics of this line up )


----------



## [H]@RD5TUFF (Oct 20, 2010)

Steevo said:


> I use both also, have only had a few issues with ATI drivers. Nvidia has cheated with their drivers so many times its not even funny, just trying to get the upper hand.
> 
> Stability, most of that is not due to the drivers, but the users and their expectations/overclocks, games instability, other system issues. I'm a admin for quite a few systems, and almost all have ATI cards, about 7 have Nvidia, and a few have IGP from Intel. I have had zero issues with any, and your comment about CCC is just your being unfamiliar with it. And it being total shit to navigate. I hate the displays settings, it is almost like they purposefully hide certain settings. Right click second display to configure options is just retarded.



I've been running ATI cards since I bought a Saphire 9600XT, and Nvidia since I bought a Asus Geforce 2, so I am familiar with both, and I never had a single issue with my AMD drivers back then where as the Nvidia would BSOD (which turned out to be a poorly soldered capacitor). Yeah I've had 2-3 computers all at once since I was 13.  All that said AMD could increase it's market share by a lot if it would just nut up and admit it's drivers are crap, and fix the damn problem, they could make customer service a priority something Nvidia seems to have forgotten about as of late.


----------



## HTC (Oct 20, 2010)

CDdude55 said:


> *I never trust the benchmarks from the actual manufactures, because for some reason their own cards always seem to come out on top*..
> 
> Still waiting for some benchmarks and reviews of these cards..



True that.

Personally, i don't care who wins or who loses: i just want a card that uses low power (idle), is quiet, runs @ low temps (stock cooler) and has performance, in that order.

When i bought this card, nVidia didn't use as low power as ATI and was way hotter, which is why i went with ATI instead of nVidia but i would have gone the other way if tables were reversed.


What i do care, however, is if ATI gets a big lead on nVidia because that will put ATI in Intel's position: to be able to sell their products @ much higher prices (compared to now).

If ATI passes nVidia by a bit, that's fine. If ATI trails nVidia by a bit, that's fine too. If either company gets a big gap against the other, that's BAD ... *BAD*, i tell you ...


----------



## EastCoasthandle (Oct 20, 2010)

[H]@RD5TUFF said:


> Never stated they were perfect iether but you will be hard pressed to find anyone syaing they have never had an issue with ATI crads and drivers, where as the vast majority of Nvidia users have never had 1 problem. Personally I run both, and the quality of Nvidia drivers trump AMD in every way. Not to mention the CCC is a bunch of jumbled garbage and forces needless menu navigation.



Really, not 1 problem?  I guess you were unaware of the stuttering/latency issues, etc when the 460 was released that many complained about.  There are other threads in other forums that discussed this but I thought 2 would be enough.

Now that's out the way neither are better than the other as both do have their problems from time to time.  However, with AMD including better AF and MLAA from CCC I can't see why that wouldn't be note worthy additions.  That would suggestion they are trying to make some improvements on the driver front (regarding added functionality).  As for stability we will have to see as time goes on after the 6800 series release.

Edit:
Oh, btw, Cat 10.10 will be released this week.  So lets start with that,


----------



## N3M3515 (Oct 20, 2010)

Benetanegia said:


> You mean like this one where they claimed HD5870 would befaster than GTX295?
> 
> http://i25.tinypic.com/2emg9zs.jpg
> 
> Yeah, comparing both slides doesn't do a favor to the HD6870.



For some reason i still believe a stock 6870 will be faster in every game(or like 80%) at every resolution(from 1680*1050) than a stock GTX 460 1GB, just like 5850 is right now....
And that percentage will increase due to tessellation adjustments.

Bottom line: 
* 5850 IS 8.5% faster (in the resolutions that MATTER) than a stock GTX 460. source: source
* 5850 HAS 20% better performance per watt than GTX 460 1GB. source: source
* 6870 will be slightly faster than 5850
* 6870 will have 2x tessellation performance than a 5850
* 6870 will have better performance per watt than a 5850
* 6870 IS cheaper to produce than a 5850 (in case nvidia lowers the price of the GTX460, amd won't have a problem lowering price to stay competitive)

So, 5850 is faster than GTX460, 6870 will be faster than 5850 = 6870>5850>GTX460
According to this an overclocked GTX460 is 6% faster than a stock 5850 and 13% if further overclocked, that's where things get interesting.
Then, a stock GTX460 1GB will be no match for a stock 6870.
A regular overclocked GTX 460 1GB will be equal in terms of performance with a stock 6870 (my guess).
Further overclocking the GTX460 1GB will make it some 7% - 8% faster than a stock 6870.
How much can be the 6870 be overclocked ??? of that i have no idea.


----------



## [H]@RD5TUFF (Oct 20, 2010)

EastCoasthandle said:


> Really, not 1 problem?  I guess you were unaware of the stuttering/latency issues, etc when the 460 was released that many complained about.  There are other threads in other forums that discussed this but I thought 2 would be enough.
> 
> Now that's out the way neither are better than the other as both do have their problems from time to time.  However, with AMD including better AF and MLAA from CCC I can't see why that wouldn't be note worthy additions.  That would suggestion they are trying to make some improvements on the driver front (regarding added functionality).  As for stability we will have to see as time goes on after the 6800 series release.
> 
> ...



Fanboy = fail

AMD = more problems

Why is it more people complain about AMD drivers than Nvidia drivers when Nvidia has a larger market share?


----------



## CDdude55 (Oct 20, 2010)

[H]@RD5TUFF said:


> Why is it more people complain about AMD drivers than Nvidia drivers when Nvidia has a larger market share?



Because the majority doesn't do what we the minority do, the programs and games we run can be largely affect by the kind of drivers we run and how well they are coded. The majority isn't going to see a difference in how fast their email opens by switching to new drivers, so they aren't affected.


----------



## btarunr (Oct 20, 2010)

[H]@RD5TUFF said:


> Fanboy = fail
> 
> AMD = more problems
> 
> Why is it more people complain about AMD drivers than Nvidia drivers when Nvidia has a larger market share?



Heineken has a larger market share than Sam Adams, that doesn't change the fact that Heineken is pissier. Market share is hardly ever an argument if you're talking product quality.


----------



## a_ump (Oct 20, 2010)

um....AMD has more marketshare than Nvidia now. Thought that was understood lol been like this since 2010 Q1

http://xtreview.com/addcomment-id-13179-view-AMD-graphic-market-share-higher-than-NVIDIA.html


----------



## HXL492 (Oct 20, 2010)

Better than the 5850 but 25% less silicon?

OH noes the silicon reserves are running out!


----------



## f22a4bandit (Oct 20, 2010)

If it works just as good, or better, with better efficiency then I'm all for it. Looking forward to possibly buying a new card to replace my 4850.


----------



## btarunr (Oct 20, 2010)

HXL492 said:


> Better than the 5850 but 25% less silicon?
> 
> OH noes the silicon reserves are running out!



Lesser die-area = lesser costs = lesser price.


----------



## bear jesus (Oct 20, 2010)

I'm intrigued by the multi-stream transport hub in the last slide, does that mean a mst hub connected to the card via a display port 1.2 port means you can connect up all the monitors for eyefinity with either display port, vga, dvi or hdmi and have only a single cable running out from the card?

If that's right and actually works well it would make choosing monitors for an eyefinity setup much easyer, although the price of the hub may be silly just like the active display port converters used to be.

From the spec it looks like the 6870 would be a very nice upgrade over my 4870... but i think i will still be waiting for the 6970


----------



## wahdangun (Oct 20, 2010)

[H]@RD5TUFF said:


> Fanboy = fail
> 
> AMD = more problems
> 
> Why is it more people complain about AMD drivers than Nvidia drivers when Nvidia has a larger market share?



please don't say other fanboy when you are clearly the fanboy, its doen's matter if it ati or nvdia they have their own driver problem. 

so please stop the trolling


----------



## Steevo (Oct 20, 2010)

[H]@RD5TUFF said:


> I've been running ATI cards since I bought a Saphire 9600XT, and Nvidia since I bought a Asus Geforce 2, so I am familiar with both, and I never had a single issue with my AMD drivers back then where as the Nvidia would BSOD (which turned out to be a poorly soldered capacitor). Yeah I've had 2-3 computers all at once since I was 13.  All that said AMD could increase it's market share by a lot if it would just nut up and admit it's drivers are crap, and fix the damn problem, they could make customer service a priority something Nvidia seems to have forgotten about as of late.



2 or 3 whole computers? Really?

I spent almost 70K at newegg in the last year doing builds for work, clients, friends, family..... alot of others here play with tens of thousands of dollars in hardware for their personal rigs. 

I have owned every ATI card series except the 2XXX series as it truly was shit. I have owned GeForce cards, and recently helped with a friends 460 upgrade from a older Nvidia card. It works well, but I could hardly navigate their control panel. He likes it, so ace for him, he has only used nvidia and has a perception that they are always better than ATI/AMD in everything.  So despite having a better online video upscalling/display, lower power requirement, running cooler, running his games faster, for less money available he chose the more expensive option as it seemed to be a better product.


----------



## a_ump (Oct 20, 2010)

i guess no body read my post....AMD has the majority of the graphics marketshare now NOT Nvidia


----------



## Benetanegia (Oct 20, 2010)

N3M3515 said:


> For some reason i still believe a stock 6870 will be faster in every game(or like 80%) at every resolution(from 1680*1050) than a stock GTX 460 1GB, just like 5850 is right now....
> And that percentage will increase due to tessellation adjustments.
> 
> Bottom line:
> ...



I quess you made a typo and when you say HD6850, you mean HD6870, because everything suggests that a HD6850 will NOT be faster than a HD5850, by a long shot.

Comparing the GTX460 to HD6870 is pointless anyway, because Nvidia will not position the GTX460 against HD6870 once it is released. They will release a GTX475 or whatever they call the full blown GF104 card. That card with 15% higher shader count and probably a 10-20% higher clock (750-800 Mhz) will be about 25-40% faster than the GTX460, making it substantially faster, decidedly faster than GTX470 and maybe even faster than HD5870.

It's true that on the $200 bracket AMD will have the clear advantage in perf per cost of manufacturing and perf/die area, but why? Only, because of releasing their $200 card later, they have been able to position their full blown chip against a crippled part. It's the same as when they say that HD6870 is better at perf/die area than HD5850. They are comparing a fully enabled chip against a crippled and underclocked part while boasting the die area of the fully enabled Cypress die. But oh well, that's the same stupid mistake that so many people have always made with HD3870 vs 8800GT, HD4870 vs GTX260, etc etc.



a_ump said:


> i guess no body read my post....AMD has the majority of the graphics marketshare now NOT Nvidia



I'll wait until numbers are posted, but that's most probably false. In Q2 AMD had 52% market share, true, but I would hardly call that the majority.

No official numbers have been posted that include GTX460 sales, not to mention GTS450 and below cards. Meaning that AMD achieved that 52% with the complete HD5000 family cards vs GF100 + GT240/220 etc. It's nice to see competition and wanting AMD to be ahead is nice and all, but keep it real. Steam Hardware survey makes it very clear that GTX460 has sold A LOT more than any other card. In just 2 months it's user base (DX11) has increased to 5.5% or more than 1/6 of what HD6800 (3 cards conbined) has sold over a complete year. 

Not only that but every single AMD card has seen a decline in DX11 share, while every Nvidia card has had an increase, so that clearly suggests that as of late Nvidia is outselling AMD. Also considering that Nvidi still sells DX10 cards, while AMD doesn't anymore.


----------



## N3M3515 (Oct 20, 2010)

Benetanegia said:


> I quess you made a typo and when you say HD6850, you mean HD6870, because everything suggests that a HD6850 will NOT be faster than a HD5850, by a long shot.
> 
> Comparing the GTX460 to HD6870 is pointless anyway, because Nvidia will not position the GTX460 against HD6870 once it is released. They will release a GTX475 or whatever they call the full blown GF104 card. That card with 15% higher shader count and probably a 10-20% higher clock (750-800 Mhz) will be about 25-40% faster than the GTX460, making it substantially faster, decidedly faster than GTX470 and maybe even faster than HD5870.
> 
> ...



Yeah, i meant 6870

Anyway if nvidia positions GTX460 1GB at a lower price point, lets say 200 USD, then it would compete with 6850 and beat it, you really think amd would let that happen? no way.

Crippled or not i'm comparing price points, so that's what matter.

GTX475? not even seen any info on when will be released, not even rumours, so lets stay at now.

Right now amd has the upper hand, that's irrefutable, like nvidia had it whe released GTX 460, talking about mainstream.

When GTX475 is near release, then we can talk about it. (maybe when it happens, cayman will be out too, so the argument goes on and on...)

At the end we win if prices go down, so lets hope nvidia does what you say and put GTX460 at 200USD, that would be awesome.

EDIT: there is one at 210 USD woot! here


----------



## pantherx12 (Oct 20, 2010)

Bitch bitch bitch, both have issues clearly, I've only ever had problems with nvidia cards, do I assume all of their cards have problems ?

No.

I probably had hardware conflicts.


Such a stupid argument to have.

If the cards work for you who gives a crap, if they didn't work for you don't assume all cards are going to be the same.


----------



## bear jesus (Oct 20, 2010)

Benetanegia said:


> In Q2 AMD had 52% market share, true, but I would hardly call that the majority.




ma·jor·i·ty
1.
the greater part or number; the number larger than half the total


But you are right about what the numbers include, the fact that the higher selling cards (lower priced) are not included means a lot, but it is also very hard to accuratly list market share with gpu's as its constantly changing so fast with new products and people upgrading between companys.
Some of that 52% could have just been waiting for cards like the 460 or people with low end ati gpu's could have been waiting for the 450, i geuss in a way we will never know for sure unless one company really pulls away from the other again in market share.



pantherx12 said:


> Bitch bitch bitch, both have issues clearly, I've only ever had problems with nvidia cards, do I assume all of their cards have problems ?
> 
> No.
> 
> ...



Damn straight


----------



## N3M3515 (Oct 20, 2010)

you now what would be great? if 6850 is 97% of a GTX460 1GB in performance xD


----------



## Benetanegia (Oct 20, 2010)

N3M3515 said:


> Yeah, i meant 6870
> 
> Anyway if nvidia positions GTX460 1GB at a lower price point, lets say 200 USD, then it would compete with 6850 and beat it, you really think amd would let that happen? no way.
> 
> ...



Yeah, I already said that AMD has the upper hand in that segment, because Barts was specifically designed for that segment. GF104 was designed to compete/be close to Cypress or GTX470 (maybe not deliberately), but they have only released it in a heavily crippled fashion so that it does not eat on GTX470 sales (let's avoid the discussion about if that makes sense or not, for whatever reason it does make sense for them. maybe because of higher "justifiable" ASP?). Now that Barts will make GTX470 almost obsolete and that they have probably cleared GF100 inventories, I'm sure they will release the full chip. Rumor mill says that Nvidia has a warehouse full of "something" awaiting its time for release, "something" being cards of unknown nature and those are probably the GTX475.

The reason that you have heard nothing tangible about the card yet is that Nvidia probably prefers to let it be that way until they have to release it, which is what happened with G92. If you remember the release of G92, the paralellism with GF104 is amazing. Back then Nvidia released the crippled part firts, so that it didn't eat up on 8800 GTX sales and many people thought that Nvidia could not make the fully enabled part and also the mere existence of the 8800 GTS was nothing more than rumors, almost confirmed rumors, but rumors after all. And that's exactly what happens today with GF104 IMO. Back then AMD didn't have anything to compete (here ends te parallelism), whereas now it's Nvidia who will have to compete, but the reason for secretism is the same.



bear jesus said:


> ma·jor·i·ty
> 1.
> the greater part or number; the number larger than half the total



Yeah correct. My mistake. I was going by the main meaning in Spain, whose translation is "most".


----------



## N3M3515 (Oct 20, 2010)

Benetanegia said:


> Yeah, I already said that AMD has the upper hand in that segment, because Barts was specifically designed for that segment. GF104 was designed to compete/be close to Cypress or GTX470 (maybe not deliberately), but they have only released it in a heavily crippled fashion so that it does not eat on GTX470 sales (let's avoid the discussion about if that makes sense or not, for whatever reason it does make sense for them. maybe because of higher "justifiable" ASP?). Now that Barts will make GTX470 almost obsolete and that they have probably cleared GF100 inventories, I'm sure they will release the full chip. There's rumors mill says that Nvidia has a warehouse full of "something" awaiting its time for release, "something" being cards of unknown nature and those are probably the GTX475.
> 
> The reason that you have heard nothing tangible about the card yet is that Nvidia probably prefers to let it be that way until they have to release it, which is what happened with G92. If you remember the release of G92, the paralellism with GF104 is amazing. Back then Nvidia released the crippled part firts, so that it didn't eat up on 8800 GTX sales and many people thought that Nvidia could not make the fully enabled part and also the mere existence of the 8800 GTS was nothing more than rumors, almost confirmed rumors, but rumors after all. And that's exactly what happens today with GF104 IMO. Back then AMD didn't have anything to compete (here ends te parallelism), whereas now it's Nvidia who will have to compete, but the reason for secretism is the same.



Lets hope you're right for our sakes lol
I want 5850 perf at 199 USD yay!


----------



## TAViX (Oct 20, 2010)

So this is actually the equivalent of the next gen 57xx series??


----------



## MikeX (Oct 20, 2010)

TAViX said:


> So this is actually the equivalent of the next gen 57xx series??



the die design is different with less silicone. Shaders are more efficient too. And yea these are like 5850s trying to performing a near gtx480


----------



## Mr McC (Oct 20, 2010)

Benetanegia said:


> Yeah correct. My mistake. I was going by the main meaning in Spain, whose translation is "most".



"Mayoría" and "mayor parte" also mean the majority, I don't understand where the confusion arose.


----------



## bear jesus (Oct 20, 2010)

Mr McC said:


> "Mayoría" and "mayor parte" also mean the majority, I don't understand where the confusion arose.



I think it was just me being pedantic about the definition, with being such a small percentage into the majority it hardly seams that way to most unless the term is used with the exact meaning. 

Back on topic though, we won't know the real ingame speeds untill friday so to me most of this is just talking to pass the time untill the reviews arrive.


----------



## Mr McC (Oct 20, 2010)

bear jesus said:


> I think it was just me being pedantic about the definition, with being such a small percentage into the majority it hardly seams that way to most unless the term is used with the exact meaning.



I see, he (she?) was drawing attention to the fact that AMD's majority share of the market is not really that significant.


----------



## bear jesus (Oct 20, 2010)

Mr McC said:


> I see, he (she?) was drawing attention to the fact that AMD's majority share of the market is not really that significant.



You are probably right, also with such a tiny percent it could easly be possible for nvidia to have the majority again since those numbers were released due to the 460's and 450 cards.


----------



## Mr McC (Oct 20, 2010)

bear jesus said:


> You are probably right, also with such a tiny percent it could easly be possible for nvidia to have the majority again since those numbers were released due to the 460's and 450 cards.



That is undoubtedly true, but the imminent release of the 6xxx series will put a lot of pressure on Nvidia and it seems that they will be late to the ball yet again. That's bad for all of us in terms of pricing and everything indicates that AMD will extend its market lead, assuming that the new cards sell well.


----------



## bear jesus (Oct 20, 2010)

Mr McC said:


> That is undoubtedly true, but the imminent release of the 6xxx series will put a lot of pressure on Nvidia and it seems that they will be late to the ball yet again. That's bad for all of us in terms of pricing and everything indicates that AMD will extend its market lead, assuming that the new cards sell well.



True it will be bad for short term pricing but i hope it will just push nvidia harder to make even better chips as soon as possible, i would hope nvidia brings out a refresh while on 40nm to try and hold off AMD untill they both get onto 28nm, hopefully when it comes to the 28nm chips they can get them out pretty close together and give us some great chips at great prices.


----------



## stupido (Oct 20, 2010)

Benetanegia said:


> Rumor mill says that Nvidia *has a warehouse full of "something" awaiting its time for release*, "something" being cards of *unknown nature* and those are probably the GTX475.



(me in the dark with candle light under my chin):
beeee afraaaaaiiddd... very very affraaaiiidd... a beast of unknown nature to be releasssssed... bwahahahahahahahahaha... 

==============

Did I scared you? 

No offence, I'm just trying to make a joke...


----------



## N3M3515 (Oct 20, 2010)

Mr McC said:


> That is undoubtedly true, but the imminent release of the 6xxx series will put a lot of pressure on Nvidia and it seems that they will be late to the ball yet again. That's bad for all of us in terms of pricing and everything indicates that AMD will extend its market lead, assuming that the new cards sell well.



Do you guys realize hemlock will soon have a year without competition?
That's how late nvidia has been...., why don't they release a dual GTX460??, antilles is coming in 2 months...
nvidia needs to release GTX475 and the GTX460x2(or whatever the name is) soon.

PD: I was thinking, do you remember the msrp of the 5850 the first time? 250 USD, it didn't stay like that until now, but technically we're getting slightly more performance(5%) for the same price with the 6870 , so the price for the 6870 should be 199 USD


----------



## Benetanegia (Oct 20, 2010)

stupido said:


> (me in the dark with candle light under my chin):
> beeee afraaaaaiiddd... very very affraaaiiidd... a beast of unknown nature to be releasssssed... bwahahahahahahahahaha...
> 
> ==============
> ...



Nah, no offense. I didn't mean that a beast is going to be released, just that I have read that Nvidia does have a warehouse that has been filling with something, and no one seems to know what. But it can be low end GT410, if that even exists, or teddy bears for all we know. Of course those being "GTX475" or 512 SP "GTX485" makes more sense, but who knows.

@ Mcc and bear jesus

Yes, I meant that I wouldn't call 52% vs 48% a majority, with majority == most off it


----------



## jamsbong (Oct 21, 2010)

First of all, it looks like the tesselation performance is improved and I expect it to be around 460 -470 region.

Man, I wonder who are the clever ppl who timed all these correctly? I mean, the 5xxx series is very much a dx10+ card with a complete set of features from dx11. however, the 5xxx card was never really gonna perform well in a native dx11 game. ATI still successful though since we only see very few native dx11 game, which means 5xxx owners will most likely play dx10 or dx9 games.

Now, the 6xxx series will be out and it is meant to be a full-scale dx11 card. Like Nvidia, ATI have to sacrifice some shader (dx10 related) performance and fill that gap with tessellation performance (dx11). However, this is the perfect time to see a real dx11 game card since there are more native dx11 games to come including the famous 3dmark11.

I have very high expectations from this weekend's launch!


----------

