# Sapphire R9 285 Dual-X OC 2 GB



## W1zzard (Aug 28, 2014)

Today, AMD launches the Radeon R9 285 based on their brand-new Tonga GPU, which replaces Tahiti, promising to be faster and more power efficient. Sapphire's board comes with a custom cooler and is overclocked out of the box for some extra performance.

*Show full review*


----------



## darkangel0504 (Sep 2, 2014)

How sad is this card   high power consumption, a really weak performace


----------



## v12dock (Sep 2, 2014)

Come on AMD...


----------



## bubbleawsome (Sep 2, 2014)

RIP Tahiti. After 2 years AMD has finally replaced it. With something no better.


----------



## buildzoid (Sep 2, 2014)

WTF is this it sucks. It's barely faster than a 760 it pulls much more power than a 760 and more importantly it's even power hungrier than the original HD 7970 while pulling more power. WTF AMD WTF are you doing. Just sell lower clocked cheaper R9 280Xs instead of trying to make a new GPU that no one needs or wants.


----------



## Lionheart (Sep 2, 2014)

Only an 8.0?  ..... Was expecting at least a 9.0 from the performance you get from it which takes me to this!

"Quite slow, performance not improved"    Uuuuuuhh what? This card is aimed at the GTX 760 and beats it in every benchmark... What were you expecting?
That power consumption needs some fixing though  Wtf AMD!!!


----------



## ISI300 (Sep 2, 2014)

The chip has 5 billion transistors:
http://techreport.com/review/26997/amd-radeon-r9-285-graphics-card-reviewed/2


----------



## Sempron Guy (Sep 2, 2014)

It was meant to replace the R9-280 so performance either on par or faster. It does the former but that on lesser memory bandwidth, with added features like the R9-290 and 260s has and is a bit power efficient. Price is almost the same plus added game bundles. Where's the hate coming from?


----------



## LAN_deRf_HA (Sep 2, 2014)

Looking at these charts it's a real shame nvidia didn't make the 760 based on the 670. It rules the midrange but can only be had second hand now.

Looking at the broader picture this whole market has been kinda stagnant and boring for way too long now if a two year old card still rules the midrange. You'd think in two years AMD would have at least beat it in efficiency but no. Seriously whats the deal here? To make matters worse the die shrinks aren't coming till next year.


----------



## W1zzard (Sep 2, 2014)

Lionheart said:


> Only an 8.0?  ..... Was expecting at least a 9.0 from the performance you get from it which takes me to this!
> 
> "Quite slow, performance not improved"    Uuuuuuhh what? This card is aimed at the GTX 760 and beats it in every benchmark... What were you expecting?
> That power consumption needs some fixing though  Wtf AMD!!!



http://www.newegg.com/Product/Product.aspx?Item=N82E16814500302

That's the problem for R9 285, and main reason why low score/no award


----------



## Steevo (Sep 2, 2014)

They choked the memory too much to compete with the 770, which performance per dollar and watt, is really where they needed to be to make it. Unless they plan on making a new spin of Tonga to drop power consumption this card is meh, find a 7970 and clock it up and get better performance and close power consumption.


----------



## Undyingghost (Sep 2, 2014)

W1zzard said:


> http://www.newegg.com/Product/Product.aspx?Item=N82E16814500302
> 
> That's the problem for R9 285, and main reason why low score/no award



Yeah but even that 770 have only 2GB vram, we need to get away from those cards. I would still take my 280X over any of this.


----------



## Casecutter (Sep 2, 2014)

Well this is a disappointment... Being still at TSCM and their cost and same old processes’.  It really is a marker for the coming round, I was hoping we'd see at least some things get mixed-up... but more of the same.  As I saw early 2014 this year was looking to be a bust, then Tonga came into the rumor mill and hoped it would offer variety and some new diversity.  NOT!

There not much hope on price either, being it appears to be the same die size.


----------



## W1zzard (Sep 2, 2014)

Undyingghost said:


> Yeah but even that 770 have only 2GB vram, we need to get away from those cards. I would still take my 280X over any of this.


We need to get away from these VRAM legends. This card is for 1080p, for which 2 GB is perfectly fine, even for 1440p


----------



## Disparia (Sep 2, 2014)

On a positive note, it's the best of the ITX cards? MSI has a GTX 760 and R9-270X at 170mm, and Sapphire has announced an R9-285 so if the jump in power usage is acceptable to the user, can't get better in that size range.


----------



## HalfAHertz (Sep 2, 2014)

Steevo said:


> They choked the memory too much to compete with the 770, which performance per dollar and watt, is really where they needed to be to make it. Unless they plan on making a new spin of Tonga to drop power consumption this card is meh, find a 7970 and clock it up and get better performance and close power consumption.



They made the memory bus 40% more efficient, read the article: http://techreport.com/review/26997/amd-radeon-r9-285-graphics-card-reviewed/2


----------



## Rahmat Sofyan (Sep 2, 2014)

at 256 Bit, can reach 384 bit performance.. not a good point at all?? and lower memory clock .. the conclusion need to revised I guess

or this R9 285: 14.30 Beta 2 was the problem?

compare to this 14.7 Beat 2


----------



## Sony Xperia S (Sep 2, 2014)

The card has some improvements but depending on gaming applications, performance fluctuates from being higher to being lower compared to the R9 280.
Hope is to improve over new driver releases.

Tessellation is improved, and also:

*New video decode block. The R9 285 also includes a new video decoder block for full hardware decode of H.264 4K streams*. H.264 base, main, and high profile up to 5.2 are all supported. That means the new block can handle decoding 4096×2304 streams at 60 fps. There’s also a new fixed function video transcoder unit (VCE) that supports full hardware encoding to H.264, but AMD didn’t provide additional details on features or capabilities that distinguish this new unit from previous generation hardware.


----------



## Steevo (Sep 2, 2014)

HalfAHertz said:


> They made the memory bus 40% more efficient, read the article: http://techreport.com/review/26997/amd-radeon-r9-285-graphics-card-reviewed/2


Synthetic VS real world, perhaps if all you play is fill rate game, thats great, but in the real world, 12% memory overclock resulted in a direct 12% increase of performance, despite a 17% core clock increase, all else equal it looks like its starving for data, or perhaps they need new drivers...... like we haven't heard that before. 

I would rather have waited to get the hardware out if they had issues with software. But I don't think thats the case here, I believe they were trying to design a new performance chip for a new process node and it isn't ready so they rushed out this turd, its the same size as the chip its replacing, only marginally better energy consumption, mediocre performance for the money, and unneeded.


----------



## the54thvoid (Sep 2, 2014)

Sempron Guy said:


> It was meant to replace the R9-280 so performance either on par or faster. It does the former but that on lesser memory bandwidth, with added features like the R9-290 and 260s has and is a bit power efficient. Price is almost the same plus added game bundles. Where's the hate coming from?



I question where your love is coming from.  It's being released as an efficient replacement for the 280(x) variant.  It kind of fails...  A bit less than a Titan running 6Gb memory?  I've got nothing against red but let's face it, the card's a bit 'meh'.  If you think it's great, fair enough but it's perf/watt is mediocre as well.

EDIT - misread, a bit more consumption than a Titan.


----------



## alwayssts (Sep 2, 2014)

What did you guys expect?  Cue wall of text.

Memory bw (for compute units) is tied to both this arch's biggest boon and largest drawback...as they also do special function...been saying it for years.  This seems to have improved the bw thirst some (larger cache per CU or something to that effect?), probably very purposely so 2048 at ~10xx/7000 actually makes sense when using all shaders (for extra compute effects) vs on their old design where it only would have been enough for the ideal shader-special function/rop efficiency of most games; closer to 18xxsp (roughly 2/3 Hawaii)...that would've been a HUGE waste, especially given how amd is now marketing their stack.  More on that in a second.

Hynix memory might clock slightly better (and hence scale performance more linearly), but still...Pretty close to a 7970ghz or so best-case it would appear, which will probably also be the stock performance of 285x.  I don't think anyone would be surprised when after they release 285x, this becomes very close to $200-220 and 285x replaces it at around to slightly above this price (perhaps starting even higher).  This intermediate period is likely purely for stretching profitability out:  285x can't compete with 770 because of bandwidth and power efficiency (outside of compute performance) issues; this part can compete with 760 (especially when compute is heavily involved).  760 is currently around this price, so they start with a msrp reflecting their strongest aspect.  285x must be cheaper than 770 by the ratio of base performance, even if their 'msrp' tries to inflate it to a proportionate level it is faster in compute.  This hand has been tipped as their game-plan given amd tried to not only market this product performance via Firestrike benches, but also the rest of their stack using the same compute-heavy bench while expounding their value as higher (of course it is, price has to reflect avg base performance in the market).  Given all that, plus gm204, 285/760 prices will fall, 285x price will fall.  770 will probably stay similar depending on gm204.  Do not be surprised when I estimate: this is your typical refresh amd has always done...it's just being marketed differently so they can adjust price of new products with the market to their actual ideal price, rather than start there and bring margin to zero.  

Give it a little time...even if you don't agree on the logic I think they're using, prices usually fall 15% after launch (on lower-end/refresh parts) relatively quickly.  Always have as far back as I can remember.  The difference between old amd/ati and new amd (after Rory Read) is new amd figures this drop into their launch msrp...now they just found a way to justify it.

I'll add the part you're looking for btw...Looking at conceivable ranges, this is probably the smallest amount realistically amd could have said is a better base part.  If you take compute away, is it purely only the better choice at the exact same price.  In other words: straight competition...and knowing amd, it'll probably actually be cheaper all said and done.







ASUS GTX 760 DC Mini1130 MHz1810 MHz84.3 FPS
MSI GTX 760 HAWK1225 MHz1880 MHz90.9 FPS
MSI GTX 760 Gaming1180 MHz1960 MHz87.8 FPS
NVIDIA GTX 7601140 MHz1760 MHz79.2 FPS
Gigabyte GTX 760 OC1190 MHz1850 MHz86.6 FPS
Palit GTX 760 JetStream1140 MHz1750 MHz82.9 FPS
EVGA GTX 760 SC1220 MHz1840 MHz88.8 FPS
ASUS GTX 760 DC II OC1175 MHz1840 MHz86.5 FPS


----------



## mroofie (Sep 2, 2014)

Lol dat power consumption 
Beware incoming Amd fanboys


----------



## Hilux SSRG (Sep 2, 2014)

Excellent review W1zzard, I appreciate you comparing this new release to the older 670/680s and 7950/7970/7970Ghz series.              

I really thought this card would be close to 280X performance levels.  I new there was going to be much power efficiency gains, but damn this card is just not worth the $250.  Huge disappointment.  This card will be selling at the $200 range within a few months./


----------



## Steevo (Sep 2, 2014)

alwayssts said:


> What did you guys expect?  Cue wall of text.
> 
> Memory bw (for compute units) is tied to both this arch's biggest boon and largest drawback...as they also do special function...been saying it for years.  This seems to have improved the bw thirst some (larger cache per CU or something to that effect?), probably very purposely so 2048 at ~10xx/7000 actually makes sense when using all shaders (for extra compute effects) vs on their old design where it only would have been enough for the ideal shader-special function/rop efficiency of most games; closer to 18xxsp (roughly 2/3 Hawaii)...that would've been a HUGE waste, especially given how amd is now marketing their stack.  More on that in a second.
> 
> ...




A lot of words, card still sucks. Honestly is this what we do now, here? Post reasons why a card that is mediocre with slightly better power draw and temps is the "replacement" for a good product lineup already?


Well sir, you could buy a 2013, 2012,  car with all the same real world features, and spend less, or your could buy the 2014 car with shiny new chrome trim around the cup holders, sure its a bit mroe, and goes slower, but its quieter since it goes slower. Ayyy, now there is a game changer, we have been looking at this all wrong, who wants faster, you could crash and get hurt, lets all go slower, you know Island time.


----------



## Sempron Guy (Sep 2, 2014)

the54thvoid said:


> I question where your love is coming from.  It's being released as an efficient replacement for the 280(x) variant.  It kind of fails...  A bit less than a Titan running 6Gb memory?  I've got nothing against red but let's face it, the card's a bit 'meh'.  If you think it's great, fair enough but it's perf/watt is mediocre as well.
> 
> EDIT - misread, a bit more consumption than a Titan.



People getting shocked at the results like it is news. The card runs a bit faster than the 280 - check. It consumes a bit less power than the 280 - check. It's not like they built a hype right from the start that the 285 will offer "phenomenal" power efficiency and performance.


----------



## Sony Xperia S (Sep 2, 2014)

Steevo said:


> A lot of words, card still sucks.



The card doesn't suck, it is just a problem in you who doesn't see the obvious potential.

I like it because of more future-proof 4K hardware encode/decode support and very good gaming performance as shown in the review in the following titles:

Bioshock Infinite, 
Diablo III: Reaper of Souls, 
Metro: Last Light, 
Thief, 
Watch_Dogs, 
Wolfenstein: The New Order

Just improve your vision, this card is, after all, just a better replacement for the R9 280. No one forces you to accept it as anything else.


----------



## MrStim (Sep 2, 2014)

why use old drivers for the review? 
wont the newer drivers be more optimized? time issue? 
Dont agree with the review conclusion as well. 

R9 285: 14.30 Beta 2


----------



## Hilux SSRG (Sep 2, 2014)

Sony Xperia S said:


> The card doesn't suck, it is just a problem in you who doesn't see the obvious potential.
> 
> I like it because of more future-proof 4K hardware encode/decode support and very good gaming performance as shown in the review in the following titles:
> 
> ...



How many of those games are AMD optimized similar to NVidia's "Gameworks"?  Look at Wolfenstein and you will see the disappointment.


----------



## Big_Vulture (Sep 2, 2014)

Lol, I'm just laughing when AMD say something big . They sad earlier it will be efficient as Maxwell


----------



## Steevo (Sep 2, 2014)

Sony Xperia S said:


> The card doesn't suck, it is just a problem in you who doesn't see the obvious potential.
> 
> I like it because of more future-proof 4K hardware encode/decode support and very good gaming performance as shown in the review in the following titles:
> 
> ...


I see the light, a card aimed at 1080 gaming, cant even make 50FPS in Watch_Dogs. or is it Watch Dogs... http://www.tomshardware.com/reviews/amd-radeon-r9-285-tonga,3925-9.html

Very good gaming support, where a card essentially made in 2012 sinks its battleship, if we had this card and then they released the 280X as an improvement at a lower cost, we would cheer, but no, its a slower card at a higher price.

Future proof 4K, http://www.ultrahdtv.net/movies/page/2/ cause we all want to watch ink drops.

By the time 4K is an actual player in the market once blu-ray players are updated, in late 2015 when we finally have a possible disc to support it, once all the hundreds of other pieces fall into place we will worry about what standards are, and I am going to guess this card won't be the thing to own then, it will probably be left behind as both companies move ahead.

Wait, puppies bath..... so cute, so awesome, I will now buy a entire 4K setup just to watch a short movie about a puppies bath. Thank god you showed me the light!!!

http://www.imdb.com/list/ls051448087/

I hear breakfast at Tiffany's was mastered in 8K, making it even more future proof.


----------



## Eagleye (Sep 2, 2014)

MrStim said:


> why use old drivers for the review?
> wont the newer drivers be more optimized? time issue?
> Dont agree with the review conclusion as well.
> 
> R9 285: 14.30 Beta 2



The Gigabyte R9 285 WindForce OC and its 176W average for gaming comes in almost 40 Watts lower than a moderately overclocked AMD Radeon R9 280 reference graphics card

191W @ average (torture test) OpenCL and DirectCompute

AMD Catalyst 14.7 specifically would have this card compete much better


----------



## Steevo (Sep 2, 2014)

Eagleye said:


> The Gigabyte R9 285 WindForce OC and its 176W average for gaming comes in almost 40 Watts lower than a moderately overclocked AMD Radeon R9 280 reference graphics card
> 
> 191W @ average (torture test) OpenCL and DirectCompute
> 
> AMD Catalyst 14.7 specifically would have this card compete much better



Its the same driver everyone else is using, 14.300











W1zz tests power use right at the card too, there will always be some variation between manufacturers and GPU dies.

A whole 40W lower? So an end user could buy a much smaller PSU? Save lots of money?


----------



## newtekie1 (Sep 2, 2014)

Very disappointing card.  Same price as a 770, but consumes significantly more power and performs noticeable worse.  Just disappointing is all I can say.


----------



## Assimilator (Sep 2, 2014)

Sempron Guy said:


> It's not like they built a hype right from the start that the 285 will offer "phenomenal" power efficiency and performance.



Um, actually, AMD was throwing the phrase "GTX 760 killer" around like nobody's business. And a GTX 760 killer this card is most certainly not.


----------



## Assimilator (Sep 2, 2014)

p.s. @W1zzard Catalyst 14.8 WHQL is out, when you gonna update the download links on TPU main page?


----------



## Casecutter (Sep 2, 2014)

Window 8.1 can use Catalyst 14.7 beta 2  /  Win7 is still calling to use Catalyst 14.6 beta

Those saying the 770 is pricing is similar... here in the States the bulk are still <$300; that one Zotac is a rare occurrence.

Honestly this will not move pricing... well one price these 285's!  When AMD can drop a 295X2 down to $999 this with a lower cost PCB and menory needs to be a $210 with say a $20 rebate in play.


----------



## The Mac (Sep 2, 2014)

Sempron Guy said:


> People getting shocked at the results like it is news. The card runs a bit faster than the 280 - check. It consumes a bit less power than the 280 - check. It's not like they built a hype right from the start that the 285 will offer "phenomenal" power efficiency and performance.



Agreed, its a little more expensive than a 280, so there will be some great deals on 280, reducing inventory.

280 is no longer in production per AMD, so this is a great market strategy to clear the chanel of 280s, and replace with all the new tech.

Not sure what all the fuss is about, it was rumored to be a pitcairn replacement, instead its a Tahiti replacement.

Sounds like a win to me.

The flagship is coming with Fiji, this was NEVER supposed to be it.


----------



## W1zzard (Sep 2, 2014)

Assimilator said:


> p.s. @W1zzard Catalyst 14.8 WHQL is out, when you gonna update the download links on TPU main page?


not listed on amd.com, yes i know it's floating around the internet, but i rather wait a day for official + release notes


----------



## HumanSmoke (Sep 2, 2014)

Sempron Guy said:


> It's not like they built a hype right from the start that the 285 will offer "phenomenal" power efficiency and performance.


You mean like this 
	

	
	
		
		

		
		
	


	




 translating to this 
	

	
	
		
		

		
		
	


	




AMD claim "up to 15%" faster, so I guess* 0.37%* does - at least semantically- fall into that range


----------



## The Mac (Sep 2, 2014)

Wrong res, needs to be 1440.

This may change at that rez.

My guess is AMD also used no AA.


----------



## the54thvoid (Sep 2, 2014)

The Mac said:


> Wrong res, needs to be 1440.
> 
> This may change at that rez.
> 
> My guess is AMD also used no AA.








So between 3% at 1080p and 3.7% at 1660p.  Excuse HumanSmoke's arithmetic   And AMD perform better at AA but that might just be the 4GB cards?

Let's just agree it's meh.... 290X is awesome, let's not muddy AMD's waters with a slightly revamped 7970-esque performer.

I can't wait for the GTX 980 to come out because I'll gladly stick my boot into that as well


----------



## GhostRyder (Sep 2, 2014)

newtekie1 said:


> Very disappointing card.  Same price as a 770, but consumes significantly more power and performs noticeable worse.  Just disappointing is all I can say.


Are you referring to the GTX 760?  Because I cannot find a GTX 770 for less than 300...Well I just found one for 274 from Zotac but thats the only one ive seen blow 300.



Steevo said:


> Its the same driver everyone else is using, 14.300
> 
> 
> 
> ...


That is what ive been seeing around though it seems to depend on the card itself.  40 watts less roughly to get around 280X performance still is pretty nice and at least shows some improvements while we wait for the next generation.  I am just more disappointed in the lack of ram...

Was never meant to be the biggest game changer ever...Just meant to show performance to watt improvements and give something to replace the dated 280/X series eventually.



Assimilator said:


> Um, actually, AMD was throwing the phrase "GTX 760 killer" around like nobody's business. And a GTX 760 killer this card is most certainly not.


It seems to be pretty consistently better than the 760 so I would say its a better option seeing as how they are priced roughly the same.



Assimilator said:


> p.s. @W1zzard Catalyst 14.8 WHQL is out, when you gonna update the download links on TPU main page?


I had not seen that yet 0_0


----------



## HumanSmoke (Sep 2, 2014)

the54thvoid said:


> So between 3% at 1080p and 3.7% at 1660p.  Excuse HumanSmoke's arithmetic


You're in a habit of comparing overclocked cards with stock as a viable benchmark comparison?
Stock vs stock is 27.1 fps (R9 285) vs 27.0 fps (GTX 760) - which by my reckoning is 0.37% 


GhostRyder said:


> Are you referring to the GTX 760?  Because I cannot find a GTX 770 for less than 300...Well I just found one for 274 from Zotac but thats the only one ive seen blow 300.


Probably depends where you shop. NCIX has a couple of non-reference models at a quick glance: PNY GTX 770 ($265)  and an EVGA ACX ($280 after MIR)


----------



## Mathragh (Sep 2, 2014)

Well, atleast this card is an excellent test platform for their color compression tech which apparently improves memory bandwidth efficiency by around 40%, thus negating a big chunk of the deficit caused by the 256 bus.
I can see that tech being especially important for APU's, where bandwidth really is quite limited, and I guess its also more economical for big cards.

Furthermore, I guess this card is also a bit more future proof (when bought with at least 4GB) if you recon tessellation will be more important in the future, as this card has as much tessellation power as the 290(x).

Apart from that, I also feel that this card is a bit of a let-down, especially when it comes to the promised power improvements. However with apparently an equally sized die, and even more transistors than tahiti, I guess this is what you get.
Other sites did sometimes get power use numbers that were quite different from here, like hardware.info, where their card used even less than the 760, and in this regard I really think AMD did quite a bit of damage by only releasing non-stock and factory overclocked cards to reviewers, potentially with voltages way over spec(although voltages are comparable to other non-x stock cards). Its sad to see them shooting themselves in the foot like this, by either overpromising, or doing a bad job at handing out proper cards.

On a last note, I found that the first page of the review had a bit of a negative undertone. I might be alone in this, but I thought it mention worthy as I don't think this is the intention. I can certainly see why it would be there, but at least leave us in ignorance while crunching through your review!

Thanks for the review as always!


----------



## revin (Sep 2, 2014)

Thank you wizz!
It seem's a 3Gb 280X is better for most games ect,
Isn't the 280X a better deal ?
I was thiking of going 7970/Ghz since it'd be a good improvement, and price's coming down,  but the 3Gb 280x's had caught my eye last couple months

so what am I missing ?? { Please be cordial}


----------



## GhostRyder (Sep 2, 2014)

Mathragh said:


> Well, atleast this card is an excellent test platform for their color compression tech which apparently improves memory bandwidth efficiency by around 40%, thus negating a big chunk of the deficit caused by the 256 bus.
> I can see that tech being especially important for APU's, where bandwidth really is quite limited, and I guess its also more economical for big cards.
> 
> Furthermore, I guess this card is also a bit more future proof (when bought with at least 4GB) if you recon tessellation will be more important in the future, as this card has as much tessellation power as the 290(x).
> ...


I agree there seem to be many different discrepancies on power usage.  It may have something to do with the selection of the overclock edition cards like stated.  Maybe we will see things more clearly as more of these come out.  But some card makers way over estimate power needs and volt those cards way further than needed!


----------



## Mathragh (Sep 2, 2014)

GhostRyder said:


> I agree there seem to be many different discrepancies on power usage.  It may have something to do with the selection of the overclock edition cards like stated.  Maybe we will see things more clearly as more of these come out.  But some card makers way over estimate power needs and volt those cards way further than needed!


aye, and there apparently also are some 290(x) cards suffering from very high power use accompanied by higher temperatures at the moment. Its only some cards, and with the latest driver that suffer from it, but it makes one wonder if not something similar is happening here..
would both explain why the poweruse is higher than what we were let to believe it should be, and why the card is actually drawing even more than the PCI spec allows.
I wonder if this is why anandtech hasnt posted its review yet.


----------



## HumanSmoke (Sep 2, 2014)

revin said:


> It seem's a 3Gb 280X is better for most games ect,
> Isn't the 280X a better deal ?


Probably....and likely to get better since AMD just confirmed that the 280 is EOL*(after an active product run of .5 months), so it stands to reason that the 285X when it releases - probably as soon as the GTX 9xx are announced, it will do the same for the 280X

* 





> R9 280 is going away – you’ll find some pretty amazing deals online, so get it while you can! - Evan Groenke, Product Manager AMD desktop graphics


----------



## W1zzard (Sep 2, 2014)

GhostRyder said:


> Because I cannot find a GTX 770 for less than 300...Well I just found one for 274 from Zotac but thats the only one ive seen blow 300.



I've seen the ZOTAC card twice in the last 2 weeks around $275 on Newegg (guess it's been there for these 2 weeks but don't know for sure). I hear other cards trade spots with it from time to time. So there is a single GTX 770 available for $275 all the time


----------



## Darksword (Sep 2, 2014)

Might as well spend $269.99 for an R9 280X and get a faster card with 3GB VRAM.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202046

I knew the R9 285 would be a pointless card.


----------



## GhostRyder (Sep 2, 2014)

Mathragh said:


> aye, and there apparently also are some 290(x) cards suffering from very high power use accompanied by higher temperatures at the moment. Its only some cards, and with the latest driver that suffer from it, but it makes one wonder if not something similar is happening here..
> would both explain why the poweruse is higher than what we were let to believe it should be, and why the card is actually drawing even more than the PCI spec allows.
> I wonder if this is why anandtech hasnt posted its review yet.


Seems that is the case with 14.7 for some reason but the cards I hear referenced at times with the issue is the aftermarket OC editions.  I run PowerColor OC editions but my temps did not change that I noticed (granted they are my own custom clocks under liquid).  It's likely how much and how the voltage is set causing the issue (or somehow related to recent fixes for 4K if that could make sense lol).



W1zzard said:


> I've seen the ZOTAC card twice in the last 2 weeks around $275, I hear other cards trade spots with it from time to time. So there is a single GTX 770 available for $275 all the time


Never really payed enough attention to Zotac which is part why I've probably missed this deal mostly after their flops on the 5XX series (specifically the LC editions) but that price is pretty legit if it's a common thing.  Most if the 770s are minimum 300+ which makes that zotac a good deal.  I find it funny though because most of the 760s cost around that price point so buying a 760 would be pointless as well because of the performance hop up to the 770.

Maybe I need to pay more attention to Zotac again is they are offering great deals like that!


----------



## RealNeil (Sep 2, 2014)

Thanks for the review W1zzard. The market will adjust and this card may be a much better deal before long.


----------



## Jeffredo (Sep 3, 2014)

If the GTX 750 Ti is even remotely an example of what Maxwell can do in performance vs. watt AMD may be in huge trouble in another six months or so.


----------



## MikeMurphy (Sep 3, 2014)

The performance of the 285 is still in the realm of high-end.  This is not "mid-range" performance and is certainly relevant beyond 1080p.  Being an avid overclocker I surprised myself when I _downclocked _my 280x  toxic given it has never once come even close to dipping below 60fps at 1200p.  How is that mid-range?

Sure, it's not like the giant leap in power efficiency that Maxwell will bring us, but be reasonable.  The overclocks aren't bad, either.  

Having said that, I expect Maxwell will be the better overall product.


----------



## Steevo (Sep 3, 2014)

HumanSmoke said:


> You're in a habit of comparing overclocked cards with stock as a viable benchmark comparison?
> Stock vs stock is 27.1 fps (R9 285) vs 27.0 fps (GTX 760) - which by my reckoning is 0.37%
> 
> Probably depends where you shop. NCIX has a couple of non-reference models at a quick glance: PNY GTX 770 ($265)  and an EVGA ACX ($280 after MIR)


W1zz dropped the clocks as did other sites, perhaps that was the 15% AMD was talking about, overclocked VS stock. lol


Bang for the buck all around still goes to the 280X, it enters the cavities of competitive and same line models and performs acts of violence upon their physical selves and their high priced ways, figuratively speaking of course.


----------



## ISI300 (Sep 3, 2014)

I'm just starting to realize how shallow and empty TPU reviews are. No in-depth architecture analysis, and what looks like to be a template to reuse on all graphics card reviews. Meanwhile the rest of hardware sites dedicate at least 2 pages for architecture.
Also it's not clear exactly what settings you are using, that makes it hard to reproduce. 
For instance, you've not mentioned AMD s new color compression.
In short, pile of bulls**t.


----------



## HumanSmoke (Sep 3, 2014)

ISI300 said:


> I'm just starting to realize how shallow and empty TPU reviews are. No in-depth architecture analysis, and what looks like to be a template to reuse on all graphics card reviews.....Also it's not clear exactly what settings you are using, that makes it hard to reproduce.


Probably the same settings that are used on the earlier reviews, like the 295X2 when you said "Such a great review." 


ISI300 said:


> For instance, you've not mentioned AMD s new color compression.


Probably because its secondary to the actual results. If it's so fantastic why is the 2GB/256-bit R9 285 basically equalling the performance of a two-and-a-half-year-old 2GB/256-bit GTX 670 ? Until the tech actually shows up as a tangible difference maker it's more academic interest. If you're highlighting future possibilities then why not ask why Tonga doesn't include H.265 transcode or HDMI 2.0. I'm pretty sure reviewers have their own rationale for what they deem important or not, but just because it's in the release kit doesn't make it a highlight-able feature - I seem to remember that just about every HD 2900XT review had some obligatory blurb about its tessellation engine. Until it shows results that set it apart from its peers, it means little.


----------



## Steevo (Sep 3, 2014)

Don't feed the trolls, and obvious troll is obvious. 

People complained about driver testing, W1zz retested with new driver, and it gained less than 1%.
People complain about how this site and reviews were AMD fanboyish.
People complain about how this site and reviews were Nvidia fanboyish.
People complain about what games are used.
People complain about the power consumption tests (and fail to realize two identical cards may differ by a significant percentage due to sensor, GPU, VRM, and other NORMAL variations)


The reason other sites use GPUZ and the reputable sites are linked together is how great of a job is being done, and if one or two people want to get the panties all knotted up and chafing their delicate asses, they can collectively and politely leave, go start your own site, we have had enough people try that.


----------



## xorbe (Sep 3, 2014)

ISI300 said:


> I'm just starting to realize how shallow and empty TPU reviews are. No in-depth architecture analysis, and what looks like to be a template to reuse on all graphics card reviews. Meanwhile the rest of hardware sites dedicate at least 2 pages for architecture.



Yet those are the very pages I skip over in other reviews.  Then go read the other reviews.  No need for them to all be the same.

Card is faster than 760, slower than 670.  Somehow I will sleep soundly tonight!


----------



## mroofie (Sep 3, 2014)

xorbe said:


> Yet those are the very pages I skip over in other reviews.  Then go read the other reviews.  No need for them to all be the same.
> 
> Card is faster than 760, slower than 670.  Somehow I will sleep soundly tonight!


the fps doesn't agree with you ? 
Gtx 760 has same performance


----------



## HumanSmoke (Sep 3, 2014)

mroofie said:


> the fps doesn't agree with you ?
> Gtx 760 has same performance


----------



## ISI300 (Sep 3, 2014)

HumanSmoke said:


> Probably the same settings that are used on the earlier reviews, like the 295X2 when you said "Such a great review."
> 
> Probably because its secondary to the actual results. If it's so fantastic why is the 2GB/256-bit R9 285 basically equalling the performance of a two-and-a-half-year-old 2GB/256-bit GTX 670 ? Until the tech actually shows up as a tangible difference maker it's more academic interest. If you're highlighting future possibilities then why not ask why Tonga doesn't include H.265 transcode or HDMI 2.0. I'm pretty sure reviewers have their own rationale for what they deem important or not, but just because it's in the release kit doesn't make it a highlight-able feature - I seem to remember that just about every HD 2900XT review had some obligatory blurb about its tessellation engine. Until it shows results that set it apart from its peers, it means little.


I wasn't referring to this one particular review. As for the 295x2 review, It was known Hawaii GPU on a single card and I was just checking in for PCB shots and power consumption numbers. 
Tonga has some very interesting new features that AMD is being coy about to discus to the press, Some websites such as techreport.com try their best to probe the manufacturers to get some details. 
Although I will admit that in some cases TPR really shines, such as the range of tested products and it power consumption (esp. bluray blayback) tests. But having the same basic layout for reviews for year after year comes across as lazy.


----------



## BiggieShady (Sep 3, 2014)

ISI300 said:


> But having the same basic layout for reviews for year after year comes across as lazy.



No, it does not ... layout is something people get used to ... changing it often would come across as crazy.


----------



## W1zzard (Sep 3, 2014)

ISI300 said:


> But having the same basic layout for reviews for year after year comes across as lazy.


We do make small incremental updates to our reviews from time to time, if you feel certain sections could be improved let me know.

I'm with Humansmoke on features that don't matter to you. At least 99.99% of people buy discrete graphics cards for gaming, so that's what we are focusing on.

What non-gaming graphics card technology do you use? Right now, the only technology where I would like to bolster our testing is accelerated video rendering using advanced shaders i.e. MadVR. But it's not easy to come up with a testing methodology.

Oh, and we received our card from AMD on Thursday last week.

Let's see.
- Mantle: 3 games available, broken on R9 285.
- 4K H.264 decoder: irrelevant without 4K content, no software for testing available. Not H.265 decode
- Compute: zero real life applications that consumers use, accelerated video encode has horrible quality compared to CPU, double precision completely irrelevant in the consumer space, barely relevant elsewhere unless you buy Teslas
- Tessellation performance: covered by game testing, which actually matters to you
- DeltaColor compression: interesting, too bad all AMD said about it is on that slide: http://img.techpowerup.org/140903/Capture3957.jpg ie, nothing but mention it exists
- Architecture: no arch diagram provided by AMD, no die size, no transistor count. I asked for it but got no reply.
- DX12: AMD's fine print says "The DX12 specification and support for it are subject to change without notice".
- Freesync: vaporware
- TrueAudio: cmon
- Raptr: really?


----------



## bubbleawsome (Sep 3, 2014)

I'll admit. I would like to see compute (see: F@H) performance. I remember anandtech used to do F@H but I'm not sure if they still test that. Not really sure how you bench F@H though. Do you just throw the card at a work unit for 3 or so days until the ppd evens out?

I'd like it because no one mentioned how much the 770 and 7970 even out in compute. I figured the 7970 would blow it away but it didn't.
¯\_(ツ)_/¯


----------



## Frick (Sep 3, 2014)

Here it seems it'll be pretty much exactly as much as the 760's, which makes it a better deal I assume.

EDIT: BTW, something that has always annoyed me with the reviews is the lack of paragraphs in the conclusion. There are paragraphs, but there is no space between them of any kind.



Spoiler



AMD's new Tonga-based Radeon R9 285 leaves me with mixed feelings. Performance has really seen no significant improvements since Tahiti, i.e. the R9 280(X). Sapphire's overclocked board performs exactly as well as the HD 7970, which is almost three years old now. AMD did not send us a reference board, so I had to clock the Sapphire card down to reference clocks, which made the R9 285 3% slower than the Sapphire R9 285 Dual-X OC and 5% faster than the GTX 760. Compared to the AMD Radeon HD 7970 GHz Edition, Sapphire's overclocked card is over 10% slower. I'm looking at 1080p here, which I believe to be the ideal resolution for the R9 285. While AMD tries to promote 1440p gaming on the card, I think it's just too slow for a really good gaming experience at the resolution. Overall performance considered, I'd say the R9 285 is an excellent card for 1080p gaming at maximum details.
Feature-wise, not much that really affects you has changed. Tonga adds support for TrueAudio, which has been available on Bonaire and Hawaii before and really didn't impress anyone. You can now use FreeSync in games on Tonga, but there are no FreeSync monitors, so we'll have to see if it'll be worth it. Mantle has been on the market for a while now. With some adoptions into games, it will hopefully pick up in the future. Last but not least, an improved display controller now allows you to run a 3-monitor EyeFinity setup without an active DisplayPort adapter.
In light of NVIDIA's Maxwell architecture, I expected AMD to introduce massive power consumption optimizations, but Tonga doesn't bring much in that regard. Single-monitor idle power consumption is improved, reaching levels that are more appropriate for 2014. Multi-monitor and Blu-ray power consumption is still bad, much worse than with any NVIDIA product. Gaming efficiency has improved by roughly 10% over the old Tahiti GPU, but that's not enough. The R9 28x Series (including the R9 285) is still the least power-efficient series on the market as NVIDIA's new Maxwell architecture seems to be twice as efficient during gaming.
Sapphire's custom cooler does a good job keeping the card cool, delivering decent temperatures that match those we've seen from the R9 280(X). Idle noise levels are fine, but could be quieter. While fan noise during gaming has improved, it's still not where I'd like it to as it doesn't beat competing products. In gaming, the two fans will definitely be noticeable.
AMD's MSRP for the R9 285 is $249, and we expect the Sapphire R9 285 Dual-X OC to retail for around $260, which isn't too bad, but not good enough to take over the market. Right now, you can find GTX 770 cards discounted to an amazing $275, and these cards are significantly faster, quieter, and more power efficient. Another option would be to look at used HD 7970/R9 280X class cards that compete well with the R9 285 at better, used prices. This means AMD has to reduce their pricing to somewhere below the $230 mark to really get things moving. AMD's latest game bundle also includes loads of titles, old and new, you could pawn off to save some cost.



That's sort of tight.



Spoiler



AMD's new Tonga-based Radeon R9 285 leaves me with mixed feelings. Performance has really seen no significant improvements since Tahiti, i.e. the R9 280(X). Sapphire's overclocked board performs exactly as well as the HD 7970, which is almost three years old now. AMD did not send us a reference board, so I had to clock the Sapphire card down to reference clocks, which made the R9 285 3% slower than the Sapphire R9 285 Dual-X OC and 5% faster than the GTX 760. Compared to the AMD Radeon HD 7970 GHz Edition, Sapphire's overclocked card is over 10% slower. I'm looking at 1080p here, which I believe to be the ideal resolution for the R9 285. While AMD tries to promote 1440p gaming on the card, I think it's just too slow for a really good gaming experience at the resolution. Overall performance considered, I'd say the R9 285 is an excellent card for 1080p gaming at maximum details.

Feature-wise, not much that really affects you has changed. Tonga adds support for TrueAudio, which has been available on Bonaire and Hawaii before and really didn't impress anyone. You can now use FreeSync in games on Tonga, but there are no FreeSync monitors, so we'll have to see if it'll be worth it. Mantle has been on the market for a while now. With some adoptions into games, it will hopefully pick up in the future. Last but not least, an improved display controller now allows you to run a 3-monitor EyeFinity setup without an active DisplayPort adapter.

In light of NVIDIA's Maxwell architecture, I expected AMD to introduce massive power consumption optimizations, but Tonga doesn't bring much in that regard. Single-monitor idle power consumption is improved, reaching levels that are more appropriate for 2014. Multi-monitor and Blu-ray power consumption is still bad, much worse than with any NVIDIA product. Gaming efficiency has improved by roughly 10% over the old Tahiti GPU, but that's not enough. The R9 28x Series (including the R9 285) is still the least power-efficient series on the market as NVIDIA's new Maxwell architecture seems to be twice as efficient during gaming.
Sapphire's custom cooler does a good job keeping the card cool, delivering decent temperatures that match those we've seen from the R9 280(X). Idle noise levels are fine, but could be quieter. While fan noise during gaming has improved, it's still not where I'd like it to as it doesn't beat competing products. In gaming, the two fans will definitely be noticeable.

AMD's MSRP for the R9 285 is $249, and we expect the Sapphire R9 285 Dual-X OC to retail for around $260, which isn't too bad, but not good enough to take over the market. Right now, you can find GTX 770 cards discounted to an amazing $275, and these cards are significantly faster, quieter, and more power efficient. Another option would be to look at used HD 7970/R9 280X class cards that compete well with the R9 285 at better, used prices. This means AMD has to reduce their pricing to somewhere below the $230 mark to really get things moving. AMD's latest game bundle also includes loads of titles, old and new, you could pawn off to save some cost.



The space!


----------



## ISI300 (Sep 3, 2014)

W1zzard said:


> We do make small incremental updates to our reviews from time to time, if you feel certain sections could be improved let me know.
> 
> I'm with Humansmoke on features that don't matter to you. At least 99.99% of people buy discrete graphics cards for gaming, so that's what we are focusing on.
> 
> ...


I do agree with you on many of the points you make (GPU compute being practically useless for consumers, Mantle not being worth the hype,  4K H.264 irrelevant for now, etc.) 
I have checked in to other mainstream sites to see if there's is something extra floating about, but nothing except for the features you highlighted. 
But, while I was reading the review, there where some questions that arose that weren't answered for me. Like, would the DeltaColor compression have any impact or reduce the amount of video memory usage? Without knowing whether this new architecture is purportedly any more efficient in VRAM usage compared to the prior or not, every review website (except some, TR for instance) are deducting points for it's 2 gigs of VRAM. Damage over at TR thinks that the 285 writes compressed data into the frame buffer. Seeing as how much more efficient the 285 is in 3DMarkV fillrate test according to this test:
http://techreport.com/review/26997/amd-radeon-r9-285-graphics-card-reviewed/2
would the 285 be more efficient in VRAM allocation?


			
				Damage said:
			
		

> Note that the frame buffer info stored in Tonga's DRAM is natively compressed, presumably giving it a smaller footprint. I'd like to test the 285 in 4K. I suspect it could get by almost as well as a 3GB Tahiti in some cases.


What I mean is it's good to know the base architecture (with as much info as one can gather) before giving a score and calling it a day. 
BTW, respect for the extensive list of tested cards and resolutions!


----------



## W1zzard (Sep 3, 2014)

ISI300 said:


> DeltaColor compression



nobody really knows. I posted the slide above that has all info from AMD regarding that feature.

2 GB of VRAM is plenty for all games at all playable settings at 1080p


----------



## DaedalusHelios (Sep 3, 2014)

W1zzard said:


> http://www.newegg.com/Product/Product.aspx?Item=N82E16814500302
> 
> That's the problem for R9 285, and main reason why low score/no award



Or even http://www.newegg.com/Product/Product.aspx?Item=N82E16814127741 which is even cheaper.


----------



## W1zzard (Sep 3, 2014)

DaedalusHelios said:


> Or even http://www.newegg.com/Product/Product.aspx?Item=N82E16814127741 which is even cheaper.


The Zotac was $275 until today, I guess AMD went to Newegg and cried, so they bump the price


----------



## GhostRyder (Sep 3, 2014)

Steevo said:


> Don't feed the trolls, and obvious troll is obvious.
> 
> People complained about driver testing, W1zz retested with new driver, and it gained less than 1%.
> People complain about how this site and reviews were AMD fanboyish.
> ...


Most people rant about one being significantly better in all situations or make fun of the other because of their own insecurities.  If you have not noticed the same few people make fun of certain companies on a daily basis regardless of the topic at hand.


mroofie said:


> the fps doesn't agree with you ?
> Gtx 760 has same performance


? It seens to be faster in most situations than a GTX 760...Which was the target of this card though its only relevant to people shopping in that price point and not to people looking for the king of GPUs.



DaedalusHelios said:


> Or even http://www.newegg.com/Product/Product.aspx?Item=N82E16814127741 which is even cheaper.


$274 when we were discussing yesterday.



ISI300 said:


> I do agree with you on many of the points you make (GPU compute being practically useless for consumers, Mantle not being worth the hype,  4K H.264 irrelevant for now, etc.)
> I have checked in to other mainstream sites to see if there's is something extra floating about, but nothing except for the features you highlighted.
> But, while I was reading the review, there where some questions that arose that weren't answered for me. Like, would the DeltaColor compression have any impact or reduce the amount of video memory usage? Without knowing whether this new architecture is purportedly any more efficient in VRAM usage compared to the prior or not, every review website (except some, TR for instance) are deducting points for it's 2 gigs of VRAM. Damage over at TR thinks that the 285 writes compressed data into the frame buffer. Seeing as how much more efficient the 285 is in 3DMarkV fillrate test according to this test:
> http://techreport.com/review/26997/amd-radeon-r9-285-graphics-card-reviewed/2
> ...


It is his testing methodology...@W1zzard does it his way and the way he thinks is best and follows more what a majority of people are looking for in said price point and areas in general.  If you are specifically referencing 4k being necessary, I doubt and of the sub 300 dollar cards even in tri-quad setups would be worthwhile at playing at those resolutions anyhow.  This card was shot up to 1440p/1600p areas (Probably in the aftermarket variant 4gb versions with a second card) and mostly aimed in the 1080p area at that price point for gamers which is what the review focused on.  There are plenty of reviews out there that check different things, not every review/reviewer can do every possible scenario for every single card.  If people tried they would be stuck for weeks on end testing just one card...


----------



## revin (Sep 3, 2014)

ISI300 said:


> ,  4K H.264 irrelevant for now, etc.)


@W1zzard  Just to clarify, It's only because of lack of testing software, and for you to set a repetable standard
Correct ?  I feel that that quote may be out of context to what you are trying to relay.

4K is relevant, and I know when you find how to approach it you will be just as diligent.

It's not fair to hear coment's negitave to w1zzards methodology, when he, and most all TPU reviewer's are far more diligent in a "Review", not passifying some Brand or Rep to get brownie points.

Never have even when the CPU TIM issue came up, it still revieled "some" flaw, even if it was "free", don't mean they{TPU} need to butter up................................


----------



## W1zzard (Sep 3, 2014)

4K video: raise your hand if you have watched a real movie in 4K, not at the cinema. raise your second hand if you watched it via computer.  nobody 

I have no plans to test 4K movie playback in any way for the forseeable future.

My comment on video playback testing was for 1080p output with MadVR (if you don't know what it is, Google and start using it).


----------



## DaedalusHelios (Sep 3, 2014)

W1zzard said:


> The Zotac was $275 until today, I guess AMD went to Newegg and cried, so they bump the price



My mistake, sorry about that. 

I don't see why Nvidia doesn't make 6GB or 4GB 780 ti cards. I guess every major player makes weird decisions. I believe W1zzard is correct in his assessment of the R9 285.  If it was instead a more robust card than the 280X and featured more memory bandwidth instead of less with even higher core clocks. Or perhaps more shaders, then they would have been better off. Nobody looks to AMD for power efficiency given what we know Maxwell will bring to the table.


----------



## W1zzard (Sep 3, 2014)

Rest of the 4K graphs (due to 10 attachment limit)

I dont have that for other cards in the R9 285 segment


----------



## W1zzard (Sep 3, 2014)

oh and


----------



## GhostRyder (Sep 3, 2014)

W1zzard said:


> Rest of the 4K graphs (due to 10 attachment limit)
> 
> I dont have that for other cards in the R9 285 segment


Well at least WoW is playable at 4k on the card


----------



## W1zzard (Sep 3, 2014)

GhostRyder said:


> Well at least WoW is playable at 4k on the card


Not for any serious raiding with your own guild + spell effects + monsters + dont stand in fire


----------



## GhostRyder (Sep 3, 2014)

W1zzard said:


> Not for any serious raiding with your own guild + spell effects + monsters + dont stand in fire


So in other words stand in the corner and watch people walk by and acknowledge that its pretty at least 

Err...Pretty might be relevant to the situation since the graphics are not exactly on the highest of standards by today lol (But then again that's not the point of WoW)


----------



## RealNeil (Sep 3, 2014)

I read reviews on four sites. This is one of them. I haven't been around here for very long, but I like it here, and the reviews are part of the reason why. (I ended up here chasing down a creep that cheated me)

Anyone can post ~picking the flyshit out of the pepper~, but actually do some reviews and you're gonna realize that they're all different and that variables exist every time. 
I like being able to depend on a familiar review format. It makes it easier to find the info I want to see. (what interests me and what doesn't) 

Thanks for the time and effort you put into the reviews here.


----------



## Sony Xperia S (Sep 3, 2014)

W1zzard said:


> oh and



A few dozen of these votes are mine because it is open for quite a while and you can vote every day.

How representative is this result and how many real unique votes are there?

So, if I understand you correctly, you will wait until 4K becomes mainstream and then you will begin paying more serious attention? Right?


----------



## N3M3515 (Sep 3, 2014)

lol, this review is like totally opposite to the one on the tech report


----------



## W1zzard (Sep 3, 2014)

Sony Xperia S said:


> So, if I understand you correctly, you will wait until 4K becomes mainstream and then you will begin paying more serious attention? Right?


http://www.techpowerup.com/reviews/AMD/R9_295_X2/

Plenty of 4K in there


----------



## Ravenas (Sep 3, 2014)

I believe you meant to say the R7 260x on page 1 correct?


----------



## W1zzard (Sep 3, 2014)

Ravenas said:


> I believe you meant to say the R7 260x on page 1 correct?


fixed


----------



## Ravenas (Sep 3, 2014)

I'm seeing that the Asus R9 285 Strix and the Gigabyte variant on Tom's are drawing approximately 176W (average).

http://www.tomshardware.com/reviews/amd-radeon-r9-285-tonga,3925-11.html

These are somewhat lower than what has been reported here by the Sapphire variant (189W avg.), but still in line with what turns out to be a card that draws a lot of power. I was hoping to see this card in the 150W range.


----------



## The Von Matrices (Sep 3, 2014)

Ugh, why does AMD continue to use a shim that is higher than the core?  It forces you to use special heatsinks that are incompatible with every other GPU on the market.  The only reason I can imagine they did it in this GPU is to maintain compatibility with the 79x0 and 280 heatsinks.


----------



## HumanSmoke (Sep 3, 2014)

Sony Xperia S said:


> A few dozen of these votes are mine because it is open for quite a while and you can vote every day.
> How representative is this result and how many real unique votes are there?


Can't speak for anyone else, but I voted once only....to keep within the spirit of the thing y'know?

I guess if I had the facebook mentality I'd be checking boxes like a demented Bingo player- but I don't so I don't.


Sony Xperia S said:


> So, if I understand you correctly, you will wait until 4K becomes mainstream and then you will begin paying more serious attention? Right?


I'd guess that W1zzard bases his bench suite on both reliable (repeatable) bench scenario's, game popularity/*relevance*, and feedback from the site membership - admitting to spamming the poll doesn't seem like advancing your position.


----------



## W1zzard (Sep 3, 2014)

HumanSmoke said:


> admitting to spamming the poll doesn't seem like advancing your position.


forum users can only vote once on the poll, unregistered once per ip per 24 hours. these polls are a fun thing, they have no significance for us, so i really don't care if there is cheating.

I keep forgetting about changing the poll to a new one ..


----------



## HumanSmoke (Sep 3, 2014)

N3M3515 said:


> lol, this review is like totally opposite to the one on the tech report


Look around, you'll see variances all over the place. TR's power consumption...




Looks bad doesn't it? If you're the kind of person that skips straight to the results you couldn't really tell that the 285 card is overclocked, and like the gaming benchmarks...




Doesn't have normalized stock clocks as either indicator that the tested card is OC'ed, or base performance as W1zzard supplied (i.e. an idea of OC scaling perf. vs power consumption) - which is the better methodology ?


Ravenas said:


> I'm seeing that the Asus R9 285 Strix and the Gigabyte variant on Tom's are drawing approximately 176W (average).


Well, I see the Gigabyte results, but I don't see the Asus results.
Another point I'd highlight is that not all cards are created equal.


----------



## Sony Xperia S (Sep 3, 2014)

HumanSmoke said:


> admitting to spamming the poll doesn't seem like advancing your position.





W1zzard said:


> forum users can only vote once on the poll, unregistered once per ip per 24 hours. these polls are a fun thing, they have no significance for us, so i really don't care if there is cheating.



I am not spamming or cheating the poll but using my officially given right to cast my vote as much as I would like to. Calm down.



HumanSmoke said:


> I'd guess that W1zzard bases his bench suite on both reliable (repeatable) bench scenario's, game popularity/*relevance*, and feedback from the site membership - admitting to spamming the poll doesn't seem like advancing your position.





W1zzard said:


> http://www.techpowerup.com/reviews/AMD/R9_295_X2/
> 
> Plenty of 4K in there



I don't speak about gaming at 4K but all other services where GPUs can and should be used - television, movies, videos in YouTube, etc...


----------



## GhostRyder (Sep 3, 2014)

Sony Xperia S said:


> I don't speak about gaming at 4K but all other services where GPUs can and should be used - television, movies, videos in YouTube, etc...



Well if thats all you care about, im pretty sure any of the cards that contain DP 1.2 can run 4k just fine (60hz effective).  Some NVS 510's in the office we have running support it and videos really do not utilize much video power.


----------



## Sony Xperia S (Sep 3, 2014)

GhostRyder said:


> Well if thats all you care about, im pretty sure any of the cards that contain DP 1.2 can run 4k just fine (60hz effective).  Some NVS 510's in the office we have running support it and videos really do not utilize much video power.



Yes, probably you are right. I am just taking that ideal case when the GPU and CPU load stay at ~1% during 4K playback. But if it is already possible, or even not that bad when those are loaded at several dozens percents, then I agree with you.  I didn't know that.


----------



## overpass (Sep 3, 2014)

All the GTX 770s linked here has their promotion period expired...all over $300... at least in Canada 
I think the number of flak this review gets is due to the ratings system reliant on numbers (and decimals, for Chrissake), for example what determines it a 8.0 and not a 8.2 or 8.4? Or not 7.9 or 7.8? Or 7.95 rounded up to 8 because the reviewer was feeling good that day?(Crossing into 7 territory from 8 is like crossing the Rubicon or Styx, I can see W1zzard thought real careful about that!) I am sure this has been discussed to death already, but take it what you will and make an informed choice using this review and others is all.


----------



## Steevo (Sep 4, 2014)

W1zzard said:


> 4K video: raise your hand if you have watched a real movie in 4K, not at the cinema. raise your second hand if you watched it via computer.  nobody
> 
> I have no plans to test 4K movie playback in any way for the forseeable future.
> 
> My comment on video playback testing was for 1080p output with MadVR (if you don't know what it is, Google and start using it).




I tried, and it crashes, is it incompatible with 64 bit version of MPC?


Interestingly enough the hardware using ATI provided video encoding software from a few years ago provides another GPU assisted rendering method with MPC, 0-5% CPU use, 0-60% GPU use depending on file scaling and scene complexity. Attempting to play the same clip with CPU bound enhancements resulted in a single core maxed out and jumpy replay.









Sony Xperia S said:


> I don't speak about gaming at 4K but all other services where GPUs can and should be used - television, movies, videos in YouTube, etc...



Most don't use it, as they don't understand that it needs turned on, needs some filters, their built in Intel graphics are crap, Netflix silverlight plugin doesn't allow true hardware acceleration, youtubes smorgasbord of file types and streaming does more to hurt quality than to help, and other limitations make it almost impossible to test. 

I have a ATI 750HD tuner card, where encrypted digital cable makes it virtually worthless. OTA programming benefits from hardware, as does plugging in a old VHS or DVD player, but why do that when it already has a DVD player in it, and backup copies can be had through the pipes that be. 

I own and use 1080 camcorder, on my camera, and my new cell does 4K that looks great at 1080, but I am sure meh at actual 4K. You are trying to start a movement on something we aren't sure when and how is going to be adopted, no delivery service, no content, no reason as of yet. Few cards can push that many pixels without issues gaming, and google 4K tiles screen issues and read up.


----------



## Ravenas (Sep 4, 2014)

Steevo said:


> I own and use 1080 camcorder, on my camera, and my new cell does 4K that looks great at 1080, but I am sure meh at actual 4K. You are trying to start a movement on something we aren't sure when and how is going to be adopted, no delivery service, no content, no reason as of yet. Few cards can push that many pixels without issues gaming, and google 4K tiles screen issues and read up.



Yes but do the apps running on your 4K phone actually support 4K, or are they just stretched like most Android apps?


----------



## Ravenas (Sep 4, 2014)

Power consumption results confirmed.

http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-285-dual-x-oc-review/18/


----------



## Steevo (Sep 5, 2014)

Ravenas said:


> Yes but do the apps running on your 4K phone actually support 4K, or are they just stretched like most Android apps?



Don't double post.

It records 4K video, it is 1080 resolution. 4K screen resolution on a phone is stupid, no need for that pixel density.


----------



## Ravenas (Sep 5, 2014)

Steevo said:


> Don't double post.
> 
> It records 4K video, it is 1080 resolution. 4K screen resolution on a phone is stupid, no need for that pixel density.



Is it really considered a double post if it is posted 8 hours later and has completely nothing to do with the other post? Hmm...


----------



## RealNeil (Sep 5, 2014)

I have a question about the reviewed GPU.
On page 4, A Closer Look, there is this:  "A BIOS switch is also available. It lets you switch between a legacy and UEFI BIOS and acts as a safeguard should something go wrong during a BIOS flash."

Is the switch a two position switch? Does it make the card operate at different speeds or voltages? Note that in the past, I have seen these serve as a stock speeds and voltages BIOS in one position, and accelerated settings in the other.


----------



## W1zzard (Sep 5, 2014)

RealNeil said:


> I have a question about the reviewed GPU.
> On page 4, A Closer Look, there is this:  "A BIOS switch is also available. It lets you switch between a legacy and UEFI BIOS and acts as a safeguard should something go wrong during a BIOS flash."
> 
> Is the switch a two position switch? Does it make the card operate at different speeds or voltages? Note that in the past, I have seen these serve as a stock speeds and voltages BIOS in one position, and accelerated settings in the other.


It's a 2-way switch and switches between two BIOSes with the same clocks and voltages. Sapphire's website claims it's a UEFI toggle, which I haven't verified.


----------



## RealNeil (Sep 5, 2014)

So can you get a look at the card's BIOS settings? (and maybe tweak them too?)


----------



## 1d10t (Sep 6, 2014)

This card definitely a no -go.I would take aging R9 280X anyday over this


----------



## shark974 (Sep 6, 2014)

What a horribly biased review! First of, the author commends Nvidia's Maxwell architecture...except that's a horrible performer in the 750 Ti and AMD's much older architecture cards run circles around Nvidias's latest and greatest it at it's price point. I mean AMD utterly destroy the Maxwell 750 Ti. The only thing Maxwell has going is a few watts less power usage, which frankly, who cares. Enthusiast gamers that buy $200+ cards care about FPS and value. Power consumption should be only a minor concern as long as it's within reason. Unless you are building an ITX system or something, obviously a tiny minority. Personally i will take a higher watts, more powerful, card every day of the week.

Also he compares it to the "3 year old 7970"...but yeah that card was $450. again Nvidia introduced it's Maxwell architecture with a card that is MUCH slower than it's ancient Fermi based parts, and much slower than Tonga. Was he bitching then? I doubt it. So why whine at AMD for doing the same thing?

To get 7970 performance for $249 is not bad, and more to the point this is old news, both AMD and Nvidia have been pretty stagnant on the performance front and doing lots of brandishing for a while now.

I'll quote Toms hardware for a much less biased review:



> _This is not a situation that we expected to see. Based on the specifications alone, we thought that the Radeon R9 285 would fall behind the Radeon R9 280 when it comes to average game performance. In actual benchmarks it comes very close, sometimes beating and sometimes losing to the Radeon R9 280 by a small amount, and slightly besting its predecessor on average. This indicates that AMD's new lossless color compression scheme is effective enough to compensate for the raw memory bandwidth deficit that the 256-bit Radeon R9 285 suffers compared to the 384-bit Radeon R9 280. That alone is an impressive technical accomplishment.
> 
> From the perspective of a gamer, the Radeon R9 285 isn't quite as impressive when compared to the Radeon R9 280 it will replace. Sure, performance-per-watt has improved significantly, and it's nice to have access to new features such as TrueAudio, a revamped 4K-compatible UVD/VCE, and bridgeless CrossFire. But the most important metric to a gamer is FPS-per-dollar, and this doesn't change much vs. the Radeon R9 280.
> 
> ...



And more:



> _We'd also like to address the GeForce GTX 760's position on our average performance chart. Seeing the 760 sit just *below* the Radeon R9 270X wasn't something we expected, and this situation caused us to re-run the majority of benchmarks and cross-reference them with other tests we've taken. The results were consistent, though, and we believe this situation is the result of a combination of factors including some newer game titles such as Thief (that favor the GCN architecture), mixed with some high-detail settings that don't sit well with the GeForce GTX 760's 192 GB/s of memory bandwidth (bandwidth that doesn't benefit from lossless color compression). We wouldn't count out the GeForce GTX 760, a card that uses even less power than the Radeon R9 285, but we're beginning to wonder if it's time to re-assess its position over a wider range of benchmarks._



_S_o Nvidia is behind, and they're falling further behind. as the 760 is now falling behind the _270X_

Then the Author goes on to include this unasked for advertisement for Nvidia in the review conclusion:

_Right now, you can find GTX 770 cards discounted to an amazing $275, and these cards are significantly faster, quieter, and more power efficient.    _

_W_ell no, actually on newegg the GTX 770 goes for 309+, generally actually $329 and up. Unless you want "Zotac" or open box. I am not sure where the Author is finding these 770 deals but rest assured whatever podunk shady retailer laden with mail in rebates you'll never get, he found them at, AMD has probably get them beat. To compare a $249 MSRP card with a year old $329 street price card? Real fair. I bet the Tonga price will fall a lot too after it's been out a bit, especially as long as the 770 has been.

At least as good a buy as the 770, is the R9 280X, according to TechPower up itself is only a little slower than the 770 (106% vs 114%), much cheaper (newegg street at ~$280 vs $329), and has 3GB RAM vs 2GB, which is a big deal imo, as 2GB is on the verge of limiting. Nvidia has been shorting customers on RAM for years, ask all the owners of 1.5 or 1.2 GB 580 and 570, where AMD was already at 2GB with the 6970...ask which owner of those cards is feeling more future proof right now. Or my brother, who paid $650 for Geforce 780 with...a whopping 3GB RAM, the same amount that has been in AMD's $250 cards for years. He's thinking of getting a 4k monitor and well, got the old Nvidia RAM shaft as his GPU is fine but that RAM could be a future problem. Nice $650 spent with Nvidia for a possible future 4k paperweight cause they wanted a couple $ more profit. Personally I wouldn't buy a 2GB card for more than $250. And, I do fault AMD a little bit for only 2GB in the 285, but, it's only a $249 card so kind of understandable. Nothing like what Nvidia has been doing the last few years where their $600+ cards skimp on RAM.

Basically I dont see the 770 as any great deal, looks lie it slots in about right. Somewhat faster (but not much) than 285 at a significantly higher price, both have only 2GB RAM. A little faster than 280X, more expensive, less RAM. Cheaper than 290, slower, half the RAM. So it slots in about right between all those cards, which I have to admit is pretty rare for an Nvidia card to actually give fair value against AMD, so maybe that's why the Author was so amazed? I dont care about noise (aftermarket coolers are quiet) or power consumption (they are all fine).


And I left out the R9 280. It's to the point of being significantly slower than 770, according to Tom's summary the 770 is about 19% faster than the R9 280, so it may not be a comparable product for some, but it's MUCH cheaper (street price as little as $210) and has 3GB of RAM vs 770's two, which I already made my feeling clear about the extreme importance of. PERSONALLY I would take the extra 1GB of future proofing RAM over the 19% performance even if they were straight up the same price, let alone the 280 being $120 cheaper. Personally the 280 is probably the best deal of any of these mentioned cards IMO. Plus you get Star Citizen and Alien Isolation free, which is an AMAZING game bundle imo, but I do admit this could vary per user, some may not care about these games, although I'd venture objectively it's far better than what games Nvidia gives currently. Nvidia appears to be giving Borderlands the pre-sequel, which at least for me is not very exciting (very poor graphics as well as it's based PS3/360) but again, this could vary by user and at least it's not an old game. But for me there's zero contest.


----------



## The Von Matrices (Sep 6, 2014)

shark974 said:


> What a horribly biased review! First of, the author commends Nvidia's Maxwell architecture...except that's a horrible performer in the 750 Ti and AMD's much older architecture cards run circles around Nvidias's latest and greatest it at it's price point. I mean AMD utterly destroy the Maxwell 750 Ti. The only thing Maxwell has going is a few watts less power usage, which frankly, who cares. Enthusiast gamers that buy $200+ cards care about FPS and value. Power consumption should be only a minor concern as long as it's within reason. Unless you are building an ITX system or something, obviously a tiny minority. Personally i will take a higher watts, more powerful, card every day of the week.
> 
> Also he compares it to the "3 year old 7970"...but yeah that card was $450. again Nvidia introduced it's Maxwell architecture with a card that is MUCH slower than it's ancient Fermi based parts, and much slower than Tonga. Was he bitching then? I doubt it. So why whine at AMD for doing the same thing?
> 
> ...



If you think the mediocre review was because of some Nvidia conspiracy, then you are completely missing the point.

AMD's biggest competition is with itself.  Last month you could have bought an R9 280X at the same performance and the same price as you can buy an R9 285 today.  All the reviewers agree that this launch does absolutely nothing to change the competitive landscape of the market, and that is why it is so disappointing.


----------



## 1d10t (Sep 8, 2014)

shark974 said:


> What a horribly biased review!and bla..bla..








Any author can have their opinion as they pleased...but considering the fact lets face the reality...R9 285 doesn't make any dent 
Oh and welcome to the forums btw 



The Von Matrices said:


> If you think the mediocre review was because of some Nvidia conspiracy, then you are completely missing the point.
> 
> AMD's biggest competition is with itself.  Last month you could have bought an R9 280X at the same performance and the same price as you can buy an R9 285 today.  All the reviewers agree that this launch does absolutely nothing to change the competitive landscape of the market, and that is why it is so disappointing.



+1 to this.


----------



## dj-electric (Sep 8, 2014)

For those who haven't noticed, GTX 770 prices went up sharply.


----------



## Steevo (Sep 8, 2014)

Nvidia adjusted them according to the performance/price of this turd. 

Perhaps AMD could use some compute clusters to calculate our disappointment in their mediocre mid range replacement, then follow that up with how well they expect a larger chip than its replacing to make them less money, this is Phenom all over again.


----------



## The Von Matrices (Sep 8, 2014)

Steevo said:


> they expect a larger chip than its replacing to make them less money, this is Phenom all over again.



You are forgetting that the GPU in the R9 285 not a fully enabled die while the GPU in the R9 280X is.  Comparing partially enabled die to partially enabled die, the R9 285 clearly outperforms the R9 280.  The leaks indicate that there will a R9 285X using a fully enabled Tonga GPU, and that is expected to outperform the R9 280X as it should considering the larger die size.


----------



## Steevo (Sep 8, 2014)

The Von Matrices said:


> You are forgetting that the GPU in the R9 285 not a fully enabled die while the GPU in the R9 280X is.  Comparing partially enabled die to partially enabled die, the R9 285 clearly outperforms the R9 280.  The leaks indicate that there will a R9 285X using a fully enabled Tonga GPU, and that is expected to outperform the R9 280X as it should considering the larger die size.



Considering where it is at, it a piss poor move as overall. They should expect to sell more of these than 290's, and since the die size is larger, and since they don't really make cards, but merely the silicon that goes on the card a smaller piece of that as wafer costs remain the same keeps more money in their pocket. From a business perspective if they sell 3 to 1 of midrange to high end on a well developed process node they should be paying complete attention to the midrange, and cut this die size by the 25% it could have been. The number of cut shaders (1/8) , and the as of yet unknown memory interface of the full die makes me guess that it may only reach 290X levels in full form, merely allowing for energy consumption benefits and cheaper components. Other than a few watts dispersed as heat where is the real benefit? It seems that the same or more energy savings could have been had by merely refining the Tahiti silicon with a respin, better clock management, and really cutting out of unneeded parts.
http://forums.guru3d.com/showthread.php?t=364277

This is the reason why AMD cards have higher power draw at everything but idle single screen. Chop the whole UVD out, make it a secondary chip wired to the memory bus and output buffer logic and stop screwing around, or put a second clock generator on the damn board for it.

Perhaps W1zzard can see if the core downclocks on this board too, run something like Heaven in window mode, then open youtube or the like and see if core clocks drop for UVD. That would be interesting.


----------



## Sony Xperia S (Sep 14, 2014)

_I Agree - Tonga is not bad, but on the other hand it does not change anything substantially compared to Tahiti. This would have been a nice result 1 - 1.5 years after the introduction of Tahiti. But that's almost been 3 years ago! *The last time a GPU company showed no real progress after 3 years they went out of business shortly afterwards*..._


----------



## Fluffmeister (Sep 17, 2014)

No Tonga XT coming?

http://videocardz.com/52441/did-amd-just-cancel-radeon-r9-285x


----------



## RealNeil (Sep 17, 2014)

AMD and NVIDIA go back and forth all of the time. They respond to one another's releases and then they're responded to in kind.

My R9-280X OC 3GB cards are awesome for the price and they run Crossfire without any problems. But my older GTX-680s in SLI are pretty cool too.
I just get whatever I can afford, whenever I can. If either company want's loyalty, they should buy a dog.

You can expect both companies to release bigger, better GPUs for the holidays. I look forward to that.


----------



## Sony Xperia S (Sep 18, 2014)

Fluffmeister said:


> No Tonga XT coming?
> 
> http://videocardz.com/52441/did-amd-just-cancel-radeon-r9-285x



No, we should not jump onto wrong conclusions because logically it doesn't make a lot of sense not to produce full cores for our market.

However, I'm more interested why TPU has no review and performance comparison with R9 280.

Basically, there is no 280 but there is 285, there is no 285X but there is 280X. Very confusing, as if there is indeed something going on behind the scenes.


----------



## W1zzard (Sep 18, 2014)

Sony Xperia S said:


> However, I'm more interested why TPU has no review and performance comparison with R9 280.


Because AMD didn't send me a R9 280


----------



## RealNeil (Sep 18, 2014)

W1zzard said:


> Because AMD didn't send me a R9 280



That's an excellent reason!


----------



## Sony Xperia S (Sep 18, 2014)

RealNeil said:


> That's an excellent reason!



Actually, it doesn't answer my question because it is not the original reason but more likely a consequence.

Why didn't AMD send the card? And why does TPU rely on them when they can take the card from the AIB partners? What was AMD's idea? What happens inside?


----------



## W1zzard (Sep 18, 2014)

Sony Xperia S said:


> Why didn't AMD send the card? And why does TPU rely on them when they can take the card from the AIB partners? What was AMD's idea? What happens inside?


I have no idea. AIBs didn't sample R9 280.


----------



## Sony Xperia S (Sep 18, 2014)

W1zzard said:


> I have no idea. AIBs didn't sample R9 280.



Well, we can put it somewhere between / around GTX 660 Ti, GTX 760 and a little bit lower than R9 285.
Perhaps around 5% (on average) inferior to R9 285.


----------



## nunyabuisness (Oct 1, 2014)

I think its time to abandon AMD. no real advancements in CPU or GPU tech IN YEARS. all they seem to do is throw more power at the problems they have 
 they are still stuck in the 2D transistor world. still in the 28nm/32 nm size range. it's just disappointing 

Also Tech powerup. if you are going to review a video card such as the 285 how about including the card it's replacing for comparison. the 280. you included the 280X which offers no comparison


----------



## Sony Xperia S (Oct 1, 2014)

nunyabuisness said:


> I think its time to abandon AMD. no real advancements in CPU or GPU tech IN YEARS.



Now, after so much time has passed, it indeed proves that it was a severe mistake that they threw their money on ATi.

Instead, they should have used these financial resources in engineering potential, like Intel designs their own graphics.

Whatever, yes, I agree that AMD is in a deep trouble, but maybe after 5-6 months they will release something from R9 300 series.


----------



## RealNeil (Oct 4, 2014)

nunyabuisness said:


> I think its time to abandon AMD~~~~~~~~~~~~~



Ha! That was good!,..............I laughed and snorted coffee through my nose when I read that.



nunyabuisness said:


> Also Tech powerup. if you are going to review a video card such as the 285 how about including the card it's replacing for comparison. the 280. you included the 280X which offers no comparison



I think that the reviewer already commented on the fact that he only had certain cards on hand to test with.


----------

