# AMD Radeon R9 290 4 GB



## W1zzard (Nov 5, 2013)

AMD is launching their new Radeon R9 290 today. It comes with slightly weaker specifications than the R9 290X, but can compete with its bigger brother in benchmarks. Thanks to the fantastic price of just $399, the card is also extremely affordable.

*Show full review*


----------



## v12dock (Nov 5, 2013)

Wow what an incredible price! Awesome review as usual


----------



## Mussels (Nov 5, 2013)

nice speed, but doesnt this make the 290x redundant?


----------



## Novulux (Nov 5, 2013)

Alright well, I'm definitely picking up this card. Hopefully it comes with some games.


----------



## Slomo4shO (Nov 5, 2013)

Great price, $100 less for 780 performance... an echo from 2 weeks ago?


----------



## Pandora's Box (Nov 5, 2013)

I'd be interested in seeing a review where they take all these cards and overclock them and re-evaluate.


----------



## Nordic (Nov 5, 2013)

With how these cards throttle, I really want to see how they do with custom coolers as to most of us. I think that is where we can see these cards really begin to look good.

Hey w1zz, does the 290 perform pretty much the same as the 290x because it didn't throttle as much?


----------



## W1zzard (Nov 5, 2013)

james888 said:


> Hey w1zz, does the 290 perform pretty much the same as the 290x because it didn't throttle as much?



yes, 290x in quiet mode throttles very often, and 290 non-x throttles less due to the higher maximum fan speed setting


----------



## sweet (Nov 5, 2013)

290x is still the killer with a better cooling solution. But Titan performance at 400$? You can't go wrong with 290


----------



## radrok (Nov 5, 2013)

Wow this thing pisses so hard on nvidias parade that their logo has lost some of its green in favor of yellow today. 

GREAT review, great card, bad cooler and very competitive price. 

This plus a good AIB cooler infusion and it'll become a nightmare more than it is now for the competition. 

Good job AMD. Bravo.


----------



## THE_EGG (Nov 5, 2013)

hmmmm next card for me!! Although only if it comes out with a decent third-party/aftermarket cooler. That noise is insane!


----------



## HumanSmoke (Nov 5, 2013)

Awesome bang-for-buck card.
Pity AMD enforce a waiting period before AIB's can release their own proprietary designs. A Windforce3 or Vapor-X on day 1 would have my money


----------



## manofthem (Nov 5, 2013)

I've been waiting for this review; thanks W1zz, great work yet again 
I'm pretty much sold at this price.  While the price of the 290x was steep, the 780 price drop looked attractive, but now that this is coming out, I think this will have to be my next card!  Slap a water block on it and I'll be good to go, no need to worry about heat and noise


----------



## Mussels (Nov 5, 2013)

Pandora's Box said:


> I'd be interested in seeing a review where they take all these cards and overclock them and re-evaluate.



cant really be done accurately, since every card OC's differently. you can have a card thats unstable at 1050MHz but stable at 1100, and unless a reviewer spends days testing each card they'll never get each card to their absolute maximum, which is why most reviews skim over the OCing side of things briefly.


----------



## btarunr (Nov 5, 2013)

HumanSmoke said:


> Awesome bang-for-buck card.
> Pity AMD enforce a waiting period before AIB's can release their own proprietary designs. A Windforce3 or Vapor-X on day 1 would have my money



They'll be out by the month's end, don't worry. Expect WindForce 450W, TF4, and DCU2 triple-slot.


----------



## SIGSEGV (Nov 5, 2013)

what an incredible card... 
it reminds me with gtx 670 back then... 

i wonder if my ek full water block for 290x is compatible with 290.. anyone? 
damn, this card is tempting me hard to get 2 of this and crossfire them instead getting 2 290x..


----------



## manofthem (Nov 5, 2013)

SIGSEGV said:


> what an incredible card...
> it reminds me with gtx 670 back then...
> 
> i wonder if my ek full water block for 290x compatible with 290.. anyone?
> damn, this card is tempting me hard to get 2 of this and crossfire them instead getting 2 290x..



I gotta say we think alike.    I'm pretty sure I'm steered toward the non-X now, and hope to get 2 down the road.


----------



## haswrong (Nov 5, 2013)

why delay everything to christmas? i dont do christmas.. i just want to upgrade my g card with a nonref design.. so worn out..




SIGSEGV said:


> i wonder if my ek full water block for 290x is compatible with 290.. anyone?


why would it not be? the chip only lacks some tmus, everything else should be the same, me guesses..


----------



## Shinshin (Nov 5, 2013)

So you knew that the price would be 400$ all that time..... 

Good card and I like that AMD is back in high end GPU market.
We need AMD to sit and design a CPU now and do the same to against Intel.

The customers will win!


----------



## Totally (Nov 5, 2013)

Looks like the 8800GT all over again. I'm hoping a Sapphire Toxic  version like the 270/280X shows up quickly to take my money.


----------



## Rebel333 (Nov 5, 2013)

Is possible the 290X performance will be much better with custom cooler, heatsink?

It would have been interesting to see bf4 test too.


----------



## mascotzel (Nov 5, 2013)

Why is this important in a 2013 highend GPU to be listed as a negative (not another card has this anyway)?


> No analog VGA outputs


----------



## btarunr (Nov 5, 2013)

mascotzel said:


> Why is this important in a 2013 highend GPU to be listed as a negative (not another card has this anyway)?



Because noobs still buy/own $100 21.5-inch 1080p monitors that only have D-Sub.


----------



## Nihilus (Nov 5, 2013)

wow slam dunk for amd.  it would be that much better with a custom cooler.  hilarious that team green is trying so hard to win on the top end when amd crushes them up the middle.  a gtx 770 ti would be a better move than a $700 gtx 780 ti


----------



## Totally (Nov 5, 2013)

mascotzel said:


> Why is this important in a 2013 highend GPU to be listed as a negative (not another card has this anyway)?



Well if you're still using an old CRT with no desire or need to upgrade that is kind of a big negative.


----------



## hardcore_gamer (Nov 5, 2013)

Tomshardware replaced the stock cooler with an Arctic Accelero Xtreme III. It gave 13% improvement in performance (without manual overclocking) and much better noise levels.

AMD could have scored a perfect 10 if they've equipped this card with a good cooler.


----------



## mastershake575 (Nov 5, 2013)

Damn that's a great price. 

Even a third party cooler version will only cost $425-450 and that will easily allow clocks at 1050-1100mhz (this card has SERIOUS potential, I hope AMD's production on this card is high and that it sells well).


----------



## The Von Matrices (Nov 5, 2013)

I like the price, but I'm unhappy that AMD panicked and made a bad decision in not including a "quiet" mode.  It's the same PCB and has the same switch; it's just disabled.  I have a feeling the company didn't like the 290X reviews being confusing with the two modes, so they just ditched the "quiet" mode to make their performance consistent across reviews at the expense of noise.  Now the R9 290 has no "quiet" mode, and unless you want to manually tune fan control (not many people do) you're stuck with the noise.  *49dB all the time is unacceptable.*


----------



## Pumper (Nov 5, 2013)

Pandora's Box said:


> I'd be interested in seeing a review where they take all these cards and overclock them and re-evaluate.



Have a look at linustechtips reviews on youtube. They always do OCed to max. benchmarks.


----------



## mascotzel (Nov 5, 2013)

btarunr said:


> Because noobs still buy/own $100 21.5-inch 1080p monitors that only have D-Sub.


That is not what I mean.
Which high end card in the last two years has had a VGA output?!
None.
Despite this, you only put it as a negative only on Radeon R9 290(X) series.


----------



## btarunr (Nov 5, 2013)

mascotzel said:


> Which high end card in the last two years has had a VGA output?!



GeForce GTX Titan, GTX 690, HD 7990, HD 7970 GHz Edition, etc.


----------



## Totally (Nov 5, 2013)

mascotzel said:


> That is not what I mean.
> Which high end card in the last two years has had a VGA output?!



Just about all off them. Even the Titan does. Ever heard of a DVI->VGA adapter.


----------



## mascotzel (Nov 5, 2013)

That is not a VGA output.An adapter is an adapter.
They say in the review that they received the cards as bulk, not as retail.


----------



## Lionheart (Nov 5, 2013)

Good work AMD







Now we just gotta wait for non reference cards I don't want a hair dryer for a video card


----------



## Totally (Nov 5, 2013)

mascotzel said:


> That is not a VGA output.An adapter is an adapter.
> They say in the review that they received the cards as bulk, not as retail.



what i was alluding to is that analog vga output was passed via the  DVI-A port using and adapter. With the R9 series card THERE IS NO VGA OUTPUT PERIOD.


----------



## W1zzard (Nov 5, 2013)

Totally said:


> Ever heard of a DVI->VGA adapter.



You can't use these adapters, they require special analog VGA pins on the DVI port, which are not present on R9 290 Series


----------



## Totally (Nov 5, 2013)

W1zzard said:


> You can't use these adapters, they require special analog VGA pins on the DVI port, which are not present on R9 290 Series



I brought that up because the previous guy was saying prior cards lacked VGA, and R9 shouldn't be an exception. I was trying to subtlety say "Bro, just because you don't see it doesn't mean it's not there" on cards like the Titan, 7970 and so on...


----------



## zzzaac (Nov 5, 2013)

I'm craving for some non-reference 290 and 290x's

I wonder....will you be able to xfire a 290 and 290x?


----------



## btarunr (Nov 5, 2013)

mascotzel said:


> That is not a VGA output.An adapter is an adapter.
> They say in the review that they received the cards as bulk, not as retail.



It is a VGA output. The DVI connectors on those cards feature pins for that VGA output, which an adapter merely rearranges in the RS232 D-Sub connector layout. That adapter won't work on an R9 290/290X, because those VGA pins are absent on an R9 290/290X. It will work on GTX Titan. 

The summary of our argument is: 
People still buy/use cheap 21.5-inch 1080p monitors that only feature VGA input
Those people use DVI to D-Sub adapters
Those adapters work on every other high-end graphics card (launched in the past 2 years)
Those adapters will not work on R9 290/290X
That warrants a demerit


----------



## Over_Lord (Nov 5, 2013)

Radeon 290 is - - - WOW!

Waiting for AIB to work their magic with an after market cooling solution


----------



## The Von Matrices (Nov 5, 2013)

Nihilus said:


> hilarious that team green is trying so hard to win on the top end when amd crushes them up the middle.  a gtx 770 ti would be a better move than a $700 gtx 780 ti



AMD knows that NVidia already has GK104 pushed to the limit in GTX 770; there's no way to make a higher performance GK104 card.  A GTX 770Ti would have to be a GK110 cut down even more than a GTX 780, and I'm not sure NVidia wants a 4th GK110 SKU.  This would be a repeat of the GTX 260, which was such a large chip for such a low price that investors speculated that NVidia was losing money on each card sold.  I think all NVidia can do is cut the price of GTX 780 even more and just cede the $400 range to AMD.  A $449 GTX 780 would be reasonable considering that most GTX 780 cards are custom overclocked variants that perform better than the stock GTX 780 benchmarks indicate.


----------



## Protagonist (Nov 5, 2013)

Wow go AMD, now i feel like a tool, Nvidia used me and i was a fool to fall for it, all this just shows how much the GK104 was the mid range chip and not high end but AMD did not have a response clear enough at that time, i still feel aaaarrrrrr... will wait for 20nm. Nvidia should price 780TI @ 499 as its not that efficient too, 780 @ 399 and 770 @ 299 where it was always meant to be that should be the way it was meant to be played.


----------



## The Von Matrices (Nov 5, 2013)

hardcore_gamer said:


> Tomshardware replaced the stock cooler with an Arctic Accelero Xtreme III. It gave 13% improvement in performance (without manual overclocking) and much better noise levels.
> 
> AMD could have scored a perfect 10 if they've equipped this card with a good cooler.



The reviews are confusing in this regard, and W1zzard didn't mention it.  Unlike the 290X, *for the 290 there is basically no performance to be gained by just replacing the stock cooler.  The card is not throttling at stock speeds* as long as you are using a case with at least moderate airflow; of course really hot climates will experience different conditions.  You have to overclock to get performance gains, unlike the 290X.  This is the reason the card nearly matches the 290X.  Put a custom cooler on the 290X or increase its fan speed and the performance gap will widen at stock clock speeds.

AMD, at the last minute, increased the default fan speed from 40% (290X's "quiet" mode) to 47% (close to 290X's "uber" mode) through the 13.11 beta8 drivers.  This stops throttling in almost all conditions and increases the card's performance at the expense of insane noise.  The Tom's Hardware article tested at the 40% fan speeds, which causes the card to throttle, and that is the main reason they can claim increased performance through a better cooler.  They must have a really hot bench or a high stock voltage card too, because no other review I have yet seen said that card throttled at the 47% fan speed.  For anyone using the default 47% fan speed, then there should be no performance increase at stock settings by using a custom cooler.


----------



## HammerON (Nov 5, 2013)

Very impressive card(s) for the price
This will change the price structure even more


----------



## the54thvoid (Nov 5, 2013)

Although the noise is unnacceptable for their top tier cards,  bit naive imo, the price performance is awesome. It's taken the attractive price of the 780 out the equation. Only relevant problem is the reference design. I couldnt buy a ref card - just too noisy. I think if you're a water cooler, AMD just gave you no option. I wonder if NV will rethink the pricing of the upcoming 780ti? If not, they'll suffer for their arrogance.


----------



## Crap Daddy (Nov 5, 2013)

Fantastic performance for $400. But as it is, this card is not something you would want to put in your machine. It is simply unacceptable to have this level of noise in the house. We'll have to wait and see aftermarket cards and the real price asked for a proper 290.


----------



## HammerON (Nov 5, 2013)

It will be interesting to see if Nvidia stays with the ~$700.00 for the 780Ti...


----------



## symmetrical (Nov 5, 2013)

The Von Matrices said:


> The reviews are confusing in this regard, and W1zzard didn't mention it.  Unlike the 290X, *for the 290 there is basically no performance to be gained by just replacing the stock cooler.  The card is not throttling at stock speeds* as long as you are using a case with at least moderate airflow; of course really hot climates will experience different conditions.  You have to overclock to get performance gains, unlike the 290X.  This is the reason the card nearly matches the 290X.  Put a custom cooler on the 290X or increase its fan speed and the performance gap will widen at stock clock speeds.
> 
> AMD, at the last minute, increased the default fan speed from 40% (290X's "quiet" mode) to 47% (close to 290X's "uber" mode) through the 13.11 beta8 drivers.  This stops throttling in almost all conditions and increases the card's performance at the expense of insane noise.  The Tom's Hardware article tested at the 40% fan speeds, which causes the card to throttle, and that is the main reason they can claim increased performance through a better cooler.  They must have a really hot bench or a high stock voltage card too, because no other review I have yet seen said that card throttled at the 47% fan speed.  For anyone using the default 47% fan speed, then there should be no performance increase at stock settings by using a custom cooler.



All true and good, but jesus 49dB 

But safe to say once they let their AIB partners come out with some nice dual or triple fan coolers, it's going to be a hit.


----------



## RCoon (Nov 5, 2013)

After all of this and the eventual release of AIB partner cards, I'm more interested in what NVidia intends to do to save themselves. I somehow doubt the 780ti is going to solve the problem of the 290 and 290X, as they're in different price leagues. It's either pull out another ridiculous boost/ti card for that price point (which I personally see as very likely), or bring down prices by a substantial amount.
It's also weird how AMD punched themselves in the balls with the 290, in terms of reference cards. Anyone buying a reference card without water would be stupid to buy a 290X over a 290.


----------



## N3M3515 (Nov 5, 2013)

hardcore_gamer said:


> Tomshardware replaced the stock cooler with an Arctic Accelero Xtreme III. It gave 13% improvement in performance (without manual overclocking) and much better noise levels.
> 
> AMD could have scored a perfect 10 if they've equipped this card with a good cooler.



Lol, that's more less kind of what i was expecting from a non reference.


----------



## Mathragh (Nov 5, 2013)

I think you might've wrongfully copied the maximum poweruse again.

Apart from that, thanks for the review! I think its telling that they changed the fan settings at the last moment. They really seem determined to push NVidia out of its comfortable position, forcing them to lower their margins by a lot. I can only hope AMD doesn't incur too big of an margin cut themselves this way. 
I suppose they atleast have the advantage of selling a smaller piece of silicon, something which Nvidia can never properly compete on with their huge titan-chip.


----------



## Lionheart (Nov 5, 2013)

I need my fellow Tpuer's advice.. Does anyone think an i7 940 at my current speed would bottleneck a R9 290? I game at 1080p... & yes I'm aware this card is made for higher resolutions but 1080p still good enough for me plus I would like to max out new demanding games in the future


----------



## sweet (Nov 5, 2013)

the54thvoid said:


> Although the noise is unnacceptable for their top tier cards,  bit naive imo, the price performance is awesome. It's taken the attractive price of the 780 out the equation. Only relevant problem is the reference design. I couldnt buy a ref card - just too noisy. I think if you're a water cooler, AMD just gave you no option. I wonder if NV will rethink the pricing of the upcoming 780ti? If not, they'll suffer for their arrogance.



Or you can spend ~50$ and happy with this
http://www.ebay.com.au/itm/160486152943?ssPageName=STRK:MEWAX:IT&_trksid=p3984.m1423.l2649
A guy on overclock.net test 290x with Gelid icy rev 2, 52*C full load at stock


----------



## d1nky (Nov 5, 2013)

can the shaders be unlocked on these cards, just like older series?


----------



## happita (Nov 5, 2013)

It looks like AMD just caused a price war. I see it now, all R9 cards will be on backorder for 2 months or more because they probably won't be able to meet demand (or willingly NOT meet demand so they can drive the prices on these higher)


----------



## the54thvoid (Nov 5, 2013)

This the Tom's Hardware link for the accelero cooler. 

http://www.tomshardware.com/reviews/radeon-r9-290-review-benchmark,3659-19.html

TBH, hearing that fan for the first time, jesus - that's an awful 'card'.  AMD wtf were you thinking  

Need some dBA numbers for a good comparison but the accelero definitely helps but at 12v it isn't 'quiet' and given the heat given off by the card, it will push air coolers to the limit i think.

EDIT:  They have this info 



> Even at 7 V, the upgraded Radeon R9 290 is barely louder at prolonged full load than the stock versions are at idle



and they say this: 



> So when can we expect the custom-built cards to address our concerns? We hear they’re being held back until more is known about GeForce GTX 780 Ti—and this is entirely plausible.


----------



## Shihab (Nov 5, 2013)

Question: What's the ambient temperature of the environment those cards were tested at?


----------



## Mathragh (Nov 5, 2013)

Lionheart said:


> I need my fellow Tpuer's advice.. Does anyone think an i7 940 at my current speed would bottleneck a R9 290? I game at 1080p... & yes I'm aware this card is made for higher resolutions but 1080p still good enough for me plus I would like to max out new demanding games in the future



It might bottleneck it in some games that are really single threaded, but judging by benchies from modern day games like crisis 3 and battlefield 3, I'd recon that cpu will be relevant for quite a while yet. Even more if mantle takes off, which would decrease the load on that cpu even more (supposedly atleast).

I think you're fine!

Edit: so probably, from this moment on, the relative performance of your cpu in the newest games should actually go up as more engines become multithreaded and more efficient (up till a certain point atleast).


----------



## the54thvoid (Nov 5, 2013)

Shihabyooo said:


> Question: What's the ambient temperature of the environment those cards were tested at?



Probably not as hot as Sudan...


----------



## Fourstaff (Nov 5, 2013)

Did this card just obsoleted all the cards more expensive than it?


----------



## Mathragh (Nov 5, 2013)

Fourstaff said:


> Did this card just obsoleted all the cards more expensive than it?



Probably only with the current coolers on both the 290x and non-x.
Since the 290x was shown to continuously throttle, and thus run on clocks often atleast 10% lower, i'd say that once there are R9 290's and 290x's on the market with custom coolers, you'll once again see a bigger difference.


----------



## alwayssts (Nov 5, 2013)

The Von Matrices said:


> I like the price, but I'm unhappy that AMD panicked and made a bad decision in not including a "quiet" mode.  It's the same PCB and has the same switch; it's just disabled.  I have a feeling the company didn't like the 290X reviews being confusing with the two modes, so they just ditched the "quiet" mode to make their performance consistent across reviews at the expense of noise.  Now the R9 290 has no "quiet" mode, and unless you want to manually tune fan control (not many people do) you're stuck with the noise.  *49dB all the time is unacceptable.*



I think it's obviously more-so they found exactly the heat dissipation required (and vicariously power draw) across a sample range to keep the 290 performing at the level of 780.  It is likely intentional it is similar to the level the quiet mode was tuned for 290x.  This is supported by the simple calculation of 47/40% (+17.5%) fan speed and the shader/clock difference (+16%) for efficiency.  It performs 1% better than 780, which backs the reality the chip is heat limited.


----------



## the54thvoid (Nov 5, 2013)

On a more topical note (i'm sure folk can cherry pick, come on you Reds!) a lot of reviewers are attacking what AMD have done with the last minute 47% fan increase.  They're saying AMD seemewd to have panicked at the 780 price cut and made the 290 give the same performance as it's flagship at the sacrifice of noise. (Guru 3D must test in another room - he finds the noise 'average').  I'd agree with what Anand says - if you bought this as a vanilla card with no intention of customising it - you'd be massively unhappy with the noise.  They don't even recommend it on that alone, despite saying it's turned the price performance war on it's head with what is stellar performance at such a low price.


----------



## 1d10t (Nov 5, 2013)

What i see is an absolute winning...




with less power requirement than 7970GE,R9 290 only 40% slower than 7970 CF/7990.
I take two please 



SIGSEGV said:


> what an incredible card...
> it reminds me with gtx 670 back then...
> 
> i wonder if my ek full water block for 290x is compatible with 290.. anyone?
> damn, this card is tempting me hard to get 2 of this and crossfire them instead getting 2 290x..



Trust me,it'll fit perfectly.And what did i say to you about R9 290 CF on the other day


----------



## hardcore_gamer (Nov 5, 2013)

Mathragh said:


> I suppose they atleast have the advantage of selling a smaller piece of silicon, something which Nvidia can never properly compete on with their huge titan-chip.



Yep. Titan's die is ~26% larger than 290x's. AMD managed to achieve Titan's performance with a smaller chip. They can afford to undercut nvidia's pricing.


----------



## zsolt_93 (Nov 5, 2013)

Sooo. We will have a 770 TI and a 780 TI incoming from what i see. Not sure what will happen to the normal 780 though, it might get demoted to 770TI and get a bit of OC to beat this 290 thing. Surely nVidia has been given a lesson. The nVidias are not bad cards by any means now, but they seem to be totally out of place price segment wise. They need to bring out at least 2 new cards and drop prices 1-2 tiers on the older ones to stay with AMD, or they are in for some trouble.


----------



## alwayssts (Nov 5, 2013)

Just out of curiosity...what is the unofficial tdp of 780?

Is it 250w + 10% (275w) on most cards?  

I ask because *what matters is the max tdp and efficiency within that tdp.* 

As it sits, according to Wiz's tests it looks like the 780 is roughly around 7-10% more efficient than 290 (at 1080p), with proportionally lower max tdp (290 is 300w), hence should be relatively similar over-all when not limited by heat.  In essence, $100 vs 20-30w at load (at 'realistic' resolutions/settings not limited by rops/buffer) when you have a decent cooler and overclock it.

Would not be surprised if this will change with a 780 refresh that has a similar 300w tdp as well...that would make sense.  Make the thing boost to ~1085-1112/6008 stock, ie 1.5x 680 (which would match/beat 290x) and leave room to max out the clock/ram (another ~10%).

These games revolving around max tdps and full potential of chips being held back has got to stop.  The bullshit keeps getting deeper and people perpetually feel screwed.

The fact AMD is using these crap coolers almost seems like a reverse-Tahiti play.  Instead of more-and-more limiting the chip through voltage/binning/whatever to let it seep to lower markets, it almost seems they are limiting the cards by heat in prep to unleash decently-cooled models (and full potential at stock, let-alone overclocked/volted) against the one or two new gk110 models.  Pretty conspiracy-theorist, I suppose, but that is sure what it look like.


----------



## Primalz (Nov 5, 2013)

That pricing is just excellent! so its going for US $399, so here in Australia they should be about $550. The cheapest GTX 780 is going for $700 AU & 290X is same price. So is it worth paying extra $150 for potentially 5 fps...


----------



## The Von Matrices (Nov 5, 2013)

I think it's worth noting that for all the people complaining about NVidia ripping people off with their pricing, at this point the people who got most ripped off are those that paid $549 for a R9 290X two weeks ago.  At least Titan and the 780 when could sustain their prices for months after their respective releases.


----------



## hardcore_gamer (Nov 5, 2013)

The Von Matrices said:


> At least Titan and the 780 when could sustain their prices for months after their respective releases.



I agree. A sustained rip-off is way better than a short-term rip-off.


----------



## the54thvoid (Nov 5, 2013)

The Von Matrices said:


> I think it's worth noting that for all the people complaining about NVidia ripping people off with their pricing, at this point the people who got most ripped off are those that paid $549 for a R9 290X two weeks ago.  At least Titan and the 780 when could sustain their prices for months after their respective releases.



This is what annoyed Anand.  The 290 driver revision made the 290X redundant.  If they had left the 290 at 40% fan speed AMD would have a proper product stack.  As it is you have two cards that go toe to toe, except one is noisier and cheaper.



hardcore_gamer said:


> I agree. Sustained rip-off is way better than a short-term rip-off.



Titan was not a rip off unless you couldn't afford one.  Money is money, it is there to be used.  If somebody has enough for a product and they can afford it without detriment to their household budget, it is not a rip off if there is no competing, compelling alternative.

When the 780 was released, Titan as a gaming card, not only became a rip off, it also became irrelevant.
Until the 290X release 780 could only be bested (single gpu) by a Titan.  And even then custom 780's (Classifieds) could OC and beat a Titan OC.
When 290X release happened, 780 prices dropped to compete. 
With the release of the 290, the 290X became irrelevant.  You could also argue a rip off.  Either way, both AMD cards need custom solutions to make their potential realised.

But to round it up to be fair... If you buy a Titan now, for gaming, you really are crazy.  And unless the 780Ti works miracles and doesn't impressively beat the custom cooled 290X's (or 290's) then it is a rip off too at it's suggested price.

But again, lets' state, a rip off is a product that has no peer.  Any price can be charged - you pay for that level of performance.  If folk can't understand that basic premise of a free economy, you ought to live in a hippy commune.  
On the same hand, if you put yourself in debt for a graphics card, you're also probably needing time in that very same commune.


----------



## Phobia9651 (Nov 5, 2013)

hardcore_gamer said:


> I agree. A sustained rip-off is way better than a short-term rip-off.



LOL, +1 to you sir!

I'm still curious about what is going to happen to Never Settle. 
Nvidia has actually a decent bundle with Splinter Cell: Blacklist, AC4 and Arkham Origins, while AMD has none at all.


----------



## alwayssts (Nov 5, 2013)

1d10t said:


> What i see is an absolute winning...
> 
> http://tpucdn.com/reviews/AMD/R9_290/images/power_maximum.gif​




With complete love and respect erm...1d10t...I hope you enjoy playing furmark to achieve those lower power levels.

I made that mistake at first as well.  Gaming (which rarely maxes a card 100% like furmark) is 7.5-10% worse according to that same page.


----------



## micropage7 (Nov 5, 2013)

no vga output as minus point. since the card is $399 so anyone who buy it should have dvi at least.


----------



## pr0fessor (Nov 5, 2013)

Nice Card. It sure is better than the 15 per cent better card R9 290X with more heat, noise and power consumption. If I hadn't two 7950 cards, I would be very interested in two custom cards of this number. But the upgrade is not worth it and the DX11.2 doesn't matter for my DX11.1 System. I guess...


----------



## Frick (Nov 5, 2013)

Mussels said:


> nice speed, but doesnt this make the 290x redundant?



Yeeah, it feels like this is the card the 290x should have been.. With a better cooler.

I find it all sorts of hilarious though.


----------



## the54thvoid (Nov 5, 2013)

pr0fessor said:


> Nice Card. It sure is better than the 15 per cent better card R9 290X with more heat, noise and power consumption. If I hadn't two 7950 cards, I would be very interested in two custom cards of this number. But the upgrade is not worth it and the DX11.2 doesn't matter for my DX11.1 System. I guess...



I'd say it is.  One of these will match your dual 7950's.  Two will blow them away.  Go on, help the global economy!


----------



## Cool Vibrations (Nov 5, 2013)

The Von Matrices said:


> I think it's worth noting that for all the people complaining about NVidia ripping people off with their pricing, at this point the people who got most ripped off are those that paid $549 for a R9 290X two weeks ago.  At least Titan and the 780 when could sustain their prices for months after their respective releases.



Do you own a GTX780/Titan? As an ex-Titan owner, I can confirm this is true. However, you cannot deny that both of those cards were absolutely overpriced. If I had known AMD had this ace up their sleeve, I would've not bought anything at all. Anyone who would justify purchasing a GTX780/780Ghz/780ti at this point in time with their current pricepoints are just being delusional. There is no way I would be buying any Nvidia card in the near future after being burned like that.


----------



## pr0fessor (Nov 5, 2013)

the54thvoid said:


> I'd say it is.  One of these will match your dual 7950's.  Two will blow them away.  Go on, help the global economy!



No need. I will consider it later when the custom cards are out. Battlefield 4 is running quite nice with my system. And I don't have a 4K Monitor.

At this point, why is there no BF4 Benchmark on the list, when the game is already released a few days?


----------



## RCoon (Nov 5, 2013)

Cool Vibrations said:


> Do you own a GTX780/Titan? As an ex-Titan owner, I can confirm this is true. However, you cannot deny that both of those cards were absolutely overpriced. If I had known AMD had this ace up their sleeve, I would've not bought anything at all. Anyone who would justify purchasing a GTX780/780Ghz/780ti at this point in time with their current pricepoints are just being delusional. There is no way I would be buying any Nvidia card in the near future after being burned like that.



I have no intentions of swapping out my 780 OC for a 290 or 290X that perform only marginally better and will heat up every component inside my ITX case. I wouldn't buy another 780 now, but I'm not going to renounce god and tell all the sinners who bought titans and 780's that they should burn their GPU's on the pyre and buy AMD instead.

You buy whatever the best card is at the time for your budget. For many people that was the Titan or 780.


----------



## DarkOCean (Nov 5, 2013)

when 780 droped to $500, I told mysef that it would awesome for the 290 to be $400. Now even I want one of these.
Now nv need to drop the 780 price again lol.


----------



## H82LUZ73 (Nov 5, 2013)

I must say i was right this is the card to get over the 290x...but wait i heard that the ASUS r9  290x bios will flash on these cards and up the volts to the gpu/memory = the 290x in overclocks and memory performance...AT $399 what a steal


----------



## RCoon (Nov 5, 2013)

H82LUZ73 said:


> I must say i was right this is the card to get over the 290x...but wait i heard that the ASUS r9  290x bios will flash on these cards and up the volts to the gpu/memory = the 290x in overclocks and memory performance...AT $399 what a steal



From OCUK:



> Quote:
> Originally Posted by OldCoals
> Can all the cards be flashed with the Asus BIOS?
> 
> Yes.


----------



## buggalugs (Nov 5, 2013)

axis007 said:


> That pricing is just excellent! so its going for US $399, so here in Australia they should be about $550. The cheapest GTX 780 is going for $700 AU & 290X is same price. So is it worth paying extra $150 for potentially 5 fps...



 Hi, you need to shop around more. There are GTX 780s for $649, $639 and one listed for $599 Although that one is on pre-order.(@PCCG)

 Anyway, I agree the 290X/290 has great pricing.We will probably pay around $629 for 290X and $479 for 290 here.

 Wow and some people say AMD is not competitive. Nvidia has a problem in that because they overcharged for 780 and titan, its not easy for them to just drop the price by such a large amount to compete. Its not a good look for cards that have only been out a couple of months to drop in price that much, it makes it look obvious that they were charging too much, if they can afford to sell them much cheaper.

 I think all they can do is ditch Titan completely, drop the price of the 780 and bring out the 780Ti as a temporary solution at the highend .....then start  fastracking new GPUs asap. The Nvidia 7 series might have a very short lifespan.


----------



## LAN_deRf_HA (Nov 5, 2013)

Now this one I like better. Mainly because at that price you can get a top end (and it really needs to be top end) cooler and stop the throttling, so in theory you'd have better performance than a 290X for less money. 

On a related note I'm not so sure aftermarket stock coolers will save the day here. When you max out the fan on an after market cooler that comes with a custom card, any of them for any top end card, it usually does not cool as well as the stock blower when it's maxed out. I remember my 580 could get down to 62c at 1000 Mhz on the core with the stock blower at 100%, that's pretty incredible and pretty much only a CLC mod or ACX3 could have matched that. The reason the custom card coolers are considered better is because they get you cooler for less noise, but they don't have the same top end performance of the blowers. The one saving grace might be if the throttle point is really high they won't have to work wonders, just get it into the 80s.


----------



## Mussels (Nov 5, 2013)

btarunr said:


> Because noobs still buy/own $100 21.5-inch 1080p monitors that only have D-Sub.



as long as its compatible with DVI-VGA adaptors (and comes with one) then i dont see it as a negative at all.


edit: saw on a later page that they dont support it.


----------



## claylomax (Nov 5, 2013)

Mussels said:


> as long as its compatible with DVI-VGA adaptors (and comes with one) then i dont see it as a negative at all.



It's not. This is why so many people misunderstand. You could put an adaptor but the card cannot output analog VGA.

Btarunr explained better:

People still buy/use cheap 21.5-inch 1080p monitors that only feature VGA input
Those people use DVI to D-Sub adapters
Those adapters work on every other high-end graphics card (launched in the past 2 years)
Those adapters will not work on R9 290/290X
That warrants a demerit


----------



## SIGSEGV (Nov 5, 2013)

urza26 said:


> LOL, +1 to you sir!
> 
> I'm still curious about what is going to happen to Never Settle.
> Nvidia has actually a decent bundle with Splinter Cell: Blacklist, AC4 and Arkham Origins, while AMD has none at all.



yes. currently amd only offering limited BF4 bundle on their 290x card. However, i'm sure that never settle will come back with various AAA title.


----------



## 1d10t (Nov 5, 2013)

alwayssts said:


> With complete love and respect erm...1d10t...I hope you enjoy playing furmark to achieve those lower power levels.
> 
> I made that mistake at first as well.  Gaming (which rarely maxes a card 100% like furmark) is 7.5-10% worse according to that same page.



Furmark?So 90's.I don't even use them.Why bother?


----------



## RCoon (Nov 5, 2013)

SIGSEGV said:


> yes. currently amd only offering limited BF4 bundle on their 290x card. However, i'm sure that never settle will come back with various AAA title.
> 
> http://s16.postimg.org/drmpyqd05/Thief_AMD_Gaming_Evolved_600x333.png



I don't think the new Thief is a very attractive game to offer in a bundle. So far that game has been heavily criticised, and even the devs have removed stuff from the game because all their feedback was a colossal failure. I expect those Mantle-adopting companies to see their games drop in the bundle, Oxide engine games and such.


----------



## Eroticus (Nov 5, 2013)

RCoon said:


> I don't think the new Thief is a very attractive game to offer in a bundle. So far that game has been heavily criticised, and even the devs have removed stuff from the game because all their feedback was a colossal failure. I expect those Mantle-adopting companies to see their games drop in the bundle, Oxide engine games and such.



404$ Not found ... wrong topic ~.~ go back to ur area.

Ohh batman soo awesome game LMAO ... i prefer to get BF4 on all 3 nVidia games ~.~


----------



## ChristTheGreat (Nov 5, 2013)

Umm, I guess this is the card I want!


----------



## qubit (Nov 5, 2013)

With a price/performance ratio like this, I reckon this card is gonna fly off the shelves. Good, time to squeeze NVIDIA.


----------



## BarbaricSoul (Nov 5, 2013)

qubit said:


> With a price/performance ratio like this, I reckon this card is gonna fly off the shelves. Good, time to squeeze NVIDIA.



Your probably right. I for see a lot of people buying 290x cards because of 290 being sold out. That is unless the non-ref 290x cards really take off with better coolers. But then again, non-ref 290 cards are also in the future(no proof, just common sense).


----------



## qubit (Nov 5, 2013)

BarbaricSoul said:


> Your probably right. I for see a lot of people buying 290x cards because of 290 being sold out. That is unless the non-ref 290x cards really take off with better coolers. But then again, non-ref 290 cards are also in the future(no proof, just common sense).



I'll bet NVIDIA is dreading those non-ref 290s!  Just imagine, all that performance at a silly price without the noise and the throttling. I can't wait.

Maybe a bit of stiff competition will put an end to grossly overpriced $1000 cards from NVIDIA?


----------



## BarbaricSoul (Nov 5, 2013)

qubit said:


> Maybe a bit of stiff competition will put an end to grossly overpriced $1000 cards from NVIDIA?



I truly hope so. Damn, +$700 for a video card? Just to play games? I have bought many cars for under that cost. $500 is steep enough.


----------



## brandonwh64 (Nov 5, 2013)

Looks to be my next card purchase!


----------



## the54thvoid (Nov 5, 2013)

SIGSEGV said:


> yes. currently amd only offering limited BF4 bundle on their 290x card.





Eroticus said:


> 404$ Not found ... wrong topic ~.~ go back to ur area.
> 
> Ohh batman soo awesome game LMAO ... i prefer to get BF4 on all 3 nVidia games ~.



This is the issue I harped on about before.  BF4 is *NOT FREE* with 290X cards, at least not in UK.  It's not bundled.  You pay £427-485 ish for a plain boxed R9 290X or you pay £505-515 ish for the BF4 edition.  The card is the same - no frills.  That's a premium of £78 worst case or £30 if you're silly enough to pay maximum dosh for either variable.  That's not free.

Jeeeeez.


----------



## the54thvoid (Nov 5, 2013)

Double post required to add in the spirit of fairness, OcUK have a BF4 MSI 290X for only £30 above their base model at £469.


----------



## RCoon (Nov 5, 2013)

This is what Gibbo claims from OCUK:

_I got 1200Mhz core then with the reference cooler with ZERO throttling. All benchmarkers run with fans at 100% when going for maximum overclock, as such with the fan at such speeds and card clocked to 1200MHz it remains sub 85c. At stock clocks it is very easy to keep an R290X from throttling, you just set a GPU temp limit of 90c and set the fan maximum fan speed of 60%, this keeps it both quiet/reasonable and prevents throttling. _

W1z, can you confirm this, or is he over exaggerating to sell more plasma hoovers? I'm fairly certain 60% fans are not "reasonably" quiet.


----------



## Zen_ (Nov 5, 2013)

Really impressive results and price on both the 290 cards, but too bad AMD cheaped out on the heatsink to hit a price point, or whatever the reason is for it. How long will it take for non-reference cards to come to market? If you look at the PCB layout of both 290 cards and a 7970 they look almost the same, so it couldn't be that hard for card partners to slap heatsinks they already have on the 290. Or do they wait until they can produce a custom PCB as well?


----------



## BarbaricSoul (Nov 5, 2013)

Zen_ said:


> Really impressive results and price on both the 290 cards, but too bad AMD cheaped out on the heatsink to hit a price point, or whatever the reason is for it. How long will it take for non-reference cards to come to market? If you look at the PCB layout of both 290 cards and a 7970 they look almost the same, so it couldn't be that hard for card partners to slap heatsinks they already have on the 290. Or do they wait until they can produce a custom PCB as well?



don't know about the 290, but non-ref 290X cards should be available towards the end of next month.

http://www.techpowerup.com/forums/showthread.php?t=193358

And considering 290x water blocks are compatible with the 290, I'd bet the non-ref heat sinks will also be compatible.

http://www.techpowerup.com/forums/showthread.php?t=193891


----------



## Shihab (Nov 5, 2013)

the54thvoid said:


> Probably not as hot as Sudan...



That's what worries me, if the card _idles_ at 47c in a -supposedly- cold environment, what should people in areas such as mine expect? Throttling in 2D mode? :|


----------



## Mussels (Nov 5, 2013)

Shihabyooo said:


> That's what worries me, if the card _idles_ at 47c in a -supposedly- cold environment, what should people in areas such as mine expect? Throttling in 2D mode? :|



oh please. down here it hits 45C in summer and then your card turns out to be spiders.


----------



## BarbaricSoul (Nov 5, 2013)

Mussels said:


> and then your card turns out to be spiders.



huh?


----------



## Mussels (Nov 5, 2013)

BarbaricSoul said:


> huh?



everything is spiders in australia.


----------



## qubit (Nov 5, 2013)

Mussels said:


> everything is spiders in australia.



Black widow spiders?


----------



## Pandora's Box (Nov 5, 2013)

Well it seems once you overclock the 290/290X/GTX 780 the 780 comes out on top:

AMD Radeon R9 290 Unboxing & Review - YouTube


----------



## RCoon (Nov 5, 2013)

Pandora's Box said:


> Well it seems once you overclock the 290/290X/GTX 780 the 780 comes out on top:
> 
> AMD Radeon R9 290 Unboxing & Review - YouTube



That's because the 780 doesn't throttle nearly as much because it has a more expensive and better reference cooler.


----------



## Big_Vulture (Nov 5, 2013)

Is it the Radeon HD8950 if stupid new naming scam not coming, yeah? Good to see AMD back in business with budget prices, hope Nvidia fans regret that they paid twice the money what their card actually worth. Nvidia is very greedy!

Oh, never mind, I found an another site the explanation;

290x = HD8970
290 = HD8950
280x = HD8870
280 = HD8850
270x = HD8770
270 = HD8750
260x = HD8670
260 = HD8650


----------



## jormungand (Nov 5, 2013)

great review WiZZ once again! now waiting for non-reference, lets see what Asus, GA and MSI have prepared to cooled and OC this card correctly. Dont mind if they come ghz versions  for $10-$20 more will be a great offer considering 780 prices. 94c doesnt interest me at all! any card above 81C temp.
 Nvidia, now is when the real fight begins!


----------



## Durvelle27 (Nov 5, 2013)

Big_Vulture said:


> Is it the Radeon HD8950 if stupid new naming scam not coming, yeah? Good to see AMD back in business with budget prices, hope Nvidia fans regret that they paid twice the money what their card actually worth. Nvidia is very greedy!
> 
> Oh, never mind, I found an another site the explanation;
> 
> ...



You do know that the HD 8000 series were rebadges that was only for OEMs. The R7/R9 are mostly new chips except the R9 280X & R9 270X


----------



## scmpj (Nov 5, 2013)

Don't know why this 290 doesn't appear in the main search (Newegg) but here is the link for what I think is the last 290 available online.   

PowerColor AXR9 290 4GBD5-MDH/OC Radeon R9 290 4GB...


----------



## Steevo (Nov 5, 2013)

So far in this thread I have read.

Nvidia fanbois angry cus their precious Nvidia has competition, and has made their purchase of a drastically overpriced Titan seem foolish to the tune of almost $600. So now they are crying about how the 290 has made the "X" redundant and stupid. congrats on catching that, did you not notice it when the 780 and Titan were overpriced? Or do the tears help you see better?

People complaining about AMD tweaking drivers and fan speeds, cause they aren't allowed to since it makes it not fair!!!

People complaining about the 49Db noise in a massively under priced card makes when it shatters the price/performance ratio by 30%.

People complaining about he lack of VGA? I threw mine away, and if I really needed a VGA ready card I keep one around for older computers or when I played with BIOS on cards.

Spend the $399, $60 for a awesome cooler, and love it, or underclock it and enjoy the performance and silence.


----------



## Fairlady-z (Nov 5, 2013)

I wonder if we are able to flash 290x bios on the 290? That would be killer like in the 6950/6970 reference cards.


----------



## [Ion] (Nov 5, 2013)

Well, at a full 50% faster than the HD7950, this is a very tempting upgrade indeed


----------



## jormungand (Nov 5, 2013)

Steevo said:


> So far in this thread I have read.
> 
> Nvidia fanbois angry cus their precious Nvidia has competition, and has made their purchase of a drastically overpriced Titan seem foolish to the tune of almost $600. So now they are crying about how the 290 has made the "X" redundant and stupid. congrats on catching that, did you not notice it when the 780 and Titan were overpriced? Or do the tears help you see better?
> 
> ...



I agree with you in that amd have done a great job bringing down prices and a great card for the price but 

1 noise is problem
2 temp is problem
3 throttling is another
4 drivers is a problem too
5 power comsuption once again a problem here

so when people are spending $400 plus on a card they dont look for troubles or spending more money on a new psu or an aftermarket cooler of $60 or more, many here like me, enjoy this card/price and performance but they nearly hit the $400 so 5 problems against money, it bounce the balance and your pocket. the smartest wait for the bigbang, like i said before , hope asus, gigabyte, msi and others make a good job an this card and gave us what we want.

performance-cooling-oc-and a good price. 
i really enjoy amd and nvidia battle it makes us all happy.


----------



## Steevo (Nov 5, 2013)

Will a MCW60-R fit it?

I happen to have one hanging around on my old 5870.....


----------



## MT Alex (Nov 5, 2013)

It makes me mad to even look at the 770 in my case


----------



## NeoXF (Nov 5, 2013)

RCoon said:


> W1z, can you confirm this, or is he over exaggerating to sell more plasma hoovers? I'm fairly certain 60% fans are not "reasonably" quiet.



Even if he could (confirm it), why would I, or anyone, take his word for it, best scenario is to test it yourself.



Zen_ said:


> Really impressive results and price on both the 290 cards, but too bad AMD cheaped out on the heatsink to hit a price point, or whatever the reason is for it. How long will it take for non-reference cards to come to market? If you look at the PCB layout of both 290 cards and a 7970 they look almost the same, so it couldn't be that hard for card partners to slap heatsinks they already have on the 290. Or do they wait until they can produce a custom PCB as well?



It's not their decision, it's AMDs. Not sure why, maybe something related to emptying reference stock first (and the fact that reference ones suck so much when compared to custom =/= no sales then).


----------



## Steevo (Nov 5, 2013)

jormungand said:


> 1 noise is problem
> 2 temp is problem
> 3 throttling is another
> 4 drivers is a problem too
> ...



1) If you look at the $400 market you could underclock the card and it will remain silent and still provide better price/performance. 
2) Temp, not sure how it is a problem, card has been designed to run at 95C, and I am sure you will not need to bake it every few months like a Nvidia card to keep it working.
3) This card doesn't throttle, read, comprehend, post.
4) Drivers? Oh like when Nvidia killed cards, and has as many issues, I look in their forums too.
5) If you are running a PSU with only 40W to spare you don't need a new PSU, you need to choose another hobby. 


Your post reminds me of someone who buys a sports car but can't afford to drive it.


----------



## Pandora's Box (Nov 5, 2013)

MT Alex said:


> It makes me mad to even look at the 770 in my case



having used amd cards in the past, I look at it this way: I have had zero issues with my 780 sli setup. sure I paid a premium for it but I think it was worth it. I wish the guys buying these cards luck. Myself? never will I buy amd cards again. way too many driver issues.


----------



## Rebel333 (Nov 5, 2013)

jormungand said:


> I agree with you in that amd have done a great job bringing down prices and a great card for the price but
> 
> 1 noise is problem
> 2 temp is problem
> ...



Only power consumption is the real problem from the list, all the others can be fixed with custom cooler or maybe a re-paste with IC Diamond can make it really good! Driver is why problem?


----------



## NeoXF (Nov 5, 2013)

jormungand said:


> [...]
> 4 drivers is a problem too
> 5 power comsuption once again a problem here
> [...]



Sorry, but that's an instant red flag for me in regards to you. You're either a). a nVidia fanboy or (more likely) b). an regular, level-headed person that's been indoctrinated by one.

Drivers problems are waaay blown out of proportion, especially in this day and age, get with the times. I've seen and had way more problems with GeForce drivers than with anything else.

And anyone who is commenting on power usage for enthusiast level hardware is kind of in the wrong league... Not to mention, the so called "issue" is way overblown, half the reviews I've seen (ie, the half the generally isn't biased or have a clue about fair power tests), show a Uber BIOS R9 290X barely exceeding 10W more than a TITAN.


----------



## Steevo (Nov 5, 2013)

Sorry, may have been wrong, does the card actually throttle W1zz?


----------



## NeoXF (Nov 5, 2013)

Steevo said:


> Sorry, may have been wrong, does the card actually throttle W1zz?



No, why would it, lower power usage/same cooler as R9 290X & a 47% fan speed profile that was supposedly one of the "improvements" that Catalyst 13.11 beta v8 brought.

Edit: Well, to be fair, it might in some extreme cases, remember seeing a graph where it showed fluctuations between 935 to 948MHz, where as R9 290X Uber mode was 1000MHz constant. Well, I suppose it all depends on ambient.


----------



## Frick (Nov 5, 2013)

RCoon said:


> I don't think the new Thief is a very attractive game to offer in a bundle. So far that game has been heavily criticised, and even the devs have removed stuff from the game because all their feedback was a colossal failure. I expect those Mantle-adopting companies to see their games drop in the bundle, Oxide engine games and such.



Waitwaitwaitwait and hold on here. Exactly what have they cut? Are you telling me they're making a proper Thief game and not a stupid Thief-lookalike-light which it did look like?


----------



## newtekie1 (Nov 5, 2013)

Steevo said:


> Sorry, may have been wrong, does the card actually throttle W1zz?



According to his conclusion the card does still throttle.



> In our testing, the card barely throttled and ran above 925 MHz almost all the time.


----------



## AsRock (Nov 5, 2013)

the54thvoid said:


> This is the issue I harped on about before.  BF4 is *NOT FREE* with 290X cards, at least not in UK.  It's not bundled.  You pay £427-485 ish for a plain boxed R9 290X or you pay £505-515 ish for the BF4 edition.  The card is the same - no frills.  That's a premium of £78 worst case or £30 if you're silly enough to pay maximum dosh for either variable.  That's not free.
> 
> Jeeeeez.
> 
> http://img.techpowerup.org/131105/Untitled.png



When is some thingg FREE ?, one way or another you pay for it and like to know were it says it's free to begin with.


----------



## TheThirdRace (Nov 5, 2013)

Concerning the VGA "minus" point...



btarunr said:


> The summary of our argument is:
> People still buy/use cheap 21.5-inch 1080p monitors that only feature VGA input
> Those people use DVI to D-Sub adapters
> Those adapters work on every other high-end graphics card (launched in the past 2 years)
> ...



Your logic is sound when talking about a 150$ card, but we're talking about a 400$ card... Furthermore, virtually every monitor in the last 5 years has either DVI and/or HDMI. Add to that you'd be a fool to shell 400$ on a video card and not have a monitor that can actually keep up.

I've no problem with a missing VGA being a minus, but not for that kind of card. It's like saying your new 4k TV should get a "minus" because it doesn't have a analog input... You'd be crazy to use that combination in the first place.


----------



## btarunr (Nov 5, 2013)

TheThirdRace said:


> Concerning the VGA "minus" point...
> 
> 
> 
> Your logic is sound when talking about a 150$ card, but we're talking about a 400$ card...



People upgrade graphics cards, they rarely upgrade monitors. Besides, the lure of a 21.5-inch 1080p LCD display around the $100 point can be very strong for noobs, who would want to transfer cost-savings from monitor to graphics card. Such displays often only feature D-Sub inputs. Then they try to stick in an aftermarket adapter to find out that not only does it not work, but it won't even stick into the DVI connectors (they're -D, lack analog guide pins).


----------



## TheoneandonlyMrK (Nov 5, 2013)

btarunr said:


> People upgrade graphics cards, they rarely upgrade monitors. Besides, the lure of a 21.5-inch 1080p LCD display around the $100 point can be very strong for noobs, who would want to transfer cost-savings from monitor to graphics card. Such displays often only feature D-Sub inputs.


I get what you're saying but he was right only an uneducated newbie would team an R 9 up with a monitor that is so old it hasn't got dvi or hdmi  and someone trying to save money by buying a cheap shit old monitor is unlikely to buy a high end gpu but I suppose like every release the opposing teams fambois will jump on any little feature drop like its major.
Foolish though because I cheaped out on a 70 quid 22" hanns g and guess what its got dvi so the issue of cheap monitor on dear card isn't much of an issue imho.


----------



## newtekie1 (Nov 5, 2013)

theoneandonlymrk said:


> I get what you're saying but he was right only an uneducated newbie would team an R 9 up with a monitor that is so old it hasn't got dvi or hdmi  and someone trying to save money by buying a cheap shit old monitor is unlikely to buy a high end gpu but I suppose like every release the opposing teams fambois will jump on any little feature drop like its major.
> Foolish though because I cheaped out on a 70 quid 22" hanns g and guess what its got dvi so the issue of cheap monitor on dear card isn't much of an issue imho.



You forget that there are still people, like me, who use more than one monitor and have a secondary monitor that only has VGA.


----------



## RCoon (Nov 5, 2013)

Frick said:


> Waitwaitwaitwait and hold on here. Exactly what have they cut? Are you telling me they're making a proper Thief game and not a stupid Thief-lookalike-light which it did look like?



They completely removed the level up system from the game at the intense backlash they received from testers. They changed a bunch of other things too but I dont remember off the top of my head. But no, it's not a real Thief game _YET_. Still Thief lookalike for now.

I just assume they'll offer Lichdom, Thief and that other game that features their weird audio thing in their game bundles. Seems like a good way for them to scratch each other backs right?



NeoXF said:


> Even if he could (confirm it), why would I, or anyone, take his word for it, best scenario is to test it yourself.



Sir, are you trying to get me to buy one of these?


----------



## FX-GMC (Nov 5, 2013)

newtekie1 said:


> You forget that there are still people, like me, who use more than one monitor and have a secondary monitor that only has VGA.



I also fit into this category.  How noobish of me.


----------



## TheThirdRace (Nov 5, 2013)

theoneandonlymrk said:


> I get what you're saying but he was right only an uneducated newbie would team an R 9 up with a monitor that is so old it hasn't got dvi or hdmi  and someone trying to save money by buying a cheap shit old monitor is unlikely to buy a high end gpu but I suppose like every release the opposing teams fambois will jump on any little feature drop like its major.
> Foolish though because I cheaped out on a 70 quid 22" hanns g and guess what its got dvi so the issue of cheap monitor on dear card isn't much of an issue imho.



Exactly my point. You kinda have to work hard to get a VGA only  monitor today...

Also, I wouldn't blame Ferrari if a buyer would decide to put 100$ tires under it and cause it to handle badly. That would be the buyer's fault, not Ferrari's.


----------



## FX-GMC (Nov 5, 2013)

TheThirdRace said:


> Exactly my point. You kinda have to work hard to get a VGA only  monitor today...
> 
> Also, I wouldn't blame Ferrari if a buyer would decide to put 100$ tires under it and cause it to handle badly. That would be the buyer's fault, not Ferrari's.



Bad analogy. You can still put $100 tires on it if you wanted to.  

Also, why are you assuming that someone is going to buy a VGA only monitor instead of already owning one?


----------



## johnnyfiive (Nov 5, 2013)

Man... stuff just keeps getting better.


----------



## technogiant (Nov 5, 2013)

Never mind all this quite and ubber mode or 47% fan on the 290 non -x version.....what do these things do with the blower pegged at 100%.....don't care about the noise..I wear headphones


----------



## Brusfantomet (Nov 5, 2013)

Managed to get one of the last 290X here in Norway on sunday. and now the 290 comes with better availability (at the moment) and lower price, oh well, i am gona check if it throtels under water first (my xspc raystorm gpu block fits according to xspc).

otherwise i may have to use some European consumer rights and return it


----------



## TheThirdRace (Nov 5, 2013)

FX-GMC said:


> Bad analogy. You can still put $100 tires on it if you wanted to.
> 
> Also, why are you assuming that someone is going to buy a VGA only monitor instead of already owning one?



Technically you can put 100$ tires on your Ferrari, but it's unusable. Most "luxury" cars prevent that by using some kind of special rims with the "right" balance. For example, you could use a normal rim to mount the tires of an Audi, but your car would wobble on the road. If a 30K Audi does that, I doubt a Ferrari would let you put whatever you want... especially from a company that does a research on you before they accept to put you on the waiting list to buy one.

So, yes my analogy wasn't perfect in every single details, but you're missing the point. A 400$ card has higher standards a low class card doesn't. You can't blame the company for not making their top of the line product for the bottom line users. You want top of the line, use top of the line. A better analogy would be blaming Ferrari because the engine doesn't use fuel with the lowest concentration of octane. If you can't afford the fuel, you can't afford the car, it's not the company's fault.

As for assuming the VGA monitor isn't already owned, I didn't. But eventually you have to realize technology moves on even if you don't. You can't expect every piece of hardware in the next century to support a technology that was already obsolete 10 years ago. Sorry, but it doesn't work that way. If you're willing to pay 400$ for a video card, it comes with the requirement of having a monitor that will give you what you paid for. VGA doesn't give you what you paid for your video card, simple as that.

And for the record, I'm no AMD fan boy. I'm a current happy owner of a 770 GTX and it would take a complete turn around to get me back in the AMD camp after suffering for years from the bad driver interface. I simply support the fact a 400$ card should require a monitor that wasn't built in the VHS era...


----------



## The Von Matrices (Nov 5, 2013)

hardcore_gamer said:


> I agree. A sustained rip-off is way better than a short-term rip-off.



I argue that it is.  With any luxury product the price is completely ridiculous and a rip-off if you consider the cost of materials.  But luxury products also have prestige and because of that maintain a high resale value.  There is a reason to pay more for a product initially if the depreciation is lower and you can recoup the extra cost at resale.

Do I think that Titan and the original 780 were priced too high?  Yes.  But at least they held their resale value for a long time.  There are a lot of happy Titan and 780 owners who bought their product on launch day and don't regret their purchase because prices remained stable for 6-9 months.  I don't expect any technology product to be in investment but it's ridiculous that the 290X could only justify its price for all of two weeks.

I don't emphasize with the people who bought a 780 directly before the 290X launch because that was just a dumb idea knowing that its competitor's launch was imminent.  But I do emphasize with the people who bought a 290X at launch and could have gotten the same performance for $150 less just for waiting two weeks.  The 290 was just completely unforeseeable and AMD screwed 290X launch purchasers by releasing it in its revised version.



Brusfantomet said:


> Managed to get one of the last 290X here in Norway on sunday. and now the 290 comes with better availability (at the moment) and lower price, oh well, i am gona check if it throtels under water first (my xspc raystorm gpu block fits according to xspc).
> 
> otherwise i may have to use some European consumer rights and return it



I emphasize with you.  I doubt you're the only one who feels this way.


----------



## PopcornMachine (Nov 5, 2013)

Don't understand all the talk about VGA output.  Who would buy a $400+ video card to run VGA?  Talk about stretching a point just to find a negative.

The real issues are the heat and power usage, and bang for buck.

Was hoping for better on the first two, but more than satisfied with the value.

Going to put a water block on my next card regardless, and real interested in seeing what this one does with proper cooling.


----------



## The Von Matrices (Nov 5, 2013)

I think the VGA output argument is pretty clear:

If you use VGA for your primary monitor then you're out of luck with the R9 290; however if you really want VGA you can buy a $25 DisplayPort to VGA adapter.  But for the naysayers, remember that the 290 has two true dual link DVI ports in exchange for the lack of a VGA port.  I'd say that for the number of people using VGA as their primary monitor compared to the number of 2560x1440 monitor users that's a good tradeoff.  

If you use a VGA monitor as a second, non-gaming monitor, then you don't need to plug it into the R9 290.  Almost every modern processor has integrated graphics.  Just plug it into the VGA port of integrated graphics and you have the VGA port for your second monitor.


----------



## newtekie1 (Nov 5, 2013)

TheThirdRace said:


> As for assuming the VGA monitor isn't already owned, I didn't. But eventually you have to realize technology moves on even if you don't. You can't expect every piece of hardware in the next century to support a technology that was already obsolete 10 years ago. Sorry, but it doesn't work that way. If you're willing to pay 400$ for a video card, it comes with the requirement of having a monitor that will give you what you paid for. VGA doesn't give you what you paid for your video card, simple as that.



You are still assuming the VGA monitor is used as the primary.  I've got a very nice 27" 2560x1440 monitor as my primary monitor, but I use a very nice 1680x1050 monitor as a secondary that only has VGA.



The Von Matrices said:


> I think the VGA output argument is pretty clear:
> 
> If you use VGA for your primary monitor then you're out of luck with the R9 290; however if you really want VGA you can buy a $25 DisplayPort to VGA adapter.  But for the naysayers, remember that the 290 has two true dual link DVI ports in exchange for the lack of a VGA port.  I'd say that for the number of people using VGA as their primary monitor compared to the number of 2560x1440 monitor users that's a good tradeoff.
> 
> If you use a VGA monitor as a second, non-gaming monitor, then you don't need to plug it into the R9 290.  Almost every modern processor has integrated graphics.  Just plug it into the VGA port of integrated graphics and you have the VGA port for your second monitor.



Both of those still lead to not having native VGA as being a negative. Not something that weighs heavily on the final score, I'm sure, but a negative that is worth mentioning.  Now if they included the Displayport adapter with the card, that would make the negative go away.


----------



## Steevo (Nov 5, 2013)

The Von Matrices said:


> With any luxury product the price is completely ridiculous and a rip-off if you consider the cost of materials.  But luxury products also have prestige.



Thanks for the quote.

http://www.thefreedictionary.com/prestige

I don't think computer hardware qualifies as prestige, unless you mean prestige = penile compensation.

Good for Titan owners on having a large epeen that says "I have enough money to be the same as everyone else who has enough money to waste, I'm pretentious!"

Edit, autocorrect.


----------



## TRWOV (Nov 5, 2013)

400? oh boy, oh boy


----------



## theonedub (Nov 5, 2013)

More Nvidia price cuts on the way?


----------



## newtekie1 (Nov 5, 2013)

theonedub said:


> More Nvidia price cuts on the way?



I hope!


----------



## The Von Matrices (Nov 5, 2013)

Steevo said:


> Thanks for the quote.
> 
> http://www.thefreedictionary.com/prestige
> 
> ...



So you only read the first three sentences of my argument and made a silly comment based on it?  If you're going to do that then my time is wasted on you.  My argument was not about buying a product for prestige but how prestigious products hold resale value.


----------



## HTC (Nov 5, 2013)

The Von Matrices said:


> I think the VGA output argument is pretty clear:
> 
> If you use VGA for your primary monitor then you're out of luck with the R9 290; *however if you really want VGA you can buy a $25 DisplayPort to VGA adapter.*  But for the naysayers, remember that the 290 has two true dual link DVI ports in exchange for the lack of a VGA port.  I'd say that for the number of people using VGA as their primary monitor compared to the number of 2560x1440 monitor users that's a good tradeoff.
> 
> If you use a VGA monitor as a second, non-gaming monitor, then you don't need to plug it into the R9 290.  Almost every modern processor has integrated graphics.  Just plug it into the VGA port of integrated graphics and you have the VGA port for your second monitor.



Dude: it has been explained that even the VGA adapters don't work on 290 because the 290 doesn't have the analog part required for the adapter to work.

Personally, though not considering it a con myself, i understand why it's there: it's because many people when checking reviews tend to go straight to the power consumption and/or performance pages, skipping the card's details, and if this info isn't @ the pros/cons, they never see it.

As for the card: i'm surprised AMD only charges $400 for this kind of performance while i'm disgusted for their choice of cooler. I still maintain that a better cooler (no need for a good one: just not a very bad one) would make this card fly off shelves WAY easier, IMO.


----------



## The Von Matrices (Nov 5, 2013)

HTC said:


> Dude: it has been explained that even the VGA adapters don't work on 290 because the 290 doesn't have the analog part required for the adapter to work.



Don't criticize anyone when you don't know what you're talking about.  Displayport is a purely digital interface.  This active adapter converts digital Displayport to analog VGA.  There is need need for analog outputs on the card itself for this adapter to work.  Only passive DVI to VGA adapters do not work.


----------



## TheThirdRace (Nov 5, 2013)

HTC said:


> Dude: it has been explained that even the VGA adapters don't work on 290 because the 290 doesn't have the analog part required for the adapter to work.



Oops yourself... He said you can buy a Display Port to VGA adapter for 25$, not DVI to VGA.



The Von Matrices said:


> Don't criticize anyone when you don't know what you're talking about.  Displayport is a purely digital interface.  This active adapter converts digital Displayport to analog VGA.  There is need need for analog outputs on the card itself for this adapter to work.  Only passive DVI to VGA adapters do not work.



Taken from the page: "The DP-to-VGA adapter uses an integrated chip to provide an active digital to analog conversion".

Doesn't sound like you need analog on the video card to make it happen. Are you 100% sure you're right?


----------



## HTC (Nov 5, 2013)

The Von Matrices said:


> Don't criticize anyone when you don't know what you're talking about.  Displayport is a purely digital interface.  This active adapter converts digital Displayport to analog VGA.  There is need need for analog outputs on the card itself for this adapter to work.  Only passive DVI to VGA adapters do not work.





TheThirdRace said:


> Oops yourself... He said you can buy a Display Port to VGA adapter for 25$, not DVI to VGA.



I stand corrected.

Anybody can make mistakes and it seems i just did one.


----------



## Casecutter (Nov 5, 2013)

$400...! While it spars with a 780, love this competition. 

I'm sure AMD was astonished Nvidia went straight-away to the $500 price (I was), but they said fine... we can bring it right to you and still be more profitable.  Sure again noise hurts, but if AMD can have AIB's offering customs by end of November they’ll be in a good place. 

Even today I wouldn't be against spending some $50-100 on top of the $400 for an aftermarket air, water block, or I'd even love to see one with a simple "Red Mod" (although haven't yet read if/how well that works).  If you can maintain it below 94°C and shave off a good amount of dbA, while achieve OC’n like W1zzards'  it's definitely worth taking into consideration a reference unit.

The only way Nvidia can strike back in this price point is with a GK110, and perhaps future cut-back part those that can only offer 10 or 11 SP to really retain a revenue return?  Obviously they would offset performance with higher clocks, but how that would measure up in terms of perf/watts?  I suppose it doesn't matter as long as it stays competitive, AMD gave Nvidia plenty to fall back on in term of "efficiency".  That said if Nvidia can't maintain at, or be similar in terms of perf/watt it really won't be a good return volley.  Perchance a GK110 cannot be brought to market in viable form without showing the thresholds of that huge Kepler?


----------



## the54thvoid (Nov 5, 2013)

Steevo said:


> Good for Titan owners on having a large epeen that says "I have enough money to be the same as everyone else who has enough money to waste, I'm pretentious!"



It's a shame that people like you that can contribute so much to TPU end up resorting to low brow insults.  I'm not pretentious by the way, PC hardware is my hobby.  

http://www.thefreedictionary.com/pretentious

pre·ten·tious  (pr-tnshs)
adj.
1. Claiming or demanding a position of distinction or merit, especially when unjustified.
2. Making or marked by an extravagant outward show; ostentatious. See Synonyms at showy.

I bought a Titan to come as close as I could to the performance of my crossfired 7970's (which were hampered by frame pacing in a few major titles i played).  It's gets really fucking boring explaining that to people that prefer to jump to childish ill informed insults.


----------



## Nihilus (Nov 5, 2013)

The only gripe is that amd should of lowered the fan speed and MHz, still have a good value, then showed it off as a monster over clocker when the right coolers were attached.  It didn't need to match the GTX 780, only the GTX 770.


----------



## Steevo (Nov 5, 2013)

Casecutter said:


> $400...! While it spars with a 780, love this competition.
> 
> I'm sure AMD was astonished Nvidia went straight-away to the $500 price (I was), but they said fine... we can bring it right to you and still be more profitable.  Sure again noise hurts, but if AMD can have AIB's offering customs by end of November they’ll be in a good place.
> 
> ...



It does speak volumes about the efficiency of this GPU die, 28% roughly smaller and just as fast if not a little faster clock for clock. 

Most people don't seem to understand that smaller surface of the die poses a huge increase in what it take to keep it cool, the thermal conductivity of copper and even with a vapor chamber on it is at its limit to stay in a reasonable size 2 slot configuration. 

I am fairly certain that if we put the cooler from a Titan or 780 on, it would still have the same heat dissipation problem, the W per mm2 is the issue, in conjunction with the thermal Delta, and hysteresis of the thermal load.

The extra power consumption comes from the higher utilization of the shader cores.



the54thvoid said:


> It's a shame that people like you that can contribute so much to TPU end up resorting to low brow insults.  I'm not pretentious by the way, PC hardware is my hobby.
> 
> http://www.thefreedictionary.com/pretentious
> 
> ...



I could approach this from a few different ways.

Its great that you bought a Titan, I'm glad it has performed well for you. I don't mean to be insulting, I hope you have modded it to push the hardware, most of us do. Its what brought me here to TPU, editing BIOS's on cards and pushing the limit. I didn't mean any particular owner or user, just standard crappers.

It irks me and many others when thread crapping happens, not just the occasional jab in good fun, but honest thread crapping. When people that don't own the hardware or just parrot what they read keep reposting the same single instance issues as the absolute truth it becomes annoying and detrimental to the thread and new users looking at the reviews here.  

If it bothers someone that AMD is selling a product then don't buy it, post something constructive or one post about how its warmer than you might like. But the horse has been beaten in to burger and already made into lasagna.



The Von Matrices said:


> So you only read the first three sentences of my argument and made a silly comment based on it?  If you're going to do that then my time is wasted on you.  My argument was not about buying a product for prestige but how prestigious products hold resale value.



Huh.....sorry? 

It's not collector items, its computer hardware. I bought my X1800XT brand new and sold it less than a year later for pennies on the dollar of what I paid for it. Now the GTX480 can be had for $50 or less, roughly 1/10th of what it was new. Not much other than phones depreciate this fast, and the high end depreciate faster.


----------



## W1zzard (Nov 5, 2013)

The Von Matrices said:


> Don't criticize anyone when you don't know what you're talking about.  Displayport is a purely digital interface.  This active adapter converts digital Displayport to analog VGA.  There is need need for analog outputs on the card itself for this adapter to work.  Only passive DVI to VGA adapters do not work.



Confirmed, this adapter will work to output analog VGA from R9 290 Series.


----------



## The Von Matrices (Nov 5, 2013)

Nihilus said:


> The only gripe is that amd should of lowered the fan speed and MHz, still have a good value, then showed it off as a monster over clocker when the right coolers were attached.  It didn't need to match the GTX 780, only the GTX 770.



You describe exactly what AMD originally intended to do with this card.  The only thing that changed at the last minute was fan speed; the rest (clock speed, voltage, shaders) was already set.  So the "original" R9 290 would be the massive overclocker; the current R9 290 has its overclocking potential diminished because the throttling has been removed.  It's almost as if AMD is admitting that its cooler is horrible and increased the cooling (noise be damned) to match what a custom cooled R9 290 would perform like because it knew it wouldn't be selling the reference card with the stock heatsink for very long.


----------



## happita (Nov 5, 2013)

I simply could not resist. I found some stock at the egg, Sapphire only atm..

Get em while they're hot!!!!
SAPPHIRE 100362SR Radeon R9 290 4GB GDDR5 Video Ca...


----------



## Steevo (Nov 5, 2013)

The Von Matrices said:


> You describe exactly what AMD originally intended to do with this card.  The only thing that changed at the last minute was fan speed; the rest (clock speed, voltage, shaders) was already set.  So the "original" R9 290 would be the massive overclocker; the current R9 290 has its overclocking potential diminished because the throttling has been removed.  It's almost as if AMD is admitting that its cooler is horrible and increased the cooling (noise be damned) to match what a custom cooled R9 290 would perform like because it knew it wouldn't be selling the reference card with the stock heatsink for very long.



AMD recently announced they are going to use Sapphire for all their branded cards anyway, so I am thinking they are moving to a fab-less hardware design company instead of having money tied up in manufacturing.


----------



## W1zzard (Nov 5, 2013)

Steevo said:


> AMD recently announced they are going to use Sapphire for all their branded cards anyway, so I am thinking they are moving to a fab-less hardware design company instead of having money tied up in manufacturing.



PCPartner (who own Sapphire) has made ATI/AMD reference boards since like forever. AMD does not have a fab for their GPU production, all AMD GPUs are made at TSMC Taiwan


----------



## Casecutter (Nov 5, 2013)

Steevo said:


> It does speak volumes about the efficiency of this GPU die, 28% roughly smaller and just as fast if not a little faster clock for clock.
> 
> Most people don't seem to understand that smaller surface of the die poses a huge increase in what it take to keep it cool, the thermal conductivity of copper and even with a vapor chamber on it is at its limit to stay in a reasonable size 2 slot configuration.
> 
> ...



Amen… Exactly what I said in the 290X review! AMD took right to the edge in a bunch of ways. They’re packaging a bunch stuff in a small space while retaining a cost effective chip.  Once they got working risk production they really started seeing that the computation and parameter layout of the design aren't always going to come in from production.  Look at the GK100 it didn't make it in its first spin, it had to revamp to some a degree and it even transitioned GK110.  

Bear in mind I don't think AMD other than early hypothesis, ever gave hard considerion to a 28Nm Hawaii chip until perhaps end of 2012.   Given they accomplished production in 10 months is huge.  Both groups of consumers have to be ecstatic that AMD accomplished what they did, because had they not the enthusiast level would've remain unobtainable to huge amount of consumers.


----------



## DarkOCean (Nov 5, 2013)

happita said:


> I simply could not resist. I found some stock at the egg, Sapphire only atm..
> 
> Get em while they're hot!!!!
> SAPPHIRE 100362SR Radeon R9 290 4GB GDDR5 Video Ca...
> ...



Gone, only powercolor and xfx (wich is 5% more expensive) in stock atm.


----------



## Steevo (Nov 5, 2013)

W1zzard said:


> PCPartner (who own Sapphire) has made ATI/AMD reference boards since like forever. AMD does not have a fab for their GPU production, all AMD GPUs are made at TSMC Taiwan



I remember they made the majority of their boards, but haven't they been outsourcing some of it to other vendors as well until recently?

http://www.amd.com/us/press-releases/Pages/amd-selects-sapphire-2013oct01.aspx

I guess they moved their FirePro line exclusively to Sapphire.


----------



## the54thvoid (Nov 5, 2013)

Steevo said:


> I could approach this from a few different ways.
> 
> Its great that you bought a Titan, I'm glad it has performed well for you. I don't mean to be insulting, I hope you have modded it to push the hardware, most of us do. Its what brought me here to TPU, editing BIOS's on cards and pushing the limit. I didn't mean any particular owner or user, just standard crappers.
> 
> ...



Thanks for taking the time to be constructive with your reply - Genuinely appreciate it.  I get just as pissed as you do about thread crapping but sometimes I guess carpet bombing statements hit the wrong targets.

The Hawaii architecture is genuinely impressive.  The bloody cooler though as a reference design does blow (too much).  Most reviewers agree.

Any other argument such as power and mantle being a closed API, or the usual driver regurgitated shite are used by ignorant folk.

What AMD have done is mostly fantastic and the pricing has forced Nvidia to make changes.  This is all good.  The only negative for most folk is the noise from the ref design.  Custom designs should alleviate that (especially if they allow it to run at about 80 -90 degrees to keep noise down) but for me , it's technically a non issue as I'd put a water block on it regardless.

Thanks again for the reply.


----------



## Steevo (Nov 5, 2013)

the54thvoid said:


> Thanks for taking the time to be constructive with your reply - Genuinely appreciate it.  I get just as pissed as you do about thread crapping but sometimes I guess carpet bombing statements hit the wrong targets.
> 
> The Hawaii architecture is genuinely impressive.  The bloody cooler though as a reference design does blow (too much).  Most reviewers agree.
> 
> ...



No problem. 

I am hoping to get back into more modding this year or next, I have squeezed every ounce of power out of this build and need more on the GPU front, that said I would love to see what a custom designed board with more power to the die and good water cooling could do. Considering its showing less than 1.2v on the core there should be headroom left.


----------



## The Von Matrices (Nov 5, 2013)

Casecutter said:


> Bear in mind I don't think AMD other than early hypothesis, ever gave hard considerion to a 28Nm Hawaii chip until perhaps end of 2012.   Given they accomplished production in 10 months is huge.  Both groups of consumers have to be ecstatic that AMD accomplished what they did, because had they not the enthusiast level would've remain unobtainable to huge amount of consumers.



I'm not so sure about your timeline being reasonable.  GTX 680 & GK104 were released in March 2012 and shocked AMD with its performance (it shocked NVidia too).  Yes, AMD released the 7970 GHz edition to compete, but surely they had to know that NVidia still had GK110 to release and that Tahiti wasn't going to be able to compete with it.

Either AMD recognized this issue and started designing Hawaii in mid 2012 (which means that they had 16+ months to produce the chip) or AMD was complacent and started designing Hawaii in late 2012 as you said.  In this latter case I would scold AMD more for being complacent than credit them for designing a chip in such a short time.  If it only takes them 10 months to design the Hawaii chip they could have started at the release of the GTX 680 and we would have had the R9 290X by February 2013, which is actually before the release of Titan.


----------



## Mathragh (Nov 5, 2013)

The Von Matrices said:


> I'm not so sure about your timeline being reasonable.  GTX 680 & GK104 were released in March 2012 and shocked AMD with its performance (it shocked NVidia too).  Yes, AMD released the 7970 GHz edition to compete, but surely they had to know that NVidia still had GK110 to release and that Tahiti wasn't going to be able to compete with it.
> 
> Either AMD recognized this issue and started designing Hawaii in mid 2012 (which means that they had 16+ months to produce the chip) or AMD was complacent and started designing Hawaii in late 2012 as you said.  In this latter case I would scold AMD more for being complacent than credit them for designing a chip in such a short time.  If it only takes them 10 months to design the Hawaii chip they could have started at the release of the GTX 680 and we would have had the R9 290X by February 2013, which is actually before the release of Titan.



unless they did just that, but were then aiming for the 20nm node, which at the start of 2013 was decided not to be viable in time. I think this was said over at techreport in the comments of the 290 review without providing actual proof. Your post made me think of it though, and judging by the state of affairs over at TSMC i would regard it as possible atleast, since it would mean that they had to redo quite a bit to adapt this chip for 28nm.

All very much speculation though(which i personally love to indulge in on matters like these


----------



## TheoneandonlyMrK (Nov 5, 2013)

newtekie1 said:


> You forget that there are still people, like me, who use more than one monitor and have a secondary monitor that only has VGA.



Well I forgot no one , use your old card in pciex 2 and run monitoring on that , my point stands legacy is not always necessary.
Mobos don't often have any proper keyboard port just usb which Does Not work as well sometimes when your trying to F8 windows boot or get into bios but hey I must have missed everyone bitching about that.
Ny monitor cost 70 quid just buy another you just spent 6 -8 times that amount on a gpu.


----------



## newtekie1 (Nov 5, 2013)

theoneandonlymrk said:


> Well I forgot no one , use your old card in pciex 2 and run monitoring on that , my point stands legacy is not always necessary.
> Mobos don't often have any proper keyboard port just usb which Does Not work as well sometimes when your trying to F8 windows boot or get into bios but hey I must have missed everyone bitching about that.
> Ny monitor cost 70 quid just buy another you just spent 6 -8 times that amount on a gpu.



And having to spend any money at all to buy anything extra to retain functionality of every other card is a con.  There is no getting around that.


----------



## THE_EGG (Nov 5, 2013)

happita said:


> I simply could not resist. I found some stock at *the egg*, Sapphire only atm..



Last time I checked, I don't have any  Wish I did though.

Looks like Powercolor, MSi and XFX are now in stock though.


----------



## Slomo4shO (Nov 5, 2013)

happita said:


> It looks like AMD just caused a price war. I see it now, all R9 cards will be on backorder for 2 months or more because they probably won't be able to meet demand (or willingly NOT meet demand so they can drive the prices on these higher)



You don't increase prices when you are trying to gain market share


----------



## qubit (Nov 5, 2013)

newtekie1 said:


> And having to spend any money at all to buy anything extra to retain functionality of every other card is a con.  There is no getting around that.



I really hate to agree with you on anything NT, but I do agree here. 

This omission is hardly a dealbreaker and can be worked around with that little $25 active adapter, but still annoying. It's the same situation as when mobo ports got removed over time such as IDE and PCI, for example, preventing your old, but useful, kit from being used. It's the price of progress, I guess.


----------



## The Von Matrices (Nov 6, 2013)

newtekie1 said:


> And having to spend any money at all to buy anything extra to retain functionality of every other card is a con.  There is no getting around that.





qubit said:


> I really hate to agree with you on anything NT, but I do agree here.
> 
> This omission is hardly a dealbreaker and can be worked around with that little $25 active adapter, but still annoying. It's the same situation as when mobo ports got removed over time such as IDE and PCI, for example, preventing your old, but useful, kit from being used. It's the price of progress, I guess.



I can see how it would turn off some people, but I strongly believe AMD made the right choice.  The improvements AMD made to its display controller (4 links allowing 2x DL-DVI as well as three clock generators allowing 3x SL-DVI) will help more people than the lack of VGA will hurt.

I don't deny that VGA users exist or that they aren't potential customers; I just argue that the tradeoff for a better digital display controller was worth it.  I don't see this as any different from other changes in the past.  Back with the GTX400/HD5000 series both manufacturers got rid of S-Video and RCA outputs completely in order to allow more digital displays, and there were complaints but they were a small minority.  The market is always moving to newer standards, and no matter what standard you're talking about, retaining legacy compatibility with new products has always meant buying adapters.

I don't think this means that AMD will abandon VGA on the low end just yet, but AMD has begun the march toward the death of VGA, and the high end is a good place to start.  To VGA I say good riddance.  I'll be glad when VGA dies as the standard to connect to projectors in meeting rooms; they always have long, low quality VGA cables.  That results in a ton of noise on the video signal, which is very distracting when trying to show a presentation containing static content and high contrast.



Mathragh said:


> unless they did just that, but were then aiming for the 20nm node, which at the start of 2013 was decided not to be viable in time. I think this was said over at techreport in the comments of the 290 review without providing actual proof. Your post made me think of it though, and judging by the state of affairs over at TSMC i would regard it as possible atleast, since it would mean that they had to redo quite a bit to adapt this chip for 28nm.
> 
> All very much speculation though(which i personally love to indulge in on matters like these



Yeah, it is completely speculation, but speculation is fun! 

What you describe is exactly what AMD admitted happened to Cayman/69xx.  It was designed for 32nm but was moved back to 40nm because 32nm was delayed (and eventually cancelled).  So I could believe that AMD wanted to take advantage of 20nm for its next high end chip, but at some point AMD realized that 20nm was too far away and needed to design a 28nm chip to suffice until 20nm chips were available.  I think this happened to NVidia this time too considering that the rumors sugegst the initial Maxwell chips are 28nm instead of 20nm


----------



## qubit (Nov 6, 2013)

Yeah, I don't disagree with you Von; as I said, it's the price of progress and that's a nice expansion of the situation. 

Another similar situation occurred when Apple were the first to abandon the floppy disc drive, remember those horrible, slow, unreliable old things? It was pretty inconvenient for quite a while, but in the end, they just weren't needed even more, even for BIOS updates or HDD utilites. Note that I've never bought Macs, so this never affected me. When the FDD was eventually removed from PC motherboards, I barely noticed, having removed my FDD drive ages before.


----------



## HalfAHertz (Nov 6, 2013)

Now AMD's True Audio makes a whole lot of sense. It's the only thing that can distract you from the horrible  noise during gaming


----------



## TheoneandonlyMrK (Nov 6, 2013)

HalfAHertz said:


> Now AMD's True Audio makes a whole lot of sense. It's the only thing that can distract you from the horrible  noise during gaming



See at least this is genuinely funny bravo.


----------



## Xzibit (Nov 6, 2013)

HalfAHertz said:


> Now AMD's True Audio makes a whole lot of sense. It's the only thing that can distract you from the horrible  noise during gaming



Those of us with wifes and girlfriends who are use to the hair dryer for 1hr before going out of the house don't mind.

Those without I guess it bothers them more since the noise is mysterious.


----------



## 15th Warlock (Nov 6, 2013)

Oh AMD, you naughty you...

I have to give it to the driver team, loud, but deadly performance, there's really no reason to get a 290X for $150 more... 

Oh well, such is life in the fast lane, amazing to get so much performance for such a low price, let's wait and see the response from.the green team, but for now, Bravo AMD, you deserve it.

I'm thinking of getting an accelero hybrid for my 290X and rid it from the noise and throtling, but if you do the math, getting a 290 makes much more sense, $399 + $104 for hybrid water cooling, you can't beat that combination, then you would have a silent and deadly video card


----------



## xenocide (Nov 6, 2013)

Xzibit said:


> Those of us with wifes and girlfriends who are use to the hair dryer for 1hr before going out of the house don't mind.
> 
> Those without I guess it bothers them more since the noise is mysterious.



That's kind of a moot comparison.  I have never dated a girl that dried her hair near my computer while I was gaming--generally such things are done in the bathroom so they can quickly transition into doing their hair with a mirror present, and those doors in the way generally muffle the sound substantially.  I don't think I could deal with a GPU pumping out a constant 50+ dB, it would drive me insane.  The 95c load temps also make me a little concerned, granted AMD has said the cards were designed to operate at these temperatures, I just don't like the idea of my GPU being as hot as a car engine...


----------



## MxPhenom 216 (Nov 6, 2013)

15th Warlock said:


> Oh AMD, you naughty you...
> 
> I have to give it to the driver team, loud, but deadly performance, there's really no reason to get a 290X for $150 more...
> 
> ...



Yep, the difference between the 290 and 290x is probably the smallest I have ever seen. The $150 does not match that small performance difference at all. If I was in the market for a card now, the 290 would be it. Possibly 2, and throw them on water.


----------



## Nordic (Nov 6, 2013)

If neither card throttled how much of a gain would the 290x have over the 290? Anyone have any idea?


----------



## TheoneandonlyMrK (Nov 6, 2013)

james888 said:


> If neither card throttled how much of a gain would the 290x have over the 290? Anyone have any idea?



9-15% actually but effective well.


----------



## Xzibit (Nov 6, 2013)

xenocide said:


> That's kind of a moot comparison.  I have never dated a girl that dried her hair near my computer while I was gaming--generally such things are done in the bathroom so they can quickly transition into doing their hair with a mirror present, *and those doors in the way generally muffle the sound substantially.  I don't think I could deal with a GPU pumping out a constant 50+ dB,* it would drive me insane.  The 95c load temps also make me a little concerned, granted AMD has said the cards were designed to operate at these temperatures, I just don't like the idea of my GPU being as hot as a car engine...



Its 49dba at 100cm with a case panel off and depending on which one it is he could be reflecting the noise towards him.







Unless your planning on running it like that. The noise no matter what card it is will be muffled as soon as you put on the panel.  To the degree its muffled it depends on your case and if you have your exhaust pointed at the wall or object in order for the sound to reflect back to you.

As far as the temps. You could do what you did to your CPU and get or wait for a non-reference cooler.


----------



## anubis44 (Nov 6, 2013)

Totally said:


> Well if you're still using an old CRT with no desire or need to upgrade that is kind of a big negative.



And who, pray tell, is this freak of whom you speak? Who is still x-raying himself unneccessarily and pointing an electron gun at his brain, and still squinting manically at a by-now blurry as hell CRT? 

Even I, Mr. Trinitron himself, finally caved in and bought 3 22" LCDs back in 2011 as my last Viewsonic 19" trinitron monitor finally started to distort and bleed colour. If you're still somehow using a CRT, I have one word for you: melanoma. 

Spare yourself the brain cancer and get rid of that tumor-in-a-box once and for all! And while you're at it, make sure the replacement flat screen monitor you buy has a DVI or HDMI input.


----------



## W1zzard (Nov 6, 2013)

anubis44 said:


> Even I, Mr. Trinitron himself, finally caved in and bought 3 22" LCDs back in 2011



and I actually searched if that person ever existed and invented the monitors


----------



## de.das.dude (Nov 6, 2013)

dat price :O


----------



## anubis44 (Nov 6, 2013)

W1zzard said:


> and I actually searched if that person ever existed and invented the monitors



 Really? A few of my friends used to call me 'Mr. Trinitron' back in the day, as I always insisted on a trinitron picture tube monitor. I even demanded one at work when such a request would very likely be seen as verging on the unreasonable. 

Couldn't stand the non-trinitron tubes. They weren't sharp enough or flat enough horizontally for me. People might have thought I was being a snob back then for insisting on trinitron, but my eyesight is still good at 44 years old, and most of them are wearing glasses. With the advent of >19" LCDs for sane prices, I finally saw an alternative that was sufficient for me to switch. Mind you, I still kind of miss the extra colour dynamics the flourescing picture elements of the trinitron tube provided over LCD, which gave game elements a slightly more cartoonish tinge (Diablo II never looked so good as on a trinitron monitor). Kinda of like the difference between colour on a plasma display and an LCD.


----------



## Blín D'ñero (Nov 6, 2013)

From the review:


			
				REVIEW said:
			
		

> Display connectivity options include two DVI ports, one HDMI port, and one DisplayPort. You may use all outputs at the same time, so triple-monitor surround gaming is possible with one card.


Is that true? I may use 4 outputs at the same time? Isn't the HDMI shared with one DVI-D?
At least on the old cards, (for instance HD5870): triple monitor Eyefinity is only possible with 2 x DVI together with (native) DP (or an active adapter), not with the HDMI. 


			
				REVIEW said:
			
		

> Please note that the DVI outputs no longer support analog monitors. AMD also improved their display controller hardware, so you can now use three HDMI/DVI monitors at the same time without having to buy an active DP-to-DVI adapter (this was a requirement to providing the TMDS clock signal for the third card on previous generation cards) .


Oh now it is three.
_"(this was a requirement to providing the TMDS clock signal *for the third card* on previous generation cards)"_
For the third card???? Doesn't it mean for the third monitor?


----------



## anubis44 (Nov 6, 2013)

Blín D'ñero said:


> From the review:
> Is that true? I may use 4 outputs at the same time? Isn't the HDMI shared with one DVI-D?
> At least on the old cards, triple monitor Eyefinity is only possible with 2 x DVI together with (native) DP (or an active adapter), not with the HDMI.



I'm running 3x22" LG monitors in eyefinity off of one Gigabyte 7950 card using the single DVI port and two $30 mini display port --> DVI adapters connected to the two mini display ports without any problems. Eyefinity set up without issues.


----------



## W1zzard (Nov 6, 2013)

You no longer need an active adapter. And yes it's supposed to mean "monitor", not "card"


----------



## alexsubri (Nov 6, 2013)

I think I found my next card!


----------



## anubis44 (Nov 6, 2013)

W1zzard, it goes without saying that we are all dying to read reviews for the custom cooled versions of the R9 290(X). 

Any ideas yet when you MAY have a card to review from Gigabyte, Asus, MSI, etc.? Even a blatant guess on your part would be appreciated!


----------



## Recus (Nov 6, 2013)

W1zzard are you faliliar with this?

http://techreport.com/news/25609/up...9-290x-cards-may-be-slower-than-press-samples
http://www.fudzilla.com/home/item/33066-toms-finds-retail-r9-290x-card-runs-much-slower


----------



## Eroticus (Nov 6, 2013)

Recus said:


> W1zzard are you faliliar with this?
> 
> http://techreport.com/news/25609/up...9-290x-cards-may-be-slower-than-press-samples
> http://www.fudzilla.com/home/item/33066-toms-finds-retail-r9-290x-card-runs-much-slower



A Swedish magazine already debunked this. They bought over the counter samples and tested against review samples and they were just a fast. Absolutely no difference.

More likely is that whoever got a bad results didn't have enough cooling and that the card went into thermal throttling.

http://www.sweclockers.com/artikel/...der-for-golden-samples-sweclockers-undersoker


----------



## Blín D'ñero (Nov 6, 2013)

Eroticus said:


> A Swedish magazine already debunked this. They bought over the counter samples and tested against review samples and they were just a fast. Absolutely no difference.
> 
> More likely is that whoever got a bad results didn't have enough cooling and that the card went into thermal throttling.
> 
> http://www.sweclockers.com/artikel/...der-for-golden-samples-sweclockers-undersoker



So you haven't read the article you link to? You are not aware of the extensive update on that page?


----------



## PopcornMachine (Nov 6, 2013)

Eroticus said:


> A Swedish magazine already debunked this. They bought over the counter samples and tested against review samples and they were just a fast. Absolutely no difference.
> 
> More likely is that whoever got a bad results didn't have enough cooling and that the card went into thermal throttling.
> 
> http://www.sweclockers.com/artikel/...der-for-golden-samples-sweclockers-undersoker



good info .... thanks


----------



## W1zzard (Nov 6, 2013)

anubis44 said:


> W1zzard, it goes without saying that we are all dying to read reviews for the custom cooled versions of the R9 290(X).
> 
> Any ideas yet when you MAY have a card to review from Gigabyte, Asus, MSI, etc.? Even a blatant guess on your part would be appreciated!



I have no concrete info, my guess is later this year



Recus said:


> W1zzard are you faliliar with this?



While I have no data, I doubt these results. It's probably bad case ventilation or user error. I uploaded the press BIOSes for R9 290 and R9 290X to our BIOS collection on launch day, so everybody can verify.


----------



## PopcornMachine (Nov 6, 2013)

There is an article on the store samples at Tech Report: http://techreport.com/news/25609/up...9-290x-cards-may-be-slower-than-press-samples

One poster there showed his card throttled down to ~750MHz until he upped the fan speed to 50%.

Probably due to local case/envrionment, or it may be that some chips are more sensitive to the 95C temp.

If I get a 290, and I prob will, it will get a water block.


----------



## Eroticus (Nov 6, 2013)

PopcornMachine said:


> There is an article on the store samples at Tech Report: http://techreport.com/news/25609/up...9-290x-cards-may-be-slower-than-press-samples
> 
> One poster there showed his card throttled down to ~750MHz until he upped the fan speed to 50%.
> 
> ...




"More likely is that whoever got a bad results didn't have enough cooling and that the card went into thermal throttling."



> Variations exist, but does not affect everyone
> With another day of data at hand, it is clear that differences between graphics cards available. Specimens from PowerColor include a slightly slower fan combined with a slightly higher voltage to the GPU, which together provide lower frequencies as the temperature puts a stop.
> 
> This, however, only limited Quiet mode where the fan speed is locked to a maximum of 40 percent. In more Über released, which is what SweClockers use reviews, the difference in performance nonexistent - both press copy and PowerColor card can operate at peak frequency even for long periods.
> ...




All AMD coolers always was suck ... the only normal cooler that AMD made it was on 7990 that all.


----------



## manofthem (Nov 6, 2013)

PopcornMachine said:


> if I get a 290, and I prob will, it will get a water block.



Do it, do it, do it


----------



## mastershake575 (Nov 6, 2013)

Few things I noticed 

1. Third party cooler versions should be out in the new couple of weeks so I don't know why people are jumping ship. I would rather wait 3-5 weeks and pay only $25-50 more to get better temps and a guaranteed 1050-1100mhz overclock (A third party 290 at $425-450 clocked at 1050-1100mhz while having good temps seems like a no brainer and I would for sure wait for that). 

2. The price tags ($400) gets me excited since I don't have to upgrade till 20nm comes out (middle to late of next year). Either the 290 or the 20nm equivalent should easily be $300 come late next year (if crossfire users are having good results, I might just have to buy two at that price)


----------



## The Von Matrices (Nov 6, 2013)

Eroticus said:


> A Swedish magazine already debunked this. They bought over the counter samples and tested against review samples and they were just a fast. Absolutely no difference.
> 
> More likely is that whoever got a bad results didn't have enough cooling and that the card went into thermal throttling.
> 
> http://www.sweclockers.com/artikel/...der-for-golden-samples-sweclockers-undersoker





Blín D'ñero said:


> So you haven't read the article you link to? You are not aware of the extensive update on that page?



I don't know if anyone else read the entire article, but specifically talks about AMD making the 290X *louder* in the next driver update to be released tomorrow.  To quote (translated):



			
				Sweclockers said:
			
		

> At AMD, work is reportedly on the problem, where jobs weary European Chamber of writing excites newly awakened colleagues in Canada. The company has a solution on the table, and promises a driver update within 24 hours as by his own admission to minimize the variances.
> 
> We will be releasing a driver in the next 24 hours That corrects this behavior by Normalizing all fan behavior. The 290X in Quiet mode Should be at 2200rpm, and the 290 Should be at 2650rpm.
> 
> ...



Their press copy had too low a fan speed, and that resulted in less performance in quiet mode but less noise (the fan speed on their press card was 10% lower than the setpoint, and the retail card was 13% lower than the setpoint)  *That means that for many users the driver update will make their card louder *than they already were before the driver update.  10-13% may not seem like much but it is certainly not welcome on an already loud card.

Wizzard, when the update ships, could you retest the 290X performance and noise in Quiet mode and the noise level in Uber mode?  They may be higher than when you tested them originally.


----------



## Am* (Nov 6, 2013)

All I can say is thanks AMD. Now we have $500 performance in a $400 card, minus 5% here or there.


Your move Nvidia; this card just demolished the value of your entire mid-high end lineup -- hope you're prepared for another wave of price cuts, because in another month or so you won't have a hope in hell of charging a $20 premium against any AMD card of similar performance, let alone $100. 780 Ti hits tomorrow and we'll see if it is of any value to anyone, because from what I've read so far ($699 price for a 3GB SKU), it looks like another worthless e-peen card not worth jackshit to anyone. If it is as overpriced as I think it will be, I will more than likely be switching camps this time -- even if I have to put my 3D Vision crap aside for another while.


----------



## Fluffmeister (Nov 6, 2013)

nVidia cards have been selling just fine with a hefty price premium for donkeys already, even with AMD bundling countless games too.

Now AMD... stick a decent cooler on your cards, thanks.


----------



## radrok (Nov 6, 2013)

Fluffmeister said:


> nVidia cards have been selling just fine with a hefty price premium for donkeys already, even with AMD bundling countless games too.



That's because there was a huge performance gap before the 290x/290.


----------



## dom99 (Nov 6, 2013)

Got my 290 today, can't wait to stretch it's legs, I'm aiming for 1.1ghz on stock voltages like I've seen quite a few others hitting


----------



## Casecutter (Nov 6, 2013)

The Von Matrices said:


> I'm not so sure about your timeline being reasonable. GTX 680 & GK104 were released in March 2012 and shocked AMD with its performance (it shocked NVidia too). Yes, AMD released the 7970 GHz edition to compete, but surely they had to know that NVidia still had GK110 to release and that Tahiti wasn't going to be able to compete with it.


I don't think AMD really thought a GK110 on the gaming market as something they saw as reality until say mid Q3 2012.  When Tesla cards started to come to light and showed the promise of great efficacy. I think AMD knew it might take time to amass binned chips and what exactly their yield would look like, would they use chip with 12-14 SMX’s in less than just the professional Sku's?  If they had to use part gelded even more would the perf/watt look good?  AMD was look forward to end of 2013 with a 20Nm...

Then as Mathragh said, both shoes' dropped near the end of 2012!  TSMC communicates that 20Nm risk production is pushed back into end of Q1 2014 (now some figure later); Nvidia got the same memo but Kepler wasn't as effected.  Titan (14-SMX) was be pushed forward to show in 3 months, while the amassed a war-chest of 12-SMX parts that didn't provide the efficiency imposed for the Tesla professional products.  

Though AMD needed something they could rebrand, but need a true Enthusiast part to work the cascading product stack they had planned.  So the whole "we won’t be releasing any new cards for the bulk of 2013" went out.  The other thing was Titan seemed overwhelming at first, but 25% more performance for $600 wasn’t that intimidating.  AMD was already in the design scaling the 20Nm layout to spin on 28Nm production.  

http://www.techpowerup.com/180009/no-new-gpus-from-amd-for-the-bulk-of-2013.html


----------



## Recus (Nov 7, 2013)

Blín D'ñero said:


> So you haven't read the article you link to? You are not aware of the extensive update on that page?



He thinks that problem denying fix the problem.



Am* said:


> http://i290.photobucket.com/albums/ll264/ahrinin/MotherOfGod.jpg
> 
> All I can say is thanks AMD. Now we have $500 performance in a $400 card, minus 5% here or there.
> 
> ...



PMS again.


----------



## Tatty_One (Nov 7, 2013)

theoneandonlymrk said:


> 9-15% actually but effective well.



So when non reference coolers do become available that becomes very realistic compared with say the performance difference between the Titan and 780.


----------



## mhadina (Nov 7, 2013)

*botlleneck*

How stupid would be to pair it wit the overclocked AMD 4-core APU and 8 gigs of ram ?


----------



## Mussels (Nov 7, 2013)

mhadina said:


> How stupid would be to pair it wit the overclocked AMD 4-core APU and 8 gigs of ram ?



i dont think you'd get 100% out of the GPU, but it would still be plenty fast.


----------



## Tatty_One (Nov 7, 2013)

mhadina said:


> How stupid would be to pair it wit the overclocked AMD 4-core APU and 8 gigs of ram ?



Dependant on which APU, some will overclock to 4.5Gig+ so there should not be any bottleneck there, but at stock possibly, but to be fair, some lower 4 core Intel offerings that stock at much lower speeds than the A8's and A10's, despite being faster clock for clock would also be a bottleneck to some degree.


----------



## jormungand (Nov 7, 2013)

Steevo said:


> Your post reminds me of someone who buys a sports car but can't afford to drive it.



Ive been driving $400+ cards long time ago, i always wait for the coolest and best performance card and at the end of the war i make my choice, what i stated is that the non reference cards maybe come better made for all interests. IMO!!!!


----------



## Am* (Nov 7, 2013)

Recus said:


> PMS again.



I know, I know -- it's your time of the month yet again, so by all means continue to pointlessly quote other people's posts and forget the fact that this is a USER FORUM with people posting their opinions on stuff -- which you can ignore, debate against or kindly GTFO. You're excused for this week, or until your tampon tantrum blows over, whichever comes first.


----------



## ChristTheGreat (Nov 7, 2013)

I don't know if it has been posted, but: http://www.techspot.com/review/736-amd-radeon-r9-290/page8.html



> I took the IceQ X2 cooler off the HIS Radeon R9 280X and stuck it on our R9 290 sample. Cooling was dramatically improved. The FurMark stress test maxed out at 76 degrees while the card never exceeded 63 degrees in Crysis 3 and Battlefield 4. So it seems as expected the board partners will be able to solve the heat issues of the reference card.



it's interresting.


----------



## Mathragh (Nov 8, 2013)

Another way to go might be : 







apparently someone at techreport did this, and had no more throttling problems, even with lower noise. lol.

Though one has to wonder if its worth the lost warranty, and whether there isn't a way of achieving this more cleanly, like getting rid of the whole bracket.


----------



## RCoon (Nov 8, 2013)

mhadina said:


> How stupid would be to pair it wit the overclocked AMD 4-core APU and 8 gigs of ram ?



<- see system specs + 8GB extra RAM.
1440p and 0 complaints.



Mathragh said:


> Another way to go might be :
> http://i.imgur.com/aSzp66V.jpg
> 
> 
> ...



Link? I'd love to know just how much thermal improvement there was because of this.


----------



## Mathragh (Nov 8, 2013)

RCoon said:


> Link? I'd love to know just how much thermal improvement there was because of this.



He didn't post benchies, just made the claim that it solved all his crossfire problems. Would also like to see some benchies tbh lol. 
However I suspect that its a bit too late for any comparison between the former state and current state .


----------



## HTC (Nov 8, 2013)

Mathragh said:


> Another way to go might be :
> http://i.imgur.com/aSzp66V.jpg
> 
> 
> ...



Quick: someone SEND THIS INFO to AMD 


I did similar to my Obsidian 650D case by cutting open the front vent mesh that came with it as, even with the fix that Corsair sent me, the noise was very noticeable for my liking: this way, the noise went down quite a bit to much more tolerable levels.


----------



## Steevo (Nov 8, 2013)

Mathragh said:


> Another way to go might be :
> http://i.imgur.com/aSzp66V.jpg
> 
> 
> ...



I have never liked the AMD/ATI EMI shield plate. Perhaps someone should do a completely open one with carbon fiber to improve the look and flow


----------



## erocker (Nov 8, 2013)

Mathragh said:


> Another way to go might be :
> http://i.imgur.com/aSzp66V.jpg
> 
> 
> ...



That's not a bad idea. I think it could of been done much cleaner if they took a little more time. It will be interesting to see in anyone else does this.


----------



## N3M3515 (Nov 9, 2013)

About the noise catastrophe everyone is talking about, my own vcard on the review here in tpu is rate 48db at full load, that's 1 point less than R9 290 and 2 less than the X variant.
I honestly can't even hear the vcard's fan when playing...
link: http://www.techpowerup.com/reviews/MSI/HD_7870_HAWK/27.html


----------



## HTC (Nov 9, 2013)

N3M3515 said:


> *About the noise catastrophe everyone is talking about, my own vcard on the review here in tpu is rate 48db at full load, that's 1 point less than R9 290 and 2 less than the X variant.*
> I honestly can't even hear the vcard's fan when playing...
> link: http://www.techpowerup.com/reviews/MSI/HD_7870_HAWK/27.html



Not necessarily so, dude: different people can use the same cards and claim different opinions about the noise it makes, regardless of the actual sound level.

Think of it this way: noise is in the ear of the listener


----------



## qubit (Nov 9, 2013)

N3M3515 said:


> About the noise catastrophe everyone is talking about, my own vcard on the review here in tpu is rate 48db at full load, that's 1 point less than R9 290 and 2 less than the X variant.
> I honestly can't even hear the vcard's fan when playing...
> link: http://www.techpowerup.com/reviews/MSI/HD_7870_HAWK/27.html



A lot of it depends on whether the fans make more of a windrush sound, or if they have an irritating whine or buzz to them.

It also depends on the particular sample you have, your case and your own perception of noise along with the environment that card is used in.



HTC said:


> Not necessarily so, dude: different people can use the same cards and claim different opinions about the noise it makes, regardless of the actual sound level.
> 
> Think of it this way: noise is in the ear of the listener



And this too.


----------



## N3M3515 (Nov 10, 2013)

Well, at least i now know what all the fuzz is about.


----------



## rangerone766 (Nov 10, 2013)

this or the 290x will be my xmas present to myself this year. along with a full cover water block. I could care less about the noise. I just need a beast of a card to carry me another few years. my gtx470 is still running strong, still maxes out BF3 and gives steady 62fps with vsync on.


----------



## PopcornMachine (Nov 10, 2013)

My 290 does make what I would call a loud hum at full load.

But it does not sound like the turbine that my 6950 did with the stock cooler.

However, it got a waterblock for that precise reason.  

This one will too with the added benefit of performing more consistently.


----------



## SIGSEGV (Nov 13, 2013)

wrong place
---


----------



## purecain (Nov 14, 2013)

will it take a 290x bios and open up those shaders... that's the question...


----------



## xorbe (Nov 14, 2013)

purecain said:


> will it take a 290x bios and open up those shaders... that's the question...



Yes and no respectively, iirc.


----------



## manofthem (Nov 14, 2013)

purecain said:


> will it take a 290x bios and open up those shaders... that's the question...



Xorbe said it. W1zz's from another thread:


W1zzard said:


> no, the shaders are not locked via bios, I checked on my press board


----------



## Eroticus (Nov 16, 2013)

AMD R9-290 Shader Unlock - YouTube
AMD R9-290 Shader Unlock


Original Source 

http://hwzone.co.il/hardware/news/142169

These idiots did same fault again ???? xD ....


----------



## H82LUZ73 (May 17, 2014)

Just got an AMD email and here is the web version Notice Anything ? I think its cool to see our Wizz being quoted from them in....http://links.em.experience.amd.com/...=NzI2ODE0NDk4NTIS1&j=MzIwOTQ1NzkxS0&mt=1&rt=0


----------

