# Choose R9 290 Series for its 512-bit Memory Bus: AMD



## btarunr (Dec 3, 2014)

In one of the first interviews post GeForce GTX 900 series, AMD maintained that its Radeon R9 290 series products are still competitive. Speaking in an interview with TweakTown, Corporate Vice President of Global Channel Sales, Roy Taylor, said that gamers should choose the Radeon R9 290X "with its 512-bit memory bus" at its current price of US $370. He stated that the current low pricing with R9 290 series is due to "ongoing promotions within the channel," and that AMD didn't make an official price adjustment on its end. Taylor dodged questions on when AMD plans to launch its next high-end graphics products, whether they'll level up to the GTX 900 series, and on whether AMD is working with DICE on "Battlefield 5." You can find the full interview in the source link, below.





*View at TechPowerUp Main Site*


----------



## dj-electric (Dec 3, 2014)

There's no need to tell AMD that the GTX 900 launch critically damaged R9 290 series viability. They know that. Luckily, the R9 290 street price is at about 270$. Which is somewhat compeling to those who insist on red. It is still in a better price to performance ratio. Now, power and drivers or at-launch optimization? well...

All there's left to do is wait for their next series i guess.


----------



## ZoneDymo (Dec 3, 2014)

damn it amd, I want a new card, GET TO IT


----------



## Steevo (Dec 3, 2014)

Considering the number of high end cards sold out again both companies are doing well enough.


----------



## KarymidoN (Dec 3, 2014)

when AMD learn to optimize energy consumption and make a decent stock cooling system, so Nvidia will have competition. Today AMD has stratospheric consumption of energy with poorly optimized drivers and ridiculously high (I have a CrossfireX).


----------



## Jstn7477 (Dec 3, 2014)

The GM204 is a little lean on memory bandwidth, even with 8GHz GDDR5 (I can hit ~90% memory controller utilization in Furmark at ~1300MHz core clock), but it still eats my R9 290 for breakfast.

It's like saying "choose AMD FX-8350 for 8 integer cores" when a Core i5 has the same amount of FPUs and is completely competitive with half the integer cores.


----------



## phanbuey (Dec 3, 2014)

If you can't get an R290, you can maybe find a 2900XT somewhere, that too has a 512-bit memory bus, and everyone knows more bits is better.


----------



## LinkPro (Dec 3, 2014)

I would have gotten an R9 290 or 290X had I not known about their black screen issue and all the driver-related problems. Did research for a couple months and it still seems like "I will get a beast of a card if I get lucky". I was on AMD for 2 generations - 5830 and 6950 - the 5830 had the black screen problem and driver crashes for a while and it was really annoying while the 6950 was a beast. Until AMD finally figures out how to write proper drivers I will stay with nVidia. It's 2014 and they still manage to churn out drivers that work worse compared to previous versions. Surely nVidia has problems too but they don't seem to affect me that much.


----------



## ManofGod (Dec 3, 2014)

LinkPro said:


> I would have gotten an R9 290 or 290X had I not known about their black screen issue and all the driver-related problems. Did research for a couple months and it still seems like "I will get a beast of a card if I get lucky". I was on AMD for 2 generations - 5830 and 6950 - the 5830 had the black screen problem and driver crashes for a while and it was really annoying while the 6950 was a beast. Until AMD finally figures out how to write proper drivers I will stay with nVidia. It's 2014 and they still manage to churn out drivers that work worse compared to previous versions. Surely nVidia has problems too but they don't seem to affect me that much.



I have had an X850 Pro, X1900 Pro, 2 x 2900 Pro Crossfire, HD 4870 - HD 4890 Crossfire, 2 x HD 6950 and I now have a R9 290X Reference model. (XFX) So far, I have not had any driver issues, black screens, gray screens, BSODs (Unless I created them myself) nor any other issues. The drivers are actually quite good so far, not perfect but good enough for me.

Last Nvidia product I owned was a 6600GT. I just do not see any reason to switch when I have not personally had any issues. Also, going with the 512 Bit bus on these cards is a good reason to get one if you need the bandwidth. Oh, and I did have a 9800 Pro back in the day as well without any issues that I can remember. Honestly, I think problems come down more to hardware combinations than anything else.


----------



## Fx (Dec 3, 2014)

Dj-ElectriC said:


> There's no need to tell AMD that the GTX 900 launch critically damaged R9 290 series viability. They know that. Luckily, the R9 290 street price is at about 270$. Which is somewhat compeling to those who insist on red. It is still in a better price to performance ratio. Now, power and drivers or at-launch optimization? well...
> 
> All there's left to do is wait for their next series i guess.



I can't remember the last time I had an issue with AMD drivers. Oh wait, yeah I can, 4 years ago in 2010, and it was an easy fix. Before that, I can't remember...


----------



## cedrac18 (Dec 3, 2014)

The $190 used on Ebay is my price point. Thank you Nvidia, i have never and will never pay more than $200 for a single component.


----------



## Lionheart (Dec 3, 2014)

Lolz thanks AMD but I will stick with my GTX 970 

Give me Windows 10, DX12, More Mantle support, a 12 or 16 core CPU based on the Excavator architecture, <--- games that take advantage of those cores properly & the R9 390 series please, & I will definitely consider purchasing your hardware Nom Nom!


----------



## Sony Xperia S (Dec 3, 2014)

All of the Avago, Freescale, LG, MediaTek, NVIDIA, Renesas and Xilinx are on list with projects waiting the 16 nm TSMC mass production later next year.

Where is AMD? 

Speaking nonsense. :lol:


----------



## jigar2speed (Dec 3, 2014)

This looks bad, Roy has no idea what he is talking about.


----------



## qubit (Dec 3, 2014)

phanbuey said:


> If you can't get an R290, you can maybe find a 2900XT somewhere, that too has a 512-bit memory bus, and everyone knows more bits is better.


lol, that's funny, I nearly spilled my coffee. I have a 2900XT with its 512-bit bus and an 8800 GTX with only a 384-bit bus and the 8800 GTX eats it for breakfast.

AMD better up the overall performance of their current cards rather than trying to bamboozle techies with bus width.


----------



## SteveS45 (Dec 3, 2014)

Personally I don't think AMD is the only camp with a driver issue.  Both camps in my opinion are equally meh.  
I have two systems one with a R9 280X and HD7970 in crossfire, and a new MSI Gold(Bronze) edition GTX970.  

The GTX970 has been having a lot of problems on Display Port with screen tearing after coming back from sleep state.  Google GTX900 Display Port tearing/black screen, a lot of people have the same problem.  And sometimes switching from Nvidia surround to normal triple monitor or vice versa causes BSOD on Windows7.

On the HD7970, I wouldn't say AMD had better or flawless drivers, we all know they don't.  But I don't see the Nvidia drivers superior in any way.

So I think driver and feature wise, both camps are equally meh.


----------



## GAR (Dec 3, 2014)

Its all marketing, of course im going to tell you my product is better than my competitions. That being said, the 290 is a good card but it produces too much heat and uses too much power, 100 watts more to be exact, heats up other components in your PC because of the massive heat the PCB produces, the gtx 970 is the sweet spot right now for a gpu, the 512bit bus is meaningless when a card with 256bit bus beats it, even in 4K the gtx 980 is equal to or faster than the 290x, not to mention the gtx 970/980 overclock like beasts, 1400mhz+ on all cards with most hitting 1500+ on stock cooling and even the reference models. Gsync is an amazing feature and I for one cannot go back to a non gsync monitor.


----------



## Kaapstad (Dec 3, 2014)

I use both GTX 980s and R9 290Xs.  I have 4 of each and find the 980s are better most of the time @1080p and the 290Xs are better most of the time @4K.

With these cards it comes down to what games and resolutions you use as there is no clear winner.


----------



## f2bnp (Dec 3, 2014)




----------



## chinmi (Dec 3, 2014)

I change from a 290x to a 970. It maybe not be faster, but it sure is more convenient.
For me the 970 is Is cooler, consume less watt, and the noise level is more silent then my 290x.
All of those non-performance improvement really makes the 970 beats my old 290x.
And sadly my experience with the 290x is not quite good. First I got an elpida memory one, which is prone to artifact on high memory clock, and i got the black screen problem when it's under heavy use. It's not psu error cause my ax1200i is enough to power 2 of those card.... Thankfully i can rma my 1st card and get a new one... But it's still an elpida one, so fml. I rma it again and ask for a non elpida one, and finally i got it. Which is perfect, no black screen and artifact. But the noise and heat really turn me down. So thats why I quickly change to a 970 when i got the chance.
The 290x sure have more bits, but it's not faster, it's louder, it's hotter, and it's consume more power.
So for me the 512bit vs 256bit is not a game breaking deal.


----------



## Animalpak (Dec 3, 2014)

AMD show me your FreeSync tech and make a better job to cool down your GPU's and i may think to buy a Radeon card.


----------



## Eukashi (Dec 3, 2014)

when the HBM technology is loaded into RADEON, a problem in a memory band is cleared.
there is no need to increase the secondary cache as Maxwell.


----------



## Naito (Dec 3, 2014)

btarunr said:


> AMD maintained that its Radeon R9 290 series products are still competitive.



This to me sounds like they are in no rush to bring out their next generation of GPUs.


----------



## eroldru (Dec 3, 2014)

f2bnp said:


>


Yeah, and I wonder why people are so mean to AMD. At least they try to please the gaming community with low prices and good products. Maybe they are not the best, but the price is great. Hell, people do you remember 700$ GTX 780 Ti? And 3000$ Titan Z?


----------



## Sony Xperia S (Dec 3, 2014)

Naito said:


> This to me sounds like they are in no rush to bring out their next generation of GPUs.





eroldru said:


> Yeah, and I wonder why people are so mean to AMD. At least they try to please the gaming community with low prices and good products. Maybe they are not the best, but the price is great. Hell, people do you remember 700$ GTX 780 Ti? And 3000$ Titan Z?



AMD's products' at the moment are in competitive inferiority and that's why the prices are such, both of them are responsible for it.

The bad things for AMD are yet to come if they are indeed "not in a rush".


----------



## Aquinus (Dec 3, 2014)

Sony Xperia S said:


> AMD's products' at the moment are in competitive inferiority and that's why the prices are such, both of them are responsible for it.
> 
> The bad things for AMD are yet to come if they are indeed "not in a rush".


Performance graphics are not the majority of AMD or nVidia sales. Also power consumption isn't too far off the mark (even though the AMD GPU does consume more.) Ever use an APU? They're nifty little CPUs that most consumers will be happy with. So maybe you should take your fanboy hat off and understand that AMD and nVidia have a lot more markets than just performance 3D.

All in all, I think memory interface width has little to do with my decision to buy a new GPU. If AMD doesn't release something new and half decent soon, I'll be switching to the green camp. Not because AMD is bad, but because nVidia (like Intel) has been making decent progress unlike AMD who seems to be milking everything for what they're worth.

Also on the side: I could never see myself paying more than 300-350 USD for a GPU which puts the GTX 970 is a really sweet spot compared to AMD's aging products.


----------



## Eroticus (Dec 3, 2014)

Unless comments AMD ...

just release the 390x =] and make the green team cry again to 1 more year =D


----------



## arbiter (Dec 3, 2014)

eroldru said:


> Yeah, and I wonder why people are so mean to AMD. At least they try to please the gaming community with low prices and good products. Maybe they are not the best, but the price is great. Hell, people do you remember 700$ GTX 780 Ti? And 3000$ Titan Z?



Well 290x probably still be selling at 500+$ if nvidia didn't release gtx970 at 330$. So most recent price drops were forced by nvidia.

Nvidia likely kept the bus at 256bit on the 28nm maxwell to keep the chip small and cheap to make. Probably can say that when die shrunk on 20/16nm (which ever) will most likely have at least 384bit maybe even 512bit. One thing for sure though Nvidia is not sitting on their arse doing nothing.


----------



## NC37 (Dec 3, 2014)

Jstn7477 said:


> It's like saying "choose AMD FX-8350 for 8 integer cores" when a Core i5 has the same amount of FPUs and is completely competitive with half the integer cores.



i5 can't match the FX on heavy multitasking. Only area where an FX actually hits i7 or beats some i7s is in heavy multithreading. Video encoding/etc. If AMD had Intel's performance with their multithreading ability, Intel would be blown right out of the market.

And yeah, everyone likes to rag on AMD for falling so far behind. But if AMD wasn't around, Intel would probably still be floundering on Pentium 4s. nVidia would be still riding G92s forever. And who knows what chips the current consoles would be using. Not the mention Intel would have never started investing in IGPs that don't insanely suck.

Competition is good, even if AMD kinda sucks at it right now.


----------



## arbiter (Dec 3, 2014)

NC37 said:


> i5 can't match the FX on heavy multitasking. Only area where an FX actually hits i7 or beats some i7s is in heavy multithreading. Video encoding/etc. If AMD had Intel's performance with their multithreading ability, Intel would be blown right out of the market.



Might want to check some benchmarks, FX doesn't beat an i7. Even 9590 at 5ghz still loses to 4770k at stock clock. Same as nvidia, Intel likely aint sitting around either.


----------



## dj-electric (Dec 3, 2014)

Fx said:


> I can't remember the last time I had an issue with AMD drivers. Oh wait, yeah I can, 4 years ago in 2010, and it was an easy fix. Before that, I can't remember...



If you can't allow others to remind you. There were and still are problems that are user-wide, stuff that nobody who had a red card could avoid.


----------



## bobalazs (Dec 3, 2014)

except the fact that the gtx-s dont need that memory bus they're built to run better.


----------



## las (Dec 3, 2014)

Who cares if its 512 bit when it losses to 256 bit cards in pretty much all games, while using more watts + higher temps. It takes alot more RPMs to cool down these GPUs compared to Maxwell.

My old 290 was a terrible card. My new 970 is whisper quiet even at 1567/2000. And the performance in Far Cry 3, Dragon Age Inquisition and Bioshock Infinite, which are the games that i play right now, is much better with zero stuttering and strange fps drops.

AMD needs to step up their game. I seriously hope their upcoming GPUs will do better than the 290 series. I've had many AMD cards, but the 290 series was the worst of all time imo. It seemed so rushed to counter GK110..


----------



## Eroticus (Dec 3, 2014)

las said:


> Who cares if its 512 bit when it losses to 256 bit cards in pretty much all games, while using more watts + higher temps. It takes alot more RPMs to cool down these GPUs compared to Maxwell.
> 
> My old 290 was a terrible card. My new 970 is whisper quiet even at 1567/2000. And the performance in Far Cry 3, Dragon Age Inquisition and Bioshock Infinite, which are the games that i play right now, is much better with zero stuttering and strange fps drops.
> 
> AMD needs to step up their game. I seriously hope their upcoming GPUs will do better than the 290 series. I've had many AMD cards, but the 290 series was the worst of all time imo. It seemed so rushed to counter GK110..



Yeah !!! finally after 1 year nvidia beat amd in single core !!! wow amazing !!! please remember that  !!!!! 295 is still most powerful single slot card on the market and last time when nvidia controlled it was with 690 ? 3 years EGO ? =O

I own my 290x over a year ... and you guys got ur new cards months ego 

i think yeah .. you finally deserve to control the market after 1 long year =] for some months like always ....

finally green team is not only posting unless facts about "but nvidia energy saves" blabla ...


----------



## sergionography (Dec 3, 2014)

Dj-ElectriC said:


> If you can't allow others to remind you. There were and still are problems that are user-wide, stuff that nobody who had a red card could avoid.



yes because you go online on some forum about a common problem and that automatically means to you "everyone has problems with red". I have owned mostly nvidia cards since the days of fx and had my share of troubles too, I had a gts250 that gave me a heck lot of driver headache then I got an hd5770 which was flawless and still running until today on one of my older rigs, and after the 5770 I built a newer rig which had a gtx460 which also had so many driver issues and died on me after 2 years, started with screen artifacts until it started doing bsod n became useless. Now to be honest even with all of this I won't go out of my way to bash neither amd nor Nvidia because while I told u these problems I hardly told u what caused them, first and foremost on both the gts250 and gtx460 I used to overclock the life out of them, and on those specific rigs my Windows is completely bloated as it was kinda a test rig where I switch hardware and what not, therefore before you keep blaming amd or nvidia just remember how broken windows in general is with its driver system, so unless you have a Fresh clean Windows installed when you get new hardware don't be complaining about drivers.

And as for the previous comments on how every nvidia fan here is obsessed with the whole efficiency thing well its completely beyond me how all for sudden that's number one priority, because if that's the case then we can easily say amd was completely superior up until kepler, though I hardly think any nvidia fan will see it that way because efficiency only matters if nvidia does it. Totally biased for sure.

As for everyone complaining about the 290x being inefficient well no card is efficient when clocked to its limits(everyone is comparing the uber mode). remember amd has configurable tdp which no one seems to be considering here. Set the tdp to 175-200 then compare it to a gtx970, I'm sure performance won't be too far off since in the clockspeed world that extra 10-15% clock speed towards higher clocks could very well mean 30%+ extra power consumption.


----------



## las (Dec 3, 2014)

Eroticus said:


> Yeah !!! finally after 1 year nvidia beat amd in single core !!! wow amazing !!! please remember that  !!!!! 295 is still most powerful single slot card on the market and last time when nvidia controlled it was with 690 ? 3 years EGO ? =O
> 
> I own my 290x over a year ... and you guys got ur new cards months ego
> 
> ...



Ehh.. The 780 Ti beat 290X.. Sigh.
http://www.techpowerup.com/reviews/AMD/R9_295_X2/24.html

Even an OCed 780 rivaled an OCed 290X. Because the scaling and OC headroom was better.

295X2 is the worst dual card in recent times.. Pump is rattling and coilwhine is insane.

Proof:

http://www.hardwarecanucks.com/foru...2-graphics-card-coil-whine-investigation.html

http://www.techpowerup.com/reviews/AMD/R9_295_X2/23.html

35dB idle lmao. But who buys dual cards anyway..


----------



## Cataclysm_ZA (Dec 3, 2014)

Kaapstad said:


> I use both GTX 980s and R9 290Xs.  I have 4 of each and find the 980s are better most of the time @1080p and the 290Xs are better most of the time @4K.
> 
> With these cards it comes down to what games and resolutions you use as there is no clear winner.



You have four each for now, until you're also subject to our racist load shedding and Eskom claims them all after some lovely surges.


----------



## Eroticus (Dec 3, 2014)

las said:


> Ehh.. The 780 Ti beat 290X.. Sigh.
> http://www.techpowerup.com/reviews/AMD/R9_295_X2/24.html
> 
> Even an OCed 780 rivaled an OCed 290X. Because the scaling and OC headroom was better.
> ...




do you remember how much 780ti 3gb cost ? vs 290x ? 200~250$ more ?

Pump what ? i didn't heard or had any problem with amd pump

35dB  do you using stock fans  ? really ? if you have enough money for this card you have enough money to change the stock fans.








i think the temperature pretty amazing ! for dual hawai gpu , with pump problems and stock fans ...

"meh who's using  dual cards"

not poor people like for you sure =]


----------



## las (Dec 3, 2014)

Eroticus said:


> do you remember how much 780ti 3gb cost ? vs 290x ? 200~250$ more ?
> 
> Pump what ? i didn't heard or had any problem with amd pump
> 
> ...



You are clueless. It's the pump, not the fans that are noisy. Like on all cheap CLCs. Besides that, the coilwhine is the real issue here.

Yeah I'm sooo poor  Thanks why I'm only using an i7-5930K at 4.8 GHz on custom water.


----------



## Eroticus (Dec 3, 2014)

las said:


> You are clueless. It's the pump, not the fans that are noisy. Besides that, the coilwhine is the real issue here.
> 
> Yeah I'm sooo poor  Thanks why I'm only using an i7-5930K at 4.8 GHz on custom water.



Ahh that's why you  are crying ;P 2 stock 295 is still doing better score then ur custom water cooled setup =[ awww ... but ok blame the ones who is buying dual cards cuz it's "useles"" when you are using custom water cooling setup and getting for that 2 fps more  =]


----------



## las (Dec 3, 2014)

Eroticus said:


> Ahh that's why you  are crying ;P 2 stock 295 is still doing better score then ur custom water cooled setup =[ awww ... but ok blame the ones who is buying dual cards cuz it's "useles"" when you are using custom water cooling setup and getting for that 2 fps more  =]



 You are the one whining here


----------



## Frick (Dec 3, 2014)

arbiter said:


> Might want to check some benchmarks, FX doesn't beat an i7. Even 9590 at 5ghz still loses to 4770k at stock clock. Same as nvidia, Intel likely aint sitting around either.



I just want to point out that there are actually benchmarks in which the FX is faster than even a 3960x. And that there are some more where the FX and the 3770 trades punches, and then there are ones where the FX is slower than the lowliest i3. How useful is this IRL? Well only the customer can answer that.

EDIT: And the fanboyism (which is a word I hate because it's stupid but it actually makes sense to use it here) in this forum is getting dumb. Just look at your wallet, then look at some data, then purchase whatever gets your job done in the shortest amount of time/gives you the biggest numbers. Or buy stuff from the company you're loyal to, but for the love of god be realistic about it.


----------



## ZoneDymo (Dec 3, 2014)

arbiter said:


> Might want to check some benchmarks, FX doesn't beat an i7. Even 9590 at 5ghz still loses to 4770k at stock clock. Same as nvidia, Intel likely aint sitting around either.



What I find so hilariously stupid is that people compare 2 products in total different price classes, you dont compare a Honda s2000 to a Ferrari Enzo do you? both sports cars but thats about it.


----------



## ZoneDymo (Dec 3, 2014)

Eroticus said:


> Ahh that's why you  are crying ;P 2 stock 295 is still doing better score then ur custom water cooled setup =[ awww ... but ok blame the ones who is buying dual cards cuz it's "useles"" when you are using custom water cooling setup and getting for that 2 fps more  =]



Are you 12 or what?


----------



## dj-electric (Dec 3, 2014)

People will hold their mighty swords and protect their purchases until their last breath escapes them. Humanity in a nutshell.


----------



## Frick (Dec 3, 2014)

Dj-ElectriC said:


> People will hold their mighty swords and protect their purchases until their last breath escapes them. Humanity in a nutshell.



I do that too, but it's OK because *I'm* the one doing it.


----------



## Sony Xperia S (Dec 3, 2014)

Aquinus said:


> Ever use an APU?



Usually I try to avoid it but for a secondary system (laptop) used for entertainment like browsing and watching movies, then possibly yes.



Aquinus said:


> So maybe you should take your fanboy hat off and understand that AMD and nVidia have a lot more markets than just performance 3D.



Yes, AMD has more products in its portfolio but the overall condition of the company is showing a catastrophe. Remember that just from a decade ago ATi and AMD as separate corporations dwarfed the current size of AMD.

And please, do not turn to me personally because there are many others who would say the same and you will not say anything.


----------



## GhostRyder (Dec 3, 2014)

Well AMD is not far from the truth on their comment, the 290X is still pretty close to a GTX 980's performance and even beats it at times depending on the game.  Even some of the recent benchmarks here show that plus the 512bit bus is good for high resolution which is where AMD generally aims their top end cards anyway (Eyefinity, 1440p, 4K, etc).  Most of the top cards from recent years all will do 1080p Ultra gaming without even breaking a sweat which makes any side you purchase so long as its at least the medium cards+ a good buy for that with only 1440p+ where the real comparisons come into play.

Right now Nvidia have great power efficiency due to improvements in the performance of the Cuda cores in Maxwell which allows less to be more in that area.  But the GTX 980 was not really any leap other than the lowered power efficiency and the 4gb of GDDR5 over its 780ti counterpart which is something shocking.  If you consider the GTX 670 and 680 bashed the previous GTX 580 last go round then this does not look as good wit hteh GTX 980 to 780ti comparison.  When we can see plenty of games where the GTX 980 and R9 290X are neck and neck even trading the top single GPU position back and fourth then things like that cause the market to move at a slower pace.


----------



## LinkPro (Dec 3, 2014)

ManofGod said:


> I have had an X850 Pro, X1900 Pro, 2 x 2900 Pro Crossfire, HD 4870 - HD 4890 Crossfire, 2 x HD 6950 and I now have a R9 290X Reference model. (XFX) So far, I have not had any driver issues, black screens, gray screens, BSODs (Unless I created them myself) nor any other issues. The drivers are actually quite good so far, not perfect but good enough for me.
> 
> Last Nvidia product I owned was a 6600GT. I just do not see any reason to switch when I have not personally had any issues. Also, going with the 512 Bit bus on these cards is a good reason to get one if you need the bandwidth. Oh, and I did have a 9800 Pro back in the day as well without any issues that I can remember. Honestly, I think problems come down more to hardware combinations than anything else.



Then you must be really lucky. I was googling the issue and there are people already on their 3rd RMA of their 290's and the black screeen is still not going away. My 5830 was also fixed with a driver update but that was 3 months after I bought it. 

That reminds me, I actually was on AMD for 3 generations (4870 as well). The fan failed after a year or so but that's more of Asus's fault. I wouldn't mind going back to red, so let's see if their 300 series is any good. I game on 1080p so a flagship card is overkill anyway, it pretty much comes down to which looks better and is more reliable, and so far nVidia is winning for me.


----------



## RealNeil (Dec 3, 2014)

Naito said:


> This to me sounds like they are in no rush to bring out their next generation of GPUs.



Maybe they are in a rush, but nothing is ready yet. In this situation, one would expect a public statement like that. 
But I can't imagine them not being extremely motivated at this point. Taylor's assertion that AMD didn't make an official price adjustment on its end seems a little suspect to me.
Team Green's offerings are power saving and good performers. 

I like the GTX-970 for it's performance and power saving, but I'm still reading about coil whine on a lot of forums. 
I can buy a pair of the 970's right now if I want to, but I'll wait to see what develops first. In a few months (if I can save enough) I can either get two GTX-980s, or take a serious look at whatever AMD releases.


----------



## Sasqui (Dec 3, 2014)

f2bnp said:


>



I'm no fan of NVidia, but looking at some of the TPU reviews by W1zzard, the 970 and 980 really shine in idle mode.  Like 5W for a 970 vs. 40W for a 290.  The 970 and 980 really shine in power consumption and they did pricing right.

Circling back to the Titan vs. 290x, who was winning then?  Duh...


----------



## Aquinus (Dec 3, 2014)

Sasqui said:


> I'm no fan of NVidia, but looking at some of the TPU reviews by W1zzard, the 970 and 980 really shine in idle mode.  Like 5W for a 970 vs. 40W for a 290.  The 970 and 980 really shine in power consumption and they did pricing right.
> 
> Circling back to the Titan vs. 290x, who was winning then?  Duh...


Another reason why I'm considering a 970. AMD's multi-monitor idle consumption is garbage in comparison until you get to the R7 cards. Considering I'm writing code most of the time on my machine, I would say that saving 50-watts or so over what I have now would be tangible over time since my tower is probably on about 14-16 hours a day and the GPUs aren't loaded 90% of the time-ish.

All in all, I think we can all agree that the GTX 970/980 are ahead of the curve because it's new technology. AMD is behind because they haven't released anything new for quite some time. I will change my stance if they release something new but until then, I just see an aging lineup next to a cutting edge one offered by nVidia.


----------



## the54thvoid (Dec 3, 2014)

f2bnp said:


>



Very selective.






Vanilla flavour GTX 980 = <160 watts.


----------



## EarthDog (Dec 3, 2014)

KarymidoN said:


> when AMD learn to optimize energy consumption and make a decent stock cooling system, so Nvidia will have competition. Today AMD has stratospheric consumption of energy with poorly optimized drivers and ridiculously high (I have a CrossfireX).


Give them time to get their next gen out and see what happens. This is new arch from NVIDIA, while AMD's has been out for quite some time now. 

ANd lol @ AMD with its 512MB bus that matters to the .01% of people that rock 4K or x3 4K monitors... Oye. What a marketing machine they are. Preying on the ignorance of the consumer (ok, both have done this to be fair).


----------



## VictorLG (Dec 3, 2014)

SteveS45 said:


> Personally I don't think AMD is the only camp with a driver issue.  Both camps in my opinion are equally meh.
> I have two systems one with a R9 280X and HD7970 in crossfire, and a new MSI Gold(Bronze) edition GTX970.
> 
> The GTX970 has been having a lot of problems on Display Port with screen tearing after coming back from sleep state.  Google GTX900 Display Port tearing/black screen, a lot of people have the same problem.  And sometimes switching from Nvidia surround to normal triple monitor or vice versa causes BSOD on Windows7.
> ...




Your problems with the 970 are MSI's fault, not Nvidia's. They even stated that there will be a new BIOS release to fix some of the problems, specially the fan rotation and output ones.

I'm quite disappointed with MSI and AMD videocards, drivers for the 290 series are bad, and I had problems with hardware too.

I migrated back to nvidia (EVGA 980GTX SC) and now I'm quite happy with my gaming experience again.


----------



## f2bnp (Dec 4, 2014)

the54thvoid said:


> Very selective.
> 
> 
> 
> ...




You call me very selective, yet you then showcase a chart with averages from anandtech.


----------



## midnightoil (Dec 4, 2014)

Lots of people screaming that AMD are done for and unrecoverably far behind with the relative performance between 290&290X / 970&980 (neither being true).

It'll be interesting to see what they think when NVIDIA legitimately have zero answer for AMDs next cards for more than 12 months.


----------



## EarthDog (Dec 4, 2014)

f2bmp said:
			
		

> You call me very selective, yet you then showcase a chart with averages from anandtech.


That's from here... not andand... 

And your comparison is asinine as its not even comparing the same damn game. In order to make the comparison empirical and have ANY value to it, they need to be tested across the same exact thing. 



midnightoil said:


> Lots of people screaming that AMD are done for and unrecoverably far behind with the relative performance between 290&290X / 970&980 (neither being true).
> 
> It'll be interesting to see what they think when NVIDIA legitimately have zero answer for AMDs next cards for more than 12 months.


An answer? They one up themselves every time something new comes out. Occasionally each answer with a bump in mid gen (think 7970 Ghz edition or 780ti... etc. That debate, to me, is hilarious because both sides can be right, it just depends on what the poster thinks was released 'first' and what was the 'response'...


----------



## midnightoil (Dec 4, 2014)

EarthDog said:


> That's from here... not andand...
> 
> And your comparison is asinine as its not even comparing the same damn game. In order to make the comparison empirical and have ANY value to it, they need to be tested across the same exact thing.
> 
> An answer? They one up themselves every time something new comes out. Occasionally each answer with a bump in mid gen (think 7970 Ghz edition or 780ti... etc. That debate, to me, is hilarious because both sides can be right, it just depends on what the poster thinks was released 'first' and what was the 'response'...



That can't happen this time.  The big marketing spiel for new cards is high resolutions.  Namely 4K and 2560x1440.  At high res particularly the HBM cards will blow GDDR cards out of the water.  NVIDIA backed the wrong horse and had to ditch their stacked memory plans.  They've been redesigning their future architectures to use the AMD-designed HBM .. for some time in 2016.

Also, if AMD do turn out to be using 20nm ... that'll be a disaster for NVIDIA.  They don't have any designs that can launch on 20nm anymore.

This is the first time in many years that there will be a big inter-generational leap in performance, and the first time the other firm won't be able to catch up for a long time.


----------



## EarthDog (Dec 4, 2014)

2560x1400/1600 doesn't really need HBM. 4K, ok. But hell 256bit cards plow through 2560x1440 with plenty of AA (assuming it has the vram capacity to support it). Not to mention the efficiency improvements of Maxwell's memory architecture offering a fair amount more bandwidth due to their updated memory compression.



> Also, if AMD do turn out to be using 20nm ... that'll be a disaster for NVIDIA. They don't have any designs that can launch on 20nm anymore.


 Wouldnt TSMC and NVIDIA's launch time have something to do with it? I recall TSMC having delays in moving to their 20nm node essentially forcing NVIDIA to design Maxwell on 28nm instead of 20nm. The 980 and 970, much like the 670/680 were not the 'full' core implementations. I would imagine there are full Maxwell chips upcoming. While those may be more of an incremental improvement, that still leaves AMD with, what I imagine to be around a 15-20% performance gap to close. While that isn't impossible, they need to bring their big boy pants to the table with their new generation. That said, here is to hoping we  see that. 



> This is the first time in many years that there will be a big inter-generational leap in performance, and the first time *the other firm won't be able to catch up for a long time.*


Only time will tell, but, I haven't seen much to make me believe that will happen... but again, I hope so for the sake of competition and innovation.


----------



## renz496 (Dec 4, 2014)

midnightoil said:


> Lots of people screaming that AMD are done for and unrecoverably far behind with the relative performance between 290&290X / 970&980 (neither being true).
> 
> *It'll be interesting to see what they think when NVIDIA legitimately have zero answer for AMDs next cards for more than 12 months.*



wow. did you have fact to back up that statement? or just your delusional assumption?

as usual AMD marketing is fun to watch. but i think this one still okay. it is better than "you guys should holding buying 900 series because we are the future of gaming and our 285 is faster than GTX760"


----------



## qubit (Dec 4, 2014)

Forget all these hires benchmarks  for a moment, I wanna see one at 1024x768 just for giggles.  I want to see framerates at 500-1000fps in some old game to demonstrate just how far graphics performance has come.


----------



## Recus (Dec 4, 2014)

So 24 wheels is better than 4, right? Right?


----------



## SIGSEGV (Dec 4, 2014)

midnightoil said:


> Also, if AMD do turn out to be using 20nm ... that'll be a disaster for NVIDIA.  They don't have any designs that can launch on 20nm anymore.



i doubt amd will use 20nm node on their next gen gpu. in my opinion amd will jump on samsung's finfet 14nm on q2 or q3 2015 instead tsmc's finfet 16nm.

amd current gpu remains competitive (at its current price/perf) and only green warriors said otherwise.


----------



## alwayssts (Dec 4, 2014)

Eukashi said:


> when the HBM technology is loaded into RADEON, a problem in a memory band is cleared.
> there is no need to increase the secondary cache as Maxwell.



Except if R390x is indeed 4096sp, 512gbps (Which would be 4*1GB HBM operating at 128gbps) would really only be good up to around 1120mhz (if like Hawaii) or around 1200mhz if using the compression tech we saw in R9 285.  With or without factoring in scaling (96-97%), that doesn't touch big maxwell (at probably a fairly similar size, if not Fiji granted slightly smaller on the same process)...and you can bet your butt we'll see a '770'-like GM204 (or really weak-sauce butchered big Maxwell sku) if it's stock clock is 1ghz.  While this method for bw would work for a 28 or even 20nm part using their current arch, compared to what is possible on 16nm it's not nearly enough if they want to actually compete.

The reason is 4096sp generally won't be used to it's full extent in core gameplay, closer to ~3800 (just as you saw with 280x vs gk104, or 7950/280 vs 7970/280x scaling on a half scale), and when you figure whatever that number is divided by 2560 effective units in GM204, and the fact it can do 1500mhz BECAUSE of having such secondary cache....that ain't good.  Btw, this is why big maxwell is essentially 3840 units ([128sp+32sfu]*24).  The same way gk104 was essentially 1792 (192+32*8)....because the optimal count for 32/64 ROPs is right around there.  Slightly higher in GK104's case (and hence why 280x was slightly faster per clock), but that was a fairly small chip and could expect decent yields.  Slightly lower in big maxi's case, but I'd be willing to bet most parts sold will be under that threshold (which is still less than 1 shader module).

What's unfortunate is while excessive compute and high bw is good for certain things (like tressfx etc), it's still a better play to generally have less units than what the rops can handle in most core gaming situations, as it's more power/die/bw efficient (again, see gk104 vs 280x), and if need-be scale the core clock for performance of all units (texture, rops etc) at an optimal ratio.  If we essentially get a 2x280x just because AMD has the bandwidth to do so (and clockspeeds won't allow a more efficient core config with higher clock to saturate it, similar to their more recent bins that generally do ~1100mhz) they are kind of missing the big picture in an effort to pull out all the stops and create something slightly faster through brute force...It'll be Tahiti vs GK104 all over again on a literally slightly larger scale.  

All they are doing is moving the goalpost with CUs and bw, more-or-less similarly since R600, when a fundamental efficiency change is sorely needed.  I'm talking like when they went to 4VLIW instead of 5 (when the avg call was 3.44sp), the move to 4x16 with a better scheduler, or to a lesser extent what they did with compression in 285.  Even if the bandwidth problem is solved for another generation (and even that's arguable when larger than 4GB is going to quickly become normal and HBM won't see that for a year or more, not to mention GM200 will literally be out of their league if on the same process) the fundamental issue is the lack of architectural evolution to cope with outside factors (bw, process node capabilities) not lining up with what they currently have.  Some of that is probably tied to and hamstrung by their ecosystem (HSA, APUs, etc), but I still think it primarily comes down to lack of resources in their engineering dept over the last couple to few years.

I truly think Roy knows all this (that they currently are in a bad place and the immediate future doesn't appear fabulous either), but his job is his job, and I respect that.


----------



## EarthDog (Dec 4, 2014)

qubit said:


> Forget all these hires benchmarks  for a moment, I wanna see one at 1024x768 just for giggles.  I want to see framerates at 500-1000fps in some old game to demonstrate just how far graphics performance has come.


hires?

You would need a helluva an overclocked cpu to reach those speeds as that is not remotely a gpu limited res.

Run 3dmk 01 though.


----------



## qubit (Dec 4, 2014)

Oh, just try the original Unreal Tournament from 1999 on modern high end hardware. It really does reach framerates like that and it's so fast, that the game's speed actually varies erratically and looks quite ridiculous. 

Modern games of course wouldn't go that fast.

@Recus Both of those pictures are pretty cool.


----------



## arbiter (Dec 4, 2014)

Anyone that says Nvidia has no answer for AMD's gpu for 12 months and has 0 proof to back statement, you sir are complete AMD tool. 

Anyone that says Nvidia has no 20nm gpu and is a disaster if AMD goes with it and has 0 proof as well to back their statement, you sir are as well a Complete AMD Tool.


----------



## EarthDog (Dec 4, 2014)

qubit said:


> Oh, just try the original Unreal Tournament from 1999 on modern high end hardware. It really does reach framerates like that and it's so fast, that the game's speed actually varies erratically and looks quite ridiculous.
> 
> Modern games of course wouldn't go that fast.
> 
> @Recus Both of those pictures are pretty cool.


Ahh, it takes a 15 year old game that the iGPU could run that fast to make that point. Gotcha.


----------



## midnightoil (Dec 4, 2014)

arbiter said:


> Anyone that says Nvidia has no answer for AMD's gpu for 12 months and has 0 proof to back statement, you sir are complete AMD tool.
> 
> Anyone that says Nvidia has no 20nm gpu and is a disaster if AMD goes with it and has 0 proof as well to back their statement, you sir are as well a Complete AMD Tool.



They don't. It's a fact.  Everyone knew they were a bit behind AMD with stacked memory anyway.  However in 2013 when they cancelled HMC entirely and decided to shift to the AMD-designed and Hynix backed HBM, we knew for sure that unless AMD delayed their HBM products enormously, NVIDIA wouldn't be able to compete for a while.  HMC Volta was canned, and replaced with HBM Pascal which is tentatively scheduled for H2 '16.

NVIDIA have no 20nm.  It's a fact.  We don't know if AMD do.  Personally I think it's unlikely, but it may transpire.


----------



## EarthDog (Dec 4, 2014)

> NVIDIA have no 20nm. It's a fact.


You mention its a fact, but... how do you know its a fact? You haven't supported that assertion with any links.

As I said, NVIDIA was planning on the shrink for Maxwell, but TSMC not being ready delayed it essentially forcing them to stick with 28nm for this release. Not resting on their laurels, I think they did a pretty damn good job of increasing IPC, memory efficiency, and power consumption on that same process to bring out the well received 9 series. With that in mind, I think NVIDIA will be in a position to catch up sooner rather than later IF AMD brings a game changer to the table. 

Remember, NVIDIA has also brought to the table a die shrink in the same platform (GTX 260 from 55nm to 45nm IIRC). Who's to say that don't have the 20 nm plans still on the shelf ready to go????


----------



## Slizzo (Dec 4, 2014)

f2bnp said:


>



How about comparisons for GPUs running at stock speeds? That's what both AMD and nVidia spec, hardly their fault if the board partners are running the GPUs out of spec.


----------



## 64K (Dec 4, 2014)

If AMD goes to the 20nm process with their GPUs then I don't see how Nvidia can compete with them by staying on the 28nm process but maybe Maxwell is that efficient to where they can. I have heard the rumors too that Nvidia is going to wait until next year to go to the 16nm process but how the hell will TSMC be ready for that when they couldn't get the 20nm process down. I don't know. There's some crazy rumors flying around. Here's one of them

http://www.kitguru.net/components/g...0nm-process-technology-jump-straight-to-16nm/


----------



## EarthDog (Dec 4, 2014)

> but maybe Maxwell is that efficient to where they can.


As I posted above, look what they did with it already on the 980... Several % faster than 780ti, with less memory bus width and CUDA cores, and uses almost 33% less (~100W less) power than a 780ti. Perhaps they wrung the rag dry though...?


----------



## yogurt_21 (Dec 4, 2014)

Recus said:


> So 24 wheels is better than 4, right? Right?


If you're transporting something  larger than a thumb drive, yes yes it is. 

the 512bit bus is nice and all but obviously memory bandwidth hasn't really been an issue for a long time now. The superior performance on the 970 and 980 in most cases makes it the better buy. But your comparison fails. You're comparing things of very different natures.


----------



## KarymidoN (Dec 4, 2014)

EarthDog said:


> Give them time to get their next gen out and see what happens. This is new arch from NVIDIA, while AMD's has been out for quite some time now.
> 
> ANd lol @ AMD with its 512MB bus that matters to the .01% of people that rock 4K or x3 4K monitors... Oye. What a marketing machine they are. Preying on the ignorance of the consumer (ok, both have done this to be fair).



It was the same with the R9 2XX series, I expected they become more economic in energy and were not, I hoped they become less hot and noisy, but they were not ... AMD focuses on Competitive price and reasonably good performance, the problem is that the reference Coolers are Horrible and custom models are not attractive.
If until February 2015 AMD not launch a new GPU that really bring an improvement in these two requirements then throw my two R9 270X in the trash and buy a GTX 980 from ASUS Strix, I am Brazilian and here the NVIDIA cards are more expensive than AMD, 980 Strix costs about US $ 1200, but I have no choice if you want to improve my system without having to change the power supply, do not want to have to buy a power supply 1200w for new cards from R9 300 series.


----------



## midnightoil (Dec 4, 2014)

EarthDog said:


> You mention its a fact, but... how do you know its a fact? You haven't supported that assertion with any links.
> 
> As I said, NVIDIA was planning on the shrink for Maxwell, but TSMC not being ready delayed it essentially forcing them to stick with 28nm for this release. Not resting on their laurels, I think they did a pretty damn good job of increasing IPC, memory efficiency, and power consumption on that same process to bring out the well received 9 series. With that in mind, I think NVIDIA will be in a position to catch up sooner rather than later IF AMD brings a game changer to the table.
> 
> Remember, NVIDIA has also brought to the table a die shrink in the same platform (GTX 260 from 55nm to 45nm IIRC). Who's to say that don't have the 20 nm plans still on the shelf ready to go????



Because it is a fact.  NVIDIA's 20nm plans relied entirely on the canned 20nm HP process at TSMC.  They're now waiting for FINFET to be ready.

AMD weren't relying on that (20nm HP), but may have canned 20nm due to delays / improvements in 28nm / cost / not wanting go to another node with TSMC when they want to shift production to GF / HBM providing large power savings anyway.


----------



## EarthDog (Dec 4, 2014)

> Because it is a fact.


LOL, enough already.. Bring something to the table friend. 

To say they had "no plans" when they had plans and scrapped them because of TSMC not being ready on that node...  Do you think they burned them and don't exist??  I would imagine the rumor of NVIDIA skipping it and going to 14nm is true also... but it depends on the fab and if its ready and yields are good, etc. It wouldn't make any sense to me that they scrapped their plans for 20nm and put ALL their eggs in that basket.


----------



## 64K (Dec 4, 2014)

KarymidoN said:


> It was the same with the R9 2XX series, I expected they become more economic in energy and were not, I hoped they become less hot and noisy, but they were not ... AMD focuses on Competitive price and reasonably good performance, the problem is that the reference Coolers are Horrible and custom models are not attractive.
> If until February 2015 AMD not launch a new GPU that really bring an improvement in these two requirements then throw my two R9 270X in the trash and buy a GTX 980 from ASUS Strix, I am Brazilian and here the NVIDIA cards are more expensive than AMD, 980 Strix costs about US $ 1200, but I have no choice if you want to improve my system without having to change the power supply, do not want to have to buy a power supply 1200w for new cards from R9 300 series.



You would only be hurting yourself if you throw those cards in the trash. You could sell them. What resolution are you gaming at? A single card solution might be a better choice for you. You wouldn't need a 1200 watt PSU anyway to run whatever will be the equivalent of your crossfire R9 270X cards in the R9 300 series. The power consumption will probably be about the same but you will get more performance. If you're gaming at 1440p or less then one high end card would be enough. Start a thread in the Graphics Card forum and maybe we can sort this out if you want to.


----------



## midnightoil (Dec 5, 2014)

EarthDog said:


> LOL, enough already.. Bring something to the table friend.
> 
> To say they had "no plans" when they had plans and scrapped them because of TSMC not being ready on that node...  Do you think they burned them and don't exist??  I would imagine the rumor of NVIDIA skipping it and going to 14nm is true also... but it depends on the fab and if its ready and yields are good, etc. It wouldn't make any sense to me that they scrapped their plans for 20nm and put ALL their eggs in that basket.



I've contributed the facts we do know.  You've contributed nothing.

I've no idea what you're even talking about. 20nm isn't one homogenous process.  NVIDIA had designs for TSMC's HP 20nm process, but that was shelved a long time ago.  Their existing designs (Maxwell & Kepler) won't work on an LP process. Pascal may have if LP 20nm bulk planar was what they were aiming at, but it's ages away and is surely aiming at FINFET.  So they might as well have burned them, because there's no process available to them that they can produce them on ... unless you're proposing that they release half a dozen semi-functional prototypes for $15m each?

AMD has had iterations of their GPU designs for both HP and LP.  Because 20nm was so delayed, so capacity constrained and so expensive, their TSMC 20nm LP options probably are shelved ... but maybe not - we'll soon see.


----------



## renz496 (Dec 5, 2014)

midnightoil said:


> *I've contributed the facts we do know.*  You've contributed nothing.
> 
> I've no idea what you're even talking about. 20nm isn't one homogenous process.  NVIDIA had designs for TSMC's HP 20nm process, but that was shelved a long time ago.  Their existing designs (Maxwell & Kepler) won't work on an LP process. Pascal may have if LP 20nm bulk planar was what they were aiming at, but it's ages away and is surely aiming at FINFET.  So they might as well have burned them, because there's no process available to them that they can produce them on ... unless you're proposing that they release half a dozen semi-functional prototypes for $15m each?
> 
> AMD has had iterations of their GPU designs for both HP and LP.  Because 20nm was so delayed, so capacity constrained and so expensive, their TSMC 20nm LP options probably are shelved ... but maybe not - we'll soon see.



bro don't confuse the rumor we heard so far as a plain real reality or FACT. since you insisting it is a fact then give the link to prove it. not some rumor article but nvidia official statement that they will skip 20nm node altogether. same with your claim about AMD design their discrete GPU on both HP and LP process. give the link to prove it.


----------



## Kissamies (Dec 5, 2014)

I remember from HD2900XT that the width doesn't do the magic.. What a stupid response from AMD. :/


----------



## plonk420 (Dec 5, 2014)

cedrac18 said:


> The $190 used on Ebay is my price point. Thank you Nvidia, i have never and will never pay more than $200 for a single component.



i got my 290 new for $215 on Black Friday



Dj-ElectriC said:


> Now, power and drivers or at-launch optimization? well...



what do you mean drivers? yeah, the panel is effing slow, but i can't think of a time when i've had driver issues other than OpenCL when trying to mine cryptocoins (and i ususally run freakin' olllld drivers).


----------



## KarymidoN (Dec 5, 2014)

64K said:


> You would only be hurting yourself if you throw those cards in the trash. You could sell them. What resolution are you gaming at? A single card solution might be a better choice for you. You wouldn't need a 1200 watt PSU anyway to run whatever will be the equivalent of your crossfire R9 270X cards in the R9 300 series. The power consumption will probably be about the same but you will get more performance. If you're gaming at 1440p or less then one high end card would be enough. Start a thread in the Graphics Card forum and maybe we can sort this out if you want to.



I use 2 monitors (1080p each), I am not disappointed with the performance, but with the temperature and the noise ... I've never used Nvidia and I'm sure that AMD will release more powerful cards that the GTX900 series, but AMD never invests in lower energy consumption and reduce the emission of noise of GPUs, for this reason I will opt for the 900 series nvidia instead of R9 300 ...


----------



## siki (Dec 6, 2014)

LinkPro said:


> Until AMD finally figures out how to write proper drivers I will stay with nVidia. .



If AMD(ati) didn't figured out that by now they will never.
I mean this story about ATI and the drivers goes back  into the nineties.


----------



## MxPhenom 216 (Dec 6, 2014)

renz496 said:


> bro don't confuse the rumor we heard so far as a plain real reality or FACT. since you insisting it is a fact then give the link to prove it. not some rumor article but nvidia official statement that they will skip 20nm node altogether. same with your claim about AMD design their discrete GPU on both HP and LP process. give the link to prove it.



This. There has been no official statement from either companies on the state of 20nm. It is an assumption + rumor from an article that spread across the internet and is now interpreted as fact it seems. Fact of the matter, unless you are in the industry, you really don't know much on the matter.


----------



## ManofGod (Dec 6, 2014)

I love this thread, it is very entertaining. I have had my XFX R9 290 reference version for about 13 months of enjoyment. Nvidia fans claim their 970 it is the bestest that ever was and AMD neveressess bother with reducing power consumption.  (All the time well petting their card and saying, "My precious.") Oh well, it is your money, have fun with it. 

Oh, and my reference R9 290, flashed to a 290X is not to hot or loud at all. Of course, I have a proper case and do not have the card 2 inches from my ear so there is that. The 512 bit memory bus does help and does not require 10 Billion Gigahurts memory speed. But, you needed the one with Hynix memory to avoid any potential problems that did occur. (Such as the one that has Hynix memory on mine and has zero problems.)

That is ok, I am sure Nvidia will take real good care of you all. (Except for the Nforce 3 debacle and the Nvidia chips failing in the laptops just past a year but, oh well, forgive and forget, eh?)



Recus said:


> So 24 wheels is better than 4, right? Right?


Yes, that 24 wheel real truck is better than that 10 inch model car.


----------



## MxPhenom 216 (Dec 6, 2014)

you should try editing your posts instead of double posting.


----------



## TheinsanegamerN (Dec 6, 2014)

ManofGod said:


> I love this thread, it is very entertaining. I have had my XFX R9 290 reference version for about 13 months of enjoyment. Nvidia fans claim their 970 it is the bestest that ever was and AMD neveressess bother with reducing power consumption.  (All the time well petting their card and saying, "My precious.") Oh well, it is your money, have fun with it.
> 
> Oh, and my reference R9 290, flashed to a 290X is not to hot or loud at all. Of course, I have a proper case and do not have the card 2 inches from my ear so there is that. The 512 bit memory bus does help and does not require 10 Billion Gigahurts memory speed. But, you needed the one with Hynix memory to avoid any potential problems that did occur. (Such as the one that has Hynix memory on mine and has zero problems.)
> 
> ...


well, consider the fact that nvidia's secod tier card is faster than amd's fastest, while pulling half the power, and not needing a massive memory bus to accomplish this, and it's no wonder people say the 970 is better, BECAUSE IT IS. the 2900xt has a 512 bit bus as well, and it didnt matter if the gpu couldnt use the bandwidth.
I mean, you could track down a 290x that has specific memory so you could overclock it on that 512 bit bus, or you could just get the 970 since its still faster anyway, and you dont have to worry about what kind of memory it has.
and dont act like AMD chips in laptops never fail either (the macbook pro and imacs with AMD chips had the same problem) and the nforce 3 was OVER A DECADE AGO, so yeah, it doesnt really matter now.


----------



## hadesjoy72 (Dec 7, 2014)

SteveS45 said:


> Personally I don't think AMD is the only camp with a driver issue.  Both camps in my opinion are equally meh.
> I have two systems one with a R9 280X and HD7970 in crossfire, and a new MSI Gold(Bronze) edition GTX970.
> 
> The GTX970 has been having a lot of problems on Display Port with screen tearing after coming back from sleep state.  Google GTX900 Display Port tearing/black screen, a lot of people have the same problem.  And sometimes switching from Nvidia surround to normal triple monitor or vice versa causes BSOD on Windows7.
> ...



Ahh...voice of reason in an otherwise chaotic thread....thank you sir.


----------



## Sony Xperia S (Dec 7, 2014)

LinkPro said:


> Until AMD finally figures out how to write proper drivers I will stay with nVidia.



AMD videocards give you better image quality and it's pretty obviously proven here:

https://pcmonitors.info/reviews/aoc-i2369vm/

_



			We also briefly tested the monitor using an Nvidia GTX 670 just to see if there were any obvious colour differences. *As usual the Nvidia card sent out completely the wrong colour signal to the monitor, washing out colours and hugely reducing contrast.* To rectify this and make everything look as it should for PC use the following steps should be taken in Nvidia Control Panel.
		
Click to expand...

_


----------



## EarthDog (Dec 8, 2014)

Sony Xperia S said:


> AMD videocards give you better image quality and it's pretty obviously proven here:
> 
> https://pcmonitors.info/reviews/aoc-i2369vm/


There is a difference between adjusting gamma, etc, than image quality produced by drivers. 

That is like saying the same TV's on the shelf next to each other, one is inferior because it wasn't calibrated as well as the other. 

Come on now...


----------



## ayazam (Dec 16, 2014)

remember me when the r9 290/x comes out? was it a year ago?
yet you all still comparing it, for god sakes...


----------



## Tatty_One (Dec 16, 2014)

ayazam said:


> remember me when the r9 290/x comes out? was it a year ago?
> yet you all still comparing it, for god sakes...


How could anyone remember you?  This is your first post


----------



## ayazam (Dec 16, 2014)

Tatty_One said:


> How could anyone remember you?  This is your first post


 yes and im sorry... 
i just feel 'need' to make an account thou i love TPU  since fermi came to world...


----------



## xenocide (Dec 17, 2014)

Sony Xperia S said:


> AMD videocards give you better image quality and it's pretty obviously proven here:
> 
> https://pcmonitors.info/reviews/aoc-i2369vm/


 
So they configured it using an AMD card initially, then just plopped an Nvidia card in there are *shocked* to find out the color and gamma settings were off... oh, wait, the line after the one you highlighted points out they showed you exactly how to correct it (assuming you for some reason swap out your video cards regularly).  That's a stretch accusation at best.  I could always take the low hanging fruit and say something to the tune of "does it matter what the image quality is if the game never even _runs_?"


----------



## Sony Xperia S (Dec 18, 2014)

xenocide said:


> So they configured it using an AMD card initially, then just plopped an Nvidia card in there are *shocked* to find out the color and gamma settings were off... oh, wait, the line after the one you highlighted points out they showed you exactly how to correct it (assuming you for some reason swap out your video cards regularly).  That's a stretch accusation at best.  I could always take the low hanging fruit and say something to the tune of "does it matter what the image quality is if the game never even _runs_?"



Nope, I think you are just being confused. Nvidia cards give false colours in all cases, when you turn on your monitor for the first time, and also when you used whatever before to adjust...


----------



## EarthDog (Dec 18, 2014)

Sony Xperia S said:


> Nope, I think you are just being confused. Nvidia cards give false colours in all cases, when you turn on your monitor for the first time, and also when you used whatever before to adjust...


LOL.


----------



## Tatty_One (Dec 18, 2014)

Sony Xperia S said:


> Nope, I think you are just being confused. Nvidia cards give false colours in all cases, when you turn on your monitor for the first time, and also when you used whatever before to adjust...



Some of what you have said "may" have been true in the analogue age as analogue output/quality was pretty much down to the quality of the RAMDAC but in our digital age you are way out, in fact technically it's so unlikely that if you were a betting man you would be better off putting your money on a chocolate horse in a desert horse race.

Take a look at the link below which tests 2 different cards in detail via a DVi digital connection, I just want to quote one piece that talks about why technically there should not be any differences unless of course settings have been deliberately tampered with...........

_"The problem is that when you learn a bit about how graphics actually work on computers, it all seems to be impossible. There is no logical way this would be correct. The reason is because a digital signal remains the same, no matter how many times it is retransmitted or changed in form, unless something deliberately changes it._

_In the case of color on a computer, you first have to understand how computers represent color. All colors are represented using what it called a _*tristimulus*_ value. This means it is made up of a red, green, and blue component. This is because our eyes perceive those colors, and use that information to give our brains the color detail we see._

_Being digital devices, that means each of those three colors are stored as a number. In the case of desktop graphics, an 8-bit value, from 0-255. You may have encountered this before in programs like Photoshop that will have three sliders, one if each color, demarcated in 256 steps, or in HTML code where you specify colors as #XXYYZZ, each pair of characters is a color value in hexadecimal (FF in hex is equal to 255 in decimal)._

_When the computer wants a given color displayed, it sends that tristimulus value for the video card. There the video card looks at it and decides what to do with it based on the *lookup table* in the video card. By default, the lookup table doesn’t do anything; it is a straight line, specifying that the output should be the same as the input. It can be changed by the user in the control panel, or by a program such as a monitor calibration program. So by default, the value the OS hands the video card is the value the video card sends out over the DVI cable._

_What this all means is that the monitor should be receiving the same digital data from either kind of card, and thus the image should be the same. Thus it would seem the claim isn’t possible"_

I will be honest now and say that in the past I too have believed there to be visual differences, there may well of been but it is highly likely that those differences were caused by  me in as much as settings, different cabling etc, in a "clean" environment it seems there is not.

_http://hardforum.com/showthread.php?t=1694755_

_Apologies..... I allowed a thread derail to derail me!_


----------



## Sony Xperia S (Dec 18, 2014)

EarthDog said:


> LOL.



I honestly do NOT understand what exactly you are laughing for. Nvidia drivers send signals to any display which differ to what ATi Catalyst does.
Actually, I use the default colour, brightness, gamma, etc settings on my AMD rig and I have to manually reduce brightness and adjust contrast accordingly on my Nvia machine because the screen image looks unnatural.

What is so funny about it? Maybe that you didn't get anything from the article itself and the methodology they used? 



Tatty_One said:


> _Apologies..... I allowed a thread derail to derail me!_



No apologies at all, man.

The thread should be: 

*Choose R9 290 Series for its superior image quality compared to competition's: AMD*


----------



## Tatty_One (Dec 18, 2014)

Sony Xperia S said:


> I honestly do NOT understand what exactly you are laughing for. Nvidia drivers send signals to any display which differ to what ATi Catalyst does.
> Actually, I use the default colour, brightness, gamma, etc settings on my AMD rig and I have to manually reduce brightness and adjust contrast accordingly on my Nvia machine because the screen image looks unnatural.
> 
> What is so funny about it? Maybe that you didn't get anything from the article itself and the methodology they used?
> ...



One problem with that...... it simply isn't true and I am an AMD man!  If you read my post you will see that unless you tamper it's pretty much technically impossible.


----------



## Sony Xperia S (Dec 20, 2014)

Tatty_One said:


> One problem with that...... it simply isn't true and I am an AMD man!  If you read my post you will see that unless you tamper it's pretty much technically impossible.



First, what you quoted with that "computer" does this, that..... it is bullshit because it is not the computer but the software it is running. 
Even if you have the OS wanting something, or the game wanting something, you always have the driver of the corresponding manufacturer which is like a translator and a link for understanding between the hardware itself, the OS, the game, etc.

And what the hell has the fact that you are an AMD man anything to do with your wrong point?

It isn't "may" but it is a fact.


----------



## Tatty_One (Dec 20, 2014)

Sony Xperia S said:


> First, what you quoted with that "computer" does this, that..... it is bullshit because it is not the computer but the software it is running.
> Even if you have the OS wanting something, or the game wanting something, you always have the driver of the corresponding manufacturer which is like a translator and a link for understanding between the hardware itself, the OS, the game, etc.
> 
> And what the hell has the fact that you are an AMD man anything to do with your wrong point?
> ...


No actually the facts speak for themselves, if you have seen the visuals from both cards and feel that AMD reproduces better then yes the drivers factor but only if users have set there own preferences using those drivers and every individual user who does so has there own visual preferences, some set for performance, some for quality and therefore you can't really make comparisons unless of course you have seen every setup, if CCC, NVidia control panel or any 3rd party software is not used then as the link says, colour reproduction defaults to the OS for commands, hence the "clean" method for testing digital colour reproduction which can be the only "safe" method.  I mentioned I am an AMD man because you simply come across as a fanboi who refuses to acknowledge facts and seems therefore blinded by your own misguided logic.

The last time I had an NVidia card it was a 560Ti, I also had in my 2nd rig at the time a HD5850, I set my settings to highest quality on whatever card and whilst my monitors were different of course they were both DVi cabled, both were of the same resolution and similar quality and I could see no differences, perhaps you have seen differences in setup's you have seen but as I said that may well have been down to individual settings so in this case we can agree to disagree.


----------



## Sony Xperia S (Dec 21, 2014)

The difference is pretty well stated in the article:

"As usual the Nvidia card sent out completely the wrong colour signal to the monitor, washing out colours and hugely reducing contrast."

They never said that it was their individual preset but the default setting when you have an nvidia card. 

Basically you are doing three things.

1. Accusing for lying and arguing with other writers at https://pcmonitors.info/reviews/aoc-i2369vm/;
2. Claiming something that is irrelevant - that if you uninstall nvidia's drivers, then you will see one and the same image quality.
And even then, I am not sure whether that's true because Microsoft have also their drivers and I'm not sure whether it is one and the same for both vendors;

3. Confirming what I am saying.


----------



## Aquinus (Dec 21, 2014)

Sony Xperia S said:


> AMD videocards give you better image quality and it's pretty obviously proven here:
> 
> https://pcmonitors.info/reviews/aoc-i2369vm/


To quote your own source:


> To rectify this and make everything look as it should for PC use the following steps should be taken in Nvidia Control Panel.





> If you want to avoid having to do these little tweaks in the graphics driver then make use of the monitor’s DisplayPort.



So no only is there a workaround, this isn't impacted by DisplayPort devices. That's a pretty piss poor reason to choose AMD over nVidia for that one, insignificant, reason. Even more so if it's an issue that can corrected.


Sony Xperia S said:


> What is so funny about it? Maybe that you didn't get anything from the article itself and the methodology they used?


That is a review of a monitor, or did I miss something? The name "nVidia" doesn't show up anywhere in that article except for the quote you posted:


> We also briefly tested the monitor using an Nvidia GTX 670 just to see if there were any obvious colour differences. As usual the Nvidia card sent out completely the wrong colour signal to the monitor, washing out colours and hugely reducing contrast. To rectify this and make everything look as it should for PC use the following steps should be taken in Nvidia Control Panel.



So how about finding some "testing methodology" that actually tests what you're complaining about instead of cherry picking non-relevant articles.

The thread title should be:
*Factless bantering about AMD and NVidia GPUs.*


----------



## Tatty_One (Dec 21, 2014)

I think you are misunderstanding me and the link I posted, perhaps I didn't explain myself very well so I will try again............ If you use either cards and with that use either CCC, NVidia Control panel, a 3rd party piece of software or simply your monitors calibration either through software or monitor controls there are too many variables to make accurately the assumption in every case that one's image or colour reproduction is better than another, so we use our eyes of course and they tell us.  As a simple example, if we were both playing BF4 in the same house and we both had the same monitors, the same cabling etc and you were running a 290X and I was running lets say a 970, you could walk into my room, look at me gaming and say damn my image quality seems to be better than yours and you may well be right but unless you know what presets and settings I had invoked through the control panel (yes at the driver level) then it's moot, it is very difficult to reproduce all of those factors I mentioned earlier in the real world.

I am not accusing any article of lying, merely stating that in order to do a clean comparison and hence remove all of those variables the link I posted sought to test at the hardware level and there was no difference, I acknowledge what you are saying about at the software (driver) level but it is not easily and fairly measurable, in my opinion in different scenarios with different games/apps it's likely there will be different results.  My "old" eyes really didn't see any differences when I had the 2 rigs.  Also even when you are using driver level software it is still the OS that is communicating with the hardware and software, the command for both systems would be the same as my link discussed but with NVidia Control panel or CCC their execution would be different depending on settings.

So in summary, all I am saying is that there are too many variables _in my opinion_ at the software level to accurately reflect visuals for ALL users of these graphics cards, hence why we seem to get different views.

perhaps we should now move back to the topic of the bus size of AMD cards!


----------



## Steevo (Dec 21, 2014)

Thread needs cleaning, so much crap. 

Also lookup the GTX970 utilization issues Nvidia is having. 
Look in the other "game ready" thread, Nvidia doesn't give the options for bit depth, and other enhancements. 

I would still take one in a heartbeat, just to see if I liked it of course.


----------



## BiggieShady (Jan 2, 2015)

Steevo said:


> Also lookup the GTX970 utilization issues Nvidia is having.


Thanks for this, didn't know people have to modify the low power state voltages in card's bios to get it stable at low loads when overclocked ... even if it's a factory overclock.


----------

