Tuesday, December 2nd 2014
Choose R9 290 Series for its 512-bit Memory Bus: AMD
In one of the first interviews post GeForce GTX 900 series, AMD maintained that its Radeon R9 290 series products are still competitive. Speaking in an interview with TweakTown, Corporate Vice President of Global Channel Sales, Roy Taylor, said that gamers should choose the Radeon R9 290X "with its 512-bit memory bus" at its current price of US $370. He stated that the current low pricing with R9 290 series is due to "ongoing promotions within the channel," and that AMD didn't make an official price adjustment on its end. Taylor dodged questions on when AMD plans to launch its next high-end graphics products, whether they'll level up to the GTX 900 series, and on whether AMD is working with DICE on "Battlefield 5." You can find the full interview in the source link, below.
Source:
TweakTown
107 Comments on Choose R9 290 Series for its 512-bit Memory Bus: AMD
AMD weren't relying on that (20nm HP), but may have canned 20nm due to delays / improvements in 28nm / cost / not wanting go to another node with TSMC when they want to shift production to GF / HBM providing large power savings anyway.
To say they had "no plans" when they had plans and scrapped them because of TSMC not being ready on that node... Do you think they burned them and don't exist?? I would imagine the rumor of NVIDIA skipping it and going to 14nm is true also... but it depends on the fab and if its ready and yields are good, etc. It wouldn't make any sense to me that they scrapped their plans for 20nm and put ALL their eggs in that basket.
I've no idea what you're even talking about. 20nm isn't one homogenous process. NVIDIA had designs for TSMC's HP 20nm process, but that was shelved a long time ago. Their existing designs (Maxwell & Kepler) won't work on an LP process. Pascal may have if LP 20nm bulk planar was what they were aiming at, but it's ages away and is surely aiming at FINFET. So they might as well have burned them, because there's no process available to them that they can produce them on ... unless you're proposing that they release half a dozen semi-functional prototypes for $15m each?
AMD has had iterations of their GPU designs for both HP and LP. Because 20nm was so delayed, so capacity constrained and so expensive, their TSMC 20nm LP options probably are shelved ... but maybe not - we'll soon see.
I mean this story about ATI and the drivers goes back into the nineties.
Oh, and my reference R9 290, flashed to a 290X is not to hot or loud at all. Of course, I have a proper case and do not have the card 2 inches from my ear so there is that. The 512 bit memory bus does help and does not require 10 Billion Gigahurts memory speed. But, you needed the one with Hynix memory to avoid any potential problems that did occur. (Such as the one that has Hynix memory on mine and has zero problems.)
That is ok, I am sure Nvidia will take real good care of you all. (Except for the Nforce 3 debacle and the Nvidia chips failing in the laptops just past a year but, oh well, forgive and forget, eh?) Yes, that 24 wheel real truck is better than that 10 inch model car.
I mean, you could track down a 290x that has specific memory so you could overclock it on that 512 bit bus, or you could just get the 970 since its still faster anyway, and you dont have to worry about what kind of memory it has.
and dont act like AMD chips in laptops never fail either (the macbook pro and imacs with AMD chips had the same problem) and the nforce 3 was OVER A DECADE AGO, so yeah, it doesnt really matter now.
pcmonitors.info/reviews/aoc-i2369vm/
That is like saying the same TV's on the shelf next to each other, one is inferior because it wasn't calibrated as well as the other.
Come on now...
yet you all still comparing it, for god sakes...
i just feel 'need' to make an account thou i love TPU :lovetpu: since fermi came to world...
Take a look at the link below which tests 2 different cards in detail via a DVi digital connection, I just want to quote one piece that talks about why technically there should not be any differences unless of course settings have been deliberately tampered with...........
"The problem is that when you learn a bit about how graphics actually work on computers, it all seems to be impossible. There is no logical way this would be correct. The reason is because a digital signal remains the same, no matter how many times it is retransmitted or changed in form, unless something deliberately changes it.
In the case of color on a computer, you first have to understand how computers represent color. All colors are represented using what it called a tristimulus value. This means it is made up of a red, green, and blue component. This is because our eyes perceive those colors, and use that information to give our brains the color detail we see.
Being digital devices, that means each of those three colors are stored as a number. In the case of desktop graphics, an 8-bit value, from 0-255. You may have encountered this before in programs like Photoshop that will have three sliders, one if each color, demarcated in 256 steps, or in HTML code where you specify colors as #XXYYZZ, each pair of characters is a color value in hexadecimal (FF in hex is equal to 255 in decimal).
When the computer wants a given color displayed, it sends that tristimulus value for the video card. There the video card looks at it and decides what to do with it based on the lookup table in the video card. By default, the lookup table doesn’t do anything; it is a straight line, specifying that the output should be the same as the input. It can be changed by the user in the control panel, or by a program such as a monitor calibration program. So by default, the value the OS hands the video card is the value the video card sends out over the DVI cable.
What this all means is that the monitor should be receiving the same digital data from either kind of card, and thus the image should be the same. Thus it would seem the claim isn’t possible"
I will be honest now and say that in the past I too have believed there to be visual differences, there may well of been but it is highly likely that those differences were caused by me in as much as settings, different cabling etc, in a "clean" environment it seems there is not.
hardforum.com/showthread.php?t=1694755
Apologies..... I allowed a thread derail to derail me!
Actually, I use the default colour, brightness, gamma, etc settings on my AMD rig and I have to manually reduce brightness and adjust contrast accordingly on my Nvia machine because the screen image looks unnatural.
What is so funny about it? Maybe that you didn't get anything from the article itself and the methodology they used? :D No apologies at all, man.
The thread should be:
Choose R9 290 Series for its superior image quality compared to competition's: AMD