Wednesday, October 16th 2013

Radeon R9 290X Pitted Against GeForce GTX TITAN in Early Review

Here are results from the first formal review of the Radeon R9 290X, AMD's next-generation flagship single-GPU graphics card. Posted by Chinese publication PCOnline.com.cn, the it sees the R9 290X pitted against the GeForce GTX TITAN, and GeForce GTX 780. An out-of-place fourth member of the comparison is the $299 Radeon R9 280X. The tests present some extremely interesting results. Overall, the Radeon R9 290X is faster than the GeForce GTX 780, and trades blows, or in some cases, surpasses the GeForce GTX TITAN. The R9 290X performs extremely well in 3DMark: FireStrike, and beats both NVIDIA cards at Metro: Last Light. In other tests, its half way between the GTX 780 and GTX TITAN, leaning closer to the latter in some tests. Power consumption, on the other hand, could either dampen the deal, or be a downright dealbreaker. We'll leave you with the results.
More results follow.

Source: PCOnline.com.cn
Add your own comment

121 Comments on Radeon R9 290X Pitted Against GeForce GTX TITAN in Early Review

#51
Ghost
sweetAnother victim of the scheme pulled by nVidia's dynamic boost :ohwell:
Oh right lol. Titan @ 836~900 MHz. Still should be faster clock-to-clock.
Posted on Reply
#52
Am*
the54thvoidI have read (up to this morning at least) every page of the GTX Titan owners thread at OCN. Everybody accepts that the Titan is hamstrung by BIOS limits but even without that, if temps are controlled it hits about 1097-1137MHz before throttling. FTR, a Titan at 1137MHz is f*cking fast.
It makes me wonder how good GK110 really could have been with all 2880 cores enabled and with beefier VRMs from, say Gigabyte or MSI. Nvidia really shot themselves in the foot by not allowing custom designs -- the problem is Titan hits its board power limits long before it gets anywhere near 'unsafe' temperatures. Maybe Nvidia will release the full GK110 shortly after the R290X? Will be interesting to see what it will be capable of if that is indeed the case.
the54thvoidAnd I couldn't agree more with you. If you want massive frame rates on BF4 at 1440p res or higher, it looks like 290X IS the way forward. But bear in mind my 1136MHz Titan averages 60fps at Ultra settings (at 1440p).
Thanks for this. With my GTX 460 OC'd to hell and back and on its last legs, crawling through BF4 at medium-ish settings at 40-60FPS @ 1080p, your post really makes me really look forward to my next upgrade. I finally hope to use my 120Hz monitor to its full extent soon. :rockout:
Posted on Reply
#53
sweet
GhostOh right lol. Titan @ 836~900 MHz. Still should be faster clock-to-clock.
The scheme of nVidia is much more sophisticated. Even that the boost clock says 900 MHz, every stock Titan will run at 993~1016 MHz, because of the Kepler boost. nVidia just knows how to cheat with benchmark.
Posted on Reply
#54
Casecutter
First no one buys this level of card for 1920x1080p. Could see this not from using actual/final release drivers for reviewers of R9 290X. So most all those numbers those are... not worth speculation.

Don't put to much in Furmark consumption test, unless there's a score saying the amount of "work" that produced it's hardly significant or earth-shattering. The one the it might indicate is the cooler has some headroom.

But these numbers do hold to what I've been saying, can soundly beat a 780, while will spar with Titan depending on the Title. Metro was one Nvidia had owned but not so much anymore.

If AMD hold to what they've indicated and done with re-badge prices they'll have a win!
We wait...
Posted on Reply
#55
1d10t
dom99No, Ive seen these before and I think its some bloke in China who claims to have a card and benched it.
thanks for pointing out, lad :toast:
Dj-ElectriCBTW, i feel that test results will be more in favor of the R9 290X at 2560X1440, that's where he belongs
AMD should make this statement for their card...TO THE EYEFINITY AND BEYOND :laugh:
Novuluxvideocardz.com/images/2013/10/AMD-R9-290X-Performance.jpg

videocardz.com/46785/amd-radeon-r9-290x-performance-charts-emerge
And another graph that has questionable results? R9-290 might be an awesome card if this is true...
Jeez..those graphs look very tempting.Bummer,i don't know whom i should trust now.I need salvation...
thematrix606Most of the gamers use 1080, of course they will be used for 1080. Why in hell would you think otherwise?
Have you never played beyond 60Hz on a monitor? And they barely even reach 60FPS in Crysis 3 and other games!
Most of the gamers will choose mainstream card.Why in hell would you think otherwise?
Only enthusiast and score bitching worshiper will look otherwise.
RCoon...Also the fact that in the steam survey, NVidia cards took the highest share of video cards in systems, should tell you enough about how their Q4 results will turn out...
Most of steam user consist from pre-build PC's,average joe and regular jane who bought marketing induced product.They only know three thing :
- anything cost more is better.
- any product which hordes the market is always faster.
- and the worst...buy product that had commercial opening scene in a game will make your game stable.
Let me guess,Intel's Havok and nVidia old motto The Way It's Meant To be Played definitely winner here.Heck,some of my colleagues believe AMD processor and graphics doesn't do gaming.They even boasting their i3+650Ti will decimate my FX8350+CF 7970 :wtf:
Posted on Reply
#56
erocker
*
sweetThe scheme of nVidia is much more sophisticated. Even that the boost clock says 900 MHz, every stock Titan will run at 993~1016 MHz, because of the Kepler boost. nVidia just knows how to cheat with benchmark.
Doesn't a Titan run those boost clocks with games too? I'm having difficulty understanding where the cheating comes in to play.
Posted on Reply
#57
the54thvoid
Super Intoxicated Moderator
erockerDoesn't a Titan run those boost clocks with games too? I'm having difficulty understanding where the cheating comes in to play.
Was about to post same. Boost applies in all scenarios. There is no cheating at all, just misguided sentiment.
Posted on Reply
#58
Am*
CasecutterFirst no one buys this level of card for 1920x1080p.
How do you work that out when current games like Far Cry 3, Crysis 3, Metro LL etc under the highest settings, cannot exceed 60FPS @1080p average on the most expensive single-GPU cards? Or how about the fact that next gen consoles will struggle to run launch titles natively @1080p (BF4 will run @720p)? You'll be surprised to know that the vast majority buying these cards will be gaming @1080p, whether it is across 1 or 3 panels. I am using a 27" 2560x1440 IPS panel right alongside my 1080p panel -- guess what, I still game on the TN panel due to the higher framerate, better response time and less input lag. It definitely makes more sense to game @1080p, at least competitively, since IPS benefits mostly do not apply much to gaming (viewing angles don't matter since I sit right in front of the monitor, and neither does the superior colour accuracy, since most games these days tend to have a really limited colour palette anyway -- BF3 has a blue tint to everything, BF4 seems to have a grey tint etc).
Posted on Reply
#59
ManosHandsOfFate
sweetThere is no doubt that this card is above Titan clock to clock. However, this beast will consume bunch of wattage and that tiny fan is not really reliable. The card is capable at the top, but it is not perfect. Hope that the custom versions will be available soon.
What are you talking about? There is 100% doubt as whether the 290x is faster than the Titan, and that's what EVERYONE is trying to figure out. So for you to say there is none, just sets the whole conversation back.
Posted on Reply
#60
Rei86
Damn that power graph, I'm a power user so i don't really care about how much power it'll suck up but still makes me rethink about the PSU I'm running.

Still starving for a review of these things.
Posted on Reply
#61
TheoneandonlyMrK
ManosHandsOfFateWhat are you talking about? There is 100% doubt as whether the 290x is faster than the Titan, and that's what EVERYONE is trying to figure out. So for you to say there is none, just sets the whole conversation back.
obviously there are plenty who doubt the R9 290x is faster, this is yet another example of something you can throw steam stats at ;)(not me),,most have nvidia apparently:)

even when its out,, and mantels out, your still going to be able to find a fair few who would nock it, even if it were 50% faster.

stop trying to figure something thats mathmatically proveable and proven,, its about "application" anyway and more importantly wizzards application of it(R9 290X) that really matters:D
Posted on Reply
#62
HumanSmoke
Am*It makes me wonder how good GK110 really could have been with all 2880 cores enabled and with beefier VRMs from, say Gigabyte or MSI. Nvidia really shot themselves in the foot by not allowing custom designs
Firstly, Titan (and the identical power delivery reference GTX 780) seems to be a reasonable seller for Nvidia...So by your reckoning Nvidia shot themselves in the foot by selling a load of high-revenue GPUs in a package that keeps warranty failure rates low for eight months...

Yup, that's some foot shooting right there.

Secondly, Nvidia have had two salvage parts collecting revenue and dominating the review benchmarks for the same length of time. Bonus point question: When was the last time Nvidia released a Quadro or Tesla card that didn't have a GeForce analogue of equal or higher shader count?*

Quadro K6000(2880 core) released four days ago
Tesla K40(2880 core) imminent

* Answer: Never
Posted on Reply
#63
TheHunter
AssimilatorWTF is the point of measuring power consumption in FurMark? Give us the power consumption in the tested applications/games alongside the frame rates so we can draw a useful conclusion about the power vs FPS numbers.

I'm also skeptical about Hawaii's performance in general. It still seems to be slower than the GK110 clock-for-clock, so there's nothing stopping nVIDIA from releasing a "780 Ultra" or somesuch with 1GHz core clock, which will then blow R290/X out of the water.
Yeah, I bet they tested ES with probably improper driver/bios TDP protection for such apps (OCCT, Afterburner, Furmark)..

I mean if you'd remove that in GK110 it would be the same 400-450w for sure.
Posted on Reply
#64
arbiter
So its only as fast as a stock clocked titan in most things? So what would be the results if it was tried against a titan OC'ed to 1000-1100mhz which seems to be where a lot of people are able to OC titan's to with little to no problem?
Posted on Reply
#65
Blín D'ñero
arbiter[...] So what would be the results if it was tried against a titan OC'ed to 1000-1100mhz which seems to be where a lot of people are able to OC titan's to with little to no problem?
The result: Titan would be way OV'priced...
Titan $ 999,- (newegg), whereas R9 290X $600 ~650 (expected)
Posted on Reply
#66
ManosHandsOfFate
Blín D'ñeroThe result: Titan would be way OV'priced...
Titan $ 999,- (newegg), whereas R9 290X $600 ~650 (expected)
Probably closer to $700, if not more... :(
Posted on Reply
#67
Casecutter
Am*It definitely makes more sense to game @1080p, at least competitively
Competitively okay that's a point.
ManosHandsOfFateProbably closer to $700, if not more...
There no way AMD can think $700, they have to move enough to get an ROI for a new part, it not like they've geldings and are just needing a home. AMD needs to insure they can recoup engineering and set-up, while splitting that over enough wafers. I don't see this as a boutique product I think they see it as a full-production offering no different the Tahiti was 2 years ago.

I'm wanting a $550 MSRP.
Posted on Reply
#68
SIGSEGV
1d10tOnly enthusiast and score bitching worshiper...

Heck,some of my colleagues believe AMD processor and graphics doesn't do gaming.They even boasting their i3+650Ti will decimate my FX8350+CF 7970 :wtf:
LOL
so true...

Even my father would prefer to choose intel celeron (i'm not sure it's dual cores) and its gpu rather than amd A8-4 series (4 cores) and its igpu.

@casecutter : count me in, i would be happy to get them in cfx and put it under water.
Posted on Reply
#69
xorbe
Tell me about power while playing a game, not Furmark ... who knows which is throttling more.
Posted on Reply
#70
thematrix606
CasecutterFirst no one buys this level of card for 1920x1080p.
And yet again, another 60Hz monitor owner. Please go back to your cave. :banghead::banghead::banghead:
1d10tthanks for pointing out, lad :toast:
Most of the gamers will choose mainstream card.Why in hell would you think otherwise?
Only enthusiast and score bitching worshiper will look otherwise.
So a gamer with enough money to spend on nice hardware (anyone with a job really) is an enthusiast? Wow.
Posted on Reply
#71
HammerON
The Watchful Moderator
thematrix606And yet again, another 60Hz monitor owner. Please go back to your cave. :banghead::banghead::banghead:



So a gamer with enough money to spend on nice hardware (anyone with a job really) is an enthusiast? Wow.
And what monitor(s) do you own?
I will stick with my "lackluster" 60Hz 30" Dell and wait for the 4K monitors to come down in price. Does this mean I am in a cave as well? I absolutely love my monitor and see no reason at this point to go to 120Hz.
Posted on Reply
#72
1d10t
SIGSEGVLOL
so true...
Even my father would prefer to choose intel celeron (i'm not sure it's dual cores) and its gpu rather than amd A8-4 series (4 cores) and its igpu.
I know you won't...even you show up in AMD Lounge...hahaha :laugh:
Not even a year and i already miss everyone in last gath @ bandung :toast:
thematrix606So a gamer with enough money to spend on nice hardware (anyone with a job really) is an enthusiast? Wow.
Why would someone spend $600 or $1000 for stupid card that cannot even attract a woman with nice boobs?Yup...that's enthusiast.
HammerONAnd what monitor(s) do you own?
I will stick with my "lackluster" 60Hz 30" Dell and wait for the 4K monitors to come down in price. Does this mean I am in a cave as well? I absolutely love my monitor and see no reason at this point to go to 120Hz.
Might have something to add,no matter how fast your monitor,windows only sees them in 60Hz.It's in the panel,not in OS'es or graphic card.
FYI,I have 240Hz panel and yet still have a severe judder :shadedshu
Posted on Reply
#73
the54thvoid
Super Intoxicated Moderator
1d10tWhy would someone spend $600 or $1000 for stupid card that cannot even attract a woman with nice boobs?Yup...that's enthusiast.
Ah, the naivety of youth. I enjoy having both. ;)
Posted on Reply
#74
Ahhzz
tiggerI love the smell of new hardware in the morning.

whether these are better than Nvidia cards or not, I still love it when new GPU's come out, all the speculation and "discussion" is interesting and sometimes fun to read.

Personally for me, a 280x or a 7970.
I'm with you on that. A 280x is looking more and more like the bee's knees.
Posted on Reply
#75
1d10t
the54thvoidAh, the naivety of youth. I enjoy having both. ;)
It's nice to know you had a better life than mine sir,may God always bless your family and guide you in your hardest time :toast:
Although in my early 30's i'm still at shitty job with minimum wages and barely make a living,I still believe God would have pity on me so i could date someone and make my own family someday :)
Posted on Reply
Add your own comment
Dec 19th, 2024 10:01 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts