# AMD GPU'14 Event Detailed, Announces Radeon R9 290X



## btarunr (Sep 25, 2013)

AMD announced the new Radeon R9 290X, its next-generation flagship graphics card. Based on the second-generation Graphics CoreNext micro-architecture, the card is designed to outperform everything NVIDIA has at the moment, including a hypothetical GK110-based graphics card with 2,880 CUDA cores. It's based on the new "Hawaii" silicon, with four independent tessellation units, close to 2,800 stream processors, and 4 GB of memory. The card supports DirectX 11.2, and could offer an inherent performance advantage over NVIDIA's GPUs at games such as "Battlefield 4". Battlefield 4 will also be included in an exclusive preorder bundle. The card will be competitively priced against NVIDIA's offerings. We're awaiting more details.






*LiveBlog*
"Join the Red Team" AMD SVP client products to the press
"The hair in Tomb Raider changed that game (TressFX)" Riiight.
"We are changing the gaming industry"
AMD is gunning for Ultra HD (3840 x 2160) (R9 290X supports UHD @ 60 Hz)
"New consoles raise the bar with gaming"
"It's been a great two years with the HD 7000 series"
AMD announced the R9 series for enthusiasts and gamers ; R7 for price-conscious buyers
AMD lists out a new product stack
R7 250 and R7 260 under $150, R9 270X under $200, R9 280X $299, and R9 290X
R9 290X Battlefield 4 Edition
"Three pillars of the R9 series: GCN 2.0, 4K, TrueAudio, etc
- DirectX 11.2 and improved energy efficiency
- >5 TFLOP/s compute power
- >300 GB/s memory bandwidth
- >4 billion trig/sec
- >6 billion transistors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


4X more pixels
- everyday Ultra HD possible
- AMD first to officially support 4K
- Auto-configure 4K resolutions
True Audio Technology
- Revolutionizes game audio, and provides artistic freedom
- Hundreds of more audio channels
- Working with new audio CODEC developers
- Positional audio


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


Not sold on PureAudio. Nothing Dolby didn't try in the 80s, or Yamaha/Creative in the 90s.
Most game engines already have positional-audio
PureAudio based on "25 years of brain research"
WWYSE and FMOD game audio engines support TrueAudio
the industry dumped the idea of buying $100 sound cards for exclusive EAX5.0 years ago
PureAudio reduces CPU Load: AudioKinetics (WYSE creators)


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 



And finally, the good parts:


 

 

 

 

 

 

 

 

 

 

 

 

"Ruby in that video looks like an older Linda Hamilton" - W1zzard



 

 

 

 

 

 

 

 

 

 

 

 

 

 



AMD's Press-release for Radeon R9 and R7 series: 





> AMD today unveiled the AMD Radeon R9 290X, R9 290, R9 280X, R9 270X, R7 260X and R7 250 graphics cards, AMD's first GPUs in a new era of gaming defined by UltraHD displays, renewed vigor in game engine development, and a new generation of gamers that expect a more immersive entertainment experience. AMD also introduced the world to Mantle and AMD TrueAudio technology, the latest innovations that redefine the GPU by enabling both gamers and game developers with unprecedented audio and performance enhancements for compatible games.
> 
> "The AMD Radeon R9 and R7 Series graphics cards are new GPUs for a new era in gaming," said Matt Skynner, corporate vice president and general manager, Graphics Business Unit, AMD. "This era is shaped by ultra-resolution gaming and an exciting new generation of highly-anticipated games like 'Battlefield 4.' But it's also an era shaped in a very powerful way by our own Unified Gaming Strategy; we've teamed up with the world's top game developers to establish a comprehensive portfolio of games that you can maximize to their full potential only with AMD Radeon graphics."
> 
> ...



- AMD announced its alternative to GeForce Experience, the Gaming Evolved app, built in partnership with Raptr.



 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

DirectX 11.1 optimization and 64-bit binaries for BF4 confirmed by DICE, can take advantage of 8-core CPUs.
AMD Developing its own low-level 3D API in partnership with DICE, called MANTLE



 

 

 

 

 

 



*View at TechPowerUp Main Site*


----------



## ChristTheGreat (Sep 25, 2013)

So, 512 memory bus is back!


----------



## DualAmdMP (Sep 25, 2013)

Welcome back 512Bit Video card


----------



## RCoon (Sep 25, 2013)

I sure hope this card performs better than their livestream does. Its not very live, and its only an hour late.


----------



## n0tiert (Sep 25, 2013)

"... the card is designed to outperform everything NVIDIA has at the moment...."

we´ll see as soon W1zzard get his hands on that maid


this goes down like oil


----------



## Tatty_One (Sep 25, 2013)

n0tiert said:


> "... the card is designed to outperform everything NVIDIA has at the moment...."
> 
> this goes down like oil



It is however difficult for them to predict what the other side might have in the future, although I do sell crystal balls on Ebay


----------



## HumanSmoke (Sep 25, 2013)

Where's the Crossfire fingers ?
AMD decide to handle inter-card communication completely over the PCI-E bus ?


----------



## TheoneandonlyMrK (Sep 25, 2013)

n0tiert said:


> "... the card is designed to outperform everything NVIDIA has at the moment...."
> 
> we´ll see as soon W1zzard get his hands on that maid



probably be sooner than you think that review, i mean has anyone heard from wizzard lately lucky git

woot i can watch the match in peace now


----------



## MxPhenom 216 (Sep 25, 2013)

And from what I have been reading, it is suppose to cost about as much as the Titan.....


----------



## erocker (Sep 25, 2013)

HumanSmoke said:


> Where's the Crossfire fingers ?
> AMD decide to handle inter-card communication completely over the PCI-E bus ?



Could be wood screws man. lol. 

Interesting to see if it's a "mock-up" or something new.


----------



## n0tiert (Sep 25, 2013)

HumanSmoke said:


> Where's the Crossfire fingers ?
> AMD decide to handle inter-card communication completely over the PCI-E bus ?



yepp anounced to be send via PCI-E bus (XDMA)


----------



## Crap Daddy (Sep 25, 2013)

HumanSmoke said:


> Where's the Crossfire fingers ?
> AMD decide to handle inter-card communication completely over the PCI-E bus ?



Either this or they forgot something while being in a hurry to start the livestream which is not starting.


----------



## n0tiert (Sep 25, 2013)

Moar pictures

http://translate.google.de/translat...t-grafikkarte-radeon-r9-290x-1309-101784.html


----------



## Ahhzz (Sep 25, 2013)

HumanSmoke said:


> Where's the Crossfire fingers ?
> AMD decide to handle inter-card communication completely over the PCI-E bus ?



http://www.techradar.com/news/compu...u-the-radeon-r9-290x-1183950?src=rss&attr=all
_Most notable it seems is that gone are the Crossfire connectors. Crossfire will now be handled via PCI Express. Additionally, the Radeon R9 290X will sport two DVI outputs, a DisplayPort output, and an HDMI output._


----------



## Crap Daddy (Sep 25, 2013)

Hopefully you can watch here live. 1 hour late.

http://www.livestream.com/amdlivestream?t=637320


----------



## TheoneandonlyMrK (Sep 25, 2013)

http://www.livestream.com/amdlivestream?t=637320

on NOW and working


----------



## cheesy999 (Sep 25, 2013)

btarunr said:


> The hair in Tomb Raider changed that game (TressFX)" Riiight



Hey TressFX made an improvement on the game that you could see, which at least puts it above the hoards of identical looking AA's the graphics card industry has been selling to us lately


though the whole frame rate hit does make it very questionable


----------



## HumanSmoke (Sep 25, 2013)

Pricing on all cards except the one everyone is interested in.

/Damp squib


----------



## Delta6326 (Sep 25, 2013)

Dang some good pricing!

R9 290X I could see $399


----------



## Prima.Vera (Sep 25, 2013)

Hello baby!


----------



## theonedub (Sep 25, 2013)

Delta6326 said:


> Dang some good pricing!
> 
> R9 290X I could see $399
> http://img.techpowerup.org/130925/Capture1289.jpg
> ...



If they were going to undercut Nvidia that hard, they would've announced it immediately. I'd guess less than 0.01% chance of a sub $400 price on that card.


----------



## Delta6326 (Sep 25, 2013)

Is it me or are those firestrike scores low? or do you think those on extreme?



theonedub said:


> If they were going to undercut Nvidia that hard, they would've announced it immediately. I'd guess less than 0.01% chance of a sub $400 price on that card.





Yeah true I forgot about the R9 290 so R9 290 $399, R9 290X $499?


----------



## HumanSmoke (Sep 25, 2013)

Delta6326 said:


> Dang some good pricing!
> 
> R9 290X I could see $399



Quite simply, No.Way.

I'd almost certainly say that $399 would be reserved for the R9-290 (Hawaii Pro)

I don't think there's any way the 290X costs $399, meaning that the salvage Hawaii 290 sits only $50 more than the 280X


----------



## theonedub (Sep 25, 2013)

Delta6326 said:


> Is it me or are those firestrike scores low? or do you think those on extreme?
> 
> Yeah true I forgot about the R9 290 so R9 290 $399, R9 290X $499?



I think $499 is where everyone interested in the card wants to be at, but $599 is the more likely outcome.


----------



## dom99 (Sep 25, 2013)

All I want to know is the price!


----------



## PopcornMachine (Sep 25, 2013)

Yeah, but can it play "South Park: The Stick of Truth"?

http://www.guru3d.com/news_story/south_park_the_stick_of_truth_destiny_trailer.html


----------



## TRWOV (Sep 25, 2013)

btarunr said:


> The card will be competitively priced against NVIDIA's offerings.



Read: $899.


----------



## erocker (Sep 25, 2013)

TRWOV said:


> Read: $899.



What?! No.


----------



## Crap Daddy (Sep 25, 2013)

Price floating around the web is $600.


----------



## Akrian (Sep 25, 2013)

R9 290X...$499 pleease


----------



## Casecutter (Sep 25, 2013)

Go with $550 please! I want a Big Clang...


----------



## Akrian (Sep 25, 2013)

GPUs....audio technology....say wut ? 0_o


----------



## Pedro Lisboa (Sep 25, 2013)

*Price  600,00 US*

The price of the new R9  290X  is 600,00 US .
It's a bargain compared with GTX 780 reference price that costs 649,00 US.
R9 290X beat without mercy Nvidia GTX 780.
I hope Nvidia will drop the price of GTX 780 to 550,00 US .


----------



## cadaveca (Sep 25, 2013)

Hmmm....

AMD makes CPUs, GPUs, and chipsets already.

Now they make audio...


That'd be a full AMD SOC truly possible, now. Everything, including I/O(USB3.0) can be made in-house...and in a single "unified" chip.

personally, I don't care too much about the new GPUs, I care about the tech. The audio bits may not seem that interesting ,but looking forward, it's very exciting indeed(at least to me).


----------



## xvi (Sep 25, 2013)

So, they're claiming two things :

Outperforms nVidia offerings
Competitively priced against NVIDIA's offerings
Does not compute. One does not simply offer better performance for less than the competitor.


----------



## erocker (Sep 25, 2013)

xvi said:


> Does not compute. One does not simply offer better performance for less than the competitor.



see: 9xxx series, 5 series, 6 series


----------



## Crap Daddy (Sep 25, 2013)

xvi said:


> So, they're claiming two things :
> 
> Outperforms nVidia offerings
> Competitively priced against NVIDIA's offerings
> Does not compute. One does not simply offer better performance for less than the competitor.



They haven't claimed they outperform Nvidia's offering. They claim they have "their most powerful GPU yet". They haven't announced any price.


----------



## GC_PaNzerFIN (Sep 25, 2013)

Little birds tell: 8000 points in 3DMark Fire Strike. gtx 780 does 8500.


----------



## Delta6326 (Sep 25, 2013)

Get to the GPU's!


----------



## xvi (Sep 25, 2013)

As per the main article,


btarunr said:


> the card is designed to outperform everything NVIDIA has at the moment


----------



## Akrian (Sep 25, 2013)

Could they found a better speaker ?


----------



## sunweb (Sep 25, 2013)

Crap Daddy said:


> They haven't claimed they outperform Nvidia's offering. They claim they have "their most powerful GPU yet". They haven't announced any price.



But they did.


----------



## Casecutter (Sep 25, 2013)

GC_PaNzerFIN said:


> Little birds tell: 8000 points in 3DMark Fire Strike. gtx 780 does 8500.


Little birds... are?

3DMark Firestrike in Extreme preset 
Radeon R9 290X Flagship 4816 Points
Titan scores between 4500-4700 Points…
GTX 780 scores 4400-4500 points.
7970 scores 3300-3500 points

http://wccftech.com/amd-hawaii-gpu-performance-exposed-3dmark-faster-gtx-titan/#ixzz2fwRvjl00


----------



## N3M3515 (Sep 25, 2013)

Tatty_One said:


> It is however difficult for them to predict what the other side might have in the future, although I do sell crystal balls on Ebay



Well it clearly says "at the moment"


----------



## GC_PaNzerFIN (Sep 25, 2013)

Casecutter said:


> Little birds... are?
> 
> 3DMark Firestrike in Extreme preset
> Radeon R9 290X Flagship 4816 Points
> ...



Sorry, I rather trust my source in that event rather than some random internet leak.


----------



## N3M3515 (Sep 25, 2013)

Delta6326 said:


> Dang some good pricing!
> 
> R9 290X I could see $399
> http://img.techpowerup.org/130925/Capture1289.jpg
> ...



399 is for R9 290


----------



## Crap Daddy (Sep 25, 2013)

sunweb said:


> But they did.



If you had the patience to watch this trainwreck of a presentation, which still goes on, you probably noticed that they did not say at any moment that they have the fastest card in the universe, which I'm sure they would have said it if that is the case.


----------



## Ponkata (Sep 25, 2013)

*Rip*

RIP Nvidia


----------



## sunweb (Sep 25, 2013)

Crap Daddy said:


> If you had the patience to watch this trainwreck of a presentation, which still goes on, you probably noticed that they did not say at any moment that they have the fastest card in the universe, which I'm sure they would have said it if that is the case.


You said that they didn't announced the price, i said they did.

Ruby killed Titan lol


----------



## xvi (Sep 25, 2013)

erocker said:


> see: 9xxx series, 5 series, 6 series



Well, okay. The 9xxx series is a good point. My 9800 Pro was amazing back in the day.

..and I suppose it's true that nVidia's Titan and 700 series have been out for a bit. I worry the price won't quite be in check though. nVidia's crazy prices aren't an excuse to keep AMD prices crazy too.


----------



## TheoneandonlyMrK (Sep 25, 2013)

Akrian said:


> Could they found a better speaker ?



who dya want tom cruise ,ok so no ultimate nerd to worship  buy apple oh


----------



## Wrigleyvillain (Sep 25, 2013)

Nah. John Travolta. In 1977.


----------



## Akrian (Sep 25, 2013)

theoneandonlymrk said:


> who dya want tom cruise ,ok so no ultimate nerd to worship  buy apple oh



Too low bro, too low ( Apple that is).

Well, a speaker should be able to hold your interest, not stutter all the time.


----------



## TheoneandonlyMrK (Sep 25, 2013)

Akrian said:


> Too low bro, too low ( Apple that is).
> 
> Well, a speaker should be able to hold your interest, not stutter all the time.



 its the content not the nerd waffle plus im doing three no four things now two vids one net and a guitar happy days


----------



## Gradius2 (Sep 25, 2013)

Let's hope for $499.


----------



## Casecutter (Sep 25, 2013)

GC_PaNzerFIN said:


> Sorry, I rather trust my source in that event rather than some random internet leak.


I'm sorry I didn't say anything being trusted or fact.  Just pointed to a source who's posting information. Notice I never call it Data, while when did some random TPU poster, trump a internet site with an article?  BTW… that’s 2 months old, can't keep anything straight, clicking throughout the craziness.  


This slide sure looks like a close to almost full-another bar above 7000, let's wait for data! 
http://www.techpowerup.com/img/13-09-25/23.jpg


----------



## Delta6326 (Sep 25, 2013)

One click to optimize... Sure sounds familiar.


----------



## GC_PaNzerFIN (Sep 25, 2013)

Casecutter said:


> while when did some random TPU poster, trump a internet site with an article?



http://muropaketti.com/live-amd-gpu14-tech-day-havaiji

Yeah? It is now on internet site too. I am sure you can find the numbers there with google translate or something. I would trust AMD too, why would they downplay their own card


----------



## AsRock (Sep 25, 2013)

WOW, they surly got the wrong guy for that stream lol..  in about 5 minutes he just lost around 600 viewers and boring the crap out of me too.

O well back to my game i'll read up on this later


----------



## Crap Daddy (Sep 25, 2013)

Casecutter said:


> I'm sorry I didn't say anything being trusted or fact.  Just pointed to a source who's posting information. Notice I never call it Data, while when did some random TPU poster, trump a internet site with an article?  BTW… that’s 2 months old, can't keep anything straight, clicking throughout the craziness.
> 
> 
> This slide sure looks like a close to almost full-another bar above 7000, let's wait for data!
> http://www.techpowerup.com/img/13-09-25/23.jpg



Gtx 780 stock does 8500


----------



## andresgriego (Sep 25, 2013)

512bit bus .. woohoo, finally!

AMD should never make the mistake again of not having an uber card with every launch. 

Almost want to sell my 680 lightning. ...


----------



## TheoneandonlyMrK (Sep 25, 2013)

AsRock said:


> WOW, they surly got the wrong guy for that stream lol..  in about 5 minutes he just lost around 600 viewers and boring the crap out of me too.
> 
> O well back to my game i'll read up on this later



he did and its a shame,,, you missed some exciting Bf4 news (still on)and amd bare metal driver APi news , roll on Mantle and who was it wafflin bare metal api a while ago too


----------



## Delta6326 (Sep 25, 2013)

He sucks at BF4...
Should have put it on hardcore mode.


----------



## TheoneandonlyMrK (Sep 25, 2013)

Delta6326 said:


> He sucks at BF4...
> Should have put it on hardcore mode.



koduri is quite the guru and mantle(more specificaly 9x draw calls per sec) might just have stomped nvidia outshadedshu


----------



## GC_PaNzerFIN (Sep 25, 2013)

theoneandonlymrk said:


> koduri is quite the guru and mantle(more specificaly 9x draw calls per sec) might just have stomped nvidia outshadedshu



Just what we need. More vendor specific DSP audio and rendering API.


----------



## TheoneandonlyMrK (Sep 25, 2013)

GC_PaNzerFIN said:


> Just what we need. More vendor specific DSP audio and rendering API.



thats what i said when phsyx came out but fairs fair eh i just got a physx card

and thats all console and one pc vendor specific ,devs are going to love that one api for all, nvidia probably have something similar in the works, well hopefully.


----------



## Delta6326 (Sep 25, 2013)

...I missed the end, did they announce more specs and $$$??


----------



## The Von Matrices (Sep 25, 2013)

Just to note because AMD never directly said it - the R9 290 series has the only new core.  All the other cards are rebrands of existing cores, although the clock speeds may be changed.  This would mean the R9 280 is Tahiti, R9 270 is Pitcairn, R7 260 is Bonnaire, and R7 250 is Cape Verde.

The good news about this is that it almost surely means that all these "new" features will be enabled through driver updates rather than some fundamental hardware change in the 2xx series, and thus us 7xxx owners can enjoy the benefits (maybe with the exception of the audio DSP).  

What I can't understand is how AMD can promote the anachronism that is the R7 250.  This is their 2014 lineup and they're still selling cards with 1GB of memory?!  For all this talk about high resolution and detailed effects they certainly don't support it throughout their entire lineup.


----------



## Ralfies (Sep 25, 2013)

This makes it seem like R7 260X is new silicon. If not, why wouldn't they add "True Audio" to the other re-brands as well?


----------



## The Von Matrices (Sep 26, 2013)

Ralfies said:


> http://www.techpowerup.com/img/13-09-25/46.jpg
> Does this mean R7 260X is new silicon along with Hawaii?



Probably not.  It's most likely Bonnaire (aka 7790), which is also a "second generation" GCN chip.


----------



## Roph (Sep 26, 2013)

Eh, so AMD's "Mantle" thing is basically like 3DFX's Glide? 

Even as an AMD user I'm not sure I like the sound of a proprietary API like that. I'd much rather see advancements in their drivers / DirectX to get performance closer to the metal instead.


----------



## PopcornMachine (Sep 26, 2013)

I was expecting reviews tonight, but I guess it's not to be.

When will NDA be lifted?


----------



## GC_PaNzerFIN (Sep 26, 2013)

PopcornMachine said:


> I was expecting reviews tonight, but I guess it's not to be.
> 
> When will NDA be lifted?



This info is also under NDA.


----------



## Slomo4shO (Sep 26, 2013)

That was a disappointing 3 hours... Still no confirmation on specs and no pricing on the top tier cards...

There better be some reviews available prior to the 3rd of October. I for one am not going to blindly preorder something without knowing how it performs...


----------



## PopcornMachine (Sep 26, 2013)

GC_PaNzerFIN said:


> This info is also under NDA.



Dammmmit! :shadedshu


----------



## LAN_deRf_HA (Sep 26, 2013)

Whats the deal with the non-X cards?


----------



## Xzibit (Sep 26, 2013)

Roph said:


> Eh, so AMD's "Mantle" thing is basically like 3DFX's Glide?
> 
> Even as an AMD user I'm not sure I like the sound of a proprietary API like that. I'd much rather see advancements in their drivers / DirectX to get performance closer to the metal instead.



This could be some ingenious way of getting Nvidia to buy AMD...  It worked for 3Dfx. 

I don't like the proprietary of it but it works with GCN and future ones.  GPU, APUs to SoCs will take advantage of it and might give them an advantage in the growing mobile market via apple IOS, Android JellyBean and Microsoft Windows, Linux.


----------



## adulaamin (Sep 26, 2013)

If it's under $500, it's gonna be the first time that I'm gonna buy a card on release.


----------



## GC_PaNzerFIN (Sep 26, 2013)

LAN_deRf_HA said:


> Whats the deal with the non-X cards?



Some things disabled on the GPU. Think of it like HD 7950... compared to X HD 7970.


----------



## Slomo4shO (Sep 26, 2013)

adulaamin said:


> If it's under $500, it's gonna be the first time that I'm gonna buy a card on release.



I would still wait unit Black Friday


----------



## EpicShweetness (Sep 26, 2013)

*!!!!so much info!!!!​*

Yet nothing I want to hear! Were's the damn price, and specifications ​


----------



## LAN_deRf_HA (Sep 26, 2013)

GC_PaNzerFIN said:


> Some things disabled on the GPU. Think of it like HD 7950... compared to X HD 7970.



That's even worse than nvidia's shitty "Ti vs non-Ti" naming scheme. Name distinctions should be loud and clear.

Always feels like they're trying to trick people who aren't paying attention into buying their cards over the competition because they saw a review for a better card with a very similar name beating the competition, and here it is on sale for much less money than that competition. When in actuality they're not getting the performance they're expecting.


----------



## AsRock (Sep 26, 2013)

theoneandonlymrk said:


> he did and its a shame,,, you missed some exciting Bf4 news (still on)and amd bare metal driver APi news , roll on Mantle and who was it wafflin bare metal api a while ago too



Not at all BF is just plain boring to me always has been so i guess i was right to stop watching lol... I am much more a OFP\Arma fan.

I do wounder about this comment as HDMI only supports 8 channels and 16 on the newer version



> Hundreds of more audio channels


----------



## TheoneandonlyMrK (Sep 26, 2013)

AsRock said:


> Not at all BF is just plain boring to me always has been so i guess i was right to stop watching lol... I am much more a OFP\Arma fan.
> 
> I do wounder about this comment as HDMI only supports 8 channels and 16 on the newer version



He mentioned hundreds of audio channels but for me it the 9x draw calls that's 180- 240000 instead of 20-30000 per second mantle sounds impressive.


----------



## 15th Warlock (Sep 26, 2013)

Hadn't been excited about new tech on an AMD card in years, but this new card packs some really interesting features, I wonder if it'll be worth an upgrade when it comes out...

Some interesting tech coming out for PC in the next months, steamOS, Mantle, and finally new focus on audio tech, I despised MS for killing audio acceleration with Vista, I'm glad to see it come back with a vengeance


----------



## PopcornMachine (Sep 26, 2013)

So all we got today really was a prices on all but the newest cards, no benchmarks, and a boatload of marketing BS.

I mean a big boatload.  A shipload.

Really turned off by this.  Audio?  They most of the time talking about audio!?!?

Not what anybody wanted to hear.


----------



## acperience7 (Sep 26, 2013)

Getting really hard not scratch my upgrade itch...especially if this card is $550 or lower...Here's hoping!


Love the cooler design BTW.


----------



## Fluffmeister (Sep 26, 2013)

PopcornMachine said:


> So all we got today really was a prices on all but the newest cards, no benchmarks, and a boatload of marketing BS.
> 
> I mean a big boatload.  A shipload.
> 
> ...



I see what you did there.


----------



## ChristTheGreat (Sep 26, 2013)

Can'T wait to see review VS the actual market (my HD7950 vs at least the 290 and 290x, cause I won,t switch for 10%, as my card is overclocked)


----------



## TRWOV (Sep 26, 2013)

So Mantle is like writing in assembler back in the 90s?


----------



## 15th Warlock (Sep 26, 2013)

PopcornMachine said:


> So all we got today really was a prices on all but the newest cards, no benchmarks, and a boatload of marketing BS.
> 
> I mean a big boatload.  A shipload.
> 
> ...



Yes, I would've liked a little more substance on what makes this new card tick and everything, but there's still a lot of info under NDA that'll be slowly released in the following months.

Unfortunately IT companies are going for more bang for their PR escapades as opposed to more substance, that's the effect of living in a post-Jobs world, and that's the problem, most companies want to copy some of Apple's style but not all of them have the resources to organize a press conference and release products a few days afterwards, so they leave us wanting more...

I for one am excited about the audio related tech, I mean, with oculous rift finally getting us closer to having VR games from a visual perspective, I sure am glad someone's paying attention to the audio department


----------



## dir_d (Sep 26, 2013)

I hope Mantle takes off, id love to play BF4 on linux with mantle.


----------



## Xenturion (Sep 26, 2013)

Excellent. They finally dropped the requirement of an active DisplayPort adapter for 'legacy' monitors in EyeFinity. (Presumably eliminating the screen tearing you get on one monitor from using an adapter; not to mention them dying) I suspect a 290 or 280X will be my next card. I'll wait for the reviews, but the healthy helpings of VRAM on both cards should be ideal for triple monitors. Plus, excellent compute performance never hurts.


----------



## awesomesauce (Sep 26, 2013)

i miss u ATI !! Where is dual GPU???


----------



## The Von Matrices (Sep 26, 2013)

Xenturion said:


> Excellent. They finally dropped the requirement of an active DisplayPort adapter for 'legacy' monitors in EyeFinity. (Presumably eliminating the screen tearing you get on one monitor from using an adapter; not to mention them dying) I suspect a 290 or 280X will be my next card. I'll wait for the reviews, but the healthy helpings of VRAM on both cards should be ideal for triple monitors. Plus, excellent compute performance never hurts.



Where did you see this information about the removal of the active displayport adapter requirement?  I didn't see it anywhere in the presentation.


----------



## haswrong (Sep 26, 2013)

so how do i plug the headphones into this new radeon soundcard? time to remove the soundblaster? 

why is the outtake blocked by another dvi port and several vents are going to blow hot air inside the case? the cooler cover sucks, the simple one from the first leak rocked. but the dvi connector is the problem, there should have been one line of 6x mini-dp.. 

who the hell designed this hardware? one step forward, two steps back. no benchmarks, no price, no buy. bye, amd, bye.


----------



## net2007 (Sep 26, 2013)

Man if this thing really beats the titan I'm selling mine and buying four of these hurry up ek and release full cover acrylic blocks.


----------



## erocker (Sep 26, 2013)

haswrong said:


> so how do i plug the headphones into this new radeon soundcard? time to remove the soundblaster?
> 
> why is the outtake blocked by another dvi port and several vents are going to blow hot air inside the case? the cooler cover sucks, the simple one from the first leak rocked. but the dvi connector is the problem, there should have been one line of 6x mini-dp..
> 
> who the hell designed this hardware? one step forward, two steps back. no benchmarks, no price, no buy. bye, amd, bye.



Your heatsink/shroud speculation is just speculation. Obviously benchmarks, price and performance numbers are coming. Though, I am too disappointed that these things weren't covered. Not a very long time until release so, we'll see more info any time now.


----------



## TRWOV (Sep 26, 2013)

I like the fact that this will push the 7970 into the <$300 territory


----------



## The Von Matrices (Sep 26, 2013)

TRWOV said:


> I like the fact that this will push the 7970 into the <$300 territory



It might in the short term when the 7970 is discontinued, but I wouldn't be so sure about Tahiti going below $300 in the longer term.  The R9 280X is almost surely a rebrand of the 7970 with higher clock speeds, and the higher clock speed combined with the "newness" of the card will keep its price up.  Think of the GTX 770 vs GTX 680 for comparison.


----------



## Bansaku (Sep 26, 2013)

*Ya...no....*

Not really impressed at the FireStrike rating as I get 12000; I think I will keep my HIS IceQ x2 7950HD 3GB CrossfireX. Worth the upgrade if you have a 6000 series or less, but ya no, thanks. But hey, maybe I will change my mind once the card(s) are released and I see real benchmarks.


----------



## okidna (Sep 26, 2013)

Only 8000 (no, not Firestrike score  ) : http://bit.ly/BF4AMD


----------



## Nordic (Sep 26, 2013)

I think the audio news is the most interesting honestly. I wonder how they will implement it.


----------



## mrwizard200 (Sep 26, 2013)

I almost pulled the trigger on a 7970, but since the R9 280X is just a "refresh", I will wait. 
As far as today's event, I am pretty excited on the 290 and 290X. While not a game changer, these should provide some challenge to NV. I am crossing my fingers the 290 (non X) gets a $399 price tag. I would so jump on that.


----------



## HumanSmoke (Sep 26, 2013)

james888 said:


> I think the audio news is the most interesting honestly. I wonder how they will implement it.


Hopefully the AMD driver team aren't thinking the same thing...still, they probably need a new challenge after getting all their other issues sorted ou...nvm.


----------



## Prima.Vera (Sep 26, 2013)

What's with the TrueAudio?? I thought 3D hardware sound is dead and buried...


----------



## droopyRO (Sep 26, 2013)

Quick question, what do they mean by FireStrike score ? overall score(the one with orange font) or the Graphics score ? 10q


----------



## TRWOV (Sep 26, 2013)

Prima.Vera said:


> What's with the TrueAudio?? I thought 3D hardware sound is dead and buried...



Windows 8 has hardware acceleration:







Windows Vista and 7 moved DirectSound to a software stack:







It's worth noting that OpenAL does support hardware acceleration even in W7. Only DirectSound was affected.


----------



## Prima.Vera (Sep 26, 2013)

Yeah, but except DiRT and DiRT2, what other games do you know in the pass 3-4 years that use OpenAL or Hardware 3D Sound?? Hell, they are not even implementing software 3D sound in games. I think only Path of Exile have some software based 3D sound, but that's about it...


----------



## HumanSmoke (Sep 26, 2013)

droopyRO said:


> Quick question, what do they mean by FireStrike score ? overall score(the one with orange font) or the Graphics score ? 10q


Overall score (Orange font). Performance preset (as opposed to Extreme)


----------



## BiggieShady (Sep 26, 2013)

I like what they did with Mantle API layers slide?


----------



## Xzibit (Sep 26, 2013)

This Saint Row IV Eyefinity demo was on R9 290 (non-X)






The new Final Fantasy XIII was on the next one over


----------



## the54thvoid (Sep 26, 2013)

A lot of bluster without the refreshing breeze.  This is worse than an infamous wood screw Nvidia launch.  I know they never said this was the R9 290X launch date but they didn't even manage to tease us with it.  One Firestrike bench (arguably lower than a GTX 780) and a picture without any more info.  Meh....


----------



## Pedro Lisboa (Sep 26, 2013)

*Epic disaster*

"A lot of bluster without the refreshing breeze. This is worse than an infamous wood screw Nvidia launch. I know they never said this was the R9 290X launch date but they didn't even manage to tease us with it. One Firestrike bench (arguably lower than a GTX 780) and a picture without any more info. Meh....".

The boys and girls that thought that AMD will show the R9 290X card ... got nothing .
This event was an epic disaster .


----------



## Xzibit (Sep 26, 2013)

MAXIMUMPC - AMD: R9 290X Will Be "Much Faster Than Titan in Battlefield 4"



> When we asked what this means in real-world terms, he stated, “with Battlefield 4 running with Mantel (AMD’s new graphics API), the card will be able to ‘ridicule’ the Titan in terms of performance.”


----------



## HammerON (Sep 26, 2013)

I am real impressed that this thread has remained civil
This might be a first in regards to a thread discussing the release of a new AMD or NVIDIA GPU
I am looking forward to W1zzard's review and the final price of the 290X.


----------



## Recus (Sep 26, 2013)

Xzibit said:


> MAXIMUMPC - AMD: R9 290X Will Be "Much Faster Than Titan in Battlefield 4"



After reading more comments (mostly Anandtech) I'm "shocked" how open standards supporters jumped in Mantle API property. Looks like Opencl, Dirext X will go bankrupt.


----------



## Mombasa69 (Sep 26, 2013)

n0tiert said:


> yepp anounced to be send via PCI-E bus (XDMA)



It's just a demo card, you can clearly see where the crossfire connections will go.


----------



## HumanSmoke (Sep 26, 2013)

Xzibit said:


> MAXIMUMPC - AMD: R9 290X Will Be "Much Faster Than Titan in Battlefield 4"



So, for the ultimate battlefield 4 experience, you'll need

Origin

Windows 8 (or 8.1) 

Crossed fingers that the Mantle API plays nice with WDDM


----------



## BiggieShady (Sep 26, 2013)

HumanSmoke said:


> So, for the ultimate battlefield 4 experience, you'll need
> 
> Origin
> 
> ...




As I see it, version of Frostbite for BF4 should have both DirectX and Mantle renderer ... it's to be expected that Mantle renderer will be buggier at first.

Now with AMD trying to improve draw call overhead with Mantle API, I wonder if Nvidia is adding ARM core to Maxwell GPU trying to issue draw calls directly on the GPU (in hardware - no overhead at all)


----------



## refillable (Sep 26, 2013)

Stupid naming scheme


----------



## TheoneandonlyMrK (Sep 26, 2013)

BiggieShady said:


> As I see it, version of Frostbite for BF4 should have both DirectX and Mantle renderer ... it's to be expected that Mantle renderer will be buggier at first.
> 
> Now with AMD trying to improve draw call overhead with Mantle API, I wonder if Nvidia is adding ARM core to Maxwell GPU trying to issue draw calls directly on the GPU (in hardware - no overhead at all)



Mantle is compatible with dx and both api's will be supported just one will be quicker from what they say 

Lot of neg vibes from a few wtf is amd not allowed to differentiate , are only intel and nvidia allowed closed standards, Bs moaning, and nvidia wont have an answer for a few Years imho .
trouble is brewing for nvidia intel not so much but there driver team will be getting a right ear full as we speak.


----------



## buggalugs (Sep 26, 2013)

Cant wait. New hardware, new features, great performance what more do we want?


----------



## HumanSmoke (Sep 26, 2013)

theoneandonlymrk said:


> Mantle is compatible with dx and both api's will be supported just one will be quicker from what they say


"They" being the company with a vested interest in saying the API will be quicker. Surprised?


theoneandonlymrk said:


> Lot of neg vibes from a few wtf is amd not allowed to differentiate


From the reaction I've seen, it probably stems from an almost complete absence of hard facts in the implementation. The kind of people that frequent tech forums are somewhat inured to PR-speak. The other less than ebullient reaction seems to stem from AMD rather mixed success rate with new features. Some work well, some not so much - the much hyped quickly forgotten VCE ? HDMI/sound issues ? Enduro ? Frame pacing ? It makes people cautious.


theoneandonlymrk said:


> are only intel and nvidia allowed closed standards


Isn't Mantle being touted as an open standard ?


theoneandonlymrk said:


> Bs moaning


Pot.Kettle.Black. I can provide examples if you like.


theoneandonlymrk said:


> and nvidia wont have an answer for a few Years imho


Do they need to if Mantle is an open standard ?


theoneandonlymrk said:


> trouble is brewing for nvidia intel not so much but there driver team will be getting a right ear full as we speak.


Maybe AMD's driver and software team could make them an instructional video !


----------



## BiggieShady (Sep 26, 2013)

theoneandonlymrk said:


> Mantle is compatible with dx and both api's will be supported just one will be quicker from what they say



Not quite ... it's compatible only in using the same shading language (HLSL).


----------



## 63jax (Sep 26, 2013)

i don't see any CF connector on the card???


----------



## springs113 (Sep 26, 2013)

63jax said:


> i don't see any CF connector on the card???



It is there at the very same spot it normally is...look good there's just no cutout where the xfire connector/adapter normally slides on but the markings with the metal/copper conductors are there.


----------



## HumanSmoke (Sep 26, 2013)

springs113 said:


> It is there at the very same spot it normally is...look good there's just no cutout where the xfire connector/adapter normally slides on but the markings with the metal/copper conductors are there.


I was kind of under the impression that the Crossfire bridge was incompatible with 4K resolution. Something about the maximum playfield/screen resolution being ~4 megapixels that can be transferred. I know the Crossfire bridge isn't overly friendly for bandwidth ( 2.5GT/sec) which I assumed was why all the talk of the transfers being made of the PCI-E bus.

BTW: Here's a pretty clear picture of the Crossfire finger contacts you are talking about.






EDIT: Just found this at Tech Report. 


> We noted this fact way back in our six-way Eyefinity write-up: the card-to-card link over a CrossFire bridge can only transfer images up to to four megapixels in size. Thus, a CrossFire team connected to multiple displays must pass data from the secondary card to the primary card over PCI Express. The method of compositing frames for Eyefinity is simply different. That's presumably why AMD's current frame-pacing driver can't work its magic on anything beyond a single, four-megapixel monitor.


----------



## okidna (Sep 26, 2013)

HumanSmoke said:


> Maybe AMD's driver and software team could make them an instructional video !



Totally agreed  They still need to make a better OVERALL driver, not only a driver that only works for specific cards and specific condition.

So I don't have to do all of this : http://www.techpowerup.com/forums/showpost.php?p=2986178&postcount=39


----------



## lZKoce (Sep 26, 2013)

Wow 6 pages of comments , I just wanted to say: I am digging the looks. Definitely I like how they look. Really fancy right there. They remind me of one of my all time favourite cards: Asus ROG gtx285 matrix edition  I wouldn't mind even more bling bling  You know what they say: Performance may win the fight, but style wins the crowd


----------



## Ahhzz (Sep 26, 2013)

63jax said:


> i don't see any CF connector on the card???



http://www.techpowerup.com/forums/showpost.php?p=2985764&postcount=14


----------



## Yellow&Nerdy? (Sep 26, 2013)

I kind of hoped for something at the 249$ mark, but the R9 270X looks like a pretty solid card. 5500 in Firestrike "Standard" is around the same as a GTX 760, while costing less than the GTX 760. If we're lucky, it will also overclock like a madman. Since BF4 is AMD optimized, it's definitely a no-brainer for me.

Now we just have wait for availability, hoping to grab one before BF4 releases.


----------



## springs113 (Sep 26, 2013)

I need ek, or preferably swiftech to come out with a block and  I am selling my 780 as long as it is on par with it.  I may end up keeping my 780 and giving my wife or brother it, or just get selfish and keep it for myself and put it in my htpc case.


----------



## Naito (Sep 26, 2013)

I don't think a proprietary API (Mantle) has much of a place in today's world. You ever wondered what mostly caused Glide's demise? On top of that, will game developers want to implement multiple versions of code, or code that can only run on a specific vendor's hardware that has ~1/3* of the market, let alone what percentage of those cards are capable of running it?


----------



## Crap Daddy (Sep 26, 2013)

Naito said:


> I don't think a proprietary API (Mantle) has much of a place in today's world. You ever wondered what mostly caused Glide's demise? On top of that, will game developers want to implement multiple versions of code, or code that can only run on a specific vendor's hardware that has ~1/3* of the market, let alone what percentage of those cards are capable of running it?



You are right, it doesn't. But does it worth spending some money to make Nvidia look bad in one AAA game? AMD sez yes. Thing is when the 290X will be in the hands of the reviewers mantle spiced BF4 will not be available yet.


----------



## nemesis.ie (Sep 26, 2013)

PopcornMachine said:


> So all we got today really was a prices on all but the newest cards, no benchmarks, and a boatload of marketing BS.
> 
> I mean a big boatload.  A shipload.
> 
> ...



I was thrilled to hear about it - it could really add to the immersion, not to say that more info on the gfx side would not have been nice too.


----------



## Nordic (Sep 26, 2013)

Naito said:


> I don't think a proprietary API (Mantle) has much of a place in today's world. You ever wondered what mostly caused Glide's demise? On top of that, will game developers want to implement multiple versions of code, or code that can only run on a specific vendor's hardware that has ~1/3* of the market, let alone what percentage of those cards are capable of running it?



Aren't all consoles running amd hardware? Will they be using mantle on the consoles? If so then they would have their foot in. All console ports will use this.


----------



## WaX (Sep 26, 2013)

james888 said:


> Aren't all consoles running amd hardware? Will they be using mantle on the consoles? If so then they would have their foot in.



PS4 and XBO natively support Mantle through AMD's GCN based gfx cards. Both PC and Consoles will benefit from this new API. Thank you AMD.


----------



## Ahhzz (Sep 26, 2013)

Xenturion said:


> Excellent. They finally dropped the requirement of an active DisplayPort adapter for 'legacy' monitors in EyeFinity. (Presumably eliminating the screen tearing you get on one monitor from using an adapter; not to mention them dying) I suspect a 290 or 280X will be my next card. I'll wait for the reviews, but the healthy helpings of VRAM on both cards should be ideal for triple monitors. Plus, excellent compute performance never hurts.



As I just installed my adapter last night on my 24" Asus, to finally setup Eyefinity... *sigh*.... Oh well, I won't be in the market for one until next year, (closer to Star Citizen release) anyway....


----------



## NeoXF (Sep 26, 2013)

HumanSmoke said:


> Pricing on all cards except the one everyone is interested in.
> 
> /Damp squib



I can assure you not everyone is interested in "just" that...



Akrian said:


> GPUs....audio technology....say wut ? 0_o



GPUs are used to farm virtual gold (coins)... or to hack your damn passwords and what not...
Surprised of 1st hand audio integration on them... why?



GC_PaNzerFIN said:


> Little birds tell: 8000 points in 3DMark Fire Strike. gtx 780 does 8500.



On a overclocked i7, yes. However, I'm willing to bet AMD stupidly do their in-house runs with stuff like A10-6800 or FX-8350 tops.

Edit: Examples
-> Stock R7970 + stock FX-9590 -> ~5700 marks in FS
-> Stock R7970 + stock i7-3820  -> ~7100 marks in FS


----------



## cadaveca (Sep 26, 2013)

NeoXF said:


> On a overclocked i7, yes.



Nah, a stock 3960X or 4770K does 8700 no problem..in fact, much closer to 9000, to be honest. I have 7970 and GTX 780, plus the modern CPUs, plus the 3DMark Firestrike benches already done and can link them easily enough.

But that's besides the point. We have no clockspeed info, no shader count, and no real testing done. Those scores could change.

Anyway, to me "immersive" = V.R. and I suppose that AMD has reason for going with this, although the real use might not be immediately apparent. What does strike me as odd about the whole thing is the fact that Microsoft seems to be one step ahead of AMD here, and that the butt-load of Win7 users that are not likely to change OS any time soon, plus some hardware restrictions, will make this TrueAudio a difficult sell for some time yet.


----------



## Nordic (Sep 26, 2013)

I think the consoles are the ace in the whole here. If both the xbox and the ps4 use this and mantle, all the AAA console ports will have it too. To me it seems amd is really leveraging their position in consoles to push their new tech.

TressFx was open to anyone. I think all of this new stuff will too. Of course, it will run best on AMD hardware.


----------



## haswrong (Sep 26, 2013)

Naito said:


> I don't think a proprietary API (Mantle) has much of a place in today's world.



it could get some attention if amd manages to show at least 20% performance increase in crysis 3 using mantle as opposed to running it via directx. never hurts to become independent on microsoft.

i hope amd doesnt drop the opengl support too..

i havea different question. how does one utilize the audio features? does the graphics card communicate with a sound device or wtf?


----------



## PopcornMachine (Sep 26, 2013)

crossfire bridge isn't complete yet.  card isn't ready yet.

hence the extended discussion about audio and nothing concrete about graphics ability.

very disappointing pile of crap presented yesterday.


----------



## NeoXF (Sep 26, 2013)

cadaveca said:


> Nah, a stock 3960X or 4770K does 8700 no problem..in fact, much closer to 9000, to be honest. I have 7970 and GTX 780, plus the modern CPUs, plus the 3DMark Firestrike benches already done and can link them easily enough.
> 
> But that's besides the point. We have no clockspeed info, no shader count, and no real testing done. Those scores could change.
> 
> Anyway, to me "immersive" = V.R. and I suppose that AMD has reason for going with this, although the real use might not be immediately apparent. What does strike me as odd about the whole thing is the fact that Microsoft seems to be one step ahead of AMD here, and that the butt-load of Win7 users that are not likely to change OS any time soon, plus some hardware restrictions, will make this TrueAudio a difficult sell for some time yet.



My info is based off of HWBot submissions, so no monkey bussiness, also, I'm not talking about GTX 780 scores...
Point being, I wouldn't swear by it, but I'm pretty sure AMD's scores are based on a FX-8350, FX-9590 tops, system (PCI-Express 2.0 reminder as well), which doesn't mean much when benching a lower end card, but will hold back considerably faster cards, in synthetics especially.



haswrong said:


> it could get some attention if amd manages to show at least 20% performance increase in crysis 3 using mantle as opposed to running it via directx. never hurts to become independent on microsoft.
> 
> i hope amd doesnt drop the opengl support too..
> 
> i havea different question. how does one utilize the audio features? does the graphics card communicate with a sound device or wtf?



Drop OGL support as in drop Linux support? LOL, never.
Especially since there's a huge chance that VALVE's SteamMachines/SteamOS will be based on AMD APUs.


Edit: I'm surprised no one is commenting on AMD's response to nVidia's "GeForce Experience" or whatever it's called.


----------



## BiggieShady (Sep 26, 2013)

NeoXF said:


> I'm surprised no one is commenting on AMD's response to nVidia's "GeForce Experience" or whatever it's called.



Bloatware for those that don't like to (or worse, that don't know how to) configure their games themselves. There's my comment.


----------



## haswrong (Sep 26, 2013)

NeoXF said:


> Edit: I'm surprised no one is commenting on AMD's response to nVidia's "GeForce Experience" or whatever it's called.



whats to comment anyway, amd are the best of best! at least thats what they told me during the livestream.

well, its a caddish datamining operation for monitoring what programs users use (games at this stage, but theres certainly much more info to mine). i wonder what or who decides the optimal game setting (users may broadcast their setting back to amd, but the most used setting doesnt have to be the smoothest).. and amd = copy cat as with the boost/turbo feature.


----------



## TheoneandonlyMrK (Sep 26, 2013)

haswrong said:


> whats to comment anyway, amd are the best of best! at least thats what they told me during the livestream.
> 
> well, its a caddish datamining operation for monitoring what programs users use (games at this stage, but theres certainly much more info to mine). i wonder what or who decides the optimal game setting (users may broadcast their setting back to amd, but the most used setting doesnt have to be the smoothest).. and amd = copy cat as with the boost/turbo feature.



only a fool of the spinal tap variety would believe Nvidia invented boost/turbo

for years all chips have had performance steps etched into their low level microcode and there is no difference at all between how that and boost works, the only change is thermal/power monitoring ,err no actually they were doing that anyway to just not as well

my guitar amp goes to 11 too btw i tipexed it on its way better now

oh and i understand it all no worries , i get that they use it better now to auto oclaugh:) when a game is not loading it up as much in heat or load terms and thats why I Dont own a GK104 based card.


----------



## KainXS (Sep 26, 2013)

at least these will ram the prices of everything down, 

thank you amd


----------



## haswrong (Sep 26, 2013)

theoneandonlymrk said:


> only a fool of the spinal tap variety would believe Nvidia invented boost/turbo
> 
> for years all chips have had performance steps etched into their low level microcode and there is no difference at all between how that and boost works, the only change is thermal/power monitoring ,err no actually they were doing that anyway to just not as well



ok. i didnt notice until it became an advertised feature. i usually change graphics card in a 4-5 year period to my excuse..




KainXS said:


> at least these will ram the prices of everything down,
> thank you amd



would be nice if it rammed the price of ram too somehow..


----------



## TheoneandonlyMrK (Sep 26, 2013)

Crap Daddy said:


> You are right, it doesn't. But does it worth spending some money to make Nvidia look bad in one AAA game? AMD sez yes. Thing is when the 290X will be in the hands of the reviewers mantle spiced BF4 will not be available yet.



:shadedshu 



PopcornMachine said:


> crossfire bridge isn't complete yet.  card isn't ready yet.
> 
> hence the extended discussion about audio and nothing concrete about graphics ability.
> 
> very disappointing pile of crap presented yesterday.



clearly been on holiday the last two weeks and didnt see mantle at all, no crossfire dongle is required or useable at UHD resolutions so it works over pciex



Naito said:


> I don't think a proprietary API (Mantle) has much of a place in today's world. You ever wondered what mostly caused Glide's demise? On top of that, will game developers want to implement multiple versions of code, or code that can only run on a specific vendor's hardware that has ~1/3* of the market, let alone what percentage of those cards are capable of running it?



In a world where All games are ran on Amd hardware bar maybe a quarter this rocks
your third is just pc,, add all sony ,xbox, and nintendo gamers and you have something that makes so much sense it would be stupid not to, though i agree other Api's should still be coded for but imho just Opengl and Not Dx, in fact ima say it,,,, 5 years and Dx is a dodo


----------



## the54thvoid (Sep 26, 2013)

Just ran Firestrike for giggles at 876Mhz core (modded bios removed boost) on a Titan with cpu @ 4.4GHz.

http://www.3dmark.com/3dm/1283922?


----------



## springs113 (Sep 26, 2013)

the54thvoid said:


> Just ran Firestrike for giggles at 876Mhz core (modded bios removed boost) on a Titan with cpu @ 4.4GHz.
> 
> http://www.3dmark.com/3dm/1283922?



I get higher with a 4770k @4.5 and my hydro copper on stock


----------



## TheoneandonlyMrK (Sep 26, 2013)

springs113 said:


> I get higher with a 4770k @4.5 and my hydro copper on stock



get a room will ya's:shadedshu


----------



## PopcornMachine (Sep 26, 2013)

theoneandonlymrk said:


> clearly been on holiday the last two weeks and didnt see mantle at all, no crossfire dongle is required or useable at UHD resolutions so it works over pciex



Where'd you pull this out of?

So no one running UHD (whichever definition of that you use) will need or be able to use multiple cards????

Another site clearly said this is an early engineering sample as the crossfire connects were not complete.


----------



## TheoneandonlyMrK (Sep 26, 2013)

PopcornMachine said:


> Where'd you pull this out of?
> 
> So no one running UHD (whichever definition of that you use) will need or be able to use multiple cards????
> 
> Another site clearly said this is an early engineering sample as the crossfire connects were not complete.



sorry mate ive already read all of most of these threads once im not looking again(no time) but its stated by others as the reason frame pacing is not working on multi monitor eyefinity setups  also, and it makes sense as you can force dongle less crossfire now so it does work even on my 5's


----------



## Nordic (Sep 26, 2013)

PopcornMachine said:


> Where'd you pull this out of?
> 
> So no one running UHD (whichever definition of that you use) will need or be able to use multiple cards????
> 
> Another site clearly said this is an early engineering sample as the crossfire connects were not complete.



Searched the thread with the term "crossfire" and found it really quick.

http://www.techradar.com/news/compu...u-the-radeon-r9-290x-1183950?src=rss&attr=all


> Notably, the Crossfire connectors are gone. Crossfire will now be handled via PCI Express. Additionally, the Radeon R9 290X will sport two DVI outputs, a DisplayPort output, and an HDMI output.


----------



## the54thvoid (Sep 26, 2013)

springs113 said:


> I get higher with a 4770k @4.5 and my hydro copper on stock



Screen shot of Afterburner or Precision?  Unless you're running a boost disabled bios, you'll find your card is running far higher boost clocks.  This isn't a pissing contest - it was posted to show that a Titan at it's basic advertised boost of 876Mhz (held there via Svl7's V3 boost disabled bios) still gets about 8500 on Firestrike compared to what the PR slides for R9 290X show (<8000). 

My Titan at 1202/7000 gets 11101 on Firestrike.


----------



## TheoneandonlyMrK (Sep 26, 2013)

the54thvoid said:


> Screen shot of Afterburner or Precision?  Unless you're running a boost disabled bios, you'll find your card is running far higher boost clocks.  This isn't a pissing contest - it was posted to show that a Titan at it's basic advertised boost of 876Mhz (held there via Svl7's V3 boost disabled bios) still gets about 8500 on Firestrike compared to what the PR slides for R9 290X show (<8000).
> 
> My Titan at 1202/7000 gets 11101 on Firestrike.



The pr slides mearly indicate but you run with it.


----------



## springs113 (Sep 26, 2013)

the54thvoid said:


> Screen shot of Afterburner or Precision?  Unless you're running a boost disabled bios, you'll find your card is running far higher boost clocks.  This isn't a pissing contest - it was posted to show that a Titan at it's basic advertised boost of 876Mhz (held there via Svl7's V3 boost disabled bios) still gets about 8500 on Firestrike compared to what the PR slides for R9 290X show (<8000).
> 
> My Titan at 1202/7000 gets 11101 on Firestrike.



I'm running my card stock...when I go home I will, I fell into the top bar in 3dmark the last time I check not the second tier that you are in.


----------



## Horrux (Sep 26, 2013)

Roph said:


> Eh, so AMD's "Mantle" thing is basically like 3DFX's Glide?
> 
> Even as an AMD user I'm not sure I like the sound of a proprietary API like that. I'd much rather see advancements in their drivers / DirectX to get performance closer to the metal instead.



No, it's a very GOOD thing. Games will be developed using that API, and then possibly also ported to DX 11? But AMD card users will be able to run it natively, because all the consoles use x86 and GCN.


----------



## NeoXF (Sep 26, 2013)

BiggieShady said:


> Bloatware for those that don't like to (or worse, that don't know how to) configure their games themselves. There's my comment.



Again, I ask myself, why do I even remotely expect an objective answer from a forum packed full of so-called "enthusiasts". Being an enthusiast and loving to configure every little nut and bolt of your system and it's software/games doesn't mean that the world revolves around you or anything related to you, much less, the things you have/"own". I know a ton of people who either don't know how, don't have the time... or just want TO GAME, nothing less, nothing more, instead of messing about with every little non-actual-gameplay aspect of their games on a PC... and trust me, they outnumber us by a lot.

So please, you people, spare me these comments of how useless this or that is, what is bloatware or not how this and that is a data mining conspiracy (which might very well be true, but seriously now, just unplug your internet and be done with it if you're that desperate). I'd like to read more valid opinions than that on what is supposed to be a very decent forum...


----------



## Xzibit (Sep 26, 2013)

I wouldn't get to hung up on that PR slide of the R9 290X.  







If you move your eyes a little to the left you can see all cards have 8 semi-transparent bars to fill across the board.

If you look at the bottom of the graph it says. 
*"APPROVED FOR PUBLIC DISTRIBUTION"*


I put it up to a cleaver play with Photoshop with a hint of mystery to keep you guys talking and speculation.


----------



## Fluffmeister (Sep 26, 2013)

I think you're giving AMD too much credit.


----------



## Xzibit (Sep 27, 2013)

Fluffmeister said:


> I think you're giving AMD too much credit.



If I wanted to give them credit I'd go on and on in several forums talking about how accurate a PR slide was.



Never mind the devoted few who despise anything non-green and always reference information as a PR stunt and now are doing a 180 and singling this sol PR slide as the only accurate slide AMD has released in there lifetime.

Its amusing to see the blatant hypocrisy that comes from convenience to advance ones own viewpoint.



/morepopcornoncounch


----------



## Fluffmeister (Sep 27, 2013)

Hehe indeed, you should read what you just typed, take a long hard look in the mirror and say this three times:

nVidia didn't steal my first born....
nVidia didn't steal my first born....
...

Just messing, I have to ask though, if AMD don't know the performance of their own cards, who truly does? They do seem to suck at PR, I'll give you that.

Let's hope you're right, for your own sanity at least.

/morepopcorn


----------



## HumanSmoke (Sep 27, 2013)

Xzibit said:


> I wouldn't get to hung up on that PR slide of the R9 290X...[_snip_]...I put it up to a cleaver play with Photoshop with a hint of mystery to keep you guys talking and speculation.


So, the AMD slide might be bullshit ?
Quite possibly....which leaves open the obvious possibility that that slide isn't the only one. Maybe a batch of slides- or the whole presentation- was a snow job? Smoke and mirrors aided by carefully picked speakers specifically chosen to render the audience into stupor as misdirection.

You maybe on to something. 

As cleaver play though it leaves a lot to be desired. This is cleaver play


----------



## Xzibit (Sep 27, 2013)

HumanSmoke said:


> So, the AMD slide might be bullshit ?
> Quite possibly....which leaves open the obvious possibility that that slide isn't the only one. Maybe a batch of slides- or the whole presentation- was a snow job? Smoke and mirrors aided by carefully picked speakers specifically chosen to render the audience into stupor as misdirection.
> 
> You maybe on to something.
> ...



Didn't say it was BS. I just mock it up for what it is a PR slide.  If there is no other information to collaborate it its not that informative to me other then that's what they want to show.

I'm not going to jump to the conclusion that its a definitive answer.  Far too many un-known variables.

I'll keep my speculation to a minimal.

I think it was somewhat clever.  Don't you think ?
Sure seamed clever the way it had you talking about it. 
For a person that hates anything AMD you sure talk about it a lot so kudos to them for making a Nvidia die-hard talk about it.
I'll await your thesis on it.

/patientlyawaitsforndatolift


----------



## Fluffmeister (Sep 27, 2013)

It's a shame we don't have more info to go on, maybe they should hold some sort of PR event on I dunno.... Hawaii.


----------



## Xenturion (Sep 27, 2013)

The Von Matrices said:


> Where did you see this information about the removal of the active displayport adapter requirement? I didn't see it anywhere in the presentation.





btarunr said:


> In addition, gamers more accustomed to AMD Eyefinity multi-display technology will be freed to use virtually any combination of display outputs when connecting matching monitors to the DVI or HDMI outputs on their system.



I also looked through the slides, but I couldn't find it there either. It's nice that they've followed Nvidia's lead because removing the requirement of DP definitely reduces the entry price of multi-display setups.


----------



## The Von Matrices (Sep 27, 2013)

Xenturion said:


> I also looked through the slides, but I couldn't find it there either. It's nice that they've followed Nvidia's lead because removing the requirement of DP definitely reduces the entry price of multi-display setups.



If this is true, I don't see the price as being the major advantage.  Single link active displayport adapters are now less than $20.  However, the third TMDS link would eliminate tearing on the 3rd screen as is present with the active displayport adapter.


----------



## HumanSmoke (Sep 27, 2013)

Fluffmeister said:


> It's a shame we don't have more info to go on, maybe they should hold some sort of PR event on I dunno.... Hawaii.


LOL. Well played.
I'd come to much the same conclusion some time before the event, so I wasn't unduly surprised. AMD's modus operandi hasn't changed for years.


Xzibit said:


> For a person that hates anything AMD you sure talk about it a lot so kudos to them for making a Nvidia die-hard talk about it.
> I'll await your thesis on it.


Sure no problem. Already published almost a year ago.

Makes a change from being labelled an AMD fanboy


----------



## Xzibit (Sep 27, 2013)

HumanSmoke said:


> Sure no problem. Already published almost a year ago.
> 
> Makes a change from being labelled an AMD fanboy



Still pandering your collage I see.

Hope that wasn't your highlight since it contains nothing about the R9 290X

They are some interesting post on that slide I posted above that you made though 

Maybe follow this guys advice.


Fluffmeister said:


> Sucks to live in middle earth then, move closer to civilization.




/waitingfordnatolift


----------



## HumanSmoke (Sep 27, 2013)

Didn't take you long to revert to type with the personal shit I see.


Xzibit said:


> Still pandering your collage (?) I see.
> Hope that wasn't your highlight since it contains nothing about the R9 290X [what part of "Already published almost a year ago" was so difficult to understand?]
> They are some interesting post on that slide I posted above that you made though[Interpreter required]


Oooh, sick burn coming from the guy that uses sentence structure such as " _They are some interesting post on that slide I posted above that you made though_" and can't complete a simple post without mangling the grammar, punctuation, and sentence structure.:shadedshu

 Colour me abashed.


----------



## HammerON (Sep 27, 2013)

Alright you two. Enough of the bickering. Move along please.


----------



## Prima.Vera (Sep 27, 2013)

Forgive the ignorance, but WHEN will the cards be available for testing and benching? I mean in the office before public release.


----------



## the54thvoid (Sep 27, 2013)

I'm just amused that certain peeps are saying AMD's very own PR slide showing the Firestrike scores is unreliable.  I stated I wanted the new card to beat Titan, so I could go and get a card I could tamper with more than my current castrated card.  So to me, the things at first sneak peak is a disappointment, not a victory to Nvidia.

Folk really ought to stop reading posts in such combative ways.  Only a dick would _want_ AMD to fail with a new piece of tech and they would have no place to be on a tech forum like this.


----------



## RCoon (Sep 27, 2013)

the54thvoid said:


> I'm just amused that certain peeps are saying AMD's very own PR slide showing the Firestrike scores is unreliable.  I stated I wanted the new card to beat Titan, so I could go and get a card I could tamper with more than my current castrated card.  So to me, the things at first sneak peak is a disappointment, not a victory to Nvidia.
> 
> Folk really ought to stop reading posts in such combative ways.  Only a dick would _want_ AMD to fail with a new piece of tech and they would have no place to be on a tech forum like this.



Seconded, I want R9 290X to be a viable alternative to all the voltage locked 780's/Titan's, so we can get back to serious overclocking on water without having to mod this and that so we dont hit arbitrary limits.


----------



## HumanSmoke (Sep 27, 2013)

the54thvoid said:


> I'm just amused that certain peeps are saying AMD's very own PR slide showing the Firestrike scores is unreliable.


It's not a road any company would want to go down IMO. Short term strategic gain generally isn't offset by long term loss of goodwill


the54thvoid said:


> I stated I wanted the new card to beat Titan, so I could go and get a card I could tamper with more than my current castrated card.  So to me, the things at first sneak peak is a disappointment, not a victory to Nvidia.


I wouldn't really even see it as a disappointment. If on the balance of benchmarks it trades blows with the GTX 780/Titan then that is well and good -depending upon price, because there are going to be apps (OpenCL etc) and games where the Radeon will be faster in any case. In the end it doesn't really matter what the performance over and above the opposition- or its own product stack- is unless the price is right. Price at $600 or $650 as has been mentioned in various places and it pretty much doesn't change anything ( Zotac and PNY are both in that price range from a quick Google) except getting the option of another feature set


the54thvoid said:


> Folk really ought to stop reading posts in such combative ways.


Seconded


the54thvoid said:


> Only a dick would _want_ AMD to fail with a new piece of tech and they would have no place to be on a tech forum like this.


Seems more a case of one group using the available information to ascertain a possible/probable level of performance, and another group being a little defensive over the findings. I don't see a lot of people wishing AMD away...more a case tempering expectation with the information given by the vendor.

There's always been an irrational train of thought that the next architecture is going to be the one- the [insert huge percentage] quantum leap reminiscent of the ~1994-99 era- and it just ain't going to happen. Even if the technology breakthroughs were available, it is still two companies carving up a cake that used have fifty with their hands in it.


----------



## BiggieShady (Sep 27, 2013)

NeoXF said:


> I'm surprised no one is commenting on AMD's response to nVidia's "GeForce Experience" or whatever it's called.





NeoXF said:


> So please, you people, spare me these comments of how useless this or that is, what is bloatware or not how this and that is a data mining conspiracy (which might very well be true, but seriously now, just unplug your internet and be done with it if you're that desperate). I'd like to read more valid opinions than that on what is supposed to be a very decent forum...



You have a funny notion of indecency.

I'm sorry you didn't like my comment. I wish you that all comments you ask for to be the ones you actually like ... let's cross our fingers and wish it really, really, hard .... if it doesn't work for some reason, look at the good side - at least you are not that surprised anymore.


----------



## Xzibit (Sep 27, 2013)

the54thvoid said:


> I'm just amused that certain peeps are saying AMD's very own PR slide showing the Firestrike scores is unreliable.



I guess I was confused since everyone saying R9 280X = 7970 GHZ

Back in March Brandonwh64 posted this valid score (old drivers)







Unless i'm reading his data wrong hes using an original 7970 and boosting it to 7970 GHZ @ stock clocks the numbers just didn't match up to me.

That would mean the AMD PR slide is off by -400 to -600 with the R9 280X.  Since it seamed they were that off to a supposed refresh. It raised an eyebrow.

Silly me.  For not taking things at face value


----------



## PopcornMachine (Sep 27, 2013)

james888 said:


> Searched the thread with the term "crossfire" and found it really quick.
> 
> http://www.techradar.com/news/compu...u-the-radeon-r9-290x-1183950?src=rss&attr=all



That is speculation at this point.  AMD said nothing about this.

Maybe true, but why wouldn't they say that at the big info press conference thing?

Seems more noteworthy than audio.


----------



## NeoXF (Sep 27, 2013)

Xzibit said:


> I guess I was confused since everyone saying R9 280X = 7970 GHZ
> 
> Back in March Brandonwh64 posted this valid score (old drivers)
> 
> ...








That's a 1050MHz, 1100MHz Turbo w/ stock 6000MHz VRAM HD 7970 GHz... so pretty much what we should expect from a 280X... sooo... yeah...
(ofc, you have to also take into account the 4,5GHz hexacore i7 platform it's running on)

And a 1000 point discrepancy is nothing to sneeze at... I imagine the R9-290X (a even more CPU-limited GPU) is being off-shot by even more in those claims.


----------



## Nordic (Sep 27, 2013)

PopcornMachine said:


> That is speculation at this point.  AMD said nothing about this.
> 
> Maybe true, but why wouldn't they say that at the big info press conference thing?
> 
> Seems more noteworthy than audio.



I was simply answering your question of where he pulled that information from.


----------



## PopcornMachine (Sep 27, 2013)

james888 said:


> I was simply answering your question of where he pulled that information from.



Thanks for the info.  I heard something different.  Turns out AMD hasn't really said yet. 

If they'd been more clear on important stuff, it would have saved all of us a lot trouble.


----------



## TheoneandonlyMrK (Sep 27, 2013)

HumanSmoke said:


> Seems more a case of one group using the available information to ascertain a possible/probable level of performance, and another group being a little defensive over the findings. I don't see a lot of people wishing AMD away...more a case tempering expectation with the information given by the vendor.
> 
> There's always been an irrational train of thought that the next architecture is going to be the one- the [insert huge percentage] quantum leap reminiscent of the ~1994-99 era- and it just ain't going to happen. Even if the technology breakthroughs were available, it is still two companies carving up a cake that used have fifty with their hands in it.



Guess what, Yet again i think your wrong, that next evolutionary step is going to be fairly soon(1-2years) a node down and will involve Tsv enhanced chips and additional memory near die plus some serial compute power perhaps , Btw this is only my opinion dont get too excited  and thats not a vendor specific rumour imho.


----------



## HumanSmoke (Sep 28, 2013)

theoneandonlymrk said:


> Guess what, Yet again i think your wrong, that next evolutionary step is going to be fairly soon(1-2years) a node down and will involve Tsv enhanced chips and additional memory near die plus some serial compute power perhaps , Btw this is only my opinion dont get too excited  and thats not a vendor specific rumour imho.



Well I certainly hope you're right. But knowing how the IHV's operate, they seldom extend themselves unless they need to, and mono/duopolies tend to incrementally advance in order to extract the most from their herd. As little as 5-6 years ago the determining factor was how big and complex you could make a GPU, and you were limited by the fab tooling. That really isn't the case anymore - litho tools have a reticle size that allows for 800+mm² (858mm for the ASML TWINSCAN for example), yet the only company taking advantage of the litho advancement in o.a. size is Intel ( Xeon Phi is supposedly 650-700mm²), and pretty much only to advance their standard/integrated product line - basically the same tactics Intel used twenty years ago with ProShare.

Anyhow, always good to here another's opinion. Here's to a brighter future


----------

