# nVidia Fanboy Here - Moving on from GTX 760 2GB - Should I go AMD?



## shovenose (Jun 5, 2016)

In any other situation I would buy green without even looking at the red... but my MSI Z87M-G43 motherboard does NOT support SLI, only CrossFire (I didn't notice this when I bought it), and my i5-4590, 16GB of DDR3, 2TB HDD, and 60GB Intel SSD Cache are still performing well for me, so I don't see the need to replace my whole computer yet - and I really don't want to change motherboards since I'd have to reinstall my OS and I really don't want to do that. So, either I buy one new nVidia card or I go with two new AMD cards.

I want to spend about $400 to get the best performance across all of my games. My main priority is GTA V for PC. In the past, I've noticed that nVidia GPUs played GTA IV PC better, but I'm not sure if this is still the case for GTA V?

My thought is to wait another month or so for the Radeon RX480 to come out and buy two of them. Is this the best use of my $400 graphics card budget?

I was looking at what's currently on my market as far as nVidia and it's either a GTX970 for like $300 or a GTX980 for like $500... so I can't stay within my budget. If my motherboard supported SLI I'd buy one GTX970 now and then next year buy another... but that's simply not an option. I did see the GTX1070 which should cost about $400... would that be a good option?

Nothing that AMD currently sells (R7,R9,Fury) interests me. So, should I buy two RX 480s or one GTX 1070? Your input is much appreciated!


----------



## TheoneandonlyMrK (Jun 5, 2016)

Just get a 1070 anything else wont be green enough for a fanboy would it ,plus 1070 should be a nice jump in performance.


----------



## shovenose (Jun 5, 2016)

theoneandonlymrk said:


> Just get a 1070 anything else wont be green enough for a fanboy would it plus 1070 should be a nice jump in performance.



AMD has sucked for a really long time but with these new cards and their new Zen stuff they might actually be competitive again, and if that's the case, I would like to support them. That is why I am considering them. I guess we should wait for official benchmarks, but I'm still curious if people think the GTX1070 or two RX480s would be a better choice. Once again, my primary concern, since you seem to have missed it and instead decide to start making fun of my brand preference, is how GTA V will perform.


----------



## 64K (Jun 5, 2016)

I would say don't be in a hurry. Let's see the benches from from both camps and then decide. If you want to stick with Nvidia no matter what then the 1070 would be a good sub-$400 choice and will no doubt smoke that 760 at 1080p that you have right now. The VRAM upgrade will be necessary soon as well.

Eit: Never look at going Crossfire right off the bat. It's better to get the single most powerful card you can within your present budget and look at Crossfire if you have to later on down the road.


----------



## erocker (Jun 5, 2016)

shovenose said:


> AMD has sucked for a really long time but with these new cards and their new Zen stuff they might actually be competitive again, and if that's the case, I would like to support them. That is why I am considering them. I guess we should wait for official benchmarks, but I'm still curious if people think the GTX1070 or two RX480s would be a better choice. Once again, my primary concern, since you seem to have missed it and instead decide to start making fun of my brand preference, is how GTA V will perform.


Unless you want to wait until around the end of the year or get two 480x's at the end of this month (meh CrossFire), I would think the 1070 is the way to go.


----------



## newtekie1 (Jun 5, 2016)

shovenose said:


> My thought is to wait another month or so for the Radeon RX480 to come out and buy two of them. Is this the best use of my $400 graphics card budget?



No, you are going to want to avoid crossfire at all costs.  Reason?  You motherboard and AMD's decision to use the PCI-E bus for crossfire data sent between the two cards.  Sure, it technically supports crossfire, but the slots will dramatically cripple crossfire.

If anything, wait until the RX480 is out, if it is truly as powerful as some think it will be, then it might trigger a price drop on the 1070.  If it isn't really that powerful, then just go with a single 1070 and call it a day.


----------



## Zenith (Jun 5, 2016)

Bought 980Ti month ago, if I would be buying at this moment I'd buy 1070 gtx for sure. Less heat, newer tech, better OC, less power consumption, 980Ti performance, cheaper....


----------



## dieselcat18 (Jun 5, 2016)

Zenith said:


> Bought 980Ti month ago, if I would be buying at this moment I'd buy 1070 gtx for sure. Less heat, newer tech, better OC, less power consumption, 980Ti performance, cheaper....



Same here...just purchased a 980ti last month. If I waited,  would have looked at the 1070 for a bit less money...though I'm pleased with the price & performance of 980ti I got atm.


----------



## Devon68 (Jun 5, 2016)

I would probably go with a 1070. While you could buy 2  RX 480 you should wait and see how they perform. But I would usually go with 1 card instead of 2.


----------



## HD64G (Jun 5, 2016)

I'd get the 480 if its at 390X-980 performance lvl since its price is a steal and if by next year some better options exist, sell it for 2/3rds of its price as new and get the newer and more powerful. 1070 is good atm but not sure how good value since it will sink in price once the performance models come out. Normal for every new gen of hw. And as for CF by getting 2X480, let's wait for the reviews, especially for DX12 games.


----------



## ZoneDymo (Jun 5, 2016)

newtekie1 said:


> No, you are going to want to avoid crossfire at all costs.  Reason?  You motherboard and AMD's decision to use the PCI-E bus for crossfire data sent between the two cards.  Sure, it technically supports crossfire, but the slots will dramatically cripple crossfire.
> 
> If anything, wait until the RX480 is out, if it is truly as powerful as some think it will be, then it might trigger a price drop on the 1070.  If it isn't really that powerful, then just go with a single 1070 and call it a day.



"but the slots will dramatically cripple crossfire"

Got any link to where you got that info? because that to me seems like a bunch of bs seeing as how much data PCI-E slots can transfer, way more then any card uses atm.


----------



## cdawall (Jun 5, 2016)

ZoneDymo said:


> "but the slots will dramatically cripple crossfire"
> 
> Got any link to where you got that info? because that to me seems like a bunch of bs seeing as how much data PCI-E slots can transfer, way more then any card uses atm.



The second slot is PCI-e 2.0 4x it will cripple performance some. My suggestion would be for him to run a single good card and avoid xfire/sli as a whole.







This is a single card scaling, crossfire will on exacerbate the issue.


----------



## LightningJR (Jun 5, 2016)

The 1070 is an obvious choice but if you have no issues waiting then why not wait. I have no experience with SLI or Crossfire but if the rumors are true then 2 480s should be top in price/performance (when the game takes advantage of it) Wait for the benchies.


----------



## Dethroy (Jun 5, 2016)

Do you game on 1 or all 3 of your monitors? If you game on a single monitor the RX 480 will handle every game @1080p easily. But if you play on 3 monitors or plan on upgrading to Ultrawide 1.440p or 4k then I'd look in the green direction.
Anyways, I'd await the reviews of RX 480(X) cards before making my decision.


----------



## arbiter (Jun 5, 2016)

LightningJR said:


> The 1070 is an obvious choice but if you have no issues waiting then why not wait. I have no experience with SLI or Crossfire but if the rumors are true then 2 480s should be top in price/performance (when the game takes advantage of it) Wait for the benchies.


Only rumors are one's put out by AMD so take that with a grain of salt. Likely independent reviewers will see 2 480's around same performance as 1070. if you get a 1070 for 400$, then you will have 2 480 4gb cards at that price. to get same 8gb could be 100$ more for 8gb premium. other thing is 300watts vs 150watt draw.


Dethroy said:


> Anyways, I'd await the reviews of RX 480(X) cards before making my decision.


^ pretty much best advice when looking at any AMD card now day's, AMD's PR has very little to believe anymore when they speak.



64K said:


> Eit: Never look at going Crossfire right off the bat. It's better to get the single most powerful card you can within your present budget and look at Crossfire if you have to later on down the road.



My stance now with CF and even SLI, since its going to be in the hands of game dev's now cause DX12. Probably best to avoid both since probably be at least a years or 2 before they game dev's can sort it out and make it work properly.


----------



## newtekie1 (Jun 5, 2016)

ZoneDymo said:


> "but the slots will dramatically cripple crossfire"
> 
> Got any link to where you got that info? because that to me seems like a bunch of bs seeing as how much data PCI-E slots can transfer, way more then any card uses atm.



The 2nd PCI-E x16 slot on his motherboard is only an x4 2.0 slot.  This PCI-E Scaling test is already showing a single card loosing performance in that type slot, you add Crossfire communication to that and performance is going to suffer greatly.


----------



## jaggerwild (Jun 5, 2016)

I'd suggest to wait it out, first get a card(current 1070 will be old soon) it's a mid tier card to begin with, if you feel the need to spend buy a swappable mother board so down the road what ever card you do buy you can then SLI or CF.


----------



## thesmokingman (Jun 5, 2016)

newtekie1 said:


> The 2nd PCI-E x16 slot on his motherboard is only an x4 2.0 slot.  This PCI-E Scaling test is already showing a single card loosing performance in that type slot, you add Crossfire communication to that and performance is going to suffer greatly.




Pro tip for you, any multi-gpu will run like ass with seriously mismatched slot widths, whether it be Green or Red, ignoring the fact that his mb is seriously under engineered for the task.


----------



## Grings (Jun 6, 2016)

If you consider yourself an nvidia fanboy id certainly say wait to see 1070 pricing

Given that the 1080 has released and non founder editions are around the same price as 980ti's, i would expect the 1070 to be at 980 prices, at least for the first month or so

This is also why (i think) we see talk of 1060/65 cards with a 256bit mem bus, because it will come priced closer to 970 prices than 960's


----------



## arbiter (Jun 6, 2016)

thesmokingman said:


> Pro tip for you, any multi-gpu will run like ass with seriously mismatched slot widths, whether it be Green or Red, ignoring the fact that his mb is seriously under engineered for the task.


I think nvidia has a limit built in to them that 4x slot won't work for sli.


----------



## qubit (Jun 6, 2016)

Stay green and go for the 1070, but wait until AMD releases the new models first to hopefully get a decent price drop. Let AMD prove themselves with their next gen cards.

As NT said, this latest version of CrossFire is crippled, so don't bother with it. Better to invest in a new NVIDA SLI cable mobo down the line, although SLI has issues too, so I advise that you just stay with this current mobo and one card.


----------



## cdawall (Jun 6, 2016)

arbiter said:


> I think nvidia has a limit built in to them that 4x slot won't work for sli.



It'll work it just won't work well.


----------



## newtekie1 (Jun 6, 2016)

thesmokingman said:


> Pro tip for you, any multi-gpu will run like ass with seriously mismatched slot widths, whether it be Green or Red, ignoring the fact that his mb is seriously under engineered for the task.



That is correct, I'm not saying that isn't the issue on both side. However, the problem is worse with AMD now that the cards do all communication through the PCI-e slots.


----------



## 5DVX0130 (Jun 6, 2016)

Well $400 won’t buy you a 1070, that thing is certain. Also expect it to be pretty much sold out and overpriced into oblivion for (at least) the first two months.

The RX cards are still largely a mystery and while they should launch at end of June, we only have speculations till then. At the same time do note, that if the card turns out to be as good as it’s suggested expect the 10x0 story to repeat (hopefully it does), and cards will be sold out for a while.

Best suggestion I can give you… unless you are in a hurry just wait till August/September and based on reviews make up your mind then. By then stock and prices should stabilize, drivers mature, any issues pop up, and custom cards make an appearance.


----------



## Kanan (Jun 6, 2016)

64K said:


> I would say don't be in a hurry. Let's see the benches from from both camps and then decide. If you want to stick with Nvidia no matter what then the 1070 would be a good sub-$400 choice and will no doubt smoke that 760 at 1080p that you have right now. The VRAM upgrade will be necessary soon as well.
> 
> Eit: Never look at going Crossfire right off the bat. It's better to get the single most powerful card you can within your present budget and look at Crossfire if you have to later on down the road.


If you're just into GTA V and the likes (AAA games), it's no problem to go with a Crossfire RX 480 system, also general support of it will be better in the future, DX12 multi adapter will make things easier.

Edit: if your mainboard isn't suited for multi graphics cards, it's better to go with a single strong GPU. GTX 1070 maybe, or a good 980 Ti.


----------



## arbiter (Jun 6, 2016)

5DVX0130 said:


> Well $400 won’t buy you a 1070, that thing is certain. Also expect it to be pretty much sold out and overpriced into oblivion for (at least) the first two months.


That is why you don't buy from 3rd party price gougers. Same thing likely happen to amd's card. You won't get one for 200$ to start with.



Kanan said:


> If you're just into GTA V and the likes (AAA games), it's no problem to go with a Crossfire RX 480 system, also general support of it will be better in the future, DX12 multi adapter will make things easier.


If the game dev's do it right which probably won't see that for at least a year or more since CF/SLI is such a small part of the market.


----------



## RealNeil (Jun 6, 2016)

arbiter said:


> AMD's PR has very little to believe anymore when they speak



That's true, and it's their own fault.
With that mainboard, a single card solution is the best option.


----------



## OneMoar (Jun 6, 2016)

wait for the benchmarks I would't trust amd as far as I could throw them


----------



## R-T-B (Jun 6, 2016)

Depends on what you want to spend.

AMD may actually be competitive on the new process in the $200ish space.  I doubt it has much to say for itself beyond that for a few months yet.


----------



## thesmokingman (Jun 6, 2016)

newtekie1 said:


> That is correct, I'm not saying that isn't the issue on both side. However, the problem is worse with AMD now that the cards do all communication through the PCI-e slots.




Proof? 

Where is this negativity to AMD's use of crossfire thru the pcie bus coming from? Can you show where it is worse?


----------



## cdawall (Jun 6, 2016)

thesmokingman said:


> Proof?
> 
> Where is this negativity to AMD's use of crossfire thru the pcie bus coming from? Can you show where it is worse?



When you have a PCI-e 2.0 4X slot it will be worse.


----------



## thesmokingman (Jun 6, 2016)

cdawall said:


> When you have a PCI-e 2.0 4X slot it will be worse.




Worse than what? I find it hard to believe that one could tell the difference with it all running like ass. This belabored point is just to try to make something look worse than it already is imo.


----------



## cdawall (Jun 6, 2016)

thesmokingman said:


> Worse than what? I find it hard to believe that one could tell the difference with it all running like ass. This belabored point is just to try to make something look worse than it already is imo.



Then cards with bridges? It looks worse than it already is because it is worse. That's how this works now all data is transferred over pcie there is nothing but pcie.


----------



## cdawall (Jun 6, 2016)

R-T-B said:


> Crossfire had bridges not that long ago.  I don't think they removed them lightly.  They probably weighed it and judged the PCIe bandwidth to be adequate in nearly all cases.



In cases with at least an 8x pci-e 3.0 bus it's a non-issue. The assumption was likely made that people running 2 at least 290's would be using a board with specs that matched that price point.


----------



## thesmokingman (Jun 6, 2016)

cdawall said:


> Then cards with bridges? It looks worse than it already is because it is worse. That's how this works now all data is transferred over pcie there is nothing but pcie.




One would never run with mismatched slots at those extremes so the point is moot.


----------



## R-T-B (Jun 6, 2016)

thesmokingman said:


> One would never run with mismatched slots at those extremes so the point is moot.



Uh, OP has a mobo with one I think?  That's where this all spawned from?


----------



## thesmokingman (Jun 6, 2016)

R-T-B said:


> Uh, OP has a mobo with one I think?  That's where this all spawned from?




Yea, he has a whacky board alright. It's terribad to run that mismatched let alone that mismatched across different chipsets which is what I'm assuming his board is setup as (x4 off the SB).


----------



## OneMoar (Jun 6, 2016)

actually it doesn't matter so yea.....
pci-e link speeds have very little impact on performance
this has bee tested numerous times infact the pci-e scaling test gets re-run with every generations of card
https://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/


----------



## Vayra86 (Jun 6, 2016)

theoneandonlymrk said:


> Just get a 1070 anything else wont be green enough for a fanboy would it ,plus 1070 should be a nice jump in performance.



/thread. 

Though from 760 to 1070 is kind of a break from the norm. That's a full, and expensive tier up. Albeit a worthwhile tier to be in.

I would definitely take the 1070 over 2x RX 480 - whoeever cooked that up in their AMD head needs to be shot immediately. Only idiots will buy into 2x mid tier crossfire right away.


----------



## arbiter (Jun 6, 2016)

OneMoar said:


> actually it doesn't matter so yea.....
> pci-e link speeds have very little impact on performance
> this has bee tested numerous times infact the pci-e scaling test gets re-run with every generations of card
> https://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/


There was very little to almost no impact on a gtx980, how ever when they tested the Fury X in the same test there was impact. Some games more then others. AMD test is more relevant for AMD cards since it uses PCI-e bus to talk to other card.
https://www.techpowerup.com/reviews/AMD/R9_Fury_X_PCI-Express_Scaling/1.html


----------



## R-T-B (Jun 6, 2016)

OneMoar said:


> actually it doesn't matter so yea.....
> pci-e link speeds have very little impact on performance
> this has bee tested numerous times infact the pci-e scaling test gets re-run with every generations of card
> https://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/



That very article does show that PCIe 2.0 x4 has a performance impact.  It's even worse likely if it's chipset bound.

EDIT:  And it's far worse in the Fury X article, yeah...


----------



## medi01 (Jun 6, 2016)

Between crossfire 480 and single 1070, provided, price is similar, you should go for the latter. (this coming from a user perceived as pro-AMD).
DX12 might or might NOT improve things in multi-GPU cards regard, it's too early to know for sure.

You could also go for 480, which should keep  you decent 1080p and most 1440p for at least a year or so, then, later on, switch to the next gen chip.

Or if you want to go for nVidia, wait for 1060. It is unlikely to be better than 480 (as 960 is inferior to 380) but it will be close to it anyhow.



Zenith said:


> Bought 980Ti month ago, if I would be buying at this moment I'd buy 1070 gtx for sure. Less heat, newer tech, *better OC*, less power consumption, 980Ti performance, cheaper....


Marked red is hands down false.

I wonder, how much you paid for 980Ti month ago.

If you OC it to 1450+, you are already on 1080 levels, going for 1070 from 980Ti looks weirdo.




R-T-B said:


> AMD may actually be competitive on the new process in the $200ish space.


Even this gen AMD is competitive in most areas (380 > 960, 390 > 970, 390x/Fury Nano > 980), except OC-ed 980Ti levels, where it has nothing to show.




Vayra86 said:


> whoeever cooked that up in their AMD head needs to be shot immediately.


They just needed a placeholder compare to competitor part of the show, imo, they are unlikely to seriously push for it.
Agree with the rest of your post.


----------



## R-T-B (Jun 6, 2016)

medi01 said:


> Even this gen AMD is competitive in most areas (380 > 960, 390 > 970, 390x/Fury Nano > 980), except OC-ed 980Ti levels, where it has nothing to show.



I know.  I'm speaking specifically of the new process node.


----------



## newconroer (Jun 6, 2016)

64K said:


> I would say don't be in a hurry. Let's see the benches from from both camps and then decide. If you want to stick with Nvidia no matter what then the 1070 would be a good sub-$400 choice and will no doubt smoke that 760 at 1080p that you have right now. The VRAM upgrade will be necessary soon as well.
> 
> Eit: Never look at going Crossfire right off the bat. It's better to get the single most powerful card you can within your present budget and look at Crossfire if you have to later on down the road.



Yes, wait.

1080/1070 are bit over hyped for what you pay;  and the new Red camp products might end up being a better investment for general/mainstream personal use.


----------



## TheoneandonlyMrK (Jun 6, 2016)

Your comment stands for Amd v Nvidia at all times imho but it never seams to matter to nvidia fanboys so I'm staying with getting a 1070 for the Op plus if you go amd you might ruin the blue and green colour coding in your case.


----------



## thesmokingman (Jun 6, 2016)

arbiter said:


> There was very little to almost no impact on a gtx980, how ever when they tested the Fury X in the same test there was impact. Some games more then others. AMD test is more relevant for AMD cards since it uses PCI-e bus to talk to other card.
> https://www.techpowerup.com/reviews/AMD/R9_Fury_X_PCI-Express_Scaling/1.html





R-T-B said:


> That very article does show that PCIe 2.0 x4 has a performance impact.  It's even worse likely if it's chipset bound.
> 
> EDIT:  And it's far worse in the Fury X article, yeah...




Far worse? The math doesn't seem to support that.

980


> Real performance losses only become apparent in x8 1.1 and x4 2.0, where the performance drop becomes noticeable with around 15%. We also tested x4 1.1, though of more academic interest, and saw performance drop by up to 25%, an indicator that PCIe bandwidth can't be constrained indefinitely without a serious loss in performance.



Fury


> Real performance losses only become apparent in x8 1.1 and x4 2.0, where the performance drop becomes noticeable with around 6-10%. We also tested x4 1.1, though of more academic interest, and saw performance drop by up to 20%, an indicator that PCIe bandwidth can't be constrained indefinitely without a serious loss in performance.


----------



## Frag_Maniac (Jun 6, 2016)

I seriously would NOT buy into 480 Crossfire, especially just because you made an oversight in purchasing your MB. 

It should be an obvious red flag to most, that AMD uses just ONE title, Ashes of Singularity, to example 480 Crossfire performance.

The last decent card AMD made was the 7970. Ever since then, Nvidia has dominated on performance, reliability, and even price.

Just wait until the aftermarket 1070s release. I'm betting you will be able to get one for around $400.


----------



## R-T-B (Jun 6, 2016)

thesmokingman said:


> Far worse? The math doesn't seem to support that.
> 
> 980
> 
> ...



Mental memory aparently was wrong.  Have a thanks.


----------



## OneMoar (Jun 7, 2016)




----------



## moproblems99 (Jun 7, 2016)

After my experience with dual cards, I will never do it again.  I will buy one really good card and use whatever money is left to buy whiskey.


----------



## wiak (Jun 7, 2016)

wait for amd new lineup and benchmarks, your cpu might be your biggest bottleneck if you where going for 1070
how about a RX 480 and more memory or a faster/bigger ssd?

there are only 3 cards you should be looking at this summer

RX 480 ($199)
GTX 1070 ($449)
GTX 1080 ($799)
we are all suspecting amd to have yet another polaris 10 card up their sleeves that is faster than RX 480
simply just wait a month or two, amd is expected to release RX 480 the 29th of june, and there is also the pc gaming show the 13th
where amd might reveal some more info on new cards

amd has been pretty tight lipped this time on their cards, no big leaks at all
upgrading from a GTX 760 to a RX 480 or 480X will be a hell of a upgrade


----------



## arbiter (Jun 7, 2016)

wiak said:


> GTX 1070 ($449)
> 
> GTX 1080 ($799)


Um 1080 is not 799, stop using 3rd party price gouge's. when AIB card makers get their cards out can get a 1080 for pretty close to 600$. 1070 likely be around 400$. FE of 1080 is $699 not 799


----------



## Azumay (Jun 7, 2016)

moproblems99 said:


> After my experience with dual cards, I will never do it again. I will buy one really good card and use whatever money is left to buy whiskey.



absolutely the best quote in this thread


----------



## ViperXTR (Jun 7, 2016)

Similar query as OP now and also 400 USD budget, GTX 660 OC going to either GTX 1070 or RX 480 (or if there is an RX 480X/490). Either way both will be a huge jump for me. Some say 1070 wil be overkill for my current 1080p HDTV and 900p monitor but i would like to max out all details at those said resolutions and i don't intend to upgrade my GPU for years to come (got my GTX 660 few months after release way back), though i might go to next gen i5 or what the zen cpu may bring next year.

Same thoughts about multi GPU, id prefer a fast single card over SLI/CF, my current board doesn't support either anyway.

Also, using a testbed like layout on my curent setup (Aerocool Dead SIlence) thinking if blower type might be a better one vs the custom ones


----------



## Jub (Jun 7, 2016)

newtekie1 said:


> No, you are going to want to avoid crossfire at all costs.  Reason?  You motherboard and AMD's decision to use the PCI-E bus for crossfire data sent between the two cards.  Sure, it technically supports crossfire, but the slots will dramatically cripple crossfire.


PCI-E cripples crossfire? Why would you say this when it is totally untrue?
I ran benches only last week as i was flashing my 290x crossfire cards with different custom bios's i had altered. I ran heaven bench maxed out with 8xAA and extreme tesselation etc. 
Single card was 57 fps..Put the other card in and ran it again and 112 fps....Yeah that sure is crippled, right?
As for the original question. I would go single card if possible but also reckon waiting a few more months to see what amd vega brings would be a wise move. DX12 and Vulkan will more likely favour amd hardware due to those api's being mantle based.


----------



## thesmokingman (Jun 7, 2016)

Azumay said:


> absolutely the best quote in this thread




It's obvious that one should always buy the single fastest card one can afford. The problem is, what do you do when the single fastest card isn't fast enough?? The question of single vs dual is simple at first, but not when you get into large resolution setups. And now we have 4k 144hz panels coming out soon, lol the levels just got higher.


----------



## R-T-B (Jun 7, 2016)

Jub said:


> PCI-E cripples crossfire? Why would you say this when it is totally untrue?



Yes, a PCIe 2.0 x4 slot provided by the chipset as OP has will cripple crossfire.


----------



## thesmokingman (Jun 7, 2016)

R-T-B said:


> Yes, a PCIe 2.0 x4 slot provided by the chipset as OP has will cripple crossfire.




I think that's obvious even without this thread, but some have been using that as a reason to take a dig at AMD for using the pcie bus to perform crossfire. I'm pretty sure the poster you are quoting is talking about that and not x4 slots off the chipset.


----------



## medi01 (Jun 7, 2016)

Price perf of the cards, based on current leaks/lowest prices (from ng thread), note how 1070 is roughly on par with 390/970:









Frag Maniac said:


> It should be an obvious red flag to most, that AMD uses just ONE title, Ashes of Singularity, to example 480 Crossfire performance.


Nah, they leaked Doom performance, for instance, in the show they've shown how 480 runs other games.

They used only one title to take on 1080.



RealNeil said:


> it's their own fault.





Spoiler



Yeah, why would you believe them, when there is another camp, with wooden screws, those are to be trusted.


----------



## arbiter (Jun 7, 2016)

medi01 said:


> Price perf of the cards, based on current leaks/lowest prices (from ng thread), note how 1070 is roughly on par with 390/970:





medi01 said:


> Nah, they leaked Doom performance, for instance, in the show they've shown how 480 runs other games.


Um only without and independent testing this graph is pretty much worthless. they "leaked" doom performance doesn't say much since don't know much about settings and such. Well AOTS performance is well as other person said can just toss in the trash.


----------



## Frick (Jun 7, 2016)

I haven't read the whole thread, but on what resolution are you playing? 1080p or do you plan on playing on three monitors? If just 1080p, I would honestly just wait for the 480/s and see how that does.


----------



## medi01 (Jun 7, 2016)

arbiter said:


> Um only without and independent testing this graph is pretty much worthless.


No, definitely not worthless (especially Pascal vs previous gen part) merely unconfirmed on 480.
Oh, and in this graph 480 is assumed to be C4-ish, so, between 970 and 980, not between 980 and Fury.


----------



## Agility (Jun 7, 2016)

This thread interests me. As a R290 Crossfire user, _i am beginning to regret , with shitty drivers coming in AMD side and stupid game makers siding with 1 camp (Etc Tomb Raider, Witcher 3.) 

Since it's not mentioned, does 1 GTX1080 wreck a R290 x-fire? Anyone know?_


----------



## GreiverBlade (Jun 7, 2016)

shovenose said:


> AMD has sucked for a really long time



mmhhhh? .... 7870Ghz 7970? R9 290 290X? R9 390 390X? mmmm? (fury was a mild disapointement but still was a fine card ... )

i mean ... my actual 980 did not offer me anything more than my previous 290 did ... altho i did get the 290 for 150$ and the 980 i have is priced at 620$ where i am ... and i don't even considere a 1070 as a tempting upgrade (even the 1080 nonetheless )


lucky that i am from neither side.



arbiter said:


> Um 1080 is not 799, stop using 3rd party price gouge's. when AIB card makers get their cards out can get a 1080 for pretty close to 600$. 1070 likely be around 400$. FE of 1080 is $699 not 799



ahah ... a 1080 non founder is 895chf where i am, and founder are 759chf (respectivelly 925.22$ and 784.63$) the price intended by nvidia (aka the new reference of 699 and not 599 since the references cards are 699) will never be seen for end consumer ... but rather those i stated previously



Agility said:


> _Since it's not mentioned, does 1 GTX1080 wreck a R290 x-fire? Anyone know?_


probably ... but it would be better if some other CFX user would answer (strangely they will have a different experience and opinion i think ) aka: is it needed? since even a 290X solo is still plenty for now and probably further (so is a 980 etc etc etc )

what i mean is depending the price paid and if you aren't a "OEMEGEE i want the latest cardz older are sh*tzn no matter the price" and your actual setup give you satisfactory result there is no worries to have (if i didn't got my 980 i would still use my 290 atm and probably until Vega release )


----------



## Agility (Jun 7, 2016)

Yes. Crossfire is giving me a headache and i am going back to a single card (AMD crossfire makes me don't wanna do it again.) I am not sure for SLI but are people have the same issues? (Flickering, frame shuttering etc etc)


----------



## medi01 (Jun 7, 2016)

Agility said:


> does 1 GTX1080 wreck a R290 x-fire?


If CF works, they'd be roughly the same, if not, then of course.
Think of 1080 as well OCed 980Ti.


----------



## GreiverBlade (Jun 7, 2016)

Agility said:


> Yes. Crossfire is giving me a headache and i am going back to a single card (AMD crossfire makes me don't wanna do it again.) I am not sure for SLI but are people have the same issues? (Flickering, frame shuttering etc etc)


i had a SLI (no cfx despite my "both brand" history) well single card was better in any case ... SLI and CFX are not, imho, reliable tech: mainly the fault of the games developer, not "shitty AMD drivers" ... nvidia is far worse on the driver side... and i talk from personal experiences... (and i only mean driver side ... not, multi GPU support ... SLI tend to be a little more viable than CFX but it was quite some times before )


----------



## RealNeil (Jun 7, 2016)

thesmokingman said:


> I'm pretty sure the poster you are quoting is talking about that and not x4 slots off the chipset.



Isn't the OP's board using an X4 slot for the crossfire setup? That's all it has, right?



GreiverBlade said:


> but it would be better if some other CFX user would answer



I have R9-290X Sapphire Tri-X cards in crossfire. My best 980Ti (Gigabyte G1 Gaming Edition) beats them in a lot of benchmarks, but they all play games about the same. (fast and smooth)
Crossfire 290X cards are a decent solution while waiting for new products to arrive.

This is a Heaven result for two Sapphire TriX R9-290X in Crossfire. (both are at 8X speed on the PCI-E bus)


----------



## thesmokingman (Jun 7, 2016)

RealNeil said:


> Isn't the OP's board using an X4 slot for the crossfire setup? That's all it has, right?
> 
> 
> 
> ...




Yea, which is why crossfire/sli is a terrible idea for him and not recommended on that point alone. Speaking of which, it's probably time for the OP to move up to a better foundation.


----------



## HD64G (Jun 7, 2016)

arbiter said:


> Only rumors are one's put out by AMD so take that with a grain of salt. Likely independent reviewers will see 2 480's around same performance as 1070. if you get a 1070 for 400$, then you will have 2 480 4gb cards at that price. to get same 8gb could be 100$ more for 8gb premium. other thing is 300watts vs 150watt draw.



1) RX480 won' go for sale at $250 but as $230 according to all info (not rumours).

2) TDP isn't power consumption. So, RX480 is more likely to consume 110-130W judjing by its die size and 14nm tech along with 1266MHz clock. If oced by much it will consume more but will go against more costly and more strong GPUs also for a fragment of their price.

So, let's wait for a review to judge RX480 eh? And we should hope for it to be the best ever GPU in its class as it would lower the price of more powerful cards.


----------



## Agility (Jun 7, 2016)

Hmm thanks for the feedback. It literally tears my heart apart to see The Witcher 3 item icons flickering non-stop which annoys the crap out of me. And being an "Nvidia" game with their stupid gameworks and shit, I still can't get the idea why are developers siding one company to create a stink-hole for other other fan boy camp... I mean.....Seriously? What ERA are we living?


----------



## Assimilator (Jun 7, 2016)

I would wait until the RX 480(X) is actually released and benchmarked before dropping and monies on a new graphics card. I was very impressed when I went from a 760 to a 970, and the 480 is supposed to exceed the 970's performance, so that could be a very attractive option for $200.

As for the GTX 1070, you know exactly what you're getting: 980 Ti-beating performance for much less cash and much less heat. Given that, I forsee the 1070 being in very short supply for quite some time after its launch, which will push the price up enough that $400 may not cover it.



shovenose said:


> I really don't want to change motherboards since I'd have to reinstall my OS and I really don't want to do that



You don't. My current Windows 7 install started out life on an Asrock Z77 board, then I went to a Gigabyte board, then to a different Asrock board, and finally to the current Gigabyte board I have. As long as you keep the same chipset, and your vital BIOS options like RAID config are the same on both old and new board, and you plug the same SATA cables into the same ports on the new board, everything should just work. There will probably be some driver installs necessary, and Windows will require you to reactivate, but that's literally it.

(For the record, my current Windows 7 install has remained the same over: 4 different motherboards; 2 different CPUs; 3 different boot hard disks/SSDs (disk cloning FTW); 5 different graphics cards; and remains rock solid. The only instability/blue screens I've had were when I clocked the CPU or graphics card too far.)


----------



## newtekie1 (Jun 7, 2016)

thesmokingman said:


> I think that's obvious even without this thread, but some have been using that as a reason to take a dig at AMD for using the pcie bus to perform crossfire. I'm pretty sure the poster you are quoting is talking about that and not x4 slots off the chipset.



That is why I specifically said "Sure, it[the motherboard] technically supports crossfire, *but the slots* will dramatically cripple crossfire." 

I made it very clear in my first post that the issue was the motherboard and it's poor slot arrangement and that Crossfire communication over the PCI-E bus just makes that issue worse, but the issue is the motherboard and the x4 2.0 slot.


----------



## RealNeil (Jun 7, 2016)

Does anyone know how much bandwidth crossfire communication uses? (just wondering)


----------



## thesmokingman (Jun 7, 2016)

RealNeil said:


> Does anyone know how much bandwidth crossfire communication uses? (just wondering)



That's hard to tell but from testing there is no loss. Imo, there are probably other pressing things to worry about then lamenting AMD's XDMA engines.



> True to their promises, AMD has delivered a PCie based Crossfire implementation that incurs no performance penalty versus CFBI, and on the whole fully and sufficiently resolves AMD’s outstanding frame pacing issues.



http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/4
https://community.amd.com/community/gaming/blog/2015/05/11/modernizing-multi-gpu-gaming-with-xdma


----------



## newtekie1 (Jun 7, 2016)

RealNeil said:


> Does anyone know how much bandwidth crossfire communication uses? (just wondering)



According to what I've read, the Crossfire link provided 0.9GB/s of bandwidth.  However, AMD stopped using the Crossfire connectors because they didn't provide enough bandwidth for 4k resolution crossfire.  So 4k Crossfire communication we can assume is more than 0.9GB/s.



thesmokingman said:


> That's hard to tell but from testing there is no loss. Imo, there are probably other pressing things to worry about then lamenting AMD's XDMA engines.
> 
> 
> 
> ...




No loss when both slots are x16 3.0, where there is plenty of extra bandwidth.  A PCI-E x16 3.0 slot gives almost 16GB/s of bandwidth.  A PCI-E 2.0 x4 only gives 2GB/s.  Half of that going towards Crossfire communication... But you just keep believing that won't make a bad situation worse.


----------



## thesmokingman (Jun 7, 2016)

newtekie1 said:


> According to what I've read, the Crossfire link provided 0.9GB/s of bandwidth.  However, AMD stopped using the Crossfire connectors because they didn't provide enough bandwidth for 4k resolution crossfire.  So 4k Crossfire communication we can assume is more than 0.9GB/s.
> 
> No loss when both slots are x16 3.0, where there is plenty of extra bandwidth.  A PCI-E x16 3.0 slot gives almost 16GB/s of bandwidth.  A PCI-E 2.0 x4 only gives 2GB/s.  *Half of that going towards Crossfire communication*... But you just keep believing that won't make a bad situation worse.




Back to making crap up again? X16 3.0 = 32gb/s not 16gb/s. Where is your proof that it takes up half the bandwidth? Btw, AMD moving to XDMA was in anticipation of 4K, not because they ran out of bandwidth with bridges. They chose to move to XDMA instead of increasing bridge bandwidth like the route Nvidia stuck.


----------



## RealNeil (Jun 7, 2016)

Maybe one day they'll have some sort of WiFi connection between them,..........................LOL!

I wonder if those High Bandwidth connectors are just a double connection using both of the finger connectors on their GPUs at the same time?
Can you just use two regular SLI connectors and get the same bandwidth?


----------



## moproblems99 (Jun 7, 2016)

thesmokingman said:


> Where is your proof that it takes up half the bandwidth?



I think all he was pointing out is that the OP's motherboard only has a 2gb/s pcie 2.0 x 4 slot.  Seeing as how the cross communication takes up at least .9gb/s, that is approximately half the bandwidth for that slot in particular.  Not every slot.


----------



## thesmokingman (Jun 7, 2016)

moproblems99 said:


> I think all he was pointing out is that the OP's motherboard only has a 2gb/s pcie 2.0 x 4 slot.  Seeing as how the cross communication takes up at least .9gb/s, that is approximately half the bandwidth for that slot in particular.  Not every slot.




The .9gb/s is the maximum of the bridge interface, NOT the amount of bandwidth used.

Btw, if you want read the link below to see an Nvidia tri 580 system get neutered via an x4 link. Back story, Hardocp ran trisli 580 vs 6990+6970 trifire. They however did not realize that the 3rd slot on their MB was an X4 link off the southbridge... lmao. Their forums were ablaze from this obvious oversight. They re-ran it again on a P67 WS Revo with NF200 giving a multiplexed even pcie slot widths, ie. 4x X8 pcie 2.0 versus their previous X16/X16/X4. The end result of the redux? the 580 trisli was back up to where it was supposed to be, beating the 6990+6970.

Original 
http://www.hardocp.com/article/2011/04/28/nvidia_geforce_3way_sli_radeon_trifire_review/2

Redux
http://www.hardocp.com/article/2011/05/03/nvidia_3way_sli_amd_trifire_redux/2


----------



## newtekie1 (Jun 7, 2016)

thesmokingman said:


> Back to making crap up again? X16 3.0 = 32gb/s not 16gb/s. Where is your proof that it takes up half the bandwidth? Btw, AMD moving to XDMA was in anticipation of 4K, not because they ran out of bandwidth with bridges. They chose to move to XDMA instead of increasing bridge bandwidth like the route Nvidia stuck.





thesmokingman said:


> The .9gb/s is the maximum of the bridge interface, NOT the amount of bandwidth used.



OMFG!  It is right in the link *YOU POSTED!*



			
				The Article You Posted said:
			
		

> the purpose of the CFBI link is to transfer completed frames to the master GPU for display purposes...the CFBI has enough bandwidth to pass around complete 2560x1600 frames at over 60Hz...
> For a 3x1080p setup frames are now just shy of 20MB/each, and for a 4K setup frames are larger still at almost 24MB/each. With frames this large *CFBI doesn’t have enough bandwidth* to transfer them at high framerates – realistically you’d top out at 30Hz or so for 4K – requiring that AMD go over the PCIe bus for their existing cards.
> 
> *PCIe 3.0 x16 has 16GB/sec* available versus .9GB/sec for CFBI



Also, it seems even Wikipedia says 16GB/s(actually 15.75GB/s). So it seems you are the one making shit up, not me.  Get your facts straight.

And simple math tells you it takes up half the bandwidth.  PCI-E x4 2.0 = 2GB/s.  Crossfire Communication now takes up >0.9GB/s.  2 - (>0.9) = <1.1GB/s  or about half of 2GB/s.  Was that too hard for you to follow?

The simple fact is that the crossfire link did not have enough bandwidth.  Yes, of course AMD switched in anticipation of 4k, because the link didn't have enough bandwidth to support 4k.  The bandwidth was the problem.  Crossfire at 4k@60Hz will require ~1.4GB/s of bandwidth.  It would almost completely saturate a x4 2.0 port.

Now, the OP knows Crossfire with his motherboard is a bad idea, so we can move on.


----------



## thesmokingman (Jun 7, 2016)

newtekie1 said:


> OMFG!  It is right in the link *YOU POSTED!*
> 
> 
> *Also, it seems even Wikipedia says 16GB/s(actually 15.75GB/s). So it seems you are the one making shit up, not me.  Get your facts straight.
> ...




Actually the link I posted says this:



> XDMA is designed for optimal performance with systems running PCI Express 2.0 x16 (16GB/s), PCI Express 3.0 x8 (16GB/s), or PCI Express 3.0 x16 (32GB/s).



And other sources...



> *Base Clock Speed:* PCIe 3.0 = 8.0GHz, PCIe 2.0 = 5.0GHz, PCIe 1.1 = 2.5GHz
> *Data Rate:* PCIe 3.0 = 1000MB/s, PCIe 2.0 = 500MB/s, PCIe 1.1 = 250MB/s
> *Total Bandwidth:* (x16 link): PCIe 3.0 = 32GB/s, PCIe 2.0 = 16GB/s, PCIe 1.1 = 8GB/s
> *Data Transfer Rate:* PCIe 3.0 = 8.0GT/s, PCIe 2.0= 5.0GT/s, PCIe 1.1 = 2.5GT/s



http://www.trentonsystems.com/applications/pci-express-interface/



> Gen 3.0 x16 slot of yours can handle a total bandwidth of 32 GB/s (bi-directional) in PCIe Gen 3.0 mode



http://www.guru3d.com/articles-pages/pci-express-scaling-game-performance-analysis-review,2.html


And finally:



> As noted by Anandtech in their comprehensive analysis of XDMA, the bandwidth of an external bridge is just 900MB/s



Looks like they are stating that that is the bandwidth of the bridge not how much is used regardless.


----------



## newtekie1 (Jun 8, 2016)

thesmokingman said:


> Actually the link I posted says this:
> 
> And other sources...
> 
> ...




Except Crossfire communication is not Bi-Directional.  So the speed is 16GB/s.  We don't rate Gigabit Ethernet is 2Gb/s even though it can do 1Gb/s bi-directionally.  And that is why most sources don't say PCI-E x16 3.0 is 32GB/s.  The maximum speed you will ever get on a transfer is 16GB/s.  That is how data rates/bandwidth is rated.  The rated bandwidth of the connection is not doubled just because it is a full-duplex connection.  So, sure, you can find some sources that say 32GB/s, but they are wrong.



thesmokingman said:


> Looks like they are stating that that is the bandwidth of the bridge not how much is used regardless.



No, in the article you posted, they actually go into the bandwidth needed to transfer frames from the secondary GPU to the primary for display.  And specifically state that the bandwidth required has exceed what the old Crossfire link could provide.

A single 4K frame takes up 24MB, a single 1080p frame takes up 6MB.  So even 1080p@144Hz is right at the limit of the old crossfire link, requiring 864MB/s(and that still would be almost half the bandwidth of a PCI-E x4 2.0 link).  While 4k@60Hz would require 1,440MB/s.  Even 1440p@100Hz would require 1,050MB/s.

I've wasted enough time educating you.  I'm moving on now.  Post more all you want, I won't respond to you.


----------



## moproblems99 (Jun 8, 2016)

thesmokingman said:


> Btw, if you want read the link below to see an Nvidia tri 580 system get neutered via an x4 link. Back story, Hardocp ran trisli 580 vs 6990+6970 trifire. They however did not realize that the 3rd slot on their MB was an X4 link off the southbridge... lmao. Their forums were ablaze from this obvious oversight. They re-ran it again on a P67 WS Revo with NF200 giving a multiplexed even pcie slot widths, ie. 4x X8 pcie 2.0 versus their previous X16/X16/X4. The end result of the redux? the 580 trisli was back up to where it was supposed to be, beating the 6990+6970.



I'm only going to add one thing: nobody was bashing AMD for using pcie for gpu cross communication.  Only saying that it is a problem in a pcie 2.0 x4 slot because of the limited bandwidth of pcie 2.0 x4 slots.  In every other situation it works just fine.


----------



## scevism (Jun 8, 2016)

Nvidia

The End.


----------



## medi01 (Jun 8, 2016)

scevism said:


> Nvidia
> 
> The End.



Said 780Ti user.
/chuckle

http://www.babeltechreviews.com/nvidia-forgotten-kepler-gtx-780-ti-vs-290x-revisited/3/


----------



## arbiter (Jun 8, 2016)

medi01 said:


> Said 780Ti user.
> /chuckle
> 
> http://www.babeltechreviews.com/nvidia-forgotten-kepler-gtx-780-ti-vs-290x-revisited/3/


290x from 2013 to a 2016, still in amd's line up as the 390x just renamed, rebadged and claim its a new gpu.
Lookin at numbers performance of 780ti hasn't changed that much in most cases


----------



## R-T-B (Jun 8, 2016)

medi01 said:


> Yeah, why would you believe them, when there is another camp, with wooden screws, those are to be trusted.



Not that it changes things much, but they were WOOD screws (a type of metal screw), not screws actually made of wood (wooden screws).


----------



## medi01 (Jun 9, 2016)

arbiter said:


> 290x from 2013 to a 2016, still in amd's line up as the 390x just renamed


Uh, what about no?
390 non-X is faster than 290x, for starters.
"just renamed" is a lie.



arbiter said:


> Lookin at numbers performance of 780ti hasn't changed that much in most cases


If anything, it gives you a glimpse on how cards from both manufacturers age. 
290, besides taking over the lead by a solid margin was also the cheaper card.


----------



## arbiter (Jun 9, 2016)

medi01 said:


> Uh, what about no?
> 390 non-X is faster than 290x, for starters.
> "just renamed" is a lie.


it was a rebadge. 290(x) was rebadged to be a 390(x). they changed clocks but that doesn't change the fact it was a rebadge. Just like you would point of a 680 was rebadged to the 770 but had a clocks bump.


----------



## medi01 (Jun 9, 2016)

arbiter said:


> it was a rebadge. 290(x) was rebadged to be a 390(x)


Orly? And why is rebadged 290x faster than 290x? Oh and consume less power too?
And what was "rebadged" to 390x?

Rebadge/"just rename" is LITERALLY that, exactly the same card with a different name. 

What you are referring to is actually a "refresh", same architecture, different cards.


----------



## R-T-B (Jun 9, 2016)

medi01 said:


> Orly? And why is rebadged 290x faster than 290x? Oh and consume less power too?
> And what was "rebadged" to 390x?
> 
> Rebadge/"just rename" is LITERALLY that, exactly the same card with a different name.
> ...



290x and 390x are confirmed to be identical silicon wise.  The memory difference is the best argument for them being different, if anything is.


----------



## medi01 (Jun 9, 2016)

R-T-B said:


> 290x and 390x are confirmed to be identical silicon wise.


Citation needed.


----------



## cdawall (Jun 9, 2016)

medi01 said:


> Citation needed.


Would you like me to go pull the cooler off of my 290 and 390? Hell my 290 pcb is even marked to accommodate 8gb...its just a binned 290 chip. They clock a little higher and come clocked higher put of the box with 8gb.


----------



## newtekie1 (Jun 9, 2016)

medi01 said:


> Citation needed.



http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/

3rd paragraph.


----------



## Grings (Jun 9, 2016)

cdawall said:


> Would you like me to go pull the cooler off of my 290 and 390? Hell my 290 pcb is even marked to accommodate 8gb...its just a binned 290 chip. They clock a little higher and come clocked higher put of the box with 8gb.



Yup, i have the powercolor 390, i would assume as with your 290 the pcb is an lfr29fa (ver 1.0), pretty obvious what the r29 bit of that model no stands for


----------



## EarthDog (Jun 9, 2016)

medi01 said:


> Orly? And why is rebadged 290x faster than 290x? Oh and consume less power too?
> And what was "rebadged" to 390x?
> 
> Rebadge/"just rename" is LITERALLY that, exactly the same card with a different name.
> ...


It doesn't consume less power. It consumes more. Its faster because of the increased clockspeeds/voltage to support it, and additional ram. There are reviews that clocked them back to matching speeds and saw the same exact same performance.



medi01 said:


> Citation needed.


Really? I am surprised that the biggest AMD supporter on this forum doesn't know the differences were in memory capacity and clockspeeds only (which there were 8GB 290/290x out there...just take a BIOS flash to change the speeds - but its still a 290x/390x - I mean you don't consider overclocked versions of cards are 390x and half, do you????!!!).

http://www.hardocp.com/article/2015/06/18/msi_r9_390x_gaming_8g_video_card_review/#.V1m7KdkrK70



> Here's the rub, these video cards, the R9 390 and R9 390X, are inherently re-brands of the Radeon R9 290 and 290X respectively. Re-branding video cards is not a new concept, NVIDIA has done this plenty of times, and so has AMD. In fact, the Radeon R9 280X is a re-brand of AMD Radeon HD 7970 GPUs.
> 
> 
> 
> *When the AMD Radeon R9 290X was released, it finally had architectural improvements over the 7970 that increased the GCN version a bit. However, that was back in 2013 that the AMD Radeon R9 290X codenamed (Hawaii) was released. So here we are two years later, and basically we are getting a re-brand of Hawaii with the R9 390 and 390X. This time, no architectural improvements.*


----------



## R-T-B (Jun 10, 2016)

medi01 said:


> Citation needed.



Look back in TPU news a bit.  Citation really not needed.


----------



## cdawall (Jun 10, 2016)

Next people are going to need proof that the 280/280x isn't just a rebadged 7950/7970


----------



## Vayra86 (Jun 10, 2016)

cdawall said:


> Next people are going to need proof that the 280/280x isn't just a rebadged 7950/7970



Now my whole world is burning. They re-released the same card with a different name? OH MY GOD

Sorry, it's friday.


----------



## EarthDog (Jun 10, 2016)

I guess the citations scared him off.


----------



## medi01 (Jun 10, 2016)

newtekie1 said:


> http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/
> 
> 3rd paragraph.



Oh, I see:

"The Radeon R9 390X *is based on the "Hawaii" silicon* (now referred to as "Grenada" without any silicon changes) and features the same core-configuration as the R9 290X. "

"290x and 390x are confirmed to be *identical silicon wise.* The memory difference is the best argument for them being different, if anything is." - is now confirmed.


Let me repeat it again (as some seem to have forgotten original point): rebadge is LITERALLY renaming existing product. The only thing that is changing when it is just a rebadge is name.

As opposed to "refresh", which is having different physical characteristics, be it core clock or mem clock, with possible minor tweaks to the chip itself.


----------



## EarthDog (Jun 10, 2016)

medi01 said:


> As opposed to "refresh", which is having different physical characteristics, be it core clock or mem clock, with possible minor tweaks to the chip itself.


oh.. so you DO conside a card with more memory and higher clocks differen by name?! So with your logic, a 290x with 8gb and higher clocks is a 290x.5? It's still a 290x bub.. you make no sense and are in denial. And I suppose so is every one like this: http://www.tomshardware.com/reviews/sapphire-vapor-x-r9-290x-8gb,3977.html

Would you look at that... a 290x with 8gb and overclocked!!! And they STILL call it a 290x!!!!

There are no tweaks to the arch.


----------



## CAPSLOCKSTUCK (Jun 10, 2016)

GPUZ for HD 7970 reads


AMD Radeon R9 200/ HD 7900 series


----------



## RealNeil (Jun 10, 2016)

This deserves a little clarification. I have been using a pair of Sapphire 4GB R9-290X Tri-X cards in Crossfire in one of my PCs for over a year.
Performance has been good, and they play all of ~my~ games without fault.

I was recently offered a deal on a pair of brand new Sapphire 8GB R9-390X Toxic cards to replace them.
I figured that they're brand new, they have double the memory, and "Toxic" instead of Tri-X. So I bought them.

Here are the results that I got running the Heaven benchmark with exactly the same settings, on the same system, with the same driver.









I leave it open to your own interpretation.
Needless to say these new Toxic cards are probably going back.


----------



## EarthDog (Jun 11, 2016)

What did you clarify?

My interpretation is that the toxic is clocked at 1120 mhz core while the tri x is 947. I'm not surprised it's beating the slower clocked card.

I'd keep the toxic since it's clocked higher and has more vram.


----------



## cdawall (Jun 11, 2016)

EarthDog said:


> My interpretation is that the toxic is clocked at 1120 mhz core while the tri x is 947. I'm not surprised it's beating the slower clocked card.
> 
> I'd keep the toxic since it's clocked higher and has more vram.



and I would just overclock the trix


----------



## Caring1 (Jun 11, 2016)

The "upgrade" just isn't worth the minimal points for so many dollars.


----------



## EarthDog (Jun 11, 2016)

Doesnt the toxic have a better vrm on it?


----------



## RealNeil (Jun 11, 2016)

Since they're already here, I might keep them for the 8GB instead of 4GB onboard.
I'm disappointed that there isn't much improvement for the money.

Now I have four R9-290X OC cards to sell.
Maybe I'll get a couple of GTX-1070s


----------



## GreiverBlade (Jun 11, 2016)

RealNeil said:


> Now I have four R9-290X OC cards to sell.
> Maybe I'll get a couple of GTX-1070s


Naaaaahhhhh, not worth it...

Keep them, and wait until AMD next release, as a single 290X is already enough ( for the 8gb but even for 4gb) for many things.

Here where I live 4 290X sold = ~600chf = 1 *one* 1070 and maybe a snickers if they're out of Vaseline...


----------



## HD64G (Jun 11, 2016)

RealNeil said:


> Since they're already here, I might keep them for the 8GB instead of 4GB onboard.
> I'm disappointed that there isn't much improvement for the money.
> 
> Now I have four R9-290X OC cards to sell.
> Maybe I'll get a couple of GTX-1070s



Sell now the 3 easiest to go and keep the money and one of them in order to get the best possible of 1080Ti and Vega when they get to normal prices 2-3 months after their releases.


----------



## shovenose (Jun 11, 2016)

Hey guys, a lot has happened since I posted. And I can see a lot has happened in this thread 

My 2TB Seagate drive started giving me trouble (well, my friend dropped my whole PC from about waist height at Lanfest Sacramento last year, so it's been a long time coming), and so I rebuilt my computer. I started over with a clean install of Windows 7 on a new Crucial 480GB SSD with a new 3TB HGST drive for data storage. I did like the Intel SSD caching with my little 60GB Kingston V300 that I put more than 5TB writes on in just a few months, but all the same, I'm happy with the new arrangement. I did move the pagefile to the HDD to increase the lifespan of the SSD. Anyway, this time around I'll be sticking with Windows 7 and not upgrading to Windows 10, so DirectX 12 is out of the question.

I also replaced the motherboard with a new MSI Z97S SLI Krait Edition. This motherboard supports both SLI and CrossFire so I have more options in the future. Full specs: https://www.msi.com/Motherboard/Z97S-SLI-Krait-Edition.html

I have an opportunity to get a second EVGA GTX760 2GB for $75 - should I do that? I kind of used my budget up buying the new SSD, HDD, and motherboard, so new GPU(s) are out of the question for at least 6 months.

I should clarify while I have 3x 1080P monitors I only play games on one of them and not across all three. And somebody in this thread mentioned color scheme, well my case doesn't have a side window so it doesn't matter 

And for the person who said I don't have to reinstall my OS with changing motherboard, not only did you not realize I was using Intel smart response technology SSD caching, but WIndows 10 is a whole different beast


----------



## cdawall (Jun 11, 2016)

I wouldn't suggest that z97 krait board to my worst enemy.


----------



## shovenose (Jun 11, 2016)

cdawall said:


> I wouldn't suggest that z97 krait board to my worst enemy.


Why?


----------



## HD64G (Jun 11, 2016)

shovenose said:


> Hey guys, a lot has happened since I posted. And I can see a lot has happened in this thread
> 
> My 2TB Seagate drive started giving me trouble (well, my friend dropped my whole PC from about waist height at Lanfest Sacramento last year, so it's been a long time coming), and so I rebuilt my computer. I started over with a clean install of Windows 7 on a new Crucial 480GB SSD with a new 3TB HGST drive for data storage. I did like the Intel SSD caching with my little 60GB Kingston V300 that I put more than 5TB writes on in just a few months, but all the same, I'm happy with the new arrangement. I did move the pagefile to the HDD to increase the lifespan of the SSD. Anyway, this time around I'll be sticking with Windows 7 and not upgrading to Windows 10, so DirectX 12 is out of the question.
> 
> ...


If you have $75 for a used 760, toy 'd better sell yours and get the RX480 for just a bit more money imho. That would be a GREAT upgrade in performance.


----------



## BiggieShady (Jun 11, 2016)

Agility said:


> The Witcher 3 item icons flickering


Completely offtopic but I have three words for you ... borderless windowed mode


----------



## Zubasa (Jun 11, 2016)

BiggieShady said:


> Completely offtopic but I have three words for you ... borderless windowed mode


That basically just shuts off Crossfire, which defeats the whole point.
In that case, you are better off just setting Crossfire to off in the profile, since Crimson allows a profile per game.


----------



## Agility (Jun 12, 2016)

BiggieShady said:


> Completely offtopic but I have three words for you ... borderless windowed mode


And run in single R290?? 



Zubasa said:


> That basically just shuts off Crossfire, which defeats the whole point.
> In that case, you are better off just setting Crossfire to off in the profile, since Crimson allows a profile per game.


I do get performance improvement in crossfire, it's just the stuttering when moving + icons randomly flickers (HP bar, EXP bar, heck even the Item bar). It annoys me at times, which makes me wanna jump to single GPU.


----------



## cdawall (Jun 12, 2016)

Agility said:


> I do get performance improvement in crossfire, it's just the stuttering when moving + icons randomly flickers (HP bar, EXP bar, heck even the Item bar). It annoys me at times, which makes me wanna jump to single GPU.



I have run crossfire for a long time and never seen that specific issue...sounds game with poor crossfire profiles or a complete lack of crossfire issues.


----------



## Caring1 (Jun 12, 2016)

cdawall said:


> I wouldn't suggest that z97 krait board to my worst enemy.





shovenose said:


> Why?


I'd like to know also, as I haven't heard anything bad about those boards, apart from looking cheap.


----------



## cdawall (Jun 12, 2016)

Caring1 said:


> I'd like to know also, as I haven't heard anything bad about those boards, apart from looking cheap.



I have had nothing but issues out of them from a a service prospective. They look cheap because they are. Between it and the cheap Asus z97-a/z170-a they are the most common dead boards I get in the shop. Anything from loosing USB ports to dead pcie slots.

I currently use one in a hackintosh with a 4770 it has behaved ok but if you don't restart it every few days the USB ports don't auto detect and the Lan kicks off constantly until it's rebooted.

Also if you use the m2 port you loose all but 2 sata ports on the board. Which is stupid.


----------



## shovenose (Jun 12, 2016)

Caring1 said:


> I'd like to know also, as I haven't heard anything bad about those boards, apart from looking cheap.



Look cheap it does, but the reviews were alright, and it wasn't DOA... so better than an Asus in that regard  I've actually never had any MSI board failures and for a time I managed a small datacenter that consisted of an odd mix of consumer and enterprise hardware and most of our consumer grade stuff was built on MSI motherboards... No issues. It was the Asus boards that would die randomly or would just be DOA.

So yes, I'd love to hear why this board is a piece of shit.


----------



## newtekie1 (Jun 12, 2016)

shovenose said:


> have an opportunity to get a second EVGA GTX760 2GB for $75 - should I do that?



I wouldn't. Stick with the single 760 for now, when you can afford a single newer card, grab it. Put the $75 towards a new card.


----------



## crazy098 (Jun 12, 2016)

@shovenose I created this account just to reply to you. I used to be a fan of AMD video cards. That was until I tried to overclock my HD 7970. It's been a while but from what I can remember (it's late and I'm not using google) AMD claimed that the 7970 had no limits on overclocking. However, I can tell you that after monitoring the card during testing that the clocks were locked at a certain point. I was trying to reach the same clocks as the R290X, as it's just a 7970 with slightly increased clocks. I had read that you could reflash the BIOS on the 7970 to that of the 290X as long as you used the one that went along with your vram vendor and figured I'd do that to unlock the card. My card was one of the original 7970's and after the reflash it would boot into windows, get recognized as a 290X, but my system would always crash whenever I went to play a game. So even that didn't work. My card has 2 bios chips so no damage was done. However, due to AMD's claim regarding overclocking which was the main reason I bought the card, and my experience in real life, I will never buy an AMD product or recommend them to anyone again.


----------



## HD64G (Jun 12, 2016)

crazy098 said:


> @shovenose I created this account just to reply to you. I used to be a fan of AMD video cards. That was until I tried to overclock my HD 7970. It's been a while but from what I can remember (it's late and I'm not using google) AMD claimed that the 7970 had no limits on overclocking. However, I can tell you that after monitoring the card during testing that the clocks were locked at a certain point. I was trying to reach the same clocks as the R290X, as it's just a 7970 with slightly increased clocks. I had read that you could reflash the BIOS on the 7970 to that of the 290X as long as you used the one that went along with your vram vendor and figured I'd do that to unlock the card. My card was one of the original 7970's and after the reflash it would boot into windows, get recognized as a 290X, but my system would always crash whenever I went to play a game. So even that didn't work. My card has 2 bios chips so no damage was done. However, due to AMD's claim regarding overclocking which was the main reason I bought the card, and my experience in real life, I will never buy an AMD product or recommend them to anyone again.



7970 is the same chip as 280X, not 290X. That's why this crash happened when you flasheb the bios to the wrong one.


----------



## Aquinus (Jun 12, 2016)

crazy098 said:


> @shovenose I created this account just to reply to you. I used to be a fan of AMD video cards. That was until I tried to overclock my HD 7970. It's been a while but from what I can remember (it's late and I'm not using google) AMD claimed that the 7970 had no limits on overclocking. However, I can tell you that after monitoring the card during testing that the clocks were locked at a certain point. I was trying to reach the same clocks as the R290X, as it's just a 7970 with slightly increased clocks. I had read that you could reflash the BIOS on the 7970 to that of the 290X as long as you used the one that went along with your vram vendor and figured I'd do that to unlock the card. My card was one of the original 7970's and after the reflash it would boot into windows, get recognized as a 290X, but my system would always crash whenever I went to play a game. So even that didn't work. My card has 2 bios chips so no damage was done. However, due to AMD's claim regarding overclocking which was the main reason I bought the card, and my experience in real life, I will never buy an AMD product or recommend them to anyone again.


Sounds like someone got pissed off because they didn't know what they were doing. Thanks for registering to tell us how you made some pretty critical mistakes. You really have no one to blame but yourself, it's not AMD's fault that you didn't learn how to use the overclocking software before you decided to flash the *wrong bios* to your card.


----------



## crazy098 (Jun 12, 2016)

No, it was the right one. I can show you the eBay listing I made for the card. Like I said, it's late and I didn't feel like checking facts. Regardless, the point I was making is that the clocks are locked when AMD said they weren't. You can google that for yourself. You can see from the pic I took from the listing that the device ID is 1002-6798, which is for 7970/280X.


----------



## Aquinus (Jun 12, 2016)

crazy098 said:


> I didn't feel like checking facts.


Ah yes, those pesky facts. 

I seriously haven't heard of a single person here at TPU having an issue with locked clocks on modern AMD cards. My first ATi card was a 9200 and even that overclocked. Every ATi/AMD card I've had since then has been able to: 9200, x800 gt, 2600 xt, 4850, 6870, and the 390. Also, you did use the wrong BIOS if you flashed a 290X bios to your 280X because they're different cores, different memory controllers, and different CU counts. Now, I haven't had voltage control on every one of those cards but, that isn't the problem at hand, it's locked clocks. Everything about the two other than GCN 1.2 is different so yeah, it was the wrong BIOS and it would kill the card. You can't even put a 290X on a 290 because the extra CUs are laser cut. If you were having trouble overclocking, you were doing it wrong.

Hell, I had a Mobility Radeon HD 3650 in an old laptop and I could even overclock that, so, I suspect you were using the wrong tools or didn't configure something properly because I've found that AfterBurner almost always provides limits that are well higher than what the card is even capable of achieving.

As I said before, you have no one to blame but yourself.


----------



## crazy098 (Jun 12, 2016)

Aquinus said:


> Ah yes, those pesky facts.
> 
> I seriously haven't heard of a single person here at TPU having an issue with locked clocks on modern AMD cards. My first ATi card was a 9200 and even that overclocked. Every ATi/AMD card I've had since then has been able to: 9200, x800 gt, 2600 xt, 4850, 6870, and the 390. Also, you did use the wrong BIOS if you flashed a 290X bios to your 280X because they're different cores, different memory controllers, and different CU counts. Now, I haven't had voltage control on every one of those cards but, that isn't the problem at hand, it's locked clocks. Everything about the two other than GCN 1.2 is different so yeah, it was the wrong BIOS and it would kill the card. You can't even put a 290X on a 290 because the extra CUs are laser cut. If you were having trouble overclocking, you were doing it wrong.
> 
> ...



The device ID for a 290X is 1002-67B0 or B1. However, I was wrong in blaming AMD and the clocks. https://www.google.com/search?q=hd+7970+clocks+locked&ie=utf-8&oe=utf-8 It was voltage locked and was only done by certain manufacturers: http://forum.giga-byte.co.uk/index.php?topic=10222.0 http://forums.anandtech.com/showthread.php?t=2300069


----------



## TheoneandonlyMrK (Jun 12, 2016)

Aquinus said:


> Ah yes, those pesky facts.
> 
> I seriously haven't heard of a single person here at TPU having an issue with locked clocks on modern AMD cards. My first ATi card was a 9200 and even that overclocked. Every ATi/AMD card I've had since then has been able to: 9200, x800 gt, 2600 xt, 4850, 6870, and the 390. Also, you did use the wrong BIOS if you flashed a 290X bios to your 280X because they're different cores, different memory controllers, and different CU counts. Now, I haven't had voltage control on every one of those cards but, that isn't the problem at hand, it's locked clocks. Everything about the two other than GCN 1.2 is different so yeah, it was the wrong BIOS and it would kill the card. You can't even put a 290X on a 290 because the extra CUs are laser cut. If you were having trouble overclocking, you were doing it wrong.
> 
> ...


 I had a 7970 matrix ed, its clocks and many like it had a top clock hard limit (which you couldn't reach to be fair via oc as you said but its there ),and they all without fail had a top core volt lock at 1.25volts that would not budge with out a bios flash even msi  afterburner said it upped this but actually didn't ,and even then it took using the LNC version on mine to get to 1.3 core volts, in short they were shit to overclock compared to the 5870 I swapped out for example. 

But this threads been derailed to high heaven already so for the third time, just get a gtx 1070 or better still save up and get a 1080 ,xfire or sli I are not for you at this time Op

And I'd recommend a RX 480 but to me a fan of the green team that has allowed themselves to become biased will always find fault in a card from a company they spend time hating on so what's the point.


----------



## Tatty_One (Jun 12, 2016)

crazy098 said:


> The device ID for a 290X is 1002-67B0 or B1. However, I was wrong in blaming AMD and the clocks. https://www.google.com/search?q=hd+7970+clocks+locked&ie=utf-8&oe=utf-8 It was voltage locked and was only done by certain manufacturers: http://forum.giga-byte.co.uk/index.php?topic=10222.0 http://forums.anandtech.com/showthread.php?t=2300069


Yes there were a few AIB's that locked voltage (XFX springs to mind), there was a whole host of owners joining here trying some VBE7 software to amend their Bios to try to get more voltage but alas, if they come with a locked voltage controller no end of attempts is going to get you there, I may have been lucky, my old 280X ran at 1160 with a voltage hike, my current 290X runs at the same.


----------



## cdawall (Jun 12, 2016)

Don't know what his issue was my 7950's are flashed to 280, have unlocked voltage and are clocked to 1150. Never had a single issue out of them in the 3 or 4 years I have had them. Sounds like a big box of user error and picking the wrong bios.


----------



## Caring1 (Jun 12, 2016)

crazy098 said:


> @shovenose I will never buy an AMD product or recommend them to anyone again.


You gave a pathetic one of reason which sounds more like sour grapes.
I suppose if you get one bad nVidia card you will never buy them again too.


----------



## Aquinus (Jun 12, 2016)

crazy098 said:


> The device ID for a 290X is 1002-67B0 or B1. However, I was wrong in blaming AMD and the clocks. https://www.google.com/search?q=hd+7970+clocks+locked&ie=utf-8&oe=utf-8 It was voltage locked and was only done by certain manufacturers: http://forum.giga-byte.co.uk/index.php?topic=10222.0 http://forums.anandtech.com/showthread.php?t=2300069


That sounds more correct. I have gotten pissed off at some GPUs for not having control of voltage but, with that said the best overclocking GPU I ever had didn't have voltage control (a GeForce 8600 GTS.)

I do think in this day and age, control over voltages should be considered a necessity but, as you said, that certainly isn't AMD's fault.


cdawall said:


> Don't know what his issue was my 7950's are flashed to 280, have unlocked voltage and are clocked to 1150. Never had a single issue out of them in the 3 or 4 years I have had them. Sounds like a big box of user error and picking the wrong bios.


Voltage control depends on the vendor and if the controller is accessible via I2C in software.


----------



## cdawall (Jun 12, 2016)

Aquinus said:


> Voltage control depends on the vendor and if the controller is accessible via I2C in software.



100% correct and I read reviews prior to purchase of my parts to make sure I got one that had control. Research it's what separates those with good flashes from bad.


----------



## BiggieShady (Jun 12, 2016)

Zubasa said:


> That basically just shuts off Crossfire, which defeats the whole point.
> In that case, you are better off just setting Crossfire to off in the profile, since Crimson allows a profile per game.





Agility said:


> And run in single R290??



I thought it was AMD issue in general, not only in crossfire ... it figures though (silly me), otherwise backlash would have been enormous


----------



## arbiter (Jun 12, 2016)

crazy098 said:


> @shovenose I created this account just to reply to you. I used to be a fan of AMD video cards. That was until I tried to overclock my HD 7970. It's been a while but from what I can remember (it's late and I'm not using google) AMD claimed that the 7970 had no limits on overclocking. However, I can tell you that after monitoring the card during testing that the clocks were locked at a certain point. I was trying to reach the same clocks as the R290X, as it's just a 7970 with slightly increased clocks. I had read that you could reflash the BIOS on the 7970 to that of the 290X as long as you used the one that went along with your vram vendor and figured I'd do that to unlock the card. My card was one of the original 7970's and after the reflash it would boot into windows, get recognized as a 290X, but my system would always crash whenever I went to play a game. So even that didn't work. My card has 2 bios chips so no damage was done. However, due to AMD's claim regarding overclocking which was the main reason I bought the card, and my experience in real life, I will never buy an AMD product or recommend them to anyone again.


Um 290x is different gpu then a 7970 was so its not just a 290x with a clock bump. Less you mean originalk 7970 which had a clock of like 850mhz until nvidia dropped gtx680 which amd responded by upclocking 7970 to 1 ghz. only other card that 7970 was is 280x but if i remember right 280x was 50mhz slower then 7970.


----------



## deviantyo88 (Jun 12, 2016)

shovenose said:


> Nothing that AMD currently sells (R7,R9,Fury) interests me. So, should I buy two RX 480s or one GTX 1070? Your input is much appreciated!



Dont buy any, if you have money for a 1070 wait a while for rx 490/490x AMD might release some 1070/1080 competitors cheaper maybe by 100$ less, they really need marketshare and sales, i wouldnt be surprised if they would announce 490/490x soon after 480 release.
Also old cards will drop in price, used 980's and 980Ti are probably already being sold for cheap.


----------



## Moofachuka (Jul 10, 2016)

I had dual 7970's and they lasted me 4 years. They aged very well compared to gtx 680. Now I got a 1070 and I'm also very happy. 7970 were very hot and limited my cpu oc (i7 980 @ 4.0ghz) cuz they took up most of my 850w power. Now with my 1070 msi gaming x,  my card oc 2100 core and my i7 980 finally can oc 4.3ghz. =)


----------



## Nergal (Jul 11, 2016)

tldr; can someone make for the OP (and me) a list of all the cons and pros from  the NVIDIA;AMD;neutral-camp? 
I am not certain if OP has any value of being bombed with these comments tbh. 


But to answer his question(s): 

1) YES, you can go AMD if you want. 

=> Quality-wise; the AIB´s matter more and also any specific problems that pop-up. (so always read reviews of the brand and specific type you want to buy and search for problems relating to that specific card brand (example: rx480 MSI or 1080GTX Gigabyte G1)

2) If you want to spend about 450dollar you can best get a 1070GTX. 
3) If you want to spend less (and upgrade other stuff with the money) get a Rx480
4) Spending more (1080GTX) is ludicrous at the moment (VEGA/Ti coming)


----------



## Frick (Jul 11, 2016)

Nergal said:


> tldr; can someone make for the OP (and me) a list of all the cons and pros from  the NVIDIA;AMD;neutral-camp?



Here's what you do:

1. Speficy your budget
2. Specify your priorities (ie how important is noise and overclocking)
3. Get the fastest thing you can find within your budget and according to your priorities


----------



## Nergal (Jul 11, 2016)

Frick said:


> Here's what you do:
> 
> 1. Speficy your budget
> 2. Specify your priorities (ie how important is noise and overclocking)
> 3. Get the fastest thing you can find within your budget and according to your priorities



Well; I was hoping someone would take the time to actually create a complete PRO-CON comparison on the Firms; not the cards. 
He isn´t asking for a specific card; but AMD vs NVIDIA as a whole and how they relate to each other. In effect, as a NV-fanboy he is trying to consider the opposition and wants general well informed information?


----------



## Vayra86 (Jul 12, 2016)

Nergal said:


> Well; I was hoping someone would take the time to actually create a complete PRO-CON comparison on the Firms; not the cards.
> He isn´t asking for a specific card; but AMD vs NVIDIA as a whole and how they relate to each other. In effect, as a NV-fanboy he is trying to consider the opposition and wants general well informed information?



Not really, he is moving on from one GPU to the next and wants buyer's advice. That is all, and any NV/AMD comparison is just flamebait, we all know this. There is no point either - you judge a company by its products' performance and everything else is entirely personal and pretty useless info. GPU's are for gaming, you want the GPU that serves that purpose best based on your personal preferences such as budget, resolution, games you play. Simple, effective.


----------



## Tsukiyomi91 (Jul 12, 2016)

if OP has a budget of around $400 for a GPU, the answer is rather obvious: a GTX1070 with some balance. If it's less than that, the GTX980 still has it's charms for 1440p gaming.


----------



## Nergal (Jul 12, 2016)

Vayra86 said:


> Not really, he is moving on from one GPU to the next and wants buyer's advice. That is all, and any NV/AMD comparison is just flamebait, we all know this. There is no point either - you judge a company by its products' performance and everything else is entirely personal and pretty useless info. GPU's are for gaming, you want the GPU that serves that purpose best based on your personal preferences such as budget, resolution, games you play. Simple, effective.



I thought it would be a touchy subject and no-one wants to get burned. 
Bit of a shame thou. 



Tsukiyomi91 said:


> if OP has a budget of around $400 for a GPU, the answer is rather obvious: a GTX1070 with some balance. If it's less than that, the GTX980 still has it's charms for 1440p gaming.



Perhaps a 2nd hand 980GTX could be a good option. 
Certainly not a new one. That would be ludicrous! (pricing is way to high)

Don´t forget it only has 4GB. There will be a time, not so far in the future, that on 1440P; the Rx480 8GB performs better than a 980GTX?


----------



## Tsukiyomi91 (Jul 12, 2016)

non Ti 980 still perform well for 1440p. U can check the performance charts TPU publishes. For future-proofing, a single GTX1070 does the job, lasting for a good 3 years, with where our game is going that is...


----------



## Tsukiyomi91 (Jul 12, 2016)

reference model GTX1070 (non Founder's Edition) is selling around MYR2k or so over here in Malaysia, which is quite decent for it's performance. Diff between FE & non-FE is the GPU chip, that's all.


----------



## Vayra86 (Jul 12, 2016)

Nergal said:


> I thought it would be a touchy subject and no-one wants to get burned.
> Bit of a shame thou.
> 
> 
> ...



Hey buddy, why don't you enlighten us with your pro/con list then?  If you can't or won't you are just baiting and that is all, I suspect you can do better, so please be better than this.

On topic; it's quite clear that 4GB is more than enough for this performance level, and for the GTX 980 too. You won't be running 4K on these cards and 1440p has MORE than enough with 4GB. Maybe in SLI/Xfire, but even then, these cards will be bandwidth starved before they can push 8GB. 256-bit and one compression layer instead of the second pass that you get on Pascal, keep that in mind ^^


----------



## Aquinus (Jul 12, 2016)

Tsukiyomi91 said:


> Diff between FE & non-FE is the GPU chip, that's all.


I think you mean the difference is the PCB and its components as the GPU itself between FE and non-FE is the same.


----------



## EarthDog (Jul 13, 2016)

Aquinus said:


> I think you mean the difference is the PCB and its components as the GPU itself between FE and non-FE is the same.


It was opposite day Aquinas... you need to log in more. Missed that memo.


----------



## Kissamies (Jul 13, 2016)

I moved from 670 OC to 290, and I've liked this pretty much, though the reference card's blower is terrible.


----------



## Nergal (Jul 13, 2016)

Vayra86 said:


> Hey buddy, why don't you enlighten us with your pro/con list then?  If you can't or won't you are just baiting and that is all, I suspect you can do better, so please be better than this.



I have an actually genuine interested in those views, but can´t make a pro/con list myself. As you can perhaps tell by my profile, I have been out of tech circulation for many years. (think back to AGP). There is a reason my posts are limited to economic insights&predictions and recent developments only.

I had an inkling that the AMD vs NV topic was somewhat flammable; but it seems so touchy I can´t even ask for a short list of technical achievements/important moments&differences between said companies. Which is ouch and points to an almost bitter competition(among the fans). 



Vayra86 said:


> On topic; it's quite clear that 4GB is more than enough for this performance level, and for the GTX 980 too. You won't be running 4K on these cards and 1440p has MORE than enough with 4GB. Maybe in SLI/Xfire, but even then, these cards will be bandwidth starved before they can push 8GB. 256-bit and one compression layer instead of the second pass that you get on Pascal, keep that in mind ^^



I have seen many tell that 4GB ain´t enough for 1440P. Seeing even claims that 4GB will be pushing 1080P in the near future. And just today I have seen the new benchmark for doom on 1440P with the new update that a Rx480 is almost as fast as an 980Ti.  

So OP could look into a 2nd hand 980; a new Rx480 or a 1060GTX(waiting for benchmarks and delivery time AIBs of the last). 
Buying a new 980 at this point is not cost/performance any good. (whilst yes, the card itself is still quite good, but not to buy new)


----------



## Aquinus (Jul 13, 2016)

EarthDog said:


> It was opposite day Aquinas... you need to log in more. Missed that memo.


I guess that's what happens when I work 50-60 hour weeks.


----------



## Tsukiyomi91 (Jul 13, 2016)

the whole AMD VS Nvidia is always a heated topic, regardless of what we ask, even from the simplest thing like "which is good?". I rather let the cards do the talking as they will tell us how good they perform. It's easier that way & conflicts between both sides stayed minimal. Old stock GTX980 is ok but like what @Nergal says... it's just not a good time to buy 2nd Gen Maxwell with the emergence of Pascal cards.


----------



## Recon-UK (Jul 14, 2016)

thesmokingman said:


> Pro tip for you, any multi-gpu will run like ass with seriously mismatched slot widths, whether it be Green or Red, ignoring the fact that his mb is seriously under engineered for the task.


Sli won't work with a x4 slot.



moproblems99 said:


> After my experience with dual cards, I will never do it again.  I will buy one really good card and use whatever money is left to buy whiskey.


I ran 9800GTX+ SLi, one card ended up dying as soon as i loaded up a game on the set up for the first time...
4870 crossfire was TERRIBLE, 480 SLi was good.. sans i could see individual frames being processed (this is hard to explain and is not microstutter).

With 480 SLi it was amazing in some games and in others you could see the cards processing together, basically the normal refresh rate i see to what 480 SLi was producing was noticeably different, i did not like it anyway.

Single card FTW.


----------



## Moofachuka (Jul 14, 2016)

Amd = cheaper weaker but ages better longer support more heat more power consumption more noise in general more future proof cuz of support

Nvidia = more expensive faster high end doesn't age as well cooler less power consumption usually support lasts for up to 3 years maybe resale value higher

Both have fanboys both have equal quality drivers give and take


----------



## Recon-UK (Jul 14, 2016)

Moofachuka said:


> *Amd = cheaper weaker* - REALLY?  but ages better longer support more heat more power consumption more noise in general more future proof cuz of support
> 
> Nvidia = more expensive faster high end doesn't age as well cooler less power consumption usually support lasts for up to 3 years maybe resale value higher
> 
> Both have fanboys both have equal quality drivers give and take



What side of Uranus did you spawn from?


----------



## Moofachuka (Jul 14, 2016)

Recon-UK said:


> What side of Uranus did you spawn from?



Weaker as in comparing their fastest flagship cards


----------



## Recon-UK (Jul 14, 2016)

Moofachuka said:


> Weaker as in comparing their fastest flagship cards



I thought you were bashing the actual hardware, my bad.


----------



## cdawall (Jul 14, 2016)

Moofachuka said:


> Weaker as in comparing their fastest flagship cards



Typically their flagship cards have been faster since the 3870x2


----------



## Nergal (Jul 14, 2016)

Thanks for this short insight. 
And see, no-one got burned 

I am pondering on the idea that NVIDIA is perhaps very happy with AMD as a competitor. 
Just follow my hunch here.

NV seems a step ahead on its tech, consistently.  
An example is the new NV card. 
They released the 1070&1080, and BAM; its the best out there.
AMD comes up with a very well aimed strike(pricing) of their new tech and NV immediately releases an answer that was meant only to be released in a few months. 

Oops, miscalculation from NV.


But doesn´t this just mean that they are playing as the one reacting on the action of AMD? 
It makes me wonder how much tech they develop; and then choose not to research it further because AMD isn´t anywhere near it yet. It wouldn´t be cost effective to pump money in something that is way too much better than the competition.(unneeded) 

AMD would go broke and NV wouldn´t have its nice, easy, competition anymore. 

Without AMD; NV (and Intel); would be forced to also R&D; but having no clue what people would except, perhaps pump more money into it. (or less). Not knowing what the future would bring. 

So yes, NV is happy with an easy to anticipate, sole, competitor that brings stability and predictability to the market. Whilst also taking being the actor out of their hands and always(most of the time) getting the nr1 position.



I know this is an oversimplification, but could this view hold some truth? 
Is AMD indeed a bit weaker overall, and is NV not developing things as well as they could?
Has there been "proof" or examples of this in the past?

*=> this is not an attempt on bashing on the level of AMD nor the vilification of NV. Just a hunch on a market dynamic, please treat it as such.*


----------

