# Radeon 7 is Released, What Would You Buy?



## xkm1948 (Jan 10, 2019)

See the choices


----------



## hat (Jan 10, 2019)

I don't think we have enough information to make any decisions right now. "Radeon 7" looks promising, but all we have is marketing slides from CES... we won't know how it stacks up until the mighty W1zzard tells us. I'm also not sure if Radeon 7 is a one off product, or if there's going to be an entire product stack that comes with it.

For me, though, I'm probably buying a great big heap of nothing. I'd love to get a new Radeon and shove my 1070s off to my "server", but unfortunately that machine only has one PCI-E slot, so that's a non starter without also buying a motherboard... an AM3 motherboard, at that. And a power supply upgrade would also be mandatory to do this. That's a whole lot of money I don't have to solve a problem that doesn't really exist... so I voted "Nothing, I am good with current GPU".


----------



## cdawall (Jan 10, 2019)

With the way performance looks I'm buying some water blocks and keeping my current setup.


----------



## Apocalypsee (Jan 10, 2019)

Already use Vega56 and only 1080p, Im good for another year or two.


----------



## m&m's (Jan 10, 2019)

I voted "Nothing, I am disappointed with current stack of GPUs" because of the current prices. $700 for the Radeon 7 or $700 for the RTX 2080. Here in Canada, it will probably translate to ~1000CAD... lol Not interested. I don't care if it's 3x faster than the previous gen, I'm not paying that much for a GPU, on principle. I believe that GPU and NAND manufacturers are pushing the envelope a little too much so I don't give them money. I know some people here are going to say I'm stupid and what not but as a consumer the only way I have to express my opinion to them is by not giving them my monies so I don't. I might buy a mid range GPU like a RTX 2060 6GB if it ever go below the $300 mark.


----------



## GoldenX (Jan 10, 2019)

Last option.
Radeon 7 only showed performance numbers, no innovation.
Plus, thanks to Ngreedia, now we have another GCN, now at $700. Great. Thank you.


----------



## phanbuey (Jan 10, 2019)

https://venturebeat.com/2019/01/09/nvidia-ceo-rival-amds-graphics-chip-performance-is-underwhelming/
https://www.pcworld.com/article/333...x-2080-will-crush-amds-underwhelming-gpu.html


“The performance is lousy and there’s nothing new,” Huang said. “[There’s] no ray tracing, no AI. It’s 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we’ll crush it. And if we turn on ray tracing we’ll crush it.”


----------



## MrGenius (Jan 10, 2019)

If I hadn't just bought my Vega 64 I'd be all over it like stink on a gorilla. But since I did just buy the Vega...I can't justify the expense ATM(or any time in the near future).



phanbuey said:


> https://venturebeat.com/2019/01/09/nvidia-ceo-rival-amds-graphics-chip-performance-is-underwhelming/
> https://www.pcworld.com/article/333...x-2080-will-crush-amds-underwhelming-gpu.html
> 
> 
> “The performance is lousy and there’s nothing new,” Huang said. “[There’s] no ray tracing, no AI. It’s 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we’ll crush it. And if we turn on ray tracing we’ll crush it.”


And we're going to listen to what that dumbass has to say about it why? Or rather we're stupid enough to listen to that idiot how? Or...I could go on and on and on. Dude's on a sinking ship and claiming everything's hunky dory. Time to shut yer yap, go lay down for a while, and lick your wounds Jensen. Quit acting like an overzealous dickhead.


----------



## hat (Jan 10, 2019)

phanbuey said:


> https://venturebeat.com/2019/01/09/nvidia-ceo-rival-amds-graphics-chip-performance-is-underwhelming/
> https://www.pcworld.com/article/333...x-2080-will-crush-amds-underwhelming-gpu.html
> 
> 
> “The performance is lousy and there’s nothing new,” Huang said. “[There’s] no ray tracing, no AI. It’s 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we’ll crush it. And if we turn on ray tracing we’ll crush it.”


Wouldn't it be only natural for the competing CEO to say such things? nVidia's RTX has been received by most as underwhelming... a little eye candy that has been done via rasterisation already, with a giant performance hit, at a high cost... both in money terms and in die size. On top of that, ray tracing is supposed to be a standard feature in DX12, so AMD should be able to do it too, without specialized hardware (tensor cores). Granted, it remains to be seen what the red team's ray tracing looks and runs like... As for DLSS, it seems to be an even bigger flop than RTX ray tracing. It's "fast" upscaling that looks like crap, and has to have special support on a per game basis. Not hating though, just not impressed with the tech. Not sure if AMD will have an answer to that, but I don't suppose it would be that hard to do upscaling. They already do it for "4k" consoles with much weaker hardware...


----------



## OneMoar (Jan 10, 2019)

nope because I Know what the power consumption/heat output is going to be.
I also know there is absolutely no way its accually competitive with 2080 except in a handful of cherry picked dx12 titles ( did you catch how blocky the forza screenshot they showed was lol)

AMD should have trashed GCN a long time ago yet here we are more of the same

and I am sorry being just barely competitive with the competitors middle-high tier part with your very best high end part is NOT SOMETHING TO BE PROUD OF why do people have such trouble with that concept
if your best is only half or as good as your competitors B game you have a problem
'
its like saying yo my honda is almost as fast as your 2.0L BMW three series


----------



## xkm1948 (Jan 10, 2019)

GCN is soo old but they just have to stretch it on as long as possible until the successor of GCN comes out.



OneMoar said:


> nope because I Know what the power consumption/heat output is going to be I also know there is absolutely no way its accually competitive with 2080 except in a handful of cherry picked dx12 titles



Vulkan more than likely


----------



## Hockster (Jan 10, 2019)

I think there's a reason half the GPU team has jumped ship over the last few months.


----------



## Zubasa (Jan 10, 2019)

OneMoar said:


> nope because I Know what the power consumption/heat output is going to be.
> I also know there is absolutely no way its accually competitive with 2080 except in a handful of cherry picked dx12 titles ( did you catch how blocky the forza screenshot they showed was lol)
> 
> AMD should have trashed GCN a long time ago yet here we are more of the same
> ...


Or they could fix the Geometry limit on GCN, thats the least they can do.
$699 is hard to swallow when 2080 can be brought at that price right now and it supports all the bells and whistles, and soon with VESA Adapative sync support.
A trialed and proven card vs a promise of a slide from the company that makes the card.



Hockster said:


> I think there's a reason half the GPU team has jumped ship over the last few months.


Also look at all the extra grey hair that Lisa Su got, that is what the dumpster fire known as RTG does to a CEO.


----------



## OneMoar (Jan 10, 2019)

xkm1948 said:


> GCN is soo old but they just have to stretch it on as long as possible until the successor of GCN comes out.
> 
> 
> 
> Vulkan more than likely



and we all know why amd does better in those because there Architecture is choked by there inability to keep the SP's busy enough to warrant having fucking 2500 of them

and just having sit idle comes with the penalty  of massive power consumption and wasted die space

so again comparatively nvidia's arch does a lot more with a lot less physical execution units so what does that tell you
think about that for a second kids let that sink is nvidia cards are faster with a lot less 'cores'

yea nvidia are dicks, overpriced but there the market leader they have the market on lock and can do whatever they please because there is nobody to challenge them. and that is completely AMD's fault


----------



## sam_86314 (Jan 10, 2019)

Voted last option.

Was hoping for AMD to somehow deliver GTX 1080 performance for less than $300 like how Polaris compared to Maxwell, but instead they claim RTX 2080/GTX 1080 Ti performance for the same price.

Looks like we'll just have to wait and see what Intel comes up with next year.


----------



## ShurikN (Jan 10, 2019)

phanbuey said:


> https://venturebeat.com/2019/01/09/nvidia-ceo-rival-amds-graphics-chip-performance-is-underwhelming/
> https://www.pcworld.com/article/333...x-2080-will-crush-amds-underwhelming-gpu.html
> 
> 
> “The performance is lousy and there’s nothing new,” Huang said. “[There’s] no ray tracing, no AI. It’s 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we’ll crush it. And if we turn on ray tracing we’ll crush it.”


Leather jacket looks a bit irritated. Must be from all the success RTX cards are having.


----------



## moproblems99 (Jan 10, 2019)

OneMoar said:


> its like saying yo my honda is almost as fast as your 2.0L BMW three series



Doesn't stop anyone from buying those piles of crap does it?  And then look what they do...put RGB all over them.


----------



## sepheronx (Jan 10, 2019)

m&m's said:


> I voted "Nothing, I am disappointed with current stack of GPUs" because of the current prices. $700 for the Radeon 7 or $700 for the RTX 2080. Here in Canada, it will probably translate to ~1000CAD... lol Not interested. I don't care if it's 3x faster than the previous gen, I'm not paying that much for a GPU, on principle. I believe that GPU and NAND manufacturers are pushing the envelope a little too much so I don't give them money. I know some people here are going to say I'm stupid and what not but as a consumer the only way I have to express my opinion to them is by not giving them my monies so I don't. I might buy a mid range GPU like a RTX 2060 6GB if it ever go below the $300 mark.



We are getting raped in prices here in Canada.


----------



## Space Lynx (Jan 10, 2019)

sepheronx said:


> We are getting raped in prices here in Canada.



yeah but you also make more money in Canada. so it evens out really. I mean a gas station employee working cash register in America makes what $7.25 to $8 an hour, and in Canada they probably make 10-11 an hour.

bad example I know, but that seems to be the case across the board (except for nurses which get paid loads more in America)


----------



## sepheronx (Jan 10, 2019)

lynx29 said:


> yeah but you also make more money in Canada. so it evens out really. I mean a gas station employee working cash register in America makes what $7.25 to $8 an hour, and in Canada they probably make 10-11 an hour.
> 
> bad example I know, but that seems to be the case across the board (except for nurses which get paid loads more in America)



Prices on average are much more though.  Too much imo.  I mean my home which is 1,600sqft cost us almost $400,000 CAD.  That is far too much.  Mix in everything else and we end up living paycheck to paycheck.  My wife and I combined make more than average what Canadians make, yet we are house poor.

It's just that I walk away with $6K personal debt rather than average which extends far above $10K.  I end up buying used components in order to save.


----------



## Xzibit (Jan 10, 2019)

lynx29 said:


> yeah but you also make more money in Canada. so it evens out really. I mean a gas station employee working cash register *in America makes what $7.25 to $8 an hour,* and in Canada they probably make 10-11 an hour.
> 
> bad example I know, but that seems to be the case across the board (except for nurses which get paid loads more in America)



That was in 2015, Average minimal wage in 2018 was $10.50. Higher in certain states.


----------



## INSTG8R (Jan 10, 2019)

Still happy with my performance from Vega@1440 but now that I’m overclocking power consumption went out the  window. But it’s still running cool doing it so couldn’t care less.



sepheronx said:


> We are getting raped in prices here in Canada.


Try Norway....(Ex-Pat Canuck) you don’t want to know what I paid for my Vega and I got in on “sale” and was literally the only one in the country at the time...


----------



## londiste (Jan 10, 2019)

So, doesn't the critique of RTX series apply 100% here? 
We have had that exact performance at the same price for practically 2 years. 
GTX 1080Ti in March 2017 and RTX 2080 since September.


----------



## Zubasa (Jan 10, 2019)

londiste said:


> So, doesn't the critique of RTX series apply 100% here?
> We have had that exact performance at the same price for practically 2 years.
> GTX 1080Ti in March 2017 and RTX 2080 since September.


It does, except according to AMD's own slides the Radeon VII is barely faster than 2080 and maintains the TDP of the Vega 64.
So you have a card that is basically same price and performance, with higher power consumption, and with less features (RTX is questionable but still its an extra option).
Whats worse is the 2080 is readily avaliable already.


----------



## Vya Domus (Jan 10, 2019)

phanbuey said:


> https://venturebeat.com/2019/01/09/nvidia-ceo-rival-amds-graphics-chip-performance-is-underwhelming/
> https://www.pcworld.com/article/333...x-2080-will-crush-amds-underwhelming-gpu.html



Oh boy is he a damsel in distress. He even goes on to indirectly shit on one of Nvidia's own products,  GeForce Now, saying that streaming will never work . Must be getting real pissed that Google chose AMD over them and salty that they also had their first go at 7nm. This is truly hilarious, he sounds like an angry fanboy not a CEO.


----------



## erocker (Jan 10, 2019)

Besides The Division 2, what two other games are coming with Radeon 7?


----------



## INSTG8R (Jan 10, 2019)

erocker said:


> Besides The Division 2, what two other games are coming with Radeon 7?


DMC 5 and RE 2


----------



## RCoon (Jan 10, 2019)

MrGenius said:


> Quit acting like an overzealous dickhead


Take your own advice.


----------



## Space Lynx (Jan 10, 2019)

INSTG8R said:


> DMC 5 and RE 2



all 3 of the free games look pretty boring to me. I watched some youtube gameplay videos of all 3.  /shrug  I'm going to pass on those 3 games myself.


----------



## jboydgolfer (Jan 10, 2019)

im hoping amd has a beastly dgpu this time around. Its getting boring picking between amds latest top of the line gpu, or nvidias top of the line from 3 generations ago for best performance in my budget. I personally dont feel a new gpu release consists of all new dynamic shadow puppet rendering technology or light ray pixel smoothing, or whatever stupid firestarting gimmick they come up with.

Amd needs to release something that competes with the competitions gpu from THIS year & not a few years back, & nvidia needs the competition to actually start needing to try. Once performance is posted online by several  3rd parties, then a choice can be made


----------



## INSTG8R (Jan 10, 2019)

lynx29 said:


> all 3 of the free games look pretty boring to me. I watched some youtube gameplay videos of all 3.  /shrug  I'm going to pass on those 3 games myself.


Really just Div 2 has a mild interest, first one was truly the best crafted world I’ve ever had the pleasure to wander. The bullet sponge mechanics were the real turnoff. I don’t do hack and slash so DMC means nothing and I could take or leave RE.


----------



## qubit (Jan 10, 2019)

There should be an option for NVIDIA to make it a fair poll.


----------



## sepheronx (Jan 10, 2019)

INSTG8R said:


> Still happy with my performance from Vega@1440 but now that I’m overclocking power consumption went out the  window. But it’s still running cool doing it so couldn’t care less.
> 
> 
> Try Norway....(Ex-Pat Canuck) you don’t want to know what I paid for my Vega and I got in on “sale” and was literally the only one in the country at the time...



I can only imagine.  European nations can get horribly screwed because of VAT taxes, and other little taxes on everything.  So I can sympathize.

Edit:

I would like to share my 2 Kopeks on this.

There are people here who are very over zealous on the comments about AMD and their Radeon 7, were they borderline call it trash cause it doesn't compete against the 2080ti.  OK, its fair to complain that AMD may not come out with top tier top of the line to compete.  But to compete with the option just below it is good for me.  Good for most.  As competition allows to eventually fight over with price point too.  Issue is, AMD is still using the HBM memory which is expensive to say the least.  If they went GDDR6 then it may drop price point by at least $100.  Maybe I am exaggerating though.

I am not the man who buys top of the line.  Mid tier is good for me.  Something like a RTX2060 is right at the sweet spot, if it wasn't for the price.  Price should be at least $280 - $300 IMO.  Now I am intrigued to see what they (AMD) comes out with in that category.  My only gripe is the wattage.  Otherwise, I wont crap on AMD just because they dont have a 2080TI competitor.  I really dont care about that category.  And judging by the average consumer, I say they don't care either.  

I don't have a favorite brand.  Favorite brands are for chumps.  I go for best price to performance.  Simple as that.


----------



## Space Lynx (Jan 10, 2019)

sepheronx said:


> I can only imagine.  European nations can get horribly screwed because of VAT taxes, and other little taxes on everything.  So I can sympathize.
> 
> Edit:
> 
> ...



they also have universal healthcare and have no anxiety about being able to afford their medicine.  

take your pick. cheaper graphics card you buy once every 4-5 years or anxiety free life...


----------



## londiste (Jan 10, 2019)

sepheronx said:


> Mid tier is good for me.  Something like a RTX2060 is right at the sweet spot, if it wasn't for the price.  Price should be at least $280 - $300 IMO.  Now I am intrigued to see what they (AMD) comes out with in that category.


Wait no more, RX590 came out less than two months ago with MSRP $279, right in your sweet spot


----------



## Space Lynx (Jan 10, 2019)

londiste said:


> Wait no more, RX590 came out less than two months ago with MSRP $279, right in your sweet spot



yep and once overclocked it does do quite well for tha tprice point actually. minus the heat and power draw. better to be patient and wait for 7nm cheap parts in 6 months


----------



## Vayra86 (Jan 10, 2019)

phanbuey said:


> https://venturebeat.com/2019/01/09/nvidia-ceo-rival-amds-graphics-chip-performance-is-underwhelming/
> https://www.pcworld.com/article/333...x-2080-will-crush-amds-underwhelming-gpu.html
> 
> 
> “The performance is lousy and there’s nothing new,” Huang said. “[There’s] no ray tracing, no AI. It’s 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we’ll crush it. And if we turn on ray tracing we’ll crush it.”



"If we _can_ turn on DLSS or ray tracing, we crush it" "Too bad we can't, most of the time, and people generally don't seem to care about it in the first place"

Oh sorry Huang, did I steal your thoughts?

OH shit it gets even better 






This is also interesting and kinda echoes... well everything that's been said about RTX. It falls upon the 2060 to 'make the masses ready' apparently. The lowest RTRT performing card. Apparently Nvidia is ready _now_, after all, we have BFV... and Final Fantasy... and... ? Sad product stack is pretty damn sad I'd say, when _only one product has good sales potential_.


----------



## sepheronx (Jan 10, 2019)

londiste said:


> Wait no more, RX590 came out less than two months ago with MSRP $279, right in your sweet spot



That is true, but I would rather wait to see what the new chips come out as with a medium spot.  RX590 was just a 1060 with GDDR5X imo.


----------



## Athlonite (Jan 10, 2019)

INSTG8R said:


> Still happy with my performance from Vega@1440 but now that I’m overclocking power consumption went out the  window. But it’s still running cool doing it so couldn’t care less.
> 
> 
> Try Norway....(Ex-Pat Canuck) you don’t want to know what I paid for my Vega and I got in on “sale” and was literally the only one in the country at the time...



try New Zealand on for gauging
Sapphire Radeon RX Vega 64 Limited Edition HDMI 3xDP 8GB *$1,583.55*


----------



## Flyordie (Jan 10, 2019)

Cant honestly answer the poll. 

A liquid cooled version, with all 64 CU's unlocked and rocking an 1750-1800Mhz clock at a $800 price point.. maybe?

I mean right now, Vega64 Liquid is $500 shipped.

Don't get me wrong, Vega II is good.  My Vega clocks to 1800Mhz easily and if the clocks on the VII are what they said they were.. 1800Mhz..  Its (VII) at least 18% faster overall in gaming. 

So, I'll wait for the fully unlocked cards if they come. Other than that... My Vega64 is fine.


----------



## kurosagi01 (Jan 10, 2019)

If the cooling is right then I may consider purchasing around Spring time, trying to keep cool of a Vega 64 on reference cooler is quite a noisy bugger when running it at maximum fan speed lol.
But my gut feeling is telling me I may need to change my CPU but I'm going to be replacing my ram kit with  3000mhz, which I should have got from start.


----------



## Vayra86 (Jan 10, 2019)

lynx29 said:


> they also have universal healthcare and have no anxiety about being able to afford their medicine.
> 
> take your pick. cheaper graphics card you buy once every 4-5 years or anxiety free life...



This man gets it. 

As for those graphics cards... paid 450 eur for my brand spanking new 1080... no complaints.


----------



## purecain (Jan 10, 2019)

if I didnt have the V i'd be buying two of these to crossfire... praying the performance numbers for overclocking are decent... otherwise nvidia are going to keep pushing up the price of gpu's…


----------



## xtreemchaos (Jan 10, 2019)

too much money for me, ill stick with my 2x 580s for now, if in a year or so thay come down 20% i might try a 7 for fun . charl.


----------



## Absolution (Jan 10, 2019)

CES was a disappointment for me as far as a Vega56 owner goes.

Could sell my V56 and go for GTX 2060, but then I dont want a card with 6GB RAM.

On the other end, I was hoping for a 150W die shrink of a V56 in the form of a 3080 at 350USD, and that didnt happen either.

Looks like I will be holding onto my Vega for a bit longer

Unless.... Radeon VII is somewhat competitive and triggers a price war.


----------



## newtekie1 (Jan 10, 2019)

Bias poll is Bias.  No option for the RTX2080?  Seriously!?

Even with Radeon 7, the RTX2080 is still looking like the better buy between the two based on what we know so far.  Performance will be about the same, price is the same, and the RTX2080 offers more features(Ray Tracing).  Yeah, ray tracing isn't utilized that much yet, but it is something the RTX has over the Radeon while the Radeon offers nothing extra at this point.


----------



## Space Lynx (Jan 10, 2019)

Absolution said:


> CES was a disappointment for me as far as a Vega56 owner goes.
> 
> Could sell my V56 and go for GTX 2060, but then I dont want a card with 6GB RAM.
> 
> ...



it also might overclock really well. we really have no idea yet.


----------



## NdMk2o1o (Jan 10, 2019)

jboydgolfer said:


> im hoping amd has a beastly dgpu this time around. Its getting boring picking between amds latest top of the line gpu, or nvidias top of the line from 3 generations ago for best performance in my budget. I personally dont feel a new gpu release consists of all new dynamic shadow puppet rendering technology or light ray pixel smoothing, or whatever stupid firestarting gimmick they come up with.
> 
> Amd needs to release something that competes with the competitions gpu from THIS year & not a few years back, & nvidia needs the competition to actually start needing to try. Once performance is posted online by several  3rd parties, then a choice can be made


So you're only choice is a vega 7 or 980ti umm ok....

Seems you haven't read the thread or forums for the last 2 days as It competes with the 2080 so we're told but won't know any different until reviews on Feb 7th.


----------



## Durvelle27 (Jan 10, 2019)

I'm waiting for navi


----------



## spectatorx (Jan 10, 2019)

Durvelle27 said:


> I'm waiting for navi


Same here. Before seeing your post i was about to write a post about navi. There were amd's roadmaps showing navi in 2019 and new, non-gcn architecture a year later. My plan for this year until now was to buy navi gpu but now i have no idea what to do as i am totally confused. This whole VII seems to be pointless gpu but still making more sense than 590. Also announced price turns me away, a lot. 700usd msrp is too much for consumer gpu, make it 500 with performance of 2080/1080ti and then we can talk. Probably such msrp is an effect of analysis of sales during cryptocurrency mining boom when people were paying insane money for gpus (800 euros for 580...) and card itself is not worth that much.

I would say i could wait for navi but seeing such price for card which was never planned before (VII) i think prices for navi will be even higher unless they will come to realization that they must lower prices. I wonder how long i will stick to my r9 380 waiting for worthy replacement in performance and price. Maybe i will buy soon 580 for 250 euros or vega 64 for 500 but i would love to have navi with performance of 2080ti for 500 euros as it should be.


----------



## phanbuey (Jan 10, 2019)

Durvelle27 said:


> I'm waiting for navi


Same, I am hoping this is just AMD putting a known design through 7nm to get the hang of the new fab, and then Tgiving 2019 is Navi.


----------



## Vayra86 (Jan 10, 2019)

phanbuey said:


> Same, I am hoping this is just AMD putting a known design through 7nm to get the hang of the new fab, and then Tgiving 2019 is Navi.



I'm still missing how Navi will change GCN apart from splitting up the resources and using an interconnect to glue them together. I'm still missing how this is not just a repackaged Vega chopped in smaller blocks, that is actually mainly meant and utilized for the next console upgrade.


----------



## phanbuey (Jan 10, 2019)

Vayra86 said:


> I'm still missing how Navi will change GCN apart from splitting up the resources and using an interconnect to glue them together. I'm still missing how this is not just a repackaged Vega chopped in smaller blocks, that is actually mainly meant and utilized for the next console upgrade.



I don't think you're missing anything  - I think what Navi will allow AMD to make a Rome - like epic chiplet that's much more powerful than a monolithic GPU using a tweaked GCN core.


----------



## Space Lynx (Jan 10, 2019)

Durvelle27 said:


> I'm waiting for navi





spectatorx said:


> Same here. Before seeing your post i was about to write a post about navi. There were amd's roadmaps showing navi in 2019 and new, non-gcn architecture a year later. My plan for this year until now was to buy navi gpu but now i have no idea what to do as i am totally confused. This whole VII seems to be pointless gpu but still making more sense than 590. Also announced price turns me away, a lot. 700usd msrp is too much for consumer gpu, make it 500 with performance of 2080/1080ti and then we can talk. Probably such msrp is an effect of analysis of sales during cryptocurrency mining boom when people were paying insane money for gpus (800 euros for 580...) and card itself is not worth that much.
> 
> I would say i could wait for navi but seeing such price for card which was never planned before (VII) i think prices for navi will be even higher unless they will come to realization that they must lower prices. I wonder how long i will stick to my r9 380 waiting for worthy replacement in performance and price. Maybe i will buy soon 580 for 250 euros or vega 64 for 500 but i would love to have navi with performance of 2080ti for 500 euros as it should be.



this card does seem like a placeholder just to ease peoples fears they are quitting gpu's. though i am not sure why they couldn't just rush Navi instead of this, they are both 7nm...


----------



## Dante Uchiha (Jan 10, 2019)

Where is the option  "Wait for reviews before making the choice "? It looks like an interesting GPU for those who use 3d modeling software.


----------



## londiste (Jan 10, 2019)

phanbuey said:


> I don't think you're missing anything  - I think what Navi will allow AMD to make a Rome - like epic chiplet that's much more powerful than a monolithic GPU using a tweaked GCN core.


Except for some dude David Wang saying Navi will definitely not be MCM.


			
				https://www.pcgamesn.com/amd-navi-monolithic-gpu-design said:
			
		

> But we recently spoke with David Wang, the new SVP of engineering for AMD’s Radeon Technologies Group (RTG), and there’s pretty much zero chance that’s going to be worked into next year’s Navi GPUs.
> ...
> It’s definitely something AMD’s engineering teams are investigating, but it still looks a long way from being workable for gaming GPUs, and definitely not in time for the AMD Navi release next year. “We are looking at the MCM type of approach,” says Wang, “but we’ve yet to conclude that this is something that can be used for traditional gaming graphics type of application.”


----------



## xkm1948 (Jan 10, 2019)

Well technically you can think the choice “higher tier GPUs” means 2080, 2080Ti and Titan RTX


----------



## Vayra86 (Jan 10, 2019)

phanbuey said:


> I don't think you're missing anything  - I think what Navi will allow AMD to make a Rome - like epic chiplet that's much more powerful than a monolithic GPU using a tweaked GCN core.



The difference is that Zen is also a more efficient architecture and it is optimized around a specific clock range that competes with other offerings.

GCN is none of those things, and cutting it up sure as hell won't improve perf/watt on its own. The only escape then would be lower clocks to sit right in the efficiency sweet spot (see Vega undervolts/clocks and/or RX590 perf/watt loss versus its predecessors) and it would still not be enough to produce something sub 300 W with decent high end perf. I'm also not seeing how GCN fixes the latency that even Zen suffers from. GPU tasks are highly time-critical and everything is realtime. Its not like a decode/render task you can just take a few seconds longer on like CPUs do.


----------



## 8bitgamer757 (Jan 10, 2019)

Can someone tell me what gcn is? everytime i see it i think your talking about the Gamecube


----------



## spectatorx (Jan 10, 2019)

Ok, there is one thing i missed in specification. Card (VII) has a lot of hbm vram which is awesome. The thing which cought even more of my attention is number of ROP units: 128. Now i'm wondering and seems like this is the first consumer single-gpu card ever released with such high ROP units count. Please someone correct me if i am wrong.

The specification for this card imo is odd and seems to make card oriented not towards consumer but "prosumer", professional tasks. Especially if we compare it with vega 64 then specification becomes even more weird as there is less CU but more of everything else.

Vega 64 is already comparable with 1080, VII with this specification should be better than 2080ti in most titles. I will be surprised if will be different.


----------



## londiste (Jan 10, 2019)

8bitgamer757 said:


> Can someone tell me what gcn is? everytime i see it i think your talking about the Gamecube


Graphics Core Next, the basis architecture of current AMD cards.
Historically it goes back to select models in HD7000/8000 series and it is still (albeit considerably improved) the base of AMD GPUs. This also includes GPUs in both Xbox One (including S and X) and PS4 (including Pro) consoles. AMD has said Navi should be the last GCN GPU before a completely new architecture in Arcturus.
https://en.wikipedia.org/wiki/Graphics_Core_Next



spectatorx said:


> Vega 64 is already comparable with 1080, VII with this specification should be better than 2080ti in most titles. I will be surprised if will be different.


Curb your enthusiasm. AMD says it should perform around RTX2080 (not RTX2080Ti) in games. They said it, slides showed that, they have results from 25 games - in their comparison of Vega 64 and Radeon VII where VII is 29% faster - that show the same thing. This should be the basic expectation for now. We will know more when 3rd party reviews appear. That is February 7 at latest.

ROPs is a(n educated) guess at this point. There are no specs available about it.


----------



## spectatorx (Jan 10, 2019)

londiste said:


> ...
> 
> Curb your enthusiasm. AMD says it should perform around RTX2080 (not RTX2080Ti) in games. They said it, slides showed that, they have results from 25 games - in their comparison of Vega 64 and Radeon VII where VII is 29% faster - that show the same thing. This should be the basic expectation for now. We will know more when 3rd party reviews appear. That is February 7 at latest.
> 
> ROPs is a(n educated) guess at this point. There are no specs available about it.



https://www.anandtech.com/show/13832/amd-radeon-vii-high-end-7nm-february-7th-for-699


----------



## londiste (Jan 10, 2019)

spectatorx said:


> https://www.anandtech.com/show/13832/amd-radeon-vii-high-end-7nm-february-7th-for-699


The thing is, it is just an educated guess but it has not been confirmed. Vega 20 is in MI50/60 but their specs sheets have nothing on that either. Well, cannot blame them - ROPs are not really relevant for deep learning cards


----------



## Vayra86 (Jan 10, 2019)

Right now I might almost change my vote to 'Radeon 7' just so I can send Jensen Huang a pic of me having bought one, while tossing away my 1080.

I'd still sell it off after that to get something decent


----------



## xkm1948 (Jan 10, 2019)

Vayra86 said:


> Right now I might almost change my vote to 'Radeon 7' just so I can send Jensen Huang a pic of me having bought one, while tossing away my 1080.
> 
> I'd still sell it off after that to get something decent



Just do it! Buy a Radeon 7 just to piss off Nvidia. Dr. Su will be pleased.


----------



## WhiteNoise (Jan 10, 2019)

Nope. I'm good. Currently running a 1080Ti, 1080 and 980 in my three gaming rigs.


----------



## eidairaman1 (Jan 10, 2019)

phanbuey said:


> Same, I am hoping this is just AMD putting a known design through 7nm to get the hang of the new fab, and then Tgiving 2019 is Navi.



Just like fury was a testbed for vega, vega is a test bed for Navi/+


----------



## xkm1948 (Jan 10, 2019)

eidairaman1 said:


> Just like fury was a testbed for vega, vega is a test bed for Navi/+



And Navi will be a test bed for?


----------



## Zubasa (Jan 10, 2019)

xkm1948 said:


> And Navi will be a test bed for?


Pixie Dust.


----------



## xkm1948 (Jan 10, 2019)

Zubasa said:


> Pixie Dust.



So when will the test bed product end and we have official non test bed stuff?


----------



## Brusfantomet (Jan 10, 2019)

I must say I was tempted when reading about it, but looking at it now the two 290X I have in CF are still doing fine at the moment, water-cooling ensures they keep them at ca 55C under load at OC, but depending on pricing and performance I might change.



INSTG8R said:


> Still happy with my performance from Vega@1440 but now that I’m overclocking power consumption went out the  window. But it’s still running cool doing it so couldn’t care less.
> 
> 
> Try Norway....(Ex-Pat Canuck) you don’t want to know what I paid for my Vega and I got in on “sale” and was literally the only one in the country at the time...



What you are missing is that card has as near as makes no difference 5 year warranty from the store, all the hardware purchased here has.



lynx29 said:


> they also have universal healthcare and have no anxiety about being able to afford their medicine.
> 
> take your pick. cheaper graphics card you buy once every 4-5 years or anxiety free life...



Average yearly salary is 60500 USD, if you have computers as a hobby you can get a card more often than every 5 years.



sepheronx said:


> I can only imagine.  European nations can get horribly screwed because of VAT taxes, and other little taxes on everything.  So I can sympathize.



Cheapest 2080 ATM is 885 USD, and that is with the 25% VAT


----------



## theonek (Jan 10, 2019)

well ,there is a difference between announced and released? So there will be more time until we see actual card and actual tests with it....


----------



## HTC (Jan 10, 2019)

I'd buy nothing.

My GPU budget is 250€: 300€ if i stretch it.

nVidia is out of the picture due to me disliking their business practices and, because i vote with my wallet, i go AMD. Since AMD doesn't have a new card that:

A - is within my budget
B - has good enough performance increase to justify the upgrade
C - isn't loud or consumes too much power

I don't go AMD either.

Besides, i don't game that much and i'm not one of those that desperately needs eye candy to be able game anyway.


----------



## phanbuey (Jan 10, 2019)

HTC said:


> Besides, i don't game that much and i'm not one of those that desperately needs eye candy to be able game anyway.



It's a disease really.  Must... have... candy for eyes.


----------



## silentbogo (Jan 10, 2019)

xkm1948 said:


> Just do it! Buy a Radeon 7 just to piss off Nvidia. Dr. Su will be pleased.


I know you'll buy it. It's your long-awaited Mi25, only now on 7nm, for consumer segment, and 1.5 years late


----------



## fullinfusion (Jan 10, 2019)

xkm1948 said:


> So when will the test bed product end and we have official non test bed stuff?


My question to you is, what is your fascination with AMD cards? I understand you feel butt hurt because you jumped all over Fury before there was very much information about the product.

I was all for getting a Fury as well, but I asked our memory and motherboard reviewer and he flat out said to me don't bother, keep what you have as it's better and faster so I did.. plus that saved me how much $$ lol

The day that I would spend $1,200 on a GPU is the day that I will buy a bag of suckers aka lollipops..and put them in my pocket and carry them around with me every single day of the week, I should rephrase to the number of weeks just the remind me of how much of a sucker I am for spending that kind of bread.

Now to get back on topic. Would I buy a Radeon 7? Absolutely! But not until I do my research and see the reviews and if the price is right then hell yeah..

But someone pointed out why not add Nvidia into the mix to keep it fair..  when I clicked on this thread I didn't expect anything to do with Nvidia nor do I care.

But isn't RTX a test bed all on its own that was catching Fire on a lot of customers cards and artifacting? And also is RTX going to be in all the games? Do you just stop mid game and stare at a reflection like a deer in the headlights??  I guess time will tell hey. I call that a test bed over gcn any day


----------



## Splinterdog (Jan 10, 2019)

Nothing. I already spent my annual budget last year on a big Ryzen upgrade.
If I had the spare cash, I'd replace my H100i, which is getting on a bit now.


----------



## eidairaman1 (Jan 10, 2019)

fullinfusion said:


> My question to you is, what is your fascination with AMD cards? I understand you feel butt hurt because you jumped all over Fury before there was very much information about the product.
> 
> I was all for getting a Fury as well, but I asked our memory and motherboard reviewer and he flat out said to me don't bother, keep what you have as it's better and faster so I did.. plus that saved me how much $$ lol
> 
> ...



If I were buying a card now it Be Vega 2.


----------



## xkm1948 (Jan 11, 2019)

silentbogo said:


> I know you'll buy it. It's your long-awaited Mi25, only now on 7nm, for consumer segment, and 1.5 years late



lol nope. Are you going to though?


----------



## Space Lynx (Jan 11, 2019)

i mean if the radeon 7 is on ebay when they have that 100 off site wide promo, and i can get it for $599 free ship no tax, then i might consider it.


----------



## Zubasa (Jan 11, 2019)

lynx29 said:


> i mean if the radeon 7 is on ebay when they have that 100 off site wide promo, and i can get it for $599 free ship no tax, then i might consider it.


So far it looks like it is another AMD only card like the Frontier Edition.
AMD wants your every cent to cover that HBM cost.


----------



## silentbogo (Jan 11, 2019)

xkm1948 said:


> lol nope. Are you going to though?


Probably not. It won't be available in my area for sure. Never even got to see Vega 56 or 64 with my own eyes.
If anything, CPU/MoBo upgrade is a priority for now. Just waiting for Ryzen 3000-series to hit the shelves.


----------



## ppn (Jan 11, 2019)

Direct shrink of VEGA10 on 7nm is 200 mm.sq. Add faster 8GB HBM2 at 672GB/s at 350 euro like 2060 and perform like 2070 OC. and we may have a deal. Now VEGA20 cost like 2080 and performs like 2070 OC @ 2Ghz @ double the power what is the point.


----------



## londiste (Jan 11, 2019)

ppn said:


> Direct shrink of VEGA10 on 7nm is 200 mm.sq. Add faster 8GB HBM2 at 672GB/s at 350 euro like 2060 and perform like 2070 OC. and we may have a deal. Now VEGA20 cost like 2080 and performs like 2070 OC @ 2Ghz @ double the power what is the point.


This is very-very close to direct shrink of Vega10. Vega 20 has 5,6% more transistors - for additional memory controllers and 1:2 FP64 support. Taking out that 5,6% it would still be roughly a 313 mm.sq. die.


----------



## ppn (Jan 11, 2019)

they added alot of empty space separating the core from the memory controllers if you look at it, and second 2048 bit channels overbloats the design. I prefer much nicer direct shrink, the core itself has undergone 2.4X shrinkage, no need to add anything, just bump speeds 30% and voila, 2080 levels. but instead we have this now. 50% bigger needlessly.


----------



## qubit (Jan 11, 2019)

I wouldn't swap my GTX 1080 for any AMD graphics card at the moment. When they become competitive again I'll consider it.


----------



## Space Lynx (Jan 11, 2019)

It's hard to swallow the amount of power consumption versus Nvidia that is for sure, that is a lot of money if you end gaming or folding a lot over 3 years of electric bills.


----------



## INSTG8R (Jan 11, 2019)

lynx29 said:


> It's hard to swallow the amount of power consumption versus Nvidia that is for sure, that is a lot of money if you end gaming or folding a lot over 3 years of electric bills.


And I’ll never understand the obsession with power. Performance and Thermals are important if it takes 500W to get there I could care less it’s literally pennies a year and your card sits idles consuming single digits most of the time.


----------



## londiste (Jan 11, 2019)

lynx29 said:


> It's hard to swallow the amount of power consumption versus Nvidia that is for sure, that is a lot of money if you end gaming or folding a lot over 3 years of electric bills.


Cooling, noise and sizing of other things like PSU, sure but does anyone really care about the actual power cost?

RTX2080 is at 225W vs Radeon VII's assumed (and likely, because the card is essentially identical to MI-50) 300W. 75W difference at full power.
Assuming both cards run at full load for an entire year (entirely unrealistic scenario for a consumer card) that is a difference of 657 kWh.


----------



## INSTG8R (Jan 11, 2019)

londiste said:


> Cooling, noise and sizing of other things like PSU, sure but does anyone really care about the actual power cost?
> 
> RTX2080 is at 225W vs Radeon VII's assumed (and likely, because the card is essentially identical to MI-50) 300W. 75W difference at full power.
> Assuming both cards run at full load for an entire year (entirely unrealistic scenario for a consumer card) that is a difference of 657 kWh.


My Vega won’t go over 300 unless I force it to, on the high BIOS it’s 275W , 240W on the low one. Without actually overclocking it would never go over 300W


----------



## Vya Domus (Jan 11, 2019)

lynx29 said:


> It's hard to swallow the amount of power consumption versus Nvidia that is for sure, that is a lot of money if you end gaming or folding a lot over 3 years of electric bills.



Yeah I am convinced everyone ready to drop 700$ on a card will count every penny that goes to their electricity bill for years to come.


----------



## xkm1948 (Jan 11, 2019)

Vya Domus said:


> Yeah I am convinced everyone ready to drop 700$ on a card will count every penny that goes to their electricity bill for years to come.



I do agree with this. For flagship GPUs power is of lesser concern definitely. Performance and noise level should be the No.1 concern for high end cards.


----------



## Space Lynx (Jan 11, 2019)

I disagree, some of us have small incomes and are main hobby is gaming, just because we like to enjoy our main hobby to the max doesn't mean I should have to sacrifice higher electric bills when better alternatives are out there, which they are with Nvidia. Better off paying $100 more for 2080 because within a year of heavy usage it will have paid that $100 back in cheaper electric bills while still being slightly faster, so I am not sure I get your point at all.


----------



## Vayra86 (Jan 11, 2019)

INSTG8R said:


> And I’ll never understand the obsession with power. Performance and Thermals are important if it takes 500W to get there I could care less it’s literally pennies a year and your card sits idles consuming single digits most of the time.



Apart from noise and heat to dissipate out of a case, there is definitely a cost aspect.

Let's make it a very simple bit of math. Consider the Radeon 7 versus 2080 situation; if the Radeon 7 is 50-100 dollars cheaper, you could say its a great deal at the same performance. But if you consider 4 hours of gaming per day for, let's say 300 days per year, that is 1200 hours per year. Say you keep the R7 for two years and let's say on average, there is a 50W gap between the two cards at the same performance. 2.400 x 50W = 120 kWh @ €0.25/kwh = 30 EUR. That is two years - and I reckon most keep this card for longer, either in the same or another hand-me-down rig. Or if you game more than 4hx300d per year... its easy to get pretty damn close to the money you saved beforehand.

Power matters and cost is an aspect you simply cannot ignore, especially not when matching up cards to see what is a better deal. I'm not advocating that one should pick Nvidia because 'its better for the environment' or something; in that sense power is indeed irrelevant. But as a cost aspect, its a real thing worth factoring in a purchase.


----------



## Space Lynx (Jan 11, 2019)

Vayra86 said:


> Apart from noise and heat to dissipate out of a case, there is definitely a cost aspect.
> 
> Let's make it a very simple bit of math. Consider the Radeon 7 versus 2080 situation; if the Radeon 7 is 50-100 dollars cheaper, you could say its a great deal at the same performance. But if you consider 4 hours of gaming per day for, let's say 300 days per year, that is 1200 hours per year. Say you keep the R7 for two years and let's say on average, there is a 50W gap between the two cards at the same performance. 2.400 x 50W = 120 kWh @ €0.25/kwh = 30 EUR. That is two years - and I reckon most keep this card for longer, either in the same or another hand-me-down rig. Or if you game more than 4hx300d per year... its easy to get pretty damn close to the money you saved beforehand.
> 
> Power matters.



well i like to do folding at home once in awhile too, not all the time, just sometimes... thats 24 hour usage... so... yeah   nvidia is the only option for that just because of electric bill. and i do game 4-6 hours for long spurts sometimes, I don't watch a single minute of tv ever, never been my thing, only gaming. so yeah


----------



## Vya Domus (Jan 11, 2019)

Vayra86 said:


> Apart from noise and heat to dissipate out of a case, there is definitely a cost aspect.
> 
> Let's make it a very simple bit of math. Consider the Radeon 7 versus 2080 situation; if the Radeon 7 is 50-100 dollars cheaper, you could say its a great deal at the same performance. But if you consider 4 hours of gaming per day for, let's say 300 days per year, that is 1200 hours per year. Say you keep the R7 for two years and let's say on average, there is a 50W gap between the two cards at the same performance. 2.400 x 50W = 120 kWh @ €0.25/kwh = 30 EUR. That is two years - and I reckon most keep this card for longer, either in the same or another hand-me-down rig. Or if you game more than 4hx300d per year... its easy to get pretty damn close to the money you saved beforehand.
> 
> Power matters and cost is an aspect you simply cannot ignore, especially not when matching up cards to see what is a better deal. I'm not advocating that one should pick Nvidia because 'its better for the environment' or something; in that sense power is indeed irrelevant. But as a cost aspect, its a real thing worth factoring in a purchase.



I am sure you know this but I am going to point this out again. A lot of people have a fixed amount of cash they can spend on something at a time. If you only have 700$ bucks , that's it, no matter how much logic you can derive from this power usage argument it wont change your spending capacity.


----------



## Space Lynx (Jan 11, 2019)

Vya Domus said:


> I am sure you know this but I am going to point this out again. A lot of people have a fixed amount of cash they can spend on something at a time. If you only have 700$ bucks , that's it, no matter how much logic you can derive from this power usage argument it wont change your spending capacity.









that was me at checkout a couple weeks ago for a RTX 2080 EVGA black edition.  free ship no tax with the site wide promo code ebay does every month or so. sometimes they do that promo code multiple times in a single month. next time I catch this deal, i am biting the bullet!    AMD can shove that $700 where the sun don't shine


----------



## Vya Domus (Jan 11, 2019)

Good for you buddy, am I supposed to care about that or what ?


----------



## kapone32 (Jan 11, 2019)

I have my Sapphire Vega 64 on a Byiski water block and since AMD updated the driver 18.2.2.2 that has the ability to auto under volt the card I applied it. I then ran a Fire Strike run this morning and saw the card was running at 1787 MHZ!!! on the GPU clock. I was hyped for Vega 7 but since I saw that with a minimum of 150 FPS for both Graphics runs I think I will put my confidence into software updates decreasing the gap between AMD and the green team. I will probably replace them with Navi. Right now my main concern is to find a reference Vega 64 to replace my Gigabyte Vega 64. Is anyone interested in a trade? I will probably create a post for that last statement anyway.



Vayra86 said:


> Apart from noise and heat to dissipate out of a case, there is definitely a cost aspect.
> 
> Let's make it a very simple bit of math. Consider the Radeon 7 versus 2080 situation; if the Radeon 7 is 50-100 dollars cheaper, you could say its a great deal at the same performance. But if you consider 4 hours of gaming per day for, let's say 300 days per year, that is 1200 hours per year. Say you keep the R7 for two years and let's say on average, there is a 50W gap between the two cards at the same performance. 2.400 x 50W = 120 kWh @ €0.25/kwh = 30 EUR. That is two years - and I reckon most keep this card for longer, either in the same or another hand-me-down rig. Or if you game more than 4hx300d per year... its easy to get pretty damn close to the money you saved beforehand.
> 
> Power matters and cost is an aspect you simply cannot ignore, especially not when matching up cards to see what is a better deal. I'm not advocating that one should pick Nvidia because 'its better for the environment' or something; in that sense power is indeed irrelevant. But as a cost aspect, its a real thing worth factoring in a purchase.




Only if you don't under volt your card and there is no information other than opinions that Vega 7 will pull 300 Watts


----------



## phanbuey (Jan 11, 2019)

I think the main issue is that there is any discussion about it at all... AMD gave us a choice (which is great) for the mid-high end; but they could have made it a no contest.

Got $500? the Radeon 7 is a no brainer.  2080 performance at a 2070 price! 

But now you have factor things in and it's a 50/50 -  you can really go either way.  It just lets nVidia completely off the hook for their ridiculous RTX pricing.


----------



## londiste (Jan 11, 2019)

kapone32 said:


> Only if you don't under volt your card and there is no information other than opinions that Vega 7 will pull 300 Watts


Let me introduce you to a Radeon VII without video outputs: https://www.amd.com/en/products/professional-graphics/instinct-mi50


----------



## Space Lynx (Jan 11, 2019)

Vya Domus said:


> Good for you buddy, am I supposed to care about that or what ?



I'm just saying there are still options if you are USA customer and patient with ebay site wide promo codes.


----------



## xkm1948 (Jan 11, 2019)

7nm is expensive; 16GB HBM is also expensive. I think at $699 AMD barely makes money. Radeon 7 feels more like a statement than actual competition. Same as Vega and Fury: to show consumers they are still in the high end game. 

TBH the limited supply run may also be likely. If you really like AMD then Radeon 7 feels like the final evolution form of GCN. It is probably gonna be the only high end until whatever comes after Navi.

For computing it is kinda hard to say. I had quite a lot of attempts using my old FuryX for OpenCL based genome alignment accleration. When you can get it working it IS VERY FAST. However the learning curve is quite steep and there is not a whole lot of bioinformatician coding in OpenCL that I can get help from. Maybe it would be better for other people using ML or AI. But then again without good Tensorflow support it is another steep learning curve.


----------



## Tomgang (Jan 11, 2019)

For me the correct answer would be "nothing, I am good with current GPU" and "nothing, I am disappointed with current stack of GPUs".

Well i have a GTX 1080 TI, so there Arent really nothing that is wofh replacing it with. RTX 2080 less Vram, more exspensive and performance difference is not the trouble worf it. RTX 2080 TI is way to exspensive and ray tracing cost way to much performance and AMD Radeon VII does not impress me either. Im good with GTX 1080 TI


----------



## phanbuey (Jan 11, 2019)

Tomgang said:


> For me the correct answer would be "nothing, I am good with current GPU" and "nothing, I am disappointed with current stack of GPUs".
> 
> Well i have a GTX 1080 TI, so there Arent really nothing that is wofh replacing it with. RTX 2080 less Vram, more exspensive and performance difference is not the trouble worf it. RTX 2080 TI is way to exspensive and ray tracing cost way to much performance and AMD Radeon VII does not impress me either. Im good with GTX 1080 TI



The 1080ti reminds me of the 8800gtx a bit...

If you bought one when they first came out, you were set for 2.5-3 years, it was still top dog until the GTX 280 came out 2.5 years later.

Until nvidia goes 7nm it probably wont be worthwhile upgrading it.



xkm1948 said:


> 7nm is expensive; 16GB HBM is also expensive. I think at $699 AMD barely makes money. Radeon 7 feels more like a statement than actual competition. Same as Vega and Fury: to show consumers they are still in the high end game.
> 
> TBH the limited supply run may also be likely. If you really like AMD then Radeon 7 feels like the final evolution form of GCN. It is probably gonna be the only high end until whatever comes after Navi.
> 
> For computing it is kinda hard to say. I had quite a lot of attempts using my old FuryX for OpenCL based genome alignment accleration. When you can get it working it IS VERY FAST. However the learning curve is quite steep and there is not a whole lot of bioinformatician coding in OpenCL that I can get help from. Maybe it would be better for other people using ML or AI. But then again without good Tensorflow support it is another steep learning curve.



I don't understand why they did that, I am sure they had a good reason to, but 16gb is completely unnecessary at this performance level;  I wonder how much they could have saved with just 8gb.  A $550 8GB R7 vs a $699 2080 is a solid matchup.


----------



## xkm1948 (Jan 11, 2019)

Nvidia’s 7nm flagship will probably be where most 1080Ti owners upgrade. Assuming it wont sell for $2000 by then.


----------



## Tomgang (Jan 11, 2019)

phanbuey said:


> The 1080ti reminds me of the 8800gtx a bit...
> 
> If you bought one when they first came out, you were set for 2.5-3 years, it was still top dog until the GTX 280 came out 2.5 years later.
> 
> Until nvidia goes 7nm it probably wont be worthwhile upgrading it.



I think you are spot on there. I got my card in july 2017 and GTX 1080 TI where released in march 2017. So i have had my card for around one and a half a year now. RTX 2080/2080 TI released in september last year and with the given rate nvidia releases new gen cards. Replacement for RTX 2000 cards comming earliest by the of 2019 and i will think its properly more plausible with an early 2020 release and with the current prices RTX goes for, im keeping my GTX 1080 TI for sure. Dont be surprized if im still on GTX 1080 TI in 2020 as well. So yeah claiming to keep the card for 3 years seems spot on this time.

If i even get a new card before that, its properly a second GTX 1080 TI for some sli fun. Compared to RTX cards, GTX 1080 TI whas really spot on in my oppinion when it released and maybe the bedst highend card nvidia has released. Cheap compared to titan, plenty of vram and still performed very close to titan, but to al most the half price. That is not the case with RTX 2080 TI. Yes its faster than 1080 TI but price is stupid, still same vram zise and i dont need ray tracing.


----------



## Vya Domus (Jan 11, 2019)

xkm1948 said:


> I had quite a lot of attempts using my old FuryX for OpenCL based genome alignment accleration. When you can get it working it IS VERY FAST. However the learning curve is quite steep and there is not a whole lot of bioinformatician coding in OpenCL.



That's not the fault of AMD, or anyone really. It also doesn't help that the largest GPU provider in the world which is also technically true for OpenCL capable devices, Nvidia, wont support anything past 1.2 and provides zero tools for debugging. If they don't bother to make this environment more usable why would AMD put in the effort ?


----------



## HTC (Jan 11, 2019)

Vya Domus said:


> That's not the fault of AMD, or anyone really. It also doesn't help that the largest GPU provider in the world which is also technically true for OpenCL capable devices, Nvidia, wont support anything past 1.2 and provides zero tools for debugging. *If they don't bother to make this environment more usable why would AMD put in the effort ?*



To entice more costumers to their offerings?


----------



## Zubasa (Jan 11, 2019)

HTC said:


> To entice more costumers to their offerings?


Exactly, it is a shame all that compute power is wasted because AMD didn't put in enough effort.



phanbuey said:


> I don't understand why they did that, I am sure they had a good reason to, but 16gb is completely unnecessary at this performance level;  I wonder how much they could have saved with just 8gb.  A $550 8GB R7 vs a $699 2080 is a solid matchup.


The thing is, Vega 20 is a 2096-bit GPU that requires 4 stacks of HBM2.
They can either use 4 stacks of 4GB HBM2, which RX Vega is already using 2x 4GB, or some how source 2GB stacks of HBM2 just to make a single low volume product.
Nothing use 2GB HBM2 in AMD's product stack right now, in fact I am not sure if anyone is using 2GB HBM2 stacks right now.
I doubt it would actually save much cost in the end.


----------



## Vya Domus (Jan 11, 2019)

HTC said:


> To entice more costumers to their offerings?



It's simply not a burden they should carry exclusively. I reckon they have done enough already.


----------



## Zubasa (Jan 11, 2019)

Vya Domus said:


> It's simply not a burden they should carry exclusively. I reckon they have done enough already.


nVidia is in the dominant position right now, it is a conflict of interest for them to support OpenCL more instead of CUDA.
CUDA has the bonus of locking down the market for nVidia so that they can charge whatever they want.



sil3ntearth said:


> Gonna stick with my 8GB 580.  It works fine for my 1080P monitor.  Until this thing starts on fire or stops running 1080P well, I'm happy with it.


My Vega 56 did indeed literally went up in flames.


----------



## sil3ntearth (Jan 11, 2019)

Gonna stick with my 8GB 580.  It works fine for my 1080P monitor.  Until this thing starts on fire or stops running 1080P well, I'm happy with it.



Zubasa said:


> nVidia is in the dominant position right now, it is a conflict of interest for them to support OpenCL more instead of CUDA.
> CUDA has the bonus of locking down the market for nVidia so that they can charge whatever they want.
> 
> 
> My Vega 56 did indeed literally went up in flames.



That comes with the territory if you own an AMD GPU.  It's an amazing personal space heater.


----------



## xkm1948 (Jan 11, 2019)

Vya Domus said:


> That's not the fault of AMD, or anyone really. It also doesn't help that the largest GPU provider in the world which is also technically true for OpenCL capable devices, Nvidia, wont support anything past 1.2 and provides zero tools for debugging. If they don't bother to make this environment more usable why would AMD put in the effort ?





Vya Domus said:


> It's simply not a burden they should carry exclusively. I reckon they have done enough already.




From the standpoint of us researchers we need relative easier to code/optimize hardware solutions. I mean yeah if I had multiple computer science grad students helping me optimize OpenCL codes then yeah Radeon’s GCN based solution would be wicked good as they ARE faster at hardware level. But i am just by myself and most of my colleagues don’t even know how to use R, let alone coding in OpenCL.

We are spending tax payers dollar so at the end of the day it is also not researchers/users burden to develop more OpenCL based GPGPU application.

I hate to say it, but for us bioinformatics based applications CUDA and Nvidia’s solution “just works”


----------



## phanbuey (Jan 11, 2019)

xkm1948 said:


> From the standpoint of us researchers we need relative easier to code/optimize hardware solutions. I mean yeah if I had multiple computer science grad students helping me optimize OpenCL codes then yeah Radeon’s GCN based solution would be wicked good as they ARE faster at hardware level. But i am just by myself and most of my colleagues don’t even know how to use R, let alone coding in OpenCL.
> 
> We are spending tax payers dollar so at the end of the day it is also not researchers/users burden to develop more OpenCL based GPGPU application.
> 
> I hate to say it, but for us bioinformatics based applications CUDA and Nvidia’s solution “just works”


I keep hearing the same sentiment from the AI crowd.

They're there to solve a problem, not to spend all day trying to get the hardware to run.


----------



## xkm1948 (Jan 11, 2019)

phanbuey said:


> I keep hearing the same sentiment from the AI crowd.
> 
> They're there to solve a problem, not to spend all day trying to get the hardware to run.



Yes exactly this.

Similarly this is also why I have built multiple threadripper system for multiple molecular biology labs. The performance brought on by TR is simiply unmatchable for thr price we pay. TR CPU needs no coding optimization. As everything is already done at compiler level and our python script just works right off the line. Plug in CPU plug in ram. Then install Linux and off we go! 0 minutes wasted on getting things working.


----------



## the54thvoid (Jan 11, 2019)

xkm1948 said:


> Nvidia’s 7nm flagship will probably be where most 1080Ti owners upgrade. Assuming it wont sell for $2000 by then.



I'm a bit pissed about the price inflation but AMD's response is pretty shit tbh. For years, people supported AMD for 'keeping it real' but they tried to go toe-to-toe with Nvidia on pricing (Fury X, Vega 64) and now they're doing the same with their top tier card that's not as good as a 2080ti. They've thrown the towel in on the value proposition which is pretty poor. It's as if they're trying to emulate the CPU sector where, arguably, Intel has reacted fairly aggressively (I can now buy an 8/16 core/thread Intel CPU for £470- used to be £900). Except Nvidia hasn't done the same, so AMD is stuck with lower rate GPU at a price matched comparison with Nvidia's 2nd (or 3rd including Titan RTX) best GPU.  

Personally, I only recreationally game*, so I won't pay more than £700-800 for an awesome GPU. Radeon 7 is NOT that GPU.  

*mid-forties, so too middling to be good at multiplayer or as is often the case - too drunk to hit anything.


----------



## Zubasa (Jan 11, 2019)

the54thvoid said:


> I'm a bit pissed about the price inflation but AMD's response is pretty shit tbh. For years, people supported AMD for 'keeping it real' but they tried to go toe-to-toe with Nvidia on pricing (Fury X, Vega 64) and now they're doing the same with their top tier card that's not as good as a 2080ti. They've thrown the towel in on the value proposition which is pretty poor. It's as if they're trying to emulate the CPU sector where, arguably, Intel has reacted fairly aggressively (I can now buy an 8/16 core/thread Intel CPU for £470- used to be £900). Except Nvidia hasn't done the same, so AMD is stuck with lower rate GPU at a price matched comparison with Nvidia's 2nd (or 3rd including Titan RTX) best GPU.
> 
> Personally, I only recreationally game*, so I won't pay more than £700-800 for an awesome GPU. Radeon 7 is NOT that GPU.
> 
> *mid-forties, so too middling to be good at multiplayer or as is often the case - too drunk to hit anything.


If you also include computing tasks, there is also "Poor Volta" which is just strait up better in FP32 and FP64 than Radeon 7, brute force or not.
It also has a good amount of HBM2  which everyone seems to think that it automatically means anything except memory bandwidth.
So yes that would make Radeon 7 compete with nVidia's 4th best GPU.


----------



## sil3ntearth (Jan 11, 2019)

the54thvoid said:


> I'm a bit pissed about the price inflation but AMD's response is pretty shit tbh. For years, people supported AMD for 'keeping it real' but they tried to go toe-to-toe with Nvidia on pricing (Fury X, Vega 64) and now they're doing the same with their top tier card that's not as good as a 2080ti. They've thrown the towel in on the value proposition which is pretty poor. It's as if they're trying to emulate the CPU sector where, arguably, Intel has reacted fairly aggressively (I can now buy an 8/16 core/thread Intel CPU for £470- used to be £900). Except Nvidia hasn't done the same, so AMD is stuck with lower rate GPU at a price matched comparison with Nvidia's 2nd (or 3rd including Titan RTX) best GPU.
> 
> Personally, I only recreationally game*, so I won't pay more than £700-800 for an awesome GPU. Radeon 7 is NOT that GPU.
> 
> *mid-forties, so too middling to be good at multiplayer or as is often the case - too drunk to hit anything.



The Radeon 7 also confused me for that reason.  It seems to go against everything they've tried to accomplish with the RX, TR, and Ryzen stuff.  Realistically, R7 should cost $400 with higher end variants coming out down the road.  $700 for that card is insane.


----------



## Zubasa (Jan 11, 2019)

sil3ntearth said:


> The Radeon 7 also confused me for that reason.  It seems to go against everything they've tried to accomplish with the RX, TR, and Ryzen stuff.  Realistically, R7 should cost $400 with higher end variants coming out down the road.  $700 for that card is insane.


TBH it wouldn't feel so awkward if they even price it at $649 or something.
But no, they have to price it toe to toe with the 2080 which is already regarded to be very overpriced.
The fact that Turing does all everything Vega does and has extra gimmicks stack on top doesn't help as well.

Lisa Su spend an entire hour repeating the word gaming 100x, but in the end we got a compute card with no Pro drivers? 



sil3ntearth said:


> It seems to be they priced it at $699 because they figured if Nvidia could do it, they could too which is incredibly disappointing.   Problem is, I don't think the RTX cards (aside from probably the 2060) are all that popular and have only served to hurt their reputation in large part because of the insane price.   This is a terrible time in the history of gaming to be asking people to pay that much for a video card.


Again, for gaming there is nothing that the Radeon 7 offers, that the 2080 does not already offer months ago.
It is odd that AMD figures that it is fine to sell a card that is as expensive as nVidia's overpriced offerings but with less features.

AMD did a better marketing job for nVidia then even nVidia marketing team can come up with.


----------



## sil3ntearth (Jan 11, 2019)

Zubasa said:


> TBH it wouldn't feel so awkward if they even price it at $649 or something.
> But no, they have to price it toe to toe with the 2080 which is already regarded to be very overpriced.



It seems to be they priced it at $699 because they figured if Nvidia could do it, they could too which is incredibly disappointing.   Problem is, I don't think the RTX cards (aside from probably the 2060) are all that popular and have only served to hurt their reputation in large part because of the insane price.   This is a terrible time in the history of gaming to be asking people to pay that much for a video card.



Zubasa said:


> TBH it wouldn't feel so awkward if they even price it at $649 or something.
> But no, they have to price it toe to toe with the 2080 which is already regarded to be very overpriced.
> The fact that Turing does all everything Vega does and has extra gimmicks stack on top doesn't help as well.
> 
> Lisa Su spend an entire hour repeating the word gaming 100x, but in the end we got a compute card with no Pro drivers?



Sure, higher profit margin.


----------



## RealNeil (Jan 11, 2019)

Without actual reviews to see, this question may be a little too soon to ask.
In theory, I've always found myself rooting for the underdog, AMD. 
This can result in diminishing returns as we experienced with Bulldozer and others, but supporting AMD has always been a necessary counterpoint to Intel and NVIDIA's monopolistic tendencies.
We all know what happens without any meaningful competition in the marketplace.

The success of Ryzen I and II was gratifying to me and I hope for more of the same. 
Even if this VII GPU doesn't damage NVIDIA's stranglehold on the market, I'll buy a few of them, just to support AMD efforts.


----------



## sil3ntearth (Jan 11, 2019)

RealNeil said:


> Without actual reviews to see, this question may be a little too soon to ask.
> In theory, I've always found myself rooting for the underdog, AMD.
> This can result in diminishing returns as we experienced with Bulldozer and others, but supporting AMD has always been a necessary counterpoint to Intel and NVIDIA's monopolistic tendencies.
> We all know what happens without any meaningful competition in the marketplace.
> ...



At least for me it's not so much the performance of the R7, but the price point.  I think that's why people are so disappointed.


----------



## Deleted member 178884 (Jan 11, 2019)

It's very interesting even at the price point it is, the 2080 here in the UK still costs more, If I was buying a new GPU it'd be a radeon 7. However I'm happy with my 1080ti ftw3 and I've just put it under a alphacool nexxos waterblock.


----------



## RealNeil (Jan 11, 2019)

sil3ntearth said:


> At least for me it's not so much the performance of the R7, but the price point.  I think that's why people are so disappointed.


AMD prices seem to adjust after release. I believe it will happen again.


----------



## xkm1948 (Jan 11, 2019)

RealNeil said:


> Without actual reviews to see, this question may be a little too soon to ask.
> In theory, I've always found myself rooting for the underdog, AMD.
> This can result in diminishing returns as we experienced with Bulldozer and others, but supporting AMD has always been a necessary counterpoint to Intel and NVIDIA's monopolistic tendencies.
> We all know what happens without any meaningful competition in the marketplace.
> ...


I am assuming you are going to crossfire then? Would be sweet to see how CF works for Radeon 7


----------



## RealNeil (Jan 11, 2019)

xkm1948 said:


> I am assuming you are going to crossfire then? Would be sweet to see how CF works for Radeon 7


Yes, I probably will.


----------



## xkm1948 (Jan 11, 2019)

RealNeil said:


> Yes, I probably will.



Haven’t check CF in a while. Do they need bridges as SLI nowadays?


----------



## vega22 (Jan 11, 2019)

xfire hasn't needed a bridge since pcie gen3 became a thing dude.

poll is way too premature.


----------



## RealNeil (Jan 12, 2019)

xkm1948 said:


> Haven’t check CF in a while. Do they need bridges as SLI nowadays?


No, they don't. Last time I used Crossfire Bridges was with my R9-280X OC cards.


----------



## TheoneandonlyMrK (Jan 12, 2019)

RealNeil said:


> Without actual reviews to see, this question may be a little too soon to ask.
> In theory, I've always found myself rooting for the underdog, AMD.
> This can result in diminishing returns as we experienced with Bulldozer and others, but supporting AMD has always been a necessary counterpoint to Intel and NVIDIA's monopolistic tendencies.
> We all know what happens without any meaningful competition in the marketplace.
> ...


I agree with waiting for reviews , i wouldn't buy before a fair few are read personally.

At least they're trying lets hope some driver work releases some performance too.

I can't help but wonder how it would compare clock for clock with vega 10.


----------



## xkm1948 (Jan 12, 2019)

theoneandonlymrk said:


> I can't help but wonder how it would compare clock for clock with vega 10.


Asking the real question here. Can one manually disable certain amount of CU on Vega64 to test GPU “IPC”?


----------



## eidairaman1 (Jan 12, 2019)

sil3ntearth said:


> Gonna stick with my 8GB 580.  It works fine for my 1080P monitor.  Until this thing starts on fire or stops running 1080P well, I'm happy with it.
> 
> 
> 
> That comes with the territory if you own an AMD GPU.  It's an amazing personal space heater.



Same can be said about RTX 2000 series


----------



## Space Lynx (Jan 12, 2019)

RealNeil said:


> AMD prices seem to adjust after release. I believe it will happen again.



yeah the 1800x was $500 at launch, and within 6 months could be had on ebay brand new for around $300 on sales. AMD has historically never held its value like Intel and Nvidia.  but I am like Neil on this one, we need to support AMD a bit longer I think, because I suspect they will be going toe to toe with the other companies within a couple years, they just need to optimize Navi 7nm or replace GCN gpu's in late 2020.... also Intel entering GPU race in late 2020... it could be a very interesting time.  Nvidia is milking everyone while they can, because they know late 2020 early 2021 its going to be an entire different ballgame.


----------



## Poul-erik (Jan 12, 2019)

sepheronx said:


> We are getting raped in prices here in Canada.




Here in Denmark we have to pay danish kronen 6200, it is  $ 950 for a 2080 and that's when they are on offer.

a US dollar is 6.47 Danish kroner


----------



## Space Lynx (Jan 12, 2019)

Poul-erik said:


> Here in Denmark we have to pay danish kronen 6200, it is  $ 950 for a 2080 and that's when they are on offer.
> 
> a US dollar is 6.47 Danish kroner




how much money do you make an hour though? the middle class i mean. it tends to even out when you look at all of that


----------



## Poul-erik (Jan 12, 2019)

lynx29 said:


> how much money do you make an hour though? the middle class i mean. it tends to even out when you look at all of that



the middle class, i will say here on this little island "Langeland" where i live we kan get  23$ an hour  US, it is ca. 150 Danish kronen.


----------



## Space Lynx (Jan 12, 2019)

Poul-erik said:


> the middle class, i will say here on this little island "Langeland" where i live we kan get  23$ an hour  US, it is ca. 150 Danish kronen.



and vast majority of people where I live make 9-13 US an hour. and we have high insurance premiums on top of that. so you still win in the end by far.


----------



## ppn (Jan 12, 2019)

Every hour of playing games on an overly expensive Vcard and PC is a loss of what ever you make per hour, and a minimum of 25$ because everybody with an iq of 101 can earn that, even lower if you tile tiles or do hard physical or something. So the price of videocards is irrelevant, the question is why would anybody be willing to waste the good half of his life doing that.


----------



## Space Lynx (Jan 12, 2019)

ppn said:


> Every hour of playing games on an overly expensive Vcard and PC is a loss of what ever you make per hour, and a minimum of 25$ because everybody with an iq of 101 can earn that, even lower if you tile tiles or do hard physical or something. So the price of videocards is irrelevant, the question is why would anybody be willing to waste the good half of his life doing that.



You underestimate how privileged you are just because you were born in Denmark. I have a Master's Degree in Indiana and only jobs I can find are 30k a year still. Your government helps the wages in your country, a lot. I don't know what laws they passed, just saying, some countries the living conditions really are a lot better and more opportunity.


----------



## Poul-erik (Jan 12, 2019)

lynx29 said:


> You underestimate how privileged you are just because you were born in Denmark. I have a Master's Degree in Indiana and only jobs I can find are 30k a year still. Your government helps the wages in your country, a lot. I don't know what laws they passed, just saying, some countries the living conditions really are a lot better and more opportunity.



Perhaps you will not believe it, ha ha but our government is quite the same as your Republicans. Denmark is a good place to live in, but even though we have it all, there are still many in Denmark who think we have it bad, ok, we have had a little better, but compared to many other countries we are lucky


----------



## Darmok N Jalad (Jan 12, 2019)

I just bought a used Nitro+ RX 480 (whose specs essentially makes it a 580), and I undervolt the GPU and OC the VRAM, so I’m not likely to move on anything unless this card dies. I game on a 4K 50” TV, but I’m fine with 1080p or 1440p so long as I get my 60fps.



Zubasa said:


> Pixie Dust.


Rumor has it Pixie Dust OCs really well if you have good cooling, but I’m waiting for reviews first.


----------



## Vario (Jan 12, 2019)

Going a few years on the 1060 6GB before I upgrade. Maybe 2020 at the earliest but likely 2021+.


----------



## delshay (Jan 14, 2019)

Just to make user(s) aware, there are few games out there that require more than 8GB of VRAM @4K max settings. RE2 is one of them & it needs 14GB of VRAM.

So AMD was right to fit 16GB HBM2 to Vega 7..


----------



## londiste (Jan 14, 2019)

delshay said:


> Just to make user(s) aware, there are few games out there that require more than 8GB of VRAM @4K max settings. RE2 is one of them & it needs 14GB of VRAM.
> 
> So AMD was right to fit 16GB HBM2 to Vega 7..
> 
> View attachment 114524


RE2 does not need 14gb of VRAM. You can make it want to use that much but that is different from requiring or needing it. There is a setting for Texture Quality which effectively sets the size of dynamic texture pool. This setting goes from 0.25GB to 8GB. Textures are streamed in and out of this part of allocated memory and provided that the algorithm for this is halfway decent and space is even close to enough this will not cause performance hit or stutter.

At 1440p with everything maxed - except Texture Quality (the aforementioned dynamic pool) and Image Quality (resolution scaling/supersampling) - the ram usage is said to be a little over 5 GB. With Texture Quality setting at 2GB (High), the RAM usage is shown to be 7.10 GB. Ran through most of the areas in the demo with these settings and the maximum VRAM usage I logged was 6.3 GB.

Granted, 4K will affect VRAM usage and with resolution scaling the "Image Quality" setting will do the same. In 4K the memory usage was shown somewhere north of 6GB and 1440p with 200% Image Quality had close to 7GB. From what I can see, Radeon7/GTX1080Ti/RTX2080 performance class will not be good for 4K anyway. Played a little on RTX2080 (8GB VRAM) and at 1440p with settings described above it ran from 50-70 fps. Changing the Image Quality setting scaled as expected for resolution scaling. 50% (720p) ran at ~150fps and 200% (5K-ish) ran under 30 FPS. No stutters or anything in any of these resolutions.


----------



## delshay (Jan 14, 2019)

londiste said:


> RE2 does not need 14gb of VRAM. You can make it want to use that much but that is different from requiring or needing it. There is a setting for Texture Quality which effectively sets the size of dynamic texture pool. This setting goes from 0.25GB to 8GB. Textures are streamed in and out of this part of allocated memory and provided that the algorithm for this is halfway decent and space is even close to enough this will not cause performance hit or stutter.
> 
> At 1440p with everything maxed - except Texture Quality (the aforementioned dynamic pool) and Image Quality (resolution scaling/supersampling) - the ram usage is said to be a little over 5 GB. With Texture Quality setting at 2GB (High), the RAM usage is shown to be 7.10 GB. Ran through most of the areas in the demo with these settings and the maximum VRAM usage I logged was 6.3 GB.
> 
> Granted, 4K will affect VRAM usage and with resolution scaling the "Image Quality" setting will do the same. In 4K the memory usage was shown somewhere north of 6GB and 1440p with 200% Image Quality had close to 7GB. From what I can see, Radeon7/GTX1080Ti/RTX2080 performance class will not be good for 4K anyway. Played a little on RTX2080 (8GB VRAM) and at 1440p with settings described above it ran from 50-70 fps. Changing the Image Quality setting scaled as expected for resolution scaling. 50% (720p) ran at ~150fps and 200% (5K-ish) ran under 30 FPS. No stutters or anything in any of these resolutions.



Thanks for the reply. I got this info & screenshot from a thread on REDDIT. I'v not tested the DEMO myself, but I will look into it.


----------



## Recus (Jan 14, 2019)

delshay said:


> Just to make user(s) aware, there are few games out there that require more than 8GB of VRAM @4K max settings. RE2 is one of them & it needs 14GB of VRAM.
> 
> So AMD was right to fit 16GB HBM2 to Vega 7..
> 
> ...



AMD just brides developer to use more VRAM.










Digital Foundry review probably show there is no big difference between consoles and PC version textures.


As for pool. 28% performance for 40% higher price it's not brainer to skip 7nm or Turing and wait for 7nm EUV.


----------



## delshay (Jan 14, 2019)

Division 2 is another game from the REDDIT thread that claim it needs 11GB VRAM in order to play @4K Max settings. Can someone please check & provide screenshot if possible.


----------



## eidairaman1 (Jan 14, 2019)

delshay said:


> Just to make user(s) aware, there are few games out there that require more than 8GB of VRAM @4K max settings. RE2 is one of them & it needs 14GB of VRAM.
> 
> So AMD was right to fit 16GB HBM2 to Vega 7..
> 
> View attachment 114524



This just happens to be a frontier edition/instinct accelerator



Recus said:


> AMD just brides developer to use more VRAM.



Baseless statement


----------



## Vayra86 (Jan 18, 2019)

vega22 said:


> xfire hasn't needed a bridge since pcie gen3 became a thing dude.
> 
> poll is way too premature.



I think the poll results are interesting, a vast majority is in no need for an upgrade. Says a lot about the incentive of the current gen of cards I think. And of course you can easily consider that half of that Radeon 7 vote is actually trolling


----------



## delshay (Feb 13, 2019)

Optional Firmware update released.

https://www.amd.com/en/support/radeonvii-vbios-eula


----------



## ArbitraryAffection (Feb 13, 2019)

20 series and Radeon VII are a whole lot of 'meh'. Too expensive and not enough performance uplift. I voted for I am disappointed with the current stack of GPUs.


----------



## Dbiggs9 (Feb 13, 2019)

I should have my VII this Friday. Replacing a GTX 770


----------



## purecain (Feb 13, 2019)

good luck with the silicon lottery. I hope you get a really nice example...
I look forward to seeing the results on here.


----------



## John Naylor (Feb 14, 2019)

I never buy anything within 3 months of release ... so will be more data available at that point ... but looking with what we know now from TPUs review ....

As of now per TPUs test over 21 games, the 2080 is 14% faster overall, so no ... it's not competing with the 2080.  The 2070 is the best comparison being 94% as fast as the Radeon VII

Compared to a 2070 in a new build, it's offering us 6% more performance than a reference 2070 or 2% more than the MSI Gaming Z and they both OC the same 8.2% ... so what might offset that 2% ?   My son is planning a May build perhaps with a 2070 at 1440p ... so things to consider.

a)  He'll need a PSU 80 - 100 watts larger, my "go t"o would be  Seasonic Focus Plus Gold * and moving from a 650 to a 750 is + $20 (* VII may have an issue with these PSUs).  With EVGA g3, the difference is + $13 ... Corsair RMx is + $60
b)  With the extra heat, will want an extra 140mm fan (we base fan count on one 140mm, 1200 rpm fan per every 75 - 100 watts) , add $15.
c)  It uses 80 watts more in gaming and at 30 hours a week gaming, that's $34 a year over 4 years ..... $136 for me...  I pay far more than US average so $62 for the average Joe / Josephine
d)  It's 30 bBA louder at idle and 13 dbA louder under load (that's 2.5 times louder... a deal killer for me but my son uses headphone and maybe won't care.

Since the OPs question is "what would *** I *** buy" .....

The 2070 Gaming Z is $550.... the Radeon Z is $700 "on paper"... but w/ all costs considered, gotta add $20 for larger PSU+ $15 for extra fan + $136 for extra electric making grand total of $171 bringing my total total investment (here) to $871.  The question then ... *** for me ***, from a financial perspective, .... is an average 2% fps performance gain worth $321 ?   The truth is I'd never get that far to evaluate it.  I recognize that many won't care and pay half the electric rates I do, but the noise thing is a deal killer for me.    On top of that, at comparable sound levels, we could:

a)  Drop $45 Scythe Fuma (- $45) from his planned build.
b)  Buy a 3 x 120 mm Swiftech AIO for CPU (+ $165)
c)  Buy a Water Block for the 2070 (+ $140)
d)  Buy extra coolant, tubing and fittings for card (+ $35)

I'd expect a Delta T of 7C  (water 30 to ambient 23)  with fans at max speed under 100% load.... 10C at 1600 rpm is more likely when stress testing.  In peak gaming, I expect fans to stay from 38 to 40 dBA.   Way to high for my taste but my son might go for it since he wears headphones and may be inclined to the RGB blingie bling.  However, I'm not expecting much improvement n water from either of these cards.  I'm sure we'll know before the 3 months is up tho.  The water cooling I expect will make up some but not all of that 2% but we'd still be $26 ahead of the Radeon VII alternative with quieter 2070 installation.

But no, it's not a purchase I would or have to make today ... would wait the 3 months "lets see what happens" period but it doesn't apply since he's not building till end of May anyway.  Will be interesting to see what the MSI version does, particularly whether it has passive 0 dbA cooling at idle speeds and whether driver and / or AIB improvements pick up on the performance.  But from what I see now ... they'd have to drop the price to at least $550, pick up performance, vastly improve the noise issue and finally, provide gaming bundles or other incentives to offset the ancilliary costs to make it attractive.


----------



## purecain (Feb 25, 2019)

the performance is decent (if irratic)@ 2080 lvls… I would support amd if I wasn't already happy. the power draw is high but the performance is good. 
I hope as many people as possible choose Radeon7.


----------



## GamerGuy (Feb 25, 2019)

Heh, how could I have missed this thread?   I had gotten a Radeon VII on the day after it was released, a few were physically available in the various shops in the IT mall in my neck of the woods. I love it! AMD fanboy I am! I also have 3x VEGA64 (one's RMA'ed, the replacement card's in my brother's apartment in Toronto), as well as a Leadtek GTX1080 (just to center myself  actually when a card is this good and cheap at the price I'd gotten it, it's a no brainer)


----------



## cucker tarlson (Feb 25, 2019)

GamerGuy said:


> Heh, how could I have missed this thread?   I had gotten a Radeon VII on the day after it was released, a few were physically available in the various shops in the IT mall in my neck of the woods. I love it! AMD fanboy I am! I also have 3x VEGA64 (one's RMA'ed, the replacement card's in my brother's apartment in Toronto), as well as a Leadtek GTX1080 (just to center myself  actually when a card is this good and cheap at the price I'd gotten it, it's a no brainer)


in reality,how does the amd cooling on vii compare to quality aib ones on vega ?


----------



## GamerGuy (Feb 25, 2019)

cucker tarlson said:


> in reality,how does the amd cooling on vii compare to quality aib ones on vega ?


Ugh, I'm replaying Metro Exodus on my CF VEGA64 rig (finished it on my Radeon VII rig, got a bad ending so playing it again, but less Rambo style) so I haven't used my other rig. I'll play Strange Brigade and let you know what sorta temps I get. It is a tad loud and I have set a custom fan curve, but kinda recall not exceeding 72C if memory serves.


----------



## las (Feb 25, 2019)

Waiting for Nvidia 3000 series @ 7nm+

Maybe AMD surprises with Navi


----------



## RealNeil (Feb 25, 2019)

Navi is what I'll be buying next and it will be a pair of them. Then I'll sell the two Vega-64s.


----------



## zenlaserman (Feb 25, 2019)

As much as I admire powerhouses like the easy max-IQ gameability of the 2080Ti and the Radeon VII, flagship king of (usually) underutilized GCN, I am by nature a modest fellow.  I used to be a die-hard PC enthusiast, spending more money I should have on PC hardware I didn't really need....then many years passed and I realized nothing got better but attempts to glue your face to a monitor.

Like many members of this forum, I'm one of those kinda people with an encyclopedic knowledge of PCs that makes normal people do a Bill the Cat face. But maybe, a time's gonna come when you realize that real life has better graphics than any game you will ever play while you are alive, and if you think that is a lie, you're a noob at life.  I know men sick of real-life war, and I was sick of shooters when Q3 was already old.  Hardcore gamers are some of the most selfish people around, I know from experience.

All that said, when I game, I game with an RX460 2GB I found for $90 on Amazon about 2 years ago.  I love it.
I have a 1080p monitor and run most games 1080 or 720 on Medium.  Undervolted and no power connector.


----------



## the54thvoid (Feb 25, 2019)

zenlaserman said:


> I am by nature a modest fellow.





zenlaserman said:


> I'm one of those kinda people with an encyclopedic knowledge of PCs that makes normal people do a Bill the Cat face.



Modest you say? As for the remarks about 'noobs at life', many people can appreciate both highly immersive gaming experiences and the real world around us.

You can be both into realistic graphics and real life.


----------



## Vayra86 (Feb 25, 2019)

zenlaserman said:


> As much as I admire powerhouses like the easy max-IQ gameability of the 2080Ti and the Radeon VII, flagship king of (usually) underutilized GCN, I am by nature a modest fellow.  I used to be a die-hard PC enthusiast, spending more money I should have on PC hardware I didn't really need....then many years passed and I realized nothing got better but attempts to glue your face to a monitor.
> 
> Like many members of this forum, I'm one of those kinda people with an encyclopedic knowledge of PCs that makes normal people do a Bill the Cat face. But maybe, a time's gonna come when you realize that real life has better graphics than any game you will ever play while you are alive, and if you think that is a lie, you're a noob at life.  I know men sick of real-life war, and I was sick of shooters when Q3 was already old.  Hardcore gamers are some of the most selfish people around, I know from experience.
> 
> ...



To me this reads more as 'gaming is old to you' and you're tired of being in that upgrade/rat race. Though its a nice refreshing bit of input, can appreciate that for sure. Some perspective as well in our relentless chase to realistic graphics, indeed.


----------



## zenlaserman (Feb 25, 2019)

Normally, I don't have much to say in here.  It'll get drowned out by those who can out-piss and moan myself. I have a lil bit of time.



the54thvoid said:


> Modest you say?



I like to think of myself that way, I drive a 35 y/o German car, that I maintain myself, with 300K miles and still manages 30MPG when I can be a fast asshole and draft fast assholes at 85MPH. A 56 y/o truck that I've owned for 25 years, never had a credit card or smartphone and live in a portable tiny house I built.  The 2-stoplight town I live in has nothing but 25MPH speed limits and rolling hills all around so I just use my bicycle to go grocery shopping.  I'm not snooty, a hipster, or looking down my nose at anyone.  I try to stay humble and be actually useful.  It took me a long time, but I learned that I always felt happiest when I'm helping other people.

I'm just lamenting the greed of people today and contributing a lil bit to the whine.



> As for the remarks about 'noobs at life', many people can appreciate both highly immersive gaming experiences and the real world around us.



Sure, I remember throwing coins at arcade machines and getting immersed.  I learned to drive manual like a boss on Hard Drivin'.   I 'member when it took a $3k PC to run Quake III Arena over 60fps @ 1600x1200 and the giggles I had when I ran it over 700 with an OC'd E4300 on friggin software mode.  I still remember the first time I played through Resident Evil on Playstation.  A video game had never scared me before. FEAR quickly did it again.  There's many great games, and there's a helluva lotta worse and more expensive things one can do than play video games.  I started playing them in 1980.



> You can be both into realistic graphics and real life.



You need to get out more.  Your thinking appears to be binary and therefore based on mutual exclusivity.  Seriously, I've left and come back from this forum so many times over the years to see your ass still posting and steadily turning into some kind of borg; it's like you forget there are things beyond 0 and 1.  I've enjoyed many of your posts over the years, dude.  But anyway, my point is that graphics are irrelevant when it comes to a good game.  Cmon, Pong?  Too many games rehash the same old shit over and over but just polish it up more and get people buy more of a waste of computing power to do it all.  Doesn't matter how power efficient the architecture is when you get down to looking how many transistors ticking over a billion times a second it takes to make a shadow look pretty.  Just for scale, 1 billion seconds is nearly 32 years.



Vayra86 said:


> To me this reads more as 'gaming is old to you' and you're tired of being in that upgrade/rat race. Though its a nice refreshing bit of input, can appreciate that for sure. Some perspective as well in our relentless chase to realistic graphics, indeed.



Thanks, I know there's old timers here who 'member this whole hardware rat race 10-15 years ago - much has changed, but then, much has NOT.   10 years ago, a 10 y/o PC was a 1GHz single core.  Now, a 10 y/o PC can still be relevant most the time.  Sure we have some amazingly pretty graphics and OMG cores now, but the GPU power required to run all that eye candy at high res seems to be such a waste.  Realistic hair and particle storms, bleh.  I mean, when we first saw 1 billion transistor GPUs, that was something special.  the original Far Cry, Crysis..even GTA IV blew my mind visually, not to mention Skyrim with mods.  Everything beyond that seems to be superfluous, gameplay is largely the same as it ever was.

....and most of the deskchair CEO kids and elitist hardware snobs spouting "nVidia should have done this" and "AMD should have done that" don't realize that most of this shit is planned out 1½ - 2 years in advance.  As a result, ego-driven ultracrepidarian spam spam spam all over the place in today's forum.  Now if you'll excuse me, I think I saw someone on my lawn! *shuffles off grumbling


----------



## cucker tarlson (Feb 25, 2019)

zenlaserman said:


> As much as I admire powerhouses like the easy max-IQ gameability of the 2080Ti and the Radeon VII, flagship king of (usually) underutilized GCN, I am by nature a modest fellow.  I used to be a die-hard PC enthusiast, spending more money I should have on PC hardware I didn't really need....then many years passed and I realized nothing got better but attempts to glue your face to a monitor.
> 
> Like many members of this forum, I'm one of those kinda people with an encyclopedic knowledge of PCs that makes normal people do a Bill the Cat face. But maybe, a time's gonna come when you realize that real life has better graphics than any game you will ever play while you are alive, and if you think that is a lie, you're a noob at life.  I know men sick of real-life war, and I was sick of shooters when Q3 was already old.  Hardcore gamers are some of the most selfish people around, I know from experience.
> 
> ...


I haven't had enough of the chase for the best iq and framerate but at some point there'll have to come a time when all I care about in a game is writing and mechanics and I'll enjoy that with a low budget rig too.
I had a long period of time in my life when I stopped buying enthusiast hardware but I lost interest in gaming completely for that time too. It's tough to stand on your own feet here in PL,when you're young in my country you are pretty much f****ed for years before you can really get comfortable enough to get at least some of your life passions back.


----------



## Super XP (Feb 25, 2019)

zenlaserman said:


> As much as I admire powerhouses like the easy max-IQ gameability of the 2080Ti and the Radeon VII, flagship king of (usually) underutilized GCN, I am by nature a modest fellow.  I used to be a die-hard PC enthusiast, spending more money I should have on PC hardware I didn't really need....then many years passed and I realized nothing got better but attempts to glue your face to a monitor.
> 
> Like many members of this forum, I'm one of those kinda people with an encyclopedic knowledge of PCs that makes normal people do a Bill the Cat face. But maybe, a time's gonna come when you realize that real life has better graphics than any game you will ever play while you are alive, and if you think that is a lie, you're a noob at life.  I know men sick of real-life war, and I was sick of shooters when Q3 was already old.  Hardcore gamers are some of the most selfish people around, I know from experience.
> 
> ...


Fair enough, for me the Picture Quality in 1440p PC Gaming is the utmost highest of priorities for me. That is why I'm using a RX 580 8GB GPU. Even my previous Radeon GPU ensured the best PQ for gaming.

My next upgrade will be a NAVI GPU (RX500 series replacement GPU), though I'll check reviews, including TPU for a final decision. All I am looking for is a performance boost over my RX580, though I am sure we will see 2080 like performance for about $250 to $300 if rumours & speculation end up being right.


----------



## overvolted (Feb 25, 2019)

I used to feel compelled to buy the best card out there all the time and once the novelty wore off that all I would be getting out of it is a better 3d mark score in practice, I decided to go with what actually does the job I need it to do.

Honestly, I see no point in spending 700 dollars on a video card either. That's just ridiculous.
This time around, I went with an ASUS Strix RTX 2060. Open box even...for 320 bucks.

Extremely happy and no buyers remorse. It's an awesome performer.
For 1440 gaming, that's all you need. By the time it struggles with anything you'll probably have a completely new machine anyway.


----------



## moproblems99 (Feb 26, 2019)

Super XP said:


> That is why I'm using a RX 580 8GB GPU



1440, with a 580?  Picture Quality?  I could see 1080 but I am having a hard time seeing success there.  I wouldn't do anything less than my Vega 56 at 1440.  Can't imagine a 580.



Super XP said:


> I am sure we will see 2080 like performance for about $250 to $300 if rumours & speculation end up being right.



I dunno about that.  2070ti performance costs $699 from AMD.  I have some concerns about Navi now.


----------

