• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announces the Radeon VII Graphics Card: Beats GeForce RTX 2080

It has about the same performance as the 2080, matching it's price, with no ray tracing support?

I was hoping AMD would bring about a real threat to Nvidia's inflated prices, it's not the case...

I know this is because of the amount of HBM on this card, but they have a new Fab process, which means they can carve more GPUs out of the silicon wafer, and yet they chose to keep the CU count smaller (or equal in a future XT version?) than Vega 64, when they could've increased sheer rasterizing performance by taking advantage of the smaller node, while beating 2080 to a pulp and matching or surpassing the 2080 Ti? I mean, I get they're not beting on ray tracing, that's reasonable, but why not push the rasterizing envelope then?

Idk, same performance at same price, more memory, less features... It's gonna be an uphill battle.

An ideal scenario in my opinion would've been: Vega VII, with 50 more CUs due to ability to cram more transistors in a smaller space, and a lower price than 2080 by matching its 8GB buffer, now that would've made Nvidia shit their pants...

Idk, maybe next year

Doesn't it have x2 more ROPs then before.

Very little investment to Vega with the die Shrink to be competitive. They don't have to invest 10yrs+ like the competition for around 20% uplift. If you look at both camps arc they been using prosumer arcs and just tweaking it along the way.
 
Last edited:
AdoredTV apparently is pissed off by Radeon 7 as well. When a diehard AMD fan becomes angry with your GPU and calling it "shit" you know how f*ucked up the product is.

Capture.JPG
 
Doesn't it have x2 more ROPs then before.

I think we can both agree that even doubling the number of ROPs won't match the sheer performance gain from adding more CUs, especially since these new CUs will run at a higher clock.

Instead, we see a regression in the number of CUs, and yes, double the ROPs but no clear gains from an almost 40% smaller fab process.

It's mind boggling. Nvidia screwed up by dedicating more silicon to RT and tensor units, silicon that could've been used for more cuda cores, and more performance, but they bet the farm on ray tracing and DLSS.

If you ask me, AMD is making a very similar mistake, less CUs, but double the HBM2 raising the price to the point where they cannot be competitive with Nvidia, like they've always have been.

A missed opportunity for both camps, but hindsight is 20/20.
 
Last edited:
$700 for a shrunk Vega doesn't have me excited as a Vega owner. 16GB of HBM2 sounds great as does the triple fan setup. I'll wait for the reviews, but more money for old technology won't have me reaching for the wallet.
 
I think we can both agree than even doubling the number of ROPs won't match the sheer performance gain from adding more CUs, especially since these new CUs will run at a higher clock.

Instead, we see a regression in the number of CUs, and yes, double the ROPs but no clear gains from an almost 40% smaller fab process.

It's mind boggling. Nvidia screwed up by dedicating more silicon to RT and tensor units, silicon that could've been used for more cuda cores, and more performance, but they bet the farm on ray tracing and DLSS.

If you ask me, AMD is making a very similar mistake, less CUs, but double the HBM2 raising the price to the point where they cannot be competitive with Nvidia, like they've always have been.

A missed opportunity for both camps, but hindsight is 20/20.

Tbh, this looks like AMD taking advantage of the situation to unload some inventory that it was otherwise going to write-off.
 
Last edited:
AdoredTV apparently is pissed off by Radeon 7 as well. When a diehard AMD fan becomes angry with your GPU and calling it "shit" you know how f*ucked up the product is.

View attachment 114290
Naw, he just knew it wasn't anything to get excited about which was pretty well understood months ago when they called it "Vega 20." Tech shrinks are never miracle workers.

Tbh, this looks like AMD taking advantage of the situation to unload some inventory that it was otherwise write-off.
The 60 CUs certainly suggests that. All of the good chips are binned for Radeon Instinct products. If Radeon Instinct demand falls off, we might see AMD debut a 64 CU gaming card.


AMD is betting the gaming house on Navi.
 
Well, at least I wasn't expecting anything great from AMD for a while. Still, this is good because it'll force price reductions(fingers crossed).
 
Tbh, this looks like AMD taking advantage of the situation to unload some inventory that it was otherwise write-off.

Who knows, I mean, they probably have enough Radeon Instinct MI50s GPUs around to make it profitable to sell them in mainstream cards.

You might be onto something.
 
The CEO states it's competitive with the RTX2080, not quite beats (like the title of this news post) except in one Vulkan bench and they strategically make it not Wolf 2, Turing cards show very strong perf in wolf 2 especially with VRS.

A shrunk Vega splitting the difference in CU's, double the memory, overclocked to use the same 300w... double the ROPS too yet not a lot to show for it? Well it's a stop gap, a cheap one that cost them little to no R&D and they get to use dies that didn't cut it for instinct products. Not a whoooole lot to get exited about, especially since they are following Nv's pricepoint for that level of performance, without RT/DLSS/VRS(?) and at significantly higher power consumption and thus probably more heat to dissipate.

I do believe however this card will do decently as the res increases, as a vague example goes lets say it might trail a 2080 by 5% @ 1440p but lead a 2080 by 5% at 2160p across more than 3 games. These vendor slides are always whimsical as hell, Nv and AMD both do it.

As always I eagerly await W1z's full review before truly being able to judge the card against the competition.
 
Well, this will likely be my card coming up. We'll see. Wish it was only 8GB and about $600 but whatever, hope they are making a good chunk on them. I can't imagine they are with the cost of HBM.
 
I'm thrilled that for at least once it looks like AMD has gotten rid of their squirrel cage blowers on their reference cards.
 
It doesnt have the RTX comparable tech of nvidia, but atleast it has 16gb of hbm that some people will love
 
so people want AMD 7nm to compete against Nvidia 12nm???

Just because they are ahead of getting contract from TSMC 7nm doesnt mean they are better..

when nvidia releases 7nm..only then will it be real comparsion.
 
RTX 2080 has 8-pin + 6-pin. Not that 2x8pin means 375W tdp, i.e RTX 2080ti has two of them and TDP for that is 260/250W(FE/ref.).
Sorry I went but the first 2080 review I found which was the Zotac...
157AA19D-84B6-4ECB-AE7F-377C2D612A8B.jpeg
 
AMD is betting the gaming house on Navi.

This day turned out to be fun. Jensen comes off as angry with AMD and Intel in some of his interviews


Lisa isnt responding with hate but she did drop some interesting tid-bits in there

“I think ray tracing is an important technology, and it’s something we’re working on as well, both from a hardware and software standpoint,” Su said. “The most important thing, and that’s why we talk so much about the development community, is technology for technology’s sake is okay, but technology done together with partners who are fully engaged is really important.”
Nvidia has received some criticism from enthusiasts concerning the price of its RTX cards and the relative of lack of game support at present. Su indicated that building a development ecosystem was important.
Later, Su expanded on her thought. “I don’t think we should say that we are ‘waiting,’” Su said, in response to this reporter’s question. “We are deep in development, and that development is concurrent between hardware and software.”
“The consumer doesn’t see a lot of benefit today because the other parts of the ecosystem are not ready,”
Su added.

Might allude to the PlayStations upcoming Gran Turismo recent announcement that it was working on in-house Ray-Tracing.
 
Why 60 CU instead of 64 CU? AMD giving themselves wiggle room in terms of manufacturing? Or was it designed with 60 from the start to make room for more memory controllers?
It is not like those extra CUs does anything but make the card run hotter and pull more power.
Just look at Vega 56 vs 64, they perform the same at equal clocks 99% of the time.
Hell, even the Fury vs Fury X is like that.
These chip are mostly Geometry limited in games, throwing more CUs at it won't make it any better.
 
It is not like those extra CUs does anything but make the card run hotter and pull more power.
Just look at Vega 56 vs 64, they perform the same at equal clocks 99% of the time.
Hell, even the Fury vs Fury X is like that.
These chip are mostly Geometry limited in games, throwing more CUs at it won't make it any better.

yeah, GCN is too compute focused instead of graphics focused. Sadly very few developer use GCN for compute as CUDA dominates GPU acceleration markets.


But hey, I bet this Radeon 7 mines crypto coins like no tomorrow! 16GB of frreaking HBM2 at 1TB/s bandwidth?? If crypto coins take off again thse cards will sell like gold!
 
TBH power consumption is literally the last thing I care about. I can run my card at 240w or now with new Wattman 300+ At the end of the day it’s about performance. I’f it took 450W I still wouldn’t care but assuming 2x8. Is automatically gonna be high power consumption is just assumption at this point. My card on release had 3x8 pin but that card no longer exists and is now a 2x8. Bottom line for me is power consumption means very little and actual performance matters a lot. The difference is literally pennies a year so why should I care?
 
Might allude to the PlayStations upcoming Gran Turismo recent announcement that it was working on in-house Ray-Tracing.
I think that only way that's true is if PlayStation 5 is completely raytraced. It's not impossible...but is unlikely. If it is, NVIDIA is going to lose at its own game. Sony wouldn't be interested in anything that can't do 10+ GR/s for less than 100w. NVIDIA is using in excess of 200w to get that result. Sony would also be concerned about the cost side of it and would reject huge, monolithic dies like Turing.
 
AdoredTV apparently is pissed off by Radeon 7 as well. When a diehard AMD fan becomes angry with your GPU and calling it "shit" you know how f*ucked up the product is.

View attachment 114290
seeing this hack so pissed is literally the best thing about Radeon 7 :roll:

the card is good for now if it can really match 2080 and has 16gb onboard. the problem will be 2020 when 7nm nvidia will have NO competition. NONE.
 
seeing this hack so pissed is literally the best thing about Radeon 7 :roll:

the card is good for now if it can really match 2080 and has 16gb onboard. the problem will be 2020 when 7nm nvidia will have NO competition. NONE.
We know little about Navi that will already be 7nm. Slate AMD all you like but they are ahead of the game on process node.
 
We know little about Navi that will already be 7nm. Slate AMD all you like but they are ahead of the game on process node.
which gives them 16nm gen nvidia performance in enthusiast (vega 7 vs 1080ti) and probably nvidia's 12nm gen performance with 7nm navi if it can match the 2060/2070.
 
Depending on the FP64 performance, this can be a hell of a card for compute, though it would have been perfect if it comes with PCI-E 4
 
Back
Top