# AMD Radeon RX 7900 XTX



## W1zzard (Dec 12, 2022)

Navi 31 is here! The new $999 Radeon RX 7900 XTX in this review is AMD's new flagship card based on the wonderful chiplet technology that made the Ryzen Effect possible. In our testing we can confirm that the new RX 7900 XTX is indeed faster than the GeForce RTX 4080, but only with RT disabled.

*Show full review*


----------



## Lightofhonor (Dec 12, 2022)

Performance is a lot closer to 4080 than I had imagined.


----------



## AirplaneA1 (Dec 12, 2022)

Uh huh, pretty much what I expected before.


----------



## TheLostSwede (Dec 12, 2022)

Lightofhonor said:


> Performance is a lot closer to 4080 than I had imagined.


In a positive or negative way?


----------



## Vayra86 (Dec 12, 2022)

Is it possible the Frametime Analysis page is not filled yet? Not seeing graphs.


----------



## Pumper (Dec 12, 2022)

I just can understand how AMD always manages to fuck up low load power draw on launch.

RT performance hit being pretty much identical to 6000 series is a bummer. Intel did a better job on their first try for fuck's sake.

I see that the performance/dollar also includes 7900XT and, as expected, the lower end card is the worse value GPU. Disappointing.


----------



## Chaitanya (Dec 12, 2022)

Impressive but XT seems to be the unwanted child of the two with quite bad pricing(relative to XTX).


----------



## Crackong (Dec 12, 2022)

Same performance as 4080, 8GB more VRAM, not sized as a brick, and $200 cheaper


----------



## terroralpha (Dec 12, 2022)

TheLostSwede said:


> In a positive or negative way?


Very negative. This means no price disruptions. 

Also, can we talk about idle power on these AMD cards? what the hell happened?


----------



## Tech Ninja (Dec 12, 2022)

Isnt it time for a RT average chart?  Also what about FSR 2 vs DLSS 3.0?


----------



## Fouquin (Dec 12, 2022)

Annoying to see the power draw issues with multi-monitor are still present after all this time, but as before perhaps they can address that with driver fixes. Otherwise this looks exactly as AMD said it would be; direct competition for the 4080 at a lower price. You can certainly see where they pushed the power draw back up vs RDNA2. It doesn't look nearly as competitive on the efficiency front as previous generations, but it's not SO bad. Since AMD offers an instant undervolt option in the Adrenalin utility it's not going to be hard for people to get the card back down closer to 320-350W peak.



Pumper said:


> RT performance hit being pretty much identical to 6000 series is a bummer



Did we look at the same review? It's matching the 3090 Ti quite often, and a couple times sits between the 4080 and 4090 in RT. The 6000 series is languishing miles away in RT performance, not even remotely "identical" performance. Judging by results such as Far Cry, Resident Evil, and Watch Dogs this looks like a solution can be found in driver optimizations.


----------



## Vayra86 (Dec 12, 2022)

16% RT gap, on par in raster, with more games pushing past the 4080 than there are games ending up worse.

999 is priced right in relation to the 4080. But not priced right. The 7900XT OTOH is half a tier below the 4080 but has virtually the same perf/$.
Overall this makes AMD's offering on the (too-) pricy side IMHO, much like Nvidia's.

Guess I'm saving that 13th month for now


----------



## Pumper (Dec 12, 2022)

nicamarvin said:


> Color me Surprised, W1zzard Blatant Biased yet again. Pairing a AMD GPU with an Intel CPU. He think he is slick. He knows AMD's GPU perform the best with AMD CPUs... I am done with W1zzard's antics.


Is this a shitty attempt at a joke?


----------



## Lightofhonor (Dec 12, 2022)

TheLostSwede said:


> In a positive or negative way?


I guess negative? Haha though that sounds harsh. Figured XT and XTX would split the 4080, which they do, but not as evenly as I thought. Drivers need some fine wine work anyway so could change


----------



## Psychoholic (Dec 12, 2022)

So matches the 4080 while being quite a bit cheaper, Nicely done imo.


----------



## _JP_ (Dec 12, 2022)

Tbh, I'm positively surprised about the card and it's great to see a seriously good flagship from AMD, but I cannot get over the price. Again, this is flagship, not halo-product, so that $1K makes it rather unappealing.


nicamarvin said:


> Color me Surprised, W1zzard Blatant Biased yet again. Pairing a AMD GPU with an Intel CPU. He think he is slick. He knows AMD's GPU perform the best with AMD CPUs... I am done with W1zzard's antics.


Provide data to backup your antics or stop the trolling, please.


----------



## birdie (Dec 12, 2022)

This is a disaster for the consumer.

Instead of soundly beating NVIDIA at least in raster AMD offers comparable performance at the comparable price while not offering unique distinguishing features like e.g. DLSS 3.0. And RTRT performance is again hugely lacking though AMD has managed to reach ... Ampere levels of performance.

I hate both NVIDIA and AMD. It looks like both companies are in cahoots and are no longer interested in advancing the gaming industry and graphics.

You want more performance? You pay proportionally more money. This is _not_ how the GPU industry worked for the previous 20 years. This is just disgusting.

And what's up with multimonitor power consumption? They had 5 years to perfect the architecture and we look at 103W at idle? WTF AMD?

This card doesn't disrupt anything. It's a mockery of competition.

This is a bloody duopoly.


----------



## gridracedriver (Dec 12, 2022)

My prediction that in RT it was equal to the 3090ti and just 15% less than the 4080 was 100% true.
Good work AMD, an incredible performance-cost ratio.

For being the first iteration of Die chiplet, it's a very good result, costs almost halved.


----------



## Vayra86 (Dec 12, 2022)

nicamarvin said:


> Color me Surprised, W1zzard Blatant Biased yet again. Pairing a AMD GPU with an Intel CPU. He think he is slick. He knows AMD's GPU perform the best with AMD CPUs... I am done with W1zzard's antics.


Don't let the door hit you on the way out!


----------



## W1zzard (Dec 12, 2022)

Vayra86 said:


> Is it possible the Frametime Analysis page is not filled yet? Not seeing graphs.


coming soon, just finished xt conclusion, grabbing a slice of pizza and then frametimes.



Tech Ninja said:


> Isnt it time for a RT average chart?  Also what about FSR 2 vs DLSS 3.0?


I read your mind earlier today, check the RT page


----------



## Vayra86 (Dec 12, 2022)

birdie said:


> This is a disaster for the consumer.
> 
> Instead of soundly beating NVIDIA at least in raster AMD offers comparable performance at the comparable price while not offering unique distinguishing features like e.g. DLSS 3.0.
> 
> ...


This is bloody idiot customer base paying too much for three generations on end. So we get a fourth that stretches that rationale just a bit further.

What the hell did you expect? This is on US. And on US alone. I wonder how many cards are left on shelves right now... The fact is, we enable 60% margins by paying them, instead of waiting for a deal we really feel is fair. For what? Gaming. Luxury.

Its like gaming itself: you get what you pay for.

We've lived most of our lives in an age of 'free money' - financial crises notwithstanding - and the turning point is now. Let's see the turn - if you pay 1K+ for a GPU in 2022-2023, you've lost the plot, its that simple.


----------



## Fouquin (Dec 12, 2022)

birdie said:


> Instead of soundly beating NVIDIA at least in raster AMD offers comparable performance at the comparable price while not offering unique distinguishing features like e.g. DLSS 3.0. And RTRT performance is again lacking thought AMD has made some great strides.



I suppose if you ignore FSR 2.0 being available to a lot of modern games and being functional, sure. No distinguishing feature there. RT perf got a nice bump, they've never quite been this close to on-par even in optimized games.


----------



## TheLostSwede (Dec 12, 2022)

nicamarvin said:


> Color me Surprised, W1zzard Blatant Biased yet again. Pairing a AMD GPU with an Intel CPU. He think he is slick. He knows AMD's GPU perform the best with AMD CPUs... I am done with W1zzard's antics.


Ah yes, very much expected comment now that the test system was switched, because loads of people complained about the Ryzen 7 5800X...


----------



## wolf (Dec 12, 2022)

I must say I was hoping for more, and AMD's slides led me to expect more, and other reviewers have mentioned driver bugs... I doubt this will do much for prices in this segment...

I am underwhelmed, relatively speaking RDNA2 feels stronger than this.


----------



## Legacy-ZA (Dec 12, 2022)

Pleasantly surprised, that being said, still too expensive.


----------



## yeeeeman (Dec 12, 2022)

Seems like chiplet GPUs is not as easy as it seems, for sure, not as "easy" as CPUs were.
Unfortunately for AMD and for us, they will have to drop prices to sell these and also, the worst part is that nvidia can keep their HALO prices, cause yeah, RTX 4090 is unbeatable so they can ask whatever they want for it.

In any case, RDNA2 was a bigger increase partly because RDNA1 was crap and full of bugs and partly because nvidia didn't have a great round with the 30 series cards which were made on a poor process, samsung 8nm. Heck, nvidia left a lot of potential raster performance on the table with the 4090, because they did spend quite a bunch of their transistor budget on making dlss 3 a reality.


----------



## Kaleid (Dec 12, 2022)

Compact and loud go together. Too bad, why can't they design these things better?
But other than RT performance it's quite great.

The XT is only about 20w more than 6800xt, that's a high increase in efficiency


----------



## 3211 (Dec 12, 2022)

Vayra86 said:


> View attachment 274144
> 16% RT gap, on par in raster, with more games pushing past the 4080 than there are games ending up worse.
> 
> 999 is priced right in relation to the 4080. But not priced right. The 7900XT OTOH is half a tier below the 4080 but has virtually the same perf/$.
> ...



It's 16% in wizzard's cherry picked amd favoured and small sample selection.  In actual fact, when taken a broader sample and more diverse,  the 4080 is 26% faster in RT than the XTX, nearly 50% faster than the XT and the 4090 is 75% faster than the XTX and more than doble than the XT.  In ray tracing.  

You also have to laugh at wizzard's conclusions  "faster in rasterization".  You look at the average: 124 fps vs 125 fps.  )))  yeah, thats faster in raster allright.   "MUCH lower price"  yeah, 1000 vs 1200.    Gotta accentuate that "much"


----------



## kapone32 (Dec 12, 2022)

Well I have a 6800XT and from what I can see the XTX is about 60% faster overall than my card. That is enough for me with the additional 8GB of VRAM. People can wax on all they want about 40 series cards too as, give me AMD's overall package anyday before Nvidia. With my 5800X3D this thing will give me that smile that you get the first time you use compelling hardware. I want the Aqua from As Rock too. That way I don't have to wait for a block to arrive from Slovenia or Germany.


----------



## Ayhamb99 (Dec 12, 2022)

While my prediction was wrong about the 7900 XTX prediction of competing with the 4090 (In traditional rasterization at least the 7900 XTX is not that far off but for Ray Tracing Nvidia is still the way to go.) This is going to force a MSRP cut for the 4080 definitely



nicamarvin said:


> Color me Surprised, W1zzard Blatant Biased yet again. Pairing a AMD GPU with an Intel CPU. He think he is slick. He knows AMD's GPU perform the best with AMD CPUs... I am done with W1zzard's antics.


Of course another group of idiots are going to complain about the CPU choice after a long time of whining about how the "5800X is a massive bottleneck

smh


----------



## Nkd (Dec 12, 2022)

LMAO. Wizzard found a way to call this a bad value and bang on it for price and pitch people 4080 still at the end? I get the review but conclusion really feels like someone in the forum who loves nvidia.


----------



## kapone32 (Dec 12, 2022)

TheLostSwede said:


> Ah yes, very much expected comment now that the test system was switched, because loads of people complained about the Ryzen 7 5800X...


There are actually 5800X numbers in the Gaming charts


----------



## Nkd (Dec 12, 2022)

Lightofhonor said:


> Performance is a lot closer to 4080 than I had imagined.


why don't you look at the chart at the end and see which games matter? Its faster in more games than its not.


----------



## Toss (Dec 12, 2022)

how about PRODUCTIVITY TESTS?
x264, AV1 H265 
CONDING ENGOCING DECODING VIRGINING?


----------



## spnidel (Dec 12, 2022)

Nkd said:


> LMAO. Wizzard found a way to call this a bad value and bang on it for price and pitch people 4080 still at the end? I get the review but conclusion really feels like someone in the forum who loves nvidia.


you don't get it man, the 8gb less VRAM and 16% better RT performance is worth the extra $200 bucks
as for the performance itself... this is vega 64 all over again. not impressed, the 6800 xt was a far better product.
given that the 7900 xt is only *33%* faster than a 6800 xt, I can only expect the 7800 xt to be 20% faster than the 6800 xt - a lame fucking generational leap and I am 100% sure the 7800 xt will have a higher MSRP than the 6800 xt

vega 64 all over again, unbelievable, AMD needs to stop smoking crack and lower the fucking prices


----------



## btk2k2 (Dec 12, 2022)

So AMD claimed a 54% perf/watt increase over the 6900XT when both the 7900XTX and 6900XT are running at 300W. At full power in this benchmark (and others) it seems like it can't even hit 50% faster than the 6900XT with 355W of power.

Either those 55W are doing practically nothing or AMD cherry picked really really hard and in either event I am disappointed with AMD for being this misleading.

EDIT: Although I do think a CIV 6 FPS benchmark is a total waste of a slot which could be used for a different game.


----------



## CGLBESE (Dec 12, 2022)

Thanks for the review. I had hoped AMD would price its cards around $150 lower... but I was wrong 

It's a bit *meh* for this price tag


----------



## kapone32 (Dec 12, 2022)

birdie said:


> This is a disaster for the consumer.
> 
> Instead of soundly beating NVIDIA at least in raster AMD offers comparable performance at the comparable price while not offering unique distinguishing features like e.g. DLSS 3.0. And RTRT performance is again hugely lacking though AMD has managed to reach ... Ampere levels of performance.
> 
> ...


I bet you that by this time next year these cards will be competing with the 40 series at every level. AMD is not sitting on their laurels and have refined their software suite. As I have said before (I have a 3060) what Nvidia give's you in comparison is laughable. I will also say that vs a 6800XT this thing is seriously telling of justt how good they are.


----------



## neatfeatguy (Dec 12, 2022)

nicamarvin said:


> Color me Surprised, W1zzard Blatant Biased yet again. Pairing a AMD GPU with an Intel CPU. He think he is slick. He knows AMD's GPU perform the best with AMD CPUs... I am done with W1zzard's antics.


Clearly you didn't go through the review very well. W1zzarid has the results of the 4080 and 7900XTX paired with a 5800x in all the game and relative performance graphs. Was it because it was with a 5800x and not maybe a 7700 or 7900 CPU, is that why you're so bitter?

It's sad to see that people cry on both ends of the spectrum. W1zzard received grief when he wasn't using the fastest gaming CPU (arguably the 13900k) when the 4080 launched and now he's getting shit for not using an AMD CPU, when in actuality he has data listed for the GPUs in this review on all the graphs about being used with the 5800x.

So sorry he can't do a review with every single CPU out there and keep a constant base to use for his review work over multiple years. If you're so unappreciative of the work that him and other review sites do because they're not using the hardware you want, then by all means, go out and spend thousands for testing hardware and spend hundreds of hours creating benchmarks for us. Once you do, we'll be sure to bitch about how we don't like the methods you used.


----------



## W1zzard (Dec 12, 2022)

_JP_ said:


> Provide data to backup your antics or stop the trolling, please.


It is on my list, but don't think I'll be able to test this before xmas (50+ games RX 7900 XTX 13900K vs 7700X)


----------



## P4-630 (Dec 12, 2022)

Just playing a simple youtube video?


----------



## the54thvoid (Dec 12, 2022)

Folks can critique the review as much as they want, but do not make it personal. Points for that.


----------



## W1zzard (Dec 12, 2022)

Nkd said:


> Just playing a simple youtube video?


I'm testing a video file, not a stream, so I have perfect repeatability, independent of internet speed, YT shenanigans, codec changes etc. But yes, should be for YT as well


----------



## TheLostSwede (Dec 12, 2022)

kapone32 said:


> There are actually 5800X numbers in the Gaming charts


Transition period so people can compare with older results. Makes sense.


----------



## ARF (Dec 12, 2022)

This is an absolute failure by AMD. 4090 is 22% faster - that's much worse than RX 6900 XT vs RTX 3090 in which case the performance difference was only 7%.

Why is RDNA 3 so damn slow? 

AMD has a chance, if:
1. lowers the prices
2. releases Navi 32 with very close performance and much much cheaper


----------



## iO (Dec 12, 2022)

Overall a good card but at 1150€ too expensive, just like all new cards...

And that 5800x bottleneck in some games is astonishing.


----------



## mb194dc (Dec 12, 2022)

Just reduce the price by $400 and you've got a winner... 

Personally don't think 1080p is a relevant resolution to cards costing $1k odd also. They're aimed at 4K or even higher.


----------



## spnidel (Dec 12, 2022)

if AIBs don't overclock far better than reference cards to compensate for the shitty generational leap, then this is another vega 64 moment for AMD. so fucking disappointing.
"up to 1.7x faster than 6950 xt" my fucking ass


----------



## luches (Dec 12, 2022)

Overall disappointed . The cards are not competitive enough to move the market and prices. Why does it feel like these 2 giants always rely on misinformation to hype the cards and always end up pricing their cards in a way that don't threaten each other !
Anyways I pray the video playback and multi monitor power usage is a driver issue and gets fixed ! 90w on playback is insane !


----------



## the54thvoid (Dec 12, 2022)

ARF said:


> This is an absolute failure by AMD. 4090 is 22% faster - that's much worse than RX 6900 XT vs RTX 3090 in which case the performance difference was only 7%.
> 
> Why is RDNA 3 so damn slow?
> 
> ...



I'd say, why is the 4090 so damn fast? That's a helluva card (too pricey for me) and was always going to be a reach for AMD to catch it.

I think the XTX is good but the XT would be better at $800 max (or ideally, £799 UK equiv.)


----------



## HD64G (Dec 12, 2022)

CPU bottlenecks very hard in some games so the ~45% uplift vs 6900XT @4K is pessimistic. For any game utilising the GPU properly it get closer to 60% and in some games even at 80% over the 6900XT (on other online reviews). So the up to 54% higher efficiency is true.


----------



## P4-630 (Dec 12, 2022)

W1zzard said:


> I'm testing a video file, not a stream, so I have perfect repeatability, independent of internet speed, YT shenanigans, codec changes etc. But yes, should be for YT as well



88W @8K ?  Or even 88W @ 1080P ?


----------



## EatingDirt (Dec 12, 2022)

Fouquin said:


> Did we look at the same review? It's matching the 3090 Ti quite often, and a couple times sits between the 4080 and 4090 in RT. The 6000 series is languishing miles away in RT performance, not even remotely "identical" performance. Judging by results such as Far Cry, Resident Evil, and Watch Dogs this looks like a solution can be found in driver optimizations.


Look at the (% changes) for the 6950 XT verses the 7900XTX. The 7900 XTX's efficiency in raytracing is basically on par with the 6000 series. This is the disappointing, especially as they specifically said they worked on raytracing performance for this architecture. It looks like the architecture changes to increase raytracing performance didn't translate well into real-world game raytracing performance.


----------



## phanbuey (Dec 12, 2022)

I definitely think it's in a good spot in the terrible performance/value high end segment... that being said used 6900xt and 3080's / 3080Tis looking amazing right now.


----------



## lightning70 (Dec 12, 2022)

The performance difference is less than said. With a good Overclock, the RTX 4080 can be brought to the raster performance of this card.
Pros are more up-to-date video outputs, more memory and more efficient power consumption.A future driver update will remove this extra power from a multi-monitor setup. There may be some increase in performance as well.


----------



## Psychoholic (Dec 12, 2022)

We should probably keep in mind while AMD has experience with chiplets in CPUs, GPUs I'm sure are a bit more complex.  
Since this is a totally new architecture (GPU Chiplets) i feel like ALOT of driver optimizations are to come for it.


----------



## gffermari (Dec 12, 2022)

No gpu is bad if it is priced accordingly. But these are not priced well.
150$+ down and we are sold to AMD regardless the unacceptable RT numbers.


----------



## Xuper (Dec 12, 2022)

spnidel said:


> if AIBs don't overclock far better than reference cards to compensate for the shitty generational leap, then this is another vega 64 moment for AMD. so fucking disappointing.
> "up to 1.7x faster than 6950 xt" my fucking ass


 It's your fault , don't blame on AMD or you're trolling


----------



## GreiverBlade (Dec 12, 2022)

pricing aside, AMD did it right ... performances and RT are competitive (RT is a bit irrelevant for me, as it bring nothing imho, but still good to see them making progress in that direction. ) the references card look drop dead gorgeous, 2.5slot no HPWR connector needed

imhho, they did better than Nvidia and certainly better than i expected and if i didn't do right by buying a RX 6700 XT this year, when the price dropped significantly, i would definitely go for a XT or XTX , although not at their prices... but i would have been forced to, because i suspect the RX 7700 XT will not be a worthy successor (having the same syndrome as the 4070 will have and reduced specs from one generation to another)  the RX 7700 XT and 4070 feel like they would belong to the 7600 XT and 4060 line in terms of specs (to me the 7900 XT should be the 7700 XT )


in term of price the XTX should be 850$ and the XT 750$ at max .... (well if they were at that MSRP, in Switzerland they would be 900 (XT) and 1000 (XTX) anyway  i guess i will see them around 1100 (XT) and 1200 (XTX) )

i can skip a gen again (or more ... after all i stayed 5-6yrs with my 1070  )


----------



## xenosys (Dec 12, 2022)

AMD really messed up on their presentation.  They had a lot of people believe they would be around 1.5-1.7x the performance of a 6950XT, and in reality, it was closer to 1.35-1.4x.  I'd expected they would be further ahead on raster performance than the 4080 (around 15%)  based on their presentation figures, and in RT performance closer to a 3090/3090Ti and well behind the 4080, which ended up being the case.

Based on the reviews, and the relative performance of the 6000/3000 series GPU's at the prices you can currently pick them up for, the 7900XTX should be a $899 product and 4080 a $999 one.


----------



## ARF (Dec 12, 2022)

the54thvoid said:


> I'd say, why is the 4090 so damn fast? That's a helluva card (too pricey for me) and was always going to be a reach for AMD to catch it.
> 
> I think the XTX is good but the XT would be better at $800 max (or ideally, £799 UK equiv.)



I don't think that 4090 is fast - it's normal for a generational architecture update and number of transistors.
But:
4090 - 78 B T for 122% performance
7900 XTX - 58 B T for 100% performance

Actually, the 7900 XTX has higher performance per transistor but too few of them.

The 7900 XTX is bad for other reasons, too:
-too high power consumption in idle (13 watts), multi-monitor (103 watts), video playback (88 watts) and v-sync (127 watts).
-extremely high price tag.


----------



## W1zzard (Dec 12, 2022)

P4-630 said:


> 88W @8K ?  Or even 88W @ 1080P ?


See the test notes, it's "4K 30 FPS video that's encoded with H.264 AVC at 64 Mbps bitrate"

edit: just tested it, YT 4K = ~100 W, 1440p = 100 W too, 1080p = 50 W, 720p = 21 W
edit2: added to conclusion, great question


----------



## AnotherReader (Dec 12, 2022)

A very disappointing effort from AMD so far. I was expecting the 7900 XTX to be around 90% of a 4090 in traditionally rasterized games, but it is barely beating the 4080. On top of that, the horrendous multi monitor and video playback power consumption is infuriating. I hope the latter is a driver bug. Given that RDNA3 is very different from RDNA2, maybe driver optimizations will increase the distance from the 4080 over time. All in all, I think I'll be snagging a 6800XT on the cheap instead of buying a 7700 XT.


----------



## Shatun_Bear (Dec 12, 2022)

gffermari said:


> No gpu is bad if it is priced accordingly. But these are not priced well.
> 150$+ down and we are sold to AMD regardless the unacceptable RT numbers.



'Unacceptable RT performance' that's quite something.

First of all, I guess you consider all 3000-series cards 'unacceptable' too in RT since they offer similar performance there. Secondly, RT is still a gimmick, Nvidia has made you believe it is worth paying a premium for 16% more of it.


----------



## Fluffmeister (Dec 12, 2022)

phanbuey said:


> I definitely think it's in a good spot in the terrible performance/value high end segment... that being said used 6900xt and 3080's / 3080Tis looking amazing right now.



Tell me about it, I went from a GTX 980 Ti to a RTX 3080 FE for a total of £450. Very happy


----------



## 3x0 (Dec 12, 2022)

Expected better raster performance vs 4080.

@W1zzard In the GPUz screenshot, Revision C8 indicates that there were many revisions, correct? Compared to RDNA2 C0/1 for most of the cards. Or is it just some arbitrary info?


----------



## dgianstefani (Dec 12, 2022)

Shatun_Bear said:


> 'Unacceptable RT performance' that's quite something.
> 
> First of all, I guess you consider all 3000-series cards 'unacceptable' too in RT since they offer similar performance there. Secondly, RT is still a gimmick, Nvidia has made you believe it is worth paying a premium for 16% more of it.


3000 series cards being released in late 2022 with a $1000 price tag would be unacceptable.

Maybe people are getting tired of AMD settling to be as fast as NVIDIA's previous gen every release.


----------



## W1zzard (Dec 12, 2022)

3x0 said:


> @W1zzard In the GPUz screenshot, Revision C8 indicates that there were many revisions, correct? Compared to RDNA2 C0/1 for most of the cards.


No. Since AMD has started to run out of device IDs (the 744C number), they are separating some SKUs by the revision ID. It's an arbitrarily selected number that's just different on some cards. e.g. XT and XTX are the same device Id, but different revision. Different SKUs, like the Pro cards still get their own ID and the revision ID is 0, or some other seemingly random value. It has absolutely nothing to do with respins/ASIC revisions or similar for recent AMD cards

I hope that makes sense, happy to explain more


----------



## john_ (Dec 12, 2022)

This reminds me of Bulldozer. RDNA3 looks like Bulldozer to me. That dual ALUs(?) architecture (please correct me here, I am probably describing wrong) doesn't seem to work as expected and probably needs much more work from the software team to optimize it. That's why I talk about Bulldozer. Bulldozer had the same issue. Dual ALUs and one FPU. It was losing even to Phenom in some cases or it wasn't offering what someone would expect from an "eight core" CPU. The same here. We see more like 6144 SPs performance in some cases than what we would expect from 12288 SPs. For example. F1 22 at 4K. 6900 XT at 132 fps, 7900XT at 152 and 7900XTX at 183 fps. Who wasn't expecting the 7900XTX at 200+ minimum before the reviews. Those power consumption problems with multi monitor and video playback also scream "rushed, immature drivers". The software team probably had other priorities to fix (meaning performance obviously) than power consumption.

Let's hope they can fix it, or Nvidia would be selling RTX 4070's like hot cakes. Even RTX 4800 might start selling better after today's reviews.


----------



## ARF (Dec 12, 2022)

AMD has abandoned the ray-tracing optimisations. 55% faster means a generation or two ahead


----------



## phanbuey (Dec 12, 2022)

john_ said:


> This reminds me of Bulldozer. RDNA3 looks like Bulldozer to me. That dual ALUs(?) architecture (please correct me here, I am probably describing wrong) doesn't seem to work as expected and probably needs much more work from the software team to optimize it. That's why I talk about Bulldozer. Bulldozer had the same issue. Dual ALUs and one FPU. It was losing even to Phenom in some cases or it wasn't offering what someone would expect from an "eight core" CPU. The same here. We see more like 6144 SPs performance in some cases than what we would expect from 12288 SPs. For example. F1 22 at 4K. 6900 XT at 132 fps, 7900XT at 152 and 7900XTX at 183 fps. Who wasn't expecting the 7900XTX at 200+ minimum before the reviews. Those power consumption problems with multi monitor and video playback also scream "rushed, immature drivers". The software team probably had other priorities to fix (meaning performance obviously) than power consumption.
> 
> Let's hope they can fix it, or Nvidia would be selling RTX 4070's like hot cakes. Even RTX 4800 might start selling better after today's reviews.



This is more like zen 1 IMO - they're sacrificing efficiency for the 1st gen modular design, but it has the potential to help them scale (or realize massive efficiencies) down the line.  The first zen had the same issues.

If they just die shrunk and optimized RDNA 2 they would have gotten better results, for sure, but they're opting for the Zen strategy.

I definitely think it's priced ok according to the 4080 -- i don't agree with the reviews that "nvidia has to lowe price" since they do have more features and better RT performance, to me the 4080 and 7900xtx are at parity.


----------



## spnidel (Dec 12, 2022)

Xuper said:


> It's your fault , don't blame on AMD or you're trolling


???
1.7x performance of 6950 XT at 4K in cyberpunk would at least put the 7900 XTX on par with the 4090, yet it's only 1.42X faster - same story in most other games.










113 / 84 = 1.34X more fps


----------



## ARF (Dec 12, 2022)

phanbuey said:


> This is more like zen 1 IMO - they're sacrificing efficiency for the 1st gen modular design, but it has the potential to help them scale (or realize massive efficiencies) down the line.  The first zen had the same issues.
> 
> If they just die shrunk and optimized RDNA 2 they would have gotten better results, for sure, but they're opting for the Zen strategy.



Down the line now means 2 years in the future, 4 years in the future, 6 years in future.
I bet they will change the architecture, and no one would give them time for this.

Remember that AMD has only 8% market share today, which will soon be overtaken by Intel Arc.


----------



## igralec84 (Dec 12, 2022)

Will be interesting to see how it undervolts, if the 4090 can achieve it's default FPS under 340W at 0.95v, the 4080 can probably go below 300W too without much performance loss. 

It has crossed my mind a couple of times to sell the 4090 if don't start using it for other stuff than gaming, get a 7900XTX and pocket the 700-800€ difference, turn on freesync and play at 120hz instead of 144hz and hope i don't regret it if/when some good RT games come out that i'll play


----------



## phanbuey (Dec 12, 2022)

ARF said:


> Down the line now means 2 years in the future, 4 years in the future, 6 years in future.
> I bet they will change the architecture, and no one would give them time for this.
> 
> Remember that AMD has only 8% market share today, which will soon be overtaken by Intel Arc.



Im not sure Arc is even on the radar.   The only competitor to Nvidia is AMD - arc is 3 gens behind.


----------



## Xuper (Dec 12, 2022)

spnidel said:


> ???
> 1.7x performance of 6950 XT at 4K in cyberpunk would at least put the 7900 XTX on par with the 4090, yet it's only 1.42X faster - same story in most other games.
> 
> 
> ...


I knew something wrong with their Marketing , So I didn't believe it. anyone who think it's ( either AMD or Nvidia) True , deserve being unhappy.


----------



## Fleurious (Dec 12, 2022)

Wow, performance is honestly what i was hoping for.  

Price is still too high, though, but at least there is competition for the 4080.


----------



## THU31 (Dec 12, 2022)

Good raster value compared to the RTX 40 series, but that would be it.

Performance per dollar is at the same level as the previous generation. And even if this is the future and we have to get used to it, at this moment in time it is unacceptable.

I just do not see how they are envisioning this future. Who will pay $1000 for mid-range cards in a few years?


----------



## Ferrum Master (Dec 12, 2022)

I want a bench of this thing under water. Does it scale well while being cool.


----------



## usiname (Dec 12, 2022)

People have short memories






Portal "3x" performance


----------



## Argyr (Dec 12, 2022)

nicamarvin said:


> Color me Surprised, W1zzard Blatant Biased yet again. Pairing a AMD GPU with an Intel CPU. He think he is slick. He knows AMD's GPU perform the best with AMD CPUs... I am done with W1zzard's antics.


do we also need to get AMD SSD's, power supplies, keyboards and energy drinks for it to run better?


----------



## Zareek (Dec 12, 2022)

This is a really impressive showing by AMD if you compare it to the 4080. For $200 less, you get better performance out of the box in most of the tested games. RT performance isn't on the same level, but it seems like a massive jump versus RDNA2. It looks like AMD has a crap load of driver work to do still. A few games are really not scaling as expected. The multi-display idle bug is back, and what is up with the video decoding power consumption. I'm willing to bet AMD works this stuff out with drivers.


----------



## birdie (Dec 12, 2022)

usiname said:


> People have short memories


The graph you're referring to *uses DLSS 3.0 with frame generation which TPU's review doesn't account for*. People's memory is OK. Your argument is squarely invalid.


----------



## dgianstefani (Dec 12, 2022)

usiname said:


> People have short memories
> View attachment 274150
> View attachment 274151
> Portal "3x" performance
> View attachment 274152


Without FG it's almost 2x, with FG it's close to 4x.


----------



## RedBear (Dec 12, 2022)

Shatun_Bear said:


> 'Unacceptable RT performance' that's quite something.
> 
> First of all, I guess you consider all 3000-series cards 'unacceptable' too in RT since they offer similar performance there. Secondly, RT is still a gimmick, Nvidia has made you believe it is worth paying a premium for 16% more of it.


AMD is believing that as well, if they're making you pay a premium for performance comparable to the previous Nvidia generation... If Nvidia didn't offer any generational upgrade in performance that would have been unacceptable as well (after all the time and money that they've spent in researching and promoting it).


----------



## Xex360 (Dec 12, 2022)

Another disappointing product due to price, this card should have obliterated the 4080 in raster (RT is still stupid even the 4090 and its RT cores gets slaughtered) to justify its price.
They are just milking us.


----------



## dir_d (Dec 12, 2022)

I guess those rumors about AMD Hardware bugs were right, in the AMD slides, and AMD themselves i swear it said this card would push 3GHZ. If it actually could have made it that high on the frequency i think it could have been a competitor to the 4090. Right now the card is kinda all over the place in FPS, frame times and power consumption. The card was clearly not ready but they pushed it out anyways. Let's see what they can do to salvage this in the next couple months with drivers.


----------



## Tropick (Dec 12, 2022)

If anything this launch has made me very happy that I was able to snag a 6950XT OCF on the cheap for black friday.


----------



## Shatun_Bear (Dec 12, 2022)

dgianstefani said:


> 3000 series cards being released in late 2022 with a $1000 price tag would be unacceptable.
> 
> Maybe people are getting tired of AMD settling to be *as fast as NVIDIA's previous gen every release*.



So you've just taken what I said to the next level. Instead of just considering RT performance a major factor in a cards value, you are wholly defining a card based on its RT performance by saying these are 'as fast as Nvidia's previous gen', ignoring it's actual performance (raster). That's crazy framing.

The XTX is faster than Nvidia's previous gen, it's only in RT there is a 16% gap with the current gen.


----------



## neatfeatguy (Dec 12, 2022)

EatingDirt said:


> Look at the (% changes) for the 6950 XT verses the 7900XTX. The 7900 XTX's efficiency in raytracing is basically on par with the 6000 series. This is the disappointing, especially as they specifically said they worked on raytracing performance for this architecture. It looks like the architecture changes to increase raytracing performance didn't translate well into real-world game raytracing performance.



Remember that even Nvidia didn't improve on RT performance. Ampere had around a 40-45% drop in performance when you turned on RT and so does Ada.

Neither company improved upon RT performance when compared to last gen. The only reason things look better is simply because of the rasterization performance gains. I think it's sad that neither company made any improvements with RT.....although, it doesn't really matter to me because I don't care about RT. I never use it on any games that support it because for me it really doesn't make any difference. You're moving too fast through the games to really take in small visual changes that RT has.


----------



## gffermari (Dec 12, 2022)

Shatun_Bear said:


> 'Unacceptable RT performance' that's quite something.
> 
> First of all, I guess you consider all 3000-series cards 'unacceptable' too in RT since they offer similar performance there. Secondly, RT is still a gimmick, Nvidia has made you believe it is worth paying a premium for 16% more of it.


Unacceptable for a top tier product that cost 1000$ and will be with us for the next two+ years.
It’s a gimmick to you. I value it a lot more than the raster performance.


----------



## mechtech (Dec 12, 2022)

Looks like they didn’t repeat the 6800/6900 sweetness.

maybe they should have just added additional shaders to rdna2 and made it bigger?

for the price I guess it’s ok.

@W1zzard 2% difference at 4k between and AMD and Intel cpu.   Who was whining and complaining about that?  Not even worth changing the test bed for.  1080p ok but running these cards on that would be ridiculous.
I guess there was now way to do confirmation testing on DP and HDMI versions?   I may have missed it but for media testing which codec(s) did you use?  Do these cards offer full 10-bit colour output?


----------



## john_ (Dec 12, 2022)

phanbuey said:


> This is more like zen 1 IMO - they're sacrificing efficiency for the 1st gen modular design, but it has the potential to help them scale (or realize massive efficiencies) down the line.  The first zen had the same issues.
> 
> If they just die shrunk and optimized RDNA 2 they would have gotten better results, for sure, but they're opting for the Zen strategy.
> 
> I definitely think it's priced ok according to the 4080 -- i don't agree with the reviews that "nvidia has to lowe price" since they do have more features and better RT performance, to me the 4080 and 7900xtx are at parity.


No, it's Bulldozer. AMD was promising higher performance than what the reviews show. I think they where hoping to improve their drivers considerably in this last month. But they didn't. Or we will have to accept that their marketing numbers where best case scenarios close to the border of being lies. Zen 1 was a new architecture that was offering tremendous IPC gains over Bulldozer(which probably was the easy part), while also incorporating a first step into chiplet design. Zen 1 was a clear jump into the future. RDNA3 is a big "What The Hell?".

Here we have an architecture that probably fails to be utilized at it's full potential. Like what we where seeing with Bulldozer. As you say, if they have shrunk RDNA2 they might where getting better results. The same was true back when Bulldozer came out. Me and others where thinking that a 32nm Phenom III with 8 cores could have been a better product than an "eight core" Bulldozer.

It would have been priced OK if it was clearly beating RTX 4080. Nvidia only has to drop RTX 4080's price by $200 and RX 7900XTX is DOA at $1000. Or release RTX 4070 Ti at $800 and see that card beating RX models in RT performance. People will go that direction.

Damn, 2 years of Nvidia monopoly and Intel coming to bite AMD from behind. At least they will be selling plenty of APUs.


----------



## dgianstefani (Dec 12, 2022)

Shatun_Bear said:


> So you've just taken what I said to the next level. Instead of just considering RT performance a major factor in a cards value, you are wholly defining a card based on its RT performance by saying these are 'as fast as Nvidia's previous gen', ignoring it's actual performance (raster). That's crazy framing.
> 
> The XTX is faster than Nvidia's previous gen, it's only in RT there is a 16% gap with the current gen.


Engines are updated or are being updated to have RT as a default replacement for baked lighting. This process is moving in one direction.

I meant what I said.


----------



## Metroid (Dec 12, 2022)

What about memory temperature benchmark? 7900xtx has gddr6 384bit while 4080 has gddrx6 256bit. Does gddr6 384bit run cooler than 256bit gddrx6?

The greatest thing about 7900 gpus are the included displayport 2.1, finally. Other than that, 7900xtx price a little bit lower but power consumption a little bit higher 5nm x 4nm. Performance x price wise pretty much the same. I would still prefer amd for the displayport 2.1, however dp 2.1 is only 54gb maximum, not the full 80gb and that is bad, hdmi 2.1 full is 48gb and both amd and nvidia have full 48gb on hdmi 2.1, amd is counting on dsc here, dsc is only for display port 1.4a and up, hdmi 2.1 dont have dsc.


----------



## phanbuey (Dec 12, 2022)

john_ said:


> No, it's Bulldozer. AMD was promising higher performance than what the reviews show. I think they where hoping to improve their drivers considerably in this last month. But they didn't. Or we will have to accept that their marketing numbers where best case scenarios close to the border of being lies. Zen 1 was a new architecture that was offering tremendous IPC gains over Bulldozer(which probably was the easy part), while also incorporating a first step into chiplet design. Zen 1 was a clear jump into the future.
> 
> Here we have an architecture that probably fails to be utilized at it's full potential. Like what we where seeing with Bulldozer. As you say, if they have shrunk RDNA2 they might where getting better results. The same was true back when Bulldozer came out. Me and others where thinking that a 32nm Phenom III with 8 cores could have been a better product than Bulldozer.
> 
> ...



I'm a huge fan of new intel, but they're not even close to 'biting' AMD from behind. If intel can match the 6800xt with battlemage next year I will be impressed, but I doubt it.

This architecture does have similarities to bulldozer, but at the end of the day it pushes the FPS needed to compete with the 80 class.  

They will need to drop price, for sure -- but hopefully the silicon savings allow for that.


----------



## btk2k2 (Dec 12, 2022)

Xuper said:


> I knew something wrong with their Marketing , So I didn't believe it. anyone who think it's ( either AMD or Nvidia) True , deserve being unhappy.



There is wrong and there is that.

Small samples always have a certain level of inaccuracy but claiming a 70% bump in CP2077 and then getting just 43% at HUB vs a 6950XT and 63% here vs a 6900XT is utterly misleading vs what AMD claimed. It is such a shame as well because AMD were building a track record of being trustworthy in their marketing performance claims, they were with Zen 2, Zen 3, Zen 4, RDNA, RDNA2 so for them to do this totally destroys years and years of hard work.


----------



## birdie (Dec 12, 2022)

neatfeatguy said:


> Remember that even Nvidia didn't improve on RT performance. Ampere had around a 40-45% drop in performance when you turned on RT and so does Ada.
> 
> Neither company improved upon RT performance when compared to last gen. The only reason things look better is simply because of the rasterization performance gains. I think it's sad that neither company made any improvements with RT.....although, it doesn't really matter to me because I don't care about RT. I never use it on any games that support it because for me it really doesn't make any difference. You're moving too fast through the games to really take in small visual changes that RT has.



NVIDIA did improve RT performance _by a lot_ for Ada Lovelace relative to Ampere, it's just the base frame rate that increased as well. If RT performance was the same, generational improvements for RT titles would be a _lot smaller_ than for other purely rasterized games. Math requires respect.

Some examples from a Russian website which runs some purely artificial benchmarks:


----------



## oxrufiioxo (Dec 12, 2022)

Way overhyped... Now I know why the 4080 is so terribly priced and now we will likely get an 800-900 usd 4070 way to go AMD.


The only hope is that aib cards perform way better.


----------



## swirl09 (Dec 12, 2022)

Not sure what people were expecting that so many comments are talking about disappointment :S

Its slightly faster than a 4080, while cheaper. Did you think AMD were retaking the performance crown while charging half as much? lol 

Nice card IMO.


----------



## 3x0 (Dec 12, 2022)

john_ said:


> No, it's Bulldozer. AMD was promising higher performance than what the reviews show. I think they where hoping to improve their drivers considerably in this last month. But they didn't. Or we will have to accept that their marketing numbers where best case scenarios close to the border of being lies. Zen 1 was a new architecture that was offering tremendous IPC gains over Bulldozer(which probably was the easy part), while also incorporating a first step into chiplet design. Zen 1 was a clear jump into the future. RDNA3 is a big "What The Hell?".


Buldozer was in some cases slower than the previous gen Phenoms. That was the majority of the disappointment behind Buldozer. Is RDNA3 slower in some cases than RDNA2?


----------



## Denver (Dec 12, 2022)

Is there any logical reason to measure performance per watt based on CP2077 and not the overall average?


----------



## dgianstefani (Dec 12, 2022)

Denver said:


> Is there any logical reason to measure performance per watt based on CP2077 and not the overall average?


I'd reason because CP2077 is notoriously heavy on both CPU and GPU.


----------



## Dristun (Dec 12, 2022)

Well, it's... okay, I guess? However with these numbers Huang can either simply continue the giga-milking mode or deliver a fatality just by dropping 4080 to a $1000. Unfortunate for everyone on the consumer side. GG to AMD for enabling better margins _for themselves _if they're really saving by going with chiplets.


----------



## spnidel (Dec 12, 2022)

john_ said:


> No, it's Bulldozer.
> 
> Here we have an architecture that probably fails to be utilized at it's full potential.


the closest comparison then would be vega 64
if you ask me the 7900 XT and XTX should've been priced at $699 and $799 respectively.
this performance is nowhere near worth $1k, when going from a 5700XT to a 6800XT net you a 2X increase in performance for $650 MSRP
here you go from a 6800 XT to a 7900 XT and get... 30% more fps for $250 more - abysmal generational leap


----------



## Bomby569 (Dec 12, 2022)

swirl09 said:


> Not sure what people were expecting that so many comments are talking about disappointment :S
> 
> Its slightly faster than a 4080, while cheaper. Did you think AMD were retaking the performance crown while charging half as much? lol
> 
> Nice card IMO.



compared to death a bloody nose is an improvement, neither is a good outcome


----------



## jabbadap (Dec 12, 2022)

neatfeatguy said:


> Remember that even Nvidia didn't improve on RT performance. Ampere had around a 40-45% drop in performance when you turned on RT and so does Ada.
> 
> Neither company improved upon RT performance when compared to last gen. The only reason things look better is simply because of the rasterization performance gains. I think it's sad that neither company made any improvements with RT.....although, it doesn't really matter to me because I don't care about RT. I never use it on any games that support it because for me it really doesn't make any difference. You're moving too fast through the games to really take in small visual changes that RT has.


It did, but there's no games using SER at default yet... 

Sad to say but rdna2 was more competitive with ampere than rdna3 is to ada(probably just because of TSMC).


----------



## 1d10t (Dec 12, 2022)

To sum it up, pretty much underwhelming. I'm not electrical engineering but I'm already skeptical about dual issues SIMD on modular design GCD and MCD, in which I would suspect pose another latency or anything cache related issues. Still, RT performance is quite good despite it lack a dedicated hardware, but sadly a feature I don't ever use. Good job though for first try at chiplet design.


----------



## mb194dc (Dec 12, 2022)

Probably a higher power limit and tweaking the voltage and boost curve will provide quite a decent performance boost and maybe get it close to the 4090. Some AIB cards have 3 8 pin connectors, so 500w~ versions are possible within spec. 

They'll probably be pretty expensive and obviously hot though...


----------



## Bomby569 (Dec 12, 2022)

1d10t said:


> To sum it up, pretty much underwhelming. I'm not electrical engineering but I'm already skeptical about dual issues SIMD on modular design GCD and MCD, in which I would suspect pose another latency or anything cache related issues. Still, RT performance is quite good despite it lack a dedicated hardware, but sadly a feature I don't ever use. Good job though for first try at chiplet design.



they are on a higher node, so that should also explain some differences i think


----------



## Pumper (Dec 12, 2022)

Fouquin said:


> Did we look at the same review? It's matching the 3090 Ti quite often, and a couple times sits between the 4080 and 4090 in RT. The 6000 series is languishing miles away in RT performance, not even remotely "identical" performance. Judging by results such as Far Cry, Resident Evil, and Watch Dogs this looks like a solution can be found in driver optimizations.


That's not the point. Look at the % of FPS drop vs. raster. It's almost exactly the same as previous gen. Of course it 50% faster than a 6900XT in RT, because it's also 50% faster in raster.







6900XT - 61% drop with RT on.
7900XTX - 58% drop with RT on.

Where is the performance increase? There is none, it's in the margin of error. AMD is not even at the RTX 2000 level yet. Embarrassing.


----------



## dgianstefani (Dec 12, 2022)

Pumper said:


> That's not the point. Look at the % of FPS drop vs. raster. It's almost exactly the same as previous gen. Of course it 50% faster than a 6900XT in RT, because it's also 50% faster in raster.
> 
> 
> 
> ...


Do you understand that if you increase raster by 50%, and RT performance stays the same, you would still have zero improvements in RT frame rate?

The RT improvements are in line with the raster improvements, this does not mean zero improvements in RT lmao.


----------



## spnidel (Dec 12, 2022)

LOL the 7900 XT is a JOKE of a card when compared even to the 6950 XT - 10% faster than a 6950 XT. what kind of generational leap is this??? a fucking joke
if it cost $700 - OK, that'd be a great card, but at $900? this is dogshit


----------



## tfdsaf (Dec 12, 2022)

Realistically I expected the 7900XTX to be a bit faster, certainly expected it to be 10% better than the 4080 on average, it seems like its much closer to the 4080, anywhere from being 15% slower to being 25% faster depending on the games and on average being just 5% faster over the 4080. Still a great result mind you are it costs $1000, which is at least $200 less than the 4080 and considering most custom models for the 4080 are around $1400, that is pretty much $400 dollars cheaper and about 5% faster! 

So, a great result overall, but slightly disappointed as well because I thought it would be faster than it ended up being. 

There is clearly some performance issues in certain games and getting really poor results, so hopefully AMD can sort these out in the coming weeks and improve performance in these titles! With AMD and how they operate I am fully expecting these GPU's to gain at least 5% more performance over the next 2-3 months and the issues with power draw at media playback and multi monitor to be fully resolved!


----------



## Acesbong (Dec 12, 2022)

birdie said:


> This is a disaster for the consumer.
> 
> Instead of soundly beating NVIDIA at least in raster AMD offers comparable performance at the comparable price while not offering unique distinguishing features like e.g. DLSS 3.0. And RTRT performance is again hugely lacking though AMD has managed to reach ... Ampere levels of performance.
> 
> ...



There needs to be a price fixing investigation, don't know how they have gotten away with this for so long.


----------



## ARF (Dec 12, 2022)

3x0 said:


> Buldozer was in some cases slower than the previous gen Phenoms. That was the majority of the disappointment behind Buldozer. Is RDNA3 slower in some cases than RDNA2?



Probably if you normalise for the shaders quantity change and frequency.
Look at how close the 7900 XT is to 6950 XT  126 vs 113 is literally the same performance.




AMD Radeon RX 7900 XTX review - The Witcher III Wild Hunt (guru3d.com)


----------



## MojoTheJester (Dec 12, 2022)

I'm surprised at just how big the differences are with the different CPU at 1080p compared to at 4k. Makes me wonder how low my scores would be with this GPU, as I have a Ryzen 5 2600x


----------



## MxPhenom 216 (Dec 12, 2022)

Crackong said:


> Same performance as 4080, 8GB more VRAM, not sized as a brick, and *$200 cheaper*


For now...


----------



## Luminescent (Dec 12, 2022)

That chiplet bullshit seems to benefit cost cutting more than bringing anything revolutionary like performance or lower power draw.
Basically Nvidia beat them everywhere, performance, power efficiency, productivity, only place left where AMD can claim to compete is *price.*
And again with the drivers, that multi monitor power consumption of 103W is crazy and 88W for video playback.


----------



## Pumper (Dec 12, 2022)

dgianstefani said:


> Do you understand that if you increase raster by 50%, and RT performance stays the same, you would still have zero improvements in RT frame rate?
> 
> The RT improvements are in line with the raster improvements, this does not mean zero improvements in RT lmao.


Keep pretending you don't understand what I'm saying if that makes you feel better.


----------



## 3x0 (Dec 12, 2022)

ARF said:


> Look at how close the 7900 XT is to 6950 XT  126 vs 113 is literally the same performance.


*XTX* vs 6950XT


----------



## WhoDecidedThat (Dec 12, 2022)

Come on NVIDIA, release a compact 4080 with dual 8-pin power connectors, it's efficient enough. Who knows... maybe the 4080 sales will stop languishing so much with a smol card.


----------



## geniekid (Dec 12, 2022)

@W1zzard Nice job with the review.  

No real surprises for me, except noise-at-load which was disappointing just because the 6900XT was so incredible in this regard.


----------



## Akkedie (Dec 12, 2022)

Incredibly disappointing. They really needed to beat the 4080 in raster across the board for it to even have a chance because when we start counting RT, it's a complete bloodbath, nevermind DLSS 3. Seeing RDNA 3 not improve RT % performance loss compared to RDNA 2 is just baffling. GG


----------



## Neo_Morpheus (Dec 12, 2022)

terroralpha said:


> This means no price disruptions.


That can be fixed easily, stop giving nvidia money, regardless of what they have to offer. This is the sh!t, people want AMD to produce a 4090 killer, but offer it at US$500, just so Nvidia is "forced" to cut prices so the rabid followers would still give nvidia their money.


birdie said:


> Instead of soundly beating NVIDIA


Aah yes, you want a 4090 killer that only sells for US$500.


birdie said:


> AMD offers comparable performance at the comparable price


Maybe I read another review, but AMD is cheaper and equal to faster performance.


birdie said:


> not offering unique distinguishing features like e.g. DLSS 3.0


Those are not "unique features", they are "lock-in features" which I have always avoided instead of desiring them.


birdie said:


> And RTRT performance is again hugely lacking


The amount of games that support RT are simply minuscule compared to the immense library of games (that many of us haven't played yet) that don't have it, neither need it.

Talking about the ones that have it, very, very few of the games that have RT, gives you the visual reward that is supposed to have, but at an insane performance hit.

Wait, arent you the same "birdie" that is always trashing AMD and kissing nvidias behind at Phoronix?

Anyways, AMD should rename the 7900 XT to 7800 XT and both GPUS (7900 XT and XTX) should be priced lower, but then again, it seems that nobody really complains as to how much the 3090Ti and 4090 MSRPs were, so why not ask for the moon at this point?


----------



## W1zzard (Dec 12, 2022)

Denver said:


> Is there any logical reason to measure performance per watt based on CP2077 and not the overall average?


Power is measured on a different machine than the normal performance. Testing all games would take forever, more than perf/watt is worth. I picked CP because it's highly GPU bound and everybody definitely has their drivers well-optimized for that title.


----------



## Acesbong (Dec 12, 2022)

dir_d said:


> I guess those rumors about AMD Hardware bugs were right, in the AMD slides, and AMD themselves i swear it said this card would push 3GHZ. If it actually could have made it that high on the frequency i think it could have been a competitor to the 4090. Right now the card is kinda all over the place in FPS, frame times and power consumption. The card was clearly not ready but they pushed it out anyways. Let's see what they can do to salvage this in the next couple months with drivers.


I dunno, the overclocking results don't seem to scale so well with frequency.


----------



## tfdsaf (Dec 12, 2022)

jabbadap said:


> It did, but there's no games using SER at default yet...
> 
> Sad to say but rdna2 was more competitive with ampere than rdna3 is to ada(probably just because of TSMC).


Nvidia's cards use the 4nm process, while AMD is using the 5nm and 6nm process, but Nvidia is paying more and thus we see $1600 4090, $1200 4080, while AMD are able to offer slightly cheaper cards.

I don't think Nvidia can reduce the price of either of the 4000 GPU's by much as they would likely be facing selling at cost if they lowered the RTX 4080 to say $900, anything less than that and I think they are likely to be selling at a loss.

I think the RTX 4090 can be sold for quite a bit less before essentially selling it at cost, but Nvidia are not going to do that! I believe they can lower the RTX 4090 to $1100 and sell it at cost. 

Again, people also need to consider R&D, Marketing, testing, driver and features development, etc... when accounting for the cost of these GPU's. If its purely the cost of producing these, plus packaging, plus shipping then you can easily go down to something like $400 for the RTX 4090, but you got to understand that they need to have a leeway for AIB partners to make money and you have to account for the cost in developing these cards in the first place! 

I think AMD is in a slightly better position to enter into a price war with Nvidia, but they are also limited in how "cheap" they can go. 

To me the 7900XTX is extremely competitive with the RTX 4080, on average about 5% faster and $200 cheaper! We also see that there are several games in which the 7900 series clearly have some performance issues and AMD is going to have to address these in upcoming driver releases. I'm certain these cards will gain at least 5% more performance in the next few months as AMD iron out issues and optimize some of the games through drivers. We clearly see it in every past generation that their GPU's do gain significant performance over time as AMD optimize performance in future driver releases! 

Nvidia on the other hand pretty much never gain performance in subsequent driver releases and in some cases actually lose performance as Nvidia fix stability issues in various games!


----------



## btk2k2 (Dec 12, 2022)

ARF said:


> Probably if you normalise for the shaders quantity change and frequency.
> Look at how close the 7900 XT is to 6950 XT  126 vs 113 is literally the same performance.
> 
> View attachment 274162
> AMD Radeon RX 7900 XTX review - The Witcher III Wild Hunt (guru3d.com)








To be honest this is where I figured the cards would rank on the average metrics give or take a bit. 7900XTX comfortably sitting between the 4080 and 4090 with the 7900XT close to the 4080 on average.

What it shows to me is that AMD are not extracting all the ILP available at the moment so some games see crap gains and others are pretty damn good. I am sure if the clocks were as high as AMD wanted to begin with even the likes of Witcher 3 where the ILP is pretty low and not making good use of the dual issue shaders the raster performance would be above that of the 4080 and then in cases where AMD have extracted ILP for their shader design you get situations where the XTX matches or even beats the 4090.


----------



## W1zzard (Dec 12, 2022)

Pumper said:


> Where is the performance increase? There is none, it's in the margin of error.


I feel like my error is considerably lower than that. I think the improvements are in those 2-3%


----------



## john_ (Dec 12, 2022)

3x0 said:


> Buldozer was in some cases slower than the previous gen Phenoms. That was the majority of the disappointment behind Buldozer. Is RDNA3 slower in some cases than RDNA2?


Dual ALUs in Bulldozer, dual ALUs in RDNA3 (again if I am describing it wrong, please feel free to correct me).

That's why I was thinking Bulldozer.
As about performance, it's not much faster. It's not as fast as AMD was promising in it's slides. In fact if we use the 12288 number for the SPs, performance is nowhere near where we should be expecting it. So calling RDNA3 faster, technically is correct, but not in a meaningful way.


----------



## W1zzard (Dec 12, 2022)

MojoTheJester said:


> I'm surprised at just how big the differences are with the different CPU at 1080p compared to at 4k. Makes me wonder how low my scores would be with this GPU, as I have a Ryzen 5 2600x











						Intel Core i7-13700K Review - Great at Gaming and Applications
					

With the Core i7-13700K, Intel has built a formidable jack-of-all-trades processor. Our review confirms that it offers fantastic application performance, beating the more expensive Ryzen 9 7900X, and in gaming it gets you higher FPS than any AMD processor ever released, delivering an experience...




					www.techpowerup.com
				




You should be able to extrapolate from this data and this review. But these cards are simply not made for 1080p, they are super-CPU bound at that res. You can see that even with the 13900K, when results bunch up to an invisible wall (like Borderlands 3 1080p)


----------



## terroralpha (Dec 12, 2022)

Dristun said:


> Well, it's... okay, I guess? However with these numbers Huang can either simply continue the giga-milking mode or deliver a fatality just by dropping 4080 to a $1000. Unfortunate for everyone on the consumer side. GG to AMD for enabling better margins _for themselves _if they're really saving by going with chiplets.


from my experience, people don't really care about the details. the only thing that seems to move product is brand perception. for example, i was at a microcenter about 3 weeks ago, and saw two different individuals walk out with $700 RTX 3070ti cards (1 asus strix and 1 msi suprim) when they could have got a 6900XT at the same store for only $650. the sales people tried to steer them in the right direction but they were not having it. same trend everywhere. newegg sold out of $800, RTX 3080 long before selling out their $700-$750 RTX 6900XTs.

perception trumps all else and in light of that i don't see huang taking his foot off the "moar money" gas pedal. as long as they have the halo product (rtx 4090) people will just mindlessly flock to them. and i'm not an nvidia hater by any stretch of imagination. i bought a 3090 on release day myself.


----------



## W1zzard (Dec 12, 2022)

btk2k2 said:


> To be honest this is where I figured the cards would rank on the average metrics give or take a bit.


I think Guru3D is testing the in-game benchmark, not the actual game. Maybe AMD optimized their shader compiler for dual issue only for the integrated benchmarks?


----------



## GhostRyder (Dec 12, 2022)

Well I have to say I am somewhat happy with its performance as it is better in the way most people are going to use it.  But I am disappointed that its not more above the 4080 but had a feeling based on the way it was being advertised from AMD.  Memory overclocking was interesting as it seems regular overclocking barely helps if at all, I wonder if they are being held back by the fact they are relying on the Infinity cache and not using GDDR6X memory.  I get they are doing it because of the power consumption of said memory, but I wonder if its being held back by that fact because the performance jump in overclocking the memory was pretty significant.

As for everyone complaining about Ray Tracing performance.  In all seriousness I want to know if people are seriously using the tech in the small list of games it supports.  I get that its something to compare and talk about when comparing all the cards, but lets be frank here that the performance drop even with DLSS enabled is still huge to this day.  The only card that delivers reasonable performance with it all enabled is the 4090 and it struggles as well in many of the games.


----------



## defaultluser (Dec 12, 2022)

HD64G said:


> CPU bottlenecks very hard in some games so the ~45% uplift vs 6900XT @4K is pessimistic. For any game utilising the GPU properly it get closer to 60% and in some games even at 80% over the 6900XT (on other online reviews). So the up to 54% higher efficiency is true.




well, they stepped-up this time to Raptor Lake i9 (so the 4k should be pretty GPU-limited)

*maybe swap 1080 for 5k, just for top-end cards; also, another new AMD-smack-down of RT is missing from this test : Portal RTXa!*


----------



## dir_d (Dec 12, 2022)

Acesbong said:


> I dunno, the overclocking results don't seem to scale so well with frequency.


That's why i think there is something wrong with the Hardware, the slides don't add up. Where's the power per watt performance and the Raster performance, just seems like something went wrong and AMD did not hit the marks they were trying to achieve in the slides. Or the drivers and AMD marketing is really bad this time around.


----------



## Bomby569 (Dec 12, 2022)

terroralpha said:


> from my experience, people don't really care about the details. the only thing that seems to move product is brand perception. for example, i was at a microcenter about 3 weeks ago, and saw two different individuals walk out with $700 RTX 3070ti cards (1 asus strix and 1 msi suprim) when they could have got a 6900XT at the same store for only $650. the sales people tried to steer them in the right direction but they were not having it. same trend everywhere. newegg sold out of $800, RTX 3080 long before selling out their $700-$750 RTX 6900XTs.
> 
> perception trumps all else and in light of that i don't see huang taking his foot off the "moar money" gas pedal. as long as they have the halo product (rtx 4090) people will just mindlessly flock to them. and i'm not an nvidia hater by any stretch of imagination. i bought a 3090 on release day myself.



People and reviewers keep throwing that at buyers. How many people was AMD burned in the last couple of years? It's not exactly like "perception" came out of think air.


----------



## spnidel (Dec 12, 2022)

btk2k2 said:


> What it shows to me is that AMD are *not extracting all the ILP available at the moment so some games see crap gains and others are pretty damn good*. I am sure if the clocks were as high as AMD wanted to begin with even the likes of Witcher 3 where the ILP is pretty low and not making good use of the dual issue shaders the raster performance would be above that of the 4080 and *then in cases where AMD have extracted ILP for their shader design you get situations where the XTX matches or even beats the 4090*.


vega 64 all over again. not even joking. I'm sick of AMD repeating the same mistakes over and over again. have they not learned?


----------



## Dristun (Dec 12, 2022)

By the way, regarding RT performance, checking the RDNA3's architecture slides — there's only stuff about effiency and algorithm optimizations there for RT. So I suppose AMD's RT cores are still not doing BVH traversals and final ray shading, offloading them to standard cores, just like RDNA2 did — hence the performance drop in the games is more or less the same. Maybe once game devs wrap their heads around architectural improvements they could squeeze a bit more, but if the cores are still not as "complete" as Nvidia's and Intel's there are no surprises here.



terroralpha said:


> from my experience, people don't really care about the details. the only thing that seems to move product is brand perception. for example, i was at a microcenter about 3 weeks ago, and saw two different individuals walk out with $700 RTX 3070ti cards (1 asus strix and 1 msi suprim) when they could have got a 6900XT at the same store for only $650. the sales people tried to steer them in the right direction but they were not having it. same trend everywhere. newegg sold out of $800, RTX 3080 long before selling out their $700-$750 RTX 6900XTs.
> 
> perception trumps all else and in light of that i don't see huang taking his foot off the "moar money" gas pedal. as long as they have the halo product (rtx 4090) people will just mindlessly flock to them. and i'm not an nvidia hater by any stretch of imagination. i bought a 3090 on release day myself.



Lamentable! Also I can't see AMD ever beating Nvidia's halo product, I don't think they have the r&d budget. Plus 4090 isn't even a full die.


----------



## W1zzard (Dec 12, 2022)

defaultluser said:


> well, they stepped-up this time to Raptor Lake i9 (so the 4k should be pretty GPU-limited)


Just to clarify, I've retesting every single card on the new rig, the 6900 XT numbers were tested last week on the 22.11.2 drivers


----------



## Luminescent (Dec 12, 2022)

Neo_Morpheus said:


> That can be fixed easily, stop giving nvidia money, regardless of what they have to offer. This is the sh!t, people want AMD to produce a 4090 killer, but offer it at US$500, just so Nvidia is "forced" to cut prices so the rabid followers would still give nvidia their money.


I'm gonna leave this here
Top Nvidia shareholders 
Vanguard Group Inc. representing 7.7% of total shares outstanding​BlackRock Inc. representing 7.2% of total shares outstanding​Top AMD shareholders
Vanguard Group Inc. representing 8.28% of total shares outstanding​BlackRock Inc. representing 7.21% of total shares outstanding​


----------



## OfficerTux (Dec 12, 2022)

Some people here sound like AMD has personally insulted them by releasing this abysmal disgrace of a product. If you don't like it, don't by it. It's that simple.

Here in Germany the cheapest RTX 4080 is 1350€, if AMD actually manages to have XTX cards in stock tomorrow for 1150€ that would be 200€ or ~15% cheaper for the same raster performance.

I think I'm finally going to replace my Vega 64, just need a waterblock to go along with the card. Does anybody know when alphacool releases theirs?


----------



## spnidel (Dec 12, 2022)

Luminescent said:


> I'm gonna leave this here
> Top Nvidia shareholders
> Vanguard Group Inc. representing 7.7% of total shares outstanding​BlackRock Inc. representing 7.2% of total shares outstanding​Top AMD shareholders
> Vanguard Group Inc. representing 8.28% of total shares outstanding​BlackRock Inc. representing 7.21% of total shares outstanding​


hahaha, OK, the prices now make total sense


----------



## Argyr (Dec 12, 2022)

tfdsaf said:


> Nvidia on the other hand pretty much never gain performance in subsequent driver releases and in some cases actually lose performance as Nvidia fix stability issues in various games!


categorically false


----------



## BigMack70 (Dec 12, 2022)

Maybe I'm weird but I just don't understand how any of these $900-1200 cards make any sense to purchase. If I'm going to be spending four figures (or just shy of it) on a graphics card, I want the best. And none of these are the best - the 4090 trounces them.

I either want the best, or I want some value, and these cards exist in a no-man's land for me. Where are the great new GPUs between $500-700? Nowhere in sight. Both AMD and Nvidia refusing to move the needle on performance in what was historically the high end pricing tier.


----------



## dir_d (Dec 12, 2022)

OfficerTux said:


> Some people here sound like AMD has personally insulted them by releasing this abysmal disgrace of a product. If you don't like it, don't by it. It's that simple.
> 
> Here in Germany the cheapest RTX 4080 is 1350€, if AMD actually manages to have XTX cards in stock tomorrow for 1150€ that would be 200€ or ~15% cheaper for the same raster performance.
> 
> I think I'm finally going to replace my Vega 64, just need a waterblock to go along with the card. Does anybody know when alphacool releases theirs?


For me the card would be fine if it was consistent, but it is not and its all over the place. Some people have been burned where AMD never fixes the issues. in the 6k series AMD fixed the issues but why cant they get it right, every year AMD has issues at 1k we should at least get a consistent card.


----------



## Neo_Morpheus (Dec 12, 2022)

Luminescent said:


> I'm gonna leave this here
> Top Nvidia shareholders
> Vanguard Group Inc. representing 7.7% of total shares outstanding​BlackRock Inc. representing 7.2% of total shares outstanding​Top AMD shareholders
> Vanguard Group Inc. representing 8.28% of total shares outstanding​BlackRock Inc. representing 7.21% of total shares outstanding​


Interesting, but they only count for less than 20% of shareholders, so who has the other 80% and are they also in cahoots?


----------



## TheinsanegamerN (Dec 12, 2022)

Bomby569 said:


> People and reviewers keep throwing that at buyers. How many people was AMD burned in the last couple of years? It's not exactly like "perception" came out of think air.


Every time someone points that out you start hearing the cope brigade "oh well nvidia has the same problems (lolno)" "oh nvidia is evil" "gamers just dont care about AMD" ece.

Few are willing to admit that AMD dug that hole for themselves, starting in 2006 when they overpaid by over $2 billion for ATi, and the subsequent decade of crap drivers and poorly supported releases gained AMD the reputation of the budget brand. EVen recently we had the rDNA downlcokcing bug and the first gen APUs relying on OEMs for drivers, both of which required pressure from tech media to fix.

Also on a side note: did you know AMD still sells bulldozer chips for chromebooks? They suck, and they have that nice red AMD sticker to draw your attention to who made this slow garbage. That's helping the public image. Like you said, perception is everything, and AMD's optics have not been all that great historically speaking


BigMack70 said:


> Maybe I'm weird but I just don't understand how any of these $900-1200 cards make any sense to purchase. If I'm going to be spending four figures (or just shy of it) on a graphics card, I want the best. And none of these are the best - the 4090 trounces them.
> 
> I either want the best, or I want some value, and these cards exist in a no-man's land for me. Where are the great new GPUs between $500-700? Nowhere in sight. Both AMD and Nvidia refusing to move the needle on performance in what was historically the high end pricing tier.


What you consider the best or having some value is subjective. For those who chase high frame rates but dont use RT there are multiple high end GPUs to choose from, those who want 4k are going to need more power then 1440p, ece. There's also the angle of these high end GPUs lasting a long time, whereas mid range GPUs will need replaced more often to handle newer games at settings higher then poverty tier. Those 480s and 1060s are not holding out as well as the 1080ti in modern games, for example.


Neo_Morpheus said:


> Interesting, but they only count for less than 20% of shareholders, so who has the other 80% and are they also in cahoots?


didnt you know, without black rock or Vanguard AMD would be selling the 7900xtx for $250 to compete with the $300 4090 and all CPUs would have a billion cores!


----------



## Drash (Dec 12, 2022)

£630 all UK taxes included for a MSI 6900 XT Trio X seems like a great deal now. Spends most of its time at 2500MHz, and I'm playing CP2077 Ultra with Ultra ray tracing at 40-100 fps (1080p). Wish I'd bought 2 as the wife "needs" something similar for the Witcher 3 remaster. I'd love a 4090 if someone gave it to me because prices are stupid, and it now looks like the stupid is set to continue a bit longer.


----------



## W1zzard (Dec 12, 2022)

Luminescent said:


> I'm gonna leave this here
> Top Nvidia shareholders
> Vanguard Group Inc. representing 7.7% of total shares outstanding​BlackRock Inc. representing 7.2% of total shares outstanding​Top AMD shareholders
> Vanguard Group Inc. representing 8.28% of total shares outstanding​BlackRock Inc. representing 7.21% of total shares outstanding​


Umm .. you understand what Blackrock and Vanguard are?


----------



## phanbuey (Dec 12, 2022)

BigMack70 said:


> Maybe I'm weird but I just don't understand how any of these $900-1200 cards make any sense to purchase. If I'm going to be spending four figures (or just shy of it) on a graphics card, I want the best. And none of these are the best - the 4090 trounces them.
> 
> I either want the best, or I want some value, and these cards exist in a no-man's land for me. Where are the great new GPUs between $500-700? Nowhere in sight. Both AMD and Nvidia refusing to move the needle on performance in what was historically the high end pricing tier.



QFT --.  This is exactly how the majority of the market feels IMO - If I'm gonna drop $1000-1300, I'm not going to want to sacrifice.  Otherwise I'm getting a $550 6800xt with 3 games and calling it a day.

Both of these need a 20% price cut to make any sense.


----------



## OfficerTux (Dec 12, 2022)

dir_d said:


> For me the card would be fine if it was consistent, but it is not and its all over the place. Some people have been burned where AMD never fixes the issues. in the 6k series AMD fixed the issues but why cant they get it right, every year AMD has issues at 1k we should at least get a consistent card.


That's valid criticism. A premium-priced product should also be of premium quality.

The fact that there are only two (three with Intel) companies left which can design GPUs with the current performance shows how complex it has become, so "getting it right every year" might not be as easy as it sounds. I hope AMD can again deliver some "fine wine TM" and improve consistency over all games / applications with driver updates.


----------



## Tropick (Dec 12, 2022)

ARF said:


> Probably if you normalise for the shaders quantity change and frequency.
> Look at how close the 7900 XT is to 6950 XT  126 vs 113 is literally the same performance.
> 
> View attachment 274162
> AMD Radeon RX 7900 XTX review - The Witcher III Wild Hunt (guru3d.com)


And that's a stock 6950XT... if you spend some time tuning the thing you could easily squeeze another 6-7% out of it, even more with a high end board partner model. My 6950XT OCF can sustain 2.7-2.75GHz on the core and 2.35GHz on the memory. And I snagged that thing for a relatively low $730 after rebate on black friday. Haven't done any solid testing but I'd imagine it just about 1:1's a 7900XT in raster. Definitely not AMD's best showing...


----------



## Frick (Dec 12, 2022)

W1zzard said:


> Umm .. you understand what Blackrock and Vanguard are?



Shady corporations hell bent on world domination, anyone saying anything else is in their employ! (this is a joke)

In any case ... a big yawn from me. Announced prices here €1083 and €1193. 4080's go for €1500+, so if they actually manage to sell for that it'd be good I suppose but still. Yawn, with a bit of blergh.


----------



## phill (Dec 12, 2022)

From what I could see and understand from the review, AMD have really made a decent card. 

Yes drivers seem to have a bit of an issue when it comes to wattage when its watching a video or something but meh, that will get sorted I'm sure  For the cost, looks and everything else, I'd glad have an aftermarket card 

Thank you W1zzard for a great review as always


----------



## vMax65 (Dec 12, 2022)

Both good and bad for me. The power usage is surprising as I thought the 7900XTX would be far more effecient and at the same time was hoping for a bit more of a beating given to the 4080 so that Nvidia would have no choice but to drop prices...Hopefully Nvidia will drop prices but I fear it will be just a small one. Overall though the 7900XTX had done well on the RT side by getting to the 3090/Ti level which is not bad at all. The 7900XT though is a bust and I just cannot understand why it was priced $100 less other than to take a leaf out of Nvidia's play book. Just pay the $100 more for the XTX.

Bottom line, these are still too expensive as the 7900XTX should have been no more than $899 and the 7900XT at $759. The 4080 also should have been no more than $899 and that is still taking the you know what...


----------



## rrrrex (Dec 12, 2022)

0.1 V is too low, you need at least 0.5-0.6 V to open transistor. Clocks aren't real too.


----------



## andreiga76 (Dec 12, 2022)

Neo_Morpheus said:


> That can be fixed easily, stop giving nvidia money, regardless of what they have to offer. This is the sh!t, people want AMD to produce a 4090 killer, but offer it at US$500, just so Nvidia is "forced" to cut prices so the rabid followers would still give nvidia their money.
> 
> Aah yes, you want a 4090 killer that only sells for US$500.
> 
> ...


We want AMD to have ANYTHING to compete with 4090, same price, this way we will win because of competition, NVidia will have to drop the price of 4090 and release a 4090 Ti with full chip unlocked for a 1600 price, AMD will do the same etc, but at this moment AMD has nada, zip, zero because their engineers couldn't compete with those from NVidia.


----------



## terroralpha (Dec 12, 2022)

ARF said:


> Probably if you normalise for the shaders quantity change and frequency.
> Look at how close the 7900 XT is to 6950 XT  126 vs 113 is literally the same performance.
> 
> View attachment 274162
> AMD Radeon RX 7900 XTX review - The Witcher III Wild Hunt (guru3d.com)



reminds me of the fury and the subsequent vega cards. when you downclocked a vega 64 card to fury x clock speeds, the performance was identical.


----------



## wNotyarD (Dec 12, 2022)

If AMD's recent history is anything to go by, then we can expect two things:
1 - FineWine™ will make both 7900s a bit better, especially that low load power consumption;
2 - The chiplet design will improve in two generations time, one at the earliest (see the jump from Ryzen 1000 to 3000 series).


----------



## Tropick (Dec 12, 2022)

TFW The 7900XTX pushed balls to the wall beats top end RDNA2 by a whopping ~43FPS in raster.


----------



## N3M3515 (Dec 12, 2022)

Psychoholic said:


> So matches the 4080 while being quite a bit cheaper, Nicely done imo.


Matches an incredibly overpriced gpu.........
So, the 7900XTX, in reality is a 7800XT, and should be $700.

This gpu is overpriced by $300.

Same shit as the 4080.


----------



## TheinsanegamerN (Dec 12, 2022)

terroralpha said:


> reminds me of the fury and the subsequent vega cards. when you downclocked a vega 64 card to fury x clock speeds, the performance was identical.


Don't forget getting a $100 bump UP in price after a disappointing launch. Lmfao.....

The 7900xtx at least is a good price. The upgrade from the 6950xt is respectable, ignoring the driver issues. The 7900xt is DOA at $900 though, that should have been a $800 card max.


----------



## iGigaFlop2 (Dec 12, 2022)

Well while its a good product i dont think the 200
difference will sway people from buying the 4080 its closer in performance than i thought it would be to the xtx and has the superior feature set and better ray tracing. When your spending $1000 whats $200 more. I still think its a great gpu but the xt is a dud it should’ve been $200 cheaper or more.


----------



## 3x0 (Dec 12, 2022)

TheinsanegamerN said:


> The 7900xtx at least is a good price. The upgrade from the 6950xt is respectable, ignoring the driver issues. The 7900xt is DOA at $900 though, that should have been a $800 card max.


Well, considering the pricing difference between 6900XT and 6800XT, compared to which the 7900XT is also similarly cut down, the 7900XT should have been a 7800XT priced at no more than 700$


----------



## rojo (Dec 12, 2022)

Frick said:


> Shady corporations hell bent on world domination, anyone saying anything else is in their employ!
> 
> In any case ... a big yawn from me. Announced prices here €1083 and €1193. 4080's go for €1500+, so if they actually manage to sell for that it'd be good I suppose but still. Yawn, with a bit of blergh.



Damn right! Them and the WEF

And I agree with others on this thread. AMD should have priced these new cards at 100 to 200 less than they are now.


----------



## N3M3515 (Dec 12, 2022)

kapone32 said:


> from what I can see the XTX is about 60% faster overall than my card


60% faster for 60% more money..............................LOL


----------



## BigMack70 (Dec 12, 2022)

TheinsanegamerN said:


> What you consider the best or having some value is subjective.



I get what you're saying, but I don't fully agree. The GPU market has been pretty consistent for 10 years that high end GPUs are available between $500-700. GPUs costing $1k+ have _always _been premium products with sky high pricing only justified by the fact that they were the best. Until now, no company has tried to sell a value proposition north of $800. This is the first time we have ever seen companies try to release $1000 graphics cards that are solidly second tier products.

So, I think there are objective criteria by which we can say "these products make zero sense to purchase". The only way these products make sense is if you are willing to agree with Nvidia and AMD that the crypto-pandemic GPU price explosion has set a new normal for the graphics card market.

I do not agree. Consoles are not exploding in price. CPUs are not exploding in price. The Steam Deck is not exploding in price. Only graphics cards (and motherboards, sort of). And it's _objectively _nonsensical. I understand that some people will make subjective purchasing decisions and be happy with these products, and good for them - buy what you want and enjoy it.

But second tier GPUs have no business being this expensive. AMD and Nvidia could almost certainly sell these products around $700-800 and turn a profit. They just don't want to, and they're betting there are enough consumers willing to accept the "new normal" that they will be able to sell product at a historically ridiculous price level. They might be right, but that doesn't make it suddenly true that there is value in these purchases.

I expect RDNA 3 will be sitting on shelves the same way the 4080 is sitting on shelves.


----------



## Pumper (Dec 12, 2022)

GhostRyder said:


> As for everyone complaining about Ray Tracing performance.  In all seriousness I want to know if people are seriously using the tech in the small list of games it supports.


That's not really the point of the complaints. AMD claimed that RDNA3 will have improved RT performance vs. 6000 series, but it ended up being the same.


----------



## Bomby569 (Dec 12, 2022)

no matter how much you polish a turd or compare it to an even bigger turd, a turd is a turd


----------



## Vayra86 (Dec 12, 2022)

the54thvoid said:


> I'd say, why is the 4090 so damn fast? That's a helluva card (too pricey for me) and was always going to be a reach for AMD to catch it.
> 
> I think the XTX is good but the XT would be better at $800 max (or ideally, £799 UK equiv.)


This. RDNA3 is fine, its just priced too high.

Same applies to Nvidia.

As for early hardware behaviour like high idle tdp and that high peak clocking after which it quickly drops 200mhz... that looks fixable and its likely there is some perf left on the table especially in relation to its voltage under load vs fan speed vs temp hotspot only at 74ish degrees... and gpu at 58c. Thats extremely low!

Idle usage is obv related to MCM. This is just new territory and memory or core might not clock down agressively enough. Not exciting or new imho   Could it have been more refined though from launch.. I certainly think they should have. First impressions matter.


----------



## Psychoholic (Dec 12, 2022)

N3M3515 said:


> Matches an incredibly overpriced gpu.........
> So, the 7900XTX, in reality is a 7800XT, and should be $700.
> 
> This gpu is overpriced by $300.
> ...



Oh, I'm aware the 4080 is overpriced, I own one, lol

Just saying, not a bad showing from the XTX considering the price VS the 4080.


----------



## kapone32 (Dec 12, 2022)

N3M3515 said:


> 60% faster for 60% more money..............................LOL


Sorry I got my 6800xt when they were $1400 in Canada. If you don't remember. Yes they are expensive but that does not mean they are not worth it. It doesn't matter what you say anyway I will be buying a card regardless. Complaining about an AMD card's performance on a day one review should always have the caveat that based on history this card will be faster in 2 years than it is today and more efficient too.


----------



## N3M3515 (Dec 12, 2022)

spnidel said:


> you don't get it man, the 8gb less VRAM and 16% better RT performance is worth the extra $200 bucks


lol...................how can you justify a $1200 gpu?, BOTH the 4080 and the 7900XTX are ripoffs. People thought 7900XTX was going to be well priced because everyone believed it would be +20% faster than the 4080.



kapone32 said:


> Sorry I got my 6800xt when they were $1400 in Canada. If you don't remember. Yes they are expensive but that does not mean they are not worth it. It doesn't matter what you say anyway I will be buying a card regardless. Complaining about an AMD card's performance on a day one review should always have the caveat that based on history this card will be faster in 2 years than it is today and more efficient too.


Sorry, i'll be keeping my 6800XT for another gen i guess.


----------



## Cheeseball (Dec 12, 2022)

They need to get their drivers to read the monitor's EDID properly so that the idle clocks would be on par with both NVIDIA and Intel. Using CRU is still the workaround for that issue.

After that, pretty much Adrenalin is fine for what it is (aside from Enhanced Sync being useless nowadays).






Yes, of course I've reported this issue through their Bug Report tool. It's been 3 years actually and still no notable movement on this. (Find my 5700 XT thread)


----------



## ChaoticG8R (Dec 12, 2022)

Im optimistic that the board partners will push the XTX even farther. It’s fairly obvious that nvidia has put some incredible effort to ensure their internal cooler/design is competitive with AIBs, while it feels more like a true bare minimum/“reference” style that AMD is providing.

I would assume we’ll have some reviews tomorrow on launch for the AIBs @W1zzard or is AMD going to stagger them again like previously? If you can’t give a yes or no due to NDA, can you at least acknowledge that?

Thanks in advance


----------



## TheoneandonlyMrK (Dec 12, 2022)

Luminescent said:


> I'm gonna leave this here
> Top Nvidia shareholders
> Vanguard Group Inc. representing 7.7% of total shares outstanding​BlackRock Inc. representing 7.2% of total shares outstanding​Top AMD shareholders
> Vanguard Group Inc. representing 8.28% of total shares outstanding​BlackRock Inc. representing 7.21% of total shares outstanding​


Brilliant post, again, , ,,Not.
I think the thread your after is the actual project Calisto poor performance thread.
You can shitpost in peace there.

OP equals a 3090Ti in raytacing, edges the 4080 in 4k raster and cheaper than the competition, not bad, not the outright win many(I) wanted but not bad.

Come on Santa you Shit, sort it out.(unlikely but I can dream)


----------



## Ravenas (Dec 12, 2022)

Performance/watt should be larger on AIB cards given 3 and 4 8 pin connectors. You will see larger power draw with larger performance increases.

Energy efficiency. Clear advantage over 4080 in regard to rasterization and price. Ray Tracing in 4K still isn’t ready for prime time yet.

@W1zzard The argument that you should by a 4080 over a 7900 XTX for ray tracing is ludicrous in my opinion. Averaging 4K 29 FPS on 4080 versus 21 FPS 7900XTX in Cyberpunk is comical. It’s almost across the board game to game.

Well deserved Editors Choice.


----------



## Luminescent (Dec 12, 2022)

I have a fan.


----------



## kapone32 (Dec 12, 2022)

N3M3515 said:


> Sorry, i'll be keeping my 6800XT for another gen i guess.


That is fine it is your money to do with as you please. I play TWWH3 and when I saw 145 FPS in the review I was sold.


----------



## N3M3515 (Dec 12, 2022)

AnotherReader said:


> barely beating the 4080.


It's identical in raster, unless you can detect a 3% difference?
RT nvidia +16%.

But both are $700 gpus.



kapone32 said:


> I saw 145 FPS in the review I was sold.


Yeah, that's sad, it's like approving those price hikes. Next gen 8900XTX $1500. And you'll see 185 FPS in TWWH3 and be sold again.


----------



## Bjorn_Of_Iceland (Dec 12, 2022)

If I was in the market for a GPU, I'd rather go for a used RTX3090 than a 7900 XTX tbh.


----------



## Alpha_Lyrae (Dec 12, 2022)

Ah, yes, launch driver bugs; very typical of AMD. You know, more of you should be demanding “FineWine” performance at launch and not 6-12 months later. That simply means that the driver compiler is still not optimized for RDNA3 after all of this development time, and yes, it’s resource intensive, but most people base decisions on launch performance.

Overall, RDNA3 isn’t terrible, however we should all remember that this generation, AMD and Nvidia are both on TSMC again. Nvidia is using “4N” or Nvidia-optimized N4(P?) while AMD is using N5 (also with AMD customized libraries) for GPU and N6 for MCDs. 

They’re not quite on equal terms: N4P (closest to 4N) offers 11% more performance and 22% more efficiency over N5 (just 4% and 7% over N5P). Density is only a 6% reduction vs N5. These are TSMC’s own numbers. AMD can refresh Navi 31 GPU in 8-12 months with N4P at higher wafer costs and that would close pricing between AMD 7950XTX and eventual Nvidia RTX 4080Ti.

Nvidia screwed themselves with Ampere by going to Samsung 8LPP.


----------



## Vayra86 (Dec 12, 2022)

Kaleid said:


> Compact and loud go together. Too bad, why can't they design these things better?
> But other than RT performance it's quite great.
> 
> The XT is only about 20w more than 6800xt, that's a high increase in efficiency


39dB at 58C core temp is pretty neat I'd say, you can't look at noise in isolation imho.

These either OC to the moon or they can go pretty damn silent under load, being as they are in the review; on the OC page you can see the XTX hitting near 3 GHz.

It does not however extract a whole lot of FPS from it in Heaven, and not all of that I think is attributable to the CPU limitations. The lack of refinement we see there echoes throughout the review; apparently it clocks very aggressively at the beginning of a run and is quickly forced to push back to remain within 350W power target.



kapone32 said:


> That is fine it is your money to do with as you please. I play TWWH3 and when I saw 145 FPS in the review I was sold.


Yeah somehow that game cripples my 1080 too, I'm looking at 30 FPS on the campaign map. Still wondering why it drops so low; TWWH2 does 50% more on average. Its not exactly paid off by massive increase in fidelity...



N3M3515 said:


> It's identical in raster, unless you can detect a 3% difference?
> RT nvidia +16%.


The RT results AND raster results are pretty wide apart in some cases, but if you consider that and then still consider there's 3% in advantage of AMD, that's meaningful. Some games push 20% more FPS, while the biggest loser is what, 15% in favor of Nvidia. The number of games is also higher where it scores better. Given the lacking optimization elsewhere in the product, its safe to say that gap can increase further in favor of AMD. Not a given, but definitely plausible.


----------



## Makaveli (Dec 12, 2022)

Toss said:


> how about PRODUCTIVITY TESTS?
> x264, AV1 H265
> CONDING ENGOCING DECODING VIRGINING?


----------



## AnotherReader (Dec 12, 2022)

N3M3515 said:


> It's identical in raster, unless you can detect a 3% difference?


We agree; you didn't quote the full sentence.


> I was expecting the 7900 XTX to be around 90% of a 4090 in *traditionally rasterized* games, but it is barely beating the 4080


----------



## kapone32 (Dec 12, 2022)

N3M3515 said:


> Yeah, that's sad, it's like approving those price hikes. Next gen 8900XTX $1500. And you'll see 185 FPS in TWWH3 and be sold again.


You can call it what you want. Who knows, it depends but yes I do buy AMD cards. Even the Vega 64 has been worth every single penny and my RX570 is still going strong too. I am not taking food out of my mouth to buy a $1000 GPU so it really is inert. I do not like the prices but a PS5 is $999 all day long and pre-builts with cards like the 3090 and TI versions are still $4000+ so it is what it is.


----------



## medi01 (Dec 12, 2022)

Considerably lower ray tracing performance than RTX 4080

Cough.








Ravenas said:


> Ray Tracing in 4K still isn’t ready for prime time yet.


No offense, 3090Ti.


----------



## Sir Alex Ice (Dec 12, 2022)

So what happens if I connect a Dell U2720Q or similar monitor that has a Type-C display port using the Type-C DisplayPort 2.1 in the 7900 XTX?


----------



## Vayra86 (Dec 12, 2022)

medi01 said:


> Considerably lower ray tracing performance than RTX 4080
> 
> Cough.
> 
> ...


Yeah the RT perf should not be a deal breaker on its own, I do agree.

But then you see that the perf/$ gap is also exactly 17% to the 4080, and it kinda does get ya thinking. That balance isn't tipping decisively in AMD's favor there; after all, RDNA3 is missing the mark on a few other aspects, most notably idle power plus a historical drawback; people are seeing history repeat: cards getting released that could have been refined more.

And to be fair to them, I see that too. New technology, blah blah fantastic, but if that serves to brute force your way to the sub top of leaderboards while leaving refinement on the table, that ain't good. RDNA2 was great exactly because it did tick all those boxes. And that's a literal repeat of what we saw since Fury. HBM never materialized in the gaming space. If MCM isn't paying off bigtime, what's it doing here? After all, if it offers cheaper products for being non monolithic, why aren't we seeing it?

I'm kinda withholding judgment because I expect two things in the near future;
- price drops
- driver refinement


----------



## 75Vette (Dec 12, 2022)

Greetings. Awesome review as always... small error though:

*4090 is $2400 now???

This is in all the charts on that page: AMD Radeon RX 7900 XTX Review - Disrupting the RTX 4080 - Performance per Dollar | TechPowerUp*


----------



## Cheeseball (Dec 12, 2022)

Sir Alex Ice said:


> So what happens if I connect a Dell U2720Q or similar monitor that has a Type-C display port using the Type-C DisplayPort 2.1 in the 7900 XTX?


It should just consider it a normal DisplayPort connection.


----------



## Denver (Dec 12, 2022)

N3M3515 said:


> It's identical in raster, unless you can detect a 3% difference?
> RT nvidia +16%.
> 
> But both are $700 gpus.
> ...


Why? Did you roll a dice and decide on that price? The reality is that contrary to what you think these GPUs don't have very high profit margins like in the past, $700 should be more or less the cost of production. Over $300 just for memory (optimistic), it would be great if you could maintain that performance with just 12Gb or 16gb of vram, so it could cost $150-200 less

My main and only complaint is in relation to the huge variation in performance difference between games, in some cases it even beats the 4090 and in others it loses by about 20% to the 4080. Plus, It seems a rule that games based on the Unreal Engine run bad AMD gpus, I wonder if AMD doesn't have money left over to do something about it. 

Anyway, it's still the most interesting product AMD has released in recent years, too bad there aren't decent games to effectively use so much computational power.


----------



## Dristun (Dec 12, 2022)

Denver said:


> Anyway, it's still the most interesting product AMD has released in recent years, too bad there aren't decent games to effectively use so much computational power.


IMO 6900XT was way more interesting - actually faster than nvidia's best in raster, and a time when ray-tracing was less relevant and in way fewer games. This gen feels like a geniune step back again - they can't even comprehensibly beat a cut-down die while still lagging in features (or gimmicks, if you wish, it doesn't really matter because one has them and the other doesn't, at least right here and right now without FSR3 and so on).


----------



## Vayra86 (Dec 12, 2022)

Denver said:


> Why? Did you roll a dice and decide on that price? The reality is that contrary to what you think these GPUs don't have very high profit margins like in the past, $700 should be more or less the cost of production. Over $300 just for memory (optimistic), it would be great if you could maintain that performance with just 12Gb or 16gb of vram, so it could cost $150-200 less
> 
> My main and only complaint is in relation to the huge variation in performance difference between games, in some cases it even beats the 4090 and in others it loses by about 20% to the 4080. Plus, It seems a rule that games based on the Unreal Engine run bad AMD gpus, I wonder if AMD doesn't have money left over to do something about it.
> 
> Anyway, it's still the most interesting product AMD has released in recent years, too bad there aren't decent games to effectively use so much computational power.


Historically Nvidia has ran margins of 40-60% and they're peaking above 60% today. So strangely enough as nodes shrink, both revenue and margins shoot up, and not one at cost of the other.

700 cost of production? Got source, or just pulling it out of what's been going around the internet in random blurbs? I'm interested if you have real info. And if AMD has to cut into its margins so much more than Nvidia doing its monolithic approach, what the fuck is AMD even on right now? Crack? Or is this the only way they can realistically move forward? Or is it an investment in the future? If so, they should damn well make sure to release it in a better state.

The story just doesn't make sense. If MCM is better wrt yields on smaller nodes, it should end up cheaper.

Note; here's AMD's gross margin. Note that CPU is chasing the chiplet route for a while now too.


----------



## john_ (Dec 12, 2022)

phanbuey said:


> I'm a huge fan of new intel, but they're not even close to 'biting' AMD from behind. If intel can match the 6800xt with battlemage next year I will be impressed, but I doubt it.
> 
> This architecture does have similarities to bulldozer, but at the end of the day it pushes the FPS needed to compete with the 80 class.
> 
> They will need to drop price, for sure -- but hopefully the silicon savings allow for that.


They don't need to reach 6800X level of performance. Nvidia is securing the high end and Intel will take market share from AMD in the low-mid range market by selling to big OEMs. Intel can sell multiple times more products than AMD to OEMs, even inferior products. Intel is also doing something smart. It's investing from the beginning in RT performance. AMD is the only one of the three GPU manufacturers that still considers RT performance secondary. That's a mistake in my opinion. And now that they are having problems destroying RTX 4080 in raster, it's going to cost them heavily in sales. I can already see a drop of $100 in price, at least for the XT, even if Nvidia doesn't drop RTX 4080's price, or/and RTX 4070 Ti comes at $800. I agree with you in this one.


----------



## Fluffmeister (Dec 12, 2022)

Dristun said:


> IMO 6900XT was way more interesting - actually faster than nvidia's best in raster, and a time when ray-tracing was less relevant and in way fewer games. This gen feels like a geniune step back again - they can't even comprehensibly beat a cut-down die while still lagging in features (or gimmicks, if you wish, it doesn't really matter because one has them and the other doesn't, at least right here and right now without FSR3 and so on).



Yeah that's a good point, the 4080 isn't using a fully enabled AD103, and then worringly neither is the 4090 close to fully using the AD102. A fully enabled AD102 on presumably the 4090 Ti is gonna bitch slap everything (and melt wallets).


----------



## W1zzard (Dec 12, 2022)

rrrrex said:


> 0.1 V is too low, you need at least 0.5-0.6 V to open transistor. Clocks aren't real too.


This is what the monitoring circuitry in the card returns, I have no way to disprove these results



Tropick said:


> View attachment 274179View attachment 274182
> 
> TFW The 7900XTX pushed balls to the wall beats top end RDNA2 by a whopping ~43FPS in raster.
> View attachment 274181


5800X vs 13900K though. Check the deltas for cards present in both tests



Vayra86 said:


> These either OC to the moon


Actually these don't OC at all, as mentioned in the review. When you change GPU clock you get 3 results:
- No change in performance
- Loss in performance
- Crash

What you don't get is any performance gain.

What works for "OC" is undervolt, so the boosting increases clocks (without you manually touching clocks), and increasing power limit, so the boosting increases clocks (without you manually touching clocks).

Others would say "OC is broken", if that's the way it works now.



75Vette said:


> Greetings. Awesome review as always... small error though:
> 
> 4090 is $2400 now???


That's the current market price. I tried to find one at lower price this morning, not possible, all sold out.


----------



## Vayra86 (Dec 12, 2022)

W1zzard said:


> Actually these don't OC at all, as mentioned in the review. When you change GPU clock you get 3 results:
> - No change in performance
> - Loss in performance
> - Crash
> ...


Yeah I noticed that, and the GPU is also at peak power all the time. Looks remarkably similar to Zen4.

What do you estimate the chances of this changing with future updates?


----------



## Ciubaka (Dec 12, 2022)

Just registered to say that I am still very disappointed in idle power consumption. I am very sure it not only applies to multi-monitor setups only but the same goes for high refresh rate displays as well. It was THE reason I sold my 6800 XT (it drew around 45 watts at 4k 120 Hz with no way to lower that (I sent AMD bug report but response was along the line of "tough cookies") and now 7900 series draws around 100 watts. Highly doubt AMD will fix that just like I am sure 6800 XT was (and pretty sure it still is, can anyone confirm what it is like nowadays 4k display at 120 Hz?) left "as is". I really like AMD but its always some details that kills it for me.


----------



## spnidel (Dec 12, 2022)

N3M3515 said:


> lol...................how can you justify a $1200 gpu?, BOTH the 4080 and the 7900XTX are ripoffs. People thought 7900XTX was going to be well priced because everyone believed it would be +20% faster than the 4080.


I can't believe you thought I was being serious in the message you quoted


----------



## medi01 (Dec 12, 2022)

For what it's worth, curious comment:


__ https://twitter.com/i/web/status/1602305370667507712


----------



## john_ (Dec 12, 2022)

oxrufiioxo said:


> Way overhyped... Now I know why the 4080 is so terribly priced and now we will likely get an 800-900 usd 4070 way to go AMD.


Nvidia prices have nothing to do with AMD falling to offer an RTX 4080 killer at $1000. Nvidia prices have everything to do with people willing to pay those prices. Nvidia prices have everything to do with people accusing Nvidia's competition for the prices and not Nvidia itself.

Let's say that AMD's RDNA3 was a total mess. A buggy hardware that needed new revisions to work. Nvidia comes out with the prices we see today and NO ONE was buying. I mean let's say they have managed to sell 12000 RTX 4090s instead of 120000. RTX 4090's price will have been much lower today. And without AMD's help.

And please. If you want AMD to build better products, just be also ready to buy them. Hoping AMD to build better products, so you can keep paying Nvidia, only secures one thing. More expensive GPUs in the future.


----------



## qlum (Dec 12, 2022)

I wonder how multi monitor power consumption scales here.
I know from experience that while nvidia has relatively low power consumption with 2 monitors, if you use 3 or 4 it is already a lot higher.
I personally use 4 so I would be quite interested to see how this scales.


----------



## AnotherReader (Dec 12, 2022)

qlum said:


> I wonder how multi monitor power consumption scales here.
> I know from experience that while nvidia has relatively low power consumption with 2 monitors, if you use 3 or 4 it is already a lot higher.
> I personally use 4 so I would be quite interested to see how this scales.


Multi monitor is terrible in TPU's more realistic scenario: monitors with different resolutions and refresh rates. If you have similar screens, then it's better, at least according to Computerbase:


----------



## Jism (Dec 12, 2022)

Nice review @W1zzard - must have taken ages to work through those benchmarks and make up the numbers.

It peaks at 360W for a higher end card. Far more efficient then a Nvidia if you ask me. GPU at less then 1 volt means there's still headroom.


----------



## qlum (Dec 12, 2022)

AnotherReader said:


> Multi monitor is terrible in TPU's more realistic scenario: monitors with different resolutions and refresh rates. If you have similar screens, then it's better, at least according to Computerbase:
> 
> View attachment 274201


Interesting, there really is not one multi monitor use case, there are instead a lot of them, so yea at least a good qualifier what is actually being tested would help a lot here.
I personally run 2x 144hz 1440p 1x 155hz 1440p 1x 60hz 1200x1920 (portrait).


----------



## pavle (Dec 12, 2022)

Quite powerful pixel cannons these new AMD cards are. 
Despite all the culling optimizations, they don't do so good in ray tracing, I wonder how they would do in Portal RT, where the RX 6900 XT does only 2fps?
Multimonitor consumption seems high, looking at Computerbase.de review at least (2 monitor test under title "Multi-monitor gut, Youtube schlecht" on page 5), I thought AMD fixed that.
And also they are quite expensive in Europe, 1050 and 1150 Euros...


----------



## Braegnok (Dec 12, 2022)

Great review!

Exactly what I needed to see before launch tomorrow.


----------



## Ando (Dec 12, 2022)

The multi-monitor/video playback power use is shockingly high. It must be the vram clock shooting up like on rx 6000 series. I wonder if the huge increase relative to last gen is related to the MCD’s? Hopefully they can fix it.

The reviews make me glad I pulled the trigger on a $500 6800 months ago rather than wait like people suggested. Pricing is only marginally better for the navi 21 cards, and navi 31 is just too much money at launch for what it’s offering. I really do think they’ll need to drop the price for both models by at least $100 in the coming months.
Navi 32 might be really interesting though for people on rx 5000/rtx 2000 and older. I can see a smaller GCD with 4 MCD’s beating the 6800xt and likely profiting at the same $650 launch price, which is easier for people to stomache. Nvidia will probably struggle to compete there with monolithic dies judging by what they wanted for the 4070ti. And inb4 “AMD will be greedy”; if 7900 series don’t sell that well they will have no choice but to recognize reality and be aggressive with pricing.
I doubt the 7800xt will be a further cut down N31 die, all signs point to good yields. Makes sense, the GCD is so much smaller than recent high end GPU’s. I hope AMD focuses more on this strategy instead of going for a monumental chiplet GPU; for a future upgrade I’d prefer something more modest with modest power draw.


----------



## fluxc0d3r (Dec 12, 2022)

The other day, I almost bought a 6650XT for $250 brand new off Amazon for my HTPC/driving sim rig, but decided to wait for the budget 7000 series line up from AMD. I here they have a $400 7000-series card with RX 6800 performance coming out soon. 

AMD is currently dominating in the budget to midrange market with their 6650XT and 6700XT cards being much cheaper. Most budget users don't care about RT performance, probably turn it off anyways.


----------



## ARF (Dec 12, 2022)

fluxc0d3r said:


> The other day, I almost bought a 6650XT for $250 brand new off Amazon for my HTPC/driving sim rig, but decided to wait for the budget 7000 series line up from AMD. I here they have a $400 7000-series card with RX 6800 performance coming out soon.
> 
> AMD is currently dominating in the budget to midrange market with their 6650XT and 6700XT cards being much cheaper. Most budget users don't care about RT performance, probably turn it off anyways.



People also eat junk food because they think it is delicious and maybe cheaper, but that doesn't make it right.
I would never touch 6650 or 6700. Let them rot on the shelves!


----------



## wheresmycar (Dec 12, 2022)

Thanks for the reviews @W1zzard 

A special note to AMD and NVIDIA (OUR SAVIOURS) as always nowadays... great cards!! but for the price NO THANK YOU VERY VERY MUCH.... stick it where the sun don't shine!


----------



## N3M3515 (Dec 12, 2022)

Vayra86 said:


> The RT results AND raster results are pretty wide apart in some cases, but if you consider that and then still consider there's 3% in advantage of AMD, that's meaningful. Some games push 20% more FPS, while the biggest loser is what, 15% in favor of Nvidia. The number of games is also higher where it scores better. Given the lacking optimization elsewhere in the product, its safe to say that gap can increase further in favor of AMD. Not a given, but definitely plausible.


I mean, at those price points none of the gpus are recommendable imho, if i had a gun pointed at me and i had to choose, of course i would save the $200 and go for mad, but those prices are stupid. And i repeat, 7900XTX is only that in name, the reality is that is a 7800XT competing with the 4080. Price: $700 max.


----------



## mahoney (Dec 12, 2022)

Don't worry guys AMD FineWine is back for this gen. In 1 year time we're gonna see a 10% performance uplift vs 4080


----------



## Taisho (Dec 12, 2022)

The mystery of high 4000 series prices is finally solved:
1. AMD lost over 20% in raster performance (admittedly at a slightly lower power draw). RTX 4090 is the only option for a smooth 4K native or 4K RT upscaled.
2. Ray Tracing performance gap is exactly the same as it was in the previous gen, as both companies did only minimal improvements in RT vs non-RT performance, RTX 4090 being the most efficient. AMD's RT is again depending on the game and resolution, unplayable or a huge sacrifice to make.
3. "Power efficiency" crown that AMD's marketing tried to sell us came out to be one huge BS. Radeons were more efficient in the previous gen, especially the ones competing with NVIDIA's high-end, now the tables have turned. It's the most disappointing part of this release to me personally.
4. We still need to wait for custom models for this to be confirmed, but based on the reference models, it looks like AMD cards will be again hotter/louder at the same TDP. It was always the same with Ryzen vs Intel CPUs.

After NVIDIA solved overhead issues in DX12, there seem to be no practical reasons to buy an AMD over NVIDIA in this gen apart from the price.


----------



## Denver (Dec 12, 2022)

Vayra86 said:


> Historically Nvidia has ran margins of 40-60% and they're peaking above 60% today. So strangely enough as nodes shrink, both revenue and margins shoot up, and not one at cost of the other.
> 
> 700 cost of production? Got source, or just pulling it out of what's been going around the internet in random blurbs? I'm interested if you have real info. And if AMD has to cut into its margins so much more than Nvidia doing its monolithic approach, what the fuck is AMD even on right now? Crack? Or is this the only way they can realistically move forward? Or is it an investment in the future? If so, they should damn well make sure to release it in a better state.
> 
> ...


Of course, I'm not just throwing out random information for fun.

The cost per 5nm wafer is $17k. Considering the yield, similar to 7nm, and the 300mm2 die of the main GPU chip(GCD), we have about 137 usable chips, costing about $124 each. Now we also have 6x cache chips(MCDs) costing about $12 each. So far the cost is at U$ 196. 









						GDDR6 significantly more expensive than GDDR5
					

Electronic Components Dealers lists various Micron GDDR5 and GDDR6 chips, pricing for 2,000 units. From these it can be seen that GDDR6 is currently much more expensive than GDDR5 at the moment. Goi...




					www.guru3d.com
				




It's kind of hard to know precisely the cost of GDDR6 20Gbps now, but 3 years ago before the chaos and inflation it was almost $12/Gb of slow 14Gbps memory.. so lets say its $14/Gb. 

24Gb * 14 = U$ 336 + $196 = $532. 

Now add the cost of all the other components and logistical complexity of the modular design, so... we have something closer to $700, ignore the cost of development and driver support for years to come. 

Anyway, selling CPUs is a much better deal for AMD: small chips, very high margin, low investment requirement in support etc...


----------



## N3M3515 (Dec 12, 2022)

Denver said:


> it's still the most interesting product AMD has released in recent years, too bad there aren't decent games to effectively use so much computational power.


6800XT and 6900XT were MUCH more interesting and actually had excelent msrp.



fluxc0d3r said:


> I here they have a $400 7000-series card with RX 6800 performance coming out soon.


Yeah, keep dreaming. Make that at least $500.


----------



## Denver (Dec 12, 2022)

N3M3515 said:


> 6800XT and 6900XT were MUCH more interesting and actually had excelent msrp.
> 
> 
> Yeah, keep dreaming. Make that at least $500.


Of course, they are selling well below their original launch price, perhaps even at a loss. It's a great time to shop at a discount while supplies last.


----------



## ARF (Dec 12, 2022)

Denver said:


> Of course, they are selling well below their original launch price, perhaps even at a loss. It's a great time to shop at a discount while supplies last.



I doubt they are selling anything with a loss. But they should definitely lower their wages for yachts and private jets


----------



## champsilva (Dec 12, 2022)

Toss said:


> how about PRODUCTIVITY TESTS?
> x264, AV1 H265
> CONDING ENGOCING DECODING VIRGINING?


I could find some test


----------



## Gica (Dec 12, 2022)

Crackong said:


> Same performance as 4080, 8GB more VRAM, not sized as a brick, and $200 cheaper


And
Exceeds 4080 in electricity consumption
Weak in raytracing (16% in 4K can make the difference from satisfactory to cursive)
Possibly weak in productivity as well (we are waiting for the Puget tests)
Huge multimonitor consumption (4x more (!!!), but I think it can be fixed with the drivers).
They will not fix the ray tracing performance with drivers. It is clear that here they are a generation behind nVidia. 
So you save $200 just by giving something up.


----------



## AnotherReader (Dec 12, 2022)

Denver said:


> Of course, I'm not just throwing out random information for fun.


GDDR6 should be cheaper now; IIRC, @W1zzard estimated about $55 for 8 GB of GDDR6 around RDNA 2 launch. N5 wafer costs are probably an over estimate; they don't line up with AMD's own graphs:





Assuming around $10 k for N7, N5 should be less than $15 k per wafer. The oft quoted costs are costs at the time of launch when Apple was the only customer. The MCDs, in particular, should be very cheap. With a defect rate of 0.09 per square cm and 37.5 mm^2 die size, the yield for each should be around 1500 per wafer. All 6 shouldn't cost more AMD than $40. With 50% gross margins, the GCD and 6 MCDs should be sold to the partners for around $300. The total board cost seems to be around the $ 500 - $ 550 mark. Shipping and the margins for retailers should allow a profitable product even at $ 800 or so.


----------



## ARF (Dec 12, 2022)

AnotherReader said:


> Assuming around $10 k for N7, N5 should be less than $15 k per wafer. The oft quoted costs are costs at the time of launch when Apple was the only customer. The MCDs, in particular, should be very cheap. With a defect rate of 0.09 per square cm and 37.5 mm^2 die size, the yield for each should be around 1500 per wafer. All 6 shouldn't cost more AMD than $40. With 50% gross margins, the GCD and 6 MCDs should be sold to the partners for around $300. The total board cost seems to be around the $ 500 - $ 550 mark. Shipping and the margins for retailers should allow a profitable product even at $ 800 or so.



10K for N7 wafer is too much. I bet it's 7-8K today.




The price of a 5nm wafer from TSMC is a whopping $ 16.988 - HardwarEsfera


----------



## AnotherReader (Dec 12, 2022)

ARF said:


> 10K for N7 wafer is too much. I bet it's 7-8K today.
> 
> View attachment 274214
> The price of a 5nm wafer from TSMC is a whopping $ 16.988 - HardwarEsfera


I suspect you are right, but good, reliable information is hard to come by. From comments by industry insiders, N7 prices were about right at the time of that estimate, N5 were too high, and N16 were too low.


----------



## sweet (Dec 12, 2022)

Beat that brick 4080 in every games I play! Nice one AMD.

Now if only those bloody retailers stick to that price...


----------



## chowow (Dec 12, 2022)

good show AMD but for  200 bucks I will take the 4080 nvidia's features DLSS,ray-tracing Better power efficiency,quieter card


----------



## Denver (Dec 12, 2022)

AnotherReader said:


> GDDR6 should be cheaper now; IIRC, @W1zzard estimated about $55 for 8 GB of GDDR6 around RDNA 2 launch. N5 wafer costs are probably an over estimate; they don't line up with AMD's own graphs:
> 
> View attachment 274213
> 
> Assuming around $10 k for N7, N5 should be less than $15 k per wafer. The oft quoted costs are costs at the time of launch when Apple was the only customer. The MCDs, in particular, should be very cheap. With a defect rate of 0.09 per square cm and 37.5 mm^2 die size, the yield for each should be around 1500 per wafer. All 6 shouldn't cost more AMD than $40. With 50% gross margins, the GCD and 6 MCDs should be sold to the partners for around $300. The total board cost seems to be around the $ 500 - $ 550 mark. Shipping and the margins for retailers should allow a profitable product even at $ 800 or so.


Really? How accurate is this information @W1zzard? All articles I find point to even higher prices..

"And according to an online marketplace called Digi-Key, GDDR6 chips from Micron are now selling for around US$13-16 per GB"









						Very high GDDR6 costs blamed for such high GPU prices
					

AMD recently launched the RX 6600 XT for $379, a price higher than many expected. The company's justification is the doubling of memory prices which is indeed true, according to price listings online.




					www.neowin.net
				











						Graphics Memory Prices Expected to Increase Next Quarter
					

GDDR6 to the moon!




					www.tomshardware.com


----------



## Gica (Dec 12, 2022)

chowow said:


> good show AMD but for  200 bucks I will take the 4080 nvidia's features DLSS,ray-tracing Better power efficiency,quieter card


Correct.
And it's hard for me to believe that AMD can beat the 4090 with the 7950XTX.
With deficiencies in ray-tracing (and very likely also in Content Creation), lack of DLSS, only with performance in rasterization you cannot ask for the same price for a product. My impression is that nVidia forced AMD to release a product on the market that could still be worked on.


----------



## mahoney (Dec 12, 2022)

john_ said:


> *Nvidia prices have nothing to do with AMD falling to offer an RTX 4080 killer at $1000.* Nvidia prices have everything to do with people willing to pay those prices. Nvidia prices have everything to do with people accusing Nvidia's competition for the prices and not Nvidia itself.
> 
> Let's say that AMD's RDNA3 was a total mess. A buggy hardware that needed new revisions to work. Nvidia comes out with the prices we see today and NO ONE was buying. I mean let's say they have managed to sell 12000 RTX 4090s instead of 120000. RTX 4090's price will have been much lower today. And without AMD's help.
> 
> And please. If you want AMD to build better products, just be also ready to buy them. Hoping AMD to build better products, so you can keep paying Nvidia, only secures one thing. More expensive GPUs in the future.


You think? Look at the 3000 cards msrp's. It's like they knew that AMD had something in store.
3070 $499   < 6800  $579
3080  $699  =   6800xt $649
3090 $1499 ≥ 6900xt/69500xt   $999-$1100


----------



## Space Lynx (Dec 12, 2022)

I am impressed, well done AMD, this is exactly what I wanted. More frames, I could care less about ray tracing.


----------



## nikoya (Dec 12, 2022)

@W1zzard thank you for the hardwork man ! 

you are truly a GPU demi-god and clearly deserve a throne at the GPU Pantheon.


----------



## Darmok N Jalad (Dec 12, 2022)

Based on the commentary that the reference 7900xtx hits the power cap and throttles, it makes me suspect that all these AIB cards with 3 8-pins and massive coolers won’t have that 6% clockspeed hit. They will probably have a factory OC on them too.


----------



## N3M3515 (Dec 12, 2022)

Space Lynx said:


> I am impressed, well done AMD, this is exactly what I wanted. More frames, I could care less about ray tracing.


6800XT $650 msrp -------- 7900XTX $1000 msrp, 51% higher performance for 53% higher price. Price/performance: crapola.

Exactly the same crap nvidia did with the 4080.
And don't get me started on the 7900 XT.


----------



## efikkan (Dec 12, 2022)

Once again, we see AMD hurt by their own hype.

I think it's fine that AMD offers comparable rasterizing performance to RTX 4080 at $200 less, even with poor RT performance. Not all gamers need RT performance yet, so it's fine to let each buyer decide which is best for them. My only objection here is the price of both, $800 for RTX 4080 and $700 for RX 7900 XTX would have been a more fair price.



wNotyarD said:


> If AMD's recent history is anything to go by, then we can expect two things:
> 1 - FineWine™ will make both 7900s a bit better, especially that low load power consumption;


Unfortunately, many people are still making excuses for AMD (whether it's subconscious or not).
I'm disappointed to see even this review tries to make excuses for "missing driver optimizations". This is the sort of stuff I've come to expect from the likes of Hardware Unboxed and LTT, but not TPU.
Normally, architectural changes are the first to be implemented in a new driver, normally long before engineering samples are done. The only reason to postpone implementation of a core architectural feature is if there is some issue with the hardware. So I see no reason to expect any significant change from driver updates, at least not anything to make the product compete at a higher performance tier.

If anything, we should expect AMD's launch drivers to be the most mature. Their architectural changes tend to be more conservative and the corresponding driver changes relatively minor. AMD also have far less gimmicks in their drivers.
Except for the odd bug here and there, what we see now is probably what we will see 3 and 6 months from now too.



Alpha_Lyrae said:


> Ah, yes, launch driver bugs; very typical of AMD. You know, more of you should be demanding “FineWine” performance at launch and not 6-12 months later.


AMD FineWine is a myth. We've heard this nonsense from the 200/300 series, 400/500 series, Vega series and so on. So many expect significant performance to be unleashed "shortly" after release, but it never happens. E.g. RX 480/580 didn't outclass GTX 1060 back in the days and it still doesn't today.
We need to judge products for what they are, not what they are portrayed as in some "fanboy utopia".



Alpha_Lyrae said:


> That simply means that the driver compiler is still not optimized for RDNA3 after all of this development time, and yes, it’s resource intensive, but most people base decisions on launch performance.


Driver compiler? Optimized for RDNA3?
The driver compiler is just a normal compiler, either MSVC, GCC or LLVM.
If you're thinking of the shader compiler, which is a part of the runtime driver, it must be tailored to the GPU architecture, otherwise it will simply not work.


----------



## murr (Dec 12, 2022)

Nice, This will be my next video card.


----------



## N3M3515 (Dec 12, 2022)

efikkan said:


> My only objection here is the price of both, $800 for RTX 4080 and $700 for RX 7900 XTX would have been a more fair price.


Finnaly someone with a bit of sense on this forum.


----------



## Taisho (Dec 12, 2022)

Crackong said:


> Same performance as 4080, 8GB more VRAM, not sized as a brick, and $200 cheaper


The "brick" argument strikes again. I can assure you that the size of the GPU is not audible unlike the high-RPM motion of the fans of these cheaply-made 7900 XTX and 7900 XT reference models. $200 cheaper? It's two completely different cooling tiers, so a fair part of this amount is justified only by that - we are talking about flagship products. And then comes ray tracing, power efficiency, driver stability, and DLSS having an edge over FSR. I'm not trying to justify the high price tag of 4080 by any means, for me the asking prices of all these cards are absurd.


----------



## AusWolf (Dec 12, 2022)

Exactly what I expected: 4080 level performance for less money. Nice! 

It's a shame RT performance isn't on par, though not a big deal just yet.


----------



## Space Lynx (Dec 12, 2022)

N3M3515 said:


> 6800XT $650 msrp -------- 7900XTX $1000 msrp, 51% higher performance for 53% higher price. Price/performance: crapola.
> 
> Exactly the same crap nvidia did with the 4080.
> And don't get me started on the 7900 XT.



Have you not been to a grocery store lately? Everything has doubled in price. You are lucky its only $999, lmao.


----------



## Taisho (Dec 12, 2022)

Fouquin said:


> I suppose if you ignore FSR 2.0 being available to a lot of modern games and being functional, sure. No distinguishing feature there. RT perf got a nice bump, they've never quite been this close to on-par even in optimized games.


This is completely incorrect. Look at RT data again. The relative RT vs non-RT performance changed only marginally from the previous generation, both for AMD and NVIDIA. The only GPU that shows what we can call a slight generational improvement in RT vs non-RT is RTX 4090, so we could say that AMD is now lagging behind even more.


----------



## neatfeatguy (Dec 12, 2022)

Space Lynx said:


> Have you not been to a grocery store lately? Everything has doubled in price. You are lucky its only $999, lmao.



That's not entirely true. Not everything has doubled. Somethings haven't gone up much while others have nearly quadrupled.
Ground turkey - almost priced the same as it was 2 years ago (the price moved up maybe $1, give or take a little)
Eggs - around $1.25 for a dozen 2 years ago. Today, right around $4.50

I don't know what to think about GPU prices except they're too rich for my blood these days - perhaps the manufacturers think these high prices are still okay so they keep testing the water or maybe costs really have gone up that much or a mix of both?. Thankfully, all I know is that I'm not in the market for one.


----------



## RedBear (Dec 12, 2022)

neatfeatguy said:


> I don't know what to think about GPU prices except they're too rich for my blood these days - perhaps the manufacturers think these high prices are still okay so they keep testing the water or maybe costs really have gone up that much or a mix of both?. Thankfully, all I know is that I'm not in the market for one.


My favourite theory is that they are still trying to get rid of the previous generation GPUs that they couldn't sell to cryptominers. If this is actually the case prices should drop to relatively saner levels soon enough (but still higher than before the pandemic, mind you, if anything because of inflation and US-China trade wars).


----------



## Taisho (Dec 12, 2022)

RedBear said:


> My favourite theory is that they are still trying to get rid of the previous generation GPUs that they couldn't sell to cryptominers. If this is actually the case prices should drop to relatively saner levels soon enough (but still higher than before the pandemic, mind you, if anything because of inflation and US-China trade wars).


I think that the prices drop especially for 7900XT and to a lesser extent RTX 4080, but the RTX 4090 stays close to where it is now. It has no competition and is the end game for productivity, AI learning, and servers. The only true 4K gaming offering on the market.


----------



## N/A (Dec 12, 2022)

> There's no heatpipes in use.


impressive next level stuff assembly. No heatpipes inside. And factory undervolted to 0.85V, not much room for improvement there,


----------



## MxPhenom 216 (Dec 12, 2022)

N3M3515 said:


> lol...................how can you justify a $1200 gpu?, BOTH the 4080 and the 7900XTX are ripoffs. *People thought 7900XTX was going to be well priced because everyone believed it would be +20% faster than the 4080.*
> 
> 
> Sorry, i'll be keeping my 6800XT for another gen i guess.


No shot, anyone actually believed this and has a rational brain.

If Nvidia or AMD release a card that is considerably faster than their competitions counter part sku you cannot expect that card to be "well priced" (whatever that means, but im assuming significantly cheaper - is $200 that?) than the counter part sku. Especially these days, neither company in their right of mind would do shit like that. No one should be surprised that either company marks their prices in such a way if their cards are reasonably competitive. People have got to have a mental divorce with the idea that AMD is the "fair/well price" company.

Both companies are bat shit insane with pricing nowadays (many factors in play here that many don't realize though).

Again, no one thought the 7900XT was going to be that much faster than the 4080 when AMD announced prices. (Or I at least hope no one did). That should have brought about way different questions and assumptions than that.


----------



## tfdsaf (Dec 12, 2022)

TheinsanegamerN said:


> Every time someone points that out you start hearing the cope brigade "oh well nvidia has the same problems (lolno)" "oh nvidia is evil" "gamers just dont care about AMD" ece.
> 
> Few are willing to admit that AMD dug that hole for themselves, starting in 2006 when they overpaid by over $2 billion for ATi, and the subsequent decade of crap drivers and poorly supported releases gained AMD the reputation of the budget brand. EVen recently we had the rDNA downlcokcing bug and the first gen APUs relying on OEMs for drivers, both of which required pressure from tech media to fix.
> 
> ...



Nvidia just 2 years ago released the 1030 with DRR4 ram under the same name and everything as the 1030 GDDR5! They just did the same just like 3 weeks ago with the RTX 3060 8GB, its a cut down card from the actual 3060, but you don't have that info anywhere! They are literally selling a 3050 with the same name as a 3060! 

Should I mention the dirty tactics they play on reviewers and how they try and bully them into positive reviews? Maybe we should talk about all of their naming schemes and even undercutting their AIB partners for more profits! 

How about right now and their issues with the 16pin connector melting? 

How about when their driver bricked people GPU's? Or the driver that had so many people unable to boot into windows? 

Again ALL of the Nvidia crap and BS and incompetence or greed or scams and whatnot never gets brought up! Nvidia is literally 10x times worse than AMD in terms of bad drivers, bad products, misleading products, scams, false narratives, false advertisement, anti consumer fraud, monopolistic tendencies, etc...

And UR utter BS about $200 7900XTX is actually mentally insane, even if you measure JUST the production cost of the 7900xtx or RTX4090 just those are over $300, just the production cost! Do you forget about R&D, testing, marketing, transportation, drivers and features development, packaging, paying royalties to various groups and consortiums, profit margin leeway for AIB partners, etc... 

You can NOT claim that AMD or Nvidia can sell their cards at $250, the price for 4nm and 5nm process node is extremely expensive, the nodes are new and therefore more error prone so that means less GPU's comes from a single wafer compared to a more mature process, the GPU's are generally bigger to make, etc...

Literally everything you said in your post is complete and utter FUD!


----------



## Scircura (Dec 12, 2022)

- Noisy
- Power hog on desktop
- Frame pacing all over the place

I would not replace my RTX 3070 FTW3 with a 7900 XTX even if someone gave me one for free.

Thanks W1zzard for the reviews and for re-testing on Raptor Lake. I also appreciate the more detailed undervolt discussion on the overclocking page.

Unrelated: I would LOVE to see a perf/W comparison of these cards at various undervolt limits. Something like this news post from a couple weeks ago.


----------



## Chrispy_ (Dec 12, 2022)

Faster than a 4080 for less, and the raytracing is plenty good enough in the games where it doesn't murder framerate. It's about where we expected it, given that AMD's reveal last month was always going to be cherry-picked games. Performance may increase as drivers get updated for the new compiler but I don't think that matters, because if the $999 price is real, the cheapest 4080 is almost $1400 and the XTX is already faster than it _today_.

For the games where the 7900XTX struggles with raytracing, _so do the 4080 and 3090Ti. _I'm with @Wizzard here, Raytracing is the future, but for now we're all stuck in the present.


----------



## MxPhenom 216 (Dec 12, 2022)

Chrispy_ said:


> Faster than a 4080 for less, and the raytracing is plenty good enough in the games where it doesn't murder framerate. It's about where we expected it, given that AMD's reveal last month was always going to be cherry-picked games. Performance may increase as drivers get updated for the new compiler but I don't think that matters, because if the $999 price is real, the cheapest 4080 is almost $1400 and the XTX is already faster than it _today_.
> 
> For the games where the 7900XTX struggles with raytracing, _so do the 4080 and 3090Ti. _I'm with @Wizzard here, Raytracing is the future, but for now we're all stuck in the present.


I think 4080 suggested pricing is soft from Nvidia, as they wait to see where chips fall once AMD releases theirs. So, I am not putting a whole lot of weight in the idea that the 7900XTX is cheaper, maybe for its first week, assuming any of the rumors that Nvidia is planning price drops later this month is real.


----------



## Opex (Dec 12, 2022)

See ya in 5-7 years GPU companies. Maybe by then you'll come up with something that's actually worth getting for the mid/higher-mid tier users.

So much hype and shitting on NVIDIA and in the end there's no mic drop. Just a "standard" mid/higher-mid tier card with bloated prices/margins.


----------



## TheinsanegamerN (Dec 12, 2022)

tfdsaf said:


> Nvidia just 2 years ago released the 1030 with DRR4 ram under the same name and everything as the 1030 GDDR5! They just did the same just like 3 weeks ago with the RTX 3060 8GB, its a cut down card from the actual 3060, but you don't have that info anywhere! They are literally selling a 3050 with the same name as a 3060!
> 
> Should I mention the dirty tactics they play on reviewers and how they try and bully them into positive reviews? Maybe we should talk about all of their naming schemes and even undercutting their AIB partners for more profits!
> 
> ...


And there it is, right on que! Pure whataboutism to justify AMD releasing a upper mid range part for halo tier pricing. 

Pro tip: Copium does not a competitive GPU brand make, AMD needs to be competitive to make a profit, you cannot charge premium prices for second tier products.


tfdsaf said:


> And UR utter BS about $200 7900XTX is actually mentally insane, even if you measure JUST the production cost of the 7900xtx or RTX4090 just those are over $300, just the production cost! Do you forget about R&D, testing, marketing, transportation, drivers and features development, packaging, paying royalties to various groups and consortiums, profit margin leeway for AIB partners, etc...
> 
> You can NOT claim that AMD or Nvidia can sell their cards at $250, the price for 4nm and 5nm process node is extremely expensive, the nodes are new and therefore more error prone so that means less GPU's comes from a single wafer compared to a more mature process, the GPU's are generally bigger to make, etc...


Bruh, BRUH  



Did you just take that $250 joke seriously    LMFAO


tfdsaf said:


> Literally everything you said in your post is complete and utter FUD!


Legend says if you say FUD five times the crypto bros will visit you in your sleep and give you a free JPEG.


----------



## wolf (Dec 13, 2022)

Shatun_Bear said:


> 'Unacceptable RT performance' that's quite something. First of all, I guess you consider all 3000-series cards 'unacceptable' too in RT since they offer similar performance there.


Well that bar was set 2 years ago, we expected more 2 years later, strange.


neatfeatguy said:


> Nvidia didn't improve on RT performance


Well when you dig a little deeper, it's plainly evident they did, like in @birdie 's subsequent post.


Neo_Morpheus said:


> Wait, arent you the same "birdie" that is always trashing AMD and kissing nvidias behind at Phoronix?


Wait, aren't you the same "Neo Morpheus" that is always trashing Nvidia and kissing AMD's behind at Techspot?

Welcome to Techpowerup!


----------



## AusWolf (Dec 13, 2022)

wolf said:


> Well that bar was set 2 years ago, we expected more 2 years later, strange.
> 
> Well when you dig a little deeper, it's plainly evident they did, like in @birdie 's subsequent post.


If the previous gen was let's say, 40% slower with RT enabled vs disabled, and the current gen is also 40% slower, then there was no improvement, was there? From this point of view, I see no improvement on the RT front with either AMD or Nvidia. Increasing the number of RT cores together with the number of more traditional shader cores is not an improvement.


----------



## Dimitriman (Dec 13, 2022)

I wish AMD would put more effort into the RT and creator side of things. This is where Nvidia commands it's premium, and without competition there, prices for both companies will slot on top of each other rather than compete directly. 

It would have been awesome if one of the chiplets was a dedicated silicon just for RT and multimedia, call it something cool like "the reality engine" and make it so good that it actually beat Nvidia's offering. 

Right now the GPU industry feels just Meh.. 
- monster cards that won't fit you PC and will break your mobo
- no level playing field in RT and creator arena
- 600w power cables with incinerate function 
- min $1k after tax for anything new
- focusing resources for upscaling and fake frames rather than RT and raw power 

In the end, I feel like Nvidia and AMD are having a private party but I was not invited. That kinda summarizes my vibe with these launches.


----------



## DemonicRyzen666 (Dec 13, 2022)

AusWolf said:


> If the previous gen was let's say, 40% slower with RT enabled vs disabled, and the current gen is also 40% slower, then there was no improvement, was there? From this point of view, I see no improvement on the RT front with either AMD or Nvidia. Increasing the number of RT cores together with the number of more traditional shader cores is not an improvement.


It varies on this thing on somes times on games it's 3-5% fasters on others it's 1-3% slower on games.

Doubling the instruction rate to two instructions pre-clock to, double what RNDA2 was, is pointless. If they doubled the shaders too. They left a single lone RT core with the need for a lot more instructions with the new BHV trasveral stuff. they just left it lone with barely any more instructions. The shaders sucked up all that extra instructions pre-clock, seems like wasted instructions.


----------



## AusWolf (Dec 13, 2022)

Dimitriman said:


> I wish AMD would put more effort into the RT and creator side of things. This is where Nvidia commands it's premium, and without competition there, prices for both companies will slot on top of each other rather than compete directly.
> 
> It would have been awesome if one of the chiplets was a dedicated silicon just for RT and multimedia, call it something cool like "the reality engine" and make it so good that it actually beat Nvidia's offering.
> 
> ...


I completely agree. The only thing you left out is Nvidia's complete disregard of the entry level, and AMD's _"okay, fine, here's something, just leave me alone"_ edition 6400.



DemonicRyzen666 said:


> It varies on this thing on somes times on games it's 3-5% fasters on others it's 1-3% slower on games.
> 
> Doubling the instruction rate to two instructions pre-clock to, double what RNDA2 was, is pointless. If they doubled the shaders too. They left a single lone RT core with the need for a lot more instructions with the new BHV trasveral stuff. they just left it lone with barely any more instructions. The shaders sucked up all that extra instructions pre-clock, seems like wasted instructions.


Exactly. Meanwhile, Nvidia has the same-looking shader cores with the same-looking RT cores as before. One step forward and one step back from AMD, no change from Nvidia.

It seems I made a good choice buying the 6750 XT a couple weeks ago. This generation is pretty lacklustre from both companies. Okay, we have a bit more performance, but so what?


----------



## wolf (Dec 13, 2022)

AusWolf said:


> If the previous gen was let's say, 40% slower with RT enabled vs disabled, and the current gen is also 40% slower, then there was no improvement, was there?


If that were what was happening, sure there would be no improvement. And in some games, that's what we see, but in others, especially newer ones and certainly Portal RTX, the difference in RT perf outstrips the increase in Raster Perf, Ada absolutely has stronger RT than Ampere when and if the extra capability can be/is leveraged. W1zards testing suite doesn't show much of this, but examples are out there.


----------



## AusWolf (Dec 13, 2022)

wolf said:


> If that were what was happening, sure there would be no improvement. And in some games, that's what we see, but in others, especially newer ones and certainly Portal RTX, the difference in RT perf outstrips the increase in Raster Perf, Ada absolutely has stronger RT than Ampere when and if the extra capability can be/is leveraged. W1zards testing suite doesn't show much of this, but examples are out there.


Except that Portal RT isn't a "game" in the traditional sense. It's a tech demo made by Nvidia for Nvidia cards.

The other examples in the post are purely RT benchmarks measured against Ampere. That's not how you measure the improvement in RT.

1. You need a game.
2. You measure performance with RT on,  then with RT off.
3. You take the difference in percentage.
4. You compare that percentage with the last generation.


----------



## Icon Charlie (Dec 13, 2022)

@W1zzard.  First off I want to say thank you so very much on revue on both video cards.  It confirms that  THIS is the web site for tech.  Not Youtube and their questionable vbloggers (You all know where I stand on almost all of the so called techies so I won't go there right now).   As stated before.  Silicon Valley is in my back yard. I see these people who do and create things.  

The information that I gathered before your review is close on what you have reported.  You really work hard to give us  reliable information.  Heh. I should know. I was a technology reporter doing similar stuff 20+ years ago. So I kind of know how important and at times stressful to make sure your data is correct and helpful to us... The Masses.   

In this day and age of "talking heads", questionable reporting on just about everything... Gimmicks to get people to buy garbage, it is so damned good that I can come to a site to get RELIABLE information on something. 

I thank you for your hard work and I hope others take the time to say thanks as well.


----------



## wolf (Dec 13, 2022)

AusWolf said:


> Except that Portal RT isn't a "game" in the traditional sense. It's a tech demo made by Nvidia for Nvidia cards.
> 
> The other examples in the post are purely RT benchmarks measured against Ampere. That's not how you measure the improvement in RT.


From where I'm sitting, it's precisely a way to measure improvement to RT performance. And I disagree with dismissing those results which frankly are valid in context.

For arguments sake, lets try your way, there are games where it's a wash and broadly equal to Ampere, and there are ones that show a reasonable difference, even without alterations to game code to leverage Ada's improvements like Portal RTX does, at least they haven't said so iirc.









						AMD Radeon RX 7900 XTX Review
					

The Radeon RX 7900 XTX is a pretty good GPU, at least relative to its GeForce competitor, but whether or not it's worth $1,000 will depend on...




					www.techspot.com
				




Marvels guardians of the galaxy
3090 has 48.4% of RT off perf with RT on
4090 has 52.4% of RT off perf with RT on ~8% better result than Ampere

Cyberpunk 2077
3090 has 45.5% of RT off perf with RT on
4090 has 54.2% of RT off perf with RT on ~19% better result than Ampere


----------



## systemBuilder (Dec 13, 2022)

Dimitriman said:


> I wish AMD would put more effort into the RT and creator side of things. This is where Nvidia commands it's premium, and without competition there, prices for both companies will slot on top of each other rather than compete directly.
> 
> It would have been awesome if one of the chiplets was a dedicated silicon just for RT and multimedia, call it something cool like "the reality engine" and make it so good that it actually beat Nvidia's offering.
> 
> ...


I think your reality has been distorted by the "news-hype" cycle.
It's true that the only high-end cards with reasonable power consumption and size are 3080, 7900xt, 7900xtx, and maybe the 6900xt.
But if you freeze frame rates to 100 fps then a 4080 only uses 170 watts of power, which is extremely good, and only 6800xt can compete on "Quiet".
It may be a gigantic oversized 3.5-slot lump in your computer, but joules per frame rendered on 4080 is second to NONE.
And RT really doesn't matter.  Its just a FUD-hype technique by Nvidia. 
Ever hear of ANYONE making a kill shot due to a puddle reflection?  No?  Of course not!
NVENC and OpenCL matter, and AMD is finally matching NVidia on OpenCL with the 7900 XTX.

I have bought 2 high-performance video cards this year (3080, 3070 TI), and both times it was hell and exasperation.
One was from the newegg shuffle during card rationing, and I won a much-hated 3070 TI for ONLY 50% over MSRP.
One was 30d before xmas when I won a NEW 3080 on amazon for ONLY 15% over MSRP.
These companies have to stop lying with lowball/fake MSRP prices, I think the FTC should step in.

Listen, both NVidia and AMD are now using the same semiconductor foundries.   Since NVidia has the engineers from SGI and 3dfx that INVENTED the 3D gaming industry, did you expect AMD to beat them on price AND performance?  Really?  Now that EVERYONE is using the same chip factories, the designs are probably not that different any more!  Going forward, you get what you pay for.  A 30TFlops 3080 for $800 is a screaming deal compared to a 5TFlops rx580 costing $300 from just 3-4 years ago.  I know, because I have bought both of them.  The 30TFlops card is 6x faster, for 2.67x the $$$.  It may not following moore's law but hopefully some competition from Intel will help (Although I feel like Raj Koduri is an idiot and AMD is better off without him, it was a brilliant move to dump him on Intel graphics division ...)


----------



## Arkz (Dec 13, 2022)

Nice to see improvements but they've been working on this for a few years and it comes out of the door with 100w idle on dual monitors... The hell? Not as big of an improvement on the 6900XT as I was hoping, and RT is still a fair bit behind the current NV, but at least they're closing the gap. At the moment though all prices are just daft. I thought my £660 3080 was expensive when I got it but compared to todays price for performance 2 years later, maybe it wasn't that bad. Not like I plan on upgrading any time soon anyway, and the way prices are going I'll probably still be using this 3080 in 5 years.


----------



## AusWolf (Dec 13, 2022)

wolf said:


> From where I'm sitting, it's precisely a way to measure improvement to RT performance. And I disagree with dismissing those results which frankly are valid in context.


Those results measure performance in general, not the improvement in RT core performance specifically. Of course they're valid, just not in this context. You can't cram 60% more RT cores onto a GPU die and brag about your RT cores being improved, because they're not - there's just more of them.



wolf said:


> For arguments sake, lets try your way, there are games where it's a wash and broadly equal to Ampere, and there are ones that show a reasonable difference, even without alterations to game code to leverage Ada's improvements like Portal RTX does, at least they haven't said so iirc.
> 
> 
> 
> ...


A good point. I guess there is some improvement in some cases. Just not enough (improvement and cases), imo.


----------



## Richards (Dec 13, 2022)

Disgusting  how the 5800x bottle necks the 7900 xtx and 4080.. 5800x should never been used


----------



## Fluffmeister (Dec 13, 2022)

Richards said:


> Disgusting  how the 5800x bottle necks the 7900 xtx and 4080.. 5800x should never been used



Yeah, bloody AMD!


----------



## wolf (Dec 13, 2022)

AusWolf said:


> Those results measure performance in general, not the improvement in RT core performance specifically. Of course they're valid, just not in this context.


I still disagree, it's measuring like for like products in like for like testing, when we know in rasterization the 4090 is on average about 60-70% faster than a 3090, but in RT testing it's ~2x ish faster. I'm happy to leave it there though, if you think those results lack merit in this context, that is your prerogative.


AusWolf said:


> A good point. I guess there is some improvement in some cases. Just not enough (improvement and cases), imo.


I agree it's not as large as I would have hoped, but I get the impression lessening that hit is quite the uphill battle for both camps. Gotta inch forward every time.


----------



## evernessince (Dec 13, 2022)

Looks like another generation I'll be keeping my 1080 Ti.  Neither AMD nor Nvidia's cards are competing on price.  At this point that's painfully clear.  Instead AMD and Nvidia price around each other, ensuring that they both extract maximum profits while customers are left wondering how much they'll have to spend.



Makaveli said:


>



I didn't watch the video but that title is terrible clickbait.


----------



## shovenose (Dec 13, 2022)

Alright, it's been decided. I put in an offer on a used Dell OEM 6900XT on eBay. I want that specific card because it's a true dual slot card so I can retain my PCI-E X1 front panel USB C adapter. If that offer is accepted by tomorrow morning when the 7900 series launches, great, I got a 6900XT. If not, I will retract my offer, and buy the 7900XTX, yay for credit cards. Too bad for my front panel USB C port at that point LOL. Like I commented on the XT review, I honestly really wanted a 7900XT but the price/performance is a joke compared to the XTX. If it was 799 or 849 I would have gone for it, but you lose over 15% performance for a 10% discount which is NOT cool with me.


----------



## konga (Dec 13, 2022)

There's something weird going on with the 4090 results in this review. It is getting lower frame rates at 4K in a lot of games than it got with a 5800X in the original 4090 review. What's the explanation for this?


----------



## gmn 17 (Dec 13, 2022)

I'm going to wait for better drivers and 3dv cache version of this GPU hopefully out by NH summer 2023


----------



## TheinsanegamerN (Dec 13, 2022)

AusWolf said:


> I completely agree. The only thing you left out is Nvidia's complete disregard of the entry level, and AMD's _"okay, fine, here's something, just leave me alone"_ edition 6400.
> 
> 
> Exactly. Meanwhile, Nvidia has the same-looking shader cores with the same-looking RT cores as before. One step forward and one step back from AMD, no change from Nvidia.
> ...


Oh God, don't get me started on the total lack of low end stuff. I went with a rx 560lp years ago instead of a 1050ti, and felt the 1650 wasn't enough of an upgrade. 

What do I get? No cards from Nvidia that fit in the 75w tdp required for low profile cards, Intel's ARC a380 being a mess and requiring bios settings my old optiplex media machine doesn't have, and AMD throwing out the laughingstock rx 6400 when they have mobile gpus like the 6650m or 6700s that would have done the job FAR better as 75w desktop parts. 

My only hope now is some Chinese brand takes aforementioned mobile GPUs and makes a low profile model out of them


----------



## wolf (Dec 13, 2022)

TheinsanegamerN said:


> What do I get? No cards from Nvidia that fit in the 75w tdp required for low profile cards


Well, there was _one _option for those that must have LP, but you'd have needed relatively deep pockets for one, the RTX A2000. Absolutely though they could have made a 3050LE or something LP with a 'sane' price tag, they just chose not to.

Interestingly this generation I feel like Laptop GPU's are going to be really potent from both camps given the levels of efficiency on display, I reallllly hope that means at least one new LP card from both camps that raises the bar significantly over the GTX1650/RX6400


----------



## Space Lynx (Dec 13, 2022)

gmn 17 said:


> I'm going to wait for better drivers and 3dv cache version of this GPU hopefully out by NH summer 2023



I never thought of that before, is that actually going to be a thing? 3d cache like on 5800x3d will be on gpu's?


----------



## sepheronx (Dec 13, 2022)

Good review wiz

What a hog overpriced piece of shit of a cards launch.

I'll be skipping this gen


----------



## AusWolf (Dec 13, 2022)

TheinsanegamerN said:


> Oh God, don't get me started on the total lack of low end stuff. I went with a rx 560lp years ago instead of a 1050ti, and felt the 1650 wasn't enough of an upgrade.
> 
> What do I get? No cards from Nvidia that fit in the 75w tdp required for low profile cards, Intel's ARC a380 being a mess and requiring bios settings my old optiplex media machine doesn't have, and AMD throwing out the laughingstock rx 6400 when they have mobile gpus like the 6650m or 6700s that would have done the job FAR better as 75w desktop parts.
> 
> My only hope now is some Chinese brand takes aforementioned mobile GPUs and makes a low profile model out of them


The 6400 is not a bad card with its 55-ish W TBP, if your system has PCI-e 4.0, or if you don't mind sacrificing some performance in some games with 3.0. It's just too expensive for what it is.


----------



## LuxZg (Dec 13, 2022)

Tech Ninja said:


> Isnt it time for a RT average chart?  Also what about FSR 2 vs DLSS 3.0?





W1zzard said:


> I read your mind earlier today, check the RT page



How about RT price/perf chart, dollar per RT frame, etc ? Should be 5-10 minutes of work now that you have averages and prices from other chart. Well, hopefully, unless you don't draw charts by hand


----------



## Gica (Dec 13, 2022)

tfdsaf said:


> Nvidia just 2 years ago released the 1030 with DRR4 ram under the same name and everything as the 1030 GDDR5! They just did the same just like 3 weeks ago with the RTX 3060 8GB, its a cut down card from the actual 3060, but you don't have that info anywhere! They are literally selling a 3050 with the same name as a 3060!
> 
> Again ALL of the Nvidia crap and BS and incompetence or greed or scams and whatnot never gets brought up! Nvidia is literally 10x times worse than AMD in terms of bad drivers, bad products, misleading products, scams, false narratives, false advertisement, anti consumer fraud, monopolistic tendencies, etc...
> 
> ...


The RTX 3060 8GB uses the same chip as the 12GB version (GA106-302-A1 versus GA106-300-A1). The same number of cores, SM, CUDA cores, Tensor, etc., even the same frequencies. Only the amount of memory and bandwidth differ. The original chip was modified only on the bus (128 versus 196 bits) to support 8GB of vRAM. Theoretically, the 8GB version is cheaper.
GT1030 remains a wanted video card because there is nothing in the area that can compete with it (I'm not talking about gaming) and be more energy efficient. AMD has left such a huge void in the low cost area because even the ancient GT 710 survives and sells well.
What do you say about this:
See reviews with ryzen 3000 (example), you like the performance and buy a 3xx0*G*. At home, you find that you have a processor from the old series, 2000, and an integrated graphics that only shines in games.
You advertise PCIe 4 for the 5700X without saying that it brings zero performance gain.
I can continue until I reach the Bulldozer series, 4-6-8 cores on paper, 2-3-4 in the processor.
Let's be serious. nVidia is a small child next to AMD among these donkeys.


----------



## ratirt (Dec 13, 2022)

Not that bad for the 7900 XTX but I honestly expected a bit more in overall score for that card. Not to mention the 7900 XT is in a very weird place to be honest. The price should not go above 800 for sure in the current market but the prices should go down for both RDNA3 and ADA because these are still ridiculous. There is an improvement for the 7900xtx over 6900xt though. Around 40% better for the same price (MSRP) but I feel AMD could have done slightly better and get closer to 4090 rather than 4080.
I'm skipping both vendors this time. I think that is the only right decision this time around.


----------



## bearClaw5 (Dec 13, 2022)

In the RT tests is DLSS or any upscaling used?


----------



## W1zzard (Dec 13, 2022)

efikkan said:


> I'm disappointed to see even this review tries to make excuses for "missing driver optimizations". This is the sort of stuff I've come to expect from the likes of Hardware Unboxed and LTT, but not TPU.
> Normally, architectural changes are the first to be implemented in a new driver, normally long before engineering samples are done. The only reason to postpone implementation of a core architectural feature is if there is some issue with the hardware. So I see no reason to expect any significant change from driver updates, at least not anything to make the product compete at a higher performance tier.
> 
> Driver compiler? Optimized for RDNA3?
> ...


This time it's different. The "architecture" (instruction set) really doesn't matter, I agree with you. What matters is that the new units are dual issue, so the HLSL shaders that get compiled must use a special compilation path that optimizes, merges and combines instructions and data in a way so that they become aligned with how the architecture performs best. This is a difficult task that often requires hand-optimized code.

Random paper that I pulled up to give you an idea: http://www.ai.mit.edu/projects/aries/Documents/vliw.pdf

"Compiler" in this context means "shader compiler" of course, not the compiler that builds the driver code running on your PC

The good thing is that due to the way shader compiling and uploading works, AMD already has pretty much all the shader code that anyone ever compiled, so they can run statistical analysis on it, etc, they briefly touched on this in the press briefings. This is probably worth a separate thread, feel free to start one and link me.



bearClaw5 said:


> In the RT tests is DLSS or any upscaling used?


No upscaling, would be an unfair comparison otherwise


----------



## Space Lynx (Dec 13, 2022)

I don't think I am going to for the 7900 xtx after all. I was excited for it, but a grand is a lot of money, double the price of my 6800xt and honestly most of my games already cap out 165hz 165 fps 1440p gaming anyway.

It would have been nice playing AC Valhalla and other more demanding games at 165 fps 165hz ultra though. but I suppose turning down a couple settings to hit that 165 fps isn't the end of the world.

I also don't like that there is no zero fan anymore??? Fans don't stop when idle and it has a very high wattage idle draw, hmm, just seems like a rushed product. I think AMD should have waited a bit longer before releasing this card.


----------



## Neo_Morpheus (Dec 13, 2022)

wolf said:


> Wait, aren't you the same "Neo Morpheus" that is always trashing Nvidia and kissing AMD's behind at Techspot?


Yes, and many other other sites and the nvidia trashing is based on valid reasons. Feel free to find them there.


wolf said:


> Welcome to Techpowerup!


Well, thank you!


----------



## Melvis (Dec 13, 2022)

Drivers driver drivers......New Arch is going to take time to get all the bugs and performance issues sorted. Clearly the performance is there and the price to performance compared to Nvidia is good and honestly who gives a F about Raytracing.....but like all things AMD when it comes to there cards.......... Drivers!......give it 6-12months and the margin will close.

I wont be buying this card, anything over $600 is a big nope from me, Im having a hard time wanting to spend $569 on a 5800X3D...sighs and its on sale at $100 off! FFS


----------



## Gica (Dec 13, 2022)

Space Lynx said:


> I think AMD should have waited a bit longer before releasing this card.


They had no choice. It is not good for the image or the profit to launch high performance video cards when the competition has launched almost the entire series, from premium to low cost. Taliban fans can wait for nVidia or AMD, but the "neutrals" are the most numerous and will buy what they find in stores, not ads.
They now released an XTX version because they had nowhere else to go. No chance for the 7950XTX to surpass the 4090, and nVidia has kept room for a possible 4090Ti. The RTX 4090 does not use the entire chip.


----------



## W1zzard (Dec 13, 2022)

Space Lynx said:


> I also don't like that there is no zero fan anymore??? Fans don't stop when idle


The fans do stop when idle. They just keep spinning for a relatively long time during cooldown, which might have confused some reviewers. See my chart on the Temps page.


----------



## the54thvoid (Dec 13, 2022)

Well, I'm sort of bummed by this generation. AIB cards will be interesting for noise etc, but price will be >1k, probably even for XT. Prices are ruining PC DIY community. Despite what many odd people think, not everyone can afford gfx cards at these prices. And the trickle down isn't working. A bloody 3070ti (Nvidia's 8th fastest card, and a generation old) is still 600-700 quid   . I'm changing things in life right now and I can't justify these prices. They're responsible for my growing disinterest in PC hardware. It's great for those that can (buy a 4090, or 4080) but given what I'm seeing in reviews, the _AMD cards need to be a lot cheaper_. Frankly, Nvidia can still go screw themselves until they're cheaper too.

Another generation bypassed until prices collapse.


----------



## Hofnaerrchen (Dec 13, 2022)

RDNA3 is an improvement over RDNA2 but should nVIDIA decide to pair the RTX 4080s pricing with the 7900XTX, than the nVIDIA product will be the better choice in my opinion. Not only for RT, even more for efficiency reasons. Something nVIDIA will most like focus it's marketing on. So far RDNA3 strongly reminds me of Vega - comparable performance at the cost of (much) higher power consumption. Somewhat dissappointing. 

The good thing though: Might force both parties to reduce prices much earlier than expected. Good for customers.


----------



## Space Lynx (Dec 13, 2022)

the54thvoid said:


> Well, I'm sort of bummed by this generation. AIB cards will be interesting for noise etc, but price will be >1k, probably even for XT. Prices are ruining PC DIY community. Despite what many odd people think, not everyone can afford gfx cards at these prices. And the trickle down isn't working. A bloody 3070ti (Nvidia's 8th fastest card, and a generation old) is still 600-700 quid   . I'm changing things in life right now and I can't justify these prices. They're responsible for my growing disinterest in PC hardware. It's great for those that can (buy a 4090, or 4080) but given what I'm seeing in reviews, the _AMD cards need to be a lot cheaper_. Frankly, Nvidia can still go screw themselves until they're cheaper too.
> 
> Another generation bypassed until prices collapse.



Variety of hobbies is the healthiest way to be, I am actually surprised how easy it was for me to say no to the 7900 XTX today, it was consistent after every review I looked in to, and not something I was expecting. I'm happy with my 6800 XT, especially since I am a 1440p gamer for awhile to come and I have so much backlog, there is legit no reason for anymore power. Ray tracing still doesn't interest me, so no factor there.

In the mean time, I want to keep saving up the rest of my money and future earnings to go back to England to my fiance, that is my true happiness. I wish it was easier to get a work visa in England, I'd be with her right now still.


----------



## gridracedriver (Dec 13, 2022)

RX 7900 XTX curve, practically has the handbrake on, average at 2.6ghz and only 0.90~0.95v.






For comparison curve RTX 4080




Average 2.75 ghz with 1.05v

As well as software in the dual issue, there is a hardware optimization left a little behind, the x50 between driver and clock fix could easily make +20% on these x00 ?

We will see if N32 still has these problems.


----------



## Chomiq (Dec 13, 2022)

It's like 90% of the time AMD is playing catch-up. On top of that we're still seeing issues that were present since at least the 5xxx series (multi monitor power draw that was suppose to be resolved by driver update).


----------



## Space Lynx (Dec 13, 2022)

Chomiq said:


> It's like 90% of the time AMD is playing catch-up. On top of that we're still seeing issues that were present since at least the 5xxx series (multi monitor power draw that was suppose to be resolved by driver update).



I do think AMD would be better office hunkering down on what they do best, instead of feeling the need to always have something Nvidia has exclusivity too. Like ray tracing, let Nvidia have it, all the time and resources spent on that could have been spent on a more polished product.

AMD's message could have been simple, if you want ray tracing buy Nvidia, but here is what we can offer you. Personally, I just want the smoothness of 165 fps 165hz gaming at medium to high settings. Unfortunately, some games like RDR2 still can't hit that, even with the bells and whistles turned off.


----------



## Bwaze (Dec 13, 2022)

Good, thorough review, thanks to a involved.

But these releases only confirm the new reality - that "upper midrange" is now 1000 - 1200 USD. Yeah, AMD is a bit cheaper, but with slower raytracing, less powerful upscaling, less usability outside gaming, various bugs that are reoccurring (very high multimonitor idle power draw) it also has a bit less value.

I'm sure they'll be able to sell all the cards they will be able to make, but I don't think these products are in any way forcing Nvidia to lower the prices of RTX 4080 and upcoming RTX 4070 Ti...

I also don't think that roughly matching Nvidia prices is the way to greater market share. But I think that's OK by AMD, they know how many cards they can produce.


----------



## Flanker (Dec 13, 2022)

Nice. No upgrade itch for me


----------



## medi01 (Dec 13, 2022)

pavle said:


> Despite all the culling optimizations, they don't do so good in ray tracing


This sort of comment blows my mind.
I've expected way bigger gap than just 16%.
Actually, AMD had closed the gap.

It is raster performance that I expected to be better, to be honest.

Especially, given the chip sizes: 380mm2 vs mixed salad of 530 total, with biggest chip 300 or so.

But then, 4090 is only 22% faster. Diminishing returns sI guess. And very likely Finewine will happen shortly.



Gica said:


> Huge multimonitor consumption


Only in weird setups it seems (and I guess that is how it slept through QA). Formidably low power consumption in dual monitor setup test by computerbase.




john_ said:


> Nvidia prices have nothing to do with AMD falling to offer an RTX 4080 killer at $1000.


People keep repeating it, but I simply do not see it.

Last gen NV was forced to DROP A TIER on their GPUs (remember that weird 10GB 3080 vs 12GB 3060? Guess why that happened)

This gen NV was forced to "unlaunch" smaller 4080, it stacked so bad against RDNA3.

So what we have is a product $200 cheaper than 4080, that is a bit faster at raster and  14% slower at RT.
Mix "features" in, use RT perf es metric and you get roughly the same perf/$.

Amazing for NV is that:
It is just 380mm2 vs 500+ on competitor (mkay, chiplets even it up a bit... or not that much?)
It only has 16GB of that more expensive VRAM vs 24GB on competitor.

-----------------------------------------------------

Had anyone napkin math-ed chip costs, 4080 vs 7900XTX?


----------



## Bomby569 (Dec 13, 2022)

I can't get my head around how people even consider this a good value. Nevermind the paper launch ( especially on Europe as usual) on reddit the price for a aib model was 1300€

__
		https://www.reddit.com/r/Amd/comments/zgqi13
and it doesn't seem farfetched at all

I get most of people in TPU don't really represent your typical average gpu buyer, but damn are you all diving Lambos or some shit.

Reviewers are even worst.

I think at this rate looking for prices or last gen cards for nvidia and amd in Europe, this is going to insane lengths and i doubt it's sustainable.


----------



## Prima.Vera (Dec 13, 2022)

Those cards should have been priced 500$ and 750$ in my opinion.
The new trend in prices is a disaster for the consumer... sad


----------



## Bwaze (Dec 13, 2022)

$1000 + VAT + higher seller margins than in US should be no less than 1200 EUR in Germany with 19% VAT, of course higher elsewhere.

But that's base model price, AIB partners will of course also offer models with much higher prices.



Bomby569 said:


> I think at this rate looking for prices or last gen cards for nvidia and amd in Europe, this is going to insane lengths and i doubt it's sustainable.




Price / performance wise RX 7900 XTX is better than RTX 4080, but yeah, only on par with previous generation - so ZERO price / performance increase.

What's worrying is that RX 7900 XT has worse price / performance than XTX and last gen cards - what does that tell us about upcoming midrange cards? Don't bother, buy RX 6000's?


----------



## RainingTacco (Dec 13, 2022)

Toss said:


> how about PRODUCTIVITY TESTS?
> x264, AV1 H265
> CONDING ENGOCING DECODING VIRGINING?



It's crap, rendering time[blender] is double of 4080.


----------



## Fluffmeister (Dec 13, 2022)

This card oozes buyers regret to me, I mean if you're already in the market for a card costing a grand, you might as well spend a bit more and get the better all round product of 4080.


----------



## mb194dc (Dec 13, 2022)

Fluffmeister said:


> This card oozes buyers regret to me, I mean if you're already in the market for a card costing a grand, you might as well spend a bit more and get the better all round product of 4080.



Most use cases better off with the last gen than either... Only if you want the top RT performance or to game at 4k 120fps+ is it worth bothering with the the new gen cards.  

My suspicion is both teams cards sit gathering dust on the shelves largely once Christmas is done.


----------



## Bwaze (Dec 13, 2022)

Fluffmeister said:


> This card oozes buyers regret to me, I mean if you're already in the market for a card costing a grand, you might as well spend a bit more and get the better all round product of 4080.



In Europe I doubt there will even be any price difference. RTX 4080 that were offered for 1500 EUR are now priced from 1340 EUR on, we already had two price cuts. And they're still all available.


----------



## Acesbong (Dec 13, 2022)

These cards would be great and exciting at 599 and 699€. People ripping Nvidia for naming scheme, AMD has done the exact same thing. These are 7800 series cards being price gouged. Considering the economics of the world at the moment , who still thinks AMD are the good guys? Just as bad as Nvidia.


----------



## z1n0x (Dec 13, 2022)

RDNA2 was a big step in the right direction. We were hopeful that AMD is getting back on track to properly compete with Nvidia, and we get this. Good to know that RTG is still a shitshow. Jensen can go back to cheap Samsung process to maximize profits. All in all Another Major Disappointment.


----------



## Bwaze (Dec 13, 2022)

Who knows, maybe we'll see our own "E. T. The Game" event, a video game in Christmas 1982 so bad that it almost singlehandedly caused the collapse of Atari VCS console sales, causing three year of nearly zero sales in computer gaming - sales of games and hardware fell by 97%!


----------



## ARF (Dec 13, 2022)

Space Lynx said:


> I do think AMD would be better office hunkering down on what they do best, instead of feeling the need to always have something Nvidia has exclusivity too. Like ray tracing, let Nvidia have it, all the time and resources spent on that could have been spent on a more polished product.
> 
> AMD's message could have been simple, if you want ray tracing buy Nvidia, but here is what we can offer you. Personally, I just want the smoothness of 165 fps 165hz gaming at medium to high settings. Unfortunately, some games like RDR2 still can't hit that, even with the bells and whistles turned off.



Ray-tracing is part of the M$ DirectX specification now. AMD cannot afford to simply not support DX 12.


----------



## Bomby569 (Dec 13, 2022)

ARF said:


> Ray-tracing is part of the M$ DirectX specification now. AMD cannot afford to simply not support DX 12.


DX12 ultimate you mean


----------



## wNotyarD (Dec 13, 2022)

Space Lynx said:


> Variety of hobbies is the healthiest way to be, I am actually surprised how easy it was for me to say no to the 7900 XTX today, it was consistent after every review I looked in to, and not something I was expecting. I'm happy with my 6800 XT, especially since I am a 1440p gamer for awhile to come and I have so much backlog, there is legit no reason for anymore power. Ray tracing still doesn't interest me, so no factor there.


Same here, only with a 3070 instead of your 6800XT. It just runs anything at 1440p over 120 fps, which makes my FreeSync monitor just fluid.
No need for anything new right now, and I'm not even taking new tech absurd prices into account for that conclusion.


----------



## ARF (Dec 13, 2022)

I think that this *Navi 31* (306 sq. mm central die) is a second-tier chip which is equivalent to Radeon RX 7800 in all honesty.
Which means that AMD desperately needs two things:
1. to lower the prices to $500 and $700 now;
2. to release an RTX 4090 competitor with larger die size - *Navi 30* with 450 sq. mm central die.


----------



## RainingTacco (Dec 13, 2022)

ARF said:


> 2. to release an RTX 4090 competitor with larger die size - *Navi 30* with 450 sq. mm central die.


Isn't AMD using smaller node to compete with nvidia due to AMD less advanced infrastructure? If so, i guess the price per square mm is higher for AMD, and hence they can't go on surface parity for their dies.


----------



## ARF (Dec 13, 2022)

RainingTacco said:


> Isn't AMD using smaller node to compete with nvidia due to AMD less advanced infrastructure? If so, i guess the price per square mm is higher for AMD, and hence they can't go on surface parity for their dies.



No, the price per square mm is lower for AMD because it uses two older processes compared to nvidia. 7nm+ for the small chiplets and "5nm" for the central die, while nvidia uses "4nm" ("5nm"+) for the whole chip.


----------



## RainingTacco (Dec 13, 2022)

ARF said:


> No, the price per square mm is lower for AMD because it uses two older processes compared to nvidia. 7nm+ for the small chiplets and "5nm" for the central die, while nvidia uses "4nm" ("5nm"+) for the whole chip.



Ahhh i've mistook for the older nvidia generation, thanks for clarifying up that the 4000 series is on the smaller node.


----------



## OfficerTux (Dec 13, 2022)

How can they lower the price to 500$ if 20Gb of GDDR6 alone cost ~300$:





__





						Loading…
					





					www.digikey.com
				




(16 Gbit / 2 Gb per chip, AMD uses 10 on the XT and 12 on the XTX)

The board and other components (excluding the GPU chip(s)) will probably cost another 150$ (especially power supply chips are very expensive and hard to get at the moment).

This does not even include engineering, distribution, sales and support. Honestly I do not think that the margins on these cards are as high as people here claim. They could probably go down by another 100$, but everything below that seems unrealistic to me.

Edit: Yes I know, AMD will get better prices for components than the average Joe on Digikey, but I think the order of magnitude is correct.


----------



## big_glasses (Dec 13, 2022)

perf is about expected, decent, but the "new" horrid pricing
Won't change this gen anyways, though that I had decided before any of the cards where out. Don't see a reason to upgrade every gen, just ain't worth it to me. 



Bomby569 said:


> I can't get my head around how people even consider this a good value. Nevermind the paper launch ( especially on Europe as usual) on reddit the price for a aib model was 1300€
> 
> __
> https://www.reddit.com/r/Amd/comments/zgqi13
> ...


It's "reasonable" or "good value" cause the 40x0 is even worse/more expensive :/
and the 1300$ for EU should be expected, sadly. 999+tax+AiBtax+store profit


----------



## Bomby569 (Dec 13, 2022)

OfficerTux said:


> How can they lower the price to 500$ if 20Gb of GDDR6 alone cost ~300$:
> 
> 
> 
> ...



the 1080ti had 11GB and cost 699$, i doubt there is any difference in the the % of the total card cost represented by the memory, between the 2 cards.


----------



## elemelek (Dec 13, 2022)

Would be nice to get some h264,h265,AV1 encode performance benchmarks for AMD and nVidia cards.
Or not many people are interested in it, so reviewers don't bother with that?

edit:
dropping it here in case someone is interested: https://techgage.com/article/amd-radeon-rx-7900-xt-radeon-rx-7900-xtx-creator-review/
Seems 7900s are faster than Ada in handbrake. nice


----------



## TheinsanegamerN (Dec 13, 2022)

wolf said:


> Well, there was _one _option for those that must have LP, but you'd have needed relatively deep pockets for one, the RTX A2000. Absolutely though they could have made a 3050LE or something LP with a 'sane' price tag, they just chose not to.
> 
> Interestingly this generation I feel like Laptop GPU's are going to be really potent from both camps given the levels of efficiency on display, I reallllly hope that means at least one new LP card from both camps that raises the bar significantly over the GTX1650/RX6400


The A2000 would have been a wonderful 3050ti, especially with the LP blower cooler that nobody has made for consumers yet. But certainly not at $600. 

Laptops are increasingly where its at, given the prices of desktop hardware a gaming laptop is a viable alternative, which is totally nuts. 


AusWolf said:


> The 6400 is not a bad card with its 55-ish W TBP, if your system has PCI-e 4.0, or if you don't mind sacrificing some performance in some games with 3.0. It's just too expensive for what it is.


Well the 1650 wasnt worth the $159 upgrade price years ago, paying $159 for an AMD card that cannot consistently beat said 1650 is definetly out of the question. 

From the limited 4GB framebuffer, to terrible 3.0 system performance (which mine is) the 6400 was a slap in the face. I was so hopeful the 6GB a380 would be better then it was.


----------



## OfficerTux (Dec 13, 2022)

Bomby569 said:


> the 1080ti had 11GB and cost 699$, i doubt there is any difference in the the % of the total card cost represented by the memory, between the 2 cards.


So that's roughly half of what the 7900 has, so add another 150$ for the DRAM and factor in 10% of inflation since 2017 and you arrive at what we have now.

I am not saying that I like the current prices, I'd really like to go back to 700$ for a high-end GPU, but with the current market situation I think that is highly unrealistic.


----------



## Bomby569 (Dec 13, 2022)

OfficerTux said:


> So that's roughly half of what the 7900 has, so add another 150$ for the DRAM and factor in 10% of inflation since 2017 and you arrive at what we have now.
> 
> I am not saying that I like the current prices, I'd really like to go back to 700$ for a high-end GPU, but with the current market situation I think that is highly unrealistic.



what are you on about. That's not how any RAM prices work, a couple of years back you paid similar prices for 4GB RAM, then 8GB, 16GB on your PC.

So a 486 with 8MB or RAM cost what? 10 cents.


----------



## Bwaze (Dec 13, 2022)

TheinsanegamerN said:


> Laptops are increasingly where its at, given the prices of desktop hardware a gaming laptop is a viable alternative, which is totally nuts.



Is it really? I helped choose a relatively fancy Gigabyte A5 K1: Ryzen 5600H, 32 GB RAM, RTX 3060 high TGP - it was on sale for less than 800 EUR, laptops with similar configuration are about 1300 EUR normally. And I wasn't impressed. Blowers aren't really able to cool the full 180W this thing can burn, so it's temp throttled. Loud, and 3060 in laptops is slower than desktop 3050. It's actually so loud it's distracting while wearing ear covering headphones, and distracting to others across the house.


----------



## TheinsanegamerN (Dec 13, 2022)

Bwaze said:


> Is it really? I helped choose a relatively fancy Gigabyte A5 K1: Ryzen 5600H, 32 GB RAM, RTX 3060 high TGP - it was on sale for less than 800 EUR, laptops with similar configuration are about 1300 EUR normally. And I wasn't impressed. Blowers aren't really able to cool the full 180W this thing can burn, so it's temp throttled. Loud, and 3060 in laptops is slower than desktop 3050. It's actually so loud it's distracting while wearing ear covering headphones, and distracting to others across the house.


According to notebookcheck, that's one of the loudest laptops they've tested at 57/58db. I'll give them props for the removable battery, but that tells me the clevo bones that laptop is built on are very old, from a bygone era, which is reflected in the thermal and noise tests. 





__





						Loading…
					





					www.notebookcheck.net
				




Most of their gaming notebooks are between 47-51db these days. Laptops like the legion 7 gen 7 are rather impressive, they can handle the power of a 6900hs and a 6850m xt without throttling and at under 50db.

It's also different strokes, I didnt mind my old alienware 15r2 at full tilt when gaming, but I also had it under the desk going to an external monitor, so your setup and mileage will vary.

For reference, here's another laptop, same screen size, with a larger GPu and more power hungry CPU from the same company (gigabyte) that only hits 52db,a nd averages 48db under load:





__





						Loading…
					





					www.notebookcheck.net


----------



## OfficerTux (Dec 13, 2022)

Bomby569 said:


> what are you on about. That's not how any RAM prices work, a couple of years back you paid similar prices for 4GB RAM, then 8GB, 16GB on your PC.
> 
> So a 486 with 8MB or RAM cost what? 10 cents.


Well yes, overall price per Gb went down for DRAM, but GDDR6 is more expensive than GDDR5X and I am pretty sure that the price per Gb has not halved since 2017. Unfortunately I was not able to find any numbers to back my claims up, so I hereby admit defeat.

Anyway, in 20 minutes the sale for the 7900s will go live in my country, so I am excited for the real world prices.


----------



## sepheronx (Dec 13, 2022)

Bomby569 said:


> I can't get my head around how people even consider this a good value. Nevermind the paper launch ( especially on Europe as usual) on reddit the price for a aib model was 1300€
> 
> __
> https://www.reddit.com/r/Amd/comments/zgqi13
> ...


No, it's just high level of cope from the members here. Don't think it's much different on reddit either.

The instant they say "it's RT, rt doesn't matter" you instantly realize it's copium to the max.  More blinding than RT itself.


----------



## DemonicRyzen666 (Dec 13, 2022)

wolf said:


> Marvels guardians of the galaxy
> 3090 has 48.4% of RT off perf with RT on
> 4090 has 52.4% of RT off perf with RT on ~8% better result than Ampere
> 
> ...



Don't you mean?

+4% Marvel's Guardian's of the Galaxy

+8% cyberpunk 2077

Newer card precentage minus older cards precentage drop = Improvement on architecture

Lower % number means better performance.
higher % means worse prefromance from previous cards.


----------



## Chomiq (Dec 13, 2022)

Acesbong said:


> These cards would be great and exciting at 599 and 699€. People ripping Nvidia for naming scheme, AMD has done the exact same thing. These are 7800 series cards being price gouged. Considering the economics of the world at the moment , who still thinks AMD are the good guys? Just as bad as Nvidia.


I do agree that both 7900 and 4080 series are not worth their asking price and if it wasn't for this whole mining bonanza price creep both Nvidia and AMD would have been laughed out for pricing them at around or above $1k.
Seeing how we still see frequent sales on 6000-series from AMD and 30-series from Nvidia we won't see any price drop for the next 2-3 months, at least.


----------



## N3M3515 (Dec 13, 2022)

ratirt said:


> There is an improvement for the 7900xtx over 6900xt though. Around 40% better for the same price (MSRP)


That would be fine *IF* 4080 did not exist. The problem is that the 7900XTX is the replacement of the 6900 XT but only in price. In performance it actually replaces the 6800XT. Also, the 4080 is overpriced by at least $450. So........7900XTX is also overpriced, by $300.



Bwaze said:


> what does that tell us about upcoming midrange cards? Don't bother, buy RX 6000's?


kind of feels that way.


----------



## ratirt (Dec 13, 2022)

N3M3515 said:


> That would be fine *IF* 4080 did not exist. The problem is that the 7900XTX is the replacement of the 6900 XT but only in price. In performance it actually replaces the 6800XT. Also, the 4080 is overpriced by at least $450. So........7900XTX is also overpriced, by $300.


I'm not sure how did you conclude that the 7900xtx replaces the 6800xt? I went with the notion of a flagship to flagship (chip to chip) so I have not considered 6950XT since that is literally the same chip as the 6900xt just better binned and memory bandwidth is increased. 7900xtx is a new chip release not better binned 7900xt. 
No matter what your take is, the price stayed the same for the 7900xtx vs 6900xt. Anyway the price is not great and it should have been lower for sure.


----------



## Arco (Dec 13, 2022)

Let the games. BEGIN! AMD is queuing right now. Hopefully I get one.


----------



## N3M3515 (Dec 13, 2022)

ratirt said:


> I'm not sure how did you conclude that the 7900xtx replaces the 6800xt?


Don't know how to explain it more clearly: It does not matter if the 7900XTX is the fastest radeon. BECAUSE it competes with the 4080 in performance. Guess which gpu competed with the 3080 in performance last gen?
AMD can call it whatever they want, performance does not lie.


----------



## Braegnok (Dec 13, 2022)

Arco said:


> Let the games. BEGIN! AMD is queuing right now. Hopefully I get one.



I checked 12 stores and all are sold out, one store has a coming soon, pre-order.




Ebay, scalpers however you can find several listings starting at $1,449.00 to $1,749.00.

I'm in the AMD live que, will see what happens.


----------



## Arco (Dec 13, 2022)

Braegnok said:


> I checked 12 stores and all are sold out, one store has a coming soon, pre-order.
> 
> View attachment 274309
> 
> ...


Haha aha, you thought. All of them were gone instantly except for the XT.


----------



## kapone32 (Dec 13, 2022)

Every card sold out everywhere. The As Rock Aqua is $2100 CAD. I am getting the Gigabyte and a block from Alphacool. I should be able to save $500-600.



kapone32 said:


> Every card sold out everywhere. The As Rock Aqua is $2100 CAD. I am getting the Gigabyte and a block from Alphacool. I should be able to save $500-600.


I can buy an XT from AMD but i want the XTX.


----------



## Braegnok (Dec 13, 2022)

What a joke after waiting in que for 45-min,.. @ Arco your correct the XTX is out of stock instantly on launch day!


----------



## AnotherReader (Dec 13, 2022)

In Canada, the XTX was available for a small markup over the US price: $ 1399 or about 1033 USD. AMD's queue was quick, but the XTX is sold out. AMD's price is closer to the US price: $ 1364.66 or about 1008 USD.


----------



## Arco (Dec 13, 2022)

I will just chill with my 1060 3GB and enjoy other games until the 4090 is in stock.


----------



## ThrashZone (Dec 13, 2022)

Braegnok said:


> I checked 12 stores and all are sold out, one store has a coming soon, pre-order.
> 
> View attachment 274309
> 
> ...


Hi,
Nothing new on first release 
Call me in 6 to 12 months


----------



## neatfeatguy (Dec 13, 2022)

I'd rather pick up a 6950XT at my local Micro Center for about $780 if I was in the market for a new GPU right now. It just falls behind the 7900XT by 5-10% and the power draws are very similar. Save myself a good $100 for a card that's just about the same as a 7900XT.


----------



## shovenose (Dec 13, 2022)

Who got one? I was at Newegg at 6:01 and by the time I got to add it to my cart and check out they were already all gone. Oh well, I decided that going from SATA to NVMe for my boot drive would be worth it so I ordered a Samsung 980 Pro 1TB with the $100 savings and bought an XT instead. Overall it will make me happier as I use my PC for more than just gaming than if I'd bought the XTX and happiness per dollar matters too.


----------



## Arco (Dec 13, 2022)

shovenose said:


> Who got one? I was at Newegg at 6:01 and by the time I got to add it to my cart and check out they were already all gone. Oh well, I decided that going from SATA to NVMe for my boot drive would be worth it so I ordered a Samsung 980 Pro 1TB with the $100 savings and bought an XT instead. Overall it will make me happier as I use my PC for more than just gaming than if I'd bought the XTX and happiness per dollar matters too.


At this point, I'm just waiting until the 4090 is in stock. Time be damned. The GTX 1060 3GB plays Doom Eternal at 4k 60 fps low anyways somehow.


----------



## OfficerTux (Dec 13, 2022)

I was able to order a 7900XTX in Germany for 1150€ from notebooksbilliger.de, lets see if they actually have enough stock to supply all orders.

Price on AMD.com was 1125€, but those were sold out before I was out of the queue.

Some shops are now selling them for as high as 1300€ (for a reference design card), custom designs go even higher which makes them highly uninteresting since a RTX4080 can be had for 1340€.

Edit: For people in Germany: The offer from NBB is still live: https://www.notebooksbilliger.de/powercolor+rx+7900xtx+mba+24gb+789047?nbb=pn.&nbbct=1002_10#Q0C10#Q0C10#Q0C10 Sold out now


----------



## dir_d (Dec 13, 2022)

Fluffmeister said:


> This card oozes buyers regret to me, I mean if you're already in the market for a card costing a grand, you might as well spend a bit more and get the better all round product of 4080.


My suspicion is that these cards are priced to push previous inventory that why there are so little cards available. Push the price up through limited availability and if that too much you sell previous gen stock, its a win-win for them.


----------



## Fasola (Dec 13, 2022)

Bwaze said:


> Who knows, maybe we'll see our own "E. T. The Game" event, a video game in Christmas 1982 so bad that it almost singlehandedly caused the collapse of Atari VCS console sales, causing three year of nearly zero sales in computer gaming - sales of games and hardware fell by 97%!


Perhaps we'll be digging in landfills for high-end GPUs in a couple of years.


----------



## OfficerTux (Dec 13, 2022)

AMD has just released the official driver for the 7900 series.

From the release notes (https://www.amd.com/en/support/kb/release-notes/rn-rad-win-22-12-1-rx7900):
Highlights​
Support for Radeon™ RX 7900 Series Graphics.
Known Issues​
Corruption may be encountered when using Virtual Super Resolution with multi-display configurations [Resolution targeted for 22.12.2].
A system crash may be observed when changing display modes with 4 display configurations [Resolution targeted for 22.12.2].
*High idle power has situationally been observed when using select high resolution and high refresh rate displays.*
Intermittent app crashes or driver timeout may occur when using Radeon Super Resolution with some extended display configurations.
Video stuttering or performance drop may be observed during gameplay plus video playback with some extended display configurations.
Stuttering may be observed in UNCHARTED™ 4: A Thief’s End during the opening game sequence.
While loading Marvel’s Spider-Man: Miles Morales™ an app crash or driver timeout may occur after enabling ray tracing settings [Resolution targeted for 22.12.2].


----------



## Tatty_One (Dec 13, 2022)

My goto 3 main online retailers in the UK have all got XT's in stock but none of them have XTX's, the XT's start at GB£899.


----------



## neatfeatguy (Dec 13, 2022)

My local Micro Center was out of stock on the XTX within 20 minutes of opening. They still have some XT listed in stock, but I don't know how many exactly.


----------



## kapone32 (Dec 13, 2022)

It would seem that everyone wants the XTX. I can see the price of the XT coming down in the coming weeks. I just hope that there is a refresh coming next week but I will check Newegg at Midnight to see if they restock the XTX.


----------



## 75Vette (Dec 13, 2022)

W1zzard said:


> That's the current market price. I tried to find one at lower price this morning, not possible, all sold out.



By that logic the newly reviewed XFX 7900 XTX Merc 310 should be $2,149 USD. It's sold out on New Egg and that is the cheapest eBay buy now price.

Seems kinda confusing using different standards for different cards.

You also can't get a $1200 4080 anywhere, yet that is still listed at that price in the comparison tables.


----------



## wEeViLz (Dec 13, 2022)

Hope this makes the 4080 closer to the $800-$900 mark


----------



## efikkan (Dec 13, 2022)

W1zzard said:


> This time it's different. The "architecture" (instruction set) really doesn't matter, I agree with you. What matters is that the new units are dual issue, so the HLSL shaders that get compiled must use a special compilation path that optimizes, merges and combines instructions and data in a way so that they become aligned with how the architecture performs best. This is a difficult task that often requires hand-optimized code.
> 
> Random paper that I pulled up to give you an idea: http://www.ai.mit.edu/projects/aries/Documents/vliw.pdf


I'm quite familiar VLIW. AMD's family of architectures prior to GCN, TeraScale, used several variants of this type of design. Shader compilers have no issue with this, they are tailored to optimize the shader programs from IR to native instructions specific to each GPU architecture. This isn't something new, but PR people can absolutely make it sound fancy.

AMD (and the others) certainly have the ability to override shader programs with their own, but this is rarely done for a whole host of reasons; including the difficulty as most are in IR, the risk of introducing bugs or glitches, the sheer number of shader programs (could be hundreds in a single game), game patches overriding this effort, and the difficulty for developers to debug the game with these "optimizations". I say "optimizations" in quotes, because hand-optimized shaders are usually about removing complexity without sacrificing too much fidelity to notice, unlike the shader compiler which retains the original logic.

AMD certainly have a huge repository of shader programs, which is what they use to develop their shader compilers.

There are games which do compile shaders during rendering. I haven't checked why, but it's certainly conceivable that some "generate" countless permutations of shaders based on object properties to save logic inside the shaders.
And the complexity and amount of shaders in games are going up, so inconceivable that AMD will hand-optimize most of these by hand, it would probably be hundreds of thousands, for each GPU architecture. it would still be a lot if you limited it to only the top games.


----------



## AusWolf (Dec 13, 2022)

TheinsanegamerN said:


> The A2000 would have been a wonderful 3050ti, especially with the LP blower cooler that nobody has made for consumers yet. But certainly not at $600.
> 
> Laptops are increasingly where its at, given the prices of desktop hardware a gaming laptop is a viable alternative, which is totally nuts.
> 
> ...


Not to mention the A380 didn't even come in half-height (LP) version, which it should have, imo. There was news about an A310, but 1. It's crap based on specs, 2. I haven't seen it anywhere, just like the GeForce GT 1010 was a paper launch as well.


----------



## nikoya (Dec 13, 2022)

OfficerTux said:


> Anyway, in 20 minutes the sale for the 7900s will go live in my country, so I am excited for the real world prices.



I couldn't get my hand a XTX on Day 1 opening midnight (6h ago) in China. there were arround 40-50K people queuing globally on Jingdong online store. 
I didn't pass the lucky draw


----------



## Dr. Dro (Dec 13, 2022)

Pumper said:


> RT performance hit being pretty much identical to 6000 series is a bummer. Intel did a better job on their first try for fuck's sake.



But moooom! Raja Koduri is an incompetent fool, Reddit told me so!!! Blame Raja for RDNA3 being bad, and wait for Vega RDNA 4! In the meantime, downgrade to 22.5.1, that will fix the bad RT performance.



nikoya said:


> I couldn't get my hand a XTX on Day 1 opening midnight (6h ago) in China. there were arround 40-50K people queuing globally on Jingdong online store.
> I didn't pass the lucky draw



That is probably for the best, my friend. You might want to wait until the initial driver and FW bugs are resolved in a couple of months' time. By then we'll also have custom models with 3 or 4 8-pin input and liquid cooling, possibly with a chip revision (like the Navi 21 XTXH), and that's when you should buy one. It's when I will look at what I do, if I flip my 3090 and buy one, if I keep it and pass it to my secondary rig, etc.

One thing is for sure, I cannot afford an RTX 4090 (as I can't buy two of them, that is my golden rule), and buying one would be ridiculously unwise at the present moment.


----------



## gmn 17 (Dec 14, 2022)

Space Lynx said:


> I never thought of that before, is that actually going to be a thing? 3d cache like on 5800x3d will be on gpu's?


yeah some rumours about it but could be happening








						AMD Radeon RX 7990 XTX Specs
					

AMD Navi 31, 3599 MHz, 6144 Cores, 384 TMUs, 192 ROPs, 24576 MB GDDR6, 3000 MHz, 384 bit




					www.techpowerup.com


----------



## AusWolf (Dec 14, 2022)

All 7900 XT and XTX cards are sold out at Scan UK. Some XT models are available at Overclockers UK, but the rest are sold out along with all XTX models.

Meanwhile, the 4080 is seeing massive discounts at Scan.

Who would have thought?


----------



## terroralpha (Dec 14, 2022)

shovenose said:


> Who got one? I was at Newegg at 6:01 and by the time I got to add it to my cart and check out they were already all gone. Oh well, I decided that going from SATA to NVMe for my boot drive would be worth it so I ordered a Samsung 980 Pro 1TB with the $100 savings and bought an XT instead. Overall it will make me happier as I use my PC for more than just gaming than if I'd bought the XTX and happiness per dollar matters too.



i got a reference xfx 7900xtx from best buy for $900, which dropped just before 5am us east time and with a price mistake to boot. so far looks like they are honoring it. my pick up date is the 21st.


----------



## shovenose (Dec 14, 2022)

terroralpha said:


> i got a reference xfx 7900xtx from best buy for $900, which dropped just before 5am us east time and with a price mistake to boot. so far looks like they are honoring it. my pick up date is the 21st.


I noticed that they had the pricing wrong but by the time I looked (6am pacific time) they were all sold out.


----------



## Icon Charlie (Dec 14, 2022)

I agree with this person's comments as he says certain things in a PC type manner.

But from where I am standing If I was running AMD ,I would want to fire Dr.Su for the blatant misinformation of these upcoming cards.

I pretty much know how much they are making HUGE profits on this card because of going to the chiplet format.  So IMHO you are paying another AMD Tax, just like you are paying for the motherboard chipset.

I don't like limited launches and IMHO THIS IS ONE.  Bragging rights for "selling out of product" is just getting old.

This is all her damned fault for the misinformation to happen. ALL OF IT.


----------



## terroralpha (Dec 14, 2022)

AusWolf said:


> All 7900 XT and XTX cards are sold out at Scan UK. Some XT models are available at Overclockers UK, but the rest are sold out along with all XTX models.
> 
> Meanwhile, the 4080 is seeing massive discounts at Scan.
> 
> Who would have thought?



i don't know how to break this to you but there aren't any "discounts" going on in the UK. the cheapest 4080 at scan right, with the "discount", is about $1475 USD, or about $1230 before VAT. most models at $1600+ while the asus 4080 strix is about $1900 USD with VAT. 

overclockers UK was charging 2400 pounds, not USD but pounds, for the msi 4090 suprim. that's $3000 USD. you guys are getting ebay prices straight from the vendor. meanwhile, the AMD cards were actually all prices at or close to MSRP. 

i'm not defending nvidia or the 4080. in fact, i think only a person with severe brain trauma would buy a 4080. the point i'm trying to drive here is that overclockers, box, scan, laptops direct, etc, were all basically scalping you guys and these "discounts" are finally bring a few low tier zotac and palit models sorta close to MSRP. but most cards still waaaaay too overpriced. maybe when prices on the nicer cards start coming to down to where they are supposed to be, maybe then they'll sell.


----------



## AusWolf (Dec 14, 2022)

terroralpha said:


> i don't know how to break this to you but there aren't any "discounts" going on in the UK. the cheapest 4080 at scan right, with the "discount", is about $1475 USD, or about $1230 before VAT. most models at $1600+ while the asus 4080 strix is about $1900 USD with VAT.
> 
> overclockers UK was charging 2400 pounds, not USD but pounds, for the msi 4090 suprim. that's $3000 USD. you guys are getting ebay prices straight from the vendor. meanwhile, the AMD cards were actually all prices at or close to MSRP.
> 
> i'm not defending nvidia or the 4080. in fact, i think only a person with severe brain trauma would buy a 4080. the point i'm trying to drive here is that overclockers, box, scan, laptops direct, etc, were all basically scalping you guys and these "discounts" are finally bring a few low tier zotac and palit models sorta close to MSRP. but most cards still waaaaay too overpriced. maybe when prices on the nicer cards start coming to down to where they are supposed to be, maybe then they'll sell.


I guess you're not familiar with UK/EU import tariffs. Basically, every single piece of computer hardware is a lot more expensive here than in the US. The general formula used to be _"1 USD MSRP = 1 GBP retail"_, but it's been worse than that in the last couple of years. Scan UK's £1,189 Palit 4080 GameRock is actually a good price relative to world prices of the 4080 in general, believe it or not.


----------



## Bwaze (Dec 14, 2022)

Yeah, that's about equivalent to 1340 EUR Manli, Palit RTX 4080 in Germany. These are really 200 EUR cheaper than they were at launch, and very close to AMD 7900 XTX prices.


----------



## Gica (Dec 14, 2022)

AusWolf said:


> Meanwhile, the 4080 is seeing massive discounts at Scan.


And it is the same with us. Almost all available 4080 lost ~$200 in 48 hours.
Maybe it would be time for the "intelligents" who offended me to apologize. Too much time spent explaining to them that only demand and competition regulate prices, not MSRP. If demand drops dramatically and competition is strong, prices will drop even below MSRP.
PS. Sudden price drop is not imposed by nVidia. Sellers have decided, as they have decided to sell well above MSRP, based on demand. A new player is in the game and the time of limitless speculation (imposed, however, by demand) has set.


----------



## mama (Dec 14, 2022)

gmn 17 said:


> yeah some rumours about it but could be happening
> 
> 
> 
> ...


*"...2x 8-pin power connectors, with power draw rated at 405 W maximum".  *Really???


----------



## Bwaze (Dec 14, 2022)

But is it enough for buyers to start grabbing the "heavily discounted" cards? 

It's still X080, a card that had a $699 MSRP before this travesty. And with a normal generational perfirmance leap, the same as to GTX 1080 and to RTX 3080.


----------



## mama (Dec 14, 2022)

Icon Charlie said:


> I agree with this person's comments as he says certain things in a PC type manner.
> 
> But from where I am standing If I was running AMD ,I would want to fire Dr.Su for the blatant misinformation of these upcoming cards.
> 
> ...


What misinformation?  The projections were correct.  "Up to" doesn't mean in every scenario.  Seriously, people who express this amount of outrage are either Nvidia shills or trolls.


----------



## AusWolf (Dec 14, 2022)

Icon Charlie said:


> I agree with this person's comments as he says certain things in a PC type manner.
> 
> But from where I am standing If I was running AMD ,I would want to fire Dr.Su for the blatant misinformation of these upcoming cards.
> 
> ...


I watched 7 minutes of the video, but I can't take it anymore. Basically, the guy did a poll to ask where people expect the 7900 XTX compared to the 6950 XT, and the majority was around 50% faster. Let's ignore the fact that the lowest option he included was 40%, so the poll was flawed by design. Now he says it's a shit card because it's only 35% faster, which means people didn't get what they wanted according to his manipulative poll. What about the fact that its MSRP is 100 USD lower? That makes its price to performance ratio 1.485 times better, meanwhile Nvidia stagnates the price to performance ratios on their cards at roughly 1:1 compared to previous gen. The 7900 XTX is even cheaper than the 4080 which is bigger, needs a new adaptor, and slower, except for RT. Yet, the 7900 XTX is a shit card. Right...

And then you say that Dr Su, under whose leadership AMD managed to back out from bankruptcy should be fired because the 7900 XTX doesn't live up to the expectations of some people on some Youtuber's Facebook page. Bloody hell, you've got some imagination there!


----------



## Dirt Chip (Dec 14, 2022)

Coming late to the party (me on this thred, not the card).
So basically if you are into RT you go with NV and pay the NV-RTX tax, or if you only need raster you pay the (less, but still) AMD MCM tax.

Wonderful choice to make


----------



## _JP_ (Dec 14, 2022)

In stock at one place, in Portugal, but the price tho:




Otherwise:







Again, not a halo-product, this price makes the card not appealing at all.
IMO, in this country, this launch makes the RX 6800 XT and the RX 6700 XT the most appealing cards to get, even if still €200 above what would be reasonable.


----------



## AusWolf (Dec 14, 2022)

Dirt Chip said:


> Coming late to the party (me on this thred, not the card).
> So basically if you are into RT you go with NV and oay the NV-RTX tax, or if you are only need raster you pay the (less, but still) AMD chiplet tax.
> 
> Wonderful choice to make


There wasn't even a chiplet tax on RDNA 2. You might just call it... price.  

My conclusion is different: if you have more money than brains, and want the best, buy the 4090. If you want superior RT, buy the 4080. If you value your money, buy the 7900 XTX. The XT is meant for no one.


----------



## Dirt Chip (Dec 14, 2022)

AusWolf said:


> There wasn't even a chiplet tax on RDNA 2. You might just call it... price.
> 
> My conclusion is different: if you have more money than brains, and want the best, buy the 4090. If you want superior RT, buy the 4080. If you value your money, buy the 7900 XTX. The XT is meant for no one.


It's the same trick that NV did by pricing 4090 to be seen 'the better value' vs 4080, but in a smaller scale


----------



## AusWolf (Dec 14, 2022)

Dirt Chip said:


> It's the same trick that NV did by with pricing 4090 to be seen 'the better value' vs 4080, but in a smaller scale


Except that the 4080 wasn't out when the 4090 launched, so they only had the 30-series to compare to, which obviously ended extremely well _*khm_...


----------



## InVasMani (Dec 14, 2022)

XTX is looking like a rather solid card, but apparently no supply to satisfy the demand right now.


----------



## Taisho (Dec 14, 2022)

mama said:


> What misinformation?  The projections were correct.  "Up to" doesn't mean in every scenario.  Seriously, people who express this amount of outrage are either Nvidia shills or trolls.


AMD promised power efficiency leadership 3 weeks before RTX 4090 was available. Correct projections my arse.


----------



## N3M3515 (Dec 14, 2022)

People do not care about price hikes........yikes. It's like gpus are a drug for them, they don't care how much it costs, they must have it. Compulsive addiction.


----------



## Taisho (Dec 14, 2022)

N3M3515 said:


> People do not care about price hikes........yikes. It's like gpus are a drug for them, they don't care how much it costs, they must have it. Compulsive addiction.


I can understand gamers who don't care about money and buy a 4090 or stretch their budget and also buy a 4090 because it's a unique card, the only that offers a great 4K experience in basically all titles without upscaling.

What I can't understand is paying fat dollars for GPUs that are suitable only for 1440p gaming. God forbid turning RT on.


----------



## Dr. Dro (Dec 14, 2022)

Icon Charlie said:


> I agree with this person's comments as he says certain things in a PC type manner.
> 
> But from where I am standing If I was running AMD ,I would want to fire Dr.Su for the blatant misinformation of these upcoming cards.
> 
> ...



I don't think Lisa herself had anything to do with this. It's the marketing departments that make the press decks. I will give them credit for RDNA 2 being a competent graphics architecture, and really, if you look at it objectively, RDNA 3 is quite well architected as well. Ada takes a tried-and-true approach to chip design, and with so much more execution units, it's no wonder that the AD102 is a faster design. 

The GCD on Navi 31 has 96 compute units in it, vs. the 142 SMs in a full AD102 and 128 in the cutdown configuration used in the RTX 4090, it's just not a GPU of the same caliber or size, and to that extent I believe AMD's hardware engineers did quite well. It's a shame it doesn't equal the 4090 in raster, I expected this given Navi 21 did match and even exceed GA102 here, but they didn't hit this mark this time around. Even then, this is not a bad product - and I strongly believe you will be hard pressed to find a game that will not run at 4K, 60 fps steady on this with every single graphics option you can think of.

Despite the chiplet approach, the bill of materials for this GPU is significantly higher than that of NVIDIA's on your average RTX 4080 design. NV could probably undercut their cards by $600 and still make a profit, the same likely cannot be said of AMD here.

I have several criticisms that are orders of magnitude more important than this one to level at the company


----------



## medi01 (Dec 14, 2022)

Bomby569 said:


> I can't get my head around how people even consider this a good value. Nevermind the paper launch ( especially on Europe as usual) on reddit the price for a aib model was 1300€
> 
> __
> https://www.reddit.com/r/Amd/comments/zgqi13
> ...


The cheapest I see is 1499 Euro for XFX version.

Cheapest 4080 is Palit, at 1350 (which makes it "only" 1134 before VAT, curiously, below MSRP)



_JP_ said:


> In stock at one place, in Portugal, but the price tho:
> View attachment 274412
> Otherwise:
> View attachment 274413
> ...


But Portugal VAT is 23%.
Price looks right.


----------



## HD64G (Dec 14, 2022)

People need to become realistic. 4090 was out of reach on average performance for the reference 7900XTX since SMs are so many more. RTX4080 already is and will become even slower in a few months with driver updates on a totally new arch. Power draw and efficiency due to the nVidia using 4nm is normal to be lost for N31 made on 5nm. Thing is that +3GHz is doable and price is better relatively to performance compared to any nVidia on market today. Also RT performance is equal to the best RTX30 series, so not so bad since those are still selling while being clearly slower and less efficienct than both AMD's 6900 and 7900 gpu series. Bugs will be ironed out and then, 7900 series will get its pace up. Especially for anyone liking oc. My 5c.


----------



## N3M3515 (Dec 14, 2022)

Taisho said:


> I can understand gamers who don't care about money and buy a 4090 or stretch their budget and also buy a 4090 because it's a unique card, the only that offers a great 4K experience in basically all titles without upscaling.
> 
> What I can't understand is paying fat dollars for GPUs that are suitable only for 1440p gaming. God forbid turning RT on.


Yeah, i mean the 4090 is kind of for rich people. The halo product, the epen. And it increased by $100, 6%. But the 4080 got a +70%, +$500 increase in price. Did people suddenly got their salary increased by 70%??
And the fake 7900XTX that competes only with the 4080 but got the price of the 6900XT that competed with the 3090. WTF.

I only see junkie behavior honestly.


----------



## InVasMani (Dec 14, 2022)

She brought a company near bankruptcy to lead the S&P 500 over the likes of Apple, Google, Amazon, Microsoft, Facebook, Nvidia, Intel, ect as a Asian woman in America. If she doesn't inspiration a whole generation of youth in this country and around the globe I don't know who the hell could.


----------



## CapNemo72 (Dec 14, 2022)

So, I just got mine Sapphire 7900 XTX (reference design) delivered.
It's price was (as in everything was sold out in 10mins) 1179CHF ( 7.7% tax included) here in Switzerland. That is around 1185USD without tax, so almost 200USD over MSRP.

The packaging is very basic and feels so cheap that it is disgusting for something that costs that much. I bought it for a new build, but not sure if I should keep it.

Graphic card itself, feels really solid and its healthy heavy. Design is nice (I like simple designs, not "in your face").

EDIT: Of course 4080s are going for 1450+USD (no tax), so price difference is still there.


----------



## medi01 (Dec 14, 2022)

InVasMani said:


> She brought a company near bankruptcy to lead the S&P 500 over the likes of Apple, Google, Amazon, Microsoft, Facebook, Nvidia, Intel, ect as a Asian woman in America. If she doesn't inspiration a whole generation of youth in this country and around the globe I don't know who the hell could.


Dude, let me break this to you:

1) Asian women is the group earning more than any other group for the last 3 quarters straight in US. Diversity quotas work wonders, you know.
2) I recall there was that guy... who's name starts with J and surname with K that did some amazing stuff at AMD, I was told.


----------



## shovenose (Dec 14, 2022)

CapNemo72 said:


> So, I just got mine Sapphire 7900 XTX (reference design) delivered.
> It's price was (as in everything was sold out in 10mins) 1179CHF ( 7.7% tax included) here in Switzerland. That is around 1185USD without tax, so almost 200USD over MSRP.
> 
> The packaging is very basic and feels so cheap that it is disgusting for something that costs that much. I bought it for a new build, but not sure if I should keep it.
> ...


As long as it arrived safely who cares if the box feels cheap, you're just going to throw it away next week anyway.


----------



## Auer (Dec 14, 2022)

medi01 said:


> Dude, let me break this to you:
> 
> 1) Asian women is the group earning more than any other group for the last 3 quarters straight in US. Diversity quotas work wonders, you know.
> 2) I recall there was that guy... who's name starts with J and surname with K that did some amazing stuff at AMD, I was told.


LOL

Afraid of diversity?


----------



## Easo (Dec 14, 2022)

Card is OK, but... Yeah, the price. It's still just too much. Well, I guess mid and higher end gaming is becoming a bigger privilege after all, sad as it is.


----------



## AusWolf (Dec 14, 2022)

Easo said:


> Card is OK, but... Yeah, the price. It's still just too much. Well, I guess mid and higher end gaming is becoming a bigger privilege after all, sad as it is.


It's exactly the same price as the 6900 XT had at launch. Mid-to-high end gaming starts with the x700 tier, imo (or x600 if you don't mind "lower" resolutions, like 1080p). 



Auer said:


> LOL
> 
> Afraid of diversity?


I read "diversity quotas", not "diversity". Those are two entirely different things, you know.


----------



## Avro Arrow (Dec 14, 2022)

I see so much negativity about this card from some people and while I do understand it, I'm also shaking my head at how ridiculous some of them are.  A video card is made good or bad by its competition and nVidia has done a GREAT job of making Radeons good in spite of themselves this gen.  Don't blame AMD for the pricing because the consumers are really to blame.  Then of course AMD has to raise their prices because they know that enough people will buy nVidia no matter the price (which is another serious issue with the GPU market) because it means that nVidia will be rolling in it and so AMD has to rake in more in order to keep up with nVidia's huge R&D budget!  Consumers need to start blaming _themselves_ for this situation because it sits squarely on our shoulders.  WE are in control of the market but we haven't been in control of _ourselves_.  That's the biggest issue imaginable in any industry marketplace, not just this one.

It's time for a reality check (and I don't care if you don't like it because it's the truth):

*Universe-Level Stupid -* If you're just a gamer and have an RTX 3090, you're almost as dumb as it gets and you would be if the RTX 3090 Ti didn't exist.  This is because it means that you spent an extra $800 (MORE THAN DOUBLE) over the price of the RTX 3080 for an absurd 13% performance increase.  Again, this is assuming that you managed to get it at MSRP because if you didn't, you could have paid as much as an extra $1,500!   That's a stupid amount of money to spend just to stroke your ego and be able to say that you had "the fastest video card in the world" for the maximum of 15 months before the RTX 3090 Ti came out.  The only people who make you look smart are the gamers who bought the RTX 3090 Ti.  *Y*_*ou*_ are the biggest reason that Jensen keeps pushing the prices up because if you show that you're willing to pay it, why wouldn't he?  

*Galaxy-Level Stupid - *If you're just a gamer and have an RX 6900 XT (or RX 6950 XT for that matter), then you are still a big part of the problem.  If you managed to get your card at MSRP, it means that you spent $350 to get a measly 9% performance advantage over the RX 6800 XT.  If you didn't manage to get it at MSRP, you could have paid as much as an extra $700 over the price of the RX 6800 XT.  You're not the dumbest class of people I'm profiling here, but you're solidly in second-place because you demonstrated that you're willing to pay WAY more for a video card than you really had to (even at MSRP) and AMD saw that you were willing to do this.  They remember this when they do market research and analysis and conduct themselves accordingly.

*Constellation-Level Stupid - *If you're just a gamer but paid the exorbitant prices of the RTX 3080 12GB or RTX 3080 Ti, I don't need to explain to you what I just explained to the first two levels.

*System-Level Stupid - *If you're just a gamer and have an RX 6800/XT or RTX 3080 but you paid more than MSRP, you still showed that you were willing to pay way too much for a video card which also helped to fuel the fire, even if it wasn't to the same extent as the first two categories.  *I myself am in this category because I myself am guilty of paying a good deal more than I should have so don't think that I'm trying to play innocent here.  *The fact that I did this is something that made me swear at myself in the mirror a couple of times after the initial joy of having the card wore off and I realised that I should've just stuck with my RX 5700 XT.  The only reason why the people in this category are smarter than the people in the previous categories is that, as crazy as it sounds, they did choose to be screwed over LESS than the people in the first two categories, for whatever that's worth.

*Planet-Level Stupid - *If you're just a gamer and you're rocking a GeForce GPU _below_ an RTX 3080, then you have no right to complain about value because at whatever price you paid (or often less than you paid), there's a Radeon that absolutely DEMOLISHES your GeForce card in performance.  The only reason that you're smarter than the first three categories is that you most likely paid a crap-tonne less than they did and that's a good thing because it didn't make the tech executives' eyes get as big as the first three categories. 

*BS Excuse #1, Ray-Tracing - *RT is barely usable on cards below the RTX 3080.  Just to be playable, you have to compensate with DLSS and as good as DLSS is, it's no match for native resolution and never has been.  _You_ chose to throw too much money at nVidia for the card you own and you showed Jensen that it doesn't matter how crappy nVidia's low-end is because you're willing to screw yourself over just so you can get a video card in a green box.  

*BS Excuse #2 - Radeon Drivers - *Radeon drivers haven't been problematic in years.  I'm sure that 99% of the so-called "driver issues" from Radeons were either because the owner didn't think to use DDU or it wasn't the drivers in the first place, it was just an ID10T error.  I myself had issues with my RX 5700 XT when I first bought it, but it was a hardware problem, a power distribution problem that went away when I did an RMA to XFX.  I had no driver issues with my HD 7970s, R9 Furies, RX 5700 XT or my RX 6800 XT.  That's over ten years of Radeon use with no driver problems.  I don't believe that I was just lucky because as redundant as it sounds, I've never been lucky enough to be lucky.

*Universe-Level Geniuses - *If you're a gamer and use an RX 6700 XT or any Radeon below that, or you use an old GeForce GTX 1000-series card 9or older) that you bought years ago, then congratulations because you're the smartest consumers in the room and far smarter than I am.  You read the writing on the walls and made the smart choice to just get _something_ with which to weather the storm while taking the least amount of financial damage possible or stick with what you had (which is what I should've done) while still being able to game decently.  You ignored the marketing BS and got only what you needed to survive.  If everyone was like you, we wouldn't be in this mess AT ALL.  

I could've been a *Universe-Level Genius* but I ultimately failed because I wanted the first gamer-grade Radeon reference card to NOT use a blower cooler and I got it for $500CAD less than what I could get any other RX 6800 XT for at the time.  I tried to excuse the purchase away with this fact, but it was, at the end of the day, an emotional purchase and I should've known better because I'm not a kid, I'm a grown man.  Therefore, I hang my head in shame as being *System-Level Stupid*.  I am part of the reason that video cards have become so expensive.  The thing is, we can turn this around if we vow to be better the next time we purchase a video card (or CPU for that matter) and remember that the thrill of owning it is very short-lived.  If we demonstrate a gravitation to lower priced cards with better value and leave the expensive cards with crappy values on the shelves, sure it will take a couple of generations, but this problem will eventually solve itself.

If you really don't care if the situation resolves itself or not, then you're probably a sociopath and are beyond redemption.  Personally, I'm glad that I was able to come to this conclusion because admitting that a problem exists is always the first step towards solving it.  Oh yeah, the other thing is, if you bought a GeForce card, then you have no right to whine about _anything_ that AMD does because you chose to support the other side and they don't owe you a damn thing.  On the other hand, if you own a Radeon, then you have no reason to whine about anything that nVidia does because you've already avoided that bullet.

Goodnight everyone!


----------



## Lost_Troll (Dec 14, 2022)

There is a hot fix driver out now for the 7900 series, 22.12.1.

AMD Driver Link


----------



## Dirt Chip (Dec 15, 2022)

medi01 said:


> 2) I recall there was that guy... who's name starts with J and surname with K that did some amazing stuff at AMD, I was told.


John Kennedy?
The first J(F)K that came to my mind..


----------



## sepheronx (Dec 15, 2022)

Dirt Chip said:


> John Kennedy?
> The first J(F)K that came to my mind..


Yeah, Lee Harvey Intel had him assassinated to make sure AMD would truly not progress.  Though Rumor has it that Nvidia was the guy at the grassy knoll


----------



## kapone32 (Dec 15, 2022)

Avro Arrow said:


> I see so much negativity about this card from some people and while I do understand it, I'm also shaking my head at how ridiculous some of them are.  A video card is made good or bad by its competition and nVidia has done a GREAT job of making Radeons good in spite of themselves this gen.  Don't blame AMD for the pricing because the consumers are really to blame.  Then of course AMD has to raise their prices because they know that enough people will buy nVidia no matter the price (which is another serious issue with the GPU market) because it means that nVidia will be rolling in it and so AMD has to rake in more in order to keep up with nVidia's huge R&D budget!  Consumers need to start blaming _themselves_ for this situation because it sits squarely on our shoulders.  WE are in control of the market but we haven't been in control of _ourselves_.  That's the biggest issue imaginable in any industry marketplace, not just this one.
> 
> It's time for a reality check (and I don't care if you don't like it because it's the truth):
> 
> ...


That is an interesting dissertation on Penis envy, applying common sense to a completely leisure activity and the fruits of applying the Greed coefficient to Graphics cards while Games are getting even more fidelity.


----------



## sepheronx (Dec 15, 2022)

N3M3515 said:


> People do not care about price hikes........yikes. It's like gpus are a drug for them, they don't care how much it costs, they must have it. Compulsive addiction.


Here is another thing bothering me and this is about the price.

People are going to ruin it foe everyone else by buying these overpriced gadgets.  They problem is that if features like RT can only be used on the very best of the best, even basic RT functions, then the technology may not progress as fast as only few people will have the benefit of using it because they could afford the tech.  To the bean counters, they won't see it as important because not many can afford to use the tech and thus it either is barely used or only accessible to the wealthy.  And it will get worst because the companies know in the end, people will pay abysmal prices for this feature instead.

We will see what happens when Unreal engine 5.1 becomes even more used and all those features that require RT to be enabled, how well it will work with current gen cards.  But we already know what Silent Hill 2 remake requirements are.


----------



## LuxZg (Dec 15, 2022)

fluxc0d3r said:


> The other day, I almost bought a 6650XT for $250 brand new off Amazon for my HTPC/driving sim rig, but decided to wait for the budget 7000 series line up from AMD. I here they have a $400 7000-series card with RX 6800 performance coming out soon.
> 
> AMD is currently dominating in the budget to midrange market with their 6650XT and 6700XT cards being much cheaper. Most budget users don't care about RT performance, probably turn it off anyways.


This is me! I mean, I read these hyped reviews mostly to gauge what next gen tech is bringing to the market, but I really can't (won't!!) go for anything over 350€ this decade! I got company laptop with 5800X + RTX 3060 that cost 1000€, I am in no way interested in graphics card that costs more than whole PC (with screen!!).

Anyway, I'd be going for 6650/6700 if there wasn't for this the laptop (surprise upgrade) and now I'll need to wait for something like 7650/7700 to see how it goes. If they pull Nvidia and price it at 400$ that will translate to 500€ and I'm solidly out of that picture. Anything below wouldn't be an upgrade anyway.

So for me, it's sit and wait, and wait and wait, eventually price/perf will hit the spot where it's ok for me. I still have couple thousand games to play without RT in my library, and this laptop can max them all at 3440x1440.

I wish more people would sit and wait, would bring prices down way faster, see how Ryzen 7000 is being discounted so soon after launch. We can do it, just need to stop accepting 60% margins on graphics and CPUs when rest of the world works fine with 10-20%. Then 7900XTX would be a 700$ card, and 7600XT would be 200$ card.


----------



## Dirt Chip (Dec 15, 2022)

Avro Arrow said:


> *Universe-Level Stupid
> 
> Galaxy-Level Stupid
> 
> ...


That`s a hefty 'geniuses-stupidity' tier you got there man, second only to the too complicated NV-AMD tier structure of GPU`s.
You are quite decisive about what and how people NEED to shop, as if you accommodate all knowledge about each person inner reason whatsoever. I find it somewhat arrogance and falling into your own geniuses-stupidity system in a limbo.

I`m one of your supposed "Universe-Level Geniuses" with 970GTX while using a 4K monitor for gaming (among other things) but I can totally understand and justify someone that will opt to spend 1000-2000$ on a GPU alone. To many gaming is a hobby and for enjoying your preciouses spare time with that some will pay big and beyond what calculated as 'right' - they can (either they save $$$ specially for it or just have it) so they do. That's simple. No wrong about that imo.
I agree, and apply it myself, that you don't need all eye-candy and max settings in order to enjoy a good game but to which is own. I don`t crown people as 'geniuses-stupidity' and I don`t try to educate anyone- surly not by intituling insulting words or suggest a drastic psychological condition- and it`s sound you are trying to. The use of sharp and decisive terms only highlights the weakness of your argument and determinism is a cheap way out of insecurity.
I may just tip that going either edge direction is probably not the right way to go and might do good with a realty check on it`s own.

I don`t see the point of blaming (only) the people for having high cost hardware nowadays. Above all ,in the market we are in and imo, a fierce competition is the only major force to balance high prices. I also don't blame the gamers community for having capitalistic market, a dipole market, stock exchange market and post Covid-19 market- the root causes as I see it to the current inflated price situation.

If anything, buying a product out of brand consideration and not by your own specific needs (and you play the NV vs. AMD vs. the world game quite seriously) is what I might called (but I try not to) unwise and not in your own favorer. It will get you a bad deal in the short term and will boomerang to you on the long-run. You can`t paly the loyal-support or hate-boycott game with those kind of companies and they will try to exploit you on that very attempt (I wrote about it in the past in length).

I also reject the notion that someone don`t have a moral right to complain about X because he choose Y. To complain is one of man`s greatest needs and enjoinment so I won't go on denying that healthy pleasure from anyone. In the end whining is caring.
We are not in control of the market, although we have some (small) amount of influence on it as individual consumers. To go farther than that as to believe only us consumers can make all the change is naïve and unrealistic to the point of fantasy land stories. Trying to be in control of that will lead you do despair or worse- to loss all control whatsoever.

Shop color blindly fallowing your specific needs in a given time and according to your own budget ability after prioritization it.



sepheronx said:


> Yeah, Lee Harvey Intel had him assassinated to make sure AMD would truly not progress.  Though Rumor has it that Nvidia was the guy at the grassy knoll


Yep, and the evidence is that by 'RT on' and only by NV GPU you can see the real murderer in the video playback.


----------



## LuxZg (Dec 15, 2022)

Denver said:


> Of course, I'm not just throwing out random information for fun.
> 
> The cost per 5nm wafer is $17k. Considering the yield, similar to 7nm, and the 300mm2 die of the main GPU chip(GCD), we have about 137 usable chips, costing about $124 each. Now we also have 6x cache chips(MCDs) costing about $12 each. So far the cost is at U$ 196.
> 
> ...


This may be true, although information is still guesswork, but it ignores huge elephant in the room. The fact that both TSMC and memory makers ALSO have 60% margins! And when AMD is buying chips that have 60% margin then puts it's own 60% on top, things go bananas real quick. I'm gonna repeat that none of these companies are in the red, they are having more and more money, at alarming rate! While even some big names of this world (non-IT) are happy with 25%. Just pop any big company name into Google search followed by "gross margin" and check the trends. It is THE reason for 1000$ cards (and 700$ "midrange", for God sakes). 60% on top of 60% on top of 60%... 

But hey, as long as world has couple million people ready to buy 800-2500$ cards these companies will keep milking...


----------



## Dirt Chip (Dec 15, 2022)

LuxZg said:


> So for me, it's sit and wait, and wait and wait, eventually price/perf will hit the spot where it's ok for me. I still have couple thousand games to play without RT in my library, and this laptop can max them all at 3440x1440.
> 
> I wish more people would sit and wait, would bring prices down way faster, see how Ryzen 7000 is being discontinued so soon after launch. We can do it, just need to stop accepting 60% margins on graphics and CPUs when rest of the world works fine with 10-20%. Then 7900XTX would be a 700$ card, and 7600XT would be 200$ card.


I`m with you on that 'sit and wait' train (or actually just watching all the trains go by as I stay put happily in my home), moving from aging i5-2400 and FHD monitor to top-of-the-flop 13900K+4K monitor all while keeping the same ancient 970GTX.
Many good games are still on my 'to play' list that will do just fine with that (uncommon) configuration (I just need the NV CUDA for work and the old one are the same as the new).


----------



## LuxZg (Dec 15, 2022)

N3M3515 said:


> 6800XT $650 msrp -------- 7900XTX $1000 msrp, 51% higher performance for 53% higher price. Price/performance: crapola.
> 
> Exactly the same crap nvidia did with the 4080.
> And don't get me started on the 7900 XT.



Let me start with 7900XT:

738€ for 6800XT vs 1286€ for 7900XT right here right now (lowest local price this moment).

142 FPS average at 1440p vs 184 FPS by TPU review.

So over 74% more money for under 30% more FPS


----------



## Dirt Chip (Dec 15, 2022)

LuxZg said:


> This may be true, although information is still guesswork, but it ignores huge elephant in the room. The fact that both TSMC and memory makers ALSO have 60% margins! And when AMD is buying chips that have 60% margin then puts it's own 60% on top, things go bananas real quick. I'm gonna repeat that none of these companies are in the red, they are having more and more money, at alarming rate! While even some big names of this world (non-IT) are happy with 25%. Just pop any big company name into Google search followed by "gross margin" and check the trends. It is THE reason for 1000$ cards (and 700$ "midrange", for God sakes). 60% on top of 60% on top of 60%...
> 
> But hey, as long as world has couple million people ready to buy 800-2500$ cards these companies will keep milking...


I wish TSMC will get the same 'hate treatment' as NV having for gouging price.
The new dogma will be something like "don`t but overpriced TSMC, only Global founders (or Intel...)".
LOL all the way to absurdity.

Point is, going forward price will go up faster than silicon shrink (read performance increase) and that situation is a change to what we use to know till now.

AMD did good by going the efficient chiplet design\MCM design to offset the price a bit, but I'm not sure it will hold to their next gen.
NV are still monolithic and charging more for RT performance (rightfully I might add) but will probably stay the higher "premium" option next gen also.
All in all- AMD can safely keep riding the increased price wave lead by NV, keep the underdog 'robin hood' image all while selling tons of 1000$+ GPU`s. Very sleek.


----------



## AusWolf (Dec 15, 2022)

Dirt Chip said:


> I wish TSMC will get the same 'hate treatment' as NV having for gouging price.
> The new dogma will be something like "don`t but overpriced TSMC, only Global founders (or Intel...)".
> LOL all the way to absurdity.


I don't think they will, unfortunately. People know Intel, Nvidia and AMD, but not many people know TSMC. TSMC can do whatever they want in the background while the first three absorb all the hate.


----------



## ARF (Dec 15, 2022)

LuxZg said:


> Let me start with 7900XT:
> 
> 738€ for 6800XT vs 1286€ for 7900XT right here right now (lowest local price this moment).
> 
> ...



This is very unfair manipulation. Price one thing in an absurd way, so the other still heavily overpriced looks somewhat "decent value".



AusWolf said:


> I don't think they will, unfortunately. People know Intel, Nvidia and AMD, but not many people know TSMC. TSMC can do whatever they want in the background while the first three absorb all the hate.



All will absorb the hate. Vote with your wallet!

I think I will cancel my intention to have an RX 7900 XT 20 GB as long as it stays over 600 bucks which is the limit I will pay for this junk.


----------



## AusWolf (Dec 15, 2022)

ARF said:


> This is very unfair manipulation. Price one thing in an absurd way, so the other still heavily overpriced looks somewhat "decent value".
> 
> 
> 
> ...


I think the XTX is fair value, but I agree with you on the XT. It's too expensive.


----------



## Denver (Dec 15, 2022)

LuxZg said:


> This may be true, although information is still guesswork, but it ignores huge elephant in the room. The fact that both TSMC and memory makers ALSO have 60% margins! And when AMD is buying chips that have 60% margin then puts it's own 60% on top, things go bananas real quick. I'm gonna repeat that none of these companies are in the red, they are having more and more money, at alarming rate! While even some big names of this world (non-IT) are happy with 25%. Just pop any big company name into Google search followed by "gross margin" and check the trends. It is THE reason for 1000$ cards (and 700$ "midrange", for God sakes). 60% on top of 60% on top of 60%...
> 
> But hey, as long as world has couple million people ready to buy 800-2500$ cards these companies will keep milking...


It must be on those margins that they can build factories costing tens of billions of dollars and keep advancing new processes to meet demand.


----------



## HD64G (Dec 15, 2022)

LuxZg said:


> Let me start with 7900XT:
> 
> 738€ for 6800XT vs 1286€ for 7900XT right here right now (lowest local price this moment).
> 
> ...


Wait a few weeks and check prices again. XT will get below 1K and then it will be good in vfm. Early price gouging is pretty usual to make it a topic.


----------



## Avro Arrow (Dec 15, 2022)

kapone32 said:


> That is an interesting dissertation on Penis envy, applying common sense to a completely leisure activity and the fruits of applying the Greed coefficient to Graphics cards while Games are getting even more fidelity.


Thank you.  I'm glad you liked it.  It was quite an effort to compose, believe me!


Dirt Chip said:


> That`s a hefty 'geniuses-stupidity' tier you got there man, second only to the too complicated NV-AMD tier structure of GPU`s.


Well, it did take a pretty long time to compose it because there were a lot of thoughts to collect together and make cohesive.


Dirt Chip said:


> You are quite decisive about what and how people NEED to shop, as if you accommodate all knowledge about each person inner reason whatsoever.


I worked for years at Tiger Direct.  I know EXACTLY how people NEED to shop because I know the tricks that are played on them.  Clearly, you've never worked in the industry because you're using the layman's "infinite reasons for buying" argument which falls completely flat when addressing why people buy video cards.  You might want to re-read what you quoted because I began each explanation with "If you're ONLY a gamer" (without the caps) which means that I was excluding people who wanted the cards for any combination of gaming and content creation or professional workloads.  If you had paid attention, you would've seen that and wouldn't have responded as you have.


Dirt Chip said:


> I find it somewhat arrogance and falling into your own geniuses-stupidity system in a limbo.


Arrogant people don't publicly admonish themselves.  That's another lesson for you.


Dirt Chip said:


> I`m one of your supposed "Universe-Level Geniuses" with 970GTX while using a 4K monitor for gaming (among other things) but I can totally understand and justify someone that will opt to spend 1000-2000$ on a GPU alone. To many gaming is a hobby and for enjoying your preciouses spare time with that some will pay big and beyond what calculated as 'right' - they can (either they save $$$ specially for it or just have it) so they do. That's simple. No wrong about that imo.


Except that if your experience is not improved by spending more money, then spending more money is plain stupid.  If you were to buy an RX 6900 XT, then your gaming experience wouldn't be discernably better than if you were to buy an RX 6800 XT.  If you can play your games maxxed-out at 4K, then there's not going to be an objective difference.  When I worked at TD, I used to have people come in and ONLY want a GTX 280.  It was quite obvious that they were immature egomaniacs who lived at home and so didn't pay rent.  That kind of mentality generally disappears as you become a man unless there's something wrong with you because being a grownup means becoming more objective and reasonable.  These are assets when it comes to decision-making and thus assets to survival.  The people who never seem to develop these assets are usually just too stupid to be able to.


Dirt Chip said:


> I agree, and apply it myself, that you don't need all eye-candy and max settings in order to enjoy a good game but to which is own. I don`t crown people as 'geniuses-stupidity' and I don`t try to educate anyone- surly not by intituling insulting words or suggest a drastic psychological condition- and it`s sound you are trying to.


Again, you do realise that I admonished myself, right?  I don't know what kind of gaslighting you're trying to do here but I INCLUDED MYSELF in the post.  You clearly do more than skim over the whole thing and that has been demonstrated repeatedly by your response.

You do realise that the market is morbidly close to collapsing into a monopoly, right?  No, I guess you don't.  The GPU market has never been more in danger of losing a major player than it is right now.  What's _really_ ominous is that this has happened _despite_ the fact that Radeon hasn't been this competitive with GeForce since 2015 with the R9 Fury.  People can no longer say "If AMD would only..." because AMD already HAS "only..." and it made absolutely no difference.  In fact, nVidia's market domination _accelerated_ during the RX 6000 / RTX 30 years!  It honestly looks like you're going to have the choice between broken Intel ARC (if Intel decides to stick around) or GeForce cards priced into the stratosphere.


Dirt Chip said:


> I don`t see the point of blaming (only) the people for having high cost hardware nowadays. Above all ,in the market we are in and imo, a fierce competition is the only major force to balance high prices.


Ordinarily yes, but, not in a duopoly.  Duopolies aren't very stable and AMD has given nVidia the fiercest competition that nVidia has seen since 2015 and not only has it not helped to balance the market, nVidia's domination accelerated.  So, reality has proven your words false.


Dirt Chip said:


> I also don't blame the gamers community for having capitalistic market, a dipole market, stock exchange market and post Covid-19 market- the root causes as I see it to the current inflated price situation.


And you call me naive?  Look, I don't care if you don't blame them because you're still dead-wrong.  If the only people who bought these insanely-priced cards were those who needed them, like how it was mostly rich people and prosumers who bought Titan cards, then we wouldn't be here because there aren't enough of them to make the impact that has been felt.  Insanely-priced Titans existed back in the days of normal gaming GPU pricing.  Only gamers are numerous enough for that and it is they who have made this impact.


Dirt Chip said:


> If anything, buying a product out of brand consideration and not by your own specific needs (and you play the NV vs. AMD vs. the world game quite seriously) is what I might called (but I try not to) unwise and not in your own favorer. It will get you a bad deal in the short term and will boomerang to you on the long-run. You can`t paly the loyal-support or hate-boycott game with those kind of companies and they will try to exploit you on that very attempt (I wrote about it in the past in length).


Once again, your insistence on ignoring the words used invalidates your response.  The words "If you're only a gamer" has this section of your post nullified before you typed it so you completely wasted your effort.


Dirt Chip said:


> I also reject the notion that someone don`t have a moral right to complain about X because he choose Y. To complain is one of man`s greatest needs and enjoinment so I won't go on denying that healthy pleasure from anyone. In the end whining is caring.


I don't remember saying that I care what you reject.  I DO remember saying that I don't care if you don't like it because it's still the truth.  You don't seem to _want_ it to be the truth, but that changes absolutely nothing.

- If you want Radeon to get better, stop buying nVidia cards because magic doesn't actually exist and they need money to do so.
- If you're mad at something that nVidia did but bought a GeForce card anyway, you are your own worst enemy.
- If nVidia does something you don't like but you have a Radeon card, you have no right to complain.  All you can do is say "I ain't buying nVidia."
- If you're "unsatisfied" with something that AMD does but you have a GeForce card, you still have no right to complain because you gave your money to nVidia, the company that already owns the market.  You didn't try to make a positive difference when you had the chance so who cares what you think?  It's like listening to a four year-old crying about the fact that he tripped on his laces but refuses to tie his shoes.  It's just stupid.


Dirt Chip said:


> We are not in control of the market, although we have some (small) amount of influence on it as individual consumers.


You're dead wrong.  We are in COMPLETE control of the market because video cards are not a NEED.  Every time a card gets sold, it is because a consumer CHOSE to purchase it.  If you don't buy a video card, you don't get sick or die.  The only markets that can control us are FOOD, MEDICINE, CLOTHING and SHELTER which is why those markets are the most heavily regulated.  Everything else is just a want, a luxury.  EVERY purchase you make that doesn't involve one of those four is based on self-control, period.  Most people automatically know that so welcome to the human condition.  Talk about crying about first-world problems, eh?


Dirt Chip said:


> To go farther than that as to believe only us consumers can make all the change is naïve and unrealistic to the point of fantasy land stories. Trying to be in control of that will lead you do despair or worse- to loss all control whatsoever.


It's only unrealistic because people don't have self-control.  I admitted that I myself was also unable to control myself so I speak from experience.


Dirt Chip said:


> Shop color blindly fallowing your specific needs in a given time and according to your own budget ability after prioritization it.


Sure, and completely ignore any possible negative impact that it might have on the industry and market as a whole.  That might work in a typical market but video cards have been, until this year, a duopoly.  A duopoly is a terrible thing because it's extremely unstable.  If one competitor fails, then ALL consumers lose and there's no going back.  

A whopping 88% of the video cards that people currently own are nVidia cards.  Fact-check that if you like, I don't make things up.  If recent trends don't change, you won't have to worry about shopping colour-blindly because there will only be ONE colour left and that ONE colour will smile as you pay $3,000USD for a mid-tier video card before 2030.  You have to imagine that AMD is already wondering if making Radeon cards is worth it because your "colour-blind" shopping doesn't actually exist.  The vast majority of people who buy video cards If you buy a Radeon today, you'll still be able to buy a GeForce tomorrow.


----------



## N3M3515 (Dec 15, 2022)

Avro Arrow said:


> Thank you.  I'm glad you liked it.  It was quite an effort to compose, believe me!
> 
> Well, it did take a pretty long time to compose it because there were a lot of thoughts to collect together and make cohesive.
> 
> ...


Couldn't have said it better. 
This situation of ever increasing gpu prices is out of control.


----------



## LuxZg (Dec 15, 2022)

Dirt Chip said:


> I wish TSMC will get the same 'hate treatment' as NV having for gouging price.
> The new dogma will be something like "don`t but overpriced TSMC, only Global founders (or Intel...)".
> LOL all the way to absurdity.
> 
> ...



There's one thing that will eventually save us. New nodes cost a lot, but their implementation is payed in first years. And 4nm/5nm will be near top to stay as further miniaturization slows. But the production cost should then start to lower. So I say - keep your GPUs as long as you can, and buy 4/5nm GPUs in couple years when the price gets right  ignoring few outliers, hardware is way ahead of software these days, and that won't change anytime soon. Be smart, save money


----------



## Neo_Morpheus (Dec 15, 2022)

Avro Arrow said:


> A whopping 88% of the video cards that people currently own are nVidia cards.


There was a time that it was for good reasons, but we are now paying for those "good reasons" and worse, people are ignoring the really bad reasons why we should stop giving Nvidia money.


Avro Arrow said:


> You have to imagine that AMD is already wondering if making Radeon cards is worth it because your "colour-blind" shopping doesn't actually exist.


I am afraid of that and I will be brutally honest, if they left the GPU market, I would not blame them.


Avro Arrow said:


> Fact-check that if you like, I don't make things up. If recent trends don't change, you won't have to worry about shopping colour-blindly because there will only be ONE colour left and that ONE colour will smile as you pay $3,000USD for a mid-tier video card before 2030.


I have been saying that but in other less eloquent words as to what we are walking towards to.


Avro Arrow said:


> The vast majority of people who buy video cards If you buy a Radeon today, you'll still be able to buy a GeForce tomorrow.


Yeah, but no, I hate Nvidia so much that it will take a miracle of biblical proportions to give them money again.

Cant edit the post, but saying it down here: Welcome to TPU!!


----------



## AusWolf (Dec 15, 2022)

Avro Arrow said:


> Thank you.  I'm glad you liked it.  It was quite an effort to compose, believe me!
> 
> Well, it did take a pretty long time to compose it because there were a lot of thoughts to collect together and make cohesive.
> 
> ...


What a comprehensive, well thought-out assessment of the market situation!

A few years ago, I completely agreed with the "_shopping for brand is bad_" and "_colourblind shopping is the way_" sentiments, but when 88% of the market is owned by a single company, one is inclined to make a stand and say "_I'll give up that 10% extra RT performance because I want to buy a midrange GPU under a grand for my next rig as well_".

When Intel Arc debuted, Linus (from Linus Tech Tips) heavily defended it and urged people to buy an A770 despite its many problems. I completely understand why.


----------



## sepheronx (Dec 16, 2022)

I wanted to buy a a770 but couldn't get one in my shithole


----------



## AusWolf (Dec 16, 2022)

sepheronx said:


> I wanted to buy a a770 but couldn't get one in my shithole


Same here.


----------



## sepheronx (Dec 16, 2022)

AusWolf said:


> Same here.


I just looked and there is 10 avail on other end of the city. I may just grab it tomorrow


----------



## AusWolf (Dec 16, 2022)

sepheronx said:


> I just looked and there is 10 avail on other end of the city. I may just grab it tomorrow


Here, only Ebuyer has it for £380. It's a bit too much, imo.


----------



## Bwaze (Dec 16, 2022)

Maybe when the new generation low and midrange Nvidia and AMD cards launch, the £380 A770 will look better.  And prices might even go up.

When RTX 4080 lsunched the prices of RTX 3080, 3090 and their Ti variants actually went up if I look on Geizhals.eu. And it's not because the stores would be running out of them, most of the models seem widely available.


----------



## LuxZg (Dec 16, 2022)

A770 for 436-486€ over here. 345€ gets me RX6650. 375€ for RTX3060. (Croatia, taxes and all included)
It certainly makes it hard to support 3rd player with that kind of pricing. If it was 350€ I'd honestly consider it, even though that's on the edge of my (imaginary) red line for new GPU costs. Though in reality - all three should be around 300€ with taxes. But 350€ is... close enough in this crazy world.
I hope they get better volumes by the time B770 hits the market, so price gauging by retailers also stops.


----------



## AusWolf (Dec 16, 2022)

LuxZg said:


> A770 for 436-486€ over here. 345€ gets me RX6650. 375€ for RTX3060. (Croatia, taxes and all included)
> It certainly makes it hard to support 3rd player with that kind of pricing. If it was 350€ I'd honestly consider it, even though that's on the edge of my (imaginary) red line for new GPU costs. Though in reality - all three should be around 300€ with taxes. But 350€ is... close enough in this crazy world.
> I hope they get better volumes by the time B770 hits the market, so price gauging by retailers also stops.


There is literally one single retailer that sells the A770 here in the UK for £380 (about €410-420). It's very hard to justify even for a hoarder like myself when the 6650 XT starts at £299. I really wanted to get one (an A770), but when I saw the price, I just threw a few quids on top of it, and got myself a 6750 XT instead.


----------



## Dirt Chip (Dec 16, 2022)

Avro Arrow said:


> Thank you.  I'm glad you liked it.  It was quite an effort to compose, believe me!
> 
> Well, it did take a pretty long time to compose it because there were a lot of thoughts to collect together and make cohesive.


Indeed, Kudus for that. Above all I appriciate a well thought read.



Avro Arrow said:


> I worked for years at Tiger Direct.  I know EXACTLY how people NEED to shop because I know the tricks that are played on them.  Clearly, you've never worked in the industry because you're using the layman's "infinite reasons for buying" argument which falls completely flat when addressing why people buy video cards.  You might want to re-read what you quoted because I began each explanation with "If you're ONLY a gamer" (without the caps) which means that I was excluding people who wanted the cards for any combination of gaming and content creation or professional workloads.  If you had paid attention, you would've seen that and wouldn't have responded as you have.


I didn't mean for people who do other than gaming. My write is about those who mainly or only game.


Avro Arrow said:


> Arrogant people don't publicly admonish themselves.  That's another lesson for you.


And yet, to say conclusively without any dubet that YOU know exactly what every consumer NEED to purches is imo arrogance because you can't positively know for sure each one resones.
You might give yourself more credibility by publicly admonishing themselves but the bold statment is very patronising. Knowing every trick NV\AMD play on them don't grant you insights to everyone souls. Thinking you do is beyound aroggance. Do you really think you know every person so well as to dictate for him what`s best for him?


Avro Arrow said:


> Except that if your experience is not improved by spending more money, then spending more money is plain stupid.  If you were to buy an RX 6900 XT, then your gaming experience wouldn't be discernably better than if you were to buy an RX 6800 XT.  If you can play your games maxxed-out at 4K, then there's not going to be an objective difference.  When I worked at TD, I used to have people come in and ONLY want a GTX 280.  It was quite obvious that they were immature egomaniacs who lived at home and so didn't pay rent.  That kind of mentality generally disappears as you become a man unless there's something wrong with you because being a grownup means becoming more objective and reasonable.  These are assets when it comes to decision-making and thus assets to survival.  The people who never seem to develop these assets are usually just too stupid to be able to.


I think you are too quickly to mark people as stupid if they don`t fallow to what you think is right for them\your logic. If they saved the money (a good quality by itself- to save for something), prioritize on PC components and in general in there life, knowing they pay extra to get a only small bump in fps and still decide they want to spend it that way I don't see any reason to convince them otherwise (unless they pay the extra in order to support"" the company- with that I dont agree at all)
I`m sure and agree that many don`t do that proper thinking before buying, but that their lesson to learn and not your to decided. Maybe your long experience with many snout childish (or adult) tinted you a bit to think everyone are like so you somehow try to control\save the irrationality of all humanity (read gamer comunity). No point going that road.


Avro Arrow said:


> Again, you do realise that I admonished myself, right?  I don't know what kind of gaslighting you're trying to do here but I INCLUDED MYSELF in the post.  You clearly do more than skim over the whole thing and that has been demonstrated repeatedly by your response.


I don`t think that admonishing yourself give you the moral justice to do it to others. I guess you are free to do it to yourself, but using that to direct it that to others is not cool. 


Avro Arrow said:


> You do realise that the market is morbidly close to collapsing into a monopoly, right?  No, I guess you don't.  The GPU market has never been more in danger of losing a major player than it is right now.  What's _really_ ominous is that this has happened _despite_ the fact that Radeon hasn't been this competitive with GeForce since 2015 with the R9 Fury.  People can no longer say "If AMD would only..." because AMD already HAS "only..." and it made absolutely no difference.  In fact, nVidia's market domination _accelerated_ during the RX 6000 / RTX 30 years!  It honestly looks like you're going to have the choice between broken Intel ARC (if Intel decides to stick around) or GeForce cards priced into the stratosphere.


I think you are over reacting with that morbidly of AMD disappearing (thay have good product at various market segments including consumers GPU to keep them going IF they are serious about that, just like Intel with ARC) and too afraid of monopoly (that is in the contaxt of GPU`s as recreational gaming tool, not as a general rule about monopoly).
I sound that your honest concern drive you to, imo, thing the wrong way how to fix this problem (AMD to quit consumers GPU). Favoring a company in the magnitude of AMD (a giant worldwide corporation that is listed on to of NASDAQ,  just like NV and Intel) is a futile thing to do. They are no a charity case as let`s say a local ice cream maker you want to help by favoring them and not Nestle ice cream for example. They will and are using your sens of 'let`s help the underdog before it collapse' to exploit you on the exactly. Do favor AMD if their product is right for you personaly from a selfish perspective with equal or better price but not in any case and not if the rival has a product that is better for you and\or cost less.


Avro Arrow said:


> Ordinarily yes, but, not in a duopoly.  Duopolies aren't very stable and AMD has given nVidia the fiercest competition that nVidia has seen since 2015 and not only has it not helped to balance the market, nVidia's domination accelerated.  So, reality has proven your words false.


Be that as it may, It doesn't matter imo. If AMD can`t pull it off in the long run than according to the market we are in make it sonner than later. Don't try to postpone the inevitable. All your good will will not be enught to keep them aflot if they can`t come up with a winner GPU producet for so long, and do note that that has done so with ZEN, so no fundamental problem here.
If their best is not enough in the GPU department then close it off and have someone else try- these are the ruls of the market, or do you wnat to chang it also?
I think they are in much better place than you try to depict and don`t agree we are on a road to monopol- the other way around- AND will stay and Intel will join.
Anyway, non of that is a reason to choose other then what`s best for me personally.
Moreover, choosing a product that is not the best for you out of the idia you will help (to a company in the magnitude of AMD) will only lead to the opposit in the long run and might leave you heartbroken. You so cold support on a cheraty base will enable them to keep the same direction for longer. The real force for dramatic change in those level of business is losing money and tipping them off to postpone the collaps will only prospon a change, a change that can lead them to make a product thay will seing the market. Thay prove thay can, as I said. with ZEN.


Avro Arrow said:


> And you call me naive?  Look, I don't care if you don't blame them because you're still dead-wrong.  If the only people who bought these insanely-priced cards were those who needed them, like how it was mostly rich people and prosumers who bought Titan cards, then we wouldn't be here because there aren't enough of them to make the impact that has been felt.  Insanely-priced Titans existed back in the days of normal gaming GPU pricing.  Only gamers are numerous enough for that and it is they who have made this impact.


Most of us gamers don`t really need 1/10 of what we got. But still we want it. Do you try to battel that 'I want it" need? really?
And no, I don`t think that the gamer's community shape the face of the market. We are but small-medium segment who enjoy the fruit of a much larger lucrative professional segments. It is on none-of us soldiers to re-shape the market and trying so will lead to unnecessary frustration.
I do in favor of helping people to make more conscious decision regarding what hardware do they really need, but blaming them of put them as idiots isn't the way to go imo. So if anything, you perpetuate the current situation by insulting.


Avro Arrow said:


> Once again, your insistence on ignoring the words used invalidates your response.  The words "If you're only a gamer" has this section of your post nullified before you typed it so you completely wasted your effort.


I stand by it even if your sole purpose is to game all day and night, as it was by my original intention.


Avro Arrow said:


> I don't remember saying that I care what you reject.  I DO remember saying that I don't care if you don't like it because it's still the truth.  You don't seem to _want_ it to be the truth, but that changes absolutely nothing.


Your clearly don`t care, but your so called undisputable truth is only derived ,as I see it, from a narrow frustration\concern point view on the current situation. You basically say no one can complain about anything if he make a choice about any matter. I profoundly disagree. You don't care, but you care a lot to say it in length. And you keep on self contradicting yourself by complaining about how everyone are so stupid, but you also made a choice, hence you are forbidden to make complaints. And you are aware of that, but keep on doing so. (Or maybe I didn't understand you on that one...I do make mistakes, not rare thing)


Avro Arrow said:


> - If you want Radeon to get better, stop buying nVidia cards because magic doesn't actually exist and they need money to do so.


Profoundly disagree- Supporting AMD blindly as you suggest will make NV even stronger or in the long shoot something changes you will see AMD turning into NV very quickly (see ZEN3 amd ZEN 4 case study)- If you're mad at something that nVidia did but bought a GeForce card anyway, you are your own worst enemy. Rulling out NV as an option *might* leave you with a lesser option and\or at higher cost for what is best for you. Plus, you will pay for the hope to see a change and only get upset in return for that money after the chage, if any, will not metirealis the way you hoped for. So a part of your pay will go on the promise of a better future for GPU`s. Are you OK with that?Hopes and promises in the hand of a giant global company? no thank you- that a very bad advice to give. They will get money by making the better product that suits most, not by charity support.


Avro Arrow said:


> - If nVidia does something you don't like but you have a Radeon card, you have no right to complain.  All you can do is say "I ain't buying nVidia."


You have every right to complain and thinking you shulden`t show how fustreated you are. No complain dogma don't mix well will worldwide forums, so what are you doing here?
Also, denying the right to complain is an easy way to block any sort of unpleasant feelings associated with the cause of complaint. Are you in favor of that also?


Avro Arrow said:


> - If you're "unsatisfied" with something that AMD does but you have a GeForce card, you still have no right to complain because you gave your money to nVidia, the company that already owns the market.  You didn't try to make a positive difference when you had the chance so who cares what you think?  It's like listening to a four year-old crying about the fact that he tripped on his laces but refuses to tie his shoes.  It's just stupid.


No reason not to complain about a choice you made- it actually very important part of not making that bad choise again. How can you be better or learn anything otherwise?
Who is the stupid one- he who wrong, compalin (part of a process of understanding what was wrong) and than change or he who stay the same no matter what happend around him?


Avro Arrow said:


> You're dead wrong.  We are in COMPLETE control of the market because video cards are not a NEED.  Every time a card gets sold, it is because a consumer CHOSE to purchase it.  If you don't buy a video card, you don't get sick or die.  The only markets that can control us are FOOD, MEDICINE, CLOTHING and SHELTER which is why those markets are the most heavily regulated.  Everything else is just a want, a luxury.  EVERY purchase you make that doesn't involve one of those four is based on self-control, period.  Most people automatically know that so welcome to the human condition.  Talk about crying about first-world problems, eh?


Nop, gaming is a much is a needed thing for some people. It is a kind of escapism, and there is may to escap from nowdays. It also take care of many social needs and other besic non materialistic needs. Maybe it`s the root cuse of your view of things. Trying to dis-associate that need within people and disregarding it as something you can do without is fundamentally misunderstanding of heumen beings.
Those needs come shortly after water, food and shelter. Denying that from people will lead to very bad outcomes if not treated carefully.
Although plausible, It is very very very far from fasiable to be in COMPLETE control of the market unless you are in COMPLETE control over all gamers alike. Dose anyone have that magnitude of control?
I don`t think so...


Avro Arrow said:


> It's only unrealistic because people don't have self-control.  I admitted that I myself was also unable to control myself so I speak from experience.


Yey, people tend to not to have unlimited self control (not few have very limited when it come to GPU purchase) and many lack patiance, cus` we are just heumens man. Don't be so harsh on yourself for that. Trying to control THAT human trait with force is futile as it can be.


Avro Arrow said:


> Sure, and completely ignore any possible negative impact that it might have on the industry and market as a whole.  That might work in a typical market but video cards have been, until this year, a duopoly.  A duopoly is a terrible thing because it's extremely unstable.  If one competitor fails, then ALL consumers lose and there's no going back.


We part by the notion that an individual has responsibility other than chooing what is best for him (that is to learn, hear different opinions and than decide if and what to buy). In the capitalist market we are in you are better off with trying to saving or direct it in any way. Much larger forcec will smash you to piece. We might want it to be different, but It doesn`t change the fact that it`s not the way things are.


Avro Arrow said:


> A whopping 88% of the video cards that people currently own are nVidia cards.  Fact-check that if you like, I don't make things up.  If recent trends don't change, you won't have to worry about shopping colour-blindly because there will only be ONE colour left and that ONE colour will smile as you pay $3,000USD for a mid-tier video card before 2030.  You have to imagine that AMD is already wondering if making Radeon cards is worth it because your "colour-blind" shopping doesn't actually exist.  The vast majority of people who buy video cards If you buy a Radeon today, you'll still be able to buy a GeForce tomorrow.


I belive you about that 88%, but I don`t see how we are falling into monopole anytime soon and even if we do, that doesn't desire me one bit. I know I will be able to adjust myself accordingly, as do everyone else. *To shop "colour-blind" is the single most best thing AMD can have now, as they do have very good product to offer.* Maybe it`s not the current way of people to shop, but if anything that`s what can "save" AMD (if anyone think thay need any saving at all). 

But please don`t choose AMD out of pity or out of concern to the fragile duopoly market. Those arguments sound much like the RT argument or the redeon driver argument for not going AMD. They are acctually even more "coupons from the thin air". That way of thinking about cheerleading AMD\ruling out NV, besides not helping AMD in the long run, might lead you to choose wrongly in other areas on life. So even if you do choose to only go AMD, to check from time to time that it doesn't reach to other aspect of your life or you will wrongly choose over and over again.
Choose AMD if it`s the right product for you right now and leave brand loyalty consideration to places where it does matter.


----------



## N3M3515 (Dec 16, 2022)

Dirt Chip said:


> But please don`t choose AMD out of pity or out of concern to the fragile duopoly market.


AMD has solid products, solid prices, better value(quick example: EVERY GPU FROM RX 6800 XT and below). Still ignorant people choose nvidia due to mindshare. I really don't see a bright future for them if most people just want them to:

Develop a better product
Faster
With the same or more features

Just so that nvidia lower prices and then they just buy nvidia......



Dirt Chip said:


> shop "colour-blind" is the single most best thing AMD can have now, as they do have very good product to offer


Most people don't care or don't know.


----------



## Gica (Dec 17, 2022)

N3M3515 said:


> AMD has solid products, solid prices, better value(quick example: EVERY GPU FROM RX 6800 XT and below). Still ignorant people choose nvidia due to mindshare. I really don't see a bright future for them if most people just want them to:
> 
> Develop a better product
> Faster
> ...


AMD and nVidia have adjusted their prices according to the offer. My opinion is that each of them has enough information about the performance and value of the competitor's product and that's how I set the MSRP.
1. Rasterization.
Where the AMD video card is better, do you think you will see differences between (eg) 120 FPS and 130 FPS?
2. Ray Tracing
Yes, here you can see the differences and AMD is still suffering.
3. Upscaling
nVidia: (DLSS + FSR)
AMD: only FSR
And DLSS remains superior to FSR and is the right choice for an RTX owner.
3. Renderings
OptiX (only nVidia RTX) > CUDA (only all nVidia cards) > OpenCL (AMD)
Although the RX7000 clearly outperforms the RX6000, the RTX 4000 destroys everything. We're not talking about 10-20%, we're talking about 200-300% above AMD's best offer.
4. Encodings & streaming.
Software support is clearly in favor of nVidia, support for nVenc and/or CUDA being present in all software that is respected.
I add here that in this segment, AMD still struggles from time to time with the drivers, the example that comes to mind now is VEGA with VP9: the decoder works in the standalone player but does not do it in YouTube.

Far be it from me to say that choosing an AMD video card is wrong, but many see AMD as the only correct choice and it smells a bit like fanboism.

PS.


----------



## Space Lynx (Dec 17, 2022)

N3M3515 said:


> 6800XT $650 msrp -------- 7900XTX $1000 msrp, 51% higher performance for 53% higher price. Price/performance: crapola.
> 
> Exactly the same crap nvidia did with the 4080.
> And don't get me started on the 7900 XT.



You need to keep in mind the 6800 XT often sales brand new for $530 to $540 now. I got my MSI TRIO 6800 XT for $540. at that price point it is the best bang for buck on the market. overclocked I am only 100 points away from a rtx 3090 in 3dmark test


----------



## N3M3515 (Dec 17, 2022)

Gica said:


> 1. Rasterization.
> Where the AMD video card is better, do you think you will see differences between (eg) 120 FPS and 130 FPS?


RX 6800 XT $540 ----- AVG FPS @4K 95.1 fps
RTX 3070 $569 ----- AVG FPS @4K 72.7 fps
Very noticeable.


Gica said:


> 2. Ray Tracing
> Yes, here you can see the differences and AMD is still suffering.


RX 6800 XT slightly faster on AVG.
*Note, Doom eternal @4K with RT:
RX 6800 XT ----- 83 fps
RTX 3070 ----- 18 fps
Very noticeable.


Gica said:


> 3. Upscaling
> nVidia: (DLSS + FSR)
> AMD: only FSR
> And DLSS remains superior to FSR and is the right choice for an RTX owner.


Close enough to not care.


Gica said:


> 3. Renderings
> OptiX (only nVidia RTX) > CUDA (only all nVidia cards) > OpenCL (AMD)
> Although the RX7000 clearly outperforms the RX6000, the RTX 4000 destroys everything. We're not talking about 10-20%, we're talking about 200-300% above AMD's best offer.
> 4. Encodings & streaming.
> ...


I only use gpus for gaming, so this point is moot for me and for 90% of gamers.


----------



## AusWolf (Dec 17, 2022)

Gica said:


> 1. Rasterization.
> Where the AMD video card is better, do you think you will see differences between (eg) 120 FPS and 130 FPS?


You're right, although if AMD suffers the lower performance, some people immediately call it shit. With Nvidia, it doesn't matter?

Besides, it does matter at lower price points, where you can have 40 instead of 30 FPS for the same price, or enjoy 60 FPS for a lower price.



Gica said:


> 2. Ray Tracing
> Yes, here you can see the differences and AMD is still suffering.


Yes. Whether that matters or not is up to personal interpretation.



Gica said:


> 3. Upscaling
> nVidia: (DLSS + FSR)
> AMD: only FSR
> And DLSS remains superior to FSR and is the right choice for an RTX owner.


Says who? Personally, I think FSR 2 is on par with DLSS 2.

I also think that upscaling should only be used as a last resort, to squeeze the last few FPS out of your system that you need to run your game at the desired quality settings.



Gica said:


> 3. Renderings
> OptiX (only nVidia RTX) > CUDA (only all nVidia cards) > OpenCL (AMD)
> Although the RX7000 clearly outperforms the RX6000, the RTX 4000 destroys everything. We're not talking about 10-20%, we're talking about 200-300% above AMD's best offer.
> 4. Encodings & streaming.
> ...


I know nothing about this, so all I say is that if you need these, buy Nvidia.

Personally, I don't find anything wrong with people buying overpriced Nvidia products due to a genuine need (Optix, or any professional work). I only facepalm at people who buy a 3060 instead of a 6600 XT or 6650 XT purely for gaming because they don't know any better. Or at people who buy a 4090 only because that's the most expensive shiny new thing out there.


----------



## N3M3515 (Dec 17, 2022)

AusWolf said:


> I only facepalm at people who buy a 3060 instead of a 6600 XT or 6650 XT purely for gaming because they don't know any better


RTX 3050 VS RX 6600 -------- 6600 is *29% faster, yet 26% cheaper*
RTX 3050 VS 6650XT --------- Same price, 6650xt is *57% FASTER*
RTX 3060 VS 6700XT --------- Same price, 6700xt is *27% faster *(and don't get me started on the 8GB version of the 3060)
RTX 3060Ti VS 6800 ---------- 6800 is 8% more expensive BUT *29% faster, *slightly faster at RT. Double the ram.
RTX 3070 VS 6800XT --------- 6800XT 8% more expensive BUT *26% faster, *slightly faster at RT. Double the ram.
RTX 3070Ti VS 6800XT --------- 6800XT *$100 cheaper and 18% faster, *equal or slightly worse at RT. Double the ram.


----------



## shovenose (Dec 17, 2022)

N3M3515 said:


> RTX 3050 VS RX 6600 -------- 6600 is *29% faster, yet 26% cheaper*
> RTX 3050 VS 6650XT --------- Same price, 6650xt is *57% FASTER*
> RTX 3060 VS 6700XT --------- Same price, 6700xt is *27% faster *(and don't get me started on the 8GB version of the 3060)
> RTX 3060Ti VS 6800 ---------- 6800 is 8% more expensive BUT *29% faster, *slightly faster at RT. Double the ram.
> ...



Yeah, the stingy VRAM is really irritating. A 3080 shouldn't have just 10GB.


----------



## wolf (Dec 18, 2022)

Luminescent said:


> I have a fan.


Indeed you do, always amusing to me when people are exactly what they accuse others of being, and add layers of rudeness on top.

Pretty keen to revisit the 7900's in a few months, hopefully sanity prevails after Xmas and both camps start making prices in the meat of the range more appetizing, and give RDOA3 time to fix a few more egregious issues.


----------



## Gica (Dec 18, 2022)

N3M3515 said:


> RX 6800 XT $540 ----- AVG FPS @4K 95.1 fps
> RTX 3070 $569 ----- AVG FPS @4K 72.7 fps
> Very noticeable.
> 
> ...


Sir, around here 3070 is cheaper than 6800XT. Last year, when I bought the 3070 Ti for 800 euros (offer from Germany), you couldn't even find the 6700XT cheaper (well, LHR versus full mining).
You are comparing apples with pears to an old version.
The MSRP at 3000 and 6000 was a bad joke, the video cards selling for at least double that.

P.S. In DOOM, the 3070 renders 116 FPS, not 18. Without ray tracing and DLSS. And you won't see any difference if a Radeon renders more frames.
PS Maybe you can find games where 3070 is equal to 6900XT (not 6800XT) with ray tracing on. If you don't succeed, I'll help you. I notice that you can only find those that put Radeon in good light.

For 1440p, Ultra Nightmare, *RT ON, DLSS OFF*, you can watch the material for 3070 Ti. It is "only" 1440p because these video cards for this resolution are recommended. In 4K, memory limitation appears as well as performance limitation with ray tracing ON on all AMD video cards from the 6000 series, the most powerful of them having performances at most equal to the ancient 2080Ti.


----------



## N3M3515 (Dec 18, 2022)

Gica said:


> Sir, around here 3070 is cheaper than 6800XT


Maybe there, but *USA is the biggest market* so.....invalid argument.


Gica said:


> P.S. In DOOM, the 3070 renders 116 FPS, not 18. Without ray tracing and DLSS. And you won't see any difference if a Radeon renders more frames.


I took the data from here *TPU* and *without any upscaling* tech.


Gica said:


> PS Maybe you can find games where 3070 is equal to 6900XT (not 6800XT) with ray tracing on. If you don't succeed, I'll help you. I notice that you can only find those that put Radeon in good light.


Again, acording to *TPU* own metrics, 6800 XT is slightly better than 3070 in RT in avg. I'm not stating opinions, just *FACTS.*


----------



## Fasola (Dec 18, 2022)

N3M3515 said:


> Again, acording to *TPU* own metrics, 6800 XT is slightly better than 3070 in RT in avg. I'm not stating opinions, just *FACTS.*


In fact, the 6800 XT is equal to the 3070 *Ti *at 1080 and 1440p, and *faster* at 4K across the games that TPU tested in this very review.


----------



## Gica (Dec 18, 2022)

It is totally inappropriate to compare the two video cards now, as if the 4000 and 7000 series were released recently, or not? Last year, AMD prices were in the sky and their video cards, in most cases, were nowhere to be found. Discussion is for discussion's sake, since a new generation has been launched. And the 5600X now costs ~900 RON (~$190), but last year it did not drop below 1500 RON (over $300).

P.S. 3070 (Ti) and even 6800XT should only be compared in 1080p and 1440p. In 4K, both can achieve a decent fps only with the compromise of details in heavy AAA games. If you also activate ray tracing, they are weak in this resolution.

PS In the capture are the cheapest video cards available now in Romania (I did not search through all the stores, you may find other good offers, AMD or nVidia). We are talking about the entry-midrange class. The idea is that you don't have to get involved with the USA, you buy the desired product from anywhere, waiting for the tempting offers, and in the current case, the 3070 is ~$125 cheaper than the 6800XT and the anomaly from the beginning of the year still prevails as certain 6600 models to be cheaper than the scrap 6500XT.
As far as I'm concerned, I had the 6800XT alternative instead of the 3070 Ti, but I had to pay ~300 Euro more. We are talking about the summer of 2021.
Although you deny them, the other advantages of an nVidia video card are undeniable. You also denied the importance of ray tracing until AMD came up with something similar. And you still deny, but the situation will change only if AMD takes the crown in this segment.
Peace to you!


----------



## TheinsanegamerN (Dec 18, 2022)

Neo_Morpheus said:


> There was a time that it was for good reasons, but we are now paying for those "good reasons" and worse, people are ignoring the really bad reasons why we should stop giving Nvidia money.


People are not going to pay a premium price for sub tier products. That's just simple reality. If you cant compete on performance, features, or price, you get left out. Simple as. Oh yeah, and you need to be able to DELIVER said product, when its needed, not 8 months later.


Neo_Morpheus said:


> I am afraid of that and I will be brutally honest, if they left the GPU market, I would not blame them.


AMD has made their own bed over the years. At some point you have to compete more then once in a blue moon if you want to be a "premium" brand.


----------



## Neo_Morpheus (Dec 19, 2022)

TheinsanegamerN said:


> People are not going to pay a premium price for sub tier products. That's just simple reality. If you cant compete on performance, features, or price, you get left out. Simple as. Oh yeah, and you need to be able to DELIVER said product, when its needed, not 8 months later.
> 
> AMD has made their own bed over the years. At some point you have to compete more then once in a blue moon if you want to be a "premium" brand.


By your logic, we wont have to bother about premium items, since it will be only nvidia and we already know how they price their wares when they dont have “competition “.

The 5060 will be around US$1200, the 5070 will be 1600, 2200 for the 5080 and they wont even bother in releasing a 5090 because we wont have any options but to buy nvidia.

I think i will need a new hobby by then.


----------



## AusWolf (Dec 19, 2022)

Neo_Morpheus said:


> By your logic, we wont have to bother about premium items, since it will be only nvidia and we already know how they price their wares when they dont have “competition “.
> 
> The 5060 will be around US$1200, the 5070 will be 1600, 2200 for the 5080 and they wont even bother in releasing a 5090 because we wont have any options but to buy nvidia.
> 
> I think i will need a new hobby by then.


This.

Besides, anyone who says AMD isn't competitive right now is either (1) hung up on the RT hype a bit too much, (2) thinks that 5700-series driver problems are still around when they're not, (3) needs Optix, or other professional creator stuff, or (4) just likes to recite what mindless Nvidia fans spew all over the internet because it sounds cool.


----------



## vMax65 (Dec 19, 2022)

Luminescent said:


> I'm gonna leave this here
> Top Nvidia shareholders
> Vanguard Group Inc. representing 7.7% of total shares outstanding​BlackRock Inc. representing 7.2% of total shares outstanding​Top AMD shareholders
> Vanguard Group Inc. representing 8.28% of total shares outstanding​BlackRock Inc. representing 7.21% of total shares outstanding​


I also read that Jen-Shun and Lisa Su are related...



Fluffmeister said:


> This card oozes buyers regret to me, I mean if you're already in the market for a card costing a grand, you might as well spend a bit more and get the better all round product of 4080.


Sad but true...


----------



## Avro Arrow (Dec 19, 2022)

Neo_Morpheus said:


> There was a time that it was for good reasons, but we are now paying for those "good reasons" and worse, people are ignoring the really bad reasons why we should stop giving Nvidia money.


I think that it's because most people who buy video cards don't have enthusiast-level knowledge of what's going on.  To them, it's like choosing between GE And Whirlpool.


Neo_Morpheus said:


> I am afraid of that and I will be brutally honest, if they left the GPU market, I would not blame them.


Nor would I.  Consumers have proven time and again that they don't deserve what AMD tries to do.


Neo_Morpheus said:


> I have been saying that but in other less eloquent words as to what we are walking towards to.


Don't sell yourself short Neo.  You care passionately about the industry as a whole and I've always respected you for that.


Neo_Morpheus said:


> Yeah, but no, I hate Nvidia so much that it will take a miracle of biblical proportions to give them money again.


Like Jensen getting hit by a bus?  


Neo_Morpheus said:


> Cant edit the post, but saying it down here: Welcome to TPU!!


Thanks bro.  I've always admired this place and I'm glad that I finally got on board!



N3M3515 said:


> Couldn't have said it better.
> This situation of ever increasing gpu prices is out of control.


It's a really sad state of affairs, especially since this is self-inflicted.  We just have to be willing to say "no".  I've resolved to use my RX 6800 XT and R7-5800X3D until I really can't use them anymore.


AusWolf said:


> What a comprehensive, well thought-out assessment of the market situation!
> 
> A few years ago, I completely agreed with the "_shopping for brand is bad_" and "_colourblind shopping is the way_" sentiments, but when 88% of the market is owned by a single company, one is inclined to make a stand and say "_I'll give up that 10% extra RT performance because I want to buy a midrange GPU under a grand for my next rig as well_".


Yep.  It's just called "seeing the bigger picture".  If you buy a Radeon today, you can still buy a GeForce tomorrow.  The inverse is not necessarily true.


AusWolf said:


> When Intel Arc debuted, Linus (from Linus Tech Tips) heavily defended it and urged people to buy an A770 despite its many problems. I completely understand why.


I don't believe that for a second because LTT is one of the major reasons why the market is the way it is.  He has admitted to having a lot of "friends" at Intel (or at least, people he genuinely likes) which is why he keeps supporting them.


sepheronx said:


> I wanted to buy a a770 but couldn't get one in my shithole


I don't know why, but that's funny as hell!  


sepheronx said:


> I just looked and there is 10 avail on other end of the city. I may just grab it tomorrow


I dunno.  I have a hard time supporting Intel after all of their illegal and anti-consumer practices.  Intel's got plenty of cash and don't need our support.


Dirt Chip said:


> I think you are over reacting with that morbidly of AMD disappearing (thay have good product at various market segments including consumers GPU to keep them going IF they are serious about that, just like Intel with ARC) and too afraid of monopoly (that is in the contaxt of GPU`s as recreational gaming tool, not as a general rule about monopoly).


Call it "reacting before things become overly-dire".  Most people aren't reacting at all and that's even worse.


Dirt Chip said:


> I sound that your honest concern drive you to, imo, thing the wrong way how to fix this problem (AMD to quit consumers GPU). Favoring a company in the magnitude of AMD (a giant worldwide corporation that is listed on to of NASDAQ,  just like NV and Intel) is a futile thing to do. They are no a charity case as let`s say a local ice cream maker you want to help by favoring them and not Nestle ice cream for example. They will and are using your sens of 'let`s help the underdog before it collapse' to exploit you on the exactly. Do favor AMD if their product is right for you personaly from a selfish perspective with equal or better price but not in any case and not if the rival has a product that is better for you and\or cost less.


I'm not favouring them per se.  I've just grown to hate nVidia over the years.  Saying to favour AMD if their product is right for me personally is a good mantra to have.  The problem is that many people for whom AMD _would_ be the better choice aren't making it.


Dirt Chip said:


> Profoundly disagree- Supporting AMD blindly as you suggest will make NV even stronger or in the long shoot something changes you will see AMD turning into NV very quickly (see ZEN3 amd ZEN 4 case study)- If you're mad at something that nVidia did but bought a GeForce card anyway, you are your own worst enemy. Rulling out NV as an option *might* leave you with a lesser option and\or at higher cost for what is best for you.


I've never seen a Radeon be higher cost than a GeForce on the same level so that's a strawman argument at best.


Dirt Chip said:


> Nop, gaming is a much is a needed thing for some people.


Absolutely ridiculous.  Video games didn't exist for hundreds of thousands of years and we're still here.


Dirt Chip said:


> But please don`t choose AMD out of pity or out of concern to the fragile duopoly market.


I don't choose AMD strictly out of concern for the fragile duopoly, I choose AMD because at the same price point, their products out-perform nVidia's.  I said that because people who _should_ choose AMD are choosing nVidia and they need to wake up.


Gica said:


> Sir, around here 3070 is cheaper than 6800XT. Last year, when I bought the 3070 Ti for 800 euros (offer from Germany), you couldn't even find the 6700XT cheaper (well, LHR versus full mining).


That's fair and in that situation, I don't blame you.


Gica said:


> You are comparing apples with pears to an old version.
> The MSRP at 3000 and 6000 was a bad joke, the video cards selling for at least double that.


Yes I know, but after the hell had subsided, the Radeon cards were below MSRP while the GeForce cards weren't (and still aren't as far as I know).  The fact that the RX 6800 XT was _hundreds less_ than the RTX 3080 didn't seem to matter.  I was using the "after mining" time period as my template.


Gica said:


> For 1440p, Ultra Nightmare, *RT ON, DLSS OFF*, you can watch the material for 3070 Ti. It is "only" 1440p because these video cards for this resolution are recommended. In 4K, memory limitation appears as well as performance limitation with ray tracing ON on all AMD video cards from the 6000 series, the most powerful of them having performances at most equal to the ancient 2080Ti.


And yet, people still paid over $2,000 for the RTX 2080 Ti back then.  Do you see how people just go nuts for nVidia's top card and marketing even when, as you say, it's really not that great?  Also, that's a title that has RT when most titles don't.


----------



## Gica (Dec 19, 2022)

Avro Arrow said:


> Yes I know, but after the hell had subsided, the Radeon cards were below MSRP while the GeForce cards weren't (and still aren't as far as I know).  The fact that the RX 6800 XT was _hundreds less_ than the RTX 3080 didn't seem to matter.  I was using the "after mining" time period as my template.


If the collapse of the crypto caught them with huge unsold stocks. Happy are those who buy at a low price, unhappy are those who bought at higher prices and cannot sell the product on the second-hand market for even 50% of the cost paid. How much can you sell a 5600X, purchased for $300+, if it is now being sold, new, for less than $200?


----------



## TheoneandonlyMrK (Dec 20, 2022)

AMD attempts to defuse controversy around RDNA 3 GPUs and ‘broken’ feature
					

Shader pre-fetching code is experimental and therefore not supposed to be working, AMD assures us




					www.techradar.com
				




So a guy called Kepler al said it's broken, AMD refuted this.


----------



## InVasMani (Dec 20, 2022)

Avro Arrow said:


> I think that it's because most people who buy video cards don't have enthusiast-level knowledge of what's going on.  To them, it's like choosing between GE And Whirlpool.
> 
> Nor would I.  Consumers have proven time and again that they don't deserve what AMD tries to do.
> 
> ...



The few the proud the woke the circuit bent robots those whom avoided Avro Arrow's attempt at responding to ALL THE THINGS!! All your base are belong to us...you didn't think you'd get off that easily did you?


----------



## N3M3515 (Dec 20, 2022)

So........RTX 3080 was more expensive to make than the 4080?












I was right voting $800 as the price of the 4080 
Seems like amd did something similar with the 7900 series.


----------



## Luminescent (Dec 20, 2022)

Communicated inflation rate in US is bullshit, it is much higher and Nvidia and AMD knows it.
They printed a lot of money in the "bug" time and even more now with the war and current people in charge.


----------



## Avro Arrow (Dec 20, 2022)

Gica said:


> If the collapse of the crypto caught them with huge unsold stocks. Happy are those who buy at a low price, unhappy are those who bought at higher prices and cannot sell the product on the second-hand market for even 50% of the cost paid. How much can you sell a 5600X, purchased for $300+, if it is now being sold, new, for less than $200?


I'm not sure what you're trying to say here.  I mean, sure, if you bought when prices were high (like I did with my RX 6800 XT), then yeah, you're pretty much screwed.  I don't know what that has to do with Radeon cards being a far better value than GeForce cards though.  Even Tom's Hardware did a article called AMD Graphics Cards Are the Better Value at Every Price Point with the url being "amd-graphics-cards-are-better-value-than-nvidia".

This wasn't exactly news to me because historically, it has been true that if a Radeon and a GeForce are the same price, the Radeon will be superior.



N3M3515 said:


> So........RTX 3080 was more expensive to make than the 4080?
> View attachment 275262
> 
> 
> ...


Yeah, but I think that AMD put their prices higher because they were astonished to see how many fools were lining up to throw $1600 at Jensen for a goddamn VIDEO CARD.  They're a corporation after all and if consumers demonstrate that they're willing to pay $1600, then paying $1000 should be a piece of cake for them.

Personally, I think that if the RTX 4090 hadn't sold out like it did but was treated more the way that the RTX 4080 is being treated (which is what _should_ have happened), I think that the RX 7900 XT would be no more than $700 today.

It's like I've been saying all along.  These exorbitant prices are our own fault.


----------



## N3M3515 (Dec 20, 2022)

Avro Arrow said:


> Yeah, but I think that AMD put their prices higher because they were astonished to see how many fools were lining up to throw $1600 at Jensen for a goddamn VIDEO CARD


I agree, though the 4090 is not the first to do that, there's precedent and that's why the top of the top gets a pass. But apart from that the rest of the lineup should not get away with it.
RX 7900 XTX, i insist is also a rip off to a lesser degree than the 4080, performance wise it replaces the $650 6800 xt.


----------



## InVasMani (Dec 21, 2022)

GPU's in general are priced to high today.


----------



## Gica (Dec 21, 2022)

In all your agitation, you remained frozen in 1990. Gentlemen, today, the video card is not only used for fps in gaming. Does it support AMD OptiX or CUDA? Does the enc/dec module match nVidia's value? For the price of video cards, taking into account that no respected creator is looking at any Radeon, the demand is huge in this segment. Oh, and let's not forget DLSS, which is completely missing from AMD.
You gave the example of Doom Eternal. What performance does an AMD video card get with DLSS ON? Ah, can't stand technology! So how does it perform with FSR? Ah, the game doesn't have this implemented.
In addition to this, gentlemen, we are at the end of 2022. If we are still investing a lot of money, we are investing it intelligently, and intelligently does not match the purchase of video cards released two years ago. I'm just saying.


----------



## N3M3515 (Dec 21, 2022)

Gica said:


> In all your agitation, you remained frozen in 1990. Gentlemen, today, the video card is not only used for fps in gaming. Does it support AMD OptiX or CUDA? Does the enc/dec module match nVidia's value? For the price of video cards, taking into account that no respected creator is looking at any Radeon, the demand is huge in this segment. Oh, and let's not forget DLSS, which is completely missing from AMD.
> You gave the example of Doom Eternal. What performance does an AMD video card get with DLSS ON? Ah, can't stand technology! So how does it perform with FSR? Ah, the game doesn't have this implemented.
> In addition to this, gentlemen, we are at the end of 2022. If we are still investing a lot of money, we are investing it intelligently, and intelligently does not match the purchase of video cards released two years ago. I'm just saying.


None of what you just said justifies current prices, period.


----------



## TheoneandonlyMrK (Dec 21, 2022)

Gica said:


> In all your agitation, you remained frozen in 1990. Gentlemen, today, the video card is not only used for fps in gaming. Does it support AMD OptiX or CUDA? Does the enc/dec module match nVidia's value? For the price of video cards, taking into account that no respected creator is looking at any Radeon, the demand is huge in this segment. Oh, and let's not forget DLSS, which is completely missing from AMD.
> You gave the example of Doom Eternal. What performance does an AMD video card get with DLSS ON? Ah, can't stand technology! So how does it perform with FSR? Ah, the game doesn't have this implemented.
> In addition to this, gentlemen, we are at the end of 2022. If we are still investing a lot of money, we are investing it intelligently, and intelligently does not match the purchase of video cards released two years ago. I'm just saying.


Yeah no physx either, go full retard or Gtfo.


----------



## Avro Arrow (Dec 21, 2022)

N3M3515 said:


> I agree, though the 4090 is not the first to do that, there's precedent and that's why the top of the top gets a pass. But apart from that the rest of the lineup should not get away with it.
> RX 7900 XTX, i insist is also a rip off to a lesser degree than the 4080,


Absolutely!  It's a TOTAL rip-off!  The problem is that nVidia pretty much gave AMD the green light to do it.


N3M3515 said:


> performance wise it replaces the $650 6800 xt.


I don't really agree with that because it is their level-9 card which still makes it a better value than the RX 6900 XT before it (which was a terrible value in itself compared to the RX 6800 XT).  I think that's the card that it's replacing.

However, I do NOT give them a pass for the existence of the RX 7900 XT because THAT card really should be called the RX 7800 XT.  This use of "XTX" is just plain stupid.  The RX 7900 XTX should have been called the RX 7900 XT and the RX 7900 XT should have been called the RX 7800 XT.  If they wanted an XTX card, they should have reserved that nomenclature for a liquid-cooled refresh card and called it the RX 7950 XTX.

What they've done with their naming scheme here is pitiful and makes me think that they're on the wrong drugs in Santa Clara.  I know for a fact that several people who actually work at ATi in Markham are VERY pissed that AMD did this.


----------



## N3M3515 (Dec 21, 2022)

Avro Arrow said:


> I don't really agree with that because it is their level-9 card


That's the trick they play to justify the price increase. But the logic i'm using here is the same as last gen: 6800xt vs 3080, they where neck and neck at raster, just the same as it is now with the 7900XTX and the 4080. Literally the same scenario, amd just choose to name it 7900. And with the conclusion being: if the 4080 is only worth $800(majority of tpu members voted that, and gamers nexus video also showed it). Then the 7900XTX is worth no more than $700 - $750.


----------



## Dirt Chip (Dec 22, 2022)

One can also disregard naming completely, as those company play with it as their will (which is actually is) by reshuffling, adding, changing and so on by the gen and within running gen (see NV 'SUPER' suffix that might do a comeback anytime).
Just go pref/cost at a given price range that suite you. So simple and will save the trouble of doing name/pref and name/cost calculation (unless you enjoy it).

As time goes by those company learn us, thoroughly, to take their word (=name monoculture) for less and less by the gen.
By now, any informed shopper should know to give the 'name' of the product only small importance and certainly not to shop by it.


----------



## N3M3515 (Dec 22, 2022)

Dirt Chip said:


> Just go pref/cost at a given price range that suite you


That's what i do, performance/$ just don't lie.


----------



## Avro Arrow (Dec 22, 2022)

N3M3515 said:


> That's the trick they play to justify the price increase. But the logic i'm using here is the same as last gen: 6800xt vs 3080, they where neck and neck at raster, just the same as it is now with the 7900XTX and the 4080. Literally the same scenario, amd just choose to name it 7900. And with the conclusion being: if the 4080 is only worth $800(majority of tpu members voted that, and gamers nexus video also showed it). Then the 7900XTX is worth no more than $700 - $750.


Well, the difference there is that AMD isn't under any obligation to make their part numbers match nVidia's or correspond with them in any way, shape or form.  AMD has the right to refer to their fastest card as level-9 because that's what they've done for at least 15 years.  Just because nVidia has some pie-in-the-sky card that they call level-9 doesn't mean that AMD has to follow suit.  They're a different company and nVidia doesn't set the standards for them when it comes to nomenclature.  Just because AMD's level-9 card matches nVidia's level-8 doesn't mean that AMD should call it a level-8 card.  That's where your logic falls apart.

Remember that the RX 6900 XT was only a paltry 9% faster than the RX 6800 XT so AMD's use of "7900" could easily be considered applicable for the XTX.  At the same time, the use of "7900" for their second-tier card was wholly inappropriate because their second-tier card is supposed to use the number 8, something that has been true since ATi came out with the HD-series of cards (whenever there actually was a 9-series card in their lineup).  The last Radeon that I can think of that had more than one card at level-9 was the X1000-series back in 2005-07 but ATi's naming scheme back then was completely different.  Moreover, they had over 40 different cards released in those two years, something that we'll probably never see again.

So, my problem isn't with the use of 7900 for their level-9 card, my problem is with them also referring to what is really their level-8 card as level-9.  I also agree that the RX 7900 XTX should be no more than $700 but at the same time, the RTX 4080 is even more egregiously-priced so we can't really blame AMD for trying to get as much as they could for the RX 7900 XTX while still making it considerably cheaper than the RTX 4080.  The reason that the RX 7900 XTX is $1000 is because the RX 6900 XT was also $1000 and people still bought it.  The price should've been a lot lower but it's hard to complain about Radeon pricing when the pricing of the level-8 GeForce is $200 higher than the price of the level-9 Radeon.  We should also remember that the RX 6900 XT was competitive in performance with the RTX 3090.  Every time I see some dumb reviewer comparing the RTX 3080 with the RX 6900 XT I'm like "WTF are you doing?  They're not even aimed at each other!" and I think that they do it to make nVidia look not so overpriced, which is dishonest as hell.

You know, one _could_ say that AMD did the right thing with their level-9 card by keeping the price the same.  I have no real problem with the pricing of the XTX, just the nomenclature.  When it comes to pricing, I _really_ have a problem with AMD raising the price of the level-8 card by $250 and re-naming it in a lame attempt to cover up the fact that they've essentially done this.  This is quite bad but again, it's hard to be really mad at AMD when what nVidia did with their level-8 card was literally _twice_ as bad.  It's a real mess that isn't helped at all by the fact that the only reviewer with the balls to condemn and seriously ridicule nVidia for this is Steve Burke of Gamers Nexus.  I honestly believe that _everyone_ should be piling up on them for what they're doing.  Even worse than that are the fools who are willing to buy them at this price, giving Jensen the validation that he craves to keep prices artificially high.


----------



## N3M3515 (Dec 22, 2022)

Avro Arrow said:


> Even worse than that are the fools who are willing to buy them at this price, giving Jensen the validation that he craves to keep prices artificially high.


True, i saw on youtube someone comment on the 4080, "for $1000 i would buy it with no remorse", LOL. It is so incredibly overpriced that even at minus $200 it's still up there. Well, i don't see any incentive to replace my 6800 xt, unless the 4800/7900xtx goes down to $750 at most. And even then i don't know if my 5900X would bottleneck them at 1440p/144Hz.


----------



## Bwaze (Dec 23, 2022)

"Stockholm syndrome is a coping mechanism to a captive or abusive situation. People develop positive feelings toward their captors or abusers over time. This condition applies to situations including child abuse, coach-athlete abuse, relationship abuse and sex trafficking."


----------



## gffermari (Dec 23, 2022)

The target is the x80/x800 cards to be at 999$/€.
Mobile phone companies have been selling the same thing for ages, every year at that price and above.

Anyone would do the same if they were in Lisa/Jensen shoes, watching crap technology to be sold at ridiculous prices.
And no one diagrees that gpu developing is way harder than anything else (in the consumer level).

I think the problem will be solved in the next two three gens when both companies have chiplets and can produce gpus for every pocket.
The monolithic design does not allow you to sell other than high end products with good profit.

Anyway, although I want to upgrade my gpu and am able to cover a 7900XTX/4080 cost, it's matter of principle.
Some friends were lucky to buy a 649 pounds 3080. Taking into account all the factors, my limit for the next x80 is 799.


----------



## TheoneandonlyMrK (Dec 23, 2022)

gffermari said:


> The target is the x80/x800 cards to be at 999$/€.
> Mobile phone companies have been selling the same thing for ages, every year at that price and above.
> 
> Anyone would do the same if they were in Lisa/Jensen shoes, watching crap technology to be sold at ridiculous prices.
> ...


Fair point in the wrong thread there's a 4080 thread for Nvidia only buyers to complain about price.

Or are you one of those AMD should be competitive or I can't get my cheap Nvidia types.


----------



## gffermari (Dec 23, 2022)

TheoneandonlyMrK said:


> Fair point in the wrong thread there's a 4080 thread for Nvidia only buyers to complain about price.
> 
> Or are you one of those AMD should be competitive or I can't get my cheap Nvidia types.



Looool. I think my post fits in both topics.

I had a 5700XT (and RX470, 7970, 6950, 4850, X1950Pro etc....) before buying the 2080Ti, so no.
I would buy a 7900XTX if it was at the 6800XT msrp.



Spoiler


----------



## Avro Arrow (Dec 23, 2022)

N3M3515 said:


> True, i saw on youtube someone comment on the 4080, "for $1000 i would buy it with no remorse", LOL. It is so incredibly overpriced that even at minus $200 it's still up there. Well, i don't see any incentive to replace my 6800 xt, unless the 4800/7900xtx goes down to $750 at most.


I also have an RX 6800 XT and I plan to keep using it for many years to come.  It's a great performer and won't have any VRAM limitations like my R9 Fury did.  I think that it'll be at least a decade before 16GB isn't enough and that's assuming that there will ever be a time when 16GB isn't enough.  I say that because it's already more than enough for Unreal Engine 5 which is essentially photo-realistic.  You can't get much better than that.


N3M3515 said:


> And even then i don't know if my 5900X would bottleneck them at 1440p/144Hz.


Actually, the RX 6800 XT would be the bottleneck there, especially at 1440p, not the R9-5900X.  See, the RX 6800 XT is the performance equal of the RTX 3080 and TechPowerUp has these results for the R9-5900X with an RTX 3080.  I'll use 1080p because the game will be completely GPU bottlenecked at 1440p:




So you see here that your R9-5900X is tied with my R7-5800X3D with an RTX 4080, so our two rigs would be equal in performance because we both have RX 6800 XTs.  However, if we had a more powerful card, like an RTX 3090 Ti for example:




All other things being equal, your CPU's output increased by 7.5% which means that the RTX 3080 was the bottleneck here.  While it's true that Cyberpunk is pretty hard on GPUs, it's only 1080p here and if the RTX 3080 is the bottleneck at 1080p in Cyberpunk, then it would definitely be the bottleneck in _almost all_ modern titles at 1440p.  Exceptions to this would be real-time strategy games like Civilization because it relies completely on the CPU for game speed.  I often joke that the ideal setup for Civilization 6 would be a Threadripper coupled with an RX 580.

If the performance increases at the same resolution with a faster GPU, it means that the CPU isn't the bottleneck so your R9-5900X is just fine.  Even with the bottleneck, you're still looking at ~60FPS at 1440p with the RX 6800 XT in Cyberpunk 2077, certainly nothing worth upgrading any time soon.  That game is often considered to be a "worst-case-scenario" for Radeon cards so other games would be even better.  Here's a Cyberpunk demo at 1440p using an R9-5950X (which is pretty much the same as the R9-5900X in gaming) and an RX 6800 XT:


----------



## Ravenas (Dec 25, 2022)

I was able to get one of these directly from AMD. Great pricing and shipping for sure. Just now installing VGA drivers on Win11. Not much luck on Linux side outside of Ubuntu / ubuntu flavors.

Ubuntu and Red Hat always get there hands on these first.


----------



## Ripcord (Dec 25, 2022)

Seems like a good offering from AMD, the same RT performance as a 3090ti and as fast as a 4080 in everything else, all for less money while using less power! what's not to like .


----------



## N3M3515 (Dec 25, 2022)

Ripcord said:


> what's not to like


The price


----------



## Swak (Jan 1, 2023)

N3M3515 said:


> The price


The 7900 XTX is fine.  It's 47% faster than the 6900 XT, while having the same launch MSRP even after two bad years of inflation.  If people want to fight the good fight and try to lower the prices, sure, have at it, but some people seem to think 7900 XTX should really be a 7800 XT, which is delusional at best.  The "halo" cards for both AMD and nvidia have never been the best bang for your buck, except the 1080 ti, which was clearly a mistake on nvidia's part.

The real problem is the 7900 XT.  Right now the 7900 XT is only a 34% improvement over the 6800 XT, while costing $900.  AMD won't release a 7800 XT that performs better than a 7900 XT, so you are probably looking at best around a  30% improvement going from a 6800 XT to a 7800 XT, at who knows what cost.  For comparison, I think the 5700 XT to 6700 XT was a 35% improvement, at a 20% price hike, which if they followed that trend it would put the 7800 XT at slightly higher performance than the 7900 XT, while costing $780 MSRP.


----------



## Gica (Jan 2, 2023)




----------



## N3M3515 (Jan 2, 2023)

Swak said:


> seem to think 7900 XTX should really be a 7800 XT, which is delusional at best. The "halo" cards for both AMD and nvidia have never been the best bang for your buck


People are not "thinking" that, and amd does not have a "halo" product because they decided to name it like one. A "halo product" has the highest performance at the highest price, like the 4090 this gen, and last gen 3090 and 6900 xt. The 6900xt traded blows with the 3090. Forward two years and now the best from amd is tied with the x80 class gpu from nvidia, that's not a halo product, that's a x80 class gpu like the 6800XT. AMD is just taking advantage of the ultra high price increace of the x80 class gpu from nvidia and pricing to match.


Swak said:


> The 7900 XTX is fine. It's 47% faster than the 6900 XT


39% according to this site. And 31% vs previous top of the line 6950 XT. 4090 is 45% faster than the 3090Ti, that is a true generational leap(and not the highest).


Swak said:


> the 7900 XT is only a 34% improvement over the 6800 XT, while costing $900


*27% improvement *(again, according to tpu) for 38% increase in price. It would need to be at the very least $750 for the increase in performance to be higher than the increase in price, in that case it would be 27% faster for 15% more expensive. That would also be a shitty deal, but at least not a ripoff. The 7900XT and 7900XTX are clearly not for 6800XT owners, or people that care about price/performance.


Swak said:


> you are probably looking at best around a 30% improvement going from a 6800 XT to a 7800 XT


Actually a *17 - 20% *would be more realistic and for no less than $700.


----------



## kapone32 (Jan 2, 2023)

N3M3515 said:


> 39% according to this site. And 31% vs previous top of the line 6950 XT. 4090 is 45% faster than the 3090Ti, that is a true generational leap(and not the highest).


Seriously? Can we come back to this in a year?


----------



## Gica (Jan 3, 2023)

Are you hoping for a miracle from the drivers?


----------



## Avro Arrow (Jan 3, 2023)

Gica said:


> In all your agitation, you remained frozen in 1990. Gentlemen, today, the video card is not only used for fps in gaming. Does it support AMD OptiX or CUDA? Does the enc/dec module match nVidia's value? For the price of video cards, taking into account that no respected creator is looking at any Radeon, the demand is huge in this segment. Oh, and let's not forget DLSS, which is completely missing from AMD.
> You gave the example of Doom Eternal. What performance does an AMD video card get with DLSS ON? Ah, can't stand technology! So how does it perform with FSR? Ah, the game doesn't have this implemented.
> In addition to this, gentlemen, we are at the end of 2022. If we are still investing a lot of money, we are investing it intelligently, and intelligently does not match the purchase of video cards released two years ago. I'm just saying.


Do you work for nVidia?  You're spouting their BS extremely well for someone who doesn't.

CUDA - Are you a student or professional in the CAD market?  No?  So who cares?
DLSS - Is no better than FSR because you wouldn't be able to tell which is which if they weren't side by side with you looking at the screen through a magnifying glass at a game moving at 25% speed.  What an absolute joke, a joke that you have either fallen for or have a vested interest in.

It didn't take long for me to no longer be able to take you seriously between your nonsensical way of stringing words together coupled with your clear lack of gaming experience.  On to the list you go.  I prefer to forget that people like you exist.


----------



## Gica (Jan 4, 2023)

CUDA is everywhere, my friend.
For example, for you, in renderings: OptiX (only RTX) -> CUDA (only nVidia cards)-> OpenCL (AMD). If the program does not support OptiX, you can confidently use CUDA, which is much more efficient and better implemented than OpenCL.
I don't know what reviews you've read, but I'm not from Earth if you have FSR=DLSS. Even if it were, nVidia cards support both.

For example, I have the privilege of choosing what I want with an nVidia card, in Cyberpunk, Shadow of the Tomb Raider, RDR 2 and many, many others.
In Doom Eternal, and not only, you don't have the FSR option.
"The gentle lamb sucks two sheep", a Romanian proverb that determined the winner of the great battle: 5700X versus 2070 Super.


----------



## N3M3515 (Jan 4, 2023)

Gica said:


> For example, I have the privilege of choosing what I want with an nVidia card


And that's the exact reason your comments have zero relevance on price, you're like the top 1% of the income bracket. Most people that are complaining are of this 2 points of view:

1. Have the money, hard earned by working (money does not grow on trees) and do not agree with a $350 - $500 increase just because amd or nvidia wants to earn more money.
2. The people that had their budget capped at $800 that always granted them a x80 gpu, and now have to settle for x70 performance. Because their income did not increase by +70%.

This is not about amd vs nvidia anymore.

Hope you uderstand something very basic(common sense).


----------



## Gica (Jan 5, 2023)

Or
Proverb: I'm too poor to buy cheap if it means compromise.
You keep saying that I don't understand, but your horse glasses don't let you understand that the market regulates everything. You complain too much that your old CRT 256 lines costs less than the new 4K OLEDs.


----------



## kapone32 (Jan 5, 2023)

Gica said:


> Or
> Proverb: I'm too poor to buy cheap if it means compromise.
> You keep saying that I don't understand, but your horse glasses don't let you understand that the market regulates everything. You complain too much that your old CRT 256 lines costs less than the new 4K OLEDs.


The Market my ass. We are quick to forget that if Mining was still viable (for now) you would not be able to buy a 4080 and the 7900XT is (was) not selling because of the same thing. It was too expensive for the market. The hardest 7900XTX to get will be the 3X8 pin variants and I expect a $100-200 difference in price for those units. They must be selling them too because the price has already risen by $300 on Newegg for of all things the Sapphire Reference Edition. You can expect that there will be a bump in the rest of the AIB at first too as the Pulse is currently $70 more than the Nitro and that makes no sense but those prices are listed for when the cards launched.


----------



## Gica (Jan 5, 2023)

As the demand for RX is high here, the difference between 4080 and 7900XTX (the only available model) is only ~$60, not $200. I wouldn't look at $60 when the 4080 is clearly above the 7900XTX at ... I won't repeat them, but they are features.
The market regulates everything, gentlemen. No one is forcing you to buy, and you won't have a heart attack if you don't. What the hell, are you North Koreans undercover?


----------



## kapone32 (Jan 5, 2023)

Gica said:


> As the demand for RX is high here, the difference between 4080 and 7900XTX (the only available model) is only ~$60, not $200. I wouldn't look at $60 when the 4080 is clearly above the 7900XTX at ... I won't repeat them, but they are features.
> The market regulates everything, gentlemen. No one is forcing you to buy, and you won't have a heart attack if you don't. What the hell, are you North Koreans undercover?


Yes, Yes we know about those features.You seem to forget that not everyone has your specific use case in mind. This is a thread for the 7900XTX. Don't you no Specs have a 4080?



Gica said:


> I don't know what reviews you've read, but I'm not from Earth if you have FSR=DLSS. Even if it were, nVidia cards support both.


Yep Thanks to Nvidia that Company that cares about Gamers by how they show it.


----------



## Gica (Jan 5, 2023)

As the magic of cheap fuel ended in 1970 (as if), maybe you should also inspect the other end of the glass: video cards are not expensive now, they were very cheap until now. The Cinderella story for only $600 is over. It remains to be seen how many will adapt to reality.


----------



## HTC (Jan 5, 2023)

Avro Arrow said:


> I also have an RX 6800 XT and I plan to keep using it for many years to come.  It's a great performer and won't have any VRAM limitations like my R9 Fury did.  I think that it'll be at least a decade before 16GB isn't enough and that's assuming that there will ever be a time when 16GB isn't enough.  I say that because it's already more than enough for Unreal Engine 5 which is essentially photo-realistic.  You can't get much better than that.
> 
> Actually, the RX 6800 XT would be the bottleneck there, especially at 1440p, not the R9-5900X.  See, the RX 6800 XT is the performance equal of the RTX 3080 and TechPowerUp has these results for the R9-5900X with an RTX 3080.  I'll use 1080p because the game will be completely GPU bottlenecked at 1440p:
> 
> ...



There's a flaw in your comparison: @W1zzard tests are CUSTOM and i don't know how exactly HU tests their games.

You can compare them directly ONLY WHEN all things are equal (except the processor ofc, in this specific case), but since they aren't, you can't

Either you compare both tests from TPU or both tests from HU, and EVEN THEN, it assumes ALL ELSE the same (from drivers, to Windows version, etc).


----------



## Avro Arrow (Jan 6, 2023)

HTC said:


> There's a flaw in your comparison: @W1zzard tests are CUSTOM and i don't know how exactly HU tests their games.
> 
> You can compare them directly ONLY WHEN all things are equal (except the processor ofc, in this specific case), but since they aren't, you can't
> 
> Either you compare both tests from TPU or both tests from HU, and EVEN THEN, it assumes ALL ELSE the same (from drivers, to Windows version, etc).


The differences in the test benches isn't enough to be significant so I don't worry about it.


----------

