• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What is Radeon RX 7000 series for you? Please elaborate

What is Radeon RX 7000 series for you?

  • Success

    Votes: 49 30.8%
  • Disappointment

    Votes: 42 26.4%
  • So-so

    Votes: 68 42.8%

  • Total voters
    159
  • Poll closed .

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.55/day)
Location
Ex-usa | slava the trolls
I don't think they believe they can sell a 12-1500 gpu is probably why they decided on a 300mm2 GCD.. Only the most die hard AMD fans would likely buy it.

Exactly. A large die would have to sell for high prices, and they know that the people willing to shell out that much are more likely to buy Nvidia.

But AMD really wants to be felt a premium brand. With their mindset it will not happen. Their mindset rotates about only one thing - make everything as cheap as possible, and in the process they make fatal mistakes - first was HBM memory when it was not needed and/or imposed VRAM limitations (hello 4GB R9 Fury X, or Radeon VII crazy 1024 GB/s memory throughput), then this memory cache (MCDs on separate chips) to compensate for somewhat lacking memory throughput.

It's like they have difficulties to understand how to design a proper graphics architecture, to begin with...
 
  • Like
Reactions: N/A
Joined
Nov 26, 2021
Messages
1,769 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
But AMD really wants to be felt a premium brand. With their mindset it will not happen. Their mindset rotates about only one thing - make everything as cheap as possible, and in the process they make fatal mistakes - first was HBM memory when it was not needed and/or imposed VRAM limitations (hello 4GB R9 Fury X, or Radeon VII crazy 1024 GB/s memory throughput), then this memory cache (MCDs on separate chips) to compensate for somewhat lacking memory throughput.

It's like they have difficulties to understand how to design a proper graphics architecture, to begin with...
You're right with one exception: Radeon VII. It was derived from a compute product and for that purpose, its bandwidth was justified.
 
Joined
Sep 17, 2014
Messages
23,101 (6.10/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
It isn't a technical limitation rather an economical one, even if self-imposed.
Its really just common sense, as much as it is not jumping headfirst into an unknown race with Nvidia to fight for an RT crown that's actually a randomly moving target.

Common sense however is not common these days. And most things big tech have lost all sense altogether, its simply about profit maximization, your stuff doesn't have to be better, you just buy the narrative that it is so. In the current camp battle we have a whole marketing line set up around the availability of a software feature that is hardware specific and game specific and does not come with the box you buy it in. Without it, the entire line up of cards is measurably and noticeably slower at a higher price. People are buying it.

Commercially a lot of things are really AMD's fault for not doing better, and at the same time, you have to also herald such down to earthness. They make GPUs to run games on, first and foremost, everything else is we'll see when we get to it. Now that RDNA3 actually does run all of it, you have to wonder how far away from a solid approach AMD really is. They have always, since the first RT capable GPU, said they weren't going to go big on it until it landed in midrange. Well, with current day Ada and its 8GB midrange, you can safely forget that until 2025.

I'm honestly not seeing AMD being that far behind given that they can't meet the expectations of a 450W 4090 nor that of beyond 3090-RT performance being all that's worth looking at.

And what's more, they do it on a 529mm die where Nvidia needs beyond 600 on a more expensive node. Strategically this really is a much better position than having to market DLSS3 because without it its really not going anywhere other than Ampere already did.

But AMD really wants to be felt a premium brand. With their mindset it will not happen. Their mindset rotates about only one thing - make everything as cheap as possible, and in the process they make fatal mistakes - first was HBM memory when it was not needed and/or imposed VRAM limitations (hello 4GB R9 Fury X, or Radeon VII crazy 1024 GB/s memory throughput), then this memory cache (MCDs on separate chips) to compensate for somewhat lacking memory throughput.

It's like they have difficulties to understand how to design a proper graphics architecture, to begin with...
Cheaper means you have a better competitive outlook, especially if you do not have the premium / top brand status. The result is that the cost per frame on AMDs end is very often lower, and it is especially today.
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.55/day)
Location
Ex-usa | slava the trolls
The result is that the cost per frame on AMDs end is very often lower, and it is especially today.

I don't think that anyone here really cares about the "cost per frame". Gamers buy based on other considerations and never go deeper in these analyses which are maybe job tasks for some trainees working at AMD themselves.
What really matters is the sales and the overall market share, and the overall (top) management, and to be honest, I would not be very optimistic about AMD's outlooks, particularly in the graphics cards business.
 
Joined
Feb 24, 2023
Messages
3,494 (4.95/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
I really wish AMD fans stopped playing this game of pretend that everything is fine and that Nvidia's not a billion light years ahead of them
I'm not an AMD fan, they disgust me even more than the nVidia.

The point is the drivers are okay. Hardware is not. Prices are completely far away from reality. There is not a single RX 7000 card deserving to be above $600. They are extremely behind in RT and perf/w. Not one, but at least THREE generations behind Ada.

AMD pretending their next-gen graphics are actually next-gen and can be competitive whilst being light years behind nVidia in revenues for ages is off. Shitposting prices like their products are superior is just a complete overkill. This is why I wish their CEOs the hardest reality check ever. Seems like both Lisa and Huang haven't committed to reality checks since the last century. But Huang's next-gen GPUs, at least, are superior to last-gen GPUs in some ways despite trying hard to make this generation as bad as possible.
 
Joined
Dec 25, 2020
Messages
7,425 (4.96/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) RTX A2000 (soon: Palit GeForce RTX 5090 GameRock)
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
I'm not an AMD fan, they disgust me even more than the nVidia.

The point is the drivers are okay. Hardware is not. Prices are completely far away from reality. There is not a single RX 7000 card deserving to be above $600. They are extremely behind in RT and perf/w. Not one, but at least THREE generations behind Ada.

AMD pretending their next-gen graphics are actually next-gen and can be competitive whilst being light years behind nVidia in revenues for ages is off. Shitposting prices like their products are superior is just a complete overkill. This is why I wish their CEOs the hardest reality check ever. Seems like both Lisa and Huang haven't committed to reality checks since the last century. But Huang's next-gen GPUs, at least, are superior to last-gen GPUs in some ways despite trying hard to make this generation as bad as possible.

Don't worry, I didn't target you specifically, I just threw that in the wind ;)

Drivers could be "okay" as in, minimum stability, but they're definitely not in great shape. I agree with you otherwise, 100%
 
Joined
Dec 31, 2020
Messages
1,089 (0.73/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
And what's more, they do it on a 529mm die where Nvidia needs beyond 600 on a more expensive node. Strategically this really is a much better position than having to market DLSS3 because without it its really not going anywhere other than Ampere already did.

What does it mean. Nvidia needs just under 379 mm fully enabled 10240 CUda and gain the additional 5% needed to compete with NAVI31. The 4080 Super.
SO they don't even need the 100% quality dies that cover only 2% of the wafer it can comfortably rely on the defective 98% that are 98% of the wafer.
And if they install GDDR7 on the 4080 it will blow NAVI out of the water.
600 is severely bottlenecked by the memory.
In the case of 3070 / 4070 the latter got 12% more bandwidth, 10240 CUda 3080Ti / 4080 the got downgraded by 22%, 912 to 716 GB/s
and 4090 however 50% more cuda than 3090 and same bandwith, that can't be good.
 
Joined
Jul 18, 2016
Messages
520 (0.17/day)
System Name Gaming PC / I7 XEON
Processor I7 4790K @stock / XEON W3680 @ stock
Motherboard Asus Z97 MAXIMUS VII FORMULA / GIGABYTE X58 UD7
Cooling X61 Kraken / X61 Kraken
Memory 32gb Vengeance 2133 Mhz / 24b Corsair XMS3 1600 Mhz
Video Card(s) Gainward GLH 1080 / MSI Gaming X Radeon RX480 8 GB
Storage Samsung EVO 850 500gb ,3 tb seagate, 2 samsung 1tb in raid 0 / Kingdian 240 gb, megaraid SAS 9341-8
Display(s) 2 BENQ 27" GL2706PQ / Dell UP2716D LCD Monitor 27 "
Case Corsair Graphite Series 780T / Corsair Obsidian 750 D
Audio Device(s) ON BOARD / ON BOARD
Power Supply Sapphire Pure 950w / Corsair RMI 750w
Mouse Steelseries Sesnsei / Steelseries Sensei raw
Keyboard Razer BlackWidow Chroma / Razer BlackWidow Chroma
Software Windows 1064bit PRO / Windows 1064bit PRO
Joined
Dec 25, 2020
Messages
7,425 (4.96/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) RTX A2000 (soon: Palit GeForce RTX 5090 GameRock)
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
What does it mean. Nvidia needs just under 379 mm fully enabled 10240 CUda and gain the additional 5% needed to compete with NAVI31. The 4080 Super.
SO they don't even need the 100% quality dies that cover only 2% of the wafer it can comfortably rely on the defective 98% that are 98% of the wafer.
And if they install GDDR7 on the 4080 it will blow NAVI out of the water.
600 is severely bottlenecked by the memory.
In the case of 3070 / 4070 the latter got 12% more bandwidth, 10240 CUda 3080Ti / 4080 the got downgraded by 22%, 912 to 716 GB/s
and 4090 however 50% more cuda than 3090 and same bandwith, that can't be good.

A fully-enabled AD103 is not necessary to achieve this, as the 4080 is already equal or better than Navi 31, save a rare few occasions that heavily favor the RDNA design. The raw memory bandwidth is lower in Ada cards but due to the much larger L2 cache, the performance is actually quite a bit stronger - and I'd know as I came from the 3090 to the 4080 and there's basically no workload the 4080 doesn't slap it silly. Ironically, a trick from AMD's playbook, this is how Navi 21 despite its anemic 512 GB/s bandwidth could keep up with the 3090's 936 and even the 3090 Ti's >1 TB/s bandwidth.

The RTX 4080 is faster in RT (accounting W1zz's newest review of the Taichi White on more recent drivers and strong aftermarket design to help the 7900 XTX's case), tremendously faster AI acceleration (from Tom's - the 4070 Ti slaps it silly even with XDNA tech in N31), better at content creation (Puget Systems' content creation benchmarks), on average, the 7900 XTX is about 15% slower than the 4080, lacks all of its ecosystem features (I won't say DLSS 3 is make or break feature, but AMD didn't deliver FSR 3.0 yet - they're holding up), lacks creator-focused studio drivers, doesn't support innovative optimizations like SER and has lower power efficiency in general (again from W1zz review of the Taichi White). Regarding production, also have to note that despite the 7900 XTX's rather strong DaVinci performance, its HW decoder can't handle 4:2:2 video files, so that kind of excludes it by default if you're working with high end video cameras' output. Otherwise, decent for video now, thank the Lord AMD worked on this.

In general, the 7900 XTX is about a match for the previous-generation RTX 3090 Ti when put to an all-round test. The 4090 is simply on a league of its own right now, adding another 20-25% on average on top of the 4080. It's not a bad card, but it is very much third place right now.
 
Last edited:
Joined
Dec 25, 2020
Messages
7,425 (4.96/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) RTX A2000 (soon: Palit GeForce RTX 5090 GameRock)
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
I think that this article must be shown here to make discussion better.

Yes, I'm well aware of the idle power issue being mostly patched by AMD by now (it is still higher than 4080 but it will always be, this can't be helped, the MCDs will simply use more power), but that doesn't change the general idea - and remember the 4080 is technically a 320W card instead of a 355W card.

It's just this, even if one plugs their ears and ignores everything but rasterized games where RDNA 3 does more or less well (an astounding 2% faster at a 10% higher power target in W1zz's suite), it's just not on the same level. Who knows how it will age, but by then, does that really matter? The people who bought their Nvidia cards enjoyed it through their prime while those who bought 7900 XTX's waited for patches and bug fixes (see: Ratchet and Clank issues, not the first and won't be the last time), so really? I dont see the point. Ultimately - this last point I made is why I ended up not getting the XTX, and I had it in my cart when I decided not to.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.55/day)
Location
Ex-usa | slava the trolls
Chipsandcheese has an article about Navi 31.

Navi 31's theoretical memory throughput is 960 GB/s (default, up to 1056 GB/s when OCed), but the tests show its maximum is around 750-790 GB/s.

Is this the reason for the worsening performance when scaling up the screen resolutions...

From VRAM, both AMD architectures enjoy very good bandwidth at low occupancy. RDNA 3 starts with a small advantage that gets larger as more WGPs come into play. From another perspective, RDNA 2’s 256-bit GDDR6 setup could be saturated with just 10 WGPs. RDNA 3’s bigger VRAM subsystem can feed more WGPs demanding full bandwidth. Nvidia has more trouble with VRAM bandwidth if only a few SMs are loaded, but takes a lead at higher occupancy.

1690881282755.png

 
Joined
Nov 15, 2020
Messages
955 (0.62/day)
System Name 1. Glasshouse 2. Odin OneEye
Processor 1. Ryzen 9 5900X (manual PBO) 2. Ryzen 9 7900X
Motherboard 1. MSI x570 Tomahawk wifi 2. Gigabyte Aorus Extreme 670E
Cooling 1. Noctua NH D15 Chromax Black 2. Custom Loop 3x360mm (60mm) rads & T30 fans/Aquacomputer NEXT w/b
Memory 1. G Skill Neo 16GBx4 (3600MHz 16/16/16/36) 2. Kingston Fury 16GBx2 DDR5 CL36
Video Card(s) 1. Asus Strix Vega 64 2. Powercolor Liquid Devil 7900XTX
Storage 1. Corsair Force MP600 (1TB) & Sabrent Rocket 4 (2TB) 2. Kingston 3000 (1TB) and Hynix p41 (2TB)
Display(s) 1. Samsung U28E590 10bit 4K@60Hz 2. LG C2 42 inch 10bit 4K@120Hz
Case 1. Corsair Crystal 570X White 2. Cooler Master HAF 700 EVO
Audio Device(s) 1. Creative Speakers 2. Built in LG monitor speakers
Power Supply 1. Corsair RM850x 2. Superflower Titanium 1600W
Mouse 1. Microsoft IntelliMouse Pro (grey) 2. Microsoft IntelliMouse Pro (black)
Keyboard Leopold High End Mechanical
Software Windows 11
I have used both the XTX and XT 7900s. The thing about those cards are where they are for. If you play at 1440P and had a 6800XT you might be whelmed by the 7900 series but if you have a 4K 144Hz screen prepare to be blown away. The 20 and 24 GB of VRAM is excellent and the driver updates have refined the cards. There is no Game that those cards cannot provide high refresh rates at this point. I have even gotten some of the new Games that everyone complains about and wonder what all the noise is about. The other thing is upscaling means nothing to me and the way the GPUs (like 6000 series) maintain boost clocks in modern Games is great. Even Games that are CPU bound like Grip (1 core) still produce 140+ FPS with the GPU at 99%.

The funniest thing about the 7900XT is that in my country it is $100 more than the 6900XT, $400 cheaper than the 7900XTX and a whopping $1200 less than the 4090. As the 4070TI is within $50 for 8GB less. Obviously the 4060TI 16GB has demonstrated that the GPU must be up to snuff to take advantage of the VRAM buffer but I am happy with my 20GB frame buffer and it is pushed by my CPU as well in modern Games.

The AIB cards are all great at OC. One of the truths is that AMD did allow for AIBs to gain an advantage over the reference models by allowing for higher boost clocks. You can expect owners of Gigabyte, As Rock, Asus, MSI, Gigabyte and Sapphire to all have 3 GHZ OCs on their cards. As much as people may think that Chiplets are a pain you can also think that the first 3 months of 2023 were spent integrating those chipsets with the unified GPU driver. As it has been shown that even Mobile GPUs use the same driver (6600,6600M).

To be brutally honest this system I have has me addicted to Gaming and the colours, resolution, refresh rate and pixel fill rate all have something to do with it. I had a 5800X3D and 6800XT with 3600 14-15... RAM and from the day I installed my X670E system I have not looked back. You can search my posts and see that about 2 weeks ago I pulled the MB I had (X570S Ace Max) out of the closet and did an RMA for it to go into my HTPC. Yes if the 7900XT and 7900X3D can make someone like me put arguably the best AM4 board in storage for 6 months.

The narrative does effect AMD though. The common ones are now ridiculous.

1. Drivers are problematic; Truth: AMD drivers are rock solid and out of the last 20 driver releases perhaps 3 have caused issues
2. Software is not as good as Nvidia; Truth: AMD software is the best of all 3 GPU makers and will make you a better PC user if you take the time to use it right. I bought a 3060 based laptop for my 50th and could not believe the software package looked the exact same since the GTS450.
3. Ray Tracing is not as good. Truth: The 7900XTX is just as fast at RT as a 3090 and about $400 cheaper too. Other than the 4090 (which is $2500 on sale) no Nvidia other Nvidia is faster than a 4090 in RT so.
4. Power draw is high. Truth: How many pixels a second does 4K require to be put on screen? It consumes about 1/3 more power than a 6800Xt for a 40-70% improvement at 4K.
5. OC of GPU is dead. Truth; When you see your GPU singing at 3 GHZ with literally 1 click it is great. If you bought a 6500 or 6x50 GPUs you knew that they provided another 12-15% with a higher boost clock than the 6x00 cards so 7000 has some of that sauce too.
As to point 4 (power consumption), that may now be fixed. Check here: AMD Radeon RX 6000/7000 GPUs Reduce Idle Power Consumption by 81% with VRR Enabled | TechPowerUp

I'm not an AMD fan, they disgust me even more than the nVidia.

The point is the drivers are okay. Hardware is not. Prices are completely far away from reality. There is not a single RX 7000 card deserving to be above $600. They are extremely behind in RT and perf/w. Not one, but at least THREE generations behind Ada.

AMD pretending their next-gen graphics are actually next-gen and can be competitive whilst being light years behind nVidia in revenues for ages is off. Shitposting prices like their products are superior is just a complete overkill. This is why I wish their CEOs the hardest reality check ever. Seems like both Lisa and Huang haven't committed to reality checks since the last century. But Huang's next-gen GPUs, at least, are superior to last-gen GPUs in some ways despite trying hard to make this generation as bad as possible.
"But Huang's next-gen GPUs, at least, are superior to last-gen GPUs in some ways despite trying hard to make this generation as bad as possible"
Are they? Look at the full stack. Starting from the 4070Ti and down they all needed a rename to a lower tier of performance. The 4070Ti should have been the 4070, etc. Cheaping out on vram was not good for this generation. Other than the 4090 which is a halo product, the pricing is abominable throughout.
 
Joined
Aug 13, 2010
Messages
5,501 (1.04/day)
I had the chance to work with both RX 7900 XT and RX 7900 XTX very early.

I voted disappointment not because of performance, not even because of the higher power consumption these GPUs observe under mild loads compared to the more tamed NVIDIA products.
My disappointment is from AMD completely giving up on its underdog mentality when the market needed it the most. This is where "showing them how its done" does not exist for Radeon brand at time where NVIDIA is probably at its most market dominant position in its history.

We got new products that are priced way beyond the products they replaced, with the rude mention of RX 6950 XT's COVID inflated launch price as some sort of an anchor to why the RX 7900 XTX was priced like it was.
Working with the RX 7600 thinking it was priced at 299 USD until the very 24 hours before review coming up to know it will be priced only slightly lower had ricochets, it was a price drop that told us how none-confident AMD was at its strategy. The RX 7900 XT's 100 dollar MSRP distance from the XTX model was just salt on open wounds.

I want Radeon's old spunk and market disruption back, even if it means using hardware that may lack some of the features the competitor use. No, not even, but because of. Show us you can come out with a new product that entirely phases out old inventory and competes agressively, instead of overpricing at first, then making the slowest ever backtrack with product pricing. Current economy got AMD in a stronghold.
 
Joined
Feb 24, 2023
Messages
3,494 (4.95/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Look at the full stack. Starting from the 4070Ti and down they all needed a rename to a lower tier of performance.
True that. What I wanted to say is Ada is much better than Ampere in terms of performance per watt. 4070 aka 4080, 4060 Super/Ti aka 4070 Ti aka cancelled 4080 12 GB, 4060 aka 4070, 4050 aka 4060 Ti, 4040 series aka 4060... That's also where Ada is superior to Ampere. Superior in being overpriced and put in the wrong rank XD
 
Joined
Dec 25, 2020
Messages
7,425 (4.96/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) RTX A2000 (soon: Palit GeForce RTX 5090 GameRock)
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
I want Radeon's old spunk and market disruption back, even if it means using hardware that may lack some of the features the competitor use. No, not even, but because of. Show us you can come out with a new product that entirely phases out old inventory and competes agressively, instead of overpricing at first, then making the slowest ever backtrack with product pricing. Current economy got AMD in a stronghold.

You already have this, sans the spunk and defiant attitude, I'm afraid. After all, why undercut if it's not going to sell anyway
 
Joined
Aug 13, 2010
Messages
5,501 (1.04/day)
You already have this, sans the spunk and defiant attitude, I'm afraid. After all, why undercut if it's not going to sell anyway
That's a good question, but i'm afraid not too much can be done. Nothing besides cutting product prices to the point where you should bother if you should make them in the first place. I can't see RX 7900 XT for 649 USD happening fast enough
 
Joined
Jun 2, 2017
Messages
9,689 (3.46/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I had the chance to work with both RX 7900 XT and RX 7900 XTX very early.

I voted disappointment not because of performance, not even because of the higher power consumption these GPUs observe under mild loads compared to the more tamed NVIDIA products.
My disappointment is from AMD completely giving up on its underdog mentality when the market needed it the most. This is where "showing them how its done" does not exist for Radeon brand at time where NVIDIA is probably at its most market dominant position in its history.

We got new products that are priced way beyond the products they replaced, with the rude mention of RX 6950 XT's COVID inflated launch price as some sort of an anchor to why the RX 7900 XTX was priced like it was.
Working with the RX 7600 thinking it was priced at 299 USD until the very 24 hours before review coming up to know it will be priced only slightly lower had ricochets, it was a price drop that told us how none-confident AMD was at its strategy. The RX 7900 XT's 100 dollar MSRP distance from the XTX model was just salt on open wounds.

I want Radeon's old spunk and market disruption back, even if it means using hardware that may lack some of the features the competitor use. No, not even, but because of. Show us you can come out with a new product that entirely phases out old inventory and competes agressively, instead of overpricing at first, then making the slowest ever backtrack with product pricing. Current economy got AMD in a stronghold.
In the strange world we live in my 7900XT was actually about $300 less than my 6800XT. I see it differently, even though AMD has tried to abandon us for DGPUs their 6000 series cards are selling quite well due to the reduction in price of most of the stack. I wish the 6700XT was 300 Canadian but you wouldn't be able to buy one, all other cards below that (not 6700) are strictly for 1080P high.

Unfortunately those days of $99 GPUs that beat Nvidia's $200 offering are gone. I remember buying a 7950XT for $199 and just wow as it was the Sapphire Vapor-X variant. The 6500XT is actually a great 1080P card but should be that $99 GPU. I can say though that I have never been happier with my machine. I do lament crossfire and PCIe lanes on MBs though.
 
Joined
Dec 25, 2020
Messages
7,425 (4.96/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) RTX A2000 (soon: Palit GeForce RTX 5090 GameRock)
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
That's a good question, but i'm afraid not too much can be done. Nothing besides cutting product prices to the point where you should bother if you should make them in the first place. I can't see RX 7900 XT for 649 USD happening fast enough

And that's sad because even with all the shortcomings - at 649 this would be an excellent product. Just not at 900 or thereabouts.
 
Joined
Aug 13, 2010
Messages
5,501 (1.04/day)
And that's sad because even with all the shortcomings - at 649 this would be an excellent product. Just not at 900 or thereabouts.
We're not too far with some of the models that go for 749-779 USD at the moment.
I don't think these prices are bad, but they only serve to tip people who are on the fence towards the red side, more than convince the avg Joe that this choice is what is better over the competing product
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.55/day)
Location
Ex-usa | slava the trolls
Radeon RX 6800 XT launched soon-to-be 3 years ago for 650$, and we are yet to see its proper next-gen replacement.
Today the only thing you can buy for this money (650$) is Radeon RX 6950 XT which is barely (15ish %) faster.

Radeon RX 7900 XT, which is something that can replace the RX 6800 XT, was launched way too overpriced, and we don't know when it'll fall to its predecessor price point.
 
Joined
Dec 25, 2020
Messages
7,425 (4.96/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) RTX A2000 (soon: Palit GeForce RTX 5090 GameRock)
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
We're not too far with some of the models that go for 749-779 USD at the moment.
I don't think these prices are bad, but they only serve to tip people who are on the fence towards the red side, more than convince the avg Joe that this choice is what is better over the competing product

In America perhaps but the prices, while lower than Nvidia's, have been nowhere near that appetizing everywhere else I'm afraid.

Radeon RX 6800 XT launched soon-to-be 3 years ago for 650$, and we are yet to see its proper next-gen replacement.
Today the only thing you can buy for this money (650$) is Radeon RX 6950 XT which is barely (15ish %) faster.

Radeon RX 7900 XT, which is something that can replace the RX 6800 XT, was launched way too overpriced, and we don't know when it'll fall to its predecessor price point.

The 7900 XT is its replacement. AMD knocked the Navi 31 SKUs up to justify the increased MSRP on this SKU specifically, foreseeing it being popular for the same reason as the 6800 XT was.

It didn't pan out that way and it didn't help they missed every single performance expectation out of this silicon.

Things continued to look bad with Navi 33 and this time they answered with a series of pre launch, last minute price cuts that were enacted on the hour, after even reviews were done and filmed as we've come to know from GN and HUB, from 329 (earliest rumors) to 299 to 269... Not only this but since this new graphics card did not provide ANY measurable improvement over its predecessor, they've also knocked it DOWN an SKU so they could compare it to the cut down RX 6600, creating an even wider gap in the product stack that still goes unfulfilled as of today. The truth remains RX 7600 is a replacement for the RX 6650 XT as a fully enabled processor. It cannot get faster than that, any eventual 7600 XT would need to be carved out of Navi 32.
 
Joined
Nov 26, 2021
Messages
1,769 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
In America perhaps but the prices, while lower than Nvidia's, have been nowhere near that appetizing everywhere else I'm afraid.



The 7900 XT is its replacement. AMD knocked the Navi 31 SKUs up to justify the increased MSRP on this SKU specifically, foreseeing it being popular for the same reason as the 6800 XT was.

It didn't pan out that way and it didn't help they missed every single performance expectation out of this silicon.

Things continued to look bad with Navi 33 and this time they answered with a series of pre launch, last minute price cuts that were enacted on the hour, after even reviews were done and filmed as we've come to know from GN and HUB, from 329 (earliest rumors) to 299 to 269... Not only this but since this new graphics card did not provide ANY measurable improvement over its predecessor, they've also knocked it DOWN an SKU so they could compare it to the cut down RX 6600, creating an even wider gap in the product stack that still goes unfulfilled as of today. The truth remains RX 7600 is a replacement for the RX 6650 XT as a fully enabled processor. It cannot get faster than that, any eventual 7600 XT would need to be carved out of Navi 32.
I doubt they will use Navi 32 for a 7600 XT. The 7600 is clocked lower than the silicon is capable of to meet a power target of 150 W. Clocking it higher à la the 6650 XT and equipping it with 20 Gbps memory would be sufficient for a 7600 XT. Just like Nvidia, they will choose to redefine the performance gap between the 7600 and the XT rather than use a bigger GPU for a 7600 XT.
 
Joined
Dec 25, 2020
Messages
7,425 (4.96/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) RTX A2000 (soon: Palit GeForce RTX 5090 GameRock)
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
I doubt they will use Navi 32 for a 7600 XT. The 7600 is clocked lower than the silicon is capable of to meet a power target of 150 W. Clocking it higher à la the 6650 XT and equipping it with 20 Gbps memory would be sufficient for a 7600 XT. Just like Nvidia, they will choose to redefine the performance gap between the 7600 and the XT rather than use a bigger GPU for a 7600 XT.

That could be but it would inevitably come at the cost of a much higher power footprint, on the most sensitive slice of the market. That'd be an oof
 
Joined
Feb 24, 2023
Messages
3,494 (4.95/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Clocking it higher à la the 6650 XT and equipping it with 20 Gbps memory would be sufficient for a 7600 XT
3070 Ti which is a couple percent faster than 3070 whilst eating a whole hundred watts more: "Finally, a competition!"
 
Top