• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power

Joined
Dec 25, 2020
Messages
6,782 (4.73/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
If it has flagship or very near flagship performance, it is a lite version of the flagship.

A quality which can be bestowed upon the RTX 3080 all the same. But no.
 
Joined
Jun 2, 2017
Messages
9,174 (3.35/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
A quality which can be bestowed upon the RTX 3080 all the same. But no.
Can you even use a 3080 for 4K Gaming with new Games?
 
Joined
Dec 25, 2020
Messages
6,782 (4.73/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Can you even use a 3080 for 4K Gaming with new Games?

You cannot use a 6800 XT either, since it performs so poorly. Despite being 10GB, the 3080 still outperforms it. From W1zz's latest game performance review

 
  • Haha
Reactions: ARF

SailorMan-PT

New Member
Joined
Jul 7, 2024
Messages
13 (0.09/day)
Location
Stuttgart
@SailorMan-PT - please post in English, or your next posts will be removed. This is an English speaking forum, thank you.

Bitte auf Englisch posten for writing in German language
Does that mean that my jaw has to drop in front of the 4090's performance with no consideration towards its price or power consumption? Sorry, not gonna happen.

Since you mentioned it, even the 7900 XTX is way above the limit of what I consider a sensible price for a GPU. That's why I'm not affected by AMD's decision of not making a halo RDNA 4 GPU. If they can pull off a decent midrange card, I'm game.
Well, after watched and studied several Benchmarks, there are cards who serves the mid-range market. In despite of technical value, in relationship to the prices, I would consider the AMD Radeon GPUs better. Cuz of her advantage in a larger VMemory and one decent storage connection. 16-GB+256 bit vs.12-GB+192bit.
Those are the Radeon 7800xt and RX-7900-GRE. I think I don't have to mention the respective NVIDIA GPUs. Cuz they're well known.

Those are the Radeon 7800xt and RX-7900-GRE. I think I don't have to mention the respective NVIDIA GPUs. Cause they are well known

@SailorMan-PT - please post in English, or your next posts will be removed. This is an English speaking forum, thank you.

Bitte auf Englisch posten

@SailorMan-PT - please post in English, or your next posts will be removed. This is an English speaking forum, thank you.

Bitte auf Englisch posten.
Yeah I understand but my Smartphone translated this English language into German language when I tipt the reply it's appeared in German language so I just think it would be write it on English language in my reply it was one big Miss understanding
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
AMD may lose a golden opportunity to beat Nvidia this year

A year and a half after the launch of RDNA 3, AMD’s graphics card lineup has grown a little stagnant — as has Nvidia’s. We’re all waiting for a new generation, and according to previous leaks, AMD was getting ready to release RDNA 4 later this year. Except that now, we’re hearing that it might not happen until CES 2025, which is still six months away.
Launching the new GPUs in the first quarter of 2025 is a decision that could easily backfire, and it’s never been more important for AMD to get the timing right. In fact, if AMD really decides to wait until January 2025 to unveil RDNA 4, it’ll miss out on a huge opportunity to beat Nvidia.
...
But AMD’s gaming segment appears to be bleeding money. Its gaming revenue dropped by 48% year-over-year, and even AMD itself doesn’t expect it to get better.
Jean Hu, AMD’s CFO, recently talked about how the subpar sales of its Radeon GPUs affected the gaming segment in a big way. The company predicts that the revenue in that segment will continue to decline. Where’s AMD going to make money then? It’s simple: From its data center and client segment.

 

TeamMe

New Member
Joined
May 24, 2023
Messages
8 (0.01/day)
nVidia Ray Tracing is a con I mean even Cyberpunk on a 4090 looks no where near as good as the CGI on old movie like Lord Of The Rings. AMD shouldn’t bother with Ray Tracing even on NVIDIA the performance degradation is too big. AMD should keep on being a Rasta man and use HDR like The Last of Us Part 1 PC. AMD should focus on getting higher polygon count models and more detailed textures…
 
Joined
Dec 25, 2020
Messages
6,782 (4.73/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
nVidia Ray Tracing is a con I mean even Cyberpunk on a 4090 looks no where near as good as the CGI on old movie like Lord Of The Rings. AMD shouldn’t bother with Ray Tracing even on NVIDIA the performance degradation is too big. AMD should keep on being a Rasta man and use HDR like The Last of Us Part 1 PC. AMD should focus on getting higher polygon count models and more detailed textures…

Following this absurd logic, every single API feature developed in the past 20 years is a con, since you can get 70% of the job done with DirectX 9.0c anyway
 
Joined
Feb 21, 2006
Messages
2,225 (0.32/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.10.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
You cannot use a 6800 XT either, since it performs so poorly. Despite being 10GB, the 3080 still outperforms it. From W1zz's latest game performance review

That is at Max settings.

If you drop them you can use a 3080 or a 6800XT at 4k.
 
Joined
Dec 25, 2020
Messages
6,782 (4.73/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
That is at Max settings.

If you drop them you can use a 3080 or a 6800XT at 4k.

That much seems obvious, the argument there (in this necro'd thread) is that the 6800 XT was superior for 4K gaming, it's not.
 

TeamMe

New Member
Joined
May 24, 2023
Messages
8 (0.01/day)
Following this absurd logic, every single API feature developed in the past 20 years is a con, since you can get 70% of the job done with DirectX 9.0c anyway
The effect/difference ray tracing adds is subtle, where as the performance degradation is anything but; that statement is beyond reproach…
 
Joined
Dec 25, 2020
Messages
6,782 (4.73/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
The effect/difference ray tracing adds is subtle, where as the performance degradation is anything but; that statement is beyond reproach…

I didn't specifically mention RT to begin with but I disagree, but it's not something that can be truly appreciated unless you're already playing at 4K or higher IMO. The atmospherics in Metro Exodus for example, or more recently the full path tracing in Wukong, these look absolutely phenomenal on the right setup.

End of the day people make a big deal of RT because their hardware can't handle it and they cannot afford a suitable upgrade.
 
Joined
Oct 12, 2005
Messages
708 (0.10/day)
I disagree, but it's not something that can be truly appreciated unless you're already playing at 4K or higher IMO. The atmospherics in Metro Exodus for example, or more recently the full path tracing in Wukong, these look absolutely phenomenal on the right setup.

End of the day people make a big deal of RT because their hardware can't handle it and they cannot afford a suitable upgrade.
It's a great advice to actually play something instead of just peeping at screenshot.

Because the best looking thing we could do for screenshot is a very high quality baked lighting that wouldn't be dynamic at all. Doesn't matter, screenshot don't move. This is also why so many old game still look good and seem to have good lighting on screenshot. Because it's just static.

When you get to Dynamic lighting, then the non RT stuff have many flaws. It can look somehow good but it's easy to fell apart. RT do a much better job for those kind of scenario. It just make the lighting to be much more realistic, witch at the same time, make it less obvious. That is a strange thing but that is the case. Like said previously that is a style you go for.

Right now, the main problem i think with Path Tracing and RT in general is not really the performance impact, it's the quality of the denoiser. Movie just brute force that issue with many more rays and heavy offline denoising algorithm.

DLSS 3.5 is a bit better on that front but it still have many flaws. Having a quality denoiser will be key to make RT look like what we see in movies.


As for the performance impact, this is just a current impact. Most of the shaders we use would destroys the few first generations of GPU that supported it. In 2-3 generation, even mid range should have plenty of power to run RT at 1080P
 
Joined
Jun 11, 2019
Messages
616 (0.31/day)
Location
Moscow, Russia
Processor Intel 12600K
Motherboard Gigabyte Z690 Gaming X
Cooling CPU: Noctua NH-D15S; Case: 2xNoctua NF-A14, 1xNF-S12A.
Memory Ballistix Sport LT DDR4 @3600CL16 2*16GB
Video Card(s) Palit RTX 4080
Storage Samsung 970 Pro 512GB + Crucial MX500 500gb + WD Red 6TB
Display(s) Dell S2721qs
Case Phanteks P300A Mesh
Audio Device(s) Behringer UMC204HD
Power Supply Fractal Design Ion+ 560W
Mouse Glorious Model D-
If MLID or RGT say something good then at least 80% of the time it's guaranteed to be late and slow again. Only if the moment arrives when these two declare that AMD has lost it we should all start waiting for AMD to pull out a card of 9700XT/7970 levels of greatness.
 
Joined
Feb 21, 2006
Messages
2,225 (0.32/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.10.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
6800 XT was superior for 4K gaming, it's not.
neither of them is good for 4k without dropping settings so you are correct.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
If MLID or RGT say something good then at least 80% of the time it's guaranteed to be late and slow again. Only if the moment arrives when these two declare that AMD has lost it we should all start waiting for AMD to pull out a card of 9700XT/7970 levels of greatness.

The current situation is so bad, anyways. Because the current lineup is already 2-year-old and needs optimisations and updates for 2024/2025 usage.
The only thing that matters now is to release something (anything) new, no matter if it's good or bad, then adjust the pricings accordingly, so that at least some stock moves off the warehouses shelves..
 
Joined
Feb 20, 2019
Messages
8,285 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
The current situation is so bad, anyways. Because the current lineup is already 2-year-old and needs optimisations and updates for 2024/2025 usage.
The only thing that matters now is to release something (anything) new, no matter if it's good or bad, then adjust the pricings accordingly, so that at least some stock moves off the warehouses shelves..
Why are you so bad at super-basic, increadibly easy-to-check facts, Arf?

1724572685141.png


Nothing in the current AMD 7000-series lineup is 2 years old yet, most of it isn't even 1 year old yet - and the AMD 7000-series is younger than the RTX 40-series and the Intel Arc series.
 
Joined
Jan 14, 2019
Messages
12,341 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Why are you so bad at super-basic, increadibly easy-to-check facts, Arf?

View attachment 360536

Nothing in the current AMD 7000-series lineup is 2 years old yet, most of it isn't even 1 year old yet - and the AMD 7000-series is younger than the RTX 40-series and the Intel Arc series.
Also, what if it's old? What matters is that it still works, right?
 
Joined
Aug 26, 2021
Messages
377 (0.32/day)
That much seems obvious, the argument there (in this necro'd thread) is that the 6800 XT was superior for 4K gaming, it's not.
I own a 6800xt with the snot over clocked out of it. It benchmarks faster than. 7900GRE and what you are saying is correct.
 
Joined
Sep 17, 2014
Messages
22,453 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I own a 6800xt with the snot over clocked out of it. It benchmarks faster than. 7900GRE and what you are saying is correct.
Throw a UE5 game like Black Myth at a 4090 and its not a real 4K killer either... it manages, at best ;) OTOH throw Minesweeper at a 6800XT and itll do 4K120...

4K cards dont exist and never have nor will. There is only a game and its performance. GPUs move along doing what they can, and especially now with dynamic 'anything' in graphics... if you put up with high latency, lower IQ you can have 4K on many cards...
 
Joined
Aug 26, 2021
Messages
377 (0.32/day)
Throw a UE5 game like Black Myth at a 4090 and its not a real 4K killer either... it manages, at best ;) OTOH throw Minesweeper at a 6800XT and itll do 4K120...

4K cards dont exist and never have nor will. There is only a game and its performance. GPUs move along doing what they can, and especially now with dynamic 'anything' in graphics... if you put up with high latency, lower IQ you can have 4K on many cards...
True it's a weird situation currently hardware seems off I'm hoping to upgrade to Blackwell but I'd want 30% more performance than a 4090 to feel comfortable parting with the cash a 5090 will cost. The 6800xt is a hell of a card so much so that I kept it after going through 3 7900xtx cards that all had issues with drivers and latency. Software is to important and my 6800xt and 4 month hell with 7900xtx has shown me AMD still sucks in that department, software adrenaline is terrible in so many ways I'm going back to Nvidia the premium is worth it IMO. But I'll be putting my 6800xt on the wall with my other favourite cards.
 
Joined
Feb 20, 2019
Messages
8,285 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
20 months old. Give or take 3-4 months. While the design process can be traced back at least 3 or 4 years earlier.
So, a 2018 thing. :D
Yes, I literally posted a table with all of the launch dates for you, ranging from 6 to 20 months. 20 months is not 'already 2 years" that you wrote which quite unambiguously means >24M

As for the design process being 2018, what does that have to do with anything? Every CPU and GPU in the last 30 years has been in development for multiple years before launch. Once again, you're spouting nonsense - please stop, or at least do a basic sanity check on whether what you type is sane, relevant, or worthwhile.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
Yes, I literally posted a table with all of the launch dates for you, ranging from 6 to 20 months. 20 months is not 'already 2 years" that you wrote which quite unambiguously means >24M

As for the design process being 2018, what does that have to do with anything? Every CPU and GPU in the last 30 years has been in development for multiple years before launch. Once again, you're spouting nonsense - please stop, or at least do a basic sanity check on whether what you type is sane, relevant, or worthwhile.

The one spouting nonsense is you.
You in a wrong way count the period between a product going in the wild and the present moment, when the right way to count how old a product is to count the time between its set-in-stone, tape-out, or set-in-stone decision / feature set. Because between that moment in time, and the physical release to the wild, there can be multiple other milestones happening, feature set updates, etc.

Also, what if it's old? What matters is that it still works, right?

Wrong. AMD's market share is close to non-existent in the OEMs market, which means that while the products do indeed physically work, they are market failures.

88% of all GPU shipments are nvidia, the rest is AMD and intel.

1724585992024.png

 
Top