• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Retreating from Enthusiast Graphics Segment with RDNA4?

Joined
Aug 25, 2021
Messages
1,125 (0.98/day)
Amen.
So many so-called technology enthusiasts simply don't understand that rasterisation is dead. The fact that games, even new ones, still use it is entirely down to the fact that the console GPUs are simply not capable of acceptable RT performance. Assuming AMD manages to mostly address that shortcoming in the next console generation (2027-2028 timeline), we will then finally see the end of rasterisation as the primary graphics rendering technology.
There are serious doubts about it, unless both companies substantially improve RT performance across all classes of GPUs, and fast, without breaking customers' piggy-banks.
Is 4060Ti capable of "acceptable RT performance" for $500? No. Even 4070 chokes with RT in more demanding titles and becomes a stuttering mess. So, the mainstream market GPUs still have RT performance in its infancy. Raster is dead - long live the raster.

I know. This entire generation, both from AMD and nvidia, is rebranded at least a tier up the product stack.

RX 7900 XTX should be 7900 XT
RX 7900 XT should be 7800 XT
RX 7600 should be 7400 XT

RTX 4090 should be RTX 4080 Ti
RTX 4080 should be RTX 4070
RTX 4070 Ti should be RTX 4060 Ti
RTX 4070 should be RTX 4060
RTX 4060 Ti should be RTX 4050 Ti
RTX 4060 should be RTX 4050
Names are less important. Marketing departments of both companies use it to confuse people and make comparisons harder. We need to take official names on the face value for what they are, and simply have a healthy distance to it by comparing performance and features.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.71/day)
Location
Ex-usa | slava the trolls
Names are less important. Marketing departments of both companies use it to confuse people and make comparisons harder. We need to take official names on the face value for what they are, and simply have a healthy distance to it by comparing performance and features.

Names are extremely important. This is the same as putting a low-performance 1.4 litre engine in your brand new s-class mercedes. Will you agree? Not, of course!
 
Joined
Jun 11, 2020
Messages
571 (0.36/day)
Location
Florida
Processor 5800x3d
Motherboard MSI Tomahawk x570
Cooling Thermalright
Memory 32 gb 3200mhz E die
Video Card(s) 3080
Storage 2tb nvme
Display(s) 165hz 1440p
Case Fractal Define R5
Power Supply Toughpower 850 platium
Mouse HyperX Hyperfire Pulse
Keyboard EVGA Z15
Names are extremely important. This is the same as putting a low-performance 1.4 litre engine in your brand new s-class mercedes. Will you agree? Not, of course!
But if Mercedes could they would love to put the 1.4 l turbocharged engine that gives you the same "feel" as the 5l v8 and charge the same as the previous model. This is what nvidia is essentially doing, finding new more efficient ways to generate graphics, but not passing the savings on to consumers since its still a "premium" product.
 
Joined
Feb 18, 2005
Messages
5,701 (0.79/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Until there’s a GPU and engine capable of full path tracing at 60 fps min, rasterization and or hybrid rendering will never be dead. Unless either company can magically quintuple RT performance gen to gen, were years away from that being any sort of reality.
You've very obviously never looked at RTX 4090 ray-traced benchmarks. It is capable of over 60 FPS in 4K in every title tested but one. There's no need for them to quintuple RT performance each generation when it's increasing by close to 50% in 4K generation-on-generation.

Inb4 you try to move the goalposts with "I meant a mainstream GPU": 4060 Ti handily beats my 2080 Ti in RT, that's equivalent RT performance moving from the ultra-high-end to the mid-upper-end in a mere two generations. No reason to expect that to slow down anytime soon.

@Assimilator RT full path tracing being THE way is years off IMHO and yet even then indy raster game's will happen, I disagree then.
Sure, rasterised games will continue to be made, but rasterisation performance of new hardware won't continue to increase. Because it makes far more sense to spend that precious die space on a technology that isn't on terminal life support. Once games that actually matter (i.e. not indie ones) start using RT rendering exclusively, nobody will care about rasterisation performance every again.

There are serious doubts about it, unless both companies substantially improve RT performance across all classes of GPUs, and fast, without breaking customers' piggy-banks.
Is 4060Ti capable of "acceptable RT performance" for $500? No. Even 4070 chokes with RT in more demanding titles and becomes a stuttering mess. So, the mainstream market GPUs still have RT performance in its infancy. Raster is dead - long live the raster.
Which is why you turn on DLSS.
 
Joined
Apr 12, 2013
Messages
7,444 (1.77/day)
But if Mercedes could they would love to put the 1.4 l turbocharged engine that gives you the same "feel" as the 5l v8 and charge the same as the previous model. This is what nvidia is essentially doing, finding new more efficient ways to generate graphics, but not passing the savings on to consumers since its still a "premium" product.
But wouldn't you rather get a 1000 mile range SS battery powered EV instead?
 
Joined
Apr 14, 2018
Messages
626 (0.26/day)
You've very obviously never looked at RTX 4090 ray-traced benchmarks. It is capable of over 60 FPS in 4K in every title tested but one. There's no need for them to quintuple RT performance each generation when it's increasing by close to 50% in 4K generation-on-generation.

Inb4 you try to move the goalposts with "I meant a mainstream GPU": 4060 Ti handily beats my 2080 Ti in RT, that's equivalent RT performance moving from the ultra-high-end to the mid-upper-end in a mere two generations. No reason to expect that to slow down anytime soon.


Sure, rasterised games will continue to be made, but rasterisation performance of new hardware won't continue to increase. Because it makes far more sense to spend that precious die space on a technology that isn't on terminal life support. Once games that actually matter (i.e. not indie ones) start using RT rendering exclusively, nobody will care about rasterisation performance every again.


Which is why you turn on DLSS.

There’s no goal posts to move? 99.99% of games are either traditional rendering or hybrid.

The list of “path traced” (actual ray tracing), is so infinitesimally small, it’s not even worth mentioning. Last time I checked the 4090 was averaging, what, 12 fps at 4k in CP2077 with overdrive settings (path tracing). So yes, they absolutely need to quintuple if not more. Keep living in some warped green reality.
 
Last edited:
Joined
Mar 10, 2010
Messages
11,878 (2.23/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Names are extremely important. This is the same as putting a low-performance 1.4 litre engine in your brand new s-class mercedes. Will you agree? Not, of course!
Oh dear.

That's exactly what's happening with Ice this generation.

1litres masquerade as 30's Audi are first but not last with this.

We had bogof, we bought. Tooofers and 3 for 2 deals.


Now marketeer's are literally selling less for more all over.
 
Joined
Jun 11, 2020
Messages
571 (0.36/day)
Location
Florida
Processor 5800x3d
Motherboard MSI Tomahawk x570
Cooling Thermalright
Memory 32 gb 3200mhz E die
Video Card(s) 3080
Storage 2tb nvme
Display(s) 165hz 1440p
Case Fractal Define R5
Power Supply Toughpower 850 platium
Mouse HyperX Hyperfire Pulse
Keyboard EVGA Z15
But wouldn't you rather get a 1000 mile range SS battery powered EV instead?
Sure, but not at the price that something like that will cost when first released. I'd give it a few years to mature/get costs down.
 
Joined
Jul 7, 2019
Messages
895 (0.46/day)
On the subject of Tegras, it was an easy deal for Nintendo and Nvidia. No one wanted Nvidia's Tegras, so Nintendo could buy them in bulk for cheap, and converted them into a basic handheld and made bank. Heck the first few generations of Switches were easily hackable because they were literally a modified Tegra tablet. For Nvidia, their throwaway SoC now had some value and the development cost was already paid for, so it was easy just ordering more as Nintendo needed and selling them. It's also why it's taken forever for a Switch 2 to be a thing; because that requires Nintendo fork up the cash to get Nvidia to produce a one-off SoC design for them, as Nvidia doesn't really do custom solutions. Rather, Nvidia prefers to make customers conform to fit their ecosystem, much like how Apple does.

Meanwhile, for a broader audience, the Steam Deck, ROG Ally, and the various other handhelds all run on either custom or semi-custom AMD APUs/SoCs, and AMD can support this as they have a dedicated team who can specifically customize a solution for a buyer, which is also what allowed AMD to win over MS and Sony after both got burned by Nvidia. The PS3 was originally intended to run on Nvidia, but Nvidia refused to design a custom GPU for Sony, causing Sony to seek out ATI/AMD with the PS4, and MS couldn't convince Nvidia to produce a custom GPU for their X360, leading MS to seek out a deal with ATI instead, that was carried over when AMD and ATI merged.

Which brings to mind another point; given that mobile gaming is on the rise now that there's the infrastructure to support it (the PS Vita was too ahead of its time), AMD refocusing for this growing market could be one reason for the supposed rumors that AMD is just aiming for the mid-range. A few months barely goes by before we hear of a new handheld with an AMD SoC, or that AMD's SoCs are now edging into the space where Intel NUCs used to reign, so it does make sense to saturate that market with AMD IP and get more code optimized for AMD, just like AMD has been doing with the Enterprise/Corporate sector via EPYC and Threadripper. Of course, with RDNA3 having proven that MCM GPUs are possible, AMD could now just MCM multiple mid-range chiplets into a high-end product rather than producing a separate line of chiplets just for high-end purposes, and continue to refine the MCM approach that way (moreso if they delve into 3DV-cache or even an FPGA or mini-CPU block as some rumors claim).
 
Joined
Apr 12, 2013
Messages
7,444 (1.77/day)
If AMD+Exynos takes off QC/Apple could be in some trouble at the top end. Though Apple will remain the most profitable phone maker for years to come, it won't necessarily on the back of their uber powerful & efficient SoC's perhaps.
 
Joined
Jan 14, 2019
Messages
11,924 (5.67/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Names are extremely important. This is the same as putting a low-performance 1.4 litre engine in your brand new s-class mercedes. Will you agree? Not, of course!
So you would buy a Mercedes S-class with a low-performance 1.4 l engine just for its name?

No. DLSS should always be treated as an additional perk to be used at your convenience, not as a feature that masks hardware deficiencies and 'must-turn-on' to bring performance to barely acceptable level in demanding scenarios.
Finally someone with common sense! :toast:
 
Joined
Dec 25, 2020
Messages
6,359 (4.56/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Tough, then I will not be their customer.
5080 for $1,200 can pass my test only if it has: 24GB VRAM, 50% uplift in 4K over 4080 and DisplayPort 2.1 ports (imncluding one USB-C).

Which is likely going to happen, as the 4080 met that same criteria over the 3080. USB-C is likely not coming back, Turing had it and while AMD caught up to that in RDNA 2 (RDNA 1 did not have them), NVIDIA removed the port in Ampere. I understand NVIDIA's reasoning: the trend is for HMDs to go full wireless, and with the extremely high bandwidth and low latency afforded by next-gen networking protocols like Wi-Fi 7, I can easily see true wireless HMDs debutting sometime soon.

@Assimilator RT full path tracing being THE way is years off IMHO and yet even then indy raster game's will happen, I disagree then.

Which brings the conversation full circle, as I pointed out initially, indie raster games already run well on Pascal, often in Maxwell and in some cases, even Fermi and earlier as it stands today. A focus in increased rasterization performance is not necessary, the RTX 4080 is knocking on the realm of terapixel and teratexel fillrates (with 300 gpixel/s and 800 gtexel/s its raster fillrates are high enough that targeting a 1440p display and in many cases even a 4K display it'd never be the bottleneck), and the 4090 actually ventures into that domain, its texture fillrate is so high that you don't measure it in gigatexels per second anymore - you can safely use the tera scale for that.

I'll see your 1070 and raise you a Nintendo Switch. Its crazy what devs can run on that thing. Thing has the specs of a flagship phone from 2013!

Indeed, and while we won't claim that the Switch doesn't have a compromised experience (it most definitely does) - it can pull off some wondrous things. I'd say NieR Automata port to Switch is one of the most marvelous achievements ever, as it was clearly designed to run on a much more robust system.

On the subject of Tegras, it was an easy deal for Nintendo and Nvidia. No one wanted Nvidia's Tegras, so Nintendo could buy them in bulk for cheap, and converted them into a basic handheld and made bank. Heck the first few generations of Switches were easily hackable because they were literally a modified Tegra tablet. For Nvidia, their throwaway SoC now had some value and the development cost was already paid for, so it was easy just ordering more as Nintendo needed and selling them. It's also why it's taken forever for a Switch 2 to be a thing; because that requires Nintendo fork up the cash to get Nvidia to produce a one-off SoC design for them, as Nvidia doesn't really do custom solutions. Rather, Nvidia prefers to make customers conform to fit their ecosystem, much like how Apple does.

I don't think it's even that, the console is selling well, the ecosystem is rich, and developers and gamers are interested alike in it. Thus, there's little need to rush the release of a new system, does Nintendo want a repeat of the Wii U? Even bringing high-definition didn't really make it as popular as the original Wii, and that's because the U didn't have the original's mojo - and mojo is Nintendo's specialty.

Speaking of exploits though, it'd be absolutely hilarious if the PS5 is jailbroken because of the newly discovered Zen 2 Zenbleed vulnerability... of which AMD seems to be having a high degree of difficulty patching, some Zen 2 platforms will only receive AGESA updates next year.

If AMD+Exynos takes off QC/Apple could be in some trouble at the top end. Though Apple will remain the most profitable phone maker for years to come, it won't necessarily on the back of their uber powerful & efficient SoC's perhaps.

I don't see it happening... at least it didn't with Samsung, who already went back to Qualcomm SoCs, even scored a few tailored "Snapdragon for Galaxy" buffed up versions of the 8 Gen 2.... the Exynos 2200 has the RDNA 2-derived Xclipse 920, and it's about 10% faster than the Snapdragon 888's GPU, keeping in mind the 888 was released in late 2020 and is used in 2021 flagships (Galaxy S21, Z Flip 3/Z Fold 3)

Amen.

So many so-called technology enthusiasts simply don't understand that rasterisation is dead. The fact that games, even new ones, still use it is entirely down to the fact that the console GPUs are simply not capable of acceptable RT performance. Assuming AMD manages to mostly address that shortcoming in the next console generation (2027-2028 timeline), we will then finally see the end of rasterisation as the primary graphics rendering technology.

And that's pretty much it. The GPU is evolving and the hyperfixation on raster is clearly a way to stonewall and deflect from the reality AMD currently finds itself in, more accurately described by:

This Is Fine GIF
 

The Jniac

New Member
Joined
Aug 8, 2023
Messages
6 (0.01/day)
Very disappointing if true, although I definitely understand that it might not be financially feasible to produce a high-end card every year or so. I wonder if it might be better for AMD to switch to an alternating cycle where they produce a new generation of cards one year, complete with high-end cards, and then do the budget/midrange cards the other year.

I *really* do not want to be stuck with only nVidia producing high-end cards.
 
Joined
Jan 14, 2019
Messages
11,924 (5.67/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
I'm not against RT by any means, but if it's really as important as we're led to believe, then why do both AMD and Nvidia offer the same ratio of raster-to-RT hardware and performance ratio as they did in their last generation?
 

The Jniac

New Member
Joined
Aug 8, 2023
Messages
6 (0.01/day)
Another read of this is that the high end radeon will be built from multiple smaller dies.
Exactly what I was thinking. AMD has really been leaning into building stuff out of multiple chiplets as opposed to monolithic designs, so at least with my minimal knowledge of chip design and manufacturing, it makes sense to bring that to GPUs as well.
 
Joined
Apr 14, 2018
Messages
626 (0.26/day)
Which is likely going to happen, as the 4080 met that same criteria over the 3080. USB-C is likely not coming back, Turing had it and while AMD caught up to that in RDNA 2 (RDNA 1 did not have them), NVIDIA removed the port in Ampere. I understand NVIDIA's reasoning: the trend is for HMDs to go full wireless, and with the extremely high bandwidth and low latency afforded by next-gen networking protocols like Wi-Fi 7, I can easily see true wireless HMDs debutting sometime soon.



Which brings the conversation full circle, as I pointed out initially, indie raster games already run well on Pascal, often in Maxwell and in some cases, even Fermi and earlier as it stands today. A focus in increased rasterization performance is not necessary, the RTX 4080 is knocking on the realm of terapixel and teratexel fillrates (with 300 gpixel/s and 800 gtexel/s its raster fillrates are high enough that targeting a 1440p display and in many cases even a 4K display it'd never be the bottleneck), and the 4090 actually ventures into that domain, its texture fillrate is so high that you don't measure it in gigatexels per second anymore - you can safely use the tera scale for that.



Indeed, and while we won't claim that the Switch doesn't have a compromised experience (it most definitely does) - it can pull off some wondrous things. I'd say NieR Automata port to Switch is one of the most marvelous achievements ever, as it was clearly designed to run on a much more robust system.



I don't think it's even that, the console is selling well, the ecosystem is rich, and developers and gamers are interested alike in it. Thus, there's little need to rush the release of a new system, does Nintendo want a repeat of the Wii U? Even bringing high-definition didn't really make it as popular as the original Wii, and that's because the U didn't have the original's mojo - and mojo is Nintendo's specialty.

Speaking of exploits though, it'd be absolutely hilarious if the PS5 is jailbroken because of the newly discovered Zen 2 Zenbleed vulnerability... of which AMD seems to be having a high degree of difficulty patching, some Zen 2 platforms will only receive AGESA updates next year.



I don't see it happening... at least it didn't with Samsung, who already went back to Qualcomm SoCs, even scored a few tailored "Snapdragon for Galaxy" buffed up versions of the 8 Gen 2.... the Exynos 2200 has the RDNA 2-derived Xclipse 920, and it's about 10% faster than the Snapdragon 888's GPU, keeping in mind the 888 was released in late 2020 and is used in 2021 flagships (Galaxy S21, Z Flip 3/Z Fold 3)



And that's pretty much it. The GPU is evolving and the hyperfixation on raster is clearly a way to stonewall and deflect from the reality AMD currently finds itself in, more accurately described by:

This Is Fine GIF

The market both hardware and software clearly prove the opposite of this conclusion/delusion.

Full path traced becoming the norm, let alone being able to be played on a relatively affordable GPU, is still several years away. It’s easy to forget tech snobs (including myself) have a massively warped take on the GPU market. 4090/4080/3090/6900xt/7900xtx owners don’t even come close to being popular outside of the tech forum niche.
 
Joined
Dec 25, 2020
Messages
6,359 (4.56/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
The market both hardware and software clearly prove the opposite of this conclusion/delusion.

Full path traced becoming the norm, let alone being able to be played on a relatively affordable GPU, is still several years away. It’s easy to forget tech snobs (including myself) have a massively warped take on the GPU market. 4090/4080/3090/6900xt/7900xtx owners don’t even come close to being popular outside of the tech forum niche.

It's not a delusion. It takes several generations for new standards to be widely adopted. It's not because we now have third-generation raytracing cards that can finally more or less pathtrace without instantly croaking (as long as you have the highest end models) that the several generations' worth of traditional hardware will just up and disappear. People are and will continue to happily use their GTX 1080 Ti, like I brought up earlier - and with age advancing, not having superfluous eye candy is a very small price to pay if you're not interested in the bleeding edge.

When DirectX 11 cards finally arrived and unified shaders (introduced 3 hardware generations prior with G80) started to become mandatory because games started using (at the time) advanced GPU computing technologies, it didn't mean that DirectX 9 games suddenly disappeared. Indeed, they were still widely released, with some high profile releases coming out as late as 2014 (Borderlands Pre-Sequel), still fully compatible with Windows XP. By that time it was 13 years old and just out of its quite protracted extended support stage, and Windows 7 and DirectX 11 were well over half a decade old. And by then... a lot of people were still happily gaming on GTX 200 series and HD 4000 cards.

Due to the enormous complexity of undertaking this evolution and market realities (video game fidelity has more or less plateau'd because of the ever rising development costs and the struggle to raise prices to end-users), the recent bubbles of crypto and AI, as well as economic hardships in the wide user base has obviously tremendously slowed adoption, which brings my point yet again full circle: if you don't really care and need the newest graphics technologies, you can happily stay with your Pascal card until something else comes along.

However, if you want to compete at the high-end, where your primary market is technology enthusiasts and people who want to experience the latest and greatest, you must deliver the goods. If AMD retreats to the midrange, and offers their modest support for these newer techs while they build their portfolio, it would prove to be quite the wise move - at the behest of their loyal fanbase, which will be put to the test: will they still buy AMD hardware if they don't place a claim to the upper range?
 
Joined
Mar 10, 2010
Messages
11,878 (2.23/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Yeah no , Polaris , 5700Xt , it would be the norm.

They're busy is my take.
 
Joined
Jun 2, 2017
Messages
8,844 (3.28/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I have owned both the XT and XTX and they are both absolutely fine for Gaming. Raster is dead? I guess Balder's Gate 3 is focused on RT and DLSS? I guess I don;t enjoy my Games because I don't use or have them. People talking about chiplets being done are not appreciating what Ryzen did and how parallel processing could be even faster with chiplets. The best is today's HU video comparing the 6800XT to the 4060TI 16GB and guess what? The fact (for me) is that people who use DLSS and RT in the same sentence should have at least a 4080 if you are going to complain about the performance of AMD. The other thing is that the 7900XT is still the same price today (Sapphire Pulse) on Newegg as it was at launch. The XTX on the other hand can be found for up to $200 less than launch price.

The narrative is so entrenched even with the launch notification of the 7800XT with 16GB people will still say the 7900XT should be called the 7800XT. The truth is Gaming has never been more enjoyable with nostalgic, high fidelity, unique experiences and new content coming all the time. Balders Gate 3 and Armoured Core 6 are 2 that come to mind but there is also plenty of Games before the age of DLSS like Shadow of War, Grim Dawn and plenty of Racing Games like the Grid Series. to enjoy. If you want to go to the Arcade Everspace 2 and Redout 2 are fun to play but even Red Faction Guerrilla Re Mastered is sweet for Arcade satisfaction.
 
Joined
Apr 14, 2018
Messages
626 (0.26/day)
It's not a delusion. It takes several generations for new standards to be widely adopted. It's not because we now have third-generation raytracing cards that can finally more or less pathtrace without instantly croaking (as long as you have the highest end models) that the several generations' worth of traditional hardware will just up and disappear. People are and will continue to happily use their GTX 1080 Ti, like I brought up earlier - and with age advancing, not having superfluous eye candy is a very small price to pay if you're not interested in the bleeding edge.

When DirectX 11 cards finally arrived and unified shaders (introduced 3 hardware generations prior with G80) started to become mandatory because games started using (at the time) advanced GPU computing technologies, it didn't mean that DirectX 9 games suddenly disappeared. Indeed, they were still widely released, with some high profile releases coming out as late as 2014 (Borderlands Pre-Sequel), still fully compatible with Windows XP. By that time it was 13 years old and just out of its quite protracted extended support stage, and Windows 7 and DirectX 11 were well over half a decade old. And by then... a lot of people were still happily gaming on GTX 200 series and HD 4000 cards.

Due to the enormous complexity of undertaking this evolution and market realities (video game fidelity has more or less plateau'd because of the ever rising development costs and the struggle to raise prices to end-users), the recent bubbles of crypto and AI, as well as economic hardships in the wide user base has obviously tremendously slowed adoption, which brings my point yet again full circle: if you don't really care and need the newest graphics technologies, you can happily stay with your Pascal card until something else comes along.

However, if you want to compete at the high-end, where your primary market is technology enthusiasts and people who want to experience the latest and greatest, you must deliver the goods. If AMD retreats to the midrange, and offers their modest support for these newer techs while they build their portfolio, it would prove to be quite the wise move - at the behest of their loyal fanbase, which will be put to the test: will they still buy AMD hardware if they don't place a claim to the upper range?

There’s no reason to dump all their eggs into the RT/path tracing basket. Even Nvidia can’t provide a viable and affordable solution for the non-existent library of path traced games. Both Nvidia and AMD have bigger fish to fry in the AI and server spaces.

Rasterization and hybrid ray tracing are here to stay as the main means of rendering for awhile. AMD isn’t “stonewalling” anything.
 
Joined
Dec 25, 2020
Messages
6,359 (4.56/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
There’s no reason to dump all their eggs into the RT/path tracing basket. Even Nvidia can’t provide a viable and affordable solution for the non-existent library of path traced games. Both Nvidia and AMD have bigger fish to fry in the AI and server spaces.

Rasterization and hybrid ray tracing are here to stay as the main means of rendering for awhile. AMD isn’t “stonewalling” anything.

Read again, that was directed at their "loyal fans".
 
Joined
Feb 8, 2022
Messages
269 (0.27/day)
Location
Georgia, United States
System Name LMDESKTOPv2
Processor Intel i9 10850K
Motherboard ASRock Z590 PG Velocita
Cooling Arctic Liquid Freezer II 240 w/ Maintenance Kit
Memory Corsair Vengeance DDR4 3600 CL18 2x16
Video Card(s) RTX 3080 Ti FE
Storage Intel Optane 900p 280GB, 1TB WD Blue SSD, 2TB Team Vulkan SSD, 2TB Seagate HDD, 4TB Team MP34 SSD
Display(s) HP Omen 27q, HP 25er
Case Fractal Design Meshify C Steel Panel
Audio Device(s) Sennheiser GSX 1000, Schiit Magni Heresy, Sennheiser HD560S
Power Supply Corsair HX850 V2
Mouse Logitech MX518 Legendary Edition
Keyboard Logitech G413 Carbon
VR HMD Oculus Quest 2 (w/ BOBO VR battery strap)
Software Win 10 Professional
I'll see your 1070 and raise you a Nintendo Switch. Its crazy what devs can run on that thing. Thing has the specs of a flagship phone from 2013!
My Nexus 6 was more powerful and it was 32 bit! One of the first 1440p phones with a GPU powerful enough to push Android without stuttering!
 
Top