• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Retreating from Enthusiast Graphics Segment with RDNA4?

Joined
Feb 18, 2023
Messages
244 (0.38/day)
Yes, but I'd say it's because you have a Core i9 13th gen now and a Core i7 8th gen then.
Yes but I switched my 1070 to 3070 Ti then I upgraded the CPU, so I had a Core i7 8th gen with a 3070 Ti, the CPU helped but the GPU helped way more.
 
Joined
Dec 25, 2020
Messages
6,693 (4.69/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
You'll hit the physics wall before you end up anywhere close to full RT RT, wanna bet on that?

It's well known we're at the end of silicon and to continue onwards new materials will need to be used, but I would never bet against the tech industry, that is most unwise ;)
 
Joined
Apr 12, 2013
Messages
7,525 (1.77/day)
I'd rather go with Physics.
Manhattan Project Oppenheimer GIF by GIPHY News
Manhattan Project Oppenheimer GIF by GIPHY News
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
It's well known we're at the end of silicon and to continue onwards new materials will need to be used, but I would never bet against the tech industry, that is most unwise ;)
Your being extreme, silicon is still being worked on and with and will be for at least 7 more generations of shrink and likely won't be replaced quickly or easily, few materials have a suitably workable bandgap, even GAi devices are very new and niche and they're in the lead experimentally over other silicon alternatives.

I actually think Innovation, possibly photonics will extend the viability of silicon due to the vast amounts of working processes developed for silicon fabrication way way beyond the Angstrom era.
 
Joined
Dec 25, 2020
Messages
6,693 (4.69/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Your being extreme, silicon is still being worked on and with and will be for at least 7 more generations of shrink and likely won't be replaced quickly or easily, few materials have a suitably workable bandgap, even GAi devices are very new and niche and they're in the lead experimentally over other silicon alternatives.

I actually think Innovation, possibly photonics will extend the viability of silicon due to the vast amounts of working processes developed for silicon fabrication way way beyond the Angstrom era.

If anything that kind of reinforces my point, though. It won't be that long.
 
Joined
Aug 25, 2021
Messages
1,170 (0.99/day)
USB-C is likely not coming back
USB-C video port on GPU may actually evolve in future to transmit USB and/or PCIe data, alongside video.

Asus has recently shown NVMe drive attached to GPU's PCB transmitting data over PCIe link. By extension, its enough to install USB4 controller on GPU's PCB to inject USB and PCIe data flowing to/from USB-C port alongside DisplayPort video data.
 
Joined
Dec 25, 2020
Messages
6,693 (4.69/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
USB-C video port on GPU may actually evolve in future to transmit USB and/or PCIe data, alongside video.

Asus has recently shown NVMe drive attached to GPU's PCB transmitting data over PCIe link. By extension, its enough to install USB4 controller on GPU's PCB to inject USB and PCIe data flowing to/from USB-C port alongside DisplayPort video data.

This is already possible, the CPU graphics on my motherboard are wired to an USB-C DisplayPort source. But even if that is the case, it's still a rather unusual format for monitors.
 

yannus1

New Member
Joined
Jan 30, 2023
Messages
23 (0.03/day)
I remember a rumour that Intel would stop Arc production after first gen. It wasn't true.
 
Joined
Jan 14, 2019
Messages
12,337 (5.77/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
It's well known we're at the end of silicon and to continue onwards new materials will need to be used, but I would never bet against the tech industry, that is most unwise ;)
How about using one GPU only for raster, and one only for RT, similarly to the earliest physics accelerators before Nvidia bought Ageia, the company that made them (or 3DFX cards that didn't have 2D support). ATX is basically just a ton of unused space in a modern gaming PC, so why do raster and RT have to be on the same chip, or even the same card?
 
Joined
Dec 25, 2020
Messages
6,693 (4.69/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
How about using one GPU only for raster, and one only for RT, similarly to the earliest physics accelerators before Nvidia bought Ageia, the company that made them (or 3DFX cards that didn't have 2D support). ATX is basically just a ton of unused space in a modern gaming PC, so why do raster and RT have to be on the same chip, or even the same card?

I'm sure it was thought of but I doubt we have any interconnect technology that's fast enough for that, it'd also need perfect synchronization with the GPU itself... Seems like its own can of worms to me
 
Joined
Jun 6, 2022
Messages
622 (0.69/day)
well it doesn't beat 4090 in any other game... valhalla is just a heavy amd game. your refusal to admit it doesn't even beat it in one game though is troubling
He is Taliban AMD and you have nothing to discuss with him. I have it on ignore.
1111.jpg


RTX 4090 is the undisputed king at the moment. An expensive king, it is right, and the price is the only weapon with which these Taliban, denying the army of technologies that increase the value of an nVidia video card. Anyway, their desperation can be seen in how they use a game sponsored by AMD to hide the drama from the others.

Something funny that stuck in my mind from the review of a well-known compatriot for RTX 4060 Ti/ RX 7600:
4060 Ti - driver installation and running games without problems.
7600 - driver installation and... error, error, error. It was solved after several wasted hours, including reinstalling the OS.
And he also found that he had to use nVidia software to be able to get the results of the AMD video card in the games that did not have a benchmark utility because AMD has nothing similar.

I extracted his results from two reviews because he "omitted" to compare the results, they were too against AMD. How to compare in Cyberpunk the 34 fps obtained by the RX 7600 (igp disaster) with the 111 fps obtained by the 4060?
Video cards were tested only in 1080p
RX7600 RT_FSR.jpg

4060 RT_DLSS3.jpg


The sources are here
 
Last edited:
Joined
Jan 14, 2019
Messages
12,337 (5.77/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
He is Taliban AMD and you have nothing to discuss with him. I have it on ignore.
View attachment 308242

RTX 4090 is the undisputed king at the moment. An expensive king, it is right, and the price is the only weapon with which these Taliban, denying the army of technologies that increase the value of an nVidia video card. Anyway, their desperation can be seen in how they use a game sponsored by AMD to hide the drama from the others.

Something funny that stuck in my mind from the review of a well-known compatriot for RTX 4060 Ti/ RX 7600:
4060 Ti - driver installation and running games without problems.
7600 - driver installation and... error, error, error. It was solved after several wasted hours, including reinstalling the OS.
And he also found that he had to use nVidia software to be able to get the results of the AMD video card in the games that did not have a benchmark utility because AMD has nothing similar.

I extracted his results from two reviews because he "omitted" to compare the results, they were too against AMD. How to compare in Cyberpunk the 34 fps obtained by the RX 7600 (igp disaster) with the 111 fps obtained by the 4060?
Video cards were tested only in 1080p
View attachment 308239
View attachment 308241

The sources are here
Comparing anything with DLSS 3 FG on sounds more like an Nvidia advert than a review to me. One must not forget about the possible input latency issues with FG.
 
Joined
Aug 25, 2021
Messages
1,170 (0.99/day)
This is already possible, the CPU graphics on my motherboard are wired to an USB-C DisplayPort source. But even if that is the case, it's still a rather unusual format for monitors.
iGPU has been using USB-C for quite some time. I have Z390 motherboard from Asrock with three monitor ports from CPU - DP, HDMI and Thunderbolt 3.

The new thing would be installing USB4 controller on GPU's PCB, so that USB-C port carries not only DP video data, but other protocols too. We have never had this solution on GPU.
 
Joined
Dec 25, 2020
Messages
6,693 (4.69/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
iGPU has been using USB-C for quite some time. I have Z390 motherboard from Asrock with three monitor ports from CPU - DP, HDMI and Thunderbolt 3.

The new thing would be installing USB4 controller on GPU's PCB, so that USB-C port carries not only DP video data, but other protocols too. We have never had this solution on GPU.

Aye, I get you. It's a possibility I suppose, although it'll largely depend on how USB 4 is received, given that the 3.x versions didn't even supplant USB 2.0 (and likely due to the branding mess).

Sadly on the MEG Z690 Ace the USB-C port that carries a video signal doesn't work with HDMI. I bought a USB-C to HDMI dongle on Amazon to use after I sold my 3090 and had the 4080 on the way, unfortunately it did not work at all... then I read on the manual that it only supports DisplayPort. I guess even us nerds need to RTFM sometimes. Fortunately, it's not all a waste, apparently it works just fine with my laptop - and provides a way to use the integrated Radeon graphics with an external display, as its native HDMI port is wired directly to the RTX 3050.
 
Joined
Jun 11, 2020
Messages
573 (0.35/day)
Location
Florida
Processor 5800x3d
Motherboard MSI Tomahawk x570
Cooling Thermalright
Memory 32 gb 3200mhz E die
Video Card(s) 3080
Storage 2tb nvme
Display(s) 165hz 1440p
Case Fractal Define R5
Power Supply Toughpower 850 platium
Mouse HyperX Hyperfire Pulse
Keyboard EVGA Z15
If AMD really want this chaplet approach to work for GPU, they need to be more aggressive securing advanced nodes from TSMC, or they better be sure Samsung can really deliver and go as advanced as possible with them.

They also really need their FG tech to be as good as DLSS3. I just wish Nvidia would toss us 3000 series people a bone with FG... but guess I'll have to wait for AMD/Intel solutions...
 
Joined
Dec 25, 2020
Messages
6,693 (4.69/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
They also really need their FG tech to be as good as DLSS3. I just wish Nvidia would toss us 3000 series people a bone with FG... but guess I'll have to wait for AMD/Intel solutions...

I bet that by shortly after FSR 3.0 goes public, if it's really completely hardware agnostic and you can run it on a GTX 980, they'll just announce "DLSS 3.5" with some Ada improvements and "new ability to run on Ampere, Turing, Pascal and Maxwell", with the implication that's going to be the sendoff gift for the 900 series GPUs... it'd be basically a repeat of what they did with the image sharpening feature, they added it as far back as Kepler shortly after AMD rolled it to Polaris and announced "other GPUs were coming soon", basically stealing the spotlight there.
 
Joined
Aug 25, 2021
Messages
1,170 (0.99/day)
If AMD really want this chaplet approach to work for GPU, they need to be more aggressive securing advanced nodes from TSMC, or they better be sure Samsung can really deliver and go as advanced as possible with them.
It's no and no currently.
At TSMC, Apple always has a priority and currently uses 90% of 3nm capacity. Plus, they don't pay for defective dies for the first time, which means 3nm yields are lower than expected, perhaps 65-70% at the moment.

At Samsung, yields are unknown on GAA Fet 3nm node. Unclear if it's 50 or 60% currently, which is low. Nvidia really had a problem with them on 8nm.

Everybody wants to be on a cutting edge node for most advanced products, but it takes a few years to improve yields towards 90%. It's a painfully slow process...
 
Last edited:
Joined
Jun 11, 2020
Messages
573 (0.35/day)
Location
Florida
Processor 5800x3d
Motherboard MSI Tomahawk x570
Cooling Thermalright
Memory 32 gb 3200mhz E die
Video Card(s) 3080
Storage 2tb nvme
Display(s) 165hz 1440p
Case Fractal Define R5
Power Supply Toughpower 850 platium
Mouse HyperX Hyperfire Pulse
Keyboard EVGA Z15
It's no and no currently.
At TSMC, Apple always has a priority and currently uses 90% of 3nm capacity. Plus, they don't pay for defective dies for the first time, which means 3nm yields are lower than expected, perhaps 65-70% at the moment.

At Samsung, yields are unknown on GAA Fet 3nm node. Unclear if it's 50 or 60% currently, which is low. Nvidia really had a problem with them on 8nm.

Everybody wants to be on a cutting edge node for most advanced products, but it takes a few years to improve yields towards 90%. It's a painfully slow process...
That's why I'm saying they need to be more aggressive in securing an advanced node. Doesn't need to be the most advanced, but they can not afford disparity with Nvidia/Intel. Its an investment that pays off. Isn't that supposed to be an advantage of chiplet approach, you get better yields because the chip is not as big/complex?
 
Joined
Jul 10, 2022
Messages
340 (0.39/day)
Location
France
Processor AMD Ryzen 7 5700X3D
Motherboard MSI MPG B550I GAMING EDGE WIFI Mini ITX
Cooling Noctua NH-U12S Chromax Black
Memory Corsair Vengeance RGB Pro SL 32 GB (2 x 16 GB) 3600MHz CL18
Video Card(s) AMD RX 6750XT Reference Design
Storage 2.5 TB 2.5" SSD / 3 TB HDD
Display(s) ASUS 27" 165HZ VG27WQ / Vertical 16/10 iiyama 25" 75Hz ProLite XUB2595WSU-B1
Case be quiet! Dark Base 700 RGB
Audio Device(s) PSB Alpha P3 / LOXJIE A30 Amp
Power Supply EVGA SuperNOVA 650 GA
Mouse Cooler master MM720
Keyboard Roccat horde
VR HMD Oculus Rift S (please Valve, release a new headset)
Software Windows 10
Something funny that stuck in my mind from the review of a well-known compatriot for RTX 4060 Ti/ RX 7600:
4060 Ti - driver installation and running games without problems.
7600 - driver installation and... error, error, error. It was solved after several wasted hours, including reinstalling the OS.
Did he use DDU ? It's well know that nvidia and amd don't like eachothers. Nothing surprising.

It makes me wonder if there's situation where it works fine. Like starting fresh with amd and go nvidia, does it works fine ? Or does starting fresh with nvidia and go amd then works fine ? I've never seen experimentation on this, maybe guru 3d have some test.
And he also found that he had to use nVidia software to be able to get the results of the AMD video card in the games that did not have a benchmark utility because AMD has nothing similar.
Im pretty sure i have seen a perf summary somewhere in adrenalin. Isn't a ton of test are made without any manufacturer software anyway ?
 
Last edited:
Joined
Aug 25, 2021
Messages
1,170 (0.99/day)
That's why I'm saying they need to be more aggressive in securing an advanced node. Doesn't need to be the most advanced, but they can not afford disparity with Nvidia/Intel. Its an investment that pays off.
I dont think AMD could be more "aggressive". They are the second most preferred client of TSMC currently having access to several latest customised nodes.

People need to realise that Apple pays in advance by building entirely new factories for TSMC's next best. No one else has that kind of money. Perhaps Nvidia in two years.

AMD has secured 3nm for several Zen5 products, such as Turin and Turin dense CPUs. Server takes priority for latest and greatest nodes.

There is currently no major disparity between AMD, Nvidia and Intel in node process. Intel is behind in server chips, Nvidia is in front in AI chips.

Isn't that supposed to be an advantage of chiplet approach, you get better yields because the chip is not as big/complex?
Yes, they can get higher chiplet yield per wafer due to their smaller size, but 3nm wafer itself is significantly more expensive at the moment that only Apple can afford the capacity they had booked and paid two years ago.
 
Joined
Jan 14, 2019
Messages
12,337 (5.77/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I bet that by shortly after FSR 3.0 goes public, if it's really completely hardware agnostic and you can run it on a GTX 980, they'll just announce "DLSS 3.5" with some Ada improvements and "new ability to run on Ampere, Turing, Pascal and Maxwell", with the implication that's going to be the sendoff gift for the 900 series GPUs... it'd be basically a repeat of what they did with the image sharpening feature, they added it as far back as Kepler shortly after AMD rolled it to Polaris and announced "other GPUs were coming soon", basically stealing the spotlight there.
Nvidia has already said that the hardware necessary to make DLSS 3 work is there in Turing and Ampere, just "probably" not fast enough to do it at the proper speed (whatever that means). That, to me, is a hint that DLSS 3 for Ampere and Turing is coming soon - probably when Ada sales have reached or exceeded Nvidia's expectations.
 
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
I remember a rumour that Intel would stop Arc production after first gen. It wasn't true.
That's not that rumour was saying. The rumour was saying that Intel will do just one or 2 cards each generation, with very low ambitions, instead of having a complete lineup and competing at all levels. For all purposes it seems to have been true. But that doesn't mean that this rumor will turn out to be true, hopefully not.
 
Joined
Apr 12, 2013
Messages
7,525 (1.77/day)
There were multiple rumors, I'm sure MLID/WTF tech probably cooked up a few :ohwell:
 
Top