• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Retreating from Enthusiast Graphics Segment with RDNA4?

Joined
Sep 17, 2014
Messages
22,441 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I mean, once oc'd, I beat a XTX by 200 points, which would have me tying Assassins Creed Valhalla with a 4090 at 1440p according to techspot reviews. $580 gpu (i got on sale) vs a $1700 gpu... i mean, yeah its only one game, but still.
Ehh yeah. Next you're going to say the 7900XT took you to Mars. Its not only one game, its no single game.
Let's not exaggerate and try to see things for what they are. There is nearly 25% between the XT and the XTX :) There are no OCs' on a 7900XT for more than 15% perf, and even then you're doing something special.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,214 (4.66/day)
Location
Kepler-186f
Ehh yeah. Next you're going to say the 7900XT took you to Mars. Its not only one game, its no single game.
Let's not exaggerate and try to see things for what they are. There is nearly 25% between the XT and the XTX :) There are no OCs' on a 7900XT for more than 15% perf, and even then you're doing something special.

can't find the techspot article right now, but this is it here, 4090 gets beat by 10-15 fps in valhall by xtx, and my xt oc'd beats a reference xtx in this game... so you do the math.

 
Joined
Sep 17, 2014
Messages
22,441 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
can't find the techspot article right now, but this is it here, 4090 gets beat by 10-15 fps in valhall by xtx, and my xt oc'd beats a reference xtx in this game... so you do the math.

No I don't care and its irrelevant plus Youtuber bullshit. There's 1,5 tier between the 7900XT and the 4090. We all know top end runs into CPU bottlenecks and this is an eternal Ubisoft open world shitstorm so you do the math ;) You're 'beating' an XTX by extrapolating your setup to what's on a Youtube video in a single game that is obviously limited not by the GPU.
 
Joined
Sep 17, 2014
Messages
22,441 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
well it doesn't beat 4090 in any other game... valhalla is just a heavy amd game. your refusal to admit it doesn't even beat it in one game though is troubling
It doesn't. The 4090 is limited by the game. This is like starting Minesweeper and saying your Intel IGP is as fast as that 4090. Come on man.
 
Joined
Feb 18, 2023
Messages
244 (0.38/day)
Your issues may be limited to the RX 6400, which is a pretty middling card. Even something like a RX 6600 would've given you a much better impression.
My old 1050 TI didn't had those issues, and was way slower, also my 1030 (my backup card) doesn't has those issues either.

So saying that 6600 will not have those issues is nuts.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
What are you talking about? The 6600 is a perfectly fine 1080p card.

Cards with 8 GB are really bad, and everyone tells you this. Listen:



RX 6600 is a low-end, poorly perfoming card for today's games, only good for yesterday's games which require less VRAM, maybe 4 or 6 GB.
 
Joined
Dec 25, 2020
Messages
6,745 (4.71/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
not sure how you can logically say that, when my 7900 XT goes toe to toe with your 4080 for half the price (if you factor in the most recent Prime Day sales). 14-19% improvements in fps gains since launch, drivers that are rock solid (for me anyway), all my games smooth as butter. considering the price I paid, I can't complain, and kudos to AMD for the driver improvements, I expect more will come.


Until it doesn't. And that's fundamentally the problem with AMD's GPUs. By buying a Radeon, you forgo your right to the front seat. Nvidia currently supports - and supports well - all of the technologies that make a modern graphics card what it is. By owning a Radeon, you give up on each and every one of those. Point in case, the hyperfixation on legacy raster graphics that AMD fans have should be painfully obvious that Radeon's lacking in the other departments. That 2% faster than a reference 4080 at 10% higher power target in W1zzard's review suite for the 7900 XTX can't even be called a win for AMD.

The hardware just isn't up to snuff - you don't have access to matrix multiplication units, the hardware video encoder, while no longer completely awful, doesn't support full chroma or 4:2:2 video, which hinders its usefulness for video production as the GPU is incompatible with high-quality codecs used by modern cameras, you lose out on pretty much almost universally all of the niceties and eye candy that Nvidia has developed and maintained over the years, relegate yourself to a last-gen raytracing performance... and if we go by MSRP, congrats, you got 200 bucks off your 20-24 GB GPU that can't run anything that'd make that video memory worthwhile. In the meantime, Nvidia's figured out how do to on-the-fly shader execution reordering, and even has an early implementation of motion interpolation, which while increases latency, it can be countered somewhat with the use of Nvidia Reflex - well, I promise I won't tell anyone about the mess that Radeon's antilag thing is. Oops, I guess I did :oops:

Then there's the other thing, you got 19% fps since launch, that's pretty great! The problem is, Nvidia is also constantly improving their own software's performance. Reviews are best referenced when the hardware's closer to its launch, or when you manage to get a newer review with newer software - for example, I use W1zz's Taichi White XTX review as my reference because of that.

End of the day what matters is that you and you alone are happy, but if you carefully evaluate my reasoning, you'll see that for all the things that I get? The difference in MSRP, those $300 that would separate a 7900 XT to a 4080, accounting for all the performance gains, the far richer feature set, the constant stream of certified drivers plus the studio drivers for content creation which are made available to all RTX GPUs, it all adds up enough for me to personally, lean heavily towards Nvidia's offering. Strictly as a Windows user, anyway... Linux is the green team's achilles heel, mostly because everything that makes a GeForce give the experience it can is closed source.
 
Joined
Aug 25, 2021
Messages
1,170 (0.98/day)
Its the card to buy at its price range imo, unless you really need RT for some reason.
I don't. First pure raster, good hardware and better prices, then software perks. Once Nvidia gets this combination right, I will buy Nvidia card again.
 
Joined
Dec 25, 2020
Messages
6,745 (4.71/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I don't. First pure raster, good hardware and better prices, then software perks. Once Nvidia gets this combination right, I will buy Nvidia card again.

It'll never happen. It's the opposite direction the market is headed. We've achieved enough raster performance back with Pascal, AMD caught up with RDNA, you'll find the GTX 1080 can still run practically any game exceptionally well if you leave RT, modern accurate lighting and occlusion algorithms, high-precision soft shadows, etc. - you know, the newer technologies off. And i'll double down on my point:


This dude ran a 2023 AAA test battery on the vanilla 1070 which had slower 8Gbps GDDR5 (reducing mem bandwidth from 320 to 256 GB/s) and has 25% of the SMs of the GP104 disabled (15/20 units present). The newest games which have more sophisticated rendering techniques only begin to get a "passable" rank here when you're talking about Cyberpunk 2077. Warzone, Days Gone, God of War, Apex... they're all highly playable even on this gimped card from 2016.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
Maybe AMD should lower the prices and sell their XTX for $799 MSRP instead of $999... then they would sell more.

I wouldn't vote against. That'd definitely be a welcome change.
 
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Cards with 8 GB are really bad, and everyone tells you this. Listen:



RX 6600 is a low-end, poorly perfoming card for today's games, only good for yesterday's games which require less VRAM, maybe 4 or 6 GB.
Yeah, basically the whole internet is loud with saying that 8 GB is crap, but I haven't run into any single situation where it's really a limiting factor at 1080p. It's like everybody telling me that the sky is red and grass is purple, and I'm the idiot for not believing it.

Similarly, I haven't run into any situation where my 6750 XT can't deliver a stable 60 FPS with using only half of its power limit. Yes, I have other 8 GB (and even 4 GB) GPUs, and they're fine.
 

bug

Joined
May 22, 2015
Messages
13,764 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
200$ is RX 6600 8GB, 500$ is RX 6800 XT three ! ! years after its release.
200$ is RTX 3050 8GB, 500$ is RTX 3070 Ti 8GB more than two !! years after its release.

What are you going to do with these cards ? Slide show "gaming" at 1080p? :banghead:

I don't think the 1000+$ cards must go, actually the opposite - everyone should focus on them and try to buy only them. Instead of upgrading every year or two, just buy that 1000$ beast and stay with it for the next five-seven ! ! years with ease.
Yes, but by increasing competition, I would expect more capable cards to be squeezed in that price range.
Plus, Nvidia's $400 4060Ti is really a $200-250 card, look at the PCB pictures. AMD is probably no better.
 
Joined
Dec 29, 2022
Messages
222 (0.32/day)
Maybe AMD should lower the prices and sell their XTX for $799 MSRP instead of $999... then they would sell more.
They might. But a lot of people will still buy nVidia for features and better drivers.
nVidia is even working on neural textures to reduce VRAM usage / improve texture quality.
AMD really needs to come up with something spectacular, if they want more market share.
Example: you're living in Central / Western Europe, you earn 2500 Euros / month and rent + food gets you close to 1500 Euros. You still have 1000 Euros for different spendings. You can easily save money to get a 4070 in 2 months... It's not the end of the world and nVidia KNOWS that! That's why nVidia has more market share: people can still afford their products.
It's Apple reloaded: "Apple is expensive!" but I see that 1 out of 4 phones is an iPhone, where I live (even older generations).
It is what it is... People want "the best of the best" of everything: phone, car, wife / husband. :laugh: But you should be aware that you can't always have the best of the best. Can't afford the RTX 4090? Go for the RTX 4080 instead. Never understood the need to have the highest level of performance - you rarely need it. I know a guy that only plays CS: GO and LoL on a RTX 4070... :wtf: Hell, even my RTX 4060 is overkill for WoW (got it mostly for Warcraft 3 Reforged, Diablo 4 and God of War Ragnarök).
Maybe it's time to read more and play less... Just saying...
 
Last edited:
Joined
Feb 18, 2023
Messages
244 (0.38/day)
They might. But a lot of people will still buy nVidia for features and better drivers.
nVidia is even working on neural textures to reduce VRAM usage / improve texture quality.
AMD really needs to come up with something spectacular, if they want more market share.
Example: you're living in Central / Western Europe, you earn 2500 Euros / month and rent + food gets you close to 1500 Euros. You still have 1000 Euros for different spendings. You can easily save money to get a 4070 in 2 months... It's not the end of the world and nVidia KNOWS that! That's why nVidia has more market share: people can still afford their products.
It's Apple reloaded: "Apple is expensive!" but I see that 1 out of 4 phones is an iPhone, where I live (even older generations).
It is what it is... People want "the best of the best" of everything: phone, car, wife / husband. :laugh: But you should be aware that you can't always have the best of the best. Can't afford the RTX 4090? Go for the RTX 4080 instead. Never understood the need to have the highest level of performance - you rarely need it. I know a guy that only plays CS: GO and LoL on a RTX 4070... :wtf: Hell, even my RTX 4060 is overkill for WoW (got it mostly for Warcraft 3 Reforged, Diablo 4 and God of War Ragnarök).
Maybe it's time to read more and play less... Just saying...

I mostly play LOL, I have a 3070 TI and a 13 gen Core i9, I play at 4K, If I open discord, I got random FPS drops, so just because we play a game that is supposed to run on a microwave, doesn't means it can play on anything.

Before that I had a 1070 with a Core i7 8 Gen, that machine wasn't able to play LOL at 4K without drops, I tried with a 3060 with 12 GB RAM and was way better but still not perfect, so anything higher than 1080P will require a good video card.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
It'll never happen. It's the opposite direction the market is headed. We've achieved enough raster performance back with Pascal, AMD caught up with RDNA, you'll find the GTX 1080 can still run practically any game exceptionally well if you leave RT, modern accurate lighting and occlusion algorithms, high-precision soft shadows, etc. - you know, the newer technologies off. And i'll double down on my point:


This dude ran a 2023 AAA test battery on the vanilla 1070 which had slower 8Gbps GDDR5 (reducing mem bandwidth from 320 to 256 GB/s) and has 25% of the SMs of the GP104 disabled (15/20 units present). The newest games which have more sophisticated rendering techniques only begin to get a "passable" rank here when you're talking about Cyberpunk 2077. Warzone, Days Gone, God of War, Apex... they're all highly playable even on this gimped card from 2016.
Amen.

So many so-called technology enthusiasts simply don't understand that rasterisation is dead. The fact that games, even new ones, still use it is entirely down to the fact that the console GPUs are simply not capable of acceptable RT performance. Assuming AMD manages to mostly address that shortcoming in the next console generation (2027-2028 timeline), we will then finally see the end of rasterisation as the primary graphics rendering technology.
 
Joined
Apr 14, 2018
Messages
655 (0.27/day)
Amen.

So many so-called technology enthusiasts simply don't understand that rasterisation is dead. The fact that games, even new ones, still use it is entirely down to the fact that the console GPUs are simply not capable of acceptable RT performance. Assuming AMD manages to mostly address that shortcoming in the next console generation (2027-2028 timeline), we will then finally see the end of rasterisation as the primary graphics rendering technology.

Until there’s a GPU and engine capable of full path tracing at 60 fps min, rasterization and or hybrid rendering will never be dead. Unless either company can magically quintuple RT performance gen to gen, were years away from that being any sort of reality.
 
Last edited:
Joined
Feb 18, 2023
Messages
244 (0.38/day)
Discord is known for causing performance issues, on any system, regardless - not graphics related.

When I had my 1070 was worse, now is much much less an issue, still there but at least is not as annoying as it was with my previous setup.
 
Joined
Aug 25, 2021
Messages
1,170 (0.98/day)
Maybe AMD should lower the prices and sell their XTX for $799 MSRP instead of $999... then they would sell more.
You are asking AMD to sell you their top card for the price of 4070Ti. It's a joke.
Perhaps you could ask Nvidia to sell you 4080 for $850, and then ask AMD to sell 7900XTX for $799.
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
They might. But a lot of people will still buy nVidia for features and better drivers.
nVidia is even working on neural textures to reduce VRAM usage / improve texture quality.
AMD really needs to come up with something spectacular, if they want more market share.
Example: you're living in Central / Western Europe, you earn 2500 Euros / month and rent + food gets you close to 1500 Euros. You still have 1000 Euros for different spendings. You can easily save money to get a 4070 in 2 months... It's not the end of the world and nVidia KNOWS that! That's why nVidia has more market share: people can still afford their products.
It's Apple reloaded: "Apple is expensive!" but I see that 1 out of 4 phones is an iPhone, where I live (even older generations).
It is what it is... People want "the best of the best" of everything: phone, car, wife / husband. :laugh: But you should be aware that you can't always have the best of the best. Can't afford the RTX 4090? Go for the RTX 4080 instead. Never understood the need to have the highest level of performance - you rarely need it. I know a guy that only plays CS: GO and LoL on a RTX 4070... :wtf: Hell, even my RTX 4060 is overkill for WoW (got it mostly for Warcraft 3 Reforged, Diablo 4 and God of War Ragnarök).
Maybe it's time to read more and play less... Just saying...
You sort of prove yourself wrong,

You didn't need the best.
Everyone doesn't buy the best.
Nvidia 4060 isn't the best.

Life goes on still no surprise.

You prove people are fickle and buy favourite name's.

@Assimilator RT full path tracing being THE way is years off IMHO and yet even then indy raster game's will happen, I disagree then.
 
Joined
Aug 25, 2021
Messages
1,170 (0.98/day)
It'll never happen. It's the opposite direction the market is headed.
Tough, then I will not be their customer.
5080 for $1,200 can pass my test only if it has: 24GB VRAM, 50% uplift in 4K over 4080 and DisplayPort 2.1 ports (imncluding one USB-C).
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
Yes, but by increasing competition, I would expect more capable cards to be squeezed in that price range.
Plus, Nvidia's $400 4060Ti is really a $200-250 card, look at the PCB pictures. AMD is probably no better.

I know. This entire generation, both from AMD and nvidia, is rebranded at least a tier up the product stack.

RX 7900 XTX should be 7900 XT
RX 7900 XT should be 7800 XT
RX 7600 should be 7400 XT

RTX 4090 should be RTX 4080 Ti
RTX 4080 should be RTX 4070
RTX 4070 Ti should be RTX 4060 Ti
RTX 4070 should be RTX 4060
RTX 4060 Ti should be RTX 4050 Ti
RTX 4060 should be RTX 4050
 
Joined
Jun 11, 2020
Messages
573 (0.35/day)
Location
Florida
Processor 5800x3d
Motherboard MSI Tomahawk x570
Cooling Thermalright
Memory 32 gb 3200mhz E die
Video Card(s) 3080
Storage 2tb nvme
Display(s) 165hz 1440p
Case Fractal Define R5
Power Supply Toughpower 850 platium
Mouse HyperX Hyperfire Pulse
Keyboard EVGA Z15
It'll never happen. It's the opposite direction the market is headed. We've achieved enough raster performance back with Pascal, AMD caught up with RDNA, you'll find the GTX 1080 can still run practically any game exceptionally well if you leave RT, modern accurate lighting and occlusion algorithms, high-precision soft shadows, etc. - you know, the newer technologies off. And i'll double down on my point:


This dude ran a 2023 AAA test battery on the vanilla 1070 which had slower 8Gbps GDDR5 (reducing mem bandwidth from 320 to 256 GB/s) and has 25% of the SMs of the GP104 disabled (15/20 units present). The newest games which have more sophisticated rendering techniques only begin to get a "passable" rank here when you're talking about Cyberpunk 2077. Warzone, Days Gone, God of War, Apex... they're all highly playable even on this gimped card from 2016.

I'll see your 1070 and raise you a Nintendo Switch. Its crazy what devs can run on that thing. Thing has the specs of a flagship phone from 2013!
 
Top