• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XTX Performance Claims Extrapolated, Performs Within Striking Distance of RTX 4090

Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
Oct 27, 2020
Messages
797 (0.53/day)
If RX 7900XTX is -10% from RTX 4090, all Nvidia has to do is upgrade RTX 4080 to full die (336 TMUs/TCs from 304) and also upgrade clocks from 2505MHz to 2610MHz (what was RTX 3080 12GB clocks) and TDP to 350W and be just 10% slower in classic raster but faster in raytracing vs RX 7900XTX, probably it will be enough based on Nvidia's brand awareness.
It seems RDNA3's SPs having double the FP32 Tflop/clock by being dual issue is yielding less than desired performance uplift (of course AMD will tell you that it's early days and in the future with drivers updates and with newer games optimized more for RDNA3's architecture it will get better...)
In any case even if 6nm Navi33 can hit the same 3GHz clocks as 5nm models and the reference model has 2.85GHz boost for example, it will likely won't be more than 1.5X vs 6600XT in FHD so not being able to match 6900XT FHD performance and in QHD RX 6800XT will be much faster (in 4K even RX 6800 will be faster too)
RX 6800 is $479 and 6800XT $535 right now in Newegg and are 16GB cards, I would advise anyone looking for ≤$499 cards to buy at Black Friday/Cyber Monday offers, likely it will be a wash or better offers vs Q1 2023 releases (Full Navi32 at $649 for example) regarding 4K raster performance/$ (RDNA3 upcoming model's SRPs vs RDNA2 Black Friday deals)
 
Joined
Feb 20, 2019
Messages
8,331 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Same, my 3080 continues to impress me 2 years in, but the desire for more performance can never be truly quenched. I'll be looking closely at the 7900XTX after release for sure, and hoping to see it shake up the market a bit and hopefully force more compelling prices from Nvidia around that price point too.
I'll also be looking at the XTX pretty closely. The number of titles I play that actually have meaningful raytracing is just two, and it's not as if RDNA3 can't raytrace, it's just not going to be 4090-tier (or possibly even 4080 tier) when it comes to full lighting/shadow/occlusion raytracing. If you stick to raytraced reflections only, the AMD hardware is basically pretty competitive.

My interest will be in getting an XTX and tuning it to see if I can run it at 200-250W without losing too much performance. If The 7900XTX can't manage that, perhaps the 7800XT will do. My HTPC is currently rocking a 6700 10GB which sips about 125W under full load after an undervolt and I really don't think my case or slim furniture can handle anything more than 200-250W. The irony is that it's right underneath the only 4K display in the house, so it needs the most GPU power.
 
Joined
Jun 10, 2014
Messages
2,987 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
It seems RDNA3's SPs having double the FP32 Tflop/clock by being dual issue is yielding less than desired performance uplift (of course AMD will tell you that it's early days and in the future with drivers updates and with newer games optimized more for RDNA3's architecture it will get better...)
And there we have it again; claims of better performance with better drivers and games.
It's the same old claim that AMD (or Intel) will catch up on Nvidia with better software over time, but it never happens. If driver overhead was holding back the performance, we would see a progressively growing overhead with the higher tier cards, holding them back to the point where high-end cards become almost pointless. The performance figures we've seen so far does not indicate this, and when reviews arrive, we can probably discredit that claim completely.

And no, (PC) games are not optimized for specific GPU architectures. Games are written using DirectX or Vulkan these days, neither are tailored to specific GPU architectures. Games may have some exclusive features requiring specific API extensions, but these don't skew the benchmark results.

BTW, did anyone catch the review embargo?

RX 6800 is $479 and 6800XT $535 right now in Newegg and are 16GB cards, I would advise anyone looking for ≤$499 cards to buy at Black Friday/Cyber Monday offers, likely it will be a wash or better offers vs Q1 2023 releases (Full Navi32 at $649 for example) regarding 4K raster performance/$ (RDNA3 upcoming model's SRPs vs RDNA2 Black Friday deals)
I agree, I expect some good deals from both makers, so people better set some price notifications grab the best deals.
 
Joined
Apr 30, 2011
Messages
2,712 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
If RX 7900XTX is -10% from RTX 4090, all Nvidia has to do is upgrade RTX 4080 to full die (336 TMUs/TCs from 304) and also upgrade clocks from 2505MHz to 2610MHz (what was RTX 3080 12GB clocks) and TDP to 350W and be just 10% slower in classic raster but faster in raytracing vs RX 7900XTX, probably it will be enough based on Nvidia's brand awareness.
It seems RDNA3's SPs having double the FP32 Tflop/clock by being dual issue is yielding less than desired performance uplift (of course AMD will tell you that it's early days and in the future with drivers updates and with newer games optimized more for RDNA3's architecture it will get better...)
In any case even if 6nm Navi33 can hit the same 3GHz clocks as 5nm models and the reference model has 2.85GHz boost for example, it will likely won't be more than 1.5X vs 6600XT in FHD so not being able to match 6900XT FHD performance and in QHD RX 6800XT will be much faster (in 4K even RX 6800 will be faster too)
RX 6800 is $479 and 6800XT $535 right now in Newegg and are 16GB cards, I would advise anyone looking for ≤$499 cards to buy at Black Friday/Cyber Monday offers, likely it will be a wash or better offers vs Q1 2023 releases (Full Navi32 at $649 for example) regarding 4K raster performance/$ (RDNA3 upcoming model's SRPs vs RDNA2 Black Friday deals)
Maybe you underestimate the gap between 4090 and 4080. it is close to 40%. Nothing can make 4080 get close to even the 7900XT. This time nVidia went full on to keep the crown with their halo GPU and the rest will get annihilated both in performance, power draw and vfm (will be saved only in RT and only for the games it fully utilises that). My 5c.
 

optichippo

New Member
Joined
Nov 6, 2022
Messages
1 (0.00/day)
What we can extrapolate is that the 7900XTX would have around 50-60% more performance than the 6900XT in pure rasterization games, so without RT, and all that FSR/DLSS bullshit. So look at some benchmark on the 6900XT and take a guess, while its RT performance might not be up to par with the 4090, since its about a generation late in comparison, its still up to 50% greater than before, which probably put it in the ballpark of the 3090/3090ti RT performance, which is still far below that of the 4090.
Then take into account the price, it would probably be far superior than the 4080, in most games barring RT performance, then it's also cheaper, $999 vs $1199, so it definitely a better choice IMO. The 7900XTX might not be targeting the 4090 especially at it's price point, instead the 4080.
This right here is what I've suspected and been telling friends and coworkers. I believe the 7900XTX is going to be more on par with the 4080. Now if it does so happen to be that it comes somewhat closer to the 4090, then Nvidia is going to be in a bunch of trouble due to pricing and Display Port 2.1. I still have zero clue as to why Nvidia skipped out on including 2.1, which in itself is a selling point.

BTW the 8K reference is not really 8K (7680x4320 = 33M pixels) but a widescreen 8K (7680x2160 = 16.5M pixels)
Yes, you are correct! 8K is not 4K x2, this is not how it works and I see how someone can be tricked into thinking this which is what AMD is clearlydoing here. 8K is 4k x4.
 
Joined
Oct 27, 2020
Messages
797 (0.53/day)
Maybe you underestimate the gap between 4090 and 4080. it is close to 40%. Nothing can make 4080 get close to even the 7900XT. This time nVidia went full on to keep the crown with their halo GPU and the rest will get annihilated both in performance, power draw and vfm (will be saved only in RT and only for the games it fully utilises that). My 5c.
The theoretical difference (I don't mean FP32 diff : 512/336) between 4090 and full AD103 it's close to +40%, I agree, let's say +39% as an example.
My assumption is that in 5800X TPU testbed with the current game selection, RTX 4090 realizes around -10% from it's potential.
For example:
RTX 4090 theoretical 4K 139%
RTX 4090 realized 4K 125% (-10%)
Full AD103 with the quoted clocks 100%
I may be wrong, we will see what actual performance the current RTX 4080 model config will achieve (304 TMUs/TC vs 336 and -4% clocked vs my proposed AD103 specs) in relation with RTX 4090.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
I'm waiting to see the AMD flagship that sells for $1000 and offers the performance of the 4090. It would be a pleasant surprise, but it's not Liza Su's style.

I don't know why so many people do not believe that AMD can undercut with good pricing quite substantially. After all, the chiplets were made exactly to cut the pricings.
AMD has a 300 sq. mm die vs nvidia's 2x larger die. Of course, AMD's product is around 60-70% of the cost of the nvidia's.
 
Joined
Dec 26, 2020
Messages
382 (0.26/day)
System Name Incomplete thing 1.0
Processor Ryzen 2600
Motherboard B450 Aorus Elite
Cooling Gelid Phantom Black
Memory HyperX Fury RGB 3200 CL16 16GB
Video Card(s) Gigabyte 2060 Gaming OC PRO
Storage Dual 1TB 970evo
Display(s) AOC G2U 1440p 144hz, HP e232
Case CM mb511 RGB
Audio Device(s) Reloop ADM-4
Power Supply Sharkoon WPM-600
Mouse G502 Hero
Keyboard Sharkoon SGK3 Blue
Software W10 Pro
Benchmark Scores 2-5% over stock scores
Impressive if it's really like that, given the 95W lower tdp and the much, much lower price. If this continues down the stack Lovelace will be the biggest joke Nvidia made since a while ago...
 
Joined
May 31, 2016
Messages
4,437 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
I really can't say the prices are great for AMD since 7900XT is going to bee $899. That is a lot. NV is just a kick in the teeth for consumers and that is not the end of the story since there will be 4090 Ti I suppose. Anyway, I will wait for the reviews and RT performance is still, not a deal breaker not a winning deal either. We are getting there but we are not there yet.
I only hope the AMD GPUs are as fast as advertised.

Impressive if it's really like that, given the 95W lower tdp and the much, much lower price. If this continues down the stack Lovelace will be the biggest joke Nvidia made since a while ago...
Some reviewers claim it is already a big joke and a cash grab.
 
Joined
May 21, 2009
Messages
237 (0.04/day)
AMD has already stated that the 7900 XTX is a video card to challenge the RTX 4080 and not the RTX 4090.
 
Joined
Feb 22, 2022
Messages
606 (0.59/day)
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Custom Watercooling
Memory G.Skill Trident Z Royal 2x16GB
Video Card(s) MSi RTX 3080ti Suprim X
Storage 2TB Corsair MP600 PRO Hydro X
Display(s) Samsung G7 27" x2
Audio Device(s) Sound Blaster ZxR
Power Supply Be Quiet! Dark Power Pro 12 1500W
Mouse Logitech G903
Keyboard Steelseries Apex Pro
If RX 7900XTX is -10% from RTX 4090, all Nvidia has to do is upgrade RTX 4080 to full die (336 TMUs/TCs from 304) and also upgrade clocks from 2505MHz to 2610MHz (what was RTX 3080 12GB clocks) and TDP to 350W and be just 10% slower in classic raster but faster in raytracing vs RX 7900XTX, probably it will be enough based on Nvidia's brand awareness.
Sooo, basically all they have to do is make a new GPU. Would this be a 4080 Super or ti? Because I promise you that unless the 4080 launch is next summer, they are already manufactured.

Yes, you are correct! 8K is not 4K x2, this is not how it works and I see how someone can be tricked into thinking this which is what AMD is clearlydoing here. 8K is 4k x4.
It is either 8K halfheight or 4K ultrawide. Pick your poison! :p
 
Joined
Apr 30, 2011
Messages
2,712 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
AMD has already stated that the 7900 XTX is a video card to challenge the RTX 4080 and not the RTX 4090.
They did the same for 6900XT vs 3080 but ended up matching 3090 @1440P & 4K
1667830155734.png
1667830166002.png
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
The theoretical difference (I don't mean FP32 diff : 512/336) between 4090 and full AD103 it's close to +40%
No.
4080 being 60% of 4090, means that if 4080 is 100%, 4090 is 166%
 
Joined
Nov 26, 2021
Messages
1,702 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Joined
Oct 26, 2022
Messages
57 (0.07/day)
Exactly and that (7950XTX..?) will be at the time of 4090Ti probably.
And yes RT performance of 7900XTX is known (by AMD claims) to be ~1.8x over the 6950XT that will place it around the 3090/Ti.
Its just math... +50% per CU +20% more CUs
1.0 + 50% = 1.5 + 20% = 1.8x
7950/7970 XTX >|~|< 4090 Ti > 4090 ~/a little > 7900 XTX > 4080 16GB >|~|< 7900 XT > "Unlaunched" 4080 12GB.

Very compelling to see how it all plays out once they all have been released.
 
Joined
Sep 17, 2014
Messages
22,638 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I beg to differ, don't fall for the memes.
However, it's objective to say that Nvidia drivers are worst in Linux.
It shows you're retired for a while now, because this is absolute nonsense.

That is blatantly false.
Nvidia's Linux drivers have been rock solid for over a decade, even more solid than their Windows drivers, and have consistently offered the highest level of API compliance.
What you are reciting is typical forum nonsense coming from people who don't use AMD's "open source" Linux drivers to any real extent, fueled by ideology because people think one is completely free and open and the other is proprietary and evil, when the reality is both are partially open. The truth is the "open" Mesa/Gallium drivers are bloated and abstracted drivers, full of workarounds and are a complete mess.
.. and this is the truth.

Maybe you underestimate the gap between 4090 and 4080. it is close to 40%. Nothing can make 4080 get close to even the 7900XT. This time nVidia went full on to keep the crown with their halo GPU and the rest will get annihilated both in performance, power draw and vfm (will be saved only in RT and only for the games it fully utilises that). My 5c.
That is exactly why it appears they cancelled the 4080. Initially I thought they had to reposition because of their OWN marketing (after all how vague is such a large gap between two 4080 cards, and a different VRAM cap. to boot, these just aren't two similar cards in any way), but with the 7900XTX performance estimates out the door, we can easily defend the idea that 4080 12G turd was pulled back in because it would mean AMD had a much better story at the high end all the way through. After all, if they drop the number to an x70, now AMD's 'faster' cards are no longer all competing with (and in many cases performing over the level of) 4080's. Its a better marketing reality.

It is also highly likely the 4080 16G will get repositioned - in MSRP. $200 or even $300 just for somewhat better RT perf is steep. Too steep - and thats giving Nvidia benefit of the doubt that 4080 won't get eclipsed by AMD's 7900XT (yes XT). I honestly think the 4080 is going to be only situationally equal, and overall lower in raster perf, and even the 7900XT will be highly competitive with its performance, seeing the tiny gap between XTX and it.
 
Last edited:
Joined
Jan 11, 2013
Messages
1,237 (0.28/day)
Location
California, unfortunately.
System Name Sierra
Processor Core i5-11600K
Motherboard Asus Prime B560M-A AC
Cooling CM 212 Black RGB Edition
Memory 64GB (2x 32GB) DDR4-3600
Video Card(s) MSI GeForce RTX 3080 10GB
Storage 4TB Samsung 990 Pro with Heatsink NVMe SSD
Display(s) 2x Dell S2721QS 4K 60Hz
Case Asus Prime AP201
Power Supply Thermaltake GF1 850W
Software Windows 11 Pro
Since you started the "nobody" talk let me deliver my suggestion: Nobody should pay so much for a gaming device. Only for professional reasons. And 1440P is a great res for everyone. 4K will not become mainstream even in 10 years since most people (>80%) aren't and will not be willing to spend so much for the monitor and GPU combo needed.
Well I for one find my new 4K monitor fantastic compared to 1080P. That said, I wonder how much of a difference it would have been vs 1440P. I just don’t think your statement is valid, just because you can’t justify spending so much on something for gaming doesn’t mean anything. I’m not made of money, I drive a 13 year old Ford Escape and live in a rented room, but I am saving up for a powerful GPU to match my monitor. Why? Because I love when games are pretty. Especially MSFS2020 which will be amazing once I get a card that can run 4K smoothly.
 
Joined
Sep 17, 2014
Messages
22,638 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Well I for one find my new 4K monitor fantastic compared to 1080P. That said, I wonder how much of a difference it would have been vs 1440P. I just don’t think your statement is valid, just because you can’t justify spending so much on something for gaming doesn’t mean anything. I’m not made of money, I drive a 13 year old Ford Escape and live in a rented room, but I am saving up for a powerful GPU to match my monitor. Why? Because I love when games are pretty. Especially MSFS2020 which will be amazing once I get a card that can run 4K smoothly.
Its what you settle for in the end, we all make our choices. But there is also just laws of physics and ergonomics; 4K is not required in any way to get high graphical fidelity. I run a 3440x1440 (close enough, right...;)) panel and its really the maximum height that's comfortable to view, the width is already 'a thing' and only a curve makes it a good fit - 4K has 400 extra pixels in width and 700 in height.

4K struggles with efficiency because you're basically wasting performance on pixels you'll never notice at the supposed ideal view distance. You'll make a choice between a perf sacrifice for no uptick in graphical fidelity, versus sitting closer to see it all and killing your neck/back. At longer view distances, you can make do with lower res for the exact same experience. Another issue is scaling, 4K requires it for small text or its just unreadable.

Its a thing to consider ;) Not much more; the fact remains 4K is becoming more mainstream so there's simply more on offer, specifically also OLED. But the above is where the statement '1440p is enough' really comes from. Its a sweet spot, especially for a regular desktop setting. Couch gaming follows a different ruleset, really. But do consider also the advantages. I can still play comfortably at 3440x1440 on a GTX 1080... (!) 4K is going to absolutely murder this card though. Jumping on 4K is tying yourself to a higher expense to stay current on GPU, or sacrificing more FPS for wanted IQ.

Some reviewers claim it is already a big joke and a cash grab.
Nvidia has all opportunity to tweak the line up and the better half isn't even out... They always ran the risk of misfires because they release first.
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
4K is becoming more mainstream

Only the PC environment is lagging behind the reality but 4K 100% dominates the markets in which it is allowed to develop.
Also, AMD already markets "8K" or 4K ultrawide experience with Radeon RX 7900 series GFX.

A local store with offers (quantity of offers is in brackets):

1667848188437.png
 
Joined
Oct 27, 2020
Messages
797 (0.53/day)
Sooo, basically all they have to do is make a new GPU. Would this be a 4080 Super or ti? Because I promise you that unless the 4080 launch is next summer, they are already manufactured.
All GPCs are active in RTX 4080, they just disabled some SMs, all they have to do is re-enable them for the AD103 dies that can be fully utilized and the rest can be used in future cut-down AD103 based products (and also increase the clocks for the full AD103 parts)
And anyway my point wasn't what Nvidia will do but what it could achieve based on AD103 potential...

No.
4080 being 60% of 4090, means that if 4080 is 100%, 4090 is 166%
According to leak, even an OC cut-down RTX 4080 (304TCs enabled vs 336TCs of my higher clocked full AD103 config...) appears to be only -20% slower vs RTX 4090 in 3DMark Time Spy Performance preset and -27% in Extreme 4K preset...
You do your math, I will do mine!
For example theoretical Shading performance delta alone is useless to extract performance difference between 2 models, it's much more complex than that...

IMG_20221107_221337.jpg
 
Last edited:
Joined
Sep 17, 2014
Messages
22,638 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Only the PC environment is lagging behind the reality but 4K 100% dominates the markets in which it is allowed to develop.
Also, AMD already markets "8K" or 4K ultrawide experience with Radeon RX 7900 series GFX.

A local store with offers (quantity of offers is in brackets):

View attachment 268931


Context, man, you might need to look that word up.

These posts make no sense whatsoever. 4080 isn't in the correct place in that chart, obviously, and 'local store offers' tell just about jack shit about where 4K is for gaming. Its marketing; you can find IoT devices like a fridge with '4K support'.

Resolution was, is and will always be highly variable. Especially now with FSR/DLSS. There is also a resolution for every use case, its not true the only way is up, enough is enough.
 
Top