• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5080 Founders Edition

Joined
Jan 9, 2025
Messages
34 (1.55/day)
Processor AMD Ryzen 7 5800X3D
Motherboard MSI MPG B550 Gaming Plus
Cooling Deepcool LT720 360mm
Memory 32GB Corsair Dominator Platinum DDR4 3600Mhz
Video Card(s) Zotac Nvidia RTX 3090
Storage Western Digital SN850X
Display(s) Acer Nitro 34" Ultrawide
Case Lianli O11 Dynamic Evo White
Audio Device(s) HiVi Swans OS-10 Bookshelf Speakers
Power Supply Seasonic Prime GX-1000 80+ Gold
Mouse Razer Ultimate | Razer Naga Pro
Keyboard Razer Huntsman Tournament Edition
RT performance is really inconsistent across titles. The 7900XTX lost a whopping 9% vs the 3090TI since the 4070 review, even getting beaten by the 3090 when it was 15% faster before.

Just a guess, could it be because of the introduction of ray reconstruction?
 
Joined
May 29, 2017
Messages
547 (0.20/day)
Location
Latvia
Processor AMD Ryzen™ 7 5700X
Motherboard ASRock B450M Pro4-F R2.0
Cooling Arctic Freezer A35
Memory Lexar Thor 32GB 3733Mhz CL16
Video Card(s) PURE AMD Radeon™ RX 7800 XT 16GB
Storage Lexar NM790 2TB + Lexar NS100 2TB
Display(s) HP X34 UltraWide IPS 165Hz
Case Zalman i3 Neo + Arctic P12
Audio Device(s) Airpulse A100 + Airpulse SW8
Power Supply Sharkoon Rebel P20 750W
Mouse Cooler Master MM730
Keyboard Krux Atax PRO Gateron Yellow
Software Windows 11 Pro
If AMD also will not be aggressive with it's pricing these will be very long and boring two years. Basically sitting four years on a single generation for me refresh does not count.
 
Last edited:
Joined
Sep 6, 2013
Messages
3,483 (0.84/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Well, Multi-Frame Generation, for now, hardly justifies the price premium. Maybe if they iron out the kinks and it becomes a better story, like how DLSS 1.0 evolved to DLSS 4.0
I was thinking the same, but it will probably be more difficult to see the evolution that DLSS have seen. And the problem is that even with DLSS we see that while there is image quality degradation, it is hidden under the carpet of "a must have because it is better than the alternatives and increases framerates". What if multi frame generation ends up the same? A feature that degrades even further image quality but it is hidden under the same carpet? Even with a multi frame generation of version 4.0 in a few years, we will be dealing with worst image quality than what we get today. Probably we will be getting better lighting, because frame rates with path tracing will be acceptable, but in the end we will have ghosting and artifacts and who knows what else with just better(?) lighting.
AMD could pull a rebrand, like they did in the past, of the RX7000 series but lowering their asking price a notch and just keep the RX9070XT as their top offering undercutting the 5080 in price and keeping their latest posture of "No high-end stuff".
There, competition thru price with minimal overhead for AMD.
I'm still crossing fingers that Intel will go after the 300~400€ price-range with an offering above the B580 and maybe have a full BMG-20 die to take a shot at 499€.

As for this launch, guess for now I'll just keep recommend the 40x0 gen of nVidias , any time the prices lower due to discounts, for anyone still on Turing or earlier.
Maybe they don't want to or can't sell much. Maybe they don't get enough wafers from TSMC to cover the gaming market and they prefer to sell badly with a profit than good without a profit. Because even that "good without a profit" will be questionable. If there is demand and they don't have wafer supply, prices will go up and everyone will accuse AMD of doing the same s_it that Nvidia does by limiting supply. And unfortunately we all have seen that if people suspect AMD of doing the same with Nvidia, all negativity turns to AMD and Nvidia gets away with it.

Intel needs to fix it's manufacturing. What we see now, limited availability of B570 and B580, would never happened if Intel could build those chips in it's own fab. The market would be flooded with those cards. I don't think they have enough supply of wafers from TSMC either. I think only two companies enjoy enough wafers from TSMC. Apple and Nvidia.
 
Joined
Sep 10, 2018
Messages
7,537 (3.23/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Really good question, for the short term they will be included in the RT section, and not in the "regular" games list. Eventually these two sections will be merged, because RT will become the standard, and I'll have separate summary charts, avg all, raster only, rt only. At least that's what I came up while thinking about the problem. Happy to discuss this further, start a new thread please

Have you thought about removing Elden Ring and Doom Eternal from the RT benchmarks I know why they are there but then we won't have people saying they are holding back the 5080... By my napkin math it would only add about 2% vs the 4080 super if you did so the gap would still be disappointing.
 
Joined
Mar 23, 2012
Messages
586 (0.12/day)
Processor Intel i9-9900KS @ 5.2 GHz
Motherboard Gigabyte Z390 Aorus Master
Cooling Corsair H150i Elite
Memory 32GB Viper Steel Series DDR4-4000
Video Card(s) RTX 3090 Founders Edition
Storage 2TB Sabrent Rocket NVMe + 2TB Intel 960p NVMe + 512GB Samsung 970 Evo NVMe + 4TB WD Black HDD
Display(s) 65" LG C9 OLED
Case Lian Li O11D-XL
Audio Device(s) Audeze Mobius headset, Logitech Z906 speakers
Power Supply Corsair AX1000 Titanium
Similar to GTX 770 but worse than GTX 580! It's a clear refresh nothing more.
480 --> 580 and 680 --> 770 weren't even new generations of GPU, making Blackwell look even more pathetic. Those were just Fermi --> Fermi and Kepler --> Kepler.

Lowest generational performance jump ever (at least back to the early '00s when I started paying attention) for a new architecture.
 
Joined
Oct 12, 2005
Messages
726 (0.10/day)
From what i see, the Nvidia/AMD sponsored title thing these days isn't much about performance but what is included in the game. Its not uncommon for a game to only have DLSS or FSR

we really need a vendor agnostic solution soon
 
Joined
Dec 6, 2022
Messages
571 (0.73/day)
Location
NYC
System Name GameStation
Processor AMD R5 5600X
Motherboard Gigabyte B550
Cooling Artic Freezer II 120
Memory 16 GB
Video Card(s) Sapphire Pulse 7900 XTX
Storage 2 TB SSD
Case Cooler Master Elite 120
It has been done like that since forever, I need to cut off at some point .. usually 125% and 75% relative performance. But good point, let me add the 3080, it's an important comparison

Graphs are rerendering now
Question, do you remember how Anandtech (RIP) had a semi-interactive benchmark results, which would help you compare your cpu or gpu (not combined), only one of the two) with newer results?

Would such a tool be possible given all the data that you have collected?

Just place a big disclaimer stating that it wont be 100% accurate for obvious reasons (different hardware combinations) but it would still offer some kind of info for everyone.
 
Joined
Jan 10, 2022
Messages
1 (0.00/day)
It feels like a reach to call this "highly recommended".

This release feels like a giant nothing burger. I don't really care about frame gen when it makes latency worse. I want the card to actually render the stupid game at a reasonable frame rate to begin with. Sure, we can use DLSS, which is yet another band-aid unless you're using it to render at a higher resolution than your display, but that also results in artifacts, and I don't want to have to use it.

Not to mention, an xx80 card used to be $750-800 adjusted for inflation, not $1,000 and sure as hell not $1,200. So, saying it's $200 less than a card that was absurdly overpriced by $400 isn't really something that belongs in the positive category. Same price as the super still isn't great because it's still $200 too much.

The ONLY reason anyone should buy this card is if the RTX 5070 Ti and RX 9070 XT just aren't enough for what you're doing and you can justify hundreds of dollar for a tiny improvement. Zero people with the previous gen 4070 Ti, 4080, 7900 xt/gre/xtx should buy it at this price and lack of performance gains.

I waited years to buy this generation, and now I'm probably going to wait until the next generation so that I can buy something that's remotely worth the money. Maybe.
 
Joined
May 13, 2024
Messages
31 (0.12/day)
Processor Ryzen 7 5800X3D
Motherboard MSI Pro B550M-VC Wifi
Cooling Thermalright Peerless Assassin 120 SE
Memory 2x16GB G.Skill RipJaws DDR4-3600 CL16
Video Card(s) Asus DUAL OC RTX 4070 Super
Storage 4TB NVME, 2TB SATA SSD, 4TB SATA HDD
Display(s) Dell S2722DGM 27" Curved VA 1440p 165hz
Case Fractal Design Pop Air MIni
Power Supply Corsair RMe 750W 80+ Gold
Mouse Logitech G502 Hero
Keyboard GMMK TKL RGB Black
VR HMD Oculus Quest 2
Question, do you remember how Anandtech (RIP) had a semi-interactive benchmark results, which would help you compare your cpu or gpu (not combined), only one of the two) with newer results?

Would such a tool be possible given all the data that you have collected?

Just place a big disclaimer stating that it wont be 100% accurate for obvious reasons (different hardware combinations) but it would still offer some kind of info for everyone.
Isn't that basically what the comparison box in the GPU database is?
 
Joined
Dec 10, 2019
Messages
29 (0.02/day)
Location
United Kingdom
Processor 13700K 5.6GHz 8/16
Motherboard PRO Z690-A DDR4
Memory V10 2x16GB 4100MHz CL15
Video Card(s) 3080 Ti Gamerock
Benchmark Scores 3DMark CPU Pr. Test /No-L. Nitrogen 13700K @6.0 1300-14543 (5.)
I feel a bit disappointed on this gpu :/ , i feel like this card does not deserve the 5080 naming and should have been better calling it at best 5070Ti , that's my personal opinion. Don't like it ...
That's right, this is the real 5070 Ti upsold. This card could have received a 330W TDP without any meaningful performance loss.
The 5080 Ti will be the legit 5080, a heavily cut-down GB202 die somewhat faster than the 4090; with a fake, seemingly good MSRP, what in practice still be high, with a likely created artificial scarcity.

At this point Nvidia is not that hard to predict in its market behaviour.
 
Last edited:
Joined
Dec 6, 2022
Messages
571 (0.73/day)
Location
NYC
System Name GameStation
Processor AMD R5 5600X
Motherboard Gigabyte B550
Cooling Artic Freezer II 120
Memory 16 GB
Video Card(s) Sapphire Pulse 7900 XTX
Storage 2 TB SSD
Case Cooler Master Elite 120
Isn't that basically what the comparison box in the GPU database is?
Not entirely.

One, its not interactive.

Two, as stated by W1zzard, he removes old ones after a certain threshold is reached.
 

calhau

New Member
Joined
Jan 21, 2025
Messages
6 (0.60/day)
Getting mad at consumers/customers here is a really weird take. Monopolies have nearly complete power over their customer base. The 4080 12GB / 4070 Ti had competition from AMD. The 5080 does not. This isn't a "why are customers so braindead" moment, it's a "monopolies suck and we need AMD or someone else to become competitive" moment.

Nvidia has zero competition on the high end, and the AI bubble means they will sell cards at whatever price Nvidia wants even if gamers completely ignore them. Blaming consumers for that is a little... odd.
I'm not mad I'm baffled!

Nvidia monopoly doesn't mean that you can't criticise their practices, specially when it worked before.

The problem here is that this community (the one with interest and knowledge on GPUs) that was vocal about the same problem when it was obvious, now is mildly disappointed with this card when it's the same thing that happened but even worst.

Word of mouth counts in the pc market, if the worst this card gets is a "nothing special", it will sell well.
 
Joined
May 13, 2024
Messages
31 (0.12/day)
Processor Ryzen 7 5800X3D
Motherboard MSI Pro B550M-VC Wifi
Cooling Thermalright Peerless Assassin 120 SE
Memory 2x16GB G.Skill RipJaws DDR4-3600 CL16
Video Card(s) Asus DUAL OC RTX 4070 Super
Storage 4TB NVME, 2TB SATA SSD, 4TB SATA HDD
Display(s) Dell S2722DGM 27" Curved VA 1440p 165hz
Case Fractal Design Pop Air MIni
Power Supply Corsair RMe 750W 80+ Gold
Mouse Logitech G502 Hero
Keyboard GMMK TKL RGB Black
VR HMD Oculus Quest 2
That's right, this is the real 5070 Ti upsold. This card could have received a 330W TDP without any meaningful performance loss. The 5080 Ti will be the legit 5080, a heavily cut-down GB202 die which will be somewhat faster than the 4090; with a fake, seemingly good MSRP, which in practice still be high, with a likely created artificial scarcity.

At this point Nvidia is not that hard to predict with its market behaviour.

It's always hilarious reading these posts. Yes, the $3 trillion company should listen to you, random internet user, regarding how to name and price their products. Your word is the law, you have the divine right to declare that this particular GPU must be named what you choose and coincidentally priced at exactly the amount of money in your checking account. Everybody knows the fundamental constants of the universe: the speed of light, the mass of a proton, the gravitational constant, and that the second-largest chip can only be used to make an xx70 card. We all learned that in school.

Come on man. The names are just branding, and Nvidia is divvying up the chips and pricing them according to what they think the market can handle. If they're wrong, they'll correct it (like they did with the 4070-Ti and 4080). It's fine to not be happy about the price or the performance compared to the last gen, but it's just silly to quibble over the name. And again, if the price is too high, the "street prices" will eventually drop and Nvidia will readjust the lineup.
 
Joined
Jul 31, 2024
Messages
204 (1.11/day)
Processor AMD Ryzen 7 5700X
Motherboard ASUS ROG Strix B550-F Gaming Wifi II
Cooling Noctua NH-U12S Redux
Memory 4x8G Teamgroup Vulcan Z DDR4; 3600MHz @ CL18
Video Card(s) MSI Ventus 2X GeForce RTX 3060 12GB
Storage WD_Black SN770, Leven JPS600, Toshiba DT01ACA
Display(s) Samsung ViewFinity S6
Case Fractal Design Pop Air TG
Power Supply Corsair CX750M
Mouse Corsair Harpoon RGB
Keyboard Keychron C2 Pro
VR HMD Valve Index
Really don't think 'not increasing price' compared against mid-gen refreshes (or more accurately, course corrections) is any kind of upside. Like its price, that is a nominal trait. Not good or bad. Just the extant status quo. If this thing was $2 cheaper than the 4080S at MSRP, sure. Whatever. But it's not. It's a holding pattern, and for barely 12ish% improvement overall and almost nothing for perf/watt? Just not worth it.

See y'all in about 8 months if/when Blackwell SUPER gets announced.
 
Last edited:
Joined
Mar 23, 2012
Messages
586 (0.12/day)
Processor Intel i9-9900KS @ 5.2 GHz
Motherboard Gigabyte Z390 Aorus Master
Cooling Corsair H150i Elite
Memory 32GB Viper Steel Series DDR4-4000
Video Card(s) RTX 3090 Founders Edition
Storage 2TB Sabrent Rocket NVMe + 2TB Intel 960p NVMe + 512GB Samsung 970 Evo NVMe + 4TB WD Black HDD
Display(s) 65" LG C9 OLED
Case Lian Li O11D-XL
Audio Device(s) Audeze Mobius headset, Logitech Z906 speakers
Power Supply Corsair AX1000 Titanium
I'm not mad I'm baffled!

Nvidia monopoly doesn't mean that you can't criticise their practices, specially when it worked before.

The problem here is that this community (the one with interest and knowledge on GPUs) that was vocal about the same problem when it was obvious, now is mildly disappointed with this card when it's the same thing that happened but even worst.

Word of mouth counts in the pc market, if the worst this card gets is a "nothing special", it will sell well.
I think you are attributing to word of mouth what can only be properly attributed to marketplace competition. If there is no competition for the 5080 except for the even more hilariously expensive 4090 and 5090, word of mouth won't matter.

Just look at the Ampere generation. 100% word of mouth agreement that crypto scam prices were absurd on tech forums. Did nothing - you either got lucky with your F5 key when there was a restock or you paid scalpers price if you wanted a card.
 
Joined
Jun 19, 2024
Messages
363 (1.61/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
Just because there was no outcry doesn't mean nvidia is going to have any more success than if there was an outcry.
You'll have to check the sales for that. $1000 for any computer part is too much, so anyone who buys such a thing must convince themselves it's worth it somehow.
I have no idea how anyone will convince themselves a 5080 is worth it, in the same way that a 4080 wasn't either. These are simply rip offs. If they aren't sexy, they won't sell, unless the market is held captive (which it sort of is)

Explain the RTX 4090, which absolutely nobody needs, but yet has outsold anything AMD has produced in the last several years. That tells me that $1,000 isn’t too much. Price is only part of the value equation.

Just a guess, could it be because of the introduction of ray reconstruction?

I think RR might be slower than other denoisers, just much higher quality.
 

calhau

New Member
Joined
Jan 21, 2025
Messages
6 (0.60/day)
Just because there was no outcry doesn't mean nvidia is going to have any more success than if there was an outcry.
You'll have to check the sales for that. $1000 for any computer part is too much, so anyone who buys such a thing must convince themselves it's worth it somehow.
I have no idea how anyone will convince themselves a 5080 is worth it, in the same way that a 4080 wasn't either. These are simply rip offs. If they aren't sexy, they won't sell, unless the market is held captive (which it sort of is)
The outcry worked didn't it?

No one can say how well it will sell, but it will sell a lot better than if there was an outcry.

For a lot of people the precepion of a product is more important than the product itself, specially when it's expensive.

Nvidia monopoly was only possible because of the inflated perception of their products in the wide public.

I've built a decent amount of PCs for other people when I was younger, and I can tell you that people can be oblivious in this subject.
I had people asking me to put more expensive parts even if in their use case was just wasting money or to use Nvidia because they thought that Radeon cards were all slower and broke easily.
 

calhau

New Member
Joined
Jan 21, 2025
Messages
6 (0.60/day)
I think you are attributing to word of mouth what can only be properly attributed to marketplace competition. If there is no competition for the 5080 except for the even more hilariously expensive 4090 and 5090, word of mouth won't matter.

Just look at the Ampere generation. 100% word of mouth agreement that crypto scam prices were absurd on tech forums. Did nothing - you either got lucky with your F5 key when there was a restock or you paid scalpers price if you wanted a card.
On tech forums yes, but on the other hand the average Joe was bombarded from people and social media that crypto was free money, but some didn't commit that mistake because of the tech forums.

For example I personally dissuaded 6 people from buying Vegas for crypto, but they were convinced by others to try their hand at crypto.

Perception of a product is very important if everyone in tech forums is just mildly disappointed at the 5080 there's no chance that the wider public will notice, if they don't notice they will buy when at least some wouldn't if they had the notion of what's happening.

And for Nvidia what matters is profit, if the card sells less but the increased profit outweighs that it's great for them. The only way to get prices lower from a monopoly is to create a bad perception of their products so the sales dip.
 
Joined
Sep 27, 2008
Messages
1,221 (0.20/day)
Performs a bit better than the 4080 Super, but also uses a bit more power.

It's like a... 4080 Ti? 4080 Super Duper?
 
Joined
Mar 23, 2012
Messages
586 (0.12/day)
Processor Intel i9-9900KS @ 5.2 GHz
Motherboard Gigabyte Z390 Aorus Master
Cooling Corsair H150i Elite
Memory 32GB Viper Steel Series DDR4-4000
Video Card(s) RTX 3090 Founders Edition
Storage 2TB Sabrent Rocket NVMe + 2TB Intel 960p NVMe + 512GB Samsung 970 Evo NVMe + 4TB WD Black HDD
Display(s) 65" LG C9 OLED
Case Lian Li O11D-XL
Audio Device(s) Audeze Mobius headset, Logitech Z906 speakers
Power Supply Corsair AX1000 Titanium
Nvidia monopoly was only possible because of the inflated perception of their products in the wide public.

What sort of revisionist history is this? The last time AMD was competitive with the top GPU from Nvidia was the Radeon HD 7970... 13 years ago...

Nvidia has a monopoly because AMD has been producing second-class graphics cards for over a decade.
 
Joined
Apr 30, 2020
Messages
1,046 (0.60/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
If you really are a game(s) programmer then you know we've reached the end of 3D rasterization. Everyone - Nvidia, AMD, Sony, Microsoft, Epic etc.. has told you this.



You're lying again.
Complete bull crap about rasterization limit; or a a game like Chernobylite wouldn't gain any performance in supported mGPU mode with two rlrtx 4090s.

Now that I can agree with.

There was a comment above about RT handles the draw distance. No, that is more a bandwidth and throughput problem.
You cannot over come that problem when the rops share all three of those resoucers at the same render times. Hence when nvidia says they lower inter Rt core latency by half or more. I said it wouldn't matter for the end render. Which I was correct.
 
Last edited:

Contra

New Member
Joined
Jan 26, 2025
Messages
9 (1.80/day)
alright, i'll feed the troll. what would you recommend to me if i had $800 to $1200 to spend?
Of course 5070, because its performance exceeds 4090!

You do these reviews, and we all read them, and you don't need to take users for naive simpletons, as Jensen does) You should boldly say that you will not receive any future cards for review, if you do not present this wretchedness in a better light than it actually is.

And you definitely know the answer to your question, but you wrote it wrong ;)

The other baffling thing with rDNA3 was the shaders. They doubled the shader count per CU for, apparently, 0benefit. The RT performance per CU isnt any better than rDNA2, but it DOES bloat the die sizes up for no reason.
It seems you are out of the loop, even though you are writing on this enthusiast forum!
If you read the C&C reviews, you would know that it was not the number of shaders that was doubled, and AMD has repeatedly pointed this out, but the number of ALUs in them. And yes, they work and increase performance under very specific known conditions. And yes, this has nothing to do with hw-Rt cores.
 
Joined
May 13, 2024
Messages
31 (0.12/day)
Processor Ryzen 7 5800X3D
Motherboard MSI Pro B550M-VC Wifi
Cooling Thermalright Peerless Assassin 120 SE
Memory 2x16GB G.Skill RipJaws DDR4-3600 CL16
Video Card(s) Asus DUAL OC RTX 4070 Super
Storage 4TB NVME, 2TB SATA SSD, 4TB SATA HDD
Display(s) Dell S2722DGM 27" Curved VA 1440p 165hz
Case Fractal Design Pop Air MIni
Power Supply Corsair RMe 750W 80+ Gold
Mouse Logitech G502 Hero
Keyboard GMMK TKL RGB Black
VR HMD Oculus Quest 2
Complete bull crap about rasterization limit; or a a game like Chernobylite wouldn't gain any performance in supported mGPU mode with two rlrtx 4090s.
He's not talking about a limit on the performance of GPUs. Obviously you can just keep making bigger GPUs with more cores and more memory to keep calculating more polygons faster.

He's talking about the end of 3D rasterization as a means to improve how games look and work. We obviously reached the point years ago where adding more and more polygons to an asset is just not worth the diminishing returns. If you want proof, just look at The Last of Us on PS4 vs "PS5 Pro Enhanced." Looks the exact same despite a 10x increase in compute power. Just dumping more shaders into a GPU so the blade of grass can have 8000 polygons instead of just 5000 is silly.

Ray tracing actually improves how games look, which is why Nvidia is going all-in on ray tracing hardware and AI models to mimic it instead of improving their shader architecture.
 
Joined
Jun 18, 2021
Messages
2,640 (2.00/day)
NVIDIA's FE cooler uses five double-length heatpipes. The heatsink provides cooling not only for the GPU, but also for the memory chips and VRM circuitry.

This is a much simpler cooling design than on the RTX 5090, which used liquid metal and a vapor-chamber.

Isn't it basically the same?
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,056 (2.48/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | LG 24" IPS 1440p
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
Isn't it basically the same?
Thats what i thought and being reported by other reviewers. Exact same cooler. Vapor chamber. Just no liquid metal
 
Top