• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5080 Founders Edition

Joined
Jun 10, 2021
Messages
127 (0.10/day)
This might be a controversial take, but I think the 5080 is an insanely good card. Just overpriced on a price to perfomance scale.

16GB VRAM @ 1k is just cancer. Wish they used 3GB dies. Could justify paying 1k for a 24GB version.
 
Joined
Apr 30, 2020
Messages
1,046 (0.60/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
He's not talking about a limit on the performance of GPUs. Obviously you can just keep making bigger GPUs with more cores and more memory to keep calculating more polygons faster.

He's talking about the end of 3D rasterization as a means to improve how games look and work. We obviously reached the point years ago where adding more and more polygons to an asset is just not worth the diminishing returns. If you want proof, just look at The Last of Us on PS4 vs "PS5 Pro Enhanced." Looks the exact same despite a 10x increase in compute power. Just dumping more shaders into a GPU so the blade of grass can have 8000 polygons instead of just 5000 is silly.

Ray tracing actually improves how games look, which is why Nvidia is going all-in on ray tracing hardware and AI models to mimic it instead of improving their shader architecture.
I visially this maybe true but read my other comment. These cards are designed as hybird with both rastizafion & Rt. They cannot gaint any Rt improvements without now increasing rastization.
The has been molinor gains in rt effiency over the generatiojs for nvidia its been a stead 6% gain when matced to rops shaders tenors & rt cores. Even Amd cards had to increase rasterization while gaining minor effinecy or around 8% to 10% for RT per generation when match the same way.
 

calhau

New Member
Joined
Jan 21, 2025
Messages
6 (0.60/day)
What sort of revisionist history is this? The last time AMD was competitive with the top GPU from Nvidia was the Radeon HD 7970... 13 years ago...

Nvidia has a monopoly because AMD has been producing second-class graphics cards for over a decade.
290 disagrees.
But that's not the point, Nvidia created the fastest cards most of the time, that somehow created the perception that all Nvidia cards where better at every tier in the wide public, most people didn't buy high end cards but and with a better Radeon option they went for Nvidia none the less. You can't imagine the stigma Radeon cards had in my personal experience in building PCs for others.

Competition can't be just seen from the point of halo products IMO, if Nvidia didn't have such an overwhelming mind share Radeon would be more profitable and Nvidia wouldn't be as much making competition much closer.
 
Joined
Jul 13, 2016
Messages
3,449 (1.10/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
This is definitely up there with one of the worst GPU gens. Performance increases seems to be entirely from the faster VRAM (except for the 5090 with it's higher core count, bus size).

Every launch Nvidia keeps redefining what is "good value" by messing with the SKUs and pricing to the point where people start thinking a $1,000 xx80 GPU with 8% improvement over last gen with no efficiency improvement, no increase in VRAM, no improvement to anything else, and an uptick in power consumption is good. They forgot that this tier was $700 two gens ago, washed away by the ridiculous $1,200 price tag of the 4080, and that carried a much larger performance uplift.

The lack of an improvement in price to performance bodes poorly for gamers, as games become more demanding the cost to play those games at the same settings increases as a result. Normall this tock generation is when we'd see improvement to price to perf but unfortunately Nvidia has decided to keep all those savings for itself. This is what a monopoly looks like.

If NVIDIA makes that same mistake, some other player will murder them eventually, like what happened with Intel and Ryzen

I really don't think so The whole reason AMD could crack into the CPU market is because their CPUs could run almost every piece of software and their respective features.

The same doesn't apply in the GPU market. In a hypothetical scenario where Nvidia gets lazy, AMD still have to overcome the fact that large swathes of software and / or software features either only run on Nvidia or use an Nvidia accelerated framework. AMD could release a significantly faster product and it would be irrelevant to many professionals who have to use Nvidia because that's what their software essentially requires.

AMD essentially needs to out execute Nvidia for at least 6 years to try and break the stranglehold Nvidia has and frankly that's a tall ask for any company no matter how competent. All that assumes Nvidia doesn't play any more tricks as well. They've already gotten away with GPP and they have no issues throwing money at companies to add features that nuke competitor's performance. I'm sure if Nvidia felt threatened they would go right back to doing that. Mind you, they already exert significant control over what features are included in games to begin with so they inherently have an advantage when designing their next gen products given they know what they are going to throw money at game devs from. AMD and Intel have to come from the back foot fighting a much larger competitor with much more money.

AMD hasn't really cracked all of Intel's markets either, just look at the games OEMs play in the laptop market. They are still keeping AMD CPUs out of higher end products. I don't think people realize just how hard of an ask it is to expect AMD to be able to crack a monopoly as strong as Nvidia.
 
Last edited:
Joined
Aug 26, 2021
Messages
411 (0.33/day)
Oh dear what a totally depressing product. AMD allowed Nvidia to do this by abandoning the high-end this gen and now we can all see what the future would be like if Nvidia had zero competition and in fact probably worse.

I do wonder if Nvidia cut the 5080 down too much and left it vulnerable to the 9070, sure it won't be a match but if it's within 10% at 60% of the price the smart buyers will pick it up instead and let Nvidia keep there fake frames.
 

Contra

New Member
Joined
Jan 26, 2025
Messages
9 (1.80/day)
If you really... etc
We are not talking about the you do not know what RT is, and you also do not know that HW-RT cores are not needed to calculate it in real time - all this works fine on regular shaders, of course, if they are not as miserable and outdated as in the RTX40/50 series. In general, if instead of these RT-cores there were inserted large register files and a little love for shaders, they would calculate these RT effects with the same/over speed.

W1zzard
Really good question, for the short term they will be included in the RT section, and not in the "regular" games list. Eventually these two sections will be merged, because RT will become the standard, and I'll have separate summary charts, avg all, raster only, rt only. At least that's what I came up while thinking about the problem. Happy to discuss this further, start a new thread please
Even our respected reviewers do not know that UE4 GI and UE5 Lumen are real RT, without the need for hw-rt, and do not write these games as RT, although they should have done so ;)
 
Last edited:
Joined
Oct 19, 2022
Messages
278 (0.33/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DisplayPort 2.1)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q with AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
I mean you could have honestly said "It is a very big disappointment, 15% after 2 years is literally the worst gen over gen performance increase we've ever seen. It should have cost $800 to really make sense".
Nvidia need to be called at for being that scummy. If people thought the 5090 was a bad value the 5080 is an even worse value lol
 
Joined
Jul 13, 2016
Messages
3,449 (1.10/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
This might be a controversial take, but I think the 5080 is an insanely good card. Just overpriced on a price to perfomance scale.

16GB VRAM @ 1k is just cancer. Wish they used 3GB dies. Could justify paying 1k for a 24GB version.

The same argument could be made of nearly any product. At the end of the day, products have set prices and are judged based off them.

Oh dear what a totally depressing product. AMD allowed Nvidia to do this by abandoning the high-end this gen and now we can all see what the future would be like if Nvidia had zero competition and in fact probably worse.

I do wonder if Nvidia cut the 5080 down too much and left it vulnerable to the 9070, sure it won't be a match but if it's within 10% at 60% of the price the smart buyers will pick it up instead and let Nvidia keep there fake frames.

Don't start with "AMD did this" nonsense. AMD is not responsible for the actions of Nvidia and is not obligated to make high end cards so you can buy Nvidia cheaper.
 
Joined
Oct 19, 2022
Messages
278 (0.33/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DisplayPort 2.1)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q with AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
Worst generational performance uplift award?

If we wait a bit for the real world prices to materialize, not this fake MSRP, I think this could beat the GTX 1080 Ti -> RTX 2080 by a mile!

:p
Absolutely! Worst generation uplift ever... Even the AI performance is disappointing. Lovelace has 4x the AI performance vs Ampere... Blackwell only has 2.5x more AI performance vs Lovelace.

Sure for people playing Single-Player games like Cyberpunk 2077, Alan Wake 2, God of War, Horizon, etc. the 5090 can "get you 4K@240fps" with MFG but the Latency and Input Lag are definitely not the same as real 240fps.
 
Joined
Mar 5, 2024
Messages
137 (0.41/day)
What a joke of a GPU. Total garbage, and this is the first time i ever say that about any GPU. Even 4060 or the barely any better 4080Super... i didnt say a thing.

This card is insane on so many levels. It delivers performance of an old card but 2-3 years later. Higher price too. Its nuts. Nvidia is finally a horrible GPU company. Until now, you could still argue. This is the generation that you simply cannot defend them. They crossed the line with the ENTIRE lineup. So many things are wrong. Even the power draw... VRAM... scalping... AIB... i can honestly name 50 DIFFERENT issues. What a TOTAL fail of a generation. You rarely get to see this, at least i havnt. Even the worst cases before were still better LOL.
 
Joined
Aug 18, 2014
Messages
58 (0.02/day)
4080 Ti Super

They couldn't wait, so they released the 4000 again

I don't remember another 80 tier being weaker than the previous gen top of the line
 
Joined
Mar 5, 2024
Messages
137 (0.41/day)
I usually NEVER critic the reviews here, and i always agree with them but... um.. "highly recommended" for this?! Too bad... i expected more from my favorite website. I buy all my stuff based on your recommendations. I'm not even joking. Ofc i read about them, i don't buy stuff blindly. I check the stats and all.. but still.

Reading the review, seeing the terrible stats... and then getting a highly recommended at the end felt really bad. If there was a time to not recommend an Nvidia card, its probably now. Just knowing the shortage+AIB tax... some of these cards will cost more than 1600, which could get you a 4090... which IS faster than 5080. Idk what else to say or think. I know, 4090 dont cost that anymore, but still. What a horrible video card.
 
Joined
Oct 19, 2022
Messages
278 (0.33/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DisplayPort 2.1)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q with AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
If NVIDIA makes that same mistake, some other player will murder them eventually, like what happened with Intel and Ryzen
Definitely. They're only doing it because neither AMD or Intel are going to release a High-End/Enthusiast GPU this generation. Let's hope UDNA and even Intel Arc 3rd Gen are really going to kick ass so Nvidia really goes hard with RTX 60s.

I don't know why but I'm pretty sure Blackwell was originally planned to be on TSMC 3nm and for whatever reason (Price? Capacity?) Nvidia decided it was not worth it.
Also a lot of companies seem to be skipping 3nm to go straight to 2nm so I wonder if the RTX 60s will be on TSMC 2nm and therefore get a substantial performance increase, or they will cheap out again by using a TSMC 3nm node like N3P or N3X.
 
Joined
Dec 28, 2012
Messages
4,142 (0.94/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
I usually NEVER critic the reviews here, and i always agree with them but... um.. "highly recommended" for this?! Too bad... i expected more from my favorite website. I buy all my stuff based on your recommendations. I'm not even joking. Ofc i read about them, i don't buy stuff blindly. I check the stats and all.. but still.

Reading the review, seeing the terrible stats... and then getting a highly recommended at the end felt really bad. If there was a time to not recommend an Nvidia card, its probably now. Just knowing the shortage+AIB tax... some of these cards will cost more than 1600, which could get you a 4090... which IS faster than 5080. Idk what else to say or think. I know, 4090 dont cost that anymore, but still. What a horrible video card.
bro it's a GPU review. Calm down. You dont have to buy one if you dont like it.
40% faster than a 3080 4 years later and $300 more.

RIP cost/performance.
So what you're saying is it's great news as everyone who has bought a GPU int he last 4 years is still good for at least another 2?
 
Joined
Dec 31, 2020
Messages
1,089 (0.73/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
4080 Ti Super

They couldn't wait, so they released the 4000 again

I don't remember another 80 tier being weaker than the previous gen top of the line

Only a 4080 super-duper, not enough for a Ti moniker. It would need 12288 Cuda for a Ti, something tangible.
 
Joined
May 13, 2024
Messages
31 (0.12/day)
Processor Ryzen 7 5800X3D
Motherboard MSI Pro B550M-VC Wifi
Cooling Thermalright Peerless Assassin 120 SE
Memory 2x16GB G.Skill RipJaws DDR4-3600 CL16
Video Card(s) Asus DUAL OC RTX 4070 Super
Storage 4TB NVME, 2TB SATA SSD, 4TB SATA HDD
Display(s) Dell S2722DGM 27" Curved VA 1440p 165hz
Case Fractal Design Pop Air MIni
Power Supply Corsair RMe 750W 80+ Gold
Mouse Logitech G502 Hero
Keyboard GMMK TKL RGB Black
VR HMD Oculus Quest 2
I visially this maybe true but read my other comment. These cards are designed as hybird with both rastizafion & Rt. They cannot gaint any Rt improvements without now increasing rastization.
The has been molinor gains in rt effiency over the generatiojs for nvidia its been a stead 6% gain when matced to rops shaders tenors & rt cores. Even Amd cards had to increase rasterization while gaining minor effinecy or around 8% to 10% for RT per generation when match the same way.

What are you talking about? And why can't you spell? Spellcheck is free and on every phone and computer, my guy. It's hard to understand what you're even saying. They can't improve RT performance without adding more shader cores? It's the exact opposite. The whole reason why RT decreases frame rate is because there's not enough RT cores to do all the RT calculations. If Nvidia made a GPU that had 2x as many RT cores and 30% less shader cores, you'd see massive improvements in RT and especially path tracing performance.

But until all major game engines change to ray-traced lighting only, rasterization performance will still be important, so Nvidia's just trying to balance RT and shader cores. But I would not be surprised if they significantly increased the number of RT cores in the next generation.
 
Joined
Sep 10, 2018
Messages
7,537 (3.23/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
40% faster than a 3080 4 years later and $300 more.

RIP cost/performance.

Not sure what you are looking at but it's 60% faster at 1080p up to 73% faster at 4k for 43% more money.....

That's not even the worse comparison

So what you're saying is it's great news as everyone who has bought a GPU int he last 4 years is still good for at least another 2?

Pretty sure someone would notice going from 56 to 96 fps

Also in 2025 money the 3080 would have been 850 but we all know for 8-12 months it was double that...
 
Joined
Jun 1, 2010
Messages
456 (0.09/day)
System Name Very old, but all I've got ®
Processor So old, you don't wanna know... Really!
GeForce RTX 508.3%.
More like 4080S.12% or 4080S GDDR7.
Winner here is people who weren't on Ada already. Upgrading gen-gen has always been a bad decision. No price increase despite inflation is nice, considering the lack of competition. I'm also surprised they squeezed 10% better efficiency despite same node. Perks of power efficient GDDR7 vs GDDR6X I guess.

View attachment 382318
Yep. Much like the RTX 3000, this also seems like oriented at the audience, that has not been upgraded their HW in a while.

But it's still confusing, why it's only 10% better efficiency, considering GDDR7 seems like significantly more energy and thermally efficient, than GDDR6X.
Technically yes, but its maxed out the GB203 silicon so there probably wont be a 5080 "super" unless they cut down GB202 die, and if thats the case it would more than likely be a 5080Ti not a 5080 super.
Really have to look at it as the 4080 super is what the 4080 should have been.
Fully enabled AD103 for $1000 vs fully enabled GB203 for $1000. naming convention aside. so its really only a 13% improvement.
There's barely any improvement, outside of better VRAM. Considering for how long nVidia were milking ppl with cut-down "debris" silicon of non-Super cards, before they went with "full" "Super" SKUs, there' no doubt 5080Ti is to be not seen for a while.
nGreedia shifted the SKU's up with the RTX4000 and now, they have done the same with the RTX5000. The RTX5070 is going to be such a let down to for so many, they will see the xx70, but it it won't be one.
AMD has done absolutely the same, right after nVidia has beckpedaled their "4080 12GB" BS, making them (nV) to be ashamed they actually pushed back, and agreed to release it as 4070Ti.

This is AMD, after all, that has "entrenched" the nVidia's BS pricing. Act them independently, and set themselves the prices, nVidia would be caught unprepared (as despite ppl used to say, that nVidia would lower the prices as well, to undercut AMD), because nVidia won't reduce the price until certain point, as they won't give up their 70% margins any time soon. Don't be fooled.
AMD had basically the advantage at every point:
1. They used to be successful with profit margins of 40% and below.
2. Their silicon is on older, more refined, matured, and much cheaper nodes. So they have more room to undercut nVidia. Yet, they prefer to keep the nVidia prices, that use the bleeding-edge silicon.
3. The chips are much smaller, as well. So this is another point to udercut nVidia's counterparts. Yet, the same written above applies here either.
4. As mentiooned above, AMD had to keep RDNA3 monolithic (or just refresh RDNA2, and call it 2.5), by moving it on superrior node, and adding good enc/dec performance, and power efficiency, and maybe RT. Or just skip MCM RDNA3 altogether, making it "temporary", instead of RDNA4, making way to RDNA5/UDNA sooner.

Better than anything AMD will produce in the next 2-4 years.
If AMD's RTG will be a thing in the next year or so. They might as well be gone, while they are working on RDNA5/UDNA, whatever comes next.
So you are basically arguing that if the 4080S hadn't launched, this would be a better card? This is fundamentally not computing, it's illogical to make such an argument.

Nobody cares about raster performance anymore, especially not in the high end.
Don't get me wrong. RT is totally the future, no doubts here,. But on the other hand, All the SKUs with RT, should have been considered "prototypes", and their R&D should have been done behind the closed doors (much like AI.LLM), up until the very moment, they would be able to do full RTRT, with all RT effects, for at least 2160p at native, no upscallers/denoisers, at at least 60fps. Until then, this is just turning buyers into Beta-testers, and gouging them with capped, inferior silicon.
Genuinely, I don't think you need the /s tag.
They've proven they will take the greed option over the market share option time and time again. If the 9070XT is even vaguely competitive with the 5080, don't be surprised! I hope, foolishly, that the price is in the $480-$550 bracket, as leaked - but I'm not really going to be surprised in the slightest if it's $800.

I'm a pessimist because that way I'm rarely disappointed. By expecting an $800 9070XT I'm not going to get upset or annoyed that I waited all this time for nothing of value.
I'm a cynic because my hope and optimism for big-tech to do anything that's not self-serving has be crushed, almost without exception over the last 25 years of working in this industry.
This is not cynic approach. This is being realistic. The reason why the GPU/IT market has ended up in this utter cesspool, is due to ppl being (overly)naive. People used to buy intp the BS, that companies used to feed them with. So much coping, and wishful thinking, hype, that allows these companies margins to explode each product generations launch.

I don't say AMD, or any other company, shouldn't get the deserved profit, and paid. But any company, which seeks for success, and "growing the mindshare", especially, after such claim, should price their products fair and realistic. To include the R&D expences, the BOM for the card, and a bit of margins (10%-20% would be enough, to penetrate the market), and not put any BS prices, to satisfy the "loaded" investors/shareholders. At this point, anything that satisfies the last will inevitably lead to the RTG, and AMD brand damage anyway. Everyone has to get real expectation about situation, the brand, the company, and the market state, including shareholders. They won't get a penny from a dead brand.
AMD has fumbled some significant aspect of every single GPU launch for 15 years. None of their recent moves behind the scenes inspire confidence they know what they are doing with RDNA 4. Expect cards with a mild price/performance advantage in raster vs the Nvidia competition, meaningfully worse RT performance, and a dramatically worse software feature set. They seem super content with lousy second-rate products which fit that description, and I imagine there's some incompetent exec over there with constant shocked Pikachu face that they are permanently stuck at like 10% market share.
They had all the chances and posibilities. RT performance is a joke and complete disgrace at this stage. Let it have another five years, adnit might be another thing.
AMD's problem right now is the power efficiency, the codecs sipport and encoding/decoding performancea and quality, and allocation. They need to flood the market. Since right now, their GPUs are not only overpriced as heck, but they are barely present as well. The stock is a drop, compared to the nVidia's ocean. There's no need in top notch node, for "temporary" GPU µarch generation.
It's like part of AMD does enough of efforts, does some amasing results, and the other just shatters it all, ceases the progress in every achievement, and destroys the progress.
 
Joined
Jul 5, 2013
Messages
28,862 (6.83/day)
Holy 15 pages of comments Batman!! This is what happens when you're late to the party..

My 2 cents, this should not have been the 5080. It should have been a 5070ti. This is not a $1000 card. $850 is what this card should be priced at.

@ Nvidia, @ Jensen Huang,
Drop the ball did ya? Or did you not actually witness the perf numbers first hand? Before making statements like the ones you made, you should make sure they are truthful and accurate. Otherwise you end up looking like a liar that no one can take seriously.
 
Last edited:
Joined
Jun 13, 2022
Messages
44 (0.05/day)
bro it's a GPU review. Calm down. You dont have to buy one if you dont like it.

So what you're saying is it's great news as everyone who has bought a GPU int he last 4 years is still good for at least another 2?
Yeah that's one way of looking at it. I'm definitely not upgrading anytime soon.
 
Joined
Jan 9, 2025
Messages
34 (1.55/day)
Processor AMD Ryzen 7 5800X3D
Motherboard MSI MPG B550 Gaming Plus
Cooling Deepcool LT720 360mm
Memory 32GB Corsair Dominator Platinum DDR4 3600Mhz
Video Card(s) Zotac Nvidia RTX 3090
Storage Western Digital SN850X
Display(s) Acer Nitro 34" Ultrawide
Case Lianli O11 Dynamic Evo White
Audio Device(s) HiVi Swans OS-10 Bookshelf Speakers
Power Supply Seasonic Prime GX-1000 80+ Gold
Mouse Razer Ultimate | Razer Naga Pro
Keyboard Razer Huntsman Tournament Edition
I usually NEVER critic the reviews here, and i always agree with them but... um.. "highly recommended" for this?! Too bad... i expected more from my favorite website. I buy all my stuff based on your recommendations. I'm not even joking. Ofc i read about them, i don't buy stuff blindly. I check the stats and all.. but still.

Reading the review, seeing the terrible stats... and then getting a highly recommended at the end felt really bad. If there was a time to not recommend an Nvidia card, its probably now. Just knowing the shortage+AIB tax... some of these cards will cost more than 1600, which could get you a 4090... which IS faster than 5080. Idk what else to say or think. I know, 4090 dont cost that anymore, but still. What a horrible video card.

Overreaction much? You made it sound like W1zzard like ran over your dog back up the car and ran over it again. :rolleyes:
 
Joined
Dec 26, 2006
Messages
3,919 (0.59/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
@W1zzard

Sorry if this has been asked, but are all the PCBs like this?

edit just looked at galax review - pcb looks more run of the mill.

So about 50-50 or?

1738193488232.png
 
Joined
Feb 21, 2006
Messages
2,264 (0.33/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.12.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
Seeing the poor uplift for this, I'm baffled by AMD's decision to go and sit in a corner. Just redoing a slightly larger 7900 XTX on N4 would have probably been enough to beat this.
For AMD to release a highend card this gen it would need to beat a 4090 not just be a few percent faster than a 5080 which is still slower than a 4090.
 
Joined
Jun 1, 2010
Messages
456 (0.09/day)
System Name Very old, but all I've got ®
Processor So old, you don't wanna know... Really!
Thank gaud that the benchmarks were tested with the 572 driver. Grazie
Disappointing to see that the 5080 doesn't out perform the 4090 in the games and resolution that matter to me. I should of kept my 4090.
It was written on the wall. If to consider, that the top SKU like 509, has barely 20% more "raw"/"true" performance uplift over 4090, then there's simply no way for 5080 do do any different/better. Especially, considering, that almost each next generaton, the performance gape/difference between **90 and other SKUs like **80, is only increasing.

Besides, it is always better to value what one has right now. Especially in these, uncertain times. There's an old wise saying: "whoever changes/swaps (trades/barter) has nothing". It basically means that, each deal can end up worse, than having what one already has.

Have I 7900GRE, 7800XT, or at least 7700XT, heck, even 7600 I would be the happiest man alive.

AIB partners will simply make Reference model and other cheaper SKUs in extremely low numbers, and use their parts to make higher end cards that have larger margin of profit. They have already said that Nvidia made MSRP too low, and number of allocated chips also very low.

A perfect combination for scalping season.
Exactly. The "exciting" HW, becomes even more scarce and scalped. The problem is, this gen/refresh of nVidia cards, has yet again become te victim/pre of it's compute power, much like during several crypro-crazes. And it's much more interesting for AI crowd, than the gamers, that will cope with their existing aging HW. nVidia knows this and uses evey opportunity to gouge, because only wealthy clients can justfy this price hike.
And this was nVidia, who have broke the old rule, for compute GPUs being only Quadro/Tesla, and never gaming ones like GeForce, due to last being proibited for use "enterprise" driver. This is nVidia, who let the prosumers, designers, video-editors to buy and use the somewhat inferior gaming graphic cards for compute and professional workloads.
The emerging of CUDA was the first bell to ring about nVidia claiming to become a compute/enterprise monster.
 
Last edited:
Top