System Name | S.L.I + RTX research rig |
---|---|
Processor | Ryzen 7 5800X 3D. |
Motherboard | MSI MEG ACE X570 |
Cooling | Corsair H150i Cappellx |
Memory | Corsair Vengeance pro RGB 3200mhz 32Gbs |
Video Card(s) | 2x Dell RTX 2080 Ti in S.L.I |
Storage | Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2 |
Display(s) | HP X24i |
Case | Corsair 7000D Airflow |
Power Supply | EVGA G+1600watts |
Mouse | Corsair Scimitar |
Keyboard | Cosair K55 Pro RGB |
I visially this maybe true but read my other comment. These cards are designed as hybird with both rastizafion & Rt. They cannot gaint any Rt improvements without now increasing rastization.He's not talking about a limit on the performance of GPUs. Obviously you can just keep making bigger GPUs with more cores and more memory to keep calculating more polygons faster.
He's talking about the end of 3D rasterization as a means to improve how games look and work. We obviously reached the point years ago where adding more and more polygons to an asset is just not worth the diminishing returns. If you want proof, just look at The Last of Us on PS4 vs "PS5 Pro Enhanced." Looks the exact same despite a 10x increase in compute power. Just dumping more shaders into a GPU so the blade of grass can have 8000 polygons instead of just 5000 is silly.
Ray tracing actually improves how games look, which is why Nvidia is going all-in on ray tracing hardware and AI models to mimic it instead of improving their shader architecture.
290 disagrees.What sort of revisionist history is this? The last time AMD was competitive with the top GPU from Nvidia was the Radeon HD 7970... 13 years ago...
Nvidia has a monopoly because AMD has been producing second-class graphics cards for over a decade.
Processor | Ryzen 7800X3D |
---|---|
Motherboard | ASRock X670E Taichi |
Cooling | Noctua NH-D15 Chromax |
Memory | 32GB DDR5 6000 CL30 |
Video Card(s) | MSI RTX 4090 Trio |
Storage | P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2 |
Display(s) | Acer Predator XB3 27" 240 Hz |
Case | Thermaltake Core X9 |
Audio Device(s) | JDS Element IV, DCA Aeon II |
Power Supply | Seasonic Prime Titanium 850w |
Mouse | PMM P-305 |
Keyboard | Wooting HE60 |
VR HMD | Valve Index |
Software | Win 10 |
If NVIDIA makes that same mistake, some other player will murder them eventually, like what happened with Intel and Ryzen
We are not talking about the you do not know what RT is, and you also do not know that HW-RT cores are not needed to calculate it in real time - all this works fine on regular shaders, of course, if they are not as miserable and outdated as in the RTX40/50 series. In general, if instead of these RT-cores there were inserted large register files and a little love for shaders, they would calculate these RT effects with the same/over speed.If you really... etc
Even our respected reviewers do not know that UE4 GI and UE5 Lumen are real RT, without the need for hw-rt, and do not write these games as RT, although they should have done soReally good question, for the short term they will be included in the RT section, and not in the "regular" games list. Eventually these two sections will be merged, because RT will become the standard, and I'll have separate summary charts, avg all, raster only, rt only. At least that's what I came up while thinking about the problem. Happy to discuss this further, start a new thread please
Processor | AMD Ryzen 7 9800X3D |
---|---|
Motherboard | MSI MPG X870E Carbon Wifi |
Cooling | ARCTIC Liquid Freezer II 280 A-RGB |
Memory | 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30) |
Video Card(s) | MSI GeForce RTX 4090 SUPRIM Liquid X |
Storage | Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink |
Display(s) | AORUS FO32U2P 4K QD-OLED 240Hz (DisplayPort 2.1) |
Case | CoolerMaster H500M (Mesh) |
Audio Device(s) | AKG N90Q with AudioQuest DragonFly Red (USB DAC) |
Power Supply | Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1 |
Mouse | Logitech G PRO X SUPERLIGHT |
Keyboard | Razer BlackWidow V3 Pro |
Software | Windows 10 64-bit |
Processor | Ryzen 7800X3D |
---|---|
Motherboard | ASRock X670E Taichi |
Cooling | Noctua NH-D15 Chromax |
Memory | 32GB DDR5 6000 CL30 |
Video Card(s) | MSI RTX 4090 Trio |
Storage | P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2 |
Display(s) | Acer Predator XB3 27" 240 Hz |
Case | Thermaltake Core X9 |
Audio Device(s) | JDS Element IV, DCA Aeon II |
Power Supply | Seasonic Prime Titanium 850w |
Mouse | PMM P-305 |
Keyboard | Wooting HE60 |
VR HMD | Valve Index |
Software | Win 10 |
This might be a controversial take, but I think the 5080 is an insanely good card. Just overpriced on a price to perfomance scale.
16GB VRAM @ 1k is just cancer. Wish they used 3GB dies. Could justify paying 1k for a 24GB version.
Oh dear what a totally depressing product. AMD allowed Nvidia to do this by abandoning the high-end this gen and now we can all see what the future would be like if Nvidia had zero competition and in fact probably worse.
I do wonder if Nvidia cut the 5080 down too much and left it vulnerable to the 9070, sure it won't be a match but if it's within 10% at 60% of the price the smart buyers will pick it up instead and let Nvidia keep there fake frames.
Processor | AMD Ryzen 7 9800X3D |
---|---|
Motherboard | MSI MPG X870E Carbon Wifi |
Cooling | ARCTIC Liquid Freezer II 280 A-RGB |
Memory | 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30) |
Video Card(s) | MSI GeForce RTX 4090 SUPRIM Liquid X |
Storage | Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink |
Display(s) | AORUS FO32U2P 4K QD-OLED 240Hz (DisplayPort 2.1) |
Case | CoolerMaster H500M (Mesh) |
Audio Device(s) | AKG N90Q with AudioQuest DragonFly Red (USB DAC) |
Power Supply | Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1 |
Mouse | Logitech G PRO X SUPERLIGHT |
Keyboard | Razer BlackWidow V3 Pro |
Software | Windows 10 64-bit |
Absolutely! Worst generation uplift ever... Even the AI performance is disappointing. Lovelace has 4x the AI performance vs Ampere... Blackwell only has 2.5x more AI performance vs Lovelace.Worst generational performance uplift award?
If we wait a bit for the real world prices to materialize, not this fake MSRP, I think this could beat the GTX 1080 Ti -> RTX 2080 by a mile!
Processor | AMD Ryzen 7 9800X3D |
---|---|
Motherboard | MSI MPG X870E Carbon Wifi |
Cooling | ARCTIC Liquid Freezer II 280 A-RGB |
Memory | 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30) |
Video Card(s) | MSI GeForce RTX 4090 SUPRIM Liquid X |
Storage | Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink |
Display(s) | AORUS FO32U2P 4K QD-OLED 240Hz (DisplayPort 2.1) |
Case | CoolerMaster H500M (Mesh) |
Audio Device(s) | AKG N90Q with AudioQuest DragonFly Red (USB DAC) |
Power Supply | Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1 |
Mouse | Logitech G PRO X SUPERLIGHT |
Keyboard | Razer BlackWidow V3 Pro |
Software | Windows 10 64-bit |
Definitely. They're only doing it because neither AMD or Intel are going to release a High-End/Enthusiast GPU this generation. Let's hope UDNA and even Intel Arc 3rd Gen are really going to kick ass so Nvidia really goes hard with RTX 60s.If NVIDIA makes that same mistake, some other player will murder them eventually, like what happened with Intel and Ryzen
System Name | Skunkworks 3.0 |
---|---|
Processor | 5800x3d |
Motherboard | x570 unify |
Cooling | Noctua NH-U12A |
Memory | 32GB 3600 mhz |
Video Card(s) | asrock 6800xt challenger D |
Storage | Sabarent rocket 4.0 2TB, MX 500 2TB |
Display(s) | Asus 1440p144 27" |
Case | Old arse cooler master 932 |
Power Supply | Corsair 1200w platinum |
Mouse | *squeak* |
Keyboard | Some old office thing |
Software | Manjaro |
bro it's a GPU review. Calm down. You dont have to buy one if you dont like it.I usually NEVER critic the reviews here, and i always agree with them but... um.. "highly recommended" for this?! Too bad... i expected more from my favorite website. I buy all my stuff based on your recommendations. I'm not even joking. Ofc i read about them, i don't buy stuff blindly. I check the stats and all.. but still.
Reading the review, seeing the terrible stats... and then getting a highly recommended at the end felt really bad. If there was a time to not recommend an Nvidia card, its probably now. Just knowing the shortage+AIB tax... some of these cards will cost more than 1600, which could get you a 4090... which IS faster than 5080. Idk what else to say or think. I know, 4090 dont cost that anymore, but still. What a horrible video card.
So what you're saying is it's great news as everyone who has bought a GPU int he last 4 years is still good for at least another 2?40% faster than a 3080 4 years later and $300 more.
RIP cost/performance.
Processor | E5-4627 v4 |
---|---|
Motherboard | VEINEDA X99 |
Memory | 32 GB |
Video Card(s) | 2080 Ti |
Storage | NE-512 |
Display(s) | G27Q |
Case | DAOTECH X9 |
Power Supply | SF450 |
4080 Ti Super
They couldn't wait, so they released the 4000 again
I don't remember another 80 tier being weaker than the previous gen top of the line
Processor | Ryzen 7 5800X3D |
---|---|
Motherboard | MSI Pro B550M-VC Wifi |
Cooling | Thermalright Peerless Assassin 120 SE |
Memory | 2x16GB G.Skill RipJaws DDR4-3600 CL16 |
Video Card(s) | Asus DUAL OC RTX 4070 Super |
Storage | 4TB NVME, 2TB SATA SSD, 4TB SATA HDD |
Display(s) | Dell S2722DGM 27" Curved VA 1440p 165hz |
Case | Fractal Design Pop Air MIni |
Power Supply | Corsair RMe 750W 80+ Gold |
Mouse | Logitech G502 Hero |
Keyboard | GMMK TKL RGB Black |
VR HMD | Oculus Quest 2 |
I visially this maybe true but read my other comment. These cards are designed as hybird with both rastizafion & Rt. They cannot gaint any Rt improvements without now increasing rastization.
The has been molinor gains in rt effiency over the generatiojs for nvidia its been a stead 6% gain when matced to rops shaders tenors & rt cores. Even Amd cards had to increase rasterization while gaining minor effinecy or around 8% to 10% for RT per generation when match the same way.
System Name | His & Hers |
---|---|
Processor | R7 5800X/ R7 7950X3D Stock |
Motherboard | X670E Aorus Pro X/ROG Crosshair VIII Hero |
Cooling | Corsair h150 elite/ Corsair h115i Platinum |
Memory | Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk |
Video Card(s) | Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090 |
Storage | lots of SSD. |
Display(s) | A whole bunch OLED, VA, IPS..... |
Case | 011 Dynamic XL/ Phanteks Evolv X |
Audio Device(s) | Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B |
Power Supply | Seasonic Ultra Prime Titanium 1000w/850w |
Mouse | Logitech G502 Lightspeed/ Logitech G Pro Hero. |
Keyboard | Logitech - G915 LIGHTSPEED / Logitech G Pro |
40% faster than a 3080 4 years later and $300 more.
RIP cost/performance.
So what you're saying is it's great news as everyone who has bought a GPU int he last 4 years is still good for at least another 2?
System Name | Very old, but all I've got ® |
---|---|
Processor | So old, you don't wanna know... Really! |
More like 4080S.12% or 4080S GDDR7.GeForce RTX 508.3%.
Yep. Much like the RTX 3000, this also seems like oriented at the audience, that has not been upgraded their HW in a while.Winner here is people who weren't on Ada already. Upgrading gen-gen has always been a bad decision. No price increase despite inflation is nice, considering the lack of competition. I'm also surprised they squeezed 10% better efficiency despite same node. Perks of power efficient GDDR7 vs GDDR6X I guess.
View attachment 382318
There's barely any improvement, outside of better VRAM. Considering for how long nVidia were milking ppl with cut-down "debris" silicon of non-Super cards, before they went with "full" "Super" SKUs, there' no doubt 5080Ti is to be not seen for a while.Technically yes, but its maxed out the GB203 silicon so there probably wont be a 5080 "super" unless they cut down GB202 die, and if thats the case it would more than likely be a 5080Ti not a 5080 super.
Really have to look at it as the 4080 super is what the 4080 should have been.
Fully enabled AD103 for $1000 vs fully enabled GB203 for $1000. naming convention aside. so its really only a 13% improvement.
AMD has done absolutely the same, right after nVidia has beckpedaled their "4080 12GB" BS, making them (nV) to be ashamed they actually pushed back, and agreed to release it as 4070Ti.nGreedia shifted the SKU's up with the RTX4000 and now, they have done the same with the RTX5000. The RTX5070 is going to be such a let down to for so many, they will see the xx70, but it it won't be one.
If AMD's RTG will be a thing in the next year or so. They might as well be gone, while they are working on RDNA5/UDNA, whatever comes next.Better than anything AMD will produce in the next 2-4 years.
Don't get me wrong. RT is totally the future, no doubts here,. But on the other hand, All the SKUs with RT, should have been considered "prototypes", and their R&D should have been done behind the closed doors (much like AI.LLM), up until the very moment, they would be able to do full RTRT, with all RT effects, for at least 2160p at native, no upscallers/denoisers, at at least 60fps. Until then, this is just turning buyers into Beta-testers, and gouging them with capped, inferior silicon.So you are basically arguing that if the 4080S hadn't launched, this would be a better card? This is fundamentally not computing, it's illogical to make such an argument.
Nobody cares about raster performance anymore, especially not in the high end.
This is not cynic approach. This is being realistic. The reason why the GPU/IT market has ended up in this utter cesspool, is due to ppl being (overly)naive. People used to buy intp the BS, that companies used to feed them with. So much coping, and wishful thinking, hype, that allows these companies margins to explode each product generations launch.Genuinely, I don't think you need the /s tag.
They've proven they will take the greed option over the market share option time and time again. If the 9070XT is even vaguely competitive with the 5080, don't be surprised! I hope, foolishly, that the price is in the $480-$550 bracket, as leaked - but I'm not really going to be surprised in the slightest if it's $800.
I'm a pessimist because that way I'm rarely disappointed. By expecting an $800 9070XT I'm not going to get upset or annoyed that I waited all this time for nothing of value.
I'm a cynic because my hope and optimism for big-tech to do anything that's not self-serving has be crushed, almost without exception over the last 25 years of working in this industry.
They had all the chances and posibilities. RT performance is a joke and complete disgrace at this stage. Let it have another five years, adnit might be another thing.AMD has fumbled some significant aspect of every single GPU launch for 15 years. None of their recent moves behind the scenes inspire confidence they know what they are doing with RDNA 4. Expect cards with a mild price/performance advantage in raster vs the Nvidia competition, meaningfully worse RT performance, and a dramatically worse software feature set. They seem super content with lousy second-rate products which fit that description, and I imagine there's some incompetent exec over there with constant shocked Pikachu face that they are permanently stuck at like 10% market share.
Yeah that's one way of looking at it. I'm definitely not upgrading anytime soon.bro it's a GPU review. Calm down. You dont have to buy one if you dont like it.
So what you're saying is it's great news as everyone who has bought a GPU int he last 4 years is still good for at least another 2?
Processor | AMD Ryzen 7 5800X3D |
---|---|
Motherboard | MSI MPG B550 Gaming Plus |
Cooling | Deepcool LT720 360mm |
Memory | 32GB Corsair Dominator Platinum DDR4 3600Mhz |
Video Card(s) | Zotac Nvidia RTX 3090 |
Storage | Western Digital SN850X |
Display(s) | Acer Nitro 34" Ultrawide |
Case | Lianli O11 Dynamic Evo White |
Audio Device(s) | HiVi Swans OS-10 Bookshelf Speakers |
Power Supply | Seasonic Prime GX-1000 80+ Gold |
Mouse | Razer Ultimate | Razer Naga Pro |
Keyboard | Razer Huntsman Tournament Edition |
I usually NEVER critic the reviews here, and i always agree with them but... um.. "highly recommended" for this?! Too bad... i expected more from my favorite website. I buy all my stuff based on your recommendations. I'm not even joking. Ofc i read about them, i don't buy stuff blindly. I check the stats and all.. but still.
Reading the review, seeing the terrible stats... and then getting a highly recommended at the end felt really bad. If there was a time to not recommend an Nvidia card, its probably now. Just knowing the shortage+AIB tax... some of these cards will cost more than 1600, which could get you a 4090... which IS faster than 5080. Idk what else to say or think. I know, 4090 dont cost that anymore, but still. What a horrible video card.
Processor | Ryzen 5700x |
---|---|
Motherboard | Gigabyte X570S Aero G R1.1 BiosF5g |
Cooling | Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm |
Memory | Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A) |
Video Card(s) | AMD RX 6800 - Asus Tuf |
Storage | Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX |
Display(s) | LG 27UL550-W (27" 4k) |
Case | Be Quiet Pure Base 600 (no window) |
Audio Device(s) | Realtek ALC1220-VB |
Power Supply | SuperFlower Leadex V Gold Pro 850W ATX Ver2.52 |
Mouse | Mionix Naos Pro |
Keyboard | Corsair Strafe with browns |
Software | W10 22H2 Pro x64 |
System Name | The Expanse |
---|---|
Processor | AMD Ryzen 7 5800X3D |
Motherboard | Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc. |
Cooling | Corsair H150i Pro |
Memory | 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die) |
Video Card(s) | XFX Radeon RX 7900 XTX Magnetic Air (24.12.1) |
Storage | WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB |
Display(s) | LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz |
Case | Fractal Design Meshify S2 |
Audio Device(s) | Creative X-Fi + Logitech Z-5500 + HS80 Wireless |
Power Supply | Corsair AX850 Titanium |
Mouse | Corsair Dark Core RGB SE |
Keyboard | Corsair K100 |
Software | Windows 10 Pro x64 22H2 |
Benchmark Scores | 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d |
For AMD to release a highend card this gen it would need to beat a 4090 not just be a few percent faster than a 5080 which is still slower than a 4090.Seeing the poor uplift for this, I'm baffled by AMD's decision to go and sit in a corner. Just redoing a slightly larger 7900 XTX on N4 would have probably been enough to beat this.
System Name | Very old, but all I've got ® |
---|---|
Processor | So old, you don't wanna know... Really! |
It was written on the wall. If to consider, that the top SKU like 509, has barely 20% more "raw"/"true" performance uplift over 4090, then there's simply no way for 5080 do do any different/better. Especially, considering, that almost each next generaton, the performance gape/difference between **90 and other SKUs like **80, is only increasing.Thank gaud that the benchmarks were tested with the 572 driver. Grazie
Disappointing to see that the 5080 doesn't out perform the 4090 in the games and resolution that matter to me. I should of kept my 4090.
Exactly. The "exciting" HW, becomes even more scarce and scalped. The problem is, this gen/refresh of nVidia cards, has yet again become te victim/pre of it's compute power, much like during several crypro-crazes. And it's much more interesting for AI crowd, than the gamers, that will cope with their existing aging HW. nVidia knows this and uses evey opportunity to gouge, because only wealthy clients can justfy this price hike.AIB partners will simply make Reference model and other cheaper SKUs in extremely low numbers, and use their parts to make higher end cards that have larger margin of profit. They have already said that Nvidia made MSRP too low, and number of allocated chips also very low.
A perfect combination for scalping season.