• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RTX 5080 - premature review - it sucks

Joined
Jan 14, 2019
Messages
14,234 (6.41/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
wouldnt suprise me at all if they separated the cards and you would need to buy a rasterization card and a utility card that does all the specialty stuff like AI, RT etc. $1000 for the raster card but only $500 for the utility card lol. what a deal
That's two separate architectures, which increases R&D costs significantly. Not gonna happen. Even AMD is moving towards unifying their gaming and compute architectures after RDNA 4.

Nvidia had something similar with the GTX 16-series as an intermediary solution, with mixed success.
 
Joined
Mar 29, 2023
Messages
1,237 (1.81/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
That's two separate architectures, which increases R&D costs significantly. Not gonna happen. Even AMD is moving towards unifying their gaming and compute architectures after RDNA 4.

Nvidia had something similar with the GTX 16-series as an intermediary solution, with mixed success.

16 series would have been a huge success if they hadn't only made them as low tier cards. A chip the same size as 2080 ti with only raster cores would have outperformed the 2080 ti by alot, and sold like hotcakes. Nvidia was obviously aware of this thus didn't do it.
 

Rungar

New Member
Joined
Feb 8, 2025
Messages
3 (1.50/day)
That's two separate architectures, which increases R&D costs significantly. Not gonna happen. Even AMD is moving towards unifying their gaming and compute architectures after RDNA 4.

Nvidia had something similar with the GTX 16-series as an intermediary solution, with mixed success.
perhaps but they werent up against a technology wall like they are now. if AMD isnt even bothering and gains are trivial somebody knows something. Once the general public know they are up against the wall the "new revolutionary solution" will be to sell you two cards.. not one. Each one with an eye watering price tag! After all the RT, AI, frame gen etc is completely optional..

the smart move would be for amd to move rt and ai type stuff to the cpu instead of onboard graphics and their cards do the brute force work. Wouldnt be long before there was a dedicated rt core, a few ai cores, frame gen cores etc. Makes sense since higher core counts arent that useful in games atm.
 
Joined
Aug 3, 2006
Messages
299 (0.04/day)
Location
Austin, TX
Processor Ryzen 6900HX
Memory 32 GB DDR4LP
Video Card(s) Radeon 6800m
Display(s) LG C3 42''
Software Windows 11 home premium
I try again. (same post - same topic)

ltt, hardware unboxed and gamers nexus 100% have all those cards.

Random youtube guy most likely do not have that hardware - in response to post #337 - in resonse to that nonsense video which was in post #337

in response to #337 - check that nonsense channel - just scroll on the nonsense link on post #337 - check that channel and scroll down - pick a random video

View attachment 383957


Do you really think someone can afford such hardware?
- a few minutes ago - I will not share where I saw it.
Estimate money from youtube for 200.000 views - tech video channel = 100 Us dollar. that should be 0.05 us dollar per video view.

I love those clickbait nonsense - from that channel:


i5 3470 vs i5 13400F - 11 Years Difference


i7 1st 870 vs i7 14th 14700K - 14 Years Difference

--

I sold my am4 plattform early 2023.

I therefore bought and sold quite soon a MSI 960 GTX 4GB card which was used with a ryzen 5800x / 3 3100

That card had issues wiht subnautica with lowest settings. Encased game big issues.
I do own the last of us part 1 or star wars jedi survivor. These games are much more demanding than subnautica

Just nonsense.

You subtly accuse him of lying by saying he can't afford the hardware, when it literally says in his profile he owns a computer store. His estimated income from YouTube is 1-16k per month. Besides all of that, what exactly is off about the video that makes you skeptical? That the 7900xtx is that close to the 5080? Because that actually is the case, and these videos are nice to actually see what the cards are rendering rather than just looking at a chart.
 
Top