• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector

Joined
Sep 10, 2018
Messages
6,925 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Substantial power bump per tier? Completely skippable then

looks a lot like Ada SuperDuper

It's a 10% ish increase likely due to them wanting to push it a bit since the node isn't significantly better.

Ada can be really power tuned with almost no loss in performance guessing it'll be the same with this.

My 4090 at 350w loses almost no performance definitely not enough to be noticed without using an overlay at 320w it's less than 5%.

Guessing the 5090 is going to be some monstrosity that is borderline HPC with a price to match.
 
Joined
Jun 14, 2020
Messages
3,474 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
It's a 10% ish increase likely due to them wanting to push it a bit since the node isn't significantly better.

Ada can be really power tuned with almost no loss in performance guessing it'll be the same with this.

My 4090 at 350w loses almost no performance definitely not enough to be noticed without using an overlay at 320w it's less than 5%.

Guessing the 5090 is going to be some monstrosity that is borderline HPC with a price to match.
I'm running mine on 320w with some oced memory, it's 2-3% faster than stock, lol :D

Have no idea why people care about stock power draw, just change it in 5 seconds, seriously who really cares?
 
Joined
Dec 6, 2022
Messages
385 (0.53/day)
Location
NYC
System Name GameStation
Processor AMD R5 5600X
Motherboard Gigabyte B550
Cooling Artic Freezer II 120
Memory 16 GB
Video Card(s) Sapphire Pulse 7900 XTX
Storage 2 TB SSD
Case Cooler Master Elite 120
500W TDP for RX 5090! yikes... & I thought my RX 7900 XTX with 350W TDP was overkill. When gaming at 3440x1440p, ultra settings in Starfield, It does spike to 510W, but avgs 350w during a session.

Only the hardcore nvidia fans will bear this utter engineering junk. Will have 800-watt spikes, and will require 1200-watt PSUs, especially when coupled with a 300-watt intel oven. :kookoo:
The funny thing is, everyone calls the whole Radeon 7000 series trash because of power consumption, but magically that is not an issue with Ngreedia in this case. Granted, these are rumors, but still.

Personally, I hated AMD pricing and naming though. Each one should’ve been named a tier less and priced accordingly.

Example the 7900 XTX should’ve been 7800 XT and priced at 799.

Then again, in many raster games (you know the ones that compromise like 90% of the whole Steam catalog) the 7900 xtx is on average 20-30% slower than the 4090 but priced up to 60% less, depending on vendors, rebates and packed in games.
 
Joined
Sep 10, 2018
Messages
6,925 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I'm running mine on 320w with some oced memory, it's 2-3% faster than stock, lol :D

Have no idea why people care about stock power draw, just change it in 5 seconds, seriously who really cares?

Not every sample will do that though but I'd make a bet almost every 4090 can be dropped to 350w with almost no loss in performance.

Mine boost to almost 2900mhz stock and stays there is my guess why there is some slight performance drop at 350w it sticks around 2600mhz
 
Joined
Aug 10, 2020
Messages
316 (0.20/day)
Since 4070 is 220W TDP and 4070 Super is 225W TDP, keeping the 5070 at 220W TDP sounds good. It'll be a solid performance/watt increase from the new architecture and lithography bump.
 
Joined
May 31, 2016
Messages
4,437 (1.43/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
I need of much more (real)increase for to be staggered. Please, no more fake teraflops!
I doubt there will be any of this kind.
Rather perf increase at the expense of power consumption. Really, don't holding my breath.
 
Joined
Jun 14, 2020
Messages
3,474 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
The funny thing is, everyone calls the whole Radeon 7000 series trash because of power consumption,
No, not because of power consumption, because of efficiency. These 2 are different. Put a 7900xtx and a 4080 both at 250w and see which one is faster. That's efficiency. Power draw is irrelevant, a card can draw 2 kilowatts and its fine, if you can limit it to 300w and have it still be fast, no issue.

Not every sample will do that though but I'd make a bet almost every 4090 can be dropped to 350w with almost no loss in performance.

Mine boost to almost 2900mhz stock and stays there is my guess why there is some slight performance drop at 350w it sticks around 2600mhz
The card is memory limited so yeah, the core dropping clocks doesn't impact performance. Just clock your memory a bit and it should be faster than stock. I'm running +1400 on mine
 
Joined
Sep 10, 2018
Messages
6,925 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
The card is memory limited so yeah, the core dropping clocks doesn't impact performance. Just clock your memory a bit and it should be faster than stock. I'm running +1400 on mine

Mine is OC to +1200 which is measurable in benchmarks like timespy but in actual games it makes about 1-2% difference when logging over an hour for me clockspeed matters more I see the most gains at 3ghz but not running my card locked to 600w lmao....

It does flip back and fourth depending on game played though and if I'm running path tracing or pure rasterization.
 
Joined
Aug 2, 2012
Messages
1,987 (0.44/day)
Location
Netherlands
System Name TheDeeGee's PC
Processor Intel Core i7-11700
Motherboard ASRock Z590 Steel Legend
Cooling Noctua NH-D15S
Memory Crucial Ballistix 3200/C16 32GB
Video Card(s) Nvidia RTX 4070 Ti 12GB
Storage Crucial P5 Plus 2TB / Crucial P3 Plus 2TB / Crucial P3 Plus 4TB
Display(s) EIZO CX240
Case Lian-Li O11 Dynamic Evo XL / Noctua NF-A12x25 fans
Audio Device(s) Creative Sound Blaster ZXR / AKG K601 Headphones
Power Supply Seasonic PRIME Fanless TX-700
Mouse Logitech G500S
Keyboard Keychron Q6
Software Windows 10 Pro 64-Bit
Benchmark Scores None, as long as my games runs smooth.
He doesnt know either. Theres nothing really keeping a user of atx 3.0 PSU using these new cards. The connector, even tho its not the newer revision adopted with atx 3.1 will still work.
Thought as much, as the female connector hasn't changed and is compatible with both revisions.
 
Joined
Jan 14, 2019
Messages
12,344 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Try horizon zero dawn. Yeah the old one. Medium just destroyes the graphics, it removes all shadows etc. I used fsr high instead of native medium on my laptop, looked way better
It's from 2017. I highly doubt you need medium graphics with a 7900 XT in it to run properly.

It's a 10% ish increase likely due to them wanting to push it a bit since the node isn't significantly better.

Ada can be really power tuned with almost no loss in performance guessing it'll be the same with this.

My 4090 at 350w loses almost no performance definitely not enough to be noticed without using an overlay at 320w it's less than 5%.

Guessing the 5090 is going to be some monstrosity that is borderline HPC with a price to match.
It's 10% now, but let's not forget that power requirements have been steadily increasing ever since Turing. The GTX 1060 had a TDP of 120 W. That's x50 card territory now.

I'm running mine on 320w with some oced memory, it's 2-3% faster than stock, lol :D

Have no idea why people care about stock power draw, just change it in 5 seconds, seriously who really cares?
Because that's what you see before you boot into windows, install your tools and apply your own power limit, maybe?
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
The funny thing is, everyone calls the whole Radeon 7000 series trash because of power consumption, but magically that is not an issue with Ngreedia in this case. Granted, these are rumors, but still.

Personally, I hated AMD pricing and naming though. Each one should’ve been named a tier less and priced accordingly.

Example the 7900 XTX should’ve been 7800 XT and priced at 799.

Then again, in many raster games (you know the ones that compromise like 90% of the whole Steam catalog) the 7900 xtx is on average 20-30% slower than the 4090 but priced up to 60% less, depending on vendors, rebates and packed in games.

I have been saying this about the names for years. AMD never hears us.
But it's definitely not only their fault. It's the whole market against them. When we desperately need their healthy competition.
 
Joined
Dec 6, 2022
Messages
385 (0.53/day)
Location
NYC
System Name GameStation
Processor AMD R5 5600X
Motherboard Gigabyte B550
Cooling Artic Freezer II 120
Memory 16 GB
Video Card(s) Sapphire Pulse 7900 XTX
Storage 2 TB SSD
Case Cooler Master Elite 120
When we desperately need their healthy competition.
But thats the thing, i do think that the 7000 series is competitive up to the 4080.

The “problems “ at least for gamers, is the gimmicks, like fsr, dlss and RT.

FSR and dlss plus fake frames are there to cheat over native rendering and hide their perhaps limited performance at the given resolutions.

And RT hype/push is simply the influencers (formerly known as reviewers) earning their free 4090s. We have less than 5 games that properly use RT yet we are made believe otherwise.
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
But thats the thing, i do think that the 7000 series is competitive up to the 4080.

I don't think that RX 6400, RX 6500 XT and RX 7600 are exactly competitive. The first two are junk leftover from some hybrid laptop systems, which do need a CPU + dGPU, in order to use the CPU's media engine as the default video accelerator, while the latter is too weak - it's a 6nm chip with 0% performance improvement over the older RX 6600/6650 XT.

AMD doesn't innovate, it simply follows the nvidia leadership. This is in fact duopoly, with all the legal consequences that arise from this.

The “problems “ at least for gamers, is the gimmicks, like fsr, dlss and RT.

FSR and dlss plus fake frames are there to cheat over native rendering and hide their perhaps limited performance at the given resolutions.

And RT is simply the influencers (formerly known as reviewers) earning their free 4090s. We have less than 5 games that properly use RT yet we are made believe otherwise.

I don't listen to their marketing BS. I use the classic approach - use low and medium setting where needed to achieve high enough FPS.
 
Joined
Sep 10, 2018
Messages
6,925 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
It's from 2017. I highly doubt you need medium graphics with a 7900 XT in it to run properly.


It's 10% now, but let's not forget that power requirements have been steadily increasing ever since Turing. The GTX 1060 had a TDP of 120 W. That's x50 card territory now.


Because that's what you see before you boot into windows, install your tools and apply your own power limit, maybe?

As the gains from the process node diminish that is just the reality the days are gone of both getting a massive performance increase at lower power for people who actually want progress when it comes to performance this is just the reality.... I am sure whatever the 5060/5070 end up being anyone who cares will be able to slim down the power target quite a bit.

The 5090 will likely be 80-100% faster than the 3090ti at the same or lower power though so anyone who want's to pay for it can still get massive performance per watt improvements.
 
Joined
Dec 6, 2022
Messages
385 (0.53/day)
Location
NYC
System Name GameStation
Processor AMD R5 5600X
Motherboard Gigabyte B550
Cooling Artic Freezer II 120
Memory 16 GB
Video Card(s) Sapphire Pulse 7900 XTX
Storage 2 TB SSD
Case Cooler Master Elite 120
I don't think that RX 6400, RX 6500 XT and RX 7600 are exactly competitive. The first two are junk leftover from some hybrid laptop systems, which do need a CPU + dGPU, in order to use the CPU's media engine as the default video accelerator, while the latter is too weak - it's a 6nm chip with 0% performance improvement over the older RX 6600/6650 XT.
My bad, i do agree with that. Should have said 7800 xt and the infinite variations of the 7900. :)
AMD doesn't innovate, it simply follows the nvidia leadership. This is in fact duopoly, with all the legal consequences that arise from this.
Well, depending in what you call innovations, but at the same time, i’m still willing to give them a bit more time to get the Radeon group up to the same level as the cpu group.

People forget that less than 10 years ago, the company was almost dead (thanks in huge part to intel dirty and illegal actions) so they bet everything on Ryzen and are now building up Radeon and other stuff.

You might say cuda but remember that amd bet on opencl and everyone bailed, ngreedia sabotaged it in favor of cuda, etc.

So it does takes time and they simply need a bit more.
I don't listen to their marketing BS. I use the classic approach - use low and medium setting where needed to achieve high enough FPS.
Sadly, you and I are a minority in that train of thought
 
Last edited:
  • Like
Reactions: ARF
Joined
Jan 14, 2019
Messages
12,344 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I don't think that RX 6400, RX 6500 XT and RX 7600 are exactly competitive. The first two are junk leftover from some hybrid laptop systems, which do need a CPU + dGPU, in order to use the CPU's media engine as the default video accelerator, while the latter is too weak - it's a 6nm chip with 0% performance improvement over the older RX 6600/6650 XT.
At least it's a fair bit cheaper than the equally bad 4060. I'm not saying it's great, I'm just trying to find some positives, whatever little there is.
 
Joined
Sep 17, 2014
Messages
22,468 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
As the gains from the process node diminish that is just the reality the days are gone of both getting a massive performance increase at lower power for people who actually want progress when it comes to performance this is just the reality.... I am sure whatever the 5060/5070 end up being anyone who cares will be able to slim down the power target quite a bit.

The 5090 will likely be 80-100% faster than the 3090ti at the same or lower power though so anyone who want's to pay for it can still get massive performance per watt improvements.
I wouldnt be too quick to say that. AMD isnt really trying so we only have Nvidia ergo Huangs blue eyes to believe this. Nvidia is simply in a position to say this right now. And look how it works for their margins.

Meanwhile, on CPU, where a real competitor exists and both are REALLY trying to make the best cpus, we see lots of progress irrespective of the node. From chiplet, to interconnects, big little, X3D... and then we see that power really doesnt have to keep going up. Except when a design isnt quite a fix, such as Intel's E cores, do we see how far the powerbudget needs to go to remain competitive.

And hey, look at GPU. Even upscale could be considered such a new tech. And it does change the playing field. Too bad Nvidia enforces a VRAM/bandwidth product limitation and segmentation plus an RT push to make you believe otherwise.
 
Joined
Sep 10, 2018
Messages
6,925 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I wouldnt be too quick to say that. AMD isnt really trying so we only have Nvidia ergo Huangs blue eyes to believe this. Nvidia is simply in a position to say this right now. And look how it works for their margins.

Meanwhile, on CPU, where a real competitor exists and both are REALLY trying to make the best cpus, we see lots of progress irrespective of the node. From chiplet, to interconnects, big little, X3D... and then we see that power really doesnt have to keep going up.

Yeah but gaming specifically it's slowing down quite a bit a lot of that has to do with developers for sure but we are seeing 5-10 ish per year improvements while getting 50%+ at the top for gpus generation every two years.

Amd has also stagnated when it comes to cores or at the very least cores per tier the 9900X really should be the R7 with the 9700X being the R5.....

MT performance in general has been nice an 3Dvcache keeps amd competitive with Intel in gaming but intel hasn't shipped a new desktop arch since 2022 and amd is barely faster depending on the gaming suite benchmarked not really impressive to me

To me it feels like decent pricing after launch ofc has been the only real winner with cpus at launch they've all been overpriced as well.

It's really just the 500 and under market that has really gone to shite with graphic cards and pricing in general not actual improvements at the top generation after generation.
 
Last edited:
Joined
Aug 25, 2023
Messages
374 (0.81/day)
System Name Personal computers
Processor Ryzen 7000, 8000 & 9000 series
Motherboard 3 x B650 boards
Cooling Deep Cool, Cooler Master, Thermal take & Stock air coolers
Memory 5 kits of DDR5 - G.Skill Flare X5, Team T-Create, Adata & XPG Lancer, Patriot Viper
Video Card(s) Asus TUF gaming RX 7900 XTX OC edition / iGPUs
Storage 1 + 2TB T-Force Cardea A440 pro / 2 x Kingston KC3000 1TB / PNY 1TB M.2 / WD 250GB M.2
Display(s) 34 " / 32" / 27" LCDs
Case MSI MPG Sekira 100R / Silverstone Redline mATX / Antec C8
Audio Device(s) Asus Xonar AE 7.1 + Audio Technica -AD500X / Onboard + Creative 2.1 soundbar
Power Supply Corsair RM1000x V2 / Corsair RM750x V2 / Thermaltake 650W GF1
Mouse MSI Clutch GM20 Elite / CM Reaper /
Keyboard Logitech G512 Carbon / MSI G30 Vigor / Ttesports Challenger Duo
The funny thing is, everyone calls the whole Radeon 7000 series trash because of power consumption, but magically that is not an issue with Ngreedia in this case. Granted, these are rumors, but still.

Personally, I hated AMD pricing and naming though. Each one should’ve been named a tier less and priced accordingly.

Example the 7900 XTX should’ve been 7800 XT and priced at 799.

Then again, in many raster games (you know the ones that compromise like 90% of the whole Steam catalog) the 7900 xtx is on average 20-30% slower than the 4090 but priced up to 60% less, depending on vendors, rebates and packed in games.
Then there is the fact that a lot of RX 7900 XTX are factory overclocked as well & even then, in the majority of cases even more OC can be found with little if any extra power draw making them even more value for money.
 
Joined
Apr 12, 2013
Messages
7,536 (1.77/day)
Amd has also stagnated when it comes to cores or at the very least cores per tier the 9900X really should be the R7 with the 9700X being the R5.
Not really no for desktops I'd still say the biggest issue is bandwidth & till that is addressed, either with 4 channel MC or something else, the practical limits at teh top end will remain roughly the same. Zen is still a pretty lean core so they can go even wider & then they will need to make it work with even higher memory speeds/wider memory interface at the lowest end which includes desktops.
 
Joined
Sep 10, 2018
Messages
6,925 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Not really no for desktops I'd still say the biggest issue is bandwidth & till that is addressed, either with 4 channel MC or something else, the practical limits at teh top end will remain roughly the same. Zen is still a pretty lean core so they can go even wider & then they will need to make it work with even higher memory speeds/wider memory interface at the lowest end which includes desktops.

While I am not going to sit here and say 16 cores isn't enough for any mainstream desktop platform it is..... the R7 and R5 has really stagnated in both core counts and MT performance I was shocked at how bad a 7800X3D was at MT it honestly didn't feel all that different and in some cases worse than my 5800X outside of gaming.

Honestly it was so bad outside of gaming the headache of the 7950X3D become very appealing.

Hopefully 9000 or 11000 fixes that but I guess until Arrow Lake is shown it's hard to know, it looks like it will also not see very impressive MT boost.... But if the 9700X loses to a 14700k in MT that is pretty embarrassing considering how old the Raptor lake core is if they still want to price it like an i7.... Now if AMD shifts down pricing to say $329 sure then its fine
 
Joined
Jan 14, 2019
Messages
12,344 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Yeah but gaming specifically it's slowing down quite a bit a lot of that has to do with developers for sure but we are seeing 5-10 ish per year improvements while getting 50%+ at the top for gpus generation every two years.

Amd has also stagnated when it comes to cores or at the very least cores per tier the 9900X really should be the R7 with the 9700X being the R5.....

MT performance in general has been nice an 3Dvcache keeps amd competitive with Intel in gaming but intel hasn't shipped a new desktop arch since 2022 and amd is barely faster depending on the gaming suite benchmarked not really impressive to me

To me it feels like decent pricing after launch ofc has been the only real winner with cpus at launch they've all been overpriced as well.

It's really just the 500 and under market that has really gone to shite with graphic cards and pricing in general not actual improvements at the top generation after generation.
I guess there's also the fact that one doesn't need more than 8 cores in a mainstream desktop, just like one doesn't need more than a 6700 XT for 1440p and below with sensible graphics settings and FPS expectations. The enthusiast range expanded downwards to the x80, or even x70 level while game hardware requirements haven't increased rapidly like we saw in the '90s and early 2000s.
 
Joined
Sep 10, 2018
Messages
6,925 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I guess there's also the fact that one doesn't need more than 8 cores in a mainstream desktop, just like one doesn't need more than a 6700 XT for 1440p and below with sensible graphics settings and FPS expectations. The enthusiast range expanded downwards to the x80, or even x70 level while game hardware requirements haven't increased rapidly like we saw in the '90s and early 2000s.

Same with what Nvidia has done with Vram on the lower tiers I hate seeing stagnation in any form.... The low end matters just as much if not more than the high end and without year on year improvements we will see stagnation kinda like the half decade plus of mainstream quad cores, but beyond something like Hellblade 2 we might be hitting a limit of what rasterization can do and only true path tracing or something that hasn't been invented yet will actually lead to any meaningful improvements going forward.

At the very least I think both you and me can agree they really need to come up with something hardware agnostic that is much better than TAA at the same performance hit lol.....
 
Joined
Jun 19, 2024
Messages
110 (0.67/day)
More interested to see if they stopped sniffing glue; meaning, if this generation will be affordable and have more than enough VRAM.

Have you seen their revenue growth? I’ll take some of that glue please!

With GPU along sucking 500W, this time around wont be surprised to find 1kW PSU being bare minimum for High end WS builds(with single GPU).

meh. I’m already using 1300w server supplies. Don’t cheap out on power if you want a stable system.

I wouldn't be so rude to point the finger onto normal people who value the better product.

If slow, hot and lacking features is how you define the better product maybe it’s best to have a finger pointed at you. Unless you think Intel makes better CPUs than AMD?
 
Last edited:
Top