• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon "RDNA 4" RX 9000 Series Will Feature Regular 6/8-Pin PCI Express Power Connectors

Status
Not open for further replies.

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
9,161 (3.97/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
Whatever, bro. I like having two sleeved 8-pins on my 3080 instead of that rushed connector.


*for now* :D nah, just kidding. Your card isn't also that power hungry.
I would have no worries about plugging in a 600w GPU either, Team Red is just soft and scared :D
 

AcE

Joined
Dec 3, 2024
Messages
366 (10.46/day)
I like the new connector, works good for me :)
Reasonable people will be reasonable. :)))) Fact is millions use the connector just fine, forums like these is like the loud minority and that’s it. And then those criticising the connector are mostly people who never used it and are just fud-ing.

Team Red is just soft and scared :D
This is a sale tactic my friend, nothing else. As you see a lot of people like to use the old connectors and with that card the new connector is simply not needed.

btw “Team Red” used 2x 8 Pin on a dual gpu card that used as much as 600W and was only drivable by the best PSUs, so no they’re a lot of things but not scared. ;)
 
Joined
Jul 15, 2006
Messages
1,334 (0.20/day)
Location
Noir York
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B450M S2H
Cooling Scythe Kotetsu Mark II
Memory 2 x 16GB SK Hynix CJR OEM DDR4-3200 @ 4000 20-22-20-48
Video Card(s) Colorful RTX 2060 SUPER 8GB GDDR6
Storage 250GB WD BLACK SN750 M.2 + 4TB WD Red Plus + 4TB WD Purple
Display(s) AOpen 27HC5R 27" 1080p 165Hz curved VA
Case AIGO Darkflash C285
Audio Device(s) Creative SoundBlaster Z + Kurtzweil KS-40A bookshelf / Sennheiser HD555
Power Supply Great Wall GW-EPS1000DA 1kW
Mouse Razer Deathadder Essential
Keyboard Cougar Attack2 Cherry MX Black
Software Windows 10 Pro x64 22H2
I'm fine with 8-pin or a couple of those on a GPU. I really hope the pricing of this card is reasonable. I don't want to buy another 3080, I know it's a damn good GPU but 10GB isn't much by the look of things even at 1080p for foreseeable future.
 
Joined
Aug 21, 2013
Messages
1,966 (0.47/day)
As long as one 8pin is enough all is good.
If you need 2 or 3 of those, better go with the new standard Imo.
On most RDNA4 cards it will likely be enough. Even two is ok.
Ain't gonna be much faster than that, even by today's ridiculous standards of +5% being a whopping upgrade. I'd rather skip this generation. 9070 XT is unlikely to be significantly faster than 3090 (3090 Ti if we're feeling really ambitious) and your 3080 isn't really far behind. More sense in waiting for 4080 series or better GPUs to become affordable.
Laugh of the day. Waiting for 4080 performance to be affordable, from Nvidia?
From the last years flagships only 2080Ti performance can be considered affordable these days.
And that card released seven years ago. 3090Ti performance may become affordable next year if the new cards keep prices in check. That would be five tears and that's still a big "if".
4090 performance wont be affordable until 2030 going by previous examples taking 5+ years.
This matters in like four games and in five more if we talk absurd use cases (UHD+ texture packs and/or settings so high it's <15 FPS anyway) and 3080 has the edge to stay solid in every other title. Especially the ones where DLSS is the only upscaler that works correctly. I would've agreed if that was a comparison with an 8 GB GPU but 10 GB is nowhere near obsolete, also 320-bit bus really helps a lot.
The effect of VRAM limitations cannot be measured in average FPS alone like TPU does. No offense to W1zzard here but the issue is more complex as it requires also frame time analysis for every game at every setting and this is heard to read and takes forever to benchmark.

Still those who have done it like Daniel Owen with 8G and 16GB 4060Ti found the 8GB card having significantly worse frametimes. And it wasn't just four games. In fact in another thread did a breakdown of TPU's own performance tests from this year. This was for 8GB cards and I'll quite it here:
I looked at all the performance benchmark reviews he has posted for this years games.
11 games in total. At 1080p max settings (tho not in all games and without RT or FG) the memory usage is average 7614 MB.
7 games stay below 8GB at those settings. 4 games go over it.
6 games are ran at lowest settings 1080p no RT/FG and despite that half of them (3) still go over 8GB even at these low settings.

Anyone looking at these numbers and seeing how close the average is to the 8GB limit should really be considering twice when buying a 8GB card today.
Next year likely more than half of the tested games will surpass 8GB even at 1080p low no RT/FG and you have to remember that RT and FG both increase VRAM usage even more. To say nothing of frametimes on those 8GB cards. Even if the entire buffer is not used up the frametimes already take a nosedive or in some cases textures simply refuse to load.

Links:

The leaks we got suggest 9070 XT just barely outperforming 7900 GRE which is roughly 3090/3090 Ti area. This is faster than 3080, sure, but it's not a lot of difference.
That's a 11% increase to 7900 GRE. About average these days. If it reaches 3090Ti then it will be 32% which is way above average these days and can be considered good. Except for you it seems because it's AMD.
This thing will be barely any faster than the 4 years old 6900XT in raster, let alone the 7900XTX that it supposed to beat for half the price.
Who said it was supposed to beat XTX? Stop setting false expectations. Current leaks suggest 7900 GRE performance. Not XT or much less XTX.
Old news, you seeing any reports the past months? No.
Doesn't mean it's still not happening. Did terrorist attacks stopped because it was not in the news? No.
If media gets saturated by the same news they tend to fade in the background once the initial panic has died down.
New AMD GPU is still slow and bad upgrade..
Buy a new Gpu to get more Vram only is just stupid whitout getting more performance.

Better to go 5070Ti to get performance boost

Its only Amd fans who dont know what means
VRAM allocation and VRAM usage? Right?

Ppls who never ever buy Nvidia write this trash and BS about Vram. Thats how it goes atm in every tech forum
Butthurt fans cant take it when Nvidia is top dog here and topics are full of Vram/price BS from Amd fans

Better to sound rational than talking BS about prices and Vrams 24/7 like some butthurt amd fans
You are the perfect embodiment of the person in this meme:
8GB.jpg
Which doesn't contradict with what I just said. It's barely noticeable. Significant starts from 100%.
Name me last time a new card offered 100% performance increase.

From what i remember it was 6800XT over 5700XT at 92% according to TPU, but that's also a bit unfair comparison as 5700XT was decidedly a midrange card like RDNA4 will be and 6800XT was a high end card with much higher price and specs.

Before that i could find 4870 over 3870 at 119%.
And going ever further back 8800GTX over 7900GTX but i dont have percentages as it was that long ago.

So while 100% has happened a few times in history it's extremely rare. These days the best we can hope for is around +40% like 1080Ti vs 980Ti or 4090 vs 3090Ti.

I would not say 4090 owners called it's performance increase over 3090Ti as "barely noticeable".
This is just you preempting whatever AMD comes up with as "barely noticeable".
By your own logic Nvidia's performance upgrades are also "barely noticeable " as most dont even reach the rare 40% mark.
 
Joined
Feb 24, 2023
Messages
3,241 (4.75/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
From the last years flagships only 2080Ti performance can be considered affordable these days.
If you're broke then yes. 3080 Ti goes for 450ish on aftermarket. It's not ideal but it's very reasonable for a GPU that can handle anything at 1080p and, if we don't go for heaviest RT, at 1440p. Or 4K if we don't care for RT at all.
4080 going for 500ish is affordable and is the way the things will be cca '27. Or, maybe, just maybe, even '26.
because it's AMD.
Couldn't care less. Current gen NVIDIA offerings are also price/performance rubbish, with 4070 Ti upwards being the pinnacle.
Name me last time a new card offered 100% performance increase.
Why should I? My point implies the buyer has got an X GPU that offers 100% performance and then, when they are up to upgrade their PC, they buy a Y GPU that offers at least 200% performance for that to be considered a real upgrade in my book. Times when $X GPU of today doubled the performance of $X GPU of yesterday are about 17 years old at this point, give or take. Which doesn't matter because no one said you must buy a new GPU every time something new is released.
By your own logic Nvidia's performance upgrades are also "barely noticeable " as most dont even reach the rare 40% mark.
True. I hate everything about the state of affairs in NVIDIA SKUs, too. However, as an effective monopolist, NVIDIA are in their right to do so. AMD should declare a price war, invent something useful that NVIDIA cards cannot do, or do anything else that's impressive to at least save what they got left of their market share. What they do, however, is releasing products that barely outperform similarly priced NVIDIA GPUs (no more than 20% and not even always the case) in the most AMD favouring scenarios (no upscaling, no RT, no frame generation; things that AMD GPUs of all existing generations do MUCH worse than equally priced NVIDIA SKUs and, what's even funnier, some Intel ones).

Buy a one-trick pony for 500 or a well-rounded GPU for 600? If AMD's plan was to upsell NVIDIA GPUs they overdid on it.
 
Joined
Jun 10, 2014
Messages
3,006 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I don't want to buy another 3080, I know it's a damn good GPU but 10GB isn't much by the look of things even at 1080p for foreseeable future.
Why do you people have these irrational fears? Where do you get your figures to determine a very good GPU is bad just because of a number you know nothing about?

And why should VRAM size increase so drastically between generations anyways, do each pixel on your screen need exponentially more data in order to be rendered?
Let's so some math for a moment;
Consider 4K (3840x2160), now assume we we're rendering a perfect scene with high details, we run 8xMSAA (8 samples per pixel), and we assume every object on-screen has 4 layers of textures, and every sample is interpolating on average 4 texels, and that every object is unique, so every texel is unique, resulting in a whopping 128 average samples per rendered pixel (this is far more than any game would ever do), it will still total just 3037.5 MB uncompressed*. (Keep in mind I'm talking if every piece of grass, rock, etc. is unique.) So when considering a realistic scenario with objects repeating, lots of off-screen nearby objects cached (mip-levels and AF), etc. a ~5 GB of textures, ~1.5 GB of meshes and ~1 GB of temporary buffers would still not fill a VRAM size of 8 GB, let alone 10 GB. Throw in 12-bit HDR, and it would still not be that bad.
*) Keep in mind that with MSAA, lots of the same texels will be sampled. And normal maps are usually much lower resolution and are very highly compressible.

So the only logical conclusion would be that if a game struggles with 10 GB VRAM in 1080p, the game is either very poorly designed, or the driver is buggy. And as we usually see in such comparison, it's usually another bottleneck slowing it down.

The effect of VRAM limitations cannot be measured in average FPS alone like TPU does. No offense to W1zzard here but the issue is more complex as it requires also frame time analysis for every game at every setting and this is heard to read and takes forever to benchmark.
If a game is acutally running out of VRAM, and the GPU starts swapping just because of that, the FPS wouldn't just drop a few percent, or have slightly higher variance in frame times, it would completely collapse.
When Nvidia releases their usual refreshes of GPUs with slightly higher clocks and memory speeds, and yet they keep scaling in 4K, we can safely conclude that VRAM size isn't a bottleneck.
So whenever these besserwissers on YouTube make their click-bait videos about an outlier game which drops 20-30%, it's bug, not a lack of VRAM.
 
Joined
Aug 12, 2010
Messages
143 (0.03/day)
Location
Brazil
Processor Ryzen 7 7800X3D
Motherboard ASRock B650M PG Riptide
Cooling Wraith Max + 2x Noctua Redux NF-P14r + 2x NF-P12
Memory 2x16GB ADATA XPG Lancer Blade DDR5-6000
Video Card(s) Powercolor RX 7800 XT Fighter OC
Storage ADATA Legend 970 2TB PCIe 5.0
Display(s) Dell 32" S3222DGM - 1440P 165Hz + P2422H
Case HYTE Y40
Audio Device(s) Microsoft Xbox TLL-00008
Power Supply Cooler Master MWE 750 V2
Mouse Alienware AW320M
Keyboard Alienware AW510K
Software Windows 11 Pro
tldr then: if it ain't broke, don't fix it.
 
Joined
Jun 2, 2017
Messages
9,487 (3.42/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
What confuses me about the VRAM argument is that it was one of the main positives about Intel's new card vs the 4060. Now all of a sudden VRAM does not matter. What happened to all the pain communicated when Hogwarts Launched?
 
Joined
Dec 1, 2022
Messages
303 (0.39/day)
Reasonable people will be reasonable. :)))) Fact is millions use the connector just fine, forums like these is like the loud minority and that’s it. And then those criticising the connector are mostly people who never used it and are just fud-ing.
People not wanting to use it are reasonable too. And many didn't want to use it because the first version was a fire hazard, it got updated to 12v 2x6 for a reason.
This is a sale tactic my friend, nothing else. As you see a lot of people like to use the old connectors and with that card the new connector is simply not needed.
Not needing a new PSU is a good thing, and the new power connector wasn't needed for any card except the 4090.
btw “Team Red” used 2x 8 Pin on a dual gpu card that used as much as 600W and was only drivable by the best PSUs, so no they’re a lot of things but not scared. ;)
How many years ago was that?
And btw, "team green" had 2x 8 pin connectors on the RTX 30 series cards and it was tripping the overcurrent limit on power supplies. Yeah I know its cool to call AMD users here stupid, but some of us didn't want to deal with the risks because Nvidia didn't allow AIB's to use a standard connector thats proven to be safe.
 
Joined
Jan 14, 2019
Messages
13,074 (5.98/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Or 100 to 150 fps... :)
Do you notice that? Personally, I don't. Anything above 50-80 FPS (depending on the game) is invisible to me, especially with VRR enabled.

I regretfully admit that I have never used Linux. After my last experience trying to install Win 11 2H24 on my laptop, I am so done with Windows. Now I just need to find the time to make the switch.
No problem. Just pop over there. There'll be plenty of people (myself included) to help. :)
 
Joined
Jun 14, 2020
Messages
3,667 (2.20/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
People not wanting to use it are reasonable too. And many didn't want to use it because the first version was a fire hazard,
Since you just called yourself reasonable, can you give me some examples of the 12vh causing fires?
 
Joined
Nov 27, 2023
Messages
2,614 (6.42/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
Unfortunately, Nvidia created this 'standard' because they are planning 600W GPUs.
NVidia didn’t really create any standards, they are not ones who do so. The Ampere connector was proprietary bullshit. The 12VHPWR and the new 12v6 are PCI-SIG created specs with, yeah, input from NVidia and Dell, among others, and were incorporated by Intel into the new ATX spec. The whole point of the whole exercise was minimizing footprint and creating something that is more suited for higher power consumption GPUs. The whole hysteria is baffling to me. It works. We know it works. We know the revised version is impossible to “melt” even on fucking purpose. It was tested. People who know more about power supplies and connectors, like John Gerow, have confirmed so. It’s fine.

Do you notice that? Personally, I don't. Anything above 50-80 FPS (depending on the game) is invisible to me, especially with VRR enabled.
I prefer a tad higher, but it depends on the genre and, realistically, anything above 120 for single player games and 240 for MP ones is much of muchness for most players.
 
Joined
Jun 14, 2020
Messages
3,667 (2.20/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
NVidia didn’t really create any standards, they are not ones who do so. The Ampere connector was proprietary bullshit. The 12VHPWR and the new 12v6 are PCI-SIG created specs with, yeah, input from NVidia and Dell, among others, and were incorporated by Intel into the new ATX spec. The whole point of the whole exercise was minimizing footprint and creating something that is more suited for higher power consumption GPUs. The whole hysteria is baffling to me. It works. We know it works. We know the revised version is impossible to “melt” even on fucking purpose. It was tested. People who know more about power supplies and connectors, like John Gerow, have confirmed so. It’s fine.
No, it's much more likely that nvidia is shipping cards that will ALL eventually melt (according to the above post) and has to pay the cost of RMA for all of them. Makes sense man :roll:
 
Joined
Dec 1, 2022
Messages
303 (0.39/day)
What confuses me about the VRAM argument is that it was one of the main positives about Intel's new card vs the 4060. Now all of a sudden VRAM does not matter. What happened to all the pain communicated when Hogwarts Launched?
It seems like people quickly forgot about the stuttering, poor frame timing and the awful looking textures required to run Hogwarts Legacy on cards with 8GB VRAM. There is more to game testing than FPS, and games keep progressing yet people keep insisting 8GB is fine, so game devs have to compensate for the majority of hardware configs.
Since you just called yourself reasonable, can you give me some examples of the 12vh causing fires?
Theres plenty of examples of the connector melting, melting means something is getting hot enough to short out or catch fire. Someone else posted vids from Northridge Fix, some of those examples have melted power connectors. I've also seen cards with connectors melted off of the PCB, its dangerous when the connector gets so hot it melts the solder and there is no safety mechanism to shut the system down before it gets that hot.
It amazes me team green users want to ignore logic so hard to defend their favorite brand they're saying things like " but its not in the news". You do realize things happen without news coverage, right?
NVidia didn’t really create any standards, they are not ones who do so. The Ampere connector was proprietary bullshit. The 12VHPWR and the new 12v6 are PCI-SIG created specs with, yeah, input from NVidia and Dell, among others, and were incorporated by Intel into the new ATX spec. The whole point of the whole exercise was minimizing footprint and creating something that is more suited for higher power consumption GPUs. The whole hysteria is baffling to me. It works. We know it works. We know the revised version is impossible to “melt” even on fucking purpose. It was tested. People who know more about power supplies and connectors, like John Gerow, have confirmed so. It’s fine.
Nvidia was likely the main company driving to push for a new power connector, since they needed something to fit their weirdly shaped cards. And yes Intel is a part of PCI-SIG, but funny enough they haven't been using the new connector either.
We only know the new revised connector works, there is plenty of evidence the inital version was garbage and didn't even fit into place with a solid enough retention clip and you couldn't bend it to fit in a reasonably sized case. As for someone working for a PSU company saying its fine, that is expected, I'd rather trust third party reviewers to tell me its fine.
 
Last edited:
Joined
Jan 14, 2019
Messages
13,074 (5.98/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
It seems like people quickly forgot about the stuttering, poor frame timing and the awful looking textures required to run Hogwarts Legacy on cards with 8GB VRAM. There is more to game testing than FPS, and games keep progressing yet people keep insisting 8GB is fine, so game devs have to compensate for the majority of hardware configs.
Hogwarts Legacy is a funny game. I remember Hardware Unboxed testing it with a 4 GB and an 8 GB 6500 XT, with the 4 GB showing strange artefacts, texture pop-ins and other oddities, while the 8 GB one didn't.

So then, I popped my 4 GB 6500 XT into my PC to see it for myself, and honestly, I couldn't notice anything weird... which was... weird.
 
Joined
Nov 27, 2023
Messages
2,614 (6.42/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
I'd rather trust third party reviewers to tell me its fine.
Okay. W1zz says it’s fine. He had no issues with any version of the connector. Across dozens of cards and thousands replugs. Any new goalposts you would like to choose?
 
Joined
Aug 21, 2013
Messages
1,966 (0.47/day)
If you're broke then yes. 3080 Ti goes for 450ish on aftermarket. It's not ideal but it's very reasonable for a GPU that can handle anything at 1080p and, if we don't go for heaviest RT, at 1440p. Or 4K if we don't care for RT at all.
4080 going for 500ish is affordable and is the way the things will be cca '27. Or, maybe, just maybe, even '26.
So now everyone who wont buy a GPU over 450 is broke?
3080 Ti is pointless. 450+ for a five year old 12GB GPU? What a deal! /s
Until AMD's or Intel's midrange soundly beats 4080 there's little reason for it to cost 500 in '26.
Why should I? My point implies the buyer has got an X GPU that offers 100% performance and then, when they are up to upgrade their PC, they buy a Y GPU that offers at least 200% performance for that to be considered a real upgrade in my book. Times when $X GPU of today doubled the performance of $X GPU of yesterday are about 17 years old at this point, give or take. Which doesn't matter because no one said you must buy a new GPU every time something new is released.
You're the one who brought up this ridiculous number. Now you can provide examples?
Sure by that logic i can upgrade from iGPU to 4090 and get 1000% but that's not what average person does.
Buy a one-trick pony for 500 or a well-rounded GPU for 600? If AMD's plan was to upsell NVIDIA GPUs they overdid on it.
Well rounded with limited VRAM and weak RT perf?
If a game is acutally running out of VRAM, and the GPU starts swapping just because of that, the FPS wouldn't just drop a few percent, or have slightly higher variance in frame times, it would completely collapse.
When Nvidia releases their usual refreshes of GPUs with slightly higher clocks and memory speeds, and yet they keep scaling in 4K, we can safely conclude that VRAM size isn't a bottleneck.
So whenever these besserwissers on YouTube make their click-bait videos about an outlier game which drops 20-30%, it's bug, not a lack of VRAM.
No it does not. Frametimes go haywire before anything else. That's the first indication that something is wrong. FPS drops come after that.
Keep scaling? What are you talking about? I provided TPU's own data. Average VRAM usage is 7,6GB this year at 1080p with no DLSS/FG/RT. Sometimes on low settings. This will only continue to increase.
Oh sure just blame the games. It's all a bug...
 
Joined
Dec 12, 2016
Messages
2,003 (0.68/day)
...invent something useful that NVIDIA cards cannot do, or do anything else that's impressive...
AMD has done four things along these lines already:

1. Powerful console SoC for Sony and Microsoft. The business is cyclic so revenues are down right now until the next Xbox and PS.
2. License graphics tech to smartphone companies like Samsung. Nvidia can't or won't do this.
3. Powerful laptop/SFF SoC combining both CPU and GPU IP. Strix Halo is coming and Nvidia is a long way off from creating their own Apple M# competitor.
4. New chip configuration like stacked ICs, interposers and chiplets. Having experience with these two configs paves the way to the future when node shrinks become impossible and monolithic chips are no longer viable. Instinct already uses chiplets.

Not everything GPU related is an RGB desktop gaming rig product. In addition, AMD is working on cool Xilinx follow-ons and their 3D cache chips are awesome. Instinct is also powerful but we know Nvidia has a big head start here. Finally, if AMD and Intel push Nvidia out of the sub $500 discrete GPU space with RDNA4 and Battlemage, market shares will go up.
 
Joined
Jul 31, 2024
Messages
521 (3.26/day)
People should understand that there are manufacturing processes. And some stuff is problematic.

First post I see is the usual I do not have a problem post with
choose from:
*) operating system XY
*) connector XY
*) product XY
(which implies that the product is totally fine - 100% of the products are free from defect)

The graphic card manufacturers are just lazy. I know other areas where you have to write 8d reports, recall, scrap and pay fines for defective connectors.

I wrote it here and somewhere else about my "defective cables" from my enermax power supply. You may look for that topic and read it. I tried to explain the topic in more detail there.
 

AcE

Joined
Dec 3, 2024
Messages
366 (10.46/day)
People not wanting to use it are reasonable too. And many didn't want to use it because the first version was a fire hazard, it got updated to 12v 2x6 for a reason.
Being non pragmatic isn’t reasonable, sorry. And it was never a “fire hazard” unless you mishandled the connector. A little “burning” isn’t a fire btw. The newer connector is idiot proof, while the other was not idiot proof, that is the difference that the older connector needed you to make sure the cable is properly inserted while the newer one won’t work if you are unable to properly install a cable. :) But even a revision of the current one made this a fact. If you have a 4090 of 2023 or 2024, likelihood is high you have one of the idiotproof ones.
Not needing a new PSU is a good thing, and the new power connector wasn't needed for any card except the 4090.
Depends on definition of need. Unprecise, the 4080 also needed it because of how the ref card of Nvidia was constructed so you’re wrong.
And btw, "team green" had 2x 8 pin connectors on the RTX 30 series cards and it was tripping the overcurrent limit on power supplies.
Source and proof for that? (X) none you’re making it up. :)

again, millions of RTX 40 series users with 0 problems with the connector, loud minority won’t change the facts. And fantasies won’t turn into facts. All connectors are safe if properly used, end of story.
 
Last edited:
Joined
Jun 10, 2014
Messages
3,006 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
No it does not. Frametimes go haywire before anything else. That's the first indication that something is wrong. FPS drops come after that.
Keep scaling? What are you talking about? I provided TPU's own data. Average VRAM usage is 7,6GB this year at 1080p with no DLSS/FG/RT. Sometimes on low settings. This will only continue to increase.
Oh sure just blame the games. It's all a bug...
Graphics cards can successfully swap data that isn't used, but if they start to swap on a frame-by-frame basis, it goes from totally fine to totally unplayable very quickly, there isn't a large margin with a lot of stutter but unaffected averages. By the time it starts swapping truly, the frame rate will drop sharply, and any reviewer will notice this.
But when faster graphics cards with the same amount of VRAM keeps scaling fine, then VRAM isn't the issue, it's the facts.
 
Joined
Jan 14, 2019
Messages
13,074 (5.98/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
AMD has done four things along these lines already:

1. Powerful console SoC for Sony and Microsoft. The business is cyclic so revenues are down right now until the next Xbox and PS.
2. License graphics tech to smartphone companies like Samsung. Nvidia can't or won't do this.
3. Powerful laptop/SFF SoC combining both CPU and GPU IP. Strix Halo is coming and Nvidia is a long way off from creating their own Apple M# competitor.
4. New chip configuration like stacked ICs and chiplets. Having experience with these two configs paves the way to the future when node shrinks become impossible and monolithic chips are no longer viable.
5. Powerful APUs for handheld consoles and SFF devices,
6. Open source technologies like FSR that run on anything,
7. Open source drivers that come integrated into the Linux kernel, making life on Linux a lot easier with an AMD GPU.

Who said AMD doesn't have anything on its own?
 
Joined
Dec 12, 2016
Messages
2,003 (0.68/day)
5. Powerful APUs for handheld consoles and SFF devices,
6. Open source technologies like FSR that run on anything,
7. Open source drivers that come integrated into the Linux kernel, making life on Linux a lot easier with an AMD GPU.

Who said AMD doesn't have anything on its own?
I was just about to come back and add handhelds. The only handheld with Nvidia is the Switch and that SoC is old, old, old. Even the upcoming Nvidia SoC in the Switch 2 is over two years old.
 
Joined
Dec 1, 2022
Messages
303 (0.39/day)
Being non pragmatic isn’t reasonable, sorry. And it was never a fire hazard unless you mishandled the connector. A little “burning” isn’t a fire btw. The newer connector is idiot proof, while the other was not idiot proof, that is the difference that the older connector needed you to make sure the cable is properly inserted while the newer one won’t work if you are unable to properly install a cable. :)
No it's called being practical, if it isn't broken don't fix it as others in this thread have said. Oh yeah just a little burning, nothing to worry about lol, besides ruining an expensive GPU or your whole house from a connector getting hot enough to melt solder.

And mishandling isn't the issue, the issue is the connector wasn't idiot proof, the old 8 pin connector is idiot proof because its either not plugged in all the way and the system won't boot or its plugged in and you have a running system, and the 8 pin connector didn't have any issues with melting or burning unless you bought a completely garbage PSU.
Depends on definition of need. Unprecise, the 4080 also needed it because of how the ref card of Nvidia was constructed so you’re wrong.
The definition of need being to have the new connector, too many adapters are just untrustworthy IMO. The 4080 didn't need it with a 320W TDP.
Source and proof for that? (X) none you’re making it up. :)

again, millions of RTX 40 series users with 0 problems with the connector, loud minority won’t change the facts. And fantasies won’t turn into facts. All connectors are safe if properly used, end of story.
You're welcome to go look it up, you never post proof of your claims anyway so why should I even bother?
Okay. W1zz says it’s fine. He had no issues with any version of the connector. Across dozens of cards and thousands replugs. Any new goalposts you would like to choose?
He isn't using the connector in a long term PC up and running for any length of time, IMO a review test bench doesn't count. I want to see a reviewer actually use a card with it in a system, how most people actually use a graphics card. You must be getting quite the workout from those goal posts btw.

Edit- Thanks for the laugh reacts, this just confirms how reasonable and mature Nvidia diehards are, disappointing coming from a mod though.
 
Last edited:
Joined
Jul 26, 2024
Messages
281 (1.70/day)
People should understand that there are manufacturing processes. And some stuff is problematic.

First post I see is the usual I do not have a problem post with
choose from:
*) operating system XY
*) connector XY
*) product XY
(which implies that the product is totally fine - 100% of the products are free from defect)

The graphic card manufacturers are just lazy. I know other areas where you have to write 8d reports, recall, scrap and pay fines for defective connectors.

I wrote it here and somewhere else about my "defective cables" from my enermax power supply. You may look for that topic and read it. I tried to explain the topic in more detail there.
would it make you happier if my card burned down and imply that all have an issue ?
there was no implication of anything there, except maybe that what you're doing is spreading FUD, those connectors are fine as long as you make sure you connect them properly.
 
Status
Not open for further replies.
Top