Tuesday, December 31st 2024

AMD Radeon "RDNA 4" RX 9000 Series Will Feature Regular 6/8-Pin PCI Express Power Connectors

AMD will continue using traditional PCI Express power connectors for its upcoming Radeon RX 9000 series RDNA 4 graphics cards, according to recent information shared on the Chiphell forum. While there were some expectations that AMD would mimic NVIDIA's approach, which requires the newer 16-pin 12V-2×6 connector for its GeForce RTX 50 series, the latest information suggests a more traditional power approach. While AMD plans to release its next generation of graphics cards in the first quarter, most technical details remain unknown. The company's choice to stick with standard power connectors follows the pattern set by their recent Radeon RX 7900 GRE, which demonstrated that conventional PCI Express connectors can adequately handle power demands up to 375 W. The standard connectors eliminate the need for adapters, a feature AMD could highlight as an advantage. An earlier leak suggested that the Radeon RX 9070 XT can draw up to 330 W of power at peak load.

Intel reportedly cited similar reasons for using standard power connectors in their Arc "Battlemage" graphics cards, suggesting broader industry support for maintaining existing connection standards. NVIDIA's different approach reportedly requires all board partners to use the 12V-2×6 connector for the RTX 50 series, removing the option for traditional PCI Express power connectors. In contrast, AMD's decision gives its manufacturing partners more flexibility in their design choices, and MBA (Made by AMD) reference cards don't enforce the new 12V-2×6 power connector standard. Beyond the power connector details and general release timeframe pointing to CES, AMD has revealed little about the RDNA 4 architecture's capabilities. Only the reference card's physical appearance and naming scheme appear to be finalized, leaving questions about performance specifications unanswered, as early underwhelming performance leaks are somewhat unreliable until final drivers and final optimizations land.
Sources: Chiphell, via HardwareLuxx
Add your own comment

133 Comments on AMD Radeon "RDNA 4" RX 9000 Series Will Feature Regular 6/8-Pin PCI Express Power Connectors

#101
AcE
Hecate91No it's called being practical, if it isn't broken don't fix it as others in this thread have said.
Then why waste your time in a tech forum if you’re so extremely conservative as to the point you want to hinder advancement? :laugh: No, you’re being just against it because of misinformation.
Hecate91Oh yeah just a little burning, nothing to worry about lol, besides ruining an expensive GPU or your whole house from a connector getting hot enough to melt solder.
If you are unable to install a GPU maybe leave it to professionals? Again millions of people did it right, a few people didn’t, and you’re so unreasonable to side with those. :laugh:
Hecate91And mishandling isn't the issue
Oh too bad it absolutely and 100% is. Gamers Nexus confirmed this, as did other tech tubers.
Hecate91The definition of need being
You’re not defining the language my friend.
Hecate91You're welcome to go look it up,
Too bad it’s your argument and if you won’t do the work to back your words up I will just again say you made it up. :) RTX30 series had minimal problems along its run. And I always backed up my words, you also made that up. :) Unlucky. Contrary to you, I never make extreme statements to begin with, so I never run the risk to say something I can’t back up.
Posted on Reply
#102
evernessince
AcEIt’s fine because with 330W max it can easily get away with 2x8 pin connectors, but this isn’t particularly great, they won’t need to include an adaptor then. That’s it. Nvidia on the other hand needs to use new connector because the high end gpus are too power hungry and would need 4x 8 pin connectors which is too much, they want a small pcb for their ref cards and there’s no room for that. This is whole reason why they invented small connectors in the first place for small pcb design of the 3090 back then. That connector was just capable of 300W, it evolved to the industry standard 600W connector then which was used on PSUs and 3090 Ti and 40 series. And then that one evolved to the new one that is now used with newer psus and 50 series soon.
The lack of PCB space theory explains why they couldn't use 4 8-pins but it creates obvious questions like:

1) Why not simply have better safety margins on the new connector (12VHPWR, 12V2X6)? Surely you aren't implying they didn't have enough space for a slightly more beefed up connector.
2) What's stopping them from using 2 connectors?

I don't see any engineer not asking these obvious questions and it's quite apparent decisions were made not by the engineers in terms of the safety margin or quality of this new connector.

Reading through this thread, a lot of the comments for the new connector (and ironically VRAM) seem to boil down to "it isn't an issue for me or anyone I know and therefore it isn't an issue". For those issues though, both sides can be mutually correct without being wrong. It not being an issue for you and people you know doesn't mean it isn't an issue. It essentially equates to people with ownership entitlement and feelings punching down the opinions of others and ignoring facts.

Regardless of opinions, we should all be able to agree on two things: 1) Any increase in failure rate, particularly in a power cable, is unacceptable. 2) It would be better for customers if Nvidia included more VRAM.

If you cannot agree on the two above, your feelings have overwritten facts.
Posted on Reply
#103
freeagent
Like installing your CPU vertically, then wondering why it smokes when you turn it on.

Blame the manufacturer for making such a stupid design where that could happen.
Posted on Reply
#104
AcE
evernessince1) Why not simply have better safety margins on the new connector (12VHPWR, 12V2X6)? Surely you aren't implying they didn't have enough space for a slightly more beefed up connector.
The ironic thing is that the first connector nvidia invented, the one for 3090 vanilla that had only max 300W output, was fine, afaik no problems reported. The problems started with 4090 with people who did not properly insert connector. Gamers Nexus tried to imitate this fail but were hardly able to, having to put the cable half-way (!) in, so obviously even users who are clumsy would've seen it, but why be clumsy in the first place when installing a GPU? It's ultra-obvious, especially with expensive cards, that you must be cautious, and they weren't. With later ref they fixed it by shrinking pin size, so now when the cable isn't properly in anymore, the GPU just won't have power connection and thus will not even start. = Idiot proof as I mentioned before.
evernessince2) What's stopping them from using 2 connectors?
This would've been necessary only with the old 12 pin connector of the 3090, but they hastily developed this into the 3090 Ti's connector that has double the power limit at 600W and a 660W risk-limit, after that it will start to burn eventually. So why not? Because again they want to use 1 connector so everything is more tidy and easy, and simpler. 1 connector > 2-4. If you have the newer PSUs, it's great, if not, well, lol, have fun with the adaptors and that mess, but it works at least (i'm using it like that).
evernessinceI don't see any engineer not asking these obvious questions and it's quite apparent decisions were made not by the engineers in terms of the safety margin or quality of this new connector.
Who else? The engineers made this, and they did not test it enough to make it idiot proof from the get go. Only rev 2 is and the brand new one released not long ago and for RTX 50 series. The difference being old connector was much simpler and built to be idiot proof, this one was much less like that. Less simple, less idiot proof. It approached the level of the old connector only after 2022 with Rev 2.
evernessinceFor those issues though, both sides can be mutually correct without being wrong.
That's not how life works. Or science. It is based on facts, so if those that say that the connector is bad, 1) no proper reasons, they just don't like it, dont wan't it, plain and simple 2) then improper reasons like burning, which is due to user error, make it quite obvious who in this thread has the correct statement and who not. Facts > *. This isn't personal and this isn't down to opinion, you're mistaken if you think so.
evernessinceRegardless of opinions, we should all be able to agree on two things: 1) Any increase in failure rate, particularly in a power cable, is unacceptable. 2) It would be better for customers if Nvidia included more VRAM.
The fact people brought the Vram argument again into this shows how lost these people are, if they were right they would lean back and relax. ;) Vram is offtopic here so i'm not addressing it further and mods should delete that stuff. And I agree with 1), that's why it was fixed after 2022. Seems a lot of people are not well informed here and missed that.
Posted on Reply
#105
igormp
Onasi@Hecate91
Okay. How about two 3090s that have been running in two workstations in my lab at work for three years now? In fairly shitty cramped Dell cases, by the way. Working off mid as hell PSUs, too. Fucking bizarrely, my workplace still stands and nothing burned down. I am sure it’s just a fluke, though, and my experience is irrelevant. As I have been reliably informed, after all:
Your 3090s have that weird connector? I thought that was only for the founders 3090ti, not the regular model.
Both of mine have regular 2x8-pin, which I run off a 850W PSU.
Posted on Reply
#106
MrMeth
Macro DeviceIn the perfect world, they would've been squeezing ~330 W off one 8-pin (AWG14) so add ~70 W from the PCI-e interface on top of that and only the hungriest GPUs would've needed more than one.
But I doubt it'll be enough for the 9070 XT.

This matters in like four games and in five more if we talk absurd use cases (UHD+ texture packs and/or settings so high it's <15 FPS anyway) and 3080 has the edge to stay solid in every other title. Especially the ones where DLSS is the only upscaler that works correctly. I would've agreed if that was a comparison with an 8 GB GPU but 10 GB is nowhere near obsolete, also 320-bit bus really helps a lot.

The leaks we got suggest 9070 XT just barely outperforming 7900 GRE which is roughly 3090/3090 Ti area. This is faster than 3080, sure, but it's not a lot of difference.
That's not true I own a 3080 10 gig and a 12 gig and there are games that run better on the 12 gig card. The consoles both have more Vram then 12 gig so it's only a mater of time until a 3080 doesn't cut it anymore and it won't be because or processing power it will be because of Vram.
Posted on Reply
#107
AusWolf
MrMethThe consoles both have more Vram then 12 gig
Which ones? As far as I know, neither the PS5 (Pro), nor the Xbox whatever-is-the-latest-model have any VRAM, only 16 GB of system RAM.
Posted on Reply
#108
JustBenching
Hecate91Theres plenty of examples of the connector melting, melting means something is getting hot enough to short out or catch fire. Someone else posted vids from Northridge Fix, some of those examples have melted power connectors. I've also seen cards with connectors melted off of the PCB, its dangerous when the connector gets so hot it melts the solder and there is no safety mechanism to shut the system down before it gets that hot.
It amazes me team green users want to ignore logic so hard to defend their favorite brand they're saying things like " but its not in the news". You do realize things happen without news coverage, right?
So there are 0 cases of fire, glad we agree. Therefore calling it a fire hazard is, well, let's just call it not accurate.

But as you've said yourself, just because it's not in the news doesn't mean it ain't happening, happy reading. These are not 12vhpwr, I assure you, they are the safe non fire hazardous 8pins :D

www.reddit.com/search/?q=8pin+melted+connector&cId=3fb1a911-885f-44b8-83dc-15ccf54a651e&iId=9f25cd7f-5b64-43e5-b785-31556aea5336
Posted on Reply
#109
Hecate91
AcEThen why waste your time in a tech forum if you’re so extremely conservative as to the point you want to hinder advancement? :laugh: No, you’re being just against it because of misinformation.
Why should I want advancement just for the sake of something being new? I'm against something not being well tested enough when something more robust and proven exists.
AcEIf you are unable to install a GPU maybe leave it to professionals? Again millions of people did it right, a few people didn’t, and you’re so unreasonable to side with those. :laugh:
So now you're going to result to insults and accuse me of not being able to install a GPU,nice.
The issue is the connector is designed with a lesser margin of safety, and less tolerance compared to the 8 pin connector.
Saying it works fine for you is just confirmation bias and bringing your feelings for the brand you like into the argument.
AcEOh too bad it absolutely and 100% is. Gamers Nexus confirmed this, as did other tech tubers.
Gamers Nexus was the only one to scientifically test the connector,probably more than Nvidia did because it seems like if the engineers correctly tested the connector they would've found issues with it. Although scientific testing doesn't always apply to real life usage, again there is little margin for safety compared to the 8 pin connector.
AcEYou’re not defining the language my friend.
Neither are you.
AcEToo bad it’s your argument and if you won’t do the work to back your words up I will just again say you made it up. :) RTX30 series had minimal problems along its run. And I always backed up my words, you also made that up. :) Unlucky. Contrary to you, I never make extreme statements to begin with, so I never run the risk to say something I can’t back up.
The thread is already off topic and I'm not going to discuss something off topic with links. But if you consider 3090's getting bricked from a game, and the 3090Ti overheating because of VRAM being placed on the back of the card minimal issues then its clear you want to bring feelings into it. And I would consider saying 8GB is enough without proof, as being an extreme statement because many tech youtubers have debunked that.
JustBenchingSo there are 0 cases of fire, glad we agree. Therefore calling it a fire hazard is, well, let's just call it not accurate.

But as you've said yourself, just because it's not in the news doesn't mean it ain't happening, happy reading. These are not 12vhpwr, I assure you, they are the safe non fire hazardous 8pins :D

www.reddit.com/search/?q=8pin+melted+connector&cId=3fb1a911-885f-44b8-83dc-15ccf54a651e&iId=9f25cd7f-5b64-43e5-b785-31556aea5336
If you have to go looking for a melted 8 pin connector, then you should know what problem is, like I said its a confirmation bias from people that want to ignore the facts.
Posted on Reply
#110
evernessince
AcEThe ironic thing is that the first connector nvidia invented, the one for 3090 vanilla that had only max 300W output, was fine, afaik no problems reported. The problems started with 4090 with people who did not properly insert connector. Gamers Nexus tried to imitate this fail but were hardly able to, having to put the cable half-way (!) in, so obviously even users who are clumsy would've seen it, but why be clumsy in the first place when installing a GPU? It's ultra-obvious, especially with expensive cards, that you must be cautious, and they weren't. With later ref they fixed it by shrinking pin size, so now when the cable isn't properly in anymore, the GPU just won't have power connection and thus will not even start. = Idiot proof as I mentioned before.
You kind of answer your own question, an issue exists with a certain connector that doesn't with others. People do not set out to damage their very expensive cards.

FYI there are reports of 3090 connectors burning up as well: nvidia/comments/yj4sga
That it wasn't as prevalent was probably due to defect present in the original 12VHPWR connector that allowed it to run only partially plugged in and the fact that the connector was not as widely rolled out. 12V2X6 claims to fix that particular defect.
AcEWho else? The engineers made this, and they did not test it enough to make it idiot proof from the get go. Only rev 2 is and the brand new one released not long ago and for RTX 50 series. The difference being old connector was much simpler and built to be idiot proof, this one was much less like that. Less simple, less idiot proof. It approached the level of the old connector only after 2023 with Rev 2.
There are several forces outside the engineers designing a project that get significant influence over a product's design.
AcEThat's not how life works. Or science. It is based on facts, so if those that say that the connector is bad, 1) no proper reasons, they just don't like it, dont wan't it, plain and simple 2) then improper reasons like burning, which is due to user error, make it quite obvious who in this thread has the correct statement and who not. Facts > *.
"me and my friends don't have an issue with the connector" is not mutually exclusive with statements like "this connector has a higher failure rate". Both can be true. One is an opinion while the other is looking at the broader picture using data.
AcEThis isn't personal and this isn't down to opinion, you're mistaken if you think so.
You are mistaken, I pointed out that comments for the new connector are based on opinion because they are in fact opinions. The evidence against the connector is data driven, it shows the connector has a higher failure rate.

You agree later in your comment with my point that any increase in failure rate in a power cable is unacceptable. If you have data that shows otherwise please share. Otherwise you've presented a contradiction here.
AcEThe fact people brought the Vram argument again into this shows how lost these people are, if they were right they would lean back and relax. ;) Vram is offtopic here so i'm not addressing it further and mods should delete that stuff. And I agree with 1), that's why it was fixed after 2023. Seems a lot of people are not well informed here and missed that.
Sigh, there are plenty of examples of TPU and HWUB benchmarks showing VRAM usage exceeding 8GB (used, not allocated). There are videos showing the ill effects of such.

Regardless of whether a game is seeing a performance dip or not, It's wild that you cannot agree that consumer cards would be better off with a bit more VRAM.
Posted on Reply
#111
AcE
Hecate91Why should I want advancement just for the sake of something being new? I'm against something not being well tested enough when something more robust and proven exists.
It is fine you don't get the point of 1 cable over 2-4, and I won't explain it to you. ;)
Hecate91So now you're going result to insults and accuse me of not being able to install a GPU,nice.
Wrong, the way I formulated it is in a questioning manner. But sure "everything" is a "insult" nowadays. ;)
Hecate91The issue is the connector is designed lesser margin of safety, and less tolerance compared to the 8 pin connector.
The issue is you're talking about 2022 connectors but we are in the year 2025~. ;) Interestingly, the few people who bought 3090 Ti had no such issues as this was reported with 40 series first. Seems those users properly installed the cable!
Hecate91Saying it works fine for you is just confirmation bias and bringing your feelings for the brand you like into the argument.
Too bad that my words are based on facts / science, has nothing to do with anything else. I would totally agree if you had a point. "Feelings for the brand", how funny, I'm too old to care about brands anymore. I like some, I fanatic none. Maybe you're projecting your own issues on to me? It's pretty typical nowadays. ;) I'm not you.
Hecate91Gamers Nexus was the only one to scientifically test the connector,probably more than Nvidia did because it seems like if the engineers correctly tested the connector they would've found issues with it. Although scientific testing doesn't always apply to real life usage, again there is little margin for safety compared to the 8 pin connector.
Again, there's more than enough safety now on the connector since the Rev 2 of after 2022. How many times did I mention that now? Ignorant much?
Hecate91The thread is already off topic and I'm not going to discuss something off topic with links. But if you consider 3090's getting bricked from a game, and the 3090Ti overheating because of VRAM being placed on the back of the card minimal issues then its clear you want to bring feelings into it. And I would consider saying 8GB is enough without proof, as being an extreme statement because many tech youtubers have debunked that.
A lot of statements without proof again. And all brands have issues, I never said it was perfect, I merely talked down your extremism. ;) Your takes are just not very sensible.
Posted on Reply
#112
Zazigalka
Legitimate question - Is this connector really that much of a concern or are some of AMD's damage control people trying their best shot at scaring others with possibile issues on high end blackwell knowing AMD will sit this round out ?
Posted on Reply
#113
AcE
evernessinceYou kind of answer your own question
I didn't ask a question, read properly next time. ;)
evernessinceFYI there are reports of 3090 connectors burning up as well:
Too bad that this is with custom connectors and 1 in a zillion doesn't make your moot point. As other guy mentioned even examples of 8 pin connectors burning and what not can be found. ;) You're making a fallacy example with utterly low relevance.
evernessince12V2X6 claims to fix that particular defect.
Again, maybe try to properly read next time, the older connector already fixed that with a rev 2 after 2022. Not only the 12V2X6.
evernessinceThere are several forces outside the engineers designing a project that get significant influence over a product's design.
So you think they noticed short comings and said "oh fine! Let's still release it!"? Unlikely. More likely what I already said, the product was tested in a limited manner, this is called Quality Assurance. And as it was rushed for the 3090 Ti, we already know why that happened, because there wasn't more time.
evernessince"me and my friends don't have an issue with the connector" is not mutually exclusive with statements like "this connector has a higher failure rate". Both can be true. One is an opinion while the other is looking at the broader picture using data.
And what exactly has this to do with my statements? Nothing. It never was about "me and my friends" for me, go back and reread perhaps, generally you have issues reading it seems, after multiple times it's pretty obvious now.
evernessinceYou are mistaken, I pointed out that comments for the new connector are based on opinion because they are in fact opinions. The evidence against the connector is data driven, it shows the connector has a higher failure rate.
And water is wet, you said nothing new here.
evernessinceSigh, there are plenty of examples of TPU and HWUB benchmarks showing VRAM usage exceeding 8GB (used, not allocated). There are videos showing the ill effects of such.
Sigh, you're still offtopic and vram is still a nonissue on most (meaning: nearly all cards) cards ever released in their relevant time frame of usage. Go discuss it with people who have too much time, I have already said plenty on the matter, go read the other thread on this topic. ;)
Posted on Reply
#114
Macro Device
TomorrowSo now everyone who wont buy a GPU over 450 is broke?
Not over. Around. Do you not know what "ish" means?
Anyway, with the progress slowing down, this is not a biggie. Compare spending $250 on a GPU that will run out of steam in 3 years (RTX 2080 Ti, roughly $83 a year) and spending $450 on a GPU that'll be fine for 6 years (RTX 3080 Ti, roughly $75 a year).
TomorrowYou're the one who brought up this ridiculous number. Now you can provide examples?
1. You need to prove it's ridiculous.
2. I don't wanna explain my point in even more basic language than I already did.
But if you insist then let's say some guy bought a GTX 1080 Ti during 2017 and at some point in 2024, upgraded it to a 4080 because he only just now ran out of games that run fine on 1080 Ti. This roughly translates to +170% base performance boost with much more features supported. Thus, a serious upgrade. Massive, even. The same guy will be fine with his 4080 till like the middle of 2030s (when it dies) and upgrades only because his GPU has passed. Why do I think so? Because in the current state of the market, nothing suggests rapid gen-to-gen improvements.
TomorrowWell rounded with limited VRAM and weak RT perf?
Not limited enough to concern, unless we're talking 4070 Ti which is too expensive for what VRAM it sports.
RT performance in AMD GPUs doesn't even exist so whatevs.
DavenAMD has done four things along these lines already:
I was talking discrete desktop GPU market ONLY. This is the market where AMD are happily suiciding and I dunno how and why it could be unclear from my statement.
Posted on Reply
#115
3valatzy
This investigation digs deep into the 12VHPWR and 12V-2x6 specifications, highlighting the contradictory design documents leading to confusion as manufacturers cut corners. We talk about the CableMod recall of its angled adapters, including a deep-dive failure analysis into its solutions that we hired a lab for. We also cover PCIe 6/8-pin melting failures of the past and the differences with 12VHPWR. For this content, we collaborated with Aris of Cybenetics (Hardware Busters), Der8auer, Elmor of Elmor Labs, and others to fact check the research. With the NVIDIA GeForce RTX 5090 on the horizon, now is a good time to revisit the 12VHPWR standard (and its follow-ups, like 12V-2x6) to try and come to an understanding as to what it all means. This also covers the differences between 12VHPWR and 12V-2x6, alongside other connector standards.
Posted on Reply
#116
AusWolf
Hecate91Why should I want advancement just for the sake of something being new? I'm against something not being well tested enough when something more robust and proven exists.

So now you're going result to insults and accuse me of not being able to install a GPU,nice.
The issue is the connector is designed with a lesser margin of safety, and less tolerance compared to the 8 pin connector.
Saying it works fine for you is just confirmation bias and bringing your feelings for the brand you like into the argument.

Gamers Nexus was the only one to scientifically test the connector,probably more than Nvidia did because it seems like if the engineers correctly tested the connector they would've found issues with it. Although scientific testing doesn't always apply to real life usage, again there is little margin for safety compared to the 8 pin connector.

Neither are you.

The thread is already off topic and I'm not going to discuss something off topic with links. But if you consider 3090's getting bricked from a game, and the 3090Ti overheating because of VRAM being placed on the back of the card minimal issues then its clear you want to bring feelings into it. And I would consider saying 8GB is enough without proof, as being an extreme statement because many tech youtubers have debunked that.


If you have to go looking for a melted 8 pin connector, then you should know what problem is, like I said its a confirmation bias from people that want to ignore the facts.
evernessinceYou kind of answer your own question, an issue exists with a certain connector that doesn't with others. People do not set out to damage their very expensive cards.

FYI there are reports of 3090 connectors burning up as well: nvidia/comments/yj4sga
That it wasn't as prevalent was probably due to defect present in the original 12VHPWR connector that allowed it to run only partially plugged in and the fact that the connector was not as widely rolled out. 12V2X6 claims to fix that particular defect.



There are several forces outside the engineers designing a project that get significant influence over a product's design.



"me and my friends don't have an issue with the connector" is not mutually exclusive with statements like "this connector has a higher failure rate". Both can be true. One is an opinion while the other is looking at the broader picture using data.



You are mistaken, I pointed out that comments for the new connector are based on opinion because they are in fact opinions. The evidence against the connector is data driven, it shows the connector has a higher failure rate.

You agree later in your comment with my point that any increase in failure rate in a power cable is unacceptable. If you have data that shows otherwise please share. Otherwise you've presented a contradiction here.



Sigh, there are plenty of examples of TPU and HWUB benchmarks showing VRAM usage exceeding 8GB (used, not allocated). There are videos showing the ill effects of such.

Regardless of whether a game is seeing a performance dip or not, It's wild that you cannot agree that consumer cards would be better off with a bit more VRAM.
Guys, I wouldn't bother replying to our friend any further if I were you. He will only deflect, keep insisting that you're wrong with no proof, and resort to offensive language upon failure. A total waste of your time.
Posted on Reply
#117
eidairaman1
The Exiled Airman
katziIf the prices are good, Might end up with a Radeon gpu to replace my 3080.

My last Radeon GPU was a 4890 that I overclocked to 1Ghz LOL.
You missed out, 290 what a card
Posted on Reply
#118
evernessince
AusWolfGuys, I wouldn't bother replying to our friend any further if I were you. He will only deflect, keep insisting that you're wrong with no proof, and resort to offensive language upon failure. A total waste of your time.
Yep, I realize that after he started hurling insults. Thread should be locked. Amazing that these are adults.
Posted on Reply
#119
AusWolf
evernessincethese are adults.
I'm not so sure about that.
Posted on Reply
#120
freeagent
Piggy backed 8 pin connectors were a bad idea, plenty of burned up plugs from less than stellar setups, some involving cable extensions too.
Posted on Reply
#121
AcE
AusWolfGuys, I wouldn't bother replying to our friend any further if I were you. He will only deflect, keep insisting that you're wrong with no proof, and resort to offensive language upon failure. A total waste of your time.
Defamation, this is a great example of projection as well. ;) Say about others what you are yourself. Great I took this guy on block, absolutely unreasonable, has very limited tech knowledge and thinks he's an expert on everything.
Posted on Reply
#122
Hecate91
AusWolfHogwarts Legacy is a funny game. I remember Hardware Unboxed testing it with a 4 GB and an 8 GB 6500 XT, with the 4 GB showing strange artefacts, texture pop-ins and other oddities, while the 8 GB one didn't.

So then, I popped my 4 GB 6500 XT into my PC to see it for myself, and honestly, I couldn't notice anything weird... which was... weird.
Is Hogwarts Legacy just that buggy? I remember hearing it was buggy, so I haven't bought the game to try it for myself.
In TPU testing on the Intel Arc B580, it does pull ahead of the 4060Ti 8GB on Hogwarts Legacy at 1440P.
AusWolfGuys, I wouldn't bother replying to our friend any further if I were you. He will only deflect, keep insisting that you're wrong with no proof, and resort to offensive language upon failure. A total waste of your time.
Agreed, I'm done replying to them after he resulted to throwing insults, they keep asking for proof even though people have posted proof, yet he hasn't provided any proof to support his claims.
Posted on Reply
#123
somebadlemonade
ZazigalkaLegitimate question - Is this connector really that much of a concern or are some of AMD's damage control people trying their best shot at scaring others with possibile issues on high end blackwell knowing AMD will sit this round out ?
So the big issue was not enough testing was done on the new connector. And it had a lower than usual safety factor, anything less then 25% for power connections is genuinely bad.

Both sides are butt hurt about it. Nvidia made a new connector with same safety factor but a better designed interface between the connections to allow better contact.

TL:DR - It's partly marketing and partly reasonable to call out the issue with the new unproven connector with a lower than usual safety factor.
Posted on Reply
#124
AcE
evernessinceYep, I realize that after he started hurling insults. Thread should be locked. Amazing that these are adults.
"If you can't win a argument, just go out by pretending the other guy is the problem." ;)
Hecate91Agreed, I'm done replying to them after he resulted to throwing insults,
And again, I didn't insult you, and provide proof for all arguments I make and you can keep crying, if you can't provide proof for your extreme statements, many many people will rightly call you out, not just me. :)
Posted on Reply
#125
AusWolf
Hecate91Is Hogwarts Legacy just that buggy? I remember hearing it was buggy, so I haven't bought the game to try it for myself.
In TPU testing on the Intel Arc B580, it does pull ahead of the 4060Ti 8GB on Hogwarts Legacy at 1440P.
I'm not sure if I'd call it buggy, or just rather weird. All I'm saying is that I had no issues running it on a 4 GB 6500 XT while Steve from HUb did.
Posted on Reply
Add your own comment
Jan 5th, 2025 04:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts