# The Curious Case of the 12-pin Power Connector: It's Real and Coming with NVIDIA Ampere GPUs



## btarunr (Jul 16, 2020)

Over the past few days, we've heard chatter about a new 12-pin PCIe power connector for graphics cards being introduced, particularly from Chinese language publication FCPowerUp, including a picture of the connector itself. Igor's Lab also did an in-depth technical breakdown of the connector. TechPowerUp has some new information on this from a well placed industry source. The connector is real, and will be introduced with NVIDIA's next-generation "Ampere" graphics cards. The connector appears to be NVIDIA's brain-child, and not that of any other IP- or trading group, such as the PCI-SIG, Molex or Intel. The connector was designed in response to two market realities - that high-end graphics cards inevitably need two power connectors; and it would be neater for consumers to have a single cable than having to wrestle with two; and that lower-end (<225 W) graphics cards can make do with one 8-pin or 6-pin connector. 

The new NVIDIA 12-pin connector has six 12 V and six ground pins. Its designers specify higher quality contacts both on the male and female ends, which can handle higher current than the pins on 8-pin/6-pin PCIe power connectors. Depending on the PSU vendor, the 12-pin connector can even split in the middle into two 6-pin, and could be marketed as "6+6 pin." The point of contact between the two 6-pin halves are kept leveled so they align seamlessly. 



 

 

 




As for the power delivery, we have learned that the designers will also specify the cable gauge, and with the right combination of wire gauge and pins, the connector should be capable of delivering 600 Watts of power (so it's not 2*75 W = 150 W), and not a scaling of 6-pin. Igor's Lab published an investigative report yesterday with some numbers on cable gauge that helps explain how the connector could deliver a lot more power than a combination of two common 6-pin PCIe connectors.

Looking at the keying, we can see that it will not be possible to connect two classic six-pins to it. For example pin 1 is square on the PCIe 6-pin, but on NVIDIA's 12-pin is has one corner angled. It also won't be possible to use weird combinations like 8-pin + EPS 4 pin, or similar—NVIDIA made sure people won't be able to connect their cables the wrong way.

On topic of the connector's proliferation, in addition to PSU manufacturers launching new generations of products with 12-pin connectors, most prominent manufacturers are expected to release aftermarket modular cables that can plug in to their existing PSUs. Graphics card vendors will include ketchup-and-mustard adapters that convert 2x 8-pin to 1x 12-pin; while most case/power manufacturers will release fancy aftermarket adapters with better aesthetics.

*Update 08:37 UTC*: I made an image in Photoshop to show the new connector layout, keying and voltage lines in a single, easy to understand graphic.

*View at TechPowerUp Main Site*


----------



## londiste (Jul 16, 2020)

It is not quite 2x6-pin connector, keying is different. 
Considering the sense pins and usually one reserved pin, would it have 6 +12V pins?


----------



## Vayra86 (Jul 16, 2020)

londiste said:


> It is not quite 2x6-pin connector, keying is different.



Oh man its a Big little connector then? 2 weak plugs and 10 strong ones Alderpere?


----------



## cucker tarlson (Jul 16, 2020)

well,not the worst of ideas to have even the 3x8-pin cards like lightning run off a single connector instead of this mess


----------



## W1zzard (Jul 16, 2020)

londiste said:


> keying is different


Indeed .. for example pin 12 isn't square, let me check with my sources


----------



## kayjay010101 (Jul 16, 2020)

Wait, so it's real now again? But the FCPowerUp article was made up! If you actually read the original article, the last sentence: 以上的内容都是我编的。translates to "I fabricated all the content above". This is very confusing. How could FCPowerUp make up something the same day that other sources claim that exact, made up stuff, is actually true? Is it just a crazy coincidence? What's happening?


----------



## cucker tarlson (Jul 16, 2020)

kayjay010101 said:


> Wait, so it's real now again? But the FCPowerUp article was made up! If you actually read the original article, the last sentence: 以上的内容都是我编的。translates to "I fabricated all the content above". This is very confusing. How could FCPowerUp make up something the same day that other sources claim that exact, made up stuff, is actually true? Is it just a crazy coincidence? What's happening?


made he means he sketched a picture himself


----------



## kayjay010101 (Jul 16, 2020)

cucker tarlson said:


> made he means he sketched a picture himself


The single picture wouldn't be "all the content" though?


----------



## dont whant to set it"' (Jul 16, 2020)

The wheel has been invented before this.


----------



## Vya Domus (Jul 16, 2020)

> The connector was designed in response to two market realities - that high-end graphics cards inevitably need two power connectors; and it would be neater for consumers to have a single cable than having to wrestle with two; and that lower-end (<225 W) graphics cards can make do with one 8-pin or 6-pin connector.



You know what's really neat, God knows how many customers not having to buy new PSUs and using a million adapters. I can't believe how they spun it around into something beneficial,  the connectors we have now work fine.


----------



## BoboOOZ (Jul 16, 2020)

Well, graphic cards power consumption is only going up, so they might as well ensure that power delivery is flawless. Otherwise, many users might complain about their card's buggy drivers crashing.

Still, I wouldn't like to change my PSU just yet.


----------



## fynxer (Jul 16, 2020)

*Question is will these aftermarket 12pin cables be free for existing high end GOLD/Platinum/Titanium PSU owners* from big PSU manufacturers like Corsair, EVGA, SeaSonic among others.

As reference when a new cpu mounting standard are introduced from Intel/AMD usually cpu cooler manufacturers like Noctua and others send out free adapter kits to existing cpu cooler owners.

If we must rely on a included 8+8pin to 12pin cable adapter from the gfx card manufacturer the cable clutter will be even worse than before.

Also these 8+8pin to 12pin cabel adapters included with the gfx cards will probably all have different looks and be of various quality.


----------



## dj-electric (Jul 16, 2020)

This reeks of lobbyism from you-know-who in the connector market.
Dont @ me.

At least getting 600W+ from one connector will be a thing.


----------



## cucker tarlson (Jul 16, 2020)

fynxer said:


> *Question is will these aftermarket cables be free for existing high end PSU owners* from big PSU manufacturers like Corsair, EVGA, SeaSonic among others.


you need a connector on the psu not just the cable
all modern psus have 8-pin connectors for PEG,not 6 -pin
it's a nice thing for sure,but if it gets released there will be normal 2x8-pin versions for sure.



dj-electric said:


> This reeks of lobbyism from you-know-who in the connector market.
> Dont @ me.
> 
> At least getting 600W+ from one connector will be a thing.


lol,connector market.
why wouldn't this be psu makers lobbying nvidia to use that connector to make ppl replace current psus ?


----------



## Emu (Jul 16, 2020)

For 600W you are going to want 4x 8pin PCIe power to the 12 pin Nvidia connector.  You could probably get away with less but you would have no guarantees that you won't melt your cables and/or connectors.  I think the last thing Nvidia or any of it's partners want is for more video cards to catch on fire (well, cause fires)...


----------



## Vya Domus (Jul 16, 2020)

BoboOOZ said:


> Well, graphic cards power consumption is only going up, so they might as well ensure that power delivery is flawless.



Funny how when AMD has a power hungry GPU every one thinks power consumption is everything but when Nvidia hints at an upcoming power hungry atrocity everyone's cool with it.


----------



## cucker tarlson (Jul 16, 2020)

Vya Domus said:


> Funny how when AMD has a power hungry GPU every one thinks power consumption is everything but when Nvidia hints at an upcoming power hungry atrocity everyone's cool with it.


really ? or it just when amd has a power hungry card that loses to the power efficient card ? and is massively late to add to that.


----------



## CandymanGR (Jul 16, 2020)

GPU's should go the way of efficiency. 600W power requierements for gpus ? Should have been the opposite.
Plus this connector is too heavy and it will sag the cards down. 

Genious!


----------



## BoboOOZ (Jul 16, 2020)

Vya Domus said:


> Funny how when AMD has a power hungry GPU every one thinks power consumption is everything but when Nvidia hints at an upcoming power hungry atrocity everyone's cool with it.


They're the market leader...
Things might change in the future, I heard AMD had finally changed all the graphics marketing team. That was long due.


----------



## kayjay010101 (Jul 16, 2020)

CandymanGR said:


> GPU's should go the way of efficiency. 600W power requierements for gpus ? Should have been the opposite.
> Plus this connector is too heavy and it will sag the cards down.
> 
> Genious!


Nobody said 600W power requirement for GPUs. All this said is the new connector is _capable _of delivering upto 600W. We won't see any 600W monsters any time soon

How do you know the connector is heavier than what's already out there? The dual 8pin connector that's currently on the 2080 Ti's should weigh more. It's just plastic and the same wires, it's just a different connector.


----------



## londiste (Jul 16, 2020)

Why does this connector necessarily mean anything about power consumption? 2x6-pin or 6+8-pin has been the norm for high-end cards (and lately midrange cards) for a long while. This 12-pin thing is likely going to end up equal to 6+8-pin solution, just in a single cable. 

Current throughput relies on different things, what is currently known does not seem to give us a complete picture of what the connector will end up being. Capable of up to 600W is a strange thing to claim with so many unknowns.

The connector pinouts are not quite accurate. 
While technically most PSUs provide +12V on pin 2 for 6-pin connector, that is not the spec and pin 5 accordingly is sense. 6-pin connector officially has 2 +12V pins.
Similarly, 8-pin connector includes 2 sense pins 4 and 6 and has 3 +12V pins.
I bet 12-pin connector will end up with 5 +12V pins.


----------



## cucker tarlson (Jul 16, 2020)

this may be a fancy connector on top of that fancy dual sided cooler methinks


----------



## CandymanGR (Jul 16, 2020)

Scarlet Witch


kayjay010101 said:


> Nobody said 600W power requirement for GPUs. All this said is the new connector is _capable _of delivering upto 600W. We won't see any 600W monsters any time soon



You create something that you will need soon. You dont create something that you "might" need.  Thats how technology works. If it wasnt needed, it wouldn't have been made.



kayjay010101 said:


> How do you know the connector is heavier than what's already out there? The dual 8pin connector that's currently on the 2080 Ti's should weigh more. It's just plastic and the same wires, it's just a different connector.



The dual connector splits the weight of itself AND of the plug in half. And it allready saggs the card. I use math and logic, but i doubt anyone else here can. Right?


----------



## cucker tarlson (Jul 16, 2020)

CandymanGR said:


> Scarlet Witch
> 
> 
> You create something that you will need soon. You dont create something that you "might" need.  Thats how technology works. If it wasnt needed, it wouldn't have been made.
> ...


yes but the total weight of the thing you attatch to the gpu is the same whether you split or not
and it's the cooler that sags the card not the connector unless you're wiring your cables wrong


----------



## Flanker (Jul 16, 2020)

kayjay010101 said:


> Wait, so it's real now again? But the FCPowerUp article was made up! If you actually read the original article, the last sentence: 以上的内容都是我编的。translates to "I fabricated all the content above". This is very confusing. How could FCPowerUp make up something the same day that other sources claim that exact, made up stuff, is actually true? Is it just a crazy coincidence? What's happening?


Nah you're right, it's a troll post on his blog


cucker tarlson said:


> made he means he sketched a picture himself


Nope, it means exactly "I made all this shit up"

How the hell do stuff like this end up on TPU lol
Oops turned out to be correct


----------



## cucker tarlson (Jul 16, 2020)

Flanker said:


> Nah you're right, it's a troll post on his blog
> 
> Nope, it means exactly "I made all this shit up"
> 
> How the hell do stuff like this end up on TPU lol


tbh it's a shame
I don't understand the outrage
a single connector woul be nice


----------



## Vya Domus (Jul 16, 2020)

BoboOOZ said:


> They're the market leader...



When you are a market leader everything that's a negative becomes a positive ? Might as well pack and go home and accept everything that's shit because it comes from a market leader.

This is some bizarre Stockholm syndrome type of stuff.


----------



## BoboOOZ (Jul 16, 2020)

Vya Domus said:


> When you are a market leader everything that's a negative becomes a positive ? Might as well pack and go home and accept everything that's shit because it comes from a market leader.
> 
> This is some bizarre Stockholm syndrome type of stuff.


When you are the market leader, many changes are easier accepted, and you can impose some decisions on the market, while the other players in the market will have to follow.

Also, when you are a market leader, it means that there are more fanboys that will accept without discussion he compromises that you impose.

FWIW, the power draw is not necessarily a negative, it depends on the final performance and the final price. Anyways, it might still be just a rumor, as always, wait&see.


----------



## W1zzard (Jul 16, 2020)

Flanker said:


> How the hell do stuff like this end up on TPU lol


We ask our sources in the industry


----------



## Anymal (Jul 16, 2020)

Vayra86 said:


> Oh man its a Big little connector then? 2 weak plugs and 10 strong ones Alderpere?


Smart.


----------



## Vya Domus (Jul 16, 2020)

BoboOOZ said:


> FWIW, the power draw is not necessarily a negative, it depends on the final performance and the final price.



Or color of the sticker apparently, new connectors are a problem no matter how you spin it.


----------



## cucker tarlson (Jul 16, 2020)

BoboOOZ said:


> FWIW, the power draw is not necessarily a negative, it depends on the final performance and the final price.


vega was unfairly hurt by nvidia's power efficiency mindshare
drawing 300w and losing to 1080Ti by a country mile


----------



## fynxer (Jul 16, 2020)

cucker tarlson said:


> you need a connector on the psu not just the cable
> all modern psus have 8-pin connectors for PEG,not 6 -pin
> it's a nice thing for sure,but if it gets released there will be normal 2x8-pin versions for sure.



I guess for existing PSUs they can make a custom cable with two connectors that goes into the modular PSU and then a 12pin into the graphics card.

Would be a much better solution than a 8+8pin to 12pin adapter


----------



## BoboOOZ (Jul 16, 2020)

cucker tarlson said:


> vega was unfairly hurt by nvidia's power efficiency mindshare
> drawing 300w and losing to 1080Ti by a country mile


And you think that I care? 

I'm here to understand and discuss technology, fanboy flame wars are not my cup of tea.


----------



## lZKoce (Jul 16, 2020)

Suddenly I felt my PSU is out of standard. Because I know I won't be changing anytime soon. By the time comes to upgrade, there will be a bunch of new connectors sticking out.  I guess this is what it feels when your focus is on having the latest as opposed to enjoying what you have


----------



## Flanker (Jul 16, 2020)

W1zzard said:


> We ask our sources in the industry


Sorry about that.


----------



## BoboOOZ (Jul 16, 2020)

lZKoce said:


> Suddenly I felt my PSU is out of standard. Because I know I won't be changing anytime soon. By the time comes to upgrade, there will be a bunch of new connectors sticking out.  I guess this is what it feels when your focus is on having the latest as opposed to enjoying what you have


Well if you stick with 1080p you of course you don't need to upgrade yet.


----------



## cucker tarlson (Jul 16, 2020)

why would you want a single connector instead of two or three
we are outraged



W1zzard said:


> We ask our sources in the industry


I hope this is true

my psu is 6 yo,due upgrade next year


----------



## Verpal (Jul 16, 2020)

*看来有一些读者尤其是国外的读者不能理解中文的幽默，我更新一下最新的图纸和消息汇总。消息是真的。既然Techpowerup发了图纸，那我也发一下。 *

You guys can't read Chinese? 

以上的内容都是我编的。 
Anyone with somewhat reasonable competency in Chinese would know it is literally a joke, nothing more.
fcpowerup are legit, they are reviewing power supply since birth of Jesus Christ, if you don't believe it, at least try to read the write up from igor labs.


----------



## W1zzard (Jul 16, 2020)

cucker tarlson said:


> I hope this is true


Why would I lie to you? For a few thousand clicks?


----------



## Assimilator (Jul 16, 2020)

Nobody has yet mentioned the elephant in the room, namely that this connector isn't part of the ATX specification. The whole point of that spec is to guarantee that if you buy an "ATX PSU", you know exactly what type of connectors you can expect it to come with. This throws all of that out the window, now you'll have "ATX+12-pin PSU"s. Which consequently means anyone can start adding whatever goddamn connector type they think the industry will support, you end up with a plethora of connectors outside the spec, and choosing (and marketing) a PSU becomes a nightmare.

tl;dr vendors randomly adding arbitrary power connectors is a bad thing for everyone concerned, which is one of the primary reasons the ATX specification exists.

As such, unilaterally introducing a new power connector outside the ATX spec would be an extremely foolish move by NVIDIA. Even if they feel that the current 6- and 8-pin connectors are suboptimal, if they want to replace or augment those connectors then they must do so via the standard. Anything else is Apple levels of hubris that will only end poorly.

This is not even getting into the possibility of Intel revoking ATX certification for Ampere GPUs and any PSUs that decide to include this connector - because Intel would be entirely within its right to do so.

*As such, I see the possibility of this connector being introduced with Ampere as low.* Much more likely is that NVIDIA is prepping it for inclusion into an upcoming revision of the ATX specification, and obviously that will entail sending it out to PSU manufacturers in order to get their feedback, hence the leaks.


----------



## cucker tarlson (Jul 16, 2020)

W1zzard said:


> Why would I lie to you? For a few thousand clicks?


is thousand a lot ?



Assimilator said:


> Nobody has yet mentioned the elephant in the room, namely that this connector isn't part of the ATX specification. The whole point of that spec is to guarantee that if you buy an "ATX PSU", you know exactly what type of connectors you can expect it to come with. This throws all of that out the window, now you'll have "ATX+12-pin PSU"s. Which consequently means anyone can start adding whatever goddamn connector type they think the industry will support, you end up with a plethora of connectors outside the spec, and choosing (and marketing) a PSU becomes a nightmare.
> 
> tl;dr vendors randomly adding arbitrary power connectors is a bad thing for everyone concerned, which is one of the primary reasons the ATX specification exists.
> 
> ...


atx has revisions.
maybe new one is coming


----------



## W1zzard (Jul 16, 2020)

cucker tarlson said:


> is thousand a lot ?


Nope, right now 454 people are reading the news post, which is around 5% our current total traffic.
Working off leaks can be tempting for a smaller site though, I've been there many years ago


----------



## cucker tarlson (Jul 16, 2020)

W1zzard said:


> Nope, right now 454 people are reading the news post, which is around 5% our current total traffic.
> Working off leaks can be tempting for a smaller site though, I've been there many years ago


then no,not for a thousand.
how surre are you tho ?


----------



## W1zzard (Jul 16, 2020)

cucker tarlson said:


> then no,not for a thousand.
> how surre are you tho ?


Sure enough to allow my guys to post an article. Nothing stops NVIDIA from cancelling this project, or implementing it only on a specific model, or on Founders Edition, or on Titan, or on Special Pro OC Edition


----------



## kayjay010101 (Jul 16, 2020)

Verpal said:


> *看来有一些读者尤其是国外的读者不能理解中文的幽默，我更新一下最新的图纸和消息汇总。消息是真的。既然Techpowerup发了图纸，那我也发一下。 *
> 
> You guys can't read Chinese?
> 
> ...


I see, so the "I made this up" part is a joke, meaning the actual article is real. That makes more sense now. I had only seen someone else translate the last sentence, I can't read Chinese myself. Apologies.


----------



## cucker tarlson (Jul 16, 2020)

W1zzard said:


> Sure enough to allow my guys to post an article. Nothing stops NVIDIA from cancelling this project, or implementing it only on a specific model, or on Founders Edition, or on Titan, or on Special Pro OC Edition


or specific cooler model
imo there might be two,a 2000-like with standard 2x8-pin and this new dual sided thing with the 12-pin


----------



## kapone32 (Jul 16, 2020)

fynxer said:


> *Question is will these aftermarket 12pin cables be free for existing high end GOLD/Platinum/Titanium PSU owners* from big PSU manufacturers like Corsair, EVGA, SeaSonic among others.
> 
> As reference when a new cpu mounting standard are introduced from Intel/AMD usually cpu cooler manufacturers like Noctua and others send out free adapter kits to existing cpu cooler owners.
> 
> ...


 I don't think so EVGA might give their customers some love but not Corsair


----------



## cucker tarlson (Jul 16, 2020)

kapone32 said:


> I don't think so EVGA might give their customers some love but not Corsair


imo they'll come with the card


----------



## W1zzard (Jul 16, 2020)

cucker tarlson said:


> or specific cooler model
> imo there might be two,a 2000-like with standard 2x8-pin and this new dual sided thing with the 12-pin


I'm just not convinced that this can offer any selling point. It's not going to be cheaper because production volume for the new plug will be low, so the connector will be a buck or so more expensive, which will turn into $10 retail at least, probably more for marketing, logos and stickers. All for the convenience of plugging in one fewer cable? Unless you're a reviewer, you do that once, after you buy your card.

Also I'm highly skeptical how they plan on cooling 400+ W cards. Good 2080 Tis are like 35 dBA at 350 W, I'm not sure if I would buy a $1000+ card that's not quiet and that heats up my room like crazy, even if it can run 4K 120 Hz


----------



## cucker tarlson (Jul 16, 2020)

W1zzard said:


> I'm just not convinced that this can offer any selling point. It's not going to be cheaper because production volume for the new plug will be low, so the connector will be a buck or so more expensive, which will turn into $10 retail at least, probably more for marketing, logos and stickers. All for the convenience of plugging in one fewer cable?
> 
> Also I'm highly skeptical how they plan on cooling 400+ W cards. Good 2080 Tis are like 35 dBA at 350 W, I'm not sure if I would buy a $1000+ card that's not quiet and that heats up my room like crazy, even if it can run 4K 120 Hz


prolly just those that need 8+8 or higher
and the new cooler design will be more expensive too


----------



## BoboOOZ (Jul 16, 2020)

W1zzard said:


> Also I'm highly skeptical how they plan on cooling 400+ W cards. Good 2080 Tis are like 35 dBA at 350 W, I'm not sure if I would buy a $1000+ card that's not quiet and that heats up my room like crazy, even if it can run 4K 120 Hz


For an expensive card, liquid cooling should be q perfectly acceptable option. But it will still warm up your room nicely...


----------



## W1zzard (Jul 16, 2020)

BoboOOZ said:


> For an expensive card, liquid cooling should be q perfectly acceptable option. But it will still warm up your room nicely...


Not sure, seems the market isn't buying cards with bulky water cooling radiators. RMA rates could also eat up the small profits board partners have in this business.


----------



## TheDeeGee (Jul 16, 2020)

Lucky i'm only getting a RTX 3060 and can do with a single 8-pin (probaby).


----------



## BoboOOZ (Jul 16, 2020)

W1zzard said:


> Not sure, seems the market isn't buying cards with bulky water cooling radiators. RMA rates could also eat up the small profits board partners have in this business.


Well, sooner or later they will have to come with a solution for this. 
Die sizes for GPUs are much larger than die sizes for CPUs, so either the manufacturers will have to keep leaving performance on the table (but if the competition is animated, I have a hard time imagining that they will) or they will have to come with even better coolers. And given the form aspect of graphic cards, AIO coolers seems like a simpler option than doing 3-4 slot graphic cards with air coolers.


----------



## W1zzard (Jul 16, 2020)

BoboOOZ said:


> Die sizes for GPUs are much larger than die sizes for CPUs


That's actually an interesting point, with 7 nm, die sizes will be smaller = higher heat density



BoboOOZ said:


> AIO coolers seems like a simpler option than doing 3-4 slot graphic cards with air coolers


I was reasonably happy with the ASUS 2080 Ti Matrix, but not convinced this is ready for a million unit per year market


----------



## ZoneDymo (Jul 16, 2020)

would it not be nicer if the next high end gpu's were so efficient we could just go back to a single 6pin connector or so?


----------



## kiriakost (Jul 16, 2020)

According to latest information's regarding female Pin development , while basic Molex this can do 9A max,  the newest ones are made to deliver 12A as Max. 
By this information now we may make our math about Max current transfer per plug. 
But the very truth this is that industry this cares to level up the average constant current transfer and this should be 30% lower than the Max for safety reasons.


----------



## BoboOOZ (Jul 16, 2020)

ZoneDymo said:


> would it not be nicer if the next high end gpu's were so efficient we could just go back to a single 6pin connector or so?


I used to have a passive cooled Geforce 2 MX. It probably drew 10 watts or something. We're not getting back to that, GPU's are packing more and more transistors, so you cannot reduce power draw unless you're also drastically reducing clock speeds.

BTW, that's something you can do very well yourself, my 5700XT runs mostly passively cooled at 120W.


----------



## Vayra86 (Jul 16, 2020)

Vya Domus said:


> Funny how when AMD has a power hungry GPU every one thinks power consumption is everything but when Nvidia hints at an upcoming power hungry atrocity everyone's cool with it.



You're on a roll aren't you? Let's just watch the dust settle and not jump on every photoshop whackjob that flies around the net...


----------



## kayjay010101 (Jul 16, 2020)

BoboOOZ said:


> Well, sooner or later they will have to come with a solution for this.
> Die sizes for GPUs are much larger than die sizes for CPUs, so either the manufacturers will have to keep leaving performance on the table (but if the competition is animated, I have a hard time imagining that they will) or they will have to come with even better coolers. And given the form aspect of graphic cards, AIO coolers seems like a simpler option than doing 3-4 slot graphic cards with air coolers.


Can't wait until reference cards start shipping as hybrids like the Fury X did. Costs a bit more and takes up at least one fan slot but it cools so much better than air cooling.


----------



## RH92 (Jul 16, 2020)

Ehhhhhhhhhhhhhh .........


__ https://twitter.com/i/web/status/1283502759292162051
TPU should at the very least check what the Chinese text says instead of running with whatever BS leak comes to their hands , just saying !


----------



## EarthDog (Jul 16, 2020)

RH92 said:


> TPU should at the very least check what the Chinese text says instead of running with whatever BS leak comes to their hands , just saying !


But........... "sources". Time will tell. But if W1z says he has a source... I believe that... however not that the source is or is not right.


----------



## Vya Domus (Jul 16, 2020)

Vayra86 said:


> You're on a roll aren't you?



Indeed, I can't help but notice these things.



BoboOOZ said:


> Die sizes for GPUs are much larger than die sizes for CPUs



Intel is still making chips right up to their reticle limit and that's likely not going to change.


----------



## BoboOOZ (Jul 16, 2020)

W1zzard said:


> I was reasonably happy with the ASUS 2080 Ti Matrix, but not convinced this is ready for a million unit per year market


Haven't seen that card (a bit above my budget   ) but the cooling solution is really smart. All you need is good airflow inside the case.


----------



## RH92 (Jul 16, 2020)

EarthDog said:


> But........... "sources". Time will tell. But if W1z says he has a source... I believe that... however not that the source is or is not right.




Thing is the ''source'' everyone who covered this BS leak is using is precisely the Chinese text you see in that tweet so yeah , theres that .  This article isn't made by W1z .


----------



## Recus (Jul 16, 2020)

So how you connect in PSU? Also rumor says 12pin only for Founder's Edition.



Spoiler: ...


----------



## lexluthermiester (Jul 16, 2020)

What's the point of this new connector? 6+6 or 8+6 isn't good enough? Even 8+8? This doesn't compute. I don't care what NVidia's reasonings are, this a change that has no logic. Hopefully this is just rumor..


----------



## iO (Jul 16, 2020)

Nah. Nvidia might suggest using a new plug in a future PCIe spec revision but unless it gets approved by the PCI-SIG and becomes part of the official spec, chances are slim this will end up real.


----------



## Legacy-ZA (Jul 16, 2020)

W1zzard said:


> Not sure, seems the market isn't buying cards with bulky water cooling radiators. RMA rates could also eat up the small profits board partners have in this business.



"Small profits" Bwhawhahahahahaha! Good one.


----------



## W1zzard (Jul 16, 2020)

Legacy-ZA said:


> "Small profits" Bwhawhahahahahaha! Good one.


"board partners". GPU makers are keeping the big profits, guys like ASUS, MSI are barely making any profit after RMA, support and marketing


----------



## Vya Domus (Jul 16, 2020)

W1zzard said:


> MSI are barely making any profit after RMA



I somehow find that hard to believe for a company with a market cap of 100 billion.


----------



## W1zzard (Jul 16, 2020)

Vya Domus said:


> I somehow find that hard to believe for a company with a market cap of 100 billion.


TWD


----------



## Vya Domus (Jul 16, 2020)

W1zzard said:


> TWD



Yeah I realized that, still, they aren't exactly small.


----------



## BoboOOZ (Jul 16, 2020)

Check out Gamer's Nexus videos, you will understand margins are smaller and smaller for partners. I think the situation was different 15 years ago, that's why there was way more innovation in the graphic cards. Nowadays all graphic cards are practically the same, give or take a few MHz here and there and a more/less accomplished cooling solution.


----------



## Vya Domus (Jul 16, 2020)

What you also need to understand is margins *and* volume shipped. Amazon has measly margins too but what do you know they are one of the largest or maybe the largest company out there.


----------



## BoboOOZ (Jul 16, 2020)

True, but Amazon is not a manufacturer, it's just a reseller. Small margins high volume works fine for reselling, for manufacturers it's harder. It's sufficient to have a few faulty products and you throw away all your margin.


----------



## gridracedriver (Jul 16, 2020)

600 watt?

LOOOL


----------



## L'Eliminateur (Jul 16, 2020)

londiste said:


> The connector pinouts are not quite accurate.
> While technically most PSUs provide +12V on pin 2 for 6-pin connector, that is not the spec and pin 5 accordingly is sense. 6-pin connector officially has 2 +12V pins.
> Similarly, 8-pin connector includes 2 sense pins 4 and 6 and has 3 +12V pins.
> I bet 12-pin connector will end up with 5 +12V pins.



i don't think so, the current connectors have sense pins since they can be joined together and without sense the card has no way of knowing you plugged a 6 or 8 pin cable to it so they have to waste pins for that.
Since this new 12-pin connector seems to be monolithic (althought it's mentioned it can be split in half, that has me puzzled) they wouldn't need any sense as it's always a 12-pin plugged.
Or if it indeed can be split for smaller GPUs then you're right it would require one sense pin on the 2nd half


----------



## tomc100 (Jul 16, 2020)

Soon the gpu will just plug directly into the wall with an AC adapter.


----------



## L'Eliminateur (Jul 16, 2020)

tomc100 said:


> Soon the gpu will just plug directly into the wall with an AC adapter.


jaja cannot happen, the power consumption of GPUs means the PC PSU is the only viable option, or the power brick would essentially be a dedicated PC PSU, doesn't make sense


----------



## Jism (Jul 16, 2020)

L'Eliminateur said:


> jaja cannot happen, the power consumption of GPUs means the PC PSU is the only viable option, or the power brick would essentially be a dedicated PC PSU, doesn't make sense









The 3dfx actually came with it's own power brick to be plugged into the wall to feed the GPU.

Apart from that, weird decision. 2x8 pins should be more then enough for most GPU's unless this thing is supposed to feed GPU's in enterprise markets or machines.

Lot of wires are overrated as well. One single yellow wire on a good build PSU could easily push 10A up to 13A.


----------



## Legacy-ZA (Jul 16, 2020)

Vya Domus said:


> Yeah I realized that, still, they aren't exactly small.



Don't believe their waffle for a moment, they make moooooooooooooore than enough. Not "small profits" at all. I can tell you stories that will make you infuriated with the injustices people pay for today's products. Suffice to say, most consumers are extremely unaware of how badly they are being ripped off.


----------



## Overclocker_2001 (Jul 16, 2020)

i bet this connector will be used in server / hpc space, where one cable is better than 2, and where PSU are designed to be fitted inside a case, with special connector.

remember Nvidia HGX gpu? well.. that's a 400W BEAST... but does not have any connector to power it because.. well.. 4*8pin PEG cable is not an option.

definitely 600W for a single VGA (single or dual gpu doen't matter) is quite too much for consumer, but not for PRO / server / HPC application


----------



## Vya Domus (Jul 16, 2020)

Overclocker_2001 said:


> but not for PRO / server / HPC application



Yeah it is, when you have thousands of them the cost of electricity and cooling stacks up.


----------



## Dirt Chip (Jul 16, 2020)

What about LED??!
Must have RGB lighting in this new spec and adapters!


----------



## Flanker (Jul 16, 2020)

Verpal said:


> *看来有一些读者尤其是国外的读者不能理解中文的幽默，我更新一下最新的图纸和消息汇总。消息是真的。既然Techpowerup发了图纸，那我也发一下。 *
> 
> You guys can't read Chinese?
> 
> ...



I read and speak Chinese and I don't see the joke? Some historical reference with FCPowerup?
I get that FCPowerup is legit, I'm just confused AF


----------



## Mistral (Jul 16, 2020)

So now we have to have different nVidia and AMD connectors on PSUs, or what?


----------



## Krzych (Jul 16, 2020)

Vya Domus said:


> Funny how when AMD has a power hungry GPU every one thinks power consumption is everything but when Nvidia hints at an upcoming power hungry atrocity everyone's cool with it.



This is entirely relative. There is a difference between pointlessly drawing way more power for the same performance and pushing the limits of performance, and we are talking the latter here. To do as badly as AMD did back in the days you refer to, they would have to release a new 300W+ card thats slower than RTX 2070.


----------



## Th3pwn3r (Jul 16, 2020)

Krzych said:


> This is entirely relative. There is a difference between pointlessly drawing way more power for the same performance and pushing the limits of performance, and we are talking the latter here. To do as badly as AMD did back in the days you refer to, they would have to release a new 300W+ card thats slower than RTX 2070.


Exactly, in terms of power consumption to performance AMD wasn't doing good.


----------



## Verpal (Jul 16, 2020)

Flanker said:


> I read and speak Chinese and I don't see the joke? Some historical reference with FCPowerup?
> I get that FCPowerup is legit, I'm just confused AF



TF? I thought the joke is like..... obvious?
To be fair if it late night and you are just reading stuff literally...... sure
But context is important here, the piece is too high effort and well reference to be a joke, especially when it is just a single line at the bottom of the page.

Anyway, he put in additional material already, although I like the write up with Igor's lab more, FCPowerup's info is about the same, go check it out


----------



## Parn (Jul 16, 2020)

Does that mean for any potential RTX 3080/3080 Ti buyers they would have to buy one of those new shiny PSUs to benefit from less cluttering cables? Unless this 12-pin connector can provide double the amount of wattage of the 8-pin, what's the point?


----------



## thebluebumblebee (Jul 16, 2020)

Okay, I didn't read the thread, but....

I can't see this coming to desktops.  AI focused cards?  Yes.  Desktops? No.  Why?  Because places like California would have a power usage fit.  To me, the industry is very conscience of their power usage and NOT trying to draw attention to themselves.








						New California Energy Commission Regulation Threatens Pre-built Gaming Desktops
					

California Energy Commission (CEC), the body tasked with keeping the US state of California both energy-rich and energy-efficient, is preparing a new series of regulations aimed at reducing power-draw of computers and the overall consumption of PC monitors. These regulations could have a...




					www.techpowerup.com
				











						All ASUS Motherboards Meet Stringent New California Energy Commission Standards
					

ASUS today announced that its motherboards meet ambitious energy-efficiency standards laid down by the California Energy Commission (CEC), and due to come into force on January 1, 2019.  From this date, most new computers, monitors and signage displays sold or offered for sale in California must...




					www.techpowerup.com


----------



## Razrback16 (Jul 16, 2020)

Good info, thanks.


----------



## BoboOOZ (Jul 16, 2020)

thebluebumblebee said:


> I can't see this coming to desktops.  AI focused cards?  Yes.  Desktops? No.  Why?  Because places like California would have a power usage fit.  To me, the industry is very conscience of their power usage and NOT trying to draw attention to themselves.


If I understand correctly, all these regulations stipulate about efficiency, not power draw. That's why the evolution of the ATX format, so that there are less losses. There will still be 1500W PSU, it's just that the current conversions will be made at a single point, to avoid waste.


----------



## nickbaldwin86 (Jul 16, 2020)

I hope this is true. would be great to have a single plug. I have 2 8pins, which is still more wires and involves two plugs and two cables.

wondering when cards will just have 2x12pin  LOL


----------



## Assimilator (Jul 16, 2020)

One more thing: the schematic for that connector has durability at only 25 insertion/removal cycles.



RH92 said:


> Ehhhhhhhhhhhhhh .........
> 
> 
> __ https://twitter.com/i/web/status/1283502759292162051
> TPU should at the very least check what the Chinese text says instead of running with whatever BS leak comes to their hands , just saying !



You should at the very least check the previous posts in this thread before you post something that has already been addressed, thereby making yourself look ignorant.

Just saying!


----------



## W1zzard (Jul 16, 2020)

Assimilator said:


> he schematic for that connector has durability at only 25 insertion/removal cycles


Hard to believe, and would be terrible if true. I must have plugged my PSU cables several hundred times and they're still fine. Slightly less force needed to connect, but np otherwise. Didn't we heard similar claims the first time LGA sockets were announced?


----------



## Assimilator (Jul 16, 2020)

W1zzard said:


> Hard to believe, and would be terrible if true. I must have plugged my PSU cables several hundred times and they're still fine. Slightly less force needed to connect, but np otherwise. Didn't we heard similar claims the first time LGA sockets were announced?



I had a look at Molex's page for current PCIe connectors afterwards, and those are only rated at 30 cycles! Guess they just take the worst-worst-worst case...


----------



## Vya Domus (Jul 16, 2020)

Krzych said:


> This is entirely relative. There is a difference between pointlessly drawing way more power for the same performance and pushing the limits of performance, and we are talking the latter here. To do as badly as AMD did back in the days you refer to, they would have to release a new 300W+ card thats slower than RTX 2070.




"To do as badly" is entirely relative, gotcha. You can try and justify it all you can want, an unacceptable amount of power should remain unacceptable no matter what.


----------



## Chrispy_ (Jul 16, 2020)

I'm not really interested in a GPU that needs more than 250W. Even if the graphics are phenomenal, I don't need that noise or heat and after the lacklustre RTX game library I can honestly say that awesome graphics do not fix bad or re-hashed gameplay.


----------



## TheoneandonlyMrK (Jul 16, 2020)

BoboOOZ said:


> Well, graphic cards power consumption is only going up, so they might as well ensure that power delivery is flawless. Otherwise, many users might complain about their card's buggy drivers crashing.
> 
> Still, I wouldn't like to change my PSU just yet.


Multi rail cheap PSU plus adapter cables is a sure way to Imitate buggy driver's from my experience.

Not good, Ampere is really looking like 4-450 watts at the top end.

After biting, Rtx ended up like Physx, shit and underutilized, a disappointment.


----------



## Aerpoweron (Jul 16, 2020)

If this 20pin plug has any truth to it, then nvidia has the GTX 480 times all over again


----------



## agatong55 (Jul 16, 2020)

theoneandonlymrk said:


> Multi rail cheap PSU plus adapter cables is a sure way to Imitate buggy driver's from my experience.
> 
> Not good, Ampere is really looking like 4-450 watts at the top end.
> 
> After biting, Rtx ended up like Physx, shit and underutilized, a disappointment.



Usually, first-gen of anything is a disappointment but now with consoles using RTX, there will actually be games that actually use it.


----------



## thebluebumblebee (Jul 16, 2020)

This has to be in a server environment.  Think about how stiff 12 each 16AWG would be.


----------



## AusWolf (Jul 16, 2020)

cucker tarlson said:


> you need a connector on the psu not just the cable
> all modern psus have 8-pin connectors for PEG,not 6 -pin
> it's a nice thing for sure,but if it gets released there will be normal 2x8-pin versions for sure.


That is true, though I'm wondering what the options are when you have two 8-pin connectors on a single cable. Would it be possible for the PSU manufacturer to supply a cable with one 12-pin instead?


----------



## cucker tarlson (Jul 16, 2020)

AusWolf said:


> That is true, though I'm wondering what the options are when you have two 8-pin connectors on a single cable. Would it be possible for the PSU manufacturer to supply a cable with one 12-pin instead?


even daisy chaining two 8-pins is not recommended.


----------



## AusWolf (Jul 16, 2020)

cucker tarlson said:


> even daisy chaining two 8-pins is not recommended.


Why is that? It shouldn't be a problem with the correct wire gauge.


----------



## thebluebumblebee (Jul 16, 2020)

The current 8 pin connector is rated for 150 watts, or 50 watts per hot wire.  This 12 pin connector is rated for either 9 or 9.5 amps per hot wire which puts it at at least 108 watts per hot wire.  Still don't see a problem?


----------



## londiste (Jul 16, 2020)

thebluebumblebee said:


> The current 8 pin connector is rated for 150 watts, or 50 watts per hot wire.  This 12 pin connector is rated for either 9 or 9.5 amps per hot wire which puts it at at least 108 watts per hot wire.  Still don't see a problem?


Similar spec for 8-pin connector is 8 amps per hot wire - 96W or so.


----------



## TheoneandonlyMrK (Jul 16, 2020)

agatong55 said:


> Usually, first-gen of anything is a disappointment but now with consoles using RTX, there will actually be games that actually use it.


Yeah except they're not, soo
That's not necessarily going to happen.


----------



## dicktracy (Jul 16, 2020)

Difference is, Nvidia requires more power on the topend max tuned cards whereas AMD needs the same at midrange.


----------



## Vayra86 (Jul 16, 2020)

W1zzard said:


> Hard to believe, and would be terrible if true. I must have plugged my PSU cables several hundred times and they're still fine. Slightly less force needed to connect, but np otherwise. Didn't we heard similar claims the first time LGA sockets were announced?



I'm finding the whole story about people buying new PSUs for a GPU a bit terrible if true. And hard to believe.


----------



## Space Lynx (Jul 16, 2020)

well I imagine adapters won't be ideal, so you will need a new PSU for this to really run Ampere properly. and PSU shortages are a real thing in last 6 months... and I just bought my evga 700w gold last year, works great for me. I think I am just going to risk Big Navi and hope Lisa Su heard the community loud and clear and Big Navi drivers won't be so much an issue this round. so ryzen 4800x and big navi it is for me.  now I just have to keep my fingers crossed on drivers, but I suppose if they are super horrible I can just return for refund.


----------



## Totally (Jul 16, 2020)

cucker tarlson said:


> well,not the worst of ideas to have even the 3x8-pin cards like lightning run off a single connector instead of this mess



it would run off of two 12-pins not a single, not sure that would be a step up more of an aside.


----------



## mechtech (Jul 16, 2020)

What was wrong with 2x6pins?  Almost like Apple inventing unique items instead of sticking with standards.


----------



## arbiter (Jul 17, 2020)

Ok, just cause the connector is able to do 600 watts doesn't mean gpu you will put in will use that. I do see the idea that this single 600watt connector is way to eliminate need for 2x cables to power the gpu. Plus would help for gpgpu side for compute cards that need ton of power so they loose the need for 2x power per card to 1. So to me this is way to remove up to 1 power cable for gpu outta the pc and help's remove many more on server side that could have 5 or 6+ gpu's in a server that each could need 2x power cables per card.

Some of comments about "worry needing a new PSU". I don't think power that 2x8 pin provides is gonna be over used least on consumer side any time soon. Even though 8pin is rated at 150watts per spec it can do more then that provided the psu you own isn't some cheaply made pile of junk. AMD had a gpu that was pulling like 250+ watts per 8pin.



Totally said:


> it would run off of two 12-pins not a single, not sure that would be a step up more of an aside.


It would be 1, since 12 pin can do 600 watts. Those 3x 8pins are 450watts total.


----------



## Hyderz (Jul 17, 2020)

if this is true, do you think when buying the gpu from say asus, gigabyte, msi etc
would include adapter for current psu connectors?


----------



## Grimmyn (Jul 17, 2020)

Nvidia bought 3dfx
Is the just add moooore the solution?












						Recently Acquired: 3DFX Voodoo 5-6000 Rev3700A !
					

Hi all  Thought i'd show off my recently acquired Voodoo5-6000 Revision 3700A with the PCI-Rework. Big thanks to Gold Leader from VoodooAlert for tracking down this gem in just 3 days!  It is fully functioning and has passed a 2 hour stress test of 3D Mark 2001SE!, I will be adding to this...




					www.techpowerup.com


----------



## laszlo (Jul 17, 2020)

i really don't understand why we need so many 12 v rails as entrance to a vga; the current/power is coming from psu - one rail and is delivering the needed amps; even 3 12v rails can deliver and the wires support the max amps that cards need; on pcb entrance they make 3-6 points with different usage within the card but they all are connected in the end to the psu rail which deliver or not; you won't have cleaner power to different subsystem once all are interconnected physically  to the psu...


----------



## rtwjunkie (Jul 17, 2020)

CandymanGR said:


> GPU's should go the way of efficiency. 600W power requierements for gpus ? Should have been the opposite.
> Plus this connector is too heavy and it will sag the cards down.
> 
> Genious!


Time to get yourself a brace. They are cheap and pretty much are not very visible while holding up alot of the weight. 

My complaint is I would rather “wrestle” with two connector cords that are easy to get out of the way than 1 thick connection cord that will ACTUALLY be tough to “wrestle” out of the way. Thick cord will not make for a clutter free case that doesn’t impede air flow.


----------



## Chrispy_ (Jul 17, 2020)

Is Nvidia aware that nobody can buy power supplies at the moment? 
Making your new product dependent on something that's unobtainable at the moment seems like a poorly-planned idea.


----------



## BoboOOZ (Jul 17, 2020)

Chrispy_ said:


> Is Nvidia aware that nobody can buy power supplies at the moment?
> Making your new product dependent on something that's unobtainable at the moment seems like a poorly-planned idea.


I'm not sure at all that this connector, even the rumor seems correct, is arriving with the 30XX series of cards, or that there won't be some type of convertor available, at least. New standards take a while to implement and this looks more like a long shot to me.


----------



## AusWolf (Jul 17, 2020)

BoboOOZ said:


> I'm not sure at all that this connector, even the rumor seems correct, is arriving with the 30XX series of cards, or that there won't be some type of convertor available, at least. New standards take a while to implement and this looks more like a long shot to me.


I agree. The last 3-4 generations have all delivered 1.5-2x the performance with the same power consumption as the previous one. I don't believe (consumer) Ampere is about to break this trend -  especially at a time when they're making the jump from 12 to 7 nm as well.


----------



## robb213 (Jul 17, 2020)

Vya Domus said:


> Funny how when AMD has a power hungry GPU every one thinks power consumption is everything but when Nvidia hints at an upcoming power hungry atrocity everyone's cool with it.


Oh they paid with the GTX 480. My egg cooker as I called it back then. Mine would nearly overheat in Dead Island in a HAF X case, hitting about 100 C against the 105 C throttle stop. And I consume less power with this 1080Ti than I did with that...I regretted not getting the HD 5870 instead.

Every generation is a shit flinging contest, nothing new here.


----------



## medi01 (Jul 17, 2020)

dicktracy said:


> Nvidia requires more power on the topend max tuned cards


Uh, why does NV suddenly require (much) more power for the topend GPUs? Are we back to Fermi times?


----------



## duynguyenle (Jul 18, 2020)

I understand the desire to have a single connector for a graphics card instead of a bundle of them, especially for more power hungry hardware. What I don't get is why on earth Nvidia feels the need to draw up their own bloody connector.

There already exist a 12-circuit header housing (literally from the exact same connector series that the ATX24 pin, EPS 8 pin and PCIe 6 pin comes from, specifically, this part here: https://www.molex.com/molex/products/part-detail/crimp_housings/0469921210). I don't even buy the argument about insufficient power handling capability. Standard Minifit JR crimp terminals and connector housings are rated up to 9A per circuit pairs, so for the 12-circuit part, you're looking at 9A*12V*6 = 648W of power. This is around the same as the rough '600W' figure quoted in the article, not to mention, you can get higher . You don't need a new connector along with new crimp terminal design and non-standard keying for this. THE EXISTING ATX 12 PIN CONNECTOR WILL DO THE JOB JUST FINE. Not to mention, you can get Minifit-Jr terminals that are rated for even higher amperage (13A for copper/tin crimp terminals when used with 18AWG or thicker conductors, IIRC, in which case the standard ATX 12-pin cable will carry even more power). This is literally just Nvidia trying to reinvent the wheel for no apparent reason.

As someone who crimps their own PSU cables, this is a hecking pain in the butt, and I hope whoever at Nvidia came up with this idea gets 7 years of bad luck for this. Just use the industry-standard part that already exist FFS.

/rant


----------



## L'Eliminateur (Jul 18, 2020)

Jism said:


> The 3dfx actually came with it's own power brick to be plugged into the wall to feed the GPU.
> 
> Apart from that, weird decision. 2x8 pins should be more then enough for most GPU's unless this thing is supposed to feed GPU's in enterprise markets or machines.
> 
> Lot of wires are overrated as well. One single yellow wire on a good build PSU could easily push 10A up to 13A.


that 3dfx board is a fraction of the power consumption of a modern gpu, that's why it used an external brick, there was no pcie power connector and there was no standard for internal power connectors for boards, the most they could do was a molex connector that aren't very reliable or high power.
As i've said, an external power brick for a modern gpu would be a PC PSU in an external case(thus costing upwards of 100USD) as it would require 500W of output, plus you'd need a multitude of thick cables and connector occupying and entire slot on the back (there's no room for the power connector plus the regular video output), and that's another fan making noise and getting clogged.

2x8 pins may be barely enough for ampere, but not for the future, or even not enough for 3080ti(or whatever name it comes out), there are already 2080ti cards with 3x8 pin.

Yes a wire can push 10+A, but they have to take into account the voltage drop and heating on the wire and contact resistance of the connector, it's not that simple(otherwise we'd use one big thick cable instead of 2x8 pins for example), and also how flexible it is


----------



## jonnyGURU (Jul 18, 2020)

duynguyenle said:


> I understand the desire to have a single connector for a graphics card instead of a bundle of them, especially for more power hungry hardware. What I don't get is why on earth Nvidia feels the need to draw up their own bloody connector.
> 
> There already exist a 12-circuit header housing (literally from the exact same connector series that the ATX24 pin, EPS 8 pin and PCIe 6 pin comes from, specifically, this part here: https://www.molex.com/molex/products/part-detail/crimp_housings/0469921210). I don't even buy the argument about insufficient power handling capability. Standard Minifit JR crimp terminals and connector housings are rated up to 9A per circuit pairs, so for the 12-circuit part, you're looking at 9A*12V*6 = 648W of power. This is around the same as the rough '600W' figure quoted in the article, not to mention, you can get higher . You don't need a new connector along with new crimp terminal design and non-standard keying for this. THE EXISTING ATX 12 PIN CONNECTOR WILL DO THE JOB JUST FINE. Not to mention, you can get Minifit-Jr terminals that are rated for even higher amperage (13A for copper/tin crimp terminals when used with 18AWG or thicker conductors, IIRC, in which case the standard ATX 12-pin cable will carry even more power). This is literally just Nvidia trying to reinvent the wheel for no apparent reason.
> 
> ...



That Molex is a mini-fit.  The one in the drawing is a micro-fit.  Look at the dimensions.  And it's not "made up".  It exists.  Has for a long time.  I can buy one from Digikey, Mouser, etc. from four or five different manufacturers.

Any way.. This isn't a news story.  This is someone downloading a drawing from a connector supplier and posting it as "news".  Real news would be seeing the connector on the card itself.  Am I right?

Now look what's popping up in my Google ads!





See... Not a "made up" connector.


----------



## duynguyenle (Jul 18, 2020)

jonnyGURU said:


> That Molex is a mini-fit.  The one in the drawing is a micro-fit.  Look at the dimensions.  And it's not "made up".  It exists.  Has for a long time.  I can buy one from Digikey, Mouser, etc. from four or five different manufacturers.
> 
> Any way.. This isn't a news story.  This is someone downloading a drawing from a connector supplier and posting it as "news".  Real news would be seeing the connector on the card itself.  Am I right?
> 
> ...



Thanks for the reference, I didn't recognise the keying of the connector housing and the wording of the article made me think they designed a fancy new connector housing. This invalidates my previous point. What's the technical improvement you get from moving from the Minifit Jr to Minifit then? Most of the other connectors commonly used for ATX PCs are Minifit Jr, what do you gain by going to Micro-fit? Is it really that much better in terms of mechanical strength or electrical/current handling?


----------



## jonnyGURU (Jul 19, 2020)

duynguyenle said:


> Thanks for the reference, I didn't recognise the keying of the connector housing and the wording of the article made me think they designed a fancy new connector housing. This invalidates my previous point. What's the technical improvement you get from moving from the Minifit Jr to Minifit then? Most of the other connectors commonly used for ATX PCs are Minifit Jr, what do you gain by going to Micro-fit? Is it really that much better in terms of mechanical strength or electrical/current handling?



Smaller size. Nothing more.  If this is real, then they're only doing it because they ran out of PCB space.


----------



## Nero1024 (Jul 19, 2020)

Can somebody explain this stupid decision?


----------



## medi01 (Jul 19, 2020)

Nero1024 said:


> Can somebody explain this stupid decision?


Pissing off TSMC and going with Samsung has back*fire*d.


----------



## ThrashZone (Jul 19, 2020)

Hi,
Most psu's have six 8 pin vga ports on them so I don't see this as an issue and just speculation needing a new psu when most likely just needs a new adapter or cable at worst.
30 series Titan rtx may need more but those aren't exactly mainstream cards either.

Guess we'll find out sooner or later


----------



## BoboOOZ (Jul 19, 2020)

medi01 said:


> Pissing off TSMC and going with Samsung has back*fire*d.


That,s the shorter way of putting it, basically, Nvidia is anticipating higher TDP graphic cards in the near future.

 This might be because of the fact that the improvement from moving to Samsung 8nm is less than they expected, or simply that they intend to leave less performance on the table and place their cards more towards the right side of the voltage/frequency curve, sacrificing some efficiency for more raw power.

To be fair, we know nothing of the TDPs from team red, they might've gone up too, in spite of using TSMC's 7nm EUV.


----------



## sergionography (Jul 20, 2020)

Fermi 2020


----------



## Shatun_Bear (Jul 20, 2020)

Spec allows up to 600W. So Ampere 3080 Ti might draw 400W+!! This seems like Nvidia dropping the ball going with Samsung 8nn fab process. Or Ampere as an arch is a bit of a turd. Likely a bit of both.

RDNA2 on TSMC's 7nm seems to be quite formidable.



BoboOOZ said:


> To be fair, we know nothing of the TDPs from team red, they might've gone up too, in spite of using TSMC's 7nm EUV.



We know a little. PS5 RDNA2 GPU is 2.2Ghz in a ~100W package offering near 2080 performance.

A 350W RDNA2 GPU is going to give a 2080 Ti a run for its money.


----------



## BoboOOZ (Jul 20, 2020)

Shatun_Bear said:


> We know a little. PS5 RDNA2 GPU is 2.2Ghz in a ~100W package offering near 2080 performance.


That's only 36CU's and the TDP is not 100W, the reason the PS5 is so big is to allow better ventilation. So it indeed is more efficient than RDNA1, but it's unclear by exactly how much.


Shatun_Bear said:


> A 350W RDNA2 GPU is going to give a 2080 Ti a run for its money.


It better do more than that, it will launch at the same time with Ampere, the 2080Ti is a generation old now. It has to give the 3080 a run for its money, be it at 350W or 500W.


----------



## Shatun_Bear (Jul 20, 2020)

BoboOOZ said:


> That's only 36CU's and the TDP is not 100W, the reason the PS5 is so big is to allow better ventilation. So it indeed is more efficient than RDNA1, but it's unclear by exactly how much.
> 
> It better do more than that, it will launch at the same time with Ampere, the 2080Ti is a generation old now. It has to give the 3080 a run for its money, be it at 350W or 500W.



I'm talking about the TDP of the PS5 GPU cores only, not the entire system. 100W was a guess.

And yes latest rumours from sources I trust state the top Big Navi will be around 40% faster than a 2080 Ti, so it will give that a run for its money as stated and challenge the 3080 Ti.


----------



## BoboOOZ (Jul 20, 2020)

Shatun_Bear said:


> I'm talking about the TDP of the PS5 GPU cores only, not the entire system. 100W was a guess.


Yes, I reckoned you would be guesstimating, but I think you're overoptimistic, I imagine the smart shifts in these APU's gives more umph to the GPU part, so I think the TDP is about or above 200W. I'm guesstimating, too 


Shatun_Bear said:


> And yes latest rumours from sources I trust state the top Big Navi will be around 40% faster than a 2080 Ti, so it will give that a run for its money as stated and challenge the 3080 Ti.


Here's a recap from Tom that points out that we will see high TDP's from Nvidia this fall:


----------



## EarthDog (Jul 20, 2020)

ThrashZone said:


> Hi,
> Most psu's have six 8 pin vga ports on them so I don't see this as an issue and just speculation needing a new psu when most likely just needs a new adapter or cable at worst.
> 30 series Titan rtx may need more but those aren't exactly mainstream cards either.
> 
> Guess we'll find out sooner or later


Most have 2 6+2 pin connectors... some (typically with a bit higher power, ~700W+) have four. A few, over the 1KW range, have six. Most do NOT have six, but two or four.



Shatun_Bear said:


> And yes latest rumours from sources I trust state the top Big Navi will be around 40% faster than a 2080 Ti, so it will give that a run for its money as stated and challenge the 3080 Ti.


So, essentially, what you are saying is that a RDNA2 card will be 'close enough' to a 300-400W Ampre card to give it a "run for its money"? What does that mean, exactly? Like 10% behind? I'm curious how you came up with that conclusion...

What I see is this.....2080 Ti(FE) currently leads 5700XT (Nitro+) by 42% (1440p). With all the rumors about the high power use, even with a significant node shrink (versus AMD who is tweaking) and a new architecture, you still think that is true?

I mean, I hope you're right, but from what we've seen so far, I don't understand how that kind of math even works out. You're assuming that with a die shrink and new arch from Ampre will only be around the same amount faster 2080Ti was over 1080Ti (~25-30%) while a new arch and node tweak will gain upwards of 70% performance? What am I missing here?

If NV Ampre comes in at 300W+, I don't see RDNA2 coming close (split the difference between 2080Ti and Ampre flagship)


----------



## Chrispy_ (Jul 20, 2020)

BoboOOZ said:


> Here's a recap from Tom that points out that we will see high TDP's from Nvidia this fall:


Tom dropped off the radar for me these last few months, I guess he fell off my feed. Still, I believe he's credible and smart with a reasonable track record and he is _really _certain that 1st-gen Ampere will be an overheating, power-hungry dumpster fire on Samsung 8nm.

Even if he's only half-right, that doesn't bode well for the 3000-series. 

It will also mean that Nvidia play dirty (using DLSS and black-box dev tools that hinder AMD/Intel performance to get the unfair advantage their inferior hardware needs). In the case of this generation, I suspect that means that proprietary RTX-specific extensions are heavily pushed, rather than DX12's DXR API. Gawd, I hope I'm wrong....


----------



## Th3pwn3r (Jul 20, 2020)

Chrispy_ said:


> Tom dropped off the radar for me these last few months, I guess he fell off my feed. Still, I believe he's credible and smart with a reasonable track record and he is _really _certain that 1st-gen Ampere will be an overheating, power-hungry dumpster fire on Samsung 8nm.
> 
> Even if he's only half-right, that doesn't bode well for the 3000-series.
> 
> It will also mean that Nvidia play dirty (using DLSS and black-box dev tools that hinder AMD/Intel performance to get the unfair advantage their inferior hardware needs). In the case of this generation, I suspect that means that proprietary RTX-specific extensions are heavily pushed, rather than DX12's DXR API. Gawd, I hope I'm wrong....



Lol inferior hardware. Inferior hardware that's better. Wut?


----------



## Shatun_Bear (Jul 20, 2020)

Th3pwn3r said:


> Lol inferior hardware. Inferior hardware that's better. Wut?



We don't know who's 'better' yet.


----------



## BoboOOZ (Jul 20, 2020)

Chrispy_ said:


> It will also mean that Nvidia play dirty (using DLSS and black-box dev tools that hinder AMD/Intel performance to get the unfair advantage their inferior hardware needs). In the case of this generation, I suspect that means that proprietary RTX-specific extensions are heavily pushed, rather than DX12's DXR API. Gawd, I hope I'm wrong....


I haven't seen the edit.

That's always the marketing battle to be fought and AMD has been very bad at this in the past. And Nvidia are masters at this game, just ask 3DFX, S3 and Kyro...

But this time there are AMD APU's in all the consoles that matter, and the whole Radeon marketing team has been changed. So I reckon they might have something resembling a strategy.


----------



## Chrispy_ (Jul 20, 2020)

Th3pwn3r said:


> Lol inferior hardware. Inferior hardware that's better. Wut?


It's based on the video. Tom is putting his guesses that Big Navi will perform at 50-60% better than a  2080Ti, whilst his same evidence points towards Nvidia needing insane power draw and cooling just to hit 40% more than a 2080Ti.

If he's right, it means AMD will have the superior hardware.

I hope you realise this is a speculation thread though and there's no hard evidence on Ampere or Big Navi yet.

Me? I'm sitting on the fence and waiting for real-world testing and independent reviews. I'm old enough to have seen this game between AMD/ATi and Nvidia played out over and over again for 20+ years. It would not be the first time that either company had intentionally leaked misleading performance numbers to throw off the other team.


----------



## Th3pwn3r (Jul 20, 2020)

When was the last time AMD had better hardware than Nvidia? From a performance standpoint.


----------



## BoboOOZ (Jul 20, 2020)

The best way to find the answer to your own question, go to this page:








						NVIDIA GeForce GTX 1080 Ti Specs
					

NVIDIA GP102, 1582 MHz, 3584 Cores, 224 TMUs, 88 ROPs, 11264 MB GDDR5X, 1376 MHz, 352 bit




					www.techpowerup.com
				



and use the relative performance graph to find top of the line cards launched in the same year. I found this for you:








						NVIDIA GeForce GTX 780 Specs
					

NVIDIA GK110, 902 MHz, 2304 Cores, 192 TMUs, 48 ROPs, 3072 MB GDDR5, 1502 MHz, 384 bit




					www.techpowerup.com
				











						AMD Radeon HD 7990 Specs
					

AMD Malta x2, 1000 MHz, 2048 Cores, 128 TMUs, 32 ROPs, 3072 MB GDDR5, 1500 MHz, 384 bit




					www.techpowerup.com
				











						NVIDIA GeForce GTX 780 Ti Specs
					

NVIDIA GK110B, 928 MHz, 2880 Cores, 240 TMUs, 48 ROPs, 3072 MB GDDR5, 1753 MHz, 384 bit




					www.techpowerup.com


----------



## Chrispy_ (Jul 20, 2020)

Th3pwn3r said:


> When was the last time AMD had better hardware than Nvidia? From a performance standpoint.


If you're asking seriously, it depends what you mean:

In terms of performance per dollar or performance per transistor?

Currently they do. AMD's Navi10 is a reasonable competitor for TU106 but it is faster and cheaper than either the 2060S or the 2070 for a lower transistor count. It can't do raytracing, but arguably, TU106 is too weak to do it too. I've been wholly disappointed by my 2060S's raytracing performance, even with DLSS trying desperately to hide the fact it's only rendering at 720p. Heck, my 2060S can barely run Quake II or Minecraft ​
In terms of halo/flagships?

In 2008, Terascale architecture (HD 4000 series) ended a few years of rubbish from ATi/AMD and was better than the 9800GTX in every way.
In 2010, Fermi (GTX 480) was a disaster that memes were born from.
In 2012 Kepler (GTX680) had an edge over the first iteration of GCN (HD7970) because DX11 was too common. As DX12 games appeared, Kepler fell apart badly.
In 2014 Kepler on steroids (GTX780Ti and Titan) tried to make up the difference but AMD just made Hawaii (290X) which was like an HD7970 on steroids, to match.
Nvidia has pretty much held the flagship position since Maxwell (900-series), and generally offered better performance/Watt and performance/Transistor even before you consider that they made consumer versions of their huge enterprise silicon (980Ti, 1080Ti, 2080Ti). The Radeon VII was a poor attempt to do the same and it wasn't a very good product even if you ignore the price - it was just Vega's failures but clocked a bit higher and with more VRAM that games (even 4K games) didn't really need.

So yeah, if you don't remember that the performance crown has been traded back and forth a lot over the years, then you need to take off your special Jensen-Huang spectacles and revisit some nostaligic youtube comparisons of real games frequently being better on AMD/ATi hardware. _Edit, or just look at BoBoOOZ's links_

I don't take sides, If Nvidia puts out rubbish, I'll diss it.
If AMD puts out rubbish, I'll diss that too.
I just like great hardware, ideally at a reasonable price and power consumption.


----------



## TheoneandonlyMrK (Jul 20, 2020)

Th3pwn3r said:


> When was the last time AMD had better hardware than Nvidia? From a performance standpoint.


7970 clawed performance back, the original r5870 had it's contemporary beat.
And later this year 

Moore's law is dead might as well have quoted me verbatim, though I can't remember where on here I called it.

And if the many rumours are as true as usual, ie bits are but 50% balls then it still doesn't look Rosey for Nvidia this generation regardless.


----------



## Fluffmeister (Jul 20, 2020)

Turing is two years old, it's already died of boredom waiting for Big Navi to beat it to death.


----------



## Chrispy_ (Jul 20, 2020)

Fluffmeister said:


> Turing is two years old, it's already died of boredom waiting for Big Navi to beat it to death.


Yeah, but Turing _was _beaten to death by Little Navi at under $400. Hell, the $375 or so that the 2060 is going for is a joke because it can't raytrace well enough to matter and it's no faster than a 5600XT which is two price tiers below it.

Even in 2019 there was no excuse for a card costing $350 to not have at least 8GB RAM, and the $350 founders edition was as rare as hen's teeth with all of the partner cards coming in at $370-$410. For a 6GB card? *Ouch!* 

Without Navi, Nvidia would have continued to take the piss with such stellar values as the $600 2070FE. The "Super" lineup and price cuts were Nvidia's answer to Navi. They didn't upgrade all their shader counts and reduce prices out of the goodness of their hearts or charity, that's not how businesses run....


----------



## Fluffmeister (Jul 21, 2020)

The joys of a free market, how profound.


----------



## medi01 (Jul 21, 2020)

BoboOOZ said:


> But this time there are AMD APU's in all the consoles that matter, and the whole Radeon marketing team has been changed. So I reckon they might have something resembling a strategy.


Poor Volta was actually hilaroius.
It's not marketing team, but product development team that failed to deliver.

AMD seems to have stopped saying things just because some kids want to hear them.



Chrispy_ said:


> offered better performance/Watt and performance/Transistor


AMD was also cramming more transistors into the same area on the same process, but running them at lower clocks (e.g. Polaris). It has changed with Vega/Navi.



Fluffmeister said:


> The joys of a free market


Is on other side when monopolies or, often, even duopolies are involved.


----------



## cucker tarlson (Jul 21, 2020)

4:50









AIBs are still going for 2x8-pin,3x8-pin for the overclocking versions


----------



## Th3pwn3r (Jul 21, 2020)

I should have worded things a bit differently. What I meant was in terms of current hardware generation wise. Nvidia's offerings are ancient while AMD are brand new. It would be cool if latest gen were released from Nvidia and AMD at the same time. I'm not biased, I buy whatever performs the best with no budget for my main rig and I buy whatever offers best performance per dollar for my other machines if I choose not to use what's in storage.


----------



## Vayra86 (Jul 21, 2020)

Th3pwn3r said:


> I should have worded things a bit differently. What I meant was in terms of current hardware generation wise. Nvidia's offerings are ancient while AMD are brand new. It would be cool if latest gen were released from Nvidia and AMD at the same time. I'm not biased, I buy whatever performs the best with no budget for my main rig and I buy whatever offers best performance per dollar for my other machines if I choose not to use what's in storage.



Well the latest is that they're at least going to be about a quarter apart in release schedule, but we can also count on AMD not having AIB versions ready at launch, and you can rest assured Nvidia will open the full barrage of that close to launch.



Chrispy_ said:


> Yeah, but Turing _was _beaten to death by Little Navi at under $400. Hell, the $375 or so that the 2060 is going for is a joke because it can't raytrace well enough to matter and it's no faster than a 5600XT which is two price tiers below it.
> 
> Even in 2019 there was no excuse for a card costing $350 to not have at least 8GB RAM, and the $350 founders edition was as rare as hen's teeth with all of the partner cards coming in at $370-$410. For a 6GB card? *Ouch!*
> 
> Without Navi, Nvidia would have continued to take the piss with such stellar values as the $600 2070FE. The "Super" lineup and price cuts were Nvidia's answer to Navi. They didn't upgrade all their shader counts and reduce prices out of the goodness of their hearts or charity, that's not how businesses run....



The whole Super line up was not so much a response to Navi, it was a response to shareholders and overall Turing reception. Initially, that is. The later tweaks were definitely responses in terms of price point. But to think Super exists because Navi is Navi... please. Nvidia doesn't need AMD to release updated cards. And SUPER is pretty competitive to Navi.

Also... let's turn this around. We can moan about 6GB, but what about die size, hm? Navi is what... 255mm2. A 2060 is _twice as big. _Why pay nearly 400 for something with much better yields?


----------



## cucker tarlson (Jul 21, 2020)

nvidia releases stuff early,amd releases stuff late.
it's amd what need to haul ass



Vayra86 said:


> Well the latest is that they're at least going to be about a quarter apart in release schedule, but we can also count on AMD not having AIB versions ready at launch, and you can rest assured Nvidia will open the full barrage of that close to launch.


and a reference that doesn't suck

I like what nvidia are doing with FE coolers on rtx2000
it drives aib prices down.

example:
my 2070s trio is the best cooler tpu ever tested.it cost me 2560 while fe was 2400.
meanwhile 5700xt reference was 1900pln,but decent aibs were 2200 pln.
see the markup ? percentage wise it's twice as big on amd partner cards


----------



## Vayra86 (Jul 21, 2020)

cucker tarlson said:


> example:
> my 2070s trio is the best cooler tpu ever tested.it cost me 2560 while fe was 2400.
> meanwhile 5700xt reference was 1900pln,but decent aibs were 2200 pln.
> see the markup ? percentage wise it's twice as big on amd partner cards



Initially, FE was a tweak to push AIB prices up. Not sure what idea you are trying to sell here. The mark up exists and FE caused it. Now that the mark up is in effect we say 'oh look its almost the same price'... yeah  Its also one LOCAL example of one generation. Let's go into wait mode on that one... because so far the net result of FE is certainly not that AIBs got cheaper. The price point for each tier has gone up. Maybe what you want to say is that FE sets a price guide as a hard item in the market, which keeps things somewhat under control.... except it never really did so far in any tangible way. But theoretically... yes.

All I read out of your example above here is that quality solutions are priced accordingly, both in Navi and Turing's cases. Navi stock cooler = shit value. AIB coolers = good value. Nvidia's FE which is also open air = good value, but not quite as good as the best tested one.


----------



## cucker tarlson (Jul 21, 2020)

Vayra86 said:


> Initially, FE was a tweak to push AIB prices up. Not sure what idea you are trying to sell here. The mark up exists and FE caused it. Now that the mark up is in effect we say 'oh look its almost the same price'... yeah  Its also one LOCAL example of one generation. Let's go into wait mode on that one... because so far the net result of FE is certainly not that AIBs got cheaper. The price point for each tier has gone up. Maybe what you want to say is that FE sets a price guide as a hard item in the market, which keeps things somewhat under control.... except it never really did so far in any tangible way. But theoretically... yes.


*not* a local example
msrp on fe was 500 and on trio it was 515
entry level aibs were 500

you get the idea of "what I'm selling" ?
now check 5700xt msrp
$40 premium over a $400 card for a good aib


----------



## Vayra86 (Jul 21, 2020)

cucker tarlson said:


> *not* a local example
> msrp on fe was 500 and on trio it was 515
> entry level aibs were 500
> 
> ...



Yeah, so how do you defend the statement that AIB price levels are lower because of the FE? Their bottom end is priced like an FE now, which is pretty close to a top end card. So for most customers, the net result is their GPU got more expensive, not cheaper.


----------



## cucker tarlson (Jul 21, 2020)

Vayra86 said:


> Yeah, so how do you defend the statement that AIB price levels are lower because of the FE?


by looking at the numbers maybe
you pay a lower % markup on best 2070s cooler than best 5700xt coolers like nitro

15 bucks on a 500 card vs 20-40 dollars on a 400 card ? 

how do I make it easier for you ?


----------



## TheoneandonlyMrK (Jul 21, 2020)

cucker tarlson said:


> *not* a local example
> msrp on fe was 500 and on trio it was 515
> entry level aibs were 500
> 
> ...


2070super 489-699£  5700Xt 369-469£. Scan UK now.

Since when did one price statement suit reality.


----------



## cucker tarlson (Jul 21, 2020)

theoneandonlymrk said:


> 2070super 489-699£  5700Xt 369-469£. Scan UK now.
> 
> Since when did one price statement suit reality.


I said I was comparing msrps provided in tpu reviews you may wanna start reading before our conversation blows up into a 5 page long spat










						PowerColor Radeon RX 5700 XT Red Devil Review
					

The PowerColor Radeon RX 5700 XT Red Devil is overclocked out of the box and uses a massive triple-slot, triple-fan cooler that delivers outstanding noise levels. It's actually the quietest Radeon card we ever tested, and quieter than all NVIDIA RTX cards except for one custom-design.




					www.techpowerup.com
				











						MSI Radeon RX 5700 XT Gaming X Review
					

The factory-overclocked MSI Radeon RX 5700 XT Gaming X is a large triple-slot, dual-fan design with a completely new cooler that achieves the best temperatures of all Navi cards we've tested so far. Even at its low temperatures, the card runs very quietly, and idle-fan-stop is included, too.




					www.techpowerup.com
				











						Sapphire Radeon RX 5700 XT Nitro+ Review
					

Sapphire's Radeon RX 5700 XT Nitro+ is the company's flagship Navi card. It comes with a large triple-slot, triple-fan cooler that runs quiet and cool. Adjustable RGB lighting and fan-stop is included, too. A unique addition is the ability to control the dual BIOS switch through software.




					www.techpowerup.com
				




40-50 dollar markup on a 400 card vs $15 for 2070 super trio


----------



## TheoneandonlyMrK (Jul 21, 2020)

cucker tarlson said:


> I said I was comparing msrps provided in tpu reviews you may wanna start reading before our conversation blows up into a 5 page long spat
> 
> 
> 
> ...


You may wanna stop implying stuff about others you can't know.

I read your comments, but feel the price of one card on day one means nothing to anyone now.

So I checked up , provided facts, and there you go staying calm 

Stop grabbing at one offs like they're the law too.


----------



## cucker tarlson (Jul 21, 2020)

theoneandonlymrk said:


> You may wanna stop implying stuff about others you can't know.
> 
> I read your comments, but feel the price of one card on day one means nothing to anyone now.
> 
> ...


so official msrp is a one off
but one store in one country at one given time is "the law"

good grief
you can't be taught


----------



## TheoneandonlyMrK (Jul 21, 2020)

cucker tarlson said:


> so official msrp is a one off
> but one store in one country at one given time is "the law"
> 
> good grief
> you can't be taught


Your use of language tut.

It was an example , your just pushing your same Nvidia best mantra with release price's, which I think ridiculous.

No need to teach or preach ,and what right anyway.


----------



## Shatun_Bear (Jul 21, 2020)

Nvidia card prices in shops are extortionate, anyone trying to argue they're not needs to get off the green koolaid.


----------



## cucker tarlson (Jul 21, 2020)

Shatun_Bear said:


> Nvidia card prices in shops are extortionate, anyone trying to argue they're not needs to get off the green koolaid.


they are
wasn't my point at all

my point was,the better the reference model,the harder aibs have to try to sell theirs.


----------



## medi01 (Jul 21, 2020)

Vayra86 said:


> Super line up was not so much a response to Navi
















						Denial - Wikipedia
					






					en.wikipedia.org


----------



## Vayra86 (Jul 22, 2020)

cucker tarlson said:


> my point was,the better the reference model,the harder aibs have to try to sell theirs.



Thát is what I said too, in fact. But its not what you said. AIB models didn't get better because of the FE. The FE just positions itself higher in the price stack. But initially (Pascal),  the FE was simply a markup - and AIBs did position their cards _below it, _and comparable cards to a Trio were priced royally above an FE. That is why I said the single example says nothing, because it doesn't exist in isolation.

The comparison to Navi reference is also unfair because Navi reference is pretty shit, its comparable to the initial FE, more so than the current one. The two aren't the same thing.

So we can be all 'oh my its so cheap compared to XYZ' but what really happens is 'you get what you pay for'.


----------



## cucker tarlson (Jul 22, 2020)

Vayra86 said:


> The comparison to Navi reference is also unfair because Navi reference is pretty shit, its comparable to the initial FE, more so than the current one. The two aren't the same thing.


it is fair.the shittier the reference,the higher the markup on quality partner coolers.


----------



## Vayra86 (Jul 22, 2020)

cucker tarlson said:


> it is fair.the shittier the reference,the higher the markup on quality partner coolers.



Alright, then we are in full agreement after all, let's leave it there lol


----------



## cucker tarlson (Jul 22, 2020)

Vayra86 said:


> Alright, then we are in full agreement after all, let's leave it there lol


that was my point from the very beginning


----------



## Saabjock (Jul 28, 2020)

Most people do not have a lot of room between the forward end of their videocard and the front of the case. Positioning the connector there may pose a few issues with case fitment.


----------



## RainingTacco (Aug 1, 2020)

Saabjock said:


> Most people do not have a lot of room between the forward end of their videocard and the front of the case. Positioning the connector there may pose a few issues with case fitment.



You are right. With a quality flexible cable, there's no problem to place it at the side, just as usual. Manufacturers should stop putting inline caps in power connectors.


----------



## P4-630 (Aug 16, 2020)

_I've been talking to industry sources of mine who have said that NVIDIA will be using the 12-pin PCIe power connector on its new GeForce RTX 30 series Founders Edition graphic cards, while AIB partners on the other hand will not be using the 12-pin connector, and rather will have multiple 8-pin PCIe power connectors._





__





						GeForce RTX 3090: 12-pin PCIe on Founders Edition, not on custom cards
					

We're hearing that NVIDIA will use a new 12-pin PCIe power connector on the GeForce RTX 3090 Founders Edition, not custom cards.




					www.tweaktown.com


----------



## kayjay010101 (Aug 17, 2020)

P4-630 said:


> _I've been talking to industry sources of mine who have said that NVIDIA will be using the 12-pin PCIe power connector on its new GeForce RTX 30 series Founders Edition graphic cards, while AIB partners on the other hand will not be using the 12-pin connector, and rather will have multiple 8-pin PCIe power connectors._
> 
> 
> 
> ...


By "multiple" 8-pin connectors, do they mean 2 or 3? I'd lean more towards 3, considering the lack of clarification and usage of "multiple" instead of "two".
It'd be crazy to see 3x 8pin connectors be the new standard, that was kind of reserved for the high-end LN2-focused AIB cards previously. 
Can't wait for the 4x 8-pin K|NGP|N card from EVGA


----------



## EarthDog (Aug 17, 2020)

2 or 3 depending on the card and its power requirements...

I highly doubt that will be 'standard' on anything south of Titan and Non-Titan flagship part, but yeah, we saw this before on a couple of GPUs... yep. MSI Lightning IIRC had it as well as the K|NGP|N.


----------

