• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3090 and 3080 Specifications Leaked

Generally yes but what will the actual TDP of the 3090 be compared to the 2080ti? Keeping in mind that Nvidia (according to Tom) might be trying to make their line up less confusing to consumers (I won't hold my breath). The heat output is going to be enormous because Nvidia is obsessed with mindset because there are so many people who will pointlessly throw themselves at a corporate identity simply because they have the best at 1 million dollars per card when they can afford $200.

Frankly I would really like to see AMD release the 6000 series with a setting that allows you to choose a TDP target. Like a slider for the card power, even if it has steps like 25 watts or something. I know AMD will still crank the power up, I don't need 300FPS, I am still using a 60Hz screen for now and am interested in 120/144 later this year but options are limited in the area of the market I'm looking as I game maybe 2% of the time I'm at my rig.

From what i keep seeing is that the 3090 be around 320w and 220w for the 3080, i guess we will see soon enough. I get the feeling that the 3090 was actually made to be their next Titan with the amount of memory it has.
 
Frankly I would really like to see AMD release the 6000 series with a setting that allows you to choose a TDP target. Like a slider for the card power, even if it has steps like 25 watts or something. I know AMD will still crank the power up, I don't need 300FPS, I am still using a 60Hz screen for now and am interested in 120/144 later this year but options are limited in the area of the market I'm looking as I game maybe 2% of the time I'm at my rig.
The Radeon driver already allows you to define a power budget in general, and on a per-game basis. I run my 5700XT at 120W most of the time, and it's passively cooled when it does. And, btw, the slider is continuous.

The only thing missing is memory underclocking, but that's more like peanuts compared to the GPU itself.
 
In other words, that big RDNA2 chip is beating 3080 (hardly surprising, to be honest)
I haven't seen a single solid RDNA2 leak, only suppositions, have you?
 
Yeah besides, It would have been better if he said GA102 cannot beat it either, shame.
 
From what I've seen, adapters will be included.
Yea I had just found that out so im not worried (Other than looks lol).
From what i keep seeing is that the 3090 be around 320w and 220w for the 3080, i guess we will see soon enough. I get the feeling that the 3090 was actually made to be their next Titan with the amount of memory it has.
I thought the way they are doing this they are scrapping the Titan badging from the gaming lineup and making it a pro card but I could be mistaken.
 
Yea I had just found that out so im not worried (Other than looks lol).

I thought the way they are doing this they are scrapping the Titan badging from the gaming lineup and making it a pro card but I could be mistaken.

Not sure you could be right too for all i know, although i am not after 300w+ vcard that's for sure regardless of the price.
 
Were you not about for the last ten years, every GPU release had a hype train.
And if not new unreleased tech, what then do we discuss, do we just compare benchmarks, fix issues, some like a less dry discussion.
You can not read them.

Sorry I have no idea what you're trying to say.
 
So worth it, skipping all the leaks, skipping pointless speculation on a forum, it wouldn't have felt the same, I made an awesome dinner, sat down and watched mostly all of it, very satisfied, even tho I'm not a Nvidia customer, but that's not the point.

 
So all the 30 series have a 12 pin connector?
Simple, not all cards need that many power lines. For many cards one 6pin/8pin is enough. So why build a connector that has a bunch of power lines when only a few are needed? It's needless and wasteful.


You need to read that wikipedia article you're quoting a little closer. 2.0 was on the roadmap in 2016 but it wasn't finalized until June 26th of 2019. Additionally, DP2.0 modulation ICs are expensive and offer marginal benefit to the consumer over 1.4a based ICs. DP2.0 is best suited for commercial and industrial applications ATM.
 
Either way.
I still prefer a single connection over multiples
While I would generally agree with that, I don't have any PSU's that have that connection and an adapter cable is pointless. When PSU's have that connector, I'd be fine using it, but I'm not going to replace $650 worth of in-use PSUs in my home just to accommodate a change no one needs or asked for. And to be fair, AIBs are not going to use the connector and I'm not buying any of NVidia' own FE lineup(either personally or for my shop). So it's kind of a mute point at this time.
 
Connector should come the nV FE card anyways.
 
Back
Top