• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS TUF Gaming GeForce RTX 3090 Ti Box Pictured

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
It looks like GeForce RTX 3090 Ti is indeed the name of the maxed out "Ampere" GA102 silicon, and NVIDIA did not go with "RTX 3090 SUPER" for its naming. A picture emerged of an ASUS TUF Gaming GeForce RTX 3090 Ti graphics card that confirms the naming. The board design of the card looks similar to the RTX 3090 TUF Gaming, except that the Axial-Tech fans have changed, with more blades on the impellers.

The GeForce RTX 3090 Ti is expected to max out the GA102 silicon, featuring all 10,752 CUDA cores, 84 RT cores, and 336 Tensor cores, physically present on the silicon. The memory size is unchanged over the RTX 3090, with 24 GB of GDDR6X memory. What's new is that NVIDIA is reportedly using faster 21 Gbps-rated memory chips, compared to 19.5 Gbps on the RTX 3090. The typical board power is rated at 450 W, compared to 350 W on the RTX 3090. NVIDIA is expected to announce this card at its January 4 press event along the sidelines of the 2022 International CES.



View at TechPowerUp Main Site
 
450W?? :laugh: :laugh: :laugh:
Those GPUs are really starting to become not only ridiculous, but with retarded prices also...
 
Jess, 450 watt out of box. So with power target maxed out perhaps 500 watt? And for the top tier models line 500 to 550 watt max.

My new evga rtx 3080 FTW 3 ultra all ready stock has 380 watt and with max power target it goes to 400 watt. That is all ready more than enough power consumption for me.
 
"The typical board power is rated at 450 W, compared to 350 W on the RTX 3090."
Good ol' days when Fermi aka "Thermi" was called the absolute space heater..
 
"The typical board power is rated at 450 W, compared to 350 W on the RTX 3090."
Good ol' days when Fermi aka "Thermi" was called the absolute space heater..
1640058216467.png

Getting close to Fermi territory here NVidia...
1640058357130.png

Whatever comes next had better follow Kepler's footsteps and not look to increase performance so much as cut down on power consumption.
If you can make the GTX 680 consume 300W compared to the 580's 450W and be more powerful by 23% according to TPU then you can do the same here.
Except I want more emphasis on less power draw and less emphasis on performance. The 3090 Ti is already stupid fast, no need to go overkill.
 
Dang, still remember how 450-550W is enough for a budget system. :rolleyes:
 
View attachment 229690
Getting close to Fermi territory here NVidia...
View attachment 229691
Whatever comes next had better follow Kepler's footsteps and not look to increase performance so much as cut down on power consumption.
If you can make the GTX 680 consume 300W compared to the 580's 450W and be more powerful by 23% according to TPU then you can do the same here.
Except I want more emphasis on less power draw and less emphasis on performance. The 3090 Ti is already stupid fast, no need to go overkill.
That's the total system consumption with the Thermi text you posted. On this 3090 Ti, only the card is going to have 450W TBP.
 
That's the total system consumption with the Thermi text you posted. On this 3090 Ti, only the card is going to have 450W TBP.
Spontaneous combustion anyone?
 
Whatever comes next had better follow Kepler's footsteps and not look to increase performance so much as cut down on power consumption.

"Expect next-generation graphics cards to consume more power"

"In a recent Tweet, the leaker @kopite7kimi stated that "400 is not enough" when referring to Nvidia's next-generation RTX 40 series."
 
Spontaneous combustion anyone?
At least now there's a reason to get those 1kW or beefier PSUs even after the death of SLI/CF..
 
In view of both the current pricing and power hunger of the Ampere generation of graphics cards I will stick with my GTX 1080 Ti FE for the foreseeable future. I could imagine buying a new mid-range card from Nvidia's RTX 4xxx series, AMD's RDNA3 or Intel's upcoming ARC some day but I doubt the prices for graphics cards will go down before 2023/24 when all the newly erected fabs start producing new chips. I hope I can nurse my 1080 Ti through this time of crisis...
 
In view of both the current pricing and power hunger of the Ampere generation of graphics cards I will stick with my GTX 1080 Ti FE for the foreseeable future. I could imagine buying a new mid-range card from Nvidia's RTX 4xxx series, AMD's RDNA3 or Intel's upcoming ARC some day but I doubt the prices for graphics cards will go down before 2023/24 when all the newly erected fabs start producing new chips. I hope I can nurse my 1080 Ti through this time of crisis...
1080 Ti here as well and still plays games flawlessly. I'll probably grab an used 2080 Ti when their prices drop to sane levels. I've used a gen or two old second-hand flagships for several years.
 
It's okay guys, i already have one

10,496 to 10,792 is so tiny that these cards are really just going to have fixed VRAM cooling and call it a day :/
 

Attachments

  • 3090ti.png
    3090ti.png
    4.4 MB · Views: 294
It's okay guys, i already have one

10,496 to 10,792 is so tiny that these cards are really just going to have fixed VRAM cooling and call it a day :/

Thought you could have sold that special card for one gazillion dollars.
 
What's also weird is that they have so many variants of the flagship GPU. 3080, 3080 Ti, 3090 and upcoming 3090 Ti. Usually before it was only the high-end Ti and Titan. And yeah, I remember 780/780 Ti/Titan/Titan Black and 1080 Ti/Titan X/Titan Xp, but still.

And still the prices are insane and there's practically no cards for those who can afford them.
 
At that point we will start to see 4 slot cards more regularly. I think I would prefer the extra hazzle of an integrated AIO at this point...
Think about it, if the thing goes in flames rendering the menu of Halo Infinite, the leak from the AIO will put out the fire for you.
 
At that point we will start to see 4 slot cards more regularly. I think I would prefer the extra hazzle of an integrated AIO at this point...
Think about it, if the thing goes in flames rendering the menu of Halo Infinite, the leak from the AIO will put out the fire for you.
Well, for a typical build, a 4-slot card would be okay as not many has a sound card or other cards that often anymore.

And does Halo Infinite has also unlimited FPS on menu or what, totally missed that?
 
What's also weird is that they have so many variants of the flagship GPU. 3080, 3080 Ti, 3090 and upcoming 3090 Ti. Usually before it was only the high-end Ti and Titan. And yeah, I remember 780/780 Ti/Titan/Titan Black and 1080 Ti/Titan X/Titan Xp, but still.

And still the prices are insane and there's practically no cards for those who can afford them.
It isn't weird at all in the case of the Ti cards. They're essentially replacements for the originals, offering a tiny performance boost for a higher price. Jensen regrets every single day pricing the originals so "low" considering what's happened after that. It's just an excuse to unofficially sunset those cheaper SKUs, since everything from the 3080 up is competing for the same die. They trickle out a tiny number of 3080s for PR reasons, allowing them to keep claiming that prices haven't risen, whilst diverting all the dies they can to the much more profitable variants.
 
I'm actually quite happy I'm not buying an Ampere GPU. But that's what I've been saying since it got released. The whole gen is an utter mess. TDPs through the roof, VRAM is hit/miss, and RT is still early adopter nonsense. And its not 7nm TSMC, but shaky Samsung, directly relating to the stack's other issues.

Then again, I'm also not buying an RX GPU :D But that's just an availability issue. The gen itself is solid, normal product stack, proper balance, and as per AMD's mojo, slightly behind on featureset.

All things considered its not a huge issue that stuff's hardly available. If you have a working GPU.
 
It isn't weird at all in the case of the Ti cards. They're essentially replacements for the originals, offering a tiny performance boost for a higher price. Jensen regrets every single day pricing the originals so "low" considering what's happened after that. It's just an excuse to unofficially sunset those cheaper SKUs, since everything from the 3080 up is competing for the same die. They trickle out a tiny number of 3080s for PR reasons, allowing them to keep claiming that prices haven't risen, whilst diverting all the dies they can to the much more profitable variants.
Well, on the other hand, not always. 980/1080/2080 didn't use the same chip and had less memory/narrower memory bus width.
 
I look at the spec and don't care it's just another card gamers won't be able to buy. And if Nvidia or AMD cared they would be producing more of their own cards or at least putting in place decent supply chain tracking to stop direct to miner sales it's not like the cost couldn't be payed for and still massively cheeper than now. The truth is they are all happy the way it is and it's disgusting.
 
Well, on the other hand, not always. 980/1080/2080 didn't use the same chip and had less memory/narrower memory bus width.

Nvidia is forced to push their largest chip from x80 and up to keep the stack sensible and worthwhile. This is telling. Last time they did that, was Kepler Refresh, and they weren't really winning at the time. They needed their big chip in the 780 to fight AMD's offerings, while the Ti above it is just for epeen purposes. The same thing is happening today against RX. Compare this to Pascal where the largest chip was only used in the utterly fantastic 1080ti, which commanded a major perf gap over the 1080, while the 1070 also used GP104 and was still competitive with 25% less shaders and slower VRAM.

If Nvidia is able to keep their Gx104 SKU all the way from x70 > x80 you know they have a strong generation and product stack. If they have to use Gx102 from x80 onwards... its a dead end and a gen pushed to the limit of the silicon. Generally not the best things you can buy, history repeats.
 
Nvidia is forced to push their largest chip from x80 and up to keep the stack sensible and worthwhile. This is telling. Last time they did that, was Kepler Refresh, and they weren't really winning at the time. They needed their big chip in the 780 to fight AMD's offerings, while the Ti above it is just for epeen purposes. The same thing is happening today against RX.

If Nvidia is able to keep their Gx104 SKU all the way from x70 > x80 you know they have a strong generation and product stack. If they have to use Gx102 from x80 onwards... its a dead end and a gen pushed to the limit of the silicon. Generally not the best things you can buy, history repeats.
Yeah, I remember Kepler era well. GCN was IMO better in a long run; 7970 was already a great card and 290(X) made that even better.

But on the topic, it's just so hella stupid to have this many SKUs as the prices are insane and there just isn't cards for customers.
 
Back
Top