Monday, December 20th 2021

ASUS TUF Gaming GeForce RTX 3090 Ti Box Pictured

It looks like GeForce RTX 3090 Ti is indeed the name of the maxed out "Ampere" GA102 silicon, and NVIDIA did not go with "RTX 3090 SUPER" for its naming. A picture emerged of an ASUS TUF Gaming GeForce RTX 3090 Ti graphics card that confirms the naming. The board design of the card looks similar to the RTX 3090 TUF Gaming, except that the Axial-Tech fans have changed, with more blades on the impellers.

The GeForce RTX 3090 Ti is expected to max out the GA102 silicon, featuring all 10,752 CUDA cores, 84 RT cores, and 336 Tensor cores, physically present on the silicon. The memory size is unchanged over the RTX 3090, with 24 GB of GDDR6X memory. What's new is that NVIDIA is reportedly using faster 21 Gbps-rated memory chips, compared to 19.5 Gbps on the RTX 3090. The typical board power is rated at 450 W, compared to 350 W on the RTX 3090. NVIDIA is expected to announce this card at its January 4 press event along the sidelines of the 2022 International CES.
Sources: HXL (Twitter), VideoCardz, ITHome
Add your own comment

63 Comments on ASUS TUF Gaming GeForce RTX 3090 Ti Box Pictured

#1
Prima.Vera
450W?? :laugh: :laugh: :laugh:
Those GPUs are really starting to become not only ridiculous, but with retarded prices also...
Posted on Reply
#2
Tomgang
Jess, 450 watt out of box. So with power target maxed out perhaps 500 watt? And for the top tier models line 500 to 550 watt max.

My new evga rtx 3080 FTW 3 ultra all ready stock has 380 watt and with max power target it goes to 400 watt. That is all ready more than enough power consumption for me.
Posted on Reply
#3
Ruru
S.T.A.R.S.
"The typical board power is rated at 450 W, compared to 350 W on the RTX 3090."
Good ol' days when Fermi aka "Thermi" was called the absolute space heater..
Posted on Reply
#4
WhitetailAni
Maenad"The typical board power is rated at 450 W, compared to 350 W on the RTX 3090."
Good ol' days when Fermi aka "Thermi" was called the absolute space heater..

Getting close to Fermi territory here NVidia...

Whatever comes next had better follow Kepler's footsteps and not look to increase performance so much as cut down on power consumption.
If you can make the GTX 680 consume 300W compared to the 580's 450W and be more powerful by 23% according to TPU then you can do the same here.
Except I want more emphasis on less power draw and less emphasis on performance. The 3090 Ti is already stupid fast, no need to go overkill.
Posted on Reply
#5
Rhein7
Dang, still remember how 450-550W is enough for a budget system. :rolleyes:
Posted on Reply
#6
Ruru
S.T.A.R.S.
RealKGB
Getting close to Fermi territory here NVidia...

Whatever comes next had better follow Kepler's footsteps and not look to increase performance so much as cut down on power consumption.
If you can make the GTX 680 consume 300W compared to the 580's 450W and be more powerful by 23% according to TPU then you can do the same here.
Except I want more emphasis on less power draw and less emphasis on performance. The 3090 Ti is already stupid fast, no need to go overkill.
That's the total system consumption with the Thermi text you posted. On this 3090 Ti, only the card is going to have 450W TBP.
Posted on Reply
#7
WhitetailAni
MaenadThat's the total system consumption with the Thermi text you posted. On this 3090 Ti, only the card is going to have 450W TBP.
Spontaneous combustion anyone?
Posted on Reply
#8
Bwaze
RealKGBWhatever comes next had better follow Kepler's footsteps and not look to increase performance so much as cut down on power consumption.
"Expect next-generation graphics cards to consume more power"

"In a recent Tweet, the leaker @kopite7kimi stated that "400 is not enough" when referring to Nvidia's next-generation RTX 40 series."
Posted on Reply
#9
Ruru
S.T.A.R.S.
RealKGBSpontaneous combustion anyone?
At least now there's a reason to get those 1kW or beefier PSUs even after the death of SLI/CF..
Posted on Reply
#10
TheHughMan
RTX 3090 Ti, Best $5,000 you'll ever spend.
Posted on Reply
#11
Flydommo
In view of both the current pricing and power hunger of the Ampere generation of graphics cards I will stick with my GTX 1080 Ti FE for the foreseeable future. I could imagine buying a new mid-range card from Nvidia's RTX 4xxx series, AMD's RDNA3 or Intel's upcoming ARC some day but I doubt the prices for graphics cards will go down before 2023/24 when all the newly erected fabs start producing new chips. I hope I can nurse my 1080 Ti through this time of crisis...
Posted on Reply
#12
Lightning
I mean the 3090 Strix OC has a max peak of 415W in gaming and 449W in Furmark and winter is here.
Posted on Reply
#13
Ruru
S.T.A.R.S.
FlydommoIn view of both the current pricing and power hunger of the Ampere generation of graphics cards I will stick with my GTX 1080 Ti FE for the foreseeable future. I could imagine buying a new mid-range card from Nvidia's RTX 4xxx series, AMD's RDNA3 or Intel's upcoming ARC some day but I doubt the prices for graphics cards will go down before 2023/24 when all the newly erected fabs start producing new chips. I hope I can nurse my 1080 Ti through this time of crisis...
1080 Ti here as well and still plays games flawlessly. I'll probably grab an used 2080 Ti when their prices drop to sane levels. I've used a gen or two old second-hand flagships for several years.
Posted on Reply
#14
Mussels
Freshwater Moderator
It's okay guys, i already have one

10,496 to 10,792 is so tiny that these cards are really just going to have fixed VRAM cooling and call it a day :/
Posted on Reply
#15
the54thvoid
Super Intoxicated Moderator
MusselsIt's okay guys, i already have one

10,496 to 10,792 is so tiny that these cards are really just going to have fixed VRAM cooling and call it a day :/
Thought you could have sold that special card for one gazillion dollars.
Posted on Reply
#16
Ruru
S.T.A.R.S.
What's also weird is that they have so many variants of the flagship GPU. 3080, 3080 Ti, 3090 and upcoming 3090 Ti. Usually before it was only the high-end Ti and Titan. And yeah, I remember 780/780 Ti/Titan/Titan Black and 1080 Ti/Titan X/Titan Xp, but still.

And still the prices are insane and there's practically no cards for those who can afford them.
Posted on Reply
#17
GoldenX
At that point we will start to see 4 slot cards more regularly. I think I would prefer the extra hazzle of an integrated AIO at this point...
Think about it, if the thing goes in flames rendering the menu of Halo Infinite, the leak from the AIO will put out the fire for you.
Posted on Reply
#18
Ruru
S.T.A.R.S.
GoldenXAt that point we will start to see 4 slot cards more regularly. I think I would prefer the extra hazzle of an integrated AIO at this point...
Think about it, if the thing goes in flames rendering the menu of Halo Infinite, the leak from the AIO will put out the fire for you.
Well, for a typical build, a 4-slot card would be okay as not many has a sound card or other cards that often anymore.

And does Halo Infinite has also unlimited FPS on menu or what, totally missed that?
Posted on Reply
#19
Aretak
MaenadWhat's also weird is that they have so many variants of the flagship GPU. 3080, 3080 Ti, 3090 and upcoming 3090 Ti. Usually before it was only the high-end Ti and Titan. And yeah, I remember 780/780 Ti/Titan/Titan Black and 1080 Ti/Titan X/Titan Xp, but still.

And still the prices are insane and there's practically no cards for those who can afford them.
It isn't weird at all in the case of the Ti cards. They're essentially replacements for the originals, offering a tiny performance boost for a higher price. Jensen regrets every single day pricing the originals so "low" considering what's happened after that. It's just an excuse to unofficially sunset those cheaper SKUs, since everything from the 3080 up is competing for the same die. They trickle out a tiny number of 3080s for PR reasons, allowing them to keep claiming that prices haven't risen, whilst diverting all the dies they can to the much more profitable variants.
Posted on Reply
#20
Vayra86
I'm actually quite happy I'm not buying an Ampere GPU. But that's what I've been saying since it got released. The whole gen is an utter mess. TDPs through the roof, VRAM is hit/miss, and RT is still early adopter nonsense. And its not 7nm TSMC, but shaky Samsung, directly relating to the stack's other issues.

Then again, I'm also not buying an RX GPU :D But that's just an availability issue. The gen itself is solid, normal product stack, proper balance, and as per AMD's mojo, slightly behind on featureset.

All things considered its not a huge issue that stuff's hardly available. If you have a working GPU.
Posted on Reply
#21
Ruru
S.T.A.R.S.
AretakIt isn't weird at all in the case of the Ti cards. They're essentially replacements for the originals, offering a tiny performance boost for a higher price. Jensen regrets every single day pricing the originals so "low" considering what's happened after that. It's just an excuse to unofficially sunset those cheaper SKUs, since everything from the 3080 up is competing for the same die. They trickle out a tiny number of 3080s for PR reasons, allowing them to keep claiming that prices haven't risen, whilst diverting all the dies they can to the much more profitable variants.
Well, on the other hand, not always. 980/1080/2080 didn't use the same chip and had less memory/narrower memory bus width.
Posted on Reply
#22
Broken Processor
I look at the spec and don't care it's just another card gamers won't be able to buy. And if Nvidia or AMD cared they would be producing more of their own cards or at least putting in place decent supply chain tracking to stop direct to miner sales it's not like the cost couldn't be payed for and still massively cheeper than now. The truth is they are all happy the way it is and it's disgusting.
Posted on Reply
#23
Vayra86
MaenadWell, on the other hand, not always. 980/1080/2080 didn't use the same chip and had less memory/narrower memory bus width.
Nvidia is forced to push their largest chip from x80 and up to keep the stack sensible and worthwhile. This is telling. Last time they did that, was Kepler Refresh, and they weren't really winning at the time. They needed their big chip in the 780 to fight AMD's offerings, while the Ti above it is just for epeen purposes. The same thing is happening today against RX. Compare this to Pascal where the largest chip was only used in the utterly fantastic 1080ti, which commanded a major perf gap over the 1080, while the 1070 also used GP104 and was still competitive with 25% less shaders and slower VRAM.

If Nvidia is able to keep their Gx104 SKU all the way from x70 > x80 you know they have a strong generation and product stack. If they have to use Gx102 from x80 onwards... its a dead end and a gen pushed to the limit of the silicon. Generally not the best things you can buy, history repeats.
Posted on Reply
#24
Ruru
S.T.A.R.S.
Vayra86Nvidia is forced to push their largest chip from x80 and up to keep the stack sensible and worthwhile. This is telling. Last time they did that, was Kepler Refresh, and they weren't really winning at the time. They needed their big chip in the 780 to fight AMD's offerings, while the Ti above it is just for epeen purposes. The same thing is happening today against RX.

If Nvidia is able to keep their Gx104 SKU all the way from x70 > x80 you know they have a strong generation and product stack. If they have to use Gx102 from x80 onwards... its a dead end and a gen pushed to the limit of the silicon. Generally not the best things you can buy, history repeats.
Yeah, I remember Kepler era well. GCN was IMO better in a long run; 7970 was already a great card and 290(X) made that even better.

But on the topic, it's just so hella stupid to have this many SKUs as the prices are insane and there just isn't cards for customers.
Posted on Reply
#25
nguyen


That 3090 Ti box looks just as sexy as mine :D
Posted on Reply
Add your own comment
Nov 21st, 2024 10:40 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts