Monday, December 20th 2021

ASUS TUF Gaming GeForce RTX 3090 Ti Box Pictured

It looks like GeForce RTX 3090 Ti is indeed the name of the maxed out "Ampere" GA102 silicon, and NVIDIA did not go with "RTX 3090 SUPER" for its naming. A picture emerged of an ASUS TUF Gaming GeForce RTX 3090 Ti graphics card that confirms the naming. The board design of the card looks similar to the RTX 3090 TUF Gaming, except that the Axial-Tech fans have changed, with more blades on the impellers.

The GeForce RTX 3090 Ti is expected to max out the GA102 silicon, featuring all 10,752 CUDA cores, 84 RT cores, and 336 Tensor cores, physically present on the silicon. The memory size is unchanged over the RTX 3090, with 24 GB of GDDR6X memory. What's new is that NVIDIA is reportedly using faster 21 Gbps-rated memory chips, compared to 19.5 Gbps on the RTX 3090. The typical board power is rated at 450 W, compared to 350 W on the RTX 3090. NVIDIA is expected to announce this card at its January 4 press event along the sidelines of the 2022 International CES.
Sources: HXL (Twitter), VideoCardz, ITHome
Add your own comment

63 Comments on ASUS TUF Gaming GeForce RTX 3090 Ti Box Pictured

#26
Mussels
Freshwater Moderator
nguyen

That 3090 Ti box looks just as sexy as mine :D
I'd say their looks are a.... Tie
Posted on Reply
#27
chrcoluk
450W haha sorry had to laugh.
Posted on Reply
#28
ARF
Vayra86I'm actually quite happy I'm not buying an Ampere GPU. But that's what I've been saying since it got released. The whole gen is an utter mess. TDPs through the roof, VRAM is hit/miss, and RT is still early adopter nonsense. And its not 7nm TSMC, but shaky Samsung, directly relating to the stack's other issues.

Then again, I'm also not buying an RX GPU :D But that's just an availability issue. The gen itself is solid, normal product stack, proper balance, and as per AMD's mojo, slightly behind on featureset.

All things considered its not a huge issue that stuff's hardly available. If you have a working GPU.
Yeah, it's like the people who made those stupid decisions for the lineup are with very low intelligence.
RTX 3080 10 GB, and RTX 3090 24 GB is absolute nonsense.

In the worst case, it should have been:
RTX 3080 12 GB and RTX 3090 16 GB.
or RTX 3080 16 GB, and RTX 3090 20 GB..
Posted on Reply
#29
Chomiq
ARFYeah, it's like the people who made those stupid decisions for the lineup are with very low intelligence.
RTX 3080 10 GB, and RTX 3090 24 GB is absolute nonsense.

In the worst case, it should have been:
RTX 3080 12 GB and RTX 3090 16 GB.
or RTX 3080 16 GB, and RTX 3090 20 GB..
It still doesn't make sense because 12GB 3080 Ti exists.
Posted on Reply
#30
ARF
ChomiqIt still doesn't make sense because 12GB 3080 Ti exists.
That memory capacity is still very low.. It should have been 16 GB..
Posted on Reply
#31
Dux
Judging from the box length, this is some super big card. Can't wait for Jensen to come out with BS statments again about power efficiency with this 450W card. Also, this box is as much as i will see of this card. :laugh:
Posted on Reply
#34
InVasMani
Next up AMD with a 32-bit GPU backed by like 4MB of infinity cache for a paltry $250's.
Posted on Reply
#36
mechtech
picture of a box, pretty much the only thing everyone will see.
Posted on Reply
#37
GerKNG
Three High End GPUs... for three percent difference in performance
ARFIn the worst case, it should have been:
RTX 3080 12 GB and RTX 3090 16 GB.
or RTX 3080 16 GB, and RTX 3090 20 GB..
almost all of these combinations are impossible.
Posted on Reply
#38
WhoDecidedThat
Micron made 8 Gbit/1 GByte GDDR6X chips which obviously wasn't enough. I wonder, if they couldn't make 2 GB chips, why not compromise and make 12 Gbit/1.5 GByte GDDR6X chips?

The RTX 3080 especially would have been pretty sweet with 15 GB VRAM (10 chips x 1.5 GB per chip).
Posted on Reply
#39
GoldenX
MaenadWell, for a typical build, a 4-slot card would be okay as not many has a sound card or other cards that often anymore.

And does Halo Infinite has also unlimited FPS on menu or what, totally missed that?
True, but in my case it would stop me from testing other vendor on the same PC. Testing can lead to weird things, some days ago I had to test an issue with flatpak and video decoding, if you had an Nvidia GPU accompanied by any other dedicated card, decoding would fail and make the app crash. A 4 slot card would block me from testing that

Halo's menu seems to be like New World, a GPU toaster.
Posted on Reply
#40
ARF
GerKNGalmost all of these combinations are impossible.
For the incompetently stupid Nvidia engineers? :D Yes!
Posted on Reply
#41
GerKNG
ARFFor the incompetently stupid Nvidia engineers? :D Yes!
no it is impossible for a simple reason.
every memory chip is bound to a 32bit bus.
you have 8Gbit chips (1GB)

with a 320 Bit memory bus you can use 10 or 20 memory chips. and nothing in between (without making the card a complete mess and barely functional as soon as the "odd" memory gets used. like with the 970.
Posted on Reply
#42
Richards
10-15 more performance at 4k with the increase of memory bandwidth plus 2 sm's best case
Posted on Reply
#43
ARF
GerKNGno it is impossible for a simple reason.
every memory chip is bound to a 32bit bus.
you have 8Gbit chips (1GB)

with a 320 Bit memory bus you can use 10 or 20 memory chips. and nothing in between (without making the card a complete mess and barely functional as soon as the "odd" memory gets used. like with the 970.
There are two workarounds - the second with the infamous 3.5-4GB 970. But you are wrong here because the 970 has a 256-bit MI, so in theory it must function normally with its 4 GB.
It has only 3.5 for another reason.
And yes, Nvidia has already designed other models with decoupled MI bandwidth-VRAM capacity ratios :D

And first - you have to design the MI interface with the memory capacity in mind, not the other way round.

If you want 16 GB, then give the card a 256-bit MI.
Posted on Reply
#44
GerKNG
ARFThere are two workarounds - the second with the infamous 3.5-4GB 970. But you are wrong here because the 970 has a 256-bit MI, so in theory it must function normally with its 4 GB.
It has only 3.5 for another reason.
And yes, Nvidia has already designed other models with decoupled MI bandwidth-VRAM capacity ratios :D

And first - you have to design the MI interface with the memory capacity in mind, not the other way round.

If you want 16 GB, then give the card a 256-bit MI.
so you want to reduce the performance of the card to get a couple gigs more vram for no reason?
Posted on Reply
#45
Mussels
Freshwater Moderator
Tigger450W and people whine at ADL :laugh:
How long before american power sockets cant handle the load of both a CPU and GPU on the same fuse?
Posted on Reply
#46
Solid State Soul ( SSS )
Prima.Vera450W?? :laugh: :laugh: :laugh:
Those GPUs are really starting to become not only ridiculous, but with retarded prices also...
Bro, that's like an entire power draw of an i7 8700 with a 1080Ti rig
Posted on Reply
#48
truehighroller1
Bwaze"Expect next-generation graphics cards to consume more power"

"In a recent Tweet, the leaker @kopite7kimi stated that "400 is not enough" when referring to Nvidia's next-generation RTX 40 series."
Good thing my psu is 1600 watts.. Bring it on liberals.
Posted on Reply
#49
ARF
GerKNGso you want to reduce the performance of the card to get a couple gigs more vram for no reason?
The miserable 10 GB on the RTX 3080 already decrease the performance, forcing Nvidia to do shenanigans in the driver to force lower resolution textures in the games..

RTX 3080 VRAM usage warnings and the issue with VRAM pool sizes: the compromise of 4K gaming | ResetEra

Far Cry 6 needs more VRAM than the Nvidia RTX 3080 has to load HD textures | PCGamesN
Posted on Reply
#50
Vayra86
ARFThe miserable 10 GB on the RTX 3080 already decrease the performance, forcing Nvidia to do shenanigans in the driver to force lower resolution textures in the games..

RTX 3080 VRAM usage warnings and the issue with VRAM pool sizes: the compromise of 4K gaming | ResetEra

Far Cry 6 needs more VRAM than the Nvidia RTX 3080 has to load HD textures | PCGamesN
Exactly and those shenanigans don't show up in reviews at launch. Win win, right? 10GB was enough and people can happily live in ignorance that it is. Unless they frequent TPU, where I'll keep massaging that one in since lots of buyers were convinced all was fine and 'oh, otherwise the card's perf isn't going to push those details anyway' or 'I'll just lower my textures'. On GPUs that are hitting the 1K MSRP mark. Less than a year post-launch. Its utterly retarded.

To each their own, it was easy to predict this.

So no, Nvidia won't do the hardware workaround for 'sufficient' VRAM anymore, after they got burned on Maxwell they're designing around it and they sell you the argument that all is well with 50% reduced VRAM compared to Pascal in relative core power. Meanwhile, they still do release double VRAM for the entire stack. Early adopter heaven, cash twice on fools with money. Its much better than getting forced to settle with customers for your 3.5GB.

The newest argument to avoid buyers' remorse now is 'muh muh but we have DLSS, so it looks great anyway'. :roll::roll::roll: Okay, enjoy being on Nvidia's DLSS leash for your gaming, what used to be waiting for your SLI profiles now is waiting for your DLSS profiles. But! This time, it will all be different, right? :D

Its a piss poor gen, this one. Even the 450W on this 3090ti underlines it - they can't make a full die to save their life without pulling all the stops on voltage. Samsung 8nm. Fan-tas-tic node.
Posted on Reply
Add your own comment
Sep 26th, 2024 21:15 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts