• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4090 Twice as Fast as RTX 3090, Features 16128 CUDA Cores and 450W TDP

don’t expect more than +50% from anything but maybe the top card at massive power draw.

4080 will be 50% over 3080

the July launch will surprise a lot of people of true.

I would like to make a request for reviewing 4090 and 7900xt. Please make a summary graph of 4k, RT max, DLSS /FSR Quality mode.

this is how we will play games this gen
 
Even with RDNA2 AMD has reached chip size / perf parity with NV.

I don't see why they'd lag with RDNA3.
 
Twice as fast means instead of 100fps on 4K (3090) it can output 200fps, right? ;)
 
Twice as fast means instead of 100fps on 4K (3090) it can output 200fps, right? ;)
Nope as Flops don't = Frames per second I think you'll most likely see a 30~40fps increase using the same settings
 
Can't wait to see these actual cards, sure there will be a few members showing their rigs with these power monsters in, funniest part will be the justification stories for buying them.
 
Can't wait to see these actual cards, sure there will be a few members showing their rigs with these power monsters in, funniest part will be the justification stories for buying them.

1653078782259.png
 
I see nVidia, but to be honest, everybody else too, forgot a very important detail in order to justify buying those overpriced cards.
Are there any good or excellent games worth buying such cards?? No, I mean really? So far I see 0 (ZERO) justification in upgrading a video card for that amount of money.
I remember to buy new video cards just to play so demanding games at their time such as, Quake 2, Deus Ex, Doom3, Dragon Age, Mass Effect, Crysis, The Witcher 3, Cyber Punk....and nothing since then....really.
So the question is,...why?
 
Did you even read this "rumor" article?

I never understood where those 600 W or more rumors came from. TSMC 5N has triple the transistor density over Samsung 8N. And the 40 series might even be made on the 4N process.

So with "only" 16128 CUDA cores (much less than double over 3090), a 450 W TDP should mean a huge increase in the number of RT cores, and probably very high clocks.

Maybe the 4090 Ti can get to 600 W, with more shaders and even crazier clocks. But if it costs $3000, whether the TDP is 300 W or 600 W makes no difference, as only a few people will buy it anyway.
as a matter of facts i believe that both amd + nvidia agree to increase power consumption purposely since corps need to sell us 1000-1200w psu they made a tons of these sincec mining days . my best bet is the 4090/ti would be around 500-575w and amd around 10-15% lower in term of consumption. These type of rumours came from amd twitter hardcore fan bois , remember back then the 680 give us better per with less power consumption than the 480 and 580....... or 1080 vs the old 980/980ti but we save the environment by using more power hungry stuff lul
 
I no longer see the point in all this graphics power, as we have enough power now to meet the needs of 4K ultra settings with the likes of the current 3080ti and 3090s for that matter. They should now be focused on efficiency, not power hungry hardware.
 
as a matter of facts i believe that both amd + nvidia agree to increase power consumption purposely since corps need to sell us 1000-1200w psu they made a tons of these sincec mining days . my best bet is the 4090/ti would be around 500-575w and amd around 10-15% lower in term of consumption. These type of rumours came from amd twitter hardcore fan bois , remember back then the 680 give us better per with less power consumption than the 480 and 580....... or 1080 vs the old 980/980ti but we save the environment by using more power hungry stuff lul
You only mentioned Nvidia so there's that green tint I see.

You don't have to buy the most expensive you know, efficiency does Not live there.
 
MSRP is going to be 3000-4000 USD? We saw already that they can price these whatever they want, and they're still selling.
 
MSRP is going to be 3000-4000 USD? We saw already that they can price these whatever they want, and they're still selling.
Selling and profiting is NOT the same. And if you have been keeping up with the industry's global economy forecasts lately for months, your HONEST assessment would be different.
 
So it's 450 W. So why were there so many articles in the last couple of months claiming "leaked" information which said up to 900W?
 
I'm a bit surprised that there are so many comments harping on about the GPU market's inflated prices, considering prices just plunged in May. Has anyone priced the RTX 30 series lately? I just purchased a new Zotac NVIDIA GeForce RTX 3080 Ti (12 GB GDDR6X)for $900.00 on Amazon, and I'm extremely happy with it. Most 30 series cards are now priced below MSRP, and this is INDUSTRY WIDE, so it shouldn't be difficult to find any 30 series card priced at or below MSRP. If the prices remain at or below MSRP, the 40 series will be priced in line with the current trends, putting the flagship 40 series card somewhere between $1200 -$1600 based on debut prices we've seen in the past. The timeline for release will most likely be extended a few times, so I don't expect to see these available for quite a while. Given the volatility of the current world economy, it's difficult to predict whether or not we'll see prices near the MSRP or not.


 
I just purchased a new Zotac NVIDIA GeForce RTX 3080 Ti (12 GB GDDR6X)for $900.00 on Amazon
Damn that's cheap they're still going for over $1700~$1900 Gougelandastani Plunkitt's here in Gougelandastan (NZ)

3080Ti Pricing.jpg
 
I no longer see the point in all this graphics power, as we have enough power now to meet the needs of 4K ultra settings with the likes of the current 3080ti and 3090s for that matter. They should now be focused on efficiency, not power hungry hardware.
As someone who works in the film industry, I can attest that we will continually need MORE powerful GPUs to keep up with the hardware requirements needed to render 8K and higher video footage. We don't use consumer/gaming graphics cards for editing and rendering pro-grade footage since we need MUCH more powerful cards than the consumer RTX series to process special effects and convert footage to different formats. Even the 3090 Ti is far too slow to be practical in the post-production phase of filmmaking. Most of our editing rigs are custom built for us by a company that specializes in ultra high end video editing systems. Most of the newer systems cost upward of $50,000 and come equipped with multiple server-grade GPUs and multi CPU motherboards. The good part about the release of the 40 series is that the cost of professional grade video cards will drop, allowing us to stretch our budget farther than before. There will always be a need for more powerful GPUs and CPUs, even if you're just gaming with your PC. Progress will continue, and games will require more and more powerful hardware to keep up with the latest software graphics engines used, especially as we transition to 8K monitors for gaming and video editing. When 1080p was the industry standard for gaming, most people didn't see a need for 4K hardware. Now that 4K gaming is on the rise, 8K will become more and more commonplace, following the same trend that computer hardware and software have always followed. As far as efficiency is concerned, you are correct that the manufacturers should be more focused on creating more efficient GPUs. Energy prices are soaring and the U.S. isn't going green fast enough to justify the wattage these GPUs are pulling.
 
As someone who works in the film industry, I can attest that we will continually need MORE powerful GPUs to keep up with the hardware requirements needed to render 8K and higher video footage. We don't use consumer/gaming graphics cards for editing and rendering pro-grade footage since we need MUCH more powerful cards than the consumer RTX series to process special effects and convert footage to different formats. Even the 3090 Ti is far too slow to be practical in the post-production phase of filmmaking. Most of our editing rigs are custom built for us by a company that specializes in ultra high end video editing systems. Most of the newer systems cost upward of $50,000 and come equipped with multiple server-grade GPUs and multi CPU motherboards. The good part about the release of the 40 series is that the cost of professional grade video cards will drop, allowing us to stretch our budget farther than before. There will always be a need for more powerful GPUs and CPUs, even if you're just gaming with your PC. Progress will continue, and games will require more and more powerful hardware to keep up with the latest software graphics engines used, especially as we transition to 8K monitors for gaming and video editing. When 1080p was the industry standard for gaming, most people didn't see a need for 4K hardware. Now that 4K gaming is on the rise, 8K will become more and more commonplace, following the same trend that computer hardware and software have always followed. As far as efficiency is concerned, you are correct that the manufacturers should be more focused on creating more efficient GPUs. Energy prices are soaring and the U.S. isn't going green fast enough to justify the wattage these GPUs are pulling.

If you consider the Steam Hardware Survey to be anywhere close to accurate then 1080p is by far the most used resolution coming in at around 67% adoption. 4K coming in at 2.6%. I regularly check the survey and in the past 4 years 4K adoption has only increased by around 1%. The thing holding 4K back isn't that there are no GPUs that can handle 4K, it's that they are too expensive for the average gamer.

It may take many, many years before 4K even starts to go mainstream. It may never go mainstream because as you pointed out GPUs are indeed getting faster but games continue to require more resources at the same time. Forget about 8K for the foreseeable future. An 8K monitor has the same number of pixels to process as 16 1080p monitors.

I would be surprised if the 4090 is twice as fast as the 3090 but we will see soon.
 
Last edited:
That's precisely why I think they released the Ti series. They were setting the stage for raising their MSRP's, but they need a higher baseline to raise from than the 30 series started out with.

The 3080 was $700. To put that at what is likely $1200+, it would look terrible. No way to slice that except to release a 3080 12GB with a new effective MSRP of $1k. Now "only" $200 more. Same with the 3090. That was a "budget friendly" $1500, but Nvidia fixed that by releasing the 3090 Ti for $2000. Now only $500 more. Same with the 3070 at $500 versus the 3070 Ti at $600.

They released that entire line to juice the MSRP to make the next bump in MSRP's not seem as dramatic. So I wouldn't compare the 4090 pricing to the 3090. I'd compare it to the 3090 Ti. That's why that latter card had to come out no matter what. It was essential to look like they were doing a markup of $500 instead of $1k.

Kinda suggests the 4090 will be $2500+, the 4080 will be $1.2k+, the 4070 will be $650+ to me. That's the MSRP of the baseline cards. We all know you have to add another $100-200 for AIB cards.
Thats why I bit their hands off for the 3080 Founders Edition, they clearly underpriced it for the market and we not getting a 4080 for anywhere near that price. Every so often they seem to have a golden priced generation where you just have to jump in.

Before that I had purchased a Auros 1080ti for £450 with a AAA game thrown in as a freebie during the end of pascal 1080ti mass sell of (sold the code for £30 so net £420 for a xx80ti card), took advantage of that as well which was an absolutely insane price, I paid £370 for a GTX 1070 start of pascal generation.
 
As someone who works in the film industry, I can attest that we will continually need MORE powerful GPUs to keep up with the hardware requirements needed to render 8K and higher video footage. We don't use consumer/gaming graphics cards for editing and rendering pro-grade footage since we need MUCH more powerful cards than the consumer RTX series to process special effects and convert footage to different formats. Even the 3090 Ti is far too slow to be practical in the post-production phase of filmmaking. Most of our editing rigs are custom built for us by a company that specializes in ultra high end video editing systems. Most of the newer systems cost upward of $50,000 and come equipped with multiple server-grade GPUs and multi CPU motherboards. The good part about the release of the 40 series is that the cost of professional grade video cards will drop, allowing us to stretch our budget farther than before. There will always be a need for more powerful GPUs and CPUs, even if you're just gaming with your PC. Progress will continue, and games will require more and more powerful hardware to keep up with the latest software graphics engines used, especially as we transition to 8K monitors for gaming and video editing. When 1080p was the industry standard for gaming, most people didn't see a need for 4K hardware. Now that 4K gaming is on the rise, 8K will become more and more commonplace, following the same trend that computer hardware and software have always followed. As far as efficiency is concerned, you are correct that the manufacturers should be more focused on creating more efficient GPUs. Energy prices are soaring and the U.S. isn't going green fast enough to justify the wattage these GPUs are pulling.
This is irrelevant to this discussion. How many CONSUMER 8K+ films, renderíng devices, panels, etc., are out here to purchase and logically priced at today's inflated/recession economy? It would take ~2028 GPUS (intentionally) before consumer GPUs would even render high-tasking 8K gaming titles buttery smooth before this is considered.

What?! The film industry makes up ~0.055% of the whole GPU market share?

I'm sure the GPU vendors are not betting on that market for mass growth.
 
As someone who works in the film industry, I can attest that we will continually need MORE powerful GPUs to keep up with the hardware requirements needed to render 8K and higher video footage. We don't use consumer/gaming graphics cards for editing and rendering pro-grade footage since we need MUCH more powerful cards than the consumer RTX series to process special effects and convert footage to different formats. Even the 3090 Ti is far too slow to be practical in the post-production phase of filmmaking. Most of our editing rigs are custom built for us by a company that specializes in ultra high end video editing systems. Most of the newer systems cost upward of $50,000 and come equipped with multiple server-grade GPUs and multi CPU motherboards. The good part about the release of the 40 series is that the cost of professional grade video cards will drop, allowing us to stretch our budget farther than before. There will always be a need for more powerful GPUs and CPUs, even if you're just gaming with your PC. Progress will continue, and games will require more and more powerful hardware to keep up with the latest software graphics engines used, especially as we transition to 8K monitors for gaming and video editing. When 1080p was the industry standard for gaming, most people didn't see a need for 4K hardware. Now that 4K gaming is on the rise, 8K will become more and more commonplace, following the same trend that computer hardware and software have always followed. As far as efficiency is concerned, you are correct that the manufacturers should be more focused on creating more efficient GPUs. Energy prices are soaring and the U.S. isn't going green fast enough to justify the wattage these GPUs are pulling.
One thing that really annoys me with the film industry is not the graphics, but the ridiculous sound/music score. Why the dramatized background music needs to be so loud that you can't hear the dialogue is beyond me. It's now got to the point where you need to turn your expensive surround sound to "dialogue" enhanced! Don't they realize their is there is more impact without it and the level of acting is enhanced? Anyway, it's gotten to the point where I have no interest in watching many shows now because of it. Why bother watching if you can't hear people talk!
 
Soon, we would need PSU only for GPUs. Yes, that power is nice, but 450W only for GPU. Seems too ridiculous! :cool:
 
I no longer see the point in all this graphics power, as we have enough power now to meet the needs of 4K ultra settings with the likes of the current 3080ti and 3090s for that matter. They should now be focused on efficiency, not power hungry hardware.
Exactly my thoughts. There isn't a game currently in existence that a 3090/3090ti can't handle. It also doesn't appear that there are any upcoming titles that call for such power. Look at steam surveys and the current gen consoles. Most people are FINALLY moving up to 1440p. 4k and above is like an infinitesimally small number of the global PC user base. For machine learning and AI research I understand the power upgrade, but for consumers? As a 3090Ti owner, I have to ask WTF are people going to do with a 4090Ti? The 30 series cards are finally hitting software maturity, and Nvidia is about to drop an entirely new architecture? It's senseless.
 
Exactly my thoughts. There isn't a game currently in existence that a 3090/3090ti can't handle. It also doesn't appear that there are any upcoming titles that call for such power. Look at steam surveys and the current gen consoles. Most people are FINALLY moving up to 1440p. 4k and above is like an infinitesimally small number of the global PC user base. For machine learning and AI research I understand the power upgrade, but for consumers? As a 3090Ti owner, I have to ask WTF are people going to do with a 4090Ti? The 30 series cards are finally hitting software maturity, and Nvidia is about to drop an entirely new architecture? It's senseless.

You bought a card with no efficiency for $2000 (at least that was the initial price) and you are asking what people will do with a 4090 Ti? Seems like a total contradiction.

Those cards are not for "most" people. That is why they cost ludicrous amounts of money, because there will always be enthusiasts that will buy them. They do not have to sell a lot of them, since the margins are insane.

And there is no specific performance level required to run games at max settings. Graphics are still evolving, with diminishing returns, but they are. And even though rasterization has pretty much peaked, we now have ray tracing, which is still in its infancy. But if you look at games like Metro Exodus EE or Dying Light 2, it is literally impossible to play those games at 4K and high framerates, even with upscaling technologies. RT implementations in those games are amazing, especially global illumination. And that will only get better, but we need more power for that.
And have you seen UE5 with Nanite and Lumen? That looks like nothing we have seen before.

That is why new cards will be focusing on RT performance. If you look at the leaks about the 4070 and 4080, they will only have a small increase in the number of CUDA cores. Most of the performance gains will come from high clocks, thanks to the new process.
But these cards should have more RT cores, though there have been zero rumors about that.
 
Back
Top