• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces GeForce Ampere RTX 3000 Series Graphics Cards: Over 10000 CUDA Cores

Two trains of thoughts, not necessarily contradictory ~

If Nvidia was able to pull the 2.5-3x perf/W efficiency it's possible they may have priced it similar to the 2xxx lineup. Of course Nvidia would be looking at RDNA2 perf & that big ball of nothingburger called nCoV ravaging the entire world atm. Now depending on which side of the fence you are, NVidia's margins could be higher though I'm 100% certain their overall sales would be (much?) lower!

Next is what we see right now, Nvidia cannot really get that perf/W efficiency leap as some of the leaks suggested. That means Nvidia card will not be better in nearly all metrics vs AMD, unlike the last gen. So pricing it to enthusiast grade is nearly impossible for them. Hence the current "attractive" pricing.

The only way Nvidia prices Ampere the way they have now is when RDNA2 is really competing with them on perf/W & likely perf/$ as well. Anyone remember Intel's mainstream quad cores for a decade BS till Zen launched? This likely the same game played over again.
I'm at a loss for words for all the conjuring and bending that is happening in this post. Just take the performance / price increase already... sheesh.
 
Yeah, I'm not paying $700 for a 10GB card.
RX570 for $120 has 8GB FFS and I already ditched my 6GB card because it ran out of VRAM.
 
SLI is dead. Nor will you need it with a 3090.
b!tch we need flight simulator at 4K60

Yeah, I'm not paying $700 for a 10GB card.
RX570 for $120 has 8GB FFS and I already ditched my 6GB card because it ran out of VRAM.
bro these cards are WAY more cost effective than turing. Yes, the 3090 is, technically speaking, cheaper than the 2080 ti.

And who cares about VRAM? It's not like it affects your performance in any noticeable way, even if you had 20GB you likely wouldn't notice the single digit FPS bump.
 
Based on all the performance data from both next gen consoles, technical document releases, and that they clearly will have evaluated PS5 and XB dev kits, and likely had rough performance of RDNA2 desktop leaked to them.

The pricing and gigantic, inefficient 3090 reflect this. Why else would they do it? You think they just rolled the dice and decided to slash their margins on volume sellers, and produce an ultra low yield furnace halo product for the LULs?
That is true and the price gap between 3080 and 3090 indicates where NV expects AMD to have competitive products perhaps, but twice 2080Ti performance (unless it is RTX bazinga aggravated with fancy AI upscaling known as DLSS, in which case it is a lawsuit worthy misleading) is unexpected and so is 8k+ CU $700 card.

2.5-3x perf/W efficienc
They themselves claim 1.9.
 
Many leaks proven correct, except silly stuff like copro/fpga. Even the late "reasonable" prices. I'm particularly interested in 2xfp32 perf as they're quoting 10496 Cuda cores for 3090. Does this mean Int32+fp32/fp32+fp32 with compiler extracting parallelism? 2xfp32 per clk? I also presume TF32/FP64 tensor support for the gaming cards? Need the Ampere white paper... Also nice that all GA support hw AV1 decode.
 
Last edited:
Two trains of thoughts, not necessarily contradictory ~

If Nvidia was able to pull the 2.5-3x perf/W efficiency it's possible they may have priced it similar to the 2xxx lineup. Of course Nvidia would be looking at RDNA2 perf & that big ball of nothingburger called nCoV ravaging the entire world atm. Now depending on which side of the fence you are, NVidia's margins could be higher though I'm 100% certain their overall sales would be (much?) lower!

Next is what we see right now, Nvidia cannot really get that perf/W efficiency leap as some of the leaks suggested. That means Nvidia card will not be better in nearly all metrics vs AMD, unlike the last gen. So pricing it to enthusiast grade is nearly impossible for them. Hence the current "attractive" pricing.

The only way Nvidia prices Ampere the way they have now is when RDNA2 is really competing with them on perf/W & likely perf/$ as well. Anyone remember Intel's mainstream quad cores for a decade BS till Zen launched? This likely the same game played over again.

Stop with your red-pill fantasy please. Nvidia released the 1080 Ti at 700usd which demolished AMD until this very day. At this point the 700usd RTX 3080 are meant for 1080 Ti owners who refused to upgrade for so long.
 
there might be an rtx 3080ti coming, nvidia is possibly holding off to see what RDNA2 brings

No Ti branding this time around supposedly. I think with the super models they'll double the VRAM on higher SKUs.


3080S 20GB / 3070S 16GB model what you should keep an eye out if they are forced by AMD.
 
Love how much cognitive dissonance must be going on in people's heads right now. Just a day ago people were still saying this kind of performance / price was "literally impossible" on this very forum.

I'll be first to say I didn't expect the prices.
 
I'll wait for benchmarks. Here's hoping the 3070 is 2080Ti equivalent or better.

My upgrade path always used to be when 1 card (reasonably priced or if funds are available) = 2 older gen cards I have in SLI.

Dual 8800 GTS 512MB in SLI roughly equals 1 GTX 280
Add a second GTX 280 for SLI roughly equals 1 GTX 570
Not having funds I didn't upgrade my 570s to a 780Ti and waited for next gen....then jumped on a 980Ti and I've been using it since.

I've been waiting for a single card priced in the $500 range that can give twice the performance of my 980Ti and a 2080Ti is that card, but not in the $1000+ price range. Hell no.

If the 3070 or even an AMD equivalent card around the same price can give me double the performance of my 980Ti and cost is around $500, then this generation will be the one I finally upgrade my GPU.
 
Al i can say is:
latest

Also who is joining me on the hype trian choo choo, but be warned the hype trian is really hot:p
Screen_Shot_2013-05-14_at_12.55.01_PM.jpg


What really took me by surprise whas the cuda core amount. I dit not for seen ampere would have this many cores. This also explain why RTX 3080 and 3090 300 watt TDP+ cards. No dout whit that many cuda cores ampere is gonna be a serious beast. RTX 3080 also surprized me whit the price. Not so much much Ngreedia this time as i had fear. RTX 3080 looks on papir like a solid 4K GPU all throw i do have my concerns about only 10 GB vram for future prof the next two years, There are all ready games that uses 8 GB+ vram at 4K and if we look at Microsoft Flight Simulater that is already close to 10 GB at 4K. But besides Vram amount, ampere looks really solid.

Sorry GTX 1080 TI, but i think its time ower ways goes in different directions... no dont look at me like i am betraying you.
 
Stop with your red-pill fantasy please. Nvidia released the 1080 Ti at 700usd which demolished AMD until this very day. At this point the 700usd RTX 3080 are meant for 1080 Ti owners who refused to upgrade for so long.
Yea, the cognitive dissonance must be strong right now. People believe one thing so strong for so long until reality hits them on the head like a ton of bricks. All they can do is remain with their outdated opinion just to relieve the pain of having been wrong for this long.
 
What I'm also fascinated about is how well a false leak on rise in price may change general perception of price and value of a product.
When I asked three friends of mine some time back what would they think about leaving the release price of the 3000 series all of them were more or less like 'meh, a lower would be nice but it is expected that the price shouldn't change'. After the price leak and NVIDIA now denying it all of them are amazed of the price.
Feels almost as if the leak was a marketing trick :)
 
Two trains of thoughts, not necessarily contradictory ~

If Nvidia was able to pull the 2.5-3x perf/W efficiency it's possible they may have priced it similar to the 2xxx lineup. Of course Nvidia would be looking at RDNA2 perf & that big ball of nothingburger called nCoV ravaging the entire world atm. Now depending on which side of the fence you are, NVidia's margins could be higher though I'm 100% certain their overall sales would be (much?) lower!

Next is what we see right now, Nvidia cannot really get that perf/W efficiency leap as some of the leaks suggested. That means Nvidia card will not be better in nearly all metrics vs AMD, unlike the last gen. So pricing it to enthusiast grade is nearly impossible for them. Hence the current "attractive" pricing.

The only way Nvidia prices Ampere the way they have now is when RDNA2 is really competing with them on perf/W & likely perf/$ as well. Anyone remember Intel's mainstream quad cores for a decade BS till Zen launched? This likely the same game played over again.

more like nvidia is expecting RDNA2 to compete with them on both price and efficiency. but it doesn't mean in reality AMD will compete. we have this kind of moment with nvidia vs AMD for several times already. not saying that RDNA2 can't compete or will repeat another history but nvidia going all out is not a definite proof that AMD have something good coming out.
 
Yea, the cognitive dissonance must be strong right now. People believe one thing so strong for so long until reality hits them on the head like a ton of bricks. All they can do is remain with their outdated opinion just to relieve the pain of having been wrong for this long.
man yall really be fighting over A HUGE PRICE CUT and A HUGE PERFORMANCE GAIN... cmon yall be happy for once lol
 
man yall really be fighting over A HUGE PRICE CUT and A HUGE PERFORMANCE GAIN... cmon yall be happy for once lol
Welcome to 2020?
 
And who cares about VRAM? It's not like it affects your performance in any noticeable way, even if you had 20GB you likely wouldn't notice the single digit FPS bump.
I had to reduce detail in Doom Eternal because my 2060 lacked enough VRAM for even 1440p. It literally wouldn't run, needed 7.5GB of the 6GB available.
Cyberpunk might just fit into 10GB but I suspect games in 2021 are going to be pushing 12GB with the console devs targeting that for VRAM allocation.
 
man yall really be fighting over A HUGE PRICE CUT and A HUGE PERFORMANCE GAIN... cmon yall be happy for once lol
There is no price cut to speak of, the performance gain is nice on the other hand.
 
Yeah, I'm not paying $700 for a 10GB card.
RX570 for $120 has 8GB FFS and I already ditched my 6GB card because it ran out of VRAM.
I actually think it's admirable that they aren't ripping people off with too much VRAM. Well, they are doing it with 24GB on the 3090, but you have to be gullible to think you need that much VRAM. That extra amount of VRAM is usually just to inflate the price for no reason.
 
There is no price cut to speak of, the performance gain is nice on the other hand.
Performance/dollar is increased over 2x. THAT is huge.
 
Awesome numbers, awesome prices... what's even more awesome is that the number of CUDA cores suggest a Ti version coming out later for each segment.
 
I wonder what relative performance means this time. I have a gut feeling that this incredible (roughly ~1.7x compared to 2080S looking at the graph) speed-up it's all about raytracing and not so much for rasterization but I would love to be wrong here.

I think the numbers are all with ray tracing and dlss on when they mention double the performance. If you go to their website they have 3 games they have up and they all have to have dlss and ray tracing in for claimed double performance at 4K.
 
Stop with your red-pill fantasy please. Nvidia released the 1080 Ti at 700usd which demolished AMD until this very day. At this point the 700usd RTX 3080 are meant for 1080 Ti owners who refused to upgrade for so long.

Stop with your green-pill fantasy please. 1080 Ti owners refused to upgrade, because the 60% price-hike.

Everybody is so franatic about the 30xx pricing, where the truth is this is the good old Pascal pricing coming back to replace the insanity that Turing was.
 
Performance/dollar is increased over 2x. THAT is huge.
Of course, compare to the dumpster fire that is 35% performance increase for Turing and a price hike.
Although large performance gains in new generations is not unheard of before.
 
but nvidia going all out is not a definite proof that AMD have something good coming out.
Not saying it is, but a combination of the global pandemic & RDNA2 may have forced their hand. The last thing JHH would want is to alienate its base by pricing it outside their reach when incomes across the board are plummeting, there's also the fact that HPC & DC revenue surpassed gaming just last quarter so they do have more wiggle room to price it more aggressively now.

The point is ~ if Nvidia could price it to Turing levels they'd almost certainly do so.
 
Back
Top