Wednesday, September 2nd 2020

NVIDIA Announces GeForce Ampere RTX 3000 Series Graphics Cards: Over 10000 CUDA Cores

NVIDIA just announced its new generation GeForce "Ampere" graphics card series. The company is taking a top-to-down approach with this generation, much like "Turing," by launching its two top-end products, the GeForce RTX 3090 24 GB, and the GeForce RTX 3080 10 GB graphics cards. Both cards are based on the 8 nm "GA102" silicon. Join us as we live blog the pre-recorded stream by NVIDIA, hosted by CEO Jen-Hsun Huang.

Update 16:04 UTC: Fortnite gets RTX support. NVIDIA demoed an upcoming update to Fortnite that adds DLSS 2.0, ambient occlusion, and ray-traced shadows and reflections. Coming soon.
Update 16:06 UTC: NVIDIA Reflex technology works to reduce e-sports game latency. Without elaborating, NVIDIA spoke of a feature that works to reduce input and display latencies "by up to 50%". The first supported games will be Valorant, Apex Legends, Call of Duty Warzone, Destiny 2 and Fortnite—in September.
Update 16:07 UTC: Announcing NVIDIA G-SYNC eSports Displays—a 360 Hz IPS dual-driver panel that launches through various monitor partners in this fall. The display has a built-in NVIDIA Reflex precision latency analyzer.
Update 16:07 UTC: NVIDIA Broadcast is a brand new app available in September that is a turnkey solution to enhance video and audio streaming taking advantage of the AI capabilities of GeForce RTX. It makes it easy to filter and improve your video, add AI-based backgrounds (static or animated), and builds on RTX Voice to filter out background noise from audio.
Update 16:10 UTC: Ansel evolves into Omniverse Machinima, an asset exchange that helps independent content creators to use game assets to create movies. Think fan-fiction Star Trek episodes using Star Trek Online assets. Beta in October.
Update 16:15 UTC: Updates to the AI tensor cores and RT cores. In addition to more numbers of RT- and tensor cores, the 2nd generation RT cores and 3rd generation tensor cores offer higher IPC. Making ray-tracing have as little performance impact as possible appears to be an engineering goal with Ampere.
Update 16:18 UTC: Ampere 2nd Gen RTX technology. Traditional shaders are up by 270%, raytracing units are 1.7x faster and the tensor cores bring a 2.7x speedup.
Update 16:19 UTC: Here it is! Samsung 8 nm and Micron GDDR6X memory. The announcement of Samsung and 8 nm came out of nowhere, as we were widely expecting TSMC 7 nm. Apparently NVIDIA will use Samsung for its Ampere client-graphics silicon, and TSMC for lower volume A100 professional-level scalar processors.
Update 16:20 UTC: Ampere has almost twice the performance per Watt compared to Turing!
Update 16:21 UTC: Marbles 2nd Gen demo is jaw-dropping! NVIDIA demonstrated it at 1440p 30 Hz, or 4x the workload of first-gen Marbles (720p 30 Hz).
Update 16:23 UTC: Cyberpunk 2077 is playing big on the next generation. NVIDIA is banking extensively on the game to highlight the advantages of Ampere. The 200 GB game could absorb gamers for weeks or months on end.
Update 16:24 UTC: New RTX IO technology accelerates the storage sub-system for gaming. This works in tandem with the new Microsoft DirectStorage technology, which is the Windows API version of the Xbox Velocity Architecture, that's able to directly pull resources from disk into the GPU. It requires for game engines to support the technology. The tech promises a 100x throughput increase, and significant reductions in CPU utilization. It's timely as PCIe gen 4 SSDs are on the anvil.

Update 16:26 UTC: Here it is, the GeForce RTX 3080, 10 GB GDDR6X, running at 19 Gbps, 238 tensor TFLOPs, 58 RT TFLOPs, 18 power phases.
Update 16:29 UTC: Airflow design. 90 W more cooling performance than Turing FE cooler.
Update 16:30 UTC: Performance leap, $700. 2x as fast as RTX 2080, available September 17. Up to 2x faster than the original RTX 2070.
Update 17:05 UTC: GDDR6X was purpose-developed by NVIDIA and Micron Technology, which could be an exclusive vendor of these chips to NVIDIA. These chips use the new PAM4 encoding scheme to significantly increase data-rates over GDDR6. On the RTX 3090, the chips tick at 19.5 Gbps (data rates), with memory bandwidths approaching 940 GB/s.
Update 16:31 UTC: RTX 3070, $500, faster than RTX 2080 Ti, 60% faster than RTX 2070, available in October. 20 shader TFLOPs, 40 RT TFLOPs, 163 tensor cores, 8 GB GDDR6
Update 16:33 UTC: Call of Duty: Black Ops Cold War is RTX-on.

Update 16:35 UTC: RTX 3090 is the new TITAN. Twice as fast as RTX 2080 Ti, 24 GB GDDR6X. The Giant Ampere. A BFGPU, $1500 available from September 24. It is designed to power 60 fps at 8K resolution, up to 50% faster than Titan RTX.

Update 16:43 UTC: Wow, I want one. On paper, the RTX 3090 is the kind of card I want to upgrade my monitor for. Not sure if a GPU ever had that impact.
Update 16:59 UTC: Insane CUDA core counts, 2-3x increase generation-over-generation. You won't believe these.
Update 17:01 UTC: GeForce RTX 3090 in the details. Over Ten Thousand CUDA cores!
Update 17:02 UTC: GeForce RTX 3080 details. More insane specs.

Update 17:03 UTC: The GeForce RTX 3070 has more CUDA cores than a TITAN RTX. And it's $500. Really wish these cards came out in March. 2020 would've been a lot better.
Here's a list of the top 10 Ampere features.

Update 19:22 UTC: For a limited time, gamers who purchase a new GeForce RTX 30 Series GPU or system will receive a PC digital download of Watch Dogs: Legion and a one-year subscription to the NVIDIA GeForce NOW cloud gaming service.

Update 19:47 UTC: All Turing cards support HDMI 2.1. The increased bandwidth provided by HDMI 2.1 allows, for the first time, a single cable connection to 8K HDR TVs for ultra-high-resolution gaming. Also supported is AV1 video decode.

Update 20:06 UTC: Added the complete NVIDIA presentation slide deck at the end of this post.

Update Sep 2nd: We received following info from NVIDIA regarding international pricing:
  • UK: RTX 3070: GBP 469, RTX 3080: GBP 649, RTX 3090: GBP 1399
  • Europe: RTX 3070: EUR 499, RTX 3080: EUR 699, RTX 3090: EUR 1499 (this might vary a bit depending on local VAT)
  • Australia: RTX 3070: AUD 809, RTX 3080: AUD 1139, RTX 3090: AUD 2429
Add your own comment

502 Comments on NVIDIA Announces GeForce Ampere RTX 3000 Series Graphics Cards: Over 10000 CUDA Cores

#101
Max(IT)
ArkzYeah but lets see what shops actually sell em at.
Well after the initial rush with low availability and higher prices, the price should be lower, not higher.
Third party solutions with better coolers could cost a little more
Posted on Reply
#103
chodaboy19
Ferrum MasterJen-Hsun for sure likes kitchen shows....
He's keeping everyone safe, nothing but praise there.
Posted on Reply
#104
Space Lynx
Astronaut
chodaboy19He's keeping everyone safe, nothing but praise there.
i loved when he pulled the 3090 out of the oven. glorious.
Posted on Reply
#105
midnightoil
ChomiqBased on what? A single RT demo or next gen console games running at 4K30 with checkerboard rendering?
Based on all the performance data from both next gen consoles, technical document releases, and that they clearly will have evaluated PS5 and XB dev kits, and likely had rough performance of RDNA2 desktop leaked to them.

The pricing and gigantic, inefficient 3090 reflect this. Why else would they do it? You think they just rolled the dice and decided to slash their margins on volume sellers, and produce an ultra low yield furnace halo product for the LULs?
Posted on Reply
#106
Fluffmeister
RaendorI was almost dropping a tear when he was talking about 1080 (which I run to this day) and that it safe to get 3070/80 now :D
Hey, I'm still on a 980 Ti, but it does seem to be time to pull the trigger, 500 bucks and boom, 2080 Ti performance and all the shiny new tech, damn Nvidia ARE big meanies!

Just waiting on Vermeer too and it will finally be time to blow the cobwebs from my wallet.
Posted on Reply
#107
Imouto
RIP AMD.

This time around they won't be able to compete even in price. I see the higher ups at AMD hovering their mouses over emails asking their providers to stop the RDNA2 production.
Posted on Reply
#108
PowerPC
Love how much cognitive dissonance must be going on in people's heads right now. Just a day ago people were still saying this kind of performance / price was "literally impossible" on this very forum.
Posted on Reply
#109
RH92
mouacykHe looked at me through the screen and called me his "Pascal friend" and said "it's safe to upgrade". :laugh: Guy is salesman for sure.
:laugh: :laugh: :laugh: :laugh: Thats for sure ! Time for my 1080Ti to find a new home , im really tempted between the 3070 and 3080 , if the 3070 can rasterize as well as 2080Ti i might go with it if not i will grab myself a 3080 .
Posted on Reply
#110
Vya Domus
ImoutoAMD hovering their mouses over emails asking their providers to stop the RDNA2 production.
Too late, a couple of millions of consoles are already commissioned to have RDNA2.

Do you even know on what planet you are ?
Posted on Reply
#111
TheoneandonlyMrK
I'll wait on benches and reviews.
I feel my 2060 just became an igpu grade though, that's for sure.
Posted on Reply
#112
Foxiol
As a happy owner of a 3 months old RTX2080 Super with an LG CX6LB OLED 4K 120hz 55" TV as a monitor...yup I knew it wasn't going to be enough for playing at actual 4K but doings its job at 1440p. Depending in how those RTX3080 reviews are at 4K resolution, I might get one withouth even thinking about it.

Also I bought it for 789€ (Gigabyte Windforce OC 8G model) and now it could sell for what...400€ at most? Nvidia killed the reseller market for those that paid way more than I did and now looking at stores with massive amounts of 2080Ti's for around 1300/1600€...What are they going to do basically since the 3070 prices can't get that higher after the recommended price tag. I bet no one in their sane mind is going to buy a new 2080Ti now after this.

All in all I don't care that much, I'm glad NVidia came to their senses for once and kept prices in the right place, the 3080 is well priced in my opinion BUT that gap between this and the 3090 could mean they have something more to show.

Again 4K reviews on that 3080 will sell the deal to me, HDMI 2.1 is also my main reason to upgrade for having stable and functional 120hz and GSync at 4K even if I'm not reaching higher frame rates.
Posted on Reply
#113
Raendor
FluffmeisterHey, I'm still on a 980 Ti, but it does seem to be time to pull the trigger, 500 bucks and boom, 2080 Ti performance and all the shiny new tech, damn Nvidia ARE big meanies!

Just waiting on Vermeer too and it will finally be time to blow the cobwebs from my wallet.
in a same boat. Although if it takes long, I’ll just get something like 10400f which will be fine for 1440p gaming to cover me till am5/lga1700
Posted on Reply
#114
Amite
I can feel the used GPU prices on Ebay falling with out even looking.
Posted on Reply
#115
kings
519€ for the RTX 3070 and 719€ for the RTX 3080 in Europe... not bad at all. Prices of Founders Edition.

It looks like my 980Ti will finally get its well-deserved rest.
Posted on Reply
#116
CrAsHnBuRnXp
jesdalsBut does RTX 3090 support SLI
SLI is dead. Nor will you need it with a 3090.
Posted on Reply
#117
ZoneDymo
so uhh where can I watch this presentation back?
Posted on Reply
#118
theGryphon
Can't wait to see what NVIDIA will have for sub-75W LP segment :toast:
Posted on Reply
#119
Imouto
Vya DomusToo late, a couple of millions of consoles are already commissioned to have RDNA2.

Do you even know on what planet you are ?
That's one of the points, though. Consoles are obsolete even before launch.
Posted on Reply
#120
kings
ZoneDymoso uhh where can I watch this presentation back?
Posted on Reply
#121
PowerPC
CrAsHnBuRnXpSLI is dead. Nor will you need it with a 3090.
It was dead but they actually implemented a new version of SLI just for this card.
Posted on Reply
#122
R0H1T
PowerPCLove how much cognitive dissonance must be going on in people's heads right now. Just a day ago people were still saying this kind of performance / price was "literally impossible" on this very forum.
Two trains of thoughts, not necessarily contradictory ~

If Nvidia was able to pull the 2.5-3x perf/W efficiency it's possible they may have priced it similar to the 2xxx lineup. Of course Nvidia would be looking at RDNA2 perf & that big ball of nothingburger called nCoV ravaging the entire world atm. Now depending on which side of the fence you are, NVidia's margins could be higher though I'm 100% certain their overall sales would be (much?) lower!

Next is what we see right now, Nvidia cannot really get that perf/W efficiency leap as some of the leaks suggested. That means Nvidia card will not be better in nearly all metrics vs AMD, unlike the last gen. So pricing it to enthusiast grade is nearly impossible for them. Hence the current "attractive" pricing.

The only way Nvidia prices Ampere the way they have now is when RDNA2 is really competing with them on perf/W & likely perf/$ as well. Anyone remember Intel's mainstream quad cores for a decade BS till Zen launched? This likely the same game played over again.
Posted on Reply
#123
BorisDG
I'm curious how they will perform on PCI-E 3.0
Posted on Reply
#124
efikkan
Specs are looking pretty good (on paper), perheps except TDP numbers.

It's funny to see that most "leakers" were pretty wrong on these specs.
Posted on Reply
#125
PowerPC
R0H1TTwo trains of thoughts, not necessarily contradictory ~

If Nvidia was able to pull the 2.5-3x perf/W efficiency it's possible they may have priced it similar to the 2xxx lineup. Of course Nvidia would be looking at RDNA2 perf & that big ball of nothingburger called nCoV ravaging the entire world atm. Now depending on which side of the fence you are, NVidia's margins could be higher though I'm 100% certain their overall sales would be (much?) lower!

Next is what we see right now, Nvidia cannot really get that perf/W efficiency leap as some of the leaks suggested. That means Nvidia card will not be better in nearly all metrics vs AMD, unlike the last gen. So pricing it to enthusiast grade is nearly impossible for them. Hence the current "attractive" pricing.

The only way Nvidia prices Ampere the way they have now is when RDNA2 is really competing with them on perf/W & likely perf/$ as well. Anyone remember Intel's mainstream quad cores for a decade BS till Zen launched? This likely the same game played over again.
I'm at a loss for words for all the conjuring and bending that is happening in this post. Just take the performance / price increase already... sheesh.
Posted on Reply
Add your own comment
Nov 21st, 2024 10:08 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts