Monday, October 7th 2019

NVIDIA Could Launch Next-Generation Ampere GPUs in 1H 2020

According to the sources over at Igor's Lab, NVIDIA could launch its next generation of GPUs, codenamed "Ampere", as soon as first half of the 2020 arrives. Having just recently launched GeForce RTX Super lineup, NVIDIA could surprise us again in the coming months with replacement for it's Turing lineup of graphics cards. Expected to directly replace high-end GPU models that are currently present, like GeForce RTX 2080 Ti and RTX 2080 Super, Ampere should bring many performance and technology advancements a new graphics card generation is usually associated with.

For starters, we could expect a notable die shrink to take place in form of 7 nm node, which will replace the aging 12 nm process that Turing is currently being built on. This alone should bring more than 50% increase in transistor density, resulting in much more performance and lower power consumption compared to previous generation. NVIDIA's foundry of choice is still unknown, however current speculations are predicting that Samsung will manufacture Ampere, possibly due to delivery issues that are taking place at TSMC. Architectural improvements should take place as well. Ray tracing is expected to persist and get enhanced with possibly more hardware allocated for it, along with better software to support the ray tracing ecosystem of applications.
Source: Igor's Lab via WCCFTech
Add your own comment

33 Comments on NVIDIA Could Launch Next-Generation Ampere GPUs in 1H 2020

#1
oxrufiioxo
That would be pretty awesome. I am currently more curious/concerned over pricing. My wife may not be cool with a $2000 ish flagship. :)
Posted on Reply
#2
Hyderz
wow already? thats pretty quick considering companies generally stretch out about a year before releasing a new gpu.
could it be they want a piece of the market share before new gpus from amd and intel comes into play in 2020?
Posted on Reply
#3
dicktracy
This will be my next upgrade. Also, Big Navi = Midrange Ampere worst case scenario.
Posted on Reply
#4
robb
Hyderzwow already? thats pretty quick considering companies generally stretch out about a year before releasing a new gpu.
could it be they want a piece of the market share before new gpus from amd and intel comes into play in 2020?
What new gpu? The super cards were little more than a response to what AMD was launching and use existing chips. Turing launched back in September 2018 so we are probably looking at over 18 months or maybe longer by the time Ampere comes out.
Posted on Reply
#5
Hotobu
oxrufiioxoThat would be pretty awesome. I am currently more curious/concerned over pricing. My wife may not be cool with a $2000 ish flagship. :)
Yeah, I'm hoping that they don't have a price hike as well,but it seems inevitable. It'd be nice if they didn't increase from the 2000 series at least (even though that's a bit steep).
Posted on Reply
#6
Upgrayedd
I really hope Intel puts up a tough fight so these cards don't get too pricey.
Posted on Reply
#7
zo0lykas
Dont think so, look intel price tag over all, for optaine ssd, for cpus, and you hoping they sell gpu for coins?
UpgrayeddI really hope Intel puts up a tough fight so these cards don't get too pricey.
Posted on Reply
#8
Noim
If my cristal ballz are anything to go by (more reliable than Wccftech, btw), I have been expecting Nvidia's next GPU for H1 2020 ever since we got Cyberpunk 2077's release date. Imo, there is no way Nvidia can let this "RTX showroom by CDPR" launch without new beefy GPUs to bring a satisfactory RTX experience to the masses.
Posted on Reply
#9
Chomiq
Could be a response to big Navi launch
Posted on Reply
#10
las
Can't wait to replace my 1080 Ti.

RTX 3080 will do. I bet they launch in March, before Cyberpunk.
Posted on Reply
#11
Ergastolano
ChomiqCould be a response to big Navi launch
Indeed. A good move.
Posted on Reply
#12
kings
UpgrayeddI really hope Intel puts up a tough fight so these cards don't get too pricey.
Intel has already said through Raja that they will not fight in the high-end for now.

Nvidia will continue to have live rein in this segment and charge whatever they want. Unless there's a surprise from AMD, but honestly I'm not seeing them with the ability to fight Nvidia at the high end.
Posted on Reply
#13
Animalpak
I just bought my RTX 2080 Ti Strix last month, i can skip the upgrade unless the performance is more than the usual +15%. Im more concerned to upgrade my system CPU/MB/RAM.
Posted on Reply
#14
JalleR
My 1080TI is tiered of generating 4K Pictures sooo a Beefy 3080TI would be nice.

I think/hope they will go with 50%+ more RT cores (i would like to see 100% but let's be realistic), maybe Skip the AI cores and then 50% Cuda Cores. Price, well it depends on the compatition soooo the Same ( i would like a little cheaper but who doesn't want that :) )
Posted on Reply
#15
Ergastolano
I would like to see these new RX 5500(XT) and the RX 5600(XT) :).
Posted on Reply
#16
las
AnimalpakI just bought my RTX 2080 Ti Strix last month, i can skip the upgrade unless the performance is more than the usual +15%. Im more concerned to upgrade my system CPU/MB/RAM.
Why did you buy a 2080 Ti over 1 year after it's release? I hope you got a hefty discount because everyone knew Ampere / 7nm would come soon.

Ampere is going to be a major bump in perf compared to Pascal -> Turing. Node + New (awaited) Arch.

Turing was never what we wanted. Ampere was teased back in 2018.
Turing was a gapfiller because of Ampere delay. SUPER refresh was straight up milking.

I expect 3070 to beat 2080 Ti (less power and heat too, obviously).

Can't wait.
Posted on Reply
#17
Animalpak
nd
lasWhy did you buy a 2080 Ti over 1 year after it's release? I hope you got a hefty discount because everyone knew Ampere / 7nm would come soon.

Ampere is going to be a major bump in perf compared to Pascal -> Turing. Node + New (awaited) Arch.

Turing was never what we wanted. Ampere was teased back in 2018.
Turing was a gapfiller because of Ampere delay. SUPER refresh was straight up milking.

I expect 3070 to beat 2080 Ti (less power and heat too, obviously).

Can't wait.
I bought it for 1200$ instead of 1359$ also with my new 27 "screen 1440p 165Hz G-Sync it was urgent to buy a GPU at heigh.


I don't think they will allow themselves to disfigure the 2000 series too much because, being a new chip, they will probably want to use it for other generations of GPUs without unlocking the full potential of Ampere.

I remind you that those who have 1080 Ti can still use it throughout 2020 without worries as it is a still valid GPU especially at 1080p ( the resolution the majority of PC players own )
Posted on Reply
#18
las
Animalpaknd



I bought it for 1200$ instead of 1359$ also with my new 27 "screen 1440p 165Hz G-Sync it was urgent to buy a GPU at heigh.


I don't think they will allow themselves to disfigure the 2000 series too much because, being a new chip, they will probably want to use it for other generations of GPUs without unlocking the full potential of Ampere.

I remind you that those who have 1080 Ti can still use it throughout 2020 without worries as it is a still valid GPU especially at 1080p ( the resolution the majority of PC players own )
Depends on how you play your games I guess. I'm gladly lowering settings to archieve 100+ fps (at all times) and when you do this, you become much more CPU bound, even at 1440p.

Ampere is an all-new arch. Long awaited. Was supposed to come after Pascal. Turing was rushed out as a stop-gap solution. Half baked arch and chips (hence the SUPER releases later probably).

Ofcouse 1080 Ti will do 1080p, still does 1440p flawlessly in pretty much any new game, maxed out with 100 fps avg. On par with 2070 SUPER / 2080 non-SUPER.

3000 series is going to bring us what 2000 series never did. A true generational leap.
Posted on Reply
#19
medi01
AleksandarK...many performance and technology advancements a new graphics card generation is usually associated with...
Eat that, 1080Ti owners, not one, many.... Oh, wait, "usually"... :))))
Posted on Reply
#20
ZoneDymo
honestly, who cares? prices are way too high anyway, anyone with some self respect would not buy cards right now.

guess the one thing that I would be interested in seeing, that imo needs to happen with these cards, is atleast a doubling in ray tracing performance because that is no where near where it needs to be.
Posted on Reply
#21
Space Lynx
Astronaut
Flagship Ampere at 1 grand might be the last gpu I ever buy. Let's hope it comes out during tax check time, I should be getting a decent amount back back for paying off one of my student loans. not to mention the interest I am now saving by not having that loan comes to around $1600 a year saved on its just in interest... so yeah Ampere flagship is going to be mine as a reward, Navi just has to many issues, I hope AMD can fix them, but my trust is lacking, it just seems to be bad story after bad story I have read of crashes 3 hours into a game, etc.
Posted on Reply
#22
ppn
3080 will beat 2080Ti at the same price point. transistor densities should improve by 60%, 7nm TSMC offers 41 compared to 25 Mtr. sq.mm on 14nm. 2080 SUPER will be shrinked to 330 sq.mm. But performance can't be lifted 50%, this would require hitting 3Ghz clock speeds. the biggest chip they can make on EUV is 429sq.mm
Posted on Reply
#23
Unregistered
'Bout time - there hasn't been a single product released since Pascal that would be an upgrade for me that's been appropriately priced to its performance. I'll be curious to see what the 3000 series is like and if the pricing comes back down out of the stratosphere. As someone else said, this timing makes a lot of sense - many folks will be wanting to upgrade for CP2077.
#24
efikkan
While we do expect Nvidia to launch their next generation sometime "soon", this "source" from WCCFTech is completely bogus.
It relies on two details;
- Alleged EEC certifications for GA104-400; which we know is a typo, as EEC certifications only happen after the product is complete.
- Some guy claims it's coming in H1 2020.

This article should be titled "Some guy thinks Nvidia will launch Ampere GPUs in H1 2020".
While this might turn out to be right, it would be out of coincidence, everyone can reason and realize Nvidia will replace their lineup sooner or later, and sometime in 2020 is a fairly safe bet.
Hyderzwow already? thats pretty quick considering companies generally stretch out about a year before releasing a new gpu.
Turing was released late 2018, but was also delayed. So ~two years between generations is pretty normal.
lasAmpere is going to be a major bump in perf compared to Pascal -> Turing. Node + New (awaited) Arch.

Turing was never what we wanted. Ampere was teased back in 2018.
Turing was a gapfiller because of Ampere delay. SUPER refresh was straight up milking.
lasAmpere is an all-new arch. Long awaited. Was supposed to come after Pascal. Turing was rushed out as a stop-gap solution. Half baked arch and chips (hence the SUPER releases later probably).
First of all get the facts straight; Pascal was the "filler" due to delays of Volta. Turing is derived from Volta and is a major improvement over Maxwell/Pascal. The chips were not "half-baked", like almost every previous GPU generation the last ten years Nvidia did some mid-life refreshes.

The next architecture may very well turn out to be "Ampere", but I haven't seen anything but baseless speculations about it. I'm puzzled how you would know it's a major bump over Turing. "Ampere" might even be a datacenter only GPU for all we know.

I just want to remind people about all the BS about AMD's "Arcturus" which was supposed to be the successor of GCN, but in reality was Vega based.
Razrback16'Bout time - there hasn't been a single product released since Pascal that would be an upgrade for me that's been appropriately priced to its performance. I'll be curious to see what the 3000 series is like and if the pricing comes back down out of the stratosphere. As someone else said, this timing makes a lot of sense - many folks will be wanting to upgrade for CP2077.
Seriously, you upgrade every generation?
It would make more sense to buy a higher tier product and keep it for longer.
Posted on Reply
#25
Unregistered
efikkanSeriously, you upgrade every generation?
It would make more sense to buy a higher tier product and keep it for longer.
I already buy pretty high end generally speaking. At times, yes I upgrade generation to generation - but only if I'm actually GPU limited - if I was playing everything at my target resolution & framerates I would stay with the same gear for quite some time, but usually I end up GPU or VRAM limited, lol. Money isn't an issue for me, but at the same time, I'm not going to spend an outrageous amount of money for meager gains such as what the Pascal --> Turing would be - I just don't want to set that kind of precedent for the future. If I were to do that, I'd be sending a message to NVidia that that kind of pricing scheme is okay, when it's not. So I'm holding steady with my 1080 Ti cards for now. Hopefully the 3000 series is more reasonably priced. If not, I'll wait for 2nd-hand Turing cards to come down to more reasonable price ranges.
Add your own comment
Nov 21st, 2024 11:37 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts