# Rumor: NVIDIA's Next Generation GeForce RTX 3080 and RTX 3070 "Ampere" Graphics Cards Detailed



## AleksandarK (Jan 20, 2020)

NVIDIA's next-generation of graphics cards codenamed Ampere is set to arrive sometime this year, presumably around GTC 2020 which takes place on March 22nd. Before the CEO of NVIDIA, Jensen Huang officially reveals the specifications of these new GPUs, we have the latest round of rumors coming our way. According to VideoCardz, which cites multiple sources, the die configurations of the upcoming GeForce RTX 3070 and RTX 3080 have been detailed. Using the latest 7 nm manufacturing process from Samsung, this generation of NVIDIA GPU offers a big improvement from the previous generation. 

For starters the two dies which have appeared have codenames like GA103 and GA104, standing for RTX 3080 and RTX 3070 respectively. Perhaps the biggest surprise is the Streaming Multiprocessor (SM) count. The smaller GA104 die has as much as 48 SMs, resulting in 3072 CUDA cores, while the bigger, oddly named, GA103 die has as much as 60 SMs that result in 3840 CUDA cores in total. These improvements in SM count should result in a notable performance increase across the board. Alongside the increase in SM count, there is also a new memory bus width. The smaller GA104 die that should end up in RTX 3070 uses a 256-bit memory bus allowing for 8/16 GB of GDDR6 memory, while its bigger brother, the GA103, has a 320-bit wide bus that allows the card to be configured with either 10 or 20 GB of GDDR6 memory. In the images below you can check out the alleged diagrams for yourself and see if this looks fake or not, however, it is recommended to take this rumor with a grain of salt.


 



*View at TechPowerUp Main Site*


----------



## londiste (Jan 20, 2020)

Tl;dr
GA103 - 60 SM (3480 SP) - 320-bit VRAM  (10/20 GB) - assumed 3080
GA104 - 48 SM (3072 SP) - 256-bit VRAM (8/16 GB) - assumed 3070

7nm has not been favourable to higher clocks so far but has been considerably better in power consumption and efficiency. With frequency of around 2GHz (and some architectural improvements) GA104 is faster than 2080 Super and GA103 faster than 2080Ti.

Assuming Nvidia's usual chip numeration there should be another higher-end GA102 in the works...


----------



## kapone32 (Jan 20, 2020)

Depending on pricing I may make the jump to Nvidia if this holds true. I cringe at the potential price point for these particular cards.


----------



## Razrback16 (Jan 20, 2020)

kapone32 said:


> Depending on pricing I may make the jump to Nvidia if this holds true. I cringe at the potential price point for these particular cards.



Ya, I will wait and see what the 3080 Ti is priced at when released. It needs to have at bare minimum, 16GB of RAM or more. If they can get it priced appropriately, I'll buy a pair of them. If not, I will continue to be patient and wait for a good deal on ebay down the road.


----------



## xkm1948 (Jan 20, 2020)

I do wonder how much different the overall design would be from Turing uArc. Also I hope they don't cut down the Tensorflow units. It has been really nice to use consumer level GPU for dl/ml acceleration.


----------



## CrAsHnBuRnXp (Jan 20, 2020)

Razrback16 said:


> Ya, I will wait and see what the 3080 Ti is priced at when released. It needs to have at bare minimum, 16GB of RAM or more. If they can get it priced appropriately, I'll buy a pair of them. If not, I will continue to be patient and wait for a good deal on ebay down the road.


Why? Dual card systems are basically dead these days. Yes, there is still support for SLI but it's at the point where its unneeded and troublesome.


----------



## Otonel88 (Jan 20, 2020)

Not really an expert on the depth details, but currently the *2080 TI* has *4352 Cuda cores,* and b, and the rumored *GA103* die that could have potentially *3840 Cuda* cores and as much as *60 SM's*.
So on paper this is not faster than the *2080TI. Right?*
Obviously the smaller die would result in other benefit, such as lower TDP which I am really interested, and other stuff.
Do you guys think the GA103 card would be under 200W TDP? 
Always looking at lower TDP cards, which are more silent as there is less heath to deal with by the cooler which translates into quieter card operation.


----------



## xkm1948 (Jan 20, 2020)

Otonel88 said:


> Not really an expert on the depth details, but currently the *2080 TI* has *4352 Cuda cores,* and b, and the rumored *GA103* die that could have potentially *3840 Cuda* cores and as much as *60 SM's*.
> So on paper this is not faster than the *2080TI. Right?*
> Obviously the smaller die would result in other benefit, such as lower TDP which I am really interested, and other stuff.
> Do you guys think the GA103 card would be under 200W TDP?
> Always looking at lower TDP cards, which are more silent as there is less heath to deal with by the cooler which translates into quieter card operation.



Different generation of stream processors cannot be compared just by numbers.


----------



## Otonel88 (Jan 20, 2020)

xkm1948 said:


> Different generation of stream processors cannot be compared just by numbers.


Good point


----------



## kapone32 (Jan 20, 2020)

CrAsHnBuRnXp said:


> Why? Dual card systems are basically dead these days. Yes, there is still support for SLI but it's at the point where its unneeded and troublesome.



Multi GPU is for anyone that wants it. I have not replaced my Vega 64s with 5700Xts for exactly that reason. If Multi GPU was really dead MBs would not be touting it and giving you 2, 3 and 4 SLI bridges. As much as people complain about it being troublesome it is  not as bad as people make it out to be. There are plenty of Games that support Multi GPU anyway. As an example I get 107 FPS average at 4K playing Jedi Fallen Order @ Ultra settings.


----------



## CrAsHnBuRnXp (Jan 20, 2020)

kapone32 said:


> Multi GPU is for anyone that wants it. I have not replaced my Vega 64s with 5700Xts for exactly that reason. If Multi GPU was really dead MBs would not be touting it and giving you 2, 3 and 4 SLI bridges. As much as people complain about it being troublesome it is  not as bad as people make it out to be. There are plenty of Games that support Multi GPU anyway. As an example I get 107 FPS average at 4K playing Jedi Fallen Order @ Ultra settings.


Triple and quad SLI is officially dead as nvidia no longer supports it.


----------



## Otonel88 (Jan 20, 2020)

Any opinions or guesses on the TDP of this new cards using the 7nm?


----------



## Kaotik (Jan 20, 2020)

This "new leak" is actually part of rumor from last may

__ https://twitter.com/i/web/status/1131031428878094337


----------



## kapone32 (Jan 20, 2020)

CrAsHnBuRnXp said:


> Triple and quad SLI is officially dead as nvidia no longer supports it.



Yeah I know, I was just referencing that we still get those. Personally I would never go for more than 2 GPUs in a setup. It would be nearly impossible (unless you install single slot water blocks) to fit more than in some cases, 2 GPUS on modern Motherboards.


----------



## Adam Krazispeed (Jan 20, 2020)

IDK, my thoughts..

AM102 - ??  sm (?????   SP) -RT Cores 108 - 384bit VRAM  - assumed RTX Titan XXX
AM104 - 68 SM (4352  SP) - RT Cores 88  - 320-bit VRAM  - assumed 3080ti -12gb-16gb-32gb
AM106 - 46 SM (2944  SP) - RT Cores 68  - 256-bit VRAM  - assumed 3070ti - 12gb-16gb

7nm has not had much higher clocks so ??  but has been much better in power consumption/Eff.. With frequency of  1.7 - 2 GHz  GA104 Could be faster than a 2080 Super and GA102 will be faster than 2080Ti. 20% >>??? 30% MAX , but RT-RT up to 50% MAX

Assuming Nvidia's usual chip numeration there should be another higher-end GA102 in the works...TITAN model!!



Adam Krazispeed said:


> IDK, my thoughts..
> 
> AM102 - ??  sm (?????   SP) -RT Cores 108 - 384bit VRAM  - assumed RTX Titan XXX
> AM104 - 68 SM (4352  SP) - RT Cores 88  - 320-bit VRAM  - assumed 3080ti -12gb-16gb-32gb
> ...


depens on  also who nvidia using?? TSMC.  or SAMSUNG... rumored that TSMC 7nm is a bit better than samsungs 7nm at the moment?? (whether EUV or DUV)


----------



## medi01 (Jan 20, 2020)

__ https://twitter.com/i/web/status/1219171309298561024


----------



## VrOtk (Jan 20, 2020)

That 320-bit bus looks weird: allowing your xx80 grade card to have such high memory bandwidth, you start to cripple your xx80Ti's performance at higher resolutions (unless it uses 512-bit bus or HBM2).
Though I'd be happy to be wrong, as better grade products for the same or cheaper price is always a welcome.


----------



## Space Lynx (Jan 20, 2020)

medi01 said:


> __ https://twitter.com/i/web/status/1219171309298561024



going to be a great year for high end gaming.

I'm leaning towards big navi and ryzen 4700x, but if intel/nvidia high end ends up being 30+% fast for a 200-300 bucks more... than yeah I prob will go green.


----------



## efikkan (Jan 20, 2020)

GTC is a compute/professional graphics conference. I can't remember the last time Nvidia launched consumer products there, so this sounds unlikely but not impossible. Keep in mind that we've had similar rumors about GTC previous years too.
But I would expect them to show data center products and perhaps give some hints about the upcoming consumer products.


----------



## Gungar (Jan 20, 2020)

VrOtk said:


> That 320-bit bus looks weird: allowing your xx80 grade card to have such high memory bandwidth, you start to cripple your xx80Ti's performance at higher resolutions (unless it uses 512-bit bus or HBM2).
> Though I'd be happy to be wrong, as better grade products for the same or cheaper price is always a welcome.



Yeah i was thinking the same but it really wouldn't surprise me that the new 3080Ti comes with HBM.


----------



## EarthDog (Jan 20, 2020)

kapone32 said:


> Multi GPU is for anyone that wants it.


True... for anyone that is a masochist . I mean if people want to put themselves through spotty/inconsistent performance uptick while always using 2x power, 2x heat mitigation... go for it. mGPU just needs to die already or go all in and get it working. Far too long has this technology just been 'MEH'.


Gungar said:


> it really wouldn't surprise me that the new 3080Ti comes with HBM.


I don't see this happening... 

Does anyone else see HBM making any headway in consumer graphics? When first released, AMD touted it as the second coming... but it turns out, it was fairly useless comparatively (I mean it did shrink gaps between NV cards at high resolutions, but never had the core horsepower in the first place for anything past 1440p).


----------



## _Flare (Jan 20, 2020)

They could go for 4SMs per TPC leading to doubled RT-Perf.
or they could place 8 or even 12 TPC per GPC, also leading to 2x Perf.
People shouldn´t forget, nvidia gets about 2x the area 12nm vs 7nm.
I bet nvidia has plenty of ideas to get performance from that extra-area and doubled RT Perf is scaled 1:1 with SMs.


----------



## DeathtoGnomes (Jan 20, 2020)

Wonder what AMD will counter this with.


----------



## Gungar (Jan 20, 2020)

EarthDog said:


> Does anyone else see HBM making any headway in consumer graphics? When first released, AMD touted it as the second coming... but it turns out, it was fairly useless comparatively (I mean it did shrink gaps between NV cards at high resolutions, but never had the core horsepower in the first place for anything past 1440p).



HBM is by far the most powerful and efficient graphic memory out there right now. AMD is just incapable of producing any good gpu, for sometime now.


----------



## EarthDog (Jan 20, 2020)

DeathtoGnomes said:


> Wonder what AMD will counter this with.


Big Navi is due out at the same time... 

... wondering if it will still be 'big' compared to these (or w.e NV's high-end will be) or what...


----------



## Otonel88 (Jan 20, 2020)

I think what ever the potential of the Ampere architecture, the performance will be around 10% - 15% improved over current generation. 
This is so Nvidia can 'milk' the architecture as much as possible in the coming years. If they are running tests and they realise the they could push up to 60% more performance, they will release this performance in batches over the next generations of cards that will come in the next few years. So, yeah I am excited about the new cards, but Nvidia is always a business, and they won't release the full performance in the first batch, as this would leave them empty handed.


----------



## EarthDog (Jan 20, 2020)

Otonel88 said:


> I think what ever the potential of the Ampere architecture, the performance will be around 10% - 15% improved over current generation.


I highly doubt it. This past generation we saw the least amount of performance increase from flagship to flagship from NV (and that was 25-30%).


----------



## Xaled (Jan 20, 2020)

Evet a %100 improvement means nothing, if the price would increase %95 and the price/performance ratio would stay the same


----------



## Dave65 (Jan 20, 2020)

I'll wait for FAT Navi and then compare with Nvidias offering.


----------



## Otonel88 (Jan 20, 2020)

EarthDog said:


> I highly doubt it. This past generation we saw the least amount of performance increase from flagship to flagship from NV (and that was 25-30%).



What I mean, is that if the Ampere architecture potential is 50%, the first ampere GPUs will release with an improvement of 10%-15%. Leaving the rest of potential for future cards. I think Nvidia will not give all the potential of Ampere with the next 30xx cards, just so they have something left in the tank for other future cards. Plus they will do architecture improvements in the future potentially getting even more from Ampere.



Xaled said:


> Evet a %100 improvement means nothing, if the price would increase %95 and the price/performance ratio would stay the same



Honestlly I think the price will stay either the same or increase. Don't think it will decrease. We saw nothing but price increases from Nvidia for the past years.


----------



## EarthDog (Jan 20, 2020)

Otonel88 said:


> What I mean, is that if the Ampere architecture potential is 50%, the first ampere GPUs will release with an improvement of 10%-15%. Leaving the rest of potential for future cards. I think Nvidia will not give all the potential of Ampere with the next 30xx cards, just so they have something left in the tank for other future cards. Plus they will do architecture improvements in the future potentially getting even more from Ampere.


In other words, you are calling these their mid-range cards??? Ik still confused... 10-15% over what?

Also, it has been generations since we have seen architectural changes in mid life cycle for gpu... IPC is what it is.


----------



## Otonel88 (Jan 20, 2020)

EarthDog said:


> Big Navi is due out at the same time...
> 
> ... wondering if it will still be 'big' compared to these (or w.e NV's high-end will be) or what...



It will be 'big navi' compared to current generations. Don't think AMD can take Nvidia in the top tier cards just yet. Historically they where always a bit behind for top tier cards.


----------



## Xaled (Jan 20, 2020)

EarthDog said:


> I highly doubt it. This past generation we saw the least amount of performance increase from flagship to flagship from NV (and that was 25-30%).








And the RTX 2xxx series came 3 years after. But still Nvidia could've made more improvement in the RTX series but there were no need to and there won't be need in this series as well


----------



## repman244 (Jan 20, 2020)

Xaled said:


> But still Nvidia could've made more improvement but there were no need to and there won't be need in this series as well



5970 is a dual GPU card, the only valid comparison is the 5870.


----------



## Otonel88 (Jan 20, 2020)

EarthDog said:


> In other words, you are calling these their mid-range cards??? Ik still confused... 10-15% over what?
> 
> Also, it has been generations since we have seen architectural changes in mid life cycle for gpu... IPC is what it is.



10% - 15% over top of the range Nvidia cards like 2080 super or 2080 TI.
If they are running tests and at the moment and the new Ampere generation has an improvement of 50% over the current high end cards then the GPU to come in the next years could have improvements over the current generations as it follows:
3080 (+15% improvement over 2080)
4080 (+30% improvement over 2080) .. and so on.

They will release the performance in batches over the next years cards.

(I am just speculating on percentages, but I guess you got my point)


----------



## Xaled (Jan 20, 2020)

repman244 said:


> 5970 is a dual GPU card, the only valid comparison is the 5870.


Who even mentioned the 5xxx series. I am comparing Nvidia to Nvidia


----------



## EarthDog (Jan 20, 2020)

Otonel88 said:


> 10% - 15% over top of the range Nvidia cards like 2080 super or 2080 TI.
> If they are running tests and at the moment and the new Ampere generation has an improvement of 50% over the current high end cards then the GPU to come in the next years could have improvements over the current generations as it follows:
> 3080 (+15% improvement over 2080)
> 4080 (+30% improvement over 2080) .. and so on.
> ...


Then that isnt midrange. You are confusing. Lol

With Turing, they released high end first and midrange after. 

Pre


----------



## Otonel88 (Jan 20, 2020)

EarthDog said:


> Then that isnt midrange. You are confusing. Lol



I am talking about what Nvidia could do with Amepere over the current top cards from the current architecture


----------



## EarthDog (Jan 20, 2020)

Otonel88 said:


> I am talking about what Nvidia could do with Amepere over the current top cards from the current architecture


Yep. It's going to  do a lot more than that, flagship to flagehip.



Xaled said:


> And the RTX 2xxx series came 3 years after. But still Nvidia could've made more improvement in the RTX series but there were no need to and there won't be need in this series as well


Isnt that the same base arch though but die shrieked? I dont recall. Regardless, least amount of improvement in a loooong while.


----------



## FordGT90Concept (Jan 20, 2020)

londiste said:


> nm has not been favourable to higher clocks so far but has been considerably better in power consumption and efficiency.


That's at TSMC.  This is Samsung.

Judging by the core count, I expect Ampere to be quite different from Turing.  Like a Kepler to Maxwell jump in architecture over Pascal to Turing.


----------



## Xaled (Jan 20, 2020)

EarthDog said:


> Yep. It's going to  do a lot more than that, flagship to flagehip.


I agree it can be up to 50٪ better, maybe even more,  but i really doubt that Nvidia would play all of its cards, as there is no need to and the prices would increase in the same amount


----------



## EarthDog (Jan 20, 2020)

Xaled said:


> I agree it can be up to 50٪ better, maybe even more,  but i really doubt that Nvidia would play all of its cards, as there is no need to and the prices would increase in the same amount


To be clear, I dont think it will be 50% either, but it will surely be more than 10-15%.


----------



## 64K (Jan 20, 2020)

Otonel88 said:


> 10% - 15% over top of the range Nvidia cards like 2080 super or 2080 TI.
> If they are running tests and at the moment and the new Ampere generation has an improvement of 50% over the current high end cards then the GPU to come in the next years could have improvements over the current generations as it follows:
> 3080 (+15% improvement over 2080)
> 4080 (+30% improvement over 2080) .. and so on.
> ...



What are you basing this on?

Let's look at past performance increases to get some clarity about what we can probably expect from Ampere with the lower process node and new architecture.

RTX 2080 (non Super) showed an increase of 37% in average performance over the GTX 1080

GTX 1080 showed an increase of 67% in average performance over the GTX 980




Spoiler: Benches Done Here on RTX 2080, GTX 1080, GTX 980














Why in the world are you expecting a lousy 15% increase in performance from Ampere over Turing?


----------



## Xaled (Jan 20, 2020)

Comparing names is just wrong, you should compare the best GPU to the best one of the previous generation


----------



## Razbojnik (Jan 20, 2020)

lynx29 said:


> going to be a great year for high end gaming.
> 
> I'm leaning towards big navi and ryzen 4700x, but if intel/nvidia high end ends up being 30+% fast for a 200-300 bucks more... than yeah I prob will go green.



Yeah...pity there are no games worthy of such powers, I mean when these cards come an average 250$ card will be able to rock 1440p and ray tracing on ultra in all games...plus games are being delayed like hell lately. we're getting there where there will be no point in buying expensive gpu's for gaming, just like there's no point in buying expensive cpu's for gaming...a 250$ cpu maxes all.  I'm not complaining...I guess? Right? Hmm, there is a gap here. Gpu's are getting insanely powerful...while we are getting games of lesser quality. Come to think of it, there's a lot of things to complain about in here.


----------



## 64K (Jan 20, 2020)

Xaled said:


> Comparing names is just wrong, you should compare the best GPU to the best one of the previous generation



Ok let's look at high end GPUs

The RTX 2080 Ti showed an average increase in performance over the 1080 Ti of 33%

The 1080 Ti showed an average increase in performance over the 980 Ti of 75%



Spoiler: Benches Done Here of RTX 2080 Ti, GTX 1080 Ti, GTX 980 Ti


----------



## TheoneandonlyMrK (Jan 20, 2020)

EarthDog said:


> Big Navi is due out at the same time...
> 
> ... wondering if it will still be 'big' compared to these (or w.e NV's high-end will be) or what...


I think it bigger but not big personally, Navi 21lite is a console gpu , Navi 21 is due soon n thought big, but Navi 23 is alleged the one.

I hope these Nvidia 3070/3080 specs are wrong personally, not that there is much wrong with them per se just i am ever hopeful for some performance doubling to kick in soon so as to make 8k useable and this sounds like a mere step up in performance.

no i am not going to use 8k but expect something that can do 8k to do 4k very well and yes AMD are not doing what i want either before i get trolled.


----------



## Razbojnik (Jan 20, 2020)

kapone32 said:


> Multi GPU is for anyone that wants it. I have not replaced my Vega 64s with 5700Xts for exactly that reason. If Multi GPU was really dead MBs would not be touting it and giving you 2, 3 and 4 SLI bridges. As much as people complain about it being troublesome it is  not as bad as people make it out to be. There are plenty of Games that support Multi GPU anyway. As an example I get 107 FPS average at 4K playing Jedi Fallen Order @ Ultra settings.



For the money you spent...I can't say I'm impressed with those frames...and I can only imagine how awful the stuttering in 4k is lol


----------



## Otonel88 (Jan 20, 2020)

EarthDog said:


> To be clear, I dont think it will be 50% either, but it will surely be more than 10-15%.


 Whatever that percentage will be, we will not get all of that at once for the next high end cards.


----------



## medi01 (Jan 20, 2020)

64K said:


> RTX 2080 (non Super) showed an increase of 37% in average performance over the GTX 1080



It's sounds like it's more than what it actually is.


----------



## EarthDog (Jan 20, 2020)

Otonel88 said:


> Whatever that percentage will be, we will not get all of that at once for the next high end cards.


 All I am saying is that we will get A LOT more than 10-15% gains out of the box from the new generation when comparing apples to apples (tier to tier). The only differences we've seen the past couple of generations are clock swaps and in the case of the super cards, core/spec differences, in order to extract more performance out of these cards. But 10-15% is putting a 2080 Ti on water and overclocking it... next gen will show double that, at minimum.


----------



## kapone32 (Jan 20, 2020)

Razbojnik said:


> For the money you spent...I can't say I'm impressed with those frames...and I can only imagine how awful the stuttering in 4k is lol



Are you sure about that? There is absolutely no stuttering and the money I spent on my 2 Vega 64s was less than buying a brand new 2080TI here in Canada (Including the water blocks).


----------



## Xaled (Jan 20, 2020)

64K said:


> What are you basing this on?
> 
> Let's look at past performance increases to get some clarity about what we can probably expect from Ampere with the lower process node and new architecture.
> 
> ...


Nah, even that is not correct, Titan to Titan would be the best comparison.


----------



## Berfs1 (Jan 20, 2020)

kapone32 said:


> Depending on pricing I may make the jump to Nvidia if this holds true. I cringe at the potential price point for these particular cards.


It was rumored that they would be around half the price, not sure of the accuracy of that statement though, but I think where the 2080 Ti MSRP was 999$, I think the 3080 Ti may be 899$, 3080 599$, and 3070 379$. Just my guess though...


----------



## TheoneandonlyMrK (Jan 20, 2020)

Would also be nice to hear some rumours about the improvements to RTX hardware and tensor cores too, shader numbers are not going to inform us much about the performance in isolation.

@Berfs1 I Can't see any of that panning out, since when did Nvidia improve their best then undercut it? so I think the prices we have now plus 50 dollars minimum, at least at launch with supers or something coming out later ,once stocks of the 2xxx series run out, makes sense tbf.


----------



## ppn (Jan 20, 2020)

If 7nmEUV by Samsung provides 40Mtr/mm2 density just like TSMC 7nmDUV, then we get 3072Cuda 2080Super chip shrinked to 545 ->340 mm2. 2070 ->3060, 2080S->3070, or 20% performance increase for 2060->3060, 33% 2070->3070. combined with price drop. and  just because the EUV tools are limited to 429mm2, 3080 only 25% bigger than 3070 means ~420mm2. ANd there could be no 3080Ti. this is where the next gen hopper comes into play with multi chip. Well if 7nmEUV really is 77Mtr/mm2 or comes close to 64mm2 real density then it will be just mindblown as 2080 Ti shrnks in 300mm2 die area. but I doubt it can be done, since NAVI is barely 41Mtr/mm2 real 7nmDUV being 91Mtr/mm2 on paper.


----------



## Berfs1 (Jan 20, 2020)

theoneandonlymrk said:


> Would also be nice to hear some rumours about the improvements to RTX hardware and tensor cores too, shader numbers are not going to inform us much about the performance in isolation.
> 
> @Berfs1 I Can't see any of that panning out, since when did Nvidia improve their best then undercut it? so I think the prices we have now plus 50 dollars minimum, at least at launch with supers or something coming out later ,once stocks of the 2xxx series run out, makes sense tbf.


Cus 20 series was extremely overpriced from the beginning and everyone knew that.


----------



## eidairaman1 (Jan 20, 2020)

Berfs1 said:


> Cus 20 series was extremely overpriced from the beginning and everyone knew that.



Green are spacecases in price


----------



## TheoneandonlyMrK (Jan 20, 2020)

Berfs1 said:


> Cus 20 series was extremely overpriced from the beginning and everyone knew that.


They are not cheap now either and they have been out a while now, you think they will drop the 2080ti another couple of hundred to fit the 3xxx lineup in, I don't, people did buy them, sure they didn't fly off the shelf but levy that point verses 7nm yield issues and this could be the best way to play it out for Nvidia.

they are not known for a full lineup on a day, they will introduce their next GPU's slowly and consciously to maximize profits, early birds Will pay, then as yield's improve ,a super or something with a price cut ish sort of.


----------



## 64K (Jan 20, 2020)

medi01 said:


> It's sounds like it's more than what it actually is.
> 
> View attachment 142709



I have no idea what you are saying. It is what it is. And it is a 37% average increase in performance  for a RTX 2080 (non Super) over a GTX 1080 at 1440p in a 23 game suite benched on this site. Have a look at the benches if you want to:









						NVIDIA GeForce RTX 2080 Founders Edition 8 GB Review
					

It was very bold of NVIDIA to debut its flagship implementation of the Turing architecture right next to the RTX 2080, poised to be the poster-boy of this architecture. This card packs the promise of real-time ray tracing, of sorts. NVIDIA also put out its best cooler design since TITAN. All...




					www.techpowerup.com


----------



## ppn (Jan 20, 2020)

1080->2080 1,45x in 4K, but look at transistor count. 13600M/7200M = 1.88x. So in the name of new improvements like async compute and variable shader rate, RTRT and tensor we did not get the real 90% improvement that is provided by transistor count right away but 45% instead. 90% will come later as games start to use it. But next gen will provide linear scaling and that means RTX 3080 17000M transistors ~~ 25% faster than 2080S. If we take into account the clock increase 18Gbps and 2.5Ghz Gpu for another 20% we get the 50% number at 1/2 power.


----------



## medi01 (Jan 20, 2020)

64K said:


> I have no idea what you are saying. It is what it is. And it is a 37% average increase in performance  for a RTX 2080 (non Super) over a GTX 1080 at 1440p in a 23 game suite benched on this site. Have a look at the benches if you want to:
> 
> 
> 
> ...



We are talking about *17% more expensive card*, pulling roughly 1/3rd of perf ahead, give or take, one (or was it more?) year later than the original.


----------



## Space Lynx (Jan 20, 2020)

Razbojnik said:


> Yeah...pity there are no games worthy of such powers, I mean when these cards come an average 250$ card will be able to rock 1440p and ray tracing on ultra in all games...plus games are being delayed like hell lately. we're getting there where there will be no point in buying expensive gpu's for gaming, just like there's no point in buying expensive cpu's for gaming...a 250$ cpu maxes all.  I'm not complaining...I guess? Right? Hmm, there is a gap here. Gpu's are getting insanely powerful...while we are getting games of lesser quality. Come to think of it, there's a lot of things to complain about in here.




huh? I can't even play witcher 3 maxed out at 1440p 144hz 144 fps yet even with a rtx 2080 super...

sorry if you don't enjoy high refresh but i do, gpu's have a long way to catch up before i can truly enjoy gaming.


----------



## 64K (Jan 20, 2020)

medi01 said:


> We are talking about *17% more expensive card*, pulling roughly 1/3rd of perf ahead, give or take, one (or was it more?) year later than the original.



I am not talking about the price increase and neither was the guy that I was replying to. We were talking about performance increase. If you want to talk about price increase then yes I think Nvidia did jack up the prices somewhat on Turing more than they needed to.



ppn said:


> 1080->2080 1,45x in 4K, but look at transistor count. 13600M/7200M = 1.88x. So in the name of new improvements like async compute and variable shader rate we did not get the real 90% improvement that is provided by transistor count right away but 45% instead. 90% will come later as games start to use it. But next gen will provide linear scaling and that means RTX 3080 17000M transistors ~~ 25% faster than 2080S.



A good bit of the transistor increase on Turing is due to the RT cores and Tensor cores on Turings. If Turings had just stuck with a lot more CUDA cores then the performance increase would have been much better over Pascals but Nvidia wanted to push RTRT forward.

The bottom line is that Ampere will be on the 7nm process node as compared to Turing on the 12nm process node so it should be a good bit more efficient than Turing for the same wattage used so there should be room to add a good bit more cores and faster clocks for the same wattage used. 

imo the performance increase with the Ampere will be somewhere between 35% to 50% over Turing


----------



## dicktracy (Jan 20, 2020)

Big. Navi. Is. Dead. Don't even bother releasing that expensive chip just to get defeated jebaited by midrange Ampere.


----------



## Vayra86 (Jan 20, 2020)

xkm1948 said:


> I do wonder how much different the overall design would be from Turing uArc. Also I hope they don't cut down the Tensorflow units. It has been really nice to use consumer level GPU for dl/ml acceleration.



Looking at the shader counts I don't think its going to be a huge change, more like a small refinement of Turing and similar die sizes. 500 additional shaders for the x80.



dicktracy said:


> Big. Navi. Is. Dead. Don't even bother releasing that expensive chip just to get defeated jebaited by midrange Ampere.



Business as usual. AMD still hasn't caught up and I reckon they might not get there anytime soon either. You don't simply make two generational jumps in one go. So guess what. They will compete on price in the upper mid range and that is all... once more.



ppn said:


> 1080->2080 1,45x in 4K, but look at transistor count. 13600M/7200M = 1.88x. So in the name of new improvements like async compute and variable shader rate, RTRT and tensor we did not get the real 90% improvement that is provided by transistor count right away but 45% instead. 90% will come later as games start to use it. But next gen will provide linear scaling and that means RTX 3080 17000M transistors ~~ 25% faster than 2080S. If we take into account the clock increase 18Gbps and 2.5Ghz Gpu for another 20% we get the 50% number at 1/2 power.



I think that's realistic; 25% faster is kinda what they need to differentiate from Turing. And as a bonus they even keep the 2080ti performance level somewhat 'premium' for a while longer until they bring a new big die. I do think that a 3080ti can remain at Turing's 4352 shaders, then its still going to obliterate everything (still ~25% above 3080), and it would mean that potentially that card won't be costing upwards of 1K this time. As it should.


----------



## TheoneandonlyMrK (Jan 20, 2020)

ppn said:


> 1080->2080 1,45x in 4K, but look at transistor count. 13600M/7200M = 1.88x. So in the name of new improvements like async compute and variable shader rate, RTRT and tensor we did not get the real 90% improvement that is provided by transistor count right away but 45% instead. 90% will come later as games start to use it. But next gen will provide linear scaling and that means RTX 3080 17000M transistors ~~ 25% faster than 2080S. If we take into account the clock increase 18Gbps and 2.5Ghz Gpu for another 20% we get the 50% number at 1/2 power.


That would be nice , I think your being optimistic though, but hopefully.


----------



## ZoneDymo (Jan 20, 2020)

Be sure to empty your wallets loyal customers.


----------



## Vayra86 (Jan 20, 2020)

ZoneDymo said:


> Be sure to empty your wallets loyal customers.



The more you buy...


----------



## Zmon (Jan 20, 2020)

It's going to be a bit ridiculous if Nvidia is going to keep their current price hike for their 3xxx series. It was already ridiculous that the 2080ti had an MSRP of $1199, which was a massive 58% increase over the 1080ti's MSRP of $699. If we go back to Kepler and Maxwell, Maxwell even had a price decrease over Kepler of about $50. There isn't really any good justification for the current price gouging besides the "massive die" argument. I hope Nvidia fixes this, but I highly doubt it with a current lack of competition from AMD.


----------



## Vayra86 (Jan 20, 2020)

Zmon said:


> It's going to be a bit ridiculous if Nvidia is going to keep their current price hike for their 3xxx series. It was already ridiculous that the 2080ti had an MSRP of $1199, which was a massive 58% increase over the 1080ti's MSRP of $699. If we go back to Kepler and Maxwell, Maxwell even had a price decrease over Kepler of about $50. There isn't really any good justification for the current price gouging besides the "massive die" argument. I hope Nvidia fixes this, but I highly doubt it with a current lack of competition from AMD.



But there is a good argument to keep Turing levels of pricing this time: similar die size on a smaller node. So effectively, you're looking at a more complicated die here on a node that is new, so yields are not optimal yet. Maxwell was cheap for good reasons; the 970 for example had a cut down die and was the real price king; and it was on a very easy and familiar node with very high yields. Even so, the 980 wasn't exactly a bang/buck card. And the 980ti neither. Still good though I agree.

But that doesn't mean 3080ti will keep its price point, see my post above.

Nvidia will want to keep its (royal) margin, but most of their price bumps are actually quite understandable within that rationale. Those 'expensive' Turing dies are effin' huge and its only the second gen on 16~12nm.


----------



## eidairaman1 (Jan 20, 2020)

Zmon said:


> It's going to be a bit ridiculous if Nvidia is going to keep their current price hike for their 3xxx series. It was already ridiculous that the 2080ti had an MSRP of $1199, which was a massive 58% increase over the 1080ti's MSRP of $699. If we go back to Kepler and Maxwell, Maxwell even had a price decrease over Kepler of about $50. There isn't really any good justification for the current price gouging besides the "massive die" argument. I hope Nvidia fixes this, but I highly doubt it with a current lack of competition from AMD.



Do your research, amd answered last year, they aren't finished either


----------



## Zmon (Jan 20, 2020)

eidairaman1 said:


> Do your research, amd answered last year, they aren't finished either


Sure there's plenty of research, AMD can only compete with Nvidia at the low-mid tier end currently. They are of course planning on a high end GPU, but how that performs will remain to be seen. We can talk all day about the current die sizes and prices, which will still be the same most likely for Ampere regardless of what AMD does.


----------



## cucker tarlson (Jan 20, 2020)

looks like turing copy paste.




eidairaman1 said:


> amd answered last year


they answered tu106,with same performance but lack of features


----------



## Nihilus (Jan 20, 2020)

Looks like Nvidia is going for a more even performance stepping next time around.  This gen, the 2070 Super, 2080, and 2080 Super were all closer together than the 2080 Super was to the 2080ti.

Next gen, the 3070 should be 2080 Super-like and the 3080 could be at 2080ti levels.  

Someone mentioned 16 GB for the 3080ti, but I find that highly doubtful unless it is using HBM2.  No way they are running a 512 bit bus on gddr6.

Best guess is 12 gb of GDDR6 at 16gb/s or more.


----------



## rtwjunkie (Jan 20, 2020)

Gungar said:


> HBM is by far the most powerful and efficient graphic memory out there right now. AMD is just incapable of producing any good gpu, for sometime now.


For practical, consumer and gaming use, I see GDDR6 being used on GA102, 103, 104. GDDR6 has proven itself.  HBM not so much. I’m with @EarthDog on this.


----------



## Razrback16 (Jan 20, 2020)

CrAsHnBuRnXp said:


> Why? Dual card systems are basically dead these days. Yes, there is still support for SLI but it's at the point where its unneeded and troublesome.



I've had pretty good luck with multi gpu. I only have maybe 2-3 games that don't utilize it. I play exclusively at 4k so for me, multi gpu is pretty much a necessity if I want games to have nice steady framerates and be able to crank up the image quality settings.


----------



## EarthDog (Jan 20, 2020)

rtwjunkie said:


> For practical, consumer and gaming use, I see GDDR6 being used on GA102, 103, 104. GDDR6 has proven itself.  HBM not so much. I’m with @EarthDog on this.


This... still waiting to see its benefits at this (consumer) level. I get the super high bandwidth, but, clearly it isn't needed and it seems (someone correct me on this if needed) that GDDR6 is cheaper to produce anyway? So...... while for high bandwidth applications it can beneficial, we aren't seeing it needed for gaming or general consumer use.


----------



## 64K (Jan 20, 2020)

The people that are claiming that Ampere will only be a trivial increase in performance aren't paying attention to Nvidia's track record for performance increases with each successive generation.

Average performance increase over previous generation:

RTX 2080 Ti over GTX 1080 Ti 33%  (low due to the introduction of RT Cores and Tensor cores  on the 2080 Ti instead of just a lot more CUDA cores)
GTX 1080 Ti over GTX 980 Ti 75%
GTX 980 Ti over GTX 780 Ti 41%
GTX 780 Ti over GTX 580 204%
GTX 580 over GTX 285 70%



Spoiler: Benchmarks Backing the Performance Increase Over Previous Generations For The Past 11 Years


----------



## Aerpoweron (Jan 20, 2020)

320bit memory bus. I really hope we get more details how that bus is connected to what memory controller. I just want to rule out a mess like the GTX 970 was.

@64K 
You skipped a few generations. The GTX 480, and GTX 680 are missing.


----------



## cucker tarlson (Jan 20, 2020)

64K said:


> RTX 2080 Ti over GTX 1080 Ti 33%  (low due to the introduction of RT Cores and Tensor cores  on the 2080 Ti instead of just a lot more CUDA cores)


I think it's a scaling problem


----------



## dicktracy (Jan 20, 2020)

64K said:


> The people that are claiming that Ampere will only be a trivial increase in performance aren't paying attention to Nvidia's track record for performance increases with each successive generation.
> 
> Average performance increase over previous generation:
> 
> ...


980ti to 1080ti is pretty much a good example of what to expect from Ampere. New node (7nm EUV this time and not just 7nm!) and new arch + not wanting Intel to ever catch up to them = yuuge increase!


----------



## eidairaman1 (Jan 20, 2020)

Zmon said:


> Sure there's plenty of research, AMD can only compete with Nvidia at the low-mid tier end currently. They are of course planning on a high end GPU, but how that performs will remain to be seen. We can talk all day about the current die sizes and prices, which will still be the same most likely for Ampere regardless of what AMD does.



5800/5900 series


----------



## Razrback16 (Jan 20, 2020)

kapone32 said:


> Are you sure about that? There is absolutely no stuttering and the money I spent on my 2 Vega 64s was less than buying a brand new 2080TI here in Canada (Including the water blocks).



Ya I remember years back before I tried my first SLI setup (2x 780 Ti) I was scared to death about the stuttering people kept talking about. I've run 780 Ti SLI, Titan X (maxwell) SLI, and now 1080 Ti SLI...I haven't had a lick of stuttering on any games I play. I mean zippo. I generally run vsync @ 60fps in 4k and my games have been butter smooth. If I ever feel like a game is running right around that 60fps limit for my cards and may fluctuate (which can cause input lag in those rare situations), then I switch out of true vsync and enable adaptive vsync at the driver level and that will take care of any issues.

My experiences with multi gpu have been great. It's obviously not for everyone given that the scaling is never 1:1, and in some cases not even close, but if you have tons of cash and / or are just a hardware enthusiast that wants maximum image quality and / or framerates, it's something I'd recommend people try.



Berfs1 said:


> Cus 20 series was extremely overpriced from the beginning and everyone knew that.



Ya I'd like to think they'll get the prices back down into the realm of reason, but I am skeptical with NVidia.  I may need to just plan on buying 2nd hand Turing once the prices get down into reasonable territory.


----------



## 64K (Jan 20, 2020)

Aerpoweron said:


> @64K
> You skipped a few generations. The GTX 480, and GTX 680 are missing.



I left out the GTX 480 (Fermi) because there weren't benches to compare it directly to a GTX 780 Ti (Kepler) here and anyway the GTX 580 (Fermi)  was the same generation as the GTX 480 (Fermi) but with a vapor chamber for lower temps and a few more shaders.

The GTX 680 has nothing to do with the comparisons that I was making. It was a midrange Kepler. It wasn't the high end. That was some shenanigans that Nvidia pulled on the uninformed. The GTX 780 Ti which came later was the high end Kepler.


----------



## EarthDog (Jan 20, 2020)

64K said:


> That was some shenanigans that Nvidia pulled on the uninformed.


To be clear, this is the same shenanigans as AMD with Polaris, right?

Edit: cat got you tongue?


----------



## efikkan (Jan 20, 2020)

I would advice against trying to estimate the performance when we don't know a single thing about its performance characteristics. Back when Turing's details were pretty much confirmed, most predicted like a ~10% performance increase, and there were an outcry from many claiming it would be a huge failure. Then it surprised "everyone" by offering significant performance gains anyway.

We still don't know what Nvidia's next gen even is at this point. In the worst case, we're looking at a shrink of Turing with some tweaks and higher clocks, but it could also be a major architectural improvement. While I'm not sure I believe the details of this "leak", I do think future improvements will come from architectural changes rather than just "doubling" of SMs every node shrink.


----------



## moproblems99 (Jan 20, 2020)

kapone32 said:


> Multi GPU is for anyone that wants it. I have not replaced my Vega 64s with 5700Xts for exactly that reason. If Multi GPU was really dead MBs would not be touting it and giving you 2, 3 and 4 SLI bridges. As much as people complain about it being troublesome it is  not as bad as people make it out to be. There are plenty of Games that support Multi GPU anyway. As an example I get 107 FPS average at 4K playing Jedi Fallen Order @ Ultra settings.



Multi gpu is great for people that like to waste 50% of their money more than 50% of the time.  I'll never do it again.  Nothing like sitting there waiting for a patch or driver update to get the other card working.  Meanwhile, I finished the game already.


----------



## cucker tarlson (Jan 20, 2020)

I think they'll wanna

1.use the performance of ps5 as reference,so expect a 350-400 rtx3060 card trading blows with 2070s/2080og
2.have a lot of sku options from the very beginning to be able to respond to changing pricepoints fluidly,hence the new a103 die
3.jack up the prices in the ~2080 super/2080Ti (rtx3070) territory and upwards.expect $600 3070 cards.


----------



## Assimilator (Jan 20, 2020)

LOL @ HBM. It's dead on consumer cards and it's not coming back, the only reason AMD ever used it was because Fiji was a disgusting power hog and if they'd added a GDDR controller and memory on top of that it would've been even more of a TDP turd. I assume they continued with it on Vega because by that time they were invested (and Vega was not without TDP and memory bandwidth issues of its own), but it's pretty significant that with Navi - their first mid- to high-end GPU in a while that doesn't suck power like a sponge - they've ditched HBM entirely.

HBM's higher bandwidth, lower power consumption and increased cost only makes sense in compute environments where the memory subsystem is completely maxed out transferring data between the host system and other cards in that system.


----------



## moproblems99 (Jan 20, 2020)

64K said:


> The GTX 680 has nothing to do with the comparisons that I was making. It was a midrange Kepler. It wasn't the high end. That was some shenanigans that Nvidia pulled on the uninformed. The GTX 780 Ti which came later was the high end Kepler.



While generally I agree, I believe that NV mid-range 680 was pretty competitive against AMD's high-end...


----------



## Minus Infinity (Jan 20, 2020)

Otonel88 said:


> 10% - 15% over top of the range Nvidia cards like 2080 super or 2080 TI.
> If they are running tests and at the moment and the new Ampere generation has an improvement of 50% over the current high end cards then the GPU to come in the next years could have improvements over the current generations as it follows:
> 3080 (+15% improvement over 2080)
> 4080 (+30% improvement over 2080) .. and so on.
> ...



No I disagree, the next cards after Ampere are a clean sheet new architecture, called Hopper or Harper, named after a female computer scientist IIRC. IMO 3xxx cards will see around 30% at a minimum compared to 2xxx cards,  3080 as fast at least as 2080TI, 3070 faster than 2080 Super, but RT will be much faster for all cards, I honestly expect 100% improvement, they have no choice to if they want to make it a feature you want to enable, it's so lame right now. I'd expect 4xxx cards to be 70%+ faster than 2xxx cards and 200% faster in RT. 

Hard to recall Nvidia ever doing a lame 10-15% imprvement on new(er) architecture.


----------



## kapone32 (Jan 20, 2020)

moproblems99 said:


> Multi gpu is great for people that like to waste 50% of their money more than 50% of the time.  I'll never do it again.  Nothing like sitting there waiting for a patch or driver update to get the other card working.  Meanwhile, I finished the game already.



Indeed I have only been running Multi GPU since the GTS 450 days. It is your opinion that it does that not seem good and maybe you had a bad experience. I find that with my Vega 64 crossfire that I do not see a waste of money nor time. I can't speak for Nvidia but every time I update my GPU drivers both cards work perfectly. Maybe I am just lucky.


----------



## metalfiber (Jan 20, 2020)

I'll stick with my original prediction and say the new video card will be tied to Cyberpunk 2077 ....













						GeForce RTX 2070 Super Beats Radeon 5700 XT in FFXV Benchmark
					

In a recent submission to the Final Fantasy XV Benchmark database, upcoming new NVIDIA GeForce RTX 2070 Super GPU has been benchmarked. The new submission is coming just a few days before the Super series officially launches. On benchmark's tests, RTX 2070 Super has scored 7479 points at 1440p...




					www.techpowerup.com


----------



## 64K (Jan 21, 2020)

moproblems99 said:


> While generally I agree, I believe that NV mid-range 680 was pretty competitive against AMD's high-end...



It was. The GTX 680 was a little faster than the HD 7970 and was about $50 cheaper than the HD 7970  until the HD 7970 GHz Edition High End GPU that AMD released which caught up with Nvidia's upper midrange GPU the GTX 680.


----------



## moproblems99 (Jan 21, 2020)

64K said:


> It was. The GTX 680 was a little faster than the HD 7970 and was about $50 cheaper than the HD 7970  until the HD 7970 GHz Edition High End GPU that AMD released which caught up with Nvidia's upper midrange GPU the GTX 680.



So, adding 1 + 1, it really doesn't matter that it was a mid-range card, does it?  In most cases, people are buying for performance, not chip size, transistor count, board label, or anything else.  It is pretty much performance.  Would anything have really changed if the board was labeled a 100 or 102?



Assimilator said:


> the only reason AMD ever used it was because Fiji was a disgusting power hog



This is exactly the only reason it ever made it to consumer gpus.



kapone32 said:


> Indeed I have only been running Multi GPU since the GTS 450 days. It is your opinion that it does that not seem good and maybe you had a bad experience. I find that with my Vega 64 crossfire that I do not see a waste of money nor time. I can't speak for Nvidia but every time I update my GPU drivers both cards work perfectly. Maybe I am just lucky.



Indeed.  The one and only time I used XFire was it's 'hey-day'.  The HD 6800 series.  It was total trash.  Never again.

Edit:  That said.  I am half tempted to pick up a second V56 and see what happens.  For science.


----------



## EarthDog (Jan 21, 2020)

kapone32 said:


> Indeed I have only been running Multi GPU since the GTS 450 days. It is your opinion that it does that not seem good and maybe you had a bad experience. I find that with my Vega 64 crossfire that I do not see a waste of money nor time. I can't speak for Nvidia but every time I update my GPU drivers both cards work perfectly. Maybe I am just lucky.


I wouldn't call you lucky, but I would say you are in the minority. 

They just need to either REALLY make it work so scaling is better and consistent across a lot more titles, or abort alltogether.


----------



## kapone32 (Jan 21, 2020)

moproblems99 said:


> So, adding 1 + 1, it really doesn't matter that it was a mid-range card, does it?  In most cases, people are buying for performance, not chip size, transistor count, board label, or anything else.  It is pretty much performance.  Would anything have really changed if the board was labeled a 100 or 102?
> 
> 
> 
> ...



Well I have to be honest, the only games I played in those days was Total War Medieval 2 and Total War Shogun 2, throw in some Torchlight and Titan Quest as well all of which have full multi GPU support. Then I got into Just Cause 2, Batman Arkham and Dues Ex HR which all support crossfire. Then I discovered Sleeping Dogs and Shadow of Mordor. Then Rome 2 was launched and after that, Attila which again fully support crossfire. Just Cause 3 and I will end it with Total War Warhammer 1 & 2 which both support multi GPU. Even though 3 Kingdoms does not support multi GPU, I fully expect Warhammer 3 to continue crossfire support (as long as you can edit the script) as it will be an expansion of the existing game.


----------



## EarthDog (Jan 21, 2020)

kapone32 said:


> (as long as you can edit the script)


Exactly the stuff most users simply do not want to deal with and one of the hassles. Most people can't figure out how to install a second card, none the less change .exe files to different names to get things to work. People just want it to work... 

Unless you are running 4K and don't want to drop $1K on a 60 fps capable card, I suppose its viable.. but the writing has been on the wall for years now, it is a, and rightfully so, a dieing breed.


----------



## Prima.Vera (Jan 21, 2020)

Like every generation, most likely, but not guaranteed, the  performance for the 30xx series will be something like:
3070 = RTX 2080
3080 = RTX 2080 Ti
3080 Ti = RTX Titanium

So shouldn't be any surprises here tbh...


----------



## Hyderz (Jan 21, 2020)

my wallet ran away when it saw the rtx2080ti launch price, until today its still shivers


----------



## moproblems99 (Jan 21, 2020)

kapone32 said:


> Well I have to be honest, the only games I played in those days was Total War Medieval 2 and Total War Shogun 2, throw in some Torchlight and Titan Quest as well all of which have full multi GPU support. Then I got into Just Cause 2, Batman Arkham and Dues Ex HR which all support crossfire. Then I discovered Sleeping Dogs and Shadow of Mordor. Then Rome 2 was launched and after that, Attila which again fully support crossfire. Just Cause 3 and I will end it with Total War Warhammer 1 & 2 which both support multi GPU. Even though 3 Kingdoms does not support multi GPU, I fully expect Warhammer 3 to continue crossfire support (as long as you can edit the script) as it will be an expansion of the existing game.



Here we are the - two extremes.  I had a terrible time while you had a great time.  We are but a small sample size so lets assume the truth lies in the middle.  More than 25% and less than 100% of games will work with mGPU (and the percentage of newer games is dropping).  Of those games, most of them will not have linear scaling, some of them might have more than 50%, and the many more, less than 50%.  All of these amazing benefits for twice the money,  twice the heat (up to), and twice the power (up to)!! So now, you have to have a beefier PSU, and better case cooling.  And your top gpu is going to get cooked.

If you find that all of your games work with mGPU then it may make sense.  But for the majority (by a margin), it doesn't make sense.  You can spend 50% more on a gpu and jump up at least 2 tiers.  You'll get a much more consistent experience with less of draw on the system.  And the best part, for the majority of the games that don't have great scaling, you'll get the same performance or better.

Edit: I suppose this generation you would not have been able to jump two tiers.  Perhaps the Touring generation added some fractional tenths of a percent to the "Should I bother with mGPU?" question.


----------



## SRB151 (Jan 21, 2020)

Let's face it.  7nm means smaller dies, much lower cost, better yields, lower power, and higher clocks.
Nvidia has ALWAYS gotten more out of any silicon than anyone,  so 100% performance increases, lower 
prices than RTX cards, more memory as the article says, and more efficiency.  The only thing standing 
in their way of total domination will be Raja and his upcoming, infinite scaling XE cards.  Great year for 
super graphics.


----------



## moproblems99 (Jan 21, 2020)

SRB151 said:


> Let's face it.  7nm means smaller dies, much lower cost, better yields, lower power, and higher clocks.
> Nvidia has ALWAYS gotten more out of any silicon than anyone,  so 100% performance increases, lower
> prices than RTX cards, more memory as the article says, and more efficiency.  The only thing standing
> in their way of total domination will be Raja and his upcoming, infinite scaling XE cards.  Great year for
> super graphics.



What did they put in my tea....

Our Savior.....Raja!  Who everyone booed out of AMD?


----------



## 64K (Jan 21, 2020)

moproblems99 said:


> So, adding 1 + 1, it really doesn't matter that it was a mid-range card, does it?  In most cases, people are buying for performance, not chip size, transistor count, board label, or anything else.  It is pretty much performance.  Would anything have really changed if the board was labeled a 100 or 102?



It matters a lot and it's why people are paying $500 to $700 for a midrange Nvidia GPU today.


----------



## moproblems99 (Jan 21, 2020)

64K said:


> It matters a lot and it's why people are paying $500 to $700 for a midrange Nvidia GPU today.



Not because the competition hasn't exactly put out an alternative?  Now granted, when AMD had as good (or better) cards, it isn't exactly like many people bought them.

Intel charged an arm and a leg because they had no one to forced them to lower the price.  Look what happened when Ryzen came out.  Until AMD puts out some juice, prices will stay where they are.  The board designation has nothing to do with where we are.


----------



## 64K (Jan 21, 2020)

moproblems99 said:


> Not because the competition hasn't exactly put out an alternative?  Now granted, when AMD had as good (or better) cards, it isn't exactly like many people bought them.
> 
> Intel charged an arm and a leg because they had no one to forced them to lower the price.  Look what happened when Ryzen came out.  Until AMD puts out some juice, prices will stay where they are.  The board designation has nothing to do with where we are.



Your assertion that it doesn't matter that the GTX 680 was a midrange card which Nvidia placed an MSRP of $500 which was the former price for a high end GPU and used the x80 name which was once again the name for the high end GPU just because the 680 was on par with the high end 7970 does matter. There are still to this very day people that think that a GTX 680 was a high end GPU.


----------



## moproblems99 (Jan 21, 2020)

64K said:


> Your assertion that it doesn't matter that the GTX 680 was a midrange card which Nvidia placed an MSRP of $500 which was the former price for a high end GPU and used the x80 name which was once again the name for the high end GPU just because the 680 was on par with the high end 7970 does matter. There are still to this very day people that think that a GTX 680 was a high end GPU.



It beat the competition high-end why does it matter?


----------



## 64K (Jan 21, 2020)

moproblems99 said:


> It beat the competition high-end why does it matter?




Well, the RTX 2080 Ti beat the competition. Why does it matter that it was $1,200 ?

The RTX Titan beat the competition. Why does it matter that it was $2,500 ?

The Nvidia shenanigans began with the GTX 680 and people just pay any damn thing since then.


----------



## rtwjunkie (Jan 21, 2020)

moproblems99 said:


> It beat the competition high-end why does it matter?


Because it fundamentally shifted the prices artificially upward for what appears forever. All because they passed off a high-midrange card as high end and charged accordingly.


----------



## Khonjel (Jan 21, 2020)

At this point it's more entertaining to watch how much Nvidia slam-dunks AMD rather than hoping AMD will ever put up a fight.


----------



## ixi (Jan 21, 2020)

EarthDog said:


> All I am saying is that we will get A LOT more than 10-15% gains out of the box from the new generation when comparing apples to apples (tier to tier). The only differences we've seen the past couple of generations are clock swaps and in the case of the super cards, core/spec differences, in order to extract more performance out of these cards. But 10-15% is putting a 2080 Ti on water and overclocking it... next gen will show double that, at minimum.



Are you sure about clock speeds?

Compare 980 ti vs gtx 1080 ti vs rtx 2080  . Difference between them is around 25-30%.


----------



## cucker tarlson (Jan 21, 2020)

If your mid range beats competitions high end,the problem is that competitions high end is really mid range.


----------



## Krzych (Jan 21, 2020)

Really hyped for this gen. Turing was "suspicious" right from the beginning and still turned out better than excepted to be honest, but this time I think we will get more Pascal like improvements in both performance and power. Everything is pointing to that the same way everything was pointing to Turing being a smaller jump. Plus expected huge improvements in RT performance. New consoles, want it or not, will be affecting the market somewhat too and prices should be much better. Maybe not going back to old ways (like Maxwell) but some Pascal level prices with tricks like RTX 3080 masquerading for top-end card for a few months before 3080 Ti arrives are possible I think.



kapone32 said:


> Multi GPU is for anyone that wants it. I have not replaced my Vega 64s with 5700Xts for exactly that reason. If Multi GPU was really dead MBs would not be touting it and giving you 2, 3 and 4 SLI bridges. As much as people complain about it being troublesome it is  not as bad as people make it out to be. There are plenty of Games that support Multi GPU anyway. As an example I get 107 FPS average at 4K playing Jedi Fallen Order @ Ultra settings.



Exactly that. The only people who say things about "dead, not worth it, troublesome" are those who never wanted such configuration in the first place and they complain on something that doesn't even concern them just to reassure themselves. SLI is up there for anyone who wants it and is capable of managing it. It was never perfect or reliable as far as support in newly released games goes and never been targeted for making sense perf/money wise across the board, it is strictly for pushing the limit kind of situation, in selected games. But I guess it is too hard to understand since there always has to be some guy entering the thread and talking about dead multi GPU, no matter what thread is it. It is almost like they are actively looking for an opportunity  

The rate of support and tweakability is practically stable for years now, only notable loss was Ubisoft dropping SLI from Assassin's Creed series starting with Origins, a series that had SLI support since forever. But at the same time they dropped all the extra features to use SLI for and Odyssey is basically a non-scalable console port that cannot even take proper advantage of new GPUs, so it is just a whole series going astray not just SLI exclusive problem. Also we are yet to see how universal this new SLI CFR mode is going to be once fully developed (and how tweakable), it will probably go official along with Ampere launch. Even now it shows some notable gains in games with zero multi GPU support and no custom profiles available. So things are rather looking up compared to right now, if anything.


----------



## Metroid (Jan 21, 2020)

finally very high settings + 4k 60 fps constant, old games 144hz for sure.


----------



## Xajel (Jan 21, 2020)

These seems good, as long as NV doesn't repeat the RTX 2000 launch prices again. RTX 3070 should be $499 (like current super), and RTX 3080 should be $649~$699...

And please no exclusive Founders Edition at launch which costs $100.


----------



## ratirt (Jan 21, 2020)

I see a lot of people have their wishes in terms of price and/or performance from the new NV graphics. Lets hope at least one, out of these wishes, comes true.
Hope the shrink will bring some more performance. Of the new arch I'm not so sure what to expect. Not much has been revealed.
My concerns on the other hand is mostly the price. We will have to wait a bit longer to get all the answers.


----------



## 64K (Jan 21, 2020)

The R&D costs for RTX GPUs has probably already been recouped from the early adopters. AMD will be stepping into the RTRT ring with their releases this year. Consoles will have a hardware solution to accelerate ray tracing this year. Probably Intel will be stepping into the RTRT ring this year.

Prices for the 3070 and 3080 should be more down to earth or Mr Huang will have to make a similar announcement to the press like he made a while back concerning the RTX Turings: _"Sales are failing to meet expectations."_


----------



## Super XP (Jan 21, 2020)

SRB151 said:


> Let's face it.  7nm means smaller dies, much lower cost, better yields, lower power, and higher clocks.
> Nvidia has ALWAYS gotten more out of any silicon than anyone,  so 100% performance increases, lower
> prices than RTX cards, more memory as the article says, and more efficiency.  The only thing standing
> in their way of total domination will be Raja and his upcoming, infinite scaling XE cards.  Great year for
> super graphics.


Interesting how people already count AMD out of the GPU business, without Understanding the reasoning behind hanging onto GCN so long, reasoning behind the release of Fury, Vega & Radeon VII. I could get into why AMD stuck onto the low to medium range of GPUs, but I'm sure most on TPU already know why. 

A quick word of advice, RDNA 2 will make Nvidia re-think it's Ampere GPU Launch.


----------



## EarthDog (Jan 21, 2020)

Super XP said:


> A quick word of advice, RDNA 2 will make Nvidia re-think it's Ampere GPU Launch.


So confident, yet nothing behind it.   

Bookmarked... My guess (instead of sharing it as fact and guised as 'advice') I don't think it will phase Nvidia much at all. If they are lucky, they will have a card that competes with 2080ti... which ampre will clearly beat by a significant amount.


----------



## Super XP (Jan 21, 2020)

EarthDog said:


> So confident, yet nothing behind it.
> 
> Bookmarked... My guess (instead of sharing it as fact and guised as 'advice') I don't think it will phase Nvidia much at all. If they are lucky, they will have a card that competes with 2080ti... which ampre will clearly beat by a significant amount.


I was only referring to SRB151's confidence that AMD has nothing upcoming in the GPU side of things. His own comments are posted as facts without any facts lol

I never mentioned anything about RDNA2 performance,  what I did say is RDNA2 WILL make Nvidia re-think it's Ampere Launch,  depending on who launches 1st. 

Disect that as you will but it could be cost to performance ya know. This ain't a Vega like repeat was my point.


----------



## EarthDog (Jan 21, 2020)

Super XP said:


> I was only referring to SRB151's confidence that AMD has nothing upcoming in the GPU side of things. His own comments are posted as facts without any facts lol
> 
> I never mentioned anything about RDNA2 performance,  what I did say is RDNA2 WILL make Nvidia re-think it's Ampere Launch,  depending on who launches 1st.
> 
> Disect that as you will but it could be cost to performance ya know. This ain't a Vega like repeat was my point.


Yeah, lol, his post is ridiculous. 

I suppose every launch will cause the other to 'think' about things, be it price or whatever, but I took your post to mean it will be some sort of game changer. I mean, you said it was 'advice' like it was a fact that RDNA2 will be good enough... only time will tell. The confidence exuded from both parties is premature.


----------



## Super XP (Jan 21, 2020)

EarthDog said:


> Yeah, lol, his post is ridiculous.
> 
> I suppose every launch will cause the other to 'think' about things, be it price or whatever, but I took your post to mean it will be some sort of game changer. I mean, you said it was 'advice' like it was a fact that RDNA2 will be good enough... only time will tell. The confidence exuded from both parties is premature.


I think my stance and my optimism for RDNA2 is because AMD has struggled with both the GPU and CPU for Soooooo Longggg lol 
And that AMD put almost all its resources into building ZEN. So I'm hopeful AMD pulls a rabbit out of RDNA2 and actually puts out competitive GPUs so we all benefit. 
It's been for too long and we need something from AMD that wil hopefully turn eyebrows towards them 

But I agree. Both are premature. Though Nvidia has had a better track record for obvious reasons lol


----------



## EarthDog (Jan 21, 2020)

Super XP said:


> It's been for too long and we need something from AMD that wil hopefully turn eyebrows towards them


No doubt! But I have little faith (in the high-end) and hoping for a surprise.


----------



## QUANTUMPHYSICS (Jan 21, 2020)

If I do buy a 3080Ti, I will sell my 2080Ti on Ebay and probably get around $700 - $800 for it.
From now on I will only buy cards with built in AIO.

I am extremely impressed with the temperature of my FTW3 over my BLACK. 

My Black will rise to 55 Degrees after 2 - 3 hours of gaming with Ray Tracing on.

My FTW3 doesn't go past 40 Degrees in the same time period. 

I don't overclock (as a practice).


----------



## Super XP (Jan 21, 2020)

EarthDog said:


> No doubt! But I have little faith (in the high-end) and hoping for a surprise.


Amen to that. Lol



QUANTUMPHYSICS said:


> If I do buy a 3080Ti, I will sell my 2080Ti on Ebay and probably get around $700 - $800 for it.
> From now on I will only buy cards with built in AIO.
> 
> I am extremely impressed with the temperature of my FTW3 over my BLACK.
> ...


A card like that doesn't need overclocking. 
And when you sell it be sure to mention that it was never overclocked. That might squeeze out more $$$ for you.


----------



## moproblems99 (Jan 21, 2020)

rtwjunkie said:


> Because it fundamentally shifted the prices artificially upward for what appears forever. All because they passed off a high-midrange card as high end and charged accordingly.



Possibly.  I think it is because AMD couldn't put up a solid competitor so NV could charge high end prices for a mid range card because the competition high end card priced as a high end card could only compete with a mid range card. 

If AMD can pull a Zen in GPUs, we will get a price drop because NV will have to compete on price.  In a sense, the entire industry hinges on AMD coming around or this will be status quo.  The good thing is, is I don't think AMD is that far away.  Will it be this gen?  Only time will tell.


----------



## efikkan (Jan 21, 2020)

moproblems99 said:


> If AMD can pull a Zen in GPUs, we will get a price drop because NV will have to compete on price.  In a sense, the entire industry hinges on AMD coming around or this will be status quo.  The good thing is, is I don't think AMD is that far away.  Will it be this gen?  Only time will tell.


You have to remember what it took for AMD to catch up with Zen/Zen2; over 3 years of Intel struggling with getting their designs to work on the new nodes. We shouldn't hope for the same to happen in the graphics market; years of stagnation for AMD to catch up…

AMD is currently 1.5 - 2 "generations" behind Nvidia. It is fairly unlikely that they will catch up in a single iteration. What they need to do is to take targeted steps towards closing the gap, and not fall further behind.

AMD are preparing Navi 21, 22 and 23, and if a tweaked Navi is all they have for the next 1-2 years, then they will risk increasing the gap even further if Nvidia's next gen brings any architectural improvements at all.


----------



## moproblems99 (Jan 21, 2020)

efikkan said:


> You have to remember what it took for AMD to catch up with Zen/Zen2; over 3 years of Intel struggling with getting their designs to work on the new nodes. We shouldn't hope for the same to happen in the graphics market; years of stagnation for AMD to catch up…
> 
> AMD is currently 1.5 - 2 "generations" behind Nvidia. It is fairly unlikely that they will catch up in a single iteration. What they need to do is to take targeted steps towards closing the gap, and not fall further behind.
> 
> AMD are preparing Navi 21, 22 and 23, and if a tweaked Navi is all they have for the next 1-2 years, then they will risk increasing the gap even further if Nvidia's next gen brings any architectural improvements at all.



Maybe.  Maybe not.  Could they surprise us this year?  Maybe.  Is it likely?  Not really.  But I am not discounting the proverbial rabbit getting proverbially pulled from the proverbial hat.

Edit: They don't need to surpass NV to shift prices.  They just need to get a little more competitive across more of the range.  See the 2060 for an exhibit.

Additionally, AMD may like the high prices and want to cash in for a bit.  Who could blame them?


----------



## kapone32 (Jan 21, 2020)

It is entirely possible that AMD could surprise us. Before Ryzen officially launched there was not a lot of information from AMD regarding the details of AM4 other than the info that was rehashed months before. If anyone had told me that the 5700XT would be 20 to 30% faster than the Vega 64 I would have laughed at them but the fact of the matter is that it is. The biggest caveat though is if Big Navi is as fast or even 10 to 15% faster than the 2080TI Nvidia will still have a sizable lead for the high end with the 3080Ti.


----------



## EarthDog (Jan 21, 2020)

kapone32 said:


> It is entirely possible that AMD could surprise us. Before Ryzen officially launched there was not a lot of information from AMD regarding the details of AM4 other than the info that was rehashed months before. If anyone had told me that the 5700XT would be 20 to 30% faster than the Vega 64 I would have laughed at them but the fact of the matter is that it is. The biggest caveat though is if Big Navi is as fast or even 10 to 15% faster than the 2080TI Nvidia will still have a sizable lead for the high end with the 3080Ti.


I mean, one would've hoped the 5700xt would work over V64... the latter was based on years old Polaris. One would figure this was a given, not a surprise. That said, it's also a couple % _slower_ than the Radeon VII, the previous flagship....


----------



## Super XP (Jan 21, 2020)

kapone32 said:


> It is entirely possible that AMD could surprise us. Before Ryzen officially launched there was not a lot of information from AMD regarding the details of AM4 other than the info that was rehashed months before. If anyone had told me that the 5700XT would be 20 to 30% faster than the Vega 64 I would have laughed at them but the fact of the matter is that it is. The biggest caveat though is if Big Navi is as fast or even 10 to 15% faster than the 2080TI Nvidia will still have a sizable lead for the high end with the 3080Ti.


All AMD needs is a competitive portfolio for all segments including the high end. They don't have to beat Nvidia's 2080ti and 3080ti, just have to remain competitive which would help make prices more fair. 
All I know is RDNA1 is a quarter of what a full new RDNA2 design will push out. 

I'm quite optimistic about RDNA2. Can't wait to see more info on it. Hopefully the ATI Technology days are coming back


----------



## EarthDog (Jan 21, 2020)

Super XP said:


> All I know is RDNA1 is a quarter of what a full new RDNA2 design will push out.


How do you know this?


----------



## Super XP (Jan 21, 2020)

EarthDog said:


> How do you know this?


Leaked Roadmaps? Why you ask. Lol 
Based on several sites including Guru3D. All based on rumours of course.


----------



## kapone32 (Jan 21, 2020)

EarthDog said:


> I mean, one would've hoped the 5700xt would work over V64... the latter was based on years old Polaris. One would figure this was a given, not a surprise. That said, it's also a couple % _slower_ than the Radeon VII, the previous flagship....



Yes and probably because of the 16GB of memory on the VII so that there is no latency involved by sending info to DRAM. One thing that interests me is if AMD will use a 384 Bit bus for Big Navi and if so will we see 12 or 16GB of DDR6 on those cards.


----------



## moproblems99 (Jan 21, 2020)

EarthDog said:


> That said, it's also a couple % _slower_ than the Radeon VII, the previous flagship....



Even though 5700xt is the highest we got out of it,. I am not 100% sure that was the idea going in.


----------



## EarthDog (Jan 21, 2020)

Super XP said:


> Leaked Roadmaps? Why you ask. Lol
> Based on several sites including Guru3D. All based on rumours of course.


In other words, you don't really _know_ anything. 

I'm not trying to be a dickhead here, but am a bit tired of seeing 'knowing' and other premature information under the guise of facts. 



kapone32 said:


> Yes and probably because of the 16GB of memory on the VII so that there is no latency involved by sending info to DRAM. One thing that interests me is if AMD will use a 384 Bit bus for Big Navi and if so will we see 12 or 16GB of DDR6 on those cards.


Memory capacity has nothing to do with that card's performance (in games), nor my talking point. To reiterate, we saw a flagship part come out that is slower than the last gen flagship part. It is taking an arch tweak and another generation to get there...IF RDNA 2 is competitive on the high-end (within a few percent of 2080ti). 

Some people's faith is seemingly misplaced in rumors and misinformation.


----------



## P4-630 (Jan 21, 2020)

Super XP said:


> All I know is RDNA1 is a quarter of what a full new RDNA2 design will push out.



Links please....


----------



## Krzych (Jan 21, 2020)

Super XP said:


> IA quick word of advice, RDNA 2 will make Nvidia re-think it's Ampere GPU Launch.



Yea we hear that before every AMD launch, every time they are about to come with some secret revolutionary GPUs that is going to save the world, but then the real products come out...



QUANTUMPHYSICS said:


> If I do buy a 3080Ti, I will sell my 2080Ti on Ebay and probably get around $700 - $800 for it.
> From now on I will only buy cards with built in AIO.
> 
> I am extremely impressed with the temperature of my FTW3 over my BLACK.
> ...



If Ampere is any good then you are not getting anywhere near $700-$800. New RTX 3070 should be close to 2080 Ti with both normal performance and RT performance at half the power and no more than $449-499. I have a 2080 Ti too, but only now it will be the time to really pay for the madness that was going on in recent years. Pascal did keep it's price thanks to inflated pricing of Turing, so we were able to not feel the price of Turing, but now comes the generation not only with proper leap in performance and efficiency, but also much cheaper. This even in normal circumstances causes the previous generation to depreciate by 50%, and with Turing that was both less of generational jump and much expensive than usual, this is going to reach 70%, especially for 2080 Ti.

This is not going to hit those who will keep upgrading, because you will sell your 2080 Ti for let's say 40% of new 3080 Ti price, add 60% of your own cash and you get 50%+ more performance, so that's not a big deal, also cards don't depreciate just because, Turing will only depreciate as much as Ampere will make it, so if Ampere is weak, Turing won't depreciate, and the ratio maintains. But as far absolute depreciation goes, I mean the difference between purchase price and sell price, for example for those who just want to sell the card and move on without buying new one, it is going to a record one.


----------



## kapone32 (Jan 21, 2020)

If Ampere is any good then you are not getting anywhere near $700-$800. New RTX 3070 should be close to 2080 Ti with both normal performance and RT performance at half the power and no more than $449-499. I have a 2080 Ti too, but only now it will be the time to really pay for the madness that was going on in recent years. Pascal did keep it's price thanks to inflated pricing of Turing, so we were able to not feel the price of Turing, but now comes the generation not only with proper leap in performance and efficiency, but also much cheaper. This even in normal circumstances causes the previous generation to depreciate by 50%, and with Turing that was both less of generational jump and much expensive than usual, this is going to reach 70%, especially for 2080 Ti.
[/QUOTE]

I seriously fear that these cards will be uber expensive. The 3080TI could be as high as $1700+ here in Canada. The reason I say that is that 2080TI is still $1400+ here in Canada. I can also see Nvidia releasing their new high end before Big NAvi gets released too. To be honest though what I really want is a fast card for 4K that can OC like how Tahiti OC so nicely for under $500.


----------



## moproblems99 (Jan 21, 2020)

Krzych said:


> If Ampere is any good then you are not getting anywhere near $700-$800. New RTX 3070 should be close to 2080 Ti with both normal performance and RT performance at half the power and no more than $449-499. I have a 2080 Ti too, but only now it will be the time to really pay for the madness that was going on in recent years. Pascal did keep it's price thanks to inflated pricing of Turing, so we were able to not feel the price of Turing, but now comes the generation not only with proper leap in performance and efficiency, but also much cheaper. This even in normal circumstances causes the previous generation to depreciate by 50%, and with Turing that was both less of generational jump and much expensive than usual, this is going to reach 70%, especially for 2080 Ti.



You are not getting a large performance increase, large efficiency gain, and large price drop.  Let's get that out of the way right now.  You may get one of them.  If lucky.


----------



## ppn (Jan 21, 2020)

Well 3070 is 2080S on steroids, this is 60% faster than 2070. 60% the chip size at same power higher clocks, and lower price.... than 2080S.


----------



## kapone32 (Jan 21, 2020)

ppn said:


> Well 3070 is 2080S on steroids, this is 2x faster than 2070. 3/4 the chip size at same power higher clocks, and lower price,.... than the 2080S. but higher than 2070.



So we have pricing details for Ampere?


----------



## ppn (Jan 21, 2020)

We have nothing for ampere at all. just grains of salt.


----------



## efikkan (Jan 21, 2020)

ppn said:


> We have nothing for ampere at all. just grains of salt.


Frankly, we don't even have the name confirmed…

The only "safe" guess is that it will be _at least_ a node shrink.


----------



## Super XP (Jan 21, 2020)

P4-630 said:


> Links please....


Why not check the internet. There's a wealth of speculation and rumours for both Big Navi / RDNA2 and Ampere. lol

http://www.redgamingtech.com/rdna-2...vements-to-take-on-nvidia-analysis-exclusive/

https://www.techradar.com/news/amd-...ay-be-out-in-2020-according-to-leaked-roadmap

A lot of the expertise of Zen's design team has moved to Radeon to help push AMD's graphics performance to the next level.

https://www.overclock3d.net/news/gp...erformance_efficiency_improvements_-_rumour/1

AMD's CEO, Lisa Su, has confirmed that *PC gamers should expect to see Radeon graphics cards with hardware-accelerated raytracing this year*. Furthermore, Lisa Su has also confirmed that "a high-end Navi" graphics card is in the works.

https://wccftech.com/amd-zen-3-ryzen-epyc-cpus-and-rdna-2-radeon-rx-gpus-2020-launch/

https://hothardware.com/news/amd-radeon-rx-navi-21-gddr6-rdna-2

https://upnewsinfo.com/2020/01/18/a...-than-rdna-1-with-no-added-power-consumption/

https://www.techpowerup.com/262098/ray-tracing-and-variable-rate-shading-design-goals-for-amd-rdna2

https://www.pcgamesn.com/amd-rdna-2-gpu-zen

https://www.guru3d.com/news-story/amd-radeon-5800-xt-(big-navi)-to-get-80-compute-units.html


----------



## efikkan (Jan 21, 2020)

80 CUs? Does it pass the smell test?
40 CUs in Navi 10 is already 225W, but doubling that plus some kind of RT cores? This quickly sounds like >400W even with benefits from 7nm EUV. AMD better have some secret tricks up their sleeve to do this.


----------



## Super XP (Jan 21, 2020)

efikkan said:


> 80 CUs? Does it pass the smell test?
> 40 CUs in Navi 10 is already 225W, but doubling that plus some kind of RT cores? This quickly sounds like >400W even with benefits from 7nm EUV. AMD better have some secret tricks up their sleeve to do this.


Try not to confuse RDNA1 5700 and 5700XT with RDNA2. They aren't the same according to several reliable sources.


----------



## P4-630 (Jan 21, 2020)

Super XP said:


> Why not check the internet. There's a wealth of speculation and rumours for both Big Navi / RDNA2 and Ampere. lol
> 
> http://www.redgamingtech.com/rdna-2...vements-to-take-on-nvidia-analysis-exclusive/
> 
> ...





Super XP said:


> *All I know is RDNA1 is a quarter of what a full new RDNA2 design will push out.*



Just clicked a few links, I see "half".


----------



## moproblems99 (Jan 21, 2020)

P4-630 said:


> Just clicked a few links, I see "half".



*Remembers previous hype vs launch reality*


----------



## Super XP (Jan 21, 2020)

P4-630 said:


> Just clicked a few links, I see "half".


Are you joking with me? lol, all sources claim its brand spanking new design that will feature some exceptional things including VRS and RT. 
At the end of the day it's rumours and speculation, but note that a couple of those sources have been right before.


----------



## Otonel88 (Jan 21, 2020)

When is expected for Nvidia to announce the new Ampere cards?
And AMD with Big Navi? 
Maybe if they are close together it will create some kind of competition between them, wich is always good for the consumer. (Look at the 5600 release)


----------



## Super XP (Jan 21, 2020)

Otonel88 said:


> When is expected for Nvidia to announce the new Ampere cards?
> And AMD with Big Navi?
> Maybe if they are close together it will create some kind of competition between them, wich is always good for the consumer. (Look at the 5600 release)


Couldn't Agree More,  lol


----------



## EarthDog (Jan 21, 2020)

Super XP said:


> Why not check the internet. There's a wealth of speculation and rumours for both Big Navi / RDNA2 and Ampere. lol
> 
> *snip*


From a link (or two) of yours...


> AMD claims up to a 50 percent improvement in performance for the same power consumption.



And with Nvidia Ampre saying the same thing (50% performance for the same power)...if we assume both are true... that leaves the landscape in a similar position, no? 

I wonder how AMD is going to accomplish this on the same node with just an arch tweak. Nvidia on the other hand has all of the natural improvements of a die shrink, PLUS their arch changes.


----------



## moproblems99 (Jan 21, 2020)

EarthDog said:


> I wonder how AMD is going to accomplish this on the same node with just an arch tweak.



Aren't they moving to EUV?  That should give them something...or not.  Or RDNA2 would have to be radically different like Zen3 is purport to be.  But that doesn't seem to make sense at this point.  Or HBM anyone lol?  Get those power savings somehow.



EarthDog said:


> Nvidia on the other hand has all of the natural improvements of a die shrink, PLUS their arch changes.



They could be hedging on RTX gains this year seeing as they likely know the steep ass hill AMD has to climb.

Lots of ifs, lots of coulda, lots of we don't really know.  Dash of salt.....voila!


----------



## Super XP (Jan 22, 2020)

EarthDog said:


> From a link (or two) of yours...
> 
> 
> And with Nvidia Ampre saying the same thing (50% performance for the same power)...if we assume both are true... that leaves the landscape in a similar position, no?
> ...


But you assume AMD is going to do nothing more than a arch tweak, where the majority of sources state RDNA2 is a new GPU architecture. But it's also something that AMD is keeping quiet, just like how they kept ZEN quiet.


----------



## EarthDog (Jan 22, 2020)

Super XP said:


> But you assume AMD is going to do nothing more than a arch tweak, where the majority of sources state RDNA2 is a new GPU architecture. But it's also something that AMD is keeping quiet, just like how they kept ZEN quiet.


Lol, I like your positive outlook, even if it is inherently misplaced here. 

Your links show 50% for amd so is ampre (assuming both are true). So I ask again, regardless of 7nm+ and new arch, we're in the same place, if we follow the rumors,  eh?


----------



## Super XP (Jan 22, 2020)

EarthDog said:


> Lol, I like your positive outlook, even if it is inherently misplaced here.
> 
> Your links show 50% for amd so is ampre (assuming both are true). So I ask again, regardless of 7nm+ and new arch, we're in the same place, if we follow the rumors,  eh?


Agreed.


----------



## ratirt (Jan 22, 2020)

EarthDog said:


> Lol, I like your positive outlook, even if it is inherently misplaced here.
> 
> Your links show 50% for amd so is ampre (assuming both are true). So I ask again, regardless of 7nm+ and new arch, we're in the same place, if we follow the rumors,  eh?





Super XP said:


> Agreed.


If we follow the rumors the Ampere wont be Ampere. Zen3 will suck and Cascade lake will mop  the floor with it. The new NV graphics might not come out.
Well, God knows what is true or not and if rumors are in play here, we, users, customers know nothing about the actual product and will know nothing till it shows up. It is in the best interest for the companies to not reveal anything serious, meaningful that would jeopardize the product in any way. The rumors are manufactured not leaked nowadays.
This new NV graphics will be good. Why wouldn't it be? RT is being pushed forward and it's been our "leather jacket dude" dream and now let us wait what the price will be and how this will play in the future against AMD and what the latter will show with the RDNA2. BUT,  if anyone will justify that the price is being jacked up again because more RT core are in the GPU or because it is high end of the highest end, then please leave the planet and never come back.


----------



## Berfs1 (Jan 22, 2020)

Razrback16 said:


> Ya I remember years back before I tried my first SLI setup (2x 780 Ti) I was scared to death about the stuttering people kept talking about. I've run 780 Ti SLI, Titan X (maxwell) SLI, and now 1080 Ti SLI...I haven't had a lick of stuttering on any games I play. I mean zippo. I generally run vsync @ 60fps in 4k and my games have been butter smooth. If I ever feel like a game is running right around that 60fps limit for my cards and may fluctuate (which can cause input lag in those rare situations), then I switch out of true vsync and enable adaptive vsync at the driver level and that will take care of any issues.
> 
> My experiences with multi gpu have been great. It's obviously not for everyone given that the scaling is never 1:1, and in some cases not even close, but if you have tons of cash and / or are just a hardware enthusiast that wants maximum image quality and / or framerates, it's something I'd recommend people try.
> 
> ...


The 1070 Ti was actually the best performance/$ card out of the 10 series, with the 1080 Ti right behind. Take a look in this: https://docs.google.com/spreadsheets/d/1-JXBPMRZtUx0q0BMMa8Nzm8YZiyPGkjeEZfZ2DOr8y8/edit?usp=sharing This is my database for the NVIDIA graphics cards, and I also have launch MSRPs and price/TFLOPS (for this context, it is directly relevant with GPUs with the same architecture). I only use launch MSRPs, and not adjusted MSRPs like the GTX 1080 which was launched at 699$ but they dropped it I think 100 or 200$, I don't remember the price cut.



theoneandonlymrk said:


> Would also be nice to hear some rumours about the improvements to RTX hardware and tensor cores too, shader numbers are not going to inform us much about the performance in isolation.
> 
> @Berfs1 I Can't see any of that panning out, since when did Nvidia improve their best then undercut it? so I think the prices we have now plus 50 dollars minimum, at least at launch with supers or something coming out later ,once stocks of the 2xxx series run out, makes sense tbf.


Yea, the pricing was a rumor, but I can believe that the performance will be substantially better. Also, since you mentioned the Super cards, here is a few facts that not a lot of people know about them: They have better price/performance, but they have higher performance/watt than the non-Supers. Not a lot of people touched on that, so I just wanted to let it out there that the performance/watt is actually worse on Supers than the non-Supers. Everyone wants the most performance with the lowest cost and lowest power, you can only get a good balance from two of those, you will never get max performance at lowest power.


----------



## candle_86 (Jan 22, 2020)

VrOtk said:


> That 320-bit bus looks weird: allowing your xx80 grade card to have such high memory bandwidth, you start to cripple your xx80Ti's performance at higher resolutions (unless it uses 512-bit bus or HBM2).
> Though I'd be happy to be wrong, as better grade products for the same or cheaper price is always a welcome.



last card with a 320bit bus was an 8800GTS, its to odd a number to be the complete memory controller, I'd wager there is some disabled silicone and the controller is really 384bit for TI


----------



## Diverge (Jan 22, 2020)

As much as I want to buy the latest and greatest GPU each year... I think I am going to stop doing so until they make gaming great again! 

All the game developers, but a maybe a few, suck! Tired of 1/4 baked games filled with bugs, or heavily monetized with money transactions...


----------



## Berfs1 (Jan 22, 2020)

candle_86 said:


> last card with a 320bit bus was an 8800GTS, its to odd a number to be the complete memory controller, I'd wager there is some disabled silicone and the controller is really 384bit for TI


Not the last card to have 320 bit bus width; the GTX 470 had 320 bit, as well as the GTX 560 Ti (448 core) and GTX 570, but yes it is weird nonetheless. Not to mention the 3080 Ti may be 352 or 384 bit, I mean the 1080 Ti and 2080 Ti had 352 bit, the 780 Ti and 980 Ti had 384 bit, who knows at this point lol, but I have a hunch the Titan Ampere may have 48GB VRAM and 3080 Ti may be 20GB/24GB VRAM, since the 80 Ti cards have always have had half or near half the VRAM of the Titan cards, with the one exception being the Titan Z (which was a dual GPU card).


----------



## efikkan (Jan 22, 2020)

Berfs1 said:


> Not the last card to have 320 bit bus width; the GTX 470 had 320 bit, as well as the GTX 560 Ti (448 core) and GTX 570, but yes it is weird nonetheless. Not to mention the 3080 Ti may be 352 or 384 bit, I mean the 1080 Ti and 2080 Ti had 352 bit, the 780 Ti and 980 Ti had 384 bit, who knows at this point lol, but I have a hunch the Titan Ampere may have 48GB VRAM and 3080 Ti may be 20GB/24GB VRAM, since the 80 Ti cards have always have had half or near half the VRAM of the Titan cards, with the one exception being the Titan Z (which was a dual GPU card).


Most of these were just partially disabled memory controllers. GPUs don't have a single memory controller, but multiple 64-bit controllers, but even these can be partially disabled to get 32-bit increments.

This rumor does however contain two specific and odd details;
1) Different SM count per GPC (Nvidia usually have these the same within an architecture)
2) A separate die for a chip with a 320-bit memory controller
These are two pieces of information that are either completely true or completely wrong, and those under NDA would immediately know if this entire rumor is true or just BS.



Diverge said:


> As much as I want to buy the latest and greatest GPU each year... I think I am going to stop doing so until they make gaming great again!
> 
> All the game developers, but a maybe a few, suck! Tired of 1/4 baked games filled with bugs, or heavily monetized with money transactions...


Why upgrade so often? Upgrade when you need to.

But you're right about game developers; high quality work is more the exception than the norm. It's not Nvidia that's "killing" gaming, it's poor software.


----------



## Super XP (Jan 22, 2020)

efikkan said:


> Most of these were just partially disabled memory controllers. GPUs don't have a single memory controller, but multiple 64-bit controllers, but even these can be partially disabled to get 32-bit increments.
> 
> This rumor does however contain two specific and odd details;
> 1) Different SM count per GPC (Nvidia usually have these the same within an architecture)
> ...


It's not the developers fault it's the gaming companies Board of Directors that conjure up techniques to squeeze as much money as possible into there pockets and greatly limiting development funding. 

They want less money invested into proper game development while still trying to make huge profits. 

Nowadays games are popped out too quickly with lots of issues and bugs. It's really too bad, and hopefully things change.


----------



## Diverge (Jan 22, 2020)

efikkan said:


> Why upgrade so often? Upgrade when you need to.



For no logical reason. I like like playing with new gadgets. It's a disease... LOL. But now that they cost over $1200 each year, I'm going to stop wasting my money... especially since I can't find any games that suck me in anymore.


----------



## candle_86 (Jan 22, 2020)

Diverge said:


> For no logical reason. I like like playing with new gadgets. It's a disease... LOL. But now that they cost over $1200 each year, I'm going to stop wasting my money... especially since I can't find any games that suck me in anymore.



I would the average gamer is on an rx580 or gtx1060 and this is the middle road they're going to Target.


----------



## Super XP (Jan 22, 2020)

candle_86 said:


> I would the average gamer is on an rx580 or gtx1060 and this is the middle road they're going to Target.


I have the following: As you can see in the drop down menu for System Specs. And I have absolutely *NO Issues playing on Ultra High Picture Quality Settings at 1440p.* My FPS in games like DOOM, RAGE 2, the entire METRO series, Wolfenstein New Order, Old Blood, II New Colossus, Young Blood, Resident Evil 2 Revamp, Resident Evil 7 BioHazard, Mad Max, PREY, Dying Light ETC., all gaming at a range of 60 FPS up to 100 FPS. Never have I experienced any slow downs in a game except in Dying Light a couple times, when I am picking up stuff too quickly, FPS drop down to about 50 FPS, then go back up to my average 75 FPS, for that game. DOOM is about 70-100 FPS, depending on the area. Same goes for Wolfenstein games for the most part. 

Specs:
* Sapphire Radeon RX 580 8GB Nitro+ SE 
* AMD Ryzen 7 1700X @ stock
* G.Skill TridentZ 32GB (2 x 16GB) DDR4 3200 
* Asus 27" (MG278Q) 144Hz WQHD 1440p

My next upgrade:
Radeon RX 6700XT 8GB
AMD Ryzen 7 4800X
Same Ram & Monitor. 
X670 chipset mobo Socket AM4 etc., I can dream right? lol
End of my Ramblings....


----------



## efikkan (Jan 22, 2020)

Super XP said:


> It's not the developers fault it's the gaming companies Board of Directors that conjure up techniques to squeeze as much money as possible into there pockets and greatly limiting development funding.
> 
> They want less money invested into proper game development while still trying to make huge profits.
> 
> Nowadays games are popped out too quickly with lots of issues and bugs. It's really too bad, and hopefully things change.


Yes, agree. I should have emphasized that I meant the game development companies, not the poor individuals given the "impossible" task.

But as I've mentioned in other threads before, it's often not only about money spent, but rather "quick turnover". These companies' boards often rather want rapid product cycles rather than spending the time necessary to build a proper game engine first, then do full-scale content creation, and then finally _proper_ testing before shipping.


----------



## Prima.Vera (Jan 23, 2020)

Estimated on confirmed release date?


----------



## Manoa (Jan 23, 2020)

I wanne tell you something that nobody mentions about multicard: input delay
people have to understand that SLI "frames" are not realy frames :x
think about it: what is the point to have x2 fps but an input delay of x3 ?!

yhe it the true, all the games realy bad last 10 yars :x


----------



## Nihilus (Jan 23, 2020)

ppn said:


> Well 3070 is 2080S on steroids, this is 60% faster than 2070. 60% the chip size at same power higher clocks, and lower price.... than 2080S.



And how will the 3070 get over the same bandwidth limitations as the 2080S if it uses the same GDDR6 256 but memory?  The 16 gb/s stuff is already quite expensive.  It's been shown that even a standard 2080 performs the same as the 2080S when memory is set at the same speed despite the SM deficiency.

This is why the 3080 is getting a 320 bit bus otherwise it would perform the same as the 3070 while saving room for a 3080ti which will most likely get a full 384 bit bus.


----------



## Prima.Vera (Jan 23, 2020)

Btw, when was Intel supposedly to launch their new GPUs ??


----------



## Super XP (Jan 25, 2020)

efikkan said:


> Yes, agree. I should have emphasized that I meant the game development companies, not the poor individuals given the "impossible" task.
> 
> But as I've mentioned in other threads before, *it's often not only about money spent, but rather "quick turnover". These companies' boards often rather want rapid product cycles rather than spending the time necessary to build a proper game engine first, then do full-scale content creation, and then finally proper testing before shipping.*


That is exactly what I meant to say lol,



Prima.Vera said:


> Btw, when was Intel supposedly to launch their new GPUs ??


If Intel is serious about Discrete Graphics, I can see them matching both AMD & Nvidia in about 3 years time. As for the release, I heard sometime in 2020, but its going to barely do 1080p last I read.


----------

