# So is the 10 gb vram on rtx 3080 still enough?



## Tomgang (Sep 22, 2020)

This has been discussed before I know. But now seing the spec requirements for the new watch dog legion game. I really have to say, no 10 gb is not enough these days. Not for 4k at least now or in the for seen future.

I dont know by you guys, but I am holding on and se if we can not get a rtx 3080 20 GB or a rtx 3080 ti.







What you think about the 10 gb vram?


----------



## MxPhenom 216 (Sep 22, 2020)

Its enough. I don't expect the 20GB version to really boost performance in games right now.


----------



## dirtyferret (Sep 22, 2020)

historically, both Nvidia and AMD (ATI prior) have been damn good about pairing the right amount of RAM with their GPU chips.  I wouldn't touch a RTX 3080 20GB (assuming it would ever be offered) with a ten foot pole unless it was offered at the same price as the 10GB version.  The extra rams on GPUs (think Nvidia GTX 960 4GB, AMD RX 470 8GB, AMD 580 8GB) rarely resulted in a major real world difference. You would be better off moving up to a faster GPU.


----------



## MxPhenom 216 (Sep 22, 2020)

dirtyferret said:


> historically, both Nvidia and AMD (ATI prior) have been damn good about pairing the right amount of RAM with their GPU chips.  I wouldn't touch a RTX 3080 20GB (assuming it would ever be offered) with a ten foot pole unless it was offered at the same price as the 10GB version.  The extra rams on GPUs (think Nvidia GTX 960 4GB, AMD RX 470 8GB, AMD 8GB) rarely resulted in a major real world difference. You would be better off moving up to a faster GPU.



This. The only time the extra memory on these cards beyond what it originally was meant to have was in multi-gpu configurations.


----------



## Calmmo (Sep 22, 2020)

As long it convinces you I guess..

It's just the 2080ti spec, thus the 11gb "req".


----------



## Solid State Soul ( SSS ) (Sep 22, 2020)

No.

Games are always developed for consoles first, then scaled up for pc and with next Gen 4K consoles have 16 gb of memory, the 10 gb of ram is enough up to 1440p gaming on 4K you'll likely to be limited years down the road, especially if you like to max out the textures.

The golden rule for a future proof GPU is imo to buy one with the same amount or higher RAM then the consoles, with 10 gbs maxing out the graphics in high resolution gaming will be limited by anything lower than 13gb video memory years down the road

We already see some Pc ports of current Gen games consume up to 8gb of memory when maxed out, and it'll increase with true next Gen console games


----------



## MxPhenom 216 (Sep 22, 2020)

Solid State Soul ( SSS ) said:


> No.
> 
> Games are always developed for consoles first, then scaled up for pc and with next Gen 4K consoles have 16 gb of memory, the 10 gb of ram is enough up to 1440p gaming on 4K you'll likely to be limited years down the road, especially if you like to max out the textures.
> 
> ...



Devs do not have access to all that 16gb for their games. I think around 3-4gb is reserved for the OS and background processes.

Also these days, I'm not so sure games are exactly developed for consoles first 100% of the time anymore. And ports are kind of a thing of the past now that everything is basically x86 and Windows (PS being an exception to this)


----------



## Solid State Soul ( SSS ) (Sep 22, 2020)

MxPhenom 216 said:


> Devs do not have access to all that 16gb for their games. I think around 3-4gb is reserved for the OS and background processes.
> 
> Also these days, I'm not so sure games are exactly developed for consoles first 100% of the time anymore.


Even with a small portion of memory Reserved for the OS, we'r still looking at more vram than an rtx 3080


----------



## Tomgang (Sep 22, 2020)

I know 20 GB will not boost performance over 10 gb unless the memory is faster. Like if the 10 gb version will use the 19  Gbps and the 20 GB model used 21 Gbps chips. But I can also tell how it feels when you run out of vram. That is not a nice feeling. Lagging worse than a old man trying to run and games might even crash.

Better to have to much vram than to little. I will personally wait and see if we get a card between 3080 and 3090 that is more suitable for my needs cause I Like cranking eye candy to 11.


----------



## Solid State Soul ( SSS ) (Sep 22, 2020)

Tomgang said:


> I know 20 GB will not boost performance over 10 gb unless the memory is faster. Like if the 10 gb version will use the 19  Gbps and the 20 GB model used 21 Gbps chips. But I can also tell how it feels when you run out of vram. That is not a nice feeling. Lagging worse than a old man trying to run and games might even crash.
> 
> Better to have to much vram than to little. I will personally wait and see if we get a card between 3080 and 3090 that is more suitable for my needs cause I Like cranking eye candy to 11.


If you play on 4k, 16gb of vram should be a mininum for next gen

Nvidia knows this which is why they named the 24gb, work station centrec Ampere card RTX 30*90 in* a weird move to entise gamers to buy that If they want a future proof next gen gaming experience cause its the most expensive offering but thats just My guess

If sony could have shipped a 399$ console with 8 Gb of ram 7 years ago, then nvidia could have done way better here and now


----------



## MxPhenom 216 (Sep 22, 2020)

Tomgang said:


> I know 20 GB will not boost performance over 10 gb unless the memory is faster. Like if the 10 gb version will use the 19  Gbps and the 20 GB model used 21 Gbps chips. But I can also tell how it feels when you run out of vram. That is not a nice feeling. Lagging worse than a old man trying to run and games might even crash.
> 
> Better to have to much vram than to little. I will personally wait and see if we get a card between 3080 and 3090 that is more suitable for my needs cause I Like cranking eye candy to 11.



Its not about if the RAM is faster or not. Its about if the GPU is fast enough to actually make use of the extra RAM. And a lot of games these days will actually occupy all the VRAM you can give it, but doesn't necessarily mean it has too, or more of the memory is actually benefiting that game.



Solid State Soul ( SSS ) said:


> Even with a small portion of memory Reserved for the OS, we'r still looking at more vram than an rtx 3080



Except the raw performance of 3080 GPU will slap anything thats in a console too, even with the slightly less available VRAM.


----------



## Vya Domus (Sep 22, 2020)

Enough to play games ? Yes. Enough for using maxed out settings at 4K ? Questionable ...


----------



## dirtyferret (Sep 22, 2020)

MxPhenom 216 said:


> Its not about if the RAM is faster or not. Its about if the GPU is fast enough to actually make use of the extra RAM. And a lot of games these days will actually occupy all the VRAM you can give it, but doesn't necessarily mean it has too, or more of the memory is actually benefiting that game.



this, too many people confuse using ram with needing ram



Solid State Soul ( SSS ) said:


> If sony could have shipped a 399$ console with 8 Gb of ram 7 years ago, then nvidia could have done way better here and now


PS4 shipped with 8GB of GDDR5 ram plus 1GB of DDR3 for the OS.  AMD RX 5500XT 4GB still easily outperforms the PS4.  If you want to "future proof" then buy a video card in the future.


----------



## moproblems99 (Sep 22, 2020)

Considering the specs of Cyberpunk, I would say yes.  Considering most people only keep gpus 2 or 3 years, I can't see 10gb not being enough for that time span.  Any longer than that is like asking how long a keg will last.


----------



## Solid State Soul ( SSS ) (Sep 22, 2020)

dirtyferret said:


> this, too many people confuse using ram with needing ram
> 
> 
> PS4 shipped with 8GB of GDDR5 ram plus 1GB of DDR3 for the OS.  AMD RX 5500XT 4GB still easily outperforms the PS4.  If you want to "future proof" then buy a video card in the future.



Its not about performance, its about having the ability to max out textures and details which why people buy high end GPUs


----------



## EarthDog (Sep 22, 2020)

moproblems99 said:


> Considering the specs of Cyberpunk, I would say yes.  Considering most people only keep gpus 2 or 3 years, I can't see 10gb not being enough for that time span.  Any longer than that is like asking how long a keg will last.


Beeeeeeeeeelch.

Done.   








10GB I think is fine for the life of that card, 3-5 years, I'd imagine. It may get long in the tooth on some titles towards the end, but, contrary to what pokemon thinks, I'd rather have a faster overall card with 10GB than one ~20% slower with 16GB in general. That said, it depends on the games you play (if you mod heavily, you may need to reconsider).

I think it would behoove some of those in this thread to look at vram use in some of TPU's game reviews at 4K. There may be a title out there that caps it, but, it's another thing if the gaming experience suffers from this or, as was said earlier, if the space is just allocated. There is a difference.

EDIT: At 4K, W1z's last 5 game performance reviews averaged 6.8 GB vRAM use. The highest was 8.9 GB (lowest was 4.8). I didn't go back further.. But just saying... it will likely be enough for a while...


----------



## Tomgang (Sep 22, 2020)

Solid State Soul ( SSS ) said:


> If you play on 4k, 16gb of vram should be a mininum for next gen
> 
> Nvidia knows this which is why they named the 24gb, work station centrec Ampere card RTX 30*90 in* a weird move to entise gamers to buy that If they want a future proof next gen gaming experience cause its the most expensive offering but thats just My guess
> 
> If sony could have shipped a 399$ console with 8 Gb of ram 7 years ago, then nvidia could have done way better here and now





MxPhenom 216 said:


> Its not about if the RAM is faster or not. Its about if the GPU is fast enough to actually make use of the extra RAM. And a lot of games these days will actually occupy all the VRAM you can give it, but doesn't necessarily mean it has too, or more of the memory is actually benefiting that game.
> 
> 
> 
> Except the raw performance of 3080 GPU will slap anything thats in a console too, even with the slightly less available VRAM.



I am on a 1920 x 1200 screen now 60 Hz that is around 8 years old now with only dvi connection fitting for my X58 system, but not fitting for a new zen 3/ampere powered system. Also I need a new screen with hdmi/display ports so I can just as well get a proper 4k screen now. Just as my other hardware, I have my screens for longer periods of time before replacing it.


----------



## EarthDog (Sep 22, 2020)

Tomgang said:


> I am on a 1920 x 1200 screen now 60 Hz that is around 8 years old now with only dvi connection fitting for my X58 system, but not fitting for a new zen 3/ampere powered system. Also I need a new screen with hdmi/display ports so I can just as well get a proper 4k screen now.


Honestly, I'd rather get a 2560x1440 144Hz monitor than to go 4K 60 Hz....

I'm riding mine out (1440/144Hz) for a couple more years for sure. I'd rather do that than 4K/60...


----------



## moproblems99 (Sep 22, 2020)

EarthDog said:


> 10GB I think is fine for the life of that card, 3-5 years, I'd imagine. It may get long in the tooth on some titles towards the end, but, contrary to what pokemon thinks, I'd rather have a faster overall card with 10GB than one ~20% slower with 16GB in general. That said, it depends on the games you play (if you mod heavily, you may need to reconsider).
> 
> I think it would behoove some of those in this thread to look at vram use in some of TPU's game reviews at 4K. There may be a title out there that caps it, but, it's another thing if the gaming experience suffers from this or, as was said earlier, if the space is just allocated. There is a difference.



If we want to believe the rumor mill and that BigNavi is only equal to 2080ti (impressive in itself) and that it comes with 16gb vram, I'd take the 3080 in a heartbeat.  Not that this is a good comparison, but 1000hp doesn't do any good on a tricycle.  I mean, it does, just not in the comparison.


----------



## arbiter (Sep 22, 2020)

Solid State Soul ( SSS ) said:


> No.
> 
> Games are always developed for consoles first, then scaled up for pc and with next Gen 4K consoles have 16 gb of memory, the 10 gb of ram is enough up to 1440p gaming on 4K you'll likely to be limited years down the road, especially if you like to max out the textures.
> 
> ...





MxPhenom 216 said:


> Devs do not have access to all that 16gb for their games. I think around 3-4gb is reserved for the OS and background processes.
> 
> Also these days, I'm not so sure games are exactly developed for consoles first 100% of the time anymore. And ports are kind of a thing of the past now that everything is basically x86 and Windows (PS being an exception to this)


Lets not also forget yes console has 16gb of mem and some of it is reserved but ITS also SHARED memory. That memory has to house all game and graphic data where on a pc usually separate pools. even if game needs more then 10gig's of vram with speed of storage now days its not as big of an issue as it used to be since system memory is fast now and so is data storage.



MxPhenom 216 said:


> Its not about if the RAM is faster or not. Its about if the GPU is fast enough to actually make use of the extra RAM. And a lot of games these days will actually occupy all the VRAM you can give it, but doesn't necessarily mean it has too, or more of the memory is actually benefiting that game.





dirtyferret said:


> this, too many people confuse using ram with needing ram


You can test this in a windows machine now. Fresh install win10 on say 8gb will use what ~3gb. Slap another 8 gb stick in there it could go up to 4-5gb. More ram doesn't mean its needed cause it just gets used to store extra data that might be needed.


----------



## Solid State Soul ( SSS ) (Sep 22, 2020)

dirtyferret said:


> you want to "future proof" then buy a video card in the future.


Not everyone buy new graphics card every two years, and If someone bought a 1070 4 years ago, that person would still be a very happy 1080p gamer to this day. 

Nvidia xx70 series has had 8 Gb of memory for three generations now, GPU vendors must keep up with video memory demands especially with late current gen recomending upwards of 10 Gb for 4K gaming


----------



## MxPhenom 216 (Sep 22, 2020)

Solid State Soul ( SSS ) said:


> Not everyone buy new graphics card every two years, and If someone bought a 1070 4 years ago, that person would still be a very happy 1080p gamer to this day.
> 
> Nvidia xx70 series has had 8 Gb of memory for three generations now, GPU vendors must keep up with video memory demands especially with late current gen recomending upwards of 10 Gb for 4K gaming



I have a 1070 still at 1440p...............and its f****** dog slow now.


----------



## Tomgang (Sep 22, 2020)

EarthDog said:


> Honestly, I'd rather get a 2560x1440 144Hz monitor than to go 4K 60 Hz....
> 
> I'm riding mine out (1440/144Hz) for a couple more years for sure. I'd rather do that than 4K/60...



But that 2560x1440 screen will put a glass ceiling on my 4K. Sorry I just had to say it, after telling me about the same thing about my X58 system.

yeah I cut go that route, but I'm used to 60 Hz screen and I'm more interested in high settings and eye candy than high fps. But I will consider your suggestion.


----------



## Tatty_One (Sep 22, 2020)

MxPhenom 216 said:


> I have a 1070 still at 1440p...............and its f****** dog slow now.


I was in the same boat, the newer titles with a decent degree of eye candy pushed it hard, luckily for me I got a 2060 Super as a Birthday present (special birthday) as a placeholder until I can get my hands on a 3070 or something tasty from AMD in the spring.


----------



## dirtyferret (Sep 22, 2020)

Solid State Soul ( SSS ) said:


> Its not about performance, its about having the ability to max out textures and details which why people buy high end GPUs


yes and when they no longer perform up to the people's desire, they purchase new ones.  Hence why I said if you really want to worry about future proofing then purchase in the future.  In the present, get whatever you want for your desired performance be it low, medium, ultra settings on 900p, 1080p, 1440p, 4k or anything in -between.


----------



## Tomgang (Sep 22, 2020)

MxPhenom 216 said:


>



What's left of me after waiting as well.


----------



## moproblems99 (Sep 22, 2020)

Solid State Soul ( SSS ) said:


> Not everyone buy new graphics card every two years, and If someone bought a 1070 4 years ago, that person would still be a very happy 1080p gamer to this day.
> 
> Nvidia xx70 series has had 8 Gb of memory for three generations now, GPU vendors must keep up with video memory demands especially with late current gen recomending upwards of 10 Gb for 4K gaming



Please notice the vram amount Fury X compared to 980ti and even 1070.


----------



## docnorth (Sep 22, 2020)

I think that -except for textures- 10gb of gddr6x has the bandwidth and speed of about 13gb of gddr6 ram. Anyway if a game needs more vram than that of the GPU at e.g. ultra settings, then the GPU usually doesn’t have the processing power to keep 60 fps at those settings.


----------



## John Naylor (Sep 23, 2020)

Whenever there has been a card issued with 2 versions (680 / 770, 960 and even the 1060), when tested it's been shown that the extra RAM brought noting to the table.  Yes folks love to bring up the poor console ports or the rare game that is the exception, but the exception does not invalidate the rule.  Saw hundreds of post saying 3 GB was inadequate for the 1060 at 1080p.  Things were a bit more difficult here as the 3 GB model had a different GPU whereby it had 11% less shaders.  But in testing w/ TPUs test suite, having 11% less shaders only resulted in a 6% hit on performance. 

Now if we are going to argue that 3 GB is impacting performance, it inevitably follows that the 6% would have to grow at 1440p .... it doesn't.   Now let's talk exceptions:

On the 6 GB card, RoTR is 32% faster than the 3 GB card so obviously VRAM is the cause here, right ?  Not so fast; if 3 GB is inadeuate at  1080p, the performance at 1440p must truly be horrendous no ?  Then care to explain why at 1440p, the advantage is only 16%  ?   I what scenario does relative performance gap between VRAM amounts decrease at higher resolutions ?  Something else is obviously at play here.

From the 600 series on, whenever 2 cards have dropped with 2 different RAM, the prrvailing outcry has been "Only the larger amount will do".   And yet, fps, with rare exceptions,  is not being affected.

2GB and 4GB 680 - https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
_So, what can we glean from all of that? For one thing, with any single monitor the 2GB video card is plenty - even on the most demanding games and benchmarks out today. When you scale up to three high-res screens the games we tested were all still fine on 2GB, though maxing out some games’ settings at that resolution really needs more than a single GPU to keep smooth frame rates. With the right combination of high resolution and high detail, though, there is the potential for 2GB to be insufficient. For future games, or perhaps current games that were not tested in this article, you might be better off with a 4GB card if - and only if - you plan to run across a multi-screen configuration._

2GB and 4GB 780 (34 game tests) - http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/
_There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600.  We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference.  If we start to add even more AA, in most cases, the frame rates will drop to unplayable on both cards.   Let’s only look at the games where there is more than a single FPS difference between the two GTX 770s. 

Metro: 2033 is completely unplayable on either card at our highest resolution, and even GTX 770 4GB SLI wouldn’t be playable either.  Sleeping Dogs has a problem actually displaying on the outer LCDs although the performance is cut, so this benchmark has to be discounted.  This leaves five games out of 30 where a 4GB GTX 770 gives more than a 1 frame per second difference over the 2GB-equipped GTX 770.  And one of them, Metro: Last Light still isn’t even quite a single frame difference.  Of those five games, two of them are unplayable at 5760×1080 although in these cases, 4GB GTX 770 SLI would finally make some sense over 2GB GTX 770 SLI.  That only leaves Lost Planet 2 and two racing games that gain some advantage by choosing a single GTX 770 4GB card over the single GTX 770 2GB.  And in Lost Planet 2, we were able to add even higher anti-aliasing – from 8xAA to CSAA8XQ and to CSAA32X – but the performance difference was greatest with 8xAA.

There is one last thing to note with Max Payne 3: It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it claims to require 2750MB. However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s. This has been quite an enjoyable exploration for us in comparing the 4GB vRAM-equipped EVGA GTX 770 SC to the 2GB reference GTX 770 at the same clocks. Unless a gamer plays at 5760×1080, we wouldn’t recommend choosing 4GB vRAM over 2GB for a single GTX 770 for today’s games. However, if a gamer is planning to SLI GTX 770s at 5760×1080, then it might be reasonable to pick the 4GB version. _


2 GB and 4 GB 960 GB https://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,26.html
_The performance of the GeForce 960 series overall is certainly okay if you stick to 1080P, this is really a 1080P card. At 2560x1440 the case becomes trickier with 2 GB, once the card runs out of memory frames will start swapping back and forth in the frame-buffer, resulting in a performance loss. Admittedly, even there the card manages to do OK thanks to compression techniques that save memory, but it definitely is the resolution where you want more than 2GB memory, and thus the 4GB model is applicable here. So overall it will be a fun and sweet Full HD gaming product and yeah, though the specs are a little less exciting, really we are not disappointed by the performance if 1920x1080 is your gaming domain._

Is 4 GB Enough ? -  https://www.extremetech.com/gaming/...x-faces-off-with-nvidias-gtx-980-ti-titan-x/5
_We began this article with a simple question: “*Is 4GB of RAM enough for a high-end GPU?”* The answer, after all, applies to more than just the Fury X — Nvidia’s GTX 970 and 980 both sell with 4GB of RAM, as do multiple AMD cards and the cheaper R9 Fury. *Based our results, I would say that the answer is yes* — but the situation is more complex than we first envisioned.  First, there’s the fact that out of the fifteen games we tested, only four of could be forced to consume more than the 4GB of RAM. *In every case, we had to use high-end settings at 4K to accomplish this.* _


Using a utility to show that the system is using .... (no that's incorrect) .... allocating more RAM because its is simply there, does not mean that that extra RAM is needed. This is demonstrated in the above tests, but never more obviously than in the Max Payne test with the two 770s.


----------



## MrGRiMv25 (Sep 23, 2020)

Another thing to regard about the VRAM size is that Nvidia don't make the card in an isolated microcosm and just slap an arbitrary amount on them. They tend to talk with developers when creating a new architecture and ask them what VRAM amounts they would be happy with etc. 

As for future proofing it's not really a thing you can do very well with GPU's anyway, so by the time 20GB of VRAM is absolutely needed the card will more than likely be too slow to actually make use of it.


----------



## Naito (Sep 23, 2020)

This time around, 10GB should be sufficient, but doesn't mean it's enough. The next gen consoles offer ~10GB for games as it is. Port a next gen game to PC, add a few settings that can be dialed up to 11, and you're going to see increased demand for VRAM. Pair this with wider use of RT technologies, and it will only exacerbate the issue. I'd argue the RTX 3080 has a memory bus capable of easily and effectively accessing 20GB of VRAM. The RTX 3090 _only_ has 175GB/s more bandwidth to access double the VRAM - that's only ~23% more bandwidth for 100% more VRAM!


----------



## rtwjunkie (Sep 23, 2020)

MxPhenom 216 said:


> Its not about if the RAM is faster or not. Its about if the GPU is fast enough to actually make use of the extra RAM. And a lot of games these days will actually occupy all the VRAM you can give it, but doesn't necessarily mean it has too, or more of the memory is actually benefiting that game.


This!



dirtyferret said:


> too many people confuse using ram with needing ram


And This!

I can’t understand how so many have been suckered in by the cardmakers and game devs. Many games will USE as much VRAM as is available, but they don’t NEED it. W1zz has proved this several times. What is actually needed is less, with no drop in quality.

History is littered with cards whose GPU’s did not have the horsepower to make use of the additional VRAM, like the GTX 960 4GB.

There is also the issue of the GPU’s you buy with way more VRAM then needed (for supposed future-proofing) being obsolete for what you want to play before future games can make use of that extra VRAM.

Of course, if it’ just epeen you want, then have at it.


----------



## Solid State Soul ( SSS ) (Sep 23, 2020)

MrGRiMv25 said:


> Another thing to regard about the VRAM size is that Nvidia don't make the card in an isolated microcosm and just slap an arbitrary amount on them. They tend to talk with developers when creating a new architecture and ask them what VRAM amounts they would be happy with etc.


I dont believe that, with the announcement of the Xbox series S developers spoke their disappointments about the 10GB ram saying it will hold back next gen details and that is suppose to be a 1440p console that is yet to be released, not a 4K card like an RTX 3080 with 10 Gb.

If some developers are speaking their concerns about 10Gb not being enough for a 1440p next gen console, then what does that tell you about a 4K card with 10 Gb like the RTX 3080


----------



## moproblems99 (Sep 23, 2020)

Solid State Soul ( SSS ) said:


> I dont believe that, with the announcement of the Xbox series S developers spoke their disappointments about the 10GB ram saying it will hold back next gen details and that is suppose to be a 1440p console that is yet to be released, not a 4K card like an RTX 3080 with 10 Gb.
> 
> If some developers are speaking their concerns about 10Gb not being enough for a 1440p next gen console, then what does that tell you about a 4K card with 10 Gb like the RTX 3080



It's a console, the name of the game is compromise and holding back.


----------



## biffzinker (Sep 23, 2020)

How is 10GB of RAM on a 3080 not enough compared to the consoles? The CPU doesn’t have to share that RAM space with the GPU unlike consoles with the OS/Apps occupying 4-6GB.


----------



## Naito (Sep 23, 2020)

biffzinker said:


> How is 10GB of RAM on a 3080 not enough compared to the consoles? The CPU doesn’t have to share that RAM space with the GPU unlike consoles with the OS/Apps occupying 4-6GB.



Its true consoles share the RAM pools, but let's not forget the Series X has 16GB of GDDR6 with 10GB primarily dedicated to games.


----------



## ppn (Sep 23, 2020)

It may or may not be enough, are you willing to take the risk. I wish my 770 had 4GB, my 780 to have 6GB now for a reason, my experience can testify that it was always insufficient. Had to upgrade prematurely and neutered my ambitions for SLi. And 3080 20GB could be nvlink enabled. One more reason as of why it's worth waiting for the 4070 16GB.


----------



## blued (Sep 23, 2020)

I think even 8gb is enough till the next gen. Just wait for the 3070 8gb and 16gb versions to be compared in 4k and see for yourself. Anyone who whips out a vram usage chart showing more than that with higher vram cards just doesnt get it (or understand the difference between vram allocated vs  actually needed).


----------



## Kissamies (Sep 23, 2020)

It feels like yesterday when people was like "256MB is totally an overkill" and now people are wondering is 10GB enough, damn how time flies.


----------



## Vayra86 (Sep 23, 2020)

The gist of this topic, because the funny thing is, we all agree on one thing:

"10GB is enough, because the alternative of 20GB is really too much"

That is a very convoluted way of saying '10GB is all the choice we really have, so we'll make it work'

Reflect on that - for 700-800 dollars worth of GPU.


----------



## Kissamies (Sep 23, 2020)

Vayra86 said:


> The gist of this topic, because the funny thing is, we all agree on one thing:
> 
> "10GB is enough, because the alternative of 20GB is really too much"
> 
> ...


Just wondering could 15GB be possible with mixed density chips


----------



## EarthDog (Sep 23, 2020)

Vayra86 said:


> The gist of this topic, because the funny thing is, we all agree on one thing:
> 
> "10GB is enough, because the alternative of 20GB is really too much"
> 
> ...


Sure... if this was actually a 'make it work' situation. But, it isn't. It will work fine.

Remember, 4K gaming according to steam is like 2% of users. So very few in the first place would likely have this generally derived 'conundrum' you speak of in the first place.


----------



## Vayra86 (Sep 23, 2020)

EarthDog said:


> Sure... if this was actually a 'make it work' situation. But, it isn't. Remember, 4K gaming according to steam is like 2% of users. So very few in the first place would likely have this generally derived conundrum in the first place.



According to TPU its 99% apparently... everyone is saying 'this is a 4K card'. Its all a matter of perception. The better half saying they buy hardware isn't ever even close to hovering above said button to do so. Its just nice to talk about hardware and 'plans'.

You also have to consider that this sort of leap forward is usually accompanied by a monitor upgrade.



Chloe Price said:


> Just wondering could 15GB be possible with mixed density chips



That'd be a nice middle ground really... but yeah. Let's dream on


----------



## EarthDog (Sep 23, 2020)

Vayra86 said:


> According to TPU its 99% apparently... everyone is saying 'this is a 4K card'


What people think of the card (and I agree it has enough power to easily drive 4k/60 - so did the 2080Ti really), and what they buy it for are two different things. It's still OK at 4K and will be for a while for several reasons listed multiple times in this thread already.

Thread needs to be closed... the answer is YES. lol

EDIT (to below's nonsense): There's no concern with 'turning textures and settings all the way up'. FFS, read any game review that covers vRAM use... geez.


----------



## Vya Domus (Sep 23, 2020)

biffzinker said:


> How is 10GB of RAM on a 3080 not enough compared to the consoles? The CPU doesn’t have to share that RAM space with the GPU unlike consoles with the OS/Apps occupying 4-6GB.



Again, it's more nuanced. It will be enough in the sense that games will run fine for many years, people are wondering what will happen if you push textures and settings all the way up. That's where the concern is.


----------



## sepheronx (Sep 23, 2020)

MxPhenom 216 said:


> I have a 1070 still at 1440p...............and its f****** dog slow now.



Really? I got a 1070 and the damn thing still plays all the games I throw at it at high settings minus Crysis remastered which was high/med settings.

Then again, that is at 1080p.


----------



## dirtyferret (Sep 23, 2020)

Vayra86 said:


> The gist of this topic, because the funny thing is, we all agree on one thing:
> 
> "10GB is enough, because the alternative of 20GB is really too much"


I don't agree with that and I don't think most of the other "older" gamers (earthdog, rtwjunkie, etc.,) here do either.  We have been around long enough to see the ever increase of video RAM from generation to generation of video cards.  What I am saying (and i beleive they are too but they can correct me if I'm wrong) is that 10GB is enough for the RTX 3080.  When the day comes that you need 20GB (and it will come) the specific TU102 used in the RTX 3080 will be too slow to take advantage of that RAM and there will be better alternatives in the market place that can fully use 20GB.  So if there ever was a RTX 3080 20GB card, it would show minimal real world benefits to the RTX 3080 10 GB across most games.


----------



## Metroid (Sep 23, 2020)

Today 10gb is enough, in 3 years time no, and buying the 20gb for gaming right now is even not needed, rumors suggest a 20gb will have an increase of $200, I would save those $200 for something else. The reality is that you will be able to buy a 4080 with 16gb or 20gb in 2023 and that will be needed.


----------



## EarthDog (Sep 23, 2020)

dirtyferret said:


> I don't agree with that and I don't think most of the other "older" gamers (earthdog, rtwjunkie, etc.,) here do either.  We have been around long enough to see the ever increase of video RAM from generation to generation of video cards.  What I am saying (and i beleive they are too but they can correct me if I'm wrong) is that 10GB is enough for the RTX 3080.  When the day comes that you need 20GB (and it will come) the specific TU102 used in the RTX 3080 will be too slow to take advantage of that RAM and there will be better alternatives in the market place that can fully use 20GB.  So if there ever was a RTX 3080 20GB card, it would show minimal real world benefits to the RTX 3080 10 GB across most games.


This.

I think where a 3080 20GB will shine is for those who want to run 4K and mod the titles. Otherwise, at 10GB, even at 4K, the card has years to go. And for those 2560x1440/144Hz people, I don't think there could be a better choice until we see AMD releases their cards.


----------



## Vayra86 (Sep 23, 2020)

I agree 20 is overkill. 12~16 would have been perfect, emphasis on 16.

For 1440p yes 10GB is a good place to be, but still a shaky balance with that much core power.


----------



## TumbleGeorge (Oct 16, 2020)

This theme is old but I see a lot of problems in some tests for VRAM volume usage. They used different resolutions include 4k with ultra settings... but with disabled AA? Why? Is there real test with games on 4k, with 4k textures, displaed on 4k PC monitor with 10 bit or 12 bit or 16 bit color on ultra or "badass" settings and all enabled include anti aliasing? I could not find tests in which all these conditions were met. There is always something that is excluded from use, so the claims about the amount of video memory that is enough are a lie !?


----------



## Rei (Oct 16, 2020)

TumbleGeorge said:


> This theme is old but I see a lot of problems in some tests for VRAM volume usage. They used different resolutions include 4k with ultra settings... but with disabled AA? Why? Is there real test with games on 4k, with 4k textures, displaed on 4k PC monitor with 10 bit or 12 bit or 16 bit color on ultra or "badass" settings and all enabled include anti aliasing? I could not find tests in which all these conditions were met. There is always something that is excluded from use, so the claims about the amount of video memory that is enough are a lie !?


Why would AA be enabled on 4K with ultra settings? Not only would even current high-end GPU not be capable of doing so & get a good solid fps above 30, it is also highly redundant as AA is in my observation, nothing more than "faking high resolution". AA is only good for 1080p or even 1440p when a user hits the resolution limit of their monitor but still wanna reduce any possible jaggies that are still shown. For current modern games on 4K, the jaggies should be reduced enough for the eyes to be able to perceive any. Using AA on 4K ultra is awesome & all but that would stress the GPU before video memory can be saturated. Therefore, as it is right now, using AA on 4K IRL scenario is unrealistic.


----------



## Deleted member 193596 (Oct 16, 2020)

btw..
there is a HUGE difference between allocating and utilizing VRAM..


----------



## TumbleGeorge (Oct 16, 2020)

Rei said:


> Why would AA be enabled on 4K with ultra settings? Not only would even current high-end GPU not be capable of doing so & get a good solid fps above 30, it is also highly redundant as AA is in my observation, nothing more than "faking high resolution". AA is only good for 1080p or even 1440p when a user hits the resolution limit of their monitor but still wanna reduce any possible jaggies that are still shown. For current modern games on 4K, the jaggies should be reduced enough for the eyes to be able to perceive any. Using AA on 4K ultra is awesome & all but that would stress the GPU before video memory can be saturated. Therefore, as it is right now, using AA on 4K IRL scenario is unrealistic.


Because it is possible scenario. Your thesis is just unrealistic, but not unpossible. There are many, many gamers with different attentions. Can you prove that no one would take advantage of the opportunity to play with all the settings turned on and to the maximum? a 4k monitor with a small diagonal of 27-28 inches may not make sense from some settings, but will the shortcomings in the picture quality not be visible on monitors with a larger diagonal? 
Edit: 


WarTherapy1195 said:


> btw..
> there is a HUGE difference between allocating and utilizing VRAM..


I know that and my question is not for that.


----------



## Rei (Oct 16, 2020)

TumbleGeorge said:


> Because it is possible scenario. Your thesis is just unrealistic, but not unpossible. There are many, many gamers with different attentions. Can you prove that no one would take advantage of the opportunity to play with all the settings turned on and to the maximum? a 4k monitor with a small diagonal of 27-28 inches may not make sense from some settings, but will the shortcomings in the picture quality not be visible on monitors with a larger diagonal?


Your scenario is the one that is unrealistic. A larger size screen is best played at a further distance unless you wanna snap your neck & not get the best viewing experience. At that distance, the shortcomings will become less visible if you're not looking for it. I didn't say that no one would take advantage for using AA on 4K ultra but as I mentioned in my previous post, any 4K-native game using AA would stress the GPU too much to try & achieve above 30fps before video memory size could be fully utilized. So realistically, don't use AA with 4K at even high setting if a person wants to game close to 60fps no matter the screen size.


TumbleGeorge said:


> Edit:
> 
> I know that and my question is not for that.


He wasn't replying to you, BTW. He's just replying in general to this thread.


----------



## TumbleGeorge (Oct 16, 2020)

Rei said:


> Your scenario is the one that is unrealistic. A larger size screen is best played at a further distance unless you wanna snap your neck & not get the best viewing experience. At that distance, the shortcomings will become less visible if you're not looking for it. I didn't say that no one would take advantage for using AA on 4K ultra but as I mentioned in my previous post, any 4K-native game using AA would stress the GPU too much to try & achieve above 30fps before video memory size could be fully utilized. So realistically, don't use AA with 4K at even high setting if a person wants to game close to 60fps no matter the screen size.


Are you brave enough to write in Jason Huang's Twitter  that his fastest mainstream video card is too weak and not productive enough to meet the heavy requirements of turning on all settings at 4k resolution to a maximum, including and AA?


----------



## Rei (Oct 16, 2020)

TumbleGeorge said:


> Are you brave enough to write in Jason Huang's Twitter  that his fastest mainstream video card is too weak and not productive enough to meet the heavy requirements of turning on all settings at 4k resolution to a maximum, including and AA?


Why would I? It's called common sense. You're the one who is asking for an unrealistic real world scenario. If you think you can do it, then try it out for yourself & find out if you can get above 30fps or even close to 60fps & let us know the result.


----------



## Vya Domus (Oct 16, 2020)

WarTherapy1195 said:


> there is a HUGE difference between allocating and utilizing VRAM..



People keep talking about this as if they know what any of this actually means.

First of all, allocated VRAM eventually becomes used VRAM. Secondly, for example, on a 10 GB card if 9.5 GB are allocated and say only 9 GB are used but then the application requests one more allocation of 1 GB, even though "technically" going by used VRAM there is enough space, actually, there isn't. You can't allocate 10.5 GB, you either need to swap buffers out or resort to virtual memory (which only really exists for things like CUDA). See how easy it is to be below the limit and still experience problems ?

Memory transfers and allocations are also asynchronous, meaning they don't necessarily take place when they are invoked. Because of this what will happen is that the driver will try and clump together as many allocations and transfers as possible as early as possible based on how much memory is free. Why ? Allocations are really expensive and can causes some really ugly halts, that's the infamous "stutter' people experience when the VRAM usage get's close to the maximum available. The driver keeps allocating and deleting buffers. This is also why GPUs with more memory will allocate more buffers earlier, to minimize the performance hit.


----------



## Vayra86 (Oct 16, 2020)

TumbleGeorge said:


> Because it is possible scenario. Your thesis is just unrealistic, but not unpossible. There are many, many gamers with different attentions. Can you prove that no one would take advantage of the opportunity to play with all the settings turned on and to the maximum? a 4k monitor with a small diagonal of 27-28 inches may not make sense from some settings, but will the shortcomings in the picture quality not be visible on monitors with a larger diagonal?
> Edit:
> 
> I know that and my question is not for that.



The actual scenario in real world that you DO run into with a 10GB card for 4K is modded games. Those additional assets, textures, etc will not necessarily tax the GPU core but they will eat up memory. This same issue can also happen on 1440p.

Perspective though: it can ALSO happen with a 1080ti with 11GB. The problem however is that you're much more likely to run into it with a 3080 10GB, because the core allows you to do a lot more while maintaining decent performance. You're getting +80% or something on the core, but -10% VRAM. As with all PC parts, the key is balance. Shitty balance makes bottlenecks visible.

Another real scenario that is going to happen soon is the console VRAM limit (even for games) will be over 10GB by a good margin- a margin developers will most certainly use sooner rather than later. Its realistic to expect console ports to exceed 10GB within now and three years time. We saw a similar thing with the PS4 and games needing upwards of 4GB.



Vya Domus said:


> People keep talking about this as if they know what any of this actually means.
> 
> First of all, allocated VRAM eventually becomes used VRAM. Secondly, for example, on a 10 GB card if 9.5 GB are allocated and say only 9 GB are used but then the application requests one more allocation of 1 GB, even though "technically" going by used VRAM there is enough space, actually, there isn't. You can't allocate 10.5 GB, you either need to swap buffers out or resort to virtual memory (which only really exists for things like CUDA). See how easy it is to be below the limit and still experience problems ?
> 
> Memory transfers and allocations are also asynchronous, meaning they don't necessarily take place when they are invoked. Because of this what will happen is that the driver will try and clump together as many allocations and transfers as possible as early as possible based on how much memory is free. Why ? Allocations are really expensive and can causes some really ugly halts, that's the infamous "stutter' people experience when the VRAM usage get's close to the maximum available. The driver keeps allocating and deleting buffers. This is also why GPUs with more memory will allocate more buffers earlier, to minimize the performance hit.



Thanks for setting that eternal misguided oneliner straight. I hope its the last time... obviously it won't be


----------



## TumbleGeorge (Oct 16, 2020)

Yes, my "unrealistic" scenario in near future will be more and more realistic. But Jason said something like "Nvidia guarantee that 3080 *10GB* is enough and adequately today and for 3-4 years in future for all games" This is not citation! Maybe I'm wrong remember what his saying?


----------



## londiste (Oct 16, 2020)

Vayra86 said:


> Another real scenario that is going to happen soon is the console VRAM limit (even for games) will be over 10GB by a good margin- a margin developers will most certainly use sooner rather than later. Its realistic to expect console ports to exceed 10GB within now and three years time. We saw a similar thing with the PS4 and games needing upwards of 4GB.


Nope. It is not realistic to expect console ports to exceed 10GB.

Both consoles have 16GB RAM, 2-3 GB will be reserved for system, couple GB will be used for game RAM. Best example here is Xbox Series X with 10GB of faster RAM and 6GB of slightly slower RAM.
There are going to be some edge cases where more will be squeezed out of memory allocation but this is not going to be significant.
Also Xbox Series S as a baseline may pose an additional problem - it has total of 10GB RAM out of which 2GB is quite slow in VRAM terms.

Xbox One and PS4 had 8GB total memory. Xbox One and PS4 had 5.5GB available for games, that is for both RAM + VRAM purposes. Xbox One X got more memory - 16GB - that was used primarily as VRAM for higher resolution and textures/assets. When compared to Xbox One X the new generation is not going to have any more memory available to them.


----------



## Vayra86 (Oct 16, 2020)

londiste said:


> Nope. It is not realistic to expect console ports to exceed 10GB.
> 
> Both consoles have 16GB RAM, 2-3 GB will be reserved for system, couple GB will be used for game RAM. Best example here is Xbox Series X with 10GB of faster RAM and 6GB of slightly slower RAM.
> There are going to be some edge cases where more will be squeezed out of memory allocation but this is not going to be significant.
> ...



Slower RAM... and ways to pull data out of storage a lot faster that PC won't be having anytime soon, or at least, we haven't got it today. Maybe you're right, but even if the consoles limit at 10GB, that means you're stuck at a hard limit of console image quality for a 700 dollar GPU. Yay? I thought we bought PCs exactly to do a little bit more.

Another thing of note, though I don't deem it likely here, but remember how an additional core was unlocked for devs post console launch with the last gen. These things are not set in stone... and if unlocking resources can provide competitive advantage...


----------



## londiste (Oct 16, 2020)

Vayra86 said:


> Slower RAM... and ways to pull data out of storage a lot faster that PC won't be having anytime soon, or at least, we haven't got it today.


This is what RAM is for on the PC.


----------



## TumbleGeorge (Oct 16, 2020)

londiste said:


> This is what RAM is for on the PC.


For sharing with GPU which not satisfied of onboard VRAM volume? Yes but system memory is much slower and we have identical issue like gtx 970 with 0.5GB slower than other memory but more pronounced?!


----------



## londiste (Oct 16, 2020)

TumbleGeorge said:


> For sharing with GPU which not satisfied of onboard VRAM volume? Yes but system memory is much slower and we have identical issue like gtx 970 with 0.5GB slower than other memory but more pronounced?!


The quote was also relevant. 
When it comes to pulling data out of storage going through system RAM is easy enough on the PC. From RAM to GPU that is PCI-e x16 which is plenty fast even when compared to console special compressed SSD bandwidth numbers.


----------



## TumbleGeorge (Oct 16, 2020)

londiste said:


> The quote was also relevant.
> When it comes to pulling data out of storage going through system RAM is easy enough on the PC. From RAM to GPU that is PCI-e x16 which is plenty fast even when compared to console special compressed SSD bandwidth numbers.


System RAM in mainstream is up to dual channel. DDR4 3200 deal channel has up to 51.2GB/s. Make compration with speed of GDDR6X in RTX 3080. If system RAM is used only for CPU part of game calculations that not is problem maybe, but if is necessarily to use like shared VRAM for maintain GPU calculations part... There is problem...RTX 3080 is Nvidia mainstream flagship in 2020... , not is super, super budget like geforce 6200 turbocache in 2004.


----------



## Rei (Oct 16, 2020)

TumbleGeorge said:


> Yes, my "unrealistic" scenario in near future will be more and more realistic. But Jason said something like "Nvidia guarantee that 3080 *10GB* is enough and adequately today and for 3-4 years in future for all games" This is not citation! Maybe I'm wrong remember what his saying?


Of course it gonna happen in the future when GPU becomes more powerful & 4K becomes more ubiquitous, but not as it is at present.
Your quoting Jason (who the hell is he?) is detrimental to your argument about video memory nor is it related to what we were discussing.
So... Who the hell is Jason?


londiste said:


> Nope. It is not realistic to expect console ports to exceed 10GB.
> 
> Both consoles have 16GB RAM, 2-3 GB will be reserved for system, couple GB will be used for game RAM. Best example here is Xbox Series X with 10GB of faster RAM and 6GB of slightly slower RAM.
> There are going to be some edge cases where more will be squeezed out of memory allocation but this is not going to be significant.
> ...


Just to be clear here, the separation of RAM here for the Xbox 4 is not the same scenario as some previous gen console such as Xbox One, PS3, PS2, PSVita, PSP, etc. or even the PC. Both pool of the XSX's 10+6 & XSS's 8+2 GB RAM are still used as system/game memory & video memory. It's just a priority of which assets gets to use the faster & larger RAM pool & which gets the other RAM pool. Also, both the XSX & PS5 will have 13.5GB RAM set aside for games & other applications while XSS gets 7.5GB RAM with the system software only drawing from the slower pool for the Xbox.
Also, Xbox One X didn't have 16GB RAM, it had 12GB RAM. 9.5GB of RAM set aside for games & other applications.


Vayra86 said:


> Another thing of note, though I don't deem it likely here, but remember how an additional core was unlocked for devs post console launch with the last gen.


I don't think that happened. Unless you're talking about the Boost feature, that was more like unlocking additional potential of the overall system with the CPU, GPU, & RAM, not really additional core.


----------



## Mussels (Oct 16, 2020)

By the time 10GB is a limit, we'll be two to three more generations of GPU's in


Jesus people can still game happily on 4GB cards now, and only NOW now is 8GB 'normal' enough for game devs to optimise for it


----------



## londiste (Oct 16, 2020)

Rei said:


> Also, Xbox One X didn't have 16GB RAM, it had 12GB RAM. 9.5GB of RAM set aside for games & other applications.


Good correction on Xbox One X RAM, I have no idea how I mixed that up.


TumbleGeorge said:


> System RAM in mainstream is up to dual channel. DDR4 3200 deal channel has up to 51.2GB/s. Make compration with speed of GDDR6X in RTX 3080. If system RAM is used only for CPU part of game calculations that not is problem maybe, but if is necessarily to use like shared VRAM for maintain GPU calculations part... There is problem...RTX 3080 is Nvidia mainstream flagship in 2020... , not is super, super budget like geforce 6200 turbocache in 2004.


Again, please read the quote my comment was about. I was about pulling stuff out of storage.


----------



## Zach_01 (Oct 16, 2020)

I do not believe that 10GB is a limiting factor today for 4K gaming. But we dont know what will happen next year. For the users replacing their GPU every 1~2 years this wont matter. For those keeping it up to 4~5 or even more... ...?
If VRAM is filled and the game needs more, then data isnt moved over system's RAM?
Maybe PCI-E speeds and DDR5 are relevant at some point (not now).

Anyway nVidia has played the "go to higher cost card" game to have it all. Probably will charge +200$ for the 20GB variation. And again nVidia buyers will pay 1000~1200$ for the high end cards and the dream of having a high-end GPU on 700$ price tag will remain a dream, for most people. And we are back to 2080Ti era...


----------



## londiste (Oct 16, 2020)

Rei said:


> Vayra86 said:
> 
> 
> > Another thing of note, though I don't deem it likely here, but remember how an additional core was unlocked for devs post console launch with the last gen. These things are not set in stone... and if unlocking resources can provide competitive advantage...
> ...


Access to 7th core (with caveats, a shared core) was given a couple years after consoles were released, 2015-ish timeframe.


----------



## TumbleGeorge (Oct 16, 2020)

Mussels said:


> By the time 10GB is a limit, we'll be two to three more generations of GPU's in
> 
> 
> Jesus people can still game happily on 4GB cards now, and only NOW now is 8GB 'normal' enough for game devs to optimise for it


8GB is not starting with gtx 1080. 8GB cards has from 290X 8GB it launched in 2014...ls this just marketing? That is little more of 6 years before today. Software development is not too fast process but 6 years! After than 8GB cards continuing in series with 390/X, 480 8GB and so on to today.  

Next: I think Jason Huang has enough lawyers to defend him when he fails. In my opinion, the attempt to push a "*flagship" with a smaller amount of video memory on the market than in two previous generations of flagships of the same company* is not a reason for the person who made it proud to be proud of his conclusions. Excuses such as how much faster GDDR6 compared with GDDR6X, I do not accept such an excuse.


----------



## londiste (Oct 16, 2020)

TumbleGeorge said:


> Next: I think Jason Huang has enough lawyers to defend him when he fails. In my opinion, the attempt to push a "*flagship" with a smaller amount of video memory on the market than in two previous generations of flagships of the same company* is not a reason for the person who made it proud to be proud of his conclusions.


Looking at the history, how did AMD fare with Fiji? That must have been awful, right?


----------



## TumbleGeorge (Oct 16, 2020)

londiste said:


> Looking at the history, how did AMD fare with Fiji? That must have been awful, right?


Agree!


----------



## ThrashZone (Oct 16, 2020)

Hi,
If nvdia had filled all 11 memory spots on the 3080 it would be right at the recommended spec's for maxed settings.


----------



## Rei (Oct 16, 2020)

TumbleGeorge said:


> Agree!


LOL! Well, that went over your head.... 


TumbleGeorge said:


> 8GB is not starting with gtx 1080. 8GB cards has from 290X 8GB it launched in 2014...ls this just marketing? That is little more of 6 years before today. Software development is not too fast process but 6 years! After than 8GB cards continuing in series with 390/X, 480 8GB and so on to today.
> 
> Next: I think Jason Huang has enough lawyers to defend him when he fails. In my opinion, the attempt to push a "*flagship" with a smaller amount of video memory on the market than in two previous generations of flagships of the same company* is not a reason for the person who made it proud to be proud of his conclusions. Excuses such as how much faster GDDR6 compared with GDDR6X, I do not accept such an excuse.


Just like your "English" & reason went over my head. Also, was there a "Jason Huang" working at Nvidia?
@londiste Can you plz clarify TumbleGeorge's quote for me since you seemed to get it.


----------



## ThrashZone (Oct 16, 2020)

Hi,
Misuse of the product as flagship instead of what 3080 is really Mainstream.


----------



## EarthDog (Oct 16, 2020)

ThrashZone said:


> Hi,
> Misuse of the product as flagship instead of what 3080 is really Mainstream.


What does this mean? The 3080 isn't the maintream card. Even when/if a 3080 Ti comes out, that is still the 3rd card down the stack. I'd call a 3070 and 3060 mid-range...But few would consider the xx80 mid-range over the last 3 generations.


----------



## ThrashZone (Oct 16, 2020)

EarthDog said:


> What does this mean? The 3080 isn't the maintream card. Even when/if a 3080 Ti comes out, that is still the 3rd card down the stack. I'd call a 3070 and 3060 mid-range...But few would consider the xx80 mid-range over the last 3 generations.


Hi,
Mainstream is usually the most sold card 
3090 will never be it.
3080 700.us range may not even be it but it's a lot closer than calling 3090 mainstream it's actually top of the line more commonly called flagship.


----------



## EarthDog (Oct 16, 2020)

ThrashZone said:


> Hi,
> Mainstream is usually the most sold card
> 3090 will never be it.
> 3080 700.us range may not even be it but it's a lot closer than calling 3090 mainstream it's actually top of the line more commonly called flagship.


Mainstream is where the most cards are sold, but it won't be a 3080. Nvidia called the 3080 its flagship while the 3090 is a Titan replacement. The 3070 and 3060 I would call mid-range cards today. 

But yeah, not sure who you were talking to, but if they said 3090 is mainstream, that's hilarious.


----------



## Rei (Oct 16, 2020)

EarthDog said:


> Mainstream is where the most cards are sold, but it won't be a 3080. Nvidia called the 3080 its flagship while the 3090 is a Titan replacement. The 3070 and 3060 I would call mid-range cards today.
> 
> But yeah, not sure who you were talking to, but if they said 3090 is mainstream, that's hilarious.





EarthDog said:


> What does this mean? The 3080 isn't the maintream card. Even when/if a 3080 Ti comes out, that is still the 3rd card down the stack. I'd call a 3070 and 3060 mid-range...But few would consider the xx80 mid-range over the last 3 generations.





ThrashZone said:


> Hi,
> Mainstream is usually the most sold card
> 3090 will never be it.
> 3080 700.us range may not even be it but it's a lot closer than calling 3090 mainstream it's actually top of the line more commonly called flagship.


I dunno if Nvidia has a flagship stance on Ampere-line but I'd call the 3080: the mainstream card, 3090: the enthusiast card, 3060: the mid-range card & 3070: prolly the 2nd mainstream card. That how it is in my book anyway but I could always rewrite it pending further development after more releases comes out.


----------



## ThrashZone (Oct 16, 2020)

EarthDog said:


> Mainstream is where the most cards are sold, but it won't be a 3080. Nvidia called the 3080 its flagship while the 3090 is a Titan replacement. The 3070 and 3060 I would call mid-range cards today.
> 
> But yeah, not sure who you were talking to, but if they said 3090 is mainstream, that's hilarious.


Hi,
Nothing else has been released though only 3080 and 3090 
But went there is only two cards released only one can be flagship and the other cheaper has to be mainstream lol 
We can likely agree 3080 was a mainstream third party gougers card lol


----------



## EarthDog (Oct 16, 2020)

Rei said:


> I dunno if Nvidia has a flagship stance on Ampere-line but I'd call the 3080: the mainstream card, 3090: the enthusiast card, 3060: the mid-range card & 3070: prolly the 2nd mainstream card. That how it is in my book anyway but I could always rewrite it pending further development after more releases comes out.


I wouldn't. Again, Nvidia themselves called it a flagship (until 3080 Ti release). 3090 is the Titan replacement (but not called a titan, but a titan in most aspects nonetheless). The 3090 is a crossover card (gaming/quadro). The 3080/3080Ti is the enthusiast/flagship, while 3070/3060 is mainstream and less would be considered budget/entry level. 


ThrashZone said:


> Hi,
> Nothing else has been released though only 3080 and 3090
> But went there is only two cards released only one can be flagship and the other cheaper has to be mainstream lol
> We can likely agree 3080 was a mainstream third party gougers card lol


See the forest through the trees, my guy, Obviously the product stack will fill out. To think of it like you are not accounting for the full stack is a myopic POV.


----------



## rtwjunkie (Oct 16, 2020)

EarthDog said:


> What does this mean? The 3080 isn't the maintream card. Even when/if a 3080 Ti comes out, that is still the 3rd card down the stack. I'd call a 3070 and 3060 mid-range...But few would consider the xx80 mid-range over the last 3 generations.


True, but he’s got a point. The xx80 are high range. Not the flagship though. Just like a naval fleet, there can be only one (flagship). No pun intended until I got to the end and saw what I did.


----------



## ThrashZone (Oct 16, 2020)

Hi,
Flagship 3080 will have 11gbs call it a 3080 super duper or Ti maybe doubt it
TI if following the 3070ti might have 16gbs not sure why 3080ti wouldn't have 16gb's too.


----------



## Rei (Oct 16, 2020)

EarthDog said:


> I wouldn't. Again, Nvidia themselves called it a flagship (until 3080 Ti release). 3090 is the Titan replacement (but not called a titan, but a titan in most aspects nonetheless). The 3090 is a crossover card (gaming/quadro). The 3080/3080Ti is the enthusiast/flagship, while 3070/3060 is mainstream and less would be considered budget/entry level.


I agree that 3090 is a Titan replacement (even the pricing strategy is similar) but considering it's not called Titan & has been marketed towards mainstream consumer now it may be defined as mainstream, though the pricing of double the 3080's price is definitely not mainstream shit so for me that is up in the air.



ThrashZone said:


> Hi,
> Flagship 3080 will have 11gbs call it a 3080 super duper or Ti maybe doubt it
> TI if following the 3070ti might have 16gbs not sure why 3080ti wouldn't have 16gb's too.


I do wish Nvidia drops the Ti & Super sub-branding. It not appealing to me. Just stick to the numbers-line naming.


----------



## ThrashZone (Oct 16, 2020)

Hi,
Well since the 3080 is half price it has to be the mainstream card because nothing else was released that is cheaper 
But yes normal if there were other options 4-500.us range would be closer to mainstream.


----------



## rtwjunkie (Oct 16, 2020)

Rei said:


> I agree that 3090 is a Titan replacement (even the pricing strategy is similar) but considering it's not called Titan & has been marketed towards mainstream consumer now it may be defined as mainstream, though the pricing of double the 3080's price is definitely not mainstream shit so for me that is up in the air.
> 
> 
> I do wish Nvidia drops the Ti & Super sub-branding. It not appealing to me. Just stick to the numbers-line naming.


Ti’s have been around forever, at least as far as I remember, including when Ti were the first letters on a card name (Ti-4200, for example).


----------



## ThrashZone (Oct 16, 2020)

Hi,
Indeed I'll be holding  off for either 11gb or 16gb 3080


----------



## Rei (Oct 16, 2020)

rtwjunkie said:


> Ti’s have been around forever, at least as far as I remember, including when Ti were the first letters on a card name (Ti-4200, for example).


I know...  
I'm just hoping that they would drop it soon or now with the 3000 series. Just go with something like 3065, 3075, 3085, 3095, 9999 or even 666 if the performance comes in between two existing cards.
I vote for 666, BTW... The Nvidia GeForce RTX 666!!!


----------



## ThrashZone (Oct 16, 2020)

Hi,
Last titan had 24gb's so saying the 3090 20gb is it's replacement well might be a stretch.


----------



## dirtyferret (Oct 16, 2020)

Mussels said:


> By the time 10GB is a limit, we'll be two to three more generations of GPU's in
> 
> 
> Jesus people can still game happily on 4GB cards now, and only NOW now is 8GB 'normal' enough for game devs to optimise for it


Yes but how can I justify the 12GB of ram in my GPU if I don't immediately state anything less than 12GB is no longer capable of performing even basic gaming the very second I install my new GPU?  Just be glad I haven't installed 64GB of system RAM and then decided it's a fact you can't possibly game with anything less.


----------



## moproblems99 (Oct 16, 2020)

TumbleGeorge said:


> 8GB is not starting with gtx 1080. 8GB cards has from 290X 8GB it launched in 2014...ls this just marketing? That is little more of 6 years before today. Software development is not too fast process but 6 years! After than 8GB cards continuing in series with 390/X, 480 8GB and so on to today.



It has nothing to do with when 8GB cards were first introduced.  It has everything to do with when 8GB becomes mainstream.  8GB mainstream cards have only been a thing for a couple of years.

I wish people would realize that most things (especially software) don't target the extremes.  They target the average.  Why?  Because what good does it do to spends years devving stuff that only a tiny fraction of their audience can play?


----------



## birdie (Oct 16, 2020)

People from TPU should remind themselves once in a while what the common denominator in graphics is:



			Steam Hardware & Software Survey
		


Out of the top 20 cards most feature 6GB of VRAM or less.

And game developers do pay attention to that unless they want to release a game which will tank because its reviews will be horrible due to unrealistic requirements.

I do understand this is a forum for tech enthusiasts but just remember that the world doesn't revolve around you and your purchasing power.

/thread


----------



## dirtyferret (Oct 16, 2020)

ThrashZone said:


> Hi,
> Misuse of the product as flagship instead of what 3080 is really Mainstream.


You keep using the word "Mainstream".  I do not think it means what you think it means.



birdie said:


> I do understand this is a forum for tech enthusiasts but just remember that the world doesn't revolve around you and your purchasing power.



This is a discussion of $2,000 PCs that we will declare outdated in six to nine months.  Please leave common sense statements such as yours at the door.


----------



## TumbleGeorge (Oct 16, 2020)

Mainstream is all card from one generation which is not marked(and prised expensive) like enthusiast/semi-pro or professional. Discrimination  between them based on number of CU's, size of VRAM, speed of VRAM (depending of speed and number of chips and of bandwidth of bus) and based on the characteristics described above, are placed in different mainstreme subclasses by price.


----------



## Zach_01 (Oct 16, 2020)

Am I wrong ?
Cause I thought that mainstrem GPU cards was around 200+$...

The idea that has risen last 3~5years that mainstream GPU cards are 400~500$ is a distortion of reality.
Most people around the world is buying a sub-300$ GPU for gaming.

We must not judge things by users in here or any other forum.


----------



## EarthDog (Oct 16, 2020)

rtwjunkie said:


> True, but he’s got a point. The xx80 are high range. Not the flagship though. Just like a naval fleet, there can be only one (flagship). No pun intended until I got to the end and saw what I did.


xx80 isn't mid-range... it hasn't been for 3 generations. It is literally the second card down the stack of several... it isn't mid-range. Some may want to argue the 3090 is the flagship and that is fine, that still makes it the second card down the stack and not mid-range.



birdie said:


> /thread


lol, this thread isn't even about that so, yeah.....


ThrashZone said:


> Hi,
> Last titan had 24gb's so saying the 3090 20gb is it's replacement well might be a stretch.


lol, no.


----------



## Rei (Oct 16, 2020)

TumbleGeorge said:


> Mainstream is all card from one generation which is not marked(and prised expensive) like enthusiast/semi-pro or professional. Discrimination  between them based on number of CU's, size of VRAM, speed of VRAM (depending of speed and number of chips and of bandwidth of bus) and based on the characteristics described above, are placed in different mainstreme subclasses by price.


Number of CU's? As in Compute Unit? That is an AMD term & doesn't apply to Nvidia. Discrimination is also not based on the size nor the speed of the VRAM as it can vary up or down between generations & GPU but based on the overall performance of the GPU. That is determined by the fillrate & FLOPS of the GPU which is calculated by the core clock & number of core configuration.


----------



## ThrashZone (Oct 16, 2020)

Hi,
The way nvidia has been milking versions you'd have to be living in a cave to think they won't fill the $$$ space between 3080 and 3090 and bounce it again past 3090 $$$ too


----------



## rtwjunkie (Oct 16, 2020)

EarthDog said:


> xx80 isn't mid-range... it hasn't been for 3 generations. It is literally the second card down the stack of several... it isn't mid-range. Some may want to argue the 3090 is the flagship and that is fine, that still makes it the second card down the stack and not mid-range.


It sounds like you’re arguing with me, yet you said what I just said.


----------



## TheoneandonlyMrK (Oct 16, 2020)

MxPhenom 216 said:


> Devs do not have access to all that 16gb for their games. I think around 3-4gb is reserved for the OS and background processes.
> 
> Also these days, I'm not so sure games are exactly developed for consoles first 100% of the time anymore. And ports are kind of a thing of the past now that everything is basically x86 and Windows (PS being an exception to this)


Yes but then Pcmr requires *Better* than consoles ,and that's going to cost.
Saying console's can only use so and so is meaningless to a PC owner who can often scale a few slider's higher, force crazier options on via hack, or driver and or add shader and or game hacks..10GB, depends how long your keeping it and what your going to do, it's not a easy simple answer as many skewed by their own perspective will say.

It depends.


----------



## TumbleGeorge (Oct 16, 2020)

Rei said:


> Number of CU's? As in Compute Unit? That is an AMD term & doesn't apply to Nvidia. Discrimination is also not based on the size nor the speed of the VRAM as it can vary up or down between generations & GPU but based on the overall performance of the GPU. That is determined by the fillrate & FLOPS of the GPU which is calculated by the core clock & number of core configuration.


Ok just add number of cuda core or something that is in characteristic of card with Nvidia GPU. I'm not pretend for perfect description. Core clock different than referent, espetially of graphic cards depending of decisions of assemblators of cards with non referent design...and of decision for additional overclock from card owners.


----------



## Rei (Oct 16, 2020)

TumbleGeorge said:


> Ok just add number of cuda core or something that is in characteristic of card with Nvidia GPU. I'm not pretend for perfect description. Core clock different than referent, espetially of graphic cards depending of decisions of assemblators of cards with non referent design...and of decision for additional overclock from card owners.


Ok, I'm, sorry, I know English isn't your main language but I'm having difficulty understanding your sentence. I'll reply again after someone else can elaborate more on your words or after you have re-structured your words & sentence better.


----------



## TumbleGeorge (Oct 17, 2020)

I up this theme because today has owners of RTX 3080 with own home real world experience with it. I hope they to share here results for VRAM size usage. I want to know if there are any discrepancies between the results in the real world and the results published in official reviews.


----------



## rtwjunkie (Oct 17, 2020)

TumbleGeorge said:


> I up this theme because today has owners of RTX 3080 with own home real world experience with it. I hope they to share here results for VRAM size usage. I want to know if there are any discrepancies between the results in the real world and the results published in official reviews.


You simply need to go back and look at the AAA game tests that W1zzard has done in the last year (there are 4 or 5 of them) to see exactly how much VRAM actually gets used.  He has tested for that specifically.


----------



## TumbleGeorge (Oct 17, 2020)

rtwjunkie said:


> You simply need to go back and look at the AAA game tests that W1zzard has done in the last year (there are 4 or 5 of them) to see exactly how much VRAM actually gets used.  He has tested for that specifically.


Last year has not RTX 3080 10GB but thanks I will search for W1zzard results...

...there is something else that is specifically interesting to me. Tests with such PC games, which rarely, or never, have been tested in official reviews, for various political .... uh, marketing reasons. In addition, it would be interesting to test gameplay, containing not the original game, but various heavy mods with textures with high resolution and high detail of objects. Has people with love to play more than what is offered by the authors of the game. More beautiful and detailed visualization of the characters and the background. A kind of hmm...remastered by consumers/users, not by professionals. And how these "charms" affect the size of VRAM used on RTX 3080 10GB.

PS. I read review of RTX 3090 Strix and see that:



> At a higher resolution, VRAM usage goes up


  

How this affects on RTX 3080 10GB?


----------



## EarthDog (Oct 17, 2020)

TumbleGeorge said:


> How this affects on RTX 3080 10GB


The same as on the 3090.

10gb is fine for even 4k for the next couple of years. People are making a bigger deal about this than it really is I feel. 

So many consumers will be fooled by a more is better premise in the coming weeks.


----------



## ThrashZone (Oct 17, 2020)

Hi,
Yep saw a report on a wretched website 3080 20gb 1k.us in November
Go there at your own add block... risk 

```
https://www.thefpsreview.com/2020/10/14/geforce-rtx-3080-20-gb-to-cost-999-nvidia-reportedly-flooding-gpu-market-in-november-to-wash-away-amd/
```
Said nvidia going to flood the market before amd release lol

Still think there will be a 12gb version super duper


----------



## TumbleGeorge (Oct 17, 2020)

EarthDog said:


> The same as on the 3090.
> 
> 10gb is fine for even 4k for the next couple of years. People are making a bigger deal about this than it really is I feel.
> 
> So many consumers will be fooled by a more is better premise in the coming weeks.


Will see next 3-4 years who is right. Save your promise comment somewhere 
But you is right after 4 years 3080 10GB will be enough for browser needs, for play movies and casual gaming. AAA games maybe will have RTX 3080 10GB in their minimum system requirements, because of lack of VRAM, not because of GPU performance


----------



## Zach_01 (Oct 17, 2020)

TumbleGeorge said:


> Will see next 3-4 years who is right. Save your promise comment somewhere
> But you is right after 4 years 3080 10GB will be enough for browser needs, for play movies and casual gaming. AAA games maybe will have RTX 3080 10GB in their minimum system requirements, because of lack of VRAM, not because of GPU performance


Here my stand for the subject. Already 3080s 10GB are really close or pass the 1000$ point. Right on the AMD GPU release, nVidia will probably flood the market with 3070s and 3080s 20GB on 1000$ MSRP, and no one will care again for 10GB version.
Mission accomplished
nVidia sells the flagship GPU for 1100~1200$. Again. It was never meant to offer a 700$ flagship that offers 30% more perf from the 1200$ 2080Ti. At that price (700) no one is happy in nVidia, and I doubt they even making a decent profit margin.


----------



## ThrashZone (Oct 17, 2020)

Hi,
Yeah with the memory slots available 10gb really makes no sense.
11 maybe 12 makes the most sense but 10gb is stupid come on amd drop it  already


----------



## Zach_01 (Oct 17, 2020)

Almost all in this thread said that 10GBs are ok for today for 4K and the next 1 or 2 years. I thing you are the one to say that don’t.

But you missed the point.
What we trying to tell is that with the next cards they will present that we need more VRAM and will try to make 10GB obsolete. Because they want to sell more expensive. The more higher MSRP a GPU has the more profit margin for them. That’s why 3080 released(?) initially with 10GB and not 11 or 12. It has 1 less than 2080Ti and that was not random. To offer you after that the real card they want to sell.
That is marketing and Jensen is a Mastermind on the subject. Hats off


----------



## ThrashZone (Oct 17, 2020)

Hi,
If you fail to notice 4k ultimate settings spec's recommendations then it's your bad lol
But yeah if you can't get the best internet speed all wasted.


----------



## Zach_01 (Oct 17, 2020)




----------



## arbiter (Oct 17, 2020)

10gb i think is fine for 1080p and 1440p but 4k is bit more touchy number. I know some ppl will say "but consoles are 16gb" without the idea that those consoles pushing 4k. That 16gb is SHARED ram so it has to keep the OS, stuff needed for cpu and stuff needed for gpu. If 16gb is enough for a shared system like then 10gb should be good for a bit of dedicated memory cause pc has 16gb for just most system cpu usage.


----------



## EarthDog (Oct 17, 2020)

The only reason I'd get a 3080 20gb is if I was planning on 4k gaming on this card for several years. If my normal life cycle of a gpu is 3 years or so, 10gb will be enough.


----------



## moproblems99 (Oct 18, 2020)

TumbleGeorge said:


> The only reason I'd get a 3080 20gb is if I was planning on 4k gaming on this card for several years. If my normal life cycle of a gpu is 3 years or so, 10gb will be enough.



And if you're cycle is longer, the 3080 won't be brute enough anyway.


----------



## TumbleGeorge (Oct 18, 2020)

moproblems99 said:


> And if you're cycle is longer, the 3080 won't be brute enough anyway.


Wrong quote my friend. This words is not mine. Your citation of EarthDog comment with my nickname for author.

Ontopic: How to use 3080 10GB for 3 years livecycle:
1. First year - hot like first love
2. Second year - neutral
3. Third year - I HATE IT


----------



## EarthDog (Oct 18, 2020)

TumbleGeorge said:


> Wrong quote my friend. This words is not mine. Your citation of EarthDog comment with my nickname for author.
> 
> Ontopic: How to use 3080 10GB for 3 years livecycle:
> 1. First year - hot like first love
> ...


lol, whatever man...You're delusional.


----------



## Parsian (Oct 19, 2020)

So is 10gb enough at 4k ultra settings gaming at least 2-3 years?


----------



## Rei (Oct 19, 2020)

Parsian said:


> So is 10gb enough at 4k ultra settings gaming at least 2-3 years?


Short Answer: Yes, it's enough.


----------



## ThrashZone (Oct 19, 2020)

Parsian said:


> So is 10gb enough at 4k ultra settings gaming at least 2-3 years?


Hi,
Actually you'd need to look at each games recommended spec's for those perferances
Not all are the same blanket yes is misleading
Watchdog... says 11gbs at those options plus notice the internet speed too.


----------



## Zach_01 (Oct 19, 2020)

Let’s talk a little constructive here... 

What happens if a game at 4K at future(1-2years) absolutely need 12~13GB of VRAM. What will happen to a card with 10GB of VRAM?
Won’t that extra data (2~3GB) be transferred to the system’s RAM?
So the 3080 is a PCI-E 4.0 card. Won’t that help with all that data transfer back and forth? I know it’s not the same, but here is one way for PCI-E 4.0 to become relevant.

AMD already supports PCI-E 4.0 and Intel will too in a few months. Yes... more expenses... but who can’t do that when buying a 900$ GPU. Either that or buy the 1000+$ 20GB.


----------



## EarthDog (Oct 19, 2020)

Zach_01 said:


> So the 3080 is a PCI-E 4.0 card. Won’t that help with all that data transfer back and forth?


Short answer is yes... look at what the 5500XT 4GB/8GB testing on 3.0/4.0....


----------



## ThrashZone (Oct 19, 2020)

Hi,
Short answer is buy a console lol


----------



## Zach_01 (Oct 19, 2020)

EarthDog said:


> Short answer is yes... look at what the 5500XT 4GB/8GB testing on 3.0/4.0....


That’s exactly what I had in mind. Wasn’t intuition...


----------



## EarthDog (Oct 19, 2020)

Zach_01 said:


> That’s exactly what I had in mind. Wasn’t intuition...


Maybe use a declarative statement instead of a question.


----------



## Zach_01 (Oct 19, 2020)

EarthDog said:


> Maybe use a declarative statement instead of a question.


Indeed... I thought the rest of it was declarative enough, because I was talking as if the answer was yes.
Anyway it’s always best to be clear.


----------



## ThrashZone (Oct 19, 2020)

Hi,
Warming up for Jeopardy lol


----------



## sepheronx (Oct 19, 2020)

I saw this good video on PCIe 3.0 vs 4.0 on a RTX 3090.


----------



## londiste (Oct 19, 2020)

Zach_01 said:


> What happens if a game at 4K at future(1-2years) absolutely need 12~13GB of VRAM. What will happen to a card with 10GB of VRAM?
> Won’t that extra data (2~3GB) be transferred to the system’s RAM?
> So the 3080 is a PCI-E 4.0 card. Won’t that help with all that data transfer back and forth? I know it’s not the same, but here is one way for PCI-E 4.0 to become relevant.


If game's VRAM usage is well managed, the first thing to take a hit in general case is the dynamic texture pool. There is some buffer there that will not have a noticeable effect. Basically more texture streaming in and out of VRAM. If the implementation cannot keep up the result is pop-in and/or stutter.
PCIe 4.0 x16 is still only 31.5GB/s. That pales in comparison to VRAM bandwidth - in case of RTX3080 760GB/s - in addition to massively worse latency. It'll help compared to PCIe 3.0 but not all that much in the big picture.


----------



## Parsian (Oct 19, 2020)

ThrashZone said:


> Hi,
> Actually you'd need to look at each games recommended spec's for those perferances
> Not all are the same blanket yes is misleading
> Watchdog... says 11gbs at those options plus notice the internet speed too.



What should i do? Should i wait for 20gb version? But i saw a post on reddit:


__
		https://www.reddit.com/r/nvidia/comments/j9xupx


----------



## rtwjunkie (Oct 19, 2020)

Parsian said:


> What should i do? Should i wait for 20gb version? But i saw a post on reddit:
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/j9xupx


You’re actually asking what you should do? Those are the same type of results obtained here. No reason to wait for 20 GB VRAM.


----------



## Rei (Oct 19, 2020)

Parsian said:


> What should i do? Should i wait for 20gb version? But i saw a post on reddit:
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/j9xupx


Don't bother. 20GB VRAM is largely beneficial for video editing & production scene. Unless you game at 4K, @Ultra setting, with AA (which stands for Anti-Aliasing, NOT Alcoholics Anonymous) & all other graphical fluff at max, then you won't get the full benefit of having that much VRAM, not to mention at that settings, you won't likely be getting close to 60fps. AA doesn't do much to benefit 4K visuals currently & without AA, you won't need alotta VRAM. Attempting to do so will likely cause you wasted cash & regret & before long, you & your GPU will end up at a AA (which this time, stands for Alcoholics Anonymous, NOT Anti-Aliasing) meeting.


----------



## FullTank (Oct 20, 2020)

to put it simple, as long as you playing in 4K or less, and playing Optimized games, be they old or new - RTX 3080's 10GB of VRAM will be enough for a while.
(some games can experience issues with such amount, more info below)

4x R9 390's & user here, played bunch of new games in 1440p & 4K and some old ones in 4~fake6,8&11K(GeDoSaTo) , never ran out of VRAM other than a single time (when i used 2 390X's) in Mirror's Edge Catalyst(disabled CFX) without enabled "compress vram for lower vram cards like 970" option , that thing made my total VRAM usage at around 7.1 or 7.3 in 1440p (not highest settings, it was impossible to play it in 2160p (it's a game without supported CFX but lack of vram was the issue there, as long as keeping that compress vram option disabled), so i'm assuming that it may in fact go very high in real 4K

the entire VRAM thing is not really about new vs old game, it's more about the game itself, from time to time i saw some super unoptimized games which ate tons of vram, Eco on launch ate 6 gigs i think then got reduced to 2 after lots of complains, which is still a ton for such game, Escape From Tarkov some months ago was eating all 8 gigs in 1080p medium-low with high textures when i lend my GPU to my neighbor's girlfriend after they asked me to help to see what's wrong with their PC and why their GTX 970 have a slideshow in that game (she later dumped my neighbor and stole my R9 390X,my 80+ 600W PSU, and additional fans all of which i lend, not thankful person i swear, but luckily i got 190 Euros returned to me with the help of police, instead of the lost goods, was my first time ever when i went to police and it was a good one, luckily) i also wrote a post about this high vram usage in tarkov here, the one from YaroslavZ - 




__
		https://www.reddit.com/r/EscapefromTarkov/comments/eu4pdn


----------



## TumbleGeorge (Oct 20, 2020)

All or mostly conclusions so far are based on results from games that are already available. They are certainly valid only at the moment. 3-4 years is enough time for things to change significantly. There are many 3D game engines, some has new versions...Yes, there are engines with different popularity,  with more light or more heavy resulting of games working on them. After 2-3-4 years... Intuitions, speculations for to be enough 10GB VRAM...whatever to play NEW game for 4k with ultra settings. Dlss eat additional VRAM size when use...Who is clearly sure to write short answer? The GOD itself!  Yes there is  J. Huang promises but...I never  completely trust anyone.


----------



## Parsian (Oct 20, 2020)

rtwjunkie said:


> You’re actually asking what you should do? Those are the same type of results obtained here. No reason to wait for 20 GB VRAM.



I was asking to @ThrashZone because he said "Watchdog... says 11gbs at those options plus notice the internet speed too. "


----------



## ThrashZone (Oct 20, 2020)

Parsian said:


> What should i do? Should i wait for 20gb version? But i saw a post on reddit:
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/j9xupx


Hi,
Just wait for amd release
Supposed to have 16gb's and likely will be cheaper and force nvidia to drop prices more and release a gpu with more memory
I'd say 12gb would be fine and these cards have two blank memory spots waiting for nvidia to fill the darn things without using the back of the card.

lol yeah watchdog comes with the cards too isn't that a pickle 
Steaming Internet speed and consistency matters.


----------



## Vayra86 (Oct 20, 2020)

TumbleGeorge said:


> Will see next 3-4 years who is right. Save your promise comment somewhere
> But you is right after 4 years 3080 10GB will be enough for browser needs, for play movies and casual gaming. AAA games maybe will have RTX 3080 10GB in their minimum system requirements, because of lack of VRAM, not because of GPU performance



Lol well thats a bit much isn't it. 



TumbleGeorge said:


> All or mostly conclusions so far are based on results from games that are already available. They are certainly valid only at the moment. 3-4 years is enough time for things to change significantly. There are many 3D game engines, some has new versions...Yes, there are engines with different popularity,  with more light or more heavy resulting of games working on them. After 2-3-4 years... Intuitions, speculations for to be enough 10GB VRAM...whatever to play NEW game for 4k with ultra settings. Dlss eat additional VRAM size when use...Who is clearly sure to write short answer? The GOD itself!  Yes there is  J. Huang promises but...I never  completely trust anyone.



For some perspective, in 3-4 years time we might see some high-end titles that would _prefer_ 12GB VRAM for the highest settings. Additionally, modded games could want upwards of 10GB already today, just add enough assets and bingo. In both cases the card will be plenty fast for the resolution but come up short in the VRAM department... but other than that? 10GB is fine.


----------



## kapone32 (Oct 20, 2020)

The question here is why would Nvidia release a card with double the VRAM weeks after the launch of a card where the majority of orders are still pending?


----------



## Zach_01 (Oct 20, 2020)

“Broken record at sight...” =me


kapone32 said:


> The question here is why would Nvidia release a card with double the VRAM weeks after the launch of a card where the majority of orders are still pending?


Because Ampere was not ready for September, nVidia just wanted to intercept RDNA2, and in addition the never really wanted to sell a 700$ flagship. More like 1000~1200$...


----------



## kapone32 (Oct 20, 2020)

Zach_01 said:


> “Broken record at sight...” =me
> 
> Because Ampere was not ready for September, nVidia just wanted to intercept RDNA2, and in addition the never really wanted to sell a 700$ flagship. More like 1000~1200$...


Hahaha indeed and it's not like the 3090 is so much faster than the  current 3080


----------



## rtwjunkie (Oct 20, 2020)

Parsian said:


> I was asking to @ThrashZone because he said "Watchdog... says 11gbs at those options plus notice the internet speed too. "


And???!!! My answer still stands. If you only had a question for him maybe you should have PM’d him instead of circling around and around with the drama in the forum. Now ask yourself, what advantage is there for game makers to develop games that only .05% of people can play?  Just buy the best card you can afford at the time and enjoy gaming.



kapone32 said:


> The question here is why would Nvidia release a card with double the VRAM weeks after the launch of a card where the majority of orders are still pending?


Um, because marketing and buyer gullibility and willingness to take whatever they sell at any price.


----------



## kapone32 (Oct 20, 2020)

rtwjunkie said:


> And???!!! My answer still stands. If you only had a question for him maybe you should have PM’d him instead of circling around and around with the drama in the forum. Now ask yourself, what advantage is there for game makers to develop games that only .05% of people can play?  Just buy the best card you can afford at the time and enjoy gaming.
> 
> Plus there is something called settings that can improve the performance without giving a loss of visual quality to 98% of users
> 
> Um, because marketing and buyer gullibility and willingness to take whatever they sell at any price.



Well they were very smart in the way they did it.


----------



## TumbleGeorge (Oct 20, 2020)

Vayra86 said:


> Lol well thats a bit much isn't it.
> 
> 
> 
> For some perspective, in 3-4 years time we might see some high-end titles that would _prefer_ 12GB VRAM for the highest settings. Additionally, modded games could want upwards of 10GB already today, just add enough assets and bingo. In both cases the card will be plenty fast for the resolution but come up short in the VRAM department... but other than that? 10GB is fine.


The requirements will grow faster than expected because there should be an incentive to sell the next generations of video cards, and by the fall of 2024 there will be at least 2 new generations.


----------



## EarthDog (Oct 20, 2020)

TumbleGeorge said:


> The requirements will grow faster than expected because there should be an incentive to sell the next generations of video cards


If that was true, we'd already be on 20GB cards for the mainstream... 

The reqs will not change as fast you think... it would alienate too many people too quickly. There is a fine line between wanting to sell more GPUs and software catching up and actually using that amount of VRAM (or negative effects come it.. there is a difference between allocated and in use).


----------



## Vayra86 (Oct 20, 2020)

TumbleGeorge said:


> The requirements will grow faster than expected because there should be an incentive to sell the next generations of video cards, and by the fall of 2024 there will be at least 2 new generations.



The requirements follow the growth of mainstream, not of high end GPUs. The new mainstream is likely to cap out at 10-12GB in the next 3-4 years. Basically that is the area that the 3080 is 'pioneering' now for a larger audience. You also have to consider that we've been having 11GB GPUs for as much as 3 years now... and yet, games still cap out at 8.

The requirements grow, but not as fast as you think they do. The new console gen is the new mainstream metric for now > 3-5 years. And while it does have more than 10GB... its not much. Of course we can defend that PC and a 3080 will want to target a higher visual fidelity than a weaker console GPU.

Irony has it that a 20GB version of the 3080 would lack balance just as well as the 10GB version does today, just in the other direction  Ideally, you'd see 12-16GB.


----------



## TumbleGeorge (Oct 20, 2020)

Yes 12-16GB is better but we  discussion about 3080 10GB. If this card from beginning was maked with 12 or 16GB VRAM will have zero interest to discuss it's VRAM size. Yes 20GB is too much and will be reflected on it's price and will make RTX 3090 pointless.


----------



## EarthDog (Oct 20, 2020)

TumbleGeorge said:


> Yes 12-16GB is better but we  discussion about 3080 10GB. If this card from beginning was maked with 12 or 16GB VRAM will have zero interest to discuss it's VRAM size. Yes 20GB is too much and will be reflected on it's price and will make RTX 3090 pointless.


There's zero interest in discussing 10GB on this card... for most. Few are concerned... and rightfully so.


----------



## Zach_01 (Oct 20, 2020)

TumbleGeorge said:


> Yes 12-16GB is better but we  discussion about 3080 10GB. If this card from beginning was maked with 12 or 16GB VRAM will have zero interest to discuss it's VRAM size. Yes 20GB is too much and will be reflected on it's price and will make RTX 3090 pointless.


_...broken record mode On_

Because if they gave users 12-16GB from the beginning they couldn’t ask 1000+$ for it. They wanted to create the hype of a 700$ flagship with no availability and then sell the card they really want to sell and make their desired profit margin.

_broken record mode Off..._

They know most users will go crazy for the 20GB over the 10GB model. Even tho in here (TPU) most of us, sane an informed users, know about real VRAM usage.


----------



## moproblems99 (Oct 20, 2020)

Zach_01 said:


> Even tho in here (TPU) most of us, sane an informed users, know about real VRAM usage.



This thread proves otherwise.


----------



## TheLostSwede (Oct 21, 2020)

Sorry lads and lasses, there won't be any 20 or 16GB cards...
At least not for now.








						NVIDIA allegedly cancels GeForce RTX 3080 20GB and RTX 3070 16GB - VideoCardz.com
					

NVIDIA has just told its board partners that it will not launch GeForce RTX 3080 20GB and RTX 3070 16GB cards as planned. NVIDIA GeForce RTX 3080 20GB and RTX 3070 16GB canceled NVIDIA allegedly cancels its December launch of GeForce RTX 3080 20GB and RTX 3070 16GB. This still very fresh...




					videocardz.com


----------



## ShurikN (Oct 21, 2020)

TheLostSwede said:


> Sorry lads and lasses, there won't be any 20 or 16GB cards...
> At least not for now.
> 
> 
> ...


I'm gonna take a wild guess and say that the alleged TSMC refresh will have the Super naming and introduce double the vram of the non-super cards.


----------



## P4-630 (Oct 21, 2020)

ShurikN said:


> I'm gonna take a wild guess and say that the alleged TSMC refresh will have the Super naming and introduce double the vram of the non-super cards.



Not much time for that:

_DigiTimes report suggests the Nvidia Ampere GPUs launching in late 2020 may be replaced by 5 nm Hopper GPUs one year later._
_Apparently, Nvidia underestimated the impact of AMD's 7 nm GPUs and is now looking to phase out the upcoming Ampere GPUs faster, _
_as the green team has already pre-booked an important part of TSMC's 5 nm production capacity for 2021, when the Hopper GPUs are expected to hit the market._

https://www.notebookcheck.net/DigiT...5-nm-Hopper-GPUs-one-year-later.464133.0.html


----------



## TheLostSwede (Oct 21, 2020)

ShurikN said:


> I'm gonna take a wild guess and say that the alleged TSMC refresh will have the Super naming and introduce double the vram of the non-super cards.


Might also have something to do with Micron and GDDR6X supply.


----------



## budgetgaming (Nov 30, 2020)

Solid State Soul ( SSS ) said:


> No.
> 
> Games are always developed for consoles first, then scaled up for pc and with next Gen 4K consoles have 16 gb of memory, the 10 gb of ram is enough up to 1440p gaming on 4K you'll likely to be limited years down the road, especially if you like to max out the textures.
> 
> ...





moproblems99 said:


> Considering the specs of Cyberpunk, I would say yes.  Considering most people only keep gpus 2 or 3 years, I can't see 10gb not being enough for that time span.  Any longer than that is like asking how long a keg will last.


By the time is not enought gpu manufacture will release new gen GPU to rip off our money,  3080 20g is just to show to competitor that if you have 16 then nvidia has 20, they tell you is future proof but nothing is future proof in PC. And if we are still comparing pc and console, its not apple to apple.... Console are manufacture to performe gaming only, they can saya 4k 120fps but is always stated "up to" maybe when loading screen is 120fps, but in real gaming true 4k with half of teraflop of rtx3080, do you think it will exceed 60 fps? Thats marketing gimmick, like z490 mb support pci gen 4, but intel realese a 500 chipset.... Not even a year.... What future proof?


----------



## MrGRiMv25 (Nov 30, 2020)

I was one of the people saying it would probably be enough for this gen, it's looking like it's cutting it close at the moment with the recent games requirements being announced. The way to see will be how much the difference in performance is between the 3080 and 6800XT  in games where they are evenly matched rasterization wise. I still think it will be fine for 99% of cases but only time will tell.


----------



## budgetgaming (Nov 30, 2020)

MrGRiMv25 said:


> I was one of the people saying it would probably be enough for this gen, it's looking like it's cutting it close at the moment with the recent games requirements being announced. The way to see will be how much the difference in performance is between the 3080 and 6800XT  in games where they are evenly matched rasterization wise. I still think it will be fine for 99% of cases but only time will tell.


Well


MrGRiMv25 said:


> I was one of the people saying it would probably be enough for this gen, it's looking like it's cutting it close at the moment with the recent games requirements being announced. The way to see will be how much the difference in performance is between the 3080 and 6800XT  in games where they are evenly matched rasterization wise. I still think it will be fine for 99% of cases but only time will tell.


 Have you seen watch dog new spec for 4k ultra setting? 11gb.....not even a year they force people who is crazy about spec buying new cards again... Nvidia is playin cards on us, where RX6800xt know how to kick nvidia, and nvidia fans....


----------



## Mussels (Nov 30, 2020)

budgetgaming said:


> Well
> 
> Have you seen watch dog new spec for 4k ultra setting? 11gb.....not even a year they force people who is crazy about spec buying new cards again... Nvidia is playin cards on us, where RX6800xt know how to kick nvidia, and nvidia fans....



That's because they tested it on a 2080Ti 11GB, the fastest card available they had to test with. Even if they got 30x0 cards launch day, theres no time to optimise the game for them.


----------



## budgetgaming (Nov 30, 2020)

Mussels said:


> That's because they tested it on a 2080Ti 11GB, the fastest card available they had to test with. Even if they got 30x0 cards launch day, theres no time to optimise the game for them.



Sorry if i am pissed with intel and nvidia, 1 week ago just bought the z490i unify board, and yesterday i just found out 2021 is releasing 500 chipset motherboard, i have not assemble my rig yet.... Best spec is only todays talk, tommorow is diffrent


----------



## Vayra86 (Nov 30, 2020)

budgetgaming said:


> Sorry if i am pissed with intel and nvidia, 1 week ago just bought the z490i unify board, and yesterday i just found out 2021 is releasing 500 chipset motherboard, i have not assemble my rig yet.... Best spec is only todays talk, tommorow is diffrent



That is how hardware releases work and its why you never want to pay too much for it.

Its why I'm always calmly waiting out the storm to buy at a very competitive price. Usually long post-release. Its pretty comfy being half a gen behind the curve, or even a full gen, or buy into sub-top. You avoid lots of problems and buyers remorse.



ShurikN said:


> I'm gonna take a wild guess and say that the alleged TSMC refresh will have the Super naming and introduce double the vram of the non-super cards.



That was my initial take on the 10GB card as well, I mean it was clear as day that this was not going places and not a day goes by with another confirmation that Ampere will turn obsolete faster than you can blink. I think Nvidia has realized it has gotten lazy and complacent with their pioneering Turing release that was really a shitload of silicon gone to waste with nothing to show for it; and then following up with a product range on a grossly inferior node, forcing themselves AGAIN into a larger die than the competition while not really being better at anything. They're going to have do _something._ Refreshing Ampere might not be enough and I think the Hopper rumor is credible in that sense.

Realistically that's what they've got now. We can be all cheery about their added value bullshit but all of it is proprietary so its nothing you SHOULD care about - remember PhysX, remember Gsync - they're both gone the way of the dodo, or pretty much. Its fun if they have it alongside a normal GPU comparison, but that's all it really is. And that includes the additional RT performance, too - you can rest assured the focus on the consoles will push more dev budget to the approach that works everywhere and not just with RTX special sauce. They're all just bonus points that come on top of whatever a GPU should be doing: Produce frames. And the simple fact is, AMD Is much better at that right now, doing more for less in every possible way: Power, die size, and even VRAM capacity for the mid-long term.


----------



## budgetgaming (Nov 30, 2020)

Vayra86 said:


> That is how hardware releases work and its why you never want to pay too much for it.
> 
> Its why I'm always calmly waiting out the storm to buy at a very competitive price. Usually long post-release. Its pretty comfy being half a gen behind the curve, or even a full gen, or buy into sub-top. You avoid lots of problems and buyers remorse.
> 
> ...


Do you think they have plan this all along giving is shortage of supply so that they can sell it more expensive and also waiting for amd to release their rx6000 card and giving is news that they are going to release 20gb vesion so that buyesrs will wait for it. Because they know it would be 16gb versiin because xbox and ps5 using rdna 2 are having 16gb


----------



## Vayra86 (Nov 30, 2020)

budgetgaming said:


> Do you think they have plan this all along giving is shortage of supply so that they can sell it more expensive and also waiting for amd to release their rx6000 card and giving is news that they are going to release 20gb vesion so that buyesrs will wait for it. Because they know it would be 16gb versiin because xbox and ps5 using rdna 2 are having 16gb



No I don't believe in conspiracies, I believe in the way markets work and respond. Competitors responding to each other is what we've seen with Ampere launch and AMD's follow up.

Nvidia wanted to pre-empt AMD to catch its buyers on mindshare because they would otherwise start having doubts, as AMD has the better offer now. Its clear as day. They also try to use their RT and comparison to Turing to show us they're somehow better, but in reality, that only works because Turing was pretty shit to begin with - they also needed SUPER before it was a meaningful product line there.

Now, AMD delivered and Nvidia is firing on all cilinders to keep the losses to a minimum. There is also a rumor mill with lots of absolute nonsense in it. Whether or not Nvidia planned a 16-20-whatever GB card is irrelevant until they themselves announced it and they never did. What's happening now is that Nvidia is turning from leader into follower: they HAVE to implement stuff like RTX IO because AMD is leading us into a new console gen with fast access to storage, for example. And wrt ray tracing, most titles will be coming out console-first even despite the presence of Turing.

The trend is clear: Nvidia wants to pre-empt developments happening at large and in doing so define the marketplace. They seem to be failing at it this time. 10GB might well be enough - but there is this nagging thought that its probably not a few years down the line for the high resolutions its made for, and yes, the console capacity is a writing on the wall. Ignoring that is living in denial - its sub optimal at the very least.

The supply shortages... are shortages. It happens. We have a global pandemic, we have a bottleneck on fab capacity, Christmas holiday shopping and a fight over the best nodes available. Also, we are too many people on this planet, so you can readily expect more of this to happen in the future. No conspiracy involved, its just humans being human.


----------



## Sovsefanden (Nov 30, 2020)

10GB will be plenty for 4K gaming, when it's not anymore, 3080 will be too slow for 4K anyway, and so will 6800XT.

Alot of VRAM is not going to help, when the GPU is not capable. We have seen this many times.

3080 beats 6800XT easily at 4K and this is even without DLSS enabled.

Remember that Ampere has Tensor Memory Compression which can lower VRAM usage by up to 40%


If you insist on maxing all new games out in 4K, you will simply be upgrading every 1-2 year anyway



People really need to understand how RAM allocation works
Tons of game engines simply allocate most/all VRAM, yet uses only a small percentage of that amount

Even at 4K, I don't think there's any game that uses more than 8GB (bandwidth matters too tho)

Godfall will a "texture pack" that requires 12GB, you know, that AMD sponsored game with horrible reviewsscores
VERY WEIRD that it requires 12GB when 3080 only has 10GB right? 

Do you remember Shadow of Mordor "ULTRA HD" texture pack? https://www.pcgamer.com/spot-the-di...rdor-ultra-hd-textures-barely-change-a-thing/

Back when AMD had more VRAM overall too? Used way more VRAM without improving textures anyway
Uncompressed vs compressed, but ended up with same visuals haha

GAME DEV'S SHOULD KNOW how to do loseless compression and next gen consoles wont have anywhere NEAR 10GB VRAM for "native 4K" (reality; dynamic res)

XSX and PS5 gets 16GB shared RAM, meaning that graphics will be 8GB TOPS, if not 5-6GB...


----------



## Vayra86 (Nov 30, 2020)

Sovsefanden said:


> *GAME DEV'S SHOULD KNOW* how to do loseless compression and next gen consoles wont have anywhere NEAR 10GB VRAM for "native 4K" (reality; *dynamic res)*



Key points in your post, and neither are guarantees. What'll happen is that dynamic resolution will force an Nvidia GPU into lower detail levels sooner and faster than games on the similar performing AMD GPU. Consoles DO have access to 10+ GB of VRAM though. They will be having 11-13GB for graphics, potentially.

I agree with you and most others when it comes to how we used to approach VRAM. But what has been is not what tomorrow looks like. We're looking at higher bandwidth usage and a preference for fast access to storage, ESPECIALLY because there is much more going over that bus since allocation is even more dynamic than it used to be. Already we notice how interconnects matter wrt latency for example. Frame times are already better in numerous games on an RDNA2 card. Look at TPU reviews. Its not just the FPS - frame delivery also benefits from not swapping and re-allocating. And that's right away, post release. It won't be getting worse on the AMD side of things, only better. Ampere similarly can still improve... but there is much less wiggle room.

Something's gonna give. 10GB means heavier load on swaps which results in one of two things: Lower detail level, or lower performance. The only real basis anyone has for saying '10GB is enough' is that you believe Nvidia on its green eyes that it will be. But Nvidia is not leading the game industry at the start of a new console gen. We all know this - if you look at the past, we've seen every new console gen was a major influence for a new performance level in games. Nvidia fed the last console release with a very solid GTX 970 (and lo and behold... it also had a so-so-memory system!) but even so, you don't want to run a 4GB GPU for the last crop of PS4 games do you? You want 6 or 8 at least.

So if you're going to look back, don't look with rose tinted glasses, but be honest and apply the principle everywhere. Games DO exceed the last console gen's launch GPU right now and they have been for several years and its not even exclusive to the highest resolution. The logical conclusion here is that the 3080's 10GB will therefore be obsolete long before this console gen is over. And then, when you're being honest with yourself... consider whether that is acceptable or not. The trend however is clear: Nvidia has been cutting back on VRAM per performance tier while steadily increasing the price points, the competitor is not, and the competitor is defining the general direction of game/port development. Just put two and two together.

Its all crystal ball guesswork... but this is my educated guess.


----------



## Sovsefanden (Nov 30, 2020)

Vayra86 said:


> Key points in your post, and neither are guarantees. What'll happen is that dynamic resolution will force an Nvidia GPU into lower detail levels sooner and faster than games on the similar performing AMD GPU. Consoles DO have access to 10+ GB of VRAM though. They will be having 11-13GB for graphics, potentially.
> 
> I agree with you and most others when it comes to how we used to approach VRAM. But what has been is not what tomorrow looks like. We're looking at higher bandwidth usage and a preference for fast access to storage, ESPECIALLY because there is much more going over that bus since allocation is even more dynamic than it used to be. Already we notice how interconnects matter wrt latency for example. Frame times are already better in numerous games on an RDNA2 card. Look at TPU reviews. Its not just the FPS - frame delivery also benefits from not swapping and re-allocating. And that's right away, post release. It won't be getting worse on the AMD side of things, only better. Ampere similarly can still improve... but there is much less wiggle room.
> 
> ...



Who keeps a GPU for 8 years tho? An entire console generation. Most that does that, keeps playing the same games, like WoW etc

Most PC gamers, that play new AAA games, will be upgrading AT LEAST 2 times in a console generation, for me, more like 4 times. Every 2 year is what I do 

A high amount of VRAM will not save you, because GPU is not getting faster, so you will still be looking at low fps in the end, forcing you to lower image quality, and VRAM requirement drops as a result, making the VRAM pointless again

Thats why you don't go all-out on VRAM if GPU is not absolute high-end to begin with, it's better to upgrade more often, than trying to futureproof

Never be more than 2 generations behind if you want proper driversupport from Nvidia/AMD or even game dev's (which are testing with newest and last gen mostly)

Nvidia and AMD will focus on newest arch first, then "last gen", older arch's *might* get support, might not - You will see wonky performance and issues, this is what people with older GPU's often experience in new games

Take 390X for example, it had 8GB, today it can barely do 1080p maxed, GPU is way too dated but VRAM is fine, still does not save performance. Then look at 3070 with 8GB too, performs like a 2080 Ti even at 4K; https://www.techpowerup.com/review/msi-geforce-rtx-3090-suprim-x/33.html

A friend of mine bought 380X card solely because of VRAM. He though he could be using the card for 5+ years, but he has experienced flicker and weird glitches in tons of new games in the last years + bad performance in most games relased after 2018, like very bad in some of them (unplayable) shadowbugs and even crashing (but the card does 3dmark looping for hours, meaning it's the game/drivers)

Meanwhile Fury X released with 4GB and AMD claimed this was more than enough, yet aged like milk because of the 4GB, can still do 1080p "fine" tho, the problem is that it was a 1440p-4K card, 980 Ti still does 1440p DECENT in most games using medium settings, Fury X can barely do low

6GB in 2020 is still decent for 1080p, 8GB for 1440p and 10GB for 4K, and you should be fine for a few years, if not lower some settings and enjoy anyway, who cares if you play a game at 95% IQ instead of 100%


----------



## budgetgaming (Nov 30, 2020)

NVIDIA Allegedly Sold $175 Million Worth of Ampere GeForce RTX 30 GPUs To Crypto Miners, Could Be A Contributing Factor Behind Immense Shortages
					

Financial analysts have suggested that NVIDIA sold $175 Million worth of GeForce RTX 30 Ampere gaming GPUs to crypto miners.




					www.google.com
				




Still human is human?


----------



## rtwjunkie (Nov 30, 2020)

budgetgaming said:


> Sorry if i am pissed with intel and nvidia, 1 week ago just bought the z490i unify board, and yesterday i just found out 2021 is releasing 500 chipset motherboard, i have not assemble my rig yet.... Best spec is only todays talk, tommorow is diffrent


And your Z490 board will suddenly work less well when the 590 comes out?  Stop falling for marketing. Your board will perform great for years!


----------



## Sovsefanden (Nov 30, 2020)

rtwjunkie said:


> And your Z490 board will suddenly work less well when the 590 comes out?  Stop falling for marketing. Your board will perform great for years!



LOL YES.

Both Intel, AMD and Nvidia are doing YEARLY updates, some better than others, but NEW STUFF = MORE SALES, which is the point

ALL TECH BUSINESSES does this now.


----------



## londiste (Nov 30, 2020)

rtwjunkie said:


> And your Z490 board will suddenly work less well when the 590 comes out?  Stop falling for marketing. Your board will perform great for years!


Intel has a pretty established schedule of two chipset/motherboard generations per socket. With 400-series being first chipset series for LGA1200, pretty sure 500-series chipsets and whatever CPUs come next are all going to be compatible. Rumors and news bits so far are saying exactly that. The only question mark might be PCIe 4.0 support (and based on early information and rumors, most Z490 boards should be compatible given that new CPUs come with support).


----------



## rtwjunkie (Nov 30, 2020)

Sovsefanden said:


> LOL YES.
> 
> Both Intel, AMD and Nvidia are doing YEARLY updates, some better than others, but NEW STUFF = MORE SALES, which is the point
> 
> ALL TECH BUSINESSES does this now.





londiste said:


> Intel has a pretty established schedule of two chipset/motherboard generations per socket. With 400-series being first chipset series for LGA1200, pretty sure 500-series chipsets and whatever CPUs come next are all going to be compatible. Rumors and news bits so far are saying exactly that. The only question mark might be PCIe 4.0 support (and based on early information and rumors, most Z490 boards should be compatible given that new CPUs come with support).


You are both missing the point. Yearly updates and releases don’t make a product perform less well.  It will still do whatever it did the day before a new product came out. And in the case of CPU’s/motherboards they are relevant for several years.


----------



## Sovsefanden (Nov 30, 2020)

rtwjunkie said:


> You are both missing the point. Yearly updates and releases don’t make a product perform less well.



Demands are rising and when you buy a high-end products and see 1 year later you now have a mid-end product you will always feel bad

My 8700K at 5.2 GHz still smashes any Ryzen chip in gaming and emulation, which is what this home-rig is used for

My 3080 needs to last till 2022 and I will be on 4000 series and Hopper

Never buy the refreshed series, always jump on the new arch first, you are then secured at least 2 years with PRIME FOCUS


----------



## rtwjunkie (Nov 30, 2020)

Sovsefanden said:


> Demands are rising and when you buy a high-end products and see 1 year later you now have a mid-end product you will always feel bad
> 
> My 8700K at 5.2 GHz still smashes any Ryzen chip in gaming and emulation, which is what this home-rig is used for


You seem to contradict yourself with your two paragraphs.  

Besides, what one “feels” has nothing to do with performance numbers being the same today as they were the day before.


----------



## Khonjel (Nov 30, 2020)

Simple answer.

For people who change hardware every two-three years, YES.
But people like me who stick with hardware for years, NOPE.


----------



## Vayra86 (Nov 30, 2020)

Sovsefanden said:


> Who keeps a GPU for 8 years tho? An entire console generation. Most that does that, keeps playing the same games, like WoW etc
> 
> Most PC gamers, that play new AAA games, will be upgrading AT LEAST 2 times in a console generation, for me, more like 4 times. Every 2 year is what I do
> 
> ...



My experience right now is different.

Until a few weeks back I ran a 1080p panel with this GTX 1080. Now I'm running 3440x1440 and I'm seeing allocated easily creep towards 6,5-7GB. That's not 4K, mind, and the GPU is still plenty capable to push this resolution. The 3080 is a whole lot (as in 2x) faster, but only carries 2GB more. That is out of balance, no matter how you twist it. Push a game that is heavy on mods and additional textures/tweaks and you'll quickly run into trouble, too.

If you upgrade often, then yes, sure 10GB will probably carry you. But its very easy to keep a GPU for 5 years. I'm doing it, and I can easily see how a 3080 in this situation would have caused me problems. Upgrading for me is a functional move, not something I do because upgrading is fun. It costs money and the gain is usually limited the faster you do it. If you compare the past three generations and how VRAM capacities have scaled alongside core power, Ampere is a break from the norm, and in a pretty excessive way, too - a _reduction_ of VRAM compared to a _faster_ core. There is no real logic behind any defense of that, is there, apart from having to believe that somehow Nvidia can make do with less than what will soon be a norm. We're fast moving towards an 8GB minimum - 12GB optimal situation I think.


----------



## Sovsefanden (Nov 30, 2020)

Vayra86 said:


> My experience right now is different.
> 
> Until a few weeks back I ran a 1080p panel with this GTX 1080. Now I'm running 3440x1440 and I'm seeing allocated easily creep towards 6,5-7GB. That's not 4K, mind, and the GPU is still plenty capable to push this resolution. The 3080 is a whole lot (as in 2x) faster, but only carries 2GB more. That is out of balance, no matter how you twist it. Push a game that is heavy on mods and additional textures/tweaks and you'll quickly run into trouble, too.
> 
> If you upgrade often, then yes, sure 10GB will probably carry you. But its very easy to keep a GPU for 5 years. I'm doing it, and I can easily see how a 3080 in this situation would have caused me problems. Upgrading for me is a functional move, not something I do because upgrading is fun. It costs money and the gain is usually limited the faster you do it. If you compare the past three generations and how VRAM capacities have scaled alongside core power, Ampere is a break from the norm, and in a pretty excessive way, too - a _reduction_ of VRAM compared to a _faster_ core.



VRAM Usage does not equal to VRAM requirement.

More VRAM = Higher usage. Always. Allocation, read up on it

Yes you were doing it, because you ran 1080p. which is less than phones these days, at 3440x1440 you will now have to upgrade way more often, unless you skimp on settings and lower image quality

Day vs night

1440p is bare minimum for me, at 144 Hz

Keeping a GPU for 5 years, will put you on the entry level segment. Thats 980 Ti performance TODAY. You are on GTX 1660 level. Good luck trying that out in 3440x1440 going forward. No matter which GPU you buy, VRAM is not going to save you. Because after 4 years you will get little to NO support from AMD/Nvidia or game dev's. They won't care about old archtectures.

You will be far better off with replacing the GPU with a mid-end card every 2 years.

3080 won't do well in 3440x1440 in 5 years, 6800XT won't either. Even 3090 and 6900XT will be considered low-end at that point, or lower mid-end atleast.


----------



## Vayra86 (Nov 30, 2020)

Sovsefanden said:


> VRAM Usage does not equal to VRAM requirement.
> 
> More VRAM = Higher usage. Always. Allocation, read up on it
> 
> ...



It was already pointed out that frame times do suffer when you're swapping. You're gonna have to do better than this.

I've also pointed out that an upgrade is not necessary even as VRAM usage went up and closer towards the limit of this GPU. Its time you connect some dots. To summarize the above - I would not dare buy into a 10GB GPU with the resolution I run today, knowing how well the current GPU already does with the core power on tap and how much VRAM it wants.


----------



## Sovsefanden (Nov 30, 2020)

Khonjel said:


> Simple answer.
> 
> For people who change hardware every two-three years, YES.
> But people like me who stick with hardware for years, NOPE.



People that use 4K gaming will have to upgrade every 1-2-3 years tops or they will have to drop settings alot.

Never try to futureproof. 

Paying 1500 dollars for 3090 for example. It won't age well just because it has 24GB VRAM. In 3 years, it wont even be mid-end.


----------



## budgetgaming (Nov 30, 2020)

TheLostSwede said:


> Sorry lads and lasses, there won't be any 20 or 16GB cards...
> At least not for now.
> 
> 
> ...











						GeForce RTX 3080 20GB Registered at EEC - Coming in December?
					

Yes, no, yes, no ... that's pretty much what the rumor train has been on an RTX 3080 fitted with 20GB VRAM. Face it, there has been chatter on a 20GB model for ages now, but things get serious once t...




					www.guru3d.com
				




They did register 20gb model for rtx3080.


----------



## Sovsefanden (Nov 30, 2020)

Seriously, In 5-6 years, you are far better off buying a 500 dollar card 3 times

Than paying 1500 for 3090 today. In 3-4 years it will run like crap and the 500 dollar card will destroy it.

After 4 years, you are 3 generations behind aka NO SUPPORT OR TWEAKING

2016: Pascal
2018: Turing
2020: Ampere
2022: Hopper

By 2022 Turing will be meh
By 2024 Ampere will be meh

Even my old 1080 Ti, which is pretty much legendary at this point, performed pretty average in 1440p by mid 2020, even with +15% OC, hell in somes newer games 2060 Super beat it.. It's barely a 1440p card

If you play older and less demanding games, then fine, but for newer / AAA games, you won't be able to buy ANYTHING that last for 5-6 years

For AMD it more wonky, Fury X aged like Milk for example but Tahiti and Hawaii aged well because AMD refreshed it tons of times and didnt change the arch much

Radeon VII was a mistake tho, EoL after a few months


----------



## TheLostSwede (Nov 30, 2020)

budgetgaming said:


> GeForce RTX 3080 20GB Registered at EEC - Coming in December?
> 
> 
> Yes, no, yes, no ... that's pretty much what the rumor train has been on an RTX 3080 fitted with 20GB VRAM. Face it, there has been chatter on a 20GB model for ages now, but things get serious once t...
> ...


Gee wiz, look at you, quoting a comment from over a month ago...
That was even before the AMD announcement, so not really relevant any more.
Also, I said at least not for now, but I guess you missed that?


----------



## Vayra86 (Nov 30, 2020)

Sovsefanden said:


> People that use 4K gaming will have to upgrade every 1-2-3 years tops or they will have to drop settings alot.
> 
> Never try to futureproof.
> 
> Paying 1500 dollars for 3090 for example. It won't age well just because it has 24GB VRAM. In 3 years, it wont even be mid-end.



The 3090 is just as out of balance as the 3080 is, just on the other end of the spectrum. Bad examples don't prove a point.

Its just a different approach to upgrading. Each one is fine, but they work on different metrics.


----------



## Sovsefanden (Nov 30, 2020)

budgetgaming said:


> GeForce RTX 3080 20GB Registered at EEC - Coming in December?
> 
> 
> Yes, no, yes, no ... that's pretty much what the rumor train has been on an RTX 3080 fitted with 20GB VRAM. Face it, there has been chatter on a 20GB model for ages now, but things get serious once t...
> ...



It's simply the 3080 Ti, using 20GB on a 320 bit bus with the same cores as 3090


----------



## budgetgaming (Nov 30, 2020)

TheLostSwede said:


> Gee wiz, look at you, quoting a comment from over a month ago...
> That was even before the AMD announcement, so not really relevant any more.
> Also, I said at least not for now, but I guess you missed that?











						MSI registers RTX 3060 Ti and RTX 3080 20 GB video cards with the EEC | Aroged
					

Twittergebruiker Komachi has spotted a collection of new video cards from MSI on the website of the Eurasian Economic Commission (EEC). It concerns four different RTX 3090 SKUs, four RTX 3080 10 GB cards, 18 RTX 3060 Ti models, and ten RTX 3080 variants with 20 GB video memory. The registration...




					www.aroged.com
				




Sorry I was looking using My phone....now I moving to PC to discuss....this is 30 Nov news it still yes no yes no for now......registering does not mean that the cards still on the Go, but they are not diching it....


----------



## Sovsefanden (Nov 30, 2020)

Rule of thumb: Never buy a 1000 dollar GPU unless you are rich because it will age like milk and you will be bored after 2 years anyway

Change every 2-3 years, 4 years tops


----------



## phanbuey (Nov 30, 2020)

Sovsefanden said:


> Rule of thumb: Never buy a 1000 dollar GPU unless you are rich because it will age like milk and you will be bored after 2 years anyway
> 
> Change every 2-3 years, 4 years tops



Unfortunately that rule will be harder and harder to follow, as most of the 4k GPUs will be $1k and up - 6900xt, 3090 and 20GB 3080Ti...  Seems like there will be a 20GB 3080 coming out as well but probably in the $850-900 range.

10GB is a little low for the power of this GPU but honestly not by that much even at 4k I rarely break ~6gb - by the time you exceed that amount of ram you really will want something with a bit more GPU power anyways.


----------



## budgetgaming (Nov 30, 2020)

Sovsefanden said:


> Rule of thumb: Never buy a 1000 dollar GPU unless you are rich because it will age like milk and you will be bored after 2 years anyway
> 
> Change every 2-3 years, 4 years tops



Maybe we watch too much benchmark, tech review, we should spent more time playing or editing rather then looking more and more new product, so that we wont feel bad about 1 year old tech . Just my thought....


----------



## phanbuey (Nov 30, 2020)

budgetgaming said:


> Maybe we watch too much benchmark, tech review, we should spent more time playing or editing rather then looking more and more new product, so that we wont feel bad about 1 year old tech . Just my thought....



That's definitely true... I would stick with the 2080ti but no HDMI 2.1 is a bit rough on the 4k TV -- 120 hz looks not so great.


----------



## moproblems99 (Nov 30, 2020)

Sovsefanden said:


> Demands are rising and when you buy a high-end products and see 1 year later you now have a mid-end product you will always feel bad



Not me.  Not to use the car analogy again but...
Manufacturers of cars come out with new features and moar horsepower every model.  Do you complain about cars too?

Of course the flip side of this is manufacturers will stop innovating yearly so your product is more relevant longer.  Then everyone can scream that they are milking us because they only innovate and release products every three years.

Which do you prefer?


----------



## EarthDog (Nov 30, 2020)

Sovsefanden said:


> Most PC gamers, that play new AAA games, will be upgrading AT LEAST 2 times in a console generation, for me, more like 4 times. Every 2 year is what I do


You're a friggin baller upgrading every two years, lol. Most people hang on to GPUs for at least 3-4 years.


Sovsefanden said:


> Demands are rising and when you buy a high-end products and see 1 year later you now have a mid-end product you will always feel bad


lol, what are you smoking, son? Share!

Seriously, it won't be outdated in a year and suddenly slow down to mid0-range performance.

Are we losing something in translation or is this really the way you think? Don't upgrade now because DDR5 and PCIe 5 are 'close'? A flagship suddenly becomes mid-range after a year?

Yikes man... yikes.


----------



## sepheronx (Nov 30, 2020)

budgetgaming said:


> Sorry if i am pissed with intel and nvidia, 1 week ago just bought the z490i unify board, and yesterday i just found out 2021 is releasing 500 chipset motherboard, i have not assemble my rig yet.... Best spec is only todays talk, tommorow is diffrent



Actually, your board (it's gigabyte, right?) Will support 1 PCIe x16 at 4.0 and 1 NVME as well.

Same with my MSI gaming plus.

The PCIe 4.0 compliance was actually done before intel told manufacturers that it's 10th gen wasn't 4.0.  only one to have left it out was ASUS so their boards are hooped on PCIe 4.0 (apparently).


----------



## budgetgaming (Nov 30, 2020)

moproblems99 said:


> Not me.  Not to use the car analogy again but...
> Manufacturers of cars come out with new features and moar horsepower every model.  Do you complain about cars too?
> 
> Of course the flip side of this is manufacturers will stop innovating yearly so your product is more relevant longer.  Then everyone can scream that they are milking us because they only innovate and release products every three years.
> ...


Well if they are not making inovation then they dont make new products and of course no sales, then how to settle the upkeeping cost of company.


sepheronx said:


> Actually, your board (it's gigabyte, right?) Will support 1 PCIe x16 at 4.0 and 1 NVME as well.
> 
> Same with my MSI gaming plus.
> 
> The PCIe 4.0 compliance was actually done before intel told manufacturers that it's 10th gen wasn't 4.0


Is MSI z490 Unify ITX board


----------



## sepheronx (Nov 30, 2020)

budgetgaming said:


> Well if they are not making inovation then they dont make new products and of course no sales, then how to settle the upkeeping cost of company.
> 
> Is MSI z490 Unify ITX board



Yeah, so you are like me. You got PCIe 4.0 compliance on your board so technically when 11th gen is out, it should work.  At least that is what others including the swede told me.









						ASUS Z490 Motherboards Severely Lack Behind In PCIe 4.0 Support Compared To Competitors, No Proper Hardware-Level Integration For Rocket Lake CPUs
					

The PCIe Gen 4 integration on several Z490 motherboards from ASUS, ASRock, MSI & Gigabyte has been detailed for upcoming Intel 11th Gen CPUs.




					www.google.com


----------



## budgetgaming (Nov 30, 2020)

sepheronx said:


> Yeah, so you are like me. You got PCIe 4.0 compliance on your board so technically when 11th gen is out, it should work.  At least that is what others including the swede told me.
> 
> 
> 
> ...


Then why bother releasing 500 chipset?


----------



## sepheronx (Nov 30, 2020)

budgetgaming said:


> Then why bother releasing 500 chipset?



I believe it is to streamline the whole PCIe 4.0.  only z490 motherboards have it and only for 1 x16 and 1 x4 (NVME) and that's it.


----------



## anachron (Nov 30, 2020)

EarthDog said:


> You're a friggin baller upgrading every two years, lol. Most people hang on to GPUs for at least 3-4 years.
> lol, what are you smoking, son? Share!



I would have agree with you when I was playing in 1080p, but in 1440p my 2070 Super is already barely enough for 60fps on some games even with a few settings reductions. I'm not even mentioning having 144fps. I doubt i will be able to keep it as long as my previous cards. And it's probably even worse for people playing in 4k.


----------



## EarthDog (Nov 30, 2020)

anachron said:


> I would have agree with you when I was playing in 1080p, but in 1440p my 2070 Super is already barely enough for 60fps on some games even with a few settings reductions. I'm not even mentioning having 144fps. I doubt i will be able to keep it as long as my previous cards. And it's probably even worse for people playing in 4k.


2070 S is a High Hz 1080p card or 1440/60... there will always some titles that are an exception. A 2070 Super isn't a 4K card in the first place so you're right on that point...If you're already trying to punch up a class, no shit.


----------



## anachron (Nov 30, 2020)

EarthDog said:


> 2070 S is a High Hz 1080p card or 1440/60... there will always some titles that are an exception. A 2070 Super isn't a 4K card in the first place so you're right on that point...If you're already trying to punch up a class, no shit.


I know but it does not invalidate the reasoning of @Sovsefanden. I have been able to play comfortably enough with my 2070 Super OC, and with the money i saved by not buying a 2080Ti at that time and the money made in the two year gap i should be able to upgrade to a far better GPU than a 2080Ti for a not much higher price range than my 2070 Super next year. So i don't see why it seems unreasonable to change GPU every two or three years when playing at this kind of resolution.
Although  to be honest i would probably upgrade in the current 6800Xt/RTX 3080 price range since i even get a nice bonus for working during lockdown.


----------



## Sovsefanden (Dec 1, 2020)

budgetgaming said:


> Maybe we watch too much benchmark, tech review, we should spent more time playing or editing rather then looking more and more new product, so that we wont feel bad about 1 year old tech . Just my thought....



Yep I agree most people will be satisfied with 5700XT or 2070 Super at 1440p, even 5700 non-XT and 2070 non-Super would do fine for 99% of people at this res



EarthDog said:


> 2070 S is a High Hz 1080p card or 1440/60... there will always some titles that are an exception. A 2070 Super isn't a 4K card in the first place so you're right on that point...If you're already trying to punch up a class, no shit.



2070 Super can easily do 1440p/144Hz if you tweak settings, unless your CPU/Memory are turds

Most people that aim for 120+ fps are playing multiplayer titles and cares alot less for image quality anyway, performance is key and without maxing games out, spotting enemies becomes way easier


----------



## FinneousPJ (Dec 1, 2020)

I think it's enough right now but most people look for a GPU to last ~3 years. I think the 16 GB AMD is offering is the better option looking forward.


----------



## Sovsefanden (Dec 1, 2020)

FinneousPJ said:


> I think it's enough right now but most people look for a GPU to last ~3 years. I think the 16 GB AMD is offering is the better option looking forward.



For 4K gamers maybe, meanwhile 99% uses 1440p or lower

I will take 1440p at 144 Hz/fps any day over 2160p at 60 Hz/fps

4K gamers should be buying 3080 as bare minimum (6800 non-XT with OC can work "fine" too, but 6800 XT seems like the way to go here), 6800 is a good 4K entry level card tho, when it can be had for MSRP... in 3-6 months 

Or wait for 3080 Ti and 6900XT

I will be replacing my 3080 in 2022


----------



## FinneousPJ (Dec 1, 2020)

Sovsefanden said:


> For 4K gamers maybe, meanwhile 99% uses 1440p or lower
> 
> I will take 1440p at 144 Hz/fps any day over 2160p at 60 Hz/fps


True which is why I'm waiting on the 12 GB 6700 (XT). I am also on 1440p 144 Hz.


----------



## Sovsefanden (Dec 1, 2020)

FinneousPJ said:


> True which is why I'm waiting on the 12 GB 6700 (XT). I am also on 1440p 144 Hz.



Thats the true sweetspot


----------



## nguyen (Dec 1, 2020)

FinneousPJ said:


> I think it's enough right now but most people look for a GPU to last ~3 years. I think the 16 GB AMD is offering is the better option looking forward.



16GB VRAM is not gonna help you with RT performance, are you sure there won't be any title with good RT implementation coming out in the next 3 years ?
DLSS is another thing, who knows when Direct ML is even ready, and RX6000 don't have tensor cores, and AI training also need to be per game basis.

For 1440p/UW 1440p with RT or 4K gaming, 3080 10GB will always be the better option, now and the forseeable future.


----------



## budgetgaming (Dec 1, 2020)

Best is RTX, they are mature eco system already, for streamers they have the advantage, This 3000 card is the 2nd generation, and suposly they know whats the best for Next Gen PC gaming in RT,  and AMD is only the first Gen,  maybe if RX has mature enough on RT on 2nd gen then we can talk about considering buying RX or RTX GPU...For me AMD is always second choice for me, even If I don't have enough money to buy Nvidia cards I will prefer to wait. even though they have lesser RAM, remember RX 570, 580 their ram is 8gb, but no Horse power to deliver the visual, even GTX 1060 with 6gb has beter performance


----------



## FinneousPJ (Dec 1, 2020)

nguyen said:


> 16GB VRAM is not gonna help you with RT performance, are you sure there won't be any title with good RT implementation coming out in the next 3 years ?
> DLSS is another thing, who knows when Direct ML is even ready, and RX6000 don't have tensor cores, and AI training also need to be per game basis.
> 
> For 1440p/UW 1440p with RT or 4K gaming, 3080 10GB will always be the better option, now and the forseeable future.


What do you mean am I sure? I didn't say anything about RT. You must be confused.



Spoiler: To use a famous example



Are you sure you've stopped beating your wife?


----------



## nguyen (Dec 1, 2020)

FinneousPJ said:


> What do you mean am I sure? I didn't say anything about RT. You must be confused.



So you bought into the 16GB VRAM superiority, when there is 0 game right now that take advantage of that.
Yet DXR is becoming mainstream now, with like 10+ games that can benefit from stronger RT hardware acceleration.


----------



## FinneousPJ (Dec 1, 2020)

nguyen said:


> So you bought into the 16GB VRAM superiority, when there is 0 game right now that take advantage of that.
> Yet DXR is becoming mainstream now, with like 10+ games that can benefit from stronger RT hardware acceleration.


Yes, if you have a look I literally said 10 GB is enough right now. I'm not sure I get your point here.


----------



## Vayra86 (Dec 1, 2020)

nguyen said:


> 16GB VRAM is not gonna help you with RT performance, are you sure there won't be any title with good RT implementation coming out in the next 3 years ?
> DLSS is another thing, who knows when Direct ML is even ready, and RX6000 don't have tensor cores, and AI training also need to be per game basis.
> 
> For 1440p/UW 1440p with RT or 4K gaming, 3080 10GB will always be the better option, now and the forseeable future.



You can play a title with good RT a few years post release, and enjoy the game at much better framerates on a GPU that does that good RT in a smooth way. You cannot however download additional VRAM for your current GPU. VRAM is used in every single game and if you upgrade to a higher resolution, you'll need more in *every single situation*. Not a handful. A few RT settings can be ignored, the game plays without it. But having to cut back on basic settings every single time on stuff like textures just so it fits for your silly 10GB GPU? Meh. I can think of better ways to spend 700+ on a GPU.

RT & DLSS... myeah. Keep buying into the proprietary crap. How did Gsync work out now? PhysX? Oh... Yeah. Meanwhile games have done godrays since forever at much lower budgets... not seeing what's the must-have here. There is no killer app, only grossly expensive visual effects. We've had those before, remember tesselation? Nobody batted an eye, and I dare say it was more impressive after having looked at bump maps for decades. Nobody asked for RT... now we need it, when there is barely content to show for it? Mkay, to each their own  Is this commerce talking, or did you think of it yourself?

Resale value of a 3080 will be shite, too. Nobody wants a 10GB GPU two years down the line. Especially not with that kind of core oomph. Same thing applies to 3070 and everything else Nvidia deems necessary to release with 8-10GB and performance equal or better than 2080ti. A card that already carried 11GB 

Games aren't going anywhere and neither is GPU performance. Having to play the newest on release is exactly the same as buying GPUs on launch: its a mountain of shit you're climbing to reach the desired destination, more often than not. Wait a half year or a year, and all is well in the world. For games, that also means that not only do you get to pay half price or less, but you also have a feature complete experience, perhaps even a GOTY, or something that is DLC-complete.

This chasing the carrot is what causes issues, not the products themselves. Same thing with people hunting their beloved 3080 or 6800 or whatever right now, prior to Christmas... like... do we even logic? Why not wait? Is your house on fire if you don't?


----------



## nguyen (Dec 1, 2020)

FinneousPJ said:


> I think it's enough right now but most people look for a GPU to last ~3 years. *I think the 16 GB AMD is offering is the better option looking forward*.



In 3 years there are probably hundred of DXR titles, maybe 10 of them benefit from 16GB VRAM


----------



## anachron (Dec 1, 2020)

Vayra86 said:


> You can play a title with good RT a few years post release, and enjoy the game at much better framerates on a GPU that does that good RT in a smooth way. You cannot however download additional VRAM for your current GPU. VRAM is used in every single game and if you upgrade to a higher resolution, you'll need more in *every single situation*. Not a handful. A few RT settings can be ignored, the game plays without it. But having to cut back on basic settings like textures just so it fits for your silly 10GB GPU? Meh. I can think of better ways to spend 700+.
> 
> Games aren't going anywhere and neither is GPU performance. Having to play the newest on release is exactly the same as buying GPUs on launch: its a mountain of shit you're climbing to reach the desired destination, more often than not. Wait a half year or a year, and all is well in the world. For games, that also means that not only do you get to pay half price or less, but you also have a feature complete experience, perhaps even a GOTY, or something that is DLC-complete.
> 
> This chasing the carrot is what causes issues, not the products themselves. Same thing with people hunting their beloved 3080 or 6800 or whatever right now, prior to Christmas... like... do we even logic? Why not wait? Is your house on fire if you don't?


While i did run into issues which i think are related to the 8GB of VRAM of my GPU in a single game, removing the HD texture pack while keeping RT was still a better visual result for my taste. As with RT or not, i think it's a matter of personal preferences, as there is no solution with both a lot of VRAM and correct RT performances right now in this price range.


----------



## FinneousPJ (Dec 1, 2020)

nguyen said:


> In 3 years there are probably hundred of DXR titles, maybe 10 of them benefit from 16GB VRAM


We shall see lol


----------



## Vayra86 (Dec 1, 2020)

anachron said:


> While i did run into issues which i think are related to the 8GB of VRAM of my GPU in a single game, removing the HD texture pack while keeping RT was still a better visual result for my taste. As with RT or not, i think it's a matter of personal preferences, as there is no solution with both a lot of VRAM and correct RT performances right now in this price range.



Yeah, one game. How long do you play that? 8 hours? 16? 40?


----------



## Sovsefanden (Dec 1, 2020)

Vayra86 said:


> You can play a title with good RT a few years post release, and enjoy the game at much better framerates on a GPU that does that good RT in a smooth way. You cannot however download additional VRAM for your current GPU. VRAM is used in every single game and if you upgrade to a higher resolution, you'll need more in *every single situation*. Not a handful. A few RT settings can be ignored, the game plays without it. But having to cut back on basic settings every single time on stuff like textures just so it fits for your silly 10GB GPU? Meh. I can think of better ways to spend 700+ on a GPU.
> 
> RT & DLSS... myeah. Keep buying into the proprietary crap. How did Gsync work out now? PhysX? Oh... Yeah. Meanwhile games have done godrays since forever at much lower budgets... not seeing what's the must-have here. There is no killer app, only grossly expensive visual effects. We've had those before, remember tesselation? Nobody batted an eye, and I dare say it was more impressive after having looked at bump maps for decades. Nobody asked for RT... now we need it, when there is barely content to show for it? Mkay, to each their own  Is this commerce talking, or did you think of it yourself?
> 
> ...



LMAO are you talking resale prices and think AMD will be superior here? AMD GPU's are pretty much worth nothing after you are done with them, just like Android phones.
AMD slowly lowers the prices on their GPU's, meaning your GPU will simply drop lower and lower in value, worth peanuts in the end.

Same thing happends with Ryzen CPU's. Look at pricing on 1000, 2000 and 3000 series now. An used 1000/2000 chip is worth a bag of chips at this point.


Meanwhile Intel and Nvidia holds their prices much better, like Apple products. Why? DEMAND is much higher and prices are kept near MSRP levels until products go EoL.


----------



## Vayra86 (Dec 1, 2020)

nguyen said:


> In 3 years there are probably hundred of DXR titles, maybe 10 of them benefit from 16GB VRAM



And you can still play all hundred of them, but now with balls-to-the-wall RT performance and not the early adopter BS you have today.


----------



## TumbleGeorge (Dec 1, 2020)

Less than 4-5 years upgrade is not ecological. Your children will got cancer because your too big consumption!


----------



## Vayra86 (Dec 1, 2020)

Sovsefanden said:


> LMAO are you talking resale prices and think AMD will be superior here? AMD GPU's are pretty much worth nothing after you are done with them, just like Android phones.
> AMD slowly lowers the prices on their GPU's, meaning your GPU will simply drop lower and lower in value, worth peanuts in the end.
> 
> Same thing happends with Ryzen CPU's. Look at pricing on 1000, 2000 and 3000 series now. An used 1000/2000 chip is worth a bag of chips at this point.
> ...



Nvidia cards held price better because they simply _were better._ This is no longer true today. Its time you start seeing that... I've been having and reselling Nvidia cards since 2012. I'm still saying this. Its clear as day.

Part of the reason for good resale value was mindshare. Mindshare is directly related to how good products have been over the last X years, and just having to be good products. The Nvidia cards that hold their value well are the well-endowed VRAM cards. The 780 with 3GB; the 980ti with 6, the 1080ti with 11. Even 980s sold better because they didn't have the 0.5GB of turd-VRAM. They held value, while 970s didn't.

A 3080 with 10? Its a joke!

You have much to learn buddy


----------



## nguyen (Dec 1, 2020)

FinneousPJ said:


> We shall see lol



No one knows the future, just buy the better product today. This was my original message LOL.


----------



## Vayra86 (Dec 1, 2020)

nguyen said:


> No one knows the future, just buy the better product today. This was my original message LOL.



I do.


----------



## anachron (Dec 1, 2020)

Vayra86 said:


> Yeah, one game. How long do you play that? 8 hours? 16? 40?


I played it for about 55h but I'm not sure what is your point. A game with issues with 10gb of ram at 1440p will most probably still be the exception for a few years, so in the end choosing a gpu because of that is not a more logical choice than choosing because of RT performances. Once again it's only a matter of personnal preferences.


----------



## FinneousPJ (Dec 1, 2020)

nguyen said:


> No one knows the future, just buy the better product today. This was my original message LOL.


Obviously. It seems like we disagree which is the better product today.


----------



## nguyen (Dec 1, 2020)

FinneousPJ said:


> Obviously. It seems like we disagree which is the better product today.



Yeah let hope you find your "better product" at any reasonable price soon


----------



## FinneousPJ (Dec 1, 2020)

nguyen said:


> Yeah let hope you find your "better product" at any reasonable price soon


Yes you too.


----------



## nguyen (Dec 1, 2020)

anachron said:


> I played it for about 55h but I'm not sure what is your point. A game with issues with 10gb of ram at 1440p will most probably still be the exception for a few years, so in the end choosing a gpu because of that is not a more logical choice than choosing because of RT performances. Once again it's only a matter of personnal preferences.



Yeah I find it odd that there are people who think like this:
-Turn off RT for better performance: perfectly acceptable
-Reduce Ultra detail to High for better performance: totally unacceptable

So some people are blind to RT, yet very picky about details, doesn't make any sense.


----------



## Vayra86 (Dec 1, 2020)

nguyen said:


> Yeah I find it odd that there are people who think like this:
> -Turn off RT for better performance: perfectly acceptable
> -Reduce Ultra detail to High for better performance: totally unacceptable
> 
> So some people are blind to RT, yet very picky about details, doesn't make any sense.



OK I'll try one more time  But I'll start by saying YES, you are correct. It is a personal consideration - we all try to crystal ball ourselves out of this, its never going to be conclusive until its too late 

- RT performance is early adopter territory. Next gen may turn things on its head altogether and make current day perf obsolete straight away. You can check Turing > Ampere RT perf for proof. Remember, AMD is having a lite-version of RT in RDNA2. It can go either of both ways - the industry goes full steam on it and RDNA3 or beyond will push it far more heavily, or they really don't and focus goes back towards better raster perf while RT takes a similar place as, say, Tesselation - just another effect to use. The supposed 'RT advantage' of Nvidia can also dwindle faster than you might blink if devs start optimizing for consoles first. The additional die space Nvidia has for it, won't be used properly unless Nvidia keeps using bags of money like they have so far to get RTX implementation.

Its far too early to determine RT is 'here to stay' in the projected way as a 'major part' of the graphics pipeline. If the market doesn't eat it, it'll die. Its a very expensive effect. Look at the price surges, demand issues... They are related.

RT is also not efficient at this time. Its the same thing as enabling overly costly AA that barely shows an advantage. Yes, you cán... but why? In a large number of situations it really doesn't add much. You can still count the examples where it does, on one hand - and you'll have fingers left.

- 10GB VRAM is not resale-worthy. Its just not. Its yesterday's capacity. Past two gens already had more. The fact we're already discussing it at launch speaks volumes.. You buy this to use it for a few years and then it gets knocked down the product tiers very fast. I haven't seen anyone disagree with that, by the way, even in this topic. We ALL draw the conclusion that 10GB will impose limitations pretty soon. The idea that this somehow 'scales with the core power' has absolutely no basis in the past - in the past, we've always seen an increase or equal capacity with increasing core power. You can't ignore that disbalance. Its there and it'll show.

- 16GB VRAM is very resale-worthy, especially given the fact that there is lots of core power on tap and the balance with the core power relative to past gen is also kept intact. Well balanced GPUs last longest. Its just that simple. When they run out of oomph, they run out of all things at the same time, and that tends to take a long while. Until they do... you can resell them. A GPU without such balance doesn't resell like that - you can only resell it on 'conditional' situations, ie specific use cases. '3080's a great card for 1440p now', is probably the punchline. You'll insta-lose all potential buyers with a 4K panel or even UWs - your niche got that much smaller.

- VRAM is used everywhere. If you're short, you'll be tweaking your settings every time, not just in the games that may or may not have RT worth looking at. So going forward in time, say you'll be buying a 4K monitor 3 years from now... with a 3080 you might also feel the urge to upgrade the GPU. With a 12-16GB card, you most certainly won't have to.

As for a hundred DXR titles... yeah. In a similar vein we also have 'hundreds' of DX12 titles... that we still prefer to run in DX11 because its the same thing but better.

TL DR what it REALLY comes down to... is how keen you are to early adopt RT. Except now its not the Turing days where the competition has nothing to place against that consideration - the competition has a technically more durable product - and it even does RT too! That's a pretty steep price tag to keep going green if you ask me.


----------



## Vya Domus (Dec 1, 2020)

nguyen said:


> Yeah I find it odd that there are people who think like this:
> -Turn off RT for better performance: perfectly acceptable
> -Reduce Ultra detail to High for better performance: totally unacceptable
> 
> So some people are blind to RT, yet very picky about details, doesn't make any sense.



Of course it doesn't make sense since you appear to be blind to everything expect RT.

Turning RT on can sometime halve the performance but the visual impact is minimal.
Going from Low to Ultra usually halves the performance as well or even worse but the visual impact is massive. 

You wouldn't play on Low but with RT on, would you ? The priories people have are clear and it makes perfect sense.


----------



## londiste (Dec 1, 2020)

Vya Domus said:


> Going from Low to Ultra usually halves the performance as well or even worse but the visual impact is massive.


Who said anything about going from Ultra to Low? Impact to image quality from changing Texture setting from Ultra to Very High is either minimal or nonexistent.


----------



## Vya Domus (Dec 1, 2020)

londiste said:


> Who said anything about going from Ultra to Low? Impact to image quality from changing Texture setting from Ultra to Very High is either minimal or nonexistent.



I am talking about settings in general not just textures.


----------



## FinneousPJ (Dec 1, 2020)

Vayra86 said:


> TL DR what it REALLY comes down to... is how keen you are to early adopt RT. Except now its not the Turing days where the competition has nothing to place against that consideration - the competition has a technically more durable product - and it even does RT too! That's a pretty steep price tag to keep going green if you ask me.



Exactly. I am not interested in early adopting RTRT in the least. If AMD were to offer a RT disabled option I would buy that... This seems to deeply offend some people. I can't understand it lol


----------



## Sovsefanden (Dec 1, 2020)

I know for sure that I will be enabling Ray Tracing in Cyberpunk 2077. Massive difference in image quality, unless you are blind, there is plenty of ingame footage showcasing it. AMDs 6000 series will get RT support sometime next year in this game, but performance hit using AMD is massive, with 3070 beating 6800 XT, atleast 3070 have the option to turn on DLSS 2.0 to increase fps by up to 100% and migitate the fps hit, or simply turn RT off and just enjoy a massive fps gain, the choice is yours

RT is here to stay and will only get more and more common going forward, AMD needs to improve their RT performance ALOT for 7000 series.

AMD users downplay the importance of RT, DLSS and tons of other RTX features because, you know why..  

Those High and Ultra presets are going to include RT in a few years


----------



## londiste (Dec 1, 2020)

FinneousPJ said:


> Exactly. I am not interested in early adopting RTRT in the least. If AMD were to offer a RT disabled option I would buy that... This seems to deeply offend some people. I can't understand it lol


The reverse is also true - some people seem to be deeply offended by others wanting to try or early adopt RTRT.
The cost of hardware RT acceleration is not high to begin with.


Vya Domus said:


> I am talking about settings in general not just textures.


The context of the thread is VRAM and future-proofness due to VRAM amounts. The main setting that affects this is texture quality.


----------



## Vayra86 (Dec 1, 2020)

Sovsefanden said:


> I know for sure that I will be enabling Ray Tracing in Cyberpunk 2077. Massive difference in image quality, unless you are blind, there is plenty of ingame footage showcasing it. AMDs 6000 series will get RT support sometime next year in this game, but performance hit using AMD is massive, with 3070 beating 6800 XT, atleast 3070 have the option to turn on DLSS 2.0 to increase fps by up to 100% and migitate the fps hit



Its just a late-to-the-party addon in CBP2077 and most of the lighting is intact whether or not RTX is on or off.

I'm not sure what you've seen though, got some examples? I'm open to that killer app... really I am.



FinneousPJ said:


> Exactly. I am not interested in early adopting RTRT in the least. If AMD were to offer a RT disabled option I would buy that... This seems to deeply offend some people. I can't understand it lol



Part of the defense is the fact that people are going to feel like they have to change camps. When you've been running a certain tint for a long time it grows on you. I notice the same thing, a strange reluctance to switch because I realistically don't have hands on experience with RDNA2 yet, and I do have it with Nvidia.

Still though, I'm thoroughly unimpressed with the product line Nvidia is producing right now, especially after the year*s* (!) of Turing which were also extremely weak. They didn't push much forward, all we really got was paying a big fat RT tax from it. They pre empted RT and they failed miserably from a consumer standpoint. What have we really got now? A 2080ti that got eclipsed in a year, and a SUPER line up that was too late to matter. And now they follow up with these measly VRAM amounts? GTFO. Not worth my cash.

I see a lot of corporate push because Nvidia had bad numbers post-mining and post-Pascal. They used RT to have 'the next best thing'. Good for shareholders, but I'm not seeing my benefit. AMD said it right at the time: until the midrange (consoles) start moving, its dead anyway.


----------



## FinneousPJ (Dec 1, 2020)

londiste said:


> The reverse is also true - some people seem to be deeply offended by others wanting to try or early adopt RTRT.
> The cost of hardware RT acceleration is not high to begin with.
> The context of the thread is VRAM and future-proofness due to VRAM amounts. The main setting that affects this is texture quality.


Maybe. I haven't seen much if the reverse though.


----------



## Sovsefanden (Dec 1, 2020)

Vayra86 said:


> Its just a late-to-the-party addon in CBP2077 and most of the lighting is intact whether or not RTX is on or off.
> 
> I'm not sure what you've seen though, got some examples? I'm open to that killer app... really I am.
> 
> ...



Yeah thats why Nvidia have been working closely with CDPR for years now, haha!

Why so mad? Because Nvidia can afford to back a huge title like Cyberpunk? Biggest game release in years?

Weeell, AMD has Godfall, you know, the game with insanely bad review ratings   
Most AMD users can't wait for that 12GB texture pack! LMAO! Would rather watch paint dry than play Godfall


----------



## Vya Domus (Dec 1, 2020)

londiste said:


> The context of the thread is VRAM and future-proofness due to VRAM amounts. The main setting that affects this is texture quality.



This is the comment to which I was replying :



> -Reduce Ultra *detail* to High for better performance: totally unacceptable






londiste said:


> The main setting that affects this is texture quality.



That's not really true, the bulk of memory used is not taken up by textures.


----------



## Sovsefanden (Dec 1, 2020)

By the time 16GB will be required for 4K, 6800 and 6800XT will have laughable performance anyway, even 6900XT will look like a low to mid-end solution

I guess you are new to high resolution gaming if you think VRAM is going to save a weak GPU


----------



## londiste (Dec 1, 2020)

Vayra86 said:


> I'm not sure what you've seen though, got some examples? I'm open to that killer app... really I am.


That killer app was Control when it comes to visuals.


Vayra86 said:


> They didn't push much forward, all we really got was paying a big fat RT tax from it. They pre empted RT and they failed miserably from a consumer standpoint. What have we really got now? A 2080ti that got eclipsed in a year, and a SUPER line up that was too late to matter.


2080Ti was released in September 2018. 3000-series came September 2020. 2 Years.
Big fat RT tax seems to be questionable at best. RT Cores take up about 3% of the die if not less.


----------



## Vya Domus (Dec 1, 2020)

londiste said:


> That killer app was Control when it comes to visuals.



That "killer app", came out one year after 2000 series launched and it also came with an unbelievable performance hit.


----------



## Sovsefanden (Dec 1, 2020)

Vya Domus said:


> That "killer app", came out one year after 2000 series launched and it also came with an unbelievable performance hit.



You have no experience with RTX or Control it seems LMAO ... As I expected

You own GTX, and considers going AMD 6000 series. So you are in denial.

Wait till you see Cyberpunk screenshots using RTX, and then the dull looking OFF image shortly after, thats the AMD experience


----------



## sepheronx (Dec 1, 2020)

Sovsefanden said:


> You have no experience with RTX or Control it seems LMAO ...



Can you please elaborate with some kind of evidence please?


----------



## londiste (Dec 1, 2020)

Vya Domus said:


> That "killer app", came out one year after 2000 series launched and it also came with an unbelievable performance hit.


There really hasn't been that many major changes in graphics for a long while. The last one was lower-level APIs DX12/Vulkan and that effectively took even longer (and still affects DXR adoption). Low-level API purpose was performance boost not visuals and that is also still in progress with varying results. Back when new stuff was introduced every couple of years, major performance hits with latest and greatest was norm. Remember the performance hit from Tessellation or AA (not even accounting for the initial supersampling)? There are things that need to be figured out with every new thing and 2 years is not a long time for that. A lot of groundwork has been laid though - APIs are there, big engines have support by now etc.

The expectation that new effect will come at no performance cost or a minor one is naive. Especially with something like RTRT that is well known as technology and has very clear performance implications. Glad to see AMD can now start contributing to improving RTRT.

Edit:
The biggest short-term improvements in RTRT performance are not likely to come from increasing the performance of current RT acceleration hardware. This is very easy to scale up if needed but even Nvidia is clearly holding back from adding more RT cores. Optimizing the ray projection is also a pretty well-known territory. There is a big base hit on performance that comes from setting up the scene and data. This is getting slow but constant improvements on this from. I would speculate one a "standard" enough solution is reached, manufacturers are starting to look at hardware acceleration around that as well.


----------



## Vayra86 (Dec 1, 2020)

Sovsefanden said:


> Yeah thats why Nvidia have been working closely with CDPR for years now, haha!
> 
> Why so mad? Because Nvidia can afford to back a huge title like Cyberpunk? Biggest game release in years?
> 
> ...



Look, if you're going to honor your post count/day and actually come here to troll the usual AMD-Nvidia rage debate, go elsewhere. So far you did quite well, but this is taking things into the gutter. Stahp. It won't work and you won't be more happy for going there. We have experience 

I've asked you for examples SHOWING the marked difference in Cyberpunk 2077 with RTX on and off. I ask this specifically because you seem to have seen major differences but I never did. Its an honest question. Either answer it with honesty, or GTFO - its a pointless debate if we're not backing up anything we say with good arguments or evidence.


----------



## Sovsefanden (Dec 1, 2020)

It's funny that people think the 16GB will make 6800 series relevant in 4-5 years tho, will be considered absolute garbage at that point



Vayra86 said:


> Look, if you're going to honor your post count/day and actually come here to troll the usual AMD-Nvidia rage debate, go elsewhere. So far you did quite well, but this is taking things into the gutter. Stahp. It won't work and you won't be more happy for going there. We have experience
> 
> I've asked you for examples SHOWING the marked difference in Cyberpunk 2077 with RTX on and off. I ask this specifically because you seem to have seen major differences but I never did. Its an honest question. Either answer it with honesty, or GTFO



Why should I show you? Can't you find it yourself? There's plenty of videos and articles about it lmao

I know it's hard to accept as a non-RTX owner tho


----------



## Vayra86 (Dec 1, 2020)

Sovsefanden said:


> It's funny that people think the 16GB will make 6800 series relevant in 4-5 years tho, will be considered absolute garbage at that point
> 
> 
> 
> ...



Thank you, ignore button is one click away you know - this is just confirming my assumption above about your agenda. Your post count is giving it away.

Why should I be able to find something I did not find yet? What alternate reality is this?


----------



## Vya Domus (Dec 1, 2020)

londiste said:


> Remember the performance hit from Tessellation or AA (not even accounting for the initial supersampling)?



And that massive performance hit still exists a decade after, so much so that tessellation is sparingly used if ever in the detriment of higher performance alternatives like POM.



Vayra86 said:


> Look, if you're going to honor your post count/day and actually come here to troll the usual AMD-Nvidia rage debate, go elsewhere. So far you did quite well, but this is taking things into the gutter. Stahp. It won't work and you won't be more happy for going there. We have experience
> 
> I've asked you for examples SHOWING the marked difference in Cyberpunk 2077 with RTX on and off. I ask this specifically because you seem to have seen major differences but I never did. Its an honest question. Either answer it with honesty, or GTFO - its a pointless debate if we're not backing up anything we say with good arguments or evidence.



Stop responding to paid trolls (at least I hope he's being paid). Report and move on.


----------



## Sovsefanden (Dec 1, 2020)

Vya Domus said:


> And that massive performance hit still exists a decade after, so much so that tessellation is sparingly used if ever in the detriment of higher performance alternatives like POM.
> 
> 
> 
> Stop responding to paid trolls (at least I hope he's being paid). Report and move on.



You are calling me a troll, only because you are a fanboy and can't look past your own nose, funny  

Oh well, 9 more days will I will be playing Cyberpunk maxed out in full RTX glory.


----------



## Vayra86 (Dec 1, 2020)

Vya Domus said:


> And that massive performance hit still exists a decade after, so much so that tessellation is sparingly used if ever in the detriment of higher performance alternatives like POM.
> 
> 
> 
> Stop responding to paid trolls (at least I hope he's being paid). Report and move on.



He's not getting paid I hope, he's pretty bad at the game.  Lasted less than a week!


----------



## Vya Domus (Dec 1, 2020)

Vayra86 said:


> He's not getting paid I hope, he's pretty bad at the game.



No I really hope he is, I cringe at the idea that someone's time can be worth so little to post that kind of garbage for free.


----------



## londiste (Dec 1, 2020)

Vya Domus said:


> And that massive performance hit still exists a decade after, so much so that tessellation is sparingly used if ever in the detriment of higher performance alternatives like POM.


Tessellation has far far smaller performance hit than it had in the beginning and there have been improvements even relatively recently. It is used sparingly but it does see constant use where needed, it is all over the place.


----------



## Vayra86 (Dec 1, 2020)

londiste said:


> Tessellation has far far smaller performance hit than it had in the beginning and there have been improvements even relatively recently. It is used sparingly but it does see constant use where needed, it is all over the place.



This is true - and that is in my view also a good place for RT effects. In many situations you can do similar with a much less costly raster effect. And since we don't have unlimited graphics power on tap... choices always have to be made, as they always have been made. Its about how budget is divided, and I think AMD is seeing things in a much more conservative - better - balance that way.


----------



## Vya Domus (Dec 1, 2020)

londiste said:


> Tessellation has far far smaller performance hit than it had in the beginning and there have been improvements even relatively recently. It is used sparingly but it does see constant use where needed, it is all over the place.



If it has a smaller performance hit then why should I expect that something which is way more computationally expensive will fair any better ?


----------



## londiste (Dec 1, 2020)

Vayra86 said:


> In many situations you can do similar with a much less costly raster effect. And since we don't have unlimited graphics power on tap... choices always have to be made, as they always have been made. Its about how budget is divided, and I think AMD is seeing things in a much more conservative - better - balance that way.


There are effects that are pretty darn difficult to do with rasterization, at least to the same level. Reflections in Watch Dogs Legion - especially on cars - is the first that comes to mind from recent examples.

There is definitely performance budgeting going on. Approaches to it vary wildly, scene for RT is usually simplified (for example DF's Spiderman video), if not simplified it is distance limited (different RT settings), amount of reflecting surfaces is reduced (different settings, BF5 RT patches videos have good examples) in many if not most cases the scene used for RT is not updated every frame, there are different temporal tricks going on etc. The part about RT itself is also tested and optimized but it is far more known and straightforward - the way rays are cast, they are distributed, how many bounces, distance limiting etc.

The thing with RT effects is that they are rarely in your face. In most cases the result is a relatively subtle difference but performance hit aside they usually bring a good improvement in immersion. They are also somehow more impressive in motion.


----------



## Vayra86 (Dec 1, 2020)

londiste said:


> There are effects that are pretty darn difficult to do with rasterization, at least to the same level. Reflections in Watch Dogs Legion - especially on cars - is the first that comes to mind from recent examples.
> 
> There is definitely performance budgeting going on. Approaches to it vary wildly, scene for RT is usually simplified (for example DF's Spiderman video), if not simplified it is distance limited (different RT settings), amount of reflecting surfaces is reduced (different settings, BF5 RT patches videos have good examples) in many if not most cases the scene used for RT is not updated every frame, there are different temporal tricks going on etc. The part about RT itself is also tested and optimized but it is far more known and straightforward - the way rays are cast, they are distributed, how many bounces, distance limiting etc.
> 
> The thing with RT effects is that they are rarely in your face. In most cases the result is a relatively subtle difference but performance hit aside they usually bring a good improvement in immersion. They are also somehow more impressive in motion.



Cyberpunk seems to suffer a similar fate for some areas and scenes then. Less complexity to cater to RT.

Not sure if that's a win...


----------



## anachron (Dec 1, 2020)

Vayra86 said:


> OK I'll try one more time  But I'll start by saying YES, you are correct. It is a personal consideration - we all try to crystal ball ourselves out of this, its never going to be conclusive until its too late
> 
> - RT performance is early adopter territory. Next gen may turn things on its head altogether and make current day perf obsolete straight away. You can check Turing > Ampere RT perf for proof. Remember, AMD is having a lite-version of RT in RDNA2. It can go either of both ways - the industry goes full steam on it and RDNA3 or beyond will push it far more heavily, or they really don't and focus goes back towards better raster perf while RT takes a similar place as, say, Tesselation - just another effect to use. The supposed 'RT advantage' of Nvidia can also dwindle faster than you might blink if devs start optimizing for consoles first. The additional die space Nvidia has for it, won't be used properly unless Nvidia keeps using bags of money like they have so far to get RTX implementation.
> 
> ...


You are right that we can't know for sure what will happen in the future, this is why there is no "perfect" answer as none of the offer of AMD or NVIDIA cover all the possibilities. 
While i have run into an issue with 8Gbs of VRAM on a single specific game, 10Gbs may be still enough for a few years... or not. But regarding RT, if i buy a 6800XT right now, i don't need a crystal ball to know that i will have worse performance in the few RT game i play or intend to play soon, which would be quite sad after spending that much money on a new GPU. So, for _someone who appreciate RT_ like me, NVIDIA seems to be a better choice. If you don't care about RT at all, or don't mind waiting for it to mature, then indeed AMD offers may be a better choice.



Vayra86 said:


> - 10GB VRAM is not resale-worthy. Its just not. Its yesterday's capacity. Past two gens already had more. The fact we're already discussing it at launch speaks volumes.. You buy this to use it for a few years and then it gets knocked down the product tiers very fast. I haven't seen anyone disagree with that, by the way, even in this topic. We ALL draw the conclusion that 10GB will impose limitations pretty soon. The idea that this somehow 'scales with the core power' has absolutely no basis in the past - in the past, we've always seen an increase or equal capacity with increasing core power. You can't ignore that disbalance. Its there and it'll show.
> 
> - 16GB VRAM is very resale-worthy, especially given the fact that there is lots of core power on tap and the balance with the core power relative to past gen is also kept intact. Well balanced GPUs last longest. Its just that simple. When they run out of oomph, they run out of all things at the same time, and that tends to take a long while. Until they do... you can resell them. A GPU without such balance doesn't resell like that - you can only resell it on 'conditional' situations, ie specific use cases. '3080's a great card for 1440p now', is probably the punchline. You'll insta-lose all potential buyers with a 4K panel or even UWs - your niche got that much smaller.


To be honest, i don't buy a GPU on the resale value. I mean, i may compare it if everything else would be equal, but it's at the bottom of the list. But once again it may depend on personnal situations.



Vayra86 said:


> - VRAM is used everywhere. If you're short, you'll be tweaking your settings every time, not just in the games that may or may not have RT worth looking at. So going forward in time, say you'll be buying a 4K monitor 3 years from now... with a 3080 you might also feel the urge to upgrade the GPU. With a 12-16GB card, you most certainly won't have to.


I may agree with you in 4k, in 1440p it seems that there is still room with 10GB for some times. But the 16GB of vram will not be of much use either if you have to reduce image quality because the card is not fast enough for the game that require that amount. But as you said, we have no crystal ball.



Vayra86 said:


> As for a hundred DXR titles... yeah. In a similar vein we also have 'hundreds' of DX12 titles... that we still prefer to run in DX11 because its the same thing but better.
> 
> TL DR what it REALLY comes down to... is how keen you are to early adopt RT. Except now its not the Turing days where the competition has nothing to place against that consideration - the competition has a technically more durable product - and it even does RT too! That's a pretty steep price tag to keep going green if you ask me.


Well i agree with you that the price tag is steep, but as i said, i would feel worse with buying something that perform worse in a game i already played with similar settings. Especially since the only reason i would realistically have to change my GPU is for performance on RT games as I have no issues with non RT games so far.

In the end, it seems to me that the good thing is that we at least have choice now


----------



## EarthDog (Dec 1, 2020)

Sovsefanden said:


> Yep I agree most people will be satisfied with 5700XT or 2070 Super at 1440p, even 5700 non-XT and 2070 non-Super would do fine for 99% of people at this res
> 
> 
> 
> ...


This was, of course, under the assumption of ultra settings...we're nkt talking about esports titles and competitive games where turning settings down is key. Talking about joe blow gamer who strives for ultra.


----------



## anachron (Dec 1, 2020)

The funny thing about the whole dispute in this thread is that, given the non existent availability of both brand, people vouching for AMD should be happy that people vouching for NVIDIA don't buy the card they want and vice versa. It's a win win situation...


----------



## nguyen (Dec 1, 2020)

anachron said:


> The funny thing about the whole dispute in this thread is that, given the non existent availability of both brand, people vouching for AMD should be happy that people vouching for NVIDIA don't buy the card they want and vice versa. It's a win win situation...



Well I'm sure that some people who are vouching for AMD in this thread are not buying the RX6000 at all, or even actually playing today games...kinda pointless discussion really.

But hey I found some upcoming new game that will make good use of RT, it's Riftbreaker, sponsored by AMD









And turning RT on is crushing performance on AMD hardware


----------



## Vayra86 (Dec 1, 2020)

anachron said:


> The funny thing about the whole dispute in this thread is that, given the non existent availability of both brand, people vouching for AMD should be happy that people vouching for NVIDIA don't buy the card they want and vice versa. It's a win win situation...



Comment of the day  Shame we can't upvote

So, let's start a poll? 'Which GPU are you not buying this year?'



nguyen said:


> Well I'm sure that some people who are vouching for AMD in this thread are not buying the RX6000 at all, or even actually playing today games...kinda pointless discussion really.
> 
> But hey I found some upcoming new game that will make good use of RT, it's Riftbreaker, sponsored by AMD
> 
> ...



The penny really has to drop sometime, but is the plank on your forehead this thick? Many people _DO. NOT. CARE. _
Gaming isn't about graphics you know. Its about games. If you need graphics teasers to buy into games, man... you're far gone IMO. The visual aspect is just that: one aspect - with varying relevance.

That also applies to the 'not early adopting' part. Most people don't give a rat's behind about what's new. They look at what's nice to have and easy to get. I fall in that category, for example - and guess what, I'm still a hardcore gamer nonetheless. Its a very diverse target audience.


----------



## Chomiq (Dec 1, 2020)

This is kind of a pointless discussion, both sides are arguing over "what if" scenario in 2-3 years.

In 2-3 years 3080 won't be enough to run at 4K at ultra anyway, that's expected.
16 GB of VRAM won't bottleneck you at 4K but GPU performance is based on more than just VRAM.
In 2-3 years Nvidia will be on Gen 3-4 of RT implementation, AMD will probably be on Gen 2-3. The odds are that GPU's from early gens will be useless for anything above 1080p medium RT. 
If both Sony and MS release mid-gen variants of consoles then what's achievable by XSX/PS5 with RT will become a baseline for "RT LOW" setting.


----------



## Calmmo (Dec 1, 2020)

3080 is a great short to mid term GPU, i doubt it will age well but that's beside the point. People who look to buy a high end gpu and keep it for 4-5 years.. well just wait for the more vram versions


----------



## londiste (Dec 1, 2020)

Vayra86 said:


> The penny really has to drop sometime, but is the plank on your forehead this thick? Many people _DO. NOT. CARE. _
> Gaming isn't about graphics you know. Its about games. If you need graphics teasers to buy into games, man... you're far gone IMO. The visual aspect is just that: one aspect - with varying relevance.


For this particular audience the end of days is already here. Midrange from last generation has more power than they need and probably high end from generation before as well


----------



## Vayra86 (Dec 1, 2020)

londiste said:


> For this particular audience the end of days is already here. Midrange from last generation has more power than they need and probably high end from generation before as well



Correctamundo! The thing is, if you look at overall marketshare per tier, the vast majority is in that group. Its shifting a little, but not by much.

I'm missing some FPS now with this new res, but other than that... everything's plenty playable.


----------



## Vendor (Dec 1, 2020)

dirtyferret said:


> AMD RX 5500XT 4GB still easily outperforms the PS4


actually, ps4 gpu (non-pro one) is even slower than a gtx 950


----------



## londiste (Dec 1, 2020)

Vendor said:


> actually, ps4 gpu (non-pro one) is even slower than a gtx 950


Closest match PS4 GPU has from cards is probably Radeon 7850. PS4 Pro GPU is between RX470 and RX480.


----------



## budgetgaming (Dec 3, 2020)

Yesterday nvidia launch 3060ti....what does it mean? My guess which we have talk about it is correct next year u will see a 3070ti and 3080ti in the market....


----------



## Mussels (Dec 3, 2020)

budgetgaming said:


> Yesterday nvidia launch 3060ti....what does it mean? My guess which we have talk about it is correct next year u will see a 3070ti and 3080ti in the market....



of course, and probably super models after that. there will always be a better model coming along.


----------



## vincat84 (Dec 26, 2020)

What about the new RTX IO technology? I believe this is going to fix the VRAM lack in the card (if needed it) by accessing faster to the M2 pcie 4 SSD card


----------



## lexluthermiester (Dec 26, 2020)

MxPhenom 216 said:


> Its enough.


For now.


MxPhenom 216 said:


> I don't expect the 20GB version to really boost performance in games right now.


That's just the thing really, adding more VRAM is never about a "boost performance" thing as much as it is a "make sure you have enough VRAM so that performance doesn't degrade when a game needs more and is forced to swap data out to system RAM" situation. In this respect, having more RAM is always good. It's always better to have more than you need then to have not enough. This applies to system RAM as much as VRAM.


----------



## Mussels (Dec 26, 2020)

resizable BAR + NvIO (and AMD's variant) will probably change how things work as far as game loading goes.


----------



## Chrispy_ (Dec 27, 2020)

IMO 10GB is enough for now. It probably won't be enough at some point next year, but people who splurge on flagship vanity cards at 3x the cost of anything in the performance/$ sweet spot aren't likely to give a shit about their card next year, because they'll be buying the new model that comes out then, anyway.

If you care about longevity or performance/$ then 10GB is too little for a long-term card. 12GB is probably going to become the new baseline for Ultra settings, (it's been on 8GB for some games since 2019) and with the competition including 16GB and the inevitable PS5/XBSX refresh in 12-18 months, you can expect the GPU power and VRAM quantities to increase, just like One X did. My guess is that the XBSX refresh will come with 20GB RAM to take advantage of developers targeting 16GB GPUs cross-platform whilst still having 4GB RAM for the OS and game code to run on the CPU.

As always, this discussion is entirely subjective opinion because if anyone could accurately predict the future, the human race would be in a much better position than it is right now


----------



## EarthDog (Dec 27, 2020)

It's a good thing 10GB will be fine for a generation's worth of time (really, one year won't suddenly change anything). Sure, we all want more there, but as good as it is to have a buffer, it's a waste of money when you aren't using close to its capacity. If you play at 4k/ultra perhaps it's good to have towards the end of its life in a few years...

In theory, the xbox can use up to 13.5GB. In reality, it's a shared pool with the system ram that also shares bandwidth and it will use less. I highly doubt this generation of console will push up RAM use past 10GB. I doubt they'll add more system RAM to an updated console...there's no reason to in a closed ecosystem. 

As was mentioned... this is an opinion as well...


----------



## Rei (Dec 27, 2020)

Chrispy_ said:


> and the inevitable PS5/XBSX refresh in 12-18 months, you can expect the GPU power and VRAM quantities to increase, just like One X did. My guess is that the XBSX refresh will come with 20GB RAM to take advantage of developers targeting 16GB GPUs cross-platform whilst still having 4GB RAM for the OS and game code to run on the CPU.


Why do you think that there will be a refresh of PS5 & Xbox 4 in less than 2 years? The only reason that there was a refresh of last gen console was to take advantage of the sudden demand in 4K market segment while maintaining status quo for 1080p. I highly doubt that there will be a 6K market while I don't see 8K content becoming commonplace in the next 4-5 years that there will be a need for mid-gen refresh. By that time, both (or all 3) companies would have already been working on the next-gen console.


EarthDog said:


> I doubt they'll add more system RAM to an updated console...there's no reason to in a closed ecosystem.


Why not? They did that for Xbox One X with 4 GB of additional RAM & 1 GB for the PS4 Pro. Though the PS4 Pro's RAM was DDR3 for OS use meaning GDDR5 RAM allowance for games usage went up from 5.5 GB RAM to 6.5 GB RAM.
If this gen's refreshed console would have to support higher rez output without impacting graphical fidelity or game performance, they would have to increase RAM as well otherwise it'll be a half-baked refreshed console.


----------



## EarthDog (Dec 27, 2020)

Rei said:


> Why do you think that there will be a refresh of PS5 & Xbox 4 in less than 2 years? The only reason that there was a refresh of last gen console was to take advantage of the sudden demand in 4K market segment while maintaining status quo for 1080p. I highly doubt that there will be a 6K market while I don't see 8K content becoming commonplace in the next 4-5 years that there will be a need for mid-gen refresh. By that time, both (or all 3) companies would have already been working on the next-gen console.
> 
> Why not? They did that for Xbox One X with 4 GB of additional RAM & 1 GB for the PS4 Pro. Though the PS4 Pro's RAM was DDR3 for OS use meaning GDDR5 RAM allowance for games usage went up from 5.5 GB RAM to 6.5 GB RAM.
> If this gen's refreshed console would have to support higher rez output without impacting graphical fidelity or game performance, they would have to increase RAM as well otherwise it'll be a half-baked refreshed console.


I forgot they did that in the past. But as you said earlier in that post, requirements for monitors won't be going up so there's that. We'll see if 13.5GB for all isn't enough. 

That said, I wonder how much that did for either console... how many titles could take advantage of it and what the difference is between versions fps wise.


----------



## Rei (Dec 27, 2020)

EarthDog said:


> That said, I wonder how much that did for either console... how many titles could take advantage of it and what the difference is between versions fps wise.


This is just my observation from watching too many Digital Foundry comparison videos but there were few misses such as better graphics and/or rez but worse frame rate than the original console. Sure enough, that was in quality mode, not performance mode. My guess is, it depends on how well each game is optimized as well as developer choice of prioritizing rez/graphics over frame rate & vice-versa.


----------



## EarthDog (Dec 27, 2020)

Rei said:


> This is just my observation from watching too many Digital Foundry comparison videos but there were few misses such as better graphics and/or rez but worse frame rate than the original console. Sure enough, that was in quality mode, not performance mode. My guess is, it depends on how well each game is optimized as well as developer choice of prioritizing rez/graphics over frame rate & vice-versa.


So...snake oil, really. Just another reason to sell something 'better'. Not a need/requirement for improvement.


----------



## Rei (Dec 27, 2020)

EarthDog said:


> So...snake oil, really. Just another reason to sell something 'better'. Not a need/requirement for improvement.


Yup, totally with you.

It just dawn on me on why 8K isn't gonna be a thing for a while is storage space & pricing. The price of high capacity storage device needs to go down hard to be able to take in more 8K content as those takes up several times more space than 4K content.
No easily affordable large storage drive means lack of 8K adoption which means no mid-gen refresh.


----------



## Chrispy_ (Dec 27, 2020)

Rei said:


> Why do you think that there will be a refresh of PS5 & Xbox 4 in less than 2 years? The only reason that there was a refresh of last gen console was to take advantage of the sudden demand in 4K market segment while maintaining status quo for 1080p. I highly doubt that there will be a 6K market while I don't see 8K content becoming commonplace in the next 4-5 years that there will be a need for mid-gen refresh. By that time, both (or all 3) companies would have already been working on the next-gen console.



Because it has happened with previous generations and for the current generation it's shown on the roadmap already. It's not that it might happen, it will _definitely happen_. Timing is a bit of a guess, but that's really all that's uncertain.

Look at it from Microsoft and Sony's perspective, in a year from now AMD and TSMC will be able to produce the same silicon design on a cheaper process that also requires less power and cooling enabling a cheaper console. That lets them sell the "same" console at a lower entry price, enticing more consumers into their console ecosystem and making them more profit. At the same time, they can offer an upgraded console that delivers 4K60 where the previous console maybe only did 4K30 and hardcore gamers will buy that console as an upgrade, which is yet another sale for the console manufacturer and potentially one more console passed onto a friend without a console - guess what, that's one more customer brought into the console ecosystem.

The sales model for consoles is about selling subscriptions and games. There's really not much profit in the consoles themselves so it makes competitive sense to offer the most appealing console you can at all times.


----------



## Rei (Dec 27, 2020)

Chrispy_ said:


> Look at it from Microsoft and Sony's perspective, in a year from now AMD and TSMC will be able to produce the same silicon design on a cheaper process that also requires less power and cooling enabling a cheaper console. That lets them sell the "same" console at a lower entry price, enticing more consumers into their console ecosystem and making them more profit.


This part isn't called a refresh. It's a revision & has already been done before with PS1-4, Xbox 360, Xbox One S, Wii, etc.


Chrispy_ said:


> At the same time, they can offer an upgraded console that delivers 4K60 where the previous console maybe only did 4K30 and hardcore gamers will buy that console as an upgrade, which is yet another sale for the console manufacturer and potentially one more console passed onto a friend without a console - guess what, that's one more customer brought into the console ecosystem.
> 
> The sales model for consoles is about selling subscriptions and games. There's really not much profit in the consoles themselves so it makes competitive sense to offer the most appealing console you can at all times.


This is unlikely gonna happen in a controlled ecosystem if the appeal is just a minor boost in performance. For a mid-gen refresh, they would need a larger motivation such as 8K support without graphical fidelity & performance being worse than a 4K equivalent like they did with last gen's refresh otherwise the new refresh is likely gonna flop. And 8K adoption won't happen for a while without some hurdles to jump through such as the storage capacity vs. pricing I mentioned in my previous post.


----------



## Chrispy_ (Dec 28, 2020)

Rei said:


> This part isn't called a refresh. It's a revision & has already been done before with PS1-4, Xbox 360, Xbox One S, Wii, etc.


Potayto, potahto. Similar specs as the original, but sold cheaper as a slim/lite/XS or whatever.



Rei said:


> This is unlikely gonna happen in a controlled ecosystem if the appeal is just a minor boost in performance.


I don't think you understood me; I'm not speculating or guessing, I'm reporting stuff that has already been confirmed; Phil Spencer, head of XBOX development has said that both the refresh/revision and successor to the Series X were already in development back in September, according to the Kotaku interview and he's answered further probing via Twitter and in a weird (but popular) Nintendo twitch stream, too.

So, whilst the successor and a refresh/revision are both _definitely _coming, we don't know exactly when and I suspect Microsoft won't make any announcements until the shine has worn off the fresh new Series X. My guess is that it'll be the 2022 holiday season, but could easily be before then due to TSMC constraints and the Series X silicon being relatively expensive compared to previous consoles. They'll cut costs with revised silicon as soon as they possibly can and take advantage of whatever generational improvements AMD have made to Zen and RDNA since then without breaking compatibility.

Given dev feedback, improved raytracing performance is low-hanging fruit for AMD/Microsoft - and tweaks to provide DLSS-equivalent (Fidelity FX with intelligent VRS, perhaps?) that might make 4K60 more achievable. Almost everyone has _at least_ 4K60 TV but neither the PS5 nor XBSX can realistically do current games at 4K60. They still have to use dynamic resolution scaling and some games just stick to 30fps instead. CP2077 is perhaps the first major launch for the new consoles and 4K60 isn't happy on either of them.


----------



## lexluthermiester (Dec 28, 2020)

Chrispy_ said:


> IMO 10GB is enough for now. It probably won't be enough at some point next year, but people who splurge on flagship vanity cards at 3x the cost of anything in the performance/$ sweet spot aren't likely to give a shit about their card next year, because they'll be buying the new model that comes out then, anyway.


Actually, those are really good points!


----------



## Mussels (Dec 28, 2020)

Thing is, you can always turn a texture setting down from ultra to high, and drop that VRAM usage. 8GB is going to be the 'high end' target for almost all game devs, as they know its incredibly common

Hell the GTX 1060 6GB dominates the steam charts, so 6GB at medium/high settings seems like a reasonable target for most titles.

Unlike some games (cough CP2077) most games you can turn the textures or AA down and VRAM usage plummets, whereas GPU usage you're kinda screwed if you dont have enough


----------



## evernessince (Dec 28, 2020)

londiste said:


> That killer app was Control when it comes to visuals.
> 2080Ti was released in September 2018. 3000-series came September 2020. 2 Years.
> Big fat RT tax seems to be questionable at best. RT Cores take up about 3% of the die if not less.



I'd have to disagree

Here is a video demonstrating RTX in control:










To be honest I'm getting the same quality of reflections with my 1080 Ti in Cyberpunk 2077 with RTX off.

Unless you are specifically cherry picking examples like the RTX trailer does for nearly every game, there are multiple ways to do rasterized reflections that are high quality without ray tracing.



vincat84 said:


> What about the new RTX IO technology? I believe this is going to fix the VRAM lack in the card (if needed it) by accessing faster to the M2 pcie 4 SSD card



No, accessing M.2 PCIe wil still be many times slower than VRAM.  There's no way to get around the fact that the physical distance of the VRAM to the GPU is a tiny fraction of to the NVME SSD.  On the 30xx series it's pushed up really close to the GPU.  RTX IO technology is only to address the increasing complexity and in use asset sizes in games, not to diminish VRAM requirements.



Chomiq said:


> This is kind of a pointless discussion, both sides are arguing over "what if" scenario in 2-3 years.
> 
> In 2-3 years 3080 won't be enough to run at 4K at ultra anyway, that's expected.
> 16 GB of VRAM won't bottleneck you at 4K but GPU performance is based on more than just VRAM.
> ...



Resolution doesn't have as much of an effect on memory consumption as you think.  CP2077 shows maybe a 400MB difference between 1440p and 4K.

If you think dropping the resolution from 4K to 2K is going to save these cards memory wise, think again.  2-3 years is a pretty pitiful life expectancy as well no?  I've had my 1080 Ti for longer than 3 years and have yet to have to drop the texture settings.  Then again it does have more VRAM then cards just released.  Imagine that, a card actually designed to last.



EarthDog said:


> It's a good thing 10GB will be fine for a generation's worth of time (really, one year won't suddenly change anything). Sure, we all want more there, but as good as it is to have a buffer, it's a waste of money when you aren't using close to its capacity. If you play at 4k/ultra perhaps it's good to have towards the end of its life in a few years...



We are already using the entire VRAM amount or more depending on the resolution and game.  The only buffer you have left is the game engine and drivers frugally managing your VRAM to avoid massive stuttering.

The 1080 Ti has 11GB and is almost 4 years old.  Do you honestly think it's good for games or customers to be buying a 3080 for the same price with only 10GB?  Do you honestly believe that doesn't limit game devs?  That's right now, forget about the future.  1 year, it might be fine.  Will it last another 4 years like the 1080 Ti has?  Most likely not nor should it, a VRAM decrease to 10 GB over a period of 8 years from a developer standpoint has got to be a joke.

There is no defense for the amount of VRAM on Nvidia's new cards, aside from the 3090.  It's paltry.  People defending this are holding back game devs.  A newly released card should not be at it's VRAM limit right out of the gate.  You can't make games that utilize more VRAM when no cards on the market exist with more VRAM aside from a $1,500 prosumer product.  People forget that a VRAM buffer isn't just for the longevity of the card itself, it's also to push devs to use that VRAM down the line.  Software follows hardware, not the other way around.

Not saying we need 16GB (12GB would have been fine) but the current situation is poor.  You can argue till the cows come home about whether X or Y amount of VRAM is required for game A or B but what isn't arguable is that stagnant or even a net negative to VRAM on video cards has the potential to hold back game developers and by extension games.  It stands to reason that software follows hardware, therefore hardware with more VRAM has to be released before developers are able to utilize it.


----------



## lexluthermiester (Dec 28, 2020)

evernessince said:


> A newly released card should not be at it's VRAM limit right out of the gate.


This.


evernessince said:


> Not saying we need 16GB


I disagree. AMD has the right idea with the 6000 series Radeons having 16GB from the get-go.


----------



## Vayra86 (Dec 28, 2020)

Mussels said:


> Thing is, you can always turn a texture setting down from ultra to high, and drop that VRAM usage.
> 
> *Unlike some games* (cough CP2077) most games you can turn the textures or AA down and VRAM usage plummets



You assume you will maintain equal control over your settings in games as you always have, while we keep getting a longer list of titles that really don't offer that control at all. And the new consoles, especially ports who utilize new technologies, are going to simplify those settings further into dynamic and probably hard to control setups - OR - you'll find you have to push sliders down pretty damn far to get a desired effect.

But, yes, you can turn settings down... I can also do that on a 2016 GTX 1080 with *8GB *
The assumption you won't have core power on tap for a GPU with much more than 200% the oomph and a mere 120% of VRAM is a very strange one, in a relative sense...


----------



## cueman (Dec 28, 2020)

yes it is.

if you play 4K games with 27" o monitor, now knowing one (19 game what needs little bit more,and that all highest options top.
if you playing anyhting lower, no problem...now and future....


----------



## DropToasterInTub (Dec 28, 2020)

I don't necessarily believe the argument of 10 GB being enough for today's games is more of a problem rather than the fact that the x70 SKU has been stuck on 8 GB for almost 5 years now. There's been absolutely NO progression whatsoever for x70 and below since Pascal... and Pascal's already nearing 5 years old. Yes, NVIDIA finally gave the x80 a bump to 10 GB, but even the GTX TITAN X from Maxwell had more VRAM than that back in early 2015. Of course, that's comparing a TITAN card versus just a mere mortal GeForce product, but I'm just trying to show the example of how stagnant NVIDIA's been with increasing VRAM amounts for quite some time now. Pascal doubled Maxwell's VRAM amount across the board, tripled it from the GTX 960 to 1060.

It's even funnier that an RTX 3060 12 GB is rumored, which has more VRAM than a 3080. I don't believe the 3060 will benefit from it as it's too weak of a GPU itself to take advantage of it; but, this just goes to show that NVIDIA's making a mess out of their product stack when they could've just went 12 GB for the 3080 with a 384 bit bus. Even with G6 16 Gbps memory on a 384 bit bus, it would still have more VRAM than G6X with 19 Gbps on a 320 bit bus (768 GB/s, G6 384 bit vs. 760 GB/s with G6X, 320 bit). Perhaps it'd even be the cheaper route, too. Plus, the 3070 could have had 10 GB on a 320 bit bus, with the same G6 memory at 16 Gbps, but now with even more bandwidth.


----------



## Night (Dec 28, 2020)

I would rather run out of VRAM and be able to lower some 'not so essential' setting(s) than to have 20 GB of VRAM whereas you won't have the needed processing power. Knowing nvidia this won't be a $10 increase anyway.


----------



## EarthDog (Dec 28, 2020)

evernessince said:


> We are already using the entire VRAM amount or more depending on the resolution and game


 Look at any of the games reviewed at tpu. At 4k using ultra settings there is ONE title that uses more than 10GB. One. The rest are well under that, typicay a few GB. I'm not worried about that snowball getting much bigger (in this case, that title doesn't have any hitching issues its an allocation vs actual use thing).


----------



## Vayra86 (Dec 28, 2020)

DropToasterInTub said:


> I don't necessarily believe the argument of 10 GB being enough for today's games is more of a problem rather than the fact that the x70 SKU has been stuck on 8 GB for almost 5 years now. There's been absolutely NO progression whatsoever for x70 and below since Pascal... and Pascal's already nearing 5 years old. Yes, NVIDIA finally gave the x80 a bump to 10 GB, but even the GTX TITAN X from Maxwell had more VRAM than that back in early 2015. Of course, that's comparing a TITAN card versus just a mere mortal GeForce product, but I'm just trying to show the example of how stagnant NVIDIA's been with increasing VRAM amounts for quite some time now. Pascal doubled Maxwell's VRAM amount across the board, tripled it from the GTX 960 to 1060.
> 
> It's even funnier that an RTX 3060 12 GB is rumored, which has more VRAM than a 3080. I don't believe the 3060 will benefit from it as it's too weak of a GPU itself to take advantage of it; but, this just goes to show that NVIDIA's making a mess out of their product stack when they could've just went 12 GB for the 3080 with a 384 bit bus. Even with G6 16 Gbps memory on a 384 bit bus, it would still have more VRAM than G6X with 19 Gbps on a 320 bit bus (768 GB/s, G6 384 bit vs. 760 GB/s with G6X, 320 bit). Perhaps it'd even be the cheaper route, too. Plus, the 3070 could have had 10 GB on a 320 bit bus, with the same G6 memory at 16 Gbps, but now with even more bandwidth.



If Nvidia had snagged TSMC 7nm for Ampere they very well might have been able to.

I think with their wishlist they just couldn't cram more than this into the already heavily expanded TDP budgets to meet Navi's performance at a favorable price point. They would have to make an even bigger die to do so, cutting further into margins and therefore risk. And there is already a gap between Navi's die size and Ampere's. Its the polar opposite of the gap Nvidia had going for it in every gen for the last years.

The cost of RT 
Not on my wallet, that's for sure.


----------



## lexluthermiester (Dec 28, 2020)

Night said:


> I would rather run out of VRAM and be able to lower some 'not so essential' setting(s) than to have 20 GB of VRAM whereas you won't have the needed processing power. Knowing nvidia this won't be a $10 increase anyway.


Can't agree with this. It's always better to have more RAM. Always.


----------



## EarthDog (Dec 28, 2020)

lexluthermiester said:


> Can't agree with this. It's always better to have more RAM. *Always*.


Unless you're that guy... and this guy... and whoever else feels this way. 

As you know, the difference between ultra and high is often tough to distinguish without pausing and comparing stills. That coupled with less of a need of high AA at 4K UHD, there are a couple of settings that can be turned down to save some vRAM without a notable loss of IQ. 

I'm an ultra guy myself (2560x1440/144), but would rather save $100(?) and make a couple of tweaks at the EOL personally...others don't mind the paying the premium and not using the extra ram 90% of the time over the typical life cycle of a gpu (a few years). There is certainly more than one way to skin this cat.


----------



## lexluthermiester (Dec 28, 2020)

EarthDog said:


> I'm an ultra guy myself (2560x1440/144)


I'm not. But experience has taught me time and again that having more RAM is ALWAYS better than not having enough. Fraking ALWAYS!


EarthDog said:


> but would rather save $100(?) and make a couple of tweaks at the EOL personally...


I learned in the 80's that anytime a company released a card with a certain amount of RAM installed it wouldn't be long before they(or a partner) released a similar card with more(often double) the amount of RAM for a reasonable cost. I generally tend to wait for those cards, with few exceptions. As you might imagine, I'm going to wait until a 16GB 3070(ti?) or 3080 variant with 16GB+ VRAM is released and get it. 8GB or 10GB is just not enough for future gaming titles. It's barely enough for the latest new hotness. 2021 is going to see several titles release that will push the limits again.

With the Pascal and Tensor gen cards there were no expectations of expanded VRAM cards because no games were getting close to the VRAM limit, at that time. Radeon 7 with it's 16GB of VRAM was viewed as superfluous, and maybe is was. But fast forward to today with this latest gen of cards, games are getting very close to the VRAM limits and AMD, correctly, has planned ahead with 16GB cards. NVidia is going to join them shortly as they can see games pushing up against that VRAM limit and want to offer solutions to keep gamers happy. This is not a new concept. This is history repeating itself yet again.


----------



## EarthDog (Dec 28, 2020)

lexluthermiester said:


> But experience has taught me time and again that having more RAM is ALWAYS better than not having enough. Fraking ALWAYS!


The absolute nature of your blanket statement has no legs. It does with how _you_ use your PC and spend your money (the irony isn't lost that you aren't an ultra guy but still bellowing about 10GB not being enough, lol), but there are other ways to do so as I just explained. 'Always' simply isn't true. 



lexluthermiester said:


> 2021 is going to see several titles release that will push the limits again.


I'm not holding my breath... consoles rock 13.5GB of a shared pool... RAM and vRAM... so it likely isn't those that are pushing more vRAM use. Sure it will go up with time, but as we've seen in a small cross section of game reviews here, there is literally one title that uses over 8GB of vRAM at 4K (IIRC, there may be two? - maybe that was 10GB, who knows, I'm not looking, lol). But I also addressed that point with allocation and use/what you will experience in the game as well.  



lexluthermiester said:


> But fast forward to today, games are getting very close to the VRAM limits


They aren't though. Please read some of TPU's game reviews and focus on the vRAM use at 4K/Ultra. The ONLY reason 10GB concerns me on this card is if you run 4K, plan on using mods in games, and keeping this thing for more than a few years. Otherwise, you'll be fine for the next couple of years running Ultra or the next few with tweaking a few titles down. Still, the vast majority of titles will fit within the onboard vRAM buffer for years.

EDIT: It's too bad that the Radeon 7 would be a potato trying to run some titles at 4K Ultra/60... it's not an overall 4K 60 card at 8GB, nor at 16GB. Are there any titles that trip over 8GB at 2560x1440? So the R7 is a great example of too much vRAM for the appropriate res and not enough horsepower to use it at 4K. Over half the titles in the TPU review are below 60 FPS at 4K, a few WELL below that value and what many would call 'unplayable'. 

EDIT2: For giggles, I looked through the last 10 of TPU's game performance reviews.. here are the accurate 4K values (max, some includes RT)...








						TechPowerUp
					






					www.techpowerup.com
				




Cyberp 2077 - 7.1/*9.9* (w/o RT, w/ RT)
Godfall - 8.6GB
AC: Vallhalla - 6.1GB
Watchdogs - 7.4GB
Star Wars - 4.6 GB
Horizon Zero Dawn - 8.6GB
Death Stranding - 4.9GB
Gears Tactics - 6.2GB
Res Evil 3 - 7.2/7.7 (DX11/DX12)
Doom Et - 8.3GB

I would say one title is very close (over 9GB) approaching the limit. The others, most, are not remotely close using 7.7GB or less. Maybe there are other titles that show more, but just giving 10 examples of titles that don't hit 10GB.... which happens to be the latest 10 titles he's tested. You can continue on down the list if you want.


----------



## user112 (Dec 28, 2020)

10gb is likely enough for 99% AAA titles out currently. but for emulation or games that offer the ability to use enhanced texture packs it's a major limitation. since you often don't need a lot of GPU core power but you do need a ton of vram.  capacity aside I'd like amd and nvidia stop using 1GB memory chips so GPUs get smaller for once. but for that we need memory companies to stop boosting clockspeeds at lower voltages and cut latency to under 16millisecconds.


----------



## nguyen (Dec 28, 2020)

Lol dont be like HUB who love to make future predictions when perf/dollar is not in AMD's court.
If RTX/DLSS is only available in 1% of games, there are far fewer games than that which require more than 10GB VRAM and therefore it is statistically irrelavent (as you can avoid those games altogether).
Remember Godfall's 12GB VRAM bullcrap ? yeah AMD marketing was trying pretty hard there. HUB was shilling pretty hard then tried to come out as victims when Nvidia PR team wanted to cut tie with them.


----------



## Vayra86 (Dec 28, 2020)

EarthDog said:


> The absolute nature of your blanket statement has no legs. It does with how _you_ use your PC and spend your money (the irony isn't lost that you aren't an ultra guy but still bellowing about 10GB not being enough, lol), but there are other ways to do so as I just explained. 'Always' simply isn't true.
> 
> I'm not holding my breath... consoles rock 13.5GB of a shared pool... RAM and vRAM... so it likely isn't those that are pushing more vRAM use. Sure it will go up with time, but as we've seen in a small cross section of game reviews here, there is literally one title that uses over 8GB of vRAM at 4K (IIRC, there may be two? - maybe that was 10GB, who knows, I'm not looking, lol). But I also addressed that point with allocation and use/what you will experience in the game as well.
> 
> ...



Shared pool or not, what do you expect, game logic on its own isn't that RAM intensive. If its 2GB, you're being generous. A full blown Windows 10 install is a mere 4 GB right now, and can be much lower too.

And, again... storage. Consoles will be using a special cache system.

And as for the examples being benched in TPU reviews right now - if you have a couple, that's already a sizeable _percentage_ of the games being benched, is it not? Its only fair to translate that to the entire catalogue of games coming out. I don't recall ever playing only what W1zzard wants to bench 

Might be wrong, but I sense some shifting goalposts here... slowly but surely. And we're still only waiting for the cards to be readily available


----------



## Chrispy_ (Dec 28, 2020)

lexluthermiester said:


> AMD has the right idea with the 6000 series Radeons having 16GB from the get-go.


My feeling, knowing a few game devs in person, is that they work with the common hardware constraints. The tools to make ultra-high quality art assets are readily available - developers are basically making textures and meshes automatically from photography, 3D scans, and pointclouds now. If 16GB VRAM becomes common then they'll design levels that utilise 16GB of art. It's really not any extra work for them, they just need to move the slider a little further to the right when using the compress-O-tron to reduce art assets to what fits into common VRAM sizes, 2GB, 4GB, or 8GB for example.

So some devs, at least, have been automatically ready for 16GB and 32GB cards for half a decade or more. The tools to do it effortlessly are mainstream industry standards. It's what you see on that splash screen in many games with all the technology logos. Devs aren't manually putting in extra effort to hand-make their own optimisation and modelling tools, they're already out there in mainstream use. Heck, our in-house AEC visualisation studio uses them on a daily basis, I spend several hours a month troubleshooting asset conversion, interoperability and other niggles for those applications.

We've had 8GB cards for over 6 years at the high end (290X) and almost 5 years at the midrange (RX470). That's right, 8GB has been at the mainstream, mass-market price point for HALF A DECADE.


----------



## Vayra86 (Dec 28, 2020)

Chrispy_ said:


> My feeling, knowing a few game devs in person, is that they work with the common hardware constraints. The tools to make ultra-high quality art assets are readily available - developers are basically making textures and meshes from photography now. If 16GB VRAM becomes common then they'll design levels that utilise 16GB of art. It's really not any extra work for them, they just need to move the slider a little further to the right when using the compress-O-tron to reduce art assets to what fits into common VRAM sizes, 2GB, 4GB, or 8GB for example.
> 
> So some devs, at least, have been automatically ready for 16GB and 32GB cards for half a decade or more. The tools to do it effortlessly are mainstream industry standards. It's what you see on that splash screen in many games with all the technology logos. Devs aren't manually putting in extra effort to hand-make their own optimisation and modelling tools, they're already out there in mainstream use. Heck, our in-house AEC visualisation studio uses them on a daily basis, I spend several hours a month troubleshooting asset conversion, interoperability and other niggles for those applications.
> 
> We've had 8GB cards for over 6 years at the high end (290X) and almost 5 years at the midrange (RX470). That's right, 8GB has been at the mainstream, mass-market price point for HALF A DECADE.



The tools are even so commonly available, that most modders are keen to use them as well and present us with better quality packs for games


----------



## EarthDog (Dec 28, 2020)

Vayra86 said:


> Shared pool or not, what do you expect, game logic on its own isn't that RAM intensive. If its 2GB, you're being generous. A full blown Windows 10 install is a mere 4 GB right now, and can be much lower too.
> 
> And, again... storage. Consoles will be using a special cache system.
> 
> ...


Slowly but surely is right. I'm not saying it won't increase, I'm saying there is plenty of room (for most titles) to have an increase while still slotting in under 10GB at 4K/Ultra (and again, you can lower some to high and not use copious amount of AA). Some of these over 100% before it gets to 10GB.


Vayra86 said:


> TPU reviews right now - if you have a couple,


I looked, there are NONE that go over 10GB... one is close at 9.9 with RT enabled, however. Please see the edit where I listed each of the last 10.


----------



## phanbuey (Dec 28, 2020)

At stock game settings you should be fine, if you go nuts with texture packs and mods then you could run out.  In the future you will run out but by then you will have a new card anyways.


----------



## Chrispy_ (Dec 28, 2020)

EarthDog said:


> I'm saying there is plenty of room (for most titles) to have an increase while still slotting in under 10GB at 4K/Ultra (and again, you can lower some to high and not use copious amount of AA).


My take on this is that game devs won't target 10GB the same way they didn't target the oddball 11GB in a 2080Ti.

They'll target 8GB, and 16GB, perhaps 12GB depending on sales of 12GB cards, but I'm even dubious of that. Every extra VRAM size they target is extra effort in packaging, testing and configuration. Sure, it's effortless to make content for a specific VRAM size, but there's more work to be done making sure that a scene (from any point in the game map) doesn't overstretch its budget - be that through manual level segmentation or dynamic geometry/view culling.

With 10GB being so close to the mainstream 8GB config, I would bet they don't bother even if by some miracle the 3080 sells to 100x more people than expected. 25% more VRAM isn't enough to make any significant quality difference and that's why even if 12GB cards become popular, I'm not sure it's enough to make a visual quality jump that is worth devs' extra effort.


----------



## EarthDog (Dec 28, 2020)

Chrispy_ said:


> My take on this is that game devs won't target 10GB the same way they didn't target the oddball 11GB in a 2080Ti.
> They'll target 8GB, and 16GB, perhaps 12GB depending on sales of 12GB cards, but I'm even dubious of that.


They won't target 16GB until both companies have cards out that support it. I can see AMD pushing this, but not Nvidia/TWIMTPB titles or the rest of gaming. Worth noting, these are flagships (3090 be damned, the crossover card)... where most people buy cards and play, they have less VRAM available (as they use less at a lower res).

Just like the core wars, we're seeing companies add more of something that isn't terribly useful today or in the next couple of years in most cases (of course there are some). I totally get it... there are uneducated users looking to buy constantly... and if I didn't know any better, looking on the box and seeing more vRAM versus less and not knowing what little difference it will make overall, I'd go with more too. This is marketing and, IMO, a bit of milking the consumer... just like dropping in 12c/24t+ to the mainstream platform is a joke (for most people).


----------



## Vya Domus (Dec 28, 2020)

nguyen said:


> yeah AMD marketing was trying pretty hard there.



Remind me again what exactly were they trying ?


----------



## Vayra86 (Dec 28, 2020)

EarthDog said:


> They won't target 16GB until both companies have cards out that support it.



To this one I can only agree... but I think its no longer safe to assume Nvidia gets to determine what the limit will be. There is a lot of chatter about higher VRAM cards for the whole stack.

And let's face it, its not like the market is flooded with 10GB cards right now.


----------



## EarthDog (Dec 28, 2020)

Vayra86 said:


> To this one I can only agree... but I think its no longer safe to assume Nvidia gets to determine what the limit will be. There is a lot of chatter about higher VRAM cards for the whole stack.
> 
> And let's face it, its not like the market is flooded with 10GB cards right now.


Right... 10GB+ won't saturate the market for years. If we average those 10 titles (using the highest values I listed, note), vRAM use is just over 7GB at 4K/Ultra. I think we've got some time and headroom. 

Worth noting, I find the 8GB 3070 more troubling than a 10GB 3080. The 3070 is plenty playable 4K/60 Ultra card. There are 3 titles that can eclipse that mark already. Most don't/won't.. but we're seeing that as a tipping point. The 3070 is better serving 1440/144+ IMO. While the 10GB on the 3080 is plenty for all titles at 4K/U/60 today and for an overwheming marjority of titles for years to come... and then you can always turn settings down a smidge if you have to on the titles few titles that go nuts. Or, get AMD and never worry (except for potential driver issues... or not using 25% of the RAM on it ever, lol).


----------



## evernessince (Dec 28, 2020)

EarthDog said:


> Look at any of the games reviewed at tpu. At 4k using ultra settings there is ONE title that uses more than 10GB. One. The rest are well under that, typicay a few GB. I'm not worried about that snowball getting much bigger (in this case, that title doesn't have any hitching issues its an allocation vs actual use thing).



From what I see, a majority are around 9.3GB.  Like I said in my other comment, that's ok for a year.

You also avoided my other point, in that games cannot utilize VRAM they don't have.  The failure of Nvidia to increase VRAM on it's cards is bound to lead to compromises in games.  Hardware needs to have more VRAM first, not the other way around.

Saying "hey look guys devs aren't making games that brand new products can't handle" is a bad position to take.  It's the same BS people while Intel was still dominant in the CPU market.  No need for more than 4 cores as that's all Intel sells to gamers right?  Again, people defending practices that hold the industry back.  Look where that landed Intel, behind AMD and loosing server marketshare to ARM.  If it hadn't been for AMD people would still be rocking 4 cores and paying a premium to do so.


----------



## EarthDog (Dec 28, 2020)

evernessince said:


> From what I see, a majority are around 9.3GB.


You're not seeing that from my link (and the list I made from all of the games I linked). Where are you seeing 9.3 GB?

Compromises? Naa... sorry. I doubt that. There isn't a need, they have headroom.


----------



## Mussels (Dec 29, 2020)

lexluthermiester said:


> Can't agree with this. It's always better to have more RAM. Always.



Unless you have a budget. Or its slower and hurts FPS. Or the product doesnt exist yet.

I always upgrade VRAM with every GPU upgrade, which is why i went from 8GB to 10GB - i was never VRAM limited, only GPU limited.

4K gamers are the same, yeah sure you might get close to 8GB now.... but you're also barely managing 60FPS with DLSS or settings turned right down.


----------



## lexluthermiester (Dec 29, 2020)

Chrispy_ said:


> My feeling, knowing a few game devs in person, is that they work with the common hardware constraints.


True but they allow for the extreme users and future expansion. I know a couple devs too, and they are always looking forward..



Chrispy_ said:


> That's right, 8GB has been at the mainstream, mass-market price point for HALF A DECADE.


Right? Kinda pathetic really...



Mussels said:


> Unless you have a budget.


If money was tight(and it once was), I'd rather save up for an extra month or two and spring for a better card. But I do see your point.


Mussels said:


> Or its slower and hurts FPS.


This hasn't bothered me in the past and still wouldn't now. I would be delighted with a 3080 with the same cores and speed but a 256bit memory bus and 16GB VRAM. Perfectly acceptable scenario, even if they called it a 3070ti(which they have done similar types of things in the past).


----------



## evernessince (Dec 29, 2020)

EarthDog said:


> You're not seeing that from my link (and the list I made from all of the games I linked). Where are you seeing 9.3 GB?
> 
> Compromises? Naa... sorry. I doubt that. There isn't a need, they have headroom.



Hardware Unboxed talks about it in this video: 








The 8GB 3070 is already over it's VRAM amount at 1440p even with lowered texture settings as Steve and tim points out.

You honestly think the 3080, with a mere 2GB more, has VRAM headroom at higher settings and higher resolution?  Very doubtful.

It's not even about just the current games on the market either.  As I pointed out earlier, the lack of headroom leaves no room for devs to push their games to be bigger, better, and more immersive.  Instead of spending resources on making content and improving the game, they have to worry about the crappy VRAM amount on Nvidia cards.  Like I pointed out earlier, software follows hardware.  VRAM amount has to increase in order for games to utilize more.

Heck my modded skyrim has been using around 10GB for years.

Brilliant move by Nvidia though, they managed to reduce the cost of their cards by reducing VRAM.  Heck they even have people defending the reduction in VRAM.  Now they'll sell you what they should have been at launch for $100 more (on top of the $60 or so AIB tax they charge over MSRP).


----------



## Mussels (Dec 29, 2020)

At ultra settings.

You act like you cant turn one or two settings down and tank that VRAM usage.


----------



## EarthDog (Dec 29, 2020)

evernessince said:


> The 8GB 3070 is already over it's VRAM amount at 1440p even with lowered texture settings as Steve and tim points out.


help me out...I don't have 41 mins to sit through that video...what games and settings are they 'averaging' 9.3 GB? This thread is about the 10GB 3080.. I made a passing mention of the 3070... what titles show over 8Gb of use at 1440p (nothing at TPU, note)? Come on now... the info was laid out for you, please don't make us work for it..... 



evernessince said:


> Heck my modded skyrim has been using around 10GB for years.


Right... I mentioned modding it could be an issue...



evernessince said:


> the lack of headroom


there is headroom... in 9 of 10 titles at 4k UHD in TPU testing...... I can keep going.

Got any website reviews? really. not sitting through that long of a video... 

EDIT: So I took the time to skip around... he mentioned ONE title (Watch Dogs legion) and that was with RT and HD texture packs at 1440p. Where again did you get an 'average' of 9.3 GB? Are we going to base our perspective on outliers? I don't get that line of thinking.


----------



## nguyen (Dec 29, 2020)

HUB or Techspot would spin the data to make it looks like 8GB VRAM on the 3070 is a weak point, here is their Cyberpunk 2077 benchmark without RT nor DLSS at 1440p Ultra






Now here is 1440p Ultra with DLSS





Enabling DLSS in Cyberpunk 2077 will instantly lower VRAM allocation by 1GB, yet they claim 3070 is slower than 2080 Ti with RTX/DLSS on is because of VRAM limitation ? what kinda editorial logic is this ?
HUB really has an agenda of their own, they just played victims when confronted by Nvidia PR rep

All this discussion about VRAM size is like comparing mega-pixel on camera where uninformed buyers got duped into thinking higher mega-pixel means better camera  .


----------



## lexluthermiester (Dec 29, 2020)

nguyen said:


> All this discussion about VRAM size is like comparing mega-pixel on camera where uninformed buyers got duped into thinking higher mega-pixel means better camera .


That was as adorable as it was disturbed & flawed. I'm not a particular fan of Techspot but It's not at all ok to put a false spin on TechSpot's review like that. Tim is no ones fool and does his due diligence. Those conclusions are more or less spot on and very much in-line with what every other OBJECTIVE reviewer has found. Your comment above is little more that the same spin-doctoring you claim they're going. Kind of a d-bag move there and for what, so you can earn a few brownies points in a pointless debate? Classy, real classy.

Here's the deal folks, you want to live with less VRAM and thus potentially limit yourself in the near future? Hey live in the past, it's all good. Those of us who want the most out of our tech will buy the cards that will go the distance. The rest of you can limit yourselves and/or be forced to upgrade sooner rather than later. Yes, yes.


----------



## nguyen (Dec 29, 2020)

lexluthermiester said:


> That was as adorable as it was disturbed & flawed. I'm not a particular fan of Techspot but It's not at all ok to put a false spin on TechSpot's review like that. Tim is no ones fool and does his due diligence. Those conclusions are more or less spot on and very much in-line with what every other OBJECTIVE reviewer has found. Your comment above is little more that the same spin-doctoring you claim they're going. Kind of a d-bag move there and for what, so you can earn a few brownies points in a pointless debate? Classy, real classy.
> 
> Here's the deal folks, you want to live with less VRAM and thus potentially limit yourself in the near future? Hey live in the past, it's all good. Those of us who want the most out of our tech will buy the cards that will go the distance. The rest of you can limit yourselves and/or be forced to upgrade sooner rather than later. Yes, yes.






Funny that their own testing at 4K Ultra shows that 3070 is only 3% behind 2080 Ti.
So their conclusion about 3070 lacking VRAM in CP2077 doesn't add up.

TPU and Guru3d testing do not reflect that 3070 is lacking at 4K ultra in CP2077 either.

Here in TPU, W1zzard would surely look at these irregularity before coming up with some nonsense conclusion. These kind of queries would surely get buried before long in Techspot forum .

Oh so VRAM is the future and RTX/DLSS is the past, thumbs up to you sir. Since I'm pretty sure that Fidelity SR will reduce VRAM requirement, either 16GB VRAM or FSR will become pointless as they cancel each other benefits.


----------



## lexluthermiester (Dec 29, 2020)

nguyen said:


> Funny that their own testing at 4K Ultra shows that 3070 is only 3% behind 2080 Ti.
> So their conclusion about 3070 lacking VRAM in CP2077 doesn't add up.
> 
> TPU and Guru3d testing do not reflect that 3070 is lacking at 4K ultra in CP2077 either.


Are you on drugs? I ask because TechSpot's testing shows a 1 frame per second difference between the 3070 and 2080ti at 4k. TPU's testing, that *you* just referenced, shows what? Oh, right, 1 frame per second difference. And Guru3D(BTW you referenced the wrong page)?








						Cyberpunk 2077: PC graphics perf benchmark review
					

It's probably the most hyped up title if the year, but the wait is finally over, Cyberpunk 2077 has made its way towards the PC. And yeah, what a game and what a feast for the eyes. We'll check it o... Game performance 3840x2160 (Ultra HD)




					www.guru3d.com
				



Oh their tests showed the same frame rate.

All of them are within 2(TWO) frames per second of each other. Margin of error kind of thing.. Hmmm...


----------



## nguyen (Dec 29, 2020)

lexluthermiester said:


> Are you on drugs? I ask because TechSpot's testing shows a 1 frame per second difference between the 3070 and 2080ti at 4k. TPU's testing, that *you* just referenced, shows what? Oh, right, 1 frame per second difference. And Guru3D(BTW you referenced the wrong page)?
> 
> 
> 
> ...



Oh boy you don't understand what I'm pointing at are you, Techspot testing without RT/DLSS show that 3070 is on par with 2080 Ti, same as TPU and Guru3d

But in their RTX/DLSS article, 3070 is 9% slower than 2080 Ti with RTX OFF/DLSS ON and 2% with RT Ultra/DLSS ON










And this is their take on 3070's 8GB VRAM


> If you want a 60 FPS experience using ray tracing at this resolution, you’ll need to have at least an RTX 2080 Ti in your setup to achieve that level of performance. The RTX 3080 and 3090 are strong performers, but the RTX 3070 isn’t as much. That’s because if you are using the Ultra preset, with ray tracing, 8GB of VRAM is right on the edge in this title and actually does limit performance to some extent.



Which make no sense whatsoever, their conclusion about 3070 lacking VRAM is totally baseless as their own testing contradict their conclusion.


----------



## lexluthermiester (Dec 29, 2020)

nguyen said:


> Which make no sense whatsoever, their conclusion about 3070 lacking VRAM is totally baseless as their own testing contradict their conclusion.


Ok. Whatever. You need to take a closer look at those reviews you quoted. You're missing a few things.


----------



## TumbleGeorge (Dec 29, 2020)

EarthDog said:


> there is headroom... in 9 of 10 titles at 4k UHD in TPU testing...... I can keep going


Will good to make retesting in May 2021 with insert few new titles and modded pathes. Many players when ended original game then play mods. Must retest games with mods!


----------



## EarthDog (Dec 29, 2020)

TumbleGeorge said:


> Will good to make retesting in May 2021 with insert few new titles and modded pathes. Many players when ended original game then play mods. Must retest games with mods!


lol... whenever. I already have a few posts bookmarked so we can look at this again in a 'year'...

I'm surprised that people can make statements like this (so absolute) when there is little evidence pointing in that direction.



lexluthermiester said:


> Ok. Whatever. You need to take a closer look at those reviews you quoted. You're missing a few things.


Such as?

Have the conversation instead of leaving vague dismissal for him. I have to be honest, I see his point about the conclusion. It's possible I missed something too ... can you clarify what you're so passionately dismissive about? I'm not speaking for nguyen, but I've got an open mind...


----------



## dirtyferret (Dec 29, 2020)

TumbleGeorge said:


> Will good to make retesting in May 2021 with insert few new titles and modded pathes. Many players when ended original game then play mods. Must retest games with mods!


Is your last name trump?

I can't believe that this thread is A) 13 pages long and B) people are discussing GPU ram like it's modular system RAM in your PC.  RAM is only part of the video card and I'll take a more powerful video card over a weaker video card with more RAM (think RX 580 4GB over RX 570 8GB or Nvidia RTX 2060 6GB over GTX 1070 8GB).  If the 10GB holds back the RTX 3080 compared to the competition than customers will let Nvidia know with their wallets.


----------



## TumbleGeorge (Dec 29, 2020)

dirtyferret said:


> If the 10GB holds back the RTX 3080 compared to the competition than customers will let Nvidia know with their wallets.


Costumers are hungry for hardware  bored of isolation because of covid-19 , scalpers exist, coin miners exist. What you say what Nvidia will learn from wallets? Market is not normal!


----------



## thesmokingman (Dec 29, 2020)

Mussels said:


> At ultra settings.
> 
> You act like you cant turn one or two settings down and tank that VRAM usage.



Act like...? Anyone can do that. That's not the point is it? Limiting the vram on high end cards is a bit of an insult with the amount of monetization they've done with the vram.


----------



## dirtyferret (Dec 29, 2020)

TumbleGeorge said:


> Costumers are hungry for hardware  bored of isolation because of covid-19 , scalpers exist, coin miners exist. What you say what Nvidia will learn from wallets? Market is not normal!


Goblin Sharks exist!
Croc shoes exist!
Disco existed, went away, and now exists again!

All have just as much impact on the RTX 3080's performance.


----------



## mouacyk (Dec 29, 2020)

No matter how much I despise the 10GB on the 3080, I can't find one to buy.


----------



## lexluthermiester (Dec 30, 2020)

EarthDog said:


> It's possible I missed something too ... can you clarify what you're so passionately dismissive about? I'm not speaking for nguyen, but I've got an open mind...


How about the fact that Nguyen is blatantly twisting facts to suit the very flawed narrative they are attempting(and failing) to express.


----------



## 95Viper (Dec 30, 2020)

Stay on topic.
Stop any insults and personal attacks.
Keep it technical.

Read the guidelines if you need a refresher on how/what to post.



> *Be polite and Constructive*, if you have nothing nice to say then don't say anything at all.
> This includes trolling, continuous use of bad language (ie. cussing), flaming, baiting, retaliatory comments, system feature abuse, and insulting others.
> Do not get involved in any off-topic banter or arguments. Please report them and avoid instead.


----------



## EarthDog (Dec 30, 2020)

lexluthermiester said:


> How about the fact that Nguyen is blatantly twisting facts to suit the very flawed narrative they are attempting(and failing) to express.


Can you be more specific? What facts is he twisting? He posted up charts that supports what he is saying, no? What are you contesting, exactly? Be clear and please support YOUR assertion as well so we're able to figure this out for ourselves. 

Apologies if I'm being particularly dense here.


----------



## Metroid (Dec 30, 2020)

Yeah, 10GB is enough for gaming today and I guess for at least 2 years more. If you want deep learning and other things go for the 20gb ti version or 3090.

There is a new article about it.









						Are 10GB of GDDR6X of VRAM enough for 4K/Ultra gaming?
					

A lot of gamers have been wondering whether its 10GB of GDDR6X VRAM is enough for 4K/Ultra gaming. So, time to find out.




					www.dsogaming.com


----------



## lexluthermiester (Dec 31, 2020)

EarthDog said:


> Apologies if I'm being particularly dense here.


If you're serious, then I'll explain a bit. For a moment it seemed like you were being antagonistic.

Ok if you look at post314(one page back), you'll see a comment made about how HUB and TS, in Nguyen's view, were putting "spin" in their reviews.

Naturally I looked through the review cited and found no such problem and responded.

In post316, he made comparisons to TPU and Guru3D reviews and stated a few things that were simply inaccurate if not willfully deceptive.

In the next post I pointed out the flaws in his statements and posted the correct page in the subject review to be referring to in the context of the discussion at hand.

In Nguyen's next statement, post318, they made even more inaccurate statements and not only misquoted the cited data but seemed to be deliberately twisting facts out of context to fit their flawed, factless narrative.

As a general rule, if I feel like people are discussing a subject in earnest, I'll go out of my way to help them see the real deal or get more facts that can help them understand more about the subject being discussed. But when it seems like  people are just being deceptive or worse, willfully ignorant, I lose the will to be helpful or continue the discussion. That's when you see me say, "Google it yourself" or "whatever". I've got no time for people who are going to waste it.


----------



## Mussels (Dec 31, 2020)

Well, my warranty came back with a 3090 instead of a 3080 so i'll let you guys know if i ever run out of VRAM.


----------



## nguyen (Dec 31, 2020)

Mussels said:


> Well, my warranty came back with a 3090 instead of a 3080 so i'll let you guys know if i ever run out of VRAM.



your 3080 was defective and they sent you a 3090 ? is this a blessing in disguise ?


----------



## Mussels (Dec 31, 2020)

nguyen said:


> your 3080 was defective and they sent you a 3090 ? is this a blessing in disguise ?



discounted upgrade due to no stock, its a 'mere' galax model, but it's still quiet, fast, and RGB blingy


----------



## EarthDog (Dec 31, 2020)

lexluthermiester said:


> If you're serious, then I'll explain a bit. For a moment it seemed like you were being antagonistic.


Apologies you feel this way. I do get frustrated at your (what feels like a) consistent lack of supporting information without being prodded. Maybe it shows even when I ask nicely (not antagonizing, just being honest with you). 

So...the point of 314 is this......


nguyen said:


> Enabling DLSS in Cyberpunk 2077 will instantly lower VRAM allocation by 1GB, yet they claim 3070 is slower than 2080 Ti with RTX/DLSS on is because of VRAM limitation ? what kinda editorial logic is this ?


... to which the charts he posted support the assertion, correct?

The first chart shows 1440p ultra with the 3070 and 2080 Ti hitting 56 FPS. The second chart shows the 1440p ultra + DLSS and the 2080Ti is ~9% (~8 FPS) faster at the same res.

Now, look at the chart in #316.. when you go up to 4K ultra (which uses more vRAM than 1440p, right) the difference between a 2080Ti and 3070 is 3% (~1 FPS) as it should be. TPU and Gulu3D (if anyone gets that joke, LMK... ) support this claim they are the same speed at 4K.....while using MORE vram than 1440p+DLSS. *How is that possible?* *What are we BOTH missing here?*





lexluthermiester said:


> In the next post I pointed out the flaws in his statements and posted the correct page in the subject review to be referring to in the context of the discussion at hand.


Did you though? In #317 it feels like you missed his point which #318 attempts to bring it back on track... to which you promptly blew him off in #319. The fact that they are the same speed at 4K at 3 sites isn't the point. The point is that at 4K which uses MORE vRAM than 1440p + DLSS, they are still tied, yet when using DLSS alone and LESS vram, there is a significant gap.

The charts in #318 show when RT _and_ DLSS are used, (where _more_ vRAM is allocated vs not using RT and DLSS) that the 3070 _closes_ that gap to 1%.* If vRAM was a problem, how does the gap shrink when it's using more at the higher res? *



lexluthermiester said:


> In Nguyen's next statement, post318, they made even more inaccurate statements and not only misquoted the cited data but seemed to be deliberately twisting facts out of context to fit their flawed, factless narrative.


*What was misquoted? What are those twisted facts, exactly?* This is where the details matter, lex. 

I guess the question, at least to me, is *how is it possible that 4K UHD shows the two cards neck and neck with each other, but at a lower resolution with DLSS the gap is nearly 10% (even with more tensor cores)?* After cutting through all this (thanks random insomnia, lol), I think I know the answer...but it isn't anyone twisting a narrative intentionally (that I can see). I don't think HUB/Techspot have an agenda, but I can see why he feels this way according to the conclusion he quoted. That said........ I think I found a curiosity/hole in Ngyen's point... but it isn't what I think you are trying to describe, Lex.



lexluthermiester said:


> As a general rule, if I feel like people are discussing a subject in earnest, I'll go out of my way to help them see the real deal or get more facts that can help them understand more about the subject being discussed. But when it seems like people are just being deceptive or worse, willfully ignorant, I lose the will to be helpful or continue the discussion. That's when you see me say, "Google it yourself" or "whatever". I've got no time for people who are going to waste it.


I felt he was discussing this in earnest. He posted his opinion and supported it with charts and multiple references. You come in shooting him down and seemingly miss the point. He clarifies the point with more charts and you blow him off thinking the above. I don't believe he was being deceptive, nor willfully ignorant. That said, it doesn't mean he (we both) didn't miss something. Now, I think I see what the issue is............

What I think he missed that may shed some light on things...............is the fact that RT on Ampere is a lot faster that Turing b/c more RT bits. So the faster RT overcomes the 8GB of vRAM 'issue' and closes the gap regardless(?). Now, that seems a bit counterintuitive since with RT enabled, you use _more_ vRAM... but it's the only thing I can think of. In the end, it doesn't seem like when using RT/DLSS that there is an issue with RAM. *If so, it should have manifested itself in their results, correct?* We don't see that, yet we see what the conclusion says. It doesn't seem to match.

Let's be clear, all of his charts support what he is saying.... but the reasoning behind his conclusion may be flawed is all. So, you may be right in some respect, Lex, but more so by accident/different reasons than actually/accurately pointing out what he was missing. 

*So in the end, about the vRAM situation (3070/3080... it doesn't matter)...... is it the RT that is overcoming the so-called problem? *


----------



## nguyen (Dec 31, 2020)

EarthDog said:


> What I think he missed that may shed some light on things...............*is the fact that RT on Ampere is a lot faster that Turing b/c more RT bits*. So the faster RT overcomes the 8GB of vRAM 'issue' and closes the gap regardless(?). Now, that seems a bit counterintuitive since with RT enabled, you use _more_ vRAM... but it's the only thing I can think of. In the end, it doesn't seem like when using RT/DLSS that there is an issue with RAM. If so, it should have manifested itself in their results, correct? We don't see that, yet we see what the conclusion says. It doesn't seem to match.
> 
> Let's be clear, all of his charts support what he is saying.... but the reasoning behind his conclusion may be flawed is all. So, you may be right, Lex, but more so by accident/different reasons than accurately pointing out what he was missing.
> 
> So in the end, about the vRAM situation (3070/3080... it doesn't matter)...... is it the RT that is overcoming the so-called problem?



Very well thought out discussion sir, but I have to add something, is that the results for 3070 at 1440p + RT Ultra + DLSS does not indicate any performance degradation caused by lack of VRAM, when you compare 3070 to both 2080 Ti and 3080

1440p Ultra + DLSS Quality
3080 - 107.3fps
2080 Ti - 90.1fps
3070 - 82.6fps
This is the numbers HUB used as baseline performance, said so themselves.

1440p + RT reflection + DLSS quality
3080  76.3fps = 29%fps drop
2080Ti 62.9fps = 30%fps drop
3070 57.8fps = 30%fps drop

1440p + RT Ultra + DLSS Quality
3080 60.6fps= 44%fps drop from baseline
2080 Ti 47.8fps = 47%fps drop from baseline
3070 46.7fps= 44%fps drop from baseline
Note that HUB also use these calculations in their article.

3070's performance is consistent with 3080, what I found inconsistent is the baseline performance of the 2080Ti. Perhaps the particular test scene that HUB used favors Turing in term of rasterization performance, however HUB should not draw conclusion about 3070's lack of VRAM when in fact their own data do not support that idea.
If the 3070 were indeed VRAM limited in 1440p + RT Ultra + DLSS quality, so would the 3080   .


----------



## lexluthermiester (Dec 31, 2020)

EarthDog said:


> *What was misquoted? What are those twisted facts, exactly?* This is where the details matter, lex.





EarthDog said:


> That said........ I think I found a curiosity/hole in Ngyen's point... but it isn't what I think you are trying to describe, Lex.


Yup, whatever, earthdog.


----------



## Caring1 (Dec 31, 2020)

Mussels said:


> discounted upgrade due to no stock, its a 'mere' galax model, but it's still quiet, fast, and RGB blingy


Selling for $2,500 minimum here. 
Just a slight price bonus from the 3080.


----------



## Sandbo (Dec 31, 2020)

Given the current numbers, 10 GB RAM isn't exactly a limit for most games.
Plus, with AMD actually being able to catch up with Big Navi (less Ray-tracing), a more heated-up competition should be ahead;
I wouldn't expect 3080 to age as well as 1080Ti even if it had more RAM. 

I would have gotten one if it was anywhere close to MSRP


----------



## Antonis_35 (Dec 31, 2020)

My 5 cents, and forgive me if I am stating the obvious:
If you are the type of person who upgrades with every new GPU generation, then 10GB VRAM is enough.
If you are the type of person who keeps their card for 3 years or longer then:
1) If you game at 1440p and plan to stay there, then 10GB should still be enough.
2) If you plan to, or game at 4K, then most likely 10GB VRAM might not be enough in the long term.


----------



## Sandbo (Jan 11, 2021)

Just to give my experience, after playing minecraft RTX on my new 3070 today.
In just one demo, the Neon something, I can see the GPU RAM use was > 8 GB (so capped there), and in the game some texture isn't fully rendered (some transparent blocks).
Though this is an edge case, but obviously in some games RAM use could go beyond 8 GB just by having more textures, so I would say it depends and if you are concerned you might want something with more RAM.

In general, I believe game releasing in the coming 2 years should have taken into account that 8-10 GB RAM is still the mainstream.


----------



## ZoneDymo (Jan 11, 2021)

if its enough, its only because game devs adjust for the "lack off" Vram

If all videocards had 20gb of Vram, im sure we would see some amazing high resolution texture packs.


----------



## Sunny and 75 (Jan 11, 2021)

Recent leaks suggest they're gonna offer a 12 GB 3060 to be able to compete with 6700 XT from AMD. So if 10 GB was really enough, they wouldn't have done so and they would release the card with 6 GB instead of 12 GB saying that 6 GB VRAM is enough VRAM for the 3060.

2070 Super exists because of the 5700 XT*. *Same story happened with Polaris and Pascal (R9 390 series offered 8 GB before Polaris though). It's like AMD is making them up their game otherwise we would still be getting 4C/8T i7s and that 3080 would have been like 6-8 GB (more than likely 6 GB, God!!!). 

When I think of it, we really owe it all to them. So a huge thank you to AMD for pushing the industry further ahead into the right (sane) direction. Feeling so blessed!


----------



## mouacyk (Jan 11, 2021)

Horses for courses... more and more 144Hz+ are adopted by gamers, which are better suited at resolutions below 4K.  With the limited supply and high cost GDDR6/X, NVidia perfected performance cards for 1080p and 1440p.  If you want to performance-proof 4K in *all *games, you need to step up to the 3090 with its 24GB of VRAM.  

Is anyone really contending that the 16GB in AMD's Radeons is the sole determinant in 4K performance?  Given the limited memory configuration of a 256-bit die, AMD's only other option was 8GB, which they would have had to sell cheaper and it would have looked way worse than the RTX 3080.  People, this was a marketing decision, and does not merit functional gains.


----------



## phanbuey (Jan 11, 2021)

mouacyk said:


> Horses for courses... more and more 144Hz+ are adopted by gamers, which are better suited at resolutions below 4K.  With the limited supply and high cost GDDR6/X, NVidia perfected performance cards for 1080p and 1440p.  If you want to performance-proof 4K in *all *games, you need to step up to the 3090 with its 24GB of VRAM.
> 
> Is anyone really contending that the 16GB in AMD's Radeons is the sole determinant in 4K performance?  Given the limited memory configuration of a 256-bit die, AMD's only other option was 8GB, which they would have had to sell cheaper and it would have looked way worse than the RTX 3080.  People, this was a marketing decision, and does not merit functional gains.



And performance-proof in 4k means - 4K ULTRA... which the 3080 will struggle with today... 4K high (which looks identical to ultra) and 4k Medium, which still looks awesome in most games, at much higher FPS will be viable for a long time to come.

10GB is well within the performance bracket of the card.  3080ti and 3090 will even be replaced by the 4 series before they use all that ram anyways.


----------



## EarthDog (Jan 11, 2021)

phanbuey said:


> which the 3080 will struggle with today..


No... it does not. Please take the time to read some of W1z's game reviews. So far, not ONE has used 10GB (one is very close). The 3080 doesn't struggle today, because of vRAM limitations, on any title running canned ultra settings at 4K UHD. Outside of the ONE title that has 9.8GB allocated (not used, this is an important distinction that many seem to gloss over/not understand, note...). The rest of the titles he reviewed (again, using canned Ultra settings) over the past year averaged around 7GB use. This card has plenty of time before it's vRAM becomes a problem. 

That said, if you plan to mod games and add texture packs (which few of the whole actually do) and play at 4K/Ultra/60+, this may be a concern on some titles. 

Can this thread be closed? I think everyone and their mom has posted an opinion already. This serves no purpose (but to come back in 3 years and laugh at some posts).


----------



## phanbuey (Jan 11, 2021)

EarthDog said:


> No... it does not. Please take the time to read some of W1z's game reviews. So far, not ONE has used 10GB (one is very close). The 3080 doesn't struggle today, because of vRAM limitations, on any title running canned ultra settings at 4K UHD. Outside of the ONE title that has 9.8GB allocated (not used, this is an important distinction that many seem to gloss over/not understand, note...). The rest of the titles he reviewed (again, using canned Ultra settings) over the past year averaged around 7GB use. This card has plenty of time before it's vRAM becomes a problem.
> 
> That said, if you plan to mod games and add texture packs (which few of the whole actually do) and play at 4K/Ultra/60+, this may be a concern on some titles.
> 
> Can this thread be closed? I think everyone and their mom has posted an opinion already. This serves no purpose (but to come back in 3 years and laugh at some posts).


That's not what I'm saying...

Basically - FPS will tank at ultra settings due to GPU horsepower before ram limitation.

Examples:
Borderlands 3 - ultra settings: GPU manages high 60's with dips into the mid/high 50's- feels awful for a shooter, at high/medium tweaked settings runs great at 80-110 FPS (and looks identical).
Outer Worlds - same story except even lower dips.
Cyberpunk (LOL - rip) - Need tweaked medium/high settings and DLSS to even play at 4k, tweaked settings run between 70-90FPS.

IMO newer games at ultra settings will beat the gpu into a slideshow way before the ram quantity becomes an issue.


----------



## Sunny and 75 (Jan 12, 2021)

Chrispy_ said:


> If 16GB VRAM becomes common then they'll design levels that utilise 16GB of art. It's really not any extra work for them, they just need to move the slider a little further to the right when using the compress-O-tron to reduce art assets to what fits into common VRAM sizes, 2GB, 4GB, or 8GB for example.
> 
> So some devs, at least, have been automatically ready for 16GB and 32GB cards for half a decade or more. The tools to do it effortlessly are mainstream industry standards.



Interesting.


----------



## wolf (Jan 12, 2021)

I had the pleasure of gaming at a mates place, plugging my 3080 box via HDMI 2.1 into an LG CX 65 and playing various games, DOOM Eternal, Cyberpunk, Jedi fallen order to name a few, and holy sh*t was the 4k VRR OLED experience amazing. 

I am totally sold on an OLED and the 3080 absolutely bloody excels at 4k gaming in the here and now. Really makes me want the CX 48 as my gaming monitor, I can't believe a TV does such an awes inspiring job of it. The way it handles VRR was buttery to a level I have not experienced in person. Absolutely blew me away.


----------



## Sunny and 75 (Jan 12, 2021)

Yeah playing on big screen is something else entirely.


----------



## phanbuey (Jan 12, 2021)

I have the Nano 85 49" variant which I love to death -- and yeah the 4K 120hz HDR w/ Gsync is just nuts for games.  It's no OLED tho -- I think once you see that in action it can never be unseen lol.


----------



## Chrispy_ (Jan 12, 2021)

wolf said:


> I had the pleasure of gaming at a mates place, plugging my 3080 box via HDMI 2.1 into an LG CX 65 and playing various games, DOOM Eternal, Cyberpunk, Jedi fallen order to name a few, and holy sh*t was the 4k VRR OLED experience amazing.
> 
> I am totally sold on an OLED and the 3080 absolutely bloody excels at 4k gaming in the here and now. Really makes me want the CX 48 as my gaming monitor, I can't believe a TV does such an awes inspiring job of it. The way it handles VRR was buttery to a level I have not experienced in person. Absolutely blew me away.


Yep. It sure is a sight to behold.

The closest you'll get without spending megabucks is a VA panel with strobing backlight. Extremely high contrast compared to your typical IPS display and the strobing backlight really does make tracking objects in motion unbelievably clear.

I have yet to find or use a VA panel that does this perfectly. I've had a couple of VA panels with strobing backlights but their pixel response wasn't convincing enough compared to OLED. I currently have a great 165Hz VA monitor with fast enough pixel response but it doesn't have a strobe, which is the one thing stopping it from being the "perfect VA display".


----------



## londiste (Jan 12, 2021)

High-frequency VA is a tricky thing due to some bad pixel response times inherent to VA and the balance with speed and artifacting from excessive OD to mitigate that. Strobing backlight is not for everyone, works very well on motion clarity but might be flickery depending on how your eyes perceive it. There are also not too many monitors that do both strobing backlight and VRR which is a shame.


----------



## Mussels (Jan 12, 2021)

My 165Hz here i had to spend ages testing out the 'overdrive' settings to get the best out of VA
Low makes scrolling on TPU have the text smear everywhere
balanced is actually decent
fast is great for the scrolling text, but shows weird artifacting in games - kind of a super fast colour smear


----------



## Sunny and 75 (Jan 12, 2021)

Chrispy_ said:


> pixel response wasn't convincing enough compared to *OLED*



This one looks good:








						The LG 32EP950 with a 4K OLED Pro display is official
					

The LG 32EP950 from the high-end UltraFine LG series of monitors has been announced at CES 2021. This monitor has a 31.5-inch 4K display with an OLED Pro display that covers 99% of the DCI-P3 color space and 99% of the Adobe RGB color space. The...



					www.displayspecifications.com
				




Hope they're gonna offer an UltraGear version of the panel also. Next CES maybe.


----------



## gfump (Jan 13, 2021)

Having more vram was originally added for sli support or higher performance in professional workloads.  VRAM bandwidth will always be a bottleneck, as ram speed bottlenecks the cpu on a pc.  In games, 8gb looks to be the sweet spot for 2-3 years at 1080p and 1440p gaming, any vram beyond that is for 4k+ resolutions and extra texture cache.  However NVidia gimped out with 8gb vram on the current 3060ti and 3070 due to them knowing how efficient Microsoft DirectStorage will be for texture and Sampler Feedback streaming in the near future, hopefully later this year.


----------



## lexluthermiester (Jan 13, 2021)

Adc7dTPU said:


> Yeah playing on big screen is something else entirely.


While true, there are some down sides unless you're sitting on a couch.


----------



## Sunny and 75 (Jan 13, 2021)

lexluthermiester said:


> While true, there are some down sides unless you're sitting on a couch.



No denying that, my friend.


----------



## wolf (Jan 13, 2021)

lexluthermiester said:


> While true, there are some down sides unless you're sitting on a couch.


I was sat at a table pretty much right up at the TV lol. Now my 34" curved ultrawide will take some re-adjusting too, it seems so small.

Having said that it was nice to be back on a dedicated gaming monitor, there are obviously things the OLED did that it just can't do, perfect blacks, amazing HDR like dude f**k me DOOM Eternal/Cyberpunk/Jedi fallen order pop SO HARD in HDR on that TV. But going back to the ultrawide was still sweet, I really have grown to love 21:9, I can drive higher framerates at 3440x1440, and at my viewing distance it's extremely clear.


----------



## Sunny and 75 (Jan 13, 2021)

wolf said:


> I really have grown to love 21:9



My brother has got one and it's so immersive. I wish that was the mainstream display ratio instead of 16:9. It's gradually becoming one though.


----------



## Chrispy_ (Jan 13, 2021)

Adc7dTPU said:


> This one looks good:
> 
> 
> 
> ...


As much as I love OLED, I'm not convinced they're ready for monitor usage yet. 8+ hours a day of a taskbar is going to do horrible things to even current-tech OLEDs. The older one at work isn't fairing too well (must be 4 years old now).

As always when something has known issues/caveats that are supposedly "fixed" with a new generation, I'll wait for longer-term reviews before spending my own money on it.

As for work, it's possible we'll pick one or two up for some reason, but at the moment OLEDs aren't competitive in the colour-accuracy department, and that's usually the justification required for spending more than your typical Dell Ultrasharp would cost. I would be all over this if anyone did HDR mastering, but we don't do that yet.


----------



## nguyen (Jan 14, 2021)

wolf said:


> I was sat at a table pretty much right up at the TV lol. Now my 34" curved ultrawide will take some re-adjusting too, it seems so small.
> 
> Having said that it was nice to be back on a dedicated gaming monitor, there are obviously things the OLED did that it just can't do, perfect blacks, amazing HDR like dude f**k me DOOM Eternal/Cyberpunk/Jedi fallen order pop SO HARD in HDR on that TV. But going back to the ultrawide was still sweet, I really have grown to love 21:9, I can drive higher framerates at 3440x1440, and at my viewing distance it's extremely clear.



Oh boy your post made me splurge on an LG OLED CX 48in 









Well for anyone wondering if 10GB VRAM is enough for 4K, it will be for a long time. CP2077 use around 9500MB with all the bell and whistle and my watercooled 3090 can only manage 50-60FPS (DLSS in Balanced mode)


----------



## wolf (Jan 14, 2021)

nguyen said:


> Oh boy your post made me splurge on an LG OLED CX 48in


Niiiiiiiice! any regrets? dude it's a serious bit of kit hey. Gaming on it was truly an experience, it's a dream for me for that experience to be normalised. Would be hard to go back to anything 'normal' now.


----------



## nguyen (Jan 14, 2021)

wolf said:


> Niiiiiiiice! any regrets? dude it's a serious bit of kit hey. Gaming on it was truly an experience, it's a dream for me for that experience to be normalised. Would be hard to go back to anything 'normal' now.



Well I had been looking to buy this TV for awhile now but the 48in model was hard to come by (while the 55in one is aplenty).
By chance I saw your post and looking around I found this model on sale . Even the sale person and delivery guy were surprised that their store has this model at all .
And holy crap Cyberpunk 2077 is like another game with an OLED screen in HDR mode dude, response time on this TV is dead on too, I can headshot bad guys in CP2077 easily even at <60FPS (aim assist is off of course).


----------



## wolf (Jan 14, 2021)

nguyen said:


> Well I had been looking to buy this TV for awhile now but the 48in model was hard to come by (while the 55in one is aplenty).


I can't find it in Aus at all, apparently too little demand, people want the 55" or bigger.


nguyen said:


> By chance I saw your post and looking around I found this model on sale . Even the sale person and delivery guy were surprised that their store has this model at all .
> And holy crap Cyberpunk 2077 is like another game with an OLED screen in HDR mode dude, response time on this TV is dead on too, I can headshot bad guys in CP2077 easily even at <60FPS (aim assist is off of course).


The HDR is utterly gorgeous, the glowing neon in that game, man the lightsaber glow/darkness of space in fallen order was awe inspiring.

Dead on with pixel response too, and the flow on of handling Gysnc/freesync so well, it's just perfection. And it had me asking myself, I can always picture what I want future hardware to be, it's easy to want 2x as fast, 2x as much with GFX, CPU, RAM etc etc, but with this screen I find myself asking, how do they improve on it? Sure maybe higher Hz, but the OLED experience itself is simply top tier, how do they improve what appears to be perfection? lets wait and see.


----------



## nguyen (Jan 14, 2021)

wolf said:


> I can't find it in Aus at all, apparently too little demand, people want the 55" or bigger.
> 
> The HDR is utterly gorgeous, the glowing neon in that game, man the lightsaber glow/darkness of space in fallen order was awe inspiring.
> 
> Dead on with pixel response too, and the flow on of handling Gysnc/freesync so well, it's just perfection. And it had me asking myself, I can always picture what I want future hardware to be, it's easy to want 2x as fast, 2x as much with GFX, CPU, RAM etc etc, but with this screen I find myself asking, how do they improve on it? Sure maybe higher Hz, but the OLED experience itself is simply top tier, how do they improve what appears to be perfection? lets wait and see.



I can find some areas where current OLED can improve, first is the shifting luminance when a large portion of the screen is bright, it is pretty obvious when surfing the web actually (white just becomew yellowish when a different wallpaper comes on screen LOL). Second is brightness can be improved too, seems like the 11th gen will bring higher brightness level with the G series.
Though I don't think LG can make too much of a difference with their 11th OLED screen, so I'm just gonna enjoy this screen for years to come, as a bonus it's wall mounted so I have much more desk space than before


----------



## Chrispy_ (Jan 14, 2021)

wolf said:


> Sure maybe higher Hz, but the OLED experience itself is simply top tier, how do they improve what appears to be perfection? lets wait and see.


Sample-and-hold blur. Mitigating that is the next step for OLED - goes hand-in-hand with higher refresh rates.








						Why Do Some OLEDs Have Motion Blur?
					

Written by Mark Rejhon (aka Chief Blur Buster) originally in 2013. Edited for 2019. OLED has been regarded as a Holy Grail for eliminating motion blur. Unfortunately, not always: The portable Playstation Vita, and the iPhone X and Samsung Galaxy still have lots of motion blur during fast...




					blurbusters.com
				




Inevitably, combating it will require higher frequencies and higher OLED brightness to eliminate flicker and dimming, but if you had the opportunity to game at 160Hz on a CRT back in the day, you'll know first-hand what every flat-panel technology since then has been missing.


----------



## sepheronx (Jan 14, 2021)

phanbuey said:


> I have the Nano 85 49" variant which I love to death -- and yeah the 4K 120hz HDR w/ Gsync is just nuts for games.  It's no OLED tho -- I think once you see that in action it can never be unseen lol.


So since you got the 49" of my TV, I need to ask how the VRR works on it?

I'm rather disappointed with the back light bleeding and poor black levels of the TV TBH.  But since I purchased it, better use it.


----------



## phanbuey (Jan 14, 2021)

sepheronx said:


> So since you got the 49" of my TV, I need to ask how the VRR works on it?
> 
> I'm rather disappointed with the back light bleeding and poor black levels of the TV TBH.  But since I purchased it, better use it.



VRR works great on it as long as you keep it above 50 FPS.





I am using it with GSYNC, and even though it's not 'officially supported' it works on everything if you check the 'enable display specific settings'. 

There VRR activates around 48 Hz or so but when it turns on and off it will darken (flicker) for a split second, so NVidia FPS cap at 118 and settings to make sure games run in that 65-118 range and it's perfect - no flickering w/ smooth frames and no lag.   It will still flicker in the loading screens but I don't care about that.

I don't notice backlight bleeding on mine (I did turn off local dimming and all of the other features).  The black levels are way better than most gaming/ IPS monitors I've used so no complaints there.  My 240hz Samsung G7 VA had slightly better blacks, but had uneven colors from one side of the screen to the other, worse backlight bleeding, and very noticeable pixel inversion at anything above 120hz.


----------



## lexluthermiester (Jan 14, 2021)

Wait, let's review....


nguyen said:


> Well for anyone wondering if 10GB VRAM is enough for 4K, it will be for a long time.


You said this and in the very next sentence you said....


nguyen said:


> CP2077 use around 9500MB with all the bell and whistle


...this. You then followed up with...


nguyen said:


> and my watercooled 3090 can only manage 50-60FPS (DLSS in Balanced mode)


...this.

By your own statement you show that games are right on the bleeding edge of high-end cards right NOW. 8GB cards are going to suffer. Games being made over the next few years are going to push those limits even further, as they always have and do. 8gb/10GB is simply not enough.

This demonstrates how misunderstood GPU memory and game graphics really are.


----------



## ThrashZone (Jan 14, 2021)

Hi,
Only another example of rtx on and off comparison


----------



## Deniz_Sorkun (Jan 14, 2021)

Tomgang said:


> This has been discussed before I know. But now seing the spec requirements for the new watch dog legion game. I really have to say, no 10 gb is not enough these days. Not for 4k at least now or in the for seen future.
> 
> I dont know by you guys, but I am holding on and se if we can not get a rtx 3080 20 GB or a rtx 3080 ti.
> 
> ...


It would be great if it was 16gb. For now it is enoguh but *for now*


----------



## phanbuey (Jan 14, 2021)

lexluthermiester said:


> Wait, let's review....
> 
> You said this and in the very next sentence you said....
> 
> ...



There's a big difference between 'ENOUGH' and 'BLEEDING EDGE' settings in games.  Games scale... your settings determine usage -- is 10GB or 8GB 'enough' for ULTRA settings for years to come? No.  is 10 or 8GB enough for 4k Medium-High settings. Yes - and will be for a *long *time.

The difference between an RX 480 8GB and an RX 480 4GB is a perfect example...1080P card from 2016:





Is 4GB enough? yes it was -- will it run cyberpunk now at 1080P and not run out? No - it will suffer, but both 8GB and 4GB versions are in the low 30 FPS - so buying a 8GB RX 480 back then never really made a difference for actual gaming.  By the time you enable all the settings at 4k to run you out of 10GB of vram you will be running like trash at 45 FPS anyways.

TLDR 'enough' is really what's being discussed / deliberated in this thread - if your definition of 'enough' is to run at the highest settings and all outlier games/user texture packs for the next 4 years and not have an issue - then no it's not 'enough'.  If your definition of enough is 'can it run medium/high settings' from stock/unmodded games for the next 4 years and not run out? then yes, it's fine.


----------



## lexluthermiester (Jan 14, 2021)

phanbuey said:


> Is 4GB enough? yes it was


Key word there, "was". We're not talking about GPU's that are 5+ years old. We're talking about modern GPU's that run modern and future games. The amount of RAM on offer currently by NVidia, with the exception of the 3060 12GB and 3090 24GB simply do not offer any room to grow for future games. Comparitively, AMD has no such problems. The 6800/6900 cards all have 16GB and thus will not hold game development and game playing back to any measurable degree. The RTX 8GB/10GB cards will. NVidia needs to release the 16GB & 20GB variations of the mid & top tier cards.


----------



## phanbuey (Jan 14, 2021)

lexluthermiester said:


> Key word there, "was". We're not talking about GPU's that are 5+ years old. We're talking about modern GPU's that run modern and future games. The amount of RAM on offer currently, with the exception of the 3060 12GB and 3090 24GB simply do not offer any room to grow for future games.



If you look back 4+ years, these same discussions happened with the 480 4GB, and the same point about growth was made.  *And you're right *-- the ram doesn't allow any room to grow or mod games, but when even the full fat 3090 GPU is getting 55FPS at settings that would even begin to stress the 10GB, the 3080 GPU itself doesn't really have any room to grow either.



lexluthermiester said:


> Comparitively, AMD has no such problems. The 6800/6900 cards all have 16GB and thus will not hold game development and game playing back to any measurable degree. The RTX 8GB/10GB cards will. NVidia needs to release the 16GB & 20GB variations of the mid & top tier cards.



Agreed - the 16GB is much nicer, AMD cards have always been more generous with the ram relative to GPU power (except the Fury), but they always end up dying the exact same unceremonious death at 40 fps and below before the extra vram comes into play.  For editing and content creation and compute, that's a totally different story, for gaming it's largely GPU limited.


----------



## lexluthermiester (Jan 14, 2021)

phanbuey said:


> If you look back 4+ years, these same discussions happened with the 480 4GB, and the same point about growth was made.  *And you're right *-- the ram doesn't allow any room to grow or mod games, but when even the full fat 3090 GPU is getting 55FPS at settings that would even begin to stress the 10GB, the 3080 GPU itself doesn't really have any room to grow either.
> 
> 
> 
> ...


Ah that's the key point, processing textures has never been as much of a GPU intensive task as post-processing GFX such as shadows, AA and various other effects. However, those textures do take up a lot of room in VRAM, so having a lot of VRAM can offer vast improvements to certain aspects of games without a severe GPU load overhead. For example, the 3060 12GB will be able to do far more than it's 6GB sibling. Likewise, using your earlier examples, an RX480 8GB could offer much higher texture settings without taxing the GPU by much more, where-as the 4GB model was limited in the amount, size and quality of the textures it could practically handle. When you add all the VRAM cost of other post-processing FX, 4GB quickly runs out, even at 1080p(as was seen with some games on the GTX980). 8GB does not. When you bump up to 1440p/1600p, games start to brush up against that 8GB limit for some games. At 4k many games are right at the limit of 10GB and will soon push beyond it.


----------



## Mussels (Jan 15, 2021)

When CP2077 maxes out my RTX 3090 on the GPU, i'm 100% confident GPU usage will run out before VRAM usage in future titles as well, like everyone has said.

It's very easy to turn textures from ultra to high (in games not coded poorly) and suddenly you have VRAM to spare


----------



## wolf (Jan 15, 2021)

I mean personally I tend to follow game optimisation guides from the likes of Digital Foundry and Hardware Unboxed to get the best 'bang for buck' for my settings anyway, I do love the "turn all the dials to 11" experience when you have excessive GPU horsepower to spare, but most modern games some tweaking goes a long way.

A point to note, at least from my limited experience and rose coloured recollections is they often say something along these lines. "There is very little to no visible difference between Ultra and High textures, we even have trouble telling the difference from ultra to Medium, but since there's virtually no framerate impact, we recommend Ultra if you have the VRAM"

Naturally this isn't always going to be the case all of the time, but it happens surprisingly often, and you need to take screenshots and pixel peep to find anything meaningful often too.

Would I have preferred if the 3080 was 12+gb? sure, but having had the product going on 4 months now, I am not disappointed one bit with it.


----------



## lexluthermiester (Jan 15, 2021)

Mussels said:


> When CP2077 maxes out my RTX 3090 on the GPU, i'm 100% confident GPU usage will run out before VRAM usage in future titles as well, like everyone has said.
> 
> It's very easy to turn textures from ultra to high (in games not coded poorly) and suddenly you have VRAM to spare


Yeah, but you have 24GB of VRAM. And you're missing the point...


wolf said:


> "There is very little to no visible difference between Ultra and High textures, we even have trouble telling the difference from ultra to Medium, but since there's virtually no framerate impact, we recommend Ultra if you have the VRAM"


While I don't exactly agree with this statement, it does illustrate the point. Why should we have to turn texture quality down? GPU makers should be making cards with enough VRAM that we don't need to.

At this point in time, with the RAW GPU power on offer from both AMD and NVidia, there is simply no excuse for making video cards that leave no room for growth. 8GB is yesterdays news. 10GB is a pittance. The bar should be starting at 12GB and going up from there, regardless if current games will use it all, because they soon will.



wolf said:


> I am not disappointed one bit with it.


Now? Maybe not but you soon might..


----------



## wolf (Jan 15, 2021)

I feel you dude, it *is* a compromise and I would have preferred 12gb or more for sure. But at this rate I still feel lucky just to own one. Since I don't game at 4k either outside of that one recent experience I do hope I don't need to lower those settings anytime soon.

I am keen to keep revisiting this topic/thread as the years go past and check in with how it's all going, because in the end it's all totally relative. Relative to what res you play at, games you play etc etc. So some people will 100% find it not enough, possibly already, and some people it will be 100% fine for the life of the card, and all sorts of experiences anywhere in between.


----------



## Metroid (Jan 15, 2021)

A midrange card like 3060 to have more vram than the 3080 is bs and I do have a 3080 already, yes will be enough for next 3 years but the 3080 for being a flagship, supposed to be at least 16gb. AMD did it right.


----------



## lexluthermiester (Jan 15, 2021)

wolf said:


> But at this rate I still feel lucky just to own one.


That's a fair point. However, patience will very likely pay off. The 12GB 3060 is coming very soon, the 3080ti 20GB is also soon and the 16GB 3070 is very likely soon thereafter. I have an RTX 2080 so I'm not hurting for GPU power. But there is no way in hell I'm willing to settle for a card that has no more VRAM than my current card, no matter how good it is.


----------



## wolf (Jan 15, 2021)

lexluthermiester said:


> That's a fair point. However, patience will very likely pay off. The 12GB 3060 is coming very soon, the 3080ti 20GB is also soon and the 16GB 3070 is very likely soon thereafter. I have an RTX 2080 so I'm not hurting for GPU power. But there is no way in hell I'm willing to settle for a card that has no more VRAM than my current card, no matter how good it is.


I came from a GTX1080 which I bought at launch, so over 4 years later I was realllly wanting that upgrade, and mid last year moved up to the 3440x1440 ultrawide and the GTX1080 was really starting to show it's age on it. I am very happy I got one, if there was a 20gb one that cost more I'm not even sure I'd have bought it tbh, assuming for sure it's gonna cost even more and be hard to get. I've already clocked a few games using it and am very happy with the perf level, like I said it'd just have been sick if it was a 12gb card to begin with. Lets see if I eat these words in ~2 years


----------



## nguyen (Jan 15, 2021)

Metroid said:


> A midrange card like 3060 to have more vram than the 3080 is bs and I do have a 3080 already, yes will be enough for next 3 years but the 3080 for being a flagship, supposed to be at least 16gb. AMD did it right.



Except that the GPU core on the RX6800XT cannot take advantage of the 16GB VRAM buffer to gain leverage against RTX3080 though. 
RX6800XT core shows its strength in 1440p and lower (Due to Infinity Cache hit rate being higher at 1440p and below).  At this resolution 10GB VRAM is more than enough for a very long time.
At 4K, where VRAM usage generally only increase by 20% but the compute requirement is 125% compare to 1440p, this is where Ampere architecture shows its prowess. 



















6800XT and even the 6900XT/3090 cores run out compute juice before they can properly utilize more than 10GB VRAM at 4K.
So yeah, unless you do some extreme texture modding at 1440p (which I highly doubt would bring any visual benefit) to break the 10GB VRAM buffer, there is other usage scenario that 10GB VRAM is limiting the 3080 against the 6800XT.

Now if we are talking about 10GB 3080 vs 20GB 3080, there are very limited cases where 20GB VRAM comes in handy (RTX ON without DLSS), it is not worth it to pay 150-200usd for 10GB mode VRAM, Nvidia also delay the 3080 Ti 20GB indefinitely anyways.


----------



## Metroid (Jan 15, 2021)

nguyen said:


> Except that the GPU core on the RX6800XT cannot take advantage of the 16GB VRAM buffer to gain leverage against RTX3080 though.
> RX6800XT core shows its strength in 1440p and lower (Due to Infinity Cache hit rate being higher at 1440p and below).  At this resolution 10GB VRAM is more than enough for a very long time.
> At 4K, where VRAM usage generally only increase by 20% but the compute requirement is 125% compare to 1440p, this is where Ampere architecture shows its prowess.
> 
> ...


 Yeah is not worth to pay 200 usd for 10gb more, what I'm saying is, the 3080 should have come by default with at least 16gb.


----------



## nguyen (Jan 15, 2021)

Metroid said:


> Yeah is not worth to pay 200 usd for 10gb more, what I'm saying is, the 3080 should have come by default with at least 16gb.



16GB VRAM is a very bad idea, there are several reasons:
_Higher production cost and lower stock, which would make the current situation worse.
_Lower efficiency since GDDR6X use like 4-5W per module, the card would use 25W more.
_Lower performance since 16GB VRAM would run on 256bit bus instead of 320bit bus.

All those detriments and none of the benefit that comes with 16GB VRAM, Nvidia already designed the 3080 10GB to be perfectly balanced between cost, performance and efficiency. The 3080 10GB will probably become the most sucessful high end GPU Nvidia has ever made.


----------



## wolf (Jan 15, 2021)

I think the best amount it reasonably could/should have come with is 12GB, 10GB _feels _bad after the 11GB 2080Ti and 1080Ti, where 12GB on the 3080 would have had virtually 0 complaints IMO. In any case we got what we got for many reasons, and a good product it is.

My gut says they did it knowing full well a 20GB super/Ti would follow in 6-12 months or so.


----------



## nguyen (Jan 15, 2021)

wolf said:


> I think the best amount it reasonably could/should have come with is 12GB, 10GB _feels _bad after the 2080Ti and 1080Ti where12GB would have had virtually 0 complaints IMO. In any case we got what we got for many reasons, and a good product it is.



1080Ti and 2080Ti comes with 11GB VRAM though 
But yeah I agree that 12GB VRAM for the 3080 Ti would be perfect, if it ever existed.


----------



## wolf (Jan 15, 2021)

nguyen said:


> 1080Ti and 2080Ti comes with 11GB VRAM though
> But yeah I agree that 12GB VRAM for the 3080 Ti would be perfect, if it ever existed.


Indeed they did I just worded my sentence badly, edited!


----------



## TumbleGeorge (Jan 15, 2021)

nguyen said:


> 6800XT and even the 6900XT


Is limited by only 300 watt bioses and lower frequencies. I think that will be change soon. 3090 and 3080 has bioses for 400-450...also maybe for more watts? When see results with 6800/6900 equiped with modded bioses maybe will be got more fair comparison and will see also where is limitations for size of VRAM usage.


----------



## Vayra86 (Jan 15, 2021)

Mussels said:


> At ultra settings.
> 
> You act like you cant turn one or two settings down and tank that VRAM usage.



You can. But that is a choice no 700 dollar *current* gen GPU should ever present. Come on... We never had to. You kinda act like there are no alternatives to these 3070 and 3080s... but thats not true.


----------



## lexluthermiester (Jan 15, 2021)

nguyen said:


> Except that the GPU core on the RX6800XT cannot take advantage of the 16GB VRAM buffer to gain leverage against RTX3080 though.


Nonsense. Again, you are misunderstanding the difference between texture quality, which uses a lot of VRAM storage, but is NOT GPU intensive and post processing effects which also have a VRAM tax and are GPU intensive.


----------



## Mussels (Jan 16, 2021)

lexluthermiester said:


> Nonsense. Again, you are misunderstanding the difference between texture quality, which uses a lot of VRAM storage, but is NOT GPU intensive and post processing effects which also have a VRAM tax and are GPU intensive.


i think thats the argument i've been making that people miss as well

VRAM usage/textures dont really impact on GPU usage. an 8GB RX580 gets the same FPS as a 4GB, you can just turn the textures up in some titles.

I commented earlier about my 3090 being GPU limited before VRAM limited as an obvious (to me) point that if a 3090 is GPU limited before it reaches 10GB of VRAM... well, so will anything else.


----------



## nguyen (Jan 16, 2021)

Mussels said:


> i think thats the argument i've been making that people miss as well
> 
> VRAM usage/textures dont really impact on GPU usage. an 8GB RX580 gets the same FPS as a 4GB, you can just turn the textures up in some titles.
> 
> I commented earlier about my 3090 being GPU limited before VRAM limited as an obvious (to me) point that if a 3090 is GPU limited before it reaches 10GB of VRAM... well, so will anything else.



LOL yeah turning up the texture quality might make some people feel good about spending extra for more VRAM, but the reality is it doesn't make any difference after High texture quality anyways  

Here are some 4K images of Doom Eternal, left is High texture and Right is Ultra Nightmare. 1.6GB higher VRAM usage for literally imperceptible IQ improvement.


----------



## lexluthermiester (Jan 16, 2021)

nguyen said:


> LOL yeah turning up the texture quality might make some people feel good about spending extra for more VRAM, but the reality is it doesn't make any difference after High texture quality anyways
> 
> Here are some 4K images of Doom Eternal, left is High texture and Right is Ultra Nightmare. 1.6GB higher VRAM usage for literally imperceptible IQ improvement.
> 
> View attachment 184177


Ok, now turn off AA/FSAA/DLSS and walk closer to those plants and let's see the difference. Many games often require a player to get up close and personal with environmental objects. Turning textures down will affect visual quality.


----------



## nguyen (Jan 16, 2021)

lexluthermiester said:


> Ok, now turn off AA/FSAA/DLSS and walk closer to those plants and let's see the difference. Many games often require a player to get up close and personal with environmental objects. Turning textures down will affect visual quality.



In case you can't tell what is going on, that is the closest I could stand looking at wall in Doom Eternal LOL.
Change only 1 setting: Texture Quality from Ultra Nightmare to High reduce VRAM usage from ~10.6GB to ~9GB.
The result is self-explanatory: it make almost no perceivable IQ difference, even on a 4K 48in OLED TV I can't make out any difference.
And why should I turn off all the AA/FSAA ? Every other setting are set to Ultra Nightmare, as they should be. Might be best if you do some testings of your own instead of talking nonsense.


----------



## Sunny and 75 (Jan 16, 2021)

Reaching nearly 400 posts now. I'm simply speechless.

Maxwell *980 had 4 GB VRAM*, then came the Pascal *1080 with 8 GB VRAM*. *THAT *is what I consider an upgrade. You get *both GPU performance and VRAM capacity at the same time*.
That 10 GB should have been at least 12 GB (*50% more VRAM*) for *Nvidia to convince me to upgrade*. GPU power is definitely there, *60% more performance (vs. 2080) is nothing to sneeze at* but the *25% plus VRAM simply is not gonna cut it for me*.


----------



## nguyen (Jan 16, 2021)

Adc7dTPU said:


> Reaching nearly 400 posts now. I'm simply speechless.
> 
> Maxwell *980 had 4 GB VRAM*, then came the Pascal *1080 with 8 GB VRAM*. *THAT *is what I consider an upgrade. You get *both GPU performance and VRAM capacity at the same time*.
> That 10 GB should have been at least 12 GB (*50% more VRAM*) for *Nvidia to convince me to upgrade*. GPU power is definitely there, *60% more performance (vs. 2080) is nothing to sneeze at* but the *25% plus VRAM simply is not gonna cut it for me*.



Yet the 1080, 1080Ti and 2080Ti all got replaced by cheaper GPUs with lower VRAM capacity
1080 --> 2060 6GB
1080Ti -->  2070 Super
2080Ti --> 3070






Core horsepower > VRAM capacity. More VRAM cost more money and bring almost nothing to the table.
Here is a question, for 500usd do you prefer an 3070 GPU with 8GB VRAM or an 3060Ti GPU with 16GB VRAM ?


----------



## TumbleGeorge (Jan 16, 2021)

nguyen said:


> More VRAM cost more


M-r Einstein said: "also is relatively" or something.


----------



## Sunny and 75 (Jan 16, 2021)

nguyen said:


> Yet the 1080, 1080Ti and 2080Ti all got *replaced* by cheaper GPUs with lower VRAM capacity
> 1080 --> 2060 6GB
> 1080Ti --> 2070 Super
> 2080Ti --> 3070



You mean to say that the mighty RTX 2080 Ti got replaced by a mere 3070. And that 3090 is just there for idk, to pose for a photoshoot?!?!?!
A GeForce xx80 class graphics card replaced by a mere xx60 class variant, wow, you don't say! I feel so overwhelmed, seriously there's nothing left to say, I got nothing! I literally got nothing!!!
And that RTX 2080 is just there to amuse us, right?! You know for the fun of it, to entertain us as a live performance would, yes?!

*Nvidia replaced* the *1080 by the 2080*, the *1080 Ti by the 2080 Ti* and the *2080 Ti got replaced by the 3090*. And now, come late February, the *2060 6 GB* is *going to be replaced* by the *3060 12 GB*. THAT is how it works as it did before.

Nvidia just marketed those cards as such for PR purposes. What it means is that they advertised if you the 1070 owner, upgrade to a 2070 Super, you will get performance like a 1080 Ti owner does. Simple as that. That's what it reads, nothing more, nothing less. It does NOT mean that an 11 GB 1080 Ti was a bit too much so we at Nvidia wisely decided to lower that VRAM capacity and release an 8 GB version of that in the form of 2070 Super. It ONLY means here's what we're offering at the moment (July 2019 in case of 2070 Super), take it or leave it as you wish, that's it, that's the WHOLE story.
The same goes for 2060, 3070 and all the other cards as well. Those are some ****ing awesome cards, no denying that, so when Nvidia advertise them as faster than previous flagship or something like that, this is just marketing material. They do not mean that the GTX 1080 which we launched three years before RTX 2060 should have had 6 GB VRAM and so on.




nguyen said:


> Here is a question, for 500usd do you prefer an 3070 GPU with 8GB VRAM or an 3060Ti GPU with 16GB VRAM ?



What's that supposed to mean?! Is that even a question?!

Say Nvidia offer us a 4070 8 GB sometime in 2022 and market it as faster than 3090 24GB, so what, it does not mean anything. It does not mean that 8 GB VRAM is enough VRAM for 4070, it does not mean that 24 GB VRAM was overrated or anything of the sort.
It ONLY means we're offering the performance of the 3090 with third of its VRAM as a 4070 package at a certain price, so if you're interested, take a bite. That's it.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------
There is the argument about modding also:


Vayra86 said:


> And is the use of mods so strange on a PC? I have to say that for any game I play more than a single playthrough, its one of the first things I check out. They expand games and increase value. If I don't want mods, I can buy a console - modding is a key selling point for PC gaming.
> 
> I'll also concede that its acceptable that others accept different standards from what they get out of a GPU. But I can see myself running into trouble with 10GB going forward, and that is well founded in what I've seen up until today wrt performance and capacity. You're at liberty to think otherwise and base your choices on that  But I wouldn't be too sure, neither am I - we just can't tell and that is a 'risk' for a purchase.


I did not switch from PS4 to PC to see my modding experience limited by that 10 GB VRAM.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------
And here's a quote from a 3080 owner:


Metroid said:


> They put 10gb on the rtx 3080 gpu flagship and 12 gb on a maistream gpu, this must be a joke to 3080 users.





Metroid said:


> I have a evga rtx 3080 ftw3


----------



## Vayra86 (Jan 16, 2021)

TumbleGeorge said:


> M-r Einstein said: "also is relatively" or something.


E=10GB?


----------



## wolf (Jan 17, 2021)

Adc7dTPU said:


> And here's a quote from a 3080 owner:


Well I'm a 3080 owner too and I don't really give a hoot that the 3060 has more VRAM. I'm sure it'll upset some people but it's not a universally accepted norm. And like I've said if there was a 20gb 3080 that cost more (absolute bare min would be $100 usd more, but I'd wager more like $150-300) then I wouldn't have bought it anyway.


----------



## lexluthermiester (Jan 17, 2021)

wolf said:


> (absolute bare min would be $100 usd more, but I'd wager more like $150-300)


$80 more, $100 at most. IMHO, worth it!


----------



## wolf (Jan 17, 2021)

lexluthermiester said:


> $80 more, $100 at most. IMHO, worth it!


Time will tell there too I guess, my bare min is your at most haha


----------



## simlife (Jan 17, 2021)

MxPhenom 216 said:


> Its enough. I don't expect the 20GB version to really boost performance in games right now.


well duh i think the question is more towards is 12-16 GB worth consideration.. du doy... the switch has 4 gbs of system/vram total memory... so 5 times more memery (for gpu alone) then a 90 million console systems is...huh... are you ok ???


----------



## lexluthermiester (Jan 17, 2021)

simlife said:


> well duh i think the question is more towards is 12-16 GB worth consideration.. du doy... the switch has 4 gbs of system/vram total memory... so 5 times more memery (for gpu alone) then a 90 million console systems is...huh... are you ok ???


Are you seriously comparing a console that is several years old to modern PC parts?


----------



## TumbleGeorge (Jan 17, 2021)

lexluthermiester said:


> Are you seriously comparing a console that is several years old to modern PC parts?


Which of latest consoles are with several years parts? Nintendo?


----------



## Vayra86 (Jan 17, 2021)

simlife said:


> well duh i think the question is more towards is 12-16 GB worth consideration.. du doy... the switch has 4 gbs of system/vram total memory... so 5 times more memery (for gpu alone) then a 90 million console systems is...huh... are you ok ???


The _switch?! _Those games look like they are still stuck on N64 graphics come on. What is it anyway, 720p?


----------



## MxPhenom 216 (Jan 17, 2021)

simlife said:


> well duh i think the question is more towards is 12-16 GB worth consideration.. du doy... the switch has 4 gbs of system/vram total memory... so 5 times more memery (for gpu alone) then a 90 million console systems is...huh... are you ok ???



The f*** did you just say? Sit this one out...


----------

