# NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type



## btarunr (Dec 26, 2018)

NVIDIA drew consumer ire for differentiating its GeForce GTX 1060 into two variants based on memory, the GTX 1060 3 GB and GTX 1060 6 GB, with the two also featuring different GPU core-configurations. The company plans to double-down - or should we say, triple-down - on its sub-branding shenanigans with the upcoming GeForce RTX 2060. According to VideoCardz, citing a GIGABYTE leak about regulatory filings, NVIDIA could be carving out not two, but six variants of the RTX 2060!

There are at least two parameters that differentiate the six (that we know of anyway): memory size and memory type. There are three memory sizes, 3 GB, 4 GB, and 6 GB. Each of the three memory sizes come in two memory types, the latest GDDR6 and the older GDDR5. Based on the six RTX 2060 variants, GIGABYTE could launch up to thirty nine SKUs. When you add up similar SKU counts from NVIDIA's other AIC partners, there could be upward of 300 RTX 2060 graphics card models to choose from. It won't surprise us if in addition to memory size and type, GPU core-configurations also vary between the six RTX 2060 variants compounding consumer confusion. The 12 nm "TU106" silicon already has "A" and "non-A" ASIC classes, so there could be as many as twelve new device IDs in all! The GeForce RTX 2060 is expected to debut in January 2019. 



 



*View at TechPowerUp Main Site*


----------



## ShurikN (Dec 26, 2018)

3GB on a mid range gpu in 2019... I'm at a loss for words.


----------



## Blueberries (Dec 26, 2018)

People had a hard time with 3GB vs 6GB... 

...nevermind...

not even going to try.


----------



## Prima.Vera (Dec 26, 2018)

The lather jacket idiot seems to be loosing completely his mind. 
By releasing 6 different cards under the same name it’s the proof that nVidia lost it.
And lost it hard...


----------



## Xzibit (Dec 26, 2018)

Interesting given turning on RTX in BF5 you use more VRAM.  3GB will be low SD, 4GB will be Reg SD, 6GB will be a tiny pinch of RTX.


----------



## INSTG8R (Dec 26, 2018)

Xzibit said:


> Interesting given turning on RTX in BF5 you use more VRAM.  3GB will be low SD, 4GB will be Reg SD, 6GB will be a tiny pinch of RTX.


I have a feeling RTRT will be all but useless at this level.


----------



## kastriot (Dec 26, 2018)

Nvidia is going intel way..


----------



## Tomgang (Dec 26, 2018)

Wow thats low nvidia. I replaced my two gtx 970 for a gtx 1080 ti for one and a half year ago because i ran out of vram.

And pascal gtx 1060 3 gb i dit not reccomend to people that play games al ready when it came out.

And now they release a 3 gb variant again. Wtf nvidia. 3 gb ram is out in 2018 and that sure as hell aint gonna change in 2019. Unless you are a 720P gamer and how many does that in 2018/2019?

Whats next. They secretly put out a card with ddr4 ram, like they dit with gt1030 and totally cripples memory bandwrith.


----------



## FordGT90Concept (Dec 26, 2018)




----------



## SomeOne99h (Dec 26, 2018)




----------



## GoldenX (Dec 26, 2018)

The whole RTX launch seems something out of Bethesda/EA Games.


----------



## Fouquin (Dec 26, 2018)

GS, GT, GTO, GTS, XT, RTX


----------



## M2B (Dec 26, 2018)

You can easily run almost every single game on a 3GB card at 1080p just fine, but you're gonna lower the texture quality in some games and that's all. the 3GB version is not ideal but is not as usless as some people think, it actually might be a good value card for those needing a fairly powerful GPU and don't care about the maximum quality textures or VRAM related settings.
if the 3GB RTX 2060 is going to be priced under 200$, I don't see a problem but anything higher than 200$ is unacceptable.
8GB VRAM on a 800$ card (RTX 2080) is more disappointing than 3GB on a budget card to me.


----------



## FordGT90Concept (Dec 26, 2018)

You can get an 8 GiB RX 580 for $220.  A 3 GiB card these days shouldn't be going for much more than $120 (the realm of budget cards).  The way NVIDIA offering these cards makes any sense is that they have really, really, really bad yields.  The fact they are going to exist strongly suggests that is the case.  Even 4 GiB is rather sad.

At most they should offer GDDR5 and GDDR6 variants of the 2060 6 GiB and they should call the GDDR5 variant 2060 MX to clearly indicate it's the lesser of the two.


----------



## Mistral (Dec 26, 2018)

How are they going to have an RTX 2060 when the RTX 2070 is barely adequate enough to do anything RTX related?


----------



## Durvelle27 (Dec 26, 2018)

M2B said:


> You can easily run almost every single game on a 3GB card at 1080p just fine, but you're gonna lower the texture quality in some games and that's all. the 3GB version is not ideal but is not as usless as some people think, it actually might be a good value card for those needing a fairly powerful GPU and don't care about the maximum quality textures or VRAM related settings.
> if the 3GB RTX 2060 is going to be priced under 200$, I don't see a problem but anything higher than 200$ is unacceptable.
> 8GB VRAM on a 800$ card (RTX 2080) is more disappointing than 3GB on a budget card to me.


I have a Gigabyte GTX 1060 3GB OC edition and st 1080P is struggles in 3 main games I play and it’s not the power, it’s the VRAM



Mistral said:


> How are they going to have an RTX 2060 when the RTX 2070 is barely adequate enough to do anything RTX related?


Honestly that comes down to devs and how they implement RT. The 2070 has no problem playing BFV at 1080P Ultra w/RT and still maintain 60FPS


----------



## MicroUnC (Dec 26, 2018)

Dear AMD!

Where Are You!?


----------



## M2B (Dec 26, 2018)

Durvelle27 said:


> I have a Gigabyte GTX 1060 3GB OC edition and st 1080P is struggles in 3 main games I play and it’s not the power, it’s the VRAM



Maybe you should improve your reading skills?
Yeah, 3GB is not ideal and is not enough for maximum VRAM related settings in most AAA games but IF the price is right (something around 140$ sounds right to me which is unlikely) it's not gonna be a bad value card.


----------



## Fatalfury (Dec 26, 2018)

Gigabyte ....
39 SKUs for RTX 2060  ready before launch
0 SKU for RX 590... even after few months of launch.

well done top 3 Brands(Gigabyte,Asus,Msi)  for unofficially supporting GPP.


----------



## ShurikN (Dec 26, 2018)

M2B said:


> Maybe you should improve your reading skills?
> Yeah, 3GB is not ideal and is not enough for maximum VRAM related settings in most AAA games but IF the price is right (something around 140$ sounds right to me which is unlikely) it's not gonna be a bad value card.


Even 1060 3GB wasn't $140.... Why would a 2060 (with a larger die) be cheaper.


----------



## satrianiboys (Dec 26, 2018)

M2B said:


> Maybe you should improve your reading skills?
> Yeah, 3GB is not ideal and is not enough for maximum VRAM related settings in most AAA games but IF the price is right (something around 140$ sounds right to me which is unlikely) it's not gonna be a bad value card.



The true definition of fanspin..hm i mean fanboi..


----------



## IceScreamer (Dec 26, 2018)

But, April fools' is 4 months away.


----------



## StefanM (Dec 26, 2018)

Interesting. Also check out: https://portal.eaeunion.org/sites/o...3&listid=d84d16d7-2cc9-4cff-a13b-530f96889dbc
for some unannounced GPUs



> *PG183 QS1, PG183-A00, PG183-A01, PG184, PG189, PG189-A00*, QUADRO RTX 4000, QUADRO RTX 5000, QUADRO RTX 6000, QUADRO RTX 8000, *QUADRO T1000, QUADRO T2000,* TESLA T4, *PG183, TESLA TU102*


----------



## FordGT90Concept (Dec 26, 2018)

Durvelle27 said:


> Honestly that comes down to devs and how they implement RT. The 2070 has no problem playing BFV at 1080P Ultra w/RT and still maintain 60FPS


2560x1440 D3D > 1920x1080 DXR on an RTX 2070.  BFV manages those FPS numbers by hugely cutting down on the number of rays which translates to really no point in using them.  Also, there's only a few million people that will ever play BFV on PC and only a faction of a fraction of those have an RTX card.



StefanM said:


> Interesting. Also check out: https://portal.eaeunion.org/sites/o...3&listid=d84d16d7-2cc9-4cff-a13b-530f96889dbc
> for some unannounced GPUs


PG?


----------



## _UV_ (Dec 26, 2018)

Well, they still have room for a 5GB and 2GB later... i mean with ddr4 or even even ddr3


----------



## Candor (Dec 26, 2018)

I don't think this is enough variants. Please lets add GTX 1160's to this.


----------



## StefanM (Dec 26, 2018)

FordGT90Concept said:


> PG?


Already used with previous generations.
You see these letters on photos of a naked chip circuit board.
I don't know the actual meaning though.


----------



## ArbitraryAffection (Dec 26, 2018)

Makes me even more happy with the 8GB on the RX 570 I just picked up for £150. (plus 2x free AAA games). Yes I know 2060 will be faster but AMD offering 8GB at this price point is insane value. Oh and the card runs 1080p just fine.

I had to sell my ENTIRE pc because I got behind on the rent. In fact the only thing new in my current PC is this graphics card. Bargain basement Pc. Check my pc specs if you wanna know but 99% of it is either super cheap 2nd hand or freebies I got given by friends.


----------



## lexluthermiester (Dec 26, 2018)

ShurikN said:


> 3GB on a mid range gpu in 2019... I'm at a loss for words.


For 1080p screens(which most gamers are still using), 3GB is still reasonable.



INSTG8R said:


> I have a feeling RTRT will be all but useless at this level.


That will depend on the settings level.


----------



## Nxodus (Dec 26, 2018)

AMD = overheating, power hungry, unstable, bad software, no innovation BUT IT'S CHEAPER!!
NVIDIA = expensive

I really don't get the NVIDIA haters, AMD is pure shit, it's the Walmart variant of video cards


----------



## ArbitraryAffection (Dec 26, 2018)

lexluthermiester said:


> For 1080p screens(which most gamers are still using), 3GB is still reasonable


3GB is cutting it really fine IMO. Even at 1080p. Fallout 4 will use 3GB+ at 1080p without the HD textures (the HD pack is a joke of unoptimised BS though). and Far Cry 5 will use over 4GB at 1080p too. While most games could be played easily on 3GB with medium or less-than-ultra textures, it is completely lacking any form of future proofing at all. A bad investment IMO.



Nxodus said:


> AMD = overheating, power hungry, unstable, bad software, no innovation BUT IT'S CHEAPER!!
> NVIDIA = expensive
> 
> I really don't get the NVIDIA haters, AMD is pure shit, it's the Walmart variant of video cards



Low quality troll is low quality.


----------



## Mescalamba (Dec 26, 2018)

M2B said:


> You can easily run almost every single game on a 3GB card at 1080p just fine, but you're gonna lower the texture quality in some games and that's all. the 3GB version is not ideal but is not as usless as some people think, it actually might be a good value card for those needing a fairly powerful GPU and don't care about the maximum quality textures or VRAM related settings.
> if the 3GB RTX 2060 is going to be priced under 200$, I don't see a problem but anything higher than 200$ is unacceptable.
> 8GB VRAM on a 800$ card (RTX 2080) is more disappointing than 3GB on a budget card to me.



Some games will simply NOT run at 3GB. There isnt that much of them, but that said, 6GB is bare minimum. Wouldnt consider anything with less than 8GB as upgrade at this point.


----------



## Recus (Dec 26, 2018)

First AMD fanboys blaming Nvidia for holding AIBs on short leash. Then Nvidia let AIBs run loose... Still Nvidia's fault.


----------



## Nxodus (Dec 26, 2018)

ArbitraryAffection said:


> Low quality troll is low quality.



I'm not trolling, but the NVIDIA hate these days is slowly getting to me.

Just look at the forums: AMD sub-forum has 387 pages while NVIDIA has 230. AMD cards are shit-tier. Whenever a new game gets released the internet forums are full of AMD owners crying.


----------



## ArbitraryAffection (Dec 26, 2018)

Nxodus said:


> I'm not trolling, but the NVIDIA hate these days is slowly getting to me.
> 
> Just look at the forums: AMD sub-forum has 387 pages while NVIDIA has 230. AMD cards are shit-tier. Whenever a new game gets released the forums are full of AMD owners crying.


Sure thing hon.


----------



## Rowsol (Dec 26, 2018)

This is really stupid.  Just test each config and pick the one that makes the most sense.  I don't see them doing this crap with the more expensive models so why this one?


----------



## lexluthermiester (Dec 26, 2018)

ArbitraryAffection said:


> 3GB is cutting it really fine IMO. Even at 1080p. Fallout 4 will use 3GB+ at 1080p without the HD textures (the HD pack is a joke of unoptimised BS though). and Far Cry 5 will use over 4GB at 1080p too.


That assumes full AA, which most people turn down or off, which naturally reduces the memory footprint, even with HD texturing.



Nxodus said:


> AMD is the retarded child that you have to hide in the attic when guests come over to visit.


Please take your fanboy trolling elsewhere, no one here cares about such silly nonsense..


----------



## ArbitraryAffection (Dec 26, 2018)

lexluthermiester said:


> That assumes full AA, which most people turn down or off, which naturally reduces the memory footprint, even with HD texturing.


True, but with 2060 approaching 1070 performance, surely you'd expect people to want  to play with settings maxed? I always thought the xx60 series were about max settings FHD gaming.


----------



## Paganstomp (Dec 26, 2018)

I'm going for the one that has the fastest changing RGB lighting on it.


----------



## lexluthermiester (Dec 26, 2018)

ArbitraryAffection said:


> True, but with 2060 approaching 1070 performance, *surely you'd expect people to want to play with settings maxed? *I always thought the xx60 series were about max settings FHD gaming.


Not really. I have a 2080 and still turn off AA. No real need for it at 1080p and above. Most gamers prefer framerate VS AA.


----------



## ArbitraryAffection (Dec 26, 2018)

lexluthermiester said:


> Not really. I have a 2080 and still turn off AA. No real need for it at 1080p and above. Most gamers prefer framerate VS AA.


Fair enough. I suppose it depends on the type of AA, too. I'm actually running TAA enabled in Fallout 4 and the 570 manages it at 60fps but I am also turning down other things like shadows. Warframe also uses TAA but that's really light and I'm at 120FPS more often than not at FHD. Hilariously, Fallout 76 uses over 7GB of vram at 1080p. Don't even ask why, the textures don't look anywhere near good enough haha. I think its caching the RAM or Bethesda just copy pasted their Fallout 4 Hi res textures to it. (really badly optimised). I would still prefer more VRAM though. I feel like 8GB on my 570 was a real bargain even though I won't be able to use it all before I run out of GPU power but at least I will have zero vram issues.


----------



## efikkan (Dec 26, 2018)

FordGT90Concept said:


> You can get an 8 GiB RX 580 for $220.  A 3 GiB card these days shouldn't be going for much more than $120 (the realm of budget cards).


I think one RTX 2060 is plenty to fill the market. But comparing GPUs and setting prices based on memory size is pure BS. 3GB is fine for an entry mid-range card as long as proper benchmarking shows it's fine.



ArbitraryAffection said:


> Makes me even more happy with the 8GB on the RX 570 I just picked up for £150. (plus 2x free AAA games). Yes I know 2060 will be faster but AMD offering 8GB at this price point is insane value. Oh and the card runs 1080p just fine.


RTX 2060 will perform far beyond RX 570, a low-end card. Brag all you want about 8GB, there is no way that card needs it.



ArbitraryAffection said:


> 3GB is cutting it really fine IMO. Even at 1080p. Fallout 4 will use 3GB+ at 1080p without the HD textures (the HD pack is a joke of unoptimised BS though). and Far Cry 5 will use over 4GB at 1080p too.


Memory usage doesn't mean memory requirement, many applications allocate more memory than they need. What matters is performance, or lack thereof. Stuttering might be a indicator of too little memory.


----------



## 27MaD (Dec 26, 2018)

Blueberries said:


> People had a hard time with 3GB vs 6GB...


And now there is 6 different memory size and type.


----------



## kings (Dec 26, 2018)

A 3GB card in 2019 for $300 incoming!

I hope AMD will wake up fast! Maybe they will be able to bring some serious competition on 7nm, at least as long Nvidia continues with 12nm /16nm.


----------



## oxidized (Dec 26, 2018)

6 variants of 2060 rofl this is totally insane, i really hope this isn't true, but seeing something similar happened with 1060 doesn't really give me much hope. They should've upgraded 2060 to 8GB and maybe made a 4GB version both GDDR6 ofc or if the price was right, the 4GB variant could've had GDDR5X just to cut che costs.


----------



## FordGT90Concept (Dec 26, 2018)

Recus said:


> First AMD fanboys blaming Nvidia for holding AIBs on short leash. Then Nvidia let AIBs run loose... Still Nvidia's fault.


NVIDIA is the one shipping 6 variants of the silicon to AIBs.  AIBs install the GDDR chips that are compatible.  AIBs can only control which chips they get, not what is available.



efikkan said:


> I think one RTX 2060 is plenty to fill the market. But comparing GPUs and setting prices based on memory size is pure BS. 3GB is fine for an entry mid-range card as long as proper benchmarking shows it's fine.


DRAM is a large chunk of the cost to manufacture a video card.  Additionally, having so little VRAM makes the card less valuable to gamers even at 1920x1080.  64-bit games (which most are these days) will use >4 GiB of VRAM if it is available.  Couldn't care less about benchmarks.  More and more games not only do graphics on GPU, but computation on GPU as well.


----------



## efikkan (Dec 26, 2018)

FordGT90Concept said:


> 64-bit games (which most are these days) will use >4 GiB of VRAM if it is available.


"64-bit" games have nothing to do with GPU memory usage. That makes no sense from a technical perspective.



FordGT90Concept said:


> Couldn't care less about benchmarks.


This really says it all, doesn't it?



FordGT90Concept said:


> More and more games not only do graphics on GPU, but computation on GPU as well.


They do, but this is still irrelevant.


----------



## Durvelle27 (Dec 26, 2018)

lexluthermiester said:


> Not really. I have a 2080 and still turn off AA. No real need for it at 1080p and above. Most gamers prefer framerate VS AA.


I have a RTX 2070 and play everything on max settings with AA on triple monitors


----------



## lexluthermiester (Dec 26, 2018)

Durvelle27 said:


> I have a RTX 2070 and play everything on max settings with AA on triple monitors


And your framerates are?


----------



## bug (Dec 26, 2018)

Write your congressman, limit the number of designs to one! Because choice is to be avoided! 

Wth guys, this is just different memory capacity and type. I'd be more curious if this means different memory bus widths and, like 1060 before this, different internal configurations.


----------



## Durvelle27 (Dec 26, 2018)

lexluthermiester said:


> And your framerates are?


Depends on the game

I actually did a full mini review showings it’s performacne. You can find it under my profile content 

It’s actaully holds it own very well.


----------



## Vya Domus (Dec 26, 2018)

FordGT90Concept said:


> 64-bit games (which most are these days) will use >4 GiB of VRAM if it is available.



Double-precision is rarely used, if ever for shading, in game. That is unless you meant something else ? Also, just because your application is compiled for 64-bit machines that doesn't mean there is direct correlation between that and memory usage. You can have a program that uses a smidget of 64-bit data types and yet it may use just a couple of megabytes or several gigabytes. It's more about quantity, not just data types that are used.  



Recus said:


> First AMD fanboys blaming Nvidia for holding AIBs on short leash. Then Nvidia let AIBs run loose... Still Nvidia's fault.



You are completely out of touch with this subject.



Nxodus said:


> AMD is pure shit



Nah, that'd be your comment.


----------



## CandymanGR (Dec 26, 2018)

What? No 5gb version? 3-4-6... where is the 5? I guess they missed that one.

Nvidia has fucked it up, and not matter what fanboys are telling you, the RTX was a fiasco.
And people who claim 3gb are enough for 1080p gaming are twisting the truth. That is true only for esports games, that are designed to be lightweight. You CANNOT play a lot of AAA modern games because they demand more than 3gb vram EVEN in 1080p/medium/high settings (GTA V, Far Cry 4, Shadow of mordor, etc). And this will happen more often as newer games come out. So lets cut the crap, shall we?

P.S. And before you criticize me for AMD fanboy, i will tell you that in more than 20 gpus i've had, only two were AMD.


----------



## Nxodus (Dec 26, 2018)

Vya Domus said:


> You are completely out of touch with the subject.
> 
> 
> 
> Nah, that'd be your comment.



quality argument. 

-Help! my game is crashing
-What's your setup?
-cheap AMD with 9 billion GDDR 
-Ok please wait 2 weeks until devs fix the game so it runs properly on your crapstick


----------



## lexluthermiester (Dec 26, 2018)

CandymanGR said:


> Nvidia has fucked it up, and not matter what fanboys are telling you, the RTX was a fiasco.


No it isn't. Whiners have made a bunch of needless noise. I have one. It rocks.


----------



## Vya Domus (Dec 26, 2018)

Your threadcrapping is getting annoying.


----------



## CandymanGR (Dec 26, 2018)

lexluthermiester said:


> No it isn't. Whiners have made a bunch of needless noise. I have one. It rocks.



I dont trust a single word from people who say that what they bought is the best. And also, 3gb are NOT enought. You are biased as hell.


----------



## Vya Domus (Dec 26, 2018)

The point isn't that 3GB is enough or not. It's just rather pathetic to ship a card that will likely be at the very least 250$+ with just 3GB in 2019.


----------



## CandymanGR (Dec 26, 2018)

Vya Domus said:


> The point isn't that 3GB is enough or not. It's just rather pathetic to ship a card that will likely be at the very least 250$+ with just 3GB in 2019.



It is. But the practical side is also important. You cannot build a gaming gpu in 2019 with 3gb vram, its stupid. But nvidia is milking the cow, doesnt care for anything else.


----------



## Nxodus (Dec 26, 2018)

Vya Domus said:


> Your threadcrapping is getting annoying.



No, i'd say NVIDIA bashing is getting out of hand. Also, I really don't understand the AMD love. It's just absurd. NVIDIA needs a proper competitor. AMD is dragging NVIDIA down.


----------



## windwhirl (Dec 26, 2018)

Wow... I like how Nvidia doesn't rely on market research and prefers to have a field day, poking the sleeping dragon /s


----------



## Durvelle27 (Dec 26, 2018)

People saying 3GB is enough, it is for some games but that’s not the case for all newer titles. 

Plus let’s look at it like this

The RTX 2070 is meant to replace the GTX 1070 which is 8GB GPU aimed at 1440P gaming with ultra details 

Would you really want to buy a 1070 replacement aiming for 1440 gaming  with 3GB of VRAM

Also with the msrp of current RTX 

RTX 2080 Ti $999
RTX 2080 $699
RTX 2070 $499

 Nvidia likely will price the 2060 accordingly around $299-$399, are you really willing to pay that amount for a 3GB GPU that will become VRAM limited before it even utilizes it’s full potential


----------



## Nxodus (Dec 26, 2018)

Durvelle27 said:


> around $299-$399, are you really willing to pay that amount for a 3GB GPU that will become VRAM limited before it even utilizes it’s full potential



Can't you people see, NVIDIA is finally opening to the cheapo market. The 3GB ones will be much cheaper than $300, why do you people assume they will charge $300 for them? Everyone knows ram is expensive, less ram, lower price.


----------



## CandymanGR (Dec 26, 2018)

Durvelle27 said:


> People saying 3GB is enough, it is for some games but that’s not the case for all newer titles.



I already mentioned that.
But you contradict yourself in the next part of your post, when you say first that it is for some games.
You contradict yourself when you are saying that the 3gb WILL be a limiting factor (in the future), while in the first part you are saying that it is not limiting factor for all games (which is true for esports games for example). But if someone is playing esports, he has NO reason to upgrade the gpu, since he can play with previous gen just fine. We buy new gpu's to play the new AAA titles with as much performance as possible, not in order to play LoL with 500 fps.




Durvelle27 said:


> Nvidia likely will price the 2060 accordingly around $299-$399, are you really willing to pay that amount for a 3GB GPU that will become VRAM limited before it even utilizes it’s full potential



But... but... 3gb vram is ALREADY a limiting factor. Ofcourse the card will never reach full potential because it is ALREADY limited by vram. What the....?


----------



## B-Real (Dec 26, 2018)

And some people were laughing at the RX590. TBH, RX590, looks a better product compared to this complete mess.



lexluthermiester said:


> No it isn't. Whiners have made a bunch of needless noise. I have one. It rocks.


Of course you have to defend that you have been milked in a crazy way. Those who try to defend this piece of crap can't understand our concern is not its performance but the BIGGEST PROBLEM is the less performance gain than the 700-900 series switch (it had around 40%, now it is 30%), and while the 900 series was $50-100 cheaper than their 700 series equivalents, the 20 series is $100-300 (in reality: $130-500) more expensive than the 10 series... Moreover, the other great problem is that its mostly advertised feature is only present in 1 game yet since the cards 3,5 month lifespan, which is one of the most optimized games-engines on the market, yet it couldn't squeeze out fix 60 fps in FHD with RTX level over low with a $1100-1200 GPU. And with the patch that lowered picture quality you can get this to 77 fps average. 45-55 average only in 4K. And this is achieved by the developer announcing in advance that they had to decrease the effect of RTX. Tomb Raider may hit 30s in FHD with the 2080Ti. Third game that was announced with RTX (FFXV) is now cancelled. This is the worst price-performance GPU family ever.



Nxodus said:


> AMD = overheating, power hungry, unstable, bad software, no innovation BUT IT'S CHEAPER!!
> NVIDIA = expensive
> 
> I really don't get the NVIDIA haters, AMD is pure shit, it's the Walmart variant of video cards



LOL. Bumhurt guy.



Nxodus said:


> quality argument.
> 
> -Help! my game is crashing
> -What's your setup?
> ...



You know which company released a driver with WHQL certificate that crashed an AAA title? It was NV with Watch Dogs 2. Laughing at you so loud now.


----------



## FordGT90Concept (Dec 26, 2018)

efikkan said:


> They do, but this is still irrelevant.


Computation requires VRAM too.



Vya Domus said:


> Double-precision is rarely used, if ever for shading, in game. That is unless you meant something else ? Also, just because your application is compiled for 64-bit machines that doesn't mean there is direct correlation between that and memory usage. You can have a program that uses a smidget of 64-bit data types and yet it may use just a couple of megabytes or several gigabytes. It's more about quantity, not just data types that are used.


Virtual memory address space.


----------



## TheGuruStud (Dec 26, 2018)

I see a paid shill showed up in short order to protect nvidia. Cmon, nvidia. You gotta hire Clinton levels of shills to have a chance (note: not a political remark, literally a reference in scale).


----------



## Aquinus (Dec 26, 2018)

I guess nVidia's stock hasn't hit rock bottom yet.


----------



## ArbitraryAffection (Dec 26, 2018)

efikkan said:


> I think one RTX 2060 is plenty to fill the market. But comparing GPUs and setting prices based on memory size is pure BS. 3GB is fine for an entry mid-range card as long as proper benchmarking shows it's fine.
> 
> 
> RTX 2060 will perform far beyond RX 570, a low-end card. Brag all you want about 8GB, there is no way that card needs it.
> ...


Wow, i'm reading a lot of hostility in your post. There's no need for that. And actually, there are situations where the 570 could use over 4GB, so the 8GB can make sense. Especially if i want to run high resolution textures. Something that a 3GB 2060 will not ne able to do, and hilariously may even perform less than the 570 in that situation.

The _fact_ is, AMD is offering more value in the mid-range. And 570 isn't low-end, it's lower-mid-range.  But feel free to defend getting less hardware for _more _money. Seems basically the whole idea of Intel / Nvidia these days.


----------



## Vya Domus (Dec 26, 2018)

FordGT90Concept said:


> Virtual memory address space.



That's on the OS side of things and again, doesn't have anything to do with how much VRAM an application will use.


----------



## EarthDog (Dec 26, 2018)

Wow...just wow. Cant say I like this many variants...holy cow. 



lexluthermiester said:


> Not really. I have a 2080 and still turn off AA. No real need for it at 1080p and above. Most gamers prefer framerate VS AA.


I'd have to imagine youd be in a minority saying AA isnt needed at 1080p. Even on fps I notice AA off at that res. If I wanted to play minecraft, I would. 

4k is the only place I can disable it and still manage to notice. 

Now if it's an fps issue, of course, shut it down...but it's to the detriment of IQ big time at 1080p.


----------



## efikkan (Dec 26, 2018)

ArbitraryAffection said:


> And actually, there are situations where the 570 could use over 4GB, so the 8GB can make sense. Especially if i want to run high resolution textures. Something that a 3GB 2060 will not ne able to do, and hilariously may even perform less than the 570 in that situation.


Please note that I did not say there is no use cases for having more memory, but just because _some_ may need it, doesn't mean everyone needs it. For many, the cheaper cards offer much more value. It's no accident that both AMD and Nvidia offer their RX 480/580 and GTX 1060 in low and high memory configurations respectively.

And as I've mentioned, the fact that a game allocates more memory doesn't mean it _needs_ it. To evaluate that, you need to evaluate reductions in performance and/or rendering quality.



ArbitraryAffection said:


> The fact is, AMD is offering more value in the mid-range. And 570 isn't low-end, it's lower-mid-range.  But feel free to defend getting less hardware for more money. Seems basically the whole idea of Intel / Nvidia these days.


If you keep expanding it, it's no longer the mid-range. 
In the mid-range, GTX 1060 3GB/6GB have been the better choice over RX 480/580 4GB/8GB. The only slice in there where AMD have no direct competition from Nvidia is the new RX 590, but the only argument for this one is if you can't afford GTX 1070. RX 590 is still a little odd, considering how close to GTX 1060 and RX 580 it really is. In general, AMD is hardly competitive in the mid-range, and except for possibly the RX 590, they can't claim to offer better value. This is only going to get more challenging as RTX 2060 arrives.
In the low-end, AMD have more compelling options, like the RX 570 vs. GTX 1050 Ti.


----------



## 95Viper (Dec 26, 2018)

Keep it on topic.
Stop the name calling.
Keep it civil and constructive.

Thank You and have a nice day.


----------



## eidairaman1 (Dec 26, 2018)

INSTG8R said:


> I have a feeling RTRT will be all but useless at this level.



Good Ol NV adding to the confusion, first the 1060 now this.

If anything Skip the 3GB crap GDDR5 and make the 2050ti a 4 or 6GB only with GDDR6 only...


----------



## FordGT90Concept (Dec 26, 2018)

Vya Domus said:


> That's on the OS side of things and again, doesn't have anything to do with how much VRAM an application will use.


It can't directly address anything outside of 32-bit virtual address space (includes system RAM and VRAM).  Like RAM, VRAM can use a physical address extension like system to get around that but performance is terrible so not generally viable for games.

Point is, the reason why 1, 2, 3.5 GiB was okay for long is because games were developed for Xbox 360 and PlayStation 3 that only had 512 MiB of memory.  Everything was kept 32-bit because AAA games had to fit in such a tiny footprint anyway on consoles.  With the transition to Xbox One and PlayStation 4 having 8 GiB of memory, 64-bit games are everywhere and with it a "smoke it if you got it" approach to memory usage.  That's why the GTX 970 has aged poorly and so many in this thread scoff at 3 GiB variants of graphics cards.  3 GiB was more than enough several years ago.  It isn't anymore because the 32-bit/512 MiB RAM veil has finally been truly lifted.


----------



## eidairaman1 (Dec 26, 2018)

FordGT90Concept said:


> It can't directly address anything outside of 32-bit virtual address space (includes system RAM and VRAM).  Like RAM, VRAM can use a physical address extension like system to get around that but performance is terrible so not generally viable for games.
> 
> Point is, the reason why 1, 2, 3.5 GiB was okay for long is because games were developed for Xbox 360 and PlayStation 3 that only had 512 MiB of memory.  Everything was kept 32-bit because AAA games had to fit in such a tiny footprint anyway on consoles.  With the transition to Xbox One and PlayStation 4 having 8 GiB of memory, 64-bit games are everywhere and with it a "smoke it if you got it" approach to memory usage.  That's why the GTX 970 has aged poorly and so many in this thread scoff at 3 GiB variants of graphics cards.  3 GiB was more than enough several years ago.  It isn't anymore because the 32-bit/512 MiB RAM veil has finally been truly lifted.



Even the 3GB 7970/280s are getting that way, 6GB variants are doing better.


----------



## efikkan (Dec 26, 2018)

FordGT90Concept said:


> It can't directly address anything outside of 32-bit virtual address space (includes system RAM and VRAM).  Like RAM, VRAM can use a physical address extension like system to get around that but performance is terrible so not generally viable for games.
> 
> Point is, the reason why 1, 2, 3.5 GiB was okay for long is because games were developed for Xbox 360 and PlayStation 3 that only had 512 MiB of memory.  Everything was kept 32-bit because AAA games had to fit in such a tiny footprint anyway on consoles.  With the transition to Xbox One and PlayStation 4 having 8 GiB of memory, 64-bit games are everywhere and with it a "smoke it if you got it" approach to memory usage.  That's why the GTX 970 has aged poorly and so many in this thread scoff at 3 GiB variants of graphics cards.  3 GiB was more than enough several years ago.  It isn't anymore because the 32-bit/512 MiB RAM veil has finally been truly lifted.


Please don't mix register width bits with memory capacity, it has nothing to do with it, and I cringe every time someone mixes these up.

Also be aware that GPU memory is a separate address space controlled by the GPU. In theory, there is nothing preventing you from having a GPU with >4GB memory on a 32-bit system.


----------



## FordGT90Concept (Dec 26, 2018)

efikkan said:


> Please don't mix register width bits with memory capacity, it has nothing to do with it, and I cringe every time someone mixes these up.


I didn't.



efikkan said:


> Also be aware that GPU memory is a separate address space controlled by the GPU.


Graphics driver.  Could be WOW32 driver which is why the 4 GiB limitations may still apply.



efikkan said:


> In theory, there is nothing preventing you from having a GPU with >4GB memory on a 32-bit system.


Just like you could have 64 GiB in a 32-bit system but it will only be able to address 4 GiB of it.  Moot argument.


I'm trying to get a definitive answer from engineers on this question because it's not something that's really been addressed in relation to D3D10 and newer.


----------



## Casecutter (Dec 26, 2018)

I'll ask the simple question(s)... How does a chip built with a GDDR6 memory controller run GDDR5?  Okay sure it's a 5-6 memory controller, and they fuse off the bit's not need in post, but wouldn't that make it bigger and parts wasted?  
Why do I have that 970 4Gb taste back, all of a sudden?


----------



## efikkan (Dec 26, 2018)

FordGT90Concept said:


> Just like you could have 64 GiB in a 32-bit system but it will only be able to address 4 GiB of it.  Moot argument.


That's wrong in two ways, firstly a 32-bit CPU can have a larger than 32-bit address space, like some Intel CPUs did. Similarly Intel 8086(16-bit) had a 20-bit address bus, 80286(16-bit) had 24-bit address bus. Secondly the GPU's internal addressing is not controlled from the CPU side, nor is GPU memory mirrored in system memory. You can have more GPU memory than system memory and have no troubles filling it up, and if the CPU and OS is 32-bit has nothing to do with this.

To end this, 32-/64-bit has nothing to do with graphics memory usage of games, so please keep that separate.


----------



## Supercrit (Dec 26, 2018)

Imagine each memory variant also has different shader numbers to spice it up. Maximum consumer confusion.


----------



## Aquinus (Dec 26, 2018)

efikkan said:


> nor is GPU memory mirrored in system memory.


Now that definitely depends on the graphics API and the application. GPU memory *might* be mirrored depending on the what developer is trying to do.


efikkan said:


> You can have more GPU memory than system memory and have no troubles filling it up, and if the CPU and OS is 32-bit has nothing to do with this.


GPU address space still needs to get mapped like any other peripheral. Your statement is flat out incorrect. Also, don't go using PAE as a response. We all know how rickety that barge is.


----------



## GoldenX (Dec 26, 2018)

bug said:


> Write your congressman, limit the number of designs to one! Because choice is to be avoided!
> 
> Wth guys, this is just different memory capacity and type. I'd be more curious if this means different memory bus widths and, like 1060 before this, different internal configurations.


The fact that there is a 4gb variant is a total give away.


----------



## Prince Valiant (Dec 26, 2018)

EarthDog said:


> Wow...just wow. Cant say I like this many variants...holy cow.
> 
> I'd have to imagine youd be in a minority saying AA isnt needed at 1080p. Even on fps I notice AA off at that res. If I wanted to play minecraft, I would.
> 
> ...


The number of variants is nuts. This must be how they plan on clearing out their GDDR5X stockpile .


----------



## unikin (Dec 26, 2018)

This is getting ridiculous, 3 GB mid range GPU in 2019? I have RX 580 4 GB GPU in my 2nd build and have problems playing AC Odyssey with high quality textures turned on. I hope AMD doesn't catch NGreedias' flu. GTX 1070TI/1080TI were the last NVidia's GPUs worth buying in 2018 and as it looks like things ain't gonna change for better in 2019.


----------



## ArbitraryAffection (Dec 26, 2018)

efikkan said:


> In the mid-range, GTX 1060 3GB/6GB have been the better choice over RX 480/580 4GB/8GB.


Would you honestly recommend a 1060 3GB, at all?  I mean seriously? the 6GB, yes, I get that. But the 3GB, no. The card is a joke.


----------



## bug (Dec 26, 2018)

TheGuruStud said:


> I see a paid shill showed up in short order to protect nvidia. Cmon, nvidia. You gotta hire Clinton levels of shills to have a chance (note: not a political remark, literally a reference in scale).


Why would Nvidia need protection in this case?
You don't want a 3GB mid range card, you don't buy one. That how it was before, that's how it still is.


----------



## dirtyferret (Dec 26, 2018)

Great news!  I was afraid the GTX 2060 would turn out to be like the half dozen different variants of the GTX 1060 and confuse people....



bug said:


> You don't want a 3GB mid range card, you don't buy. That how it was before, that's how it still is.



So let it be written, so let it be done


----------



## INSTG8R (Dec 26, 2018)

lexluthermiester said:


> That will depend on the settings level


720p on Low?


----------



## Charcharo (Dec 26, 2018)

efikkan said:


> Please note that I did not say there is no use cases for having more memory, but just because _some_ may need it, doesn't mean everyone needs it. For many, the cheaper cards offer much more value. It's no accident that both AMD and Nvidia offer their RX 480/580 and GTX 1060 in low and high memory configurations respectively.
> 
> And as I've mentioned, the fact that a game allocates more memory doesn't mean it _needs_ it. To evaluate that, you need to evaluate reductions in performance and/or rendering quality.
> 
> ...



Video games these days definitely DO need more than 3GB. Even 4GB is not enough, I am an ex-Fury owner, trust me. 

We know the tired argument on allocation and actually needing it, but games actually do need it at this point in time. Also, the "GPU too weak" argument is moot - the 7970 is weaker and still sees improvement going from 3GB to 6GB models, besides textures are one of the cheapest and best ways to improve immersion, provided one has the VRAM. Id rather lower lighting on a weaker GPU than textures. Using more VRAM for real wont tank performance in the vast majority of cases. 

To add more to the argument - mods. We are talking about PC gaming GPUs, so modifications matter. Many mods improve graphics and textures but are VRAM heavy, so for a lot of people, the VRAM size is important due to that as well. 

3GB on an RTX 2060, a GPU likely above 1060, maybe something between Fury and 1070 in performance, with 3GB of VRAM? yeah it is bad. Even 4GB is bad there. Even 6GB is not great. And there are games and mods that can crush 8GB of VRAM at 1440p as well so...

I will ignore the part about the RX 570 and 580 not being competitive since its likely bait.


----------



## bug (Dec 26, 2018)

Charcharo said:


> Video games these days definitely DO need more than 3GB. Even 4GB is not enough, I am an ex-Fury owner, trust me.



It depends on what you play, really. Just look at the list of games TPU has reviewed, most of them will run fine with 3GB of VRAM. Sure, that raises the question of you needing an RTX 2060 if you play mainly those games, but that's another discussion.

And yes, I would prefer mid range cards to start at 4GB, but until we see prices and internal configuration, we really don't know what's going on here. We're just foaming at the sight on Nvidia in the title.


----------



## efikkan (Dec 26, 2018)

ArbitraryAffection said:


> Would you honestly recommend a 1060 3GB, at all?  I mean seriously? the 6GB, yes, I get that. But the 3GB, no. The card is a joke.


Weren't you the one talking about hostility a few posts back? 

GTX 1060 3GB is certainly no joke, and no more a joke than e.g. RX 580 4GB. You are too fixated on specifications, specifications which you may not fully understand. Pascal and Turing have more advanced memory compression than Polaris, and the GPUs allocate memory differently. Comparing across architectures is not that simple, that's why I keep reminding people to look at benchmarks matching their use case. I even have a 3GB version in one of my PCs, and I've put them into several other builds as well.

And you keep missing the point; is more better? Yes, certainly! But what if you don't need more? There is a good price difference for GTX 1060 3GB vs. 6GB and RX 580 4GB vs. 8GB. A gamer on a very tight budget on 1080p who want higher framerate rather than higher details would certainly appreciate having the option.

My issue with the various versions of GTX 1060 is the naming, I would have called the 3 GB version GTX 1060 and the 6GB version GTX 1065/1060 Ti etc.


----------



## Charcharo (Dec 26, 2018)

bug said:


> It depends on what you play, really. Just look at the list of games TPU has reviewed, most of them will run fine with 3GB of VRAM. Sure, that raises the question of you needing an RTX 2060 if you play mainly those games, but that's another discussion.
> 
> And yes, I would prefer mid range cards to start at 4GB, but until we see prices and internal configuration, we really don't know what's going on here. We're just foaming at the sight on Nvidia in the title.



This is silly. People do not go "Well I have X GPU so I wont play games A through C". Almost all modern GPUs can achieve decent performance in all games, without exceptions. If you accept console settings, a 1050 Ti will do 50-60 fps at 1080p. If you accept 30 fps, the RX 570 can do high settings, 4K or Ultra 1080p. People with RX 560s wont just say "Well I wont play DOOM at all!" they will lower a few settings and have a great experience yet again.

And for what its worth - VRAM choking takes time to truly become an issue. I can tell you games like DOOM can and do choke my old Fury due to a demand for more memory, at 1440P, despite what reviews show. They likely test scenes for shorter amounts of time, do not force Nightmare settings and/or are in parts of the game that are heavy, but not on memory. 

And the modding thing is always a factor. Especially since it is popular across the entire PC Gaming community. From people with 2080 Tis to people with GT 1030s.


----------



## bug (Dec 26, 2018)

ArbitraryAffection said:


> Would you honestly recommend a 1060 3GB, at all?  I mean seriously? the 6GB, yes, I get that. But the 3GB, no. The card is a joke.


The 3GB version was $40 less for like 6% less performance at launch. I'm not sure how things have changed since then.


----------



## EarthDog (Dec 26, 2018)

Charcharo said:


> Almost all modern GPUs can achieve decent performance in all games, without exceptions.


They don't? There isn't minimum and recommended requirements on games to attain the optimal gaming experience? 

The only people that 'accept' console settings are those who have to. You can lower settings in game and make it worse than a console. You can crank settings up and get 30 FPS and on a PC in most genre's, is NOT an acceptable average FPS (and 1% would be abhorrent).  So... that is not silly to say I have a XXX GPU, can I play this game well? But then again not all PC users prefer or can afford a PC that will look better than a console. half the point of a PC is because in many cases, it looks better than a console. The lines have blurred over time, but... there is plenty of reason to look at a GPU and see if I can play a game.


----------



## Charcharo (Dec 26, 2018)

EarthDog said:


> They don't? There isn't minimum and recommended requirements on games to attain the optimal gaming experience?
> 
> The only people that 'accept' console settings are those who have to. You can lower settings in game and make it worse than a console. You can crank settings up and get 30 FPS and on a PC in most genre's, is NOT an acceptable average FPS (and 1% would be abhorrent).  So... that is not silly to say I have a XXX GPU, can I play this game well? But then again not all PC users prefer or can afford a PC that will look better than a console. half the point of a PC is because in many cases, it looks better than a console. The lines have blurred over time, but... there is plenty of reason to look at a GPU and see if I can play a game.




Do not look at minimum and recommended system requirements. They are made up and make no sense at all. I do not even watch them these days and have not for over a decade. 

The difference between Medium and Ultra settings... is overrated most of the time.  Plus, most of the time, Ultra settings is not even true Ultra these days. Witcher 3's settings menu does not compare to what even a pleb like me can do in the ini file in 2 minutes without any real modding. That is real Ultra settings  .

If you can afford an RX 560/ GTX 1050 then you are already quite a bit above the PS4 and Xbox One basic. PS4 Pro is about matched by 1050 Ti and Xbox One X always loses in games vs the GTX 1060 6 GB. 

I do not like using different standards for different things. One standard and zero hypocrisy or the discussion is worthless.


----------



## EarthDog (Dec 26, 2018)

Charcharo said:


> I do not like using different standards for different things. One standard and zero hypocrisy or the discussion is worthless.


Me either. In this case you are setting up a second standard (yours) and dismissing what the majority follows. So......... there is that. 

I didn't say there is a huge difference between settings, but certainly noticeable (which will depend on the title of course). Again, people don't PC game to need to turn settings down (unless they have to...$). What you as an enthusiast does to manipulate games files isn't what I would call 'standard' either. So to keep on the one standard and zero hypocrisy line of thinking.... people don't generally do that. 

Again, those requirements are there for a reason. You choosing not to follow them is your choice, however they are there to attempt to set expectations on game play with hardware.


----------



## efikkan (Dec 26, 2018)

Charcharo said:


> If you can afford an RX 560/ GTX 1050 then you are already quite a bit above the PS4 and Xbox One basic. PS4 Pro is about matched by 1050 Ti and Xbox One X always loses in games vs the GTX 1060 6 GB.


Don't forget that many budget gamers upgrade old computers with a new cheap graphics card, just saying…


----------



## dirtyferret (Dec 26, 2018)

Can the RTX 2060 play crysis?


----------



## Charcharo (Dec 26, 2018)

EarthDog said:


> Me either. In this case you are setting up a second standard (yours) and dismissing what the majority follows. So......... there is that.
> 
> I didn't say there is a huge difference between settings, but certainly noticeable (which will depend on the title of course). Again, people don't PC game to need to turn settings down (unless they have to...$). What you as an enthusiast does to manipulate games files isn't what I would call 'standard' either. So to keep on the one standard and zero hypocrisy line of thinking.... people don't generally do that.
> 
> Again, those requirements are there for a reason. You choosing not to follow them is your choice, however they are there to attempt to set expectations on game play with hardware.




The majority buys AAA games like Origins, Odyssey, Wolfenstein 2, DOOM, Witcher 3 and others just fine...

What I am doing as an enthusiast is enabling what Ultra used to mean 15-20 years ago.  Nothing more, just a return to how things used to be. I do not think people have to do that since I think the "Ultra or nothing" mentality in PC Gaming is insane.

Those requirements do not reflect reality so the reason they are there is not objectively correct and has not been in the past decade. Show me at least 2-3 times when it meant something from the last 5 years and I will give you some props. As it is, it is often just lies.


----------



## FordGT90Concept (Dec 26, 2018)

dirtyferret said:


> Can the RTX 2060 play crysis?


Yes.  It's not a very demanding game by today's standards.


----------



## EarthDog (Dec 26, 2018)

Charcharo said:


> The majority buys AAA games like Origins, Odyssey, Wolfenstein 2, DOOM, Witcher 3 and others just fine...
> 
> What I am doing as an enthusiast is enabling what Ultra used to mean 15-20 years ago.  Nothing more, just a return to how things used to be. I do not think people have to do that since I think the "Ultra or nothing" mentality in PC Gaming is insane.
> 
> Those requirements do not reflect reality so the reason they are there is not objectively correct and has not been in the past decade. Show me at least 2-3 times when it meant something from the last 5 years and I will give you some props. As it is, it is often just lies.


There are AAA titles that can bring a GPU down...

Good for you! I am glad you dig down and edit ini files. Just saying most people don't.

How about you show me where it *doesn't* give a general reference point to start from (minimum). I'm not the one pushing the rock uphill and in need of support for my assertions.  The recs are a GUIDELINE, not a rule...but you are just saying they are lies, LOL.


----------



## Charcharo (Dec 26, 2018)

EarthDog said:


> There are plenty of AAA titles that can bring a GPU to its knees.
> 
> Good for you! I am glad you dig down and edit ini files. Just saying most people don't.
> 
> How about you show me where it *doesn't* give a general reference point to start from (minimum). I'm not the one pushing the rock uphill and in need of support for my assertions.  The recs are a GUIDELINE, not a rule.



And yet they sell by the millions and average gamers play them. Hell my poor countrymen play these games on their low-end PCs without issues...

I know most people don't dig through the ini files, but as long as you agree that true Ultra is there, tis fine. We agree.

If you define things super generally, I guess you can stretch everything enough to win. My old ATI 5770 finished Witcher 3 at Low settings, 30 fps locked, 900p just fine. It was under the minimum requirements, had locked fps, and was not even using the lowest possible settings (resolution wasn't as low as it would go). i5 750 didn't bottleneck it. So obviously this page is a lie to me:
https://www.systemrequirementslab.com/cyri/requirements/the-witcher-3-wild-hunt/12446

Read the OpenGL part here:
https://www.hardocp.com/article/2014/05/21/wolfenstein_new_order_performance_review

Hell, for a recent example see this:









R9 380 is much slower than R9 290... and yet it runs well. We can do this for almost all games ultimately, but what matters is that the requirements are made up. I can give better requirements than the developers and that is sad.


----------



## EarthDog (Dec 26, 2018)

Charcharo said:


> And yet they sell by the millions and average gamers play them. Hell my poor countrymen play these games on their low-end PCs without issues...


because they meet the minimum requirements or chances are the gaming the experience is poor (not as the dev wants it).



Charcharo said:


> I know most people don't dig through the ini files, but as long as you agree that true Ultra is there, tis fine. We agree.


With respect, I honestly dont care. This has nothing to do with anything here really.

Your examples seem like exactly what I have explained, is it not? I also don't believe there is a standard for minimum/recs set by anyone so it varies by dev. That said, pretty sure 1080p is a given here...not less. I wouldn't call having to lower the resolution to make 30 fps (which many feel is unplayable) meeting minimum specs.

The bottom line is the are there are a GUIDELINE, not a rule. There is some flexibility there but the gaming experience may also suffer.

We'll agree to disagree.


----------



## Charcharo (Dec 26, 2018)

EarthDog said:


> because they meet the minimum requirements or chances are the gaming the experience is poor (not as the dev wants it).
> 
> With respect, I honestly dont care. This has nothing to do with anything here really.
> 
> ...



I actually am shocked to see someone talking about system requirements in late 2018. I thought people stopped looking at these years ago and would never guess enthusiasts to use them. No offence, this is just a shock to me as I seriously have not seen this happen in a very long time (years). And no, ;p most dont meet the requirements.

I understand your argument about the ini file manipulation, but do know that when you say you run X game on Ultra I will say that you do not actually run it at Ultra.

So your idea of minimum is literally twice the minimum playable fps that the majority of gamers tolerate at a resolution that is much higher than the minimum supported by the actual application? Your idea of minimum is literally my idea of recommended requirements...

Words have meanings in languages. Minimum should be minimum, not some standard above what the majority of console gamers (which is like half of gaming) can achieve.  And JayZ's R9 380 was doing a LOT better than that. An R9 290 can literally almost max the game at 60+ fps at 1080P. What kind of minimum is that???

I would prefer actual rules with meaning over things I can toss aside and laugh at, with API lies (as proven by the new order) tossed in. Solid, dependable and correct rules with clear meanings and definitions. That is something I think all humans love.


----------



## EarthDog (Dec 26, 2018)

Charcharo said:


> I will say that you do not actually run it at Ultra.


Ultra is what what the dev's say it is through their preset. Anything else is adding on top. Ultra is what they say it is. Just because you go to fuel cutoff and past redline.... 



Charcharo said:


> your idea of minimum is literally twice the minimum playable fps that the majority of gamers tolerate at a resolution that is much higher than the minimum supported by the actual application? Your idea of minimum is literally my idea of recommended requirements...


Maybe? I just know that generally 30 fps many wouldn't consider an enjoyable gaming experience on PC.  I'm part of that group.


----------



## Charcharo (Dec 26, 2018)

EarthDog said:


> Ultra is what what the dev's say it is through their preset. Anything else is adding on top. Ultra is what they say it is. Just because you go to fuel cutoff and past redline....
> 
> Maybe? I just know that generally 30 fps many wouldn't consider an enjoyable gaming experience on PC.  I'm part of that group.




So Nightmare settings in DOOM do not exist  ?  I mean I am all for Authorial intent, but there is an argument to be made for Death of the Author, especially when a decade ago things were more logical and Ultra really meant "as high as it would go before breaking". That makes sense. 

Many and majority are not synonymous. I don't consider 30 fps playable either, but beggars cant be choosers and if its good enough for some rich Americans on their consoles,  its good enough for me.


----------



## unikin (Dec 26, 2018)

C'mon guys anything below 30 fps is a disaster, anything below 20 fps unplayable. We are talking about MIDRANGE GPU in 2019 here, not low end stuff like 560/1050, with perf somewhere in between 1070 - 1070TI given the same number of CUDA cores as 1070 and slightly better IPC. So 1440p/+60fsp capable GPU. Pairing it with 3 GB of RAM is a sin.


----------



## ASOT (Dec 26, 2018)

3 Gigs or 3.5 gigs of vram so much debate about it )))) for 1080p medium to high is OK.

Lets hope and expect that will be at competitive price for us,AMD has been rebranding us with crappy 590 and the 56 and 64 a joke saddly


----------



## bug (Dec 26, 2018)

Charcharo said:


> Do not look at *minimum and recommended system requirements*. They are made up and make no sense at all. I do not even watch them these days and have not for over a decade.
> 
> The difference between Medium and Ultra settings... is overrated most of the time.  Plus, most of the time, *Ultra settings is not even true Ultra these days*. Witcher 3's settings menu does not compare to *what even a pleb like me can do in the ini file in 2 minutes* without any real modding. That is real Ultra settings  .
> 
> ...


You do however posses an uncanny ability to mix together all sorts of things barely related to the subject.

The simple truth is games happen to be playable at settings other than max. Some will look better, some will look worse, depending on budget and who coded it and made the assets. But I have never met a person who didn't play a game because they couldn't max out shadows or textures. I _have_ met people that delayed playing a game, because they were planning an upgrade and thought they'd make the best of it.


----------



## M2B (Dec 26, 2018)

EarthDog said:


> because they meet the minimum requirements or chances are the gaming the experience is poor (not as the dev wants it).
> 
> With respect, I honestly dont care. This has nothing to do with anything here really.
> 
> ...



They don't quite think 30FPS is unplayable, they think if they say 30FPS is unplayable they'll look cool.
More than 120 million people in the world are currently using consoles and enjoying their games at 30FPS.
Don't get me wrong, 30FPS is not ideal, it's far less enjoyable than 60FPS/60FPS+ but it's not unplayable by any means.
I'll proudly choose that "unplayable" 30FPS Red Dead Redemption 2 over 95 percent of the games at 60FPS+.


----------



## EarthDog (Dec 26, 2018)

Charcharo said:


> Many and majority are not synonymous. I don't consider 30 fps playable either, but beggars cant be choosers and if its good enough for some rich Americans on their consoles,  its good enough for me.


Oh, I'd bet good money a majority wouldn't consider 30 fps to be playable in most genres/titles. RTS, I can do... FPS...I'd cry and likely get a headache...

Playable and enjoyable I'm kind of using interchangeably. I mean... 15 is playable. The game plays...but the experience and it being enjoyable, the majority tend agree 30 fps isnt for PCs.

But....this is all a bit OT. I'd love a thread to get down to the bottom.of why 30 fps seems different on a console versus a PC (I know why movies can get away it).


----------



## lexluthermiester (Dec 26, 2018)

CandymanGR said:


> I dont trust a single word from people who say that what they bought is the best.


So you don't trust actual usage and experience? Sounds like flawed logic. But hey, do carry on..


CandymanGR said:


> And also, 3gb are NOT enough.


Sure it is when the settings are properly configured.


CandymanGR said:


> You are biased as hell.


The word your looking for is experienced. I own a PC shop and we build every kind of system for every budget from bleeding edge gaming to economy minded gaming. From Ryzen or Intel to Radeon and Geforce. A 2060 3GB will be bare minimum but still doable as a gaming card, just like the Radeon 4GB cards are doable for 1080p gaming. Calling me biased only shows your ignorance. Good luck with that.


B-Real said:


> Of course you have to defend that you have been milked in a crazy way.


Interesting perspective. Another is that I'm sharing actual experience and that it is positive despite the price increase.


B-Real said:


> Those who try to defend this piece of crap can't understand our concern is not its performance


While I haven't used a 2060 yet, I have build systems with 2080ti's, 2080's(and own one), 2070's, Vega 64's, Vega 56's, RX580's and so on. Each have there pros and cons. Calling one or the other "crap" is so baselessly nonobjective as to effectively sound like drivel. The rest of your points only matter to people looking for reasons to whine and nitpick.


B-Real said:


> This is the worst price-performance GPU family ever.


That's an opinion, and not a very good one. Every RTX card I've installed/used kicks the ever living snot out of the GTX10XX/Titan counter parts to say nothing of Radeon cards. And that's before RTRT is factored into the equation. The only people whining about the price are the people who can't afford them or have to wait an extra bit of time to save money for one. Everyone else is buying them up, which is why they are consistently still selling out, months after release.


----------



## moproblems99 (Dec 26, 2018)

Note:  I didn't read the comments and skimmed the article and didn't see a reference to tensor cores. BUUUUTTTT, if this has any number of tensor cores, I want one.  It will never see a 3d scenario so I can't care about how it performs in games.



EarthDog said:


> But....this is all a bit OT. I'd love a thread to get down to the bottom.of why 30 fps seems different on a console versus a PC (I know why movies can get away it).



Doesn't it have to do with the fact they are a constant value (movies)?



lexluthermiester said:


> I own a PC show



Really, what channel and time slot?


----------



## lexluthermiester (Dec 26, 2018)

EarthDog said:


> I'd have to imagine youd be in a minority saying AA isnt needed at 1080p.


Not based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. Then there's Steam's own stats that show most people turn AA down or off, most of them running at 1080p. Pixel density 1080p, 1440p and up mostly eliminates the need for AA as the "pixel laddering" effect of the past isn't noticeable or pronounced like it was in the past with lower resolutions and simply isn't needed. Try it yourself. Turn it off and see if it matters that much to you. Be objective though.



moproblems99 said:


> Really, what channel and time slot?


LOL! typo corrected..


----------



## moproblems99 (Dec 26, 2018)

lexluthermiester said:


> Not based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. Then there's Steam's own stats that show most people turn AA down or off, most of them running at 1080p.




I would assume most people that turn AA down on Steam are playing Insurgency or CS:GO.

EDIT:

For me, I don't turn anything down as long as I am over 60fps (75 now while I have this monitor).  I can't stand no AA.  It looks like crap.  Although, still need to try rendering above my native and downscaling to see if visual vs perf is better/worse.


----------



## EarthDog (Dec 26, 2018)

lexluthermiester said:


> Not based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. Then there's Steam's own stats that show most people turn AA down or off, most of them running at 1080p. Pixel density 1080p, 1440p and up mostly eliminates the need for AA as the "pixel laddering" effect of the past isn't noticeable or pronounced like it was in the past with lower resolutions and simply isn't needed. Try it yourself. Turn it off and see if it matters that much to you. Be objective though.
> 
> 
> LOL! typo corrected..


links plz... 

I recall that poll and walked away with a bit different of a meaning.

Yeah... let's be clear here. Nobody said max AA...but you said "AA off" and that "it wasn't needed at 1080p and higher". We concluded from that poll an overwhelming majority used AA be it max or somewhere in between. A single digit % turned it off while 47% depends on game settings which could be on or off. 

But the reality was for most that they use it when they can.


----------



## lexluthermiester (Dec 26, 2018)

moproblems99 said:


> I would assume most people that turn AA down on Steam are playing Insurgency or CS:GO.


That would be a big assumption. I personally doubt it, but who knows..


moproblems99 said:


> Although, still need to try rendering above my native and downscaling to see if visual vs perf is better/worse.


Oh turn that on and leave it on. Then turn your AA down or off. You'll like the steady framerates much better.


EarthDog said:


> links plz...


https://store.steampowered.com/stats/
https://www.techpowerup.com/forums/...ings-for-running-games-and-benchmarks.250063/


----------



## EarthDog (Dec 26, 2018)

See edit above for your poll. 

I'm mobile and cant dig down on the steam stats link... feeling saucy and post an image of it?


----------



## CandymanGR (Dec 26, 2018)

lexluthermiester said:


> So you don't trust actual usage and experience? Sounds like flawed logic. But hey, do carry on..
> 
> Sure it is when the settings are properly configured.
> 
> The word your looking for is experienced. I own a PC shop and we build every kind of system for every budget from bleeding edge gaming to economy minded gaming. From Ryzen or Intel to Radeon and Geforce. A 2060 3GB will be bare minimum but still doable as a gaming card, just like the Radeon 4GB cards are doable for 1080p gaming. Calling me biased only shows your ignorance. Good luck with that.



So now mentioning our tech past experience, gives credibility to what we're saying? That's flawed logic, not mine.
Should i start then mentioning how many systems i have build in the last 25 years i work as IT specialist? Should i? Really?
I don't trust people who are buying something which obviously has some flaws, and defend it like there is no tommorow. That translates a bit as "butthurt" to me.

3gb vram are NOT enough for MANY games. Not all, but many (AAA titles mostly). I can name a few, i already DID. I've said that 2 times already. Those are the games we're buying gpu's for, not lightweight games. You dont agree on that?

Do you think what i am saying is coming out of my a**? You think i haven't made benchmarks myself to see what's what? Ofcourse with the "right settings" vram requirement could go below 3giga but that not the point, because for the games i am speaking for, going below 3gb vram usage usually means also to go for medium/low settings. GTA V for example takes about 3.5 for medium/high settings at 1080p, NOT even Ultra. Now if you start editing .ini files then we're talking for very customized experience and thats not normal for the average user. And still that method cannot change the performance hits or gains beyond the capabilities of the gpu. It is just a more customized method, and you can sacrifice quality over speed (and vice versa) exactly as you want it, because sometimes the in-game settings do not satisfy all tastes. Thats all. But the performance of the gpu with 3gb vram will be what it is.

You are saying all that about your past experience, yet you are ready to defend a NEW gpu for 2019 with 3gb vram. And you call me ignorant!!!!!!!!!!

P.S. And please dont start with nvidia's "magic" compression.


----------



## lexluthermiester (Dec 26, 2018)

CandymanGR said:


> So now mentioning our tech past experience, gives credibility to what we're saying? That's flawed logic, not mine.
> Should i start that mentioning how many systems i have build in the last 25 years i work as IT specialist? Should i? Really?


No, this isn't a contest.


CandymanGR said:


> Do you think what i am saying is coming out of my a**?


Yes? Mostly because you're not taking into account most real-world usage scenario's. For 1080p a 2060 with 3GB will work and perform very well in most games out today. Very few AAA titles can not be made to run well on such as card. How do I know this you ask? Because it can be done with 1060 with 3GB. A 2060 is a better performing card so, naturally, what a 1060 can do a 2060 will do better. Simply deductive reasoning is all that is required to arrive at that conclusion.


CandymanGR said:


> for medium/high settings at 1080p, NOT even Ultra.


Most people tinker with their settings so the "medium/high" argument is irrelevant as it ends up being customized.


CandymanGR said:


> And you call me ignorant!!!!!!!!!!


Though you took it out of context, that's what I said. And your statements above continue to lend merit to that conclusion.


----------



## CandymanGR (Dec 26, 2018)

lexluthermiester said:


> No, this isn't a contest.


Then why you've started it?



lexluthermiester said:


> Yes? Mostly because you're not taking into account most real-world usage scenario's. For 1080p a 2060 with 3GB will work and perform very well in most games out today.


Yes, and also a 950 with 2gb vram will work and perform "well" in most games. But thats NOT the point. We're NOT talking about cards of 2016 (thats how old 1060 is in case you dont remember). And also, the vram requirements for games are keep getting higher. You just cannot accept it. And i can tell you games that look like shit with custom settings for less than 3gb vram usage. Like Far Cry 4 or Shadow of Mordor.



lexluthermiester said:


> Very few AAA titles can not be made to run well on such as card. How do I know this you ask? Because it can be done with 1060 with 3GB. A 2060 is a better performing card so, naturally, what a 1060 can do a 2060 will do better. Simply deductive reasoning is all that is required to arrive at that conclusion.


On the contrary, a faster core with ram as fast as a card of 2016 (i am refering to the GDDR5 versions) might starve of data faster, and suffer from data bus bottlenecks, especially for vram hungry games! But obviously you know how the card will perform without even seeing it first.



lexluthermiester said:


> Most people tinker with their settings so the "medium/high" argument is irrelevant as it ends up being customized.


Is that even an argument? Obviously most people customise their settings. So? We need points of origin in order to discuss this, otherwise we can say "yeah you can customize X game to run with even 2gb of vram usage". Thats not the point! You'are avoiding the point systematically.

Edit: Oh and one more thing by the way. My english are not great, but as far as i remember calling someone "biased" is not an insult, but calling someone "ignorant", actually it is. Especially, in the way you've used it.


----------



## lexluthermiester (Dec 27, 2018)

CandymanGR said:


> Then why you've started it?





CandymanGR said:


> Yes, and also a 950 with 2gb vram will work and perform "well" in most games.


No, it wouldn't.


CandymanGR said:


> But thats NOT the point.


Sure it is. Gaming performance is exactly the point here.


CandymanGR said:


> We're NOT talking about cards of 2016 (thats how old 1060 is in case you dont remember).


Oh gee wiz, thanks for reminding me...


CandymanGR said:


> And i can tell you games that look like shit with custom settings for less than 3gb vram usage.


That's your opinion and a completely subjective one. You're welcome to it.


CandymanGR said:


> But obviously you know how the card will perform without even seeing it first.


Sure can, here's my premise for logic. I had a 1080 and upgraded to a 2080. Performance jump was significant. I have 1070 in one of my other PC's and it is known that the 2070 is a big jump in performance. It doesn't take much for a person to conclude that the 2060 will beat out a 1060. Therefore, very naturally, it is easy to conclude that anything a 1060 can do a 2060 will do much better. Don't need to see it to be able to accurately conclude the general performance of such a card.



CandymanGR said:


> but as far as i remember calling someone "biased" is not an insult


Depends on how you use it, but I digress..


----------



## CandymanGR (Dec 27, 2018)

lexluthermiester said:


> No, it wouldn't.


Yes it would.




> Sure it is. Gaming performance is exactly the point here.


We were not talking about performance in general, we were talking about how 3gb vram are enough or not.




> Oh gee wiz, thanks for reminding me...


When arguments end, irony starts.




> That's your opinion and a completely subjective one. You're welcome to it.


And yours also. Stop presenting your subjective opinion as fact.




> Sure can, here my premise for logic. I had a 1080 and upgraded to a 2080. Performance jump was significant. I have 1070 in one of my other PC's it is known that the 2070 is a big jump in performance. It doesn't take much for a person to conclude that the 2060 will beat out a 1060. Therefore, very naturally, it is easy to conclude that anything a 1060 can do a 2060 will do much better. Don't need to see it to be able to accurately conclude the general performance of such a card.


So from the point of discussion which was 3gb vram is not enough, you 've reach the conclusion that "the next gen card will be faster than previous gen". No shit Sherlock!? What a great discovery you've made. Ofcourse it will be!

But i speak of specific area in terms of performance. How do you know for example, if the bus size for the GDDR5 models is the same or not, and therefore the bandwidth is similar to the previous gen or not? How do you know if the GDDR5 models would not have performance hit because of bandwidth? How do you know especially if the 3gb vram models will have enough data feed to the gpu? Your deductive logic is as flawed just as like the rest of your arguments.
This is exactly what i mean you are avoiding the point systematically.


----------



## lexluthermiester (Dec 27, 2018)

@CandymanGR 
You're nitpicking and no longer offering merit based arguments. At this point it's obviously about ego for you, so I'm out.


----------



## Xzibit (Dec 27, 2018)

EarthDog said:


> See edit above for your poll.
> 
> I'm mobile and cant dig down on the steam stats link... feeling saucy and post an image of it?



I was curious too as to why that reference and if SHS added it. I did not find any.


----------



## CandymanGR (Dec 27, 2018)

lexluthermiester said:


> @CandymanGR
> You're nitpicking and no longer offering merit based arguments. At this point it's obviously about ego for you, so I'm out.



I quote specific parts, as it is part of the rules of the forum. I cannot quote each time a whole paragraph. I am not nitpicking, i am trying to prove a point here.


----------



## lexluthermiester (Dec 27, 2018)

EarthDog said:


> I'm mobile and cant dig down on the steam stats link... feeling saucy and post an image of it?


I know they've had those stats available. I just can't find it. Maybe they took it down?
EDIT;
There are still these;
https://store.steampowered.com/hwsurvey


----------



## GoldenX (Dec 27, 2018)

So, 192 bit for the 3 and 6GB variants, that's fine.
And 128 for the 4GB one? That's some nice downgrade in performance.


----------



## lexluthermiester (Dec 27, 2018)

GoldenX said:


> So, 192 bit for the 3 and 6GB variants, that's fine.
> And 128 for the 4GB one? That's some nice downgrade in performance.


We don't have those specs yet. I hope not. Maybe the 4GB 128bit will be the GDDR6 variant? That would even out the performance..


----------



## GoldenX (Dec 27, 2018)

lexluthermiester said:


> We don't have those specs yet. I hope not. Maybe the 4GB 128bit will be the GDDR6 variant? That would even out the performance..


Looking at that Gigabyte chart, there will be both GDDR6 and GDDR5 4GB variants. So, a GDDR5 128bit RTX, not nice.


----------



## lexluthermiester (Dec 27, 2018)

GoldenX said:


> Looking at that Gigabyte chart, there will be both GDDR6 and GDDR5 4GB variants. So, a GDDR5 128bit RTX, not nice.


Would have to agree unless the mem clocks are really high to make up for the difference.


----------



## CandymanGR (Dec 27, 2018)

lexluthermiester said:


> Would have to agree unless the mem clocks are really high to make up for the difference.



I dont think the extra clock can cover 1/3 of bandwidth cut.


----------



## lexluthermiester (Dec 27, 2018)

CandymanGR said:


> I dont think the extra clock can cover 1/3 of bandwidth cut.


It's been done on previous gen cards of both AMD and NVidia. We'll see what happens once the review/information embargo's are lifted.


----------



## Xzibit (Dec 27, 2018)

lexluthermiester said:


> Not based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. *Then there's Steam's own stats that show most people turn AA down or off,* most of them running at 1080p. Pixel density 1080p, 1440p and up mostly eliminates the need for AA as the "pixel laddering" effect of the past isn't noticeable or pronounced like it was in the past with lower resolutions and simply isn't needed. Try it yourself. Turn it off and see if it matters that much to you. Be objective though.



Still can't find it.  As for Pixel density the survey doesn't specify what size monitor people use either.



lexluthermiester said:


> I know they've had those stats available. I just can't find it. Maybe they took it down?
> EDIT;
> There are still these;
> https://store.steampowered.com/hwsurvey



They never have shown AA use in those surveys.


----------



## EarthDog (Dec 27, 2018)

lexluthermiester said:


> I know they've had those stats available. I just can't find it. Maybe they took it down?
> EDIT;
> There are still these;
> https://store.steampowered.com/hwsurvey


I dont ever recall seeing AA stats there. I cant even fathom how they would keep track of something so variable anyway. 

Thanks though.


----------



## FordGT90Concept (Dec 27, 2018)

6 GiB (11.16%) variant is obviously more popular than 3 GiB (6.53%) and 5 GiB (1.86%).


----------



## Charcharo (Dec 27, 2018)

bug said:


> You do however posses an uncanny ability to mix together all sorts of things barely related to the subject.
> 
> The simple truth is games happen to be playable at settings other than max. Some will look better, some will look worse, depending on budget and who coded it and made the assets. But I have never met a person who didn't play a game because they couldn't max out shadows or textures. I _have_ met people that delayed playing a game, because they were planning an upgrade and thought they'd make the best of it.



That you can't see the relation is not my problem. It is pretty obvious to me...

I have met people who won't play a game if they can't max it out, and also people who won't play a game since they will be getting an upgrade soon. I haven't met a person who looks at system requirements, unironically, since 2011.



EarthDog said:


> Oh, I'd bet good money a majority wouldn't consider 30 fps to be playable in most genres/titles. RTS, I can do... FPS...I'd cry and likely get a headache...
> 
> Playable and enjoyable I'm kind of using interchangeably. I mean... 15 is playable. The game plays...but the experience and it being enjoyable, the majority tend agree 30 fps isnt for PCs.
> 
> But....this is all a bit OT. I'd love a thread to get down to the bottom.of why 30 fps seems different on a console versus a PC (I know why movies can get away it).



Majority rule is not a valid argument. It never has been, but if you want to play with that - consoles as a whole (not individual platforms) are bigger than PC Gaming. RTS is extremely demanding, what you are looking for seems to be turn-based strategy or tactics. Those you can tolerate at 30 fps.

PCs are not some ephemeral platform that is so different to consoles. The requirements in games do not mean much, that is the point. BTW, Wolfenstein The New Order used to be (before a game update which broke its wobbly engine) be playable at 60 fps, 1080p on a 6850. Even using your wacky ideas for what system requirements should mean, this is illogical.


3GB is way too little. 4GB is pathetic too. It is not a good idea at all. VRAM matters, modding matters, the 1160/2060 is fast enough for it to matter even for people who misunderstand how VRAM works.


----------



## Batailleuse (Dec 27, 2018)

M2B said:


> You can easily run almost every single game on a 3GB card at 1080p just fine, but you're gonna lower the texture quality in some games and that's all. the 3GB version is not ideal but is not as usless as some people think, it actually might be a good value card for those needing a fairly powerful GPU and don't care about the maximum quality textures or VRAM related settings.
> if the 3GB RTX 2060 is going to be priced under 200$, I don't see a problem but anything higher than 200$ is unacceptable.
> 8GB VRAM on a 800$ card (RTX 2080) is more disappointing than 3GB on a budget card to me.


Point of. Buying a 2060 3gb to run low. Everything? You can buy a 4-6gb 1060 that will do better for cheaper or even a 1070 that will do better for cheaper


----------



## bibob94 (Dec 27, 2018)

Well all I can say is, this is bullshit.


----------



## lexluthermiester (Dec 27, 2018)

Xzibit said:


> They never have shown AA use in those surveys.


Yes they have. Those kind of fine-grained details are available to the Steam Client and get reported to Steam telemetry service. The info is there and has been available, just doesn't seem to be there anymore.


----------



## Kissamies (Dec 27, 2018)

ShurikN said:


> 3GB on a mid range gpu in 2019... I'm at a loss for words.


Still kicking with GTX 780 (I know, this was high-end back in the day) 
	

	
	
		
		

		
			





BUT WHY?! Why there can't be just one or two models..


----------



## Vayra86 (Dec 27, 2018)

This just goes to show that Nvidia still cannibalizes x50-x60 in every possible way they can. Been doing it since 550ti and never stopped. I'm not that into the generations prior to it when it comes to the VRAM trickery but maybe this was their thing for longer.

Nothing new here, apart from the ever increasing complexity when buying one of these GPUs. The take away is the same: when you're shopping in this territory, you will always be making tradeoffs from the moment you purchase it. And those tradeoffs can range well into uncomfortable territory for even casual gamers. As for the rest of the four-page discussion about quality settings... not very relevant I'd say.

For a meaty discussion on 3GB vs more GB, we have live examples of 780, 780ti and 1060 3GB already running into those limits at settings these GPUs can push comfortably. That says enough because the 2060 will be faster on the core. Balance is meh and 4GB and up is recommended. Its not even relevant what specs on the games' box say in that regard, its about how a GPU is balanced.


The key element is price, and knowing Nvidia, they will price it far too high, so TL DR these cards are going to be eclipsed by a more power hungry AMD alternative, /thread and Happy New Year


----------



## goodeedidid (Dec 27, 2018)

ShurikN said:


> 3GB on a mid range gpu in 2019... I'm at a loss for words.


Hey they give you the choice, if you wanna go budget then do it, nobody is going to be rocking 2060 with 4K screens so 3GB shouldn't be that bad, for FHD that is.


----------



## efikkan (Dec 27, 2018)

Charcharo said:


> 3GB is way too little. 4GB is pathetic too. It is not a good idea at all. VRAM matters, modding matters, the 1160/2060 is fast enough for it to matter even for people who misunderstand how VRAM works.


So how do you pick your threshold? Is it an arbitrary number? It's 2018, therefore cards needs x amount of memory?
I base my conclusions on facts, and the fact is that GTX 1060 3GB is still a good option for many buyers.

And as I've said many times already, just because _some_ need more memory doesn't mean everyone needs it.
Modding and high-res texture packs is a niche thing - an edge case, if you're one of those who do it, buy a card with more memory, it's as simple as that.


----------



## FordGT90Concept (Dec 27, 2018)

4 is okay (not so much in RTX 2060 case because of gimped bandwidth), >4 is preferred.


----------



## efikkan (Dec 27, 2018)

FordGT90Concept said:


> 4 is okay (not so much in RTX 2060 case because of gimped bandwidth), >4 is preferred.


*Why* is 4GB okay and 3GB not?
Different GPUs have different levels of compression, and different ways of allocating and managing memory.


----------



## Tsukiyomi91 (Dec 27, 2018)

would just rather take the full 6GB GDDR6 variant anyways coz any smaller/slower memory types will tank the GPU a lot & more complaints will show up...


----------



## Vayra86 (Dec 27, 2018)

efikkan said:


> *Why* is 4GB okay and 3GB not?
> Different GPUs have different levels of compression, and different ways of allocating and managing memory.



Call it the value of experience, and lacking that, 3GB is a great choice for a midrange GPU. Then you use one, you learn its not all that rosy as the reviews and benches told you, and you make the better choice next time. Or you're thát casual in gaming that you never touch a game that hits a limit or accidentally never use settings that hit a limit. In those edges cases, yes, 3GB is fine.

Its not like there aren't any alternatives at the same price point. Back when the 780(ti) and 7970 were hot, that was different. Today, its silly to pick one. Its simply not the preferable choice especially given the often tiny price gap for something better.


----------



## efikkan (Dec 27, 2018)

Vayra86 said:


> Call it the value of experience, and lacking that, 3GB is a great choice for a midrange GPU. Then you use one, you learn its not all that rosy as the reviews and benches told you, and you make the better choice next time. Or you're thát casual in gaming that you never touch a game that hits a limit or accidentally never use settings that hit a limit. In those edges cases, yes, 3GB is fine.
> 
> Its not like there aren't any alternatives at the same price point. Back when the 780(ti) and 7970 were hot, that was different. Today, its silly to pick one. Its simply not the preferable choice especially given the often tiny price gap for something better.


In the real world, GTX 1060 3GB works just fine for an entry mid-range card, and that's what the _experience_ tells us.

Comparing Kepler directly to Turing is not fair, newer GPU architectures are more efficient.


----------



## HZCH (Dec 27, 2018)

Confusing AF

What the hell where they thinking? And this 3gb RAM for a midrange card is an insult.

Hopefully for future buyers, it might be fake...


----------



## Vayra86 (Dec 27, 2018)

efikkan said:


> In the real world, GTX 1060 3GB works just fine for an entry mid-range card, and that's what the _experience_ tells us.
> 
> Comparing Kepler directly to Turing is not fair, newer GPU architectures are more efficient.



You said it right. Entry midrange. Bottom barrel. And releasing a similar 3GB 'next gen' means it has dropped lower than that.

If that is something you feel comfy spending ~200 bucks on, by all means. I'd suggest spending 220~240 to get *double* VRAM and more consistency alongside higher performance. These 2060's are going to present a choice along those lines and we all know they will, so let's stop fooling each other. This is a typical penny wise / pound stupid trade off.


----------



## M2B (Dec 27, 2018)

efikkan said:


> *Why* is 4GB okay and 3GB not?
> Different GPUs have different levels of compression, and different ways of allocating and managing memory.



4GB allows you to use Ultra/High quality textures in most games where the Ultra textures is out of reach on a 3GB card in newer AAA titles. sometimes you need to put the texture quality on medium to avoid stuttering and other problems on a 3G card.
That extra 1GB of memory makes a noticeable difference.

"Different GPUs have different levels of compression"
That's mostly about memory bandwidth and not the frame buffer.
You can't use better textures on a 3GB card vs another 3GB card and say "my GPU has better memory compression".
In Real-World scenarios it won't work like that.


----------



## FordGT90Concept (Dec 27, 2018)

efikkan said:


> In the real world, GTX 1060 3GB works just fine for an entry mid-range card, and that's what the _experience_ tells us.
> 
> Comparing Kepler directly to Turing is not fair, newer GPU architectures are more efficient.


6 GiB version of GTX 1060 outsold the 3 GiB versions 2:1.  What does that tell you?  Now we're almost 2.5 years later.  I wouldn't be surprised if it shifts to 4:1 to 8:1 with the RTX 2060.

Why? I already explained it to you: the 32-bit barrier is gone.  Games aren't treading lightly with their memory footprint anymore.  Xbox One X ports have access to 11 GiB RAM + VRAM.  As the next generation of consoles launch, it's going to go even higher.  3 GiB is like a person with a broken leg: they'll live but not well.  Games want more memory.  Run out and framerate tanks.


----------



## ShurikN (Dec 27, 2018)

Buying 3GB in 2019 is buying for yesterday, not tomorrow. There is zero future-proofing with it. Which would be fine if that was a $120 card. In reality it'll be around $300.
Some games will work fine, but most visually high-end titles will struggle.
Battlefield 5 is already unplayable at 1080p with less than 4GB VRAM.


----------



## CandymanGR (Dec 27, 2018)

Yes, but still people here try to persuade the rest of us that.. "3gb is just fine". Like we haven't seen ourselves how far this is from the truth.
But nobody of these people would actually buy such a card. Its funny, people with 2080's telling to other members that "3gb is fine". 
But hey.... they are 'experienced", they know all and the rest of us we are ignorants.

P.S. Yes, i am being sarcastic.


----------



## Tatty_One (Dec 27, 2018)

FordGT90Concept said:


> *6 GiB version of GTX 1060 outsold the 3 GiB versions 2:1*.  What does that tell you?  Now we're almost 2.5 years later.  I wouldn't be surprised if it shifts to 4:1 to 8:1 with the RTX 2060.
> 
> Why? I already explained it to you: the 32-bit barrier is gone.  Games aren't treading lightly with their memory footprint anymore.  Xbox One X ports have access to 11 GiB RAM + VRAM.  As the next generation of consoles launch, it's going to go even higher.  3 GiB is like a person with a broken leg: they'll live but not well.  Games want more memory.  Run out and framerate tanks.



To be fair, that was not just due to the increased memory but possibly the 128 more shader units the 6GB version had, it was just a faster card even when not using the additional memory for some 40 or 50$ more...……. the combination of both just made so much more sense for those who could stretch to the additional cost.


----------



## EarthDog (Dec 27, 2018)

Tatty_One said:


> To be fair, that was not just due to the increased memory but possibly the 128 more shader units the 6GB version had, it was just a faster card even when not using the additional memory for some 40 or 50$ more...……. the combination of both just made so much more sense for those who could stretch to the additional cost.


Indeed....if anyone knew about it. That wasn't exactly advertised much. Prospective buyers would have to know or look and compare specs. Most looking at these see '6GB more than 3GB' and think it's better. You give consumers too much credit.


----------



## bug (Dec 27, 2018)

This is off topic (if this thread ever had one), but I was just looking at some ads for laptops. The GPU model wasn't even mentioned, it was "Nvidia graphics with 4GB of RAM". A bit scary imho.


----------



## phill (Dec 27, 2018)

Has Nvidia gone mad??  So much effort for a middle of the road card, I just don't get it...


----------



## micropage7 (Dec 27, 2018)

sometimes i feel like they poisoning the market, i mean they create small gap to push consumer to add little more money to get higher one


----------



## bug (Dec 27, 2018)

micropage7 said:


> sometimes i feel like they poisoning the market, i mean they create small gap to push consumer to add little more money to get higher one


Yeah, I hate them for that, too. I mean, before Nvidia, this tactics was almost unheard of. Oh, wait!


----------



## Space Lynx (Dec 27, 2018)

Can't wait for 7nm AMD gpu's to make Nvidia cry. Sick of this crap.


----------



## Vayra86 (Dec 27, 2018)

micropage7 said:


> sometimes i feel like they poisoning the market, i mean they create small gap to push consumer to add little more money to get higher one



You can go to a random bar in town and find the same trickery in a simple price list of booze. Or even a supermarket where all that matters is where the product is positioned - at eye level or far below.


----------



## efikkan (Dec 27, 2018)

FordGT90Concept said:


> 6 GiB version of GTX 1060 outsold the 3 GiB versions 2:1.  What does that tell you?


That means Nvidia sold more GTX 1060 3GB cards than AMD did of RX 400/500 combined (according to Steam survey), so that tells me more people want this than Polaris.

And don't forget the 6 GB version of GTX 1060 is like 5-6% faster too.



FordGT90Concept said:


> Why? I already explained it to you: the 32-bit barrier is gone.


Stop with this nonsense. 32-bit CPUs/OS' have NOTHING to do with memory capacity.



ShurikN said:


> Buying 3GB in 2019 is buying for yesterday, not tomorrow. There is zero future-proofing with it. Which would be fine if that was a $120 card. In reality it'll be around $300.
> 
> Some games will work fine, but most visually high-end titles will struggle.
> 
> Battlefield 5 is already unplayable at 1080p with less than 4GB VRAM.


Aah, the eternal "future proofing" argument.
I remember all those who bought GCN over Kepler because it was more "future proofing" in Direct3D 12. Then R9 390(X) with 8 GB for "future proofing". And then Fiji with HBM for "future proofing", but then suddenly memory capacity didn't matter any more, because HBM was so glorious. Then Polaris with 8 GB for "future proofing", because memory capacity suddenly mattered again.

In real world it's a balancing act. You'll have to guess your requirements for the immediate future. But taking "future proofing" too far is going to be wasted money in the end. History has proven that paying extra for a lot of "future proofing" has never paid off.



lynx29 said:


> Can't wait for 7nm AMD gpu's to make Nvidia cry. Sick of this crap.


Then prepare yourself for disappointment.


----------



## Space Lynx (Dec 27, 2018)

efikkan said:


> Then prepare yourself for disappointment.



sure thing buddy.


----------



## Casecutter (Dec 27, 2018)

Just to ask, with a 192 Bit bus how would they access 4Gb with only 3 lanes.


----------



## efikkan (Dec 27, 2018)

Casecutter said:


> Just to ask, with a 192 Bit bus how would they access 4Gb with only 3 lanes.


Two options that I know of:
- Disable one memory controller and use 128-bit, possibly compensate with faster memory.
- Use an imbalanced memory configuration, like GTX 660/660 Ti.


----------



## Deleted member 158293 (Dec 27, 2018)

LOL at 3GB...

*Looks through closet for old 3GB 7970...  yup...  launched in 2012...*

I don't get it, but ok...  guess 2060 is the "Sucker's Edition"..


----------



## bug (Dec 27, 2018)

lynx29 said:


> Can't wait for 7nm AMD gpu's to make Nvidia cry. Sick of this crap.


Yeah, people have been waiting for that since AMD's nomenclature used four digits.

Edit: Oops, I forgot about the 285.


----------



## Vayra86 (Dec 27, 2018)

efikkan said:


> That means Nvidia sold more GTX 1060 3GB cards than AMD did of RX 400/500 combined (according to Steam survey), so that tells me more people want this than Polaris.
> 
> And don't forget the 6 GB version of GTX 1060 is like 5-6% faster too.
> 
> ...



You can believe whatever you want to believe and if you think 3GB GPUs are the shit in 2019, more power to you. The sales records show a different picture with even the midrange vastly outselling 3GB models with higher capacities. Compare that to the Kepler days where standard high end VRAM was 2GB. Nobody in their right mind bought the 4GB 670 or 680 - simply because games could barely _even allocate more than 1.5GB._ Today they allocate 6GB and up without issues. Game development has changed a bit. There is a reason 4GB 970's were released when Maxwell popped up (a 25% increase over a* similar core power* GTX 780(ti)), and that amound _doubled_ for the next gen equivalent, GTX 1070. Its clear as day the balance has completely shifted towards the console norm in terms of VRAM. There is a reason even the midrange RX480 comes in 8GB flavors too.

Explain this, how does 5-6% performance gap translate to half or 25% less VRAM? Where is the balance in that? And why would you *not* suffer a performance hit from such a cutdown when you push data over the same, rather narrow bus?

Common sense, use it, instead of gazing endlessly at performance summaries that reduce all detail to a single percentage and rarely bases it on a fully comprehensive benchmark suite. Reviews are an indicator, not an absolute all encompassing truth. People apparently still didn't get that memo. Its the exception that makes the rule when it comes to VRAM and you only need one edge case to kill the experience.


----------



## Gasaraki (Dec 27, 2018)

M2B said:


> You can easily run almost every single game on a 3GB card at 1080p just fine, but you're gonna lower the texture quality in some games and that's all. the 3GB version is not ideal but is not as usless as some people think, it actually might be a good value card for those needing a fairly powerful GPU and don't care about the maximum quality textures or VRAM related settings.
> if the 3GB RTX 2060 is going to be priced under 200$, I don't see a problem but anything higher than 200$ is unacceptable.
> 8GB VRAM on a 800$ card (RTX 2080) is more disappointing than 3GB on a budget card to me.



OMG, I NEEDS 12GB of RAM!!! 

The truth is that if most people game at 1080 (which it is) 3GB of RAM should be enough for the vast majority of games. I game at 3440x1440 and most games don't break 4GB of RAM.


----------



## Tatty_One (Dec 27, 2018)

Gasaraki said:


> OMG, I NEEDS 12GB of RAM!!!
> 
> The truth is that if most people game at 1080 (which it is) 3GB of RAM should be enough for the vast majority of games. *I game at 3440x1440 and most games don't break 4GB of RAM*.


I struggle to comprehend what games you are playing then.  I moved from a 4GB 290X to a 1070 a few months back, I only play one game which is world of tanks and they completely updated their game engine and overnight the 290X on ultra settings moved from an average 2.9Gig usage to 4.4 and that game is hardly demanding even on ultra at my 2560 x 1080.


----------



## cyneater (Dec 27, 2018)

I love how everyone is triggered with the 3,4 and 6GB?

What about the 11Gb flag ship.... its the same as the last generation...  and its an odd number.
12GB would be better or 16gb 

Since there is no competition Nvidia could stick there logo on a turd and market that

Where is the blast processing?


----------



## kanecvr (Dec 27, 2018)

Nxodus said:


> AMD = overheating, power hungry, unstable, bad software, no innovation BUT IT'S CHEAPER!!
> NVIDIA = expensive
> 
> I really don't get the NVIDIA haters, AMD is pure shit, it's the Walmart variant of video cards



You're kidding right? Latest nvidia software is garbage. AMD adrenalin has better features, includes out of the box OC and fan profiles, runs and feels smoother, is less taxing then geforce experience and launches faster, doesn't nag you to create an account to use some features, and it's been bug free for years - as opposed to nvidia's drivers witch  had versions with critical bugs like "forgetting" to spin up your fans when in load, miss-managing GPU voltage, failing to install over stock, signed windows drivers, and so on.

And don't get me started on overheating. My 1080FE would quickly go to 83C and throttle down to 1300mhz, causing it to perform WORSE them my old 1070. I took the FE card back and got a MSI card, witch did pretty much the same thing. I had to buy and install a 100$ cooler to get the card to stop throttling. Same for power usage - the 180w TDP on the 1080 is pure fiction. Under load my 1080 draws 200-240w on it's own (tested with an ampermeter on the 2nd 12v rail on the PSU witch only the video card is using 20 amps x 12 = 240w. I tought the card draws 180w if not allowed to boost, but at stock 1530Mhz it draws allmost 16 amps - 12v x 16a = 192W. If you're refering to the 1060, the yeah - those are cool cards. 70-75C even with cheap, crap coolers - but they are fast cards. They're OK for 1080p, but that's it. The 580 can do a lot better, especially overclocked versions like the XFX 580GTS OC Black edition (that card does get pretty freakin' hot tough).

As for instability - you've never used an AMD card right? I've had a 7950, then two, then a 280x, then bought a second one, and then a 290 (no-x) - they were all rock-solid. I also played around with a Vega64 - and while the max FPS is not as high as on a 1080 (in some games), the minimum FPS and frame times are miles better on the Vega, so much so that most games are noticeably smoother, even tough the framerate is a tiny bit slower. I've been trying to trade my 1080 for a vega64, but guess what - nobody wants to take the trade! The only reason I switched to nvidia is the minding boom witch made AMD cards climb in price to a silly degree. A Vega 64 was twice the price I payed for my 1080, so I said screw that and bought what made sense at the time.

You are *DEFINITELY RIGHT on the innovation part tough.*..  AMD needs to get of their asses and release something competitive - not that 590 (i.e. overclocked 580 BULLSHIT). And this both for AMD and Nvidia fans. Left to their own devices, nvidia will end up charging 5000$ for a high end GPU.



lexluthermiester said:


> For 1080p screens(which most gamers are still using), 3GB is still reasonable.
> 
> 
> That will depend on the settings level.



The 470 and 570 are great little cards. And so is the 580. I picked up a 4GB 580 in november for 150$ for my living room PC (i5 2500k @ 4GHz, mATX form factor) and it runs 1080p @ ultra flawlesly. I even play some games in 4k (less demanding ones like civ6 and some oldies). For that price nvidia was ofering a 1050ti witch is significantly slower.


----------



## CandymanGR (Dec 27, 2018)

efikkan said:


> Stop with this nonsense. 32-bit CPUs/OS' have NOTHING to do with memory capacity.



A 32bit cpu cannot address in 64 bit memory space. And you need 64bit addressing in order to have access in more than 4gigabytes of memory.
There is NO way the memory above 4g to be addressable from a 32bit cpu, not even with virtual memory paging. At least not from an x86 cpu.




efikkan said:


> Two options that I know of:
> - Disable one memory controller and use 128-bit, possibly compensate with faster memory.
> - Use an imbalanced memory configuration, like GTX 660/660 Ti.



There is NO way to cover the deficit of one third of cutting the memory bus, with higher clocks, because the GDDR5 memory has limitations on the speeds it can achieve. Thats why i think they are using GDDR6 also for the same model. Because IT MATTERS for this generation, even for middle range. Propably the GDDR6 model will be much faster and maybe with slightly different core config. I guess nvidia knows the GDDR5 is not enough as the core needs its memory to be, but they dont give a damn. Milking the cow is the way for them. You will need 30-35% increase in memory speed to cover the deficit.

It so funny seeing you guys trying to defend something that sucks so hard. Really, some people here should consider a new carreer in comedy (that was a joke).


----------



## bug (Dec 27, 2018)

Tatty_One said:


> I struggle to comprehend what games you are playing then.  I moved from a 4GB 290X to a 1070 a few months back, I only play one game which is world of tanks and they completely updated their game engine and overnight the 290X on ultra settings moved from an average 2.9Gig usage to 4.4 and that game is hardly demanding even on ultra at my 2560 x 1080.


You don't have to update a game engine for that effect. Just upscale your textures 2x and you get 4x* the VRAM usage without actually improving quality.

*without factoring in compression



CandymanGR said:


> *A 32bit cpu cannot address in 64 bit memory space.* And you need 64bit addressing in order to have access in more than 4gigabytes of memory.
> There is NO way the memory above 4g to be addressable from a 32bit cpu, not even with virtual memory paging. At least not from an x86-64 cpu.



Ah, this misconception is with us since Athlon64 days. I suggest you look up PAE, the address space hasn't been confined by the general architecture for quite some time. It's awkward to do, so this practice isn't all that widespread (afaik), but it exists.


----------



## CandymanGR (Dec 27, 2018)

bug said:


> Ah, this misconception is with us since Athlon64 days. I suggest you look up PAE, the address space hasn't been confined by the general architecture for quite some time. It's awkward to do, so this practice isn't all that widespread (afaik), but it exists.



Show me an example of a 64bit application working on a 32bit cpu then. There is no way to have a 32bit application with 64bit addressing space on a 32bit cpu.
The are no 64bit memory registers on a 32bit cpu.

Edit: If my style of writing feels aggressive, sorry. I am not attacking anyone, i just disagree with passion.


----------



## RichF (Dec 28, 2018)

This is what happens when corporations feel no fear.

No fear of consumer retaliation for anti-consumer practices.

No fear of government oversight reigning anti-consumer practices in (really the same thing since governments are supposed to be people elected to do the people's work).

This is what happens when there is monopoly, duopoly, and quasi-monopoly.

The tech world has far too little competition in a lot of areas and this is what consumers get. If you don't like it you're not going to get anywhere by engaging with forum astroturfers. Organize and get political action.


----------



## bug (Dec 28, 2018)

CandymanGR said:


> Show me an example of a 64bit application working on a 32bit cpu then. There is no way to have a 32bit application with 64bit addressing space on a 32bit cpu.
> The are no 64bit memory registers on a 32bit cpu.
> 
> Edit: If my style of writing feels aggressive, sorry. I am not attacking anyone, i just disagree with passion.


You refuse to educate yourself with the same passion, it would seem.


----------



## CandymanGR (Dec 28, 2018)

bug said:


> You refuse to educate yourself with the same passion, it would seem.



Thanks for your valuable input.


----------



## bug (Dec 28, 2018)

CandymanGR said:


> Thanks for your valuable input.


I gave you my input above: look up PAE and read about it. (And not because I'm too lazy to detail, but because it's a lot to read.)
You're acting as if that never happened.


----------



## CandymanGR (Dec 28, 2018)

bug said:


> I gave you my input above: look up PAE and read about it. (And not because I'm too lazy to detail, but because it's a lot to read.)
> You're acting as if that never happened.



And i also told you 'it doesnt work well, not even with virtual memory paging', but you also act as like nothing happened. Those cpu's cannot "see" the whole memory at once. Memory paging sucks, its ancient techonology. Thats why we went to x86-64 technology.
You forget that PAE maybe it does support indeed 64 bit memory range, but in THEORY. Because in reality the virtual address space capabilities of those cpu's (Pentium Pro) remained 32bit. This changed with the AMD x86-64.

Edit: In any case, i wont say more about this, because i think it is off topic.


----------



## FordGT90Concept (Dec 28, 2018)

bug said:


> Ah, this misconception is with us since Athlon64 days. I suggest you look up PAE, the address space hasn't been confined by the general architecture for quite some time. It's awkward to do, so this practice isn't all that widespread (afaik), but it exists.


PAE is not practical in gaming (as stated repeatedly). The latency is too high, framerates plummet.  It's like going into the 3.5-4.0 GiB territory of a GTX 970.

The point of mentioning it is that it manifests a watershed moment.  When games were developed for 32-bit, their memory usage was very restricted.  The moment games switched to 64-bit, suddenly there was memory available so developers sought to use it.  Fury X marks the transition.  4 GiB was okay then but it definitely isn't okay now--especially in premium cards.

Just look at the response to this thread.  All but two people, by my count, are scoffing at the notion of a 3 GiB 2060.  It's sad that yields are so low they feel they need to debut four extra models of sub-par cards under the same brand.


----------



## CandymanGR (Dec 28, 2018)

That's because PAE in 32bit cpus cannot access all physical memory at once. So it utilizes a method of virtualizing memory space in pages, and then using paging & segmentation to access all available ram (in parts). But this introduces lots of wait-states to the processor every time it needs to access data on a different "memory page". At least this is my knowledge.


----------



## RichF (Dec 28, 2018)

FordGT90Concept said:


> It's sad that yields are so low they feel they need to debut four extra models of sub-par cards under the same brand.


There are other incentives for doing that. Yields, for example, don't explain things like GPUs that came with much more VRAM than they could put to use, GPUs with large amounts of slow VRAM and others — with the same chip — that have much less but much faster VRAM (e.g. 2 GB DDR3 on one and 512 MB GDDR5 on the other), packaging that makes the GPU seem powerful and useful for serious gaming, numbers that make the GPU sound more powerful than the previous generation... you know... the whole sad bag of tricks.

And tricks are what they are. They're not merely a matter of efficiently dealing with yield problems. Far from it. There are plenty of ways to deal with yields that don't involve intentionally confusing the consumer. But, that's how consumers are parted with more money than they otherwise would be. That, of course, is the entire point of the business of advertising.

The primary reason to sell two, or three, or fifteen different specs with the same number name (e.g. 1060) is to confuse the consumer. This is why, for instance, Sapphire sold Vega cards with vapor chambers (and got them reviewed), then sold basically identical cards to consumers without the vapor chambers. Bait and switch deception, in many forms.


----------



## efikkan (Dec 28, 2018)

CandymanGR said:


> A 32bit cpu cannot address in 64 bit memory space. And you need 64bit addressing in order to have access in more than 4gigabytes of memory.
> 
> There is NO way the memory above 4g to be addressable from a 32bit cpu, not even with virtual memory paging. At least not from an x86 cpu.


This is a common confusion, even among many engineers, unfortunately. You are mixing address width with register width. While these two can be the same width, CPUs and software can certainly work around incompatibility between the two.
The old 8086(16-bit) had a 20-bit address bus. It had to use two registers to specify the memory address. This is extra overhead, but completely achievable.
Another example is the old 8-bit 6502(and derivates), famous for the Commodore 64, Atari 2600, Apple II and NES. This 8-bit chip had 16-bit address width, allowing direct access to 64kB. Machines like the Commodore 64 employed a technique called bank switching to extend this even further.



CandymanGR said:


> And i also told you 'it doesnt work well, not even with virtual memory paging', but you also act as like nothing happened. Those cpu's cannot "see" the whole memory at once. Memory paging sucks, its ancient techonology. Thats why we went to x86-64 technology.


The fact police needs to correct you again 
Memory paging is not ancient, nor is it outdated in any way. Paging must not be confused with swapping/pagefile, that's when memory pages are moved to another storage medium. Paging is just the division of sections of memory organized into continous virtual memory spaces for each application, it's essential for multitasking operating systems.



CandymanGR said:


> There is NO way to cover the deficit of one third of cutting the memory bus, with higher clocks, because the GDDR5 memory has limitations on the speeds it can achieve.


Well, it might not cover all of it, but it might not _have to_. Turing cards do in general have much more memory bandwidth than Pascal already, so they have a lot of headroom.


----------



## bug (Dec 28, 2018)

FordGT90Concept said:


> PAE is not practical in gaming (as stated repeatedly). The latency is too high, framerates plummet.  It's like going into the 3.5-4.0 GiB territory of a GTX 970.
> 
> The point of mentioning it is that it manifests a watershed moment.  When games were developed for 32-bit, their memory usage was very restricted.  The moment games switched to 64-bit, suddenly there was memory available so developers sought to use it.  Fury X marks the transition.  4 GiB was okay then but it definitely isn't okay now--especially in premium cards.
> 
> Just look at the response to this thread.  All but two people, by my count, are scoffing at the notion of a 3 GiB 2060.  It's sad that yields are so low they feel they need to debut four extra models of sub-par cards under the same brand.





CandymanGR said:


> That's because PAE in 32bit cpus cannot access all physical memory at once. So it utilizes a method of virtualizing memory space in pages, and then using paging & segmentation to access all available ram (in parts). But this introduces lots of wait-states to the processor every time it needs to access data on a different "memory page". At least this is my knowledge.


My only assertion here was that 32bit CPUs _can_ use more than the 32bit address space allows.


----------



## lexluthermiester (Dec 28, 2018)

bug said:


> My only assertion here was that 32bit CPUs _can_ use more than the 32bit address space allows.


Which is correct. There are many ways to map memory beyond the physical limits of a CPU.


----------



## CandymanGR (Dec 28, 2018)

Yes, but you cannot access all the memory at once.
Memory paging sucks and IT IS ancient.

Ofcourse i dont confuse MEMORY SEGMENTATION AND PAGING with virtual memory. I think YOU do.

A 32bit cpu cannot USE more memory, it just segments the ram into 32bit memory pages! This is NOT the same as x64 memory addressing. JESUS! JESUS!!!!!!!!!!!!!



efikkan said:


> his is a common confusion, even among many engineers, unfortunately. You are mixing address width with register width. While these two can be the same width, CPUs and software can certainly work around incompatibility between the two.
> The old 8086(16-bit) had a 20-bit address bus. It had to use two registers to specify the memory address. This is extra overhead, but completely achievable.
> Another example is the old 8-bit 6502(and derivates), famous for the Commodore 64, Atari 2600, Apple II and NES. This 8-bit chip had 16-bit address width, allowing direct access to 64kB. Machines like the Commodore 64 employed a technique called bank switching to extend this even further.



Funny. You say it is not an ancient tech and you give examples of ancientc processors, like the 8086 or the MOS6502 processor. And the Motorola 68000 had 16bit bus but 32bit memory and data registers. So? You compare an ancient architecture that had no performance hit whatsoever by using mem segmentation and switch bank addressing because the cpu itself was so slow!



efikkan said:


> The fact police needs to correct you again


Wooooooow... really? So you are the.. tech police here, who are always right and the others are wrong and you correct them? No shit? Really? Do you have more jokes like that?



efikkan said:


> Memory paging is not ancient, nor is it outdated in any way. Paging must not be confused with swapping/pagefile, that's when memory pages are moved to another storage medium. Paging is just the division of sections of memory organized into continous virtual memory spaces for each application, it's essential for multitasking operating systems.



You dont even KNOW what is memory paging and segmentation, do you? Read again. I speak about 32bit memory addressing vs 64bit memory addressing and you and bug are saying whatever comes to your minds. And if PAE was such a panacea as you and bug imply, we would still be using 32bit cpus. We went to x86-64 FOR A REASON. Do you know what it was?


----------



## bug (Dec 28, 2018)

@CandymanGR At this point I would suggest you either drop it or try to be more concise about what you're trying to say.


----------



## CandymanGR (Dec 28, 2018)

efikkan said:


> Well, it might not cover all of it, but it might not _have to_. Turing cards do in general have much more memory bandwidth than Pascal already, so they have a lot of headroom.



Turing cards have more bandwidth? Are you sure? An RTX 2080 has more bandwidth than a GTX 1080?


----------



## 95Viper (Dec 28, 2018)

Get back on topic.
Stop the side topic bickering.

Thank you and try to have a nice day.


----------



## CandymanGR (Dec 28, 2018)

bug said:


> @CandymanGR At this point I would suggest you either drop it or try to be more concise about what you're trying to say.


Seriously?

Really? REALLY?  Ok then.

32bit cpus cannot "see" and address the whole physical memory *at once* (if more than 4gb). Only 64bit cpus can.

PAE is NOT the same as x64 memory addressing and it CANNOT use all memory at once (if more than 4gb). Anyone who says otherwise is ignorant and should stop pretending he is...... tech police. If someone still keeps insisting on that, he has no fucking clue.

Memory paging and segmentation, switch back addressing and other ancient shit is for history lessons. Not for 2019 tech.
And mulitasking doesnt need memory paging since 1992 because of memory protection!!!!!!

Turing has not more bandwidth. Bandwidth is the combination of bus width and memory speed. It is not.. MAGIC as some "experts" here believe.

Some wannabe experts should really reconsider their opinion about themselves.

Covered?


----------



## FordGT90Concept (Dec 28, 2018)

Oh look! A *6 year old* game (Max Payne 3) is using 3.2 GiB VRAM at 1920x1200!


----------



## efikkan (Dec 28, 2018)

I don't know if anyone mentioned it, but if the rumor of 6 variants of RTX 2060 is true, perhaps some might be third-world editions™? (similar to the GTX 1060 5GB)
As mentioned, I'm not a fan of this due to confusion with naming.


----------



## FordGT90Concept (Dec 28, 2018)

Perhaps but 4 GiB _and_ 3 GiB?  GDDR5 _and_ GDDR6?

2060 3 GiB GDDR6 and 6 GiB GDDR6 makes sense (albeit stupid on the 3 GiB SKU) for western release and 2060 4 GiB GDDR5 release for internet cafes.

Maybe two of these variants are actually 2050s?

Edit: No... the list is all of them exclusively for Gigabyte.  Only two logical conclusions:
a) the rumor is wrong or
b) Gigabyte has lost its marbles.

Having that many SKUs to support doesn't make business sense.


----------



## Tsukiyomi91 (Dec 29, 2018)

@FordGT90Concept it doesn't really make sense unless Gigabyte thinks it's doing a favor that no one is asking...


----------



## lexluthermiester (Dec 29, 2018)

CandymanGR said:


> An RTX 2080 has more bandwidth than a GTX 1080?


Yes it does. A lot. So it would be safe to conclude that a 2060 is going to have a similar increase compared to a 1060.



FordGT90Concept said:


> Oh look! A *6 year old* game (Max Payne 3) is using 3.2 GiB VRAM at 1920x1200!


Settings?


----------



## FordGT90Concept (Dec 29, 2018)

lexluthermiester said:


> Settings?


Max (har har) except for MSAA (4x) and tessellation (off).


----------



## gamerman (Dec 29, 2018)

mid range games,meaning FHD resolution 3GB of memory is more than enough..if some1 not know it.

and that one 3GB 2060 is for target.still its fastest gpu for that..with 3GB.

so,if you using FHD monitor,2060 with 3GB memory is best choice bcoz you get near 50 fps almost all games with low price


----------



## lexluthermiester (Dec 29, 2018)

FordGT90Concept said:


> Max (har har) except for MSAA (4x) and tessellation (off).


Try this, turn the MSAA off. Turn the shadows down to low. Turn the tessellation back on to normal. Granted you're running a Radeon, but shouldn't matter too much concerning memory load. Post a screen shot with those setting to see if the memory load drops. The reason I ask is as follows, now that GPU prices have come down for used 1060's, I've been installing then in client PC's and tweaking driver settings to keep performance good. So I know that 1060 3gb cards are good gaming cards when settings are config'd well.


----------



## FordGT90Concept (Dec 29, 2018)

lexluthermiester said:


> Try this, turn the MSAA off. Turn the shadows down to low. Turn the tessellation back on to normal. Granted you're running a Radeon, but shouldn't matter too much concerning memory load. Post a screen shot with those setting to see if the memory load drops. The reason I ask is as follows, now that GPU prices have come down for used 1060's, I've been installing then in client PC's and tweaking driver settings to keep performance good. So I know that 1060 3gb cards are good gaming cards when settings are config'd well.


My card has 8 GiB.  Why would I gimp the game to save VRAM?  To be fair, the game mostly hovered around 2 GiB.

The point I was trying to make is that you don't have to look hard to find a game that uses 3+ GiB VRAM these days.


----------



## GhostRyder (Dec 29, 2018)

Frankly this seems like a bad idea in my book.  I don't know if its an experiment or what  but to me having that many versions is just odd even on just the different VRAM versions.  Its going to be one heck of a confusing lineup that's for sure.


----------



## bug (Dec 29, 2018)

GhostRyder said:


> Frankly this seems like a bad idea in my book.  I don't know if its an experiment or what  but to me having that many versions is just odd even on just the different VRAM versions.  Its going to be one heck of a confusing lineup that's for sure.


I'm betting GDDR5 and 6 support is baked in because availability couldn't be predicted. Irl we'll see one or the other taking the lion's share.


----------



## efikkan (Dec 30, 2018)

FordGT90Concept said:


> The point I was trying to make is that you don't have to look hard to find a game that uses 3+ GiB VRAM these days.


The point wasn't to try to make a game use >3GB memory, that's easy, but the fact that most games can be configured to run well within those constraints.
Just because you can find a use for >3GB, doesn't mean everyone needs it.


----------



## FordGT90Concept (Dec 30, 2018)

Nor was I trying.  I just happened to notice it, thread was getting derailed, so I posted it.


----------



## EarthDog (Dec 30, 2018)

efikkan said:


> The point wasn't to try to make a game use >3GB memory, that's easy, but the fact that most games can be configured to run well within those constraints.
> Just because you can find a use for >3GB, doesn't mean everyone needs it.


Many users like to crank the settings/ultra to get what they paid for too. 

We can turn things down to run on, and look like, a potato.


----------



## lexluthermiester (Dec 30, 2018)

EarthDog said:


> Many users like to crank the settings/ultra to get what they paid for too.


Not everyone.


EarthDog said:


> We can turn things down to run on, and look like, a potato.


Really? Turning off setting that don't mean much will not result in a game looking like a "potato". Further, turning off or down certain settings to max performance on a lower tier card is also not going to have that result either.


----------



## bug (Dec 30, 2018)

lexluthermiester said:


> Not everyone.



I would expect a clueless user will just try various presets and settle for one. A more informed user will know how to tweak at least some of the settings. I wouldn't expect many users to just max out everything and refuse to play any other way anymore than I expect drivers to get behind the wheel and just press the pedal to the metal. Users that do that probably don't buy a mid range card to begin with. But I have seen stranger things.



lexluthermiester said:


> Really? Turning off setting that don't mean much will not result in a game looking like a "potato". Further, turning off or down certain settings to max performance on a lower tier card is also not going to have that result either.


It depends on the game. In some titles, "medium" can be hard to distinguish from "ultra". In some titles, "medium" can look significantly worse. And then there's the matter of the game itself. Some give you time to admire the scenery, some do not.


----------



## lexluthermiester (Dec 30, 2018)

bug said:


> It depends on the game. In some titles, "medium" can be hard to distinguish from "ultra". In some titles, "medium" can look significantly worse. And then there's the matter of the game itself. Some give you time to admire the scenery, some do not.


Good points all!


----------



## EarthDog (Dec 31, 2018)

lexluthermiester said:


> Not everyone.
> 
> Really? Turning off setting that don't mean much will not result in a game looking like a "potato". Further, turning off or down certain settings to max performance on a lower tier card is also not going to have that result either.


I didnt say everyone. 

I didnt buy a PC to have it look worse than a console. Some need to...some choose to, others like ultra. It is what it is.


----------



## lexluthermiester (Dec 31, 2018)

EarthDog said:


> I didnt buy a PC to have it look worse than a console.


LOLOLOLOLOL!


----------



## FordGT90Concept (Dec 31, 2018)

I only tend to change graphics settings when the game on start looks atrocious (especially games defaulting to anything less than my monitor's native resolution).  That was the case with Max Payne (defaulted to 800x600).  Naturally, the game didn't know what an RX 590 was so it defaults to medium/low settings. That's when I turn everything up to max (hehe), check the framerate which was noticeably terrible at around 35 fps, and adjusted MSAA and tessellation down.  Obviously the game uses an NVIDIA-biased implementation of tessellation which hasn't been optimized for AMD in the last six years.

Newer games with newer cards, the defaults are usually good enough.  It's older games that don't know what the graphics card is that need tweaking.


----------



## GhostRyder (Dec 31, 2018)

bug said:


> I'm betting GDDR5 and 6 support is baked in because availability couldn't be predicted. Irl we'll see one or the other taking the lion's share.


That's a fair point I had not considered.  Personally I just hope there really are not that many versions lol.


----------



## Divide Overflow (Dec 31, 2018)

Que the threads asking to "unlock" the extra memory in their lower tier models.


----------



## EarthDog (Dec 31, 2018)

Cue, even...but +1


----------



## B-Real (Jan 1, 2019)

lexluthermiester said:


> That's an opinion, and not a very good one. Every RTX card I've installed/used kicks the ever living snot out of the GTX10XX/Titan counter parts to say nothing of Radeon cards. And that's before RTRT is factored into the equation. The only people whining about the price are the people who can't afford them or have to wait an extra bit of time to save money for one. Everyone else is buying them up, which is why they are consistently still selling out, months after release.


So for you, it is enough that the 20 series can't even beat the 10 series the 900 series did with the 700 to say it's not the worst price-performance GPU ever made.


lexluthermiester said:


> Here's a set of facts;
> 1. Every generation of new GPU's get a price increase.


You are simply LYING. Comparing MSRP prices (that's what you have to do, not comparing previous gen GPU after their price drop to the new gen), there was a price decrease of $50-100 in the 700-900 switch (where there was a slightly bigger jump in performance), a price increase of $50-100 in the 900-1000 switch, which brought a HUGE performance leap. There was minimal-none price jump with the 600-700 switch except for the $150 increase of the 780. There was also minimal-none increase in the case of the 500-600. And now we are speaking of $100-300 (which in reality was more like $500) price jump. Don't you really feel how pathetic is that, or you are just an NV employee?

Plus it wouldn't have been that bad if there was a minimal price increase for the RTX series, let's say 50-50$ for the 2070 and 2080. But anything you say, just check Techpowerup's poll before the release of RTX, check the stock market, check general reception of potential customers about the card, and you will now you are just simply lying to yourself, too.


lexluthermiester said:


> The 2080 cleanly beats out the 1080 and beats out 1080ti if it doesn't match it. Also RTX offers advancements Pascal can not. The 2080/2080ti and RTX Titan are the best on the market. NVidia knows this and demands a premium price for it. If you don't want to pay that price, ok, don't buy one. Settle for less.


As some already reacted to this, I have to do that too: Wow, cleanly beats out a 2,5 year old card by nearly 30%. What a result! 2080 is 1% faster than the 1080Ti, which is totally equal in performance, so it doesnt' beat it out. Just to remind you: 1080 beat the 980 Ti by 38% while costing $50 less. The 2080 equals the 1080Ti while costing the same. And as mentioned before, 1080Ti can be OCd better than the 2080.


lexluthermiester said:


> Try this, turn the MSAA off. Turn the shadows down to low. Turn the tessellation back on to normal. Granted you're running a Radeon, but shouldn't matter too much concerning memory load. Post a screen shot with those setting to see if the memory load drops. The reason I ask is as follows, now that GPU prices have come down for used 1060's, I've been installing then in client PC's and tweaking driver settings to keep performance good. So I know that 1060 3gb cards are good gaming cards when settings are config'd well.



So you are advising people who want to buy a ~ $350+ 3GB 2060 (which is near the price of the 1070 with 8GB) to lower settings in FHD. LOL. No other words needed. I hope you advise your customers only Intel-NV rigs. 

The fact is that objectively the only really good point in the RTX is series is the Founder's Edition's solid cooling solutions (in terms of noise, cooling performance) and neat look (subjective).


----------



## lexluthermiester (Jan 1, 2019)

B-Real said:


> So for you, it is enough that the 20 series can't even beat the 10 series the 900 series did with the 700 to say it's not the worst price-performance GPU ever made.


Wow, what a conclusion. Clearly you have looked at the review performance graphs. Well done.


B-Real said:


> You are simply LYING.


Are you sure? Maybe I'm postulating a set of suggestions based on pure hyperbole?


B-Real said:


> As some already reacted to this, I have to do that too


Of course you would. Sure.


B-Real said:


> Wow, cleanly beats out a 2,5 year old card by nearly 30%.


It would seem you know how to read like an expert..


B-Real said:


> What a result! 2080 is 1% faster than the 1080Ti


So 30% is equal to 1%? Is that what you're saying?


B-Real said:


> which is totally equal in performance


Your math skills are dizzying!


B-Real said:


> so it doesnt' beat it out.


Ok, sure.


B-Real said:


> Just to remind you: 1080 beat the 980 Ti by 38% while costing $50 less. The 2080 equals the 1080Ti while costing the same. And as mentioned before, 1080Ti can be OCd better than the 2080.


Gee, thanks for the reminders. You're very helpful.

@Vayra86
Earlier you said I was making a fool of myself... How are things going on that?


----------



## Vayra86 (Jan 1, 2019)

lexluthermiester said:


> Wow, what a conclusion. Clearly you have looked at the review performance graphs. Well done.
> 
> Are you sure? Maybe I'm postulating a set of suggestions based on pure hyperbole?
> 
> ...



Im not seeing much of a change. Topic went right back to shit the moment you started 'moderating' everything posted.

Suffice to say, Im out, enjoy yourselves


----------



## vip3r011 (Jan 1, 2019)

$250 mayb i cna dream for a rtx 2060 3gb


----------



## remixedcat (Jan 2, 2019)

Nah nvidia needs to have a card with over 30 variations like the galaxy s4 lmao


----------



## RichF (Jan 4, 2019)

remixedcat said:


> Nah nvidia needs to have a card with over 30 variations like the galaxy s4 lmao


Ask and ye shall receive...

Gigabyte will have at least 40 variants of the GeForce RTX 2060 graphics card according to an EEC (Eurasian Economic Commission) product listing. link



(eyeroll count not gratuitous)


----------



## remixedcat (Jan 4, 2019)

Wowzaz!!!


----------



## bug (Jan 4, 2019)

RichF said:


> Ask and ye shall receive...
> 
> Gigabyte will have at least 40 variants of the GeForce RTX 2060 graphics card according to an EEC (Eurasian Economic Commission) product listing. link
> 
> ...


Yeah, there's no 40 cards in there. Just an assumption, based on what _could_ vary between models.


----------



## Assimilator (Jan 7, 2019)

https://www.techpowerup.com/251236/nvidia-unveils-the-geforce-rtx-2060-graphics-card $349 for the top-end version with 6GB of GDDR6.


----------



## FordGT90Concept (Jan 11, 2019)

I talked to someone in the know about the theoretical of a 32-bit D3D10 game and how that relates to VRAM.  Textures practically have to pass through the RAM to reach the VRAM.  The only way to not do that is via DMA which is very fringe stuff.

My understanding is that there's a variety of ways to make it crash because that RAM limitation is absolute.
1) If you try to load too many resources, GART will overflow crashing the executable.
2) If you try to load a huge asset (depending on conditions but could be smaller than 3 GiB in size) , it will crash because the RAM can't hold the asset before handing it to the VRAM.
3) If you try to hold too many assets in RAM in transit to VRAM and you fail to release them the RAM fast enough and it goes over the virtual memory limit, it will crash.

In other words, even under 32-bit D3D10, you're dancing on a razer's edge when dealing with VRAM.  VRAM (unless DMA is used which good luck with that) is practically limited by addressable RAM space.

Coming full circle, this is fundamentally why Fury X 4 GiB and GTX 970 3.5 GiB were okay a few years ago but not so much now.  Any game that might need 64-bit address space is usually 64-bit.   The days of claustrophobic memory usage are gone.


----------



## bug (Jan 12, 2019)

FordGT90Concept said:


> I talked to someone in the know about the theoretical of a 32-bit D3D10 game and how that relates to VRAM.  Textures practically have to pass through the RAM to reach the VRAM.  The only way to not do that is via DMA which is very fringe stuff.
> 
> My understanding is that there's a variety of ways to make it crash because that RAM limitation is absolute.
> 1) If you try to load too many resources, GART will overflow crashing the executable.
> ...


DMA access used to be everywhere a while back, not sure whether Win10 restricts is somehow (and I wouldn't be surprised if it does).
As for loading huge textures, two things. First, there's this thing called streaming - you don't have to keep the entire thing into RAM at the same time to load it. Second, I'm pretty sure no game uses a 3GB piece of texture.


----------



## FordGT90Concept (Jan 12, 2019)

Developers use the methods provided to them by APIs like D3D, OpenGL, and Vulkan which don't use DMA to move the textures.  DMA hasn't been used in earnest since before those APIs became common.  All of those APIs also support super textures either for sectioning (using segments of a larger image to reduce HDD/SDD load) or sky domes.

Yes, streaming, but most engines that are big on streaming (like UE4) and are of the era where >4 GiB VRAM can be used are also already 64-bit so it's a non-issue.  This is where you run into problems with graphics cards that have  <4 GiB VRAM because the API is having to shuffle assets between RAM and VRAM which translates to stutter.


----------



## bug (Jan 12, 2019)

FordGT90Concept said:


> Developers use the methods provided to them by APIs like D3D, OpenGL, and Vulkan which don't use DMA to move the textures.  DMA hasn't been used in earnest since before those APIs became common.  All of those APIs also support super textures either for sectioning (using segments of a larger image to reduce HDD/SDD load) or sky domes.


That doesn't prevent said APIs from doing DMA internally.



FordGT90Concept said:


> Yes, streaming, but most engines that are big on streaming (like UE4) and are of the era where >4 GiB VRAM can be used are also already 64-bit so it's a non-issue.  This is where you run into problems with graphics cards that have  <4 GiB VRAM because the API is having to shuffle assets between RAM and VRAM which translates to stutter.



Streaming is the non-naive way to handle large assets in programming.


----------



## FordGT90Concept (Jan 12, 2019)

bug said:


> That doesn't prevent said APIs from doing DMA internally.


But they don't and that's the point.  Everything in VRAM flows through RAM.

The argument I made before about 32-bit and VRAM being intrinsically linked is effectively true.  Now that the 32-bit barrier is gone, VRAM usage has soared in games that can use it.


----------

