# Digital foundry RTX 3080 early review



## Hugis (Sep 1, 2020)

Enjoy


----------



## ShurikN (Sep 1, 2020)

Impressive. 
Here's the article for those that don't want to watch the video








						Hands-on with RTX 3080 - is this really Nvidia's biggest leap in gen-on-gen performance?
					

It's a time of change, of transition. With the arrival of the next generation consoles, the scale, scope and ambition o…




					www.eurogamer.net


----------



## Tomgang (Sep 1, 2020)

That's definitely looks very promising. I think nvidia has a winner with rtx 3080 and it even looks like nvidia aren't lying in press material. 60 to 100 % faster than rtx 2080 and by that even a little bit more fast again over my GTX 1080 TI.

I think I have found my gpu to power my planned zen 3 base system


----------



## Fluffmeister (Sep 1, 2020)

Looks great, but I'm gonna wait for RDNA4, just to be sure.


----------



## ppn (Sep 1, 2020)

3080 is that much faster compared to 2080, as 1080 was compared to 980, history repeats when there is a node shrink. Shrink that is exactly the same ratio of 1.76x transistors per die area.
And not surprisingly the performance hovers around 1,76x. Only now with 775mm2 vs 627mm2 what was 398 vs 314 die sizes, 4 years later.
Unfortunately they can't keep getting bigger, 5nm 40 series could see the return to 400mm2 again for good.


----------



## mouacyk (Sep 1, 2020)

> Regardless, an overall uplift of 77.6 per cent gen-on-gen is good going - up there with the best of my recorded results in my GTX 980 to 1080 comparisons.


  Little bit disingenuous, but of course gamers are as dumb as they come.  2080 to 3080 required an SKU upgrade to provide the same gen-over-gen improvement.  Now, why did they have to do that... is there something lurking about and they are prepared to defend against it?


----------



## bubbleawsome (Sep 1, 2020)

mouacyk said:


> Little bit disingenuous, but of course gamers are as dumb as they come.  2080 to 3080 required an SKU upgrade to provide the same gen-over-gen improvement.  Now, why did they have to do that... is there something lurking about and they are prepared to defend against it?


What? They’re both xx80 and both cost the right amount.


----------



## Deleted member 24505 (Sep 1, 2020)

So 100% was optimistic


----------



## xkm1948 (Sep 1, 2020)

I see loads of pascal and vega folks jumping onto 3080 this round.


----------



## MxPhenom 216 (Sep 1, 2020)

tigger said:


> So 100% was optimistic



But not that far away. They said 100% better for RTX specific titles, not all titles.


----------



## Selaya (Sep 1, 2020)

> [ ... ]
> Full disclosure: I can bring you the results of key tests today, but there are caveats in place. Nvidia has selected the games covered, for starters, and specified 4K resolution to remove the CPU completely from the test results and in all cases, settings were maxed as much as they could be. The games in question are Doom Eternal, Control, Shadow of the Tomb Raider, Battlefield 5, Borderlands 3 and Quake 2 RTX. Secondly, frame-time and frame-rate metrics are reserved for the reviews cycle, meaning our tests were limited to comparisons with RTX 2080 (its last-gen equivalent in both naming and price) and differences had to be expressed in percentage terms
> [ ... ]


Yeah, a whole lotta RTX pad and nothings. I very much doubt non-RTX performance gains are all that significant. Also, it's compared against the 2080 (not even 2080S? I honestly don't know), not 2080Ti.


----------



## MxPhenom 216 (Sep 1, 2020)

Selaya said:


> Yeah, a whole lotta RTX pad and nothings. I very much doubt non-RTX performance gains are all that significant. Also, it's compared against the 2080 (not even 2080S? I honestly don't know), not 2080Ti.



Eurogamer is seeing performance of non RTX still over 50% over the 2080.


----------



## Super XP (Sep 2, 2020)

Looks interesting. Waiting for RDNA2 and also want to see more reviews and benchmarks. 
Need to be sure there's no image quality issues for this Nvidia GPU.


----------



## iuliug (Sep 2, 2020)

Nvidia did not allow them to show actual frame rates and picked the games. DF also acknowledged financial ties to Nvidia. Therefore those are super best case scenarios.


----------



## Cvrk (Sep 4, 2020)

This is the SINGLE best peace of comedy released since yesterday. I smiled. It was that good. Hope you enjoy it as much as I did. Brilliant content


----------



## Chomiq (Sep 4, 2020)

Hugis said:


> Enjoy


It's not a review, more like preview. They weren't allowed to show fps numbers and were limited by Nvidia on hardware choice, games and in game settings.
On top of that video was sponsored by Nvidia.


----------



## bug (Sep 4, 2020)

Cvrk said:


> This is the SINGLE best peace of comedy released since yesterday. I smiled. It was that good. Hope you enjoy it as much as I did. Brilliant content


Neah, that scene has been done to death. And the argument is stupid, picking on new gen having better perf for the same $$$. Like that never happened before.


----------



## Super XP (Sep 4, 2020)

Cvrk said:


> This is the SINGLE best peace of comedy released since yesterday. I smiled. It was that good. Hope you enjoy it as much as I did. Brilliant content


They've used that video clip for years. 
HD DVD vs. Blu-Ray
PS1, 2, 3, 4 vs. all X box consoles. 
ATI vs. Nvidia
AMD vs. Intel
Pioneer vs. Panasonic 
etc., 
its hilarious content


----------



## Shatun_Bear (Sep 4, 2020)

AMD's 6900XT I expect to be faster and have 16GB, not a meaasly 10GB which wont last with next gen starting imminently.

A $700-800 card (or more, let's be honest, there is low stock so the price will be north of $800 for most) with 10GB is a joke.


----------



## bug (Sep 4, 2020)

Shatun_Bear said:


> AMD's 6900XT I expect to be faster and have 16GB, not a meaasly 10GB which wont last with next gen starting imminently.
> 
> A $700-800 card (or more, let's be honest, there is low stock so the price will be north of $800 for most) with 10GB is a joke.


What is this next-gen you speak about? We already have 4k and DXR, anything else incoming that I'm not aware of?


----------



## Shatun_Bear (Sep 4, 2020)

bug said:


> What is this next-gen you speak about? We already have 4k and DXR, anything else incoming that I'm not aware of?



No I meant 'next gen' games are coming soon with the new consoles.

To think the big AAA games that are built with the ancient and slow PS4/XB1 and their netbook CPUs and HDDs today can eat 6GB of your GPUs memory easily @ 4K.

So can you imagine something built only for consoles that are about 5-6X faster than those, and what memory these games are going to eat @ 4K on PC? You'll be pushing right up against that 10GB limitation.


----------



## bug (Sep 4, 2020)

Shatun_Bear said:


> No I meant 'next gen' games are coming soon with the new consoles.
> 
> To think the big AAA games that are built with the ancient and slow PS4/XB1 and their netbook CPUs and HDDs today can eat 6GB of your GPUs memory easily @ 4K.
> 
> So can you imagine something built only for consoles that are about 5-6X faster than those, and what memory these games are going to eat @ 4K on PC? You'll be pushing right up against that 10GB limitation.


Neah, crappy ports will cripple performance, but they don't eat into VRAM.
Sure, PC ports tend to get higher-res texture packs, but if the PS5 gets by with 16GB of (shared) RAM, 10GB of dedicated VRAM will be more than enough.


----------



## Shatun_Bear (Sep 5, 2020)

bug said:


> Neah, crappy ports will cripple performance, but they don't eat into VRAM.
> Sure, PC ports tend to get higher-res texture packs, but if the PS5 gets by with 16GB of (shared) RAM, 10GB of dedicated VRAM will be more than enough.



I really dont think so. Not when we see the first next gen only title next year, we'll have to wait for benchmarks.


----------



## MxPhenom 216 (Sep 5, 2020)

Shatun_Bear said:


> AMD's 6900XT I expect to be faster and have 16GB, not a meaasly 10GB which wont last with next gen starting imminently.
> 
> A $700-800 card (or more, let's be honest, there is low stock so the price will be north of $800 for most) with 10GB is a joke.



I think you're in for some disappointment.


----------



## Shatun_Bear (Sep 5, 2020)

MxPhenom 216 said:


> I think you're in for some disappointment.



I'm pretty confident as couple leaks from Ro_game (who is accurate) and RGT suggest there will even be a competitor for the 3090...hold on to your buttcheeks the ride is about to start.


----------



## Amite (Sep 5, 2020)

Just canceled my bid on Ebay for a used 2080 ti @ 620.00 . They may be 550.00 in 2 weeks.


----------



## bug (Sep 6, 2020)

Shatun_Bear said:


> I really dont think so. Not when we see the first next gen only title next year, we'll have to wait for benchmarks.


Have you just thwarted your own argument, admitting the 3080 is good for over a year from now on?


----------



## John Naylor (Sep 6, 2020)

Shatun_Bear said:


> AMD's 6900XT I expect to be faster and have 16GB, not a meaasly 10GB which wont last with next gen starting imminently..... with 10GB is a joke.



Deja Vu all over again .... been hearing this tune played now for almost a decade.... , "what are you nuts ? ... 2 GB ain't enuff for since 2GB ain't enough for the 770.

GTX 770 4GB vs 2GB Showdown - Page 3 of 4 - AlienBabelTech
"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600.  We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference... This leaves five games out of 30 where a 4GB GTX 770 gives more than a 1 frame per second difference over the 2GB-equipped GTX 770.  And one of them, _Metro: Last Light_ still isn’t even quite a single frame difference.  ... There is one last thing to note with _Max Payne 3_:  It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it claims to require 2750MB.  However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting.  And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s. "

Gigabyte GeForce GTX 960 G1 Gaming 4GB review - VRAM Analysis 2GB vs 4GB - Alien Isolation
Video Card Performance: 2GB vs 4GB Memory - Puget Custom Computers
Is 4GB of VRAM enough? AMD’s Fury X faces off with Nvidia’s GTX 980 Ti, Titan X | ExtremeTech

"We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, ..... They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available....Our own testing backed up this claim.  Just like having more cores, this only matters if it performs differently

As Alienbabeltech showed, despite the claimed limitation that max payne could not be played at 5760 with 2 GB 770 they fooled the system and w/ they switched out the 4 GB for the 2 GB  card it played just fine.   Despite the claim that the 3 GB 1060 was not enough for 1080p at the time, TPUs test results proved that claim wrong, not only that it performed relatively the same (the cards actually had different GPUs, the 3 GB having 11% less shaders) in TPU testing at 1080p.  Despite having 11% fewer shaders, gaming performance differed by only 6%.   it maintained it's relative performance with 6 GB version at 1440p.   If the VRAM claim of 3 GB was not enough was true, no way that the relative performance between the 3 GB and 6GB cards could stay exactly the same at 1080p and 1440p..

Yes, we've seen the youtubers looking to get their 15 minutes "proving" that the VRAM can be exceeded by doing repeated zoom-ins / zoom-outs which makes the GFX card request more memory allocations.  When playing a game w/o these shananigans, these conditions simply do not take place,   So  the point is that one of 2 things has to happen to have a VRAM problem a)  a poor console port or b), weird conditions  have to be "created' with intent.   Whenever we have the opportunity to compare identical cards ... as was done in the links above, it's been a non-issue.

And whose to blame for vendor's raising prices ?

a)  Early adopters who need to be the 1st on the block with the new shiny thing continually refreshing vendor web pages and willing to settle for any card they can get.
b)  The competition who can't put out a competitive product to force prices down
c)  A consumer base who is willing to pay higher prices because the cards demand exceeds supply.  They sell them faster than they can make them.

A vendor has no incentive to drop prices on cards he has trouble getting a hold of.


----------



## Rob94hawk (Sep 6, 2020)

ShurikN said:


> Impressive.
> Here's the article for those that don't want to watch the video
> 
> 
> ...



If the 3080 is that good getting the 3090 is just for bragging rights.


----------



## arbiter (Sep 6, 2020)

Amite said:


> Just canceled my bid on Ebay for a used 2080 ti @ 620.00 . They may be 550.00 in 2 weeks.


if 3070 reviews put it faster then 2080ti then 400-450 is probably more in line with what would drop to.


----------



## F7GOS (Sep 6, 2020)

Amite said:


> Just canceled my bid on Ebay for a used 2080 ti @ 620.00 . They may be 550.00 in 2 weeks.



For what it's worth I've picked one up for less than £500 this week so there are a few seller out there that are desperate to sell based on the slides seen.

Will be very interested to see how a 2080TI really stacks up against the 3070 in non DXR titles


----------



## ThrashZone (Sep 6, 2020)

Rob94hawk said:


> If the 3080 is that good getting the 3090 is just for bragging rights.


Hi,
Jury still out but 3090 is more a sli 3080 I was thinking not a 30 titan, that has yet to drop.


----------



## MxPhenom 216 (Sep 6, 2020)

ThrashZone said:


> Hi,
> Jury still out but 3090 is more a sli 3080 I was thinking not a 30 titan, that has yet to drop.



I mean Jensen pretty much said its the new Titan in the announcement. It doesnt have 2 GPUs on one PCB.


----------



## Vayra86 (Sep 7, 2020)

I'd advise everyone to hold back on 3080's. This is likely going to be eclipsed by something to undercut Navi, and this can go both ways: either a cheaper (550, 600?) 3070ti or some form of higher VRAM capacity 3080ti. I reckon it is likely we'll see the x70ti as the perf gap isn't too high between x80 and x90.

If x70 is a full GA104, that means an x70ti is going to be a further cut GA102. And beyond x80, there is also wiggle room for a 6-block (instead of 5,5) GA102. Additionally, there is wiggle room in VRAM speeds - GDDR6, X, and further improvements on both as they are still fresh.

The current stack is very visibly incomplete, the price gap between x70 and x80 is too large and the 10GB is too oddball to stay out for long. If you look at how that die is cut, as well, it doesn't look like a very efficient way to push that product. Its clear that yields are likely to improve, too, which gives another opportunity to fill up the stack with more competitive products still.

Above all I think its clear the 10GB is sub optimal.



John Naylor said:


> Deja Vu all over again .... been hearing this tune played now for almost a decade.... , "what are you nuts ? ... 2 GB ain't enuff for since 2GB ain't enough for the 770.
> 
> GTX 770 4GB vs 2GB Showdown - Page 3 of 4 - AlienBabelTech
> "There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600.  We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference... This leaves five games out of 30 where a 4GB GTX 770 gives more than a 1 frame per second difference over the 2GB-equipped GTX 770.  And one of them, _Metro: Last Light_ still isn’t even quite a single frame difference.  ... There is one last thing to note with _Max Payne 3_:  It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it claims to require 2750MB.  However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting.  And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s. "
> ...



TL : DR
RTX IO is proving you wrong. Nvidia needs that route to bypass their VRAM limitations, and they need it because
A: consoles will heavily utilize SSDs as secondary VRAM
B: they have access to 12-13GB.

Stop looking back, look to the present day, where VRAM capacities have doubled gen over gen since Maxwell and that 1060 3GB is now widely considered sub-par, while its 6GB brother is future proof. Kepler comparisons are not going to work for you anymore, its a different age and we're exploring new ways to push new technologies forward. RT is just a tiny part of that - streaming game engines are extremely common now and do request different hardware balance, VRAM capacity being an important and direct impact on performance.

You also forget to ask the question _why_ game engines load everything into VRAM that they can. Its so that they can reduce the latency to produce another frame, and making sure 'they are not in the way' for any core GPU processing work. Its pretty odd to think there is no performance impact on having to do more swaps. The bottom line is that 10GB on a top-end GPU *is a compromise. *There's just no other way to explain that. 700 is a lot of money for compromises.

Here's a 2020 video, instead of a 2015 (!!!) one comparing the Fury X with 980ti and a 2060.
4 vs 6 = the gap between sub 60 and lots of variance & fixed 60 at 1440p, for example. Or between dropping into complete oblivion (10 FPS range) and having a somewhat playable slideshow (20-30) at 4K.
You like math... do it without blinders on.










Time to change perspective - here are GPUs with similar core power but a 2GB VRAM gap. The 2060 is obviously 20-30% faster. But the 980ti is not, but it still rocks while the Fury X is turning to shit.






And just to drive it home for you... here's that beloved 1060, in 2020, at a very reasonable 1080p. That's a big gap for a few shaders I'd say.


----------



## ppn (Sep 7, 2020)

Whatever Nvidia release. 3080 on 628mm2 big chip means 5nm shrink Hopper is closer than we think, 2022 Q1. So no time for 3080 super.



MxPhenom 216 said:


> I mean Jensen pretty much said its the new Titan in the announcement. It doesnt have 2 GPUs on one PCB.


3080 that carries Nvlink connector.


----------



## Vya Domus (Sep 7, 2020)

Vayra86 said:


> And just to drive it home for you... here's that beloved 1060, in 2020, at a very reasonable 1080p. That's a big gap for a few shaders I'd say.



Leave him be, hes gonna post another block of text.


----------



## bug (Sep 7, 2020)

Vya Domus said:


> Leave him be, hes gonna post another block of text.


I really don't see what the VRAM fuss is about. You want to cling onto your video card for a few more years, you'll have to lower details past some point. Whether because of VRAM, raw HP or missing features, it's going to be obsolete.
I've always bought mid-range cards, never cared about VRAM size and have always been able to skip at least one generation before upgrading again. I image it would have been the same would I have bought high-end instead.


----------



## Vayra86 (Sep 7, 2020)

bug said:


> I really don't see what the VRAM fuss is about. You want to cling onto your video card for a few more years, you'll have to lower details past some point. Whether because of VRAM, raw HP or missing features, it's going to be obsolete.
> I've always bought mid-range cards, never cared about VRAM size and have always been able to skip at least one generation before upgrading again. I image it would have been the same would I have bought high-end instead.



Aren't you also still on 1080p?

Anyone looking to use a 3080 at 4K should definitely consider the impact of its VRAM limit. Its not a linear thing throughout gens and price tiers... most of it usually is balanced out just right like you say. Some products however are really not. For 1440p, definitely, 8-10GB will do fine over its lifetime.


----------



## bug (Sep 7, 2020)

Vayra86 said:


> Aren't you also still on 1080p?
> 
> Anyone looking to use a 3080 at 4K should definitely consider the impact of its VRAM limit. Its not a linear thing throughout gens and price tiers... most of it usually is balanced out just right like you say. Some products however are really not. For 1440p, definitely, 8-10GB will do fine over its lifetime.


Like I said, I have always been able to disregard VRAM size when buying. I'm simply looking for something that is at least 20% faster than what I already have and that it plays well _current_ games.

As far as this card is concerned, by the time 10GB will be inadequate, you'll have upgraded because further generations will have significantly beefed up RTRT anyway 
Also, the solution to not enough VRAM is dead simple: turn down texture detail a notch or two.


----------



## Vayra86 (Sep 7, 2020)

bug said:


> Like I said, I have always been able to disregard VRAM size when buying. I'm simply looking for something that is at least 20% faster than what I already have and that it plays well _current_ games.
> 
> As far as this card is concerned, by the time 10GB will be inadequate, you'll have upgraded because further generations will have significantly beefed up RTRT anyway
> Also, the solution to not enough VRAM is dead simple: turn down texture detail a notch or two.



Obviously you can fix it. But that implies there is something to fix, which is odd for an x80 card and not something we're generally used to seeing from Nvidia's high end. The easy comparison is the 1080ti which had more already with much lower core perf. Its very hard to defend that the 3080 still has 'good balance' with 1GB less and a performance level nearly 75% higher (or is it even more?). Oh - AND additional load from RT.

Also... lol @ texture quality reduction... that's not why people buy into high end tiers  For a midranger, yes, this is the balancing act you paid for. Above that, its never been necessary and simply shouldn't be.


----------



## bug (Sep 7, 2020)

Vayra86 said:


> Obviously you can fix it. But that implies there is something to fix, which is odd for an x80 card and not something we're generally used to seeing from Nvidia's high end. The easy comparison is the 1080ti which had more already with much lower core perf. Its very hard to defend that the 3080 still has 'good balance' with 1GB less and a performance level nearly 75% higher (or is it even more?). Oh - AND additional load from RT.


There's no correlation between the raw HP and VRAM size. I can write you a Minesweeper game that runs on potatoes and pair it with a 500GB texture that won't fit onto any current video card. (Won't prove anything  at the end of the day)


Vayra86 said:


> Also... lol @ texture quality reduction... that's not why people buy into high end tiers  For a midranger, yes, this is the balancing act you paid for. Above that, its never been necessary and simply shouldn't be.


Well, if you hang onto it long enough, it won't be high-end anymore, so I guess my reasoning stands.


----------



## Shatun_Bear (Sep 7, 2020)

bug said:


> Have you just thwarted your own argument, admitting the 3080 is good for over a year from now on?



So it's a good investment of $700-900 (AIBs or real prices) to buy a graphics card that will be limited after just one year?

Anyway this defence force for 10GB is already looking silly as a 3070 Ti with 16GB was mistakenly listed early in a Lenovo pre-build. And the 20GB 3080s are coming soon too.


----------



## MrGRiMv25 (Sep 7, 2020)

I'd say the VRAM issue is pretty irrelevant apart from a few edge cases, Gears 5 didn't use more than 6~GB of VRAM at 4K, neither did Rage 2 and quite a few other games according to many TPU benchmarks over the couple year or so. Even Control's benchmark at TPU was measured at using around 6.5GB at 4K.

Sure there might be a couple that reserve more VRAM than they need but I don't think it's as critical as many seem to think. It might start becoming a bit of a squeeze over the next year or two but by then you'd probably need a new card anyway to get max performance. Besides, if the rumours are true then the people that want more VRAM can just wait until the cards with 16/20GB VRAM come out.


----------



## bug (Sep 7, 2020)

Shatun_Bear said:


> So it's a good investment of $700-900 (AIBs or real prices) to buy a graphics card that will be limited after just one year?
> 
> Anyway this defence force for 10GB is already looking silly as a 3070 Ti with 16GB was mistakenly listed early in a Lenovo pre-build. And the 20GB 3080s are coming soon too.


I'm obviously no match for your future-predicting skills, so I'll just stop here.


----------



## hat (Sep 7, 2020)

I'd also agree that the vram issue isn't all it's cracked up to be. It's already been proven that games can allocate more vram than what is actually in use... much like windows gobbles up more ram than what is necessary. It loads more stuff because it's available and will free ram when a demanding application is launched that requires it.


----------



## bug (Sep 7, 2020)

MrGRiMv25 said:


> I'd say the VRAM issue is pretty irrelevant apart from a few edge cases, Gears 5 didn't use more than 6~GB of VRAM at 4K, neither did Rage 2 and quite a few other games according to many TPU benchmarks over the couple year or so. Even Control's benchmark at TPU was measured at using around 6.5GB at 4K.


If only it were so simple. Programs always allocate memory before using it and many allocate in advance, to avoid frequent allocation requests. Allocated != used, as proven by several game tests right here where games shown to "use" in excess on 8GB VRAM ran with no penalties on cards with only 6GB (number pulled out of my rear, I don't remember them exactly).

I obviously wouldn't defend a 4GB VRAM 3080, but all things considered, a 10GB VRAM will be a limitation for number crunchers sooner than it will affect gamers, imho.


----------



## Shatun_Bear (Sep 7, 2020)

The fanaticism for Nvidia is so strong that people are defending their products without them being on the market even or in their hands. Instead of asking questions and trying to peer through Nvidia's marketing the other night, to get a better understanding of what we'll be buying, the fanboys have risen up in staunch defence telling us to shut up and accept everything Nvidia is telling us.

One example - Nvidia's claim the 3080 is 2X the performance of the 2080. Iv've seen this marketing line repeated across multiple forums the last few days as justification of a prospective 3080 purchase. But even in Nvidia's own cherry-picked titles and settings, the games benched shows gains of ~80%. Best case. So what is this 2X or 100% performance uplift claim about? It's complete BS.

Then there's others - '10GB is perfect', '30 teraflops' mixed in with fanboy claims 'DLSS will be in every game', 'AMD won't compete with the 3070', '380W is no problem' etc etc


----------



## bug (Sep 7, 2020)

Shatun_Bear said:


> The fanaticism for Nvidia is so strong that people are defending their products without them being on the market even or in their hands. Instead of asking questions and trying to peer through Nvidia's marketing the other night, to get a better understanding of what we'll be buying, the fanboys have risen up in staunch defence telling us to shut up and accept everything Nvidia is telling us.
> 
> One example - Nvidia's claim the 3080 is 2X the performance of the 2080. Iv've seen this marketing line repeated across multiple forums the last few days as justification of a prospective 3080 purchase. But even in Nvidia's own cherry-picked titles and settings, the games benched shows gains of ~80%. Best case. So what is this 2X or 100% performance uplift claim about? It's complete BS.
> 
> Then there's others - '10GB is perfect', '30 teraflops' mixed in with fanboy claims 'DLSS will be in every game', 'AMD won't compete with the 3070', '380W is no problem' etc etc


Did it ever occur to you that more level-headed guys are just waiting for reviews, before deciding one way or another? Unlike those that will pick on anything just because?

Do you have any idea how you sound picking on a card because it is "only" 80% faster than its predecessor at the same price point?


----------



## Cvrk (Sep 8, 2020)




----------



## bug (Sep 8, 2020)

I just realized I have no idea how the mid-range will look this time around. Most likely Nvidia won't pull another 1660+2060 stunt again. And there's so much room below that $500 price point.


----------



## ThrashZone (Sep 8, 2020)

MxPhenom 216 said:


> I mean Jensen pretty much said its the new Titan in the announcement. It doesnt have 2 GPUs on one PCB.


Hi,
He said titan performance not replacement/ new titan.


----------



## xtreemchaos (Sep 8, 2020)

WOW  thats dog gone fantastic !. i was going to go for a 3070 but after seeing the vid im thinking of paying the extra and going for that monster, i havnt seen any benchmarks for the 3070 yet so could change me mind again   .


----------



## P4-630 (Sep 8, 2020)

bug said:


> I just realized I have no idea how the mid-range will look this time around. Most likely Nvidia won't pull another 1660+2060 stunt again. And there's so much room below that $500 price point.



Rumors said even lower spec GPU's will get RT, not sure about that but we'll see.


----------



## phanbuey (Sep 8, 2020)

this is the same as the pascal / 1080 launch...

If there is a 16gb 3070 ti (as per the lenovo leak) there will be  a 20Gb 3080ti, just released a bit later, and the current 3080 will move into the upper mid range.  They left a gigantic hole between the 3090 and the 3080/3070 for additional SKUs depending on what AMD brings to the table.


----------



## Bubster (Sep 8, 2020)

Happy with my 1080 SLI Get 70-80 fps @ 4K...don't see the point in wasting money on a Gimmick that RTX was ... 3090 on the other hand is a beast 8k@ 60fps and RTX IO thing but it will be super Pricey...Good things comes to those who wait.


----------



## Vayra86 (Sep 8, 2020)

MrGRiMv25 said:


> I'd say the VRAM issue is pretty irrelevant apart from a few edge cases, Gears 5 didn't use more than 6~GB of VRAM at 4K, neither did Rage 2 and quite a few other games according to many TPU benchmarks over the couple year or so. Even Control's benchmark at TPU was measured at using around 6.5GB at 4K.
> 
> Sure there might be a couple that reserve more VRAM than they need but I don't think it's as critical as many seem to think. It might start becoming a bit of a squeeze over the next year or two but by then you'd probably need a new card anyway to get max performance. Besides, if the rumours are true then the people that want more VRAM can just wait until the cards with 16/20GB VRAM come out.



Its not _critical. _At least that is not what I am personally saying about the 10GB on this specific card.

It is definitely _sub optimal_ though, so that could imply it's probably wise to wait it out a little bit for something that is not 10GB, but 12-16 so you can have parity with consoles going forward and also more headroom than you had on *much weaker* GPUs that already sported more. Nvidia didn't put 11GB on those because they thought it was a cool number. They did it, because they considered it good balance with the core power on tap. I seriously do NOT understand why people defend the fact that a GPU twice as fast could then make do with less 'and its fine'. Its really not. It is the thing that will limit this card and make it obsolete. Not core power. You will have that in abundance, most likely, but could still fall short on VRAM. This, alongside the perception of it being 'a lot of GPU for 700'... well, I'm sure people can put two and two together here. The incentive is lots of power, but a hard limit down the line. Resale value of this 3080 might very well plummet a few years from now. In fact I'm sure it will. This is no 1080ti. This is a weird cut that won't last. If you thought the 2080ti resale value dropping so hard right now was something else... be prepared for the 3080, it'll do the same, and probably sooner.

This is especially true when you start modding, when you start adding assets to games VRAM requirements explode and can quite easily double. On an example like Gears of War, or let's say some simulation- or city builder game that can normally make do with 4-6 GB, a doubling will suddenly give you a 700 dollar GPU that is pretty useless, and potentially even more useless than an 11GB 1080ti of 3 years ago. This is strange. And defending it... well... it simply makes no sense to defend it. Dropping detail that otherwise takes very little GPU power just to accomodate to limited VRAM is another such weird argument... you don't do this on high end, because you shouldn't have to.

Nvidia has always cut things on VRAM to limit potential of a specific card. Whether in speeds, or in capacities, or by serving us with assymetrical bus that creates a performance impact (550ti, 660, 970... the list is long).


----------



## Shatun_Bear (Sep 8, 2020)

phanbuey said:


> this is the same as the pascal / 1080 launch...
> 
> If there is a 16gb 3070 ti (as per the lenovo leak) there will be  a 20Gb 3080ti, just released a bit later, and the current 3080 will move into the upper mid range.  They left a gigantic hole between the 3090 and the 3080/3070 for additional SKUs depending on what AMD brings to the table.



Yep, I think the 3080 is just to snare the early adopter suckers who'll buy it regardless, even 10GB for $700-900. Then a few weeks later Nvidia will officially reveal the 3070 Ti and 3080 Ti which will be far better deals.


----------



## Vayra86 (Sep 8, 2020)

Shatun_Bear said:


> Yep, I think the 3080 is just to snare the early adopter suckers who'll buy it regardless, even 10GB for $700-900. Then a few weeks later Nvidia will officially reveal the 3070 Ti and 3080 Ti which will be far better deals.



3080 is preliminary fire for Navi 20.

And early adopter tax yes


----------



## bug (Sep 8, 2020)

Vayra86 said:


> It is definitely _sub optimal_ though, so that could imply it's probably wise to wait it out a little bit for something that is not 10GB, but 12-16 so you can have parity with consoles going forward and also more headroom than you had on *much weaker* GPUs that already sported more. Nvidia didn't put 11GB on those because they thought it was a cool number. They did it, because they considered it good balance with the core power on tap.


Honestly, the amount of VRAM is dictated by ICs and memory interface. Nvidia doesn't get that much headroom. They just needed something between 24 and presumably the 6-8GB of their midrange and for this generation that ended up on 10GB.
They can't widen the memory bus (it would require unlocking as many blocks as on a 3090 and a more expensive PCB - not impossible, but probably not practical at this point), so they could only offer more VRAM is using higher capacity chips. 20GB sounds good, but it depends on availability of said chips.


----------



## Vayra86 (Sep 8, 2020)

bug said:


> Honestly, the amount of VRAM is dictated by ICs and memory interface. Nvidia doesn't get that much headroom. They just needed something between 24 and presumably the 6-8GB of their midrange and for this generation that ended up on 10GB.
> They can't widen the memory bus (it would require unlocking as many blocks as on a 3090 and a more expensive PCB - not impossible, but probably not practical at this point), so they could only offer more VRAM is using higher capacity chips. 20GB sounds good, but it depends on availability of said chips.



Conclusion: its a sacrifice made to accomodate the weird GA102 cut they made, and thus sub optimal.
Thanks.

All they had to do was go for 6 blocks instead of 5,5 and they'd have had a 12 and 24 GB card instead of this. Yields don't allow it. In the end that is purely an economical decision, in other words, WE are footing the bill to give Nvidia the ability to make that decision and deliver the GPU at 700. And they want to deliver at 700 because that is the best sales outlook for THEM.


----------



## Shatun_Bear (Sep 8, 2020)

Vayra86 said:


> 3080 is preliminary fire for Navi 20.
> 
> And early adopter tax yes



Likely to combat Navi, yes. It seems the launch has been rushed out the gate early. I question why review embargo lifts so close to release day. If these cards are the biggest jump ever for Nvidia, why not let reviewers be wowed by their performance a week before launch?


----------



## bug (Sep 8, 2020)

Shatun_Bear said:


> Yep, I think the 3080 is just to snare the early adopter suckers who'll buy it regardless, even 10GB for $700-900. Then a few weeks later Nvidia will officially reveal the 3070 Ti and 3080 Ti which will be far better deals.


Unless RDNA2 turns up fantastic, there will be no refresh of these parts sooner than next year.


----------



## phanbuey (Sep 8, 2020)

Shatun_Bear said:


> Yep, I think the 3080 is just to snare the early adopter suckers who'll buy it regardless, even 10GB for $700-900. Then a few weeks later Nvidia will officially reveal the 3070 Ti and 3080 Ti which will be far better deals.



They love to do this to their high end customers, they did it with the 9x and 10x series with pascal... just imagine all the guys buying a $1400 card, they're gonna sound like the people who bought the titan when the 1080ti came out with pascal...


----------



## Vayra86 (Sep 8, 2020)

phanbuey said:


> They love to do this to their high end customers, they did it with the 9x and 10x series with pascal... just imagine all the guys buying a $1400 card, they're gonna sound like the people who bought the titan when the 1080ti came out with pascal...



I call that customer due diligence, honestly. Its what I'm trying to inspire here too


----------



## Assimilator (Sep 8, 2020)

Shatun_Bear said:


> Likely to combat Navi, yes. It seems the launch has been rushed out the gate early. I question why review embargo lifts so close to release day. If these cards are the biggest jump ever for Nvidia, why not let reviewers be wowed by their performance a week before launch?



Get yourself a dictionary and look up the word "hype".

Or, just look at this thread. How many people are talking about Turing? Almost everyone. How many are talking about RDNA2? Almost nobody. And even those are *reacting* to Turing.

NVIDIA has always been strong at controlling perception, getting and keeping their products in the public eye. That isn't rushing, that's good marketing. Something AMD could do well to learn from.


----------



## nguyen (Sep 8, 2020)

PCIe 4.0 x16 will pretty much make big VRAM pool pointless. Just like the 5500XT 4GB, having faster PCIe bandwidth can help with the lower VRAM










Just spend that extra money on faster RAM make more sense. Afterall faster RAM will lead to better 1% LOW FPS in every game.


----------



## Assimilator (Sep 8, 2020)

nguyen said:


> PCIe 4.0 x16 will pretty much make big VRAM pool pointless. Just like the 5500XT 4GB, having faster PCIe bandwidth can help with the lower VRAM



You evidently have no grasp of fundamental issues like "latency" and "clock cycles".


----------



## bug (Sep 8, 2020)

phanbuey said:


> They love to do this to their high end customers, they did it with the 9x and 10x series with pascal... just imagine all the guys buying a $1400 card, they're gonna sound like the people who bought the titan when the 1080ti came out with pascal...


Do what to their customers? Enable more of the chips once yields pick up? Bastards!


----------



## kapone32 (Sep 8, 2020)

MxPhenom 216 said:


> Eurogamer is seeing performance of non RTX still over 50% over the 2080.


Isn't that the same as the 2080TI?


----------



## MxPhenom 216 (Sep 8, 2020)

kapone32 said:


> Isn't that the same as the 2080TI?



Not exactly









						NVIDIA GeForce RTX 2080 Ti Founders Edition 11 GB Review
					

NVIDIA debuted its Turing graphics architecture today, straightaway with the flagship RTX 2080 Ti. This card packs the promise of real-time ray tracing at 4K UHD, besides huge gains in performance. NVIDIA also put out its best cooler design since TITAN, commanding a very high price for some very...




					www.techpowerup.com


----------



## kapone32 (Sep 8, 2020)

MxPhenom 216 said:


> Not exactly
> 
> 
> 
> ...


Thanks, I am impressed more with the price. I am waiting for RDNA2  but if I was going to buy Nvidia I would get the 3080.


----------



## phanbuey (Sep 8, 2020)

bug said:


> Do what to their customers? Enable more of the chips once yields pick up? Bastards!



Yes... it's the "Yields" that cause them to come out with a the $500 more expensive card and then once that pool of suckers is exhausted release the same thing for $500 less...  They're definitely not holding it back as part of their market strategy.


----------



## bug (Sep 8, 2020)

phanbuey said:


> Yes... it's the "Yields" that cause them to come out with a the $500 more expensive card and then once that pool of suckers is exhausted release the same thing for $500 less...  They're definitely not holding it back as part of their market strategy.


I'm thinking you don't realize without the initial yields (i.e. actually building the stuff), there would be no improved yields to release later on.


----------



## phanbuey (Sep 8, 2020)

bug said:


> I'm thinking you don't realize without the initial yields (i.e. actually building the stuff), there would be no improved yields to release later on.



That's a given, but the costs of the initial yields are usually spread out through the product line... i.e. you don't pay back the millions it took to make the first batch of chips by pricing those chips at $1m a pop.  That spread is part of your business strategy, and with Nvidia it is massively front-loaded with very expensive cards out first, as they know their die hard first adopter base will well kidneys to get their hands on their cards, and then virtually the same card with a 50-60% markdown in a Ti variant after.


----------



## bug (Sep 8, 2020)

phanbuey said:


> That's a given, but the costs of the initial yields are usually spread out through the product line... i.e. you don't pay back the millions it took to make the first batch of chips by pricing those chips at $1m a pop.  That spread is part of your business strategy, and with Nvidia it is massively front-loaded with very expensive cards out first, as they know their die hard first adopter base will well kidneys to get their hands on their cards, and then virtually the same card with a 50-60% markdown in a Ti variant after.


I was talking about the improvements you make after you build like a million chips, not the initial engineering samples. And by "you", I mean the foundry.


----------



## Cvrk (Sep 9, 2020)

@bug i think i mentioned this, not sure.

The RTX 3060 is gonna be one amazing priced/performance king. 

Even tho the rtx 3070 is equivalent and more than the 2080ti (and some games still don't run well on a 2080ti for 4k resolutions). Few of us have 4k displays to take advantage, the majority are still playing 1080 or 2k - high refresh rate. That being said for the majority of the titles in the past 2018-2020 graphic engines and upcoming 2021 the 3070 is powerful or to much (reference 1080p gaming). 

So the RTX 3060 will be exactly what you need for 1080p high refresh rate, something that will do more than 100fps. As for the price smaller than a 3070 - of course. 
--------------------

I'm thinking to break the bank for the 3070, considering i upgraded to a 2k 144hz monitor. I need the power, and I also think about future-proofing for the upcoming 2-3 years. ..........
It might be a mistake. At this rate future proof is nothing and the 3060 will probably be more of what I need. My old RX 480 still can do 2k at 60 in almost all titles (believe it or not). 

The prices are not gonna be stock. The 600$ rtx 3070 is not happening for the majority of the world. Maybe Canada, USA but not EU regions where Covid is greatly increasing prices because of shipping. And VERY important, that price is bare-bone for the founder's edition RTX, that will be very limited. We most likely have to buy Asus, MSI etc...and those will not start at 600$ 

SO in real life, the RTX 3060 will cost 600$ and the RTX 3070 around 700$ if not more.


----------



## ppn (Sep 9, 2020)

3060 is a 192 bit card. both bandwidth 760/336 and shaders 8704/3840 deliver 44% of 3080, this unusualy weak 60 card, 50Ti at best.

for example
$200 GTX 960 55% performance of $550 GTX 980,
$300 GTX 1060 60% pefrormance of $700 GTX 1080

so 50% performance of $700 card can't possibly cost more than $200.

if you mean 3060Ti that is carved out of GA104, 256 bit then yes $300, but $600 for that weakling is preposterous.


----------



## P4-630 (Sep 9, 2020)

_Ethereum miners show interest in Nvidia’s RTX 3080

Last week, the publication dates for reviews of the RTX 3080 became clear. On September 14, we should know how the Founders Edition of Nvidia’s new flagship is performing, with data on the custom GPU's and general availability on September 17. According to some photos on the Chinese forum Baidu, there are some users who managed to get several cards before launch. One of the photos shows a stack of RTX 3080 cards.






The reason for this interest would lie in the mining power of the cards, specifically for Ethereum. According to a screenshot, the RTX 3080 achieves a computing power of 115 MH / s, almost three times as much as the RTX 2080 with 35 to 40 megahashes per second. While no figures for energy consumption are given, the 3080 seems theoretically more efficient, if one contrasts the tdp of both cards with the hashrate. Moreover, mining cryptocurrency in Asia is quite profitable, given the relatively low energy costs._









						Ethereum miners tonen interesse in Nvidia’s RTX 3080
					

Vorige week zijn de publicatiedata voor reviews van de RTX 3080 duidelijk geworden. Op 14 september zouden we moeten weten hoe de Founders Edition van N...




					nl.hardware.info


----------



## Fluffmeister (Sep 9, 2020)

So essentially it doesn't matter how much they are going to cost because they will sell load's anyway to the lovely mining community.


----------



## Chomiq (Sep 9, 2020)

P4-630 said:


> _Ethereum miners show interest in Nvidia’s RTX 3080
> 
> Last week, the publication dates for reviews of the RTX 3080 became clear. On September 14, we should know how the Founders Edition of Nvidia’s new flagship is performing, with data on the custom GPU's and general availability on September 17. According to some photos on the Chinese forum Baidu, there are some users who managed to get several cards before launch. One of the photos shows a stack of RTX 3080 cards.
> 
> ...


Here we go again.


----------



## moproblems99 (Sep 9, 2020)

Amite said:


> Just canceled my bid on Ebay for a used 2080 ti @ 620.00 . They may be 550.00 in 2 weeks.



Still isn't worth it.


----------



## Vayra86 (Sep 9, 2020)

ppn said:


> 3060 is a 192 bit card. both bandwidth 760/336 and shaders 8704/3840 deliver 44% of 3080, this unusualy weak 60 card, 50Ti at best.
> 
> for example
> $200 GTX 960 55% performance of $550 GTX 980,
> ...



Sorry... Unusually weak x60 card? This is NORMAL for an x60. Some sort of VRAM limitation. They always happen in the midrange and even infect the x70 from time to time. (Or the x80... )

x60 is the domain of asymmetrical bus (GTX 660: 1.5 / 0.5 GB split, unequal B/W between the two making it utter shit in SLI, also 192 bit; GTX 660ti: 192 bit memory that was too slow for the much faster core compared to the 660), less VRAM (GTX 1060: 3GB, RTX 2060: 6GB versus 8 on the equal performing 1080) and usually means a full 106 die. What we'll get now though is likely some cut from a GA104, and 192 bit implies they've cut about 25% GPCs. It looks like we'll get some situation similar to 1070 vs 1080, about 25% of a gap, and major differences in memory.

I'm not sure why you think the x60 will use a smaller die; unless we also maintain that x70 is not a full 104 already, but I'm not so sure of that, seeing the gap with the 3080. As for 600... if the 3070 is already at 500, how?


----------



## ppn (Sep 9, 2020)

$350 RTX 2060 delivers 70% performance of the $700 RTX 2080, see 42% improvement leads to double the price. Now with GA104 we have to see what is that card delivering 70% performance that should cost $350. Looks like 3070 is 60% of 3080, so the winner is 3070Ti. 3060 is a total joke, 3840 new shaders divide by golden ratio you get the real 2304 Turing equivalent, but with 192 bit bus, and barely faster than 2060.


----------



## Vayra86 (Sep 9, 2020)

ppn said:


> $350 RTX 2060 delivers 70% performance of the $700 RTX 2080, see 42% improvement leads to double the price. Now with GA104 we have to see what is that card delivering 70% performance that should cost $350. Looks like 3070 is 60% of 3080, so the winner is 3070Ti. 3060 is a total joke, 3840 new shaders divide by golden ratio you get the real 2304 Turing equivalent, but with 192 bit bus, and barely faster than 2060.



Nah I think the x60 will be hitting 1080ti levels now. So that's the usual 30 odd % increase at least, and it will also set the tiers apart correctly, after all, the gap from 1080ti to 2080ti is pretty big, and if that's where the 3070 is at... there is lots of wiggle room. It all hinges on the real perf of the 3070 really. If we take Nvidia's 2080ti equivalent statement for granted...


----------



## mouacyk (Sep 9, 2020)

RTX 3080 hits 43K in Firestrike Performance, which is only +34% to my 1080TI.  Looks like I won't be upgrading yet.





And over here https://www.techpowerup.com/forums/...pcmark10-firestrike-scores-2019.259705/page-3, a 2080TI hit 38K, which puts the 3080 at only +12%. DF is so full of it. Unless of course, this new leak is wrong.


----------



## FinneousPJ (Sep 9, 2020)

Yeah... not a review, just sponsored bullcrap. Frankly I'm a bit disappointed, before now I always thought DF had a high journalistic standard.


----------



## Shatun_Bear (Sep 10, 2020)

FinneousPJ said:


> Yeah... not a review, just sponsored bullcrap. Frankly I'm a bit disappointed, before now I always thought DF had a high journalistic standard.



The handful of 'benchmarks' from DF have really done a good job for Nvidia, they've got people convinced that Ampere is an insane leap over Turing in rasterization. But of course, Nvidia asked DF only to bench certain games at certain settings, so that the results are very flattering. I cannot wait to see W1zzard's performance averages of these cards, proper benchmarking instead of paid sponsored deceit, which will take down those expectations a peg or two.


----------



## bug (Sep 10, 2020)

Shatun_Bear said:


> The handful of 'benchmarks' from DF have really done a good job for Nvidia, they've got people convinced that Ampere is an insane leap over Turing in rasterization. But of course, Nvidia asked DF only to bench certain games at certain settings, so that the results are very flattering. I cannot wait to see W1zzard's performance averages of these cards, proper benchmarking instead of paid sponsored deceit, which will take down those expectations a peg or two.


Your problem is, taking "down those expectations a peg or two", still paints Ampere as a hell of a fast set of cards.


----------



## Shatun_Bear (Sep 10, 2020)

bug said:


> Your problem is, taking "down those expectations a peg or two", still paints Ampere as a hell of a fast set of cards.



The 3080 looks like 30-40% more performance than a 2080 Ti for over 30% more power draw. Raw numbers are good, but the node is not as good as TSMC's 7nm enhanced, hence the horrible power draw/heat.

Also, with the real numbers above, it seems likely to me one of the Big Navi's will be faster, have more memory and be more power efficient to boot.


----------



## nguyen (Sep 10, 2020)

Shatun_Bear said:


> The 3080 looks like 30-40% more performance than a 2080 Ti for over 30% more power draw. Raw numbers are good, but the node is not as good as TSMC's 7nm enhanced, hence the horrible power draw/heat.
> 
> Also, with the real numbers above, it seems likely to me one of the Big Navi's will be faster, have more memory and be more power efficient to boot.



Jup AMD Bike will definitely run faster on road, have more memory foam and can run on human generated power


----------



## phanbuey (Sep 10, 2020)

mouacyk said:


> RTX 3080 hits 43K in Firestrike Performance, which is only +34% to my 1080TI.  Looks like I won't be upgrading yet.
> 
> 
> 
> ...



Firestrike is eol as a bench, the difference between a 1080ti and a 2080ti is 23000is to 25000ish which is barely less than a 10% increase, when in reality the performance gap is closer to 35% in actual games.  Once you start blowing past 15-20K mark on futuremark benches they stop being good predictors of performance.  That's why you have to bump up to "extreme" or timespy (or timespy extreme) in this case.

It's more likely that the 3080 will be 80% or so faster than the 1080ti at 4k/ high fps 1440P


----------



## mouacyk (Sep 10, 2020)

phanbuey said:


> Firestrike is eol as a bench, the difference between a 1080ti and a 2080ti is 23000is to 25000ish which is barely less than a 10% increase, when in reality the performance gap is closer to 35% in actual games.  Once you start blowing past 15-20K mark on futuremark benches they stop being good predictors of performance.  That's why you have to bump up to "extreme" or timespy (or timespy extreme) in this case.
> 
> *It's more likely that the 3080 will be 80% or so faster than the 1080ti at 4k/ high fps 1440P*


Again, assuming these leaks are true... your (and DF's) claim is still unsubstantiated past 1080p.

https://www.3dmark.com/fs/18799348 - Extreme 15238 vs 21370 (+40%)
https://www.3dmark.com/fs/19240894 - Ultra 8918 vs 10876 (+22%)


----------



## John Naylor (Sep 10, 2020)

Vayra86 said:


> I'd advise everyone to hold back on 3080's. This is likely going to be eclipsed by something to undercut Nav



Is this based upon  ?  Certainly not the track record in last 7 years.  AMD hasn't made a dent in the top 2 tiers since before the 2xx series








						AMD Is Losing Ground to NVIDIA -- and It Could Get Worse | The Motley Fool
					

After a period when GPU sales momentum favored the underdog, NVIDIA's aggressive strategy looks poised to give AMD a healthy challenge of its own.




					www.fool.com
				




AMD lost 10% market share in the past year (32% to 22%) ... that's not what normally happens when you have a superior product. The 5600 XT was AMDs home run last generation .... I would love them to make an impact in the next highest tier ... ot maybe two .... but thretane the top 2 tiers ? As much as I think everyone would love to see this happen, what have you seen that nobody else has that makes this a likely scenario ?

7xx .... 780 and 780 Ti were faster
9xx ... Performance differences were so substantial that the 970 alone outsold all AMD cards combined by a factor of 2+
10xx ... nVidia took the top tiers from the 1060 on up.
20xx... nVidia took the top tiers from thew 1660 on up



> Above all I think its clear the 10GB is sub optimal.



Based upon ?   Internet posters have been posting this since the 6xx series and side by side testing has disproved it each and every time.  A few exceptions do not invalidate the rule.  Yes there were some exceptions, (i.e poor console ports or when your GPU isn't quite up to your resolution.  .. .like the 960) , but they said this with the 680 2 GB and 4 GB cards and since and yet the only time VRAM showed a significant difference was when settings were set so high the games were unplayable, substantially under 30 fps.









						GTX 770 4GB vs 2GB Showdown - Page 3 of 4 - AlienBabelTech
					

Do you need 4GB of ram? We tested EVGA's GTX 770 4GB versus Nvidia's GTX 770 2GB version, at 1920x1080, 2560x1600 and 5760x1080.




					alienbabeltech.com
				




"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600.  We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference.  If we start to add even more AA, in most cases, the frame rates will drop to unplayable on both cards.... There is one last thing to note with _Max Payne 3_:  It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it claims to require 2750MB.  However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting.  And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s.  "









						Video Card Performance: 2GB vs 4GB Memory
					

Similar video cards are often available in versions with more than one memory size. The GeForce GTX 680 is an example, and comes in both 2GB and 4GB variants. With computer components more is often better, but does doubling the memory on a video card like this actually help with game performance...




					www.pugetsystems.com
				











						Gigabyte GeForce GTX 960 G1 Gaming 4GB review
					

In this review we check out the 4GB version of the Gigabyte G1 Gaming GeForce GTX 960. The GTX 960 is the mainstream product that we figured has too little memory, will this 4GB version resolve our co... VRAM Analysis 2GB vs 4GB - Alien Isolation




					www.guru3d.com
				











						Is 4GB of VRAM enough? AMD's Fury X faces off with Nvidia's GTX 980 Ti, Titan X - ExtremeTech
					

Is 4GB enough for a high-end GPU? We investigated and tested 15 titles to find out.




					www.extremetech.com
				






> And just to drive it home for you... here's that beloved 1060, in 2020, at a very reasonable 1080p. That's a big gap for a few shaders I'd say.



A few ? .... (1280 / 1158) that's 11% ..... notice that the relative performance in  18 games in TPUs testing was 6 % at 1080p and 6% at 1440p.   By what concoction of logic can that be the case if relative performance be at 1080p is 6% and when you move to 1440p, it's still 6%.    We can do this all day long with battling links, but the fact is you can prove either side of this argument simply by selecting the games chosen to test.  









						MSI GTX 1060 Gaming X 3 GB Review
					

MSI's GTX 1060 Gaming X 3 GB might come with half the memory amount only, but still brings the big guns in form of the large dual-fan TwinFrozr cooler. Our review will test whether 3 GB is a viable alternative to 6 GB if you are trying to save some money.




					www.techpowerup.com
				




"In terms of performance, this factory-overclocked GTX 1060 3 GB card sits about 7% behind the reference GTX 1060 6 GB, which is roughly what I would have expected given the clocks, shaders, and memory. What I did not expect, though, is the large loss of performance in Tomb Raider or Hitman. Other games seem completely unaffected by having 3 GB less VRAM at their disposal, especially at 1080p. If you look at 4K in several games, you can see a small dip in performance compared to the 6 GB reference; Deus Ex and Far Cry Primal are good examples of that.* In my opinion, these cases rather argue for the 3 GB version being the smarter choice because they are good enough for 1080p, and you're not going to be getting smooth 4K framerates on any card in this performance range, no matter how much memory it has - the shading power is simply too low.* You also have to take into account that we are using highest detail settings, which are often a little bit too demanding for the GTX 1060, so people will naturally reduce their settings, which will bring down VRAM usage, smoothing out the differences at the same time.* Tomb Raider, which sees a performance loss of around 25% in even 1080p, requires you to reduce details on both the 6 GB and 3 GB version in order to achieve 60 FPS*. If you go with a 3 GB version, you might have to dial down settings just a little more, but not by much, but will have saved quite some money in return. "

If you game selection includes a lot of console ports, strategy games and the likes of Hitman and TR, the 6 GB would be the better choice, but **most people** will do just fine with the 3 GB.


----------



## phanbuey (Sep 10, 2020)

mouacyk said:


> Again, assuming these leaks are true... your (and DF's) claim is still unsubstantiated past 1080p.
> 
> https://www.3dmark.com/fs/18799348 - Extreme 15238 vs 21370 (+40%)
> https://www.3dmark.com/fs/19240894 - Ultra 8918 vs 10876 (+22%)



My claim is that an ancient DX11 bench is not a good indicator of real world performance.  That has been substantiated with the 1080ti to 2080ti difference already.


----------



## Vayra86 (Sep 11, 2020)

John Naylor said:


> Is this based upon  ?  Certainly not the track record in last 7 years.  AMD hasn't made a dent in the top 2 tiers since before the 2xx series
> 
> 
> 
> ...



A huge wall of text... you can also just say 'hey, shit, 3Gb actually did fall off faster than expected, finally agree and move on. Or you can stick to old beliefs and 2015 data... You can rest assured the list of games like the ones you happened to stumble upon (the ones not getting benched... but played irl) is far longer than this.

If a subset of games suffer, the gpu is simply less capable. Its very simple.


----------



## Hugis (Sep 12, 2020)

TecLab Bilibili benchmark leak

taken from : https://www.hexus.net/tech/news/gra...a-geforce-rtx-3080-benchmarks-video-released/


----------



## MrGRiMv25 (Sep 13, 2020)

Vayra86 said:


> Its not _critical. _At least that is not what I am personally saying about the 10GB on this specific card.
> Nvidia has always cut things on VRAM to limit potential of a specific card. Whether in speeds, or in capacities, or by serving us with assymetrical bus that creates a performance impact (550ti, 660, 970... the list is long).



I would have replied earlier but I got git by the irony bus when my Radeon died from corrupted VRAM... I know what you mean though, Nvidia have always been pretty stingy when it comes to RAM, even back when the 8800 launched and there was the 320MB version of the GTS - this will most definitely be the last generation that can get by with 8GB on even mid-range cards.


----------

