# NVIDIA GeForce RTX 3070 Founders Edition



## W1zzard (Oct 27, 2020)

NVIDIA's GeForce RTX 3070 is faster than the RTX 2080 Ti, for $500. The new FE cooler is greatly improved, runs very quietly, and even has fan stop. In our RTX 3070 Founders Edition review, we'll also take a look at RTX performance, energy efficiency, frametimes, and overclocking.

*Show full review*


----------



## Rais (Oct 27, 2020)

Nice Review! But how was the experience with coil whine, if any?


----------



## wolf (Oct 27, 2020)

Another excellent and comprehensive review. 

Now to see the hype train stop in one way or another to see what it's up against


----------



## birdie (Oct 27, 2020)

A wonderful review, a great card, only I cannot quite accept the fact that it's still priced quite high:

GTX 770 - $399
GTX 970 - $329
GTX 1070 - $449

Lastly it's the most power efficient Ampere GPU but I expected a much higher overall efficiency for this generation. It's only a few percent more efficient than the GTX 1660 Ti based on a much less advanced node. I expect availablily will totally suck for the next at least three months  - Ampere cards are so good several generations of AMD/NVIDIA owners are in line to get them. The RTX3080 is still nowhere to be seen: https://www.nowinstock.net/computers/videocards/nvidia/rtx3080/


----------



## Xaled (Oct 27, 2020)

Another fraudy release
Not available in market
If available, would be with a much higher price than the announced one.
If Nvidia does what it did with 3080 it should be sued (unfortunately only people in US can do that)



birdie said:


> A wonderful review, a great card, only I cannot quite accept the fact that it's still priced quite high:
> 
> GTX 770 - $399
> GTX 970 - $329
> ...


Except the actual price would much much higher than the announced 500


----------



## RedelZaVedno (Oct 27, 2020)

Very nice GPU. Basically 2080TI minus 3Gb GDDR6 and 20% TPD reduction for 500 bucks. Let's see what AMD has to offer tomorrow. I have a gut feeling that it will offer on pair performance, 4GB more VRAM, worse ray tracing, a bit better TPD for 50 bucks less. In that case I'll go with AMD as 8GB will soon not suffice for 4K gaming. Plus supporting the underdog with superior offer is always a good deed.


----------



## birdie (Oct 27, 2020)

RedelZaVedno said:


> In that case I'll go with AMD as 8GB will soon not be enough for 4K gaming anymore.



W1zzard has mentioned this and this has been disspelled a thousand times already:



> GeForce RTX 3070 comes with 8 GB memory and that will be the basis for a lot of discussion, just like on RTX 3080 10 GB. I feel like 8 GB is plenty of memory at the moment, especially considering that this card is targeted at 1440p gaming, which isn't as memory-intensive as 4K. Even with 4K, I feel like you'll run out of shading power long before memory becomes an issue. Next-gen consoles do have more memory, but their 16 GB is for OS + game + graphics, which means effective graphics memory is close enough to the 8 GB RTX 3070 offers. Nobody can predict the future, should you ever feel VRAM runs out, just sell the RTX 3070 and buy whatever card is right at that time. Personally I rather pay less today, than spend extra to solve an unknown problem—8 GB of GDDR6 memory cost around $40, so it will be interesting to see at which price point AMD's reportedly-16 GB RDNA2 Radeon RX 6000 comes and whether the extra memory will be able to make a difference.


----------



## bubbleawsome (Oct 27, 2020)

That is ridiculously quiet, especially for a card that is (or should eventually be) sold at the baseline MSRP. I'm pretty impressed.

For everyone complaining about price, maybe it'll help to think of this more as what would've been a 3080, and the real 3080 as what a 3080ti would've been? This still matches a 2080ti for $500.


----------



## newtekie1 (Oct 27, 2020)

birdie said:


> only I cannot quite accept the fact that it's still priced quite high



If you don't like it, you're more than welcome as a free consume to go buy the AMD alternative that gives the same amount of performance for cheaper...


----------



## birdie (Oct 27, 2020)

newtekie1 said:


> If you don't like it, you're more than welcome as a free consume to go buy the AMD alternative that gives the same amount of performance for cheaper...



I've never owned a GPU which costs more than $300   - not my price range. I'm just concerned that this generation XX60 cards will cost a lot more than 10XX/16XX/20XX cards. Also, I'm not a fan of AMD drivers, so, thanks a lot. Looks like I'm waiting for RTX 4060.


----------



## ViperXTR (Oct 27, 2020)

Noice, one of my target GPU upgrading from GTX 1070, but yeah, won't be able to get one anyway since i will not be able to find it anywhere and even if i do its twice the price already locally


----------



## RedelZaVedno (Oct 27, 2020)

birdie said:


> GeForce RTX 3070 comes with 8 GB memory and that will be the basis for a lot of discussion, just like on RTX 3080 10 GB. I feel like 8 GB is plenty of memory at the moment, especially considering that this card is targeted at 1440p gaming, which isn't as memory-intensive as 4K. Even with 4K, I feel like you'll run out of shading power long before memory becomes an issue. Next-gen consoles do have more memory, but their 16 GB is for OS + game + graphics, which means effective graphics memory is close enough to the 8 GB RTX 3070 offers. Nobody can predict the future, should you ever feel VRAM runs out, just sell the RTX 3070 and buy whatever card is right at that time. Personally I rather pay less today, than spend extra to solve an unknown problem—8 GB of GDDR6 memory cost around $40, so it will be interesting to see at which price point AMD's reportedly-16 GB RDNA2 Radeon RX 6000 comes and whether the extra memory will be able to make a difference.



Well, I'm only into flight sims where there are tons of textures to load so more vram really makes a difference in some titles. DCS world runs smoother on 11Gb 1080TI than on 8Gb 2070S/2080 for example. For me 8 Gb is as low as I'd go on a tight budget, 10 gigs pretty much minimum to consider, 12 Gb price wise optimal, 16 Gb nice for the future proofing .


----------



## EarthDog (Oct 27, 2020)

So... ~2080Ti performance w or w/o DLSS...for $500... nice!


----------



## Haile Selassie (Oct 27, 2020)

A good improvement, but you can clearly see performance per wattt isn't greatly improved with Ampere compared to Turing (220W vs. 250W). A good 15%, but that's it.
A noteworthy mention is that GA104 in this configuration uses 1536 CUDA cores more than TU102 (4352) and at a higher core clock as well.

Is there perhaps a bottleneck in the pipeline that causes stall?
Is the new Ampere architecture starving memory bandwidth?
Is the CUDA core somewhat 'RISC'-ed / 'bulldozered' compared to Turing CUDA code?

It's good for consumers that mid range card with last gen flagship performance can now be had for half the price of the previous gen flagship SKU. But it does achieve that using brute force.


----------



## Valantar (Oct 27, 2020)

As expected a very attractive GPU. Very much looking forward to AMD's response tomorrow, but even with this alone the GPU market is looking better than in a while. That tiny PCB would be _excellent_ for a water cooled SFF build too. Of course it's a shame that the xx70 series is now at $500, but hopefully that means the (permanent) addition of a xx60 Ti tier filling in the ~$400 bracket, allowing for ~$300 xx60 cards. If not ... that's a shame.

Now, bring on RDNA 2 and let us see them both duke it out. Can't wait!


----------



## HD64G (Oct 27, 2020)

So, dead is the myth nVidia spreaded and some believed of 3070 beating 2080Ti. And most of those people also believed that Big Navi will battle with this GPU. 

Nice review as usual @W1zzard


----------



## Valantar (Oct 27, 2020)

HD64G said:


> So, dead is the myth nVidia spreaded and some believed of 3070 beating 2080Ti. And most of those people also believed that Big Navi will battle with this GPU.
> 
> Nice review as usual @W1zzard


Well... 1% slower in 1080p, 1% faster in 1440p and 2160p. Technically that is a win, I guess? Though for all intents and purposes it is a tie, obviously. AMD's answer to this (whether that's the 6800 or the 6700 XT or something else) will definitely be interesting. As will the next tier down, given that ~$3-400 is often peak GPU value territory.


----------



## RedelZaVedno (Oct 27, 2020)

I really don't get Nvidia timing this time around. Why show 3070 a day before RDNA2 reveal? AMD will surely have an answer for 3070 in RDNA2 lineup tomorrow. All it has to do is to price it a bit cheaper and pop more vram on top of it to make 3070 look silly. What am I missing?


----------



## DuxCro (Oct 27, 2020)

Looking forward to not being able to buy this anywhere. And when it does pop up somewhere, It's gonna be overpriced.


----------



## PerfectWave (Oct 27, 2020)

Rais said:


> Nice Review! But how was the experience with coil whine, if any?


From GUru3d:

Much like the 3080, the GeForce RTX 3070 does exhibit coil squeal. Is it annoying? It's at a level you can hear it. In a closed chassis, that noise would fade away in the background. However, with an open chassis, you can hear coil whine/squeal. Graphics cards all make this in some form, especially at high framerates; this can be perceived.

*Pricing*


----------



## EarthDog (Oct 27, 2020)

RedelZaVedno said:


> I really don't get Nvidia timing this time around. Why show 3070 a day before RDNA2 reveal? AMD will surely have an answer for 3070 in RDNA2 lineup tomorrow. All it has to do is to price it a bit cheaper and pop more vram on top of it to make 3070 look silly. What am I missing?


Quite obviously to steal some thunder. AMD can't 'pop more vram' on the card a day before. AMD's card is what it is on that front. Surely clocks are being tweaked, but yeah.


----------



## Vya Domus (Oct 27, 2020)

Haile Selassie said:


> Is there perhaps a bottleneck in the pipeline that causes stall?
> Is the new Ampere architecture starving memory bandwidth?
> Is the CUDA core somewhat 'RISC'-ed / 'bulldozered' compared to Turing CUDA code?



The Ampere SM has 128 CUDA cores however only 64 of them can execute FP32 operations concurrent with 64 integer operations in a clock cycle, meaning that in the best case scenario the Ampere SM has twice the shading power (128 FP32 per cycle) of a Turing one and in the worst case scenarios it's about as fast (64 FP32).

In other words the SM has some pretty bad constraints making it less efficient per cycle in terms of FP32.


----------



## gridracedriver (Oct 27, 2020)

ahahhh, Faster then 2080ti... in OC moreover takes the slaps
poor those who sold off the 2080ti


----------



## Fourstaff (Oct 27, 2020)

birdie said:


> I've never owned a GPU which costs more than $300   - not my price range. I'm just concerned that this generation XX60 cards will cost a lot more than 10XX/16XX/20XX cards. Also, I'm not a fan of AMD drivers, so, thanks a lot. Looks like I'm waiting for RTX 4060.



xx70 cards are now pushed to (bottom of) high end, the same way how BMW upsized 3series from a 2 door coupe to a land yacht.


----------



## medi01 (Oct 27, 2020)

I have doubts about card availability at announced price.


----------



## TristanX (Oct 27, 2020)

overall, 3080 is better


----------



## W1zzard (Oct 27, 2020)

Rais said:


> Nice Review! But how was the experience with coil whine, if any?


I checked again for you. If I run over 200 FPS and put my ear right next to the card I can hear a little bit of coil whine, but it's minimal, similar to other cards, not worth mentioning.


----------



## medi01 (Oct 27, 2020)

RedelZaVedno said:


> I really don't get Nvidia timing this time around. Why show 3070 a day before RDNA2 reveal? AMD will surely have an answer for 3070 in RDNA2 lineup tomorrow. All it has to do is to price it a bit cheaper and pop more vram on top of it to make 3070 look silly. What am I missing?


The key question is, if it is a real release, or a la 3080, to trick AMD to price own cards lower, while NV would shrug off "oh, no availability, sorry" until they have an actual answer to Big Navi.


----------



## RedelZaVedno (Oct 27, 2020)

EarthDog said:


> Quite obviously to steal some thunder. AMD can't 'pop more vram' on the card a day before. AMD's card is what it is on that front. Surely clocks are being tweaked, but yeah.


I mean 3070 is such an easy target for AMD to beat. Adding 4GB extra GDDR6 costs 20 bucks at most and as we've seen from reputable leaks RDNA2 can hit nearly 2.5Ghz now. All AMD has to offer is "X Box X's" 52CU (less than 300 mm2 die size) GPU clocked at 2.2Ghz (PS5 gaming clock speed) to marginally beat 3070 in standard rasterization performance. I'd expect Nvidia to wait for 6700XT announcement and then tweak and release 3070 accordingly. I think Nvidia will lose xx70 battle this time around if it can't release competitive 3070 TI soon after 6700XT launches.


----------



## okbuddy (Oct 27, 2020)

i'm really right this time, 3070 10% fstaer than 2080 super

 yayy


----------



## EarthDog (Oct 27, 2020)

RedelZaVedno said:


> I mean 3070 is such an easy target for AMD to beat. Adding 4GB extra GDDR6 costs 20 bucks at most and as we've seen from reputable leaks RDNA2 can hit nearly 2.5Ghz now. All AMD has to offer is "X Box X's" 52CU (less than 300 mm2 die size) GPU clocked at 2.2Ghz (PS5 gaming clock speed) to marginally beat 3070 in standard rasterization performance. I'd expect Nvidia to wait for 6700XT announcement and then tweak and release 3070 accordingly. I think Nvidia will lose xx70 battle this time around if it can't release competitive 3070 TI soon after 6700XT launches.


Not the goal I shot at... but ok.


----------



## Rais (Oct 27, 2020)

PerfectWave said:


> From GUru3d:
> 
> Much like the 3080, the GeForce RTX 3070 does exhibit coil squeal. Is it annoying? It's at a level you can hear it. In a closed chassis, that noise would fade away in the background. However, with an open chassis, you can hear coil whine/squeal. Graphics cards all make this in some form, especially at high framerates; this can be perceived.



Feel like a lot of words to say nothing, plus it's false that coil whine fade away into background noise which has lower frequencies. It's more or less what you expect from an high end card? reading into words, as a reviewer myself, i think it's noticeable and that's not good.



W1zzard said:


> I checked again for you. If I run over 200 FPS and put my ear right next to the card I can hear a little bit of coil whine, but it's minimal, similar to other cards, not worth mentioning.


Guru3d mention felt like a worse problem. Thanks for your second check, i'll trust you.


----------



## xkm1948 (Oct 27, 2020)

Thank you for the detailed review, Nice GPU.. 3070 is truly a nice beast.


----------



## birdie (Oct 27, 2020)

RedelZaVedno said:


> I really don't get Nvidia timing this time around. Why show 3070 a day before RDNA2 reveal? AMD will surely have an answer for 3070 in RDNA2 lineup tomorrow. All it has to do is to price it a bit cheaper and pop more vram on top of it to make 3070 look silly. What am I missing?



1. RTRT performance. AMD will be a lot slower.
2. DLSS (read: free performance, tier higher performance than your card is actually capables of). Just missing.


----------



## kruk (Oct 27, 2020)

@W1zzard : why don't you mention the very high multi monitor power draw in the cons of your Ampere reviews, because compared to Turing or Pascal its significantly worse? AMD gets constantly bashed for this metric, but nVidia gets a free pass?


----------



## W1zzard (Oct 27, 2020)

kruk said:


> @W1zzard : why don't you mention the very high multi monitor power draw in the cons of your Ampere reviews, because compared to Turing or Pascal its significantly worse? AMD gets constantly bashed for this metric, but nVidia gets a free pass?


It's half of AMD ?



Rais said:


> Guru3d mention felt like a worse problem. Thanks for your second check, i'll trust you.


Could be variance between samples


----------



## iuliug (Oct 27, 2020)

Since Nvidia stopped shipping 3080 and 3090 Founders edition - probably the 3070 will be only IT influencers.


----------



## dicktracy (Oct 27, 2020)

This will sell like crazy.


----------



## Cheeseball (Oct 27, 2020)

kruk said:


> @W1zzard : why don't you mention the very high multi monitor power draw in the cons of your Ampere reviews, because compared to Turing or Pascal its significantly worse? AMD gets constantly bashed for this metric, but nVidia gets a free pass?



At least with the NVIDIA cards its not running the VRAM at max clocks. For some reason running multi-monitor (and even for some single monitors at 144 Hz) causes the VRAM to stay maxed with the AMD cards.


----------



## RedelZaVedno (Oct 27, 2020)

birdie said:


> 1. RTRT performance. AMD will be a lot slower.
> 2. DLSS (read: free performance, tier higher performance than your card is actually capables of). Just missing.


1. RT really doesn't matter for now and SP5/XboX X will have RDNA chips in them so expect AMD's implementation of ray tracing to rule gaming market with only exception being Nvidia sponsored titles.
2. Yeah, AI upscaling could become a valid advantage point of Ampere IF DLSS can be implemented on GPU driver basis rather than game code as not many devs will walk extra mile to add it into the code without financial compensation for doing so I'm afraid


----------



## Haile Selassie (Oct 27, 2020)

Vya Domus said:


> The Ampere SM has 128 CUDA cores however only 64 of them can execute FP32 operations concurrent with 64 integer operations in a clock cycle, meaning that in the best case scenario the Ampere SM has twice the shading power (128 FP32 per cycle) of a Turing one and in the worst case scenarios it's about as fast (64 FP32).
> 
> In other words the SM has some pretty bad constraints making it less efficient per cycle in terms of FP32.


This is exactly what I was looking for, thank you. So indeed, most games will never benefit from the new SM array it seems as the games seldomly use FP64 int, correct?


----------



## Condelio (Oct 27, 2020)

Hey! Congratulations on your marriage!


----------



## Vya Domus (Oct 27, 2020)

Haile Selassie said:


> This is exactly what I was looking for, thank you. So indeed, most games will never benefit from the new SM array it seems as the games seldomly use FP64 int, correct?



Graphics workloads are predominantly mixed between FP32 and INT32. When integer instructions are issued that means the SM can no longer process 128 FP32 operations in that clock cycle. That happens pretty often since you constantly need to calculate addresses which need INT32, so I suspect the SM hardly ever achieves it's maximum FP32 throughput.


----------



## Searing (Oct 27, 2020)

birdie said:


> W1zzard has mentioned this and this has been disspelled a thousand times already:



No he has not. Next gen games haven't even been released yet and we are on the edge. He said it was good enough for now, which isn't what we should be looking at, but over the next year as PS5 level games get ported to PC. There is also a difference between "enough" and providing the hardware so that game makers can provide better textures packs, and nVidia is holding PCs back here. I was able to use more than 4GB the day the 1070 released because game makers released new options and new texture packs.

4 years from the 770 to the 1070 got us 4x more VRAM. 4 years from the 1070 to the 3070 got us nothing at all. That has never happened before and I think we shouldn't be so glib about it.


----------



## okbuddy (Oct 27, 2020)

we need to run all these cards with 5950x next month to dig deeper


----------



## kruk (Oct 27, 2020)

W1zzard said:


> It's half of AMD ?



Yes, for 3070 it's lower than AMDs previous gen x700 card, but it's also higher than nVidia previous gen x70 card - and in my opinion this is a con. 

The extreme non-gaming power consumption of the 3090's (example: https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/29.html) it's also a con, but it's also not mentioned in the conclusion ...


----------



## QUANTUMPHYSICS (Oct 27, 2020)

There is only ONE Benchmark I'm interested in. 

MICROSOFT FLIGHT SIMULATOR 2020 4K    (since no one bothers benchmarking DCS-WORLD).

The 3070 is just below the 2080Ti at 30fps (+/- 5 fps) 

For $500 vs. $1200, that's pretty good.

But I have the 3090 and I can't even see 60fps in that game.


----------



## EarthDog (Oct 27, 2020)

Searing said:


> No he has not. Next gen games haven't even been released yet and we are on the edge. He said it was good enough for now, which isn't what we should be looking at, but over the next year as PS5 level games get ported to PC. There is also a difference between "enough" and providing the hardware so that game makers can provide better textures packs, and nVidia is holding PCs back here. I was able to use more than 4GB the day the 1070 released because game makers released new options and new texture packs.
> 
> 4 years from the 770 to the 1070 got us 4x more VRAM. 4 years from the 1070 to the 3070 got us nothing at all. That has never happened before and I think we shouldn't be so glib about it.


It will be OK to hold your breath... seriously. 

People need to get over this RAM thing already... so much crying over spilled milk.


----------



## thematrix606 (Oct 27, 2020)

birdie said:


> A wonderful review, a great card, only I cannot quite accept the fact that it's still priced quite high:
> 
> GTX 770 - $399
> GTX 970 - $329
> ...



As a consumer, I also want cheaper products. 

But keep in mind that there is not only inflation, but so many other factors in the world that have an influence on these prices (like corona). The 770 was released in 2013, that's 7 years ago. How much more performance do we get out of a 3070 than we did from a 770? No other market can keep up with such major iterations. CPUs, cars, planes, internet speeds... nothing can keep up with the major gains we get every year from GPUs. So at least we can be happy about that


----------



## birdie (Oct 27, 2020)

QUANTUMPHYSICS said:


> There is only ONE Benchmark I'm interested in.
> 
> MICROSOFT FLIGHT SIMULATOR 2020 4K    (since no one bothers benchmarking DCS-WORLD).
> 
> ...



The game is currently extremely poorly optimized and not yet ported to DirectX12 (which is doubly strange since it's published by Xbox Game Studios) - this is why Wizz refuses to review it.


----------



## MxPhenom 216 (Oct 27, 2020)

Xaled said:


> Another fraudy release
> Not available in market
> If available, would be with a much higher price than the announced one.
> If Nvidia does what it did with 3080 it should be sued (unfortunately only people in US can do that)
> ...



The card isnt actually out yet. Doesnt hit the market till Thursday.


----------



## newtekie1 (Oct 27, 2020)

birdie said:


> I've never owned a GPU which costs more than $300   - not my price range. I'm just concerned that this generation XX60 cards will cost a lot more than 10XX/16XX/20XX cards. Also, I'm not a fan of AMD drivers, so, thanks a lot. Looks like I'm waiting for RTX 4060.



You missed the point of my post entirely. No competition leads to inflated prices. When I said you are welcome to go buy the similar performing card from AMD for cheaper, I was poking fun at the fact that there is no similar performing card from AMD.


----------



## medi01 (Oct 27, 2020)

QUANTUMPHYSICS said:


> The 3070 is just below the 2080Ti at 30fps (+/- 5 fps)
> 
> For $500 vs. $1200, that's pretty good.



MSRP of 2080Ti is $999, cough.
You can tell me, that it's just a bogus announced price and I'd agree.
And tell you that so is $499 for 3070 at this point.


----------



## Dave65 (Oct 27, 2020)

Can't wait to search high and low for one then not getting one


----------



## Turmania (Oct 27, 2020)

Wow so getting this cardz this is proper one. Almost same power requirements as 2070 super but 2080 ti performance. This what I wanted to see from them! Lovely card to look at as well, one issue is the location of the power connector.


----------



## Colddecked (Oct 27, 2020)

Valantar said:


> Well... 1% slower in 1080p, 1% faster in 1440p and 2160p. Technically that is a win, I guess? Though for all intents and purposes it is a tie, obviously. AMD's answer to this (whether that's the 6800 or the 6700 XT or something else) will definitely be interesting. As will the next tier down, given that ~$3-400 is often peak GPU value territory.



IMO its a win since they're doing it at less power (so at same power should be clear win).  I hope amd announces their full range tomorrow, not just the top end.  It'd be great to know how amd plans on filling the 300 dollar to 200 dollar tiers, which in turn would put some pressure on nvidia to announce their midrange gpus...


----------



## Valantar (Oct 27, 2020)

Oh, by the way, am I the only one who thought the renders of the 3070 at launch looked kind of boxy and weird compared to the 3080 and 3090, but now after seeing actual photos thinks it looks much better than those? This is a really nicely designed GPU. The mid-shroud power connector is a bit meh, but that can definitely be overcome with some cable management effort. And otherwise it's a lot more understated and visually harmonious than its two bigger siblings.



Colddecked said:


> IMO its a win since they're doing it at less power (so at same power should be clear win).  I hope amd announces their full range tomorrow, not just the top end.  It'd be great to know how amd plans on filling the 300 dollar to 200 dollar tiers, which in turn would put some pressure on nvidia to announce their midrange gpus...


Yeah, that would be great if so, but I'd frankly be surprised if we see more than the top 3-4 SKUs from them at once. GPUs are much more distinct than CPUs, so if you launch too many at once you risk them drowning each other out. It's a better PR and communications tactic to distribute out the launches - get people fired up with the high end, then trickle out ever cheaper cards as time passes. But just to be clear, I would be very happy if we saw the full Navi 21+22+23 tomorrow!


----------



## Magistar (Oct 27, 2020)

Hmm 2070 is listed as $340 so does very well in price performance. But the cheapest I can find is 397(euro). Typo?


----------



## mechtech (Oct 27, 2020)

Really nice card.  Hopefully no odd issues like the 3080.

I guess after exchange rate and shipping and 13% sales tax it will be about $800 here in Canada.

To all the 2080ti owners maybe you can sell them for $600 and get a 3070


----------



## Xajel (Oct 27, 2020)

I think it's long overdue now, but when we will see Ultrawide resolutions? at least 3440x1440. It really makes sense now with these cards and the gaining popularity of gaming Ultrawides.


----------



## Vayra86 (Oct 27, 2020)

birdie said:


> I've never owned a GPU which costs more than $300   - not my price range. I'm just concerned that this generation XX60 cards will cost a lot more than 10XX/16XX/20XX cards. Also, I'm not a fan of AMD drivers, so, thanks a lot. Looks like I'm waiting for RTX 4060.



Same here, 500 is upper limit for me, and I could stretch to 520-530 but any further is a no-no. Waste of money. As it looks that 500 does net a good chunk of performance over this (also 500 costing) 1080.

20 dollars for inflation over 3-4 years, that's fine.

Overall, looks as expected and a perfect 1080p card


----------



## N3M3515 (Oct 27, 2020)

birdie said:


> I've never owned a GPU which costs more than $300   - not my price range. I'm just concerned that this generation XX60 cards will cost a lot more than 10XX/16XX/20XX cards. Also, I'm not a fan of AMD drivers, so, thanks a lot. Looks like I'm waiting for RTX 4060.



And by the time the 4060 arrives i bet i will cost at least $450 lol........gpus have become so mainstream, just like cellphones, they keep going up and up (and people keep bying them!)


----------



## RedelZaVedno (Oct 27, 2020)

Am I the only one who is not impressed with 3070?

GTX 770 ($399) -> GTX 970 ($329) = *+43%* performance increase / -21% price reduction = price/performance winner
GTX 970 ($329) -> GTX 1070 ($379) = *+47%* performance increase / +15% price increase = 2nd place
GTX 1070 ($379) -> RTX 2070/S ($499) = *+37/+52%* performance increase / +32% price increase  = last place
*RTX 2070/S ($499) -> RTX 3070 ($499) = +39/+26% performance increase / equal elevated price = 3rd place *

It's given us bellow average performance increase while retaining Turing price hike. How can this be a good deal? Just because Turing sucked BIG TIME, doesn't mean anything better becomes good.


----------



## efikkan (Oct 27, 2020)

RedelZaVedno said:


> Am I the only one who is not impressed with 3070?
> 
> GTX 770 ($399) -> GTX 970 ($329) = *+43%* performance increase / -21% price reduction = price/performance winner
> GTX 970 ($329) -> GTX 1070 ($379) = *+47%* performance increase / +15% price increase = 2nd place
> ...


Pretty much - bumping the previous high-end two tiers down is nothing to complain about.
Prices have varied a lot between generations, even more if you go further back. While prices have certainly been lower, but that were back when there was real competition.
$399 (2013) would translate to $446 today, so $499 isn't half bad. If Navi2 does its job, perhaps they'll shave $50 off that price.


----------



## Ibotibo01 (Oct 27, 2020)

> *So, RTX 3060Ti will be between RTX2080S and RTX2080Ti. Also RTX 3060 will be equal to RX 5700 XT. RTX 3070 will be equal 2080 Ti.
> RTX 3060 Ti = RTX2080S+9%*


Next RTX 3060 Ti.


RedelZaVedno said:


> *RTX 2070/S ($499) -> RTX 3070 ($499) = +39/+26% performance increase / equal elevated price = 3th place *


Wait AMD for discount. GPU Wars began.


----------



## Colddecked (Oct 27, 2020)

Valantar said:


> Oh, by the way, am I the only one who thought the renders of the 3070 at launch looked kind of boxy and weird compared to the 3080 and 3090, but now after seeing actual photos thinks it looks much better than those? This is a really nicely designed GPU. The mid-shroud power connector is a bit meh, but that can definitely be overcome with some cable management effort. And otherwise it's a lot more understated and visually harmonious than its two bigger siblings.
> 
> 
> Yeah, that would be great if so, but I'd frankly be surprised if we see more than the top 3-4 SKUs from them at once. GPUs are much more distinct than CPUs, so if you launch too many at once you risk them drowning each other out. It's a better PR and communications tactic to distribute out the launches - get people fired up with the high end, then trickle out ever cheaper cards as time passes. But just to be clear, I would be very happy if we saw the full Navi 21+22+23 tomorrow!



Yeah, I suppose they have stock they want to clear as well before the new midrange totally nuke their value.  At this pace, x60/ti series won't be out till Jan, which kind of fits nvidia's historical patterns.


----------



## Searing (Oct 27, 2020)

EarthDog said:


> It will be OK to hold your breath... seriously.
> 
> People need to get over this RAM thing already... so much crying over spilled milk.



Hold your breath and crying? None of which is happening here. Grow up and talk like an adult.

Of course games will work with 8GB, that isn't the point. I like how you just missed the entire 770 to 1070 we got 4x more VRAM and yet 1070 to 3070 we didn't get any more at all. And I'm unreasonable? We've never gone 4 years without more VRAM. And this during a launch year of a new console driving up VRAM requirements very soon. More importantly, build it and they will come, give all your cards more VRAM and game makers will release content for them. The month the 1070 launched I suddenly had games with new video options and texture packs that used more than 4GB and the same people were "crying" like yourself that we didn't need 8GB then, and yet voila we got better settings than console games and used the VRAM up quickly. Doom Eternal already needs more than 8GB for top settings and more games will soon.


----------



## Colddecked (Oct 27, 2020)

Searing said:


> Hold your breath and crying? None of which is happening here. Grow up and talk like an adult.
> 
> Of course games will work with 8GB, that isn't the point. I like how you just missed the entire 770 to 1070 we got 4x more VRAM and yet 1070 to 3070 we didn't get any more at all. And I'm unreasonable? We've never gone 4 years without more VRAM. And this during a launch year of a new console driving up VRAM requirements very soon. More importantly, build it and they will come, give all your cards more VRAM and game makers will release content for them. The month the 1070 launched I suddenly had games with new video options and texture packs that used more than 4GB and the same people were "crying" like yourself that we didn't need 8GB then. Doom Eternal already needs more than 8GB for top settings and more games will soon.



I think its a valid point, if you are into texture quality/packs/mods then you should just wait for a card that satisfies your vram requirements.  nvidia has never really gone all out with vram size on the 80 series, probably to give the ti model more purpose.


----------



## MxPhenom 216 (Oct 27, 2020)

Does anyone have a guess on market availability for AMDs new cards will be because its certainly not tomorrow during the unveil.


----------



## Verbatim (Oct 27, 2020)

Still maxwell and pascal was better in terms of price to performance ratio against previous series.

GTX 970 329eur + 40% FPS increase against GTX 770
GTX 1070 449eur + 62% FPS increase against GTX 970
RTX 3070 499eur + 31% FPS increase against RTX 2070 Super

So higher price and lower FPS increase.

Some may think RTX 3070 is great value when it really isn't! *Ampere get's knocked out cold easy* when it comes to p/p ratio if you look back in history.

Let's will hope that amd will fix this shit move from nvidia.


----------



## Poul-erik (Oct 27, 2020)

I can see that Hardware Unboxed is testing the 3070 and it is almost too little with the 8Gb ram. I was also surprised that such a powerful card only has 8 Gb of Ram.


----------



## RedelZaVedno (Oct 27, 2020)

MxPhenom 216 said:


> Does anyone have a guess on market availability for AMDs new cards will be because its certainly not tomorrow during the unveil.


Good probably, but don't expect to see Navi 21 offerings (high end) before 2nd half of November or even early December and Navi 22 (mid range)  before Q1 2021 according to pretty reliable leaks given to RedGamingTech by AIBs.


----------



## MxPhenom 216 (Oct 27, 2020)

RedelZaVedno said:


> Good probably, but don't expect to see Navi 21 offerings (high end) before 2nd half of November or even early December and Navi 22 (mid range)  before Q1 2021 according to pretty reliable leaks given to RedGamingTech by AIBs.



Okay so its still a ways out then. Guess I will be battling with bots on Thursday to get 2 3070s for 2 builds Im doing.


----------



## Mordeafaca (Oct 27, 2020)

Everyone comparing to a 2080ti and assume that you get same performance for half the price.
I would like people to realize that comparing to a 2070s can be bought <500€ we are getting around 20%-25% more performance but paying almost ~50€ more, that certainly not impressive.
Just hoping that amd announcement brings some competitiveness to the market because this 3000 series launch prices are quite high.


----------



## P4-630 (Oct 27, 2020)

If you aren't gaming that much anymore you can always get this once available:


----------



## MxPhenom 216 (Oct 27, 2020)

Mordeafaca said:


> Everyone comparing to a 2080ti and assume that you get same performance for half the price.
> I would like people to realize that comparing to a 2070s can be bought <500€ we are getting around 20%-25% more performance but paying almost ~50€ more, that certainly not impressive.
> Just hoping that amd announcement brings some competitiveness to the market because this 3000 series launch prices are quite high.



Where the hell are you getting 2070S that cheap? Unless used? Newegg they are like $600, if not more


----------



## B-Real (Oct 27, 2020)

That power consumption (even if it matches the 2080Ti performance)...



birdie said:


> 1. RTRT performance. AMD will be a lot slower.
> 2. DLSS (read: free performance, tier higher performance than your card is actually capables of). Just missing.



1. Which games use RT in a way you stop watching the scenery instead of playing? There are not enough games that support it in a way you say "Hmm, that blows my mind (eye)!"
2. DLSS is only good when it uses DLSS 2.0. How many games support 2.0? 1.0 just washes the picture.
3. Even RTX is not fast enough in RT.


----------



## MxPhenom 216 (Oct 27, 2020)

B-Real said:


> That power consumption (even if it matches the 2080Ti performance)...
> 
> 
> 
> ...



RTX is way better on 30xx though. Fine for games that actually have it, and those games really don't require super high fps to offer a good experience.

But you better anticipate a ton more games will have RT in them since next gen consoles do have the hardware for it.


----------



## Cheeseball (Oct 27, 2020)

@W1zzard 

Would it be possible to revisit the Borderlands 3 benchmark [for all cards] moving forward? It looks like Gearbox may have fixed the DX12 renderer (although it does still crash more often than DX11, but the loading times seem to be fixed).


----------



## Mordeafaca (Oct 27, 2020)

MxPhenom 216 said:


> Where the hell are you getting 2070S that cheap? Unless used? Newegg they are like $600, if not more



Bought one a year ago from gigabyte for 480€.
Today you can even get it for lower prices.









						Placa Gráfica MSI GeForce RTX 2070 SUPER Ventus GP OC 8G
					

Clock Base: 1605MHz | Clock Boost: 1785MHz | Núcleos CUDA: 2560 | Interface Memória: 256 Bits | TDP: 215W




					www.pcdiga.com


----------



## MxPhenom 216 (Oct 27, 2020)

Mordeafaca said:


> Bought one a year ago from gigabyte for 480€.
> Today you can even get it for lower prices.
> 
> 
> ...



I guess where you're located, but that ain't happening in the US.


----------



## EarthDog (Oct 27, 2020)

Searing said:


> Hold your breath and crying? None of which is happening here. Grow up and talk like an adult.
> 
> Of course games will work with 8GB, that isn't the point. I like how you just missed the entire 770 to 1070 we got 4x more VRAM and yet 1070 to 3070 we didn't get any more at all. And I'm unreasonable? We've never gone 4 years without more VRAM. And this during a launch year of a new console driving up VRAM requirements very soon. More importantly, build it and they will come, give all your cards more VRAM and game makers will release content for them. The month the 1070 launched I suddenly had games with new video options and texture packs that used more than 4GB and the same people were "crying" like yourself that we didn't need 8GB then, and yet voila we got better settings than console games and used the VRAM up quickly. Doom Eternal already needs more than 8GB for top settings and more games will soon.


I feel the same way about those continuously bitching about a non issue. Here's more of the same to wallow over. 






1. I didn't say they wouldn't. All I am saying is 8GB is fine and will be for the coming years. There is also improved compression algorithms as well.
2. You assume the console will drive up requirements "quickly"... a fair guess is requirements could go up... but quickly is absolutely up for debate. If you ask others, ray tracing won't catch on either, but this will? lol
3. You always get better settings than console. That isn't anything new.
4. Doom: E needs less than 8GB through 2560x1440p on Ultra Nightmare settings. 

If you mod and need more vram, there are other options available... they may not be Nvidia... I get it. But that doesn't change the fact that 99% of todays games running at 2560x1440 or less (where this card is targeted really) won't show any issues. As far as the next few years, only time will tell. But again, there is headroom in a lot of titles even at 1440/ultra settings. That isn't to say a couple of titles may display vRAM limited issues, but for the next 3-4 years, 8GB at that res and settings will be fine for an overwhelming majority of titles.


----------



## Cheeseball (Oct 27, 2020)

MxPhenom 216 said:


> I guess where you're located, but that ain't happening in the US.



There have been various times where Best Buy would have stock on the RTX 2070 Super Founder's Edition at MSRP $499.99. I believe the last time I saw it go on stock was earlier this month (most likely to clear out inventory).

That is to say, you probably shouldn't get one anymore at that same price.


----------



## MxPhenom 216 (Oct 27, 2020)

EarthDog said:


> I feel the same way about those continously bitching about it. Perhaps a graphic will help.
> 
> View attachment 173533



Considering these GPUs are much faster, and they have quite a bit more memory bandwidth, as long as there is enough RAM, I don't see where the fuss is. 8-10GB will be enough for quite sometime, and by the time it isn't there will be new cards out already that will be faster overall. Though that 16GB that the AMD cards will have is kind of nice to see not gonna lie. Simpletons will eat up the AMD cards purely due to having more RAM, even if on average the cards are slower than Nvidias offerings.


----------



## kapone32 (Oct 27, 2020)

RedelZaVedno said:


> I really don't get Nvidia timing this time around. Why show 3070 a day before RDNA2 reveal? AMD will surely have an answer for 3070 in RDNA2 lineup tomorrow. All it has to do is to price it a bit cheaper and pop more vram on top of it to make 3070 look silly. What am I missing?


They are trying to remove AMD mindshare.


----------



## W1zzard (Oct 27, 2020)

Cheeseball said:


> Would it be possible to revisit the Borderlands 3 benchmark [for all cards] moving forward? It looks like Gearbox may have fixed the DX12 renderer (although it does still crash more often than DX11, but the loading times seem to be fixed).


Yeah I'll think about this for next rebench, for the 2021 setup


----------



## N3M3515 (Oct 27, 2020)

RedelZaVedno said:


> Am I the only one who is not impressed with 3070?
> 
> GTX 770 ($399) -> GTX 970 ($329) = *+43%* performance increase / -21% price reduction = price/performance winner
> GTX 970 ($329) -> GTX 1070 ($379) = *+47%* performance increase / +15% price increase = 2nd place
> ...



The thing is that prices never changed generation over generation(or at least i think they went 2 or 3 gens and then increase price.), i mean the $250 gpu, next gen the same $250 gpu would be much faster and still the same price. Now they up the price every generation.


----------



## kapone32 (Oct 27, 2020)

Hi Wizzard nice review I wanted to know your thoughts on Direct Storage. If it is part of DX12 API is it something that would be capable through a software update or something specific to the GPU and GPU driver?


----------



## RedelZaVedno (Oct 27, 2020)

MxPhenom 216 said:


> RTX is way better on 30xx though. Fine for games that actually have it, and those games really don't require super high fps to offer a good experience.
> 
> But you better anticipate a ton more games will have RT in them since next gen consoles do have the hardware for it.


Next gen consoles are build on AMD's RDNA/Zen arch not Nvidia's tensor cores, so I don't expect to see a lot of hardware based ray tracing optimized for Nvidia hardware going on between 2021-2025. PC gaming market is dev's 2nd thought when it comes to AA(A) game development. Best we can hope for are decent console ports nowdays and even those are getting worse and worse optimizations


----------



## W1zzard (Oct 27, 2020)

kapone32 said:


> Hi Wizzard nice review I wanted to know your thoughts on Direct Storage. If it is part of DX12 API is it something that would be capable through a software update or something specific to the GPU and GPU driver?


It is some sort of Microsoft API. So I seriously doubt it will come to anything other than Windows 10. I would guess they'll give it to us in a future Windows update (like "20H2"), not some small downloadable patch. Then the question is how can you get enough install base for game devs to make it interesting. Maybe they'll make it compatible with whatever exists on next-gen Xbox. Then the q is how much of a diff it can make for load times, which brings us back to "will developers spend time and $ on it".

The promise is big though, nothing we can do other than wait and see


----------



## kapone32 (Oct 27, 2020)

W1zzard said:


> It is some sort of Microsoft API. So I seriously doubt it will come to anything other than Windows 10. I would guess they'll give it to us in a future Windows update (like "20H2"), not some small downloadable patch. Then the question is how can you get enough install base for game devs to make it interesting. Maybe they'll make it compatible with whatever exists on next-gen Xbox. Then the q is how much of a diff it can make for load times, which brings us back to "will developers spend time and $ on it".
> 
> The promise is big though, nothing we can do other than wait and see



Ok thanks it is interesting that it could be a OS update. Interestingly I watched the PS5 technical brief and when they talked about achieving over 7000 Mb/s by utilizing NVME along with the GPU Compute so it may be closer than we think. If it is a Windows 10 update would it have to be something that the Game developer would have to add. There must be an incentive they can use if not more than the adoption of NVME to make this a reality. The potential is insane.


----------



## ZoneDymo (Oct 27, 2020)

oddly more impressive then the 3080 and 3090


----------



## SamuelL (Oct 27, 2020)

Mild disappointment. It looks good but I was hoping we would see this slot right in the middle of the 2080ti and the 3080, not essentially matching the 2080ti +/- a percent or two depending on the resolution and DX version. Still a good deal if they are ever available at MSRP, just wish it was a "great" deal comparatively like some of the past launches.


----------



## Thuban (Oct 27, 2020)

Can't wait for the RDNA2 products! I *WANT* 16gb framebuffer, just because.


----------



## bpgt64 (Oct 27, 2020)

Yes and you too can hope to buy one of twelve that will be available this year!


----------



## SIGSEGV (Oct 27, 2020)

Paper launch and overflowed of reviews in every tech site news/forums.
Nice trick.


----------



## MxPhenom 216 (Oct 27, 2020)

RedelZaVedno said:


> Next gen consoles are build on AMD's RDNA/Zen arch not Nvidia's tensor cores, so I don't expect to see a lot of hardware based ray tracing optimized for Nvidia hardware going on between 2021-2025. PC gaming market is dev's 2nd thought when it comes to AA(A) game development. Best we can hope for are decent console ports nowdays and even those are getting worse and worse optimizations



Console are built on RDNA2 actually. So they have dedicated RT hardware in those designs. Though I suspect AMDs first attempt at RT hardware will yield around Turing performance for ray tracing.

If im not mistaken Nvidia Tensor cores are not what is used for RT calculations. Nvidia GPUs also have dedicated RT cores which are seperate from the Cuda Cores and Tensor cores.


----------



## ChristTheGreat (Oct 27, 2020)

so, since I have issue with AMD Encoder (I don't run dual PC), I think my RX5700 will be replaced by a 3070/3080, when it's available... well, except if I can grab a 2070 super or 2080 for very cheap haha! still running 1080P 144fps, but could upgrade to a gsync or 1440p monitor!

Great review wizz again!


----------



## efikkan (Oct 27, 2020)

Thuban said:


> Can't wait for the RDNA2 products! I *WANT* 16gb framebuffer, just because.


I seriously hope RDNA2 will not be another round of "inferior" cards with a bunch of extra VRAM on them to claim they're more "future proof".

AMD's current _flagship_, Radeon VII, holds up so remarkably well against RTX 3070's measly 8 GB in 4K… oh wait.


----------



## MxPhenom 216 (Oct 27, 2020)

efikkan said:


> I seriously hope RDNA2 will not be another round of "inferior" cards with a bunch of extra VRAM on them to claim they're more "future proof".
> 
> AMD's current _flagship_, Radeon VII, holds up so remarkably well against RTX 3070's measly 8 GB in 4K… oh wait.



I am literally expecting AMD and Nvidia 3000 cards to trade blows. I don't know if there will be a real clear winner this time. Well until Nvidia comes out with the Super cards on TSMC 7nm that is.


----------



## owen10578 (Oct 27, 2020)

Nice and detailed review as always! I really liked the cooler comparison part you added this time around. Wow did Nvidia really create an efficient cooler design on this little guy too.


----------



## John Naylor (Oct 27, 2020)

RedelZaVedno said:


> I really don't get Nvidia timing this time around. Why show 3070 a day before RDNA2 reveal? AMD will surely have an answer for 3070 in RDNA2 lineup tomorrow. All it has to do is to price it a bit cheaper and pop more vram on top of it to make 3070 look silly. What am I missing?



Last time, they dropped the price by $50 on the day before release day .... which can only be seen as raising the performance white flag.  You don't lower prices when you have a better product you raise them




DuxCro said:


> Looking forward to not being able to buy this anywhere. And when it does pop up somewhere, It's gonna be overpriced.



You could sacrifice being the 1st one on the block to have one .... and just wait till the new and improved later steppings arrive with mature BIOSs, flaws corrected and slightly better OCs.




RedelZaVedno said:


> I mean 3070 is such an easy target for AMD to beat. ....



I think if it was so easy, AMD would have done it with 7xx, 9xx, 10xx and 20xx.  Last gen the only tier winner was the 5600 XT.  I hope they can do it ... but it's not like they've had a horse in the upper tiers in quite a while.




Searing said:


> I was able to use more than 4GB the day the 1070 released because game makers released new options and new texture packs.
> 4 years from the 770 to the 1070 got us 4x more VRAM. 4 years from the 1070 to the 3070 got us nothing at all. That has never happened before and I think we shouldn't be so glib about it.



That wasn't driven by RAM as much as GPUs at that point were now capable of handling higher resolutions ... if you look at the same card VRM comparisons done for 6xx, 7xx, 9xx and 10xx ... the conclusion is always the same .... when you hike the settings to a point where VRAM matters.... the game is unplayable.  Playing games with X VRAM vs Y VRAM, yes when pushed with higher loads and max settings, we do see differences double digits ... like 15 fps to 20 fps.   But either way, no one ifs going to play a game at those settings at that resolution.   Then problem here is not the VRAM., don't play at 2160p k w/  a GPU targeted at 1440p.  In every era, the VRAM thing rears its head and in every era it''s has come to the same conclusion.

6xx Era - https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
So, what can we glean from all of that? For one thing, with any single monitor the 2GB video card is plenty - even on the most demanding games and benchmarks out today. When you scale up to three high-res screens the games we tested were all still fine on 2GB, though maxing out some games’ settings at that resolution really needs more than a single GPU to keep smooth frame rates. 

7xx Era - http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/
You couldn't install Max Payne on a system w/ a 2GB 770 (reported to use 2.7 GB) , but it would install on the 4 GB and yet .... when the popped out the 4 GB and installed the 2 GB it ran at same fps with same visual quality and user experience.  RAM allocation and RAM usage are two different things.  Resolutions used on this test went up to 5760 x 1080

"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600.  We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference.  ***If we start to add even more AA, in most cases, the frame rates will drop to unplayable on both cards***.

9xx Era -  https://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,26.html
The GTX 960 should probably have been launched with 3 GB VRAM standard, hence I can recommend these 4 GB versions very much. But yeah, the end-results remain a bit trivial with the price-premium in mind -- that I need to admit.

10xx Era - http://www.extremetech.com/gaming/2...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x

"We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly .... They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”

"We began this article with a simple question: “Is 4GB of RAM enough for a high-end GPU?” ... Based our results, I would say that the answer is yes — but the situation is more complex than we first envisioned.

First, there’s the fact that out of the fifteen games we tested, only four of could be forced to consume more than the 4GB of RAM. In every case, we had to use high-end settings at 4K to accomplish this. While we do see some evidence of a 4GB barrier on AMD cards that the NV hardware does not experience, ***provoking this problem in current-generation titles required us to use settings that rendered the games unplayable any current GPU***

Also .. in TPUs 1060 3GB Review have to wonder why nvidia stripped out 11% of the shaders from the 6 GB version.  The reason is the 6 GB card had to show a performance difference to justify it's price and w/o the cut in shader count it would not.  When ya look at 1080p scores between the two, the 6 GB card w/ 11% more shaders is 6% faster .... Stands to reason then if VRAM is in any way associated with hat difference, that 6% would grow substantially at 1440p .... in the real world it does not, just the same 6% performance difference.

I have certain games, that show anomalies, such as poor console ports.   Have seen the performance difference between the 3 GB and 6 GB get igger at the 1440p reolution .... and one game where it got closer at 1440p resolution.  In short, 99% of the time where we see substantial differences, in fps between same cards w/ different memory amounts, the GPU was stressed to  a point that the game was unplayable.  At any combination of resolution and setting, if VRAM is going to be a problem, the GPU is already a problem.  Finding "a game" that doesn't follow this observation doesn't change the rule because there's a few rare anomalies.

Pay attention to the article ... "With those performance numbers, RTX 3070 is the perfect choice for the huge 1440p gamer crowd".  That 8 GB is just fine for 1440p. Adding 4 GB isn't going to help you, by the time you get past 8GB (used not allocated) at 2160p and high settings, your GPU will be delivering < 30 fps.  Again, W1zzard nailed it:

_"The GeForce RTX 3070 comes with 8 GB of memory, which will be the basis for a lot of discussion, just like on the RTX 3080 10 GB. I feel like 8 GB is plenty of memory for the moment, especially considering this card is targeted at 1440p gaming, which isn't as memory-intensive as 4K. Even with 4K, I feel you'll run out of shading power long before memory becomes an issue._ 

As to why nvidia timed it this way ?  .... Cause they are not idiots.  Back in the 7xx era... AMD was in trouble.  They had nothing ... many pundits were asking "why does the 780 have the specifications that were leaked for the 770 ?  AMD spend a fortune building up the 290 / 290x over several months and the week before it's release , nvidia suddenly announced that apparently had this 780 Ti thing that was just sitting on the shelf and would ship in two weeks.  Took all the air out of AMDs sales and the 2xx series fell flat.

Why do they charge what they charge ? ... because no one is stopping them.   Price is what the market will bear and 4.5 times as many people are choosing with their wallets to pay the price premium.  AMD doesn't have a recent generation card in the top 20 / nvidia has 7.  The top 5 cards in use are 1xxx series cards... the 580 places 10th and the 570 places 16th.   Wat message is sent when new cards come out and folks are buying them up at 10, 20, 30 % or more of MSRP ? ... it says "we can charge more"  Corporate office are legally bound, within legal boundaries, to maximize profit to their investors.  Failure to do so is grounds for dismissal or even malfeasance charges. 

The 480 had more RAM than the 1060 but it accomplished what ? ... with both cards overclocked,  1060 was16.6% faster in TPUs Gaming test suite ,quieter and ran cooler.    When folks say "it will be easy to beat nVidia ....  upon what recent  past wins is this being based on ?   The 1060 is the most popular card in use today w/ 10.79 % market share ... the 480 has 0.49% ... the 580 has 2.22%.  Combined that's 2.71 % or 25% of the 1060.  That's not a win ... the 5600 XT is a win, it's clearly better than the competition in that price range.  

If we want prices to drop, one of two things has to happen, only one of which seems possible:

a)  Stop buying
b)  A competitive product hits the market

a) is not going to happen leaving us only with b) as a the only possible price impacting scenario ... so while I can still root for b).... nothing AMD has done in the last 7-8 years (other than the 5600 XT) tells me that this is likely to happen.   Having more RAM is not a feature unless it brings something to the table.   The fact that nvidia is beginning to show their hand with the xx70 tends to suggest that they know something we don't and that's not a good sign.  I hope I'm wrong.  Our hope for better prcing lies in AMD being able to deliver a 3070 level "5600 XT like" card that has the power to compete with the 3070.... slapping extra RAM on it is not going to help here.


----------



## Steevo (Oct 27, 2020)

Congrats on 17 days and many more to come. 

Looks like a winner if they can keep it in stock, any ideas on the red team and their upcoming hardware launch? Do you have any others yet?


----------



## r9 (Oct 27, 2020)

I just released a card xyz it's 10% faster than rtx 3070 and only $449 .. the availability is about the same as for the other cards.


----------



## Chrispy_ (Oct 27, 2020)

Looks like the best 3000-series card IMO. 

It's a paper launch for now, so I guess the real question is whether it's still $499 and whether anyone can actually buy one at that price once AMD's 6000-series are actually available to buy.


----------



## dgianstefani (Oct 27, 2020)

Remember ampere is a brand new architecture with 1st gen drivers, I expect another few % from driver optimisations in the next few months. Adding to that you can oc for another 5% performance and its a solid improvement from the 2080ti.


----------



## XL-R8R (Oct 27, 2020)

John Naylor said:


> MASSIVE SNIP



I was reading these amusing comments until I got to this; John, seriously, I've never seen a bigger post in my life and it probably has more words than the actual review!!

Calm down man, calm down...... "smh" for real.





As you were.


----------



## Turmania (Oct 27, 2020)

The issue with AMD, will be to convince consumers that they are reliable from day one. Let's be honest they have yet to convince on that front.


----------



## spnidel (Oct 27, 2020)

price-performance is only "disruptive" because the 2000 series price-performance was dogshit, the 3000 series is what 2000 series should've been in terms of price-performance


----------



## N3M3515 (Oct 27, 2020)

spnidel said:


> price-performance is only "disruptive" because the 2000 series price-performance was dogshit, the 3000 series is what 2000 series should've been in terms of price-performance



Incredible, that's so true. Let's hope tomorrow amd can compete on price, performance and availability


----------



## spnidel (Oct 27, 2020)

N3M3515 said:


> Incredible, that's so true. Let's hope tomorrow amd can compete on price, performance and availability


oh they definitely can compete, the only question is whether they feel altruistic enough to undercut nvidia when it comes to prices
my bet is on "no"


----------



## Thuban (Oct 27, 2020)

efikkan said:


> I seriously hope RDNA2 will not be another round of "inferior" cards with a bunch of extra VRAM on them to claim they're more "future proof".
> 
> AMD's current _flagship_, Radeon VII, holds up so remarkably well against RTX 3070's measly 8 GB in 4K… oh wait.


How previously AMD fared in 3dmark? According to leaks, it surely does look mighty. This paper launch sort of proves this point. I expect this to be Zen 2 moment for the AMD graphics card division.


----------



## efikkan (Oct 27, 2020)

Thuban said:


> *How previously AMD fared in 3dmark?* According to leaks, it surely does look mighty. This paper launch sort of proves this point. I expect this to be Zen 2 moment for the AMD graphics card division.
> 
> View attachment 173557


If the architecture is _really_ changed, then nobody can really tell from this.

Tens of thousands of cards shipping is *not* a paper launch. There were probably more RTX 3080s shipped on launch day than Radeon VII did throughout its lifespan.


----------



## Thuban (Oct 27, 2020)

You saying, this wasn't rushed? Sure as hell, the leather jacket could of made a better job of supplying more cards to the market. At any rate, lets wait another month and see how AMD handles this.


----------



## mahoney (Oct 27, 2020)

Do some of you understand we live in pandemic times? I wonder if the same whiners will be complaining when there's shortage of AMD's cpus and gpu's - not to mention shops increasing prices because of it


----------



## efikkan (Oct 27, 2020)

Thuban said:


> You saying, this wasn't rushed? Sure as hell, the leather jacket could of made a better job of supplying more cards to the market. At any rate, lets wait another month and see how AMD handles this.


Rushed? What specifically is rushed?
This launch window was planned out two years ago.
The fact is that postponing it wouldn't have improved the supply to date, that would require more production lines from Samsung.


----------



## Minus Infinity (Oct 27, 2020)

Hmm for 1440p gaming it's only 24% faster on average than the 2070 Super. Would have expected 30% at least. At least the price is good well at least compared to Turing, but I'm still waiting for RDNA2 to see where we are landing this generation. Hopper is the Nvidia cards to wait for.


----------



## Camm (Oct 28, 2020)

I don't think we can gauge the value on this until cards come out and we see prices.

We know Nvidia played funny buggers with pricing to AIB's, forcing 3080 and 3090's to be $100's dearer than MSRP. If we get pricing around MSRP, this card looks awesome (although I'll couch that with maybe for the next day until we see AMD), but if its higher, well, that value proposition starts falling pretty quick (especially with the amount of 2080 Ti's that are in the used market that you should be able to get 20% cheaper than a 3070 Ti's MSRP)


----------



## Xex360 (Oct 28, 2020)

Not very impressive given that the card costs around 800$ MSRP, and won't be available, nVidia messed up Ampere with their pricing and availability, hopefully AMD won't make the same mistake or at least launch their cards with the same MSRP every where and not inflate prices outside the US for no reason.


----------



## dayne878 (Oct 28, 2020)

Well, considering I have the 2080ti, I don't see myself upgrading to the 3070. I'll wait until the 3080 becomes available, I guess. Although at this point I might just wait for the 3080ti and hope there's more stock of that when the time comes. I'm having a hell of a time finding a 3080 on Newegg (where I have the store credit card).


----------



## ViperXTR (Oct 28, 2020)

8gb vram? Only time will tell, when PS5 and series x games starts becoming a norm.

 One of my mistakes last time is i got a system that is good enough for current games but ps4 and xbox one started becoming the baseline, i suddenly have to drop down details and run at lower fps target to just make it playable. 

Probably same for the cpu too, ps4 and xbone jaguar cores are already doomed from the start, but now the next gen is armed with a much more capable zen 2 cores.


----------



## Camm (Oct 28, 2020)

ViperXTR said:


> 8gb vram? Only time will tell, when PS5 and series x games starts becoming a norm.



Should be noted that because of the bus split, that the Xbox expects 10GB to be VRAM, with the other 4GB or so for game related memory usage (and 2GB for the OS).

Obviously PS5 is much more dynamic because it has a full 16GB without having a split bus, but I'd expect anything multiplat to adhere to that at console level fidelity settings.


----------



## ViperXTR (Oct 28, 2020)

Camm said:


> Should be noted that because of the bus split, that the Xbox expects 10GB to be VRAM, with the other 4GB or so for game related memory usage (and 2GB for the OS).
> 
> Obviously PS5 is much more dynamic because it has a full 16GB without having a split bus, but I'd expect anything multiplat to adhere to that at console level fidelity settings.


Both system will also utilize nvme PCI-E 4.0 SSD as well, this is the first time consoles have actual optimization and configuration for SSDs


----------



## renz496 (Oct 28, 2020)

ViperXTR said:


> 8gb vram? Only time will tell, when PS5 and series x games starts becoming a norm.
> 
> One of my mistakes last time is i got a system that is good enough for current games but ps4 and xbox one started becoming the baseline, i suddenly have to drop down details and run at lower fps target to just make it playable.
> 
> Probably same for the cpu too, ps4 and xbone jaguar cores are already doomed from the start, but now the next gen is armed with a much more capable zen 2 cores.



personally i think VRAM usage probably will not going to increase the way it did when game start being develop exclusively for 8th gen. yes it will increase but probably only a bit more. 9th gen console only double the total RAM inside the console from 8GB to 16GB. with 7th gen to 8th gen the increase of total RAM is 32 times (PS3) and 16 times (360) respectively. when game developer was suddenly having significantly more RAM for them to use they start brute forcing it instead of doing extreme optimization they had to do with 7th gen. this time they probably have to be more cautious with their resource management especially 9th gen console are meant to target 4k (which will need more VRAM).


----------



## ViperXTR (Oct 28, 2020)

renz496 said:


> personally i think VRAM usage probably will not going to increase the way it did when game start being develop exclusively for 8th gen. yes it will increase but probably only a bit more. 9th gen console only double the total RAM inside the console from 8GB to 16GB. with 7th gen to 8th gen the increase of total RAM is 32 times (PS3) and 16 times (360) respectively. when game developer was suddenly having significantly more RAM for them to use they start brute forcing it instead of doing extreme optimization they had to do with 7th gen. this time they probably have to be more cautious with their resource management especially 9th gen console are meant to target 4k (which will need more VRAM).


In the end, we just have to find out


----------



## ironwolf (Oct 28, 2020)

My current GTX 1070 card is worried it will be getting replaced with a 3070 card by Jan/Feb, if stock holds up then.


----------



## Mussels (Oct 28, 2020)

At least the power consumption is reasonable on this card, the 3070 isn't a space heater

seeing how low down the power consumption for my 1080 is compared to these new cards is a shock, they've really regressed on that.


----------



## MxPhenom 216 (Oct 28, 2020)

Mussels said:


> At least the power consumption is reasonable on this card, the 3070 isn't a space heater
> 
> seeing how low down the power consumption for my 1080 is compared to these new cards is a shock, *they've really regressed on that.*



Thank you Samsung...


----------



## ixi (Oct 28, 2020)

Xaled said:


> Another fraudy release
> Not available in market
> If available, would be with a much higher price than the announced one.
> If Nvidia does what it did with 3080 it should be sued (unfortunately only people in US can do that)
> ...


At my country cheapest 620, then 650 and then it goes over 720euro, nice right... not to mention that it is not still released...


----------



## medi01 (Oct 28, 2020)

John Naylor said:


> Last time, they dropped the price by $50 on the day before release day .... which can only be seen as raising the performance white flag. You don't lower prices when you have a better product you raise them



AMD announced pricing on 5700 series.
NV countered with s (bigger and more expensive to produce) versions of the cards.
AMD adjusted pricing on it's GPUs with chips of "whopping" 250mm2 size.
"white flag" is fanboi imagination.



MxPhenom 216 said:


> Thank you Samsung...


Thank you Jensen to OC-ing the card to max.


----------



## Vayra86 (Oct 28, 2020)

Searing said:


> Hold your breath and crying? None of which is happening here. Grow up and talk like an adult.
> 
> Of course games will work with 8GB, that isn't the point. I like how you just missed the entire 770 to 1070 we got 4x more VRAM and yet 1070 to 3070 we didn't get any more at all. And I'm unreasonable? We've never gone 4 years without more VRAM. And this during a launch year of a new console driving up VRAM requirements very soon. More importantly, build it and they will come, give all your cards more VRAM and game makers will release content for them. The month the 1070 launched I suddenly had games with new video options and texture packs that used more than 4GB and the same people were "crying" like yourself that we didn't need 8GB then, and yet voila we got better settings than console games and used the VRAM up quickly. Doom Eternal already needs more than 8GB for top settings and more games will soon.



NO man, you have to understand, you buy 4K capable cards to game at 1440p with High settings. And that's fine for 700-800 dollars worth of GPU nowadays. Can't be nitpicking too much about 10GB right? Look at the god damn EF PEE ES for the money! This card is fine.

And texture packs?! Console don't get that either, stop whining.

Practical experience with more VRAM usage? No, you're lying  Its not commonplace at all. I don't see it myself every day either. Allocated is not 'in use'! Its fine! Stutter doesn't exist!

/s


----------



## medi01 (Oct 28, 2020)

This review, in fact 95%+ of reviews, when manufacturers send in cards to be tested, *has it ever been tested how close to "typical" the tested cards are*?

In CPU world we have the "silicon lottery", don't we have the same in GPU world?

Imagine you are manufacturing graphic cards. Would you take a bunch of them, benchmark, and pick up the best to be sent in for reviews?


The only case that I could imagine when this wouldn't be a problem, if fluctuations are really small, like 1-3%.


----------



## R0H1T (Oct 28, 2020)

MxPhenom 216 said:


> Thank you Samsung


We don't really know if Samsung is at fault or Nvidia pushed the cards way past reasonable limits, not unlike AMD, or indeed the Ampere uarch. The only way we'd know for sure is if Nvidia launches the exact same cards on TSMC 7nm or any other "better" node.


----------



## HD64G (Oct 28, 2020)

R0H1T said:


> We don't really know if Samsung is at fault or Nvidia pushed the cards way past reasonable limits, not unlike AMD, or indeed the Ampere uarch. The only way we'd know for sure is if Nvidia launches the exact same cards on TSMC 7nm or any other "better" node.


Samsung's 8nm node isn't made for big dies and thus, when pushed to higher clocks than their optimum they draw too much power. And this time nVidia had to push for high clocks to keep the GPU crown (today we should learn for sure me thinks). And it is obvious that this Ampere arch is compute focused and doesn't like high clocks either. So, Ampere on Samsung was a bad combo. Pascal could work better if made on this but Ampere and Turing not so well with big dies and tensor cores.


----------



## Vya Domus (Oct 28, 2020)

Mussels said:


> seeing how low down the power consumption for my 1080 is compared to these new cards is a shock, they've really regressed on that.



So it is a space heater then.


----------



## Turmania (Oct 28, 2020)

You can undervolt 3080 to 200w levels and lose about 5% performance. So that means as stated before these cards are great to a certain point then lose efficiency.


----------



## Valantar (Oct 28, 2020)

Turmania said:


> You can undervolt 3080 to 200w levels and lose about 5% performance. So that means as stated before these cards are great to a certain point then lose efficiency.


If you get a good sample you might get there. But ~35% less power with that little performance loss sounds rare. Still, GPU undervolting is extremely valuable with these power hungry cards.



medi01 said:


> This review, in fact 95%+ of reviews, when manufacturers send in cards to be tested, *has it ever been tested how close to "typical" the tested cards are*?
> 
> In CPU world we have the "silicon lottery", don't we have the same in GPU world?
> 
> ...


Some reviewers have done things like that, and some reviewers purposely buy products off the shelf for review to control for this - for example, GamersNexus often buys GPUs for testing rather than getting them from the OEM. That obviously isn't possible for pre-release reviews, but in general the variance is essentially zero. Given the relatively loose voltages and aggressive boosting algorithms on GPUs, you won't see much of a difference unless you're undervolting - and no OEM would be dumb enough to send out a pre-undervolted review sample, even if it would technically be possible.


----------



## Markosz (Oct 28, 2020)

Well, I'm thoroughly surprised about the performance. I expected it to only deliver "2080 Ti performance" with RTX on and DLSS 2.0 in a few select games. But damn, this is not bad, very intriguing GPU.
It's a shame it is not going to be available for a while. I haven't seen a single 3080 in any Hungarian store yet since release and I have a feeling the 3070 is going to be even worse as I expect the demand to be higher.
Maybe in 3-4 months the 30-series are going to be actually "released" and available.


----------



## Verbatim (Oct 28, 2020)

Markosz said:


> Well, I'm thoroughly surprised about the performance. I expected it to only deliver "2080 Ti performance" with RTX on and DLSS 2.0 in a few select games. But damn, this is not bad, very intriguing GPU.
> It's a shame it is not going to be available for a while. I haven't seen a single 3080 in any Hungarian store yet since release and I have a feeling the 3070 is going to be even worse as I expect the demand to be higher.
> Maybe in 3-4 months the 30-series are going to be actually "released" and available.


RTX 3070 overall is good but GTX 970 and GTX 1070 was better when they launched. Hype every year increases and people for some reason cannot think clearly.

That's probably because of excellent nvidia marketing.


----------



## Haile Selassie (Oct 28, 2020)

MxPhenom 216 said:


> Thank you Samsung...


Why Samsung? In this day and age of MCU design you design a chip around the intended node and it's limitations and not the other way around.
MCUs are pushed to the brink of stability with these ultra high 2GHz boost clock frequencies.


----------



## Tebbs. (Oct 28, 2020)

Wondering why Fortnite is not used in testing cards?
One of the most played games, and I for one would base my purchase on results for it.


----------



## EarthDog (Oct 28, 2020)

Tebbs. said:


> Wondering why Fortnite is not used in testing cards?
> One of the most played games, and I for one would base my purchase on results for it.


Same reason most MP games aren't used... becuase testing consistency is nearly impossible to be consistent on MP games. Besides, Fortnite runs on a potato in the first place.


----------



## Tebbs. (Oct 28, 2020)

Fair point


----------



## W1zzard (Oct 28, 2020)

EarthDog said:


> Same reason most MP games aren't used... becuase testing consistency is nearly impossible to be consistent on MP games. Besides, Fortnite runs on a potato in the first place.


That, also patches can occur at random times, so I can't guarantee a consistent game version between test runs


----------



## renz496 (Oct 28, 2020)

HD64G said:


> Samsung's 8nm node isn't made for big dies and thus, when pushed to higher clocks than their optimum they draw too much power. And this time nVidia had to push for high clocks to keep the GPU crown (today we should learn for sure me thinks). And it is obvious that this Ampere arch is compute focused and doesn't like high clocks either. So, Ampere on Samsung was a bad combo. Pascal could work better if made on this but Ampere and Turing not so well with big dies and tensor cores.



i don't think nvidia push for high clock with their ampere design. if they were then we probably already seeing something beyond 2.2ghz. starting with turing nvidia try looking for something else than increasing the clock speed to increase their performance. also while samsung node are not made for big die size one way or another nvidia have to "force" samsung to make their big GPU. else how samsung going to get the experience if they never make one?


----------



## ratirt (Oct 28, 2020)

Not bad but look at the difference in performance between the 3080 and the 3070. I can bet there will be 3070 Ti. That would be a very good card if the price is OK.
If I'd go for NV card I'd wait for the 3070 Ti.


----------



## ViperXTR (Oct 28, 2020)

ratirt said:


> Not bad but look at the difference in performance between the 3080 and the 3070. I can bet there will be 3070 Ti. That would be a very good card if the price is OK.
> If I'd go for NV card I'd wait for the 3070 Ti.


I can see it already consuming almost similar power as the RTX 3080 even with some shader cores disabled as it's using the same die


----------



## Valantar (Oct 28, 2020)

ratirt said:


> Not bad but look at the difference in performance between the 3080 and the 3070. I can bet there will be 3070 Ti. That would be a very good card if the price is OK.
> If I'd go for NV card I'd wait for the 3070 Ti.


$200 price gap, 100W power consumption gap, 23% performance gap at 1440p? Yeah, that does open the door for a $600 3070 Ti, as long as there are enough sufficiently defective GA102 dice to make that viable as a product. Though it would also make for a rather crowded product stack, with AIB 3070s easily exceeding the price of a base 3070 Ti, making developing such cards a lot less attractive for partners. Going by recent history we might instead see a 3070 Super in a while fulfilling the same role while replacing the 3070 outright or pushing its price down a bit.


----------



## ratirt (Oct 28, 2020)

ViperXTR said:


> I can see it already consuming almost similar power as the RTX 3080 even with some shader cores disabled as it's using the same die


Well I was focusing mostly on the performance difference between the two than how the power consumption look like.
If you compare the 3080 and 3090 the difference is not that significant and yet there rumors about the 3080 Ti being stuck between the two. It would have made more sense to have a 3070 Ti to leverage the performance range.



Valantar said:


> we might instead see a 3070 Super in a while fulfilling the same role while replacing the 3070 outright or pushing its price down a bit.


Actually I don't care about the name of the card Ti or Super or whatever. The point is that the gap in performance is huge between the two. It would have made more sense to make a 3070 ti/super than a 3080 Ti with a 12% performance gap(or 15% whatever it is) between the 3080 and 3090
I don't think the replacement is a great option here. The gap is huge. I'm not convince also that the replacement will attract more people actually contrary. People want the cards now, not when they already have the 3000 series whatever card they buy.


----------



## Luminescent (Oct 28, 2020)

In my country you can't find one and even if you are lucky at some point it costs around 840$ 
If AMD doesn't put out something good we are looking at a future where having a decent GPU will cost an important sum of money.


----------



## MxPhenom 216 (Oct 28, 2020)

R0H1T said:


> We don't really know if Samsung is at fault or Nvidia pushed the cards way past reasonable limits, not unlike AMD, or indeed the Ampere uarch. The only way we'd know for sure is if Nvidia launches the exact same cards on TSMC 7nm or any other "better" node.



I know. I was being partially sarcastic and facetious


----------



## HD64G (Oct 28, 2020)

renz496 said:


> i don't think nvidia push for high clock with their ampere design. if they were then we probably already seeing something beyond 2.2ghz. starting with turing nvidia try looking for something else than increasing the clock speed to increase their performance. also while samsung node are not made for big die size one way or another nvidia have to "force" samsung to make their big GPU. else how samsung going to get the experience if they never make one?


The 1st big die made on 8nm couldn't give great yields or efficiency. Trial and error makes manufacturing process better. Maybe in 6 months or so things are better in yields. Efficiency though cannot get MUCH better. And Turing as an arch cannot reach ultra high clocks being a compute focused one imo.


----------



## Umbral (Oct 28, 2020)

AMD strategy:

6900 XT = 5% slower than 3090, $650 16GB RAM

6800 XT = 5% slower than 3080, 5% slower than 6900XT $450 16 GB RAM
*6800 = 10~15% faster than 3070, 5% slower than 6800 XT $400 16 GB RAM*

6700 = 10 % faster than 3060 TI, 10% slower than 3070 $325 12 GB RAM

AMD wins at price/performance at every tier. Not to mention 99% games optimized for consoles on AMD hardware.

*Best Card for price/performance = 6800 vanilla*


----------



## bpgt64 (Oct 28, 2020)

I don't care who you are, you have to have mad respect for Lisa Su at this point...kicking intel and nvidia in the balls in the same year...shit.


----------



## madshi (Oct 28, 2020)

@W1zzard, your "Noise Normalized Cooler Testing" is unreal, well done! Looks like the perfect way to compare the effectiveness of different coolers. Do you plan to make this a regular part of your GPU reviews in the future? Would love that. But I suppose it's probably a lot of work.

If these curves were available for your previous reviews, we could e.g. compare how good the 30xx TUF vs STRIX coolers are (even though one review is for 3080 and the other for 3090). Right now I find it hard to judge if the TUF or STRIX cooler is better. Naturally, I would expect the STRIX to win, but the 3080 TUF cooler tested surprisingly good, while I was a bit disappointed with the 3090 STRIX test results. So I'm not sure the 30xx STRIX cooler is any better than the 30xx TUF.


----------



## W1zzard (Oct 28, 2020)

madshi said:


> @W1zzard, your "Noise Normalized Cooler Testing" is unreal, well done! Looks like the perfect way to compare the effectiveness of different coolers. Do you plan to make this a regular part of your GPU reviews in the future? Would love that. But I suppose it's probably a lot of work.


Oh w00t, nice to have you onboard, I used your hooking stuff many many years ago, learned a lot from it 

Yeah I'll make this a permanent addition to my reviews



> If these curves were available for your previous reviews


no plans to bring it to older cards, but will have it for the all my 3070 custom-design reviews, so you can finally know which cooler is best, and by how much


----------



## madshi (Oct 28, 2020)

W1zzard said:


> Oh w00t, nice to have you onboard, I used your hooking stuff many many years ago, learned a lot from it






W1zzard said:


> Yeah I'll make this a permanent addition to my reviews


Awesome - you keep raising the bar for GPU reviews!!


----------



## QUANTUMPHYSICS (Oct 29, 2020)

I drove to MICROCENTER this morning.  It was raining.  I waited on line just 20 minutes.   Got my voucher.  Got my card.  Very glad to get an FE cause when I got the 3080 I was only able to get a XC3. 

Waiting for my 3090FE by mail.


----------



## ixi (Oct 29, 2020)

Umbral said:


> AMD strategy:
> 
> 6900 XT = 5% slower than 3090, $650 16GB RAM
> 
> ...



Where did you get the information that 6900 xt is slower than 3090 by 5%, just guessing?


----------



## QUANTUMPHYSICS (Oct 29, 2020)

ixi said:


> Where did you get the information that 6900 xt is slower than 3090 by 5%, just guessing?




I actually prefer it when MY DRIVERS ACTUALLY WORK .


----------



## ixi (Oct 29, 2020)

QUANTUMPHYSICS said:


> I actually prefer it when MY DRIVERS ACTUALLY WORK .



Never had problems with nvidia nor amd drivers. But havent used amd since the gtx 760.


----------



## Valantar (Oct 29, 2020)

ixi said:


> Where did you get the information that 6900 xt is slower than 3090 by 5%, just guessing?


I would guess that's based off that AMD's comparison graphs for the 6900 XT were with SAM and Rage Mode on - SAM is supposed to deliver ~2% performance increases on average, and the same goes for Rage Mode. If that was the basis it's obviously just a guess though.


----------



## olstyle (Oct 30, 2020)

One of the must reviews in the net as always, although I don't see eye to eye on the Vram question (might be because the GPU guy on my main site sees it way different).
For me the GPU itself is very interesting, but the card they build around it not so much.


> 8 nanometer production process


Why is this part of the Pro list. Power efficiency is good and that is already in the list, but the production process itself has no direct effect for the enduser (and if you really want to rate it, 8nm Samsung would be a Con compared to 7nm TSMC).


MxPhenom 216 said:


> Though I suspect AMDs first attempt at RT hardware will yield around Turing performance for ray tracing.


Nvidia did it once with Tesselation. ATI tried to push this Feature since the Radeon 8500, but the first DX compatible Implementation was suddenly beaten by NVs first implementation at all.
But from what we can see in leaks, AMD seems to be on par with Turing an behind Ampere in that regard, that's right.


----------



## MoleUK (Oct 30, 2020)

Any idea why my 3070 founders would be locked at 30% minimum fan speed? No 0 RPM mode. Also MSI afterburner is reporting the clock speed not going below 1725 and memory not going below 7001. I'm assuming there's a software conflict going on somewhere.

Fixed it. Had to DDU drivers then reinstall without Geforce experience. 3070 is now properly idling with fans at 0 % and core clock at normal idle speeds.


----------



## 80-watt Hamster (Nov 1, 2020)

It amuses me that Polaris still tops the performance per dollar chart.


----------



## GreiverBlade (Nov 2, 2020)

birdie said:


> A wonderful review, a great card, only I cannot quite accept the fact that it's still priced quite high:
> 
> GTX 770 - $399
> GTX 970 - $329
> ...


wait what? the 1070 was 449$ .... mine was more 540$ ... (luckily it was a standard exchange from insurance ) ... pffff some peoples are lucky to see the MSRP unicorn ...

the 3070 will be 600ish for me (confirmed .... most AIB custom model : 599chf for the cheapest  and near 2k for a 3090  ) ... hum at 430ish as option for now i have a RX5700XT (well it's still competitive it's still close enough to a 2070Super which in turn is not far from a 2080Ti which in turn is ... eh? invalide? oh, well... )




QUANTUMPHYSICS said:


> I actually prefer it when MY DRIVERS ACTUALLY WORK .


overexaggerated, well ... never had any issues with AMD drivers ... nV on the other hand ... i had to rollback several driver version more than once in a while ...


----------



## More Sly (Nov 3, 2020)

QUANTUMPHYSICS said:


> I actually prefer it when MY DRIVERS ACTUALLY WORK .


I switched to AMD from Nvidia specifically because of driver issues with green. Haven't had an issue in a long, long time.

I'm actually hesitant to switch back to Nvidia because of past driver experiences.


----------



## ixi (Nov 3, 2020)

In Latvia cheapest rtx 3070 goes starting from 820 euro from well known sellers. Rtx 3080... 1100, haha.

I'm amazed that people still buy these overpriced crap gpu's ... (why crap - because these products shouldn't cost that much).

Good that I'm waiting for amd gpus. But looking at what sellers are doing right now... amd prices gonna be on the same level I'm guessing ...


----------



## EarthDog (Nov 3, 2020)

ixi said:


> (why crap - because these products shouldn't cost that much).


Surely we understand the difference between a 'crap' product and one that is overpriced, right?


----------



## More Sly (Nov 3, 2020)

ixi said:


> In Latvia cheapest rtx 3070 goes starting from 820 euro from well known sellers. Rtx 3080... 1100, haha.
> 
> I'm amazed that people still buy these overpriced crap gpu's ... (why crap - because these products shouldn't cost that much).
> 
> Good that I'm waiting for amd gpus. But looking at what sellers are doing right now... amd prices gonna be on the same level I'm guessing ...


I live in Germany, but I'm not European. A bit surprised to hear it costs so much there. Can't you just order from other states or are the shipping costs raised? I often buy from the UK since there's no import fees within the EU (won't last for long, I guess).


----------



## ixi (Nov 5, 2020)

EarthDog said:


> Surely we understand the difference between a 'crap' product and one that is overpriced, right?



Lets go from aspect which I see it.

3080 retail price need to be around 850. Not freaking 1100 around.

3070 same goes here. With custom cooler around 550 euro... not from around 800 euro.

You wanna tell me you are ready to overpay almost 300 euro for 3070 or 3080?




> I live in Germany, but I'm not European. A bit surprised to hear it costs so much there. Can't you just order from other states or are the shipping costs raised? I often buy from the UK since there's no import fees within the EU (won't last for long, I guess).



Delivery price depends from which country and carrier you choose to order. Right now I don't have pc and waiting for amd cpus to hit the market to see what will hapen. Maybe will hold for 1 more year for new intel and amd cpus. I'm not in a hurry.

In free time playing ns pokemon sword and shield, haha.


----------



## Valantar (Nov 5, 2020)

ixi said:


> Lets go from aspect which I see it.
> 
> 3080 retail price need to be around 850. Not freaking 1100 around.
> 
> ...


But that's exactly what was said here: they are overpriced, not "crap products". If the official pricing was that high that would indeed mar the overall impression of the product, but if distributors and retailers are pushing prices up due to low supply, well, that doesn't reflect poorly on the product, only on the people/companies increasing prices.


----------



## EarthDog (Nov 5, 2020)

ixi said:


> Lets go from aspect which I see it.






Valantar said:


> But that's exactly what was said here: they are overpriced, not "crap products". If the official pricing was that high that would indeed mar the overall impression of the product, but if distributors and retailers are pushing prices up due to low supply, well, that doesn't reflect poorly on the product, only on the people/companies increasing prices.





EarthDog said:


> Surely we understand the difference between a 'crap' product and one that is overpriced, right?


----------

