• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 3070 Founders Edition

Nice and detailed review as always! I really liked the cooler comparison part you added this time around. Wow did Nvidia really create an efficient cooler design on this little guy too.
 
I really don't get Nvidia timing this time around. Why show 3070 a day before RDNA2 reveal? AMD will surely have an answer for 3070 in RDNA2 lineup tomorrow. All it has to do is to price it a bit cheaper and pop more vram on top of it to make 3070 look silly. What am I missing?

Last time, they dropped the price by $50 on the day before release day .... which can only be seen as raising the performance white flag. You don't lower prices when you have a better product you raise them


Looking forward to not being able to buy this anywhere. And when it does pop up somewhere, It's gonna be overpriced.

You could sacrifice being the 1st one on the block to have one .... and just wait till the new and improved later steppings arrive with mature BIOSs, flaws corrected and slightly better OCs.


I mean 3070 is such an easy target for AMD to beat. ....

I think if it was so easy, AMD would have done it with 7xx, 9xx, 10xx and 20xx. Last gen the only tier winner was the 5600 XT. I hope they can do it ... but it's not like they've had a horse in the upper tiers in quite a while.


I was able to use more than 4GB the day the 1070 released because game makers released new options and new texture packs.
4 years from the 770 to the 1070 got us 4x more VRAM. 4 years from the 1070 to the 3070 got us nothing at all. That has never happened before and I think we shouldn't be so glib about it.

That wasn't driven by RAM as much as GPUs at that point were now capable of handling higher resolutions ... if you look at the same card VRM comparisons done for 6xx, 7xx, 9xx and 10xx ... the conclusion is always the same .... when you hike the settings to a point where VRAM matters.... the game is unplayable. Playing games with X VRAM vs Y VRAM, yes when pushed with higher loads and max settings, we do see differences double digits ... like 15 fps to 20 fps. But either way, no one ifs going to play a game at those settings at that resolution. Then problem here is not the VRAM., don't play at 2160p k w/ a GPU targeted at 1440p. In every era, the VRAM thing rears its head and in every era it''s has come to the same conclusion.

6xx Era - https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
So, what can we glean from all of that? For one thing, with any single monitor the 2GB video card is plenty - even on the most demanding games and benchmarks out today. When you scale up to three high-res screens the games we tested were all still fine on 2GB, though maxing out some games’ settings at that resolution really needs more than a single GPU to keep smooth frame rates.

7xx Era - http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/
You couldn't install Max Payne on a system w/ a 2GB 770 (reported to use 2.7 GB) , but it would install on the 4 GB and yet .... when the popped out the 4 GB and installed the 2 GB it ran at same fps with same visual quality and user experience. RAM allocation and RAM usage are two different things. Resolutions used on this test went up to 5760 x 1080

"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600. We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference. ***If we start to add even more AA, in most cases, the frame rates will drop to unplayable on both cards***.

9xx Era - https://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,26.html
The GTX 960 should probably have been launched with 3 GB VRAM standard, hence I can recommend these 4 GB versions very much. But yeah, the end-results remain a bit trivial with the price-premium in mind -- that I need to admit.

10xx Era - http://www.extremetech.com/gaming/2...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x

"We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly .... They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”

"We began this article with a simple question: “Is 4GB of RAM enough for a high-end GPU?” ... Based our results, I would say that the answer is yes — but the situation is more complex than we first envisioned.

First, there’s the fact that out of the fifteen games we tested, only four of could be forced to consume more than the 4GB of RAM. In every case, we had to use high-end settings at 4K to accomplish this. While we do see some evidence of a 4GB barrier on AMD cards that the NV hardware does not experience, ***provoking this problem in current-generation titles required us to use settings that rendered the games unplayable any current GPU***

Also .. in TPUs 1060 3GB Review have to wonder why nvidia stripped out 11% of the shaders from the 6 GB version. The reason is the 6 GB card had to show a performance difference to justify it's price and w/o the cut in shader count it would not. When ya look at 1080p scores between the two, the 6 GB card w/ 11% more shaders is 6% faster .... Stands to reason then if VRAM is in any way associated with hat difference, that 6% would grow substantially at 1440p .... in the real world it does not, just the same 6% performance difference.

I have certain games, that show anomalies, such as poor console ports. Have seen the performance difference between the 3 GB and 6 GB get igger at the 1440p reolution .... and one game where it got closer at 1440p resolution. In short, 99% of the time where we see substantial differences, in fps between same cards w/ different memory amounts, the GPU was stressed to a point that the game was unplayable. At any combination of resolution and setting, if VRAM is going to be a problem, the GPU is already a problem. Finding "a game" that doesn't follow this observation doesn't change the rule because there's a few rare anomalies.

Pay attention to the article ... "With those performance numbers, RTX 3070 is the perfect choice for the huge 1440p gamer crowd". That 8 GB is just fine for 1440p. Adding 4 GB isn't going to help you, by the time you get past 8GB (used not allocated) at 2160p and high settings, your GPU will be delivering < 30 fps. Again, W1zzard nailed it:

"The GeForce RTX 3070 comes with 8 GB of memory, which will be the basis for a lot of discussion, just like on the RTX 3080 10 GB. I feel like 8 GB is plenty of memory for the moment, especially considering this card is targeted at 1440p gaming, which isn't as memory-intensive as 4K. Even with 4K, I feel you'll run out of shading power long before memory becomes an issue.

As to why nvidia timed it this way ? .... Cause they are not idiots. Back in the 7xx era... AMD was in trouble. They had nothing ... many pundits were asking "why does the 780 have the specifications that were leaked for the 770 ? AMD spend a fortune building up the 290 / 290x over several months and the week before it's release , nvidia suddenly announced that apparently had this 780 Ti thing that was just sitting on the shelf and would ship in two weeks. Took all the air out of AMDs sales and the 2xx series fell flat.

Why do they charge what they charge ? ... because no one is stopping them. Price is what the market will bear and 4.5 times as many people are choosing with their wallets to pay the price premium. AMD doesn't have a recent generation card in the top 20 / nvidia has 7. The top 5 cards in use are 1xxx series cards... the 580 places 10th and the 570 places 16th. Wat message is sent when new cards come out and folks are buying them up at 10, 20, 30 % or more of MSRP ? ... it says "we can charge more" Corporate office are legally bound, within legal boundaries, to maximize profit to their investors. Failure to do so is grounds for dismissal or even malfeasance charges.

The 480 had more RAM than the 1060 but it accomplished what ? ... with both cards overclocked, 1060 was16.6% faster in TPUs Gaming test suite ,quieter and ran cooler. When folks say "it will be easy to beat nVidia .... upon what recent past wins is this being based on ? The 1060 is the most popular card in use today w/ 10.79 % market share ... the 480 has 0.49% ... the 580 has 2.22%. Combined that's 2.71 % or 25% of the 1060. That's not a win ... the 5600 XT is a win, it's clearly better than the competition in that price range.

If we want prices to drop, one of two things has to happen, only one of which seems possible:

a) Stop buying
b) A competitive product hits the market

a) is not going to happen leaving us only with b) as a the only possible price impacting scenario ... so while I can still root for b).... nothing AMD has done in the last 7-8 years (other than the 5600 XT) tells me that this is likely to happen. Having more RAM is not a feature unless it brings something to the table. The fact that nvidia is beginning to show their hand with the xx70 tends to suggest that they know something we don't and that's not a good sign. I hope I'm wrong. Our hope for better prcing lies in AMD being able to deliver a 3070 level "5600 XT like" card that has the power to compete with the 3070.... slapping extra RAM on it is not going to help here.
 
Congrats on 17 days and many more to come.

Looks like a winner if they can keep it in stock, any ideas on the red team and their upcoming hardware launch? Do you have any others yet?
 
I just released a card xyz it's 10% faster than rtx 3070 and only $449 .. the availability is about the same as for the other cards.
 
Looks like the best 3000-series card IMO.

It's a paper launch for now, so I guess the real question is whether it's still $499 and whether anyone can actually buy one at that price once AMD's 6000-series are actually available to buy.
 
Remember ampere is a brand new architecture with 1st gen drivers, I expect another few % from driver optimisations in the next few months. Adding to that you can oc for another 5% performance and its a solid improvement from the 2080ti.
 
MASSIVE SNIP

I was reading these amusing comments until I got to this; John, seriously, I've never seen a bigger post in my life and it probably has more words than the actual review!!

Calm down man, calm down...... "smh" for real.





As you were.
 
The issue with AMD, will be to convince consumers that they are reliable from day one. Let's be honest they have yet to convince on that front.
 
price-performance is only "disruptive" because the 2000 series price-performance was dogshit, the 3000 series is what 2000 series should've been in terms of price-performance
 
price-performance is only "disruptive" because the 2000 series price-performance was dogshit, the 3000 series is what 2000 series should've been in terms of price-performance

Incredible, that's so true. Let's hope tomorrow amd can compete on price, performance and availability
 
Incredible, that's so true. Let's hope tomorrow amd can compete on price, performance and availability
oh they definitely can compete, the only question is whether they feel altruistic enough to undercut nvidia when it comes to prices
my bet is on "no"
 
I seriously hope RDNA2 will not be another round of "inferior" cards with a bunch of extra VRAM on them to claim they're more "future proof".

AMD's current flagship, Radeon VII, holds up so remarkably well against RTX 3070's measly 8 GB in 4K… oh wait. :rolleyes:
How previously AMD fared in 3dmark? According to leaks, it surely does look mighty. This paper launch sort of proves this point. I expect this to be Zen 2 moment for the AMD graphics card division.

1603835675565.png
 
How previously AMD fared in 3dmark? According to leaks, it surely does look mighty. This paper launch sort of proves this point. I expect this to be Zen 2 moment for the AMD graphics card division.

View attachment 173557
If the architecture is really changed, then nobody can really tell from this.

Tens of thousands of cards shipping is not a paper launch. There were probably more RTX 3080s shipped on launch day than Radeon VII did throughout its lifespan.
 
You saying, this wasn't rushed? Sure as hell, the leather jacket could of made a better job of supplying more cards to the market. At any rate, lets wait another month and see how AMD handles this.
 
Do some of you understand we live in pandemic times? I wonder if the same whiners will be complaining when there's shortage of AMD's cpus and gpu's - not to mention shops increasing prices because of it
 
You saying, this wasn't rushed? Sure as hell, the leather jacket could of made a better job of supplying more cards to the market. At any rate, lets wait another month and see how AMD handles this.
Rushed? What specifically is rushed?
This launch window was planned out two years ago.
The fact is that postponing it wouldn't have improved the supply to date, that would require more production lines from Samsung.
 
Hmm for 1440p gaming it's only 24% faster on average than the 2070 Super. Would have expected 30% at least. At least the price is good well at least compared to Turing, but I'm still waiting for RDNA2 to see where we are landing this generation. Hopper is the Nvidia cards to wait for.
 
I don't think we can gauge the value on this until cards come out and we see prices.

We know Nvidia played funny buggers with pricing to AIB's, forcing 3080 and 3090's to be $100's dearer than MSRP. If we get pricing around MSRP, this card looks awesome (although I'll couch that with maybe for the next day until we see AMD), but if its higher, well, that value proposition starts falling pretty quick (especially with the amount of 2080 Ti's that are in the used market that you should be able to get 20% cheaper than a 3070 Ti's MSRP)
 
Last edited:
Not very impressive given that the card costs around 800$ MSRP, and won't be available, nVidia messed up Ampere with their pricing and availability, hopefully AMD won't make the same mistake or at least launch their cards with the same MSRP every where and not inflate prices outside the US for no reason.
 
Well, considering I have the 2080ti, I don't see myself upgrading to the 3070. I'll wait until the 3080 becomes available, I guess. Although at this point I might just wait for the 3080ti and hope there's more stock of that when the time comes. I'm having a hell of a time finding a 3080 on Newegg (where I have the store credit card).
 
8gb vram? Only time will tell, when PS5 and series x games starts becoming a norm.

One of my mistakes last time is i got a system that is good enough for current games but ps4 and xbox one started becoming the baseline, i suddenly have to drop down details and run at lower fps target to just make it playable.

Probably same for the cpu too, ps4 and xbone jaguar cores are already doomed from the start, but now the next gen is armed with a much more capable zen 2 cores.
 
8gb vram? Only time will tell, when PS5 and series x games starts becoming a norm.

Should be noted that because of the bus split, that the Xbox expects 10GB to be VRAM, with the other 4GB or so for game related memory usage (and 2GB for the OS).

Obviously PS5 is much more dynamic because it has a full 16GB without having a split bus, but I'd expect anything multiplat to adhere to that at console level fidelity settings.
 
Should be noted that because of the bus split, that the Xbox expects 10GB to be VRAM, with the other 4GB or so for game related memory usage (and 2GB for the OS).

Obviously PS5 is much more dynamic because it has a full 16GB without having a split bus, but I'd expect anything multiplat to adhere to that at console level fidelity settings.
Both system will also utilize nvme PCI-E 4.0 SSD as well, this is the first time consoles have actual optimization and configuration for SSDs
 
8gb vram? Only time will tell, when PS5 and series x games starts becoming a norm.

One of my mistakes last time is i got a system that is good enough for current games but ps4 and xbox one started becoming the baseline, i suddenly have to drop down details and run at lower fps target to just make it playable.

Probably same for the cpu too, ps4 and xbone jaguar cores are already doomed from the start, but now the next gen is armed with a much more capable zen 2 cores.

personally i think VRAM usage probably will not going to increase the way it did when game start being develop exclusively for 8th gen. yes it will increase but probably only a bit more. 9th gen console only double the total RAM inside the console from 8GB to 16GB. with 7th gen to 8th gen the increase of total RAM is 32 times (PS3) and 16 times (360) respectively. when game developer was suddenly having significantly more RAM for them to use they start brute forcing it instead of doing extreme optimization they had to do with 7th gen. this time they probably have to be more cautious with their resource management especially 9th gen console are meant to target 4k (which will need more VRAM).
 
personally i think VRAM usage probably will not going to increase the way it did when game start being develop exclusively for 8th gen. yes it will increase but probably only a bit more. 9th gen console only double the total RAM inside the console from 8GB to 16GB. with 7th gen to 8th gen the increase of total RAM is 32 times (PS3) and 16 times (360) respectively. when game developer was suddenly having significantly more RAM for them to use they start brute forcing it instead of doing extreme optimization they had to do with 7th gen. this time they probably have to be more cautious with their resource management especially 9th gen console are meant to target 4k (which will need more VRAM).
In the end, we just have to find out
 
Back
Top