Tuesday, September 8th 2020

NVIDIA GeForce RTX 3070 Founders Edition Pictured in the Flesh

Here's one of the first clear pictures of an NVIDIA GeForce RTX 3070 Founders Edition graphics card pictured in the flesh (that isn't an NVIDIA press-shot or render). A PC enthusiast in China with access to Founders Edition cards of all three RTX 3000 series cards announced on September 1, posted a family shot, which provides a nice size comparison.

The RTX 3070 Founders Edition is noticeably shorter than the RTX 3080 FE. Both cards are dwarfed by the RTX 3090 FE. Unlike the RTX 3080 and RTX 3090, the RTX 3070 FE uses a more conventional approach to air-flow, with both of its cards on the obverse side of the card, even through the second fan still pushes some of its airflow through the PCB, through a partial cutout. All three cards use the 12-pin Molex MicroFit 3.0 power connector. The previous generation flagship RTX 2080 Ti is the butt of gamer memes thanks to the RTX 3070, as NVIDIA advertised it as being faster than the RTX 2080 Ti at half its price of $499 (starting price). This announcement has forced some RTX 2080 Ti owners to dump their cards on Ebay at throwaway prices.
Source: David Eneco (Twitter)
Add your own comment

50 Comments on NVIDIA GeForce RTX 3070 Founders Edition Pictured in the Flesh

#26
ppn
Hour glass, back side exhaust fan thing, with air blocking front cover, but I'm waiting enthusiasts like optimum to put under water cooling one of those in single slot Mitx case. 3070 only makes sense if price drops to 50% of 2080Super.
Posted on Reply
#27
ThrashZone
Hi,
Not sure how the 3080 and 3090 are different sizes the water blocks are the exact same model :confused:
Posted on Reply
#28
Milky
But who is Charles and did he ever give the charger back?
Posted on Reply
#29
ratirt
bugWhat exactly isn't real? Because I see real, bigger than expected performance jumps (even if real world benchmarking puts that 10-20% lower) at roughly the same prices as before.
You see it? So far I haven't seen any performance benchmarks just NV claims and charts which give sketchy numbers. Good these are coming out but I'd rather hold off my celebration after these are actually out and benchmarked.
Posted on Reply
#30
buzzi
Am I the only one that sees fan spinning?
Posted on Reply
#31
Turmania
I will wait for the reviews on power consumption numbers as I entertain an Itx case and heat is always an issue. But, I would be be lying if I said 3070 is not tempting. Who would say no to a 500 usd card that performs better than a 1200 usd card you get today?
Posted on Reply
#32
Vayra86
ratirtYou see it? So far I haven't seen any performance benchmarks just NV claims and charts which give sketchy numbers. Good these are coming out but I'd rather hold off my celebration after these are actually out and benchmarked.
We've seen numbers already in a Digital Foundry test which obviously favored Nvidia's current selection of titles there, but there is a wide variety of games on show, including titles like Shadow of the Tomb Raider. There you can see some order of 50-70% perf increase in comparisons. Not for the 3070 though, just yet, but it does give us an indication that the perf bump is real.

And on top of that, we also know historically Nvidia's claims are generally true, with the usual reservation that its best-case we're getting. But even that... its never a truly inflated claim, even if the performance wasn't there on launch day it still got there after a few driver updates. And more, too. Remember Shader Cache? And all the improvements on Kepler? None of those were advertised or reviewed but we still got them.

There is nothing sketchy about it, in any case. You'd also have to wonder 'who benefits'... Nvidia will tarnish years of good reputation and trust if they inflate their claims now, while they have nothing to win... these GPUs virtually sell themselves at these price/perf levels. Especially that 3070.
tiggerThe 3090 is comical
Its so friggin huge. Bracket might be compulsory or you might rip a hole in your board :D
Posted on Reply
#33
TheDeeGee
Hmm, so the 3090 cooling doesn't work the same as the 3080 then, since the other side is covered up.
Posted on Reply
#34
AusWolf
"This announcement has forced some RTX 2080 Ti owners to dump their cards on Ebay at throwaway prices" - nobody was forced to do anything. Buying or selling is a choice you make on your own.

Buying a 2080 Ti right before the 30 series launch is a bad financial decision, but you win the prize of a first-class idiot if you get rid of it for a fraction of what you paid only because the 3070 is faster. :kookoo:
Posted on Reply
#35
Vayra86
AusWolf"This announcement has forced some RTX 2080 Ti owners to dump their cards on Ebay at throwaway prices" - nobody was forced to do anything. Buying or selling is a choice you make on your own.

Buying a 2080 Ti right before the 30 series launch is a bad financial decision, but you win the prize of a first-class idiot if you get rid of it for a fraction of what you paid only because the 3070 is faster. :kookoo:
I'd say buying the 2080ti new, was the bad financial decision. Selling it fast is the sensible one, if that means you buy a 3070 in return for your 450-500 bucks. That's a free warranty/hardware renewal/improved RT performance all in one go. Of course, if you then proceed to order a 3090.... yea, that's just stacking fools, money, and parting all in one.

And if you can buy a 2080ti now for 500... The 3070 is likely to end up higher in the end, especially if you want an equal-ish AIB version of it. All you really miss out on is some RT perf, possibly. I also think its not the greatest idea in the world. But hey, you do own a 2080ti at that price now. But more importantly, the 3070 isn't out, could well end up just below 2080ti in raster perf, and then that marginally lower, or equal price is suddenly not so bad at all.
Posted on Reply
#36
EarthDog
ThrashZoneHi,
Not sure how the 3080 and 3090 are different sizes the water blocks are the exact same model :confused:
Easy. We only see the heatsink, not the PCB. PCB could easily still be the same under the heatsink. ;)
Posted on Reply
#37
Chrispy_
I'll still wait for benchmarks, but if the 3070 outperforms a 2080Ti then the most anyone can realistically expect to sell their (old, used, inefficient) 2080Ti for is about $400.
Posted on Reply
#38
milewski1015
Typo:

"with both of its cards on the obverse side of the card" should probably say "with both of its fans on the obverse side of the card"
Posted on Reply
#39
Easo
Raendor3070 beats 2080ti
We really gotta see the reviews first. The hype is unreal at this moment and it is not wise to feed it.
Posted on Reply
#40
Minus Infinity
If you're an Aussie please note Nvidia is refusing to sell FE cards in Australia, with no reason given at all.
Posted on Reply
#41
ratirt
Vayra86We've seen numbers already in a Digital Foundry test which obviously favored Nvidia's current selection of titles there, but there is a wide variety of games on show, including titles like Shadow of the Tomb Raider. There you can see some order of 50-70% perf increase in comparisons. Not for the 3070 though, just yet, but it does give us an indication that the perf bump is real.

And on top of that, we also know historically Nvidia's claims are generally true, with the usual reservation that its best-case we're getting. But even that... its never a truly inflated claim, even if the performance wasn't there on launch day it still got there after a few driver updates. And more, too. Remember Shader Cache? And all the improvements on Kepler? None of those were advertised or reviewed but we still got them.

There is nothing sketchy about it, in any case. You'd also have to wonder 'who benefits'... Nvidia will tarnish years of good reputation and trust if they inflate their claims now, while they have nothing to win... these GPUs virtually sell themselves at these price/perf levels. Especially that 3070.
It shows some % of uplift but you cant see any FPS anyway. I'd rather wait for the entire suite and then wrap my mind around how much faster these are in comparison to Turing.
It is not just NV. AMD does that marketing too. Showing charts which dont express or give enough info or actual number. But % or 2x faster etc.
Still rather wait before I get excited. The price is up a notch for me and that is not so good. Justification like it is pricier because it is faster than Turing is not convincing and It still stands, this is not the way it should be and I'm surprised people support these price bumps. That's just my opinion.
Posted on Reply
#42
Caring1
Minus InfinityIf you're an Aussie please note Nvidia is refusing to sell FE cards in Australia, with no reason given at all.
Not the first time, previously they were only available in OEM builds.
Posted on Reply
#43
Vayra86
ratirtIt shows some % of uplift but you cant see any FPS anyway. I'd rather wait for the entire suite and then wrap my mind around how much faster these are in comparison to Turing.
It is not just NV. AMD does that marketing too. Showing charts which dont express or give enough info or actual number. But % or 2x faster etc.
Still rather wait before I get excited. The price is up a notch for me and that is not so good. Justification like it is pricier because it is faster than Turing is not convincing and It still stands, this is not the way it should be and I'm surprised people support these price bumps. That's just my opinion.
Oh yeah absolutely, here take a Love for that. I can only agree :P We've certainly not seen enough to decide anything.
Posted on Reply
#44
bug
Minus InfinityIf you're an Aussie please note Nvidia is refusing to sell FE cards in Australia, with no reason given at all.
I think those cards are unavailable in many countries. But why would you want one, so far I think FE cards came with the lowest configured TDP, making them the slowest options.
Posted on Reply
#45
AusWolf
Vayra86I'd say buying the 2080ti new, was the bad financial decision. Selling it fast is the sensible one, if that means you buy a 3070 in return for your 450-500 bucks. That's a free warranty/hardware renewal/improved RT performance all in one go. Of course, if you then proceed to order a 3090.... yea, that's just stacking fools, money, and parting all in one.

And if you can buy a 2080ti now for 500... The 3070 is likely to end up higher in the end, especially if you want an equal-ish AIB version of it. All you really miss out on is some RT perf, possibly. I also think its not the greatest idea in the world. But hey, you do own a 2080ti at that price now. But more importantly, the 3070 isn't out, could well end up just below 2080ti in raster perf, and then that marginally lower, or equal price is suddenly not so bad at all.
Buying any graphics card above 3-400 bucks is never a financial decision, as the resale value drops the most sharply in the enthusiast segment. It's always been this way, and I'm positive that it's never gonna change.

I'm just puzzled that all those people who bought their 2080 Ti's even tried to look at it as something of monetary value, and not a piece of hardware to play games on - which they still can, despite any launch announcement from nVidia or AMD.
Posted on Reply
#46
Vayra86
AusWolfBuying any graphics card above 3-400 bucks is never a financial decision, as the resale value drops the most sharply in the enthusiast segment. It's always been this way, and I'm positive that it's never gonna change.

I'm just puzzled that all those people who bought their 2080 Ti's even tried to look at it as something of monetary value, and not a piece of hardware to play games on - which they still can, despite any launch announcement from nVidia or AMD.
Dunno man, this 1080 has been going for 3 years and will get sold around 250 EUR. When I bought it, sale price was around 500. That's 250 bucks for 3 years full-on high end gaming. The price of a midranger that is pretty much useless today.

Is it really more expensive? ;) If you 'play your cards right' (badum tss) ? Its really not my experience so far. Some upgrades have been completely cost neutral...
Posted on Reply
#47
AusWolf
Vayra86Dunno man, this 1080 has been going for 3 years and will get sold around 250 EUR. When I bought it, sale price was around 500. That's 250 bucks for 3 years full-on high end gaming. The price of a midranger that is pretty much useless today.

Is it really more expensive? ;) If you 'play your cards right' (badum tss) ? Its really not my experience so far. Some upgrades have been completely cost neutral...
That is a lucky scenario, but I'm sure you didn't buy that 1080 thinking how well it would keep its value in the next 3 years. ;)

On the other hand, your case demonstrates how little improvement Turing had over Pascal - or rather, how those improvements cost disproportionately more.
Posted on Reply
#48
Vayra86
AusWolfThat is a lucky scenario, but I'm sure you didn't buy that 1080 thinking how well it would keep its value in the next 3 years. ;)

On the other hand, your case demonstrates how little improvement Turing had over Pascal - or rather, how those improvements cost disproportionately more.
Its not luck. its about not overpaying for something and timing it right, plus a bit of a prediction on how the market might develop. But you're right, Turing most definitely kept Pascal's value high.

And yes, I do buy a new GPU thinking ahead about the resale value. I already had my sights set on an Ampere release around this point in time back when I bought the 1080. It could've also lasted until 2021. Turing could have also been a lot better. But back then we did already know Nvidia had more shrinking to do as 7nm was a known quantity and we also knew AMD wasn't going places anytime soon. Stagnation was bound to happen, followed by a bigger jump further away as green adopts 7nm. Which turned into 8... :P
Posted on Reply
#49
AusWolf
Vayra86Its not luck. its about not overpaying for something and timing it right, plus a bit of a prediction on how the market might develop. But you're right, Turing most definitely kept Pascal's value high.

And yes, I do buy a new GPU thinking ahead about the resale value. I already had my sights set on an Ampere release around this point in time back when I bought the 1080. It could've also lasted until 2021. Turing could have also been a lot better. But back then we did already know Nvidia had more shrinking to do as 7nm was a known quantity and we also knew AMD wasn't going places anytime soon. Stagnation was bound to happen, followed by a bigger jump further away as green adopts 7nm. Which turned into 8... :p
Fair enough. :) I guess you think about your purchases more than I do (which is good). I always just buy the highest tier that I can afford, and think about the resale value only when it's time to upgrade. My 1660 Ti was my first graphics card that I sold because of some fluctuations in the used market, and not because I needed an upgrade.

Edit: I don't suppose there's gonna be another big jump from 7/8 nm to something else in the near future, so I'm hoping Ampere will hold its value for a while (like Turing didn't).
Posted on Reply
#50
Vayra86
AusWolfFair enough. :) I guess you think about your purchases more than I do (which is good). I always just buy the highest tier that I can afford, and think about the resale value only when it's time to upgrade. My 1660 Ti was my first graphics card that I sold because of some fluctuations in the used market, and not because I needed an upgrade.

Edit: I don't suppose there's gonna be another big jump from 7/8 nm to something else in the near future, so I'm hoping Ampere will hold its value for a while (like Turing didn't).
Too early to tell, in my opinion. We need numbers. Temperature under AIB coolers, boost behavior and spread within the same SKU, end performance, etc. So far Ampere certainly doesn't seem perfect at all, its nothing like the super refined Pascal in that sense. But not bad either don't get me wrong. Could well be another Fermi. That sold well, too, especially in the midrange, and there is reason to believe you really don't need to go much higher (the vast majority).

We really don't know yet how well Samsung's 8nm really does, especially not compared to TSMC's 7nm, but there is very little reason to believe its better.
Posted on Reply
Add your own comment
Sep 8th, 2024 12:24 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts