• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 2070 Founders Edition

woah woah woah the 2070 non FE is supposed to be cheaper than my 1070, what kind of sorcery is that, uh? 1070 were supposed to be 375ish$? not 526$? eh?
well a 2070 will be 600$ and above for me then ... a guess a second hand (non miner) or even a new Vega 64 will be more than enough at 1440p the 2070 is not really much faster...

confirmed ... 669$ and above for the 2070 (well the 2080 is 900ish and the 2080 Ti is 1500+ :laugh: )

But then the 2070 is cheaper than Vega 64 here. Intel... please help!
ah for once i don't have that issue ... and also why "Intel... please help!" for their future GPU? i wouldn't count on it, they will probably be overpriced too :laugh:
 
Yeah well sadly the market has changed, not sure why it should be cheaper than a 1070 when it's better than a Vega 64. Maybe I'm smoking the wrong weed... but hey ho.

The market clearly needs a third player, Rajmeister is at Intel... lets see if they can spice the market up from there, he failed with Vega.
 
The market needs 8nm is what it needs. We have 445mm2 chip on 12nm, the optical shrink of that on 8nm is 200mm2, drops to 192 bit bus 6/12GB. This 2070 is the 3060 in disguise, and it will cost a lot less.
 
There's strong possibility for a 2019 launch.
Only if strong possibility means wishful thinking. All the legit news sources are stating holiday season of 2020 or even 2021.


in May, Sony Interactive CEO John Kodera revealed to the Wall Street Journal that the PS5 would not be releasing until at least 2021.
 
Why they had to go for a ghetto mod for that power connector, since the PCB has solder points for a 8-pin connector?
 
Sorry, but this card should have been at least 200$ cheaper to justify the purchase. A GTX 1080 runs 350$ on 2nd hand shops nowadays, so this card has ZERO value right now with its ridiculous price.
 
Last edited:
I really like the design and build quality of the FE. $100 more expensive than MSRP but looks sharp and is visibly a high-end piece.

Also nice to see the attention to detail in the internals, like in here:

wtf3.jpg
 
stop putting those RayTracing and DSLL hopes in positives, because even from nvidias benchmarks - Raytracing is not playable on rtx 2080 ti (unless you count 30-60fps on fullhd playable on 1250$ gpu in 2019 ?) and DLSS is the same when you would reduce the rendered resolution (same performance "boost" and same image quality penalty - from nvidia benchmarks, imagine the lackluster "advantages" in real world games). these positives should be ignored. so people see the real value of RTX 20xx. Because so far rtx 20xx goes as nvidia planned - they released new generation that is WORSE than previous gen and most people still "... yea, but Raytracing and DSLL it is a pluss, at least soon will be" and make purchase, nvidia bent over and milked us real good with this relase... some might say to me: "well I see that and I wont buy, you wont buy, but why the salty hate?" answer to that is - nvidia set the new bar for total market screw up and next rtx 3080 - will have imaginary msrp 800$ (you did think they will go back?) in market will cost 1000$, will be 0-5% faster than rtx 2080 , but will offer the new RZGSD (just made it up) as positive that will "just work" like a charm in some time near future. and what do you think - amd or intel will sell its equivalent for 400$? or they will price match whatever nvidia equivalent and reduce it by 50-100$ and call it a day ?
 
Well if you can't afford it there is the 2060 next year, 980Ti performance. You just cant pay the price, and remembering how GTX 280 went from 660$ to 330$ in just 6 months nobody in his right mind should. This is early adopter tax.

Now 20 series is clearly 39% faster than 10 series overall 1070 is 72% of 2070, and 1080 is 73% of 2080 in 1440p so 1060 is 73% of 2060Ti not the 2060/128bit since it drops the bus width 1 notch.

20 series could be +50% at the same price if 10nm samsung process worked on time and AMD GLOFo started using it and lowered the prices too.

Now nvidia expanded the core by 50% just to fit RT and tensor cores instead of CUDA for a reason.
 
Last edited:
stop putting those RayTracing and DSLL hopes in positives, because even from nvidias benchmarks - Raytracing is not playable on rtx 2080 ti (unless you count 30-60fps on fullhd playable on 1250$ gpu in 2019 ?) and DLSS is the same when you would reduce the rendered resolution (same performance "boost" and same image quality penalty - from nvidia benchmarks, imagine the lackluster "advantages" in real world games). these positives should be ignored. so people see the real value of RTX 20xx. Because so far rtx 20xx goes as nvidia planned - they released new generation that is WORSE than previous gen and most people still "... yea, but Raytracing and DSLL it is a pluss, at least soon will be" and make purchase, nvidia bent over and milked us real good with this relase... some might say to me: "well I see that and I wont buy, you wont buy, but why the salty hate?" answer to that is - nvidia set the new bar for total market screw up and next rtx 3080 - will have imaginary msrp 800$ (you did think they will go back?) in market will cost 1000$, will be 0-5% faster than rtx 2080 , but will offer the new RZGSD (just made it up) as positive that will "just work" like a charm in some time near future. and what do you think - amd or intel will sell its equivalent for 400$? or they will price match whatever nvidia equivalent and reduce it by 50-100$ and call it a day ?
Because people don’t get it, theyre gonna be ripped off every damn time.
Apple gives us a clear example of how things work with the crowd
 
Looks good , the only downside is the price if you don't need rtx technology .
I'm wondering for one of the benchmarks , Monster Hunter World.
Did you leave Volumetric Rendering ON? Because is bugged and extremely heavy on high resolution like 4k.
 
Now nvidia expanded the core by 50% just to fit RT and tensor cores instead of CUDA for a reason.
Any source for that? Looking at the specifications of Pascal, Volta and Turing cards I would put that extra die space at around 20%. Possibly less as Turing's usual parts are Volta which has a bit more features and is bigger than Pascal's.
 
Turing is the first new card family I remember that gives you less performance for your $ than previous generation cards.
 
Turing is the first new card family I remember that gives you less performance for your $ than previous generation cards.
The thing I find funny is, when the Vega 64 launched last year many people were like "Hurr, GTX1080 grade performance for the same price but a year late, Hurr Hurr". Well Nvidia have just launched a card with GTX1080 grade performance for more money two years late, and many seem to be lapping it up xD
 
Well if price goes down it's a decent card. I'll wait for next gen tho.

I wouldn't be surprised if Nvidia refreshes Turing on sub 10nm next year already. After all Turing was made for a smaller node, which can explain the small improvements.
 
Nice card, AIB customs are going for £459 here, no wonder they are selling out fast, Vega 64 and the GTX 1080 Ti need a price cut.
Sad state of affairs when second hand Nvidia cards are the only competiton for... Nvidia cards.

But then the 2070 is cheaper than Vega 64 here. Intel... please help!

Nice card? At OCUK, a custom 1080 is cheaper than the 4 reference and Mini style 2070s. Consumes more than a 1080. And the RTX feature will probably provide negligible performance from what we have seen from the 2080Ti SotTR demo, BFV and Metro Exodus news.
ASUS Strix Vega 64 is £450. Packed with 3 games worth 150 EUR (including AC Odyssey). So you are lying.
You must be joking or a person coming from NV factory. Do not lie, please. Lying is bad.

1080Ti is much worse, it is like GTX780 Ti compared to GTX1060 in async compute vulkan and such. And will slow down much faster in the future and in case Dlss is enabled on 2070 no performance hit, some 2x Aa on 1080Ti with performance hit.
I just love how some NV trolls invaded the forum.

The FE model is too pricey, and personally I think Nvidia should stop making them due to the negative response.

The base model RTX 2070 is a good buy though, at $500 it offers slightly better performance than GTX 1080. Still, if they could cut another $50 it would be very well priced.

Don't forget that the 1000 FE modells had base clock speeds and the 2000 modells are clocked around AIB modells. So when you get an AiB 1080, which is cheaper than FE models, it offers the same performance as the RTX 2070 FE.

Techspot compared factory overclocked 1080, 1080 Ti vs a reference clock 2070.

Quote below from Techspot review...

The 2070 FE is 1620 Mhz base, 1710 MHz boost. ASUS Turbo and MSI Armor models are 1620 MHz Boost. So the 2070 FE has higher clock speed than the cheapest AIB models. So comparing it to a base clocked (which is base in the very meaning) 1080 is not giving proper results.

Well if you can't afford it there is the 2060 next year, 980Ti performance. You just cant pay the price, and remembering how GTX 280 went from 660$ to 330$ in just 6 months nobody in his right mind should. This is early adopter tax.

Now 20 series is clearly 39% faster than 10 series overall 1070 is 72% of 2070, and 1080 is 73% of 2080 in 1440p so 1060 is 73% of 2060Ti not the 2060/128bit since it drops the bus width 1 notch.

20 series could be +50% at the same price if 10nm samsung process worked on time and AMD GLOFo started using it and lowered the prices too.

Now nvidia expanded the core by 50% just to fit RT and tensor cores instead of CUDA for a reason.
Nope.

It's around 27% faster than a 1070. 700-900 jump was around the same performance-wise (however, 900 was a bit faster compared to the 700 than the 2000 compared to the 1000), but the most important thing is that they did it with CHEAPER prices comparing same categories. Though 1000 same categories were more expensive than the 900 counterparts, they were WAY FASTER (nearly double performance leap %-wise). Now, with RTX you get a bit smaller 700-900 performance leap with a massive price increase ranging from 120$ (2070) to a whopping 400-500$ (2080Ti). A complete failure for the gen.
https://static.techspot.com/articles-info/1727/bench/GTX1070.png
GTX1070.png
 
Last edited:
None of the German tech press got a FE. Only a very small number of US sites got a FE from NVIDIA
They stopped the FE rollout given the expected lashing they've gotten on perf/$. They figured $500 would look better for initial reviews.

The other major introduction is the Tensor Core, which made its debut with the "Volta" architecture. These too are specialized components tasked with 3x3x3 matrix multiplication, which speeds up AI deep-learning neural net building and training. Its relevance to gaming is limited at this time, but NVIDIA is introducing a few AI-accelerated image-quality enhancements that could leverage Tensor operations.
Relevance to gaming is as limited as RTX is. Tensor cores are an integral & required part of Nvidia's RTX package. The Tu RT hardware can't sustain >2 ray/pixel at any sort of decent resolution, especially with shadow & reflection, so denoising via tensor cores is a must. We're not talking path tracing here...
 
Nope.

It's around 27% faster than a 1070.


Sorry to troll but buying 1080Ti now is like buying 780Ti 11 months ahead of GTX 970 because that is how long it had, and then it dropped from 699 to like 399$. And it is not cool to use other test results, Ok around +33% then. But I can't help it to see how WolfensteinII makes good use of async compute and 2070 destroys Ti, and other games just benefit greatly just for some reason. 1080 Ti doesn't have it, it is obsolete, don't buy. 2070 is worth 399$ brand new in 6 to 11 months from now, so comparing prices now is useless, unles you are ok to waste it in 300$ per year rate, that is all.

Any source for that? Looking at the specifications of Pascal, Volta and Turing cards I would put that extra die space at around 20%. Possibly less as Turing's usual parts are Volta which has a bit more features and is bigger than Pascal's.

545/314 mm2 or 445/314 mm2. 20% more CUDA only takes 10% more space, so the rest is RT, tensor core L1 cache and whatever is new.
 
Last edited:
Yeah well sadly the market has changed, not sure why it should be cheaper than a 1070 when it's better than a Vega 64. Maybe I'm smoking the wrong weed... but hey ho.

The market clearly needs a third player, Rajmeister is at Intel... lets see if they can spice the market up from there, he failed with Vega.
sarcasme ...

i meant my 1070 was 526$ because it was overpriced (and it was the only custom model to be cheaper than a FE ) ... so the 2070 will not be 499 but 669$ (confirmed price for me ) and the only serious part of my post was : if i want to upgrade my 1070 the best option will be a Vega 64 ... the 2070 has no incentive over it ... and V64 prices a slowly dropping for me

as for Intel ... well if Raj (i wouldn't call him meister) failed with Vega ... i wouldn't count on Intel's GPU :laugh: , but he didn't fail with Vega, i find the Vega 56 and 64 quite good even today (if it weren't for the mining craze ... ) just need some price drop ...

oh well, for my next upgrade Intel and Nvidia are barred from my wishlist i guess.
 
The FE model is too pricey, and personally I think Nvidia should stop making them due to the negative response.

The base model RTX 2070 is a good buy though, at $500 it offers slightly better performance than GTX 1080. Still, if they could cut another $50 it would be very well priced.
The funny thing is, Phoronix ran some compute tests and this thing is so fast it offers the best price/performance ratio even at current prices :eek:
Of course, that doesn't do much for gamers...
 
The funny thing is, Phoronix ran some compute tests and this thing is so fast it offers the best price/performance ratio even at current prices :eek:
Of course, that doesn't do much for gamers...


The RTX series should have been marketed as lower end professional cards giving developers a chance to implement ray tracing and DLSS ahead of consumer availability of the technology.
 
The RTX series should have been marketed as lower end professional cards giving developers a chance to implement ray tracing and DLSS ahead of consumer availability of the technology.
They're still pretty darn fast in games as well. But those prices, if I see them never again for video cards it will be too soon.
 
You can get a new one for $550 (which is still cheaper than a 2070 is going for now) ..but ive had used cards last way more than 3 years so... not sure where you're getting that stat.

Also this wasnt the case last generation where a 1070 was faster and cheaper than a 980ti... so also not sure where you're getting "The same argument have been made every generational change". Because that's definitely not true - at least not for GPUs... It was true for Intel's Sandy/Ivy/Haswell/Skylake releases ~ mostly becuase they were incremental and expensive for no reason (much like this).

This.
 
"editors choice" for so many negatives... umm.. ok. Meanwhile you'll see clearance (i suspect b/c you are already starting to see these occasionally today) $299/$350 1080s for black friday shopping season. Just buy one of those, overlock it and save yourself $200+
 
"editors choice" for so many negatives... umm.. ok. Meanwhile you'll see clearance (i suspect b/c you are already starting to see these occasionally today) $299/$350 1080s for black friday shopping season. Just buy one of those, overlock it and save yourself $200+
I'd go for a 1080 or even used 1080 Ti anytime before this. Price/performance is light years beyond what 2070 has.
 
Back
Top