# NVIDIA: GeForce RTX 3080 Reviews Delayed, RTX 3070 Availability Confirmed



## btarunr (Sep 12, 2020)

NVIDIA in a GeForce community forums post by staff member, announced that reviews of the GeForce RTX 3080 Founders Edition have been delayed to September 16, with the review NDA lifting at 6:00 AM Pacific Time. This NDA was originally slated to be lifted on September 14. According to a Reddit post by an NVIDIA representative "NV-Tim," the delay was in response to certain reviewers requesting more time from NVIDIA as COVID-19 impacted their sampling logistics.

In other news, NVIDIA announced that its $499 (starting) GeForce RTX 3070 graphics card will be available on October 15, 2020. The performance segment graphics card is hotly anticipated by the gaming community as the other two products in the series—RTX 3080 and RTX 3090—are enthusiast segment products. NVIDIA claims that the RTX 3070 beats the RTX 2080 Ti in performance, which means the card should be capable of 1440p high refresh-rate gaming, and 4K UHD gaming at 60 Hz. 





*View at TechPowerUp Main Site*


----------



## HD64G (Sep 12, 2020)

nVidia's CEO clearly said on the livestream that 3070=2080Ti performance, not higher. And when reviewed I predict that at 4K it will be slightly behind 2080Ti (~5%).


----------



## Anymal (Sep 12, 2020)

I get a little excited everytime I see nvidia and/or their logo in news titles. Does it make me a fanboy?


----------



## arbiter (Sep 12, 2020)

HD64G said:


> nVidia's CEO clearly said on the livestream that 3070=2080Ti performance, not higher. And when reviewed I predict that at 4K it will be slightly behind 2080Ti (~5%).


looking at specs, 3070 has ~30% more shaders and maybe 200mhz higher listed clocks. At 1080 and 1440p it should be maybe 15-20% faster but that is just a a guess.


----------



## Metroid (Sep 12, 2020)

I'm a lot more interested on the 3060ti after its specs was leaked. The 3070 was born dead. The 3070ti will likely be a lot more interesting.


----------



## DuxCro (Sep 12, 2020)

HD64G said:


> nVidia's CEO clearly said on the livestream that 3070=2080Ti performance, not higher. And when reviewed I predict that at 4K it will be slightly behind 2080Ti (~5%).



He said "faster than 2080Ti"


----------



## Xuper (Sep 12, 2020)

DuxCro said:


> He said "faster than 2080Ti"
> View attachment 168445


All resolutions or just 1080p/1440p?


----------



## THANATOS (Sep 12, 2020)

arbiter said:


> looking at specs, 3070 has ~30% more shaders and maybe 200mhz higher listed clocks. At 1080 and 1440p it should be maybe 15-20% faster but that is just a a guess.


RTX 3070 vs 2080Ti
SM: 46 vs 68(+48%)
Shaders: 5888(+35%) vs 4352
TMUs: 184 vs 272(+48%)
ROPs: 64 vs 88(+38%)
Bandwidth: 448GB/s vs 616GB/s (+38%)

Edit: If It really is as fast or faster than RTX2080Ti in standard rasterization, then good job Nvidia. The only downside is 8GB vs 11GB Vram and relatively high TDP for a smaller process.


----------



## isvelte (Sep 12, 2020)

DuxCro said:


> He said "faster than 2080Ti"
> View attachment 168445



Well he also said 3080 is double the performance of 2080, but it was only in one game which is Quake RTX, the others are roughly +70-80% vs 2080. So im not convinced this one is true. I wish these tech companies learn that its better to under promise and over deliver. Also turing shaders is not 1 to 1 Ampere shaders, all the benchmark we already know about 3080 proves that.


----------



## Luminescent (Sep 12, 2020)

HD64G said:


> nVidia's CEO clearly said on the livestream that 3070=2080Ti performance, not higher. And when reviewed I predict that at 4K it will be slightly behind 2080Ti (~5%).


They can "fix" from drivers performance for 2080TI so it can be behind or equal to 3070, for current games they might leave it at that but future games will make 3070 better.


----------



## THANATOS (Sep 12, 2020)

Metroid said:


> I'm a lot more interested on the 3060ti after its specs was leaked. The 3070 was born dead. The 3070ti will likely be a lot more interesting.


Why would be 3070Ti(Super) more interesting? If It's just a full GA104(392 mm² ) then I don't think there will be more than 48SM, 6144Cuda, 192TMU, 64ROPs, that won't increase the performance by much.
Only If the amount of Vram will be doubled to 16GB I will find It much more interesting, but then the question will be the price.


----------



## Kohl Baas (Sep 12, 2020)

Xuper said:


> All resolutions or just 1080p/1440p?



My guess is just RT.


----------



## Tomgang (Sep 12, 2020)

For my 2 in 1 system plans I am planning to get, the gpu's are not yet officially confirmed. That will be RTX 3060 TI if spec are true and rtx 3080 TI if it's named that or maybe super.


----------



## Metroid (Sep 12, 2020)

THANATOS said:


> Why would be 3070Ti(Super) more interesting? If It's just a full GA104(392 mm² ) then I don't think there will be more than 48SM, 6144Cuda, 192TMU, 64ROPs, that won't increase the performance by much.
> Only If the amount of Vram will be doubled to 16GB I will find It much more interesting, but then the question will be the price.



Anything really would be more interesting than the 3070.


----------



## Vayra86 (Sep 12, 2020)

Metroid said:


> Anything really would be more interesting than the 3070.



Can you elaborate?


----------



## Xuper (Sep 12, 2020)

THANATOS said:


> RTX 3070 vs 2080Ti
> SM: 46 vs 68(+48%)
> Shaders: 5888(+35%) vs 4352
> TMUs: 184 vs 272(+48%)
> ...


I heard Nvidia counts FP32 as CUDA cores.this means they're not same as turning arc.


----------



## claylomax (Sep 12, 2020)

Kohl Baas said:


> My guess is just RT.



This.


----------



## ppn (Sep 12, 2020)

5888 performs like 4352 FP32 + 1536 integer with the ability to do FP32+integer 1:1 2944 or 2x2944FP32. So ampere can tap into 50% extra performace, but keeping the memory bus unchanged that is more like 25% faster than 2080. which sounds close to 2080Ti.


----------



## ThrashZone (Sep 12, 2020)

Metroid said:


> Anything really would be more interesting than the 3070.





Vayra86 said:


> Can you elaborate?


Hi,
Flight simulators for one even 3070ti would be better seeing it has 16gb vmem instead of piddly 8gb if reports are correct.
Even 3080 with 10gb is anticlimactic maybe 3080ti will be with 20gb.


----------



## ppn (Sep 12, 2020)

if you had to choose between 3080 10GB, and 3070 16GB,, +60% performance or +60% memory,  at almost the same price what would you choose. $600-700, just 3060 with 3070 sticker on it.


----------



## ThrashZone (Sep 12, 2020)

Hi,
If the 3080 had at least 12gb's nobody would care


----------



## Vya Domus (Sep 12, 2020)

isvelte said:


> I wish these tech companies learn that its better to under promise and over deliver.



Over promising is always more effective, that's how marketing works.


----------



## QUANTUMPHYSICS (Sep 12, 2020)

Microcenter told me I could pick up a 3080 on Thursday morning September 18th. 

They told me I can pick up a Kingpin3090 on September 24th.


----------



## EarthDog (Sep 12, 2020)

Luminescent said:


> They can "fix" from drivers performance for 2080TI so it can be behind or equal to 3070, for current games they might leave it at that but future games will make 3070 better.


Please refrain from posting such drivel.


----------



## Naito (Sep 12, 2020)

Performance is looking fairly promising in a leaked RTX 3080 review:



> Firestrike - 200fps+, 31919 score, 25% faster than 2080Ti
> 
> Firestrike Extreme - 1440p, 20101 score, 24% faster than 2080Ti , 45% faster than 2080S
> 
> ...



Not sure if legit, but here is a link to source


----------



## BiggieShady (Sep 12, 2020)

Xuper said:


> All resolutions or just 1080p/1440p?





Kohl Baas said:


> My guess is just RT.





claylomax said:


> This.


Wait, what? We have already seen in both RTX and non RTX games there is at least 1.7x speedup (max 2x) from 2080 to 3080. That confirms 3070 being faster than 2080ti exactly in those games where 2080 to 3080 speedup is closer to 2x.
I have seen Doom Eternal (Vulkan), Shadow of the Tomb Raider (Partial RTX DX12) and Control (Full RTX DX12) tested at YT Digital Foundry channel, all showing consistent scaling of frame rate increase


----------



## lexluthermiester (Sep 12, 2020)

Anymal said:


> I get a little excited everytime I see nvidia and/or their logo in news titles. Does it make me a fanboy?


No, it just makes you excitable. And that's a good thing! It means you still enjoy the release of new hardware. Stay excitable, life's better that way!


----------



## Daven (Sep 12, 2020)

lexluthermiester said:


> No, it just makes you excitable. And that's a good thing! It means you still enjoy the release of new hardware. Stay excitable, life's better that way!



I agree with this sentiment but loving corporations can lead to trouble in society. Both corruption and monopolies form from such sentiment. Just make sure the excitement is for technology and not Nvidia or AMD or Intel or etc.

And I wanted to reiterate those posts that say that 1 Ampere CUDA core DOES NOT EQUAL 1 Turing CUDA core. An Ampere is about 0.7x a Turing. If they were equal a 3080 (8704 cores) would be way, way, way more than UP TO double the performance of a 2080 (2944 cores). Also both boost clocks are the same so you can rule out clock differences: 1710 MHz on 2080 and 1710 MHz on 3080.


----------



## laszlo (Sep 12, 2020)

Anymal said:


> I get a little excited everytime I see nvidia and/or their logo in news titles. Does it make me a fanboy?



nope ; i myself prefer to be a fanboy of my wallet .... squeeze all til upgrade is required and i don't care anymore of brand...best price/perf. ratio fanboy...


----------



## Shatun_Bear (Sep 12, 2020)

arbiter said:


> looking at specs, 3070 has ~30% more shaders and maybe 200mhz higher listed clocks. At 1080 and 1440p it should be maybe 15-20% faster but that is just a a guess.



These numbers are frankly, nonsense. At 1080p there will be less of a difference. Anyway, leaked benchmarks have put the 3080 at just 30-35% faster than the 2080 Ti so not a chance in hell 3070 is 15-20% faster or anywhere close. I would suggest you put down the green Kolaid and wait to be enlightened.


----------



## BiggieShady (Sep 12, 2020)

Shatun_Bear said:


> put down the green Kolaid


You have to realize there is no Koolaid.
Embargo prohibits comparing any frame rates in (p)reviews, but all the "nonsense" numbers, as you call them, are already known.


----------



## MxPhenom 216 (Sep 12, 2020)

HD64G said:


> nVidia's CEO clearly said on the livestream that 3070=2080Ti performance, not higher. And when reviewed I predict that at 4K it will be slightly behind 2080Ti (~5%).



No, Im fairly certain he said its faster than a 2080ti. Even the plot they showed had the 3070 higher up on it.


----------



## lexluthermiester (Sep 12, 2020)

Mark Little said:


> but loving corporations can lead to trouble in society.


Ah but I did not imply that anyone should love a company. Only that being excited about technology in general is a good way to keep life fun and interesting. And with the current generation of CPU's & GPU's there is much to be excited about.


----------



## BiggieShady (Sep 12, 2020)

MxPhenom 216 said:


> No, Im fairly certain he said its faster than a 2080ti. Even the plot they showed had the 3070 higher up on it.


They also said 2x performance for 2080 to 3080 ... but that's peak value in tests, it's commonly between 1.7x and 1.9x
So I'd say 3070 will be about same as 2080ti in real world averages (but with more rtx grunt)


----------



## lexluthermiester (Sep 12, 2020)

MxPhenom 216 said:


> No, Im fairly certain he said its faster than a 2080ti. Even the plot they showed had the 3070 higher up on it.


This is correct. NVidia has clearly stated that the 3070 is faster than the 2080ti. It's all over the place.


----------



## Daven (Sep 12, 2020)

lexluthermiester said:


> Ah but I did not imply that anyone should love a company. Only that being excited about technology in general is a good way to keep life fun and interesting. And with the current generation of CPU's & GPU's there is much to be excited about.



Oh sorry, that was more directed at Anymal than you. I was jumping back to the original comment.


----------



## lexluthermiester (Sep 12, 2020)

Mark Little said:


> Oh sorry, that was more directed at Anymal than you. I was jumping back to the original comment.


I gotcha, no worries.


----------



## dalekdukesboy (Sep 12, 2020)

All I can say is....I'm very glad I didn't buy a 2080 series card, particularly the 2080ti when it was at full price...ouch.


----------



## BArms (Sep 12, 2020)

dalekdukesboy said:


> All I can say is....I'm very glad I didn't buy a 2080 series card, particularly the 2080ti when it was at full price...ouch.



You might as well never buy any technology if you're worried about it being replaced by something better.  I got my money's worth.


----------



## harm9963 (Sep 12, 2020)

This is very interesting ?


----------



## TheGuruStud (Sep 12, 2020)

Have to hide the fact that 3080 is same as 2080ti OCed (needs the ram performance) at 4k. IPC looks to be nearly identical. I said they're fake cores, but nooooo, huang would never lie.


----------



## lexluthermiester (Sep 12, 2020)

dalekdukesboy said:


> All I can say is....I'm very glad I didn't buy a 2080 series card, particularly the 2080ti when it was at full price...ouch.


Really?


BArms said:


> You might as well never buy any technology if you're worried about it being replaced by something better. I got my money's worth.


Agree with this. I have greatly enjoyed my 2080. Of course I got mine before the "super" version came out. No regrets.


----------



## ZoneDymo (Sep 12, 2020)

dalekdukesboy said:


> All I can say is....I'm very glad I didn't buy a 2080 series card, particularly the 2080ti when it was at full price...ouch.



This is pretty much always the case though, the higher end you go, the more you deal with deminishing returns.



lexluthermiester said:


> No, it just makes you excitable. And that's a good thing! It means you still enjoy the release of new hardware. Stay excitable, life's better that way!



dont really agree with this, it just means you fall for hype all the time which is the purpose of the company but not good for "you" as a consumer.


----------



## lexluthermiester (Sep 12, 2020)

ZoneDymo said:


> dont really agree with this, it just means you fall for hype all the time which is the purpose of the company but not good for "you" as a consumer.


You can disagree all you want, doesn't make you right. There is nothing wrong with finding tech advances fun, interesting and exciting. Getting excited does not make one gullible either. You want to be dull and cynical? Feel free. Don't ruin things for anyone else.


----------



## Vya Domus (Sep 12, 2020)

Mark Little said:


> And I wanted to reiterate those posts that say that 1 Ampere CUDA core DOES NOT EQUAL 1 Turing CUDA core. An Ampere is about 0.7x a Turing. If they were equal a 3080 (8704 cores) would be way, way, way more than UP TO double the performance of a 2080 (2944 cores). Also both boost clocks are the same so you can rule out clock differences: 1710 MHz on 2080 and 1710 MHz on 3080.



It doesn't work like that, a CUDA core just means a FP32 unit. One FLOP is one FLOP no matter what, Ampere appears to be less effective per SM not per CUDA core, at least in terms of graphics performance. Everyone should stop talking about CUDA cores or stream processors in this context.


----------



## ZoneDymo (Sep 12, 2020)

lexluthermiester said:


> You can disagree all you want, doesn't make you right. There is nothing wrong with finding tech advances fun, interesting and exciting. Getting excited does not make one gullible either. You want to be dull and cynical? Feel free. Don't ruin things for anyone else.



Ermm yeah? it does not make me wrong either lol, what kind of response is that?
You share an opinion and I share mine with some argumentation I might add so if anything I am more right then the other way around.
Being critical means to not fall into traps set by companies to lighten your wallet a tad, cynical and dull is something else entirely.
And if anything I am not "ruining" anything, the ruining can only come from the critical mindset being proven correct, then its just facts, if it however turns out to be all true, which is something someone with a critical mind also embraces then all is fine and dandy and people can be happy.

But only at the appropriate time with the appropriate information.


----------



## dalekdukesboy (Sep 12, 2020)

I'm glad my post got y'all thinking. I agree with the general idea that you buy new top end GPU's and you just expect before you know it the value has plummeted and a new card performs the same at 1/4 the price; however, maybe the 2080ti is older than I think it just seems that it got totally eclipsed by a new gen very quickly and it didn't last nearly as well as say my 1080ti which I got for 500 bucks and had for couple years and it still was in the mix performance wise despite being aged technology and time'wise.


----------



## ppn (Sep 12, 2020)

I hope the people that didnt receive their review samples to still not receive them on time, for making us wait. this is torture.


----------



## lexluthermiester (Sep 12, 2020)

ZoneDymo said:


> Being critical means to not fall into traps set by companies to lighten your wallet a tad, cynical and dull is something else entirely.


And that is where you failed to understand the context of what I was saying to the other user.



ppn said:


> I hope the people that didnt receive their review samples to still not receive them on time, for making us wait. this is torture.


There's a spiteful perspective.


----------



## Markosz (Sep 12, 2020)

Eh, there will be some sites who "didn't get the memo" and release the reviews on 14th anyways.


----------



## lexluthermiester (Sep 12, 2020)

Markosz said:


> Eh, there will be some sites who "didn't get the memo" and release the reviews on 14th anyways.


I'm sure that might happen, but NVidia will likely react poorly as reviewers are expected to keep up on developing events.


----------



## sutyi (Sep 12, 2020)

Vya Domus said:


> It doesn't work like that, a CUDA core just means a FP32 unit. One FLOP is one FLOP no matter what, Ampere appears to be less effective per SM not per CUDA core, at least in terms of graphics performance. Everyone should stop talking about CUDA cores or stream processors in this context.



Marketing team at work mah dude. 10K CUDA cores sound better then 5K CUDA cores with workload dependent double FP32 performance...


----------



## Jomale (Sep 12, 2020)

So, the price will rize high, the cooler alone costs nividia up to 150$/card... And look to reddit.


----------



## dicktracy (Sep 12, 2020)

1440p is the new 1080p. Thanks Nvidia!


----------



## AddSub (Sep 12, 2020)

Techwiz Youtuber nonsense and clickbait aside, as well as questionable "leaks", do we have official info on $150 cooler? I know a little bit about metallurgy and a bit about electronics, so where are the cost #s coming from? (/r/AMD doesn't qualify) Are the motor windings in those fans not copper but gold or palladium or something rare or exotic? Machining both on scale AND quality has gotten ridiculously affordable over the last 25 years (thank you massive build up in China), and those heatsinks, while pretty hefty, are nothing we haven't seen before. In fact when compared to giants of yesteryear, they might be little on the lighter side. So again, aluminum, copper, zinc, iron... those are the usual suspects, and they are dirt chip. Where is the $150 figure coming from? Sounds like absolute effin nonsense!


...
..
.


----------



## mahoney (Sep 12, 2020)

HD64G said:


> nVidia's CEO clearly said on the livestream that 3070=2080Ti performance, not higher. And when reviewed I predict that at 4K it will be slightly behind 2080Ti (~5%).


If you look at the slides it says Faster than the 2080ti


----------



## ppn (Sep 13, 2020)

It's 20TFlops, so must be 50% faster right, when no. We don't know what that means. Until then think of it as a cut down 2080Ti 256 bit 4352 FP Cuda + 1536 Integer or 2944+2944 depending the load or 5888+0 int. the real 2080Ti is 352bit 4352Fp +4352 INT. This is just a glorified 2080super that is all with some 4billion transistors more, potentially doing nothing useful, i can't justify the use if ray tracing yet, even at 1440 unless it has no performance impact.


----------



## lexluthermiester (Sep 13, 2020)

AddSub said:


> do we have official info on $150 cooler?


Short answer, no. From everything that is available, it seems like it's not more than $20(maybe $30) worth, no matter how "good" it looks.


----------



## damric (Sep 13, 2020)

If performance is real, this is the first time I'm excited by a green product since Maxwell. An RTX 3070 would be a good fit to my 60Hz 2160p monitor.


----------



## B-Real (Sep 13, 2020)

HD64G said:


> nVidia's CEO clearly said on the livestream that 3070=2080Ti performance, not higher. And when reviewed I predict that at 4K it will be slightly behind 2080Ti (~5%).



Then the graph lied.


			https://i.cdn29.hu/apix_collect_c/2009/nvidia-geforce-rtx-3000/nvidia_geforce_rtx_3000_screenshot_20200901185706_2_original_760x760.jpg
		


"RTX 3070

Faster than 2080 Ti"



Naito said:


> Performance is looking fairly promising in a leaked RTX 3080 review:
> 
> 
> 
> Not sure if legit, but here is a link to source


Regarding the leaks on Videocardz, it could be much less than expected: 3080 is 24% faster in 4K than 2080 Ti in FC New Dawn, which is a more typical game, meaning there is no RT or DLSS. What is sure is that the FC ND 2000 series results are nearly identical with the ones benchmarked on Guru3D









						NVIDIA GeForce RTX 3080 synthetic and gaming performance leaked - VideoCardz.com
					

We have new benchmark results straight from reviewers. NVIDIA GeForce RTX 3080 performance The official embargo on GeForce RTX 3080 Founders Edition review lifts on Monday. We have been contacted by many reviewers trying to verify their results, and since we already have the data, we decided to...




					videocardz.com
				











						NVIDIA GeForce RTX 2080 SUPER review
					

We review the GeForce RTX 2080 SUPER, NVIDIA has launched a new Super graphics cards, as in super-charged in a super range of what they deem super products. GeForce RTX 2080 Super is based on a Turin... DX11: Far Cry New Dawn




					www.guru3d.com
				






damric said:


> If performance is real, this is the first time I'm excited by a green product since Maxwell. An RTX 3070 would be a good fit to my 60Hz 2160p monitor.


Why since Maxwell? Pascal was the best performance jump maybe from all generations regarding NV. Take NV official marketing with a grain of salt (I know you wrote "If performance is real").


----------



## TheoneandonlyMrK (Sep 13, 2020)

AddSub said:


> Techwiz Youtuber nonsense and clickbait aside, as well as questionable "leaks", do we have official info on $150 cooler? I know a little bit about metallurgy and a bit about electronics, so where are the cost #s coming from? (/r/AMD doesn't qualify) Are the motor windings in those fans not copper but gold or palladium or something rare or exotic? Machining both on scale AND quality has gotten ridiculously affordable over the last 25 years (thank you massive build up in China), and those heatsinks, while pretty hefty, are nothing we haven't seen before. In fact when compared to giants of yesteryear, they might be little on the lighter side. So again, aluminum, copper, zinc, iron... those are the usual suspects, and they are dirt chip. Where is the $150 figure coming from? Sounds like absolute effin nonsense!
> 
> 
> ...
> ...


What scale, these aren't being made on any great scale.
Going to be unicorn tears before long.


----------



## damric (Sep 13, 2020)

B-Real said:


> Why since Maxwell? Pascal was the best performance jump maybe from all generations regarding NV. Take NV official marketing with a grain of salt (I know you wrote "If performance is real").



I guess because I was so disappointed by the smoking hot turds known as Fermi and Kepler compared to what ATI/AMD was offering. When Maxwell arrived NV was competitive again against GCN. Pascal was more of an evolution of Maxwell no?


----------



## lexluthermiester (Sep 13, 2020)

damric said:


> Pascal was more of an evolution of Maxwell no?


No. Pascal was a complete architectural rebuild. Tegra was an evolution of Maxwell.


----------



## AddSub (Sep 13, 2020)

lexluthermiester said:


> Short answer, no. From everything that is available, it seems like it's not more than $20(maybe $30) worth, no matter how "good" it looks.



That seems like a legitimate estimate. We are looking at what... 750g of material? Probably nickel plated copper (vapor chamber layout?), non-exotic fans by all accounts, no bling of any kind other than some aluminum trimming (posibly silver painted plastic even). This is not some next-gen futuristic design. It is literally a sub 1kg heat-sink (albeit nicely machined) with two fans (ball bearings?). $150? How? It doesn't even...

...these launches are starting to mirror console launches in their sophistication and quality of rumors. ("THE NEXT PLAYSTATION IS TOTALLY AWESOME! IT HAS GOLD PLATED CONTROLLERS THAT COST $2700 EACH BRO!") That is the new target demographic I guess. 

...
..
.


----------



## Slizzo (Sep 13, 2020)

damric said:


> I guess because I was so disappointed by the smoking hot turds known as Fermi and Kepler compared to what ATI/AMD was offering. When Maxwell arrived NV was competitive again against GCN. Pascal was more of an evolution of Maxwell no?



Dunno where you've been but Kepler smoked what AMD had running at the time. The fact that NVIDIA felt it OK to release their "midrange" GK104 to compete with AMD's top products should show you why.


----------



## ZoneDymo (Sep 13, 2020)

lexluthermiester said:


> And that is where you failed to understand the context of what I was saying to the other user.



I would say the failure is on your side, you are the one trying to part information here, but do go ahead and try again.


----------



## efikkan (Sep 13, 2020)

AddSub said:


> Techwiz Youtuber nonsense and clickbait aside, as well as questionable "leaks", do we have official info on $150 cooler? <snip> Where is the $150 figure coming from? Sounds like absolute effin nonsense!


I'm assuming these are the same kind of figures we typically get whenever Apple releases a product, more or less what it would cost _you_ to order 1 in parts and assemble yourself, not if you order millions of those. Anyone with a tiny bit of understanding of production would understand these cost claims are just worthless BS.
There is just no way that RTX 3080, a card which will retail at $700, would use a cooler costing $150.



lexluthermiester said:


> No. Pascal was a complete architectural rebuild. Tegra was an evolution of Maxwell.


Actually you're wrong. Pascal was a tweaked (and extended) version of Maxwell, which were introduced due to delays of Volta(just look at older roadmaps showing Kepler->Maxwell->Volta), but it's still a good design though.


----------



## Mouth of Sauron (Sep 13, 2020)

Hey, @staff!

This basically means that you have the card already, and will be able to publish a review in a few days? YES/NO is enough...


----------



## HenrySomeone (Sep 13, 2020)

dalekdukesboy said:


> All I can say is....I'm very glad I didn't buy a 2080 series card, particularly the 2080ti when it was at full price...ouch.


That can only somewhat apply to the last couple months; those who bought it close to release, enjoyed 2 years of unrivaled top-tier performance and the even smarter of those recently sold them for close to 1000$ without much trouble, making them an unprecedentedly good value buy for a high-end gpu. I'd say you might as well be sad you didn't buy one at the right time if you wanted to and had the means...


----------



## WeeRab (Sep 13, 2020)

3070 FE cards @ $499 will only be available to the 1%. 
That is:  Reviewers and selected 'influencers'.
The rest of us will have to put up with vastly inflated 'market' pricing due to fake availability issues.
Mark my words.
Just like Intel's BS 9900k/10900k pricing and the rtx2060 Unicorn,  released to counter AMD's rx5600. That nobody could actually buy.
They are playing you for suckers.


----------



## HenrySomeone (Sep 13, 2020)

What a load of red propaganda drivel! Rtx 2060 released to counter Rx5600xt?


----------



## efikkan (Sep 13, 2020)

WeeRab said:


> 3070 FE cards @ $499 will only be available to the 1%.
> That is:  Reviewers and selected 'influencers'.
> The rest of us will have to put up with vastly inflated 'market' pricing due to fake availability issues.
> Mark my words.
> ...


Why do people like you have to spread this FUD?
Several started doing this immediately after the announcement of Ampere. It seems like you are nervous.



WeeRab said:


> Just like Intel's BS 9900k/10900k pricing and the rtx2060 Unicorn,  released to counter AMD's rx5600. That nobody could actually buy.


These are outright lies.


----------



## Vayra86 (Sep 13, 2020)

ZoneDymo said:


> Ermm yeah? it does not make me wrong either lol, what kind of response is that?
> You share an opinion and I share mine with some argumentation I might add so if anything I am more right then the other way around.
> Being critical means to not fall into traps set by companies to lighten your wallet a tad, cynical and dull is something else entirely.
> And if anything I am not "ruining" anything, the ruining can only come from the critical mindset being proven correct, then its just facts, if it however turns out to be all true, which is something someone with a critical mind also embraces then all is fine and dandy and people can be happy.
> ...



Just lex being lex... somebody said a bad thing about his 2080 and that's how he gets. Its okay, he gets too excited about hardware. 



lexluthermiester said:


> No. Pascal was a complete architectural rebuild. Tegra was an evolution of Maxwell.



No, Maxwell got several Pascal-intended (or roadmapped) tweaks. The whole CUDA based arch is an evolution of itself. This is true up until even today. The architecture always changes according to market demand. This also explains how their Titan lost many capabilities and floated between gaming and pro/ semi pro card all the time. And still does really.

Pascal was merely a shrink, more bits cut off to make it more gaming oriented, and improved power delivery. Its not too dissimilar from Ampere. Turing did the ground work, Ampere nails the efficiency with a shrink and a feature refinement. Similarly, Maxwell introduced GPU Boost 3.0 that was iterated on and got wings with the new node.



WeeRab said:


> 3070 FE cards @ $499 will only be available to the 1%.
> That is:  Reviewers and selected 'influencers'.
> The rest of us will have to put up with vastly inflated 'market' pricing due to fake availability issues.
> Mark my words.
> ...



No, just you apparently.

What's going on guys? Jeeeesus. Its a bunch of GPUs and people get all emotional and the 'facts' flying across the table get weirder every page/post.


----------



## Nater (Sep 13, 2020)

^ He's spot on.  Only people who make that zero-day click "add to cart" and get in line at places like Microcenter are getting these MSRP prices.  Sept 19th we'll see NIB 3080's on eBay for $1200.  They'll be high/out of stock through Christmas.

Not because it's a paper-launch, but because it's that damn good and demand is there.


----------



## puma99dk| (Sep 13, 2020)

I been wondering if it would be worth it to get the RTX 3080 FE but without knowing the noise, heat and so on of that cooler design it makes me want to wait.

Plus would love to the price and performance of AMD this time around.


----------



## Fruban (Sep 13, 2020)

Anymal said:


> I get a little excited everytime I see nvidia and/or their logo in news titles. Does it make me a fanboy?



Just makes me wish I had money to burn


----------



## lexluthermiester (Sep 13, 2020)

efikkan said:


> Actually you're wrong. Pascal was a tweaked (and extended) version of Maxwell, which were introduced due to delays of Volta(just look at older roadmaps showing Kepler->Maxwell->Volta), but it's still a good design though.


Interesting theory, but that's not what happened.



ZoneDymo said:


> I would say the failure is on your side, you are the one trying to part information here, but do go ahead and try again.


You can say whatever you wish...



Mouth of Sauron said:


> Hey, @staff!
> 
> This basically means that you have the card already, and will be able to publish a review in a few days? YES/NO is enough...


They're bound by NDA, they can't confirm or even deny that info.



Vayra86 said:


> Just lex being lex... somebody said a bad thing about his 2080 and that's how he gets. Its okay, he gets too excited about hardware.


And this is you being yourself. If by "lex being lex" you mean being objective and realistic, then yeah, I'm being me. Being excited by life is part of how one stays happy. 

The user above stated that having just bought a 2080ti they feel "buyers-remorse". They fail to realize that the card they bought is leaps and bounds better than what they had previously. Couple that with the fact that they have that card in hand and can actually use it means they will not have to wait for stocks to be replenished for the 30x0 series card about to be released and will instantly be sold out for months to come. Choosing to be a cynic is just that: a choice. Choosing to be objective and positive is a far better option. Maybe you and a few others should try it some time instead of being, well, the way you are.


----------



## efikkan (Sep 13, 2020)

lexluthermiester said:


> Interesting theory, but that's not what happened.


Not a theory, just the facts sir.
Volta was delayed, Pascal was put in between as a refinement, shrunk and extended Maxwell with some improvements from Volta. Volta evolved into Turing, the Pascal successor in the consumer space.


Spoiler


----------



## lexluthermiester (Sep 13, 2020)

efikkan said:


> Not a theory, just the facts sir.
> Volta was delayed, Pascal was put in between as a refinement, shrunk and extended Maxwell with some improvements from Volta. Volta evolved into Turing, the Pascal successor in the consumer space.
> 
> 
> Spoiler


That's it? Pictures? LOL!


----------



## damric (Sep 13, 2020)

Slizzo said:


> Dunno where you've been but Kepler smoked what AMD had running at the time. The fact that NVIDIA felt it OK to release their "midrange" GK104 to compete with AMD's top products should show you why.



_"The Radeon R9 290 and R9 290X hit NVIDIA's high-end GPU lineup back in fall-2013. The $399 R9 290 was faster than the $999 GTX TITAN and the $650 GTX 780; while the R9 290X held out on its own until NVIDIA made a product intervention with the GTX 780 Ti and GTX TITAN Black to reclaim those two price points."

-W1zzard 









						AMD Radeon R9 Fury X 4 GB Review
					

AMD's Radeon R9 Fury X is released today, introducing HBM memory for graphics cards. The new card is built around a watercooled Fiji GPU, which enabled AMD to design a very compact card that will fit into many small-form-factor cases. Gaming performance at 4K is good and roughly matches the GTX...




					www.techpowerup.com
				



_


----------



## ppn (Sep 13, 2020)

puma99dk| said:


> I been wondering if it would be worth it to get the RTX 3080 FE but without knowing the noise, heat and so on of that cooler design it makes me want to wait.
> 
> Plus would love to the price and performance of AMD this time around.



Avoid 3080 no matter what, it uses old 8nm 44Mtr density, instead of 6nm 66Mtr/mm2 that should be the norm. Even AMD doesnt use real 7nm. it is 41Mtr/mm I have a suspicion 7+ is not any better, except yeah some clock and power improvements... Wait for Hopper.


----------



## Caring1 (Sep 13, 2020)

Anymal said:


> I get a little excited everytime I see nvidia and/or their logo in news titles. Does it make me a fanboy?


It makes you a Tech head like most people here.


----------



## Fluffmeister (Sep 13, 2020)

damric said:


> _
> 
> 
> 
> ...



OMFG I CANT WAIT TO READ THIS!!!!1

That first comment of the review always makes me laugh.


----------



## lexluthermiester (Sep 14, 2020)

ppn said:


> Avoid 3080 no matter what, it uses old 8nm 44Mtr density, instead of 6nm 66Mtr/mm2 that should be the norm.


That is an uninformed statement. Samsung is the manufacturer for the 30x0 series of dies. Even though it's 8nm, Samsung's process has differing characteristics from TSMC's process and must therefore be judged on the merits of performances not the simple aspect of node size. And as performance numbers have yet to be released(completely ignoring leaks that are summarily without merit) that information is not specifically known.


----------



## xBruce88x (Sep 14, 2020)

THANATOS said:


> RTX 3070 vs 2080Ti
> SM: 46 vs 68(+48%)
> Shaders: 5888(+35%) vs 4352
> TMUs: 184 vs 272(+48%)
> ...



Keep in mind the shaders are a bit different architecturally this time around. Should still be an improvement but not as directly comparable to 2080ti. Basically 1 2080ti shader/Cuda core is not the same as 1 3070 core, even at same clock speeds


----------



## Slizzo (Sep 14, 2020)

damric said:


> _"The Radeon R9 290 and R9 290X hit NVIDIA's high-end GPU lineup back in fall-2013. The $399 R9 290 was faster than the $999 GTX TITAN and the $650 GTX 780; while the R9 290X held out on its own until NVIDIA made a product intervention with the GTX 780 Ti and GTX TITAN Black to reclaim those two price points."
> 
> -W1zzard
> 
> ...



That's more than a full year after the release of the GTX 680 on the GK104 Kepler die. GTX 680 released March of 2012. And as you state, a bigger GK110 launched 6 months after the GTX 780 did and performed better than the 290X.


----------



## puma99dk| (Sep 14, 2020)

ppn said:


> Avoid 3080 no matter what, it uses old 8nm 44Mtr density, instead of 6nm 66Mtr/mm2 that should be the norm. Even AMD doesnt use real 7nm. it is 41Mtr/mm I have a suspicion 7+ is not any better, except yeah some clock and power improvements... Wait for Hopper.



The problems is the RTX 3070 ain't powerful enough for 4K@144hz and the RTX 3090 is a rip off so Nvidia counted it right.


----------



## ratirt (Sep 14, 2020)

puma99dk| said:


> The problems is the RTX 3070 ain't powerful enough for 4K@144hz and the RTX 3090 is a rip off so Nvidia counted it right.


I think it might be hard even for 3090 to get 144FPS at 4k all the way in certain games. Dips will happen for sure. 
Wonder what caused the delay for 3080.


----------



## watzupken (Sep 14, 2020)

ppn said:


> Avoid 3080 no matter what, it uses old 8nm 44Mtr density, instead of 6nm 66Mtr/mm2 that should be the norm. Even AMD doesnt use real 7nm. it is 41Mtr/mm I have a suspicion 7+ is not any better, except yeah some clock and power improvements... Wait for Hopper.



If you go by that logic to not upgrade, then you will likely never upgrade. For all it matters, Hopper may not be using a real 5 or 7nm, so skipping it again? 8nm is a refined Samsung 10nm, so I am not expecting it to be as dense as the 6nm you mentioned. Moreover, people are looking out for performance improvement, less on how dense the chip is.



puma99dk| said:


> The problems is the RTX 3070 ain't powerful enough for 4K@144hz and the RTX 3090 is a rip off so Nvidia counted it right.


The mid range xx70 models are generally not meant for 4K gaming in the first place. The same goes for the AMD's side of things. High FPS 4K is reserved for the top end cards, and very frankly, I don't think the likes for the RTX 3090 can play all games at 4K@144hz consistently. It may be possible with DLSS, but I recalled that Nvidia introduced DLSS not meant to drive very high FPS. Most DLSS implementations target 60FPS up to 4K.


----------



## ratirt (Sep 14, 2020)

watzupken said:


> RTX 3090 can play all games at 4K@144hz consistently


Im pretty sure, not all games will run 144FPS constantly with a 3090.


----------



## damric (Sep 14, 2020)

Slizzo said:


> That's more than a full year after the release of the GTX 680 on the GK104 Kepler die. GTX 680 released March of 2012. And as you state, a bigger GK110 launched 6 months after the GTX 780 did and performed better than the 290X.



Quite underwhelming considering the price.

AMD was able to keep refreshing the same old GCN GPUs for years because Kepler portfolio was so bad lol.


----------



## puma99dk| (Sep 14, 2020)

ratirt said:


> I think it might be hard even for 3090 to get 144FPS at 4k all the way in certain games. Dips will happen for sure.
> Wonder what caused the delay for 3080.



Same here because a delay is interesting no matter if it's good or bad it's more the reason why.



watzupken said:


> The mid range xx70 models are generally not meant for 4K gaming in the first place. The same goes for the AMD's side of things. High FPS 4K is reserved for the top end cards, and very frankly, I don't think the likes for the RTX 3090 can play all games at 4K@144hz consistently. It may be possible with DLSS, but I recalled that Nvidia introduced DLSS not meant to drive very high FPS. Most DLSS implementations target 60FPS up to 4K.



You don't say, that's why the RTX 3080 is a good competitor for this because the RTX 3090 costs about 1900USD from Nvidia's own site and the RTX 3080 at a more reasonable 900USD give or take depending on AIB models and I know I am not blowing off 1900USB for the FE and even more for a AIB card not gonna happen this is also why I think it's sad that AMD ain't out with something yet.


----------



## kayjay010101 (Sep 14, 2020)

ppn said:


> Avoid 3080 no matter what, it uses old 8nm 44Mtr density, instead of 6nm 66Mtr/mm2 that should be the norm. Even AMD doesnt use real 7nm. it is 41Mtr/mm I have a suspicion 7+ is not any better, except yeah some clock and power improvements... Wait for Hopper.


God... I waited 2 years for a replacement for my 1070, then another 2 years for this to come out. Now you expect me to wait another 2 years for Hopper?
Hopper will probably be good, sure, but that's at least 2 years away from now. 
Samsung 8nm is still better than the TSMC 12nm that Turing and Pascal was on, and that shows in how good Ampere is shaping up to be. I'll happily buy the 3080 on Thursday, and then I might upgrade when Hopper arrives if it's so damn good.


----------



## Vayra86 (Sep 14, 2020)

damric said:


> Quite underwhelming considering the price.
> 
> AMD was able to keep refreshing the same old GCN GPUs for years because Kepler portfolio was so bad lol.



Kepler first gen was indeed neck-neck between AMD and Nvidia.

Kepler refresh started showing the cracks in GCN and forced AMD to deploy a 512 bit bus on those 'great' R9 290 / 290Xes. They also managed to get probably 2 AIB products out with a card that didn't sound like an airplane while boiling you in your room. Realistically, the R9 290X was AMD's/ GCNs last swan song and after that, they simply had nothing. Fury X was a dead end, plagued with issues not the least of them being margin and supply, a problem they handily repeated with Vega, while the latter had no competitive edge whatsoever. HBM was very visibly the only escape AMD saw at the time to get 'moar cores' on their slow as molasses memory subsystem. Meanwhile, Nvidia explored and deployed its first Delta Compression tech in Maxwell and could make do with half or less of a bus width instead with el cheapo GDDR5. Alongside GPU Boost 3.0 this obliterated AMD's entire line up which by then consisted of a failed Tonga attempt, 7970 rebrands and soon after they declared to 'Focus on the midrange'. For all that focus, they also 'focused on consoles'... in practice, they just ran a skeleton crew on the whole division.

We know how that went. Their focus on the midrange was a repeatedly rebranded Polaris chip, a failed Vega chip, and the most buggy GPU ever released called Navi 10.

Time for some redactions in your history notebook I'd say. I'm sure Navi 20 will do fantastic, after all, track record seems good. 



lexluthermiester said:


> "lex being lex" you mean being objective and realistic



Mhm. In the eyes of the beholder  You've stacked several 'facts' here that are proven untrue and clear for all to see. But keep at it, its entertaining.


----------



## neatfeatguy (Sep 14, 2020)

I was hoping for some interesting reading today......guess it can wait.

What I don't understand is this:
_"...the delay was in response to certain reviewers requesting more time from NVIDIA as COVID-19 impacted their sampling logistics. "_

It's horrifically vague and doesn't make sense. 
Did Nvidia fail to get samples out in a timely fashion? I've shipped things about via USPS and UPS over the last six months and things have been arriving on time to their destinations across the US.....barring any kind of natural weather issues.


----------



## P4-630 (Sep 14, 2020)

neatfeatguy said:


> I've shipped things about via USPS and UPS over the last six months and things have been arriving on time to their destinations across the US



There are more countries on the world than just the US...


----------



## neatfeatguy (Sep 14, 2020)

P4-630 said:


> There are more countries on the world than just the US...


True, but we have equipment/parts for machinery we use shipped in to us from Canada, Italy, Germany and Mexico. Small parts to things that need to be shipped on container ships and we haven't had any unforeseen downtime from parts due to late/delayed shipping....maybe we're just lucky.


----------



## ppn (Sep 14, 2020)

8nm does the job of delivering new product, but it looks like we have a GTX 780Ti, 980Ti scenario, both of them 600mm2 die replaced in less than 10-12 months window by a smaller faster card for half the price. so anywhere between 12-18 months 8nm is obsoleted. Why on earth would you pay $750 for that. it will be worthless than 250 resale so soon you woldn't even play 3 games on it.


----------



## lexluthermiester (Sep 14, 2020)

Vayra86 said:


> You've stacked several 'facts' here that are proven untrue and clear for all to see. But keep at it, its entertaining.


Yup, that's what happened... The word "concrete" is coming to mind... But yes, VERY entertaining indeed!


ppn said:


> 8nm does the job of delivering new product, but it looks like we have a GTX 780Ti, 980Ti scenario, both of them 600mm2 die replaced in less than 10-12 months window by a smaller faster card for half the price. so anywhere between 12-18 months 8nm is obsoleted.


That's not going to happen.


----------



## Flow (Sep 14, 2020)

ppn said:


> 8nm does the job of delivering new product, but it looks like we have a GTX 780Ti, 980Ti scenario, both of them 600mm2 die replaced in less than 10-12 months window by a smaller faster card for half the price. so anywhere between 12-18 months 8nm is obsoleted. Why on earth would you pay $750 for that. it will be worthless than 250 resale so soon you woldn't even play 3 games on it.


You pay for the performance it delivers. And thusfar the rumours are it's faster than a 2080Ti, at a much lower price.
So sure, next year by this time they have even better offerings, even more bang for your buck.
As for now, this product seems to deliver what it says it will, upcoming reviews ofcourse will proof this right or wrong, but I suspect it will come true. 
I douldn't care less how big or small the Die is, as long as it performs.


----------



## ratirt (Sep 15, 2020)

Flow said:


> You pay for the performance it delivers. And thusfar the rumours are it's faster than a 2080Ti, at a much lower price.
> So sure, next year by this time they have even better offerings, even more bang for your buck.
> As for now, this product seems to deliver what it says it will, upcoming reviews ofcourse will proof this right or wrong, but I suspect it will come true.
> I douldn't care less how big or small the Die is, as long as it performs.


Sure you pay for performance but since the 3070 is faster than a 2080TI, it doesn't mean you need to pay 2080Ti's price for something that's considered a mid range card like 3070 is. You need to move forward with the performance and offer better performance considering cards tiers these represent. The price for the 3070 is not bad but it could have been lower, considering the Turing and NV new release aren't that much apart in arch. Also keep in mind that Turing cards were freakishly overpriced. I remember all the fuss after Turing's release and dislike of the price just because it brought RT. (the performance wasn't that good if you didn't have 2080Ti or 2080 at least)


----------



## EarthDog (Sep 15, 2020)

I think some people are getting confused by tiers and naming, honestly. If you take away their stripes and look at performance you see...

3070 ~= 2080 Ti for less than half the cost ($1200 vs $500). That is $500 for ~ previous gen flagship performance. In the previous generation, the GTX 1070 was a few % faster than a 980Ti for $200 (~33%) less ($650 vs $450).

Feels to me like pricing is coming back down to earth. Surely it is higher still, but, I don't feel bad for paying $500 for a 4k/60 card...or a 1440/144Hz.


----------



## P4-630 (Sep 15, 2020)

21 Hours to go!....


----------



## lexluthermiester (Sep 15, 2020)

P4-630 said:


> 21 Hours to go!....


LOL! It would seem you're excited, yes?


----------



## puma99dk| (Sep 15, 2020)

lexluthermiester said:


> LOL! It would seem you're excited, yes?



Who ain't¿


----------

