# NVIDIA Announces GeForce Ampere RTX 3000 Series Graphics Cards: Over 10000 CUDA Cores



## btarunr (Sep 1, 2020)

NVIDIA just announced its new generation GeForce "Ampere" graphics card series. The company is taking a top-to-down approach with this generation, much like "Turing," by launching its two top-end products, the GeForce RTX 3090 24 GB, and the GeForce RTX 3080 10 GB graphics cards. Both cards are based on the 8 nm "GA102" silicon. Join us as we live blog the pre-recorded stream by NVIDIA, hosted by CEO Jen-Hsun Huang.



 

 

 







 
*Update 16:04 UTC*: Fortnite gets RTX support. NVIDIA demoed an upcoming update to Fortnite that adds DLSS 2.0, ambient occlusion, and ray-traced shadows and reflections. Coming soon.



 

 
*Update 16:06 UTC*: NVIDIA Reflex technology works to reduce e-sports game latency. Without elaborating, NVIDIA spoke of a feature that works to reduce input and display latencies "by up to 50%". The first supported games will be Valorant, Apex Legends, Call of Duty Warzone, Destiny 2 and Fortnite—in September.



 
*Update 16:07 UTC*: Announcing NVIDIA G-SYNC eSports Displays—a 360 Hz IPS dual-driver panel that launches through various monitor partners in this fall. The display has a built-in NVIDIA Reflex precision latency analyzer.



 
*Update 16:07 UTC*: NVIDIA Broadcast is a brand new app available in September that is a turnkey solution to enhance video and audio streaming taking advantage of the AI capabilities of GeForce RTX. It makes it easy to filter and improve your video, add AI-based backgrounds (static or animated), and builds on RTX Voice to filter out background noise from audio.



 
*Update 16:10 UTC*: Ansel evolves into Omniverse Machinima, an asset exchange that helps independent content creators to use game assets to create movies. Think fan-fiction Star Trek episodes using Star Trek Online assets. Beta in October.



 

 
*Update 16:15 UTC*: Updates to the AI tensor cores and RT cores. In addition to more numbers of RT- and tensor cores, the 2nd generation RT cores and 3rd generation tensor cores offer higher IPC. Making ray-tracing have as little performance impact as possible appears to be an engineering goal with Ampere.



 
*Update 16:18 UTC*: Ampere 2nd Gen RTX technology. Traditional shaders are up by 270%, raytracing units are 1.7x faster and the tensor cores bring a 2.7x speedup.



 

 
*Update 16:19 UTC*: Here it is! Samsung 8 nm and Micron GDDR6X memory. The announcement of Samsung and 8 nm came out of nowhere, as we were widely expecting TSMC 7 nm. Apparently NVIDIA will use Samsung for its Ampere client-graphics silicon, and TSMC for lower volume A100 professional-level scalar processors. 



 

 
*Update 16:20 UTC*: Ampere has almost twice the performance per Watt compared to Turing!



 
*Update 16:21 UTC*: Marbles 2nd Gen demo is jaw-dropping! NVIDIA demonstrated it at 1440p 30 Hz, or 4x the workload of first-gen Marbles (720p 30 Hz). 



 
*Update 16:23 UTC*: Cyberpunk 2077 is playing big on the next generation. NVIDIA is banking extensively on the game to highlight the advantages of Ampere. The 200 GB game could absorb gamers for weeks or months on end.



 
*Update 16:24 UTC*: New RTX IO technology accelerates the storage sub-system for gaming. This works in tandem with the new Microsoft DirectStorage technology, which is the Windows API version of the Xbox Velocity Architecture, that's able to directly pull resources from disk into the GPU. It requires for game engines to support the technology. The tech promises a 100x throughput increase, and significant reductions in CPU utilization. It's timely as PCIe gen 4 SSDs are on the anvil.




*Update 16:26 UTC*: Here it is, the GeForce RTX 3080, 10 GB GDDR6X, running at 19 Gbps, 238 tensor TFLOPs, 58 RT TFLOPs, 18 power phases.



 
*Update 16:29 UTC*: Airflow design. 90 W more cooling performance than Turing FE cooler.



 
*Update 16:30 UTC*: Performance leap, $700. 2x as fast as RTX 2080, available September 17. Up to 2x faster than the original RTX 2070.



 
*Update 17:05 UTC*: GDDR6X was purpose-developed by NVIDIA and Micron Technology, which could be an exclusive vendor of these chips to NVIDIA. These chips use the new PAM4 encoding scheme to significantly increase data-rates over GDDR6. On the RTX 3090, the chips tick at 19.5 Gbps (data rates), with memory bandwidths approaching 940 GB/s. 



 
*Update 16:31 UTC*: RTX 3070, $500, faster than RTX 2080 Ti, 60% faster than RTX 2070, available in October. 20 shader TFLOPs, 40 RT TFLOPs, 163 tensor cores, 8 GB GDDR6



 
*Update 16:33 UTC*: Call of Duty: Black Ops Cold War is RTX-on.



 

 


*Update 16:35 UTC*: RTX 3090 is the new TITAN. Twice as fast as RTX 2080 Ti, 24 GB GDDR6X. The Giant Ampere. A BFGPU, $1500 available from September 24. It is designed to power 60 fps at 8K resolution, up to 50% faster than Titan RTX.

*Update 16:43 UTC*: Wow, I want one. On paper, the RTX 3090 is the kind of card I want to upgrade my monitor for. Not sure if a GPU ever had that impact.



 
*Update 16:59 UTC*: Insane CUDA core counts, 2-3x increase generation-over-generation. You won't believe these.



 
*Update 17:01 UTC*: GeForce RTX 3090 in the details. Over Ten Thousand CUDA cores!



 
*Update 17:02 UTC*: GeForce RTX 3080 details. More insane specs.




*Update 17:03 UTC*: The GeForce RTX 3070 has more CUDA cores than a TITAN RTX. And it's $500. Really wish these cards came out in March. 2020 would've been a lot better.



 
Here's a list of the top 10 Ampere features.

*Update 19:22 UTC*: For a limited time, gamers who purchase a new GeForce RTX 30 Series GPU or system will receive a PC digital download of Watch Dogs: Legion and a one-year subscription to the NVIDIA GeForce NOW cloud gaming service.

*Update 19:47 UTC*: All Turing cards support HDMI 2.1. The increased bandwidth provided by HDMI 2.1 allows, for the first time, a single cable connection to 8K HDR TVs for ultra-high-resolution gaming. Also supported is AV1 video decode.

*Update 20:06 UTC*: Added the complete NVIDIA presentation slide deck at the end of this post.

*Update Sep 2nd*: We received following info from NVIDIA regarding international pricing:

UK: RTX 3070: GBP 469, RTX 3080: GBP 649, RTX 3090: GBP 1399
Europe: RTX 3070: EUR 499, RTX 3080: EUR 699, RTX 3090: EUR 1499 (this might vary a bit depending on local VAT)
Australia: RTX 3070: AUD 809, RTX 3080: AUD 1139, RTX 3090: AUD 2429



 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 



*View at TechPowerUp Main Site*


----------



## jesdals (Sep 1, 2020)

Thats a lot of sweet talk without substance about the key issue the new RTX 3000 card - how fast is RDNA2 actually going to be? Samsung 8nm confirmed. Love that they compared 3080 with 1080 series instead of rtx 2080


----------



## dir_d (Sep 1, 2020)

"A Triple Double" Best quote


----------



## Houd.ini (Sep 1, 2020)

So it is made on a 7nm 8nm process? Interesting!


----------



## xkm1948 (Sep 1, 2020)

$699 3080???????????????????

OH MY GOD


----------



## Chomiq (Sep 1, 2020)

They just wiped the floor with that pricing. 3080 for $699, 3070 for $499.

Really surprised they went so low.

3090 $1499.


----------



## chaosmassive (Sep 1, 2020)

it is safe to upgrade now !


----------



## jesdals (Sep 1, 2020)

But does RTX 3090 support SLI


----------



## chaosmassive (Sep 1, 2020)

Chomiq said:


> They just wiped the floor with that pricing. 3080 for $699, 3070 for $499.
> 
> Really surprised they went so low.
> 
> 3090 $1499.



my initial reaction when the pricing is revealed is that AMD is so done this time, unless they able to pull some magic from their hat.
especially 3070, it will be really hard to beat.


----------



## kayjay010101 (Sep 1, 2020)

Oh I am so happy


----------



## Makaveli (Sep 1, 2020)

Love the irony of him pulling the 3090 out of the oven.


----------



## mouacyk (Sep 1, 2020)

Yes to the future, captain Picard! No to today's prices.


----------



## jesdals (Sep 1, 2020)

I missed the RTX 3090 price


----------



## Rowsol (Sep 1, 2020)

The 3070 matching or exceeding the 2080ti at less than half the price! That's a win in my book.


----------



## caleb (Sep 1, 2020)

Rowsol said:


> The 3070 matching or exceeding the 2080ti at less than half the price! That's a win in my book.



Yeah I feel so nice i sat that generation out on 1060


----------



## kayjay010101 (Sep 1, 2020)

Typo, 3090 mentions availability Sep 24th, not Sep 17th


----------



## mouacyk (Sep 1, 2020)

Your turn AMD.  Show us the goods.

$700 - $1500 leaves a big gap for a 11GB 3080Ti to slot in again at 352-bit.  Gonna guess $1100-$1200!


----------



## Legacy-ZA (Sep 1, 2020)

jesdals said:


> I missed the RTX 3090 price



$1499 ($1500)

I was hoping to find out more about the RTX3060.


----------



## Makaveli (Sep 1, 2020)

3090 is $1500, 3080 is $700, and 3070 is $500 I believe

Which leaves room for a 3090 Ti at $2000 lol


----------



## elemelek (Sep 1, 2020)

Anyone knows when we can expect reviews?


----------



## JalleR (Sep 1, 2020)

well that looks promising, My 1080TI needs and upgrade NOW


----------



## AddSub (Sep 1, 2020)

Is there SLI support at all?


----------



## iO (Sep 1, 2020)

Woah, that's going to be hard to beat.


----------



## VallThore (Sep 1, 2020)

I wonder what relative performance means this time. I have a gut feeling that this incredible (roughly ~1.7x compared to 2080S looking at the graph) speed-up it's all about raytracing and not so much for rasterization but I would love to be wrong here.


----------



## xkm1948 (Sep 1, 2020)

Where is the pre-order page?


----------



## birdie (Sep 1, 2020)

AddSub said:


> Is there SLI support at all?



Highly unlikely.


----------



## Hyderz (Sep 1, 2020)

there might be an rtx 3080ti coming, nvidia is possibly holding off to see what RDNA2 brings


----------



## chodaboy19 (Sep 1, 2020)

There is not much to nitpick here, I think this generation is a lock for Nvidia. They really crushed it.


----------



## wheresmycar (Sep 1, 2020)

3070 for $499

3080 for $699 

:O

Not sure who I should give thanks to... NVIDIA for the effort or AMD for the compo. Either way, I couldn't have asked for anything better (me gettin a 3080 yippiieeee)


----------



## Space Lynx (Sep 1, 2020)

WTF JUST HAPPENED!!!!!!   A 2080 TI FOR $500  OMFG LMAO THIS IS THE GREATEST DAY EVER... HOLY CRAP....


this is unbelievable... Nvidia just came out swinging.  I don't see Big Navi matching this I really don't...


----------



## dir_d (Sep 1, 2020)

He said the 3080 is a 4k 60fps card and way better than 2080Ti. That made me question a little because i thought 2080Ti was a 4k 60fps card am i wrong?


----------



## Fleurious (Sep 1, 2020)

How does the non-RTX performance compare to 2080ti?


----------



## Raendor (Sep 1, 2020)

Oh sweet baby Jesus, the 3070 is sweet for its price/performance. Not even bothered by 8 gb ram in this case as it’s totally a 1440p card where this buffer will be just fine. Wonder though if they will push super out with 2-4 more gigs. Doesn’t matter, I need my 3070 in time for cbp2077. Now only to decide if I upgrade to b550 or z490, which is a tough choice at the moment.



dir_d said:


> He said the 3080 is a 4k 60fps card and way better than 2080Ti. That made me question a little because i thought 2080Ti was a 4k 60fps card am i wrong?


It’s now demoted to 1080p 30 fps console peasantry tier  just joking


----------



## ddarko (Sep 1, 2020)

xkm1948 said:


> Where is the pre-order page?











						3090 & 3090 Ti Graphics Cards
					

TITAN class performance for the ultimate in gaming.



					www.nvidia.com
				



available september 24









						NVIDIA GeForce RTX 3080 Family
					

The Ultra High Performance that Gamers Crave, powered by NVIDIA Ampere Architecture.



					www.nvidia.com
				



available september 17









						NVIDIA GeForce RTX 3070 Family
					

The AI Revolution has Come to Gaming



					www.nvidia.com
				



available october


----------



## Max(IT) (Sep 1, 2020)

Good. The 3070 could be a good upgrade for my 2070 Super


----------



## Chris34 (Sep 1, 2020)

Seems Nvidia took notice of last year's AMD RX5700 price joke. Those are well priced GPU. Surprising coming from Nvidia.


----------



## Space Lynx (Sep 1, 2020)

Max(IT) said:


> Good. The 3070 could be a good upgrade for my 2070 Super



prob is re-sell value is going to be terrible this round... i feel bad for someone who just bought a 2080 ti 31 days ago... ouch. lol


----------



## robiatti (Sep 1, 2020)

Wonder if the 3090 / 3080 will even need a water-block. the cooler design looks fairly solid.


----------



## mouacyk (Sep 1, 2020)

robiatti said:


> Wonder if the 3090 / 3080 will even need a water-block. the cooler design looks fairly solid.


Any sane air cooling enthusiast can see that NVidia left a dumpster fire for Intel and AMD to deal with, by dumping the GPU VRM heat right into the CPU.  Don't forget the poor motherboard makers too, with their VRMs overheating now. And possibly RAM.  With the 3090, you can't have passively cooled RAM or mobo VRM anymore.


----------



## AddSub (Sep 1, 2020)

birdie said:


> Highly unlikely.



So gotta get the right card right away. Right now I can force SLI profiles on quite a few games and it works with pretty decent scaling (have two GTX 1070s). Getting a 3070 and then another a year down the road would have been nice....

...
..
.


----------



## xkm1948 (Sep 1, 2020)

ddarko said:


> 3090 & 3090 Ti Graphics Cards
> 
> 
> TITAN class performance for the ultimate in gaming.
> ...




There should be pre-order going up before that.


----------



## MxPhenom 216 (Sep 1, 2020)

jesdals said:


> Thats a lot of sweet talk without substance about the key issue the new RTX 3000 card - how fast is RDNA2 actually going to be? Samsung 8nm confirmed. Love that they compared 3080 with 1080 series instead of rtx 2080



They compared it to both. Did you even watch the feed?

Jensen tried to make it especially clear that people with Pascal cards should upgrade now if they skipped Turing.


----------



## xkm1948 (Sep 1, 2020)

Yeah look at that CUDA count. 

GG Navi2


----------



## chstamos (Sep 1, 2020)

Welp! I don't have the live stream. Is RTX 3070 supposed to be twice as fast as 2080 Ti without raytracing, too, or only with RT on?

If it's twice the rasterization performance this is actually an amazing new lineup! I was set for disappointment, thought we were going to have another turing debut on our hands, which I had found very underwhelming. This is actually great!


----------



## Raendor (Sep 1, 2020)

I’m a bit disappointed 3070 is not with the new cool dissipation design. Could fit nice in my Conswole. Guess will put in Ncase M1 in classic layout.


----------



## Space Lynx (Sep 1, 2020)

how do you all like my new sig?  never thought i'd see the day... LMAO


----------



## Chomiq (Sep 1, 2020)

lynx29 said:


> how do you all like my new sig?  never thought i'd see the day... LMAO


How mature of you.


----------



## Raendor (Sep 1, 2020)

lynx29 said:


> how do you all like my new sig?  never thought i'd see the day... LMAO


Lol. Yeah, always dumb to buy halo products for ridiculous prices. Going 1-2 steps down and doing the same on the next cycle is the wisest strategy.


----------



## MxPhenom 216 (Sep 1, 2020)

mouacyk said:


> Any sane air cooling enthusiast can see that NVidia left a dumpster fire for Intel and AMD to deal with, by dumping the GPU VRM heat right into the CPU.



That doesn't exactly matter if the CPU is water cooled. At least with this design, only about half the heat of the card theoretically is actually dumped back into the case where as custom designs from board partners, all of it is. And VRMs usually dont run hotter than the GPU itself so.


----------



## kayjay010101 (Sep 1, 2020)

lynx29 said:


> how do you all like my new sig?  never thought i'd see the day... LMAO


Haha I feel so bad I sold my 2080 Ti to this dude a month ago for $1300 and he thought he got such a good deal because I included a $150 waterblock and the card was $1350 new.. Now it's worth barely $500 lmao


----------



## Dimi (Sep 1, 2020)

RTX 3080 it is for me. Should be a great upgrade from my 1070.


----------



## Arkz (Sep 1, 2020)

Chomiq said:


> They just wiped the floor with that pricing. 3080 for $699, 3070 for $499.
> 
> Really surprised they went so low.
> 
> 3090 $1499.



Yeah but lets see what shops actually sell em at.


----------



## Raendor (Sep 1, 2020)

kayjay010101 said:


> Haha I feel so bad I sold my 2080 Ti to this dude a month ago for $1300 and he thought he got such a good deal because I included a $150 waterblock and the card was $1350 new.. Now it's worth barely $500 lmao


I’d still buy 2080ti for 300


----------



## mrthanhnguyen (Sep 1, 2020)

Any info 3080ti coming? 3090 is the Titan replacement not 2080ti. I dont want to buy 3080 then they release 3080ti a month after that.


----------



## mouacyk (Sep 1, 2020)

mrthanhnguyen said:


> Any info 3080ti coming? 3090 is the Titan replacement not 2080ti. I dont want to buy 3080 then they release 3080ti a month after that.


They always do, up to 6 months down the line.  May be they call it RTX 3085 this time.


----------



## Fluffmeister (Sep 1, 2020)

Very impressive indeed, and all those cores! Core blimey!

Damn the 3070 is tempting at that price, but let's be honest, I'd frickin love a 3090.


----------



## Raendor (Sep 1, 2020)

Dimi said:


> RTX 3080 it is for me. Should be a great upgrade from my 1070.


I think they’ll do the same thing as with pascal and release 3080ti with something like 14gb ram. I bought 1080 at launch in 2016 and since then decided never go fo x80 card despite that it was good. Makes more sense to get x70 and upgrade to x80ti fir the price of x80.


----------



## framebuffer (Sep 1, 2020)

mouacyk said:


> Any sane air cooling enthusiast can see that NVidia left a dumpster fire for Intel and AMD to deal with, by dumping the GPU VRM heat right into the CPU.



this is the first thing I thought
but maybe the think that everybody is into watercooling nowadays...


----------



## nguyen (Sep 1, 2020)

A 2080 Ti at 2000mhz core clock is only about 17.5TFlops of shaders performance and the 3080 is 30TFlops at 1700mhz ? 
But yeah the 3080 Ti if there is one will be the one to get, 1000-1200usd with 11GBs of VRAM that will almost match the 3090 (which basically is the Titan).
I bet Nvidia was able to get a sweet deal with Samsung 8N to get the prices so low.


----------



## Raendor (Sep 1, 2020)

voodooFX said:


> this is the first thing I thought
> but maybe the think that everybody is into watercooling nowadays...



that design would work perfect in some itx cases actually


----------



## xkm1948 (Sep 1, 2020)

3090 for me. Guess I will just retire the 2080Ti to my work PC.


----------



## mouacyk (Sep 1, 2020)

Raendor said:


> I think they’ll do the same thing as with pascal and release 3080ti with something like 14gb ram. I bought 1080 at launch in 2016 and since then decided never go fo x80 card despite that it was good. Makes more sense to get x70 and upgrade to x80ti fir the price of x80.


It doesn't work that way.  Memory works in 32-bit incrementals and chips are available in 1GB or 2GB sizes only.  The next step up without stepping on 3090 is 352-bit, which is 11GB.  Putting 22GB on is also way too close to 3090.


----------



## R0H1T (Sep 1, 2020)

Chomiq said:


> *They just wiped the floor with that pricing. *3080 for $699, 3070 for $499.
> 
> Really surprised they went so low.
> 
> 3090 $1499.


How, do you know something about *RDNA2 performance* that we don't? Remember "Ampere being 50% faster than Turing & sipping 50% less power" claims? How'd that turn out 

*1.9x perf/W* last I checked. Nvidia definitely knows more about RDNA2 perf than you or me ~ so just like during the Zen launch you can be sure that, much like Intel, *Nvidia is taking AMD seriously because they have to*!


----------



## Nater (Sep 1, 2020)

He called the 3090 the "new" Titan...but no mention if it'll have Pro CAD support...

Was hoping to avoid getting a gaming card AND still wanting a Quadro card for SolidWorks.


----------



## Pumper (Sep 1, 2020)

Finally, some price cuts. Since there won't be this bullshit with "standard" and "founder's edition" at different prices, that means that both the 3070 and 3080 are launching at $100 lower than the 2000 series.

3090 is the new Titan, so that's a $1000 price cut.

Now to wait for proper benchmarks and then RDNA2 just to see if AMD will be able to force nvidia to reduce their prices even more.

The disappointing thing is that 3070 should have been at least a 10GB card and the 3080 12GB.


----------



## boomheadshot8 (Sep 1, 2020)

Just wait for release and they will tested, I think 3060 is also a good deal if it's  around 350$


----------



## midnightoil (Sep 1, 2020)

Chomiq said:


> They just wiped the floor with that pricing. 3080 for $699, 3070 for $499.
> 
> Really surprised they went so low.
> 
> 3090 $1499.



They're expecting RDNA2 to be faster and more efficient than Ampere  ... whilst hoping the huge, brute force 3090 will retain them the absolute crown.   Given how inefficient this is looking as an architecture, and that they're on a slightly shrunk 10nm process (that's what Samsung 8nm is), that will be touch and go too ... depends how big the biggest Navi 2x is.

In light of that, they can't maintain their current pricing model.


----------



## Franzen4Real (Sep 1, 2020)

xkm1948 said:


> 3090 for me. Guess I will just retire the 2080Ti to my work PC.


same, 2080ti going out into the living room for big screen and VR gaming.


----------



## Calmmo (Sep 1, 2020)

I'm onboard the 3080 train, but not feeling confident on a 10gb card being enough for the next 2 years.


----------



## chris.london (Sep 1, 2020)

The RTX 3090 is only 20% faster than the 3080 at twice the price. The 3080 will likely clock higher too. I think this settles it for me.


----------



## Space Lynx (Sep 1, 2020)

Chomiq said:


> How mature of you.



awww .  well i wasn't the only one that thought it was in good fun, sorry, you want a hug?


----------



## kapone32 (Sep 1, 2020)

boomheadshot8 said:


> Just wait for release and they will tested, I think 3060 is also a good deal if it's  around 350$



It all remains to be seen.


----------



## Chomiq (Sep 1, 2020)

midnightoil said:


> They're expecting RDNA2 to be faster and more efficient than Ampere  ...


Based on what? A single RT demo or next gen console games running at 4K30 with checkerboard rendering?


----------



## Legacy-ZA (Sep 1, 2020)

boomheadshot8 said:


> Just wait for release and they will tested, I think 3060 is also a good deal if it's  around 350$



I am hoping for around $300 for the RTX3060. Perhaps we might even see a $175 price tag down the line for the RTX3050. Leaving a lot of room for the Ti's as AMD launches their new hardware.


----------



## mouacyk (Sep 1, 2020)

3080 did step up an SKU tier though, from 256-bit (8GB) to 320-bit (10GB).  That's a change backwards toward the Fermi refresh tiering.  That has to be a sign of something to expect from RDNA2.


----------



## Vya Domus (Sep 1, 2020)

I am a bit baffled by that "two floating point operations per clock" thing. Whenever Nvidia/AMD/Intel/etc talk about TFLOPS, what they really mean is fused multiply add and those are already "two floating point operations per clock". So ... what's this then ?

You need either ~8000 shaders or rather FP units (kind of impossible) or ~4 Ghz (also obviously impossible) in order to get 30 TFLOPS. Something's off.


----------



## Space Lynx (Sep 1, 2020)

kayjay010101 said:


> Haha I feel so bad I sold my 2080 Ti to this dude a month ago for $1300 and he thought he got such a good deal because I included a $150 waterblock and the card was $1350 new.. Now it's worth barely $500 lmao



that does hurt oof


----------



## renz496 (Sep 1, 2020)

nguyen said:


> A 2080 Ti at 2000mhz core clock is only about 17.5TFlops of shaders performance and the 3080 is 30TFlops at 1700mhz ?
> But yeah the 3080 Ti if there is one will be the one to get, 1000-1200usd with 11GBs of VRAM that will almost match the 3090 (which basically is the Titan).
> I bet Nvidia was able to get a sweet deal with Samsung 8N to get the prices so low.



it has something to do with the new structured FP32. i heard some rumor talks before that samsung process probably can't push the clock higher so nvidia doing something else to increase the performance. they said nvidia will double the amount of FP32. and yeah maybe nvidia really get a sweet deal. samsung really needs people to manufacture their stuff at their fab. heard that qualcomm is turning back to TSMC due to samsung 5nm yield issue.


----------



## birdie (Sep 1, 2020)

midnightoil said:


> They're expecting RDNA2 to be faster and more efficient than Ampere  ... whilst hoping the huge, brute force 3090 will retain them the absolute crown.   Given how inefficient this is looking as an architecture, and that they're on a slightly shrunk 10nm process (that's what Samsung 8nm is), that will be touch and go too ... depends how big the biggest Navi 2x is.
> 
> In light of that, they can't maintain their current pricing model.



God, there's so much wild unwarranted speculation/AMD fanaticism in this comment it's actually cringe-worthy.


----------



## Space Lynx (Sep 1, 2020)

birdie said:


> God, there's so much wild unwarranted speculation/AMD fanaticism in this comment it's actually cringe-worthy.



I am an AMD fanboy, and I don't see AMD coming close to this, I think they scared Nvidia a little though for sure, which is great for competition.  I am buying 4800x and 3080 100%


----------



## Dante Uchiha (Sep 1, 2020)

Huge chips, expensive design, tons of state-of-the-art memory. So now's where nvdia's margins will sink along with the stock price? As a gamer I think it's amazing btw.  

Even if it doesn't win the crown of absolute best performance, RDNA2 will probably be more efficient.


----------



## moproblems99 (Sep 1, 2020)

jesdals said:


> But does RTX 3090 support SLI



Who cares.  Sli sucks


----------



## Max(IT) (Sep 1, 2020)

lynx29 said:


> prob is re-sell value is going to be terrible this round... i feel bad for someone who just bought a 2080 ti 31 days ago... ouch. lol


I don’t really care about re-sell value TBH...
The 2070 super served me well for a while, so I’m happy.
I will replace it in a few months, after the first batch of reviews.


----------



## Philaphlous (Sep 1, 2020)

Oh the days of Fermi are long gone...


----------



## boomheadshot8 (Sep 1, 2020)

look at the nvidia stock  will see when AMD shows us their beast


----------



## Raendor (Sep 1, 2020)

Amd will release another vega potato after months of hyping and not living up to any promise like it was in Pascal days. That’s why there were no news still.


----------



## Vya Domus (Sep 1, 2020)

birdie said:


> God, there's so much wild unwarranted speculation/AMD fanaticism in this comment it's actually cringe-worthy.



There you go again. You just can't help yourself but flame and mention AMD on every single occasion, I don't think there is a single comment you write without the word "AMD" or "fanboy" in it.

You're the only cringe worthy thing on here.


----------



## mouacyk (Sep 1, 2020)

moproblems99 said:


> Who cares.  Sli sucks


I think the fact that they disabled SLI support altogether on 3070 and 3080 is a little telling about the potential of Explicit Multi-GPU, now that Vulkan and DX12 has had time to mature.  SLI was brute force, it did suck!


----------



## Chomiq (Sep 1, 2020)

jesdals said:


> But does RTX 3090 support SLI


It does.


----------



## xkm1948 (Sep 1, 2020)

lynx29 said:


> I am an AMD fanboy, and I don't see AMD coming close to this, I think they scared Nvidia a little though for sure, which is great for competition.  I am buying 4800x and 3080 100%




Nope. To me Nvidia is only competing with itself at this point. In other words, how to get the Pascal crowd to move on and upgrade? Jensen even mentioned that in his talk!


----------



## moproblems99 (Sep 1, 2020)

I might not wait to see what AMD has.  Depending on how much faster the 3080 over 2080ti is, I'll grab it now and enjoy games for 2 months instead of rotating on my thumb.


----------



## Prior (Sep 1, 2020)

they better release some damn good games to buy an rtx 3080 that's worth it.  Still happy with 1080p nowadays.


----------



## john_ (Sep 1, 2020)

I am totally impressed. First Nvidia proved why it is called a software company that also creates hardware. 
Then I think it hit the target with prices.


----------



## birdie (Sep 1, 2020)

Dante Uchiha said:


> Huge chips, expensive design, tons of state-of-the-art memory. So now's where nvdia's margins will sink along with the stock price? As a gamer I think it's amazing btw.
> 
> Even if it doesn't win the crown of absolute best performance, RDNA2 will probably be more efficient.



NVIDIA is in for a long game, sir. Too many Pascal owners have refused to upgrade to RTX 2000 because the performance increase just isn't tempting enough. Check Steam HW stats - the GTX 1060 is miles more popular than all other cards combined. RTX 3060, if it's priced correctly (let's say around $300) could become a smashing success and will actually improve hugely their bottom line vs. having more expensive cards which won't sell well. I know there are mostly computer enthusiasts in the comments section of TPU but trust me, RTX 3060 will sell in volumes which will be far higher than 3070-3080-3080 Ti (?)-3090 _combined_.


----------



## mouacyk (Sep 1, 2020)

xkm1948 said:


> Nope. To me Nvidia is only competing with itself at this point. In other words, how to get the Pascal crowd to move on and upgrade? Jensen even mentioned that in his talk!


He looked at me through the screen and called me his "Pascal friend" and said "it's safe to upgrade".  Guy is salesman for sure.


----------



## Space Lynx (Sep 1, 2020)

birdie said:


> NVIDIA is in for a long game, sir. Too many Pascal owners have refused to upgrade to RTX 2000 because the performance increase just isn't tempting enough. Check Steam stats - the GTX 1060 is miles more popular than all other cards combined. RTX 3060, if it's priced correctly (let's say around $300) could become a smashing success and will actually improve hugely their bottom line vs. having more expensive cards which won't sell well. I know there are mostly computer enthusiasts in the comments section of TPU but trust me, RTX 3060 will sell in volumes which will be far higher than 3070-3080-3080 Ti (?)-3090 _combined_.



not to mention nvidia stock prices, gaming revenue is very little of that last i checked. they moved into healthcare, cars, AI, all of which generate way more money than gaming ever could. we just happen to get the benefits of that. thats why nvidia stock is so good and will stay good. Jensen is a smart guy.


----------



## Ferrum Master (Sep 1, 2020)

Jen-Hsun for sure likes kitchen shows....


----------



## dicktracy (Sep 1, 2020)

Cheap prices and fastest performance available especially in RT which all nextgen games will use. Nvidia uncontestly wins again. When’s Intel joining the war again?


----------



## Raendor (Sep 1, 2020)

mouacyk said:


> He looked at me through the screen and called me his "Pascal friend" and said "it's safe to upgrade".  Guy is salesman for sure.


I was almost dropping a tear when he was talking about 1080 (which I run to this day) and that it safe to get 3070/80 now


----------



## Batailleuse (Sep 1, 2020)

AddSub said:


> Is there SLI support at all?



Anything under DX12/Vulkan has basically SLI integrated, SLI in itself is DEAD, no game released in the last 5y officially support it.


----------



## good11 (Sep 1, 2020)

What about PCI-E Version on them ?

Can someone clarify it ?    

3.0 or 4.0 
Thank you


----------



## Max(IT) (Sep 1, 2020)

Arkz said:


> Yeah but lets see what shops actually sell em at.


Well after the initial rush with low availability and higher prices, the price should be lower, not higher.
Third party solutions with better coolers could cost a little more


----------



## Berfs1 (Sep 1, 2020)

Was it just me or did the stream crash midway? Also, check this out: https://docs.google.com/spreadsheets/d/1-JXBPMRZtUx0q0BMMa8Nzm8YZiyPGkjeEZfZ2DOr8y8/edit?usp=sharing


----------



## chodaboy19 (Sep 1, 2020)

Ferrum Master said:


> Jen-Hsun for sure likes kitchen shows....



He's keeping everyone safe, nothing but praise there.


----------



## Space Lynx (Sep 1, 2020)

chodaboy19 said:


> He's keeping everyone safe, nothing but praise there.



i loved when he pulled the 3090 out of the oven. glorious.


----------



## midnightoil (Sep 1, 2020)

Chomiq said:


> Based on what? A single RT demo or next gen console games running at 4K30 with checkerboard rendering?



Based on all the performance data from both next gen consoles, technical document releases, and that they clearly will have evaluated PS5 and XB dev kits, and likely had rough performance of RDNA2 desktop leaked to them.  

The pricing and gigantic, inefficient 3090 reflect this.  Why else would they do it?  You think they just rolled the dice and decided to slash their margins on volume sellers, and produce an ultra low yield furnace halo product for the LULs?


----------



## Fluffmeister (Sep 1, 2020)

Raendor said:


> I was almost dropping a tear when he was talking about 1080 (which I run to this day) and that it safe to get 3070/80 now



Hey, I'm still on a 980 Ti, but it does seem to be time to pull the trigger, 500 bucks and boom, 2080 Ti performance and all the shiny new tech, damn Nvidia ARE big meanies!

Just waiting on Vermeer too and it will finally be time to blow the cobwebs from my wallet.


----------



## Imouto (Sep 1, 2020)

RIP AMD.

This time around they won't be able to compete even in price. I see the higher ups at AMD hovering their mouses over emails asking their providers to stop the RDNA2 production.


----------



## PowerPC (Sep 1, 2020)

Love how much cognitive dissonance must be going on in people's heads right now. Just a day ago people were still saying this kind of performance / price was "literally impossible" on this very forum.


----------



## RH92 (Sep 1, 2020)

mouacyk said:


> He looked at me through the screen and called me his "Pascal friend" and said "it's safe to upgrade".  Guy is salesman for sure.



    Thats for sure ! Time for my 1080Ti to find a new home , im really tempted between the 3070 and 3080 , if the 3070 can rasterize as well as 2080Ti i might go with it if not i will grab myself a 3080 .


----------



## Vya Domus (Sep 1, 2020)

Imouto said:


> AMD hovering their mouses over emails asking their providers to stop the RDNA2 production.



Too late, a couple of millions of consoles are already commissioned to have RDNA2.

Do you even know on what planet you are ?


----------



## TheoneandonlyMrK (Sep 1, 2020)

I'll wait on benches and reviews.
I feel my 2060 just became an igpu grade though, that's for sure.


----------



## Foxiol (Sep 1, 2020)

As a happy owner of a 3 months old RTX2080 Super with an LG CX6LB OLED 4K 120hz 55" TV as a monitor...yup I knew it wasn't going to be enough for playing at actual 4K but doings its job at 1440p. Depending in how those RTX3080 reviews are at 4K resolution, I might get one withouth even thinking about it.

Also I bought it for 789€ (Gigabyte Windforce OC 8G model) and now it could sell for what...400€ at most? Nvidia killed the reseller market for those that paid way more than I did and now looking at stores with massive amounts of 2080Ti's for around 1300/1600€...What are they going to do basically since the 3070 prices can't get that higher after the recommended price tag. I bet no one in their sane mind is going to buy a new 2080Ti now after this. 

All in all I don't care that much, I'm glad NVidia came to their senses for once and kept prices in the right place, the 3080 is well priced in my opinion BUT that gap between this and the 3090 could mean they have something more to show. 

Again 4K reviews on that 3080 will sell the deal to me, HDMI 2.1 is also my main reason to upgrade for having stable and functional 120hz and GSync at 4K even if I'm not reaching higher frame rates.


----------



## Raendor (Sep 1, 2020)

Fluffmeister said:


> Hey, I'm still on a 980 Ti, but it does seem to be time to pull the trigger, 500 bucks and boom, 2080 Ti performance and all the shiny new tech, damn Nvidia ARE big meanies!
> 
> Just waiting on Vermeer too and it will finally be time to blow the cobwebs from my wallet.



in a same boat. Although if it takes long, I’ll just get something like 10400f which will be fine for 1440p gaming to cover me till am5/lga1700


----------



## Amite (Sep 1, 2020)

I can feel the used GPU prices on Ebay falling with out even looking.


----------



## kings (Sep 1, 2020)

519€ for the RTX 3070 and 719€ for the RTX 3080 in Europe... not bad at all. Prices of Founders Edition.

It looks like my 980Ti will finally get its well-deserved rest.


----------



## CrAsHnBuRnXp (Sep 1, 2020)

jesdals said:


> But does RTX 3090 support SLI


SLI is dead. Nor will you need it with a 3090.


----------



## ZoneDymo (Sep 1, 2020)

so uhh where can I watch this presentation back?


----------



## theGryphon (Sep 1, 2020)

Can't wait to see what NVIDIA will have for sub-75W LP segment


----------



## Imouto (Sep 1, 2020)

Vya Domus said:


> Too late, a couple of millions of consoles are already commissioned to have RDNA2.
> 
> Do you even know on what planet you are ?



That's one of the points, though. Consoles are obsolete even before launch.


----------



## kings (Sep 1, 2020)

ZoneDymo said:


> so uhh where can I watch this presentation back?


----------



## PowerPC (Sep 1, 2020)

CrAsHnBuRnXp said:


> SLI is dead. Nor will you need it with a 3090.


It was dead but they actually implemented a new version of SLI just for this card.


----------



## R0H1T (Sep 1, 2020)

PowerPC said:


> Love how much cognitive dissonance must be going on in people's heads right now. Just a day ago people were still saying this kind of performance / price was "literally impossible" on this very forum.


Two trains of thoughts, not necessarily contradictory ~

If Nvidia was able to pull the 2.5-3x perf/W efficiency it's possible they may have priced it similar to the 2xxx lineup. Of course Nvidia would be looking at RDNA2 perf & that big ball of _nothingburger_ called nCoV ravaging the entire world atm. Now depending on which side of the fence you are, NVidia's margins could be higher though I'm 100% certain their overall sales would be (much?) lower!

Next is what we see right now, Nvidia cannot really get that perf/W efficiency leap as some of the leaks suggested. That means Nvidia card will not be better in nearly all metrics vs AMD, unlike the last gen. So pricing it to enthusiast grade is nearly impossible for them. Hence the current "attractive" pricing.

The only way Nvidia prices Ampere the way they have now is when *RDNA2 is really competing with them on perf/W & likely perf/$ as well*. Anyone remember Intel's mainstream quad cores for a decade *BS* till Zen launched? This likely the same game played over again.


----------



## BorisDG (Sep 1, 2020)

I'm curious how they will perform on PCI-E 3.0


----------



## efikkan (Sep 1, 2020)

Specs are looking pretty good (on paper), perheps except TDP numbers.

It's funny to see that most "leakers" were pretty wrong on these specs.


----------



## PowerPC (Sep 1, 2020)

R0H1T said:


> Two trains of thoughts, not necessarily contradictory ~
> 
> If Nvidia was able to pull the 2.5-3x perf/W efficiency it's possible they may have priced it similar to the 2xxx lineup. Of course Nvidia would be looking at RDNA2 perf & that big ball of _nothingburger_ called nCoV ravaging the entire world atm. Now depending on which side of the fence you are, NVidia's margins could be higher though I'm 100% certain their overall sales would be (much?) lower!
> 
> ...


I'm at a loss for words for all the conjuring and bending that is happening in this post. Just take the performance / price increase already... sheesh.


----------



## Chrispy_ (Sep 1, 2020)

Yeah, I'm not paying $700 for a 10GB card.
RX570 for $120 has 8GB FFS and I already ditched my 6GB card because it ran out of VRAM.


----------



## Berfs1 (Sep 1, 2020)

CrAsHnBuRnXp said:


> SLI is dead. Nor will you need it with a 3090.


b!tch we need flight simulator at 4K60



Chrispy_ said:


> Yeah, I'm not paying $700 for a 10GB card.
> RX570 for $120 has 8GB FFS and I already ditched my 6GB card because it ran out of VRAM.


bro these cards are WAY more cost effective than turing. Yes, the 3090 is, technically speaking, cheaper than the 2080 ti.

And who cares about VRAM? It's not like it affects your performance in any noticeable way, even if you had 20GB you likely wouldn't notice the single digit FPS bump.


----------



## medi01 (Sep 1, 2020)

midnightoil said:


> Based on all the performance data from both next gen consoles, technical document releases, and that they clearly will have evaluated PS5 and XB dev kits, and likely had rough performance of RDNA2 desktop leaked to them.
> 
> The pricing and gigantic, inefficient 3090 reflect this.  Why else would they do it?  You think they just rolled the dice and decided to slash their margins on volume sellers, and produce an ultra low yield furnace halo product for the LULs?


That is true and the price gap between 3080 and 3090 indicates where NV expects AMD to have competitive products perhaps, but twice 2080Ti performance (unless it is RTX bazinga aggravated with fancy AI upscaling known as DLSS, in which case it is a lawsuit worthy misleading) is unexpected and so is 8k+ CU  $700 card.



R0H1T said:


> 2.5-3x perf/W efficienc


They themselves claim 1.9.


----------



## steen (Sep 1, 2020)

Many leaks proven correct, except silly stuff like copro/fpga. Even the late "reasonable" prices. I'm particularly interested in 2xfp32 perf as they're quoting 10496 Cuda cores for 3090. Does this mean Int32+fp32/fp32+fp32 with compiler extracting parallelism? 2xfp32 per clk? I also presume TF32/FP64 tensor support for the gaming cards? Need the Ampere white paper... Also nice that all GA support hw AV1 decode.


----------



## nguyen (Sep 1, 2020)

R0H1T said:


> Two trains of thoughts, not necessarily contradictory ~
> 
> If Nvidia was able to pull the 2.5-3x perf/W efficiency it's possible they may have priced it similar to the 2xxx lineup. Of course Nvidia would be looking at RDNA2 perf & that big ball of _nothingburger_ called nCoV ravaging the entire world atm. Now depending on which side of the fence you are, NVidia's margins could be higher though I'm 100% certain their overall sales would be (much?) lower!
> 
> ...



Stop with your red-pill fantasy please. Nvidia released the 1080 Ti at 700usd which demolished AMD until this very day. At this point the 700usd RTX 3080 are meant for 1080 Ti owners who refused to upgrade for so long.


----------



## sutyi (Sep 1, 2020)

Hyderz said:


> there might be an rtx 3080ti coming, nvidia is possibly holding off to see what RDNA2 brings



No Ti branding this time around supposedly. I think with the super models they'll double the VRAM on higher SKUs.


3080S 20GB / 3070S 16GB model what you should keep an eye out if they are forced by AMD.


----------



## moproblems99 (Sep 1, 2020)

PowerPC said:


> Love how much cognitive dissonance must be going on in people's heads right now. Just a day ago people were still saying this kind of performance / price was "literally impossible" on this very forum.



I'll be first to say I didn't expect the prices.


----------



## neatfeatguy (Sep 1, 2020)

I'll wait for benchmarks. Here's hoping the 3070 is 2080Ti equivalent or better.

My upgrade path always used to be when 1 card (reasonably priced or if funds are available) = 2 older gen cards I have in SLI.

Dual 8800 GTS 512MB in SLI roughly equals 1 GTX 280
Add a second GTX 280 for SLI roughly equals 1 GTX 570
Not having funds I didn't upgrade my 570s to a 780Ti and waited for next gen....then jumped on a 980Ti and I've been using it since.

I've been waiting for a single card priced in the $500 range that can give twice the performance of my 980Ti and a 2080Ti is that card, but not in the $1000+ price range. Hell no.

If the 3070 or even an AMD equivalent card around the same price can give me double the performance of my 980Ti and cost is around $500, then this generation will be the one I finally upgrade my GPU.


----------



## Tomgang (Sep 1, 2020)

Al i can say is:





Also who is joining me on the hype trian choo choo, but be warned the hype trian is really hot





What really took me by surprise whas the cuda core amount. I dit not for seen ampere would have this many cores. This also explain why RTX 3080 and 3090 300 watt TDP+ cards. No dout whit that many cuda cores ampere is gonna be a serious beast. RTX 3080 also surprized me whit the price. Not so much much Ngreedia this time as i had fear. RTX 3080 looks on papir like a solid 4K GPU all throw i do have my concerns about only 10 GB vram for future prof the next two years, There are all ready games that uses 8 GB+ vram at 4K and if we look at Microsoft Flight Simulater that is already close to 10 GB at 4K. But besides Vram amount, ampere looks really solid.

Sorry GTX 1080 TI, but i think its time ower ways goes in different directions... no dont look at me like i am betraying you.


----------



## PowerPC (Sep 1, 2020)

nguyen said:


> Stop with your red-pill fantasy please. Nvidia released the 1080 Ti at 700usd which demolished AMD until this very day. At this point the 700usd RTX 3080 are meant for 1080 Ti owners who refused to upgrade for so long.


Yea, the cognitive dissonance must be strong right now. People believe one thing so strong for so long until reality hits them on the head like a ton of bricks. All they can do is remain with their outdated opinion just to relieve the pain of having been wrong for this long.


----------



## VallThore (Sep 1, 2020)

What I'm also fascinated about is how well a false leak on rise in price may change general perception of price and value of a product.
When I asked three friends of mine some time back what would they think about leaving the release price of the 3000 series all of them were more or less like 'meh, a lower would be nice but it is expected that the price shouldn't change'. After the price leak and NVIDIA now denying it all of them are amazed of the price.
Feels almost as if the leak was a marketing trick


----------



## Berfs1 (Sep 1, 2020)

Just reposting my link, look at 30 series, compare to 20 series, then talk. https://docs.google.com/spreadsheets/d/1-JXBPMRZtUx0q0BMMa8Nzm8YZiyPGkjeEZfZ2DOr8y8/edit?usp=sharing


----------



## renz496 (Sep 1, 2020)

R0H1T said:


> Two trains of thoughts, not necessarily contradictory ~
> 
> If Nvidia was able to pull the 2.5-3x perf/W efficiency it's possible they may have priced it similar to the 2xxx lineup. Of course Nvidia would be looking at RDNA2 perf & that big ball of _nothingburger_ called nCoV ravaging the entire world atm. Now depending on which side of the fence you are, NVidia's margins could be higher though I'm 100% certain their overall sales would be (much?) lower!
> 
> ...



more like nvidia is expecting RDNA2 to compete with them on both price and efficiency. but it doesn't mean in reality AMD will compete. we have this kind of moment with nvidia vs AMD for several times already. not saying that RDNA2 can't compete or will repeat another history but nvidia going all out is not a definite proof that AMD have something good coming out.


----------



## Berfs1 (Sep 1, 2020)

PowerPC said:


> Yea, the cognitive dissonance must be strong right now. People believe one thing so strong for so long until reality hits them on the head like a ton of bricks. All they can do is remain with their outdated opinion just to relieve the pain of having been wrong for this long.


man yall really be fighting over A HUGE PRICE CUT and A HUGE PERFORMANCE GAIN... cmon yall be happy for once lol


----------



## CrAsHnBuRnXp (Sep 1, 2020)

Berfs1 said:


> man yall really be fighting over A HUGE PRICE CUT and A HUGE PERFORMANCE GAIN... cmon yall be happy for once lol


Welcome to 2020?


----------



## Chrispy_ (Sep 1, 2020)

Berfs1 said:


> And who cares about VRAM? It's not like it affects your performance in any noticeable way, even if you had 20GB you likely wouldn't notice the single digit FPS bump.


I had to reduce detail in Doom Eternal because my 2060 lacked enough VRAM for even 1440p. It literally wouldn't run, needed 7.5GB of the 6GB available.
Cyberpunk might just fit into 10GB but I suspect games in 2021 are going to be pushing 12GB with the console devs targeting that for VRAM allocation.


----------



## Zubasa (Sep 1, 2020)

Berfs1 said:


> man yall really be fighting over A HUGE PRICE CUT and A HUGE PERFORMANCE GAIN... cmon yall be happy for once lol


There is no price cut to speak of, the performance gain is nice on the other hand.


----------



## PowerPC (Sep 1, 2020)

Chrispy_ said:


> Yeah, I'm not paying $700 for a 10GB card.
> RX570 for $120 has 8GB FFS and I already ditched my 6GB card because it ran out of VRAM.


I actually think it's admirable that they aren't ripping people off with too much VRAM. Well, they are doing it with 24GB on the 3090, but you have to be gullible to think you need that much VRAM. That extra amount of VRAM is usually just to inflate the price for no reason.


----------



## Berfs1 (Sep 1, 2020)

Zubasa said:


> There is no price cut to speak of, the performance gain is nice on the other hand.


Performance/dollar is increased over 2x. THAT is huge.


----------



## AusWolf (Sep 1, 2020)

Awesome numbers, awesome prices... what's even more awesome is that the number of CUDA cores suggest a Ti version coming out later for each segment.


----------



## Nkd (Sep 1, 2020)

VallThore said:


> I wonder what relative performance means this time. I have a gut feeling that this incredible (roughly ~1.7x compared to 2080S looking at the graph) speed-up it's all about raytracing and not so much for rasterization but I would love to be wrong here.



I think the numbers are all with ray tracing and dlss on when they mention double the performance. If you go to their website they have 3 games they have up and they all have to have dlss and ray tracing in for claimed double performance at 4K.


----------



## Kohl Baas (Sep 1, 2020)

nguyen said:


> Stop with your red-pill fantasy please. Nvidia released the 1080 Ti at 700usd which demolished AMD until this very day. At this point the 700usd RTX 3080 are meant for 1080 Ti owners who refused to upgrade for so long.



Stop with your green-pill fantasy please. 1080 Ti owners refused to upgrade, because the 60% price-hike.

Everybody is so franatic about the 30xx pricing, where the truth is this is the good old Pascal pricing coming back to replace the insanity that Turing was.


----------



## Zubasa (Sep 1, 2020)

Berfs1 said:


> Performance/dollar is increased over 2x. THAT is huge.


Of course, compare to the dumpster fire that is 35% performance increase for Turing and a price hike.
Although large performance gains in new generations is not unheard of before.


----------



## R0H1T (Sep 1, 2020)

renz496 said:


> but nvidia going all out is not a definite proof that AMD have something good coming out.


Not saying it is, but a combination of the global pandemic & RDNA2 may have forced their hand. The last thing JHH would want is to alienate its base by pricing it outside their reach when incomes across the board are plummeting, there's also the fact that HPC & DC revenue surpassed gaming just last quarter so they do have more wiggle room to price it more aggressively now.

The point is ~ if Nvidia *could* price it to Turing levels they'd almost certainly do so.


----------



## Berfs1 (Sep 1, 2020)

Zubasa said:


> Of course, compare to the dumpster fire that is 35% performance increase for Turing and a price hike.
> Although large performance gains in new generations is not unheard of before.


Actually, 3090 has higher performance/$ than 2080 ti.


----------



## Nkd (Sep 1, 2020)

AusWolf said:


> Awesome numbers, awesome prices... what's even more awesome is that the number of CUDA cores suggest a Ti version coming out later for each segment.



ti is gone. It’s going to be super refresh from now on. That’s one of the reasons they changedto 3090 instead of 3080 ti at launch.


----------



## Makaveli (Sep 1, 2020)

Imouto said:


> RIP AMD.
> 
> This time around they won't be able to compete even in price. I see the higher ups at AMD hovering their mouses over emails asking their providers to stop the RDNA2 production.



You have no clue what you are talking about thanks for posting to show everyone else.


----------



## Zubasa (Sep 1, 2020)

Berfs1 said:


> Actually, 3090 has higher performance/$ than 2080 ti.


And that is somehow something unique for a new generation of GPUs?
We got that in all the generations of GPUs before Turing.


----------



## Chrispy_ (Sep 1, 2020)

PowerPC said:


> I actually think it's admirable that they aren't ripping people off with too much VRAM. Well, they are doing it with 24GB on the 3090, but you have to be gullible to think you need that much VRAM. That extra amount of VRAM is usually just to inflate the price for no reason.


see my post here
I've already run out of VRAM this year, far bigger games are coming.


----------



## P4-630 (Sep 1, 2020)

https://www.msi.com/Landing/GeForce-RTX-30-Series


----------



## Raendor (Sep 1, 2020)

Maybe game devs should learn again how to compress the textures without losing much quality. When games these days weight 100 gigs of which the most part is textures - it’s just stupid.


----------



## ZoneDymo (Sep 1, 2020)

Chrispy_ said:


> see my post here
> I've already run out of VRAM this year, far bigger games are coming.



^ I mean dont get me wrong, a lot is good here, but look at the most recent TPU game review, Horizon Zero Dawn, that game uses over 8gb of Vram on 4k.

And that is today..... I mean here you buy that 500 dollar RTX3070 to get RTX2080Ti performance but it has only 8gb of ram...that is going to be a problem soon I think.


----------



## Berfs1 (Sep 1, 2020)

Zubasa said:


> And that is somehow something unique for a new generation of GPUs?
> We got that in all the generations of GPUs before Turing.


Actually, from my numbers, 600 to 700 series was a (very slight) decrease in average performance/$. 10 to 20 series was a more noticeable average performance/$.


----------



## raptori (Sep 1, 2020)

The " twice as fast" or 3070 is faster than 2080Ti , *is it the general performance* ? or RT performance only ? or we don't know yet ...


----------



## Nkd (Sep 1, 2020)

lynx29 said:


> WTF JUST HAPPENED!!!!!!   A 2080 TI FOR $500  OMFG LMAO THIS IS THE GREATEST DAY EVER... HOLY CRAP....
> 
> 
> this is unbelievable... Nvidia just came out swinging.  I don't see Big Navi matching this I really don't...



hold your horses and wait for reviews. Browsing NVIDIA’s website it seems those numbers are based off rtx and dlss on. Not just pure rasterizarion. NVIDIA’s is just combining all three now I believe.


----------



## Zubasa (Sep 1, 2020)

Berfs1 said:


> Actually, from my numbers, 600 to 700 series was a (very slight) decrease in average performance/$. 10 to 20 series was a more noticeable average performance/$.


Both of them are Kepler, that generation is pretty much a rebrand for the most part. The GTX 780 / Ti was the only new die of note in the 700-series.
Basically the whole RX400 vs RX500 rubbish.


----------



## Berfs1 (Sep 1, 2020)

ZoneDymo said:


> ^ I mean dont get me wrong, a lot is good here, but look at the most recent TPU game review, Horizon Zero Dawn, that game uses over 8gb of Vram on 4k.
> 
> And that is today..... I mean here you buy that 500 dollar RTX3070 to get RTX2080Ti performance but it has only 8gb of ram...that is going to be a problem soon I think.


Okay but you can afford 50$ of ram right? Cus if it goes past the VRAM limit, you can go to the RAM. We had this instance earlier where 8GB and 1080 Ti was actually not a bad idea (at the time), because the 1080 Ti had more VRAM, but if you paired with a 1060 3GB, you definitely needed 16 GB. NOW, RAM is cheaper, only 100$ for 32GB. Just get that, and you will be fine.


----------



## Vayra86 (Sep 1, 2020)

So much for all the BS of the last few weeks then, huh.

I suppose Nvidia is ready to make their RT push and take a much smaller margin on it. That 3070 is pretty neat at 499, but first... raw performance plx. RT performance is an irrelevant metric.


----------



## Berfs1 (Sep 1, 2020)

Nkd said:


> hold your horses and wait for reviews. Browsing NVIDIA’s website it seems those numbers are based off rtx and dlss on. Not just pure rasterizarion. NVIDIA’s is just combining all three now I believe.


Not true, the core counts are literally doubled. MORE than doubled.


----------



## Fluffmeister (Sep 1, 2020)

Gets juicy from about 6 minutes in, compares a cranked 2080 to a 3080.

Go to 6.16 to be precise and well.... gee.


----------



## Berfs1 (Sep 1, 2020)

Zubasa said:


> Both of them are Kepler, that generation is pretty much a rebrand for the most part.
> Basically the whole RX400 vs RX500 rubbish.


Ok congrats... 30 series on average has much lower price/performance. Yay, its better.


----------



## Vayra86 (Sep 1, 2020)

ZoneDymo said:


> ^ I mean dont get me wrong, a lot is good here, but look at the most recent TPU game review, Horizon Zero Dawn, that game uses over 8gb of Vram on 4k.
> 
> And that is today..... I mean here you buy that 500 dollar RTX3070 to get RTX2080Ti performance but it has only 8gb of ram...that is going to be a problem soon I think.



I think its clear they try to sell you a 3090 if you REALLY want 4K.

As they should. Its still early adopter territory and you shouldve known better if you thought that 1080ti would carry you in the long run. Not you - royal you.


----------



## nguyen (Sep 1, 2020)

Jup, RTX 3080 will straight up murdering 2080 Ti, 40-50% faster than 2080 Ti and 80% faster than RTX 2080 just in rasterization

Edited: @Fluffmeister just posted the DigitalFoundry link like 5s faster than I did


----------



## Chrispy_ (Sep 1, 2020)

Nkd said:


> hold your horses and wait for reviews. Browsing NVIDIA’s website it seems those numbers are based off rtx and dlss on. Not just pure rasterizarion. NVIDIA’s is just combining all three now I believe.


This is what I'd heard in a couple of streams discussing the new RT performance too. Nvidia are going to be pushing DLSS_ hard_ this generation.


----------



## Nkd (Sep 1, 2020)

chstamos said:


> Welp! I don't have the live stream. Is RTX 3070 supposed to be twice as fast as 2080 Ti without raytracing, too, or only with RT on?
> 
> If it's twice the rasterization performance this is actually an amazing new lineup! I was set for disappointment, thought we were going to have another turing debut on our hands, which I had found very underwhelming. This is actually great!



well looking at the website and under performance 3080 is twicea as fast as 2080 super with rtx and dlss on. They show numbers from 3 games.They didn’t really show any game numbers specifically. So looks like their overall performance number is based on rtx and dlss on.


----------



## PowerPC (Sep 1, 2020)

Chrispy_ said:


> see my post here
> I've already run out of VRAM this year, far bigger games are coming.


VRAM is one of the most expensive parts of the GPU right now and so far, the least important when it comes to performance. You can increase the VRAM, but then you'll have to increase the price considerably, just look at 3090... I mean, these cards are for today. I doubt anybody can do anything about VRAM prices. Game devs really need (and probably will) adapt to this reality, unless these prices change. I think you can find flaws with anything, if you want.


----------



## medi01 (Sep 1, 2020)

So 
kopite7kimi
was right about pricing.

According to him/her Big Navi soundly beats 3070 (rendering 3070Ti useless).
This means we have:

3070 << Big Navi < 3080

It also explains the unexpectedly conservative pricing.


----------



## Berfs1 (Sep 1, 2020)

Makaveli said:


> You have no clue what you are talking about thanks for posting to show everyone else.


How sure are you that YOU know what you are talking about? AMD has not been able to compete with the 2080 Ti, except with the *rumored* (and never released) RX 5950 XT. At least Nvidia made something with MORE THAN 2x THEIR CURRENT FLAGSHIP CARD!!



Nkd said:


> well looking at the website and under performance 3080 is twicea as fast as 2080 super with rtx and dlss on. They show numbers from 3 games.They didn’t really show any game numbers specifically. So looks like their overall performance number is based on rtx and dlss on.


They didn't show exact numbers because they can vary. Because not everyone has the exact same CPU in the exact same configuration.


----------



## RedelZaVedno (Sep 1, 2020)

Digital Foundry benchmarks 3080 = 65-90% faster than 2080... AWESOME!!!


----------



## medi01 (Sep 1, 2020)

Mm, did number of transistors per CU go down?



RedelZaVedno said:


> Digital Foundry benchmarks 3080 = 65-90% faster than 2080... AWESOME!!!


So far so good for the "two times faster" promise, lol.


----------



## Fluffmeister (Sep 1, 2020)

nguyen said:


> Jup, RTX 3080 will straight up murdering 2080 Ti, 40-50% faster than 2080 Ti and 80% faster than RTX 2080 just in rasterization
> 
> Edited: @Fluffmeister just posted the DigitalFoundry link like 5s faster than I did



Yeah just beat ya too it! Sorry fella! But yeah, the 3080 smacks the 2080 pretty hard.


----------



## Nkd (Sep 1, 2020)

nguyen said:


> Jup, RTX 3080 will straight up murdering 2080 Ti, 40-50% faster than 2080 Ti and 80% faster than RTX 2080 just in rasterization
> 
> Edited: @Fluffmeister just posted the DigitalFoundry link like 5s faster than I did



Well looks like Navi is going to be just competing with 3080 from rasterization standpoint looking at specs. I don’t see them tryin got compete with 3090 because math doesn’t add up unless Ofcourse they have more cores all of sudden then rumored. Plus not sure if amd really wants to compete at 1000+. Just not something they shoot for.


----------



## Vayra86 (Sep 1, 2020)

Fluffmeister said:


> Gets juicy from about 6 minutes in, compares a cranked 2080 to a 3080.
> 
> Go to 6.16 to be precise and well.... gee.



I think its always best to take a worst case, much salt scenario on these comparisons because you're not usually looking at the toughest scenes games will throw at you.

Realistically I think what 3080 will give you, is +75% over a 1080ti / 2070S. Because let's face it, 2080 wasn't much of its own tier.

Which is STILL pretty damn decent at 700. I have to say. Its also good to see that the x80 pulls up the absolute performance already. That's a lot better than Turing's 2080ti doing it.


----------



## PowerPC (Sep 1, 2020)

Fluffmeister said:


> Gets juicy from about 6 minutes in, compares a cranked 2080 to a 3080.
> 
> Go to 6.16 to be precise and well.... gee.


So Doom Eternal performance of 3080 vs 2080 is increased by a whooping 81% without RTX because that game doesn't even support RTX afaik.

And my prediction a couple of weeks ago was around 60%, which was actually made fun of on this forum hahaha. Even 60% would still have been a lot actually and much more than at least two of the last increases of generations.


----------



## randompeep (Sep 1, 2020)

Out there wishing people would calm their asses down on VRAM battle. Like the heck you do with dozens and dozens?! Isn't this the hardware you'd be supposed to game the next 3-5 years like you do with a next-gen console, huh ? How many of y'all are gonna do it in 4k 120 Hz or 8k (just to call it out) ? 
And for the VRAM 'horny' peeps, go hunt the used market! Big deals are already on globally


----------



## Zubasa (Sep 1, 2020)

medi01 said:


> So far so good for the "two times faster" promise, lol.


Ampere got more than double the CUDA cores, but the rest of the card (ROPs / Bandwidth etc) didn't get this kind of increase.
So the performance checks out in the end.


Vayra86 said:


> I think its always best to take a worst case, much salt scenario on these comparisons because you're not usually looking at the toughest scenes games will throw at you.
> 
> Realistically I think what 3080 will give you, is +75% over a 1080ti / 2070S. Because let's face it, 2080 wasn't much of its own tier.
> 
> Which is STILL pretty damn decent at 700. I have to say. Its also good to see that the x80 pulls up the absolute performance already. That's a lot better than Turing's 2080ti doing it.


Yeah 70-ish% increase is like Maxwell vs Pascal which is very nice, finally we get a "real" generational increase.


----------



## Chrispy_ (Sep 1, 2020)

medi01 said:


> So
> kopite7kimi
> was right about pricing.
> 
> ...


big navi is still a ways off though, right?


----------



## mrthanhnguyen (Sep 1, 2020)

Dont forget the real benchmark is not from Nvidia slide.


----------



## chodaboy19 (Sep 1, 2020)

Wow the performance increase is truly brutal across the board! 

After some more driver and game optimization, the performance will just keep going up!


----------



## Chomiq (Sep 1, 2020)

Imouto said:


> That's one of the points, though. Consoles are obsolete even before launch.


100 mil + of PS4 sold says otherwise.


sutyi said:


> No Ti branding this time around supposedly. I think with the super models they'll double the VRAM on higher SKUs.
> 
> 
> 3080S 20GB / 3070S 16GB model what you should keep an eye out if they are forced by AMD.


They have a big price gap between 3080 and 3090. They can hold out until flagship RDNA2 and then drop 3080 Ti to fit this scenario.


----------



## Vayra86 (Sep 1, 2020)

Nkd said:


> Well looks like Navi is going to be just competing with 3080 from rasterization standpoint looking at specs. I don’t see them tryin got compete with 3090 because math doesn’t add up unless Ofcourse they have more cores all of sudden then rumored. Plus not sure if amd really wants to compete at 1000+. Just not something they shoot for.



I think Navi is going to *struggle *catching this 3080 to be honest. AMD has yet to surpass 2070S performance convincingly, and now they're making an 80% jump ahead? Not likely, unless they make something absolutely gargantuan. But let's not dive into the next pond of speculation... my heart... 

By the by, do we have TDPs for these Ampere releases already? The real numbers?


----------



## HD64G (Sep 1, 2020)

raptori said:


> The " twice as fast" or 3070 is faster than 2080Ti , *is it the general performance* ? or RT performance only ? or we don't know yet ...


That increase in performance is clearly the FPS with RTX ON.


----------



## RedelZaVedno (Sep 1, 2020)

randompeep said:


> Out there wishing people would calm their asses down on VRAM battle. Like the heck you do with dozens and dozens?! Isn't this the hardware you'd be supposed to game the next 3-5 years like you do with a next-gen console, huh ? How many of y'all are gonna do it in 4k 120 Hz or 8k (just to call it out) ?
> And for the VRAM 'horny' peeps, go hunt the used market! Big deals are already on globally


Just look at Microsoft FS 2020. 2080TI manages only 31FPS (with deeps to 21) over NY City at 4K/ULTRA and wants to use 12.7GB of VRAM. 3080 will hopefully get us to 55 fps (with deeps to 40ies). Having 16GB would probably smoothen these fps deeps further. Needless to say 3080 is still a godsent for flight simmers


----------



## sutyi (Sep 1, 2020)

Chomiq said:


> 100 mil + of PS4 sold says otherwise.
> 
> They have a big price gap between 3080 and 3090. They can hold out until flagship RDNA2 and then drop 3080 Ti to fit this scenario.



Most leaky info points to AIB partners pushing towards a more transparent naming scheme for the product stack. Super had a better a ring to it.

I'm not doubting they'll release a something between the 3080 and the 3090 down the line, but 80% it will not be called a 3080Ti.
It will probably be called a 3080 Super and it will have 20GB memory and a 350W TGP. They'll release it around 899-999 when they have enough cut down cores.


----------



## efikkan (Sep 1, 2020)

medi01 said:


> According to him/her Big Navi soundly beats 3070 (rendering 3070Ti useless).
> This means we have:
> 3070 << Big Navi < 3080
> It also explains the unexpectedly conservative pricing.


That's a pretty bold statement when nobody have compared them in real workloads yet.

Don't get me wrong, I hope Big Navi™ will push the prices a bit down further.


----------



## Icon Charlie (Sep 1, 2020)

HD64G said:


> That increase in performance is clearly the FPS with RTX ON.



I agree.  

I want to see standard testing from several different sites on a large amount of games and programs is when judgement should be placed. 

Starting at 1080p.  and go up from there.  I want to see real performance gains without RTX on.


----------



## PowerPC (Sep 1, 2020)

With the VRAM discussion, you also have to mention that the 30x0 cards all have 60% faster VRAM than the previous generation (it makes the process of freeing up and writing to Memory much faster) and therefore will probably lead to a much smaller VRAM bottleneck than Turing had if you even encounter this as a problem, which most games so far don't even on 4K. 8-10 GB is still pretty good. But they'll probably release Ti variants of all these cards with increased VRAM, but also a much higher prices.


----------



## Chomiq (Sep 1, 2020)

sutyi said:


> Most leaky info points to AIB partners pushing towards a more transparent naming scheme for the product stack. Super had a better a ring to it.
> 
> I'm not doubting they'll release a something between the 3080 and the 3090 down the line, but 80% it will not be called a 3080Ti.
> It will probably be called a 3080 Super and it will have 20GB memory and a 350W TGP. They'll release it around 899-999 when they have enough cut down cores.


Thing is, Super is perfectly fine if you're doing a refresh. Ti is helpful if you need something quick to one up the competition. They can do Super refresh in 12 months, but Ti would be a direct response if RDNA2 actually managed to match 3080 in non-RT performance.


----------



## Zubasa (Sep 1, 2020)

chodaboy19 said:


> Wow the performance increase is truly brutal across the board!
> 
> After some more driver and game optimization, the performance will just keep going up!


You might want to tone down the excitement a bit, generally nVidia GPUs don't tend to gain a lot of performance unless there is some glaring bug in the driver.
nVidia is generally pretty good at getting most of the performance out of their cards from the get go.
For example, the relative performance between Pascal and Turing more or less stayed the same in most cases.


Spoiler












						GeForce RTX 2080 Ti vs GTX 1080 Ti vs GTX 980 Ti: Flagship Versus
					

Today's comparison uses brand new fresh data for the GeForce RTX 2080 Ti, GTX 1080 Ti and GTX 980 Ti. We're currently in the process of updating...




					www.techspot.com


----------



## medi01 (Sep 1, 2020)

efikkan said:


> That's a pretty bold statement when nobody have compared them in real workloads yet.



How NV/AMD "know" what is being cooked in the other kitchen always was a mystery to me (and I wondered how much of it could be blocked on driver level).

Nevertheless, 3080 and below pricing is... not usual, is it?


----------



## Metroid (Sep 1, 2020)

3080 because is the best cost benefit price/performance gpu, I want more gddr memory, 10gb is not enough, I like Nvidia cooling design however I will have to wait for reviews and then decide which 3080 to buy, 3080 nvidia reference cards will only come with 10gb, so need to wait and see which company will pack more gddr and have the best cooling solution.

3070 = $499
3080 = $699
3090 = $1499

3090 is too expensive for the performance it will give, I mean is better to buy 2 x 3080 and put on sli, I dont think the 3090 will be even 50% faster than the 3080.

Also nvidia made wonders, I mean 2080 is 12nm and 3080 is 8nm, in that sense we should expect only 50% faster and yet is a lot more than that. If we take 1080 which is a 14nm and compare x 3080 8nm which is not even a 7nm, the performance will be much more than 100% and for that I'm happy.


----------



## Zubasa (Sep 1, 2020)

medi01 said:


> How NV/AMD "know" what is being cooked in the other kitchen always was a mystery to me (and I wondered how much of it could be blocked on driver level).
> 
> Nevertheless, 3080 and below pricing is... not usual, is it?


Depends on how you think of it, Turing's pricing was "unusual" in the bad way. 
TBH, I was kind of expecting nVidia to raise the price on those as well.


----------



## medi01 (Sep 1, 2020)

HD64G said:


> That increase in performance is clearly the FPS with RTX ON.


On one hand there is that 2+ times CU cores, on other, lower clocks and lower number of transistors per CU (???).

If increasing number of CUs 2.5 times only brings 1.5 times better perf, I'd call it disappointing.



Zubasa said:


> Depends on how you think of it, Turing's pricing was "unusual" in the bad way.


ASP price bump started when AMD Raja times, which is Pascal.


----------



## SIGSEGV (Sep 1, 2020)

I knew that 3090 would be more costly. Well, it is very obvious that the main target of this card is for intermediate or even beginner researchers and professionals, especially for whom they use CUDA libraries for having fun with deep learning. Well, with 3090 hefty prices, I prefer to rent google cloud AI to serve my passion. 
I really hope AMD has a secret recipe to counter these ampere GPU lineups.


----------



## medi01 (Sep 1, 2020)

Metroid said:


> the best cost benefit price/performance gpu


It actually is 3070 (which one could safely say without even watching anything) but still, wench for baitmarks, it shouldn't take long for actual tests from places not known to be NV's bentover stops to roll them out.


----------



## efikkan (Sep 1, 2020)

medi01 said:


> How NV/AMD "know" what is being cooked in the other kitchen always was a mystery to me (and I wondered how much of it could be blocked on driver level).


The fact is they don't know, not until it's too late anyway.
Nvidia themselves don't even know the precise performance until the final stepping is ready, and that's usually 1-2 months before launch.



medi01 said:


> Nevertheless, 3080 and below pricing is... not usual, is it?


2080/3080 and 2070/3070 pricing seems the same to me.


----------



## Zubasa (Sep 1, 2020)

medi01 said:


> On one hand there is that 2+ times CU cores, on other, lower clocks and lower number of transistors per CU (???).
> 
> If increasing number of CUs 2.5 times only brings 1.5 times better perf, I'd call it disappointing.
> 
> ...


And before that nVidia started shifting the x80 class GPUs down the stack and went from original based on the biggest x100 dies to now the x104-class dies.
As for the performance per CU, you never get 100% scaling off increasing the CU count given it is not practical to increase the cache and bandwidth etc by 2.5x as well.
Also you might lose some clock speed to keep a monster GPU within a certain power envelope.


----------



## Gameslove (Sep 1, 2020)

Geforce RTX 3070 is amazing, so expected, plus RTX IO.


----------



## randompeep (Sep 1, 2020)

RedelZaVedno said:


> Just look at Microsoft FS 2020. 2080TI manages only 31FPS (with deeps to 21) over NY City at 4K/ULTRA and wants to use 12.7GB of VRAM. 3080 will hopefully get us to 55 fps (with deeps to 40ies). Having 16GB would probably smoothen these fps deeps further. Needless to say 3080 is still a godsent for flight simmers



Yeah, but newer architecture means more efficiency in texture processing. I'm a fanboy of my wallet and honestly Nvidia always killed it on this side of things. Just looking at RTX vs RX 5xxx series VRAM consumption tells you thetruth!
MS FS is a weird flex much needed for the hardware industry just to justify upgrading to the curent gen!! It's like a football match, whoever wins doesn't matter anyways - both parties are gonna be paid well enough (AMD & Nvidia). But the gaming market have been rising anyways since COVID 'era' started, so I wouldn't be surprised if AMD jumps on a smaller semiconductor tech as soon as next year tho!


----------



## Icon Charlie (Sep 1, 2020)

medi01 said:


> How NV/AMD "know" what is being cooked in the other kitchen always was a mystery to me (and I wondered how much of it could be blocked on driver level).
> 
> Nevertheless, 3080 and below pricing is... not usual, is it?


Going back to MSRP a few years.

The Standard 1080 was $599 in 2016  
The Standard 2080 was $699 in 2018

What I think is that Nvidia is really afraid of NAVI  and has priced accordingly.


----------



## Xaled (Sep 1, 2020)

jesdals said:


> But does RTX 3090 support SLI


Why does it matter?

I don't believe the prices of 3070 and 3080s
Nvidia is just lying about launch prices and no one would be able to buy these cards for these prices.


----------



## randompeep (Sep 1, 2020)

PowerPC said:


> With the VRAM discussion, you also have to mention that the 30x0 cards all have 60% faster VRAM than the previous generation (it makes the process of freeing up and writing to Memory much faster) and therefore will probably lead to a much smaller VRAM bottleneck than Turing had if you even encounter this as a problem, which most games so far don't even on 4K. 8-10 GB is still pretty good. But they'll probably release Ti variants of all these cards with increased VRAM, but also a much higher prices.



Hell yeah. Like someone make the crowd chill!
Can you guys imagine (it's unrealistically, but still) if they still used to make 100-150$ GPUs with those memories ? RTX 3010 & RTX 3030 lol. Probably they would kill the APU game of AMD. But I guess there is a deal behind the scenes.
'As long as you guys don't touch our area, we're leaving yours'


----------



## medi01 (Sep 1, 2020)

efikkan said:


> 2080/3080 and 2070/3070 pricing seems the same to me.





Icon Charlie said:


> The Standard 2080


Yeah, "the standard".

Major perf leap(mkay, everyone understands that claimed figures are well over real, but still a major leap) coming together with price modesty, hm.


----------



## Xaled (Sep 1, 2020)

Yeah the price of the never existing standard edition yeah


----------



## Zubasa (Sep 1, 2020)

Xaled said:


> I don't believe the prices of 3070 and 3080s
> Nvidia is just lying about launch prices and no one would be able to buy these cards for these prices.


An interesting tibit about pricing.
Jensen did finally called the 2080ti a $1200 GPU in this presentation.
I take it as he finally admits that the $999 MSRP was BS right from the get go.
EVGA sold a couple of cards at that price in the first week or so and it never came back in-stock until that SKU was replaced by something more expensive.


----------



## Icon Charlie (Sep 1, 2020)

Xaled said:


> Why does it matter?
> 
> I don't believe the prices of 3070 and 3080s
> Nvidia is just lying about launch prices and no one would be able to buy these cards for these prices.
> ...


Oh I am expecting the price to jack up because of Gerbal mentality to buy shiny new things. 

It happened during the 2080 series of launches.
It will happen with this launch as well.


----------



## DuxCro (Sep 1, 2020)

Why the F only 8GB and 10GB of VRAM?


----------



## Hellfire (Sep 1, 2020)

Not sure if mentioned ('ve not read ANY of the previous posts) but surely the internal through fan, for anyone with air coolers on their CPU's could cause problems with their CPU coolers.

I see a lot of hot air being blown onto the CPU. I know a lot use AIO's or watercooling but highend air coolers are still used by many.








DuxCro said:


> Why the F only 8GB and 10GB of VRAM?



I imagine the cost of GDDR6X would play a big factor, driving the price of the GPU up some? I'd imagine 12-16Gb cards to come eventually, maybe Ti's?


----------



## Xaled (Sep 1, 2020)

Icon Charlie said:


> Oh I am expecting the price to jack up because of Gerbal mentality to buy shiny new things.
> 
> It happened during the 2080 series of launches.
> It will happen with this launch as well.


No, this is different from launch price thing as never even got close except 20 cards from evga in Black Friday or something


Zubasa said:


> An interesting tibit about pricing.
> Jensen did finally called the 2080ti a $1200 GPU in this presentation.
> I take it as he finally admits that the $999 MSRP was BS right from the get go.
> EVGA sold a couple of cards at that price in the first week or so and it never came back in-stock until that SKU was replaced by something more expensive.


Exactly, And I believe they did it once upon a year just to get publicity and most importantly if they get sued for such fraud they have a proof that they sold it at that price.


----------



## randompeep (Sep 1, 2020)

Zubasa said:


> An interesting tibit about pricing.
> Jensen did finally called the 2080ti a $1200 GPU in this presentation.
> I take it as he finally admits that the $999 MSRP was BS right from the get go.
> EVGA sold a couple of cards at that price in the first week or so and it never came back in-stock until that SKU was replaced by something more expensive.



thought EVGA was caught in a black hole even since their 2012ish wave of expensive cards haha
they were caught off guard when RGB wave started and I couldn't name a goofier modern GPU AIB. Apparently, their cards easily go out of stock, so their aviability started to suck big time in the last 2-3 years


----------



## Quicks (Sep 1, 2020)

WOW people get so excited about getting screwed over the previous generation, this is the trend to catch suckers. Always buy the best you can afford every 3rd generation. 1080TI is still a good card to have now only the 30XX will be a worthy successor. The 40XX will suck again and be overpriced, not that I am stating the 30XX is justified in pricing.


----------



## ppn (Sep 1, 2020)

DuxCro said:


> Why the F only 8GB and 10GB of VRAM?


New memory type. Supply must be satisfied. 
5888 Cuda cards with 8GB is like 2944 with 4GB, so we wait the SUPERS 16/20GB versions. Better wait for 5nm anyway with another 2x density, and 1/2 power.


----------



## Fluffmeister (Sep 1, 2020)

Xaled said:


> Why does it matter?
> 
> I don't believe the prices of 3070 and 3080s
> Nvidia is just lying about launch prices and no one would be able to buy these cards for these prices.



Bait and switch like Vega's "launch only" prices?


----------



## DuxCro (Sep 1, 2020)

ppn said:


> New memory type. Supply must be satisfied.
> 5888 Cuda cards with 8GB is like 2944 with 4GB, so we wait the SUPERS 16/20GB versions. Better wait for 5nm anyway with another 2x density, and 1/2 power.


RTX 3070 will use GDDR6, not GDDR6X


----------



## sergionography (Sep 1, 2020)

ITS OVER 9000!


----------



## Zubasa (Sep 1, 2020)

randompeep said:


> thought EVGA was caught in a black hole even since their 2012ish wave of expensive cards haha
> they were caught off guard when RGB wave started and I couldn't name a goofier modern GPU AIB. Apparently, their cards easily go out of stock, so their aviability started to suck big time in the last 2-3 years


Maybe, or as I suspected and Xaled said above. It might have been a PR stunt and potentially to get the lawyers off their asses.


----------



## DuxCro (Sep 1, 2020)

Not gonna buy a founders edition. Would love to get Asus strix or Gigabyte Aorus and then OC until it starts to squeal.


----------



## dicktracy (Sep 1, 2020)

Consoles went from midrange specs to lowend now. No need to use overly gimped checkerboard ray tracing with RDNA2 either!


----------



## randompeep (Sep 1, 2020)

Zubasa said:


> Maybe, or as I suspected and Xaled said above. It might have been a PR stunt and potentially to get the lawyers off their asses.



I don't know how are EVGA prices outside of Europe, but out there I can find similarities with the Vertu phones.
Not really tryna be a clown, I do respect Asus, Gigabyte, MSI (maybe the best), even Palit and others who tryna be competitve. No wonder why they've always been 'green'


----------



## ppn (Sep 1, 2020)

dicktracy said:


> Consoles went from midrange specs to lowend now. No need to use overly gimped checkerboard ray tracing with RDNA2 either!



most definitely 3060 with 4608 Cuda if everything doubled and moved one tier above is making the unreleased consoles look like RTX 3050. 
2080 x2 = 3070, 2080Ti x2=3080. except the memory bandwith, that remains the same and probably bottlenecking.


----------



## dayne878 (Sep 1, 2020)

Well, I'll definitely be getting the 3080. I have a 2080ti that I bought last year and while I'll be sad that I'm only able to resale it for $400-$500 USD if I'm lucky, at least it will offset some of the cost of that card. I'm just glad they didn't make the 3080 $999 and the 3090 $2000 or something like the speculation was. I could see 3080ti coming in at $999 next year to fill that gap along with a Titan for the $2500 bracket.


----------



## PowerPC (Sep 1, 2020)

Icon Charlie said:


> Oh I am expecting the price to jack up because of Gerbal mentality to buy shiny new things.
> 
> It happened during the 2080 series of launches.
> It will happen with this launch as well.


Yes, but this is people's own fault, Nvidia really doesn't have much to do with it. As long as AMD drags itself behind with GPUs, this will always happen.


----------



## R0H1T (Sep 1, 2020)

ppn said:


> most definitely 3060 with 4608 Cuda if everything doubled and moved one tier above is making the unreleased consoles look like RTX 3050.
> 2080 x2 = 3070, 2080Ti x2=3080. except the memory bandwith, that remains the same and probably bottlenecking.


Cool, find me a 8c/16t+16 GB high speed memory, class leading SSD system for $400~600


----------



## Vya Domus (Sep 1, 2020)

Imouto said:


> Consoles are obsolete even before launch.



Says who, you ?

What does "obsolete" mean, that it isn't the fastest thing in the world ? Yeah it's probably obsolete if you are some random forum dweller but for someone like Sony and Microsoft it isn't.  They'll gladly sell dozens of millions of these "obsolete" consoles.



ppn said:


> most definitely 3060 with 4608 Cuda if everything doubled and moved one tier above is making the unreleased consoles look like RTX 3050.
> 2080 x2 = 3070, 2080Ti x2=3080. except the memory bandwith, that remains the same and probably bottlenecking.



Your GPU arithmetic always cracks me up, you gotta trademark it. God knows how you come up with this stuff.


----------



## TheGuruStud (Sep 1, 2020)

Fake double cores with more scam cores. No actual FPS numbers, nothing. MAYBE 2x RT performance according to liar Huang.

Yeah, really inspiring (along with the price and we know what low prices mean).


----------



## Oberon (Sep 1, 2020)

10000 CUDA cores, <50% greater performance and 40% more power.


----------



## Valantar (Sep 1, 2020)

Hot damn, color me impressed. The 3090 is ridiculous  (more than 2x the price of the 3080 for a relatively minor performance increase and a VRAM increase that doesn't have a use in gaming), but both the 3080 and especially the 3070 look really, really good. The 3070 supposedly beating the 2080 Ti at $499 just underscores how ridiculously priced that GPU was in the first place, but also really puts the pressure on AMD. If Nvidia's numbers pan out I'm struggling to see how AMD could keep up with the 3080 even with a similarly power hungry card (320W!) and the reported (and likely best-case scenario) 50% perf/W increase of RDNA 2. On the other hand I guess this will force AMD's pricing and tiering of their cards downwards. This will sure be an interesting fall and winter for gamers.

Those doubled FP32 cores are certainly interesting. Wonder if they're able to keep them fed?


----------



## T3RM1N4L D0GM4 (Sep 1, 2020)

I hope FE comes with 12pin psu adapter...


----------



## Liviu Cojocaru (Sep 1, 2020)

3080 will be in my PC probably, pre-ordering this


----------



## RedelZaVedno (Sep 1, 2020)

I'm really afraid for AMD... Nvidia already owns 80% of discrete GPU market and Ampere looks like a nuclear option. Is RDNA2 gonna be good enough to compete with 3080? It looks like 3080 is 2080TI+50-60%. Even if AMD can compete they will be forced to sell bellow $700, probably around $600 in order to sell good enough. The same is true for 3070 competitor. It will have to undercut 3070 100 bucks to be serious consideration. Can AMD do that price wise, being on more expensive TSMC 7nm node?


----------



## Vya Domus (Sep 1, 2020)

Valantar said:


> Hot damn, color me impressed. The 3090 is ridiculous  (more than 2x the price of the 3080 for a relatively minor performance increase and a VRAM increase that doesn't have a use in gaming)



This is where they got sneaky, the gap between the two is simply too large in terms of VRAM to be a mistake. 10GB is an usually small amount of memory for the 3080 given that you could have gotten 11GB in a card two generations ago for about the same money. Games already push a lot of VRAM and it can only go up in the future, that's a card obviously intended for 4K with the highest settings and all that.

I can't help but think  that it's done on purpose, you get the compute you need but not the memory.


----------



## chodaboy19 (Sep 1, 2020)

Quicks said:


> WOW people get so excited about getting screwed over the previous generation, this is the trend to catch suckers. Always buy the best you can afford every 3rd generation. 1080TI is still a good card to have now only the 30XX will be a worthy successor. The 40XX will suck again and be overpriced, not that I am stating the 30XX is justified in pricing.


Well people got to enjoy the 2000-series for 2 years so it's not all a loss.


----------



## DuxCro (Sep 1, 2020)

RedelZaVedno said:


> I'm really afraid for AMD... Nvidia already owns 80% of discrete GPU market and Ampere looks like a nuclear option. Is RDNA2 gonna be good enough to compete with 3080? It looks like 3080 is 2080TI+50-60%. Even if AMD can compete they will be forced to sell bellow $700, probably around $600 in order to sell good enough. The same is true for 3070 competitor. It will have to undercut 3070 100 bucks to be serious consideration. Can AMD do that price wise, being on more expensive TSMC 7nm node?


I think AMD will be between RTX 3070 and 3080 with their best card.


----------



## Fluffmeister (Sep 1, 2020)

RedelZaVedno said:


> I'm really afraid for AMD... Nvidia already owns 80% of discrete GPU market and Ampere looks like a nuclear option. Is RDNA2 gonna be good enough to compete with 3080? It looks like 3080 is 2080TI+50-60%. Even if AMD can compete they will be forced to sell bellow $700, probably around $600 in order to sell good enough. The same is true for 3070 competitor. It will have to undercut 3070 100 bucks to be serious consideration. Can AMD do that price wise, being on more expensive TSMC 7nm node?



It's a good point, wouldn't surprise me if Samsungs 8nm node is cheaper than TSMC's 7nm. The second hand market of cheap 2080 Ti's and frankly still rapid 2080/2070's could also be a headache for them.

All RTX cards will support RTX IO too: https://www.nvidia.com/en-us/geforce/news/rtx-io-gpu-accelerated-storage-technology/


----------



## Valent117 (Sep 1, 2020)

me reading :
hmmm
hmm yes interesting
oh nice cooler
hm
10000 ????????????????? (check rumored specs) WTF NVIDIA LMAO


----------



## sepheronx (Sep 1, 2020)

I'll wait on those prices and benchmarks. For both this and RDNA2 before I make a leap. Gives me time to also save money.

I don't aim for best of best. Best in terms of what $500 will get me. So it will have to be a 3060 when it arrives (hopefully not gimped with a 6gb).


----------



## Frick (Sep 1, 2020)

JalleR said:


> well that looks promising, My 1080TI needs and upgrade NOW



*thinking about my GTX760*

When are reviews due, and when does RDNA2 drop?


----------



## Valantar (Sep 1, 2020)

Vya Domus said:


> This is where they got sneaky, the gap between the two is simply too large in terms of VRAM to be a mistake. 10GB is an usually small amount of memory for the 3080 given that you could have gotten 11GB in a card two generations ago for about the same money. Games already push a lot of VRAM and it can only go up in the future, that's a card obviously intended for 4K with the highest settings and all that.
> 
> I can't help but think  that it's done on purpose, you get the compute you need but not the memory.


You might be right, but considering the explicit implementation of DirectStorage, actual VRAM usage for games using this is likely to decrease, not increase. Heck, current VRAM needs for games even at 4k are massively bloated due to duplication and games streaming assets based on HDD response times and transfer rates, with huge portions of in-VRAM assets never being used at all. DirectStorage aims to solve this (didn't MS claim a 2.5x improvement or some such for the XSX?), so I'm really not worried at all. As long as the game is stored on a fast NVMe SSD, it'll likely perform admirably.


----------



## ahenriquedsj (Sep 1, 2020)

I will wait for benchmarks and the launch of AMD to decide.


----------



## Turmania (Sep 1, 2020)

Dr.Lisa Su, has left the building....


----------



## Lionheart (Sep 1, 2020)

Well those specs look delicious, the VRAM could be better on the 3080 IMO but whatever, most likely jumping to the green team until reviews come out.


----------



## witkazy (Sep 1, 2020)

Huh, maybe 2020 will be good for something ,my last nvidia was 760 so it might be about bloody time. But 3700 does it for me i think, future proofing is myth it seems.


----------



## ddarko (Sep 1, 2020)

xkm1948 said:


> There should be pre-order going up before that.



Not for Founders Edition cards - the Ampere information megathread on reddit has a giant banner on top: "There is no Founders Edition Pre-Order."  This was taken a Q&A done with Nvidia product managers right after the launch:



That doesn't preclude custom card preorders.


----------



## Krzych (Sep 1, 2020)

For all the malcontents who didn't even bother to read the graphs and watch available videos, here is a preview of 3080 with same performance numbers:


----------



## SN2716057 (Sep 1, 2020)

And now we wait for reviews and...<drumroll>.. availability.


----------



## MxPhenom 216 (Sep 1, 2020)

ppn said:


> most definitely 3060 with 4608 Cuda if everything doubled and moved one tier above is making the unreleased consoles look like RTX 3050.
> 2080 x2 = 3070, 2080Ti x2=3080. except the memory bandwith, that remains the same and probably bottlenecking.



Memory bandwidth remains the same? Not at all.


----------



## Makaveli (Sep 1, 2020)

Krzych said:


> For all the malcontents who didn't even bother to read the graphs and watch available videos, here is a preview of 3080 with same performance numbers:



lol this video has been posted 3 times now and the last two post are on the same page!!!!!!!!!

Try reading thread before posting!


----------



## Ibotibo01 (Sep 1, 2020)

I will probably sell my RTX 2060. Please AMD release RDNA2 as soon as. It is odd RTX 3070 has 5888 cores and it gets RTX 2080 Ti performance. GTX 1060 has 1280 cores but it's performance same with GTX 980 which has 2048 cores. Ampere's 20TFLOPS GPU is equal Turing's 13TFLOPS GPU??


----------



## PowerPC (Sep 1, 2020)

Ibotibo01 said:


> I will probably sell my RTX 2060. Please AMD release RDNA2 as soon as. It is odd RTX 3070 has 5888 cores and it gets RTX 2080 Ti performance. GTX 1060 has 1280 cores but it's performance same with GTX 980 which is 2048 cores. Ampere's 20TFLOPS GPU is equal Turing's 13TFLOPS GPU??


That's why you should never measure GPU performance in TFLOPS. It's all about more specialized hardware with GPUs.


----------



## chodaboy19 (Sep 1, 2020)

Quick question:

The fan on the top side, is supposed to suck air from the bottom and exhaust it to the top. But, when I look at the shape of the blades, it looks like the air would flow in the opposite direction? I'm a bit confused...


----------



## Vya Domus (Sep 1, 2020)

Valantar said:


> You might be right, but considering the explicit implementation of DirectStorage, actual VRAM usage for games using this is likely to decrease, not increase. Heck, current VRAM needs for games even at 4k are massively bloated due to duplication and games streaming assets based on HDD response times and transfer rates, with huge portions of in-VRAM assets never being used at all. DirectStorage aims to solve this (didn't MS claim a 2.5x improvement or some such for the XSX?), so I'm really not worried at all. As long as the game is stored on a fast NVMe SSD, it'll likely perform admirably.



DirectStorage aims to reduce IO overhead not necessarily memory requirements, 1 GB of textures are still going to be 1 GB of textures, they'll just load more efficiently. Just because an engine no longer needs to load as many things ahead of time doesn't mean the memory wont fill up with something else, in facts that's the goal, to allow for an increases in the amount of assets used.


----------



## efikkan (Sep 1, 2020)

Never have we witnessed more rumors about an upcoming GPU architecture, and never have they been so wrong.
I believe it was just a couple of days ago some video claimed to know _everything_ about Ampere… (most of it was wrong)
The approximate CUDA core count have been known to those with real access since the beginning, yet most leakers missed this up until yesterday, so they are obviously serving false information. This is the moment to go back and look at the various "leaks" and evaluate if it's based on something real or just fake news or click-bait.

Let this be a lesson about using more common sense and skepticism.


----------



## medi01 (Sep 1, 2020)

ppn said:


> 2080 x2 = 3070


Huang said 3070 = 2080ti, 3080 = 2x2080 (DF said, 65-90% faster, not 100%)


----------



## Vya Domus (Sep 1, 2020)

PowerPC said:


> That's why you should never measure GPUs performance in TFLOPS.



You can always use TFLOPS for that, pick any two random GPUs and compare their TFLOPS ratings then the actual performance. I'll be you anything that probably 80% of the time the GPU with more TFLOPS will be faster in the real world as well. This time around Nvidia is doing something finicky with the way they count CUDA "cores", I put that in quote marks because they were never real cores (same with AMD's stream processors), it's the SM/CU that's the real "core" of the GPU. But for some reason this time around they chose to be even more inconsistent as to what that means. Probably to make it look more impressive.




PowerPC said:


> It's all about more specialized hardware with GPUs.



Nvidia would sure like you to believe that. Shading languages don't run on specialized hardware, they can't, they need generic all-purpose processors.


----------



## TheoneandonlyMrK (Sep 1, 2020)

efikkan said:


> Never have we witnessed more rumors about an upcoming GPU architecture, and never have they been so wrong.
> I believe it was just a couple of days ago some video claimed to know _everything_ about Ampere… (most of it was wrong)
> The approximate CUDA core count have been known to those with real access since the beginning, yet most leakers missed this up until yesterday, so they are obviously serving false information. This is the moment to go back and look at the various "leaks" and evaluate if it's based on something real or just fake news or click-bait.
> 
> Let this be a lesson about using more common sense.


Samsung 8nm wasn't fake, lot's of nonesense with snippet of truth, some for memory speed and allocation right, just these new split 32bit shader core's are new and un-rumoured, they look familiar in theory to someone else's design.


----------



## MxPhenom 216 (Sep 1, 2020)

chodaboy19 said:


> Quick question:
> 
> The fan on the top side, is supposed to suck air from the bottom and exhaust it to the top. But, when I look at the shape of the blades, it looks like the air would flow in the opposite direction? I'm a bit confused...
> 
> ...



They probably have the fan spinning in the opposite direction than what you're thinking.


----------



## medi01 (Sep 1, 2020)

Frick said:


> When are reviews due, and when does RDNA2 drop?


"Leaked" as november, by the dude who got  a number of things right.
He touted big navi to be much faster than A104.


----------



## Deleted member 24505 (Sep 1, 2020)

Smashed AMD in the GPU wars


----------



## medi01 (Sep 1, 2020)

Ibotibo01 said:


> I will probably sell my RTX 2060. Please AMD release RDNA2 as soon as. It is odd RTX 3070 has 5888 cores and it gets RTX 2080 Ti performance. GTX 1060 has 1280 cores but it's performance same with GTX 980 which has 2048 cores. Ampere's 20TFLOPS GPU is equal Turing's 13TFLOPS GPU??


Good point.


----------



## Darksword (Sep 1, 2020)

If 3070's are $499 the used prices on 2080Ti's is about to be hit super hard.


----------



## Fluffmeister (Sep 1, 2020)

medi01 said:


> "Leaked" as november, by the dude who got  a number of things right.
> He touted big navi to be much faster than A104.



Ah yes, the fabled "Big Navi", it does appear to be the second coming of Christ for our resident fans that only buy AMD.

Fret not, here is a sneak peak of it's potential:



Spoiler


----------



## MxPhenom 216 (Sep 1, 2020)

Darksword said:


> If 3070's are $499 the used prices on 2080Ti's is about to be hit super hard.



Best sell now


----------



## Krzych (Sep 1, 2020)

Makaveli said:


> lol this video has been posted 3 times now and the last two post are on the same page!!!!!!!!!



lol generic copy-paste complaining has been posted like 210 times now and on the same pages!!!!!!


----------



## RevengeFNF (Sep 1, 2020)

Ibotibo01 said:


> I will probably sell my RTX 2060. Please AMD release RDNA2 as soon as. It is odd RTX 3070 has 5888 cores and it gets RTX 2080 Ti performance. GTX 1060 has 1280 cores but it's performance same with GTX 980 which has 2048 cores. Ampere's 20TFLOPS GPU is equal Turing's 13TFLOPS GPU??



First, 3070 does not have the same performance of the 2080 Ti, it is faster than it.

Second, 2080 Ti have more memory bandwidth than the 3070. That's why 3070 needs a lot more Cuda Cores. You can't just compare the Cuda Cores, you need to compare everything.


----------



## Oberon (Sep 1, 2020)

Valantar said:


> The 3070 supposedly beating the 2080 Ti at $499 just underscores how ridiculously priced that GPU was in the first place, but also really puts the pressure on AMD. If Nvidia's numbers pan out I'm struggling to see how AMD could keep up with the 3080 even with a similarly power hungry card (320W!) and the reported (and likely best-case scenario) 50% perf/W increase of RDNA 2.



A 50% perf/watt uplift from Navi puts a 5700XT class card at 3080 performance for around 300W based on some back of the envelope math.


----------



## TheoneandonlyMrK (Sep 1, 2020)

RevengeFNF said:


> First, 3070 does not have the same performance of the 2080 Ti, it is faster than it.
> 
> Second, 2080 Ti have more memory bandwidth than the 3070. That's why 3070 needs a lot more Cuda Cores. You can't just compare the Cuda Cores, you need to compare everything.


So why then are they saying that the 3070 is for 1440p it's interesting, reviews will tell all.


----------



## Vya Domus (Sep 1, 2020)

RevengeFNF said:


> Second, 2080 Ti have more memory bandwidth than the 3070. That's why 3070 needs a lot more Cuda Cores.



That's not how it works, even in the slightest. You don't trade one for the other.


----------



## yotano211 (Sep 1, 2020)

On reddit, people are selling some 2080ti's for 450-500, last week it was around 850-900


----------



## Fluffmeister (Sep 1, 2020)

yotano211 said:


> On reddit, people are selling some 2080ti's for 450-500, last week it was around 850-900



Nvidia are literally competing with themselves at this point.


----------



## Valantar (Sep 1, 2020)

Oberon said:


> A 50% perf/watt uplift from Navi puts a 5700XT class card at 3080 performance for around 300W based on some back of the envelope math.


It does, but I don't quite think 50% will be an average number, especially since unlike Nvidia AMD doesn't have the benefit of a node shrink. There's also a question of whether AMD will be willing to go big enough on their top end die. Those decisions were likely made two years ago, so it'll be interesting to see where they placed their bets.




On a different topic, after processing those massive CUDA core counts for a couple of hours I'm now wondering if Ampere is the generation where Nvidia's gaming performance/Tflop comes crashing down. No doubt they'll still be powerful, but doubling the ALUs and leaving everything else the same is bound to create heaps of bottlenecks.






Vya Domus said:


> DirectStorage aims to reduce IO overhead not necessarily memory requirements, 1 GB of textures are still going to be 1 GB of textures, they'll just load more efficiently. Just because an engine no longer needs to load as many things ahead of time doesn't mean the memory wont fill up with something else, in facts that's the goal, to allow for an increases in the amount of assets used.


But that's the thing, isn't it - if you load textures more efficiently, i.e. you stop loading ones you don't actually need, you inherently reduce the memory footprint as you are by default loading fewer textures. Sure, you can then load other things more aggressively, but wouldn't it then make sense to use the same JIT principle for those loads as well? And what other data is supposed to fill several GB of VRAM? Reducing the texture prefetch time from an assumed 1-2s (HDD speed) to .1s or even less (NVMe SSD speed) can lead to dramatic drops in the amount of texture data that needs to be in memory. I'm obviously not saying this will necessarily result in dramatic across-the-board drops in VRAM usage, but it's well documented that current VRAM usage is massively bloated and wasteful and not actually necessary to sustain or even increase performance.



RevengeFNF said:


> Second, 2080 Ti have more memory bandwidth than the 3070. That's why 3070 needs a lot more Cuda Cores.


That is literally the opposite of how this works. More cores necessitates more memory bandwidth for the cores to have data to work on. That would be like compensating for your car having no wheels by giving it a more powerful engine.


----------



## matar (Sep 1, 2020)

RTX 3080 is the sweet spot.


----------



## kings (Sep 1, 2020)

I think Nvidia is somewhat reviewing the term of what is a Cuda Core with the introduction of these new shaders. I don't think it will be directly comparable to the Cuda Cores of the previous generation.

Anyway, soon we should have it all dissected.


----------



## Oberon (Sep 1, 2020)

Valantar said:


> It does, but I don't quite think 50% will be an average number, especially since unlike Nvidia AMD doesn't have the benefit of a node shrink. There's also a question of whether AMD will be willing to go big enough on their top end die. Those decisions were likely made two years ago, so it'll be interesting to see where they placed their bets.



That's fair, but AMD has been pretty honest about their projected performance under Su. Should be interesting!



Valantar said:


> On a different topic, after processing those massive CUDA core counts for a couple of hours I'm now wondering if Ampere is the generation where Nvidia's gaming performance/Tflop comes crashing down. No doubt they'll still be powerful, but doubling the ALUs and leaving everything else the same is bound to create heaps of bottlenecks.



There's definitely a big architectural change there that I'm interested to hear about. At a very high, naive level it seems like a move toward a more GCN-like layout, or rather like AMD and NVIDIA are converging a bit in terms of general shader design.


----------



## mouacyk (Sep 1, 2020)

kings said:


> I think Nvidia is somewhat reviewing the term of what is a Cuda Core with the introduction of these new shaders. I don't think it will be directly comparable to the Cuda Cores of the previous generation.
> 
> Anyway, soon we should have it all dissected.


They are taking a page from AMD's Bulldozer and Piledriver cores days, obviously not to catch up, but to distance their lead even further.  As someone already said, it's probably not easy to keep the extra ALU's fed completely, thereby losing some of the scaling.


----------



## Initialised (Sep 1, 2020)

When he got the 3090 out of the oven!

A nice nod to the using an oven to fix the half baked solder on the 8800 GTX.


----------



## PowerPC (Sep 1, 2020)

Vya Domus said:


> You can always use TFLOPS for that, pick any two random GPUs and compare their TFLOPS ratings then the actual performance. I'll be you anything that probably 80% of the time the GPU with more TFLOPS will be faster in the real world as well. This time around Nvidia is doing something finicky with the way they count CUDA "cores", I put that in quote marks because they were never real cores (same with AMD's stream processors), it's the SM/CU that's the real "core" of the GPU. But for some reason this time around they chose to be even more inconsistent as to what that means. Probably to make it look more impressive.
> 
> Nvidia would sure like you to believe that. Shading languages don't run on specialized hardware, they can't, they need generic all-purpose processors.



The TFLOPS he was referring to was 20 TFLOPS of the 3070 compared to 13 TFLOPS of the 2080 ti. If these cards have equivalent performance, TFLOPS doesn't matter!!!

And of course they are using specialized hardware! Do you actually think they are going to waste general-purpose CPUs just to compute graphics??? And you probably know that CPUs aren't even that good at those kinds of computations. That's the reason we have GPUs in the first place. Your argument doesn't even make any sense for that reason. And to just add to that, a GPU has many thousands little processing cores that are all the same, all doing pretty much the same exact matrix computations and manipulations for those graphics. That's a far cry from what a general-purpose CPU does, to say the least. How would Nvidia even hide something like this?

The part about shading language blatantly makes no sense.


----------



## TheoneandonlyMrK (Sep 1, 2020)

Soo, when is NDA up on reviews.

Release Day?


----------



## biffzinker (Sep 1, 2020)

Anyone else know about RTX IO being supported on Turing?


> RTX IO will function on both Turing and Ampere GPUs, so RTX 20-series owners won't be left in the dust here.











						Nvidia's RTX IO technology promises faster load times for RTX-equipped systems
					

This technology will run on Microsoft's next-generation DirectStorage API, and it will enable "up to" 100x faster drive performance compared to traditional hard drives. Nvidia claims RTX...




					www.techspot.com


----------



## PowerPC (Sep 1, 2020)

There is already a 3070 Super and a 3080 ti listed on TPU with 16 GB and 20 GB VRAM respectively! Also probably a significant performance boost. As if Nvidia already knew they'll have people complain about the VRAM. I assume they'll probably cost a premium compared to these "low" VRAM versions, unfortunately.... A big reason why 3090 cost $1500 is the 24 GB of VRAM. But who knows, NVIDIA hasn't even mentioned them yet. 

I wonder where TPU gets this information.

https://www.techpowerup.com/gpu-specs/geforce-rtx-3070-super.c3675
https://www.techpowerup.com/gpu-specs/geforce-rtx-3080-ti.c3581


----------



## ViperXTR (Sep 1, 2020)

So the CUDA cores doing double calculations and marketing needed them too look good on paper so they double the numbers?


----------



## PowerPC (Sep 1, 2020)

theoneandonlymrk said:


> So why then are they saying that the 3070 is for 1440p it's interesting, reviews will tell all.


Because they are probably just being honest. 2080 ti was never truly a 4K card. Especially now that we have something like 3090 that will really handle 4K easily I assume.. It was even called a 4K/8K card in the presentation but I'd be very skeptical about the 8K part. Honest on one side but then dishonesty back again on the other. Classic marketing.


----------



## Vya Domus (Sep 1, 2020)

PowerPC said:


> The TFLOPS he was referring to was 13 TFLOPS of the 3070 compared to 20 TFLOPS of the 2080 ti. If these cards have equivalent performance, TFLOPS doesn't matter!!!



The 2080ti has no where near 20 TFLOPS , it has about 13 TFLOPS. TFLOPS and performance are highly correlated, it's the most objective measure of performance possible whether you like it or not. Rarely do you ever come across an example counter to that general rule.



PowerPC said:


> Do you actually think they are going to waste general-purpose CPUs just to compute graphics???



GPUs *are general purpose*. Have been since early 2000s, that's why we have programmable shaders.



PowerPC said:


> And to just add to that, a GPU has many thousands little processing cores that are all the same, all doing pretty much the same exact matrix computations and manipulations for those graphics. That's a far cry from what a general-purpose CPU does, to say the least.



First of all like I said these things don't really have thousands of cores but I'm not going to go into that, the point is that the analogous of a core is the SM. They do 4x4 matrix arithmetic if you chose to program that, they might as well do something else, which they do often within shaders because they're general purpose.



PowerPC said:


> That's a far cry from what a general-purpose CPU does, to say the least.





PowerPC said:


> The part about shading language blatantly makes no sense.




No it's not and it makes perfect sense, you think so because you probably have never seen a shader and don't know what I am talking about.

This is some random GLSL shader I found on the internet:






A lot more than matrix multiplication huh ? It's basically C code and you can't run C on special purpose hardware, you need a fairly robust ISA and control logic just like in a typical CPU. A GPU core is very similar to a CPU core, they're just optimized differently.



PowerPC said:


> How would Nvidia even hide something like this?



I don't know what you are on about, you make it sound like it's some sort of conspiracy. It's really funny.


----------



## PowerPC (Sep 1, 2020)

Vya Domus said:


> The 2080ti has no where near 20 TFLOPS , it has about 13 TFLOPS. TFLOPS and performance are highly correlated, it's the most objective measure of performance possible whether you like it or not. Rarely do you ever come across an example counter to that general rule.


It's the other way around. 3070 has 20 TFLOPS and 2080 ti has 13 TFLOPS... You could just read the original post about that and figure that out by now. Even if I miswrote the correct order, TFLOPS still don't predict performance, if these two cards have very similar performance.



Vya Domus said:


> First of all like I said these things don't really have thousands of cores but I'm not going to go into that, the point is that the analogous of a core is the SM. They do 4x4 matrix arithmetic if you chose to program that, they might as well do something else, which they do often within shaders because they're general purpose.



No, it's not. When I (and most people) say cores on a GPU, I mean Shading Units. The 3080 has 8704 cores in this case. They all work in parallel because GPU makes use of parallel computing WAY more than CPU. That is the difference that makes the whole GPU very different from a CPU.

And that C code runs purely on the GPU? Are you so sure of that? C is run on the CPU and the CPU eventually just controls the GPU...


----------



## Vya Domus (Sep 2, 2020)

PowerPC said:


> TFLOPS still don't predict performance.



It predicts performance incredibly well, strikingly so. I know people get angry about that but it's the truth. Size matters, or in this case TFLOPS.



PowerPC said:


> It's the other way around. 3070 has 20 TFLOPS and 2080 ti has 13 TFLOPS... You could just read the original post about that and figure that out by now.



You don't get it, even if you go by Nvidia's numbers, *the GPU with more TFLOPS is the faster one*.









						Plăci grafice NVIDIA GeForce RTX seria 30 cu arhitectura Ampere
					

Experimentează cea mai bună performanță pentru jucători și creatori, cu funcții avansate incredibile.



					www.nvidia.com
				




5888*2*1730 = ~20 TFLOPS

Nvidia claims the 3070 is faster than the 2080ti and guess what, the 3070 has more TFLOPS. Tada !


----------



## PowerPC (Sep 2, 2020)

Vya Domus said:


> It predicts performance incredibly well, strikingly so. I know people get angry about that but it's the truth. Size matters, or in this case TFLOPS.
> 
> 
> 
> ...


It's not 50% faster as it should be if you just compare the TFLOPS! Your point still makes no sense. I'm pretty sure it'll just maybe be 10% faster if you're lucky. So yea, if you're off by 40%, that's not predicting performance. All the people I heard are saying it's going to be pretty much the same performance.


----------



## Vya Domus (Sep 2, 2020)

PowerPC said:


> It's not 50% faster as it should be if you just compare the TFLOPS! Your point still makes no sense.



Did you hear me say it's exactly 50% or whatever ? I said higher TFLOPS means higher performance in general, which is true. I don't know why you are so reluctant to accept it.



PowerPC said:


> No, it's not. When I (and most people) say cores on a GPU, I mean Shading Units. The 3080 has 8704 cores in this case.



A core needs to fetch decode and execute introductions on it's own, CUDA cores or whatever Nvidia calls them don't do that, that's just marketing. Functionally speaking the SM is the core in a GPU. Have you noticed how Nvidia never says "core" but always makes sure to write "_CUDA _core" ? It's because they're not really cores, they're something else. They don't even do any shading, a CUDA core just means a FP32 unit.



PowerPC said:


> And that C code runs purely on the GPU? Are you so sure of that? C is run on the CPU and the CPU eventually just controls the GPU...



Yes, it runs purely on the GPU, instruction by instruction for each instance of the shader. Look man, you are clearly not knowledgeable about these things, that's fine. You can either take my word for it or look all of this up on your own.


----------



## PowerPC (Sep 2, 2020)

Vya Domus said:


> Did you hear me say it's exactly 50% or whatever ? I said higher TFLOPS means higher performance in general, which is true. I don't know why you are so reluctant to accept it.


Because no one even talked about "higher means higher". The poster who I was referring to before you interjected, was questioning the 20 TFLOPS vs 13 TFLOPS..... You just don't seem to get it still.



Vya Domus said:


> A core needs to fetch decode and execute introductions on it's own, CUDA cores whatever Nvidia calls them don't do that, that's just marketing. Functionally speaking the SM is the core in a GPU. Have you noticed how Nvidia never says "core" but always makes sure to write "_CUDA _core" ? It's because they're not really cores, they're something else. They don't even do any shading, a CUDA core just means a FP32 unit.


And I never talked about Cuda cores, you're just strawmanning me again. I'm talking about Shading Units, which do show the performance. By having more, you get faster GPUs. SM aren't even the "Cores", they are just arrays of Shading Units, which do the actual work.



Vya Domus said:


> Yes, it runs purely on the GPU. Look man, you are clearly not knowledgeable about these things, that's fine. You can either take my word for it or look all of this up on your own.


That is garbage. CPU always works together with the GPU. CPU instructs the GPU to do things all the time. I think you are way less knowledgeable than you believe.


----------



## Vya Domus (Sep 2, 2020)

PowerPC said:


> Because no one even talked about "higher means higher". The poster who I was referring to before you interjected, was questioning the 20 TFLOPS vs 13 TFLOPS..... You just don't seem to get it still.



I'll lay it out as simple as I can :

You said that you can't predict performance with TFLOPS, except you can, given a value you can tell with a fairly good degree of accuracy if it will be faster or not than an existing GPU. How does that not qualify as a prediction only you know.


----------



## Hotobu (Sep 2, 2020)

PowerPC said:


> There is already a 3070 Super and a 3080 ti listed on TPU with 16 GB and 20 GB VRAM respectively! Also probably a significant performance boost. As if Nvidia already knew they'll have people complain about the VRAM. I assume they'll probably cost a premium compared to these "low" VRAM versions, unfortunately.... A big reason why 3090 cost $1500 is the 24 GB of VRAM. But who knows, NVIDIA hasn't even mentioned them yet.
> 
> I wonder where TPU gets this information.
> 
> ...



Why is stuff like this even on the main page? The more I look at TPU the more its credibility takes a hit with me. Should Reddit speculation be siteworthy?


----------



## Nkd (Sep 2, 2020)

Vayra86 said:


> I think Navi is going to *struggle *catching this 3080 to be honest. AMD has yet to surpass 2070S performance convincingly, and now they're making an 80% jump ahead? Not likely, unless they make something absolutely gargantuan. But let's not dive into the next pond of speculation... my heart...
> 
> By the by, do we have TDPs for these Ampere releases already? The real numbers?



I am going by pure data that’s out their and what’s rumored. If big Navi has minimum double the CUs of 5700xt that will get right close to 3080 territory. There should also be other tweaks made it increase IPC and big Navi should get fairly high in clock speeds given the speeds on Xbox series X and how efficient that chip is as an APU.

I am suspecting them to compete with 3080 at minimum. Nvidia seems to have done right here pricing 3080 at 699.99. That does put amd in a tough spot and will have to under cut NVIDIA even at same speed. They would have to be faster to sell close to 699.99-750.


----------



## PowerPC (Sep 2, 2020)

Vya Domus said:


> I'll lay it out as simple as I can :
> 
> You said that you can't predict performance with TFLOPS, except you can, given a value you can tell with a fairly good degree of accuracy if it will be faster or not than an existing GPU. How does that not qualify as a prediction only you know.


Because that wasn't even the thing in question... This is getting annoying to discuss with you because you obviously are trying to mischaracterize completely what I was talking about. Just stop, you missed the point. It's ok and move on. The point is that a potential 3060 could also have many more TFLOPS than 2080 ti but still be slower. That's the whole point. It just works in this case but it's still a difference of 50% more TFLOPS for pretty much the same performance on 3070, so TFLOPS again, don't reflect the actual PERFORMANCE of the GPU, as I have repeated many times to you...



Hotobu said:


> Why is stuff like this even on the main page? The more I look at TPU the more its credibility takes a hit with me. Should Reddit speculation be siteworthy?


It was there for the 3070, 3080 and 3090 with all the details like this for at least a week now. And I think that was all the correct information, too. I thought that was weird as well.


----------



## Nkd (Sep 2, 2020)

Vya Domus said:


> It predicts performance incredibly well, strikingly so. I know people get angry about that but it's the truth. Size matters, or in this case TFLOPS.
> 
> 
> 
> ...



I think what he meant is it’s not not as fast as tflops show. 7 more tflops is a lot if you are taking Turing tflops. So ampere you are actually getting less performance Since 3070 is not 1.7x performance of 2080ti. So it’s almost like they are like tflops of gcn where you get less gaming performance.

so yes it has higher tflops but not as fast as it shows.


----------



## Bruno_O (Sep 2, 2020)

The power and heat for 3080/3090 is really bad, but what killed these cards for me is the vram size, just pathetic. 3070 should have 12GB and 3080 16GB at this point - and everyone can try to ignore the reality as much as they want, but these cards will severely lack enough memory, both now but even more in the next few years. Consoles are getting 16GB of GDDR6 and should cost 500 USD for the entire thing, and people are going crazy for a 3070 for the same price and just 8GB. People got so used to the 2000 series ridiculous pricing that they are now blind and just see the price drops. The 2k series was nVidia true colours when AMD couldn't compete, the current pricing is a direct response to RDNA2 - the arch that will power the next 5 years of consoles and the arch that will receive the most optimizations from developers - again, because everything is made to/for consoles where the bulk of gamers are, and then ported to PC. nVidia is scared of becoming another Intel, and I'm loving all of this, since the new Radeons will definitely be more power efficient, age better, apparently have more vram, and now are limited by nVidia prices!


----------



## Th3pwn3r (Sep 2, 2020)

lynx29 said:


> how do you all like my new sig?  never thought i'd see the day... LMAO


It's dumb. Plenty of us with 2080tis will just keep them and put them in our other rigs. Plenty of people who bought 2080tis can afford another video card that's just as expensive. Guess who will be buying 3090s? A lot of the same people who bought 2080tis. Personally I'll probably exchange my 2080ti for a 3080 since I don't think I need a 3090.


----------



## Vya Domus (Sep 2, 2020)

PowerPC said:


> That is garbage. CPU always works together with the GPU. CPU instructs the GPU to do things all the time.



I tried, yet you made sure to prove to me you have absolutely no idea what you are talking about on every occasion. Do you realize how idiotic it would be if I were to write say a + b in a GPU shader and the CPU would have to instruct the GPU step by step on how to do that ? What would there even be the point of having a GPU ?

Shaders run exclusively on the GPU, line by line, instruction by instruction with no intervention from the CPU side, that's what GPUs do. You know precisely nothing about this subject yet you insist to correct me on everything I say. May you delve in your ignorance for as long as you want, I'm out.









						Dunning–Kruger effect - Wikipedia
					






					en.wikipedia.org


----------



## PowerPC (Sep 2, 2020)

Vya Domus said:


> I tried, yet you made sure to prove to me you have absolutely no idea what you are talking about on every occasion. Do you realize how idiotic it would be if I were to write say a + b in a GPU shader and the CPU would have to instruct the GPU step by step on how to do that ? What would there even be the point of having a GPU ?
> 
> Shaders run exclusively on the GPU, line by line, instruction by instruction with no intervention from the CPU side, that's what GPUs do. You know precisely nothing about this subject yet you insist to correct me on everything I say. May you delve in your ignorance for as long as you want, I'm out.
> 
> ...


You are just insanely inaccurate about everything you say.

I never said CPU does the computation of the GPU line by line. That is something you just invented somehow. You just can't stop lying about what I said, you really don't give a f*** it seems.

The CPU can give instructions to the GPU and the GPU will actually compute those instructions. The CPU doesn't need to do it, but it does need to orchestrate what the GPU does. That is what that program is doing that someone wrote in C. The program itself doesn't do anything either, it's just the instructions for the CPU and the CPU passes those instructions on to the GPU to compute whatever that program wants it to compute.

If you knew something about C you would know it gets compiled down to Assembly language and then down to Machine code for x86 CPUs. Assembly usually just knows one instruction set, so it can't just run on anything you would like.

You are only projecting your own lack of knowledge on to others. You don't seem to be that bright either, literally everything you said was wrong so far and even others have corrected you, it's actually hard for me to remember encountering someone as willfully ignorant or even just as purposefully deceptive as you on here before.


----------



## Vya Domus (Sep 2, 2020)

Maybe you have trouble remembering what you wrote so I will post the exact comments and order in which you replied.



Vya Domus said:


> Yes, it runs purely on the GPU, instruction by instruction for each instance of the shader.



To which you responded very confidently that this is "garbage" :



PowerPC said:


> That is garbage. CPU always works together with the GPU. CPU instructs the GPU to do things all the time.



Now you write :



PowerPC said:


> I never said CPU does the computation of the GPU line by line. That is something you just invented somehow. You just can't stop lying about what I said, you really don't give a f*** it seems.



No buddy, you thought that idea was "garbage", you wrote that yourself not knowing even what I was talking about. 

Stop, you are the laughing stock of everyone reading this.


----------



## PowerPC (Sep 2, 2020)

Vya Domus said:


> Maybe you have trouble remembering what you wrote so I will post the exact comments and order in which you replied.
> 
> 
> 
> ...


How is that even supposed to be against what I said???

YOU said it runs line by line on the GPU. Then I replied that's garbage / totally wrong. I said it runs on both every time and you don't seem to get that simple principle. You have posted total non-sense over and over again.

The CPU is still always the host processor, it orchestrates what the GPU has to do. The program just doesn't purely run on the GPU, but also on the CPU. You have no sense of the abstraction happing with that C program. I'm still accurate on that, despite your silly attempts to frame this otherwise. You make extremely illogical points, so that won't happen either way. But somehow you are still trying.


----------



## Vya Domus (Sep 2, 2020)

PowerPC said:


> I said it runs on both every time and you don't seem to get that simple principle.



It doesn't run on both, you have no clue in the slightest how these things work. I said above how unimaginably idiotic it would be for that to be the case. A shader is compiled and runs on the GPU alone, not on both. Can you still not understand how dumb that would be ?



PowerPC said:


> The program just doesn't purely run on the GPU



The shader runs purely on GPU, period. I don't know what's this "program" you talk about, I wrote exclusively about shaders.



PowerPC said:


> YOU said it runs line by line on the GPU.



Which is 100% correct. Everything in the shader, the code that I posted earlier, *runs on the GPU*. All of it, never on both.



PowerPC said:


> You have no sense of the abstraction happing with that C program.



   

*IT'S NOT A C PROGRAM.*  It's a GLSL shader using a C syntax style language, I wrote that as clear as possible, of course you don't have any idea of what that means. That's why you don't understand any of this either. This is what I am trying to show you.


----------



## PowerPC (Sep 2, 2020)

Vya Domus said:


> It doesn't run on both, you have no clue in the slightest how these things work. I said above how unimaginably idiotic it would be for that to be the case. A shader is compiled and runs on the GPU alone, not on both. Can you still not understand how dumb that would be ?
> 
> 
> 
> The shader runs purely on GPU, period. I don't know what's this "program" you talk about, I wrote exclusively about shaders.


The computation for the shader runs on the GPU, that's it. The C programs run on both, that's it. The discussion was about the C program, and you started it, so you're just out of your mind now suddenly changing the facts up to appear right. That's some weird, childish reasoning.

Let's sum up:
You were wrong on the TFLOPS argument.
You were wrong on the general-purpose argument.
You are still wrong on the C program argument, which btw. you stared and which has nothing at all to do with this whole discussion.

Or what did this have to do with anything now? You wrote this:


Vya Domus said:


> Shading languages don't run on specialized hardware, they can't, they need generic all-purpose processors.



Of course they run on specialized hardware. It's called a Shading Unit inside a GPU. That's definitely as specialized as it gets, my friend. And there is no such thing as a "shading language", if it's written in C. C is an all-purpose language, no a shading one.... Whatever that is even supposed to mean.

Now stop this completely silly discussion. But you just don't seem to know when to stop, do you?


----------



## Vya Domus (Sep 2, 2020)

PowerPC said:


> The discussion was about the C program



*IT'S NOT A C PROGRAM*

You can't figure that out even now ? Read back that comment.



PowerPC said:


> You were wrong on the TFLOPS argument.
> You were wrong on the general-purpose argument.
> You are still wrong on the C program argument, which btw. you stared and which has nothing at all to do with this whole discussion.



I was correct about all of them.

More TFLOPS indicate more performance.
GPUs are general purpose, that's why they are used for all sorts of things besides graphics. That's also why they're capable of being GPGPU (General Purpose Graphics Processing Units). It's in the name.


Thoroughly explained why, can you do the same ?



PowerPC said:


> *And there is no such thing as a "shading language"*, if it's written in C. C is an all-purpose language, no a shading one.... Whatever that is even supposed to mean.











						OpenGL Shading Language - Wikipedia
					






					en.wikipedia.org
				




Just for how long will you make me do this ? Have I not proven you wrong enough times ? I feel like a bully. 

You're gonna use that cognitive dissonance thing and tell me that you never said that, right ?


----------



## B-Real (Sep 2, 2020)

xkm1948 said:


> $699 3080???????????????????
> 
> OH MY GOD



Wut? This may be the same pricing as the RTX 2000 series.  No Founders Edition mentioned in the charts, so the cheapest AIB ones may start from this price range too - just like with the RTX 2080.



xkm1948 said:


> Nope. To me Nvidia is only competing with itself at this point. In other words, how to get the Pascal crowd to move on and upgrade? Jensen even mentioned that in his talk!


Competition doesn't mean only the high end. It's there at mid and entry range too.


----------



## PowerPC (Sep 2, 2020)

Vya Domus said:


> *IT'S NOT A C PROGRAM*
> 
> You can't figure that out even now ? Read back that comment.


You can keep changing and editing posts but that's ok.

You haven't answered the question of how that language (C or not, it doesn't matter) purely runs on the GPU. Everything has to go through the CPU at some point, GPU and CPU work together constantly and for every process, they do to reach the result of a game on the screen. That's why you still need a CPU to play games. We wouldn't even need a CPU, if you could just program on the GPU. It's definitely not how it works.

And either way, what is this point even trying to accomplish? What are you trying to reach with this exact point, besides all the other points that you were wrong on? You still haven't responded to this, so I'll quote you again:


Vya Domus said:


> Shading languages don't run on specialized hardware, they can't, they need generic all-purpose processors.


Are you actually saying Shading Units aren't specialized hardware when it's literally in the name???


----------



## Parn (Sep 2, 2020)

The power usage is crazy. Both the 90 and 80 are above 300W and even the 70 is approaching the territory that used to be reserved for the 80 Ti cards. But then if 3070 is actually faster than 2080 Ti by a good margin, that means power efficiency has been improved with Ampere. BTW where is that 12-pin pcie aux power connector?


----------



## biffzinker (Sep 2, 2020)

efikkan said:


> Specs are looking pretty good (on paper), perheps except TDP numbers.


I thought the 220 watts for the 3070 wasn’t a bad TDP for the performance their advertising.


----------



## Vya Domus (Sep 2, 2020)

PowerPC said:


> You haven't answered the question of how that language (C or not, it doesn't matter) purely runs on the GPU. Everything has to go through the CPU.



You write "int a = 1;" in a shader, it's compiled then sent onto the GPU where it's executed by it. Is that too much to comprehend ? Do you understand what "run" means ?



PowerPC said:


> Are you actually saying Shading Units aren't specialized hardware when it's literally in the name???



Both AMD and Nvidia have stopped calling them like that for a very long time. Anyway, you still haven't addressed how shading languages don't exist despite this little thing :









						OpenGL Shading Language - Wikipedia
					






					en.wikipedia.org
				






> (*GLSL*) is a high-level shading language with a syntax based on the C programming language.





PowerPC said:


> And either way, what is this point trying to accomplish?



That you know nothing, you are a compulsive liar, and really, really stubborn. It wasn't my initial goal, but alas.


----------



## PowerPC (Sep 2, 2020)

Vya Domus said:


> You write "int a = 1;" in a shader, it's compiled then sent onto the GPU where it's executed by it. Is that too much to comprehend ? Do you understand what "run" means ?


So you still really think everything that code does only and purely runs on the GPU. That is completely wrong and I won't keep repeating this point because you don't seem to understand how computers or Von Neumann Architecture works on a basic level.


Vya Domus said:


> Both AMD and Nvidia have stopped calling them like that for a very long time. Anyway, you still haven't addressed how shading languages don't exist despite this little thing :


No, it's still called that because Nvidia or AMD don't determine what basic hardware components are called. Intel definitely also uses Shading Units in their iGPUs.
https://en.wikipedia.org/wiki/Unified_shader_model


> *Unified shader architecture* (or *unified shading architecture*) is a hardware design by which all shader processing units of a piece of graphics hardware are capable of handling any type of shading tasks. Most often Unified Shading Architecture hardware is composed of an array of computing units and some form of dynamic scheduling/load balancing system that ensures that all of the computational units are kept working as often as possible.






Vya Domus said:


> That you know nothing, you are compulsive liar, and really, really stubborn. It wasn't my initial goal, but alas.



Again, totally projecting. What was your initial goal other than presenting how much you know about shaders but not the overall picture of how a computer works? I don't need to know how everything works because I know how abstraction in computer architecture works, that's the point of having it. So, you still haven't told me about your initial goal. It seems to be that you just wanted to talk s*** at other members of the forum and nothing else. You are truthfully a sad and silly human being. I kind of pitty you that you had to go through this length to defend a useless point that doesn't even further anything about the discussion. You seem to just vent because something went wrong in your life?


----------



## Vya Domus (Sep 2, 2020)

PowerPC said:


> So you still really think everything that code does only and purely runs on the GPU. That is completely wrong



It's a fucking GPU shader ! Of course it runs only on the GPU, *THAT'S WHAT IT'S FOR !* My God, you can't be this dense, really I hope you're just a bad troll. 



PowerPC said:


> you don't seem to understand how computers or Von Neumann Architecture works on a basic level.



Lmao you're just writing some random ass computer science thingies that you know about.



PowerPC said:


> Again, totally projecting. What was your initial goal other than presenting how much you know about shaders but not the overall picture of how a computer works? I don't need to know how everything works because I know how abstraction in computer architecture works, that's the point of having it. So, you still haven't told me about your initial goal. It seems to be just wanted to talk s*** at other members of the forum and nothing else. You are truthfully a sad and silly human being. I kind of pitty you that you had to go through this length to defend a useless point that doesn't even further anything about the discussion. Did you just need to vent from something that went wrong in your life?



Damn are you getting emotional, want a tissue or something ? You're not gonna cry on me are you ?


----------



## Flanker (Sep 2, 2020)

Over9000.gif


----------



## PowerPC (Sep 2, 2020)

Vya Domus said:


> It's a fucking GPU shader ! Of course it runs only on the GPU, *THAT'S WHAT IT'S FOR !* My God, you can't be this dense, really I hope you're just a bad troll.



You have no idea what a small line of higher-level code is compiled to in the computer and what components carry out that code. And get this, it doesn't even matter, it's called abstraction! But you have no clue what that is, do you?

Now, you still haven't answered why you even wrote that comment about languages not running on specialized hardware, which they actually can do with the help of the CPU, since it literally had nothing to do with anything being discussed. It's not relevant to the specialized hardware point that you were wrong about. You can still have specialized hardware, like Shader Units or TMUs or ROPs... It's also not relevant to the TFLOPS argument that you were also wrong about. So what was it even relevant for, you silly goose? Tell me, I asked at least 3 times already and you can't answer, but I can continue asking.   


Vya Domus said:


> Lmao you're just posting some random ass computer science thingies that you know about.


Sure I am. I just have a CS degree for nothing. But you are the expert... Probably you tried programming a game once and didn't even work out and now you like to play yourself up as someone of importance here. It's not working for you, sorry to be the one who tells you the truth.


Vya Domus said:


> Damn are you getting emotional, want tissue or something ? You're not gonna cry on me are you ?


I'm not the one who needs to cry, but people who are around you. And they probably also need to run from what kind of pathetic lier and wannabe you are. I won't even say troll because you somehow manage to be way beneath that.


----------



## Xex360 (Sep 2, 2020)

Finally a real leap with focus on Cuda cores, even the 3070 has more than the 2080ti, a bit disappointed by the memory amounts though, 12 GB for the 3070 and 16gb for the 3080 would've been better.
Wonder what will AMD do now, they have to launch a big GPU, we need the power to drive to run games at 4k.


----------



## mrthanhnguyen (Sep 2, 2020)

can anyone calculate how many more fps for 1440p going from 2080ti to 3090?


----------



## ViperXTR (Sep 2, 2020)

wait,
Oh so from what ive read, each core can do INT+FP, FP+FP vs previous generation of INT+ FP only? Its still 2 core inside but only one when doing specific operations?


----------



## AteXIleR (Sep 2, 2020)

I'm thankful and glad for the probable generational leap, and for the price of the two lower more mainstream models.


----------



## Xex360 (Sep 2, 2020)

ViperXTR said:


> wait,
> Oh so from what ive read, each core can do INT+FP, FP+FP vs previous generation of INT+ FP only? Its still 2 core inside but only one when doing specific operations?


Interesting, can someone explain this for the laymen among us.


----------



## Lionheart (Sep 2, 2020)

I know Nvidia recommends a 750W PSU for the 3080 but I'm hoping my 650W Gold rated PSU will suffice, reviews/time will tell.


----------



## nguyen (Sep 2, 2020)

Xex360 said:


> Interesting, can someone explain this for the laymen among us.



Read through this concurrent FP+INT article








						GeForce GTX 1660 Ti’s Advanced Shaders Accelerate Performance In The Latest Games
					

Our technical deep dive explains how the GTX 1660 Ti’s TU116 Turing-architecture GPU boosts performance and increases efficiency.



					www.nvidia.com
				




Basically 2080 Ti has 13.5TFLOPs of FP32 and 13.5TFLOPs of INT32, if a game fully leverage both FP32 and INT32 instructions then 2080 Ti would effectively have 27TFLOPs of combined FP+INT. So depends on how much INT32 instructions are used, 2080 Ti's effective TFLOPs range from 13.5 to 27. From the SoTR example, 38 out 100 instructions are INT, that means effectively 2080 Ti has 13.5 + 13.5(x 38/62) = 21.77 TFLOPs

Meanwhile the 20TFLOPs of the 3070 are fixed (FP32 or FP32+INT32) and does not depend on the game engine's usage of INT instructions.

@Vya Domus and @PowerPC You guys forgot that 2080 Ti can do concurrent FP32+INT32 so effectively 2080 Ti has 27TFOPS in rare instances.


----------



## yotano211 (Sep 2, 2020)

Lionheart said:


> I know Nvidia recommends a 750W PSU for the 3080 but I'm hoping my 650W Gold rated PSU will suffice, reviews/time will tell.


I wouldnt push it. 
If I ever build a desktop again, I would never mess with or cheap out on the power supply, I would buy everything else used but the power supply, I would buy new. So I wouldnt push it.


----------



## wolf (Sep 2, 2020)

Well well well, looks like a lot of rose coloured crystal balls were off the mark.

A 3080 is definitely the card I was waiting for!


----------



## Valantar (Sep 2, 2020)

Lionheart said:


> I know Nvidia recommends a 750W PSU for the 3080 but I'm hoping my 650W Gold rated PSU will suffice, reviews/time will tell.


Depends on the rest of your hardware and how noisy your PSU is, but 650W with a 320W GPU should be perfectly fine unless you're running very high end hardware or overclocked. If Ampere is power limited to TDP like Turing, you have 320W for the GPU, plus 20-30 for the motherboard and RAM, ~5 per SSD, 15 per 3.5" HDD, ~5W per couple of fans, 10 per AIO pump, and however much power your CPU needs. For a 65W AMD that is 88W, for a 95W AMD it's 144W, for a 10th gen Intel it's 150-250W depending on the SKU. That's at stock clocks within the boost window (which might be infinite depending on your motherboard). I would add 20% margin on top of that for safety, and at least another 20% if you're overclocking - likely more. Of course it's highly unlikely for all components in the system to draw maximum power at once, and CPUs pretty much never run at 100% while gaming, so there's some extra margin in there based on that too. 650W would as such be rather slim for a system with something like a 10700K or 10900K (my formula ends up at 675W minimum assuming a couple of SSDs and a few fans), but should work fine with a less power hungry CPU or if you undervolt and/or tune the power limits of one of those power hogs.


----------



## dicktracy (Sep 2, 2020)

Their test system was with an i9 10900k. Nvidia seems to be confident that PCIE 3.0 is not a bottleneck. Jensen, what happened? No love for your niece?


----------



## nguyen (Sep 2, 2020)

dicktracy said:


> Their test system was with an i9 10900k. Nvidia seems to be confident that PCIE 3.0 is not a bottleneck. Jensen, what happened? No love for your niece?



At 4K there really is no difference between Pcie 3.0 or 4.0
And Nvidia tested them with a 9900K, still doesn't make any difference against 10900K


----------



## laszlo (Sep 2, 2020)

surprisingly good pricing from them; i can compare this with a "preemptive strike" in anticipation of amd release...

what is the best part - we, end users , not fanboys, will have good prices from both of them


----------



## PooPipeBoy (Sep 2, 2020)

The RTX 3070 is going to sell like hot tamales.
If Nvidia can satisfy demand at that $499 price point, then I'll take back every time I've ever complained about their Sheriff of Nottingham business strategy.


----------



## watzupken (Sep 2, 2020)

Parn said:


> The power usage is crazy. Both the 90 and 80 are above 300W and even the 70 is approaching the territory that used to be reserved for the 80 Ti cards. But then if 3070 is actually faster than 2080 Ti by a good margin, that means power efficiency has been improved with Ampere. BTW where is that 12-pin pcie aux power connector?



I agree the power requirements have gone through the roof. Waiting for official reviews to see how much performance improvement we are getting with this generation. Also I am not convinced the CUDA core count is correct, i.e. the actual physical number of cores may be half of what is advertised.


----------



## R0H1T (Sep 2, 2020)

dicktracy said:


> No love for your *niece*?


Huh  not this again


----------



## bubbleawsome (Sep 2, 2020)

I wonder if this generation will be majorly power limited, where unlocking TDP will actually have a measurable effect.


----------



## yotano211 (Sep 2, 2020)

R0H1T said:


> Huh  not this again


In some places around the world there is plently of love for the niece, along with the sister and 1st cousin.


----------



## sith'ari (Sep 2, 2020)

*once again , nVidia 's magic in the works* !!!  
Ultra-Hyped from what was announced by Jensen.


----------



## R0H1T (Sep 2, 2020)

yotano211 said:


> In some places around the world there is plently of love for the niece, along with the sister and 1st cousin.


Yeah except they aren't related.

Sorry if I missed any jokes in there


----------



## InVasMani (Sep 2, 2020)

This is defiantly going to make it tough to actually give RDNA2 consideration from the looks of things though we still don't know how that will be comparatively speaking. It's probably still a bit premature to call this a grand slam by Nvidia, but that pricing is aggressive this time around and exactly what's needed to propel RTRT forward. I am keen to see just how competitive AMD's card's stack up and at what price point structure. I could see this putting a big damper on Intel's GPU ambitions too.


----------



## Shatun_Bear (Sep 2, 2020)

They've left AMD an open goal because they're using Samsung's clearly inferior 8nm process node. TSMC 7nm enhanced RDNA2 with more memory and lower power draw will beat out the 3080 but will lose in RT quite handedly.



VallThore said:


> I wonder what relative performance means this time. I have a gut feeling that this incredible (roughly ~1.7x compared to 2080S looking at the graph) speed-up it's all about raytracing and not so much for rasterization but I would love to be wrong here.



It's not rasterization, people being bamboozled by Nvidia marketing as expected. The rasterization perf is exactly as the leaks rumoured:

3080 25% faster than 2080 Ti.
3090 45 faster than 2080 Ti.


----------



## Vayra86 (Sep 2, 2020)

PooPipeBoy said:


> The RTX 3070 is going to sell like hot tamales.
> If Nvidia can satisfy demand at that $499 price point, then I'll take back every time I've ever complained about their Sheriff of Nottingham business strategy.



I've always been a little bit puzzled by the supposed 'price gouging' Nvidia is doing. Yes, they're leading and command a bit of premium. But there's almost always something on offer for that premium. And then there's always a bunch of GPUs below it that do get some sort of advancement in perf/dollar and absolute performance.

I mean... the 970 was super competitive also on price. The 660ti was the same back during Kepler and the 670 was seen as the 'poor man's 680', but performed virtually the same. The 1070 was dropping the 980ti price point down by a few hundred... and its happening again with x70 today. The price of an x70 has risen... but so has the featureset and the performance gap to the bottom end.

Even with the mining craze the midrange was populated and the price, while inflated, was not quite as volatile as others.



Shatun_Bear said:


> They've left AMD an open goal because they're using Samsung's clearly inferior 8nm process node. TSMC 7nm enhanced RDNA2 with more memory and lower power draw will beat out the 3080 but will lose in RT quite handedly.
> 
> 
> 
> It's not rasterization, people being bamboozled by Nvidia marketing as expected. The rasterization perf is exactly as the leaks rumoured:



'the' leaks? The 12 pin was the only truly accurate one man (alright, and the pictures then). Nvidia played this well, you can rest assured all we got was carefully orchestrated. And that includes the teasing of a 12 pin. Marketing gets a lead start with these leaks, we also heard 1400- 2000 dollars worth of GPU, obviously this makes the announcement of the actual pricing even stronger.


----------



## efikkan (Sep 2, 2020)

biffzinker said:


> I thought the 220 watts for the 3070 wasn’t a bad TDP for the performance their advertising.


I think it's a bit much for a 70-class card, but the real problem is the TDP of 3080 and 3090. I think >300W is too much to cool at a reasonable noise level.


----------



## Vayra86 (Sep 2, 2020)

efikkan said:


> I think it's a bit much for a 70-class card, but the real problem is the TDP of 3080 and 3090. I think >300W is too much to cool at a reasonable noise level.



It is a bit much, basically its a full 104 die's overclocked power consumption. And it is a full 104 as well isn't it? This means that the actual SKUs are still doing what they did, and Nvidia is maintaining its stack in a broad sense. It's just that the 3080 and 3090 have an odd gap for being off the same die, clearly a yield based decision... Its clear GA102 isn't directly a fantastic place to be, if you ask me, there might be some distinct binning differences there.


----------



## lesovers (Sep 2, 2020)

Ok lets just get real here and come back to earth everyone and lets do the numbers, from NVidia's optimistic 4K benchmarks (RTX off) in the presentation;

2070S to 3070  +40%
2080S to 3080  +64%

and from Techpowerups 4K benchmarks;








						ASUS Radeon RX 5700 XT TUF EVO Review - Improved Cooler, Tested
					

Our ASUS RX 5700 XT TUF EVO review takes a look at the new cooler design, which fixes the memory temperature problem of the original TUF without the "EVO". The card is very solid and runs quietly with excellent GPU temperatures and idle-fan stop.




					www.techpowerup.com
				




1070 to 2070S +66%
1080 to 2080S +60%

The 1000 to 2000 super series gave us a bigger increase than these new cards!!!


----------



## efikkan (Sep 2, 2020)

Vayra86 said:


> It is a bit much, basically its a full 104 die's overclocked power consumption. And it is a full 104 as well isn't it? This means that the actual SKUs are still doing what they did, and Nvidia is maintaining its stack in a broad sense. It's just that the 3080 and 3090 have an odd gap for being off the same die, clearly a yield based decision... Its clear GA102 isn't directly a fantastic place to be, if you ask me, there might be some distinct binning differences there.


I don't care about which chip they use in which tier, that has changed in pretty much each generation, what matters is how it performs and how much energy it consumes.

I think 220W is a bit much, but tolerable for GTX 3070, but there is a substantial jump up to 320W for RTX 3080, which I think is too hot.

The performance and price gap between RTX 3080 and RTX 3090 probably indicates the production volume of RTX 3090. GTX 1080 Ti and RTX 2080 Ti have been big sellers, even outselling some of AMD's mid-range cards. Time will tell if RTX 3090 will be scarce.


----------



## Vayra86 (Sep 2, 2020)

lesovers said:


> Ok lets just get real here and come back to earth everyone and lets do the numbers, from NVidia's optimistic 4K benchmarks (RTX off) in the presentation;
> 
> 2070S to 3070  +40%
> 2080S to 3080  +64%
> ...



That's why I skipped Turing. SUPER was too late to the party.


----------



## lesovers (Sep 2, 2020)

Vayra86 said:


> That's why I skipped Turing. SUPER was too late to the party.



Yes but we are comparing the old to the new generations, unless the super cards were a generational change?



lesovers said:


> Ok lets just get real here and come back to earth everyone and lets do the numbers, from NVidia's optimistic 4K benchmarks (RTX off) in the presentation;
> 
> 2070S to 3070  +40%
> 2080S to 3080  +64%
> ...



and for AMD benchmarks at 4K;

RX580 to RX5700XT +100%

and to the future Big Navi (RDNA 2) at 4K;

RX5700XT to RX6900XT +120%???


----------



## medi01 (Sep 2, 2020)

Fluffmeister said:


> Ah yes, the fabled "Big Navi", it does appear to be the second coming


*Soundly beating 3070* doesn't sound to me as the second coming... *at all*?
It could explain NV's suddenly being so modest about pricnig, it's such a great contrast to Turing prices, with 2080Ti never being offered at the claimed $999 MSRP.


----------



## Valantar (Sep 2, 2020)

Vayra86 said:


> I've always been a little bit puzzled by the supposed 'price gouging' Nvidia is doing. Yes, they're leading and command a bit of premium. But there's almost always something on offer for that premium. And then there's always a bunch of GPUs below it that do get some sort of advancement in perf/dollar and absolute performance.
> 
> I mean... the 970 was super competitive also on price. The 660ti was the same back during Kepler and the 670 was seen as the 'poor man's 680', but performed virtually the same. The 1070 was dropping the 980ti price point down by a few hundred... and its happening again with x70 today. The price of an x70 has risen... but so has the featureset and the performance gap to the bottom end.
> 
> Even with the mining craze the midrange was populated and the price, while inflated, was not quite as volatile as others.


There's no doubt that per-tier pricing has made some major jumps in recent years. Have we gotten more performance at that tier? Sure! But perf/$ has barely been moving at all, at least until RDNA showed up and Nvidia launched the Supers. Turing was essentially "pay the same for the same level of rasterization performance, but with RT added, at a lower product tier" when compared to Pascal - of course with the top end moving upwards in both price and performance. This, on the other hand, looks like an excellent value play, and finally a significant improvement in perf/$ from day 1, and even comparing with Pascal. Of course there are reasons for this such as more expensive process nodes, more expensive memory, more complex PCBs and more complex coolers, but overall GPU pricing per market segment has seen a significant increase in later years. Mainstream GPUs used to be around (and often below) $200, while the most heavily marketed GPUs are typically above $300, with anything lower treated as a sort of second-class citizen. The selection below $200 is downright awful, even with the slightly better value 1650S on the market. I'm hoping for some downward price creep this generation around - with these efficiency improvements there should be plenty of room for small, cheap chips with good performance.


----------



## R0H1T (Sep 2, 2020)

MxPhenom 216 said:


> Best *sell now*


Would have been apt about a week or two back. After today's shellacking the only people buying used* 2080Ti* at anything above $300-400 are either just waking up from a Coma/hibernation or those living in a complete bubble from the outside world & somehow having the urge to spend big bucks for what is now an obsolete card


----------



## medi01 (Sep 2, 2020)

Xex360 said:


> even the 3070 has more than the 2080ti,


Much more, yet it is roughly matching it on performance, curious isn't it?
When in the past gens, perf/CU figures were rising.

As if someone just decided to double the claimed figure just for marketing lulz.



Shatun_Bear said:


> It's not rasterization, people being bamboozled by Nvidia marketing as expected. The rasterization perf is exactly as the leaks rumoured:
> 
> 3080 25% faster than 2080 Ti.
> 3090 45 faster than 2080 Ti.


What is the source for those claims?



efikkan said:


> even outselling some of AMD's mid-range cards.


You trust (and mistread) steam hardware survey too much.
For actual sales check reports from actual shops, e.g. mindfactory.


----------



## lemoncarbonate (Sep 2, 2020)

I think I will be happy with 3070, significantly less power than 3080, but still a good performer for 3440x1440 75Hz..
I'll wait for RDNA2. I hope AMD can deliver at least 3070 performance for less power and lower price.


----------



## Chrispy_ (Sep 2, 2020)

PowerPC said:


> VRAM is one of the most expensive parts of the GPU right now and so far, the least important when it comes to performance. You can increase the VRAM, but then you'll have to increase the price considerably, just look at 3090... I mean, these cards are for today. I doubt anybody can do anything about VRAM prices. Game devs really need (and probably will) adapt to this reality, unless these prices change. I think you can find flaws with anything, if you want.


PS5 devs have specifically talked about targeting 12GB as the dynamic VRAM allocation of next-gen titles, something made possible without silly loading times by the new hybrid storage system PS5 has.

10GB cards will be inadequate, soon, I think - and it has already been mentioned in this thread that HZD on PC requires over 8GB. The 3070 is incapable of max settings on games that existed before it's even released!

The next gen consoles will have a huge impact on game developers, because the hardware is so close to PC hardware this time around. Expect every dev to build their engines for the consoles first and the PCMR will get ports.

I was expecting the 3080 to be 16GB and 3070 to be 12GB, to be honest.....


----------



## TheoneandonlyMrK (Sep 2, 2020)

Chrispy_ said:


> PS5 devs have specifically talked about targeting 12GB as the dynamic VRAM allocation of next-gen titles, something made possible without silly loading times by the new hybrid storage system PS5 has.
> 
> 10GB cards will be inadequate, soon, I think - and it has already been mentioned in this thread that HZD on PC requires over 8GB. The 3070 is incapable of max settings on games that existed before it's even released!
> 
> The next gen consoles will have a huge impact on game developers, because the hardware is so close to PC hardware this time around. Expect every dev to build their engines for the consoles first and the PCMR will get ports.


I see the memory allocation as Nvidia's move to keep a GPU release in a few months to a year relevant, they're pushing the power envelope already so silicon optimization and process optimization won't net much of a gain so more higher speed Vram will sell cards in a year.


----------



## Fluffmeister (Sep 2, 2020)

Well the 3070 is only 220W, that's less than the 5700XT, and that has a feature set that is well.... Lacking.


----------



## Valantar (Sep 2, 2020)

Chrispy_ said:


> PS5 devs have specifically talked about targeting 12GB as the dynamic VRAM allocation of next-gen titles, something made possible without silly loading times by the new hybrid storage system PS5 has.
> 
> 10GB cards will be inadequate, soon, I think - and it has already been mentioned in this thread that HZD on PC requires over 8GB. The 3070 is incapable of max settings on games that existed before it's even released!
> 
> ...


The PS5 likely has the same split between the OS and software as the XSX, reserving 2.5GB for the system and leaving 13.5GB for software. This if course has to serve as both RAM and VRAM for the software, so games exceeding 10GB in VRAM alone is quite unlikely. Of course the PC typically supports higher detail levels, leading to higher VRAM usage, but new texture streaming techniques (and especially DirectStorage) are likely to dramatically reduce the amount of "let's keep it in VRAM in case we need it" data, which is the majority of current VRAM usage on both PCs and consoles. If developers start designing with NVMe as a baseline, VRAM utilization can drop very noticeably from this alone. Current games pre-load data based on HDD transfer rates and seek times, meaning data is loaded very aggressively and early, with the majority of it being flushed without ever seeing use.


----------



## P4-630 (Sep 2, 2020)

btarunr said:


> Here it is, the GeForce RTX 3080, 10 GB GDDR6X, running at 19 Gbps, 238 tensor TFLOPs, 58 RT TFLOPs, 18 power phases.



Hmm, according to toms 68 RT and 272 tensor














						Nvidia GeForce RTX 3080 Founders Edition Review: A Huge Generational Leap in Performance
					

The GeForce RTX 3080 delivers big gains over Turing and the RTX 2080 Ti, but at a lower price.




					www.tomshardware.com


----------



## LocutusH (Sep 2, 2020)

Anyone knows how and when can we preorder FE cards?


----------



## Chrispy_ (Sep 2, 2020)

Valantar said:


> The PS5 likely has the same split between the OS and software as the XSX, reserving 2.5GB for the system and leaving 13.5GB for software. This if course has to serve as both RAM and VRAM for the software, so games exceeding 10GB in VRAM alone is quite unlikely. Of course the PC typically supports higher detail levels, leading to higher VRAM usage, but new texture streaming techniques (and especially DirectStorage) are likely to dramatically reduce the amount of "let's keep it in VRAM in case we need it" data, which is the majority of current VRAM usage on both PCs and consoles. If developers start designing with NVMe as a baseline, VRAM utilization can drop very noticeably from this alone. Current games pre-load data based on HDD transfer rates and seek times, meaning data is loaded very aggressively and early, with the majority of it being flushed without ever seeing use.


If they go down that route then we will all need NVMe storage for our games libraries. More likely is that the devs can't assume people have 3GB/s library drives and will opt to continue using GPU VRAM as storage.

This is one instance where I'd like to be wrong but the last 25 years of PC gaming has proven that devs always cater to the lowest common denominator to get the largest customer base possible.


----------



## kayjay010101 (Sep 2, 2020)

LocutusH said:


> Anyone knows how and when can we preorder FE cards?


Employee from NVIDIA stated in a reddit q&a yesterday that there will be no preorders. They just open for purchase on the 17th/24th/ in october.


----------



## Chrispy_ (Sep 2, 2020)

LocutusH said:


> Anyone knows how and when can we preorder FE cards?


You can't, but you can sign up for when orders go live. In my experience with Pascal and Turing launches they are out of stock before the email arrives, so it's not much use.

https://www.nvidia.com/en-gb/geforce/buy/ <regional, you'll need to change en-gb to your country code.

Also, does anyone want to buy a 2080Ti for more than the cost of a 3080 _and_ 3070 combined? Nvidia has you covered!


----------



## Valantar (Sep 2, 2020)

P4-630 said:


> Hmm, according to toms 68 RT and 272 tensor
> 
> View attachment 167503
> 
> ...


Tom's is wrong. That is closer to the 3090's specs, though not quite. 



Chrispy_ said:


> If they go down that route then we will all need NVMe storage for our games libraries. More likely is that the devs can't assume people have 3GB/s library drives and will opt to continue using GPU VRAM as storage.
> 
> This is one instance where I'd like to be wrong but the last 25 years of PC gaming has proven that devs always cater to the lowest common denominator to get the largest customer base possible.


Given that DirectStorage is on the XSX, the PS5 uses a similar system, and most high budget games are developed for consoles (too), I would be very surprised if this didn't happen. I guess they might make some sort of legacy mode, though it would be far less effort for developers to aim for console specs as a minimum. Though to be frank even aiming for SATA SSDs as a baseline would largely fix this, as seek times matter more for this than raw transfer rates.


----------



## Chrispy_ (Sep 2, 2020)

Valantar said:


> Though to be frank even aiming for SATA SSDs as a baseline would largely fix this, as seek times matter more for this than raw transfer rates.


Yeah, like I said, it'd be nice if I'm wrong this time.
I don't fancy replacing my 2.5TB of library drives with NVMe.


----------



## P4-630 (Sep 2, 2020)

Chrispy_ said:


> You can't, but you can sign up for when orders go live. In my experience with Pascal and Turing launches they are out of stock before the email arrives, so it's not much use.
> 
> https://www.nvidia.com/en-gb/geforce/buy/ <regional, you'll need to change en-gb to your country code.
> 
> ...



Did you check this?





						Videokaarten - Vraag & Aanbod - Tweakers
					






					tweakers.net


----------



## BoboOOZ (Sep 2, 2020)

Chrispy_ said:


> If they go down that route then we will all need NVMe storage for our games libraries. More likely is that the devs can't assume people have 3GB/s library drives and will opt to continue using GPU VRAM as storage.
> 
> This is one instance where I'd like to be wrong but the last 25 years of PC gaming has proven that devs always cater to the lowest common denominator to get the largest customer base possible.


Shadowlands already includes an SSD as a minimum requirement for the game. But from that to 3GB/s that's another step up, I would imagine they would rather ask for more RAM/VRAM.


----------



## Valantar (Sep 2, 2020)

BoboOOZ said:


> Shadowlands already includes an SSD as a minimum requirement for the game. But from that to 3GB/s that's another step up, I would imagine they would rather ask for more RAM/VRAM.


But if the only option for more VRAM is a $1499 GPU... then buying a $100 or $200 SSD is far easier, no?


Chrispy_ said:


> Yeah, like I said, it'd be nice if I'm wrong this time.
> I don't fancy replacing my 2.5TB of library drives with NVMe.


It really wouldn't be hard to make a flexible solution for this - just make game platforms (GOG, Steam, Epic, etc.) identify what types of storage you have in your system (Windows already does this for SSDs and HDDs on a system level, but it should be trivial to differentiate between SATA and NVMe too), add a tag to games requiring fast storage so the launcher knows, and allow the platform to shuffle games between drives as needed (obviously with user configuration options like always keeping certain games on storage type X or Y, etc.


----------



## Ubersonic (Sep 2, 2020)

wheresmycar said:


> :O
> 
> Not sure who I should give thanks to... NVIDIA for the effort or AMD for the compo.


Intel.

This price/performance combo is a straight up attempt to knock AMD out of the highend GPU market.  Something Nvidia havent beem able to try previously as the monopolies commission would have come calling.

If Nvidia can KO AMD hard enough they will end up in a two horse race with Intel who will be in AMDs old spot of having the second best CPUs and gpus.


----------



## BoboOOZ (Sep 2, 2020)

Valantar said:


> But if the only option for more VRAM is a $1499 GPU... then buying a $100 or $200 SSD is far easier, no?


There'l be much more affordable (RDNA2 and then Nvidia) cards with more than 10GB in just a few months, I'm sure.



Ubersonic said:


> If Nvidia can KO AMD hard enough they will end up in a two horse race with Intel who will be in AMDs old spot of having the second best CPUs and gpus.


I confess that the 2 shader operation trick reminds me exactly of what they pulled 20 years ago with the Geforce 2 GTS (T stands for texel, which meant applying 2 textures per pixel per clock cycle) for eliminating 3DFX, and it worked just fine back then, so much brute force, 3DFX was lost and soon disappeared.


----------



## Shatun_Bear (Sep 2, 2020)

Vayra86 said:


> I've always been a little bit puzzled by the supposed 'price gouging' Nvidia is doing. Yes, they're leading and command a bit of premium. But there's almost always something on offer for that premium. And then there's always a bunch of GPUs below it that do get some sort of advancement in perf/dollar and absolute performance.
> 
> I mean... the 970 was super competitive also on price. The 660ti was the same back during Kepler and the 670 was seen as the 'poor man's 680', but performed virtually the same. The 1070 was dropping the 980ti price point down by a few hundred... and its happening again with x70 today. The price of an x70 has risen... but so has the featureset and the performance gap to the bottom end.
> 
> ...



Come on, Red Gaming Tech said/MLID were correct with:

- them using Samsung's inferior 8nm process node
- The cards will draw huge power as a result. 320 and 380W is not normal. One even claimed the exact power draw which was on the money
- The 3080 will be what they are pushing hard as it's performance is much, much closer to the 3090 than the price suggests. This is to combat Navi
- Performance numbers and their relative gaps were all spot on

So we knew a load about this release and Nvidia were definitely mpre leaky here than with Turing, Pascal.


----------



## BoboOOZ (Sep 2, 2020)

Shatun_Bear said:


> So we knew a load about this release and Nvidia were definitely mpre leaky here than with Turing, Pascal.


Tom from MLID said most of his RDNA2 info was coming from Nvidia sources too


----------



## medi01 (Sep 2, 2020)

Fluffmeister said:


> Well the 3070 is only 220W, that's ...


Contradicts Huang's statements about 1.9 better perf/w (taking 2080Ti as 270W card, 3070 should have been 145W)


----------



## Shatun_Bear (Sep 2, 2020)

Fluffmeister said:


> Well the 3070 is only 220W, that's less than the 5700XT, and that has a feature set that is well.... Lacking.



5700XT is approaching 2 years old RDNA1. Better comparison will be 6700XT vs 3070.


----------



## Vayra86 (Sep 2, 2020)

Shatun_Bear said:


> Come on, Red Gaming Tech said/MLID were correct with:
> 
> - them using Samsung's inferior 8nm process node
> - The cards will draw huge power as a result. 320 and 380W is not normal. One even claimed the exact power draw which was on the money
> ...



Quite true in fact, yeah. Still though, I'm pretty sure this was orchestrated leaking. The timing, the content... how do you sell 320W? By letting us ease into it... and then bringing a favorable price point.

These 'tubers are just free or nearly free marketing tools.


----------



## medi01 (Sep 2, 2020)

Soo, talking about *transistor density*, TSMC 7nm DUV vs Samsung 8nm:

5700XT, 250mm2, 10.3 billion => 41 million trans. per mm2
3080, 627mm2, 28 billion => 44.6 million tr. per mm2

Remind me, who was saying that Samsung 8nm is faux and just a marketing name for 10nm?


----------



## nguyen (Sep 2, 2020)

Comparing RTX 3070 to 2080 Super which are more similar in specs (8GBs on 256bit bus, 16GBps VRAM) suggests that Ampere is around 35-40% more efficient than Turing. 
I guess we lose a bit of efficiency going Samsung 8N but the lower prices justified all that.


----------



## ppn (Sep 2, 2020)

nguyen said:


> Comparing RTX 3070 to 2080 Super which are more similar in specs (8GBs on 256bit bus, 16GBps VRAM) suggests that Ampere is around 35-40% more efficient than Turing.
> I guess we lose a bit of efficiency going Samsung 8N but the lower prices justified all that.



More like 3070 is similar to 2070,. 450mm2, 256 bit. so nvidia managed to squeeze 6144 Cuda or 2.66x more compared to 2304. while bumping the memory speed only to 16. And the average of 1,14+2,66 is 1.9x, so it is 90% more efficient on average, of course where the pure computation power comes into play it is 2.66, Full die to full die TU106 Vs GA104 3070Ti.


----------



## efikkan (Sep 2, 2020)

medi01 said:


> You trust (and mistread) steam hardware survey too much.
> For actual sales check reports from actual shops, e.g. mindfactory.


Steam has a major marketshare among PC gamers, and is way more representative than a single shop. The only thing missing from the Steam hardware survey is people who buy graphics cards and don't game.



Chrispy_ said:


> PS5 devs have specifically talked about targeting 12GB as the dynamic VRAM allocation of next-gen titles, something made possible without silly loading times by the new hybrid storage system PS5 has.
> 
> 10GB cards will be inadequate, soon, I think - and it has already been mentioned in this thread that HZD on PC requires over 8GB. The 3070 is incapable of max settings on games that existed before it's even released!


Dynamic, as in the game is able to decide how much is system RAM and how much is VRAM.

By the time 8 GB is inadequate for gaming, the performance of RTX 3070 will be too, and you will be buying a "RTX 6070"…



Chrispy_ said:


> The next gen consoles will have a huge impact on game developers, because the hardware is so close to PC hardware this time around. Expect every dev to build their engines for the consoles first and the PCMR will get ports.


I think you are putting too much faith in game developers. Most of them just take an off-the-shelf game engine, load in some assets, do some scripting and call it a game. Most game studios don't do a single line of low-level engine code, and the extent of their "optimizations" are limited to adjusting assets to reach a desired frame rate.



Chrispy_ said:


> If they go down that route then we will all need NVMe storage for our games libraries. More likely is that the devs can't assume people have 3GB/s library drives and will opt to continue using GPU VRAM as storage.


Not really. The difference between a "standard" 500 MB/s SSD and a 3 GB/s SSD will be loading times. For resource streaming, 500 MB/s is plenty.
Also, don't forget that these "cheap" NVMe QLC SSDs can't deliver 3 GB/s sustained, so if a game truely depended on this, you would need a SLC SSD or Optane.



Chrispy_ said:


> This is one instance where I'd like to be wrong but the last 25 years of PC gaming has proven that devs always cater to the lowest common denominator to get the largest customer base possible.


Games in general isn't particularly good at utilizing the hardware we have currently, and the trend in game development has clearly been less performance optimization, so what makes you think this will change all of a sudden?


----------



## R0H1T (Sep 2, 2020)

medi01 said:


> Contradicts Huang's statements about *1.9 better perf/w* (taking 2080Ti as 270W card, 3070 should have been 145W)


That's closer to the best case scenario, for one game, you don't seriously believe that the card will do (at least) 1.9x better perf/W across the board under all workloads do you?


----------



## nguyen (Sep 2, 2020)

Yeah the 1.9X perf/watt figure is pretty bullshit. 
The top of the Turing curve is the worst Perf/watt while the Ampere curve is at its most effcient.
Just take the FPS value at 120W, Ampere at ~58 while Turing at 45FPS, that's around 30% improvement.
I could do the same with 2080Ti vs 1080Ti where lowering the TDP of the 2080Ti to 140W and I still get 1080Ti performance, does that mean Turing is 1.8X Perf/watt of Pascal ? heck no.

Though looking at Ampere curve and we know why the new cards TDP are much higher, perf just scale so well with power.


----------



## Valantar (Sep 2, 2020)

BoboOOZ said:


> There'l be much more affordable (RDNA2 and then Nvidia) cards with more than 10GB in just a few months, I'm sure.


Looking at the tiny gap between the 3080 and the 3090, how do you imagine that? Sure, there might be 12 and 16GB RDNA2 cards, but given Nvidia's massive marketshare advantage, developers won't tune their games for those primarily. Beyond that, "much more affordable"? How? Best case scenario we get a 3080 Ti with the same core count as the 3090 but 12GB of VRAM and lower clocks at $999. Cheaper than that isn't happening, and even that would be borderline miraculous.


Shatun_Bear said:


> Come on, Red Gaming Tech said/MLID were correct with:
> 
> - them using Samsung's inferior 8nm process node
> - The cards will draw huge power as a result. 320 and 380W is not normal. One even claimed the exact power draw which was on the money
> ...


380W? The 3090 is 350W. Other than that, this seems to have been accurate.


medi01 said:


> Contradicts Huang's statements about 1.9 better perf/w (taking 2080Ti as 270W card, 3070 should have been 145W)


Perf/W isn't a linear function, and Nvidia has obviously chosen to push clocks (and thus power, losing efficiency as they move up the DVFS curve) to increase the absolute performance to gain a better competitive position in anticipation of RDNA 2. All that graph says is that "Ampere" (likely the GA104 die, unknown core count and memory configuration) at ~140W could match the 2080 Ti/TU102 at ~270W. Which might very well be true, but we'll never know outside of people undervolting and underclocking their GPUs, as Nvidia is never going to release a GPU based on this chip at that power level (unless they go entirely insane on mobile, I guess).


ppn said:


> More like 3070 is similar to 2070,. 450mm2, 256 bit. so nvidia managed to squeeze 6144 Cuda or 2.66x more compared to 2304. while bumping the memory speed only to 16. And the average of 1,14+2,66 is 1.9x, so it is 90% more efficient on average, of course where the pure computation power comes into play it is 2.66, Full die to full die TU106 Vs GA104 3070Ti.


You can't compare Turing Cuda cores to Ampere Cuda cores as if they are the same, as there is obviously a fundamental shift in the architecture with Nvidia doubling the ALUs overall. Also, calculating efficiency from cores/area is ... absurd. Efficiency for a GPU is a function of performance over power consumption, and neither of the two can be directly calculated from on-paper specs, especially given a new architecture. Given the doubling of ALUs with seemingly minor changes elsewhere in the rasterization part of the GPU (it's not like everything else has doubled, after all), it's highly likely that perf/Tflop is going to drop significantly for this generation. After all, they are presenting the ~20Tflop 3070 as a bit faster than the ~13Tflop 2080 Ti, not 50% faster.


----------



## Prima.Vera (Sep 2, 2020)

btarunr said:


> *Update 16:23 UTC*: Cyberpunk 2077 is playing big on the next generation. NVIDIA is banking extensively on the game to highlight the advantages of Ampere. The 200 GB game could absorb gamers for weeks or months on end.


Why this screen is so blurred? I hope the game won't be like that, lol.


----------



## BoboOOZ (Sep 2, 2020)

Valantar said:


> Looking at the tiny gap between the 3080 and the 3090, how do you imagine that? Sure, there might be 12 and 16GB RDNA2 cards, but given Nvidia's massive marketshare advantage, developers won't tune their games for those primarily. Beyond that, "much more affordable"? How? Best case scenario we get a 3080 Ti with the same core count as the 3090 but 12GB of VRAM and lower clocks at $999. Cheaper than that isn't happening, and even that would be borderline miraculous.


A 850$ 3080 20GB? That would be much more affordable than 1500$, right?


----------



## SN2716057 (Sep 2, 2020)

I'll wait for multiple reviews but I'm eyeing for the 3080. Hopefully it can run MSFS2020 at 60fps.


----------



## Makaveli (Sep 2, 2020)

SN2716057 said:


> I'll wait for multiple reviews but I'm eyeing for the 3080. Hopefully it can run MSFS2020 at 60fps.



isn't 30fps all that is needed for MSFS2020 as that is pretty much what all the reviewers are saying.


----------



## SN2716057 (Sep 2, 2020)

Makaveli said:


> isn't 30fps all that is needed for MSFS2020 as that is pretty much what all the reviewers are saying.


Please no blasphemy.


----------



## John Naylor (Sep 2, 2020)

The numbers are certainly real ... they just don't make this stuff up.... but like every other benchmark, it bears little resemblance to every day usage.  No different from RAID benchmarks or multicore benchmarks ... they only matter when you have an application that can use it, and for 98.5% of folks, we can't.  What we will see in gaming wll be in the 20% range from 2080 - 3080.... as i recall, that was what was shown for the marble demo.

After the price bump from the mining craze rise / fall ... the warehouses full of last gen cards and , in the US ... the import tariffs,.  It was nice to see the naysayers were wring about the pricing.  In 2017, Hot Hardaware did a graph of prices for the top gaming card (that leaves out Titan and 2090) and from the year 2000 till 2017, the average price of the top card hovered around $700 in 2017 US Dollars.   That makes the 3080 a bargain both for the MSRP and the fact that prices are now inflated because of pandemic and tariff impacts.


----------



## Valantar (Sep 2, 2020)

BoboOOZ said:


> A 850$ 3080 20GB? That would be much more affordable than 1500$, right?


That configuration is never going to happen. Not only is the BOM cost for 10GB of GDDR6X likely more than $150, they would need to design a new PCB for this SKU - I sincerely doubt the 3080's PCB has double sided VRAM pads given its pricing. The amount is massive overkill too, placing the card in a segment where datacenters and others running massive datasets would gobble them up before gamers could ever get their hands on them. That's who the 3090 is meant to attract, after all, so why undercut it with the segment most eager to buy it?


----------



## R0H1T (Sep 2, 2020)

Well if they get higher density GDDR6x in there, all of a sudden it isn't so much impossible anymore. But then again when they have 3090 to sell why would they put any of their foot willingly on the Axe?

Though I do expect the Supes to debut some time after, perhaps next year?


Spoiler










No not them!


----------



## BoboOOZ (Sep 2, 2020)

Valantar said:


> That configuration is never going to happen. Not only is the BOM cost for 10GB of GDDR6X likely more than $150, they would need to design a new PCB for this SKU - I sincerely doubt the 3080's PCB has double sided VRAM pads given its pricing. The amount is massive overkill too, placing the card in a segment where datacenters and others running massive datasets would gobble them up before gamers could ever get their hands on them. That's who the 3090 is meant to attract, after all, so why undercut it with the segment most eager to buy it?


They won't undercut it if they don't need to, that is, if AMD doesn't come with a 16GB or more card beating it or thereabouts. They will do it as late as possible, of course.
As for the datacenter crowd gobbing them up, why would that be a problem, as long as they sell well? The availability of the 3090 will be small in any case.


----------



## Lycanwolfen (Sep 2, 2020)

Rather buy two 2080ti supers and SLI them


----------



## DuxCro (Sep 2, 2020)

According to RTX 2080 review on Guru3D, it achieves 45 fps average in Shadow of the Tomb Raider in 4K and same settings as Digital Foundry used in their video. But they achieve around 60fps. Which is  only 33% more.  But they claim avg fps is 80% higher. You can see fps counter in left top corner with Tomb Raider. Vsync was on in captured footage? If there was 80% increase in performance. Avg fps should be around 80.  RTX 2080Ti fps on same cpu DF was using should be over 60 in Shadow of TR. So RTX 3080 is  Just around 30% faster than 2080Ti.  So the new TOP of the line Nvidia gaming  GPU is just 30% faster than previous TOP of the line GPU. When you look at it like that, i really don't see any special jump in performance.  RTX 3090 is for professionals and i don't even count it in at that price.


----------



## BoboOOZ (Sep 2, 2020)

DuxCro said:


> Vsync was on in captured footage?


At some point in the beginning they mentioned VSync was on, IIRC.

Anyways, let's wait for the marketing BS to settle and for the real numbers to show up, recorded on multiple resolutions, from serious test sites...


----------



## MxPhenom 216 (Sep 2, 2020)

Lycanwolfen said:


> Rather buy two 2080ti supers and SLI them



There's no such thing as a 2080Ti Super there bud.


----------



## mouacyk (Sep 2, 2020)

MxPhenom 216 said:


> There's no such thing as a 2080Ti Super there bud.


At the expected $400 resale value, it may as well join the Supes.


----------



## Chomiq (Sep 2, 2020)

Prima.Vera said:


> Why this screen is so blurred? I hope the game won't be like that, lol.


It CAN look like that, that's a graphical effect that can be toggled off in the settings menu:








						The Witcher 3: Wild Hunt - Chromatic Aberration Can be Turned Off
					

The Witcher 3: Wild Hunt - Chromatic Aberration Can be Turned Off




					wccftech.com


----------



## Valantar (Sep 2, 2020)

BoboOOZ said:


> They won't undercut it if they don't need to, that is, if AMD doesn't come with a 16GB or more card beating it or thereabouts. They will do it as late as possible, of course.
> As for the datacenter crowd gobbing them up, why would that be a problem, as long as they sell well? The availability of the 3090 will be small in any case.


You're shifting the perspective of your arguments as you go here - that's generally not a very good way of making a point, and serves to show that the ground on which you are building your arguments is shaky. In one sentence you're arguing from a perspective of "gamers will need more than 10GB of VRAM, Nvidia must provide", and in the next you're arguing from a perspective of "it doesn't matter who buys the GPUs as long as they sell". See the issue? The latter perspective is in direct conflict with the former. Are you arguing from the perspective of the "common good" of gamers, or are you arguing from the perspective of Nvidia continuing to be a successful business?

As for what you are saying: for compute customers, any sale of a potential 20GB 3080 is a lost sale of a 3090 - those customers can _always_ afford the next tier up if it makes sense in terms of features or performance. The compute performance difference between the 3080 and 3090 is small enough that 100% of those customers would then instead buy the 3080 20GB unless they are in the tiny niche where _another_ 4GB actually makes a difference. For the rest, they'll just buy two 3080 20GBs. And again, 20GB of VRAM with no additional bandwidth or compute performance makes pretty much zero sense. Games won't exceed 10GB of VRAM any time soon unless their developers either don't care or are incompetent.


----------



## RyzenTestRig (Sep 2, 2020)

699$ 3080 yes
some of us will also need a 150$ PSU upgrade in addition
320W without overclocking

the 3070 seems way more reasonable

lets see what Big navi has got to offer before we upgrade.


----------



## T3RM1N4L D0GM4 (Sep 2, 2020)

Frick said:


> *thinking about my GTX760*



Me too :-\


----------



## Lycanwolfen (Sep 2, 2020)

MxPhenom 216 said:


> There's no such thing as a 2080Ti Super there bud.



oh then 2080's


----------



## BoboOOZ (Sep 2, 2020)

Valantar said:


> You're shifting the perspective of your arguments as you go here - that's generally not a very good way of making a point, and serves to show that the ground on which you are building your arguments is shaky. In one sentence you're arguing from a perspective of "gamers will need more than 10GB of VRAM, Nvidia must provide", and in the next you're arguing from a perspective of "it doesn't matter who buys the GPUs as long as they sell". See the issue? The latter perspective is in direct conflict with the former. Are you arguing from the perspective of the "common good" of gamers, or are you arguing from the perspective of Nvidia continuing to be a successful business?


I'm not shifting perspectives, you're probably overanalysing my (maybe too short) messages.

The whole point should be considered only from the viewpoint of the company, in a more or less competitive market.
I'm pretty certain that in a year or two there will be more games, requiring more than 10k of VRAM in certain situations, but I think if AMD comes out with a competitive option for the 2080 (with a more reasonable amount of memory), reviews will point out this problem, let's say, in the next 5 months. If this happens , Nvidia will have to react to remain competitive (they are very good at this).


Valantar said:


> As for what you are saying: for compute customers, any sale of a potential 20GB 3080 is a lost sale of a 3090 - those customers can _always_ afford the next tier up if it makes sense in terms of features or performance. The compute performance difference between the 3080 and 3090 is small enough that 100% of those customers would then instead buy the 3080 20GB unless they are in the tiny niche where _another_ 4GB actually makes a difference. For the rest, they'll just buy two 3080 20GBs.


No SLI on the 3080? Anyways, I will repeat myself, Nvidia will do this only if they have to, and AMD beats the 3080.


Valantar said:


> And again, 20GB of VRAM with no additional bandwidth or compute performance makes pretty much zero sense.


Do you mean to say that the increase from 8 to 10 GB is proportional with the compute and bandwith gap between the 2080 and the 3080? It's rather obvious that it's not. On the contrary, if you look at the proportions, the 3080 is the outlier of the lineup, it has 2x the memory bandwidth of the 3070 but only 25% more VRAM


Valantar said:


> Games won't exceed 10GB of VRAM any time soon unless their developers either don't care or are incompetent.


Older games have textures optimized for viewing at 10802p. This gen is about 4k gaming being really possible, so we'll see more detailed textures.
They will be leveraged on the consoles via streaming from the SSD, and on PCs via increasing RAM/VRAM usage.


----------



## Lycanwolfen (Sep 2, 2020)

Single card 3 slots ouch. I know you guys say video games today do not use SLI. Well I beg to differ. I went to 4k gaming couple years ago and I play a few games like FFXIV and doom and few others. I bought a single 1070ti and in FFXIV at 4k i pushed about 50 to 60 fps but in SLI I pushed over 120 FPS @ 4k so the game says it does not support it but it does. SLI is always on. I have met many people that bought SLI and did not know how to configure it so they never saw the benefit from it. Also in SLI the game ran smoother cleaner. Now maybe it cannot address all the memory but it still can use both GPU's which increase the smoothness.

When I ran 1080 P gaming I ran two 660ti's in SLI and everything was sweet. But 4k nope could not handle the load.


----------



## kapone32 (Sep 2, 2020)

Well from 16 pages of posts and all the 30 series announcements 2020 is shaping up to be the year of the Desktop Computer.


----------



## Makaveli (Sep 2, 2020)

SLI is dead notice how the list gets smaller and smaller as we get to the current year.









						57 Best SLI Supported Games That Scale (2020 List) - BGC
					

Updated list of the best games that support SLI or NVLink with an official NVidia profile, including both recent AAA games and older classics with good scaling.




					www.build-gaming-computers.com


----------



## Manoa (Sep 2, 2020)

Lycanwolfen said:


> Single card 3 slots ouch. I know you guys say video games today do not use SLI. Well I beg to differ. I went to 4k gaming couple years ago and I play a few games like FFXIV and doom and few others. I bought a single 1070ti and in FFXIV at 4k i pushed about 50 to 60 fps but in SLI I pushed over 120 FPS @ 4k so the game says it does not support it but it does. SLI is always on. I have met many people that bought SLI and did not know how to configure it so they never saw the benefit from it. Also in SLI the game ran smoother cleaner. Now maybe it cannot address all the memory but it still can use both GPU's which increase the smoothness.
> 
> When I ran 1080 P gaming I ran two 660ti's in SLI and everything was sweet. But 4k nope could not handle the load.


yes you increased smoothness by increase fps but you also increase latency of every one of the frames by factor 2 or even 3, fermi was the last of Tom since then it all deth :x


----------



## etayorius (Sep 2, 2020)

I payed $370 fot my GTX1070 in 2017. $500 for a 3070 does not seem very fair to me. Next gen the GTX 4070 will be $600.


----------



## Chrispy_ (Sep 2, 2020)

efikkan said:


> I think you are putting too much faith in game developers. Most of them just take an off-the-shelf game engine, load in some assets, do some scripting and call it a game. Most game studios don't do a single line of low-level engine code, and the extent of their "optimizations" are limited to adjusting assets to reach a desired frame rate.
> change all of a sudden?


I graduated alongside, lived with, and stay in touch with multiple game developers from Campos Santos (now Valve), Splash Damage, Jagex, Blizzard, King, Ubisoft, and by proxy EA, and Activision; I think they'd all be insulted by your statement. More importantly, even if there is a grain of truth to what you say, the "off-the-shelf engines" have been slowly but surely migrating to console-optimised engines over the last few years.



efikkan said:


> Not really. The difference between a "standard" 500 MB/s SSD and a 3 GB/s SSD will be loading times. For resource streaming, 500 MB/s is plenty.
> Also, don't forget that these "cheap" NVMe QLC SSDs can't deliver 3 GB/s sustained, so if a game truely depended on this, you would need a SLC SSD or Optane.


You're overanalyzing this. I said  3GB/s simply beacuse that's a commonly-accepted read speed of a typical NVMe drive. Also, even the worst PCIe 3.0 x4 drives read at about 3GB/s sustained, no matter whether they're QLC or MLC. The performance differences between QLC and MLC is only really apparent on sustained write speeds.



efikkan said:


> Games in general isn't particularly good at utilizing the hardware we have currently, and the trend in game development has clearly been less performance optimization, so what makes you think this will change all of a sudden?


. My point was exactly that. Perhaps English isn't your first language but when I said "_the last 25 years of PC gaming has proven that devs always cater to the lowest common denominator" - _that was me saying that it ISN'T going to change suddenly, and it's been like this for 25 years without changing. That's exactly why games aren't particularly good at utilising the hardware we have currently, because the devs need to make sure it'll run on a dual-core with 4GB RAM and a 2GB graphics card from 9 years ago.


----------



## medi01 (Sep 2, 2020)

efikkan said:


> The only thing missing from the Steam hardware survey is people who buy graphics cards and don't game.


I used to buy graphics cards, game and not be on Steam.
Pretty sure most people who are into Blizzard games do not use steam.
That doesn't explain why WoW players would necessarily skip NV... and this is when AMD bothered to explain what is going on.
Their main argument was their absence in internet cafe business, which was skewing the figures a lot (each user that was logging in was counting separately).
Steam fixed it somewhat, but not all to AMD liking (in AMD's words), brushing it off as that Valve doesn't really care about how representative that survey is. (yikes)

Mindfactory is a major pc parts online shop in Germany and it shows buying habbits of the respective DIY demographic in Germany. I don't see why that is not relevant.



Valantar said:


> All that graph says is that "Ampere" (likely the GA104 die, unknown core count and memory configuration) at ~140W could match the 2080 Ti/TU102 at ~270W. Which might very well be true, but we'll never know outside of people undervolting and underclocking their GPUs, as Nvidia is never going to release a GPU based on this chip at that power level (unless they go entirely insane on mobile, I guess).


This makes the statement fairly useless, whereas AMD's perf/w claim (+50% in RDNA2) is reflecting practical reality at least in TPU reviews.


----------



## Valantar (Sep 2, 2020)

BoboOOZ said:


> I'm not shifting perspectives, you're probably overanalysing my (maybe too short) messages.


Sorry, but no. You started out by arguing from the viewpoint of gamers needing more VRAM - i.e. basing your argument in customer needs. Regardless of your intentions, shifting the basis of the argument to the viewpoint of the company is a dramatic shift that introduces conflicting interests to your argumentation, which you need to address.


BoboOOZ said:


> The whole point should be considered only from the viewpoint of the company, in a more or less competitive market.


Again, I have to disagree. I don't give a rodent's behind about the viewpoint of Nvidia. They provide a service to me as a (potential) customer: providing compelling products. They, however, are in it for the profit, and often make choices in product segmentation, pricing, featuresets, etc. that are clearly aimed at increasing profits rather than providing benefits to the customer. There are of course relevant arguments to be presented in terms of whether what customers may need/want/wish for is feasible in various ways (technologically, economically, etc.), but that is as much of the viewpoint of the company as should be taken into account here. Adopting an Nvidia-internal perspective on this is meaningless for anyone who doesn't work for Nvidia, and IMO even meaningless for them unless that person is in a decision-making position when it comes to these questions.


BoboOOZ said:


> I'm pretty certain that in a year or two there will be more games, requiring more than 10k of VRAM in certain situations, but I think if AMD comes out with a competitive option for the 2080 (with a more reasonable amount of memory), reviews will point out this problem, let's say, in the next 5 months. If this happens , Nvidia will have to react to remain competitive (they are very good at this).


There will definitely be games requiring more than 10k of VRAM  But 10GB? Again, I have my doubts. Sure, there will always be outliers, and there will always be games that take pride in being extremely graphically intensive. There will also always be settings one can enable that consume massive amounts of VRAM if desired, mostly with negligible if noticeable at all impacts on graphical quality. But beyond that, the introduction of DirectStorage for Windows and alongside that the _very_ likely beginning of SSDs being a requirement for most major games in the future will directly serve to decrease VRAM needs. Sure, new things can be introduced to take up the space freed up by not prematurely streaming in assets that never get used, but the chance of those new features taking up all that was freed up plus a few GB more is very, very slim. Of course not every game will use DirectStorage, but every cross-platform title launching on the XSX will at least have it as an option - and removing it might necessitate rearchitecting the entire structure of the game (adding loading screens, corridors, etc.), so it's not something that can be removed easily.


BoboOOZ said:


> No SLI on the 3080? Anyways, I will repeat myself, Nvidia will do this only if they have to, and AMD beats the 3080.


SLI? That's a gaming feature. And you don't even need SLI for gaming with DX12 multi-adapter and the like. Compute workloads do not care one iota about SLI support. NVLink does have some utility if you're teaming up the GPU to work as one, but it's just as likely (for example in huge database workloads, which can consume _massive_ amounts of memory) that each GPU can do the same task in parallel, working on different parts of the dataset, in which case PCIe handles all the communication needed. The same goes for things like rendering.


BoboOOZ said:


> Do you mean to say that the increase from 8 to 10 GB is proportional with the compute and bandwith gap between the 2080 and the 3080? It's rather obvious that it's not. On the contrary, if you look at the proportions, the 3080 is the outlier of the lineup, it has 2x the memory bandwidth of the 3070 but only 25% more VRAM


...and? Increasing the amount of VRAM to 20GB won't change the bandwidth whatsoever, as the bus width is fixed. For that to change they would have to add memory channels, which we know there are two more of on the die, so that's possible, but then you're talking either 11/22GB or 12/24GB - the latter of which is where the 3090 lives. The other option is of course to use faster rated memory, but the chances of Nvidia introducing a new SKU with twice the memory _and_ faster memory is essentially zero at least until this memory becomes dramatically cheaper and more widespread. As for the change in memory amount between the 2080 and the 3080, I think it's perfectly reasonable, both because the amount of memory isn't directly tied to feeding the GPU (it just needs to be enough; more than that is useless) but bandwidth is (which has seen a notable increase), and because - once again - 10GB is likely to be plenty for the vast majority of games for the foreseeable future.


BoboOOZ said:


> Older games have textures optimized for viewing at 10802p. This gen is about 4k gaming being really possible, so we'll see more detailed textures.
> They will be leveraged on the consoles via streaming from the SSD, and on PCs via increasing RAM/VRAM usage.


The entire point of DirectStorage, which Nvidia made a massive point out of supporting with the 3000-series, is precisely to handle this in the same way as on consoles. So that statement is fundamentally false. If a game uses DirectStorage on the XSX, it will also do so on W10 as long as the system has the required components. Which any 3000-series-equipped system will have. Which will, once again, _reduce_ VRAM usage.



medi01 said:


> First, this makes the statement fairly useless, whereas AMD's perf/w claim (+50% in RDNA2) is reflecting practical reality at least in TPU reviews.


It absolutely makes the statement useless. That's how marketing works (at least in an extremely simplified and partially naive view): you pick the best aspects of your product and promote them. Analysis of said statements very often show them to then be meaningless when viewed in the most relevant context. That doesn't make the statement _false_ - Nvidia _could_ likely make an Ampere GPU delivering +90% perf/W over Turing, if they wanted to - but it makes it misleading given that it doesn't match the in-use reality of the products that are actually made. I also really don't see how the +50% perf/W for RDNA 2 claim can be reflected in any reviews yet, given that no reviews of any RDNA 2 product exist yet (which is natural, seeing how no RDNA 2 products exist either).


----------



## medi01 (Sep 2, 2020)

Valantar said:


> I also really don't see how the +50% perf/W for RDNA 2


I actually meant RDNA1.


----------



## R0H1T (Sep 2, 2020)

Valantar said:


> and removing it might necessitate rearchitecting the *entire structure of the game* (adding loading screens, corridors, etc.), so it's not something that can be removed easily.


Not sure how accurate that is, there's rumors of a cheap Xbox following (accompanying?) the regular one's release & that one sure as hell isn't going to use just as fast an SSD


----------



## mouacyk (Sep 2, 2020)

To the people who don't like the high stock TDPs, you can thank overclockers who went to all ends to circumvent NVidia's TDP lockdown on Pascal and Turing.  NVidia figured that if people would go to extreme lengths to shunt mod flagship GPUs to garner power in excess of 400W, why not push a measly 350W and look good in performance at the same time?


----------



## FeelinFroggy (Sep 2, 2020)

The pricing and cuda core count is defiantly a surprise.  While competition from AMD's RDNA is a driver for the leap, I think that the next gen console release is what is pushing this performance jump and price decrease.  I think that Nvidia fears that PC gaming is getting too expensive and the next gen consoles may take away some market share if prices cant be lowered.  

Plus, it is apparent that Nvidia has been sandbagging since Pascal as AMD just had nothing to compete.


----------



## Valantar (Sep 2, 2020)

R0H1T said:


> Not sure how accurate that is, there's rumors of a cheap Xbox following (accompanying?) the regular one's release & that one sure as hell isn't going to use just as fast an SSD


Actually it is _guaranteed_ to use that. The XSX uses a relatively cheap ~2.4GB/s SSD. The cheaper one might cut the capacity in half, but it won't move away from NVMe. The main savings will come from less RAM (lots of savings), a smaller SoC (lots of savings) and accompanying cuts in the PSU, VRM, cooling, likely lack of an optical drive, etc. (also lots of savings when combined). The NVMe storage is such a fundamental part of the way games are built for these consoles that you can't even run the games off slower external storage, so how would that work with a slower drive internally?


----------



## Shatun_Bear (Sep 2, 2020)

DuxCro said:


> According to RTX 2080 review on Guru3D, it achieves 45 fps average in Shadow of the Tomb Raider in 4K and same settings as Digital Foundry used in their video. But they achieve around 60fps. Which is  only 33% more.  But they claim avg fps is 80% higher. You can see fps counter in left top corner with Tomb Raider. Vsync was on in captured footage? If there was 80% increase in performance. Avg fps should be around 80.  RTX 2080Ti fps on same cpu DF was using should be over 60 in Shadow of TR. So RTX 3080 is  Just around 30% faster than 2080Ti.  So the new TOP of the line Nvidia gaming  GPU is just 30% faster than previous TOP of the line GPU. When you look at it like that, i really don't see any special jump in performance.  RTX 3090 is for professionals and i don't even count it in at that price.



I can't wait to see the REAL performance increase by reputable sites (Digital Foundry is not reputable, this was paid marketing deal for Nvidia) of a 3080 vs a 2080 or 2080 Ti. Without the cherry-picking, marketing fiddling of figures and nebulous tweaking of settings (RT, DLSS, vsync etc).

Before it's even been reliably benchmarked it's being proclaimed as the greatest thing ever. But I've been in this game too long to know that the figures sans RT are not nearly as impressive as is being touted by Nvidia's world class marketing and underhanded settings fiddling.


----------



## R0H1T (Sep 2, 2020)

It will be NVMe storage, that's a given ~ the same one or more specifically with the same Storage & dedicated hardware compression? I'm not sure about that, though tbf it depends on how low will they *price *the _lesser _variant.

*Storage Matters: Why Xbox and Playstation SSDs Usher In A New Era of Gaming*


----------



## xorbe (Sep 2, 2020)

I just can't imagine having a 350W card in my system.  250W is pretty warm.  I felt that 180W blower was a sweet spot.  All of these cards should have at least 16GB imho.


----------



## Shatun_Bear (Sep 2, 2020)

The Series X SSD is not exactly fast or advanced. The PS5's, ok, that is impressive.

So the budget Series S will surely have the same SSD as the X for reasons mentioned above. If it doesn't, MS have created even more problems for themselves with game development.


----------



## BluesFanUK (Sep 2, 2020)

Awaiting reviews before I take any real interest in this. Nvidia have a history of bullshitting.


----------



## Makaveli (Sep 2, 2020)

etayorius said:


> I payed $370 fot my GTX1070 in 2017. $500 for a 3070 does not seem very fair to me. Next gen the GTX 4070 will be $600.



Since when did NV care about what is fair? They will sell to what the market will bare.



Shatun_Bear said:


> The Series X SSD is not exactly fast or advanced. The PS5's, ok, that is impressive.
> 
> So the budget Series S will surely have the same SSD as the X for reasons mentioned above. If it doesn't, MS have created even more problems for themselves with game development.



There is a reason the Direct Storage API was created.


----------



## BoboOOZ (Sep 2, 2020)

Valantar said:


> Sorry, but no. You started out by arguing from the viewpoint of gamers needing more VRAM - i.e. basing your argument in customer needs. Regardless of your intentions, shifting the basis of the argument to the viewpoint of the company is a dramatic shift that introduces conflicting interests to your argumentation, which you need to address.
> 
> Again, I have to disagree. I don't give a rodent's behind about the viewpoint of Nvidia. They provide a service to me as a (potential) customer: providing compelling products. They, however, are in it for the profit, and often make choices in product segmentation, pricing, featuresets, etc. that are clearly aimed at increasing profits rather than providing benefits to the customer. There are of course relevant arguments to be presented in terms of whether what customers may need/want/wish for is feasible in various ways (technologically, economically, etc.), but that is as much of the viewpoint of the company as should be taken into account here. Adopting an Nvidia-internal perspective on this is meaningless for anyone who doesn't work for Nvidia, and IMO even meaningless for them unless that person is in a decision-making position when it comes to these questions.
> 
> ...


My dude, you spend  too long to argue, too little to understand. I'm gonna cut this discussion a little short because I don't like discussions that don't go anywhere, no disrespect intended. The 3080 has already loads of bandwidth, all it's lacking is memory size.

If you don't believe me plot a x y graph, with memory bandwidth x FP32 perf as x, memory size as y. Plot the 780,980, 1080, 2080, 3080, 3070 and 3090 points on it and you'll see if there are any outliers   . Or we'll just agree to disagree.


----------



## tehehe (Sep 2, 2020)

etayorius said:


> I payed $370 fot my GTX1070 in 2017. $500 for a 3070 does not seem very fair to me. Next gen the GTX 4070 will be $600.


I agree. People thinking these prices are low are bonkers. We don't have enough competition in GPU space. $500 for a 8GB card in 2020. Are they joking? It will be fast obsolete.


----------



## ppn (Sep 2, 2020)

tehehe said:


> I agree. People thinking these prices are low are bonkers. We don't have enough competition in GPU space. $500 for a 8GB card in 2020. Are they joking? It will be fast obsolete.



Yeah 1070 +35% 2070 +45% 3070, this thing should be at least 95% faster than 1070, and vram remains the same 8GB.

but this is the gimped chip in order protect 2080Ti users that got gutted by the price cut 60%, 1199 to 499, 11GB is all they have left, not for long. we should get the 6144 Cuda 16GB at some point. only for $599.

8GB should be fine at low detail e-sports for the next 4 years. I get unplayable frame rate at below 8GB. and even 45% won't help with the framerate.


----------



## efikkan (Sep 2, 2020)

Chrispy_ said:


> I graduated alongside, lived with, and stay in touch with multiple game developers from Campos Santos (now Valve), Splash Damage, Jagex, Blizzard, King, Ubisoft, and by proxy EA, and Activision; I think they'd all be insulted by your statement. More importantly, even if there is a grain of truth to what you say, the "off-the-shelf engines" …


Most studios don't make their own game engine in-house anymore, unfortunately. That's not an insult, but a fact. There has been a clear trend in fewer studios making their own engines for years, and the lack of performance optimizations and buggy/broken games at launch are the results. There are some studios, like Id software, which does still do quality work.

We are talking a lot about new hardware features and new APIs in this forum, yet the adoption of such features in games is very slow. Many have been wondering why we haven't seen the revolutionary performance gains we were promised with DirectX 12. Well, the reality is that for generic engines the low-level rendering code is hidden behind layers upon layers of abstractions, so those are not going to get the full potential.



Chrispy_ said:


> … have been slowly but surely migrating to console-optimised engines over the last few years.


"Console optimization" is a myth.
In order to optimize code, low-level code must be written to target specfic instructions, API features or performance characteristics.
When people are claiming games are "console optimized", they are usually referring to them not being scalable, so it's rather lack of optimization if anything.



Chrispy_ said:


> My point was exactly that. Perhaps English isn't your first language but when I said "the last 25 years of PC gaming has proven that devs always cater to the lowest common denominator" - that was me saying that it ISN'T going to change suddenly, and it's been like this for 25 years without changing. That's exactly why games aren't particularly good at utilising the hardware we have currently, because the devs need to make sure it'll run on a dual-core with 4GB RAM and a 2GB graphics card from 9 years ago.


Games today are usually not intentionally catering to the lowest common denominator, but it's more a result of the engine they have chosen, especially if they don't make one in-house. If having support for 10 year old PCs were a priority, we would see more games with support for older Windows versions etc.


----------



## Valantar (Sep 2, 2020)

Shatun_Bear said:


> I can't wait to see the REAL performance increase by reputable sites (Digital Foundry is not reputable, this was paid marketing deal for Nvidia) of a 3080 vs a 2080 or 2080 Ti. Without the cherry-picking, marketing fiddling of figures and nebulous tweaking of settings (RT, DLSS, vsync etc).
> 
> Before it's even been reliably benchmarked it's being proclaimed as the greatest thing ever. But I've been in this game too long to know that the figures sans RT are not nearly as impressive as is being touted by Nvidia's world class marketing and underhanded settings fiddling.


Just a technicality: there's a big difference between closely regulated exclusive access to hardware and paid marketing. Is it a marketing plot by Nvidia? Absolutely. Does it undermine DF' s credibility whatsoever? No. Why? Because they are completely transparent about the process, the limitations involved, and how the data is presented. Their conclusion is also "we should all wait for reviews, but this looks very good for now":



> It's early days with RTX 3080 testing. In terms of addressing the claims of the biggest generational leap Nvidia has ever delivered, I think the reviews process with the mass of data from multiple outlets testing a much wider range of titles is going to be the ultimate test for validating that claim. That said, some of the numbers I saw in my tests were quite extraordinary and on a more general level, the role of DLSS in accelerating RT titles can't be understated.



That there? That's nuance. (Something that is sorely lacking in your post.) They are making very, very clear that this is a preliminary hands-on, in no way an exhaustive review, and that there were mssive limitations to which games they could test, how they could run the tests, which data they could present from these tests, and how they could be presented. There is also no disclosure of this being paid content, which they are required by law to provide if it is. So no, this is not a "paid marketing deal". It's an exclusive preview. Learn the difference.


----------



## r9 (Sep 2, 2020)

I just hope this new gen bring the prices down on the used cards, because the prices for used gpus are nuts.
People asking new card money for their used crap.
Hopefully AMD have something competitive this time around and have the effect on nvidia as it had on intel.
Because after many many years I see a better value in intel  i7 10700 than any Ryzen.


----------



## dir_d (Sep 2, 2020)

Shatun_Bear said:


> The Series X SSD is not exactly fast or advanced. The PS5's, ok, that is impressive.
> 
> So the budget Series S will surely have the same SSD as the X for reasons mentioned above. If it doesn't, MS have created even more problems for themselves with game development.


MS didnt have to overtune the SSD because they created a whole new APi, Direct Storage. I think the PS5 and the Xbox will have about the same effective speed.


----------



## Shatun_Bear (Sep 3, 2020)

dir_d said:


> MS didnt have to overtune the SSD because they created a whole new APi, Direct Storage. I think the PS5 and the Xbox will have about the same effective speed.



No, the PS5 SSD is literally TWICE as fast, and its IO is apparently significantly more advanced, there's no chance they are similar in performance.



Valantar said:


> Just a technicality: there's a big difference between closely regulated exclusive access to hardware and paid marketing. Is it a marketing plot by Nvidia? Absolutely. Does it undermine DF' s credibility whatsoever? No. Why? Because they are completely transparent about the process, the limitations involved, and how the data is presented. Their conclusion is also "we should all wait for reviews, but this looks very good for now":
> 
> 
> 
> That there? That's nuance. (Something that is sorely lacking in your post.) They are making very, very clear that this is a preliminary hands-on, in no way an exhaustive review, and that there were mssive limitations to which games they could test, how they could run the tests, which data they could present from these tests, and how they could be presented. There is also no disclosure of this being paid content, which they are required by law to provide if it is. So no, this is not a "paid marketing deal". It's an exclusive preview. Learn the difference.



It's a paid marketing deal.


----------



## SkynetAI (Sep 3, 2020)

So I have a PSU 750W bronze, I know I need to upgrade but to which wattage


----------



## Xex360 (Sep 3, 2020)

SkynetAI said:


> So I have a PSU 750W bronze, I know I need to upgrade but to which wattage


Why? Gamer Nexus made a piece on how much power a rig actually uses.


----------



## SkynetAI (Sep 3, 2020)

lol I figured i had to update since the card is going to pull a lot of power But I want the RTX 3090 when it drops


----------



## ppn (Sep 3, 2020)

Be serious. You need 1KW power Titanium class and then load it to 50-60% where it runs at highest efficiency 97% whisper quiet or fanless. 400 watt GPU 200 watt CPU. This is all, everything else in the system is irrelevant unless you have 10 hdd or 10 fans or something preposterous.


----------



## Prima.Vera (Sep 3, 2020)

I love how they spelled "Hotel" with both Latin and Katakana characters


----------



## DuxCro (Sep 3, 2020)

You know...I'll be super pissed if i buy RTX 3070 and then a few months later Nvidia announces a 3070 with 16GB of VRAM. So i'll just wait for AMD to release their cards as well. They said RDNA 2 cards will be on the market  before new consoles.


----------



## nguyen (Sep 3, 2020)

DuxCro said:


> You know...I'll be super pissed if i buy RTX 3070 and then a few months later Nvidia announces a 3070 with 16GB of VRAM. So i'll just wait for AMD to release their cards as well. They said RDNA 2 cards will be on the market  before new consoles.



Might as well buy the RTX 3080 then, 1GB of GDDR6 used to cost like 12usd so adding another 8GB to the 3070 would make it too close to the 3080. Furthermore AMD won't be able to undercut Ampere pricing without hurting their financial, like when they released the Radeon VII.
Why everyone is so obsessed with VRAM is beyond me, with DLSS 2.0, you effectively only use as much VRAM as 1080p and 1440p require when playing at 4K. Nvidia is already touting 8K upscaled from 1440p with DLSS 2.0
Yeah DLSS just make your GPU last much longer, even the 2060 6GB can survive AAA games for a quite a while thanks to DLSS.


----------



## Valantar (Sep 3, 2020)

Shatun_Bear said:


> It's a paid marketing deal.


You keep saying that, as if repeating it makes it true. Do you have _any_ basis on which to back up that claim? Any proof that DF is being paid by Nvidia? Any proof that this is an "advertorial" rather than strictly limited exclusive-access hands-on content? So far all you've served to demonstrate is that you seem unable to differentiate between similar but fundamentally different things.


----------



## DuxCro (Sep 3, 2020)

nguyen said:


> Might as well buy the RTX 3080 then, 1GB of GDDR6 used to cost like 12usd so adding another 8GB to the 3070 would make it too close to the 3080. Furthermore AMD won't be able to undercut Ampere pricing without hurting their financial, like when they released the Radeon VII.
> Why everyone is so obsessed with VRAM is beyond me, with DLSS 2.0, you effectively only use as much VRAM as 1080p and 1440p require when playing at 4K. Nvidia is already touting 8K upscaled from 1440p with DLSS 2.0
> Yeah DLSS just make your GPU last much longer, even the 2060 6GB can survive AAA games for a quite a while thanks to DLSS.


And how many games use DLSS? A handful. Also RAy tracing. The more i think about Ampere. the more it looks like garbage architecture. Nvidia had to increase number of cuda cores to  ridiculous amounts, TDP is 320W for 3080, and if you do the math, the card is only 30-35% faster than 2080Ti.  RDNA 2 in XBOX Series X is 12 TFLOP and according to Digital Foundry it performs on level with RTX 2080 Super. So if AMD managed to achieve that level of performance in a console, that high end/ high power consumption graphics card could easily beat RTX 2080Ti by 30-40%. AMd surely won't cheap out on amount of VRAM as Nvidia does.  Worst case scenario, i expect AMD to release their top card for $599, performance between RTX 3070 and 3080, 12GB of VRAM and TDP <  300W.


----------



## nguyen (Sep 3, 2020)

DuxCro said:


> And how many games use DLSS? A handful. Also RAy tracing. The more i think about Ampere. the more it looks like garbage architecture. Nvidia had to increase number of cuda cores to  ridiculous amounts, TDP is 320W for 3080, and if you do the math, the card is only 30-35% faster than 2080Ti.  RDNA 2 in XBOX Series X is 12 TFLOP and according to Digital Foundry it performs on level with RTX 2080 Super. So if AMD managed to achieve that level of performance in a console, that high end/ high power consumption graphics card could easily beat RTX 2080Ti by 30-40%. AMd surely won't cheap out on amount of VRAM as Nvidia does.  Worst case scenario, i expect AMD to release their top card for $599, performance between RTX 3070 and 3080, 12GB of VRAM and TDP <  300W.



So you are looking into the future proofing with extra VRAM but disregard DLSS as future-proof because "currently" it doesn't get supported by many games ? what kinda red-tinted logic is that
How many games currently need more than 8GB VRAM at 4K ? *1 or 2, *one is a flight SIM and the other is a poorly ported game
How many DLSS 2.0 games atm ? more than 7 atm and more to come in the very near future, all of them are AAA games

Yeah Ampere doesn't not impress me on the efficiency gain standpoint but the Samsung 8N node just scale so well with power that capping the TDP would just waste its potential, I'm sure many 5700XT owner would understand this argument




From the look of this I bet pushing even more power will would net meaningful performance gain but it would just be ridiculous. But hey at least you can have a free heater in the winter.
And if you cap the FPS, which I think would give a better gaming experience than letting the FPS run wild, the 3080 would match the 2080 Ti performance at 1/2 the TGP which is 160W


----------



## Bytales (Sep 3, 2020)

Is the 3090 RTX full chip, or does it have some parts/shaders disabled ?
If it would, basically nvidia is selling broken chip for 1500 eur.


----------



## efikkan (Sep 3, 2020)

DuxCro said:


> The more i think about Ampere. the more it looks like garbage architecture.


So Nvidia just announced their largest architectural improvement in ages, and you think it looks like garbage. You're trolling, right? By your standards, what is _not_ a garbage architecture then?



DuxCro said:


> Nvidia had to increase number of cuda cores to  ridiculous amounts, TDP is 320W for 3080, and if you do the math, the card is only 30-35% faster than 2080Ti.


I would like to see that math.
Is it the same kind of math that predicted Turing to perfor at most 10% better than Pascal?



DuxCro said:


> AMd surely won't cheap out on amount of VRAM as Nvidia does.  Worst case scenario, i expect AMD to release their top card for $599, performance between RTX 3070 and 3080, 12GB of VRAM and TDP <  300W.


It's funny that you would know that. I honestly don't know precisely where it will end up, and I'm not sure AMD even knows yet.

Historically, AMD have put extra VRAM to lure buyers into thinking a product is more future proof, especially when the competition is more attractive.


----------



## Bytales (Sep 3, 2020)

It might very well be garbage if their selling a busted chip (3090) for 1500 euros.


----------



## BoboOOZ (Sep 3, 2020)

Bytales said:


> It might very well be garbage if their selling a busted chip (3090) for 1500 euros.


Disabling parts of most chips is s standard industry way of improving yields, and all manufacturers have been using it for a long time.
All that matters is that the GPU is giving you the amount of shaders, memory, etc. that is written on the box, the rest is irrelevant.


----------



## nguyen (Sep 3, 2020)

More VRAM /= more Future proof, otherwise Radeon VII should have beaten every GPU here


----------



## Valantar (Sep 3, 2020)

Bytales said:


> Is the 3090 RTX full chip, or does it have some parts/shaders disabled ?
> If it would, basically nvidia is selling broken chip for 1500 eur.


What on earth are you going on about? This is the silliest thing I've seen in all of these Ampere threads, and that is saying something. Cut-down chips are in no way whatsoever _broken_. The 2080 Ti was a cut-down chip. Did you raise the same point about that? Is the Radeon 5700 "broken", or the 5600 XT? Disabling defective parts of chips to improve yields is _entirely _unproblematic, has been the standard mode of operations for chipmakers for many years, and is a godsend for those of us who value reasonable component prices, as without it they would likely double.


----------



## TheoneandonlyMrK (Sep 3, 2020)

nguyen said:


> More VRAM /= more Future proof, otherwise Radeon VII should have beaten every GPU here


Your answer to a misconception is no more true then his misconception.
More Vram could increase time of useful use, Not performance per say though marginally, that can be the case.
The RVII will remain pliable for quite a while yet.


----------



## Bytales (Sep 3, 2020)

You are missing my point here. I understand your point of view about disabling chips.
I would have raised the same complain about 2080ti, if i were interested in it.
Nvidia designed a chip the way they did, and when they build it, parts of it didnt work out. So basically its a busted silicon, an imperfect chip they are selling for 1500 euros. If i give out 1500 euro, at least give me the full working chip as it was designed.
I payed 1300 euro for a vega frontier edition liquid cooled, but i got a full vega chip as it was intended with full working shaders, paiered with 16gb hbm memory, and watercooling.
The 3090 is a "garbage chip" they sell to get some money back out, while they keep the full working chip and sell it for 5000 euro in quadro or a titan card.
Compared to that the full chip vega i payed 1300 euros is a good deal.

My point is i want the full chip they designed, not an imperfect version sold for lots of money.
While you and/or other might see a good deal to buy a 3090 "busted" chip for 1500 euro, i on the other hand do not.

And you are mistaken my friend. Cut down chips are indeed broken. A part of them couldnt be build as intended - hence broken, so they scraped that and disabled it. Its a wasted chip space that could have held more transistors, making the whole chip better overall.

I dont find it funny to pay 1500 eur for "broken chip".

And the fact is, we dont even know if 3090 is full ship or not. I have asked about it, but havent gotten any answer untill now. You seem to believe that 3090 is indeed a non full chip ?


----------



## Fluffmeister (Sep 3, 2020)

nguyen said:


> More VRAM /= more Future proof, otherwise Radeon VII should have beaten every GPU here



Yeah I mean does 8GB on a RX 570 make it magically "future proof"? I guess for some it does


----------



## efikkan (Sep 3, 2020)

theoneandonlymrk said:


> Your answer to a misconception is no more true then his misconception.
> More Vram could increase time of useful use, Not performance per say though marginally, that can be the case.
> The RVII will remain pliable for quite a while yet.


If anything, it shows that VRAM is not a bottleneck.

But your claim that more VRAM can increase the time of useful use has been used as an argument for years, but have yet to happen. The reality is that unless the usage pattern in games changes significantly, you're not going to get some future proofing out of it, for in order to use more VRAM for higher details, you also need more bandwidth to deliver that data and more computational power to utilize it.

More VRAM mostly makes sense for various pro/semi-pro uses. Which is why I've said I prefer extra VRAM to be an option for AIBs instead of mandated.


----------



## TheoneandonlyMrK (Sep 3, 2020)

Fluffmeister said:


> Yeah I mean does 8GB on a RX 570 make it magically "future proof"? I guess for some it does


So five years++ of acceptable 1080p mainstream
Gaming is not proof of that to you, no point debating it here , it's not on topic but I disagree with you.
@effikan see Polaris, your wrong pal.
We'll have to just disagree, you can retort but this is off topic so I won't be continuing it on this line.


----------



## Fluffmeister (Sep 3, 2020)

Valantar said:


> What on earth are you going on about? This is the silliest thing I've seen in all of these Ampere threads, and that is saying something. Cut-down chips are in no way whatsoever _broken_. The 2080 Ti was a cut-down chip. Did you raise the same point about that? Is the Radeon 5700 "broken", or the 5600 XT? Disabling defective parts of chips to improve yields is _entirely _unproblematic, has been the standard mode of operations for chipmakers for many years, and is a godsend for those of us who value reasonable component prices, as without it they would likely double.



I for one wouldn't mind if my cut down garbage GPU performed like a 3090.


----------



## Melvis (Sep 3, 2020)

If the performance is true then im more excited about the 2660 Ti? if thats what they would call it?


----------



## Valantar (Sep 3, 2020)

Bytales said:


> You are missing my point here. I understand your point of view about disabling chips.
> I would have raised the same complain about 2080ti, if i were interested in it.
> Nvidia designed a chip the way they did, and when they build it, parts of it didnt work out. So basically its a busted silicon, an imperfect chip they are selling for 1500 euros. If i give out 1500 euro, at least give me the full working chip as it was designed.
> I payed 1300 euro for a vega frontier edition liquid cooled, but i got a full vega chip as it was intended with full working shaders, paiered with 16gb hbm memory, and watercooling.
> ...


This screed is beyond ridicuous.

Firstly, it clearly demonstrates that you don't understand how defects in silicon happen. They are an unavoidable consequence of manufacturing - there will _always_ be defects. There is nothing Nvidia can do to alleviate this. Sure, designing bigger chips increases the likelihood of them having some defect or other, but calling them "broken" is ridiculous. Are the i7-10700 or the Ryzen 7 3700X "broken" just because they couldn't reach the power/frequency bin required to be designated as 10700K or 3800X? Obviously not. And just like defects, differing clock and power behaviours of different chips across the same wafer is ultimately random. It can be controlled for somewhat and mitigated, but never removed entirely.

Secondly, unless having the full chip delivers noticeably more performance, what value does it have to you if your chip is fully enabled? The Vega 64 Water Cooled edition barely outperformed the V64 or the V56, yet cost 2x more. That is just silly. I mean, you're welcome to your delusions, but don't go pushing them on others here. Cut-down chips are a fantastic way of getting better yields out of intrinsically imperfect manufacturing methods, allowing for far better products to reach end users.

Thirdly, many (if not most!) cut-down chips aren't even defective, they are cut down to provide silicon for faster-selling lower-tier SKUs. On a mature process even with big dice the defect rate is tiny, meaning that after a while, most cut-down chips will be made from fully working silicon simply because there is no more defective silicon to use. Sure, some will always be defective, as I said above, but mostly this comes down to product segmentation.


----------



## DuxCro (Sep 3, 2020)

efikkan said:


> So Nvidia just announced their largest architectural improvement in ages, and you think it looks like garbage. You're trolling, right? By your standards, what is _not_ a garbage architecture then?
> 
> 
> I would like to see that math.
> Is it the same kind of math that predicted Turing to perfor at most 10% better than Pascal?





efikkan said:


> So Nvidia just announced their largest architectural improvement in ages, and you think it looks like garbage. You're trolling, right? By your standards, what is _not_ a garbage architecture then?
> 
> 
> I would like to see that math.
> ...


LOL. And you actually believe what Jensen said in that reveal? He also said 2 years ago that RTX 2080Ti gives 50% better performance that 1080Ti. In reality, it turned out more to be 10-15%. So not hard to beat your previous (turing) architecture when that brought minimalistic improvements to performance. RTX excluded ofc. RTX 2080 achieves around 45fps in Shadow of Tomb Raider in 4K and same settings as Digital Foundry used in their video. They say RTX 3080 is 80% faster. So simple math says that is around 80fps. RTX 2080Ti achieves around 60fps (source Guru3D) So performance difference RTX 2080Ti-3080 is 30-35% based on Tomb Raider. RTX 2080Ti 12nm manufacturing process, 250W TDP, 13.5TFLOPS. RTX 3080 8nm manufacturing process, 320W TDP, 30TFLOPS. And only 30-35% performance difference from this power hungry GPU with double the number of CUDA cores. Yes, i would call Ampere catastrophically ineffective garbage.
Also, where the F is Jensen pulling that data about Ampere having 1.9 times performance per Watt compared to Turing? It would mean that RTX 3080 at 250W TDP should have 90% higher performance tham RTX 2080TI. But no. It has 320W TDP and 30-35% higher performance.  So power effectiveness of Ampere is minimally better Watt for Watt.  Claim about 1.9X effectibness i straight out LIE.

 I wonder if Jensen believes that bullshit coming out of his mouth?


----------



## ppn (Sep 3, 2020)

Full chips shoud be 5376/384 bit, 3584/256 bit, in reality are those shaders are some sort of hyperthreading.dual shader operations per clock.

We should evaluate performance per transistor. 28000 vs 18600. should result 50% better performance, provided the memory bandwidth also grows by 50%.


----------



## efikkan (Sep 3, 2020)

Valantar said:


> Secondly, unless having the full chip delivers noticeably more performance, what value does it have to you if your chip is fully enabled? The Vega 64 Water Cooled edition barely outperformed the V64 or the V56, yet cost 2x more. That is just silly. I mean, you're welcome to your delusions, but don't go pushing them on others here. Cut-down chips are a fantastic way of getting better yields out of intrinsically imperfect manufacturing methods, allowing for far better products to reach end users.


If I may add, cut down chips may actually even be "better" in some ways. Like GTX 970 achieved higher performance per TFlop than GTX 980, because it effectively had more scheduling resources and cache per core. I believe we saw something similar with Vega56 vs. Vega64 too, where Vega56 did better than it "should" overclocked vs. its bigger brother.



DuxCro said:


> LOL. And you actually believe what Jensen said in that reveal?


You are the one challenging the performance figures, you are the one to prove it. Or should I just take your word for it?  



DuxCro said:


> Yes, i would call Ampere catastrophically ineffective garbage.


You are just going to keep digging yourself deeper, aren't you? 
Here is another shovel for you;
So if Ampere is catastrophically ineffective, what does that make Polars, Vega, Vega20 and Navi then?


----------



## DuxCro (Sep 3, 2020)

efikkan said:


> If I may add, cut down chips may actually even be "better" in some ways. Like GTX 970 achieved higher performance per TFlop than GTX 980, because it effectively had more scheduling resources and cache per core. I believe we saw something similar with Vega56 vs. Vega64 too, where Vega56 did better than it "should" overclocked vs. its bigger brother.
> 
> 
> You are the one challenging the performance figures, you are the one to prove it. Or should I just take your word for it?
> ...


Don't mistake me for AMD fanboy. I owned GTX 1080Ti and RTX 2060 Super. I'm just saying that historically Nvidia is full of shit. intel and AMD as well. I used simple math from available data. Independent reviews will tell the truth as always. But fanboys are gonna believe what they want to believe. And buy anything at any price. Trying to use logic and real life available data with fanboys is like trying to have logical conversation with a religious fanatic.


----------



## Valantar (Sep 3, 2020)

DuxCro said:


> Don't mistake me for AMD fanboy. I owned GTX 1080Ti and RTX 2060 Super. I'm just saying that historically Nvidia is full of shit. intel and AMD as well. I used simple math from available data. Independent reviews will tell the truth as always. But fanboys are gonna believe what they want to believe. And buy anything at any price. Trying to use logic and real life available data with fanboys is like trying to have logical conversation with a religious fanatic.


You're part of the problem here, not part of the solution. Saying "Nvidia is full of shit" doesn't help promote the view that one should always wait for independent reviews whatsoever. All you're doing is riling people up and inciting unnecessary conflict. I don't trust Nvidia's cherry-picked performance numbers either, but I stil find myself arguing against you simply due to how you are presenting your "points".


----------



## DuxCro (Sep 3, 2020)

Valantar said:


> You're part of the problem here, not part of the solution. Saying "Nvidia is full of shit" doesn't help promote the view that one should always wait for independent reviews whatsoever. All you're doing is riling people up and inciting unnecessary conflict. I don't trust Nvidia's cherry-picked performance numbers either, but I stil find myself arguing against you simply due to how you are presenting your "points".


I'm sorry. I'll try to never again use simple math if that offends you.  I'm done with this topic. Peace out. 
OH. Quick edit. I will be picking up RTX 3080. Because i have connections  over at gainward and i can get one at factory price.


----------



## nguyen (Sep 3, 2020)

DuxCro said:


> Don't mistake me for AMD fanboy. I owned GTX 1080Ti and RTX 2060 Super. I'm just saying that historically Nvidia is full of shit. intel and AMD as well. I used simple math from available data. Independent reviews will tell the truth as always. But fanboys are gonna believe what they want to believe. And buy anything at any price. Trying to use logic and real life available data with fanboys is like trying to have logical conversation with a religious fanatic.



Kinda funny I think you are full of BS too, where do you find numbers that say 2080 Ti is 10-15% faster than 1080 Ti ? your ass ?


----------



## Valantar (Sep 3, 2020)

DuxCro said:


> I'm sorry. I'll try to never again use simple math if that offends you.  I'm done with this topic. Peace out.
> OH. Quick edit. I will be picking up RTX 3080. Because i have connections  over at gainward and i can get one at factory price.


I will admit to losing interest in anything math related after middle school, but I certainly never came across any type of math that involved calling anyone "full of shit". That might just be me though. Besides that, trying to calculate gaming performance from on-paper specs of a new architecture with major changes (such as the doubled ALU count with who know what other changes, likely creating bottlenecks that were previously not there, dropping perf/Tflop) is arguably _more_ dubious than Nvidia's claims. Their numbers are cherry-picked, but at least they have a basis in some tiny corner of reality. Purely theoretical calculations of performance add very little to the discussion, underscoring the fact that we should all be waiting for a wide selection of third party reviews before making any kind of decision. And, to be honest, we should also be waiting to see what the competition comes up with. RDNA 2 looks very promising, after all.


----------



## rsouzadk (Sep 3, 2020)

Coming from a RTX 2080, the 3080 seems enticing.


----------



## DuxCro (Sep 3, 2020)

Valantar said:


> Ah, yes, the true sign of a superior intellect: ad hominem attacks, insults and pejoratives. Reported. If you can't present your arguments in at least a somewhat polite way, maybe you shouldn't be posting on forums?


Lets not insult each other here. People disagree with my opinons and i disagree with theirs sometimes. But we can have a polite discussion.


----------



## 95Viper (Sep 3, 2020)

Hi there, everyone!
Keep it on topic.
Quit the arguing and try to keep the discussion on the technical side.
No name calling allowed and enough BS has been thrown and discussed.
As the Guidelines state:  





> Be polite and Constructive, if you have nothing nice to say then don't say anything at all.



Thank You and Have a Great Day


----------



## DuxCro (Sep 3, 2020)

Well, here's a video comparing RTX 2080Ti and RTX 3080 in DOOM Eternal. From those screens where they showed identical scenes side by side i calculated average 40% difference. So my previous math was wrong. Calculated 30-35%. Well, maybe it wasn't wrong. Faster memory gives more advantage at higher resolutions. GDDR6 on 2080Ti vs GDDR6X on 3080. Could be less difference on lower resolutions.
So this is based on this one game. TDP difference between RTX 2080Ti and 3080 is 28%. 250W vs 320W. Performance difference jumping around 40%. So how can Nvidia claim they achieved 1,9X (+90%) performance per Watt? Were they refering to RTX performance?


----------



## Valantar (Sep 3, 2020)

DuxCro said:


> Well, here's a video comparing RTX 2080Ti and RTX 3080 in DOOM Eternal. From those screens where they showed identical scenes side by side i calculated average 40% difference. So my previous math was wrong. Calculated 30-35%. Well, maybe it wasn't wrong. Faster memory gives more advantage at higher resolutions. GDDR6 on 2080Ti vs GDDR6X on 3080. Could be less difference on lower resolutions.
> So this is based on this one game. TDP difference between RTX 2080Ti and 3080 is 28%. 250W vs 320W. Performance difference jumping around 40%. So how can Nvidia claim they achieved 1,9X (+90%) performance per Watt? Were they refering to RTX performance?


That's pretty interesting indeed. Benchmarks will be well worth the read when they arrive.


----------



## sutyi (Sep 3, 2020)

DuxCro said:


> Well, here's a video comparing RTX 2080Ti and RTX 3080 in DOOM Eternal. From those screens where they showed identical scenes side by side i calculated average 40% difference. So my previous math was wrong. Calculated 30-35%. Well, maybe it wasn't wrong. Faster memory gives more advantage at higher resolutions. GDDR6 on 2080Ti vs GDDR6X on 3080. Could be less difference on lower resolutions.
> So this is based on this one game. TDP difference between RTX 2080Ti and 3080 is 28%. 250W vs 320W. Performance difference jumping around 40%. So how can Nvidia claim they achieved 1,9X (+90%) performance per Watt? Were they refering to RTX performance?



Most likely. Cherry picked RT game with DLSS or prosumer software. If you look at the perf/W graf they provided say 1.9X but in reality isn't that 1.5X? 
Realistically Ampere is 1.35-1.40X faster regarding normal rasterization performance vs. Turing. 

Problem is that most RT titles available currently are bound by DXR1.0 implementation so might not be able to show the full RT potential of Ampere over Turing, even though it will be still considerably faster in these titles.


----------



## nguyen (Sep 3, 2020)

DuxCro said:


> Well, here's a video comparing RTX 2080Ti and RTX 3080 in DOOM Eternal. From those screens where they showed identical scenes side by side i calculated average 40% difference. So my previous math was wrong. Calculated 30-35%. Well, maybe it wasn't wrong. Faster memory gives more advantage at higher resolutions. GDDR6 on 2080Ti vs GDDR6X on 3080. Could be less difference on lower resolutions.
> So this is based on this one game. TDP difference between RTX 2080Ti and 3080 is 28%. 250W vs 320W. Performance difference jumping around 40%. So how can Nvidia claim they achieved 1,9X (+90%) performance per Watt? Were they refering to RTX performance?



Yeah the 1.9x Perf/Watt is derived from here







Basically the 2080 Super get 60fps at 250W TGP while 3080 can get 60fps with only 130W, meaning 2080 Super use 1.9X more watt per FPS. 
Which is not correct, it should have been FPS divided by power use (Perf/Watt) which Ampere is somewhere between 35-45% more efficient (the higher power usage the more Ampere is becoming more efficient, Ampere scales much better with voltages than Turing it seems).


----------



## Shatun_Bear (Sep 3, 2020)

DuxCro said:


> LOL. And you actually believe what Jensen said in that reveal? He also said 2 years ago that RTX 2080Ti gives 50% better performance that 1080Ti. In reality, it turned out more to be 10-15%. So not hard to beat your previous (turing) architecture when that brought minimalistic improvements to performance. RTX excluded ofc. RTX 2080 achieves around 45fps in Shadow of Tomb Raider in 4K and same settings as Digital Foundry used in their video. They say RTX 3080 is 80% faster. So simple math says that is around 80fps. RTX 2080Ti achieves around 60fps (source Guru3D) So performance difference RTX 2080Ti-3080 is 30-35% based on Tomb Raider. RTX 2080Ti 12nm manufacturing process, 250W TDP, 13.5TFLOPS. RTX 3080 8nm manufacturing process, 320W TDP, 30TFLOPS. And only 30-35% performance difference from this power hungry GPU with double the number of CUDA cores. Yes, i would call Ampere catastrophically ineffective garbage.
> Also, where the F is Jensen pulling that data about Ampere having 1.9 times performance per Watt compared to Turing? It would mean that RTX 3080 at 250W TDP should have 90% higher performance tham RTX 2080TI. But no. It has 320W TDP and 30-35% higher performance.  So power effectiveness of Ampere is minimally better Watt for Watt.  Claim about 1.9X effectibness i straight out LIE.
> 
> I wonder if Jensen believes that bullshit coming out of his mouth?



Exactly. It appears here like people haven't witnessed an Nvidia/AMD launch before, or have collective ignorance.

*How on earth are some of you spewing these marketing numbers from Nvidia and thus hyperbolic assertions about Ampere without a single reputable benchmark of any of the cards?* You're like the perfect consumer drones in a dystopian future where consumerism has gone into overdrive.

I will say it again: let's come back to these comments in a couple weeks when W1zzard has put the cards through his benchmark suite. Lets see how these '200% faster' and '2-3X the performance' exclamations hold up when you blow away the smoke and shatter the mirrors to look at actual rasterization performance averages across more than 10 games..


----------



## Fluffmeister (Sep 3, 2020)

Hey, we still remember Polaris was gonna offer a whopping 2.5x perf/watt over cards like the 290X:





You're right, don't believe everything you read.


----------



## medi01 (Sep 3, 2020)

ppn said:


> Full chips shoud be 5376/384 bit, 3584/256 bit, in reality are those shaders are some sort of hyperthreading.dual shader operations per clock.



In the past CUs could do 1 fp and one int op.
Ampere CUs could do 1 fp and one int op or 2 fps.
So technically it's twice tflops, but the same CU.

Saying it is 2 CUs is like AMD claiming double number of Buldozer cores.


----------



## Icon Charlie (Sep 3, 2020)

SkynetAI said:


> So I have a PSU 750W bronze, I know I need to upgrade but to which wattage


Probably not.  You should invest some money into a watt meter to get the actual wattage being used coming from your outlet.

In simple terms, this is how I setup all of my PSU's for my computers.  Seems to work well for me. It might work for you.

Right now I have a 850Watt 80+Gold standard PSU. Rated 90% Efficiency  @ 50% load. 
So that number that I use is 850 x 90% / 2 (50% load number) = 382 Watts at the 50% load level.  I personally want to keep my wattage  below this number as this will prolong the life of your PSU. I have a few that are in the 10 year range ATM and those comps I use to play games on mostly.

With my watt meter I measure my usage normally as well as while I'm playing my games on the computer I am using. I Do this for 2 or more hours  recording the data for that rig. 

My normal load level for my current computer is at 105 Watts. My Max watt level playing games is at 250W max.  Well below the 50% threshold of my PSU.   I should have no problems with any future upgrade however I'm not the type who buys on a whim. I do my research first, then buy it needed.  

My comp is  AMD 3600, 32GB G,Skill Flare X CL14 PC3200 Ram, MSI X570 A-Pro MB, Visiontek RX 5700 w/10% undervolt, 500gb Samsung 860EVO SSD, WD 2TB HDD, LG Blu Ray.

Again.  This is how I set up my PSU's concerning watt usages and it has worked for me for at least 15 years.


As far as this whole new generation of Video cards?

When Navi Comes out then we can see all of the performance gains over last generation of cards.

HOWEVER... And this is something that people should be aware of. 

1.  Most of the world is running on 1080p OR Less.
2.  Gaming monitors are mostly set at the 1080p level.  recently 1440p is becoming more and more of the Sweet Spot as 4K is still too expensive for what you are getting.
3.  Video gaming advancements are still 3 to 5 years behind the current technology.
4.  Most people right now are tight on money these days.

It makes no difference if your monitor can not handle the additional power given with these new cards.

I am averaging @ over 165 fps @1440p, playing Overwatch on a 32 inch, 1440p, 165hz Pixio PX329 monitor that is using a RX 5700 video card that I got new for $280.00 9 months ago.  Heh Grandpa here playing pew pew.   Everything right now is working great.

Unless Nvidia (and AMD) can pull 50%  real performance numbers over last years video cards, only those gerbils will be buying those video cards because most people's video cards they currently have on hand are still doing the job. 

Unless there is a fantastic price vs performance ratio on these new upcomming video cards, I don't think I'll be throwing 5 to 7 hundred dollars for one.


----------



## ppn (Sep 3, 2020)

Judging by the DF results we have 55% faster 70-tier card, and 75% faster 80-tier card. So Nvidia pulled a bulldozer on us. In their desire to impress with 2xFP performance nvidia created a GPGPU card that is going to mine well, not so good for games and for prices, although my 1060 mined 1Eth in one month, and that can buy me 3070 now. lol.. But still 1,66x more transistors fitted in the same 450mm2 die size 70-tier card. and that is being hampered by the ridiculous vram gimping. What is very dissapointing that 16GBps memory is used. there is not much difference here. 66% more transistors, and the same memory, 16% faster, that is garbage. should be 20GBps 16GB.


----------



## efikkan (Sep 3, 2020)

Icon Charlie said:


> Probably not.  You should invest some money into a watt meter to get the actual wattage being used coming from your outlet.


But is that really needed though?
I'm pretty sure 750W is plenty unless he/she is overclocking heavily or using a HEDT CPU (and probably enough even then).
I would be much more concerned with the quality and age of the PSU.

I had one computer with a heavily used ~10 year old Corsair TX750 80+ bronze (I believe) which was very unstable, and I put in a Seasonic Focus+ Gold 650W, and it improved a lot.

In general I would prefer quality over quantity, I'll take a Seasonic 650W over a crappy 1000W any day. Still, when the budget allows it, I do choose some breathing room to try to keep the typical load around ~50% for long-term stability as you said.


----------



## mechtech (Sep 4, 2020)

Icon Charlie said:


> HOWEVER... And this is something that people should be aware of.
> 
> 1.  Most of the world is running on 1080p OR Less.
> 2.  Gaming monitors are mostly set at the 1080p level.  recently 1440p is becoming more and more of the Sweet Spot as* 4K is still too expensive for what you are getting.*
> ...



My old RX480 runs terraria at 4k at 60 fps


----------



## TheoneandonlyMrK (Sep 4, 2020)

mechtech said:


> My old RX480 runs terraria at 4k at 60 fps


I was just playing ace combat 7 using my monitor plugged into my laptop Rtx 2060@4k80fps max settings, the Vegas as capable but we do all game different on different game's.


----------



## Icon Charlie (Sep 4, 2020)

mechtech said:


> My old RX480 runs terraria at 4k at 60 fps


So does my 1070. And you can even get 4K monitors @$300+ range.

But there is a big difference between something that is running at 60hz and something running at the 144hz+ with a IP panel @4K.  I've seen the difference and its nice... but not that nice when the monitor I'm looking at starts at the $600, then add a card that can take full advantage of the monitor in question and that is a lot of money.

That is why I purchased the Pixio 32 inch 1440p, 165hz  monitor for under $300 though it is a bit of overkill for me as I'm so used to the 27 inch monitor.

Hardware Monitor did a review in  late 2018









But it was Level 1 Tech that sold me on the Monitor. 









And finally the 27 inch Brand Name monitor that I wanted cost more than this monitor.   

Now back to the Nvidia 3000 series of video cards.  You know that there are going to be limited supply of the 3000 series being sold at launch date... Right?  Yea I am hearing the same rumors about limited supply issues and of course Price increases.  So if those rumors do come true it will be just like the 2000 series limited supply launch.

I'll just wait until late October/Holiday season to pick up any additional components as needed as well as what AMD has to offer. 

But I am interested to see the actual gaming performance over previous generations as well as how hot these cards will be generating the heat it creates.


----------



## rtwjunkie (Sep 4, 2020)

dir_d said:


> He said the 3080 is a 4k 60fps card and way better than 2080Ti. That made me question a little because i thought 2080Ti was a 4k 60fps card am i wrong?


It was. And still is for older games from when it was released. But game tech moves on and newer games can be unable to be played at 4k on anything but the newest.

That 4k is a moving finish line always is what I have always said. Your 4k card today won’t be one soon.


----------



## AlwaysHope (Sep 4, 2020)

Ordered factory OC RX 5700XT last week...
@ a price point with PP that suited my budget.
Last time I had an Nvidia card in my then, Intel gaming rig was back in 2008! 
There's something about mixing AMD platform with NVIDIA product that just doesn't sit well with me today.


----------



## biffzinker (Sep 4, 2020)

AlwaysHope said:


> There's something about mixing AMD platform with NVIDIA product that just doesn't sit well with me today.


I didn't get hung up on mixing the two, and I still don't care. At one time I had K6-III 450 MHz with Nivida's Riva TNT.


----------



## BoboOOZ (Sep 4, 2020)

biffzinker said:


> I didn't get hung up on mixing the two, and I still don't care. At one time I had K6-III 450 MHz with Nivida's Riva TNT.


Oooh, an old guy .

At that point, AMD didn't have a graphics division for more than another 5 years...



AlwaysHope said:


> Ordered factory OC RX 5700XT last week...


Without waiting for RDNA2 launch?
How much did you pay, out of curiosity?


----------



## Rakhmaninov3 (Sep 4, 2020)

Xpert 2000 Pro doesn't even need a fan.  Why would I upgrade.


----------



## BoboOOZ (Sep 4, 2020)

Rakhmaninov3 said:


> Xpert 2000 Pro doesn't even need a fan.  Why would I upgrade.


I didn't get your reference at first   It's 2000 like the year... You might need to upgrade for multi-monitor support, though.


----------



## medi01 (Sep 4, 2020)

dir_d said:


> He said the 3080 is a 4k 60fps card and way better than 2080Ti. That made me question a little because i thought 2080Ti was a 4k 60fps card am i wrong?



Consoles are at 2080/2080sup and will target 4k, but devs are more likely to target 30fps than 60fps. So you'll need something two times faster than 2080. 3080 falls short, maybe 3090.


----------



## webdigo (Sep 4, 2020)

Power consumption under gaming 3070 - 3080?????


----------



## Splinterdog (Sep 4, 2020)

BFGPU
Nuff said.


----------



## Valantar (Sep 4, 2020)

Splinterdog said:


> BFGPU
> Nuff said.


BFPower draw?


----------



## Caring1 (Sep 4, 2020)

Splinterdog said:


> BFGPU
> Nuff said.


BFG GPUs?








						BFG Announces Four New Graphics Cards Featuring ThermoIntelligence Cooling
					

BFG Technologies announced today the release of the BFG NVIDIA GeForce 8600 GT OC and OC2 graphics cards with ThermoIntelligence cooling. Both cards are available now in 256MB & 512MB versions at leading retailers and etailers throughout North America and Europe.We've added our custom...




					www.techpowerup.com
				



That's a name I haven't heard in a while.


----------



## KevSmeg (Sep 4, 2020)

Nvidia know how to piss off their customers that bought RTX 20XX this year LOL


----------



## Chomiq (Sep 4, 2020)

Valantar said:


> BFPower draw?


According to them, 30 W over 3080.

But that depends on what and how they actually measure.


----------



## Splinterdog (Sep 4, 2020)

Caring1 said:


> BFG GPUs?
> 
> 
> 
> ...


Erm, Quake/Doom reference?


----------



## Valantar (Sep 4, 2020)

Chomiq said:


> According to them, 30 W over 3080.
> 
> But that depends on what and how they actually measure.


Yes, but the 3080 is again 70+ W over the 2080. So definitely BFPD.


----------



## mouacyk (Sep 4, 2020)

Splinterdog said:


> Erm, Quake/Doom reference?


NVidia would likely have you think differently.  They will likely claim BFGPU is a part of their BFGD ecosytem, that did not take off but is now more than ever ready with Ampere to power it.


----------



## Splinterdog (Sep 4, 2020)

mouacyk said:


> NVidia would likely have you think differently.  They will likely claim BFGPU is a part of their BFGD ecosytem, that did not take off but is now more than ever ready with Ampere to power it.


Gotcha.
Big Format Gaming Display.
Sounds a bit dull in comparison, but I'm sure that's why Nvidia gave it that name in the first place   








						Eyes-on: Nvidia’s massive 65-inch BFGD gaming monitors get real with HP’s Omen X Emperium
					

The first monitor based on Nvidia's BFGD, the HP Omen X Emperium, will hit the streets loaded with cutting-edge display technologies and luxurious extras galore.




					www.pcworld.com


----------



## Rakhmaninov3 (Sep 4, 2020)

Caring1 said:


> BFG GPUs?
> 
> 
> 
> ...



I had a BFG 8800 GTX, it was awesome


----------



## AlwaysHope (Sep 5, 2020)

BoboOOZ said:


> Oooh, an old guy .
> 
> At that point, AMD didn't have a graphics division for more than another 5 years...
> 
> ...


Even when RDNA2 enters retail channel, it will be high end cards first that will be overkill for my gaming needs.

Besides, just like every vga launch their will be "teething" issues with drivers etc...


----------



## Valantar (Sep 5, 2020)

mouacyk said:


> NVidia would likely have you think differently.  They will likely claim BFGPU is a part of their BFGD ecosytem, that did not take off but is now more than ever ready with Ampere to power it.


I dont think you can call that an ecosystem - there were a few displays, then none. And now they're pretty much obsolete as any decent TV matches their specs and features for a quarter of the price.


----------



## Icon Charlie (Sep 5, 2020)

*Honest RTX 3000 Series Announcement Parody.... Enjoy *


----------



## mechtech (Sep 6, 2020)

Icon Charlie said:


> So does my 1070. And you can even get 4K monitors @$300+ range.
> 
> But there is a big difference between something that is running at 60hz and something running at the 144hz+ with a IP panel @4K.  I've seen the difference and its nice... but not that nice when the monitor I'm looking at starts at the $600, then add a card that can take full advantage of the monitor in question and that is a lot of money.
> 
> ...



My 4k screen was about $350 cnd, I'm happy with it.  When there is a 120HZ model for the same price that's when I will buy one.  I refuse to pay $300 premium for another 60 Hz.

As for the NV 3k series, I have bigger fish to fry, I have to replace shingles on my house.  GPU upgrade for me is at least 2 years away.


----------



## BoboOOZ (Sep 7, 2020)

Here's a video from Tom, bringing a bit of realism to the hype train.









To those who don't know him, he's not AMD fanboi, he just tends to wear Intel, AMD, Nvidia, T-shirts on occasion, depending on the content of the video.


----------



## efikkan (Sep 7, 2020)

BoboOOZ said:


> Here's a video from Tom, bringing a bit of realism to the hype train.


Coming from the guy who cited "sources" claiming 4-5x performance gains in raytracing for Ampere.
Now he is claiming RDNA2 can compete with all tiers of Nvidia's lineup, but only if they _want to_ (7:30). Last year he claimed AMD had "big Navi" ready, but didn't want to release it (despite no evidence pointing to a RDNA1 "big Navi"). He is just rambling and speculating, claiming it's leaks.

Ampere has in general not been overhyped, if anything the specs and claimed performance seems to have caught most by surprise. (As with any claims, these needs to be confirmed with real reviews of course.) The hype for RDNA2 is much higher, but if AMD is nearly as good as Nivida at concealing the real specifics from "leakers", then we might not know until it's unveiled whether "Big Navi" is a 2080 Ti class card for cheap, or if it's a true high-end contender. It may seem like this new trend of all these "leakers" spreading speculation may actually help hide the tiny bits of _actual leaks_ out there.


----------



## TheoneandonlyMrK (Sep 7, 2020)

efikkan said:


> Coming from the guy who cited "sources" claiming 4-5x performance gains in raytracing for Ampere.
> Now he is claiming RDNA2 can compete with all tiers of Nvidia's lineup, but only if they _want to_ (7:30). Last year he claimed AMD had "big Navi" ready, but didn't want to release it (despite no evidence pointing to a RDNA1 "big Navi"). He is just rambling and speculating, claiming it's leaks.
> 
> Ampere has in general not been overhyped, if anything the specs and claimed performance seems to have caught most by surprise. (As with any claims, these needs to be confirmed with real reviews of course.) The hype for RDNA2 is much higher, but if AMD is nearly as good as Nivida at concealing the real specifics from "leakers", then we might not know until it's unveiled whether "Big Navi" is a 2080 Ti class card for cheap, or if it's a true high-end contender. It may seem like this new trend of all these "leakers" spreading speculation may actually help hide the tiny bits of _actual leaks_ out there.


Nvidia did indeed hype their card, by their own figure's they struggle to prove their own claim's, to be fair ,a company shouldn't down play their performance but it's clear Nvidia hyped this launch, let's wait for review's before getting too excited verbally.


----------



## BoboOOZ (Sep 7, 2020)

efikkan said:


> Ampere has in general not been overhyped, if anything the specs and claimed performance seems to have caught most by surprise. (As with any claims, these needs to be confirmed with real reviews of course.)


If you actually listen to what he's saying, he's saying exactly that, he says Ampere was not overhyped, the performance is exactly what he leaked half a year ago. The hype for Nvidia has only started last week, because many people misinterpret the very high FP32 theoretical performance and the performance comparison based on a handful of  cherry-picked games which leabs some to believe theres a 80-100% improvement over the last generation.


efikkan said:


> The hype for RDNA2 is much higher, but if AMD is nearly as good as Nivida at concealing the real specifics from "leakers", then we might not know until it's unveiled whether "Big Navi" is a 2080 Ti class card for cheap, or if it's a true high-end contender. It may seem like this new trend of all these "leakers" spreading speculation may actually help hide the tiny bits of _actual leaks_ out there.


I have no idea where is the hype for big Navi, you must be spending your time on different fora then me.
For now Tom (and others) haven't had a single solid leak to know at least if AMD are planning to compete in the high-end or not, it's all just conjecture/deduction and wishful thinking at this point.


----------



## randompeep (Sep 17, 2020)

RedelZaVedno said:


> Just look at Microsoft FS 2020. 2080TI manages only 31FPS (with deeps to 21) over NY City at 4K/ULTRA and wants to use 12.7GB of VRAM. 3080 will hopefully get us to 55 fps (with deeps to 40ies). Having 16GB would probably smoothen these fps deeps further. Needless to say 3080 is still a godsent for flight simmers


Hey, it seems like the reviews are there. ~45 fps in M$ FS20 @4k Ultra. Hoping everyone got this clear - Microsoft made an experimental unoptimized game just for the GPU market well-being. I'm not saying RTX 3000 is trash, but in some markets the 3080 AIBs came at +50-60% of the advertised price for FE. It's quite a hike and makes it not worth the money for the next 12 months in select regions. TBH I'm sitting out there waiting for further discounts on the second market for the RX570/GTX 1060 3GB. The 3gb cards are getting obsolete, so take my 60$ if you sell one of those mosquitos


----------



## BoboOOZ (Sep 17, 2020)

DuxCro said:


> Well, here's a video comparing RTX 2080Ti and RTX 3080 in DOOM Eternal. From those screens where they showed identical scenes side by side i calculated average 40% difference. So my previous math was wrong. Calculated 30-35%. Well, maybe it wasn't wrong. Faster memory gives more advantage at higher resolutions. GDDR6 on 2080Ti vs GDDR6X on 3080. Could be less difference on lower resolutions.
> So this is based on this one game. TDP difference between RTX 2080Ti and 3080 is 28%. 250W vs 320W. Performance difference jumping around 40%. So how can Nvidia claim they achieved 1,9X (+90%) performance per Watt? Were they refering to RTX performance?


So now we know, their claim is based on the performance of fully path traced Quake and Minecraft exclusively.

And the difference you are seeing between the 2080 and the 3080 is so large because the settings are cherry-picked just so the VRAM requirement is higher than 8 GB but smaller than 10GB. Nice job Nvidia and DF!


----------



## Bubster (Sep 21, 2020)

I wonder how much the 3080 with 20 Gb will cost?


----------



## R0H1T (Sep 21, 2020)

Half an arm & two fourths of a leg?


----------

