# NVIDIA GeForce RTX 3090 and 3080 Specifications Leaked



## AleksandarK (Aug 28, 2020)

Just ahead of the September launch, specifications of NVIDIA's upcoming RTX Ampere lineup have been leaked by industry sources over at VideoCardz. According to the website, three alleged GeForce SKUs are being launched in September - RTX 3090, RTX 3080, and RTX 3070. The new lineup features major improvements: 2nd generation ray-tracing cores and 3rd generation tensor cores made for AI and ML. When it comes to connectivity and I/O, the new cards use the PCIe 4.0 interface and have support for the latest display outputs like HDMI 2.1 and DisplayPort 1.4a.

The GeForce RTX 3090 comes with 24 GB of GDDR6X memory running on a 384-bit bus at 19.5 Gbps. This gives a memory bandwidth capacity of 936 GB/s. The card features the GA102-300 GPU with 5,248 CUDA cores running at 1695 MHz, and is rated for 350 W TGP (board power). While the Founders Edition cards will use NVIDIA's new 12-pin power connector, non-Founders Edition cards, from board partners like ASUS, MSI and Gigabyte, will be powered by two 8-pin connectors. Next up is specs for the GeForce RTX 3080, a GA102-200 based card that has 4,352 CUDA cores running at 1710 MHz, paired with 10 GB of GDDR6X memory running at 19 Gbps. The memory is connected with a 320-bit bus that achieves 760 GB/s bandwidth. The board is rated at 320 W and the card is designed to be powered by dual 8-pin connectors. And finally, there is the GeForce RTX 3070, which is built around the GA104-300 GPU with a yet unknown number of CUDA cores. We only know that it has the older non-X GDDR6 memory that runs at 16 Gbps speed on a 256-bit bus. The GPUs are supposedly manufactured on TSMC's 7 nm process, possibly the EUV variant.



 



*View at TechPowerUp Main Site*


----------



## Chaitanya (Aug 28, 2020)

I will wait to see how these new GPUs perform in Helicon before deciding on uograde.


----------



## Vya Domus (Aug 28, 2020)

> The card features GA102-300 GPU with 5248 CUDA cores running at 1695 MHz, rated for 350 W TGP.



So it turns out that A100 was indeed a very good indicator of how the GAXXX counter parts are going to look like, I don't know why everyone thought otherwise. Anyway, that's about 14% more shaders for 25% more power compared to TU102 and given that the larger the processor is the more power efficient it's going to be something must have went wrong with the manufacturing process. Or something else related to the SMs themselves.


----------



## TheLostSwede (Aug 28, 2020)

No poll option for Waiting for the reviews?


----------



## koaschten (Aug 28, 2020)

Chaitanya said:


> I will wait to see how these new GPUs perform in Helicon before deciding on upgrade.


Also going to wait to see which cards get watercooler support, because I don't want an extra 350W of heat stuck in a small metal box.


----------



## W1zzard (Aug 28, 2020)

TheLostSwede said:


> No poll option for Waiting for the reviews?


Added


----------



## Daven (Aug 28, 2020)

So let's see

RTX 3090 5248 CUDA, 1695 MHz boost, 24 GB, 936 GB/s, 350 W, 7 nm process

RTX 2080 Ti  4352 CUDA, 1545 MHz boost, 11 GB, 616 GB/s, 250 W, 12 nm process

I guess all the potential fab process power savings were erased by the extra RAM, RAM speed, CUDA, RT and tensor cores.

Edit: Maybe comparing to the RTX 3080 is more informative:

RTX 3080 4352 CUDA, 1710 MHz boost, 10 GB, 760 GB/s, 320 W, 7 nm process

RTX 2080 Ti  4352 CUDA, 1545 MHz boost, 11 GB, 616 GB/s, 250 W, 12 nm process

Almost no difference between these cards except on the RT and Tensor side. If the price is much lower than $1000 for the 3080 then you can get 2080 Ti performance on the 'cheap'.


----------



## CrAsHnBuRnXp (Aug 28, 2020)

Good. Still 2 8 pin connectors from board partners. Nothing has changed. Just nvidia being weird.


----------



## JAB Creations (Aug 28, 2020)

Who the hell is team red? Okay, gotta figure this one out...


Team Blue: Intel.
Team Green: AMD.
Team Lime: Nvidia.
Yeah, there _used_ to be ATI. "But! But! Nvidia already has green!" No, they're lime and AMD has been around _way the hell longer_ than Nvidia.

I can't wait for the 6950XTX.


----------



## Raendor (Aug 28, 2020)

only 10GB on 3080? Seriously? And 8GB on 3070 is same we had ever since Pascal already for x70. That's lame.


----------



## EarthDog (Aug 28, 2020)

Chaitanya said:


> I will wait to see how these new GPUs perform in Helicon before deciding on uograde.


Helicon? hahahahahahahahahhaha

Might as well say minecraft.


----------



## steen (Aug 28, 2020)

TheLostSwede said:


> No poll option for Waiting for the reviews?


Reviews will be critical. Current bench suites may need reviewing. It'll be interesting to see what 34b odd transistors @ 1.7GHz can do.


----------



## Vya Domus (Aug 28, 2020)

Mark Little said:


> If the price is much lower than $1000 for the 3080



We all know that ain't gonna happen. "Much lower" will mean 899$ or something along those line probably.


----------



## Tomgang (Aug 28, 2020)

I just hope RTX 3000 will not be Fermi all over again. Triple slot cooler for RTX 3090 and 350 watt TDP and 320 watt for RTX 3080 gives some sort of concern for heat and temperature.

With that said, so am i planning a RTX 3080 to be the GPU for what i am planning to be a AMD Zen 3 based system. Besides i just sold my GTX 1080 TI, so i am currently stuck to a GTX 1060 6 GB. Sold my GTX 1080 TI while i cut still get a desent price for it. With the RTX 3000 lauch, that will press value down.


----------



## JAB Creations (Aug 28, 2020)

CrAsHnBuRnXp said:


> Good. Still 2 8 pin connectors from board partners. Nothing has changed. Just nvidia being weird.



They're not being weird, they're trying to distract people that they feel threatened by AMD who no longer has a stock value of under $2 from the crony tactics of both Intel and Nvidia so naturally they're going to do their absolute best. Why?

"Nvidia has the best card at $36,700! So when I spend $170 I'll somehow magically get the best card I can!" - Fanboyism

Now you take that with "Oh, but it's actually smaller than two eight pins!" which is intended for Nvidia fan boys to basically say,"I don't care that my room is 20 degrees warmer and my electricity bill doubled! I need those 14 FPS because twice the FPS of my 144Hz monitor isn't enough for some very objective unquestionable reason!"

That is the problem with cronies, they know how weaker physiological mindsets work and they have no qualms about taking advantage of people.


----------



## King Mustard (Aug 28, 2020)

*RTX 3090*
24 GB of GDDR6X memory running on a 384-bit bus at 19.5 Gbps (memory bandwidth capacity of 936 GB/s)
5,248 CUDA cores, 1695 MHz boost (Gainward Phoenix Golden Sample 3090 will boost to 1725 MHz)
350 W TGP (board power)

*RTX 3080*
10 GB of GDDR6X memory running on a 320-bit bus at 19 Gbps (memory bandwidth capacity of 760 GB/s)
4,352 CUDA cores, 1710 MHz boost (Gainward Phoenix GS 3080 will boost to 1740 MHz)
320 W TGP (board power)

*RTX 3070*
? GB of GDDR6 memory running on a 256-bit bus at 16 Gbps (memory bandwidth capacity of ? GB/s)
? CUDA cores running at ?
? W TGP (board power)

As a gamer, I want the performance of the 3090 but with the VRAM amount of the 3080. Let's hope for a 3080 Ti with 12 GB GDD6X with 5,120 CUDA cores.


----------



## medi01 (Aug 28, 2020)

AleksandarK said:


> The GPUs are supposedly manufactured on TSMC's 7 nm process


Based on what?


----------



## kayjay010101 (Aug 28, 2020)

medi01 said:


> Based on what?


Based on what the article says. Bottom of the article states the following:


> The data that we saw clearly mention the 7nm fabrication node. At this time we are unable to confirm if this is indeed true.


----------



## RedelZaVedno (Aug 28, 2020)

Is 1695 MHz a boost or a base frequency? Very disappointing if it's the boost, as it would mean only 17,8 TFlops (2080TI + 30%) if there is no arch gain. But even if IPC is increased by 15% due to new arch, it would still mean only 20,45 TFlops (50 % more than 2080TI). These rasterization performance gains would maybe justify $800/1000 price tag, but certainly not $1,4K or more.


----------



## Chaitanya (Aug 28, 2020)

koaschten said:


> Also going to wait to see which cards get watercooler support, because I don't want an extra 350W of heat stuck in a small metal box.


I think water cooling support will take 1-2 months after release of the new GPUs. I am considering upgrade from my ageing GTX 1060 to something in same range not top end or enthusiast class GPU.  


EarthDog said:


> Helicon? hahahahahahahahahhaha
> 
> Might as well say minecraft.


I dont play games and one of the main softwares that I use is Helicon for photo stacking and it scales and performs very well on GPUs. You can check their website for performance numbers on various CPUs and GPUs.


----------



## kayjay010101 (Aug 28, 2020)

Chaitanya said:


> I think water cooling support will take 1-2 months after release of the new GPUs. I am considering upgrade from my ageing GTX 1060 to something in same range not top end or enthusiast class GPU.


EK and AlphaCool has already stated they have designs ready for launch. I wouldn't be surprised if we see AIB cards like the Sea Hawk from MSI which uses the EK block be released day one.


----------



## EarthDog (Aug 28, 2020)

Chaitanya said:


> I dont play games and one of the main softwares that I use is Helicon for photo stacking and it scales and performs very well on GPUs. You can check their website for performance numbers on various CPUs and GPUs.


Oh boy... I was thinking a game, HeliBORNE not Helicon.. my bad!!!


----------



## xkm1948 (Aug 28, 2020)

Nvidia is seriously leaky this time around. Curious to see the results as well as the Tensor core config. Would be nice if the consumer variant of Ampere would also receive the sparse matrix FP16 Tensor like A100.


----------



## erek (Aug 28, 2020)

looks fake, why's the spacing different?


----------



## Chomiq (Aug 28, 2020)

I'll wait for RDNA2 reviews and then consider my options. NV has the experience of beta testing their RTX tech for 2 years so it is tempting. I'm planning to buy PS5 at some point, so AMD will get my money either way.


----------



## RedelZaVedno (Aug 28, 2020)

*RTX 3080*
10 GB of GDDR6X memory running on a 320-bit bus at 19 Gbps (memory bandwidth capacity of 760 GB/s)
4,352 CUDA cores running at 1710 MHz
320 W TGP (board power)
-------------------------------------------------------------
*FP32 (float) performance  = 14.88 ?????????????

RTX 2080TI*
11 GB of GDDR6 memory running on a 352-bit bus at 14 Gbps (memory bandwidth capacity of 616 GB/s)
4,352 CUDA cores running at 1545 MHz
250 W TGP (board power) 
------------------------------------------------------------
*FP32 (float) performance  = 13.45 *

This can't be right, only 10 % rasterization performance gain over 2080TI? I REALY hope Nvidia managed to get some IPC gain due to new node/arch or it's gonna be Turing Deja Vu all over again


----------



## ThrashZone (Aug 28, 2020)

HI,
Waiting for reviews and then some before even thinking of 30 series after 20 series space invaders 
But from the looks of it hydro copper all the way this ugle air cooler I'll never see in person.


----------



## Flying Fish (Aug 28, 2020)

This TGP rather than TDP is going to be annoying and cause confusion for people...Can see it in this thread already.

I mean how do you compared TGP to the old TDP values. is the 350W TGP gonna be similar to the 250W TDP of the 2080Ti, plus extra for board power? But then can you see the rest of the components using another 100W?


----------



## Krzych (Aug 28, 2020)

So it is basically confirmed that 3080 will be at least 20% faster than 2080 Ti if it has the same amount of CUDA cores but on a new architecture, there is no way for it to deliver less considering the spec. 3090 should be at least 50%. This should be like bare minimum conservative expectation at this point.


----------



## mouacyk (Aug 28, 2020)

If true, this will be the first time after GTX 580 that NVidia is going >256-bit for the *80 again, but still severely short of 384-bits though and with a price tag to choke on.


----------



## BorisDG (Aug 28, 2020)

Can't wait to grab my KFA2 HOF RTX 3090 baby.


----------



## steen (Aug 28, 2020)

Flying Fish said:


> This TGP rather than TDP is going to be annoying and cause confusion for people...Can see it in this thread already.
> 
> I mean how do you compared TGP to the old TDP values. is the 350W TGP gonna be similar to the 250W TDP of the 2080Ti, plus extra for board power? But then can you see the rest of the components using another 100W?


Yep. Been doing it myself using TBP instead of TGP. Nv says they're similar. I wouldn't be using frameview, though...


----------



## DuxCro (Aug 28, 2020)

If those specs are real, AMD could easily beat them TFLOPS for TFLOPS. Considering XBOX Series X has a 12 TFLOPS GPU and you can't go wild with TDP in a small console.  I think Nvidia will be mostly pushing for Ray Tracing performance gains. Again..if these specs are real. 17,8 TFLOPS on RTX  3090 by the math. 5248 CUDA cores x 2 = 10496 X 1695Mhz clock = 17,8 TFLOPS GPU.


----------



## Jinxed (Aug 28, 2020)

Mark Little said:


> Edit: Maybe comparing to the RTX 3080 is more informative:
> 
> RTX 3080 4352 CUDA, 1710 MHz boost, 10 GB, 760 GB/s, 320 W, 7 nm process
> 
> ...



This could be important. Same amount of CUDA cores, yet quite a bit higher TDP rating even though it's on 7nm EUV versus 12nm means one thing and one thing only: The amount of RT and Tensor cores will be MASSIVELY higher.


----------



## chodaboy19 (Aug 28, 2020)

According to this rumor the RTX 3080 will have variants and some of those variants will cram 20GB of VRAM:



> The card is reportedly going to feature up to 10 GB of memory that is also going to be GDDR6X but there are several vendors who will be offering the card with a massive 20 GB frame buffer but at higher prices. Since the memory is running at 19 Gbps across a 320-bit bus interface, we can expect a bandwidth of up to 760 GB/s.











						NVIDIA's GeForce RTX 3090, RTX 3080, RTX 3070 Specs Leak Out, Will Utilize 7nm Process Node - Up To 350W Flagship, 24 GB VRAM, 19.5 Gbps GDDR6X Speeds
					

The specs of NVIDIA's next-generation GeForce RTX 30 series graphics card which will include the RTX 3090, RTX 3080 & RTX 3070 have leaked.




					wccftech.com


----------



## ppn (Aug 28, 2020)

CrAsHnBuRnXp said:


> Good. Still 2 8 pin connectors from board partners. Nothing has changed. Just nvidia being weird.



12pin is hidden under the shroud. this is just for the purposes for making detachable shroud with integrated 8pins insead of being soldered like 1060,2060 FE.

in case the dies are bigger than 429mm2 can't possibly be EUV .


----------



## BoboOOZ (Aug 28, 2020)

DuxCro said:


> If those specs are real, AMD could easily beat them TFLOPS for TFLOPS. Considering XBOX Series X has a 12 TFLOPS GPU and you can't go wild with TDP in a small console.  I think Nvidia will be mostly pushing for Ray Tracing performance gains. Again..if these specs are real. 17,8 TFLOPS on RTX  3090 by the math. 5248 CUDA cores x 2 = 10496 X 1695Mhz clock = 17,8 TFLOPS GPU.


Not to mention AMD have significantly higher speeds, and performance scales much better with frequency than with cores.

Problem is, we have no idea what AMD is preparing with RDNA2, outside of reasonable guesses based on those console APUs.


----------



## RedelZaVedno (Aug 28, 2020)

WTF is happening with TDPs of xx80 series? Smaller nodes, yet higher TDPs. We're coming to IR panel wattage here. 3080 alone can heat up 6 m² room, add 160W CPU on top of that and you get a decent standalone room heater. I'm not gonna install AC just to use PC 

GTX 980... 165 W
GTX 1080...180 W
RTX 2080(S)... 215/250 W
RTX 3080... 320W


----------



## Jism (Aug 28, 2020)

This is just a enterprise card designed for AI / DL / whatever workload being pushed into gaming. These cards normally fail the enterprise quality stamp. So having up to 350W of TDP / TBP is not unknown. It's like Linus torwards said about Intel: Stop putting stuff in chips that only make themself look good in really specific (AVX-512) workloads. These RT/Tensor cores proberly count up big for the extra power consumption.

Price is proberly in between 1000 and 2000$. Nvidia is the new apple.


----------



## Jinxed (Aug 28, 2020)

RedelZaVedno said:


> WTF is happening with TDPs of xx80 series? Smaller nodes, yet higher TDPs. We're coming to IR panel wattage here. 3080 alone can heat up 6 m² room, add 160W CPU on top of that and you get a decent standalone room heater. I'm not gonna install AC just to use PC
> 
> GTX 980... 165 W
> GTX 1080...180 W
> ...


Simple. There will be a LOT more RT cores on the GPU.


----------



## medi01 (Aug 28, 2020)

BoboOOZ said:


> Not to mention AMD have significantly higher speeds, and performance scales much better with frequency than with cores.


There is no word on whether 1.7Ghz is base or boost frequency.
Given that Sony can push its RDNA chip to 2.1Ghz, if 7nm is true, NV boost frequency must be higher than that.



Jinxed said:


> RT cores


Why on earth would RT cores need to consume anything, unless one of those dozen games that support it is running?


----------



## RedelZaVedno (Aug 28, 2020)

Jinxed said:


> Simple. There will be a LOT more RT cores on the GPU.


I understand that, but it's still crazy. Gaming PC should not consume +500W of power. Most of PC gamers don't live in Alaska or Greenland like places.


----------



## GeorgeMan (Aug 28, 2020)

I'll most probably pass. 1080ti should be enough for a couple more years, until we have some real next gen games and until we see if ray tracing really proceeds or it'll remain a gimmick. By then we should have affordable gpus with more vram than 1080Ti's 11GB.


----------



## EarthDog (Aug 28, 2020)

RedelZaVedno said:


> I understand that, but it's still crazy. Gaming PC should not consume +500W of power. Most of PC gamers don't live in Alaska or Greenland like places.


and even then, room temp is still 22C. You act like they need to sit outside to be cooled. Remember the 295x2? A 500W gpu...



GeorgeMan said:


> until we see if ray tracing really proceeds


wait... so consoles and amd are going RT and you have a question on if it proceeds?


----------



## DuxCro (Aug 28, 2020)

medi01 said:


> There is no word on whether 1.7Ghz is base or boost frequency.
> Given that Sony can push its RDNA chip to 2.1Ghz, if 7nm is true, NV boost frequency must be higher than that.
> 
> 
> Why on earth would RT cores need to consume anything, unless one of those dozen games that support it is running?


1.7GHz is around the boost clock for RTX 2000 series, is it not? 
Edit. It says 1710MHz boost for RTX3080


----------



## ZoneDymo (Aug 28, 2020)

honestly to me this feels a bit Intel 6700k to 7700k type of stuff.

just kinda meh all around...and really 10gb of ram for a 3080? I would atleast have given it 16 or so.

oh welll, guess just like with Intel, this is what you get with no competition in that area, remember the RX5700(XT) is really more around RTX2060s / RTX2070s territory.


----------



## GeorgeMan (Aug 28, 2020)

EarthDog said:


> and even then, room temp is still 22C. You act like they need to sit outside to be cooled. Remember the 295x2? A 500W gpu...
> 
> wait... so consoles and amd are going RT and you have a question on if it proceeds?


I'm talking about nvidia's approach (RTX and dedicated hardware), not ray tracing as a technology. Consoles include AMD and it'll be working in a different way.


----------



## BoboOOZ (Aug 28, 2020)

medi01 said:


> There is no word on whether 1.7Ghz is base or boost frequency.
> Given that Sony can push its RDNA chip to 2.1Ghz, if 7nm is true, NV boost frequency must be higher than that.
> Why on earth would RT cores need to consume anything, unless one of those dozen games that support it is running?



Oh, I imagine 1.7 is base, not boost. But 400MHz boost gain is way overoptimistic, IMO, it will be more around 200-250MHz.
And 7nm would be at least surprising at this point, but who knows.


----------



## RedelZaVedno (Aug 28, 2020)

EarthDog said:


> and even then, room temp is still 22C. You act like they need to sit outside to be cooled. Remember the 295x2? A 500W gpu...
> 
> wait... so consoles and amd are going RT and you have a question on if it proceeds?


What do you mean? My PC room's temperature warms up to 26C (and above when outside temps are hitting +35C for a few days) during summer months and I live in a well isolated house positioned in moderate climate. Add +500W PC into the room and you get easily above 28C. I underclock 1080TI during summer months to get it to consume around 160W during gaming, but I can't see how I could underclock 320W GPU to get similar results.


----------



## Jinxed (Aug 28, 2020)

medi01 said:


> Why on earth would RT cores need to consume anything, unless one of those dozen games that support it is running?


Trying to pull 2 year old arguments? Hardly. Given the fact that many AAA titles are raytracing enabled, the fact that main game engines like UE now support raytracing and some of the biggest games are going to be raytraced - Cyberpunk 2077, Minecraft just for an example - nobody believes those old arguments anymore. And you seem to have a bit of a split personality - your beloved AMD is saying new consoles and their RDNA2 GPUs will support raytracing as well. So what are you really trying to say? Are you trying to prepare for the eventuality that AMD's raytracing performance sucks?


----------



## BoboOOZ (Aug 28, 2020)

DuxCro said:


> 1.7GHz is around the boost clock for RTX 2000 series, is it not?
> Edit. It says 1710MHz boost for RTX3080


Ouch, it does indeed say boost, but things really don't add up. AMD managed to boost their first 7nm iteration at 1.9GHz, why would Nvidia only manage 1.7? I would believe this only if it was on Samsung 8nm and if it's really not a good node...


----------



## DuxCro (Aug 28, 2020)

BoboOOZ said:


> Ouch, it does indeed say boost, but things really don't add up. AMD managed to boost their first 7nm iteration at 1.9GHz, why would Nvidia only manage 1.7? I would believe this only if it was on Samsung 8nm and if it's really not a good node...


Well, it says 1650MHz boost for my RTX 2060 Super. But still it boosts to 1850MHz out of the box. And 2000MHz + with slight OC.


----------



## RedelZaVedno (Aug 28, 2020)

BoboOOZ said:


> Ouch, it does indeed say boost, but things really don't add up. AMD managed to boost their first 7nm iteration at 1.9GHz, why would Nvidia only manage 1.7? I would believe this only if it was on Samsung 8nm and if it's really not a good node...


It is on Samsung's 8nm (comparable to TSCM's 10nm), all leakers pointed in that direction (Samsung's "5nm" in base case scenario). And yes, it is a shitty node compared to TSMC's 7nm EUV, hence shitty clock speeds.



DuxCro said:


> Well, it says 1650MHz boost for my RTX 2060 Super. But still it boosts to 1850Mhz out of the box. And 2000Mhz + with slight OC.



5240 shading units GPU is a different beast than 1920 SU GPU... More SU lower clock speeds.


----------



## M2B (Aug 28, 2020)

That 1.7GHz boost is not indicator of the actual gaming load boosts.
The 2080Ti for example is rated for 1545MHz and you'll never see one under 1750MHz in actual games, unless there is something broken with the cooling.


----------



## Metroid (Aug 28, 2020)

"HDMI 2.1 and DisplayPort 1.4a. "

Finally, now, all monitors that requires a huge bandwidth will come with hdmi 2.1.


----------



## DuxCro (Aug 28, 2020)

RedelZaVedno said:


> It is on Samsung's 8nm (comparable to TSCM's 10nm), all leakers pointed in that direction (Samsung's "5nm" in base case scenario). And yes, it is a shitty node compared to TSMC's 7nm EUV, hence shitty clock speeds.


If i remember correctly, Nvidia said that AMD reserved most of the 7nm over at TSMC. But they went on ahead and beat AMD into reserving 5nm for next year. So wait for RTX 3000 Super series for TSMC 5nm.


----------



## rtwjunkie (Aug 28, 2020)

Raendor said:


> only 10GB on 3080? Seriously? And 8GB on 3070 is same we had ever since Pascal already for x70. That's lame.


Do you really “need” more than 10GB VRAM?


----------



## RedelZaVedno (Aug 28, 2020)

DuxCro said:


> If i remember correctly, Nvidia said that AMD reserved most of the 7nm over at TSMC. But they went on ahead and beat AMD into reserving 5nm for next year. So wait for RTX 3000 Super series for TSMC 5nm.


Not a chance. Apple and AMD are TSMC's prime partners. NVidia can get a piece of 5nm production, but not the production scale it needs for PC gaming GPUs. I can see 4xxx HPC and high end Quadros being on TSCM's 5nm but nothing else.


----------



## EarthDog (Aug 28, 2020)

GeorgeMan said:


> I'm talking about nvidia's approach (RTX and dedicated hardware), not ray tracing as a technology. Consoles include AMD and it'll be working in a different way.


The point, however, is that RT tech is here... it isn't a gimmick with everyone all in. Capeesh? 


RedelZaVedno said:


> What do you mean? My PC room's temperature warms up to 26C (and above when outside temps are hitting +35C for a few days) during summer months and I live in a well isolated house positioned in moderate climate. Add +500W PC into the room and you get easily above 28C. I underclock 1080TI during summer months to get it to consume around 160W during gaming, but I can't see how I could underclock 320W GPU to get similar results.


Have a cookie...just saying 300W is nothing compared to some cards in the past.


----------



## Vayra86 (Aug 28, 2020)

Theyre having a laugh. I might hard pass on this for another gen. Still left wondering how on earth this is all worth it just for a handful of pretty weak RT titles...


----------



## BoboOOZ (Aug 28, 2020)

RedelZaVedno said:


> It is on Samsung's 8nm (comparable to TSCM's 10nm), all leakers pointed in that direction (Samsung's "5nm" in base case scenario). And yes, it is a shitty node compared to TSMC's 7nm EUV, hence shitty clock speeds.


Well, it seems even the guys from videocardz aren't sure it's 7nm, so it does look quite surprising. However, RDNA1 was on plain 7nm, not P or EUV. But maybe Nvidia boost ratings are conservative, as the others are pointing out, we'll have to see.


----------



## Jinxed (Aug 28, 2020)

Vayra86 said:


> Theyre having a laugh. I might hard pass on this for another gen. Still left wondering how on earth this is all worth it just for a handful of pretty weak RT titles...


As in Cyberpunk 2077? Yeah, right, that's a weak title. Minecraft? Yeah, also a weak title. Anything made on UE for the foreseeable future? Also weak and unimportant. Console ports made for DXR? Also unimportant ..  Riiiiight.


----------



## Turmania (Aug 28, 2020)

Seems a bit power hungry but we will know for sure with reviews.


----------



## BoboOOZ (Aug 28, 2020)

RedelZaVedno said:


> Not a chance. Apple and AMD are TSMC's prime partners. NVidia can get a piece of 5nm production, but not the production scale it needs for PC gaming GPUs. I can see 4xxx HPC and high end Quadros being on TSCM's 5nm but nothing else.


Indeed and also, TSMC seem to prefer splitting their capacity amongst their clients, rather than allowing one client to book the whole capacity for a given node.


----------



## ODOGG26 (Aug 28, 2020)

DuxCro said:


> If i remember correctly, Nvidia said that AMD reserved most of the 7nm over at TSMC. But they went on ahead and beat AMD into reserving 5nm for next year. So wait for RTX 3000 Super series for TSMC 5nm.



No they actually didnt. https://www.hardwaretimes.com/amd-p...rs-w-tsmc-taking-space-vacated-by-huawei-ban/


----------



## Nyek (Aug 28, 2020)

DisplayPort 1.4a seriously?


----------



## TheoneandonlyMrK (Aug 28, 2020)

DuxCro said:


> If i remember correctly, Nvidia said that AMD reserved most of the 7nm over at TSMC. But they went on ahead and beat AMD into reserving 5nm for next year. So wait for RTX 3000 Super series for TSMC 5nm.


You realise AMD and Tsmc announced a partnership on 5nm before that rumour came out.
As for Tsmc 5nm supers, that's dreamy IMHO.

Soooo 300 watts max some said yesterday, you can't exceed two 8 pins at 300 they said, balls I said you wouldn't need heavier gauge wire.

350 watts at base clocks, what's the max OC pull on that 500?, We'll see.

I own a Vega 64 ,course you can, and I'll take this opportunity to welcome Nvidia back in to forman grill territory, it's been lonely here for two years


----------



## Maelwyse (Aug 28, 2020)

Mark Little said:


> So let's see
> 
> RTX 3090 5248 CUDA, 1695 MHz boost, 24 GB, 936 GB/s, 350 W, 7 nm process
> 
> ...



There are BIG differences, between the 3080 and the 2080TI.
1. Clock speed. 1710-1545, >10% clock rate. That right there explains MOST of the power difference, as well as the efficiency differences
2. RTX core changes. You can call this IPC, if you wanted to, if my understanding is right. Supposedly the RTX cores are hugely more efficient.
3. Heat. I am only just now experiencing how much heat a video card is going to push into the room. Let's put it this way. You can buy a ~$300 air conditioning unit that exhausts to the outside. Those can put out roughly 8000-12000 BTU of heat (per hour). In my experience, this can cool my bedroom from 80F to 70F in about 10 minutes, Your mileage will vary. A simple conversion shows 320watts is roughly 1,091.9 BTU. If my uneducated, guesswork/screwed up math is close to right, that's roughly 10 degrees per hour that needs to be cooled, or vented out of the room. that's on top of what your computer puts out without the graphics card. I know my room gets HOT if I run games for a few hours solid. it's going to be roughly 30% more heat from the video card. And I'm still running a 10-series card.


----------



## ODOGG26 (Aug 28, 2020)

Also from this article, AMD is getting a "exclusive" special enhanced 5nm node. So their 5nm products might be crazy good if its being specifically built to match AMD's product designs. Really interesting stuff.









						AMD Zen 4 Powered Ryzen 5000 & EPYC Genoa CPUs Get Exclusive 'Enhanced 5nm Node' From TSMC
					

AMD's next-generation Zen 4 powered Ryzen 5000 & EPYC Genoa CPUs will be utilizing an exclusive 'enhanced 5nm' process node from TSMC.




					wccftech.com


----------



## EarthDog (Aug 28, 2020)

Is this thread about NV GPUs soon to be released or AMD nodes?


----------



## Jinxed (Aug 28, 2020)

EarthDog said:


> Is this thread about NV GPUs soon to be released or AMD nodes?


Exaclty. From the attempts to downplay raytracing to the AMD fans talking about future nodes, it doesn't seem like the Red team had much confidence in RDNA2 / Big Navi. I for one am really curious just how much Nvidia is going to push raytracing forward. The rumours about a 4x increase could be true, given the TDP and classic CUDA core counts from this leak. Maybe even more than that? Can't wait to see.


----------



## BoboOOZ (Aug 28, 2020)

Well, a bit more on topic, I saw the poll, and apparently there are 5% of the users willing to wait 5 years or more for Intel to come with competitive high-end desktop graphics   ...


----------



## AnarchoPrimitiv (Aug 28, 2020)

Wow, these ARE power hungry, now I'm really starting to believe the leaks that Nvidia was forced to do this because RDNA2 is that competitive and that second biggest Navi will be the price to performance to power usage star of the new gen cards. 

Now, I'm really excited for RDNA2.... But, I did just get a 5700xt last November, so maybe I actually just might check out the Xbox Series X in all honesty.... Never been excited about consoles, but it's definitely different this time around


----------



## DuxCro (Aug 28, 2020)

Idk should i upgrade from my HD 4850 512MB GDDR3? Not much performance difference, it seems. Minus the lack of Ray tracing.


----------



## ZoneDymo (Aug 28, 2020)

rtwjunkie said:


> Do you really “need” more than 10GB VRAM?



bit of a hard question to answer, do you "need" anything in this space? do you even "need" a dedicated gpu?

This is about high end gaming and more Vram to work with is better, higher resolution textures and shadow quality and other stuff.
10gb on a new flagship 3080 is..... just extremely lackluster.

Like I said its like Big N is following Big I and just sell us a nothingburger due to lack of competition in these price brackets.


----------



## BoboOOZ (Aug 28, 2020)

DuxCro said:


> Idk should i upgrade from my HD 4850 512MB GDDR3? Not much performance difference, it seems. Minus the lack of Ray tracing.


Well if you're only playing Baldur's Gate II, ray tracing isn't supported anyway...


----------



## AnarchoPrimitiv (Aug 28, 2020)

Jinxed said:


> Trying to pull 2 year old arguments? Hardly. Given the fact that many AAA titles are raytracing enabled, the fact that main game engines like UE now support raytracing and some of the biggest games are going to be raytraced - Cyberpunk 2077, Minecraft just for an example - nobody believes those old arguments anymore. And you seem to have a bit of a split personality - your beloved AMD is saying new consoles and their RDNA2 GPUs will support raytracing as well. So what are you really trying to say? Are you trying to prepare for the eventuality that AMD's raytracing performance sucks?



Someone needs to tell this guy to chill out and to stop taking criticism of a corporation and their products personally (weird, right?).  I never understood the individuals who appoint themselves as the defenders of someone else's honor, especially when that someone isn't even a person but an abstract entity.  Everyone needs to take a breath and remind themselves these are inanimate pieces of hardware...

.... Can't we all just get along?

P.S.  The guys he's reply to isn't trying to "prepare for AMD Ray tracing to be bad, he's making the point that the consoles dominate how things get done because consoles represent the vast bulk of gaming, so what's he's saying is that however AMD does Ray tracing will be the predominant way Ray tracing gets done and therefore, games will primarily optimize for AMD's method over Nvidia's.

I'm just imaging the head of a development team calling 911:
911 Operator: "What's the problem this evening?"

Developer: "My family had a package delivered today that had an Nvidia video card in it and a .357 round...and now there's some guy wearing a leather jacket snooping around the bushes outside of my house!"

911 Operator: "OK, can you describe the individual so the police know who they're looking for?"

Developer: "Yeah, OK.... Um.... He's probably about 5'1" tall, has a $5 haircut and he's ranting and raving like a madman.... Something about 'buying more is saving more', he might be mentally unstable."

911 Operator:" OK, you said he's wearing a leather jacket, what else is he wearing?"

Developer:" He's just wearing the leather jacket and a pair of knee high construction boots, but other than that, he's completely naked from the waist down"

911 Operator: "We'll have someone there immediately"


----------



## EarthDog (Aug 28, 2020)

ZoneDymo said:


> bit of a hard question to answer, do you "need" anything in this space? do you even "need" a dedicated gpu?
> 
> This is about high end gaming and more Vram to work with is better, higher resolution textures and shadow quality and other stuff.
> 10gb on a new flagship 3080 is..... just extremely lackluster.
> ...


To play games at ultra settings at 1080p/60hz, YES you need a dedicated GPU for most titles.

The flagship isn't a 3080, however.

lol at the third line... here is a tissue for your premature ejac...errr opinion.


----------



## AnarchoPrimitiv (Aug 28, 2020)

BoboOOZ said:


> Ouch, it does indeed say boost, but things really don't add up. AMD managed to boost their first 7nm iteration at 1.9GHz, why would Nvidia only manage 1.7? I would believe this only if it was on Samsung 8nm and if it's really not a good node...



Maybe they can boost higher, but then they'd be consuming 450-600 watts



DuxCro said:


> If i remember correctly, Nvidia said that AMD reserved most of the 7nm over at TSMC. But they went on ahead and beat AMD into reserving 5nm for next year. So wait for RTX 3000 Super series for TSMC 5nm.



Where have you heard that conserving 5nm?  How don't k ow how that's possible considering AMD has had 5nm reservations for Zen4 for a while now



EarthDog said:


> The point, however, is that RT tech is here... it isn't a gimmick with everyone all in. Capeesh?
> Have a cookie...just saying 300W is nothing compared to some cards in the past.



Yeah, the past as in 45nm to 28nm....for them to be that high at 8nm (Samsung is 8nm, right?) is a little curious, but then again I've heard leaks that Nvidia is really worried about RDNA2 and we're forced into this position


----------



## ZoneDymo (Aug 28, 2020)

EarthDog said:


> To play games at ultra settings at 1080p/60hz, YES you need a dedicated GPU for most titles.
> 
> The flagship isn't a 3080, however.
> 
> lol at the third line... here is a tissue for your premature ejac...errr opinion.



Well thats the point, "do you "need" X" is a difficult question to answer, now you are quantifying it, now we can talk about something.
Do you need more then 10gb of Vram....well yeah if a game developer wants to stuff massive high quality textures on your 4k or higher resolution screen it does.
Why would the 3090 have 24gb if 10 was enough....

The flagship inst the 3080 but its close enough really, this is the first time we have a 3090 as a clear bracket above, before this its more 50 - 60 - 70 - 80 and thne 80 special edition.
80 is up there, way too up there to have a pedestrian 10gb of Vram, today I would give that to the 3060 honestly, the 1060 had 6.....

No clue what your reaction to "the third line " is about.


----------



## Vayra86 (Aug 28, 2020)

Jinxed said:


> As in Cyberpunk 2077? Yeah, right, that's a weak title. Minecraft? Yeah, also a weak title. Anything made on UE for the foreseeable future? Also weak and unimportant. Console ports made for DXR? Also unimportant ..  Riiiiight.



Those titles are not any weaker or stronger with or without RT, reading comprehension buddy.

They were all there before we even knew it was going to be in them. I'm hard passing for paying premium on features I barely get to use and see little value in. It is just a graphical update. Not much more. There is no new content for it, no new gameplay, nothing that gives me even a minute more fun than it used to.

I'm not paying to early adopt and beta test technology that really isn't feasible in the economic reality we used to have. And that reality has gotten worse, too. Its not like we're enjoying big paychecks because all is going so well lately, are we? Why then would the industry force us to move into much bigger dies with much higher cost to make? You know the answer.

The fact power budgets apparently need to go up to 320W even for a non-enthusiast 3080 is telling. It speaks of a design that needed to be stretched to meet its demands. Its Vega all over again, really, and I passed on that too. 320W is not entering my case, especially not just for some fancy lighting.

My principles haven't changed, maybe now people understand pro-Nvidia doesn't exist over here... I am pro consumer and pro progress. This isn't progress, its brute forcing something that used to be done much more efficiently, and making us pay the bill. Hell no. I'm interested still in how Nvidia and AMD fill up the mid range. It better be really good price/perf wise, or I'm left with this 1080 for another few years. Which is fine, by the way, it does all I need it to.


----------



## Jinxed (Aug 28, 2020)

AnarchoPrimitiv said:


> Someone needs to tell this guy to chill out and to stop taking criticism of a corporation and their products personally (weird, right?).  I never understood the individuals who appoint themselves as the defenders of someone else's honor, especially when that someone isn't even a person but an abstract entity.  Everyone needs to take a breath and remind themselves these are inanimate pieces of hardware...
> 
> .... Can't we all just get along?
> 
> ...


Trying to belittle the head of a corporation that currently beats your beloved company on every level does not change the actual facts. No matter how hard you try. Raytracing is here to stay, lots of games use and even more are going to. Both Intel and AMD are also getting onto the raytracing train with their future products, so it the future. This leak shows pretty clearly that raytracing is going to be a big focus point of Ampere. We've heard many attempts to joke about Huang's leather jackets etc., and it doesn't achieve anything else than show you ran out of arguments. Let's just stick to the facts and the topic of this article - Ampere specs - not leather jackets


----------



## AnarchoPrimitiv (Aug 28, 2020)

Jinxed said:


> Exaclty. From the attempts to downplay raytracing to the AMD fans talking about future nodes, it doesn't seem like the Red team had much confidence in RDNA2 / Big Navi. I for one am really curious just how much Nvidia is going to push raytracing forward. The rumours about a 4x increase could be true, given the TDP and classic CUDA core counts from this leak. Maybe even more than that? Can't wait to see.



What are you talking about, the reason why he was bringing up AMD nodes is because someone literally made the completely false claim that Nvidia had 5nm reserved to the point of blocking AMD out and then people were correcting him, that's all, seems straightforward with no alterior motives, at least to me?



Jinxed said:


> Trying to belittle the head of a corporation that currently beats your beloved company on every level does not change the actual facts. No matter how hard you try. Raytracing is here to stay, lots of games use and even more are going to. Both Intel and AMD are also getting onto the raytracing train with their future products, so it the future. This leak shows pretty clearly that raytracing is going to be a big focus point of Ampere. We've heard many attempts to joke about Huang's leather jackets etc., and it doesn't achieve anything else than show you ran out of arguments. Let's just stick to the facts and the topic of this article - Ampere specs - not leather jackets


Dude, it's a joke to lighten the mood, and instead of laughing, you decided to somehow take it personally again.... Also, why is making a joke about Nvidia, automatically make someone an AMD fanboy?  I don't follow how that works.... are there only two choices in this world, either be personally offended for Nvidia or be an AMD fanboy?  How about I have no loyalty and buy whatever offers the best value at the price I can afford?  Is that OK with you, or am I forced into this false binary you've imagined?

Never would I imagine that a light hearted joke to cut the tension, directed toward someone a million miles away from this forum would result in someone here being personally offended, and then attacking me personally, when I did no such thing, why was it necessary to make it personal?

"ran out of arguments"?  What argument was I making?  I believe I didn't make a single argument in this entire forum, and it seems like you're imagining a fight that doesn't exist.... I'm sincerely just trying to understand what occurred here, and why you think I'm engaging in combat when all I did was make a joke to make people laugh, and haven't attacked anyone or called anyone out.  So far the only thing I did was try and point out to someone that people were talking about AMD process nodes to correct someone else who had brought it up first and that made a false statement about Nvidia reserving all the 5nm production when there's numerous, reputable sources to the contrary .... Why are you attempting drag me into a fight with you for no reason I can discern?

Again, I'm not trying to be combative, never have been, I'm just sincerely and genuinely trying to understand what's going on to de-escalate the situation and find out what's occurred to remedy it


----------



## Jinxed (Aug 28, 2020)

Vayra86 said:


> Those titles are not any weaker or stronger with or without RT.
> 
> They were all there before we even knew it was going to be in them. I'm hard passing for paying premium on features I barely get to use and see little value in. It is just a graphical update. Not much more. There is no new content for it, no new gameplay, nothing that gives me even a minute more fun than it used to.


So is higher resolution, shadows, higher detailes models and basically all the graphical advances over the years. All just graphical updates. With your logic, we could all just stay at 640x480. In reality all those advances, including raytracing, improve immersion. And while a good story and gameplay is still the key and it's sad that some games put graphical fidelity first, I agree on that, adding more immersion and realism is a huge boost when the gameplay is good.


----------



## Xex360 (Aug 28, 2020)

I was expecting the 3070 to beat comfortably the 2080ti given the new node and architecture, but it doesn't seem like it, and given nVidia's dominance in the market I don't expect them to price Ampere reasonably. I only hope AMD can do a Zen2 for GPUs, Navi was very good (besides drivers issues) it beat Turing while being cheaper, but the performance offered was a bit low, it would've been better to have a 5800xt comparable to a 2080/s for 100$ less.


----------



## Vayra86 (Aug 28, 2020)

Jinxed said:


> So is higher resolution, shadows, higher detailes models and basically all the graphical advances over the years. All just graphical updates. With your logic, we could all just stay at 640x480. In reality all those advances, including raytracing, improve immersion. And while a good story and gameplay is still the key and it's sad that some games put graphical fidelity first, I agree on that, adding more immersion and realism is a huge boost when the gameplay is good.



Blanket statements. Not interested in this level of debate, sorry. Realism and immersion were achieved prior to RT. It is not necessary, it is only necessary to feed an industry that is running into the limits of mainstream resolutions. Which underlines my statement: I can do just fine riding this 1080 for a few more years. A 2016 card that still hits 120 fps at 1080p. That is why RT exists. Don't kid yourself. Nothing is stopping me from jumping in when there IS actual content for it, that CAN actually run on something that doesn't need a nuclear fission plant and ditto cooling.

Smart companies create new demand to feed their business. Its always been this way. Its up to us to tell them to stick it back in their behind for further refinement... or we can jump on it like gullible fools and pay their premium for them. In hindsight everyone can agree that Turing was exactly that: early adopting territory. Why is Ampere different? The content still isn't there, the price has increased yet again... explain the madness.

I didn't buy 300W AMD cards, and I don't plan to start now even if the color is different.



BoboOOZ said:


> Well, these might still be inaccurate leaks, but 10GB sounds weird to me too. As the idea of having another 2080 variant with 20GB.
> 
> I always thought that AMD's policy of making the 5500 with 2 memory sizes was stupid, 4GB is defintely not enough for some games, while 8GB will never be fully utilized by such a weak GPU with low bandwidth. I prefer the manufacturer to pick the optimal memory size for that SKU and stick with that.
> 
> And there's a good chance that 10GB might not be enough for 4K in 2 years.



The whole thing looks like a complete failure if you ask me, and Turing was no different. Too pricy and bulky, too little to show us to motivate that price.

I think its very likely we will see a quick SUPER update to this lineup. And it could be big, if it truly is all based on foundry issues an update might fix things bigtime. Fix, there I said it, because this looks broken af.


----------



## xSneak (Aug 28, 2020)

I was hoping for displayport 2.0 on these cards. Disappointing because these cards will be able to do high frame rates at 4k.


----------



## BoboOOZ (Aug 28, 2020)

ZoneDymo said:


> Why would the 3090 have 24gb of 10 was enough....


Well, these might still be inaccurate leaks, but 10GB sounds weird to me too. As the idea of having another 2080 variant with 20GB.

I always thought that AMD's policy of making the 5500 with 2 memory sizes was stupid, 4GB is defintely not enough for some games, while 8GB will never be fully utilized by such a weak GPU with low bandwidth. I prefer the manufacturer to pick the optimal memory size for that SKU and stick with that.

And there's a good chance that 10GB might not be enough for 4K in 2 years.


----------



## crazyeyesreaper (Aug 28, 2020)

so 350 watt max with the TDP limit raised the card could get close to 400+ watts OCed


----------



## EarthDog (Aug 28, 2020)

ZoneDymo said:


> Well thats the point, do you "need" X is a difficult question to answer, now you are quantifying it, now we can talk about something.
> Do you need more then 10gb of Vram....well yeah if a game developer wants to stuff massive high quality textures on your 4k or higher resolution screen it does.
> Why would the 3090 have 24gb of 10 was enough....
> 
> ...


Do I really have to answer why a flagship has 24GB versus anoher down the stack has less? You've been here long enough... think about it. 

But, it ISN'T, the flagship. there isn't a 'close enough'. 10GB = pedestrian, lolololol. It's 25% more than the 2080... and a mere 1GB less than 2080 Ti. I'd call it an improvement... especially at where it is intended to play games. 

That is a reaction to the third line of the post I quoted. Premature is premature.


----------



## Jinxed (Aug 28, 2020)

AnarchoPrimitiv said:


> What are you talking about, the reason why he was bringing up AMD nodes is because an obvious Nvidia fan literally made the completely false claim that Nvidia had 5nm reserved to the point of blocking AMD out and then people were correcting him, do you even bother reading things before Comme ting on them?


Oh really? And this is what?


BoboOOZ said:


> Ouch, it does indeed say boost, but things really don't add up. AMD managed to boost their first 7nm iteration at 1.9GHz, why would Nvidia only manage 1.7? I would believe this only if it was on Samsung 8nm and if it's really not a good node...



He was not the one starting the process nodes argument. Let's just end it right here and move on.


----------



## BoboOOZ (Aug 28, 2020)

Xex360 said:


> I was expecting the 3070 to beat comfortably the 2080ti given the new node and architecture, but it doesn't seem like it, and given nVidia's dominance in the market I don't expect them to price Ampere reasonably. I only hope AMD can do a Zen2 for GPUs, Navi was very good (besides drivers issues) it beat Turing while being cheaper, but the performance offered was a bit low, it would've been better to have a 5800xt comparable to a 2080/s for 100$ less.


Well, looking at memory bandwidths, the 3070 might tie or beat a 2080Super in pure rasterized performance, and that's a very decent showing.

Where the 2070 will beat the 2080Ti handily, is most probably in ray-tracing...


----------



## medi01 (Aug 28, 2020)

Jinxed said:


> Joined Jan 21, 2020
> 
> 2 year old arguments?


Impressive.




Jinxed said:


> raytracing performance sucks?


It doesn't matter, as we are not talking about performance, but only about power consumption.

So... why would "RT cores" (of which we now have "moe") consume power when NOT running one of the handful RT enabled games?


----------



## ppn (Aug 28, 2020)

The most interesting card is 3070. if full chip nothing gimped and 7nm EUV.


----------



## Blueberries (Aug 28, 2020)

The most interesting thing about this leak is that the 3090 will be sold by AIB partners and won't be exclusive like the previous generation Titans


----------



## Jinxed (Aug 28, 2020)

Vayra86 said:


> Blanket statements. Not interested in this level of debate, sorry. Realism and immersion were achieved prior to RT. It is not necessary, it is only necessary to feed an industry that is running into the limits of mainstream resolutions. Which underlines my statement: I can do just fine riding this 1080 for a few more years. A 2016 card that still hits 120 fps at 1080p. That is why RT exists. Don't kid yourself. Nothing is stopping me from jumping in when there IS actual content for it, that CAN actually run on something that doesn't need a nuclear fission plant and ditto cooling.



You started it, were proven wrong and now you're running away. Evolution of computer graphics is incremental and there's no denying that. Trying to downplay a significant advance in technology just because your favourite company is behind is silly. You started this level of debate by trying to downplay RT with false claims. If you don't want to have such debates, don't start them next time and stay on topic.



medi01 said:


> Jinxed said:
> Joined Jan 21, 2020
> 2 year old arguments?





medi01 said:


> Impressive.



Yes, AMD fans were trying to downplay raytracing since Turing GPUs were released. So yes, 2 years LOL


----------



## DuxCro (Aug 28, 2020)

Rumor control. Don't take any of these RUMORS as facts.


----------



## EarthDog (Aug 28, 2020)

Vayra86 said:


> The whole thing looks like a complete failure if you ask me, and Turing was no different.


Who replaced Vayra with our fanboys. 

NV is penalized by some for leading the way. If it wasn't the way (RT), AMD wouldn't follow. Pricing and power use are likely to be out of line, but I'd imagine most any title will now be playable with full RT........ at least that is the hope with the rumored power consumption. Couple that with consoles getting RT titles, there is a better chance than ever to see a lot more momentum on that front, no? Such absolutes from you...... ... not used to that.


----------



## Vayra86 (Aug 28, 2020)

Jinxed said:


> You started it, were proven wrong and now you're running away. Evolution of computer graphics is incremental and there's no denying that. Trying to downplay a significant advance in technology just because your favourite company is behind is silly. You started this level of debate by trying to downplay RT with false claims. If you don't want to have such debates, don't start them next time and stay on topic.



No, I'm not running away, I'm making a choice as a consumer. Its really that simple. If AMD produces something similar, its a no buy in either camp.

I agree about evolution in computer graphics. Its about picking your battles. Remember 3DFX?



EarthDog said:


> Who replaced Vayra with our fanboys.
> 
> NV is penalized by some for leading the way. If it wasn't the way (RT), AMD wouldn't follow. Pricing and power use are likely to be out of line, but I'd imagine most any title will now be playable with full RT........ at least that is the hope with the rumored power consumption. Couple that with consoles getting RT titles, there is a better chance than ever to see a lot more momentum on that front, no? Such absolutes from you...... ... not used to that.



Wait and see mode for me, simple. I care about content before hardware.

Let's see that momentum, first. So far the consoles aren't out and we know it takes a while for content to mature on them. Patience, young gwasshoppa



ppn said:


> The most interesting card is 3070. if full chip nothing gimped and 7nm EUV.



The last hope...


----------



## RedelZaVedno (Aug 28, 2020)

I'll postpone buying decision until RDNA2 comes out plus a few months for prices to settle. I'll go with AMD if RDNA2 brings more value to the table, if not, I'll opt for Nvidia or skip this generation altogether if price/performance ratios of both GPUs suck. 1080TI is still a decent GPU after all and I really don't care about ray tracing at this point in time. Maybe this will change in 2-3 years.


----------



## Jinxed (Aug 28, 2020)

Vayra86 said:


> No, I'm not running away, I'm making a choice as a consumer. Its really that simple. If AMD produces something similar, its a no buy in either camp.


You forgive me if I don't belive you. But I am willing to give it the benefit of doubt.


----------



## ZoneDymo (Aug 28, 2020)

EarthDog said:


> Do I really have to answer why a flagship has 24GB versus anoher down the stack has less? You've been here long enough... think about it.
> 
> But, it ISN'T, the flagship. there isn't a 'close enough'. 10GB = pedestrian, lolololol. It's 25% more than the 2080... and a mere 1GB less than 2080 Ti. I'd call it an improvement... especially at where it is intended to play games.
> 
> That is a reaction to the third line of the post I quoted. Premature is premature.



No, but you do have to answer why that gap is so insanely hugh, more then twice the ram? borderline 2.5? that is just insane.
And again, the midrange of old, RX480 had 8gb of ram and the gtx1060 had 6gb of ram...to have an RTX3080 now with 10gb is just pathetic imo with the eye on progression and placement.


----------



## AnarchoPrimitiv (Aug 28, 2020)

Jinxed said:


> Oh really? And this is what?



Me doing a recaling of what happend, namely that someone made the claim that 5nm was all reserved by Nvidia, but there are numerous sources claiming otherwise, but if you disagree that's fine too, it's not a big deal.

I'll be happy to move on, I would just like to respectfully ask why you felt it necessary to label me an AMD fanboy, when nothing I did made any indication of that, and also felt the need to be personally offended by my joke, and then decided to make me personally a target of your ire and make a bunch of negative assumptions about my character, that's all.

I think when I said "did you bother reading before commenting", you were assuming I was doing it in an antagonistic way when I was sincerely just asking if you had happened to miss those comments previous to your own.  That's why I'm confused, because never once was I intending any of my comments to be combatitive, but they were somehow taken as such.  So now, I'm trying to understand why that was, that's all.  I'm more than happy to just drop it, but I'd prefer for us to come to an understanding and end on a more cordial and friendly note and walk away amicably so that I can then do my best avoid misunderstandings in the future, but if you feel that's not necessary, that's more than fine too.

Again, I'm saying all this in a purely friendly, sincere and respectful way and not in an antagonistic or confrontational way at all


----------



## EarthDog (Aug 28, 2020)

Vayra86 said:


> Wait and see mode for me, simple.


Certainly better than......


Vayra86 said:


> The whole thing looks like a complete failure if you ask me, and Turing was no different.


   



ZoneDymo said:


> No, but you do have to answer why that gap is so insanely hugh, more then twice the ram?


Now you're talking.


----------



## Vayra86 (Aug 28, 2020)

Jinxed said:


> You forgive me if I don't belive you. But I am willing to give it the benefit of doubt.



I'm curious what you think is my motivation then if I would lie about this. But you are free to believe whatever you like 



EarthDog said:


> Certainly better than......
> 
> 
> Now you're talking.



Fair enough lol


----------



## rtwjunkie (Aug 28, 2020)

ZoneDymo said:


> Do you need more then 10gb of Vram....well yeah if a game developer wants to stuff massive high quality textures on your 4k or higher resolution screen it does.
> Why would the 3090 have 24gb if 10 was enough....


Let me help you out. The only reason you think you “need” more than 10GB VRAM is because of lazy devs dumping textures ipon textures there just because it is there.

As to why NVIDIA is putting 24GB VRAM on the 3090? Marketing. “I’m King of the Hill.” Whatever you want to call it.


----------



## RedelZaVedno (Aug 28, 2020)

ZoneDymo said:


> No, but you do have to answer why that gap is so insanely hugh, more then twice the ram? borderline 2.5? that is just insane.
> And again, the midrange of old, RX480 had 8gb of ram and the gtx1060 had 6gb of ram...to have an RTX3080 now with 10gb is just pathetic imo with the eye on progression and placement.


I agree, 10GB is just not enough on xx80 GPU these days. MS flight simulator 2020 called for 12,7GB of vram usage in MS Flight simulator 2020 while flying over Dubai at 4K/ultra. 16GB of vram should be a safe amount on high end 4K capable GPUs. But I do believe Nvidia's AIBs will offer 20Gb variant (according to Igor's Lab which is one of the most trusted leakers imo), but for more $$$, price gauging where ever it can, it's Ngreedia after all ;-)


----------



## Vayra86 (Aug 28, 2020)

rtwjunkie said:


> Let me help you out. The only reason you think you “need” more than 10GB VRAM is because of lazy devs dumping textures ipon textures there just because it is there.
> 
> As to why NVIDIA is putting 24GB VRAM on the 3090? Marketing. “I’m King of the Hill.” Whatever you want to call it.



Well its pretty strange to see 11GB on a 3 year old GPU and then 10GB on the newer one that is going to be much faster. That's a step back no matter how you spin it.

As to the 24GB... it speaks of a product lineup that is out of balance, not as intended, whatever you want to attribute to it. But they sure as hell don't do it just because its a nice number to win with. Previous GPUs underline that. VRAM cap was never a real battle, especially not in the high end.


----------



## medi01 (Aug 28, 2020)

*Cyberpunk *will have *customizable genitalia* (as they are misogynists, only penises).

Imagine how glorious all that will be, when RT gimmick is slapped on it.

That alone justifies paying $2000 for 3090 (although I think it would cost $1400, basically like 2080Ti)


----------



## Jinxed (Aug 28, 2020)

AnarchoPrimitiv said:


> I'll be happy to move on, I would just like to respectfully ask why you felt it necessary to label me an AMD fanboy, when nothing I did made any indication of that, and also felt the need to be personally offended by my joke, and then decided to make me personally a target of your ire and make a bunch of negative assumptions about my character, that's all.



What I actually said was:


> Exaclty. From the attempts to downplay raytracing to the AMD fans talking about future nodes, it doesn't seem like the Red


AMD fans are not necessarily AMD fanboys. You said that, not me. Although I use that term when it's really obvious.I don't recall and can't find anywhere in this discussion I called YOU an AMD fanboy. You used that first. Are you done?


----------



## rtwjunkie (Aug 28, 2020)

Jinxed said:


> Yes, AMD fans were trying to downplay raytracing since Turing GPUs were released. So yes, 2 years LOL


Lots of rational, clear-headed Nvidia fans have downplayed RT as well.



Vayra86 said:


> Well its pretty strange to see 11GB on a 3 year old GPU and then 10GB on the newer one that is going to be much faster. That's a step back no matter how you spin it.


You’re comparing apples to oranges. Model to model is what you must compare. Does the 2080 have 11GB VRAM? No. It occupies the same space within its lineup as the 3080 does. 3080 improves upon that by adding 2GB.

11GB was what the top-ranked 2080Ti has. The same place in its family as the new 3090 will occupy.


----------



## Jinxed (Aug 28, 2020)

rtwjunkie said:


> Lots of rational, clear-headed Nvidia fans have downplayed RT as well.


Not sure which ones you're talking about. But I certainly recall a lot of poeple that were known to hate Nvidia for years to do that, only to say: "Hey, I have a Nvidia GPU, I'm unbiased / Nvidia fan / etc.". That always more than anything else shows me how bad the situation of Team Red is, if even their most adamant fans are buying Nvidia GPUs (provided they were telling the truth about that in the first place).


----------



## rtwjunkie (Aug 28, 2020)

Jinxed said:


> You forgive me if I don't belive you. But I am willing to give it the benefit of doubt.


Make up your mind. You are either giving him the benefit of the doubt or you don’t believe him.


----------



## Vayra86 (Aug 28, 2020)

rtwjunkie said:


> Lots of rational, clear-headed Nvidia fans have downplayed RT as well.
> 
> 
> You’re comparing apples to oranges. Model to model is what you must compare. Does the 2080 have 11GB VRAM? No. It occupies the same space within its lineup as the 3080 does. 3080 improves upon that by adding 2GB.
> ...



No, I'm comparing performance to performance because the tiers and price tiers change every gen. You know this. Please. The simple fact is, a much faster GPU gets a lower VRAM cap. And on top of that, the 2080ti compared to 3080 still does that a gen later than the 1080ti. That's a looong freeze on VRAM to then give a flagship 24GB.

Let me put it differently. It doesn't speak for the 3080's lifespan as it did for a 1080 with 8GB during Pascal. Catch my drift?


----------



## Jinxed (Aug 28, 2020)

rtwjunkie said:


> Make up your mind. You are either giving him the benefit of the doubt or you don’t believe him.


He actually stated multiple things in his reply. And I am replying to multiple statements. I don't believe that it's a simple as him making a consumer choice. But give him the benefit of doubt that he's not going to buy any GPU if AMD also brings a raytracing oriented GPU. Read the whole quoted part please.


----------



## ODOGG26 (Aug 28, 2020)

AnarchoPrimitiv said:


> What are you talking about, the reason why he was bringing up AMD nodes is because someone literally made the completely false claim that Nvidia had 5nm reserved to the point of blocking AMD out and then people were correcting him, that's all, seems straightforward with no alterior motives, at least to me?
> 
> 
> Dude, it's a joke to lighten the mood, and instead of laughing, you decided to somehow take it personally again.... Also, why is making a joke about Nvidia, automatically make someone an AMD fanboy?  I don't follow how that works.... are there only two choices in this world, either be personally offended for Nvidia or be an AMD fanboy?  How about I have no loyalty and buy whatever offers the best value at the price I can afford?  Is that OK with you, or am I forced into this false binary you've imagined?
> ...



Yea he and the other guy didn't follow why the node replies were there and went finger happy on their keyboards and  to appoint people a Fanboy for AMD which to me clearly shows who was fanboying. Like you said, people really need to lighten up. It's not that serious.


----------



## M2B (Aug 28, 2020)

Vayra86 said:


> No, I'm comparing performance to performance because the tiers and price tiers change every gen. You know this. Please. The simple fact is, a much faster GPU gets a lower VRAM cap. And on top of that, the 2080ti compared to 3080 still does that a gen later than the 1080ti. That's a looong freeze on VRAM to then give a flagship 24GB.
> 
> Let me put it differently. It doesn't speak for the 3080's lifespan as it did for a 1080 with 8GB during Pascal. Catch my drift?



The 3080 most probably will have a 20GB variant.


----------



## rtwjunkie (Aug 28, 2020)

Vayra86 said:


> No, I'm comparing performance to performance because the tiers and price tiers change every gen. You know this. Please. The simple fact is, a much faster GPU gets a lower VRAM cap. And on top of that, the 2080ti compared to 3080 still does that a gen later than the 1080ti. That's a looong freeze on VRAM to then give a flagship 24GB.
> 
> Let me put it differently. It doesn't speak for the 3080's lifespan as it did for a 1080 with 8GB during Pascal. Catch my drift?


Just don‘t compare price tiers and you’re good. Just the chips and where they are in their tiers. It keeps the picture much clearer.  

sure you’re correct on the 3080 not being a huge leap in VRAM.  But honestly, it may not mostly be needed. Don’t you frequently argue that devs are loading way more into VRAM than necessary?


----------



## Vayra86 (Aug 28, 2020)

M2B said:


> The 3080 most probably will have a 20GB variant.



That would make the appearance of the 10GB one even more questionable.



rtwjunkie said:


> Just don‘t compare price tiers and you’re good. Just the chips and where they are in their tiers. It keeps the picture much clearer.
> 
> sure you’re correct on the 3080 not being a huge leap in VRAM.  But honestly, it may not mostly be needed. Don’t you frequently argue that devs are loading way more into VRAM than necessary?



OK! Let's compare the actual chips. Both are GA102 dies! What gives?! Its not even a 104.

Devs do indeed... but now check that price tag again for this GPU. Hello? This screams bad yield and scraps and leftovers to me, sold at premium. Let's see how that VRAM is wired...


----------



## ODOGG26 (Aug 28, 2020)

Jinxed said:


> Oh really? And this is what?
> 
> 
> He was not the one starting the process nodes argument. Let's just end it right here and move on.



No he didn't start a process node argument. You did. He seemed to be genuinely curios or confused as to why these would be the clocks if using same node as another player who managed higher clocks. He wasn't bashing NVIDIA or anything because it seems like those numbers may or may not be right. But lets move on cause you clearly arent comprehending things properly atm.


----------



## HammerON (Aug 28, 2020)

ODOGG26 said:


> Yea he and the other guy didn't follow why the node replies were there and went finger happy on their keyboards and  to appoint people a Fanboy for AMD which to me clearly shows who was fanboying. Like you said, people really need to lighten up. It's not that serious.


Alright folks, stop with the fanboy crap - pretty please.
Remember that it is okay to disagree (and just leave it at that) and it often does not work to challenge those with different opinions than yours.
Carry on.


----------



## M2B (Aug 28, 2020)

Vayra86 said:


> That would make the appearance of the 10GB one even more questionable.
> 
> 
> 
> OK! Let's compare the actual chips. Both are GA102 dies! What gives?! Its not even a 104.



Probably because of the expense.
If they only release a 20GB variant it should end up being at the very least 100$ more expensive and add nothing to performance (for now at least) so they might think it's better to have let's say a base price of 799 instead of 999 so people don't freak out that much.
And let's be honest here, 20GB is very much overkill and if you don't want to keep your card for 4~ years why pay more...


----------



## BoboOOZ (Aug 28, 2020)

HammerON said:


> Alright folks, stop with the fanboy crap - pretty please.
> Remember that it is okay to disagree (and just leave it at that) and it often does not work to challenge those with different opinions than yours.
> Carry on.


Or just use the ignore function, it works wonders


----------



## rtwjunkie (Aug 28, 2020)

Vayra86 said:


> That would make the appearance of the 10GB one even more questionable.
> 
> 
> 
> ...


My personal opinion is Nvidia is still experimenting. You waiting another gen might be a good idea. I may either do that as well or picky up a used 2080Ti.


----------



## Vayra86 (Aug 28, 2020)

rtwjunkie said:


> My personal opinion is Nvidia is still experimenting. You waiting another gen might be a good idea. I may either do that as well or picky up a used 2080Ti.



You said it. +1.



M2B said:


> Probably because of the expense.
> If they only release a 20GB variant it should end up being at the very least 100$ more expensive and add nothing to performance (for now at least) so they might think it's better to have a let's say base price of 799 instead of 999 so people don't freak out that much.
> And let's be honest here, 20GB is very much overkill and if you don't want to keep your card for 4~ years why pay more...



I agree, but then that begs the question how things are balanced in terms of margins etc.... if they can't produce that kind of GPU at 800... something is amiss. They have a node shrink and they're even pretty late with it, as well. The fact is, RT demands a much bigger die and we're paying the full price on it. The fact also is, the RT content is still rare. By the time there is sufficient content, the GPU will be  ready for replacement.


----------



## TheoneandonlyMrK (Aug 28, 2020)

Jinxed said:


> So is higher resolution, shadows, higher detailes models and basically all the graphical advances over the years. All just graphical updates. With your logic, we could all just stay at 640x480. In reality all those advances, including raytracing, improve immersion. And while a good story and gameplay is still the key and it's sad that some games put graphical fidelity first, I agree on that, adding more immersion and realism is a huge boost when the gameplay is good.


We sort of are with dlss , then upscaling, no?.

Anyway,
I'm eager for a tech face-off , less of the over the fence sh#t talking and more doing.

With the upcoming GPU releases plus consoles at the same time, I can't see them all being a winner,not with the Rona kicking wages to the kerb.


----------



## dicktracy (Aug 28, 2020)

7nm confirmed! Some fanboys are going to have to make up new lies for Ampere.


----------



## Jinxed (Aug 28, 2020)

rtwjunkie said:


> My personal opinion is Nvidia is still experimenting. You waiting another gen might be a good idea. I may either do that as well or picky up a used 2080Ti.


Given the hundreds of hours of developer oriented videos, workshops and presentations about techniques to implement raytracing, even actually lending developers to companies like CD Projekt, DICE, Microsoft, 4A and Epic to help them implement raytracing in their games and engines, developing new technologies of AI denoising and supersampling to support raytracing, along with what seems like a massive investment into raytracing silicon real estate on their GPUs, it sure does not seem like they are experimenting. It seems like they are fully commited. But of course you are entitled to your opinion.


----------



## Vayra86 (Aug 28, 2020)

Jinxed said:


> Given the hundreds of hours of developer oriented videos, workshops and presentations about techniques to implement raytracing, even actually lending developers to companies like CD Projekt, DICE, Microsoft, 4A and Epic to help them implement raytracing in their games and engines, developing new technologies of AI denoising and supersampling, along with what seems like a massive investment into raytracing silicon real estate on their GPUs, it sure does not seem like they are experimenting. It seems like they are fully commited. But of course you are entitled to your opinion.



Commitment is nice, results count 

Turing didn't exactly generate momentum, did it?


----------



## BoboOOZ (Aug 28, 2020)

dicktracy said:


> 7nm confirmed! Some fanboys are going to have to make up new lies for Ampere.


7nm or 14nm, who cares, what matters is performance, price, and efficiency.
Anyway don't get all hyped up, quote from videocardz:
"The data that we saw clearly mention the 7nm fabrication node. At this time we are unable to confirm if this is indeed true. "


----------



## RedelZaVedno (Aug 28, 2020)

rtwjunkie said:


> Let me help you out. The only reason you think you “need” more than 10GB VRAM is because of lazy devs dumping textures ipon textures there just because it is there.
> 
> As to why NVIDIA is putting 24GB VRAM on the 3090? Marketing. “I’m King of the Hill.” Whatever you want to call it.


That is just not true. When game streams textures in real time (MS Flight Simulator 2020) or has procedurally generated worlds (SpaceEngine) you easily surpass 10GB Vram usage. I hit 12,7GB Vram usage when flying over Dubai at 4K/ultra, and filled all 16 GB of R7 vram in SpaceEngine. xx80 GPUs are HIGH END gpus meant to be used with 4K resolution, so 10 GB of Vram just won't cut it. Hell, Xbox X/PS5 have 16 GB GDDR6 w/ 320b bus memory on board. Sure it's not totally comparable because they lack DDR4, but still they have ultra fast SSDs to compensate to a degree.


----------



## Jinxed (Aug 28, 2020)

Vayra86 said:


> Commitment is nice, results count


We have to wait four more days for results (and then some for the reviews). But it's fun times for hardware fans either way.


----------



## rtwjunkie (Aug 28, 2020)

Jinxed said:


> Given the hundreds of hours of developer oriented videos, workshops and presentations about techniques to implement raytracing, even actually lending developers to companies like CD Projekt, DICE, Microsoft, 4A and Epic to help them implement raytracing in their games and engines, developing new technologies of AI denoising and supersampling, along with what seems like a massive investment into raytracing silicon real estate on their GPUs, it sure does not seem like they are experimenting. It seems like they are fully commited. But of course you are entitled to your opinion.


The thing is, they are only committed to still getting it right. Content is bare still. Name a few devs and a few upcoming titles on top of a few minimally RT’d titles so far and we are still in the development phase, at buyer expense.

Fact is, every one of the released games still looks fantastic without RTRT. None of these features they are making available on super-priced GPU’s make playing the game better or are necessary. Game immersion (at least on SP games) comes from gameplay and wel-written story. Graphics only enhances it.



RedelZaVedno said:


> xx80 GPUs are HIGH END gpus meant to be used with 4K resolution


They never have been actual 4K GPU’s, no matter what generation. Only the 80 Ti’s have been, and even then only on some currently released games, and by a year later when newer games come out they are left behind.


----------



## RedelZaVedno (Aug 28, 2020)

rtwjunkie said:


> They never have been actual 4K GPU’s, no matter what generation. Only the 80 Ti’s have been, and even then only on some currently released games, and by a year later when newer games come out they are left behind.


Anything costing 800 bucks should be 4K capable imho. 1080TI/Radeon7/2080(S) are all 4K capable cards.... It's 2020 for god sake, ppl are buying 8K TV sets and we're still discussing if GPUs costing near a grand should be 4K capable?


----------



## ppn (Aug 28, 2020)

4K capable means 60FPS not counting the 1% lows every setting at low and disabled when applicable in almost every game under the sun or 99% of them. it is still very cabable I think. Even 2070 is capable then, yay lets sell it for $800. Now if you are talking about every setting maxed, those goal posts are constantly moving. Kind of thinking that leads to a very short lived glory and badly spent $800.


----------



## Jinxed (Aug 28, 2020)

rtwjunkie said:


> Fact is, every one of the released games still looks fantastic without RTRT. None of these features they are making available on super-priced GPU’s make playing the game better or are necessary. Game immersion (at least on SP games comes from gameplay and wel-written story. Graphics only enhances it.


And I agree there, as I told Varya:


> And while a good story and gameplay is still the key and it's sad that some games put graphical fidelity first, I agree on that, adding more immersion and realism is a huge boost when the gameplay is good.



Why NOT enhance it? I remember playing Thief 1 and 2. And it was great. Very primitive shadows, but still immersive so much. Then Thief 3 came out, also excelent gameplay and story, but with a few generations of graphics enhancements, and that was such a massive boost. Did you need it in the previous two? No. Did it add a LOT to the 3rd episode? Absolutely. It's progress. It still baffles me that some people put their brand preferences above such an incredible technical advancement as raytracing is, just because one brand is way behind.


----------



## BoboOOZ (Aug 28, 2020)

RedelZaVedno said:


> Anything costing 800 bucks should be 4K capable imho. 1080TI/Radeon7/2080(S) are all 4K capable cards.... It's 2020 for god sake, ppl are buying 8K TV sets and we're still discussing if GPUs costing near a grand should be 4K capable?


500-600USD consoles will be 4k capable, so enthusiast GPU's should definitely be 4k capable.


----------



## rtwjunkie (Aug 28, 2020)

Jinxed said:


> And I agree there, as I told Varya:
> 
> 
> Why NOT enhance it? I remember playing Thief 1 and 2. And it was great. Very primitive shadows, but still immersive so much. Then Thief 3 came out, also excelent gameplay and story, but with a few generations of graphics enhancements, and that was such a massive boost. Did you need it in the previous two? No. Did it add a LOT to the 3rd episode? Absolutely. It's progress. It still baffles me that some people put their brand preferences above such an incredible technical advancement as raytracing is, just because one brand is way behind.


No brand preference here. Every few years I go AMD, but mostly I use Nvidia.  I simply call things as I see them. I have a preference, but certainly not a loyalty.

I do agree, since I already said it, that gee wiz graphics can enhance an already immersive game. But the cost for that extra needs to be reasonable, considering that none of it os mandatory for an enjoyable game.


----------



## TheoneandonlyMrK (Aug 28, 2020)

Jinxed said:


> And I agree there, as I told Varya:
> 
> 
> Why NOT enhance it? I remember playing Thief 1 and 2. And it was great. Very primitive shadows, but still immersive so much. Then Thief 3 came out, also excelent gameplay and story, but with a few generations of graphics enhancements, and that was such a massive boost. Did you need it in the previous two? No. Did it add a LOT to the 3rd episode? Absolutely. It's progress. It still baffles me that some people put their brand preferences above such an incredible technical advancement as raytracing is, just because one brand is way behind.


Okay, less than five.

Starting to feel like a family guy slapping skit.  




ppn said:


> 4K capable means 60FPS not counting the 1% lows every setting at low and disabled when applicable in almost every game under the sun or 99% of them. it is still very cabable I think. Even 2070 is capable then, yay lets sell it for $800. Now if you are talking about every setting maxed, those goal posts are constantly moving. Kind of thinking that leads to a very short lived glory and badly spent $800.


That's what you perceive as capable, meanwhile a fair few are gaming at 4k 60 in many games with whatever they have.
Neither you or me gets to set the rules for everyone.


----------



## Jinxed (Aug 28, 2020)

rtwjunkie said:


> No brand preference here. Every few years I go AMD, but mostly I use Nvidia.  I simply call things as I see them. I have a preference, but certainly not a loyalty.
> 
> I do agree, since I already said it, that gee wiz graphics can enhance an already immersive game. But the cost for that extra needs to be reasonable, considering that none of it os mandatory for an enjoyable game.


Reasonable is relative. It seems to me most people on tech forums these days focus too much on the high-end. But the significance of the RTX 3090 strangely enough is more important for the low end. Let me explain: If we really see 4x or more improvement in raytracing performance, we can start getting raytracing cards all the way down to the bottom of the stack. There may not be any more GTX cards. And even the lowest gaming card like RTX 3050 could have more raytracing power than the RTX 2060 has now. Do you realize the significance of that? Then raytracing REALLY becomes massive. It's so suprising almost noone mentions that in their articles.


----------



## P4-630 (Aug 28, 2020)

Jinxed said:


> There may not be any more GTX cards. And even the lowest gaming card like RTX 3050 could have more raytracing power than the RTX 2060 has now.



_The other intriguing tidbit is the claim that GTX is dead, and that there will be RTX prefixes up and down the GeForce stack, with Tensor and RT Cores being dropped into even the lowest spec Ampere GPUs._ 








						Nvidia Ampere rumour suggests it will kill the cost of ray tracing
					

But all the salt in the world won't help you against the Nvidia Ampere rumour onslaught.




					www.pcgamer.com


----------



## Deleted member 193596 (Aug 28, 2020)

the 10gb 3080 is a huge letdown..

should be 16gb


----------



## rtwjunkie (Aug 28, 2020)

Jinxed said:


> Reasonable is relative. It seems to me most people on tech forums these days focus too much on the high-end. But the significance of the RTX 3090 strangely enough is more important for the low end. Let me explain: If we really see 4x or more improvement in raytracing performance, we can start getting raytracing cards all the way down to the bottom of the stack. There may not be any more GTX cards. And even the lowest gaming card like RTX 3050 could have more raytracing power than the RTX 2060 has now. Do you realize the significance of that? Then raytracing REALLY becomes massive. It's so suprising almost noone mentions that in their articles.


Yes, I do recognize the significance of a card like the 3090. I’ve been around quite awhile, I know how the filter down of tech works.

I like to skip gens though, so buying mid-level doesn’t work for me.  Even so, cost is a factor, and it’s not likely Nvidia will get my top-end dollars for the 3090.


----------



## Rob94hawk (Aug 28, 2020)

Chaitanya said:


> I will wait to see how these new GPUs perform in Helicon before deciding on uograde.



I'm waiting on how it handles Zork.


----------



## Dave65 (Aug 28, 2020)

Not buying a thing until Big Navi comes out. Then I will decide!


----------



## Sithaer (Aug 28, 2020)

I'm more interested in the ~cheaper mid range/budget options, most likely even the 3060 will be out of my budget range but if these new gen cards will push down the price on prev _'current'_ gen cards then thats all fine with me.

RT I'm not exactly hyped about, its a nice idea/tech but not interested in it enough to pay the extra for it yet _'regardless of the brand'_.
Maybe 1-2 gen later when its more common/matured.


----------



## harm9963 (Aug 28, 2020)

Will preorder my 3090 at Micro Center, lived down the street ,for pick up! here in Houston.


----------



## dragontamer5788 (Aug 28, 2020)

rtwjunkie said:


> Fact is, every one of the released games still looks fantastic without RTRT. None of these features they are making available on super-priced GPU’s make playing the game better or are necessary. Game immersion (at least on SP games) comes from gameplay and wel-written story. Graphics only enhances it.



Lets make a hypothetical: lets say that raytraced shadows would help the player see where enemy mooks are in some generic stealth game. Obviously, Raytracing would be very good in this kind of situation (and it'd be great to actually have the new Raytracing feature interact with gameplay features and/or puzzles). However, what happens to all the normal guys without Raytracing? They can no longer play the game effectively. So you need to come up with a rasterization trick to estimate the shadows, so that the game runs on their computers.

We're simply not at the stage yet where Raytracing can be used for a mechanical advantage. Otherwise, you'd alienate too many gamers from the game.

---------

Its really hard for me to come up with good uses of Raytracing that would actually affect the gameplay loop however. Outside of observing shadows, or maybe mirror-reflections... but even "Portal" ended up with tons of reflections (fake, non-raytraced ones. But good enough that most people didn't notice the issues). So its not like you need raytracing to make "accurate-enough for video games" kind of mechanics.


----------



## Hotobu (Aug 29, 2020)

dragontamer5788 said:


> Lets make a hypothetical: lets say that raytraced shadows would help the player see where enemy mooks are in some generic stealth game. Obviously, Raytracing would be very good in this kind of situation (and it'd be great to actually have the new Raytracing feature interact with gameplay features and/or puzzles). However, what happens to all the normal guys without Raytracing? They can no longer play the game effectively. So you need to come up with a rasterization trick to estimate the shadows, so that the game runs on their computers.
> 
> We're simply not at the stage yet where Raytracing can be used for a mechanical advantage. Otherwise, you'd alienate too many gamers from the game.
> 
> ...



To your point it's hard for me to come up with many good uses outside of visual fidelity but here are a few:

- a Hitman game where you use reflections to your advantage
- maybe some FPS/multiplayer games with some reflective surfaces
- perhaps enemies that can actually *see and hear* as opposed to having generic sight and hearing cones


Still, I'm taken aback by the amount of people who seem to be downplaying Raytracing as a technology. It's true that lighting is accurate enough for games, but true Raytracing should take visual fidelity a step further. I think when Raytracing starts becoming more of a thing we'll look back and see how much of a departure from reality current lighting actually is.


----------



## steen (Aug 29, 2020)

Vayra86 said:


> That would make the appearance of the 10GB one even more questionable.
> 
> OK! Let's compare the actual chips. Both are GA102 dies! What gives?! Its not even a 104.
> 
> Devs do indeed... but now check that price tag again for this GPU. Hello? This screams bad yield and scraps and leftovers to me, sold at premium. Let's see how that VRAM is wired...


It's a combination of factors including yield (as you've alluded to), memory bus width, granularity of GDDR6X & cost. 3080 10GB with 20% reduced memory bus (& associated memory controllers), & 13 fewer 19gbps GDDR6X modules & associated power stages (I'm assuming) yields a drop of only 30w TGP. Consider 3950X vs 3900X. If raster perf has a marginal increase as implied, we can see why there was some concern over allocation of resources within TA arch. I believe they've gone all in with RTX (2xFP32) as well as the far more flexible tensor & will push TF32, etc reduced precision formats heavily. It will work especially well with game engines built around DLSS reduced res upscaling. If the nn model(s) continue with reconstruction improvement on the order of DLSS 2.0, then who cares if only 25% of displayed pixels came from the game engine & the rest is magic, right?


----------



## TechLurker (Aug 29, 2020)

Something I noticed not being mentioned is that they are seem to be settling on advertising as PCIe 4.0 ready (whether or not it's really necessary yet), which likely answers the question/dilemma GamersNexus touched upon in regards to how NVIDIA would advertise their cards. While it's an insignificant element for those in the know, it's a big thing for the masses who'd just look at it and then panic because their Intel mobo doesn't have PCIe 4.0 capability, and the only ones on the market with PCIe 4.0 at the time of release is AMD.

I'm actually hoping NVIDIA decides to completely follow through on marketing, if only to see how Intel would spin things in order to stop would-be RTX 3000 series buyers from panic buying an AMD build to go along with their new GPU (which would hilariously also benefit the same mobo makers frustrated with Intel's failure on intended PCIe 4.0).


----------



## Hotobu (Aug 29, 2020)

TechLurker said:


> Something I noticed not being mentioned is that they are seem to be settling on advertising as PCIe 4.0 ready (whether or not it's really necessary yet), which likely answers the question/dilemma GamersNexus touched upon in regards to how NVIDIA would advertise their cards. While it's an insignificant element for those in the know, it's a big thing for the masses who'd just look at it and then panic because their Intel mobo doesn't have PCIe 4.0 capability, and the only ones on the market with PCIe 4.0 at the time of release is AMD.
> 
> I'm actually hoping NVIDIA decides to completely follow through on marketing, if only to see how Intel would spin things in order to stop would-be RTX 3000 series buyers from panic buying an AMD build to go along with their new GPU (which would hilariously also benefit the same mobo makers frustrated with Intel's failure on intended PCIe 4.0).



I definitely thought about this. There will certainly be some people that go with AMD/x570 just to get PCIe 4.0 (unecessarily) which is funny because this launch could be a bit of a boost to AMD solely because of that. I'm kind of in that same boat because I'm on x370 now, and while I'd like to stay with it I am probably going to end up getting the 3090 after the dust settles depending on benchmarks RDNA2 etc, but one thing I need to know first is if there's any benefit to running it on PCIe 4.0. There may not be, but I want a definitive answer on that before I do anything.


----------



## Caring1 (Aug 29, 2020)

I want to see some funky fish tailed cards.


----------



## sYn (Aug 29, 2020)

So 7nm woopwoop, and the new tensor where able to calculate fp32, and we will have an GPU with tripple the Tflops xD, lets see


----------



## GhostRyder (Aug 29, 2020)

Well seems we were right to be skeptical of pricing rumors before as while the prices are still high they are not as big of a jump as thought on the top.  Now I only have one major concern...

That 12 pin, I intend to water cool this card But it seems I am going to be stuck one of two ways:
1: Buy the reference, but have to purchase new PSU due to 12 pin.
2: Hope to god I pick a non reference design that gets a water block and uses the 3 8 pin connections.

Guess ill have to wait and see.


----------



## Hotobu (Aug 29, 2020)

GhostRyder said:


> Well seems we were right to be skeptical of pricing rumors before as while the prices are still high they are not as big of a jump as thought on the top.  Now I only have one major concern...
> 
> That 12 pin, I intend to water cool this card But it seems I am going to be stuck one of two ways:
> 1: Buy the reference, but have to purchase new PSU due to 12 pin.
> ...



Why would you have to buy a new PSU? Why can't you use an adapter? Especially if it's a modular PSU you can probably buy a new cable direct from your PSU manufacturer eventually.


----------



## bubbleawsome (Aug 29, 2020)

GhostRyder said:


> Well seems we were right to be skeptical of pricing rumors before as while the prices are still high they are not as big of a jump as thought on the top.  Now I only have one major concern...
> 
> That 12 pin, I intend to water cool this card But it seems I am going to be stuck one of two ways:
> 1: Buy the reference, but have to purchase new PSU due to 12 pin.
> ...


Reference cards are supposedly shipping with a 2x8pin->12pin adaptor.


----------



## lexluthermiester (Aug 29, 2020)

TheLostSwede said:


> No poll option for Waiting for the reviews?


Looks like it was added and *most* are voting for it. I'm in with that vote. But in addition to reviews, price points are important. Most people in this new economic condition of the world are going to be less able to afford the same prices NVidia was charging for the RTX20xx line-up.

(Before anyone flames me for repeating what might have already been said, TLDR. Yes, I know I'm late to the party again..)
Holy hannah the wattage on these cards! Granted, they are are going to be premium performance cards, but unless buyers already have beefy PSU's, said buyers must also include the cost of a PSU in addition to the cost of the GPU's. Of course those costs haven't been stated, but we can safely presume that they will very likely be similar to the RTX20XX series.

One also has to wonder if NVidia has a plan for the GTX line-up and what they might look like.


----------



## Frick (Aug 29, 2020)

Flying Fish said:


> This TGP rather than TDP is going to be annoying and cause confusion for people...Can see it in this thread already.
> 
> I mean how do you compared TGP to the old TDP values. is the 350W TGP gonna be similar to the 250W TDP of the 2080Ti, plus extra for board power? But then can you see the rest of the components using another 100W?



TGP?



Jinxed said:


> Trying to pull 2 year old arguments? Hardly. Given the fact that many AAA titles are raytracing enabled, the fact that main game engines like UE now support raytracing and some of the biggest games are going to be raytraced - Cyberpunk 2077, Minecraft just for an example - nobody believes those old arguments anymore. And you seem to have a bit of a split personality - your beloved AMD is saying new consoles and their RDNA2 GPUs will support raytracing as well. So what are you really trying to say? Are you trying to prepare for the eventuality that AMD's raytracing performance sucks?



But it's a good question. If I turn RT off, does the card use less power? If no, why not, if the reason for the power use is RT hardware?


----------



## calkapokole (Aug 29, 2020)

Leaked specs of RTX3080 don't make sense. Knowing GA100 and A100 chip configuration it's easy to deduct that fully enabled GA102 chip will have 6144 CUDA Cores split between 96 SMs which are further organized into 6 GPCs. If RTX3080 uses only 68 SMs (~71%) then a lot of silicon is wasted. I think there is a space between GA102 and GA104 chips for GA103 chip which fully enabled will have 5120 CUDA Cores (80 SMs, 5 GPCs). The RTX3080 will probably be based on GA103 chip if indeed it has 4352 CUDA Cores.


----------



## BiggieShady (Aug 29, 2020)

What's with 7nm process power consumption? 3080 vs 2080 Ti, similar specs, +150MHz on the GPU and faster GDDR but maybe less ram chips ... the result is +100W on a 7nm node ... I wouldn't expect power envelope  expanded by hundred watts even on the same node.


----------



## saki630 (Aug 29, 2020)

Without bench's to show how poor the performance difference between the 3080 vs. 3090 at the same settings, we are going to be fighting over nothing here. Its obvious the 3080 is the best choice if it was priced where it should be. The 3090 is the 'ti' variant that the 2080ti people will purchase and see performance gains. Then some time around March 2021, the real 3090ti variant releases and prices adjust accordingly. 

I have a 1080ti, I want a performance increase (3080+) without spending a kidney. This 1080ti was $550, Someone tell me what I can purchase for $5-700 in the next month that will give me an increase in performance in my sexsimulator69?


----------



## Hardware Geek (Aug 29, 2020)

Jism said:


> This is just a enterprise card designed for AI / DL / whatever workload being pushed into gaming. These cards normally fail the enterprise quality stamp. So having up to 350W of TDP / TBP is not unknown. It's like Linus torwards said about Intel: Stop putting stuff in chips that only make themself look good in really specific (AVX-512) workloads. These RT/Tensor cores proberly count up big for the extra power consumption.
> 
> Price is proberly in between 1000 and 2000$. Nvidia is the new apple.


Do you mean *probably*?


----------



## TheLostSwede (Aug 29, 2020)

erek said:


> looks fake, why's the spacing different?
> 
> View attachment 166928











						GAINWARD GeForce RTX 3090 and RTX 3080 Phoenix leaked, specs confirmed - VideoCardz.com
					

Gainward is preparing its next-generation Phoenix series for the RTX Ampere launch.  Gainward Phoenix RTX 30 Ampere series In just a few days NVIDIA will unveil its GeForce RTX 30 series based on Ampere architecture. These cards will be NVIDIA’s first gaming cards based on 7nm process...




					videocardz.com


----------



## BiggieShady (Aug 29, 2020)

Ah, beloved Gainward aka Palit division for EU, good to see it still going strong.


----------



## CandymanGR (Aug 29, 2020)

3090 launch price: 1400$
2080 ti launch price (standard edition): 999$



I laugh on some people who really believed nvidia when said: "Ampere will be cheaper than what Turing was". (edited)
Even some tech sites mentioned that. It is so funny when people believing in hope, than in reality. And the 70% faster than Volta nvidia claims, is a joke. Not even in RTX scenarios the difference will be that big. Maybe it will be that much in CUDA processing, and that's all.

P.S. Thats a 40% increase in price. I bet, the difference in performance (in gaming) will be less than this (maybe 25-30% in real world scenarios). Mark this post for reference, when reviews come out. Over and Out.


----------



## RandallFlagg (Aug 29, 2020)

Mark Little said:


> ...
> RTX 3090 5248 CUDA, 1695 MHz boost, 24 GB, 936 GB/s, 350 W, 7 nm process
> 
> RTX 2080 Ti  4352 CUDA, 1545 MHz boost, 11 GB, 616 GB/s, 250 W, 12 nm process
> ...



Yes, it's not a bad upgrade but it is predictable.  Basically everything in their lineup shifts one level, 3080=2080Ti + higher clock, 3070=2080+higher clock.  If the pattern follows to the midrange, which I think it will, we'll see slightly higher than 2070 / 2070 Super performance from the 3060 / 3060 Ti.  

The biggest impact to future PC games and the capabilities of PC games will be if they take the 1650/1660 series and include ray tracing and DLSS at 2060+ levels of performance.  That will be a mainstream card and would become a new baseline for developers to target for games to be released in 2-3 years.  

Still I suspect we will see same performance at the same price point for a while (~6 months after release), regardless of what the name is.  Only the 3090 offers significantly more performance, and that's very much niche with more marketing value as those types of cards typically garner 0.1% of market share.

Ironically the pricing situation may hinge a lot on potential competition from Intel and its new Xe discrete GPU.


----------



## GhostRyder (Aug 29, 2020)

Hotobu said:


> Why would you have to buy a new PSU? Why can't you use an adapter? Especially if it's a modular PSU you can probably buy a new cable direct from your PSU manufacturer eventually.


PSU is no longer supported guarantee it.  It’s a modular design but it uses screw in round connectors and I have had it for awhile.  It’s a 1300 watt gold PSU.




bubbleawsome said:


> Reference cards are supposedly shipping with a 2x8pin->12pin adaptor.


Oh I missed that, then I am no longer worried I’ll just get a reference 3090 and a water block and use the adaptors for awhile.


----------



## saikamaldoss (Aug 29, 2020)

Can’t wait to know more about RDNA2 so I can make a informed decision..NV or AMD 

my Vega64 can’t do 4K 30fps in project cars 3. I get 26fps but still it’s smooth thanks to freesync but can’t want to upgrade.


----------



## Prior (Aug 29, 2020)

NVLink SLI is only available on 3090, why would you want to sli a beast Nvidia?


----------



## P4-630 (Aug 29, 2020)

Prior said:


> NVLink SLI is only available on 3090, why would you want to sli a beast Nvidia?



If you got unlimited cash to burn.


----------



## ppn (Aug 29, 2020)

2080 SUPER
Transistors13,600 million  Shading Units3072 RT Cores48 

3070
Transistors30,000 million Shading Units3072 RT Cores96

What,

other than doubling the RT core from 48 to 96, what other benefit did the doubling the transistor count do, OMG, this could have been 6144 CUDA core count for the transistor budget it has.


----------



## medi01 (Aug 29, 2020)

Hotobu said:


> Raytracing should take visual fidelity a step further.


Had that been the case, people wouldn't need to ask Epic whether Unreal PS5 demo was using DXR like calls or not.

RT fails to deliver on its main promise: _easier_to_develop_ realistic reflections/shadows.

Short term, it could evaporate the way PhysX did.


----------



## rtwjunkie (Aug 29, 2020)

GhostRyder said:


> Buy the reference, but have to purchase new PSU due to 12 pin.


From what I've seen, adapters will be included.


----------



## RandallFlagg (Aug 29, 2020)

medi01 said:


> Had that been the case, people wouldn't need to ask Epic whether Unreal PS5 demo was using DXR like calls or not.
> 
> RT fails to deliver on its main promise: _easier_to_develop_ realistic reflections/shadows.
> 
> Short term, it could evaporate the way PhysX did.




PhysX did not evaporate.  It became ubiquitous to the point people don't know it's there anymore.  It's used by Unreal Engine 3+, Unity, and host of others.


----------



## RoutedScripter (Aug 29, 2020)

I come here as someone who does not like spoilers ... just days away this seems like a total psycho obsession with some of the people who think they're doing something noble with leaks ... at least the news media, if they are eager to profit off the leaks because of drama and traffic, should put up big spoiler warnings and some standards in this regard, I'm so sick of this, no, I do not know what the leak is, I only came here to say this, I will be going on tech-site blackout until I watch the proper reveal. Yes I was hiding my eyes not to do a single peek of the content or any comments, I did not read any posts here in this thread either.


----------



## John Naylor (Aug 29, 2020)

ZoneDymo said:


> No, but you do have to answer why that gap is so insanely hugh, more then twice the ram? borderline 2.5? that is just insane.
> And again, the midrange of old, RX480 had 8gb of ram and the gtx1060 had 6gb of ram...to have an RTX3080 now with 10gb is just pathetic imo with the eye on progression and placement.



Peeps have been complaining about VRAM for generations of cards, and real world testing has nott borne it up.  Every time test sites have compared the same GPU w/ different RAM sizes, in almost every game, there was no observable impact in performance.









						GTX 770 4GB vs 2GB Showdown - Page 3 of 4 - AlienBabelTech
					

Do you need 4GB of ram? We tested EVGA's GTX 770 4GB versus Nvidia's GTX 770 2GB version, at 1920x1080, 2560x1600 and 5760x1080.




					alienbabeltech.com
				



2GB vs 4 GB 770... when everyone was saying 2Gb was not enough, this test showed otherwise.

"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600.  We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference. ... There is one last thing to note with _Max Payne 3_:  It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it ***claims*** to require 2750MB.  However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting.  And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s. 

Same here .... https://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
Same here .... http://www.extremetech.com/gaming/2...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x
Same here .... https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/

Yes, you can find some games that will show a difference, most;ly SIMs w/ bad console ports

And lets remember ... the 3GB version of the 1060 did just fine.   They were not the same GPU, the 3 GB version had 10% less shaders which gave the 6 GB an VRAM independent speed advantage.  The extra shaders gave the 6 GB version a 6 % speed advantage over the 3 GB ... So when going to 1440p, if there was even a hint of impact due to VRAM, that 6% should be miuch bigger ... it wasn't....only saw a difference at 4k.

Based upon actual testing at lower res's and scaling up accordingly, my expectations for the 2080 were 12k, so was surprised at the odd 11 number.... for 3080, I thot they'd do 12 ... so 10 tells me thet Nvidia must know more than we know.  No sense putting it in if it's not used.... no different that have an 8+6 power connector on a 225 watt card.  Just because the connectors and cable can pull 225 watts (+ 75 from the slot) doesn't mean it will ever happen.

Nvidia’s Brandon Bell has addressed this topic more than once saying that the utilities  that are available "_all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”    _The card manufacturers gave us more RAM because customers would buy it.   But for a 1060 ... the test results proved we don't need more than 3 GB at 1080p, the 6 GB version didn't add anything to the mix other than more shaders. 

So now for the why 10 question ?

When they did the 1060, 3 GB, why did they disable 10% of the shaders ... didn't save any money ?  Let's look at  W1zzard's conclusion:

"Typically, GPU vendors use the exact same GPU for SKUs of different memory capacity, just not in this case. NVIDIA decided to reduce the shader count of the GTX 1060 3 GB to 1152 from the 1280 on the 6 GB version. This rough 10% reduction in shaders lets the company increase the performance difference between the 3 GB and 6 GB version, which will probably lure potential customers closer toward the 6 GB version. "

In other words, that needed to kill 10% of the shaders because otherwise.... the performance would be the same and folks would have no reason to spring for the extra $$  for the 6 GB card.  Same with the 970's 3.5 GB ... it was clearly done to gimp the 970 and provide a performance gap between the 980.    When I heard there was a 3080 and a 3090, coming, I expected 12 and 16 GB..   No I can't help but wonder ....  is the 12 GB the sweet spot for 4k and is the use of the 10 GB this generation's the little "gimp" needed to make the cost increase to the 3090 attactive ?


----------



## RandallFlagg (Aug 29, 2020)

I like what you're saying on the VRAM thing, based on my experience it's irrelevant in *most* games.

One exception though, MMOs.  Specifically, if you've ever played a MMO and been in an area where there are like 200+ other players with many different textures for their gear, VRAM definitely comes into play.  Smaller VRAM cards simply can't hold all those textures, at least not with any quality.  What you wind up with is half the people look normal and the other half look like they're in underwear.  

Anyway, that is kind of an edge case.  Even in those settings if it's truly bothersome typically one can reduce texture quality and problem solved.


----------



## Vayra86 (Aug 29, 2020)

John Naylor said:


> Peeps have been complaining about VRAM for generations of cards, and real world testing has nott borne it up.  Every time test sites have compared the same GPU w/ different RAM sizes, in almost every game, there was no observable impact in performance.
> 
> 
> 
> ...



You might be on the money there, but even so, the shader deficit is a LOT more than 10% on the 3080. Its about a 1000 shaders less?

And to be fair, I'm not that silly to think that this move will somehow push people doubting a 3080, suddenly to a 3090 that is a whole lot more expensive. They will probably look at their 2080ti and think, meh... I'm gonna sit on this 11GB for a while - heck, it even does RT already. Remember that 2080ti was already a stretch in terms of price. Will they nail that again? Doubtful, especially when the gimp of a lower tier is so obvious. Its not something one might be keen to reward. More likely I think, is that the 3080 is the step up aimed at 2080/2080S and 2070S buyers. And they gain 2GB compared to what they had.

I think a bigger part of the 10GB truth is really that they had some sort of issue getting more on it. Either a cost issue, or yield/die related. That is why I'm so so interested in how they cut this one up and how the VRAM is wired.


----------



## reflex75 (Aug 29, 2020)

That's a shame only 10GB for a new high end GPU, while new consoles will have more (16GB) and be cheaper!


----------



## xman2007 (Aug 30, 2020)

RoutedScripter said:


> I come here as someone who does not like spoilers ... just days away this seems like a total psycho obsession with some of the people who think they're doing something noble with leaks ... at least the news media, if they are eager to profit off the leaks because of drama and traffic, should put up big spoiler warnings and some standards in this regard, I'm so sick of this, no, I do not know what the leak is, I only came here to say this, I will be going on tech-site blackout until I watch the proper reveal. Yes I was hiding my eyes not to do a single peek of the content or any comments, I did not read any posts here in this thread either.


Just read the title and avoid the thread, whats with the drama Britney?


----------



## lexluthermiester (Aug 30, 2020)

rtwjunkie said:


> From what I've seen, adapters will be included.


If adapters are included why make the new connector in the first place? Makes no sense at all.



reflex75 said:


> That's a shame only 10GB for a new high end GPU, while new consoles will have more (16GB) and be cheaper!


Consoles with 16GB(which NONE are out yet BTW) have to share that 16GB system wide. Whereas 10GB dedicated for the GPU is in ADDITION to the RAM the system already has. You need to keep that technological dynamic in mind.


----------



## rtwjunkie (Aug 30, 2020)

lexluthermiester said:


> If adapters are included why make the new connector in the first place? Makes no sense at all.
> 
> 
> Consoles with 16GB(which NONE are out yet BTW) have to share that 16GB system wide. Whereas 10GB dedicated for the GPU is in ADDITION to the RAM the system already has. You need to keep that technological dynamic in mind.


Remember years ago when power requirements were changing quickly? We got adapters in nearly every video card box.  It’ll be the same until it is presumed everyone has new PSU’s in about 5 years.


----------



## lexluthermiester (Aug 30, 2020)

rtwjunkie said:


> Remember years ago when power requirements were changing quickly? We got adapters in nearly every video card box.  It’ll be the same until it is presumed everyone has new PSU’s in about 5 years.


While I see your point, the change is still unneeded. Current 8+6pin or 8+8pin power works perfectly well. Other than physical size, there is no benefit from this new connector.


----------



## rtwjunkie (Aug 30, 2020)

lexluthermiester said:


> While I see your point, the change is still unneeded. Current 8+6pin or 8+8pin power works perfectly well. Other than physical size, there is no benefit from this new connector.



Just so you know, I didn’t make this decision and I am not for it. I’m just relaying information.


----------



## lexluthermiester (Aug 30, 2020)

rtwjunkie said:


> Just so you know, I didn’t make this decision and I am not for it. I’m just relaying information.


You didn't?!? Well hot damn... LOL!  No worries mate.


----------



## TranceHead (Aug 30, 2020)

lexluthermiester said:


> While I see your point, the change is still unneeded. Current 8+6pin or 8+8pin power works perfectly well. Other than physical size, there is no benefit from this new connector.


Whats wrong with replacing multiple connectors with 1 single connector?
I welcome it


----------



## yotano211 (Aug 30, 2020)

I would hate to see the 3000 cards in laptops, they would have to be heavily throttled to get it not to blow up the laptop. 
The 2070 is a 175w card but in laptops its 115w, this 3070 is rumored to be 320w but will Nvidia stay with the same wattage of 115w or maybe increase it to 150w which is 2080 mobile territory. 
I guess all laptops will be max-q designs with the 3000 cards.


----------



## DuxCro (Aug 30, 2020)

RoutedScripter said:


> I come here as someone who does not like spoilers ... just days away this seems like a total psycho obsession with some of the people who think they're doing something noble with leaks ... at least the news media, if they are eager to profit off the leaks because of drama and traffic, should put up big spoiler warnings and some standards in this regard, I'm so sick of this, no, I do not know what the leak is, I only came here to say this, I will be going on tech-site blackout until I watch the proper reveal. Yes I was hiding my eyes not to do a single peek of the content or any comments, I did not read any posts here in this thread either.


Spoilers? What is this? Game? Movie? No! It's a computer component. You want me to spoil the entire "plot" for you? OK. Here it goes: Nvidia releases another ultra overpriced series of RTX cards, people still buy, Nvidia laughs as it gets showered in money, end credits.


----------



## kanecvr (Aug 30, 2020)

JAB Creations said:


> Who the hell is team red? Okay, gotta figure this one out...
> 
> 
> Team Blue: Intel.
> ...



Modern amd logo is orange. Radeon tecnology group is red.

As for the poll, I'm not spending 1000$ for a gpu. I'm only upgrading when i can get a 4k capable (60 fps) gpu for 400-500$.


----------



## AsRock (Aug 30, 2020)

JAB Creations said:


> They're not being weird, they're trying to distract people that they feel threatened by AMD who no longer has a stock value of under $2 from the crony tactics of both Intel and Nvidia so naturally they're going to do their absolute best. Why?
> 
> "Nvidia has the best card at $36,700! So when I spend $170 I'll somehow magically get the best card I can!" - Fanboyism
> 
> ...



Wouldn't the smaller connector help with airflow ?, even more so with how this card is designed.


----------



## Dwarden (Aug 30, 2020)

DisplayPort 1.4a ?  2.0 was finalized in year 2016 ... 
wth is with PC companies failing to use latest PC standards for PC components


----------



## Vayra86 (Aug 30, 2020)

yotano211 said:


> I would hate to see the 3000 cards in laptops, they would have to be heavily throttled to get it not to blow up the laptop.
> The 2070 is a 175w card but in laptops its 115w, this 3070 is rumored to be 320w but will Nvidia stay with the same wattage of 115w or maybe increase it to 150w which is 2080 mobile territory.
> I guess all laptops will be max-q designs _grossly underpowered and overpriced _with the 3000 cards.




FTFY

Its another reason I really don't get this set of products. Even for a Max-Q, that would require some pretty creative shifting with tiers and performance. Effectively your mobile x80 is slower than a desktop x60 or something. I mean how much binning can you do...



Dwarden said:


> DisplayPort 1.4a ?  2.0 was finalized in year 2016 ...
> wth is with PC companies failing to use latest PC standards for PC components



Max out that profit, that's what.


----------



## medi01 (Aug 30, 2020)

RandallFlagg said:


> PhysX did not evaporate. It became ubiquitous to the point...


Of not being used in games. 
But let's pretend we are talking about CPU PhysX here, to make a lame point, shall we...


----------



## semantics (Aug 30, 2020)

medi01 said:


> Of not being used in games.
> But let's pretend we are talking about CPU PhysX here, to make a lame point, shall we...


Parts of PhysX are of UE4 and Unity granted the ol cuda acceleration is pretty dead but nvidia's physics package is still pretty prevalent.


----------



## RoutedScripter (Aug 30, 2020)

DuxCro said:


> Spoilers? What is this? Game? Movie? No! It's a computer component. You want me to spoil the entire "plot" for you? OK. Here it goes: Nvidia releases another ultra overpriced series of RTX cards, people still buy, Nvidia laughs as it gets showered in money, end credits.



That's your opinion, yes it is a spoiler to me, and probably other people who are tolerating this for so long. We can't browse any of the tech sites normally because of this drama and the stupid impatience-fetish you people are into. The media will report for those who are interested sure, but what about those who do not wish to be spoiled ... so I have to avoid tech sites for weeks every time such reveals are heating up.

When I was 17 year old I sure liked all the leaks and stuff and had a good time dramatizing ... but those were the times when I had no responsibilites, I had all the time in the world, I could sit for hours and dribble on what new tech will be in the next console ... been there done that, but I got fed up with the spoilers, it feels so much better to see the E3 or some game show event how it's meant to be seen, some cheap-ass website that is all about pre-launch rumor monetization, leaking it in an almost criminally disrespectful way is the biggest slap in the face ever, not only to the fans but also to the very hard working employess who prepared those events, to the speakers who prepared those speeches, you are ruining so many experiences with this stupid rumor leak obsession.

But that's my opinion, calling it stupid, sure, it's free speech, go ahead, we can't force a private forum or site to censor, ofcourse not, but it would be a good idea to try to find compromise and to overhaul the website news sections so  the news writers and staff could post spoiler warnings for articles and other forum tools to also put up some warnings- when navigating the forum.

Again, I'm not sure how big this leak is in detail but from the slight thing I've seen it seems big ... I did not see it tho, I successfully hidden my eyes out 

If I was nvidia right now, after SO MUCH HYPE for MONTHS, where sites wrote they never seen such hype before for a GPU ... it was building up so nice, now some idiot messes it all up just 3 days prior to reveal ... if I was Nvidia, I would go nuclear, I'd practically cancel and delay everything for 30 days, delay the review GPUs, and even recall them back, yes, forcefully canceling the shipments and rerouting them while in-transit so neither the reviewers would get it, it's the reviewers that are mostly responsible for spreading and the leaks, this is how most people get spoiled. That would be the most epic reply back, so epic that all the Leak-Obsession-Disorder losers who would cry about it would be doing it for nothing, industry analysts and pundits would be impressed by the boldness of the move and I think spiritually this would have infact helped the company image in the long run and in the key people, yes the 15 year old gamer-head losers would hate it but it wouldn't matter because everyone including the investors would know everyone would still buy it after the delay, no sales for 30 days might hurt them a bit sure, but if I was the CEO i'd take it, my pride stands first!


----------



## lexluthermiester (Aug 31, 2020)

TranceHead said:


> Whats wrong with replacing multiple connectors with 1 single connector?
> I welcome it


Simple, not all cards need that many power lines. For many cards one 6pin/8pin is enough. So why build a connector that has a bunch of power lines when only a few are needed? It's needless and wasteful.



Dwarden said:


> DisplayPort 1.4a ?  2.0 was finalized in year 2016 ...
> wth is with PC companies failing to use latest PC standards for PC components


You need to read that wikipedia article you're quoting a little closer. 2.0 was on the roadmap in 2016 but it wasn't finalized until June 26th of 2019. Additionally, DP2.0 modulation ICs are expensive and offer marginal benefit to the consumer over 1.4a based ICs. DP2.0 is best suited for commercial and industrial applications ATM.


----------



## medi01 (Aug 31, 2020)

semantics said:


> Parts of PhysX are of UE4 and Unity granted the ol cuda acceleration is pretty dead but nvidia's physics package is still pretty prevalent.


In other words, CPU PhysX, not very relevant to this thread don't you think?


----------



## JAB Creations (Aug 31, 2020)

AsRock said:


> Wouldn't the smaller connector help with airflow ?, even more so with how this card is designed.



Generally yes but what will the actual TDP of the 3090 be compared to the 2080ti? Keeping in mind that Nvidia (according to Tom) might be trying to make their line up less confusing to consumers (I won't hold my breath). The heat output is going to be enormous because Nvidia is obsessed with mindset because there are so many people who will pointlessly throw themselves at a corporate identity simply because they have the best at 1 million dollars per card when they can afford $200.

Frankly I would really like to see AMD release the 6000 series with a setting that allows you to choose a TDP target. Like a slider for the card power, even if it has steps like 25 watts or something. I know AMD will still crank the power up, I don't need 300FPS, I am still using a 60Hz screen for now and am interested in 120/144 later this year but options are limited in the area of the market I'm looking as I game maybe 2% of the time I'm at my rig.


----------



## TheoneandonlyMrK (Aug 31, 2020)

RoutedScripter said:


> That's your opinion, yes it is a spoiler to me, and probably other people who are tolerating this for so long. We can't browse any of the tech sites normally because of this drama and the stupid impatience-fetish you people are into. The media will report for those who are interested sure, but what about those who do not wish to be spoiled ... so I have to avoid tech sites for weeks every time such reveals are heating up.
> 
> When I was 17 year old I sure liked all the leaks and stuff and had a good time dramatizing ... but those were the times when I had no responsibilites, I had all the time in the world, I could sit for hours and dribble on what new tech will be in the next console ... been there done that, but I got fed up with the spoilers, it feels so much better to see the E3 or some game show event how it's meant to be seen, some cheap-ass website that is all about pre-launch rumor monetization, leaking it in an almost criminally disrespectful way is the biggest slap in the face ever, not only to the fans but also to the very hard working employess who prepared those events, to the speakers who prepared those speeches, you are ruining so many experiences with this stupid rumor leak obsession.
> 
> ...


Were you not about for the last ten years, every GPU release had a hype train.
And if not new unreleased tech, what then do we discuss, do we just compare benchmarks, fix issues, some like a less dry discussion.
You can not read them.


----------



## AsRock (Aug 31, 2020)

JAB Creations said:


> Generally yes but what will the actual TDP of the 3090 be compared to the 2080ti? Keeping in mind that Nvidia (according to Tom) might be trying to make their line up less confusing to consumers (I won't hold my breath). The heat output is going to be enormous because Nvidia is obsessed with mindset because there are so many people who will pointlessly throw themselves at a corporate identity simply because they have the best at 1 million dollars per card when they can afford $200.
> 
> Frankly I would really like to see AMD release the 6000 series with a setting that allows you to choose a TDP target. Like a slider for the card power, even if it has steps like 25 watts or something. I know AMD will still crank the power up, I don't need 300FPS, I am still using a 60Hz screen for now and am interested in 120/144 later this year but options are limited in the area of the market I'm looking as I game maybe 2% of the time I'm at my rig.



From what i keep seeing is that the 3090 be around 320w and 220w for the 3080,  i guess we will see soon enough.  I get the feeling that the 3090 was actually made to be their next Titan with the amount of memory it has.


----------



## BoboOOZ (Aug 31, 2020)

JAB Creations said:


> Frankly I would really like to see AMD release the 6000 series with a setting that allows you to choose a TDP target. Like a slider for the card power, even if it has steps like 25 watts or something. I know AMD will still crank the power up, I don't need 300FPS, I am still using a 60Hz screen for now and am interested in 120/144 later this year but options are limited in the area of the market I'm looking as I game maybe 2% of the time I'm at my rig.


The Radeon driver already allows you to define a power budget in general, and on a per-game basis. I run my 5700XT at 120W most of the time, and it's passively cooled when it does. And, btw, the slider is continuous.

The only thing missing is memory underclocking, but that's more like peanuts compared to the GPU itself.


----------



## medi01 (Aug 31, 2020)

This guy (note the date of his first post):


__ https://twitter.com/i/web/status/1280862817671770113
Says this:


__ https://twitter.com/i/web/status/1300464910019690505

In other words, that big RDNA2 chip is beating 3080 (hardly surprising, to be honest)


----------



## BoboOOZ (Aug 31, 2020)

medi01 said:


> In other words, that big RDNA2 chip is beating 3080 (hardly surprising, to be honest)


I haven't seen a single solid RDNA2 leak, only suppositions, have you?


----------



## Fluffmeister (Aug 31, 2020)

Yeah besides, It would have been better if he said GA102 cannot beat it either, shame.


----------



## medi01 (Aug 31, 2020)

BoboOOZ said:


> I haven't seen a single solid RDNA2 leak, only suppositions, have you?


Chip size was "leaked" (sort of).
Everything else was not (as with Ampere).



Fluffmeister said:


> Yeah besides, It would have been better if he said GA102 cannot beat it either, *shame*.


Lol.


----------



## GhostRyder (Sep 1, 2020)

rtwjunkie said:


> From what I've seen, adapters will be included.


Yea I had just found that out so im not worried (Other than looks lol).


AsRock said:


> From what i keep seeing is that the 3090 be around 320w and 220w for the 3080,  i guess we will see soon enough.  I get the feeling that the 3090 was actually made to be their next Titan with the amount of memory it has.


I thought the way they are doing this they are scrapping the Titan badging from the gaming lineup and making it a pro card but I could be mistaken.


----------



## AsRock (Sep 1, 2020)

GhostRyder said:


> Yea I had just found that out so im not worried (Other than looks lol).
> 
> I thought the way they are doing this they are scrapping the Titan badging from the gaming lineup and making it a pro card but I could be mistaken.



Not sure you could be right too for all i know, although i am not after 300w+ vcard that's for sure regardless of the price.


----------



## medi01 (Sep 1, 2020)

medi01 said:


> This guy (note the date of his first post):
> 
> 
> __ https://twitter.com/i/web/status/1280862817671770113
> ...



Uh, hold on, isn't 3080 also GA 102 this time?


----------



## RoutedScripter (Sep 1, 2020)

theoneandonlymrk said:


> Were you not about for the last ten years, every GPU release had a hype train.
> And if not new unreleased tech, what then do we discuss, do we just compare benchmarks, fix issues, some like a less dry discussion.
> You can not read them.



Sorry I have no idea what you're trying to say.


----------



## TheoneandonlyMrK (Sep 1, 2020)

RoutedScripter said:


> Sorry I have no idea what you're trying to say.


Sorry I don't get you.


----------



## RoutedScripter (Sep 1, 2020)

So worth it, skipping all the leaks, skipping pointless speculation on a forum, it wouldn't have felt the same, I made an awesome dinner, sat down and watched mostly all of it, very satisfied, even tho I'm not a Nvidia customer, but that's not the point.


----------



## TranceHead (Sep 10, 2020)

So all the 30 series have a 12 pin connector?


lexluthermiester said:


> Simple, not all cards need that many power lines. For many cards one 6pin/8pin is enough. So why build a connector that has a bunch of power lines when only a few are needed? It's needless and wasteful.
> 
> 
> You need to read that wikipedia article you're quoting a little closer. 2.0 was on the roadmap in 2016 but it wasn't finalized until June 26th of 2019. Additionally, DP2.0 modulation ICs are expensive and offer marginal benefit to the consumer over 1.4a based ICs. DP2.0 is best suited for commercial and industrial applications ATM.


----------



## lexluthermiester (Sep 10, 2020)

TranceHead said:


> So all the 30 series have a 12 pin connector?


Yes & No. All of the NVidia branded FE cards will. AIB cards will not. So it will depend on which brand you buy from.


----------



## TranceHead (Sep 10, 2020)

lexluthermiester said:


> Yes & No. All of the NVidia branded FE cards will. AIB cards will not. So it will depend on which brand you buy from.


Either way.
I still prefer a single connection over multiples


----------



## lexluthermiester (Sep 10, 2020)

TranceHead said:


> Either way.
> I still prefer a single connection over multiples


While I would generally agree with that, I don't have any PSU's that have that connection and an adapter cable is pointless. When PSU's have that connector, I'd be fine using it, but I'm not going to replace $650 worth of in-use PSUs in my home just to accommodate a change no one needs or asked for. And to be fair, AIBs are not going to use the connector and I'm not buying any of NVidia' own FE lineup(either personally or for my shop). So it's kind of a mute point at this time.


----------



## AsRock (Sep 10, 2020)

Connector should come the nV FE card anyways.


----------

