# NVIDIA Announces the GeForce RTX 3060, $330, 12 GB of GDDR6



## btarunr (Jan 12, 2021)

NVIDIA today announced that it is bringing the NVIDIA Ampere architecture to millions more PC gamers with the new GeForce RTX 3060 GPU. With its efficient, high-performance architecture and the second generation of NVIDIA RTX, the RTX 3060 brings amazing hardware raytracing capabilities and support for NVIDIA DLSS and other technologies, and is priced at $329.

NVIDIA's 60-class GPUs have traditionally been the single most popular cards for gamers on Steam, with the GTX 1060 long at the top of the GPU gaming charts since its introduction in 2016. An estimated 90 percent of GeForce gamers currently play with a GTX-class GPU. "There's unstoppable momentum behind raytracing, which has quickly redefined the new standard of gaming," said Matt Wuebbling, vice president of global GeForce marketing at NVIDIA. "The NVIDIA Ampere architecture has been our fastest-selling ever, and the RTX 3060 brings the strengths of the RTX 30 Series to millions more gamers everywhere."






With newer gaming titles come bigger worlds with cinematic graphics and real-time raytracing — these are gaming workloads that only RTX-powered platforms are suited to handle. The GeForce RTX 3060 has twice the raster performance and 10x the raytracing performance of the GTX 1060, making it a formidable upgrade opportunity and the foundation of a gaming PC platform powerful enough to handle cutting-edge titles such as Cyberpunk 2077 and Fortnite with RTX On at 60 frames per second.

The RTX 3060's key specifications include: 
13 shader-TFLOPs
25 RT-TFLOPs for raytracing
101 tensor-TFLOPs to power NVIDIA DLSS (Deep Learning Super Sampling)
192-bit memory interface
12 GB of GDDR6 memory
Resizable BAR will be supported on the GeForce RTX 30 Series starting with the RTX 3060. When combined with a compatible motherboard, this advanced PCI Express technology enables all of the GPU memory to be accessed by the CPU at once, providing a performance boost in many games.

Like all RTX 30 Series GPUs, the RTX 3060 supports the trifecta of GeForce gaming innovations: NVIDIA DLSS, NVIDIA Reflex and NVIDIA Broadcast, which accelerate performance and enhance image quality. Together with real-time ray tracing, these technologies are the foundation of the GeForce gaming platform, which brings unparalleled performance and features to games and gamers everywhere.

NVIDIA DLSS: The AI Gift That Gamers Love

AI is revolutionizing gaming — from in-game physics and animation simulation to real-time rendering and AI-assisted broadcasting features. Powered by dedicated AI processors on GeForce RTX GPUs called Tensor Cores, NVIDIA DLSS boosts frame rates while generating beautiful, crisp game images and gives gamers the performance headroom to maximize raytracing settings and increase output resolutions. DLSS is available in more than 25 games, with more added every month.

*NVIDIA Reflex and Broadcast: The Ultimate Play*
NVIDIA Reflex technology reduces system latency (or input lag), making games more responsive and giving players in competitive multiplayer titles an edge over the opposition. NVIDIA Broadcast is a suite of audio and video AI enhancements, including virtual backgrounds, motion capture and advanced noise removal, that users can apply to chats, Skype calls and video conferences.

*Advanced GeForce Experience Features*
All NVIDIA GeForce GPUs benefit from GeForce Experience, a tool used by tens of millions of gamers to optimize game settings, record and upload gameplay, stream gameplay, take screenshots, and download and install Game Ready Drivers. The latest features include:

One-click automatic GPU Tuning: GeForce Experience now supports GPU Tuning, which can automatically create overclocking profiles by using an advanced scanning algorithm.
Enhanced in-game monitoring overlay: GeForce Experience's already robust in-game overlay now adds performance stats, temperatures and latency metrics, including NVIDIA Reflex Latency Analyzer stats.

*Where to Buy*
The GeForce RTX 3060 will be available in late February, starting at $329, as custom boards — including stock-clocked and factory-overclocked models — from top add-in card providers such as ASUS, Colorful, EVGA, Gainward, Galaxy, Gigabyte, Innovision 3D, MSI, Palit, PNY and Zotac. Look for GeForce RTX 3060 GPUs at major retailers and etailers, as well as in gaming systems by major manufacturers and leading system builders worldwide.

*View at TechPowerUp Main Site*


----------



## Mad_foxx1983 (Jan 12, 2021)

so this card got more Vram than the 3060 ti, 3070 and 3080. why nvidia??


----------



## Nihilus (Jan 12, 2021)

Mad_foxx1983 said:


> so this card got more Vram than the 3060 ti, 3070 and 3080. why nvidia??


Because you touch yourself at night.  Seriously, you are just asking the same question that was answered many times.


----------



## Xaled (Jan 12, 2021)

Mad_foxx1983 said:


> so this card got more Vram than the 3060 ti, 3070 and 3080. why nvidia??


Blame AMD! for not bringing SAM early enough!


----------



## neatfeatguy (Jan 12, 2021)

"NVIDIA DLSS: The AI Gift That Gamers Love" 










As for availability - I guess we actually wait and see. The 3060Ti can't be purchased easily, like all the other GPUs that have recently launched.


----------



## Gameslove (Jan 12, 2021)

Hopefully they have been started to produce a more graphics cards rtx 3060 that price 329,00 $ (329,00 €) that early release rtx 3090, 3080, 3070, 3060 Ti.


----------



## 15th Warlock (Jan 12, 2021)

Nice little card, with a plentiful pool of VRAM, I love it.

Hate to say this but I wonder if that MSRP includes the new 25% tariff on all video cards manufactured in China


----------



## yotano211 (Jan 12, 2021)

I can understand the 60's card being the most popular, but the 3070 has 4gb less memory and much faster at higher resolutions that need more memory. I can see a 3070ti coming very soon with higher memory sizes.


----------



## ZoneDymo (Jan 12, 2021)

"and is priced at $329"

yeah....no


----------



## Anymal (Jan 12, 2021)

ZoneDymo said:


> "and is priced at $329"
> 
> yeah....no


Fake MSRP!


----------



## TheinsanegamerN (Jan 12, 2021)

yotano211 said:


> I can understand the 60's card being the most popular, but the 3070 has 4gb less memory and much faster at higher resolutions that need more memory. I can see a 3070ti coming very soon with higher memory sizes.


Likely we'll see a 16GB 3070ti and a 20GB 3080ti this year. Nvidia really shot themselves in the foot launching before the high density GDDR6X was ready. 

OTOH they seem to have no issue selling the current cards, so maybe they were right on the money.


----------



## Vayra86 (Jan 12, 2021)

TheinsanegamerN said:


> Likely we'll see a 16GB 3070ti and a 20GB 3080ti this year. Nvidia really shot themselves in the foot launching before the high density GDDR6X was ready.
> 
> OTOH they seem to have no issue selling the current cards, so maybe they were right on the money.



Planned obscolescence suits the early adopter well, yes 

But hey 10GB is enough no worries, because even the midrange AND consoles will be having 12~ GB now so all is well! Never mind the fact that volume sales will start happening now and going forward, not before this moment. So we now have the unique situation that a midrange line up AND the competition all has more VRAM than a few one-offs called the 3070 and 3080 vanilla.

Have to have some pretty solid faith to think that's not an issue going forward. Religious, in fact.

Devs optimize for the mainstream, and its fast looking like 10GB is an oddball capacity we already had two generations back, effectively, with 11GB cards. Back then it was too much and now it is too little.


----------



## Deleted member 197223 (Jan 12, 2021)

329$ equals to about 450€ if you look at the current price of a 3060 Ti. Either way, this card will be a great contender against the 400€ AMD counterpart once prices have settled sometime in late 2021. And I really hope Nvidia stops with the Super nonsense.


----------



## Xaled (Jan 12, 2021)

3070 500$ MSRP meant 1000$
And 3060tis 400$ were actually 800$


----------



## TheinsanegamerN (Jan 12, 2021)

Vayra86 said:


> Planned obscolescence suits the early adopter well, yes
> 
> But hey 10GB is enough no worries, because even the midrange AND consoles will be having 12~ GB now so all is well! Never mind the fact that volume sales will start happening now and going forward, not before this moment. So we now have the unique situation that a midrange line up AND the competition all has more VRAM than a few one-offs called the 3070 and 3080 vanilla.
> 
> ...


There are still 0 reviews by tech sites showing the 10GB to be a limiting factor. The 3080 topped the 2080ti, handily, in every test. If the VRAM was insufficient, the 3080's 1% would be annialated by turing. 

The "mainstream" number game devs will be targeting on PC is currently less then 4GB. The VAST majoirty of the market has 4GB or less, with a sizeable minority at 6GB, a smaller minority at 8GB, ece. Game consoles will have 12GB for games, yes, but that is TOTAL memory. That is both system and video memroy, which is two seperate pools on PC. A game using 12GB of video mameory would have nothing left over for the game logic itself. 

The 290x 8GB was totally worthless. the 8GB 480 same deal, not enough power for 1440p gaming where such memory may have mattered. The 8GB 3070 manages to meet or exceed the 1% low and average of the 11GB 2080ti consistently. 

By the time the 10GB 3080 RAM is insufficient for high/ultra settings, we'll likely have far more powerful GPUs on the market. Not to mention even when it does eventually run out, there are plenty of tweaks to fix the problem. Most tests are done with settings maxxed out, including AA. AA gobbles up VRAM and at 1440p and above is really not that noticeable, FXAA works better and has less VRAM penalty. 

I've yet to see a modern game that doesnt run properly on my vega 64 at 1440p, the most VRAM demanding game I've seen is cities skylines and its unoptimized RAM allocation, and doom eternal, which only hits above 8GB on unplayable settings.


----------



## Gameslove (Jan 12, 2021)

Theoretical rtx 3060 will be 10% less performance against rtx 3060 ti.....


----------



## Vayra86 (Jan 12, 2021)

TheinsanegamerN said:


> There are still 0 reviews by tech sites showing the 10GB to be a limiting factor. The 3080 topped the 2080ti, handily, in every test. If the VRAM was insufficient, the 3080's 1% would be annialated by turing.
> 
> The "mainstream" number game devs will be targeting on PC is currently less then 4GB. The VAST majoirty of the market has 4GB or less, with a sizeable minority at 6GB, a smaller minority at 8GB, ece. Game consoles will have 12GB for games, yes, but that is TOTAL memory. That is both system and video memroy, which is two seperate pools on PC. A game using 12GB of video mameory would have nothing left over for the game logic itself.
> 
> ...



Time will tell, but we're already going pretty close to that limit in some titles, and yes, having less is a performance hit. We've seen this before in games that did exceed VRAM when they get revisited a few years post-release. And there are many examples of the core having enough power but VRAM being the main reason to upgrade cards. Its as good a reason as any, right? its certainly not true that its a rule of thumb that 'the core runs out when VRAM runs out'. Sorry. I have too many examples, even recently, that tell a different story.

You say tricks and tweaks but that's nonsense... what really happens is the market caters to the common denominator, and its certainly not 10GB. Its either 8 GB for lower/mid segment, and most certainly towards 12GB for upper segment going forward. Or are we now going to defend that PC performance is capped at whatever the consoles present us? That would be lying to ourselves - PC demands have ALWAYS exceeded console demands. So... 'by the time' might come round much sooner than you think. I think 3 years is very optimistic, and 2 is very plausible. Pretty short lifetime for balls to the wall gaming on a 700 MSRP with inflation. Especially with lots of alternatives next to it that offer more. Good luck reselling that card in 2years time. Its immediately a midranger at best. Irony has it, that a 3060 with 12GB or better yet a 3070 with 16 will fetch better value by then, while being cheaper to buy now.

But be that as it may, you are certainly correct _today. _The examples are hardly there of 10GB being problematic. It mostly involves modding and per-game examples - but they do exist and also under 4K.


----------



## goodeedidid (Jan 12, 2021)

Another nice and exciting product that won't be in stock quantities anywhere. Maybe another lost container, who knows..


----------



## TheinsanegamerN (Jan 12, 2021)

Vayra86 said:


> Time will tell, but we're already going pretty close to that limit in some titles, and yes, having less is a performance hit. We've seen this before in games that did exceed VRAM when they get revisited a few years post-release. And there are many examples of the core having enough power but VRAM being the main reason to upgrade cards. Its as good a reason as any, right? its certainly not true that its a rule of thumb that 'the core runs out when VRAM runs out'. Sorry. I have too many examples, even recently, that tell a different story.


How about you post some of those examples? People often clamor that there is plenty of evidence that VRAM limits are an issue, yet never can post to tests showing the results. Maybe a youtube video at BEST, but never any well tested, verified results. One of the biggest symptoms of VRAM limitations is stuttering, which is represented by attrocious 1% low FPS. Most review sites test this now. I've yet to see ONE that shows such behaviour from a 3070 or 3080. 



> You say tricks and tweaks but that's nonsense... what really happens is the market caters to the common denominator, and its certainly not 10GB. Its either 8 GB for lower/mid segment, and most certainly towards 12GB for upper segment going forward. Or are we now going to defend that PC performance is capped at whatever the consoles present us? That would be lying to ourselves - PC demands have ALWAYS exceeded console demands. So... 'by the time' might come round much sooner than you think. I think 3 years is very optimistic, and 2 is very plausible. Pretty short lifetime for balls to the wall gaming on a 700 MSRP with inflation. Especially with lots of alternatives next to it that offer more. Good luck reselling that card in 2years time. Its immediately a midranger at best. Irony has it, that a 3060 with 12GB or better yet a 3070 with 16 will fetch better value by then, while being cheaper to buy now.


 Are you REALLY insinuating that the the common demoniator for PC development is 8GB, when >90% of PC gamers use cards with 6GB of VRAM or less? Are you even aware of how rare high ends card are? Polaris was the first generation where >6GB cards were affordable for the masses with the 8GB 470 and 480. 



> But be that as it may, you are certainly correct _today. _The examples are hardly there of 10GB being problematic. It mostly involves modding and per-game examples - but they do exist and also under 4K.


So you admit the only examples you can find are using mods (LMFAO) and is otherwise assumed based on speculation and nothing else? One or two benchmarks intentionally written to break hardware HARDLY count as general gaming.


----------



## dirtyferret (Jan 12, 2021)

I look forward to seeing this card out of stock everywhere but ebay for twice the MSRP.



Vayra86 said:


> Pretty short lifetime for balls to the wall gaming on a 700 MSRP with inflation


I would think so but you really shouldn't game like that especially if you want to have little Vayras with Mrs. Vayra.


----------



## Doc-J (Jan 12, 2021)

TheinsanegamerN said:


> How about you post some of those examples? People often clamor that there is plenty of evidence that VRAM limits are an issue, yet never can post to tests showing the results. Maybe a youtube video at BEST, but never any well tested, verified results. One of the biggest symptoms of VRAM limitations is stuttering, which is represented by attrocious 1% low FPS. Most review sites test this now. I've yet to see ONE that shows such behaviour from a 3070 or 3080.
> 
> Are you REALLY insinuating that the the common demoniator for PC development is 8GB, when >90% of PC gamers use cards with 6GB of VRAM or less? Are you even aware of how rare high ends card are? Polaris was the first generation where >6GB cards were affordable for the masses with the 8GB 470 and 480.
> 
> So you admit the only examples you can find are using mods (LMFAO) and is otherwise assumed based on speculation and nothing else? One or two benchmarks intentionally written to break hardware HARDLY count as general gaming.


Really good explanation, 10 or 8GB is sufficient for next 2 years, but certain people don't trust the reviews and speak without any test to demonstrate it.


----------



## dir_d (Jan 12, 2021)

Only reason why i think Nvidia decided to put 12GB of VRAM would be due to what happens to the 3070 Maxed out in DLSS with 8gb of RAM. Plus they couldn't release a 6GB of RAM version customers would throw a fit.


----------



## Turmania (Jan 12, 2021)

what is the power target of the card? 150 w?


----------



## Vayra86 (Jan 12, 2021)

dirtyferret said:


> I look forward to seeing this card out of stock everywhere but ebay for twice the MSRP.
> 
> 
> I would think so but you really shouldn't game like that especially if you want to have little Vayras with Mrs. Vayra.



Haha already got a little Mrs Vayra walking around over here for two years now 



TheinsanegamerN said:


> How about you post some of those examples? People often clamor that there is plenty of evidence that VRAM limits are an issue, yet never can post to tests showing the results. Maybe a youtube video at BEST, but never any well tested, verified results. One of the biggest symptoms of VRAM limitations is stuttering, which is represented by attrocious 1% low FPS. Most review sites test this now. I've yet to see ONE that shows such behaviour from a 3070 or 3080.
> 
> Are you REALLY insinuating that the the common demoniator for PC development is 8GB, when >90% of PC gamers use cards with 6GB of VRAM or less? Are you even aware of how rare high ends card are? Polaris was the first generation where >6GB cards were affordable for the masses with the 8GB 470 and 480.
> 
> So you admit the only examples you can find are using mods (LMFAO) and is otherwise assumed based on speculation and nothing else? One or two benchmarks intentionally written to break hardware HARDLY count as general gaming.



Its a crystal ball prediction, I already conceded that 8GB is not the norm today but it soon will be as that is what the consoles have; and if you want a bit more, 10GB is going to come up short sooner rather than later - but again, prediction. I can't offer you the facts today, but what I do see, is that I have an 8GB GPU today and we're steadily moving closer to its limit on new titles. The momentum is there and the increase will be mainstream soon. Cyberpunk for example will happily eat up 7+ GB and still runs fine on my 1080 while doing so.

And is the use of mods so strange on a PC? I have to say that for any game I play more than a single playthrough, its one of the first things I check out. They expand games and increase value. If I don't want mods, I can buy a console - modding is a key selling point for PC gaming.

I'll also concede that its acceptable that others accept different standards from what they get out of a GPU. But I can see myself running into trouble with 10GB going forward, and that is well founded in what I've seen up until today wrt performance and capacity. You're at liberty to think otherwise and base your choices on that  But I wouldn't be too sure, neither am I - we just can't tell and that is a 'risk' for a purchase.


----------



## Metroid (Jan 12, 2021)

They put 10gb on the rtx 3080 gpu flagship and 12 gb on a maistream gpu, this must be a joke to 3080 users. The $330 price tag will be in reality $500, we all know that.


----------



## Sunny and 75 (Jan 12, 2021)

Can't wait to see the TPU Review of the 3060 12 GB and look at how it performs against the 2080.


----------



## RedelZaVedno (Jan 12, 2021)

1,300 less cuda cores than 3060TI? That translates into 35% less rendering performance. It's 2060S/2070 with 12 gigs of vram basically for 70 bucks less. So only 12-15% gain over 2060?
That's very shitty generational performance uplift. The only GPU worth buying in Ampere lineup are 3060TI and 3080 at MSRP. They both lack 2 gigs of additional vram, but still, everything else is kind of meh.


----------



## dirtyferret (Jan 12, 2021)

Metroid said:


> They put 10gb on the rtx 3080 gpu flagship and 12 gb on a maistream gpu, this must be a joke to 3080 users. The $330 price tag will be in reality $500, we all know that.


RTX 3090 is the flagship and the rtx 3080 has a 320-bit memory bus to the 192 on the RTX 3060 so not really apples to apples comparison.


----------



## Metroid (Jan 12, 2021)

dirtyferret said:


> RTX 3090 is the flagship and the rtx 3080 has a 320-bit memory bus to the 192 on the RTX 3060 so not really apples to apples comparison.


wrong 3080 is the flagship, 3090 is titan alike, ceo nvidia said at the presentation he held when he announced the gpus the 3080 is the flagship, check the video last September.


----------



## Sunny and 75 (Jan 12, 2021)

RedelZaVedno said:


> It's 2060S/2070 with 12 gigs of vram basically for 70 bucks less.


 
We'll know that for sure in late February.


----------



## dirtyferret (Jan 12, 2021)

Metroid said:


> wrong 3080 is the flagship, 3090 is titan alike, ceo nvidia said at the presentation he held when he announced the gpus the 3080 is the flagship, check the video last September.


the RTX 3080 came out earlier so at the time it was the flagship until the RTX 3090.

Other web sites also call the RTX3090 the current Nvidia flagship as well

AMD Radeon RX 6900 XT vs. Nvidia RTX 3090: Flagship battle








						AMD Radeon RX 6900 XT vs. Nvidia RTX 3090: Flagship Battle | Digital Trends
					

The RX 6900 XT and RTX 3090 are two monstrous GPUs that will dominate high-end gaming for years to come. But which is the best for your next PC upgrade?




					www.digitaltrends.com
				




NVIDIA GeForce RTX 3090 Flagship Graphics Card Unveiled








						NVIDIA GeForce RTX 3090 Flagship 'Ampere' Graphics Card Unleashed For $1499 US - 10496 Cores, 24 GB GDDR6X, 50% Faster Than Titan RTX
					

The NVIDIA GeForce RTX 3090 flagship is here, an impeccable graphics card that packs 24 GB GDDR6X, Ampere and more, for $1499 US.




					wccftech.com
				




but hey I'm sure they would love to hear from an amateur hour such as yourself to correct them on their mistakes


----------



## Metroid (Jan 12, 2021)

dirtyferret said:


> the RTX 3080 came out earlier so at the time it was the flagship until the RTX 3090.
> 
> Other web site also call the RTX3090 the current Nvidia flagship
> AMD Radeon RX 6900 XT vs. Nvidia RTX 3090: Flagship battle
> ...


Like i said before check the video, it came straight from nvidia ceo.

Edit: Here watch it, i myself had to find when he says the 3080 is the new flagship


----------



## yotano211 (Jan 12, 2021)

Metroid said:


> They put 10gb on the rtx 3080 gpu flagship and 12 gb on a mainstream gpu, this must be a joke to 3080 users. The $330 price tag will be in reality $500, we all know that.


The 3080 mobile version will have 16gb. It appears the mobile 3080 is a downclocked 3070.


----------



## Metroid (Jan 12, 2021)

yotano211 said:


> The 3080 mobile version will have 16gb. It appears the mobile 3080 is a downclocked 3070.



I have a evga rtx 3080 ftw3, i have never seen a large graphics card like this in my entire life, 3 slots, really huge, yeah no way in hell will be same thing as the desktop version, yes huge downclocked to keep heating to a minimum. I would never buy a mobile 3080 gpu.


----------



## jboydgolfer (Jan 12, 2021)

this launch is so ass backwards, im just going to avoid it. such ridiculous treatment of consumers by a nvidia. 

they stopped shipping cards because they didnt want inventory on shelves when they raised prices, so they stopped & let the pool dry up. which increased consumer desire & interest for the product as demand was easily out weighing the supply. some time in the future they'll begin shipping them again (suddenly corona & component shortages effect on manufacturing will be evaporate), & the consumers who have been looking at $3000 2080ti's, & $4000 3090's will drool at the comparably minor 25% price increase. they'll change the $400 cards to $500 & so on, & no one will bat an eye. id say it was genius, had they not stuck it to the consumer during Christmas & the holiday's.


----------



## yotano211 (Jan 12, 2021)

Metroid said:


> I have a evga rtx 3080 ftw3, i have never seen a large graphics card like this in my entire life, 3 slots, really huge, yeah no way in hell will be same thing as the desktop version, yes huge downclocked to keep heating to a minimum. I would never buy a mobile 3080 gpu.


The mobile 3080 is based on the desktop 3070, the 3070 is rated at 220w. I dont know the power rating of the mobile 3080 but if it will be like the mobile 2080, it should stay at 150w. Its pretty easy to downclock the desktop 3070 to fit at a 150w power rating.
You might not buy a laptop with a 3080 but others would, dont say that you wouldnt buy it when you're not the target audience or have no need for it.


----------



## docnorth (Jan 12, 2021)

Turmania said:


> what is the power target of the card? 150 w?


Reported as 170w at VideoCardz.


----------



## Caring1 (Jan 12, 2021)

Any bets on how many threads we get on scalpers, high prices and lack of availability once these are released?


----------



## Sunny and 75 (Jan 12, 2021)

docnorth said:


> Reported as 170w at VideoCardz.



Yeah and that 12 GB GDDR6 appears to be 16Gbps also, not gonna make so much of a difference but still. Look:






Source: VCZ


----------



## Minus Infinity (Jan 12, 2021)

Available late February, didn't say which one though, 2022 or 2023!


----------



## yotano211 (Jan 13, 2021)

Caring1 said:


> Any bets on how many threads we get on scalpers, high prices and lack of availability once these are released?


Any bets on reseller haters complaining about high prices. Post all your bets below. I'll give good odds.


----------



## Sunny and 75 (Jan 13, 2021)

Vayra86 said:


> And is the use of mods so strange on a PC? I have to say that for any game I play more than a single playthrough, its one of the first things I check out. They expand games and increase value. If I don't want mods, I can buy a console - modding is a key selling point for PC gaming.
> 
> I'll also concede that its acceptable that others accept different standards from what they get out of a GPU. But I can see myself running into trouble with 10GB going forward, and that is well founded in what I've seen up until today wrt performance and capacity. You're at liberty to think otherwise and base your choices on that  But I wouldn't be too sure, neither am I - we just can't tell and that is a 'risk' for a purchase.



Couldn't agree more. That's exactly why I switched from PS4 to PC and been enjoying PC experience ever since.


----------



## LukeCuda (Jan 13, 2021)

there is some benchmarking evidence 3GB is enough for 4K for current games. drop in framerates is not greater than higher GB cards with same performance. people just look at used VRAM monitors, and dont understand how a GPU works.

its possible future games they wont test with such low GB so maybe it will be limiting in certain circumstances. hard to predict the future. But with NVME on consoles, it might be even less of a problem in the future.


----------



## Chrispy_ (Jan 13, 2021)

Cool, it doesn't look to be cut down too much from GA104.

13 shader-TFLOPs vs 16.2 of the 3060Ti
101 tensor-TFLOPs vs 129 of the 3060Ti
So 80% of the Ti's performance, and the FE is likely to be extremely compact - perfect for replacing all those 1060s that are starting to struggle.
It's just a shame that this is going to cost about $600 for a while. Good job I don't need a new GPU right now....


----------



## LukeCuda (Jan 13, 2021)

Chrispy_ said:


> 13 shader-TFLOPs vs 16.2 of the 3060Ti
> 101 tensor-TFLOPs vs 129 of the 3060Ti


and 17% less memory bandwidth


----------



## Solid State Soul ( SSS ) (Jan 13, 2021)




----------



## Chrispy_ (Jan 13, 2021)

LukeCuda said:


> and 17% less memory bandwidth


It would have been worse but for some reason Nvidia decided to use faster GDDR6. Availability? Or perhaps just because they're trying to compensate for the drop in bus width and don't want it to become bandwidth-starved.

Oh, I just spotted that Nvidia have released an image of the 3060





It could_ almost_ be an exhausting blower....


----------



## renz496 (Jan 13, 2021)

Metroid said:


> They put 10gb on the rtx 3080 gpu flagship and 12 gb on a maistream gpu, this must be a joke to 3080 users. The $330 price tag will be in reality $500, we all know that.


Simply how competition works. Early adopter always have it's issue like this. You either accept it or just don't buy it and wait for better deal. That's it.


----------



## Metroid (Jan 13, 2021)

renz496 said:


> Simply how competition works. Early adopter always have it's issue like this. You either accept it or just don't buy it and wait for better deal. That's it.


Agreed.


----------



## Sunny and 75 (Jan 13, 2021)

Chrispy_ said:


> Oh, I just spotted that Nvidia have released an image of the 3060
> 
> 
> 
> ...



If that's the final product then it seems one fan is enough to cool that GA106 GPU.


----------



## N3M3515 (Jan 13, 2021)

Ok, let me see if i get it: less gpu power - more vram? ROFL, wtf


----------



## Anymal (Jan 13, 2021)

3080 twice the 2080, 3060 twice the 1060, lol


----------



## Caring1 (Jan 13, 2021)

Chrispy_ said:


> Oh, I just spotted that Nvidia have released an image of the 3060
> 
> View attachment 183788
> 
> It could_ almost_ be an exhausting blower....


And no power connectors.


----------



## Max(IT) (Jan 13, 2021)

Quite strange they compared it with the four and a half years old 1060 rather than the 2060/2060 Super it is going to replace.
Millions of players? Yes, Nvidia. You should try to reach millions of actual players instead of scalpers and miners like you did so far...



Mad_foxx1983 said:


> so this card got more Vram than the 3060 ti, 3070 and 3080. why nvidia??


I think it will be asked many times... With that kind of bus, the choice was between 6 and 12. 6 Gb would have been not enough so... You have your answer 



15th Warlock said:


> Nice little card, with a plentiful pool of VRAM, I love it.
> 
> Hate to say this but I wonder if that MSRP includes the new 25% tariff on all video cards manufactured in China


As usual MSRP will be something regarding a few US customers only... The other customers outside US will pay much probably like 400/450€ for this card



TheinsanegamerN said:


> How about you post some of those examples? People often clamor that there is plenty of evidence that VRAM limits are an issue, yet never can post to tests showing the results. Maybe a youtube video at BEST, but never any well tested, verified results. One of the biggest symptoms of VRAM limitations is stuttering, which is represented by attrocious 1% low FPS. Most review sites test this now. I've yet to see ONE that shows such behaviour from a 3070 or 3080.
> 
> Are you REALLY insinuating that the the common demoniator for PC development is 8GB, when >90% of PC gamers use cards with 6GB of VRAM or less? Are you even aware of how rare high ends card are? Polaris was the first generation where >6GB cards were affordable for the masses with the 8GB 470 and 480.
> 
> So you admit the only examples you can find are using mods (LMFAO) and is otherwise assumed based on speculation and nothing else? One or two benchmarks intentionally written to break hardware HARDLY count as general gaming.


This. I'm not happy about Nvidia memory size choices, but so far I never saw a test showing we are approaching VRAM issues on a 10 Gb card even at 4K.



dir_d said:


> Only reason why i think Nvidia decided to put 12GB of VRAM would be due to what happens to the 3070 Maxed out in DLSS with 8gb of RAM. Plus they couldn't release a 6GB of RAM version customers would throw a fit.


What happened? Nothing. So far 3070 are fine with 8 Gb even maxed out


----------



## Ibotibo01 (Jan 13, 2021)

RIP RTX 3060. AMD will release RX 6700 XT and RX 6700. RX 6700 XT will compete RTX 2080/S, RX 6700 will compete RTX 2070/S. Nvidia could release RTX 3060 Super after RX 6700 or maybe price cut.  However, 900 series, 1000 series, 2000 series' xx60 GPU has half of xx80 GPU's core count. RTX 3060 should have 4352 cores. If RTX 3060 was $279, it would dominate the price region but price of 3060 is bad. At least, I hoped to see GTX 760's promises but only i saw 3060 is baloon of VRAM. Even so, I believe that Nvidia will release RTX 3050 Ti 6GB for $179-199. I would choose RTX 3050 Ti over RTX 3060 and wait RTX 4000 series.


----------



## watzupken (Jan 13, 2021)

Mad_foxx1983 said:


> so this card got more Vram than the 3060 ti, 3070 and 3080. why nvidia??


I feel a card at this level is unlikely to use anything more than 8GB since it is good for up to 1440p. So the rest of the VRAM is just there just to compete with the mid range cards from AMD which is also expected to have 12GB of VRAM over a 192bit BUS. If AMD included the massive infinity cache on their mid range cards, then I think the 6700 XT may be faster. Just my guess.


----------



## Sunny and 75 (Jan 13, 2021)

Caring1 said:


> And no power connectors.



Maybe it's in that dark corner on the right.


----------



## Vayra86 (Jan 13, 2021)

RedelZaVedno said:


> 1,300 less cuda cores than 3060TI? That translates into 35% less rendering performance. It's 2060S/2070 with 12 gigs of vram basically for 70 bucks less. So only 12-15% gain over 2060?
> That's very shitty generational performance uplift. The only GPU worth buying in Ampere lineup are 3060TI and 3080 at MSRP. They both lack 2 gigs of additional vram, but still, everything else is kind of meh.



Yep Nvidia's Ampere stack is fast looking like a big pile of 'just below par' crap with tons of software to make it work proper. In BETA, mind, and subject to change.

Starting to look less attractive as it gets seeded further. Strange how that works. But the lackluster specs along with shitty price situation are quickly pushing me to a no-buy again. They made a big mistake trying to pre-empt as they did.


----------



## Fourstaff (Jan 13, 2021)

Any chance of RTX3050, or is this the weakest card able to effectively push RT in 1080p60hz?


----------



## cst1992 (Jan 13, 2021)

@W1zzard Waiting for the review!


----------



## Chrispy_ (Jan 13, 2021)

Caring1 said:


> And no power connectors.





Adc7dTPU said:


> Maybe it's in that dark corner on the right.


Clearly just a simplistic render, but it gives an idea of the design language Nvidia are going in for the FE - so the silver S of the 3070 and 3060Ti is gone and it looks to me like the only fan on the front (the bottom as we see it) is at the back. There can't be another fan on the other side because that's where the PCB is so this is certainly looking like either a blower, or a blow-through design based off this artists' impression.

Not really final enough to say, but the texture we can see on the bottom does look like a _cover_, rather than the chevron-angled heatsink fins we've seen on higher-tier cards.
As for power connectors, Nvidia will be hooking into the PCB which will be short again, so it's very likely they'll be in the middle of the card like the rest of the Ampere line-up.

The other option is that this render is so half-assed because Nvidia isn't planning on an FE and will hope the partners carry the design at launch. That's usually what happens from the 50-series and down, with renders of Nvidia's own desingn only existing as marketing images on Nvidia's website. It could take effect from the 60-series this time....


----------



## cst1992 (Jan 13, 2021)

If it only has the one fan, I'd say skip the 3060 FE.

That one fan can't be enough to cool a 175W card(more, since it'll still use the 12pin connector for up to 225W of power draw).


----------



## Max(IT) (Jan 13, 2021)

Ibotibo01 said:


> RIP RTX 3060. AMD will release RX 6700 XT and RX 6700. RX 6700 XT will compete RTX 2080/S, RX 6700 will compete RTX 2070/S. Nvidia could release RTX 3060 Super after RX 6700 or maybe price cut.  However, 900 series, 1000 series, 2000 series' xx60 GPU has half of xx80 GPU's core count. RTX 3060 should have 4352 cores. If RTX 3060 was $279, it would dominate the price region but price of 3060 is bad. At least, I hoped to see GTX 760's promises but only i saw 3060 is baloon of VRAM. Even so, I believe that Nvidia will release RTX 3050 Ti 6GB for $179-199. I would choose RTX 3050 Ti over RTX 3060 and wait RTX 4000 series.


yes... AMD will release other products with very poor RT performance and customers will still buy Geforce cards...


----------



## N3M3515 (Jan 13, 2021)

Max(IT) said:


> yes... AMD will release other products with very poor RT performance and customers will still buy Geforce cards...


"The customers" will buy whatever fits their interests. They only game where rt is noticeable is cp2077 and not even the 3090 can max it out unless you compromise. I just took a look a the list of 30 games that i play or will play and not even one has support for rt. So i guess for the vast majority of "customers" right now rt is meaningless. Maybe next gen.


----------



## Anymal (Jan 13, 2021)

Well, nvidias market share have risen during Turing period, so...


----------



## N3M3515 (Jan 13, 2021)

Anymal said:


> Well, nvidias market share have risen during Turing period, so...


RT or not, nvidia has good gpus, and i bet that nvidia share is 90% GTX 1060, GTX 1660.


----------



## Chrispy_ (Jan 13, 2021)

cst1992 said:


> If it only has the one fan, I'd say skip the 3060 FE.
> 
> That one fan can't be enough to cool a 175W card(more, since it'll still use the 12pin connector for up to 225W of power draw).


Utter nonsense! There are hundreds (maybe even thousands) of cards both past and present that adequately cool GPUs over 175W with a single fan. I'm currently using a single-fan card with a total board TDP of around 165-170W and it's quiet because the single fan is only running at 1400rpm even under full furmark torture test.

Cooling 175W should be a piece of cake for a single fan. The 3080 at 350W don't seem to have any problem (175W per fan) and with two fans on a card, that means there's much less heatsink _per fan _too.


----------



## ThrashZone (Jan 13, 2021)

Hi,
If you want to oc a card to max a single fan is nowhere near enough cooling unless you live near the arctic circle and leave the windows open lol

Hybrids only have one fan might be the exception.


----------



## TheoneandonlyMrK (Jan 13, 2021)

Nvidia's releases are getting more comical and less Real with each release.
They're not even going to bother with a real reference release at MSRP, so naturally the OEMs that just announced price hikes on any other GPU are not going to be selling many of these at MSRP , wave one may have some at that price to dodge legal pursuit but after that these will quickly scale the price range IMHO.

As for the card, looks good I wouldn't mind a go but I don't think I will bite.

They're trying to hard via alternative means( Hub etal) to get my cash ATM.


----------



## Max(IT) (Jan 13, 2021)

N3M3515 said:


> "The customers" will buy whatever fits their interests. They only game where rt is noticeable is cp2077 and not even the 3090 can max it out unless you compromise. I just took a look a the list of 30 games that i play or will play and not even one has support for rt. So i guess for the vast majority of "customers" right now rt is meaningless. Maybe next gen.


That's your opinion based on... Let me see... Ah ok, an Rx 580... Very relevant.


----------



## dirtyferret (Jan 13, 2021)

cst1992 said:


> If it only has the one fan, I'd say skip the 3060 FE.
> 
> That one fan can't be enough to cool a 175W card(more, since it'll still use the 12pin connector for up to 225W of power draw).


What if that one fan has RGB?


----------



## Fluffmeister (Jan 13, 2021)

The GTX 1060 is by far the most popular card with gamers on Steam and the RTX 2060 is well up there too, no doubt Nvidia hope to continue that trend with the RTX 3060. Hell it even has enough VRAM attract the "Lol 10GB, I must have loads of VRAM because of dodgy consoles in the coming years!!!1" crowd.

Smart.


----------



## cst1992 (Jan 13, 2021)

dirtyferret said:


> What if that one fan has RGB?


meh.


----------



## TheoneandonlyMrK (Jan 13, 2021)

Max(IT) said:


> That's your opinion based on... Let me see... Ah ok, an Rx 580... Very relevant.


I mean come on ,  you have a 3080 so all of a sudden RTX means more than the sun, wtaf dude his 580 is WAYYYYYYYYYYYYYYYYYYYYYYYYY more representative of the gaming pc public ATM then your 3080 is at the moment, yet his opinions worth less than yours, I DONT THINK SO.

remove those blinkers, your perspective is not everyones or more right.


----------



## Max(IT) (Jan 13, 2021)

theoneandonlymrk said:


> I mean come on ,  you have a 3080 so all of a sudden RTX means more than the sun, wtaf dude his 580 is WAYYYYYYYYYYYYYYYYYYYYYYYYY more representative of the gaming pc public ATM then your 3080 is at the moment, yet his opinions worth less than yours, I DONT THINK SO.
> 
> remove those blinkers, your perspective is not everyones or more right.


saying who ? Someone "incidentally" owning another AMD card with no RT support...

I didn't say my opinion worth more. I just said i ACTUALLY played several games using RT, and every one of them counts. And in the next future it will be even more relevant.

PS: I saw a "/RTX 2060" in your signature. I don't know what it means, but even if you own one, a RTX 2060 is not exactly the hardware to test RT.


----------



## ThrashZone (Jan 13, 2021)

Hi,
Yeah reflections isn't a reason I game


----------



## cst1992 (Jan 13, 2021)

Reflections? Reflections were in NFS 5 too. Also in Burnout Paradise.
The vast majority of gamers don't have and don't really need RTX. Maybe when it becomes more mainstream, but at the moment, they don't.


----------



## Chrispy_ (Jan 13, 2021)

Max(IT) said:


> saying who ? Someone "incidentally" owning another AMD card with no RT support...
> 
> I didn't say my opinion worth more. I just said i ACTUALLY played several games using RT, and every one of them counts. And in the next future it will be even more relevant.
> 
> PS: I saw a "/RTX 2060" in your signature. I don't know what it means, but even if you own one, a RTX 2060 is not exactly the hardware to test RT.


RTX 2060 blows chunks for RT performance. I ditched mine in favour of a 2070S which still isn't great at RT unless you cheat with DLSS too, and the number of RTX+DLSS games can be counted on your fingers - hardly a representative cross section of popular games today (let alone 2 years ago when Turing was brand new!)

I bought my 2060 on performance/Watt alone, not caring about RTX at all. I ditched my 2060 because it ran out of VRAM for the third game in a row. Everything I tried with RTX ran like garbage and required me to accept serious resolution, image quality, and framerate drops. No thanks! As a 6GB raster-perfomance card, the 2060 is fine but the RTX tax was never worth it, even at $299 when the price was reduced it still couldn't justify it's almost $100 premium over the 1660S, and at the original 2060 price of $350 it was borderline obscene.

Steam survey is honestly the best representation of the market right now and the median gamer is running an Intel quad core and a GTX 1060. Like it or not, it's going to stay that way for a while because there's nothing available on store shelves for 1060 owners to upgrade to even if they want to.


----------



## TheoneandonlyMrK (Jan 14, 2021)

Max(IT) said:


> saying who ? Someone "incidentally" owning another AMD card with no RT support...
> 
> I didn't say my opinion worth more. I just said i ACTUALLY played several games using RT, and every one of them counts. And in the next future it will be even more relevant.
> 
> PS: I saw a "/RTX 2060" in your signature. I don't know what it means, but even if you own one, a RTX 2060 is not exactly the hardware to test RT.


Are you kidding, you think I haven't played cyberpunk at max settings with dlss on anyway @1080p ,I get 40 smooth frames but still I take it that's too peasant for you then.(as my system states laptops a Rog strix scarII with I7 +2060)

Saying who , someone who is not biased but realistic.

Rtx is here, new dawn and all that ,meanwhile the earth wasn't shattered and the sun didn't park itself up Nvidia's ass.

Calm the FF down I played all the same Rtx game's in 1080 and 4k and I'm Still not that bothered in some cases it's good in others hard to tell in play and in others a total nothing burger.
Essential it very much isn't.
And I can't see that changing until 2022 at the earliest though that's my opinion that last bit.


----------



## ThrashZone (Jan 14, 2021)

Hi,
Yeah I saw no real reason to pony up on the 20 series so vaporware 30 series lol yeah maybe in about 12 months they might be around for a better price.
1080ti and titan Xp are doing just fine.


----------



## Sunny and 75 (Jan 15, 2021)

N3M3515 said:


> They only game where rt is noticeable is cp2077 and *not even the 3090 can max it out unless you compromise*.



AD102 Lovelace to the rescue then! 








						NVIDIA AD102 (Lovelace) GPU rumored to offer up to 18432 CUDA cores - VideoCardz.com
					

New information on the NVIDIA AD102 graphics processor surfaces.  NVIDIA Lovelace AD102 GPU A week ago we reported that NVIDIA will introduce a new graphics architecture named after Ada Lovelace. The series is expected to be NVIDIA’s next gaming architecture that could arrive even before Hopper...




					videocardz.com


----------



## Chrispy_ (Jan 15, 2021)

Adc7dTPU said:


> AD102 Lovelace to the rescue then!
> 
> 
> 
> ...


Don't forget to buy your 2000W PSUs now, before the scalpers realise you need them for Lovelace.



N3M3515 said:


> RT or not, nvidia has good gpus, and i bet that nvidia share is 90% GTX 1060, GTX 1660.


You don't have to guess, there are stats for that sort of thing:






What's crazy is that the 1060's market share in the #1 spot is actually _increasing_.
I can only guess that's because cards that miners have held onto for 3 years are being sold and bought by gamers now.


----------



## Anymal (Jan 15, 2021)

Well its not market share whats you are comparing.


----------



## Max(IT) (Jan 15, 2021)

Anymal said:


> Well its not market share whats you are comparing.


exactly... how people are looking at Steam stats as market share is beyond my comprehension ....


----------



## Sunny and 75 (Jan 16, 2021)

Chrispy_ said:


> Don't forget to buy your *2000W* PSUs now, before the scalpers realise you need them for *Lovelace*.


2000W 

Thanks for the heads-up!  I'll keep that in mind.


----------



## P4-630 (Jan 25, 2021)

NVIDIA GeForce RTX 3060 listed early for almost as much as RTX 3060Ti - VideoCardz.com
					

NVIDIA GeForce RTX 3060 – first pricing appears Could NVIDIA GeForce RTX 3060 AIB cards cost just as much as RTX 3060 TI?  Official render of NVIDIA GeForce RTX 3060, Source: NVIDIA NVIDIA announced that GeForce RTX 3060 non-Ti will debut in late February at the price of 329 USD. There won’t be...




					videocardz.com


----------



## N3M3515 (Jan 25, 2021)

Max(IT) said:


> exactly... how people are looking at Steam stats as market share is beyond my comprehension ....


Do you have a more reliable source? (besides anecdotal)


----------



## Max(IT) (Jan 25, 2021)

N3M3515 said:


> Do you have a more reliable source? (besides anecdotal)



your question confirms me you don't have any clue about what market share is.


----------



## N3M3515 (Jan 25, 2021)

Max(IT) said:


> your question confirms me you don't have any clue about what market share is.


Right back at you


----------



## dirtyferret (Jan 25, 2021)

Max(IT) said:


> your question confirms me you don't have any clue about what market share is.


yes but JPR does since they literally get paid to cover market share for investors and they have Nvidia at 80% which is just 3% lower than the STEAM hardware survey









						Pandemic distorts global GPU market results
					

Shipments surprise: Jon Peddie Research reports that PC GPU market shipments increased by 2.5% sequentially from last quarter




					www.jonpeddie.com


----------



## Max(IT) (Jan 25, 2021)

dirtyferret said:


> yes but JPR does since they literally get paid to cover market share for investors and they have Nvidia at 80% which is just 3% lower than the STEAM hardware survey
> 
> 
> 
> ...


again, all of you are literally confusing market share with installed base...

Those numbers are just different numbers.
market share is based on the sales in a specific time frame (usually a quarter, but could be yearly ...).
Steam hardware survey is speaking about hardware maybe sold 3/4 years ago.

You are speaking about two different things.

According to Steam survey, GTX 1060 is the most used Nvidia GPU. Nvidia hardly sold ANY 1060 in the last 2 or 3 quarters ...
Do you see the difference ?


----------



## dirtyferret (Jan 25, 2021)

Max(IT) said:


> again, all of you are literally confusing market share with installed base...
> 
> Those numbers are just different numbers.
> market share is based on the sales in a specific time frame (usually a quarter, but could be yearly ...).
> ...


Actually it's you who seems to fails at understanding how to read; if you looked at the JPR chart it literally states "market share" in the chart.  As for the GTX 1060, once again you are the only one with his panties in a bunch over it as I never mentioned anything about that gpu.


----------



## N3M3515 (Jan 26, 2021)

Max(IT) said:


> According to Steam survey, GTX 1060 is the most used Nvidia GPU. Nvidia hardly sold ANY 1060 in the last 2 or 3 quarters ...


And the evidence?


----------



## Max(IT) (Jan 26, 2021)

dirtyferret said:


> Actually it's you who seems to fails at understanding how to read; if you looked at the JPR chart it literally states "market share" in the chart.  As for the GTX 1060, once again you are the only one with his panties in a bunch over it as I never mentioned anything about that gpu.
> 
> View attachment 185572


Actually the 1060 was literally quoted a few posts about your, using Steam hardware survey as a source for market share, which is WRONG.

Quoting from the comment above:

_What's crazy is that the 1060's market share in the #1 spot is actually increasing_.

JPR is showing market share.
Steam isn’t showing market share.

I’m not denying Nvidia has 80% market share at all.
I was just agreeing with the other customer how Steam hardware survey isn’t showing market share.

Next time before accusing, read the whole discussion.



N3M3515 said:


> And the evidence?


My fault.
Arguing with someone who thinks Nvidia was selling many GTX 1060 in 2020 (a product discontinued years ago) clearly was a waste of my time.


----------



## dirtyferret (Jan 26, 2021)

Max(IT) said:


> Actually the 1060 was literally quoted a few posts about your, using Steam hardware survey as a source for market share, which is WRONG.
> 
> Quoting from the comment above:
> 
> ...


I read the whole discussion and all I see if an immature person tossing a hissy fit.


----------



## Max(IT) (Jan 26, 2021)

dirtyferret said:


> I read the whole discussion and all I see if an immature person tossing a hissy fit.


Reported for insulting again.

BTW it doesn’t seems you actually read the discussion.
It started with:

_What's crazy is that the 1060's market share in the #1 spot is actually increasing.

post #82_

and we just pointed out that IS NOT market share, because Steam surveys are not about sales.
Then you kicked in with a pointless link about a market share research not related to Steam hardware survey...


----------



## N3M3515 (Jan 26, 2021)

Max(IT) said:


> My fault.
> Arguing with someone who thinks Nvidia was selling many GTX 1060 in 2020 (a product discontinued years ago) clearly was a waste of my time.


I did not even remenber what i was talking about....
*****yes... AMD will release other products with very poor RT performance and customers will still buy Geforce cards...*****
It was that phrase which i replied.
My point was amd's "poor" rt performance is irrelevant at this point, because the only game where it is noticeable (cp2077) not even the 3090 (which 99% of people can't or won't buy) is able to max it out. So, for the vast majority of people RIGHT NOW rt doesn't matter, so that "advantage" of nvidia is pointless. If i for example buy a rtx 2060, 2060 super, 3060, 3060ti, which are not cheap gpus by any means, i know i won't be able to play cp2077 at a respectable fps with rt on.
That's one thing, the other thing is, who tf buys a gpu to play 1 game?, like 99% of the games right now don't use rt, or at least in a meaningful way.

Yes, i would like to play with rt on, hell yeah, BUT when it is available in a lot of games and the performance hit is less than half of what it is now. But right now to say rt is the shit!, i'm sorry, but imho, it is irrelevant at this stage.


----------



## 95Viper (Jan 26, 2021)

As, I have stated before... Debate the statement and not the member.
If needed report the problem.
Do not report the problem... and, then post a retaliatory post to continue the drama you just reported.

Also, if you have nothing nice to say... well, read the forum guidelines.

Here is a quote from the guidelines:


> Posting in a thread
> Be polite and Constructive, if you have nothing nice to say then don't say anything at all.
> This includes trolling, continuous use of bad language (ie. cussing), flaming, baiting, retaliatory comments, system feature abuse, and insulting others.



Thank You, and please, Have a Great Day


----------



## Max(IT) (Jan 26, 2021)

N3M3515 said:


> I did not even remenber what i was talking about....
> *****yes... AMD will release other products with very poor RT performance and customers will still buy Geforce cards...*****
> It was that phrase which i replied.
> My point was amd's "poor" rt performance is irrelevant at this point, because the only game where it is noticeable (cp2077) not even the 3090 (which 99% of people can't or won't buy) is able to max it out. So, for the vast majority of people RIGHT NOW rt doesn't matter, so that "advantage" of nvidia is pointless. If i for example buy a rtx 2060, 2060 super, 3060, 3060ti, which are not cheap gpus by any means, i know i won't be able to play cp2077 at a respectable fps with rt on.
> ...


I stop speaking about RT several days ago... I don't agree with your opinion but now I was just speaking about market share and Steam hardware survey...

I think RT performance are A MUST today, but as I said above, now I was just speaking about the difference between market share and Steam survey.


----------



## Sunny and 75 (Jan 29, 2021)

Read this: AMD's response to DLSS








						AMD Files Patent for Chiplet Machine Learning Accelerator to be Paired With GPU, Cache Chiplets
					

AMD has filed a patent whereby they describe a MLA (Machine Learning Accelerator) chiplet design that can then be paired with a GPU unit (such as RDNA 3) and a cache unit (likely a GPU-excised version of AMD's Infinity Cache design debuted with RDNA 2) to create what AMD is calling an "APD"...




					www.techpowerup.com
				




And with Jay *saying* that Ampere is using some kind of HT, this launch (both AMD and Nvidia) is getting less and less interesting to me.


----------



## Gstorm CZE (Feb 4, 2021)

3060 looks cool as other Nvidia too, but cant complain about AMD products as well. When it gets to competition with AMD and attempts to attract customers, weird decisions come out like that 12GB VRAM, but thats due to conficts with previous card generations setups like 6-8GB VRAM 20xx series or even newer 30xx series from last year(2020). Anyway i guess it makes some sense, while sometimes unused, say with older games, as new games with higher VRAM requirements come out and higher res textures and so on, it allow you play higher res.. either 2K or even 4K with lower FPS versus higher GPU cores with less VRAM, but maybe less stuttering, thanks to no touches to system RAM, sure depends on other system parts...CPU and RAM mainly. As I dont stick to last gen AAA titles if those dont get to me by accident or as gift, that VRAM size would be most time unused for me,  so overpriced, I usually go for budget GPUs, like 1050 i have 3years. If GPUs get at stocks again at some close to MSRP prices so value ok, i probably get some newer, not yet decided if it would be 1650S/1660S, 2060series if some big discount or some 3050/3050TI, which 3050TI looks most future proof for casual gaming and still kicking vs my 1050, also hope reasonably priced. That 4GB VRAM looks quite big compromise for making low price, maybe Nvidia knows best what memory size fits its GPU core for most uses. As they have GF experience tool for recommended settings for each GPU, its kind easy to optimize your ingame settings for optimal GPU/game performance for any desired Graphics card /VRAM you get, so here i finally dont fear much for that 12G VRAM within 3060.


----------

