Tuesday, October 25th 2022

NVIDIA Partners Quietly Launch GeForce RTX 3060 with 8GB (128-bit) Memory

NVIDIA's add-in board partners today began quietly launching the GeForce RTX 3060 8 GB, a variant of the RTX 3060 with a third of its memory size and memory bus-width sawed off. The RTX 3060, NVIDIA's best-selling desktop graphics SKU from the RTX 30-series "Ampere," originally launched with 12 GB of GDDR6 memory across a 192-bit wide memory bus, which at its reference speed of 15 Gbps (GDDR6-effective), makes 360 GB/s of memory bandwidth. The new variant comes with 8 GB of GDDR6 memory across a narrower 128-bit memory interface, with the same 15 Gbps data-rate, which works out to 240 GB/s memory bandwidth.

Besides memory size, bus-width, and bandwidth; NVIDIA hasn't tinkered with the core-configuration with the RTX 3060 8 GB. It still comes with 3,584 CUDA cores across 28 SM, which work out to 112 Tensor cores, 28 RT cores, 112 TMUs, and 48 ROPs. The GPU's base frequency is set at 1320 MHz, and boost frequency at 1777 MHz—same as the original RTX 3060. Even the typical graphics power is unchanged, at 170 W. The new 8 GB variant doesn't replace the original, but is being positioned a notch below it, possibly to compete against the likes of the Radeon RX 6600 (non-XT), and perhaps even the Arc A750.
Source: VideoCardz
Add your own comment

35 Comments on NVIDIA Partners Quietly Launch GeForce RTX 3060 with 8GB (128-bit) Memory

#26
rjc34
3roldWasn't it Nvidia that said that same name and different memory configuration would be confusing to the customer???
I myself have Nvidia on both my systems as well, but I don't feel like supporting them in the future. Too much bs going on at the green front.
This is absurd news so soon after “unlaunching” the 4080 12GB for exactly the same problem of unreasonable consumer confusion. Just revive the “SE” suffix last used (for consumer cards) in the GTX 560 SE with its cut-down GF114 core. Or LE, or Lite, or whatever else. At the end of the day it’s all just a gigantic mess anyway. 3080 10GB/12GB are effectively identical performance. 3060 and 3060 Ti are entirely different GPUs, but the 3070/80/90 and their Ti variants all use more fully enabled variants of the chip in the non-Ti version.

Mid-gen card naming has pretty much always been a confusing mess as they try and slot new products between the existing SKUs.


On the topic of BS though:

They’re all gigantic mega-corporations looking to get your money. Give the consumer CPU and GPU markets have been dominated by duopolys for so long the story is often similar - the company with the dominant product sets their own rules, and the underdog usually puts on the image that they’re on the consumer’s side.

One only has to look at AMD’s shift before and after Zen 3 to see how quickly that marketing schtick vanishes into thin air when the tables turn. They did a phenomenal job leveraging that marketing niche with 1st/2nd/3rd gen Ryzen to ramp sales and foster solid brand recognition and perception.

All that is to say: the grass is not greener. Enthusiasts like us invest the time and energy picking the hardware that is best suited to our needs/wants. If/when AMD has a GPU generation that achieves against Nvidia what Zen 3/4 did against Intel, when the capabilities of the product as a whole come out on top instead of falling back on better “value” (rasterization perf/$), you can be absolutely sure they will make that exact same switch.
Posted on Reply
#27
Keullo-e
S.T.A.R.S.
First they cancel the 4080 12GB to rename it since even they admitted that it was named misleading. Now they release a 3060 with a cut-down memory bus.

Man, I want the same stuff their PR staff is smoking.
Posted on Reply
#28
RandallFlagg
Nvidia's midrange (3050/3060) and upper midrange (3060 Ti/3070) cards are way overpriced.

I doubt this card is going to help, even if it lowers the price of a 3060 by 10% it'll still be overpriced.
Posted on Reply
#29
xBruce88x
defaultluserit really depends on how cheap this variant is - we still haven't seen 3050 pricing below 290 yet!

probably better to wait for the 4050 with the same memory capacity, but higher performance!
I wouldn't put it past them to flat out skip the lower end this round, as they have plenty of 3xxx series to fill that performance level. Probably why they're STILL releasing a new version of the 3060
Posted on Reply
#30
Max(IT)
Nvidia keep playing dirty games with customers… same name for different products.
Posted on Reply
#31
Nopa
Strange milking strategy from NVIDIA. 3060 & 3060 Ti 12GB GDDR6X would at least tone the criticisms down a bit.
Posted on Reply
#32
ARF
Max(IT)Nvidia keep playing dirty games with customers… same name for different products.
AMD is no much better. Rebrands a RX 570 to RX 580 for the Chinese. Very ugly.

Or RX 5500 XT and RX 6500 XT have the same performance, which means that the *500 tier doesn't remain stable between generation transitions.
RX 6500 XT is a rebranded RX 6300 or something.

It just shows that they are free to name them as they wish - even a card called "moneky-donkey" will be accepted and sold en masse :banghead:
Posted on Reply
#33
Unregistered
chstamosThis should be the 3050, at 3050's prices, not the pos that they're selling.
This shouldn't have been released in the first place. Nvidia should give more to its consumers by dropping the price of the 12 GB model,
but this is a multi-billion company, not humanitarians and this is not a fair world we live in. :)
#34
ARF
spanjamanThis shouldn't have been released in the first place. Nvidia should give more to its consumers by dropping the price of the 12 GB model,
but this is a multi-billion company, not humanitarians and this is not a fair world we live in. :)
True, that's why there is at least some competition to relieve the pain - you can consider cheaper and better options - Radeon RX 6650 XT and Intel Arc A770 ;)
Posted on Reply
#35
Lycanwolfen
All these new cards and quite frankly. I'm not impressed at all. All the super sampling scaling BS. I will be impressed when I see a Video card run 4k or 8k natively without some software trick. DLSS and AMD's FFSR renders everything at 1080P then scales it up to 4k or 8k. So Nvidia and Amd want me to spend 1000 or more on a video card that can only do 4k and 8k nicely by running 1080p. If I wanted to just run 1080p I just run some 660ti's it can run 1080p just fine. My two 1070ti's in SLI can run 4k pretty sweet and its pure hardware. Then people tell me it has ray tracing or RTX. HalfLife 2 has ray tracing and it did not need no special card to run it. It was built into the game. I think gamers should demand better. Force Amd and Nvidia to make better card with less power and better tech. 1070ti's draw about 275 watts of power each and the 2070 supers was a 50 watt reduction but higher output on the GPU that to me was a step forward. The 30 series and 40 series do not impress me its like here it had 600 watts of gaming power yet only renders good a 1080p. I seen videos of Games running without DLSS at 4k and man really 64 fps some 83fps. I mean the image looks great but My 1070ti's in SLI can run same games at 100 fps to 130 fps.
Posted on Reply
Add your own comment
Jun 3rd, 2024 11:57 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts