Wednesday, March 12th 2025

NVIDIA Reportedly Prepares GeForce RTX 5060 and RTX 5060 Ti Unveil Tomorrow

NVIDIA is set to unveil its RTX 5060 series graphics cards tomorrow, according to VideoCardz information, which claims NVIDIA shared launch info with some media outlets today. The announcement will include two desktop models: the RTX 5060 and RTX 5060 Ti, confirming leaks from industry sources last week. The upcoming lineup will feature three variants: RTX 5060 Ti 16 GB, RTX 5060 Ti 8 GB, and RTX 5060. All three cards will utilize identical board designs and the same GPU, allowing manufacturers to produce visually similar Ti and non-Ti models. Power requirements are expected to range from 150-180 W. NVIDIA's RTX 5060 Ti will ship with 4608 CUDA cores, representing a modest 6% increase over the previous generation RTX 4060 Ti. The most significant improvement comes from the implementation of GDDR7 memory technology, which could deliver over 50% higher bandwidth than its predecessor if NVIDIA maintains the expected 28 Gbps memory speed across all variants.

The standard RTX 5060 will feature 3840 CUDA cores paired with 8 GB of GDDR7 memory. This configuration delivers 25% more GPU cores than its predecessor and marks an upgrade in GPU tier from AD107 (XX7) to GB206 (XX6). The smaller GB207 GPU is reportedly reserved for the upcoming RTX 5050. VideoCardz's sources indicate the RTX 5060 series will hit the market in April. Tomorrow's announcement is strategically timed as an update for the Game Developers Conference (GDC), which begins next week. All models in the series will maintain the 128-bit memory bus of their predecessors while delivering significantly improved memory bandwidth—448 GB/s compared to the previous generation's 288 GB/s for the Ti model and 272 GB/s for the standard variant. The improved bandwidth stems from the introduction of GDDR7 memory.
Source: VideoCardz
Add your own comment

29 Comments on NVIDIA Reportedly Prepares GeForce RTX 5060 and RTX 5060 Ti Unveil Tomorrow

#2
_roman_
How long will it takes until we see the 8GiB VRAM argument?

edit: germans: www.pcgameshardware.de/Grafikkarten-Grafikkarte-97980/News/Acer-Gaming-PC-mit-8-GiByte-Grafikspeicher-1468001/

It seems ACER leaked the specs. They highly doubt the 16GiByte will be most likely available. They also state 8GiByte are not enough. I wonder why they write GiByte instead of GiB unit or Gigibyte

Gemini says:
  • GiB stands for gibibyte.
  • It's a unit of measurement based on powers of 2.
    • "GiByte" with a capital "B" is not a recognized or standardized term.
  • Correct Term:
    • "gibibyte" (GiB) is the correct term.
Posted on Reply
#3
Prima.Vera
_roman_How long will it takes until we see the 8GiB VRAM argument?
When and after they test those cards with the most demanding games ;)
Posted on Reply
#4
Onasi
Prima.VeraWhen and after they test those cards with the most demanding games ;)
…which is a scenario these cards aren’t even meant for. While I would like to see more VRAM, these models will do perfectly fine for what they are meant to be, which isn’t 4K with all sliders mindlessly pushed to the right.
Posted on Reply
#5
Tsukiyomi91
performance for the 5060 8GB GDDR6 model will be either at parity or perform slightly faster than the 4060 8GB while the 5060 8GB GDDR7 variant will be either slightly behind the 4060Ti 8GB or at parity with it. The 5060Ti 8GB at best will only gap a 5070 but will the 1% lows will suffer as usual in games that requires more than 8GB of VRAM. The 5060Ti 16GB variant will be the only xx60 tier card that will perform at parity with the 5070 in certain games.
Posted on Reply
#6
Quicks
Onasi…which is a scenario these cards aren’t even meant for. While I would like to see more VRAM, these models will do perfectly fine for what they are meant to be, which isn’t 4K with all sliders mindlessly pushed to the right.
Well to be honest there is games on 1080p that will eat up 8GB VRAM on Medium / High. So the defending of 8GB scenario in 2025 is becoming really a bit stupid. Needs minimum 12GB for even 1080p.
Posted on Reply
#7
Denver
Tsukiyomi91performance for the 5060 8GB GDDR6 model will be either at parity or perform slightly faster than the 4060 8GB while the 5060 8GB GDDR7 variant will be either slightly behind the 4060Ti 8GB or at parity with it. The 5060Ti 8GB at best will only gap a 5070 but will the 1% lows will suffer as usual in games that requires more than 8GB of VRAM. The 5060Ti 16GB variant will be the only xx60 tier card that will perform at parity with the 5070 in certain games.
No, we're looking at a +25% increase in shader count, probably much higher clocks, so it's probably a 30% improvement on its predecessor. It would be decent if the 8GB didn't hold back the GPU's potential.

Picture this: with the next gen of consoles boasting 32GB of VRAM, games will set a new standard, rendering anything considered mid or low-end today obsolete.

Keep in mind that today, achieving what consoles can accomplish with 16GB or less of unified memory typically requires a combination of 16GB of RAM + 12GB of VRAM on a PC.
Posted on Reply
#8
Philaphlous
It'll be interesting to see the modders modding 8gb to 16gb on these cards. Wish it could be done in laptops...
Posted on Reply
#9
Wirko
It's not just about gaming, people are also becoming more inclined to experiment with AI stuff on their PCs. 8 GB is a little tight for that.
Posted on Reply
#11
Epaminombas
RTX 5070 is already a Low End GPU.
5080 is MID-END.
5090 is High End.

There shouldn't even be a 5060 and 5050, maybe only for notebooks...
Posted on Reply
#12
lexluthermiester
EpaminombasRTX 5070 is already a Low End GPU.
5080 is MID-END.
5090 is High End.

There shouldn't even be a 5060 and 5050, maybe only for notebooks...
While the performance of RTX5000 has been disappointing, your suggestion is(and I'm being nice here) daft as a brush. :slap::rolleyes:

The 5060/5050 offerings are still needed to fill in market segments the rest do not fit into.
Posted on Reply
#13
Niceumemu
I thought the paper industry was on a downturn but it seems Nvidia is making it work
Posted on Reply
#14
Epaminombas
lexluthermiesterWhile the performance of RTX5000 has been disappointing, your suggestion is(and I'm being nice here) daft as a brush. :slap::rolleyes:

The 5060/5050 offering are still needed to fill in market segments the rest do not fit into.
As long as there are RTX 5060/5050 or RX 7600 9060, games will continue to be scaled down.

The developer thinks about creating the game for the lowest GPU on the market, everything will be leveled to something below a LOW-END.

12GB VRAM GPUs shouldn't even exist anymore. Just like 6-core CPUs.
Posted on Reply
#15
Dirt Chip
The art of side-grading, NV mid-range gen-by-gen is what it is.
Posted on Reply
#16
b1k3rdude
So an upsold 5050 rebranded as a 5060, like the 4060 was but worse as I imagine the uplift will be peanuts.

meh..
Posted on Reply
#17
Darc Requiem
Onasi…which is a scenario these cards aren’t even meant for. While I would like to see more VRAM, these models will do perfectly fine for what they are meant to be, which isn’t 4K with all sliders mindlessly pushed to the right.
That's the thing, they aren't. These are 1080p cards. 1080p cards shouldn't require you to turn down settings at 1080p because of a lack of VRAM. Games at 1080p are pushing up against and spilling over the 8GB VRAM buffers. Nvidia pushes Ray Tracing, which increases the VRAM usage further, and yet skimps on VRAM. It reminds me of when Sony launched the PS3. They included Talledega Nights to show off the difference between Blu Ray and DVD, but they only included composiste cables with the console. So that Blu Ray's image quality would look just like a DVD's. Even with that silly situation, the consumer would be able to buy more cables.
Posted on Reply
#18
Nater
Does anybody care at this point?
Posted on Reply
#19
lexluthermiester
EpaminombasAs long as there are RTX 5060/5050 or RX 7600 9060, games will continue to be scaled down.

The developer thinks about creating the game for the lowest GPU on the market, everything will be leveled to something below a LOW-END.
As market surveys show, most people buy the "60" version Geforce cards. This has been true for nearly 20 years.
Epaminombas12GB VRAM GPUs shouldn't even exist anymore. Just like 6-core CPUs.
Oh please. Seriously with your elitism shtick?
Posted on Reply
#20
Onasi
Prima.VeraI'm talking about 1080p gaming.
www.techpowerup.com/review/gigabyte-geforce-rtx-4060-gaming-oc/34.html
Darc RequiemThat's the thing, they aren't. These are 1080p cards. 1080p cards shouldn't require you to turn down settings at 1080p because of a lack of VRAM. Games at 1080p are pushing up against and spilling over the 8GB VRAM buffers. Nvidia pushes Ray Tracing, which increases the VRAM usage further, and yet skimps on VRAM. It reminds me of when Sony launched the PS3. They included Talledega Nights to show off the difference between Blu Ray and DVD, but they only included composiste cables with the console. So that Blu Ray's image quality would look just like a DVD's. Even with that silly situation, the consumer would be able to buy more cables.
No, what you people are talking about is RT which, even at 1080p, is a punishing feature to enable for ANY GPU. Doesn’t matter what NV says or thinks, absolutely anyone remotely sane understands that RT is still not ready to be a permanent tentpole feature, at least not in the way it’s currently implemented. You can add however more VRAM to the 4060 - it will not make it truly RT capable. That’s just how things are. So RT is a no go, what else is there? Insane optional texture packs that make 0 difference at 1080p anyway? That’s not to mention the fact that I already mentioned in another thread - most people playing on PC and buying relatively budget cards don’t even play graphically advanced titles. Can a 5060 pull high FPS in insert e-sports of choice and also perform reasonably well on medium-high settings in AAA? Yes? Congratulations, the card has a reason to exist and absolutely will become a commercial success. I will never tire to repeat this, but what PC enthusiasts THINK should be and what the actual reality of the market IS pretty much never aligns.

Edit: @lexluthermiester kinda said what I did in a more concise and based way.
Posted on Reply
#21
Epaminombas
The CPU and GPU market should be based on the Xbox SX and PS5 consoles.

They are 8 cores + 14Gb VRAM and 2Gb for the operating system.

12Gb Vram should be an RTX 5050
16Gb the RTX 5060
18Gb RTX 5070
24Gb RTX 5080
32Gb RTX 5090

an RX 9060 has to come with 12GB to be good. I hope the RX 9050 doesn't exist.

Many games today require 12Gb of VRAM to be played at 1080P for 60fps...
Posted on Reply
#22
lexluthermiester
EpaminombasThe CPU and GPU market should be based on the Xbox SX and PS5 consoles.

They are 8 cores + 14Gb VRAM and 2Gb for the operating system.

12Gb Vram should be an RTX 5050
16Gb the RTX 5060
18Gb RTX 5070
24Gb RTX 5080
32Gb RTX 5090

an RX 9060 has to come with 12GB to be good. I hope the RX 9050 doesn't exist.

Many games today require 12Gb of VRAM to be played at 1080P for 60fps...
Oh, so more of your unrealistic elitism? Let's fix that;

32Gb RTX 5090
24Gb RTX 5080 (I'll go along with this)
16Gb RTX 5070 (because how would you do 18GB on a 256bit bus?)
12Gb RTX 5060ti
8Gb RTX 5060
6Gb RTX 5050

There we go. Why? Because THIS wish list is both realistic and technologically doable. But that's not what they're doing, so wishes might as well be fishes..
Posted on Reply
#23
Bobaganoosh
NaterDoes anybody care at this point?
Sort of. The issue is that tons of people buy x050/x060 series GPUs because of the price and Nvidia always claims how "good" these entry level cards are and how gamey they game at games. They seemingly have barely improved since 2000 series and even with those, you needed to get the 3060ti to get a real improvement over the 2060 and the 4060 didn't even catch up with the 3060ti. So we're YEARS down the line and the low end is still offering a tiny improvement over previous gens, but the previous gens disappear off the market, so there's always demand for the new generation just because that's all there is. Intel came in with a good performing entry level card, but they seem to have no ability to actually produce them. If you look at the inventory of B580's, they've had a few short-lived restocks in the last month. Looking at newegg today, there's an RTX 4060 for $365, an RTX 3050 for $342, and an RX6600XT for $375 (just for some examples in this segment). They all get outperformed by the ~$265 B580 (well, they would if you could buy one).

So, yes, as boring as these probably are, they are the cheaper cards and people will buy them if they're available because that's what happens at the entry level. The question is: Did Nvidia actually make them perform well for entry level cards or will they be sold at decent prices? The answer is probably "no and no" lol.
Posted on Reply
#24
Epaminombas
You have to face the fact that $500 is a low-cost GPU.

This is no longer the year 2004. Hardware prices have increased because developing new technologies is more expensive, manufacturing chips is more expensive, not only because of inflation, but because today we have 8.2 billion people in the world.

In 2004, there were 6 billion.

These 2 billion more people are consuming and making the price of everything skyrocket.

There are more people with purchasing power, so it is impossible for Nvidia to sell a high-cost GPU for $500 like it did in 2004 with the Geforce 6800.

CPU and GPU chips are being competed for with other electronics such as cell phones and tablets and chips for TVs and cars.

The price of everything has skyrocketed and there is no going back.
Posted on Reply
#25
Rexter
It saddens me that they are releasing that slop for such a price. Single digit generational uplift. Absolutely not worth the money.
Its even more ironic that Nvidia is sitting on a "real" 5060:
(quick note: I am not affiliated with that particular youtube channel or anything of the like).
The video is about a franken gpu from aliexpress that has a 4090M gpu with 16gb vram. In the benchmarks in the video it consistently performs identical or faster than the 5070. Oh and uses around 100w power. Why in buttcracks name isnt this the one that Nvidia is releasing? 5070 performance!? 16gb vram!? 100w at load!? ITS THERE! ITS RIGHT THERE NVIDIA! JUST SELL IT OFFICIALLY!
Posted on Reply
Add your own comment
Mar 12th, 2025 11:48 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts