Monday, August 26th 2024

NVIDIA's RTX 5060 "Blackwell" Laptop GPU Comes with 8 GB of GDDR7 Memory Running at 28 Gbps, 25 W Lower TGP

In a recent event hosted by Chinese laptop manufacturer Hasee, company's chairman Wu Haijun unveiled exciting details about NVIDIA's upcoming GeForce RTX 5060 "Blackwell" laptop GPU. Attending the event was industry insider Golden Pig Upgrade, who managed to catch some details of the card set to launch next year. The RTX 5060 is expected to be the first in the market to feature GDDR7 memory, a move that aligns with earlier leaks suggesting NVIDIA's entire Blackwell lineup would adopt this new standard. This upgrade is anticipated to deliver substantial boosts in bandwidth and possibly increased VRAM capacities in other SKUs. Perhaps most intriguing is the reported performance of the RTX 5060. Wu said this laptop SKU could offer performance comparable to the current RTX 4070 laptop GPU. It's said to exceed the RTX 4070 in ray tracing scenarios and match or come close to its rasterization performance.

This leap in capabilities is made even more impressive by the chip's reduced power consumption, with a maximum TGP of 115 W compared to the RTX 4060's 140 W. The reported power efficiency gains are not exclusive to RTX 5060. Wu suggests that the entire Blackwell lineup will see significant reductions in power draw, potentially lowering overall system power consumption by 40 to 50 watts in many Blackwell models. While specific technical details remain limited, it's believed the RTX 5060 will utilize the GB206 GPU die paired with 8 GB of GDDR7 memory, likely running at 28 Gbps in its initial iteration.
Source: via Wccftech
Add your own comment

108 Comments on NVIDIA's RTX 5060 "Blackwell" Laptop GPU Comes with 8 GB of GDDR7 Memory Running at 28 Gbps, 25 W Lower TGP

#51
sLowEnd
8GB is deeply disappointing for an upcoming midrange GPU, mobile or not.
Posted on Reply
#52
Dr. Dro
TheinsanegamerNSorry? Does 1080p on a laptop use less vram then 1080p on desktops? Because techspot has already demonstrated games suffering performance issues at 1080p with 8gb.
Since when exactly is 8 GB a problem with a 1080p display, I must have missed something
Posted on Reply
#53
wolf
Better Than Native
Dr. DroSince when exactly is 8 GB a problem with a 1080p display, I must have missed something
Because HUB/Techspot clearly demonstrated that if you specifically craft a test to overload 8GB of VRAM, you can!
Posted on Reply
#54
oxrufiioxo
wolfBecause HUB/Techspot clearly demonstrated that if you specifically craft a test to overload 8GB of VRAM, you can!
Also there are more than just performance issues in some games lods and texture quality are automatically downgraded on an 8GB gpu.

Even black myth wukong which looks fantastic has poor texture quality if you look too closely even at max setting a likely concession to stay within 8GB.

My issue with 8GB cards at the entry level has nothing to do with performance or what settings someone has to use it's stagnation for almost a decade in the 300ish price range and below and with Nvdia the 400 usd price range and below...
Posted on Reply
#55
Dr. Dro
wolfBecause HUB/Techspot clearly demonstrated that if you specifically craft a test to overload 8GB of VRAM, you can!
I mean I get 8 GB isn't enough to go full ultra even at 1440p, but for 1080, 8 GB is just fine
oxrufiioxoAlso there are more than just performance issues in some games lods and texture quality are automatically downgraded on an 8GB gpu.

Even black myth wukong which looks fantastic has poor texture quality if you look too closely even at max setting a likely concession to stay within 8GB.

My issue with 8GB cards at the entry level has nothing to do with performance or what settings someone has to use it's stagnation for almost a decade in the 300ish price range and below and with Nvdia the 400 usd price range and below...
Wukong on medium settings 1080p at full resolution scale (so true 1080p render) will fit in 6 GB, you saw my post on the benchmark thread. Reasonable to ask medium settings out of a card such as the 1070 Ti, which this 5060 will no doubt wipe the floor with
Posted on Reply
#56
oxrufiioxo
Dr. DroI mean I get 8 GB isn't enough to go full ultra even at 1440p, but for 1080, 8 GB is just fine



Wukong on medium settings 1080p at full resolution scale (so true 1080p render) will fit in 6 GB, you saw my post on the benchmark thread. Reasonable to ask medium settings out of a card such as the 1070 Ti, which this 5060 will no doubt wipe the floor with
What I was saying is even at the cinematic preset texture quality isn't very good on close inspection so im not surprised it has modest vram requirements but games in general have had meh texture quality in general.

I feel this is a side effect of how much stagnation nvidia has done with vram at the low to midrange...

Thankfully mods like the Witcher 3 HD rework and CP2077 HD rework are saving the day.
Posted on Reply
#57
Macro Device
The main issue is there is no oof. 3060 didn't make 2060 look garbage. Neither did 4060. Similar price, similar performance. I don't like it and I don't buy it, we all should thank enthusiasts for their upgrade itch which makes it viable to sell stuff for a lot of money. 4070M level performance doesn't sound impressive, unless we're talking extremely cheap laptops ($750 or lower).

8 GB is a problem but lack of bang per buck improvement is a much bigger problem.
Posted on Reply
#58
Onasi
oxrufiioxoWhat I was saying is even at the cinematic preset texture quality isn't very good on close inspection so im not surprised it has modest vram requirements but games in general have had meh texture quality in general.
Honestly, it’s a bit hard to say nowadays if it’s the texture quality being low or the over-abundance of post-processing and temporal AA making everything LOOK like it is. For what it’s worth, my friend works in 3D modeling and regularly looks at models from current games and his opinion is that the textures are mostly fine, decently high-res (his rants about shit alpha-channels aside since everyone just relies on TAA to hide dithering), it’s just that developers just rely on engines to do a LOT nowadays automatically and it often looks wrong.
Posted on Reply
#59
oxrufiioxo
Beginner Macro DeviceThe main issue is there is no oof. 3060 didn't make 2060 look garbage. Neither did 4060. Similar price, similar performance. I don't like it and I don't buy it, we all should thank enthusiasts for their upgrade itch which makes it viable to sell stuff for a lot of money. 4070M level performance doesn't sound impressive, unless we're talking extremely cheap laptops ($750 or lower).

8 GB is a problem but lack of bang per buck improvement is a much bigger problem.
For sure the lack of progress at the 60/60ti range has been troubling.
OnasiHonestly, it’s a bit hard to say nowadays if it’s the texture quality being low or the over-abundance of post-processing and temporal AA making everything LOOK like it is. For what it’s worth, my friend works in 3D modeling and regularly looks at models from current games and his opinion is that the textures are mostly fine, decently high-res (his rants about shit alpha-channels aside since everyone just relies on TAA to hide dithering), it’s just that developers just rely on engines to do a LOT nowadays automatically and it often looks wrong.
True but mods can and do improve it substantially and my guess is with game sizes already ballooning out of control also being a factor but an optional update would be nice for higher quality/higher resolution textures.
Posted on Reply
#60
wolf
Better Than Native
oxrufiioxoEven black myth wukong which looks fantastic has poor texture quality if you look too closely even at max setting a likely concession to stay within 8GB.
Is that your opinion of what's happening or confirmed to be whats happening? From all the content I've seen it has nothing to do with VRAM pool size, as some textures are repeatably poor quality on cards with 12+GB.

Overall I agree on the pricing though, it'd be nice to see more VRAM across the entire stack, but on this SKU I don't see it as a necessity. I also don't know what to even try to believe at this point, we usually get high end desktop parts well before mobile, it seems odd to get a leak about a mobile 5060 at this point in the pre-release leaks.
Dr. DroI mean I get 8 GB isn't enough to go full ultra even at 1440p, but for 1080, 8 GB is just fine
Oh I agree, it might not be the GPU that appeals to me the most but I'd wager it's fine. It was more a dig at HUB/Techspot and their testing they do to highlight problems, testing with methodology specifically constructed to prove their supposition correct (at least in one or more occasions).
Posted on Reply
#61
R0H1T
Dr. DroI mean I get 8 GB isn't enough to go full ultra even at 1440p, but for 1080, 8 GB is just fine
I mean why does JHH need to be so greedy? Isn't that $3 trillion market cap not enough :shadedshu:

Rhetorical question btw :ohwell:
wolfOh I agree, it might not be the GPU that appeals to me the most but I'd wager it's fine.
So if you want to run custom ML models you now need to pay more for VRAM as well :wtf:
Posted on Reply
#62
TheinsanegamerN
wolfBecause HUB/Techspot clearly demonstrated that if you specifically craft a test to overload 8GB of VRAM, you can!
As in: "play games designed for consoles with 16GB of RAM". Such stringent testing!

IDK why you guys are so insistent on holding onto that 8GB lifeline. Console shave 16GB of RAM. 8GB is no longer a baseline. It's OK to move on. Buying a GPU with 8GB of RAM at this point is like buying a 2GB GPU in 2015. A pointless move that will likely backfire.

But I forgot, resident evil Village is a specifically crafted benchmark, nobody actually PLAYS it.
Posted on Reply
#63
Onasi
@oxrufiioxo
The ballooning sizes are also, IMO, pure laziness that stems from the trend started on previous console gens. Devs stopped compressing assets to save on CPU resources due to consoles having weak CPUs and they just kinda… kept going like this. Next COD is what, like 400 gigs for a full install? Shit’s ridiculous.
Posted on Reply
#64
R0H1T
It's probably more do with Directstorage, because Sony/MS(?) have something which desktops still don't.
Posted on Reply
#65
TheinsanegamerN
Dr. DroSince when exactly is 8 GB a problem with a 1080p display, I must have missed something
There are games, such as RE:Village and forespoken that, even at 1080p, experience issues with 8GB cards. Village had crashing issues and severe stuttering, Forespoken textures would straight up not load properly. Frame graphs looked fine, but anyone LOOKING at the gameplay could tell something was wrong.

I just really dont get it. These GPUS cost nearly as much as a console themselves, cannot play at settings consoles can, yet you still need the rest of the PC. Yeah, sure your use case may not demand it now, but when 512MB and 2GB cards went out to pasture there wasnt this constant demand that games be made to work on them, people understood that they were outdated, and it was time to move on. But with 8GB people get really hung up on it, like its some kind of insult to say 8GB just isnt enough anymore. When wolfenstein the new order came out and needed 3+ GB for the highest settings, people didnt cry about how it should fit fine in 2GB, they accepted that 2GB cards were coming to their end.

Any card over $200 should have more then 8GB of VRAM today. To do otherwise is an insult to the consumer. If you are buying these $400+ cards and running games at low settings to avoid running out of VRAM, something is seriously wrong.
Posted on Reply
#66
oxrufiioxo
R0H1TIt's probably more do with Directstorage, because Sony/MS(?) have something which desktops still don't.
So far direct storage has been a bust on PC with just lowered performance for negligible differences in loading etc. With Nixxes saying there are other better options on pc basically.
Posted on Reply
#67
R0H1T
I meant Sony already had dedicated hardware for compression on the PS5, MS probably has something similar although in software. So consoles are much better than PC in that regard, as of now.
Posted on Reply
#68
wolf
Better Than Native
TheinsanegamerNAs in: "play games designed for consoles with 16GB of RAM". Such stringent testing!
It's my understanding consoles can't access 16GB for video memory, more like 10-12, and often targeting 4k output. And yeah so there are at most a handful of games that can't manage the maximum setting, and seemingly people are allergic to changing the texture setting down one, maybe two notches.

I absolutely agree there are games you can't max all the sliders at 1080p on an 8GB card, but the equation isn't as simple as that fact being true = 8GB is a wholly insufficient amount for certain SKU's.
TheinsanegamerNIDK why you guys are so insistent on holding onto that 8GB lifeline.
I'm not, I wouldn't buy this myself, but I think the issue is also somewhat overblown, for an unreleased rumored low end mobile video card. personally, targeting 4k120 since my last GPU upgrade, I'll be after a 16+ GB card when I next upgrade.
Posted on Reply
#69
64K
R0H1TI mean why does JHH need to be so greedy? Isn't that $3 trillion market cap not enough :shadedshu:

Rhetorical question btw :ohwell:


So if you want to run custom ML models you now need to pay more for VRAM as well :wtf:
Market Cap isn't wealth. It's just what investors think the price of stock should be at any given time.

The thing is that adding 12 GB VRAM to an entry level laptop GPU isn't going to put more money into Huang's pocket. It will probably take money out of his pocket. The added cost will just be passed down to consumers who are on a tight budget and already looking hard at the cost of the laptop. Some may choose not to buy it if it adds more than they are comfortable with spending.

On a tight budget and making compromises is a difficult subject to broach on a tech enthusiast site. It's not how most of us think but I can assure you that there a lot of gamers out there who do care very much about the budget especially at the level of this laptop.
Posted on Reply
#70
R0H1T
The point is with all the AI money JHH is printing adding 4GB more to the GPU would cost peanuts, literally & figuratively, for Nvidia. Yet here we are, Nvidia is more like today's Apple in this instance!
Posted on Reply
#71
Dr. Dro
oxrufiioxoSo far direct storage has been a bust on PC with just lowered performance for negligible differences in loading etc. With Nixxes saying there are other better options on pc basically.
The PC potato race strikes again. Most gamers have really dated, poorly maintained hardware. The condition of the average gaming PC is really bad.
TheinsanegamerNThere are games, such as RE:Village and forespoken that, even at 1080p, experience issues with 8GB cards. Village had crashing issues and severe stuttering, Forespoken textures would straight up not load properly. Frame graphs looked fine, but anyone LOOKING at the gameplay could tell something was wrong.
Visual degradation in such engines is an anomaly, not the norm. But these symptoms emerge primarily when settings are pushed beyond the hardware's reasonable capabilities, IMO.
Posted on Reply
#72
Onasi
R0H1TThe point is with all the AI money JHH is printing adding 4GB more to the GPU would cost peanuts, literally & figuratively, for Nvidia. Yet here we are, Nvidia is more like today's Apple in this instance!
It would not be peanuts, actually. You’d need to add double to preserve the same memory bus or change that to a wider one to get 12. I suppose there are funky asymmetrical configs that are possible, but there is a reason they aren’t done often. So for a 128-bit bus it has to be either 8 or 16. And 16 gigs of brand new GDDR7 would probably be considered a pointless waste on an entry SKU.
Posted on Reply
#73
R0H1T
OnasiI suppose there are funky asymmetrical configs that are possible, but there is a reason they aren’t done often.
The 48GB DDR5 on desktops is a recent & relevant example, so no GDDR7 can also do it but yes I did have the 128bit wide bus in mind & how that would also affect packaging? 16GB on a mobile xx60 chip isn't happening so we can all forget about that pipedream.
Posted on Reply
#74
Onasi
@R0H1T
VRAM has more issues than RAM with asymmetric configs. It’s stringent in that each GDDR package has a 32-bit connection. Anything like, say, getting 12 gigs on 128-bit bus is, while theoretically possible, will cause non-uniform performance for part of that memory. CPU IMCs are more flexible and can run, essentially, whatever (to certain extent), especially after a firmware update.
Posted on Reply
Add your own comment
Nov 28th, 2024 23:39 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts