Friday, March 23rd 2018

NVIDIA Sneaks Less Powerful GeForce MX150 Variant Into Ultrabooks

NVIDIA quietly launched the GeForce MX150 mobile GPU in May of last year. The team at Notebookcheck discovered that there are actually two variants of the GeForce MX150 in the wild - the standard 1D10 variant and the much slower 1D12 variant. Normally, this wouldn't raise any alarms. However, neither NVIDIA or the manufacturer distinguish the two variants from each other. Buyers who purchase an ultrabook or notebook with a GeForce MX150 are basically playing the lottery. They have no idea which variant is inside the product until they run an utility like GPU-Z to find out. But just how significant is the performance difference between the two variants? Let's look at Notebookcheck's findings.

Starting with the GeForce MX150's specifications, the standard 1D10 variant has a 1469 MHz core clock, 1532 MHz boost clock, and 1502 MHz memory clock. Notebookcheck first saw this variant in the MSI PL62 and Asus Zenbook UX430UN. They later discovered the underclocked 1D12 variant in the Lenovo IdeaPad 320S, ZenBook 13 UX331UN, Xiaomi Mi Notebook Air 13.3, HP Envy 13, and ZenBook UX331UA notebooks. The 1D12 variant has a 937 MHz core clock, 1038 MHz boost clock, and 1253 MHZ memory clock. Right off the bat, that's a 36 percent reduction in the core clock alone. According to the 3DMark and 3DMark 11 tests, consumers can expect anywhere from a 20 to 25 percent performance hit with the less powerful variant. The charts don't lie. Of the 13 notebooks tested by Notebookcheck, the five models equipped with the 1D12 variant of the GeForce MX150 are at the bottom of the list. Nvidia's move to sneak the 1D12 variant into thin and light notebooks was probably to meet the 10W TDP envelope as opposed to the original variant's 25W. Luckily, the 1D12 variant has only appeared in 13-inch notebooks.
Source: Notebookcheck
Add your own comment

95 Comments on NVIDIA Sneaks Less Powerful GeForce MX150 Variant Into Ultrabooks

#26
HopelesslyFaithful
I ordered this and i am quite happy with it.
www.newegg.com/Product/Product.aspx?Item=N82E16834316267

It has its issues but its the only affordable and powerful 2-1 i can find and traveling internationally makes having a 2-1 a godsend. I have my issues like Acer locked BIOS. It took forever to get the CPU to run at 25W TDP through software and some other things like the IPS screen has bad backlight bleed but nothing like some lenovos BLB so thats good.

The build quality is great too but it has soldered RAM so only 8GB...16GB version isn't in the wild.

I have many small beefs like i always due with laptops but this is shockingly powerful and nice for 900-1000 bucks for the specs and quality.

I really dont know why they can't throw in better specs like this thing into ultrabooks. MX150 is trash compared to my 2-1.

-.1v and 25W TDP on that intel CPU with 35W TDP short boost is amazing. 25W TDP runs at like 85C while playing war thunder/NS2 in South Korea and the GPU runs just fine. I havent overclocked the GPU because i don't know the specs but it was plenty fast enough to enjoy 1080P 60hz!!!
Posted on Reply
#27
newtekie1
Semi-Retired Folder
moproblems99Typically, you won't see me say anything nice about nVidia, but I don't think they are in the wrong here. If the only thing different is the clocks then nothing was 'snuck' although perhaps it should have a low power moniker or something. Now, if they started jacking with core counts, shaders, or memory specs - we have a problem.
The only thing I think they did wrong was not actually list official specs for the MX150. I see no reason not to list the minimum specs for the MX150.
evernessinceVery few laptops have that capability. You'd also have to assume that the motherboard/PSU can delivery that extra power. Laptops especially are rated at a specific spec for a reason. Unless it's a gaming oriented system, I wouldn't risk it.
It's not really the extra power that I'd be worried about, but the extra heat.
Posted on Reply
#28
Recon-UK
And this only aids nVIDIA I guess?
Bad yeilds? Making cash by naming a lower spec GPU the same as one already out with more grunt?
Who knows... Does it effect me? No
Posted on Reply
#29
HopelesslyFaithful
Recon-UKAnd this only aids nVIDIA I guess?
Bad yeilds? Making cash by naming a lower spec GPU the same as one already out with more grunt?
Who knows... Does it effect me? No
I see this as nvidia helping OEMs boost marketing and profits. I dont see how this benefits nvidia.... This would hurt nvidia not OEMs
Posted on Reply
#30
moproblems99
newtekie1The only thing I think they did wrong was not actually list official specs for the MX150. I see no reason not to list the minimum specs for the MX150.
I agree for the most part but it is possible that another company will have an ultrabook with shittier thermals and run even lower than what nVidia has posted. If anything, they should just post core,shader, cuda, bus width, etc.
Posted on Reply
#31
Imsochobo
This ain't anything new.
But they should stop doing this, misleading consumers...

same as I7 in ultrabooks that can be 2 core...
Posted on Reply
#32
newtekie1
Semi-Retired Folder
moproblems99I agree for the most part but it is possible that another company will have an ultrabook with shittier thermals and run even lower than what nVidia has posted. If anything, they should just post core,shader, cuda, bus width, etc.
From what I gather, nVidia is setting the minimum specs for the lower power version still. It is just up to the manufacturers to pick which version they want to use. If they can't design a laptop around the lower power version, and the dedicated graphics ends up thermal throttling below the base clock, then that isn't nVidia or anyone else's fault but the laptop manufacturer.

Also, even with the low power version's specs set in stone, we can still have laptops using the same specced GPUs getting different performance. We see that in the benchmarks in the first post. The reason being that some likely were getting too hot and dropping from the boost clock, while others were in better designed machines with better cooling and didn't drop as far away from the boost clock. But as long as they were staying at or above the base clock, that is acceptable.

And I'd be willing to bet that if you took the higher performance MX150 and put it in any of those laptops that have the weaker version, the performance wouldn't really go up any. Because I'm guess they all would thermal throttle and the performance would be the same. I mean, it is pretty obvious that even the ones with the weaker version MX150 were still being limited by thermals.
Posted on Reply
#33
HopelesslyFaithful
newtekie1From what I gather, nVidia is setting the minimum specs for the lower power version still. It is just up to the manufacturers to pick which version they want to use. If they can't design a laptop around the lower power version, and the dedicated graphics ends up thermal throttling below the base clock, then that isn't nVidia or anyone else's fault but the laptop manufacturer.

Also, even with the low power version's specs set in stone, we can still have laptops using the same specced GPUs getting different performance. We see that in the benchmarks in the first post. The reason being that some likely were getting too hot and dropping from the boost clock, while others were in better designed machines with better cooling and didn't drop as far away from the boost clock. But as long as they were staying at or above the base clock, that is acceptable.

And I'd be willing to bet that if you took the higher performance MX150 and put it in any of those laptops that have the weaker version, the performance wouldn't really go up any. Because I'm guess they all would thermal throttle and the performance would be the same. I mean, it is pretty obvious that even the ones with the weaker version MX150 were still being limited by thermals.
there is also BIOS settings that are designed to keep total power of system under X TDP. Some laptops have universal power limits that will downclock the CPU or GPU if total power draw breaks X wattage. It might not even be throttling. This was an issue with some laptops having hidden BIOS settings that throttled the whole system. I forgot what laptops years back had that.
Posted on Reply
#34
megamanxtreme
While that is said and done, where's the MX110?
The MX130 finally shows up and it is in a build with the i7, with a higher price than an affordable($600) laptop with MX150 to boot.
Seriously wanted a cheap dedicated graphics chip so I don't have to rely on Dual-Channel memory, since the price of the laptop(under $400) would most likely come in Single.
Posted on Reply
#35
R0H1T
So what about this one :confused:
Posted on Reply
#36
londiste
MX150 is the entry-level GPU geared towards OEMs that never had clock speed and TDP specs in place. it's rather interesting that Nvidia provides no ranges whatsoever on their page though.
From varying sources, the TDP range is 10-36W, with standard implementation being 25W?
Posted on Reply
#37
Vya Domus
birdieOr NVIDIA slandering news titles work as a click bait?
It's the clickbait Nvidia "bashing" news that annoy you ? You know what annoys me ? Your fame-war inducing comments that I always see :
Posted on Reply
#38
Shihab
If I'm not mistaken, having the same GPU [short] model sold under different configurations is not new to the mobile scene, notebook OEMs have been selling GPUs with less dedicated memory for a long time now, yet look at the Nvidia specs for those GPUs and you'll only see the highest capacity listed. Back then, it was the notebook makers' jobs to tell how much memory that GPU has. I don't see this as any different, and Nvidia's materials on the matter clearly points to that fact...

I'd shake my pitchfork at the notebook brands first, personally.
Posted on Reply
#39
HTC
Perhaps i miss understood something: where the ultrabooks being sold with the better GPU before? If yes, then the whole "they did it to fit the power envelope" argument doesn't fly @ all.

Assuming "yes" to the question above, the way i see it is not that nVidia changed the GPU to a less powerful version but that they purposely let the GPU model's specs vague in the "original version" and can therefore change it to a substantially less performing model while having "the same specs". As such, you could theoretically buy 2 of these "exact models" and have one perform 20+% more then the other. Dunno what you dudes call that but i call that "legal fraud".
Posted on Reply
#40
Fluffmeister
R0H1TSo what about this one :confused:
Thats just the higher clocked 1D10 variant overclocked even more baby!
Posted on Reply
#41
Imsochobo
HopelesslyFaithfulthats not misleading consumers......i7 is generally the top CPU in its designated category (now they have i9s).....are you trolling with that comment?
.
Absolutely not.
I've seen some replace a 6700K GTX1070 with a I7 laptop with 1070 and thought it was just as good.
Yes, It happens ALOT!

or think their I7 is on par with the desktop.
Posted on Reply
#42
newtekie1
Semi-Retired Folder
HopelesslyFaithfulthere is also BIOS settings that are designed to keep total power of system under X TDP. Some laptops have universal power limits that will downclock the CPU or GPU if total power draw breaks X wattage. It might not even be throttling. This was an issue with some laptops having hidden BIOS settings that throttled the whole system. I forgot what laptops years back had that.
Yeah, but after reading the reviews for the laptops in question here, it is obviously a heat issue. These 13" utlra-thins just don't have the space for adequate cooling. The processors aren't able to keep their max turbos either in a lot of them.
HTCPerhaps i miss understood something: where the ultrabooks being sold with the better GPU before? If yes, then the whole "they did it to fit the power envelope" argument doesn't fly @ all.
The answer is no. The higher power version seems to always be used in larger laptops, while the lower power version is in the 13" ultrabooks.
Posted on Reply
#43
HTC
newtekie1The answer is no. The higher power version seems to always be used in larger laptops, while the lower power version is in the 13" ultrabooks.
I C.

I take it then that those ultrabooks didn't exist @ all with the "original" MX150 GPU and only when the "recent" MX150 came about they were introduced. If so, then there's absolutely no problem, other then nVidia "trying" to confuse the potential buyer by not providing "proper" specs: had they provided the specs, this issue would not arise @ all. Labeling 2 GPUs with substantially different performance the same (MX150) certainly does not help.
Posted on Reply
#44
newtekie1
Semi-Retired Folder
HTCI C.

I take it then that those ultrabooks didn't exits @ all with the "original" MX150 GPU and only when the "recent" MX150 came about they were introduced. If so, then there's absolutely no problem, other then nVidia "trying" to confuse the potential buyer by not providing "proper" specs: had they provided the specs, this issue would not arise @ all.
Correct, the same model laptop never used different versions of the MX150.
Posted on Reply
#45
R0H1T
FluffmeisterThats just the higher clocked 1D10 variant overclocked even more baby!
Yeah I don't think Nvidia can be hanged for this, the OEM wanted something to fit in the 10W TDP & you get an underclocked MX150. It's clocks can easily be doubled, unless there's a hard TDP limit &/or these are the lower MX150 bins. Kinda like the RX 560D IIRC :wtf:
Posted on Reply
#46
moproblems99
FluffmeisterThats just the higher clocked 1D10 variant overclocked even more baby!
They really need to add the T moniker finish off this variant.
Posted on Reply
#47
Vayra86
Alright, let me show the problem with this MX150, when a company also has an MX130, and why I feel Nvidia should take a hit on this. The gaps in performance are capable of edging comfortably into each others' territory. Take special note of that Cloud Gate 720p, the extreme width of the red bar on the MX150 there with min / max perf almost doubled in points. Also: Fire Strike 1080p: MX130 average scores higher than MX150 minimums. So much for a 'generational leap forward'... Keep in mind MX130 is Maxwell, thus less efficient.

www.notebookcheck.net/GeForce-MX130-vs-GeForce-MX150_8132_8000.247598.0.html

Posted on Reply
#48
newtekie1
Semi-Retired Folder
Vayra86Alright, let me show the problem with this MX150, when a company also has an MX130, and why I feel Nvidia should take a hit on this. The gaps in performance are capable of edging comfortably into each others' territory. Take special note of that Cloud Gate 720p, the extreme width of the red bar on the MX150 there with min / max perf almost doubled in points. Also: Fire Strike 1080p: MX130 average scores higher than MX150 minimums. So much for a 'generational leap forward'... Keep in mind MX130 is Maxwell, thus less efficient.
Yes, in a few synthetics it looks like there isn't much performance difference. But did you look at the actual game tests? In any of the tests that actually have a MX130 score, the MX150's minimums beat the MX130.

FF XV low 720p: MX130 = 22.5FPS MX150 Minimum = 25.2FPS
Assisin's Creed Origins Low 720p: MX130 = 29FPS MX150 Minimum = 42FPS
Middle Earth: SoW Low 720p: MX130 = 43FPS MX150 Minimum = 47FPS
Rocket League Low 720p: MX130 = 94FPS MX150 Minimum = 127FPS

So, obviously, even the weaker version of the MX150 is still outperforming the MX130 in real world tests.

But, again, the biggest part of this is likely going to come down to the thermals of the laptop. Which is why the more efficient, and less heat outputting, MX150 is performing better than the MX130 in the real world tests. Because, even notebookcheck's own reviews on laptops with the weaker MX150 note that it will boost to over 1600MHz when the laptop is cool. The problem is these 13" ultrathins don't keep things cool for very long, again the same reviews note that even the processor begins to lower the boost under load too due to heat.

And the MX130's TPD is 30W for the configuration that is coming close to matching the weaker MX150. While the weaker MX150 only has a TDP of 10W. So that is why the weaker MX150 exists. If they would put a top MX130 in one of these 13" laptops, it would throttle so hard it wouldn't even come close to scoring as well as it did in those synthetics, forget about the real world tests. I doubt it would even be able to maintain its base clock.
Posted on Reply
#49
Vayra86
newtekie1Yes, in a few synthetics it looks like there isn't much performance difference. But did you look at the actual game tests? In any of the tests that actually have a MX130 score, the MX150's minimums beat the MX130.

FF XV low 720p: MX130 = 22.5FPS MX150 Minimum = 25.2FPS
Assisin's Creed Origins Low 720p: MX130 = 29FPS MX150 Minimum = 42FPS
Middle Earth: SoW Low 720p: MX130 = 43FPS MX150 Minimum = 47FPS
Rocket League Low 720p: MX130 = 94FPS MX150 Minimum = 127FPS

So, obviously, even the weaker version of the MX150 is still outperforming the MX130 in real world tests.

But, again, the biggest part of this is likely going to come down to the thermals of the laptop. Which is why the more efficient, and less heat outputting, MX150 is performing better than the MX130 in the real world tests. Because, even notebookcheck's own reviews on laptops with the weaker MX150 note that it will boost to over 1600MHz when the laptop is cool. The problem is these 13" ultrathins don't keep things cool for very long, again the same reviews note that even the processor begins to lower the boost under load too due to heat.

And the MX130's TPD is 30W for the configuration that is coming close to matching the weaker MX150. While the weaker MX150 only has a TDP of 10W. So that is why the weaker MX150 exists. If they would put a top MX130 in one of these 13" laptops, it would throttle so hard it wouldn't even come close to scoring as well as it did in those synthetics, forget about the real world tests. I doubt it would even be able to maintain its base clock.
You're right, I'm not contesting a word of what you're saying. But now I'm pretending to be Mr Average Joe - with a small, even average knowledge of tech and having learned to always check benchmarks. Gonna go shopping for a laptop with a 'bit more' than just a weak IGP, and I find Nvidia's MX150 on the net. It takes an awful lot of knowledge and even reading between the lines and into a laptop's cooling capabilities to really learn anything at all. The MX150 by itself tells me nothing - worst case, it'll do less than my old slightly bulkier laptop with an MX130 in it.

Thát is what's leaving a really sour taste in my mouth. And you mentioned Intel's confusing forest of mobile CPUs, and yes, I agree on that as well. Was it intended? Of course this was intended... and that is why I feel these articles are very much right to exist and pop up once in a while or in fact, EVERY TIME a new low has been reached. The gaps are widening as the form factors get more extreme, and this is a really bad trend.

Apart from this, taking a longer look at the four game benches you linked, I see some striking links: AC Origins is CPU heavy and Rocket League is running at a very high FPS = also high on CPU load. FFXV and Middle Earth are most certainly not, and here we see the MX130 and MX150 scores get frighteningly close. It underlines what you're saying about heat and Maxwell versus Pascal, at the same time, it shows that the graphics performance of the MX150 is extremely inconsistent whereas the MX130 is not. On the desktop, this is touted as Pascal's greatest feature and praised for its headroom; on mobile, its abused to the limit to feign a greater performance than we're actually getting. This is a change from the norm, with regards to what Nvidia is/was offering. What's next, reversed GPU Boost? 'It may boost to the specsheet's clock once in a while, if you're lucky'? Because essentially that is what this is.


Its exactly like you're telling it: Nvidia doesn't need to do this for ANY reason whatsoever, except to mislead and support OEM's in doing that trick with them. Even if they had just specified a TDP and clock speed range, not even all the possible configurations but just min/max, the performance gap would be excusable and explainable. Even Intel doesn't play the game so badly, they at least specify the core counts somewhere.
Posted on Reply
#50
newtekie1
Semi-Retired Folder
Vayra86You're right, I'm not contesting a word of what you're saying. But now I'm pretending to be Mr Average Joe - with a small, even average knowledge of tech and having learned to always check benchmarks. Gonna go shopping for a laptop with a 'bit more' than just a weak IGP, and I find Nvidia's MX150 on the net. It takes an awful lot of knowledge and even reading between the lines and into a laptop's cooling capabilities to really learn anything at all. The MX150 by itself tells me nothing - worst case, it'll do less than my old slightly bulkier laptop with an MX130 in it.
We aren't talking "slightly" less bulky here. The difference between the laptops we see the MX130 in and the weaker MX150 in are miles apart. To the point that anyone shopping for one isn't shopping for the other. The MX130 comes in full thickness 15" laptops, I don't think I've seen it in anything else due to the high thermals, and the weaker MX150 only comes in 13" ultrathins. They are in different categories completely.

And at the end of the day, that 13" ultrathin is still outperforming the 15" thick bastard, so I would like to meet the person that wouldn't be happy with that.

Apart from this, taking a longer look at the four game benches you linked, I see some striking links: AC Origins is CPU heavy and Rocket League is running at a very high FPS = also high on CPU load. FFXV and Middle Earth are most certainly not, and here we see the MX130 and MX150 scores get frighteningly close. It underlines what you're saying about heat and Maxwell versus Pascal, at the same time, it shows that the graphics performance of the MX150 is extremely inconsistent whereas the MX130 is not. On the desktop, this is touted as Pascal's greatest feature and praised for its headroom; on mobile, its abused to the limit to feign a greater performance than we're actually getting. This is a change from the norm, with regards to what Nvidia is/was offering. What's next, reversed GPU Boost? 'It may boost to the specsheet's clock once in a while, if you're lucky'? Because essentially that is what this is.

Again, the inconsistencies comes down to thermals more than anything else. The fact of the matter is that the MX150 is being put in computers that struggle to keep it cool, and that is why the performance varies. The laptops that the MX130 go in don't have the same issues with cooling. And the MX130 would struggle way more if it was put in 13" Ultrathins. And that is what people need to realize, when you start pushing thermal envelopes to the point that a 10w GPU starts to overheat, it really doesn't matter if you started with the faster MX150, because it ain't going to perform any better.
Posted on Reply
Add your own comment
Oct 2nd, 2024 05:17 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts