• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

SCHENKER XMG NEO 17 M22 Released: Ryzen 9 6900HX, RTX 3080 Ti, 16:10 Display, Liquid Cooling

D

Deleted member 185088

Guest
Given that they're just different bins and configurations of the same silicon I really don't see the problem - unless you want a GA102 mobile GPU, which just isn't feasible in anything smaller than a huge, chunky 17"+ (likely even larger) device anyhow due to package size, VRM and VRAM requirements, and more. As for existing configurations, the mobile configuration are vastly more efficient, which is what makes them usable in a portable device in the first place. And if you want more, just build a portable ITX system in some ~10l case with a full sized GPU and a portable monitor. You'll get full desktop performance, some portability, and desktop pricing too.
With Pascal they put the same desktop GPUs on laptops, just reduced clocks, now they sell an overpriced 3080ti that is basically a slow 3070ti, for 5000€ or so I expect to see the full GPU at least the 3080.

Hi everyone,

we would like to directly reply to some of your comments and questions. For more information, please check out our deep dive on XMG NEO in our own sub-reddit.



It covers all the important hot spots: CPU, GPU and VRAM. This was the most efficient way to have a substantial impact on thermals without having any negative impact on air cooling and mobility.

The main goal was to reduce total system noise in GPU-focused workloads such as Gaming. Reviews of XMG NEO 15 (E22) with XMG OASIS have confirmed that this target has been met indeed. Including a review on TechpowerUp:
Let me show you some numbers:

In a CPU all-core rendering workload, you will still have to deal with some fan noise because the CPU's surface area is very small.
Our air/water hybrid cooling system is able to increase sustained CPU power by 20%, but it would still requite the laptop fans to push it over the finishing line.

View attachment 257855

In a GPU-focused work-load, the total system fan noise drops down to almost Idle levels because the 175 W GPU power can be cooled at quite low temperatures.

View attachment 257854

In the design process, it was important to minimize weight, cost and complexity of the "waterblock" because it is supposed to only be an optional add-on to the air cooling.

Users who buy the laptop without the intention of buying XMG OASIS should not have to sacrifice too much weight etc.



We have been selling this design since January this year and did not yet have a single case of anybody spilling water into their laptop.
The droplets that can leak during disconnect are absolutely minimal. The twin tube connector of XMG OASIS has a self-sealing valve.
Liquid damage of the laptop itself would most likely only be possible through gross negligence.

Our warranty policy is outlined in the FAQ on the product page.

Covered under warranty

If the water pipe inside the laptop is leaking due to normal wear and tear, this would be covered under warranty.

NOT covered under warranty
  • If the water pipe inside the laptop is damaged due to mechanical damage (impact shock) or inappropriate servicing.
  • If the water pipe inside the laptop is damaged by exposing it to excessive pressure with 3rd party devices, including air pressure gauges or 3rd party water cooling solutions, other than XMG OASIS.
  • If the water pipe inside the laptop is damaged by exposing it to freezing temperatures with liquid inside.
  • Any other damage from liquid that creeps in from outside the system, including accidental damage that might occur during inappropriately handled refill, drainage and disconnection operations.
RMA procedures

The distinction between “warranty” and “self-inflicted damage” is broad enough and should be pretty clear in most cases. The usual process would look like this:
  • If you discover any issue with your product, please contact us.
  • If we come to the conclusion that your product issue might have a hardware root cause, we will offer you a free RMA shipment*.
  • The service technicians in our RMA department will inspect the product. If there is a defect, we will find out the most probably root cause of it.
  • If the defect or its root cause is not covered under warranty (examples see above), we will reach out to you and offer an alternative solution.
* Free shipping after consultation with our support applies within the European Economic Area (EEA and EFTA), i.e. within the 27 member states of the EU plus Switzerland, Iceland, Norway and Liechtenstein.

Our user manual (delivered in colored print to all owners of XMG OASIS) is quite clear about how to safeguard against mishandling.

You can find the full user manual here.

Please allow me to quote the relevant part:

Disconnect the water tubes

When you are finished using XMG OASIS and you intend to disconnect the water tubes, please follow these instructions:
  1. As a precaution, shut down the laptop or send it to hibernate mode. Sending it merely to standby mode is not sufficiently safe.
  2. Disconnect the power adapter from XMG OASIS and disconnect the DC cable of XMG OASIS from your laptop.
  3. Keep a handkerchief, tissue paper or microfiber cloth ready on hand. There will be small droplets of liquid emerging during the next step. Remove them as soon as they emerge.
  4. Gently squeeze the lock mechanism of the quick release connector and swiftly remove it from the laptop horizontally.
  5. Inspect the area around the water ports of the laptop for liquid droplets. Remove all fluids, keep all surfaces clean and dry.
  6. Now close the rubber seal of the laptop. This operation may cause additional liquid to be squeezed out around the edges and the ventilation hole of the rubber seal. Inspect the area again and clear off all remaining fluids. Make sure the rubber seal is fully closed and all surfaces are clean and dry.

View attachment 257852

Leak warning: make sure that XMG OASIS is turned off before you remove the water tubes. If you disconnect during operation, the water pressure caused by the operating pump will cause additional leakage.

Heat warning: if you have been using the laptop with the tubes connected but with XMG OASIS not running, the quick release connector (which is metal-made) may have accumulated a lot of heat from the temperature generated by the system. Please make sure to test the connector’s temperature with a quick touch before fully grasping it.




No support from silicon partners.

Exception: we've had models with Desktop CPU and Laptop GPU, but those have always been difficult to move forward due to lackluster support from some partners.
Even if you take the Mobile GPU, it's not guaranteed that you get the green light if you want to bundle it with a Desktop CPU.



We are working with our partners on projects with AMD Radeon graphics. Those projects are still under discussion, so we cannot yet make any promises regarding availability.

However, balancing CPU and GPU power is already very well possible in Intel(CPU)+NVIDIA and AMD(CPU)+NVIDIA solutions. NVIDIA has introduced Dynamic Boost years ago, which controls GPU power based on CPU load. This is not really a new concept works very well even when combining different CPU/GPU vendors together.

If anyone has any other questions about our announcement, feel free to ping me here in the forums.

Cheers,
Tom
Thank you for your answer.
I'm curious though, is AMD also not supporting this, I had a 6800xt and during gameplay seemed to hover around 230w, with lower voltages and clocks it would be a better solution to the hopelessly slow 3080ti mobile.
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
With Pascal they put the same desktop GPUs on laptops, just reduced clocks, now they sell an overpriced 3080ti that is basically a slow 3070ti, for 5000€ or so I expect to see the full GPU at least the 3080.
You put way too much weight on labels/model numbers.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
With Pascal they put the same desktop GPUs on laptops, just reduced clocks, now they sell an overpriced 3080ti that is basically a slow 3070ti, for 5000€ or so I expect to see the full GPU at least the 3080.
Pascal is also pretty much the only series in which this has been true. And it makes perfect sense - names are only arbitrary signifiers of relative performance between tiers, and laptops are not desktops, so there's no reason to expect them to be identical, especially given the drastically different power and thermal envelopes. And, once again, you're not getting a GA102 in a mobile form factor - it's too large, needs too much space for VRAM, needs too wide a VRM to reasonably fit in even a large gaming laptop, even if they underclocked it for efficiency. It would be a super expensive, bespoke product tier that would only exist in DTR laptops - which barely sell at all. It would lose money for both Nvidia and laptop makers. As @bug said, you put way too much weight on model numbers. Higher number=better, but mobile and desktop are entirely separate lists, and tiers are not cross-comparable.
 
  • Like
Reactions: bug
Joined
Jan 25, 2020
Messages
2,227 (1.24/day)
System Name GrandadsBadAss
Processor I7 13700k w/ HEATKILLER IV PRO Copper Nickel
Motherboard MSI Z790 Tomahawk Wifi
Cooling BarrowCH Boxfish 200mm-HWLabs SR2 420/GTS 360-BP Dual D5 MOD TOP- 2x Koolance PMP 450S
Memory 2x16gb G.SKILL Trident Z5 Neo RGB 6400
Video Card(s) Asrock 6800xt PG D w/ Byski A-AR6900XT-X
Storage WD SN850x 1TB NVME M.2/Samsung 980 Pro 1TB NVMe M.2
Display(s) Acer XG270HU
Case Phanteks Enthoo Pro 2 Server Edition w/3 Noctua NF-A14 2000 IP67/4 be quiet! LIGHT WINGS LX 120mm
Audio Device(s) Logitech z623 <---THE SUCK
Power Supply FSP Hydro Ti PRO 1000w
Mouse Logitech G502
Keyboard Roccat Vulcan Aimo
Software Win 10/11pro
There aren't many options for ANY kind of wcing in a mass produced laptop. There are simply far too many variables at play not to mention the problems associated with trying to physically fit something that might actually work into such a tight space.
Once you get the idea of a custom wced loop out of your head and look at it from such a small physical area. Your not left with much else than what we see here.
Unless you think your customer might be willing to carry around a 240rad and res/pump combo along with their laptop (not likely). It could work but then you've got to design some kind of ultra flat cpu and gpu blocks that wont turn your laptop into something military grade thick. Again the space constraints.
With all the complaints we see with people complaining about their gaming laptops overheating...I think this is a good example of people thinking out of the box for once.
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
There aren't many options for ANY kind of wcing in a mass produced laptop. There are simply far too many variables at play not to mention the problems associated with trying to physically fit something that might actually work into such a tight space.
Once you get the idea of a custom wced loop out of your head and look at it from such a small physical area. Your not left with much else than what we see here.
Unless you think your customer might be willing to carry around a 240rad and res/pump combo along with their laptop (not likely). It could work but then you've got to design some kind of ultra flat cpu and gpu blocks that wont turn your laptop into something military grade thick. Again the space constraints.
With all the complaints we see with people complaining about their gaming laptops overheating...I think this is a good example of people thinking out of the box for once.
Not to mention "liquid cooling" is not mentioned on the official website anyway, it's just an unfortunate addition on TPU.
The official website only states:
The excellent cooling system of the XMG NEO 17 is one of the most striking features of the laptop, which is designed for uncompromising high performance. The interconnected system with five heat pipes and four heat sinks and air outlets allows the processor and graphics card to be pushed to their ultimate limits, while the two 11 mm fans are characterised by an unobtrusive, low-frequency sound characteristic.
 

XMG Support

Schenker Rep
Joined
Apr 1, 2020
Messages
23 (0.01/day)
I'm curious though, is AMD also not supporting this, I had a 6800xt and during gameplay seemed to hover around 230w, with lower voltages and clocks it would be a better solution to the hopelessly slow 3080ti mobile.

The requirements betwee laptop and desktop cards are just different. Laptops have quite different PCB layouts where the same key components need to fit in a much tighter space. Silicon vendors don't like to spend the R&D resources on helping OEMs to cram a desktop layout into a laptop and basically re-inventing the laptop layout that already exists.

The 230 watts total board power of your 6800 XT is way beyond anything that is currently possible in laptops. NVIDIA RTX 3080 Ti takes up to 175 watts and is already at the limit of what most vendors can do.

It's true that AMD's RDNA2 has made huge gains in terms of performance-per-watt efficiency. That's why we we'd be excited to bring a laptop with RDNA2 into the market. But this is not about desktop vs. laptop, it's about supply, vendor support and about delivery a full, working product. We are not ready to reveal any specific plans yet.

Not to mention "liquid cooling" is not mentioned on the official website anyway, it's just an unfortunate addition on TPU.

Support for XMG OASIS is mentioned on the product page and you can select XMG OASIS on bestware when configuring your XMG NEO (E22 and M22).

But good point, the word "liquid" is not to be found on the product page at the moment. Perhaps we will add a link to the XMG OASIS micro-page which explains the liquid cooling solution in details.

From the XMG OASIS FAQ:

What is XMG OASIS?

XMG OASIS is a modular liquid cooling system that has been specially developed for the XMG NEO series. The system consists of an external housing that contains a liquid reservoir, a pump, a fan and a radiator. The radiator is the heart of the liquid cooling system: warm liquid flows through a tube from the laptop into the radiator and is cooled there by the large-area case fan. The cold liquid flows back into the laptop via a second tube, thus forming a closed circuit. Due to the large surface area of the 120mm fan, XMG OASIS generates significantly less fan noise than a usual laptop air cooling system.

The two cooling tubes are connected to a metal pipe inside the laptop through a 2-in-1 quick release connector on the back of the laptop. The pipe inside the laptop is soldered to the traditional heat pipes of the laptop’s air cooling system. The tube is following a curvy path across the cooling system, indirectly touching the laptop’s main heat emitters: processor, graphics card, voltage regulators and video memory. Liquid flows through the inside of this tube and, thanks to its high thermal conductivity coefficient, transports the excess heat from the aforementioned hotspots directly to the outside with surprisingly high efficiency.

Cheers,
Tom
 
Last edited:
Joined
Jul 5, 2013
Messages
28,260 (6.75/day)
Desktop PC watercooling: here's my $200, 1kg pure copper GPU waterblock with precision machined microchannels and highly optimized flow paths, alongside my $150 CPU block that's nearly as complex.

Laptop water cooling: yeah, so there's this tiny, kinda flat copper pipe here, you see, that runs kind of across the GPU and GPU in a little loop, on top of the other heatpipes, and it's got water in it.
Actually, physics says it should work. You can joke, but I'm betting it's effective. That said...

I like this system. Ticks the boxes for me. I would undervolt/underclock it all and I'd really like someone to do a 16:10 19' or 20" notebook, but this system is solid IMHO.
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Actually, physics says it should work. You can joke, but I'm betting it's effective. That said...
Sure it should work. But like @XMG Support said above, 175W is about the limit of what it can do. Whereas on a desktop, you can "easily" manage 100W on top of that. Of course, it's still physics. Airflow, more material and more room to work with have to amount to something.
 
Joined
Jul 5, 2013
Messages
28,260 (6.75/day)
175W is about the limit of what it can do. Whereas on a desktop, you can "easily" manage 100W on top of that. Of course, it's still physics. Airflow, more material and more room to work with have to amount to something.
Those are good points. However this is a mobile system. It doesn't need to handle more. What it does it does likely does very well. I haven't seen one personally, so I can't say for sure, but the short math says it should work well.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Actually, physics says it should work. You can joke, but I'm betting it's effective. That said...
Way ahead of you there:
I think this is the same OEM solution that LTT has covered a few times, and apparently it works kind of decently
The fact that it works doesn't make the difference between this and desktop custom water cooling any less funny.
 
Joined
Dec 30, 2010
Messages
2,200 (0.43/day)
I'm sure they could develop or install something like a ancient Thermaltake Tide Water inside of it:



Other then collecting dust it's pretty much maintaince free.
 
Joined
Jun 15, 2021
Messages
63 (0.05/day)
It's so funny how obviously shoehorned in there that one pathetic pipe is. It doesn't even cover the maximum surface area it could, FFS! Just goes to show that fools and their money...
small pipe with liquid cool enough to cool down not the GPU/CPU core itself, but cooling more the heatpipes that actualy cooldown cpu/gpu

it actually works very well
 

VSG

Editor, Reviews & News
Staff member
Joined
Jul 1, 2014
Messages
3,695 (0.97/day)
I feel like I should share these directly in here so people see the data:

CPU temps stress oasis.png


GPU temps stress oasis.png


GPU clock stress oasis.png


I have a Ryzen-based XMG Neo 15 here already that will be tested on the OASIS too, but after that I'll get the NEO 17 in so you guys will have that data as well. Point is the internal liquid cooling setup works fine, just that the design of the external OASIS cooler unit could be improved for this market.
 
D

Deleted member 185088

Guest
Pascal is also pretty much the only series in which this has been true. And it makes perfect sense - names are only arbitrary signifiers of relative performance between tiers, and laptops are not desktops, so there's no reason to expect them to be identical, especially given the drastically different power and thermal envelopes. And, once again, you're not getting a GA102 in a mobile form factor - it's too large, needs too much space for VRAM, needs too wide a VRM to reasonably fit in even a large gaming laptop, even if they underclocked it for efficiency. It would be a super expensive, bespoke product tier that would only exist in DTR laptops - which barely sell at all. It would lose money for both Nvidia and laptop makers. As @bug said, you put way too much weight on model numbers. Higher number=better, but mobile and desktop are entirely separate lists, and tiers are not cross-comparable.
Before we had an "m" after the name to differentiate between laptop and desktop GPUs, with Pascal (and later Turing) nVidia removed it because they put the same die on laptops.
The rest of your arguments fall apart as the 3080ti laptops are insanely expensive, and Turing laptop GPU had huge dies like the 2080 super which had 200w unlike the fake 3080ti mobile with 175w.
You and @bug are naïve enough to fall to nVidia marketing, the new naming scheme is a lie to mislead people (alongside the awful power nonsense).
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Before we had an "m" after the name to differentiate between laptop and desktop GPUs, with Pascal (and later Turing) nVidia removed it because they put the same die on laptops.
The rest of your arguments fall apart as the 3080ti laptops are insanely expensive, and Turing laptop GPU had huge dies like the 2080 super which had 200w unlike the fake 3080ti mobile with 175w.
You and @bug are naïve enough to fall to nVidia marketing, the new naming scheme is a lie to mislead people (alongside the awful power nonsense).
Well, it's good to know that we are dealing with someone with deep engineering knowledge on a level rivalling at least Nvidia's.

/s

We never saw a mobile TU102 implementation, so... yeah. Still topped out at the second largest die, still topped out at 256-bit memory. Same as now. Turing didn't have quite the same massive transient power spikes of Ampere either, which might go some way towards explaining its marginally higher power limits. But OEMs are also free to configure the 3080 Ti mobile at 175W+ if they want to - that this hasn't happened likely says more about diminishing returns for performance than anything else. Also, do you really need that 'm'? It's a laptop gpu. It sits on a laptop motherboard. It is not a desktop GPU - it can't be. It's obvious Nvidia doesn't want to highlight the difference between the two, but... does it matter? Yes, it's a branding and marketing exercise that removes a tiny sliver of clarity, but the only point at which I find it even remotely problematic is if it were actually confusing to buyers - and I don't believe there are enough people cross-shopping laptops and desktops for that to be much of a problem.
 
D

Deleted member 185088

Guest
Well, it's good to know that we are dealing with someone with deep engineering knowledge on a level rivalling at least Nvidia's.

/s

We never saw a mobile TU102 implementation, so... yeah. Still topped out at the second largest die, still topped out at 256-bit memory. Same as now. Turing didn't have quite the same massive transient power spikes of Ampere either, which might go some way towards explaining its marginally higher power limits. But OEMs are also free to configure the 3080 Ti mobile at 175W+ if they want to - that this hasn't happened likely says more about diminishing returns for performance than anything else. Also, do you really need that 'm'? It's a laptop gpu. It sits on a laptop motherboard. It is not a desktop GPU - it can't be. It's obvious Nvidia doesn't want to highlight the difference between the two, but... does it matter? Yes, it's a branding and marketing exercise that removes a tiny sliver of clarity, but the only point at which I find it even remotely problematic is if it were actually confusing to buyers - and I don't believe there are enough people cross-shopping laptops and desktops for that to be much of a problem.
You are completely missing my point, nVidia shouldn't use misleading names that's all, they used to have a reasonable naming scheme. Now it doesn't make any sense.
I am quite aware of laptops limitations, as such I didn't suggest a 3090, rather to put the same die on both if the same name is used, and to give it more TGP like 200w, having these a 3070ti on a laptop would have similar levels of performance to the dGPU.
As for the rest of your "arguments" as I said you fail for nVidia marketing, to be fair it's quite effective. It is sad to see here, we ought to be able to see through marketing.
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
What context? The only thing I read was excuses/speculatios to justify nVidia's misleading marketing. None of my arguments were addressed.
There are no arguments to address. The correlation between laptop and desktop part names does not exist. Simple as that.
 
D

Deleted member 185088

Guest
There are no arguments to address. The correlation between laptop and desktop part names does not exist anymore. Simple as that.
Fixed that for you!
/S
Let's just agree to disagree, though it is interesting to see people with different views we disagree with.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
What context? The only thing I read was excuses/speculatios to justify nVidia's misleading marketing. None of my arguments were addressed.
Your arguments weren't addressed? Have you even had multiple? Let's see:
Didn't know that, probably nVidia doesn't want competition for their pathetic laptop GPUs, with Pascal they managed to put the same GPUs on laptops (better with the 1070 which had more cuda cores).
With Pascal they put the same desktop GPUs on laptops, just reduced clocks
These are essentially the same thing being said, just from slightly different angles. "Laptop GPUs used to be closer to desktop GPUs, or named differently." Which was, among other places, addressed in this post:
Pascal is also pretty much the only series in which this has been true. And it makes perfect sense - names are only arbitrary signifiers of relative performance between tiers, and laptops are not desktops, so there's no reason to expect them to be identical, especially given the drastically different power and thermal envelopes.
And once again:
Before we had an "m" after the name to differentiate between laptop and desktop GPUs,
This is just naming. Names are arbitrary. You can think that one naming convention is better or worse than another, but that's just an opinion.
with Pascal (and later Turing) nVidia removed it because they put the same die on laptops.
No, they removed it because they wanted to remove the stigma of "m GPUs are crap" which had - deservedly - cemented itself over the years.

For the response to that, again, see above.

Moving on:
The rest of your arguments fall apart as the 3080ti laptops are insanely expensive
This isn't an argument, at least not one that relates to the topic at hand whatsoever. Does one segment being more expensive somehow make desktop and laptop GPUs inherently comparable? What? Does the fact that a large excavator costs more than a family sedan suddenly render them somehow comparable? Different things are different things. Mobile GPUs are not desktop GPUs.
and Turing laptop GPU had huge dies like the 2080 super which had 200w unlike the fake 3080ti mobile with 175w.
And you have already been told that this is not true. The size difference between mobile Turing and Ampere is marginal - they're both huge. Mobile Turing topped out at TU104. I was actually wrong before, as Ampere doesn't top out at GA104, but has the GA103S as its largest mobile variant. These are very comparable in terms of die size - 545mm2 and 496mm2 respectively. Yes, GA103S is marginally smaller, but not to a degree that matters. You have also persitently failed to address the actually space-consuming components of a laptop GPU, which matter far more than die size: the board space needed for ancillary circuitry, mainly VRMs and VRAM. Ampere boosts much more aggressively than Turing and as a consequence has significant power excursions/current spikes, and thus need a beefier VRM for the same wattage. They also top out at the same 256-bit VRAM bus, with 8 VRAM dice - they're at the same level.

What does this tell us? Well, we can use our eyes and look at the boards of desktop GPUs and compare them to the boards of high end gaming laptops. Two things are clear: high end GPUs have huge and/or jam-packed boards with a lot of Z-height for VRM components; laptops are always jam-packed, do not have the luxury for lots of Z-height (outside of a few very thick laptops), and are thus more limited in what you can pack into them. Which is why there's no TU102 or GA102 laptop variants. There just isn't room, outside of the tiny, hyper-expensive niche of DTR laptops - and developing a specific SKU for that would be a massive money sink for everyone involved.

You're also blatantly ignoring the massive power consumption increase from Turing to Ampere and its attendant consequences when looking at mobile SKUs in your insistence on comparing them to desktop cards. The 2080 Super mobile has a 100W lower TDP than its desktop counterpart (configurable up to a 50W lower TDP), at 150W (up to 200W) vs. 250W. So, that's a reasonable gap, right? But how does that change when the 3080 Ti has a full 100W TDP increase over the 2080 Super? That delta grows - you can't just choose to increase laptop TDPs by 100W - physics doesn't work that way. We've also seen a fundamental change in how laptop GPUs are segmented from Nvidia, with the Max-Q labeling being abandoned, as they've realized that it's better to allow each OEM to configure the GPU to what their chassis can actually cool (with a lower bound for acceptable performance per model) than forcing fixed configurations that might not fit a given thermal envelope. This has the disadvantage of less clarity for consumers, as you can't look for the Max-Q label for low power or its absence for high performance, but also has the advantage of providing a wider, more diverse set of options for consumers as laptops can be designed at any point in between those two extremes.

There is also of course the fact that with Turing, 80-tier desktop GPUs were 04-series (with the exception of the Ti, which had no mobile counterpart), while with Ampere, they are 02. This also changes things, no? Again: are you expecting GA102 on mobile? That is a significantly larger die - 628 mm2 - again, and crucially, one equipped with 12 memory channels. Would you want a mobile GA102 cut down to 8 channels? Or are you just selectively ignoring the impossibility of implementing this much VRAM in any reasonably sized gaming laptop?

And, once again, remember the market segments and how they are evolving. Gaming laptops are shrinking. Drastically - the 17" segment today is dominated by thin-bezeled laptops around the size of 15" laptops just a few years ago. Larger than 17" is essentially nothing. And they're getting thinner and more portable too. While you could argue that the growth of SFF means desktops are also shrinking, it's still a world apart in terms of cooling capability and what can actually be fit within the confines of the chassis. A GA102 laptop GPU isn't feasible in the types of laptops that sell in any type of quantity today. Period.

This isn't "falling for Nvidia's marketing". It is taking into account the actual relevant realities surrounding these two separate product segments, and accepting that despite having similar names, names are ultimately arbitrary and for GPUs only designate relative performance within their same generation and brand (with a lot of flexibility). You are insisting that the names should be non-arbitrary, i.e. that there must be a material commonality between a mobile 3080 Ti and a desktop 3080 Ti, beyond both of them being at or near the top of their respective product stacks - extreme performance, highly priced GPUs in their respective markets. This insistence of yours is misguided and misunderstood, and from your arguments seems to be based on some weak reasoning and unrealistic desires. You cannot compare a 3080 Ti laptop to a 3080 Ti desktop and expect similar performance - they'll both be crazy fast, but still wildly different. That was true with Turing and Pascal too. And with previous generations.
 
D

Deleted member 185088

Guest
Your arguments weren't addressed? Have you even had multiple? Let's see:


These are essentially the same thing being said, just from slightly different angles. "Laptop GPUs used to be closer to desktop GPUs, or named differently." Which was, among other places, addressed in this post:

And once again:

This is just naming. Names are arbitrary. You can think that one naming convention is better or worse than another, but that's just an opinion.

No, they removed it because they wanted to remove the stigma of "m GPUs are crap" which had - deservedly - cemented itself over the years.

For the response to that, again, see above.

Moving on:

This isn't an argument, at least not one that relates to the topic at hand whatsoever. Does one segment being more expensive somehow make desktop and laptop GPUs inherently comparable? What? Does the fact that a large excavator costs more than a family sedan suddenly render them somehow comparable? Different things are different things. Mobile GPUs are not desktop GPUs.

And you have already been told that this is not true. The size difference between mobile Turing and Ampere is marginal - they're both huge. Mobile Turing topped out at TU104. I was actually wrong before, as Ampere doesn't top out at GA104, but has the GA103S as its largest mobile variant. These are very comparable in terms of die size - 545mm2 and 496mm2 respectively. Yes, GA103S is marginally smaller, but not to a degree that matters. You have also persitently failed to address the actually space-consuming components of a laptop GPU, which matter far more than die size: the board space needed for ancillary circuitry, mainly VRMs and VRAM. Ampere boosts much more aggressively than Turing and as a consequence has significant power excursions/current spikes, and thus need a beefier VRM for the same wattage. They also top out at the same 256-bit VRAM bus, with 8 VRAM dice - they're at the same level.

What does this tell us? Well, we can use our eyes and look at the boards of desktop GPUs and compare them to the boards of high end gaming laptops. Two things are clear: high end GPUs have huge and/or jam-packed boards with a lot of Z-height for VRM components; laptops are always jam-packed, do not have the luxury for lots of Z-height (outside of a few very thick laptops), and are thus more limited in what you can pack into them. Which is why there's no TU102 or GA102 laptop variants. There just isn't room, outside of the tiny, hyper-expensive niche of DTR laptops - and developing a specific SKU for that would be a massive money sink for everyone involved.

You're also blatantly ignoring the massive power consumption increase from Turing to Ampere and its attendant consequences when looking at mobile SKUs in your insistence on comparing them to desktop cards. The 2080 Super mobile has a 100W lower TDP than its desktop counterpart (configurable up to a 50W lower TDP), at 150W (up to 200W) vs. 250W. So, that's a reasonable gap, right? But how does that change when the 3080 Ti has a full 100W TDP increase over the 2080 Super? That delta grows - you can't just choose to increase laptop TDPs by 100W - physics doesn't work that way. We've also seen a fundamental change in how laptop GPUs are segmented from Nvidia, with the Max-Q labeling being abandoned, as they've realized that it's better to allow each OEM to configure the GPU to what their chassis can actually cool (with a lower bound for acceptable performance per model) than forcing fixed configurations that might not fit a given thermal envelope. This has the disadvantage of less clarity for consumers, as you can't look for the Max-Q label for low power or its absence for high performance, but also has the advantage of providing a wider, more diverse set of options for consumers as laptops can be designed at any point in between those two extremes.

There is also of course the fact that with Turing, 80-tier desktop GPUs were 04-series (with the exception of the Ti, which had no mobile counterpart), while with Ampere, they are 02. This also changes things, no? Again: are you expecting GA102 on mobile? That is a significantly larger die - 628 mm2 - again, and crucially, one equipped with 12 memory channels. Would you want a mobile GA102 cut down to 8 channels? Or are you just selectively ignoring the impossibility of implementing this much VRAM in any reasonably sized gaming laptop?

And, once again, remember the market segments and how they are evolving. Gaming laptops are shrinking. Drastically - the 17" segment today is dominated by thin-bezeled laptops around the size of 15" laptops just a few years ago. Larger than 17" is essentially nothing. And they're getting thinner and more portable too. While you could argue that the growth of SFF means desktops are also shrinking, it's still a world apart in terms of cooling capability and what can actually be fit within the confines of the chassis. A GA102 laptop GPU isn't feasible in the types of laptops that sell in any type of quantity today. Period.

This isn't "falling for Nvidia's marketing". It is taking into account the actual relevant realities surrounding these two separate product segments, and accepting that despite having similar names, names are ultimately arbitrary and for GPUs only designate relative performance within their same generation and brand (with a lot of flexibility). You are insisting that the names should be non-arbitrary, i.e. that there must be a material commonality between a mobile 3080 Ti and a desktop 3080 Ti, beyond both of them being at or near the top of their respective product stacks - extreme performance, highly priced GPUs in their respective markets. This insistence of yours is misguided and misunderstood, and from your arguments seems to be based on some weak reasoning and unrealistic desires. You cannot compare a 3080 Ti laptop to a 3080 Ti desktop and expect similar performance - they'll both be crazy fast, but still wildly different. That was true with Turing and Pascal too. And with previous generations.
Again you fail miserably to address my points or even understand them. All the points you raised I'm aware of them but are irrelevant.
As I said earlier we can agree to disagree.
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Again you fail miserably to address my points or even understand them. All the points you raised I'm aware of them but are irrelevant.
As I said earlier we can agree to disagree.
This isn't us agreeing to disagree. It's just you looking at things from a very specific point of view, that allows you to talk thrash about Nvidia. It's your right, of course. Just be a man and admit it.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Again you fail miserably to address my points or even understand them. All the points you raised I'm aware of them but are irrelevant.
As I said earlier we can agree to disagree.
... so, perhaps try to expand on your arguments, if they are so incomprehensible to us? You said you have brought up arguments that haven't been addressed; I quoted the majority of your postings in this thread and showed how it had been responded to. What are we missing, beyond your vague and general thing of disagreeing that desktop and laptop GPUs should be considered separately due to being distinct and separate types of products?
 
Top