• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Alienware's Fewer CUDA Core Controversy Explodes, Company Admits Error, Announces mid-June Fix

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,301 (7.52/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Last week, it surfaced that Alienware shipped certain m15 gaming laptops with GeForce RTX 3070 Laptop GPUs with fewer CUDA cores than what is standard—4,608 vs. 5,120, without properly advertising it in their marketing material. Over the weekend, the company's train-wreck of a response played out. First, from Alienware's parent company Dell; and later by Alienware itself.

Dell, in a statement to Jarrod's Tech, tried to normalize the practice. "CUDA core counts per NVIDIA baseline may change for individual OEM, such as ourselves [Dell], to allow to provide a more specific design and performance tuning. Be assured the changes made by our engineering team for this computer model was done after careful testing and design choices to bring the most stable and best performance possible for our customers, if at a later date more CUDA cores can be unlocked via a future update, we will be swift to make it available on our support website," the Dell statement read.



Here's the controversy: Alienware did not advertise the specific configuration of the RTX 3070 Laptop GPU in this notebook, only mentioned the GPU name. One is guided to believe they are buying a notebook with a GPU they've independently researched to be of a certain configuration. Clock-speed tuning by OEMs to a certain degree is acceptable, but certainly not 10 percent fewer CUDA cores. Also, CUDA cores aren't the only things that are fewer. Since Dell/Alienware reduces the number of streaming multiprocessors available to the GPU, there are proportionate reductions in even RT cores (raytracing cores), Tensor cores, and TMUs.

Meanwhile, Dell's misfired attempt at damage-control was quickly eclipsed by Alienware, which trashed the "optimization" excuse offered by its parent company, and termed this as a manufacturing defect. The company released a statement to Tom's Hardware: "We have been made aware that an incorrect setting in Alienware's vBIOS is limiting CUDA Cores on RTX 3070 configurations. This is an error that we are working diligently to correct as soon as possible. We're expediting a resolution through validation and expect to have this resolved as early as mid-June. In the interim, we do not recommend using a vBIOS from another Alienware platform to correct this issue. We apologize for any frustration this has caused."

View at TechPowerUp Main Site
 
Joined
Oct 22, 2014
Messages
14,170 (3.81/day)
Location
Sunshine Coast
System Name H7 Flow 2024
Processor AMD 5800X3D
Motherboard Asus X570 Tough Gaming
Cooling Custom liquid
Memory 32 GB DDR4
Video Card(s) Intel ARC A750
Storage Crucial P5 Plus 2TB.
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Eweadn Mechanical
Software W11 Pro 64 bit
ALIENWARE: "we do not recommend using a vBIOS from another Alienware platform to correct this issue."

BTARUNR: "Given that Alienware is asking end-users to tamper with video BIOS"

One of you is wrong and i'm pretty sure it isn't Alienware.
 
Joined
Jul 13, 2016
Messages
3,337 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Wouldn't this also be on Nvidia for failing to properly denote differences in GPUs?
 
Joined
Dec 10, 2014
Messages
1,335 (0.36/day)
Location
Nowy Warsaw
System Name SYBARIS
Processor AMD Ryzen 5 3600
Motherboard MSI Arsenal Gaming B450 Tomahawk
Cooling Cryorig H7 Quad Lumi
Memory Team T-Force Delta RGB 2x8GB 3200CL16
Video Card(s) Colorful GeForce RTX 2060 6GV2
Storage Crucial MX500 500GB | WD Black WD1003FZEX 1TB | Seagate ST1000LM024 1TB | WD My Passport Slim 1TB
Display(s) AOC 24G2 24" 144hz IPS
Case Montech Air ARGB
Audio Device(s) Massdrop + Sennheiser PC37X | Koss KSC75
Power Supply Corsair CX650-F
Mouse Razer Viper Mini | Cooler Master MM711 | Logitech G102 | Logitech G402
Keyboard Drop + The Lord of the Rings Dwarvish
Software Tiny11 Windows 11 Education 24H2 x64
I'm reading from the reddit threads that Dell/Alienware might be trying sabotage AMD here. I saw weeks ago that someone compared where their prebuilts with Ryzen 5000 series are named R10 or sth while 11-series Intel is R12. While innocent minds may think it's just to differentiate between models of AMD and Intel, the 10-series Intel powered ones were named R11. Not to mention they actively cripple AMD systems with single-channel memory.

I think it's safe to say (considering Dell and Intel's special relationship from the past instance) that we're seeing Dell/Alienware pants down with its ugly backside. Notice this latest controversy is also affecting AMD system(s).

Here's the source: Alienware Really Doesn't Want You to Buy an AMD Ryzen PC - ExtremeTech
 
Joined
Jan 14, 2021
Messages
32 (0.02/day)
Location
Australia
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS Crosshair X670E Extreme
Cooling Alphacool Core 1 CPU / Alphacool Core 7900XTX Nitro / Hardware Labs GTS360 & GTS180x2
Memory 2×32GB G.Skill Flare X (6000MHz @ 30-40-40-96)
Video Card(s) Sapphire RX 7900XTX Nitro+
Storage 2TB Kingston KC3000 / 2×4TB Kingston KC3000
Display(s) Samsung Odyssey G8 (3440x1440 175Hz OLED)
Case Fractal Design Torrent
Audio Device(s) HyperX Cloud Alpha Wireless
Power Supply Corsair RMx1200 Shift
Mouse Logitech G Pro Superlight 2
Keyboard Keychron Q6 Max (Gateron Jupiter Red)
ALIENWARE: "we do not recommend using a vBIOS from another Alienware platform to correct this issue."

BTARUNR: "Given that Alienware is asking end-users to tamper with video BIOS"

One of you is wrong and i'm pretty sure it isn't Alienware.
I'm pretty sure btarunr was referring to the fact that end-users will have to flash the vBIOS themselves once Alienware releases the "fixed" vBIOS for the m15 laptops. While flashing a vBIOS is not inherently dangerous, there is always the chance of bricking it and for the non-technically minded, that would be a major PITA.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Wouldn't this also be on Nvidia for failing to properly denote differences in GPUs?
No, because nVidia does properly denote difference in the GPUs. The RTX 3070 Mobile GPU is supposed to have 5120 Cuda cores, nVidia publishes this spec. AFAIK, OEMs are not allowed to sell variations on these specs. The only things OEM are given to adjust are maximum power consumption(80-120w for the 3070 Mobile) and Boost Clock(1290-1640MHz for the 3070 Mobile). Though the boost clock will be overridden by the nVidia GPU Boost in the driver anyway as long as thermals and power allow.
Let's see if Nvidia or Dell get sued for this, it is basically the 970 BS all over again :shadedshu:
Except nVidia had nothing to do with this, it's all on Dell.
Not to mention they actively cripple AMD systems with single-channel memory.
They do that shit with Intel systems too, even in desktops where it makes absolutely no freakin' sense. It's like Dell has not learned the concept of dual-channel memory. I mean, I know a single 16GB stick of RAM is $1 or $2 cheaper than two 8GB sticks, but do they have to penny pinch that much? You can buy a $1700 gaming desktop from them with an i7-10700K and it has a single 16GB stick of RAM in it, like WTF?!
 
Last edited:
Joined
Apr 12, 2013
Messages
7,563 (1.77/day)
Except nVidia had nothing to do with this, it's all on Dell.
And you base this on? If this wasn't officially approved then Nvidia can sue Dell as well, let's see if they do! They don't exactly have a stellar record of not lying to their customers do they? And before you go the "benefit of doubt" route I don't give any benefits to conniving, lying, scheming, thieving profit grabbing corporations because 9/10 you're proven right :rolleyes:
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
And you base this on? If this wasn't officially approved then Nvidia can sue Dell as well, let's see if they do! They don't exactly have a stellar record of not lying to their customers do they? And before you go the "benefit of doubt" route I don't give any benefits to conniving, lying, scheming, thieving profit grabbing corporations because 9/10 you're proven right :rolleyes:
I base this on the fact that the issue is solved by a VBIOS flash. Which puts the source of the problem 100% in the hands of Dell's BIOS team. The chips nVidia sold Dell were correct, Dell's BIOS team just fucked up.

And you are absolutely correct, nVidia could sue Dell. It is likely that Dell violated their contract with nVidia by selling chips that had shaders locked via the VBIOS. However, I can guarantee you nVidia won't sue. It isn't worth losing Dell as a customer over a mistake that is minor in the scheme of things and that Dell is taking the blame for and correcting(at least according to the Alienware department's statement).
 
Joined
Apr 12, 2013
Messages
7,563 (1.77/day)
Right, they didn't fuck up ~ it was a design decision!
The chances of this being unintentional (from Dell) is basically zero, the chances of Dell slipping this past Nvidia is also very close to zero. Like I said Nvidia can sue Dell for this, let' see if they do?
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
The chances of this being unintentional (from Dell) is basically zero
And what do you base this on? The fact is we'll probably never know if it was intentional or not. It certainly wouldn't be the first time a test BIOS/Driver/Software made it out to the public with some of the test parameters still in place. So it is very possible that is the case here. This could easily be a case of a VBIOS engineer just playing around and seeing if it is possible to adjust CUDA core count via VBIOS and forgetting to changing things back before compiling the production VBIOS. And as pointed out in the video you posted, there really isn't any good reason to disable CUDA cores when you can just adjust the power target of the GPU to meet the thermals of the laptop design.

It is kind of pointless to assume either way. Mistakes happen and we don't know if it was intentional or not. So we can really only judge them on how they react. I'll say the first response was pretty bullshit on Dell's part, and sounds like a PR guy just trying to come up with an answer. The second response though sounds better and more realistic.

This doesn't change the fact that it was an error at Dell and has nothing to do with nVidia.

the chances of Dell slipping this past Nvidia is also very close to zero. Like I said Nvidia can sue Dell for this, let' see if they do?
Oh, I guarantee you nVidia already knows about this. But they won't sue, for the reason I posted above. The most they'll probably do is have a strongly worded phone call with some Dell exec telling them to try harder to make sure this doesn't happen again.
 
Last edited:
Joined
Jul 13, 2016
Messages
3,337 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
No, because nVidia does properly denote difference in the GPUs. The RTX 3070 Mobile GPU is supposed to have 5120 Cuda cores, nVidia publishes this spec. AFAIK, OEMs are not allowed to sell variations on these specs. The only things OEM are given to adjust are maximum power consumption(80-120w for the 3070 Mobile) and Boost Clock(1290-1640MHz for the 3070 Mobile). Though the boost clock will be overridden by the nVidia GPU Boost in the driver anyway as long as thermals and power allow.

Except nVidia had nothing to do with this, it's all on Dell.

They do that shit with Intel systems too, even in desktops where it makes absolutely no freakin' sense. It's like Dell has not learned the concept of dual-channel memory. I mean, I know a single 16GB stick of RAM is $1 or $2 cheaper than two 8GB sticks, but do they have to penny pinch that much? You can buy a $1700 gaming desktop from them with an i7-10700K and it has a single 16GB stick of RAM in it, like WTF?!

Got ya, thanks for explaining. So this is definitely Dell's fault. That said I do believe that Dell would have not attempted to do this had there been specific model numbers for the specific SKUs instead of just "RTX 3070" or "RTX 3080". As there's already such a large variance between RTX 3070 on the mobile platform (and you can't even tell whether I'm talking about mobile or not as there's not even a mobile SKU identifier), they probably figured people would be non the wiser. There's zero transparency in regards to what the customer is getting and I find that most OEMs do not list graphics card TGP except for some high end models. Short of actually denoting performance in the model number, they should be required to publish TGP, Clocks, and core count. Nvidia is in sompe part to blame for continuing to make mobile GPU models and performance more opaque.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
That said I do believe that Dell would have not attempted to do this had there been specific model numbers for the specific SKUs instead of just "RTX 3070" or "RTX 3080".

There are specific model names for the specific SKUs. RTX 3070 is the desktop GPU. The mobile RTX 3070 is actually officially named "RTX 3070 Mobile GPU", and this is what nVidia refers to it as in all of their published spec sheets. That's actually the name of the mobile graphics card, it does have a different name than the desktop version.

I think the issue with the name, again, comes down to the OEMs, as most seem to shorten the name down to just RTX 3070 in their marketing. Which I think is deceptive and nVidia should assert more control over that.
 
Joined
Mar 28, 2020
Messages
1,761 (1.02/day)
Another reason not to consider an overpriced Dell system. Given the response below, its certainly a deliberate action, rather than a mistake. And I certainly don't recall Nvidia marketing their RTX 3070 with 10% less cores.

"CUDA core counts per NVIDIA baseline may change for individual OEM, such as ourselves [Dell], to allow to provide a more specific design and performance tuning.
 
Joined
Jul 13, 2016
Messages
3,337 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
There are specific model names for the specific SKUs. RTX 3070 is the desktop GPU. The mobile RTX 3070 is actually officially named "RTX 3070 Mobile GPU", and this is what nVidia refers to it as in all of their published spec sheets. That's actually the name of the mobile graphics card, it does have a different name than the desktop version.

I think the issue with the name, again, comes down to the OEMs, as most seem to shorten the name down to just RTX 3070 in their marketing. Which I think is deceptive and nVidia should assert more control over that.

No no no. "RTX 3070 mobile GPU" is clearly not intended to be a model name. Nvidia likely refers to it like that because it's the only way even they themselves can tell the difference between their mobile version and desktop version. Not a single OEM calls it the RTX 3070 mobile GPU. That's not an OEM issue, that's an Nvidia issue if everyone is doing it. For good reason too, that would be one of the worst model names I've ever seen, especially when Nvidia could simply bring back the M specification. Makes no sense when Nvidia had MUCH better alternatives and dropped them.

IMO there should also be TGP indicators as well. For example:

low TGP 3070 - RTX 3070-M3
mid TGP 3070 - RTX 3070-M5
high TGP 3070 - RTX 3070-M7
highest TGP 3070 - RTX 3070-M9

This really isn't that hard to do and relies on existing nomenclature to give customers an immediate idea of where the performance of each version stands.
 
Joined
Oct 22, 2014
Messages
14,170 (3.81/day)
Location
Sunshine Coast
System Name H7 Flow 2024
Processor AMD 5800X3D
Motherboard Asus X570 Tough Gaming
Cooling Custom liquid
Memory 32 GB DDR4
Video Card(s) Intel ARC A750
Storage Crucial P5 Plus 2TB.
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Eweadn Mechanical
Software W11 Pro 64 bit
Another reason not to consider an overpriced Dell system. Given the response below, its certainly a deliberate action, rather than a mistake. And I certainly don't recall Nvidia marketing their RTX 3070 with 10% less cores.

"CUDA core counts per NVIDIA baseline may change for individual OEM, such as ourselves [Dell], to allow to provide a more specific design and performance tuning.
Reading betwen the lines, that's an admission they know their systems are inferior and incapable of handling the full power and heat.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
I just checked W1zzard's review of the desktop RTX 3070 and it was using 220 watts average in gaming just for the FE 3070. That's not even including the CPU and other components in the laptop. That would drain a battery very quickly. Dell probably had to do something to prevent that but they should have been up front about it.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
No no no. "RTX 3070 mobile GPU" is clearly not intended to be a model name. Nvidia likely refers to it like that because it's the only way even they themselves can tell the difference between their mobile version and desktop version. Not a single OEM calls it the RTX 3070 mobile GPU. That's not an OEM issue, that's an Nvidia issue if everyone is doing it. For good reason too, that would be one of the worst model names I've ever seen, especially when Nvidia could simply bring back the M specification. Makes no sense when Nvidia had MUCH better alternatives and dropped them.
They don't "refer" to it, this is literally the name that shows up in GPU-Z, in nVidia control panel and in every nVidia spec sheet. The name of the GPU is officially "RTX 3070 Mobile GPU", if the OEMs don't advertise it as such, that's on them. But they are probably going with the idea that the consumer should know they are getting the mobile version of the GPU when buying a laptop. I'm not saying that is the right thing to do, I'm just saying that is the idea they are going with and likely enough to get them out of legal trouble. The only time it would be an issue would be if they put the RTX 3070 Mobile GPU in a desktop PC and didn't make the consumer aware.
IMO there should also be TGP indicators as well. For example:

low TGP 3070 - RTX 3070-M3
mid TGP 3070 - RTX 3070-M5
high TGP 3070 - RTX 3070-M7
highest TGP 3070 - RTX 3070-M9

This really isn't that hard to do and relies on existing nomenclature to give customers an immediate idea of where the performance of each version stands.
The thing is, nVidia doesn't set the TGP. They aren't selling multiple SKUs with different TGP. All nVidia does it set an acceptable range, and the OEM just has to make sure the card lands somewhere in that range. And the performance difference in the range nVidia sets is not that major. The difference between the worst 3070 Mobile and best in Jarrod's video was about 10%. He only notice the shader issue because the Alienware was then another 10% slower than even the worst tested 3070 Mobile laptop.

The other issue that is present is the fact that the TGP still doesn't really actually matter. You'll see in Jarrod's video that 3070 Mobiles with lower TGPs are out performing some 3070 Mobiles with higher TGPs. Why? Because they are all still likely thermal throttling at some point, which kind of throws the whole point of a TGP out of the window. IMO, nVidia should just set a solid TGP for their mobile GPUs. If the laptop OEMs can't build a laptop to keep them cool, that's on them. The reviews will tell you about the throttling before you buy the laptop. That's how it was before, and I don't see any reason we had to change it and add this sliding TGP scale in the first place.
 
Joined
Feb 1, 2013
Messages
1,270 (0.29/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
Maybe someone was trying to hide 10% performance to be used secretly, for personal gains... one of the most prevalent reasons to not use/buy prebuilts, unless you gut it and reload the OS/drivers from scratch.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
one of the most prevalent reasons to not use/buy prebuilts
This is a laptop, so DIY building isn't really an option.
unless you gut it and reload the OS/drivers from scratch
Since this wouldn't affect this issue in any way, I don't see why it is relevant in this thread. Please explain how this applies to this thread.
 
Joined
Jan 25, 2019
Messages
44 (0.02/day)
Dell will either correct this through a VBIOS or they will not/can't. If they don't/can't, the knives will come out.
 
Joined
Jul 13, 2016
Messages
3,337 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
They don't "refer" to it, this is literally the name that shows up in GPU-Z, in nVidia control panel and in every nVidia spec sheet. The name of the GPU is officially "RTX 3070 Mobile GPU", if the OEMs don't advertise it as such, that's on them. But they are probably going with the idea that the consumer should know they are getting the mobile version of the GPU when buying a laptop. I'm not saying that is the right thing to do, I'm just saying that is the idea they are going with and likely enough to get them out of legal trouble. The only time it would be an issue would be if they put the RTX 3070 Mobile GPU in a desktop PC and didn't make the consumer aware.
lol what are they supposed to put in there "RTX 3070"? That's precisely my point, it's extremely confusing. The fact that you have to add words after the model name to even tell the difference between the two says enough. Mind you the model there is still RTX 3070. What it's officially called and model name are two separate things.

The thing is, nVidia doesn't set the TGP. They aren't selling multiple SKUs with different TGP. All nVidia does it set an acceptable range, and the OEM just has to make sure the card lands somewhere in that range. And the performance difference in the range nVidia sets is not that major. The difference between the worst 3070 Mobile and best in Jarrod's video was about 10%. He only notice the shader issue because the Alienware was then another 10% slower than even the worst tested 3070 Mobile laptop.

They do:


Both AMD and Intel do this, it is not surprising.

The other issue that is present is the fact that the TGP still doesn't really actually matter. You'll see in Jarrod's video that 3070 Mobiles with lower TGPs are out performing some 3070 Mobiles with higher TGPs. Why? Because they are all still likely thermal throttling at some point, which kind of throws the whole point of a TGP out of the window. IMO, nVidia should just set a solid TGP for their mobile GPUs. If the laptop OEMs can't build a laptop to keep them cool, that's on them. The reviews will tell you about the throttling before you buy the laptop. That's how it was before, and I don't see any reason we had to change it and add this sliding TGP scale in the first place.

On average higher TGP SKUs are faster. There's always going to be OEMs that do a poor job at cooling, that doesn't change the fact that higher TGP parts will perform better when not thermally and electrically limited then a lower TGP SKU.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
lol what are they supposed to put in there "RTX 3070"? That's precisely my point, it's extremely confusing. The fact that you have to add words after the model name to even tell the difference between the two says enough. Mind you the model there is still RTX 3070. What it's officially called and model name are two separate things.
So you don't consider the RTX 3080 and RTX 3080 Ti as different models? Or the RX 6800 and RX 6800 XT as different models? Or are you willing to admit when you add things to the Model Name, they are distinguished as different models?

The model name of the GPU we are talking about is RTX 3070 Mobile GPU. That is the model name, it is a different model than the RTX 3070 and has different specs

They do:

Both AMD and Intel do this, it is not surprising.


On average higher TGP SKUs are faster. There's always going to be OEMs that do a poor job at cooling, that doesn't change the fact that higher TGP parts will perform better when not thermally and electrically limited then a lower TGP SKU.
No, nVidia does not set the TGP. They provide a TGP range that the OEM is allowed to then set the TGP in. But nVidia doesn't have a "Low, Mid, High" TGP numbers like you suggest.
 
Joined
Aug 20, 2007
Messages
21,545 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 5800X Optane 800GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
ALIENWARE: "we do not recommend using a vBIOS from another Alienware platform to correct this issue."

BTARUNR: "Given that Alienware is asking end-users to tamper with video BIOS"

One of you is wrong and i'm pretty sure it isn't Alienware.
No, they are advising users to wait to tamper with the bios until they have an official patch executable.

Neither is wrong, technically.
 
Top