Friday, June 24th 2022

MSI MEG Ai1300P and Ai1000P Power Supplies Leaked, Optimized for GPU Power Spikes

MSI is readying a line of high-end "smart" power supplies under the MEG Ai P-series. Some of the most relevant slides from its launch presentation were leaked to the web. courtesy g01d3nm4ng0. The PSUs feature ATX 3.0 and PCI-Express Gen 5.0 readiness, including native PCI-Express 16-pin 12VHPWR connectors that are capable of delivering 600 W of power. The lineup includes two models—the Ai1300P (1300 W), and Ai1000P (1000 W).

A key aspect of these PSUs' ATX 3.0 specification compliance is their ability to handle power excursions (spikes in load) from GPUs, ranging between 1 to 10 milliseconds in interval. Specifically, these PSUs can handle GPU power excursions that are up to 3x the nameplate load (eg: a GPU with 450 W typical power pulling 1350 W spikes). In some of the older PSUs, excursions are known to trigger the overload protection and shut down the system. The MSI MEG Ai P-series PSUs can handle excursions that are up to 2x the PSU's nameplate capacity (eg: 2600 W for the 1300 W model); and 3x that of the 16-pin connector (i.e. 1800 W). These spikes typically have a extremely narrow interval, and so they don't threaten the integrity of the circuits.
Another interesting feature of the MSI MEG Ai P-series is the ability for the user to monitor the various runtime electrical and thermal parameters of the PSU using the Gamer Intelligence app. The PSU interfaces with software via a USB 2.0 connection (internal header). The app lets you monitor load on the various voltage rails, real-time switching efficiency, fan speed, etc. There's no word on when MSI plans to launch these.
Sources: g01d3nm4ng0 (Twitter), VideoCardz
Add your own comment

28 Comments on MSI MEG Ai1300P and Ai1000P Power Supplies Leaked, Optimized for GPU Power Spikes

#1
RH92
For those who don't understand the power excursion slide :

Posted on Reply
#3
DeathtoGnomes
P4-630Warranty?
Its a leak, maybe the next one will. :D
Posted on Reply
#4
LFaWolf
I hope TPU get a sample to test it out. Sounds quite interesting
Posted on Reply
#5
maxfly
The most important question...who builds em? If they are built by a reputable oem they could hit the ground running.
Posted on Reply
#6
ZoneDymo
would it not be amazing if PSU manufactures would come together and just say "no" to these ridiculous requirements?
Posted on Reply
#7
DoLlyBirD
Efficiency this and that but you're doubling power requirements every generation, fuck your 600w GPU's give me a 200w that will smoke the last gen by 2x at the same power draw FFS is it that hard?
Posted on Reply
#8
mechtech
"The PSUs feature ATX 3.0 and PCI-Express Gen 5.0 readiness,"

Does not mean fully compliant/exceeds ATX 3.0 specifications.
Posted on Reply
#9
sector-z
DoLlyBirDEfficiency this and that but you're doubling power requirements every generation, fuck your 600w GPU's give me a 200w that will smoke the last gen by 2x at the same power draw FFS is it that hard?
Wtf ?? Just buy a 1600W like many and tour good for many years, I haved customer that buyed the Silverstone 1500W 80+ Silver when released in i7 First Gen time (I7 920) and never needed to upgrade.
Posted on Reply
#10
crmaris
LFaWolfI hope TPU get a sample to test it out. Sounds quite interesting
We will do :)
Posted on Reply
#11
chinobis
crmarisWe will do :)
Fellow compatriot, I've been meaning to bug you about making a round up of PSUs with hardware monitoring capabilities, then i went on a rabbit-hole to find out why this cool feature seemingly is mostly supported by a few old PSUs, and I found out it is a combination of wildly inaccurate readings and or crappy software, so please dig a bit deeper into this aspect when you'll review MSI's new power supply. Thank you for the fantastic work you've done over the years.

PS My current Antec 750w psu is almost 12 years old, still working 24/7, still noiseless.
Posted on Reply
#12
Rx771
It's almost the same with this.Also they're weaker BQ 1500W platform.
Posted on Reply
#13
chrcoluk
RH92For those who don't understand the power excursion slide :

One of his better videos, so it confirms what I thought a lower power budget has a likelihood of lowering the transient spike amount, which was my thought when I mentioned in another thread that a temporary measure can be to reduce the power budget on a GPU (or reducing it via other means like less demanding game configuration). The phenomenum is not new to Ampere either, was on the 1080ti he tested.
Posted on Reply
#14
HisDivineOrder
Can't wait till the power excursion limit goes above what the wall can provide and people can't understand what's happening.

Or we could all just say no to absurd GPU's.
Posted on Reply
#15
OC-Ghost
Blocking whole internal USB connector, while only using half :mad:

I´m probably not the only one who has run into issues with that stupidity, while needing USB headers for liquid cooling and case
Posted on Reply
#16
80251
Do these power excursions affect crypto-miners or is it only a gaming phenomenon? These PSU's look like they're going to be expensive.
Posted on Reply
#17
Frick
Fishfaced Nincompoop
"Power excursions".:rolleyes:
80251Do these power excursions affect crypto-miners or is it only a gaming phenomenon? These PSU's look like they're going to be expensive.
It's a thing with variable loads, so not with mining.
Posted on Reply
#18
HenrySomeone
DoLlyBirDEfficiency this and that but you're doubling power requirements every generation, fuck your 600w GPU's give me a 200w that will smoke the last gen by 2x at the same power draw FFS is it that hard?
It is yes, because the process nodes are not advancing at a rate that they used to ... and it's only going to get worse unfortunately...
Posted on Reply
#19
mama
So how important will the 12 pin connector be... really? What changed to 16 pin now?
Posted on Reply
#20
ARF
HenrySomeoneIt is yes, because the process nodes are not advancing at a rate that they used to ... and it's only going to get worse unfortunately...
I think it will become better. Because the users will not need to buy new products so often.
And you can always undervolt - actually that will be the best practice from now on - limiting the crazy wattage.
Posted on Reply
#21
DoLlyBirD
HenrySomeoneIt is yes, because the process nodes are not advancing at a rate that they used to ... and it's only going to get worse unfortunately...
Makes no sense, yes nodes aren't advancing at the same rate though with every new node there is at least 30% increased performance or transistor density, add to that generational architectural improvements and there is no reason for TDP to keep rising, 500w cards for gaming are an abomination IMO and we have stagnated if not gone backwards to say 5-10 years ago
Posted on Reply
#22
Mussels
Freshwater Moderator
We're gunna see PCI-E cables with capacitors on the end, arent we
mamaSo how important will the 12 pin connector be... really? What changed to 16 pin now?
same as the 12 pin, but with voltage sensing rails to help the PSU compensate for these transient loads faster
Posted on Reply
#23
jonnyGURU
Except it doesn't actually pass the ATX 3.0 test plan put in place by Intel. The platform is the CWT CST and it fails.

You can't just slap a 12VHPWR connector on a PSU (which isn't even necessary since the two "sense" wires just need to terminate to ground) and say it's ATX 3.0 compliant.
Posted on Reply
#24
Ferd
jonnyGURUExcept it doesn't actually pass the ATX 3.0 test plan put in place by Intel. The platform is the CWT CST and it fails.

You can't just slap a 12VHPWR connector on a PSU (which isn't even necessary since the two "sense" wires just need to terminate to ground) and say it's ATX 3.0 compliant.
Marketing department” super powers “
Posted on Reply
#25
HenrySomeone
ARFI think it will become better. Because the users will not need to buy new products so often.
And you can always undervolt - actually that will be the best practice from now on - limiting the crazy wattage.
It's already better if you're satisfied with either maxed out 1080p 60-75Hz or casual 1080p 144Hz gaming. Both 3060, 6600Xt and 6600 do that at very reasonable power usage (especially the latter, which would be almost excellent from a technical standpoint if not for the dumb x8 limitation and of course they are still priced way too high). However for the higher resolutions and/or refresh rates, especially 4k (and slowly emerging 8k), it won't get better anytime soon. The processing power required to drive those at high settings and respectable framerates in newer AAA titles will require "gas guzzling" cards for years if not decades to come for sure.
Posted on Reply
Add your own comment
May 21st, 2024 17:51 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts