- Joined
- Feb 20, 2019
- Messages
- 8,295 (3.93/day)
System Name | Bragging Rights |
---|---|
Processor | Atom Z3735F 1.33GHz |
Motherboard | It has no markings but it's green |
Cooling | No, it's a 2.2W processor |
Memory | 2GB DDR3L-1333 |
Video Card(s) | Gen7 Intel HD (4EU @ 311MHz) |
Storage | 32GB eMMC and 128GB Sandisk Extreme U3 |
Display(s) | 10" IPS 1280x800 60Hz |
Case | Veddha T2 |
Audio Device(s) | Apparently, yes |
Power Supply | Samsung 18W 5V fast-charger |
Mouse | MX Anywhere 2 |
Keyboard | Logitech MX Keys (not Cherry MX at all) |
VR HMD | Samsung Oddyssey, not that I'd plug it into this though.... |
Software | W10 21H1, barely |
Benchmark Scores | I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000. |
4090 uses less power than a 3090Ti, and Nvidia have designed 40-series to have less severe power spikes beyond the power limit:40 series cards seem to have stupid high wattages, as do modern intel and AM5 CPU's
A 3090Ti is a 450W card, but spikes into the 600W range.
A 4090FE is a 450W card, and it never spikes above 500W.
Essentially, by fixing that behaviour and using it as a marketing point in their Lovelace launch events, Nvidia have confimed that their power regulation was awful on Ampere, and it's their own fault that all the 1000W PSUs caused their 320 and 450W cards to bluescreen by not being strong enough for Ampere's power spikes.
AMD have a similar problem with the 6900XT so I'm not just bashing Nvidia here, but we're talking 40-series as per your comment.