Thursday, October 15th 2020
Silverstone Intros VIVA Line of Mainstream PSUs
Silverstone today introduced the VIVA line of mainstream PSUs. Available in mid-range capacities of 550 W, 650 W, and 750 W, these PSUs offer fixed cabling, and cover all the essentials for a gaming PC build. Under the hood, the VIVA series features a single +12 V rail design, 80 Plus Bronze efficiency, active PFC, and most common electrical protections, against over/under-voltage, overload, and short-circuit. All three models offer flat black cables. A 120 mm noise-optimized fan with 18 dBA minimum noise output, cools the PSU. All three models come with two 4+4 pin EPS connectors. The 750 W model includes four 6+2 pin PCIe power connectors, while the 550 W and 650 W models have two. The VIVA series PSUs are build in a 14 cm long chassis. The company didn't reveal pricing or warranty information.
28 Comments on Silverstone Intros VIVA Line of Mainstream PSUs
If it's not modular, it's not worth my time or moolah, hehehe :)
And with the ever increasingly power-hungry parts nowadays (ie... 10th & 11th gen intel cpu's & RTX 3xxx), who in their right mind would try to build a "gaming" rig with only a 750w psu anyways ?
Pick any two.
Realistically, if it's good enough quality that will be fine.
It sure makes a change from all the ridiculously expensive 1500W PSUs being launched at exactly the time Nvidia are officially killing SLI for consumer use.
Some comments they are inspiration for strong laughs.
Thanks!
The industry has shifted focus towards technologies that are fundamentally incompatible or unable to take advantage of addition GPUs like variable refresh, lag reduction, screen-space compute shaders covering occlusion, reflection etc. It's also absolutely terrible for VR where multiple latency-sensitive viewports are being rendered extremely efficiency by sharing extremely similar data and calculation type as both viewports pass through the pipeline with very little overhead for the second viewport on the same GPU. Multi-GPU is the opposite of that, generating a single viewport using duplicated data sets across duplicate GPUs and converging the result into a single, heavily-buffered stream that hides the additional variance behind some additional buffering that results in higher latency. It's inefficient and it's expensive to not just duplicate all that data, but work on it separately when there are additional benefits to having everything in a single unified dataset. This is why single large caches are always better than multiple smaller caches.
Journalism has also evolved away from focusing solely on average FPS and now looks at a more complete picture of gameplay fluidity - 99th percentile fps, frame delivery times, minimum FPS, and how smooth those frame-time graphs are - and this more complete picture of game performance exposes all the problems with multi-GPU rendering. Apart from having higher average FPS, pretty much every other metric is a bad result that objectively provides the hard evidence that multi-GPU doesn't feel great, even when the FPS numbers are better.
Finally there are the other smaller reasons that have always existed (like poor performance scaling, needless doubling of expensive things like GDDR6 or HBM2, the fact you need a huge case with good cooling and a beefy power supply) and new reasons - the next gen GPUs both supporting direct storage access over PCIe - Trying to get that to work over two cards will, at the very best, double the amount of data read and effectively halve the PCIe bandwidth and storage performance, all whilst multi-GPU is itself reducing that available bandwidth by stealing it for the GPU-to-GPU data feed. Not to mention the newest reason of all - you need a pair of $1500 cards to even try and get the damn thing working at the moment. Back in the heyday of multi-GPU you could get half-decent results with most of the cards on the market, starting from about $150 per card.
There is no situation where any of the possibilities points towards gaming adopting multi-GPU again. It has died and both the devs and the GPU driver teams are breathing a collective sigh of relief that it's almost gone for good. It doesn't work for consoles, it doesn't work for laptops, it's a nightmare best avoided for any kind of cross-platform development, and Nvidia no longer offer it on any of their gaming cards.
Anyway, looks like I might keep these in mind for replacing my current PSU in case I need more power for the 2680v2.
Therefore it's useless to use the 80PLUS certification to determe a good price for the PSU.
I'm still a little unsure of why full modular is so popular though. I get that some people with lots of money and vanity builds want to replace every visible cable with some custom-braided, colour-coded replacements - but that's a tiny, insignificant niche.
For the 99.9% I don't understand why you'd want a modular 24-pin cable and 4+4 CPU power cable. They're mandatory, and for any PSU over 300W, so is at least one of the PCIe power cables, too. Making them modular just increases the cost and adds needless bulk to the area where the cables come out of the PSU, potentially making the PSU body longer than it would otherwise need to be as well.