Tuesday, February 18th 2020
UK Prepares $1.6 Billion for the Most Powerful Weather Forecasting Supercomputer
The UK government has set aside a budget of 1.2 billion GBP, which is roughly around 1.56 billion US Dollars. With this budget, the UK government plans to install the world's most powerful supercomputer used for weather forecasting in the year 2022. Previously, the UK government used three Cray XC40 supercomputers that are capable of achieving a maximum of 14 PetaFLOPs at its peak performance. The future system plans to take that number and make it look tiny. With plans to make it 20 times more powerful than the current machine, we can estimate that the future supercomputer will have above 200 PetaFLOPs of computing performance.
The supercomputer deployment will follow a series of cycles, where one is happening in 2022 and that supercomputer will be six times more powerful than the current solution. To get to that 20 times improvement, the supercomputer will get an upgrade over the next five years' time. While we do not know what will power the new machine, it will almost definitely be a CPU plus multi-GPU node configuration, as GPUs have gained a lot of traction in weather prediction models lately.
The supercomputer deployment will follow a series of cycles, where one is happening in 2022 and that supercomputer will be six times more powerful than the current solution. To get to that 20 times improvement, the supercomputer will get an upgrade over the next five years' time. While we do not know what will power the new machine, it will almost definitely be a CPU plus multi-GPU node configuration, as GPUs have gained a lot of traction in weather prediction models lately.
43 Comments on UK Prepares $1.6 Billion for the Most Powerful Weather Forecasting Supercomputer
www.tomshardware.com/uk/news/amd-epyc-7742-cpus-tapped-for-european-weather-predicting-supercomputer
Check back in roughly 8-12 years when its complete/finally cancelled for the total cost.
nothing else.
10 Xeons = 15 EPYC and 100+ more cores = budget build. :p
I just wish Nvidia would stop dressing up Tensor cores as a gaming feature that requires developers to waste time on half-baked DLSS supersampling instead of adding content and/or polish to a new release.
As for forecast quality:
Local forecasts (i.e. what you check to know if it's going to rain tomorrow) are very precise for 2-3 days ahead. This is not a computational limit - weather is just too random.
Large area forecasts (i.e. what will happen in your province/country as a whole - large pressure zones, hurricanes etc) can be modeled even few weeks ahead. There is nothing better. It's a cluster. You make it as big as you need.
What you actually mean is: cheaper. Based on EPYC it may be cheaper.
What you forget is that - even if EPYC's offer better value, they also offer less supply.
So not every supercomputer can be built with AMD interiors. This is one of the fundamental issues behind current Intel pricing (in all segments). 7nm is still relatively limited.
AMD CPUs offer better value at the moment, i.e. you need less sockets to deliver the required cluster performance.
And of course there's a good chance this system can benefit from AVX-512. It really is a big selling point for Intel. First of all: speeds you've mentioned are for home use, not for datacenters...
Second: this supercomputer will likely be kept offline, i.e. no direct Internet access (it isn't needed).
Assume that the power and cooling costs of a supercomputer cluster are 10X higher than the internet connectivity, and you'll be in the right ballpark at least.
It's PDEs after all. I mean: there are many lines of code, but the idea is quite simple.
Some fairly decent models / libraries are free and open-source. COAMPS is very popular:
www.nrlmry.navy.mil/coamps-web/web/home
Met Office uses a model called Unified Model. AFAIK it's not open-source, but it's fairly popular as well (I worked with it at university in Poland).
www.metoffice.gov.uk/research/climate/maps-and-data/uk-synoptic-and-climate-stations
As in many other problems in physics: it's actually the input data that makes this difficult - not the actual model complexity.
1) you're mostly using input from the surface, but you're actually modeling in 3D
2) the number of stations is limited. In case of UK and Met: just ~270 stations report in real time or hourly
www.metoffice.gov.uk/research/climate/maps-and-data/uk-synoptic-and-climate-stations
3) no matter how small your country is, you have to model the whole atmosphere anyway :)
This means you're forced to use 3rd party data for the rest of the planet. Of course national institutes cooperate (and some data is just free), but that means you have very little impact on data quality and data point density. I'm not sure where you're going with this. Numerical weather forecasting has been around for decades. It works.
We know equations that have proven to be very accurate. We come up with something better from time to time. It's not magic.
As for satellite imaging: you can't even learn the current conditions based on that, which implies that you can't make any kind of forecast.
Sure, they can be used an additional source, but there's very little data that can be extracted.