The Next Generation Modelling Systems Programme
Designing modelling systems for architectures that don't yet exist.
Why we need next generation modelling systems
The power of supercomputers underpins our ability to deliver ever more accurate and useful weather forecasts and climate predictions. That power has increased exponentially since the origin of electronic computers in the 1950s. However, fundamental physical and engineering constraints mean that to continue that increase will require not only a huge further increase in the number of computing processors but also an explosion of new types of processors, and in particular supercomputer architectures that employ a mix of processor types (heterogeneous architectures).
To be able to exploit the power of these new architectures requires changes to the design of the weather and climate prediction systems – both the algorithms that emulate the physics of the atmosphere and oceans and the software infrastructures in which those algorithms are encoded.
Designing modelling systems for architectures that do not exist yet is clearly a challenge! It is essential therefore that the new systems are as flexible as possible and portable to different architectures without needing the software to be completely rewritten for each and every processor type. Fortunately help is at hand. Modern software design and practices give the exciting prospect of applying the principle of separation of concerns which separates hardware-specific aspects from the science-specific aspects (see later for a little more detail). Developing and implementing this approach is itself a big challenge though which requires co-design between computational scientists, software engineers, and weather and climate scientists. But the rewards will be rich as having more flexible and portable systems will make them more usable and more efficient, freeing up scientists from having to understand the complexities of specific computer architectures.
Aims of the programme
The Met Office has therefore established the Next Generation Modelling Systems Programme to reformulate and redesign the Met Office’s complete weather and climate research and operational/production systems, including oceans and the environment, to allow the Met Office and its partners to fully exploit future generations of supercomputer for the benefits of society. Recognising the importance of this endeavour the Next Generation Modelling Systems Programme is one of the Met Office’s Corporate Strategic Actions; it is also one of the themes of its Research and Innovation Strategy for the coming decade.
The programme is targeting implementation of the Next Generation Modelling Systems for operational Numerical Weather Prediction (NWP) towards the end of life of the Met Office’s new supercomputer ahead of porting to its replacement system, which is anticipated to be in 2027.
Improving the accuracy of weather forecasts and climate predictions
Accuracy of weather forecasts and climate predictions will be enabled by harnessing greater compute resources to increase the granularity with which the atmosphere and oceans are represented by the computer models, improve the accuracy of the representation of key physical processes, better represent the uncertainty in forecasts, and capture more extreme events further ahead in time. Additionally, by employing modern software practices in its redesign, the Next Generation Modelling Systems Programme will improve the usability, robustness, and flexibility of those modelling systems. This will improve the whole user experience and enable us to build new, exciting partnerships to deliver more science, more quickly.
The work of the Programme is organised under a series of projects which cover all aspects from the processing of observations, their ingestion into the modelling systems, through the various modelling systems needed for a complete Earth system simulation, to the visualisation and verification of the output of those systems. Specifically:
Next Generation Observations Processing System (NG-OPS)
This project will create a replacement for the current Observations Processing System (OPS) used by the global and regional (UKV) atmospheric data assimilation systems and marine systems. This entails the redesign of the OPS to increase flexibility, scalability, portability and efficiency to exploit future generation of supercomputers while retaining important processing properties of the current system (e.g. retrieval of cloud and surface information including skin temperature and emissivity, quality control over a profile, ability to add observation in research mode). To achieve this, the project has adopted the JEDI code framework.
Data Assimilation (NG-DA)
This project will deliver a Data Assimilation system for the Next Generation Modelling Systems with a flexible modular system that can interface with LFRic software. There will be an emphasis on ensemble Data Assimilation systems and a particular challenge is to maintain continuity of operational NWP during transition to the Next Generation Modelling Systems. Like the Observations Processing Systems project this project has adopted the JEDI code framework.
GungHo Atmospheric Science
The remit of the GungHo Atmospheric Science Project (GHASP) is to develop science code, within the LFRic infrastructure, that will form the basis of the atmospheric component of a global and regional forecast and climate model. This entails development of the GungHo dynamical core formulation and code implementation, as well as porting of the physical parametrisation schemes from the Unified Model (UM).
A key element of GungHo is the change from a latitude-longitude based array of points, with its clustering of points near the poles (the ‘pole problem’ or ‘polar singularity’), to a cubed-sphere based array of points which are more uniformly distributed.
LFRic & LFRic Inputs
The LFRic project will deliver a comprehensive software infrastructure for running the next generation atmosphere model comprising GungHo dynamics and UM physics running on a global cubed-sphere or regional latitude-longitude mesh. The LFRic atmosphere model needs to be scalable and easily portable between future diverse supercomputer architectures. To this end, the design imposes a separation of concerns between the science code, which implements the scientific algorithms, and the parallel code, which executes the science code using a number of parallel computations.
A tool called PSyclone is being developed to generate and optimise the parallel code for a range of different compute architectures. The project works very closely with the GHASP project which is delivering both the GungHo and UM science codes into the LFRic atmosphere model. The LFRic Inputs Project has been created to develop the tools to manipulate the files generated and used by LFRic as part of the flow of data through the overall modelling system.
Since the current marine systems run at the Met Office do not suffer from the same polar singularity issues as the atmosphere model, the associated codes (including NEMO, NEMOVAR and WAVEWATCH III) will be maintained within the Next Generation Modelling Systems. However, the models still need to be prepared for the changes in future supercomputing architectures. Therefore, for each of the marine systems the project includes implementation of cost-effective improvements for Central Processing Unit (CPU)-based machines, adaptation of the codes to Graphics Processing Unit (GPU)-based machines, and development of a strategy for adaptation to other future architectures, exploring the same separation of concerns approach that is being used for the atmosphere model.
Next Generation Atmosphere Composition (NG-Composition)
This project aims to port the United Kingdom Chemistry and Aerosol (UKCA) sub-model of the UM to LFRic, with a particular focus on the gas-phase chemistry. Aerosol modelling capability is being ported as part of the GungHo Atmospheric Science Project (GHASP).
This project will develop an atmosphere-ocean-ice-land-hydrology model based on GHASP and the NEMO marine system. The different scientific components will be coupled together using the OASIS3-MCT coupler.
Next Generation Visualisation, Analysis & Tools (NG-VAT)
The Visualisation, Analysis and Tools Project aims to advance and assist the scientific visualisation and model evaluation tools needed to support the Next Generation Modelling Systems by exploiting the Met Office led open source Iris Python package. With most current tooling only able to handle data such as the UM’s latitude-longitude based mesh, new tooling is required to manage and exploit the cubed-sphere mesh of the Next Generation Modelling Systems’ atmosphere model. Iris is extended to support the analysis and visualisation of unstructured UGRID data, as output from the LFRic atmospheric model.
Next Generation Verification System (NG-Ver)
This project will implement a replacement for the current Met Office Verification System (VER) used for both global and regional model atmosphere verification. The replacement will also provide verification for the ocean models.
The current VER system is over 20 years old and would not be able to produce verification on the cubed-sphere mesh of the Next Generation Modelling Systems’ atmosphere model without significant rewriting. The Model Evaluation Tools (MET) verification package together with the METplus python wrappers that were developed by the National Center for Atmospheric Research (NCAR) Developmental Testbed Center have been chosen as the replacement. MET is open source, widely used in research communities and is being implemented for operational use at NCEP.
Next Generation Research to Operations (NG-R2O)
This project is responsible for pulling together the work delivered by other Next Generation Modelling Systems projects and related programmes (such as those responsible for delivering our global and regional configurations) into coherent Numerical Weather Prediction (NWP) systems and preparing these systems for operational implementation on the Met Office’s next supercomputing system by 2027/28. The scope includes the global NWP system (which will be a coupled atmosphere-ocean-land-sea ice system by then), the UK NWP system, regional NWP for other areas of the globe, and any other NWP models (including sub-km NWP) that may be implemented between now and 2027/28.
Next Generation Research to Climate Use (NG-R2C)
The Next Generation Modelling Systems: Research to Climate Use project (NG-R2C) will make, in collaboration with other Next Generation Modelling Systems projects, the necessary developments to ensure that climate needs are well understood to enable timely adoption of the new systems for key climate science and service activities. The project will also develop climate-specific workflows to exploit the new capabilities of next-generation supercomputing systems.
Build system (Fab)
This project will deliver a new build system to support LFRic, the UM and other large primarily Fortran science codebases that the Met Office manages. The project was conceived following an analysis of the Met Office’s existing build system, fcm-make, which was found not to support the required functionality. Investigation of various options including development of fcm-make or adoption of a readily available alternative were found not to meet the needs of the Met Office.
The build system we require needs to be both capable and flexible. In addition to compiling source files and linking object files to produce the executable, other operations are needed, including pre-processing and automatic code generation (e.g. using PSyclone). These operations complicate dependency analysis and the requirement for efficient incremental builds, which is essential for codebases of the order of a million lines of code.
Atmospheric-dispersion modelling (NAME-NGMS)
This project will ensure that the Numerical Atmospheric-dispersion Modelling Environment (NAME) continues to be an effective Lagrangian-Eulerian dispersion model which is capable of using NWP data from LFRic. It will ensure that NAME can continue to deliver operational services efficiently on future supercomputing architectures. Coupling NAME with LFRic so that the two models can run together at the same time will also be investigated.
Working in partnership
The Next Generation Modelling Systems Programme is large and complex. Its success depends on the efforts of many partners. Some of these are long-standing partners, such as operational and research centres based around the world in the UM Partnership. The Bureau of Meteorology (Australia), the National Institute of Water and Atmospheric research (New Zealand) and a US Air Force partner, Oak Ridge National Laboratory, have been contributing to the programme’s technical infrastructure around code generation, visualisation, and the porting of physics routines. Others are new, such as the Joint Center for Satellite Data Assimilation and the National Center for Atmospheric Research (both in the United States), who have led the development of the JEDI infrastructure and METplus respectively. And for some, such as the Science and Technology Facilities Council’s Hartree Centre, who have developed PSyclone, it has reinvigorated long-term collaborations.
The people and their skills to develop, run, and interpret the output of the next generation modelling systems are as important as the systems themselves. Developing our people and those of our partners is therefore an essential element of the Programme, and one which will also ensure that we can continue to develop and improve the next generation systems long after the Programme is closed. A lot of initial work has already been undertaken for the more specialised aspects, such as for the developers of some of the new components of the system, such as GungHo and LFRic, but we, together with our partners, are now beginning to develop more formal training for users of the systems as a whole.