As well as developing our own modelling capability it is also important to build collaborative relationships with other agencies and organisations in order to draw on and exploit the expertise of others. On the weather side the atmospheric model is developed collaboratively working closely with other national Met and climate services including the Australian Bureau of Meteorology and the Korean Meteorological Administration. In areas such as atmospheric chemistry and ocean biogeochemistry we work closely with other parts of UK academia and for ocean modelling we do not develop our own ocean model any more, but use a community ocean model called NEMO. A full earth system model is so complex that it would be impossible to produce without effective collaboration.

Even on more local scales, the Met Office is working with research institutes including the Centre of Ecology and Hydrology, the National Oceanography Centre and the Plymouth Marine Laboratory to build an environmental prediction system for the UK. By using the Met Office’s atmospheric model combined with an ocean model, a wave model and a river flow model it may be possible to predict not only where rainfall will occur, but also where rivers will burst their banks, where the flooding will take place and the impact that environmental conditions will have on issues such as sedimentation in harbours.

Making sure the software complements the hardware One of the key challenges looking ahead is ensuring that existing models will run effectively on the new generation of supercomputers in order to exploit improvements in speed and processing power. However, because computer chips are not getting any faster, the improvement in supercomputers is coming from running multiple processors in parallel.

Today, a typical forecast will use a few thousand parallel processing “threads”, whilst in a decade this may be in excess of 100,000. Unfortunately many current forecast models are not coded to take advantage of this and we expect decreased returns as the model does not scale as well as it should at very high resolutions.

The Met Office global model has grid spacing based on latitude and longitude so while that equates to 10km grid over the mid-latitudes, once you get towards the poles the grid boxes are tens of metres apart in the east-west direction and the computer algorithms struggle to run faster and faster on more processors because of that aspect. The net result involves rewriting the model code to have a completely different grid system that is more aligned to the way in which the next generation supercomputers will run.

While this will future proof the models in the long-term, it is a huge undertaking, which again requires academic collaboration.

Getting the correct weather data It is fair to say that the forecast model is only as good as the information that is fed into it so there is a huge amount of technology dedicated to observing the earth’s weather systems including satellites, weather stations, instrumentation on aircraft and ocean buoys. Global data gathering is a hugely collaborative process and the results are fed back to the Met Office in Exeter and ingested into our models. The same observations are then used to evaluate the forecast models so the quantities of data are quite staggering.

By harnessing all the available technology, intelligence and data, the Met Office is able to model complex weather systems with impressive precision. Today’s four day forecast is now as accurate as a one day forecast was thirty years ago, but in response the public and our customers continually demand more local, accurate and detailed forecasts. Continued planning and investment in science, computing and the environmental observing network will ensure further improvements, no matter what the weather throws at us.

flooding under a bridge on a quiet brittish country road a street full of snow covered cars
Waves crashing against a lighthouse pier in stormy weather, with a tanker in the background