Big data, big challenge
The Met Office is facing new and increasing challenges because of the huge volume of data that we manage and produce.
From its earliest beginnings over 160 years ago, Met Office forecasting was driven by data. Simple observations used to hand plot synoptic charts have been exchanged for the billions of observations we handle every day from satellites, weather stations, radar, ocean buoys, planes, shipping and the public.
The 335 million observations of data we store every day require huge computational capability. Our new supercomputer provides the processing power needed to manipulate the data in a timely and effective way. The complex numerical models we develop in turn create enormous data outputs, used for climate and weather prediction and by data users throughout the world to make weather outlooks more accurate than ever before.
Making the most of data
“To maximise the investment in the new supercomputer, it’s important that both the Met Office and external organisations can continue to use our data to build innovative new products and services,” says Alex Longden, Met Office Data Provisioning Business Manager. This needs to remain unhindered by the huge increase in data volumes, which can easily bog down systems and slow delivery to a crawl.
Due to the exponential growth in our weather and climate data sets we are investing in new dissemination technologies including APIs to transform how we make our data available. These tools give the option of making more selections within the data and subdividing data into more fields, known as granularity. This in turn helps to minimise the volume of data a user needs to ‘ingest’ or receive, leading to a faster, more sustainable and cost effective method for data delivery, for both the Met Office and our customers.
State of weather data infrastructure
The Met Office recognises that increases in observational and forecast data volumes have implications across both the public and private weather sectors. To understand this better, we recently partnered with the Open Data Institute, to carry out a review on ‘The state of weather data infrastructure’.
“The review is encouraging discussion on how the global weather data infrastructure can be sustained, and deliver value to society,” explains Alex. “As well as looking at the need for continuing investment in technical infrastructure and supercomputing resources, the review looks at the role of global, regional and national meteorological services in collecting observations and generating forecasts.”
The review also highlights the technology creating new data and in turn generating new, big data challenges. Supercomputing is enabling new and improved weather models which are harnessing a variety of sources of weather observations from ground, air, sea and space based monitoring and sensors. These trends exist within a wider landscape of innovation and changing consumer expectations where instant and real-time access to data is increasingly essential.
All of this presents huge challenges but, as Alex says, “There are massive opportunities for organisations to unlock even more value from weather data. Creativity and innovation will ensure that new products and services are developed to meet the needs of specific sectors in our society.”
Big Data Drive
• To make Met Office data more useful, we are undertaking a Big Data Drive to transform how our data is accessed, disseminated and stored.
• Several major changes are being researched, tested and implemented to ensure that we take the steps needed to realise the opportunities that our data provides for society.
• In 2015, every day we produced around 30TB of public weather data and we expect to produce 200 to 250TB with the installation of the final phase of the supercomputer.