Long-term and accurate climate data is needed to underpin work to understand, mitigate and adapt to climate change. At the Met Office Hadley Centre we monitor the climate system and compile observational evidence of variability and change to build a complete picture of the world's climate.
Dr Peter Stott, Head of Climate Monitoring and Attribution explains the importance of maintaining global and regional temperature datasets and how these give us, not only a robust record of past climate, but confidence in computer model projections of future climate.
The thing most people will immediately say is temperature and this, of course, is a key parameter for us to record. Together, we and the University of East Anglia's Climatic Research Unit compile a temperature set for the entire world, known as HadCRUT. In conjunction with the National Climate Information Centre, based at the Met Office, we also compile temperature datasets for the UK and are responsible for maintaining the Central England Temperature series. This is the longest instrumental record in the world and gives a unique insight into the temperature over a period spanning
more than 350 years.
But there are many other elements which are equally important for us to know about. Determining how much heat is stored in the oceans; the extent of polar sea-ice; glacier mass; and the temperature high up in the stratosphere are among the records that combine to provide a comprehensive picture of our climate.
Whatever we observe, length of record is absolutely essential. This is because naturally occurring year to year variability tells us very little about climate change and it's only by looking over longer periods of time that we detect underlying trends within the 'noise' of shorter records.
The reason we're now able to monitor more than simply land and sea-surface temperature is the advent of modern technologies. So we've moved from having thermometers and hygrometers measuring temperature and humidity to instruments flown on satellites and travelling at depth in the world's oceans to measure a much larger range of elements. Using satellites we can monitor the extent and thickness of sea- and land-ice and specialist submerged floats provide us with temperature and salinity readings deep in the oceans, as well as information on currents, that previously were not available to us.
These are much more than a straightforward averaging of a range of temperatures. The datasets are expressed in terms of an anomaly; that is, how each part of the series differs from the average conditions. A lot of work goes into their construction. To do this, temperature measurements are quality controlled and careful account taken of the different ways in which temperatures have been measured over the years. For instance, sea temperatures used to be measured by hauling in buckets from the sides of ships but are now mostly measured by special measuring buoys.
By building a comprehensive picture of past climate we can test how well climate models perform. We start our climate models at some point in the past, run them forward to the present day and look to see how well the model replicates what has actually happened. We can check to see whether the model has the correct climatology in terms of temperature and humidity; we can check whether the model has the correct variability, such as successfully replicating the variability between El Niño and La Niña conditions in the Pacific Ocean; and we can check how well the model represents clouds, showing us how well it has captured the physical processes of our atmosphere. We can also verify that the model correctly captures the main features of climate change over the last century.
Increasingly, we are able to perform a forensic examination of the causes of extreme weather. An international group of scientists is examining specific instances of extreme weather and through a process of 'optimal fingerprinting' establishing how the odds, or frequency, of a given event occurring are changing and why that might be. We are able to identify and isolate likely causes, be they natural or man-made, for temperature extremes with more work being done to establish causes for extremes of rainfall.
The European heatwave of 2003, which led to tens of thousands of additional deaths, is estimated to have been made at least twice as likely in future because of man's activities. Through these studies we can build a picture of future climate and help inform decisions on how best to adapt to, or mitigate against, them in the future.
In terms of temperature, there is an international initiative to build a new dataset which has more data on daily timescales and is benchmarked to improve comparison between the data. This will give us much more information on variability and emerging trends in extreme temperatures at local scales.
Today's more sophisticated climate models are including feedbacks within the climate system to evaluate the response of the climate to thawing permafrost and the carbon cycle among other things.
So just as temperature records demonstrate the effectiveness of climate models, we need to accurately measure permafrost and the changing carbon dioxide balance in the atmosphere to give us confidence that these newer models are accurately reflecting what's actually happening.
View the document Core capabilities - Observations and monitoring (PDF, 197 kB)
Last updated: 17 September 2012
Numerical models are at the heart of our forecasts and products as well as much of our research and development.
Climate Science provides the science capabilities to meet the requirements of a number of government and business customers.
Weather Science provides the forecasting capabilities that are used for routine operational short to medium range forecasting of the weather, the oceans and their impacts.
All our research and development is underpinned by excellent scientists with a broad range of skills.