Just a few days ago, before a brutal winter storm blasted the Midwest and the Northeast, weather forecasters warned it could be one of the worst winter storms in memory. But who can remember the difference between one bad storm and another, year after year? To compare a storm with all those that came before it, and to study possible long-term climate trends among winter storms and other atmospheric phenomena, meteorologists usually look back to records of snowfall or temperature, for example. Now, climatologists from around the world are collecting a much more richly detailed picture of past weather than ever before, looking back over a 125-year timespan.
Historical weather map for the morning of January 28, 1922, the day the infamous “Knickerbocker Storm” hit Washington, DC. Credit: NOAA
Working together as part of the 20th Century Reanalysis Project, researchers from the National Oceanic and Atmospheric Administration (NOAA) and the Cooperative Institute for Research in Environmental Sciences (CIRES) are combing history books to retrieve old records of atmospheric pressure and surface temperature from around the world. Compiling them into an immense database, they are using the data to go backwards in time, modeling past climate, and creating maps of global weather conditions at six-hour intervals. This immense collection of historical climate data not only lets researchers recreate past weather, but could also help them better understand natural and human-driven climate variations.
“With this project, we’re helping answer the question “what was happening with the weather at that exact time” and now we can do that going all the way back to 1871,” says climate scientist Gilbert Compo from CIRES at the University of Colorado in Boulder. As the lead investigator of the project, Compo says gathering pressure and temperature observations from around the world and over long periods of time is essential to enable researchers to learn about historical weather and climate patterns and then use that understanding to develop, test, and interpret climate models and their simulations.
“Climate change may alter a region’s weather and its dominant weather patterns,” Compo recently wrote to colleagues in a review of the project, “We need to know if we can understand and simulate the variations in weather and weather patterns over the past 100 years to have confidence in our projections of changes in the future.”
Often, looking back at weather over time involves considering individual days, or monthly and annual averages, and then comparing them against other days, months and years. The reanalysis approach, however, is more like a weather forecast — except that it goes backward in time instead of forward. Starting with just a few pieces of observed weather data (like air pressure readings at a set of weather stations), the reanalysis model can fill in what all the other atmospheric conditions were like at that time. Then, as the model moves forward in time, researchers can enter more observations from different locations at other times. As each new piece of real information gets added in, the model can make a more educated guess at what was happening nearby. Consequently, the more points of observed historical data researchers can enter into the reanalysis, the more likely it is that the recreated weather is similar to what actually happened.
This reanalysis approach to studying previous climate and weather is not a new technique, but the 20th Century Reanalysis Project draws upon a much longer collection of weather observations than previous reanalyses. Because it is based only on pressure and temperature data, of which there is a richer historical record compared to weather variables like precipitation and wind speeds, this project can draw from a much larger collection of data that goes back in time and covers more parts of the globe. With a more extensive set of observations, in theory this reanalysis should be recreating a more accurate version of historical weather.
In spite of the more extensive data collection, however, there are still spaces and times in history where few observations exist, and the reanalysis still depends largely on what the model can estimate the weather was. In particular, in large areas of the South Pacific, the Arctic and Antarctic that simply did not have many observations from the 1800s and early 1900s, there is greater uncertainty in the reanalysis. On the other hand, in regions of the globe with good records of atmospheric pressure, like across North America, Europe and much of the Atlantic Ocean, the recreated weather is likely more accurate, Compo and others say.
Sample contour plot of sea level atmospheric pressure produced by the 20th Century Reanalysis Project. Credit: NOAA
The product of the 20th Century Reanalysis is a series of weather maps that cover the entire Earth, from its surface all the way up to the jet stream, during the time period of 1871-2008 in six-hour intervals. To help compensate for the inherent uncertainty that comes with all reanalysis, Compo actually conducted 56 recreations for each time period, each one a possible representation of the past weather around the world, given the observations that were fed into the climate model.
Compo first conceived of the 20th Century Reanalysis Project in 2000 with his colleague Jeff Whitaker from NOAA. Since then, they’ve built their voluminous database, thanks in large part to collaboration with an international initiative from the UK’s Met Office, led by climatologist Rob Allan. According to Compo, surveying the weather over such a long period of time will now allow researchers to identify with more certainty whether modern extreme weather events are within the range of natural variability, and also how particular climate cycles, including El Niño and La Niña, influence these events. The reanalysis of past weather may even yield a new perspective on these larger climate patterns themselves, he says.
Putting the maps to work
Already the benefits of having such a long period of time and more detailed geographic data to consider are becoming clear, says Compo. He and his colleagues have been able to compare their reanalysis to previous studies looking at trends in a series of global climate circulation indices over the time period of the 20th century in order to see how increasing global temperatures in recent decades may be influencing the three-dimensional structure of the climate circulations. For example, previous studies found that during a shorter time period there was a trend towards a stronger positive North Atlantic Oscillation (NAO), which is a polar circulation that influences winter weather in North America and Europe.
In years with a “positive” NAO index, winters on the continents can be much milder than average, and in the early 1990s, some people wondered if global warming could be altering the NAO. According to Compo, the earlier reanalysis results indicated that might be the case.
“But when we included data going back to 1871, and going forward to 2008, the trend disappeared,” he says. “Now it doesn’t look like the increasing temperatures have had a strong influence on the NAO.” Compo says their reanalysis, which illuminates these climate circulations in three-dimensions, also gives a different perspective than most previous studies that evaluate the NAO by comparing barometric pressure readings from just a few locations. This longer preliminary analysis of the NAO and a few other global circulation measures are published in a new review of the 20th Century Reanalysis Project that appeared in the January issue of the Quarterly Journal of the Royal Meteorological Society.
“This new reanalysis confirms that the primary component driving the NAO is simple natural variability,” says NOAA atmospheric scientist James Overland, who has independently studied changing patterns in the NAO.
Overland believes that in the future, if temperatures in the Arctic continue to warm as rapidly as they have in recent decades, there could be a noticeable influence on the NAO, but he says that most current observations indicate natural variation is still the major driver.
The researchers that have developed the 20th Century Reanalysis won’t be the only ones able to use the data, says Compo. With a complete collection of weather maps now available, other scientists can access the data and apply it to their own climate studies. Though they expect there will be many more applications of the reanalysis, Compo thinks the information is particularly useful for climate modelers that want to compare simulations of past climate to the real weather.
For example, Columbia University climatologist Richard Seager has already used an earlier version of the reanalysis that spanned 1908-1958 in studies of the climate conditions that led to the Dust Bowl drought that left the Great Plains parched in the 1930s. After simulating weather in that era using a common climate model, which does not make use of observations, Seager was able to confirm the simulations were a close match to what the weather likely was by comparing them to recreated weather maps from the 20th Century Reanalysis Project. It not only shed light on the weather patterns that gave rise to the drought and dust storms, but also helped Seager verify that the climate model was accurately simulating conditions at the time.
A dust storm approaches houses in Stratford, Texas in April, during the Dust Bowl in 1935. Credit: NOAA
Although there is a lot of detailed information that can come from this iteration of the reanalysis, there is also concern that some researchers might use it in the wrong way, without understanding where the errors can come in.
“This reanalysis is a really important growth of the science, but there are growing pains that go along with it,” says John Fasullo, a climatologist at the National Center for Atmospheric Research in Boulder, Colo. Fasullo says that the reanalysis method of sampling and “filling in the blanks” with models comes with a lot of errors. His concern is that some won’t realize that the recreated maps always contain some amount of information that was not observed, but was recreated by a model of the climate.
Fasullo says another limitation is that the NOAA/CIRES reanalysis depends primarily on atmospheric pressure observations, which aren’t always well correlated to rain, clouds and wind speeds.
“The project is a good stepping stone, but we’ll never get the full reanalysis,” he says.
Mining historical records for weather
To gather historical readings of atmospheric pressure and surface temperature from around the world to help improve the reanalysis, researchers have had to search a lot farther than the archives from their local weather station. Going further back in time than previous databases, and covering parts of the planet that haven’t had regular weather observation posts, Compo and Allan had to get creative. They turned to records kept by explorers and mariners. Some of their most valuable resources have been the logbooks of ships and sailing vessels from the late 19th and early 20th centuries.
“Before the age of satellites, mariners were very interested in what the barometric pressure was along their sailing paths, so almost every ship carried a barometer and took observations,” says Compo. With the help of historians to identify ships that tracked oceans paths where weather stations were few and far between, Allan has been employing unique techniques to digitize the logbook observations, and Compo has incorporated the information into the reanalysis.
Already there are over 250 terabytes of data going into the 20th Century Reanalysis Project
— more than ten times the information held by the U.S. Library of Congress
— and while a complete set of weather maps is now available for other scientists to use in their research, more historical and future observations will be added going forward. All of this additional pressure and temperature information helps improve the accuracy of the recreated weather.
“For the time prior to the Second World War, there is probably still as much information that hasn’t been digitized than has already been added,” says Allan. And as they continue to uncover data, he says he expects the reanalysis will only get better. “That’s a lot more data we can still get that will improve on what has already gone in there.”