Posts Tagged ‘Climate Change’

I told you so

October 20, 2012
In 1981 Hansen et al made predictions for the change in global mean temperature expected over the course of the coming century. The figure shows their predictions along side 4 independent estimates of what has actually happened.

In 1981 Hansen et al made predictions for the change in global mean temperature expected over the course of the coming century. The figure shows their predictions along side four independent estimates of what has actually happened.

According to Gore Vidal, the four most beautiful words in the English language are “I told you so”. My hero James Hansen can justifiably speak those words, but I am sure they don’t feel beautiful to him.

In 1981, together with six NASA colleagues, he published a paper in Science magazine entitled ‘Climatic Impact of Increasing Atmospheric Carbon dioxide‘. Science magazine won’t let you read it but it is available online here. The paper is not that difficult to understand and if you are curious about these things, it’s a good read. I particularly liked the inclusion of a simple analogy:

“The surface temperature resulting from the greenhouse effect is analogous to the depth of water in a leaky bucket with constant inflow rate. If the holes in the bucket are reduced slightly in size the water depth and water pressure will increase until the flow rate out of the holes once again equals the inflow rate. Analogously, if the atmospheric infrared opacity  increases, the temperature of the surface and the atmosphere will increase until the emission of radiation from the planet again equals the absorbed solar energy.”

The figure at the top of the page shows Figure 6 from their paper on which I have overlaid four independent estimates of what has actually happened since then. At the time the paper was published,  global mean temperature was declining and the predictions were thus extremely bold. However, looking back the authors predictions now seem conservative. And indeed the authors were careful and conservative, though clear about specific predictions.

In the summary they state

“Potential effects on the climate in the 21st Century include the creation of drought-prone regions in North America and central Asia … erosion of the West Antarctic Ice Sheet … and an opening of the fabled NorthWest passage”

Well, North America has been prone to drought, and the North West passage now regularly opens in summer. Thankfully the West Antarctic Ice Sheet seems relatively stable.

All through the paper the authors consider the uncertainties arising from the simplicity of their model and the many poorly-understood effects – such as cloud cover and solar variability – which affect climate. However, they test their predictions against plausible variations in these factors and find that the predictions of warming are robust against a wide range of plausible feedback effects. They conclude with a wider non-scientific perspective

Political and economic forces affecting energy use and fuel choice make it unlikely that the CO2 issue will have a major impact on energy policies until convincing observations of global warming at in hand. In light of historical evidence that it takes several decades to complete a major change in fuel use this makes large climate change almost inevitable.

However the degree of warming will depend strongly on the energy growth rate and the choice of fuels for the next century. Thus CO2 effects on climate may make full exploitation of coal resources undesirable. An appropriate strategy may be to encourage energy conservation and develop alternative energy sources while using fossil fuels as necessary during the next few decades.

The Climate change induced by anthropogenic release of CO2 is likely to be the most fascinating global geophysical experiment that man will ever conduct. The scientific task is to help determine the nature of future climatic effects as early as possible. The required efforts in global observations and climate analyses are challenging, but the benefits from improved understanding of climate will surely warrant the work involved.

To me these views seem modest, realistic and optimistic. But I bet that although James Hansen and his colleagues predicted the climate 30 years ahead, they never guessed that in the 21st Century the US would have senators such as Paul Brown.

To understand such ignorance we have to turn again to Gore Vidal:

The United States was founded by the brightest people in the country – and we haven’t seen them since.

Acknowledgement: This article is based on a blog story at Real Climate:

Denial in Action

September 23, 2012
Interesting Map showing the different 'seas' within the Arctic Ocean. Source NSIDC

Interesting Map showing the different ‘seas’ within the Arctic Ocean. Source NSIDC

The collapse in the extent of the summer minimum of Arctic Sea Ice has been a shock to everyone, but in honesty, not really a surprise. But disappearance of three-quarters-of-a-million square kilometres of sea ice seemed to be such a dramatic change that I was sure that Climate Change ‘sceptics’ would be holding up their hands and saying simply ‘I was wrong’. So I headed over to the Sea Ice Update pages of Antony Watts ‘Watts Up’ site to witness their surrender.

But far from admitting that their world view was flawed, the ‘Climate Sceptics’ were responding in a manner which would be hilarious if it were not so tragic. The discussion is a classic example of a group unable to ‘distinguish the forest from the trees’. The discussion is focussed on individual facts (the trees) which are discussed in detail and critically examined. But they denounce anyone who raises the wider context of the facts (the forest) i.e. the only theory which predicted sea-ice melting. Indeed our concerns that this might happen are the very reason that the sea-ice data exists.

The page begins with a section noting that:

…there are some quite large Sea Surface Temperature Anomalies in the Arctic at present [up to 7 °C]. They appear to centered in four primary areas, the coasts of the Beaufort, Laptev and Kara Seas, as well as the middle of Baffin Bay. There are a multitude of potential explanations for these anomalies, let’s take them individually

We then get the individual potential explanations which I will summarise:

  1. Could be due to the low sea ice extent which means areas previously covered with ice are now exposed.
  2. Could be due to an ‘unusually strong storm’ which occurred early in August which could have broken up the ice cover.
  3. Could be Albedo Feedback – the replacement of reflective sea ice with dark ocean – likely to be a factor.
  4. Could be anthropogenically-warmed river discharges – quite likely a factor in some areas.
  5. Could be Northern Polar Lower Troposphere Anomalies – basically the air temperature has warmed over the decades, but enough for the trend to explain the sea surface temperature anomalies.
  6. Could be Tundra Vegetation Feedback – where the sea ice has retreated plants have begun to grow, changing surface albedo.

I have summarised these explanations but each one is discussed in detail. The discussions then cover other possible explanations:

  • Arctic Drilling
  • Undersea Volcanos
  • Soot from Chinese Coal Power Stations
  • The effect of the North Atlantic Oscillation – a persistent weather pattern with two distinct stable states.
  • Absorption of Energy from Geomagnetic Storms
  • Increased use of icebreakers and even tourist boats.
  • There has been no extra melting – just dispersal of sea ice into smaller pieces which are not counted as contiguous sea ice.

All these are discussed intelligently, helpfully and politely. It is an admirable example of a community of interested people discussing a topic. But when someone suggests:

There’s the increased release of anthropogenic greenhouse gases, such as CO2.

they are quickly told…

OK, you made a conjecture. Now, show a direct connection between Arctic ice melt and anthropogenic CO2, per the scientific method: testable, and using raw data. Otherwise, you have just expressed an opinion, nothing more.

In short – we don’t want to know about this.

In fact Climate models – our way of taking account of as many factors as we can think of – predicted long ago that Arctic warming would result from CO2 emissions. And Arctic warming can be reasonably expected to thin the ice sheet over the Arctic Ocean, which will then break up when there is a storm. All of the factors mentioned above may be proximate causes of the ice break up and enhanced sea-surface temperatures. But in fact the ultimate cause is in all probability the emission of greenhouse gases.

What we learn is that this group of well-meaning, interested and intelligent people simply rejected the most likely cause of this astonishing phenomenon. It caused me to wonder, if  there were any event which would cause these people – not perhaps to change their minds – but to perhaps shift their opinion slightly. To consider that perhaps all the world’s experts in Climate studies might just have a point worth considering?

Cosmic Rays and Climate Change

August 31, 2011
Clouds - condensed water vapour - formed around tiny particles emitted from jet engines

Clouds - condensed water vapour - formed around tiny particles emitted from jet engines. Do cosmic rays give rise to similar 'contrails' that initiate the growth of clouds? Click for larger version

Cosmic rays are the particles (probably protons and not ‘rays’ at all) that are ejected from extreme events throughout our galaxy and beyond, that bombard the Earth from all directions. I discussed the basic phenomenology and the fantastic satellite recently launched to study them here. This article is about recent stories reporting a link between cosmic ray flux and the formation of water droplets in the atmosphere -clouds. Various articles describing the research can be found here:

Water molecules in the atmosphere have quite different effects depending on whether they are present as droplets – i.e. in a cloud – or as isolated molecules i.e. water vapour. In either form they have roughly similar effects on infra red light emitted by the Earth, but as we all know, clouds block visible light. The process by which droplets form has been the object of extensive study for more than a century – one of the major effects affecting the stability of droplets is called the Kelvin effect – and yet still we do not collectively understand how water vapour condenses in the atmosphere to form cloud droplets.

Of course we don’t know nothing, but even though the process of droplet formation is ubiquitous and important, the process is complex. The most significant fact is that even when there is more than enough water vapour in the air to form liquid droplets (so-called super-saturated air) they just don’t form by themselves. The chance of the water molecules clumping together by chance is infinitesimal. In practice, they need a ‘seed’ of some kind which allows water molecules to stick to it and which forms the ‘nucleus’ of a droplet which can grow.

CERN's Illustration of the process of droplet formation

CERN's Illustration of the process of droplet formation. Click for larger version. Courtesy CERN

The research from CERN (who can generate proton beams very easily) evaluated the effect of cosmic rays (i.e. fast protons) on the formation of the smallest droplets under different simulated atmospheric conditions and in the presence of different impurities. The results were complex, but can be divided into two parts:

  • When simulating the atmosphere at an altitude of 1 kilometer (3000 feet) where the temperature is approximately -10 °C (prime cloud forming temperature) , they were surprised to find that the rate of droplet formation was only one thousandth of  that observed in the real atmosphere, with or without the ‘cosmic ray’ bombardment.
  • When simulating the atmosphere at an altitude of 5 kilometers (16000 feet) where the temperature is approximately -25 °C – they found that ‘cosmic ray’ bombardment enhanced the rate of tiny droplet formation by a factor 10.
So the results indicate that droplet formation is even more complex than had been previously considered. But as many reports were at pains to point out, this is not really news because nobody ever thought they understood the process in the first place! And the droplets formed in the experiment were still too small to grow into cloud droplets and scatter light. Small droplets – perhaps 10 nanometers (50 atoms) do not necessarily grow to be large 1 micron size droplets typically found in clouds. Small droplets tend to evaporate faster than larger droplets and so when there is a mix of droplet sizes, small droplets tend to shrink and larger ones tend to grow – that is a manifestation of the Kelvin effect I mentioned above. However, no doubt we will eventually figure out how the process works.
However, I would like to single out the disingenuous Andrew Orlowski who writes for the Register for special castigation. Mr. Orlowski is an iconoclast who enjoys mocking the achievements of others. From reading his articles it is cleat that Mr. Orlowski objects to the idea that anthropogenic carbon dioxide emissions could conceivably be affecting the climate. So Mr Orlowski loves the idea that Cosmic Rays could be affecting the climate because they are ‘not our fault’ and we can just ignore the ‘liberal whingers’ calling for controls on energy usage. I don’t know why he so persistently rejects the idea that carbon dioxide emissions could be affecting the climate, because its a pretty sound idea with quite a lot of evidential and theoretical support. But every report he writes on the subject focuses on the things which people can’t explain and implies that the whole concept is thought up by a liberal/authortarian elite who – unlike the free-thinking Mr Orlowski – are unwilling to accept new data. However he never has the honesty to state what he actually thinks. So, for example,  he ends his article with a quote implying that the lead author thinks that previous climate studies are bunkum.
When Dr Kirkby first described the theory in 1998, he suggested cosmic rays “will probably be able to account for somewhere between a half and the whole of the increase in the Earth’s temperature that we have seen in the last century.”
But in fact the actual results of Dr. Kirkby’s work are completely inconclusive – telling us only what we knew before – that we don’t understand the basic process of cloud formation.
As new research fills in the gaps in our knowledge of the many complex factors that affect our climate, many media sources invite us to view the work in an essentially confrontational light. The question they ask is whether this report strengthens the views of climate change ‘skeptics’ or climate change ‘supportors'(!). Frequently one voice from each camp will be quoted to further this sense of antagonism. But in fact there will always remain many areas of uncertainty and we – you and I and scientists and governments – have to cope with this uncertainty. We have to make our Climate Models as best we can even though we don’t understand all the elements: We have to make decisions about energy usage and generation (Wind turbines, Electricity pylons, banned light bulbs) in the face of this uncertainty. These decisions  are difficult enough in themselves and we would all do better without this kind of tribal response to each new piece of information.

The power of water

May 18, 2011
US Army Corp of Engineers Photograph of the Morganza Spillway.

US Army Corp of Engineers Photograph of the Morganza Spillway.

If you ask my advice, there is nothing as nice, as messing about on the river“. And humans have felt that way for a long time. Our settlements have followed river valleys from the mountains out to the sea since the dawn of time. But when the water – either from the river or the sea – wants to be where we happen to live, there is generally only one winner. When disaster strikes people are wont to say that it was unpredictable. However in almost all cases, exactly the opposite is true. These disasters are in fact entirely predictable –  it is just that we have short collective memories. So for example:

  • The tsunami which hit Japan earlier this year , was really only a one-in-one-hundred-year event. How could people have collectively ‘forgotten’ that the sea did this?
  • The flooding in Brisbane last year was entirely predictable, and had happened previously as recently as 1974.
  • Hurricane Katrina’s terrible toll in New Orleans, is really quite understandable in a city which is built below sea level!

And in the face of these disasters I have been extremely impressed by the US handling of the current flooding on the Mississippi – summarised in this Washington Post graphicThe authorities have followed the floods and predicted the extent of the flooding downstream several days in advance. They have destroyed levees to flood farm land rather than cities, and finally opened the splendidly-named Morganza Spillway to successfully prevent flooding in New Orleans. 

The Morganza spillway was envisioned  after the great flooding of 1927, and completed in 1954 in the sure and certain knowledge that at some time in the future there would be another flood that might widen the river to 80 miles across in places. If the Thames flooded like that it would nearly reach the south coast. What I admire is the collective political and engineering understanding that built and maintained this structure through all these years in which it wasn’t needed: it was last opened in 1973. The actions of the engineers have turned flooding from a catastrophe causing loss of life and distressing rescue, into a predictable disaster – the flooding of the surrounding farmlands has been completely predictable giving people many days notice, and allowing them to leave their homes safely.

Reading about the smart flood management, I allowed myself to imagine that the collective might of the US Army Corp of Engineers might deployed to, say, Bangladesh or Pakistan, to wage a ‘War on Water’. Imagine if they constructed dams and levees to protect the country from sea flooding in case of sea level rise. And created designated  flooding areas to manage floods from extreme rains or melting. It might cost a few billion dollars, but the benefits would last for a century or more. Giving people plenty of time to do all the things we love to do by the riverside.

Do you want to bet on whether sea level is rising?

March 6, 2011

 

Inundation in Newport Beach
Inundation in Newport Beach

The models we use to predict future climate are impressive, but not perfect, and so there are uncertainties in our knowledge of what will happen to Earth’s climate. In this sense uncertainty around Climate Change is no different from uncertainty around the consequences of any other activity we undertake. And yet need to make decisions now in the face of this uncertainty. And I was struck by this reality as I read this story in the LA Times on the decision by Newport Beach council to undertake works to mitigate the effects of any possible sea level rise.

Many people in this wealthy conservative enclave think the whole Climate Change issue is nothing more than nonsense. But nonetheless, if sea levels are rising then they will still get wet whether they believe sea levels are rising or not! It comes down to the simple question of whether people are prepared to bet with their homes that sea levels are not rising. With houses costing millions of dollars each, the answer appears to be ‘No’. The article ends with a quote from Newport Beach Councilwoman Nancy Gardner:

“We aren’t going to retreat — we’ve got so much invested in real estate, but the sooner we can start to think long-term, the more creative we can be in our solutions.” [My italics]

I like the way she puts this – and I wish that others could grasp the simplicity of her approach:

The sooner we can start to think long-term, the more creative we can be in our solutions.

I’ll say ‘Amen’ to that.


Adapting to Climate and Climate Change

January 13, 2011
Picture of Brisbane in 1974

Picture of Brisbane in 1974

When we see weather extremes such as the current Brisbane floods, it’s easy to associate such a dramatic event with Climate Change – after all sea surface temperatures off the east of Australia have been in excess of 26 °C – and that’s where all the water has come from. However, as the above picture from 1974 shows, floods are nothing new in Brisbane, and this level of flooding appears to be roughly a 1 in 50 year event. In other words this is part of the normal climate of the area -or at least there is no obvious evidence otherwise.

And so the question that occurs to me is this: Why did people build a city in a place where they knew there would be floods? Now I know that a dam was built to reduce the risk of flooding, and so presumably things would be worse now if the dam had not been built. But nonetheless the question remains – is it smart to build houses in a place that catastrophically floods once in 50 years? I think the answer is ‘No’.

Now I suspect almost the entire population of Brisbane would disagree with me. In the same way, the people of New Orleans decided after Katrina that the answer to the question ‘Is it smart to build a city 18 metres below sea level in a Hurricane belt?’ was ‘Yes’. Why do people make such decisions? Well the answer is because is the individuals involved have lost everything they owned – and if they moved on and abandoned their old homes, then they would become effectively destitute. So staying makes sense for each individual. But eventually, in places where the flooding risk is so severe, and the costs of avoidance – in terms of dams and levees – so great, we will just have to say collectively: let’s move on. Because it seems pretty likely that Brisbane will flood again, and so will New Orleans.

So the problems of living in a marginally sustainable place such as Brisbane and New Orleans are severe. And there the question we are coping with is one connected with the frequency of extreme events in a particular climate. Now suppose that people decide to stay and spend 10 billion pounds making the place safe for, say, 100 years. How many times and how frequently are people prepared to spend that amount of money before it become cheaper to move on – to move higher. If climate change is active, many coastal communities may find that 1 in 100 year events happen once every 30 years – and then the economic and personal choice becomes more evenly balanced. And it will be many tens of years before we can definitely conclude that the climate has changed. But that is the nature of the challenge we face.

Now these are difficult questions to answer for rich countries such as Australia or the USA. For a poor country, such as Bangladesh, there is just nowhere else for people to go and then the prospect of increased frequency of flooding isn’t just an economic question, but a matter of life and death.

Climate Change Discussion at Protons for Breakfast

November 25, 2010
Wind Farms in Texas

Wind Farms in Texas

I am just back from discussing ‘Climate Change’ at Protons for Breakfast. And after having eaten – I was ravenous! – I am reflecting on a very moving evening.

  • So many people – children and adults – concerned and interested and coming along to these sessions
  • So many helpers giving up their time.
  • My friend Lindsay standing outside in the freezing cold to make sure that people found their way to the right car park!
  • Andrew Russell giving up his time to be an expert when all NPL’s experts were abroad!

And as we came to the end, one of the attendees stood up and encouraged everyone not to give up hope – and she related her experience of how things were changing in Africa and that solar photovoltaics were making a real difference. I remembered the first few times we had run Protons for Breakfast and how depressed I had felt about our situation. Now, I don’t feel depressed about our situation – even though I still don’t know what will happen. But now I feel that as the reality of our situation becomes apparent, humanity has the capability to act together. And although there will be squabbles and political manoeuvring, we will do something. It won’t be ideal, but it will be – in some sense – enough.

Arriving home I watched the BBC News where there was a feature about giant  wind farms in Texas. The feature stressed how politically it was unacceptable to mention anything ‘green’ or global warming related, but the wind farms were there nonetheless – the largest wind farms in the world – colossal constructions harvesting a sustainable resource which should still be reaping rewards long after the ‘nodding donkeys‘ beneath them have nodded for the last time.

The World really is changing- in small ways and in large ways, and momentarily I feel happy. Being amongst fellow citizens and work colleagues like these –  I feel quite sure humanity will adapt to our new reality.

Homogenisation III: It’s complicated…

October 25, 2010
Figure 1 of the Menne and Williams Paper describing their method of homogenisation.

Figure 1 of the Menne and Williams Paper describing their method of homogenisation.

I have now written two blogs on homogenisation of climate data (this one and this one) and really want to get on with blogging about other things – mainly matters less fraught. So let’s finish this off and move on.

I realise that both my previous articles were  embarrassingly oversimplified. Matt Menne sent me his paper detailing how he and his colleague Claude Williams homogenised the climate data. On reading the paper I experienced several moments of understanding, several areas of puzzlement, and a familiar feeling which approximates humiliation. Yes, humiliation. Whenever I encounter a topic I feel I should have understood but haven’t I find myself feeling terrible about my own ignorance.

Precis…

You can read the paper for yourself, but I thought I would try to precis the paper because it is not simple. It goes like this:

  • The aim of the paper is to develop an automatic method (an algorithm) that can consider every climate station temperature record in turn and extract an overall ‘climate’ trend reflected in all series.
  • The first step is to average the daily maximum and minimum values to give averaged monthly minimum and maximum values and monthly averages. This averaging reduces the ‘noise’ on the data by a factor of approximately 5 (the square root of 30 measurements) for the maximum and minimum data and 7.5 for the average (the square root of 60 measurements).
  • Next we compare each station with a network of ‘nearby’ stations by calculating the difference between the target station data and each of its neighbours. In the paper, example data (Figure 1) is given that shows that these difference series are much less ‘noisy’ than  the individual series themselves. This is because the difference series are correlated: for example, when the monthly average temperature in Teddington is high, then the monthly average temperature at nearby stations such as Hounslow is also likely to be high.  Because the temperatures tend to go up and down together – the differences between them show much less than the variability of either series by itself.
  • The low ‘noise’ levels on the difference series are critically important. This allows the authors to sensitively spot when ‘something happens’ – a sudden change in one station or the other (or both). Of course at this point in the analysis they don’t know which data set (e.g. Teddington or Hounslow) contains the sudden change. Typically these changes are caused by a change of sensor, or location of a climate station, and over many decades these are actually fairly common occurrences. If they were simply left in the data sets which were averaged to estimate climate changes, then they would be an obvious source of error.
  • The authors use a statistical test to detect ‘change points’ in the various difference series, and once all the change points have been identified they seek to identify the series in which the change has occurred. They do this by looking at difference series with multiple neighbours (Teddington – Richmond, Teddington – Feltham, Teddington – Kingston etc) they identify the ‘culprit’ series which has shifted. So consider the Teddington – Hounslow difference series. If Teddington is the ‘culprit’ then all the difference series which have Teddington as a partner will show the shift. However if, say, Hounslow has the shift, then we would not expect to see to a shift at that time in the Teddington – Richmond difference series.
  • They then analyse the ‘culprit’ series to determine the type of shift that has taken place. They have 4 general categories or shift: a step-change; a drift; a step-change imposed on a drift, or a step-change followed by a drift.
  • They then adjust the ‘culprit’ series to estimate what it ‘would have shown’ if the shift had not taken place.

So I hope you can see that this is not simple and that is why most of the paper is spent trying to check how well the algorithms they have devised for:

  • spotting change points,
  • identifying ‘culprit’ series,
  • categorising the type of change point
  • and then adjusting the ‘culprit’ series.

are working. Their methods are not perfect. But what I like about this paper is that they are very open about the shortcomings of their technique – it can be fooled for instance if change points in different series at almost the same time. However the tests they have run show that it is capable of extracting trends with a fair degree of accuracy.

Summarising…

It is a sad fact – almost an inconvenient truth – that most climate data is very reproducible, but often has large uncertainty of measurement. The homogenisation approach to extracting trends from this data is a positive response to this fact. Some people take exception to the very concept of ‘homogenising’ climate data. And it is indeed a process in which subtle biases could occur. But having spoken with these authors, and having read this paper, I am sure that the authors would be mortified if there was an unidentified major error in their work. They have not made the analysis choices they have because it ’causes’ a temperature rise which is in line with their political or personal desires. They have done the best they can to be fair and honest – and it does seem as though the climate trend they uncover just happens to a warming trend in most – but not all – regions of the world.

You can read more about their work here.

Homogenistion

September 25, 2010
Raw data: annual average minimum temperature from Reno, Nevada 1895 to 2005

Raw data: annual average minimum temperature from Reno, Nevada 1895 to 2005

UPDATE: This page contains errors! Please see the comments for clarification. I have posted a second version of this calculation here.

As I have mentioned on several recent posts, the raw data from even relatively sophisticated climate stations is really rather poor quality. Rather than just ignore this data altogether, researchers have looked at the data and noted that although the actual temperatures might not be correct to within perhaps ±2 °C, the errors in the measurement are likely to have remained constant for a long time. This is because the methods used to take the data have changed only very slowly. So researchers have looked at the data, not to determine the absolute temperature at that station, but in order to determine whether the temperatures have changed. There are many problems inherent in this, so I thought it would be interesting to very explicitly show the kind of thing involved in this endeavour. To show this I have extracted data from a slide that Matt Menne showed in his talk at the Surface Temperature Workshop.

The graph at the head of this article shows the raw data for a single station near Reno, Nevada. Each day a thermometer which records the maximum temperature and the minimum temperature is read and 365 (or 366 in a leap year) measurements of the minimum daily temperature were averaged to produce each data point on the graph. We can see that the year-to-year scatter is rather low. However, two features stand out and I have re-drawn the above graph highlighting these features.

Raw data for the annual mean minimum temperature highlighting significant features.

Raw data for the annual mean minimum temperature highlighting significant features.

The first feature is a dramatic shift in the data: the years following 1937 appear to be between 3 °C and 4 °C colder than the years prior to 1936. This is a pretty obvious artefact and occured because the station was moved from one location to another – micro climates vary by this much even over distances as short as a few metres! (think about how one side of your car can have frost on in the morning but the other side doesn’t!). The second feature is a strikingly linear 4 °C rise in temperature since 1975. This looks strongly like an Urban Heat Island (UHI) effect – a real effect – but not caused by a shift in climate. The question that the ‘homogenisation process’  tries to answer positively is this: can we recover any information at all from the above graph? To try to extract the trend of the data, researchers look at the difference between this station and its ten nearest neighbours – many kilometres away from this station. This difference data is shown below.

Difference between minimum temperatures at Reno and the mean from its 10 nearest neighbours.

Difference between minimum temperatures at Reno and the mean from its 10 nearest neighbours.

If we add this difference data to the raw data, then we should be able to compensate for the local anomalies in the data. The compensated graph is shown below in red with the original data shown in grey. It is pretty clear that the adjusted data is a better representation of the climate-related temperature changes at the Reno, Nevada station than the original data.

Data for the mean annual minimum temperature from temperature adjusted by comparison with its neighbours

Data for the mean annual minimum temperature from temperature adjusted by comparison with its neighbours

The adjusted data do not show a sudden jump in temperature in 1936/1937. And they do not show the real rise in temperatures at the station due to the UHI effect. The data is said to have been homogenised. Now I have  simplified the process a little, but not much. The professionals can make a statistical assessment of the uncertainty associated with the process.

When you look closely at the graphs of the temperature of the Earth versus time you will see that they are all labelled  ‘temperature anomaly’ rather than temperature change. The data from the land surface portion of the Earth’s temperature (around one third of the data) have ALL been adjusted in this way to highlight only changes temperature common to many stations spread over a wide geographical area. The homogenisation analysis extracts from the original data only the portion of it which corresponds to long term trends and rejects artefacts which occur only in a single localised station.

Is this process fair? Well I don’t know. From having talked with the scientists involved in this work I am sure that they are doing the best they can with the data which exists. I am 100% that they are not surreptitiously ‘fixing’ the maths to make the temperature rise for political ends. The rises they observe are a relatively robust signal. Could the whole process be flawed in some unanticipated way? Well its possible, and that’s for you to make up your mind. But I feel obliged to point out that whether you are convinced by this process or not,  there are plenty of other reasons to be concerned about even the possibility that our climate might be changing.

Surface Temperature Workshop: Response to Paul M

September 21, 2010

PaulM left a comment on my Surface Temperature Workshop Blog, and my response was too long to fit in the small ‘reply’ box.

Your admiration and praise for the openness of this process is somewhat misplaced. See the comments by Roger Pielke at

http://pielkeclimatesci.wordpress.com/2010/09/20/candid-admissions-on-shortcomings-in-the-land-surface-temperature-data-ghcn-and-ushcn-at-the-september-exeter-meeting/

Key people in the field were not invited, and no information has been provided on who was invited or even who attended the meeting.

On the workshop blog, links to posts “will be limited to views from workshop participants”.

Also your remark “the spectrum of adjustments is generally as much positive as it is negative” is misleading. The net effect of the adjustments in the US is to introduce a warming of about 0.5 F over the period 1950-2000, as shown at
http://www.ncdc.noaa.gov/oa/climate/research/ushcn/ushcn.html

PaulM

Thanks for that. I disagree and I think  Roger Pielke’s blog comments are unfair. He has disabled comments so I can’t leave comments there.

I am not at a technical expert in this field – I am  general physicist and specialist in temperature measurement, and my contributions to the workshop were generally along the line of insisting that uncertainty of measurement estimates be included in the data files from the outset. A pretty mundane input, but hopefully significant. I don’t know the details of the invitation process but the people involved did seem to me to feel slightly traumatised and so probably didn’t invite people they viewed as ‘hostile’. I think they wanted skeptics rather than cynics. What I didn’t (and still don’t) understand is why this community has  been the focus of so much negativity. I think the reason they are being so widely criticised is because the output of their work indicates that the Earth is warming, and this is a politically unwelcome result for some people. From my point of view, if their work is wrong then the errors will show up eventually, but actually the ‘signal’ they see appeared to me to be fairly robust. There are plenty of other reasons to be concerned that humans might be affecting the climate and this is just one more, and IMHO, one of the less significant pieces of evidence.

The individuals I spoke with were very open to answering my ‘dumb’ questions. As a group, they seemed to me to be very genuine people who were just trying to communicate clearly what their research revealed. They spoke of their errors – and how any admission of error caused them to be pilloried – and they spoke of the stress of trying to work in the face of that.

You raised the issue of adjustments to the data and I have been slowly working on a blog posting on that specifically – hopefully in the next day or two. The key thing I learned at the meeting concerned this adjustment – the homogenisation process. Historical and current meteorological data was and is compiled for reasons other than Climate Research, and so with the possible exception of the new US climate reference network pretty much all the data from around the globe has large measurement uncertainties, probably greater than 1 °C – but these are mainly Type B (systematic) uncertainties. However, the Type A uncertainty, the reproducibility of the monthly or yearly-averaged data is good, probably less than 0.1 °C. What this community has done is to ask the question ‘Can one do anything with this this data?’ and the answer they give is ‘Yes’, providing one can assess the effect of shifts in Type B terms.  I have been slowly reading through literature on this and their arguments seem sound. The key point is that it is essential to adjust the data to cope with shifts and drifts in the Type B terms. So the  adjustments they make to the data do not ‘introduce’ a warming trend, they ‘reveal’ it.

All the best

Michael


%d bloggers like this: