Archive for the ‘Protons for Breakfast’ Category

Science in the face of complexity

February 4, 2016
Jeff Dahn: Battery Expert at Dalhousie University

Jeff Dahn: Battery Expert at Dalhousie University

My mother-in-law bought me a great book for Christmas: Black Box Thinking by Matthew Syed: Thanks Kathleen ūüôā

The gist of the book is easy to state: our cultural attitude towards “failure”- essentially one of blame and shame – is counter productive.

Most of the book is spent discussing this theme in relation to the practice of medicine and the law, contrasting attitudes in these areas to those in modern aviation. The stories of unnecessary deaths and of lives wasted are horrific and shocking.


But when he moves on to engineering, the theme plays out more subtly. He discusses the cases of James Dyson, the Mercedes Formula 1 team, and David Brailsford from Sky Cycling. All of them have sought success in the face of complexity.

In the case of Dyson, his initial design of a ‘cyclone-based’ dust extractor wasn’t good enough, and the theory was too complex to guide improvements. So he started changing the design and seeing what happened. As recounted, he investigated¬†5,127 prototypes before he was satisfied with the results.¬†The¬†relevant point here is that his successful design¬†created 5,126 failures.

One of his many insights was to devise a simple measurement technique that detected tiny changes in the effectiveness of his dust extraction: he sucked up fine white dust and blew the exhaust over black velvet.

Jeff Dahn

This approach put me in mind of Jeff Dahn, a battery expert I met at Dalhousie University.

Batteries are really complicated and improving them is hard because there are so many design features that could be changed. What one wants is a way to test as many variants as quickly and as sensitively as possible in order to identify what works and what doesn’t.

However when it comes to battery lifetime Рthe rate at which the capacity of a battery falls over time Рit might seem inevitable that this would take years.

Not so. By charging and discharging batteries in a special manner and at elevated temperatures, it is possible to accelerate the degradation. Jeff then detects this with precision measurements of the ‘coulombic efficiency’ of the cell.

‘Coulombic efficiency’ sounds complicated but is simple. One first¬†measures the electric current as the cell is charged. If the electric current is constant during charging then the electric current multiplied by the charging time gives the total amount of electric charge stored in the cell. One then measures the same thing as the cell discharges.

For the lithium batteries used in electric cars and smart phones, the coulombic efficiency¬†is around 99.9%. But it is that¬†tiny of amount (less than 0.1%) of the electric charge which doesn’t come back that is progressively damaging the cell and limiting it’s life.

One of Jeff’s innovations is the application of precision measurement to this problem. By measuring electric currents with uncertainties of around one part in a million, Jeff can measure that¬†0.1% of non-returned charge with an uncertainty of around 0.1%. So he can distinguish between cells that 99.95% efficient and 99.96% efficient. That may not sound much, but the second one is 20% better!

By looking in detail at the Coulombic efficiency, Jeff can tell in a few weeks whether a new design of electrode will improve or degrade battery life.

The sensitivity of this test is akin to the ‘white dust on black velvet’ test used by Dyson: it doesn’t tell him¬†why something got better or worse – he has¬†to figure that out for himself. But it does tell him quickly¬†which things were bad ideas.

I couldn’t count the ammeters in Jeff’s lab – each one attached to a test cell – but he was measuring hundreds of cells simultaneously. Inevitably, most of these tests will make the cells perform worse and be categorised as ‘failures’.

But this system allows him to fail fast and fail often: and it is this capability that allows him to succeed at all. I found this application of precision measurement really inspiring.

Thanks Jeff.






Explanations are not always possible.

April 28, 2015
I asked Google how to get from NPL to Richmond and it assumed I meant Richmond Virginia instead of Richmond upon Thames which is 5 kilometres away.

I asked Google how to get from NPL to Richmond and it assumed I meant Richmond, Virginia, USA instead of Richmond upon Thames which is 5 kilometres away. 

I have spent a fair amount of time in my life trying to explain things to people. But I think now Рin all but the most basic of cases Рexplanations are impossible.

The reason I feel this is¬†because I think that giving an¬†explanation is like¬†giving directions. And most people will acknowledge that¬†unless you know where someone is ‘starting from’, it is impossible to give¬†general¬†directions to a given ‘destination’.

But¬†while we accept that every set of directions should start with the¬†question: “Where are you now?“, people are reluctant to acknowledge that logically¬†every explanation should start with the¬†question: “What do you know now?”.

Instead there¬†seems to be a widespread belief that explanations can exist ‘by themselves’.

Of course we can draw maps and then explain how to navigate the map. And if someone can follow this, then they can learn the route to a particular ‘destination’.¬†Or someone might already be familiar with the ‘landscape’. In these¬†cases explanations are possible.

But many people find maps difficult. However:

  • Getting someone to drive you to a destination does not in general teach you the route.
  • And programming a ‘sat-nav’ to take someone to a particular location will also – in general – fail to teach them the route.¬†They may have ‘arrived’ but they will be ‘lost’.
  • Travelling by tube to a location teaches them¬†nothing about where they are!

Similarly, by sophistry, or by entertaining imagery, it is possible to give people the illusion that they understand something. But unless they can link this new state of insight to their previous understanding, they will still be ‘lost’.

I thought I would illustrate the general idea with a picture of a route on a Google map. But when I tried to generate a route from Teddington to nearby Richmond (upon Thames), Google assumed that the word ‘Richmond’ referred to the much more populous Richmond, Virginia!

And the¬†impossibility of explanations¬†is clear in this video of Dave Allen ‘explaining’ how to tell the time. It features the classic lines:

“There are three hands on a clock. The first hand is the hour hand, the second hand is the minute hand, and the third hand is the second hand.”


Theories and Facts about Santa Claus

December 21, 2014

My friend Alom Shaha recently scripted the¬†video above to try to explain the concept of ‘a scientific theory’.

I like the video, but I feel one point gets lost.¬†And that is that¬†¬†‘theories’ are like ‘plot lines’ in a TV detective programme – they¬†link together ‘the facts’ to tell ‘a story’.

Example#1: Café Mimmo

  • I am observed at 10:15 a.m. leaving my house on Church Road
  • I am observed at 10:21 a.m.¬†at the¬†junction of Church Road and Walpole St.
  • I am observed at 10:27 a.m.¬†near to Caf√© Mimmo on Broad Street.
  • I am observed at 10:28 a.m. in¬†Caf√©¬†Mimmo on Broad Street.
  • I am observed at 10:58 a.m. walking North on Church Road.

These¬†are ‘the facts’. But what is really going on? Is there a ‘story’ that links all these facts together? Let me propose a theory:

  • Theory: Michael goes for coffee every day at¬†Caf√©¬†Mimmo

This theory links all these facts i.e. it explains how they relate to each other.

If this is a good theory, it ought to be able to predict new facts Рand these predictions could then be checked.

Notice that the theory doesn’t specify the route I take to the Caf√©. So even though the theory explains why I was seen on Church Road, it doesn’t predict that I always will be.

But the¬†theory does predict¬†that I will go to¬†Caf√©¬†Mimmo every day. This could be easily tested. And if I didn’t visit every day, the theory could¬†either be invalidated, or it might need to be revised to state Michael goes for coffee most¬†days at¬†Caf√©¬†Mimmo

I am sure you get the idea. But notice how the¬†theory¬†is¬†simpler and more meaningful¬†than a large number of facts. Facts tells us ‘what happened’, but theories tell us ‘what’s going on’.

Example#2: Santa Claus

  • On¬†Christmas Day (usually, but sometimes on Christmas Eve) children all around the world receive presents.
  • On Christmas Eve parents¬†leave out gifts¬†– typically whisky (please) and a carrot – and these gifts¬†are at least partly consumed by¬†the¬†morning.

These are ‘the facts’. But is there a theory that links these facts together? In fact there are two theories.

The traditional theory is that a mysterious being known as ‘Santa Claus’ delivers the presents¬†on a sleigh pulled by flying reindeer and filled with an improbably large bag of toys.

Alternatively, some¬†people contend that there is a vast conspiracy in¬†operation¬†in which otherwise honest and rational parents consistently lie to their children, year after year. According to this theory, it is in fact parents that buy all the presents, and fabricate forensic evidence of a visit by the¬†fictional being ‘Santa Claus’.

The traditional theory has been heavily criticised, largely due to the unknown identity and quasi-magical abilities of the ‘Santa Claus’ figure.

However, I have never been a fan of conspiracy theories – they always seem rather unlikely to me. In particular I am sceptical that millions upon millions of parents would systematically lie to their children. I would never do such a thing.

So I will leave it up to you to imagine experiments that you could perform that would help to decide which theory is in fact correct.

One obvious experiment is to stay up late on Christmas Eve¬†and watch for the arrival of ‘Santa Claus’.

But can I please¬†ask you to be careful: Santa Claus is known never to bring gifts to boys and girls that don’t go to sleep at night. So use a webcam.

But whatever you believe: I hope you have¬†a Happy Christmas ūüôā

Protons for Breakfast

December 14, 2014
The last few moments of the 20th presentation of Protons for BReakfast. (Picture by Lindsay Chapman)

The last few moments of the 20th presentation of Protons for Breakfast. (Picture by Lindsay Chapman)

Last Thursday I finished the twentieth and final presentation of Protons for Breakfast and this weekend I am busy trying to do nothing. Mainly sleeping, but in turn feeling sad, happy, proud and relieved.

I am lost for words. However, despite being lost for words, I want to say three things.

Thing one: Thank you

I have loved putting on the course and learned so much in so many ways in doing so. So the first words are simple:

  • Thank You.

To whom?

  • To ‘the helpers’¬†without whom the course would not be possible: it is rare¬†to have such great colleagues and friends.
  • To the expert helpers – especially those who travelled to take part.
  • To my wife and children¬†who have put up with 12 weeks of psychological absence each year for the last 10 years.
  • To NPL management, especially Fiona Auty, who has supported¬†this kind of thoroughly non-corporate outreach.
  • To everyone who attended, because fundamentally the course was¬†for, and about, you rather than me.

And finally to Jorge Ramirez who Рjust when I thought I had been given the ultimate gift (a framed triple-point of water cell!) topped that with the Protons for Breakfast Song!

Thing two: So why am I stopping?

Protons for Breakfast is a very personal course: and it needs to be presented by me.

That is part of what has made it successful Рthat I am genuinely present and not reading from any kind of script Рbut ultimately that makes it unsustainable.

This wasn’t what I thought originally. I had hoped that¬†members of the¬†team of helpers would take over parts of the course and that gradually it would become more of ‘a production’.

However, making that shift would have involved much more work – and since I wasn’t able to make that change, it seemed better to just keep on doing what we had been doing.

The immediate reason for stopping now¬†doesn’t really matter. But¬†the more profound reason¬†is that this kind of activity – relatively free-form and focussed on a particular¬†individual¬†– does not sit easily in any kind of modern corporate structure.

So despite the good will and support from many individuals, ending the course while it was popular and successful was probably best for all concerned.

And¬†hopefully NPL or someone else will ‘pick the bones’ of the course and¬†create a replacement¬†that is sustainable.

Thing three: What Next?

I don’t know. And at this point, I don’t want to think about it.

But I do have a few ideas! And if I ever catch my breath and get my energy back, perhaps I will actually make them real. Because:

Protons for Breakfast… is¬†what¬†you need…
Protons for Breakfast… will get you thinking…


I love data

November 24, 2014
Lake Windemere on a misty morning - or at least that what Wikipedia claims this is.

Lake Windemere on a misty morning – or at least that’s what Wikipedia claims this is.

Lake Windemere in the UK’s Lake District is beautiful.

Standing alone, as from a rampart’s edge,
I overlooked the bed of Windermere,
Like a vast river, stretching in the sun.
With exultation, at my feet I saw
Lake, islands, promontories, gleaming bays,
A universe of Nature’s fairest forms
Proudly revealed with instantaneous burst,
Magnificent, and beautiful, and gay.

Thus Wordsworth described what he saw..

The UK’s Lake Ecological Observatory Network¬†(UKLEON) also ‘looks’ at Lake Windemere and eight¬†other lakes. Its observations are recorded¬†less poetically than Wordsworth’s. But its data tells a story that Wordsworth missed: the¬†story of how the lake interacts with its environment.

For example, the graph below shows the temperature of Lake Windemere at different depths below its surface.¬†The data tell us¬†that below 20 metres, the lake undergoes only a small seasonal temperature change. Nearer the surface the seasonal temperature changes are more dramatic, with the upper few metres showing changes of 18 ¬įC from winter to summer.

Temperature verus depth profiles in Lake Windemere for the last year. The data show that below 20 metres, the lake undergoes a small seasonal temperature change. Nearer the surface the seasonal temperature changes are more dramatic with the upper few metres showing changes of 18 ¬įC.

Temperature versus depth profiles in Lake Windemere for the last year.

Looking at the corresponding data for Blelham Tarn, a smaller lake which feeds into Windemere, we see strikingly similar changes but with differences that reflect the ecology of that particular lake.

LEON Blelham Tarn

Temperature versus depth profiles in Blelham Tarn for the last year.

UKLEON makes lots of data available to allow a wide range of studies of lake ecology. Specifically data is available on:

Air Temperature;  Barometric Pressure; Wind speed; Wind Direction (from vector addition of unit length vectors); Solar radiation; Surface photosynthetically available radiation; Surface water temperature; Temperature Profiles; Underwater photosynthetically available radiation (1m);  Conductivity normalised to 25 degrees C; pH; Dissolved oxygen (% saturation); Dissolved oxygen (mg per litre); Dissolved carbon dioxide; Chlorophyll a concentration (relative units); Phycocyanin concentration (relative units).

And UKLEON is just a small part of the UK Environmental Change Network, UKECN. Through this portal you can gain access to a wider range of data in a wider range of environments.

For example the figure below shows the earliest recorded spawning date for frogs over a period of 13 years to 2010. Who knew that data was available? Or maybe you are interested in butterfly counts? UK ECN has the data for you.

Graph showing the day of year on which frogs were first observed to spawn at 9 different sites around the UK.

The day of year on which frogs were first observed to spawn at 9 different sites around the UK.

I am not an ecologist,¬†and so I don’t understand the significance of many of these measured quantities.

But I love the fact that the data is available for anyone to look at and perhaps make their own discoveries.


The coolest sandpit in the world.

November 17, 2014

At the end of October 2014 I visited the British Geological Survey, (BGS) in Keyworth, near Nottingham.

I was attending a meeting about¬†‘geological repositories for either nuclear waste or carbon dioxide.

In the foyer of the BGS ¬†was an ‘interactive sandpit’¬†in which the height of the sand was monitored by a¬†¬†Kinect sensor (as used with an X-box games console). From the sand height measurements¬†a computer then calculated an appropriate ‘contour’ image to project onto the sand.

The overall effect was magical and I could have played there for much longer than felt appropriate.

Schematic diagram of the ‘interactive sand pit’. A Kinect sensor determines the¬†sand hight and a computer then calculates an appropriate image to project onto the sand.

The meeting itself was fascinating with a variety of contributors who had completely different perspectives on the challenges.

However what is holding back the construction of a UK repository for nuclear waste is nothing to do with the scientific or engineering challenges: it is a failure of political leadership.

The UK has been a pioneer of nuclear power, the technology through which  we reap the benefits of nuclear power.

But we have been a laggard at cleaning up the radioactive waste generated by the nuclear industry. In this field Sweden and Finland have led the way.

Admittedly their repositories will be smaller than the UK’s, and so easier to construct: I have been informed that the UK’s repository will need to be ‘about the size of Carlisle‘. But it is all do-able.

And when¬†the UK eventually builds a repository, its cost will be inflated by the need to ensure the safety of the repository for a million years. What?…did I just say … one million years? ‘Yes’ I did. And ‘Yes’, that’s bonkers.

This time-scale makes for a number of unique challenges. At the meeting I attended, scientists were confident of the safety for a time-span somewhere between 10,000 and 100,000 years. And frankly, for me that would be good enough.

The ridiculous¬†specifications¬†required to be guaranteed before construction can begin, contrast with the¬†laissez faire attitude towards¬†burning carbon and affecting Earth’s climate. Why do we not have a moratorium on emitting carbon until we can be sure it is safe?

For example one area of uncertainty is the potential significance of microbiological fauna within rocks deep below the Earth, something about which we know very little. Do we have to wait until we can understand the millions of as yet undiscovered microbes before we can proceed?

Of course the main uncertainty – which is ultimately unresolvable –¬†arises from the extreme lengths of time under consideration. This leads to consideration of extremely unlikely scenarios

For example, the Swedish repository company SKB is carrying out extensive research on what will happen to the repository if there is another ice age, and the repository is covered by several kilometres of ice.

First of all, given the problem de jour of global warming, this is frankly unlikely. And secondly, if Sweden is covered by several kilometres of ice, then of course all the people in Sweden would already be dead! At that point the safety of the repository would be frankly a moot point.

You can learn about this research in three short but intensely dull videos here.

Wind versus Nuclear: The real story in pictures

November 3, 2014
Graph showing the electricity generated by nuclear and wind power (in gigawatts) every 5 minutes for the months of September and October 2014. The grey area shows the period when wind power exceeded nuclear power.

Graph showing the electricity generated by nuclear and wind power (in gigawatts) every 5 minutes for the months of September and October 2014. The grey area shows the period when wind power exceeded nuclear power. (Click Graph to enlarge)

For a few days in October 2014,  wind energy consistently generated more electricity in the UK than nuclear power. Wow!

You may have become aware of this through several news outlets. The event was reported on the BBC, but curiously the Daily Mail seems not to have noticed .

Alternatively, you may like me, have been watching live on Gridwatch Рa web site that finally makes the data on electricity generation easily accessible.

I was curious about the context of this achievement and so I downloaded the historically archived data on electricity generation derived from coal, gas, nuclear and wind generation in the UK for the last three years. (Download Page)

And graphing the data tells a powerful story of the potential of wind generation Рbut also of the engineering challenges involved in integrating wind power into a controllable generating system.

The challenges arise from the fluctuations in wind power which are very significant. The first challenge is in the (un)predictability of the fluctuations, and the second challenge is coping with them Рwhether or not they have been predicted. Both these challenges will grow more difficult as the fraction of wind energy used by the grid increases over the next decade.

As an example, consider in detail an event earlier in October shown in the graph at the top of the page

Graph showing the electricity generated by nuclear and wind power (in gigawatts) every 5 minutes for the months of September and October 2014. The grey area shows the period when wind power exceeded nuclear power.

Detail from the graph at the top of the page showing how earlier in October, wind power went from an impressive 6 GW to less than 1 GW in a period of around 18 hours . (Click Graph to enlarge)

The grid operators have a wind forecast running 6 to 24 hours ahead and would have planned for this. The forecasts are typically accurate to about 5% and so at the high end that amounts to a margin of error of 0.3 GW Рwhich is within the reserves that the grid can cope with routinely.

However the fluctuations in wind power are becoming larger as the amount of wind power increases. The graph below shows the monthly averages of electricity produced by Wind and Nuclear since May 2011. Also shown in pink and light blue are the data (more than 300,000 of them!) taken every 5 minutes.

Monthly averages of electricity produced by Wind and Nuclear since May 2011. Also shown in grey are the data (more than 300,000 of them!) taken every 5 minutes. It is clear that the fluctuations in wind power are large - and getting ever larger. (Click Graph to enlarge)

Monthly averages of electricity produced by Wind and Nuclear since May 2011. Also shown in pink and light blue are the data (more than 300,000 of them!) taken every 5 minutes. It is clear that the fluctuations in wind power are large Рand getting ever larger. (Click Graph to enlarge)

Incorporating wind energy is a real engineering challenge which costs real money to solve. Nonetheless, as explained in this excellent  Royal Academy of Engineering report, we expect capacity to double to ~20 GW by 2020, and to at least double again by 2030. So these problems do need to be solved

Because wind-generated electricity supply does not respond to electricity demand, as the contribution of wind energy grows we will reach two significant thresholds.

  • When demand is high, unanticipated reductions¬†in wind-generated supply¬†could¬†exceed the margins within which the¬†grid operates.
  • When demand is low, unanticipated increases¬†in wind-generated supply¬†could¬†exceed the¬†base supply from nuclear power which cannot be easily switched off

These challenges will require both economic and engineering adaptations. At the moment, because the marginal cost of wind power is so low, we basically use all the wind power that is available.

However, it is¬†possible to ‘trim’ wind turbines so that they do not produce their maximum output. In a future system with 40 GW of wind generating capacity, we might value predictability ¬†and controllability over sheer capacity. Then as the wind falls, the turbines could adjust to try to keep output constant.

These challenges lie ahead and are difficult but entirely solvable. And their solution will be essential if we really want to phase out fossil fuels by 2100.

But for the moment wind is providing on average about 2 GW of electrical power, which is around 6% of UK average demand. This is a real achievement and as a country we should be proud of it.

Perhaps someone should tell the Daily Mail.

Bent Hamer and the Kilogram

September 25, 2014

Tonight I find myself a thousand miles¬†from home¬†in a hotel in Espoo, on the outskirts of¬†Helsinki. Outside it is raining and the temperature is just 10 ¬įC.

I am here to discuss with colleagues from around Europe some of the minutiae associated with a new definition of the units of temperature: the kelvin and the degree Celsius.

You really don’t want to know the details: we worry about them so that you don’t have to. ūüôā

And the¬†day is auspicious.¬†Thursday 25th September¬†marks the 125th anniversary of the adoption of the kilogram as the international standard of mass.¬†You can read NPL’s commentary on the anniversary here, And there¬†is a new film release ‘about metrology’

The movie¬†at the top of the¬†page is a trailer for a film by Bent Hamer¬†which appears to use the kilogram as a metaphor for… well I haven’t seen the movie yet.

But the mere existence of the¬†movie does indicate that the ‘kilogram problem’ has entered popular culture¬†– at least to some limited extent – and that is >fantastic<.

My wife and I have admired Bent Hamer’s previous films and sought them out at out-of-the-way¬†cinemas. And we shall probably have to do the same with this one.

Bent Hamer’s films about IKEA researchers and retired railwaymen were not really about¬†IKEA researchers or¬†retired railwaymen. And I am sure this film is not really about the kilogram.

It¬†is probably about the same thing that¬†every other Bent Hamer film is about: the weirdness of other people’s ‘normal’ lives, and by implication, the weirdness of our own lives. And¬†how important it is to¬†nonetheless grab whatever happiness we can from the passing moments.

But I am filled with excitement at the prospect of this film. Parts of it are definitely shot at the International Bureau of Weights and Measures (BIPM) which is a thrill for us ‘insiders’ to see.

And he has certainly caught something of the obsessive personality disorder that  Рif not actually required Рtends to accompany an interest in metrology.

I suspect that Hamer’s fondness for humanity would probably¬†lead him to sympathise with a statement such as

Man is measure of all things 

And if I met him I would probably have to disagree.

The¬†fact that I can type this article¬†on a computer in Finland and have it appear on a server hosted in the United States, and be viewed¬†wherever you are viewing this, rests on agreed measurement standards that are not amenable to different people’s opinions.

And the whole purpose of almost everything I do in my work – including this meeting – is to move beyond situations where correct answers are ‘a matter of opinion’.

But nonetheless, to see metrology dramatised in this way brings a smile to my face, and yields a frisson of simple pleasure.

I can’t wait for ‘1001 kelvin’.


What do you do with an old nuclear reactor?

September 11, 2014
To search for tiny additional additional amounts of radiation you first need to screen out the normal level of radioactive background.

To search for additional amounts of radiation in the scrap from a nuclear power station you first need to screen out the normal level of radioactive background. To do this you must build a ‘chamber’¬†using special, non-radioactive bricks.

I find myself in the Hotel Opera, Prague this rainy Thursday evening, tired after having spent a fascinating day at the Czech Centre for Nuclear Research UJV Rez.

There I saw one outcome of a European collaboration (called MetroRWM) designed to answer just one of the difficult questions that arises when one needs to take apart an old nuclear power station. This is something Europe will need to become good at in the near future.

This didn’t concern the highly-radioactive parts of the power station: that’s another story.

This concerned the 99% of a nuclear power station which is no more radioactive than a normal power station.

What should happen is that this material should join the normal scrap system and be re-used.

However, the understandable surplus of precaution that surrounds nuclear matters will prevent this, unless every single bucket load of concrete or scrap metal can be verified to have a level of activity less than a specified standard.

The collaboration based at UJV Rez have built an apparatus to do just that. And most importantly, they have proved that it works i.e. that tiny hot-spots on the inside of pipes can be detected quickly and reliably.

Here is how it works.

To detect the tiny levels of radiation potentially coming from hidden radioactive particles, the apparatus uses ultra-sensitive radiation detectors.

However these detectors are useless if they are not shielded because our normal environment contains too much radioactive material. So the first step is to shield the detectors.

The low radiation chamber at UJV Rez At teh far end you can see a fork lift truck loading a pallet which will travel through teh chamber and emerge at this end.

The low-background chamber at UJV Rez At the far end you can see a fork lift truck has just loaded a pallet which will travel through the chamber and emerge at this end. The doors at this end are currently closed.

The UJV team did this by building a ‘room’ using a special type of¬†brick which is almost as good as lead at keeping out radiation, but much cheaper, much lighter, and much easier to work with. Using this they lowered the level of radiation inside to just 1% of the background radiation.

The sensitive radiation detectors can be seen inside the room as the doors open to allow the entry of test pallet.

The two ultra-sensitive radiation detectors can be seen inside the shielded room as the doors open to allow the entry of test pallet.

They then built a system for loading pallets of material on a conveyor at one end, and drawing it through the shielded room to check the radioactivity in all parts of the pallet. The measurement took about 5 minutes, and after this the pallet emerged from the other end (Video below).

The key questions are:

  • How do you ensure that ‘not detecting something’ means that there is none there?
  • Could some activity slip through if it were shielded¬†by some gravel, or steel piping?
  • Could it slip through if it was in the bottom corner of the pallet?

To answer these questions the UJV team, in collaboration with scientists across Europe, created samples that simulated many of these possible scenarios.

Pallets of 'radioactive' waste

Pallets of ‘radioactive’ waste. These pallets are a standard size, but there thickness is determined by the need to be sure any radioactivity trapped inside can be detected. The pallets above¬†have been made very slightly more radioactive than the background.

One of their clever ways of testing the machine was to create samples of known radioactivity and place them inside hollow steel balls (actually petanque balls!).

A colleague showing a very low level sample of known activity coudl be place inside a hollow steel ball,simulating radiation trapped inside steel pipes.

A colleague showing a very low level sample of known activity which can be placed inside a hollow steel ball,simulating radiation trapped inside steel pipes.

The machine could then search for the activity when the balls were arranged in many different ways.

A pallet filled with steel balls, some of which have radioactive samples of known activty concealed inside.

A pallet filled with steel balls, some of which have radioactive samples of known activity concealed inside.

The aim of all this effort is that at the end of the day, scrap material like that in the picture below can be rapidly screened on-site and sent to be recycled in the confidence that no hazard will ensue at any time in the future no matter how this material is treated.

The aim of the system is to screen very diverse scrap such these old pipes and ducts.

The aim of the system is to screen very diverse scrap such these old pipes and ducts.

These measurements are not easy – but this work really impressed me.

Would you like to measure the surface temperature of the Earth?

August 18, 2014
Our estimate of the global mean temperature is derived from analysis of around 30,000 individual station records such as the one above. This graph shows how a single station record must sometimes be pieced together from fragmentary data in in different holdings.

Our estimate of global mean temperature is derived from analysis of around 30,000 individual station records such as the one above. This graph shows how a single station record must sometimes be pieced together from fragmentary data in different holdings. Data from Geoscience Data Journal 30 JUN 2014 DOI: 10.1002/gdj3.8

Every few weeks an e-mail filters through to me from NPL’s web enquiry service. One arrived last week:

  • We seem to be taking far reaching conclusions (climate change) from temperature measurements taken around the world over the last 300 years. Taking into account the geographic spread of such readings and the changes in methods of measurement over the years is this scientifically robust? ¬†¬†

I wrote a rather long-winded  reply which you can read below, but after I sent it I realised I had forgotten something: I could have invited the enquirer to work out the answer for themselves!

That might have been a bit cheeky, because it is not a simple task. However, the data is freely available and its provenance has never been better established than in the databank released by the International Surface Temperature Initiative (ISTI).

ISTI is an effort to which I have myself contributed minimally by sitting on its steering committee and occasionally answering questions related to calibration and the fundamentals of temperature measurement.

The motivation for ISTI was to make sure the entire databank was publicly available. This was back in 2010 when global warming sceptics were claiming that the global warming was a hoax. Since then sceptics have changed to believing all the data and claiming its fluctuations are of profound significance. In any case, making the data publicly accessible and free allows even small research groups Рor even you if you are not too  busy Рto make their own analyses.

The paper linked here describes the structure of the databank and the way it has been constructed. It is a long paper and here I only want to draw your attention to the detailed scrutiny paid to every station record in the 30,000 records.

Each station record has a latitude, longitude and altitude associated with it. But one needs to decide whether a short record from one source is a unique record Рin which case  it must be included Рor  a duplicate of some fraction of a nearby station. Alternatively Рas the graphic at the top shows Рdifferent parts of an individual station record may be available Рpossibly with mis-spelt names Рin different data holdings.

This kind of analysis is tedious in the extreme, but is actually essential in creating a databank that people can trust. I am filled with admiration for the achievements of my ISTI colleagues. Now anyone can work out the surface temperature of the Earth for themselves.

So what are you waiting for?


My Reply to the NPL Enquiry

Hello. I am an NPL scientist specialising in temperature measurement. Amongst other claims to fame, I have made the most accurate temperature measurements in human history, and also sit on the steering committee of a group ( that has established an open-source data base of surface temperature records. And I have been asked to respond to your message which reads.

  • We seem to be taking far reaching conclusions (climate change) from temperature measurements taken around the world over the last 300 years. Taking into account the geographic spread of such readings and the changes in methods of measurement over the years is this scientifically robust? ¬†

The short answer is ‚ÄėYes, it is scientifically robust‚Äô but it is not at all obvious that it should be so.

Let me give a slightly longer answer.

You are quite right to be concerned but in fact the data are analysed in a careful way which we believe eliminates most (if not all) of the inevitable distortions of the data.

First of all, the data

Typically this consists of twice daily readings of the maximum and minimum temperatures taken inside a Stevenson screen. Typically these are average to produce a daily ‚Äėaverage‚Äô and this is then further averaged to produce a ‚Äėmonthly mean temperature‚Äô. Because of this averaging the monthly mean temperatures have rather low variability and enable trends to be seen more easily than might be expected.

This met office web site shows typical data from the UK

Secondly, the analysis.

The monthly mean temperature data from around the world has been analysed by four¬†different teams and ‚Äď very substantially ‚Äď they all agree that the air temperature above the surface of the Earth has warmed substantially in the last century. One key part of their analysis is that data from a station is compared only with itself. So we look back through a single station record and ask ‚ÄúDoes it have a rising or falling trend?‚ÄĚ. If life were simple ‚Äď which it never is ‚Äď all we would have to do would be to extract the trends and average them over the correct area of Earth‚Äôs surface. But life is never simple.

As one look‚Äôs back through a record one sees many ‚Äėglitches‚Äô or sudden changes, and typically there is no explanation for why they occurred? Were they real? Should we leave the glitch in the data? Or try to correct it? And if we correct it ‚Äď which way should we adjust the data? Typically these glitches occur when an instrument is changed, or when a station is re-sited e.g. from one side of an airfield to another. Different teams take different approaches, but all of them take great care not to distort the data.

Let me give you one example. This is called the Pair-wise Homogenisation Algorithm (PHA) and works especially well in places such as Europe or the USA where there are a high density of weather stations. Since we are looking for a signal of ‚Äėclimate change‚Äô we would expect this to show up on all station records in an area a few hundred kilometres across. So if we detect a glitch ‚Äď we can compare that individual record with all nearby station records one pair at a time. In this way we can see whether the glitch is real or just affects that one station. And we can correct it if we choose to. The PHA can also ¬†detect the so-called ‚Äėurban heat island‚Äô effect.

Thirdly, geography.

So the PHA can allow us to extract a trend from around 30,000 weather station data records. But these are not distributed uniformly around the globe, and critically Africa and the poles have relatively short data records. Nonetheless, based on reasonable assumptions, the warming trend does seem to be widely distributed.

And finally.

The weather records that we have were not compiled for the purpose of climatology, but we have adapted them for that purpose. But we can look at other records, such as the dates of first blooming of particular flowers. This data is not so easy to interpret but also is much less prone to numerical manipulation. And critically we can look at the extent of the melting of artic sea ice. Many different things we measure tell a story of warming that is consistent with what we would expect from carbon dioxide emissions.

The Berkeley Earth Project web site has particularly nice graphs and is well organised if you would like to investigate this question further.

If you have any remaining questions, please don’t hesitate to contact me.


Dr. Michael de Podesta MBE CPhys MInstP

Principal Research Scientist
National Physical Laboratory
Teddington, TW11 0LW
Telephone 44 (0)20 8943 6476


%d bloggers like this: