Archive for the ‘The Future’ Category

Talking about the ‘New’ SI

July 3, 2017

I was asked to give a talk about the SI to some visitors tomorrow morning, and so I have prepared some PowerPoint slides

If you are interested, you can download them using this link (.pptx 13 Mb!): please credit me and NPL if you use them.

But I also experimentally narrated my way through the talk and recorded the result as a movie.

The result is… well, a bit dull. But if you’re interested you can view the results below.

I have split the talk into three parts, which I have called Part 1, Part 2 and Part 3.

Part 1: My System of Units

This 14 minute section is the fun part. It describes a hypothetical system of units which is a bit like the SI, but in which all the units are named after my family and friends.

The idea is to show the structure of any system of units and to highlight some potential shortcomings.

It also emphasises the fact that systems of units are not ‘natural’. They have been created by people to meet our needs.

Part 2: The International System of Units

This 22 minute section – the dullest and most rambling part of the talk – explains the subtle rationale for the changes in the SI upon which we have embarked.

There are two key ideas in this part of the talk:

  • Firstly there is a description of the separation of the concepts of the definition of a unit from the way in which copies of the unit are ‘realised‘.
  • And secondly, there is a description of the role of natural constants in the new definitions of the units of the SI.

Part 3: The Kilogram Problem

This 11 minute section is a description of one of the two ways of solving the kilogram problem: the Kibble balance. It has three highlights!

  • It features a description of the balance by none other than Bryan Kibble himself.
  • There is an animation of a Kibble balance which takes just seconds to play but which took hours to create!
  • And there are also some nice pictures of the Mark II Kibble Balance installed in its new home in Canada, including a short movie of the coil going up and down.


This is all a bit dull, and I apologise. It’s an experiment and please don’t feel obliged to listen to all or any of it.

When I talk to a live audience I hope it will all be a little punchier – and that the 2800 seconds it took to record this will be reduced to something nearer to its target 2100 seconds.




Not everything is getting worse!

April 19, 2017

Carbon Intensity April 2017

Friends, I find it hard to believe, but I think I have found something happening in the world which is not bad. Who knew such things still happened?

The news comes from the fantastic web site MyGridGB which charts the development of electricity generation in the UK.

On the site I read that:

  • At lunchtime on Sunday 9th April 2017,  8 GW of solar power was generated.
  • On Friday all coal power stations in the UK were off.
  • On Saturday, strong winds and solar combined with low demand to briefly provide 73% of power.

All three of these facts fill me with hope. Just think:

  • 8 gigawatts of solar power. In the UK! IN APRIL!!!
  • And no coal generation at all!
  • And renewable energy providing 73% of our power!

Even a few years ago each of these facts would have been unthinkable!

And even more wonderfully: nobody noticed!

Of course, these were just transients, but they show we have the potential to generate electricity which has a significantly low carbon intensity.

Carbon Intensity is a measure of the amount of carbon dioxide emitted into the atmosphere for each unit (kWh) of electricity generated.

Wikipedia tells me that electricity generated from:

  • Coal has a carbon intensity of about 1.0 kg of CO2 per kWh
  • Gas has a carbon intensity of about 0.47 kg of CO2 per kWh
  • Biomass has a carbon intensity of about 0.23 kg of CO2 per kWh
  • Solar PV has a carbon intensity of about 0.05 kg of CO2 per kW
  • Nuclear has a carbon intensity of about 0.02 kg of CO2 per kWh
  • Wind has a carbon intensity of about 0.01 kg of CO2 per kWh

The graph at the head of the page shows that in April 2017 the generating mix in the UK has a carbon intensity of about 0.25 kg of CO2 per kWh.

MyGridGB’s mastermind is Andrew Crossland. On the site he has published a manifesto outlining a plan which would actually reduce our carbon intensity to less than 0.1 kg of CO2 per kWh.

What I like about the manifesto is that it is eminently doable.

And who knows? Perhaps we might actually do it?

Ahhhh. Thank you Andrew.

Even thinking that a good thing might still be possible makes me feel better.


Remarkably Unremarkable

February 24, 2017


The ‘Now’

‘The future’ is a mysterious place.

And our first encounter with ‘the future’ is ‘the now’.

Today I felt like I encountered the future when I drove a car powered by a hydrogen fuel cell. And far from being mysterious it was remarkably unremarkable.

The raw driving experience was similar to using a conventional car with automatic transmission.

But instead of filling the car with liquid fuel derived from fossil plant matter,  I filled it with hydrogen gas at a pressure 700 times greater than atmospheric pressure.


This was achieved using a pump similar in appearance to a conventional petrol pump.


This was the interface to some industrial plant which generated 80 kg of hydrogen each day from nothing more than electricity and water. This is enough to fill roughly 20 cars.

This is small scale in comparison with a conventional petrol station, but these are early days. We are still at the interface with the future. Or one possible future.

The past

Some years ago, I remember making measurements of the temperature and humidity inside a fuel cell during operation.

The measurements were difficult, and the results surprising – to me at least.

And at the end of the project I remember thinking “Well, that was interesting, but it will never work in practice”.

Allow me please to eat my words: it works fine.

Today I was enormously impressed by the engineering prowess that made the fuel cell technology transparent to the driver.

The future

What I learned today was that the technology to make cars which emit no pollution at their point of use exists, now.

The range of this car is 300 miles and it takes only 5 minutes to re-fill. When there are more re-filling stations than the dozen or so currently around the UK, this will become a very attractive proposition.

I have no idea if fuel cell cars will become ubiquitous. Or whether they will become novelties like steam-powered cars from the end of the nineteenth century.

Perhaps this will represent the high-water mark of this technology. Or perhaps this will represent the first swallow in a summer of fuel cell cars.

None of us can know the future. But for the present, I was impressed.

It felt like the future was knocking on the door and asking us to hurry up.

When will the North Pole become the North Pool?

December 16, 2016


It is a sad fact, but it is likely that within my lifetime it will become possible to sail to the North Pole. I am 56.

Tragically it is also true that there is absolutely nothing that you or I can do about it.

In fact, even in the unlikely event that humanity en masse decided it wanted to prevent this liquefaction, there would be literally nothing we could do to stop it.

The carbon dioxide we have already put in the atmosphere will warm the Earth’s surface for a few decades yet even if we stopped all emissions right now.


The particular line of causation between carbon dioxide emissions and warming of the arctic is long, and difficult to pin down.

Similarly it is difficult to determine if a bull in a china shop broke a particular vase, or whether it was a shop helper trying to escape.

Nonetheless, in both cases the ultimate cause is undeniable.

What does the figure show?

The animation at the head of the page, stolen from NASA’s Earth Observatory, is particularly striking and clear.

The animation shows data from 1979 to this past November 2016 showing the extent of sea ice versus the month of year.

Initially the data is stable: each year is the same. But since the year 2000, we have seen reductions in the amount of sea ice which remains frozen over the summer.

In 2012, an additional one million square kilometres – four times the area of England Scotland and Wales combined – melted.

The summer of 2016 showed the second largest melt ever.

The animation highlights the fact that the Arctic has been so warm this autumn, that Sea Ice is forming at an unprecedentedly slow rate.

The Arctic Sea Ice extent for November 2016 is about one million square kilometres less than what we might expect it to be at this time of year.

My Concern 

Downloading the data from the US National Snow and Ice Data Centre, I produced my own graph of exactly the same data used in the animation.

The graph below lacks the drama of the animated version at the head of the article. But it shows some things more clearly.


This static graph shows that the minimum ice extent used to be stable at around 7 ± 1 million square kilometres. The minimum value in 2012 was around half that.

The animated graph at the head of the article highlights the fact that the autumn freeze (dotted blue circle) is slower than usual – something which is not clear in the static graph.

My concern is that if this winter’s freeze is ‘weak’, then the ice formed will be thin, and then next summer’s melt is likely to be especially strong.

And that raises a big question at the very heart of our culture.

When the North Pole becomes the North Pool, where will Santa live?


Science in the face of complexity

February 4, 2016
Jeff Dahn: Battery Expert at Dalhousie University

Jeff Dahn: Battery Expert at Dalhousie University

My mother-in-law bought me a great book for Christmas: Black Box Thinking by Matthew Syed: Thanks Kathleen 🙂

The gist of the book is easy to state: our cultural attitude towards “failure”- essentially one of blame and shame – is counter productive.

Most of the book is spent discussing this theme in relation to the practice of medicine and the law, contrasting attitudes in these areas to those in modern aviation. The stories of unnecessary deaths and of lives wasted are horrific and shocking.


But when he moves on to engineering, the theme plays out more subtly. He discusses the cases of James Dyson, the Mercedes Formula 1 team, and David Brailsford from Sky Cycling. All of them have sought success in the face of complexity.

In the case of Dyson, his initial design of a ‘cyclone-based’ dust extractor wasn’t good enough, and the theory was too complex to guide improvements. So he started changing the design and seeing what happened. As recounted, he investigated 5,127 prototypes before he was satisfied with the results. The relevant point here is that his successful design created 5,126 failures.

One of his many insights was to devise a simple measurement technique that detected tiny changes in the effectiveness of his dust extraction: he sucked up fine white dust and blew the exhaust over black velvet.

Jeff Dahn

This approach put me in mind of Jeff Dahn, a battery expert I met at Dalhousie University.

Batteries are really complicated and improving them is hard because there are so many design features that could be changed. What one wants is a way to test as many variants as quickly and as sensitively as possible in order to identify what works and what doesn’t.

However when it comes to battery lifetime – the rate at which the capacity of a battery falls over time – it might seem inevitable that this would take years.

Not so. By charging and discharging batteries in a special manner and at elevated temperatures, it is possible to accelerate the degradation. Jeff then detects this with precision measurements of the ‘coulombic efficiency’ of the cell.

‘Coulombic efficiency’ sounds complicated but is simple. One first measures the electric current as the cell is charged. If the electric current is constant during charging then the electric current multiplied by the charging time gives the total amount of electric charge stored in the cell. One then measures the same thing as the cell discharges.

For the lithium batteries used in electric cars and smart phones, the coulombic efficiency is around 99.9%. But it is that tiny of amount (less than 0.1%) of the electric charge which doesn’t come back that is progressively damaging the cell and limiting it’s life.

One of Jeff’s innovations is the application of precision measurement to this problem. By measuring electric currents with uncertainties of around one part in a million, Jeff can measure that 0.1% of non-returned charge with an uncertainty of around 0.1%. So he can distinguish between cells that 99.95% efficient and 99.96% efficient. That may not sound much, but the second one is 20% better!

By looking in detail at the Coulombic efficiencyJeff can tell in a few weeks whether a new design of electrode will improve or degrade battery life.

The sensitivity of this test is akin to the ‘white dust on black velvet’ test used by Dyson: it doesn’t tell him why something got better or worse – he has to figure that out for himself. But it does tell him quickly which things were bad ideas.

I couldn’t count the ammeters in Jeff’s lab – each one attached to a test cell – but he was measuring hundreds of cells simultaneously. Inevitably, most of these tests will make the cells perform worse and be categorised as ‘failures’.

But this system allows him to fail fast and fail often: and it is this capability that allows him to succeed at all. I found this application of precision measurement really inspiring.

Thanks Jeff.






Restoring my faith in Quantum Computing

February 1, 2016
Jordan Kyriakidis from Dalhousie University Physics Department

Jordan Kyriakidis from Dalhousie University Physics Department

I am a Quantum Computing Sceptic.

But last week at Dalhousie I met Jordan Kyriakidis who explained one feature of Quantum Computing that I had not appreciated. That even if a quantum computer only gave the right answer one time in a million operations, it might still be useful.

His insight made me believe that Quantum Computing just might be possible.

[Please be aware that I am not an expert in this field. And I am aware that experts are less sceptical than I am. Indeed many consider that the power of quantum computing has already been demonstrated. Additionally Scott Arronson  argues persuasively (in his point 7) that my insight is wrong.]


Conventional digital computers solve problems using mathematics. They have been engineered to perform electronic operations on representations of numbers which closely mimic equivalent mathematical operations.

Quantum computers are different. They work by creating a physical analogue of the problem which requires solving.

An initial state is created and then the computational ‘engine’ is allowed to evolve using basic physical laws and hopefully arrive at a state which represents a solution to the problem at hand.

My problem

There are many conceivable implementations of a quantum computer and I am sceptical about them all!

My scepticism arises from the analogue nature of the computation. At some point the essential elements of the quantum computer (called ‘Qubits‘ and pronounced Q-bits) can be considered as some kind of oscillator.

The output of the computer – the answer – depends on interference between the Qubits being orchestrated in a precise manner. And this interference between the Qubits is completely analogue.

Analogue versus digital

Physics is fundamentally analogue. So, for example, the voltages present throughout a digital computer vary between 0 volts and 5 volts. However the engineering genius of digital electronics is that it produces voltages that are either relatively close to 0 volts, or relatively close to 5 volts. This allows the voltages to be interpreted unambiguously as representing either a binary ‘1’ or ‘0’. This is why digital computers produce exactly the same output every time they run.

Quantum Computing has outputs that can be interpreted unambiguously as representing either a binary ‘1’ or ‘0’. However the operation of the machine is intrinsically analogue. So tiny perturbations that accumulate between the thousands of operations on the Qubits in a useful machine will result in different outputs each time the machine is run.

Jordan’s Insight

To my surprise Jordan acknowledged my analysis was kind-of-not-wrong. But he said it didn’t matter for the kinds of problems quantum computers might be good at solving. The classic problem is factoring of large numbers.

For example working out that the number 1379 is the result of multiplying 7 × 197 will take you a little work. But if I gave you the numbers 7 and 197 and asked you to multiply them, you could do that straightforwardly.

Finding the factors of large numbers (100 digits or more) is hard and slow – potentially taking the most powerful computers on Earth hundreds of years to determine. But multiplying two numbers – even very large numbers – together is easy and quick on even a small computer.

So even if a quantum computer attempting to find factors of a large number were only right one time in a million operations – that would be really useful! Since the answers are are easy to check, we can sort through them and get rid of the wrong answers easily.

So a quantum computer could reduce the time to factor large numbers even though it was wrong 99.9999% of the time!

I can easily imagine quantum computers being mostly wrong and I had thought that would be fatal. But Jordan made me realise that might still be very useful.

Thanks Jordan 🙂


By the way, you might like to check out this web site which will factor large numbers for you. I learned that the number derived from birth date (29/12/1959>>>29121959) is prime!

Climate Hopes and Fears.

December 14, 2015
FT Calculator for Greenhouse Gas emissions required to achieve various degrees of global warming.

FT Calculator for Greenhouse Gas emissions required to achieve various degrees of global warming. If we continue on our current path, we are headed towards – in our best estimation – 6 degrees Celsius of global warming. The calculator allows you to see the anticipated effects of the pledged emission reductions.

The Paris agreement on Climate Change is cause for hope. In honesty, I cried at the news.

But the task that the countries of the world have set for themselves is breathtakingly difficult.

And in the euphoria surrounding the Paris Accord, I am not sure the level of difficulty has been properly conveyed.

The process will involve an entire century of ever stronger commitment to meet even the most minimal of targets.

Imagine going on a long car journey full of 200 ‘children’ who will bicker and fight – some of whom are not too bright but are armed with nuclear weapons. How long will it be until we hear the first ‘Are we there yet?’ or ‘ I wanna go home now!’ or ‘ Can I have some extra oil now?’ or ‘It’s all Johnny’s fault!’ or ‘It’s not fair!’

Perhaps unsurprisingly, it is the Financial Times that has cut to the chase with an Interactive Calculator that shows the level of emission reductions required to meet various warming targets.

The calculator indicates that if we continue on our current path, we are headed (in our best estimation) towards 6 °C of global warming.

The calculator then allows you to see the anticipated effects of the pledged emission reductions.

What is shocking is that even the drastic (and barely believable) reductions pledged in Paris are not sufficient to achieve the 2 °C limit.

As quoted by the Guardian, James Hansen (whom I greatly admire) is certainly sceptical:

“It’s a fraud really, a fake. It’s just bullshit for them to say: ‘We’ll have a 2C warming target and then try to do a little better every five years.’ It’s just worthless words. There is no action, just promises. As long as fossil fuels appear to be the cheapest fuels out there, they will be continued to be burned.”

Hansen suggests all governments should institute a $15/tonne carbon tax (rising each year by $10/tonne) . He sees the price of oil (and coal and gas) as the single essential lever we need to pull to achieve our collective goals.

Personally I am with Hansen on the need for urgent action right now, but I feel more charitable towards our leaders.

I don’t know whether it is more rational to feel hopeful or fearful.

But despite myself, I do feel hopeful. I hope that maybe in my lifetime (I expect to die aged 80 in 2040) I will have seen global emissions peak and the rate of increase of atmospheric carbon dioxide begin to flatten.


Volcanic Clouds

September 28, 2015
The estimated average air temperature above the land surface of the Earth. The thick black line. The squiggle lines are data and the grey lines give an indication of uncertainty in the estimate. Th bold black line shows the results of a model based on carbon dioxide and the effect of named volcanoes.

The estimated average air temperature above the land surface of the Earth. The squiggly lines are data and the grey lines give an indication of uncertainty in the estimate. The bold black line shows the results of a model based on the effects of carbon dioxide and the effect of named volcanoes. Figure is from the Berkeley Earth Temperature Project

The explosion of Mount Tambora in 1815 was the largest explosion in recorded history. Its catastrophic local effects – earthquakes, tsunami, and poisonous crop-killing clouds – were witnessed by many people including Sir Stamford Raffles, then governor of Java.

Curiously, one year later, while touring through France, Raffles also witnessed deserted villages and impoverished peasantry caused by the ‘year without a summer’ that caused famine throughout Europe.

But at the time no-one connected the two events! The connection was not made until the late 20th Century when scientists were investigating the possibility of a ‘nuclear winter’ that might arise from multiple nuclear explosions.

Looking at our reconstructed record of the air temperature above the land surface of the Earth at the head of this article, we can see that Tambora lowered the average surface temperature of the Earth by more than 1 °C and its effects lasted for around three years.

Tambora erupted just 6 years after a volcanic explosion in 1809 whose location is still unknown. We now know that together they caused the decade 1810-1820 to be exceptionally cold. However, at the time the exceptional weather was just experienced as an ‘act of god’.

In Tambora: The Eruption that changed the world, Gillen D’Arcy Wood describes both the local nightmare near Tambora, and more significantly the way in which the climate impacts of Tambora affected literary, scientific, and political history around the globe.

In particular he discusses:

  • The effect of a dystopian ‘summer’ experienced by the Shelleys and Lord Byron in their Alpine retreat.
  • The emergence of cholera in the wake of a disastrous monsoon season in Bengal. Cholera went on to form a global pandemic that eventually reached the UK through trade routes.
  • The period of famine in the rice-growing region of Yunnan that led to a shift towards opium production.
  • The bizarre warming – yes, warming – in the Arctic that led to reports of ice free northern oceans, and triggered decades of futile attempts to discover the fabled North West Passage.
  • The dramatic and devastating advance of glaciers in the Swiss alps that led to advances in our understanding of ice ages.
  • The ‘other’ Irish Famine – a tale of great shame and woe – prefacing the great hunger caused by the potato-blight famines in later decades.
  • The extraordinary ‘snow in June’ summer in the eastern United States

Other Volcanic Clouds

Many Europeans will recall the chaos caused by the volcanic clouds from the 2010 eruptions of the Icelandic volcano Eyjafjallajökull (pronounced like this  or phonectically ‘[ˈeɪjaˌfjatlaˌjœːkʏtl̥]).

The 2010 eruptions were tiny in historical terms with effects which were local to Iceland and nearby air routes. This is because although a lot of dust was ejected, most of it stayed within the troposphere – the lower weather-filled part of the atmosphere. Such dust clouds are normally rained out over a period of a few days or weeks.

Near the equator the boundary between the troposphere and stratosphere – known as the tropopause – is about 16 km high, but this boundary falls to around 9 km nearer the poles.

For a volcanic cloud to to have wider effects the volcanic explosion must push it above the tropopause into the stratosphere. Tiny particles can be suspended here for years, and have a dramatic effect on global climate.


Tambora may have been ‘the big one’ but it was not alone. Looking at our reconstructed air temperature record at the head of this article, we can see that large volcanic eruptions are not rare. And the 19th Century had many more than the 20th Century.

Near the start of the recorded temperature history is the eruption of Laki in Iceland (1783-84). Local details of this explosion were recorded in the diary of Jon Steingrimsson, and in their short book Island on Fire, Alexandra Witze and Jeff Kanipe describe the progression of the eruption and its effects further afield – mainly in Europe.

In the UK and Europe the summer consisted of prolonged ‘dry fogs’ that caused plants to wither and people to fall ill. On the whole people were mystified by the origin of these clouds, even though one or two people – including the prolific Benjamin Franklin – then US Ambassador to France – did in fact make the connection with Icelandic volcanoes.

Purple Clouds

Prior to the two books on real volcanic clouds, I had previously read a fictional account of such an event: The Purple Cloud by M P Shiel, published in 1901, and set in the early decades of that century.

This is a fictional, almost stream-of-consciousness, account of how an Arctic explorer discovers a world of beauty at the North Pole – including un-frozen regions. But by violating Nature’s most hidden secrets, he somehow triggers a series of volcanic eruptions at the Equator which over the course of a couple of weeks kill everyone on Earth – save for himself.

I enjoyed this book, but don’t particularity recommend it. However what is striking to me now having since read accounts of these genuine historical events is that the concept of a globally significant volcanic cloud actually existed at the end of the nineteenth Century.

Final Words

The lingering flavour of these books – factual and fictional – is that historically there have been poorly-appreciated tele-connections between historical events.

Now, we live in a world in which the extent and importance of these global tele-connections has never been greater.

And in this world we are vulnerable to events such as volcanic clouds which – as the chart at the top of the page shows – affect the entire world and are not that rare.

The strange truth about El Nino

May 13, 2015
Illustration of changes in the height of the sea surface during an El Nino Event. It's hard to know exatly what to make of this graphic, but it does look nice. Courtesy of the BBC

Satellite data showing changes in the height of the sea surface. The large straight red band just to the left (West) of South America represents a region of warmer water which has expanded and caused an increase in the height of the sea surface. This picture was stolen from the BBC web site.

You will probably hearing a lot about El Niño this year because the Australian Bureau of Meteorology have predicted that El Niño conditions will build through the coming year.

The news stories will all look like something like this:

  • Yada Yada Yada
  • Drought/Flood somewhere because of El Niño and Climate Change.
  • Isn’t it terrible

There will be nothing you can do except to experience a sense of vulnerability. And if you are of a similar disposition to me, you may also experience an increased sense of general anxiety.

However the amazing fact, which I have never seen mentioned in all my years of reading about this stuff is that, collectively, we have no idea what causes El Niño  events.

And we certainly can’t predict the events: the current ‘prediction’ is only being made because the El Niño has already begun!

So what do we know?

An Australian Bureau of Meteorology graphic describing the weather patterns in the Pacific Ocean as being either neutral, La Nina or El Nino

An Australian Bureau of Meteorology graphic describing the weather patterns in the Pacific Ocean as being either neutral, La Nina or El Nino

The term El Niño describes a set of linked atmospheric and oceanic conditions. And we understand that the weather patterns in the Pacific Ocean oscillate between three states.

  • Neutral (about 50% of the time)
  • La Niña
  • El Niño (Every 4 to 7 years)

These patterns are so large that they affect the weather right around the globe, and the oscillation between these ‘phases’ is called the El Niño Southern Oscillation (ENSO).

The Australian Bureau of Meteorology have an excellent video description of the phenomenon and its consequences for Australia.

And what don’t we know?

We don’t know what causes the oscillation from one phase of ENSO to another. And so we can’t predict changes from one phase to another.

And importantly, although it is quite conceivable that future Climate Change could affect the transitions from one phase of ENSO to another, we have no idea whether there will be any effect.

And why don’t we know it?

Well obviously, I don’t know the answer to this question. But I think it is this.

ENSO is a linked oceanic and atmospheric phenomenon.

Each of the three phases is self-sustaining i.e. changes in the wind patterns reinforce changes in the location of warm water. And changes in the location of the warmer water reinforce the changes in the wind patterns.

But the variability of the weather is such that it can move the weather patterns from one self-reinforcing phase to another.

And so these planetary scale weather events are triggered by some as yet unknown ‘local’ or ‘short-term’ variability in weather.

Things may improve. As I wrote in my review of Climate Models in the IPCC’s 5th Assessment Report, some models now spontaneously predict ENSO-like behaviour that was not programmed into the model.

And as models of the atmosphere and ocean improve, they will become better able to simulate weather on both the small scale and on the largest scales.

So as our understanding develops it seems likely that changes of ENSO phase will eventually become predictable.

Is anything truly impossible?

October 27, 2014

A recent Scientific American article highlighted the work of two Canadian engineers. Todd Reichert and Cameron Robertson, who built the world’s first (and only) human-powered helicopter.

After they had completed their brilliant and imaginative work, they learned of a recent paper which showed that what they had just done was impossible.

Their achievement put me in mind of Lord Kelvin’s misguided pronouncement:

Heavier-than-air flying machines are impossible.

This is a popular meme: illustrious expert says something is impossible: ingenue shows it is not.

But nonetheless, there are (presumably?) things which, even though they may be imagined, are still either truly or practically impossible.

But how can you distinguish between ideas which are truly or practically impossible, and those which are just hard to imagine?

This is not a merely an academic question

The UK  is currently committed to spending hundreds of millions of pounds on a nuclear fusion experiment called ITER which I am confident will never result in the construction of even a single power station.

Wikipedia tells me the build cost of the project is an astonishing $50 billion – ten times its original projected cost. Impossible projects have a way of going over budget.

I explained my reasons for considering the project to be impossible here

And on reading this Jonathan Butterworth, Head of Physics at UCL tweeted that he:

could write a similar post on why the LHC is impossible. IMHO

But I don’t think he could. Let me explain with some examples:

1. The large hadron collider (LHC) where Jonathan works is a machine called a synchrotron, which is itself a development of a cyclotron.

The first cyclotron was built in a single University physics department in 1932 (History). If, back then, you had told someone the specification of the LHC, would they have said it was impossible?

I don’t think so. Because although each parameter (size, energy etc.) has been stretched – through astonishing ingenuity and technically virtuosity  – the LHC is an extrapolation from something that they knew definitely worked.

2. A modern nuclear power station  is an engineering realisation of ‘a pile of graphite bricks‘ that was first constructed beneath the stand of a playing field of the University of Chicago in 1942.

Within this ‘pile’, the first controlled nuclear reaction took place and worked exactly as had been anticipated. Would the people who witnessed the reaction have said a nuclear power station was impossible?

Definitely not. Everyone in the room was aware of the significance (good and bad) of what had been achieved.

Controlled nuclear fusion, is in an entirely different category from either of these stories of engineering success.

  • It has never worked.

We have never created sustained nuclear fusion and the reasons for the failure of this achievement have always changed as we have understood the problem better.

The rationale for ITER is – cutting through a great deal of technical detail – that it is bigger than previous versions. This increases the volume of the plasma (where energy is released by fusion) in relation to the surface area (where it is lost).

I expect that ITER will meet its technical goals (or most of them). But even on this assumption, they would then have to solve the technical problems associated with confining a plasma at a temperature of 150 million ºC for 30 years rather  than 10 seconds.

As I explained previously, I just don’t think solutions to these problems exist that would allow reliable operation for 30 years with 90% availability required for power generation.

So I think controlled nuclear fusion as a means of generating power is – while perfectly conceivable – actually impossible.

What if – in 50 years time – we make it work? 

Then I will be proved wrong. If I am alive, I will apologise.

However, even in this optimistic scenario, it will be 50 years too late to affect climate change, which is a problem which needs solving now.

And we will have spent money and energy that  we could have spent on solving the problems that face us now using solutions which we know will definitely work.

%d bloggers like this: