The Past, Present and Future of Measurement

October 22, 2017

Measurement, the simple process of comparing an unknown quantity with a standard quantity, is the essential component of all scientific endeavours. We are currently about to enter a new epoch of metrology, one which will permit the breath-taking progress of the last hundred years to continue unimpeded into the next century and beyond.

The dawn of this new age has been heralded this week by the publication of an apparently innocuous paper in the journal Metrologia. The paper is entitled:

Data and Analysis for the CODATA 2017 Special Fundamental Constants Adjustment

and its authors, Peter Mohr, David Newell, Barry Taylor and Eite Tiesinga constitute the Committee on Data for Science and Technology, commonly referred to as CODATA. In this article I will try to explain the relevance of CODATA’s paper to developments in the science of metrology.

The Past

The way human beings began to make sense of their environment was by measuring it. We can imagine that our agrarian ancestors might have wondered whether crops were taller or heavier this year than last. Or whether plants grew better in one field rather than another. And they would have answered these questions by creating standard weights and measuring rods.

But to effectively communicate their findings, the standard units of measurement would need to be shared. First between villages, and then towns, and then counties and kingdoms. Eventually entire empires would share a system of measurement.

First units of weight and length were shared. Then, as time became more critical for scientific and technical endeavours, units of time were added to systems of the measurement. And these three quantities: mass, length and time, are shared by all systems of units.

These quantities formed the so-called ‘base units’ of a system of measurement. Many other quantities could be described in terms of these ‘base units’. For example, speeds would be described in multiples of [the base unit of length] divided by [the base unit of time]. They might be [feet] per [second] in one system, or [metres] per [second] in another.

Over the last few hundred years, the consistent improvement in measurement techniques has enabled measurements with reduced uncertainty. And since no measurement can ever have a lower uncertainty that the standard quantity in that system of units, there has been a persistent drive to have the most stable, most accurately-known standards, so that they do not form a barrier to improved measurements.

The Present

Presently, all scientific and technical measurements on Earth are made using the International System of Units, the SI. The naming of this system – as an explicitly international system – represented a profound change in conception. It is not an ‘imperial’ system or an ‘English’ system, but a shared enterprise administered by the International Bureau of Weights and Measures (BIPM), a laboratory located in diplomatically-protected land in Sèvres, near Paris, France. Its operation is internationally funded by the dozens of nations who have signed the international treaty known as the Convention of the Metre.

In essence, the SI is humanity’s standard way of giving quantitative descriptions of the world around us. It is really an annex to all human languages, allowing all nationalities and cultures to communicate unambiguously in the realms of science and engineering.

Founded in 1960, the SI was based upon the system of measurement using the metre as the unit of length, the kilogram as the unit of mass, and the second as the unit of time. It also included three more base units.

The kelvin and degree Celsius were adopted as units of temperature, and the ampere was adopted as the unit of electric current. The candela was defined as the unit of luminous efficacy – or how bright lights of different colours appear to human beings. And then in 1971 the often qualitative science of chemistry was included in the fold with the introduction of the mole as a unit of amount of substance, a recognition of the increasing importance of analytical measurements.

SI Circle - no constants

The SI is administered by committees of international experts that seek to make sure that the system evolves to meet humanity’s changing needs. Typically these changes are minor and technical, but in 1984 an important conceptual change was made.

Since the foundation of the SI, the ability to measure time had improved more rapidly than the ability to measure length. It was realised that if the metre was defined differently, then length measurements could be improved.

The change proposed was to define what we mean by ‘one metre’ in terms of the distance travelled by light, in a vacuum, in a fixed time. Based on Einstein’s insights, the speed of light in a vacuum, c, is thought to be a universal constant, but at the time it had to be measured in terms metres and seconds i.e. human-scale measurement standards. This proposal defined a metre in terms of a natural constant – something we believe is truly constant.

The re-definition went well, and set metrologists thinking about whether the change could be adopted more widely.

The Future

Typically every four years, CODATA examine the latest measurements of natural constants, and propose the latest best estimate of the values of a range of natural constants.

Measurement Graphic

This is a strange. We believe that the natural constants are really constant, not having changed measurably since the first few seconds of our universe’s existence. Whereas our human standards are at most a few decades old, and (as with all human standards) are subject to slow changes. Surely, it would make more sense, to base our measurement standards on these fundamental constants of the natural world? This insight is at the heart of the changes which are about to take place. The CODATA publication this week is the latest in a series of planned steps that will bring about this change on 20th May 2019.

Constants Graphic

What will happen on 20th May 2019?

On the 20th May 2019, revised definitions of four of the base units of the SI will come into force. More than 10 years of careful measurements by scientists world-wide will ensure that the new definitions are, as closely as possible, equivalent to the old definitions.

The change is equivalent to removing the foundations underneath a structure and then inserting new foundations which should leave the structure supported in exactly the same way. However the new foundations – being based on natural constants rather than human artefacts – should be much more stable than the old foundations.

If the past is any guide to the future, then in the coming decades and centuries, we can anticipate that measurement technology will improve dramatically. However we cannot anticipate exactly how and where these improvements will take place. By building the SI on foundations based on the natural constants, we are ensuring that the definitions of the unit quantities of the SI will place no restriction whatever on these future possible improvements.

The kilogram

The kilogram is the SI unit of mass. It is currently defined as the mass of the International Prototype of the Kilogram (IPK), a cylinder of platinum-iridium alloy held in a safe at the BIPM. Almost every weighing in the world is, indirectly, a comparison against the mass of this artefact.

On 20th May 2019, this will change. Instead, the kilogram will be defined in terms of a combination of fundamental constants including the Planck constant, h, and the speed of light, c. Although more abstract than the current definition, the new definition is thought to be at least one million times more stable.

The new definition will enable a new kind of weighing technology called a Kibble balance. Instead of balancing the weight of a mass against another object whose mass is known by comparison with the IPK, the weight will be balanced by a force which is calculable in terms of electrical power, and which can be expressed as a multiple of the fundamental constants e, h and c.

The ampere

The ampere is the SI unit of electrical current. It is presently defined in terms of the current which, if it flowed in two infinitely thin, infinitely long, parallel wires would (in vacuum) produce a specified force between the wires. This definition, arcane even by metrologists’ standards, was intended to allow the measurement of the ampere in terms of the force between carefully constructed coils of wire. Sadly, it was out of date shortly after it was implemented.

On 20th May 2019, this will change. Instead, the ampere will be defined in terms of a particular number of electrons per second, each with an exactly specified electrical charge e, flowing past a point on a wire. This definition finally corresponds to the way electric current is described in textbooks.

The new definition will give impetus to techniques which create known electrical currents by using electrical devices which can output an exactly countable number of electrons per second. At the moment these devices are limited to approximately 1 billion (a thousand million) electrons per second, but in future this is likely to increase substantially.

The kelvin

The kelvin is the SI unit of temperature. It is currently defined as the temperature of the ‘triple point of water’. This temperature – at which liquid water, solid water (ice) and water vapour (but no air) co-exist in equilibrium – is defined to be 273.16 kelvin exactly. Glass cells re-creating this conjunction are located in every temperature calibration lab in the world, and every temperature measurement is a comparison of how much hotter a temperature is than the temperature at one position within a ‘triple point of water cell’.

On 20th May 2019, this will change. Instead, the kelvin will be defined in terms of a particular amount of energy per molecule as specified by the Boltzmann constant, kB. This definition finally corresponds to the way thermal energy is described in textbooks.

The requirement to compare every measurement of temperature with the temperature of the triple point of water adds uncertainty to measurements at extremely low temperatures (below about 20 K) and at high temperatures (above about 1300 K). The new definition will immediately allow small improvements in these measurement ranges, and further improvements are expected to follow.

The definition of the degree Celsius (°C) in terms the kelvin will remain unchanged.

The mole

The mole is the SI unit of ‘amount of substance’. It is currently defined as the amount of substance which contains the same number of ‘elementary entities’ as there are atoms in 12 grams of carbon-12. The change in the definition of the kilogram required a re-think of this definition.

On 20th May 2019, it will change. The mole will be defined as the amount of substance which contains a particular, exactly specified, number of elementary entities. This number – known as the Avogadro number, NA – is currently estimated experimentally, but in future it will have fixed value.

The specification of an exact number of entities effectively links the masses of microscopic entities such as atoms and molecules to the new definition of the kilogram.

The ‘New’ SI

On 20th May 2019 four of the seven base units of the SI will be re-defined. But what of the other three?

The second is already defined in terms of the natural frequency of microwaves emitted by atoms of a particular caesium isotope. The metre is defined in terms of the second and the speed of light in vacuum – a natural constant. And the candela is defined in terms of Kcd, the only natural constant in the SI that relates to human beings. So from 20th May 2019 the entire SI will be defined in terms of natural constants.

SI Circle - with constants

The SI is not perfect. And it will not be perfect even after the redefinitions come into force. This is because it is a system devised by human beings, for human beings. But by incorporating natural constants into the definitions of all its base units, the SI has taken a profound step towards being a system of measurement which will enable ever greater precision in metrology.

And who knows what features of the Universe these improved measurements will reveal.

Santa Rosa Fire: Update

October 18, 2017

The fires surrounding Santa Rosa are slowly coming under control. And – thankfully – rains are due tomorrow (Thursday 19th October).

From the San Francisco Chronicle’s interactive graphic page, I compiled the animated gif above to show how closely the fires approached Santa Rosa from two directions.

Each frame shows one day’s fire extent starting with day 0 – the day before the fires – to  day 11: the 18th October 2017.

What is particularly striking is the rapidity of the spread of the so-called ‘Tubbs Fire’ on Day 2.

It completely outpaced any attempt to contain it, and it devastated the north-eastern suburb of Fountaingrove.

I confess I immediately thought that this extraordinary fire, following an extended drought, had the fingerprints of ‘Climate Change’ written all over it.

And I was sure that Suzanne would feel that way too. But this is not quite the open-and-shut case that I thought.

It turns out that there was a great ‘Hanley’ fire in 1964 that has an uncanny geographical overlap with the the 2017 fires. Several newspapers have featured recollections of the blaze (here and here) and Suzanne sent me the graphic below.

The ‘hatched’ regions correspond to the extent of the 1964 fires, and the coloured dots correspond to the 2017 fires. The overlap is… suspicious. And it rather changes ‘the story’ of the fires.

Santa Rosa Fires 1964-2017

The narrative now appears to be less “OMG! Climate change induced fire-meggedon“, and more “Were the proper planning procedures followed?“. i.e. rather less dramatic, but just as important if it happened to be your house that was burned.

It appears that the fire burned in the same place as it did previously, but in the intervening 53 years, thousands of homes were built.

One irony that Suzanne reports is that a couple of years ago, there was a move to build a fire station in Fountaingrove, but there were objections because it might lower property values in the area.

The fire station was eventually built, but – like everything else there – it has been devastated.

I suspect that when they re-build having a fire station nearby will not be seen with quite the negativity that it once was.

UPDATE: The Washington Post have also run with this story under the headline Santa Rosa ignored nature’s warning . Perhaps after reading my blog?


The Santa Rosa Fire

October 16, 2017

Santa Rosa FiresIt’s been a few year’s since we visited my friend Suzanne in Santa Rosa. But I remember the street on which she lives.

In the early mornings, I would go jogging around the neighboured which was sleepy and suburban.

But this memory contrast strongly with the news that the largest of the recent Northern California wildfires (The Tubbs Fire) stopped just about 100 metres short of Suzanne’s home.

The map at the head of the page (stolen from the excellent San Francisco Chronicle) shows how the fire swept across the suburbs from the north east, and by nothing more than good chance, stopped just short of her home.

It’s not just people who have lost homes. Businesses have been burned out too.

But Keysight (formerly Agilent, formerly Hewlett Packard) who have their HQ in Santa Rosa seem to have lost only a few out-buildings.

Aside from the unimaginable personal and financial losses, this must be devastating for the entire community.

I don’t want to say anything about this now because it’s too shocking and the fires are not yet out. But until this moment, I would never have believed this possible.


The Joy of Science

October 15, 2017

For the last couple of weeks I have been a helper at Averil Horton’s Science Club.

It’s  very low key

Just a few adults doing science experiments themselves. And then discussing the results.

Helping – which requires no prior work on my part – has reminded me of the simple pleasure people experience from doing stuff for themselves.

And the pleasantness of discussing what one sees with others. It is the Joy of Science.

Here are some pictures.

Setting things on fire…


Seeing what floats in what…


Playing with colours…


Averil & Colours

And finishing with a little bit of magic…



It’s a shame…

August 2, 2017


Pictured above is the humble grave of James Clerk Maxwell.

By all accounts, he was a kind and humble man, and so in many ways it is an entirely appropriate memorial.

But simple as it is, surely we could show our respect and admiration by as simple an act as mowing the grass? It seems not.

My attention was drawn to the unkempt state of his grave by this article in the Scottish Daily Record.

In death we are all equal.

And I have no doubt that Maxwell himself would have wanted no fuss.

But some people – very few – have led such exceptional lives that it is appropriate for us to collectively mark their mortal remains in a way which shows how much we honour their achievements in life.

This is not an indicator of our belief in any kind of saintliness on their part.

It is rather a statement about us.

It is a statement about what we currently admire and treasure and celebrate.

I have been told that Ren Zhengfei, the founder and President of Huawei Technology visited the grave and was embarrassed and shocked.

To neglect the grave of such a monumental figure says something about us.

It is actually a matter of national shame. And while acknowledging that Maxwell was decidedly Scottish, I draw the boundaries of ‘nation-hood’ more widely.

So how great was James Clerk Maxwell?

Maxwell’s many contributions to our modern view of the world are difficult to summarise without being trite, and they span an enormous range. But here are two of his achievements concerning light.

The first colour photograph taken using Maxwell's prescription. (Credit: Wikipedia)

The first colour photograph taken using Maxwell’s prescription. (Credit: Wikipedia)

Having made a breakthrough understanding of the nature of human colour vision, he used that understanding to describe how to take the first colour photograph.


A picture from Wikipedia showing a young James Clerk-Maxwell at Trinity College, Cambridge. He is holding one of his colour wheels that he used to study colour vision.

Later he became the first person to appreciate that light was an electrical phenomenon.

And the equations he wrote down to describe the nature of light are still those we use today to describe just about all electrical and magnetic phenomena*.

Richard Feynman, the person who made the next step in our understanding of the light said:

“From a long view of the history of mankind — seen from, say, ten thousand years from now — there can be little doubt that the most significant event of the 19th century will be judged as Maxwell’s discovery of the laws of electrodynamics. The American Civil War will pale into provincial insignificance in comparison with this important scientific event of the same decade.”

And Michael de Podesta, the person writing this blog said:

“I named my son after him”

That a true hero should not be honoured in his own land, is a shame on us all.

Surely we could collectively manage to keep the grass on his grave tidy?


*Note for pedants: In fact the equations we use are a simplified form of Maxwell’s Equations devised by Oliver Heaviside after Maxwell’s tragic early death.

Work Experience

August 2, 2017

Film Crew


I had a work experience student with me last week. Let’s call him ‘William’.

On reflection, I am rather concerned about the impression that the “work” he witnessed might have on him.


Firstly, everything was very ‘bitty’: it was hard to concentrate on a single task for any period as long as a half day.

And in between explicit tasks, I spent a fair amount of time composing e-mails. That’s right, I said composing, not writing. Because e-mails are generally not simply ‘written’.

For despite the immediacy of the transmission, words in e-mails have to be chosen as carefully as words in a missive that might travel more slowly.

So even though I may appear to be sitting in front of a computer for an hour, I am in fact ‘composing’: plucking words from the vacuum of possibility, and then distilling the raw words to create clear and unambiguous text.

Anyway, I think that bit may have been a bit boring for him.


Secondly, although primarily temperature-related, it was extremely diverse.

One activity involved measuring the temperature of the air using our non-contact thermometer and hygrometer (NCTAH).

NCTAH in lab with notes

We set up the experiment in one of NPL’s ultra-stable temperature labs which we normally use for dimensional measurements.

The idea was to compare the temperature indicated by NCTAH with four conventional thermometers. However while NCTAH operated beautifully, it was the readings of the conventional sensors I couldn’t understand.

They indicated that objects in the room were hotter than the air in the room by as much as 0.3 °C. Unfortunately I was in a bit of a rush and I was bamboozled by this result. And I am still working on an answer. However I would have liked him to see something simple ‘just work’. Hey, ho.

And finally…

A film crew visited to interview me about the re-definition of the kelvin. They were charming and professional and genuinely interested in the subject.

They shot a long interview one afternoon, and then the next day they must have spent a good two hours filming me walking.

It wasn’t just walking. We spent a fair amount of time opening doors and then walking. Also walking and then opening doors.

Then it was time for a solid 30 minutes of emerging from corridors, and turning into corridors.

I am not sure what I made of the experience, and I am curious to see what the director Ed Watkins will make of the footage. But he and his colleagues seemed happy as they headed off to film at the PTB in Braunschweig, Germany.

And as for what ‘William’ made of it all, I haven’t a clue. It involved quite a lot of just ‘sitting’ and ‘keeping out of shot’.

But I guess he got to see how documentaries are constructed which might have been the most valuable experience of all.

Would you like to work with me?

July 29, 2017
Lab Panorama

The Acoustic Thermometry Lab at NPL (Photo by Sam Gibbs: thanks 🙂 )

Friends and colleagues,

  • Do you know anyone who would like to work with me?

In the next few months I expect to be starting some new projects at NPL. And this means that I will not be able to work on my existing projects 😦

So NPL have created the opportunity for someone to work with me to help complete those projects.

  • You can read about the job here.
  • It’s also on the NPL web site here where it’s the described as “Research Or Higher Research Scientist – Temperature & Humidity” reference 65552.

What’s involved?

Good question. And it is one that is still being decided.

But it would involve working mainly in the acoustic thermometry lab .

Lab Panorama with notes

In acoustic thermometry, the temperature of a gas is inferred from measurements of the speed of sound.

On the left-hand side of the picture is an apparatus that uses a spherical resonator to measure the speed of sound. It is the most accurate thermometer on Earth.

On the right-hand side of the picture is a new apparatus that uses a cylindrical resonator to measure the speed of sound and has been designed to operate up 700 °C.

The job would involve learning about these techniques but that wouldn’t be the main activity.

Running around the lab is 50 metres of bright yellow tubing that we refer to as ‘an acoustic waveguide’.

By measuring the transmission of sound along the tube it is possible to turn it into a useful thermometer. I hope.

Finding out whether this can be made to work practically would be one part of the job. And testing the same idea is smaller tubes would be another.

Finally, by measuring the speed of sound in air it is possible to measure the temperature of the air and we would like to investigate applications of this technology.

What does the job involve?

Well it will involve learning a lot of new stuff. Typically projects involve:

  • Programming in Labview to control instruments and acquire and analyse data.
  • Writing spreadsheets and reports and PowerPoint presentations.
  • Keep track of stuff in a lab book.
  • Using acoustic and optical transducers
  • Signal processing
  • Electronics
  • Mechanical design and construction.
  • Vacuum and gas handling systems – ‘plumbing’.

And lots more. And the chance that someone with those skills will walk through the door is pretty low.

So prior knowledge is great but the key requirement is the mindset to face all those unknown things without letting the bewilderment become overwhelming.

So we are looking for someone with enthusiasm.


Learning new stuff is painful. Especially when it seems endless.

So I couldn’t imagine working with someone who wasn’t enthusiastic about the miracle of physics.

And there is one benefit which isn’t mentioned in the advert.

To cope with the inevitable disappointments and to reward ourselves for our minor successes, our research group has freely available Tunnock’s Caramel Wafers.

Anyway, if this person isn’t you, please do pass on the opportunity to anyone you think might be interested.

The closing date for applications is 28th August 2017.


Exactitude and Inexactitude

July 19, 2017

Exactitude and Inexactitude

After being a professional physicist for more than 30 years, I realised the other day that I write for a living.

Yes, I am a physicist, and I still carry out experiments, do calculations and write computer programs.

But at the end of all these activities, I usually end up writing something: a scientific paper; a report; some notes for myself; or a blog article like this.

But although the final ‘output’ of most of what I do is a written communication of some description, nobody ever taught me to write.

I learned to write by reading what I had written. And being appalled.

Appalled by missed words and typographic errors, and by mangled ideas and inappropriate assumptions of familiarity with the subject matter.

Learning to write is a difficult, painful and never-ending process.

And over and over again I am torn between exactitude – which I seek – and inexactitude, which I have learned to tolerate for two reasons.

  • Firstly, a perfect article which is never completed communicates nothing. Lesson one for writing is that finishing is essential.
  • Secondly, an article which has all the appropriate details will be too long and may never be read by the people with whom I seek to communicate.

So in order communicate optimally, I need to find the appropriate tension between the competing forces of exactitude and inexactitude.

This blog 

When I write for this blog, I try to write articles that are about 500 words long. I rarely succeed.

Typically, I write something. Read it. And then add explanatory text either at the start or at the end?

But with each extra word I type, I realise that fewer and fewer people will read the article and appreciate the clarity of my writing.

And I have to acknowledge that if I had written fewer words I might have communicated something to more people.

Or even communicated more by omitting detail people might find obfuscatory

Indeed I have to acknowledge – and this is hard – that I could have even written something erroneous and communicated something to more people.

For example

For example, in the previous article on the GEO600 Gravity Wave detector, I said that “moving a mirror by half a wavelength of light caused the interferometer to change from constructive to destructive interference.”

Now I know what you are thinking: and yes, it only has to move by a quarter of a wavelength of light.

I realised this before I finished the article but it had already taken hours, and I had already recorded the narrative to the movie.

Similarly, my animation showed one of the reflections coming from the wrong side of a piece of glass (!), and it omitted the normal ‘compensator’ plate in the interferometer.

And how many people noticed or complained? None so far.

So the article was published and presumably communicated something, inexactly and slightly incorrectly. And it was not wholly erroneous.

Exactitude and Inexactitude.

Exactitude and Inexactitude are like two mis-matched protagonists in a ‘buddy movie’.

At the start they hate each other, but over the course of ‘a journey’ in which they are compelled to accompany one another, they learn to love each other for what they are, and to accept each other for what they are not.

Inexactitude: You drive me crazy, but I love you.

Gravity Wave Detector#2

July 15, 2017

GEO600 One arm


After presenting a paper at the European Society of Precision Engineering and Nanotechnology (EUSPEN) in Hannover back in May, I was offered the chance to visit a Gravity Wave Detector. Wow! I jumped at the opportunity!

The visiting delegation were driven in a three-minibus convoy for about 30 minutes, ending up in the middle of a field of cabbages.

After artfully turning around and re-tracing our steps, we found a long, straight, gated track running off the cabbage-field track.

Near the gate was a shed, and alongside the road ran some corrugated sheet covering what looked like a drainage ditch.

These were the only clues that we were approaching one of the most sensitive devices that human beings have ever built: the GEO600 gravity-wave detector(Wikipedia or GEO600 home page)

Even as we drove down the road, the device in ‘the ditch’ was looking for length changes in the 600 metre road of less than one thousandth the diameter of a single proton.

Nothing about how to achieve such sensitivity is obvious. And as my previous article made clear, there have been many false steps along the way.

But even the phenomenal sensitivity of this detector turns out be not quite good enough to detect the gravity waves from colliding black holes.

In order to detect recent events GEO600 would have to have been between 3 and 10 times more sensitive.

The measuring principle

The GEO600 device as it appears above ground is illustrated in the drone movie above.

It consists of a series of huts and an underground laboratory at the intersection of two 600 metre long ‘arms’.

In the central laboratory, a powerful (30 watt) laser shines light of a single wavelength onto a beam-splitter: a piece of glass with a thin metal coating.

The beam-splitter reflects half the light and transmits the other other half, creating two beams which travel at 90° to each other along the two arms of the device.

At the end of the arms, a mirror reflects the light back to the beam-splitter and onto a light detector where the beams re-combine.

Aside from the laser, all the optical components are suspended from anti-vibration mountings inside vacuum tubes about 50 cm in diameter.

When set up optimally, the light traversing the two arms interferes destructively, giving almost zero light signal at the detector.

But a motion of one mirror by half of a wavelength of light (~0.0005 millimetres), will result in a signal going from nearly zero watts (when there is destructive interference) to roughly 30 watts (when there is constructive interference).

So this device – which is called a Michelson Interferometer – senses tiny differences in the path of light in the two arms. These differences might be due to the motion of one of the mirrors, or due to light in one arm being delayed with respect to light in the other arm.


The basic sensitivity to motion can be calculated (roughly) as follows.

Shifting one mirror by one half a wavelength (roughly 0.0005 millimetres) results in an optical signal increasing from near zero to roughly 30 watts, a sensitivity of around 60,000 watts per millimetre.

Modern silicon detectors can detect perhaps a pico-watt (10-12 watt) of light.

So the device can detect a motion of just

10-12 watts ÷ 60000 watts per millimetre

or roughly 2 x 10-17 mm which is 10-20 metres. Or one hundred thousandth the diameter of a proton!

If the beam paths are each 600 metres long then the ability to detect displacements is equivalent to a fractional strain of roughly 10-23 in one beam path over the other.

So GEO600 could, in principle, detect a change in length of one arm compared to the other by a fraction:

0.000 000 000 000 000 000 000 01

There are lots of reasons why this sensitivity is not fully realised, but that is the basic operating principle of the interferometer.

The ‘trick’ is isolation

The scientists running the experiment think that a gravity wave passing through the detector will cause tiny, fluctuating changes in the length of one arm of GEO600 compared with the other arm.

The changes they expect are tiny which is why they made GEO600 so sensitive.

But in the same way that a super-sensitive microphone in a noisy room would just makes the noise appear louder, so GEO600 is useless unless it can be isolated from noise and vibrations.

So the ‘trick’ is to place this extraordinarily sensitive ‘microphone’ into an extraordinarily ‘quiet’ environment. This is very difficult.

If one sits in a quiet room, one can slowly become aware of all kinds of noises which were previously present, but of which one was unaware:

  • the sound of the flow of blood in our ears:
  • the sound of the house ‘creaking’
  • other ‘hums’ of indeterminate origin.

Similarly GEO600, can ‘hear’ previously unimaginably ‘quiet’ sounds:

  • the ground vibrations of Atlantic waves crashing on the shores of Europe:
  • the atom-by-atom ‘creeping’ of the suspension holding the mirrors


So during an experiment, the components of GEO600 sit in a vacuum and the mirrors and optical components are suspended from silica (glass) fibres, which are themselves suspended from the end of a spring-on-a-spring-on-a-spring!

In the photograph below, the stainless steel vacuum vessels containing the key components can be seen in the underground ‘hub’ at the intersection of the two arms.

GEO600 Beam Splitter

They are as isolated from the ‘local’ environment as possible.

The output of the detector – the brightness of the light on the detector is shown live on one of the many screens in the control ‘hut’.

GEO 600 Control Centre

But instead of a graph of ‘brightness versus time, the signal is shown as a graph of the frequencies of vibration detected by the silicon detector.


The picture below shows a graph of the strain – the difference in length of the two arms – detected at different frequencies.

[Please note the graph is what scientists call ‘logarithmic’. This means that a given distance on either axis corresponds to a constant multiplier. So the each group of horizontal lines corresponds to a change in strain by a factor 10, and the maximum strain shown on the vertical 10,000 times larger than the smallest strain shown.]

Sensitivity Curve

The picture above shows two traces, which both have three key features:

  • The blue curve showed the signal being detected as we watched. The red curve was the best performance of the detector. So the detector was performing close to its optimal performance.
  • Both curves are large at low frequencies, have a minimum close to 600 Hz, and then rise slowly. This is the background noise of the detector. Ideally they would like this to be about 10 times lower, particularly at low frequencies.
  • Close to the minimum is a large cluster of spikes: these are the natural frequencies of vibration of the mirror suspensions and the other optical components.
  • There are lots of spikes caused by specific noise sources in the environment.

If a gravity wave passed by…

…it would appear as a sudden spike at a particular frequency, and this frequency would then increase, and finally the spike would disappear.

It would be over in less than a second.

And how could they tell it was a gravity wave and not just random noise? Well that’s the second trick: gravity wave detectors hunt in pairs.

The signal from this detector is analysed alongside signals from other gravity wave detectors located thousands of kilometres away.

If the signal came from a gravity wave, then they would expect to see a similar signal in the second detector either just before or just afterwards – within a ‘time window’ consistent with a wave travelling at the speed of light.


Because powerful lasers were in use, visitors were obliged to wear laser google!

Because powerful lasers were in use, visitors were obliged to wear laser goggles!

This was the second gravity wave detector I have seen that has never detected a gravity wave.

But I have seen this in the new era where we now know these waves exist.

People have been actively searching for these waves for roughly 50 years and I am filled with admiration for the nobility of the researchers who spent their careers fruitlessly searching and failing to find gravity waves.

But the collective effect of these decades of ‘failure’ is a collective success: we now know how to the ‘listen’ to the Universe in a new way which will probably revolutionise how we look at the Universe in the coming centuries.

A 12-minute Documentary

Gravity Wave Detector#1

July 6, 2017
Me and Albert Einstein

Not Charlie Chaplin: That’s me and Albert Einstein. A special moment for me. Not so much for him.

I belong to an exclusive club! I have visited two gravity wave detectors in my life.

Neither of the detectors have ever detected gravity waves, but nonetheless, both of them filled me with admiration for their inventors.

Bristol, 1987 

In 1987, the buzz of the discovery of high-temperature superconductors was still intense.

I was in my first post-doctoral appointment at the University of Bristol and I spent many late late nights ‘cooking’ up compounds and carrying out experiments.

As I wandered around the H. H. Wills Physics department late at night I opened a door and discovered a secret corridor underneath the main corridor.

Stretching for perhaps 50 metres along the subterranean hideout was a high-tech arrangement of vacuum tubing, separated every 10 metres or so by a ‘castle’ of vacuum apparatus.

It lay dormant and dusty and silent in the stillness of the night.

The next day I asked about the apparatus at morning tea – a ritual amongst the low-temperature physicists.

It was Peter Aplin who smiled wryly and claimed ownership. Peter was a kindly antipodean physicist, a generalist – and an expert in electronics.

New Scientist article from 1975

New Scientist article from 1975

He explained that it was his new idea for a gravity wave detector.

In each of the ‘castles’ was a mass suspended in vacuum from a spring made of quartz.

He had calculated that by detecting ‘ringing’ in multiple masses, rather than in a single mass, he could make a detector whose sensitivity scaled as its Length2 rather than as its Length.

He had devised the theory; built the apparatus; done the experiment; and written the paper announcing that gravity waves had not been detected with a new limit of sensitivity.

He then submitted the paper to Physical Review. It was at this point that a referee had reminded him that:

When a term in L2 is taken from the left-hand side of the equation to the right-hand side, it changes sign. You will thus find that in your Equation 13, the term in L2 will cancel.

And so his detector was not any more sensitive than anyone else’s.

And so…

If it had been me, I think I might have cried.

But as Peter recounted this tale, he did not cry. He smiled and put it down to experience.

Peter was – and perhaps still is – a brilliant physicist. And amongst the kindest and most helpful people I have ever met.

And I felt inspired by his screw up. Or rather I was inspired by his ability to openly acknowledge his mistake. Smile. And move on.

30 years later…

…I visited Geo 600. And I will describe this dramatically scaled-up experiment in my next article.

P.S. (Aplin)

Peter S Aplin wrote a review of gravitational wave experiments in 1972 and had a paper at a conference called “A novel gravitational wave antenna“. Sadly, I don’t have easy access to either of these sources.


%d bloggers like this: