Archive for the ‘Personal’ Category

Impact

December 3, 2017

I can collect my state retirement pension in just 97 months.

The closeness of this date – and its week-by-week countability – is a great comfort to me when I feel under pressure at work.

And one abiding pressure is the requirement that I personally create ‘impact’ from my work.

I had cause to reflect on this when I visited the Royal Society in London last week.

The Royal Society

The Royal Society has a lot of marble and in one portion of that marble is carved its admirable motto.

Nullius in verba

or

Take nobody’s word for it

This is a reflection of the belief of the founding fellows in the primacy of experimental results over beliefs.

In short, it is a declaration that science has to deal with the world as it is, and not as we would wish it to be.

IMG_6559.jpg

The Royal Society also has the best door handles I ever seen.

IMG_6560.jpg

Evoking the double-strands of DNA, I was told the door-handles had to be re-made because the first batch had the wrong chirality!

And just in case you were in doubt about its prestige, its walls are lined with portraits of its past presidents: Newton, Boyle, Darwin, Kelvin, Rutherford, and…

Hans Sloane

IMG_6557.jpg

Hans Sloane was the president of the Royal Society who followed Sir Isaac Newton: a tough act to follow.

But in terms of ‘impact’ I think he may even have exceeded Newton. History records three great achievements.

  • He founded the Chelsea Physic Garden for the study of plants from all over the world. Mmmm. Not bad.
  • He donated his collection of antiquities to found the British Museum. Mmmm. Impressive.
  • But finally, as recorded on a small label underneath his portrait, is by far his greatest achievement: he invented Milk Chocolate!

Citing Nullius in verba, I am disinclined to believe that he really invented Milk Chocolate. I suspect he re-invented it or modified a previous recipe.

But as I imagine him filling out his annual appraisal form, I feel sure that in the ‘impact’ section, he would have mentioned the popularisation of chocolate as his most significant achievement.

And since his death in 1753, literally billions of human beings have experienced momentary pleasure, or relief from anxiety, by simply eating a small amount of chocolate.

In all of history, has humanity ever had a greater benefactor? 

 

 

How do we know anything?

November 18, 2017

How do we know Anything MdeP-NPL

This is an edited video of a talk I gave recently to the NPL Postgraduate Institute about the forthcoming changes to the International System of Units, the SI.

It’s 36 minutes long and you can download the PowerPoint slides here.

It features the first ever public display of the Standard Michael – the artefact defining length in the SM, le systeme de moi, or My System of Units. (It’s shown at about 6 minutes 40 seconds into the video).

The central thesis of the talk is summarised in the slide below:

Measurement

In the talk I explain how the forthcoming changes to the SI will improve future measurements.

I hope you enjoy it.

 

Before understanding comes familiarity

November 14, 2017
IMG_6178.jpg

Averil Horton

It is tough being an adult. Hey. We all know that.

But is especially tough if you realise as an adult that science fascinates you. There are relatively few places where you can go and learn about science  without being condescended to, or treated like a child.

I tried to create such an environment when I ran Protons for Breakfast and my friend Averil Horton is now trialling her ‘Science Club’ with adults.

I am attending as a helper – which is delightfully low stress compared to running a class!

And the key insight of which I have been reminded repeatedly is that experience has to come to before understanding. And for adults, just gaining exposure to the experiential pleasure of hands-on experimentation is so difficult!

I won’t describe the classes in detail, but below I will just post a couple of pictures showing the kinds of things people do. And with the exception of a couple of potentially dangerous things – everyone does everything!

Cutting Potassium…

This slideshow requires JavaScript.

Chromatography…

Colours

Growing Silver…

IMG_6461.jpg

Burning Magnesium…

IMG_6456.jpg

Experiments with density…

IMG_6219.jpg

Burning hydrocarbons…

IMG_6215.jpg

And we are not even half way through!

Measuring Temperature with Sound

November 12, 2017

Measurement

I have just given the first of a series of five talks for The Training Partnership, a company that provide ‘enrichment’ days for A level students.

Since one of my key messages is the importance of measurement in science, I feel obliged to perform some measurements during the presentation.

I find this worrisome, but I think it works well. When it works!

Anyway, with four more presentations to go I thought I would create a page with links to all the the resources I use in the talk.

PowerPoint

The PowerPoint presentation can be found here. Please feel free to steal animations if you think they will be helpful, but please give credit to NPL.

Software

During the presentation I use:

  • Audacity for capturing acoustic wave-forms and analysing them: it is astonishing software, and completely free.
  • Sound Card Oscilloscope for detecting the resonance within the spherical resonator: it is excellent and free for educational users. It also comes with a built in oscillator, but for the demo it is much clearer if I use a separate device. So I use…
  • Signal Generator, an app for IOS devices. There many others for both IOS and Android but this one is fine and costs £0.99.

And this is the spreadsheet I use to interpret the results from the experiment.

Hardware

In my talk I use the same microphone for all the demonstrations, a commercial lapel microphone from RS Components (RS Stock No.242-8911which costs about £20. Similar devices are available from other suppliers.

I chose this particular model because it more robust than home-made contraptions and has a small head – so it fits inside tubes. Larger microphones will work but they tend to damp acoustic waves more strongly.

I hold it in place with a blob of Blutac.

The miniature loudspeaker I use for the resonator demo is quite specialised. It is from a range of products used in headphones, mobile phones and hearing aids produced by the Knowles corporation.

think the model I used is  from this ‘BK’ series. It requires wires to be soldered onto very tiny terminals, and then wired to a 4 mm jack plug that can connect to a mobile phone.

One alternative would be to dismantle a pair of in-ear headphones and just use the loudspeaker from one earphone.

Tube and Resonator

The metal tube I use in the talk is 1.1 metre long stainless steel tube approximately 9.5 mm diameter. You can also use many other types of tubing such as copper or steel plumbing tube.

In general, longer is better for more accurate measurements at room temperature, but it is obviously more difficult to heat it uniformly.

The resonator is a 3-D printed version of the copper resonator we used to measure the Boltzmann constant and make the most accurate temperature measurements in history.

I have placed the 3-D printing files in a zipped folder hereThere are files for the Northern Hemisphere, the Southern Hemisphere, and the plugs. Creating the resonator is quite complicated and I will write a separate blog post on that later.

Good luck!

 

The Past, Present and Future of Measurement

October 22, 2017

Measurement, the simple process of comparing an unknown quantity with a standard quantity, is the essential component of all scientific endeavours. We are currently about to enter a new epoch of metrology, one which will permit the breath-taking progress of the last hundred years to continue unimpeded into the next century and beyond.

The dawn of this new age has been heralded this week by the publication of an apparently innocuous paper in the journal Metrologia. The paper is entitled:

Data and Analysis for the CODATA 2017 Special Fundamental Constants Adjustment

and its authors, Peter Mohr, David Newell, Barry Taylor and Eite Tiesinga constitute the Committee on Data for Science and Technology, commonly referred to as CODATA. In this article I will try to explain the relevance of CODATA’s paper to developments in the science of metrology.

The Past

The way human beings began to make sense of their environment was by measuring it. We can imagine that our agrarian ancestors might have wondered whether crops were taller or heavier this year than last. Or whether plants grew better in one field rather than another. And they would have answered these questions by creating standard weights and measuring rods.

But to effectively communicate their findings, the standard units of measurement would need to be shared. First between villages, and then towns, and then counties and kingdoms. Eventually entire empires would share a system of measurement.

First units of weight and length were shared. Then, as time became more critical for scientific and technical endeavours, units of time were added to systems of the measurement. And these three quantities: mass, length and time, are shared by all systems of units.

These quantities formed the so-called ‘base units’ of a system of measurement. Many other quantities could be described in terms of these ‘base units’. For example, speeds would be described in multiples of [the base unit of length] divided by [the base unit of time]. They might be [feet] per [second] in one system, or [metres] per [second] in another.

Over the last few hundred years, the consistent improvement in measurement techniques has enabled measurements with reduced uncertainty. And since no measurement can ever have a lower uncertainty that the standard quantity in that system of units, there has been a persistent drive to have the most stable, most accurately-known standards, so that they do not form a barrier to improved measurements.

The Present

Presently, all scientific and technical measurements on Earth are made using the International System of Units, the SI. The naming of this system – as an explicitly international system – represented a profound change in conception. It is not an ‘imperial’ system or an ‘English’ system, but a shared enterprise administered by the International Bureau of Weights and Measures (BIPM), a laboratory located in diplomatically-protected land in Sèvres, near Paris, France. Its operation is internationally funded by the dozens of nations who have signed the international treaty known as the Convention of the Metre.

In essence, the SI is humanity’s standard way of giving quantitative descriptions of the world around us. It is really an annex to all human languages, allowing all nationalities and cultures to communicate unambiguously in the realms of science and engineering.

Founded in 1960, the SI was based upon the system of measurement using the metre as the unit of length, the kilogram as the unit of mass, and the second as the unit of time. It also included three more base units.

The kelvin and degree Celsius were adopted as units of temperature, and the ampere was adopted as the unit of electric current. The candela was defined as the unit of luminous efficacy – or how bright lights of different colours appear to human beings. And then in 1971 the often qualitative science of chemistry was included in the fold with the introduction of the mole as a unit of amount of substance, a recognition of the increasing importance of analytical measurements.

SI Circle - no constants

The SI is administered by committees of international experts that seek to make sure that the system evolves to meet humanity’s changing needs. Typically these changes are minor and technical, but in 1984 an important conceptual change was made.

Since the foundation of the SI, the ability to measure time had improved more rapidly than the ability to measure length. It was realised that if the metre was defined differently, then length measurements could be improved.

The change proposed was to define what we mean by ‘one metre’ in terms of the distance travelled by light, in a vacuum, in a fixed time. Based on Einstein’s insights, the speed of light in a vacuum, c, is thought to be a universal constant, but at the time it had to be measured in terms metres and seconds i.e. human-scale measurement standards. This proposal defined a metre in terms of a natural constant – something we believe is truly constant.

The re-definition went well, and set metrologists thinking about whether the change could be adopted more widely.

The Future

Typically every four years, CODATA examine the latest measurements of natural constants, and propose the latest best estimate of the values of a range of natural constants.

Measurement Graphic

This is a strange. We believe that the natural constants are really constant, not having changed measurably since the first few seconds of our universe’s existence. Whereas our human standards are at most a few decades old, and (as with all human standards) are subject to slow changes. Surely, it would make more sense, to base our measurement standards on these fundamental constants of the natural world? This insight is at the heart of the changes which are about to take place. The CODATA publication this week is the latest in a series of planned steps that will bring about this change on 20th May 2019.

Constants Graphic

After years of work by hundreds of scientists, the values of the natural constants recommended by the CODATA committee will be fixed – and will form the basis for the new definitions of the seven SI base units.

What will happen on 20th May 2019?

On the 20th May 2019, revised definitions of four of the base units of the SI will come into force. More than 10 years of careful measurements by scientists world-wide will ensure that the new definitions are, as closely as possible, equivalent to the old definitions.

The change is equivalent to removing the foundations underneath a structure and then inserting new foundations which should leave the structure supported in exactly the same way. However the new foundations – being based on natural constants rather than human artefacts – should be much more stable than the old foundations.

If the past is any guide to the future, then in the coming decades and centuries, we can anticipate that measurement technology will improve dramatically. However we cannot anticipate exactly how and where these improvements will take place. By building the SI on foundations based on the natural constants, we are ensuring that the definitions of the unit quantities of the SI will place no restriction whatever on these future possible improvements.

The kilogram

The kilogram is the SI unit of mass. It is currently defined as the mass of the International Prototype of the Kilogram (IPK), a cylinder of platinum-iridium alloy held in a safe at the BIPM. Almost every weighing in the world is, indirectly, a comparison against the mass of this artefact.

On 20th May 2019, this will change. Instead, the kilogram will be defined in terms of a combination of fundamental constants including the Planck constant, h, and the speed of light, c. Although more abstract than the current definition, the new definition is thought to be at least one million times more stable.

The new definition will enable a new kind of weighing technology called a Kibble balance. Instead of balancing the weight of a mass against another object whose mass is known by comparison with the IPK, the weight will be balanced by a force which is calculable in terms of electrical power, and which can be expressed as a multiple of the fundamental constants e, h and c.

The ampere

The ampere is the SI unit of electrical current. It is presently defined in terms of the current which, if it flowed in two infinitely thin, infinitely long, parallel wires would (in vacuum) produce a specified force between the wires. This definition, arcane even by metrologists’ standards, was intended to allow the measurement of the ampere in terms of the force between carefully constructed coils of wire. Sadly, it was out of date shortly after it was implemented.

On 20th May 2019, this will change. Instead, the ampere will be defined in terms of a particular number of electrons per second, each with an exactly specified electrical charge e, flowing past a point on a wire. This definition finally corresponds to the way electric current is described in textbooks.

The new definition will give impetus to techniques which create known electrical currents by using electrical devices which can output an exactly countable number of electrons per second. At the moment these devices are limited to approximately 1 billion (a thousand million) electrons per second, but in future this is likely to increase substantially.

The kelvin

The kelvin is the SI unit of temperature. It is currently defined as the temperature of the ‘triple point of water’. This temperature – at which liquid water, solid water (ice) and water vapour (but no air) co-exist in equilibrium – is defined to be 273.16 kelvin exactly. Glass cells re-creating this conjunction are located in every temperature calibration lab in the world, and every temperature measurement is a comparison of how much hotter a temperature is than the temperature at one position within a ‘triple point of water cell’.

On 20th May 2019, this will change. Instead, the kelvin will be defined in terms of a particular amount of energy per molecule as specified by the Boltzmann constant, kB. This definition finally corresponds to the way thermal energy is described in textbooks.

The requirement to compare every measurement of temperature with the temperature of the triple point of water adds uncertainty to measurements at extremely low temperatures (below about 20 K) and at high temperatures (above about 1300 K). The new definition will immediately allow small improvements in these measurement ranges, and further improvements are expected to follow.

The definition of the degree Celsius (°C) in terms the kelvin will remain unchanged.

The mole

The mole is the SI unit of ‘amount of substance’. It is currently defined as the amount of substance which contains the same number of ‘elementary entities’ as there are atoms in 12 grams of carbon-12. The change in the definition of the kilogram required a re-think of this definition.

On 20th May 2019, it will change. The mole will be defined as the amount of substance which contains a particular, exactly specified, number of elementary entities. This number – known as the Avogadro number, NA – is currently estimated experimentally, but in future it will have fixed value.

The specification of an exact number of entities effectively links the masses of microscopic entities such as atoms and molecules to the new definition of the kilogram.

The ‘New’ SI

On 20th May 2019 four of the seven base units of the SI will be re-defined. But what of the other three?

The second is already defined in terms of the natural frequency of microwaves emitted by atoms of a particular caesium isotope. The metre is defined in terms of the second and the speed of light in vacuum – a natural constant. And the candela is defined in terms of Kcd, the only natural constant in the SI that relates to human beings. So from 20th May 2019 the entire SI will be defined in terms of natural constants.

SI Circle - with constants

The SI is not perfect. And it will not be perfect even after the redefinitions come into force. This is because it is a system devised by human beings, for human beings. But by incorporating natural constants into the definitions of all its base units, the SI has taken a profound step towards being a system of measurement which will enable ever greater precision in metrology.

And who knows what features of the Universe these improved measurements will reveal.

Would you like to work with me?

July 29, 2017
Lab Panorama

The Acoustic Thermometry Lab at NPL (Photo by Sam Gibbs: thanks 🙂 )

Friends and colleagues,

  • Do you know anyone who would like to work with me?

In the next few months I expect to be starting some new projects at NPL. And this means that I will not be able to work on my existing projects 😦

So NPL have created the opportunity for someone to work with me to help complete those projects.

  • You can read about the job here.
  • It’s also on the NPL web site here where it’s the described as “Research Or Higher Research Scientist – Temperature & Humidity” reference 65552.

What’s involved?

Good question. And it is one that is still being decided.

But it would involve working mainly in the acoustic thermometry lab .

Lab Panorama with notes

In acoustic thermometry, the temperature of a gas is inferred from measurements of the speed of sound.

On the left-hand side of the picture is an apparatus that uses a spherical resonator to measure the speed of sound. It is the most accurate thermometer on Earth.

On the right-hand side of the picture is a new apparatus that uses a cylindrical resonator to measure the speed of sound and has been designed to operate up 700 °C.

The job would involve learning about these techniques but that wouldn’t be the main activity.

Running around the lab is 50 metres of bright yellow tubing that we refer to as ‘an acoustic waveguide’.

By measuring the transmission of sound along the tube it is possible to turn it into a useful thermometer. I hope.

Finding out whether this can be made to work practically would be one part of the job. And testing the same idea is smaller tubes would be another.

Finally, by measuring the speed of sound in air it is possible to measure the temperature of the air and we would like to investigate applications of this technology.

What does the job involve?

Well it will involve learning a lot of new stuff. Typically projects involve:

  • Programming in Labview to control instruments and acquire and analyse data.
  • Writing spreadsheets and reports and PowerPoint presentations.
  • Keep track of stuff in a lab book.
  • Using acoustic and optical transducers
  • Signal processing
  • Electronics
  • Mechanical design and construction.
  • Vacuum and gas handling systems – ‘plumbing’.

And lots more. And the chance that someone with those skills will walk through the door is pretty low.

So prior knowledge is great but the key requirement is the mindset to face all those unknown things without letting the bewilderment become overwhelming.

So we are looking for someone with enthusiasm.

Enthusiasm?

Learning new stuff is painful. Especially when it seems endless.

So I couldn’t imagine working with someone who wasn’t enthusiastic about the miracle of physics.

And there is one benefit which isn’t mentioned in the advert.

To cope with the inevitable disappointments and to reward ourselves for our minor successes, our research group has freely available Tunnock’s Caramel Wafers.

Anyway, if this person isn’t you, please do pass on the opportunity to anyone you think might be interested.

The closing date for applications is 28th August 2017.

 

Gravity Wave Detector#2

July 15, 2017

GEO600 One arm

GEO600

After presenting a paper at the European Society of Precision Engineering and Nanotechnology (EUSPEN) in Hannover back in May, I was offered the chance to visit a Gravity Wave Detector. Wow! I jumped at the opportunity!

The visiting delegation were driven in a three-minibus convoy for about 30 minutes, ending up in the middle of a field of cabbages.

After artfully turning around and re-tracing our steps, we found a long, straight, gated track running off the cabbage-field track.

Near the gate was a shed, and alongside the road ran some corrugated sheet covering what looked like a drainage ditch.

These were the only clues that we were approaching one of the most sensitive devices that human beings have ever built: the GEO600 gravity-wave detector(Wikipedia or GEO600 home page)

Even as we drove down the road, the device in ‘the ditch’ was looking for length changes in the 600 metre road of less than one thousandth the diameter of a single proton.

Nothing about how to achieve such sensitivity is obvious. And as my previous article made clear, there have been many false steps along the way.

But even the phenomenal sensitivity of this detector turns out be not quite good enough to detect the gravity waves from colliding black holes.

In order to detect recent events GEO600 would have to have been between 3 and 10 times more sensitive.

The measuring principle

The GEO600 device as it appears above ground is illustrated in the drone movie above.

It consists of a series of huts and an underground laboratory at the intersection of two 600 metre long ‘arms’.

In the central laboratory, a powerful (30 watt) laser shines light of a single wavelength onto a beam-splitter: a piece of glass with a thin metal coating.

The beam-splitter reflects half the light and transmits the other other half, creating two beams which travel at 90° to each other along the two arms of the device.

At the end of the arms, a mirror reflects the light back to the beam-splitter and onto a light detector where the beams re-combine.

Aside from the laser, all the optical components are suspended from anti-vibration mountings inside vacuum tubes about 50 cm in diameter.

When set up optimally, the light traversing the two arms interferes destructively, giving almost zero light signal at the detector.

But a motion of one mirror by half of a wavelength of light (~0.0005 millimetres), will result in a signal going from nearly zero watts (when there is destructive interference) to roughly 30 watts (when there is constructive interference).

So this device – which is called a Michelson Interferometer – senses tiny differences in the path of light in the two arms. These differences might be due to the motion of one of the mirrors, or due to light in one arm being delayed with respect to light in the other arm.

Sensitivity

The basic sensitivity to motion can be calculated (roughly) as follows.

Shifting one mirror by one half a wavelength (roughly 0.0005 millimetres) results in an optical signal increasing from near zero to roughly 30 watts, a sensitivity of around 60,000 watts per millimetre.

Modern silicon detectors can detect perhaps a pico-watt (10-12 watt) of light.

So the device can detect a motion of just

10-12 watts ÷ 60000 watts per millimetre

or roughly 2 x 10-17 mm which is 10-20 metres. Or one hundred thousandth the diameter of a proton!

If the beam paths are each 600 metres long then the ability to detect displacements is equivalent to a fractional strain of roughly 10-23 in one beam path over the other.

So GEO600 could, in principle, detect a change in length of one arm compared to the other by a fraction:

0.000 000 000 000 000 000 000 01

There are lots of reasons why this sensitivity is not fully realised, but that is the basic operating principle of the interferometer.

The ‘trick’ is isolation

The scientists running the experiment think that a gravity wave passing through the detector will cause tiny, fluctuating changes in the length of one arm of GEO600 compared with the other arm.

The changes they expect are tiny which is why they made GEO600 so sensitive.

But in the same way that a super-sensitive microphone in a noisy room would just makes the noise appear louder, so GEO600 is useless unless it can be isolated from noise and vibrations.

So the ‘trick’ is to place this extraordinarily sensitive ‘microphone’ into an extraordinarily ‘quiet’ environment. This is very difficult.

If one sits in a quiet room, one can slowly become aware of all kinds of noises which were previously present, but of which one was unaware:

  • the sound of the flow of blood in our ears:
  • the sound of the house ‘creaking’
  • other ‘hums’ of indeterminate origin.

Similarly GEO600, can ‘hear’ previously unimaginably ‘quiet’ sounds:

  • the ground vibrations of Atlantic waves crashing on the shores of Europe:
  • the atom-by-atom ‘creeping’ of the suspension holding the mirrors

Results

So during an experiment, the components of GEO600 sit in a vacuum and the mirrors and optical components are suspended from silica (glass) fibres, which are themselves suspended from the end of a spring-on-a-spring-on-a-spring!

In the photograph below, the stainless steel vacuum vessels containing the key components can be seen in the underground ‘hub’ at the intersection of the two arms.

GEO600 Beam Splitter

They are as isolated from the ‘local’ environment as possible.

The output of the detector – the brightness of the light on the detector is shown live on one of the many screens in the control ‘hut’.

GEO 600 Control Centre

But instead of a graph of ‘brightness versus time, the signal is shown as a graph of the frequencies of vibration detected by the silicon detector.

Results

The picture below shows a graph of the strain – the difference in length of the two arms – detected at different frequencies.

[Please note the graph is what scientists call ‘logarithmic’. This means that a given distance on either axis corresponds to a constant multiplier. So the each group of horizontal lines corresponds to a change in strain by a factor 10, and the maximum strain shown on the vertical 10,000 times larger than the smallest strain shown.]

Sensitivity Curve

The picture above shows two traces, which both have three key features:

  • The blue curve showed the signal being detected as we watched. The red curve was the best performance of the detector. So the detector was performing close to its optimal performance.
  • Both curves are large at low frequencies, have a minimum close to 600 Hz, and then rise slowly. This is the background noise of the detector. Ideally they would like this to be about 10 times lower, particularly at low frequencies.
  • Close to the minimum is a large cluster of spikes: these are the natural frequencies of vibration of the mirror suspensions and the other optical components.
  • There are lots of spikes caused by specific noise sources in the environment.

If a gravity wave passed by…

…it would appear as a sudden spike at a particular frequency, and this frequency would then increase, and finally the spike would disappear.

It would be over in less than a second.

And how could they tell it was a gravity wave and not just random noise? Well that’s the second trick: gravity wave detectors hunt in pairs.

The signal from this detector is analysed alongside signals from other gravity wave detectors located thousands of kilometres away.

If the signal came from a gravity wave, then they would expect to see a similar signal in the second detector either just before or just afterwards – within a ‘time window’ consistent with a wave travelling at the speed of light.

Reflections

Because powerful lasers were in use, visitors were obliged to wear laser google!

Because powerful lasers were in use, visitors were obliged to wear laser goggles!

This was the second gravity wave detector I have seen that has never detected a gravity wave.

But I have seen this in the new era where we now know these waves exist.

People have been actively searching for these waves for roughly 50 years and I am filled with admiration for the nobility of the researchers who spent their careers fruitlessly searching and failing to find gravity waves.

But the collective effect of these decades of ‘failure’ is a collective success: we now know how to the ‘listen’ to the Universe in a new way which will probably revolutionise how we look at the Universe in the coming centuries.

A 12-minute Documentary

Gravity Wave Detector#1

July 6, 2017
Me and Albert Einstein

Not Charlie Chaplin: That’s me and Albert Einstein. A special moment for me. Not so much for him.

I belong to an exclusive club! I have visited two gravity wave detectors in my life.

Neither of the detectors have ever detected gravity waves, but nonetheless, both of them filled me with admiration for their inventors.

Bristol, 1987 

In 1987, the buzz of the discovery of high-temperature superconductors was still intense.

I was in my first post-doctoral appointment at the University of Bristol and I spent many late late nights ‘cooking’ up compounds and carrying out experiments.

As I wandered around the H. H. Wills Physics department late at night I opened a door and discovered a secret corridor underneath the main corridor.

Stretching for perhaps 50 metres along the subterranean hideout was a high-tech arrangement of vacuum tubing, separated every 10 metres or so by a ‘castle’ of vacuum apparatus.

It lay dormant and dusty and silent in the stillness of the night.

The next day I asked about the apparatus at morning tea – a ritual amongst the low-temperature physicists.

It was Peter Aplin who smiled wryly and claimed ownership. Peter was a kindly antipodean physicist, a generalist – and an expert in electronics.

New Scientist article from 1975

New Scientist article from 1975

He explained that it was his new idea for a gravity wave detector.

In each of the ‘castles’ was a mass suspended in vacuum from a spring made of quartz.

He had calculated that by detecting ‘ringing’ in multiple masses, rather than in a single mass, he could make a detector whose sensitivity scaled as its Length2 rather than as its Length.

He had devised the theory; built the apparatus; done the experiment; and written the paper announcing that gravity waves had not been detected with a new limit of sensitivity.

He then submitted the paper to Physical Review. It was at this point that a referee had reminded him that:

When a term in L2 is taken from the left-hand side of the equation to the right-hand side, it changes sign. You will thus find that in your Equation 13, the term in L2 will cancel.

And so his detector was not any more sensitive than anyone else’s.

And so…

If it had been me, I think I might have cried.

But as Peter recounted this tale, he did not cry. He smiled and put it down to experience.

Peter was – and perhaps still is – a brilliant physicist. And amongst the kindest and most helpful people I have ever met.

And I felt inspired by his screw up. Or rather I was inspired by his ability to openly acknowledge his mistake. Smile. And move on.

30 years later…

…I visited Geo 600. And I will describe this dramatically scaled-up experiment in my next article.

P.S. (Aplin)

Peter S Aplin wrote a review of gravitational wave experiments in 1972 and had a paper at a conference called “A novel gravitational wave antenna“. Sadly, I don’t have easy access to either of these sources.

 

What is Life?

June 28, 2017
Royal Trinity Hospice

A pond in the garden of the Royal Trinity Hospice.

On Monday, my good friend Paula Chandler died.

It seems shocking to me that I can even type those words.

She had cancer, and was in a hospice, and her passing was no surprise to her or those who loved her. But it was, and still is, a terrible shock.

It is unthinkable to me that we will never converse again.

How can someone be alive and completely self-aware and witty on Saturday; exchanging texts on Sunday evening; and then simply gone on Monday morning?

Her body was still there, but the essential spark that anyone would recognise as being ‘Paula’, was gone.

As I sat in the garden of the Royal Trinity Hospice, I reflected on a number of things.

And surrounded by teeming beautiful life, the question of “What is Life?” came to my mind. Paula would have been interested in this question.

What is life?

In particular I tried to recall the details of the eponymous book by Addy Pross.

In honesty I can’t recommend the book because it singularly fails to answer the question it sets itself.

In the same way that a book called “How to become rich” might provide an answer for the author but not the reader, so Addy Pross’s book was probably valuable for Addy Pross as he tried to clarify his thoughts. And to that extent the book is worth reading.

Life is ubiquitous on Earth, and after surveying previous authors’ reflections, Addy Pross focuses the question of “What is Life?” at one specific place: the interface between chemistry and biology:

  • In chemistry, reactions run their course blindly and become exhausted.
  • In biology, chemistry seeks out energy sources to maintain what Addy Pross calls a dynamic, kinetic stability.

So how does chemistry ‘become’ biology?

In the same way that a spinning top is stable as long as it spins. Or a vortex persists in a flowing fluid. Similarly life seems to be a set of chemical reactions which exhibit an ability to ‘keep themselves going’.

What is life?

Re-naming ‘life’ as ‘dynamic kinetic stability’ does not seem to me to be particularly satisfactory.

It doesn’t explain how or why things spontaneously acquire dynamic kinetic stability any more than saying something is alive explains its aliveness.

I do expect that one day someone will answer the question of “What is Life?” in a meaningful technical way.

But for now, as I think about Paula, and the shocking disappearance of her unique dynamic kinetic stability, I am simply lost for words.

Not everything is getting worse!

April 19, 2017

Carbon Intensity April 2017

Friends, I find it hard to believe, but I think I have found something happening in the world which is not bad. Who knew such things still happened?

The news comes from the fantastic web site MyGridGB which charts the development of electricity generation in the UK.

On the site I read that:

  • At lunchtime on Sunday 9th April 2017,  8 GW of solar power was generated.
  • On Friday all coal power stations in the UK were off.
  • On Saturday, strong winds and solar combined with low demand to briefly provide 73% of power.

All three of these facts fill me with hope. Just think:

  • 8 gigawatts of solar power. In the UK! IN APRIL!!!
  • And no coal generation at all!
  • And renewable energy providing 73% of our power!

Even a few years ago each of these facts would have been unthinkable!

And even more wonderfully: nobody noticed!

Of course, these were just transients, but they show we have the potential to generate electricity which has a significantly low carbon intensity.

Carbon Intensity is a measure of the amount of carbon dioxide emitted into the atmosphere for each unit (kWh) of electricity generated.

Wikipedia tells me that electricity generated from:

  • Coal has a carbon intensity of about 1.0 kg of CO2 per kWh
  • Gas has a carbon intensity of about 0.47 kg of CO2 per kWh
  • Biomass has a carbon intensity of about 0.23 kg of CO2 per kWh
  • Solar PV has a carbon intensity of about 0.05 kg of CO2 per kW
  • Nuclear has a carbon intensity of about 0.02 kg of CO2 per kWh
  • Wind has a carbon intensity of about 0.01 kg of CO2 per kWh

The graph at the head of the page shows that in April 2017 the generating mix in the UK has a carbon intensity of about 0.25 kg of CO2 per kWh.

MyGridGB’s mastermind is Andrew Crossland. On the site he has published a manifesto outlining a plan which would actually reduce our carbon intensity to less than 0.1 kg of CO2 per kWh.

What I like about the manifesto is that it is eminently doable.

And who knows? Perhaps we might actually do it?

Ahhhh. Thank you Andrew.

Even thinking that a good thing might still be possible makes me feel better.

 


%d bloggers like this: