Archive for September, 2011

The Speed of Neutrinos: Part 2

September 29, 2011
The xkcd take on the neutrino speed measurements. Click to link to xkcd site

The xkcd take on the neutrino speed measurements. Click to link to xkcd site

It would be great to have an experiment that nobody on Earth could understand. Now there are certainly things which we don’t fully understand:

  • We don’t understand how rain drops form in a cloud (link).
  • We don’t understand how life came to be.
  • We don’t understand what ‘dark matter’ is.
  • We don’t know how high-temperature superconductors work.

But I don’t count any of these in the same category of mysterious as the prospect of super-luminal neutrinos.

  • I don’t doubt for a moment that we will eventually understand how rain drops form in a cloud.
  • I don’t know whether we will ever understand this, but I have no problem with the idea that ‘something happened’ – its just I have no idea what that something is!
  • I don’t trust any cosmological deductions from astronomical data – it’s all so far away and we know so little about what is going on. I have no trouble being amazed but I feel sure that discoveries over further decades will resolve matters.
  • We only fail to understand the details of how high-temperature superconductors work: the general phenomenon is fairly well understood

But an experiment that we can do in a lab on Earth, and yet which unequivocally disobeys accepted laws? That would be amazing. And 100 years ago that was exactly the case. Although the nascent science we now call Physics was triumphant, with successful theories of light and the structure of matter, there were several experiments that were utterly inexplicable.

  • Most notable was the phenomenon of radioactivity: certain minerals just spontaneously got hot! This contravened the idea of conservation of energy and the laws of thermodynamics. And nobody knew what was happening. It took decades before our understanding of atoms enlarged sufficiently that radioactivity ‘made sense’

In 1901 Lord Kelvin drew attention to ‘two small clouds’ on the otheriwse clear horizon of 19th Century physics: the difficulty in understanding the Michelson-Morely experiment and inability to understand the heat capacity of gases. As it turned out, these ‘clouds’ heralded intellectual ‘storms’ which changed our view of the world profoundly.

  • The heat capacity of a gas is a measure of how much energy it takes to increase the temperature of a fixed amount of gas by one degree. How dull a measurement is that! The theories of physics said the answer should be the same for all gases – but in fact the answer differed for different gases – and changed with temperature. This simple fact could not be explained until the theory of Quantum Mechanics was developed – and this radically changed our view of all microscopic processes.
  • The Michelson-Morely experiment measured differences in the speed of light in two perpendicular directions at the same time. The world was astounded to find out that the speed of light was the same in every direction – no matter how fast the Earth was moving around the sun or through the cosmos. When Einstein explained this, he changed our concept of the ‘relativity’ of motion, and  out of his explanation arose the idea that particles or matter could not exceed the speed of light. Einstein’s arguments were completely general and for the last century they have been observed to be correct time and time again. So it would be more than a surprise if it should transpire that neutrinos could move even a tiny bit faster than light. I will write more about this on another evening.

Having an experiment that we could do on Earth which nobody could understand would be amazing. If the superluminal speed of neutrinos were confirmed then physicists would be stumped –  and the lesson of history seems to be that when we have something that we just don’t understand, then we can expect that this will lead to new ways of seeing the world.

I would love to be alive when such a phenomenon was discovered.

The Speed of Neutrinos: Part 1

September 28, 2011
Map showing the path of the neutrino beam as it passes from CERN to the Gran Sasso Laboratory. Map courtesy of Google

Map showing the path of the neutrino beam as it passes 730 km through the Earth from CERN to the Gran Sasso Laboratory. The width of the beam is exaggerated - it is only around 6 km wide when it reaches the detector. Map courtesy of Google

You may have read recently about an experiment at CERN which found that neutrinos appeared to travel faster than light. I was going to write about the significance of this result (should it be confirmed) – but when I looked at what the experimenters actually did I was filled with admiration for their work. And I felt that it might be nice to just describe what they did. And in doing this I hope you will see that although it is quite possible that they are right, there is also a distinct possibility that they have made a mistake. You can read the researchers own account of their work here.

The neutrinos are generated in CERN using a proton accelerator. Every 6 seconds a ‘kicker magnet’ sends protons out of a ring in two pulses each 10 microseconds long, and separated by 50 milliseconds. Through a complicated series of interactions the protons generate neutrinos of a type known as muon neutrinos which travel in roughly the same direction as the protons. Neutrinos barely interact with matter at all, and can travel through the Earth and barely notice it. Of the 1020 neutrinos generated in this experiment over a period of three years, only around 16,000 were detected in the Gran Sasso Laboratory, deep underground in Italy – 730 km away. In other words only roughly 1 neutrino was detected for every 10,000,000,000,000,000 that were emitted from CERN. How could the transit time of this one neutrino in 10 million billion possibly be timed?!

Well first the researchers synchronised clocks at the two laboratories using a clock on a GPS satellite as a common reference source. They took great care over this and think their clocks agree within 1 nanosecond. This may sound incredible, but your computer is probably processing instructions at a rate of at least 1 per nanosecond (if you have a 1 Gigahertz processor) so you can think of this synchronising the clocks of two computers within one processor cycle – difficult but imaginable.

Then the researchers measured the distance as best they could using GPS satellites and a ground survey. They found the distance to be 730534.61 metres with an uncertainty of only 20 centimetres – and they could easily detect a 7 centimetre shift in this distance after the L‘Aquila Earthquake.

So how could the transit time of this one neutrino in 10 million billion possibly be timed?! It can’t. Instead the researchers first measured the current of protons headed towards the target where the neutrinos were created. They did this by wrapping a transformer around the beam which gave a voltage when the proton ‘current’ pulsed through the transformers coils. They reasoned that the rate of production of neutrinos ought to be proportional to the rate at which protons hit the target. When they had measured the shape of the proton pulse averaged of the millions of pulses in three year, they tried looking for neutrinos arriving at the detector 2.436801 milliseconds later – the transit time for light. By restricting their view in this way they were able to ignore the possibility that a neutrino from the rest of the universe would arrive in this tiny time window. They expected that as the neutrinos were detected, they would see a pulse shape slightly delayed with respect to time that light would have taken (had it been able to travel through 730 km of solid rock!). To their great surprise they found that although the 16111 neutrinos they detected over three years did have the same shape as the average proton pulse shape, the neutrino pulse occurred slightly earlier than they would have expected.

The graph below shows the average proton pulse shape from each of the two 10 microsecond pulses which occur every six seconds

The average shape of the two pulses of protons that generate the neutrinos averaged over all the pulses used in three years.

The average shape of the two pulses of protons that generate the neutrinos averaged over all the pulses used in three years.

The graph below shows the detected neutrino arrival times alongside the expected arrival time (in red) based on the average proton pulse shape assuming the neutrinos travelled at the speed of light. The neutrino distribution is in advance of its expected arrival time by around 1 microsecond. The researchers can understand most of that in terms of signal delays in equipment. But there is a stubborn 0.06 microseconds that they are unable to explain – and that is why they have publicised their work – basically to ask for help.

The averaged arrival times of neutrinos

The dots show the measured arrival times of neutrinos at the detector compared with the expected timing (shown in red) if the neutrinos travelled at the speed of light. Notice that the rise in neutrino count occurs a little earlier than expected.

Now there are lots of places that they could have made mistakes, and that is what I expect they will find eventually. But I just want to express my admiration for their work.

  • First of all for the sheer chutzpah of trying to do such an audacious experiment! Detecting neutrinos is hard in itself, but this is just amazing!
  • Secondly, it is an example of how precision measurement can reveal the physical details of a phenomenon in the same a way a microscope shows visual details: if they hadn’t tried to time the pulses so accurately, they would never have been able to meaningfully ask the question about whether neutrinos travelled faster than light.
  • And finally, although I think they will find an error, I really hope they don’t, and that their result stands: it would be great to have an experiment that nobody on Earth understood – how exciting would that be!

One small ringtone for man…

September 26, 2011


After having visited NASA last week, I noticed that it is now possible to download ‘sounds of NASA‘ as ring tones, including some of the most famous sound bites. For example

The NASA home page has lots of other links you may enjoy too. Strangely, they didn’t have ‘May the Force be with you’.

Meeting one’s heroes

September 26, 2011
Michael and Jonathan at NASA Glenn

My colleague Jonathan Pearce and I at the NASA Glenn Research Centre

I have just returned from an exhausting trip to the NASA Glenn Research Center in Cleveland, Ohio. Staff there had sought out someone to teach them how to ‘measure the temperature’, and settled on NPL: What an endorsement! And representing NPL were myself and my colleague Jonathan Pearce. We worked night and day for weeks beforehand to prepare the course and then had an exhausting three days teaching and talking about temperature measurement, but the staff at NASA seemed happy, so I guess it was worth it in the end.

Beforehand I was terrified. In my mind NASA represented the ultimate in technological capability and I was anxious that we simply wouldn’t be able to answer their questions. Rationally I knew that Jonathan and I are pretty expert at what we do, but that didn’t stop me being scared – especially at the start. But it quickly transpired that although the people on the course were very expert in their fields too, they appreciated a little clarification about some subtleties of temperature measurement. In short, the people at NASA were not ‘super beings’ but instead they were just like myself, and Jonathan and our colleagues at NPL.

NASA is an awesome organisation which is having some difficulties at the moment. According to recent testimony to Congress by  the first Man on the Moon, Neil Armstrong, NASA has rather lost its way.

In summary, some significant progress has been achieved during the past year. However, NASA, with insufficient resources, continues to try to fulfill the directives of the Administration and the mandates of the Congress. The result is a fractious process that satisfies neither. The absence of a master plan that is understood and supported by government, industry, academia and society as a whole frustrates everyone. NASA itself, riven by conflicting forces and the dashed hopes of canceled programs, must find ways of restoring hope and confidence to a confused and disconsolate work force. The reality that there is no flight requirement for a NASA pilot-astronaut for the foreseeable future is obvious and painful to all who have, justifiably, taken great pride in NASA’s wondrous space flight achievements during the past half century.

Winston Churchill famously stated: “The Americans will always do the right thing after they have exhausted all the alternatives”. In space fight, we are in the process of exhausting alternatives. I am hopeful that, in the near future, we will be doing the right thing.

I obviously don’t have Neil Armstrong’s insight, but I think perhaps his perspective is a little distorted. He was after all a single individual who was carried aloft at the pinnacle of a stupendous technological enterprise costing 4% of the Gross Domestic Product of the USA*. And now I think perhaps he is being a little hard on his colleagues who are having to cope in more humble roles and in less affluent times. I recognised many phenomena at NASA because similar changes have affected NPL. The people at both organisations are being urged to do more with less, and to do work which is ‘more relevant’. On the ground, this can be personally difficult for committed staff who find themselves working with ever diminishing resources, and fewer and fewer  colleagues. And in the background there is still the glow of the glory days which will never return.

It’s always difficult when one meet’s one’s heroes. Visiting NASA I found out that my heroes were not ‘super beings’ , but real human beings. And since their achievements are the achievements of real human beings, I find them even more admirable. ‘NASA, you’re still my hero.’

*For comparison NPL costs roughly 0.003 % of UK GDP.

An inspiring story of incompetence

September 13, 2011
A liquid filled light bulb

A liquid-filled light LED bulb. Designed with delightful incompetence.


Schadenfreude is the german word that describes the pleasure derived  from another person’s misfortune. Reading this WIRED story about the design of LED light bulbs, I will confess to feeling great pleasure at reading about the utter incompetence with which these were designed. On reflection, it was not exactly pleasure at their misfortune, but relief. My feeling was that “If they can bring their product to market after ignoring really obvious problems with their product, then maybe I should should not be so hard on myself about the mistakes I have made in my work. Either way, I felt inspired to follow my ideas with renewed vigor.

The story relates how in 1997 Ron Lenk had the idea of cooling LED’s inside a light bulb with a gel rather than gas – what a great idea! He patented the idea and founded a company, Switch to manufacture their Superbulb. All kinds of things happened, but things were looking tricky when they hired a physicist, David Horn, in 2009 who noticed that something was very wrong. On touching the outer envelope with his finger, he noticed that it wasn’t warm! If the bulb wasn’t getting warm, then the gel within the bulb wasn’t doing its job of cooling the LEDs. So after two years of experiments and development, it took an outsider to notice that something was very wrong. Everyone in the development team must have known that!

They seem to have things in hand now. But this willingness to notice problems that everyone would prefer to ignore is critical to the success of projects. It’s an example of negative feedback, and  it often takes an outsider to provide it. It put me in my mind of my own project in which I am trying to measure temperatures by timing pulses of sound down tubes. Even though the idea is mine, I will confess to thinking its a pretty clever idea. But making it work reliably takes all kinds of skills that I don’t have. I had been feeling really fed up about the project – there are still one or two things that I can’t figure out – but reading this story inspired me to look at things again. Because after all, tomorrow is another day.


When the ice melts…

September 8, 2011
The annual melting of arctic sea reaches its maximum extent in September. This year the melting has been roughly similar to 2007, the greatest extent of melting since humans evolved as a separate species. Click for a larger Image. Data from NSIDC

The annual melting of arctic sea reaches its maximum extent in September. This year the melting has been roughly similar to 2007, the greatest extent of melting since humans evolved as a separate species. Click for a larger Image. Data from NSIDC

As we near the end of summer in the northern hemisphere, the annual melting of the free-floating sea ice is reaching its maximum extent. Soon the temperatures will fall, the ice will re-form, and another winter freeze will commence. The annual melting has been the subject of satellite observation for more than 30 years and there is no doubt that the extent of the summer melting is increasing.

The data from the US National Snow and Ice Data Centre indicate that this year will probably see the greatest extent of summer melting ever seen since humans evolved as a separate species around one million years ago. Compared to 1979, this summer has witnessed the melting of an additional  2 million square kilometres of sea ice. Just to repeat that: two MILLION square kilometres! This is an area of roughly one thousand miles by one thousand miles. It is colossal.

As a specialist in temperature measurement, I am familiar with the use of ‘temperature fixed-points’ for calibrating thermometers. Even when heated, a mixture of ice and water does not increase in temperature, but instead maintains a stable temperature close to 0 ºC until all the ice has melted. In this sense, the arctic ocean in summer forms a gigantic, planetary-scale  fixed point. But when all the ice has melted, heating the liquid causes a temperature rise. And climatalogically, should the summer ever see the complete melting of arctic sea ice – in perhaps as little as 50 to 100 years according to current trends – we can expect the temperature of the region to begin to rise: no longer will an ice-water mixture stabilise the regional temperature.

The gob-smacking magnitude of ‘ice events’  – even on a smaller scale – is evident in this BBC story about the area left by the Peterman Glacier when it broke off from Greenland last year. It is still a major hazard to shipping around Newfoundland with a web page tracking it constantly and this video gives some idea of the scale of just one of the iceberg fragments.

And is all this due to anthropogenic global warming? Well I don’t know, and neither does anyone else. But it is a distinct possibility that we have played a part.

Trends in the FTSE 100

September 5, 2011
27 years of data for the FTSE100 Share Index. Also shown is a line which follows the trend of the index from 1984 to 1994. Data from the Financial Times

27 years of data for the FTSE100 Share Index. Also shown is a line which follows the trend of the index from 1984 to 1994. Data from the Financial Times. Click for larger version

Since reading the Financial Times the other week I have come over all financially-minded and so I present the above graph to you for your consideration. I was very surprised when I came across it on the FT website – it shows a perspective on recent stock market turmoil which I have never seen on the BBC News. It shows approximately 27 years of data for the FTSE100 Share Index – a general measure of stock market prices for UK companies. Also shown is a line which follows the trend of the index from 1984 to 1994 showing roughly 7% growth year-after-year for a decade. I have just three things to say.

  1. Whatever process the data represents made sense up until the mid-1990’s. Since then it doesn’t look there is any kind of consistent trend for more than a few years.
  2. Based on this data, it would be rash IMHO to place a bet – perhaps using all the money in one’s pension fund – on the value of the index in (say) 20 years.
  3. Shares can go up as well as down. Or up and down as well as down and up. I am not a registered Financial Advisor

Global Warming Policy Foundation: A Toxic Brand

September 4, 2011
The modest HQ of the GWPF situated just of The Mall in central London tells you all you need to know about the wealth of its backers.

The modest HQ of the GWPF situated just off The Mall in central London tells you all you need to know about the wealth of its backers. Pictures from Google

Some brands are toxic. It doesn’t matter what is being sold, if it carries a toxic brand label, it will not sell. As evidence I offer you the News of the World, the conservative party in Scotland, and finally the Global Warming Policy Foundation (GWPF).

The naming of the GWPF is misleading: it implies that they hold meaningful opinions about Global Warming. But they have no more valuable opinions about Global Warming than the Flat Earth Society have on the diameter of the Earth: they are simply in denial that carbon emissions could even conceivably affect climate. They are not climate-change skeptics: that would be an insult to climate-change skeptics. One of their less ludicrous previous publications has advocated the widespread adoption of shale gas extraction across the UK, a technology known to pollute water supplies, cause Earthquakes, and emit just as much carbon dioxide as conventional gas extraction! They advertise this as ‘greener’ than wind power!

Their latest output is the Myth of Green Jobs, predictably publicised by Andrew Orlowski on The Register. Summarising, the report says that renewable electricity is more capital intensive and more expensive that fossil fuel generated electricity. And if we spend money on renewable energy, then we can’t spend it on other things, and that this will harm business and jobs. In itself, I don’t doubt a word of the report, and it has many sensible things to say: such as advocating that we compare different strategies for reducing carbon emissions by evaluating the cost per ton of reduced emissions. However, because the report is published under the GWPF brand we can have no doubt about the intention of the author: to undermine any response to possible negative consequences of carbon emissions.

Most people agree that we don’t fully understand the consequences of carbon emissions. But most people would also agree that there are substantial reasons to be concerned. From the standpoint of a Global Warming Denier, doing anything to respond to this potential threat is damaging economically. Why? Because as far as they are concerned it is money down the drain. However, from a more rational standpoint, doing something makes perfect sense – even if that something is not optimal. We are using resources now to begin to develop a response to things which might happen.

The GWPF would not accept that carbon dioxide is a pollutant. But it is. Historically businesses used to pollute rivers and objected to restrictions on their emissions on the grounds that not polluting would be uneconomic. This kind of behaviour is now viewed as not just unacceptable, but immoral. Now, we all pay for the clearing up of this mess by an effective ‘green tax’ on all the chemicals we use: these goods cost more than they otherwise would but our rivers are healthier than they have been for generations. Similarly, the GWPF say that higher electricity prices are bad for businesses – and they are. But if we believe collectively that it is unacceptable to pollute our shared atmosphere in order to get cheap electricity, then this is a price worth paying.

One can argue about whether the government are doing the right things with regard to energy policy. But the aim of GWPF is not to engage in such a discussion. It has only one policy which is simply:

Cheap energy – no matter what the cost.

I think this is an immoral and – literally – toxic stance.

A carbon neutral gym?

September 3, 2011
Is it feasible to make a 'carbon neutral' gym?

Is it feasible to make a 'carbon neutral' gym?

My colleagues Laurie and Andrew have just recovered from producing NPL’s blockbusting ‘Energy Harvesting’ stand at the Royal Society’s Summer Exhibition. And last week they asked me whether it was possible to have a carbon neutral gym? The idea must have occurred to many people as they sweat and struggle. Surely, they think , all this effort is not entirely pointless? Surely we could capture some this work being done and use it to power the lights? Surely if Seattle can have 20 carbon neutral office blocks, we manage a gym! Well, let’s see.

Let’s look at the power 20 squad members in a Rugby team training together might produce.

  • Rowing Machines: If I try hard I can generate 150 W of power on a rowing machine for around 10 minutes. Let’s assume rugby players can generate for 250 W for 15 minutes. If we had 5 machines then we could rotate banks of 5 players through these machines to generate a continuous 5 x 250 = 1250 W – that’s impressive .
  • Treadmills: A treadmill actually consumes energy :-(, and a typical device might have a 2 horse power (1500 W) motor. So either we could have 6 rowing machines powering one treadmill, or more sensibly, we could just abandon treadmills and get the team to run around a Rugby pitch.
  • Exercise bikes: I don’t have a figure for bikes, but in my experience they are less knackering than rowing so I would guess they can generate less energy. Let’s guess we have 5 exercise bikes and each one can generate 150 W then we could rotate banks of 5 players through these machines to generate a continuous 5 x 150 = 750 W.
  • Cross Trainers: I don’t have a figure for cross-trainers (sometimes called elliptical trainers),  but in my experience they are less knackering than rowing but more knackering than exercise bikes, so I would guess they can generate power somewhere in between the previous two. Let’s guess we have 5 cross-trainers and each one can generate 200 W then we could rotate banks of 5 players through these machines to generate a continuous 5 x 200 = 1000 W.

So with 5 of each type of cardio machine and no treadmills, 15 rugby players could generate 1250 W + 750 W + 1000 W = 3000W – a useful amount of power. Each person would be working at a rate which would burn roughly 1000 kCal/hour which is pretty intense exercise, so let’s guess that they could do this for one hour.

  • Muscle Machines: In these machines pulleys translate the physical motion into a lift of some weights, typically by about half a metre. Let’s assume that each lift is of 40 kg, then each lift requires (m x g x h) (40 x 10 x 0.5) 200 Joules of energy. If we could capture this in a way that generated electricity, and  a lift was performed on average every second (assuming several people are exercising simultaneously) then that could generate 200 W of power.

Is it feasible: Given these simple calculations the answer has to be – ‘Yes’. But there are caveats

  • In general one can’t capture all the energy people dissipate in these machines. Dynamos are not ideally efficient and the muscle machines would not couple easily to dynamos; some players may not be ideally fit; and we have assumed 100% occupancy of the equipment. So in practice our 3000 W of power would be substantially reduced, but 1000 W would be a fair estimate.
  • Human’s generate heat when they exert themselves – about 6 times more than work they produce. So if 3000 W of work is being produced there would be an astonishing 18 kW of heat produced – so some kind of cooling or ventilation would be required – which would take energy. Let’s assume this is negligible and that we cool using a chimney effect.
  • The gym would need to have a relatively large floor area which would probably require lighting – which would probably require several hundred watts  – but let’s ignore this and assume the training is in daylight.
  • Over one hour, the team would have plausibly generated 1 kWh of useful electrical energy, which would have cost at most £0.25.
  • Over one day (12 hours) assuming the equipment was highly utilised, this might amount to 12 kWh of energy, which would have cost at most £3.00.
  • Over a week (5 days) this might amount to 60 kWh of energy, which would have cost at most £15.
  • Over a year this might amount to 3120 kWh of energy, which would have cost at most £780.
 So all in all: this 2.6% per annum return on investment might well make sense in the context of the £30,000 or so which such a gym might cost. Or
But would that be carbon neutral? Well No. Such a gym would not recoup the carbon released during in the manufacture of the equipment for many decades – probably beyond the lifetime of the equipment. But people might well prefer a gym where they feel their efforts aren’t being wasted. However, what would be the worst possible outcome would be that additional generating equipment was incorporated into gym equipment to make people feel better, and that the carbon cost of the additional equipment was not recouped either!
As many people have found micro-generation of electricity can make sense in certain contexts. For instance if the gym  was in an isolated location which could not connect to the electricity grid. But nothing can approach the low cost of central generation of electricity using cheap carbon-emitting fuel. If electricity from the grid was all  renewably generated lots of schemes such as this one would make economic sense. However electricity might then cost perhaps £1.00/kWh and we would all take much more care not waste a joule. Anyone in favour of massively increasing electricity prices?

%d bloggers like this: