A level standards: A national disgrace

Scores on the Prior Knowledge Test (PKT) given to all students at the start of their B.Sc degree course at Bristol University physics department. Scores have systematically declined despite the amazing increase in apparent performance at A level

Scores on the Prior Knowledge Test (PKT) given to all students at the start of their B.Sc degree course at Bristol University Physics Department. The test has remained essentially unchanged, but scores have systematically declined despite the amazing increase in apparent performance at A level.

In 1985 around 9% of students obtained ‘A’ grades in their A-level exams. Since then, the percentage has increased every year and in 2010 the result was 27%. Is that because students are working harder and teachers are teaching smarter? Well, I can’t answer that question for every subject, but for physics the answer is an unequivocal ‘No’.

I can say this with certainty because the experience has been mournfully recounted to me by many teachers. And this month the anecdotal evidence was confirmed in an interesting paper by Professor Peter Barham published in Physics Education (download pdf here). Since 1975 the entire student intake to the physics course at Bristol University has taken essentially the same Prior Knowledge Test (PKT). Professor Barham published an analysis of these results and I have summarised one aspect of his analysis in the graph at the head of the page. It shows that scores testing prior knowledge of physics have declined by around 25% and are still falling.

Summarising, our national exam structure is reporting continuous improvement, but in reality students can achieve less and less. As I have commented previously, results such as this represent a national disgrace. This shameful situation is the product of:

Together these elements have combined to drive down educational standards. If this were a sport’s challenge these steps would involve:

  • Lowering the height of a ‘high jump test’ year upon year.
  • Stating that students could jump one metre when in fact they could only jump 25 centimetres, but they could do it 4 times.
  • The high jump inspectors guarantee the average height of the bar, but it is higher in some places and lower in others. The high jump inspectors publish guides explaining where the low points are.

In this case we can see that reports of continuous improvements in ‘high jump’ would be laughable. However, the educational system really matters. It is the way in which we pass on our combined cultural appreciation from generation to generation. And the confidence trick in which our educational establishment has conspired should make us angry.

However things appear to be changing. The modularisation of courses is slowly being undone. For his GCSE exams my second son will have to take all his science exams at the end of the course, a more difficult task than that which faced my first son. Michael Gove has now announced that in future A level ‘content’ will need to ‘involve’ academics at Universities. At this point it is impossible to foresee what this ‘involvement’ actually means, but it is probably a step in the right direction. However only when exam boards become fully independent of publishing companies do we stand a chance of reversing the continuous devaluation of educational qualifications.

Tags: ,

31 Responses to “A level standards: A national disgrace”

  1. Steve McGann Says:

    Fascinating stuff. Thanks for this 🙂 My son is now 15, and this kind of thing has worried me for some time. Anecdotally – parents of new Physics undergraduates are telling me that their A* children are having to sit remedial classes in the first term to bring them up to scratch. Not good.

    • protonsforbreakfast Says:

      Steve. It’s tough in schools. GCSE does not prepare children for A level, A level is not enough for University. I don’t know what to do for my own children and all I feel I can do is try to just state the truth – people are hesitant to speak out for fear of offending teachers or students. But there is a big problem and I wish the people responsible for it would address it. As you say: ‘not good’.

  2. edhui Says:

    Hi Michael
    I know you’ve done so before, but just to make absolutely sure everybody gets the whole impact of what you’re writing, you should include the fact that the high jump analogy includes the further point that the people making the tape used to measure the jump are also the ones publishing the instructions on how to jump, and being assessed on how high the students manage. I know it’s obvious once you’ve got the point, but if common sense was actually common, we’d not be where we are today.

    You know where I work. I’ve had the opportunity in the last few months to ask, face to face, over 120 students, aged 11 -16 where, physically, the mass of a tree comes from- as in using conservation of mass, where does that matter that makes wood come from. I have been offered as answers water, soil, oxygen, the seed, sunlight. Only three students answered correctly, using carbon dioxide from the atmosphere as part of their answers, and none of those understood how that impacted the current climate debate. Had I asked what photosynthesis was, I would have had a much better response, but this illustrates how ‘chunking’ the syllabus can suck all scientific meaning out of it.

    Yesterday I asked an A level maths student, predicted an A* in this summer’s exams and having performed on target in all assessments, what the name of the curve of a quadratic equation was, and she had no idea that it was a parabola, no knowledge of conic sections, no application of the laws of motion and s=ut + 1/2 at2 in her syllabus, which did include the solution of quadratic equations in their many and varied forms. Makes me want to take a running jump, and not just to describe the path I would take…

    • protonsforbreakfast Says:

      Your tale is so sad. It rings very true, but makes me sad, ANd actually angry. The subjects children study – not just science – are so fascinating I cannot understand how the subjects can be reduced to the uninspiring state many children experience.

      I am about alter the blog on the point you mention!

  3. Nestor Patrikios Says:

    It seems that exams are being used for two reasons: to assess pupils and to assess teaching standards. And doing neither well!

    Given the large numbers, you can surely assume students don’t change statistically year on year and so normalise results to give the same spread of results (e.g. 10% As, 20% Bs etc.) every year. An A then tells you something useful and year-independent.

    If you want to assess teaching standards you should do something else!

    • protonsforbreakfast Says:

      That is what used to be done!

      As you say exams serve (at least) two purposes. This could be done by having an A grade indicate that the results are in the top 10% (say). B grade would mean next 20% (say) and then a C grades would then extend to a particular pass mark. So it would be OK for more people to pass an exam and get a C grade , but an A and B would retain its significance in distinguishing amongst the class.

      The problem is not that there is no solution, but that it is in no one’s interest to maintain standards. Schools want better marks, pupils want better marks, the government wants better marks, publishers want better marks, exam boards want better marks. None of these people have taken responsibility for the deception.

      Perhaps it is because Mr. Gove is such an outsider that he has felt able to take a step in the right direction.

  4. adamhstevens Says:

    As a physics teacher, I agree with your general premise.

    However, it’s interesting that you automatically associate performance in a multiple choice ‘knowledge’ quiz with achievement. Such a test can never really give a full picture of the capability of students.

    There are other points – should schools really prepare students for university simply with ‘knowledge’? It’s a sad state of affairs when students don’t understand Newton’s laws when beginning their 1st year of a Physics degree, but can we really say that this failing would be due to the ‘dumbing’ down of physics syllabi, or even the grading of their exams?

    Yes, the way courses and exams are developed is not ideal, but changes to exams over the last 20 years are not /all/ bad. Let’s not throw out the baby with the bathwater–Mr Gove’s proposed changes are not necessarily going to make things better.

    I’m also interested in the graph you provide here. I can’t find it or data that would produce in the paper you linked to. How did you make it? (not intended in a critical way, just interested!)

    • protonsforbreakfast Says:


      Considering Physics, I agree that exam results are an imperfect measure of student’s ability. But although imperfect, they are not useless. In this article I was struck by the conjunction of ever-improving results with declining scores on the PKT. This is a pretty clear indicator of ‘a problem’ and that problem has many dimensions. The quality of teaching, teachers’ background, the age of teachers, changing perceptions of science – I could go on. And if you read back through the GSCE standards topic on the blog – you will see that I have gone ‘on and on’ in the past!

      “Should schools really prepare students for university simply with ‘knowledge’?”. Yes. I think that is almost the only thing that schools should do. Teenagers ability to learn, synthesize and create is astounding. Age 52 I am well aware that I no longer have the ability I once had. I think the time and the knowledge is precious.

      I agree that not all changes have been bad. Personally I would guess that the general standard of teaching is better. But until the basic issue of the persistent ‘devaluation of the currency’ is tackled then no one will be able to measure the effect of any other changes.

      The data are extracted from Figure 2 of Peter Barham’s paper. I scanned the figure, placed a grid over the graph and read of the data. I estimate the uncertainty is around ±1 mark which is much smaller than the ±10 mark spead of the results. I did this because data cannot be copyrighted, but images can. I am happy to send you the spreadsheet if you would like.

      All the best


      • adamhstevens Says:

        I totally agree that devaluation of exam scores is an issue. But (and I guess I’m in the minority here) I don’t think it’s a bit one, in part because
        a) I think exams are essentially useless anyway, except as a fairly arbitrary ‘figure of merit’ for universities to look at for admission. Time and time again I have seen hard-working students that are capable, but simply struggle with exams and tests and therefore receive a bad mark.
        b) I think comparing students results from now with those from 10, 20, 30 years ago is unproductive. Yes, universities struggle to decide between the numerous ‘A*’ students, but I’m not sure this is really such a bad thing–it means they have to be a little more ‘inventive’ with how they assess applicants, which is definitely a good thing.

        These exams are no doubt different to those of the past. To say they are ‘easier’ is not helpful, and since it is such a relative measure can never really be proved with facts (I would heavily contend it – my first year of teaching (a few years ago) I tried an A-Level exam paper and genuinely struggled).

        Using a decline in ‘quality’ of exams to revert aspects of education back to ‘how they used to be’ but be very unproductive. I have issues with the discussion in the paper linked about modularisation – the idea that students revising with a ‘fire and forget’ attitude means that they lose more knowledge before university, rather than one final exam where they have to remember it for longer is faintly ridiculous to be honest.

        Thanks for confirming where the data is from. I think it’s interesting that the standard deviation on those data is so large (almost 30%) and am unsure what this really means.

        I guess what I’m trying to say is that I think we should move away from exams as a summative assessment method, and particular not regress back to the examination style of 30 years ago. I have seen the lack of conceptual knowledge of first year physics students for myself (whilst doing research for the Education Physics Education Research group, whose work I highly recommend https://sites.google.com/site/edpersite/) and I totally agree that there is a problem somewhere between A-Level and University, but I think too hard at examination methods would avoid the real issues, which as you say are more likely to do with the quality of teacher (which could undoubtedly be related to the way students are examined!) – there is no simple fix and I think that some of the repair methods outlined here would a step in the wrong direction.

      • edhui Says:

        Hi Adam
        There are a number of issues in your comment. Michael’s data shows a trend that, for the purposes of this discussion, can be accepted. If the Bristol PKI is actually the erroneous result of an increase in knowledge proportional to the increase in A level A and A* grades over time, then we have nothing to worry about!

        You cannot agree that ‘there is a problem somewhere between A level and university’ (if that is what you really mean) because there is no education process between A level and university, except a few wild visits to music festivals. If the universities find the students unprepared, then it is their prior education (home situation as well as school) that is at fault. Michael is not saying there is a problem between A level and university- he is saying there is a problem at or before A level. Personally I think there is a cultural problem- the complete lack of realisation that immersion into computer consumer culture (social networking and games) for hours a day can have a bad effect on attainment. Lacking modern IT, I spent my teenage years tinkering with cars, motorbikes and radio controlled aircraft. My son does not. I think that makes a difference.

        But (and Michael will correct me if I’m wrong) Michael makes a central and I think irrefutable point- exams should award grades to predefined percentages of the students, so that say 5% of students always get A grades. This is because cultural environments change, as do the difficulties of teaching ever changing syllabi. But the best students should always make the best progress. As an employer, I am uninterested in whether a candidate can remember the facts he crammed for many years ago. I am interested in how he compared to the rest of his cohort.

        As a biologist, almost everything I learnt over thirty years ago has been made obsolete by modern genetic advances. My exam performances in terms of the questions I could answer then are now functionally completely irrelevant. But my exam grades, achieved in the days when only certain percentages achieved each grade, serve as a permanent record of my achievement relative to my peers. Much else might be wrong, but the current exam system is certainly wrong.

      • adamhstevens Says:

        Well, the Bristol data can of course be taken at face value, but what they show is open to interpretation. They suggest that pupil’s knowledge is decreasing. All the test measures, however, is how good new students are at taking that test. I don’t have details of the test itself, but I wonder if they were given another test, say that focussed on more functional aspects of science rather than knowledge, whether the results would be the same.

        By ‘between A-Level and university’ I did mean more generally the latter stages of school education, although I think it’s important to point out that the Bristol research paper implies that some of the poor performance is due to students ‘forgetting’ things between doing their exams and starting university. It doesn’t say that that’s the biggest problem, but it does say it.

        I would agree that there’s a cultural problem, but would disagree on the specifics. I wouldn’t say it was to do with the preponderance of IT, but (from my own experience) a more general social/cultural malaise with regards to education. And I don’t think this culture will change if we keep just using exam results.

        I also agree about the distribution of A-Level results – a percentage based apportioning is sensible. Using one final exam for a subject to do that with is not necessarily the answer though. There are much better metrics than exams to grade a cohort of students with, but, again, the ‘culture’ is that exams tell us everything about a person.

    • edhui Says:

      i think you’re right in that exams don’t really tell you everything you want to know about the person. The only thing they used to say was the person’s rank in their cohort and that’s what’s been removed by ‘standards’ grading.
      If there are better metrics than exams, I’m certain they aren’t controlled assessments where students memorise answers to known questions at home and regurgitate the precise words in the assessment, or coursework where teachers can influence the content as the student does the work in school. An exam is just a situation where the student has to answer previously unseen questions in a secure environment. That is a perfectly fair situation to test someone’s success at learning.

  5. andrewducker Says:

    I find it fascinating that the standards dropped precipitously under the Conservatives (1980-1999 seems to be the massive drop) and then under Labour stabilised at the new lower level.

    I wonder what the change was, and whether Labour actively stopped further decline or simply failed to do anything and the new level was a natural lower one. Impossible to tell from this data, of course.

  6. Hairyears Says:

    Do, please, offer a comment about the Oxford and Cambridge ‘A’ levels.

    Are they still run by the Universities?

    I note that private-sector schools use those exams to stretch the better pupils, as their curricula (and examination questions) are more engaging as well as more demanding; mediocre pupils are spoon-fed the EdExcel A-level to get their guaranteed ‘A’ grades.

    • protonsforbreakfast Says:

      You can read about the labyrinthine history of the Exam Boards on Wikipedia

      There are currently 5 boards and yes, Oxford and Cambridge A levels are set by the OCR – Oxford, Cambridge and Royal Society of the Arts Examinations which is still administered by a branch of Cambridge University.

      The troubling boards are the two largest Edexcel and AQA

      Edexcel is wholly-owned by Pearson publishing. AQA have an ‘exclusive publishing agreement’ with the publisher Nelson Thornes. Basically the economics goes like this:
      • There is no money in exams, but there is BIG money in books.
      • Schools are under pressure to achieve better results and they choose the exam board they think is most appropriate for their students. In many cases this means the choice they perceive to be easier.
      • The publishers have a close relationship with one specific exam board and write the books to closely match the ‘scheme of work’ proposed by one particular exam board.
      • So it in the interests of both the publishers and the schools to make the match between the books and the exam syllabus closer and closer. Trying to take an Edexcel A level with a Nelson Thornes text book would be madness.
      So net result is that the publishing and examination arms work together and there is no one who guarantees standards are maintained.

      • edhui Says:

        You are so polite.
        ‘So net result is that the publishing and examination arms work together and there is no one who guarantees standards are maintained.’

        Surely the more interesting observation is that this arrangement, in which the exam standards are not anchored in competition between students or boards but are subject to a disarmament race between publishers in their hunger for sales to schools who require the easiest attainment targets, can be EXPECTED to drive standards down.

        The fact that no-one guarantees standards is only relevant because you EXPECT standards to go down because of the way the system’s set up. Nobody in the real world has to guarantee that prices go up, or governments levy taxes, or that we all die. Only when you have oligopolies that you expect to fleece you do you need things like ofcomm, ofgen, etc.

  7. Hairyears Says:

    That is a counsel of despair. Monkey-Choice questions are limited, just as you say: but extended questions will – if properly structured – test retained knowledge, insight, ability to apply concepts in unfamiliar situations, and offer ‘bonus marks’ opportunities to demonstrate a wider reading of the subject.

    It costs more money to mark these questions – and few examination boards have the skills, these days, to construct an exam and manage a marking process with such broad and demanding material, so as to give reproducible and meaningful scores.

    I would remind you that the maximum mark for mere recall of material was explicitly-stated by the examiners to be a ‘D’, in 1985, in London Board and O&C A-Levels in the sciences: I realise that this is greeted with disbelief by younger teachers .

    Exams from the JMB and AEB were a grade or two easier, even then.

  8. andyxl Says:

    Is this effect not dominated by the fact that we are taking in a larger fraction of the population ? Possibly that 9% is just as good as it used to be. We are just pushing further into the Gaussian distribution but pretending we are not. You could argue that it is a good thing to re-normalise grades as we take more students; but it still ought to be done transparently.

    • edhui Says:

      Do we have any evidence or reason to believe Bristol is pushing further into the Gaussian? As far as I understood, the university intake was increasing, but doing so by new courses being offered, old polytechnics becoming unis, etc. I didn’t think that established physics courses were having a harder time recruiting their chosen slice of the population. In any case, with grade inflation, I don’t imagine their nominal grade requirements have fallen, so the reduction in PKI results would still be a cause for concern. I don’t think the data fits your concern.

      • protonsforbreakfast Says:

        The major ‘Russell’ group physics course have no problem getting people to apply, but like Bristol, they have real problems with a lack of prior knowledge.

      • edhui Says:

        I hear you. I don’t see any alternative for your data other than your explanation. I disagree with you on one point though- people happily make things worse all the time, as long as it is personally profitable for them, or they are put under sufficient stress by their superiors, accountants, directors or politicians, or they’re plain stupid. It’s an underlying theme to your blog, be it A levels, sustainable energy, global warming or mobile phone safety.

    • protonsforbreakfast Says:


      Thanks for that thought. The article was focussing on Physics and the number of people taking Physics A level declined from 32,000 in 1996 to around 28,000 in 2007.It has increased since then and is currently back at around 30,000.

      As others have said, there are several things happening. The decline in the Physics PKT stems mainly from a large reduction in the content of the Physics syllabus and possibly to some extent the modularisation of the course. The increase in A level scores is mainly caused by a change in marking procedures. Both changes are defendable – very few people do things hoping to make hints ‘worse’! As you say, if the changes were transparent. we could all make sense of it, but the pretence that things are ‘getting better’ when in fact they are objectively getting worse is – putting on my sad voice – really shameful.


      Click to access review.pdf


  9. kieran martin Says:

    It is important to be careful when making broad conclusions from a set of data of correlations. To be clear, what this study demonstrates is that the intake of physics students to Bristol university have shown a steady decline in achievement in a particular test Bristol university applies.

    You have then taken this fact to apply a conclusion, that A-levels have become worse at preparing students for university physics. But sadly, as you have not presented the information on the students each year, it could equally be due to changing demographics in Bristol applications over that amount of time.

    I suspect this study is reasonable evidence of a decline in certain A-level standards, but without demographic data it is currently not terribly meaningful.

    You also suggest three conclusions as to why A-levels are worse, but these arguments are not as substantiated.

    • protonsforbreakfast Says:


      I agree. And disagree.

      I think this data is moderate evidence that ‘physics knowledge’ amongst A and A* students has declined. I think that is meaningful. I think that the coincidence of this with an apparent improvement in performance is significant, indicating that the meaning of a mark has become disconnected from real world achievement. This is a shameful failure.

      This is not a game. I am not trying to ‘make a point’. Out there, we collectively are failing our children by failing to educate them.

      In 500 words it is hard to substantiate all the points in the article. But I think the analogy with sports is informative. There are – like it or not – absolute standards of knowledge in physics. There are things people either know or don’t. We should be encouraging are students to strive and understand more, not giving out meaningless prizes for mediocre performance.

      All the best: Michael

      • edhui Says:

        More power to your elbow, Michael. The point that a few of your correspondents have made is that you haven’t provided absolute evidence for your thesis, and in doing so they are missing two important points:

        1. You’re not trying to prove anything just for scientific interest.
        2. You’re drawing attention to evidence that is not ‘consequence-neutral’. We have a responsibility not to regard your post as a simple piece of experimental data which can be right or wrong. If you’re right, it’s a national disgrace. If your data is inconclusive, then you have identified a possible national disgrace, because the publisher-examiner-school league table and budget coalition CANNOT drive standards upwards when dealing with a hard science (or any other science, for that matter).

        So pointing to holes in your argument, if any, misses the important point. It’s like you saying you don’t think there are enough lifeboats on the titanic and people are saying you’ve only counted one side- the ship may not be symmetrical. By pointing out the ‘publisher of textbook writes the exam’ issue, you’ve already shown there’s an iceberg. By adding the Bristol data, you’re saying there aren’t enough lifeboats. Anyone arguing the toss about lifeboat numbers just isn’t seeing the whole picture.

        I love the way it’s generated discussion, though.

  10. Pete Says:

    Awesome blog… I just had to read it after thinking how insane it is for 26% of students to be able to attain A/A* at A-Level.

    The sad thing is, that due to the Government/Exam boards and teachers putting their heads in the sand, the whole country will be paying the price in 20/30 years time, when all the clever folk from the past are pushing up daisies.

  11. GCSE and A level exams: parental reflections | Protons for Breakfast Blog Says:

    […] problem is that – in Physics at least – the syllabus has been reduced and and there has been around two decades of grade […]

  12. John Dray Says:

    I have just read this blog, as I was trying to work out what my 1984 A levels were worth in today’s market. The comments are illuminating. When I did my O levels we covered Newton’s laws of motion, s=ut+1/2at*t in physics and the parabola in maths… they were re-stated at A level and taken further. By A level we had covered all conic sections including the hyperbola, circle, ellipse and line pair. Exam boards were linked to universities. Exam grades actually graded people rather than everyone gets A/A*. Then again, the physics degree course I attended had to be changed into a four year course in the 90s, because the first year was spent teaching students what their forebears used to know when they arrived at university.

    • protonsforbreakfast Says:

      Thanks John. I really don’t know what to say about this re-reading it, it is all still true! I too took A levels in the olden days (’77 or ’78 I guess) and in those days the exam boards were linked to universities and ware not wholly owned subsidiaries of publishing companies. How any government let this happen is beyond me and why they let it continue is – still – a scandal. M

  13. John Dray Says:

    Thanks for your reply. My only explanation, and it is very cynical, is that keeping more children/young adults in education for longer ensures that the unemployment figures are lower. You pay one teacher and keep 30 pupils off the jobless figures. Wrecks education – lots of those people would be better off in apprenticeships, rather than in an education system that doesn’t work for them. The high-fliers need space to fly, rather than be bored with education that must operate at a pedestrian pace.

  14. Jonathan Bagley Says:

    Interesting stuff with a high level of discussion in the comments. I came across your blog, looking for information, after reading this in today’s Guardian.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: