Always read the small print – the decoupling of skills and knowledge in our exams

There is a funny thing about the skills versus knowledge debate. It is odd that anyone presumes many people would seriously argue for what is obviously a false dichotomy, a choice between skills or knowledge. On the other hand I do know that many people decouple skills from knowledge. They try to teach generic skills with subject matter chosen largely for its suitability as a vehicle towards teaching those skills. Thinking skills, learning to learn and twenty first century skills are education buzz words.

However, there is actually quite a large volume of research suggesting that:

  1. Skills are the product of fluency of knowledge in a specific area.
  2. Skills learnt in one area don’t transfer readily to other areas.

The idea that it is very possible to teach generic skills such as critical thinking or creative thinking is so seriously contested by the research [this article is a clear summary of research in the area] that it seems odd that schools are so gung ho about buying into a skills based educational agenda. I suppose it is so beguiling – make schooling about the inculcation of transferable skills and you don’t have to try and justify curriculum content for its own sake. However, it is even more surprising that our GCSE and A level exams are entirely built around the assumption that skills can be separated from knowledge, decoupled, and separately assessed. Should a nation’s examination system really be based on assumptions seriously contested by science?

Which brings me to the first of many points where I fear you might start to drift off because I need to show you the most tedious part of exam specifications, the bit any sane teacher just skips over. I want to show you some assessment objectives.

Here are some old style (old specification) assessment objectives for history GCSE:

AO1: recall, select, organise and deploy knowledge of the specification content and communicate it through description, explanation and analysis of:

  • • the events, people, changes and issues studied
  • • the key features and characteristics of the periods, topics and societies studied =70%

AO2/3 for source work, 22% and 8% for interpretation of historical events.

Here are the some currently in use:

AO1: Recall, select and communicate their knowledge of history 37%

AO2: Demonstrate their understanding of the past through explanation and analysis of, and judgements about, key features and the concepts in history of causation, consequence and change. 36%

AO3 27% for source work

Can you spot the difference?

The problem is that this small change to specifications in most subjects is distorting our whole examination system.  We can’t ignore it.

The assessment objectives dictate the mark schemes and the mark schemes dictate how the students are assessed. We teach our kids to deftly hop skip and jump their way through those mark schemes, otherwise they won’t pass. But what if the mark schemes don’t actually describe progression in our subject? What if they attempt to a test a facility in demonstrating a totally fictional hierarchy of ‘generic’ skills?  Surely not…

Problem 1: Mark schemes assess skills separately from knowledge and understanding, when the distinction is meaningless.

Today I have been marking politics A level mocks. Here is an example at random of a 10 mark question:

Explain three political functions of pressure groups.

  • Up to 7 marks are available for: ‘developed knowledge and understanding’
  • Up to 3 marks are available for ‘Intellectual skills’: specifically ‘the ability to analyse and explain how pressure groups function.’

So there we have it, the decoupling of knowledge from apparently generically teachable skills in the assessment objectives means they must be assessed separately in the mark scheme.

I sat and looked at those descriptions today till my head began to spin. How can a student show level 3 understanding of the functions of pressure groups (AO1) without explaining how they function (AO2)? How able are we to analyse anything convincingly without deploying good knowledge? Marking becomes easier if the mark scheme defines the content that will count as AO2 but while that might help the markers it leads to a check list approach to essay marking and hardly solves the core problem. Sometimes it is nice to clearly reward a candidate that knows loads but hasn’t got it together in an argument and that feels like the grain of truth behind this approach. However, they have hardly shown good understanding of the issues (AO1) if those facts are not well used. Serious worthy examiners, perform feats of mental gymnastics to make this stuff work in their own minds. If ever there was a case of the emperor’s new clothes…

However, if this incoherence in the mark scheme was the only problem the decoupling of skills and knowledge would be simply a frustration. However it is not just nonsense, it is pernicious nonsense.

Problem 2: Assessing skills distorts the mark scheme progression and leads to unreliable assessment.

I teach Political Ideologies to A2 students. I prepare students to answer essay questions such as:
‘Socialism is defined by its opposition to capitalism, discuss.’

I returned from a long maternity leave four years ago to find the A2 essay titles were exactly the same but mark schemes had decoupled knowledge from skills. In fact they went further than a two way split:

AO1: ‘Knowledge and understanding’

AO2: ‘Intellectual skills’ of ‘analysis and evaluation’

AO2: ‘Synoptic skills’ which means being able to ‘identify competing viewpoints’ and awareness of how they ‘affect interpretation’ and ‘shape conclusions’.

AO3: ‘Communication and coherence’

We have to judge politics A2 candidates using four different level descriptors for the one essay and thus make four different decisions as defined in the box above.  Can you identify competing viewpoints while not analysing? Can you show consistently good understanding incoherently? Three quarters of the marks are now for discrete skills that can apparently be demonstrated separately from knowledge and understanding.

When answering the question above weaker students tend to give descriptions of the different sorts of socialism and then they might say in passing how each strand of socialism viewed capitalism. My aim as a teacher is to try and improve their understanding so they can get beyond this. Able students are able to really actively compare types of socialism and explain WHY they had different approaches. I have done years of external examining and was used to marking essays using a set of level descriptors that had some flexibility but were built on the assumption that meaningful analysis comes from a foundation of secure knowledge and understanding and by definition is not frequently evident in ‘C’ grade answers.

Now though… to score a C grade, students must show ‘C’ grade ‘analytical skills’, to match their ‘C’ grade knowledge and understanding when almost by definition those with only ‘C’ grade knowledge cannot effectively analyse.Teachers and textbooks routinely provide students with lists of arguments they can make in essays to help them do this because they would struggle otherwise. It is a delusion to believe that these students are now genuinely analysing rather than describing and that they are developing generic analytical skills that they can apply to this or any other question. They are parroting arguments they could not have developed themselves and sometimes barely understand (despite my best efforts.)

Given that it is actually virtually impossible to really judge the knowledge and understanding separately from the analytical skills it is inevitable that examiners are often told which responses to judge as AO2, rather than AO1. Sometimes weak students have learnt off these AO2 arguments and miraculously get ‘A’ grades. On many other occasions I have seen from photocopy scripts that able students have neglected to hit the points that have been identified as AO2 and plummet to a ‘C’ grade. Either way, the mark scheme is not actually describing the real progression between weaker and stronger responses and this will inevitably lead to distortions and injustices.

Problem 3: Mark schemes now describe a fictional hierarchy of ‘generic’ skills

GCSE and A level mark schemes assume students will make progression in skills, apparently independent of the content they grasp. Questions are written, expecting  to test analytical skills and the subject matter  is simply the necessary vehicle to demonstrate these skills.

The following is from guidance to AQA GCSE history examiners:

“Each ‘level’… [of the mark scheme] represents a stage in the development of the student’s quality of thinking, and, as such, recognition by the assistant examiner of the relative differences between each level descriptor is of paramount importance…Indicative content must not however determine the level into which an answer is placed; the student’s level of critical thinking determines this.

But there is no such thing as measurable progress in a stand-alone skill of analysis or critical thinking.

There really, really isn’t. You can’t judge quality of thinking and then adjust within the level depending in quality of content because the grasp of the content dictates the quality of the thinking. Any mark schemes which try to describe this fiction are nonsensical. The research is pretty clear. It is grasp of the detail that allows a student to analyse effectively. We all actually know that students will quickly lose this apparent skill when faced with a topic or entirely new subject they don’t understand. When we hear a student reasoning intelligently about football but then unable to transfer this ‘skill’ to his history work, is this because he has suddenly lost command of a skill? How can one have ‘quality thinking’ without deep knowledge and understanding? A question is actually more difficult depending on the complexity of the content. In fact it is very common to hear history teachers complain that the NC levels, which have similar flawed assumptions, do not actually describe progression in history.

Despite this AQA decide how examiners might identify evidence of a generic ability to describe, explain, assess and compare in GCSE history because the assessment objectives require  progress in these ‘skills’ to be assessed. So mark schemes try to define, inevitably quite narrowly, how apparent stand alone skills can be demonstrated. What we all know is that in practice students won’t conform to these criteria without some pretty clear guidance.

A student won’t conform to the mark scheme description of progression and get the marks unless they are drilled.

So those flawed assessment objectives bear a heavy responsibility for the miserable hours of technique coaching necessary to ensure candidates get the grades they deserve. Did we really believe that by teaching the technique to climb a particular mark scheme our students were really mastering transferable skills?

So why is the whole structure of our exams based on this nonsense? Why is no one questioning such flawed assumptions? Lots people who are influential in education know that skills can’t be decoupled from knowledge and it isn’t like our exam system isn’t important enough to make a fuss over. It even looks like the new exams will be keeping these problematic assessment objectives. I don’t really understand why no one is making a fuss about this.

 

Some of these points have already been blogged here. A detailed look at mark schemes can be found in the comments section of that post.

Advertisements

9 thoughts on “Always read the small print – the decoupling of skills and knowledge in our exams

  1. Gosh! I had to read it twice, carefully. A thorough treatment of an essential point. You are quite right, have you challenged the Chief Examiner of these courses? You should. I had never considered the pervasive nature of a small change to a principle of assessment especially when based on a misconception; I will look at mine ‘with new eyes’. Great Job!

    1. The problem is that it is not just some courses. These sorts of AOs are pretty much across the board and so the decoupling can be found on most GCSEs and A Levels. I haven’t looked at stuff like maths, where I guess it has less impact but you will see the same split on most specs.
      The post is dense and I am grateful you ploughed through!

  2. Brilliant post on the problems with the generic mark schemes. Completely agree with this analysis. Have been writing on a similar theme with regard to KS3 here http://www.andallthat.co.uk/blog but this is more important I think. Like you, I also mark A2 ideologies and sometimes struggle to get my head around the way the generic skills are applied to the question. Most of the time, examiners hammer these issues out in moderation, but even this doesn’t happen face to face any more. Couple this with the fact that a generic skill band can sometimes cover 3 grades (AQA A2 History Unit 3) and you get the recipe for a disaster

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s