Courage and Ambition in Teaching and Learning

First published 06/03/2015

This blog first appeared on the EuroSTAR blog in April 2014 (http://conference.eurostarsoftwaretesting.com/2014/courage-and-ambition-in-teaching-and-learning/)

In 2013, Cem Kaner (http://kaner.com) asked me to review a draft of the ‘Domain Testing Workbook’ (DTW) written by Cem in partnership with Sowmya Padmanabhan and Douglas Hoffman. I was happy to oblige and pleased to see the book published in October 2013. At 480 pages, it’s a substantial work and I strongly recommend it. I want to share some ideas we exchanged at the time and these relate to the ‘Transfer Problem’.

In academic circles, the Transfer of Learning relates to how a student applies the knowledge they gain in class to different situations or the real-world. In the preface of DTW, the transfer problem is discussed. Sowmya and Cem relate some observations of student performance in a final exam which contained:

Almost every student handled the first type of question very well, but every student failed the more challenging question. It appears that the students were able to apply their knowledge in familiar situations, but not in an unfamiliar one. The transfer problem has been studied by academics and is a serious problem in the teaching of science in particular, but it also seems to exist in the teaching of software testing.

The ‘Transfer of Learning’ challenge is an interesting and familiar topic to me.

Like many people, in my final school year, I sat A-Level examinations. And I got to know that taking the CISA exam after A-levels wasn't my cup of tea. In my chosen subjects – Mathematics, Physics and Chemistry – the questions in exams tended to focus on ‘point topics’ lifted directly from the syllabus. The questions were similar to the practice questions on previous exams. But I also sat Scholarship or S-Level exams in maths and physics. In these exams, the questions were somewhat harder because they tended to merge two or even more syllabus concepts into one problem. These were clearly harder to answer and required more imagination and I’m tempted to say, courage, in some respects. I recall a simple example (it sticks in my mind – exam questions have a tendency to do that, don’t they?)

Now the student would be familiar with the modulus of a number |X| being the absolute or positive value. X can be a positive or negative number. And the familiar ax2 + bx + c = 0 quadratic equation that can be often solved by trial and error but always solved using the quadratic formula. The quadratic with a modulus would not be familiar, however. So this problem demands a little more care. I leave it to you to solve.

Now, this ‘harder question style’ was (and still is) used by some universities in their own entrance exam papers. I sat university entrance exams which were exclusively of this pattern. Whether it is an effective discriminator of talent – I don’t know – but I got through them, thank goodness.

But my experience with testers not being able to transfer skills to real world or more complex contexts is a manifestation of the ‘transfer problem’. It seems to me that it’s not lack of intellect that causes people to struggle with problems ‘out of a familiar context’ but I’d like to consider two attitudes to teaching and learning that we should encourage – courage and ambition. For the first, I will draw a parallel with sports coaching.

Most years, I coach rowing at my local club. In rowing and in particular sculling, if a sculler makes a mistake, they can capsize the boat, fall into the river and get rather wet, so there’s a risk and the risk makes people unwilling to commit to the correct rowing technique. Correct technique demands that firstly, the rower gets their blades off the water which leaves the rower in a very vulnerable, unstable situation. They have to learn how to balance a boat before they can move a boat quickly so they must be confident first, skilled second and then they can actually apply their power to make the boat move quickly.

It’s a bit like taking the stabilisers (training wheels) off a pushbike – it takes some confidence and skill for a beginner rider to do that. Coaching and learning tightrope walking, skiing, climbing and gymnastics are all similar.

Athlete coaching technique involves asking athletes to have courage, to trust their equipment, the correct technique and the laws of physics and to not fear the water or a fall in the snow. In fact, coaches almost force people to fail so they recognise that failure doesn’t hurt so much and in fact, they can commit knowing the consequence of failure is not so bad after all.

I remember many years ago when I was learning to ski in a class of ten people – at one point on a new slope, the whole class was having difficulty. So the ski instructor took us to an ‘easier slope’. We struggled there too, but made some progress. Then we went back to the first slope. Remarkably, everyone could ski down the first slope with ease. In fact, the ski instructor had lied to us – he took us to a harder slope to ‘go back to basics’. It turned out that it was confidence that we lacked, not the skill.

Getting people to recognise that the risk isn’t so bad, to place trust in things they know, to have courage to try and keep trying can’t be learnt from a book or online course. It takes practice in the real world, perhaps in some form of apprenticeship and with coaching, not just teaching, support. Coaches must strongly challenge the people they coach, continuously.

The best that a book can do is present the student (and teacher) some harder problems like this with worked examples. If we expect the student to fail, we should still set them this kind of problem, but then the teacher/coach has to walk through the solution, pointing out carefully, that it’s not just allowed, but that it really is essential to ‘think outside the box/core syllabus’. Perhaps even to trust their hunches.

Coaches/trainers and testers both need courage.

The test design techniques are often taught as rote-procedures whereby one learns to identify a coverage item (a boundary, a state-transition, a decision in code) and then derive test cases to cover those items until 100% coverage is achieved. There is nothing wrong with knowing these techniques, but they always seem to be taught out of context. Practice problems are based on static, unambiguous, but above all, simple requirements or code that when the student sees a real, complicated, ambiguous, unstable requirement it’s no wonder they find it hard to apply the techniques effectively – or at all.

These stock techniques are often presented as a way of preparing documentation to be used as test scripts. They aren’t taught as test models having more or less effectiveness or value for money to be selected and managed. They are taught as clerical procedures. The problem with real requirements is you need half a dozen different models on each page, on each paragraph, even.

A key aspect of exploratory testing is that you should not be constrained but should be allowed and encouraged to choose models that align with the task in hand so that they are more direct, appropriate and relevant. But the ‘freedom of model choice’ applies to all testing, not just exploratory, because at one level, all testing is exploratory. I’ve said that before as well (http://gerrardconsulting.com/index.php?q=node/588).

In future, testers need to be granted the freedom of choice of test models but for this to work, testers must hone their modelling skills. We should be teaching what test models are and how models can be derived, compared, discarded, selected and used. This is a much more ambitious goal than teaching and learning the rote-procedures that we call, rather pompously, test design techniques. I am creating a full-day workshop to explore how we use models and modelling in testing. If you are interested or have suggestions for how it should work, I’d be very interested to hear from you.

We need to be more ambitious in what we teach and learn as testers.

Tags: #teaching #learning

Paul Gerrard My linkedin profile is here My Mastodon Account