Talking with a colleague at another institution this week, he mentioned that many of his colleagues felt there was a ‘student problem’: the teaching was good, but the students were simply unable to make much of it. Their poor grades and weak academic practice was, essentially, their fault.
I’m going to go out on a limb here and suggest that such views are not uncommon. You see echoes of it in those listicles on Facebook come marking time: “look what my dumbass students wrote”, “15 photos that make you ask why we bother to even try to teach them.” I’ve done it myself, like when I talk to people about the class presentation I had to sit through on Claud Monet’s contribution to European integration. I’m smart, they’re dumb, and you’re smart, because you understand what the mistake is and why they’ve made it. Basically, XKCD for political science, but much less generous in its humour.
A couple of years ago, I was part of a pilot project/committee on documenting teaching efforts. The goal was to create a process that mirrored disciplinary research for faculty to use to demonstrate “a peer-reviewed contribution to teaching with real impact in the classroom.” After much deliberation, we called the process a Peer-Reviewed Exploration in Teaching or PRET for short. As the gold standard in research is peer-review, a PRET built on this idea. In this case, the peer group was other faculty, across different disciplines at the university.
As a participant, the process of doing a PRET encouraged quite a bit of reflection. The first part of the process was a proposal that linked specific learning objectives to an activity “that is grounded in pedagogical literature and designed to meet those learning objectives”. Although I’ve always felt that active learning is beneficial, thinking deeply (and putting into a proposal) why this particular activity (I was evaluating my use of the Statecraft simulation) should meet those particular learning objectives was more difficult than I thought. The peer feedback on this step was very useful in solidifying my thinking.
The process included peer observations of my class over a number of sessions – before, during, and after the activity. The observers also conducted a focus group with the students. The report that my peer observers produced from the observations and focus group was eye-opening. It also brought a new level to the debriefing process that we always say is so important with simulations.
Although the PRET formalized this process, I think the basic idea is generalizable and worth the effort. I could see an informal peer-to-peer exchange of proposals and observations work. It encouraged me to think about the goals of the activity in a more systematic way and it provided an outside perspective on whether the activity was meeting these learning goals. As a peer reviewer of other faculty, I learned quite a bit about what active learning looked like in other disciplines and adopted some new ideas.
There’s a lot to be said for banality. It’s probably the most under-rated of teaching practices, mainly because it’s so little remarked upon. We always talk about pushing our students to the edge of their knowledge and understanding, so that this edge is pushed further back, and we also keep flagging the core ideas as lodestones, but we only rarely come back to the stuff in-between: the logical corollaries of the core concepts.
Yesterday’s class with my negotiation students was a case in point.
The session was centred on the theme of preparation, and asked them to agree a governmental coalition in the wake of the Spanish elections. This threw up lots of great thoughts and discussions about many points: Spanish politics, coalition-building in general, verisimilitude in simulations (they ended up with a grand coalition), and even stuff about preparation (I’d possibly been less than helpful about what prep they needed to do).
But for me, the big lesson was one that I end up discussing at length every year, but never quite manage to embed explicitly in the module’s work. And it’s a banal point: simply put, it’s not what you say, it’s what people hear that matters.
Even if you never studied negotiation, then you know enough about constructivism to appreciate the objective weight of subjective interpretations, enough about the importance of clear communication in any sphere of life, and enough about life to know that misunderstandings and talking-at-cross-purposes happens pretty often.
But you also might well have never put those things together to consider the banal point that people will tend to understand things as they understand them, rather than as someone else understands them. So it doesn’t matter if I think I’m being clear, if you don’t think I’m being clear.
Likewise, my students seemed to have a bit of block in understanding why there had been some tension in the negotiations. A couple of groups had left the room to work out some options, and didn’t want to be disturbed by emissaries from the other parties. Unfortunately, since they were the PP and PSOE and ended up with that grand coalition proposal, when they did present it to the others, they didn’t get much joy. Both sides were still quite sore about it, even during the debrief, and we had to work through how this had come to pass before we could get to all that other stuff I mentioned.
I’ve written about this before in a different context and that’s maybe the point: at some stage it becomes so obvious that it’s hard to remember that we need to remember it.
In my case, I’m fortunate that it’s such a pervasive issue that it does always come up at some point in class, but you might not have that. All of us might do well to remember that to leap from central theoretical tenet straight to the boundary can be exciting and engaging, but it can also come with costs.
It’s really only fitting that the presence of the QAA at work that should drive this week’s reflection and activity. As their audit team spend the week meeting with staff and students, and working through the huge pile of documentation that we’ve given them, we’ve necessarily had to make some adjustments to our usual order of things. In my case, that meant bringing my class to an end a bit early yesterday, so that I could get to my meeting on time.
I’ll admit that when I first got the schedule, I did um and ah about whether I should simply move the class altogether: that would mean less rushing about for me, and more class time for the students. But since I’ve been prepping for this visit for about a year and half, I doubted that the extra time would do anything much to help. This is part of my classic model of exam-stress management: it’s usually too early to stress, until it’s too late to stress. As long as I don’t think about it in the transitional (‘it’s the right time to stress’) period, I’m fine.
Any way, more importantly, I was also thinking about what this particular session with the students was trying to achieve. It’s about negotiation in practice, following on from two sessions on negotiation theory, and is basically there to manage the transition into the much more active phase of the module, by highlighting the difficulties of using theory in practice.
In previous years, I did the following: recapped the key messages from the theory, then showed them a video that I made of a haggle, discussed how I was good and bad in using theory (bad mainly), then sharing my key practical tips on negotiation, then getting them to play a small crisis game (like this one). In short, build linearly from theory through my failings (to make them comfortable with reflective practice), then discussing their practice in the crisis game.
That structure made a degree of sense, but it also is rather gentle in immersing students into what will come, and probably unnecessarily so, on the evidence of all of the cohorts I’ve taught, who seem to grasp the need to look in on their practice and then articulate that externally.
So I chopped it all around. I got them to watch the video before class (also saving me the hassle of IT/speakers/etc.), then launched straight into the crisis game as they arrived in the classroom, debriefed them using the theory, before moving on to discussion of the video, and finally my tips.
This refocusing on their practice meant the session was a much more constructive opportunity for them to consider what they did and do, still with the knowledge that (via the video) I’m also capable of self-critique (not least of my acting/directing skills). Giving them the practical experience at the top of the class really helped to give them something to hang their thoughts on.
The bonus of all this was that it also shortened the session, because the video was moved out, so next year I can work on adding more content/activity to the session.
If there’s a general point behind this, then it’s to think about how and why we scaffold our students’ learning. Often, there’s sound reasoning in building up to more advanced activity, but it’s also worth reflecting on whether that later activity is really any more advanced, or just different from the norm. As I’ve discussed in other contexts, sometimes we need to credit our students with more resilience and capability than we do: isn’t self-reflection something we want to be developing in our students from day one?
You probably don’t have the impetus of a QAA visit to help you try moving things around, but it’s still worth a try.
As long-standing readers might recall, I read. A lot. But I don’t tend to talk/write about my reading much, either because it’s not immediately relevant or because the rest of my life is so rich with incident. Probably the former, then.
However, I’ve just ploughed through Philip Tetlock & Dan Gardner’s “Superforecasting“, which does offer some interesting perspectives that might be tied to our work in the classroom.
Tetlock will be known to you for the meme that most predictions are no better than chance in their accuracy. As he points out, while that might be true in the aggregate, it’s not true for ever forecaster, and this book (which follows up the Expert Political Judgment that set out the (rigorous) basis for the meme) explores those people who have demonstrated a consistent and measurable ability to outperform both the metaphorical ‘crowd’, but also other mechanisms, such as forecasting markets.
Tetlock’s core point is that such ability is not innate, but learnt and learnable. And that’s what interests me here.
Over the past couple of weeks, I’ve been teaching students about negotiation theory, before I drop them off into the metaphorical deep end of actual negotiation. One of the core skills I’ve been trying to stress to them is the need for preparation, which forms the bedrock of good practice.
However, while it’s easy to say “be prepared”, it’s often hard to know how to prepare, especially if the situation is a complex one. And I can see utility in Tetlock’s approach as a way of improving the chances of having as good an oversight of the preparatory phase as possible.
In practical terms, that means doing things like breaking down intractable problems into tractable sub-problems, looking at the situation from as many different perspectives as possible and engaging in constant checking of your practice and biases.All the kinds of things we might be encouraging students to do as active learners, in fact.
And perhaps that’s the point. Learning is a memetic process, where we draw analogies across from one place to another, hopefully casting light on the way. Tetlock’s not interested in this book about effective actors, although he does mention several examples of how his core ideas get operationalised, but about our ability to forecast events. I’m less interested in that, but can see how I could use it to improve students’ performance elsewhere.
Tetlock talks about the rise of the ‘mission command‘ model in the military, the notion that what matters is the underlying intent, which local commanders can then find the most appropriate way to achieve it. This replacement of the strict hierarchy of command-and-control with something much more flexible and adaptable is not so different from the shift towards active learning. We set out learning objectives and students find their own path towards them.
I wouldn’t want to stretch this too far, but it makes the point that we should be casting our nets as widely as possible when developing our practice, because the analogies can take us to places that might never have considered before.
A few weeks ago, I wrote about simulating the Greek crisis. I suggested then that one issue in doing this was the difficulty of carrying things over from year to year in the classroom: students change, curricula change, you never quite know whether it’s still going to be relevant, etc.
As Amanda rightly pointed out in an email to me some time later, you can perfectly well do it, with a bit of thought. So it’s with that bit of thought that I am now doing it.
Last year I created my first online asynchronous simulation for the INOTLES project in which I participate. As you’ll see from the post, it’s a simplified recreation of the East European situation, with a friendly (if ponderous) EU-like structure on one side and a confident (if worryingly so) Russia-like country on the other.
I played this with my students too, with the upshot that the ‘Russians’ produced a surprising success in sealing a deal with the ‘East Europeans’ (largely over a misunderstanding, but let’s not pretend that doesn’t happen in real life too). I put the simulation back on the shelf, mused on what had happened and then basically forgot about it.
Until Amanda’s email. There’s no reason why this year’s students can’t pick up where their predecessors left off.
It’s a fictional scenario, with all the requisite information provided. Since it allows for a wide-ranging set of actions, there is no obvious end-point or stable equilibrium. Indeed, one might imagine that some students might take the opportunity to revise the actions of the past, just because they can. Certainly, given the rather devil-may-care approach to a second round of the Hobbes games in class yesterday, that looks like a rather likely outcome.
I’ll quote Amanda at some length here:
Whenever we do a simulation, it tends to be a new run of an old game–how neat would it be to have the simulation just continue, with students acting as the newly appointed representative for that country and having to work with old agreements produced by students who are no longer in power? I find the idea really interesting, not only for the sense of realism it brings to ongoing negotiations, but also for the real-world skill of having to step into a job vacated by someone else and having to figure out what the prior office holder did and how to incorporate their decisions into your own
Amanda’s last point is perhaps the crucial one: we all have to pick up other people’s stuff and deal with it – it’s a basic stable of professional life – so getting to experience that is a useful opportunity for personal development.
Indeed, in this game the original conceit that it opened with no particular situation is clearly unrealistic, so we’ll learn about path dependency directly.
Amanda’s one concern was about record-keeping: how to capture what had happened, so that we can pick it up again. Well, I’ll admit that this isn’t a big issue in this case. The final agreement reached ran to a full four lines of hand-written text and there was nothing else to share. I’m hoping that this time around we’ll have clearer sight of the next year, so that paper-trails can be left, with all the joys that brings.
As usual, this is all new territory for me, so I’ll be reporting back as we progress.
So it turns out I’m a Thinker. And a Doer. And a Planner. And a Solver.
Last night, I was administered the learning styles questionnaire of my eldest, which had come home from school earlier on, which revealed that I was a very balanced individual. Possibly.
Now it could be that I am very balanced. Or that my enormous experience and modesty mean that I have learnt to appreciate and internalise these different approaches. Or that the questionnaire is not very good at distinguishing between them. Or that learning styles is possibly unhelpful as a model for understanding individuals.
Of maybe it’s just because I didn’t tick number 33: “I’m fun to be around.” Who knows?
And that’s actually the issue we confront very often in L&T: we just don’t know what we’re measuring and whether it’s meaningful. It’s easy to pock holes in a questionnaire for kids, but it’s just as common an issue in universities, where all these clever people sit around working hard to describe clever ways to resolve precisely such issues.
I’ve observed before that students’ evaluations of modules/courses – which we set such great stock by in the UK – can only ever be part of judged the quality and effectiveness of our teaching. “Have we made a good impression on our students?” is a different question from “have they learnt something useful?”, yet we often conflate the two.
Likewise, as has been noted on this blog many times, we talk about new pedagogies and why they’re great, without much evidence to back it up. Just because I think simulations are great, doesn’t mean they’re great, and my anecdote doesn’t match systematic research of outcomes.
So what to do?
Partly, we have to rein in our enthusiasm and be frank about the limits to our knowledge. I think I speak for all the ALPSblog authors when I say that we talk as much about the costs and limitations of our active learning practice as we do about the benefits: beware anyone who offers you the moon on a stick.
Partly, we have to go out and get that evidence. For my part, I’m going to use some of the money the nice people at the Higher Education Academy gave me to conduct some research into simulations to do just that. I’m still working on the details, but it would seem to be a worthwhile activity, on a number of levels.
To return to a theme we discussed not so long ago, if we are to bring more people along with us in improving pedagogy, then we have to make the case better.
You – as a Thinker, Planner, Doer and Solver – can be part of that too.