Scenario modelling

One of the more regular questions we get from students of Politics/IR is ‘what’s going to happen on X?’

We study political events, they see political events and not unreasonably assume we know how it will all play out.

Of course, that’s easier said than done. As my own track record on assorted elections, referendums and scandals has demonstrated.

But the question still remains a good one, because it’s an opportunity to apply theory to practice, and to appreciate where the uncertainties lie.

Of course, right now the invasion of Ukraine is the big example on many peoples’ minds in Europe, but you could add in the looming SCOTUS decision on Dobbs v. Jackson Women’s Health Organization, or the renewed tensions in North Korea, or Taiwan’s situation, or pretty much anything else really.

Each of this political situations has a wide range of possible outcomes, so working through what those might be and the factors that might weigh in deciding between is a useful exercise, for students and colleagues.

This reminds me of the excellent work done by Jon Worth during the hot phase of Brexit, where lots of uncertainty existed and everybody had a hot take to share.

His approach was to work through necessary decision points and allocate weightings to the likelihood of various outcomes, ultimately producing a summary set of overall results of varying probabilities. You can find his last set of diagrams here.

Crucially, Jon did this in a very transparent way, gathering input from social media contacts on both the steps involved and the probabilities to attribute. As you’ll see, this made each diagram an iterative process.

Jon used open-source software for this and put in a lot of time. He’ll freely discuss how getting feedback proved harder and harder over time, so this isn’t something to be done more than once by students, but certainly you can see how a small group could produce a diagram within a session and then work to refine it among themselves: for many topics you could return to those diagrams the next time you ran the class, a year later, to see how they stood up.

The value here is in the unpacking of assumptions and the explicit consideration of how things fit together. Whether students make the right call on what happens or not, they learn – through debriefing – why things turned out the way they did.

Creating Wicked Students 3

Time to reflect on the previous semester’s successes and failures:

I might be on to something with the Wicked Problems that I created for my comparative politics course. Previous posts on the subject are here and here. A brief synopsis of the activity: in class, teams of students have to quickly determine and present a possible solution to an unstructured, authentic problem. I put four of these exercises into the course:

  • Political risk consultants recommend to Volkswagen executives which of two sub-Saharan African states is most suitable for establishing a new automobile manufacturing site and sales network.
  • Defense Intelligence Agency analysts identify which of three Latin American U.S. allies is most susceptible to a Russian GRU election disinformation campaign.
  • The United States Institute for Peace delivers a conference speech on constitutional design for leaders of Libya’s major political parties that compares constitutionally-established institutions of government across four states.
  • Members of Iran’s Mujahedin-e-Khalq create a strategy for overthrowing the Islamic Republic by examining revolutionary movements in four other states.

Students found the exercises engaging. My exams included a question that asked students to reflect on what they learned about their problem-solving ability from each Wicked Problem, and their answers indicated a reasonable degree of meta-cognition.

But it was obvious that students failed to use the methods of comparison that I repeatedly demonstrated during class discussions. I expected students to organize their cases and variables into a simple table, like I had, but they didn’t. So, for example, instead of something like this:

Ethnically heterogeneousNoYes
Prior civil warNoYes
Major oil exporterNoYes
High level of political riskNoYes

students presented the equivalent of this:

Nigeria has a large population and represents a larger automobile market than Rwanda, so Volkswagen should site its new operation in Nigeria.

I suppose the solution is to require that students create their presentations by filling in a blank table, which will force them to select cases and variables in a logical manner.


A short one today, to encourage you to read this thread from C Thi Ngiuyen on how he’s challenging students’ understanding of grading:

Suffice to say here that his ideas resonate a lot with my own, but he’s in a position to do more about it with his class. For those of you who are bound to your institutional requirements on grading and assessment, this is still a really useful discussion to have, with both students and colleagues.

That’s a great question

Not this (source)

As someone who’s be listening to a lot of US-based podcasts during the Covid era, the title of this post is very familiar to me, since it’s the first thing that a guest says when asked any question on said podcasts.

Even – and hear me out on this one – when it’s not a great question.

It is, of course, more a linguistic tic and and a means getting a couple of seconds longer to work out a reply (maybe even a great reply) to the great question.

Which is fine, but it’s also had me thinking about how we handle questions in our classes.

For many starting out in teaching, questions figure large in the pantheon of ‘shit that can go wrong’. What if I get asked something I don’t know the answer to? What if I get muddled up? What if the student disagrees with me? What if my colleagues find out?

So let’s try to map out some strategies and things to remember.

Firstly, recognise that the key reason questions from students in class seem daunting is that you don’t know what they’re going to ask. If you’re approaching teaching as an exercise in ‘I must do my thing’, then you likely had decided rather closely what your thing is, and anything else is at best a distraction and at worst a sandtrap.

Start by thinking through what teaching is trying to do: it’s about helping students to learn, not about you getting stuff out there. So it’s about them, not you. Which means you have to accept that there is always likely to be a gap between what you’re trying to communicate and what they understand of your communication.

[Small aside: I once spent 15 minutes discussing with students what ‘iterated’ means, because they’d not heard of the word before.]

This leads into a second point: students almost always are asking about something because they’re trying to understand. Just like journalists, almost none of them are out to get you when they ask a question.

Indeed, think about the shift you experienced from before your first ever class and after it: before, you probably worried about ‘all the difficult questions’, but after you more likely worried about whether you’d ever get any communication out of your class. Most students won’t ask stuff, and they ones that do are the ones who are interested in what you have to say.

Thirdly, the classroom isn’t a quiz show. There isn’t a prize for answering quickly and you don’t just move after your answer. Instead see questions as part of a dialogue with students. You can ask them to explain or expand on their question if you think that might help. You can say that you don’t have the answer to hand, but you’ll come back to them with it (and then you must totally do that). You can ask the rest of the class if they have any ideas.

[Another small aside from my first month of teaching: I got asked to explain voluntary export restraints, and got no further than that rearranging the three words and shrugging. This was not good.]

In short, see questions not as a threat, but an opportunity.

Yes, you should think about what students might ask about – the stuff that you know people find tricky, the application to a contemporary case – but more important is treating a question as a way to valorise the student in the learning process (that’s what ‘there are no stupid questions’ really means BTW), to give them a stake in your class. The more you respond constructively to questions, the more comfortable students will be in asking, and the better the chances that they will understand what you want them to be learning.

Maybe all those guest on those American podcasts might actually be on to something after all.

Say what you see

As part of the package of materials I’m building for our new Masters in IR, I’ve been trying out some different ways of stimulating reflection and discussion with our students.

As a reminder, we run a fully distance-learning model, with weekly asynchronous bundles of written, audio and visual elements, with a lot of our students doing this around work or other commitments.

The mix of elements is an important point for us, both because it maintains interest for students and because it opens up different ways of looking at key questions when opportunities for face-to-face discussion with peers are limited: sometimes just trying to approach an issue in a different manner can help things to click together.

As such, while I’ve been putting a lot of effort into an asynchronous negotiation exercise, I’ve also been looking at ways of tackling other elements in an interesting and engaging way.

So I’ve been exploring some visual analysis of a table.

You’ll know the table, since it launched a thousand memes, back before the visuals from Ukraine became a lot more visceral (in all senses).

Source: DW

We’re written before about getting students to produce memes on subjects, but this time I’m more interested in the image as originally presented.

Strange as the table might be, it’s also evidently a conscious choice (given that Putin met other world leaders in different settings and around different furniture), so the question we might ask of students is: “what impression is Putin trying to convey here?”

That’s partly about the impression on Macron, but also (and more importantly) the impression on other audiences, both within Russia and beyond.

In my exercise, I ask students to write about what they think of these different communications, with some prompts for reflection once they’ve done. These prompts are important because they remind the student that what they see is not necessarily what others see.

This is a key part of this activity, since the polysemic nature of political communication is not always so obvious in other media, whereas visuals set up much more space for multiple interpretations. To make the obvious point here; this table set-up looks so odd to me that it must have some other set of meanings that I have missed.

Moreover, precisely because this image got so reworked for memes and mockery, there’s a follow-up exercise here to ask students to consider what those memes try to do and how they try to do it.

For my students, this will be a 15 minute exercise in total, but if you ran this in class you could easily make an hour of it, through exchanging ideas and grounding it back into wider patterns of Putin’s (self-)representation and communication. Plus how our view of it changes with all that has followed.

Assessing large groups

Among the many upsides of working with relatively small groups of students for most of my career has been that I’ve not been driven primary by volume management concerns. I could look across at my colleagues in Law or Business and laugh at the thought of having to juggle several hundreds of exam scripts at once.

(One time, a Business lecturer proudly told us how he’d marked 400 exams in three days, only for it to become clear it was a multiple-choice paper, with answers scanned in, which raised questions about why it had taken so long.)

But in the spirit of cooperation, a recent tweet about the need to treat assessment as an integral part of our teaching activity prompted this response from a Legal colleague:

This is an issue for many of us at some point: the big compulsory course where innovation feels like a luxury.

So what do to?

Sylvia’s dilemma is three-fold: assessment needs to a) serve learning objectives, b) minimise opportunities for cheating, and c) be practical to turn around with reasonable speed. We’ve not had the chance to speak about her specific situation, so what follows should be read more generically.

My personal view is that we always have to place learning objectives first in assessment: do we test for the things that we consider it essential that the students should have learnt?

In any course or module that covers a variety of elements: substantive knowledge; research skills; presentational and interpersonal skills; and more general aspects of critical thinking and building confidence. That breadth is important, because it underlines that ‘knowing facts’ isn’t the be-all and end-all here: even for us in academia, we probably make as much use of the skills and competences we gained from our study as we do the knowledge (and we’re at the high end of a spectrum of knowledge use).

Sylvia mentions vivas as a nominally ideal form of assessment, possibly because it’s interactive and personal and offers lots of opportunities to test how far a student can work with what they know. Having sat through vivas for a course of 100 students, I could point up some issues, but the analysis still holds: here’s something that better serves the learning objectives.

So are there other ways to get that same benefit without the big time implications of a viva system?

Two strategies suggest themselves, if we’re treating final written exams as an unsatisfactory option: different formats and collaborative working.

Asking students to produce posters, infographics or podcasts not only opens up different ways of presenting material, but also requires considerable distillation of substantive knowledge into key points, which in turn stimulates more critical engagement. Yes, students will be unfamiliar with the practicalities, but this can be covered with some preparatory sessions, and it develops presentational skills that might be otherwise neglected.

If you want to stick to text, then asking for shorter written pieces – policy briefs, submission to a court – can also keep the focus on distillation, plus give experience in formats they might encounter in their work (unlike a long-form essay).

And all of these options could be used with collaborative approaches too. Learning to work together is a valuable skill [he writes from his shed], so why not test for that? Group projects can be marked for the group as a whole, plus with individual marking for short reflective pieces on what each person contributed and got from it.

Of course, free-riding is an issue, and some disciplines might encounter accreditation barriers on collaborative assessment, but the problems need not be any greater than for final exams.

The right answer will vary from case to case: your capacities; the nature of your course; your institution’s attitude; the willingness of your students to buy into it. But these discussions are still worth having. Just because things have ‘always been like this’, doesn’t mean they should continue like this, especially if it’s not working for you or your students.

If you have more ideas on this, or what to chat about your assessment, drop me a line and we’ll talk.

The interrelation between attendance and writing assignment in a PBL course

This guest post comes from Patrick Bijsmans (Maastricht University) and Arjan Schakel (University of Bergen)

In one of his recent contributions to this blog, Chad asks why students should attend class. In his experience

[C]lass attendance and academic performance are positively correlated for the undergraduate population that I teach. But I can’t say that the former causes the latter given all of the confounding variables.

The question whether attendance matters often pops up, reflected in blog posts, such as those by Chad and by Patrick’s colleague Merijn Chamon, and in recent research articles on the appropriateness of mandatory attendance and on student drop-out. In our own research we present strong evidence that attendance in a Problem-Based Learning (PBL) environment matters, also for the best students, and that attending or not attending class also has an influence on whether international classroom exchanges benefit student learning.

Last year we reported on an accidental experiment in one of Patrick’s courses that allowed us to compare the impact of attendance and the submissions of tasks in online and on-campus groups in Maastricht University’s Bachelor in European Studies. We observed that that attendance appeared to matter more for the on-campus students, whereas handing in tasks was important for the online students.

This year the same course was fully taught on-campus again, although students were allowed to join online when they displayed symptoms of or had tested positive for Covid-19 (this ad-hoc online participation was, unfortunately, not tracked). We did the same research again and there are some notable conclusions to be drawn.

In the first-year BA course that we looked at, students learn how to write a research proposal (see here). The course is set up as a PBL course, so it does not come as a big surprise that attendance once again significantly impacted students’ chances of passing the course.

Continue reading “The interrelation between attendance and writing assignment in a PBL course”

Learning about teaching


Last night found me at our kids’ school, for a talk on revising. Aside from being a reminder of how quickly people decide that facemasks aren’t prudent any more, it brought home some lessons about the way we construct teaching for others.

The talk was primarily a run-through of what will be happening after Easter with exams, plus subject-specific sections on useful resources and good revision practice. Its content was much as you imagine, and as familiar to me (as a teacher) as it was to my daughter (who’s now on her third time of hearing it all in as many weeks).

So what’s worth mentioning here, on a site devoted to university education, where we don’t (usually) draw parents into it all?

The teachers here have clearly had some training on revision, including some useful models of ‘how to revise’, which they brought to the table. But what was missing (for me at least) was an unpacking of how revision fits into the broader process of learning.

Back at the start of the year, we got a welcome talk about what the next cycle for our daughter would be (the two years up to her first major external exams). In that was lots of stuff, but not so much about how keeping materials and notes would be a key part of ‘revising’, in the sense that they discussed last night. Revision implies vision a first time, and all the revision techniques set out in the current talk require a baseload of substantive knowledge and understanding to be able to produce the materials for effective assessment performances.

Put the other way around, if you’d not done the work until now, having six weeks until the tests to revise as the school would like you to is not a viable proposition.

And this is where this all matters for you (and me). Assessment (and by extension, revision) is too often treated as a disparate element of the educational experience; something tagged on the end, just because we have to.

Instead, assessment is an integral part of learning and should be handled as such, a logical extension to what happens in a class and in a student’s broader package of work through a programme.

This disconnect was evident in a couple of other places too, last night.

One of the teachers asked that students didn’t come to them, asking for ‘help with something vague’, but rather with a precise and focused query: ‘I have tried to do this past paper question on topic X and I can’t seem to make sense of it, despite several tries’, seemed to be the preferred line.

Now, as a teacher, I appreciate that more precision means more scope to get into the nuts and bolts with a student, but I also appreciate that the bigger problem is students not coming to ask for help at all. If I were a student who was struggling, being told I now needed to come with a precise inquiry strikes me as more daunting.

Here the issue is one of assumption-making about student engagement and buy-in to the programme of study. Even the most wonderful teaching set-up does not guarantee that engagement and we always have to be alert to those students that haven’t found their place within it.

That’s best treated not as the fault of the student, or the teacher, but of the specific instance. In a university setting we have more discretion to change and adapt that instance to accommodate individuals on a different path, but in a much more prescriptive system – such as that found in schools – the need to nudge/shove everyone into the same track is much more considerable.

The key take home for me from all of this it that we need to be thoughtful about how we communicate with our students. That means not simply setting out what to do, but rather explaining what we’re trying to achieve (in the broad and narrow senses): it doesn’t stop us from recommending techniques to follow, but it does then require us to explain why these might work.

Since I don’t want to paint our school in a completely bad light, they did do this last night when talking about planning revision. As was explained, prioritising topics is a key first step in making a revision timetable: the focus should be on what’s less comfortable or familiar, because that’s where the biggest gains can be, rather than sticking to the stuff you know.

Of course, sometimes even the stuff you know turns out to be not as simple as you might think.

Academics as part of a community

Striking, but probably not enough

The invasion of Ukraine raises many questions, both academic and practical: certainly it has the feel of an event that will shape a lot of lives for a very long time.

Part of that has been the question of how to respond, as an individual and as part of a community.

I’ve had to deal with this a bit more than many, as a function of chairing UACES, the UK’s European Studies association.

We have a few Ukrainian (and Russian) members, plus many more with personal or familial ties, reflecting the entangled nature of European society.

Like many other academic bodies, we issued a statement that condemned the invasion and called on members to help where they could. However, we also didn’t want to just leave things there.

As a result, we’ve carried on working to link up with bodies that can provide direct support, such as CARA and Scholars at Risk, to try maximise the connections that exist, while also opening ourselves up to help people more individually, as best we can.

How much help that ends up providing remains to be seen, but already at this stage it has underlined for me the importance of not falling into a performative trap of simply declaring “X is Bad” and the moving on.

If you have examples of how you see ways we could be doing more, then I’m really keen to hear them. Each of us might not be able to do much to address or remedy the situation, but equally it behoves us all to try where we can.

Creating Wicked Students 2

As promised last week, here is an example of a wicked problem I’ve given to my comparative politics class.

  • You are an employee of the The Scowcroft Group.
  • Volkswagen wants to expand into a new African market.
  • Setting up production facilities and distribution channels will take three years.
  • Which sub-Saharan African country should Volkswagen choose to expand into?
  • Your task is to compare risk to political stability for two sub-Saharan African nation-states, and choose the one with the least risk.
  • Use ≥ 1 quantitative and ≥ 1 non-quantitative indicator.
  • Present your recommendation to Volkswagen’s CEO and board of directors.
  • You have 15 minutes to create a 3 minute presentation.

I show the instructions, small teams of students work on the problem, and each team presents its solution. I grade the presentations using this rubric: