The simplicity-complexity dilemma

Having been all chuffed with how my EU simulation was received in Prague at EuroTLC, I read Patricia’s post about using doughnuts to model a two-level game with a mixture of admiration and jealousy.

The admiration comes from the elegance of its design and jealousy from the feeling that I’d not come up with something nearly as good.

So, props to Patricia (and hello to my local doughnut vendor), but it also raises an interesting question that was niggling me in Prague and which has been a long-running debate here on ALPSblog, namely the tension between making something ‘realistic’ and drawing out the essence of a situation.

It’s a generic problem for all teaching and learning: we can’t (or shouldn’t) hope to describe and explain every last thing in the world around us, so we use heuristics of theory and extrapolation to provide ‘good enough’ models.

Similarly, when building simulations or games, we’re trying to draw out the key processes and dynamics, to expose them for students to see them more clearly and to then take them back to the building of their understanding of the world.

The difficulty comes, of course, in deciding what’s important and what’s not.

The great strength of Patricia’s exercise is that it’s all about the two-level game: it’s lean on its specificity to doughnuts, so it can be used to illustrate any two-level game.

My game is much less lean (and doesn’t provide any tasty pay-offs), but it does include some other mechanisms that I consider important for my students’ understanding, namely the role of outside parties, the consequences of particular choice and the potential to challenge the entire premise of the activity.

Neither is ‘right’ in its approach, but each stresses different aspects.

To make the point, my session in Prague largely consisted of participants talking about what else they/one could add in to do other stuff. Again, not right or wrong, but different emphases.

These things can potentially be crippling, to both designer and user.

For the designer, the fear of missing something out can mean throwing in too much, especially if you’re relatively new to the process. To takes a degree of courage to stripe things right back to one thing and to accept that it doesn’t do it all.

For the user, the anxiety that you’re not hitting all (or enough) of your learning outcomes might mean a desire to shovel more into a scenario, or to feel you have to play multiple activities, or even to drop them all and stick with the lecture.

The key to unlocking all these is to talk with students. It might not have been the first point you picked up from Patricia’s post, but the post-game debrief is essential: getting students to talk through what they’ve done and then to connect it to the rest of their learning.

It’s here that the process of essentialisation actually becomes a positive help: asking what was and wasn’t realistic about what they did can open up big areas of discussion and debate and invites thoughtful consideration about what else is happening in a given scenario.

Put differently, not trying to do it all helps to point up the things you’re not trying to do.

My core rule in designing activities has long been KISS: keep it simple, stupid. If I can’t be clear about what I intend the activity to do, then I can’t expect anyone else to be clear.

Indeed, by seeking out that core process, I’m also trying to make sense of a phenomenon: to see my students then play that out also helps me to see if my sense is a useful or instructive one.

If you like, this is another example of the value of gaps: rather than trying to do it all for students, by leaving things open we can encourage them to think and develop for themselves. Which is rather the point.

The Doughnut Negotiation: Win-Sets with Sprinkles

Today we have a guest post from Dr. Patricia Blocksome, Assistant Professor of Social Science, US Army School of Advanced Military Studies. She can be reached via her LinkedIn profile at https://www.linkedin.com/in/pblocksome/.

 

Putnam argues that international negotiations between states occur simultaneously with domestic negotiations between intrastate coalitions – the two-level game. At the domestic level, politicians have to form coalitions large enough to ratify an international agreement. These domestic coalitions establish the win-set, the spectrum of acceptable outcomes for the state. At the international level, each state attempts to achieve an agreement that falls within its domestic win-set. When states have overlapping domestic win-sets, an international agreement is possible. Negotiations can occur concurrently over two or more different issues, leading to potential trade-offs, where a gain in one area can offset a loss in another.

So how does this apply to doughnuts? Continue reading

Model Diplomacy: Smart, easy to use foreign policy simulations

Model Diplomacy is a series of free National Security Council simulations put out by the Council on Foreign Relations. Michelle used it in her class last year, and I decided based on her positive experience to try it in my intro to IR class this year.  In this post I’m going to explain the basics of Model Diplomacy, discuss my experience using it, and give my recommendation.  Spoiler Alert: I loved it.

Continue reading

Snippets from Comparative Politics

Some end-of-the-semester thoughts on my comparative politics course, in relation to a post from the beginning of the semester and to Simon’s post last week about a framework for active learning.

First, the simple stuff:

Running this course with only ten students at 8:00 a.m. is problematic, for reasons I have mentioned before. Lack of students definitely decreases the level of activity in my Gerkhania simulation. Attendance has picked up but is still only eighty or ninety percent, so in the future I really need to give pop quizzes — in paper, rather than electronic, form — on a semi-frequent basis.

I have noticed a problem with the reading responses. For these assignments, I usually pair an article from an academic journal — often the Journal of Democracy — with shorter and more current items from news outlets like The Atlantic, Politico, and The New York Times. Some students developed the habit of reading only the latter and ignoring the former. I need to force students to read the journal articles, but haven’t quite figured out the best way of doing this.

Now for the complex stuff: Continue reading

Building a grid for measuring the effect of Active Learning

The great thing about colleagues is the way that they get you to move beyond yourself. Reading Peter’s summary of our Nicosia discussion is a case in point, setting out our agenda in a way that makes me want to write more about the ideas involved.

That means the dream I had last night about how to run my negotiating course will have to wait until next week, for which we might all be graeful.

At the centre of Peter’s idea is the creation of a framework that would allow colleagues to engage in a more systematic and rigorous examination of the effects of Active Learning. In so doing, it plots a middle path through the challenges I set out before.

On the one hand, a framework can be too vague, offering no real purchase on the issues involved, nor a mechanism for comparison of individual pieces of research, even if it would have the benefit of flexibility.

On the other, prescription might guide the work much better, but at the risk of missing out important elements. And that’s after the long, hard struggle to agree such a detailed model in the first place.

The compromise approach suggested by our discussions is to divide the big question of ‘what effects?’ along three discrete and meaningful dimensions.

The first is to unpack ‘Active Learning’. Our workshop alone contained simulations, creation of videos, semi-structured facilitated group discussion, problem-based learning and more: each rather different, each brought together by not much more than the placing of the student in the centre of the learning activity.

Indeed, much of my informal conversation in Nicosia was precisely about what makes Active Learning, Active Learning. Given the range, it’s difficult to come up with a definition that includes the kind of range listed above, but excludes something like a lecture. And there’s a question about whether lectures should be excluded in any case: colleagues using EVS might feel that they’re doing Active Learning.

And no, I didn’t get to an answer on this one. There’s maybe something in thinking about learning as being about stimulus-response, with active learning focused more on the response element, but by that point I was feeling that I was hopelessly out of my depth and in need of an educational scientist with some emergency theory.

Digressions aside, this dimension logically matters: the type of thing you do in your learning environment should influence what students learn from it. By differentiating across the variety, we might be able to spot commonalities and differences, especially as it doesn’t a priori exclude consideration of the effect of non-Active Learning situations too, as a benchmark.

Which leads to the second dimension of types of effect.

Here again, much discussion ensued in Nicosia about what types of effect to consider and how to group them. As I’ve discussed already, Bloom’s tripartite cognitive-affective-psychomotor domaining forms an obvious starting point, even if you can have a discussion about whether something like self-confidence is a skill or a disposition or something else.

However you resolve this one, there are still the three main areas of ‘facts’, skills and attitudes. Clearly one can break each of these down into more specific elements, and consider interactions between each of them – if my students enjoy it more, do they learn more facts? – but this does at least begin to structure the range of what we might consider.

The third dimension – of context – is somewhat different, since it’s not about the activity per se, but rather the environment in which Active Learning takes place. Several of our papers dealt with school children rather than university students, posing a question of whether this made any fundamental difference.

My personal experience makes me think that it is more a difference of degree than kind: the high levels of confidence and knowledge allow university students to take simulation scenarios further than school pupils, in terms of depth, realism and reflection. However, others find rather different dynamics, which suggest that differentiation across this might hold value.

Again, we come back to the impact of types of Active Learning and to the scope and magnitude of effects.

And this might be the biggest challenge: measurement.

Peter didn’t try to specific minimum or common standards for measuring effects, in part because of the scale and scope indicated by the three dimensions. However, we have to hope that as we start to work on this, we might all develop a better sense of what works how: to take the obvious example, some techniques will work better than others for different effects.

So, a plan. And a grid.

On to the next step.

Sticking it to STEM: getting school kids into Politics

Meh

This guest post is by Karen Heard-Lauréote, Reader in European Politics at the University of Portsmouth.

My STEM-based colleagues are always going out to “feeder” schools and blowing stuff up (in contained experiments of course), conducting maths magic and playing with Meccano to design crazy structures in an effort to encourage pupils (especially) girls to consider studying one of their subjects at University. And there’s a lot of money sloshing around in the STEM subject promotion kitty to do this.

In the humanities and social sciences we have far-less spectacular tricks up our sleeve to boost interest amongst school pupils in our disciplines and engage them to aspire to apply for one of our courses. Let’s be honest – taster lectures are about as innovative as it sometimes gets when us political scientists do school outreach.

In a climate of decline in humanities and social sciences recruitment and funding, and in a context of widening participation in HE, the time has perhaps come to join our STEM colleagues and put a few fireworks into our own outreach activities.

And so as a keen advocate of active learning in my university-based UG and PG-level pedagogy I thought about using EU political decision-making simulations as an outreach tool in schools. School funding for careers activities and support has hugely reduced in recent years and it turns out that schools are only too willing to get local HE providers in to do such activities – particularly in the last week of term when the teaching staff are exhausted!

The idea is simple. We developed a crisis-meeting scenario which had sufficient verisimilitude to a real phenomenon (in our case the Calais refugee crisis) but reduced the complexity of the decision-making process and took some liberties with the “facts” to make the scenario manageable to simulate in 3 hours and as close to the pupils’ own experience as possible (swapping Calais with Cherbourg, which has a direct ferry route to Portsmouth).

We developed role cards with actors ranging from the CEO of Brittany Ferries, local Council and City leaders, local MPs and local NGO and business groups and went into the school a week before the simulation to assign roles and instruct pupils on how to prepare. A week later we came back and ran the simulation.

It was a hoot!

We saw pupils fully assimilate and inhabit their roles – a few so retrenched in the arguments of their character that they surprised both themselves and their teachers with their enthusiasm for negotiation, problem-solving, diplomacy and use of political rhetoric to persuade others. Political science as a field of study that may have previously been perceived by school pupils as abstract, dry and serious, suddenly came alive, attractive and exciting in the context of the simulation.

So apart from being a great deal of fun, what does this kind of activity tell us about active learning? The results of a pre- and post-event pupil questionnaire showed us three main effects of simulations used in this context.

First, the simulations increased the participants’ interest in pursuing university degrees in fields cognate to EU politics. As such simulations boosted interest in pupils in studying social sciences at University thus raising aspirations and most interestingly, it boosted, more specifically, their interest in studying political science and IR (where many of them placed European politics – but that’s another debate) as University subjects.

Second, the simulations increased the participants’ self-assessed knowledge of EU politics.

Third, the simulations increased the importance participants placed on understanding the workings of the EU.

Taken together, these findings support our claim that EU-related simulations may be used as outreach tools to increase interest in pursuing EU-related subjects at university level.

We may not have safety goggles, Bunsen burners, medical instruments, Meccano sets and the other paraphernalia associated with STEM subjects in humanities and social sciences to wow and amaze school children, but we do have powerful ideas and debates which, with a little nurturing of contacts in schools, we can explore in a fun way through the use of active learning techniques.

Simulations as an outreach tool to boost general interest in HE participation and specific Interest in European politics could be worth a try.

First Annual Teach, Play, Learn Conference

Announcing the first annual Teach, Play, Learn Conference on Friday, June 22, 2018, at Indiana University South Bend. The goals of the conference are to:

  • generate awareness and interest in the changing technologies and pedagogies in the quickly evolving area of educational games and playful learning.
  • demonstrate benefits of using games as part of classroom education.
  • showcase practical solutions for the design and implementation of games in the educational context.

Deadline for proposal submission is April 27. Details are here.

Abandon: Fall 2017 edition

As promised in my last post about teaching risk-averse students, I am going to again apply Simon’s ABC technique to last semester’s teaching. And since I taught two sections of my first-year seminar, I’ll focus on that.

First item on the “abandon” list: in-class peer review of student-designed games. Although I think the rubric that students use to evaluate classmate-designed games is good, they simply refuse to use it to provide honest feedback. I know that the majority of the students understand at least some of the principles reflected by the rubric because of the way they have analyzed the games in the final exam. In the classroom, though, they rate the games as perfect. A potential replacement for the peer review process — and this is really more of a “begin” item — is a short writing assignment after each round of game design in which they compare the game their team designed with another team’s game that they played in class.

Second thing to abandon: my organization of memo-writing assignments. I have assumed, incorrectly, that first-semester college students can grasp the purpose and format of a memo with minimal instruction on my part. After three separate iterations of the assignment, complete with an opportunity to rewrite each memo, I didn’t see significant improvement in the quality of students’ work, which was the same thing that happened in the course last year. A possible solution is to walk students step by step through the mechanics of writing a memo in class, so that by the end of the process they have in their hands a  document that they can submit for a “perfect” grade. But this would remove pretty much any opportunity for students to independently engage in creative thinking, which is another term for problem solving. More holding of students’ hands to protect them from anything they might find unpleasant. I’ll have to think more about how to better organize an assignment like this.

Third item on the list, which is speculative at this point: abandon the whole course. I’ve been teaching this first-year seminar since its inception four years ago, when it came into being through a revision to my university’s general education requirements. The developmental rationale for the course is not well-defined, and the learning outcomes associated with it are mostly not measurable. Valid data on how the course may be of benefit to students simply isn’t being collected, which means that it is really nothing but an empty three-credit curricular requirement. While I think the topic on which I have built the course is a good one, I am finding it less enjoyable to teach over time. And interaction with university administrators about aspects of teaching it have been less than satisfactory. So, if I have the opportunity in future fall semesters not to teach the course, I might take it.