Model Diplomacy is a series of free National Security Council simulations put out by the Council on Foreign Relations. Michelle used it in her class last year, and I decided based on her positive experience to try it in my intro to IR class this year. In this post I’m going to explain the basics of Model Diplomacy, discuss my experience using it, and give my recommendation. Spoiler Alert: I loved it.
My diary tells me that today I’m uploading materials to the EuroTLC website, in anticipation of the third conference at the end of the month.
EuroTLC is something we’ve written about before (here), but it’s worth revisiting how it differs from APSA’s TLC, which had been something of an inspiration.
While TLC has kept a rather academic bent to its work – streamed workshops focusing on papers – EuroTLC has been more interested in applied approaches: how to do stuff in the classroom. That’s evident in the structures – lots of practical sessions and a variety of formats – and in the general collective model of lots of different organisations chipping in.
Of course, there’s the practical necessity that being able to get together enough papers to run a TLC-style event is very difficult, and indeed rather redundant, given the existence of TLC itself. There’s never been a desire to cannibalise TLC, but rather to fill a gap that was felt to exist in the market.
But how does this all matter?
Well, building on the work I’ve been doing in Nicosia, EuroTLC is a good moment to advance that agenda.Some of the Joint Session participants will be there too, so it’s an obvious jumping-off point to get some more buy-in from colleagues.
And here the nature of EuroTLC becomes more relevant. If participants are more interested in ‘doing stuff in the classroom’ than in research per se, how to make the connection.
Of course, this is a slightly moot point, since I’m aware that the two things aren’t mutually exclusive and – more to the point – that interest in one often co-exists with the other. I want to do better and more useful things in my classroom, so I’m interested in what constitutes ‘better’ or ‘more useful’.
At the same time, I know from past EuroTLCs that playfulness is not always easily aligned with rigorous. The opportunity to try out new pedagogic things is a joy in itself – the moment of thinking “why did no one ever try this before?” – but it’s not the same as attempting to apply a methodical and dispassionate analysis. Put bluntly, sometimes it’s just enough to be trying something new.
But this is rather why Peter and I set up our Cypriot workshop in the first place: neophilism isn’t enough. And that’s not even getting to the dull fact that most things have been tried before; it’s just that we didn’t know about them. Which is the point.
I’m all for exploring what we do, but that doesn’t have to be without a map. Indeed, a map might point us to new (for us) things that actually work.
Talking with participants at other EuroTLCs (and TLC, for that matter) I often encounter the sentiment – usually the next day – of “well, it was fun, but was it any good?”
That’s the moment to connect the different elements.
Now I just need to loiter around hotel lobbies and airport departure lounges to buttonhole people.
You’ve been warned.
Some final thoughts on my globalization course this semester:
First, the community partnership with a local non-profit organization might have worked better in a research methods course. Because I structure my courses around daily reading and writing assignments, they are effectively hybrid in design. Initial encounters with and applications of knowledge happen outside the classroom. This makes it easier for to cut content from a course if needed, and in a community partnership, something planned before the semester begins always needs to be abandoned midstream.
As I mentioned in Part 4 and Part 5 of this series, this time around the ethnography assignment was what got left by the side of the road. But my decision was driven by the lack of quality data gathered by students, not because the time that students spent working on behalf of the community partner was greater than expected. I knew going into the course that I would not be creating a bunch of Margaret Meads, but the classroom instruction on field interviewing did not produce the level of proficiency necessary to complete an assignment that students had never before encountered.
A lack of facility in working with data also showed up in the infographic assignments that replaced the ethnography. Students’ infographics included percentages calculated from the data they had collected, but the percentages often did not reflect informative observations about local patterns of food consumption.
Though this course was 100-level, the students in it ranged from first-year to seniors. So the lessons here for me are, first, that I should not assume that the students who enroll in this course have any prior training in working with data, and second, that a project of this nature requires a full semester devoted to teaching research skills, not a brief introduction to it wedged into a course whose focus is on acquiring topical knowledge. In sum, I tried to do too much within the confines of a single course.
Second, if I think students should gain a better understanding of community, I need to do a better job of getting them to define and work with the concept. The maps of the local community that students drew at the end of the semester did not vary much from the maps they drew at the beginning, and their discussion of community in the end-of-semester meta-cognitive assignment was often unfocused. To be fair, I now think the prompt I created for the assignment was itself needlessly complex — one of my bad habits.
So, as usual, when I teach this course again next year, it will be back to the drawing board for more changes.
Links to all posts in this series:
I’m pleased to report that even after a gap of several years, I still recently managed to destroy a colleague’s enjoyment of The Lego Movie by pointing out its representation of fascism, including the Newspeak of “everything is awesome.”
Such found objects are valuable, not simply as a way of robbing the joy from quality time with the kids, but also a way into discussing complex political issues.
This resurfaced for me once again, for a couple of reasons.
Firstly, there was a very interesting piece on the morality of superheroes, which built on the emerging questioning within the Hollywood system of whether masked individuals meting out extra-judicial ‘justice’ might not be quite the unmitigated good it once was portrayed as.
(And yes, I know that graphic novels got to this a long time ago, but we’re talking here about a form that a lot more people consume.)
The second was the consequence of being left home alone and watching The Hitman’s Bodyguard (THB), which I shall not review beyond noting a key piece of action occurs in Coventry.
It’s a classic odd-couple buddy movie, with many wisecracks and location scenery, and for that it’s very run-of-the-mill.
However, the story turns on genocide and responsibilities to act (in various ways). There’s a bunch of ethics thrown in, although not enough that anyone seems to notice the jarring effort of key characters laughing about ‘ass’ as they walk through the scene of a bombing.
All of which suggests that there might be two levels of discussion one could have with students about the issues involved.
At the obvious level, there’s the ethics as promoted by the film(s) you discuss. In the case of THB, there’s a tension between natural and judicial justice, as well as between means and ends. There’s even an element of the balance between structure and agency, in the discussion about life-partners, that might open up some useful lines of debate.
As the article notes, such overt discussion of the great responsibilities of great power is becoming more common in superhero movies, which might be a reflection of producers’ increased confidence in what audiences can handle, or might simply be because just fighting people eventually runs out of steam at some point. But the consequence is that ethics, even if it is ethics-by-numbers, is there on the screen to be considered. And if you have a class that’s still getting to grips with the basics, then this is as good a way in as any.
But there’s also the less-obvious layer of discussion: the kind of stuff that’s either not mentioned or not even obviously considered by the movie’s makers.
To take an obvious example, THB isn’t about gender, but it’s also about gender. That’s clear from the gendering of roles, the rescuing of women and the occasional knob joke. I’m guessing it’s not what the director wanted me to think about, as I watched, and I’m also guessing it’s not what the director thought very much about either, but that’s precisely the point. Such dimensions get woven into the fabric of a cultural product, and it is for us to notice and unpick those.
Culture invites multiple readings, and so let’s try doing just that. Wikipedia tells me THB got ‘mixed reviews’, and I can believe that: any film that portrays such a lax depiction of border controls deserves to be challenged.
First, the simple stuff:
Running this course with only ten students at 8:00 a.m. is problematic, for reasons I have mentioned before. Lack of students definitely decreases the level of activity in my Gerkhania simulation. Attendance has picked up but is still only eighty or ninety percent, so in the future I really need to give pop quizzes — in paper, rather than electronic, form — on a semi-frequent basis.
I have noticed a problem with the reading responses. For these assignments, I usually pair an article from an academic journal — often the Journal of Democracy — with shorter and more current items from news outlets like The Atlantic, Politico, and The New York Times. Some students developed the habit of reading only the latter and ignoring the former. I need to force students to read the journal articles, but haven’t quite figured out the best way of doing this.
Now for the complex stuff: Continue reading
Continuing on the theme of what I’ve learned in the last year of building my own business doing dissertation and academic coaching and freelance editing, at the invitation of the blog owner, Chad, I’m back for a two-part series on common problems that I’ve seen working with faculty on their research, project, and time management. This is part 1 of 2.
Faculty usually begin their careers trained to do one thing: research. If they’re lucky, they’ve been trained to teach, at least a little bit, too. But no one ever begins their career trained in administration and management. Those are, theoretically, on-the-job skills that you pick up on the way. As a result, most faculty have vastly underdeveloped systems for managing administrative processes: committee work, cycles of paperwork like monthly meeting agendas, required paperwork for grants and other funding, and the most dreaded one of all – email.
For most of us, email becomes the default way of managing our committee work, paperwork, and other not-research-but-still-necessary-business. Which means, then, that a system to manage our email becomes a necessity. That system needs to comprise two parts: incoming management, and archiving management.
Managing incoming email needs to be something that you do deliberately, not something done haphazardly. I recommend setting aside 2-3 times per day to process your inbox. Anything that can be answered in 3 sentences or less gets a response; the rest get deleted, archived immediately if appropriate, or placed in a specific folder or given a tag/flag indicating that follow-up is required. Then, once a day, have a dedicated time for churning through the things that require more detailed follow-up. Set a designated amount of time for this and stick to it. That doesn’t mean you can’t tackle one or two semi-quick ones if you have 10 minutes between meetings, but it does mean that email becomes a designated, deliberate task, rather than an interstitial one.
The second part of email management is archiving. The goal is to keep your inbox containing only those things that are active: ongoing conversations, tasks you’re working on, things you need to follow up on. Everything else that’s closed should be either deleted or archived into a system of folders. Most of us are reasonably good at this, but it’s a good idea to make part of your Friday shutdown routine a quick cleanout of the inbox to archive anything that’s been completed that week that hasn’t already been put away so that you can start the week with an empty inbox.
These and other skills are things I can help you develop through academic coaching. If you’re interested in academic coaching, the summer is a great time to start. It gives you a chance to develop and solidify new or better habits before the chaos of term time arrives. Feel free to take a look around my website at http://www.leannecpowner.com/coaching/ and if you’re interested, drop me an email at Leanne@leannecpowner.com . The initial consultation is free. You can also follow me on Facebook at https://www.facebook.com/LeanneCPowner/ or Twitter @LeanneCPowner for free daily writing tips.
The great thing about colleagues is the way that they get you to move beyond yourself. Reading Peter’s summary of our Nicosia discussion is a case in point, setting out our agenda in a way that makes me want to write more about the ideas involved.
That means the dream I had last night about how to run my negotiating course will have to wait until next week, for which we might all be graeful.
At the centre of Peter’s idea is the creation of a framework that would allow colleagues to engage in a more systematic and rigorous examination of the effects of Active Learning. In so doing, it plots a middle path through the challenges I set out before.
On the one hand, a framework can be too vague, offering no real purchase on the issues involved, nor a mechanism for comparison of individual pieces of research, even if it would have the benefit of flexibility.
On the other, prescription might guide the work much better, but at the risk of missing out important elements. And that’s after the long, hard struggle to agree such a detailed model in the first place.
The compromise approach suggested by our discussions is to divide the big question of ‘what effects?’ along three discrete and meaningful dimensions.
The first is to unpack ‘Active Learning’. Our workshop alone contained simulations, creation of videos, semi-structured facilitated group discussion, problem-based learning and more: each rather different, each brought together by not much more than the placing of the student in the centre of the learning activity.
Indeed, much of my informal conversation in Nicosia was precisely about what makes Active Learning, Active Learning. Given the range, it’s difficult to come up with a definition that includes the kind of range listed above, but excludes something like a lecture. And there’s a question about whether lectures should be excluded in any case: colleagues using EVS might feel that they’re doing Active Learning.
And no, I didn’t get to an answer on this one. There’s maybe something in thinking about learning as being about stimulus-response, with active learning focused more on the response element, but by that point I was feeling that I was hopelessly out of my depth and in need of an educational scientist with some emergency theory.
Digressions aside, this dimension logically matters: the type of thing you do in your learning environment should influence what students learn from it. By differentiating across the variety, we might be able to spot commonalities and differences, especially as it doesn’t a priori exclude consideration of the effect of non-Active Learning situations too, as a benchmark.
Which leads to the second dimension of types of effect.
Here again, much discussion ensued in Nicosia about what types of effect to consider and how to group them. As I’ve discussed already, Bloom’s tripartite cognitive-affective-psychomotor domaining forms an obvious starting point, even if you can have a discussion about whether something like self-confidence is a skill or a disposition or something else.
However you resolve this one, there are still the three main areas of ‘facts’, skills and attitudes. Clearly one can break each of these down into more specific elements, and consider interactions between each of them – if my students enjoy it more, do they learn more facts? – but this does at least begin to structure the range of what we might consider.
The third dimension – of context – is somewhat different, since it’s not about the activity per se, but rather the environment in which Active Learning takes place. Several of our papers dealt with school children rather than university students, posing a question of whether this made any fundamental difference.
My personal experience makes me think that it is more a difference of degree than kind: the high levels of confidence and knowledge allow university students to take simulation scenarios further than school pupils, in terms of depth, realism and reflection. However, others find rather different dynamics, which suggest that differentiation across this might hold value.
Again, we come back to the impact of types of Active Learning and to the scope and magnitude of effects.
And this might be the biggest challenge: measurement.
Peter didn’t try to specific minimum or common standards for measuring effects, in part because of the scale and scope indicated by the three dimensions. However, we have to hope that as we start to work on this, we might all develop a better sense of what works how: to take the obvious example, some techniques will work better than others for different effects.
So, a plan. And a grid.
On to the next step.
This guest post comes from Peter Bursens, University of Antwerp.
In a previous post Simon referred to the lavish Cypriot mezze as a metaphor for the discussions during our ECPR workshop on active learning. There is indeed a lot on our plate when it comes to elaborating a systematic research agenda on the effects of active learning. Thanks to the participants we now have at least a shopping list to purchase the necessary ingredients.
The primary aim of the workshop was to go beyond descriptions and good practices of active learning tools in political science. Participants were invited to collect empirical data from their active learning environment and apply appropriate methods to explore the effects on learning outcomes.
We identified five parameters to situate the papers of the workshop: the dependent variable, the independent and intervening variables, methods, data and context.
The dependent variable refers to the different types of learning outcomes. Knowledge, skills and attitudes were often used as broad categories, although these concepts were defined differently according to the theories used (cognitive, affective and regulative outcomes or cognitive, emotional and behavioural outcomes, to name just two). Other more concrete outcome variables included interest, motivation and self-efficacy. Yet other papers measured effects on civic engagement or even on the motivation to study political science in higher education.
The independent variables often referred to students’ disposition such as gender, age, previous education, previous experience social capital and others. As intervening variables the papers looked at a variety of active learning instruments. Most papers dealt with different types of simulations and role play games, but others used movies, ICT tools, learning approaches such as problem based learning, and video production.
Papers applied a wide variety of methods: some used (advanced) quantitative statistics, others pre- and post-test while some used qualitative tools such as discussion groups, interviews, observations or even diaries. Often the choice of the method followed the ontological positions of the researcher as most were positivist and some were constructivist minded.
Data varied according to the methods. Survey data were most common, although some papers had response or sample issues. Most papers relied on self-reporting, while objective and observational data were more rare. Datasets had a range from a few hundreds to just four students.
Finally, the context varied as well. Higher education students (but different types of programmes and courses, and also extra curricular events) were the most popular. Some also look at the secondary school pupils.
Of course, the workshop only addressed a small number of the potential questions to be asked regarding the effect of active learning environments. Nevertheless, from the workshop, a three-dimensional projection could be derived that can help the political science community to define the puzzles of a future research agenda. A typical research question for a paper within this agenda would be what effect does active learning environment X in context Y have on learning outcome Z. A final observation regarding the research agenda is that political science would benefit from the theories and methods from educational science.
Conclusion? A lot to digest for the workshop participants. More guests at the table would be warmly welcomed!
A post about advising, a topic we haven’t talked much on this blog — an example from last year is here — but which, depending on where you work, may be seen by your superiors as the panacea for everything from retention to student psycho-social dysfunction. Hence the push for advising to evolve from “transactional” to “transformational” (always be wary of alliteration). Since students nominally attend college to obtain an education, and faculty are the ones who formally provide that education, the responsibility for advising students frequently falls to them.
My university recently hired a consultant to evaluate the advising landscape on campus. His report highlighted several aspects of advising that are, in his view, in need of improvement:
- Constant churn in academic administrators.
- Absence of accountability for university employees whose duties include advising, whether in a supervisory or “point of service” capacity.
- Advising mechanisms designed without input from the people who hypothetically need to be advised (students).
- Information relevant to the student academic experience that is generated by one part of the university is not shared with other parts, something I discussed in 2012.
- Online resources that for students are difficult to locate and inconvenient to use.
Note that faculty lack the authority or resources to solve any of these problems.
So what is a faculty member to do, especially in the midst of a requirement-heavy curriculum that presents the academic path through college as a series of boxes to check off, instead of as a process that is heavily influenced by the student’s choice of social interactions? Something that I am slowly migrating toward — initially reflected in the print and digital promotional material that I have designed for my department — is to present options to students in the form of “here is the choice a past student made in this situation, and this is where that student ended up. Your results might differ, but we know that this outcome is at least possible.” I am hoping that giving advisees concrete examples like this will more effectively communicate what might be beneficial for them to know.
This guest post is by Karen Heard-Lauréote, Reader in European Politics at the University of Portsmouth.
My STEM-based colleagues are always going out to “feeder” schools and blowing stuff up (in contained experiments of course), conducting maths magic and playing with Meccano to design crazy structures in an effort to encourage pupils (especially) girls to consider studying one of their subjects at University. And there’s a lot of money sloshing around in the STEM subject promotion kitty to do this.
In the humanities and social sciences we have far-less spectacular tricks up our sleeve to boost interest amongst school pupils in our disciplines and engage them to aspire to apply for one of our courses. Let’s be honest – taster lectures are about as innovative as it sometimes gets when us political scientists do school outreach.
In a climate of decline in humanities and social sciences recruitment and funding, and in a context of widening participation in HE, the time has perhaps come to join our STEM colleagues and put a few fireworks into our own outreach activities.
And so as a keen advocate of active learning in my university-based UG and PG-level pedagogy I thought about using EU political decision-making simulations as an outreach tool in schools. School funding for careers activities and support has hugely reduced in recent years and it turns out that schools are only too willing to get local HE providers in to do such activities – particularly in the last week of term when the teaching staff are exhausted!
The idea is simple. We developed a crisis-meeting scenario which had sufficient verisimilitude to a real phenomenon (in our case the Calais refugee crisis) but reduced the complexity of the decision-making process and took some liberties with the “facts” to make the scenario manageable to simulate in 3 hours and as close to the pupils’ own experience as possible (swapping Calais with Cherbourg, which has a direct ferry route to Portsmouth).
We developed role cards with actors ranging from the CEO of Brittany Ferries, local Council and City leaders, local MPs and local NGO and business groups and went into the school a week before the simulation to assign roles and instruct pupils on how to prepare. A week later we came back and ran the simulation.
It was a hoot!
We saw pupils fully assimilate and inhabit their roles – a few so retrenched in the arguments of their character that they surprised both themselves and their teachers with their enthusiasm for negotiation, problem-solving, diplomacy and use of political rhetoric to persuade others. Political science as a field of study that may have previously been perceived by school pupils as abstract, dry and serious, suddenly came alive, attractive and exciting in the context of the simulation.
So apart from being a great deal of fun, what does this kind of activity tell us about active learning? The results of a pre- and post-event pupil questionnaire showed us three main effects of simulations used in this context.
First, the simulations increased the participants’ interest in pursuing university degrees in fields cognate to EU politics. As such simulations boosted interest in pupils in studying social sciences at University thus raising aspirations and most interestingly, it boosted, more specifically, their interest in studying political science and IR (where many of them placed European politics – but that’s another debate) as University subjects.
Second, the simulations increased the participants’ self-assessed knowledge of EU politics.
Third, the simulations increased the importance participants placed on understanding the workings of the EU.
Taken together, these findings support our claim that EU-related simulations may be used as outreach tools to increase interest in pursuing EU-related subjects at university level.
We may not have safety goggles, Bunsen burners, medical instruments, Meccano sets and the other paraphernalia associated with STEM subjects in humanities and social sciences to wow and amaze school children, but we do have powerful ideas and debates which, with a little nurturing of contacts in schools, we can explore in a fun way through the use of active learning techniques.
Simulations as an outreach tool to boost general interest in HE participation and specific Interest in European politics could be worth a try.