A very short one today, as I’m struggling with a pile of stuff that I’m not sure I understand.
While it’s great that I get to do things I wouldn’t have otherwise be able to, Brexit has also meant I get asked to explain things that are either at the edge of my knowledge, or which are so novel that no-one’s considered them before.
You might have this in your classroom sometimes – I know I still do – so a couple of thoughts on how I handle it.
Firstly, work from what you know.
Nothing is so out-there that it doesn’t touch on something that’s much more settled, so build your conceptual bridge out from that. It not only gives you something more solid to work with, but often it’s where those involved are working from too.
Secondly, consider the range of options.
Politics is great to study because of its uncertainty, but that usually works within a bounded set of pathways. The more you can work through what that set might include, the better you can evaluate how actors might choose among them.
And thirdly, don’t be afraid to say you don’t know.
No-one knows everything and sometimes it’s a matter of either being too early to tell, or too uncertain to guess. Park it, say what would be a marker of things changing in a way that you could tell, so that your audience is left with some tools, even if they don’t get the answer there and then.
Right, back to the world of UK Parliamentary procedure.
This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Maxine David.
In her chapter Life after academia: preparing students for successful collaboration, Kovačević talks us through her 2017 experiences teaching a course on EU Enlargement at the University of Economics in Bratislava (EUBA). We are first given a little insight into teaching practices at EUBA and into Slovak Higher Education legislation, before moving on to the detail regarding the problems she felt needed remedying, the method she employed, the rationale for it and expected results. Data collection and results are then discussed, albeit the latter more comprehensively than the former. The chapter ends with Kovačević’s reflections on the application of group presentations and the challenges involved in devising a reliable research design to generate data on applied teaching methods.
academics who have been teaching for a good number of years will recognise
their own early teaching days in the experiences Kovačević describes. At one
level, this is rather depressing; I, for instance, have assessed students
through group presentations for a decade and more. Even at the beginning, I did
not consider it as terribly innovative given it was what I had encountered on
my own Bachelor’s degree back in the 1990s. The chapter therefore raised
questions for me about what might really be called innovative. In turn, that
suggests the real value in Kovačević’s chapter: first, that it adds to our
knowledge of other contexts; second that it highlights the wider failure of
many academics to engage sufficiently in an exchange of pedagogical knowledge
is therefore to be commended for the degree to which she has problematised the
learning process, thinking about impediments to learning and how to overcome
them at this early stage in her career. She grounds her thinking in the
literature explaining the benefits of group work and presentations, especially
in respect of developing transferable skills and enhancing employability. Based
on that literature and her prior experience in teaching the course, she comes
up with three hypotheses. The first of these is somewhat unclear: “The
innovation—i.e. group work—takes place in a classroom environment that is
supportive of learning via collaboration”. Does she mean that she is
hypothesising groupwork is innovative, that groupwork is learning by collaboration,
or something else? The other two hypotheses are clearer: students’ interest in
the subject matter will be increased as a result of the process of creating a
group presentation; and there will be a noticeable increase in subject-specific
knowledge, as well as related skills.
for many teaching in environments that regularly apply such methods, these will
be self-evident: as Kovačević herself acknowledges, the benefits of
student-centred learning are already well-recognised. Clearly, however, whether
as a whole or just in pockets, methods that put the student at the centre of
the learning process are not the norm for EUBA (and many another institutions).
her course, working with seminar groups of around 13 students, Kovačević began by
having students collaborate to produce a poster on Turkey in the EU’s
enlargement process before moving on to the creation of a Powerpoint
presentation. She is keen to point out the support that was offered throughout
the process, including instructor and peer feedback. It is a shame,
incidentally, that we did not hear more about this peer feedback, a notably
tricky area (see, for instance: Liu and Carless).
chapter is weakest on talking us through definitions and in the section on data
collection and methods, though the latter aspects are partly addressed in the
conclusions. On definitions, it is not entirely clear what is meant by
“presentation”. Presumably, it is confined to a Powerpoint presentation but it
could be more (role plays) or less (students acting as rapporteurs) extensively
construed. The question is an important one for those thinking about adaptions
to the method Kovačević applies.
about measurement and comparison are also insufficiently considered. For
example, the third hypothesis (“Student learning, including knowledge and
skills after collaborative group work, is noticeable”) begs questions about how
levels of learning can be measured, and compared to what. If we accept group
presentations as innovative, we must accept also that others need to be
persuaded of the relative benefits of
such innovative teaching, otherwise, why change? As such, the persuasive
potential of the chapter is reduced. Methodologically, it would have been
useful to know how Kovačević recorded and evaluated the “student activity and
behaviour” she observed.
the number of unanswered questions, the chapter is an important one. It
functions as a reminder that there is still much to be done to convince others
of the benefits of changing ways of thinking and doing because innovation is
not contagious. It is a reminder too that rigorous and reliable evidence is
sometimes difficult to generate and that without that, it becomes all the more
difficult to overcome resistance to change. Finally, the chapter is important
because it raises implicitly the question of whose responsibility it is to
bring about change. Should it be contingent on young scholars, under pressure
in so many other ways, to undertake all this work? I think we all know the
answer to that question, what we are doing to address it is another matter.
My first-year module this semester has been a real training ground for me. Not only am I going all-in on flipping, but I’m also trialing the new assessment software that the University is thinking of using.
By extension, that also means it’s a training ground for my students, something that I’ve been very open about with them.
The flipping seems to be working and I’ll be writing up my thoughts on that later in the semester, but having coming through the first use of the software I need to make some decisions now.
In part, my situation arises from wanting to push how we used the software past a conventional approach. Not only did students submit a literature review to it, but they then had to review someone else’s using the system, all in aid of a final piece of self-reflection (which we’re marking now).
Using the marking function is a bit more involved than just submitting work and a couple of people did get a bit lost on that. But the bigger problem was that not everyone submitted work.
In the good old days (i.e. last year and before) we did all this in-class, so it was much simpler to cover (the exceptionally few) missing pieces. However, because we’d pre-selected peer reviewers, we ended up with some students having nothing to review and others not getting their work reviewed.
That’s a failing on my part: next time, I’d leave allocation until after the first submission was in, so everyone who submitted got allocated and reviewed.
But that’s next time. What about now?
Already, I’ve indicated to everyone that not getting peer feedback won’t count against them in marking, but a couple of students have felt that absent such comments they’re not in a position to complete the self-reflection.
To that, I’ve had to underline that it’s self-reflection, so peer feedback was only ever one component of that: indeed, the whole purpose of the somewhat-convoluted exercise is to get students becoming more independent and critical about their learning.
All that said, peer review was added in here to help prompt everyone to think more about what they’ve done and what they could do.
As we sit down to mark, the question will be much we can, and should, take the circumstances into account. Until we’ve seen the full range of work, that’s going to be a tricky call to make.
However, it all highlights an important point in such situations: do we have fall-backs?
Trying new things is inherently risky – that’s why many colleagues stick with what they know – but with some risk management, that need not be a barrier to moving practice forward.
Annoying through our situation here is, it’s not fatally-compromising to the endeavour: we know who’s affected and how; they’re still able to submit work; and the assessment is relatively small in the overall scheme of things.
Yes, we’ll be using the system again for the final exam, but without the aspects that have proved problematic. Indeed, the exam has already been trialled elsewhere in the University, so that’s well-understood.
So, on balance, I feel comfortable that we can manage the situation and implement the necessary changes next time around to remove the problems identified.
Which is, of course, a big part of the reason for trying it out in the first place.
One of the many fascinating aspects of my Erasmus study
abroad year in Bonn was that the town was then undergoing a major change:
following reunification, the capital was being moved to Berlin, necessitating a
multi-million DM programme of construction, re-construction and general upheaval.
Right now, I feel a bit like I’m facing my own Umzug, not least because while the
Germans were moving only the once, I’ve got two moves ahead of me in the next 9
months or so.
The reason is the usual one for a university: the juggling
of spaces against changing needs is a constant for most colleagues and the
biggest wonder of it all is that our Department’s not moved in its 15 year
That’s great, but it means that now we are moving, there’s a
problem: what to take?
Usually, this isn’t the kind of thing I’d bother you with,
but because we’re doing it twice, there’s an additional constraint: I can take
only three packing crates to the interim location, with the rest going into
Three crates? Not so bad, maybe.
Welcome to my office.
I did actually get moved some years back when I was doing my
Associate Dean role, and I used about 30 crates, and even then got told off for
So you see my dilemma: three crates to take the stuff I’ll
actually use between May and January.
Of course, a lot of stuff doesn’t get used very much: I’ll
admit that I cleared out a couple of shelves directly upon hearing the news, as
our local book harvest is coming by next week: old textbooks might have some
historiographical value, but realistically they’re not a burning priority.
Which leaves me with still 30-odd crates-worth of choices.
The priority seems to me to be around teaching: my research
is mostly based around materials I can transport virtually, plus we’re next
door to the library (or rather, we are now: not when we move). But teaching
needs me to have access immediately to some texts and to teaching aids.
That means trying to map out what my classes might look like
in the first semester of the next academic year, plus the second semester just
in case we get delayed (my Brexit research on contingency planning coming in
Put like that, the problem suddenly becomes much more
manageable: it’s now a matter of boxes of Lego, blindfolds, whiteboard markers
and post-it notes rather than my extensive collection of notes on Danish
euroscepticism in the late 1990s (two crates-worth, last time I looked).
An as exercise, I’m actually finding it rather cathartic: it
feels like an extension of much of the rest of my work experience (I’m writing
this on a crowded train heading into London, laptop balanced on my knees, for example).
And if I really need something I’ve packed away, they say it can be retrieved,
so the peril is relatively low.
Of course, I’m not going to suggest that you move just for
the sake of it, but it is good to occasionally ask yourself whether you need
all that stuff you’ve piled up. Asking yourself what you really need is
a good question at any time, not least because it invites you to focus on the
core of what you do.
So it’s actually all good.
Except for the plants, which are going to be a very different
A few years back I wrote about the The Lego Movie and how it captured the operation of fascism in a form that was both accessible and about as enjoyable as fascism ever can be.
Some readers of this blog were unhappy that I’d ruined that film for them, so it’s only right I try to do that for another generation of scholars and young parents.
The Lego Movie 2 isn’t as good as the first one (confine discussion of whether this is ever possible to yourself, away from me), but obviously I watched it, because, well, Lego.
This aside, there’s still plenty of politics going on in the film., which I’ll write about now with some mild spoilers (because you’re not 9 years old and because you’re a serious scholar).
For the IR types among you, there’s a whole bunch of realism going on, with security dilemmas, anarchy and the brutishness that this engenders in actors. The collapse of the (ultimately benign) dictatorship in the first movie produces a literal and metaphorical wasteland in which actions are guarded and security is everything. The final reconciliation of the actors this time occurs when they identify a common external threat and work together to overcome it.
As a musing on power in its various forms, the movie offers a useful way to conceptualise how actors operate under uncertainty and the tension between collaborative and conflictual action, as captured in the notions of being a ‘master builder’ or a ‘master smasher’.
For the gender scholars, there’s a bunch of gendered roles, paternalism (and maternalism), as well as how children ‘become’ adults going on in all this too. It’s also a classic of the ‘absent father’ trope and all that implies.
But for my purposes, the film is all about constructivism.
The logic of appropriateness runs through the entire piece, as individuals strive and struggle to either fit into their environment or communicate their intentions.
If Emmet’s arc is one of finding a persona and an attitude that works from him (puberty alert!), then Bianca’s is one of learning to signal intentions less ambiguously.
As someone who teaches negotiation, I recognised a lot of these tensions from my classroom, where students are apt to adopt personae as ‘negotiators’ that don’t always sit comfortably with their more general sense of being.
In both classroom and the film, individuals try out different gambits, with varying degrees of sincerity and of success. But ultimately, as the film suggests, it is when there is a more open exchange of views that progress is made, clearing up the confusions and misunderstandings and realigning how we view other’s actions.
Indeed, the whole film turns on how individuals perceive one another and themselves: Finn misunderstands Bianca; Emmet is misled by Rex; Lucy struggles (as in the first film) with her sense of identity; Batman has to learn about living with light as well as dark; and Superman has to find accommodation with the Green Lantern.
Identity here is thus not purely about being true to yourself, but also about being true to others. Whether you accept that being doing both you end up with a happy society – as the movie argues (as movies are wont to do) – is another matter.
But however you take it, this case highlights how we can use cultural products to illustrate and illuminate our teaching: the beauty of politics is that it is pervasive, so we can find it pretty much anywhere we look, if we choose to see.
There’s not much that separates PoliSci academics from others in most aspects of pedagogy, but one that is quite notable is the question of “what’s your politics?”
The reasons for this should be pretty clear, so I’ll not get into that, but instead will offer some thoughts, because we get this kind of thing on our side of the Atlantic too.
As the various respondents to Carolyn’s tweet suggest, the very question speaks to a set of assumptions, which can be usefully exposed and explored.
However, that can be a deflection, rather than an answer, so it still behoves us to consider what answers we can give.
It’s something I’ve had to chew on a lot in recent years, given my work on Brexit: “how did you vote?” is now getting overtaken by “what do you think we should do?”
The fact that I genuinely don’t know what we should do is neither here nor there, because the rest of what I’m offering people is what I claim to be impartial and fair insight into assorted issues, so if I’m seen as speaking for any one party then my whole work is compromised.
This is, of course, the problem we all face: politics gets seen as a clash of interests with no objective truth to be defended, thus meaning we must all be on one side or another.
Without wishing to get lost down an ontological or epistemological hole on this one, I think it’s possible to mark out a more segmented view of politics: we have our own views, but the consequence of those is limited, especially if we are reflective about these.
Thus I can acknowledge how I voted in the referendum, while also stressing that my interest now is in helping others to reach an informed and considered set of decisions about what comes next. It helps that this is my heartfelt belief – process matters much more than outcome to me right now.
But we can also communicate such messages in different ways in our classroom.
Promoting and defending a range of perspectives on contentious issues; fostering a space in which different views can be discussed with respect and tolerance; acknowledging the limits of what evidence (and anecdote, for that matter) can tell us.
These elements often prove to be much more meaningful in conveying the values of academic inquiry and debate and the interplay between facts and opinions than any “what’s your politics?” discussion.
Still doesn’t make it that much easier when you get asked, though.
This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Heidrun Maurer.
Innovation and active learning are nowadays often too easily welcomed catchphrases in Higher Education, whose successful implementation is taken for granted. Stanislava Kováčová from Masaryk University aimed at testing the added value of active learning herself.
To her own
surprise, her experiment did not show any significant indication that her
students had learnt more after active learning than after traditional lectures.
In her contribution “Does
active learning work? The experiences of Brno and Tehran psychology students” she presents collected data and
reflects on how her students experienced passive lecture-focused and active
student-focused learning. She tests three hypotheses: if students in an active
learning environment participate more, gain higher level of content knowledge, or
engage more. While the results are not statistically significant, they suggest
counterintuitively a tendency for lecturing to be more effective practice than
Stanislava´s contribution encourages all of us to think more carefully of how to measure the success of the teaching methods that we employ. Her attempt showcases tellingly the complexity of measuring learning, but also how important it is to think about the methods of data collection. In Stanislava´s case, institutional policies made it difficult to gather reliable, comparable data. In addition, one needs to consider the right moment to test the effect of learning tools, especially when it is not only about content but also skills: is it right after the class, at the end of term, or years after?
Furthermore, measuring the effect of learning must depend on the objective(s) that we set for our teaching innovation. Stanislava had decided to assess participation (“students asking questions”), knowledge (“students being able to answer questions”), and engagement (“students taking notes”), but those criteria will vary depending on the expected outcomes of the innovation. It is generally a good reminder that we should not innovate for innovation´s sake, but that all attempts to improve the learning experience have to start from a concise definition of what is meant to change and why.
project also reminds us that it is not
just a question if we use active learning tools but howwe apply and integrate them into our
students´ learning. It seems like a plausible explanation that students in Brno
and Tehran were overwhelmed with the task and would have needed more attempts
to get used to switch from a more lecture-based system to actively engaging with
the exercises. Another explanation could be that the exercises that we
sometimes use are not achieving what they are meant to achieve, and they would
need a different design altogether. Especially for colleagues unfamiliar with
active learning there is a tendency to design active learning exercises that
are too prescriptive and too narrow, as they do not allow students to engage in
researching and asking their own questions.
not least important, the reflections of Stanislava´s project even more
tellingly emphasise what we must not ignore
when employing active learning pedagogy: students´
skills like active listening, processing information and taking notes must not
be taken for granted and should also be actively – or even more concisely –
encouraged and trained in an active
learning pedagogy can help us a great deal to design tools to engage students,
facilitate their learning, and train them as researchers. But applying active
learning effectively asks for a different mindset, and its successful
application looks easier than it is often made out to be in practice. Adding a
few exercises in a traditional curriculum is often not enough to harness its
learning more generally, it can only work with practice, critical reflection,
and sometimes, trial and error.
This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Alexandra Mihai.
Active learning refers to a large range of teaching strategies and methods that put the student at the centre of the learning process. From debates and group discussions, to simulations and problem-based learning, this versatile approach has the students’ active engagement at the forefront, while teachers play the role of a coach or facilitator. Fujdiak captures in her chapter a very interesting instance of active learning being employed in an International Security Policy course. Her two goals are to find out what students think about the methods used and to assess whether they were contributing to a deeper understanding of the topic. In order to do that, Fujdiak analyses student feedback from “minute papers”, enhanced by her own classroom observations.
The active learning approach was introduced in
the second part of the course. The main aim was to complement the series of
lectures from the first part and to get students to engage with the topics,
which would in turn lead to a more effective learning experience. For this part
of the course the large class was split into three seminar groups of 26
students each and the classes were conducted by seminar leaders, one of whom
was Fujdiak. Throughout the six seminars she used various learning activities
such as group discussions in various formats, brainstorming, mind-mapping and
role play. The chapter contains annexes detailing the activities and their
perceived impact, as well as a visual representation of the findings.
By analysing students’ qualitative feedback via content analysis and through her own observations, Fujdiak could draw a few conclusions concerning the impact of her active learning activities.
First of all, students found the student-student interaction very useful and their overall engagement in class increased. Two seminars received mixed ratings: one where the guest lecturer did not employ active learning at all and another one where the activity was not planned very well in terms of timing. This shows that students are very fine observers insofar as the activity design is concerned. Moreover, the more familiar they get with active learning the higher their expectations are, this being mirrored in their degree of engagement with the respective tasks.
In her chapter, Fujdiak emphasizes some of the most important aspects of active learning. In order for this teaching approach to fulfil its main goals, it is crucial to put a considerable amount of effort into class design, with a focus on providing students a clear structure and instructions. Moreover, effective learning activities need to be meaningfully integrated in the overall course design.
It would be interesting to see whether some of the activities would have a bigger impact if they were to take place in alternation with the related lectures, and not in a separate part of the course, somewhat in isolation. As Fujdiak herself explains, this is not always a choice one has; she, like many other early career academics, had to operate within a pre-defined course structure. Her varied active learning activities and her reflective study are a proof that teaching innovation can also occur under rather rigid external conditions. The important thing is to establish clear learning objectives, be receptive to students’ needs and feedback and be bold enough to try out new ways of engaging students in their learning.
For reasons best known to others, it’s the end of our first
semester here, so that means coursework grades are going back to students.
I was even more interested than usual in this event this
time around because something unusual happened with my class: they came to talk
with me about their assessment.
I know that might seem mundane, but despite my best efforts
my office hours have often resembled one of the remoter oases in a desert:
potentially of use, but rarely visited by anyone.
I’d love to tell you what was different this semester, but I
genuinely have no idea: I did the things I usually did, so maybe it was a
cohort effect. Or not.
In any case, I reckon I sat down for discussions with most
of the students and emailed with several others. In those exchanges we
typically covered both generic guidance on what was required and specific
discussion on students’ plans.
Of course, the big question is whether that helped the
students to do better.
At this point, I’ll note that my class had about 35 students
and it’s a one-off event so far, so I’m alive to not over-reading the outcomes.
Against that, the marking has been confirmed by the second marker.
That said, the main positive outcome was that the bottom
half of the class moved up quite markedly. In previous years, I’ve always had a
cluster of students who simply didn’t ‘get’ the assessment – a reflective essay
– and thus came out with poor marks. This time, I had only a couple of students
in that situation, and they appeared (from my records) to have not attended
most of the classes, and hadn’t come to talk.
Put differently, the tail was severely trimmed and the large
bulk of students secured a decent grade.
What didn’t appear to happen was an overall shift upwards
though: the top end remaining where it had been previously.
Again, I’m not sure why this might be. Without another
cohort I’m not even sure if my guidance actually did anything for anything.
Quite aside from the specific instance, it does underline for
me how little me know about the ways in which our teaching practice does and
doesn’t impact on student learning.
In this case, I don’t really know how one could ethically
test the impact of formative feedback and support, given the multiple variables
at play. If you have an idea, I’d love to hear it.