After last week’s class discussion about participation, I decided to run an exercise that made it really easy to show the marginal benefit of preparation.
I told students to prepare for a meeting about putting together an agenda for another negotiation, and gave them all specific roles, plus some rules of procedure.
(For those who are looking for Brexit sims, this was a Council working group, putting together an agenda for the Commission to take to the UK to discuss the Political Declaration).
Because it was about formulating an agenda, I hoped that students would see they didn’t need to get too deeply into substantive positions, as long as they could frame the general areas to be covered.
Plus, but giving clear roles and rules, I incentivised everyone to push out their own draft agendas prior to the meeting. In so doing, I hoped they’d see that even a small amount of preparation could have big effects.
Badges are not exactly a new concept, but like many ideas that come from games, they are still not widely known in the world of higher education and learning. As many of us know from playing sports or participating in Scouts-style programs as children, badges are physical marks of achievement given to those that show competency at a particular skill or for doing well in a competition. In video games like World of Warcraft, meeting a particular goal is called an ‘achievement’. Each achievement you earn flashes across the public chat, allowing for recognition, and some of them come with special items or titles in the game. Defeat the Lich King, and you not only get to bask in the glory of victory, but you earn the title ‘Kingslayer’. While typically meant as a form of visible recognition for an achievement, earning achievements or badges can motivate behavior and can be seen as credentials in their own right.
Motivation, recognition, and credentialing skills–sounds like what we want to do in the classroom, right?
Oddly, it took being interviewed for a research project to really crystallise my thoughts on this subject, after some months of it niggling away.
Earlier in the year, my institution launched a consultation on a captured content policy.
This was intended – in its words – to ensure improved access to learning materials and to allow for more flexible delivery, and was sold with a large dose of student demand (via our Students Union).
For those of you who’ve not had this conversation at your place of work, captured content covers lecture capture (semi-automatic filming of lectures to be uploaded to the VLE); flipped content and; anything else that’s a recording of teaching.
As an aside, there’s mixed evidence in the literature of its benefit for students: Owston et al suggest it’s particularly of use for low-achieving students, and Shaw et al see most benefits for non-native speakers; but Stroup et al find no evidence of impact on GPAs; while Danielson et al suggest that the kind of lecture has an impact.
However, as presented, the university wanted to have a whole lot more of this kind of thing, across the board, including talk of a largely-compulsory system of lecture capture.
Cue much concern from colleagues.
This ranged from how to deal with mixed lecture-seminar sessions to the impact on willingness to talk about sensitive subjects to administrators using recordings for management purposes to the principle worry that students just wouldn’t turn up to class if they could just watch it online later on.
In its defence, the final, approved policy didn’t go as far as the draft plans, so there’s a lot more scope for instructor discretion about using captured content; although we’re all required to have discussions about how best to proceed on this front. Some of our teaching rooms now have automatic recording of classes, but defaulting to not making these available to students or anyone else.
So that’s all fine, right? University over-reaches in its plans, colleagues feed into consultation, university responds and adapts. That’s what should happen. Right?
I’m not so sure.
To come back to the original sell, a key part of it all was that push from the Students Union to the effect that lecture capture would improve the quality and student-centredness of lectures.
Here we have to remember that lecture capture (since it was that, rather than captured content in general) is not about content, but about delivery. In a system that automatically records lectures, the expectation should be that lectures continue as they have, but now with the option of being available online.
No imagine you’re sitting in a lecture.
You don’t understand something, so you either raise your hand to ask the lecturer, or you ask the person sat next to you.
In both cases, you’ll get an almost instantaneous clarification for someone immediately and directly focused on the subject matter, with a pretty good change of resolving the issue.
But if you watch a captured lecture, then if you don’t understand the one explanation in that lecture, then you’ve got to email or visit the lecturer, who’s got to fit responding around whatever else it is they’re doing.
Much more time, much more effort, many more points of failure.
So no lecture capture then?
This is why I’ve never gone for lecture capture, but instead have travelled down the road of flipping. In the latter case, you’re using the contact time to give space for student questions and clarification, so it’s a much more engaged model than just recording the stuff that already happening in class.
Importantly, that’s what works for what I’m doing and what I’m trying to achieve.
And this is perhaps the central point.
In all the years of teaching that I’ve done, at all the institutions that I’ve encountered and worked for, I don’t ever recall a policy about optimising student learning.
I’ve seen policies about captured content or using VLEs; regulations about the volume and nature of assessment and size of modules; and more learning and teaching strategies than I care to remember.
But never a document about how to make informed pedagogic choices about designing the best possible learning experience for students.
If it’s appeared anywhere, then it’s in teaching training courses, and then generally indirectly.
I can understand why this is – those other things are much more fungible and measurable – but it does raise a question about the focus of our work.
Importantly, I feel that too often we find ourselves in situations where “student learning” is conflated with “student satisfaction”: if only we can make them happy, then they’ll get more out of it.
Even on its own terms, I don’t see the logic of this, even before we get to whether it’s something that’ll serve our students well in the wider world.
All of which is a very roundabout way of saying that if I change how I teach in my class, then it’s because I’ve made a considered decision about its pedagogic merits, rather than because of an institutional policy.
Today’s post is more about career development than teaching . . .
Academia is a bureaucratic work environment. Information is constantly documented and distributed. Often this happens to the same piece of information multiple times. Consequently I began recycling my writing as much as possible several years ago, in the belief that it is better to make minor changes, or none at all, to writing upon which I have already expended mental energy. An underlying principle here is writing with an ultimate rather than a proximal use in mind. What is the most valuable end to which this writing can be eventually directed? A simple example: the proposal for your conference presentation becomes the abstract for the conference paper, which in turn becomes the abstract for the manuscript submitted to a journal.
A second and, for some, more important example: the stream of email, editorial comments, draft committee proposals, and other written minutiae that one produces — it’s all work. Don’t let it disappear into the ether. Instead, use it for future contract renewal, tenure, or promotion.
I admit that I didn’t fully recognize the potential value of this writing until my wife — also an academic — compiled her application dossier for promotion to full professor. Watching her, I realized that, in the course of my day-to-day business as an associate professor and department chair, I had generated chains of emails and memos that constituted evidence of service and scholarship à la the Boyer model. I saw that this material, if organized coherently, could form much of my own application for promotion, in many cases verbatim. Continue reading →
We have been following the ALPS-blog discussion on students’ participation between Amanda and Simon with great interest. The situations they discuss are very familiar.
learning takes place according to the principles of problem-based learning (PBL); through active participation and discussions in tutorials.
In the programmes that
we teach in, we can grade students’ participation with a +0.5 on top of the
exam grade for exceptionally good participation or a -0.5 for insufficient
participation – a system introduced following discussions about the problem of
We too see students
who remain silent. We train students, encourage participation and discuss group
dynamics, but students may not feel comfortable or skilled to live up to our
expectations – certainly not in their first weeks at university.
I want to draw our readers attention to two new edited volumes they might find useful in their own teaching. Full disclosure: I have chapters in both of them, so my recommendation is not without bias. Both are interdisciplinary in approach, which can be very helpful in furthering our own innovation as teachers.
The first book is Human Rights in Higher Education: Institutional, Classroom, and Community Approaches to Teaching Social Justice, edited by Lindsey N. Kingston and published by Palgrave in its Studies in Global Citizenship, Education and Democracy series. Many of our classes touch on human rights, and this book offers different perspectives on how to bring a human rights and social justice approach to undergraduate education. All of the authors are connected to Webster University, but are from different disciplines including philosophy, sociology, criminology, law, photography, and psychology. The approaches look at fostering human rights education at the institutional level (considering campus culture, student affairs, and research programs), classroom level (through specific courses, study abroad, and projects), and the community level (conferences, teaching non traditional students, and legal outreach). My own chapter evaluates an interdisciplinary course I co-created with professors in philosophy and education on the Millennium and Sustainable Development Goals that included a three day educational simulation of hunger and poverty at Heifer Ranch in Perryville, Arkansas.
The other book is Learning from Each Other: Refining the Practice of Teaching in Higher Education.,edited by Michele Lee Kozimor-King and Jeffrey Chin and published by University of California Press. The social scientists in this book offer innovative ways to approach curriculum design, classroom instruction, out-of-classroom experiences, and assessment. One of the chapters, Jay R. Howard’s ‘Student Reading Compliance and Learning in the Social Sciences’ touches directly on previousALPS conversations about encouraging students to do the reading, and is well worth a look. My chapter dives into the literature on simulations and games in the social science, evaluating data from published simulations in political science to determine whether concerns about simulations taking too much classroom time are valid (spoiler alert: I say no).
There are lots of great books out there on pedagogy, but if you want some very recent work directly speaking to social scientists, you might want to check these two books out!
Amanda’s post prompts me to do a bit more reflecting on us, the instructor.
It’s really easy to focus on students as the source of problems, but as Amanda rightly underlines, that’s not the most productive of frames.
As a less-experienced lecturer, one of the most useful lessons I got in my training was that we go through different stages in our understanding of what’s happening in a classroom.
You start out by thinking it’s all about yourself, then you move to thinking it’s all about the students, before finally understanding that it’s actually about the situation you and they are in.
So part of that is recognising that you matter, but you’re not the only thing that matters.
And, frankly, sometimes we’re not at the top of our game.
Either that means we’ve not prepared enough, or we’ve not on the ball enough in the classroom.
I’ve done that – not often, but more than once – and I’m going to guess that you have too.
What’s the problem?
Clearly, there are lots of reasons why this happens and I’m not really so interested as to why, precisely because of that diversity. I know it happens to me when I teach straight after landing from an international flight, but that’s scarcely useful.
The more interesting point is to explore what impact this lack of prep has on your class and what you can do about it.
In the broadest of terms, this is a problem because of the signal it sends to your students. Just as you know full-well when they’re not concentrating in class, so too do they know when you’re not.
Just think back to when you were getting taught and you’ll recall the occasions you were on the receiving end.
If we ask students to be ‘in the room’, then we have do the same. That’s why I always laugh at academic conferences when everyone sits at the back of the room, doing other stuff on their laptops, despite what they say to their students back home.
(It’s also why I don’t say those things to my students)
What’s the solution?
Three steps suggest themselves.
First, acknowledge what you’re falling short on. This doesn’t have to be a big mea culpa, but just a simple recognition that you know what’s (not) happening and not trying to bluff your way out.
If not else, it’s better to get out in front of it and own it, before someone else does that for you.
Second, adapt what you’re doing in class to minimise the impact on student learning. If you could only prep 2/3rds of a lecture, focus on that part rather than winging the last part. If you’re supposed to be providing feedback, try using peer evaluation to replace a block of it.
That’s not always possible: if you forgot the key piece of equipment, then you should sort out getting ASAP. But you need to demonstrate your intention to make the session still work, either in a slightly different way or with a bit of delay. What’s critical is that you don’t just notice you’re not firing on all cylinders, but that you also act on it.
Third, after the session is done, you take action to make up any shortfall in the class and to avoid it happening again. That might mean some jiggling of content for next week’s class, or some additional materials on the online environment.
The longer-term redressing needs you to be reflexive and honest about what went wrong (which you should be doing in any case) and finding ways to deal with it.
So now when I fly I either do it so I can rest afterwards, or I move classes.
For you that might mean changing your schedules, or changing what you do in class, or getting a big orange sign to point to the key piece of equipment, so you don’t forget it.
Taken together, I can’t promise you’ll never have this problem again (especially if you’ve not yet had this problem), but I can tell you that it’ll become much more manageable and much less likely to happen again.
For a variety of reasons, I’ve been thinking of late about whether I matter or not.
Maybe it’s the after-effect of coming back from leave to discover that things have been just fine in my absence, or maybe it’s that the kids are old enough to need no support other than top-ups for their phones.
But certainly it’s also about the start of the academic year.
As someone dedicated to active learning, I know that I have to work from my students, rather than them work from me. Their centrality implies a less central role for me.
That’s particularly true in my autumn module on negotiation, which very explicitly and consciously puts students front and centre, and puts me at the metaphorical and literal side of the classroom, trying to help them to understand what they’re doing.
The corollary of this is that if students don’t bother, or aren’t bothered, then there’s little I can do to force learning upon them.
At best, I’m like the sun in that favourite fable of IR: my best chance lies in offering positive encouragement and opportunity, not in brow-beating and punishing.
The challenge – for me, at least – is how to keep that sun beating down.
The round of academic events at the end of summer is always a good moment to gather thoughts and find new ideas to help in this. This year, it’s been good to hear again about the value of building a high level of communication with students, giving them some ownership of the process and acknowledging where the limits of my capacity lie.
This last point is a bit of paradox: by being clear about what I can’t do, I can also strengthen the value of what I can. This is not so much modesty as realism and reflection: if I seek to inculcate such values in students, then can do no better than practise them myself.
Of course, the difficulty comes in also having to acknowledge that you aren’t in complete control of things. I’m fine with saying that, but I know many colleagues aren’t, not least for fears that it undermines their authority.
The answer to this is that rather than thinking you have to know the answer to all possible questions, you really only need to know how to answer all possible questions.
That might seem semantic, but actually it’s about feeling confident about your more abstracted skills – of reflection, of research, of analysis – and applying them to the novel case your student has just presented to you.
Sometimes that means turning the question back to the student, or to the class, to answer (or work out what they’d need to do to answer). Sometimes it’s a matter of returning to underlining principles to answer. Sometimes it’s just saying that you’d need to go off and do some work to answer it next time.
All of these options rely on us being honest with students.
It’s too easy to fall into the trap of “we know everything, you know nothing”, which underpins much of the didactic model: I’m the reservoir of knowledge, you should just sit downstream and drink your fill.
Instead, we have to recognise our limits and students’ abilities. I’m certainly not ashamed to admit that I’ve learnt as much from students as I have from colleagues: very different things, certainly, but still valuable things.
And in all this I do matter.
I might not be at the centre of the classroom, but that doesn’t mean I don’t shape, contribute, encourage and support. In short, I’m part of a group that learns.
And that’s what keeps me so eager to get back to the classroom.
In line with Simon’s last post, something of a continued meditation on conferences and academic disciplinary associations in the USA, relative to last weekend’s one-day TLC, which was embedded within the APSA annual meeting:
Conferences reflect perverse incentives that do not reflect the realities of the academic labor market. Only a small minority of people who obtain PhDs, regardless of field, end up working as tenured professors at elite research universities teaching one or two, or zero, courses per semester. Yet to have even a chance of being hired or tenured by any institution, regardless of its position in the reputational pecking order, one is supposed to present (at conferences) and publish (in journals) research. The research is almost always irrelevant to anyone outside the discipline and much of the time also irrelevant to those within it.
These norms allow academic conferences to prey financially on graduate students, who are led to believe that they must attend, to both present research and to interview. In an age of digital communication tools and decreasing numbers of tenure-track positions, neither search committees nor disciplinary associations should be encouraging graduate students to pay out of pocket to attend conferences, the costs of which can exceed $1,000 per event.
But therein lies the rub: the more people who register for and attend a conference, the more profitable the conference is to the disciplinary association that has organized it. Whether a conference enables graduate students, their advisers, or other faculty to become more effective at what most academics spend most of their time doing — teaching — is not a concern. To claim otherwise is to ignore the economics of the system.
Conference attendance by full-time faculty is subsidized by their employers in the form of professional development support. Yet the way in which most conferences are structured means that opportunities are lacking for enhancing the teaching skills used on a daily basis in the workplace. Given the declining fortunes of many colleges and universities in the USA, this subsidization is likely to decrease, and decrease substantially, at some point in the near future — or maybe it’s occurring already.