In the spirit of Chad, I’m writing this week about one of the periodic threats to politics research here in the UK.
Birkbeck is both a storied institution and one with a special educational position, delivering most of its programmes through evening classes in London. It reaches a section of society that most universities don’t or can’t.
The plans to restructure have been left rather vague by senior management, but include options to drop significant numbers in a range of departments, including Politics, whose members include names you’ve probably heard of, wherever you’re from.
There are petitions you can sign, and letters you can share, and I’d strongly encourage you to do so.
Sadly, this isn’t the first time this has happened, and won’t be the last: the economic downturn and tightening government support for universities have meant a lot of management meetings and opaque comms to staff and students. Despite being relatively cheap courses to run, social sciences and humanities often end up in the firing line, perhaps because there appears to be little sunk cost and because they don’t get the STEM-is-economically-vital style boosterism.
I can’t be the only person to have experienced this: you’re writing a book and you realise that it’s either getting too long and/or that some of the stuff you thought should go into it doesn’t really fit any more. So, reluctantly but also with a sigh of relief, you cut it.
But then what do you do? Bin it completely in spite of all the hard work you put into it? Really? Sure, that’s the sensible thing to do – the thing you’d be perfectly happy doing if you were a totally rational individual rather than a living, breathing human being prone to practically all the cognitive biases under the sun: in this case the so-called sunk cost fallacy.
In reality, what you often do, if you’re anything like me, is to think whether there’s something you could do with it somewhere else. I mean, you could always turn it into a journal article, right?
Wrong! At least in my case. At 18,000 words and with a whole bunch of endnotes, it was going to be agony trying to cut it in order to make it short enough for a decent journal. I did explore the possibility of going in the opposite direction and beefing it up to come up with the 25,000 required for those short (and short turn-around) books that a couple of well-known publishers now seem quite keen on. But two things about that, put me off.
First, I would have been topping and tailing it with ‘theory’ for the sake of it – something I hate doing. And, second, have you actually seen how much those things cost!? Only university libraries could possible afford to buy them, and I’m not really sure (morally speaking) that even they should be spaffing forty or fifty quid on such footling things anyway.
In any case, I had the temerity to think that what I’d written might be the sort of thing that people who were simply interested in, rather than formally studying, British politics, contemporary history and the EU might enjoy reading. I also thought that, since it was originally written to be ‘approachable’, it might come in handy, too, for anyone teaching those subjects – both at post-16 and post-18.
One alternative would have been to stick it up as a post on my blog. Yet, to be honest, it wouldn’t really have fitted too well because that’s simply where I collect (mostly for my own memory’s sake rather than because many people read it there) the very short-form stuff I write for newspapers and websites. But no-one else was going to host it, I was sure – so sure I didn’t even ask anyone.
It was only then I thought about self-publishing it. Initially, I dismissed the idea – I mean, that’s ‘vanity-publishing’, right? OK for your pseudonymous erotic novel but, for something academic? Surely not?
I use Twitter quite a lot. And have done for a long time (since 2008 apparently). So i’m also quite invested in it.
Perhaps I knew this, but the events of the last few months have definitely confirmed it. We don’t need to rehearse the progress of leveraged buyouts by people with galaxy brains here, suffice to say that I’ve been working on alternative platforms, just in case.
Of course the problem is your classic network effect issue: Twitter’s great for me (and for lots of other academics) because lots of other people use it. So while people might want to go elsewhere, they also want to make sure others also go there too, to rebuild the networks.
Sadly, there’s no definitive choice on this front. In the past fortnight, I’ve seen more activity on Facebook, LinkedIn and other platforms, especially from those who use those already for work purposes.
Personally, those don’t do what I want to recreate from Twitter, so it’s been an adventure in the wild woods of open-architecture microblogging sites.
The prime destination so far has been Mastodon, a network so decentralised that if you google it, you get the band, not the thing (hence me linking to the Wikipedia page).
This is my quick guide to setting up there, from your Twitter (while still keeping the latter going, which is what I plan to do).
For those of you on Twitter, you’ll have seen much grumbling about Mastodon, mainly because it’s not Twitter. The user interface is not great and because it’s on lots of little servers, there’s bandwidth issues.
However, it’s nothing that a smart person like yourself can’t handle.
First up, you can to create an account.
This needs you to pick a server: it really doesn’t matter which, since you can see all the other servers’ outputs and it’s the same interface, so either follow a colleague’s recommendation (I’m on mastodon.social) or pick whatever comes up on a search.
Second, connect it to your Twitter.
The thread below shows you how to really easily scrape Twitter for others on Mastodon, so you can avoid dull searching on the latter, and how to cross-post on both networks simultaneously: this all took me 5 minutes to sort.
And that’s about it. You’re covered for now.
At this stage, I’m taking this as a precautionary move: Twitter remains my (and others’) primary network, so as long as it doesn’t descend into abject chaos, I’ll be sticking there.
That said, having a lifeboat – smaller, less pretty – is a sensible move, especially since we appear to be at the whim of an individual unable to accept either market forces or the power of parody accounts.
So take a bit of time to consider your position on this and if you have any good suggestions, then stick them in the comments below.
This blog post is all about the value of being clear.
Recently, I’ve been doing some work on helping colleagues get work out to audiences, both academic and non-academic.
Being able to communicate effectively is a central skill for all parts of an academic’s work these days: teaching, research and leadership/admin.
As such, it gives me pause to reflect on how we can check with ourselves about whether we’re on the right track, since the key issue is typically one of clarity.
Communicating is a necessarily loss-y process. What I think has to be turned into what I say/do, which then have to be received by my audience and turned back into something they might think: each step is imperfect. That’s as true for a discussion at home about the laundry as it is for a presentation you might do for work.
And I’ll focus here on presentations because they are the set-pieces, when you are very overtly trying to put ideas into the minds of others.
So, how to check? Three questions to ask yourself:
Is the core message clear? Whatever you present on, there needs to be a point to it. So do you know what the point is and have you put that front and centre?
For me, I like to start presentations not with an overview of the structure of my talk, but with the core message. It’s not a murder-mystery, where we find out what’s actually been going on at the end, so push your core idea out directly and unadorned right at the top. That way, the rest of the presentation keeps speaking back to that, everyone (including you) can see why you’re talking about what you’re talking about, and if anyone does lose attention as you go then they still got the key bit.
If you don’t know what you’re trying to say, then you’ll want to find something sharpish. Personally, I find this is also a good exercise for deciding whether to make a contribution in a discussion, given that there’s already plenty of people out there who talk without saying anything. Be useful in your participation.
Are you sticking to the brief? Whatever the situation, there are expectations or rules about your presentation, so work to them.
Most obviously, this includes keeping to time. You should never have to say you’ll trying to keep to time, because a) you will be keeping to time, and b) saying that simply wastes time, likely making the problem worse. If you’re not sure how long you’ve got, err on brevity and save time for Q&A, because that’s where you can say stuff that your audience actively wants to hear, rather than what you think they want to hear. If you’re not sure how long your presentation takes, practise and assume the real thing will be a bit longer than that.
But it also includes keeping to the subject. Think about what the purpose of the presentation might be and what the needs of your audience might be: work to those, rather than starting from “what would I like to talk about”. Again, brevity is good if it allows more time for discussion.
Finally, are you practically clear? Think again about the loss-iness of communication: are you sticking barriers in the way?
You know those presentations you go to where someone puts up a slide and says “you probably can’t read it” or “don’t worry about all this”: that’s really annoying, right? Either the content is important – in which case make it legible – or it’s not – in which case remove it. Include as little as possible and as much as necessary: this is part of having a clear core message.
Likewise, cut the guff: try to talk plainly and directly as much as possible. Think about what jargon or technical terms your audience will understand and that your presentation demands. This doesn’t mean be simplistic, but rather than your choice of words conveys clearly the message.
And speak to your audience. This is about actually trying to engage them in the moment: making eye contact, reading their reactions for (mis)understanding, adapting to them. It’s one of the trickiest elements here, precisely because you can’t practise this alone: try watching other people and how they succeed/fail in this.
If you can be confident you’ve cleared these three tests then you’re on your way to better communication.
Like equating time spent in a physical classroom with knowledge learned, the assumption that learning always matters most to U.S. college students does not mesh with reality. U.S. universities in the main operate on the basis of their customers’ revealed preferences. Experiences that seem to be just as attractive as or more attractive than learning to these students:
Occupational credentialing. Like it or not, students are aware of the economic benefits of college. Accurately or not, many students perceive that these benefits derive from meeting the requirements for a diploma, not from what is learned. Given that elite universities in the USA function as prestige goods, I can’t say that this view is entirely incorrect.
Maturation and individuation. Students are willing to pay (or, in reality, borrow) tens of thousands of dollars to live independently of their parents for the first time. Colleges that cater to 18-22 year olds are happy to provide this revenue-generating service.
Recreation and entertainment. Many first-time, full-time students choose a four-year institution on the basis of whether they will be able to continue to play the sport that they played in high school, for example. Others are quite willing to watch this happen, even at taxpayer expense.
How did the pandemic affect student demand for and access to these experiences? It’s probably too early to identify any changes in what a bachelor’s degree from State U. signals to employers, given the economy’s current strong demand for labor. College certainly wasn’t a maturation experience while campuses were closed. Anecdotally it seems like students were happy to return to campus dormitories and apartments, regardless of the cost, and university CFOs breathed a sigh of relief as auxiliary revenue streams kicked in again — despite the continuing national decline in college enrollment. And I don’t know of any collegiate athletic programs that have been dismantled post-pandemic.
But there does seem to be something different in the wind. I know of several institutions where enrollment began declining several years ago, federal pandemic aid provided a temporary stopgap, and now broad swathes of academic programs are being eliminated as they try to budget cut their way to financial viability. I also am seeing reports of the customer-facing employees of higher education — faculty and graduate students — abandoning academia for better salaries and greater job satisfaction elsewhere. The same seems to be true of mid-level non-instructional university staff. Last, the few campuses that I’ve been on over the last year seem less lively than has customarily been the case. Fewer people walking between buildings, less crowded parking lots, and more empty chairs. Maybe this is because people discovered during lockdown that the benefits of working or studying remotely were at least equal to its cost.
It looks from my biased perspective that the pandemic might have been my long-awaited inflection point for higher education. The online experience may be quantitatively or qualitatively different from in-person instruction, but as I’ve stated above, learning for learning’s sake has not been the top priority for many college students for quite a while.
I’m reminded of MIT president L Rafael Reif’s statement in June 2020 about MIT’s plan for its upcoming fall semester — “Everything that can be taught effectively online will be taught online” (italics original). I’m also reminded of last year’s purchase of edX, MIT’s non-profit MOOC platform, by the publicly-traded company 2U. In ten years, edX went from nothing to a market valuation of $800 million. There are at least some people out there who think that physical presence in the classroom is no longer essential to the educational experience of college.
This weekend I’ve been in Turkey, speaking at a workshop and being treated with a huge amount of respect and deference, because I’m a professor.
Whether those same people would have thought the same of me in the airport as I walked away from security without my mobile phone, forcing a very unseemly effort to resolve the issue is a moot point.
So let’s moot away.
A major challenge in many parts of our education system is deference to seniority: we stick labels on people that somehow means they know better.
Yes, a part of getting a fancier job title is knowing your subject, as expressed through publications and engagement. But it’s not the whole picture.
And knowing something doesn’t necessarily mean knowing everything.
All weekend I listened to colleagues just starting out on their independent research journeys, talking about subjects they knew about and which I didn’t. As much as I could offer some insights from my own understanding, I was very aware of the limits of the utility of this to the needs of those colleagues, so became more intent about trying to leave spaces to those who could say something more, or about asking questions that might open up more reflection on their part.
Just as we want workshops or conferences to be spaces for developing ideas, so too the classroom.
Students and teachers very often fall into their socialised roles: one lot listen to the other lot to gain knowledge and ‘be taught’. But again this misses the potential to recognise that everyone in the room has something to give.
My fervour for active learning comes precisely from the realisation that I didn’t know it all, couldn’t know it all and needed to have my students’ insights to gain more. The more we can place students into positions where they can take control and make contributions the better the chances of creating not only a more rounded understanding of the matter in hand, but also the skills and comfort to continue doing that in whatever future role they take on.
Inquisitiveness about others and their understanding and knowledge is central. We have to open to what others can bring to the table, not because of their title, but because of the knowledge and reflection they have.
As the security guy at the airport could testify, just because you’re a professor doesn’t mean you can’t also be a doofus too.
It’s been a while since we offered up a cultural product as material for teaching, but since I’d hate for Mr Bezos to have spent all that money for nothing, let’s consider the Rings of Power as an option for a minute or two.
For those who’ve missed it, this is a spin-off of the Lord of the Rings/Hobbit sagas of JRR Tolkien, a man who’d probably be working on an Impact Case Study these days, and concerns the events that lead up to those original texts.
Amazon snaffled up – if that’s the right word for years of negotiation and $250bn for the rights alone – the option to make this series which is just about to drop its final episode of its (presumably) first season. Given the investment, the number of characters and the quantity of pregnant pauses as someone says something weighty in import, there’s going to be more of this to come.
Those pauses also gave me an opportunity to reflect on the politics of the series.
On a quick reading, there are four obvious points to engage with the material in respect of our curricula.
Firstly, there’s a strong US draw-down in Iraq vibe in the early episodes, as the elvish leadership wants to wind up its orc-hunting operation and close its in-field placement of forces. Themes about obligation, trust and resilience are plentiful, as is the question of how effective any of this long-term activity has been in either rooting out the initial problems or the emergent threats.
Secondly, you might consider the tensions between cooperation and conflict, both within the various species and between them. None of the societies depicted have meaningful democratic mechanisms, but in each there are paths to influence and shape policy, even if in some cases that involves bits of convenient magic. There’s also a moot point of whether the orcs could even have rehabilitated themselves into Middle-Earth society without [spoiler] fighting and – somewhat related – they could operate in a non-authoritarian system.
Thirdly, you might reflect on the political values implicit in the production itself. That might include discussion on the use of accents to frame species (why are all the half-foots/half-feet Irish?), the diversity of casting and the placing of women into key roles and functions, all of which say something about how cultural products work right now. Comparing these points with the earlier Peter Jackson films would also open up more points.
Finally, there is a lot of political communication going on here. One might ask questions about whether all crowds are so led, but the value of clear and motivating messaging is made clear time and again, as is the power of symbols. The series hangs as much on how individuals imagine others to be as it does on how those others actually are, which isn’t a bad point to reflect upon in our contemporary political debates.
The challenge in all of this might be that this is already a big pile of screen time for students to engage with: each episode is clocking in at about 70 minutes. However, if you think they might already have watched it, then you certainly have an in to a set of useful discussions and potential activities.
To take one example, I’ll make the wild guess that season 1 ends on a cliffhanger: gaming out the possible paths and their political logics would easier fill a seminar session, plus you’d have the follow-up option to note and review the discussion with a new group once season 2 drops.
Or you could just find a volcanologist and try to work out whether the triggering of Orodruin in E5 is even vaguely viable.
During the spring 2022 semester, Holstead noticed a very high rate of absenteeism in her courses. She surveyed 245 or her students about their reasons for not attending class; 175 responded.
Over a third of the respondents said they regularly did not attend class. Common reasons included physical illness, depression, attendance wasn’t required, boredom, tiredness, and conflicting family care commitments.
Students said they regularly came to class if they felt a connection to other students or the professor, if they felt it improved their mental health, or if attendance was required.
I thought this was an interesting exercise in gathering data, so I’ll be administering a similar survey in the coming week. I’ll report the results in my next post.
But I want to point out two underlying assumptions to this kind of survey, and my objection to her recommendation that faculty require students attend class. The assumptions are that learning is a function of time spent in the physical classroom and that students are in college mainly to learn. I’ve written before about why the first assumption should be discarded. I’ll belabor that point a bit more — always happy to beat a dead horse that people keep trying to ride — by connecting it to the pandemic.
This semester, and probably for the foreseeable future, students who test positive for Covid are required by my university to quarantine for at least five days. In practical terms, this means missing up to a week’s worth of classes. Faculty are expected to accommodate these students accordingly, and right so, in my opinion. But from my perspective, such a policy is long overdue, and it shouldn’t be limited to the latest communicable virus. Penalizing students when they are absent from class not only punishes those who are infected with contagious diseases, but also commuter students who decide not to drive to campus on icy roads during a snowstorm, and students with ailments that are periodically physically debilitating. The list goes on. It’s an accessibility and equity issue.
The credit hour is the quantum building block of college curricula. It is a proxy for how long a student sits in a classroom chair. As Matt Reed pointed out recently in Inside Higher Ed, this measurement exists because it meets the bureaucratic needs of the institutions that use it. It was never a valid or reliable indicator of learning. Maybe it’s time for a different measurement.
As for the faulty logic behind the second assumption, I’ll discuss that in an upcoming post.
The UK’s Office for Students has now published its decisions on student outcomes.
As I discussed earlier this year, when the consultation went out, this is an intellectually-dubious way to ensure sub-standard university providers are held to account.
My dubiousness stemmed then from the metrics used, which focused on progression, completion and post-qualification employment: to reduce the ‘value’ of a degree to these points suggests a rather narrow view of what we try to do with our teaching and with students’ learning.
You’ll be shocked to hear that despite much grumbling from providers, the OfS has broadly stuck with its plans, albeit with a rather less automatic penalty process.
Speaking as someone whose institution is below its benchmark for various elements, I’m aware this might all sound like motivated reasoning, but equally I’m also aware that my university is easily the biggest provider in the UK, taking in mostly non-standard students (so typically mature and part-time), so I do wonder why a one-size fits all approach was deemed to be appropriate.
As others have pointed out, the discretion in the application of penalties effectively leaves us very uncertain about how any of this plays out: the OfS could take a very rigid view and just hit every infringement of metricised performance to make a point of how standards need to be raised, or it could be very relaxed about it all and treat this as a diagnostic tool for providing support.
Certainly, contextual factors are mentioned here, but equally it is fair to say that OfS has tended to be somewhat at loggerheads with universities about How Things Should Be, especially with a government standing over them that seems to want a recasting of Higher Education.
Even if we are [back?] in a phase of British politics being very much less than settled, it’s clear that all of this will mean more interest by providers in metrics, and that colleagues will need to keep in mind how that plays out in their subject fields.
As much as I like to say that the best and most sustainable route to good metrics is through good academic practice – i.e. not through metric-chasing – it’s also clear that we have to have a clear eye on what metrics count for internal managers and external regulators.
The more we can articulate a coherent and cohesive vision of how our efforts to build learning environments for our students, the better we can push back against the effects of trying to reduce such things to points on a dashboard.
Given the traditional academic hiring cycle, interview season will soon be upon us. I thought both novice and veteran job seekers might find this Harvard Business Review article on interview red flags to be helpful.
A few personal examples of bullets dodged and not dodged:
When individually meeting each member of department during one campus visit, one said about another, “He’s been trying to sabotage my career since I got here.”
During the same routine at another university, it wasn’t until talking privately at the end of the day with the interim chair — a dean — that I learned that the department was in receivership because of interpersonal conflict.
A position was advertised three years in a row. I applied the first time the ad appeared, never even received a rejection notice, and assumed “oh well, someone else got the job.” Applied a second time when the same ad appeared the following year. Several months later, I received a strange email stating that “some” applications had mysteriously disappeared from a locked office and that the search had been halted. The ad appeared again. I applied a third time, interviewed, and received an offer, which I accepted. My probationary contract was not renewed mid-way through my second year on the job, after I had unknowingly helped interview the person who became my replacement.
An interview at a small university included a meeting with the president and vice president for academic affairs. The latter struck me as having the personality of an old-timey small town banker — cautious, conservative, honest. The former seemed like a used car salesman. Less than two months after I had started the job, the president became embroiled in a scandal that received national media coverage. He was eventually forced to resign because of the bad publicity, but not until several other people quit or were fired.