For reasons best known to others, it’s the end of our first
semester here, so that means coursework grades are going back to students.
I was even more interested than usual in this event this
time around because something unusual happened with my class: they came to talk
with me about their assessment.
I know that might seem mundane, but despite my best efforts
my office hours have often resembled one of the remoter oases in a desert:
potentially of use, but rarely visited by anyone.
I’d love to tell you what was different this semester, but I
genuinely have no idea: I did the things I usually did, so maybe it was a
cohort effect. Or not.
In any case, I reckon I sat down for discussions with most
of the students and emailed with several others. In those exchanges we
typically covered both generic guidance on what was required and specific
discussion on students’ plans.
Of course, the big question is whether that helped the
students to do better.
At this point, I’ll note that my class had about 35 students
and it’s a one-off event so far, so I’m alive to not over-reading the outcomes.
Against that, the marking has been confirmed by the second marker.
That said, the main positive outcome was that the bottom
half of the class moved up quite markedly. In previous years, I’ve always had a
cluster of students who simply didn’t ‘get’ the assessment – a reflective essay
– and thus came out with poor marks. This time, I had only a couple of students
in that situation, and they appeared (from my records) to have not attended
most of the classes, and hadn’t come to talk.
Put differently, the tail was severely trimmed and the large
bulk of students secured a decent grade.
What didn’t appear to happen was an overall shift upwards
though: the top end remaining where it had been previously.
Again, I’m not sure why this might be. Without another
cohort I’m not even sure if my guidance actually did anything for anything.
Quite aside from the specific instance, it does underline for
me how little me know about the ways in which our teaching practice does and
doesn’t impact on student learning.
In this case, I don’t really know how one could ethically
test the impact of formative feedback and support, given the multiple variables
at play. If you have an idea, I’d love to hear it.
This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Johan Adriaensen.
When teaching writing skills, our objective is
not to have students explain what a good paper looks like, we want them to
write a good one. Similarly, through our methodological courses, we hope
students not only understand the logic of a research method, but that they can
apply it in practice.
This need for practical application inevitably
pushes us towards active learning pedagogies. One way to achieve this is
through a flipped classroom model. This requires students to learn the essentials
of theory at home so that contact hours can be devoted to practical
applications and discussion. This enables the instructor to provide support where
it is most needed. As a result, student engagement with the material is stimulated
through learning by doing.
In her chapter, Kateřina Fridrichová elaborates on her experiences while
applying this method in an elective course on research methods for Master’s
students of international relations at Masaryk University in Brno. As I was familiar with the potential
of flipped classrooms in teaching research methods through the work of Michael Touchton, I formulated myself a set of
questions on the practice of teaching in a flipped classroom as opposed to their
effects on students (which are also analysed in the chapter).
Firstly, I wonder how to best facilitate the learning process during students’ self-study?
Fridrichová referred to the use of general readings and short
explanatory videos. To ensure the students get prepared in advance, she
requested summaries of the readings from students. Unfortunately, despite an
elaborate assessment of student and teacher experiences, we learn little about
the need, perception or effectiveness of pre-recorded lectures and/or the short
explanatory videos used by Fridrichová (and how to determine which
topics merit such support).
Second, I ask myself how to organize the in-class sessions so as to improve learning? As common in flipped teaching, there was
still scope for ‘mini-lectures’ in her class to address issues requiring
further explanation. These were then followed by various practical applications
and exercises. I particularly liked the idea of working with data collected from
the students. The exercises were diverse, well-designed and also appropriate
for a method as QCA which is commonly applied to small and medium-N.
Thirdly, I remain puzzled about the implications in terms of preparation and
workload for the instructor. Fridrichová’s
honest and open reflection highlight the importance of preparation for the
class sessions and in particular how to cope with small class sizes. The shift
from traditional teaching practices to a student-led, active teaching format
often necessitates a different type of preparation to keep the class engaged.
In her case, the small group of students (4) further complicated this challenge.
While the initial time-investment seemed to be significant, the rewards
seem more than worth the effort. The impacts on her students were diverse
ranging from greater confidence in applying the method, greater engagement
during class, the scope for peer-learning and an appreciation of applying the
theory in practice. Having read the chapter, my interest in flipped teaching
has only increased.
Yesterday I found myself on the campus of another university, attending an advisory board of a research project.
One of the key topics was about impact – taking the work into the community of users.
As we talked about this, I was struck by the way in which there are a lot of parallels between this and teaching.
Most importantly, both impact and teaching need to be focused on the needs of your audience from the outset: it cannot simply be a function of what you want to do.
That means understanding your audiences, reflecting on their needs and tailoring what you have to offer. It’s easy to do the most convenient thing, but that might not be the most useful thing you can do.
Secondly, there has to be a recognition that there is more than one way to skin the proverbial cat and that your choice of activity is not pre-determined.
This implies a willingness to explore options and to try out new things that might work better. In the case of this project, we talked a lot about blogging and the options it might open, both in of itself and as a gateway to other activity, but the principle is the same for any other form of working.
Finally, there needs to be a tolerance of failure. Just as not every single student responds positively to our teaching, so not every person targeted for impact work demonstrates interest.
Failing in these cases obviously requires adjustment and different approaches, but it is not intrinsically a problem: there are always limits to what we can achieve in working with others. What matters is our response and adaptation.
Seen together, both teaching and impact work should remind us that we operate in collective environments: we alone cannot – and should not – take everything on our own shoulders, but rather rather need to work with others to find common cause.
For most academics, the gears of course planning grind exceedingly fine. We tinker with projects, lectures, and assignments, trying to create what we imagine as the ideal learning experience. But that’s frequently not what we do outside of the classroom.
The winter holiday break is a good time to take stock of one’s life and position oneself better for the future. Although it’s never too late, the sooner you begin taking charge of your personal affairs, the better. So, some basics:
I ask these questions because, if your experience has been anything like mine, you didn’t get trained in personal financial management while in graduate school, and you probably haven’t utilized whatever training might be available through your employer.
Last month I wrote about the multi-year death spiral at Iowa Wesleyan University. My 2017 column for Inside Higher Ed discussed four broad signs that a small college or university is headed toward failure. But how can a faculty member employed by a tuition-dependent institution like Iowa Wesleyan get a firmer grip on his or her employer’s financial health?
One way to do this is to calculate the percentage change over time in a school’s annual total expenses per full-time equivalent (FTE) undergraduate. The larger the expansion in expenses per student, the worse the school’s financial condition and the lower the chances of its long-term survival.
I decided apply this rule of thumb to several colleges and universities that suffered declining enrollment, eliminated academic programs, or were otherwise reported as in financial difficulty. My analysis uses data from fiscal years 2011 through 2016. Why use this time span? Prior to FY 2011, colleges and universities were trying to cope with the immediate effects of the Great Recession, which, as I have previously argued, accelerated what are probably near-permanent changes in undergraduate enrollment. It seemed fair to give schools six years after the economy had begun to stabilize to adjust to the new normal. Finally, when I began my analysis, the most recent publicly-available federal tax filings were from FY 2016.
Below are my results, ordered from the smallest increase in expenses per FTE undergraduate to the largest. I bear none of these schools any ill will. Many have histories of serving marginalized populations. But I predict that at least half of them will close within the next five years.
During last September’s annual conference of the University Association for Contemporary European Studies (UACES) in Bath, Simon kicked off the teaching and learning afternoon gathering with a teaching and learning bingo.
I enjoyed this a lot. It was nice, active, fun. I got to meet new people and learned new thing about teaching and learning. Based on this (perhaps somewhat subjective) experience, I decided to hijack Simon’s idea and use it in two similar, but different settings. This is what happened.
Problem-Based Learning workshop Bolzano
Maastricht University is known for its application of Problem-BasedLearning (PBL). New staff have to attend a PBL introduction training session upon starting at our university. All teaching staff also need to complete the so-called University Teaching Qualification (UTQ) trajectory (this, in fact, applies to all higher education institutions in the Netherlands). One of my duties is to coordinate UTQ at my home faculty. And it was this – plus my teaching experience – that lead me to being invited to convene a workshop entitled ‘Tutors in problem-based learning from distant facilitator to approachable coach’ at the University of Bolzano early October.
Since I did not know any of the people there, I thought this would be a great opportunity to use the T&L bingo.
I adapted Simon’s bingo to my own needs. The instructions are relatively straightforward and the items of a diverse nature, including more light-hearted ones.
It worked surprisingly well. I got to know the participants, plus they got to know each other a bit better too. It also provided me with some input for the workshop (‘The one thing they’re hoping to learn more about today’). Considering it was a day-long workshop, this helped me to focus on specific points and also, towards the end, to check if everyone thought that we sufficiently covered their needs.
University Teaching Qualification workshop Maastricht
I already wrote that I coordinate UTQ at my home faculty. This year 13 colleagues have to complete the trajectory. The group of participants is very diverse in terms of disciplinary backgrounds and teaching experience, and included teaching assistants, PhD students and a professor. Not everyone knew each other, so once again the bingo seemed like a good idea. And once again, I thought I could use the input for the workshop.
This time too, I adapted the bingo to the setting,with specific questions about the topics that we were going to focus on during the day.
Even though there was some hilarity as to whether this was really a serious exercise (see this tweet),participants actively engaged in it. As they will be working together throughout this academic year, it was important that they got to know each other – the usual round of introduction is a bit boring, especially because it usually does not result in new information. Once more I included a question that gave me specific input as to the expectations for the day (‘Your colleague’s personal learning goal for today’).
What I learned
The good: after a bit of hesitation everyone got really involved. Some colleagues did their utmost best to talk to everyone; others decided for a longer talk when the issue at hand was interesting. Reason enough to do it again, though I’d probably want to explain the exercise better in order to avoid awkward moments at the start.
The bad: timing is an issue. In both cases, we took much more time than I had anticipated, as everyone really got into it and because I had encouraged them to try and talk to everyone. Next time I might consider using a timer or buzzer. Or perhaps offer a prize to the person who gets most boxes filled within a set time.
The ugly: I enjoyed it so much that I also got completely carried away. And subsequently lost track of time… As such, a timer is definitely needed! Someone needs to keep track of me too. I might ask one of the other participants to be in charge of time instead.
I’d love to try this exercise with students. I think it would make a great course opening. I mostly teach in a programme with +300 students, so they do not always know each other despite changing tutor groups every 8 weeks and every course. Plus it would be a great way to have them discuss a course topic in a more informal setting and get to know more about the course’s intended learning outcomes. Obviously, I would have to adapt the bingo to the course itself. I would include a debriefing so that we all get to know more about students’ pre-knowledge about the topic and our expectations of the course, the group and the tutor (i.e. me).
Nearly five years ago I wrote about the dim prospects for Iowa Wesleyan University (formerly College). The school had just announced that it would eliminate half of its academic majors and terminate more than forty percent of its faculty members.
Earlier this month, Iowa Wesleyan’s president announced that it was in danger of shutting down. The university’s trustees decided on November 15 to keep the university operating until at least December 2019. My guess is that their decision just delays the inevitable. According to Iowa Wesleyan’s IRS filings, it suffered from negative net revenue — a deficit — for five of six fiscal years from 2011 to 2016. Operating expenses per full-time equivalent undergraduate increased by more than fifty percent during the same period. Supposedly the university needs $4.6 million in additional revenue to stay open until the end of 2019.
Iowa Wesleyan faces the same problem that many other similarly-sized private colleges and universities face: declining demand coupled with increased operating costs. For many small-enrollment, tuition-dependent higher education institutions, the future is financially unsustainable.
I’ll be writing more about how to interpret the relationship between a school’s enrollment and operating expenses in the coming months, both here and, possibly, at Inside Higher Ed.