Teaching innovation improves student performance

This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Carolina Curvale.

Petra Srnisova documents in her study the results of the implementation of constructive alignment and active learning methods in an Accounting course at the University of Economics in Bratislava. Her motivation for incorporating teaching innovation to her course emerged from an observation to which, I think, a good number of educators can relate to. She acknowledged that students’ field of study tends to be related to the level of interest in a specific course. In her case, Commerce students appeared to be less worried than other students in minute details that are essential to Accounting.

The author cleverly included strategies to promote student engagement in applying theory to practice in three sessions that took place in between a first and a second assessment. The innovated sessions introduced post-it, group work, and pair work activities aimed at promoting problem-solving, critical thinking, and collaborative work. The skills practiced during the sessions were expected to engage students and help them improve their performance in the assessments. The results of the study reveal that the exercise was more fruitful in achieving the second of these goals.

In regard to performance, the author compared the student scores before (first assessment) and after teaching innovation (second assessment). The student scores indeed improved from an average of 64% to 76%. A course taught the prior year, without innovation, showed that from the first to the second assessment students also improved, but much less (only 2 percent points). These results encourage innovation of teaching in order to improve student performance, although it would be interesting to control groups by their overall grade scores. In my opinion, this is a very important result that may also contribute to achieving better engagement, as students who perform better may be more prone to participate in class.  

The author could not conclude based on collected data that the teaching innovation produced the expected effect, that is, that active learning techniques promote student engagement. While the survey questions measuring student interest before and after the course reveal no change, on average students reported that they did pay attention to classes. The qualitative data gathered from the instructor’s notes and from an external observer provide contrasting information: the instructor perceived more engagement during the innovation sessions while the observer did not register heightened participation, although the observer attended only one session and could not fully compare the group’s performance.

The chapter systematically documents the results of the adoption of teaching innovation aimed at improving both student interest and performance in an Accounting course. While the results are mixed, the experience is flawlessly analyzed and presented, and the author herself offers avenues for improving the experience in the future. In my view, the chapter offers interesting and practical ideas on how to improve the teaching-learning experience when the topic of the course is not directly related to the students major – something we can all learn from!

Bringing culture back in: a comment on Pechersky’s study on student-centred learning

This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Stephan Engelkamp.

Many years ago, I found myself attending a class on intercultural learning, or Etudes culturelles, as the course was called at the time. I was a young Erasmus student, enrolled at the Institut d’Etudes Politiques Strasbourg in France, and part of a large group of international students who would soon be released on Sciences Po’s regular courses. To be fair, I cannot say that I was particularly thrilled by the prospect of attending this seminar. Mostly struggling with my uneasy relationship with the French language, I did not really see the point in discussing cultural stereotypes for weeks and months.

However, this was a mandatory course, and so I attended. For whatever it was worth, it was a good opportunity to get to know my fellow Erasmus students and maybe make some new friends. The seminar turned out to be fun and helpful. What I remember most vividly was what turned out to be the best part of the seminar: discussing different cultural points of view with international students, as competent practitioners of their respective cultures.

This brings me to Alexander Pechersky’s insightful contribution on the potential outcomes of introducing student-centred learning to the curriculum, specifically when teaching fuzzy but enacted concepts such as culture. The chapter reports on the results of a study the author conducted when teaching seminars on intercultural learning, which were offered to local and Erasmus students. The author starts with a contextualisation of his own academic socialisation abroad, reflecting on the different cultures of learning he experienced in different university settings during his academic path. This leads Pechersky to the following assumptions: students with a higher degree of control in student activities should be more satisfied with the learning exercise, better internalise the learning material and gain a deeper understanding of studied concepts.

To test these assumptions, the author developed a quasi-experimental research design for three seminar groups. Each seminar starts with the lecturer’s mini lecture, which is then followed by a quiz as an icebreaker to make students more comfortable to participate in the next step: a student-centred part in which students apply a case study on the session’s subject matter. The design of the three settings varies according to the degree of freedom students have in controlling the student exercise. Student satisfaction and learning outcomes are traced using a survey and participant observation.

As survey results demonstrate, the hypotheses could only be partially corroborated. While the results on learning satisfaction seems to be as expected – the more control students have, the more satisfied they are – results regarding the learning outcomes are somewhat mixed. However, the impressions of the observing colleague seem to suggest that the group with the most control over the learning exercise had the most sophisticated discussions of concepts.

One challenge of the research design may be the limited number of observations, due to the small number of students, which may make it difficult to apply even descriptive statistical methods. To address this methodological issue, the author may have considered assigning reflective essays rather than using surveys.

Methodological issues aside, I suggest an alternative way to account for the unexpected results regarding the students’ learning outcome. As the author rightly states, “[I]in student-centered learning (SCL) the teacher assumes the role of a facilitator and invites students to participate in the learning process by relating information to prior knowledge and discussion with others.” Hence, students’ prior knowledge and experiences may be a key variable in the quasi-experiment. As the seminars focus on intercultural and communication skills, group composition may affect the learning outcome but maybe not in the way the author assumes.

Pechersky theorises that students’ prior experience with student-centred learning may explain the outcome. An alternative explanation may relate to the content of the course, which focuses on inter-cultural learning, and student background, specifically international students. From this angle, students’ experience with intercultural exchange may determine the learning outcomes. International students may be more invested in the subject matter due to their personal situation which probably allows them to bring in their personal experiences more effectively.

In any case, Pechersky’s contribution raises interesting questions about variables of success of student-centred learning. I would love to see a follow-up piece drawing on a larger set of observations! As an Erasmus alumnus, I clearly see the value of taking individual intercultural experiences seriously, both inside and outside the classroom.

To engage youths to study political processes – there are no “one size fits all” methods

This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Niina Meriläinen

Many states of Europe are now experiencing growing social and political passivity of some youths, which may lead to their lifelong marginalization and radicalization. To revert this trend, many university teachers feel committed to encourage and empower their students to become active in their communities, and to play a more active role in the democratic development of societies and in supporting the principles of human rights and rule of law.

In his chapter , Martin Karas, from the University of Economics in Bratislava, reports on various aspects of how to engage non-political science undergraduates. Karas introduced three active learning exercises – a debate, a group analysis of a primary sources, and cut-up cards – to help students to distinguish between various political science concepts, to achieve higher levels of student engagement, to improve knowledge retention, and to create understanding of political actors, issues and practices. While researching the effects of these exercises on student learning, he combined qualitative and quantitative methods.

Whereas active learning methods led to higher levels of student engagement in Karas’ class, they did not significantly affect knowledge retention and understanding. Karas’ research results correlate with the literature findings, which reports the connection between active learning and student engagement, but no robust evidence for the existence of a link between active learning and knowledge retention and understanding.

Karas’ findings are similar with those of Meriläinen, Pietilä, & Varsaluoma (2018) and Meriläinen’s forthcoming research (2019). The later research focuses on the engagement of non-subject students in vocational schools to contribute to social change processes, including law drafting. Naturally, we need to understand that there are various forms of youth engagement and participation to social and political change processes, some that gatekeepers such as officials give credibility, and some that are overlooked and viewed as non-credible by the gatekeepers.

Karas’ research illustrates that the engagement of non-subject students can increase the quantitative participation but may not result in long-term qualitative understanding of the issues, actors and events in the political sphere. This implies that more effective learning methods (and more multidisciplinary research) are needed to achieve lasting and profound awareness, engagement and participation of non-subject students to societal and political change processes.

If we wish to achieve desired change towards equal, human rights based, inclusive and sustainable societies, active citizenship and participation of youths is an essential requirement. Karas’ chapter is an exciting example from this research field. While designing learning methods for non-subject students, various actors working together, such as researchers, teachers, youth workers and volunteers, should take into account several key issues.

This includes making the courses “accessible” for various groups of students including individuals with disabilities, language competences, asylum seekers, and migrants in general, paying attention to power-relations among youths and dissimilar interests and skills of various youths. As they are not a homogenous group that engages with one voice, “one size fits all” teaching and learning method cannot effectively reach all of them. Because youths from various backgrounds should become agenda setters in the democratic development of societies, teaching and learning methods must be designed to address their different needs.

Learning your lines

So I have this colleague, who does a lot of public speaking. And when I say a lot, I mean really a very large pile of it indeed.

I guess this works too

They have to give a TED talk this week and they’re anxious, because they aren’t sure they will remember what it is they have to say.

I found this surprising, given their extensive experience, but also a bit reassuring: it’s easy to assume that just because someone can make it all look rather effortless, that doesn’t mean it is.

At some level, we’re all like a swan: gracefully gliding across the water, while furiously paddling underneath. The only question is the ratio between the two, but it’s always there.

In any case, it got me thinking about how I’d handle the ‘memorise your speech’ thing, not least because I will have given six of the things by the end of next week.

Continue reading

What’s good for me and what’s good for my students?

Evidently, my existential rut continues.

Last week I had three separate occasions teaching on the same subject, plus also getting filmed talking about that subject too.

I’d been a bit anxious about this concatenation of coverage, even if it was on my research specialisation: talk enough about something and eventually you end up saying all you’ve got to say. Or worse, I might forget who’d I’d said what to whom.

In the end, that was not a problem: I’d been good about defining quite clearly what each session was focused on, so the internal consistency was fine, as was relationship between the different bits.

However, each time I was doing something quite different.

There was one conventional lecture, one set of seminars and one Q&A-type session to support a flipped lecture. Plus that filming.

Now I’m not one to blow my own trumpet, but I gave a good lecture: it flowed, it had coherence, the students were engaged (and occasionally entertained) and I even got a little round of applause at the end.

Lovely.

But it was also the session that I worried gave the least to the audience: they were passive recipients, rather than active learners. In the Q&A and in the seminars, the content was driven by their needs and learning processes: even the filmed lecture is going to become part of a more interactive package.

However, my ability to give lectures seemed to be better than my ability to give the other formats (not that they were bad, to be clear: just not as good as the lecture).

So what to do?

Stick with what I’m best at, or focus on what I understand to be best for my students’ learning?

Of course, I’m setting up a bit of a false choice here: my evaluation of my relative capacities is completely anecdotal, plus I know that the evidence about pedagogic formats isn’t completely nailed down.

However, the point still stands, because our subjective view of such situations shapes how we engage with pedagogy: I think we all know plenty of people who stick with what they know because they’re good at it, or at least better at it than some other approach they’ve not tried before.

The difficulty is separating our anxiety/indifference about new pedagogies from any more objective limitation on our ability to use them. My personal view is that just because I not smashing it in my seminars, it doesn’t mean I shouldn’t be doing them: instead, I should be trying to identify and address my weaknesses.

Ways to make a university course more attractive and to improve student learning

This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Ulrich Hamenstädt

In her chapter, Ludmila Kašpárková discusses several methods to make a relatively unpopular university course more attractive for the students. The chapter presents first the challenge of teaching a course for which the students’ motivation is low. Kašpárková then outlines three changes she made to improve students’ motivation and learning outcomes. Finally, she presents the results of her evaluation on how the new teaching style has helped to improve the course and student learning.

For me, this chapter is a good example on how to deal with a challenge that a university lecturer faces from time to time: taking over a course that is part of the compulsory curriculum, but is unpopular among the students for different reasons. While the content of this course was relatively fixed, the approach to teaching offered many opportunities for improvement. In this sense, this chapter presents three ways to improve learning that can be easily applied to various university courses.

The first approach Kašpárková used was to implement John Bigg’s concept of constructive alignment in the didactic design of the course. To ensure that the students understand the syllabus, the learning objectives and develop ownership of learning she discussed with the students their possible takeaways of the course. This may seem obvious, but since it takes time, it is often neglected or shortened out in practice. The second change of the course was the introduction of new learning activities such as role plays and peer discussions to replace passive learning from lecturing. This was a key element of this innovation, since learning in social sciences is not about imparting knowledge but it implies – above all – enabling the students to develop their skills. Thirdly, the examination at the end of the semester was replaced by continuous assessment to enhance the quality of student learning.

Overall, it is interesting to read how all these changes together influenced students’ learning outcomes. Ludmila Kašpárková’s work presents a valuable approach to transforming and improving a course for students. The series of changes reported in the chapter can also be applied individually to course sessions or teaching situations. 

Getting flipped

So last week was our final teaching week before the Easter break. For reasons we don’t need to go into here, it’s not the end of our teaching block for the semester, so it’s a bit of a breather.

As such, my usual expectation is that class attendance takes a bit of a hit.

However, my lecture of 120 students only saw 15 turn up, which – even by my standards – isn’t good. At all.

Obviously those who came didn’t have the answers for their fellow students’ absence, so I decided to gather some data.

Using a quick SurveyMonkey poll, I offered students a range of options to choose from. As you can see, I got about 40% of the class (including a couple who had come) to respond, so reasonable enough to make some observations.

The first is that timing does matter. My lecture is 1000 on a Thursday, and the night before had been a big sports night, so there was certainly a bunch of students incapacitated by that. Also, a couple of students noted that Thursdays are solid with classes for them, abetted by some deadlines for other modules due at the same time.

However, while those factors explain part of it, I was also rather curious about whether our experiment in flipping plays a role.

My module is a first-year introduction to European integration, and I’ve been trying out a flipped format. This involves a pre-recorded lecture online, with the conventional lecture time being given over to Q&A on the recording and elaboration of key themes: there’s also a seminar session, which runs on more conventional lines.

So far, that Q&A part has not been running as well as I would like: only a small number of students ask questions and there’s evidently a block who haven’t watched the lecture beforehand, so can’t ask.

Part of my concern in trying out flipping was that students might see it as a way to disengage with the face-to-face element of the module. My efforts to tackle this including highlighting that the recording doesn’t have enough in itself to give all the material and framing students need to do well in the module: the Q&A always includes stuff that relates much more directly to the final exam (and I say as much).

Clearly, the survey highlights that this isn’t resonating with the students.

Even if we allow for a degree of “what might look like the answer least likely to cause offence”, there’s a big block covering the lack of utility of the (Q&A) lecture element.

Certainly, I can see that if you’ve not watched the video, then the lecture isn’t that useful, but I’m more concerned about those who feel that the recording suffices.

All of which leaves me in a quandry.

Part of me wants to rework the remaining sessions after Easter to be much more explicit in leaving material out of the online stuff, with the lecture picking it up instead.

But another part of me wants to stick with my approach to date and then we see how it goes with the exam.

Right now, I don’t have the answers to this one. I need to explore some more to see if attendance was similarly down in other modules, to better triangulate what’s happening here.

Your thoughts are welcome.

Unsure about it all?

A very short one today, as I’m struggling with a pile of stuff that I’m not sure I understand.

While it’s great that I get to do things I wouldn’t have otherwise be able to, Brexit has also meant I get asked to explain things that are either at the edge of my knowledge, or which are so novel that no-one’s considered them before.

You might have this in your classroom sometimes – I know I still do – so a couple of thoughts on how I handle it.

Firstly, work from what you know.

Nothing is so out-there that it doesn’t touch on something that’s much more settled, so build your conceptual bridge out from that. It not only gives you something more solid to work with, but often it’s where those involved are working from too.

Secondly, consider the range of options.

Politics is great to study because of its uncertainty, but that usually works within a bounded set of pathways. The more you can work through what that set might include, the better you can evaluate how actors might choose among them.

And thirdly, don’t be afraid to say you don’t know.

No-one knows everything and sometimes it’s a matter of either being too early to tell, or too uncertain to guess. Park it, say what would be a marker of things changing in a way that you could tell, so that your audience is left with some tools, even if they don’t get the answer there and then.

Right, back to the world of UK Parliamentary procedure.

More Pre-Post Post-its

A follow-up about asking students why they do what they do . . . For the second stage of this data-gathering exercise, I had students use Post-its to anonymously answer three questions at the beginning of class:

  • How are you feeling right now? (the one-word check-in)
  • Why are you feeling what you’re feeling?
  • Why did you come to class today?

Nineteen out of twenty-three students, or more than eighty percent, reported feeling badly — the same proportion as last time. Of the nineteen, ten referenced being tired while four wrote “stressed.” Only one wrote “hungry.” The overwhelming majority of people in this group attributed their feelings to too little sleep and too much work.

The other four students felt “happy,” “good,” “relaxed,” and “chill.” Three of these students attributed their feelings to having had time to eat, buy coffee, or otherwise get ready before class. One of them mentioned sleeping comfortably, while another wrote “not super-stressed . . . trying to stay calm for the day ahead.”

I sorted answers to the third question into a few different categories, which are shown below, along with their frequencies. A few students’ comments fell into more than one category.

  • I had to; attendance is mandatory: 7
  • Get a good grade: 5
  • I am paying for the course: 3
  • Learn something: 3
  • Participate in discussion: 1
  • Collaborate with teammates on an upcoming assignment: 3
  • Miscellaneous reasons — “My roommate told me I couldn’t skip,” “I was awake so I figured why not,” “Because I didn’t go to the last one,” “I try to go to all of my classes,” “Didn’t want to miss anything,” “To avoid falling behind”: 6

In sum, only seven students, or thirty percent, indicated that they had been intrinsically motivated to attend class that day; i.e., they came to learn or participate in a learning-oriented activity. More than half of the students indicated that they were extrinsically motivated by the fear that their grades would be harmed if they did not attend. What I think is interesting here: I do not penalize students for being absent from class — I regard them as legal adults, free to suffer the natural consequences of their actions. I do not grade on attendance or class participation. Only students’ written work, submitted before class, gets assessed.

More thoughts on this subject in a future post . . .

Pre-Post Post-its

Sometimes the best way to find out why students do what they do is to ask them.

During a recent lunchtime conversation with a colleague, I learned about the “one-word check-in” — asking students to each describe, with a single adjective, how they felt at that moment. I decided to incorporate this into a data collection exercise that I hoped would demonstrate one benefit of taking notes in class — a problem for which I still haven’t figured out a solution.

My hypothesis: students who took notes — a more cognitively-engaging activity than just listening — would be more likely to feel better by the end of class.

I collected data in my course on globalization, which meets twice a week in seventy-five minute sessions from 9:30 a.m. to 10:45 a.m. The class, when everyone attends, has only twenty-five students, so my results are not statistically significant.

As students were entering the classroom and settling into their chairs, I gave each person three Post-it notes, along with a playing card dealt from a stacked deck (more on this further down). I told everyone to marked their Post-it notes with the suit and number of the playing card each had received. This allowed me to sort the Post-its by individual student afterward. Students should also number each Post-its with a 1, 2, or 3, to simplify keeping them in the correct sequence after class. I didn’t think of this at the time, but luckily I kept each pile of Post-it notes separate after they were collected.

The data:

  • At the beginning of class, students wrote a one-word check-in on Post-it #1.
  • After the discussion of that day’s reading response, students wrote on Post-it #2 answers to “Have I written any notes during today’s class?” and “Why?”
  • Students then clustered into teams to discuss plans for an upcoming project assignment. Note that this introduces a methodological flaw in my research design, but it turned out to be irrelevant.
  • At the end of class, students wrote a one-word check-out on Post-it #3.

A different randomly-selected student collected each set of Post-it notes after students had finished writing on them, which he or she placed face down on a table. The goal here was to make it obvious that I was trying to preserve the anonymity of students’ responses. However, I had dealt cards from a stacked deck (low value cards on the bottom) so that I could identify which responses were from men and which were from women — because I expected that women would be more likely to take notes.

Now for the results. Out of 23 students who were in class that day . . .

Continue reading