Now I’ve got a bit more space, and some data, I want to provide you with an initial write-up of each of them. Next week, I’ll go through the one that I think has more obvious potential – the flipped format – but today it’s going to be the gladiatorial model that gets the attention.
The conceit – and I think that’s the right word here – is that a presenter starts off with three minutes to present, then the audience get to vote (by app/website) on whether they get another three minutes. This repeats up to a maximum of 12 minutes per paper.
The options I settled on for each round of voting were ‘Yes’, ‘No’ and ‘Yes, but…’, which I suggested to the audience would be a way of signalling a willingness to give time, but with a caution to up one’s game.
We ran the panel with four papers – including me – ordered randomly immediately beforehand, and we had about 18 in the audience.
How it unfoldedon the day
The first bit of feedback from the audience was ‘why?’, as I explained the format to them, and that required a bit of an explanation about my intention to create a format where everyone was more engaged and where there was more interaction.
Each paper took a very different approach to the time issue:
This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Nanette S. Levinson
Alica Retiova’s Chapter focusing on her innovative teaching experiment in a first year writing intensive Seminar provides plenty of great advice and, most importantly, evidence of what works in her classroom. A particular bonus is that she does not just assess the innovation itself but she also measures and shares student perceptions of it.
Retiova successfully implemented is the use of written peer feedback on student
papers designed to improve writing skills in the field and foster critical
thinking. She also has the goal of catalyzing students’ confidence in their own
us exactly how to foreshadow implementing the innovation of written
peer-to-peer feedback, Retiova competently explains her techniques for
developing and accurately assessing student ability to ‘feed forward
(suggestions to their peers for future writing) as well as backward (specific
feedback on their current position papers). Just as importantly, she measures
this over time (three different papers in a three week period) and with the
same peer evaluators.
long-time faculty member who herself teaches a first year seminar, I look
forward to following Retiova’s tips and the techniques she tried. I also
encourage future experiments that recognize the role of culture in attempts to
catalyze independent learning. Retiova found that the experiment contributed
only “partly” to developing students’ confidence as independent learners. Based
upon my research in cross-cultural communication I note that some cultures
foster a more hierarchical view of the professor and student, with the
professor being viewed as the major source of learning and knowledge. Thus,
culture itself may play a role in shaping students’ views and, indeed,
abilities to develop confidence in their own role in assessing the work of
other students in their classes.
it is inspiring to read about the teaching & assessment excellence focus of
Retiova, an early career faculty member. This bodes well for the next
generation of faculty leaders and their focus on fostering student-learning
excellence including critical thinking and field specific writing skills.
This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Liudmila Mikalayeva.
Daniela Jaklová Střihavková tells her story of a highly motivated seminar tutor in a first-year course on social work. The problem she sets to solve is a mismatch between the grader’s expectations and the students’ performance: the average grade for exam essays is between 16 and 18 points in 2016-2017, which is far from the maximum 30 points. What can the tutor do to fill this gap between the expectations and the performance?
The setup of the
course will be familiar to many young instructors: while a senior professor
gives weekly content-focused lectures and grades students’ work at the end of
the course, a junior tutor is in charge of interactive seminars where students
discuss and apply knowledge. The examination consists mainly of two essays
checking students’ understanding of the nature of social work and of the social
worker’s role, formulated in proper academic language. The essays are
challenging for students, since they receive little training in writing them in
the course itself.
very reasonably suggests improving the course through constructive alignment: course
design should guarantee the fit between teaching goals, teaching methods and
assignments. So if a major teaching goal is to enable students to define the
role of the social worker in a concrete case, they should receive guidance on
how to do it, have an opportunity to practice it and receive feedback before
author introduced two additional seminar sessions where students practiced a
task similar to that of the exam essay. Combining individual and group work,
she provided space for the students to confront the complex task and reflect on
their own performance. While she cannot prove that the essays received better
grades because of these changes, both the grader and the students were happier
with the learning outcome.
The effort by the
seminar tutor to bridge the gap between the expectations of the grader and the actual
students’ work was however only partly successful. Even after the additional
seminars, students continued to feel unsure what the grader expected from them
and the grader was still unhappy with how they used disciplinary vocabulary. I
see three issues explaining the persistence of the gap.
A relatively minor
point is that oral exercises may not be effective enough for supporting
students’ success in written tasks. A much more important drawback, underlined
by the author herself, is the absence of clear and explicit criteria for
grading: the professor would need to make an effort detailing the requirements.
And, most significantly, the course structure as such is at the core of the
problem: the person grading students’ work is not directly in touch with the
students and is senior enough to have forgotten how challenging it is for undergraduate
students to understand and use academic jargon and navigate the often-implicit
expectations to their work.
Jaklová Střihavková is
right to point that to improve learning outcomes students need space to reflect
on the curriculum, but young and senior instructors should become more
reflective as well. Clarifying expectations, aligning content, teaching approaches
and assignments, communicating among themselves and with the students is key
and cannot be replaced by teaching experience alone. Students as well as
instructors will benefit from it.
This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Alistair Jones
We’ve all had the problem of students turning up to class without adequate preparation. To gain the full benefits of any classroom discussion or debate, student preparation is essential. Yet there are too many other distractions, resulting in unprepared students attending class. Natália Gachallová explains how she got the students to prepare for their classes, the ensuing benefits, and embeds the explanation in the existing literature.
for an introduction to Latin anatomical nomenclature, you might think, would be
essential. The obvious incentive is the required mark of 75% to pass the first
part of the course. Yet experience suggested this was not the case. Thus
Gachallová decided to innovate her teaching. She introduced fortnightly summative
online mini quizzes. These would reinforce prior learning as well as provide a
useful revision tool. There was also a reward component, where students gained
extra marks for the final exam based on the average score in the quizzes.
innovation can be time consuming, especially noting the length and volume of
classes that are undertaken. In this case, there is a cohort of over 130
students! Gachallová does not mention how much time was used in preparing these
quizzes, especially in comparison to what preparation was undertaken
previously. Follow-up questions were used in class to stimulate debate – an
example would be interesting to see.
online student survey was utilised to measure the effectiveness of this
approach, which produced remarkable findings. Around 85% of respondents claimed
to find the in-class quizzes beneficial. Conversely, some respondents
complained about the quizzes being too challenging, and voiced concerns over
spelling mistakes leading to marks being dropped.
benefits are visible in the grades of students. Both the average mark and the
overall pass rates improved. The exception is in the tail of the class, where
pass rates at the third attempt of sitting the final exam were almost
Gachallová takes into consideration the extra marks gained by students from the
online quizzes. Her findings showed most students did not need the extra marks
from the quizzes. Most of those who
passed the exam would have passed without the extra marks. A very small number
of students failed the exam despite gaining the extra marks from the online
quizzes. The reward component was meaningful for about 5% of all students.
key message from this chapter is simple. If students can engage with the
learning materials throughout the academic year, they are far more likely to
pass. Online quizzes appear to motivate the students to engage with the class
preparation. Formative assessments AND formative feedback can increase student
success in the summative assessments.
of you may consider a Latin Medical Terminology course to be rather niche. It
might be that online quizzes are not deemed appropriate for other subjects. Yet
that is to miss the bigger picture. There is a need to engage with the students;
to encourage them to take responsibility for their learning. One way is through
the use of technology. The fact a small incentive was added in relation to the
marks may be inconsequential – which is something for future exploration.
If students are not going to engage with classroom preparation, why hold the class? If students merely parrot what has been said in class, has any learning actually happened? Consider the six levels of knowledge in Bloom’s Taxonomy (1956): to have knowledge, comprehension, application, analysis, synthesis and evaluation. If using technology and incentives to innovate our teaching can help our students to a deeper level of knowledge, then it is worth the experimentation.
can you please insert chapter link here?
So last year, I mused on trying out some different types of conference panels, to see if we couldn’t be doing better than our usual two-minutes-at-the-end-for-a-long-winded-question-that’s-more-a-statement-actually.
Long-standing readers (and those who’ve just gone back to the original post) will remember I’m trying out two formats here.
Format 1: flipped panels
For our panel (106) on Monday on the EU27 and Brexit negotiations, we’re trying to use a flipped approach.
That involves we recording presentations that run as long as we like and posting online beforehand, then having only very brief one-sliders in the panel itself.
You can watch the efforts of myself (here), Natasza (here) and Petr (here) already now.
As you can see, we’ve all gone longer than the usual 15 minutes and I personally found it nice not to have to worry about the time-limit for this.
In Lisbon, we’ll have 5 minutes to present key points and then we go into a nice block of time for Q&A/discussion.
Format 2: gladiator time!
For our second panel (416) on Tuesday on Learning & Teaching, we’re getting the audience involved.
At the start, we’ll be getting everyone to access Poll Everywhere, via laptop or app, so they can vote.
The presenters will be given an initial 3 minutes to present, before the audience gets to vote on whether to give them another 3 minutes.
Presenters can get up to a total of 12 minutes, with each block being conditional on the audience being willing to go along with more of this.
We’re going to draw lots for the order of papers too, because I’m guessing it’ll make a while for the audience to get used to using their power and it only seems fair.
I’ve given a couple of notes to the presenters to reflect upon:
How do you balance getting your key points across with trying the patience of your audience? Logically, you make some kind of cliff-edges every 3 minutes, but what if your paper isn’t like that, or doesn’t have that kind of content? How much do you trial content and how do you spring surprises?
How do you structure any powerpoint you’re using? I can keep the results of the polling on my laptop, but again, what do you need to communicate when to the audience?
I’ll not say how I’m approaching these points, because I’m guessing there will be a variety of approaches, but I’ve found it much more thought-provoking/tricky/a pain for my content than the flipped model.
To back-up the panels, I’ll be asking for feedback from both audiences and presenters on the formats and I’ll give you a quick run-down on that post-conference, with a view to publishing at some point.
With that in mind, if someone else wants to try these formats at their event, then knock yourselves out: I’m happy sharing my feedback forms with you too.
So, as I go off to finalise that second panel, I will hope that you had a good summer. If you’re coming to Lisbon, then I hope to see you at these panels: please do either watch the flipped presentations and/or download the app.
If you’re not, then I’m sure I’ll be trying you about it soon enough!
This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Carolina Curvale.
Petra Srnisova documents in her study the results of the implementation of constructive alignment and active learning methods in an Accounting course at the University of Economics in Bratislava. Her motivation for incorporating teaching innovation to her course emerged from an observation to which, I think, a good number of educators can relate to. She acknowledged that students’ field of study tends to be related to the level of interest in a specific course. In her case, Commerce students appeared to be less worried than other students in minute details that are essential to Accounting.
The author cleverly included strategies to
promote student engagement in applying theory to practice in three sessions
that took place in between a first and a second assessment. The innovated
sessions introduced post-it, group work, and pair work activities aimed at
promoting problem-solving, critical thinking, and collaborative work. The
skills practiced during the sessions were expected to engage students and help
them improve their performance in the assessments. The results of the study
reveal that the exercise was more fruitful in achieving the second of these
In regard to performance, the author
compared the student scores before (first assessment) and after teaching
innovation (second assessment). The student scores indeed improved from an
average of 64% to 76%. A course taught the prior year, without innovation,
showed that from the first to the second assessment students also improved, but
much less (only 2 percent points). These results encourage innovation of
teaching in order to improve student performance, although it would be
interesting to control groups by their overall grade scores. In my opinion,
this is a very important result that may also contribute to achieving better
engagement, as students who perform better may be more prone to participate in
The author could not conclude based on
collected data that the teaching innovation produced the expected effect, that
is, that active learning techniques promote student engagement. While the
survey questions measuring student interest before and after the course reveal
no change, on average students reported that they did pay attention to classes.
The qualitative data gathered from the instructor’s notes and from an external
observer provide contrasting information: the instructor perceived more
engagement during the innovation sessions while the observer did not register
heightened participation, although the observer attended only one session and
could not fully compare the group’s performance.
The chapter systematically documents the
results of the adoption of teaching innovation aimed at improving both student
interest and performance in an Accounting course. While the results are mixed,
the experience is flawlessly analyzed and presented, and the author herself
offers avenues for improving the experience in the future. In my view, the
chapter offers interesting and practical ideas on how to improve the
teaching-learning experience when the topic of the course is not directly
related to the students major – something we can all learn from!
This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Stephan Engelkamp.
Many years ago, I found myself attending a class on intercultural learning, or Etudes culturelles, as the course was called at the time. I was a young Erasmus student, enrolled at the Institut d’Etudes Politiques Strasbourg in France, and part of a large group of international students who would soon be released on Sciences Po’s regular courses. To be fair, I cannot say that I was particularly thrilled by the prospect of attending this seminar. Mostly struggling with my uneasy relationship with the French language, I did not really see the point in discussing cultural stereotypes for weeks and months.
this was a mandatory course, and so I attended. For whatever it was worth, it
was a good opportunity to get to know my fellow Erasmus students and maybe make
some new friends. The seminar turned out to be fun and helpful. What I remember
most vividly was what turned out to be the best part of the seminar: discussing
different cultural points of view with international students, as competent
practitioners of their respective cultures.
This brings me to Alexander Pechersky’s insightful contribution on the potential outcomes of introducing student-centred learning to the curriculum, specifically when teaching fuzzy but enacted concepts such as culture. The chapter reports on the results of a study the author conducted when teaching seminars on intercultural learning, which were offered to local and Erasmus students. The author starts with a contextualisation of his own academic socialisation abroad, reflecting on the different cultures of learning he experienced in different university settings during his academic path. This leads Pechersky to the following assumptions: students with a higher degree of control in student activities should be more satisfied with the learning exercise, better internalise the learning material and gain a deeper understanding of studied concepts.
test these assumptions, the author developed a quasi-experimental research
design for three seminar groups. Each seminar starts with the lecturer’s mini
lecture, which is then followed by a quiz as an icebreaker to make students
more comfortable to participate in the next step: a student-centred part in
which students apply a case study on the session’s subject matter. The design
of the three settings varies according to the degree of freedom students have
in controlling the student exercise. Student satisfaction and learning outcomes
are traced using a survey and participant observation.
survey results demonstrate, the hypotheses could only be partially
corroborated. While the results on learning satisfaction seems to be as
expected – the more control students have, the more satisfied they are –
results regarding the learning outcomes are somewhat mixed. However, the
impressions of the observing colleague seem to suggest that the group with the
most control over the learning exercise had the most sophisticated discussions
challenge of the research design may be the limited number of observations, due
to the small number of students, which may make it difficult to apply even
descriptive statistical methods. To address this methodological issue, the
author may have considered assigning reflective essays rather than using
issues aside, I suggest an alternative way to account for the unexpected
results regarding the students’ learning outcome. As the author rightly states,
“[I]in student-centered learning (SCL) the teacher assumes the role of a
facilitator and invites students to participate in the learning process by
relating information to prior knowledge and discussion with others.” Hence, students’
prior knowledge and experiences may be a key variable in the quasi-experiment. As
the seminars focus on intercultural and communication skills, group composition
may affect the learning outcome but maybe not in the way the author assumes.
theorises that students’ prior experience with student-centred learning may
explain the outcome. An alternative explanation may relate to the content of
the course, which focuses on inter-cultural learning, and student background, specifically
international students. From this angle, students’ experience with
intercultural exchange may determine the learning outcomes. International
students may be more invested in the subject matter due to their personal
situation which probably allows them to bring in their personal experiences
any case, Pechersky’s contribution raises interesting questions about variables
of success of student-centred learning. I would love to see a follow-up piece
drawing on a larger set of observations! As an Erasmus alumnus, I clearly see
the value of taking individual intercultural experiences seriously, both inside
and outside the classroom.
Wonderfully, several parts of my professional life have seen recent instances of people being less than completely satisfied with my work, or that of my colleagues.
I say wonderfully not sarcastically, but genuinely, because it’s a great opportunity to question whether things are working or not.
It’s easy to get stuck into a rut, where you’re comfortable or feel you have the answers. Sometimes you need a bit of a bump to get you re-engaged.
(Of course, some bumps are a bit bigger than others, but hey, what are you going to do?)
Sometimes these things happen because you’ve not keep up your standards; sometimes because things you thought were explicit have become implicit; sometimes because things have changed.
In each case, it’s more useful that your response is open and enquiring, rather than closed and defensive. No-one likes being told that their work isn’t up to scratch, but that’s not a good enough reason to carry on regardless.
Importantly, you have to start by recognising that problems with your work isn’t the same as problems with you. Contain the issue to your actions – which you can control – rather than allow it to be a comment on your being – which you can’t.
This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Niina Meriläinen.
Many states of Europe are now experiencing growing social and political passivity of some youths, which may lead to their lifelong marginalization and radicalization. To revert this trend, many university teachers feel committed to encourage and empower their students to become active in their communities, and to play a more active role in the democratic development of societies and in supporting the principles of human rights and rule of law.
In his chapter , Martin Karas, from the University of Economics in Bratislava, reports on various aspects of how to engage non-political science undergraduates. Karas introduced three active learning exercises – a debate, a group analysis of a primary sources, and cut-up cards – to help students to distinguish between various political science concepts, to achieve higher levels of student engagement, to improve knowledge retention, and to create understanding of political actors, issues and practices. While researching the effects of these exercises on student learning, he combined qualitative and quantitative methods.
Whereas active learning methods led to higher levels of student engagement in Karas’ class, they did not significantly affect knowledge retention and understanding. Karas’ research results correlate with the literature findings, which reports the connection between active learning and student engagement, but no robust evidence for the existence of a link between active learning and knowledge retention and understanding.
Karas’ findings are similar with those of Meriläinen, Pietilä, & Varsaluoma (2018) and Meriläinen’s forthcoming research (2019). The later research focuses on the engagement of non-subject students in vocational schools to contribute to social change processes, including law drafting. Naturally, we need to understand that there are various forms of youth engagement and participation to social and political change processes, some that gatekeepers such as officials give credibility, and some that are overlooked and viewed as non-credible by the gatekeepers.
Karas’ research illustrates that the engagement of
non-subject students can increase the quantitative participation but may not
result in long-term qualitative understanding of the issues, actors and events
in the political sphere. This implies that more effective learning methods (and
more multidisciplinary research) are needed to achieve lasting and profound
awareness, engagement and participation of non-subject students to societal and
political change processes.
If we wish to achieve desired change towards equal, human rights based, inclusive and sustainable societies, active citizenship and participation of youths is an essential requirement. Karas’ chapter is an exciting example from this research field. While designing learning methods for non-subject students, various actors working together, such as researchers, teachers, youth workers and volunteers, should take into account several key issues.
This includes making the courses “accessible” for various groups of students including individuals with disabilities, language competences, asylum seekers, and migrants in general, paying attention to power-relations among youths and dissimilar interests and skills of various youths. As they are not a homogenous group that engages with one voice, “one size fits all” teaching and learning method cannot effectively reach all of them. Because youths from various backgrounds should become agenda setters in the democratic development of societies, teaching and learning methods must be designed to address their different needs.
I spent the weekend cycling through Northern England, with an old school friend.
If I were so-minded, I could write you a whole blog post about how this was an analogy to the learning process, or to our professional careers, or something else.
But I won’t.
I won’t, not because I can’t, or because it’s not useful, but rather because during one of the quieter stretches of our ride, when we’d exhausted the catching-up chat and the comments on the beauty of it all, I got to thinking about the place of metaphor in teaching.
As you do. Or, at least, as I do.
It’s a truism to note that all teaching proceeds by metaphor and analogy: we explain the things people don’t understand by drawing connections across to things they do understand. Everything is like everything else, in some way.
However, it’s easy to forget this, to think that we are building radically new structures of understanding for our students with what we do.
But even that notion of ‘structures’ is an analogy: it gives us a visual metaphor for how we can understand this very thing.
You’ll be unsurprised to learn that I found I rather good example of this on Twitter, using almost-Lego:
As often happens with my Thoughts While Cycling, I’m not sure it comes to a whole lot, but if we recognise that this happens, then we can use it to good, or better, effect.
Importantly, it also helps us to be weary of mis-using metaphors, as happens a lot in my other work: metaphors can blind us to things as much as they can enlighten.