Does the Question Determine the Answer?

Regular readers of this blog know that I sometimes ponder the clarity of my assignment and exam prompts (some past posts on this subject are here, here, and here). Students sometimes don’t hit what, in my mind, the question targets, so I revise in the hopes of creating a prompt that is more transparent. But I don’t want prompts to be answerable with a Jeopardy-like regurgitation of facts. I want students to exert some cognitive effort to figure out how to apply concepts that are relevant to the question at hand.

Usually this situation occurs with my undergraduates, but I’m noticing it more frequently with master’s degree students. A recent example is an assignment from my graduate-level introduction to comparative politics course:

Continue reading

What Do Grades Mean?

What do grades actually mean? I began pondering this question while designing a course for the fall semester. Theoretically a grade indicates the amount of knowledge or skill that a student possesses. But really? Those of us working in the USA are quite familiar with grade inflation. A final grade of C today probably doesn’t indicate the same level of knowledge or skill proficiency as the C from fifty years ago. There is also the persistent problem of knowing whether our assessment tools are measuring the types of learning that we think they are/want them to. And it is probably safe to assume that, both in and out of the classroom, there is a lot of learning happening but we just aren’t interested in trying to measure it. The situation gets even more complex given that — again, in the USA — a “learning activity” often won’t function as intended if students believe that it has no discernible effect on their course grades.

I structure my syllabi so that the sum total of points available from all assessed work is greater than what it needed for any particular final grade. For example, a student might need to accumulate at least 950 points over the semester for an A, but there could be 1,040 points available. I do this to deliberately create wiggle room for students — with so many assignments, students don’t need to get perfect scores on, or complete, all of them. While this leads to higher grades in my courses than if I graded strictly on a bell curve, I want to give students plenty of opportunities to practice, fail, and improve. And I firmly believe that sloppy writing indicates slopping thinking, while good writing indicates the converse. So in reality what I’m doing with most of my assignments is evaluating the writing abilities of my students.

This system often produces a bimodal grade distribution that is skewed to the right. Expend a lot of effort and demonstrate a certain level of proficiency, and you will get a grade somewhere between an A and a B-. Choose not to expend the effort, or consistently demonstrate an inability to perform at a minimum level, and you will get a D or an F. I’m comfortable with this result, in part because I know from the cognitive science research on learning that repeated exposure and frequent testing builds long term memory.

This leads me to the reason for doubting that grades my courses mean the same thing as they do in courses where the only assessment is done through mid-term and final exams composed of multiple-choice questions. Yes, the proportion of A’s in the latter might be lower than in the former, but I bet on average my students are retaining more. At least I like to think that’s the case. There is no way for me to be sure.

Call for Proposals

The next New England Faculty Development Conference will be held on November 8 at the College of the Holy Cross in Worcester, Massachusetts. The deadline for proposals is August 17. Full details are here. The NEFDC is totally teaching-oriented and interactive workshops are encouraged.

As the new Director of Faculty Development at my university, and managing editor of this blog, please get in touch if you would like to publicize a teaching-related conference or event.

A Classroom Competition in Risk Taking

Today we have a guest post from Kyle Haynes, assistant professor of political science at Purdue University. He can be reached at kylehaynes [at] purdue [dot] edu.

Thomas Schelling’s (1966) groundbreaking work on “brinkmanship” explains how deterrent threats are made credible between nuclear-armed opponents. Schelling argued that although rational leaders would never consciously step off the ledge into nuclear Armageddon, they might rationally initiate a policy that incurs some risk of events spiraling into an inadvertent nuclear exchange. Whichever state can tolerate a greater risk of accidental disaster could then escalate the crisis until the adversary, unwilling to incur any additional risk, concedes. For Schelling, this type of crisis bargaining is a competition in risk taking. I use the following simulation to teach this concept:

The simulation begins by randomly splitting the entire class into pairs of students. One student in each pair is designated as Player 1 (P1), the other as Player 2 (P2). At the beginning of each game the instructor places nine white table tennis balls and a single orange table tennis ball into an empty bowl or small bucket. In Round 1 of the game, P1 must decide whether to concede the first extra credit point to P2, or to “stand firm” and refuse to concede. If P1 concedes, P2 receives one point and P1 receives zero points. If P1 stands firm, the instructor will blindly draw a single ball from the ten in the bowl. If the instructor draws a white ball, both players survive, and the game continues to the next round. If the instructor draws an orange ball, then “disaster” occurs and both players lose two points.

If the game continues to the second round, the instructor removes a white ball from the pot and replaces it with another orange ball—there are now eight white balls and two orange balls. It is P2’s turn to decide whether to stand firm or concede. If P2 concedes, P1 receives one point. If P2 stands firm and the instructor draws a white ball, both players survive, and the game continues to Round 3. If, however, the instructor draws an orange ball, both players lose two points.

Continue reading

Teaching innovation improves student performance

This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Carolina Curvale.

Petra Srnisova documents in her study the results of the implementation of constructive alignment and active learning methods in an Accounting course at the University of Economics in Bratislava. Her motivation for incorporating teaching innovation to her course emerged from an observation to which, I think, a good number of educators can relate to. She acknowledged that students’ field of study tends to be related to the level of interest in a specific course. In her case, Commerce students appeared to be less worried than other students in minute details that are essential to Accounting.

The author cleverly included strategies to promote student engagement in applying theory to practice in three sessions that took place in between a first and a second assessment. The innovated sessions introduced post-it, group work, and pair work activities aimed at promoting problem-solving, critical thinking, and collaborative work. The skills practiced during the sessions were expected to engage students and help them improve their performance in the assessments. The results of the study reveal that the exercise was more fruitful in achieving the second of these goals.

In regard to performance, the author compared the student scores before (first assessment) and after teaching innovation (second assessment). The student scores indeed improved from an average of 64% to 76%. A course taught the prior year, without innovation, showed that from the first to the second assessment students also improved, but much less (only 2 percent points). These results encourage innovation of teaching in order to improve student performance, although it would be interesting to control groups by their overall grade scores. In my opinion, this is a very important result that may also contribute to achieving better engagement, as students who perform better may be more prone to participate in class.  

The author could not conclude based on collected data that the teaching innovation produced the expected effect, that is, that active learning techniques promote student engagement. While the survey questions measuring student interest before and after the course reveal no change, on average students reported that they did pay attention to classes. The qualitative data gathered from the instructor’s notes and from an external observer provide contrasting information: the instructor perceived more engagement during the innovation sessions while the observer did not register heightened participation, although the observer attended only one session and could not fully compare the group’s performance.

The chapter systematically documents the results of the adoption of teaching innovation aimed at improving both student interest and performance in an Accounting course. While the results are mixed, the experience is flawlessly analyzed and presented, and the author herself offers avenues for improving the experience in the future. In my view, the chapter offers interesting and practical ideas on how to improve the teaching-learning experience when the topic of the course is not directly related to the students major – something we can all learn from!

Bringing culture back in: a comment on Pechersky’s study on student-centred learning

This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Stephan Engelkamp.

Many years ago, I found myself attending a class on intercultural learning, or Etudes culturelles, as the course was called at the time. I was a young Erasmus student, enrolled at the Institut d’Etudes Politiques Strasbourg in France, and part of a large group of international students who would soon be released on Sciences Po’s regular courses. To be fair, I cannot say that I was particularly thrilled by the prospect of attending this seminar. Mostly struggling with my uneasy relationship with the French language, I did not really see the point in discussing cultural stereotypes for weeks and months.

However, this was a mandatory course, and so I attended. For whatever it was worth, it was a good opportunity to get to know my fellow Erasmus students and maybe make some new friends. The seminar turned out to be fun and helpful. What I remember most vividly was what turned out to be the best part of the seminar: discussing different cultural points of view with international students, as competent practitioners of their respective cultures.

This brings me to Alexander Pechersky’s insightful contribution on the potential outcomes of introducing student-centred learning to the curriculum, specifically when teaching fuzzy but enacted concepts such as culture. The chapter reports on the results of a study the author conducted when teaching seminars on intercultural learning, which were offered to local and Erasmus students. The author starts with a contextualisation of his own academic socialisation abroad, reflecting on the different cultures of learning he experienced in different university settings during his academic path. This leads Pechersky to the following assumptions: students with a higher degree of control in student activities should be more satisfied with the learning exercise, better internalise the learning material and gain a deeper understanding of studied concepts.

To test these assumptions, the author developed a quasi-experimental research design for three seminar groups. Each seminar starts with the lecturer’s mini lecture, which is then followed by a quiz as an icebreaker to make students more comfortable to participate in the next step: a student-centred part in which students apply a case study on the session’s subject matter. The design of the three settings varies according to the degree of freedom students have in controlling the student exercise. Student satisfaction and learning outcomes are traced using a survey and participant observation.

As survey results demonstrate, the hypotheses could only be partially corroborated. While the results on learning satisfaction seems to be as expected – the more control students have, the more satisfied they are – results regarding the learning outcomes are somewhat mixed. However, the impressions of the observing colleague seem to suggest that the group with the most control over the learning exercise had the most sophisticated discussions of concepts.

One challenge of the research design may be the limited number of observations, due to the small number of students, which may make it difficult to apply even descriptive statistical methods. To address this methodological issue, the author may have considered assigning reflective essays rather than using surveys.

Methodological issues aside, I suggest an alternative way to account for the unexpected results regarding the students’ learning outcome. As the author rightly states, “[I]in student-centered learning (SCL) the teacher assumes the role of a facilitator and invites students to participate in the learning process by relating information to prior knowledge and discussion with others.” Hence, students’ prior knowledge and experiences may be a key variable in the quasi-experiment. As the seminars focus on intercultural and communication skills, group composition may affect the learning outcome but maybe not in the way the author assumes.

Pechersky theorises that students’ prior experience with student-centred learning may explain the outcome. An alternative explanation may relate to the content of the course, which focuses on inter-cultural learning, and student background, specifically international students. From this angle, students’ experience with intercultural exchange may determine the learning outcomes. International students may be more invested in the subject matter due to their personal situation which probably allows them to bring in their personal experiences more effectively.

In any case, Pechersky’s contribution raises interesting questions about variables of success of student-centred learning. I would love to see a follow-up piece drawing on a larger set of observations! As an Erasmus alumnus, I clearly see the value of taking individual intercultural experiences seriously, both inside and outside the classroom.

Managing discontent

Wonderfully, several parts of my professional life have seen recent instances of people being less than completely satisfied with my work, or that of my colleagues.

Choooooon!

I say wonderfully not sarcastically, but genuinely, because it’s a great opportunity to question whether things are working or not.

It’s easy to get stuck into a rut, where you’re comfortable or feel you have the answers. Sometimes you need a bit of a bump to get you re-engaged.

(Of course, some bumps are a bit bigger than others, but hey, what are you going to do?)

Sometimes these things happen because you’ve not keep up your standards; sometimes because things you thought were explicit have become implicit; sometimes because things have changed.

In each case, it’s more useful that your response is open and enquiring, rather than closed and defensive. No-one likes being told that their work isn’t up to scratch, but that’s not a good enough reason to carry on regardless.

Importantly, you have to start by recognising that problems with your work isn’t the same as problems with you. Contain the issue to your actions – which you can control – rather than allow it to be a comment on your being – which you can’t.

Continue reading

To engage youths to study political processes – there are no “one size fits all” methods

This guest post is part of a series linked to the publication of G. Pleschova & A. Simon (eds.) Learning to teach in central Europe: Reflections from early career researchers. This post comes from Niina Meriläinen

Many states of Europe are now experiencing growing social and political passivity of some youths, which may lead to their lifelong marginalization and radicalization. To revert this trend, many university teachers feel committed to encourage and empower their students to become active in their communities, and to play a more active role in the democratic development of societies and in supporting the principles of human rights and rule of law.

In his chapter , Martin Karas, from the University of Economics in Bratislava, reports on various aspects of how to engage non-political science undergraduates. Karas introduced three active learning exercises – a debate, a group analysis of a primary sources, and cut-up cards – to help students to distinguish between various political science concepts, to achieve higher levels of student engagement, to improve knowledge retention, and to create understanding of political actors, issues and practices. While researching the effects of these exercises on student learning, he combined qualitative and quantitative methods.

Whereas active learning methods led to higher levels of student engagement in Karas’ class, they did not significantly affect knowledge retention and understanding. Karas’ research results correlate with the literature findings, which reports the connection between active learning and student engagement, but no robust evidence for the existence of a link between active learning and knowledge retention and understanding.

Karas’ findings are similar with those of Meriläinen, Pietilä, & Varsaluoma (2018) and Meriläinen’s forthcoming research (2019). The later research focuses on the engagement of non-subject students in vocational schools to contribute to social change processes, including law drafting. Naturally, we need to understand that there are various forms of youth engagement and participation to social and political change processes, some that gatekeepers such as officials give credibility, and some that are overlooked and viewed as non-credible by the gatekeepers.

Karas’ research illustrates that the engagement of non-subject students can increase the quantitative participation but may not result in long-term qualitative understanding of the issues, actors and events in the political sphere. This implies that more effective learning methods (and more multidisciplinary research) are needed to achieve lasting and profound awareness, engagement and participation of non-subject students to societal and political change processes.

If we wish to achieve desired change towards equal, human rights based, inclusive and sustainable societies, active citizenship and participation of youths is an essential requirement. Karas’ chapter is an exciting example from this research field. While designing learning methods for non-subject students, various actors working together, such as researchers, teachers, youth workers and volunteers, should take into account several key issues.

This includes making the courses “accessible” for various groups of students including individuals with disabilities, language competences, asylum seekers, and migrants in general, paying attention to power-relations among youths and dissimilar interests and skills of various youths. As they are not a homogenous group that engages with one voice, “one size fits all” teaching and learning method cannot effectively reach all of them. Because youths from various backgrounds should become agenda setters in the democratic development of societies, teaching and learning methods must be designed to address their different needs.

What Sticks?

Inside Higher Ed recently published a column written by a community college dean on the most important subjects one took or could take in high school — part of a larger conversation that originated on Twitter. Responses to the column mentioned:

  • Theater productions, to learn how to work with other people who have different perspectives and objectives.
  • A foreign language, to learn principles of grammar that allow one to become a better communicator in English.
  • Typing, to learn how to communicate more quickly with less effort.
  • Bookkeeping, to learn how to manage one’s personal finances.

Comments also referenced the processes through which the learning occurred. For example, one person mentioned that he gained a better understanding of the here and now when a history teacher worked backward from the present instead of using the traditional method of moving from the distant past toward today (which in high school is almost never reached).

The column and the comments got me thinking about the same question as applied to college. What undergraduate course was the most useful to you, and why?

As I wrote a few years ago, I generally don’t remember anything about the content of my college courses. Sorry, James Clerk Maxwell, I’ve forgotten how to use your equations. But I do have memories of what actions I performed when I originally learned the content and how I felt when that happened. The general process stuck. The specific outcome did not.

While I have tried in my own teaching career to better emphasize process over content, I still don’t get the kind of feedback contained in the Inside Higher Ed piece. My university doesn’t collect data on this level from alumni. So maybe it’s time I started doing it myself with a survey.

“You know, it’s like…”

I spent the weekend cycling through Northern England, with an old school friend.

It didn’t rain all the time…

If I were so-minded, I could write you a whole blog post about how this was an analogy to the learning process, or to our professional careers, or something else.

But I won’t.

I won’t, not because I can’t, or because it’s not useful, but rather because during one of the quieter stretches of our ride, when we’d exhausted the catching-up chat and the comments on the beauty of it all, I got to thinking about the place of metaphor in teaching.

As you do. Or, at least, as I do.

It’s a truism to note that all teaching proceeds by metaphor and analogy: we explain the things people don’t understand by drawing connections across to things they do understand. Everything is like everything else, in some way.

However, it’s easy to forget this, to think that we are building radically new structures of understanding for our students with what we do.

But even that notion of ‘structures’ is an analogy: it gives us a visual metaphor for how we can understand this very thing.

You’ll be unsurprised to learn that I found I rather good example of this on Twitter, using almost-Lego:

As often happens with my Thoughts While Cycling, I’m not sure it comes to a whole lot, but if we recognise that this happens, then we can use it to good, or better, effect.

Importantly, it also helps us to be weary of mis-using metaphors, as happens a lot in my other work: metaphors can blind us to things as much as they can enlighten.