Notes From a Conference of Damp Showers and Wet Snow*

Continuing on a theme . . . some notes on today’s pedagogical discussion at ISA 2023:

Teachers want to create an environment that facilitates learning and stimulates a spirit of curiosity. Students may have different expectations. As one session participant put it, students can have the purely transactional attitude of “I’m not going into debt so I can feel emancipated.”

In a similar vein, we talk about what students should get out of a college education, but we don’t ask what they bring to it; e.g., a K-12 education where the teacher was the sole authority in the classroom.

So we are frequently faced with a situation where students don’t want to engage unpredictably with new knowledge because it makes them feel uncomfortable, which they do their best to avoid.

To resolve this dilemma, students need to become familiar with tools for giving and receiving feedback productively so that they can learn from each other. They also need to learn how to articulate why they hold certain positions, why those positions are important to them, and what they mean when they state those positions.

During the conversation, I thought of a tweak to an assignment that might help with the above. As I have written previously, many of my students are unable to identify the author’s thesis, independent variables, and dependent variable in Perusall readings. I’m thinking of adding “What is a question about this article that you want answered?” to the assignments, with the stipulation that the answer needs to come from their classmates, not me. This could also be a way of getting students to design their own quiz questions.

*Allusion to 19th Russian literature, of which I am mostly ignorant — a known unknown that I am at present mostly comfortable with.

Discussion-Based Quizzes 2

First, I hope to see folks at some of the pedagogy-oriented sessions at this week’s ISA meeting in Montreal. Feel free to chat up me or the illustrious Dr. Simon Usherwood.

Second, a follow-up to my post last month about no-stakes quizzes and class discussion.

I gave students another quiz, on survivorship bias, a topic of the reading assigned earlier in the week. Here is the prompt for the questions (note that the scenario is a fictionalized version of a famous World War II example):

The United States has gone to war against Zambia. A high percentage of U.S. warplanes are being shot down by the Zambian military. You work for the U.S. Department of Defense as an operations research analyst. You have been given the task of recommending where additional armor plating should be installed on U.S. warplanes to better protect them from enemy fire. The image below shows a representative pattern of damage from Zambian anti-aircraft fire to U.S. warplanes that have made it back to home base. 

Responses to question 1:

Responses to question 2:

Despite the tiny sample, I think the quiz scores indicate the ease of memorizing a concept’s definition while being unable to meaningfully apply it. Students frequently equate memorization with knowing and hence learning, when mostly it’s not.

Discussion-Based Quizzes

Alternative title: Stuff Just Happens, a persistent student mental paradigm that I’ve written about before (here and here).

I’m teaching an undergraduate introduction to research methods course this semester, for which I created a series of practice quizzes that contribute nothing to the final course grade. I expected the quizzes, which range from one to three questions, to function as retrieval practice and launch pads for class discussion. This is the same idea behind polling students to check their comprehension while lecturing, but I used the Canvas LMS quiz tool instead of a polling platform because 1) I was too lazy to learn the latter, and 2) I wanted all course content housed in one place.

The activity is not working as well as I thought it would, for a few reasons. First, Canvas identifies the correct answer to each question when displaying quiz results, as shown below, which shuts down discussion about which option is correct and why. A pie chart that doesn’t label the correct answer would work better — i.e., Google Forms.

Second, this is a small class that meets at 8:00 a.m. The quantity and quality of discussion declines markedly when only a few students are absent, which is usually the case.

But most importantly, given their answers to the quiz questions, students are still having a lot of difficulty forming plausible and potentially valid causal inferences from the data that I present. I’ve given six practice quizzes so far, and on average students answer questions correctly only about 50% of the time. Here is a typical example from a prior quiz:

Based on the visual representation of data below, a logical hypothesis is:

Advice for Job-Fillers

Although probably too late in the calendar year to put into practice, here is a flip-side follow-up to my prior post about campus interviews:

It’s time to stop the costly performative rituals that are contributing to the disintegration of the academy.

Move to a bi-annual or even tri-annual hiring cycle. Academia is the only industry that I know of that limits hiring to an annual schedule. If you operate your searches on a staggered timeline, your applicant pool will probably improve and your top choices will be less likely to be snagged by competitors.

Stop requiring recommendation letters at the beginning of the search process. Demanding otherwise wastes the time of letter writers and applicants. We all know that an overwhelming percentage of these letters are never read because the applications they are part of quickly get tossed. Get references only for those on your short list and then check them.

While the science supports eliminating job interviews entirely, this probably isn’t going to happen, so at least make them less onerous. The pandemic demonstrated that there is no need to bring finalists to campus. And there is no demonstrated benefit in subjecting them to one to two days of back-to-back meetings with people who have no direct effect on the specified duties of the position. Is it essential for every candidate to have a 30-minute conversation with the Associate Director of Strategic Student Development Initiatives? No one who interviews for an IT or facilities management staff position has to suffer through this, and those units function perfectly well.

Finally, per the article linked to above, structure the applicant evaluation process to minimize bias and noise. Use rubrics to score candidates on clearly defined criteria. Collect the results, average them, and distribute this information to the search committee before discussion of the applicants’ relative merits. This will help prevent any single person in the room from unreasonably affecting the outcome.

How To Identify Problem Students

While going down the YouTube rabbit hole early last September, I stumbled across this video by a Canadian lawyer about the three types of clients to avoid. These clients display urgency, flattery, or (lack of) responsibility — often simultaneously. As stated in the video, these signals occur in any customer service industry. I’ve certainly seen them, and probably you have, too.

Urgency — a student claims to have an emergency that requires your immediate action. Questions for you to ask: “Is this a real or perceived emergency? Did the situation arise because of the student’s behavior?” In a serendipitous example, two weeks after watching the video, I received an email from a student with “URGENT CONCERN” in the subject line. It wasn’t urgent, nor was it my concern.

Flattery — a student says that you are the only professor that can resolve their problem. It is an attempt to distract you from the real cause of the situation. E.g., “This is my favorite course, but it’s the only one I’m doing badly in this semester, and if my GPA drops below X, I will lose my scholarship and have to drop out of college. Are there any extra credit assignments?”

Responsibility — nothing is the student’s fault. For example (actual email I received last month): “The wi-fi is completely shut down on campus and I can’t submit anything, I’ve been trying to for the past hour. I know our assignment is due and I’ve tried submitting it but I don’t know what to do. I can attach the writing here but can’t upload anything to Canvas.” My response? “The library has computers on every floor.”

Advice For Job-Seekers

We’re in peak season for campus interviews. At this stage of my career, I’ve had and seen many of them. So a bit of advice to those whose applications ended up at the top of the list:

We’ve all probably heard the statement, “No one in the room knows as much about your topic as you do,” intended to alleviate the anxiety of speaking before an audience of strangers. In an attempt to strengthen the performance of job candidates, I now propose the Chad Raymond Corollary: “No one in the room is as interested in your topic as you are.”

If your interview includes a research presentation or a teaching demo, practice multiple times — by which I mean full dress rehearsals, not just reviewing what you think you’re going to say in your mind. Boil the talking down to one or two main points. Eliminate words that you stumble over. Whittle down the content until you can deliver the entire presentation at a steady, deliberate pace within the specified time limit.

Then cut at least another 25 percent. Preferably more.

Why? Expect a delay because of the technology in the room isn’t working properly. Then introductory announcements by hosts. And you will need to entertain questions at the end. But mainly because people don’t really want to hear someone else speak non-stop for 45 minutes about a topic that has less import to them than what they will eat for dinner that night.

The above also applies to conference presentations.

Last, and I can’t stress this enough, figure out how you are going to engage your audience. If you prompt people in the room to do something, they are more likely to have a stronger, more positive view of your abilities. At minimum, they won’t be looking at their phones.

New Journal – Call for Papers

An announcement:

The School for International Training (SIT) is debuting an academic journal for the publication of research on the world’s most critical global issues.

The new Journal of Critical Global Issues, a peer-reviewed, open-access digital journal, will contribute to SIT’s mission to educate future scholars and professionals to address critical issues in pursuit of a more sustainable, peaceful, and just world. The journal aspires to support respectful communities, foster intercultural understanding, advocate for social justice and inclusion, and promote sustainability.

The Journal of Critical Global Issues invites proposals from researchers and scholars to contribute to an online roundtable discussion in May focused on the following areas: climate and the environment; development and inequality; education and social change; geopolitics and power; global health and well-being; identity and human resilience; and peace and justice. Roundtable presenters will have the opportunity to publish work related to their roundtable presentation in the inaugural issue of Journal of Critical Global Issues. We seek contributions from diverse theoretical and methodological perspectives to join us for this event.

Event information:

Location: Virtual
When: May 15-17, 2023
To submit a proposal for a roundtable discussion, please submit a 500-word abstract of your presentation here by February 15.

Questions? Contact university.relations@sit.edu.

How Do I Get An “A”?

Last summer, when building LMS sites for my fall semester undergraduate courses, I inserted a link titled “How do I get an ‘A’,” assuming it would get students’ attention. The link was to this short video about the importance of deadlines.*

I decided to expand on this idea for the spring semester and beyond, with an LMS page that contains the link to the video and this advice:

  • The due date is not the do date. Instructions and deadlines for all course assignments are available in the syllabus and on Canvas from the beginning of the semester. Plan ahead and complete assignments several days before they are due.
  • See the syllabus for the location of reading assignments. Ask librarians how to access these materials at no cost. There are computers available for this in the library and at other campus locations.
  • Revise your writing to eliminate as many unnecessary words as possible. Bad writing is an indication of sloppy thinking. If you are not familiar with the revision process, use the Writing Center.
  • Read the feedback on the quality of your work that is contained in assignment rubrics and my comments. It is not possible for me to care more about your learning than you do.
  • Sleep, eat, and exercise. Sufficient quantities of each are necessary for learning.

While the above can be construed as facilitating more learned helplessness among students, I’m finding that my syllabus quizzes just aren’t doing the job of communicating some of the most basic academic aspects of being a college student.

*Courtesy of TikTok via Reddit. Not something I created.

The Death of Curiosity? Part 3

A final review of the previous semester, this time on my course about environmental politics and economic development. I tweak the design and content of this course every year, probably because it’s my favorite topic to teach (some prior examples of this here and here).

As in the other undergraduate course that I taught, I administered my own course evaluation. Sample is 18 out of 22 students. Here are the results for the questions with a 5-point scale of “strongly agree” to “strongly disagree”:

  • I now have a better understanding of the causes of poverty and economic growth: 4.3
  • I now have a better understanding of the relationship between economic development, environmental change, and risk: 4.4
  • The game design project helped me learn about environmental vulnerability and risk analysis: 3.6
  • I am now better able to use risk analysis as a decision making tool in my own life: 4.0
  • More courses at this university should include training in skills like risk analysis: 4.2

The relatively low score for the third question matched my observations. As in previous iterations of the course, teams of students designed games. This year I specified that the games needed to teach players about the environmental vulnerabilities faced by business owners. I devoted portions of some classes to presentations about system design and failure, and there were many writing assignments about the relationships between economic development, climate change, and risk. Yet, as in prior years, the games students built had little relevance to the design objective. In terms of mechanics, they mainly resembled Monopoly or Life.

I’m taking this as a sign that I need to impose even more limitations on the creativity students can but don’t exercise on this project. Next year I’m going to require that the games:

  • Be played on a board that is a map of the local community.
  • Have player roles that focus on a specific industry or institution threatened by climate change — such as tourism, food, or housing.
  • Contain mechanics that take into account the system components of place, people, and processes.

The good news is that I was completely surprised by answers to the evaluation’s “My favorite reading in the course?” Eleven of the respondents named the novel How to Get Filthy Rich In a Rising Asia, by Mohsin Hamid. Comments about the book included:

  • Clearly written and entertaining.
  • Nice to be able to connect with a character throughout the story.
  • Explained the timeline of a developing country through a perspective that I could visualize.
  • Unique and thought provoking.

I’ll definitely be including this novel in the course next year.

The Death of Curiosity? Part 2

Continuing to review my fall semester . . .

The forecasting project might have helped students learn Middle East politics and history. I’d rate it as a success on that front. As to whether their decision making skills have improved from using the CHAMP method, who knows?

At five different points in the semester, students forecasted the likelihood of these events occurring by December 9:

  • The value of the Turkish lira against the U.S. dollar decreases to less than 22:1.
  • In Iran, the Assembly of Experts names a new Supreme Leader.
  • An anti-government protest in Cairo results in at least twenty demonstrators arrested, injured, and/or killed.
  • The president or prime minister of Lebanon is assassinated.
  • Turkey ends its occupation of Syrian territory.

None of these events happened before the deadline, but that was ok given my purposes for the project. Here are the class’s predictions, with average percentage probability on the y-axis:

I need to tweak some of the project’s components. For example, the prompt for the last individual assignment — assess how your forecasts have been affected by cognitive biases — included this statement:

“People like Daniel Kahneman, Charles Wheelan, Tim Harford, Gerd Gigerenzer, and Nassim Taleb have written about cognitive biases and how to counter their effects.”

A few students did not discuss cognitive biases at all. Others clearly did a bad job of Googling “cognitive biases” and what the above individuals have written about them. In the future I’ll need to assign a specific reading on the topic. I see this as another manifestation of student inability or unwillingness to find information that I don’t put right in front of them.

Similarly, I either need to discard the in-class team presentations or formally assess them. Overall, they were of poor quality. Students need an explicit, rigid template for constructing presentations, and students will follow the template only if the presentations are graded. Asking students to give informal, ungraded presentations simply doesn’t work. Given that this country has raised a generation of children who frequently suffer from anxiety disorders, I might need to institute a rule that credit for presentations only goes to the students who deliver them, with the condition that each member of a team can present if they so choose. I already design my courses to provide students with “multiple paths to success,” so optional-yet-graded presentations are not much of a complication for me.

I administered my own course evaluation at the end of the semester. Here are the results — from 20 out a class of 22 students — for questions with a scale from “strongly agree” (5) to “strongly disagree” (1):

  • The forecasting project improved my ability to analyze political events in the Middle East – 3.9
  • I am now better able to use forecasting as a decision making tool in my own life – 3.7
  • More courses should include training in decision making skills like forecasting – 3.4

I would like the average scores on the second and third items to be higher.

Final comment: the last two reading response assignments before the final exam asked students to respond to “Will Lebanon/Syria still be a single sovereign state in 2030?” I did not realize until the last week of classes that these questions dovetail perfectly with the forecasting project, and that I should somehow integrate the CHAMP method and reading responses so that students get more opportunities to hone their decision making skills.