From the prompt “Complain to my husband about not washing dishes thoroughly in the style of Jane Austen”:
My dearest husband,
I hope this letter finds you well. It is with some vexation that I write to you regarding a matter that has been weighing heavily on my mind: the washing of the dishes.
As you are no doubt aware, this is a task that we both undertake in turn, and one which is essential to the smooth running of our household. And yet, I cannot help but notice that your efforts in this regard have been somewhat lacking of late.
I have observed with no small amount of dismay that you have not been washing the dishes thoroughly, leaving behind a residue of food that is most unappetizing. This, I must confess, has caused me no small amount of discomfort and concern, as it is not only unhygienic, but also highly unpleasant.
I trust that you will take this matter seriously, and make a concerted effort to improve the quality of your dishwashing in the future. I am sure that with a little more attention to detail, you will soon become as proficient in this task as I am.
Today we have a guest post from Rebecca A. Glazier at the School of Public Affairs at the University of Arkansas at Little Rock (rebecca [dot] glazier [at] gmail [dot] com) and Matthew Pietryka at Florida State University’s political science department (mpietryka [at] fsu [dot] edu).
Rebecca Glazier
Many professors are struggling to engage their students, who are often disengaged and burned out. To address these issues and improve student retention, universities are increasingly turning to edtech solutions or big data—everything from predictive analytics to chatbots in discussion boards. These remedies tend to be far removed from students’ daily lives. In contrast, as professors, we are with students in the classroom every day. And this experience often prepares us to know best how to engage our students.
Matthew Pietryka
In a new, open-access article we just published in Education Sciences, “Learning through Collaborative Data Projects: Engaging Students and Building Rapport,” we illustrate how faculty can engage students through collaborative data projects. Rather than relying on top-down university solutions, faculty can use the content of their own courses to involve students in collaborative projects that build rapport and make them feel included and engaged in the course. We see these collaborative data projects as another kind of active learning—getting students thinking outside of the textbook and involved in contributing to a project that is bigger than themselves.
We used data from more than 120 students over two semesters and our results suggest that most students find these collaborative data projects more enjoyable than typical college assignments. And students report the projects make them feel the professor is invested in their learning.
The article we wrote detailing these projects is open access. It provides advice on implementing these projects as well as the R code used to create individualized reports for students participating in the collaborative data projects. The individualized reports help develop rapport between the professor and each student. And this programmatic approach allows professors to scale up these reports to accommodate classes with hundreds of students. Building rapport and doing active learning is something considered possible only in smaller classes, but our approach demonstrates how it can be done in large classes as well—with significantly positive results.
At a time when many faculty members are struggling to engage students, we can take matters into our own hands by designing projects for our classes that draw students in and build rapport with them. It doesn’t take expensive edtech solutions or top-down directives. Mostly, it takes thoughtful pedagogy and prioritizing student connection.
Today we have a guest post, or more accurately, a guest video, from Joel Moore of Monash University, on an innovative use of the ChatGPT AI in a simulation:
Credit for this goes to the economist Richard Thaler, who mentioned the game’s basic premise in a Freakonomics Radio podcast.
I created this game for an admissions office event designed to persuade admitted applicants — high school students — to enroll. There were eight participants in the room. I placed a folded paper placard with the name of a country on it in front of each participant. The countries had varying GDP levels; e.g., El Salvador and China. I used poker chips instead of real money. I introduced the game by asking the group if they thought it was important for all countries to work toward mitigating climate change; everyone agreed. I then announced that we would simulate an international fund for climate defense. Countries could contribute to the fund that would be used to slow climate change and benefit everyone.
The game unfolds in three phases. Since my time was limited to about 35 minutes, I did two rounds for each phase. Probably more rounds per phase would work better. Here are the game’s rules:
Phase 1: Each player begins the game with 5 chips. For each round, a player can contribute 0 to 5 chips to the fund. At the end of the round, the number of chips in the fund doubles and this amount is divided equally among all players.
Phase 2: Players retain any chips acquired in Phase 1. Rules from Phase 1 still apply, plus: You can spend 1 chip to penalize another country. The country that is penalized loses 2 chips. To do this, write the name of the country on a piece of paper and give it to me along with 1 of your chips.
Phase 3: Each player starts with an amount of chips that reflects his or her country’s GDP. All other rules from Phases 1 and 2 still apply.
After six rounds, I led a short discussion, and it was evident that the high school students had picked up on the collective action problem that exists in the provision of international public goods.
Iowa Wesleyan University will close in May. I first wrote about Iowa Wesleyan — then a College — on this blog in early 2014, when I noted that, despite its firing of nearly half of its faculty members, the long-term decline in the number of Iowa’s high school graduates and changes in the higher ed market made the school’s survival unlikely. As I predicted in 2019 and 2021, its articulation agreement with a community college in another state failed to generate sufficient tuition revenue, and its full-time equivalent (FTE) undergraduate enrollment never rebounded to pre-2008 recession levels.
A few years ago, I stated during a conference panel that any private college or university with an undergraduate FTE below 500 was in danger of closing and that faculty members at such institutions should find jobs elsewhere. I’m revising my recommendation — to an undergraduate FTE of less than 1,000.
I’ll end with this:
IRS tax forms and IPEDS enrollment data indicate that Bethany College in West Virginia is in terrible shape.
A small college in Ohio, name not known to me, was supposedly unable to make payroll last week.
Continuing on a theme . . . some notes on today’s pedagogical discussion at ISA 2023:
Teachers want to create an environment that facilitates learning and stimulates a spirit of curiosity. Students may have different expectations. As one session participant put it, students can have the purely transactional attitude of “I’m not going into debt so I can feel emancipated.”
In a similar vein, we talk about what students should get out of a college education, but we don’t ask what they bring to it; e.g., a K-12 education where the teacher was the sole authority in the classroom.
So we are frequently faced with a situation where students don’t want to engage unpredictably with new knowledge because it makes them feel uncomfortable, which they do their best to avoid.
To resolve this dilemma, students need to become familiar with tools for giving and receiving feedback productively so that they can learn from each other. They also need to learn how to articulate why they hold certain positions, why those positions are important to them, and what they mean when they state those positions.
During the conversation, I thought of a tweak to an assignment that might help with the above. As I have written previously, many of my students are unable to identify the author’s thesis, independent variables, and dependent variable in Perusall readings. I’m thinking of adding “What is a question about this article that you want answered?” to the assignments, with the stipulation that the answer needs to come from their classmates, not me. This could also be a way of getting students to design their own quiz questions.
*Allusion to 19th Russian literature, of which I am mostly ignorant — a known unknown that I am at present mostly comfortable with.
First, I hope to see folks at some of the pedagogy-oriented sessions at this week’s ISA meeting in Montreal. Feel free to chat up me or the illustrious Dr. Simon Usherwood.
Second, a follow-up to my post last month about no-stakes quizzes and class discussion.
I gave students another quiz, on survivorship bias, a topic of the reading assigned earlier in the week. Here is the prompt for the questions (note that the scenario is a fictionalized version of a famous World War II example):
The United States has gone to war against Zambia. A high percentage of U.S. warplanes are being shot down by the Zambian military. You work for the U.S. Department of Defense as an operations research analyst. You have been given the task of recommending where additional armor plating should be installed on U.S. warplanes to better protect them from enemy fire. The image below shows a representative pattern of damage from Zambian anti-aircraft fire to U.S. warplanes that have made it back to home base.
Responses to question 1:
Responses to question 2:
Despite the tiny sample, I think the quiz scores indicate the ease of memorizing a concept’s definition while being unable to meaningfully apply it. Students frequently equate memorization with knowing and hence learning, when mostly it’s not.
Alternative title: Stuff Just Happens, a persistent student mental paradigm that I’ve written about before (here and here).
I’m teaching an undergraduate introduction to research methods course this semester, for which I created a series of practice quizzes that contribute nothing to the final course grade. I expected the quizzes, which range from one to three questions, to function as retrieval practice and launch pads for class discussion. This is the same idea behind polling students to check their comprehension while lecturing, but I used the Canvas LMS quiz tool instead of a polling platform because 1) I was too lazy to learn the latter, and 2) I wanted all course content housed in one place.
The activity is not working as well as I thought it would, for a few reasons. First, Canvas identifies the correct answer to each question when displaying quiz results, as shown below, which shuts down discussion about which option is correct and why. A pie chart that doesn’t label the correct answer would work better — i.e., Google Forms.
Second, this is a small class that meets at 8:00 a.m. The quantity and quality of discussion declines markedly when only a few students are absent, which is usually the case.
But most importantly, given their answers to the quiz questions, students are still having a lot of difficulty forming plausible and potentially valid causal inferences from the data that I present. I’ve given six practice quizzes so far, and on average students answer questions correctly only about 50% of the time. Here is a typical example from a prior quiz:
Based on the visual representation of data below, a logical hypothesis is:
Although probably too late in the calendar year to put into practice, here is a flip-side follow-up to my prior post about campus interviews:
It’s time to stop the costly performative rituals that are contributing to the disintegration of the academy.
Move to a bi-annual or even tri-annual hiring cycle. Academia is the only industry that I know of that limits hiring to an annual schedule. If you operate your searches on a staggered timeline, your applicant pool will probably improve and your top choices will be less likely to be snagged by competitors.
Stop requiring recommendation letters at the beginning of the search process. Demanding otherwise wastes the time of letter writers and applicants. We all know that an overwhelming percentage of these letters are never read because the applications they are part of quickly get tossed. Get references only for those on your short list and then check them.
While the science supports eliminating job interviews entirely, this probably isn’t going to happen, so at least make them less onerous. The pandemic demonstrated that there is no need to bring finalists to campus. And there is no demonstrated benefit in subjecting them to one to two days of back-to-back meetings with people who have no direct effect on the specified duties of the position. Is it essential for every candidate to have a 30-minute conversation with the Associate Director of Strategic Student Development Initiatives? No one who interviews for an IT or facilities management staff position has to suffer through this, and those units function perfectly well.
Finally, per the article linked to above, structure the applicant evaluation process to minimize bias and noise. Use rubrics to score candidates on clearly defined criteria. Collect the results, average them, and distribute this information to the search committee before discussion of the applicants’ relative merits. This will help prevent any single person in the room from unreasonably affecting the outcome.
While going down the YouTube rabbit hole early last September, I stumbled across this video by a Canadian lawyer about the three types of clients to avoid. These clients display urgency, flattery, or (lack of) responsibility — often simultaneously. As stated in the video, these signals occur in any customer service industry. I’ve certainly seen them, and probably you have, too.
Urgency — a student claims to have an emergency that requires your immediate action. Questions for you to ask: “Is this a real or perceived emergency? Did the situation arise because of the student’s behavior?” In a serendipitous example, two weeks after watching the video, I received an email from a student with “URGENT CONCERN” in the subject line. It wasn’t urgent, nor was it my concern.
Flattery — a student says that you are the only professor that can resolve their problem. It is an attempt to distract you from the real cause of the situation. E.g., “This is my favorite course, but it’s the only one I’m doing badly in this semester, and if my GPA drops below X, I will lose my scholarship and have to drop out of college. Are there any extra credit assignments?”
Responsibility — nothing is the student’s fault. For example (actual email I received last month): “The wi-fi is completely shut down on campus and I can’t submit anything, I’ve been trying to for the past hour. I know our assignment is due and I’ve tried submitting it but I don’t know what to do. I can attach the writing here but can’t upload anything to Canvas.” My response? “The library has computers on every floor.”