A Pandemic of Irrelevancy?

Debates over responses to the coronavirus pandemic are being driven by epidemiologists, physicians, economists, and elected politicians. Political scientists, in contrast, have been absent from policymaking and public discourse — as has been the historical pattern. An academic discipline that claims to advance a scientific understanding of politics regularly fails to communicate politically about science. A few possibly non-representative, but illustrative, examples:

The historical analysis of municipal public health interventions and epidemic intensity during the 1918 influenza pandemic, published in 2007 in the Proceedings of the National Academy of Sciences of the USA. The authors, who were profiled in Michael Lewis’s book The Premonition, were senior officials in several federal agencies.

Editorials (direct links here, here, here, here, and here) on pandemic policy written by former members of the President Biden’s White House transition team, published by the Journal of the American Medical Association on January 6 — and referenced in this New York Times article.

This essay by a professor at the Johns Hopkins School of Medicine about how universities are responding to the pandemic.

The common thread among the authors of the above? Not a political scientist among them. Perhaps the situation is different outside the USA. Or maybe I’m just engaging in confirmation bias. But I welcome suggestions on pandemic-related publications written by political scientists for the educated layperson. It’s a bit frustrating having to rely solely on authors from other fields when teaching undergraduates about public policy.

Perusall

When the spring semester starts, I’ll be using Perusall for the first time, in my comparative politics course. I decided to finally experiment with it for three reasons. First, my previous attempts at getting students to engage in collaborative notetaking have mostly failed. Second, as I mention in that linked post, a couple of my colleagues have raved about Perusall’s ability to turn reading into a social learning process. Third, resiliency is as important as ever when it comes to course design. Given the pandemic and associated mitigation protocols, there is the chance that some or all of my students will be absent from the physical classroom at random points during the semester. Perusall allows students to engage with course content and each other asynchronously online.

I found it easy to set up Perusall by following these basic instructions (on my campus, Perusall has been administratively connected to all Canvas course shells, so there is no need for individual faculty members to install the LTI app). This brief explanatory video was also helpful. Perusall’s user interface is very intuitive. I set up the course’s article library and associated Canvas assignments in only a few minutes. Here is the end result from the Perusall side:

Notice how the layout is exactly what is shown in the video. It is also the same as what students will see.

Perusall uses an algorithm to machine grade student interaction with each document in the course library, and the algorithm’s output can be synced back to the Canvas gradebook. This means readings can become auto-graded Canvas assignments. Details on this and more are in the instructions I linked to above.

I will report on how well all of this has worked once the semester is underway.

Holiday Greetings 2021

Time for another semi-annual update on the financial condition of some U.S. colleges and universities. Standard disclaimer: this is my opinion, based on publicly-available information.

In the interest of holding myself accountable, let’s begin with some of the schools that I’ve profiled here before.* The number to the right of a school’s name refers to the percentage increase in expenses per FTE undergraduate from fiscal years (FY) 2011 through 2020. The higher the number, the worse the situation.

Continue reading “Holiday Greetings 2021”

TLC@ APSA 2022 – and a request

A one-day Teaching and Learning Conference will be held at the 2022 APSA meeting in Montreal. The call for proposals is here.

Young-Im Lee, assistant professor of political science at Cal State University-Sacramento, would like to organize two workshops for the TLC@APSA. Here is her request:

  1. I am curious what other political scientists/their departments do to practice antiracist pedagogy and create antiracist institutions.
  2. I wonder how other political science programs offer career advising for undergraduates, in terms of both graduate school application support and non-academic jobs. I am particularly interested in programs mainly teaching underserved and minoritized students.   

I am not yet in the position to present on these two topics, but I am interested in learning about what others do. I am happy to do the organizing work. Please let me know if you want to share your experience and expertise on either one of the two topics above.

Dr. Lee can be contacted at young-im [dot] lee [at] csus [dot] edu.

The Wheels Are Coming Off

The essay below was written by a tenured professor at a public regional comprehensive university in the USA.

We always say we are a “tuition-dependent” state university, so any enrollment downturn hits us hard. What I didn’t fully appreciate before the Covid-19 pandemic is how dependent we are on revenue from the cafeterias and dorms. We suffered huge losses from going completely online for a year.

But wait, there is more! Even before the pandemic, our athletics department, by which I mean our football team, lost $10-12 million per year. They lie about this and hide it as best they can, but at a state institution with strong public records laws the truth has come out. Athletics at our university reports directly to the president and presents at every board of trustees meeting. To the trustees, the athletics department is the university. Only about 5% of the students ever attend a football game, but they all pay several hundred dollars per year in student fees to athletics.

Last year we hit a $25 million deficit. The administration slashed budgets everywhere. Almost half of our classified staff were laid off in June of 2020. Programs were eliminated, adjuncts all fired, phones taken out of faculty offices. You can’t get anything done on campus anymore; you email people or leave a phone message and there is no one to get back to you. A four-person office I often collaborate with was reduced to one person, the most junior, who was told to do all the work.

But guess what wasn’t cut? Athletics! The administration hired consultants—retired coaches—who decided, without any quantifiable evidence, that football was vital to the character of the institution. Their recommendation? Spend millions more per year on athletics. The administration committed to meeting that goal.

My department was combined with several others to “save money.” Our small but valuable graduate program was eliminated. Three secretaries were laid off and replaced, after a year, with one person working less than full-time to support three dozen faculty. Every faculty meeting is a battle between angry professors and our thin-skinned president who bristles at any criticism.

Enrollment is down another 20% this year. Our region of the state produces fewer high school graduates each year, and that number will drop off a cliff when the 2008 recession’s fast-approaching birth dearth hits us. In the last ten years, we have invested heavily in our physical campus; we are paying bonds on lovely, state-of-the-art buildings that will never be full.

I am less than five years from a possible early retirement. I am a graduate of this institution, and my wife and I met in a classroom where I now sometimes teach. I love this place. But it is never going to recover.

The More Things Change . . .

A follow-up to my post from last month about changing an exam prompt:

I created two exams for this course with the same two-part design. First, answer some multiple choice questions. Second, write additions to a Twine story.

For the second exam, five out of seventeen students wrote in a style that resembled, to varying degrees, that of the story. While this marked a minor improvement over the first exam, students incorrectly applied economic concepts more frequently. The average score for the second exam was lower than that of the first exam.

While my sample size is far too small to determine whether the change was statistically significant, I would like students to do better, and I’m wondering how I might change the exam prompt yet again to facilitate this.

A Human Rights Foreign Policy Game

Today we have a guest post from Michelle Goodridge, academic librarian at Wilfrid Laurier University. She can be contacted at mgoodridge [at] wlu [dot] ca.

After a casual conversation about classroom games with my colleague Professor Andrew Robinson, we created a foreign policy simulation for his course, HR 100 Human Rights and Human Diversity. We had two goals for the simulation: first, have students explore why state actors fail to advance human rights domestically and internationally, and second, measure the simulation’s effectiveness in helping students achieve desired learning outcomes.

We modified the International Trade Game by:

  • Orienting the exercise around human rights instead of international trade.
  • Dividing students into three teams of high and middle-income pro-human rights. democracies, two teams of low-income democracies indifferent to human rights, one team of a high-income state that is anti-human rights, and one team representing an NGO.
  • Introducing the political objective of re-election.
  • Creating different winning conditions for each team.

To form teams, students picked one of several different colored t-shirts that we had laid out around the classroom. Each team received a corresponding packet of instructions and resources. I had the role of The Trader who accepted the geometric shapes produced by teams in exchange for political support units. Andrew injected human rights crises into the simulation via PowerPoint. The simulation ran an hour, with defined victory conditions that needed to be met to have a winner. Often none of the teams met its victory condition, which came as a shock to the students, but it helped illustrate the complexity of international relations.

After the game concluded, we took time to debrief the students, and this is when students made robust connections between the simulation and concepts they had been studying. I can only assume this is because verbalizing these responses right after the exercise is easier than writing them down a week afterward.

We attempted to measure the effectiveness of our Human Rights Foreign Policy Game with pre/post test evaluations. The evaluation results were anonymized, coded, and analyzed using SPSS. We found that the richest data came from students’ responses to the evaluation’s open-ended questions. So far, we have run this simulation in six semesters, and we will probably continue to use it in the future because of the high percentage of students reporting that it helped them learn. For more details, please see our article “Objective Assessment of Pedagogical Effectiveness and the Human Rights Foreign Policy Simulation Game,” Journal of Political Science Education 17, 2 (2021): 213-233, DOI: 10.1080/15512169.2019.1623048.

Information Literacy Exercise

Today we have a guest post from Colin Brown, assistant teaching professor in the Department of Political Science at Northeastern University. He can be reached at colin [dot] brown [at] northeastern [dot] edu.

It seems safe to say that political scientists have some concerns these days about information literacy, and information literacy is likely an implicit learning outcome for many of us. This blog has provided a number of good exercises for bringing information literacy into research methods, reading academic research, and headline writing. Inspired by these examples, I attempted to include this skill in my introductory comparative politics class, where democratic (de)consolidation is a major topic. In theory, the class gives students enough background to start keeping up with events around the world—if they choose to do so.

The exercise I tried this year, now available on APSA Educate, forces them to update slightly-out-of-date readings on a country facing democratic backsliding (Poland) by finding out what’s happened there in the four or five years since they were published. Students were assigned to small groups, and each was given a different kind of source to examine during a class session. One group read newspaper articles, another examined democracy indexes, yet another searched Wikipedia, etc. Students then applied what they’d read to course concepts—has democracy gotten weaker or stronger in Poland since these were published? Students then discussed what they trusted or distrusted about each type of source, and the potential merits of each.

I had a few key goals for students:

  • Think about source material for future courses. In an intro course, students not only might be unfamiliar with how research articles work, but also may not have a lot of practice in thinking about online source credibility.
  • Understand that while sources vary in credibility, there are pros and cons to using even the most credible sources. For example, the students who looked at V-Dem, Freedom House, etc., got clear, direct answers to the exercise’s questions, but they also correctly pointed out that they had to accept these organizations’ conceptualizations of democracy. And less credible sources like Wikipedia still had things to offer if used carefully.
  • Bridge the gap between classroom learning and events in the broader world and show how what they’re learning might help them understand the news.

When I ran this exercise in class this year, I budgeted only about 25 minutes for it, when it turned out to need 40 minutes or more to give students enough time to look at multiple sources in their category. We ended up using another 25 minutes the next day but dividing the exercise into two sessions probably led to more shallow searching and a less systematic attempt to make sense of sources.

When running this exercise in the future, I will think more explicitly about the balance between handholding and allowing students to practice seeking things out on their own. Last time I provided a couple of search terms, told them to keep looking outward beyond these, and to keep a record of what they searched for (which as best I could tell no group did). Next time I will probably experiment with either giving students a fully curated list of search terms, so they can observe how this affects their search results, or, conversely, I might give them even more time to “flail” about on their own before offering suggestions.

Statecraft in the International Relations Classroom

Today we have a guest post from Eric Cox, an associate professor at Texas Christian University. He can be contacted at e[dot]cox[at]tcu[dot]edu.

Does the online Statecraft simulation improve student learning when used as a key component of international relations classes? I explored this question in a Journal of Political Science Education article through a controlled comparison of two IR course sections taught during the same semester. One section was randomly chosen to participate in Statecraft, the other was assigned a research paper. The primary finding of the study was that students in both sections performed similarly on exams when controlling for other factors.

Statecraft is a turn-based simulation that divides students into “countries” that they govern. Each country must choose its form of government, economic system, and other attributes. Players also choose whether to focus on domestic spending priorities such as schools, hospitals and railroads, or on military capabilities. They must deal with terrorism, the melting of Ice Mountain, pirates, and rumors. The simulation is, to put it mildly, complex. I have been using it for just over a decade.

To try to put the students doing the research paper on an equal footing with those engaged with Statecraft, I dedicated several days of class to instruction in research writing skills and peer review. The students in this section spent roughly the same amount of time in class on their paper as the students in the Statecraft section did on the simulation. Both groups also wrote about the same amount.

At the end of the semester, I compared class performance on three exams and gave students a brief survey on their experiences. The initial findings were surprising: the research paper class did much better on exams but were less satisfied with the research assignment than the Statecraft students were with the simulation. I obtained access to students’ GPA when entering the course, and re-ran my analysis with GPA, whether students were taking the course for a grade, and whether students were political science majors as controls. Once these controls were introduced, the effect of Statecraft went away. The strongest predictor of course performance was their incoming GPA. Students with high prior GPAs made As, B students made Bs, and so on. Academic performance was independent of the research paper or Statecraft assignment. However, students in the Statecraft section showed a strong preference for the simulation over a traditional research paper, and students in the research paper section indicated they would have rather done Statecraft. Subsequent student evaluations have also demonstrated the relative popularity of Statecraft.

That said, my use of Statecraft has evolved, something I discuss in detail in my chapter of Teaching International Relations. Foremost, I dedicate class time to the simulation, and draw examples from the simulation when discussing IR theory, issue areas, and current events. Students have indicated that the simulation gives them a greater appreciation for the complexity of international relations and the challenges leaders face. 

Editor’s note: previous posts on Statecraft can be found here.

Observing Observation

Two weeks ago, students in my economic development and environmental politics course played my simulation on freshwater resource scarcity in Asia. If my memory is correct, it was the first time running the simulation in the physical classroom, and I was interesting in whether students behaved differently in the face-to-face environment compared to a prior iteration of the simulation that occurred online.

You can lead the students to knowledge . . .

The underlying mechanics of the simulation were unchanged: six teams, each representing a different country with one or more transnational rivers crossing its territory. Turn by turn, the population expands, more food must be produced, and water demand increases, yet countries are building dams upriver and rainfall declines because of climate change. Eventually a country has a famine and millions of refugees spill into its neighbors.

This time around I added a victory condition: members of the team with the greatest percentage growth in GDP per capita when the simulation ended earned five points (out of a thousand) toward their final grades. I put a copy of the simulation’s spreadsheet, which shows how actions taken by teams affect water availability, food production, hydroelectricity generation, and GDP, on the LMS and encouraged students to experiment with it before the simulation started.

Student did seem more engaged with the simulation in the classroom than they had online, though it was far easier for me to observe their interactions. The real surprise was how baffled students were about the cause and effect relationships built into the spreadsheet. Growth in GDP requires growth in hydroelectric capacity, which only comes from building dams. Yet teams were hesitant to build dams. By the end of the simulation, China, for example, had stockpiled enough of a reserve to have constructed over one hundred dams, yet it had built only a handful. The largest change in GDP among the six teams? Only 1.1 percent over a twelve year period.

Students clearly had not tried to figure out the spreadsheet before the simulation started, and none of them seemed to understand the relationship between economic growth, food, and water. Consequently, many of them flailed about helplessly as their country’s water supply steadily dwindled. When asked during the debriefing why they chose inaction instead of action, I got mostly blank looks. As I’ve noted before, many students seem to have little understanding of cause and effect; instead, in their worlds, stuff just happens. While I would prefer not adding multiple assignments to the course to force students to work with the simulation’s causal relationships before the simulation actually begins, it might be necessary.