Creating Community I: Reading Response Discussions

Per Amanda’s last post about platform options for online group projects, over the next several weeks I’ll throw out some of my plans for exercises that I hope will create community in the two fall undergraduate courses I’ll be teaching online. I’ll start simple and gradually get more complex.

In the physical classroom, I still use reading responses to generate discussion among students and minimize formal lecturing. Online, I’ll do this with breakout rooms. In the classroom, I typically ask each group of students to summarize for the rest of the class the consensus position it has reached on the reading response; the pattern that emerges from polling groups in this manner often leads to additional discussion. I think this process will be tedious for students in an online environment, so I will tell students that each breakout room needs to create a document with three bullet points that support its argument. I will randomly choose one group of students to present its conclusions to the rest of the class after the breakout discussions are completed. The group will display its three bullet points to the class via screen share. I can ask that other students submit questions or opposing points of view, perhaps through text chat, for follow-up after the presentation. Throughout the process I’ll be asking “Why?” in Socratic fashion.

Technology note: Zoom has had breakout rooms for a long time. Cisco says that an updated version of Webex with this capability will launch at some point in September. There is apparently a method of creating breakout rooms with Microsoft Teams, but to me it looks complicated.

Preparing Students for Online Learning

Students, like many of their professors, dislike change. Much of the student dissatisfaction with last semester’s shift to “remote instruction” ultimately derives from the fact that online education is different from what they had previously experienced. Well, guess what? The fall semester will also be unlike what many of them have grown to expect.

As a faculty member, you’re probably spending the coming weeks or months thinking about how to teach in an environment of lower enrollments, less frequent face-to-face classroom interaction, and hybrid or fully online course formats. It’s doubtful that students are thinking about how their own behaviors as learners might need to change.

To get students to think about tools and habits that they might need for the next semester, have them do an online readiness self-assessment. Directing them to general advice about online learning either before or after a self-assessment might be helpful also.

Call for Reviewers

The Journal of Political Science Education (JPSE) is looking for manuscript reviewers. JPSE, one of the four main journal of the American Political Science Association, publishes articles on the scholarship of teaching and learning (SoTL), political science instruction, reflections on teaching and the academy, and reviews of educational resources.

Manuscripts submitted to JPSE as examples of SoTL discuss empirically-based research and contain a rigorous qualitative or quantitative analysis. Political science instruction manuscripts explain how to use the pedagogical technique being described, but they do not need to include evidence of effectiveness like SoTL pieces. All submissions, regardless of topic area, need to be evaluated with a critical eye for rigor, clarity, and style. Reviewers must be willing to reject manuscripts that do not merit a revise and resubmit. They also need to complete their reviews in a timely manner.

Continue reading “Call for Reviewers”

New and Improved

As I mentioned back in February, there are some glitches in the WordPress software that drives this blog. Probably I need to install an updated version of PHP to increase the blog’s security and stability, and this will force a change in the blog’s appearance. In other words, a new look, but same content and mission.

Along those lines, I would like to invite ideas for guest posts about the new assignments, exams, and techniques that all of you are now developing for an unprecedented fall semester. These ideas can be sent to us at The Covid-19 pandemic is forcing a radical re-think of teaching, and we all know that even the best-designed experiments often do not perform as desired. We welcome examples of failure as well as success, since the former is often more instructive than the latter.

Simulating Covid-19 Classroom Conditions

[Updated: This post describes my experience delivering a simulated classroom lesson, part of my university’s effort to evaluate potential solutions to the challenges posed by the upcoming fall semester — a process that is, or should be, occurring on your campus as well. Testing is a necessary part of the design process, and the process of evaluating potential solutions rarely goes as expected in its initial iterations.]

Last week I simulated fall semester teaching with some students in the physical classroom and others connected remotely via Webex. My main objective for the demo was to identify possible points of failure in the technology that my university is thinking about purchasing, and in this I succeeded beyond my wildest expectations.

The lesson was organized as follows:

Continue reading “Simulating Covid-19 Classroom Conditions”

Looking Back At Another Simulation

As promised in my last post, a brief review of another self-designed Excel-based simulation that I used this past Spring semester:

The purpose of this simulation was to teach students about freshwater resource use in Asia. I created three preparatory assignments on water scarcity in the region. The twenty-one students in the class were divided into teams that represented countries dependent on rivers that originate in the Himalayan watershed: Bangladesh, China, India, Myanmar, Pakistan, and Vietnam. Each team could build dams on the rivers that transited its country’s territory. Dams:

  • Enabled a country to expand the area of irrigated farmland and produce more food.
  • Generated more hydroelectricity, which in turn increased industrial production, per capita income, and, because of urbanization, municipal demand for water.
  • Reduced the amount of water available to downstream countries.

Countries purchased dams with surplus food, which could also be donated to other countries. Because of population growth, each country’s food needs increased annually, and rainfall decreased, reflective of climate change. These processed served as an incentive for countries to build dams. If a country suffered a food deficit in any given year, refugees flowed into neighboring countries, increasing those countries’ food needs — an incentive for some countries to negotiate on dam construction.

Ultimately, however, the demand for water eventually exceeded its supply for several countries — an outcome I had deliberately built into the simulation.

As happened with Gerkhania in my comparative politics course, the need to regularly update a complex Excel spreadsheet created interruptions. And, as with the other class, I made a few errors in the process, which slowed things down even further. But although the design of the spreadsheet needs some improvement, I was generally pleased with how it worked.

I asked students to complete anonymous survey about the simulation after it had ended. Seven of the twenty-one students in the class responded. Six of the seven said they thought that the simulation accurately depicted water resource issues in Asia, and five felt that the simulation improved their understanding of these issues. The seventh student thought that the simulation was a confusing, unproductive exercise.

Several commented that communication within and between teams was problematic because of 1) the constraints of the Webex meeting platform, and 2) lack of participation by teammates. This feedback leads me to wonder if I should include a collaborative team assignment before the simulation begins, perhaps one in which teammates’ contributions derive from more formalized roles (e.g., agricultural minister, foreign minister, etc.). And I do realize that Webex’s chat box is not an ideal tool for conversation, so I need to find some other means by which students can communicate with each other in real time outside of the classroom.

But here is the big change I’m considering: for the last several years, based on Michelle Allendorfer’s reasoning, I have scheduled these simulations for the last week of classes. I’m now wondering if I should move them to the beginning of the semester, in an attempt to quickly engage students with course content before they get tired and distracted. This could become important if Fall semester gets disrupted halfway through by Covid-19 like Spring semester did.

When the Sky Falls

A colleague recently asked for my opinion on two articles about the pandemic’s effects on higher education. The first is an interview with entrepreneur and NYU marketing professor Scott Galloway, who says that the current economic landscape makes higher education a tempting target for tech firms — the same point made by Kevin Carey and many others back in 2012.

I told you it wasn’t just a wooden horse.

The second is an op-ed by Glenn Moots, a philosophy and political science professor at Northwood University. Moots argues that online education lacks the “experiential learning, networking, and in-person collaboration, celebration, and commiseration” that students prefer.

Both Galloway and Moots distinguish between a college education and the college experience. The business model of many U.S. colleges and universities has long relied upon successfully selling the latter—i.e., “come here and you can continue playing the sport you played in high school for four more years while majoring in, oh, I don’t know, whatever.” The credential of the bachelor’s degree is an ancillary benefit one gets at the end, not the main product.

This is a business model that works until it doesn’t. The model is highly fragile because it assumes a static environment that conforms to one’s expectations. Given stagnant or declining household incomes, and shrinking numbers of 18-year old high school graduates in some regions, it has been an untenable financial strategy for many higher ed institutions for quite some time. The pandemic only made the model’s flaws more obvious. And thus we are now faced with an interesting economics question: how much are people willing to pay for the credential of a college diploma when the experience with which it has been historically bundled no longer exists? And which schools can survive at the price point they are now able to charge?

As Galloway points out, a few hyper-elite institutions offer credentials with such a high reputational value that they don’t need to worry about the college experience, or even, really, the education. Universities like MIT figured out nearly two decades ago that they could give away their curricular content for free and not damage their brands.

As a contrasting example, IPEDS data and IRS filings show that Northwood University lost 25% of its FTE undergraduate enrollment between 2007-08 and 2017-18 and suffered budget deficits in fiscal years 2012-2014 and 2018. Positive net revenue for the years in between came mainly from sales of assets. It looks like the demand for Moots’ “own little corner of academe”—the experience provided to full-time, campus-residing 18-22 year old students—was in decline long before Covid-19.

More Looking Backward

Continuing with my recent theme of evaluating my teaching over the previous semester:

My courses on comparative politics and Asia both concluded with simulations. I’ll discuss the latter in a future post. As I mentioned last month, I heavily modified my old Gerkhania exercise for comparative politics. The changes were based on a brilliant democratic government simulation that Kristina Flores-Victor of CSU-Sacramento presented at the 2020 APSA Teaching & Learning Conference.

As in previous versions of Gerkhania, students each received fictional identities as members of a newly-formed legislature in a multi-ethnic country with a history of civil strife (think Afghanistan). Over a series of three Webex sessions, I fed the class nine legislative proposals. Action on each proposal caused students to earn or lose political constituency points (representing support from voters) and political capital points (influence within the legislature). These effects varied in ways that corresponded to the identity of each participant.

At the opening and closing of each session, students could exchange constituency points at a 2:1 ratio for either political capital points or reward points that could contribute toward their course grades. Political capital points could be used to remove a proposal from the agenda, to prevent the legislators from voting on it, or to return it to the agenda. Students took a trivia quiz before the simulation began and prior to the second session so that they could acquire constituency and capital points to work with.

Every proposal that was voted down increased the probability that Gerkhania returned to a state of civil war at the end of the simulation by 1:18. If civil war occurred, legislators would lose all accumulated reward points.

Considerations for the future:

The effects of each successive legislative proposal, in terms of point changes, increased as the simulation progressed. The stakes associated with the initial proposals turned out to be too small to generate contention among students and need to be increased. The second trivia quiz can be scrapped for this reason.

I had built a very complicated Excel spreadsheet to track each student’s points as the simulation progressed. Using this spreadsheet for the first time, for a simulation that I had originally intended to run in the classroom, proved slightly problematic. I found it difficult to always correctly update spreadsheet cells with my eyeballs bouncing between windows on two different monitors. Also editing webpages so that students could track developments created delays during which students were idle.

A larger problem: although the simulation’s online environment seemed to negatively affect the amount of interaction between students, I think the small size of the class was the major contributing factor. As I’ve discussed before, these kinds of exercises seem to require a critical mass of participants, which this class didn’t have.

The pandemic most likely also had consequences. Campus classes ended at spring break, students scattered hither and yon, and the semester was extended by an extra week to make up for time lost in the transition to online instruction. By the last week, many students were probably just trying to finish the semester, had other concerns, and may not have been motivated to become heavily invested in the simulation.

Looking Backward and Forward

Expanding on my last post on failures from this semester:

From where I stand, information literacy skills are important, because they help one identify and demolish specious claims made by authority figures. An assignment that, for example, forces students to locate three peer-reviewed journal articles is practice in finding credible information. It also allows students to determine whether a topic is suitable for a semester-long research project.

To me, these outcomes are both beneficial and rather obvious. But from the students’ perspective, the assignment could simply be yet another meaningless hoop to jump through on the way to getting another A+ on a transcript. Given the sources many students cited in the different stages of their storymap projects, it looks like too many of them customarily take the latter approach to research.

Therefore, in future courses that involve research projects, I should create assignments that are limited to the task of locating scholarly sources and place those assignments at the beginning of the semester. I should demonstrate why this skill is useful outside of the classroom.

I’ve noticed a similar problem with student writing — really basic errors that indicate a lack of proofreading. I don’t expend more effort evaluating a student’s work than the student did creating it. But I do know that sloppy writing indicates sloppy thinking and that the former advertises one’s propensity for the latter to the rest of the world. Again, I should demonstrate early in the semester why it’s important to proofread one’s work before it reaches an audience. My favorite example? The missing Oxford comma that cost a dairy company US$5 million.

I’m also seeing, from the last few journal article worksheets students are submitting, that many still do not have a clear understanding of how evidence-based arguments are constructed in academic literature. An author typically poses a research hypothesis or question at the beginning of a journal article and concludes with the same hypothesis or question reworded as declarative statement. I.e., “Why is the sky blue?” in the introduction with “The sky is blue because . . . ” as the conclusion. Yet on worksheets some students are writing that the hypothesis is about one thing while the conclusion is about some other thing. So again, students need practice in understanding the components of a written argument in scholarly literature, and that practice needs to happen early in the semester.

In principle I’m talking about scaffolding. But many of my assignments are attempts at getting students to builds several different skills simultaneously. I think I need to disentangle my goals for these assignments so that they target only one skill at a time.