Mekong Dam Simulation, Part 2

Today we have a second guest post from Sharmaine Loh and Marek Rutkowski, of Monash University—Malaysia, and Joel Moore, Monash University—Australia. They can be contacted at sharmaine [dot] loh [at] monash [dot] edu, marek [dot] rutkowski [at] monash [dot] edu, and joel [dot] moore [at] monash [dot] edu.

In our last post, we described our Mekong River crisis simulation. The assessments that we use for this simulation are designed to reward student preparation and engagement (a detailed breakdown is in this appendix).

Students are initially provided with detailed position descriptions for employment in the organisations to which they’ve been assigned. They are asked to prepare for a mock job interview for that position, which requires them to conduct research and think about their role in the simulation. We have offered this scenario in an applied capstone class, so have required students to identify their own readings and research to be able to fulfil their roles.

Once the simulation begins, students write a weekly strategy memo for the lead member of their organization based on independent research they’ve conducted, an opportunity for them to consider the practical, actionable implications of scholarly work in the social sciences. Students also must also document their interaction with other organisations and the media during the simulation in a reflective journal.

The head of each organization in turn relies on his or her team members to regularly provide advice about the best course of action in the unfolding crisis. If a group suggests a questionable course of action, the instructor uses follow-up questions to prompt students to consider possible negative consequences, e.g. how would investors view a decision to cancel the project?

At the end of the course, students analyze their experience of the simulation in a writing assignment.

The simulation is designed to make it difficult for students to upset the status quo. Local and international NGOs usually must settle for limited gains based on a government’s willingness to placate its critics. While this sometimes leads to frustration and disillusionment for students, it allows them to gain a better understanding of the power disparity between governmental and nongovernmental actors. While students sometimes initially attempt to resolve the crisis by reaching a consensus among all parties involved, they quickly realize that this is impossible due to conflicting interests. While students are allowed to make risky decisions if they are well considered and not purposely disruptive, successfully negotiated political and policy changes in the simulation have always been limited and incremental. 

In past iterations of the simulation, the incumbent Thai leadership has usually been able to retain control of the government and dominate issue framing, in some cases solidifying its position in the process. Thai opposition groups have had to navigate between outright rejection of government policies and a more conciliatory and constructive criticism. Students have learned that political change is difficult to accomplish without a broad anti-government bloc that includes civil society organisations.

Changes at the international level have also been limited, accurately reflecting the shortcomings of the Lower Mekong  governance regime and ASEAN’s commitment to the principle of non-interference. Students’ attempts to amend the 1995 Mekong Agreement have been hindered by states’ competing foreign policy objectives and the strict application of sovereignty. At most, parties have agreed on a controlled and gradual extension of the Mekong River Commission’s supervisory apparatus.

We have identified a few ways in which the simulation can be further improved. Students’ concerns about free riding within teams, while partially mitigated through the use of a team member evaluation tool (e.g. CATME or Feedback Fruits, we used one developed for this class by Joel), have continued. A possible solution could be a “divorce option,” where students would be allowed to “fire” a free riding member. We have also observed that students’ insufficient background knowledge can lead to unrealistic behaviour in the simulation. This could be mitigated by an increased redundancy within groups (multiple students being given the same or similar role) and an added criterion of academic performance in determining group allocation (Joel’s tool for the allocation of students into groups for class assignments has also been used to allocate students into roles for this class). 

Mekong Dam Simulation

Today we have a guest post from Sharmaine Loh and Marek Rutkowski, of Monash University—Malaysia, and Joel Moore, Monash University—Australia. They can be contacted at sharmaine [dot] loh [at] monash [dot] edu, marek [dot] rutkowski [at] monash [dot] edu, and joel [dot] moore [at] monash [dot] edu.

We developed a six weeks long simulation with three contact hours per week about international competition over freshwater resources of the lower Mekong River. The simulation, which we call the Riparian Dam Crisis, is designed to provide students with the opportunity to build collaboration, communication, and negotiation skills while learning about Southeast Asia. Students are introduced to select theories before the start of the simulation and incentivised to conduct independent research and source other relevant materials to inform actions of their groups throughout. 

The simulation involves a Thai-funded hydroelectric dam project in Laos. Most of the dam’s electricity will be purchased by Thailand. Shortly before the dam goes into operation, a drought reduces downstream water to its lowest level in living memory. This scenario, which resembles the real-life Xayaburi dam a few years ago, reflects competing economic and environmental demands, weak regional regimes for dispute resolution, domestic political considerations, and transnational advocacy networks. Students assume the roles of various stakeholders that must try to achieve specific objectives in an evolving situation, such as the Thai, Lao, and Cambodian ministries of foreign affairs, rural NGOs, the regional Mekong River Commission, Thai political parties, and journalists. For example, the dam has been constructed wholly within Laos’s borders, which paradoxically gives the smallest country the largest say in the simulation’s outcome. Cambodia is the most negatively affected by upstream dams in Laos, but it has limited influence over Laos and Thailand because it is not a participant in the project. Meanwhile Thailand is very susceptible to domestic pressure from interests that either support or oppose the dam.

During the simulation, student journalists representing two Thai media outlets conduct interviews and create stories targeting different audiences. The simulation’s other stakeholders need to engage strategically with reporters to have their actions framed in a positive manner. 

Thus, there is one constellation of groups that broadly favours pushing forward with the dam, another one that generally wants to halt the dam, and a third whose position is flexible. After an initial feeling-out period, students identify aligned groups and develop strategies to achieve their objectives. Each time we have run this simulation, students have focused on their efforts on preserving or creating a sympathetic ruling coalition in Thailand after they had exhausted other diplomatic avenues. Students have been quite creative in creating novel strategies to achieve group objectives, such as staging mock mass protest campaigns, lobbying global powers, and bringing down Thailand’s ruling coalition with a vote of no confidence.

In a future post, we will describe how we assess student learning from the simulation and how we adapted it over time in response to student experience.

The Wheels Are Coming Off

The essay below was written by a tenured professor at a public regional comprehensive university in the USA.

We always say we are a “tuition-dependent” state university, so any enrollment downturn hits us hard. What I didn’t fully appreciate before the Covid-19 pandemic is how dependent we are on revenue from the cafeterias and dorms. We suffered huge losses from going completely online for a year.

But wait, there is more! Even before the pandemic, our athletics department, by which I mean our football team, lost $10-12 million per year. They lie about this and hide it as best they can, but at a state institution with strong public records laws the truth has come out. Athletics at our university reports directly to the president and presents at every board of trustees meeting. To the trustees, the athletics department is the university. Only about 5% of the students ever attend a football game, but they all pay several hundred dollars per year in student fees to athletics.

Last year we hit a $25 million deficit. The administration slashed budgets everywhere. Almost half of our classified staff were laid off in June of 2020. Programs were eliminated, adjuncts all fired, phones taken out of faculty offices. You can’t get anything done on campus anymore; you email people or leave a phone message and there is no one to get back to you. A four-person office I often collaborate with was reduced to one person, the most junior, who was told to do all the work.

But guess what wasn’t cut? Athletics! The administration hired consultants—retired coaches—who decided, without any quantifiable evidence, that football was vital to the character of the institution. Their recommendation? Spend millions more per year on athletics. The administration committed to meeting that goal.

My department was combined with several others to “save money.” Our small but valuable graduate program was eliminated. Three secretaries were laid off and replaced, after a year, with one person working less than full-time to support three dozen faculty. Every faculty meeting is a battle between angry professors and our thin-skinned president who bristles at any criticism.

Enrollment is down another 20% this year. Our region of the state produces fewer high school graduates each year, and that number will drop off a cliff when the 2008 recession’s fast-approaching birth dearth hits us. In the last ten years, we have invested heavily in our physical campus; we are paying bonds on lovely, state-of-the-art buildings that will never be full.

I am less than five years from a possible early retirement. I am a graduate of this institution, and my wife and I met in a classroom where I now sometimes teach. I love this place. But it is never going to recover.

A Human Rights Foreign Policy Game

Today we have a guest post from Michelle Goodridge, academic librarian at Wilfrid Laurier University. She can be contacted at mgoodridge [at] wlu [dot] ca.

After a casual conversation about classroom games with my colleague Professor Andrew Robinson, we created a foreign policy simulation for his course, HR 100 Human Rights and Human Diversity. We had two goals for the simulation: first, have students explore why state actors fail to advance human rights domestically and internationally, and second, measure the simulation’s effectiveness in helping students achieve desired learning outcomes.

We modified the International Trade Game by:

  • Orienting the exercise around human rights instead of international trade.
  • Dividing students into three teams of high and middle-income pro-human rights. democracies, two teams of low-income democracies indifferent to human rights, one team of a high-income state that is anti-human rights, and one team representing an NGO.
  • Introducing the political objective of re-election.
  • Creating different winning conditions for each team.

To form teams, students picked one of several different colored t-shirts that we had laid out around the classroom. Each team received a corresponding packet of instructions and resources. I had the role of The Trader who accepted the geometric shapes produced by teams in exchange for political support units. Andrew injected human rights crises into the simulation via PowerPoint. The simulation ran an hour, with defined victory conditions that needed to be met to have a winner. Often none of the teams met its victory condition, which came as a shock to the students, but it helped illustrate the complexity of international relations.

After the game concluded, we took time to debrief the students, and this is when students made robust connections between the simulation and concepts they had been studying. I can only assume this is because verbalizing these responses right after the exercise is easier than writing them down a week afterward.

We attempted to measure the effectiveness of our Human Rights Foreign Policy Game with pre/post test evaluations. The evaluation results were anonymized, coded, and analyzed using SPSS. We found that the richest data came from students’ responses to the evaluation’s open-ended questions. So far, we have run this simulation in six semesters, and we will probably continue to use it in the future because of the high percentage of students reporting that it helped them learn. For more details, please see our article “Objective Assessment of Pedagogical Effectiveness and the Human Rights Foreign Policy Simulation Game,” Journal of Political Science Education 17, 2 (2021): 213-233, DOI: 10.1080/15512169.2019.1623048.

Information Literacy Exercise

Today we have a guest post from Colin Brown, assistant teaching professor in the Department of Political Science at Northeastern University. He can be reached at colin [dot] brown [at] northeastern [dot] edu.

It seems safe to say that political scientists have some concerns these days about information literacy, and information literacy is likely an implicit learning outcome for many of us. This blog has provided a number of good exercises for bringing information literacy into research methods, reading academic research, and headline writing. Inspired by these examples, I attempted to include this skill in my introductory comparative politics class, where democratic (de)consolidation is a major topic. In theory, the class gives students enough background to start keeping up with events around the world—if they choose to do so.

The exercise I tried this year, now available on APSA Educate, forces them to update slightly-out-of-date readings on a country facing democratic backsliding (Poland) by finding out what’s happened there in the four or five years since they were published. Students were assigned to small groups, and each was given a different kind of source to examine during a class session. One group read newspaper articles, another examined democracy indexes, yet another searched Wikipedia, etc. Students then applied what they’d read to course concepts—has democracy gotten weaker or stronger in Poland since these were published? Students then discussed what they trusted or distrusted about each type of source, and the potential merits of each.

I had a few key goals for students:

  • Think about source material for future courses. In an intro course, students not only might be unfamiliar with how research articles work, but also may not have a lot of practice in thinking about online source credibility.
  • Understand that while sources vary in credibility, there are pros and cons to using even the most credible sources. For example, the students who looked at V-Dem, Freedom House, etc., got clear, direct answers to the exercise’s questions, but they also correctly pointed out that they had to accept these organizations’ conceptualizations of democracy. And less credible sources like Wikipedia still had things to offer if used carefully.
  • Bridge the gap between classroom learning and events in the broader world and show how what they’re learning might help them understand the news.

When I ran this exercise in class this year, I budgeted only about 25 minutes for it, when it turned out to need 40 minutes or more to give students enough time to look at multiple sources in their category. We ended up using another 25 minutes the next day but dividing the exercise into two sessions probably led to more shallow searching and a less systematic attempt to make sense of sources.

When running this exercise in the future, I will think more explicitly about the balance between handholding and allowing students to practice seeking things out on their own. Last time I provided a couple of search terms, told them to keep looking outward beyond these, and to keep a record of what they searched for (which as best I could tell no group did). Next time I will probably experiment with either giving students a fully curated list of search terms, so they can observe how this affects their search results, or, conversely, I might give them even more time to “flail” about on their own before offering suggestions.

Statecraft in the International Relations Classroom

Today we have a guest post from Eric Cox, an associate professor at Texas Christian University. He can be contacted at e[dot]cox[at]tcu[dot]edu.

Does the online Statecraft simulation improve student learning when used as a key component of international relations classes? I explored this question in a Journal of Political Science Education article through a controlled comparison of two IR course sections taught during the same semester. One section was randomly chosen to participate in Statecraft, the other was assigned a research paper. The primary finding of the study was that students in both sections performed similarly on exams when controlling for other factors.

Statecraft is a turn-based simulation that divides students into “countries” that they govern. Each country must choose its form of government, economic system, and other attributes. Players also choose whether to focus on domestic spending priorities such as schools, hospitals and railroads, or on military capabilities. They must deal with terrorism, the melting of Ice Mountain, pirates, and rumors. The simulation is, to put it mildly, complex. I have been using it for just over a decade.

To try to put the students doing the research paper on an equal footing with those engaged with Statecraft, I dedicated several days of class to instruction in research writing skills and peer review. The students in this section spent roughly the same amount of time in class on their paper as the students in the Statecraft section did on the simulation. Both groups also wrote about the same amount.

At the end of the semester, I compared class performance on three exams and gave students a brief survey on their experiences. The initial findings were surprising: the research paper class did much better on exams but were less satisfied with the research assignment than the Statecraft students were with the simulation. I obtained access to students’ GPA when entering the course, and re-ran my analysis with GPA, whether students were taking the course for a grade, and whether students were political science majors as controls. Once these controls were introduced, the effect of Statecraft went away. The strongest predictor of course performance was their incoming GPA. Students with high prior GPAs made As, B students made Bs, and so on. Academic performance was independent of the research paper or Statecraft assignment. However, students in the Statecraft section showed a strong preference for the simulation over a traditional research paper, and students in the research paper section indicated they would have rather done Statecraft. Subsequent student evaluations have also demonstrated the relative popularity of Statecraft.

That said, my use of Statecraft has evolved, something I discuss in detail in my chapter of Teaching International Relations. Foremost, I dedicate class time to the simulation, and draw examples from the simulation when discussing IR theory, issue areas, and current events. Students have indicated that the simulation gives them a greater appreciation for the complexity of international relations and the challenges leaders face. 

Editor’s note: previous posts on Statecraft can be found here.

Write Your Own Headlines Activity

This post comes from Chelsea Kaufman, assistant professor of political science at Wingate University. She can be contacted at c[dot]kaufman[at]wingate[dot]edu.

In teaching undergraduate research methods, I often find that the students are intimidated by the subject matter and don’t see its relevance to their lives. I have increasingly emphasized to students that it prepares them to be savvy consumers of political information wherever they might encounter it. This approach introduces an additional challenge, however: students often lack the information literacy skills to evaluate the sources that they access. If I want students to have the skills to evaluate the political information they encounter, I obviously need to teach them these skills. How exactly can this be accomplished? 

It is not enough to tell students which sources are acceptable, because people tend to trust information that aligns with their political predispositions. Simply lecturing to students about the dangers of misinformation can reinforce false beliefs and increase their distrust of reliable sources. 

To avoid this conundrum, I have students write their own headlines based on public opinion poll data. I first find a poll with results covered in several media outlets. I then send students a link to (or print out) the results of the poll, without providing them any context as to how it was covered in the media. After writing the headlines, students share them and compare theirs with those of their classmates and with published headlines about the data. Students learn to interpret data and evaluate whether it was given accurate coverage in the media. As the final part of the lesson, I then ask them to evaluate the polling methods used to obtain the data, by, for example, considering how a question’s wording might have impacted the responses. 

You can view detailed instructions for the activity on APSA Educate. You can also read more about this topic and find examples of additional activities in my article Civic Education in a Fake News Era: Lessons for the Methods Classroom or my chapter in The Palgrave Handbook of Political Research Pedagogy

The Challenge Game

Elia Elisa Cia Alves

Today we have a guest post from Elia Elisa Cia Alves, Federal University of Paraíba (UFPB), and Ana Paula Maielo Silva and Gabriela Gonçalves Barbosa, State University of Paraíba (UEPB), of Brazil. Elia Elisa Cia Alves can be contacted at eliacia [at] gmail [dot] com.

The Challenge Game was developed by a group of professors at the State University of Paraiba and the Mettrica Lab in Brazil. It is suitable for teaching concepts in international relations theory, such as state survival within an anarchic system, the security dilemma, alliances and the balance of power, and hegemony.

Ana Paula Maielo

To play this game in the classroom, you will need 1) approximately 8 to 50 students who can play either individually or in teams, depending on the purpose to which the game is put, 2) candy, points, or some other reward that can be distributed, and 3) a method of determining the winner of a challenge between two parties, such as dice (high roll wins), rock-paper-scissors, or an online random number generator. Also, the rules of the game should be visible to students during the game.

The game is played in four rounds of approximately ten minutes each. A challenge is a one-candy bet (a loss results in one piece of candy being taken away) with a 50% probability of winning. Any individual or team that is challenged must participate in the challenge. Only one challenge should occur at a time so that the instructor can note what happens. A student or team that ends up with zero candy can no longer issue challenges; they are “dead” for the remainder of the round.

Round 1: Each student starts with one piece of candy. The winner of a challenge takes one piece of candy from the loser and can then challenge someone else. Any student who loses all of his or her candy is out of the game for the round. Depending on class size, the instructor may want to limit each student to a maximum number of challenges.

Round 2: Candy is distributed unequally among students. Most students should have 1-2 candies, a few students should have 3, and only a couple of students should have 4. The instructor may want to allow students to form alliances, in which case students can borrow candies from each other if needed. However, the loan is optional.

Round 3: Group students into teams. Distribute candy unequally among teams as in Round 2. Each team represents a nation-state. Students within a team decide, using any decision making method they choose, whether the team challenges any other team. As in Round 2, the instructor might allow teams to form alliances.

Round 4: Group students into teams and distribute candy as in Round 3. The professor grants special rules to only teams that have the greatest number of candies, such as altering their odds of winning a challenge. After the game, the professor should debrief the class to link theoretical international relations concepts to students’ experiences of the game. In our JPSE article, we suggest several questions that can be used as part of the debriefing.

Developing a Podcast Assignment

Today we have a guest post from John McMahon, Assistant Professor of Political Science at SUNY Plattsburgh. He can be contacted at jmcma004 [at] plattsburgh [dot] edu.

Podcast assignments make students the creators of political knowledge, allow them to actively research subjects of interest, and offer them the opportunity to improve their writing, listening, and speaking abilities. The format is more interesting and authentic to students than that of traditional assignments, in part because of the popularity of podcasts among people under the age of thirty-five.

In my experience, there are two especially salient components of podcast assignment design. First, it is necessary to be intentional and clear with oneself and one’s students about the assignment’s required elements. A podcast’s political content, length, required sound elements (clips, effects, music, etc.), type of interview subjects (if any), how its creation is scaffolded—all require careful consideration. The requirements of the assignment need to match course learning objectives.

Second, do not worry too much about the technology. Instructional technology and library staff usually can provide support and resources, from workshops to USB microphones to campus recording studios. If needed, students can simply use their phones to record audio. Audio editing tools like Audacity and GarageBand are easy for students to learn, and instructional videos on podcast creation abound online. In my experience, students have also found Spotify’s Anchor to be an easy platform to use.

Podcast assignments are adaptable to a range of courses. I have used them successfully when teaching political theory and American politics at the 100-, 200-, and 300-level. Crucially, as we enter another pandemic academic term, this kind of assignment is suitable for online, hybrid, and in-person courses, including those that change modality in the middle of the term.

Instructions for one of my podcast assignments are available on APSA Educate, and I have published an article on student podcasting in the Journal of Political Science Education.

Exam Essays that Develop Research Skills: A Second Look at Zotero

Today we have a guest post from Adam Irish, an assistant professor of political science at California State University, Chico.

Like many professors, I change my teaching to fit the class or, in the past year, the Zoom discussion I am leading. My lower division, survey courses focus on building a scholarly vocabulary and an understanding of concepts; upper division courses dive deeper into issues so that students can wade into the intellectual fray. However, this past year of online teaching revealed a potential for overlap for this dichotomy: the development of research citation skills through the incorporation of Zotero.

Continue reading “Exam Essays that Develop Research Skills: A Second Look at Zotero”