Today we have a guest video from James “Pigeon” Fielder of Colorado State University on the benefits of games. He can be contacted at James [dot] Fielder [at] colostate [dot] edu.
Today we have a guest post from Rebecca A. Glazier at the School of Public Affairs at the University of Arkansas at Little Rock (rebecca [dot] glazier [at] gmail [dot] com) and Matthew Pietryka at Florida State University’s political science department (mpietryka [at] fsu [dot] edu).
Many professors are struggling to engage their students, who are often disengaged and burned out. To address these issues and improve student retention, universities are increasingly turning to edtech solutions or big data—everything from predictive analytics to chatbots in discussion boards. These remedies tend to be far removed from students’ daily lives. In contrast, as professors, we are with students in the classroom every day. And this experience often prepares us to know best how to engage our students.
In a new, open-access article we just published in Education Sciences, “Learning through Collaborative Data Projects: Engaging Students and Building Rapport,” we illustrate how faculty can engage students through collaborative data projects. Rather than relying on top-down university solutions, faculty can use the content of their own courses to involve students in collaborative projects that build rapport and make them feel included and engaged in the course. We see these collaborative data projects as another kind of active learning—getting students thinking outside of the textbook and involved in contributing to a project that is bigger than themselves.
We used data from more than 120 students over two semesters and our results suggest that most students find these collaborative data projects more enjoyable than typical college assignments. And students report the projects make them feel the professor is invested in their learning.
The article we wrote detailing these projects is open access. It provides advice on implementing these projects as well as the R code used to create individualized reports for students participating in the collaborative data projects. The individualized reports help develop rapport between the professor and each student. And this programmatic approach allows professors to scale up these reports to accommodate classes with hundreds of students. Building rapport and doing active learning is something considered possible only in smaller classes, but our approach demonstrates how it can be done in large classes as well—with significantly positive results.
At a time when many faculty members are struggling to engage students, we can take matters into our own hands by designing projects for our classes that draw students in and build rapport with them. It doesn’t take expensive edtech solutions or top-down directives. Mostly, it takes thoughtful pedagogy and prioritizing student connection.
Open Access article link: https://www.mdpi.com/2227-7102/12/12/897.
Recent episode on the Teaching in Higher Ed Podcast on this research: https://teachinginhighered.com/podcast/engaging-students-through-collaborative-research-projects/.
Today we have a guest post, or more accurately, a guest video, from Joel Moore of Monash University, on an innovative use of the ChatGPT AI in a simulation:
Today we have a second guest post from Sharmaine Loh and Marek Rutkowski, of Monash University—Malaysia, and Joel Moore, Monash University—Australia. They can be contacted at sharmaine [dot] loh [at] monash [dot] edu, marek [dot] rutkowski [at] monash [dot] edu, and joel [dot] moore [at] monash [dot] edu.
In our last post, we described our Mekong River crisis simulation. The assessments that we use for this simulation are designed to reward student preparation and engagement (a detailed breakdown is in this appendix).
Students are initially provided with detailed position descriptions for employment in the organisations to which they’ve been assigned. They are asked to prepare for a mock job interview for that position, which requires them to conduct research and think about their role in the simulation. We have offered this scenario in an applied capstone class, so have required students to identify their own readings and research to be able to fulfil their roles.
Once the simulation begins, students write a weekly strategy memo for the lead member of their organization based on independent research they’ve conducted, an opportunity for them to consider the practical, actionable implications of scholarly work in the social sciences. Students also must also document their interaction with other organisations and the media during the simulation in a reflective journal.
The head of each organization in turn relies on his or her team members to regularly provide advice about the best course of action in the unfolding crisis. If a group suggests a questionable course of action, the instructor uses follow-up questions to prompt students to consider possible negative consequences, e.g. how would investors view a decision to cancel the project?
At the end of the course, students analyze their experience of the simulation in a writing assignment.
The simulation is designed to make it difficult for students to upset the status quo. Local and international NGOs usually must settle for limited gains based on a government’s willingness to placate its critics. While this sometimes leads to frustration and disillusionment for students, it allows them to gain a better understanding of the power disparity between governmental and nongovernmental actors. While students sometimes initially attempt to resolve the crisis by reaching a consensus among all parties involved, they quickly realize that this is impossible due to conflicting interests. While students are allowed to make risky decisions if they are well considered and not purposely disruptive, successfully negotiated political and policy changes in the simulation have always been limited and incremental.
In past iterations of the simulation, the incumbent Thai leadership has usually been able to retain control of the government and dominate issue framing, in some cases solidifying its position in the process. Thai opposition groups have had to navigate between outright rejection of government policies and a more conciliatory and constructive criticism. Students have learned that political change is difficult to accomplish without a broad anti-government bloc that includes civil society organisations.
Changes at the international level have also been limited, accurately reflecting the shortcomings of the Lower Mekong governance regime and ASEAN’s commitment to the principle of non-interference. Students’ attempts to amend the 1995 Mekong Agreement have been hindered by states’ competing foreign policy objectives and the strict application of sovereignty. At most, parties have agreed on a controlled and gradual extension of the Mekong River Commission’s supervisory apparatus.
We have identified a few ways in which the simulation can be further improved. Students’ concerns about free riding within teams, while partially mitigated through the use of a team member evaluation tool (e.g. CATME or Feedback Fruits, we used one developed for this class by Joel), have continued. A possible solution could be a “divorce option,” where students would be allowed to “fire” a free riding member. We have also observed that students’ insufficient background knowledge can lead to unrealistic behaviour in the simulation. This could be mitigated by an increased redundancy within groups (multiple students being given the same or similar role) and an added criterion of academic performance in determining group allocation (Joel’s tool for the allocation of students into groups for class assignments has also been used to allocate students into roles for this class).
Today we have a guest post from Sharmaine Loh and Marek Rutkowski, of Monash University—Malaysia, and Joel Moore, Monash University—Australia. They can be contacted at sharmaine [dot] loh [at] monash [dot] edu, marek [dot] rutkowski [at] monash [dot] edu, and joel [dot] moore [at] monash [dot] edu.
We developed a six weeks long simulation with three contact hours per week about international competition over freshwater resources of the lower Mekong River. The simulation, which we call the Riparian Dam Crisis, is designed to provide students with the opportunity to build collaboration, communication, and negotiation skills while learning about Southeast Asia. Students are introduced to select theories before the start of the simulation and incentivised to conduct independent research and source other relevant materials to inform actions of their groups throughout.
The simulation involves a Thai-funded hydroelectric dam project in Laos. Most of the dam’s electricity will be purchased by Thailand. Shortly before the dam goes into operation, a drought reduces downstream water to its lowest level in living memory. This scenario, which resembles the real-life Xayaburi dam a few years ago, reflects competing economic and environmental demands, weak regional regimes for dispute resolution, domestic political considerations, and transnational advocacy networks. Students assume the roles of various stakeholders that must try to achieve specific objectives in an evolving situation, such as the Thai, Lao, and Cambodian ministries of foreign affairs, rural NGOs, the regional Mekong River Commission, Thai political parties, and journalists. For example, the dam has been constructed wholly within Laos’s borders, which paradoxically gives the smallest country the largest say in the simulation’s outcome. Cambodia is the most negatively affected by upstream dams in Laos, but it has limited influence over Laos and Thailand because it is not a participant in the project. Meanwhile Thailand is very susceptible to domestic pressure from interests that either support or oppose the dam.
During the simulation, student journalists representing two Thai media outlets conduct interviews and create stories targeting different audiences. The simulation’s other stakeholders need to engage strategically with reporters to have their actions framed in a positive manner.
Thus, there is one constellation of groups that broadly favours pushing forward with the dam, another one that generally wants to halt the dam, and a third whose position is flexible. After an initial feeling-out period, students identify aligned groups and develop strategies to achieve their objectives. Each time we have run this simulation, students have focused on their efforts on preserving or creating a sympathetic ruling coalition in Thailand after they had exhausted other diplomatic avenues. Students have been quite creative in creating novel strategies to achieve group objectives, such as staging mock mass protest campaigns, lobbying global powers, and bringing down Thailand’s ruling coalition with a vote of no confidence.
In a future post, we will describe how we assess student learning from the simulation and how we adapted it over time in response to student experience.
The essay below was written by a tenured professor at a public regional comprehensive university in the USA.
We always say we are a “tuition-dependent” state university, so any enrollment downturn hits us hard. What I didn’t fully appreciate before the Covid-19 pandemic is how dependent we are on revenue from the cafeterias and dorms. We suffered huge losses from going completely online for a year.
But wait, there is more! Even before the pandemic, our athletics department, by which I mean our football team, lost $10-12 million per year. They lie about this and hide it as best they can, but at a state institution with strong public records laws the truth has come out. Athletics at our university reports directly to the president and presents at every board of trustees meeting. To the trustees, the athletics department is the university. Only about 5% of the students ever attend a football game, but they all pay several hundred dollars per year in student fees to athletics.
Last year we hit a $25 million deficit. The administration slashed budgets everywhere. Almost half of our classified staff were laid off in June of 2020. Programs were eliminated, adjuncts all fired, phones taken out of faculty offices. You can’t get anything done on campus anymore; you email people or leave a phone message and there is no one to get back to you. A four-person office I often collaborate with was reduced to one person, the most junior, who was told to do all the work.
But guess what wasn’t cut? Athletics! The administration hired consultants—retired coaches—who decided, without any quantifiable evidence, that football was vital to the character of the institution. Their recommendation? Spend millions more per year on athletics. The administration committed to meeting that goal.
My department was combined with several others to “save money.” Our small but valuable graduate program was eliminated. Three secretaries were laid off and replaced, after a year, with one person working less than full-time to support three dozen faculty. Every faculty meeting is a battle between angry professors and our thin-skinned president who bristles at any criticism.
Enrollment is down another 20% this year. Our region of the state produces fewer high school graduates each year, and that number will drop off a cliff when the 2008 recession’s fast-approaching birth dearth hits us. In the last ten years, we have invested heavily in our physical campus; we are paying bonds on lovely, state-of-the-art buildings that will never be full.
I am less than five years from a possible early retirement. I am a graduate of this institution, and my wife and I met in a classroom where I now sometimes teach. I love this place. But it is never going to recover.
Today we have a guest post from Michelle Goodridge, academic librarian at Wilfrid Laurier University. She can be contacted at mgoodridge [at] wlu [dot] ca.
After a casual conversation about classroom games with my colleague Professor Andrew Robinson, we created a foreign policy simulation for his course, HR 100 Human Rights and Human Diversity. We had two goals for the simulation: first, have students explore why state actors fail to advance human rights domestically and internationally, and second, measure the simulation’s effectiveness in helping students achieve desired learning outcomes.
We modified the International Trade Game by:
- Orienting the exercise around human rights instead of international trade.
- Dividing students into three teams of high and middle-income pro-human rights. democracies, two teams of low-income democracies indifferent to human rights, one team of a high-income state that is anti-human rights, and one team representing an NGO.
- Introducing the political objective of re-election.
- Creating different winning conditions for each team.
To form teams, students picked one of several different colored t-shirts that we had laid out around the classroom. Each team received a corresponding packet of instructions and resources. I had the role of The Trader who accepted the geometric shapes produced by teams in exchange for political support units. Andrew injected human rights crises into the simulation via PowerPoint. The simulation ran an hour, with defined victory conditions that needed to be met to have a winner. Often none of the teams met its victory condition, which came as a shock to the students, but it helped illustrate the complexity of international relations.
After the game concluded, we took time to debrief the students, and this is when students made robust connections between the simulation and concepts they had been studying. I can only assume this is because verbalizing these responses right after the exercise is easier than writing them down a week afterward.
We attempted to measure the effectiveness of our Human Rights Foreign Policy Game with pre/post test evaluations. The evaluation results were anonymized, coded, and analyzed using SPSS. We found that the richest data came from students’ responses to the evaluation’s open-ended questions. So far, we have run this simulation in six semesters, and we will probably continue to use it in the future because of the high percentage of students reporting that it helped them learn. For more details, please see our article “Objective Assessment of Pedagogical Effectiveness and the Human Rights Foreign Policy Simulation Game,” Journal of Political Science Education 17, 2 (2021): 213-233, DOI: 10.1080/15512169.2019.1623048.
Today we have a guest post from Colin Brown, assistant teaching professor in the Department of Political Science at Northeastern University. He can be reached at colin [dot] brown [at] northeastern [dot] edu.
It seems safe to say that political scientists have some concerns these days about information literacy, and information literacy is likely an implicit learning outcome for many of us. This blog has provided a number of good exercises for bringing information literacy into research methods, reading academic research, and headline writing. Inspired by these examples, I attempted to include this skill in my introductory comparative politics class, where democratic (de)consolidation is a major topic. In theory, the class gives students enough background to start keeping up with events around the world—if they choose to do so.
The exercise I tried this year, now available on APSA Educate, forces them to update slightly-out-of-date readings on a country facing democratic backsliding (Poland) by finding out what’s happened there in the four or five years since they were published. Students were assigned to small groups, and each was given a different kind of source to examine during a class session. One group read newspaper articles, another examined democracy indexes, yet another searched Wikipedia, etc. Students then applied what they’d read to course concepts—has democracy gotten weaker or stronger in Poland since these were published? Students then discussed what they trusted or distrusted about each type of source, and the potential merits of each.
I had a few key goals for students:
- Think about source material for future courses. In an intro course, students not only might be unfamiliar with how research articles work, but also may not have a lot of practice in thinking about online source credibility.
- Understand that while sources vary in credibility, there are pros and cons to using even the most credible sources. For example, the students who looked at V-Dem, Freedom House, etc., got clear, direct answers to the exercise’s questions, but they also correctly pointed out that they had to accept these organizations’ conceptualizations of democracy. And less credible sources like Wikipedia still had things to offer if used carefully.
- Bridge the gap between classroom learning and events in the broader world and show how what they’re learning might help them understand the news.
When I ran this exercise in class this year, I budgeted only about 25 minutes for it, when it turned out to need 40 minutes or more to give students enough time to look at multiple sources in their category. We ended up using another 25 minutes the next day but dividing the exercise into two sessions probably led to more shallow searching and a less systematic attempt to make sense of sources.
When running this exercise in the future, I will think more explicitly about the balance between handholding and allowing students to practice seeking things out on their own. Last time I provided a couple of search terms, told them to keep looking outward beyond these, and to keep a record of what they searched for (which as best I could tell no group did). Next time I will probably experiment with either giving students a fully curated list of search terms, so they can observe how this affects their search results, or, conversely, I might give them even more time to “flail” about on their own before offering suggestions.
Today we have a guest post from Eric Cox, an associate professor at Texas Christian University. He can be contacted at e[dot]cox[at]tcu[dot]edu.
Does the online Statecraft simulation improve student learning when used as a key component of international relations classes? I explored this question in a Journal of Political Science Education article through a controlled comparison of two IR course sections taught during the same semester. One section was randomly chosen to participate in Statecraft, the other was assigned a research paper. The primary finding of the study was that students in both sections performed similarly on exams when controlling for other factors.
Statecraft is a turn-based simulation that divides students into “countries” that they govern. Each country must choose its form of government, economic system, and other attributes. Players also choose whether to focus on domestic spending priorities such as schools, hospitals and railroads, or on military capabilities. They must deal with terrorism, the melting of Ice Mountain, pirates, and rumors. The simulation is, to put it mildly, complex. I have been using it for just over a decade.
To try to put the students doing the research paper on an equal footing with those engaged with Statecraft, I dedicated several days of class to instruction in research writing skills and peer review. The students in this section spent roughly the same amount of time in class on their paper as the students in the Statecraft section did on the simulation. Both groups also wrote about the same amount.
At the end of the semester, I compared class performance on three exams and gave students a brief survey on their experiences. The initial findings were surprising: the research paper class did much better on exams but were less satisfied with the research assignment than the Statecraft students were with the simulation. I obtained access to students’ GPA when entering the course, and re-ran my analysis with GPA, whether students were taking the course for a grade, and whether students were political science majors as controls. Once these controls were introduced, the effect of Statecraft went away. The strongest predictor of course performance was their incoming GPA. Students with high prior GPAs made As, B students made Bs, and so on. Academic performance was independent of the research paper or Statecraft assignment. However, students in the Statecraft section showed a strong preference for the simulation over a traditional research paper, and students in the research paper section indicated they would have rather done Statecraft. Subsequent student evaluations have also demonstrated the relative popularity of Statecraft.
That said, my use of Statecraft has evolved, something I discuss in detail in my chapter of Teaching International Relations. Foremost, I dedicate class time to the simulation, and draw examples from the simulation when discussing IR theory, issue areas, and current events. Students have indicated that the simulation gives them a greater appreciation for the complexity of international relations and the challenges leaders face.
Editor’s note: previous posts on Statecraft can be found here.
This post comes from Chelsea Kaufman, assistant professor of political science at Wingate University. She can be contacted at c[dot]kaufman[at]wingate[dot]edu.
In teaching undergraduate research methods, I often find that the students are intimidated by the subject matter and don’t see its relevance to their lives. I have increasingly emphasized to students that it prepares them to be savvy consumers of political information wherever they might encounter it. This approach introduces an additional challenge, however: students often lack the information literacy skills to evaluate the sources that they access. If I want students to have the skills to evaluate the political information they encounter, I obviously need to teach them these skills. How exactly can this be accomplished?
It is not enough to tell students which sources are acceptable, because people tend to trust information that aligns with their political predispositions. Simply lecturing to students about the dangers of misinformation can reinforce false beliefs and increase their distrust of reliable sources.
To avoid this conundrum, I have students write their own headlines based on public opinion poll data. I first find a poll with results covered in several media outlets. I then send students a link to (or print out) the results of the poll, without providing them any context as to how it was covered in the media. After writing the headlines, students share them and compare theirs with those of their classmates and with published headlines about the data. Students learn to interpret data and evaluate whether it was given accurate coverage in the media. As the final part of the lesson, I then ask them to evaluate the polling methods used to obtain the data, by, for example, considering how a question’s wording might have impacted the responses.
You can view detailed instructions for the activity on APSA Educate. You can also read more about this topic and find examples of additional activities in my article Civic Education in a Fake News Era: Lessons for the Methods Classroom or my chapter in The Palgrave Handbook of Political Research Pedagogy.