Wargaming for Learning: A robust examination of how educational gaming adds value to traditional instructional techniques

My latest article is out in Journal of Political Science Education and I’m excited to share it with ALPS readers. Alongside Dr. Lisa Kerr, also at the Naval War College, we set out to do a robust examination of whether educational gaming is worth the extra time it takes; in other words, do students learn more by playing a game (in this case, a bespoke war-game called War at Sea) when they’ve already encountered the material through traditional methods of learning such as reading, lecture, and discussion of a case study? Our research says yes.

Using objective measures (i.e., a test), we found that there was no statistically significant increase in learning after the traditional methods of instruction when compared to th pretext. But for students who war-gamed, they showed statistically significant increases in learning after wargaming, both in comparison to themselves, pre-war game, AND compared to students who did not war-game. Students also reported an increased preference for learning through gaming after experiencing both methods. Most studies of gaming in political science face methodological limitations: they are show-and-tell style pieces that share great ideas without assessment, or rely on student self-assessment of learning, have low ns, study their own undergraduate students, or cannot use control groups. Ours avoids these methodological constraints, and adds to our understanding of the value of using educational games to teach issues of national security and war fighting.

One additional finding that will be of interest to those who study simulations and games: when asked what part of the gaming experience contributed most to their learning, students overwhelming pointed to the gameplay and preparation, rather than the debrief. As most of us tend to point to the debrief as where the learning happens, this suggests several possible explanations. First, we might be wrong, and the debrief is not as necessary as we think. This hypothesis absolutely merits further study, as we tend to assert the debrief matters theoretically without providing evidence for it. Second, students might be wrong. We know from other studies such as Deslauriers et al 2019 that students are not always great at understanding their own sources of learning; in that study, students reported learning more from traditional models of teaching when their scores showed they learned more from active learning techniques. Finally, it may be that the debrief has value, but the specific nature of it depends on instructor skill; how much do we prepare for the debrief and help students extract lessons? Do we use a post-game oral debrief or reflective exercises? In other words, it may be the execution of the debrief rather than the idea of it that is problematic.

We hope that this article is useful to those instructors still facing pushback on the use of educational games in the classroom and for those thinking through the methodological issues we face in studying the impact of games. Next up we are interested in studying what specific skills and knowledge are activated during gameplay, what kinds of games map onto different kinds of learning, and studying student engagement during games. Stay tuned, and in the meantime, please let me know what you think of the piece!