Undergraduates Doing Replication? Why Not!?

When I taught Quantiative Methods last spring, a colleague picked up a paper of mine from the printer and came looking for me. “You’re doing replications with the undergrads?” she asked. “Why?” I looked at her and without thinking simply replied, “Why not?” Replicating studies is considered a best practice of sorts in graduate level methods training. None of the reasons given there – teaching disciplinary norms, emphasizing the importance of transparency in research, etc. – fails to hold in the undergraduate context. If anything, our undergraduates have more need of those objectives than our grad students, who will have them repeatedly reinforced across multiple classes. For most of us who teach undergrads, one methods class is all we’ve got, and we need to make it count.

For most of us, part of the objective of a research methods course is to introduce students to the ways of thinking and doing that characterize social science, and social science research especially. It’s a key point of socialization into the discipline, where they go from being students of politics to being students of political science. Continue reading

Distance Teaching + 1-on-1 = …. Active Learning?

Recently I’ve begun doing some dissertation coaching while in a gap between positions. It’s all done remotely using a free videoconferencing service (more below). But here I am, videocalling with students I’ve never met, trying to step into an ongoing project and guide them out of whatever mess they (think they) are in. It’s an unusual form of teaching, but so far I like it a lot.

The experience has me thinking about what active learning really LOOKS like. Obviously, no simulations are going on here; no games either, or case studies, or other typical discussion materials. So how do you DO active learning when you’re one-on-one, geographically remote, and dealing primarily with the writing process?  Continue reading

Make Your Librarian Love You

I’ve often said that librarians are the most under-utilized resource of any college or university. At one of the schools I’ve taught at, they actually went begging to faculty to be invited in to do things with students. At most of the others, they frequently advertise their services to faculty, hoping that some of us will take them up on their offer.

The usual use of librarians for a research methods course is in teaching students how to find materials for a literature review using library databases. That’s a pretty standard need. But for methods classes that also incorporate qualitative methods, I’d like to suggest a second use for your librarians: teaching a hands-on class on primary source interpretation using materials from the school’s special collections or school archives. Continue reading

Where Did Your Stuff Come From?

Most American students are challenged to understand the extent to which international trade affects their lives, and the way that the US trades with the world. I can (and have) shown statistics about trade and economics in very graphic and immediate form, but numbers in the scale of trillions are hard to conceptualize.

To combat that, I asked students in an introductory international politics class to go on a scavenger hunt. They were tasked to find one item from each of 5 world regions – Europe, Latin America, Sub-Saharan Africa, Middle East & North Africa, and Asia & the Pacific. They had to take a picture of the ‘made in’ indicator (and part of their student ID, to ensure that they didn’t just go grabbing stock photography or Instagram stuff) and post it to the class learning management system’s discussion board. To sweeten the pot, I offered 2 bonus points for unique entries, where no one else posted something from that country. Specialty foods and beverages were excluded (no taking a picture of a bottle of Stella for Belgium).

Students went crazy hunting for stuff. The two bonus points were apparently a huge incentive, with students finding and posting additional items when someone else duplicated “their” country.

Continue reading

How Do We Know What We Know? Active Learning and the Scientific Method

Positivist epistemology can be the topic of a very boring lecture in an introductory social science or methods course. Fortunately, an activity borrowed from an elementary math class can alleviate the boredom by asking students to identify the sources of their own beliefs about unobservable phenomena and to explore questions about what constitutes an explanation.

The basic outline of this activity, which I use during the second class session of most courses I teach, is that unusual objects are concealed in opaque fabric bags. Small groups of students must attempt to describe and identify the object as thoroughly as possible without opening the bag. A recorder observes the group’s discussion and notes what data points they establish, how they negotiate rival hypotheses, and what threshold or pieces of evidence convinced group members about their findings. The process of investigation, discovery, and persuasion is repeated with several different bags, so that each group gets at least 3 objects. We then discuss what they learned about the objects, and as I reveal each item I link it to concepts in positivist epistemology such as what constitutes sufficient evidence, whether we ever can know for sure without being able to observe things, the role of context in defining the meaning of an observed act, and precision and accuracy in measurement.

This activity has been around for a while; I published it in PS: Political Science and Politics in 2006. Since then, it’s been picked up in history, linguistics, and psychology classes. The article goes through sources, ideas, and materials in more detail than I can here. The objects you use can vary, but I strongly recommend that you keep the two bags with film canisters or other containers (one with cotton balls inside, one without), and the bag with a piece of fabric ribbon. These objects allow you to get at issues of precision, accuracy, unobservability and authority most efficiently.

My own set of objects has evolved, and it shifts from course to course depending on the points I want to emphasize. My methods course, for example, emphasizes the difficulty of doing measurement across time and space. To raise this point, one of the bags includes the cake topper from my wedding – a simple engraved acrylic block intended as an executive award or paperweight. Most groups identify that it’s plastic and engraved and probably a paperweight… but without knowing what the engraving says, its context is completely unknown and the object’s significance is vastly misinterpreted. Out of its original context, our identification – coding – of that object was totally incorrect.

I’ve been using this activity for over a decade now and haven’t made any major improvements or changes to the version described in the PS paper. I strongly encourage you to give it a try: let your students’ own curiosity drive their understanding of how researching unobservables works.

Unconventional Films in the Classroom: Into the (Methodological) Woods

Just after the (last) New Year, I emailed my spring Research Methods sections with the usual ‘class starts soon, here’s the syllabus, order your books’ message.  This year’s message contained a strange assignment, due during the second week of class. They were to watch the recent film version of Stephen Sondheim’s Into the Woods and write a one page reaction to the questions, whose fault is it, and why?

For those unfamiliar with the film, Into the Woods is a complex tale braiding together half a dozen well-known fairy tales around the story of the Baker and his Wife, who remain childless thanks to a curse put on the Baker’s house by the Witch. The Baker and his Wife go into the woods to seek the ingredients to lift the curse. The rest of the village, meanwhile, is driven into the same woods by the arrival of a Giant (Jack’s fault, of course). While in the woods, all sorts of calamities befall the group. By intermission, things seem to be mostly repaired so that happily ever after is plausible, but in the second act, thanks to a continuation of behaviors from before – and the realization by some characters that fulfilled wishes aren’t always what you dreamed of – a second giant drives the group into the woods once again. Disasters of various sorts again ensue. But whose fault is it?

Midway through the second act, the characters confront the realities of determining causation in the real world. As the Baker says to Jack, “It’s because of you that there’s a giant in our midst and my wife is dead.” At that point, the fun begins from a methodological and pedagogical perspective. Are those two separate outcomes, or one? Is the causal chain from Jack to those event(s) equally strong? What constitutes a good explanation? The Baker’s Father was the ultimate trigger of the curse, which was placed on them by the Witch’s Mother – is it one of their faults? Neither of those characters even appears in the show – is that convincing? For that matter, what is “it”?? We have to begin by determining the actual dependent variable of interest: the phenomenon we’re explaining. Because of the nature of this particular show’s plot, all the plots are intertwined; isolating a single simple causal story is impossible. (Hello, equifinality and multiple causality!) The Baker’s wife walked off a cliff (in the more recent version), so it’s technically her own fault that she’s dead – or did the shaking of the earthquake cause her to lose her balance and fall? How do you attribute blame convincingly to an earthquake?

Aside from a bit of grumbling at an assignment so early, and the unfortunate guy who watched it in the frat house without knowing that “it was all singing!!,” this activity was generally well received. Forcing students to pick a side in their reaction papers gave us a starting point for discussion as I tallied blame on the board, then went into some of the justifications. Overall, the highly unorthodox use of film in a research methods class, and the even more unorthodox choice of films, was a very good way to start my admittedly unorthodox approach to a required but dreaded methods class.

The Power of Unpublished Research: You Be the Reviewer

We want our students to learn to read critically and to interrogate and evaluate what they read. Does the author have the right data? Do the conclusions actually follow from the data? Are other explanations missing from the argument? That’s what we want them to ask themselves. A quick look at students’ notes from reading – if they even took any – reveal a totally different set of information, usually focused on the literature review and sometimes the theory. After all, this is the main textually-based body of an empirical paper, so it’s easiest for them to read.

Beyond steps we can take to teach students to read articles effectively (see my previous post on R&U and the Article Sort activity), I like to engage my intermediate and upper-level courses in an activity we call “You Be the Reviewer.” Students in all of my classes have already done the R&U activity and read (briefly) about the process papers go through to get published. So at some point in the term, I assign an unpublished article manuscript – often from a colleague or a conference paper pulled from the conference archives with author permission – and ask students to write a journal-style review, including a decision of whether the item should be published.

As support for this assignment, I distribute a handout like the one available here. It suggests some questions for students to consider, reminds them to check R&U for more guidance, and gives them a framework for writing a review. Typically, they are asked to post their reviews to the course learning management site and to bring a hard copy to class for reference. The resulting conversations have been far more in-depth and wide-ranging than anything else I’ve tried. At the end of the discussion, we collectively decide on the disposition of the article. Several classes – including a freshman-level intro course – have voted to reject manuscripts, though, as in the real world, R&Rs are the most common response.

While this activity obviously works better with upper-division classes, even lower-level students have enjoyed it and given very piercing feedback. For lower-level classes, qualitative research or very simple quantitative analysis works best. I normally compile the students’ feedback (copying particularly relevant bits from the CMS and pasting into a document) and send it to the author as thanks for sharing the manuscript. In an undergraduate methods class, I once was able to have the author come and give a (previously prepared) conference style presentation to the class on the manuscript they had reviewed. The author also took questions, so that the class had a model presentation to use in preparing their own as well as a chance to ask the author about research design decisions and practice giving useful feedback on research-in-progress before their own peer review process.

I’ve found that using a manuscript – an honest to goodness pre-publication, looks-like-it-was-written-in-Word-then-PDF’d manuscript – gets a far better reaction than published research. Students are reluctant to question or challenge work by ‘experts’ that’s already been vetted and published, but papers are a different matter.

Have you used unpublished research (other than your own) with your students? What was their reaction?