Here’s a quick and easy game you can use if you ever need to explain some basic methods concepts like variables or organizing data. It requires only a deck of playing cards (more for large classes) and can work for classes of any size.
I’ve often said that librarians are the most under-utilized resource of any college or university. At one of the schools I’ve taught at, they actually went begging to faculty to be invited in to do things with students. At most of the others, they frequently advertise their services to faculty, hoping that some of us will take them up on their offer.
The usual use of librarians for a research methods course is in teaching students how to find materials for a literature review using library databases. That’s a pretty standard need. But for methods classes that also incorporate qualitative methods, I’d like to suggest a second use for your librarians: teaching a hands-on class on primary source interpretation using materials from the school’s special collections or school archives. Continue reading
Many research methods classes end with student presentations of their research papers. In the typical format, this requires faculty to clear three days of class time for students and faculty alike to sit there and be bored by badly organized and poorly designed ‘presentations’ of research. Unless students are given significant guidance on what to include and how to organize it, their talks usually omit important elements of the paper. Instructors resent the loss of class time, students resent either the loss of instructional time or the obligation to sit there and appear attentive when they don’t care, and no one really gains much from the whole situation except, perhaps, a small bit of public speaking experience.
I’d like to make a plea instead for poster sessions instead of presentations. In a typical 50- or 75-minute class, you can run two mini sessions where half the class presents and half is the audience. This costs you one period of instructional time – perhaps two – instead of three or four for standard presentations. Poster sessions are all about audience involvement. Instead of talking about their research once, students will speak about it informally and repeatedly for 20 minutes and respond to more questions from their peers as audience clusters come and go. They learn to speak succinctly and clearly about just the highlights of their research while still having to respond to questions about the details that students glean from the poster. With the aid of a simple poster review sheet, the audience members will engage more with their peers’ research and think more critically about it because their active involvement is a crucial part of poster sessions.
In short, poster sessions are significantly more active than traditional panel-style presentations, and they have clear benefits for presenters, audience members, and faculty alike. Guidance for students (and faculty) on what to put on a poster, and how to convert your traditional powerpoint presentation to a poster, is in chapter 11 of my Empirical Research and Writing: A Political Science Student’s Practical Guide. (Or request a review copy from CQ Press.) It also contains some basic suggestions for how to organize a poster session in a regular classroom without nice big bulletin boards.
Positivist epistemology can be the topic of a very boring lecture in an introductory social science or methods course. Fortunately, an activity borrowed from an elementary math class can alleviate the boredom by asking students to identify the sources of their own beliefs about unobservable phenomena and to explore questions about what constitutes an explanation.
The basic outline of this activity, which I use during the second class session of most courses I teach, is that unusual objects are concealed in opaque fabric bags. Small groups of students must attempt to describe and identify the object as thoroughly as possible without opening the bag. A recorder observes the group’s discussion and notes what data points they establish, how they negotiate rival hypotheses, and what threshold or pieces of evidence convinced group members about their findings. The process of investigation, discovery, and persuasion is repeated with several different bags, so that each group gets at least 3 objects. We then discuss what they learned about the objects, and as I reveal each item I link it to concepts in positivist epistemology such as what constitutes sufficient evidence, whether we ever can know for sure without being able to observe things, the role of context in defining the meaning of an observed act, and precision and accuracy in measurement.
This activity has been around for a while; I published it in PS: Political Science and Politics in 2006. Since then, it’s been picked up in history, linguistics, and psychology classes. The article goes through sources, ideas, and materials in more detail than I can here. The objects you use can vary, but I strongly recommend that you keep the two bags with film canisters or other containers (one with cotton balls inside, one without), and the bag with a piece of fabric ribbon. These objects allow you to get at issues of precision, accuracy, unobservability and authority most efficiently.
My own set of objects has evolved, and it shifts from course to course depending on the points I want to emphasize. My methods course, for example, emphasizes the difficulty of doing measurement across time and space. To raise this point, one of the bags includes the cake topper from my wedding – a simple engraved acrylic block intended as an executive award or paperweight. Most groups identify that it’s plastic and engraved and probably a paperweight… but without knowing what the engraving says, its context is completely unknown and the object’s significance is vastly misinterpreted. Out of its original context, our identification – coding – of that object was totally incorrect.
I’ve been using this activity for over a decade now and haven’t made any major improvements or changes to the version described in the PS paper. I strongly encourage you to give it a try: let your students’ own curiosity drive their understanding of how researching unobservables works.
Well, my first experience with specifications grading is almost over, and with the semester drawing to a close, it’s time to reflect on the experiment. Find my first entry on specs grading here, and previous entries on this experiment can be found here, here, here, ad here. But now, here are my top 5 take-aways and lessons learned from specifications grading:
#5: Specifications Grading is more work up front, but much less at the end and moving forward
Just after the (last) New Year, I emailed my spring Research Methods sections with the usual ‘class starts soon, here’s the syllabus, order your books’ message. This year’s message contained a strange assignment, due during the second week of class. They were to watch the recent film version of Stephen Sondheim’s Into the Woods and write a one page reaction to the questions, whose fault is it, and why?
For those unfamiliar with the film, Into the Woods is a complex tale braiding together half a dozen well-known fairy tales around the story of the Baker and his Wife, who remain childless thanks to a curse put on the Baker’s house by the Witch. The Baker and his Wife go into the woods to seek the ingredients to lift the curse. The rest of the village, meanwhile, is driven into the same woods by the arrival of a Giant (Jack’s fault, of course). While in the woods, all sorts of calamities befall the group. By intermission, things seem to be mostly repaired so that happily ever after is plausible, but in the second act, thanks to a continuation of behaviors from before – and the realization by some characters that fulfilled wishes aren’t always what you dreamed of – a second giant drives the group into the woods once again. Disasters of various sorts again ensue. But whose fault is it?
Midway through the second act, the characters confront the realities of determining causation in the real world. As the Baker says to Jack, “It’s because of you that there’s a giant in our midst and my wife is dead.” At that point, the fun begins from a methodological and pedagogical perspective. Are those two separate outcomes, or one? Is the causal chain from Jack to those event(s) equally strong? What constitutes a good explanation? The Baker’s Father was the ultimate trigger of the curse, which was placed on them by the Witch’s Mother – is it one of their faults? Neither of those characters even appears in the show – is that convincing? For that matter, what is “it”?? We have to begin by determining the actual dependent variable of interest: the phenomenon we’re explaining. Because of the nature of this particular show’s plot, all the plots are intertwined; isolating a single simple causal story is impossible. (Hello, equifinality and multiple causality!) The Baker’s wife walked off a cliff (in the more recent version), so it’s technically her own fault that she’s dead – or did the shaking of the earthquake cause her to lose her balance and fall? How do you attribute blame convincingly to an earthquake?
Aside from a bit of grumbling at an assignment so early, and the unfortunate guy who watched it in the frat house without knowing that “it was all singing!!,” this activity was generally well received. Forcing students to pick a side in their reaction papers gave us a starting point for discussion as I tallied blame on the board, then went into some of the justifications. Overall, the highly unorthodox use of film in a research methods class, and the even more unorthodox choice of films, was a very good way to start my admittedly unorthodox approach to a required but dreaded methods class.
We want our students to learn to read critically and to interrogate and evaluate what they read. Does the author have the right data? Do the conclusions actually follow from the data? Are other explanations missing from the argument? That’s what we want them to ask themselves. A quick look at students’ notes from reading – if they even took any – reveal a totally different set of information, usually focused on the literature review and sometimes the theory. After all, this is the main textually-based body of an empirical paper, so it’s easiest for them to read.
Beyond steps we can take to teach students to read articles effectively (see my previous post on R&U and the Article Sort activity), I like to engage my intermediate and upper-level courses in an activity we call “You Be the Reviewer.” Students in all of my classes have already done the R&U activity and read (briefly) about the process papers go through to get published. So at some point in the term, I assign an unpublished article manuscript – often from a colleague or a conference paper pulled from the conference archives with author permission – and ask students to write a journal-style review, including a decision of whether the item should be published.
As support for this assignment, I distribute a handout like the one available here. It suggests some questions for students to consider, reminds them to check R&U for more guidance, and gives them a framework for writing a review. Typically, they are asked to post their reviews to the course learning management site and to bring a hard copy to class for reference. The resulting conversations have been far more in-depth and wide-ranging than anything else I’ve tried. At the end of the discussion, we collectively decide on the disposition of the article. Several classes – including a freshman-level intro course – have voted to reject manuscripts, though, as in the real world, R&Rs are the most common response.
While this activity obviously works better with upper-division classes, even lower-level students have enjoyed it and given very piercing feedback. For lower-level classes, qualitative research or very simple quantitative analysis works best. I normally compile the students’ feedback (copying particularly relevant bits from the CMS and pasting into a document) and send it to the author as thanks for sharing the manuscript. In an undergraduate methods class, I once was able to have the author come and give a (previously prepared) conference style presentation to the class on the manuscript they had reviewed. The author also took questions, so that the class had a model presentation to use in preparing their own as well as a chance to ask the author about research design decisions and practice giving useful feedback on research-in-progress before their own peer review process.
I’ve found that using a manuscript – an honest to goodness pre-publication, looks-like-it-was-written-in-Word-then-PDF’d manuscript – gets a far better reaction than published research. Students are reluctant to question or challenge work by ‘experts’ that’s already been vetted and published, but papers are a different matter.
Have you used unpublished research (other than your own) with your students? What was their reaction?