Observing Observation

Two weeks ago, students in my economic development and environmental politics course played my simulation on freshwater resource scarcity in Asia. If my memory is correct, it was the first time running the simulation in the physical classroom, and I was interesting in whether students behaved differently in the face-to-face environment compared to a prior iteration of the simulation that occurred online.

You can lead the students to knowledge . . .

The underlying mechanics of the simulation were unchanged: six teams, each representing a different country with one or more transnational rivers crossing its territory. Turn by turn, the population expands, more food must be produced, and water demand increases, yet countries are building dams upriver and rainfall declines because of climate change. Eventually a country has a famine and millions of refugees spill into its neighbors.

This time around I added a victory condition: members of the team with the greatest percentage growth in GDP per capita when the simulation ended earned five points (out of a thousand) toward their final grades. I put a copy of the simulation’s spreadsheet, which shows how actions taken by teams affect water availability, food production, hydroelectricity generation, and GDP, on the LMS and encouraged students to experiment with it before the simulation started.

Student did seem more engaged with the simulation in the classroom than they had online, though it was far easier for me to observe their interactions. The real surprise was how baffled students were about the cause and effect relationships built into the spreadsheet. Growth in GDP requires growth in hydroelectric capacity, which only comes from building dams. Yet teams were hesitant to build dams. By the end of the simulation, China, for example, had stockpiled enough of a reserve to have constructed over one hundred dams, yet it had built only a handful. The largest change in GDP among the six teams? Only 1.1 percent over a twelve year period.

Students clearly had not tried to figure out the spreadsheet before the simulation started, and none of them seemed to understand the relationship between economic growth, food, and water. Consequently, many of them flailed about helplessly as their country’s water supply steadily dwindled. When asked during the debriefing why they chose inaction instead of action, I got mostly blank looks. As I’ve noted before, many students seem to have little understanding of cause and effect; instead, in their worlds, stuff just happens. While I would prefer not adding multiple assignments to the course to force students to work with the simulation’s causal relationships before the simulation actually begins, it might be necessary.

More Notes About Notes

Simon’s post from last week got me thinking about how my own lecture preparation and delivery have changed during my career. When I began teaching university students, I created extensive, highly structured outlines that supplied me with both a mental schema for organizing the information and what I believed were the many important details. Though I didn’t read the outlines word for word, they were a handy reference during lectures, and they also indicated to me just how much I had “covered” in a class. In sum, the outlines made me better able to transmit what I needed to transmit to students. Or so I thought. Reality, at least for students, is usually very different.

2010 version

My graduate education, especially the doctoral dissertation process, had primed me for a teaching style that may not have been the most productive. As a general rule, the undergraduate experience is not, nor should it be, sorting and remembering huge amounts of information about arcane topics like early 20th century Vietnamese and Cambodian nationalists.

2021 version

Flash forward to the present: I’m less organized, less detail oriented, and more concerned with big picture concepts when lecturing. I have the flexibility to spend time discussing the implications of a question that a student has raised. Are students learning more? I don’t really know. But I strongly doubt they are learning any less.

Directory Assistance II

I have occasionally written (examples here and here) about students interpreting assignment and exam prompts in ways that differ from what I intend.

This happened again with the first exam in the undergraduate course that I am teaching this semester. The exam prompt directed students to add to a Twine story. In a class of nineteen students, only one actually wrote text to add to the story. The rest of the students wrote up to three pages that described additions to the story. So here is the prompt for the second exam — changes in bold:

“Play the [link to Twine HTML file in Canvas course shell] game. Write a brief paragraph about one character in the Twine that continues the text of the story and presents the reader with a binary yes/no choice to make about the character. Then write a brief paragraph for each outcome of that choice.  The three paragraphs need to be part of a plot line that reflects one of the following economic development concepts:

[list of concepts students are being tested on]

Write the story, do not describe it.

At the top of your exam, in four sentences or less, 1) identify which of these concepts your plot line demonstrates, and 2) explain how the concept is demonstrated by your plot line.

Your work will be assessed according to the rubric below.”

The second exam is at the end of this week, so I will soon be able to report on whether the revised prompt is more effective.

What You Think Depends On Where You Stand

Our superb librarians survey students and faculty annually. Results from this year’s survey are in. Student responses to one of the questions:

Faculty responses:

Notice that the frequencies of responses from these two groups are essentially mirror images of each other. Students are extrinsically motivated by grades, so they think in instrumental terms: I need correctly formatted citations and the specified minimum number of sources. Otherwise my grade will be negatively affected. Knowing whether a source is reputable is far less important. Faculty think the reverse: the ability to locate scholarly source material and analyze information for bias matters most.

I have tried to solve this problem in the past, and could not find a satisfactory solution. Consequently, I have focused more on curating quality content for student to consume than on marking down students because of their reliance on websites that are top-listed in Google searches. In fact, it’s one of the reasons I decided to stop assigning traditional research papers.

Given the survey results though, the problem extends far beyond my small corner of the curriculum. I’m not going to solve it independently.

Readers might find these other posts on information literacy skills to be of interest:

The Methods Silo Effect and Fixing Poor Research Skills

Googling

Write Your Own Headlines Activity

Op-Ed Writing Workshop

The Virginia USA chapter of the Scholars Strategy Network is sponsoring a free online op-ed writing workshop on Wednesday, October 20, 3:30 – 5:00 pm. This workshop is a hands-on training for scholars who want to learn how to write and pitch compelling, research-based op-eds. Participants will learn how to craft a good lead, identify and incorporate timely news hooks, signal the author’s unique and relevant expertise, increase the likelihood of publication, and structure an op-ed for maximum impact. Participants are asked to come prepared with an idea for an op-ed in mind; they will be guided through shaping their idea into a first draft.

Registration form and additional details are here.

Personal note: as the author of occasional op-eds for local and national publications, I know first hand the benefits of being able to write for a non-academic audience.

Collecting Data From Students

As previously discussed, this semester I am attempting to research whether metacognitive exercises improve students’ learning — as measured by exam scores. My class completed the first survey and exam. A few initial impressions about the data and the process of collecting it:

Eighteen students completed the pre-exam survey a total of twenty-seven times. Two students submitted responses three times each. This demonstrates the importance of requiring that students include some kind of identifying information when they complete the survey, so that duplicate submissions can be removed from the data set.

I suspect the survey data are skewed because of above average effect or subject bias. By coding the responses from 1 to 5, with 1 being “never” and 5 being “always,” the highest possible sum score from the ten survey questions is 50. The average for this survey was 40. I doubt students actually engaged in the study strategies referenced in the survey as frequently as they said they did.

The average total score on the exams five multiple choice questions was 7.7 out of 10. Given the small sample and the nature of the data, a statistical analysis that compares these scores against survey responses isn’t meaningful, but I did run a correlation in Excel, which resulted in a very non-impressive r of -0.12.

The exams in this course are extremely low stakes — the first and second exams are worth 25 points each, and the final exam is worth only 40 points, out of more than 1,000 points available from all graded items. That might have affected how diligent students were in studying for the exam.

Given the small size of the class and the usual host of possible confounding variables, I can already guess that I won’t be able to identify a relationship between the metacognition surveys and students’ exam performance. Repeatedly asking students about their study techniques might help them learn more, but I’m not going to be able to demonstrate it.

Leaving Academia: A Practical Guide

Christopher L. Caterine’s Leaving Academia: A Practical Guide (Princeton U. Press, 2020) is packed with sound career advice for people who have obtained doctorates. The book is also highly relevant to anyone who is just contemplating post-baccalaureate study, because it points out three systemic flaws in graduate education:

First, graduate programs typically emphasize the production of subject matter experts, leading to what Caterine calls the overspecialization trap:

“[N]obody outside the academy can monetize knowledge of . . . constructions of gender in eighteenth-century French novels. Even scientists aren’t safe on this count . . . many still face hiring bias because of the excessive specialization that graduate school requires. Trying to convince nonacademics to value what you study is probably a losing battle” (p. 89).

Just as doctorate holders should emphasize how they study when applying for jobs, graduate programs need to be oriented around methodological training rather than the delivery of factual knowledge. Any worthwhile graduate program needs to teach its students how to quickly distill large amounts of unfamiliar and often contradictory information down to its essentials and present “a coherent narrative in a public forum on short notice” (p. 123). This skill is in constant demand by employers, whereas being the world’s foremost authority on a post-Augustan Roman poet is not.

Second, the elements of good teaching are also immensely beneficial job skills, yet how many graduate programs train their students to become competent teachers? Good teaching requires one to be adept at project management, public speaking, running meetings, balancing divergent stakeholder interests, and emotional intelligence (p. 104). For example, running a classroom debate on a policy topic for which there are no cut and dried answers is an example of the ability to engineer “discussions that orient people toward a shared understanding or goal” (p. 108). These are the kinds of attributes that employers prize.

Finally, just like anyone else, academics need to present themselves and their expertise in an understandable, unambiguous manner. Judging by the terribly written cover letters and resumes I have seen from job applicants, this is not a skill that people commonly acquire through graduate education.

So, for anyone out there thinking about graduate school, what’s the evidence that a program in which you are interested will adequately prepare you for a non-academic career? If you are already university faculty, what aspects of your work have value outside of academia, and how can you clearly communicate this to potential employers? Leaving Academia: A Practical Guide will show you how to find answers to these questions.

Resilience of Learners in Times of Uncertainty

Three weeks ago I wrote about resilient syllabus design. A chance email exchange with someone I’ve never met caused me to reexamine that post for unconscious assumptions. And yes, I had a few. For example:

A course with numerous low stakes assignments throughout the semester is probably more pedagogically resilient and effective than a course in which the only assessments are a midterm and a final exam. If one of these two exams has to be cancelled for some reason, you’re screwed. But the design assumes students will be able to complete assignments mostly continuously, with perhaps only a brief interruption or two because of weather, contagious disease, or alien invasion.

What happens if people’s homes and workplaces have been destroyed and a significant portion of the population has evacuated? Maybe the campus reopens after electrical power has been restored, but students, wherever they are, might still lack a permanent residence, transportation, employment, internet access, or, in some cases, even an adequate supply of food and water. Euphemistically, they have become the ultimate retention risks.

While there might not be a good solution to this type of worst case scenario, I’m going to be running undergraduate students in my economic development course through some exercises that I hope will get them thinking about “What if?” Just in case the unexpected, or highly unlikely, happens.

Call for Proposals: Work Smarter, Not Harder

Charity Butcher, Tavishi Bhasin, Maia Hallward, and Elizabeth Gordon are creating a book for political science faculty who want to simultaneously increase their research productivity and teaching effectiveness. They are seeking proposals for chapters that will outline how faculty can align their teaching and research interests through:

  • The scholarship of teaching and learning.
  • Textbook authorship.
  • Faculty-student research at both the undergraduate and graduate levels.
  • Study abroad and field research.
  • Designing courses to general new areas of scholarship.

If you are interested, please submit a 300-400 word abstract and a short bio (around 50 words) to cbutche2@kennesaw.edu by September 30th. Your proposal should describe the essay you would like to contribute, explicitly connecting your chapter to one of the areas above or indicating an additional unlisted area where you think your chapter could fit. Abstracts will be reviewed on a rolling basis, with all decisions completed by October 15. Accepted abstracts will be included in a book proposal to be submitted to Springer as part of the Political Pedagogies book series (edited by Jamie Frueh and David Hornsby).

They anticipate final essays of roughly 3000 words to be submitted by February 1, 2022. Final essays should include specific, tangible examples of teaching and research practices that other professors could reproduce in their own contexts to improve and expand their research and teaching. For more details, please see the full call for proposals at: 

http://facultyweb.kennesaw.edu/cbutche2/docs/Call%20for%20Proposals%20-%20Aligning%20Teaching%20and%20Research.pdf.

Asynchronous Field Research Exercises

To answer Maria’s question in her comment on my last post about syllabus design, here is an example of one of my field research assignments:

To earn full credit: upload your first post by 6:00 p.m. Wednesday and respond to at least one other person’s post by 9:00 a.m. Friday; see the rubric.

  1. Read:
  1. Go to The Opportunity Atlas at https://www.opportunityatlas.org. Click “Begin Exploring.” Click “OK.” Enter Newport’s zip code in the search bar at the upper left. Make sure “Household Income” is selected for “Outcomes.” Select a street in the blue area and a street in the red area.
  2. Walk a portion of the two streets that you selected. What do you see about the built environment that you think relates to economic opportunity in these locations?
  3. Take a photo that indicates what you notice; post the photo and your observations in the discussion. Identify the location shown in the photo. Cite at least one the readings in your first discussion post. 

Here is my rubric for grading the online discussion: