
Not knowing whether one has actually helped students learn is one of the most frustrating aspects of teaching. Assuming an absence of megalomania or the Dunning-Kruger effect, indications that we’ve made a difference are typically quite vague and ambiguous. So I was pleasantly surprised — as in, “hey, maybe students really did benefit from this” — by the results of a knowledge probe that I launched at the beginning and end of the semester in my course on economic development and environmental politics.
The knowledge probe was an ungraded quiz that asked questions about a few basic economic concepts, administered through Google Forms in the first and last weeks of the semester. Results, showing percentage of respondents answering correctly, are below.
Pre N = 21 | Post N = 16 | % Change | |
Poverty Trap | 52 | 100 | 92 |
Diminishing Returns to Capital | 52 | 75 | 44 |
Skill Matching | 5 | 88 | 1,660 |
Common Pool Resource Problem | 48 | 81 | 69 |
Moral Hazard | 38 | 100 | 163 |
Obviously this wasn’t the perfect tool. Sample sizes are too small for statistical significance. And a slightly different proportion of students reported previously taking a university economics course on the pre-test than on the post-test. But the numbers at minimum suggest that students learned something over the semester, which gave me a sense of satisfaction that I otherwise wouldn’t have.