I decided to survey my comparative politics class on their opinions about Perusall after the first exam. Of a total of thirteen students, only eight were in class on the day of the survey, so the results are in no way statistically representative. But here they are anyway. Each survey item was on a five-point scale, with 1 equal to “strongly disagree” and 5 as “strongly agree.”
Reading other people’s annotations helps me understand assigned readings.
The university should continue to offer Perusall as an option for undergraduate courses.
I find Perusall difficult to use.
I’m more likely to read assigned journal articles that are on Perusall.
Perusall helped me complete reading responses.
Perusall helped me study for the exam.
No obvious warning signs in the results. And my main objective in using Perusall — to increase students’ understanding of assigned readings — was the statement with which they most strongly agreed.
The class has scored on average 80% on Perusall assignments so far. In my opinion, this is a sign that Perusall’s assessment algorithm fairly evaluates the quality of students’ interaction with assigned readings. Since the marking process involves no effort on my part, it’s win-win situation. I’m now thinking of how I can incorporate Perusall into other courses.
When the spring semester starts, I’ll be using Perusall for the first time, in my comparative politics course. I decided to finally experiment with it for three reasons. First, my previous attempts at getting students to engage in collaborative notetaking have mostly failed. Second, as I mention in that linked post, a couple of my colleagues have raved about Perusall’s ability to turn reading into a social learning process. Third, resiliency is as important as ever when it comes to course design. Given the pandemic and associated mitigation protocols, there is the chance that some or all of my students will be absent from the physical classroom at random points during the semester. Perusall allows students to engage with course content and each other asynchronously online.
I found it easy to set up Perusall by following these basic instructions (on my campus, Perusall has been administratively connected to all Canvas course shells, so there is no need for individual faculty members to install the LTI app). This brief explanatory video was also helpful. Perusall’s user interface is very intuitive. I set up the course’s article library and associated Canvas assignments in only a few minutes. Here is the end result from the Perusall side:
Notice how the layout is exactly what is shown in the video. It is also the same as what students will see.
Perusall uses an algorithm to machine grade student interaction with each document in the course library, and the algorithm’s output can be synced back to the Canvas gradebook. This means readings can become auto-graded Canvas assignments. Details on this and more are in the instructions I linked to above.
I will report on how well all of this has worked once the semester is underway.
A few weeks ago, I wrote about using one technological platform to circumvent the design constraints of another. Here is another, more serendipitous, example of finding a technological means for achieving an instructional objective.
For an upcoming undergraduate course, I decided to make Twine stories part of my exams. My previous posts on Twine for a team writing project are here, here, and here. (Hard to believe it’s been seven years since I last used it — how time flies.) For now, it is only important to know that Twine is freeware that enables users to create interactive texts in the form of HTML files.
I wanted my exams to each have two parts that students complete in sequence — first, a series of multiple choice questions on concepts; second, an essay-type question in which students demonstrate their ability to apply the same concepts by extending a Twine’s plot line. It is fairly easy (if tedious) to create multiple choice test questions in the Canvas LMS. One can also set a content module to require that students complete each item in the module in order. But initially I didn’t know how to include the Twine story for each exam’s second part.
Probably all of us have encountered the constraints of educational technology — in a particular situation, it doesn’t do quite what we want it to do, so we try to figure out a workaround. Here is one example:
For the coming academic year, my undergraduate students will complete multiple metacognitive exercises that will supply me with data for some pedagogical research. The exercises consist of surveys that ask students to evaluate the effectiveness of their study habits before and after exams (I’ll describe this in detail in a future post).
Initially, I tried creating these surveys in the Canvas LMS quiz tool, because I can set Canvas to automatically reward students with a certain number of points if they complete a survey. I find point rewards to be necessary because most of the undergraduates I teach won’t do anything unless it has a transparent effect on their course grade. However, I rapidly hit several obstacles — e.g., as far as I can tell, one can easily duplicate an “assignment” in Canvas, but not a “quiz.”
In contrast, it is ridiculously easy to copy, rename, and revise survey instruments in Google Forms. But Google Forms isn’t connected to the Canvas gradebook, and I did not want to have repeatedly jump between Google Forms and Canvas to record points each time a student completed a survey. Also I prefer putting as much of my course content as possible in Canvas, because invariably, the more I expect students to use different technological platforms, the more emails I receive about their learned helplessness.
What for me was the most telling part of the second report:
“students learning experiences were undermined in myriad ways by poor decisions in the delivery and management of courses. On the pedagogical side, students complained of long lectures with massive slides decks . . . assignments with little scaffolding or connections to learning outcomes . . . and generally trying to replicate face-to-face experiences in online learning environments.” (italics mine)
More musings about higher education in a post-pandemic world . . .
While isolating at home during the winter Covid-19 surge, I re-established contact with an academic fellow traveler from my pre-21st century days as a doctoral student. Our conversation turned to the declining popularity of traditional humanities and social science disciplines among undergraduates, a trend seemingly initiated by the 2008 recession and possibly accelerated by the SARS-CoV-2 pandemic. As professors tend to do, we each had previously identified a second possible cause of this trend: the failure at the undergraduate level of these disciplines to evolve in response to technological change. Back in 2013, I wrote an ALPS post about the need for faculty to examine assumptions about curricular content and delivery given the new technological demands of employers, but my friend expressed it much better late last year here. His basic point: students are more likely to study what reflects their daily experiences and clearly connects to attractive careers than what does not. Universities, being subject to finite resources, will institutionalize the former while casting aside the latter.
As my friend wrote, technologies like internet search, smartphones, big data, and social media were already having an effect before 2008, but they radically altered life afterward. Yet how many undergraduate political science, history, or English literature programs now train majors in app design, predictive analytics, or video production? I’ve taken a few small steps in this direction, with online video content, ArcGIS storymaps, and KnightLab timelines, but always at my own expense and independently of the formal curriculum. My friend has made a much deeper commitment to learning and teaching these technologies, but again, he’s done it despite, not because of, the norms of his discipline.
As the fall semester bears down on us and many schools are finally admitting that yes, there will be a substantial amount of online courses (either fully, blended, hybrid, hyflex, etc), I imagine many faculty are experiencing some amount of panic about having to once again suddenly move their courses online. In particular, faculty are concerned about building community in their classes. Online courses can feel very isolating; without physical interaction before and after class, students may not feel connected to either you as the instructor, or their fellow students. One way to combat this and build community is to use team-based learning, where you have set groups working throughout a term on one or a series of projects. This can give students a small group of people that they can come to know well, even if they only work asynchronously with those students. Whether you are interested in adopting a team-based learning model, or just want to use the occasional group project, it’s a good idea to look at what options we have to do this online. On general approaches, I will direct you to this article by Stephanie Smith Budhai in Faculty Focus; here, let’s stick to recommendations on platforms for group or team learning.
First, a caveat: you don’t have to always dictate what platform your students use to collaborate. If all you care about is the end-project or outcomes, then let them use whatever platform they feel comfortable with. Give them options, certainly, but don’t dictate–let them communicate in whatever way is going to make it easy for them to work together, whether that’s on a social media platform, texting, WhatsApp, or something else. The main reason to ask students to use a particular platform is if you want to be able to check in on their work in progress and to see how things are developing. Each of the below options would allow you to do that (although students may need to grant you access!). Just be sure to explain why you’ve chosen this platform, take some time to train students in how to use it, and be clear on how and why you’ll be dropping in to check on their progress.
Let’s talk about several platforms you can use for group collaboration or team-based learning.
I’ve previously suggested that faculty should still be preparing for their classes to move online at some point this fall, whether their university is planning to be entirely virtual or not. Unless your school has strong institutional practices in place to minimize spread–that is, testing, contact tracing, enforced mask wearing and social distancing, and protocols for quarantines–there is a strong chance that an outbreak on campus will prompt another sudden move to online.
As a faculty educational developer, I had to figure out how I could best support my faculty as they made the transition to online teaching. In the spring I focused on training faculty to teach online using different platforms (Blackboard, Zoom, Microsoft Teams); consulting and troubleshooting; writing and evaluating surveys of students and faculty; and building and sharing resources on a webpage I put together. What else could I do with our one month break that would provide the biggest rate of return as faculty prepare for a fall that will likely include virtual instruction?
As the title of this post gives away, I’ve decided to go with a faculty learning community. I held a faculty panel discussion right before graduation where faculty who taught in the spring shared their challenges, successes, and insights–but as such panels do, it generated as many questions as answers. Those unanswered questions (and responses to the evolution for the event) guided the choice of topics for this summer-only event.
Higher Ed is in crisis mode in much of the USA, with faculty at a growing list of universities being told that on-campus instruction is suspended until further notice. If you work at one of these institutions, here’s some advice:
First, a great analysis by Rebecca Barrett, assistant professor of sociology at Arkansas State University:
What activities will help students learn what you want them to learn? There are multiple options that serve the same goal, and some function well in an online environment.
Students benefit from consistent difference
Organize the course as a series of similarly-structured modules that include varied tasks; for example, readings + writing assignment + graded discussion + quiz in each module. Spaced repetition of activities with different cognitive demands aids learning. Students appreciate a routine — it helps them develop a schedule in a new environment.
If your students are (or were) full-time residential undergraduates without families, set deadlines for 10:00 pm so they aren’t awake past midnight.
Use a variety of content delivery methods
Convert lecture notes to brief essays or outlines. Create visual presentations with PowerPoint or Prezi. Assign e-book chapters or journal articles in the library’s databases.
Producing high-quality video is very labor intensive and nearly impossible to create on short notice without professional expertise. Audio must be captioned for the hearing-impaired. Instead use existing resources like Crash Course.
Include opportunities for student-student interaction; e.g., discussions
Require that students post substantial discussion comments by the halfway point in each week/module, and that they meaningfully respond to the comments of others.
Interact with students by regularly posting your own comments in discussions.
Grading discussions in Canvas is easy with a rubric and Speedgrader.
Your university’s learning management system (LMS) will grade multiple-choice and true/false tests for you.
Create a question bank first, then draw questions for a test from it. This will save time and effort in the long run.
Timed tests where students have on average only 1-2 minutes per question will minimize cheating. Some universities have purchased tools that are integrated with the LMS that lock down browsers (to prevent new windows from being opened) and monitor students activity during tests.
Short, frequent writing assignments (1-2 pages per module) are better than only one or two longer assignments. Frequent practice and feedback = better student work.
Time spent grading can be greatly minimized with rubrics.
Specify file types (doc, docx, pdf) to ensure that you can read what students submit.
Build support networks
Your colleagues down the hall and across campus are valuable resources. Benefit from them. Someone probably knows a solution to the problem you’re struggling with, whether it be a technological obstacle or carving out work time at home when your child’s elementary school has closed.
These actions also improve your teaching in on-campus courses by making it easier for students to learn and by reducing the time and effort you expend on unsatisfying tasks.
Many of you are probably already acquainted with the muddiest point technique — asking students to identify the one aspect of a lesson or assignment that they are the most confused by. Often this is accomplished by distributing index cards for students to write on. This semester I’m using an electronic version in a 200-level honors course on Asia: a survey on our Canvas LMS, completed in the last few minutes of class on days for which some kind of lecture or discussion is scheduled. The survey consists of the question “What are you most curious or confused about from class today?” Students automatically earn one point toward the final grade by answering it.
With a paperless process, I don’t have to try to decipher students’ handwriting. And I have an archive of students’ responses that I don’t have to transport or store.
Far more importantly, the surveys are demonstrating the difference between my knowledge base and that of my students — which I otherwise would be mostly oblivious to.
For example, my mind automatically defaults to thinking in terms of power, authority, and legitimacy whenever I’m confronted with the task of analyzing an authoritarian state. Or I recall concepts like ethnic identity when discussing nationalism. Or I know that geography is political rather than an immutable law of the universe — as demonstrated by the origins of labels like Far East, Middle East, and Near East. This is not the case with the majority of students in the class, given their survey responses so far.