Maximizing the Medium II

A few weeks ago, I wrote about using one technological platform to circumvent the design constraints of another. Here is another, more serendipitous, example of finding a technological means for achieving an instructional objective.

For an upcoming undergraduate course, I decided to make Twine stories part of my exams. My previous posts on Twine for a team writing project are here, here, and here. (Hard to believe it’s been seven years since I last used it — how time flies.) For now, it is only important to know that Twine is freeware that enables users to create interactive texts in the form of HTML files.

I wanted my exams to each have two parts that students complete in sequence — first, a series of multiple choice questions on concepts; second, an essay-type question in which students demonstrate their ability to apply the same concepts by extending a Twine’s plot line. It is fairly easy (if tedious) to create multiple choice test questions in the Canvas LMS. One can also set a content module to require that students complete each item in the module in order. But initially I didn’t know how to include the Twine story for each exam’s second part.

Continue reading “Maximizing the Medium II”

Maximizing the Medium I

Probably all of us have encountered the constraints of educational technology — in a particular situation, it doesn’t do quite what we want it to do, so we try to figure out a workaround. Here is one example:

For the coming academic year, my undergraduate students will complete multiple metacognitive exercises that will supply me with data for some pedagogical research. The exercises consist of surveys that ask students to evaluate the effectiveness of their study habits before and after exams (I’ll describe this in detail in a future post).

Initially, I tried creating these surveys in the Canvas LMS quiz tool, because I can set Canvas to automatically reward students with a certain number of points if they complete a survey. I find point rewards to be necessary because most of the undergraduates I teach won’t do anything unless it has a transparent effect on their course grade. However, I rapidly hit several obstacles — e.g., as far as I can tell, one can easily duplicate an “assignment” in Canvas, but not a “quiz.”

In contrast, it is ridiculously easy to copy, rename, and revise survey instruments in Google Forms. But Google Forms isn’t connected to the Canvas gradebook, and I did not want to have repeatedly jump between Google Forms and Canvas to record points each time a student completed a survey. Also I prefer putting as much of my course content as possible in Canvas, because invariably, the more I expect students to use different technological platforms, the more emails I receive about their learned helplessness.

What to do?

Continue reading “Maximizing the Medium I”

Students’ Experiences With Technology During the Pandemic

The nonprofit association EDUCAUSE has released two reports on a study conducted in Fall 2020 about students’ pandemic experiences with:

  • Connectivity and technology.
  • Learning with technology.

Both reports can be accessed for free here.

What for me was the most telling part of the second report:

“students learning experiences were undermined in myriad ways by poor decisions in the delivery and management of courses. On the pedagogical side, students complained of long lectures with massive slides decks . . . assignments with little scaffolding or connections to learning outcomes . . . and generally trying to replicate face-to-face experiences in online learning environments.” (italics mine) 

When the Medium Becomes the Message

More musings about higher education in a post-pandemic world . . .

While isolating at home during the winter Covid-19 surge, I re-established contact with an academic fellow traveler from my pre-21st century days as a doctoral student. Our conversation turned to the declining popularity of traditional humanities and social science disciplines among undergraduates, a trend seemingly initiated by the 2008 recession and possibly accelerated by the SARS-CoV-2 pandemic. As professors tend to do, we each had previously identified a second possible cause of this trend: the failure at the undergraduate level of these disciplines to evolve in response to technological change. Back in 2013, I wrote an ALPS post about the need for faculty to examine assumptions about curricular content and delivery given the new technological demands of employers, but my friend expressed it much better late last year here. His basic point: students are more likely to study what reflects their daily experiences and clearly connects to attractive careers than what does not. Universities, being subject to finite resources, will institutionalize the former while casting aside the latter.

As my friend wrote, technologies like internet search, smartphones, big data, and social media were already having an effect before 2008, but they radically altered life afterward. Yet how many undergraduate political science, history, or English literature programs now train majors in app design, predictive analytics, or video production? I’ve taken a few small steps in this direction, with online video content, ArcGIS storymaps, and KnightLab timelines, but always at my own expense and independently of the formal curriculum. My friend has made a much deeper commitment to learning and teaching these technologies, but again, he’s done it despite, not because of, the norms of his discipline.

Online Group Projects to Build Community: Platform Options

As the fall semester bears down on us and many schools are finally admitting that yes, there will be a substantial amount of online courses (either fully, blended, hybrid, hyflex, etc), I imagine many faculty are experiencing some amount of panic about having to once again suddenly move their courses online. In particular, faculty are concerned about building community in their classes. Online courses can feel very isolating; without physical interaction before and after class, students may not feel connected to either you as the instructor, or their fellow students. One way to combat this and build community is to use team-based learning, where you have set groups working throughout a term on one or a series of projects. This can give students a small group of people that they can come to know well, even if they only work asynchronously with those students. Whether you are interested in adopting a team-based learning model, or just want to use the occasional group project, it’s a good idea to look at what options we have to do this online. On general approaches, I will direct you to this article by Stephanie Smith Budhai in Faculty Focus; here, let’s stick to recommendations on platforms for group or team learning.

First, a caveat: you don’t have to always dictate what platform your students use to collaborate. If all you care about is the end-project or outcomes, then let them use whatever platform they feel comfortable with. Give them options, certainly, but don’t dictate–let them communicate in whatever way is going to make it easy for them to work together, whether that’s on a social media platform, texting, WhatsApp, or something else. The main reason to ask students to use a particular platform is if you want to be able to check in on their work in progress and to see how things are developing. Each of the below options would allow you to do that (although students may need to grant you access!). Just be sure to explain why you’ve chosen this platform, take some time to train students in how to use it, and be clear on how and why you’ll be dropping in to check on their progress.

Let’s talk about several platforms you can use for group collaboration or team-based learning.

Continue reading “Online Group Projects to Build Community: Platform Options”

Tips for Moving Instruction Online

Still this?

Higher Ed is in crisis mode in much of the USA, with faculty at a growing list of universities being told that on-campus instruction is suspended until further notice. If you work at one of these institutions, here’s some advice:

First, a great analysis by Rebecca Barrett, assistant professor of sociology at Arkansas State University:

Please Do a Bad Job of Putting Your Courses Online.

Some mundane advice from me:

Design according to student learning outcomes

  • What activities will help students learn what you want them to learn? There are multiple options that serve the same goal, and some function well in an online environment.

Students benefit from consistent difference

  • Organize the course as a series of similarly-structured modules that include varied tasks; for example, readings + writing assignment + graded discussion + quiz in each module. Spaced repetition of activities with different cognitive demands aids learning. Students appreciate a routine — it helps them develop a schedule in a new environment.
  • If your students are (or were) full-time residential undergraduates without families, set deadlines for 10:00 pm so they aren’t awake past midnight.

Use a variety of content delivery methods

  • Convert lecture notes to brief essays or outlines. Create visual presentations with PowerPoint or Prezi. Assign e-book chapters or journal articles in the library’s databases.
  • Producing high-quality video is very labor intensive and nearly impossible to create on short notice without professional expertise. Audio must be captioned for the hearing-impaired. Instead use existing resources like Crash Course.

Include opportunities for student-student interaction; e.g., discussions

  • Require that students post substantial discussion comments by the halfway point in each week/module, and that they meaningfully respond to the comments of others.
  • Interact with students by regularly posting your own comments in discussions.
  • Grading discussions in Canvas is easy with a rubric and Speedgrader.

Testing

  • Your university’s learning management system (LMS) will grade multiple-choice and true/false tests for you.
  • Create a question bank first, then draw questions for a test from it. This will save time and effort in the long run.
  • Timed tests where students have on average only 1-2 minutes per question will minimize cheating. Some universities have purchased tools that are integrated with the LMS that lock down browsers (to prevent new windows from being opened) and monitor students activity during tests.

Writing assignments

Short, frequent writing assignments (1-2 pages per module) are better than only one or two longer assignments. Frequent practice and feedback = better student work.

Time spent grading can be greatly minimized with rubrics.

Specify file types (doc, docx, pdf) to ensure that you can read what students submit.

Build support networks

Your colleagues down the hall and across campus are valuable resources. Benefit from them. Someone probably knows a solution to the problem you’re struggling with, whether it be a technological obstacle or carving out work time at home when your child’s elementary school has closed.

These actions also improve your teaching in on-campus courses by making it easier for students to learn and by reducing the time and effort you expend on unsatisfying tasks.

The Muddiest Point, Updated

Many of you are probably already acquainted with the muddiest point technique — asking students to identify the one aspect of a lesson or assignment that they are the most confused by. Often this is accomplished by distributing index cards for students to write on. This semester I’m using an electronic version in a 200-level honors course on Asia: a survey on our Canvas LMS, completed in the last few minutes of class on days for which some kind of lecture or discussion is scheduled. The survey consists of the question “What are you most curious or confused about from class today?” Students automatically earn one point toward the final grade by answering it.

With a paperless process, I don’t have to try to decipher students’ handwriting. And I have an archive of students’ responses that I don’t have to transport or store.

Far more importantly, the surveys are demonstrating the difference between my knowledge base and that of my students — which I otherwise would be mostly oblivious to.

For example, my mind automatically defaults to thinking in terms of power, authority, and legitimacy whenever I’m confronted with the task of analyzing an authoritarian state. Or I recall concepts like ethnic identity when discussing nationalism. Or I know that geography is political rather than an immutable law of the universe — as demonstrated by the origins of labels like Far East, Middle East, and Near East. This is not the case with the majority of students in the class, given their survey responses so far.

Collaborative Reading – Follow-Up Thoughts

Today we have an update from Colin M. Brown, College Fellow in Government at Harvard University. He can be reached at brown4 [at] fas [dot] harvard [dot] edu.

In a post last year, I talked about the potential of using annotation software like CritiqueIt to make the reading process more collaborative. In short, by creating a single copy of the reading that students can mark up together online, there’s the potential for creating discussion prior to and during class, and also for getting students to see course readings as statements in a dialogue.

My first use of CritiqueIt was promising, but I’m less satisfied after having further used it in two undergraduate seminars plus a graduate-level, continuing education course.

Two things have continued to work, probably still making the tool a net positive. First, as a diagnostic tool CritiqueIt makes class prep easier, because it gives me a window into what students find interesting or are struggling with. Students indicate their interest implicitly or explicitly, and they also seem relatively fine with using their comments to signal that something doesn’t make sense—especially useful when they’re having difficulty with something I didn’t expect. Second, they seem to like it. Students seem to perceive it as a cool new gimmick, and I seem to get credit for trying it.

However, while CritiqueIt lets me know what students want the conversation in class to be about, it hasn’t generated a conversation among students on its own. Students have posted a few responses to other students’ annotations, but the kind of exchange I mentioned in the original post hasn’t happened consistently. Students seem to be completing the assignment because it sends me a signal that they have, in fact, engaged with the reading. This provides me with feedback for me, as mentioned above, but was not my ultimate reason for using the tool.

Since I want students to see political science writings as part of an ongoing exchange of ideas, there are three changes that I’ll be implementing next semester, thanks to insights from my colleague Daniel Smail, who has been experimenting with the same tool in his history courses:

  1. Build CritiqueIt into the entire semester. Students need time to get used to the tool, and the expectation that it’s an integral part of their work.
  2. Assign early readers. If everyone reads the night or morning before class, there’s less incentive to start a dialogue that none of their peers will respond to. By dividing up the collaborative readings and having one or two students make their annotations three or four days before class, there will be more time for students to jump into the conversation.
  3. Work CritiqueIt into summative assessment. This also normalizes the use of the tool, and gives students the incentive to develop better commenting skills. Students will need several days to virtually hand the document back and forth so this has to be accounted for in scheduling other assignments. But giving them a longer piece of journalism on the broad course theme and having them react to it, and then to each other, knowing that their comments will be graded on some explicit rubric, might be a better way to tease out their ability to respond critically to arguments—and actually use something they learned from class.

 

Follow-Up on Slack and Specification Grading

Today we have another post by guest contributor William R. Wilkerson, Professor of American Government and Politics at SUNY-Oneonta. He can be reached at bill [dot] wilkerson [at] oneonta [dot] edu.

Earlier this summer I wrote about two changes that I made to my five-week online summer course, Law, Courts and Politics: using Slack for class communication and specifications grading. Both experiments were a success.

Slack

Slack was a great addition. I found it easy to set up and to use. Students liked it. Thanks to the resources I noted in my earlier post, I created a simple structure: channel for each week was the home of announcements, files, links, and discussion — the center of the course. The introduction channel gave students the ability to practice and the questions forum got some use, especially early in the term.

Because Slack has excellent apps for all mobile and computer platforms, I hoped that it would encourage regular communication, which it did. Total posts in the weekly channels ranged from 62 in week 2 to 90 in week 4. I posted reminders and introduced topics, but most posts were from the students. Nine or ten students active each week; one student never posted in the weekly discussion forums. I was pleased that a group of students began posting mid-week and continued through the end of the week. Students picked up quickly on hashtags for topics and connecting to their fellow students via the @ symbol, which facilitated interaction. Posts were fairly long too, especially when you consider they were writing on their phones. I had expected phone use to result in short responses to comments, but that didn’t happen. Continue reading “Follow-Up on Slack and Specification Grading”