Category Archives: Training

Avatar photo

Review: Applying the QM Rubric Course

I have been working as a content developer at DePaul for nearly 5 years. In these 5 years, I have heard rumblings about Quality Matters and Quality Matters Reviews, but never really understood what “QM’ing” a course really meant. When asked what I would like to focus on for professional development, becoming a certified peer reviewer was the first thing that popped in my head. I have quality assured many courses and wondered, “how much different is that from doing a quality matters review?” I was in for an awakening.

Continue reading

Reflections on my Experience in the Teaching and Learning Certificate Program

After having attended over a dozen workshops offered throughout DePaul’s Teaching and Learning Certificate Program (TLCP), I have developed a much better sense of what I can do as a teacher to affect a positive and measurable transformation in my students. I also learned that I am not alone in my quest to find innovative teaching practices that can be readily implemented in my courses.   Continue reading

Quick and Easy Curving

Something Happened! — That Sinking Feeling

Sometimes the grade distribution on your exam seems a bit low — maybe even horrifyingly low.

Perhaps there wasn’t enough time to focus on a topic due to a holiday, a bout of illness struck, or maybe there was a question that was ambiguously worded. The assessment might be brand new and still needs some tweaking, or maybe the students just didn’t get it — there was a collective lapse in memory.

Whatever the reason, the grade distribution is low and it feels bad for you and worse for your students.

What Happened? — The Empathy Hat

Now that you’ve identified there is an issue, the next step is to identify the reason for the low scores. Continue reading

Low-Cost Student Assessment

Student X has done the reading all term, they promise! It just happens they missed [concept covered in reading] and must’ve been [doing a good student-like activity] when you talked about [concept covered in lecture]. Now it’s finals week, Student X has no idea what is going on, and it’s going to hurt to fail them. If only there were a way to ensure they were doing the reading (or at worst, have documentation when the grade challenge comes)…

I have been working with James Riely, who teaches a hybrid Data Structures course in the College of Computing and Digital Media, to develop a series of low-value quizzes so he can painlessly assess student reading, lecture attention, and concept mastery. Not only are these quizzes useful for James, but they also allow students to self-assess their grasp of the concepts so they can reach out if need be. Continue reading

Tools Are Just That

In the DePaul Online Teaching Series (DOTS), the facilitators demo a lot of tools. The number of tools is overwhelming for the facilitators, so it is no surprise that the number of tools is extremely overwhelming for faculty.

As facilitators, we often hear statements like “this tool doesn’t make sense for my class;” “this tool is too complicated;” “there are too many tools being presented;” “it takes too long to make stuff with the tools—I could never use them in every class.”

Fortunately or unfortunately, presenting tools is important. Bringing a traditionally face-to-face course online means some things will need to be done differently. Translating a lecture online requires tools.

The crux of the problem is not the tools, but the perception that using any particular tool is mandatory. We aren’t dictators determining what you have to use to run your class. We’re offering suggestions. Every tool in DOTS is a suggestion.

And we don’t suggest using all, most, some, or any of the tools. It’s up to the faculty to determine how their students will best learn the subject. The facilitators barrage faculty in the hopes that there will be something, somewhere, that might be useful in mitigating some of the challenges of bringing the course online.

As facilitators and instructional designers, we say, “think about your teaching style and choose one tool that fits how you like to teach.” Don’t commit to that tool. Use that one tool to accomplish one thing, in one class, in one module. If the tool works, great! Use it again for another module! If the tool doesn’t work or only works passably, you haven’t committed to it. Try one of the others.

Don’t sweat tech. Tech is just a tool. If a tool is broken or doesn’t work well, don’t use it. Get a new one. A different one. Try again.

To use a horrid cooking analogy, "don’t sweat a bad cheese slicer."

Don’t sweat cheese slicers. Cheese slicers are just a tool. If a cheese slicer is broken or doesn’t work well, don’t use it. Get a new one. A different one. Try again.

You don’t go to the store shouting about how all cheese slicers are wretched tools whose only purpose is to subvert your ability to cut cheese.

You find a better cheese slicer.

Avatar photo

Lessons from Four Years of Faculty Development

For the last few years, one of my key job duties has been developing the curriculum and facilitating workshops for the DePaul Online Teaching Series. DOTS is a professional-development program that helps faculty make the transition to online teaching through thirty-six hours of workshops, trainings, and online-learning activities. Since the program’s inception in 2008, we’ve collected extensive feedback from our 239 graduates across all 14 cohorts to find out what they liked about the program and how it could be improved. In response, we’ve tweaked everything from the readings and assignments to the software we promote and the way we arrange the seating for face-to-face workshops. Today, faculty interest in DOTS continues to grow, and our most recent cohorts have set records for total applications and enrollment. 

In the summer of 2012, DOTS won the Sloan Consortium Award for Excellence in Faculty Development for Online Learning. Before I received the news, I’d already committed to giving a presentation at the Sloan-C annual conference to share some of the “secrets” of DOTS’ success. While I was excited I’d be able to mention the award as part of my presentation, I also felt added pressure to include useful tips and lessons that the audience hadn’t heard before.

To prepare for the presentation, I reviewed four years worth of DOTS survey feedback, looking at trends in answers to multiple-choice questions and identifying common themes in the responses to open-ended questions. Because I’d read all of the survey results before as each cohort completed DOTS, I had several assumptions about which aspects of DOTS would be the most praised and which would be the most criticized. However, poring over all the data in a single day and quantifying the results revealed a few interesting and unexpected results.

While I’d like to save a few secrets for the Sloan-C attendees, I thought I’d share some of my favorite findings here.

  1. Faculty loved screencasting no matter which tool we used. Over the years, we’ve tested and trained faculty to use just about every screencasting tool imaginable. (Most of our faculty currently use Screencast-o-Matic.com.) We always knew faculty liked screencasting because it was an easy transition from traditional lecture delivery. What was a bit surprising was the fact that 14 percent of survey respondents mentioned screencasting training as one of the most useful elements of DOTS—more than any other tool or concept. In addition, negative comments were almost nonexistent regardless of which screencasting tool they tried.
     
  2. Self-pacing eliminated nearly all complaints about hands-on software trainings. For the first three years of DOTS, we ran hands-on software trainings with a traditional, follow-the-leader approach. A trainer would demonstrate each step on a projector while faculty followed along and completed the same task on their laptops. This approach led to many complaints that the trainer was either moving too quickly or too slowly, and less tech-savvy faculty would often hold up the class as they struggled to keep up. To resolve this, we shifted to a self-paced approach. The trainer now begins with a fast-paced demonstration that lasts roughly ten minutes. During this time, faculty observe without attempting to perform the task. Next, each participant is given a handout and asked to complete a basic task in the software while staff members mingle and provide one-on-one support as needed. This approach has been very well received and allowed us to better meet the needs of our participants regardless of their level of technology experience.
     
  3. Showing amazing examples can backfire. Ten percent of respondents mentioned feeling overwhelmed by some aspect of DOTS. While this isn’t surprising—DOTS has to introduce many new tools and course-design strategies, after all—I found it interesting that some faculty cited the high quality of the example courses as a contributing factor. When we only showed courses with very polished video lectures, interactive games, and multi-level content navigation, some faculty felt intimidated and assumed these courses represented a minimum standard they would have to follow. To address this, we began adding sample courses that provided high-quality instruction with fewer bells and whistles. We also made more of an effort to remind faculty that certain courses had already been through years of revisions after being taught several times.

Through careful evaluation of faculty feedback, we’ve been able to implement strategies like the ones above to ensure DOTS keeps getting better with each cohort. While I’m thrilled we received external recognition from an organization like Sloan-C, I’m most proud of the fact that we’ve always viewed DOTS as a work in progress with room for improvement. As a result, our 2012 spring and summer cohorts were among our largest ever, and received satisfaction ratings of 95 percent and 96 percent, respectively. In addition, a recent graduate of our first cohort in 2008 paid us an incredible compliment by “auditing” DOTS this summer. While she felt DOTS was invaluable as she began her online-teaching journey four years ago, she didn’t want to miss out on the new tools, techniques, and activities that her colleagues raved about after completing the program in 2011. This type of evangelism and passion for the program explains why one of our biggest challenges as we plan future DOTS cohorts is finding meeting spaces on campus big enough to hold all of our new participants and our repeat customers.