Avatar photo

Lessons from Four Years of Faculty Development

  Reading time 6 minutes

For the last few years, one of my key job duties has been developing the curriculum and facilitating workshops for the DePaul Online Teaching Series. DOTS is a professional-development program that helps faculty make the transition to online teaching through thirty-six hours of workshops, trainings, and online-learning activities. Since the program’s inception in 2008, we’ve collected extensive feedback from our 239 graduates across all 14 cohorts to find out what they liked about the program and how it could be improved. In response, we’ve tweaked everything from the readings and assignments to the software we promote and the way we arrange the seating for face-to-face workshops. Today, faculty interest in DOTS continues to grow, and our most recent cohorts have set records for total applications and enrollment. 

In the summer of 2012, DOTS won the Sloan Consortium Award for Excellence in Faculty Development for Online Learning. Before I received the news, I’d already committed to giving a presentation at the Sloan-C annual conference to share some of the “secrets” of DOTS’ success. While I was excited I’d be able to mention the award as part of my presentation, I also felt added pressure to include useful tips and lessons that the audience hadn’t heard before.

To prepare for the presentation, I reviewed four years worth of DOTS survey feedback, looking at trends in answers to multiple-choice questions and identifying common themes in the responses to open-ended questions. Because I’d read all of the survey results before as each cohort completed DOTS, I had several assumptions about which aspects of DOTS would be the most praised and which would be the most criticized. However, poring over all the data in a single day and quantifying the results revealed a few interesting and unexpected results.

While I’d like to save a few secrets for the Sloan-C attendees, I thought I’d share some of my favorite findings here.

  1. Faculty loved screencasting no matter which tool we used. Over the years, we’ve tested and trained faculty to use just about every screencasting tool imaginable. (Most of our faculty currently use Screencast-o-Matic.com.) We always knew faculty liked screencasting because it was an easy transition from traditional lecture delivery. What was a bit surprising was the fact that 14 percent of survey respondents mentioned screencasting training as one of the most useful elements of DOTS—more than any other tool or concept. In addition, negative comments were almost nonexistent regardless of which screencasting tool they tried.
     
  2. Self-pacing eliminated nearly all complaints about hands-on software trainings. For the first three years of DOTS, we ran hands-on software trainings with a traditional, follow-the-leader approach. A trainer would demonstrate each step on a projector while faculty followed along and completed the same task on their laptops. This approach led to many complaints that the trainer was either moving too quickly or too slowly, and less tech-savvy faculty would often hold up the class as they struggled to keep up. To resolve this, we shifted to a self-paced approach. The trainer now begins with a fast-paced demonstration that lasts roughly ten minutes. During this time, faculty observe without attempting to perform the task. Next, each participant is given a handout and asked to complete a basic task in the software while staff members mingle and provide one-on-one support as needed. This approach has been very well received and allowed us to better meet the needs of our participants regardless of their level of technology experience.
     
  3. Showing amazing examples can backfire. Ten percent of respondents mentioned feeling overwhelmed by some aspect of DOTS. While this isn’t surprising—DOTS has to introduce many new tools and course-design strategies, after all—I found it interesting that some faculty cited the high quality of the example courses as a contributing factor. When we only showed courses with very polished video lectures, interactive games, and multi-level content navigation, some faculty felt intimidated and assumed these courses represented a minimum standard they would have to follow. To address this, we began adding sample courses that provided high-quality instruction with fewer bells and whistles. We also made more of an effort to remind faculty that certain courses had already been through years of revisions after being taught several times.

Through careful evaluation of faculty feedback, we’ve been able to implement strategies like the ones above to ensure DOTS keeps getting better with each cohort. While I’m thrilled we received external recognition from an organization like Sloan-C, I’m most proud of the fact that we’ve always viewed DOTS as a work in progress with room for improvement. As a result, our 2012 spring and summer cohorts were among our largest ever, and received satisfaction ratings of 95 percent and 96 percent, respectively. In addition, a recent graduate of our first cohort in 2008 paid us an incredible compliment by “auditing” DOTS this summer. While she felt DOTS was invaluable as she began her online-teaching journey four years ago, she didn’t want to miss out on the new tools, techniques, and activities that her colleagues raved about after completing the program in 2011. This type of evangelism and passion for the program explains why one of our biggest challenges as we plan future DOTS cohorts is finding meeting spaces on campus big enough to hold all of our new participants and our repeat customers.

Avatar photo

About Daniel Stanford

Daniel Stanford is a Learning Design Consultant and former Director of Faculty Development and Technology Innovation at DePaul University's Center for Teaching and Learning. His work in online learning has received awards from the the POD Network, the Online Learning Consortium, NAFSA, the Instructional Technology Council, the University of Wisconsin, and Blackboard Inc. Follow @dstanford on Twitter | Connect on LinkedIn |

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.