Category Archives: Training

Sonya Ratliff

What Do I Do Now? Remembering What It Feels Like Being A First-time Online Student

Can you remember feeling nervous, anxious, and fearful about the upcoming online course you registered for at the advice of an academic advisor? While the advisor gave you some basic information about the course and told you not to worry, the little voice inside would say, “Are you sure you can do this”? That little voice never really went away until the end of the course.

The online world of learning is so very different than the face-to-face classroom. Students don’t have the opportunity to speak to the instructor after class or stop by their instructor’s office on the way home to ask a question. Everything, everything is done virtually.

Continue reading

Lori Zalivansky

Stress: The Good, The Bad, and The Downright Ugly

I think it is safe to say we have all experienced some form of stress in our life–whether it be in our personal life or at work. Stress isn’t always a bad thing. Sometimes stress, in small doses, can help you perform better and keep you safe when in dangerous situations.

This week we had several interviews and one of the questions we always ask candidates is how are you under pressure and can you manage stress? Working in our field can be very stressful. There is a lot of customer support involved with instructional design. As Sharon Guan likes to say, we are free therapy. Whenever an instructor is struggling, they come to us with the hope that we can ease their worries and their stress. Which means a lot of the time we are not only dealing with our own personal stress but also taking on the stress of our faculty. Stress is only good if you keep it in a comfortable zone, so how do you make sure to not let yourself get overwhelmed? As one of the candidates said during the interview, you don’t want to get to the point where you are seconds away from throwing your computer out the window. It’s a long way down.

Continue reading

Lori Zalivansky

Walking a Fine Line Between Support and Too Much Support

As an eLearning Content Developer (ECD) at DePaul University, one of my roles is to provide faculty support for all courses using Desire2Learn. Whether that is providing D2L training sessions, building content, or answering any D2L technical questions. One of the biggest challenges that I face as an ECD is figuring out when I might be providing “too much support.” I’m sure any faculty reading this at this point are thinking how could there ever be too much support? But I believe there needs to be a balance between providing the support faculty need and also giving them the right amount of encouragement to be able to eventually answer their own questions. Continue reading

Josh Lund

Instructional Designers: Preventative Care is for Us Too

I don’t often write directly to my instructional designer colleagues; usually I try to impart some of the occasional nuggets of wisdom I’ve gained from teaching, research or just plain trial and error to faculty, so they can avoid making the same mistakes I have. This time I’ve found a new way to stay inspired and reduce the burnout that can happen in this line of work, and I’m excited about how it has affected my approach to Instructional Design (ID) that it bears repeating.

Over the past decade or so, we have all witnessed a major change in health care. The medical profession has shifted focus from just treating the symptoms to preventative care—the idea that by changing life and health habits earlier on, it will reduce the amount of symptomatic care required for patients later in life. It does seem to be having a positive effect so far, as hospitals have more time to deal with emergencies, and their doctors and nurses spend less time in consultation over health conditions that are ultimately preventable. Continue reading

Bridget Wagner

Best Practices for Support in Instructional Technology

I have only been working in instructional technology full time for a few months, so I am not really prepared to call myself an expert on anything. Except, maybe, on providing support in an instructional technology department.

As a newcomer to instructional technology, I have spent a lot of time figuring out how to make stuff work. In my current role as a member of a support team, I not only have to figure out how to make stuff work for my own benefit, but also help other people at a university learn to use our learning management system and other educational technology tools.

So I am no course design expert, but I am good at troubleshooting what I do not know, and helping others to do the same. This is what I’ve learned is key in a support role. Continue reading

Lori Zalivansky

Review: Applying the QM Rubric Course

I have been working as a content developer at DePaul for nearly 5 years. In these 5 years, I have heard rumblings about Quality Matters and Quality Matters Reviews, but never really understood what “QM’ing” a course really meant. When asked what I would like to focus on for professional development, becoming a certified peer reviewer was the first thing that popped in my head. I have quality assured many courses and wondered, “how much different is that from doing a quality matters review?” I was in for an awakening.

Continue reading

Ian Hall

Quick and Easy Curving

Something Happened! — That Sinking Feeling

Sometimes the grade distribution on your exam seems a bit low — maybe even horrifyingly low.

Perhaps there wasn’t enough time to focus on a topic due to a holiday, a bout of illness struck, or maybe there was a question that was ambiguously worded. The assessment might be brand new and still needs some tweaking, or maybe the students just didn’t get it — there was a collective lapse in memory.

Whatever the reason, the grade distribution is low and it feels bad for you and worse for your students.

What Happened? — The Empathy Hat

Now that you’ve identified there is an issue, the next step is to identify the reason for the low scores. Continue reading

Ian Hall

Low-Cost Student Assessment

Student X has done the reading all term, they promise! It just happens they missed [concept covered in reading] and must’ve been [doing a good student-like activity] when you talked about [concept covered in lecture]. Now it’s finals week, Student X has no idea what is going on, and it’s going to hurt to fail them. If only there were a way to ensure they were doing the reading (or at worst, have documentation when the grade challenge comes)…

I have been working with James Riely, who teaches a hybrid Data Structures course in the College of Computing and Digital Media, to develop a series of low-value quizzes so he can painlessly assess student reading, lecture attention, and concept mastery. Not only are these quizzes useful for James, but they also allow students to self-assess their grasp of the concepts so they can reach out if need be. Continue reading

Ian Hall

Tools Are Just That

In the DePaul Online Teaching Series (DOTS), the facilitators demo a lot of tools. The number of tools is overwhelming for the facilitators, so it is no surprise that the number of tools is extremely overwhelming for faculty.

As facilitators, we often hear statements like “this tool doesn’t make sense for my class;” “this tool is too complicated;” “there are too many tools being presented;” “it takes too long to make stuff with the tools—I could never use them in every class.”

Fortunately or unfortunately, presenting tools is important. Bringing a traditionally face-to-face course online means some things will need to be done differently. Translating a lecture online requires tools.

The crux of the problem is not the tools, but the perception that using any particular tool is mandatory. We aren’t dictators determining what you have to use to run your class. We’re offering suggestions. Every tool in DOTS is a suggestion.

And we don’t suggest using all, most, some, or any of the tools. It’s up to the faculty to determine how their students will best learn the subject. The facilitators barrage faculty in the hopes that there will be something, somewhere, that might be useful in mitigating some of the challenges of bringing the course online.

As facilitators and instructional designers, we say, “think about your teaching style and choose one tool that fits how you like to teach.” Don’t commit to that tool. Use that one tool to accomplish one thing, in one class, in one module. If the tool works, great! Use it again for another module! If the tool doesn’t work or only works passably, you haven’t committed to it. Try one of the others.

Don’t sweat tech. Tech is just a tool. If a tool is broken or doesn’t work well, don’t use it. Get a new one. A different one. Try again.

To use a horrid cooking analogy, "don’t sweat a bad cheese slicer."

Don’t sweat cheese slicers. Cheese slicers are just a tool. If a cheese slicer is broken or doesn’t work well, don’t use it. Get a new one. A different one. Try again.

You don’t go to the store shouting about how all cheese slicers are wretched tools whose only purpose is to subvert your ability to cut cheese.

You find a better cheese slicer.

Daniel Stanford

Lessons from Four Years of Faculty Development

For the last few years, one of my key job duties has been developing the curriculum and facilitating workshops for the DePaul Online Teaching Series. DOTS is a professional-development program that helps faculty make the transition to online teaching through thirty-six hours of workshops, trainings, and online-learning activities. Since the program’s inception in 2008, we’ve collected extensive feedback from our 239 graduates across all 14 cohorts to find out what they liked about the program and how it could be improved. In response, we’ve tweaked everything from the readings and assignments to the software we promote and the way we arrange the seating for face-to-face workshops. Today, faculty interest in DOTS continues to grow, and our most recent cohorts have set records for total applications and enrollment. 

In the summer of 2012, DOTS won the Sloan Consortium Award for Excellence in Faculty Development for Online Learning. Before I received the news, I’d already committed to giving a presentation at the Sloan-C annual conference to share some of the “secrets” of DOTS’ success. While I was excited I’d be able to mention the award as part of my presentation, I also felt added pressure to include useful tips and lessons that the audience hadn’t heard before.

To prepare for the presentation, I reviewed four years worth of DOTS survey feedback, looking at trends in answers to multiple-choice questions and identifying common themes in the responses to open-ended questions. Because I’d read all of the survey results before as each cohort completed DOTS, I had several assumptions about which aspects of DOTS would be the most praised and which would be the most criticized. However, poring over all the data in a single day and quantifying the results revealed a few interesting and unexpected results.

While I’d like to save a few secrets for the Sloan-C attendees, I thought I’d share some of my favorite findings here.

  1. Faculty loved screencasting no matter which tool we used. Over the years, we’ve tested and trained faculty to use just about every screencasting tool imaginable. (Most of our faculty currently use We always knew faculty liked screencasting because it was an easy transition from traditional lecture delivery. What was a bit surprising was the fact that 14 percent of survey respondents mentioned screencasting training as one of the most useful elements of DOTS—more than any other tool or concept. In addition, negative comments were almost nonexistent regardless of which screencasting tool they tried.
  2. Self-pacing eliminated nearly all complaints about hands-on software trainings. For the first three years of DOTS, we ran hands-on software trainings with a traditional, follow-the-leader approach. A trainer would demonstrate each step on a projector while faculty followed along and completed the same task on their laptops. This approach led to many complaints that the trainer was either moving too quickly or too slowly, and less tech-savvy faculty would often hold up the class as they struggled to keep up. To resolve this, we shifted to a self-paced approach. The trainer now begins with a fast-paced demonstration that lasts roughly ten minutes. During this time, faculty observe without attempting to perform the task. Next, each participant is given a handout and asked to complete a basic task in the software while staff members mingle and provide one-on-one support as needed. This approach has been very well received and allowed us to better meet the needs of our participants regardless of their level of technology experience.
  3. Showing amazing examples can backfire. Ten percent of respondents mentioned feeling overwhelmed by some aspect of DOTS. While this isn’t surprising—DOTS has to introduce many new tools and course-design strategies, after all—I found it interesting that some faculty cited the high quality of the example courses as a contributing factor. When we only showed courses with very polished video lectures, interactive games, and multi-level content navigation, some faculty felt intimidated and assumed these courses represented a minimum standard they would have to follow. To address this, we began adding sample courses that provided high-quality instruction with fewer bells and whistles. We also made more of an effort to remind faculty that certain courses had already been through years of revisions after being taught several times.

Through careful evaluation of faculty feedback, we’ve been able to implement strategies like the ones above to ensure DOTS keeps getting better with each cohort. While I’m thrilled we received external recognition from an organization like Sloan-C, I’m most proud of the fact that we’ve always viewed DOTS as a work in progress with room for improvement. As a result, our 2012 spring and summer cohorts were among our largest ever, and received satisfaction ratings of 95 percent and 96 percent, respectively. In addition, a recent graduate of our first cohort in 2008 paid us an incredible compliment by “auditing” DOTS this summer. While she felt DOTS was invaluable as she began her online-teaching journey four years ago, she didn’t want to miss out on the new tools, techniques, and activities that her colleagues raved about after completing the program in 2011. This type of evangelism and passion for the program explains why one of our biggest challenges as we plan future DOTS cohorts is finding meeting spaces on campus big enough to hold all of our new participants and our repeat customers.