Is your syllabus clearly organized? Will your students understand it? Is your course site laid out intuitively? Can students identify where to start and how to find different kinds of information?
Just because it’s easy for you to navigate and interpret your course materials doesn’t mean it’s going to be easy for your students–you have a wealth of information about the discipline and course structure that students don’t have when they first encounter it. And it’s very difficult to look at your course through the eyes of someone who doesn’t already have that context.
Implementing the methods of user experience research in higher education can provide direction. At DePaul, I lead the Learning Experience Research efforts within our Center for Teaching and Learning, and we use these methods to make student-centered design decisions within courses and the learning-management system.
Addressing Points of Friction
Students have complicated lives. A common critique in course evaluations is “The instructor thinks my whole life revolves around this one class.” Certainly, students should expect a degree of time commitment and rigor to attain the learning outcomes of a college course. But if there’s an opportunity to reduce the friction of a course in a way that doesn’t affect the substance of the course, we should be doing that.
I like the metaphor of friction when talking about usability problems. A little friction can slow you down and make you feel uncomfortable. A lot of friction can stop you entirely. And it’s cumulative–a small amount of friction from several sources can still have a big effect.
When I’m evaluating the usability of a course, that’s the lens I’m using–many of the problems I might identify seem minor, and maybe students can recover from their confusion by clicking around the course site a little more, or re-reading the instructions a couple more times. But these little things add up. And if students are going to be grappling with something unclear in their course, I’d rather it be the nuances of your subject matter rather than the navigation of the course site!
UX Research Methods
Some of the tools in the UX research toolkit might already be familiar to instructors. You probably give surveys. But there are other methods that can offer different insights into how students interact with your course materials.
Surveys
Surveys are a staple of any sort of behavioral research. You’re almost certainly providing a university-standard survey at the end of your course, but consider also providing them mid-term. These can help you catch student experience issues when there’s still time to make adjustments for the current cohort.
Usability Testing
Usability testing involves watching in real time how students interact with course materials, whether that’s the course site in the learning management system, the syllabus, or assignment prompts. Typically in the Learning Experience Research team, we do these tests during the design process before a course is offered to determine what will trip up students as they interact with the materials. These tests can uncover simple points of confusion in wording and layout or more substantive feedback, like if the course is providing the right balance of structure and flexibility for current students.
Here’s how we approach usability testing:
- Create a draft version of the course element we’re testing.
- Identify a task for our users to do. This could be something like, “review the syllabus as if you were a student actually enrolled in the class”, or “identify the first steps you would do to complete this assignment.”
- Recruit participants who resemble your target audience. We typically recruit other students.
- Ask them to think aloud as they’re completing the task. Do not provide guidance beyond the materials themselves and what an actual student in the class might already know.
- Observe and take notes.
We have never had a test where the results didn’t surprise us or reveal a blind spot in our design process.
Field Observation
Our Learning Experience Research team sometimes does classroom observations to give instructors insights into the student experience. Certainly you’re getting some feedback cues from students already–are they maintaining eye contact and nodding, or tilting their head confusedly? But there are other aspects of the student experience you might miss without another set of eyes in the room. Can students in the back corner hear well enough to follow the conversation you’re having with the student in the front? Can they see the screen? Are you missing raised hands when you’re looking at the other side of the room? Are you giving them enough time to think through a question and respond before answering it for them?
These observations can reveal subtle but important barriers to engagement.
What You Find Might Surprise You
Is it worth the effort to get this kind of feedback from actual students? We wouldn’t keep doing it if we didn’t keep finding things we didn’t expect–and in some cases things that we weren’t even looking for.
Here are a few surprises we’ve encountered:
- In a D2L course site, we included screenshots of the D2L interface within some orientation materials. Every student in our user testing attempted to click those screenshots thinking the buttons were functional D2L buttons–an error they easily recovered from but still an unnecessary point of friction.
- In attempting to determine if submodules–another level of categorization of materials within a weekly module–made courses easier or harder to navigate, we found that the answer was simply that students performed better (both in navigating the course site and in performance on short assignments) when they had a structure similar to one they had already seen before in a different class.
- When testing how students reacted to the D2L Grades tool’s default message when an instructor had set the lowest score in a category to be dropped from the grade calculation, one student thought the word red full caps “DROPPED!” message meant he had been dropped from the class!
- In class observations, we found instances where an instructor would miss seeing a student with a raised hand while making an extended point to one side of the room. The student kept her hand raised for more than a minute before quietly lowering it again.
- Students can tell when materials are boilerplate and not written in the instructor’s own voice. They skim and skip these materials and often distrust that anything in them will be reflective of the instructors actual practice in the course.
Final Thoughts
My job title is “Learning Experience Designer,” but really every instructor to some degree is designing the learning experience of their students, and as any user experience professional will tell you, it’s important to keep the end user at the center of the design process.
