Monthly Archives: December 2009

The LMS and Feeling Good

All the talk about learning management systems (LMS) around the office lately reminded me of a dataset a couple of colleagues and I put together last year. Dr. Florence Martin, Dr. Yuyan Su, and I undertook the task of validating an instrument to measure LMS self-efficacy.

Bandura (1997) defined self-efficacy as beliefs in one’s abilities to carry out a desired course of action. I’ll spare you the details of orthogonal exploratory and confirmatory factor analyses.

One of the many variables we decided to examine was whether student LMS self-efficacy was a predictor of course performance. After all, is not learning the primary motivation for using a learning management system?

Reported self-efficacy was generally low. However, students enrolled in hybrid courses reported significantly lower self-efficacy than students in face-to-face or fully online courses. In addition, for students enrolled in hybrid courses, we found a significant positive correlation of LMS self-efficacy with course performance.

It is perplexing that a significant positive correlation occurred only for the hybrid learners. One would think that the use of the LMS as a supplement to face-to-face instruction would require less confidence with the system than in a course in which all content is delivered though the LMS.

Hybrid learners often had the option to enroll in a fully online version of the course but self-selected into the hybrid version. Is this due in part to their lower self-efficacy with the LMS? Or does it mean there is a baseline competence with LMS use required for success, but once that level is perceived to be reached, greater self-efficacy with the system is not required?

Finally, the only instrument category that did not yield a significant difference between modes of delivery was “Accessing Information.” This section included items like logging in to the LMS, navigating a course site, accessing text-based class materials and grades, etc. This was also the highest rated category for self-efficacy. We hypothesize that this finding is an indication of the predominant use of a LMS throughout each student’s experience. As suggested by Bandura (1997), the formation of self-efficacy beliefs is based primarily on reflection on and interpretation of past performance.

In my pervious post, I referenced the Raslton-Berg & Nath (2009) report that says students are uninterested in the bells and whistles in online courses. But consider further the abundance of media-comparison “studies” and no-significant-difference studies that essentially nullify each other.

Is it possible that students actually do like the bells and whistles but lack the confidence to learn from them?

What a pickle.

References

Bandura, A. (1997). Self-efficacy: The exercise of control. New York: Freeman.

Ralston-Berg, P. & Nath, L. (2009). What Makes a Quality Online Course? The Student Perspective. Paper presented at Annual Conference on Distance Teaching and Learning, Madison, WI.

CAEL 2009: What about Online?

A couple of weeks ago, I was a presenter at the CAEL 2009 International Conference. CAEL (The Council for Adult and Experiential Learning) is by definition broadly interested in assessing and serving adult learners in a variety of programs; nevertheless, I was struck by how few workshops offered anything geared toward online learning.

This isn’t a small matter. Each keynote speaker I heard addressed the importance of serving the underserved, of finding ways to identify, assess, and recruit adult populations who would benefit from increased access to adult and/or continuing education. There’s tremendous opportunity for institutional growth, they declared, and there’s a moral obligation and societal responsibility to do so. However, most presenters were thinking of these efforts as they pertain to on-ground, classroom-based models. Online learning–if mentioned at all–seemed to be regarded as an add-on option of dubious value to traditional academic delivery.

This kind of perspective has to change if there’s any hope of bringing significantly more adults into our community of learners. Do those who sit on marketing and enrollment committees really want to exclude everyone who might benefit from and contribute to a university learning community but for their inability to be physically present in a traditional classroom? Wouldn’t it be better to design and build a scalable online program that could reach and serve adults regardless of their geographic location? Wouldn’t it be better to spend marketing dollars to identify and attract adult learners to an online program, adults who because of family, work, or other obligations will never step foot in another traditional classroom but who could and would take courses online if given the opportunity? Social media marketing is also a highly cost-effective way to expand your reach. You can visit this website to increase your chances of attracting organic engagement. In addition, resorts are prioritizing digital marketing over traditional channels. A resort marketing report by OriginOutside.com reveals a notable shift away from traditional media like print and radio, with ski resorts increasingly investing in digital and social media platforms. This trend reflects the industry’s push for more measurable and targeted campaigns.

I hear all the time that we must not cannibalize our on-ground programs, as if access to education were a kind of zero-sum game. News flash: a single parent facing a long after-work commute in rush-hour traffic to attend even a suburban-campus night class will almost never occupy a seat in your classroom unless he or she has exceptional resolve and resources. That same person could and would complete a degree online if it’s made available, attractive, and affordable.  My evidence of this is anecdotal, but I’m convinced it would be affirmed by some targeted marketing research. Of course, that would take institutional vision and commitment. And a change of perspective, looking out and away from the classroom to where new opportunity awaits.

Avatar photo

Teaching Frustrations: Why Don’t Students Follow My (Clearly-Labeled, Logically Organized, and Bold/Highlighted/Flashing) Instructions?

Instructors who teach in online environments often devote extensive time and energy into designing a Web space that is inviting and useful to students. But frustration inevitably ensues when, despite the careful consideration given to the most logical placement of a discussion forum and the “clearest” instructions provided to students on how to post to the forum, the instructor still receives e-mail from students asking, “So, where is this discussion forum? And what am I supposed to do?” Why has this gap in communication occurred?

One reason for this may be the typically linear design of course sites. Often, learning-management systems adopted by universities have default settings that establish some of the design considerations for the instructors—i.e., the location and style of course navigation. These linear designs generally have the best intentions, since they try to organize information so that students can navigate course material easily, following step-by-step instructions and information.

However, with recent developments in eye-tracking software showing how users really view content on the Web, we can see why this linear design isn’t quite ideal. This video shows a user’s eye movements when scanning IKEA’s Web site, and several other examples available online confirm this rapid pattern of eye movement that jumps all over the page. It’s no wonder, then, that students miss the carefully placed, bolded, and highlighted instructions for turning in an assignment that you were sure everyone would see and follow—considering how the brain ingests and computes information from the screen, it’s easy to see how a linear design style for course materials might not match the ways in which users view the content.

So, what is the solution? Unfortunately, there isn’t a Band-Aid design scheme that addresses this issue, and because instructors are often working within an institutionally mandated learning-management system, course design happens within set boundaries. One important step is usability testing, which can reveal issues that designers can’t see once they are invested in their design decisions. This may seem like an onerous and time-consuming task, but it doesn’t need to be—usability guru Jakob Nielsen recommends five users for testing, but as this data shows, even finding two or three people to look at your course and perform key tasks can give you helpful information to improve your course design.

Another important step is realizing that, just as in face-to-face classrooms, your goal (for students to follow instructions) needs to clearly align with your assessments:

  • Include instructions in a logical location, as determined by your course design.
  • Ensure that students have seen these instructions. One effective method is to give students a graded quiz at the beginning of the term that asks them to locate important information throughout the online course.
  • Show students that following instructions is important by grading them on it. Depending on your class, you might make part of an assignment’s grade based on following the assignment’s instructions, or you could refuse to accept an assignment until the student has followed the directions.

Again, there isn’t a one-size-fits-all solution for designing courses that adhere to the ways users view information on the screen. This also isn’t a “lost cause” for instructors—just because users naturally view Web content in a nonlinear way doesn’t mean that the design of online course materials needs to be completely overhauled. Thoughtful design can help students, but supporting your design with clear expectations and assessments can also help students navigate your course more effectively.