The great thing about illiteracy is it’s pretty easy to spot. Sure, we’ve all heard stories about some 65-year-old grandpa who shocks his closest friends by revealing he never learned to read. Yet, for the most part, people who can’t read and write don’t sneak into American universities undetected and they don’t successfully hold down white-collar jobs. I know it’s tempting to argue with me here. This is the part where you want to derail my entire opening argument by telling me all about a student who graduated from University X and couldn’t even sign his own name. Or you might want to rain on my parade with the tale of the Fortune 500 CEO who had his son write all his memos. While I’m sure such things have happened on rare occasion, that doesn’t change the fact that it’s easy to determine if someone can read and write, assuming you truly want to know.
Unfortunately, it’s not nearly as easy to determine if someone is computer literate. The problem isn’t that we lack the means to test a person’s level of technology-savvy. The problem is that no one can agree on specific minimum, universal standards that define basic computer literacy. And even if we established such standards, no one seems eager to require faculty or students to take a computer literacy test before being approved to dive into the world on online learning. As a result, universities across the country encounter very similar problems as they try to develop online learning programs. Instructors are asked to develop online courses, but they even don’t know how to create zipped files. Students are encouraged to take online courses, but they don’t even know where to find files on their hard drives that they’ve downloaded online. Help desk staff wind up answering educational technology questions, but insufficient training and bureaucratic problem-logging systems prevent them from answering these questions quickly and effectively.
So, what is the instructional designer’s role in this whole debacle? Are they just co-dependant enablers who can’t say no? Are they guilty of encouraging computer-illiterate faculty to explore new, painful ways to torture computer-illiterate students without ever addressing the underlying literacy problem? Of course, many professors’ level of computer literacy improves as they work with instructional designers to develop online courses because an instructional designer’s job always includes at least a little technology training. Yet, this doesn’t resolve the concern I hear faculty express most often when I’m encouraging them to use a new tool in their courses:
“I don’t have time to learn how to use this new technology, let alone teach my students how to use it.”
Of course, all instructional designers have their own ways of mitigating this. They promise it won’t take long to learn how to use a new tool. They vow to be there for faculty throughout the quarter whenever questions arise. One of my old bosses had no authority to motivate faculty to complete their courses on time, so she spent a lot of time trying to catch flies with honey—and coffee and donuts paid for out of her own pocket. (I suspect this approach is quite common for instructional designers whose job security depends on producing a certain number of online courses a year.) Whatever technique is employed to get faculty on board, the instructor’s concern about time constraints and professional priorities remains valid.
I think most academic administrators would agree that it isn’t fair to expect teachers to be both experts in their fields of study and expert users of the latest educational technologies. However, they’d probably throw in a caveat that a certain level of basic computer literacy is essential in any job field today, including education. Yet, until everyone (at least at the institutional level) can agree on what that essential level of computer literacy is and what should be done to ensure it is met, it seems futile to try to define the role that students, faculty, technical support, and instructional designers must play in a successful online learning program. Before we introduce instructors to the wonders of podcasts or encourage them to set up instructional blogs or wikis or virtual classrooms, shouldn’t we make sure faculty and their students possess certain fundamental digital media knowledge? Shouldn’t we be sure they know the difference between streaming media and progressive downloads? Shouldn’t we be sure they possess certain basic digital media skills, like how to perform a basic image edit in a tool like Photoshop and export the file in the ideal format for its intended use?
I think every institution could benefit from a required computer literacy course with a curriculum developed and approved by a well-rounded teams of experts. It’s tempting to believe that such a course isn’t necessary for most students today. So many students already know how to add photos to their Flickr accounts or embed a YouTube video in a MySpace page. However, as someone who has recently taught undergrads how to build basic webpages using HTML, I can tell you that learning to use a social networking tool does not a computer literate person make. These accomplishments belie a very superficial knowledge of how the Web—and digital media in general—truly works, and that lack of knowledge almost always shows up later when it’s too late to do anything about it.
I’m not sure how realistic it is to think that computer literacy training and/or standardized testing could ever be forced upon the faculty at most American colleges and universities. Addressing the student side of the problem is probably an easier place to begin, and its benefits would extend far beyond the development of online learning programs. If nothing else, we’d at least ensure that our students are truly prepared for that “digital, global, information-driven economy” I keep reading so much about. Plus, we’d avoid the embarrassment of graduating a generation of students who will one day shock their closest friends by revealing they never learned how to zip a file or edit a photo or compress an audio clip.