Avatar photo

The Importance of Defining Computer Literacy

  Reading time 7 minutes

Compared to digital illiteracy, traditional illiteracy is relatively easy to spot. For the most part, people who can’t read and write don’t sneak into universities undetected and they don’t often hold down white-collar jobs. I know it’s tempting to argue with me here. This is the part where you want to derail my entire opening argument by telling me all about a student who graduated from University X and couldn’t even sign his own name. Or you might want to rain on my parade with the tale of the Fortune 500 CEO who had his son write all his memos. While I’m sure such things have happened on rare occasion, that doesn’t change the fact that it’s fairly easy to design an assessment that can determine if someone can read and write at a particular level of proficiency.

Unfortunately, it’s not nearly as easy to determine if someone is computer literate. The problem isn’t that we lack the means to test a person’s level of technology-savvy. The problem is that no one can agree on specific minimum, universal standards that define basic computer literacy. And even if we established such standards, no one seems eager to require faculty or students to take a computer literacy test before being approved to dive into the world on online learning. As a result, universities across the country encounter very similar problems as they try to develop online learning programs. Instructors are asked to develop online courses, but they don’t know how to create zipped files or edit a photo. Students are encouraged to take online courses, but they might not know where to find files on their hard drives that they’ve downloaded. Help desk staff wind up answering educational technology questions, but insufficient training and bureaucratic problem-logging systems prevent them from answering these questions quickly and effectively.

So, what is the instructional designer’s role in this whole debacle? Are they just co-dependant enablers who can’t say no? Are they guilty of encouraging computer-illiterate faculty to explore new, painful ways to torture computer-illiterate students without ever addressing the underlying literacy problem? Of course, many professors’ level of computer literacy improves as they work with instructional designers to develop online courses because an instructional designer’s job often includes technology training. Yet, this doesn’t resolve the concern I hear faculty express most often when I’m encouraging them to use a new tool in their courses:

“I don’t have time to learn how to use this new technology, let alone teach my students how to use it.”

Of course, all instructional designers have their own ways of mitigating this. They promise it won’t take long to learn how to use a new tool. They vow to be there for faculty throughout the quarter whenever questions arise. One of my old bosses had no authority to motivate faculty to complete their courses on time, so she spent a lot of time trying to catch flies with honey—and coffee and donuts paid for out of her own pocket. (I suspect this approach is quite common for instructional designers whose job security depends on producing a certain number of online courses a year.) Whatever technique is employed to get faculty on board, the instructor’s concern about time constraints and professional priorities remains valid.

I think most academic administrators would agree that it isn’t fair to expect teachers to be both experts in their fields of study and expert users of the latest educational technologies. However, they’d probably throw in a caveat that a certain level of basic computer literacy is essential in any job field today, including education. Yet, until everyone (at least at the institutional level) can agree on what that essential level of computer literacy is and what should be done to ensure it is met, it seems futile to try to define the role that students, faculty, technical support, and instructional designers must play in a successful online learning program. Before we introduce instructors to the wonders of podcasts or encourage them to set up instructional blogs or wikis or virtual classrooms, shouldn’t we make sure faculty and their students possess certain fundamental digital media knowledge? Shouldn’t we be sure they possess certain basic digital media skills, like how to perform a basic image edit in a tool like Photoshop and export the file in the ideal format for its intended use?

I think every institution could benefit from a required computer literacy course with a curriculum developed and approved by a well-rounded teams of experts. It’s tempting to believe that such a course isn’t necessary for most students today. So many students already know how to add photos to their Flickr accounts or embed a YouTube video in a MySpace page. However, as someone who has recently taught undergrads how to build basic webpages using HTML, I can tell you that learning to use a social networking tool does not a computer literate person make. These accomplishments belie a very superficial knowledge of how the Web—and digital media in general—truly works, and that lack of knowledge almost always shows up later when it’s too late to do anything about it.

I’m not sure how realistic it is to think that computer literacy training and/or standardized testing could ever be forced upon the faculty at most American colleges and universities. Addressing the student side of the problem is probably an easier place to begin, and its benefits would extend far beyond the development of online learning programs. If nothing else, we’d at least ensure that our students are truly prepared for that “digital, global, information-driven economy” I keep reading so much about. Plus, we’d avoid the embarrassment of graduating a generation of students who will one day shock their closest friends by revealing they never learned how to zip a file or edit a photo or compress an audio clip.

Avatar photo

About Daniel Stanford

Daniel Stanford is a Learning Design Consultant and former Director of Faculty Development and Technology Innovation at DePaul University's Center for Teaching and Learning. His work in online learning has received awards from the the POD Network, the Online Learning Consortium, NAFSA, the Instructional Technology Council, the University of Wisconsin, and Blackboard Inc. Follow @dstanford on Twitter | Connect on LinkedIn |

4 thoughts on “The Importance of Defining Computer Literacy

  1. An interesting post. I posted some related thoughts about it on my blog. http://edusign.blogspot.com/2008/02/participation-at-core-of-web-20-and.html

    I like your call for literacies, but my biggest departure from your comments are that I don’t think we need to be dictating specific tools anymore. In a day of Google docs, open source applications and a host of other web applications that easily facilitate, communicating, writing, publishing, creating video, editing photos, learning html or photoshop tasks might be akin to memorizing times tables (I’m not sure that’s all bad either).
    My main point is that we get people participating and creating regardless of the tool. Tools will come and go. Learners today need to be literate in communicating, participating and bartering in the marketplace of ideas…and the tools of this marketplace are no longer HTML.

    Digital literacy courses could be a good idea, but as with nearly all learning, we can’t store it up. It is best learned as we need it to address our needs. Faculty will likely be drawn to these tools (and to literacy workshops) when they feel the need to stay relevant and communicate with and where their students communicate.
    I’ll sign off for now. Not all that is very well thought through, but your post prompted this quick response.
    -Joel G.

  2. Thanks for the feedback and the link from you own blog over at http://edusign.blogspot.com/. For the most part, I agree with what you said and it was clearly a mistake for me to mention Photoshop. I tried as much as possible in the article to cite more general skills that aren’t software specific–editing a photo, trimming an audio file, zipping files, understanding the basic principle of streaming media and how it differs from progressive download, etc. When I teach my students how to edit images, I show them how to do it in Fireworks. However, I always mention that if they’re already comfortable with another tool (Photoshop, for example), they’re welcome to use that instead.

    Perhaps your point is that teaching any applied computer skills isn’t really making anyone more computer literate–that we need to focus more on the why and less on the how. Maybe you’re proposing we have to teach people more general concepts and theory so they understand, for example, how an online publishing tool works (what is tagging, what’s a database, etc.) and why its features are valuable as educational tools. I think it’s great to teach all these things in addition to showing them the “how-to” stuff, e.g. what series of buttons to click to create a new post and make Blogger or WordPress or Moveable Type publish it. However, I’m not sure exactly what kind of test you give someone to determine if they can “communicate and barter in the marketplace of ideas” without also testing their practical skills–their ability to create a discussion board thread in Blackboard or create a new Google doc and invite people to it, etc. I suppose the best computer literacy training would find a way to demonstrate key skills in such a way that students could then apply those skills to any similar tool, but I think this requires a well-planned balance between the theoretical and the practical.

    I think there’s merit in still teaching students to memorize multiplication tables, but I think that’s a bit different from teaching someone how to edit an image in a specific program like Photoshop. Being able to do basic multiplication in your head can save a person a little time or let them calculate the cost of something at the grocery store when they don’t have a calculator on them, but that’s about where the benefits end. Learning how to edit an image in Photoshop should–in theory–teach a student what menu options and tool icons to look for in another program (iPhoto, Fireworks, etc.) so they can quickly apply those software skills to similar tools.

    I still spend a week or two teaching my students how to hand-code their first HTML webpages because I think it’s important that they not rely entirely on Dreamweaver’s design view (or that of any WYSIWYG editor) to create their website content. I try to teach them general concepts like what HTML tags are and why we wrap them around specific chunks of content in the way that we do, but I don’t make them create 20-page websites entirely in Notepad. Hopefully the result is that they understand a bit more about the way content on the web is structured, they know a bit more about how to create webpages of their own, and in the future, they can troubleshoot problems and adapt to new advances thanks to their mix of theoretical and practical knowledge.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.