Category Archives: Pedagogy

Evidence for Online Pedagogy—One More Tool!

It is particularly gratifying to read a headline like this one, which appeared in last week’s Chronicle of Higher Education: “Online-Education Study Reaffirms Value of Good Teaching, Experts Say.”

Gotta love it! "Good teaching" finally makes it into the online tool kit!

The ‘study’ is Evaluation of Evidence-Based Practices in Online Learning: a Meta-Analysis and Review of Online Learning Studies–recently released by the U.S. Department of Education. The full report is available online.

I quickly downloaded, printed out, read, and marked up my own copy!

While the report found that students taking all or part of their class online performed better than those in a face-to-face class, the study suggests that it was not the medium for delivery itself that accounted for the difference but rather "it was the combination of elements in the treatment conditions (which was likely to have included additional learning time and materials as well as additional opportunities for collaboration) that produced the observed learning advantages" (p. XVII).

This report could serve as a discussion prompt for faculty interested in developing or teaching an online class. And it would also be useful to instructional designers who may want to review some of the specific learning practices reviewed in the report.

As always, the IDD Blog is interested in what you find to be of interest. Read the report yourself and let us know.

Avatar photo

Avoiding Intellectual Clutter: A Student’s Perspective

Students today have access to more information than ever before.   Beyond Google-fu and Wikipedia, new technologies allow anyone to research and order practically any publication with a few keystrokes.  College students have access to expansive libraries filled with volumes on the most obscure topics, and at larger universities like DePaul, students can access full-text articles in respected academic journals with a few simple searches.  The amount of information we students have access to before we ever reach a classroom or open up Blackboard can be overwhelming. It raises the question, given the wealth of information students have access to outside the classroom, what exactly is the role of the instructor?

It cannot simply be to impart information—all the information is out there and available for students.  Rather, I think part of the instructor role has to be to act as a filter—to cut through all the information out there and identify and present only that which is most important, most up to date, and most accessible for students who are just being introduced to a field or subfield of study.  The material then needs to be arranged into a coherent, unified form.

I think a lot of professors disregard this filtering function, and either put too much into their syllabi or overwhelm students with “optional resources” for topics they can’t cover within the course.

I know from experience how frustrating it can be when a student is confronted with an overwhelming number of sources.  I enjoy being subjected to academic rigor, but I’m put off by instructors handing me articles I “might be interested in” or optional resources, even when I’m really engaged by the subject-matter of the class.  First of all, I know of no student that has the time, at least during the school year, to go that far above and beyond the course requirements.  And more importantly, I think all these secondary resources can create intellectual clutter, distracting from the central principles the course is trying to communicate.

In short, as a student, I’m interested in what I need to know to meet the course goals. All the other stuff is a distraction, more often than not.

Working with Wikis

Wikis are a great tool for collaborative learning, but like any other tool, they need to be used properly. In my role as wiki administrator/Mister Fix-It at SNL Online, I’ve recently checked up on several course wikis that I’d initially created and turned over to faculty and was disappointed to see some that were underused and poorly structured. Here are a few tips to make your wiki (or, ahem, workspace as PB Works, née PB Wiki, now calls wikis) easier to use and a better learning environment for you and your students.

Have assignments that use the wiki. This would seem self-evident, but unless students have to go to the wiki to do course work that will be assessed, they won’t use it.

Make those assignments appropriate for a wiki.  A wiki is not a discussion board. A wiki is a great place to work collaboratively. It’s easy to work on a common document without having to exchange endless iterations of Word documents. It’s easy to post work and share it with others in a highly visual environment. You can post and share photos, audio, video, and a wide variety of multimedia widgets and Web tools–things that are clumsy or impossible to do in a discussion. You can set up private folders for each student, so he or she can post sensitive material like a personal journal that only you and the author can see. But if you want a space for students to discuss things, use the discussion feature in your learning management system.

Provide scaffolding for students. Give them low-stakes tasks to do at the start of the course, like creating a personal introduction page, adding a photo and text to it, and creating a link to it from the course-wiki home page. Again, it should be a required activity, not an optional exercise. Your students can then build on this experience.

Provide clear directions for students. Many adult students are intimidated by new technology, and a surprising number of younger students also struggle with unfamiliar applications. Both groups need to know exactly what you want from them and how to create it. At SNL Online we provide faculty and students with role-based wiki FAQs, print and interactive tutorials, and links to PB Works’ extensive library of video tutorials to help with the “how-to-do-it.”

Provide navigation. The wiki will be underused if it’s hard to use. You need some kind of navigation and site structure. It can be as simple as a list of links on the home page that direct to student pages; the important thing is to make sure that users can easily find what they need.

Provide a template or wiki structure. I’ve set up some wikis for faculty with the course foundation completed; students needed only to edit existing pages or add pages to an existing section or folder. Some of our faculty prefer to create this structure themselves. So far, both approaches have been more successful than leaving the design and creation of the wiki to chance.  

Monitor and maintain. Because any user with editing privileges can change any page you don’t lock down, things can (and usually do) frequently appear and disappear. To maintain a consistent, usable learning environment you’ll need to keep an eye on your wiki and make corrections, adjustments, and replacements. Every wiki I’m familiar with sends wiki administrators e-mail alerts when a page is edited; this makes it easy to keep up to date. You can usually set the frequency of these alerts or opt out of them altogether.

Keep ahead of your students. There’s certainly something to be said for you and your students learning as you go along, but with new technology, it’s far more preferable to be comfortable with it yourself before asking your students to use it. Familiarize yourself with the functions and features of your wiki, use all available resources to strengthen your own skill set, and you and your students will create a useful and rewarding collaborative-learning environment.

Webtopia—Democratizing the Internet

Writers and urban planners for years have mapped and envisioned the ideal society through designing utopian metropolises. This is my own interpretation and glimpse into a version of a “webtopia,” a re-imagining meant as a prompt for discussing democracy and citizenship on the Web.

myUSAthumb

We might begin to think of the Internet as a public infrastructure or a spatial experience akin to walking a city’s streets. We navigate through the vague surrealism of unexplained flashing images and flash graphics; however, without the same binding of civic infrastructure and citizenship, without our ties to the streets, we comparatively navigate a corporate labyrinth—an endless mall. It is a spectacle of passive engagement, wherein we consume information, commodities, and products while hardly holding a stake in its architecture. As Google strives to unleash infinite knowledge at our fingertips and YouTube and other Web services promote do-it-yourself content creation, our productive capabilities are exploited. We might question if the intellectual value of a Web-user really drives these companies or whether they instead mean to attract our passive gaze into the corporate consumer spectacle, only interested in the activity of our wallets. These applications and services are by no means free; we pay by submitting ourselves to billions of dollars in advertising spent to follow and manipulate our habits. Furthermore, the content we generate through “free” e-mail, social networking, Web applications, etc. is scanned, exploited, and sold as marketing analytic research. Is our content, then, really our own?

Can we begin to maybe imagine a Web experience prefaced upon citizenship before consumerism? If we ever intend to renegotiate the intellectual foundation and potential of the Web, one option might be its decommercialization. The free-use Internet of the public is hinged and supported by advertising dollars. A new public space could be founded via one of two avenues: a not-for-profit Wikipedia approach or socialization, an Internet owned, financed, and governed by the people. Personal injury lawyer seo services can enhance your firm’s online presence and attract more clients. If you want to boost the security and privacy of your wifi network, you may use premium residential proxies.

The illustration above is a vision of a new interface, a “citizen portal” akin to Google’s centralized “iGoogle”. These would be the new public-domain passages and highways of the Web, owned and operated by the people and designed to encourage and praise democratic participation. Currently, the Internet, as a public space, and its Web-architecture are owned and regulated by only a handful of corperations—Google, Microsoft, Yahoo (much like the big three in television broadcasting). Google’s own page layout and interface (its line weights, color schemes, etc.) are designed to optimize its advertising revenue.  A noncommercial, government-sponsored  provider seems worthy of experiment—a PBS alternative to the Internet.

Now imagine how e-learning might be better respected if its platform, the Internet, became a nexus of civic pride. I’m twenty years old, and my peers and I remain skeptical of online learning because of its highly commercial platform—the Web. Education seems diluted into a material good for consumption, rather than active engagement, when a Web-forum discussion on Plato is a click away from penis-enlargement pills. Furthermore, e-learning presents the prospect of increased hours spent online by youth, a goldmine for commerce and advertising, as students can be easily distracted from any academic essay to Walmart.com within seconds.

When re-imagining the Internet, we must seriously consider and reflect upon how we navigate physical space, how we embed our values into the infrastructure and organization of our cities. Knowledge and education are often perceived as pillars of democracy not as IPOs or aisles in the mall. But perhaps the disorderly nature of the Web allows for the unexpected positioning of otherwise segregated forms of information. A platform coupling Penis enlargement and Plato might liberate higher education from its pedestal—its perceived irrelevancy trapped behind the gates of academia.

Twitter: I have a head cold and…

Twitter

I’m nursing a head cold and have a blog post due. Can I put something together in small tweets of 140 characters or less?

Wondering about the Web literacy of our online students. Some have to be told to scroll to see content "below the fold." Why is this?

Resisting the notion of designing for users who are Web illiterate. Does designing for the few diminish the learning experience for most?

DePaul IDD consultant Daniel Stanford has written about user tech illiteracy. I’m currently thinking he’s onto something we should consider.

I’m thinking that a basic competency in technology should be a prerequisite for students who wish to take courses online.

I’m thinking about the faculty who are similarly challenged by fundamental Web/tech literacy and teach online. Requirements for them?

Concluding that Twitter is good for musing and asking questions; maybe stimulate discussion about user-centered design & Web literacy?

OK, my Twittering ends above. While I still sense that Twittering is a largely narcissistic activity (as is, in my belief, much of social media), I am interested in its ease of use, immediacy, connectivity, and mobility. I’m also interested in how Twitter’s 140-character limit shapes writing and thought: it won’t let me ramble. In that spirit I’ll wrap up with this: I’m going to try using Twitter to document the user illiteracy I encounter day to day. If you’re interested too, follow me at http://twitter.com/dschmidgall.

Avatar photo

Make Learning Objectives Short, Punchy, and Retainable

No one likes to read learning objectives.  Okay, this might be too extreme a statement.  Let me rephrase to make it sound more academically correct: no one, other than instructional designers, academic creditors, faculty/syllabus-writers, or students who are bored to tears, likes to read learning objectives—unless they are short, punchy, and, hence, super retainable! 

As an instructional-design professional, I fit into the category of learning-objectives reviewers.  I have a tendency to browse through the objectives portion of various documents: course syllabi, training brochures, webinar announcements, and even activity notices from my kids’ school.  I look at them not to learn purpose of the events but rather to catch “violators” of our learning-objective rules:  “to understand”… vague word; “to improve” … but how; “to be able to” … under what condition!

The latest “violator” that I encountered was Dr. David Allbritton, from DePaul’s psychology department.  A few weeks ago, he gave a presentation at an online-learning seminar, where he shared the learning objectives of his online psychology course, and it looked like this:

  • Think like a scientist
  • Know stuff
  • Figure stuff out
  • Feel connected
  • Find it relevant
  • Don’t cheat

Maybe this was just an abbreviated list of objectives for the sake of a presentation (with the audience being psychology amateurs), but wow, talk about violations worthy of ticketing and fines! My need for a rewrite became so compelling that within a minute, new objectives showed up in my mind:

Think like a scientist Develop and apply critical thinking skills for decision making and problem solving in the subject area
Know stuff Demonstrate knowledge of the subject matter
Figure stuff out Develop problem-solving skills in the subject area
Feel connected Develop an interactive learning community among faculty and students
Find it relevant Apply knowledge and skills obtained from the course to problem-solving in the real world
Don’t cheat Refrain from any behaviors of cheating and/or plagiarism

Now tell me which one you like, mine or his?  Or, to phrase it in a different way, which one is easier to comprehend and to remember?

As an audience of a presentation, I must say that I like Dr. Allbritton’s objectives, which grabbed my attention right away with his “stuff.”   And hey, isn’t “gain attention” the very first step of Gagne’s “Nine Events of Instruction”?   Okay, his second one is “inform learners of objectives”, but if your handout, syllabus, or presentation doesn’t allow you the space or time to “holler,” wouldn’t it be nice to use your objectives as an attention grabber?  If the magazines are doing it (e.g. “Lose 10 Pound in a Day,” a suspicious but nevertheless clear and straightforward objective) and the book publishers are doing it (e.g. How to Cook Everything, an ambitious objective embedded even in the title itself), why can’t an academic learning guide, such as a syllabus, be made as easy to grasp as they are?   I am not talking about an effort to commercialize or “sexy up” our academic lingo for the sake of sensationalism.  Because often, it doesn’t need that.  Being straightforward is all it takes to win the bid.  Simple expressions, such as to know, to apply, to become, to evaluate, and to change tells students exactly what to expect from the course, from the fundamental or theoretical (to know) to the practical or methodological (to apply) to the ideological or believed (to become or to think like).

Having gained my attention and that of the rest of the audience, Dr. Allbritton was able to further explain the strategies he used to ensure the achievement of each target in the same simple and direct way.  As someone who’s used to seeing and giving presentations in multislide mode, I found the following one-page handout of his demonstrated, in a clean and clear fashion, a great way of matching instructional strategies with learning objectives:   

Objective Strategy Implementation
Think like a scientist Emphasize use of evidence to make decisions and support ideas In content of lectures, discussions, and group projects
Know stuff Give ’em content PowerPoint lectures
Test ’em Weekly Bb quiz on textbook
Figure stuff out Make ’em do stuff Discussion questions;Group projects
Feel connected Make them feel they are interacting with real people Introductions assignment;Video intro lecture by instructor;

Voice-overs in PPT lectures;

Discussions and projects with small groups

Find it relevant Make them apply it Discussion questions;Final project in which they apply material from course
Don’t cheat Lots of low stakes assignments; no high-stakes tests; Weekly quizzes;Lots of small assignments;
No “purchasable” term papers Final paper requires application rather than just summaries

Student Toolkit

Here at DePaul, we have the DePaul Online Teaching Series program (DOTS), where we work with faculty to help prepare them for the unique challenges of teaching online. It’s an intensive program that begins with a crash course in designing an online or hybrid course and goes all the way through working with a design consultant to get the course completed and evaluated.

In order to help the faculty effectively accomplish this, we give them the tools they will need to create their course, including a laptop computer, a webcam, a headset microphone, software, and a portable voice recorder. Doing this ensures that they have all the technology they will need to produce a robust, dynamic, and interesting course.

I received a phone call today from an instructor who went through the DOTS program asking about what resources were available to a student who wanted to produce videos to submit to the class. This got me thinking about the aforementioned technology toolkit we give to faculty. At what point will the students need a similar toolkit?

A great deal of focus in course design is often placed on creating instructional materials for the students to consume. For example, they watch a video, read an article, or view a Web site. There is not much focus on student-created content—regardless of whether it is eventually offered up for assessment. The majority of the time, students interact with the material through writing a paper, posting to a discussion board, or taking a quiz.

However, what happens when an instructor would like to send off her students to create materials for assessment similar to what the instructor can produce for the students to consume? Where does a student, especially an online student, obtain the required video camera, microphone, or editing software?

This thought process, combined with a conversation I had the other day about technical requirements for online students, made me wonder if we will see not only tech specs for computers for students in the future, but also what they will need as peripheral devices in order to succeed as a student in an increasingly visual and technical world.

I can’t wait to see where this may lead.

Avatar photo

Just Because They’re Young Doesn’t Mean They’re Tech Savvy

A professor I work with recently decided to use Ning to create an online social network for a course. Like Facebook, Ning provides a space where users can communicate and share links, images, and videos. However, Ning allows instructors to create a space that is used exclusively for course-related collaboration and is only accessible by their students. This increased level of privacy and focused purpose helps everyone involved maintain boundaries between their academic and personal lives.

Shortly after the course began, the professor noticed many of her students were having trouble with basic tasks such as uploading images, embedding YouTube video clips, and writing blog posts. The professor told me, “I have a blog and I’m almost fifty. I was shocked that my students have no experience with blogging.” I wish I could say I was as shocked as she was. Unfortunately, I know this problem all too well and I’ve been writing about it periodically for the past year. Back in February of 2008, I wrote a post about the importance of defining computer literacy. My major complaint at that time was the lack of agreement on a minimum technology literacy level for college students. The lack of computer-literacy requirements and classes to support students who don’t meet such requirements places an unfair burden on faculty. Professors who wish to use new technology in their courses wind up serving as tech support for students who lack a fundamental understanding of interactive media.

Back in November, I also wrote about the misleading stereotype of the tech-savvy millennial learner that I hear about so often at conferences. As much as people love to refer to today’s twenty-something college students as “digital natives,” many of these students are more like “digital resident aliens.” They’ve learned just enough to get by, but ask them something that’s not in their phrasebook and you’ll quickly see how superficial their knowledge really is.

Sadly, the lack of a well-rounded technology education isn’t just failing students in the arts and humanities. Students pursuing technology-focused degrees are also suffering. An article in The Chronicle of Higher Education recently noted that many Web-design instructors are not preparing students for the demands of employers in the field. In “Colleges Get Poor Grades on Teaching Web Fundamentals,” the author cites a survey developed by Leslie Jensen-Inman, an assistant professor of art at the University of Tennessee at Chattanooga. Jensen-Inman interviewed thirty-two professional Web designers and discovered that universities are either encouraging students to overspecialize in a particular piece of software or programming language or teaching outdated tools and techniques that are no longer relevant in the working world.

As a part-time Web-design professor, I found this article vindicating, because it supports my belief that students need a broad range of up-to-date knowledge to become successful designers themselves. In addition, I think the basic skills and knowledge that aspiring Web designers need are becoming increasingly essential for all college students. Knowing how to manage digital files, maintain a blog, participate in an online discussion, embed media in a Web page—these are all skills that will prove valuable no matter what a student’s career aspirations might be. Now we simply need to recognize that this knowledge won’t reach critical mass by osmosis. Hundreds of hours of Wii Tennis or text messaging or Twittering might do a lot to reduce technophobia in a new generation of students, but it doesn’t necessarily increase their understanding of how interactive media works and enable them to transfer knowledge from one tool to another.

Many instructional designers might disapprove of the idea that we should relegate new-media education to a single “Technology 101” course. Instead, they often support an integrative approach in which technology is used across the curriculum as a means to an end for a variety of disciplines. I agree that it’s wonderful to see faculty using technology to improve learning in a variety of subject areas, from philosophy to chemistry to mathematics to the fine arts. However, I think attacking the problem from both sides could help ensure the push for technology integration doesn’t always come from the top down.

A Technology 101 course could help ensure today’s students can live up to the tech-savvy stereotype we’ve already forced upon them. With a little support from the bottom, we might finally see more students pushing faculty to use new tools and helping instructors improve their technology literacy. Until then, I’m afraid we might be stuck in an inefficient, reactive model that attempts to support students once assignment deadlines are looming and panic has set in. This approach is a bit like asking students to drive cross-country after giving them the keys to an eighteen wheeler and an 800 number to call if they have questions as they’re barreling down the highway. Will some of them make it? Sure. But a little driver’s ed up front could prevent a lot of disasters down the road.

Language and Thought: Explanation and Understanding

Conventional wisdom views language as a device through which thought is actualized into spoken or written word, as a tool that simply assists in the representation of something that precedes it. To paraphrase a science mentor and dear friend of mine, “We do not create the world through language. Language and explicit knowledge are the poor symbolic systems we use to try and communicate about the real creator of the world: implicit rules and knowledge that are metasymbolic.”

I disagree with this assessment and see an important, fundamental feedback between metasymbolic, implicit rules and knowledge on one hand and language on the other. Understanding language formally as a symbolic, self-contained system that is governed simply by syntactical and grammatical rules is narrow and fails to recognize that language does not only express thought but also guides it. Such a failure underestimates language’s potential to both enrich and stifle thought. With this in mind, the belabored arguments below are meant to support a single simple statement:
The task of developing rich (and ideally multi-) language skills should be undertaken not only by language or creative writing majors but by all, since one’s level of linguistic skill provides the basis for critical and creative-thinking development, which is fundamental to all human endeavors.

By the time in our lives that critical thinking and reflection have become prominent aspects of our being, both the use and understanding of language have themselves become implicit, creating the illusion of a given language’s “naturalness.” Those who speak and write fluently in more than one language often discover aspects of thought and feeling that are much more accessible in one linguistic scheme or another, destroying this illusion. I, for one, think and feel differently, express myself differently, and focus on different aspects of my experiences depending on whether I "function" in Greek, English, or German. I can think of several words that exist in one language and not in another (especially words with subtle shades of meaning) that not only suggest differences in how thoughts are expressed but also support the formation of different future thoughts. For example, there is no Greek noun that can capture the meaning of the English "privacy," while the English "hospitality" and the equivalent Greek "filoxenia" (literally and clumsily translated as “friendship towards strangers”) clearly put emphasis on different aspects of the concept they describe. In both cases, the linguistic differences reflect and support attitudes towards privacy and guests that are fundamentally different between the two traditions.

The drawbacks of formal approaches to language come to the forefront especially when trying to address prosody and metaphor, linguistic devises that account for a large portion of communicated meaning and of language use and creation in general. All the  formal “substitution” theories of metaphor accomplish is to create a model that is “Ptolemaic” in its complexity and uselessness, trying too hard to stick to existing ideas, simply because embracing different ones would require thinkers to enlist the help of unfamiliar intellectual traditions. But I will reserve this topic for a future post.

Winograd and Flores (1986) observe that even sophisticated linguists are puzzled by the suggestion that the basis for the meaning of words and sentences cannot ultimately be defined in terms of an objective external world. Words correspond to our intuition about “reality” simply because our purposes in using them are closely aligned with our physical existence in a world and with our actions within it. But this coincidence is the result of our use of language within a tradition (or as some biologists may say, of our “structural coupling” within a “consensual domain”).  As such, this reality is based on language as much as it reflects it.

Ultimately, language, like cognition, is fundamentally social and may be better understood if approached as a “speech act” rather than a formal symbolic system, a move that introduces the importance of “commitment,” as described in speech-act theories of Austin and others. Both language and cognition are relational and historical, in the larger sense of the word. As Winograd and Flores note, the apparent simplicity of physically interpreted terms such as "chair" is misleading and obscures the fact that communication through words such as "crisis" or "friendship" cannot exist outside the domain of human interaction and commitment, both of which are intricately linked to language (as speech act) itself. This apparently paradoxical view that nothing (beyond simple descriptions of physical activity and some sensory experience) exists except through language describes the fundamentally linguistic nature of all experience and motivates me to approach moments of understanding (i.e. “understanding” experiences) as the achievements of explanatory (i.e. linguistic) acts.

The power of language to create, rather than simply express, thought and meaning may actually be more easily recognized through an examination of the relationship between explanation and understanding. The writings of Gadamer (1960), Ricoeur (1991), and others, have expanded our conception of explanation, illustrating that it cannot be approached as simply the result of and subsequent to understanding. 

Explanation and understanding are both products of thought, “moments” of knowing that constantly interact in a productive feedback. This feedback is manifested as communication, reflection, etc. and has explanation, rather than understanding, at its center. In this scheme, explanation is linguistic in nature (whether as discourse—with someone or within—or text) and understanding is cognitive/phenomenological (whether as thinking or thought). Explanation (interpretation) is not seen as a post-facto supplement to understanding but as belonging to understanding’s inner structure, an integral part of the content of what is understood. I see Gadamer’s efforts to recover the importance of application (“understanding always involves something like applying the text to be understood to the interpreter’s present situation’”) as evidence that application is the ultimate explanatory act. As an "explanatory achievement," understanding is the fruit of explanation, "being realized not just for the one for whom one is interpreting but for the interpreter himself." This essentially argues that understanding is “explaining to self.”

If, along with Gadamer, we conceive every statement as an answer to a question, what we understand as a statement’s meaning is an answer, an explanation. And even though the moment of understanding often seems to occur without explicit interpretation/explanation, it is always preceded by an explanation to self, motivated by the hermeneutic question that has to be asked and be answered in any event of understanding.

The understanding/explanation dialectic parallels the one between thought (understanding) and language (explanation). A thought that cannot be “explained” linguistically (to self or others) is better approached as intuition, not as understanding. The revelatory moment of experiencing a work (linguistic or otherwise) that manages to say to us what we could only intuit is what transforms our intuition into thought, helping us escape the prison of our previous language (and thoughts), and being verbally reconstituted through our new language, enriched through our encounter with the work. Our interaction with the work gives us the tools to explain our intuition to ourselves and turn it into a thought, with our newly found understanding being the culmination of an explanatory moment, however “implicit” or “concealed” this moment may seem.

This is just a blog post rather than a piece of academic writing, so I will allow myself the luxury of closing with strong words: Language must be recognized as our means of formulating thought, with all understanding viewed as the result of explanatory moments whose ontology is linguistic. Explanation and understanding, in turn, must be recognized as being tied into a continuous and dynamic feedback loop that develops through the initiation of acts of explanation. With Winograd and Flores, I reject cognition as the manipulation of knowledge of an “objective world,” recognize the primacy of action and its central role in language, and conclude that it is through language that we create our world.

 

References

Gadamer, H. G. (1960). Truth and Method. 2nd edition (1989). New York: Continuum.

Ricoeur, P. (1991). A Ricoeur Reader. M. J. Valdes (ed.). Toronto: University of Toronto Press.

Winograd, T. and Flores, F. (1986). Understanding Computers and Cognition: A New Foundation for Design. Indianapolis, IN: Addison-Wesley, Pearson Foundation.

Avatar photo

Millenial Learners Are Unique, but They’re Not the Jetsons

I just attended the 2008 Lilly Conference on College Teaching where the theme was “Millenial Learning: Teaching in the 21st Century.” I enjoyed some of the keynote presentations, especially Erica McWilliam’s presentation, “Is Creativity Teachable? Conceptualizing the Creativity/Pedagogy Relationship in Higher Education.” In the presentation, McWilliam noted that creativity is not only vital in the arts, but is also in scientific disciplines where creative thinking leads to key breakthroughs.

While McWilliam believes creativity can be taught, she claimed that it cannot be done simply by giving students free reign of their learning experience. She addressed a critical flaw in the rejection of the traditional sage-on-the-stage model of instruction in favor of the guide-on-the-side approach. According to McWilliam, this trend encourages instructors to become too passive and compromises the level of rigor we traditionally associate with more structured courses and teaching methods. Instead, McWilliam proposed an approach she calls “meddler in the middle.” This approach encourages experiential learning and assignments that foster independent, critical thinking. However, it requires faculty to be actively involved along the way, setting high standards for success and rejecting the notion that all answers are valid.

While I enjoyed some of the keynote presentations at the Lilly Conference, I have to admit that there was also a thorough beating of the dead horse that is the “millenial student.” Several of the presenters rattled off the same sweeping generalizations about the millennial generation that I’ve heard so often at past conferences, including classics like, “They’re multitasking visual learners,” “They prefer to learn by doing,” and everyone’s favorite, “They’re incredibly tech savvy.”

Even if some of these statements are exaggerations, they’re not particularly harmful because most of them are based on facts or at least a relatively scientific survey. However, I find it hard to hide my annoyance when someone tells yet another anecdote about the now-famous (yet nameless) young college student who text messages her friends while listening to her latest class lectures on her iPod and updating her Facebook page—all while driving to her apartment in the sky in a flying hovercar.

It seems no educational-technology conference presentation is complete these days without the obligatory stock photo of a hip, young student with a laptop tucked under his arm, iPod headphones in his hears, a video game controller in one hand, a cell phone in the other. This photo is usually a warning that the presenter is about to describe a bleeding-edge case study that will make use of Second Life, Twitter, Facebook, or some other tool that is revolutionizing education as we know it.

The problem with this recurring emphasis on millenials and their insatiable appetite for bleeding-edge technology is that it makes faculty feel they’re always behind the times. Most of the instructors I know are excited if they can figure out how to embed a YouTube video in Blackboard or insert an audio file in a PowerPoint presentation. Now imagine how those instructors must feel when they go to a conference to discover that PowerPoint and YouTube are “so five years ago.”

I’ll be the first to admit that I’m a part of the problem. I just gave a presentation titled “Beyond PowerPoint and YouTube: Making the Most of Multimedia for Language Instruction” at the fall conference for the Illinois Council on the Teaching of Foreign Languages. The session was packed and the attendees were very eager to learn. However, it was clear to me based on their questions and feedback that they would have been just as happy with a session titled, “PowerPoint Tips and Tricks: Making the Most of the Everyday Tool You’ve Never Had Time to Master.”

I’m certainly no PowerPoint evangelist. I like building educational mini-games in Flash, trying out new blogging and wiki tools, and encouraging faculty to use services and sites that often include the world “beta” in their logos. However, I think it’s important to admit that the simplest solution for presenting instructional material is often the best. For many professors, that solution is PowerPoint.

Occasionally, instructors might want a feature that PowerPoint can’t offer. They might want students to be able to view presentations in their web browsers and comment on them. They might want students to be able to create their own presentations with audio-narration and easily share them with others. When those needs arise, it’s important to offer them the simplest, most reliable solution that gets them from point A to point B. If a French professor wants students to create narrated cultural tours of Paris, we should introduce that professor to VoiceThread. We shouldn’t encourage her to establish an island in Second Life, hire three graduate students to build a replica of central Paris, and force her students to create avatars and chat in French inside a bad recreation of the Hall of Mirrors at Versailles.

If you’d like to know more about alternatives to PowerPoint and the features they provide, you can view the multimedia presentation tools comparison I put together in October of 2008. All of the sites listed feature tools I’ve actually tried myself, and I’ve included the pros and cons I discovered after creating and uploading test presentations of my own. Some of the tools I’ve highlighted (e.g., Google Docs) might not win me any awards for being on the bleeding edge of instructional technology. However, as someone who knows a lot of professors, I know from experience that it’s important not to overestimate the tech needs and wants of faculty. And as a student who is technically a millennial by some definitions, I think it’s important not to overestimate the tech needs and wants of millenials. After all, I’m living proof that some millenials are happy with a traditional, well-delivered lecture with minimal fuss. And for the record, I’ve never text messaged a friend while updating my Facebook page.

Now, if you’ll excuse me, I have to take my hovercar in for servicing. My info console has been acting up and it won’t play my video mail or let me make online bill payments while driving at hyperspeed.