Being the sports fan that I am, I have taken note of the recent outbreak of Twitter-related disciplinary actions involving athletes. Those of you who follow the NFL or NBA are familiar with the Chad Ochocincos and Gilbert Arenases of the world. And the trend has filtered down into the collegiate and high-school ranks as well. The Texas Tech football team was banned from tweeting last season, and just last week, a University of Idaho basketball player was suspended for tweets critical of his coaches and teammates. The rational for the disciplinary action is nearly always that the tweet is “conduct detrimental to the team.”
One of the great challenges and opportunities in online teaching and learning is the capacity to leverage the medium to take a distributed environment and create community. One needs only a moment to see the proliferation of social networking as evidence for the ability of the Web environment to support community. Clearly, not all tools work as envisioned, nor do all courses benefit from the use of certain tools. Yet, does a compelling argument even exist to not make use of such technologies in online learning? But what is the appropriate action when a discussion board is hijacked or a class blog goes up in flames?
Classroom management is not a subject often discussed in online-learning circles. With the increasing socialization of our online courses, is conduct detrimental to the team an issue? And what can be done about it?
We all agree it is imperative to continue striving to improve each student’s learning experience while maintaining an equilibrium that promotes the use of social tools and the establishment of an environment of respect.
The question is how?
I am curious to learn about strategies for dealing with, or better yet, preventing such conduct from this community.
You want to use some Web 2.0 technology in your course, so you have each student create a blog on Blogger to have them chronicle their work and thoughts through the term. As an instructor, you visit these sites and leave comments on the blog. In order for you to keep track of which student has which blog, you ask them to have their names on the front page of their blog and for them to e-mail you the URL so that you can go through them all, moving from one blog to the next. No grades are shared via the blog, and your final evaluation for the student comes in feedback that you provide within the Gradebook area of Blackboard.
Is this a violation of FERPA?
There were some very good answers in the comments section, and now it’s time for me to share mine.
The short answer is yes.
There are a few land mines in this scenario, but the one that jumps out to me is that the instructor leaves comments on the blog regarding the student’s posts. When an instructor reads a student-submitted work—as a blog would be when it is read and graded by the instructor—it is then considered part of the student’s educational record. Remember the definition of an educational record according to FERPA (PDF): “Education records are currently defined as records that are directly related to a ‘student’ and maintained by an ‘educational agency or institution’ or by a party acting for the agency or institution.” When the instructor leaves evaluative feedback for the student in a comment to the post, he or she violates FERPA by making his or her evaluation of that part of the student’s educational record public.
Another land mine in this scenario is the fact that the blogs were not necessarily made private, so anyone could view them and associate the student’s name with the course they are taking and reveal that they are students in a particular course, term, and institution. Requiring the student’s name to appear on the front page is also a red flag.
Since you were so good at answering my last question, I pose another to you: what could an instructor do differently in this assignment to keep the academic objective of the assignment (self-reflection) without violating FERPA?
In 1998, I had my first full-time job as a computer-graphic designer in a media center at Indiana State University. The word “computer” in my job title differentiated me from the other graphic designers in the office. While they produced print materials like banners and posters designed in Photoshop or Illustrator, I didn’t do much of the drawing and printing, because to me, the word “computer” meant but one thing—PowerPoint!
PowerPoint, believe it or not, was a high-end, technical tool at the time (meaning higher than overhead transparencies). My job was to produce PowerPoint slides for televised distance-learning courses. I remember getting those highlighted textbooks from faculty and typing page after page of content into PowerPoint slides. I remember the “wows” from faculty thrilled to see text flying in line by line. I remember the same thrilling feeling I had myself when my designer peers asked me whether the animated presentations I created were really done with MS PowerPoint—“It looks like a (Macromedia) Director product,” they said. Soon I was crowned “the PowerPoint guru.”
Yet, deep in my heart, I knew that this glory would not last long: my crown would become an old hat once other users figured out my tricks—or worse, they would be discovered by the vendor, who would then make them part of the application. I thought this would happen within a couple of years.
So I was shocked a few months ago when an associate vice president of my institution asked me about offering a PowerPoint workshop, because she had seen too many presenters that “were sorely in need of training on how to give effective PowerPoint presentations.”
After thirteen years, with all the comings and goings of dazzling new tools, guess what? We are back to PowerPoint!
I was even more shocked when I learned that the enrollment of the workshop (Beyond the Bulletpoint: How To Design Low-Tech High-Effect Presentations) reached thirty-two in a matter of days and the event organizer was asking me whether we should close it or offer another session. Oh, come on, we can’t close it! It was my good old PowerPoint staying cool in the era of Web 2.0! And besides, isn’t it wonderful to know that after more than a decade, people are still interested in my tricks (I mean, they still haven’t got them yet)?
I guess this has been a long enough teaser. Let me get to the meat of this entry: the tricks.
My tricks in using PowerPoint are as simple as following two basic rules: a) avoid PowerPoint sins and b) inject creativity into the presentation design.
Avoid PowerPoint Sins
I consider the following behaviors sinful for any PowerPoint presentation:
Sin I: Long, Massive Text Blocks
This means more than six lines of content with a font size smaller than 18. Anyone who throws full-blown paragraphs into the slides is asking PowerPoint to serve as a teleprompter and forgetting the fact that those things are supposed to be hidden from the audience.
Sin II: Long, Full-Sentenced Bullet Points
This might be less sinful than paragraphs, but it still makes it impossible for the audience to grasp the key points no matter how loudly you read them. (And by the way, reading from the slide doubles the sin.)
Sin III: Unnecessary Decorative Elements
Unless your audience is too immature or intellectually challenged to understand your concepts, you should control the use of clip art. I still feel ashamed of this slide I created thirteen years ago. The clip art of the tool box is nothing but an insult to college students.
Sin IV: Excessive Use of Animation
With the infusion of all sorts of digital gadgets, our world is already overanimated. Unless it carries some meaning, animation is merely annoying (see the next section for the meaningful use of animation).
Sin V: Serif Font Type and Low-Contrast Color Schemes
Picky as it may sound, text in Times New Roman in a PowerPoint screams that it was created by a nonprofessional designer. Those little semistructural details at the end of some of the strokes aren’t reader friendly for at-a-glance or on-screen reading. And common sense will tell you that any dark texts on a black or blue background aren’t reader friendly either. Our daily writing media is black text on a white background, which can teach us a simple but very useful lesson on what is the friendliest combination of colors.
Inject Creativity into Presentation Design
I love reading the debate on whether creativity is teachable. This year’s International Conference on College Teaching and Learning frames the question as, “Creativity: Art or Science?” I believe creativity is a mix of art and science: while it does require a fair amount of natural talent, cognizant exposure to innovative ideas and procedures will stimulate creative sparks within the ordinary.
Over the years, I’ve seen many great presentations—with and without the use of PowerPoint. The ones that have used PowerPoint usually used it to serve the following four purposes:
To inform, to illustrate, to inspire, and to prepare.
To Inform
In most cases, PowerPoint is used as a visual aid for content delivery during lectures and presentations. People use it to get their point across. But the best way to get the point across is not by throwing out the points. I found that when information is presented in a story-telling way, it’s easier for the audience to comprehend. The following video didn’t offer any text-based definition or bullet points of Google Wave features; instead it used animated graphics to tell us a story of e-mail. Can the same be achieved with PowerPoint? My answer is yes.
To Illustrate
In order to combat the laziness of human brain, Dr. Chris Atherton from the School of Psychology of the University of Central Lancashire offered some strategies in designing PowerPoint slides:
As you might have noticed, this presentation didn’t use any of the given templates in PowerPoint. For most of the slides, it is black text over a plain white background. Also, it contains no animation and is, therefore, well suited for online viewing via Slideshare. The plain design makes the plain truths that the author wants to share stand out without interruption.
In other cases, animation can be a powerful tool to keep the viewer focused on the flow of information, like in this presentation I did last year on online teamwork (click on the image below to access the presentation on Slideboom):
To Inspire
TED.com, which is my favorite Web site, inspires me not only with their presenters but also by some of its creative PowerPoint design. Look at this one by Larry Lessig on “Laws that Choke Creativity” and feel the choreographic harmony between speech and slide show. In this case, the power of the PowerPoint lies in its ability to strike on the key ideas at the right moment.
TED.com assured me that by delivering the best and the brightest directly to our computer screens, technology is breaking through the knowledge monopoly! Someday we might move into an age of presentation Darwinism where the mediocre can no longer survive as people click through the Internet to view and rate only the best content. Until then, sites like TED.com have at least helped set up a high bar in terms of presentation design.
To Prepare
Lastly, I have seen PowerPoint being used as a notebook provided to the students by the instructors before, during, or after class. This kind of PowerPoint can be as self-sufficient as a textbook that allows students to prepare for class or an exam or to save them from having to take notes in class. Projecting these slides on the screen to guide an in-class lecture can be dangerously boring (if nothing else, just the dimmed light induces the desire to doze off). These slides are more suited to be a handout than a presentation, but if you really want to use it, you can try to remove some key concepts so as to stimulate some brainstorming from students.
The other option takes some time, but PowerPoint does allow us to create minitutorials by hyperlinking text and graphics between slides.
Do you have any creative ideas in using good, old PowerPoint? Post them here so we can share.
I’m a fairly typical, multitasking, always-connected member of generation Y (or a late gen-Xer, depending on who sets the cutoff date). My laptop and I are rarely apart, and I quickly run out of things to occupy my time when I’m deprived of high-speed internet access. (My parents finally upgraded from dial-up just before the holidays and, as a result, I finally agreed to stay with them for more than 48 hours.)
In short, I get bored easily, which is why I’d always suspected that Las Vegas might be my ideal vacation destination. After all, Vegas is designed for people who can’t focus. Buffets are abundant—ideal for those who can’t even commit to a particular entrée for an evening. Cirque du Soleil shows are multiplying like rabbits—perfect for anyone who loves live theatre but hates paying attention to one performer for more than fifteen seconds. Nearly every casino offers a superficial imitation of some ancient city or wonder of the world—fantastic for the tourist who can’t imagine spending an entire week in just France or Italy or Egypt. You could also go through the detailed analysis of US poker and understand what you can do when you want to gamble online. You can also check out online casino platforms like level up casino NZ for exciting prizes! Visit online casinos like mpo888 to enjoy an exciting casino experience from the comfort of your own home.
I just returned from my first Vegas trip, and unfortunately, it seems all of the city’s catering to multitaskers comes at a high price. At first, I was drawn in by the bells and whistles. A slot machine featuring stars from the hit ‘70s game show Password? Amazing! Where do I insert my money? A shopping mall with a maze of canals and happy couples riding in gondolas while being serenaded by a man dressed like the Hamburgler? Incredible! I’ll take two tickets, please.
Unfortunately, it wasn’t long before the bells and whistles lost some of their appeal. “Doesn’t that gondolier know any songs other than ‘Mambo Italiano’?” I wondered. “And why does the attendant in the Parisian pastry shop sound like Marisa Tomei?” I could squint my eyes and pretend the stamped concrete was real cobblestone and the faux-finished walls were made of real plaster. Yet, eventually, I had to accept that underneath it all, Vegas was largely composed of some very mundane raw materials—primarily concrete and overweight chain-smokers.
The same holds true for online courses. We can try desperately to hold our students’ attention with flashy games and constant variety. We can reel them in with the promise that we won’t make them work too hard or stare at any one thing too long. But sometimes what’s fun or easy isn’t what’s best. A bad discussion prompt is not better just because it takes place online. A boring lecture is not more interesting just because you’re watching it on an ipod. And a hamburger with half a press-on nail wedged under the bun is not better just because it was served by a young woman from Des Moines dressed like Cleopatra.
I like to joke with participants in our faculty-development workshops that there is one key to being an amazing online instructor: just be riveting. Of course, that’s easier said than done. But we all have ways of presenting material that can keep students hanging on our every word. By choosing what to present and how to present it, you can make your lectures and assignments funny, relevant, scary, provocative, or inspiring. And you don’t need technology bells and whistles to do this. Professors have been creating riveting lessons long before the advent of the first educational technology—paper. (And just imagine all the awful things teachers have forced students to read and write simply because it was finally possible to do so without a hammer and chisel!) That’s not to say educational technology is useless. It’s just important that we don’t let it be a driving force when designing a course.
A colleague recently sent me information about the PBS program titledDigital Nation: Life on the Virtual Frontier. Portions of the show featured several educational technology scholars discussing the importance of engaging today’s multitasking, millennial learners. There were the usual cliché shots of students texting and updating Facebook while their dinosaur of a professor drones on from the stage below. The scholars talked about the need to keep students engaged the same way their favorite computer games do, with one scholar promoting an entire school curriculum built around game-based learning. While I salute educators for their openness to new teaching methods, I think it’s critical that we not lose sight of what truly makes for an engaging course and what great teachers have been doing right for hundreds of years. In the end, there’s no need for flashing lights and faux finishes if you already have the real Eiffel Tower and great pastries.
In my post from November 9th, 2009, I suggested two discussion starters—polling and pros and cons—and promised more strategies in future posts. So, here are two strategies for getting your online students talking to each other in more depth about course content.
Roles in a Case Study – present the class with a short case study and assign each group a part to play in that case. Each group discusses their “part” identifying primary concerns, varying influences, and possible actions for that stakeholder. Each group reports when everyone reconvenes. Discussion flows from there to identify differing approaches to the problem and possibilities for a mutually agreeable solution. I’ve seen this work particularly well using an ethical situation; it would work well in any course addressing conflicting concerns and interests.
Problem Solving Based in Theory – In this activity the instructor (or selected students) provide a real-life situation. Each group develops a response based on a different theoretical stance. When the class reconvenes each group reports, and a discussion ensues about the differences between the responses.
Why might these work? In each case, breaking up the class into smaller groups 1) puts more pressure on each student to participate and 2) eases the pressure of individually putting an idea out to the entire class.
The advantage to the instructor? As with the discussion starters of November 9th, you get the opportunity not only to see what students think they know—and so have an opportunity to correct misconceptions—but also to see whether or not they can apply what they know.
When instructors who have years of experience teaching face-to-face classes start teaching online, it’s tempting to try to simply “port” their traditional classes into the online environment—that is, to convert their existing classes to a new medium with no modification. These instructors have developed well-tested teaching techniques, sometimes through a painstaking trial-and-error process, and are often understandably hesitant to change them.
But while studies have shown that a well-designed online class can be as effective as a traditional class, there should not be a one-to-one correlation between how a traditional class is put together and how an online class is put together. Web environments have different capabilities and limitations than a face-to-face classroom. For example, online classes allow a discussion to stretch over a period of days (allowing more thoughtful contributions) but limit the immediacy of an in-class conversation, perhaps making it harder to generate the same energy. Online classes allow a nonlinear class experience but limit the instructor’s control over the student’s attention. These capabilities and limitations should be considered in the design of an online course.
But I would like to reiterate this point using an example from outside academia that will hopefully clarify why Web content should be developed with the capabilities and limitations of the Web in mind. This will show what kinds of problems can develop from simply “porting” information to the Web.
Last year, I did some research on the concept of genre in new media and the public sphere. I studied, among other things, the differences between the quality of discourse generated in user comments on political blogs and user comments on newspaper editorials presented online. (By quality of discourse, I mean the tendency of participants to cite evidence for their claims, use logical arguments, avoid ad hominem attacks, etc.) Without going into too much detail about my study, let me just say that I found more productive discourse in the comments attached to political blogs. Why is this?
There may be a number of factors, but one is that when newspapers establish an online presence, they generally just move their articles and editorials onto the Web with no modification. They are not developing online content as much as just presenting their print content on a Web page.
Political blogs, on the other hand, do not simply port content to the Web that was developed for another medium. Rather, they utilize the capabilities of the new technology in creating content. For example, they use hyperlinks to cite their sources, allowing readers to independently verify that the blogger’s characterization of those sources is fair. And bloggers draw on comments to their posts for insight, raw data, and differing perspectives, sometimes even modifying or supplementing their original post in response to user comments.
Because blog entries engage readers using techniques that are unique to the Web experience, they generate a more productive (though still seldom polite) exchange of ideas in their comments sections.
So, how does this relate to online learning? Just as newspapers fail to engage participants by simply porting print content to the Web rather than developing Web content, online classes run the risk of failing to engage online students by porting a face-to-face class to the Web, rather than developing a Web-based class.
So how does an instructor go about developing a class for the Web rather than just on the Web? I’m afraid that’s a large question with a variety of possible answers, and this is beyond the scope of this humble entry. There is plenty of specific advice in other entries in this blog, and if instructors need more help, why, that’s what instructional-design consultants are for!
All the talk about learning management systems (LMS) around the office lately reminded me of a dataset a couple of colleagues and I put together last year. Dr. Florence Martin, Dr. Yuyan Su, and I undertook the task of validating an instrument to measure LMS self-efficacy.
Bandura (1997) defined self-efficacy as beliefs in one’s abilities to carry out a desired course of action. I’ll spare you the details of orthogonal exploratory and confirmatory factor analyses.
One of the many variables we decided to examine was whether student LMS self-efficacy was a predictor of course performance. After all, is not learning the primary motivation for using a learning management system?
Reported self-efficacy was generally low. However, students enrolled in hybrid courses reported significantly lower self-efficacy than students in face-to-face or fully online courses. In addition, for students enrolled in hybrid courses, we found a significant positive correlation of LMS self-efficacy with course performance.
It is perplexing that a significant positive correlation occurred only for the hybrid learners. One would think that the use of the LMS as a supplement to face-to-face instruction would require less confidence with the system than in a course in which all content is delivered though the LMS.
Hybrid learners often had the option to enroll in a fully online version of the course but self-selected into the hybrid version. Is this due in part to their lower self-efficacy with the LMS? Or does it mean there is a baseline competence with LMS use required for success, but once that level is perceived to be reached, greater self-efficacy with the system is not required?
Finally, the only instrument category that did not yield a significant difference between modes of delivery was “Accessing Information.” This section included items like logging in to the LMS, navigating a course site, accessing text-based class materials and grades, etc. This was also the highest rated category for self-efficacy. We hypothesize that this finding is an indication of the predominant use of a LMS throughout each student’s experience. As suggested by Bandura (1997), the formation of self-efficacy beliefs is based primarily on reflection on and interpretation of past performance.
In my pervious post, I referenced the Raslton-Berg & Nath (2009) report that says students are uninterested in the bells and whistles in online courses. But consider further the abundance of media-comparison “studies” and no-significant-difference studies that essentially nullify each other.
Is it possible that students actually do like the bells and whistles but lack the confidence to learn from them?
What a pickle.
References
Bandura, A. (1997). Self-efficacy: The exercise of control. New York: Freeman.
Ralston-Berg, P. & Nath, L. (2009). What Makes a Quality Online Course? The Student Perspective. Paper presented at Annual Conference on Distance Teaching and Learning, Madison, WI.
Instructors who teach in online environments often devote extensive time and energy into designing a Web space that is inviting and useful to students. But frustration inevitably ensues when, despite the careful consideration given to the most logical placement of a discussion forum and the “clearest” instructions provided to students on how to post to the forum, the instructor still receives e-mail from students asking, “So, where is this discussion forum? And what am I supposed to do?” Why has this gap in communication occurred?
One reason for this may be the typically linear design of course sites. Often, learning-management systems adopted by universities have default settings that establish some of the design considerations for the instructors—i.e., the location and style of course navigation. These linear designs generally have the best intentions, since they try to organize information so that students can navigate course material easily, following step-by-step instructions and information.
However, with recent developments in eye-tracking software showing how users really view content on the Web, we can see why this linear design isn’t quite ideal. This video shows a user’s eye movements when scanning IKEA’s Web site, and several other examples available online confirm this rapid pattern of eye movement that jumps all over the page. It’s no wonder, then, that students miss the carefully placed, bolded, and highlighted instructions for turning in an assignment that you were sure everyone would see and follow—considering how the brain ingests and computes information from the screen, it’s easy to see how a linear design style for course materials might not match the ways in which users view the content.
So, what is the solution? Unfortunately, there isn’t a Band-Aid design scheme that addresses this issue, and because instructors are often working within an institutionally mandated learning-management system, course design happens within set boundaries. One important step is usability testing, which can reveal issues that designers can’t see once they are invested in their design decisions. This may seem like an onerous and time-consuming task, but it doesn’t need to be—usability guru Jakob Nielsen recommends five users for testing, but as this data shows, even finding two or three people to look at your course and perform key tasks can give you helpful information to improve your course design.
Another important step is realizing that, just as in face-to-face classrooms, your goal (for students to follow instructions) needs to clearly align with your assessments:
Include instructions in a logical location, as determined by your course design.
Ensure that students have seen these instructions. One effective method is to give students a graded quiz at the beginning of the term that asks them to locate important information throughout the online course.
Show students that following instructions is important by grading them on it. Depending on your class, you might make part of an assignment’s grade based on following the assignment’s instructions, or you could refuse to accept an assignment until the student has followed the directions.
Again, there isn’t a one-size-fits-all solution for designing courses that adhere to the ways users view information on the screen. This also isn’t a “lost cause” for instructors—just because users naturally view Web content in a nonlinear way doesn’t mean that the design of online course materials needs to be completely overhauled. Thoughtful design can help students, but supporting your design with clear expectations and assessments can also help students navigate your course more effectively.
“Often faculty don’t need more training on the tool, they need more training on the affordance of the tool and how to use it to support learning.” Patricia McGee, associate professor from the University of Texas, made this statement while offering tips for training faculty on teaching with technology in the newsletter Higher Ed Impact: Weekly Analysis, published by Academic Impressions.
What she said about learning the tools versus learning the affordance of the tools reminded me of a lot of trainings and conference presentations I have attended, which are usually made up of a lengthy PowerPoint presentation followed by a little bit of product/project demo. The PowerPoint usually covers vendor introductions, the tool’s primary functions displayed as bullet points, a theoretical framework or the background of the product/project (sometimes), the implementation process, and eventually, student feedback. If I am lucky, I might be able to get a few screenshots of the site or a quick run-through of the final project, but often these come at the very end. While a big introduction does help build expectations, without any concrete examples, it is hard for me to understand what exactly this particular technology could bring to my own teaching practice.
Compared to academies, tool providers seem to do better at addressing the issue of affordances up front. If you’ve read Melissa Koenig’s blog entry Story-Telling Tools—Beyond PowerPoint, you might have noticed that almost all of the tool sites incorporate a good number of samples on their home pages (check out PhotoPeach, Gloster, and Toondoo). This shows that the tool producers have figured out the best way to capture the attention of today’s busy and impatient Web visitors—by showing (instead of “telling”) them what has been done by and with the tool. The only challenge here is that many of the examples are for a “general” audience instead of being targeted at educators. Examples of faculty and student use of technology for instructional purpose are usually not presented in one collection. However, that does not mean that they cannot be found (Isn’t it a general rule that you can find anything on the Internet?). It is up to the trainer to locate the appropriate examples that could get instructors thinking, “How should I use this in my class?”
Speaking of selecting appropriate examples for faculty, Patricia McGee provided another practical tip in the article—adopting a tailored approach. Offering generic examples of educational use of the technology is not good enough, since faculty in different disciplines will have different needs. One type of technology that works well for one content area may not work for another. Given the various needs of different disciplines, Patricia McGee pointed out that campus-wide training might not be the ideal option. This is exactly why we developed a tailored DePaul Online Teaching Series (DOTS) program with a well-matched combination of technology, pedagogy, and content knowledge (TPACK) and implemented a liaison model to embed technology consultants in schools and colleges. Now it is time to bring the same tailored mode beyond the systematic program (such as DOTS) and implement it into all training events.
According to the CDW 21st-Century Campus Report, faculty’s lack of technology knowledge remains the greatest campus technology challenge perceived by students, and training is the type of support most needed by faculty. Whether faculty training is useful has become a determining factor for how successful technology integration on campus is. The answer to this could be as simple as a tailored training curriculum structured in a meaningful sequence. The one I’d like to propose includes the following three easy steps:
Step 1: Provide concrete and relevant examples (a demo of the affordance)
Step 2: Pause to choose the best tool for meeting instructor needs
Step 3: Train on the use of the chosen tool and the necessary technology