Small Steps, Big Impact: Micro Experiments in Teaching
Avatar photo

Small Steps, Big Impact: Micro Experiments in Teaching

  Reading time 8 minutes

Our students are changing…and so should our teaching

I think we are all finding our long-held assumptions about how to create an assessment with safeguards against cheating, “make” students do the weekly readings, and engage students either online or in-person are increasingly proving to disappoint.  Our students are changing and evolving. Jenny Darroch’s Inside Higher Ed “Students Are Less Engaged; Stop Blaming COVID”, asks us to reframe college students as “knowledge workers.”  Peter Drucker first coined the term “knowledge work” in 1959.  He identified six factors that motivate “knowledge workers” to be productive, including a sense of autonomy, opportunities for continuous learning, and an emphasis on quality of work vs quantity.  We need to adapt our teaching and learning strategies, but how do we know what is going to work?  

Micro experiments explained

Micro experiments might be the answer!  David Kolb’s article, “The Power of Micro Experiments by Embracing Small Scale Innovation for Big Results” in Medium is targeted to business leaders; however, I see his insights as very beneficial to the higher education space.  

First, what are micro experiments?  They are mini-pilot tests of new approaches to teaching,  course design, and content development which can be conducted in a live class or in a more controlled environment, such as a focus group or user experience lab, or even using AI.  The theoretical foundation of micro experiments lies in “design thinking.”  Guido Stompff, a professor at InHolland University of Applied Sciences, explains how anyone can quickly adopt and integrate rapid tests or micro experiments in his TedX video (12:53) “Speed up Innovation with Design Thinking.” The goal is to gather student feedback and data on a selected element of your course: an in-class exercise, assignment instructions, how you approached teaching specific content in an online lecture, the design of interactive knowledge check activities, and so on. The feedback is then incorporated into the next version of the course element.  

Institutional Support

DePaul University’s Center for Teaching and Learning provides opportunities for faculty and instructional designers to incorporate micro experiments into their design process. The Learning Experience Research (LXR) team provides consultation to faculty and instructional designers on creating rapid prototype tests that facilitate student feedback.  The LXR team recruited a pool of students through student-centered offices and groups. When a test request is submitted to the team, they build a group of student test participants that meet certain criteria in order to best represent the learner audience the content is intended for.  The LXR team designs the test, runs the test, and reports back to the requester the results.

Learning experience tests have proved to be very valuable.  Here are a couple of examples:

  • Refining language used to alert students that feedback is available for a graded assignment in our learning management system.
  • Assignment instructions that allow students to use generative AI, with guardrails. Learn more from Jes Kass and Alex Joppie‘s insightful blog posts.
  • Determining whether students prefer course content broken down more granularly (modules>submodules) or as larger units of content (modules>no submodules).
  • Exploring how students prefer to view videos: embedded on a course webpage or linking externally to a streaming service.

DePaul’s support of the Learner Experience Research team, in the Center for Teaching and Learning, allows faculty and instructional designers to conduct micro experiments that are cost-effective, an easy lift, and prioritize student-centered learning that adapts to our student’s changing motivations and learning needs.

DIY Approach

If your university does not have a Learner Experience Research team, then you have many choices to gather student feedback on your own. DIY!

In their Inside HigherEd article, “Experimenting with Teaching to Improve Student Learning,” Richard J Light and Allison Jegla detail two specific micro experiments that faculty conducted in their courses.  Joshua Goodman, a professor at Brandeis University, divided his 60 student class into one controlled and two experimental groups in order to determine whether his communication style via email impacted student academic outcomes.  The control group did not receive any emails, the first experimental group received emails with an academic tone, the second experimental group received emails with a more personal tone. His assumption (and hope) was that, yes, of course those students who received more personal emails would perform better in the course. However, his assumption was proven incorrect.  It made no difference whether a student received an email with a more personal tone, rather those students that did not receive any emails ended up performing better.  This is somewhat disappointing as we expect our efforts to connect with students are important in the calculus to increase motivation and engagement.  However, Dr. Goodman did discover that those replies from the students he wrote a more personal email to were twice as long and often included personal details that connected to the course content and experience.  He realized that connecting the content to a student’s lived experience is something students crave and he intends to be more deliberate about making those connections in his future courses.   

A midterm student survey is a very effective way to gather specific feedback on a newly designed course component or teaching strategy.  Voila, a micro experiment! DePaul’s Teaching Commons provides excellent guidance on best practices for student midterm surveys.  When creating a survey it is important to first determine your goal.  Do you want general feedback on the student’s learning experience or specific feedback on a particular assignment or activity? Develop survey questions that can provide you the data you need to answer your goals. When asking students for feedback, anonymous answers are preferable, as students will be honest in their replies. In addition, providing multiple choice and open-ended questions allows students to quickly rate their experience and also explain their feedback.  The faculty can then make adjustments, if possible,  midstream as they are teaching.  It is gratifying to let students know that their participation and feedback are appreciated and used to create a better learning experience.  

Innovation in small steps

Micro experiments aren’t just about tweaking assignments or course materials; they’re about sparking creativity and innovation in our classrooms.

Let’s be curious to learn more about our students, their motivations, and learning preference by mixing up small doses of change to see what teaching practices work best. With micro experiments, we can quickly uncover stale and ineffective practices and spark new ideas and approaches to teaching in the hopes of improving the learning experience for our students.

Avatar photo

About Lisa Gibbons

Lisa is the Director of Instructional Design in the Center for Teaching and Learning. She has twenty-five years of experience working in educational, non-profit, internet start-ups, and publishing organizations specifically in positions that leveraged her background in user experience, instructional design, and leadership. Lisa earned her B.A in American Studies from Northwestern University, and a M.B.A from DePaul University’s Kellstadt Graduate School of Business. Outside of work, Lisa focuses on projects that support food access for all and sustainable agriculture, she was also on the leadership team that opened up the second food co-op in Chicago, Wild Onion Market.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.