Since the pandemic, more instructors at schools and colleges appear to have embraced “flipped learning,” the approach of asking students to watch lecture videos before class so that class time can be used for active learning.
Proponents say the model improves student outcomes by encouraging more interaction among students and professors, and many studies have been conducted to measure the efficacy of the approach. So a group of professors recently performed a meta-analysis to try to assess how well flipped learning is working.
The study considered 173 studies of flipped learning, as well as 46 previous meta-analyses of the approach. And while many of the studies showed gains for learners in some cases, the researchers concluded that flipped learning isn’t living up to its promise.
The hype is convincing — it’s seductive — but the implementation of the hype is not,” he said. “It has been implemented so variably.
— John Hattie, emeritus professor at the University of Melbourne
“The current levels of enthusiasm for flipped learning are not commensurate with and far exceed the vast variability of scientific evidence in its favor,” the paper argues.
In fact, the authors made the surprising conclusion that many instances of flipped learning involve more time spent on passive learning than the traditional lecture model, because some professors both assign short video lectures and spend some time in class lecturing to prepare for class activities. As the authors put it: “Indeed, it seems that implementations of flipped learning perpetuate the things they claim to reduce, that is, passive learning.”
The far-reaching meta-analysis considered flipped learning experiments done in elementary schools, high schools and colleges, with the bulk of the studies in the higher ed setting.
The biggest surprise to the researchers as they coded each research project was realizing how many different versions of flipped learning exist, said John Hattie, an emeritus professor at the University of Melbourne who co-authored the study. “The hype is convincing — it’s seductive — but the implementation of the hype is not,” he said. “It has been implemented so variably.”
Another surprise, Hattie said, was that the more active learning done in a flipped classroom, the worse the outcome. He chalks that up to the fact that many professors using the model don’t test whether students are actually learning the material presented in lecture videos, and so some students who skip the videos or watch them on double-speed arrive in class unprepared for the activities.
The researchers do think that flipped learning has merit — if it is done carefully. They end their paper by presenting a model of flipped learning they refer to as “fail, flip, fix and feed,” which they say applies the most effective aspects they learned from their analysis. Basically they argue that students should be challenged with a problem even if they can’t properly solve it because they haven’t learned the material yet, and then the failure to solve it will motivate them to watch the lecture looking for the necessary information. Then classroom time can be used to fix student misconceptions, with a mix of a short lecture and student activities. Finally, instructors assess the student work and give feedback.
“I hope our paper does not dismiss the ideas underlying [flipped learning] because they’re very powerful ideas,” Hattie said.
‘Hey, We’re All on the Same Team Here’
Fans of flipped learning had some questions about the new study’s conclusions.
Among them is Robert Talbert, a professor in the mathematics department at Grand Valley State University and author of the book “Flipped Learning: A Guide for Higher Education Faculty.”
“It kind of takes flipped learning educators to task, and I thought that was super unnecessary,” Talbert said. “I wanted to reach out to the authors and say, ‘Hey we’re all on the same team here.’ They’re part of the group doing flipped learning.”
He says he welcomes a tough look at the research, but he argued that the study left out some well-known research on active learning. And he said that by looking at flipped learning across K-12 schools and colleges, the analysis ended up comparing apples and oranges.
“It’s a great discussion starter, and I’m never going to say we can’t publish things that are critical to flipped learning,” Talbert said. “But the paper’s overall message was, ‘All of y’all are doing flipped learning wrong, and we’re doing it right.’ I didn’t think that was fair to people practicing flipped learning.”
The lead author of the paper, Kapur Manu, a professor of learning science and higher education at ETH Zurich, responded to that critique by saying he wanted to push back against an uneven implementation of a popular teaching trend.
“I’m on team science, and this is what the empirical science is proving,” he said in an interview. “The contribution is that we actually coded the kinds of activities” that went into flipped learning efforts. “When you do that, you find that active learning was not as present as it should have been.”
Talbert praised the model the researchers presented, but he said it seems very much like a paper by Bertrand Schneider and Paulo Blikstein that the researchers cited but didn’t discuss in their meta-analysis.
Hattie, one of the co-authors of the meta-analysis, acknowledged that their model emerged in part from the experiments they examined. “The new model came particularly from the extensive work from the first author Kapur Manu, and he and I both learned from this paper and others to build the model,” he said.
And Hattie argued that the uneven results of flipped learning held true no matter which sector of education considered — K-12 or higher education.
One hope of the paper, he said, is to encourage a more detailed understanding of which parts of flipped learning work best so that those jumping into a trendy teaching strategy can be effective.