A Retraction on Individualization:
I wrote my first book, "The Scientific Principles of Teaching," in 2019 and self-published it at the very start of 2022. In the book, I heavily hypothesized that individualizing the curriculum to student needs was a key tenet to accelerating learning. I came to this belief based on three pieces of evidence.
The Case for Individualizing Instruction:
-
Theoretically, the idea just made incredible intuitive sense. I could recall moments as a student when teachers continued to drill information that I already knew. Likewise, I could remember moments (especially in Math class) when I did not understand something, and my teacher moved on. It was a central belief of mine that all students are capable of higher learning. However, it was obvious to me as a teacher that not all students grasped all concepts at the same pace. It, therefore, just seemed logical to me that all students learn at their own pace, and that teaching should be individualized.
-
Response to Intervention (RTI) had multiple meta-analyses showing it’s an exceptionally effective framework for improving learning outcomes (Tran, 2011; Swanson, 2005; Burns, 2001). RTI is essentially a framework for individualizing curriculum instruction based on student learning rates, so I took this as strong evidence that individualizing the curriculum is important.
-
Most importantly, a meta-analysis by Rogers (2008) showed a mean effect size of 2.35 for individualizing curriculum. At the time of originally reviewing this paper, it was the largest effect size I had ever seen in a meta-analysis, and I took this fact as proof that individualizing instruction was the holy grail of education.
The Case Against Individualizing Instruction:
However, after publishing this book, the Rogers finding nagged at me for years. While, in general, higher effect sizes do point to higher levels of efficacy, rigorous studies in education rarely show effect sizes above (Slavin, 2018) .40 and almost never show effect sizes above 1.50. Recently, I re-read the study, and I must say, there are four serious limitations that need to be considered.
-
1. The paper was not published in a peer-reviewed journal.
-
2. There were only four studies behind this effect size.
-
3. The inclusion criteria for the meta-analysis were not clear.
-
4. It was specifically looking at gifted students.
Now, these limitations don’t mean the paper was a bad study or that its findings are invalid. However, they do mean that we cannot say individualizing the curriculum is settled science. I also must admit, some of my own research, as recently led me to question my previous convictions on the topic. Firstly, as I have previously discussed on this blog, I submitted a meta-analysis for peer-review on systematic vs unsystematic phonics. This meta-analysis showed roughly double the outcomes for systematic phonics. An earlier, cruder version of this meta-analysis was originally posted to this blog (Hansford, 2022a). And while I have since updated this meta-analysis for peer-review, the original findings are essentially unchanged. Moreover, there have been multiple previous meta-analyses that find the same outcome of double the outcomes for systematic phonics vs unsystematic phonics, including (NRP, 2000; Camilli, 2003; Steubings, 2008).
And here’s the problem: The (NRP 2000) interpreted the whole language as being a philosophy that taught phonics as needed to students' perceived individualized needs. Similarly, many interpret the (Pressley, 2001) white paper on balanced literacy to suggest that balanced literacy means teaching phonics to students' individualized needs, opposed to teaching the same curriculum to all students.
Of course, some might argue that while the whole language in theory argued for individualized phonics instruction that it in practice minimized phonics instruction. However, even within systematic phonics instruction, I have found some evidence that less teacher autonomy increases student reading outcomes. A sub-analysis of our meta-analysis, (Hansford, 2022b), showed that when teachers controlled the pace of phonics instruction, opposed to following a set pace, the reading outcomes went down.
In another meta-analysis, I recently submitted for peer-review, alongside Elizabeth Reenstra, Pamela Aitchison, and Sky McGlynn, we examined adaptive technology language programs (Hansford, 2022c). These programs attempted to individualize language instruction to students' specific needs. Unfortunately, we found that on average there was no difference between computer programs that individualized instruction, vs did not. These results would suggest that not only do teachers struggle with individualizing instruction, but so do computers.
So Should We Individualize Instruction?
Truthfully, I still use a ton of individualization in my own classroom, especially in Math class. That said, many teachers prefer a more one-size-fits-all or scripted approach to instruction. And I have to admit, the scientific case against a one-size-fits-all approach is probably not as strong as I would have liked to admit. Moreover, one drawback to a more individualized approach, might be excessive time spent on lesson planning.
Written by Nathaniel Hansford
Last Edited 2023/12/04
References:
Burns, M. K., Appleton, J. J., & Stehouwer, J. D. (2005). Meta-Analytic Review of Responsiveness-To- Intervention Research: Examining Field-Based and Research-Implemented Models. Journal of Psychoeducational Assessment, 23(4), 381–394. https://doi.org/10.1177/073428290502300406
Pressley, Michael & Roehrig, Alysia & Bogner, Kristen & Raphael, LM & Dolezal, Sara. (2002). Balanced Literacy Instruction. Focus on Exceptional Children. 34. 1-14. 10.17161/fec.v34i5.6788.
Hansford, N & King, J. (2022a). A Meta-Analysis of Language Programs. Teaching by Science. https://www.teachingbyscience.com/a-meta-analysis-of-language-programs
Hansford, N,. King, J & Brown, E. (2022b). How Fast Should Phonics be Taught? Teaching by Science. https://www.teachingbyscience.com/phonics-speed
Hansford, N,. Reenstra, E,. P, Aitchison, P & McGlynn, S. (2022c). Computer Based Language Programs. Teaching by Science. https://www.pedagogynongrata.com/_files/ugd/237d54_62011c91c716487781f3126193a09b0f.pdf
Swanson, H. L., & Lussier, C. M. (2001). A Selective Synthesis of the Experimental Literature on Dynamic Assessment. Review of Educational Research, 71(2), 321–363. https://doi.org/10.3102/00346543071002321
Slavin, R. (2018, June 21). John Hattie is wrong [Blog post]. Accessed at
https://robertslavinsblog.wordpress.com/2018/06/21/john-hattie-is-wrong on April 21,
2023.
Rogers, K. B. (2008). Academic acceleration and giftedness: The research from 1990 to
2008. A best-evidence synthesis. Paper presented at the proceedings of the
Acceleration Poster Session at the 2008 Wallace Research Symposium on Talent
Development. www.accelerationinstitute.org/proceedings_2008.pdf
Tran, L., Sanchez, T., Arellano, B., & Lee Swanson, H. (2011). A meta-analysis of the RTI literature for children at risk for reading disabilities. Journal of learning disabilities, 44(3), 283–295. https://doi.org/10.1177/0022219410378447