Helping teachers identify the impact they have on student achievement.
With the publication of his book, Visible learning: A synthesis of over 800 meta-analyses relating to achievement in 2008, Professor John Hattie invited schools to start using learning growth estimates as a measure to judge their success in academic achievement. One of his main propositions was that schools should focus on the amount of academic progress made by students during a period of time, as opposed to their scores or grades attained. For instance, if Emma (Year 9 student) gets a high score in the numeracy component of the Naplan test, let’s say 800, that does not necessarily provide evidence of growth but of achievement. Teachers should not be looking at her most recent score in isolation, but they should gather information about her previous Naplan results in Numeracy (when she was in Year 7, for instance), estimate effect size, and determine whether progress has taken place or not. They may find that Emma was already performing at that high level two years ago, meaning that she did not make any progress.
This call for measuring and using growth presented several challenges for Australian schools. In particular, for teachers, who were immediately assigned with this new responsibility, as schools do not typically have a specialist data person among their admin staff. I subsequently conducted a survey in 78 schools in Victoria to explore teachers’ perceptions about their new data analyst role. In total, 182 teachers participated in the study. Results revealed that teachers are very busy professionals with 85% of the sample stating they do not have enough time to conduct other activities apart from teaching and planning. Also, results indicated that 70% of teachers believed they did not have the necessary statistical skills to perform such analyses of data. Finally, I found that teachers were no willing to learn about statistics, with only 12% of the sample showing interest in participating in basic statistical training. In an effort to remedy these issues, I developed my learning growth estimator, which is currently used by dozens of schools across Victoria.
Frameworks for interpreting your growth data
Schools can use several comparative frameworks to interpret their learning growth results. Using Naplan results as an example, they could be as follows:
- Norm-referenced frameworks: schools can compare their growth with State or Territory’s estimates. By doing so, they can find out whether their students are making more or less progress than the average children in Victoria, for instance.
- Standard-referenced frameworks: schools can also judge effects based on a pre-established standard of learning growth (e.g., Visible Learning). According to Hattie, there is an expected learning growth for children in different age groups. So, schools can use this framework to see whether their growth estimates meet the benchmark suggested by Hattie. Please contact us for information about this.
- Criteria-based frameworks: schools can interpret their growth based on the progress made by students along a developmental continuum that describes different levels of increasing competence for a particular skill. In Naplan, these levels of competence are known as Bands. For each band, there is a level descriptor that labels the skills students can master. Thus, schools can make judgement of growth as students move from one level to another.
Final note
Professor Hattie continues innovating and updating his work on factors affecting educational outcomes. I encourage teachers to visit the brand-new Visible Learning website (http://www.visiblelearningmetax.com) and explore variables that may be associated with the academic achievement of your students. This is, I believe, a very rich source of information for schools, so please do take a good look at it.
Author: Jesus Camacho – Morles