NAPLAN has been back in the headlines again recently (although, when is it every really OUT of the headlines?): this time in response to a senate inquiry investigating concerns NAPLAN is damaging to student and teacher wellbeing and creating a culture of ‘teaching to the test.’ After six years of NAPLAN testing, at a cost to ACARA of $7 to $7.5 million per year, it is time to put NAPLAN to the test, so to speak.
Greens Senator Penny Wright, in handing down a Senate inquiry into NAPLAN testing, has recommended significant changes need to be made to both the tests and MySchool website.
“The committee heard a huge amount of evidence that the MySchool site has introduced a competitive element which is damaging student and teacher wellbeing and resulting in a whole lot of ‘teaching to the test” Senator Wright said.
“As a committee we came to the view it’s time the ranking and comparative functions for individual schools on the MySchool site were removed. The committee also thinks that NAPLAN needs some improvement as a diagnostic test – it’s just not as helping teachers and parents support students as well as it should.”
In election campaign-mode last year, Education Minister Christopher Pyne suggested the Coalition would consider banning the publication of NAPLAN test data over concerns it was ‘skewing the way people teach.’
In March of this year, however, Minister Pyne substantially softened his standpoint:
”The government committed to review NAPLAN and the My School website to ensure it is meeting the needs of our students.” This statement was a rather dismal indicator for any significant change occurring.
Standardised testing is not new in Australia: state-wide testing of cohorts has taken place since the 1980s. NAPLAN was first introduced in 2008 to replace the Basic Skills Test, or BSC, which varied from state to state. One of NAPLANs aims was to provide nationally comparable data on individual student performance in literacy and numeracy. NAPLAN tests are conducted in May each year, and students participate in tests in years 3, 5, 7, and 9.
I would like to make it quite clear: I am not against standardised testing. The data provided by NAPLAN testing can be useful for both parents and schools alike. Each student receives an individual report which provides parents and teachers with feedback on the knowledge and skills they have demonstrated on their test. This data can be compared to other students in their grade across the country. This element of national comparison means that the data is useful even if students move interstate. Schools can use NAPLAN data to compare incoming students against an existing cohort, and it is common practice for many high schools to request NAPLAN data for students enrolling in year seven. This data is often used alongside other information (such as reports or general ability testing) to place students in streamed classes.
Schools can use data to track trends in areas of student strength and weakness between and across year levels. Teaching can then be structured to ensure these areas are targeted to improve student outcomes. NAPLAN can function to provide a snapshot of how a cohort is functioning as a whole, and facilitate comparisons between year groups..
In reality, however, as a teacher I do not typically use NAPLAN data to inform my practice. Tests are sat in May, yet results are not received by schools until mid-September: leaving only term four to work with students. Rather than expensive and time-consuming NAPLAN tests, in-class assessments and day-to-day monitoring of student learning and understanding is far more useful in informing my practice. As a diagnostic test, NAPLAN leaves much to be desired.
NAPLAN testing can be disruptive to school routine, and with the focus on publishing test data, and the high profile of NAPLAN testing in the media, there is increasing pressure for students to perform well. It is not unusual for heads of department to request teachers spend some time in the weeks preceding the test ‘familiarising’ students with the style of questioning. When teachers ‘teach to the test’ pedagogical practices emphasise rote learning, memorisation, and drills. Testing can motivate performance as students strive to achieve better academic outcomes, particularly when it is not tied to their grade, as in the case of NAPLAN. However, standardised testing can also increase anxiety, promote challenge-avoidance behaviour, and increase levels of self-doubt. Are we willing to sacrifice students’ psychological well-being for the sake of publishing test data?
You may accuse me of scaremongering: naysayers may protest that NAPLAN is not high stakes, but my argument is not hyperbolic. With the level of exposure of NAPLAN in the media, the constant level of scrutiny teachers are held under by politicians, media, and the general public, this supposedly ‘low stakes’ test seems to becoming exponentially bigger each year.
Parents, however, appear to be more fond of NAPLAN testing than teachers themselves. A report by the Whitlam Institute at the University of Western Sydney found 56 per cent of parents were in favour of NAPLAN testing, with around 70 per cent finding the tests useful for providing objective and comparable data. By contrast, just 20 per cent of teachers reported positive feelings about NAPLAN testing (it would be interesting to know how many of the parents that disagreed with testing were teachers).
Why? It is not because teachers are averse to scrutiny (goodness knows we receive enough of it from the media). It is because publishing of NAPLAN data via the MySchool website transforms a supposedly ‘low stakes’ assessment into a ‘high-stakes’, highly-publicised extravaganza. Whenever data is published there is the opportunity for it to be misused and misinterpreted. Publishing of data inevitably leads to production of league tables and uninformed, uneducated comparisons being made between schools and states. It is ridiculous to compare the ACT with the Northern Territory: the two cohorts are entirely different, yet as soon as NAPLAN data is released those types of comparisons are inevitable.
Adjunct Professor James Athanasou encapsulates the issue with NAPLAN as it stands currently
“NAPLAN is not there to swell parents’ egos or to publicly humiliate schools or teachers. The problem is not the test’s assessment, but how it is used.”
Am I proposing an end to standardised testing? No. I do, however, maintain my strong opposition to publishing of NAPLAN data via the MySchool website. Increasing emphasis on ‘measuring’ students and production of transparent data serves simply to focus students (and teachers, and parents) on being performance or achievement focused. The consequence is that that students are no longer learning for mastery and understanding, are less adaptive learners, eschew challenging tasks, and have a greater tendency to blame failure on personal inadequacy (see my post on mastery vs performance learning here.
Ceasing wide-scale publication of test data would be a step in the right direction towards more appropriate use of NAPLAN testing. As long as the MySchool website is in operation NAPLAN will continue to appear to be a high-stakes test. If we want students to be focused on deep learning, on being engaged, challenged-seeking problems solvers, continuing to emphasise NAPLAN is NOT the way to go about it. NAPLAN should function as it was intended: a diagnostic tool that serves to provide information to schools and parents. Not a measuring stick for the production of league tables and a culture of teaching to the test.