Gerritsen sides with Parata over socio-economic influences on school learning

John Gerritsen reports thus:

It was based on the results of tests of 15-year-olds conducted in 2012 for the Programme for International Student Assessment (PISA). Initial reporting in 2013 said New Zealand’s results were above average, but had worsened and socio-economic background accounted for 18 percent of the difference between New Zealand children’s results in mathematics.

Simply copied from a ministry handout.

Gerritsen would have been well aware of the question in the House about Hekia Parata’s citing of 18% from the PISA-OECD report, but he still chose to allow that figure to stand even though he would have known the indisputable figure from that report was 78 percent.

This is not the first time I have questioned the behaviour of this senior journalist. https://networkonnet.wordpress.com/2015/06/13/john-gerritsen-war-correspondent-for-the-times/

Weak, careless journalism at best – and no retraction, no apologies.

School education should be an activity in which everyone takes the greatest care with the truth; the teachers and children of New Zealand deserve nothing less. The minister of education fails that test and so does John Gerritsen, Radio New Zealand education reporter.

Gerritsen doesn’t set out to be careless but his gullibility, his unwillingness to become involved in education complexity – often sees him ending up in confusing and compromised positions like this.

The minister of education policy argument for cutting back on teachers, refusing significantly more teacher aides, refusing to allocate more resources to lower decile schools, and harping on about teacher shortcomings,  and many other arguments,  pivot on the figure for children’s achievement accounted for by socio-economic factors – she usually puts it at around 20%.

Up till now the standard in New Zealand research has been by Richard Harker (consistent with many overseas studies) who said that ‘anywhere between 70-80% of the between schools variance is due to student “mix” which means that only between 20% to 30% is attributable to the schools themselves.’ (Please note that means a lot more than the teachers in classrooms.)

The minister stood in the House recently and referred to a large-scale study by the OECD and PISA in 2012 which she said supported her figure for socio-economic factors:

‘I do agree with the OECD study in 2012 involving over half a million 15-year-olds from 65 countries, of which one was New Zealand. In fact, 5,000 students from 177 schools in our country participated. That OECD study found that 18 percent—18 percent—of the difference in student achievement can be accounted for by socio-economic factors. That means that 82 percent of student achievement is not statistically explained by socio-economic factors.’

And Gerritsen agrees with the minister; a transcript of his report on 11 February on mathematics ability in schools (11 February) cites a figure of 18% from a PISA report for socio-economic factors.

An academic friend informs me letters have been sent to Gerritsen but he refuses to respond, the motivation presumably is that if he disagrees with Parata he will be on the outer.

18%?

18% as quoted by Parata and Gerritsen or 70-80% as researched by Harker?

Parata and Gerritsen got it wrong, but neither will apologise or correct their error: understandable for her, par for the course, but disappointing in a RNZ reporter.

The actual data the OECD report draws on in is Figure 11.2.8. (p. 48):

The average of what is explained by socio-economic factors for all PISA countries is just over 60%. NZ comes in at 78% (consistent with Harker’s findings of 80% in NZ in 1995).  The report says ‘In Chile, Hungary, Ireland, New Zealand, Peru and Slovenia more than 75% of the performance difference between schools is explained by the socio-economic status of students and schools.’ (OECD 2013, p. 49).

The evidence is clear and consistent and it has nothing to do with ideology or politics. It is just a fact that in countries like ours the social background of students accounts for some 70%-80% of the variance leaving only 20%–30% to be explained by schools.   And that was where Harker had it all those years ago; it was always there, but it didn’t suit Parata and she lied about it, and it suited Gerritsen, it seems, to stay just there.

Advertisements
This entry was posted in Education Policy and tagged , , , , , . Bookmark the permalink.

4 Responses to Gerritsen sides with Parata over socio-economic influences on school learning

  1. Thanks Kelvin, this is really important to know.Do you have a link to the particular OECD report?

    I am also trying to find out about a similar claim Hattie makes, in his collaboration with Pearson: What Works in Schools – the politics of distraction, that you wrote about last year.

    Hattie says, the variance between schools, based on the 2009 PISA results for reading across all OECD Countries, is 36 per cent, and variance within schools is 64 per cent. For Australia, it is 18
    and 72 per cent; and New Zealand, 16 and 84 per cent.

    This sort of forms the evidence background for his position that Teachers are the problem/solution not the school or social policy.

    However, the variances he uses seem misleading – do you know how he calculated them?

    It’s like saying 50% of kids will be above the median score – it will always be that way no matter what.

    My gut instinct is that if you measure the variance between schools, you would get the ave for each school and then find the variance for these averages. But when you measure the variance within a school, you are taking all the individual results from each student, including all the outliers.

    So variance between schools will always be smaller than variance between individual students – no matter what. So to use this as evidence for his political posture is misleading.

  2. Kelvin says:

    Nice stuff George. If you give me your email address a response by Ivan Snook will give you a running start at the matter.

  3. Laurence caltaux says:

    So what was the 18% figure actually referring to? Is it a completely incorrect figure or does it actually relate to something in the report?

  4. Kelvin says:

    Yes – good question Laurence, the figure 18% is in the report early on, a passing figure the ministry has latched on to; completely ignoring the conclusions. In an earlier posting I referred to the 18% and commented on the ruthless dishonesty of the ministry using it. If the 18% hadn’t been in, the minister would never have referred to the OECD report. She feels bullet-proof.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s