1) Parents and teachers at A schools were more likely to respond to surveys than parents and teachers at F schools. “A” schools had an average parent response rate of 31.75% versus 22.59% at F schools. The differences are relatively small for teachers: 48.23% versus 45.03%.
There’s something in this finding for everyone. One could argue that A schools are actually higher quality schools and more parents are involved as a result, which is why a higher proportion of parents responded. Alternatively, it may be that schools with lower levels of parent support are struggling for this very reason. The graph below, which plots the overall score for high schools against the parent response rate, provides some support for this point (correlation=.54).
2) Schools with higher proportions of Hispanic, African-American, and free lunch kids had much lower parental survey response rates, while those with higher proportions of White and Asian kids had much higher response rates.
To make the graph above, I divided schools into quintiles – five equally sized groups that each represent 20% of the schools. Schools in Quintile 1 have the lowest proportions of a given group, while schools in Quintile 5 have the highest proportions of that group. For example, Quintile 1 schools for the free lunch population have 48.7% free lunch or fewer, while Quintile 5 schools have 85.25% free lunch kids or more. For the Hispanic population, schools with 14.2% Hispanic kids or fewer are Quintile 1 schools, while those with 65.35% Hispanic kids or more are Quintile 5 schools.
3) This relationship holds for the teacher survey response rates. Schools with higher proportions of African-American and free lunch kids had significantly lower teacher survey response rates, while those with higher proportions of White and Asian kids had higher response rates.
Overall, these results raise the question of the validity of the parent surveys for any given school. 25% of schools had parent response rates of 18% or less (one school had a parent response rate of 1.97%!).
In addition, given how much these results vary by demographics, it is also not clear that we can validly compare actual survey responses across all schools. For example, is it meaningful to look at the safety and respect scores of a school with a parent response rate of 67% and compare those with a school with a response rate of 9%?
Lots of food for thought here – as always, email me if you'd like to see the full tables.
2 comments:
my understanding is that the parent response rate isn't just something that correlates. . . it's a criteria--score that counts toward the school's assessment.
we had both parents and kids claim ridiculous things on the surveys... that they are in 12th grade, that we offered classes that we don't (or that we don't offer classes that we do offer), etc. only a few weird answers like this, granted... but in a small school, a few nonsensical answers could skew things. I never saw either survey (being out of the country) but it made me wonder what it looked like...
Post a Comment