Yesterday, USA Today championed more time in school in an editorial. States are getting behind the idea by providing grants for schools to extend instructional time (see Ed Week article here). But opt-in programs are hard to evaluate precisely because schools are choosing to participate. So how might we figure out whether and how much instructional days matter?

Dave Marcotte, an economist at the University of Maryland - Baltimore County, turned to mother nature for guidance. Marcotte reasoned that variation in winter weather made non-trivial differences in the number of instructional days before students took the Maryland School Performance Assessment Program (MSPAP) exams.

So how do kids do when snowfall decreases the number of instructional days before state tests? In Marcotte's paper, "Schooling and Test Scores: A Mother Natural Experiment," published this fall in the Economics of Education Review, we find out that that students who took exams in years with heavy snowfall performed worse than their peers in the same school who took MSPAP exams in other years. A 19.1 inch increase in a school year's snowfall is associated with a 1.2% fewer 3rd graders passing the math test. The effects of instructional time varied by subject - math scores were affected more than reading scores. They also varied by grade level - 3rd grade scores are affected more than 8th grade students' scores.

Teachers - go buy an SUV. Global warming will help increase your school's test scores.

(Image from velverse.com)

## 5 comments:

Your observation of the paper that "a 19.1 inch increase in a school year's snowfall is associated with a 1.2% more 3rd graders passing the math test" seems inconsistent with the general tone of the post that more time in school leads to better test scores. Wonder if this is a typo or a counter-intutive finding

Are you sure you've got this right, Eduwonkette? The text of the article reads:

"Snow accumulation (measured in inches) is negatively related to the percentage of 3rd grade students performing satisfactorily on the MSPAP math test (p-value <0.001). The magnitude of this coefficient is most readily understood by thinking about expected differences in test scores between mild and harsh winters. Comparing winters separated by one standard deviation in snowfall (19.1 in), 1.2 percent fewer of the students in 3rd grade during the snowier winter could be expected to have exceeded the criterion for satisfactory performance on the math subject test." You wrote, "A 19.1 inch increase in a school year's snowfall is associated with a 1.2% more 3rd graders passing the math test." Looks to me like the author's results show a negative association between snowfall and MSPAP pass rates (more snowfall is associated with lower pass rates), whereas you wrote that there was a positive association (more snowfall is associated with higher pass rates). Either way, I'm yearning for a snow day.

okay, glad I'm not the only one confused... but what an interesting idea for a study. also the sort of thing that's kind of replicatable by sixth graders for a science fair investigation... I might put this on the list. except that NYC doesn't have snow days. really. one in 8 years. ONE.

Oy - I wrote *more* where I meant *fewer*. Now I understand why Matthew was confused - not about the magnitude of the effect, which is what I thought he was responding to, but the direction. Apparently I can't read.

It's corrected now - thanks for the catch, guys.

Post a Comment