Tuesday, November 6, 2007

The NYC School Report Card: A First Cut at the Data

The report cards are out - you can check them out here. There's a lot to talk about, but I thought it would be helpful to first create a profile of the schools that received various grades. Using the progress report data and data from the 2005 release of the NYC School Report Cards, I looked at the average characteristics of schools receiving a given grade (i.e. means). For example, A schools average 15.92% Asian students, while schools that received Fs have an average Asian population of 4.53%.


The bright line findings from the table below are:
  • Schools with higher grades have higher proportions of Asian students; A schools have an average of 16% Asian, while F schools have 4.5%.
  • Schools with higher grades have much lower proportions of African-American students; A schools have an average of 26%, while F schools have an average of 47%.
  • Schools with higher grades have lower proportions of students qualifying for free and reduced lunch - (A schools=65%, F schools=76%). Interestingly, schools "under review" - those contesting their grades - have much lower percentages of free and reduced lunch kids (56.5%).
  • Schools with higher grades have higher proportions of Hispanic and immigrant students.
I'll write in more depth about the comparison group issue later, but for now, these results - particularly the Asian finding - suggest that the comparison groups may not adequately control for the fact that kids of different backgrounds may be on different growth trajectories for reasons that have little to do with the schools they attend - read more about this under point #2 here.


Also interesting are the dimensions where there are very few differences between schools receiving different grades, as you'll see in some of the variables below.

* Notes on the analysis: The most recent data to which I had access were 2005 data; because these variables are highly correlated across years, the general trends should look the same. (If someone has clean 2006 data, send it along and I can rerun these descriptives.) Also note that these are means; if you are interested in standard deviations and ranges, email me.

11 comments:

Anonymous said...

My nightmare scenario for the report cards was that my school would get a B or C, since the test scores have improved from atrocious to merely unacceptable in the last five years. We got a D, which is fair and, I think, correct. The school is under capable new leadership, and a gaggle of hard-working, earnest young teachers have been hired. In short, the seeds for a long-term turnaround have been planted, but it may be a while until it bears fruit. Yesterday, I visited one of the best elementary schools I’ve ever seen in NYC, PS 108K in East New York. First-rate leadership, focused instruction, on-task students, committed parents – clearly a first-rate environment. They got a C, but it’s one of the first schools I’ve seen in an inner city community that I would feel comfortable having my daughter attend, which as an educator is my own personal litmus test. We will probably need to view the report cards as one piece of data, and become more critical consumers of education, relying on a wide variety of inputs.

There are two criticisms of the report cards which I expect we will hear over and over in the days ahead: too much reliance on standardized tests, and the improvement model, which I have to think will make it easier (not easy) for schools starting from a low starting point to earn an A, while making it impossible over time, for high-achieving schools to maintain their rating.

I support strong accountability and see nothing wrong with standardized tests, but I can see lots of problems with basing 85% of a school’s grade on such tests. It’s a virtual demand to eliminate everything but tested subjects from the curriculum. Cheating has been rampant with lower stakes. The added pressure of these report cards may make it endemic. I’ve argued in the past for testing systems that make it impossible to game the scores. That need is now absolutely essential for these report cards to be reliable in the future. Likewise, if a school that is doing a great job providing a rigorous, balanced education suddenly feels pressure to abandon a successful curriculum to jump on the test-prep bandwagon, then the cure will be worse than the disease.

Rewarding growth is certainly the most important factor in assessing low-performing schools, but surely a different measure is in order for high-performing schools. I’m not a statistician, so perhaps someone can disabuse me of the following notion: Suppose we apply the report card methodology to the NYC Marathon: If Katie Holmes runs the New marathon again next year, trains hard and shaves an hour off her time to finish in four and a half hours, would she earn an A? If Paula Radcliffe, this year’s winner, finishes second next year or runs a minute or two slower, she might very well rate a B, even though she completed the race in less than half the time. Should we conclude that Katie Holmes is the better runner? At the very least, the fact that it would be easier for Katie Holmes to improve her time than Paula Radcliffe needs to be taken into account. In order to maintain her A rating, wouldn’t the day come when Paula Radcliffe would need to finish in under two hours – or less -- just to stay an A?

Surely some absolute measure of achievement and quality is in order. I’m on the board of a non-profit which runs a small high school in the South Bronx. The school received an A, which is a testament to the hard work of its staff and some very good results. But I’m not prepared to argue it’s just as good as Bronx Science, which received the same grade.

It’s going to take a long time to really get to the bottom of this data, and to compare it against personal experience where one has objective insights into school quality. Among the South Bronx elementary schools I know well, there were no surprises. Some of the middle school scores did raise my eyebrows. One middle school that made the list of persistently dangerous schools, received an A. Another school, that I’ve long encouraged my students to attend, rated a D. It’ll take some digging to find out if the fault is in the grading system, or my previous perceptions of those schools.

rpondiscio@aol.com

jack phelps said...

Where did you grab your demographic data from? I'm interested in applying some more rigorous analysis.

Double H said...

Where are the White students? Why do researchers always leave out the data trends for White students and focus on the racial minorities only?

Anonymous said...

White students in NYC probably get signing bonuses or maybe a free ipod provided by Gates money to attend small school they support.

Anonymous said...

this is the dirty little race card secret which I always suspected, but which no one wants to talk about.

If there have been gains in the school system these last few years (since the test scores are suspicious) it may have a lot to do with the city-wide rise in Asian and Bengali immigration and conscious efforts to drive out a black population. It also has to do with the hard work of staffs who succeed despite the crazed requirements and dulled curriculum initiatives

Where I used to work on the lower east side principals would mount campaigns against each other in order to recruit Chinese students. Under restructuring the Region 9 leaders (coming mostly from district 2) made efforts to channel those kids away from district one and into their friends' schools in district 2

Anonymous said...

During the last two years I taught in three Bronx middle schools. One was a truly exemplary school, and I was not surprised this school received an A. That being said, as Robert mentions above this school in no way compares to Bronx Science or even a school like Mott Hall III. The school is great in comparison to the other two schools I worked in, and sadly, in comparison to most of the city middle schools but it certainly could be made better.

The other two schools had various problems, but one was terribly violent, and unsafe. Additionally, from what I saw in four months, I can tell you with 100% certainty that very little true education occured there. Kids were passed to the next grade for no rhyme or reason, My homeroom went a month and a half without a math teacher, and then one of my classes was totally disbanded sometime around October, and I was given a new group of students. This school earned a C. I find this shocking. I would love to believe that the school got better in the year and a half since I left, but I know that isn't the case.

The other school, the one I worked in all of last year received an A. This makes me critical of the whole scoring system. I believe way too much emphasis is placed on this growth model and I don't believe it is an effective measure of the quality of a school. At this school I was applauded because 3 of my high 2s had become 3s from sixth to seventh grade (I had about 5 high 2s). The truth is two of these kids were exceptionally bright and I was surprised they scored a 2 to begin with. Beyond that, the difference between a high 2 and a 3 really isn't all that much. I know that overall a high number of our students went up from 6th to 7th, 7th to 8th. I believe 40% of our 8th graders had made tremendous gains from 6th to 8th grade. And while that is excellent, it isn't a real measure of the quality of the school. First, because many of these kids started 6th grade way below grade level. And second because the school has a series of huge problems, including problems with leadership.

jack phelps said...

Sol--Don't be too convinced until you see some more rigorous analysis. The asian american population is substantially smaller than the other populations measured, and attributing such large gains to such a small group may be misleading, depending on what a deeper look reveals. Your LES principals may have been influenced by the same fallacy. I'm not trying to be overly academic (I also certainly admit that the initial look could be accurate), but don't jump to any conclusions! There are plenty of other factors this look doesn't take into account that could be at play.

Anonymous said...

You can find more recent demographic data in the big spreadsheet of "learning environment survey" results that was released a few months ago: http://schools.nyc.gov/NR/rdonlyres/94AA43CA-0343-4F86-92C9-2CB1E6E7A09E/26549/surveyaccess_101107.xls

Go to the "school profile" tab (or something like that) and scroll way over to the right.

eduwonkette said...

Hi everyone,

Re the percent white - the dataset that I had didn't include this measure, so I had to extract it from the database - I'll post those numbers later on. I agree it's an oversight, and I apologize.

jack, if you want to email me directly at eduwonkette (at) gmail (dot) com and let me know what variables you're looking for, i'm happy to help.

ms. v. said...

I'm not a statistical expert nor a progress report expert, but here's one little tidbit that might have something to do with the performance of schools with Hispanic students and immigrants versus the performance of schools with African-American students... schools can receive "extra credit" for having populations of English Language Learner students. Which in many schools means recent Hispanic immigrants. For what that's worth...

Anonymous said...

I've heard that students for whom there is no prior year score basically don't count in the progress reports. So do schools that have lots of over-the-counter admits, including new immigrants, basically get screwed for serving kids who don't count in this year's progress reports?