The idealized idea is that in the vast supermarket of educational goods, grades serve as a strong signaling mechanism that tell both educators and parents where their schools stand. They are intended to provide a stronger signal, and thus clearer information to guide action, than regular test scores. As a result, their proponents argue that by placing schools in a distribution where everyone knows the pecking order, grades create strong incentives for improvement. They are also supposed to serve as shortcuts for parents, which allow them to vote with their feet and create market pressure for schools to improve.
In the locales where grades have been implemented, they appear to have real consequences on behavior. For example, economists David Figlio and Maurice Lucas found that trivial differences in grades in Florida affect housing prices. Similarly, a new experimental study by economist Justine Hastings and colleagues found that providing simplified information on test scores "improves" parents' choices ( i.e. they pick higher scoring schools).
What I found is that an A or B school is not necessarily a school with a positive school environment or a well-developed rating on the Quality Review, and vice versa:
* 67 schools that received school environment scores in the lowest 20% of all city schools (i.e. if this was a separate grade, they would have received an F) received As and Bs.
New Yorkers may be wondering about which schools fell into these mismatch categories:
Schools that got As but were in the bottom 20% of school environment category scores:
1) PS 165 ROBERT E SIMON
2) JHS 080 THE MOSHOLU PARKWAY
3) BRONX LITTLE SCHOOL
4) PS 038 THE PACIFIC
5) PS 202 ERNEST S JENKYNS
6) PS 015 JACKIE ROBINSON
7) PS 031 BAYSIDE
Schools that got Ds or Fs but were in the top 20% of the school environment category scores:
1) MS 243 CENTER SCHOOL
2) CENTRAL PARK EAST I
3) ISAAC NEWTON JHS FOR SCIENCE & MATH
4) PS 291
5) PS 007 ABRAHAM LINCOLN
6) BROOKLYN COLLEGE ACADEMY
7) PS 179
8) PS 35 The Clove Valley School
Schools rated as "underdeveloped" (the lowest category) on Quality Reviews that got As and Bs:
As:
1) MS 260 CLINTON SCHL WRITERS &
2) THE EAST VILLAGE COMMUNITY SCHOOL
3) NEW EXPLORATIONS INTO SCIENCE, TECHNOLOGY AND MATH
4) THE HERITAGE SCHOOL
5) JHS 080 THE MOSHOLU PARKWAY
6) SCHOOL FOR EXCELLENCE
7) JHS 050 JOHN D WELLS
8) PS 193 GIL HODGES
Bs:
9) PS 015 ROBERTO CLEMENTE
10) JHS 044 WILLIAM J O'SHEA
11) PS 137 JOHN L BERNSTEIN
12) PS 217/IS 217 ROOSEVELT IS.
13) NEW EXPLORATIONS INTO SCIENCE, TECHNOLOGY & MATH
14) PS/IS 54
15) PS 061 FRANCISCO OLLER
16) PS 072 DR WILLIAM DORNEY
17) PS 107
18) I.S.117 JOSEPH H WADE
19) JHS 125 HENRY HUDSON
20) JHS 162 LOLA RODRIGUEZ DE TIO
21) PS 246 POE CENTER
22) NEW MILLENNIUM BUSINESS ACADEMY MIDDLE SCHOOL
23) ACCION ACADEMY
24) MS 391
25) HIGH SCHOOL FOR TEACHING AND THE PROFESSIONS
26) GLOBAL ENTERPRISE HIGH SCHOOL
27) PS 046 EDWARD C BLUM
28) PS 133 WILLIAM A BUTLER
29) IS 136 CHARLES O DEWEY
30) PS 184 NEWPORT
31) ANDRIES HUDDE
32) PS 305 DR PETER RAY
33) PS 040 SAMUEL HUNTINGTON
34) I.S. 73 - THE FRANK SANSIVIERI INTERMEDIATE SCHOOL
35) PS 156 LAURELTON
Schools graded as well-developed (the highest category) on Quality Reviews that got Ds and Fs:
Ds:
1) MS 243 CENTER SCHOOL
2) PS 016 WAKEFIELD
3) PS 018 JOHN PETER ZENGER
4) PS 044 DAVID C FARRAGUT
5) PS 047 JOHN RANDOLPH
6) PS 007 ABRAHAM LINCOLN
7) I. S. 381
8) BROOKLYN COLLEGE ACADEMY
9) JHS 008 RICHARD S GROSSLEY
10) PS 105 THE BAY SCHOOL
11) PS 166 HENRY GRADSTEIN
12) PS 8 SHIRLEE SOLOMON
13) I S 034 TOTTENVILLE
14) PS 041 NEW DORP
15) PS 055 HENRY M BOEHM
Fs:
16) PS 033 CHELSEA PREP
17) PS 106 PARKCHESTER
18) PS 130 ABRAM STEVENS HEWITT
19) PS 179
20) PS 182
21) PS 238 Anne Sullivan
22) PS 35 The Clove Valley School
So what does it mean for a school to be good? As saavy parents and teachers know, it depends on what "good" means. The report card grades should be interpreted with that in mind.
4 comments:
I really liked this post. As a former teacher and someone very interested in education policy I've been thinking about these report card scores a lot over the last couple days. While I believe that giving schools a uniform ranking can be beneficial in a lot of ways, I feel that it is obvious that the system used was flawed. It bothers me greatly that so much money was spent on a system that is greatly flawed when it could have been put to use in a way that actually helps students. How can anyone really believe in this data when we know schools that are on the NY State failing list earned As, and schools with good environments and 95% of their kids passing exams could not earn an A (due to a lack of high enough growth)?
Additionally, from what I understand schools were compared to "like" schools with similiar populations. I understand that it is necessary to take into account the number of low income students and ELL and Special Ed populations but I can't help but think that the way the data was utilized legitimizes a system that holds low income students and schools to a different standard than middle class students and schools. My former school earned an A even though it is in no way on the same level as schools such as Mott Hall III, Bronx Science or a number of other A schools. It bothers me that data is being used so haphazardly that an A means something different depending on where a school is located.
Beyond that my biggest concern has to do with the community I once worked in and other communities like it. Most of my students' parents had no real sense of the school system, and lacked the time or resources to advocate on the behalf of their children. I can't help but think that these parents are the ones who will be the most affected by this new report card system. Now, they will think: "Great my kid is at an A school" and see no need to seek any more information or be critical consumers.
This is just a germ of an idea (I lack eduwonkette's masterful grasp of data and statistics) but after three days of looking at these report cards, I'm starting to wonder if it wouldn't make sense to think of these report cards the way college sports teams are classified and then ranked. The best Division III college football team is rarely a match for even a middling Division I team. But within its division a small college might be as daunting a foe as USC would be in the Rose Bowl.
As I said elsewhere here, the small South Bronx high school on whose board I serve got an A, but it can't reasonably be argued that it's as good as Bronx Science, which also got an A. I used to ruefully joke that the middle schools in the South Bronx came in three basic flavors: bad, worse and holy #$%&!. And, looking at the report cards, you could loosely interpret that comparison as accurate. Relative to each other, IS 162, for example might be a B, where IS 151 is a D, etc. Some of the schools have improved a lot relative to where they were and their peers. There's some value in knowing that--especially if those are my only choices--but there's still not a lot of room for complacency about their overarching quality and accomplishments.
In order for the report cards to be truly meaningful, there has to be some transparency and clarity about the comparisons, and this is where they fall down. The peer group methodology may be rock solid statistically (I'm out of my depth to say if it is or is not), but it muddies the water among parents and other users of this data. Clearly not all As are created equal, etc.
It seems what is needed to reward excellence and put pressure on laggards is first a classification of overarching quality (The division in which your school plays ball), then a grade relative to your peer group (your ranking in the AP football poll). This needs to be simple, easily grasped, and clear to users, or else to impact of these report cards *as communications tools* is dulled.
If my goal is a career in the NFL, I'm more likely to be drafted as a starter from a mediocre Division I team than a standout Division III team. So given the choice between signing a letter of intent with a rebuilding Division I program that has had a few lean years, or the winner of last year's Amos Alonzo Stagg Bowl, the Division III championship, I'd choose the former.
Likewise, I'd rather have my child attend a Division I "C" school, than a Division III "A."
rpondiscio@aol.com
Clearly the grades are preposterous, more nonsense from an administration that's accomplished nothing, yet managed the media like masters.
Do you want to know what a good school is? My daughter attends one, because I live in a nearby suburb. Though I work in one of the city's very best regular high schools, the contrast is stunning. She's now in sixth grade, and she's never had a bad teacher.
When I walk into her schools (She's now been in three) the buildings are colorful and clean, be they new or old. Class sizes at her grade are around 20, and will go up to 25. There are several working computers in every classroom.
When she has problems, I call or visit, and they are quickly resolved. No one talks about charter schools, or vouchers, or closing schools, or reorganizations, or report cards for schools.
If you insist on good teachers, reasonable class sizes and decent facilities, you will have good schools and you can dispense with all this nonsense.
However, neither Mr. Bloomberg nor Mr. Klein (nor Ms. Weingarten, truth be told) could give a rat's behind about such trivialities. And their refusal to confront the rampant dysfunction in our system has doomed it to remain pretty much the way it is, with the addition of an unprecedented atmosphere of fear and loathing for teachers.
Accountability? Give me a break. This is all to divert attention from Tweed, which takes no responsibility for anything, and create more charters to give kids an education that's even more on the cheap than what they're getting already.
If they tried this where I live, the residents would march on the district office with torches and pitchforks. Tweed deserves no less.
I really liked the balance in your post. I think there is a great need and benefit of showing parents information about quality schools, however the next question for us should be about how we define "quality".
I am a grad student at USC and if you're interested here is a project my research center has been working on. It's called the Charter Schools Indicators (CSI) and measures the "quality" of charter schools along 13 different indexes... this conversation is just beginning, but I hope the CSI project will be another resource to help us think of how we measure schools.
http://www.usc.edu/dept/education/cegov/CSI_USC.pdf
Post a Comment