With the sixth annual release of school progress reports, many parents and teachers are again raising concerns over how to interpret the scores and whether or not the reports are even useful.
Some take issue with the large fluctuation of grades for 14 percent of the 1,200 schools, questioning if the measurement system is a stable one.
One father commented on SchoolBook that his son’s elementary school went from an F to an A in a single year with the same staff and principal.
“Any measure of ‘performance’ that can create such a wide disparity in a single year is meaningless,” Steven Levine said.
Others lament the weight given to standardized tests, and basing a report card on a single year’s test scores.
“I have always thought of them as fairly inconsequential,” said Juhyung Harold Lee, a fifth-grade teacher at P.S. 124 Yung Wing elementary school in Manhattan’s Chinatown. “I don’t think that they really depict a school’s progress or quality in any meaningful sort of way.”
Lee’s school fell from an A grade in the past several years to a B grade this year. Lee hypothesizes that the school lost points on the “student progress” measure of year-to-year growth on test scores, which counts for most of the report. Students at Yung Wing tend to be high-performing, he said, so they did not have as much room to grow as pupils in struggling schools. Three-quarters of the school’s pupils were proficient on the state’s English Language Arts exams this year and almost 90 percent were proficient in math.
Lee’s complaint is frequently raised at high-scoring schools. But the city says it compares each school to a “peer group” of 30-40 schools with similar demographics, in order to see whether other schools have made more progress with the same populations. This is why education officials have argued all along that even high scoring students can make progress, because they’ve seen other schools where it happens.
But that emphasis on test scores is precisely what worries teachers such as Lee.
Only 15 percent of the progress report score is based on input from teachers and parents, he noted, “which to me seems like the most meaningful indicators” of how the school is doing. Lee also questions basing such a large majority of the report on a standardized test that has been changing from year to year as the state phases in tougher standards. For instance, he said, students had to adjust to longer testing time which could have affected their stamina.
“So, for me, it’s very hard to understand how they can really unpack the growth that’s happening there.”
Then there’s the fact that progress reports are based on a single year’s metrics. Critics say this methodology can result in big swings. About a quarter of the 100-plus schools that got D’s and F’s on their progress reports this year got A’s and B’s the previous year.
“You want to give schools a really good indicator of how they’re doing right now,” said James Liebman, Columbia Law School professor who worked with the city’s Department of Education to develop the first school progress report.
Back in 2007, officials had discussed basing the progress reports on several years’ worth of data. But that idea was rejected.
“If you take three-year averages of how schools would do,” said Liebman, “you could dilute what’s really important to a school, which is ‘The things that we’re doing right now, how well are they working?’”
Shael Polakow-Suransky, the city’s chief academic officer, noted that the city does include a chart on each school’s progress report showing how it’s fared over the last three years. But he agreed that progress report plays an important role of measuring annual shifts and that, yes, schools can change enough in a year to cause a big rise or fall in their letter grade.
“When you dig in behind why things decline in a school, often what you find is that there was something that happened in that school,” he said. “It could have been a change in staff or leadership. It could have been something around the curriculum. It could have been something around the facilities.”
The progress reports offer at least a point of inquiry, Suransky and Liebman say, to reflect on what may be contributing to changes in the school’s performance. And decisions about whether to close a struggling school are never based solely on one year’s progress report, said Suransky. The city looks at multiple years’ worth of data and more qualitative evaluations, such as school visits and meetings with staffers.
But Aaron Pallas, professor of sociology and education at Teachers College at Columbia University, said the methodology behind the city’s progress reports offer “a kind of false sense of precision.”
He singled out the peer index, which is used to compare schools that have similar demographics. The index takes into account economic need, students with disabilities, students who are black and Hispanic and English language learners.
Principals often complain about these groupings, claiming their schools are frequently compared to schools that are actually very different. Or to schools that are so similar that a slight fluctuation in one school’s test scores can lead to a big change in another school’s progress report.
The principals union has said that progress reports are useful, but don’t tell the whole story of progress for a school.