Monday, May 16, 2011
 

 

Join our E-Mail list!
Send an e-mail request to
subscribe@empirestatenews.net,
with the word "Subscribe" in the
subject line.

 

For site information and
viewing tips, click here.


All content copyright © 2003-2007
Statewide News Network, Inc.
Contents may not be reproduced
in any form without express written consent

NYC comptroller questions stats on school progress

NEW YORK - New York City Comptroller John Liu announced that an audit of the Department of Education’s (DOE) High School Progress reports raised questions about the usefulness of the reports in comparing the yearly progress of schools.

“It’s troubling that a system that is used to decide school closings leaves teachers and students confused about what they need to do to improve,” Comptroller Liu said.  “The Department of Education should not leave parents, educators or students in the dark when it’s deciding their fates.”

High School Progress Reports are a DOE accountability tool that assigns schools an annual grade of A through F.  The Report grades play a significant role in the DOE’s decisions to reward high performing schools, perhaps with added funds, and restructure or close low-performing schools.

According to the audit, the DOE has revised the complex formula behind the grades every year.  The frequent changes the agency has made to its grading and other formulas — without determining the impact of those changes — makes it difficult, if not impossible, to get a true picture of a school’s progress by comparing its grade from one year to the next.  As a result, the High School Progress Reports paint an unreliable and confusing picture of a school’s progress or failure over time.  Auditors recorded complaints from schools that the DOE’s lack of consistency made it difficult to set goals for students.

The audit focused on 10 high schools representing the five boroughs. It included three schools (Jamaica, Metropolitan Corporate Academy, and Norman Thomas high schools) that the DOE selected in January 2010 for closing.

Chief among the findings:

  • Inaccurate Picture of Year-to-Year Progress
    The DOE’s changes to the formula behind the Progress Report grades make it difficult for parents and educators to measure a school’s performance from one year to the next.  Action: Since the audit, the DOE has posted an advisory on its website regarding year-to-year comparisons of High School Progress Report grades.
  • Lack of Communication
    The audit determined that, while DOE met with school principals and others about changes, auditors found no evidence that it actually integrated feedback from them into the Progress Report.  Action: The DOE has since published materials summarizing and responding to feedback from educators and others involved in the 2010-2011 review process.
  • Data Reliability
    The audit found that the data — student grades, Regents exam scores, and other information — that the DOE used to calculate each year’s Progress Report grades was representative of student data recorded in the DOE’s computer systems and verifiable.  However, while the data in a given year was accurately recorded, it was not useful as a measure of an individual school’s progress over time.

 

The DOE generally agreed with nine of the audit’s 10 recommendations and has begun to implement a number of them.  However, the audit notes that “DOE inappropriately misinterpreted and even exaggerated, many of the audits ‘positive’ conclusions as an endorsement for the progress reports,” while simultaneously discounting its weaknesses.