by Christopher Paslay
The drop in PSSA test scores cannot be attributed solely to improved test security. Cuts in education funding, though not acknowledged by state officials, are also to blame.
The official results of the 2012 PSSA exams are out and it seems everyone is pointing fingers. Math and reading scores are down 1.5 points statewide, and an average of 8 points in Philadelphia. Teachers unions are blaming cuts in education funding for the slump in student performance, and it appears they have a legitimate argument. Last school year Governor Corbett cut $860 million from K-12 education, which translated to about $410 per student. These cuts hit impoverished school districts the hardest; in Philadelphia, state education funding decreased by about $557 per student.
“When resources are pulled from our schools, scores drop,” Jerry Jordan, president of the Philadelphia Federation of Teachers, said.
Education Secretary Ron Tomalis insisted the drop in PSSA scores had nothing to do with recent changes in funding. “I don’t buy the excuse the numbers went down because of budget cuts,” he said. According to Tomalis, scores are down because of heightened test security put in place during last spring’s PSSAs. This conclusion was backed by the state’s Technical Advisory Committee, which studied three possibilities for the drop in scores: funding, changes in the test content, and tighter test security.
The data to support TAC’s findings in the recently released 2012 PSSA Official Report is insufficient, however. Although TAC states “that the only scientific cause for the drop in scores from 2011 to 2012 was the Department’s investigation of past testing improprieties which has led to heightened test security measures,” no analysis of the effect of changes in funding is given in the 2012 PSSA report.
How did cutting $860 million from K-12 education impact testing, exactly? How did losing hundreds of teachers, nurses, librarians, counselors and school police affect test scores? How did losing art, music, foreign language, sports programs, clubs, and a multitude of other extracurricular activities impede education? TAC never adequately addresses these issues in the report.
Selective interpretation of test data seems to be the Pa. Dept. of Ed.’s modus operandi. Also missing from the 2012 PSSA report are the forensic audits of the 2010 and 2011 PSSAs, as conducted by the Data Recognition Corporation—the makers of the PSSA. A neatly arranged, prepackaged analysis of the state’s “Integrity Investigation” into cheating on past exams is contained in the report, but this investigation is by no means an adequate substitute for the original audits of the 2010 and 2011 PSSAs themselves. Pennsylvania tax payers have a right to review the primary documents and draw their own conclusions about which schools and districts cheated on the 2010 and 2011 PSSAs; my gut feeling is still that the Philadelphia School District, although clearly guilty of widespread cheating, was made the primary scapegoat by the state.
The state has also failed to explain why they waited until shortly before the start of the 2012 PSSA to announce its new security policies regarding the administration of the exam, and why only Philadelphia and a handful of other districts were required to abide by the new security measures. If cheating was so widespread, why weren’t the security measures mandated statewide?
Tragically, it appears that the state’s obsession with testing and test security is only going to get worse. While Corbett’s 2012-13 budget keeps school funding generally flat, it increases spending on educational assessments by 43 percent to $52 million.
The drop in PSSA test scores, especially in large urban districts such as Philadelphia, cannot be attributed solely to improved test security. The state’s claim otherwise is purely political, and until supported by sufficient data, lacks legitimacy.