Promise of Things to Come to NYC: Florida botches release of new data on teacher evaluationsBy Curtis Krueger, Marlene Sokol, Jeffrey S. Solochek and Danny Valentine, Times Staff Writers
Published Wednesday, December 5, 2012
Florida's Department of Education on Wednesday rolled out the results of a sweeping new teacher evaluation system that is designed to be a more accurate, helpful and data-driven measure of how well teachers actually get students to learn.
And then, within hours of releasing the data, the department pulled the numbers off its website and sheepishly admitted that much of it was wrong.
State officials late Wednesday said thousands of teachers were mistakenly double-counted because they had more than one "job code" in computerized records. That skewed the results.
Department spokeswoman Cynthia Sucher acknowledged it was "distressing" for the agency to learn that the information turned out to be incorrect.
The new evaluation system has been stressful for teachers. Even though it appears that the vast majority have been rated as "effective" or "highly effective," many have been downgraded. Critics of the new system said the problem did not surprise them.
"Garbage in, garbage out," said Bob Schaeffer, public education director for Fair Test, which opposes excessive testing. "The teacher evaluation system is ideologically driven and not ready for prime time . . . When you rush to put a shoddy system in place, you get ludicrous results."
"We told you so," said Marshall Ogletree, executive director of the Pinellas Classroom Teachers Association.
The Florida Legislature approved the comprehensive new system and moved to implement it so quickly that it amounted to "trying to do something that's impossible to do at breakneck speed," he said.
It's not the first time the department has botched the release of important education data. Earlier this year, the department admitted to giving incorrect school grades to 200 schools, including some in Pinellas.
Also, the state decided to toughen the FCAT writing test, but the Board of Education later decided to temporarily lower the passing mark for the test after conceding poor communication could have contributed to an unprecedented drop in writing scores.
On Wednesday, the department posted the data at 10:30 a.m. and at 11 started a conference call for nearly an hour with news reporters from across the state. By early afternoon, Hillsborough County school officials had noticed the state said it employed 23,970 teachers — but the real number was less than 15,000.
"The numbers don't look right, and it's not just us," Hillsborough schools spokesman Stephen Hegarty said. "We have asked the DOE to look at them."
"They were the first to notice it and they called our people," Sucher said later. Department officials then noticed that some other districts, but not all of them, also had data that was double-counted.
Sucher did not say if it was the department's fault for giving school districts faulty information about how to supply the data, or if the school systems had failed to follow instructions.
Asked how such an error could occur in such a high-profile project, Sucher pointed out that it is a brand-new system, with preliminary data that is scheduled to be finalized later. Also, individual school districts had some latitude in creating their own evaluation systems, so there are differences from county to county.
Therefore, when looking at data supplied by the school districts, "you're not comparing apples to apples, you're comparing apples to grapes," Sucher said.
Today, the department plans to try again to release the data. It should show, on a school by school level, what percentage of teachers have been rated as "highly effective," "effective," "needs improvement," "developing" or "unsatisfactory."
However, individual teacher scores will not be immediately released.
Jeffrey S. Solochek and Danny Valentine contributed to this report.
Teacher Evaluation FactSheet